Books Books - articles brought to you by History News Network. Fri, 04 Jul 2025 08:44:08 +0000 Fri, 04 Jul 2025 08:44:08 +0000 Laminas_Feed_Writer 2 (https://getlaminas.org) https://www.historynewsnetwork.org/article/group/3 Slave Hunts as “Normal Policing” In May 1752 the French minister of the navy, Antoine de Rouillé, wrote to the governor of Saint-Domingue about the new problem of slaves in France. Slaves were “multiplying every day, more and more, in almost all the towns of the kingdom.” The minister’s disquiet followed a controversy that centered on an African man, age 22, whom I shall call Jean, though he also appears under other names (Charles-Auguste and Adonis) in the police archives. He was enslaved to Guy Coustard, a sugar planter in Saint-Domingue. Jean had the Coustard family’s monogram (CO) branded on his left breast.

Documents about Jean’s brief sojourn in France come from two slender files at the Bastille Archives, which contain letters to the lieutenant-general of police from the minister of the navy and from Jean’s would-be benefactor, the Dowager Princess of Nassau-Siegen, born Charlotte de Mailly de Nesle, who tried and failed to protect Jean from Coustard. Her staff and Coustard lodged in the same hotel, near the Luxembourg Palace. Through her servants, she learned of Jean’s physical abuse and despair.

From Mailly de Nesle we learn that Jean arrived in Paris during the spring of 1751 and fled from the city twice. On both occasions he tried to escape by joining the army. In March 1752 the French constabulary arrested him in Sedan, a frontier garrison town, and escorted him back to Paris in chains. He wound up in the dungeon of For l’Évêque, a former ecclesiastical prison. Many of the other inmates at that time were soldiers. Unlike Jean, who had hoped to become free by joining the army, those men were draftees, who had sought freedom from the army through desertion. On April 8, someone other than Coustard claimed Jean from prison. Port records in La Rochelle note that a slave named Jean sailed for Saint-Domingue in July.

The capture and imprisonment of Jean resulted from an order of the king, popularly known as a lettre de cachet. Masters paid a fee to police for these roundups and paid for the maintenance of their slaves in prison. In March 1752, Jean-Jacques Coustard, an elderly Parisian judge, lobbied the Crown to arrest Jean by royal writ. The judge did not own slaves himself and had probably never set foot in the colonies. He came from a clan of Angevine drapers who bought their way into the Paris legal establishment in the 17th century. The Paris Coustards abandoned trade for the law, to become a judging dynasty, just as a more intrepid, piratical sprig of the family settled in Saint-Domingue. The judge and Guy Coustard, Jean’s master, were cousins, not brothers. The capture of Jean resulted from the maneuvering of Crown officials to oblige both a sugar magnate and a member of the city’s judicial elite.

Jean’s failed bid for liberty offers a glimpse of how elusive freedom became for many slaves in Paris after the mid-18th century. His removal from the army and deportation back to Saint-Domingue resulted from new policing practices that crystallized around the time of his brief stay in France. Despite fleeing Paris, Jean became one of the first victims of an emerging system, based in France’s capital, by which slave owners, or their proxies, caused freedom-seeking domestics to disappear. 

 

The rising importance of the slave trade, and of colonial slave plantations, to Parisian social and economic life led the city’s elites to adopt a new attitude toward people of African and South Asian descent, whom they increasingly viewed as potentially saleable belongings. Resplendent sojourners from Saint-Domingue played a role in diffusing new racial concepts in Paris, but their influence should not be overstated. Ideas of race did not waft into the capital as a foreign essence. By 1750, slave plantations and the slave trade out of East and West Africa had become economically vital to Parisian institutions, including the Company of the Indies, which enjoyed direct support from the Crown and strong ties to Parisian high finance. There was nothing distantly managerial about the activities of Paris-based officials in the Africa trade. Consider this document from 1750, written one year before Jean arrived in Paris. Signed by all directors of the Company of the Indies, it sets forth a new scale of value for slave sales in Senegal.

RÉGULATION DES NOIRS, NÉGRESSES, NÉGRILLONS ET NÉGRITTES

21. Every negro between 14 and 40 will be reputed as one Indian piece so long as he has none of the defects indicated below.

22. One négrillon (boy) of 14 equals one Indian piece.

23. Four négrillons (boys) or négrittes (girls) from the age of 8 to 13 equal three Indian pieces.

24. Six négrillons (boys) or négrittes (girls) from the age of 4 to the age of 8 equal three Indian pieces.

25. Four négrillons (boys) or négrittes (girls) who are 4 years of age or younger equal one Indian piece so long as they are not nursing.

26. One negress who is between 14 and 35 years of age equals one Indian piece.

27. One negress who is age 13 and 14 equals one Indian piece.

28. Men between 40 and 50 years of age, and women between 35 and 40 years of age, equal one-half Indian piece and cannot compose more than 3 percent of the cargo.

29. All nursing children will follow their mothers and not be counted.

30. All negroes, negresses, négrillons (boys), and négrittes (girls) will be considered valid Indian pieces so long as they are not epileptic, maimed, blind, or suffering from formal disease.

31. Some missing teeth, and negroes with enlarged testicles who do not have hernias, cannot be refused by captains and surgeons, or excepted from the above regulation.

32. Negroes with one bad eye who are not over 30 years, others of the same age who are missing one or two fingers, however robust their bodies, will only be counted as one-half an Indian piece.

33. A negro who is lacking two toes will be estimated as two-thirds of a piece; a negress in the same case will be evaluated similarly; and négrillons (boys) and négrittes (girls) by the same proportion.

To pin down the novelty of this document requires that we identify what is not new. At direct points of sale among slave buyers in Africa or the Americas, this meticulously commodified view of the human body was familiar. It was normal for company agents to haggle over people with missing toes and enlarged testicles. There is also nothing new about the term pièce d’Inde (Indian piece), from the Portuguese peça das Indias, which originally referred to the value of a piece of cloth exchanged for slaves in Africa by 15th-century traders. French merchants began to employ this term in the early 18th century.

What seems new is this bald enactment by Paris-based officials of a common system of meaning that binds together the capital and trading posts in Senegal in which Africans about 30 years old are whole units, Africans about 40 years old are half-units, and nursing babies, the blind, and ailing people literally have no value. This is not merely a blunt statement of adhesion to the language of the slave captain by the city’s most eminent merchants; it is the other way around. It is Paris scripting the dialogue at the point of sale.

Police sources about slaves in Paris might seem worlds away from plantation inventories, or Indies Company contracts, yet they convey the same matter-of-fact view of black people as property. Stakeouts and arrests could not have occurred otherwise. Urban slave hunts, far from chafing against local values, reaffirmed them. The property that officials in Paris were willing to defend changed in step with the kind of property that Parisians believed in. By the mid-century, policemen accepted that property could take the form of people.

Slave hunts brought the ideology of the slave owner into the streets of Paris, raising the question of what neighbors thought. At least for bystanders, the arrest of slaves looked just like regular police raids. The question is not how neighbors reacted to the spectacle of capture so much as how they understood the status of their neighbors’ domestics, whether they reported fugitives to the police, and whether they hid people. It is impossible to venture a single answer to this question. Police files offer many clues to friendship, love, and complicity between Parisians and enslaved people. There were, nonetheless, some residents of the city who described their neighbors’ domestics in the crudest possible terms. In 1751, la Dame Mallecot, the wife of an administrator in Cayenne, sought help from the police with the removal of Esther, an African (Igbo) domestic. Mallecot plotted the woman’s arrest, sent Esther to the home of an elderly neighbor, and left town. The neighbor’s son complained to the lieutenant-general of police. “I beg you sir to order that Mallecot come for her negress, whom I will return. It is her property, she will do with it what she wants.” Esther was “a deposit” (un dépôt) for his neighbor to reclaim.

There did not need to be a slave master in the picture. Police agents presumed black and brown people to be stolen goods even when no one reported them missing. The arrest of a man called Mustapha in 1755 offers a revealing instance of this. Mustapha, newly arrived from Marseille, was doubly jinxed. The police had doubts about the fancy napkins Mustapha was hawking on a bridge, and they were just as suspicious about the provenance of Mustapha himself. He deepened their concern by refusing to answer questions (although he was believed to know French) and spent four weeks in For l’Évêque. “We did not find anything in his pockets indicating to whom he belonged.”

 

During the reign of Louis XIV, royal officials began to theorize policing as a vast, tentacular cleansing project by an all-knowing state. As Michel Foucault observes, the rise of new policing ideas would change the structure of government as people began to reimagine its purpose. Policing became a boom topic for publishers and Crown officials, especially after the death of Louis XIV in 1715. The end of Louis’s long reign heightened the reforming zeal of police enthusiasts, to inspire dictionaries, treatises, proclamations, and experiments in repression and surveillance. In Paris, the word police encompassed just about everything. It meant ridding the city of moral filth, actual filth, crime and delinquency, crooked houses, illegal workers, badly lighted streets, family embarrassments, and riotous effervescence among the laboring poor. In the service of this billowing project, the lieutenant-general of police in Paris could issue his own royal writs for the arrest of undesirables, who entered dungeons without passing through the courts.

The practical ability of municipal authorities in Paris to police evolved over time. The invention of inspectors in 1708, with an amplified role after 1740, altered the relationship between police and city dwellers. Through their webs of spies and informants, twenty police inspectors maintained an unrelenting, round-the-clock surveillance of lodging houses and rented rooms frequented by étrangers (strangers). The French word étranger, imbued with a sense of danger and suspicion, referred to outsiders in general, including people from elsewhere in France.

Changes to the policing of Paris responded to dearth, social unrest, and an increase in human mobility. Migration expanded both the city, as a physical space, and its population. The new brutal efficacy of police inspectors around the mid-century also came on the heels of war — the War of the Austrian Succession — and should be read in light of that conflict. As Arlette Farge notes, resistance to troop levies, together with mass desertion, spurred social upheaval in Paris. This may help to account for the menacing force of police in Paris after the war in confrontations with strangers and crowds.

Once agents of the Paris police put themselves in the service of slave owners, it became perilous for fugitives to hide in the city. Jean needed to escape from Paris and not into it. Enslaved domestics who accompanied masters to Paris in the 1740s tended to disappear after a couple of weeks.

Admiralty records provide numerous examples of flight by teenage Africans between 1742 and 1747. The police did not catch these people and there is no evidence they tried to. (They may have been focusing on deserters.) On the rare, documented occasions before 1750 when masters sought help from the police to recover enslaved domestics, nothing happened. In 1742 Anne-Marie-Josephe de Sorel, from Léogane, reported the flight of her slave Pierrot to the Admiralty. To find the boy, she summoned “Sir Genesty, exempt, and she charged him with conducting searches for the said negro, which he assures her of having done for several days and nights” to no effect. In August 1749 a Parisian solicitor reported the flight of his slave Jeanne, who remained at large despite “investigations and house searches that her master caused to be done” — which suggests another failed police hunt.

Masters in the 1750s who appealed to the police framed their demands by emphasizing the moral threat posed by escapees. At the time, the police and most of French society viewed the whole serving class as degenerate scoundrels. Through their depiction of runaways as urban contaminants, masters recast slave hunts as normal policing. In 1751 the Portuguese bishop of Noronha, governor of Sao Tomé, reported the flight of Figueret, “about 4 foot 3, black, dressed in black, in a curly wig gathered at the back, age 16 or 17, from Goa in the Indies.” Figueret was known to be spending his days at the Saint-Germain fair. Noronha explained that the boy “who belonged to him, has been extremely deranged for five or six months, since arrivingin Paris, and it being important to oversee his conduct, to prevent him from committing some disorder, he would be very grateful for him to be put in the prison of For l’Évêque until he departs Paris for the Orient.” When informing the police about the flight of his slave, Louis Aubin, the Chevalier de Nolivos noted “how much pleasure (his arrest) would give me, because, independent of the real loss caused by this domestic, he swindled me.” Masters in the 1750s emphasized the resemblance between runaways and other delinquents. They did so to enable the extrajudicial arrest of people they regarded as valuable assets. 

Excerpted from Slaves in Paris: Hidden Lives and Fugitive Histories by Miranda Spieler, published by Harvard University Press. Copyright © 2025 by the President and Fellows of Harvard College. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186043 https://historynewsnetwork.org/article/186043 0
Irrelevant at Best, or Else Complicit It was not an optimistic time. In the United States, President John F. Kennedy and civil rights activist Medgar Evers had been shot dead in 1963, Malcolm X in 1965, and Dr. Martin Luther King Jr. and Robert F. Kennedy in 1968. Bodies piled up, too, in Vietnam. The year 1968 had brought a global surge of energy and solidarity: the growth of social movements, of struggles against dictatorships and authoritarian rule, of resistance even in the face of violent repression. But 1969 saw a massive global let-down. Coalitional hopes sagged nearly worldwide, replaced by feelings of chaos, dread, and hopelessness. 

“Design,” whatever that might be, no longer looked to anyone like the answer to any of the world’s problems. At the 1969 International Design Conference in Aspen (IDCA) — the same conference that in 1961 had been themed “Man / Problem Solver,” that had emphasized the designer’s “great social responsibility” to help build “a new society with new institutions,” that had celebrated design’s capacity to “‘blast off’ for richer worlds” — the atmosphere had turned somber. The 1969 conference was titled “The Rest of Our Lives.” The industrial designer George Nelson bemoaned, in his conference talk, the difficulty of escape from “the perverted offspring of the American dream” — the dream itself having been brought about, Nelson said, in part by blind faith in technology. The conference’s overall mood, one commentator observed later, reflected “the despair the participants felt at the crumbling of American ideals.” 

The 1970 conference, titled “Environment by Design,” was even darker. Three days in, the American architect Carl Koch declared from the podium that “Our national leadership is unspeakable. The government’s sense of priorities is criminally askew. Our cities are rotting visibly before our eyes.” By a few days later, the program of organized talks had disintegrated. 

People gathered ad hoc in the conference tent to connect with one another and express ideas about the current crisis. A group of French participants read a screed against design itself, written for the occasion by Jean Baudrillard. Baudrillard’s statement lambasted the conference’s environmentalist theme as disingenuous (“Nothing better than a touch of ecology and catastrophe to unite the social classes”), even as it acknowledged, “The real problem is far beyond Aspen — it is the entire theory of Design and Environment itself, which constitutes a generalized Utopia; Utopia produced by a Capitalist system.” (Utopia, here, seems to imply the most self-delusional kind of fantasy.) 

The final hours of the conference, IDCA president Eliot Noyes wrote afterward, underlined “the relative irrelevance of the design subject in the minds of many who were attending.” At the subsequent board meeting, Noyes resigned as president, and the board resolved to search for a radically new form for the 1971 conference, if the conference were to be held again at all. Both the conferees and the board, Noyes reflected, now harbored “serious doubt as to whether at this moment in our national history and our state of emotional disrepair a conference on design can or should be held at all.” Focusing on design seemed irrelevant at best, or else complicit, deplorable, malign.

The whole concept of design was also under attack from those outside design’s professional bounds. In 1971, the German philosopher Wolfgang Fritz Haug published Kritik der Warenästhetik (later translated into English as A Critique of Commodity Aesthetics), a Marxist-cum-Freudian manifesto that described designers as the “handmaidens” of capitalism. Design, Haug contended, was an engine of the appetite-generating “illusion industry” of media and advertising, as well as of the broader consumer capitalist system behind them, all of which were organized around driving consumption and thereby producing profits. 

Haug, like the Frankfurt School before him, charged the modern culture industries and the commodities they produced with the manipulation of human beings. But Haug added a meaningful nuance to Theodor Adorno and Max Horkheimer’s thesis: he showed that manipulating people was only possible because design and its peer disciplines colluded with those people’s pursuit of self-interest, which was continuous, intelligent, and fully intentional. Even “manipulative phenomena” like design, as Haug put it elsewhere, still spoke “the language of real needs.” 

So what to make of design? Was it a necessary evil, or a poison to be eradicated? Neither: it was that poison’s dangerously sweet taste. Or, to use Haug’s own metaphor, design was like the Red Cross in wartime. “It tends some wounds, but not the worst, inflicted by capitalism,” Haug wrote. “Its function is cosmetic, and thus prolongs the life of capitalism by making it occasionally somewhat more attractive and by boosting morale, just as the Red Cross prolongs war. Thus design, by its particular artifice, supports the general disfigurement.”

1971 was also the year the Austrian American designer Victor Papanek published Design for the Real World. It has since become one of the most widely read design books in history; it has been published all over the world, has been translated into over twenty languages, and (as of 2024) has never fallen out of print. It’s a manifesto against what design had become. And it’s a passionate brief for what Papanek believed design could be.

 

As of 1971, Victor Papanek was dean of the newly formed School of Design at the California Institute of the Arts (CalArts). And he had begun to develop his own methodology for a design practice focused, he believed, on solving for real human beings’ real needs. 

Papanek preached design’s “unique capacity for addressing human issues,” as he put it in the magazine Industrial Design, and its “value beyond the purely commercial imperative.” His philosophy of “DESIGN FOR THE NEEDS OF MAN” was a set of seven “main areas for creative attack”:

1. Design for Backward and Underdeveloped Areas of the World.

2. Design for Poverty Areas such as: Northern Big City Ghettos & Slums, White Southern Appalachia, Indian Reservations in the Southwest and Migratory Farm Workers.

3. Design for Medicine, Surgery, Dentistry, Psychiatry & Hospitals.

4. Design for Scientific Research and Biological Work.

5. Design of Teaching, Training and Exercising Devices for the Disabled, the Retarded, the Handi-capped and the Subnormal, the Disadvantaged.

6. Design for Non-Terran and Deep Space Environments, Design for Sub-Oceanic Environments.

7. Design for “Breakthrough,” through new concepts.

That designers should organize their work around addressing human beings’ real-world needs, however clumsily taxonomized—rather than around aesthetics, or function, or the profit imperative—was the message of Design for the Real World. First published in Swedish in 1970, it found global success when published in English in 1971, taking its place among other leftist English-language jeremiads of the time: Jane Jacobs’s The Death and Life of Great American Cities (1961), Rachel Carson’s Silent Spring (1962), James Baldwin’s The Fire Next Time (1963), Kate Millett’s Sexual Politics (1970), E. F. Schumacher’s Small Is Beautiful: Economics as if People Mattered (1973). 

Papanek’s book attributes a lot of agency to design: “In an age of mass production when everything must be planned and designed,” he writes, “design has become the most powerful tool with which man shapes his tools and environments (and, by extension, society and himself ).” But the book doesn’t celebrate that agency. Instead, it charges designers, and the broader economies of production within which they operate, with wasting and abusing their power. 

Take the process of creating and distributing a new secretarial chair. In a “market-oriented, profit-directed system such as that in the United States,” such a new chair almost invariably “is designed because a furniture manufacturer feels that there may be a profit in putting a new chair on the market,” rather than because there is any empirical evidence that a particular population’s sitting needs are not being met. The design team is simply “told that a new chair is needed, and what particular price structure it should fit into.” The team may consult resources in ergonomics or human factors, but inevitably they will find that the information available about their potential “users” is sorely lacking. So they design another generic chair, made neither to fit a specific population nor to solve a new problem. After some perfunctory testing, the chair hits the market, where, invariably, someone other than the secretary decides whether to buy it for her use. Some money is made. No one’s life improves. But the manufacturer is satisfied: “If it sells, swell.”

Young man paints the back of a wooden chair, by Arnold Eagle, c. 1940. [The J. Paul Getty Museum]

What should designers do instead? “A great deal of research,” Papanek replied. Designers should ask “big,” “transnational” questions: “What is an ideal human social system? … What are optimal conditions for human society on earth?” They should inquire into their potential users’ “living patterns, sexual mores, world mobility, codes of behavior, primitive and sophisticated religions and philosophies, and much more.” And they should learn about other cultures’ ways of prioritizing and addressing needs. They should undertake “in-depth study” of such “diverse social organizations” as the “American Plains Indians, the Mundugumor of the Lower Sepik River basin; the priest-cultures of the Inca, Maya, Toltec, and Aztec; the Pueblo cultures of the Hopi; the social structuring surrounding the priest-goddess in Crete; the mountain-dwelling Arapesh; child care in Periclean Greece; Samoa of the late 19th century, Nazi Germany, and modern-day Sweden”; et cetera, et cetera.

Papanek’s commitment to identifying needs by learning about the lives of specific users—largely those from non-Western cultures—might be called an “ethnographic” impulse: a drive to study groups of people (usually groups other than one’s own) and to document their cultures, customs, habits, and differences from an assumed norm. The ethnographic impulse played out not only in Papanek’s bloc-buster book but also in his self-curation and self-presentation. He built a personal library, his biographer notes, containing hundreds of volumes of anthropological research and writing. Beginning in the 1960s, Papanek invited reporters into his home to photograph or draw him and his wife (whoever she was at the time) and their decor: Navajo weavings, Buddhist figures, Inuit masks and ritual artifacts, Balinese masks, other objects of vernacular culture. 

Papanek also endeavored, through this period, to document his alleged ethnographic capital as a set of professional credentials. In the “biographical data” sheet (something like a curriculum vitae) that he presented to CalArts in 1970, Papanek wrote that he 

a. had traveled widely throughout Europe, Thailand, Bali, Java, Cambodia, Japan, etc.

b. spent nearly 6 months (with the Governor’s permission) living in a Hopi Indian pueblo

c. spent several months with an Alaskan Eskimo tribe and nearly five years in Canada

d. spent part of 5 summers in an art-and-craft centered milieu in the Southern Appalachians

e. received various grants that took me to Lapland, Sweden and Finland during the summer of 1966; Finland and Russia during the summer of 1967; and will take me to Russia, Finland, Sweden, and Norway during the summer of 1968

His biographer calls several of these items—particularly those suggesting that Papanek had carried out fieldwork with Hopi and Alaskan Eskimo tribes—“fallacious.” But that didn’t stop Papanek from repeating them across documents and forums. 

Excerpted adapted from The Invention of Design: A Twentieth-Century History by Maggie Gram. Copyright © 2025 by Maggie Gram. Available from Basic Books, an imprint of Hachette Book Group, Inc.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186041 https://historynewsnetwork.org/article/186041 0
An Attempt to Defeat Constitutional Order Conservatives in South Carolina first attempted to defeat the state’s new post-Civil War constitution by appealing to the federal government they had fought three years prior. A petition was submitted to Congress, describing the new constitution as “the work of Northern adventurers, Southern renegades, and ignorant negroes” and claiming that “not one percentum of the white population of the State approves it, and not two percentum of the negroes who voted for its adoption know any more than a dog, horse, or cat, what his act of voting implied.” Conservatives complained that “there seems to be a studied desire throughout all the provisions of this most infamous Constitution, to degrade the white race and elevate the black race, to force upon us social as well as political equality, and bring about an amalgamation of the races.” They ended the petition with a warning: “The white people of our State will never quietly submit to negro rule.”

Congress refused conservative entreaties. But conservatives persisted in their fight. To prevent Black people and Republicans from prevailing in the first elections after the constitution, many turned to coercion, intimidation, and violence. In testimony before a state legislative committee investigating a disputed election, one South Carolinian said that employers had resolved not to employ any man who voted Republican. This was a smart strategy as many former slaves still relied on contracts with their former masters to earn a living. Slaveholders had exploited Black labor to build their wealth, and then used that wealth to build white political power. 

Conservatives also used the legal system. One former slave was arrested and held without trial. Authorities released him when he agreed to vote Democrat. Sometimes, conservatives resorted to even more direct methods. In the spring of 1868, the Ku Klux Klan appeared in South Carolina for the first time, and worked to win the 1868 election for conservatives. After years of being denied a voice in the political process, Richard Johnson was excited to vote. But the night before the election, “the Ku Klux came through our plantation, and said if any of the colored people went to the polls the next day to vote, that they would kill the last one of them.” Some Black men on the plantation were so determined to vote that they still turned up at the polls. But several decided not to vote at the last minute because “the Democrats had liquor at the box upstairs and were drinking and going on in such a manner that the colored people were afraid to go up.” Eli Moragne was one of them. The day before the election, the Klan broke into his home, dragged him outside, stripped him naked, and then whipped him. He showed up despite the experience but was told that if he “voted the Radical ticket [he] would vote it over a dead body.” Armed white men stood between him and the ballot box.

Union Republican Ticket for Constitution, 1868. [University of South Carolina]

Sometimes, Democrats engaged in violence without bothering to wear their Klan robes. William Tolbert, a Democrat who helped murder a Black Republican, observed that “committees were appointed, which met in secret, and they appointed men to patrol in each different neighborhood.” This was done “to find out where the negroes were holding Union leagues.” They had instructions to “break them up, kill the leaders, fire into them, and kill the leaders if they could.” Committees were supposed to take ballots from Republicans and kill those who resisted. Republicans did resist because Tolbert described a scene where one Republican had been shot dead and others had fled. The violence was effective. At one precinct, Tolbert would ordinarily have expected between four and five hundred Black men to vote, but Democratic committee members in the area only allowed two Black men to vote before they started shooting. There were similar drops in Black turnout across the state. For example, in Abbeville County, around 4,200 Black men were registered voters, but only 800 actually voted in 1868’s fall elections.

Republicans won the governorship and control of the legislature. But Democrats and conservatives saw that violence could be effective. 

Carte-de-visite of members of Republicans in the South Carolina State Legislature, 1868. [Wikimedia Commons]

State authorities did try to respond. Amid Klan violence sweeping the state, Governor Robert Scott signed a bill authorizing a state militia. However, most whites refused to serve, a trend that became especially pronounced when Governor Scott rejected all-white militia companies offered by former rebels. In the end, as many as 100,000 men, mostly Black, joined by the fall of 1870. They often wielded state-of-the-art weapons such as Winchester rifles. White newspapers spread conspiracy theories about the militia. For example, after describing the militia sent to Edgefield as “the Corps d’Afrique,” the Charleston Daily Courier claimed that it had come to the town to commence “the arrest of citizens on trumped up charges of being ‘rebel bushwackers,’” and “‘members of the Ku Klux Klan.’” It then suggested that the militia had tortured an innocent white man into admitting that he was a “bushwacker.” Two things appear to have been truly offensive about Black militia units. First, they inspired pride among Black people. The paper complained that when a Black militia unit went to Edgefield, “the negroes of Edgefield became exceedingly jubilant, and determined to congratulate the colored soldiers on their great victory.” Second, the militia gave Black men another economic option besides relying on their former masters. As the paper lamented, “Among the numerous evils which have resulted to the people of Edgefield from this invasion of the county by the negro militia, has been the desertion of the fields by the negro laborers.”

Violence between Black militia units and white people erupted in Laurens County right after the 1870 election. After a gun discharged during a fight between a police officer and a citizen, a white mob began shooting at militia in the town. Several Black men and a few white men died during the fighting and in the subsequent upheaval. One of them was Wade Perrin, a Black legislator. White men caught up to him, ordered him to dance, sing, pray, and then run away. While he was running, they shot him in the back. Between 2,000 and 2,500 armed white men occupied the town. They had confiscated militia weapons from the armory. Two different stories developed about what had caused the violence. The Daily Phoenix blamed Black people. In the months before the 1870 election, the paper reported, “the white people had been subjected to an organized system of disparagement, abuse, and threats of violence to person and property, which had produced that feverish state of feeling incident to a deep sense of outrage and injustice.” Black people had allegedly become so unruly that “for weeks, whole families had not undressed for bed, so great was the apprehension of midnight negro risings, burnings and butcheries.”

The South Carolina Republican, however, claimed that a white man deliberately attacked a policeman to provoke him into firing so they would have an excuse to shoot. This must have been a premeditated plot because “it was not three minutes after the first shot was fired before a line of white men had formed across the public square … The white men came from every direction, out of the stores, the courthouse, and every other place, and what appears very singular is that every one was fully armed.” After the white men had fired on the militia, the paper reported that “white couriers were dispatched on every road, to rouse the people, so that by night at least one thousand men were scouring the countryside on horseback, and in little squads hunting up Radicals.” The incident attracted national media coverage. The New York Herald observed that “‘The War of the Races’ in South Carolina did not end with the rebellion, but occasionally bursts forth with its wonted fury.”

Governor Scott declared martial law in four South Carolina counties. But he also ordered remaining militia weapons in Laurens County transferred to Columbia. Removing the weapons ensured that the militia couldn’t be a serious fighting force and made the martial law proclamation meaningless. A wave of Klan violence swept the state after Laurens. The violence diminished temporarily later in 1871, though there is disagreement about why. Some have suggested that aggressive federal measures were responsible. 

In 1871, the federal government stationed more troops in the state and engaged in a thorough intelligence gathering operation to learn more about the Klan. Federal legislation authorized President Ulysses S. Grant to use the military to enforce the law and placed congressional elections under federal supervision. What became known as the Ku Klux Klan Act allowed Grant to suspend the writ of habeas corpus when he deemed it necessary. After considerable debate, Grant suspended the writ in nine South Carolina counties on October 17, 1871. Over the next months, federal authorities arrested thousands of men for allegedly participating in the Klan and secured dozens of convictions and guilty pleas. These efforts were enough for one historian to claim that “the limited steps taken by the Federal government were adequate to destroy” the Klan.

Indeed, Klan violence was lower for the end of 1871 and some of 1872 than it had been earlier. At the time, however, law enforcement officials themselves were skeptical about whether their efforts had been effective. One prosecutor even suggested that “orders were given” from unknown persons to end the violence “for the present” and that the Klan would simply “wait until the storms blew over” to “resume operations.” By the summer of 1872, Klan activity intensified, indicating that any benefits from federal intervention were limited.

Left: Jonathan Jasper Wright, 1870. [Wikimedia Commons] Right: William Whipper, c. 1879. [Wikimedia Commons]

Given the immense opposition it faced, South Carolina’s government made important achievements. The state greatly extended educational opportunities. In 1868, 400 schools served only 30,000 students. But by 1876, 2,776 schools served 123,035 students. The state also expanded the University of South Carolina, even providing 124 scholarships to help poor students with tuition.

Perhaps most importantly, South Carolina saw unparalleled Black involvement in politics during Reconstruction. During these years, 315 Black men served in political office. Six served in Congress. Two Black men served as lieutenant governor. South Carolina was a place where a parent could take a son who had experienced chattel slavery just three years previously to the legislature, point to a majority of the members, and say, “that could be you one day.” The state that was the first to plunge the nation into Civil War because of its commitment to Black slavery was also the first to raise a Black man up to its supreme court. Jonathan Jasper Wright was born in Pennsylvania to free Black parents and managed to save enough money to attend college, a rare feat for both white and Black people in the era. He read law in his spare time while teaching to support himself. Upon passing the bar, he became the first Black lawyer in Pennsylvania. After the Civil War, he came to South Carolina to organize schools for freedmen. Wright had a neatly trimmed beard and mustache, and his somber eyes betrayed a young man who was in a hurry or a man weighed down with cares, or perhaps both.

Corruption marred all of the progress. In 1870, the Charleston Daily News wrote that “the South Carolina Legislature enjoys the reputation, both at home and abroad, of being one of the most corrupt legislative bodies in existence.” Corruption was so bad, the paper claimed, that “a remark frequently made among the white men in Columbia, Radicals and Democrats, was that two hundred thousand dollars, judiciously circulated among the legislators, would secure the passage of a bill repealing the Emancipation act, and putting all but colored legislators back in slavery.” The paper then asserted that there was an organization known as the forty thieves pillaging the treasury. The organization allegedly had a captain, three lieutenants, four sergeants, and twenty-eight privates. The group conspired to prevent the legislature from passing any “measure unless money was paid to the members of the organization.”

Although conservatives may have exaggerated corruption, it did plague South Carolina during Reconstruction. After John Patterson won election to the U.S. Senate, authorities arrested him when a legislator said he had voted for Patterson after receiving a bribe. Critics called Patterson “Honest John,” supposedly because he always made good on his promises to pay bribes. The legislature attempted to impeach Governor Scott for his behavior in issuing bonds. At the end of 1871, a Republican newspaper lamented that “1872 finds South Carolina financially in a bad way, with no one to blame but officials of our own party. This is a disagreeable statement to make, but it is the truth.” William Whipper, who had argued for enfranchising women at the 1868 constitutional convention, asserted Scott bribed legislators to escape impeachment.

All the corruption caused schisms in the Republican Party. Eventually Whipper, who would himself be accused of corruption, asserted, “It is my duty to dissolve my connection, not with the Republican Party, but with the men, who by dishonesty, demagogism and intrigue have defamed the name of Republicanism, and brought financial ruin upon the State.” Disgruntled Republicans joined the new Union Reform Party along with some Democrats. In the 1870 campaign, the party’s platform was “honesty against dishonesty — cheap, economical government against exorbitant taxation — reduction of public expenses against extravagant expenditure of the people’s money — responsibility of officials for the faithful discharge of their duties against irresponsibility, selfishness and greedy absorption of power.” The Reform Party failed to win the fall elections, though members alleged fraud and intimidation at the polls. Corruption in the Republican Party deprived it of unity precisely when it was most needed to overcome the massive resistance it faced.

Some observers even claimed that corruption led to the Klan violence against Black people and Republicans. But whatever else is true about the corruption in the South Carolina Republican Party, it does not explain the attempt to overthrow the constitutional order. We know this because conservatives and Democrats never gave the 1868 constitution or the Republican Party a chance. They schemed to prevent a constitutional convention in the first place, protested to federal authorities, and used terrorism, cold-blooded murder, and economic coercion to prevail in the 1868 general election. The reality is that, given their hostility to Black political advancement, they would have engaged in violence and attempted to defeat the new constitutional order even if every Republican official had been honest and efficient.

Excerpt adapted from Sedition: How America’s Constitutional Order Emerged from Violent Crisis by Marcus Alexander Gadson. Copyright © 2025 by New York University. Published by NYU Press.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186038 https://historynewsnetwork.org/article/186038 0
Lethal Injection Is Not Based on Science We know how to euthanize beloved pets — veterinarians do it every day. And we know how physician-assisted suicide works — it is legal in several states. If drugs can be used to humanely end life in these other contexts, why is it so difficult in the death penalty context? The answer is one of the best-kept secrets of the killing state: lethal injection is not based on science. It is based on the illusion of science, the assumption of science. “What we have here is a masquerade,” one lab scientist says. “Something that pretends to be science and pretends to be medicine but isn’t.” Consider first the birth of lethal injection.

In 1976, the Supreme Court gave states the green light to resume executions after a decade of legal wrangling over the constitutionality of the death penalty, and Oklahoma was eager to get started. The only hitch was how to do it. Oklahoma’s electric chair was dilapidated and in need of repair, but more importantly, it was widely viewed as barbaric and inhumane. The state was looking to try something new. A state legislator approached several physicians about the possibility of death by drugs — a lethal injection. They wanted nothing to do with it, but the state’s medical examiner, Dr. Jay Chapman, was game. “To hell with them,” the legislator remembered Chapman saying. “Let’s do this.”

Chapman had no expertise in drugs or executions. As Chapman himself would later say, he was an “expert in dead bodies but not an expert in getting them that way.” Still, he said he would help and so he did, dictating a drug combination to the legislator during a meeting in the legislator’s office. Chapman first proposed two drugs, then later added a third. Voila. In 1977, the three-drug protocol that states would use for the next 30 years was born.

The idea was triple toxicity — a megadose of three drugs, any one of which was lethal enough to kill. The first drug, sodium thiopental, would kill by barbiturate overdose, slowing respiration until it stopped entirely. The second drug, pancuronium bromide, would kill by paralyzing the diaphragm, preventing it from pumping air into the lungs. And the third drug, potassium chloride, would kill by triggering a cardiac arrest. The effects of the second and third drugs would be excruciatingly painful, so the first drug did double duty by blocking pain as well.

How did Chapman come up with his three-drug combo? “I didn’t do any research,” he later confided in an interview. “I just knew from having been placed under anesthesia myself, what was needed. I wanted to have at least two drugs in doses that would each kill the prisoner, to make sure if one didn’t kill him, the other would.” As to why he added a third drug, Chapman answered, “Why not? … You wanted to make sure the prisoner was dead at the end, so why not add a third drug,” he said, asking: “Why does it matter why I chose it?”

This is how the original three-drug lethal injection protocol came to be: a man working outside his area of expertise and who had done no research just came up with it. “There was no science,” says law professor Deborah Denno, one of the leading experts in the field. “It was basically concocted in an afternoon.” As another lethal injection expert, law professor Ty Alper, put the point, Chapman “gave the matter about as much thought as you might put in developing a protocol for stacking dishes in a dishwasher.” For the careful dish stackers among us, it’s fair to say he gave it less.

But that was good enough for Oklahoma, which adopted the new execution method without subjecting it to a shred of scientific scrutiny. No committee hearings. No expert testimony. No review of clinical, veterinary, or medical literature. The state was embarking upon an entirely new way to kill its prisoners, and did none of the most basic things.

Texas followed Oklahoma’s lead the next day, and then other states did too, carelessly copying a protocol that had been carelessly designed in the first place. “There is scant evidence that ensuing States’ adoption of lethal injection was supported by any additional medical or scientific studies,” a court reviewing the historical record wrote. “Rather, it is this Court’s impression that the various States simply fell in line relying solely on Oklahoma’s protocol.” As Deborah Denno observes, the result was an optical illusion — states touted a “seemingly modern, scientific method of execution” without an iota of science to back it up. Jay Chapman was as surprised as anyone by other states’ adoption of his protocol. “I guess they just blindly followed it,” he later stated, adding, “Not in my wildest flight of fancy would I have ever thought that it would’ve mushroomed into what it did.” “I was young at the time,” he explained. “I had no idea that it would ever amount to anything except for Oklahoma.”

Over time, every death penalty state in the country would adopt Chapman’s three-drug lethal injection protocol — not because they had studied it, but because in the absence of studying it, there was nothing to do but follow the lead of other states. “I didn’t have the knowledge to question the chemicals,” one warden explained, saying that he had “no reason to because other states were doing it.”12 “It wasn’t a medical decision,” an official from another state explained. “It was based on the other states.”

Sociologists have a name for this, a term of art for fads based on a faulty assumption. They call it a “cascade to a mistaken consensus,” and lethal injection is a textbook example. States had come to a consensus in adopting the three-drug protocol, but it was based on the assumption that other states knew what they were doing. They did not.

 

The fact that the three-drug protocol wasn’t based on science is not to say that science on the drugs didn’t exist. All three drugs were FDA approved, so there were studies and FDA warning labels saying what each drug did. The problem was that none of that science could predict what would happen when the drugs were used in lethal injection. Lethal injection is an “off-label” use of a drug, and although doctors use drugs for off-label purposes all the time, they aren’t trying to kill people, so their off-label use doesn’t come anywhere close to the use of those drugs as poison in lethal injection. Lethal injection uses drugs in amounts that no one has ever prescribed, let alone studied in a research setting. It delivers the entire dose of a drug at once — a practice known as “bolus dosing” — rather than delivering the drug in an IV drip, as is typical for large doses in the clinical setting. And it uses combinations of drugs that are simply unfathomable in the practice of medicine, giving rise to the possibility of “profound physiological derangements” (science-speak for freakishly weird results), as overdoses of different drugs affect the body in different ways.

Who knew what was going to happen when all three of these perversions came together. No one did, and the studies to find out had not even begun. In the biomedical research setting, a baseline showing of scientific support is required for testing on animals, and the three-drug protocol didn’t even meet that threshold. As one lab scientist quipped, “You wouldn’t be able to use this protocol to kill a pig.”

But states weren’t killing pigs. They were killing people, so they forged ahead, undaunted by the unknowns. Yet over time, the executions that followed created data points of their own, and those data points drew scientists. If states would not go to the science, science would come to them.

Granted, the data was thin. In some states, the problem was secrecy. “There is an enormous amount of information from executions (autopsies, toxicology, ECG recordings, EEG recordings, execution logs, and photographs),” one expert explained, “but most of it has been kept secret.” In other states, the problem was poor record-keeping. In still others, it was a state’s decision to stop keeping records altogether. For example, Texas — which conducts more executions per year than any other state — stopped conducting post-execution autopsies altogether in 1989. “We know how they died,” a state spokesperson stated when asked about the reason for the no-autopsy policy.

That said, the raw data that scientists did manage to get was enough to raise serious concerns about the three-drug protocol. State officials were making “scientifically unsupportable” claims about lethal injection, researchers stated, so they decided to look at the data to see what it showed. In 2005 and 2007, researchers published two peer-reviewed studies on lethal injection, the first major studies of their kind.

In the first study, researchers obtained toxicology reports from forty-nine executions in Arizona, Georgia, North Carolina, and South Carolina. (Texas and Virginia, the two states with the most executions in the country at the time, refused to share their data.) Because they had no other way to determine whether prisoners were anesthetized when they were injected with the second and third drugs, researchers measured the postmortem amounts of sodium thiopental (the first drug) in the blood, finding that most prisoners had amounts lower than what was necessary for anesthesia, and some had only trace amounts in their system.

“Extrapolation of ante-mortem depth of anesthesia from post-mortem thiopental concentrations is admittedly problematic,” the researchers conceded. Still, the wide range of sodium thiopental amounts in prisoners’ blood suggested gross disparities during their executions as well. “It is possible that some of these inmates were fully aware during their executions,” the researchers stated, but their conclusion was more modest: “We certainly cannot conclude that these inmates were unconscious and insensate.”

Vigorous debate ensued. “You can’t take these post-mortem drug levels at face value,” one forensic pathologist stated, explaining that the amount of a drug in the blood dissipates after death, just as it does in life, and most autopsies in the study were conducted around twelve hours after death, so the postmortem measurements didn’t say much about the sodium thiopental in a prisoner’s blood during the execution. The study’s authors shot back with point-by-point responses to the criticism, but the damage was done. The so-called “Lancet study,” named for its publication in one of the most prestigious medical journals in the world, would forever be tainted by skepticism.

Had the first study been the only study of the three-drug protocol, one might have said that the science was inconclusive. But a second study was published two years later, and its findings were far less subject to dispute. In the second study, researchers examined execution logs in California. California’s expert had testified that the effects of sodium thiopental were well understood. Within sixty seconds of receiving the overdose, “over 99.999999999999% of the population would be unconscious,” the state’s expert stated, and “virtually all persons [would] stop breathing within a minute.” But when researchers examined the logs from California’s eleven executions by lethal injection, they found that this was not the case. In six of the eleven cases — 54% — the logs showed that the prisoner “continued to breathe for up to nine minutes after thiopental was injected.”

This was alarming not only because it showed that the state’s expert was wrong, but also because it suggested that the prisoners had died torturous deaths. In the absence of a trained professional assessing anesthetic depth, the cessation of breathing provides a rough proxy for adequate anesthesia. Thus, the fact that over half the prisoners continued breathing was an ominous sign that they had not been fully anesthetized prior to injection of the drugs that would cause slow suffocation and cardiac arrest. Executioners had recorded prisoners’ vital signs, but had not understood what they meant.

California’s execution logs revealed another problem as well: the same six prisoners who continued to breathe did not go into cardiac arrest after injection of the third drug, potassium chloride, which the state’s expert had said would kill within two minutes. Given the massive dose of potassium chloride, how could this possibly be? The answer was one of the “profound physiological derangements” that no one saw coming, at least not until researchers documented it: the bolus dose of sodium thiopental had depressed circulation so dramatically that it blunted the bolus dose of potassium chloride. Prisoners’ hearts raced in response to the potassium chloride, but not enough to induce cardiac arrest, leaving them to die by slow suffocation from the paralytic instead.

The findings from California’s execution logs led a federal court to invalidate the state’s lethal injection protocol in 2006. “The evidence is more than adequate to establish a constitutional violation,” the court stated, noting that it was “impossible to determine with any degree of certainty whether one or more inmates may have been conscious during previous executions or whether there is any reasonable assurance going forward that a given inmate will be adequately anesthetized.” The governor has since declared a moratorium on executions in the state, and it remains in place today.

Looking back, it’s fair to say that for the first 30 years of lethal injection, states used a three-drug protocol without understanding how it actually worked. State experts made claims and stated them with confidence, but what they said didn’t turn out to be true. Sodium thiopental didn’t do what states said it would do, and potassium chloride didn’t do what states said either — largely because no one accounted for the possibility that a bolus dose of the first drug would blunt the bolus dose of the third. States had no idea what their toxic drug combinations would actually do. They were slowly suffocating prisoners to death, and they didn’t have a clue.

Excerpt adapted from Secrets of the Killing State: The Untold Story of Lethal Injection by Corinna Barrett Lain. Copyright © 2025 by New York University. Published by NYU Press.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186036 https://historynewsnetwork.org/article/186036 0
Are You Not Large and Unwieldy Enough Already? On May 25, 1836, John Quincy Adams addressed the U.S. House of Representatives in an hour-long oration. Eight years earlier, when Adams was still president of the United States, an address of such length by the erudite Harvard graduate would have been unremarkable. But by 1836, Adams was no longer president. He had been defeated for reelection by Andrew Jackson in 1828; left the White House in 1829 without attending his successor’s inauguration; quickly grown restless in retirement as he observed with dismay Jackson’s populist, expansionist, and proslavery policies; and returned to Washington in 1831 as a member of the House. The nominal issue that inspired Adams’ sprawling speech in 1836 was a resolution authorizing the distribution of relief to settlers who had fled their homes in Alabama and Georgia following a series of violent altercations with Indigenous people. Adams used that conflict as an opportunity to embark on a wide-ranging discourse. As a Congressional Globe journalist archly put it, the ex-president addressed the chamber “on the state of the Union.”

Although Adams expounded on numerous subjects, he focused on the most pressing issue of the moment: the rebellion in the Mexican province of Coahuila y Tejas (or, as Americans called the northern part of the province, Texas). Beginning in October 1835, “Texians,” as expatriate American settlers in Texas were known, had revolted against Mexican rule. By April 1836, the Texians had unexpectedly defeated the Mexican force sent to subdue them, achieved a fragile independence, and appealed to the United States for annexation. Jackson plainly favored annexation, and Adams accused numerous House members of “thirsting” to annex Texas as well.

In dire terms, Adams warned against expanding the boundaries of the United States to include Texas. His opposition to annexation may have surprised some of his colleagues in the House. As a U.S. senator from Massachusetts in 1803, he had been the only Federalist to vote in favor of Thomas Jefferson’s acquisition of the Louisiana Territory. In 1818, as secretary of state during the administration of James Monroe, he had defended Andrew Jackson when Jackson, then an army general, had invaded Spanish Florida. In 1821, Adams acquired Florida for the United States from Spain in return for setting the southwestern boundary of the United States at the Sabine River — the border between the modern states of Louisiana and Texas. With that agreement in place, Adams believed that U.S. expansion had gone far enough. Before the House in 1836, he argued that to extend the already “over-distended dominions” of the United States beyond the Sabine would be an untenable overreach. “Are you not large and unwieldy enough already?” he asked proponents of annexation. “Is your southern and southwestern frontier not sufficiently extensive? Not sufficiently feeble? Not sufficiently defenceless?” Annexation, he predicted, would precipitate a war with Mexico that the United States might well lose. Adams warned that Mexico had “the more recent experience of war” and “the greatest number of veteran warriors.” He reminded the House of ongoing U.S. military stumbles in Florida, where the United States had struggled to establish its control since acquiring the peninsula from Spain: “Is the success of your whole army, and all your veteran generals, and all your militia-calls, and all your mutinous volunteers against a miserable band of 500 or 600 invisible Seminole Indians, in your late campaign, an earnest of the energy and vigor with which you are ready to carry on that far otherwise formidable and complicated war?” Not least of all, he warned that if Mexico were to carry the war into the United States, the invader would find numerous allies among slaves and especially among the Indigenous people whom the United States was in the process of removing to the Indian Territory on the border with Texas. “How far will it spread,” Adams asked, should Mexico invade the United States, “proclaiming emancipation to the slave and revenge to the native Indian”? In such an instance, “Where will be your negroes? Where will be that combined and concentrated mass of Indian tribes, whom, by an inconsiderate policy, you have expelled from their widely distant habitations, to embody them within a small compass on the very borders of Mexico, as if on purpose to give that country a nation of natural allies in their hostilities against you? Sir, you have a Mexican, an Indian, and a negro war upon your hands, and you are plunging yourself into it blindfold.” Adams’ speech sparked a debate that consumed five hours, causing the House to stay in session long into the evening. That night, Adams, in his inimitably cramped handwriting, recorded the day’s events in his diary. He congratulated himself that he had succeeded in sapping the House’s enthusiasm for annexation. Indeed, Adams and his like-minded colleagues in Congress managed to deter annexation for nine more years.

Ornamental map of the United States and Mexico, 1846. [David Rumsey Historical Map Collection]

In Adams’ view, the United States, which between 1783 and 1836 had expanded its territory northwest into the Great Lakes region, west into the Great Plains, and south to the Gulf of Mexico, had swollen beyond its capacity either to exercise effective sovereignty over border regions or to defend its extended borders against imperial competitors. The U.S. presence in the borderlands, a multilateral and multiethnic region, was tenuous: until the 1840s, Britain dominated the region between the western Great Lakes and Oregon, while Spain and, later, Mexico controlled the region between Texas and California. The success of the Seminoles together with the escaped slaves who were allied with them in resisting U.S. forces in Florida was hardly exceptional. In the western Great Lakes region, the Ojibwe dominated. The British liberally supported the Ojibwe and other Indigenous nations in the Great Lakes region. In the event of another war with Britain, the natives were likely to once again be British allies as they had been in the War of 1812. As for the horse-mounted natives of the Great Plains such as the Comanches and the Lakota, the United States in 1836 could not even begin to imagine challenging their control of the grasslands. Likewise, the fear that an invasion by a foreign power on the southwestern border might spur a slave revolt was quite real; by promising freedom, the British had encouraged thousands of enslaved people to join them in fighting against the United States in both the Revolutionary War and the War of 1812. In the first decades of the 19th century, numerous slaves fled from Georgia and Louisiana to Florida and New Spain; once in Spanish territory, maroon communities encouraged further flight and, slaveholders feared, rebellion. In short, Adams was entirely correct that in the first decades of the 19th century, the United States maintained a relatively weak presence on its borders where it had to contend with powerful, autonomous native groups, fugitive slaves, and competing imperial powers.

Leaders such as Adams who in the first decades of the 19th century pondered the weaknesses of the United States in its border regions were in many respects confronting a new problem. Before 1800, the most profitable imperial holdings in the Americas were of two types: sugar plantations in the Caribbean and coastal Brazil; and Spain’s silver mines at Potosí in the Andes and the Bajío in Mexico. Almost everywhere else, until the end of the 18th century, the British, French, Spanish, and Portuguese empires in continental North and South America were primarily commercial and tributary rather than territorial. European imperial settlements on the American mainland, with the notable exceptions of the Spanish silver mines and a few other places such as Mexico’s Central Valley, hugged the coastlines. European empires primarily claimed sovereignty over vast interiors of the Americas based on the reciprocal exchange of gifts and tribute with native leaders and by virtue of commerce in animal products and slaves that European merchants carried on with the Indigenous people of continental interiors.

Thus, throughout much of British, French, and Spanish North America, European imperial claims to territory depended on the commercial and diplomatic loyalties of Indigenous people. European military forces occasionally launched punitive expeditions into the interior against natives who resisted these commercial and diplomatic arrangements but rarely managed, or even tried, to establish an enduring military presence. Imperial boundaries, in this scheme, remained only loosely defined.

This system, in which Indigenous people held considerable influence, began to change in the late eighteenth and early 19th century, as European empires shifted away from defining sovereignty in terms of relationships with Indigenous people and toward negotiating imperial boundaries with each other. In 1777, for instance, Spain and Portugal agreed in the first Treaty of San Ildefonso to create a joint boundary commission to survey the border between their South American empires, marginalizing the Indigenous nations who lived in those lands. When the United States and Spain agreed to a border between Georgia and Spanish Florida in 1795, they did not consult with the Seminoles who inhabited the territory. Indigenous people were similarly excluded in 1818, when the United States agreed to a treaty with Britain establishing the northern boundary of the United States and providing for joint Anglo-American occupation of Oregon. They were likewise left out in 1821, when Adams negotiated with Luis de Onís, a Spanish minister, establishing the border between the United States and New Spain at the Sabine River. All these agreements belonged to a larger European-US effort to sideline Indigenous people and negotiate imperial boundaries among themselves. European- and American-made maps reflected the shift in imperial mentalities: in the seventeenth and eighteenth centuries, when imperial claims depended on alliances with Indigenous people, maps of the North American interior abounded with the names of Indigenous nations. By the 19th century, similar maps had erased references to Indigenous nations and showed only empty space.

Yet while European powers and the United States could erase Indigenous nations from their maps, they could not so easily dispense with the necessity of dealing with autonomous and powerful Indigenous nations on the outskirts of their territories. In the first decades of the 19th century, the old, somewhat unpredictable system of imperial sovereignty contingent upon diplomatic and commercial relations with Indigenous people persisted even as the new territorial system based on diplomacy (and sometimes war) between empires was ascending. For example, when the United States achieved its independence from Britain in 1783, it acquired — on paper at least — an extensive territory between the Appalachians and the Mississippi River. In 1783, however, the borders spelled out in treaties remained less meaningful than commercial and diplomatic relations with Indigenous people. While the British formally ceded the trans-Appalachian region to the United States, they maintained for decades merchant outposts in what was nominally U.S. territory. The U.S. explorer Zebulon Pike encountered one such outpost on the Upper Mississippi River in January 1806: a North West Company trading post. Seeing “the flag of Great Britain” over the post in what was nominally U.S. territory, Pike wrote, “I felt indignant.” But there was little he could do to assert U.S. authority. 

More than just flying their flag in U.S. territory, the British, through their trade, retained the commercial and diplomatic allegiance of Indigenous people in the new US Northwest Territory. When the United States and Britain went to war six years after Pike stumbled across the British trading post, most of the Indigenous people in the Northwest Territory sided with the British. To the south, the Spanish had seized Florida from Britain during the American Revolution; the Florida peninsula almost immediately became a haven for fugitive slaves from the United States. The Spanish, who also controlled New Orleans, periodically inconvenienced American merchants by closing the mouth of the Mississippi River to commercial travel.

Between 1803 and 1821, the United States acquired both Florida and New Orleans by treaty. The United States thus removed those territories from the control of an imperial competitor but in so doing took on an extensive territory where it struggled to establish its sovereignty. Understanding the early 19th-century United States as weak relative to Indigenous people, escaped slaves, and imperial competitors contradicts both the popular and the scholarly view of the United States in this period. Most historians of what the historian Arthur M. Schlesinger Jr. once called “the age of Jackson” depict U.S. expansion not only as inexorable but as one of the defining characteristics of the period. According to this view, the United States in the first half of the 19th century was like a seething boiler that could barely contain the outward economic and cultural pressures within it: a virulent, racist hatred of Indigenous people; an all-but-insatiable desire for land; a dynamic, profitable, and expanding slave-based plantation system; an explosive market economy; and a self-righteous American missionary Protestantism that saw itself as a reforming beacon to the world.

Pictorial map of the Great West, 1848. [David Rumsey Historical Map Collection]

Expansion was not a national consensus, and the expansionism that Andrew Jackson advocated was always a politically divisive and contested issue. In 1819, by a vote of 107–100, Jackson only narrowly escaped censure in the House of Representatives for his unauthorized attacks against Spanish outposts and British subjects during an invasion of Spanish Florida the previous year; in 1830, Jackson’s Indian Removal Act barely passed the House of Representatives, 101–97; in 1832, an anti-Jackson coalition won a majority of the Senate; and beginning in 1836 and lasting for the next nine years, Adams and his congressional allies successfully deterred Texas annexation. Adams was one of numerous elected leaders — many of them Northeasterners who eventually coalesced into the Whig Party — who advocated strengthening U.S. commerce, manufacturing, and infrastructure within existing U.S. boundaries rather than overstretching U.S. power by sprawling across the continent. Adams understood a reality about the U.S. position in North America that “manifest destiny” obscures:  a relatively weak United States found itself engaged with powerful European imperial competitors, and even more powerful Indigenous nations, in a complicated struggle for sovereignty in several regions on its borders. Unable to simply impose its will, the U.S. often reached out into the borderlands through diplomacy or commerce. Manifest destiny was just one of many narrative visions for the borderlands; in the first decades of the 19th century, it was neither the dominant vision nor the most plausible. 

Excerpt adapted from The Age of the Borderlands: Indians, Slaves, and the Limits of Manifest Destiny, 1790–1850 by Andrew C. Isenberg. Copyright © 2025 by the University of North Carolina Press.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186034 https://historynewsnetwork.org/article/186034 0
Mutant Capitalism In Neal Stephenson’s Snow Crash (1992), a novel that channeled perfectly the libertarian imagination of the post–Cold War moment, the territory once known as the United States has been shattered into privatized spaces: franchise nations, apartheid burbclaves, and franchulets, a world of what I have called “crack-up capitalism.” The threat in the plot is the Raft, a maritime assemblage several miles across: a decommissioned aircraft carrier lashed to an oil tanker and countless container ships, freight carriers, “pleasure craft, sampans, junks, dhows, dinghies, life rafts, houseboats, makeshift structures built on air-filled oil drums and slabs of styrofoam.” The Raft “orbits the Pacific clockwise” bearing a cargo of “Refus” or refugees, welcomed aboard by an entrepreneurial tech evangelist who has just cornered the global fiber optic grid and has schemes to subjugate the population through a computer virus administered as a bitmap narcotic. The Raft’s passengers are dehumanized and anonymized: a mass of insects “dipping its myriad oars into the Pacific, like ant legs” at whose arrival the coastal residents of California live in terror, subscribing to a “twenty-four-hour Raft Report” to know when the “latest contingent of 25,000 starving Eurasians has cut itself loose” to swim ashore.

Stephenson’s descriptions are stomach-turning, indulging in a grotesque racist imagery of nonwhite danger. The Raft was the fodder for, as he wrote, “a hundred Hong Kong B-movies and blood-soaked Nipponese comic books.” As the race scientist and former National Review journalist Steve Sailer noted, the Raft also had an obvious antecedent: the “Last Chance Armada” of Jean Raspail’s 1973 novel, first published in French, The Camp of the Saints. In that book, a disabled messianic leader from the Calcutta slums boards millions of indigent Indians on a lashed-together fleet of old ships to travel West “in a welter of dung and debauch.” The novel revels in what one scholar calls “pornographic prose” in its depiction of coprophagy, incest, and pedophilia aboard the armada. The plot ends in an orgy of violence after what the author sees as the suicidal embrace of the armada by the foreigner-friendly French population.

The first English translation of The Camp of the Saints was published by Scribner’s in 1975 to many positive reviews. The cover image showed a single Caucasian hand holding up a globe from grasping brown hands with a catch line reading: “a chilling novel about the end of the white world.” The book returned to public discussion during the first successful presidential campaign of Donald Trump as an alleged inspiration to his advisers Steve Bannon and Stephen Miller, but it was already a common touchstone decades earlier. It was reissued in 1986 by the white supremacist Noontide Press and in 1987 by the American Immigration Control Foundation (AICF), which, along with the Federation for American Immigration Reform (FAIR) helped mainstream anti-immigrant arguments in part by piggy-backing on the mailing lists of right-wing magazines to help seed a national movement. 

In 1991, John Randolph Club (JRC) founding member Sam Francis described the book as “a kind of science fiction novel” that had become American reality. “The future is now,” he wrote. The vision of the maritime refugee indexed with the evening news in the early 1990s. There were more than 30,000 interceptions of Haitians at sea in 1992 and nearly 40,000 Cubans in 1994; the same year, the Golden Venture ran aground in Rockaway Beach, carrying 300 Chinese would-be migrants. Raspail’s novel “forecasts the recent landing of the Golden Venture,” as one letter to the Washington Times put it in 1993. The Social Contract Press reissue featured a photo of Chinese men wrapped in blankets after disembarking from the vessel in the background. Introducing the novel, the nativist ideological entrepreneur and FAIR director John Tanton wrote that “the future has arrived,” citing the Golden Venture and other instances of maritime flight that had taken Raspail’s plot “out of a theorist’s realm and transposed it into real life.” Fiction can be more powerful than fact,” wrote JRC member and American Renaissance founder Jared Taylor in a review of The Camp of the Saints. “The novel,” he wrote, “is a call to all whites to rekindle their sense of race, love of culture, and pride in history for he knows that without them we will disappear.”

The Camp of the Saints had a special place in the paleo imagination. Ahead of the first JRC meeting, the Ludwig von Mises Institute’s Lew Rockwell claimed partial credit for the book’s circulation in the United States in 1975. In his talk “Decomposing the Nation-State” at the Mont Pelerin Society in 1993, Rothbard wrote that he had previously dismissed the novel’s vision, but “as cultural and welfare-state problems have intensified, it became impossible to dismiss Raspail’s concerns any longer.” He referred to his proposal of privatizing all land and infrastructure discussed in the last chapter as a solution to the “Camp of the Saints problem.” When the JRC met in Chicago in December 1992, the conference was titled “Bosnia, USA” and Hans-Hermann Hoppe spoke in the lead-off session named after The Camp of the Saints.

The year between the first and second meeting of the JRC had been momentous. The Los Angeles riots in April, Buchanan’s run for president, and Rothbard’s proposal of a strategy of right-wing populism made 1992 look like, in the words of author John Ganz, “the year the clock broke.” Another notable event was the publication of an article in National Review by the scheduled keynote speaker at the club: the journalist Peter Brimelow, a naturalized U.S. citizen born in England in 1947. When the article was published as a book by Random House in 1995 with thanks given to Rockwell and Jeffrey Tucker at the Ludwig von Mises Institute (as well as his agent Andrew Wylie), Alien Nation was described as a “non-fiction horror story of a nation that is willfully but blindly pursuing a course of suicide.” Historian Aristide Zolberg writes that the book “marked the ascent to respectability of an explicitly white supremacist position … that had hitherto been confined in the United States to shadowy groups.” Alien Nation came in the immediate wake of the passage of Proposition 187 in California, blocking access to education and health services for undocumented immigrants, one of the earliest instances of local governments “trying to retake immigration control into their own hands.” “No writer has argued more effectively for this change of policy than Peter Brimelow,” wrote Brimelow’s former colleague at Forbes, David Frum. “No reformer can avoid grappling with [his] formidable work.”

In 1999, Brimelow took his project online — “fortunately the Internet came along,” as he put it later — founding the website VDARE.com, named after the first child born to white settlers in North America, Virginia Dare. Serving as what the Washington Post called a “platform for white nationalism,” the website has hosted prominent advocates of scientific racism like Jared Taylor, J. Philippe Rushton, and Steve Sailer as well as alt-right activists Richard Spencer and Jason Kessler.

An amplifier for themes and tropes of the Far Right, a search of the website yields more than 20,000 posts with the term “white genocide,” more than 13,000 with “race realism,” and 6,000 with “Great Replacement.” Brimelow is also proximate to more mainstream figures in the United States. He was hosted at the home of then-president Donald Trump’s economic adviser Larry Kudlow in 2018 and held a role at the same time at Fox reporting directly to Rupert Murdoch. Brimelow has become Jean Raspail’s spokesperson for the 1990s and 2000s. 

 

Where does the resurgence of the Far Right come from? Scholars attempting to explain how apparently fringe political ideologies have moved to center stage since the election of Trump in 2016 have split into two camps. The first locates the origins of the Far Right in culture: racism, chauvinism, xenophobia, the “tribalism” of “white identity politics,” or a longing for “eternity.” As a group, these commentators seem to ignore the admonition from Frankfurt school sociologist Max Horkheimer repeated so often that it threatens to become a cliché that “whoever is not willing to talk about capitalism should also keep quiet about fascism.”

Capitalism can be hard to find in this literature. A recent book on “the far right today” does not mention the term once. Four other books on the alt-right and white power movement barely mention it, and a fourth only to say that the alt-right is “skeptical of global capitalism.” References to “identity” outnumber “capitalism” at a ratio of several dozen to one. The assumption seems to be that Far Right ideology is either post- or pre-material: it inhabits a space of culture detached from issues of production and distribution. This is startling given the fact that the radical Right’s central issue is nonwhite immigration, an eminently economic issue with a vast specialized literature. 

By contrast, the second school of interpretation finds the origins of the Far Right in the spirit of capitalism itself. Rather than a rejection of neoliberalism, they see the Far Right as a mutant form of it, shedding certain features like a commitment to multilateral trade governance or the virtues of outsourcing while doubling down on Social Darwinist principles of struggle in the market translated through hierarchical categories of race, nationality, and gender. Brimelow’s work helps us see how the nation is understood as both a racial and economic asset to the Far Right.

Brimelow is described variously as a “white nationalist,” “restrictionist,” or “Alt Right figurehead.” Yet he is almost never described the way he described himself: as a libertarian conservative or even a “libertarian ideologue.” It is rarely, if ever, noted that he was a fixture in the standard networks of neoliberal intellectuals seeking to rebuild the foundations of postwar capitalism. He spoke at a Mont Pelerin Society (MPS) regional meeting in Vancouver in 1983 alongside Margaret Thatcher’s speechwriter and later National Review editor John O’Sullivan. Brimelow’s interviews and lengthier features in Forbes in the late 1980s and 1990s drew almost exclusively from the MPS roster. This included profiles and interviews with Thomas Sowell (twice), Peter Bauer, Milton Friedman (twice for Forbes and twice for Fortune), and Murray Rothbard. His longer features were built around the research of Gordon Tullock, Hayek, Friedman, and MPS member Lawrence White. He wrote a glowing review of Milton and Rose Friedman’s memoirs, recounting Milton’s first trip overseas to the inaugural MPS meeting and praised the couple’s contributions to “the free-market revolution in economics that has overthrown the statist-Keynesian-socialist consensus.”

To describe Brimelow as nativist and white nationalist may be correct, but it threatens to banish his concerns from the domain of the rational and the economic. In fact, he was a typical member of a transnational milieu linking Thatcherite intellectuals taking their own version of a cultural turn around the Institute of Economic Affairs’ Social Affairs Unit with social scientists like Charles Murray and Richard J. Herrnstein concocting theories linking race, intelligence, and economic capacity as well as neoconservatives from the United States to Singapore to Japan rediscovering the relevance of “Asian values” for capitalist success. For the new fusionists of the free-market Right, the economic was not a pristine space quarantined from matters of biology, culture, tradition, and race. Rather, these thought worlds overlapped and melded with one another.

Brimelow’s first book was not about politics or race. It was called The Wall Street Gurus: How You Can Profit from Investment Newsletters, marketed alongside books like The Warning: The Coming Great Crash in the Stock Market and Wall Street Insiders: How You Can Watch Them and Profit. Like the authors of those newsletters, investment was simultaneously a strategy of money-making and leveraging symbolism and accruing influence. We can understand his turn to whiteness as the outcome of a portfolio analysis. The nation was a safe asset. The pro-white play looked like a payday. 

Excerpt adapted from Hayek’s Bastards: Race, Gold, IQ, and the Capitalism of the Far Right by Quinn Slobodian. Copyright © 2025 by Quinn Slobodian. Published by Zone Books.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186032 https://historynewsnetwork.org/article/186032 0
“The End Is Coming! The End Is Coming!” If you were a child during the late 1990s, there’s a good chance you either owned Beanie Babies or your parents went crazy over them. Beanies featured simple designs that inspired complex madness. A mixture of polyester, synthetic plush, and plastic pellets, Beanies were ordinary-looking children’s playthings that took the world by storm. Throughout the mid- to late 1990s, they became ubiquitous in American homes and fostered a community. By the end of the decade, the craze went haywire. In April 1998, a police department near Chicago removed weapons from the streets by offering a buyback program where people could exchange their guns for Beanie Babies. A few months later, in a foreshadowing of the inanity of contemporary politics, U.S. trade representative Charlene Barshefsky sparked controversy when Customs forced her to turn over a few dozen Beanie Babies she purchased while traveling to China with President Bill Clinton. “Instead of trying to reduce our $50 billion trade deficit with China,” stated Republican National Committee chairman Jim Nicholson, “our trade representative was scouring the street markets of Beijing grabbing up every illegal, black market ‘Beanie Baby’ she could get her hands on.” Citing “a source close to the White House delegation,” the Washington Post reported that Barshefsky turned over 40 Chinese Beanies.

Beanie Babies came with a red heart-shaped tag with the word “Ty” printed in large white letters. Ty is an homage to Ty Warner, who created the Beanies in 1993. His company Ty Inc. grew from a modest upstart near Chicago with a handful of employees to running a 370,000-square-foot warehouse and generating $1.4 billion in sales by 1998.

Looking at a Beanie Baby would give the impression that they were children’s toys. But what drove the revenue of Ty Inc. wasn’t parents buying toys for their children to play with. It was adults buying Beanies for themselves and treating them like financial instruments. The CD-ROM Ultimate Collector for Beanie Babies made a splash at the video game industry’s largest trade event in 1999. For $25, consumers received software helping them organize their collection and track price changes. Ultimate Collector featured tax summaries and insurance reports. “It was no longer a child’s toy,” said one collector when recalling why she accumulated Beanies throughout the late ’90s. “It was the hunt for me.”

Despite selling millions of stuffed animals, the Ty company convinced consumers that Beanies were in short supply. In the late ’90s, every few months a particular Beanie Baby would be “retired,” which drove prices up. Online Beanie reselling became so common that when eBay filed paperwork with the U.S. Securities and Exchange Commission to become a publicly traded company in 1998, it cited the volatility of Beanie Baby sales as a risk factor to the company’s financial health. During the second fiscal quarter in 1998, eBay had “over 30,000 simultaneous auctions listed in its ‘Beanie Babies’ category,” the company stated. “A decline in the popularity of, or demand for, certain collectibles or other items sold through the eBay service could reduce the overall volume of transactions on the eBay service, resulting in reduced revenues. In addition, certain consumer ‘fads’ may temporarily inflate the volume of certain types of items listed on the eBay service, placing a significant strain upon the Company’s infrastructure and transaction capacity.” eBay was correct to be cautious about a fad. Beanie sales accounted for a tenth of its total revenues, but the gravy train would not last.

There was enough interest in this market that Beanie Baby price guides were produced to forecast Beanie resale values the same way investment banking analysts make stock predictions. At its peak, there were more than a hundred Beanie Baby price guides. Price guides were often published by collectors who had large investments in Beanie Babies. Unlike an impartial observer, it was in the collector’s personal interest to hype prices rather than proceeding with caution. At the time that Beanies were reselling for thousands of dollars, one price guide predicted they would appreciate another 8,000% over the following decade. Enough people acted on price guide recommendations that, for a brief time, their outrageous predictions became self-fulling until the bubble popped.

Self Portrait with Toys, by Viola Frey, 1981. [Smithsonian American Art Museum]

Retiring Beanies initially drove up resales. Then Ty Inc. tried a more aggressive tactic. “For years, nothing has been hotter than those cuddly little animals with cute little names,” stated CBS Evening News correspondent Anthony Mason during a September 1999 segment. “But abruptly this week, the makers of Beanie Babies, Ty Incorporated, announced over the internet that it was over … By the turn of the century, Beanie Babies will become has beans.” The ushering in of the new millennium would include Y2K and Beanie Baby production stoppages. Beanie Baby web forums were full of apocalyptic posts with titles like “The End Is Coming! The End Is Coming!”

Prices did not rise as Ty hoped. More people came to the realization that stuffed animals that originally sold for $5 should not be reselling for $5,000. That people who banked their life savings on children’s collectibles were playing a fool’s game. That there was no value in investing in a mass-produced product whose entire worth is premised on speculation. Beanie collectors were enamored with old toys that sold for high prices. But when old toys become valuable it is because most toys get beat up when children play with them, so finding a toy in mint condition 30 years after it came out is a rarity that drives prices up. 

Beanies were collected by thousands of adults and stored in glass cases. They were produced in such high volume that the supply outstripped the demand. Finding an old mint-condition Beanie Baby is about as rare as finding a pothole on a city street. 

People who spent thousands of dollars chasing the Beanie fad had closets full of merchandise that was worth less than its original retail price. A soap opera actor who spent $100,000 on Beanies as an investment in his children’s college education saw his investment evaporate. That might sound laughable, but the actor’s son made a haunting documentary about the saga that displayed the humanity driving the craze and its unfortunate consequences. Collectors were left with credit card debt they couldn’t pay off and regret they couldn’t shake.

By 2004, Ty Warner claimed a loss of nearly $40 million on his tax return. In 2014, Warner was convicted of tax evasion. (Warner loathed all taxes, including road tolls. According to Zac Bissonnette, Warner instructed people he was riding with to “just throw pennies and keep driving! It’s an illegal tax!”) Like so many other rich men found guilty of serious crimes, he avoided jail time and remains wealthy. 

Excerpt adapted from 1999: The Year Low Culture Conquered America and Kickstarted Our Bizarre Times, available for preorder from University Press of Kansas. Copyright © 2025 by Ross Benes. 

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186030 https://historynewsnetwork.org/article/186030 0
Telling Chestnut Stories At one time, more than 4 billion American chestnut trees spread from southern Canada all the way to Mississippi and Alabama. While it was rarely the dominant tree in the forests, this giant of the eastern woodlands was hard to miss. It could stand over one hundred feet tall, the trunks straight and true.

To those who lived in the eastern United States, especially Appalachia, the tree was invaluable. It provided food for both people and animals and wood for cabins and fence posts. In cash-poor regions, the tree could even put some money in people’s pockets when they sold nuts to brokers to take to the cities of the Northeast. Some joked that the tree could take you from the cradle to the grave, as the wood was used to make both furniture for babies and caskets. It was certainly “the most useful tree.” In the early 20th century, however, the chestnut came under serious threat. A fungus, first identified in New York in 1904, began killing chestnut trees. It quickly spread south across Appalachia, resulting in the almost complete loss of the species by the mid-20th century. This loss had enormous ecological impacts on the forests of the eastern United States and contributed to a decline in the population of many species of wildlife, including turkeys, bears, and squirrels.

Today, while millions of American chestnut sprouts remain in the forests of the East, almost all the large trees, as well as most of the people who remember the trees’ dominant place in the forest ecosystem, are gone.

Since 1983, the American Chestnut Foundation (TACF) has taken the lead in restoring the American chestnut. While scientists coordinate the project, volunteers play an important part in planting and caring for trees in TACF test orchards. One of the tools TACF uses to connect with volunteers are chestnut stories. In TACF’s publication, originally published as the Journal of the American Chestnut Foundation and known since 2015 as Chestnut: The New Journal of the American Chestnut Foundation, chestnut stories can take many forms — oral histories, essays, poems — but they all document the relationship between humankind and the American chestnut tree. Chestnut stories serve an important purpose: reminding people of the value of the species and the many ways people used the tree before its decline. In documenting the story of the American chestnut through the journal, in sharing and interpreting this story, and in using it to mobilize volunteers and resources, TACF has demonstrated the value that practices rooted in the field of public history and the study of memory can bring to the realm of environmental science. Public historians are well aware of the power that narrative has in prompting action and encouraging people to rethink the status quo. The chestnut stories documented by TACF help create a historical narrative and also serve as a justification for the reintroduction of the species into the modern landscape. As we deal with the long-term consequences of climate change, the emergence of new diseases, and the loss of habitat, the work of TACF can, perhaps, provide a road map for other organizations to employ science, technology, public history practices, and memories to mobilize people to solve environmental challenges.

 

While it is difficult to pinpoint the exact moment when the fungus that devastated the American chestnut arrived in North America, it is possible to date when and where someone first noticed its effects. In 1904, in the Bronx Zoological Park in New York City, employee H.W. Merkel noticed that large sections of the park’s chestnut trees’ bark were dying, and the overall health of the trees appeared to be deteriorating. Dr. A.A. Murrill, who worked for the New York Botanical Gardens, was called in to study the affected trees. He identified the cause: a new fungus, which he called Diaporthe parasitica (the name was changed in 1978 to Cryphonectria parasitica). It is highly unlikely that the trees in the Bronx Zoo were the first to be infected by the fungus, which had come into the port of New York from Asia, but rather it was the first place that someone paid enough attention to notice it.

The “chestnut blight,” as it was commonly called, spread quickly, infecting trees in other locations in New York, as well as in New Jersey and Connecticut. Scientists studying the blight, such as Dr. Haven Metcalf and Dr. J.F. Collins, published bulletins about the disease, which contained recommendations for how to slow the spread. These recommendations included checking nurseries for blighted trees and quarantining those suffering from the blight, creating blight-free zones where chestnuts were removed in the hopes that the blight’s progress would be stopped if there were no chestnut trees, and performing surgery to remove cankers from infected trees. Unfortunately, the advice they gave did not stop the blight, and it began pushing farther south. In Pennsylvania, the Chestnut Blight Commission had permission to enter private property and remove trees infected with or threatened by the blight. In all, the commission spent over $500,000 to fight the blight. But, again, their efforts did not halt the spread. The blight reached West Virginia by 1911, pushing into the heart of Appalachia, where the tree had an important place in the lives of mountain communities. Combined with ink disease, which had been attacking chestnut trees in the southern end of the tree’s range since the 19th century, the blight caused widespread devastation. By 1950, 80% of the American chestnut trees were gone. In all, the upper portion of over 3.5 billion trees died, the equivalent of approximately 9 million acres of forest. The root structure of many trees, however, did not die, and stump sprouts continue to emerge from root systems today, well over a century later. Unfortunately, before they are able to grow very large, these stump sprouts become infected by the blight and die back. So today, while millions of stump sprouts do exist, few mature trees are left.

Chestnut blight, 2009. Photograph by Daderot. [Wikimedia Commons]

TACF eventually took the lead in chestnut restoration efforts. TACF began formally documenting its progress in 1985 with the publication of the Journal of the American Chestnut Foundation, published as Chestnut: The New Journal of the American Chestnut Foundation since 2015. In the first edition, the then editor Donald Willeke lays out the mission of the journal: “We hope that it will be both a scientific journal and a means of communicating news and developments about the American Chestnut to dedicated non-scientists (such as the lawyer who is writing this Introduction) who care about trees in general, and the American Chestnut in particular. and wish to see it resume its place as the crowning glory of the American deciduous woodlands.” Over the years, the journal has moved from a volunteer publication released once or twice a year (depending on the year and on capacity) to a glossy, professional magazine released three times a year.

In the journal, the progress of the backcross breeding program is broken down into terms nonscientists can understand. The journal, however, is not only about the science behind the restoration effort. One of the most significant sections of the journal in its early years was the “Memories” section, which documented “chestnut stories.” While many of the memories included in the journal came to TACF unsolicited, TACF also recognized the importance of documenting people’s chestnut stories in a more organized fashion. In 2003, the then membership director Elizabeth Daniels wrote about the American Chestnut Oral History project, which aimed to preserve chestnut stories for future generations. In the spring of 2006, the then editor Jeanne Coleman let readers know she was interested in gathering chestnut stories. Stories came pouring in, and as Coleman says in the fall 2006 issue, “These stories are heartwarming [and] often funny.” Today, in essence, the journal itself acts as the archive of TACF’s chestnut stories, preserving and sharing them simultaneously.

Untitled (Squirrels in a Chestnut Tree), by Susan Catherine Moore Waters, c. 1875. [The Metropolitan Museum of Art]

In reviewing all 79 issues of TACF’s journal that are available online as of January 2022, as well as other sources, the significance of the chestnut stories becomes quite clear. The work of the scientists engaged in backcross breeding and genetic modification is essential to the restoration of the chestnut. But the success of TACF also has come from thousands of members and volunteers who have supported the work of the scientists. From the beginning, TACF understood the importance of engaging with people outside traditional scientific research circles to accomplish restoration. 

People who mourned the past also supported work to bring about a future where the chestnut once again plays an important role in the ecosystems of the eastern woodlands. TACF members have been, per scientist Philip Rutter from the University of Minnesota, “trying to do something about the problem rather than just lament the loss,” which certainly challenges the argument that nostalgia can reduce the ability to act in the present. 

While maybe not quite as tall or as wide as remembered in chestnut stories, the American chestnut tree occupied a significant place in the forest and in the lives of those who lived under its spreading branches—and it is certainly worthy of the work to restore it. Chestnut stories document this significant place chestnuts held in the forest ecosystem, and the sharing of the stories reminds people of the value the tree brought to Americans before the blight destroyed it. In an interview with Charles A. Miller, William Good remembers how farmers fattened their hogs off chestnuts: “In the fall, because people didn’t raise corn to feed their hogs, farmers would let them run in the mountain where they fattened up on chestnuts. The hogs would have to eat through the big burs on the outside to get the nut out. . . . The hogs must have liked the nuts so much they would chew through them until their mouths were bleeding.”

In an interview that appears in the 1980 folklore collection Foxfire 6 and is reprinted in TACF’s journal, Noel Moore recollects that people in the Appalachians did not build fences to keep their stock in; they instead fenced around their homes and fields to keep out the free-range stock wandering the woods. Each fall, farmers would round up and butcher their hogs that had grown fat on acorns and chestnuts. Chestnuts also served as an important food source for wild animals, including turkeys, black bears, white-tailed deer, gray fox squirrels, and the now extinct chestnut moth. These animals, in turn, fed those who hunted them. Chestnuts also were an important food source for people. Myra McAllister, who grew up in Virginia, recalls that she liked chestnuts “raw, boiled, and roasted by an open fire.” Cecil Clink, who grew up in North Bridgewater, Pennsylvania, remembers filling old flour sacks with the nuts, which his mother would store in the family’s “butry,” or buttery, “with the smoked hams. . . . [They ate] the nuts boiled, or put them on the cook stove and roast them.” Other people made stuffing and traditional Cherokee bread out of the nuts, though they could not grind the nuts into flour because they were too soft; the nuts had to be pounded by hand into flour instead. And it was not just the nuts themselves that people loved. Noel Moore remembers the taste of the honey that bees made from the chestnut blossoms. The leaves also had medicinal uses: the Mohegans made tea to treat rheumatism and colds, and the Cherokee made cough syrup.

Some stories recall the act of selling chestnuts, which gave many families cash that they might otherwise not have had—likely making it a memorable moment. In Where There Are Mountains, Donald Davis shares the chestnut story of John McCaulley, who as a young man had gathered chestnuts for sale. The nuts he gathered sold for four dollars a bushel in Knoxville, Tennessee—and McCaulley could gather up to seven bushels a day. Jake Waldroop remembers seeing wagons loaded with chestnuts in his northeast Georgia mountain community. The wagons headed to “Tocca, Lavonia, Athens, all the way to Atlanta, Georgia.”63 Noel Moore recalls seeing families coming from the mountains and heading to the store in the fall, laden with bags of chestnuts. They traded the bags for “coffee and sugar and flour and things that they had to buy to live on through the winter.”6 Exchanging chestnuts for supplies or cash was “much less risky than moonshining.”

Chestnutting, by Winslow Homer, 1870. [Wikimedia Commons]

To the north in Vittoria, Ontario, Donald McCall, whose father owned a store in the village, recollects that “farmers counted on the money from their chestnuts to pay taxes on the farm.” The trees themselves also had value. William B. Wood recalls that his father tried to save the family farm during the Great Depression by felling and selling the wood of a huge chestnut tree dead from the blight that Wood calls the “Chestnut Ghost.” Unfortunately, the plan did not work, and the family lost their farm.Other stories connect to the “usefulness” of the tree. Because chestnut wood is “even grained and durable . . . light in weight and easily worked,” the tree was used for a wide variety of purposes. Georgia Miller, who was 101 when she wrote “Chestnuts before the Blight,” recalls the split rail fences that lined the edges of pastures. The chestnut wood split easily and lasted longer than that of other species, making it a good material for what some called “snake fences.” Daniel Hallett, born in New Jersey in 1911, says his family used chestnut to trim doors and windows and also for chair rails in the houses they built. Dr. Edwin Flinn’s story (told by Dr. William Lord in 2014), which focuses on the extraction of tannins from dead chestnut trees, shows that the tree remained valuable even after the blight struck.

Chestnut stories recount more than memories of the tree’s usefulness or the role it played in Indigenous and rural economies. Many stories have documented how an encounter with the tree mobilized someone toward engagement with the restoration effort, demonstrating that chestnut stories can provide a pathway to a wider recognition of the natural world. Patrick Chamberlin describes such an experience in “A Practical Way for the Layman to Participate in Breeding Resistance into the American Chestnut.” Chamberlin tells readers how his grandmother used to reminisce about the tree when he was a young boy and then how he came across a burr from an American chestnut while in high school. He started looking for trees as he explored the woods on his parents’ farm. Eventually, while wandering near the old homestead site where his grandmother grew up, he came across two flowering chestnut trees. Returning later in the season, he found nuts from the trees. Through this experience, Chamberlin became involved in the back- cross breeding program—and, at the end of his essay, encourages others to do the same. A chestnut story from the distant reaches of his youth started him on his journey, and science helped him continue his work into the present. Fred Hebard, who directed TACF’s Meadowview Research Farms for twenty-six years, saw his first American chestnut sprout while helping round up an escaped cow with a farmer he worked for. When the farmer told him the story of the American chestnut, Hebard ended up changing his college major, earned a PhD in plant pathology, and began researching the chestnut. It became his life’s work.

Reprinted from Branching Out: The Public History of Trees. Copyright © 2025 by University of Massachusetts Press. Published by the University of Massachusetts Press.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186029 https://historynewsnetwork.org/article/186029 0
Scared Out of the Community Our featured weekly excerpts usually spotlight new history titles, but sometimes the news of the day makes returning to past scholarship, responding to different times and looking at the past from different contexts, a useful endeavor. This is the first entry in a series we hope to revisit from time to time, excerpting books from previous decades in order to bring the history they document to new audiences. Below is an excerpt adapted from Unwanted Mexican Americans in the Great Depression: Repatriation Pressures, 1929–1939, by Abraham Hoffman, published in 1974 by the University of Arizona Press. You can read the entire book online here

The old man entered the circular park, looked around, and sat down on one of the many benches placed there for the use of the town’s citizens. Several hundred people, mostly men, were also in the park, enjoying the afternoon sun. Sitting in the park enabled the old man to forget the reason that had brought him there. The deepening economic depression had cost him his job, and work was hard to find.

A sudden commotion startled the old man out of his reverie. Without warning, uniformed policemen surrounded the park, blocking all exits. A voice filled with authority ordered everyone to remain where he was. While the policemen guarded the exits, government agents methodically quizzed each of the frightened people, demanding identification papers, documents, or passports. With shaking hands the old man produced a dog-eared, yellowed visa. Only the other day, he had considered throwing it away. After all, he had entered the country so long ago…

The agent inspected the papers and barked several questions at the old man. Haltingly, he answered as best he could, for despite his years of residence in the country he had learned the language only imperfectly. With a nod of approval, the officer returned the papers. The old man sat down again; a sense of relief washed over him.

The agents continued their interrogation, and after about an hour everyone in the park had been checked and cleared. Or almost everyone. Seventeen men were placed in cars and taken away. The inspection over, the policemen left the park to the people. Few cared to remain, however, and in a few moments the place was deserted.

 

The time was 1931; the place, Los Angeles, California, in the city’s downtown plaza. The government agents were officers in the Department of Labor’s Bureau of Immigration, assisted by local policemen. Their goal was the apprehension of aliens who had entered the United States illegally.

Unlike many post-World War II aliens who knowingly entered in violation of immigration laws, immigrants prior to the Great Depression entered the United States at a time when the government’s views on immigration were in flux, moving from unrestricted entry to severe restriction. Many aliens found themselves confused by the tightening noose of regulations; one immigrant might enter with one law in effect, but his younger brother, coming to the country a few years later, might find new rules — or new interpretations of old rules — impeding his entrance.

With the onset of the depression, pressure mounted to remove aliens from the relief rolls and, almost paradoxically, from the jobs they were said to hold at the expense of American citizens. In the Southwest, immigration service officers searched for Mexican immigrants, while local welfare agencies sought to lighten their relief load by urging Mexican indigents to volunteer for repatriation. The most ambitious of these repatriation programs was organized in Los Angeles County, an area with the largest concentration of Mexicans outside of Mexico City.

Not all of the repatriates, however, departed solely under pressure from the Anglos. Many Mexicans who had achieved varying degrees of financial success decided on their own to return to Mexico, taking with them the automobiles, clothing, radios, and other material possessions they had accumulated. The Mexican government, vacillating between the desire to lure these people home and the fear that their arrival would add to an already existing labor surplus, sporadically launched land reform programs designed for repatriados. Between 1929 and 1939 approximately half a million Mexicans left the United States. Many of the departing families included American-born children to whom Mexico, not the United States, was the foreign land.

The peak month in which Mexicans recrossed the border was November 1931, and in all subsequent months the figures generally declined. Yet it was after this date that the number of cities shipping out Mexican families increased. Even after the massive federal relief programs of the New Deal were begun in 1933, cities such as Los Angeles, Chicago, and Detroit still attempted to persuade indigent Mexicans to leave.

With the start of World War II, Mexican immigration was renewed, when the United States and Mexico concluded an agreement to permit braceros to enter the United States. A system of permits and visas for varying periods testifies to the evolution of border regulations; their abuse and misuse bear witness to the difficulties of making such a system coherent. 

No other locality matched the county of Los Angeles in its ambitious efforts to rid itself of the Mexican immigrant during the depression years. By defining people along cultural instead of national lines, county officials deprived American children of Mexican descent of rights guaranteed them by the Constitution. On the federal level, no other region in the country received as much attention from immigration officials as Southern California. Because of the tremendous growth of this region after World War II, Southern California’s service as a locus for deportation and repatriation of Mexican immigrants is little remembered. To the Mexican-American community, however, repatriation is a painful memory. 

 

In 1931, various elements in Los Angeles had indicated support for the idea of restricting jobs on public works projects to American citizens. Motions were presented and passed by the Los Angeles city council and the county board of supervisors, while the Independent Order of Veterans of Los Angeles called for the deportation of illegal aliens as a means of aiding jobless relief.

The board of supervisors went so far as to endorse legislation pending in Congress and in the state legislature, which would bar aliens who had entered the country illegally from “establishing a residence, holding a position, or engaging in any form of business.” Supervisor John R. Quinn believed that such legislation would provide a sort of cure-all for all problems generated by illegal aliens, whom he believed numbered “between 200,000 and 400,000 in California alone.” Said Quinn in two remarkably all-inclusive sentences:

If we were rid of the aliens who have entered this country illegally since 1931 ... our present unemployment problem would shrink to the proportions of a relatively unimportant flat spot in business. In ridding ourselves of the criminally undesirable alien we will put an end to a large part of our crime and law enforcement problem, probably saving many good American lives and certainly millions of dollars for law enforcement against people who have no business in this country.

Quinn also believed the “Red problem” would disappear with the deportation of these aliens. 

It was in this atmosphere that Charles P. Visel, head of the Los Angeles Citizen’s Committee on Coordination of Unemployment Relief, published a press release in city newspapers. The statement announced a deportation campaign and stressed that help from adjoining districts would be given the local office of the Bureau of Immigration. Each newspaper printed the text as it saw fit, so that while one newspaper printed sections of it verbatim, another summarized and paraphrased. Certain embellishments were added. “Aliens who are deportable will save themselves trouble and expense,” suggested the Los Angeles Illustrated Daily News on January 26, 1931, “by arranging their departure at once.” On that same day, the Examiner, a Hearst paper, announced, without going into any qualifying details, that “Deportable aliens include Mexicans, Japanese, Chinese, and others.”

As the days passed, follow-up stories and editorials kept the public aware of the project. The Express two days later editorially endorsed restrictionist legislation and called for compulsory alien registration. On January 29, the Times quoted Visel, who urged “all nondeportable aliens who are without credentials or who have not registered to register at once, as those having papers will save themselves a great deal of annoyance and trouble in the very near future. This is a constructive suggestion.” The impending arrival of the special agents from Washington, DC, and other immigration districts was made known, the word being given by Visel to the newspapers. 

La Opinión, the leading Spanish-language newspaper in Los Angeles, published an extensive article on January 29. With a major headline spread across page one, the newspaper quoted from Visel’s release and from the versions of it given by the Times and the Illustrated Daily News. La Opinión’s article pointedly stressed that the deportation campaign was being aimed primarily at those of Mexican nationality.

 

Commencing February 3, Supervisor William F. Watkins of the Bureau of Immigration and his men, with the assistance of police and deputies, began ferreting out aliens in Los Angeles. By Saturday 35 deportable aliens had been apprehended. Of this number, eight were immediately returned to Mexico by the “voluntary departure” method, while an additional number chose to take the same procedure in preference to undergoing a formal hearing. Several aliens were held for formal deportation on charges of violating the criminal, immoral, or undesirable class provisions of the immigration laws. Five additional immigration inspectors arrived to provide assistance, and five more were shortly expected. 

On Friday the 13th, with the assistance of 13 sheriff’s deputies led by Captain William J. Bright of the sheriff’s homicide detail, the immigration agents staged a raid in the El Monte area. This action was given prominence in the Sunday editions of the Times and the Examiner. Watkins wrote to Robe Carl White, assistant labor secretary, that such coverage was “unfortunate from our standpoint,” because the impression was given by the articles that every district in Los Angeles County known to have aliens living there would be investigated. “Our attitude in regard to publicity was made known to the authorities working with us in this matter,” Watkins complained, “but somehow the information found its way into the papers.”

Considering the announcements from Walter E. Carr, the Los Angeles district director of immigration, that no ethnic group was being singled out and that only aliens with criminal records were the primary interest of the Bureau of Immigration, the aliens captured in the Friday the 13th raid could only have made the Mexican community wary of official statements. Three hundred people were stopped and questioned: from this number, the immigration agents jailed 13, and 12 of them were Mexicans. The Examiner conveniently supplied the public with the names, ages, occupations, birth places, years in the United States, and years or months in Los Angeles County, while the Times was content just to supply the names.

While generalizations are impossible about the people stopped, questioned, and occasionally detained, the assertions that all the aliens either were people holding jobs (that only could be held by citizens) or were criminals in the county did not apply to these arrested suspects. Of the twelve Mexicans arrested, the most recent arrival in the United States had come eight months earlier, while three had been in the United States at least seven years, one for thirteen years, and another was classified as an “American-born Mexican,” a term which carried no clear meaning, inasmuch as the charge against the suspects was illegal entry. Eleven of the twelve gave their occupation as laborer; the twelfth said he was a waiter.

 

As Watkins pursued the search for deportable aliens, he observed that the job became progressively difficult:

After the first few roundups of aliens ... there was noticeable falling off in the number of contrabands apprehended. The newspaper publicity which attended our efforts and the word which passed between the alien groups undoubtedly caused great numbers of them to seek concealment.

After several forays into East Los Angeles, the agents found the streets deserted, with local merchants complaining that the investigations were bad for business. In the rural sections of the county surveyed by Watkins’ men, whole families disappeared from sight. Watkins also began to appreciate the extent of Southern California’s residential sprawl. He observed that the Belvedere section, according to the 1930 census, might hold as many as 60,000 Mexicans.

The Mexican and other ethnic communities were not about to take the investigations passively. La Opinión railed at officials for the raids, while ethnic brotherhood associations gave advice and assistance. A meeting of over one hundred Mexican and Mexican American businessmen on the evening of February 16 resulted in the organization of the Mexican Chamber of Commerce in Los Angeles, and a pledge to carry their complaints about the treatment of Mexican nationals to both Mexico City and Washington, DC. Mexican merchants in Los Angeles, who catered to the trade of their ethnic group, felt that their business had been adversely affected, since Mexicans living in outlying areas now hesitated to present themselves in Los Angeles for possible harassment. Sheriff William Traeger’s deputies in particular were criticized for rounding up Mexicans in large groups and taking them to jail without checking whether anyone in the group had a passport or proof of entry.

Ambassador Rafael de la Colina had been working tirelessly on behalf of destitute Mexicans in need of aid or desiring repatriation. Much of his time was occupied with meeting immigration officials who kept assuring him that the Mexicans were not being singled out for deportation. He also warned against unscrupulous individuals who were taking advantage of Mexican nationals by soliciting funds for charity and issuing bogus affidavits to Mexicans who had lost their papers.

The Japanese community also expressed its hostility to the immigration officials. When several agents stopped to investigate some suspected Japanese aliens, the owner of the ranch employing the aliens threatened to shoot the inspector “if he had a gun.” Japanese people obstinately refused to answer any questions, and Watkins believed that an attorney had been retained by the Japanese for the purpose of circumventing the immigration laws. 

Despite the adverse reaction to and public knowledge of the drive on aliens, Watkins persisted. “I am fully convinced that there is an extensive field here for deportation work and as we can gradually absorb same it is expected [sic] to ask for additional help,” he stated. Responding to the charges of dragnet methods, he notified his superiors in Washington:

I have tried to be extremely careful to avoid the holding of aliens by or for this Service who are not deportable and to this end it is our endeavor to immediately release at the local point of investigation any alien who is not found to be deportable as soon as his examination is completed.

 

On February 21, 1931, Watkins wrote to White, and the following month to Visel, that 230 aliens had been deported in formal proceedings, of whom 110 were Mexican nationals, and that 159 additional Mexican aliens had chosen the voluntary departure option to return to Mexico.

These figures revealed that seven out of ten persons deported in the Southern California antialien drive were Mexicans. By the supervisor’s own admission, in order to capture the 389 aliens successfully prosecuted during this period, Watkins and his men had to round up and question somewhere between 3,000 and 4,000 people — truly a monumental task.

The effect of the drive on the Mexican community was traumatic. Many of the aliens apprehended had never regularized an illegal entry that might have been made years before. Other than that, to call them criminals is to misapply the term. The pressure on the Mexican community from the deportation campaign contributed significantly to the huge repatriation from Los Angeles that followed the antialien drive. But this seemed of little concern to the head of the Citizens Committee on Coordination of Unemployment Relief. By the third week in March, an exuberant Visel could write to Labor Secretary William N. Doak:

Six weeks have elapsed since we have received ... Mr. Watkins, in reply to our request for deportable alien relief in this district. We wish to compliment your department for his efficiency, aggressiveness, resourcefulness, and the altogether sane way in which he is endeavoring and is getting concrete results.

The exodus of aliens deportable and otherwise who have been scared out of the community has undoubtedly left many jobs which have been taken up by other persons (not deportable) and citizens of the United States and our municipality. The exodus still continues.

We are very much impressed by the methods used and the constructive results steadily being accomplished.

Our compliments to you, Sir, and to this branch of your service.

However much Visel’s interpretation of the benefits derived from the campaign squared with reality, the Department of Labor was no longer as eager to endorse the Los Angeles coordinator, or even to imply the existence of an endorsement. Perhaps the department feared any such reply might be converted into another publicity release. At any rate, with Nation and New Republic lambasting the department, Doak shied away from making a personal reply. Visel’s letter was answered by Assistant Secretary W.W. Husband, who acknowledged Visel’s message and then circumspectly stated:

It is the purpose of this Department that the deportation provisions of our immigration laws shall be carried out to the fullest possible extent but the Department is equally desirous that such activities shall be carried out strictly in accordance with law.

Excerpt adapted from Unwanted Mexican Americans in the Great Depression: Repatriation Pressures, 1929–1939 by Abraham Hoffman. Copyright © 1974 by The Arizona Board of Regents. Used with permission of the publisher, the University of Arizona Press. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186026 https://historynewsnetwork.org/article/186026 0
Creating the “Senior Citizen” Political Identity A mass social movement of the elderly was arguably the major force behind Social Security old-­age pensions: the Townsend movement, named after its founder, Francis Townsend, M.D.

Townsend was by far the largest social movement of the 1930s — ­larger than the protests of the unemployed and labor union organizing taken together. Never before had the elderly organized to advance their interests as old people. Yet despite its size and originality, it is the least studied social movement in American history. Scholars and policy wonks of its time focused almost exclusively on debunking the Townsend plan, pointing out its economic fallacies and labeling its supporters naïve and crackpot. Very few contemporary scholars have studied it. I find myself wondering, is this a sign of disdain for old people, precisely what the movement was challenging?

The Townsend movement’s influence continued even after the 1935 passage of the Social Security Act. It created a “senior citizen” political identity, now a powerful voting bloc. Townsenders insisted that the elderly should become a “guiding force in all things, political, social and moral.” This was a claim to wisdom and relevance at a time when the culture was increasingly disrespectful of old people. That claim allowed a single­-issue social movement to consider itself a patriotic cause, one that could improve the well­being of the entire nation. Part of the Townsend movement’s appeal was that it could be both narrow and wide: its narrowness made it recognizable and straightforward, its message stickier; its width made it selfless. Its genius lay in joining its members’ material interests with altruism. Townsenders saw themselves as simultaneously beneficiaries of the nation and contributors to it, givers as well as takers.

Car in Columbus, Kansas, with sign supporting the plan of Dr. Francis Townsend to create a nationwide pension plan for the elderly, 1936. Photograph by Arthur Rothstein. [Wikimedia Commons]

Dr. Townsend was aware that the United States was alone among developed countries in providing no government old-age pensions. He published eight letters on the subject in the Long Beach Press Telegram between September 30, 1933 and February 20, 1934, and in those five months found he had jump­-started a social movement. He soon sketched out an actual pension plan. Recruiting two partners — his brother Walter L. Townsend and real estate broker Robert Earle Clements —­ he created Old Age Revolving Pensions, Limited (OARP). The organization thus acquired two sorts of leaders: the doctor who led the social movement and the shrewd commission-earning businessmen who built the organization. This division of labor made the movement a juggernaut, but it also produced conflicts and allegations of corruption. 

From early on it was clear that a social movement was arising. OARP’s newsletter, The Modern Crusader, soon sold 100,000 copies — ­granted, it cost only two cents. The proposal generated excitement throughout California, and OARP chapters appeared so fast that the headquarters could not keep track. 

A committee, allegedly including statisticians, began drafting legislation, and within a year John McGroarty, the 73-year-old poet laureate of California, got himself elected to Congress, where he introduced the first Townsend bill, HR 3977.12 (The plan would be revised several times.) Its major provisions were: Every American citizen aged 60 or older would receive a monthly pension of $200, provided that they retired and refrained from wage-earning. Younger people who were “hopelessly invalided or crippled” would receive the same pension. Originally the plan proposed funding the pensions through a 2 percent sales tax — a regressive tax that would have disproportionately burdened low-income people. Later the plan substituted a tax on transactions, which continued to be regressive, would have raised commodity prices exponentially, and would have provided an incentive for vertical integration.

When McGroarty first introduced the bill, he presented it not as a pension proposal but as a plan for economic recovery, a claim often repeated by its supporters, one of whom called it a “big business” plan. It would work because the legislation would require each stipend to be spent within a month. The plan thus called itself “revolving” pensions on the theory that, after the first month, what was paid out would be recompensed by taxes, as if the same money would be cycling through the economy. The pensions would thus stimulate an economy in deep depression by boosting consumer spending. 

Even better, Representative McGroarty argued that freeing up jobs held by older people would open jobs for younger people — ­4 million jobs would allegedly become available. Retirement would then allow elders to become a “service class” of volunteers doing charitable work; this would allow government to operate at a “high standard,” as a supportive newspaper put it. To the criticism that government stipends would encourage passivity and dependence, Dr. Townsend responded that volunteerism should be a fundamental aspect of active citizenship. As he put it, the “secondary purpose of Townsend clubs is a desperate fight to continue the democratic spirit and form of government in these United States.” He argued that his plan would end child labor and reduce or even do away with crime, which resulted, in Townsend ideology, from poverty and unemployment. It would end war, which was also the result of poverty and inequality. Dr. Townsend’s arguments became ever more utopian and less realistic — an unusual trajectory, as over time most social movements make compromises, and their goals become more modest.

 

Experts in groups such as the American Association for Old Age Security, the American Association for Labor Legislation, and the National Conference of Social Work had been discussing possible welfare programs since the 1920s. Some of them would participate in writing the Social Security Act of 1935, including lions of social reform Edwin Witte, John R. Commons and Arthur Altmeyer, and social democratic feminists Grace Abbott, Sophonisba Breckenridge, Florence Kelley, Julia Lathrop, and Mary van Kleeck (many of them part of the Hull-House network). They had promoted the 1933 Dill-Connery bill, which would have provided federal grants-in-aid for the elderly, to be matched by the states, in amounts up to $15 a month per person. It passed the House in 1933 and 1934, but failed both times in the Senate. President Roosevelt did not support it, and Massachusetts sued the Treasury Department, arguing that the program was an attack on the constitutionally reserved powers of the states. Though that argument was rejected by the conservative Supreme Court, the bill’s failure suggests the strength of resistance to such welfare expenditure — and the difficulty of overcoming opposition to the Social Security Act a few years later.

Dill-Connery’s stipends would have been too minuscule to help low-income people, and they would have been controlled by state governments, which almost guaranteed that nonwhites would be excluded. The Social Security Act to come would have equally great limitations. It excluded the majority of Americans — farmworkers, domestic workers, and most other employed women — who worked mainly for small employers who were not required to participate. Unemployed women were expected to share a husband’s stipend. Divorced women would have no entitlement to an ex­-husband’s pension, and other unmarried women would get nothing.

The Townsend plan was better. True, it relied on discriminatory funding, like Social Security. Townsend proposed funding by sales taxes, while Social Security was funded by a percentage of earnings, so the poor who needed help most would get least. But the universality and the size of Townsend plan pensions would have mitigated inequality, by providing the same level of support to all Americans. Moreover, it would include people of both sexes and all races, a nondiscriminatory policy that might have set a precedent for future programs. The Townsend plan might also have ideological influence, contributing to a positive view of government responsibility for the public welfare. Put simply, the Townsend plan was advancing a democratic understanding of citizens’ entitlements. By contrast, most New Deal programs were discriminatory, offering more to those who needed less, by excluding the great majority of people of color and white women.

Although the Townsend plan would have been redistributive across class, sex, and race lines and could thus be categorized as a left or progressive plan, it was also redistributive along age lines, and this was problematic. It called for transferring massive resources from young and middle-aged adults to older ones. One opponent calculated that it would give half the national income to one-eleventh of the population. A historian recently estimated that a quarter of U.S. GDP would move from those under 60 to the elderly. Either figure confirms the plan’s unfairness toward younger people and their needs. Townsenders countered with a moral argument: “We supported our children in youth, is it not right and just that in old age we shall be taken care of by youth?” This sentiment, consistent with traditional family values, brought in socially conservative supporters.

Dr. Townsend’s Socialist Party history, during a period of the party’s strength, must surely have influenced his concern to help the needy. Nevertheless he took care to dissociate his plan from socialism. He frequently insisted that the plan did not undercut the profit system “or any part of the present administration of business.” This political ambiguity made the plan seem inconsistent, even incoherent to its opponents. Yet it was a “masterly synthesis of conservatism and radicalism,” in the words of one scholar. Townsend supporters were not naïvely “falling for” this political fusion of left and right, as their opponents charged. While Townsenders were supporting an impossible means of financing the pensions, as opponents pointed out repeatedly in every conceivable medium, they might be classified as intuitive social democrats — believing, and hoping, that a rich capitalist country could become a welfare state. For some, that belief was more emotional than political, and few of them had a broad conception of a welfare state. But the critics’ disdainful appraisal of Townsenders as fools was itself foolish. They were as educated and informed as any middle-class Americans.

While anyone could join and everyone would be entitled to a pension in the Townsend plan, the movement’s racial and religious composition was extremely homogeneous and almost identical with that of the Klan — ­white and Protestant, particularly evangelical. One writer commented that “one sees no foreign-looking faces.” There were a few exceptions. In one upstate New York industrial county, most votes for the Townsend/Democratic Party candidate came from immigrant voters, and one organizer pleaded for literature in Yiddish, Polish, and Italian. But the national Townsend organization never crafted appeals to immigrants, “ethnics,” or Black Americans. There were a few African American clubs and a few integrated clubs, mostly in California — Los Angeles, Long Beach, Oakland, Stockton — ­but the overwhelming majority of clubs were 100% white.

Townsend national leaders probably had little concern for elderly Black people or people of color in general. The demographics of southern California may have played a role here: in 1930 African Americans constituted only 1.4% of the state population. On the other hand, the state’s population included tens of thousands of people of color who rarely appear in material by or about Townsend: 415,000 people of Hispanic origin, about 7%, and 169,000 of Asian origin, just under 3%. No doubt the whole movement had not only a white but a Protestant evangelical appearance and discourse, and most nonwhite people in the western states had learned caution about entering unknown “white” spaces. Certainly the Townsend movement and its clubs did not attempt to recruit them.

As with the Klan, many Townsend movement members were businessmen and white-collar workers, with some professionals. Its demographics contrasted with the Klan’s in several ways however — it had more big-city dwellers, more women, of course more gray hair, and fewer young and middle-aged people. But while most active Townsenders were middle class, conceived broadly, that label meant something very different in the midst of an economic depression: the majority had probably experienced a sudden economic collapse rather than chronic deprivation. Some West Coast members, especially the poorest ones, were refugees from the “dust bowl.” But regions of chronic poverty, such as the southern states, did not produce many Townsend clubs. The universality of proposed pensions, which threatened to include African Americans, no doubt repelled many white southerners. As a Jackson, Mississippi, newspaper editorialized, “The average Mississippian can’t imagine himself chipping in to pay for able-bodied Negroes to sit around in idleness.”

 

The Townsend movement was a business. Millions flocked to it because of the pension plan and the doctor’s hokey charisma, but also because it offered them a chance to make a bit of money. Clements introduced the same recruitment-by-commission arrangement that had so ballooned the KKK. Previously an organizer for the Anti-Saloon League (like quite a few Klan organizers), he “hired” some three hundred organizers, aka recruiters, many of them also former Anti-­Saloon League employees, some of them ministers. There were no wages, only commissions: they earned 20%, or 2.5 cents, from every 12 cents that new members paid. One early organizer claimed that the doctor promised him “handfuls” of money from the work.

At first the doctor worried about the opportunities for embezzlement created by the commission system, but his staff clung to the system because it was so cheap. Understandably, Townsenders appreciated the opportunity to earn in the midst of the still worsening Depression. Members felt even better because they were earning by bringing people into a just cause. Dr. Townsend defended the system by arguing that it freed the organization from having to solicit large donations from the rich. Recruitment by commission was more democratic, he said — it meant that the needy supported the movement themselves and would not be beholden to big money. Yet the doctor also defended this approach with an argument that justified and flattered his personal leadership: “Townsend …  is a Program of Proxy … Thousands of the world’s best people do not possess the high qualifications for personal leadership . . . yet they can partake in the program by letting their money become proxy for them. . . . Your dollars can become you.”

Excerpted from Seven Social Movements That Changed America. Copyright © 2025 by Linda Gordon. Used with permission of the publisher, Liveright Publishing Corporation, a division of W. W. Norton & Company, Inc. All rights reserved

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186024 https://historynewsnetwork.org/article/186024 0
There’s Some Spirit Left Yet On December 8, 1876, Bristol police arrested the bookseller Henry Cook for selling the American birth control booklet Fruits of Philosophy. (Victorian readers knew the latter noun was a byword for “science.”) The edition’s title page bore the name of the National Reformer publisher Charles Watts, who had purchased the plates years before and printed it alongside dozens of National Secular Society pamphlets.

Unbeknownst to the London-based Watts, Henry Cook had previously served two years in prison for selling pornography from under his shop’s counter. Between the Fruit’s pages of plain text, Cook had inserted images from his old trade that illustrated explanations of human anatomy and recommendations for safely preventing pregnancy.

On December 14, Annie Besant — the National Secular Society’s second-in-command, a 29-year-old single mother who had defiantly left her sexually abusive Anglican vicar husband in an era of strict coverture laws — arrived at the National Reformer’s smart new Fleet Street office to find a nervous Charles Watts. Upon hearing of the Bristol arrest, he had telegraphed to Cook, “Fear not, nothing can come of it.” But until now, Watts had never actually read the pages that he printed.

Handing a copy of Fruits of Philosophy to Besant, he asked her opinion. She read it on the train en route to a lecture on the emancipation of women. The pamphlet advocated parental responsibility, and the restriction of family size within the means of its existence. While the 1830s American medical English lacked her flair—George Bernard Shaw had recently proclaimed her among Britain’s best orators — Annie concluded that she would have been proud to author such a work.

Unlike the Victorian Dr. Acton — then instructing England that masturbation led to blindness — in his Fruits the American Dr. Charles Knowlton wrote about sex as a natural enterprise, and nothing to be ashamed of. Nor should it be limited to the purpose of procreation. “A temperate gratification,” Knowlton wrote, “promotes the secretions, and the appetite for food; calms the restless passions; induces pleasant sleep; awakens social feeling, and adds a zest to life which makes one conscious that life is worth preserving.”

From the train station, Annie telegraphed Watts, “Book defensible as medical work.” 

Her confidant Charles Bradlaugh, on the other hand, felt it was indictable. The National Secular Society president, a disciple of Richard Carlile and John Stuart Mill who for the past decade had attacked the Church and Crown over its hold on free speech, knew that open discussion of sex remained taboo. Previously he had urged Watts to pull the title from the press. Now that the horses had bolted, Bradlaugh instructed his long-serving coworker to appear in Bristol and admit to the magistrates that he was the pamphlet’s publisher. The hearing did not go well. Charles Watts gave the court 13 copies of the book. Embarrassed when select passages were read aloud, he denounced the pamphlet’s “vile purpose,” and withdrew his support for the arrested bookseller. After Boxing Day, Henry Cook would be sentenced to two years of hard labor. Before returning to London, Watts promised a judge that he would cease to print Fruits of Philosophy. The matter seemed closed. The incident barely made a ripple in the great paper ocean of Victorian newspapers.

Annie Besant, c. 1870. [National Portrait Gallery, London]

Meanwhile, news of the publisher Charles Watts’ impunity made its way back to London. On January 8, 1877, police arrested him without warning in Fleet Street. He was arraigned at Guildhall for publishing an obscene book, released on bail and committed for a February trial at the Old Bailey.

Understandably, Watts panicked. Charles Bradlaugh promised to hire a skilled lawyer, with the aim of convincing a grand jury to return a “no bill,” or recommendation to drop the indictment. “The case is looking rather serious,” Bradlaugh admitted to Watts, “but we must face it. I would the prosecution had been against any other book, for this one places me in a very awkward position.”

Annie Besant argued with both men that the case absolutely must go to trial, as the publicity would shine a needed light on a woman’s right to sex education and the power to make decisions about her own body and health. Knowlton’s Fruits might be bruised, but it was all they had. At length, both men finally agreed with her, even if they remained unenthused. “I have the right and the duty,” Bradlaugh said, “to refuse to associate my name with a submission which is utterly repugnant to my nature and inconsistent with my whole career.” However, “The struggle for a free press has been one of the marks of the Freethought party throughout its history, and as long as the Party permits me to hold its flag, I will never voluntarily lower it.”

Galvanized, Annie organized a defense fund, collecting over £8 at a talk that weekend in Plymouth. Concurrently, Charles Watts had a change of heart. He was switching his plea to Guilty, and planned to throw himself at the mercy of the Central Criminal Court.

Bradlaugh called him a coward, and, after 15 years of working and campaigning together, fired him from the National Reformer. The two would engage in an exchange of public recriminations that forced freethinkers to choose a side. Annie learned of this turn of events upon her return to London. She had been prepared, she wrote, to stand by her colleague Watts in battle, “but not in surrender.” She returned the donations to her Plymouth brethren, read Fruits of Philosophy once again, and planned on a course of action that no British woman had ever undertaken before.

 

Sharing the same roof as the notorious Newgate Prison, the stone blockhouse of the Old Bailey squatted stolidly in the center of the City of London. The courthouse was a five-minute walk from the National Reformer office, via Limeburner Lane. In the February 5, 1877, volume of its proceedings, under the heading Sexual Offences, a clerk’s hand recorded:

CHARLES WATTS (41), PLEADED GUILTY to unlawfully printing and publishing thirteen indecent and obscene books – To appear and receive judgment when called upon.

In the end, his admission brought the leniency he hoped for. No jail time, and a steep £25 fine, for costs. 

Annie Besant proposed that she and Bradlaugh form their own publishing company, taking the National Reformer pamphlet plates away from the pigeon-hearted printer Charles Watts. That they had no experience in business did not deter Annie, who held that “all things are possible to those who are resolute.” The pair cobbled together funds to rent a dilapidated shop on Stonecutter Street, a passage linking Shoe Lane to Farringdon Street. The shop was even closer to the Old Bailey than their old one. If her scheme went as planned, she would have a shorter walk to her trial.

By the end of February, the partners had opened the Freethought Publishing Company. As he sniped at and attempted to scoop Charles Watts’ new rival publication, the Secular Review, she directed her partner to a more important fight: printing an updated edition of the prosecuted Fruits of Philosophy and challenging Britain’s obscenity law.

With his eye on standing a fourth time for Parliament, Bradlaugh did not share Besant’s enthusiasm for martyrdom. He did not even like the book. With the Church and Crown arrayed against them, he doubted they could win. They could be sentenced to prison. Mrs. Besant said she would publish the pamphlet herself. 

Readers of March’s final edition of the National Reformer found the announcement, topping the page of advertisements for tailored trousers and Bordeaux burgundies, of a new edition of Fruits of Philosophy: “The Pamphlet will be republished on Saturday, March 24, with some additional Medical Notes by a London Doctor of Medicine. It will be on sale at 28 Stonecutter Street after 4 pm until close of shop. Mr. Charles Bradlaugh and Mrs. Annie Besant will be in attendance from that hour, and will sell personally the first hundred copies.”

On the day of the actual printing, Bradlaugh was in Scotland to give a talk. His daughter Hypatia described Annie’s fear of a police raid and seizure of the stock before the sale. With her sister Alice’s help, the women “hid parcels of the pamphlet in every conceivable place. We buried some by night in [Annie’s] garden, concealed some under the floor, and others behind the cistern. When my father came home again the process began of finding as quickly as possible these well-hidden treasures — some indeed so well hidden that they were not found till some time afterwards.”

On the Saturday, Besant and Bradlaugh found a crowd waiting outside their printshop. In twenty minutes the first print run of 500 copies sold out. Despite her hand delivery of the National Reformer to magistrates’ postboxes in Guildhall, the police never showed. 

The following day, a Sunday, Besant and Bradlaugh hand-sold 800 copies of Knowlton and mailed parcels of the pamphlet to fulfil orders across England and Scotland. Letters of support flowed in. The feminist journalist Florence Fenwick Miller admired Annie’s noble stand against “this attempt to keep the people in enforced ignorance upon the most important of subjects.” Miller included a donation for the defense fund she promised they would be needing. She wished she had “fifty times as much to give.”

A week passed, the Freethought press kept printing, and the Fruits kept spilling out the door. “The Vice Society has plenty of spies and informers on its books,” Besant wrote. “One wise sentence only will I recommend to that sapient body; it is from the cookery book of Mrs. Glasse, dealing with the cooking of hares — Men and brethren, ‘first, catch your hare.’”

Annie decided to help them. To the police she offered to be at Stonecutter Street daily from 10 to 11 am. At last, on April’s first Thursday, she and Bradlaugh arrived to find “three gentlemen regarding us affectionately.” They looked to her then “as the unsubstantial shadow of a dream.”

The trio followed them into the shop. Detective Sergeant Robert Outram produced a search warrant. Bradlaugh said he could look around all he wanted; the last of the first print run of 5,000 copies had been sold the previous day. Outram nonetheless played his part as planned, placed the pair under arrest, and marched them down to Bridewell for booking.

If Annie Besant had any illusions that she would be treated differently than those arrested for street crimes, they were shattered when she was told to empty her pockets and hand over her purse, and was led by a matron into a cell to be searched.

“The woman was as civil as she could be,” she wrote, but “it is extremely unpleasant to be handled, and on such a charge as that against myself a search was an absurdity.”

To Annie’s surprise, she and Bradlaugh were led to the Guildhall basement. For two and a half hours (“very dull,” she wrote, “and very cold”) she simmered as, in a neighboring cell, “Mr. Bradlaugh paced up and down his limited kingdom.” Together they listened to the names of prisoners being summoned, until theirs were the day’s last names called to “go up higher.” Annie entered the dock, and measured up the magistrate. He appeared to her “a nice, kindly old gentleman, robed in marvellous, but not uncomely garments of black velvet, purple, and dark fur.” As the proceedings began, clerks handed her a succession of little tan envelopes holding telegrams from admirers, pledging their support.

A detective constable testified that on March 24 he had purchased a copy of Fruits of Philosophy from Annie Besant, who took his one shilling and returned sixpence change. “Bradlaugh saw her take the money,” William Simmonds added matter-of-factly. “I believe that a large amount of books,” the policeman concluded, “are now kept upon those premises for the purpose of sale.”

That suspicion was what compelled Detective Sergeant Robert Outram’s visit to the shop on the day when the pamphlet’s print run had already sold out. In the courtroom DS Outram, too, had seemed kind, as she watched him find seats for Bradlaugh’s daughters. Still, “It amused me,” Annie wrote, “to see the broad grin which ran round when the detective was asked whether he had executed the seizure warrant, and he answered sadly that there was ‘nothing to seize.’” Bail was set for the next hearing, “to which adjuration I only replied with a polite little bow.”

Walking into the waning spring sunlight, she was surprised to see a small crowd cheering. One voice called, “Bravo! There’s some of the old English spirit left yet!” The criminals had missed luncheon, and so set off to have a meal. Supporters straggled behind them like the tails of a soaring kite. Dining in the gathering dusk, Annie experienced the intoxicating thrill of reading about herself in the newspaper.

“The evening papers all contained reports of the proceedings,” she wrote with satisfaction, mentioning the Daily Telegraph and Evening Standard, “as did also the papers of the following morning.” They included, she especially noted, the hallowed Times, where her name appeared for the first time. Victoria’s favorite publication, the Pall Mall Gazette, placed news of Annie Besant’s arrest — “on a charge of publishing a book alleged to be immoral” — immediately after the lines detailing Her Majesty’s daily engagements. The queen’s activities necessitated two lines of type. Annie’s warranted 33.

Adapted excerpt reprinted with permission from A Dirty, Filthy Book: Annie Besant’s Fight for Reproductive Rights, by Michael Meyer, now available in paperback from WH Allen. © 2024 by Michael Meyer. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186023 https://historynewsnetwork.org/article/186023 0
“A Party for the White Man” When an April 1964 Gallup poll asked Republicans whom they would most like to see nominated as president, only 15% named Barry Goldwater; 52% named Nelson Rockefeller, Henry Cabot Lodge Jr., George Romney, or William W. Scranton, all liberals on civil rights with support among African Americans. Outraged by Goldwater’s opposition to the Civil Rights Act, Pennsylvania Governor Scranton stepped into the race and launched a “stop-Goldwater” bid with Rockefeller’s endorsement just five weeks before the national convention. Though too late to win any primaries, he hoped that his campaign could sway delegates who agreed with him that Goldwater’s extremism represented “a weird parody of Republicanism.”

Conservatives, however, had a three-year head start packing state delegations with Goldwater supporters. Eliminating Black-and-Tan remnants from Southern delegations was essential to this strategy. South Carolina’s state party, which had been open to black inclusion in the 1950s, issued a report in the early 1960s boasting that “not a single Negro showed any interest” in the party, which “was welcomed by new Party leaders as victory in the South at any level could never be achieved by a Negro dominated party.” Georgia’s Republican Party continued to welcome black participation through the early 1960s, and an African American served as vice chairman of the state party. One white official bragged that the GOP was one of only two “integrated public organization[s] in the state.” At the 1963 state party convention, the Fulton County Republican Committee proposed a platform endorsing black equality. Not only was the statement rejected, but the delegation from Atlanta was not prepared for the onslaught of conservatives who had only recently become interested in the mechanics of party gatherings. Whereas previous state conventions had averaged fewer than 400 participants, conservatives, including many former Democrats, filled the convention with more than 1,500 delegates. By the final day, they had removed every single black leader from power, including John H. Calhoun, the man who delivered Atlanta’s black vote to Nixon in 1960. For the first time in 40 years, Georgia’s delegation to the Republican National Convention was entirely white. One of the party’s new officials proclaimed, “The Negro has been read out of the Republican Party of Georgia here today,” and members celebrated with an all-white banquet.

 

Finally in control of the party’s most powerful committees and inspired by the GOP’s first staunchly conservative presidential nominee in decades, Goldwater delegates sought to humiliate their establishment enemies at the Republican National Convention in San Francisco’s Cow Palace. Conservatives intentionally delayed proceedings so that Nelson Rockefeller could not deliver his convention address until midnight, or 3:00 am on the East Coast. When the New York governor finally stepped on stage, a steady stream of boos interrupted him for a solid three minutes. Black delegates faced similar disrespectful treatment from members of their own party. The vice chairman of the DC Republican Committee, Elaine Jenkins, recalled, “There was no inclusion of black Republicans as a group at the convention. White staffers treated the few of us present as truly non-existent or invisible.” On one occasion, Goldwater’s “Junior Sergeant at Arms” blocked four black men from entering the convention floor, including Edward Brooke, one of the most powerful public officials in Massachusetts. it was not until a Nixon associate, John Ehrlichman, intervened on their behalf that they were granted entry.

African Americans in attendance also faced verbal and physical abuse. Memphis Black-and-Tan leader George W. Lee had to be escorted from Scranton headquarters to an undisclosed motel after receiving death threats during his contest against Memphis conservatives. When Clarence Townes, the only African American in Virginia’s delegation, cast his vote for Rockefeller, he “was forced to flee from the convention hall in company with television newscasters to escape angry conservatives.” In one of the most shocking events of the convention, William P. Young, Pennsylvania secretary of labor and industry, noticed smoke coming from his clothes after a heated exchange with a group of Goldwater delegates. After burning his hand to extinguish the flames, he discovered four holes burned into his suit jacket from a lit cigar placed in his pocket by an unknown assailant. The event was witnessed live by television cameras and reporters on scene. Shortly thereafter, one Southern entrepreneur began selling “Goldwater Cigars,” which included a card that read, “These cigars can be used in many ways … Some Republican People at the San Francisco Convention Slipped a Lighted Cigar into a Negro Delegate’s Pocket! They Say He Seemed to Get the Idea That He Wasn’t Wanted. And He Left the Room in a Hurry!”

The events at Cow Palace confirmed many black Republicans’ worst fears about their party. Edward Brooke described the convention as “an exercise in manipulation by a zealous organization which operated with a militancy, a lack of toleration, at times a ruthlessness totally alien to Republican practices.” In a convention hall filled with Confederate flags waved by southern delegations, one African American remarked, “it’s clear to me … that this taking over of our party is based on resentment of civil rights advances.” Sandy Ray of New York lamented that his party had become home to “extremists, racists, crackpots, and rightists. What we experienced at the convention television onlookers could not believe.” Jackie Robinson declared, “as I watched this steamroller operation in San Francisco, I had a better understanding of how it must have felt to be a Jew in Hitler’s Germany.”

Scranton’s last-ditch campaign failed, and Goldwater easily secured the GOP nomination. Although a July poll of registered Republicans found that 60% favored Scranton, conventions are not democratic proceedings. Southern delegations cast over 97% of their votes for Goldwater under their newfound all-white leadership. By refusing to slate African Americans as delegates even from diverse states like California and replacing black leaders in Georgia and Tennessee with conservative whites, Goldwater’s forces had reduced black representation at the national convention to its lowest numbers in over 50 years. Especially disconcerting to moderates was that 7% of the convention’s 1,300 convention delegates were members of the anti-civil rights John Birch Society, while only 1%, or 14 individuals, were African American. Conversely, the 1964 Democratic Convention featured a record 65 black delegates.

In his acceptance speech, Goldwater rejected another opportunity to reconcile with moderates and liberals. Although civil rights had been the nation’s most pressing domestic issue for the past four years, the nominee did not make a single reference to the movement. He identified communism and an ever-expanding federal government as the primary threats to American “liberty,” but conspicuously left Jim Crow off the list. He used the words “free” and “freedom” 26 times, though none referred to the ongoing struggle for black equality that raged in the South. The same summer as Goldwater delivered his convention speech, four civil rights activists had been murdered, 80 had been beaten, and 67 black churches, homes, and businesses had been burned or bombed in Mississippi alone. Goldwater was silent on this wave of violence in the South in his convention speech, and yet railed against violence in the “streets” of Northern cities. He also expressed his disdain for moderate Republicans, who had so often dismissed him as an extremist, and famously proclaimed, “extremism in the defense of liberty is no vice! and … moderation in the pursuit of justice is no virtue!” 

Civil rights activists dressed up as Ku Klux Klan members to protest racists supporting the presidential campaign of Barry Goldwater at the Republican National Convention, San Francisco. Photograph by Warren K. Leffler. [Library of Congress]

Despite efforts to alienate African Americans at the convention, they refused to withdraw quietly without a fight. According to Jet, black Republicans “poured out in numbers” to attend an anti-Goldwater rally led by Jackie Robinson and the Congress of Racial Equality. The rally, whose participants also included Nelson Rockefeller, Henry Cabot Lodge, George Romney, Jacob Javits, and Kenneth Keating, culminated in a 40,000-person march from Market Street to Cow Palace. On July 15, African Americans assembled at the Fairmont Hotel to discuss protest strategies. Temporarily naming themselves the Negro Republican Organization (NRO), the group issued a statement read by William Young to the press. “We have no confidence” in Goldwater’s “ability to enforce” the civil rights bill, they announced, and pledged to “defeat that minority segment” of the GOP, “which appears determined to establish a lily-white Republican Party.” Jackie Robinson called for a coordinated NRO walkout from the convention floor, but George Parker of Washington, DC, cautioned that because of their small numbers, “it would look as if they were just going out to lunch.” They ultimately agreed to stage a “walk around” instead of a walkout, hoping that television cameras would broadcast their protest. The demonstration occurred as the convention began counting verbal votes for the presidential nominee. A counter-protest soon eclipsed the demonstration, and journalists found it difficult to see the marchers amid “a tunnel of Goldwater banners, signs, pennants, streamers, and flying hats.” 

To the black delegates who formed the NRO, leaving the party was not an option. The hostile national convention provided motivation to continue their fight against an uncompromising conservative movement. As Sandy Ray declared after the convention, “if we sit quietly and allow this band of racists to take over the party, we not only signal the end of the party of freedom, we also help to bring about the total destruction of America through racism.” When asked in 1968 if he had ever considered leaving the GOP, George W. Lee somberly replied, “during my Goldwater fight in San Francisco … I was a lone individual down there,” but he never thought of ever leaving the GOP, because “somebody had to stay there in the Republican Party and fight, and fight, and fight with the hope that the Republican Party wouldn’t be made a party of ultra-conservatism and further than that, a party for the white man.” 

Adapted excerpt reprinted with permission from Black Republicans and the Transformation of the GOP, by Joshua D. Farrington, now available in paperback. © 2016 by the University of Pennsylvania Press. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186021 https://historynewsnetwork.org/article/186021 0
Lacking a Demonstrable Source of Authority In the early 1880s Crow Dog, a member of the Sicangu Lakota Oyate (or Brulé Lakota or Brulé Sioux in older literature) ambushed and killed Spotted Tail, who was also Sicangu Lakota. The event took place on tribal land, which was increasingly surrounded by colonizers yet still very remote from what then constituted much of the United States. The motivation for the killing was assuredly political in nature — both men vied for leadership authority within the community — but it may have had personal elements as well. Nonetheless, the heinous act created a rift within the community, one that the community sought to repair. Shortly after the killing, a tribal council sent peacemakers who helped the families negotiate a settlement. The issue was resolved within the community to its satisfaction when Crow Dog and his family agreed to pay Spotted Tail’s family $600, eight horses, and a blanket, an astounding sum that testified to the significance of the family’s and community’s loss.

Although the matter was settled to the community’s satisfaction under Sicangu Lakota Oyate law, federal officials seized on what they regarded as an opportunity. At the time, in the 1880s, the federal government was in the early stages of what is regularly referred to as the Allotment Era of federal policy. The Allotment Era, lasting from approximately 1871 to 1934, was defined by attempts by the federal government, philanthropists who believed they were doing right, and those who sought tribal lands and resources to destroy tribal nations and tribalism. The Allotment Era included, among other things, the process of allotment that fundamentally changed and reduced tribal land holdings, boarding schools that sought to eradicate tribal ways of life by forcing Native children into a Western mode of life, and Indian police and Indian courts that enforced Western law and norms in Native spaces. It would be difficult to overstate the amount of time, energy, and resources that were directed toward eradicating tribal ways of life during the Allotment Era or the lasting harm that the era’s efforts continue to cause.

Shortly after the matter was settled among the Sicangu Lakota Oyate, the federal government arrested Crow Dog under the pretense that a “public outcry” demanded that the killer be brought to justice. The true purpose for arresting Crow Dog, however, had little to do with public opinion. At the time, federal officials tasked with engaging with Native peoples wanted to exercise criminal jurisdiction over Native peoples on Native lands. In one respect, the sovereignty and nationhood of Native peoples made this seem absurd — much like it would be absurd if the United States tried to extend its criminal law over peoples living in Canada or Mexico. Yet, tribal nations were increasingly surrounded and imposed upon by the growing colonial force that was the United States. Under these circumstances and within the spirit of the Allotment Era, forcing American criminal law on Native peoples on Native lands felt less like an absurdity to many federal officials and more like a necessity.

Crow Dog’s arrest and trial were intended to produce a test case that would provoke American courts to decide whether the federal government had jurisdiction to exercise American criminal law over Native peoples on Native lands. The trial was swift, Crow Dog was convicted in a territorial court and sentenced to hang, and federal officials had their test case that was soon to be heard by the Supreme Court. According to legend, Crow Dog managed to convince a federal marshal to let him go free for a period of time to arrange his affairs. The day Crow Dog promised to return was cold and snowy, and few if any expected him to keep his promise. Nonetheless, he showed up on time, making him a local hero.

Crow Dog’s situation allows us to recognize that the American, or Western, system of justice is focused on punishing the offender. Crow Dog, under this vision of criminal justice, needed to feel a roughly equivalent amount of harm that he caused. Federal officials sought the death penalty and were incensed when he “went free” under tribal law. However, for the Sicangu Lakota, and for many tribal nations, the focus of the criminal justice system was not on punishing the offender but rather on making the victim (or the victim’s family) as whole as possible. Restoring a sense of balance and harmony within the community was the foremost goal and best accomplished through restitution rather than punishment. Consequently, under the Sicangu Lakota system, Crow Dog was not buying his way out of or otherwise avoiding justice but fully and meaningfully participating in effectuating it.

 

Every single court case, from the biggest to the smallest, is just a question that is seeking an answer: Did the accused commit the crime for which she or he is on trial? Did the company breach its contractual obligations? Is a tomato a fruit or a vegetable? Consequently, the key to reading and understanding court opinions is to discern the question that the court is trying to answer. When the test case that emerged from Crow Dog’s situation reached the Supreme Court, the question to be considered was blissfully uncomplicated and likely obvious: Did the federal government have jurisdiction to enforce American criminal law over Native peoples on Native lands?

The answer, according to the Supreme Court in its 1883 decision Ex Parte Crow Dog, was an equally simple “no,” even if the methodology for arriving at that answer was somewhat convoluted and the language employed by Justice Stanley Matthews in the majority opinion was replete with the types of rhetorical unnecessities that Strunk and White sought to kill off. Put most simply, the federal government had already given itself jurisdiction over crimes committed in “Indian Country” through two statutes. Yet, in those statutes the federal government specifically exempted from its jurisdiction crimes that were committed by one Native person against another Native person or crimes by Native people that had already been punished by the tribal nation. Since Crow Dog clearly fell within both exceptions, the lawyers for the federal government sought alternative justifications for federal jurisdiction and settled on tribal cessions made in an 1868 treaty and an 1877 agreement. The Supreme Court rejected this line of reasoning, stating among other things, “It is quite clear from the context that this does not cover the present case of an alleged wrong committed by one Indian upon the person of another of the same tribe.” Without jurisdiction, the federal government was forced to free Crow Dog, at which point he returned to his community, lived to an old age, and continued to remain a thorn in the side of federal officials.

The Supreme Court’s decision in Crow Dog was unquestionably a victory for Crow Dog and the Sicangu Lakota Oyate particularly and for tribal sovereignty and Native America more generally. It was an acknowledgment by the courts of the United States that the federal government, in what might be understood as a commitment to its foundational principles, could not simply assert its authority without a basis for that authority. It is rightfully celebrated for that which it stands.

Unfortunately, victories for tribal interests in American courts are rarely complete or without some corresponding aspect or aspects that diminish, limit, or completely negate the positive impact of the case for Native America. This is so with Crow Dog. Two distinguishing features significantly dull the shine of this particular outcome. The first is the rationale upon which the decision was made. While the final result of the case supported tribal sovereignty, Justice Matthews’ opinion makes clear that this was more an unintended consequence than a purposeful goal or statement of principle. The main focus in Matthews’ opinion was on federal claims to authority and their sources, or lack thereof. There is no discussion whatsoever of tribal criminal procedures or that the matter was handled within the community to the community’s satisfaction.

The limited discussion of tribal peoples and methods in the opinion centers not on Sicangu Lakota Oyate structures or law but on the supposed deficiencies of Native America. In language that echoed earlier decisions and portended future ones, Matthews described Native peoples as “wards subject to a guardian” and “a dependent community who were in a state of pupilage.” Consequently, Matthews would later argue, it was unfair to measure Native peoples against American law. As part of the most famous passage in the case, Matthews wrote that the application of American law to Native peoples “tries them, not by their peers, nor by the customs of their people, nor the law of their land, but by superiors of a different race, according to the law of a social state of which they have an imperfect conception, and which is opposed to the traditions of their history, to the habits of their lives, to the strongest prejudices of their savage nature; one which measures the red man’s revenge by the maxims of the white man’s morality.”

Native peoples, inferior to their American counterparts according to Matthews, were merely the lens to view American jurisdiction and process. Although the opinion happened upon such an end, Matthews clearly did not intend to foster or support tribal sovereignty or methodologies. On the contrary, Matthews’ opinion demonstrates a low opinion of Native peoples. Even though it was a win for tribal interests, the case has limitedusefulness as a building block or intellectual basis for subsequent arguments in favor of Native rights and authority.

The second prominent feature of Crow Dog that mitigated its benefit for Native America was how Justice Matthews opened the door to a reconsideration of the result. Near the end of his opinion, Matthews wrote that to find American jurisdiction over Crow Dog’s actions on tribal land was “to reverse in this instance the general policy of the government towards the Indians, as declared in many statutes and treaties, and recognizedin many decisions of this court, from the beginning to the present time.” Had Matthews ended here, he merely would have made the type of general observation that is found in countless court opinions and that may be more or less accurate but is often ephemeral and mostly harmless. However, he did not stop with this bit of fluff. Instead, he continued by stating, “To justify such a departure, in such a case, requires a clear expression of the intention of Congress, and that we have not been able to find.”

 

Particularly at its highest levels, we often conceptualize the three branches of the American government as sometimes “talking” to each other. Since authority is divided between the president, Congress, and the courts, none of the three can exercise its will without limitation. To that end, sometimes when one branch runs against the boundaries of its authority, it will signal through various means to another branch what it would like to see done or propose an alternative path to complete a goal that cannot be accomplished as currently constituted or otherwise offer guidance, advice, or requests.

Understood within this framework, Justice Matthews was very much “talking” to Congress through this opinion. The Court was unable to find American jurisdiction over Crow Dog and tribal lands under the circumstances with which it was presented. Consequently, it is difficult to understand Matthews’ assertion it would take a “clear expression of the intention of Congress” for the Supreme Court to find jurisdiction as anything other than an open invitation to Congress to change the circumstances. Matthews offered his brief description of the “general policy of the government towards the Indians” and then explained how Congress might alter that general policy with a “clear expression.” Matthews deliberately neutered the opinion’s capacity to protect and acknowledge tribal sovereignty by describing to Congress how to overcome the ruling in future cases.

Two years later, Congress accepted Matthews’ invitation, passing the Major Crimes Act in 1885. As originally constituted, the new law gave the federal government jurisdiction over seven “major” crimes committed by a Native person against another Native person in Indian Country, including murder. Federal officials and others seeking to radically transform Native peoples and ways of life had another weapon in their arsenal, just as they had hoped when they initiated the action against Crow Dog.

Of course, just because Congress passes a law doesn’t mean that it has the authority to do so. As many of us learn in our tenth-grade civics class, our government is one of limited and enumerated powers. Years ago, after I finally looked up the word “enumerated,” I better understood the basic premise that the phrase “limited and enumerated powers” is intended to invoke: governmental authority extends only as far as is spelled out in the U.S. Constitution. Put differently, unless the power to act is articulated in the U.S. Constitution, the government doesn’t have that power. This is how we assess the constitutionality, or validity, of laws — those laws that are made under demonstrable grants of authority are constitutional and valid, and those laws that lack a demonstrable source of authority are unconstitutional and invalid. When an assessment of the constitutionality of a law occurs in a court, we refer to the process as judicial review.

Congress passed the Major Crimes Act, but this in and of itself did not settle the question of American jurisdiction over Native individuals on tribal lands. Eventually the constitutionality of the Major Crimes Act would be tested.

Adapted excerpt reprinted with permission from The Worst Trickster Story Ever Told: Native America, the Supreme Court, and the U.S. Constitution, by Keith Richotte Jr., published by Stanford University Press. © 2025 by Keith Richotte, Jr. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186019 https://historynewsnetwork.org/article/186019 0
The Rise and Fall of Liberal Historiography If a historian of the United States entered the public square in the 1960s or 1970s, it was often for reason of radical commitments. Eugene Genovese became a lightning rod after offering his “welcome” to the possibility of a Viet Cong victory; Staughton Lynd emerged as an icon of conscience thanks to Yale’s denial of his tenure over his antiwar activities; Lawrence Goodwyn’s writing on populism served as a historical bible for movement activists; Christopher Lasch was even read by Jimmy Carter in the White House, although he later castigated the president for missing his point. The profession seems to play this role less frequently today — despite celebrated exceptions such as Keeanga-Yamahtta Taylor. Today the most prominent “historians in public” are in different ways, and for good or ill, mainly working within liberalism, seeking to defend its institutions and norms against attack. Lasch’s brief moment in the sun notwithstanding, this latter group has a wider audience and even a kind of access to state power.

Tracing the fall of radical history and ascent of liberalism in the historical profession offers a chance to consider historians’ own place in the consolidation of professional-class liberalism, a process negotiated in part through a new relationship between social history and the tradition I call institutionalism: the approach that emphasizes political actors — policymakers, parties and their leaders, and the intellectuals who inform them — as first movers. What makes this approach specifically liberal is its emphasis on the autonomy of political ideology and action from social forces. 

As our profession both accesses the halls of power at its upper reaches and collapses in its internal economic structure, it is undergoing significant intergenerational ideological polarization. This development prompts reassessment of the renewed relationship between historiography and liberalism, through which historians drew closer analytically and practically to the institutions of state power — and so became institutionalists both in method and in politics. In the process, they gained new powers. But we may have lost something too. This ambiguity is written across the scholarship of the 1980s and 1990s. 

To place historians within the development of history is a useful exercise in two ways. First, of course, it is central to the project of historiography, the intergenerational dialogue by which we become self-conscious of how our angle of retrospective view has changed as we have moved forward in time. More ambitiously though, it is sometimes possible to say something meaningful about a historical event or process by reading contemporary historical work symptomatically. To push historiography from its traditional place as a secondary source and treat it as primary material might turn over fresh layers from familiar episodes. In particular, it is in relation to debates about method that historians often disclose some truth of their era.

In the years after 1980, the U.S. historical profession fell into intense internal debate — reaching a peak in the 1990s but not fully resolved today — as the lines of inquiry established by social history seemed increasingly alien in a more conservative historical moment. Theoretical and methodological strife ricocheted through historical scholarship, a branch of the “theory wars” occurring simultaneously across disciplines. The intellectual-historical context, the collapse of the horizon of possibility glimpsed by the New Left and the decline in scholarly belief in the potentiality of social activism, corresponded to the political context of the defeat of the postwar welfare state. Within U.S. history, a debate ensued about how to absorb the meaning of Reaganism — and ultimately, more broadly, neoliberalism — into scholars’ account of the past, for which social historians’ core concept of “agency” no longer seemed adequate given these political results. The political event had to be digested not only at the empirical level of substantive historical arguments, but also at the more abstract level where historical concepts are forged. 

Social history in particular, the engine of historiographical transformation since the 1970s, faced critiques from within and without implicating the “agency” concept. Agency, a humanist idea, aimed to describe the capacity of ordinary people to influence historical events: at each juncture, the historical presence of women, the enslaved, African Americans, immigrants, and the working class could be established and their efforts shown to have contributed to the outcome. Yet the disappointing political outcomes of the 1980s cast doubt on such an approach.

As the debate developed, it emerged that the reasons that some categories of human experience coagulated into “agents” of a particular kind — social-movement actors, especially — had largely been taken for granted by the “new social history.” What made a machinist an agentive worker and his union struggles a subject for labor history, asked Nell Painter, but the racism of his exclusionary union something less analytically significant? Surely one had to ask how race and class were constructed to reach an answer. Such critiques formed the basis of the “cultural turn” in history.

It was also here that the second critique, arising from social science, made its impact. Social history’s original maneuver — a critique of consensus liberalism — seemed ill-placed in a world where consensus liberalism had been defeated not from the left but from the right. As Alan Brinkley put it in his classic 1994 essay on conservatism, “New Left political scholarship has … generally been more interested in discrediting liberalism — and, within the academic world, in wresting leadership and initiative from liberal scholars — than in confronting what it has generally considered a less formidable foe: the self-proclaimed Right.” In his first footnote, Brinkley observed that social scientists have done a far better job than historians of tracking conservatism.

While those influenced by both cultural history and social science can be characterized as “post-Marxist,” they exited social history’s vaguely Marxist encampment heading in almost opposite theoretical directions: one group into the interpretive and hermeneutic, the other into positivism. Yet only the first, the cultural turn, has attracted sustained disciplinary reflection. The rise of cultural history has been widely debated, celebrated, and criticized. (Reader, imagine a colleague describing a talk or paper as “very 1990s. ”What you envision probably involved discourse analysis, performativity, postcoloniality, the carnivalesque, or similar.) But what came of this second, social-scientific line of argument, which often emanated from a more avowedly politically and methodologically liberal position? We must remain in some sense within it, since we have not yet looked backward at it. 

Brinkley, however, was part of a broader historical phenomenon. The call for social historians in crisis to reunite with liberalism — against which the field initially had emerged in revolt—was widespread. “The once-passionate impulse to recapture working-class struggles and commitments to anticapitalist imperatives now risks creating sentimental reminders of times lost and aspirations disappointed,” wrote Ira Katznelson in the same year, 1994, pointing to the failure of socialist dreams. 

His proposed alternative was to engage what he called “liberal theory,” in particular “to reincorporate at the center of the discipline the subjects of state-focused politics, institutions, and law.” What the U.S. history field has not yet made explicit in its own self-narrative is how successful this call was. 

This reentry into liberalism in the end produced a generation of “new political historians,” whose method was “institutionalism.” (Something closely related occurred in the next decade with another off-ramp from social history, this one into the history of capitalism.) The purpose here is not to attempt to vindicate or disprove institutionalism—neither of which could be done even at book length—but to illustrate and historicize it. For a new generation of scholars, this method has been in continuous ascendancy for most of our lifetimes. What results has it yielded and what limits has it met? 

Social history’s exhaustion by the 1990s had real sources, and institutionalism offered something significant in response. But institutionalism also aimed to suppress some animating elements of social history that might prove useful in the discontented, unequal period beginning around 2010. Many of us who have been shaped profoundly by institutionalism—arguably its third generation in the field of U.S. history—also lack the memories and scars that inspired it. We have different ones instead, which we can understand and to which we can assign meaning only partly through concepts developed in the 1980s and 1990s. This generational difference corresponds to a wider one among professionals, the younger cohorts of whom have drifted leftward, bypassing lessons of moderation learned and exemplified in the 1980s and 1990s by professional-class liberals. Since this period, political conditions have changed for the worse, possibly quite drastically. Such deterioration poses questions for any method of political analysis. 

Broadly, scholarly institutionalism can be classed into three groups. They are the “organizational synthesis” of the 1970s and 1980s in business and political history, the substantivist economic sociology of Karl Polanyi, and most significantly the “state-centered approach” that arose in the 1980s in political science (often itself called “neo-institutionalist” or “neo-Weberian”). These traditions have their own intersecting but distinct histories, but they share some personnel, as well as three fundamental features. 

First, institutionalists have been concerned with market structure. As was repeatedly argued by center-left reformers in the 1980s and 1990s, it is easier and more effective to regulate markets than to infringe directly on investment decisions and property rights: progressive reform might instrumentalize the power of markets rather than set itself against them hopelessly. These instincts shaped scholarly agendas. In the 1990s and the first decade of the 2000s especially, production tended to fall out somewhat as sites of social antagonism or historical analysis as it ceased to be a site of political possibility. Instead, market behavior signalized social class — consumer politics or antimonopoly.

Second, institutionalists share a particular interest in the specific and contingent organizational forms of economic and political life, to which they give causal priority over more abstract formalisms or determinisms. Politics and markets happen only through formation of organizations, and these organizations have particular histories. A populist focus on specific mechanisms and actors rather than structural determinants of elite power — corporate forms, elite networks — emerged here. 

And third, institutionalists see political power as irreducible to socioeconomic power. Instead, they trace it to organizational form, most importantly the state form. The modern bureaucratic state depends on expert knowledge and civil society organization, including party organization — a common feature of all modern societies. Finally, for this reason, the political party in general — and for U.S. scholars, the Democratic Party in particular — have distinct historical importance. Politics must be done through parties. They are — and in the modern U.S. the Democratic Party exclusively is — the indispensable vehicles of social progress. In this view, liberal electoral success is analytically prior to, and ultimately politically more imperative than, any other political program. Generational efforts at political change, such as go into great social movements, and which often demand sacrifices of liberal politicians, diminish in importance. 

Institutionalism had at its heart a profound commitment to historical contingency. As Richard John put it, “It is, for example, no longer as intuitively plausible as it had been in 1980 to posit that the major changes in American public life bubbled up from below.” Importantly, John drew a distinction between “society” and “political economy” as units of analysis: “Not only individuals and groups, but also institutions, can be agents of change.” What marks this position as the historical methodology of political liberalism is the analytic removal of “society” and “political economy” into separate spheres: over here are citizens going about their lives; over there are markets and the state institutions that govern them. The relationship between the spheres has no necessary shape — it is, as John emphasizes, “contingent.”

Historians appreciate contingency, of course. Benignly, such a method of contingency allows for the possibility that the state might actually represent the people: if politics is not reducible to the social, then the inequalities in society might be counteracted by the equalities of democracy. As Gary Gerstle, a 1980s labor historian par excellence who became a prominent political historian, concluded his synthesis of American state development, Liberty and Coercion, “Fixing the system does mean giving Americans the tools and flexibility to fashion a government that works, and one that as members of a polity in which the people are meant to be sovereign, they deserve.” Such neo-Progressivism does not dismiss the reality of social inequality. Indeed, it laments it. But it dispenses with social inequality as a direct cause of historical effects. It is in its method rather than its ethics that institutionalism diverges most from the traditional commitments of social history. Divergence being only partial—methodological more than ethical — encounter and affiliation were possible. 

But it is also historians’ task to explain obdurate continuities of social hierarchy. And as those continuities have reasserted themselves over recent decades, the shortcomings of this methodological liaison have shown more clearly. How can it be that each discrete outcome appears contingent and non-inevitable on close examination; yet at the same time, when we pull back the frame, so many events seem to align toward the reproduction of social inequality? Answering this question requires a move up the ladder of abstraction, a resort to structural explanation. To the extent that such moves fell out of historians’ repertoire, then the meeting of the two traditions seems less an alliance of equals and more an assertion of liberal hegemony over a defeated Left — tracking closely with developments going on in American politics writ large. 

 

It stands to reason that mass incarceration and financialized racial and social inequality — characteristic developments of the neoliberal period — would usher the historical study of political institutions onto new ground. These phenomena epitomize how the American state is an instrument of unequal, coercive rule. Neo-institutionalism arose to explore the possibility of progressive governance in the face of the neoliberal challenge, arguing that social and economic forces did not give any irresistible logic to the upward redistribution of wealth and power. But the 2020s present a harder environment for this view than the 1980s and 1990s did, as even figures from within the orbit of neo-institutionalist political science have recognized.

Within the intellectual sphere, 40 years of worsening inequality have given rise to an increasingly radicalized layer of younger scholars who are disinclined to suppress the mechanisms of social determination as the special pleading of parochial interests, and unwilling to accept that what is most needed to achieve appropriate and coherent governance is for competent liberal technocrats to hold power. Mass incarceration, U.S. imperialism, unchecked climate catastrophe, the privatized welfare state, the financialization and deregulation of the economy, the illegalization and deportation of immigrants — liberals have their fingerprints on all these phenomena; just as they do on the crisis of the university itself, which has forged the unequal institutional environments in which we ourselves work. Yet the discovery of these fingerprints has itself been made possible by the fusion of institutionalist social science with social history, which generated a reservoir of historical knowledge we can draw on to explore the darker, less democratic threads of our history — and the possible social bases for resistance and transformation. 

Although all historians are shaped by the moments in which we write, we first learn how to look at the world historically through the eyes of scholars shaped by an earlier moment. Tensions in historical discourse must develop from this sliding contradiction. But we can learn from this paradox if we will allow it its own historical meaning, in which we are necessarily implicated politically. Doing so is not antithetical to learning from previous generations: it is how we do it. As E.P. Thompson once observed, historians “are as much subject to our own time’s formation and determinations as any others. If our work is continued by others, it will be continued differently.”

Adapted excerpt reprinted with permission from Mastery and Drift: Professional-Class Liberals Since the 1960s, edited by Brent Cebul and Lily Geismer, published by The University of Chicago Press. © 2025 by The University of Chicago. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186017 https://historynewsnetwork.org/article/186017 0
Merry, Manly Militias As a euphoric martial spirit swept the colonies after Lexington and Concord, many Americans showed surprising faith in their militias, especially given that they were about to wage war against the most formidable military power of the day. One Philadelphian, not yet sure how he felt about American independence, wrote a friend days after news of Lexington and Concord arrived: “The parliament seem determined to force us into an acknowledgment of their supremacy, I dread the worst, for I am sure they never can do it, from the number of inhabitants, and the very situation of this country; besides our rifle men, who are used to shooting in the woods, will never come to an open engagement; they are very expert at the Indian manner of fighting.” 

In America, the militia was the center of a military tradition that deviated markedly from European warfare and was largely modeled after settlers’ (often skewed) conceptions of Native American warfare. Although settler militias inflicted real deadly violence on the British during the Revolutionary War, they developed and usually reserved this form of brutal and irregular warfare when making war on Indians. Outside the period of outright hostilities, both white militias and mobs applied limited violence that usually stopped short of death when engaging the “enemies of liberty.” In contrast, most of the fighting in America had developed and would continue to occur as part of the endemic wars between settlers and Native peoples or mixed forces of both. “For the first 200 years of [U.S.] military heritage,” the military historian John Grenier wrote, “Americans depended on arts of war that contemporary professional soldiers supposedly abhorred: razing and destroying enemy villages and fields; killing enemy women and children; raiding settlements for captives; intimidating and brutalizing enemy noncombatants; and assassinating enemy leaders.” Ironically, although Europeans in America became convinced that guerrilla warfare, which they called the “skulking way of war,” was innate to Indians’ supposedly savage condition, they sought to enlist Native peoples in their wars and developed a militia tradition of ranger companies that specialized in skulking, supposedly as well as any Indian.

American settlers, convinced of their mastery of the Indian way of bushfighting, embraced this new way of war as their own and as a marker of their virility, but there were important distinctions. Whereas Native forces usually sought to inflict violence over a relatively short campaign, as part of their logistical calculations and goals, European settlers in America introduced the settler colonial logic of elimination: extended, scorched-earth campaigns that sought to extirpate their foes. Native warfare employed caution and the element of surprise, to inflict damage while suffering as few casualties as possible to great effect — often translated into what today is called guerrilla warfare. Europeans mimicked such tactics, but also combined them with their logistical capacity to support longer campaigns and extirpative goals. As the theorist Patrick Wolfe famously asserted, since British settlers came “to stay” and displace the Native population in North America, they ultimately waged a genocidal campaign of extermination. The “logic of elimination” behind the goals of the British and later American settlers made genocidal violence an ongoing feature of North American settler colonialism.

American militias, part of the infrastructure that carried out one of the most extensive ethnic-cleansing projects known, were also bands of “merry men” who came together to drink and frolic, away from their wives and children. One common refrain was that militiamen did little more than drink and boast; in fact, one of the most time-honored traditions associated with militia musters was to retire to the tavern as soon as possible. Timothy Pickering, who attempted to reform Massachusetts militia traditions, found the militiamen’s habit of firing in good fun at whomever they pleased particularly appalling. “It had been the custom in Salem,” Pickering wrote, “from my earliest remembrance, and of fifty or perhaps a hundred years before to fire at the officers, under the senseless notion of doing them honor … and it gave them a singular satisfaction to make women the objects of their dangerous diversion. Nor did strangers escape the hazard and inconvenience of their inhuman inhospitable sport.” For many participants, the militia was much of the time a largely social affair, where much mirth was to be had. Levity and play — eerily combined with anxiety, terror, and deadly violence — would also infuse the identity and practices of some of the same militiamen when they wore buckskins and painted their faces to carry out ruthless warfare against Native peoples. 

 

During the first half of the 18th century, almost all North American British colonies, save Quaker-influenced Pennsylvania, experimented with and came to rely to some degree on Native forms of warfare, employing Native allies, ranger units, or both. By midcentury, the term rangers had become familiar to most Americans given the decades of ongoing wars against various French, Spanish, and Native alliances. During these years, the success and celebrity of Gorham’s Rangers during King George’s War (1745–48) and Father La Loutre’s War (1749–55), and, even more so, Rogers’ Rangers during the Seven Years’ War, firmly established this tradition as a point of pride for Americans.

Robert Rogers, by Johann Martin Will, 1776. [Wikimedia Commons]

It was also quite clear to American colonists that Native allies, and enemies, were a force to be reckoned with. After the Seven Years’ War, even leading British officers who often denounced American troops at times conceded that hardy American frontiersmen were necessary for their campaigns. Likewise, as much as elites living on the Eastern Seaboard enjoyed ridiculing backcountry bumpkins, they at times also drew inspiration and faith, tinged with a sense of virility, from their mastery of arms. 

Such latent dispositions to favorably view provincialism meant that following the early successes in Massachusetts in the spring and summer of 1775, American faith in their virility and military prowess was at an all-time high. Once organized into militias for the Revolutionary War effort, they often adopted some form of Indian garb as their official uniforms, as one observer noted in delight: “Every man has a hunting shirt, which is the uniform of each company — almost all have a cockade & buckskin tale in their hats, to represent that they are hardy resolute, & invincible Natives of the woods of America.” Another report painted a particularly attractive scene of a company of Virginian veterans of Dunmore’s War on the way to join the forces organizing under General Washington at Cambridge, Massachusetts: “They bear in their bodies visible marks of their prowess, and show scars and wounds which would do honor to Homer’s Iliad … These men have been bred in the woods to hardship and dangers from their infancy. They appear as if they were entirely unacquainted with, had never felt the passion of fear. With their rifles in their hands, they assume a kind of omnipotence over their enemies.” “At night,” after demonstrating uncanny shooting skills, they enacted a tantalizing Indian masquerade: 

A great fire was kindled around a pole painted in the court house square, where the company, with the captain at their head, all naked to the waist, and painted like savages … indulged a vast concourse of people with a perfect exhibition of a war-dance, and all the maneuvers of Indians, holding council, going to war, circumventing their enemies by defiles, ambuscades, attacking, scalping, &c. It is said by those who are judges, that no representation could possibly come nearer to the original.

Virginians seemed to take to such dress and revolutionary fervor with more verve then most. “The people all over America are determined to die or be free,” explained one Virginian, who also depicted how the local militia expressed such pathos in their uniquely American garb: “The general uniforms are made of brown Osnaburghs, something like a shirt, double caped over the shoulder, in imitation of the Indian, and on the breast, in capital letters, is the motto, Liberty or Death.”

Yet, such euphoria was also a contrivance, long nurtured by Americans to alleviate deep-seated anxieties. During the revolutionary period, American settlers internalized poignant apprehensions over their situation as Euro-Americans about to sever their cultural lifeline to their mother country in Europe, all the while continuing to engage an indigenous enemy in the “new world.” Americans performed fantastic intellectual acrobatics to negotiate the triangular, settler colonial crucible that on the one hand pitted them against their cultural metropolis, which claimed vast superiority and cast them as little better than savages, and on the other the original inhabitants of the land, who could outperform them in the North American woods, to which they had a truer claim. In a further ironic twist, the two peoples they were most anxious about and wished to outdo — the British and most Native nations in the region — became allies, joining forces to curb American settlers. For them there was no other way, but to insist that they were more qualified than their British kin to form a just and civilized republican, even democratic, society, and at the same time to declare themselves better than Indians in their own way of war. In doing so, American settlers pioneered a host of practices, along with conflicting convictions and talking points, and forced them to abide together under the same cognitive canopy, contradictions be damned. Such strained ideological and cultural production proved all the more urgent for male settlers, given that for many it involved an ongoing struggle to maintain their hold on a particularly valuable form of cultural cache — their manhood. 

Excerpted from American Laughter, American Fury: Humor and the Making of a White Man’s Democracy, 1750–1850, by Eran A. Zelnik. Copyright 2025 © by Johns Hopkins University Press. Published with permission of Johns Hopkins University Press.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186014 https://historynewsnetwork.org/article/186014 0
Expect Freedom Upon Arrival On September 1, 1864, Sam Richards, a white merchant, saw the city of Atlanta explode. Around noon, rumors began circulating of a Confederate defeat on the outskirts of town. Confederate general John Bell Hood would be evacuating his forces at once, leaving William Tecumseh Sherman’s Union Army free to take the city. The next day, local officials surrendered the city to Sherman’s army without a fight. The campaign was complete. 

Atlanta was free, and neither Sherman’s soldiers nor the city’s formerly enslaved people wasted any time in demonstrating what that meant. Richards, for one, couldn’t believe the “impudent airs” Atlanta’s freed people put on in the face of their former masters. They “were all free and the Yankee soldiers don’t fail to assure them of that fact,” he wrote, noting that one freedwoman was “as independent as can be” and that two of the men he had enslaved had both escaped into the city. It was like that all over town. Atlanta, which weeks earlier had had thousands of enslaved people working on its defenses, was now a haven for freed people from across the region, with men and women pouring in from the surrounding countryside. As the dumbstruck Richards wrote, it was as if slavery had suddenly “vanished into air.”

Despite Richards’ apparent disbelief, slavery’s demise hadn’t been quite as sudden as he thought. If anything, it had been a slow process that had begun once Union armies had begun invading the South in the earliest days of the war; yet that, too, understates the complexity of what the process actually looked like on the ground, particularly in places still experiencing the vortex of war. James M. Wells, a U.S. cavalryman with a taste for adventure, had caught a glimpse of how complex the process could be some two months before Atlanta’s fall while retreating back to the city following a failed cavalry raid on targets in middle Georgia. He and his men were facing a tall task. Georgia’s scorching summer heat was in full blaze. The men were separated from the main body of mounted horsemen, and their only instructions had been to escape back to Atlanta by whatever means necessary. Though they had covered some of the terrain before, the good news ended there. Not only were the roads humming with Confederate cavalry, their horses were tired, they would soon need food, and the army’s lines around Atlanta remained many miles away.

Much to the benefit of Wells and his band, a group of enslaved women soon discovered the desperate cavalrymen and began acting as their guides. The women had no horses of their own, so the men rode while the women walked. The women guided them down creek beds and along footpaths so deep and dark that Wells likened riding along them to descending the depths of “some vast subterranean cavern.” Oftentimes the glare of torches led the way, shining upon fords or foot trails that made the dense Georgia brush more easily passable, which must have added to the feeling Wells had while riding in the dark of the night.

He couldn’t help but appreciate the steely courage of the women, who navigated the “impenetrable darkness” and faced grave repercussions if caught by Confederates. Pretty soon, larger groups of enslaved people began following along, increasing the size of the band. The enslaved people were all “determined to flee the country with us,” Wells remembered, though he left no indication that anyone ever did.

He didn’t get the chance to find out. Not long after meeting the enslaved women, Wells broke from the group during a surprise shootout with Confederate cavalry and was later captured, making him one of the many prisoners of war who never made it back to Atlanta following an operation sometimes remembered as Sherman’s “big raid.” The original plan of attack was for two cavalry forces to swing around Atlanta in opposite directions. General Edward M. McCook’s force of Union horsemen was to ride west while General George Stoneman’s squad of cavalrymen was to ride east. The two were then supposed to join forces south of Atlanta at Lovejoy Station in an attempt to cripple Hood’s last remaining supply line into the city. If successful, the two cavalry forces would ride on to Macon, liberate the Union soldiers held there, and then head for Andersonville Prison, the great gulag of the Confederacy, which held as many as 33,000 Union prisoners and sat in the state’s southwestern corner near the town of Americus.

The problem was that the operation had been a fiasco from the start. Stoneman ignored orders. Rather than linking up with McCook at Lovejoy Station, he bolted straight for Macon. That left Confederate cavalry free to consolidate around McCook’s forces, leading to a standoff southwest of Atlanta near Newnan. McCook, caught off guard andnexposed, had no choice but to retreat in a wild ride back to Atlanta that saw hundreds of Union cavalrymen either killed or captured. Stoneman, meanwhile, ran into trouble of his own just outside Macon. He encountered a large force of Confederate cavalry at Sunshine Church, and like McCook, he soon realized he was in trouble and ordered his men to make their own wild, lifesaving ride back to Atlanta, which was the situation Wells found himself in before being captured. But unlike McCook, who managed to escape capture, Stoneman did not. He and more than four hundred of his men, many of whom had been holding the line so others could escape, were taken captive, which put a final end to one of the most calamitous cavalry movements in the history of the war.

Nevertheless, despite being a stunning failure, the “big raid” was one of the first instances in which Sherman’s mounted wings dug deep into the Georgia countryside, carrying the war to communities well beyond Atlanta. For the unsuspecting, it was a wake-up call. A war that had once been distant and abstract was now up close and personal and looming all around. Militias took to arms, and whole communities stood guard. That meant “sleepless nights,” wrote Dolly Sumner Lunt, a widowed plantation mistress from Covington, southeast of Atlanta. Stoneman’s cavalry had been seen on the road, and Lunt had heard reports of stores being ransacked, railroads being destroyed, and neighbors taking flight.

In Newnan, where McCook’s raid came to an end, Fannie A. Beers, a nurse in a Confederate hospital, watched as Union cavalrymen clashed with Confederate horsemen. Locals rushed past her into the action while others fled. “There was no time for deliberation,” she wrote. The war was fast extending its reach.

For enslaved people, the raids carried a different meaning entirely. Some were rightly wary. Armed white men on horseback were specters that enslaved people knew to approach with caution. Yet the raids were also the army’s first foray into the upper sections of middle Georgia, a part of the state where the lower Piedmont begins folding into the state’s fertile plantation belt, home to thousands of enslaved people.

Blue-coated men penetrating that far into Georgia stirred already restless waters. Enslaved people began fleeing to the two cavalry divisions almost immediately. Several enslaved men guided McCook during his journey before being forced back at Newnan, and one historian has estimated that by the time Stoneman’s cavalry approached Macon, as many as five thousand enslaved people were following along. Tragically, many of those men—and possibly women and children, too—met a brutal fate once the raid went bad. One soldier urged them to “escape while they could, as their fate would be severe if captured” but admitted that “Some followed this advice” while “many others chose to remain” with the soldiers “at the risk of any fate.”

The men and women who fled to Sherman’s cavalry units or escorted escaped horseman back to federal lines represent part of a much larger whole. In all, historians estimate that at least half a million enslaved people fled to U.S. Army lines during the Civil War. Any number of others may have run to the army but never made it; many more may have fled to the army but were turned away. In any case, the point is that amid all the fighting and dying, one of the mainstays of the war was that enslaved people consistently and inexorably ran to the U.S. Army. Like death, it was one of the war’s few certainties: wherever the army moved, enslaved people followed its movements until they caught up and joined the ranks. It happened first in 1861, when three enslaved men in Virginia rowed over to Fort Monroe in Chesapeake Bay and asked for refuge, but it happened in every theater from the beginning of the war up until the very end.

The army had no choice but to respond. Benjamin Butler, the dour-faced New Englander who commanded Fort Monroe when the three men arrived, set the army’s first policy by declaring the men “contraband of war.” As he saw it, the men had been forced to work on Confederate defenses, or so they claimed, so to return them would be equivalent to helping the Confederacy. Had the three men not claimed that they had been forced to work on Confederate defenses, it is unclear how Butler would have acted. Nonetheless, when a Virginia slave owner insisted that the three men be returned, Butler refused, arguing that as a form of Confederate property, the enslaved men could be legally confiscated by the U.S. Army according to the laws of war. That was not exactly an emancipation decree. Butler’s reasoning said nothing about freedom. But that was never the point. The importance of the contraband policy was that it established a legal framework that allowed the U.S. Army to “confiscate” enslaved people who arrived at army lines with similar stories. The path to wartime emancipation began with that basic premise.

Congress codified the policy a month later in August 1861 when it passed an Act to Confiscate Property Used for Insurrectionary Purposes, otherwise known as the First Confiscation Act. As its name implies, the bill allowed for the seizure of Confederate property used in the Confederate war effort, and it included wording that applied directly to persons “held to labor or service,” which everyone knew meant enslaved people.

Yet from the moment Lincoln’s signature dried, the bill’s complications became apparent: Who, for instance, would determine if an enslaved person had been forced to aid the Confederate Army? According to the U.S. government, that was a decision for the court system, not necessarily the army, to make, which raised questions about how and when a court would make that decision and what the army should do in the meantime. Moreover, because the bill mirrored the contraband policy, it said nothing about freedom, which locked enslaved people into an in-between status as neither free nor enslaved. What did that mean for the freed people who arrived at army lines? What did it mean for the army? All those lingering questions made the First Confiscation Act difficult to enforce and easy to ignore.

Meanwhile, enslaved people continued escaping to the army. By the spring of 1862, large camps of freedom seekers began forming around Washington, DC, and Fort Monroe; other camps formed along the coast of Georgia and the Carolinas on islands occupied by the U.S. Army. Those camps have traditionally been known as “contraband camps,” but historians have recently renamed them “slave refugee camps” because that’s what they were: makeshift encampments and tent cities attached to the army and housing hundreds, sometimes thousands, of refugees from slavery, many of whom arrived with nothing while others carried all they had in a cart or wagon. Some refugees would eventually find work with the army. Men worked as teamsters or laborers; women often served as cooks or laundresses for a particular regiment. But not everyone found work. Others, especially women, children, and the elderly, lived as bona fide refugees, eking out an existence in the shadow of the army however they could.

Out in the west, where the army moved deeper and deeper into the Mississippi Valley as 1862 wore on, the situation was the same except that the camps there tended to be larger and more numerous. Large numbers of enslaved people fled to Nashville after it fell in February 1862. The same was true for New Orleans when it fell in May, and even larger numbers of enslaved people began arriving in the area around Memphis once the army occupied the city later in June. By then, a pattern had clearly emerged: all along the army’s path and especially in places where it had established firm control over a given area, enslaved people arrived in search of refuge. The experience was enough for some within the army to throw up their hands and ask for help from the government in deciding what to do. Ambrose Burnside, a U.S. general known more for his characteristic facial hair than his generalship, captured the exasperation some felt when he wrote from New Bern, North Carolina, describing an attempt to deter the refugees as “utterly impossible.”

The collective movement of so many enslaved people into army lines during the first year of the war eventually forced Congress into a policy change. In July 1862, just over a year after the first refugees had arrived at Fort Monroe, Congress passed a second, far more comprehensive Confiscation Act that read like a virtual emancipation decree. It dispensed with the messy distinction that only enslaved people who had been forced to work for the Confederacy could attain refuge and instead declared that any enslaved person with a rebellious master who escaped to federal lines would now be “forever free of their servitude, and not again held as slaves.” The new wording made achieving refuge more accessible. But it was the last bit that mattered most of all. The bill specifically mentioned freedom, which moved away from the language of confiscation and announced Congress’ official endorsement of wartime emancipation. Enslaved people within occupied parts of the South could now escape to the army and expect freedom upon arrival.

Few people knew, however, that as Congress debated the merits of emancipation on Capitol Hill, just a few blocks away in the White House, President Abraham Lincoln was planning an emancipation order of his own. 

Excerpted from Somewhere Toward Freedom: Sherman’s March and the Story of America’s Largest Emancipation by Bennett Parten. Copyright 2025 © by Bennett Parten. Reprinted by permission of Simon & Schuster, LLC

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186012 https://historynewsnetwork.org/article/186012 0
Who Built the Panama Canal? A young man named Edgar Llewellyn Simmons sailed out of Carlisle Bay, Barbados, on a Royal Mail boat in January 1908 to work on the Panama Canal. After a two-week journey, he reached Colón and construction officials found him a place to sleep for the night. The whistle blew at 6:00 am the next morning. When Simmons lined up, the boss picked every other man and gave each a pick and shovel. Simmons thought he would get a better job, since he’d not been chosen yet. Then came “one of our own West Indian fellow men” who took him to a dump car filled with coal, handed him a shovel, and told him to get to work unloading it all. The bosses called him “Shine.”

So begins the story of Edgar Simmons, one of the tens of thousands of men and women who traveled to Panama during the construction decade from 1904 to 1914. They left homes in Barbados, Jamaica, St. Lucia, Antigua, the Bahamas, Grenada, and other Caribbean islands. Simmons’ story is held in Box 25 of the Isthmian Historical Society Collection at the Library of Congress along with 111 other first-person testimonies by canal workers. The testimonies resulted from a competition in 1963 for the “Best True Stories of Life and Work” during the construction of the Panama Canal. As the 50th anniversary of the Panama Canal’s completion in 1914 approached, Ruth C. Stuhl, president of the Isthmian Historical Society, designed a competition open to all non-U.S. employees. She wanted to capture the experiences of laborers from the British Caribbean.

The historical society placed ads for the competition in newspapers in Panama, Jamaica, Barbados, British Honduras, Trinidad, Antigua, St. Vincent, St. Lucia, and Grenada, and sent notices about the competition in several thousand food packages distributed to disability relief recipients. The straightforward instructions declared that the historical society wanted to gather true stories of life and work on the Panama Canal during its construction. It noted that little had been written about the experiences of West Indians, and that officials wanted to generate remembrances before time ran out. People who had trouble writing or were not literate might ask a friend or family member to write for them. Stuhl declared upon completion of the competition that “the Society is most grateful for all the entries and we regret that there could not be a prize for everyone.”

Men of African descent from the British Caribbean wrote nearly all the testimonies. Their entries ranged in length and detail: some were mere fragments, a few sentences long, while others were six or more pages in length. The writers in Box 25 constitute a small sample of the West Indian men and women who exerted a tremendous influence on the history of the Americas. Constructing the Panama Canal was a ten-year undertaking that generated several waves of migration and proved to the world that the United States was a dominant power in the Caribbean and Latin America.

People migrated from dozens of countries around the world to build the canal, but the largest numbers came from British Caribbean islands like Jamaica and Barbados. The majority were men, many of whom signed labor contracts with the U.S. government, while others traveled on their own to the isthmus to seek a job. Over time several thousand women joined them, working often as laundresses or domestic servants. Some brought children with them, and others gave birth in the Canal Zone, so gradually family life shaped the canal experience for many. These Afro-Caribbeans tended to be rural, leaving harsh lives working small pieces of land or laboring for large landowners. They boarded a ship for the isthmus in hopes of earning more money, acquiring new skills, or simply wanting to see the world. Many people think of these workers as providing the brutal, unskilled labor demanded by the canal project, and to be sure a great number worked as diggers and dynamiters. But over time Afro-Caribbean men often received training and began working at jobs originally limited to white North Americans, for example as carpenters or machinists. Movement had long been important in Caribbean history, but this vast wave of migration to the Panama Canal Zone changed the Americas forever.

Many migrants settled permanently in Panama, making that young republic a profoundly Caribbean-and African-descended nation. Tens of thousands more moved onward across Central and South America, the Caribbean, and often farther along to the United States after the canal construction was completed. The origins of the Caribbean American community in the United States are owed predominantly to the impact of these forefathers and mothers who traveled to Panama to work on the Canal. And as they moved, their culture and political perspectives — a hybrid of African diasporic influences and the impact of British colonialism — migrated along with them. For a relatively impoverished migrant group, they were unusually cosmopolitan and quite sophisticated politically, more likely to have received some education and to have navigated across different empires and work regimes.

 

The men and women who chose to submit testimonies to the competition were working people, often landless laborers or craftsmen. They traveled to the Canal Zone to escape harsh environments on islands across the Caribbean, where most earned wages so low as to be near starvation. They had heard about well-paying jobs helping build the Yankees’ canal. Surely, they knew, it would be far more than they could earn in St. Lucia, Barbados, or Jamaica. They hoped they could save enough money to return home and buy a piece of land or open a shop. The men worked as carpenters, blacksmiths, railroad workers, gravediggers, salesmen, waiters, hospital attendants, janitors, and of course, diggers and dynamiters. The women most often worked as domestic servants in the homes of white officials or skilled workers from the United States. When the canal opened to great acclaim in 1914, Afro-Caribbean canal workers found other jobs. Some, including most of the Box 25 authors, kept working for the Isthmian Canal Commission, helping maintain the canal operations, while others moved home or onward to plantations across Central America, or saved their money and headed to New York City. Decade upon decade passed by, and in 1963 when the competition was announced, the original canal builders were now aged men and women.

Those who submitted testimonies were most often male workers who had spent their lives in Panama or the Canal Zone. No longer working, typically confronting severe poverty as well as the ravages of time on their bodies and souls, dealing with disability and disease and sometimes the approach of death, they looked to this competition as a possible lifeline. They wrote up their memories or asked a son or daughter to write for them. They placed stamps on envelopes and sent in their testimonies with hopes the prize money would afford them a few days of comfort.

When Afro-Caribbeans disembarked from their ships and entered the Canal Zone in the early twentieth century, they confronted a sea of white faces. Their white bosses and supervisors were tough task masters.

The United States developed the Canal Zone into one of the most modern and industrialized places on earth and sought to discipline Afro-Caribbeans into an efficient army of labor. U.S. officials and foremen saw their Caribbean workers as childlike creatures who needed to be prodded constantly to work hard. The official government archives lump many thousands of workers together, melting away important cultural and geographical differences. Official government records, for example, typically labeled all these workers “West Indians” instead of noting that an Antiguan and a St. Lucian might not see eye to eye, or that Caribbean foremen and policemen, usually from Jamaica, were feared and held in contempt.

Officials’ lack of deep understanding of their workers makes their writings of limited use. The testimonies in Box 25, by taking us into life and work in the Canal Zone through the eyes and souls of Caribbean workers rather than their supervisors, provide an opportunity to recreate the experience from the perspective of laborers. Like any archive, the testimonies in Box 25 emerged from a complex process in which personal experiences became entangled with the power dynamics of the larger world — in this case the colonialism of the United States and Britain, global capitalism, and the racial, gender, and class structures that resulted. 

The canal builders left few accounts of their experiences, despite their numbers and their major role in building the canal. Most archival sources regarding Afro-Caribbean male and female workers flatten their experiences or erase altogether the complexity wrought by the diverse cultural and socioeconomic characteristics of their home islands. Government officials, medical personnel, white U.S. housewives, and British globetrotters all published memoirs that bring the construction years to life. We can observe the project through the eyes of visitors like speaker of the house Joseph Cannon or presidents Roosevelt or Taft. We can pore over letters written home by a white U.S. steam shovel engineer or other white working men. We can follow a census taker turned policeman as he crisscrossed the zone, sharing his opinions about details small and large, thanks to the book he wrote. But for those Afro-Caribbeans who so dominated the labor force, we have very little.

In St. Louis, Missouri, the U.S. government keeps personnel records it collected over the centuries on its employees, including the hundreds of thousands of Afro-Caribbean canal workers. I looked at thousands of records there, and with some digging I found many of the Box 25 authors — including Edgar Llewellyn Simmons. The U.S. government had in most cases tracked and surveilled these workers for decades. Placing those personnel records in dialogue with the testimonies illuminated their lives and the stories they told. It also became possible, in many cases, to see photos of these workers whose words I had analyzed, bringing them more vividly to life. The personnel records thus created a wholly different lens for analyzing workers’ lives, viewpoints, and the struggles they faced. When we examine official government archives like the personnel records collected by the United States, we have to look through the haze of colonial condescension to reveal the lives of working men and women.

Officials saw workers as a means of production to be managed, disciplined, surveilled, and then disposed of as easily as possible. Michel-Rolph Trouillot, Jennifer Morgan, Marisa Fuentes, and other scholars have explored the complex history of archival production, the ways archives emerge out of existing power relations, and the silences built into them. They note the need to read archives carefully to comprehend fully the agency, subjectivity, and experiences of working people caught in the surveilling power of official archives. As Fuentes puts it, we must “account for the conditions in which they emerge from the archives.”

The Box 25 testimonies tell a story of transimperial relationships, of Caribbean canal workers who moved through a terrain marked by the British and U.S. empires but haunted as well by the legacy of the Spanish and French empires. The influence of the Spanish empire remained in the legal, political, and cultural structures of the Republic of Panama, while the tragic French efort to construct an ocean-level canal had trans- formed the landscape of the isthmus in ways that continued to shape the U.S. project 20 years later. Afro-Caribbean workers carried with them a personal history of colonialism, manifested via the impact of disease and the remnants of scars from their labor, their bodies a “secret archive of harm,” as one novelist phrased it. By examining the historical production of archives related to Caribbean labor, this project calls into question the assumed stability of the official colonial archive. It asks how officials’ misunderstanding of their workers as well as the demands of colonialism and global capitalism shaped the creation of the archives. 

The Isthmian Historical Society, a social club founded in 1956 by prominent white residents of the Canal Zone, brought the testimonies of Box 25 into existence. The society’s constitution noted its objective: to “promote and inculcate interest in, and appreciation, study, and knowledge of the history of the Isthmus of Panama.” The society organized events that celebrated the history and legacy of the zone, honoring Theodore Roosevelt, for example, or bringing Maurice Thatcher (who headed the Department of Civil Administration for some years during the construction era) to give a public lecture. In 1958, to mark the centennial of Theodore Roosevelt’s birth, celebrations were held across the zone to honor white U.S. canal workers. Dozens of them returned to the isthmus for the event. As part of the celebrations, the Isthmian Historical Society interviewed thirty-five Roosevelt Medal Holders on tape, then transcribed the interviews and donated them to the Panama Canal Zone Library-Museum archives. The interviewees — male clerks, engineers, postmasters, and some of their wives — mostly recounted their employment history and a few vivid memories such as the exploding of Gamboa Dike that marked the completion of the canal. The interviews perhaps inspired a young librarian, Ruth Stuhl, to undertake a very different project a few years later: to collect memories of Afro-Caribbean canal workers.

 

Of the 112 entries, Ruth Stuhl chose sixteen contenders for best essay and forwarded copies to the three judges. Many of the essays provided illuminating reflections on the construction era — in general they were far more informative than the tape-recorded sessions with “old-timers” from 1958. Judge Loren Burnham noted the themes he saw repeated most often in the testimonies: a pride of workmanship, pride at being part of the “great Canal enterprise,” difficulty of supporting a family on low pay, and “satisfaction and a feeling of teamwork with ‘good’ bosses.” 

The society awarded first prize ($50) to Albert Peters, originally from Nassau, Bahamas, but living in Cristobal; second prize ($30) to George H. Martin of Barbados, living in the Canal Zone; and third prize ($20) to Alfonso Suazo of Honduras, living in Panama at the time of writing. Each of these three individuals wrote stirring essays of several pages in length. Peters’ entry eloquently told a harrowing tale of illness, interactions with doctors and nurses, and their successful efforts to treat him. Martin notably quoted from contemporary songs and included quite a bit of detail about everyday life. Suazo, one of the very few who wrote a testimony in Spanish, described the difficulties of life on the job. Clearly the job of judge involved subjective evaluations — there were many other essays as eloquent as these three. It’s very possible that in awarding one of the prizes to a Latin American, the judges were acknowledging the continued importance of Spanish heritage in Panama. The prize money would have made a difference in the men’s lives. Adjusted for inflation, fifty dollars awarded in 1964 would be worth nearly $500 today.

The testimonies remained in the zone as part of the Canal Zone Library-Museum until 1999, when the transfer of the Panama Canal to Panama was completed. At that point the entire holdings of the Canal Zone Library Museum moved to the Library of Congress in Washington, DC. Today Panamanian scholars must travel to the United States to unearth their country’s history, to visit the vast holdings at the U.S. National Archives or the original copies of the Box 25 testimonies at the Library of Congress. To some Panamanians, the fact that the Box 25 testimonies reside in the U.S. capital rather than in Panama vividly suggests the continued legacies of colonialism.

From Box 25: Archival Secrets, Caribbean Workers, and the Panama Canal by Julie Greene. Copyright © 2025 by the University of North Carolina Press. Used by permission of the publisher.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186010 https://historynewsnetwork.org/article/186010 0
The First and Last Queen of Haiti in Exile Although the queen had stood by Henry Christophe’s side since the earliest days of the Haitian Revolution, and eventually outlived most of her immedi­ate family, dying in 1851, hardly any of the kingdom’s many chroniclers bothered to consult her tale. Much later, while living in exile in Italy — as one of the only “black faces,” in her words — she at last told her story to a British acquaintance, a frequent visitor at the former palace. She lamented that she had suffered through the deaths of her husband and her sons, including that of her eldest, François Ferdinand, who died in Paris in 1805. Seeking neither recognition, nor glory, nor pity, nor wealth, she said with a sigh, “I have lost a husband, an empire, and [nearly] all my children … sorrow has quite weaned me from the vanities of this life; at my age and in my situation, I can only look forward to the next world, as a place of rest and peace.”

Although Marie-Louise, then residing in England, at first with the abolitionist Thomas Clarkson’s family at Playford Hall, and later with her daughters on Weymouth Street in London’s Marylebone district, told other visitors, too, that she sought only to lead a life of solitude, soon after she arrived in Europe, an unflatter­ing spotlight seemed suddenly to turn in her direction. Almost immedi­ately after news of King Henry’s suicide reached Europe, the Nîmes-based Journal du Gard published a notice about the “fall of Christophe.” The Haitian royal family’s image did not necessarily fare better in England. The Christophe women arrived in London in the fall of 1821 to a British public primed with curiosity about the sordid story of the king’s death.

On December 11, 1820, Britain’s Morning Chronicle, clearly informed by Haitian newspapers, reported that one of the “first advantages that will be derived by humanity from the late revolution in the north side of Hayti, and the Death of Christophe, is the liberation of several vic­tims, who, in the character of political enemies of his late sable Majesty, have long dragged a miserable existence in the dungeons of his citadel of Sans Souci [sic], who possibly had lost all hopes of ever again seeing the light of the sun.”

The Christophe women initially intended to stay with the Clarksons for only a few weeks, but they ended up extending their sojourn for half a year. During that time all three women suffered from bowel complaints and frostbite, unaccustomed as they were to such a cold and rainy envi­ronment. The Clarksons had played patient and sympathetic hosts. They sought medical care for their guests and brought in tutors in French and Italian for the princesses. Clarkson also helped Marie-Louise attend to her finances. With his assistance, she obtained a credit account, and for a time life in England seemed to suit the grieving family. The Christophes even had dinner at the home of their former patriarch’s friend William Wilber­force in 1822, before they moved to a more secluded cottage in the seaside town of Hastings. Away from London’s hustle and bustle, they hoped to avoid the stares of strangers, undoubtedly curious about the presence of these stately and finely dressed Black women.

The friendship between the Clarksons and the Wordsworths grew strained in connection with the presence of the Christophe women. The Wordsworths were more than a little scandalized by the presence of this Black queen in England. Hardly hiding their opinion of what they considered the inappropriateness of her presence, William’s sister, Doro­thy, wrote a letter to Mrs. Clarkson in October 1822 in which she enclosed a racist poem mocking Queen Marie-Louise, written by William Wordsworth (author of the famous sonnet “To Toussaint L’Ouverture”) and his sister-in-law Sara Hutchinson. “My dear Friend,” the letter began,

At the end of my letter I must copy a parody (which I hope will make you laugh), that William and Sarah [sic] threw off last Sunday after­noon. They had been talking of Mr. Clarkson’s kindness to every human being, especially of his perseverance in the African cause, and of his last act of kindness to the distressed negro widow and her family. Tender thoughts of merriment came with the image of the sable princess by your fireside. The first stanza of Ben Jonson’s poem slipped from William’s lips, a parody, and together they finished it with much loving fun. Oh! how they laughed! I heard them in my room upstairs, and wondered what they were about; and, when it was finished, I claimed the privilege of sending it to you … Ben Jon­son’s poem begins “Queen and huntress chaste and fair.” You must know it. 

Queen and negress chaste and fair! Christophe now is laid asleep Seated in a British chair. State in humbler manner keep Shine for Clarkson’s pure delight Negro princess, ebon bright!

Let not “Willy’s” holy shade Interpose at envy’s call, Hayti’s shining queen was made To illumine Playford hall, Bless it then with constant light, Negress excellently bright!

Lay thy diadem apart, Pomp has been a sad deceiver. Through thy champion’s faithful heart Joy be poured, and thou the giver, Thou that mak’s’t a day of night Sable princess, ebon bright.

Surprised at the brazen and overt racism, the Clarksons stopped speak­ing to the Wordsworths. This estrangement continued for several months until Dorothy apologized, with a hint of sarcasm, for “our joke on poor fallen royalty.”

 

Unable to reconcile with the climate, both racial and social, Marie-Louise opted to take her daughters to Italy. According to Catherine Clark­son, who later lost touch with the Christophe women, Marie-Louise remained in contact with her grandson, Prince Eugène’s child, and she hoped to spend the final years of her life back home in Haiti.

Though the climate of Italy suited her well, Marie-Louise still suffered much in body and mind. Her daughters were unwell, and all three women continued to be subjected to ridicule and derision. In an article quoted in the Black American newspaper Freedom’s Journal on May 11, 1827, the author refuted an inflammatory account of Marie-Louise, first published in the New-York Enquirer. In that newspaper, the ardent racist Mordecai Manuel Noah denounced as improbable a rumor that Madame Chris­tophe was engaged to a German prince, since readers must “remember she is a fat, greasy wench, as black as the ace of spades, and one who would find it difficult to get a place as a Cook in this city.” “So much for royal taste,” he concluded. The author of the refutation in Freedom’s Journal, the first Black-owned newspaper in the United States, defended Madame Christophe against “this calumny” by writing, “We are induced, from a personal acquaintance with Madame Christophe for many years previous to and after she was elevated to the rank of Queen of Hayti, to bear testimony against the above illiberal and unjust representation.” “We do not hesitate to say, that no just person acquainted with the Ex-Queen could have thus characterized her, and that there are many Americans who will unite with us in this declaration,” the author wrote.

Marie-Louise and her children made every effort to live with dignity during their long exile. Having spied them in 1830 in the vacation spa town of Carlsbad (Karlovy Vary), part of the Austrian empire at the time, the French writer François-René de Chateaubriand could not help but to take his turn gawking at and then writing about the Christophe women. Of Athénaïs, Chateaubriand wrote that “she was very educated and very pretty.” “Her ebony beauty rests free under the porticos among the myr­tles and cypresses of Campo Santo, far from the field of cane and man­grove trees, in the shade of where,” he added, “she was born a slave.” Of course, lucky for them, Chateaubriand was mistaken: the Christophe girls had never been enslaved. Still, they did not escape sorrow.

The Christophe womens’ exile across the Atlantic took them from England, to Austria, to the Italian cities of Rome, Florence, Turin, and Pisa. Amid their many wanderings, Marie-Louise and Athénaïs experi­enced a new tragedy when in October 1831, shortly after the three women took up residence in Pisa, Améthyste passed away from complications of an enlarged heart. Athénaïs passed away even more tragically eight years later, in the city of Stresa where she had been vacationing with her mother and where they had become friends with the Italian philosopher Anto­nio Rosmini. On September 10, 1839, Athénaïs reportedly hit her head so violently during a fall that she died. After Marie-Louise returned to Pisa, lonely and childless, with her late daughter’s corpse in tow, a friend of Rosmini’s lamented, “I very often see the unfortunate ex-queen of Haiti here.”

Occasional visitors, like the Englishman Robert Inglis, sometimes graced Marie-Louise’s doorstep, but for the most part the former queen passed her remaining days alone. Inglis tried to persuade Marie-Louise to return to England with him, but by that time she was perhaps too infirm due to her own health issues, or too deeply resigned to think of starting over again. She did appear to humor Inglis’s entreaty. He said, “We pressed her to think of coming back: she said that she had never liked any country so well as England; that she would never have left but for the health-sake of her daughters; but that now she had only to lie down & die, that she was daily endeavouring to prepare for it.” During their last meeting, Inglis recalled with an air of wispiness, “I again took her hand & kissed it … she embraced me; & said that I was like her son, that her son would just have been of my age.”

If she did not want to return to England, Marie-Louise did seek to return to Haiti. Addressing a letter to President Boyer from Turin on November 7, 1839, Marie-Louise confessed, “A final and frightful misfortune has just put a climax to the calamities by which it has pleased divine Providence to cause me to experience.” “The last of my daughters, Madame Athénaïse [sic] has just succumbed,” she continued, before imploring, “In the state of isolation and abandonment in which I find myself, my thoughts and my wishes naturally turn toward my dear homeland, love for which has never faded from my heart.” Adding that she hoped to spend her final days among those with whom she shared “blood ties and who do not regard me as a foreigner,” she also asked for a passport for her sister Geneviève Pier­rot. Boyer ended up denying Marie-Louise’s request to return to Haiti, but he did authorize Geneviève to travel to Italy, where both remained for the rest of their days.

Marie-Louise, unfortunately, had more suffering to do. Even with the consolation of a sister by her side, her health continued to decline. Likely a complication from diabetes induced gangrene, Marie-Louise had her left foot amputated in 1842. Afterward, like so many Black women in the nineteenth century who found themselves in Europe, forcibly or of their own volition, she became the unwitting victim of scientific observation. Ferdinando Bellini, the surgeon who operated on Marie-Louise, donated her amputated limb to the museum at the University of Pisa. According to a notation in the museum’s archives, the Black female “chambermaid” that Marie-Louise had with her in her final days, Zefferina, and to whom she bequeathed 400 Spanish pillar dollars in her will, also ended up in the museum after her death in 1855.

A deeply pious woman, Marie-Louise had donated money for a small church to be built in Pisa called San Donnino, where she buried both her daughters under marble headstones in a dedicated sacristy, and where she herself was buried after her death on March 14, 1851.

Thanks to the efforts of scholar Miriam Franchina, there are now two historical markers commemorating the Christophe women in Pisa; the first, outside the chapel where the women were interred and the second in front of Marie-Louise’s last known residence at Piazza Carrara, belated homage to Haiti’s first and last queen.

Adapted from The First and Last King of Haiti: The Rise and Fall of Henry Christophe © 2025 by Marlene L. Daut. Excerpted by permission of Alfred A. Knopf, a division of Penguin Random House LLC. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186009 https://historynewsnetwork.org/article/186009 0
Exit, Pursued by a Stork Pregnancy, or expected “blessed events,” should never be discussed as such in screen stories. Most censor boards not only frown upon, but almost always delete any such references. Any direct or crude reference to pregnancy is considered out of place exactly as it would be in any normal society where children are present. It is entirely acceptable, of course, to refer to the baby that is expected, but any reference to conception, childbearing, and childbirth is considered improper for public discussion.

So wrote Olga J. Martin in Hollywood’s Movie Commandments: A Handbook for Motion Picture Writers and Reviewers (1937), designed to help film studios interpret the Production Code of 1930. In force until the late 1950s, the Production Code (commonly called the “Hays Code” after Will Hays, president of the agency tasked with enforcing it) provided content guidelines for U.S. film studios to avoid direct censorship from state and local film boards around the country. Though the code itself had little to say about pregnancy — only that “scenes of actual childbirth” were banned — officials like Martin made clear that neither the word “pregnant” nor the appearance of a “baby bump” would be permitted on screen. 

During this era, then, films used a variety of euphemisms to communicate pregnancy: a character seen knitting little garments, fainting, craving pickles, or visiting a doctor, for instance, was almost certainly pregnant. A man buying or distributing cigars was a father-to-be. The stork who delivered babies by dropping them on doorsteps or down chimneys was less easily integrated into live-action narrative, but quickly became a staple in animated stories of marriage and childbearing. 

Stork/baby folklore is quite ancient and ambivalent (it may originate with the Greek goddess Hera turning a rival into a bird), and travels through Hans Christian Andersen’s dark fairy tale The Storks (1839), but contemporary stork discourse in the U.S. is heavily sanitized: storks have largely been reduced to a handy tool to divert curious children wondering how babies are made. But in the hands of unruly code-era animators, the stork also provided a means to reference the facts of life without drawing the ire of industry censors. 

 

Though even roundabout references to birth control and reproductive choice largely disappeared from live-action films of the code era, those discourses survived through the euphemism of the stork. The stork was rather perfect as a replacement for the figure of the pregnant woman, precisely because it produced a complete separation between the woman’s body and the expected baby. And in this space of the safely hygienic, the stork also made room for some discourses of childbearing that were heavily repressed in live-action features.

A familiar scene from Dumbo (1941), for instance, shows an elegant squad of delivery storks under a bright moon flying in formation like synchronized swimmers. They drop their bundles to the circus animals below, and every mother animal is delighted with her new offspring. The stork method of reproduction is uniquely painless and convenient. It is strange, then, that the lyrics to the accompanying song, “So look out for Mr. Stork / And let me tell you, friend / Don’t try to get away / He’ll find you in the end,” sound almost menacing, as if the stork is an FBI agent on the trail of a supercriminal. Playful as they are, the lyrics acknowledge an alternate reality, where childbearing may not be joyful or individuals may fairly long to “get away” from the burdens and expenses of pregnancy and parenthood. Stork stories of the 1930s–1950s, particularly in animated form, frequently made room for ideas that were suppressed in code-approved comic representations of pregnancy, including the desire for contraception and anxieties about adoption.

The acknowledgment that not all reproduction is wanted referenced a social attitude that was spreading during the difficult years of the Depression, when even middle-class families felt the burdens of widespread economic hardship. Both Mickey’s Nightmare (1932) and the later Beau Ties (1945) are animated shorts featuring male characters dreaming about marriage, happy fantasies that turn into nightmares when the stork goes berserk, delivering dozens of mischievous children who bring chaos and destruction to their homes. Another animated short, Puzzled Pals (1933), dramatizes a stork’s dilemma when faced with a town where no one is willing to accept delivery of a baby. First the stork flies over the town and finds that his official destination is blocked, with the chimney covered, all windows boarded up, and a Detour sign posted on the roof. The stork then flies around town, looking for an alternate destination, but finds that every resident has taken precautions to prevent a new delivery: all the chimneys are covered over, and signs on the houses announce increasingly ludicrous reasons for quarantine, including measles, scarlet fever, leprosy, and seven-year itch. The stork soon finds an uncovered roof and is about to drop the baby down the chimney when five children run out of the house and begin shooting at it with toy guns and arrows. They are quickly joined by fourteen infant siblings and two parents. The father uses a hunting rifle and the mother a tommy gun to defend their home from the unwanted delivery. The cartoon is not subtle and boldly jokes about a rational desire for birth control in overstretched American families and neighborhoods.

All’s Fair at the Fair (1938), on the other hand, includes a birth control joke so sly you can barely catch it. A couple of country bumpkins, Elmer and Miranda, stroll around the world’s fair exclaiming, “Wonderful!” as they gawk at all the fantastical modern innovations. A knitting machine? “Wonderful.” A machine that makes furniture from logs? “Wonderful.” Eventually, they approach a machine that produces prefab houses. As the houses roll off the production line, a stork approaches and drops a baby in each chimney. “Ain’t that wonderful?” muses Miranda. “Nope,” replies Elmer, at which his wife covers her mouth and giggles. The idea that not all couples want children was still a little bit naughty and particularly funny coming from this wholesome and naive pair. They may be baffled by the wonders of modern technology, but they understand storks perfectly well.

Storks were useful surrogates for the concept of birth control in the studio era because they drove a small wedge into the idea of reproduction’s inevitability and divine predestination. Animated stork shorts of the 1930s–1950s show the process of delivery as something that does not always run smoothly — rather than being messengers from some daunting higher power, cartoon storks are often all too fallible and introduce chaos, obstacles, transgression, and choices into the story of reproduction.

While live-action films of the same period largely erase non-white reproductive practices, stork narratives also make room for a playfully diverse perspective on reproduction. A 1933 Warner Bros. cartoon, Shuffle Off to Buffalo, shows storks arriving in a sort of baby factory in heaven, where elves diaper and feed preborn babies and then dispatch them to earth. The popular song referenced by the title provides a rhythm for the factory and refers to the tradition of honeymooning at Niagara Falls, in upstate New York. So although all the action is set in the chaste baby factory, the excitement of marital consummation happening down on earth is always exuberantly present.

With sexuality thus relegated to a supporting role, much of the film’s visual humor is based on ethnic jokes: a Jewish baby is stamped on the bottom with a “Kosher for Passover” seal of approval; Father Time pulls two babies out of a freezer to send in reply to a request from “Mr. and Mrs. Nanook of the North.” These brief and stereotypical appearances relegate nonwhite reproduction to a marginal and humorous position, while white babies predominate, rolling by on a conveyor belt to be diapered, fed, and prepared for delivery. Though it was certainly a labor-saving device for the animators to render the white babies identical, this technique also produced a text that supports the idea that white babies are standard and normal, while minoritized babies are marked out as unusual and “funny.” Very much in the tradition of vaudeville ethnic play (singer Eddie Cantor makes a cameo appearance), Shuffle Off to Buffalo is a sly, messy, ambivalent celebration of sexuality and reproduction that could exist only in the world of animated fantasy.

As the economic troubles of the 1930s gave way to the baby boom of the late 1940s and 1950s, animated storks reversed course a bit and frequently came to represent national fertility in overdrive. The Farm of Tomorrow (1954) is a faux newsreel touting innovative animal crossbreeding. The narrator explains, “Here, we’ve crossed the old reliable stork with a big-horn elk to accommodate you impatient newlyweds, who are in a hurry for a big family.” The image shows a stork with a giant rack of horns, babies hung from each branch like Christmas tree ornaments. Baby Bottleneck (1946) starts with a drunken stork at the Stork Club complaining about its hectic delivery schedule. Labor conditions at the bustling baby factory are so excessive and chaotic that Porky Pig is installed as the new production chief, with predictably disastrous results.

Drunken storks are everywhere in animated films of the 1950s, mirroring the reproductive recklessness of the baby boom. These untrustworthy reproductive agents frequently deliver babies to the wrong houses, resulting in cross-species adoption narratives that work through stories of unwitting parents doting on genetically unrelated babies. A Mouse Divided (1953) sees a drunken stork bringing a mouse baby to a family of cats. Goo Goo Goliath (1954) shows a stork too drunk to carry a giant baby all the way to its new home at the top of a beanstalk. Giving up, the stork instead takes the baby to a human-size couple, who raise it as their own. Lambert the Sheepish Lion (1951) sees a stork leave a lion cub for sheep parents to raise. In Apes of Wrath (1959), a drunken stork loses an ape baby, so the stork kidnaps Bugs Bunny and delivers him to the ape parents instead.

These twin concerns, birth control and adoption, showcase the flexibility of the stork narrative in surfacing the hidden features of American reproductive practice. A final example dramatizes both ideas. In Stork Naked (1955), when a drunken stork delivers an egg to Daffy Duck and his wife, Daffy tries unsuccessfully to fight him off. Stuck with the egg, Daffy is delighted to see that the hatchling is a baby stork, not a baby duck. He immediately wraps up the chick and flies it back where it came from, muttering triumphantly, “For once, that stork is gonna get a taste of his own medicine.” A bit of the subversive birth control logic of 1930s storks combined with the wrong-delivery excesses of the 1950s result in a portrait of the stork’s role as a sly avatar of reproductive ambivalence. 

Excerpted from It’s All in the Delivery: Pregnancy in American Film and Television Comedy by Victoria Sturtevant. © 2024 by the University of Texas Press, published with permission from the University of Texas Press.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186006 https://historynewsnetwork.org/article/186006 0
The Soundtrack to Vietnam War History Isn’t Quite Historically Accurate Today, more than any other American conflict, the military action popularly known as “Vietnam” (though it eventually involved Cambodia and Laos as well) has a soundtrack in the popular imagination. Thanks to Hollywood and documentarians like Ken Burns, we think we know the characteristic sounds of that war and what they meant for how it was fought. Cue up the Jefferson Airplane, the Doors, and Jimi Hendrix, with Grace Slick’s shimmering vocals catapulting us into the underground in “White Rabbit,” Jim Morrison’s baritone intoning the epic psychodrama of “The End,” and Hendrix’s wailing Stratocaster reinventing “The Star-Spangled Banner” as cri de coeur. In the exposition sequence of Apocalypse Now, Captain Willard (played by Martin Sheen) grouses about young soldiers he characterizes as “kids, rock and rollers with one foot in their graves.” That film, like Platoon, Born on the Fourth of July, Full Metal Jacket, and others, features music so prominently that the significance of music for American war-making in the 1960s now seems well understood and self-evident—if not a cliché. The defining, iconic music of the era — with its jangly guitars, deep reverb, and howling feedback — is heard as voicing the psychic lives of both American soldiers and the nation at large. In a 1977 review of Michael Herr’s landmark book Dispatches, critic John Leonard called Vietnam “our first rock-and-roll war,” and since then, many have come to see it that way.

While not entirely wrong, the Hollywood version is too loud. It drowns out a history that was more complicated — and more important — than Hollywood and Ken Burns have led us to believe. Whatever music meant for the counterculture and the anti-war movement, it did not mean the same thing in Vietnam. There, music was part of the machinery of waging war. Rock and roll did not disrupt how the war was prosecuted as much as it stood in for actual dissent, the opportunities for which were quite limited within the constraints of the military. If anything, from the military’s point of view, it was soul music — not rock — that threatened the status quo. Paradoxically, an account of the war attuned to rock as a revolutionary force may unwittingly serve the interests of the war machine, for such a narrative diverts attention away from what music was actually made to do in Vietnam, obscuring the ways music affirmatively kept the war going. 

 

For personnel assigned to combat duties, music was relatively scarce. Soldiers in the jungle needed to be all ears for signs of enemy belligerents, and they avoided extraneous sounds which could betray their own positions. On patrol, listening and being quiet could be matters of life or death. “I don’t know where or why the Vietnam War got the nickname ‘the rock ’n’ roll war,’” wrote W.D. Ehrhart, who served thirteen months in Vietnam in 1967–68. “That certainly wasn’t my experience.” He then detailed a few musical experiences he did have during his tour but noted such occasions “are so memorable precisely because they are so rare.”

But many more soldiers served in positions away from the front lines. Disdained by combat troops as “rear-echelon motherfuckers,” or REMFs, noncombat personnel greatly outnumbered combat troops (estimated ratios range from two to one to eleven to one), probably far more so than in any previous war. That is, most G.I.s spent most of the war on military bases near Saigon or elsewhere, living in relative security and comfort, not in the chaos of combat. 

The American Forces Vietnam Network (AFVN), an affiliate of the Armed Forces Radio and Television Service, broadcast throughout South Vietnam on both the AM and FM bands, twenty-four hours a day. Accessible to G.I.s every day, all over the country, AFVN made up a much bigger part of soldiers’ musical world than USO spectacles did. “Almost everyone listened [to AFVN],” wrote Vietnam veteran Doug Bradley and historian Craig Werner in their book about music in the Vietnam War. Biding their time in a listening area with few other English-language options on the radio, G.I.s tuned in — a lot — regardless of their branch of the military. From 1968 to 1971, the military conducted a series of surveys of the AFVN audience, offering a synoptic picture of who, when, how long, and to what G.I.s were listening. In 1968, 80 percent of respondents listened two or more hours a day, and a third of G.I.s listened to the radio for five hours a day or more. Further, at places like Long Binh Post, the daily averages were often substantially higher, especially among troops aged 20 and younger and those doing administrative or support work. And notably, these numbers remained fairly consistent, even as cynicism increased and support for the war declined in the military. By 1971, disaffection among the troops had grown substantially, yet average listening times to AFVN held rather steady.

On the air, music mattered most. Following the model of armed forces radio in World War II, AFVN tried to imitate stateside commercial broadcasting, with music-themed shows accounting for as much as 65 percent of the program schedule. Although the network also broadcast sports, informational spots on Vietnamese language and culture, and news reports, which were most listeners’ main source of information about the war and public affairs, one report noted, “music is the primary radio programming material for the American Forces Vietnam Network.”

As for what music the AFVN was broadcasting, program schedules and listener surveys offer a picture of preferences and listening habits that complicates the conventional wisdom. Audience members were opinionated — the vast preponderance of mail that AFVN received from listeners concerned music — and to keep soldiers’ ears, AFVN had to accommodate diverse tastes and tolerances, with contemporary rock making up only a relatively small proportion of what G.I.s were actually tuning in to. As much as the so-called rock revolution transformed American popular music in the 1960s, radically new sounds were not everyone’s cup of tea, or at least primary focus. For most listeners at the time, the cutting edge existed alongside a robust musical mainstream, as much Burt Bacharach as Big Brother and the Holding Company.

In 1970, listeners overwhelmingly ranked “current Top 40” and “oldies but goodies” (defined as songs from 1954 to 1969) as their first and second favorite styles or formats, followed distantly, in order of preference, by “easy listening,” “country-western,” “acid rock,” “soul,” “classical,” and “jazz.” Of course, rock and soul did feature prominently in the Top 40 at the time but in and among other styles, including a considerable amount of easy listening. To put this proclivity for Top 40 in perspective, the five leading recordings on the Billboard Hot 100 chart for 1970 commingled the hard-driving rock of “American Woman” by the Canadian band the Guess Who with three decidedly softer numbers — Simon and Garfunkel’s “Bridge over Troubled Water”; the Carpenters’ “Close to You”; and B.J. Thomas’s “Raindrops Keep Fallin’ on My Head.” The fifth song in the top five was Edwin Starr’s “War (What Is It Good For?),” an antiwar soul anthem which the AFVN did not play. Given the number of other top twenty artists that year who fell outside of the rock and soul orbit, including Ray Stevens, Bread, Vanity Fare, Neil Diamond, and Tony Orlando & Dawn, the designation Top 40 was far from synonymous with what today would be called classic rock. 

Some of AFVN’s music shows were produced in the United States, others in Vietnam. The most popular of the former was A Date with Chris, a Top 40 popular-music program hosted by Chris Noel, airing Monday to Friday for one hour in the late afternoon. Opening each program with her trademark “Hi, love,” she cultivated an air of flirtatious intimacy, and her patter emphasized to G.I.s that she was playing the same music their girlfriends and wives were listening to back home. Musically, a typical playlist moved easily between buoyant and soulful pop (the Dave Clark Five, “You Got What It Takes”; Arthur Conley, “Sweet Soul Music”), light, saccharine country (Sandy Posey, “What a Woman in Love Won’t Do”; Roger Miller, “Walkin’ in the Sunshine”), and the occasional rock number (the Seeds, “Pushin’ Too Hard”), with an inclination toward shimmering, string-and horn-soaked production (Andy Williams, “Music to Watch Girls By”). 

Along similar lines, G.I.s’ favorite shows produced in-country were Dawnbuster and Million Dollar Music. Thanks to the 1987 film Good Morning, Vietnam, the Dawnbuster morning show is the best-remembered AFVN program today. It aired for three hours Monday to Saturday, featuring primarily Top 40 music interspersed with lively commentary and patter. In 1965 and 1966, it was hosted by Adrian Cronauer (played in Good Morning, Vietnam by comedian Robin Williams) and later by future TV game-show emcee Pat Sajak and others. As in the film, the real Dawnbuster did present a lot of rapid-fire talk and (nominal) humor, but it rarely strayed outside the bounds of military-approved propriety and generic middle-of-the-road music. The manic, anarchic, on-air antics of Williams in the movie were a Hollywood fiction. (The real-life Cronauer described himself in 2005 as a “lifelong card-carrying Republican” and campaigned for Bob Dole in 1996 and George W. Bush in 2000.)

The second show, Million Dollar Music, aired in the midafternoon, Monday to Friday, featuring “oldies but goodies,” a category AFVN defined not stylistically but as “pop standards and up-tempo music which continues to sell over a period of years.” Anything from the mid-1950s on was permissible, but generally this meant a repertoire of innocuous pop hits of the recent past which avoided any music that even whiffed of transgression. A typical show from 1971 featured Eddie Floyd’s “Knock on Wood” (1966), the Beach Boys’ “Surfer Girl” (1963), the Chiffons’ “He’s So Fine” (1963), and the Association’s “Along Comes Mary” (1966). In this spirit, when the DJ/producer of another oldies program, Bob Casey, played the Rolling Stones’ “Satisfaction” — a song banned by the AFVN for its “sexually suggestive” lyrics — a commanding officer tried to have him pulled off the air and reassigned to a remote location.

Looking outside of the military’s own data, we get a more complex and nuanced picture of music among G.I.s, but the contours are similar. In 1968, Rolling Stone magazine mailed its own questionnaire to a “select group” of military personnel from across the armed forces. In response to its queries about music and drug use, replies came in from servicemen stationed in nearly fifty locations, ranging from bases within the U.S. and ships at sea to Saigon and the Vietnamese jungle. The 10,000-word article summarizing these results, “Is This Any Way to Run the Army? — Stoned?,” both complicates and confirms the impressions created by the AFVN surveys. On the one hand, as would be expected of a leading organ of the counterculture, many of Rolling Stone’s respondents thought about and experienced music in ways that departed sharply from the AFVN’s official findings. Presumably, such G.I.s would have been among the soldiers less likely to have answered an AFVN survey. Numerous accounts, for example, emphasized the chasm between enlisted men and career military personnel, a.k.a. “lifers.” “Lifers can’t comprehend rock and roll,” one low-ranking soldier wrote. “They’re completely disoriented doers of the establishment.” Another recounted that fans of the Doors, the Grateful Dead, or Bob Dylan were subject to harassment from officers. To them, AFVN programming consisted of “mainly piped in restaurant type music” and “Big Brother Uncle Sam talking to you with his lifer-dog propaganda.” A corporal who gave his name as “Very Obscure” was blunter: “[AFVN] sucks, as the programming tries to please everyone. The Chris Noel show makes most G.I.s vomit.”

Such claims about the tastes and habits of “most G.I.s” and “most people” need to be taken with a grain of salt. If some scorned AFVN, others valued it as a weapon against soldiers’ great perennial nemesis: boredom. “It’s better than listening to the ship rattle in the wind,” a petty officer in the navy explained after cataloging how bad AFVN was. Further, other respondents to Rolling Stone suggested the committed partisans of rock accounted for only a relatively small minority of listeners. Another respondent from the navy, whose duties over three years in the service led him to interact with a wide swath of people, offered Rolling Stone readers a breakdown of sailors’ tastes that fell roughly in line with the AFVN’s own figures. Only 10 percent, he thought, were hard-core rock fans—the same proportion as fans of country (“in our idiom, shitkicking music”) and folk. Another 20 percent, he believed, liked rock casually and enjoyed “commercial sounds” equally well; 15 percent had “no musical tastes” or did not care; 5 percent favored classical; and 30 percent — the largest single group — were “R&B fanatics.” 

 

Country music had a presence equal to that of rock, if not greater. Compared to rock and soul, however, country music enjoyed a much higher profile within the military. It was heard more frequently on the airwaves and featured in USO shows like the Grand Ole Opry revue; the record bins in the PX stores teemed with Marty Robbins and Porter Wagoner discs; and jukeboxes in enlisted men’s clubs were regularly stocked with country hits. This prominence was not a simple reflection of consumer taste. It was also a product of the unusually tight relationship between the government and the music industry in Nashville. As historian Joseph Thompson has shown in a pathbreaking study, for more than a decade Nashville had promoted country music and musicians as the signature sound of both American patriotism and militarism. 

Although in World Wars I and II the music industries had also collaborated with the military, the connection with Nashville in the 1950s and ’60s elevated a particular stylistic preference — one long associated with whiteness, nostalgia, and regionalism — and linked it especially to career military personnel. These factors made country music unusually divisive. Equal numbers of write-in comments to AFVN specifically requested more country music and less of it, and far more listeners expressed a strong dislike of Town and Country, a show that featured country music every weekday for two hours, than for any other of AFVN’s programs. 

Mythologies of the 1960s and ’70s have rendered country music almost completely invisible. In 2017, PBS aired The Vietnam War, a sprawling ten-part documentary series directed by Ken Burns and Lynn Novick, acclaimed upon its release for showing the war “from all sides.” The series’ promotional materials highlighted the filmmakers’ use of more than 120 popular songs from the period, and its website featured both Spotify playlists and an essay about the soundtrack by David Fricke, a well-known music writer. Far from evoking “all sides,” however, The Vietnam War relied overwhelmingly on rock music, with some folk, R&B, and soul numbers mixed in. For all their putative catholicism, Burns and Novick featured only two country songs over eighteen hours — Johnny Cash’s “Big River” from 1958 and Merle Haggard’s 1969 anti-hippie anthem “Okie from Muskogee” — neither of which was representative of the “Nashville Sound” and honky-tonk that soldiers heard daily on AFVN or that sparked clashes in bars and clubs, nor did viewers encounter any of the middle-of-the-road Top 40 music that hummed in the background of G.I.s’ daily lives.

If the rock-centered narrative gives voice to a cognitive dissonance many people felt during the war, this may help us explain why that narrative has become so entrenched and pervasive. It may accord with what many soldiers and civilians experienced even if it departs from what they were listening to most of the time. But the absence of country music distorts our understanding of social relations in the military. In particular, for African American G.I.s the prominence of country music was an affront. Black soldiers in the 1960s were well aware they made up a disproportionate number of draftees and combat troops and were dramatically underrepresented in the officer corps. Yet only a small slice of the music preferred by Black soldiers got airtime on AFVN. Wherever African American troops were stationed, the military’s elevation of country music and its exclusion or marginalization of Black music added insult to injury. In some cases, Black soldiers pushed back with formal demands for more soul records on jukeboxes. In others, social friction about music could spark explosive confrontations. “When trouble broke out over music,” historian James Westheider noted, “it almost always involved not rock but country and western.” 

Reprinted with permission from Instrument of War: Music and the Making of America’s Soldiers by David Suisman, published by the University of Chicago Press. © 2024 by David Suisman. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Fri, 04 Jul 2025 08:44:08 +0000 https://historynewsnetwork.org/article/186002 https://historynewsnetwork.org/article/186002 0