Historians/History Historians/History articles brought to you by History News Network. Fri, 19 Apr 2024 23:24:36 +0000 Fri, 19 Apr 2024 23:24:36 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/article/category/2 The Unlikely Success of James Garfield in an Age of Division

An 1880 Puck Cartoon depicts Ulysses Grant surrendering his sword to James Garfield after being defeated for the Republican nomination.

The candidate, at first glance, seemed to have no business being his party’s nominee for the White House. In an era seething with political strife, he had long been viewed by peers in Washington as a pleasant but out-of-touch figure. Partisan warfare was not his strong suit; he cultivated friendships with civil rights opponents and election deniers alike. He enjoyed scrappy political debate but refused to aim any blows below the belt (“I never feel that to slap a man in the face is any real gain to the truth.”) What’s more, American voters seemed to be in a decisively anti-establishment mood, and this nominee had been a presence in Washington for almost two decades – the epitome of a swamp creature.

Yet, somehow, it added up to a winning formula: James Garfield, the nicest man remaining in a polarized Washington, would be elected America’s next president in 1880. His rise to power would be framed as a rare triumph of decency in the increasingly bitter political environment of late 19th century America. It has resonance today as our country again navigates similar public conditions.

Garfield’s election was the very first of the Gilded Age. It was a time defined by tremendous disparity emerging in America. Men like Andrew Carnegie and Jay Gould were ascendant members of a new ruling class of industrialists, the so-called “robber barons.” But their factories were grinding down the working class; America’s first nationwide strike had broken out in 1877. Meanwhile, Reconstruction had failed in the South, leaving Black Americans in a perilous spot. They technically possessed rights, but, in practice, had lost most of them after former Confederates returned to power and reversed the policies of Reconstruction.

Yet the period’s discord was most obvious in its politics. The last presidential election had produced what half of Americans considered an illegitimate result: poor Rutherford Hayes had to put up with being called “Rutherfraud” for his term in the White House.  Meanwhile, the broader Republican Party had fractured into two vividly-named blocs (the “Stalwarts” and the “Half-Breeds”), each of which loathed Hayes almost as much as they did each another.  

In this setting, Minority Leader James Garfield was a uniquely conciliatory figure – the lone Republican who could get along with all the fractious, squabbling members of his party. Stalwarts described him as “a most attractive man to meet,” while the leader of the Half-Breeds was, perhaps, Garfield’s best friend in Congress. President Hayes also considered Garfield a trustworthy legislative lieutenant. The overall picture was a distinctly muddled approach to factional politics: Garfield did not fall into any of his party’s camps but was still treated as a valued partner by each.

Much of this came naturally (“I am a poor hater,” Garfield was fond of saying). But there was also, inevitably, political calculus informing it – the kind that comes from decades spent in Washington, trying in vain to solve the nation’s most pressing issues.

Exceptional as Garfield’s political style was, his life story was more so. He had been born in poverty on the Ohio frontier in 1831 and raised by a single mother. A dizzying ascent followed: by his late twenties, James Garfield was a college president, preacher, and state senator; only a few years later, he had become not just the youngest brigadier general fighting in the Union Army, but also the youngest Congressman in the country by 1864.

His talent seemed limitless; his politics, uncompromising. The young Garfield was an ardent Radical Republican – a member of the most progressive wing of his party on civil rights and the need for an aggressive Reconstruction policy in the postwar South. “So long as I have a voice in public affairs,” Garfield vowed during this time, “it shall not be silent until every leading traitor is completely shut out of all participation of in the management of the Republic.”

But he lived to see this pledge go unfulfilled. Garfield’s Congressional career was exceptionally long – stretching from the Civil War through Reconstruction and beyond – and his politics softened as events unfolded. Principle yielded to pragmatism during what felt like countless national crises. “I am trying to be a radical, and not a fool,” Garfield wrote during President Johnson’s impeachment trial. By the end of 1876, a young firebrand of American politics had evolved into a mature legislative chieftain – the Minority Leader of a fractious Republican Party. Younger ideologues of the Party had Garfield’s sympathy but not his support. “It is the business of statesmanship to wield the political forces so as not to destroy the end to be gained,” he would lecture them.

It is no wonder, then, that Garfield’s reputation as an agreeable Republican was not entirely a positive one. From Frederick Douglass to Ulysses Grant, friends tended to say the same thing: that Garfield’s flip-flopping and politeness indicated he “lacked moral backbone.” Garfield, in contrast, would argue that open-mindedness was a sign of inner strength rather than weakness.

This argument was put to the test in the election of 1880. Republicans entered their nominating convention with a handful of declared candidates who had no clear path to a majority of party support. They emerged behind a surprising choice – James Garfield, who had been picked (apparently against his will) off the convention floor as a compromise candidate. The rank-and-file rejoiced. “His nomination will produce perfect unison,” one celebrated, “because he has helped everybody when asked, and antagonized none.” Garfield was not so exuberant about the outcome. Over the course of his political career, he had learned to view the presidency with deep suspicion; none of the Administrations he had witnessed up-close ended well.

His reservations were well-placed. While trying to appease his party’s different blocs, President Garfield ultimately failed to keep the peace between them – kick-starting a chain of events that led to his murder. The result, ironically, was the nation’s politics suddenly shifted to resemble his own. Americans made Garfield into a martyr and blamed the hyperpartisan political climate of the country for his death. A great period of correction began, but, in all the drama around Garfield’s assassination, his remarkable life was overshadowed by its own untimely end.

On his deathbed, President Garfield seemed to sense this would be the case. Turning to a friend, he asked if his name would “have a place in human history.” The friend’s affirmative answer appeared to relax him. “My work is done.”

 

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185876 https://historynewsnetwork.org/article/185876 0
Martha Hodes Talks "My Hijacking" with HNN

Martha Hodes (l) and her older sister Catherine on the single passport they shared. Photo courtesy of Martha Hodes.

On September 6, 1970 Martha Hodes, then aged 12, and her older sister Catherine boarded a TWA around-the-world flight in Tel Aviv to return to New York after spending the summer in Israel. After a stop in Frankfurt, militants from the Popular Front for the Liberation of Palestine hijacked the flight and rerouted it to a makeshift airfield in the Jordan desert, part of a coordinated campaign of four hijackings. The Hodes sisters became part of a six-day drama that held the world’s attention before the ultimate release of all of the hostages.

Yet, for years her own memories of the event were vague and unclear, and she occasionally felt as if it weren’t certain that she had, in fact been inside that plane at all. Her new book My Hijacking: A Personal History of Forgetting and Remembering, published to acclaim by HarperCollins, describes her work to apply her craft as a historian to her own memory.

Professor Hodes agreed to a discussion over email of the book and her unique experiences as a historian examining her own hostage crisis.

HNN: Readers might initially be shocked to learn about your longstanding ways of relating to your experience, which included anxiety and avoidance around air travel, but also a sense of unreality and detachment from the events. How did you begin to approach your own experience as a historian?

Martha Hodes: One of the oddest parts of thinking back on the hijacking was the sense that it had never happened to me. As a historian, I wanted to dispel this illogical perception by researching, documenting, and reconstructing the event—from the moment of the hijacking up in the air, to our week inside the plane in the desert, to our release. Yet even when I came upon raw news footage in which my sister and I appeared on screen, I felt more like a historian coming upon evidence of a distant historical subject. 

Bringing a historian’s skills to memoir-writing, I studied the work of other scholars writing about their own lives: Hazel Carby’s Imperial Intimacies, Clifton Crais’s History Lessons, Annette Gordon-Reed’s On Juneteenth, Saidiya Hartman’s Lose Your Mother, Jonathan Scott Holloway’s Jim Crow Wisdom, Daniel Mendelsohn’s The Lost, Edward Said’s Out of Place, Richard White’s Remembering Ahanagran, to name a few. Working as a memoirist writing a personal story and working at the same time as a historian writing about the past, I found it valuable to think of my twelve-year-old self as an historical actor in the past. In the end, that helped me come around to the fact that I had really been there, in the desert.

Part of this journey was reading your own diary from this period. I imagine most people would find their half-century old diary a challenging reading assignment even with much lower stakes—What did this document tell you?

My 1970 diary turned out to be a key document, though not in the way I’d expected. I’d packed my diary in my carry-on bag and wrote every day during the hijacking, so I thought it would be the scaffolding on which to build the book—after all, historians place considerable trust in documents from the time and place about which we are writing. Soon, though, I discovered that I’d omitted a great deal from those pages, in particular descriptions of frightening circumstances and events, as well as my own feelings of fear. When my students work with primary sources, I always teach them to ask, “Why did this person tell this story this way?” In that light, I came to understand my own inability, as a twelve-year-old caught in a world historical event, to absorb everything that was happening around me. Or maybe it was that I didn’t want to record what I knew I wouldn’t want to remember.

At many other times, documents speak to you in ways that disrupt your understanding; can you describe some of these moments?

I’ll mention a crucial one. Along with my diary, the other key document I discovered was a tape recording of an interview that my sister and I had given less than a week after we returned home—again, valuable as a document created very close in time to the event under investigation. From that astounding document I learned a great deal about how my family handled the immediate aftermath of the hijacking and how I would approach the hijacking for decades afterward. For me, writing the book broke my own pattern of denial and dismissiveness.

Your writing is particularly effective, I think, in mirroring your own developing understanding of the unfolding of the hostage crisis breaking emotional barriers you had long maintained. For me, this was most dramatic when you begin to confront a question that preoccupied people from your 12-year-old self to Henry Kissinger for that week in September: would the PFLP follow through on its threat to kill hostages? What can you conclude about as a historian about the danger you and your fellow hostages faced?

The dynamics of hostage-taking required our captors to keep their hostages off-balance. Sometimes they conveyed that no one would be harmed and occasionally they threatened that if their demands were not soon met, we would all die. And of course they told the world that the planes, along with the hostages, would be destroyed if their demands were not honored. In the course of my research, though, I learned that the Popular Front’s internal policy was not to harm anyone (and all the hostages did in fact return unharmed). But I also learned, during my research, of other ways that harm could have come to us—say, in an attack from the outside or by the accidental firing of one of the many weapons all around us. As a historian of the American Civil War, I teach my students about contingency on the battlefield; looking back, there was also considerable contingency out there in the desert.

Your parents’ story also presents something remarkable from today’s perspective: two people of modest origins making long, gainfully employed careers in the arts (dance, in their case). Can you discuss a bit the story of your family, and how it placed you on that TWA airliner?

My parents were both modern dancers in the Martha Graham Dance Company. They were divorced, and my sister and I had spent the summer in Israel with our mother, who had moved there to help start the Batsheva Dance Company, Israel’s first modern dance troupe. I learned during my research—something I hadn’t quite understood at the time—that my sister and I were different from most of the other American Jews among the hostages, who were keen on visiting Israel after the 1967 war. Both my parents were raised as secular Jews, and my mother had moved to Israel as a dancer, with no interest in Zionism, or even any particular interest in Israel. My childhood attachment to Israel stemmed from the fact that Tel Aviv was the place I spent carefree summers.

Stepping back a bit, I want to talk about historiography. In your career, you’ve been no stranger to writing about other people’s grief and trauma as it’s preserved and remembered in the archives. In My Hijacking you explore the way that forgetting can be a necessary, even purposeful, part of people’s response to loss and harm. How can readers and writers understand how this work shapes the historical record?

I’m a historian of the nineteenth-century United States, and in different ways, each of my previous books has addressed the problem of forgetting in the historical record. In White Women, Black Men: Illicit Sex in the Nineteenth-Century South, I found that overpowering ideas of white anxiety about sex between white women and black men erased the historical record of white southerners’ toleration for such liaisons under the institution of slavery. In The Sea Captain’s Wife: A True Story of Love, Race, and War in the Nineteenth-Century, the act of forgetting was more personal: the protagonist, a white New England woman who married a Caribbean man of color, had been partially erased from family history. In Mourning Lincoln, I found that widespread gleeful responses to Lincoln’s assassination—in the North as well as the South—came to be forgotten, overtaken by narratives of universal grief.

Returning to the dilemma of my diary in My Hijacking, I saw quite starkly how first-person documents can be crafted in particular ways, and how erasure can then foster forgetting. As for My Hijacking shaping the historical record, it’s deeply researched, but it’s also my story alone. The experience of each hostage depended on factors ranging from where you were sitting on the plane to your convictions about the history of Israel/Palestine. As I write in the book, “I could strive only to tell my own story of the hijacking in the truest possible way.”

At HNN, we seek to connect the past and the present, and to draw insight onto current events from historical thought. It seems to me that in Americans’ collective responses to upheavals in recent history, from 9/11 to the COVID pandemic are deeply structured and enabled  by forgetting. Would you agree? And how can understanding the work of forgetting help us think about the recent past? 

Forgetting can be a way to survive, and during my research I found that my family was not alone in not talking much about the hijacking after my sister and I returned home. But it’s also the work of historians to remember, and while researching My Hijacking I learned about the process of forgetting in another way. Like many American Jews in 1970, I had no idea that Palestinians had once lived on the same land. In the book, I recount a visit my sister and I took with our mother to the village of Ein Hod. We didn’t know that in 1948 Palestinians had been exiled and the village resettledby Israeli artists. My sister wrote a letter home to our father, saying that the artists lived “in quaint little Arab style houses” surrounded by “beautiful mountain views and flowers,” thereby illuminating a kind of collective forgetting. At twelve years old, in the desert, I began to learn about the irreconcilable narratives told by different sides of the conflict. On the plane, my sister and I felt sorry for everyone—for our fellow captives, especially the Holocaust survivors among us, and for our captors and their families, some of whom had lost their homes in 1948 and 1967. We puzzled out the conflict, but at twelve and thirteen years old we couldn’t think of a solution.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185857 https://historynewsnetwork.org/article/185857 0
Jared McBride Sheds Light on the Darker Parts of Ukraine's History

Ukrainian Auxiliary Police during Nazi occupation, c. 1943. Photo Bundesarchiv. 

Jared McBride, an Assistant Professor in UCLA’s History Department, sat down with HNN to discuss his research into 20th century violence in Ukraine. McBride specializes in the regions of Russia, Ukraine, and Eastern Europe and his research interests include nationalist movements, mass violence, the Holocaust, interethnic conflict, and war crimes prosecution. His research has been funded by Fulbright-Hays, the Social Science Research Council, the Mellon Foundation, and the Harry Frank Guggenheim Foundation and has been published in Holocaust and Genocide Studies, Journal of Genocide Research, The Carl Beck Papers, Ab Imperio, Kritika, and Slavic Review. At present, McBride is completing a book manuscript titled Pathways to Perpetration: Violence and the Undoing of Multi-Ethnic Western Ukraine, 1941-1944 that focuses on local perpetrators and interethnic violence in Nazi-occupied western Ukraine.

Q. In 2017 you wrote an article for Haaretz, a leading Israeli newspaper, which condemned the mythmaking” attempts in Ukraine (then led by President Poroshenko) to whitewash” the involvement of nationalist Ukrainians during WWII in terrorism against Jews and members of the Polish minority in Western Ukraine. Now, some six years later, the government of Ukraine is headed by Volodymyr Zelensky.  In recent years, have Ukrainian museums or local municipalities begun to acknowledge the role of local people in supporting the Nazi invaders in WWII?

Many scholars assumed the government of  President Zelensky,  which posited itself as centrist and outside the usual divides in the Ukrainian political landscape, would mark a break in the more cynical memory politics regarding 20th century history employed by the Poroshenko government. Until the start of the new war in 2022, this appeared to be true. Crucially, one of the most common barometers for policy shifts concerning the past is how the often-controversial Institute for National Memory is staffed and how they orient their projects. In this case, Zelensky clearly opted for a more moderate and respected leader and inclusive projects meant to bridge divides, rather than create them. How the Russian invasion will ultimately shape these politics moving forward remains to be seen. Concerning museums and municipalities, the assessment remains mixed. The aforementioned Decommunization Laws led to the removal of many Soviet-era markers, which is certainly understandable, but the replacement of them with monuments to individuals who served in Nazi-led battalions and police forces has been met with less sympathy.

Still, it is important to note the latter does not represent most of the new memorialization efforts, many of which include important and non-controversial Ukrainian figures from the last two centuries. In terms of other prominent and public spaces, we find similar tensions and growing pains. More controversial spaces like the Prison on Lonksoho in L’viv continue to operate, whereas Ukrainians have made progressive efforts to mark spaces in commemoration of where other ethno-national groups lived and died on Ukrainian soil. I’d therefore like to highlight the prolific work of Rivne-based NGO Mnemonics, which has completed projects like memorializing the site of the Jewish ghetto and even laying steppingstones (Stolperstein) throughout the city, among a great deal of other work. Finally, the fate of the endlessly byzantine process around the Babyn Yar commemoration project in Kiev remains to be seen, but it should say a lot of about the future treatment of these issues in a new Ukraine.

Q. How did you first get interested in this subject?

During my first year of college at Northeastern University, I took a course taught by Professor Jeffrey Burds that focused solely on the Second World War on the Eastern Front. This course highlighted various aspects of the war in the East including the intelligence front, partisan movements, local collaboration, the Holocaust, ethnic cleansing, and sexualized violence. In doing so, Dr. Burds exposed undergraduates to cutting-edge research on these topics through his own path-breaking work and that of others. When I took the course in the late 1990s, the field of study was rapidly developing so it was the perfect time to be exposed to these themes.

Shortly after the course ended, I began studying Russian and I followed up by learning German, and eventually studying Ukrainian in graduate school. I was able to put my Russian to use in two undergraduate research trips to Russian archives in Moscow where I began to work with primary source materials on the war. These experiences motivated me to seek a PhD in Russian/Eastern European history.

Q. How have students reacted when you lecture on this topic? Your scholarly articles discuss mass killings and torturing of women and children. Have any students complained about being exposed to potentially traumatic descriptions and images?

My experience teaching on these topics, first at Columbia on a teaching post-doc, and second, at UCLA since 2016, has been overwhelmingly positive. My courses on Eastern Europe in the 20th century and the Soviet experience are always full. I find students are curious and enthusiastic to learn about some of the many difficult moments of the 20th century. Most do not seem to come to the classroom with preconceived notions about the region, positive or negative, that I believe children of the Cold War, like me, had when we took these classes in the nineties or eighties. I also organize a team-taught course at UCLA each year on political violence and genocide for over two hundred first-year students called Political Violence in the Modern World. My experiences running this large course have been no different over the past four years – UCLA students can and do work through sensitive material in a respectful and engaged manner.

Q. In past years you were able to travel to Russia and Ukraine and Russia and get access to records. Has that availability changed because of the war in Ukraine?

I was able to complete most of my dissertation and now manuscript research in Ukraine and Russia before the events of the 2014 Maidan Revolution, so I did not have any access issues at the time. Access to Soviet-era archival materials in Ukraine only improved after the revolution and arrival of the Poroshenko government thanks, somewhat ironically, to a suite of controversial laws known as the Decommunization Laws. While controversial in terms of memory politics, the laws simplified access to the archives, including former KGB archives, and this was a boon for historians like myself. The war in Ukraine has not shuttered the archives – I know some colleagues continue to go and I have been able to support seasoned research assistants who have been able to access materials — but the war has unquestionably hampered the ability of young Ukrainian scholars to complete their work. Russian missiles have also damaged some holdings, which is terrible for scholars.

Russia has been the inverse of Ukraine in recent years where archives have been more restricted, especially for foreigners. Accessing Russian archives will likely prove increasingly difficult, and though there have been recent efforts to create crowd-sourced digital repositories for scholars, nothing truly replaces the experience of working on-site. The future is concerning for Soviet studies and archival research in Russia and Ukraine, but ultimately what matters most is that the war ends, and Ukrainians can rebuild their lives and livelihoods. Scholarship is second to survival.

Q. Please tell us a little bit about the book you are working on, Pathways to Perpetration: Violence and the Undoing of Multi-Ethnic Western Ukraine, 1941-1944.

My book expands upon my earlier work on local perpetrators in multiethnic settings. It is a micro-level social and political history of the Nazi occupation of western Ukraine. It examines the motivations of those who participated in various arenas of violence during the war including pogroms, the Holocaust, ethnic cleansing, and paramilitary violence. Throughout, I demonstrate how the social identities and group formations that are typically assumed to have caused the violence were instead caused by it, and that political choices were less often anchored in pre-existing ideologies and beliefs but rather more dynamic and situational than previously argued.

My conclusions therefore challenge overriding nationalist and primordialist interpretations of the war and people’s decisions in it. This integrative account of local perpetrators and decision-making is based on 10-plus years of research in Russia and Ukraine using sources in five languages from eighteen archives, including post-war Soviet investigations, newly declassified KGB trials, German documents, and personal accounts.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185856 https://historynewsnetwork.org/article/185856 0
Maps are the Record of Humans' Imagination of the World

The Fra Mauro Map, 1460

One of the most significant word maps now hangs in the Correr Museum in Venice, Italy. This map is seven feet in diameter, inked on velum, and covered with over 3,000 inscriptions in old Veneziano, the language of Venice. It was created by an obscure Venetian monk named Fra Mauro who worked with a team of cartographers, artists, and calligraphers in the middle 1400s at the monastery island of San Michele just off the north shore of Venice. Finished in 1459, this map was a masterpiece of both cartography and artistry, and it is the oldest Medieval map that has survived.

This map was also an inflection point in human history. Fra Mauro’s map was the first to show definitively that a ship could circumnavigate Africa at the southern tip and sail into the Indian Ocean, thereby opening sea trade between the West and the East. And it described people and goods across many cultures pointing out to Westerners that there were many other lifeways around the world. But most of all, this map was the first time that a cartographer moved away from religious mythology and ideology and embraced the science of geography. As such, Fra Mauro’s map foreshadowed the slide in Western culture from the insular Middle Ages into the enlightenment of the Renaissance and the beginning of the Scientific Revolution.

The ubiquitous nature of mapping suggests that diagraming our landscape is an ancient feature of human cognition and behavior and that we owe much of our evolutionary success to that ability. Maps appeared in the human record before there was writing, and even before there were systems of numbers. Because these drawings were used to represent something else, they were a means of communication and memory and a way to bridge languages and cultures. Among the many maps created by people over time and across cultures, one mode stands out as the most imaginative and creative, and the least practical—the world map. These maps don’t show the way to get home, guide a traveler, or even inform accurately what belongs to whom.

World maps are purely artistic in that they have always been made for grand effect. Mappamundi are also products of their times because they chart the history of geography and other knowledge and so these sweeping, impractical showpieces also echo the society in which they were produced; they are talismans of culture, the storytellers of human experience. Their story is our story, and that’s why they matter.

The first world map is a tiny bit of smashed-up clay, called the Babylonian map of the world, about the size of a human hand when glued together, and it dates between 500-700 B.C.E. The reconstructed tablet is composed of 8-10 pieces with a hole in the center, which presumably marks the center of the Babylonian Empire. It is incised with rays and circles representing the Euphrates River and a horizontal rectangle that represents the city of Babel. The following centuries produced various world maps in Greece, the Roman Empire, the Arabic world, and some Asian world maps. These maps were made as ancient sailors and navigators began to travel long distances for exploration and trade, and they reflected how their cultures saw the world.

Eventually, the cartographers in Western culture used maps as supporting propaganda to reinforce Christian beliefs and to instill fear of the unknown by portraying mystical creatures, warning about barbarians, and highlighting uninhabitable and presumably dangerous places. And of course, all these early cartographers had no idea that there were two more continents on the other side of the globe, continents already inhabited by people who had walked, sailed, or rowed there long ago. These Medieval Western world maps were encyclopedias of knowledge, but that knowledge was biased and narrow.

Fra Mauro’s map was constructed during the Late Middle Ages, an exciting time for Western culture. The West was just on the cusp of breaking out of its known geography and sailing to far-flung places. But this Age of Discovery (or Age of Exploration) was not so much about exploring new and interesting places as a purposeful financial move. When Europeans moved out of their geographic comfort zone, they were incited by nascent capitalism, that is the desire to pick up goods and resources from foreign places and sell them back home or elsewhere at a profit.

That burgeoning of capitalism was underscored by a focus on technological improvements in trade ships and navigation. Because of Fra Mauro’s map, for example, one could now imagine circumnavigating Africa and entering the Indian Ocean, which had previously been imagined as a closed sea. As a result, trade with the East could be much more efficient and financially profitable by rounding the tip of Africa rather than sticking to land routes across Asia. And this map visually described other sea routes for trade and how they might connect to form one vast trading network.

Also, Fra Mauro's map was reflective of the various intellectual revolutions that had begun flowering. Like no world map before it, this one was brimming over with information from other places and cultures, suggesting there was a wide world out there waiting to be explored and understood. In that sense, Fra Mauro’s map was the first encyclopedia of the known world, and it pointed to a vast diversity of peoples and practices.

Fra Mauro’s map is not just a map of the known world in the mid-1400s. It is also a reflection of the tipping point that brought Western culture out of the Dark Ages into the light of modernity. His creation was a road map to expansion, discovery, trade, prosperity, and domination. And it gave birth to a long series of later world maps. In other words, this map was like a pebble thrown into a pond, creating various unpredictable but sizable waves of rings that spread out from the impact of that pebble; it changed world history, how world maps have since been used for various purposes, and established the scientific discipline of cartography.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185854 https://historynewsnetwork.org/article/185854 0
Recovering the Story of the Empress Messalina After a Roman Cancellation

From "Messaline Dans La Loge de Lisisca," Agostino Carraci, 16th c., depicting the rumored moonlighting of the first-century empress in a Roman brothel.

Towards the end of 48 CE a workman carried his tools down into a tomb on the outskirts of Rome. Among the rows of niches, he found the urn holding the ashes of Marcus Valerius Antiochus. He had been a hairdresser and the freedman of the empress Valeria Messalina – a fact he had been proud enough of to record on his tombstone. The workman took out his tools; his job that day was to chisel off the empress’s name.

Messalina had been killed that autumn, in the midst of a scandal that had rocked Rome. She’d been accused of bigamously marrying one of her lovers and plotting to oust her husband, the emperor Claudius, from the throne. The real reason for Messalina’s fall probably lay more in the power plays of court politics than in some grand, mad, bigamous passion, but it didn’t matter. A succession of her alleged lovers were executed, and then, fearing that Claudius might be swayed by love for his wife, an imperial advisor ordered that Messalina herself be killed before she had the chance to plead her case.

Tacitus, the great historian of Roman tyranny, recorded that Claudius hardly reacted when the news of his wife’s death was brought to him at dinner –– he simply asked for another glass of wine. Claudius seemed to want to forget completely, and the senate was willing to help him. They decreed that every trace of Messalina –– every image of her, and every mention of her name –– should be destroyed. It was only the second time in Roman history that an official order of this kind, now referred to as damnatio memoriae, had been passed. The decree applied to both the public and private sphere; statues of Messalina were dragged off plinths in town-squares and domestic atria before being smashed, or melted down, or recut. Mentions of her name were rubbed off official records, and chiselled equally off honorific monuments and hairdressers’ epitaphs.

Damnatio memoriae has sometimes been referred to as a form of ancient Roman “cancel culture,” but this was a process utterly unlike modern cancellation –– one that could not be replicated today. In the age of the internet someone might be unfollowed, their invitations to speak at official events rescinded, they might be attacked in op-eds. Their name might even become unmentionable in certain circles. But while the reach and influence of “the cancelled” might be reduced, the evidence of their existence and actions cannot be destroyed. Their government records and Wikipedia pages still record their birthdate; their tweets, however dodgy, are still cached in some corner of the internet. They can post videos of themselves crying and apologizing, tweet a glib brush-off, or publish ten-thousand-word tracts of self-justification. The cancelled might be dismissed, but they cannot be erased.

The situation was different in 48 CE. The sources of information about Roman political figures were less varied and more traceable than they are today –– and the mediums through which such information was disseminated, generally more smashable.

The public image of imperial women like Messalina was carefully controlled. Official portrait types were developed, copies of which were sent off to cities throughout the empire, where they were copied and recopied again for public buildings, shop-windows, private houses. These statues, along with coin types and honorific inscriptions, were designed to present Julio-Claudian women as icons of ideal Roman femininity and imperial stability. Messalina’s best-preserved portrait is almost Madonna like – she stands, veiled, balancing her baby son Britannicus, then heir to the empire, on her hip; coins minted in Alexandria depict the empress as a veiled fertility goddess, carrying sheaves of corn that promise the prosperity of imperially protected trade routes. Such a coherent image could be destroyed almost wholesale – especially when driven by an official, central edict rather than simply by a shift in popular consensus; there is only one surviving statue of Messalina that was not discovered pre-broken by the conscientious minor officials of the mid-1st century.

So where does this leave the historian? At first glance the situation is dire –– our information about imperial Roman women is always limited, and in this case much of that information has been purposefully and systematically destroyed. On reflection, however, it is more complex; the destruction of Messalina’s images and honours had created a vacuum and an opportunity.

The official narrative of the Julio-Claudian rulers, expressed in stone and bronze, was always supplemented by a secondary, ephemeral narrative of rumor. This was a period that saw politics move ever more away from the public arenas of the senate and the assembly into the private world of the imperial palace as power was ever-increasingly concentrated in the figure of the emperor. The women of the Julio-Claudian family were central to this new dynastic politics; they had access to the emperor that senators could only dream of, and all the while they were raising a new generation of potential heirs to the imperial throne. As the opacity of the new court politics encouraged ever more frenzied speculation about the private lives and intrigues of its players, much of that speculation came to center on the women.

Messalina’s dramatic and sudden fall from grace had raised questions and, in leaving her memory and reputation unprotected, the process of damnatio memoriae allowed people to propose answers. Rumours of the empress’ political and sexual conduct –– some of which may have been circulating during her life, some of which must have evolved after her death –– could now be openly discussed, elaborated upon and written about.

The result is an extraordinarily rich tangle of reality and myth. The sources are almost certainly right to accuse Messalina of orchestrating her enemies’ downfalls and deaths (no one could survive almost a decade at the top of the Julio-Claudian court without a little violence); their attribution of such plots to sexual jealousy and “feminine” passion rather than to political necessity is more suspect. Similarly, there is no reason to believe ancient writers totally unjustified in accusing Messalina of adultery; their claims that she slipped out of the palace nightly to work in a low-class brothel, or that she challenged the most notorious courtesan in Rome to a competition of who could sleep with more men in twenty-four hours (and won with a tally of twenty-five) are far more difficult to credit.

The unravelling of these stories is both the challenge and the joy of ancient history. The process is also revealing on two counts. The evaluation of these stories brings us closer to re-constructing the narrative of Messalina’s real life, her history, and her impact on her times. But even those tales that cannot be credited are of value. The stories and rumours that Rome constructed about its most powerful women when given totally free rein tell us a great deal about its contemporary culture and society –– its anxieties, its prejudices, its assumptions, and its desires. 

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185813 https://historynewsnetwork.org/article/185813 0
Ayn Rand's Defense of an Anti-Union Massacre

Photo from records of LaFollette Committee, National Archives and Records Administration

In July 1943, former Hollywood screenwriter Ayn Rand was still tracking responses, critical and commercial, to her first major novel, The Fountainhead.  It had been published two months earlier by Bobbs-Merrill after being rejected by a dozen other companies.   Rand had written two previous novels, along with two stage plays, none of which proved successful.  Now The Fountainhead was off to a slow start with audiences and reviewers.

While this was transpiring, Rand received in the mail a set of galleys for the memoir (eventually titled Boot Straps) by Tom M. Girdler, chairman of Republic Steel, which operated several massive plants in the Midwest and Pennsylvania. Many Americans had probably already forgotten the most tragic incident that Girdler was associated with, almost exactly six years earlier.  If Rand was among them, her memory (and high estimate of Girdler) was surely revived in reading those galleys.  Soon she would model a key character in her most famous novel, Atlas Shrugged, partly on Girdler.

Near the end of May 1937, workers who had been on strike for several days at Republic Steel in Southeast Chicago had called for a Memorial Day picnic on the wide open field several blocks from the plant entrance to build support.  Tom Girdler wouldn’t even recognize the union, famously vowing that he would retire and go back to growing apples before he’d do that.  At least 1500 workers and family members, including many women and children, turned out for the picnic.   After the festivities, organizers called on the crowd to march to the gates of the plant where they might establish a mass, legal, picket. 

Halfway there, the marchers, at least 500 strong, were halted by a large contingent of Chicago police and ordered to disperse.  A heated discussion ensued.  A few rocks were thrown in the direction of the police.  Suddenly, some of the police drew their pistols and opened fire on the protesters at point blank range, and then as the marchers fled.   They chased after the survivors, clubbing many of them. 

Forty in the crowd were shot, with ten dead within two weeks. Dozens of the survivors were arrested and lifted into paddy wagons without medical attention.  Only a handful of police required treatment for minor injuries.  

Despite these one-sided results, local and national newspapers, right up to The New York Times and Washington Post, almost uniformly portrayed the marchers as a “mob” intent on rioting—that is, as the perpetrators of this tragedy.   Some falsely suggested that the unionists fired first. 

The only footage of the incident is quite graphic, showing the police shooting and then clubbing marchers; it was suppressed by Paramount News, a leading newsreel company. 

Then the Progressive Party senator from Wisconsin, Robert LaFollette, Jr. convened a sensational three-day hearing into the tragedy. The Paramount footage was screened in its entirety—and then in slow motion (you can watch it here)--providing more proof of police malfeasance.  It emerged that Republic Steel had collaborated with police on this day, allowing them to set up headquarters inside their plant and supplying them with tear gas and axe handles to supplement their billy clubs.

When the LaFollette committee released its report (most of it, along with witness testimony, printed for the first time in my new book on the Massacre), it harshly criticized the police: “We conclude that the consequences of the Memorial Day encounter were clearly avoidable by the police. The action of the responsible authorities in setting the seal of their approval upon the  conduct of the police not only fails to place responsibility where responsibility properly belongs but will invite the repetition of similar incidents in the future.”

Ayn Rand clearly did not agree.  On July 12, 1943, she typed a five-page letter to Republic boss Girdler after reading his galleys.  “Allow me to express my deepest admiration for the way in which you have lived your life,” Rand wrote from New York City, “for your gallant fight of 1937, for the courage you displayed then and are displaying again now when you attempt a truly heroic deed—a defense of the industrialist….”  Then she offered to send him a copy of her novel.

“The basic falsehood which the world has accepted is the doctrine that altruism is the ultimate ideal,” she related.  “That is, service to others as a justification and the placing of others above self as a virtue.  Such an ideal is not merely impossible, it is immoral and vicious.  And there is no hope for the world until enough of us come to realize this.  Man’s first duty is not to others, but to himself…

“I have presented my whole thesis against altruism in The Fountainhead….Its hero is the kind of man you appear to be, if I can judge by your book, the kind of man who built America, the creator and uncompromising individualist.”

But Rand also admitted that “it shocked me to read you, a great industrialist, saying in self-justification that you are just as good as a social worker.  You are not.  You are much better.  But you will never prove it until we have a new code of values.  ​ 

“You had the courage to stand on your rights and your convictions in 1937, while others crawled, compromised, and submitted.  You were one of the few who made a stand.  You are doing it again now when you come out openly in defense of the industrialist.  So I think you are one of few men who will have the courage to understand and propagate the kind of moral code we need if the industrialists, and the rest of us, are to be saved.  A new and consistent code of individualism.” 

She concluded the letter “with deep appreciation for your achievement and that which you represent.”

Girdler replied on July 27, 1937, that he had just purchased The Fountainhead. A few months later, he met Rand in New York and told her that he had read and enjoyed novel, which pleased her immensely, and he suggested they meet for lunch.

This apparently did not take place, but she would, a short time later, create one of the key characters in Atlas Shrugged, troubled steel industrialist Hank Rearden, based partly on Girdler.

Greg Mitchell’s new film Memorial Day Massacre: Workers Die, Film Buried, premiered over PBS stations in May and can now be watched by everyone via PBS.org and PBS apps.  He has also written a companion book with the same title.  He is the author of a dozen previous books.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185782 https://historynewsnetwork.org/article/185782 0
What We Can Learn From—and Through—Historical Fiction

Novelist Anna Maria Porter, engraving The Ladies' Pocket Magazine (1824)

This image is available from the New York Public Library's Digital Library under the digital ID 1556053: digitalgallery.nypl.org → digitalcollections.nypl.org

I have been a local historian for many years, but turned to historical fiction to tell a specific story for which there were no sources. There was a sense of going to the “dark side” in doing so, yet at the same time I was able to illuminate things that do not appear in the historic record.  I suspect that there could be a lively debate online about what good historical fiction can accomplish—and also the misuse of history by those who write historical fiction.

As a local historian I tried to be true to the sources I found; to be trusted by readers. In the case of the dozen women who crossed the country in 1842, members of the first overland company to set out for the Pacific Northwest, I could find little. With no verifiable facts, but knowledge that women were present, I turned to fiction to put women in the picture and wrote Lamentations: A Novel of Women Walking West (Bison Books, an imprint of the University of Nebraska, 2021). To someone like Gore Vidal, that made perfect sense; he thought history should not be left to the historians, “most of whom are too narrow, unworldly, and unlettered to grasp the mind and motive,” of historical figures. E. L. Doctorow would agree, but more agreeably, writing that “the historian will tell you what happened,” while the novelist will explain what it felt like. The historian works with the verifiable facts—fiction is a step beyond.

Historical fiction is generally dated to Sir Walter Scott, beginning with Waverly in 1814. It turns out, however, that Scott was not the first historical novelist. Devoney Looser has just published Sister Novelists (Bloomsbury Press, 2022) about Maria (1778-1832) and Jane (1775-1850) Porter, driven by poverty, who wrote popular historical novels beginning in the 1790s. A Wall Street Journal reviewer in 2022 noted that “Maria was a workhorse, Jane a perfectionist. Between them they wrote 26 books and pioneered the historical novel.”

There have been only a few academic treatments of historical fiction. Ernest Leisy issued The American Historical Novel in 1950 and George Dekker wrote American Historical Romance in 1987, both interested in chronological periods, but neither man created, or exhibited, much enthusiasm for it. Yet, in 1911 James Harvey Robinson wrote in an essay titled “The New History,” published in the Proceedings of the American Philosophical Society, where he observed that historians need to be engaging, even while “it is hard to complete with fiction writers.” He stated

History is not infrequently still defined as a record of past events and the public still expects from the historian a story of the past. But the conscientious historian has come to realize that he cannot aspire to be a good story teller for the simple reason that if he tells no more than he has good reasons for believing to be true his story is usually very fragmentary and uncertain. Fiction and drama are perfectly free to conceive and adjust detail so as to meet the demands of art, but the historian should always be conscious of the rigid limitations placed upon him. If he confines himself to an honest and critical statement of a series of events as described in his sources it is usually too deficient in vivid authentic detail to make a presentable story.

The historian Daniel Aaron took the genre of historical fiction seriously in a 1992 American Heritage essay in which he castigates Gore Vidal. Aaron however conceded that “good writers, write the kind of history [that] good historians can’t or don’t write.”

Aaron quotes Henry James, who thought of historians as coal miners working in the dark, on hands and knees, wanting more and more documents, whereas a storyteller needed only to be quickened by a letter or event to see a way to share it with readers or use it to illuminate a point about the historical past. He recognized that genres of reading had changed. In the 19th century we read historical tomes, mostly about the classical world or of British and European war and political alignments, but in the last quarter of the 20th century “so-called scientific historians left a void that biographers and writers of fictional history quickly filled.” Aaron cites inventive novelists who have perverted history for a variety of reasons, using Gore Vidal as his prime example. Vidal thought of historians as squirrels, collecting facts to advance their careers. But Vidal does not get the last word.

Professor Aaron recognized that historical fiction had moved from a limited earlier model focused on well-known individuals to serious re-tellers of history who have “taken pains to check their facts and who possess a historical sensibility and the power to reconstruct and inhabit a space in time past.” What a lovely description of some of the best of our contemporary historical fiction.

But what of putting women into the past where they often do not appear? Addressing this issue, Dame Hilary Mantel noted in her 2013 London Review of Books essay “Royal Bodies” that

If you want to write about women in history, you have to distort history to do it, or substitute fantasy for facts; you have to pretend that individual women were more important than they were or that we know more about them than we do.

Despite my great admiration for Dame Hilary, I think we can deal with the issue of women in the past by honoring their lives in such a way that does not turn them into twenty-first century heroines but as women who found themselves in situations they might not have wished, and did what they needed to do, thought about their circumstances, and dealt with what they found they had landed in. They, as we, are each grounded in our own time, deserve credit for surviving, and should be appreciated for our observations of life around us.

We should respect the historians’ knowledge of time and place and the novelists’ intuition that is sometimes spot-on. An example: in trying to explore the moment when the buttoned-down eastern women in 1842 encountered a band of Lakota, then identified as Sioux, I wondered what the women might have thought of those bronzed warriors whose clothing left much of their chests and shoulders bare. What would the women walking west have thought about? When I read the paragraph I had written to an elderly friend, she went to her desk and pulled out a letter from an ancestor who had crossed Nebraska, walked over South Pass, and on into Oregon. And that ancestor, in the 1850s, had said exactly what I had imagined. Sometimes, the imagined past is as we conceive it to be because we have grasped the knowledge of time and place on which to activate believable players.

My desire in Lamentations was to hear what the women were thinking, and sometimes saying to each other, but within the context of that century when much that was unorthodox could not be said aloud. I wanted to show how a group of people traveling together would get to know each other, rather as students in a class know that one was from Ohio and another played hockey. We do not know others fully, but from the vantages we are given. I wanted to display how the women gained information, and then passed it along; how tragedies were dealt with; how personalities differed, and how, in the end, Jane matured. I wanted to bring women of different generations together, to show discord among sisters, to think about what was important when dismantling a home, how women fit into the daily account of miles and weather and sometimes events kept by the company clerk. I wanted to explore what it was like to answer a longing for new beginnings, for a journey when one is the first to make it. I am interested in names and what they mean, in the landscape what how one travels through. I wanted to hear the women speak when the records do not.

Historians need to be conscious of the audience we/they hope to have and perhaps can learn something about style and sense of place from the writers of historical fiction. Academic and local history can be told vividly; good history can also have good narrative but also, that some historical fiction tells a story that a historian cannot. I have written this to praise historical fiction when it respects the line between our times and the past, when it adheres to the known-truth and does not pervert it for excitement—or for book sales. I appreciate Daniel Aaron who thought historical fiction was worth taking seriously, and for all those writers who have brought the past alive in this form.

Fiction is not the only way to explore the past, but historical fiction can attract readers to wonder and speculate and then explore the past in other forms. A friend said that as a child, reading fiction of other times led her to read history and then become a historian. Aaron wrote that historical fiction gives “us something more than the historical argument.” It can bring alive an era, a person, a moment in time so that we meet the past as it was, not as we might want it to have been.

                                                                                                                                   

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185767 https://historynewsnetwork.org/article/185767 0
From "Shell Shock" to PTSD, Veterans Have a Long Walk to Health

"The 2000 Yard Stare", by Thomas Lea, 1944, WWII. The Army Art Collection, U.S. Army Center for Military History

Will Robinson, an American Iraq war veteran, languished for months with depression and post-traumatic stress disorder (PTSD) all alone at home in Louisiana. One day in March 2016, he watched the movie “Wild,” starring Reese Witherspoon as Cheryl Strayed. Strayed’s book of the same title told of her redemption from despair by hiking 2,650 miles of wilderness on the Pacific Coast Trail, from Mexico to Canada. Robinson decided to follow Strayed’s example, packing up a tent and supplies a month later to duplicate her journey and, he hoped, its hopeful outcome.

He had nothing to lose. Forced into the army at the age of eighteen by a judge who promised to erase his conviction for petty theft if he served, he was deployed to South Korea in 2001 and Iraq in 2003. Six months in Iraq left him with injuries to his wrist, his knee and, more significantly, his mind. The army gave him a medical discharge for PTSD, but it offered little in the way of medical treatment. He attempted suicide with drugs the Veterans Administration issued him, surviving only because the pills made him vomit. Other vets of the war on terror were not so lucky; every day, an average of twenty-two take their lives rather than endure another moment of living hell. Robinson promised his mother he would not try again. Then she died, and he retreated into loneliness and depression.

It was during that dark time that Robinson saw “Wild” and took his first, literal, step towards recovery. He may not have known that he was following the advice of a British psychiatrist, Dr. Arthur J. Brock, who had prescribed similar solutions to soldiers traumatized in the First World War. The battles between 1914 and 1918 subjected young men to the unprecedented terrors of high explosive artillery shells, poison gas, flamethrowers, rapid machine-gun fire and claustrophobia in rat-infested trenches. Growing numbers of casualties carried to field hospitals had no physical wounds. At least, not wounds the doctors could see.

The soldiers suffered nervous breakdowns. They called their malady “shell shock,” a term introduced to the medical lexicon by psychiatrist Dr. Charles Samuel Myers after he visited the front in 1915. A high proportion of the victims were junior officers, who shared the troops’ fears but also led them in futile offensives against relentless enemy fire and felt a burden of guilt for their deaths. The military needed these officers, but the war had transformed them into paralysed, trembling, stuttering, blind or deaf wrecks unable to fight or to lead.

The British government was forced to open hospitals to aid them and, more importantly, make them fit to return to battle. Dr. Brock took up his post at Scotland’s Craiglockhart War Hospital for Officers when it opened in October 1916. His belief, based on his pre-war practice with mental breakdowns, was that “the essential thing for the patient to do is to help himself,” and the doctor’s only role “is to help him to help himself.” Brock blamed modern society as much as industrial warfare for severing people from the natural world and from one another, resulting in an epidemic of mental illness. His treatment for the soldiers was the same as it had been for civilians who broke down amid the struggle for survival in harsh economic times: reconnect to the world, especially the natural world. He encouraged his patients, including the poet Wilfred Owen, to explore the wild Pentland Hills near Craiglockhart. Many joined Frock’s Field Club to study nature and restore their pre-war relationship to it.

Symbolizing his method was a drawing on his consulting room wall. It depicted the mythological wrestling match between the hero Hercules and the giant Antaeus of Libya. Antaeus, son of the earth goddess Gaia, drew his strength from his mother earth as Samson did from his hair. As long as he was touching the ground, his strength was prodigious. Realizing this, Hercules lifted Antaeus into the air and broke his back. “Antaeus is civilisation,” Brock wrote, “and Hercules is the Machine, which is on the point of crushing it.” The war machine had crushed his patients’ minds. Some of them in fact had been hurled skywards and rendered unconscious by exploding shells. Brock urged them to find peace through nature.

Will Robinson made his connection to mother earth by trekking and sleeping rough on the Pacific Coast Trail, and later on other famous routes –the Tahoe Rim, the Arizona, the Ozark Highlands, the Continental Divide, and the Appalachian. He clocked over 11,000 miles, the first African American man to do so. ESPN declared him “the trailblazing superstar of thru-hiking.” Not only did he come to understand and survive wild environments, he discovered something his life was lacking: community. “Thru-hiking has that community, and it’s why I love it so much,” Robinson told ESPN journalist Matt Gallagher. “People need to know they belong to something.”

Brock would have approved. Connecting to others, becoming part of a community, was as vital to mental health as relating to the earth. Robinson made friends on his treks and mentored younger hikers, including other veterans. He also met the woman who became his girlfriend, and they continue to wander America’s rural byways together. Robinson worked hard to traverse those miles and overcome his war demons. For other American vets, as for World War I’s shell-shocked warriors, there is no single cure for PTSD. What works for one will fail another. Robinson found his way, and it is up to the government that sent them to war to find ways for the rest to resume their lives.

Modern American veterans have one advantage over Brock’s charges. The men whom Brock aided to health had to return to the trenches. Many broke down again or were buried in what war poet Rupert Brook called “some corner of a foreign field/That is forever England.”

© Charles Glass 2023

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185703 https://historynewsnetwork.org/article/185703 0
Mary Wollstonecraft's Diagnosis of the Prejudices Holding Back Girls' Education Remains Relevant Today

Frontispiece engraving by William Blake from Mary Wollstonecraft, Original Stories from Real Life, 1791 ed. 

In 1785, aged only twenty-five, Mary Wollstonecraft, along with her two sisters and her good friend Fanny Blood, opened a school in Newington Green, London. Their aim was to fill the gaping hole in the education of young women, and there seemed no better place to start the rollout. As home to numerous religious radicals and dissenters, Newington Green was a community open to new ideas – one that had already rejected many a status quo. But, despite Wollstonecraft’s best efforts, the school soon failed. Rather than giving up, she turned to writing as a means of championing the cause. Her first book, aptly titled Thoughts on the Education of Daughters, was published in 1787. By 1792, she had moved to France in search of the Revolution and was publishing what was to become her best-known work: A Vindication of the Rights of Woman with Strictures on Moral and Political Subjects.

Contrary to the opinion of the day, Wollstonecraft argued that women were brains, not just bodies: that they were just as capable as men and deserved the same access to education in order to broaden their minds. While this much is, of course, well known, there is a further aspect to Wollstonecraft’s work that has been buried in history. It is a golden nugget, and one that allows us to better understand the obstacles girls face. Referencing the popular conduct manuals of the time, which she described as “specious poisons” that created an “insipid decency,” Wollstonecraft noted that it wasn’t only society’s warped focus on women’s biology that hampered progress towards educational equality but also, more specifically, society’s obsession with female “purity.” Even for a girl whose parents had the means and inclination to support her education, the fear that her virginity could be brought into question made schooling alongside men a virtual impossibility.

The answer to this was the governess, but this was expensive and necessarily limited women’s ability to acquire a broad education, and to mix and debate with others. Having been a governess herself, working for the Kingsborough family in Ireland following the failure of her school, Wollstonecraft had firsthand experience. In championing girls’ schooling, Wollstonecraft might have been the proverbial turkey voting for Christmas, but she knew that much more was at stake than her own job (and, in any case, she didn’t much get along with the mother of the Kingsborough brood).

At a time when respectable families placed their daughters’ “morality” ahead of their education, Wollstonecraft stated that “[w]ithout knowledge there can be no morality.” True virtue, she argued, could only ever be achieved by immersing yourself in life and experiencing the world, as men were encouraged to do, including on their grand tours. In her Vindication, she writes that “men have superior judgement” because “they give a freer scope to the grand passions, and by more frequently going astray enlarge their minds.” Men were allowed to achieve wisdom and virtue because “the hero is allowed to be mortal.” By contrast, heroines “are to be born immaculate.” For women, everything was to be lost; for men, everything was there for the taking. What Wollstonecraft ultimately called for was a “revolution in female manners.”

While the revolution in female manners is still ongoing, progress in regard to women’s schooling came in the late nineteenth century, albeit only for the wealthy. In England, Cheltenham Ladies’ College opened in 1853, followed by Roedean School in 1885. By the late nineteenth century, young women were able to acquire an education at my own university, Cambridge, in ladies’ colleges strategically positioned outside the city center. The compromise was, of course, gender segregation.

Even if young women could by then acquire a mentally challenging education, the next step, entry to the workforce, also presented a reputational risk. While it was not a viable strategy for the poorest families, families with means expected their daughters to remain at home until marriage, spending their days helping with domestic tasks, preparing themselves to become good wives and mothers. Priscilla Wakefield was, however, no stranger to paid work. Living at the same time as Wollstonecraft, Wakefield managed to carve out a successful career as a writer, publishing a total of seventeen books, while also finding time to establish England’s first savings bank for women and children. Informed by her personal experience, Wakefield offered her own solution to the problem of preserving female virtue, one which involved embracing paid work but with strict limitations attached.

According to Wakefield, the central reason why women fell into “sexual sin,” including sex work, was a lack of financial support. Limiting young women’s educational development and their ability to earn was, she thought, a recipe for immorality, not morality. Rather than protecting women, their exclusion only succeeded in leaving them vulnerable. The phenomenon of “fallen women” was, she argued, an economic and not a social problem, one that resulted from a “dreadful necessity.” By means of a solution, her Reflection on the Present Condition of the Female Sex; with Suggestions for its Improvement (published in 1798) proposed an intricate and detailed plan for women’s work, tabulated by class, with educational and training recommendations for each “class.” She attempted to reconcile work and virtue, combining Wollstonecraft-style thinking with social conservatism. With it, Wakefield recommended that poorer women be properly trained as hairdressers, cooks or seamstresses so as to avoid falling into harlotry, and that men should be discouraged from working in such professions, keeping them “safe” for women. For the handful of women born into families with means, writing and painting were at the top of her list of recommendations, as they could be conducted from the “safety” of the home, away from men. Segregation along gender lines was, for Wakefield, the route to liberation.

The cult of female modesty has hampered women’s access to education and work for a long time. Sadly, it continues to have the same effect in parts of the world today. While the number of children not in school across the world has fallen over the last two decades, at current rates of progress it will be 2050 before all girls have been educated to at least primary school level. Evidence suggests that the poorest girls tend to be withdrawn from school at puberty (between the age of 12 and 14). In 2020, the countries with the highest out-of-school rate for girls in this age group were: Mali (84% out-of-school), the United Republic of Tanzania (81%), Guinea (78%), Nigeria (78%), Benin (73%), Pakistan (70%), Mauritania (63%), Afghanistan (62%), Senegal (58%) and Côte d’Ivoire (57%).

In 2012, the struggle for girls’ education in Pakistan came into sharp focus when Malala Yousafzai, then aged fifteen, was shot in the head by masked gunmen on her way home from school. She had become the target of the militant group Tehrik-i-Taliban following her campaign for girls’ education. Four years before the attack, in 2008, she and her female friends had been denied schooling when her town, in the Swat Valley, came under Taliban control. Since her recovery, Malala has continued her campaign. So too, sadly, have her enemies.

Following the introduction of the new Taliban regime in Afghanistan in 2021, secondary schools were closed to girls. At the same time, the work of the Women’s Affairs Ministry was swallowed up by the Ministry of Vice and Virtue. Under the Taliban, and much as in Wollstonecraft’s time, “morality” comes first and that morality does not include a right to an education.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185652 https://historynewsnetwork.org/article/185652 0
Travel Was a Driver of Eleanor Roosevelt's Leadership

Eleanor Roosevelt speaks with an American serviceman near a downed Japanese fighter plane on the island of Guadalcanal, 1943.

“MR. PRESIDENT WOULD YOU PLEASE SUGGEST THAT MRS. ROOSEVELT CONFINE HER DUTIES MORE TO THE WHITE HOUSE.”

The couple from Atlanta who sent the demanding telegram to President Franklin D. Roosevelt weren’t alone in their criticism of the First Lady.

From the time Eleanor Roosevelt entered the White House in 1933, she sharply divided public opinion. Traditionally, First Ladies were supposed to be discreet figures in the presidential background. They stayed close to the White House, primarily overseeing social functions, and took no active part in public life. When Eleanor stepped into the role, she was well established as a writer, educator, political advocate, and traveler. And yet even she, a daring early advocate of commercial air travel and a frequent flier who wanted to become a pilot, was expected to stay grounded at the White House.

Instead, travel became a key factor in Eleanor’s success as First Lady.

Within days of FDR’s inauguration, she flew from Newark, New Jersey, to Washington, D.C., on a bumpy flight buffeted by strong winds, and was officially on record as the first president’s wife to travel by air. The next year, she took a flight from Miami to Puerto Rico at her husband’s behest to report on labor and living conditions on the island. The fact that she flew over water enhanced her reputation as fearless and unconventional.

All the while, the nation, unused to First Ladies “darting about,” watched her “with mingled admiration and alarm,” stated a news reporter. When Eleanor wasn’t taking to the skies, she was often traveling by train or behind the wheel of her car. (She won a showdown with the Secret Service over driving her own car and going about unaccompanied.) Gas station attendants between the capital and New York City kept an eye out for her famous blue roadster, while a man in Maine refused to believe she was the president’s wife because she drove her own car.

But Eleanor didn’t travel merely for the thrill of it. An innate love of the road inherited from her adventurous father, who once spent part of his inheritance on a trek to India and the Himalayas, melded with a curiosity for knowledge and a desire to get to know people from all walks of life. “Instead of going in search of beauty or remarkable artistic collections, or any of the things for which we usually travel to strange places,” she said, “I traveled to see and meet people.”

To the dismay of traditionalists like the Atlanta couple, it was outside the White House where Eleanor decided that she could best help her husband, by being his “listening post.” It was vital, she believed, for politicians, and especially the president, to keep in touch with public opinion, “the moving force in a democ­racy.” It was also difficult for the commander-in-chief, siloed in Washington, to achieve this. And so she did it for him.

Eleanor’s self-made role fueled her strong sense of social responsibility and satisfied her wanderlust. “I want to know the whole country,” she said, “not a little part of it.” She earned a reputation for wanting to see things for herself, ceaselessly criss­crossing the United States giving speeches and inspecting New Deal initia­tives. She visited factories, schools, hospitals, homesteads, and migrant camps. One morning, Americans opened their newspapers to find out their First Lady had descended two and a half miles beneath the hills in rural Ohio to explore a coal mine. A longtime advocate for the rights of coal miners and other workers, she seized this chance to learn about their livelihood firsthand. She saw how coal was mined, entering a cham­ber where minutes earlier coal had been blasted from the walls, and dis­cussed wages and working conditions with hundreds of miners.

Eleanor was famous for her travels, or infamous depending on the perspective, and she routinely made headlines for them. She averaged an astounding 40,000 miles on the road each year seeking out Americans in their own communities. Everywhere she went, she asked people what they thought and what they needed. The information she gathered was used to exact change through her own means and platforms, as well as to aid the president and his policy advisers.

“You know my Missus gets around a lot,” Franklin boasted in a cabinet meeting. “She’s got great talent with people.”

Despite having the president’s backing, Eleanor’s intrepidness and independence continually created controversy. During the 1936 presidential election, as Franklin sought a second term, her travels were wielded as a political weapon by the opposition. Voters were assured that the Republican candidate’s wife, Mrs. Alf Landon, was a traditional wife and mother who would stay at home. Franklin won re-election in a landslide.

Five years and another successful re-election later, the United States officially entered World War II. With the onset of war, many Americans found themselves in far-flung locales well beyond the country’s borders, among them hundreds of thousands of soldiers, sailors, and marines serving in the Pacific. Just as Eleanor had been doing for a decade, she would go to them, venturing into a theater of war unlike any other in history—one where fighting took place across great distances on water and in places with harsh, unfamiliar surroundings.

All of Eleanor’s fact-finding expertise and travel savvy culminated in a precedent-breaking trip to the Pacific theater in August 1943. And yet even for an experienced traveler like Eleanor, this undertaking was further, longer, and more arduous than anything she had previously done. And it was more dangerous. During the five-week trip, she covered 25,000 miles trekking to Hawaii, New Zealand, and Australia, through the South Pacific and into territory still under enemy air attack. Along the way she thanked hundreds of thousands of U.S. troops for their service, bolstered diplomatic ties with Allied nations New Zealand and Australia, and linked the fighting front with the home front by reporting the unvarnished truth about what she encountered to the president and to the American people.

A reporter at the time described Eleanor’s trip to the Pacific theater—and I believe this still stands—as “the most remarkable journey any president’s wife has ever made.”

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185549 https://historynewsnetwork.org/article/185549 0
Can We Learn from Previous Generations of Historians Negotiating Between Past and Present?

James Harvey Robinson contributed to the "New History" movement with a 1911 essay of the same name. Photo c. 1922.

Many causes and campaigns these days are looking to history to buttress their arguments. Some involve political positions, others relate to gender and racial issues, while still others pertain to broader themes of income inequality or social justice. Should historians join these causes as advocates? If not, should they write and present history in such a way that its relevance and applicability to current issues becomes clear?

We sometimes think of this as a new dilemma (or opportunity) for historians, but actually it has a history of its own. Looking into that history might help us get and keep our bearings at this time of heated public discussion. A useful place to look would be the so-called “new history” which emerged in the early 20th century. These historians espoused strategies that would draw on a wider array of historical evidence than historians had used in the past. It would place more emphasis on social and economic trends, provide more coverage of the lives of ordinary people in history, draw on allied social sciences such as economics and sociology, and dovetail with the emerging field of social studies. It would demonstrate the relevance of history for providing insights into contemporary affairs and problems. Nineteenth century history had been mostly a placid and slow-to-change field. The “new history” would liven things up.

That would be a tall order for historians. But, done well, it would permit historians from a lofty perspective to point out the historical origins of contemporary public issues and show parallels with past developments. Through teaching and writing good history, they could contribute to the public good without having to go further and also become advocates (unless they wanted to do that.)

Columbia University history professor James Harvey Robinson (1863-1936) led the way in his highly influential 1911 essay "The New History" in Proceedings of the American Philosophical Society.

Robinson made a number of points.

Historians need to be engaging, but it is hard to compete with fiction writers.

History is not infrequently still defined as a record of past events and the public still expects from the historian a story of the past. But the conscientious historian has come to realize that he cannot aspire to be a good story teller for the simple reason that if he tells no more than he has good reasons for believing to be true his story is usually very fragmentary and uncertain. Fiction and drama are perfectly free to conceive and adjust detail so as to meet the demands of art, but the historian should always be conscious of the rigid limitations placed upon him. If he confines himself to an honest and critical statement of a series of events as described in his sources it is usually too deficient in vivid authentic detail to make a presentable story.”

What counts a good deal is the light that history casts on contemporary conditions.

It is his [the historian’s] business to make those contributions to our general understanding of mankind in the past which his training in the investigation of the records of past human events especially fit him to make. He esteems the events he finds recorded not for their dramatic interest but for the light that they cast on the normal and prevalent conditions which gave rise to them. It makes no difference how dry a chronicle may be if the occurrences that it reports can be brought into some assignable relation with the more or less permanent habits of a particular people or person….”

Historians need to show how history can “explain our lives.”

“History is then not fixed but reducible to outlines and formulas but it is ever-changing, and it will, if we will permit it, illuminate and explain our lives as nothing else can do. For our lives are made up almost altogether of the past and each age should feel free to select from the annals of the past those matters which have a bearing on the matters it has specially at heart.”

Less relevant history might be accorded a lower priority.

“If we test our personal knowledge of history by its usefulness to us, in giving us a better grasp on the present and a clearer notion of our place in the development of mankind, we shall perceive forthwith that a great part of what we have learned from historical works has entirely escaped our memory, for the simple reason that we never had the least excuse for recollecting it. The career of Ethered the Unready, the battle of Poitiers, and the negotiations leading to the treaty of Nimwegen are for most of us forgotten formulae, no more helpful, except in a remote contingency, than the logarithm of the number 57.”

Robinson was joined by a cadre of other historians, most notably Charles A. Beard, his Columbia colleague, whose 1913 book An Economic Interpretation of the Constitution of the United States contended that the nation’s founders were motivated in part by personal financial considerations and that the Constitution was designed to protect vested interests. That dovetailed well with progressive reformers’ campaigns to reign in those interest through legislation and regulation.

Robinson in effect took his own advice. He was an advocate in the limited sense that he tried to exhort historian colleagues to become more relevant and proactive. But he mostly steered clear of the political controversies of the day himself while writing several well-received books, including some best-selling texts. But he and Beard chafed at restrictions on academic freedom at Columbia and, in 1919, left to join other progressives in founding The New School for Social Research which soon became a beacon of progressive thought in history and other fields.

The ideas that Robinson kicked off were also soon espoused by other historians. Historian Carl Becker, in his provocative 1931 American Historical Association presidential address, "Everyman His Own Historian," put a new twist on Robinson’s idea. “History is the memory of things said and done,” he said, “an imaginative reconstruction of vanished events,” which each of us…. fashions out of his individual experience.” That went well beyond what Robinson had in mind, but it extended his implicit point about taking history to the people. The Society of American Archivists, formed in 1936, took up the mantle of historical records preservation. Establishment of the American Association for State and Local History in 1940 helped with Robinson’s recommendation for more exploration of local history and sources.

In 1939, he gave something of a valedictory in his American Historical Association presidential address, "Newer Ways of Historians". He conceded that even historians like himself, using the methods of the “new history” to gain some predictive insight, did not foresee World War I, the communist revolution in Russia, the great depression, and some other recent developments. But he insisted he had been right on point back in 1911.

“[H]istory at its best needs not simply to be authentic. Its value, as a contribution to wisdom, depends on the selection we make from the recorded occurrences and institutions of the past, and our presentation of them…. Never before has the historical writer been in a position so favorable as now for bringing the past into such intimate relations with the present that they shall seem one, and shall flow and merge into our own personal history.”

Was Robinson right? He certainly ignored a great deal and rather astonishingly seems to have left women mostly out of consideration. He did not offer a definite way to measure the impact and influence of history. He later seemed to backtrack a bit or at least pursue a different course, particularly in his 1921 book The Mind in the Making, which proposed that educational institutions and historians approach social problems with a more progressive intent of creating a just social order. In The Human Comedy As Devised and Directed by Mankind Itself, a posthumous collection of essays published in 1937, Robinson seemed to blend pessimism and optimism – humans have made progress over the years but seemed to carry an ancient warring spirit. The “new history” gradually gave way to other interpretations, and the process of reinterpreting history continues today.

Some historians these days seem to go beyond a semblance of objectivity in the way they present history and assert that their interpretation must be the right one. Others are more measured, presenting fresh perspectives on the story of America in well-documented, objective works. Their work is refreshing and revealing. They are closer in spirit to Robinson, whose work is something of a model of moderate, measured, well-considered presentation that demonstrates the relevance of history in people’s lives.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185547 https://historynewsnetwork.org/article/185547 0
Recognizing the "Other Renaissance" of Northern Europe

Ghent Altarpiece, Hubert and Jan van Eyck, 1432

It is generally accepted that the European Renaissance began in Italy. However, as this developed south of the Alps, a historical transformation of similar magnitude began taking place in northern Europe. This “Other Renaissance” was initially centered on the city of Bruges in Flanders (modern Belgium), but its influence was soon being felt in France, the German states, England, and even in Italy itself.

This Other Renaissance was certainly influenced by the developments in Italy—in particular the Southern Renaissance, which focused on Florence. However, this Other Renaissance was far more than just a development wholly influenced by what was taking place in Italy. It also involved a number of purely independent features, characteristic of the locations in which it flourished, from Paris to the German states.

This northern Renaissance, like the southern Renaissance, largely took place during the period between the end of the Medieval age (circa the mid-14th century) and the advent of the Age of Enlightenment (circa the end of the 17th century). Arguably, three of the most important events of this period are linked with the Other Renaissance. First was the development by Johannes Gutenberg in 1439 of the moveable type printing press (which, unknown to him, had in fact been invented in China some centuries previously). This enabled the rapid and widespread dissemination of knowledge in the form of books, rather than painstakingly copied manuscripts.

The second northern development arguably changed the face of Europe forever. This was the religious revolution instigated by Martin Luther when he nailed his 93 Theses to the door of Wittenberg Church in 1517.  This brought about the Reformation, ending the hegemony of the Roman Catholic Church in western Christendom. Worshippers could pray directly to God, without the intercession of a priest. This Protestantism largely took hold in the north of the continent. Europe was split into two opposing power groups.

The third major development instigated by the northern Renaissance was the proposal, published in 1543 by Copernicus, that the earth was part of a heliocentric system. In this solar system our world was no longer the center of the universe, but did in fact orbit the sun, as did the other planets such as Venus, Mars and Mercury. Accompanied by the discovery of new worlds beyond Europe, Copernicus’s heliocentric idea would have a subtle but profound effect on western psychology and self-understanding.

In parallel with these developments came new European discoveries about our own world. Not long after Columbus reached the New World, Cabot sailed from England to North America. And following the Portuguese discovery of a sea route around Africa, the Dutch established themselves in   Indonesia and the English in India. Just as the world could no longer regard itself as the center of the universe, so Europe recognized that it was no longer the center of the world.

The Italian Renaissance is justly celebrated for its supreme artistic and cultural achievements. Yet these should not be seen as overshadowing the cultural accomplishments of northern Europe. Oil painting was in fact first developed in northern Europe, where its most skilled practitioners early were the Flemish Van Eyck brothers. The Bruges Altarpiece and the Arnolfini Wedding are the Van Eyck masterpieces, arguably the finest early oil paintings in all Europe. The succession of northern European Renaissance artists would include the likes of Holbein and Dürer. In literature, many see the French writer Rabelais as the inheritor of Boccaccio’s influence. Meanwhile, the chateaux of the Loire valley remain unmatched in their unique architectural aesthetic.

The political philosophy of the quintessential Italian Machiavelli would have a profound influence on the thought of the English ruler Henry VIII and his devious chief minister Thomas Cromwell. But the consequences of this rule would develop in a way quite unforeseen in Italy, when Henry VIII established the Church of England and broke with Rome. England would flourish during the Elizabethan age, seeing dramatists such as Shakespeare and Marlowe, as well as poets of the caliber of Marvell. Across the Channel in France Montaigne’s essays would introduce an entirely new examination of the human condition. But perhaps the supreme thinker of this age, both north and south of the Alps, was the Dutch humanist philosopher Erasmus. The later northern European Renaissance would influence the first philosophers of the Age of Reason: Descartes, Spinoza and Leibniz.

As with the Italian Renaissance, this northern development would require its financial benefactors. In Italy, the likes of the Medici bankers, rulers of city states and the Popes, had largely filled this role. In northern Europe innovative commercial developments would bring great riches to such cities as Amsterdam. Here the Dutch East India Company undertook joint-stock ventures to bring spices from Asia. In London, the East India Company would undertake even more ambitious ventures in India. Investors in these companies became rich, forming a new middle class, which aided the northern renaissance by purchasing paintings, books and other works of art. By contrast, the southern Renaissance was largely funded by aristocrats. This contrast is reflected in many aspects of the northern Renaissance, which developed its own tendency towards bourgeois democratic ideals.

Despite this, the northern Renaissance did in fact produce its own version of the Florentine Medici family. This was the German Fugger family, which originated in Augsburg in southern Germany. Mining, banking and general trade from Hungary to Spain would eventually make the Fuggers the richest non-aristocratic family in Europe, even taking over some of the Medici trade as the Medici bank went into decline. Such was their fortune that they were soon influencing who was voted Holy Roman Emperor.

In parallel with the great scientific advances made in Italy by Galileo, scientists in northern Europe made many epoch-changing advances. The English physician William Harvey (who had been educated in Italy) discovered the circulation of the blood, which would revolutionize medical practice. The Flemish physician Vesalius further advanced this field by producing the first modern work on human anatomy. In Scotland, John Napier of Merchiston made considerable advances in mathematics. He invented logarithms, was a pioneer in the use of decimal points and constructed a calculating device known as “Napier’s bones.” This trio of British scientists is completed by the flamboyant and controversial Sir Francis Bacon, who achieved political success as Chancellor of England, fell from grace, and was the first to articulate the new scientific method.

The northern European Renaissance would finally evolve into the Age of Reason and the consequent Enlightenment. Arguably, this “other renaissance,” and the figures it produced, would play a role at least as significant as the Italian Renaissance in bringing our modern world into being. This fact is largely overlooked. The Other Renaissance is an attempt to right this wrong.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185546 https://historynewsnetwork.org/article/185546 0
For Derby Day, a Note of Caution About Horses and "Races"

Riderless Racers at Rome, Théodore Géricault, 1817

With the Kentucky Derby just around the corner, horse lovers and the uninitiated alike prepare their hats, mint juleps, and bets for what is widely known as “the most exciting two minutes in sports.” And with the puff of dust unleashed by the horses’ hooves, all the messiness of an old tradition with a deep history is kicked up once again.

Scandals abound, fueling the drama of the race. At only three years of age, the horses are young and their risk of injury is high, as the world saw when the filly Eight Belles was euthanized after she broke both front ankles following a second-place finish in 2008. Testing has sought to root out the use of drugs like the steroid that helped lead Medina Spirit to victory in 2021. That is without factoring in the gambling along with the social dynamics of wealth at play as 150,000 fans gather to watch the thoroughbreds round the track.

But another thread is even more nefarious. We find it in places like the race’s official website that celebrates that “The first step on the ‘Road to the Kentucky Derby’… is the breeding shed.” Breeding racehorses is an elaborate, expensive affair complete with pedigree charts and contracts. For registration with the US Jockey Club, a Thoroughbred must be conceived through “live cover,” not embryo transfer, cloning, or artificial insemination. That means the mare (the female horse) must be brought to the stallion (the male) when she comes into heat, and stallions might mate with up to three mares a day. Elite horses, then, are both born and made as so much work goes into both their breeding and training.

Horse racing activates “best in breed” thinking. Pursuing the best, most perfectly bred animal is not a new concept. It has its roots in the Renaissance, another heyday for horse racing, and contributed to the evolution of the idea of race.

Throughout the Italian Renaissance, the patrons of Leonardo da Vinci and Galileo Galilei spent a fortune on horses alongside artists and intellectuals. Their fascination with sketching out the perfect proportions of buildings and humanity, exemplified in the Vitruvian man, extended to creating perfect horses.

For exorbitant sums, they imported animals from around Europe and the Mediterranean in pursuit of the best characteristics of each, and the political ties that came from getting to know their breeders. Arabian horses were famous for their agility. Turkish horses were steady over long distances. North African horses were known for their sprinting. Once brought to Italy, these animals would be bred with European horses in pursuit of the best combinations for riding, carriage driving, and, of course, racing. Cities would grind to a halt as everyone watched the palio horse races that ran through the central piazza, showcasing which noble family had managed to breed the best animals. 

The irony is, though, that racing bred race. These projects to create the best horses popularized a word invented in the Renaissance: “race.” Owners, breeders, and trainers all referred to the horses they raised as “razze dei cavalli.” While the modern English term “race” is, according to Marriam-Webster’s Dictionary, chiefly a word for “any one of the groups that humans are often divided into based on physical traits regarded as common among people of shared ancestry,” the word began with a meaning akin to our word “breed.” In the Renaissance, “race” meant a stock (as in livestock), or a population that had been carefully bred. It didn’t emphasize physical differences; instead, it called attention to the work of selection and training. 

However, as European empires expanded into overseas territories, they brought this new term “race” and used it to describe both humans and animals. In these hierarchal, colonial systems, it became a term to denote fixed, physical differences in humans rather than temporary reproductive work of breeding in animals.

The slipperiness around breeding as innate and trained continue to rear up. Even now, attending a horse race carries the connotation of good breeding. That’s what at stake in the joke in My Fair Lady when Eliza Doolittle, the carefully educated protégé taught upper-class English to raise her out of poverty, makes the fateful error of shouting at her horse to run a touch faster. No matter how perfect her hat, accent, and company had been, something about her true background slipped out in the thrill of the race.

So, on May 6, when we root for the aptly named Angel of Empire and the other colts and fillies who make it to the Derby, we are slipping into a much older tradition. By admiring their rippling coats and eye-popping speed, we are celebrating animals that have been crafted with human intervention to attain quasi-eugenic ideals. 

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185544 https://historynewsnetwork.org/article/185544 0
"Class War" is Back in the Headlines. But What is it, Really?

Communards at a Barricade, Paris 1871

Ours will go down as the age of class war. We talk about it all the time, and with mounting urgency amidst the realities of economic crisis, ecological catastrophe, and social declension. Rising inequality and the ever-increasing stakes of our brutal economic divide are constant reminders of this conflict, as is the ongoing wave of strikes, with workers of all kinds demanding higher wages and better conditions. 

Barely a day goes by now without class war featuring in one headline or another. For British economist Ann Pettifor, the finance sector’s response to spiraling inflation – raising rents, crushing demand, and disciplining workers – is clear evidence of “their effective preference is for class war over financial stability.” Pettifor’s formulation, which has been republished all throughout the bourgeois press, is just one of many similar usages of this phrase in 2023 alone.

And yet, despite a collective willingness to acknowledge the existence of class war, to use this phrase as a popular trope and a technical term, we don’t really know what it is. As one of history’s most prominent narrative categories, class war wants for critical explanation. So: what is class war?

Within critical thought, class war is used less as a technical term and more as an affective catalyst, reframing actions and rhetoric through military concepts and language, without offering so much as a program or practical strategy. That is what we encounter most famously with Marx and Engels, when in 1848 they summarized the development of the proletariat and the bourgeoisie: the “fight” between the classes has become so absolute that class struggle (Klassenkampf) modulates into civil war (Bürgerkrieg) and then open revolution. From this perspective, the work of politics is to undertake the transition from one phase to the other, from the grinding brutality of class struggle to the presumably intentional, organized, and openly violent confrontation with the bourgeois and their institutions.

In other words, revolution means to see the exploited class waging war against the economic regime and interstate system that maintains its exploitation, namely capitalism, whose beneficiaries will militarily defend their benefits with everything they have. To speak of class war might therefore be to conjure militancy and solidarity against the present state of things. And in this sense, class war is not just the stuff of political discourse; it emanates from the lived experience of social conditions.

The first ever recorded utterance of the phrase “class warfare” is from January 25, 1840, when it appeared in the Chartist newspaper the Northern Star. Chartism was a movement of redress that openly sought political enfranchisement. By the same token, however, the movement also comprised a militant faction for which actual warfare was seen to underwrite more peaceable demands. The invocation of class war came at the end of an article setting out the movement’s positions and demands, which pronounces ritual bloodletting as the outcome of underrepresentation coupled with immiseration. “Good must come to the nation out of this class warfare for pre-eminence,” it reads, “as from a compound of the most deadly poisons a wholesale medicine may be extracted.” No longer a threat, something to be worried about in the future, but alive and deadly, here and now: the class struggle had already erupted into civil war.

This usage of class war is carried on all throughout the history of social movements, into the twentieth century and beyond. It is what Rosa Luxemburg once described, against the beating drums of national chauvinism and the opening up of an imperial war of extermination, as “the crux of the matter, the Gordian knot of proletarian politics and its long term future,” namely the need to escalate the ongoing class struggle into actual civil war. “The proletariat does not lack for postulates, prognoses, slogans,” she says. “It lacks deeds, the capacity for effective resistance to imperialism at the decisive moment, to intervene against it during the war and to convert the old slogan ‘war against war’ into practice.”

Other well-known revolutionaries would hold to a similar line. For Lenin, thinking about the Paris Commune of 1871, the proletariat “must never forget that in certain conditions the class struggle assumes the form of armed conflict and civil war; there are times when the interests of the proletariat call for ruthless extermination of its enemies in open armed clashes.” Or for Mao, civil war marks the passage from contradiction into antagonism, or mobilization of masses. In his view, “the contradiction between the exploiting and the exploited classes” has persisted through slave society, feudal society, and into modern capitalism as a “struggle” between the two; “but it is not until the contradiction between the two classes develops to a certain stage that it assumes the form of open antagonism and develops into revolution. The same holds,” he adds, “for the transformation of peace into war in class society.”

When revolutionaries talk about class war they tend to do so in a hybrid future-present tense: class war is coming, but it’s also already upon us. Battlefronts are opening up, but something else is looming on the horizon. Specifically: the revolutionary invocation of class war raises consciousness of the former struggle as a point of departure into the latter conflict. It seeks to recruit and to motivate comrades from a state of contradiction into acts of antagonism. The proclamation of class war is what linguists might describe as a speech act: a performative utterance that, when said, is also a kind of action – like, for instance, a formal declaration of war, which not only announces but also commences.

This is just one shared feature that links the writing of Ann Pettifor and any number of other mainstream headlines to the formulations of Marx, Engels, Luxemburg, Lenin, Mao, and the social movements they all represent.

Class war is a red thread that conjoins our present moment with a submerged history of revolution. My new book, Class War: A Literary History, traces that thread, and in doing so tells a narrative that spans the globe and more than two centuries of history, from the Haitian Revolution to the Russo-Ukrainian War.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185484 https://historynewsnetwork.org/article/185484 0
When Truly Stolen Elections Changed the Course of American History

"Liberty, the Fair Maid of Kansas in the Hands of the Border Ruffians," c. 1856. Image Boston Public Library

U.S. Secretary of State William L. Marcy, Minister to the UK James Buchanan, President Franklin Pierce, and US Senators Lewis Cass and Stephen Douglas are depicted as participants in the violence committed by pro-slavery Missourians

It has become a familiar cry over the past couple of years: The election was stolen! Fraud! We were robbed!

Former President Donald Trump and his supporters allege that election fraud in several states (Georgia, Arizona, Pennsylvania, Wisconsin, and others) resulted in the electoral votes of those states going to Joe Biden, determining the outcome of the 2020 presidential election. The claims brought Trump supporters to Washington, D. C. on January 6, 2021, and led to the attack on the Capitol. Similar claims about the 2022 midterm elections have been made, most notably by the Republican candidate for governor in Arizona.

While the recent election fraud allegations have been rejected by the courts due to lack of supporting evidence, there was a time in American history when elections were stolen, their outcomes determined by fraudulent votes, and their results certified by the federal government.

In 1854, Congress passed, and President Franklin Pierce signed into law, the Kansas-Nebraska Act. It reversed the Missouri Compromise’s prohibition on any northern expansion of slavery. Instead, whether slavery would exist in the territories of Kansas and Nebraska would be determined by what was called “popular sovereignty,” i.e., the people of a territory would vote on whether to have slavery or not. Nebraska, located farther north and sharing a border with the free state of Iowa, was, most believed, destined to reject slavery. But the status of slavery in Kansas, to Nebraska’s south and bordering on the slaveholding state of Missouri, was uncertain. Both the North and South rushed settlers into Kansas to try to gain the majority.

The major tests of each side’s strength occurred at the ballot box. The first election was held on November 29, 1854, to select a delegate to represent the territory in Congress. The rules for voting, as determined by the territory’s Pierce-appointed governor, Andrew Reeder, had been clear. To cast a ballot, an eligible voter must actually reside in the territory of Kansas, to the exclusion of any other domicile, and have the intention of remaining permanently.

So much for the rules. On election day, hordes of proslavery Missourians crossed the border into Kansas and voted illegally. Although most were not slaveholders, they had heard plenty of speeches from their leaders inciting them to do whatever was necessary to stop the “Yankee abolitionists” from making Kansas a free state. Dubbed “border ruffians” by free-staters, these men were menacing in appearance and behavior and arrived for election day armed with knives and guns, as well as with ample supplies of barreled whiskey. They came in groups across the border a couple of days before the election and, a day or so after turned around and went back to Missouri. Crowding the polling places, they demanded to vote and threatened poll judges who refused to let them do so. Some of the judges, in fear for their lives, quit on the spot; those who remained were helpless to prevent ballot boxes from being stuffed. Worse, the ruffians used intimidation and in some cases violence to keep legitimate slavery-opposing residents of Kansas from casting their ballots.

It worked. The proslavery candidate for Congress, John Whitfield, won the November election with almost 2,300 votes, compared to only around 300 for his closest challenger. Despite the widespread and obvious fraud, Governor Reeder let the results stand. The Kansas-Nebraska Act, the victors proudly declared, established that the people would vote to decide all issues pertaining to slavery. And vote they had. A congressional investigation later determined that more than 1,700 votes had been fraudulently cast.

Four months later, on March 30, 1855, another election was held in Kansas to select a territorial legislature. This election was far more important than the earlier one, which had only chosen a delegate to represent the territory in Congress. The legislative body elected in March would write the territory’s laws, put it on a course for statehood, and have a large say in whether that would be with or without slavery. In the months since November, hundreds more settlers had arrived from New England and other northern states, most of whom opposed slavery. Free-staters were confident that, if the election were held fairly, a legislature with a majority opposing slavery would be chosen.

Once again, however, thousands of Missourians crossed the border and cast illegal ballots. A census of Kansas residents taken just a few weeks before had documented fewer than 3,000 eligible voters. Yet, more than 6,000 votes were cast and, of those, more than 5,400 were for proslavery candidates. All but a handful of the seats in the legislature went to them. As in November, many legitimate Kansas residents who opposed slavery were unable to vote, due to intimidation, threats, and violence. An appeal to Governor Reeder to toss out the results ended in a revote in only a few precincts, nowhere near enough to change the outcome. Fumed Horace Greeley in the New York Tribune, “[A] more stupendous fraud was never perpetrated since the invention of the ballot-box. The crew who will assemble under the title of the Kansas Territorial Legislature, by virtue of this outrage, will be a body of men whose acts no more respect will be due . . . than a Legislature chosen by a tribe of wander[ers] . . . .”

Although elected by fraud, the territorial legislature was recognized by President Pierce as the legitimate government of the Kansas Territory. When it met in the summer of 1855, harsh proslavery laws were passed. These not only made slavery legal in Kansas, but also imposed the death penalty for assisting a slave escape to freedom, and even made speaking or writing in opposition to slavery in Kansas a felony punishable by up to two years of imprisonment at hard labor. 

The free-staters in Kansas dubbed these “bogus laws” enacted by a “bogus legislature.” They boycotted the machinery of the territorial government, adopted a policy of repudiating its laws, drafted a constitution for Kansas to enter the Union as a free state, and set up their own shadow government. President Pierce, who called these actions “revolutionary” and potentially “treasonable,” ordered the commanders of federal forts in the territory to suppress any armed resistance to enforcement of the laws. Over the next few years, much blood was shed in Kansas on both sides. The town of Lawrence, an antislavery enclave, was attacked by a proslavery mob, abolitionist John Brown massacred proslavery men and boys at Pottawatomie Creek, and there were battles between militia groups. Some estimates of the death toll run into the hundreds.  

By 1859, settlers in Kansas opposing slavery were clearly in the majority. A new governor, Robert Walker, vowed that elections would be held fairly. When he threw out fraudulently cast votes in elections for the territorial legislature held that fall, the free-staters were finally in control. In January 1861, two months after the election of Abraham Lincoln, and after the secession of several states in the Deep South, Kansas finally entered the Union as a free state. Lincoln, a former one-term congressman from Illinois, had left politics in the late 1840s. It was his opposition to the Kansas-Nebraska Act that caused him to reenter to political arena in the mid-1850s and put him on a path to the White House. But for the controversy, election fraud, and violence in Kansas, Lincoln may well have been just a footnote to history.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185414 https://historynewsnetwork.org/article/185414 0
Why Did Madison Write the Second Amendment?

Illustration of a Mississippi slave patrol, c. 1863. This, argues Carl T. Bogus, was the "militia" which Madison wrote the Second Amendment to secure.

A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed. – Second Amendment, U.S. Constitution

When, on June 8, 1789, James Madison took to the floor of the House of Representatives to propose a collection of constitutional amendments – some of which, he said, “may be called a bill of rights” – he faced a steep climb. Few in the First Congress wanted to adopt a bill of rights, at least then. “The Constitution may be compared to a ship that has never put to sea,” said one congressman. It needed to be tested before decisions could be made about amending it.

Madison himself had opposed a bill of rights during the Constitutional Convention and the ratification process. He believed that rights were better secured through governmental structure than through what he called “parchment barriers.” He also worried that any list of rights would generate arguments that anything omitted was not a right.

What changed Madison’s mind? And why did he include the Second Amendment?

The answers to those questions begin one year earlier. By its own terms, the Constitution had to be ratified by at least nine states at special ratifying conventions. When Virginia’s ratifying convention convened in Richmond in June of 1788, eight states had ratified, but if Virginia failed to ratify, it was by no means clear that there would be ninth state.

Federalists and antifederalists battled fiercely for a nearly a full month in Richmond. Madison led the federalists, who wanted a strong national government and favored ratification. The antifederalists, who feared a strong national government and opposed ratification, were led by Patrick Henry and George Mason. Henry, Virginia’s first governor and the most politically powerful person in the state, was considered America’s greatest orator. Mason was one of three delegates to the Constitutional Convention in Philadelphia who had refused to sign the Constitution.

One of many arguments that Henry and Mason deployed in Richmond concerned the militia. Until then, states controlled their militia. But the Constitution gave Congress the power to organize, arm, and discipline the militia. The states were only given the authority to appoint officers and train the militia in accordance with the discipline prescribed by Congress. Time and again, Henry and Mason argued that Congress might “neglect or refuse” to arm the militia, on which the South relied for slave control. “The militia may be here destroyed by that method which has been practiced in other parts of the world before; that is, by rendering them useless – by disarming them,” Mason declared. Henry argued that authority to arm the militia implied the authority to disarm it, and he raised the specter of Congress – controlled by a faster-growing, increasingly abolitionist North – doing exactly that.

Southerners lived in terror of slave revolts. In 1739, an insurrection in Stono, South Carolina by 60-100 slaves, armed with stolen muskets, left more than 60 White slaveowners, family members, and militiamen dead. No one knows how large the rebellion might have grown, or what the death toll would have been, if militia had not snuffed out the revolt before the end of its first day. In Eastern Virginia, where many of the Founders lived and the Richmond debate was taking place, enslaved Blacks outnumbered Whites. At night, militia groups patrolled designated areas – called “beats” – to ensure slaves were where they were supposed to be (this is where the terms patrols and policeman’s beat originated).

But while militia were essential for slave control, the Revolutionary War had demonstrated they were useless as a military force. Lexington and Concord were the only true militia victories. In the face of the enemy, militia repeatedly threw down their muskets and fled. Henry “Light Horse Harry” Lee, a hero of the Revolutionary War, told the Richmond convention that he “could enumerate many instances” of militia unreliability but would describe just one: how Continental soldiers behaved with “gallant intrepidity” and militiamen fled at the Battle of Guilford Courthouse. Other Virginians had previously reported much the same. George Washington repeatedly expressed disgust with militia. After Virginia militia bolted without firing a single shot at the Battle of Camden, their own commander told Governor Thomas Jefferson, “[M]ilitia I plainly see won’t do.”   

When Madison argued that Virginia could arm its own militia if Congress failed to do so, Henry ridiculed him. The Constitution allocated some powers to Congress and others to the states. “To admit this mutual concurrence of powers will carry you into endless absurdity – Congress has nothing exclusive on the one hand, nor the states on the other,” Henry said. He was, of course, right.

The federalists prevailed at Richmond, but just barely. The key vote was 88-80.

Henry then worked to extinguish Madison’s political career. He had the state legislature send two other men to the U.S. Senate, gerrymandered Madison’s congressional district by packing it with antifederalist counties, and recruited rising star James Monroe to run against Madison for the House. Monroe intended to campaign on his support for, and Madison’s opposition to, a bill of rights. And so, Madison flipped – promising, if elected, to write a bill of rights.

Madison won the election, yet he continued in jeopardy as long as Henry remained powerful. “Poor Madison got so cursedly frightened in Virginia that I believe he has dreamed of amendments ever since,” said one of Madison’s congressional colleagues. Another said Madison was “haunted by the ghost of Patrick Henry.” It makes perfect sense that, in writing a bill of rights, Madison would try to cure the problem Henry and Mason raised in Richmond. Madison could not expressly give states the right to arm the militia because he vowed his amendments would not contradict anything in the Constitution. In fact, states rarely armed their militia; they simply passed laws requiring militiamen to furnish their own arms. By using that model, Madison was able to write an amendment that assured his constituents – and the South generally – that they would have an armed militia for internal security.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185409 https://historynewsnetwork.org/article/185409 0
50 Years Later: Eyewitness to the Last Day of US Military Command in Vietnam

BGEN Stan McClellan, Chief of Staff Military Assistance Command-Vietnam (MACV), U.S. Army, conducts a press conference, to discuss the details of release of the prisoners of war, for members of the civilian press in a Military Assistance Command-Vietnam (MACV) briefing room at Tan Son Nhut Airbase.

February, 1973, two months before the end of MACV

The calendar said March 29, 1973. But the last few thousand American soldiers in Vietnam called it "X plus 60"—the 60th day of the truce, and the deadline for the last U.S. troops to go home.

It was, when it came, a day with an overwhelming sense of anticlimax. Camp Alpha, the processing barracks for departing GIs at Saigon's Tansonnhut Air Base ("It's Camp Omega today," someone murmured as we drove through the gate), gave the impression not of a war zone but of a second-rate hotel lobby at the end of a salesmen's convention. In the lines of men coiling out of the gymnasium-like staging area onto buses that would take them to the flight line, you saw none of the teenaged grunts or fresh-faced platoon leaders who actually fought the battles. The last soldiers of America's war in Vietnam were captains and majors and senior sergeants, middle-aged men with thinning hair and thickening waists. Looking at them, you remembered not battle but the beery haze of officers' and NCO clubs. Many of them had seen combat in earlier tours, of course. But they were leaving now from offices, not foxholes, where with typewriters and duplicating fluid they had carried out the necessary but hardly glorious tasks of shutting down the American war machine.

At mid-afternoon, about 50 of them attended a forlorn little ceremony that was the last formation of the Military Assistance Command Vietnam—always called by its acronym, "Mack-Vee"—once an army of a half-million men. In a courtyard outside the huge headquarters building everyone in the little group stood at attention while a terse general order was read: "Headquarters Military Assistance Command Vietnam is inactivated this date and its mission and functions reassigned." Then an honor guard marched briskly forward carrying the MAC-V flag with its insignia of an upward-pointing sword. Facing Ambassador Bunker and General Weyand, the last MAC-V commander, the flag-bearer dipped the banner, then furled and encased it in an olive-drab bag resembling a golf bag, in which it was to be flown out of the country.

A few hours later, Weyand attended a second ceremony with the chief of the South Vietnamese Joint General Staff, the ineffectual Cao Van Vien. "Our mission has been accomplished," the lanky Weyand pronounced haltingly in Vietnamese from a phonetic script. Then he boarded a special Air Force flight and was gone.

Not many hundred yards away, Vietnamese workers celebrated the historic day by busily and thoroughly looting Camp Alpha's billets and storerooms. Lines of "hooch maids" streamed through the gate carrying electric fans, clothing, lamps, stacks of old magazines and paperbacks, and other booty that could be used or sold in the Saigon market. Another group, including off-duty Vietnamese soldiers and airmen in civilian clothes, ripped away a section of chicken-wire fence to break into the mess hall, which was supposed to be turned over to the international truce observers. Ignoring the curses of a few furious Americans who had worked past midnight to tidy up for the new tenants, the intruders carried off tables, chairs, and crates of food. Even ceiling fans were ripped from their fixtures. The crowd turned unruly, though still good-natured, and began smashing what could not be carried away. In less than fifteen minutes, the formerly immaculate dining room was a shambles of broken bottles, spilled food, and upturned furniture—a tiny but telling metaphor, I thought, for the country we had thought to save with American technology and wealth but had never fully understood.

The three-day airlift removing the last American troops had been carefully calibrated to coincide with the release of the last group of U.S. war prisoners. On March 27, 32 men were handed over to U.S. representatives in Hanoi and flown aboard U.S. Air Force hospital planes to Clark air base in the Philippines. On the 28th, 50 more were released, including ten captured in Laos whose status had been the subject of a tense ten-day dispute in the Joint Military Commission. And on the 29th, another 67 prisoners, the last of a total of 595 freed in the exchanges, were flown to freedom. In Saigon, the last 5,200 U.S. servicemen were flown out at a rate roughly matching the repatriation of the prisoners. Another 825 American military delegates to the truce commission were to leave in the two days following the deadline, leaving 159 Marine embassy guards and 50 military members of the Defense Attache Office as the only uniformed Americans remaining in Vietnam.

By the time the last flight of the 29th was ready to load, a slanting afternoon sun was casting long bars of shadow across the tarmac. Communist truce delegates in baggy green uniforms clustered about the plane, aiming cameras at the departing Americans as they compiled a copious photographic record of what was, to them, a triumphant occasion. A few dozen American and European reporters and cameramen also recorded the scene. Not far away but unnoticed was a flatbed truck loaded with twenty coffins: South Vietnamese dead, flown back from the north to be buried. At planeside a Communist colonel named Bui Tin, the spokesman for the North Vietnamese truce delegation, was carrying a gift: a straw-mat painting of a Hanoi street scene, which he planned to present as a memento to the last departing American. When Master Sgt. Max Beilke of Alexandria, Minnesota, stepped onto the boarding stairs, Colonel Tin hurried forward and thrust the package at him. But the gesture was too early. A few minutes later, while Tin watched empty-handed, two more Americans boarded the plane, Col. David Odel of Crystal Lake, Illinois, the Tansonnhut base commander, and Chief Master Sgt. Vincent R. Jacobucci of Forest Hills, New York, his senior noncommissioned officer.

Odel and Jacobucci paused for a moment at the top of the boarding steps, waved back to the truce observers and cameramen on the ground, then disappeared inside. The doors slid closed and the huge C-141 transport swerved toward the taxiway. At 6 p.m., 60 days and ten hours after the failed truce, it lumbered off the ground into an orange sunset that silhouetted the watchtowers out on the airport perimeter. For the first time in over eleven years—it seemed longer—there was no significant American military presence in Vietnam.

This excerpt is from Without Honor: Defeat in  Vietnam and Cambodia, updated edition, McFarland & Company Inc., 2022

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185376 https://historynewsnetwork.org/article/185376 0
Learning from Historical Fiction: A Family Tale Reveals a Brief Multicultural Moment of the American West

A watercolor depicts the Montana fort of Angus McDonald, the last Hudson's Bay Co. trader in the United States

Historians may well wonder what drives a creative writer to plunge into a previously unknown period to produce that peculiar hybrid: the “historical novel.” We historical novelists often ask ourselves the same question—usually right as we start trying to wrestle masses of factual detail into story form. Every project appeals for different reasons. Yet I have found the same thing happening in each dive into a new historical period. Whether the subject is the power politics of the medieval church, postwar Germany or the colonizing of the American West, the act of researching and writing a historical novel is a crash course in a portion of the past that helps me—and with luck my readers—gain new perspectives on the world.

My latest novel, The Shining Mountains, the story of a Scots-Nez Perce family caught in the crossfire of Western expansion, was particularly rich in this regard. The novel’s themes could not be more pertinent: Americans today are acutely conscious both of how history is told and of hearing voices from cultures long marginalized. The novel is a tale of the multicultural world that existed before American settlement of the West which includes indigenous points of view.

Yet this broader context was far from my mind at the start. I simply wanted to write a novel about the amazing life of my ancestor’s brother Angus McDonald, the last trader for the Hudson’s Bay Company in the United States, and his Nez Perce wife Catherine Baptiste, along with the family they raised at a time of tremendous conflict and change.

Instead, over the four years it took to research and write, I wound up teaching myself the basic facts of Native history in this country, with a focus on one specific slice: between 1842 and 1879 in the Pacific and Rocky Mountain Northwest. Then, using fiction—the imagined story of two real individuals—I was able to investigate and ultimately portray historical events that most of us never learned in school.  

Every story is about not just characters, but the world these characters inhabit. Learning everything I needed to describe this world accurately was a tall order, but one I approached with energy and fascination. This was my family’s history, after all; this is America’s history as a nation. I had to educate myself on the fur trade period and the role of America’s political, military, religious and civilian actors as the doctrine of Manifest Destiny unfurled. At the same time, I had to discover everything I could about the real protagonists' lives, in order to present a story at the level of granular detail that fiction requires.

Angus McDonald and his family were sufficiently prominent to have left letters and documents in university archives. I sought and gained the support of his descendants and the tribes involved, and started digging. It should go without saying that I, like every writer of historical fiction, am dependent upon—and exceedingly grateful for—detailed and labor-intensive studies of primary sources by professional historians. We could not spin our fictions without them. Subject headings in my archive suggest the scope of my initial research: fur trade; Hudson’s Bay Company; company marriages; Glencoe massacre; Nez Perce War; Salish culture; buffalo hunting; Yellowstone Park. Then the events in which the McDonalds were involved demanded that I learn even more: Steptoe Massacre; Yakama war; Isaac Stevens’ transcontinental railroad survey; 1850s treaty negotiations; Idaho and Washington gold rushes; clippings from territorial newspapers. The great historians of this time and place—Alvin M. Josephy Jr., Jerome A. Green, Allen P. Slickpoo, Lucullus McWhorter—are the authorities on which I built my part-real, part-imagined world.

There is lively debate today about how “true” historical fictions should be. I tend to agree with those who criticize events and situations invented from whole cloth. The truth is dramatic enough, more often than not. I have also worked as a journalist committed to accuracy in my reporting for the past forty years. So I bring the same focus on getting the facts right to my fictions. This has been especially true of the history behind The Shining Mountains.

Over time I have come to see that the truths on which this book rests matter as much to me as this one family’s story. The more I learned, the more I grasped that I might have a deeper purpose in telling their history. I finally concluded that I wanted readers to experience what I had experienced, viscerally, step by step: to see exactly how the U.S. government and the settlers it encouraged actually dispossessed these Native tribes. And I wanted them to learn—as I so belatedly had—that for decades before these wars and massacres that “cleared” the country for pioneers, a vibrant multicultural society existed in the Pacific and Rocky Mountain Northwest. To that end, I included an appendix in which readers could fact-check which parts of the story were true and which invented.

I once attended a conference in which an eminent historian of Tudor England praised the late great Hilary Mantel for her brilliant grasp of the entire period. In recent years the profession seems to have tempered its reflexive rejection of fictional forays into its territory. Many historians, after all, now write fictions set in their periods of expertise. It is well established that in the classroom, at least, history told in story form lodges more firmly than factual data in students’ minds. Of course, older readers vary greatly in the approaches they enjoy. Some want to learn, others just to be entertained.

In the same way, I suspect, some writers of historical fiction aim to educate, while others are more drawn to portraying universal human drives. Looking back, I see that I did the latter in my first novel, Gutenberg’s Apprentice, a tale of vast technological change. In The Shining Mountains, I come down more on the educating side. The reason for this difference, I think, is the position in which I find myself: a writer from the dominant culture writing about a largely erased minority group. I wanted to use my bullhorn, and my privileged standing, to teach a largely white readership about these historic wrongs.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185374 https://historynewsnetwork.org/article/185374 0
The Curious History of Ulysses Grant's Great Grandfather

Fort William Henry, 1755

During the summer of 1756, Lieutenant Quintin Kennedy led a motley band of British Regulars, Scottish Highlanders, Mohawk warriors, and Provincial troops on a scouting party in the dense woods north of Lake George in New York. At that time, British forces and their Indian allies were sporadically engaging in violent clashes with French troops and their Indian allies in this fiercely contested region as part of the French and Indian War that broke out in 1754. Kennedy, who fought in the devastating British defeat at the Battle of the Monongahela in July 1755, had subsequently adopted Indian dress and military tactics. It was even rumored that Kennedy had married an Iroquois woman.

In one account of the scouting party, a British journal reported that Kennedy went “a-scalping, in which he had some success.” Kennedy’s group of sixty soldiers spent forty days in the woods creating havoc in New France, burning homes and killing several French settlers. His men destroyed property worth between 8,000 and 10,000 £ sterling. Upon their return to Fort William Henry, at the southern tip of Lake George, they brought back “one scalp, and two prisoners, who were the tavern-keeper and his wife, whose house, with others, they also burnt.”

All of Kennedy’s party returned safely on September 20, 1756, except for three individuals: “Captain Grant of Connecticut, and a cadet of the regulars, and one of the Highlanders, —a poor drunken fellow, not able to travel, they left behind to surrender himself to the enemy.” Captain Grant’s body was never found, alas. A Connecticut newspaper declared that Noah Grant died on Sept. 20, 1756, “Killed near Lake Champlain.” Tragically, Noah’s younger brother Lieutenant Solomon Grant had been killed in June 1756, after his scouting party was attacked by Indians in western Massachusetts.

Captain Noah Grant’s brief military career has been forgotten, but he is still remembered today for being the great grandfather of Ulysses S. Grant, who would become the general-in-chief of the United States Army over one hundred years later. Noting Noah Grant’s military service, Jesse Root Grant said of his son Ulysses, “The General comes of good fighting stock.” Noah Grant’s life helps us better understand Ulysses S. Grant in another way as well.

Ever since Matthew Grant first arrived in the New World at Dorchester, Massachusetts, in 1630, the Grant family had been instrumental in settling, exploiting, and defending the American frontier—first in the wilderness of Massachusetts and later in Connecticut, Upstate New York, Pennsylvania, and Ohio. Ulysses S. Grant might have been ambivalent about slavery when the Civil War broke out in April 1861, but he never wavered in his support for the Union. The story of the Grant family had been inextricably linked to the westward expansion of America for over two centuries.

At 37 years old, Noah Grant of Tolland, Connecticut, volunteered for military service after the General Assembly of Connecticut authorized the mobilization of 1,000 troops in early 1755. Like the 23-year-old Colonel George Washington, who commanded a Virginia regiment after the Battle of Monongahela, he’d be fighting on behalf of the British Crown. Later that year, Lieutenant Noah Grant participated in an unsuccessful expedition to take Fort Crown Point on Lake Champlain. The evidence suggests Noah was a brave and trustworthy soldier. In May 1756, he received a gratuity from the Connecticut Assembly worth thirty Spanish milled dollars for “extraordinary services and good conduct ranging and scouting, the winter past, for the annoyance of the enemy near Crown Point.” He was also promoted to Captain of the Seventh Company, Second Connecticut Regiment in March 1756.

According to muster rolls written by Noah Grant, there were several African American soldiers in his company, as the names Prince Negro and Jupiter Negro clearly indicated. Solomon Scipio and Jonah Chapman were two additional men in Grant’s company that were likely African Americans. It’s a curious fact of history that Captain Noah Grant’s great grandson would eventually expand opportunities for African American soldiers during the Civil War. There didn’t appear to be segregation among troops during the French and Indian War and it was widely acknowledged that Black troops were effective.

The fighting experienced by Noah Grant was shockingly violent. Scalping and other unspeakable atrocities were common during the French and Indian War. A few weeks before the disappearance of Noah Grant, eight carpenters were killed and two carters were scalped by Indians near Fort William Henry. And Grant’s scouting party returned from their expedition with at least one scalp of their own. During the war, both British and French authorities offered bounties for the scalps of their enemies. It’s conceivable that Noah Grant himself was scalped, though we have no evidence whatsoever on how he died.

Lieutenant Kennedy’s scouting party that suffered the loss of Captain Grant exemplified a revolution in military tactics on the American frontier. Major General Edward Braddock’s defeat at the Battle of Monongahela impressed upon Kennedy that the British would need to have lighter, more mobile forces to defeat the French and their Indian allies, who had perfected the art of la petite guerre. Kennedy, who first arrived in Virginia in 1755 as a young officer with the 44th Regiment of Foot, was a pioneer in learning to fight in a new way that was more suitable to American conditions than a conventional European battlefield.

According to one account, “Lieut. Kennedy has married an Indian squaw…has learned the language, paints [himself] and dresses like an Indian, and it is thought will be of service by his new alliance. His wife goes with him, and carries his provisions on her back.” The innovative tactics adopted by Kennedy and others eventually helped win a British victory in the French and Indian War, which had tremendous consequences for American history. “Freed of European rivals,” Pekka Hamalainen writes in Indigenous Continent, “the British would treat the Indians as subjects.” The Grant brothers, Noah and Solomon, played their small part in this bloody contest to open up the frontier to American settlers, who would eventually dispossess the original owners of this land.

Shortly after the Civil War broke out, Ulysses S. Grant wrote his father Jesse, “Whatever have been my political opinions before I have but one sentiment now. That is we have a Government, and laws and a flag and they must all be sustained.” His support for the Union was sincere and deeply held. This shouldn’t surprise us. The Grant family had deep connections to the American experiment from the very beginning.

Ulysses famously declared, in the opening line of his memoirs: “My family is American, and has been for generations, in all its branches, direct and collateral.” One of his ancestors had landed in the New World a mere decade after the arrival of the Pilgrims at Plymouth. Captain Noah Grant, as we’ve seen, gave his life in 1756 for the promise of securing western lands for future expansion by colonists. His son—also named Noah—claimed to have fought for independence from Great Britain during the Revolutionary War. And his son Jesse—the father of Ulysses—built a thriving business from scratch on the frontier along the banks of the Ohio River. By 1860, the Grants had made great sacrifices for their country and had been richly rewarded for their efforts, too. The sacrifice of Captain Noah Grant, during the French and Indian War, may have consciously or unconsciously influenced Ulysses S. Grant, as he decided to rejoin the United States Army in April 1861.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185269 https://historynewsnetwork.org/article/185269 0
How The Irish Saved Wellington at Waterloo

"Closing the Gates at Hougoumont, 1815," Robert Gibb, 1903

For almost a millennium, the Irish have provided men and military expertise to the English and British crowns. From the hobelar light cavalry in the 13th century to the First World War and beyond, soldiers from Ireland have made outsized contributions. The Battle of Waterloo, June 18,1815, stands as a shining example of Ireland’s place in Britain’s military history.

Taking the King’s Shilling

The Revolutionary and Napoleonic wars, which had raged continuously from 1781 to 1815, would have a profound influence on the course of Irish history, fomenting bitter divisions and engendering the opposing ideologies of Republicanism and Unionism.

The economic booms and busts produced by the decades-long war with France left Ireland’s economy in a perilous state. That, coupled with the aftermath of the disastrous 1798 Rebellion, left many families destitute. For some, the only realistic option for survival lay in the enlistment of a son or a father (or both) into the British army.

Captured rebels were left with even starker choices: the hangman’s noose, transportation, or conscription into the King’s forces. Consequently, by the late 18th century one third of the British army consisted of Irish-born soldiers.

Of course, not all sons of Ireland fought for the British. The French and even German states fielded their own share of first and second-generation Irish regiments, although as many as 40 per cent of these foreign fighters ended up in red coats.

Unlike Britain, which was undergoing an industrial revolution at the turn of the 19th century, Ireland was suffering from massive unemployment. The destitute were disproportionately from the Catholic majority — victims of discrimination legalized by the harsh anti-Catholic penal laws.

Despite this, the British army at Waterloo fielded three predominantly Irish, Gaelic-speaking and predominantly Catholic regiments: The 27th (Inniskilling) Regiment of Foot, the 6th Inniskilling Dragoons and the 18th Kings Irish Hussars.

Following his escape from Elba and his hundred-day return to power in 1815, Napoleon Bonaparte had massed an army of 73,000 battle-hardened troops, experienced veterans who were fiercely loyal to their Emperor. Facing him was an Anglo-allied army of 68,000 led by the Irish-born aristocrat Arthur Wellesley, First Duke of Wellington. Wellington led an assembly of troops from Dutch and German states along with 25,000 British regulars. A Prussian army of 50,000, led by the old warhorse Gebhard Leberecht von Blücher, would eventually join the battle at Waterloo and decisively turn the tide for Wellington. Notably, Of the 25,000 British troops at Waterloo, only 7,000 had any real battle experience; most of those were infantry, and the majority were Irish.

“The Bravest Man at Waterloo”

The Battle began just after 11 a.m. with the French attacking Wellington’s right flank at Hougoumont Farm. Capturing this strategic, high-walled compound would enable Napoleon to outmaneuver Wellington. Recognizing its importance, the British commander reinforced the position with troops from the Coldstream Guards and Scots Guards. Their heroic defense of the impromptu citadel was dramatically depicted on canvas in Robert Gibb’s painting Closing The Gates at Hougoumont. The picture captures the crucial moment when opportunistic French soldiers force open the gates of the compound but are savagely repulsed by British soldiers.

“The success of the battle turned upon the closing of the gates at Hougoumont,” Wellington would write. And not surprisingly, an Irish soldier played a key role in the storied moment. Corporal James Graham of County Monaghan was most instrumental in this action, having slid the cross beam into place securing the gate once it was pushed shut. Some years later Graham received a substantial reward for his contribution: a nomination for being the “the Bravest man at Waterloo” by Wellington in recognition of his courage, and for saving the life of the Garrison Commander, Lieutenant Colonel James MacDonnell of Glengarry.

“The Regiment that Saved the Center”

The 27th (Inniskilling) Regiment of Foot had occupied a key strategic position on the forward slope of a ridge in the center of Wellington’s line. For two seemingly eternal hours, the 27th, consisting of 747 infantrymen, endured continual sniping from French sharpshooters and pounding from enemy artillery. Despite this, the line still held.

As the battle progressed, the regiment formed squares to repulse the onslaught of the French cavalry. Comrades, brothers, cousins fell. But with extraordinary courage and amazing resolve, the 27th still held.

By this most remarkable display of self-sacrifice the 27th had given Wellington a most precious gift: time. Without it, the Anglo-allied line might have collapsed before the decisive arrival of Blücher’s Prussians.

“They Don’t Know When They are Beaten”

The 27th held out, blocking the road to Brussels and a likely French victory. The Inniskillings had suffered more than 50 percent casualties. Only two other regiments that day, both Scottish, would take such appalling losses.

“That regiment with the castles on their caps is composed of the most obstinate mules I ever saw,” Napoleon famously remarked about the Inniskillings. “They don’t know when they are beaten.”

But of course, help for the 27th, and the entire Anglo-allied army at Waterloo, was on the way.

On the afternoon of the June 18, Blücher was marching towards the fighting at Waterloo seeking bloody revenge for defeat at Ligny two days earlier. The Prussians’ arrival would tip the balance.

Despite Wellington infamously declaring that his redcoats were “the scum of the earth,” he would graciously concede in 1829, when the Catholic Emancipation Bill was put before the House of Lords, that “It was mainly due to the Irish Catholic that we (the British) owe our pre-eminence in our military career.” Wellington’s support for Catholic emancipation would even lead to his participation in an 1829 duel with the rabidly anti-Catholic Earl of Winchelsea.

On November 23, 1918 Irish soldiers would take to the field at Waterloo yet again.

After more than four years of unimaginable horror in the trenches of the Western front the 2nd Leinsters were the first British army regiment to march across the battlefield of Waterloo since 1815.

Leading the outfit were the pipers. My grandfather Peter Farrell of Newstone Drumconrath, County Meath was, I’m proud to say, one of them.

]]>
Fri, 19 Apr 2024 23:24:36 +0000 https://historynewsnetwork.org/article/185216 https://historynewsnetwork.org/article/185216 0