History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Fri, 12 Aug 2022 04:40:06 +0000 Fri, 12 Aug 2022 04:40:06 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/site/feed The COVID Era is the Latest Episode of Medical Scapegoating of Asian Immigrants

"A Statue for Our Harbor", George Frederick Keller, San Francisco Illustrated Wasp, November 1881

 

 

Since 2020, Asian Americans in the United States have experienced dual existential crises: anti-Asian violence and COVID-19. According to Stop AAPI Hate, nearly 11,500 hate incidents were reported to its organization between March 19, 2020 and March 31, 2022. While the uptick in this violence has been connected to present-day coronavirus-related racism and xenophobia, anti-Asian violence and the association of Asian bodies with disease are not new. Rather, they have a history that is as old as the earliest mass migration of Asians to the United States. Furthermore, a closer look at this history illuminates a distinctive kind of immigrant health paradox. Although Asian immigrants have historically contributed to the health of the U.S. nation as farmers and harvesters of fresh produce and as frontline health professionals, they are among the targets of anti-Asian violence.

The Medical Scapegoating of Asian Americans

As the numbers of Chinese arrivals in San Francisco increased after the discovery of gold in northern California in 1848, the social stigma of Chinese bodies as a weak and inferior race combined with perceptions of them as foreign economic competition fueled anti-Chinese sentiment in the region and the state. Tragically, the concurrent development of San Francisco’s public health institutions in the second half of the nineteenth century furthered anti-Chinese sentiment through municipal reports that blamed Chinese immigrants for smallpox outbreaks. Public health officials instituted measures--quarantine, physical examination, the fumigation of their clothing and baggage--that targeted Chinese arrivals at the city’s port. Such medical scapegoating extended to ethnic enclaves. San Francisco Board of Health Annual Reports from 1876 to 1877 referred to the city’s Chinatown as a “moral and social plague spot” and a region “contaminating the atmosphere.”

Print culture spread anti-Chinese, medicalized horror stories to the general public. An 1881 political cartoon, entitled “A Statue For Our Harbor,” depicted the Statue of Liberty as a Chinese laborer wearing tattered clothing, a human skull at his foot, and an opium pipe in his hand. His queue or traditional ponytail was likened to a slithering snake, while a rat tail peeked out from behind the human skull. This classic representation of “yellow peril,” was highlighted through the following capitalized words that emanated from the Chinese laborer’s head in lieu of Lady Liberty’s crown: FILTH, IMMORALITY, DISEASES, RUIN TO WHITE LABOR.

European settlers had also spread smallpox and other diseases, yet the belief in Western medical superiority contributed to the popular nineteenth-century idea of manifest destiny, the divine right of the United States to expand westward across the continent and the Pacific Ocean and into the Philippines. Although Filipino nationalists had been fighting for their independence from over three centuries of Spanish rule, ideas that linked Filipino bodies to unsanitary practices and diseases, such as leprosy, justified U.S. colonization of the archipelago and its “benevolent assimilation” policies, which included Americanized nursing training.

Concurrently, Japan was emerging as a global power through imperialism, but that did not prevent the linkage of Japanese immigrant bodies in California to diseases, such as typhoid. In 1910, when a U.S. Public Health Service physician found that many Indian arrivals at Angel Island’s Quarantine Station had hookworm, the threat of disease became grounds for the movement to exclude them from the entering the United States.

Asians and Asian Americans protested medical scapegoating in multiple ways. Some rejected the purported supremacy of Western medicine and refused to partake in American medical practices. Others filed U.S. federal lawsuits, criticizing the unequal administration of public health-related laws. A Chinese immigrant detainee at Angel Island Immigration Station wrote a poem on a barrack wall: “I thoroughly hate the barbarians because they do not respect justice.... They examine for hookworms and practice hundreds of despotic acts.”

They understood that at stake was not solely their livelihoods, but also their lives. Their denigration as filth, immorality, disease, and ruin to white labor dehumanized them and made them targets of violence.

Histories of Anti-Asian Violence

Like medical scapegoating, anti-Asian violence permeated and linked the experience of diverse groups of Asians in the United States beginning in the second half of the nineteenth century. In the 1885 Rock Springs Massacre in Wyoming, 28 Chinese workers were killed, their homes and bunkhouses set on fire. Historian Beth Lew-Williams notes that in 1885 and 1886, over 168 communities in the U.S. West expelled their Chinese residents, united in their vehemence that “the Chinese must go.”

Animosity linked to economic and sexual competition was intense. In 1907, a white mob of 500 working men expelled Indian migrant workers from Bellingham, Washington. They threw rocks and indiscriminately beat people. An angry mob of white workers attacked Korean farm workers in Hemet Valley, California, in 1913. Anti-Filipino riots took place in Exeter and Watsonville, California in the 1920s and 1930s. White mobs roamed through Filipino agricultural labor camps, beating them, smashing cars, and burning down bunkhouses.

International relations, most notably war, triggered anti-Asian violence. During World War II, Japanese Americans were racialized as an enemy of their own country. Their homes and businesses were targeted for arson, shootings, and vandalism. Beginning in 1942, approximately 120,000 Japanese Americans were forcibly relocated and incarcerated in remote internment camps across the United States without due process. Even though Asian American men served in the US armed forces and Asian American women worked as Rosie the Riveters, non-Japanese Asian Americans were racially lumped together with the Japanese. They feared going out at night and some were beaten even in broad daylight.

The passage of the Immigration and Nationality Act of 1965 and U.S. involvement in the Vietnam War resulted in new mass migrations of highly educated, professional Asians as well as diverse waves of Vietnamese, Hmong, Laotian, and Cambodian refugees. Advanced educations and alliances with U.S. military forces did not protect them from racial violence. In 2017, Indian-born engineer Srinivas Kuchibhotla was fatally shot by a man with a semiautomatic pistol. The man yelled, “Get out of my country!” before opening fire.

Thus, despite their many differences in national origins, languages, faiths, generation, and socio-economic status, and despite their longstanding and multigenerational presence in the US, histories of anti-Asian hate and violence in the United States have woven the fates of Asian Americans together. These histories seep into our present.

In 2020, a threatening note was taped on a Hmong American couple’s door in Woodbury, Minnesota: “We’re watching you f------ c----- take the Chinese virus back to China. We don’t want you hear infecting us with your disease!!!!!!!!!!”

An Immigrant Health Paradox

A closer look at Asian American histories of immigration and violence illuminates a distinctive immigrant health paradox. Although Asian American labor as farmers and harvesters of fresh produce and as healthcare professionals have contributed to healthier American foodways and US healthcare delivery, Asian Americans continue to be the targets of medical scapegoating.

Beginning in the late 1860s, Chinese workers transformed tens of thousands of acres of California swampland into arable land. The ingenuity of Chinese horticulturalists, such as Ah Bing and Lue Gim Gong, contributed to the popular Bing cherry and cold-resistant citrus fruits.

By 1909, more than 30,000 Japanese were tenant farmers or farm laborers in California. They produced 70 percent of California’s strawberries, and grew the majority of the state’s snap beans, tomatoes, spring and summer celery, onions, and green peas, fulfilling the increasing demands for fresh produce in the cities. During this time period, many Indian immigrants also worked in farming, growing lettuce and beets. Among them was Dalip Singh Saund, who in 1956 became the first person of Asian descent elected to serve as a U.S. Representative, championing the farmers of his southern California district.

The mostly Filipino young men who came to the United States in the tens of thousands in the 1920s and 1930s labored as migrant agricultural workers. They followed the crops from California to the Pacific Northwest, harvesting grapes, onions, tomatoes, asparagus, potatoes, peaches, lettuce, sugar beets, celery and more. Anti-Filipino violence, meager wages, and poor working conditions contributed to their labor militancy, which culminated in Larry Itliong’s leadership and Filipino American farmworkers’ initiation of the Grape Strike in Delano, California, in 1965.

As a consequence of their Americanized nursing training, over 150,000 Filipino nurses have immigrated to the United States since the 1960s. They have cared for the most vulnerable Americans primarily at the bedside in acute and long-term patient care. Partly as a result of their direct exposure, they have suffered a disproportionate toll from COVID-19. Yet, they and other Asian American healthcare professionals are among the targets of present-day anti-Asian hate and violence.

Why does this violence happen over and over again? One root cause is the phenomenon of not knowing Asian American history, including both the long-standing tragedy of anti-Asian scapegoating and Asian American contributions to American health and health care delivery specifically, and the nation’s culture, economy, and government more broadly. Thus, recent changes in K-12 education in Illinois, New Jersey, and Connecticut to include Asian American and Pacific Islander histories in schools are important and hopeful steps. We cannot begin to change what we do not know.

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183624 https://historynewsnetwork.org/article/183624 0
A Writer Reflects on Four Enlightening and Challenging Lunches with the Father of Black Liberation Theology

Dr. James H. Cone, Union Theological Seminary, 1969

 

 

The summer of 2013 proved especially consequential for me, occasioned by a series of four private lunches with Dr. James H. Cone, author of Black Theology & Black Power, a book that has, since its release, carried the distinction as “the founding text of Black liberation theology.” To Cone, long-time distinguished professor at Union Theological Seminary, the gospel of Christianity had been hijacked and distorted by “white, Euro-American values.”

I have come to realize the conversations I experienced that summer with Cone – just the two of us, Black and white, one on one – constituted an essential segment of my “white reckoning,” a moment in time when my racial past, impressions, and ideas came under forceful examination by a newfound and sophisticated friend, whose distant Arkansas background and my own were similar, although we viewed the subjects we explored from historically obverse realities. After all, the two of us spent our youths – I was the younger – in small towns in south Arkansas around the same time with only six years in age and a relatively short distance of fifty miles separating us.

The thoughts rushed unbridled and unmeasured into consciousness as my wife, Freda, and I sat at the funeral for Cone on Monday, May 7, 2018, nearly five years after the intense discussions he and I shared over that summer of 2013. Seating capacity at Riverside Church on the upper west side of Manhattan is just a little over 2,000. From the vantage point of the pew we occupied that day, Riverside Church overflowed with a scattering of whites amid an ocean of African-Americans. Listening to nearby conversations, I realized many attendees traveled long distances to arrive at the ceremony honoring this controversial but seminal figure of philosophical and theological importance. As the funeral progressed, I felt a thrill about the degree of respect and acceptance his Black liberation theology had obviously gained among Black people across the country.

Thrilled. I cannot think of a better word to express my response to the crowd’s puissant reaction to Cone’s views of Black life in the United States. Eulogies were plentiful that day from prominent leaders in America’s Black churches and liberation theological circles, such as Cone’s friend Dr. Cornel West, former student Raphael Warnock (then the senior minister at Martin Luther King’s Ebenezer Baptist Church in Atlanta, who would later, in 2021, be elected Georgia’s first African-American United States Senator), and Dr. Kelly Brown Douglas, Cone protégé, prominent author, and Dean of the Episcopal Divinity School at Union Theological Seminary.

At least for this white author, my prospective book cried out for the voice of James Cone. While I had the advantage of a first-hand racial reckoning with him asking questions and probing many answers, Cone’s writings called whites to task for their beliefs and behavior toward Blacks and Black liberation. In many ways like Malcolm X, his voice would be unheard at the nation’s peril.

My long 2013 article on the Elaine Race Massacre had been published in a national literary periodical, a copy of which I provided to Cone, who wanted to know more about the Elaine conflagration than my piece conveyed. He and I inherited common knowledge about the region where the massacre happened and the associated racial rituals, practices, and oppression that occurred there. We each recognized that so much about the Massacre was not unique for south Arkansas – only a matter of the slightest degree separated in 1919 a mass murder of African-Americans from a single lynching. 

A geographic commonality brought us together, as he had shown considerable interest to learn much more about the Elaine Race Massacre, leading to our first luncheon in mid-June; the lunches spread over the summer, ending in late August. I was familiar with Cone’s work, having read some of his theological writings. He knew little of me but for my authorship of the Massacre piece and biographical highlights that accompanied it, unless he looked for more elsewhere.

James Hal Cone had been born August 5, 1938 in Fordyce, Arkansas, then a town of about 3,400 persons a little over forty miles northwest from my hometown of Monticello via a two-lane road – much of it would have been unpaved at that time. Eighty years ago, these two communities were about the same size. Located in the timberlands of rural south Arkansas, Fordyce’s only claim to fame during the 20th century rested on the municipality being the hometown of Paul “Bear” Bryant, legendary football coach for the University of Alabama Crimson Tide. Actually, Cone spent his youth in the smaller community of Bearden, some fourteen miles southwest of Fordyce and hosting a population of less than 1,000.

During the course of the 2013 summer, he recited for me a variety of stories about growing up in aggressively segregated south Arkansas and, more particularly, in Bearden. One such story dealt with watching a Black man being pistol whipped in town at a four-way crossing by a local white law enforcement officer; apparently, the policeman thought he had been too slow in accelerating his vehicle. Such gratuitous and arbitrary acts of violence, affront, and unfairness perpetrated by whites in and around Bearden wore on Cone for the rest of life. He often invoked his parents’ relevance and influence, sometimes summoning his father’s name, “Charlie,” seemingly to give Cone supplementary insights and additional fortitude to confront a moment of dilemma, uncertainty, or past pain.   

From the outset, Cone brought to each of our lunches, as a gift, a different, personally inscribed book he had written, and from the very beginning of our conversations, I was struck by the eagerness, curiosity, and transparency of this man in his mid-70s. Throughout the summer, stories of Cone’s life in Arkansas, including the years he spent in Little Rock studying at Shorter College and Philander Smith College before moving on to receive his doctorate from Northwestern University, would stream from him unencumbered. While a student in Arkansas, he held a job as chauffeur for a prominent Little Rock businessman, with the “n-word” freely employed by his employer’s associates and colleagues from the backseat of the automobile. Recounting these stories of being a chauffeur, Cone still remained incensed at the ignominy of having to don the obligatory driver’s cap as part of his job.

Cone showed insistence at learning as much as possible about me: What was it like being white and teaching in the all African-American school in Monticello before integration, and what was the response then to my efforts from the local communities, both Black and white? How did my family react to my views and actions? How did I come to read much Dietrich Bonhoeffer? How did my views about race develop to differ so tellingly from whites in south Arkansas, particularly from my own family? What did I think of James Baldwin and Malcolm X? Did the Episcopal Church make any reparations in connection with its apology and national day of repentance for the Church’s role in transatlantic slavery, using a litany which I wrote at the Church’s request? How and why did I become so committed to the first physical memorial to the Elaine Race Massacre?  Conducting a deluge of weighty, but sometimes intimidating questions, he was continuously and implacably inquisitive, this renowned professor at prestigious Union Theological Seminary, where he had taught since 1969, this man of profound humility, who, notwithstanding his enviable oeuvre of remarkable compositions, complained about his writing skills and the great difficulty he faced putting word after word on paper.

He freely described and discussed the myriad of crucial subjects that occupied his focus and ruminations, including the steps and circumstances that brought him to Black liberation theology; his regular criticism of Reinhold Niebuhr’s neglect of Black plight in Detroit and New York City, two cities where Niebuhr had been quite active; our mutual attention to the Detroit riots, the City of Detroit, and surrounding Wayne County, Michigan (two local governments I served as a consultant earlier in my life and career); his belief that Black people habitually hide their more visceral comments about white people; the Million Man March; his view that white subjugation of African-Americans thrusts a higher burden of original sin on white folks; and our mutual recall of the integration of Little Rock Central High School, which happened in 1957 as I became a teenager and Cone was nineteen years old.

Our final lunch in late August, 2013 proved to be the least satisfying – for us both, I believe. The conversation started very uncharacteristically with personal recriminations by Cone. He accused me, citing verbally an identical shortcoming for all whites, of not paying enough attention to Malcolm X. He additionally upbraided me, referencing and castigating whites as a group generally, for not understanding Black circumstances and attitudes. Shortly into this unexpected jeremiad, I received a telephone call notifying me that my wife unpredictably needed to go to the hospital, and I should meet her there as soon as possible. Cone’s demeanor shifted instantly and dramatically, demonstrating much concern and sympathy, but I needed to dash. So, our relationship ended quite abruptly and unsatisfactorily. He exhibited irritation with me for reasons that are still baffling, and I, in turn, felt offended at his aggression. We never reached out to each other again.    

As a result of these summertime lunches with Dr. James H. Cone, I have often pondered and attempted to answer the conundrum I have never been fully able to answer – that is, why did he desire to continue many lengthy discussions with me? We never really had a detailed agenda for any one of our talks. Until the very end, each of our meetings carried the redolent purpose of friends meeting for no reason other than to share notable experiences and personal propositions. After thinking about this question for years, I have resolved that I probably represented an opportunity for Cone to enter into a previously unrealized conversation he envisioned with a white person from his past who would willingly acknowledge and comprehend the life in Bearden and south Arkansas that Cone endured and overcame from the 1940s and 1950s. Is it wrong of me to surmise that four summer lunches in 2013 created a retrospective for James Cone that brought his own history and views into clearer focus against a backdrop of passage with a white man?

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183620 https://historynewsnetwork.org/article/183620 0
Healing a Divided Nation

The Army of the Potomac's recently formed Ambulance Corps drills in 1864

 

 

 

The following is excerpted with permission of Pegasus Books from Carole Adreinne's Healing a Divided Nation: How the American Civil War Revolutionized Western Medicine (August 9, 2022). 

 

The Civil War marked the beginning of modern advancements in medicine that were generated in response to the new weapons technologies that created a wholesale mechanical slaughter.

 

The medical community of the Civil War achieved an outstanding record for survival rates from disease and wounds. They designed, built, and operated revolutionary new hospitals. They served as the medical directors of huge armies and completely reorganized the medical corps. They initiated programs and research and left a legacy of skill and honor.

 

Organized, systemized medical care did not exist in America of the 1860s. Skilled nursing as a profession or a staff position did not exist. Methods of getting wounded men from the battlefield to a place of care were haphazard at best and nonexistent at worst. There were no large-scale treatment facilities and surgery was rarely performed in the country. By the end of the war, there were great hospitals like Chimborazo and Satterlee, hospital trains and ships, skilled nurses, and a working ambulance corps.

 

The Civil War doctors embraced a practical approach to medicine, setting up new systems and methods, sometimes learning surgical techniques in camps and hospitals from the diagrams in books. Many of the physicians and surgeons were recent medical school graduates with no practical experience, but they were able to share ideas and information with their more experienced colleagues. They quickly implemented the discoveries of other men, and what they hadn’t learned in the medical schools, they learned on battlefields and in field hospitals. The more immediate the care, the greater the likelihood of survival.

 

Many more lives were saved than was possible in earlier wars, and many lives were saved later because of knowledge gained during the Civil War. Both the Confederate and Union medical departments exercised good, solid, logical organization and changed the vista of health care. The war trained thousands of surgeons at a time when there were very few doctors in America who knew how to treat gunshot wounds. The technology of the time also gave them options that had not been available in earlier wars: surgical tools, anesthesia, and improved conveyances for the wounded.

 

The Civil War changed the long-held tradition that government did not have responsibility for the health of the individual soldiers. Before the war, no one who was injured had expected to be nursed by a trained professional, had envisioned that hospitals would be clean, or counted on the administration of anesthesia before surgery. The arrival of an ambulance was not anticipated by anyone who was wounded either in war or in peacetime, but by the end of the Civil War, expectations on the part of the patient, the military, and civilians had changed forever. Confederate General Alexander E. Porter of Georgia looked back at the medical outcome of the terrible conflict.

Was all our blood shed in vain? Was all the agony endured for the Lost Cause but as water spilled upon the sand? No! A thousand times, no!

We have set the world record for devotion to a cause. We have taught the armies of the world the casualties to be endured in battle; and the qualities of heart and soul developed both in our women and men, and in the furnace of our afflictions, have made a worthier race, and have already borne rich reward in the building up of our country.

 

These medical departments gained great insights and understanding from the horrific carnage. They grouped patients with similar injuries and made revolutionary observations. They worked with astonishing efficiency, saving lives by getting the wounded off the battlefields more quickly and transported to hospitals faster. They created centers of medicine that did not exist before the Civil War. They changed the substance of health care in America.

 

Dr. Robert D. Hicks noted that

Before the war, an M.D. was someone who attended a year of medical lectures and then repeated them for a second year. The war created a new process with measureable standards. A military doctor not only had to graduate with an M.D. and show apprenticeship to an established physician but had to pass an oral and written examination. Before the war, doctors were all M.D.s; after the war, specialisms began in neurology, trauma management, and ophthalmology, for example. Doctors became specialists. Hospitals were no longer just for indigent and dying people. Today, when we see an injured person transported to hospital emergency care via ambulance, we are witnessing Civil War medicine.

 

In retrospect, it seems remarkable that despite primitive surgical conditions and desperate supply shortages on the battlefield, so many of the casualties survived—a victory of both science and spirit. Although the bodies had lain “thick as leaves,” revolutionary changes in health care had emerged from the ashes of the war.

 

Like the killing power of weapons, medical science has soared in many ways. No subsequent war has taken the lives of as many Americans; a result of improved medical education, advanced surgical techniques, the under- standing of neurology, and faster evacuation of the wounded. Women and African Americans have achieved prominent roles in medical science and society, the terms of the Geneva Convention still seek to protect medics and the wounded in times of war. The American Red Cross and the International Committee of the Red Cross have grown to provide emergency relief for a myriad of natural and manmade disasters. These profound human achievements, as well as the record of the hideous carnage, are the legacies of the American Civil War.

 

Editor's Note: Carole Adrienne will be celebrating the launch of her book with a virtual in-conversation event with Robert D. Hicks, William Maul Measey Chairholder at The College of Physicians of Philadelphia, sponsored by the Free library of Philadelphia’s Social Science and History Department on Tuesday, August 9th at 7:00pm.  Register here for this free event!  

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183621 https://historynewsnetwork.org/article/183621 0
Why Should War Criminals Operate with Impunity?

 

 

The issue of alleged Russian war crimes in Ukraine highlights the decades-long reluctance of today’s major military powers to support the International Criminal Court.

In 1998, the International Criminal Court (ICC) was established by an international treaty, the Rome Statute.  Coming into force in 2002 and with 123 nations now parties to it, the treaty provides that the ICC, headquartered at the Hague, may investigate and prosecute individuals for war crimes, genocide, crimes against humanity, and the crime of aggression.  As a court of last resort, the ICC may only initiate proceedings when a country is unwilling or unable to take such action against its nationals or anyone else on its territory.  In addition, although the ICC is authorized to initiate investigations anywhere, it may only try nationals or residents of nations that are parties to the treaty, unless it is authorized to investigate by the nation where the crimes occurred.

The development of a permanent international court dealing with severe violations of human rights has already produced some important results.  Thirty-one criminal cases have been brought before the ICC, resulting, thus far, in ten convictions and four acquittals.  The first ICC conviction occurred in 2012, when a Congolese warlord was found guilty of using conscripted child soldiers in his nation.  In 2020, the ICC began trying a former Islamist militant alleged to have forced hundreds of women into sexual slavery in Mali.  This April, the ICC opened the trial of a militia leader charged with 31 counts of war crimes and crimes against humanity committed in Darfur, Sudan.  Parliamentarians from around the world have lauded “the ICC’s pivotal role in the prevention of atrocities, the fight against impunity, the support for victims’ rights, and the guarantee of long-lasting justice.”

Despite these advances, the ICC faces some serious problems.  Often years after criminal transgressions, it must locate the criminals and people willing to testify in their cases.  Furthermore, lacking a police force, it is forced to rely upon national governments, some with a minimal commitment to justice, to capture and deport suspected criminals for trial. Governments also occasionally withdrew from the ICC, when angered, as the Philippines did after its president, Rodrigo Duterte, came under investigation.

The ICC’s most serious problem, however, is that 70 nations, including the world’s major military powers, have refused to become parties to the treaty.  The governments of China, India, and Saudi Arabia never signed the Rome Statute.  Although the governments of the United States, Russia, and Israel did sign it, they never ratified it.  Subsequently, in fact, they withdrew their signatures. 

The motive for these holdouts is clear enough.  In 2014, Russian President Vladimir Putin ordered the withdrawal of his nation from the process of joining the ICC.  This action occurred in response to the ICC ruling that Russia’s seizure of Crimea amounted to an “ongoing occupation.”  Such a position, said Kremlin spokesman Dmitry Peskov, “contradicts reality” and the Russian foreign ministry dismissed the court as “one-sided and inefficient.”  Understandably, governments harboring current and future war criminals would rather not face investigations and possible prosecutions. 

The skittishness of the U.S. government toward the ICC is illustrative.  Even as he signed the treaty, President Bill Clinton cited “concerns about significant flaws” in it, notably the inability to “protect US officials from unfounded charges.”  Thus, he did not submit the treaty to the Senate for ratification and recommended that his successor, George W. Bush, continue this policy “until our fundamental concerns are satisfied.”  Bush, in turn, “unsigned” the treaty in 2002, pressured other governments into bilateral agreements that required them to refuse surrender of U.S. nationals to the ICC, and signed the American Servicemembers Protection Act (sometimes called the “Hague Invasion Act”) which authorized the use of military force to liberate any American being held by the ICC. 

Although subsequently the Bush and Obama administrations grew more cooperative with the court, aiding it in the prosecution of African warlords, the Trump administration adopted the most hostile stance toward it yet.  In September 2018, Donald Trump told the UN General Assembly that the United States would provide “no support” to the ICC, which had “no jurisdiction, no legitimacy, and no authority.”  In 2020, the Trump administration imposed economic sanctions and visa restrictions on top ICC officials for any efforts to investigate the actions of U.S. personnel in Afghanistan.

Under the Biden administration, however, U.S. policy swung back toward support.  Soon after taking office, Biden—in line with his more welcoming approach to international institutions― dropped the Trump sanctions against ICC officials.  Then, in March 2022, when the Russian invasion of Ukraine produced widely-reported atrocities in the Ukrainian town of Bucha, the U.S. president labeled Putin a “war criminal” and called for a “war crimes trial.”

The ICC was the obvious institution for action.  That March, the U.S. Senate unanimously passed a resolution backing an investigation into Russian war crimes in Ukraine and praising the ICC.  Weeks before this, in fact, the ICC did open an investigation.

Even so, it is unclear what the U.S. government can or is willing to do to aid the ICC in Ukraine.  After all, U.S. legislation, still on the books, bars substantial U.S. assistance to the ICC.  Also, Pentagon officials are reportedly opposed to action, based on the U.S. government’s long-time fear that U.S. troops might some day be prosecuted for war crimes.

For their part, Russian officials have claimed that the widely-recognized atrocities were a complete “fake,” a “fabrication,” and a “provocation.”  In Bucha, stated the Russian defense ministry, “not a single local resident has suffered from any violent action.”  Not surprisingly, Russian authorities have refused to cooperate with the ICC investigation.

Isn’t it time for the major military powers to give up the notion that their war criminals should be allowed to operate with impunity?  Isn’t it time these countries joined the ICC?

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183625 https://historynewsnetwork.org/article/183625 0
Who's Afraid of Critical Race Theory?

Harvard Law Student Barack Obama with Prof. Derrick Bell in 1991, during HLS student protests demanding more faculty diversity at the school.

 

 

 

In 2012, as Barack Obama ran for reelection, the right-wing website Breitbart set out to examine his time at Harvard Law School. They were looking for a scandal.  They found a tape of Obama, then a law student, speaking at a political rally.  He praised one of his professors, the controversial African-American scholar Derrick Bell, and hugged him.  Bell, it turned out, was a practitioner of something called Critical Race Theory (CRT), an academic field far less controversial then than it is today.  Guilt by association had been used against Obama before, most notably in regard to his pastor, Rev. Jeremiah Wright in 2008, but this time the attempt fell flat. The media moved on, and CRT ceased to be of public interest for almost ten years.

Under examination by more than thirty states, CRT is now back in the spotlight.  Then, as now, it is more controversial than radical.  For thirty years, CRT has been a school of thought among a small, informal, group of professors, mostly in law.  These writers examine disparate topics, but all view the law through the prism of America’s fractured race relations. A Columbia  University law professor, Kimberle Crenshaw, coined the term. For her, CRT was “a way of seeing… the ways that our history has created these [racial] inequalities that now can almost effortlessly be reproduced unless we attend to their existence.”  Examining how our history created racial inequalities remains worthwhile, even if it makes some people uncomfortable.

CRT has been reinvented by Christopher Rufo, a right-wing activist and frequent guest on Fox News. He is proud of his fantasy vision of critical race theory:  “The goal is to have the public read something crazy in the newspaper and immediately think 'critical race theory’.”  Rufo brags about how conservatives having been able to “recodify” the term CRT, but he hasn’t recodified anything, just distorted the truth.   CRT has become a dog whistle, designed to stir up white animus against people of color without using inflammatory language.  The strategy is working for Rufo; thirty state legislatures are considering legislation regarding CRT, or have already done so.

CRT is an intellectual movement, yet critique of its intellectual project has lacked a key element: examination of the views of the professors involved in the movement. It is hard to summarize easily, because it is not a monolith. Nevertheless, to understand CRT it is worth starting with Derrick Bell, the professor who made the news for a day in 2012. If anyone could be described as a founder of CRT, it would be him.  Bell began his career as a civil rights lawyer working for the Justice Department in the 1950s. In 1959, the Justice Department, convinced that his membership in the NAACP would compromise his objectivity, demanded he that resign his membership.  He refused on principle, left the department, and began a new career as a law professor.

Bell’s academic career was built around a critique of mainstream Civil Rights discourse.  In an early article, “Serving Two Masters: Integration Ideals and Client Interests in School Desegregation Litigation,” Harvard Law Review (1983). Bell drew on his formative experience as a lawyer who worked against long odds to win school desegregation after Brown v. Board of Education (1954).   

Litigators had few tools to deal with “massive resistance” to the decision in the 1950s. Private segregationist academies opened in much of the South, and in many places whites essentially opted out of public education.  Even when courts ordered busing in large school districts such as Detroit and Boston, the sheer geographical size of the districts posed obstacles to providing integrated schools, as did white flight to the suburbs. 

Bell’s “Serving Two Masters,” published in the Harvard Law Review in 1976, started with a simple fact. Lawyers are supposed to work for clients, in this case African-Americans whose children attended segregated schools.  But in  truth, Civil Rights lawyers served another master:  liberal groups such as the NAACP, who paid their salaries.   Liberals, Bell wrote, were “fixated” on integration.  If African-American parents were asked, Bell suggested, they might say the prospect of achieving “color-blindness” was too distant.  Facing an entrenched system, African-Americans might choose better schools, even if they were segregated. Bell’s argument accepted a false choice between integration and academic excellence. It would not be would not be fair to equate antiracist lawyers to the “separate but equal” doctrine of the 19th century, it was still a risky argument to make. Bell later admitted he lost friends over the article.

Bell also attracted attention with “Brown vs. the Board of Education and the Interest Convergence Dilemma,” in the Harvard Law Review (1980)  Here Bell argued that whites would not support change unless it was in their interest.    This view seemed pessimistic, but in one sense it was prescient: no Democratic presidential candidate has won the majority of white votes since 1964.  At the risk of oversimplifying Bell’s argument, the Supreme Court’s motives for making the Brown decision had almost nothing to do with integration.  The Court did advance the interests of blacks, but only did so to achieve goals desired by whites.  Yes, the Supreme Court chose “racial remedies” in the Brown case.  But the Brown decision happed due to “unspoken and perhaps subconscious judicial conclusions” that would play well globally in the Cold War. In other words, the decision was not about segregation, but other “interests deemed important by middle- and upper-class whites.”

So what accounted for the victory of Brown?  Three things: the U. S. wanted to win the Cold War by projecting an image of racial harmony; it wished to desire to reward the contributions of black veterans of World War II; and finally, because it removed segregation as an obstacle to building the “sunbelt with all its potential and profit.”  But as the movement to desegregate schools foundered, white interests no longer converged with those of Blacks.  Because of that divide, Bell argued that perhaps the best solution was “the improvement of presently segregated schools as well as the creation or preservation of model black schools.” Here, he again called for something like the “separate but equal” doctrine of  the notorious decision in Plessy vs. Ferguson.

In the 1980s, Bell broke even further from the norms of traditional legal scholarship, adopting unconventional approaches to the discipline, including story-telling and science fiction.  The most powerful of his many books in this vein was called, provocatively, Faces at the Bottom of the Well: The Permanence of Racism (1993).

Racism is not timeless, and Bell’s view of racism as permanent does not stand up to historical scrutiny.  Historians such as Winthrop Jordan and Edmund Morgan have illustrated slavery’s crucial role in the emergence of racism. If racism has an identifiable beginning, it can end.  If Blacks can fight for reforms, they can improve their position. In the struggle against slavery, historians, most notably Manisha Sinha, have illustrated how African-Americans led the interracial abolitionist movement.  W. E. B. DuBois effectively portrayed a “great strike” that ended slavery.  Denying the possibility of change denigrates the black crusaders who fought and died to end slavery or to win Civil Rights. 

Bell recoiled from fostering illusions that led to failed struggles.   Yet in academic politics, he never shied from a fight.  Bell advocated for scholars of color at Harvard, and threatened to leave unless the law school hired a black woman.  Out of all the black women practicing the law in the  United States, Harvard could not hire a single one. Bell was right:  Harvard’s stance can only be described as indifference, perhaps combined with a vision of “merit” that favored white men.  This critique connects to Crenshaw’s coinage “intersectionality,” which has become a buzzword on the academic left. Drawing on her experience as a litigator, Crenshaw drew attention to the plight of black women in cases involving employment. While summary of her influential work work is impossible in one sentence, she examined the fact that Black women suffer from overlapping oppressions, and their experience differed from those of white women or Black men.

The recent nomination of Supreme Court Justice Ketanji Brown Jackson confirmed Bell’s view of the privilege embedded in the process.  Fox’s Tucker Carlson recently demanded to know her LSAT scores. It is unlikely he would ask for this private and irrelevant information about a white candidate.  While judges are appointed to the Court with an eye to politics, Carlson, with this strange standard, played the race card from the bottom deck. Bizarrely, Lindsey Graham tried to make Jackson take a position on a book called The Anti-Racist Baby, as if all African-American judges were responsible for every book written by a black person. Of course, Jackson was far too wise to take the bait.

In fairness, while Bell viewed the idea of racial equality as a sugar-coated fiction, he never, in his words, “accepted racism.” Bell was a lawyer above all else, and that may have skewed his vision. He worked to implement change through the courts, and discovered that Blacks could not litigate their way to freedom.  Fortunately, as labor activists and community organizers could note, the law is not the only means to change the world.  Indeed, the work of activist scholars informed by Bell’s and Crenshaw’s foundational critiques have brought legal scholarship together with social movements. Notably, Michelle Alexander’s The New Jim Crow has connected legal analysis of the mass incarceration crisis to antiracist, drug decriminalization, and prison reform and abolition movements. Indeed, it seems that the demonization of CRT by the right stems from a hostility to these political goals far more than to ideas.

A headline from The Onion captures today’s controversy over CRT:  “Teacher Fired for Breaking State’s Critical Race Theory Laws After Telling Students She’s Black.”  People like Carlson and Rufo targeted CRT, the movement Derrick Bell helped create, due to their racism.  But Bell is more complicated than that. He was controversial, but he was no fiery radical, or even a liberal. It seems he wanted a fairer, more decent capitalism, and was bright enough to recognize that he likely would not get it.

Unfortunately, that does not matter to the right, who are not interested in CRT, but simply in scoring points. This was evident in the bizarre, performative, confirmation hearings for Ketanji Brown Jackson, which taught us more about Republican phobias than the nominee’s judicial ideas. There are so many dispiriting moments in those hearings—for me watching Ted Cruz debate racist babies with Jackson was especially painful.

In Republican hands, the term CRT conjures up fear of Blacks.  Racism is still weaponized by reactionaries.   Conservatives will play games, and behave in bad faith, as they did at the recent hearing.  In other words, Bell had good reason to despair about the future.

But we should remember what happened when conservatives found the videotape of the young Barack Obama hugging his Harvard law professor.  They took their best shots at Obama and Bell, and they lost. Whatever his flaws as a president, Obama was a masterful politician, and his example should remind us that racism is not absolute or permanent.  It can change because we can change. Contrary to Bell, progressives should not be guided by the possibilities for change rather than our fears.

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183627 https://historynewsnetwork.org/article/183627 0
On Putin's Vacant Moral Imagination

 

 

Moral Imagination? What’s that? Among other things it’s the title of a book and the main essay in that book by David Bromwich. He defines that quality as  “the power that compels us to grant the highest possible reality and the largest conceivable claim to a thought, action, or person that is not our own, and not close to us in any obvious way.” It involves compassion and empathy, love and mercy. It is also a quality that Russia’s leader, Vladimir Putin, sorely lacks.

If he possessed it, he could not possibly continue inflicting all the tragedy; the killing and the maiming; the attacks on homes, schools, and hospitals; all the sorrow and heartbreak that he has rained down on the Ukrainian people for more than 150 days.

Ian McEwan’s novel Black Dogs (1993) captures well the scope of such wartime tragedy when he writes of his main character: “He was struck by the recently concluded war [World War II in Europe] not as a historical, geopolitical fact but as a multiplicity, a near-infinity of private sorrows, as a boundless grief minutely subdivided without diminishment among individuals who covered the continent like dust. . . . For the first time he sensed the scale of the catastrophe in terms of feeling; all those unique and solitary deaths, all that consequent sorrow, unique and solitary too, which had no place in conferences, headlines, history, and which had quietly retired to houses, kitchens, unshared beds, and anguished memories.”

McEwan reminds us that Putin is not the first leader to cause such suffering, that Hitler and others have also caused innumerable misery and tragedies. Moral imagination, compassion, empathy: these qualities do not usually top the list of those that characterize national rulers. But that does not excuse Putin, nor the Russians who put him in power for more than two decades, counting his stint as prime minister, when he continued exercising his authority.

Neither--and this should be made absolutely clear--do NATO missteps justify his aggression. Look, for example, at the recent defense of his war by Sergey Karaganov, a prominent Russian political scientist interviewed by The New York Times’ Serge Schmemann.

Karaganov justifies Russia’s attacks because he says NATO was turning Ukraine into “a spearhead aimed at the heart of Russia. . . . The belligerence against Russia has been rapidly growing since the late 2000s. The conflict was seen as more and more imminent. So probably Moscow decided to pre-empt and to dictate the terms of the conflict. . . . This conflict is not about Ukraine. Her citizens are used as cannon fodder in a war to preserve the failing supremacy of Western elites.”

Karaganov goes on to say that “for Russia this conflict is about preservation . . . the country itself. It could not afford to lose. That is why Russia will win even, hopefully.” This same political scientist, whom Schmemann has interviewed often since Putin came to power, has (like Putin) a negative view of Western democracies: “Taking into consideration the vector of its political, economic and moral development, the further we are from the West, the better it is for us. . . .  The problem of canceling Russian culture, of everything Russian in the West, is the Western problem. Akin to canceling its own history, culture, Christian moral values.”

He sees the “global liberal imperialism imposed by the United States,” which has attempted to include Ukraine in it imperialistic outreach, as collapsing and being replaced by a “movement toward a much fairer and freer world of multipolarity and multiplicity of civilizations and cultures.” One of these centers of this new world will be Russia, “playing its natural role of civilization of civilizations.”

In Russia itself, Karaganov sees a “bright spot” amid the present “belligerent Western policies” toward Russia: they “are cleaning our society, our elites, of the remains of pro-Western elements.” Yet, despite this, he thinks Russia “could remain one of the few places that will preserve the treasure of the European, Western culture and spiritual values.”

All of this is nothing new. We see similar views among nineteenth-century Russian Slavophiles and Russophiles regarding the moral decline of the West, its antipathy toward Russia, and the belief that Russia will “preserve” the best of Western values. We also see this same type of thinking in some, but certainly not all, of the writings and comments of Alexander Solzhenitsyn, whom Putin once met and praised.

In response to Schmemann’s final question about Russia’s goals regarding Ukraine, Karaganov basically parroted Russia’s most recent declarations: “The minimum is the liberation from the Kievan regime of Donbas, which is in its final stages, and then of southern and eastern Ukraine. Then, Russia’s aim should probably be that the territory left under Kievan control will be neutral and fully demilitarized.”

Almost all of the above views mirror those of Putin, who about a year ago also detailed his views of Ukraine in a lengthy essay.

Not all of the preceding Russian justifications are complete nonsense.  Some prominent Western Russian experts and former U. S. officials believe it was foolish to encourage Ukrainian NATO hopes. Some aspects of U. S. mass culture are morally questionable, etc., etc., etc.  But none of that justifies the Russian invasion of Ukraine and the innumerable tragedies that it has caused. No way. If Putin had sufficient moral imagination, the assault would not have occurred.

As an example of that quality Bromwich cites Martin Luther King’s “A Time to Break Silence,” a 1967  anti-Vietnam-War speech delivered at New York’s Riverside Church. Here are some of the words that the author quotes:

What do the [Vietnamese] peasants think . . . as we test out our latest weapons on them. . . . We have destroyed their two most cherished institutions: the family and the village. We have destroyed their land and their crops. We have cooperated in the crushing--in the crushing of the nation's only non-Communist revolutionary political force, the unified Buddhist Church. We have supported the enemies of the peasants of Saigon. We have corrupted their women and children and killed their men.

Now there is little left to build on, save bitterness. Soon, the only solid--solid physical foundations remaining will be found at our military bases and in the concrete of the concentration camps we call "fortified hamlets." The peasants may well wonder if we plan to build our new Vietnam on such grounds as these. Could we blame them for such thoughts? We must speak for them and raise the questions they cannot raise. These, too, are our brothers.

A year after King’s speech, Kentucky writer Wendell Berry in “A Statement against the War in Vietnam” summed up King’s sentiment when he stated, “We have been led to our present shameful behavior in Vietnam by this failure of imagination, this failure to perceive a relation between our ideals and our lives.”

But it might be argued that the roles and responsibilities of a political leader like Putin are different than those of a minister or writer like King and Berry. And as the great German sociologist Max Weber noted already in 1918, that is certainly true. But this does not mean that leaders don’t need moral imagination and have never displayed it.

When in 1962 John Kennedy (JFK) became aware of Soviet missiles in Cuba, his joint military chiefs urged a full-scale invasion of Cuba, but remembering how various powers had stumbled into World War I through "stupidity, individual idiosyncrasies, misunderstandings, and personal complexes of inferiority and grandeur," he resisted invading.

Alarmed by how close the U. S. and U.S.S.R. had come to war in October, ” in June 1963, at an American University commencement, JFK challenged graduates to imagine a new approach to peace and to the Soviet Union.  Historian Robert Dallek has written that “the speech was one of the great presidential statements of the twentieth century.”

In Putin’s own country (the USSR and now Russia) moral imagination was also exercised by one of Putin’s predecessors, Mikhail Gorbachev, who led the USSR from 1985 until its collapse in 1991. Although he is now poorly thought of by most Russians, it is partly because they blame him for the collapse, which Putin has called “the greatest geopolitical catastrophe of the century.” But most Estonians, Latvians, Lithuanians, Ukrainians, and others who gained their freedom because of the fall of the USSR would not agree to that characterization.

To transform the vast USSR after what Gorbachev referred to as an “era of stagnation” required an imaginative leap, and the new Soviet leader provided it. His domestic policy was summed up by three words, glasnost (openness, less censorship), perestroika (restructuring), and demokratizatsiia (democratization). In foreign policy he urged “new thinking” and indicated that backbone of it was that universal human values were more important than class struggle. This policy was central to ending the Cold War and freeing Eastern Europe from Soviet domination, but many who thought like Putin were obviously not happy with either the domestic or foreign consequences of Gorbachev’s policies.

 

Although there is little doubt that Gorbachev made some mistakes, a good part of Putin’s dissatisfaction stems from his Russian nationalism and narrow-minded background as a KGB officer. In his insightful book on Soviet Civilization (1990), exiled Soviet dissident Andre Sinyavsky writes that when KGB officers interrogated dissidents they often accused them of not being “ours,” that is of not being loyal Soviet citizens. Sinyavsky also quotes a source that claims in “the last twenty-five years’ she has not known one deputy of the Supreme Soviet (theoretically the government’s main legislative body) who “has shown genuine and radical social initiative.”

 

Put simply, the Soviet system never encouraged moral imagination, especially from KGB officers like Putin. Too bad for the Ukrainians. Really too bad—in fact horrific and tragic. But also too bad for Russians and the rest of the world. Instead of the present horrors occurring in Ukraine, we would have peace, which Kenneth Boulding once said was “ploughing and sowing and reaping and making things . . . and getting married and raising a family and dancing and singing.”

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183626 https://historynewsnetwork.org/article/183626 0
For 38 Years of American History, There Has Been No Vice President

The Nixons celebrate with the Fords after Rep. Gerald Ford's appointment to replace Spiro Agnew as Vice President, 1973

 

 

Throughout much of our nation’s history, vice-presidents have been neither seen or heard. In fact, very often there hasn’t even been a vice president at all. On December 19, 1974, Nelson Rockefeller was sworn in as the forty-first vice president of the United States. Since that moment, we have had a vice-president. What is surprising, is that for over thirty-eight years of our country’s existence, we did not have a vice president, which represents about twenty percent of our country’s history. Before, the passage of the Twenty-Fifth Amendment in 1967, there was not a constitutional mechanism for a replacement. In the 19th century, the office was vacant an astonishing twenty-six percent of the time.

Eight presidents have died in office, four by assassins’ bullets. Each time this happened, the vice president ascended to the White House, leaving the vice-presidency unoccupied. What is often overlooked is that seven vice presidents have also passed, leaving the office without a replacement for the remainder of that term. Curiously, these seven died within a period of just over one hundred years. George Clinton died April 20, 1812, and on October 30, 1912 James Sherman died, just six days before the election. Six times in history, we have had no VP for three and a half years, almost seventy five percent of the presidential term.

The Constitution of the United States, warts and all, is a remarkable and enduring document. It has withstood the test of time, and is malleable enough to accommodate airplanes and automobiles, cameras and computers, and telegraphs, text messages, and trains. The vice-presidency almost seemed like an afterthought when the Founding Fathers met in Philadelphia in the tepid summer of 1787. The first time that it was discussed was after September 4, in the last two weeks of the convention. The deliberations were centered more around the mechanics of presidential elections than succession. Several prominent members of the delegation, ironically including future vice-president Elbridge Gerry, said they were “against having any vice-president.” Initially, the runner-up became the vice-president.  The system worked out fine, until the development of political parties, which did not exist in 1787.

A number of adjustments would follow, the first being the Twelfth Amendment. This cleaned up a controversy after the 1800 election when Aaron Burr, who was intended to be Thomas Jefferson’s vice president, entertained accepting the presidency himself if offered.

Three times has the next person in line after the vice presidency changed. The Presidential Succession Act of 1792 placed the President Pro Tempore of the Senate behind the VP if necessary. Surely you remember that office from your high school civics final exam (it’s currently held by Senator Patrick Leahy of Vermont). One would be a rather serious scholar of American History to know the names of Willie Person Mangum, Lafayette Sabine Foster, John Hay, and/or John McCormack, all of whom were a heartbeat from the presidency.

John Tyler became the first “accidental” president in April of 1841 when William Henry Harrison became the first president to die in office. There was some controversy if Tyler truly became president, or was “acting,” but Tyler insisted he was the chief executive and defiantly “returned any letter unopened if it addressed him as anything other than "President of the United States."  Tyler set a precedent that the others would follow. On February 28, 1844, a constitutional challenge nearly occurred. Tyler was aboard the USS Princeton (but fortunately below deck, courting the woman, thirty-three years his junior) when a naval gun exploded on deck, killing six. Had Tyler perished, Senator Mangum would have become the eleventh president of the United States.

Andrew Johnson became president in 1865 after the assassination of Abraham Lincoln. If Booth co-conspirator George Atzerodt hadn’t drank away his courage at the hotel bar where he and Johnson stayed and carried out his assignment to murder Johnson, Lafayette Sabine Foster would have been our eighteenth president. Johnson, the first president to be impeached, was able to keep his job by just one vote in his 1868 senate trial. Had one vote switched, Benjamin Wade would have become “acting president.”

Eighteen years later, Congress rewrote the earlier law and passed the Presidential Succession Act of 1886. This made the Secretary of State third in line, because when President James Garfield died in 1881, and Vice President Thomas Hendricks died in 1885, there was neither a president pro tempore or speaker of the house. One problem is that Congress was rarely actually in session back then. In 1885, it was technically ‘there’ just seventy-six days. President Chester Arthur, Garfield’s successor (1881-1885) was diagnosed with Bright’s disease early in his presidency, which would take his life just twenty months after he left office. These men were not immortal, particularly with 19th century medicine.

Theodore Roosevelt became the youngest chief executive in September of 1901 when William McKinley succumbed to a gunshot wound inflicted eight days earlier. A bit less than a year later, Roosevelt was involved in a carriage/trolley collision in Pittsfield, Massachusetts. In that mishap, William “Big Bill’ Craig became the first Secret Service member killed in the line of duty. The hyperactive president ignored his injuries, and a few weeks later his leg became infected while campaigning in Indiana, requiring surgery. Newspapers called the medical intervention a brush with death. Had TR died of the pre-penicillin infection, Secretary of State John Hay would have become the twenty-seventh president. Hay was not only better known than Mangum, Foster, and Wade, but also more qualified, serving as one of Lincoln’s private secretaries as well as other positions before the State Department.

The Presidential Succession Act of 1947 altered succession again, making the Speaker of the House next in line, as it remains to this day (and continues to under the terms of the Twenty-Fifth Amendment). President Harry S. Truman, who succeeded Franklin Roosevelt following his death, thought someone elected, not appointed, should be next in the list of succession if necessary.

When John F. Kennedy was assassinated on November 22, 1963, Vice President Lyndon Johnson’s car followed behind. Kennedy’s body was rushed to Dallas’s Parkland Hospital, and when Johnson arrived, the heart attack survivor (1955) was seen with his hand on his heart (any chest pains may have been from the force of Secret Service agent Ralph Youngblood’s pushing LBJ to the floor of his car after hearing shots). Had Johnson had another heart attack and died (he wouldn’t survive another in January of 1973, at just sixty-four), seventy-one-year-old Speaker John Mc Cormack would have become the thirty-sixth or thirty-seventh president, depending on whether Johnson succumbed before or after being sworn in himself.

If anything, the Twenty-Fifth Amendment was long overdue. History makes one wonder what took so long, considering the frequency with which the country lacked a second in command. The slow death of James Garfield in 1881 over eighty days (from being shot and incompetent doctoring), Woodrow Wilson’s incapacitation for the last eighteen months of his presidency (stroke), Dwight Eisenhower’s multiple issues (heart attack, intestinal surgery, minor stroke), complicated the matter, compounded by the fact that fifteen presidents and vice presidents had died between 1812 and 1963. For a combined thirty-eight years, we did not have a vice president.

Fortunately, that amendment came in handy just six years later when Vice President Spiro Agnew resigned. President Nixon was able to nominate a successor, Congressman Gerald Ford of Michigan, with congressional approval. Eight months later when Nixon resigned, Ford became the thirty-eighth president of the United States, and the only person to hold that office without being elected by the American people.  He then appointed former New York Governor Nelson Rockefeller as his VP.

Jimmy Carter and his Vice President Walter Mondale changed the nature of the office, giving Mondale far more responsibility and inclusion into decision making. Richard Cheney was undeniably one of the most influential vice presidents. Mike Pence has certainly been in the news after the Trump presidency, for better or worse, and isn’t hiding his own ambitions to occupy the Oval Office. Kamala Harris, finally, became both the first woman and person of color to serve as vice president.

One of the vice presidential duties the Constitution does stipulate is breaking a tie vote in the Senate (Article I, Section 3, Clause 4). This has happened 291 times in history through May 12, 2022. With the current 50-50 split in the Senate, Vice President Kamala Harris has been performing this duty frequently, twenty-three times already. When Georgia’s Democratic candidates, Jon Ossoff and Raphael Warnock, won their runoff elections, the Senate was tied for the fourth time in history, something that previously occurred in 2000, 1954, and 1881.

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183622 https://historynewsnetwork.org/article/183622 0
A Primary Source Shows the Connection Between 1920s Flappers and Social Media Youth Organizers Today

 

 

In the aftermath of the police killing of George Floyd, there were heavily contested debates about police reform. Sadly, those debates subsided after the trial and conviction of the police officer who was mainly responsible for Floyd’s death. However, after President Biden’s State of the Union address and the increase in mass shootings, there has been a resurgence in the dialogue concerning policing. While some advocates of defunding the police legitimately seek to abolish the police, most of them are using “defund” as a shorthand way of saying that some funds must be reallocated to social services and restorative justice initiatives. While the concept of social work as crime prevention is widely understood to date back to the late 19th century settlement house movement, it has also had some unexpected advocates. These include a number of flapper intelligentsia.

While flappers and intelligentsia are rarely mentioned in the same sentence, as the word flapper is normally associated with youthful (and stereotypically female) frivolity, that perception could and should be challenged. In contemporary society, one could substitute social media influencers for flappers.

Born out of the nadir of the First Wave of Feminism, flappers were a natural result of women earning certain social freedoms, including the right to vote, with the increase of leisure time and extended youth that came out of the Industrial Revolution. A well-known example of a flapper is Daisy Buchanan in The Great Gatsby. For movie-goers, either Mia Farrow’s (1974) or Carey Mulligan’s (2013) portrayal of her is an accurate stereotypical depiction of the flapper. The Great Gatsby’s other female protagonist, Jordan Baker, represents another often-overlooked feminist, the sporty flapper, which a contemporary magazine called Experience, aimed at an audience of flappers, also heavily reported and promoted. While flappers like Daisy would have been common, evidence suggests that some were wiser, more serious, and more interested in community solutions than their elders might give them credit for—the following is one small example of why.

Tucked within the pages of Experience is an article about police reform. Often referred by critics as a fashion magazine,  in 1923 Experience published an article titled “Turn the Police into Social Workers.” The article’s author boldly declares in the opening lines: “Turn the Policeman into a philosopher! Replace the club with the admonishing finger! Make the star a symbol of protection rather than persecution!” Some may ask, “why would these flappers care about youth crime?” Who do you think some of these forces were targeting, but the brothers, boyfriends, and the everyday acquaintances of these “frivolous” flappers?

The article continues with a quote by Chicago Police Sargent Thomas Ryan saying that we should make the “policeman a salesman for crime prevention, and you will keep 85 percent of the wayward youths from committing felonies, you will save millions of dollars to taxpayers, and you will also save parents from the agony of suffering, especially the poor mothers, who have to work to support the younger children.”

While policing was an issue then, the role of the police has only expanded. Spending on municipal police departments has nearly tripled since the 1970s, while spending on social programs clearly has not, meaning that police officers often fill those gaps too. In fact, according to the Washington Post, more than 1 in 5 people who died at the hands of police have some form of mental health issues. A prescient sense of how to prevent this tragedy can be found in this relatively small two-page article written by some “inconsequential” flappers.

Most of the article is an interview that Sargent Ryan had with Experience. He notes that youth under twenty-one are the people in most trouble and refers to it as the “danger period,” but the term “formative years” could also work. After introducing Ryan, the article tells a few anecdotes about encounters between the police and mostly young adult males that resulted “in reducing jail congestion.” The strategy Ryan presented is simple: first, get the person to be “in a listening attitude, then in a receptive mood, and finally in a position where he acts on conviction” and avoids youthful indiscretions. Sargent Ryan goes on to say that “there is no joy that can come to a policeman on the beat greater than that arising from the consciousness that has helped a neglected youth to grow up into a good citizen.” Kindlier and gentler policing is often brought up in the debate, making this quote ring true for today’s efforts.

The article then details how Ryan was a troubled orphan who now devotes himself to the construction of a “Crime Prevention Bureau” to help the “average wayward boy.” Through guidance, states Ryan, troubled youth will learn that “labor, suffering, and trouble stir the depths of our being and bring out that which is best in us. Afflictions teach us that which we never before understood.” While this is a bit of an overstatement and maybe even wishful thinking, it does show some common American ideals. Ryan then suggests that these ideals were never taught to these youths, because they was not taught to their parents; in essence, he is describing a poverty cycle, and arguing that properly trained officers could fill this void.

The article concludes with Ryan calling for assistance from “women’s clubs, civic leaders, and organizations for social betterment” to involve the average citizen, the city council, and beyond. These recommendations carried more nuance than much of the contemporary discussion of policing and crime. Could this be a strategy for the future?

Some cities have tried similar strategies. In 2016, the Alexandria, Kentucky police department hired a social worker to assist the police in nonviolent calls and follow-ups. The town’s police chief, Lucas Cooper, argues that using social workers eases police work and stress load by reducing repeat calls. According to Cooper, social workers “bring a different skillset to the table,” which helps to “fill in a lot of gaps.” It also seems logical that if police officers are less stressed, the frequency of accidental shootings should go down.

A multifaceted approach to solving issues that could prevent future criminal activity will always be more effective than a one size fits all approach that is often suggested by policy makers and politicians alike.

So why would flappers care so much about this issue? Just as today, youths are the ones that frequently take the brunt of societal issues, and young males (often minority) take the largest hit when it comes to policing. Flappers then, and contemporary youths, often get discredited by society. The parallels between these youth, and their ideas, with current social media influencers, political protesters, and other activists, is uncanny. A major lesson could be learned: older generation of political leaders should not dismiss the wisdom and knowledge within youth communities. It is often their “inexperience” that allows them to come up with outside solutions to solve issues that have been plaguing society for generations.

With that in mind, what the flappers reported in their magazine can give us further insight while moving forward.  We would benefit by emulating the flappers in the “frivolity” of protecting our youth.  While partnerships between social workers and police officers may be a controversial subject, it is worthwhile to explore the potential benefits.

Maybe just as important to the lovers of history is how this brief magazine article as a primary document demonstrates something important about society and how the past has lessons for today.

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183623 https://historynewsnetwork.org/article/183623 0
The Roundup Top Ten for August 5, 2022

The Coming Pregnancy Surveillance State Will Bring "Homeland Security" to Women's Bodies

by Natalie Fixmer-Oraiz

The Dobbs ruling puts longstanding racist and nationalist beliefs that white women's reproductive labor is the price of their citizenship, and punitive controls on women of color, on collision course with the modern capacity of digital surveillance, threatening the criminalization of any miscarried pregnancy. 

 

Reproductive Rights, Slavery, and the Post-Dobbs World

by Jennifer L. Morgan

Black women's history with reproductive freedom from slavery to today shows that racial and gendered oppression depend on the denial, embraced by Clarence Thomas, of a constitutional protection for bodily autonomy. 

 

 

The Power 5 Conferences Should Split Revenues with College Football Players

by Victoria Jackson

Another college football season means another chance to demand that universities and the NCAA recognize a fundamental fact about the dangerous and isolating work performed by players: they are not student-athletes, but employees of the football team. 

 

 

Is There a Biblical Solution to the Modern Problem of Debt?

by Eva von Dassow

Many are inspired by Old Testament rules for debt jubilees, but, while the practice has a historical basis, that history shows debt forgiveness was part of an unequal society in which forgiving old debts simply enabled the masses to take on new ones. 

 

 

Working 9 to 5: The Activism of Women Office Workers

by Ellen Cassedy

The author of a new firsthand history of a pioneering organization of women office workers discusses the history behind the movement for "Raises and Roses." 

 

 

"Freedom Dreams" at 20: Robin D.G. Kelley on the Ongoing Work of Imagining Liberation

by Robin D.G. Kelley

"The “Black Spring” rebellion of 2020 sparked a renewed interest in Freedom Dreams. But the book was never intended as a roadmap.... Instead, it humbly offered a different take on histories of a handful of social movements by centering their visions of a better future for all."

 

 

The Eugenic, Anti-Black History of the "Brazilian Butt Lift"

by Daniel F. Silva

Brazilian doctors developed the procedure in the wake of a eugenics movement that assimilated some stereotyped attributes of Black women's bodies into a new set of beauty standards that marginalized Afro-Brazilians. A similar dynamic occurs today on worldwide social media.

 

 

Fed Up with Emails from Democratic Pols? You're Not Alone

by Lara Putnam and Micah Sifry

The Democratic Party's strategy of electronic communications to raise fear and money is backfiring, as voters see little change despite their contributions. The party must reach people at their doorsteps, not through their inboxes. 

 

 

Teaching the History of Campus Police

by Yalile Suriel

The FBI's Law Enforcement Bulletin offers an insight into how law enforcement in the 1970s increased its presence on college campuses and redefined the function and goals of campus police forces. Here's how one professor has used this source in class. 

 

 

Bill Russell's Greatness Was Unfathomable

by Jack Hamilton

"Bill Russell wasn’t just everything we should want out of people who play sports; he was everything we should want out of public figures, an absurdly gifted human being who understood there was a world outside those gifts and who set himself to help change it."

 

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183618 https://historynewsnetwork.org/article/183618 0
The Highland Park Horrors Won't Break the Gun Cult's Mythic Hold on America

Missouri Senate candidate Eric Greitens (R) boasts of backing from "an army of patriots" in a recent ad. 

 

 

The Fourth of July: an American flag waving in the breeze, grills out and stacked with meats, picnic tables loaded with food and friends, fireworks launching from driveways or fairgrounds (or, sometimes, all over the driveway), parades and marching bands in the heart of town.

And because this is America in 2022, mass shootings.

Over the July 4th holiday weekend, more than 220 people were reported shot and killed in the United States. There were 11 mass shootings, including the one at the Highland Park, Illinois, 4th of July Parade. The Highland Park shooter, Robert Crimo III, has attracted a lot of attention because of his involvement in a number of internet subcultures promoting violence, alongside his casing of a local synagogue before the attack. Crimo may also have been planning a second attack in Madison, Wisconsin before he was arrested. This is just one of many; others were foiled, including a planned attack in Richmond, Virginia. Shortly thereafter police in Long Beach, California, seized weapons from another man who was glorifying mass shootings and discussing attacking minorities on social media. This, this is America, where mass shootings come too often for the media to properly cover them.

The Fourth of July is a potent symbol, and one that is always ripe for appropriation by anyone who wants to sell a message. The concept of the “Black-Robed Regiment,” for instance, has been appropriated by Christian nationalists to sell a particular narrative of American history. The idea of “Patriot Churches” and the use of the title of “Patriot” for QAnon members continues this far right appropriation of the legacy of the Revolution. The Proud Boys’ war plan for January 6th was entitled “1776 Returns,” though one would be forgiven for their conflation of the American Revolution and the Russian Revolution, given that phase one of the plan was “Storming the Winter Palace.” And of course, as with almost any other event in contemporary America, Rep. Marjorie Taylor Greene had to comment on the Highland Park shooting, saying that it was a false flag event designed to push gun control on Republicans, and adding a touch of conspiratorial thinking: “as soon as we hit MAGA month, as soon as we hit the month that we're all celebrating loving our country, we have shootings on July Fourth.”

And maybe that’s where we should pick up the thread, because if MAGA month is an absurdity, the link between firearms and gun control and the vision of what America means, on the Fourth of July weekend, is important to discuss. Because the myth of a “Good Guy With A Gun” is not just about active shooters–something that I worry will forever be associated with the Fourth of July now–but part of our American mythos. It is a religious conviction, just like the guns they carry.  It’s every bad legend of the American West, past and present. It’s the Red Dawn mythos of the American Revolution, that the militia, untrained, passionate farmers and townsfolk picking up their hunting rifles, defeated the British army. And it’s that legend, the linking of guns to the Revolution, of individual gun ownership to patriotism, to American identity, that needs to be discussed.

The Fourth of July becomes a moment for arms companies, the National Rifle Association, and politicians to champion firearms, draped in the flag. Daniel Defense, the firearms manufacturer whose tweet using Proverbs 22:6 to sell guns to children, and whose weapons were used by the Uvalde shooter, had their own marketing tweet for the 4th:

              

At DD, we celebrate our nation’s founding knowing that every Daniel firearm, part, or accessory is designed and manufactured right here in the U.S. Happy Birthday America and may God Bless our Military, Law Enforcement, and First Responders. -The Daniel Defense Family pic.twitter.com/REwjDLKA9R

— Daniel Defense (@DanielDefense) July 4, 2022

The text is pretty simple, “Happy Birthday America and may God Bless our Military, Law Enforcement, and First Responders,” a fairly traditional Fourth of July message. Two tweets down, they’re selling their MK18 SBR with the background of an American flag. The National Rifle Association is much more explicit, posting a video saying that “The only reason you’re celebrating Independence Day is because citizens were armed. Happy Fourth of July from the National Rifle Association of America,” in a tweet with the text, “We are a country because of brave souls with guns who valued and fought for liberty and freedom.”

And then there is New Hampshire State Representative Jason Osborne, now the GOP House Majority Leader in the state. In a now-deleted tweet, on his now private account, he wrote: “Instead of spending $20 more than last year on your Independence Day hot dogs, lay off the calories and grab a few more rounds for your AK-47. You’ll thank yourself later.” We can only imagine the multiple possible interpretations of this–but we live in a country where militia groups like the Oathkeepers and the Three Percenters drape themselves in the mantle of the Revolution and try to overthrow the federal government, while Republican officials continue to defend January 6th.

At least it wasn’t an Eric Greitens ad. It could be worse.

What does this tell us about the American Revolution, and its legacy on guns, gun ownership, gun control, and, dare we say it, gun idolatry in America?

Let’s start with the Revolution itself. The notion that colonials living in British America revolted, much less successfully, simply because their love of weaponry thwarted a British government trying to take it away is deeply flawed. The British empire in America relied heavily on the fact that their colonial subjects could be called upon in time of war to both augment British regular forces through their local militia’s or “artillery companies.” These well-regulated local militias were indispensable to British military might in America. Far from seeking to prevent Americans from owning guns, the British regularly complained that Americans cared too little for regular militia training, and contributed far too few resources for the safekeeping and maintenance of existing weaponry. While it is true that the first shots of the Revolution were fired within the context of British troops attempting to seize caches of guns and ammunition in Massachusetts in 1775, it is important to remember that the British government already considered the colony in a state of open rebellion. And it wasn’t well-armed and trained Americans that blocked their attempted confiscation, but rather poor intelligence, logistics, and morale among British soldiers. In fact, most Americans who did possess firearms in the early days of the Revolutionary war possessed aging weapons, inferior to those of the British and, as Gen. Washington constantly lamented to Congress during the war itself, many barely knew how to use them. What Washington wanted, needed, and eventually got was a professional American army-not armed citizens-to turn the tide of war against a beleaguered British military that was by that time engaged in a world war against far stronger European rivals.

Indeed, the dichotomy between armed citizenry and a professional army is crucial to consider when we address the modern American fascination with firearms. Colonial militias, far from relying on privately armed-to-the-teeth colonists, often stored weapons and ammunition communally. The point wasn’t their freedom to own weapons, but rather their access to them in times of crisis. Without an existing crisis, few colonists saw the point in purchasing or maintaining what were extremely expensive items in areas away from the frontier. Again, this was a necessity born of weakness, not a constantly-flexing expression of strength.

It is perhaps easy to understand this myth when one considers the inundation in modern American politics of the cult of the second Amendment. Though often cited, it is almost never cited in its entirety. The Amendment states: “A well-regulated militia, being necessary to the security of a free state, the right of the people to keep and bear arms, shall not be infringed.” Why is the proviso almost never cited in its entirety? Because it was precisely worded to specify exactly their intent in including it-the defense of the state. One of the chief grievances of American colonists leading into the Revolution was the abuses of the British army in America. By the end of the Revolution, this suspicion and animosity had developed into a principled opposition to professional armies themselves. Many founders and citizens believed professional armies exist to fight wars, and thus would inevitably find excuses and enemies to fight. How could we prevent becoming an empire? How could we ensure that wars fought by a democratic people were defensive and never offensive? That an institution dedicated to the use of force would never turn itself on the government or people? Simply eliminate the need to construct one.

We should note that the United States subsequently constructed both an empire and a professional military force (presumably the strongest argument for the repeal of the amendment itself, or at least a national conversation on choosing one or the other). The point remains, however, that the second amendment was designed to prevent this. If local militias in slave states could also double as slave-police forces-avoiding the need to have a free people’s military tangle with its pervasive practice of stealing freedom from others- all the better. What the amendment never came close to allowing or advocating was the idea that Americans who elect their leaders would need a violent backup plan to voting, would be endowed with all the individual powers of a military, or its weaponry. No state or federal administration has ever allowed the possession of tanks, cannon/artillery, submarines, jet fighters, or nuclear weapons by private citizens. Why? Because no serious political leader ever understood the founders to have recognized a right to parity in force or a challenge to the monopoly on the use of force by the state. Well, until now.

The right wing media apparatus will continue to push the narrative that there is nothing more American than baseball, apple pie, and open carrying an AR-15 into the stadium. The National Rifle Association will continue to proclaim that the right to bear arms trumps the right to live in peace, the right of school children to survive the school day, the right of Americans to enjoy a parade celebrating the singing of the Declaration of Independence in peace–that “patriotism,” that the Fourth of July itself, is a celebration of the individual right to own weapons of war. It’s a mythology that has nothing to do with the Revolution or with the Founders–but it is a mythology that has everything to do with the dangerous, problematic, and ahistoric far right legend of the Revolution, and the desire for a revolution to come.

 

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183565 https://historynewsnetwork.org/article/183565 0
Weaponizing Bad-Faith History is a Conservative Tradition from Jim Crow to Alito

Wade Hampton III of South Carolina, photographed by Matthew Brady during his tenure as a United States Senator ca. 1880

 

 

Thanks to the Supreme Court’s historic end-of-session rulings, I’ve found myself thinking about the dissertation I defended in 1997 and which was published in 2002 as In the Great Maelstrom: Conservatives in Post-Civil War South Carolina. In it, I argued that southern conservatives, despite their overwhelming defeat in 1865, relied on an understanding of history that justified using various means – including violence – to reestablish and maintain control over their state in the years that followed. A few core beliefs held their view of social order together through decades of undeniable change. As one would expect, they defended white supremacy. They also believed in the rule of local elites -- meaning themselves -- over everyone else, including poor white southerners, whom they openly tried to keep from voting for a time. A hallmark of this conservatism was that history itself justified a hierarchical social order and that history would eventually vindicate them.

 

A little part of me is impressed that the case seems to have held up. A much bigger part of me wishes I had been wrong. The Supreme Court’s deeply revanchist rulings show clearly how the majority of justices, much like the conservatives I examined, interpret and invoke American history in such a way as to justify elite male rule as the proper social order. As inheritors of this tradition, today’s conservative justices, therefore, will likely continue to solemnly invoke the past as they go on their way rolling back the hard-earned civil rights of more and more Americans. Today’s progressives should always remember that conservatives play the long game. 

 

Even amidst the ruins of their towns and farms, with slavery outlawed and African American men having the vote, southern conservatives assured one another after the Civil War that “many things here are changed, but all is not changed.” They defended the Old South and urged their fellow elites to “infuse her nature” into the New South. South Carolina writer Ben Robertson wrote later that “the past that Southerners are forever talking about is not a dead past…it is a living past, living for a reason.” Having long denied access to education to everyone else, they took it upon themselves to interpret and write that history. Their claims of reverence for the past – and their selective use of the past – are strikingly similar to modern conservatism’s favorite legal theory, originalism.

 

Recently on Ezra Klein’s podcast, Professor Kate Shaw of the Cardozo Law School reviewed the Court majority’s originalism and its slipshod use of history. Originalists turn “to the time right around when the constitutional language was written” to determine “what the language means based on what it meant at the time.” If a current rights claim cannot be found within the original meaning of the text of the Constitution, it must be rejected. This seemingly politically-neutral approach, Shaw notes, relies on “incredibly selective readings of the relevant historical record, in service of — it seems — outcomes that actually are philosophical… or substantive value driven outcomes.” In other words, conservative. Shaw concludes, “to suggest that this kind of method actually is objective in a way that earlier approaches to constitutional interpretation were not I think is just at worst sort of bad faith.”

 

Shaw is correct. The majority bloc of justices are lousy historians. As such, they are following in a long conservative tradition. What I learned through my own work was that the conservatives’ invocations of history often mixed wishful thinking about the past with bad faith in interpreting it. And it was always done with the present-minded purpose of maintaining elite white male rule, especially on matters of race.

 

For example, Frederick Porcher was a professor at the College of Charleston. The enslaved labor from over one hundred men, women, and children produced the wealth from his family-owned plantation outside of the city. Before the war, Porcher developed a sophisticated defense of enslavement as the “natural” foundation for a divinely sanctioned social order. It worked, he argued, because everyone accepted their place within the hierarchy. During the war, some of his enslaved people had other ideas and used the war’s disruptions to escape. After the war, Porcher wrote local history which asserted that the enslaved had been happy in their enslavement, conveniently overlooking what he had witnessed just a few years prior. When the disfranchisement of black voters began in the 1870s, Porcher taught his history students that no actual rights were lost for the freedman since, “he had none in the country from which he was brought” to the U.S.

 

Edward McCrady, Jr., an attorney, also dabbled as an historian as he served in the South Carolina state house in the 1870s and 1880s. His reading of the state’s history led him to conclude that “under our system of government, the public business is ours, just as much ours as our own personal and private affairs.” Our, ours – McCrady was not exactly subtle. And while history justified elite rule, it didn’t hurt to put one’s fingers on the scales. As Wade Hampton, a former Confederate general and enslaver of hundreds, explained on his way to becoming Governor, “If we cannot direct the wave it will overwhelm us.” McCrady agreed and pushed for a law that would deny the vote to black and poor white South Carolinians. He also was active in the local “rifle clubs” of the 1870s that carried out acts of violence against anyone challenging elite rule.   

 

At the dawning of the Jim Crow era in the early 1900s, Theodore Jervey, Jr., also an attorney as well as an author, looked back at the elite rule of his enslaver ancestors and sighed, “I don’t believe Democracy ever received more loyal service than was rendered by the aristocratic representatives of the South in the first three or four decades of the Constitutional life of this Republic.” In searching for a contemporary model to match his fondness for the antebellum southern order, Jervey studied Boer rule in South Africa, and he applauded as British and German imperialists abandoned their “extravagantly liberal and humanitarian ideas, with regard to the race question.”

 

Finally, from the 1920s through the late 1940s, Charleston newspaper editor William Watts Ball frequently invoked the leaders of the Old South and the Confederate generation. He boasted that he was “steeped in their opinions and prejudices . . . rejecting nothing.” Sure enough, in 1933 he grew frantic that a stronger, more progressive federal government under Franklin Roosevelt would mean “goodbye to ‘state rights’ and all that sort of thing.” And that, of course, would eventually mean “the conferring in practice, as well as theory, of equal rights.” Ball explained to his wife that “I don’t believe in slavery, but I do believe in holding the negroes down.” He lived long enough to support Strom Thurmond’s Dixiecrat campaign for president in 1948.

 

Dobbs v. Jackson on abortion rights, Shelby v. Holder on voting rights, and Vega v. Tekoh on Miranda rights all derive from landmark progressive accomplishments from the 1960s and early 1970s. All now have been gutted if not directly overturned in the name of conservatism. Allison Orr Lawson has written recently, “if history is going to be a key driver for the Supreme Court’s decisions…then it is imperative to ask where the justices are getting their historical sources, whether those sources are fact-checked, and (most importantly) who is narrating the history.” The conservatives I studied in the 1990s offer some disturbing answers to Lawson’s important questions. From Frederick Porcher to Samuel Alito, the history conservatives have used to justify their power has either been a fantasy or one that strictly and explicitly limits freedom for others. Since conservatives believe that history will eventually prove the rightness of their views – and that nearly anything goes to make sure of that outcome – no one should be surprised when the Court continues to mangle history in the name of originalism to make it so.   

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183564 https://historynewsnetwork.org/article/183564 0
Don't Call them Conservatives

 

 

At the conclusion to the latest round of the House Select Committee hearings on the January 6, 2021 invasion of the United States Capitol Building, committee vice-chairwoman Liz Cheney (R-WY) stressed that much of the testimony against Donald Trump came from conservative-identifying members of the Republican Party.

 

However, what became clear during the hearing and in recent American politics is not only that Donald Trump incited a riot and was involved in a criminal conspiracy to overthrow a democratically elected government, but that the MAGA Republicans in the House and the Senate and a Republican majority on the United States Supreme Court are not “conservatives” in a meaningful sense of the word. They are illiberal rightwing reactionaries who are willing to circumvent democracy to maintain power. Many are political extremists, religious zealots, intolerant bigots, and racists who appeal to the basest instincts of their followers. Their ideas and actions come very close to those of the Fascist movements that swept through Italy and Germany between World War I and World War II and have come to power in Hungary, Turkey and Russia today.

 

Whatever you call Trump’s followers, don’t call them “conservatives” or “traditionalists.” There is nothing conservative or traditional about them. Those labels just provide a veneer of legitimacy to people who deserve no intellectual or political legitimacy at all.

 

In Republican primaries across the country, candidates for office keep championing their supposedly conservative credentials. Often that only means support for unrestricted ownership of the kind of weapons frequently used in mass murders and campaign ads showing politicos in photo ops carrying automatic weapons.

 

A recent poll by the Violence Prevention Research Program at UC Davis exposed very disturbing trends in the United States. Slightly more than half of the respondents believe that “in the next several years, there will be civil war in the United States” and over 40% believe that “having a strong leader for America is more important than having a democracy.” About 1 in 5 argued political violence like the kind we saw on January 6, 2021 is “justifiable.” While the study identified 62% of the people who completed the survey identified as white, non-Hispanic and 47% as male, there was no breakdown on political identification and how different demographic cohorts responded. A recent academic study published in the journal Psychological and Cognitive Sciences found that that in the United States radical action by individuals associated with right-wing causes was more likely to be violent or espouse violence than radical action by people on the left. The study also found that “right-wing individuals are more often characterized by closed-mindedness and dogmatism.”

 

The United States has a strong conservative antidemocratic tradition dating back to the nation’s founding. What is different now is the conservative appeal to mass action to undermine democratic institutions, something we generally observe in Fascist movements. Early conservatism was clearly expressed in the Federalist Papers’ defense of the new Constitution. James Madison made very clear in Federalist #10 that the new government was specifically designed to inhibit decision making by an “overbearing majority” and to protect the influence of those he considered to be “our most considerate and virtuous citizens.” The Constitution was not a document designed to ensure democracy, but to protect the liberties of the elite, although it did include a commitment to rule by law. This traditional brand of conservatism sought to protect the power and property of the elite from the “huddled masses yearning to breathe free”.

 

The Declaration of Independence promised “liberty and justice,” meaning the protection of privileges and property from arbitrary authority, but not for “all.” These protections were not extended to women, enslaved Africans, and indentured whites. The Supreme Court made it clear that “liberty and justice” were not extended to people of African ancestry in its 1857 Dred Scott decision. These conservative traditionalists wanted to protect their rights and freedoms, but not those of others.

 

The conservative commitment to rule by law in the early national era was crucial for the survival of the new nation. One of the earlier conservative leaders in the United States was John Adams, elected as the nation’s first Vice-President in 1788 and 1792 and second President in 1796. As political parties evolved, Adams was the candidate of the Federalist Party. In 1800, when Thomas Jefferson defeated Adams’ bid for reelection because of additional electoral votes granted the slaveholding states, Adams conceded defeat because, unlike Donald Trump in 2020, he placed the survival of the country and its institution ahead of party and power. Adams’ action represented one of the first times in world history that there was a peaceful turnover of power between rival political factions.

 

Conservative movements dominated the federal government both before and after the New Deal. Historian Gabriel Kolko argued in The Triumph of Conservatism: A Reinterpretation of American History, 1900-1916 (1963) that Progressive federal legislation at the beginning of the 20th century drew conservative support because it circumvented more radical state reform efforts. Republicans and moderate Democrats have held the Presidency since 1969.

I include Jimmy Carter, Barack Obama and Joseph Biden as politically moderate Democrats. No Bernie Sanders or Alexandria Ocasio-Cortez has been elected to national office, served in a Presidential cabinet, or been appointed as a Supreme Court Justice.

 

Prior to 1994 and Newt Gingrich’s efforts to close down the federal government and obstruct any legislative action, Republican conservatives generally believed they had a responsibility to support good government. Republicans held a majority on the Supreme Court when the Court ruled that there was a constitutional right to privacy and to terminate an unwanted pregnancy. Thirty Republican Senators voted for the Voting Rights Act of 1965 and Republicans supported to formation of the Environmental Protection Agency in 1970. When Richard Nixon was caught breaking the law in the Watergate Scandal, many Republicans supported his removal from office and helped force him to resign.

 

The Supreme Court, whether it was intended to be or not, has almost always functioned as a conservative brake on social change. But it rarely rewrote the Constitution or reversed its own positions as forcefully as it does today. The current Supreme Court, with three Trump judges and a six-to-three rightwing majority may well be the most activist and extremist Court in United States history. Its decision in Dobbs v. Jackson Women’s Health Organization (2022) overturned Planned Parenthood of Southeastern Pennsylvania v. Casey (1992) and Roe v. Wade (1973), eliminating federal protect for abortion rights. Earlier, decisions by a narrower rightwing majority in Janus v. American Federation of State, County, & Municipal Employees (2018) overturned Abood et al. v. Detroit Board of Education (1977), taking away long established labor union rights, and its decision in Citizens United v. Federal Election Commission (2010) reversing McConnell v. Federal Election Commission (2003), eliminated major federal regulation of election campaign financing. These decisions all violated the conservative belief in stare decisis, the doctrine that courts will adhere to precedent in making decisions, a principle each of the rightwing justices promised to uphold.

 

In a 2012 pre-Trump article in The Atlantic, Conor Friedersdorf asked what “Americans mean when they say they’re Conservative.” While he concluded “it depends,” he did identify a few basic themes. They include “an aversion to rapid change and mistrust of attempts to remake society” and a “desire to return to the way things once were,” a belief that it is “imperative to preserve traditional morality, as it is articulated in the (Christian) Bible,” “disdain for American liberalism, multiculturalism, identity politics, affirmative action, welfare,” an “embrace of free-market capitalism,” and a belief that “America is an exceptional nation.” Missing from these core values was any commitment to democracy and respect for the rights and lives of others.

 

The positions identified by Friedersdorf in 2012 offer a reasonable definition of American Conservatism. But in the Trump era with anti-democratic “values” coupled with contempt for reason, law, and tradition and rightwing Republican politicians stirring up popular unrest and insurrection at the Capitol, the United States is in deep trouble. As we confront what is happening in this country, we need to stop calling the MAGA movement conservative.

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183561 https://historynewsnetwork.org/article/183561 0
Kathryn Olmsted's "Newspaper Axis" Shows Media Extremism Nothing New

Marion Davies and William Randolph Hearst, 1942

 

In our hazy collective memory about Franklin Roosevelt, the Great Depression and the march toward World War II in the 1930s, we sometimes assume the people of the United States were absorbing the printed and broadcasted words of an objective and fair news media.

Nothing could be farther from the truth, as University of California, Davis history professor Kathryn S. Olmsted ably proves in her deeply researched book The Newspaper Axis: Six Press Barons Who Enabled Hitler. Not only were some of the foremost newspaper publishers of the day needling FDR with stilettos on their editorial pages, but they also even imposed their views on their reporters. “Fake news” was as evident in those days as it is today.

Hitler and FDR came to power within a couple of months of each other in 1933. When we think of the dire situation in the country when Roosevelt was elected in 1932 — 25 percent unemployment, the banking system collapsing — it’s amazing to think that he was endorsed over Herbert Hoover by just 41 percent of the nation’s newspapers — and that was the high-water mark. The fact that FDR continued to win re-election — carrying all but two states in 1936 — is a testament to the popularity of his programs and the power of his personality.

FDR was opposed on both the domestic and foreign fronts during most of his time in office by the press lords Willam Randolph Hearst, owner of a network of newspapers and radio stations whose motto was “America First,” and Col. Robert McCormick, publisher of the Chicago Tribune, the self-proclaimed “world’s greatest newspaper.” While initially supportive, McCormick’s cousins, siblings Joseph and Eleanor “Cissy” Patterson, eventually turned on FDR too. Joe Patterson published the largest circulation newspaper in the country, the New York Daily News, while Cissy published the largest circulation newspaper in Washington, the Times-Herald. As a group, they reached 30 percent of the American newspaper-reading public every day.

Not only that, but Hearst, McCormick and the Pattersons had cozy ties with two powerful press lords in England, Lord Rothermere and Lord Beaverbrook, who were supportive of Mussolini and Hitler until the day German armies attacked Poland in 1939 and continued preaching appeasement until the war escalated with the attacks on the Netherlands and France the following spring.

McCormick’s vitriol easily matched that of the better-known Hearst. He called the New Deal the “Raw Deal” and instructed his White House correspondent to label the federal work relief programs “government easy money.” When the Lend-Lease bill was debated by Congress in 1940, the Chicago Tribune called it “the Dictator Bill” in all its coverage — without using quotes around the mocking nickname. McCormick was convinced that every Roosevelt program and action was designed to make him the country’s dictator, but he was willing to give a true dictator like Hitler a pass because of his own fierce isolationism and hatred of communism. Apparently, a fascist dictator trumped a communist one.

FDR and his supporters were able to neutralize some of the power of Hearst, McCormick and the Pattersons regarding Lend-Lease and the need to rearm the country with tools such as FDR’s Fireside Chats over the radio and the growing popularity of commercial radio and motion pictures. Henry Luce, publisher of Time and Life magazines, owned a radio program and a movie newsreel service, both called the March of Time. Although he had opposed the New Deal, he fully appreciated the situation evolving in Europe under Hitler and covered it accordingly in his media.

It’s hard for us to appreciate today the power of the movie newsreels. Americans spent most of every entertainment dollar on movie tickets in the 1930s, and the March of Time mini-documentaries “were seen by more than 20 million Americans every month,” Olmsted writes. There was also a sea change in the late 1930s and early 1940s in how Americans received their non-local news. By 1942, polls found that 62 percent preferred to receive it from radio. Broadcasts from war-torn Britain in 1940 by CBS radio man Edward R. Murrow and others moved Americans from the isolationism of Hearst, McCormick and the Pattersons.

After the fall of France in June 1940, Hollywood began producing more anti-Hitler films, especially those vilifying the Nazis and praising the English. They included Alfred Hitchcock’s Foreign Correspondent and Daryl Zanuck’s A Yank in the RAF.

Olmsted deftly interweaves the stories of the conservative press in the two countries through the 1930s and beyond. Although Lord Beaverbrook took major posts over war production in Churchill’s government, after the war he reverted to type, opposing the Marshall Plan, the United Nations and the European Common Market. Olmsted notes that Rupert Murdoch, who went on to found Fox News, was a protege of Beaverbrook’s in the 1950s. He learned at the knee of a master and helped engineer Brexit.

She concludes her powerful book with this eloquent paragraph:

“We can still hear the echoes of their voices today — in the anti-European headlines of the [British] Daily Express and the Daily Mail, in the angry populism of Fox News and Breitbart, in the nationalist speeches of Boris Johnson and Donald Trump. ‘From this day forward a new vision will govern our land,’ President Trump promised in his inauguration address. ‘From this day forward, it’s going to be only America First — America First.’ The last of the press lords died more than a half century ago, but their heirs continue their crusade for nation, for empire, for the ‘white race,’ and for Britain and America First.”

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183568 https://historynewsnetwork.org/article/183568 0
Collegiality, Interdisciplinarity, and the Historian's Work

Plato's Academy, after Carl Johan Wahlbom

 

 

At many universities, scholars are encouraged to be “interdisciplinary,” in the classroom and in their work. Some scholars desire to do interdisciplinary research or create work that speaks across disciplines. However, most PhDs spent most of the time earning that degree in a single discipline. And once out in the world, they are often housed in single-discipline departments. None of that precludes interdisciplinary work, but it does mean that regularly engaging with scholars and work from other disciplines is not automatic. Collegiality can help bridge the gap.

 

At my university, every summer we have a colloquium series for interested faculty put together by the Honors Program. There are three books, mostly from the Honors curriculum or adjacent to it, which we read and discuss in a Socratic format. All of the faculty are invited and can read all three books or any one of them. Not all of the faculty participate, but every summer we have engaged participants from a variety of disciplines. This summer during a session on St. Augustine’s On Doctrine, we were discussing semiotics with people from history, pharmacy, communication, computer science, and dance, among others.

 

The biggest immediate effect of this series is collegiality. People from very different parts of the university get to know each other and become more comfortable with each other. We learn the names that go with faces and we learn the specific skillsets that people bring to campus. And we learn to work together better—we practice collegiality. When you undertake to read and understand a book together, you are engaging in shared learning, but also something of a shared project. A book like Frankenstein may seem fairly easy to approach, but not all of us come to the table with interests or backgrounds prepped for Schiller’s On Aesthetic Education or MacIntyre’s After Virtue or Hofstadter’s Anti-Intellectualism in American Life. And we soon learn how differently disciplines can approach texts and questions. And how beautiful that can be.

 

If collegiality is the immediate outcome of the colloquium series, it is not the only outcome. As faculty become more comfortable with each other, they become more familiar with the tools and approaches other disciplines use. As it happens, sitting with someone who operates differently is an excellent starting point for learning from them or working with them. The colloquium series has prompted interdisciplinary collaboration and has continued education for all of us.

 

Even in settings that are explicitly interdisciplinary or intended to have productive outcomes, collegiality smooths the way. This summer I was fortunate to participate in an NEH Institute, “The Revolution in Books” at Florida Atlantic University. We learned about the role of books and printing in the American Revolution, learned about books as objects, and we also got to make and marble paper, set type, and stitch bindings. It was a great experience, enhanced by FAU’s resources, like the Jaffe Center for Book Arts and the Weiner Spirit of America Collection with its excellent sources. But our experience was made even better by the variety among participants.

 

The NEH Institute brought together all kinds of scholars and artists for “The Revolution in Books.” We had academics who focused on literature and history, librarians and archivists, and even artists. We had people familiar with the names of colonial America loyalists, people who knew pirate songs, and people who could recognize all kinds of typefaces. In sessions, we learned what different disciplines had to offer for approaching subjects. And as we got to know each other better, we also talked more outside of sessions about approaches to learning and teaching. A lunch or a dinner could be a launching point for a new project or an introduction to new pedagogy.

 

The NEH Institute was designed for just this kind of outcome. Participants are intentionally drawn from different disciplinary backgrounds. And the hands-on sessions put people together to work on projects. Most participants stay on campus. It builds camaraderie and collegiality. By the end of our time together, everyone had learned about new techniques for art and books, or new tools for teaching, or new resources for research. Each participant has a final project and it’s fair to say that all of them have benefited in some way from the span of interests and approaches that were in the group. (Those projects will be presented on a public Zoom, August 4 & 5.)

 

The best interdisciplinary collaborations come about when people aren’t just looking to “do interdisciplinary work,” but when we have regular interdisciplinary interactions. Conversations can lead to collaboration in very natural ways. If we want our departments or universities to see and do more outside disciplinary boundaries, we need to create environments for more engagement with others. Opportunities that involve working together to create something or understanding something are optimal. Collegiality is a strong foundation for interdisciplinary work and thought.

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183566 https://historynewsnetwork.org/article/183566 0
Time to Amend the Constitution

 

 

Much has been written about how broken our federal system is. Increasingly it seems that gridlock keeps Congress from passing laws designed to deal with the pressing problems we face as a society. The most recent spate of mass shootings has driven this home, with the Senate finally able to pass only limited gun control measures, despite overwhelming public support for even stronger laws. And this is just one area where the majority of the public favors one set of policies, from access to abortion, fighting climate change, or protecting voting rights, but a majority in Congress cannot reach agreement on what those policies should be. We also have a Supreme Court that is issuing broad opinions that are also out of step with what most people believe.

 

Even more concerning, our former president Donald Trump, and many of his supporters, do not believe in democracy. Given this situation, we need to consider how to restructure our government to ensure that the majority can rule.

 

Thomas Jefferson wrote to James Madison about the new American constitution being proposed that “it is my principle that the will of the majority should prevail.” Jefferson’s view was not the prevailing one among the 55 delegates that attended the Constitutional Convention in the summer of 1787. His good friend, James Madison, feared that a majority could run roughshod over the rights of the minority. Many of the other delegates feared too much democracy as well, and the framework they created reflects a whole series of checks and balances designed to slow the popular will. But it was not meant to thwart majority will altogether.

           

James Madison would soon change his mind about the threat of the majority and instead, in the early years of the Republic, became concerned about minority rule. When he wrote Federalist 10, he assumed that a majority could easily control the minority. “If a faction consists of less than a majority, relief is supplied by the republican principle, which enables the majority to defeat its sinister views by regular vote.” But in the early 1790s, Madison had come to the conclusion that a minority in fact controlled the government in the form of the Federalist Party. He referred to the Federalists as the “anti-republican party” that was “weaker in point” yet still controlled power. His fear of minority government led him and Jefferson to form their own independent political party, the Democratic-Republicans.

 

It seems that we are now heading for a time, if we are not already there, when a minority of the country also controls power. Part of this is certainly a product of our constitutional design, which no longer seems to reflect the will of majority, in contradiction of the views of Jefferson and the later view of Madison. Over the past twenty years, only one Republican has won the popular vote for the presidency, yet the two parties have split the office during this period. The Senate has always overrepresented small states and rural areas. Even the House, which is based on proportional representation, over-represents the minority. In 2020 the Democrats received more votes than the Republicans in the aggregate but lost seats in the House. The Supreme Court is now dominated by a young conservative majority because Trump, a minority president, was able to appoint three new members to the court.

 

Each of these trends call for an overhaul of our constitutional structure before minority rule overwhelms us. I can hear you, the reader, thinking: “We can’t even pass normal legislation, how can we ever amend the Constitution?” It’s the same view I have had for a long time. Yet in the past, when faced with a constitutional structure that no longer works, we have amended the document. Our original constitution, the Articles of Confederation, was replaced by our current Constitution. The Constitution has been amended twenty-seven times, often to provide for more democracy and equality. From the Civil War amendments contained in the 13th, 14th and 15th amendments, to the direct election of Senators and providing the vote to women and 18-year-olds, the Constitution has undergone significant changes.

 

It is in this spirit that I offer up the following amendment proposals, many of which have been raised by others. I have focused on structural changes that will allow for majority rule and avoided any issues that touch on core values (like access to abortion), which are simply too divisive.

 

The place I start is to make amending the Constitution easier. Sara Isgur, who worked at the Justice Department during the Trump Administration, recently made a concise argument for just such a change.  She recalled a conversation with the late Justice Antonin Scalia in which he said that “it was too hard for the people to overrule Supreme Court decisions.” Ms. Isgur’s proposal is to allow a majority of the Congress to recommend amendments to the states, followed by a vote of two-thirds of the states to ratify those amendments (rather than two-thirds of Congress and three-fourths of the states).

 

The Senate is our most undemocratic institution, a product of compromise between the large and small states at the Constitutional Convention. As Douglas Amy, professor of politics at Mount Holyoke has pointed out, the 40 million people in 22 small states are represented by forty-four senators, while nearly 40 million people in California are represented by two senators.  My favored approach would be to have each state get one senator, with the other 50 senators allocated based on population. This would still provide protection to small states but allow for a more democratic Senate. This may be difficult to do, given that Article V of the Constitution makes it virtually impossible to change the two senators per state provision. Short of this, the filibuster should be eliminated from the Senate rules so that the majority can pass legislation.

 

For elections to the House, the elimination of partisan gerrymandering would help to restore that body to what the Framers hoped it would be, the people’s house. This may not require a constitutional amendment but could be done through legislation. But an amendment would make it difficult to ever change. One analyst has suggested that the 15th Amendment could be reworded to read “The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of political belief, race, color, or previous condition of servitude.” A simple and elegant solution.

           

Terms in congress should be extended from two years to four years so that House members serve a similar term as the president. The four-year terms would be staggered so that only half of the House members are up for re-election at any one time. This would help to provide greater stability since we would not undergo a massive upheaval every two years in which the House can so easily pass from one party to another. And there would still be a check on a president since half of the House and one-third of the Senate would be up for election every two years.

 

The Electoral College should be eliminated and the president elected by popular vote as it was created for a variety of reasons that are no longer valid. Some of the Framers feared democracy and the direct election of the president. Southerners wanted to protect slavery, and since the Electoral College reflected the three-fifths compromise on counting slaves for purposes of representation, they believed this would result in more southerners being elected to the presidency. Alexander Hamilton thought that the Electoral College would be an independent body that could provide a check on an unscrupulous person being elected to the presidency. None of these reasons are valid for keeping such an anachronistic institution in place. If the Electoral College cannot be eliminated, then the Constitution should be amended to ensure that the presidential candidate that wins the popular vote in each state gets those electors. Right now, state legislatures have the option to choose the electors, which led to some of the shenanigans that took place in the aftermath of the 2020 election.

 

Finally, there should be term limits for justices to the Supreme Court. The term limits should be structured in such a way that each president would get to select at least one member of the court. This would ensure that the court stays up to date with current trends in society. It would also lessen the need for so much vitriol over Supreme Court nominations.

 

These recommendations are designed to ensure that we can actually govern ourselves, solve problems, and allow the will of the majority to prevail, as Jefferson believed. For those who fear this gives too much power to the majority, our current system is worse by giving too much power to the minority. Madison’s prescription for dealing with a minority through voting no longer works and the minority party is not motivated to modify its policy positions to appeal to the majority.

 

There would still be checks on the power of the majority. Multiple majorities would still be needed to pass new laws, since both the House and Senate would need to vote for such bills, the president would need to sign them, and the Supreme Court would need to opine if the laws are constitutional. Is that not enough by way of checks on the system? So let the debate begin!

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183560 https://historynewsnetwork.org/article/183560 0
Interview: Joyce Berkman on the Value of History and the Historian's Mindset

 

Joyce Avrech Berkman is Professor Emerita at the University of Massachusetts Amherst’s Department of History where she was a faculty member for 48 years (1965-2013). She is the co-founder of the Women, Gender, Sexuality Studies department. Professor Berkman holds a PhD (1967) from Yale University and is a feminist teacher and activist. Her areas of research interest include U.S. history, British history, European women's history, and oral history, and she is the author of The Healing Imagination of Olive Schreiner: Beyond South African Colonialism and the editor of Contemplating Edith Stein.

 

Professor Berkman generously discussed history, her evolution as a historian, and more during a phone interview in April 2021 (the transcript has been edited for clarity).

 

 

What influenced your interest in history and specifically U.S., British, and European Women’s history?

 

My evolution as a historian is rooted in my passion for history and my belief in history and what it can provide in the way of an education. I really would prefer to begin there, with my evolution and personal history with the discipline. My interest in history begins with curiosity about life, about my life and your life and life as a general experience or phenomena.

 

I’ve always wondered since childhood, why things happen? Why does change happen? Why does change not happen? Why do people behave the way they do? Fundamentally, who am I? And why am I who I am? What are we human beings? Why are some people courageous and others behave like sheep? Why do people commit evil, and others do good? Why are some people brave and some people cowardly? And so forth.

 

The existential features of life have always intrigued me, and that is where all learning begins, but to learn meaningfully is to have an active, agile, curious, well-trained, broadly educated mind. For me, the goals of history meet those core goals of a liberal arts education and fuel my passion for history.

 

That is – to become a liberated mind, to have a liberated heart, to have a liberated imagination. I know no better discipline than history to enlighten people, to train them, to enable them to become a full human being. I think it does so in various ways. It does so by insisting upon a critical analysis of so-called “facts,” of opinions, of various dominant and minority values, especially of one’s assumptions and one’s intentions. History does so by bringing to bear the wealth of human experience, the actual panorama of human behavior and thought, and to study that requires a range of disciplinary tools. History is, at its best, widely interdisciplinary. There is no other discipline that depends vitally on interdisciplinarity. No other discipline calls for cross-cultural and comparative regional analysis within the context of time.

 

Anthropology tends to do some longevity studies but generally is not as time-contextualizing as historians are. That is, how does a moment in time create certain possibilities and not others? Similarly, how does a moment in space in a particular. specific location create certain possibilities but not others? This relates to other fundamental concerns of mine that are life-long, and that is how do people make decisions? How do people choose this or that to do or think? What is this thing called “free will”? 

 

A little anecdote: When I was studying history as an undergraduate, I was convinced that people had very, very little free will, and that everything was determined; I was very much a determinist. I remember speaking with a faculty member at his office hours about this, and he then showed me the shallowness of that, of “I want to have court judgments on people who committed rape” and the like and so we went around on this. It’s a question, by the way, I still wrestle with, but I now see the complexity of human nature in a much more sophisticated way. 

 

The way in which we are both creatures of time and space but also creators of time and space, and just how complicated that dynamic is, so if I wish to explain or interpret a happening, such as Joe Biden’s election. Besides whatever I need to know about Biden, I need to know a lot about any individual voter. What is that voter’s values, that voter’s fears, hopes, and interests? What are the biographical influences on that voter and how do those influences intersect with her or his communities, within her or his’ specific setting and time? 

 

How did she arrive at certain psychological and sociological assumptions? What are her biases? What is her mental toolbox? What is her commitment to self-scrutiny? What is her ability to empathize with others? How lively is her compassion? In a voter’s choice, how do they anticipate results of their choice? What are the risks, gains, losses calculation? What governs those thoughts?

 

I know no other discipline but history by which you can explore the decisions of people, the choices they make, past and present. It’s been life-long, it’s why I’ve always loved history, and fascinated by the intersection, ultimately, of sociology, psychology, anthropology, and history. 

 

Now to my evolution as a historian: I would say there are various milestones and turning points. I’d begin with junior high school – we didn’t call it middle school in those days. I was born in 1937. I’m entering junior high school and I was fortunate in the eighth grade to have a social studies class that dealt with the Alamo and the occupation, if you will, of the Mexican War, and then the occupation and the annexation of Texas. I had very few good teachers, but this was one good teacher. She wanted to set up a debate on whether the annexation of Texas was a virtuous thing or not.

 

Was it a legal action? How does one evaluate, or what interpretation do we make of our taking of Texas when it was Mexican territory? At the point of which I was in the eighth grade, it was not long after World War II and the United Nations had been set up. I was a fervent supporter of the UN doctrines against international aggression. I was a very deep enthusiast for everything United Nations [in] the late 1940s and 1950s. I was especially interested in “How do you promote peace?” 

 

War and peace issues were one of my various historical interests (and still are). I was intrigued by the notion that there could be different points of view on a historical experience. As I began thinking about it, a cousin of mine was in the class, and we were both supposed to be on the side of the debate that says it was an act of aggression. If you looked at the doctrines of the UN, you realized it was an act of aggression. That was my beginning of becoming skeptical about what I’d say [on] flag-waving patriots and about nationalism in general. The concern about having the historical knowledge and historical scholarship to help students understand the character of American history was something that sort of ‘bit’ me at that time and stayed with me. 

 

Then in high school, I took a course in US history too, but the focus was especially on California history. I grew up in San Jose, California, and the teacher was keen on rooting things in California, to see what was happening in the country at-large, but also to put light on the California experience in particular. I remember being stunned by learning about the robber barons in California and the making of the California economy. 

 

After that, I was stunned by learning about the Japanese internment camps all around the state. I’d heard nothing about that until high school and was not aware of what happened to Japanese Americans in California. It was like a veil was ripped off my eyes and I was suddenly seeing California history, and hence American history again in a totally new and harsh light. It was very important to see that.

 

I was very proud to be a Californian. For me, the state represented all that was new and modern and open and exploratory, and then to learn about how power was exercised in California and the nature of racism against Japanese Americans was incredibly disillusioning, but also exciting. It stimulated my mind and imagination to want to explore these kinds of events. Why do things happen? Why do things not happen? By then, as well, as I am in high school, the news of the magnitude and horror of the Holocaust emerged, and that was awful shocking. How could people who believed in democracy – we were dealing with the Weimar Republic then – support Nazis and Hitler? Why did the Italians support Mussolini and the fascists? At that point, I thought, “Oh, Americans would never do something like that,” and I was fascinated by the ability of people to suspend their alleged democratic values and support cruel dictators – autocrats. 

 

I was my high school’s representative to the Model United Nations which is a phenomenon – I don’t know when exactly it began across high schools in this country – and they’d hold many UN conferences. I did so again in college. I remember strongly [being] assigned to represent South Africa. How to understand the apartheid regime?

How did this happen? Why do people oppress others? Why do people persecute? Then how do people respond? I’d say my first introduction to white supremacy was not Jim Crow in America (which I studied very little) or even the genocide of Native Americans. It was South African history and realizing what had been done to Africans that I then shifted my gaze to see the counterparts of those same kinds of behaviors in the United States.

 

By the time I graduated high school, I knew I wanted to teach history, but originally, I thought I wanted to teach social studies in high schools because that had been so formative for me in my growing up. 

 

My college experience not only reinforced my desire to teach history but also to eventually want to teach at the college level. In my first year in college, I was required to take as a history major (you had requirements then in the 50s) an undergraduate course in Western civilization. It was a two-semester course and I was fascinated by the history of ideas. Why, again, interpretation, explanation – why do people think what they think, create what they create? 

 

From stunning gothic cathedrals and polyphonic music to hideous crusades and awful witch-hunting and witch-burning and Inquisitions. Then to study the Reformation and realize that the intolerance did not end with the Reformation. Denominations became very intolerant too, except for the Quakers – and at the same time, you’re learning about the magnificence of Renaissance art and literature. It’s a complex experience you’re having of greatness that you can really esteem, admire, revere – and then really gory, gory, ghastly and awful stuff.

 

By the time I was a senior (I attended UCLA as an undergraduate; I graduated in 1958), my interest in history had expanded. It had become a question of the very possibility of explanation and interpretation, given the limits of mortal minds. If I’m always asking the question, “Why?” – do I really think I’m going to get a conclusive answer? I became engrossed, increasingly, by the question of rethinking and re-feeling a person’s choices, the ‘biographical’ of history. R. G. Collingwood argued that “One can enter the thinking process of Caesar contemplating the crossing of the Rubicon, and one can re-enter what he was agonizing over in his mind.” 

 

My professor at UCLA also introduced us to the thinking of Wilhelm Dilthey, the German philosopher and historian who claimed that we can enter the feelings of others, not just the mind, and we can do this through imagination, through empathic methods, to discover another person’s present or past experience. But this got me into all kinds of hot water in terms of when I was teaching. Later, I’m teaching courses on African American history and I’m having some of my students and their parents saying, “How can a white woman teach the experience of black women?” This is a women’s history course. I actually was teaching the course jointly with a male who was black in African American studies at UMass, but the people who were complaining were saying, “What do we have here? You have a man teaching about women’s lives; we have a white woman teaching about black women’s lives – how dare they? They can talk the talk but they can never walk the walk.” 

 

You have to be able to connect the “walk the walk” with the “talk the talk.” There were a lot of those challenges, and that’s still an issue of identity, knowledge, and identity politics as it enters into both learning and teaching and scholarship which remains very controversial. 

 

This further directed me into a field that became very important in my past couple of decades as a historian, and that is doing oral history. Most of my scholarly writing (with some areas of exception) have been biographical studies. I’m fascinated by people’s subjective (meaning subjectivities): documents, letters, diaries, memoirs, autobiographies, and then, in time, oral history. 

 

The concept of empathy is very problematic and one that I uphold, but with many caveats. My most recent historical biographical focus has been on Edith Stein and her dissertation in Germany in 1917 on “The Nature of Empathy.” I’ve published a lot, not just on empathy that she thought about, but I’ve written about myself, independent of Stein’s writing. 

 

In graduate school, I continued without concern but I became intrigued by the armies of past phenomena, the trails of changes; the ambiguity of historical changes and historical events. For example, I studied the American Revolution under the remarkable historian Edmund Morgan at Yale, and what emerges as you study the American Revolution close-up, looking at evidence and asking yourself, “what does this evidence say? Is this evidence credible? How does this evidence compare with other evidence from other colonies?” and the like. I began to see how the effects of the American Revolution were both liberating and oppressing. It liberated and oppressed Native Americans. Women were worse-off after the revolution than they were before in many ways; a genuine setback for women.

 

The complexities of change made me think very hard about the nature of even progressive social and political movements, in the past and in our time. The ambiguity given the ironies of technological revolutions, be they the Industrial Revolution or our present-day Digital Revolution. That was very absorbing to me. The other dimension of graduate training which I tried to do a lot with when I taught graduate studies courses is the nature of historiography. How do we understand the history of scholarship on a topic, whether the scholarship is on the American Revolution, World War I, the Vietnam War? It doesn’t matter what the movement, the event is, but you’re going to have a history. You’ll discover a history of scholarly interpretation. Why do those interpretations appear when they do? What is the nature of dialogue and debate among historians on any particular matter? That became fascinating to me. 

 

My dissertation, however, harkens back to my junior high school [years]. It was the impact of the Vietnam War on me. I wrote on the history of British pacifism between 1914 and 1945; that is, what is the nature of conscientious objection, of pacifist movements in England during the interwar era, beginning first with World War I and taking it up to World War II? I was intrigued by various figures who were leaders in the pacifist movement. I became fascinated even before the Women’s Movement gripped me, by a leading female English pacifist, Vera Brittain. After that, and at the time, of course, I’d been swept up by the Women’s Movement and committed to doing as much scholarship as I can on women.

 

My focus shifted to Red Cross nurses on both sides of the conflict during World War I. Vera Britain was a Red Cross nurse in England and had written so many letters, diaries, everything about her life. Then I began working on Edith Stein on the German side. Looking from a competitive national perspective on their thoughts, their values, their choices – that captivated my interest. I should say, at the same time, the South African apartheid motif that captivated me in high school also reappeared in my go-to book on a major South African progressive thinker, Olive Schreiner. She wrote fiction and nonfiction; she was absolutely an extraordinary woman. I still work on her. I just recently had an article of mine published in The Journal of Commonwealth Literature on Olive Schreiner.

 

The global focus that I became wrapped up with in high school, as well as the issue of war and peace, continued throughout my career, evolving and surfacing in different ways – that’s the evolution.

 

What is the purpose of history and who are its intended consumers? Does the historian have a social responsibility?

 

We get back to some of what I said initially but I can also underscore my views by talking about what I see as the mission of history. As a historian, one part is to promote the ideal of an educated human being and however you define that. I mentioned earlier on what I thought were the key elements in terms of the liberated mind, heart, and imagination. An active one when it’s capable of thinking critically about oneself and about whatever evidence is in front of them, and also thinking compassionately.

 

The second would be we have an enormous need in our nation, now more than ever, for an enlightened and compassionate citizenship in our democracy. If we’re going to keep democracy alive – never in my life have I felt more in peril as I do now in our country – then we need students, from every age onto graduate school, to take the courses and do the thinking and get the critical thinking skills, the mental equipment, and the emotional equipment to become enlightened citizens and voters. 

 

I’d add a third thing: to be a world citizen.

 

It’s not just enough to be a good American citizen; one has to be a good world citizen. This means being able to think of the many ways in which our nation is interdependent and intertwined with other nations, and to look at things beyond our nation. What is the welfare of humanity? What is the welfare for everybody? The extent to which people are willing to think about climate: the climate is your climate; but that’s also very much self-interest. I’m not saying that thinking globally does not have a self-interest component, but to be a world citizen means sometimes subordinating one’s own national interests because something larger is at stake than our own national interests.

 

We’d like to have maybe a really strong competitive economy, but if it means that it’s going to be at the expense of nations in Africa or nations in Latin America, let’s think twice about that. Let’s think about ways in which we can help one another, we can collaborate, we can cooperate, and not just think, “What serves America’s interests?” first. 

 

Those are, to me, the key missions of a good education. History is particularly prepared if it’s done right and well to help people reach those goals.

 

For those who aren’t as aware of world history, how far do you go back in the written record to provide an overview?

 

I find it to be terribly hard. I’d never want to teach a world history course. I know high school teachers are challenged to do that. I’d probably take a particular moment – maybe two – in the history of the world. It may be a war, it may be the Industrial Revolution, it could be the Women’s Movement, it could be anything, it doesn’t really matter. 

 

I’d use that particular moment to unpack the nature of global interconnection and the interdependency and need for one another, and to build enormous respect and understanding for other cultures. It might be music history; it might have something to do with the Silkroad group which is still so important, that has been going around our country and other countries performing. 

 

If I had no choice and I had to teach a course like that, I’d look for a particular event or moment or person, maybe someone like Mahatma Gandhi, and use that as the lens through which I could bring in an infinite array of other ways of looking at how world history has taken place. India: That’d be very exciting – to study the nature of Buddhism in Indian culture, of the immense brilliance of that culture, its own art, its own philosophy, and then at the same time, look at British imperialism as it colonized India. What positive hybridity emerged from that, as well as what got lost? What happened as a result that was not good?

 

I’d look at the issue as it was understood in India and England, when India was able to have a successful war of independence. Why did India get to be independent? The whole question of the independence movement in India is still yet to be sufficiently probed.

 

COVID right now is going to offer historians of the future a huge trove of possibilities for exploring international relationships and the way in which we have to think about the future. I would’ve liked, last night, if [during his State of the Union speech] Biden had been more open about the need for us to shift all this unused AstraZeneca and everything else that we have, over to India which is suffering so keenly, and to bring home to Americans that unless the pandemic is halted in India or anywhere else, there will be new variants, and explain why that is. I don’t think a lot of people understand the science. It is again where the historian must be multidisciplinary. What the science is on viral development, that if a virus is active, it will have more mutations, and if it has mutations and variants, it will come back to the United States and our vaccines that we love so much won’t be sufficient.

 

We have an enormous responsibility to the world – there’s a self-interest [component] but I’d like to think we have a compassion for the deaths and the suffering that’s going on elsewhere on this planet. I don’t hear much about that.

 

What are the challenges facing the historian, the discipline of history, and historical production today?

 

I think that there are many challenges. I don’t know nationally but [in] many parts, many states, we are facing a demographic cliff. That is, there are fewer and fewer students who are entering colleges – unless we expand immigration which doesn’t look likely, alas, this means that with fewer and fewer 18-year-olds [entering], colleges and programs are going to have to close. On top of that, we have the magnet of the digital revolution and where the jobs are. There aren’t jobs for historians like there was maybe after World War II. We don’t have the practical opportunities. 

 

History has been very resilient. The history profession has found more and more ways to grow and expand itself into new directions. Public history is a very exciting area of our department’s and university’s development, and working in consultation and collaboration with people in many other types of film and theater and all kinds of areas – but that’s public history. The prospects for a young person who wants to be a history professor or even a history teacher at the high-school level are not good. The high-school level will always be some degree of history required of students. The college level? Not likely. 

 

We have so diluted the requirement to think historically and how to think historically by saying, “Well, if you can take a history course, take history of economics. You don’t have to take it in the history department. Or maybe you can take a history of music course and that’ll satisfy your history requirement.” 

 

People who teach the history of economics or music, some of them are very fine historians but that’s not, on the whole, going to introduce students to the study of history and its use: of becoming an enlightened democratic citizen. I do think that we’re facing a major crisis here.  

 

As I understand it, at our university, I don’t know if it’s true [elsewhere], when students are asked what they plan to major in when they enter the university, very, very few say “history.” The other area that history majors used to be interested in is going into law. There’s now a glut in law schools and so being a history major, even thinking of becoming a lawyer is now no longer so wonderful an ambition as it was maybe 10 or 20 years ago. There’s a problem there. We’re watching, in other words, the humanities and the arts always very much in jeopardy, and I don’t know what history can do. We’re really facing a very difficult situation. The only way around it – because I feel history is vital – is to reintroduce the requirements. 

 

I was one in the Sixties when I first started teaching who fought university requirements! I thought, “People should take courses that they want to take, not what they’re required to take,” but I’m re-thinking that. I want students to have a proper liberal arts education, and a proper liberal arts education requires that they take a full year of history.

 

Particularly, if you can’t offer a global history course, the Western Civilization course, or Eastern – it doesn’t matter, it could be a Latin American history course. Any kind of history course that will introduce students to the complexities of thinking and arriving at opinions, and exploring what may or may not be true in the historical past. That can be done by a good professor in any area of history. They did it for Japanese history.

 

I’m somewhat a ‘creature of the 1960s’ revolution in education; there’s still something to be said about having first-year courses be topical and explored from multiple disciplines so that the vision that you have… for example, nearby us at Amherst College and other places, where you take some field of knowledge, an event, some phenomenon, and you bring to bear – maybe Black Lives Matter. A whole semester on Black Lives Matter in which you look at this movement in terms of the history of anti-racial movements, of the experience of African Americans. You look at it in terms of performance theory, in terms of journalism and media, how social media has represented it. You analyze all of that, you look at it from as many angles as possible, as many disciplines as possible, and one analyzes, explores, thinks, and argues about it, studies it as a kind of Freshman curriculum. 

 

That would be part of the vision we all had in the 1960s of what learning should be all about, that never got realized because every faculty had their vested interests in their department. Their department didn’t want to lose faculty members. 

 

Is there a historical event or series of events that captivates you most of all?

 

It’s kind of indiscriminate. Any time I read or hear about or discover something, I want to study it. It captivates me. I’ve given an area that wasn’t biographical and first-person, that was a motif of my study of history and my publication work – reproductive rights. 

I’m fascinated, upset by, and my emotions are wrapped up in the whole history of women’s struggle for reproductive justice, which is what we call it now, reproductive choice. The right to have a baby, the love of having a baby, and the right not to have a baby, and how that choice is contingent on race, on social class, on time, on space – so many variables. I’ve done a lot of lecturing on that. I’ve also done a lot of lecturing on the 15th Amendment, which was, after the Civil War, the amendment that enfranchised African American men, until the policies of states and cities began quickly to deny black men their right to vote.

 

That caused an enormous split in the Women’s Movement between those who supported the 15th Amendment, even though it did not enfranchise women, although it only made men capable as a category of voting. They’d thought that at the end of the Civil War, there was a vision that maybe we would become a real democracy and women would be able to vote and have the same legal rights as men of any color and women of any color. 

 

It was naïve and split the Women’s Movement in half. It took decades to recover, and there’s an enormous legacy that still affects white and black female relationships since then. Something like that, and its long tail, its enormous legacy – it’s the sort of thing I’ve given a number of talks on and that continues to grip me. I oscillate, I go back and forth between various characters of the time and biographies, and my attitudes and values about what they’re saying and doing. 

 

There really isn’t any that wouldn’t captivate me. There’s nothing, nothing in human experience that if I’m exposed to sufficiently and read about and become familiar with, that I wouldn’t want to be the historian of that particular event or phenomenon.

 

Can you speak to any influential ‘strands’ of history which shaped and molded civilization through the ages?

 

I’m fascinated by the history of what we call religious faiths. Part of my study of Edith Stein had to do with her exploration as a philosopher of the relationship between religion and theology, between religion and spirituality, between religion and philosophy and history. I had a colleague who’s now retired who taught the history of religion – I think that’s a grand course to teach, and any moment in the history of religion or any movement within the history of religion and antiquity from the earliest cultures, whatever anthropological material there is that can be used, is absolutely exciting!

 

Stein was talking about the longevity of human experience, and the whole conflict between faith and science. Religion and science as we’re witnessing it today is precedent upon precedent upon precedent. It’s not just the monkey business. It’s absolutely fascinating.

 

In the course of your career, has your overriding teaching philosophy evolved as time went on?

 

That’s an interesting question in terms of pedagogy. As so much influenced by the 1960s, I envision an idealized classroom in which the faculty member was the facilitator, with students engaging each other in the exploration of topics; that it would be a totally collaborative and team effort, and it’d be enormous give-and-take. 

 

Unfortunately, for many reasons, students weren’t prepared to do that in college. I don’t know whether those teaching social studies in high school had a better experience. What I find is sometimes I was lucky, in any given semester, if I’m teaching two or three courses, I’d have classes where students were engaged, and I’d have many classes where they weren’t, where they just wanted me to lecture. They’d say “I pay my tuition, you’ve done all this reading. You’ve gone through all the scholarship. We want to hear your thoughts. We don’t want to hear each other; we don’t know anything!” It was very disappointing because I so idealized back in the 60s the fruitfulness of that kind of really open and self-critical exchange of ideas within a classroom.

 

As this happened more and more, I did find myself then slipping into the lecture mode when I didn’t really want to do much lecturing. But then students would say that they loved it and they were really stimulated. It was rewarding and they shouldn’t have been rewarding me, but the rewarding of it ended up contributing to my letting go of some of the pedagogy of idealism I had in the 1960s. 

 

I haven’t really changed in my views about the imperative of multidisciplinary approaches. I’ve always taught history from many angles. I’ve always taught history as inclusive. I was teaching black women’s lives, lesbian lives, in the 1970s when I was enveloped in women’s history. I do think in the last twenty years, the challenge to me, and I haven’t resolved it, but I’ve brought it up in class is, who are women? That is, what is women’s history? If we are abandoning the binary and we are getting rid of the notion of male history and female history and that sort of thing, and we are understanding that identity is so fluid and variable, and we have a world filled with people who don’t identify as women and don’t identify as men, then is women’s history now outdated? Do we redefine it? A gendered history would mean I’d have to teach the history of transexuals, I’d have to teach the history of men, and I’m a women’s historian!

 

What does this all mean for me? I’m still tackling that one, but I always bring it up, I’ve done it during the last ten years of teaching. I tell students, “You’re here at a women’s history course but gee, in this classroom, there may be some people here who are taking it who are men who became women, or women who became men or is this all women’s history too? What are we talking about?” 

 

It’s much better to develop one’s mental toolbox, one’s skills, rather than necessarily master huge bodies of knowledge. If I were to continue teaching my women’s history courses (which the two-semester courses should really be four to six semester courses), I’d just choose a few moments in time each semester and rather than try to build a narrative, even though it’s at the cost of understanding certain levels of continuity and change, I’d probably be very selective and use those particular moments in time or those particular people as the lens through which to study and to develop one’s mental equipment.

 

What moments in history could you use to teach a course?

 

I could teach Edith Stein, I could teach Olive Schreiner (many of the people I’ve written about), Vera Brittain. Any one of those people I could teach an entire course on and let that person’s life become a lens through which to learn how to think critically, how to evaluate evidence, how to think about continuity and change. How to look through a bifocal lens in which you try to understand a person’s experience where they don’t know the future and they’re making choices based on what they see now as opposed to us looking back retrospectively and realizing what happened by […] rather than by choice.

 

Edith Stein examples: Her decision to convert to Catholicism in 1921, and her decision to become a nun in 1933. Those have become absorbing tidbits of her life that people have written about at length. For me, the choice would be when she decided (she’s German) to leave the University of Breslau, where she’d done her undergraduate work in history to go to the University of Gottingen to study with a particular philosopher, Edmund Husserl. 

 

When Husserl left Gottingen and went to Freiburg, she followed him and went there to study. I came to understand, through a lot of reading, that that was typical in that time (not for women). That students, particularly graduate students, followed faculty members they were working with, whose particular approach, in this case philosophical – Husserl was the founder of phenomenology – to go where that person is going. Let’s say it was Martin Heidegger; wherever he was teaching, that’s where students would go. They might’ve studied with him in one place, and if he then goes to another campus, they follow him to that campus.

 

I’d like to study that phenomenon, and her life, of making the decision to leave her family in Breslau to go to Gottingen, a considerable distance in 1914. 

 

The other example is when Edith Stein became a Red Cross nurse during World War I. That might be even better as I’m thinking aloud. I’d begin the course with her positioned to become a Red Cross nurse, and to do nursing on the German side and how that impacted her, how she started thinking cross-culturally. She was dealing with war victims from Hungary, from Italy, from Poland, and she then began to shed her kind of ‘flag-waving’ German perspective. 

 

Do you have any advice for aspiring historians?

 

Ask questions, as many as you can, and ask them from every discipline. Don’t let your mind close any issue. Keep everything open. 

 

What have you been doing since your retirement?

 

I retired at 75 because I had been putting off, for years, wanting to be a musician. My second interest; from one point, I had studied piano when I was in high school and really wanted to teach piano but I didn’t. Once I became more interested in other things, and with family and everything, I had no time. I promised myself that I’d retire when I was still healthy and well so that I could return to piano studies. 

 

By that time, I was also interested in composition and wanted to see if I could write or compose anything. I’m often taking composition lessons, and the music I sing with The Choral Society. Music was the motivation for my retirement. I’ve brought music and history together rather recently because the piece for a Past@Present article for our e-journal in the department that came out this past summer was a review of John Eliot Gardiner’s brilliant volume. It’s a huge one on Johann Sebastian Bach, and Bach is my hero. I can’t get enough of playing and listening to his work. 

 

That’s part of it. Then upon retirement, I knew that I’d then have time to write. I was always feeling the crunch because I’m such a committed teacher, I had too little time; maybe a summer or sabbatical to do something significant because every week was occupied with [teaching] classes, grading papers, mentoring students, and the like.

 

I’ve probably published more during my retirement than I did before I retired, and I’m still at it. In fact, I’ve just been contracted to write what’s called a companion to Edith Stein’s autobiography, which is being published by a branch of Rowman & Littlefield called Lexington Books. They’re doing a series of little booklets or treatises called Edith Stein Studies. I’m not sure of how to dub these studies of different aspects of her life and writing. 

 

That’s my summer work ahead, but I’ve gone to conferences, given talks in Poland and Germany on Stein, on reproductive rights in Berlin. I’ve participated in history conferences in various places, and I had the time to prepare for the talks, time to write essays, and did some publications. That has continued; it’s the major struggle of my life, finding a balance between the music I want to do and love and that I wanted to do upon retiring, and continuing my love for scholarship and doing the work that I’m doing on various topics in history.

 

Because of COVID and the pandemic, the past year has been a bit different. We’ve not been able to visit with family. My family is dispersed; I have a son and a grandchild in Vancouver, a son and grandchildren near Philadelphia and Swarthmore, and then I have a sister in California. People are scattered and I’m not flying and they’re not flying, and so we’re not seeing them.

 

During the years of my work on Stein, we went to Germany pretty frequently and I kept trying to speak and learn German. It’s such a difficult language and I continue to take weekly German lessons. I have a tutor who comes and we really just converse in German as best as I can (I stumble through). The other thing with the pandemic, besides the fact that we’re not seeing family which we’d ordinarily do, and that was part of retirement; I’ve had more time with family members. When I retired, my mother was still alive, and I knew that she wasn’t going to live much longer, and I really wanted to be able to spend some time with her before she died. 

 

The other is that with the pandemic, [I live] in a neighborhood with gardeners, and they’re remarkable. I don’t have a strong enough back to do anything that they’re doing but I do have a gardener who helps me, and I’ve been spending time gardening, and that’s been very nurturing for me. I’ve loved it.

 

Reading a lot. I have much more time to read – joy reading – reading I’d never have time for when I was full-time teaching, and a lot of the reading is in the history of music, biographies of composers, but it’s also crime fiction which I love. I’m a Louise Kenny fan, I’ve read every novel she’s ever written, so there’s a lot of that too!

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183563 https://historynewsnetwork.org/article/183563 0
What Does the Perot Phenomenon Tell Us about Andrew Yang's Third Party Prospects?  

 

Mass discontent is growing once again in the United States. In a recent survey conducted by AP-NORC, 85 percent of Americans said the country is headed in the wrong direction. That is the highest percentage recorded in the last 50 years, and it makes one wonder what the political establishment, or anyone else for that matter, is going to do about it.

 

In a recent New York Times opinion piece, columnist David Brooks suggested that “if there ever was a moment ripe for a Ross Perot-like third candidate in the 2024 general election, this is that moment.” Brooks is by no means speaking in hyperbole, as the majority of the American people seem to be in agreement with him on the issue of a third party or outsider candidate. Back in February, Gallup released a poll which showed that 62% of U.S. adults said that the two major “parties do such a poor job representing the American people that a third party is needed.”

With third parties and Perot’s name being thrown around at this critical moment of volatility in American politics, it seems worth remembering Perot, his movement, and his political party, The Reform Party.

 

Perot’s first major foray into politics was his 1992 campaign for the presidency. One important thing that his candidacy showed was that when the political establishment only seems able to agree about what many may view as the wrong or least important issues, a slot can open up for someone with the right resources and message to gain momentum. Luckily, Perot had both when he chose to run as an independent candidate in 1992. He was a tech billionaire who used his own money to finance his campaign (lending to his credibility on the issue of being a “free agent”), and his campaign was laser-focused on the seismic shift away from a manufacturing-based economy to a service economy and the subsequent consequences of that shift for the American middle and working classes. Because Perot had the money to spend on things like 30-minute infomercials, did not and would not take any money from special interest groups, and had a message that deeply resonated with a good portion of the electorate, he was able to do better than any third party candidate in a general election since Theodore Roosevelt’s Progressive Party run in 1912.

 

By the time of Perot’s second campaign for the presidency in 1996, he had already founded the Reform Party a year earlier. The party was intended to be a real bulwark against the direction in which the country (and the world at large) was headed, which was defined by outsourcing of U.S. manufacturing jobs, ballooning federal debt, and rising inequality. Perot and other Reformers were also interested in addressing problems of corruption within the political establishment which had diminished the level of “real” democracy in the United States. They wanted to “return to government for the people,” by ending the two-party system to create a more “ethical” government. The party generated a lot of interest for a while, even showing up on the radar of then-real estate mogul Donald Trump, who debated running as a Reform Candidate in the 2000 presidential election.

 

The only successful candidate the party ever produced was former pro-wrestler and Governor of Minnesota, Jesse Ventura. While one may assume that having a Reform Party candidate win a statewide election would have been welcome news to the party, it ended up being more internally divisive than anything due to the candidate that was representing them. Ventura’s campaign for governor of Minnesota was defined by crassness, lewd behavior, the utilization of celebrity status, and a lack of knowledge on the basic functions of the state’s government. In short; it was a far cry from Ross Perot’s issues-based campaign in 1992, and it is likely because of this that Perot did not outwardly support Ventura’s candidacy for governor in 1998.

 

After Ventura however, the party imploded in 2000 over disputes as to who would run as the presidential nominee when Perot counted himself out. With figures like Ventura representing the party in government, and speculation about Donald Trump running as the party’s presidential nominee, it is not hard to see how easily divided and effectively useless the party became in four years. But what the Reform Party and Ross Perot show us is that certain outsiders can come along to meet the moment. When there is widespread disillusionment with the two-party system amongst the American public, leaders who step up and not only speak to that feeling but offer a real choice away from the mainstream of American politics can reinvigorate the electorate and build trust between themselves and their movement. Perot’s voters in the 1992 election said that compared to Bush and Clinton, he was the most “trustworthy” and “really seemed to care.” How many politicians today can enjoy that kind of reputation?

 

This kind of political outreach was not just confined to Ross Perot in the 1990s. However much Perot may have disliked his campaign style, Jesse Ventura was able to engender a real sense of trust between himself and his voters. This is not to say that Ventura’s brash personal style and “stick it to the man” campaign mantra were not also major factors in his appeal, but he came off to voters as someone who thought like them, who was them, and would therefore govern with their interests in mind as opposed to the interests of the political elite. As political scientists Stephen I. Frank and Steven C. Wagner put it, “Ventura seemed to be a mirror image of [his] voters.”

 

A perfect example of how this manifested during his campaign was recounted by Ventura during a 2015 interview with journalist Graham Bensinger. In the interview, Ventura explained how during a debate he was asked about how he would work with the IRRRB (Iron Range Resources and Rehabilitation Board), and when he said he didn’t know what that was but “if it was important” he’d “learn about it,” the crowd erupted in applause. He understood what many Minnesotans wanted; someone who was a genuine break from career politicians who wouldn’t lie to them. The fact that he didn’t know about the IRRRB wasn’t a red flag to many voters. Rather, his willingness to be forthcoming about his lack of knowledge on the subject made him a more trustworthy and ideal candidate.

 

Perot and Ventura’s campaigns are models of what a successful third party candidate might look like. However, if the last six years of American life has taught us anything, it is that moments of mass discontent are not quelled overnight, in one election, or by one candidate. Thus, the lesson of Perot, Ventura, and The Reform Party may be that if a third party or outsider candidate is willing to step up to meet our current moment, they need to have the money, resources, and a strong message of course, but also the stamina to be in it for the long haul. Perot had all the money and air time he needed, and The Reform Party were on the right side of enough issues for many Americans to get behind them, but neither had the fortitude or patience to build something that could outlast them. Likewise, Ventura had a unique political style and a genuine connection with his voters which helped land him the governorship, but once he got there he only served one term and has spent his subsequent years being a media personality, eventually ending up on RT America dressed like Dog the Bounty Hunter to monologue about conspiracy theories.

 

The most high profile attempt at a third party today is Andrew Yang’s “Forward Party.” Yang may be trying to build a third party movement right now, and he may well succeed in doing so. But “Not left. Not right. Forward” has, so far, not proven to be a message that resonates or invigorates the public in any way that resembles the Perot or Ventura campaigns. If Yang, or anyone, wants their third party movement to succeed, they need to listen to the Americans who feel the most unseen and who are the most prepared to mobilize for a candidate they believe in. That’s what Perot did, and that was part of Ventura’s strategy as well. They reached out to those voters, and framed themselves and their campaigns as “the only real choice.” Pragmatic centrism, which so far seems to be the rallying cry of Yang’s Forward Party, feels less like a real break from traditional politics, and more like the campaign message of any number of Democrats from the last 30 years.

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183567 https://historynewsnetwork.org/article/183567 0
The Roundup Top Ten for July 29, 2022

Why "Life of the Mother" Exemptions Vanished From Abortion Restrictions

by Mary Ziegler

Exceptions to allow abortions to protect women's lives are popular. They are being eliminated from new abortion bans because the people pushing those bans distrust both women and doctors. 

 

"They Want Your Children!": Right-Wing School Panics Seek to Repeal Modernity

by Rick Perlstein

"Reactionary panics about what children learn in school are about as old as time. And they won’t ever go away."

 

 

Trump's Reelection Threatens a Politicized Civil Service

by Heather Cox Richardson

Since 1883, the federal civil service has been protected from the old spoils system by rules for merit-based hiring and promotion. Trump has threatened to revert to a system of rewards for loyalists and punishment for enemies, without regard for performing the public's business. 

 

 

Till's Accuser's Memoir Shows the Pandora's Box She Opened has Never Closed

by Peniel E. Joseph

"What does it say about America that we are still in search of justice for the victim of an almost 70-year-old crime that helped spark the modern civil rights movement?"

 

 

What Conservative Justices Get Wrong About the Founders

by Timothy C. Leech

It's preposterous to argue that the Founders, men of the Enlightenment generation, would have intended for the constitution they drafted to be immutable and unchanging. 

 

 

Can America Apply Lessons from HIV/AIDS Crisis to Deal with Monkeypox?

by Dan Royles

Public health debates on monkeypox need to look at the history of health messaging about HIV-AIDS to focus on communities of gay men currently at risk while avoiding triggering homophobic responses and stigma. 

 

 

The Irish Lesson: Abortion Bans Won't Stop Abortion

by Fintan O'Toole

The recently overturned Irish constitutional ban on abortion and the recent attack by conservative Americans on abortion rights have a common intellectual champion in Notre Dame's Charles E. Rice. The Irish learned the hard way what followed. 

 

 

Young Faculty Refusing the "Free Labor" Their Predecessors Performed Have Their Reasons

by John Warner

Faculty used to operate in a gift economy of unpaid labor supporting peer review, journal editing, and writing letters for tenure reviews. Now that institutions have withdrawn the possibility of that work being cashed in for job security, why should any faculty bother with it? 

 

 

What Does an Electric Makeover Mean for the Car of the Counterculture?

by Jill Lepore

The new electric VW bus seems to lack the charm of the vehicle of the counterculture, reflecting changes in technology and society. 

 

 

A Brief History of the Vatican and Western Canadian Missions

by Roberto Perin

"Residential schools and the papal bulls justifying the fallacious doctrine of discovery call out for concrete acts of atonement and reparation on the part of the church."

 

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183559 https://historynewsnetwork.org/article/183559 0
For America, Let's Send the Adults Back to School

The nation's first Lyceum was founded in Millbury in central Massachusetts in 1826

 

 

Each day seems to bring dire assessments of the poisonous state of America’s political culture and urgent warnings that our fraternal animosities and passions may soon erupt into widespread violence.  Prominent American historians have taken to the pages of major newspapers to express concerns that our political divides are beginning to resemble those that rent the union in the 1850s.  In her new book How Civil Wars Start, UC San Diego political scientist Barbara Walter concludes that the United States is an anocracy, a country that occupies a perilous position between autocracy and democracy and which is ripe for internal conflict. Anxieties are growing too among foreign observers of American politics. Recently, Tom Homer-Dixon, a respected Canadian intellectual, urged his nation to begin contingency planning for a crackup of the American polity.  Ordinary American citizens are apparently seeing through the same dark prism as experts. A recent CBS poll finds that 62% of Americans believe that in the aftermath of a future presidential election, the losing side will resort to violence. 

 

What is scarce in our public discourse, however, are practical proposals to dampen intense political partisanship and provide our nation with a path away from what far too many of us have already concluded is an irrepressible conflict.  It is incumbent on all concerned citizens, regardless of political orientation, to support efforts to promote mutual understanding and take the first halting steps toward reconciliation. I have no illusions that such a path will be quick, easy, or without controversy. What I envision is a concerted, bipartisan, 50 state, multi-year approach that draws its inspiration from the lyceum movement of the 19th century and is modeled on the summer seminars that thousands of American teachers, myself included, attend annually.

 

The lyceum movement arose in the United States in the 1820s and was predicated on the assumption that ongoing education was critical for the civic health and prosperity of the community. In towns across a still largely agrarian nation, adults were provided with opportunities for self-improvement through hearing lectures and debates by scholars and attending classes on a variety of topics. The often parochial world of small town life in the 19th century was enlarged through the creation and maintenance of lyceums across America.

 

How much less parochial and isolated are we than our forebears? How much are we in need of enlargement ourselves?  It has become a tragic truth of our times that conservatives and liberals have become sorted geographically, with the former inhabiting mostly rural areas and the latter clustered in cities.  To make matters worse, experts warn that we are increasingly living in narrow informational silos, our views circumscribed by the network we watch or the algorithm that steers our social media accounts.  We should take a cue from the lyceum movement and get serious about designing institutions to counter the pernicious provincialism of our own times. 

 

We already possess a contemporary template for how this new lyceum movement could be built.  Every summer organizations such as the National Endowment for the Humanities and the Gilder Lehrman Institute of American History bring together cohorts of teachers to study different topics at sites across the nation.  Over the years, I have been fortunate to attend seminars on John Steinbeck, the history of the transcontinental railroad, and the Pacific Theater during World War II.  In a typical summer, there are scores of such seminars conducted, which allows thousands of teachers to come together to learn from renowned scholars, to share ideas with each other, and to renew ourselves by temporarily throwing off the constraining routines of our own lives.  

 

I propose that we replicate this model but instead of bringing teachers to sites across the country, we bring everyday adult citizens.  Citizens would be selected through a formal application process and travel costs, meals, and accommodations would be paid by philanthropic organizations and perhaps pledge drives conducted on our ubiquitous cable news shows. Each state could host one of these citizen seminars, inviting 50 people, more or less evenly split between Republicans, Democrats, and independents, to a weeklong event.  Participants would live, eat, recreate, and study together in a communal setting such as a college dormitory or convention center. 

 

Ideally, states would select a venue within their borders that would appeal to our collective identity as Americans to remind attendees of the bonds that still unite us.  National Parks and sites of historical significance would make ideal places to learn together and soften obdurate hearts.  During the week, attendees would discuss assigned readings about history, civics, and contentious contemporary debates. There is no dearth of pressing topics to consider.  One day could be devoted to the problem of race in American history, another to the second amendment and gun policy, a third to our basic constitutional architecture.  Imagine the conversations that could ensue, the mutual respect and trust that could be fostered, if citizens simply had a structured forum to share their views with civility.   

 

The overriding goal of the week though, would be teaching participants about the central role that norms play in a democratic republic.  Sadly, many citizens have derived from their studies of the American system of government that the U.S. Constitution alone is sufficient to safeguard our democracy and ensure our cherished rights. To borrow from the 19th century poet James Russell Lowell's words on the centennial of the ratification of the Constitution, there is a deeply entrenched faith that the document’s mechanistic checks and balances render it a “machine that would go of itself.”

 

However, in their book How Democracies Die, Steven Levitsky and Daniel Ziblatt underscore the importance of norms, which they define as unwritten “shared codes of conduct” that are akin to calling your own fouls in pickup basketball or respecting the do-over if there is a close call in a neighborhood wiffle ball game. Establishing the norms that participants agree on are essential to our politics and then agreeing to practice them throughout the week could have a transformative effect on the nation as participants fanned back out to their homes and shared their experiences at their supper tables, their places of worship, their workplaces, and on the internet.   

 

Well respected university professors would deliver lectures each day to contextualize the readings and answer questions.  Local schoolteachers could then help facilitate discussions and activities to encourage participants to learn from each other and to share how they arrived at their own political sensibilities.  Teachers also have hard won experience mediating disputes, calling people back to their better selves and maintaining order.  Our republic is indeed fractious but I am certain that teachers could ensure a weeklong truce between the Don’t Tread on Me Crowd and their woke adversaries.  

 

Samara Klar, at the University of Arizona, has conducted research which validates the merits of the plan sketched above.  Her research has found that participants of bipartisan discussion groups were more likely to modify their views and find areas of common ground than homogeneous partisan groups.  What’s more, they report enjoying their experience and the desire for similar interactions.  

 

Good citizenship is a learned behavior that requires practice.  A 50 state 50 student program would send 2,500 citizens back into their communities annually with a better understanding of public policy and political ideology, but more importantly imbued with a new spirit of civic virtue. Teachers know that very few students leave a classroom unchanged for the better.  Let’s send the adults back to school.

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183518 https://historynewsnetwork.org/article/183518 0
The Thirty Meter Telescope Project Exemplifies Scientific Progress and Indigenous Dispossession

Hawaiian cultural practitioner Joshua Lanakila Mangauil and other protectors discuss their blockade of the road to Mauna Kea, part of protests of the Thirty Meter Telescope project, with Mayor Billy Kenoi, October 7, 2014. Photo Occupy HiloCC BY-SA 2.0

 

The Thirty Meter Telescope is a new class of extremely large telescopes that is expected to enable new scientific discoveries with its abilities to observe deeper into space. The founding of the TMT corporation happened in 2003, and in 2009, following approval from the State of Hawai’i Land Board; the dormant volcano of Mauna Kea was chosen as the preferred site for the construction of the telescope. The protests against the Thirty Meter Telescope in Hawai’i are not mentioned or included in the timeline provided by the TMT’s website. It instead only describes a temporary stand-down issued by the Governor of Hawai’i in 2015, and the selection of La Palma as an alternate site in 2016. It is important to highlight that the official website’s timeline glosses over the reasonings behind the construction delays, stating in 2016 “new contested case hearing begins in Hawaii” and in 2019 simply citing “efforts to restart construction stall.” Although the TMT website does not detail the protests that occurred at the Mauna Kea construction site, they are well-known and well-documented from various news outlets.

In late 2021, the construction of the Thirty Meter Telescope on Mauna Kea was delayed for a couple of years. The chief of staff of TMT cited the need to heal the relationship with the community and to recover from the delays caused by the pandemic; the project also needed additional funding from the US government to proceed. Despite stating a desire to heal the relationship with the community, the plans for construction are still projected to continue on following the delay, regardless of the concern of Indigenous peoples for the intrusion on and destruction of sacred land. While the relationship with the community is cited as a reason for the delay, the necessity for additional funding and the wait for the results of the federal Decadal Survey are likely the main causes for the postponement of construction.

According to the Office of Hawaiian Affairs’s website, Mauna Kea is a deeply sacred place that is regarded as a shrine for worship, as a home to the gods, and as the piko of Hawaii Island. Mauna Kea is further a critical part of the ceded lands trust that the State of Hawai’i must protect and preserve. In 2015, the OHA engaged the state and the University of Hawai’i in an extended mediated process to resolve the mismanagement of Mauna Kea, but was unsuccessful. Thus, the OHA has been concerned with protecting this site beyond the construction of the telescope, and there has been a long history of the mismanagement of this protected land.

The existing Caltech Submillimeter Observatory on Mauna Kea is set for its decommissioning process following the removal of its astronomical instruments back in 2015. For perspective, the removal of this existing Mauna Kea telescope that is scheduled to happen later this year has drawn attention for the necessity and difficulty of a safe removal. While Indigenous Hawaiians have sought the decommissioning and removal of the telescope from their sacred site, the removal process can be just as dangerous as construction. The ecological challenge of safe removal is captured in the multitude of studies conducted including an archeological assessment, a cultural setting analysis, hydrogeological evaluation, and more. This frame of reference demonstrates the extent of the impacts of the telescopes, both in endangering the natural environment, and the long-lasting impact on Indigenous sacred sites, one that remains even when the intrusion is in the process of removal.

Beyond the largest challenge of funding, protests also succeeded in halting construction of the TMT in both 2013 and 2018. Because of the challenges posed by Hawaiians protecting their land that was stolen, the TMT project has selected La Palma in the Canary Islands as a backup site option. The TMT’s website sites the evaluation of four potential backup site options, but does not explicitly list which locations were under consideration apart from La Palma. The conquest of the Canary Islands by Spain was completed in 1496, and the Indigenous populations, the Guanches, are largely and incorrectly believed to have been completely wiped out in the conquest. This assumption parallels the common assumption that Native Hawaiians went extinct; however, despite intermarriages with white Europeans or Americans, Indigenous lineage still exists in both locations. The backup site for the telescope chosen as another colonized site is telling. La Palma in the Canary Islands was colonized by Spain, and thus the backup site option would still operate on Indigenous lands stolen by a colonizer. While the protests may have posed enough of a challenge to the TMT project to force them to reconsider construction elsewhere, they have not penetrated the white settler colonial logic that refuses to recognize Indigenous lands as not within their possession.

What other sites were considered as alternatives besides La Palma, in light of indigenous land claims? Were indigenous land rights and potential dispossession even considered in the search for alternative sites? While the consideration of an alternative site beyond Mauna Kea comes as a relief to the Native Hawaiians who hold the volcano as sacred and under protection, the transference to a different site that was also stolen from Indigenous peoples still does not fix the underlying problem. This underlying problem is the assumption that colonized land belongs to the colonizer on the premise of its violent dispossession, and that the original inhabitants’ ties and connections to the land are invalid. This is not to discount the importance of further scientific discovery, but instead to bring attention to questions of the location and disruptiveness of the new scientific instruments, and how they impact the land and the communities tied to that land. Clearly, even in the removal process, scientific mechanisms may be dangerous and harmful to the land and its ecological systems, and so it is important to recognize that Indigenous communities have special relationships to the land that Westerners, Americans and Europeans, may not fully understand.

While there are indigenous land protectors in Hawai’i to fight for indigenous land rights, other indigenous groups lack such protectors. When an indigenous group does not have the ability to fight for their rights, does this mean we ignore those rights or pretend they do not exist? This leads one to ask several questions about the TMT project. Are there not any other valid locations which might be suitable for the construction of the Thirty Meter Telescope? Might we not find any other land to construct the telescope on – land not predicated on the dispossession of Indigenous peoples? Ongoing scientific study and discovery are both positive and necessary, but finding a way to reconcile such progress with the recognition and understanding of indigenous rights and ways of life is necessary if we are to successfully continue expanding our scientific study and understanding of the world. Without such recognition and understanding, are we even making progress?

 

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183516 https://historynewsnetwork.org/article/183516 0
The Republicans' Holocaust Problem This article originally was published by Tikkun: The Prophetic Jewish, Interfaith and Secular Humanist Voice for Social Justice, Environmental Sanity and Peace.

US Rep. Marjorie Taylor Greene (R-GA) with white nationalist and America First PAC founder Nick Fuentes, 2022

 

Republicans are having difficulty deciding how they should think about Nazis and the Holocaust. They deny actions they have publicly taken, propagate and then delete messages, verbally promote and legislatively limit teaching about what the Nazis did. They seem confused, but aren’t. Some Republicans cozy up to Nazis. Some Republicans, often the same ones, call Democrats Nazis. Many Republicans across the country are attacking the foundation of Holocaust teaching. These three arms of Republican behavior around the Nazis have a single result: to trivialize the Holocaust.

Embracing Nazis always makes news. Carl Paladino, Republican nominee for NY Governor in 2010, Trump’s NY campaign chair in 2016, and current House candidate, is simply the latest fascist advocate. In a radio interview last year, which somehow did not become public news until this month, he praised Hitler: “He would get up there screaming these epithets and these people were just, they were hypnotized by him. I guess that’s the kind of leader we need today. We need somebody inspirational. We need somebody that is a doer.” Paladino combines admiration for Nazis and old-fashioned American racism: in 2016, he hoped that Barack Obama would die of mad cow disease and suggested that Michelle Obama be “let loose in the outback of Zimbabwe where she lives comfortably in a cave with Maxie, the gorilla.”

The overlap between conservative Republicans and neo-Nazism has a long history. Former Nazis and neo-Nazis were founders of the Republican Heritage Groups Council in 1969, which excluded Black and Jewish Americans. Some Republican candidates in the 2018 elections were open Nazis, white supremacists and/or Holocaust deniers: Vox said 5, the Forward said 9. Illinois Rep. Mary Miller approvingly quoted Hitler the day before the January 6 riots, and recently won the Republican primary.

More Republicans stand next to Nazis without themselves praising Hitler. Arizona Republican office holders and candidates appeared at a 2021 rally organized by Matt Braynard, former director of data and strategy for Trump’s 2020 campaign, featuring Greyson Arnold as a speaker, who calls Nazis “the pure race” and supports the neo-Nazi group Stormfront. Idaho Lt. Gov. Janice McGeachin appeared this year at the America First Political Action Conference, which is hosted by white nationalists who express antisemitism and deny the Holocaust. She posed for pictures with Holocaust denier Vincent James Foxx. Georgia Rep. Marjorie Taylor Greene stood proudly next to Nazi-sympathizer Nick Fuentes at the same conference, where he later praised Putin and Hitler.

White supremacy has become integral to Republican messaging. A Twitter employee in 2019 argued internally that getting rid of racist content would involve deleting Republican Party messages, including Trump’s: “on a technical level, content from Republican politicians could get swept up by algorithms aggressively removing white supremacist material”. Prominent Republicans who have openly promoted the “white replacement theory” that Democrats are trying to replace real Americans with ethnic minorities in order to win elections include Texas Lt. Gov. Dan Patrick, Wisconsin Sen. Ron Johnson, and House Republican Conference Chair Elise Stefanik. FOX’s Tucker Carlson has been the most vocal propagator of this theory. German Nazis could not have been so bad if our political celebrities want to take selfies with their American cousins and parrot their racist nonsense.

It only seems contradictory that for many Republicans, including those who happily consort with American fascists, “Nazi” is a favorite label for politicians and government employees they don’t like.  Donald Trump, Jr., in 2018 said the Democratic Party’s 2016 platform was “awfully similar” to Nazi Party platforms. Doug Mastriano, the Pennsylvania nominee for governor, compared Democrats’ gun control proposals to the Nazis in 2018 and again this year. In June 2021, Pennsylvania Rep. Scott Perry said Democrats were like Nazis who want to destroy America. Even though Trump’s most notable achievement was the development of a vaccine, Republicans as a Party have criticized every government effort to save lives through masks and vaccines. Colorado Rep. Lauren Boebert called government advocates of vaccinations “needle Nazis” and “medical brownshirts” in front of a cheering CPAC crowd in July 2021. Sen. candidate Josh Mandel in Ohio in April 2021 and Ohio Rep. Warren Davidson in January 2022 compared our government’s health policy to the Nazis. Lara Logan, a host on Fox News Media’s streaming service, said in November that Anthony Fauci “represents Josef Mengele”.

Marjorie Taylor Greene denounced the media for comparing Republicans to Nazis in May 2021, then said the Democrats were the “national socialist party”. When Nancy Pelosi announced rules in May 2021 requiring unvaccinated members of the House to wear masks on the chamber floor, Greene said on a Christian Broadcasting Network program: “You know, we can look back at a time in history where people were told to wear a gold star, and they were definitely treated like second class citizens, so much so that they were put in trains and taken to gas chambers in Nazi Germany. And this is exactly the type of abuse that Nancy Pelosi is talking about.” After the American Jewish Congress tweeted back, “Such comparisons demean the Holocaust”, she insisted: “I stand by all of my statements; I said nothing wrong, I think any rational Jewish person didn’t like what happened in Nazi Germany, and any rational Jewish person doesn’t like what’s happening with overbearing mask mandates and overbearing vaccine policies.” She was so convinced of her imagery, she used it the next week in a tweet about one company’s vaccination policy: “Vaccinated employees get a vaccination logo just like the Nazi’s [sic] forced Jewish people to wear a gold star.”

Greene is not demeaning the Holocaust. Playing with Nazis, calling her opponents Nazis, and comparing herself to Jewish Holocaust victims all serve to diminish the Holocaust. Republicans are attempting to remake the Holocaust into a normal political event. If America’s doctors are like German Stormtroopers, if requiring one’s employees or our members of Congress to follow the most obvious public health rules is like murdering thousands of Jews and others every day for years, then the Holocaust as a singular event has disappeared.

Weeks later Greene apologized. As one of the most public faces of the Republican Party, she had gone one step too fast in pursuit of the Party’s goal of normalizing the Holocaust.

The Holocaust is a dangerous subject for American conservatives, because it was the mass murder of Jews by Christians. A few prominent Nazis espoused crackpot theories of Aryan paganism, and Polish Catholics and Russian Orthodox Christians were also slaughtered in vast numbers. But the murder of 6 million Jews was the culmination of centuries of official Christian persecution. Teaching about the Holocaust should begin with the Bible and must explain the violent antisemitism of nearly all Christian denominations right into the 20th century. Anti-Jewish racism was embedded in Christian European and American societies and their legal systems in order to uphold the supremacy of white Christians. The recognition of Christian responsibility for Western antisemitism and the Holocaust led every Christian denomination in Western Europe and America after 1945 to repudiate centuries of their own dogma.

The wave of Republican censorship of public school and university curricula in response to the sudden American reckoning on race after George Floyd’s murder purports to be about “critical race theory”. When Florida’s Board of Education banned “critical race theory” from public school classrooms one year ago, the Board seemed to protect Holocaust education by also banning any teaching that denies the Holocaust. But their language points in the opposite direction. Critical race theory “distorts historical events” by asserting “that racism is not merely the product of prejudice, but that racism is embedded in American society and its legal systems in order to uphold the supremacy of white persons.” The Holocaust was caused by precisely such embedded white supremacy. And like American anti-Black racism, that white supremacy had deep roots in official Christianity.

I have seen my students become uncomfortable when confronted with facts about Christian persecution of Jews and Nazi admiration for American Jim Crow legislation in the 1930s as a model for the Nuremberg laws. The American eugenicist Madison Grant, whose 1916 eulogy for Nordic supremacy was entitled “The Passing of the Great Race”, was equally popular with American segregationists and Adolf Hitler, who called the book his “bible”. They were disturbed by the realization that German Jews, from the passage of Nuremberg Laws in 1935 until the Nazis invaded Poland in 1939, were treated essentially the same as African Americans here, whose racial persecution continued unabated into the 1960s. That same knowledge frightens today’s right-wing Christians across the Western world. The Christian nationalist parties in Europe all seek to diminish the Holocaust, especially the role played by Christians in their own nations: those in power in Poland and Hungary, and those trying for power in Germany and France.

The literal wording of recent Republican censorship laws bans education that doesn’t exist. The fake narrative that critical race theory is taught in public schools is the basis of this wave of legislation. A different and broader invention imperils Holocaust education: the claim in Wisconsin’s 2021 law that it is necessary to forbid teachers from indoctrinating their students with the idea “that one race or sex is inherently superior to another race or sex and that an individual, by virtue of the individual’s race or sex, bears responsibility for acts committed in the past by other individuals of the same race or sex.” That kind of systematically damaging pedagogy was in fact integral to American education for centuries. The long racial reckoning which began in the 1960s demonstrated how white supremacy was written into all levels of educational curricula. The claim that American racism is over, the foundation of the attacks on critical race theory, ignores the continuing power and weight of adult Americans who were subject for years to those curricula, as I was.

Any hint that a teacher is promoting racial or gender superiority is likely to be called out without any help from new laws. The Republicans are not anxiously hunting for hidden examples of white supremacy or male superiority. That’s what they promote. They want their supporters to believe that they will reveal and defeat the teaching that blacks are superior to whites and that women are superior to men, exactly the kind of fake crisis that dominates the politico-cultural war.

Over years of interacting with teachers of the Holocaust, I never heard of any who told students that they bore “responsibility for acts committed in the past by other individuals of the same race or sex”. Holocaust teachers do mention that this was precisely what Christian churches had been saying for centuries about Jews. Such claims were fundamental to murderous persecution. But inducing guilt in today’s students is hardly useful in teaching history.                                                                                 

The discussions during the Republican effort in Louisiana to ban critical race theory display how the right-wing ideology of the Holocaust plays out at the state level. Republican state representative Valarie Hodges sponsored a bill in 2021 to mandate Holocaust education in Louisiana. Hodges was an avid promoter of the idea that Democrats are as bad as Nazis. She was part of the effort of conservative Republicans in the state to require the teaching of patriotic themes in American history and to block more teaching about America’s racial history. Hodges brought a Metairie resident to testify about the dangers of “communism” in our government: “To put it in Holocaust terms, the communists are now the Nazis and we are the Jews. They are the predators. We are the prey. We need to teach this history to our future citizens so we don’t end up like the Jews.” No Jewish organizations testified in favor of Hodges’ bill. The executive director of the American Historical Association, Jim Grossman, speaking for professional historians in America, recognized the ultimate goal. “You’re saying, ‘You have to teach the history of the Holocaust, but you can’t teach the history of institutionalized, deeply embedded racism in the United States.’”

Rep. Ray Garofalo, the head of the Louisiana House Education Committee, sponsored a bill barring teaching about institutional racism. He then slipped and said the right-wing truth: any lessons about American slavery should include “the good, the bad, the ugly”. Garafalo’s other unprofessional antics made him such an easy target, that the Republican Speaker of the House removed him as chair, and replaced him with another Republican. All the bills about mandating and preventing subjects in Louisiana public education ultimately failed.

The legislative history of Republican censorship in Arizona offers similar clues about what the issues are and what will be attempted in the future. Arizona Republicans in the state legislature are unanimously in favor of putting an amendment to the state’s constitution before the voters. The bill’s lengthy section B enumerates seven varieties of fake complaints about non-existent educational practices. The key is section A: teachers in public schools from elementary to high school: “may not use public monies for instruction that promotes or advocates for any form of blame or judgment on the basis of race, ethnicity or sex”. The bill’s sponsor, Michelle Udall, argued that, “If a teacher can’t teach [history] without placing blame or judgment on the basis of race, they shouldn’t be teaching.” She was clear about what she meant: it will be okay to say that a mass murder in a Buffalo grocery story happened, but it would “not be appropriate” to say that the mass murderer was a white supremacist. Her bill would insure that such teachers could be personally punished. Republicans in the Arizona House and Senate unanimously voted in favor. The bill was signed into law as part of a budget whose main item was a tax cut for better-off Arizonans.

How does one teach the Holocaust or slavery without detailing the responsibility of particular human groups for inhuman treatment of fellow humans of other groups based on racist ideologies?

Conservative politicians can count on well-funded organizations to create the local crises around curriculum that alarm enough parents to get educators fired. Nearly 900 school districts across the country, educating one-third of all public school students in the country, were targeted by anti-CRT efforts from September 2020 to August 2021. The most thorough study of the nationwide campaign against teaching about race concluded: “The anti “CRT” effort is a purposeful, nationally/state interconnected, and locally-driven conflict campaign to block or restrict proactive teaching and professional development related to race, racism, bias, and many aspects of proactive diversity/equity/inclusion efforts in schools, while — for some — gaining political power and control. The conflict campaign’s loudest, most powerful voices caricature actual teaching and stoke parent anxiety in a quest to control both schools and government.”

The real danger that Republican curricular censorship presents to Holocaust teaching is not the occasional eruption of stupidity, as in Southlake, Texas. Texas House Bill 3979 requires teachers to present multiple perspectives when discussing “widely debated and currently controversial” issues. Gina Peddy, the executive director of curriculum and instruction in the Carroll Independent School District in Southlake, told teachers,

“Just try to remember the concepts of 3979 . . . make sure that if you have a book on the Holocaust, that you have one that has an opposing, that has other perspectives.” That caused a small scandal. Despite posing for photographs with Holocaust deniers, Republican politicians don’t yet demand that Holocaust denial become part of the curriculum.

But when Holocaust denial comes from within the community, from antisemitic parents, the new laws make teaching difficult. A North Carolina teacher wrote: “My SUPERINTENDENT asked us to advise students to ‘ask your parents’ rather than insist that the Holocaust was real. We received professional development to help us navigate this political environment safely. Our superintendent attended and told us to advise kids to ‘ask your parents’ instead of try to show evidence to a child whose family swears the Holocaust didn’t happen.” 

New Republican laws and their emboldened approach to white supremacy will inevitably lead to an attack on any Holocaust teaching which goes beyond the discussion of prejudice to analyze the power of embedded racism and Christian white supremacy.

For Republicans, teaching the histories of America and of the Holocaust is too dangerous to allow. Those educations cause intellectual, then social disturbance. Both explain the role of embedded racism in Western society and the disastrous consequences. The Holocaust is over, and Christian nationalists all over Western society have been calling for Jews to get over it. But American racism and sexism are not. The success of the Black Lives Matter and #MeToo movements in demonstrating the continuing influence of male and white supremacy has frightened Christian conservatives. They are using the inevitable discomfort of students learning that their predecessors committed genocide to try to sanitize the history they will learn.                                                       

The American Association of University Professors and the American Historical Association, along with other educational organizations, released a statement in June 2021 opposing the new rollout of bills restricting the teaching of history. The statement focuses entirely on “the role of racism in the history of the United States”. Thus far, Holocaust teaching has suffered only collateral damage in the Republican war against American history. But without trivializing Holocaust education into anodyne lessons on intolerance, Republicans will never be able to cover up the historical truth that critical race theory foregrounds: racism has been and may still be embedded in American life.

Today teachers of American history are the targets of Republican censorship. Holocaust teachers, you’re next.

 

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183512 https://historynewsnetwork.org/article/183512 0
Learning About Stalin from His Books: An Interview with Geoffrey Roberts

(L-R) Kliment Voroshilov, Maxim Gorky and Josef Stalin, October 11, 1931.

 

 

 

Professor Geoffrey Roberts has just published Stalin’s Library: A Dictator and His Books, diving deep into the books, reading habits, and intellectual disposition of this key figure of the twentieth century. Aaron J. Leonard recently corresponded with him to discuss the book.

 

_______

 

Aaron Leonard: As your book makes clear, Stalin was a passionate reader. Could you talk about the breadth and depth of his personal library?

 

Geoffrey Roberts: There were about 25,000 books, pamphlets and periodicals in Stalin’s library by the time of his death in 1953.  The plan was to retain the collection intact as part of a Stalin Museum based at his main Moscow dacha, the model being Lenin’s former residence at Gorki on the outskirts of the Soviet capital. By the time Lenin died in 1924 there were nearly 9,000 books in his personal library. But Stalin’s personal effects, including his books, were dispersed after Khrushchev denounced the dictator at the 20th party congress in 1956. However, party archivists retained in storage some 5,500 texts that were identifiably Stalin’s, either because they contained his ex libris stamp or he had marked them in some way. They also kept several hundred books inscribed to Stalin by their authors. The remnants of his library only became available to researchers beginning in the late 1980s.

 

Stalin was a devoted reader of fiction as well as non-fiction. His book collection reputedly contained many thousands of novels, plays and short stories. Unfortunately, Stalin didn’t habitually mark literary works, nor did they bear his library stamp. But he did have a lot to say about the kind of fiction he liked and about his preferred authors, which enabled me to write a chapter in my book about Stalin and Soviet literature.

 

The books you find in the surviving remnant of Stalin’s library are the texts you’d expect to find in the collection of a devoted Marxist with particular interests in history, philosophy, economics and politics. Most of these books’ authors are Marxists and socialists but Stalin was happy to take ideas and information from anyone, including political opponents such as Leon Trotsky. Stalin’s favorite historian was the non-Marxist Robert Vipper, who specialized in early Christian history. Vipper also wrote a book that changed Stalin’s view of Russian history —a defense of Ivan the Terrible as a patriotic state-builder.  The Tsars had done a lot of terrible things, Stalin told his closest associates in 1937, but they had also built a vast and strong state, which the Bolsheviks who now controlled it had a duty to defend.

 

Another state-builder much-admired by Stalin was the “Iron Chancellor,” Otto von Bismarck, whose memoirs he read.  Marx, Engels and Lenin were also greatly interested in Bismarck, particularly his “revolution from above” that had unified Germany. As Arfon Rees has pointed out, more than one Stalin biographer has compared that Bismarckian revolution to Stalin’s state-driven modernization of Soviet Russia through accelerated industrialization and the collectivization of agriculture. But I also think Stalin was interested in Bismarck as a practitioner of realpolitik in the sphere of foreign policy. There were a lot of books on diplomacy, war and international relations in Stalin’s collection.

 

As an internationalist, Stalin’s interests were global and he collected and read books on many different countries – China, Japan, India, Mexico, the United States, Britain, France, Italy, Germany, Ireland and numerous others,

 

Apart from Russian, Stalin’s grasp of foreign languages was very limited. There were hundreds of books in English, French, German, Italian and other languages in his collection but there is no evidence he read any of them. Stalin’s library was overwhelmingly a collection of Russian-language texts plus a few books in his native Georgian.

 

Stalin annotated as well as read in Russian and there are nearly 500 books that contain his pometki (markings), though mostly in the form of underlinings rather than words.

 

 

Stalin’s religious training seems to have left a lasting impression. I remember reading in his later work, Economic Problems of Socialism, his challenge to a comrade that invoked the concept of sin, “To equate a part of the means of production (raw materials) with the means of production, including the implements of production, is to sin against Marxism…” Allowing for his use of exaggeration, this does suggest a view that the tenets of Marxism were — as in religion — unassailable. How do you see his early training in regard to his larger ideological framework later in life?

 

As you know, Stalin was educated in a church school in his home town of Gori and then in an Orthodox seminary in Georgia’s capital, Tbilisi, where he trained to be a priest. He spent a decade immersed in Christian education before he rebelled against the Church and gave up the priesthood to become a professional revolutionary. Stalin liked to reminisce about reading Marx’s Das Kapital but my guess is that there is no book he read more thoroughly than the Christian Bible.

 

Stalin’s belief in Marxism was certainly very dogmatic in the sense of the ideology being deemed true and beyond question. So, you could see the teenage abandonment of his childhood Christian convictions and his transition to Marxism as swapping one faith for another. But that was not how Stalin saw it. For him, Marxism was rooted in reason and science, and the knowledge it produced was empirically verifiable. He believed Marxism was irrefutable because it was true but it could, in principle, be disproved, unlike religion, whose eternal truths were based on faith and revelation.

 

Stalin used religious-infused words and phrases throughout his life – as many convinced atheists do—but I don’t think this signified any deep or enduring influence of his religious upbringing, except in one important respect: Stalin’s Christian convictions were deeply emotional and the same was true of his Marxism and Communism.

 

In the book I propose that we should see Stalin as a feeling intellectual for whom ideas had an emotional as well as a rational resonance. It was the emotional force of his belief system that underpinned his intellectual commitments and enabled him, as a person, to sustain decades of brutal, dictatorial rule that resulted in the deaths of millions of innocent people. The pious young Stalin’s religious sensibilities can be seen as the inception of the kind of political intellectual that he later became.

 

 

Flowing from that, Stalin gave the utmost importance to the theoretical level of cadre—you write about the premium he put on publishing History of the Communist Party of the Soviet Union: Short Course—as a tool to rectify the Party after the Great Purge. Putting aside, if that is possible, how disturbing that is, Stalin was not simply a dogmatist. As you point out, he saw the study of history as a science based on evidence. To what degree did Stalin employ critical compared to dogmatic thinking? Or perhaps better put, how did he reconcile the two?

 

Being a dogmatic Marxist did not blind Stalin to reality or deprive him of the power to reason and engage in critical self-reflexivity. Indeed, he always claimed to be a creative Marxist, someone whose ideas changed and developed as history progressed and the realm of human experience expanded. That said, Stalin did not like to admit mistakes and was fond of blaming others when the practical implementation of his ideology went awry. As Fidel Castro put it, if socialism had defects these were the result of people, not the system or the ideology.

 

As David Brandenberger and others have shown, Stalin was the prime author of the Short Course, which is an interesting example of his dogmatic yet creative Marxism. On the one hand, it is a sectarian, self-serving history of the party that trashes Stalin’s socialist political opponents as traitors – a narrative that cannot withstand serious empirical scrutiny.  On the other hand, Stalin also strove in this text to counter-balance the excesses of his personality cult. In editing and composing the book he reduced considerably his personal presence in its pages because he wanted people to love and commit to the party as an institution. He also wanted to arm party cadres with knowledge of Marxist theory that would insulate them from harmful bourgeois influences and enable them to correctly interpret and implement party policy.

 

I’m not saying that Stalin succeeded in these aims but the book was read and studied in depth by millions of Soviet citizens, including his son Vasily who had to sit an exam based on it – which he passed with flying colors.  Stalin also gave a copy to his daughter Svetlana to read, but she found it too boring!

 

On something of a lighter note, in describing Stalin’s marginal markings you write, “among his choice expressions of disdain were ‘ha ha,’ ‘gibberish,’ ‘nonsense,’ ‘rubbish,’ ‘fool,’ ‘scumbag,’ ‘scoundrel’ and ‘piss off”.’ Could you talk about how he used marginalia and what it tells us about him?

 

Those are examples of negative expressions that Stalin used, but he could also be positive and enthusiastic about texts. Indeed, by far his most frequent annotation – in documents as well as books - was NB [note well], which he wrote in Latin script. Stalin read to learn, not to sound off. Lenin was his top author but he was willing to learn from anyone, including arch rivals and sworn enemies. Stalin’s emotional engagement is apparent from his annotations, as is his complete fidelity to Marxism: in the many thousands of book pages marked by him in the privacy of his personal reading, there is not the slightest hint of any doubt about his chosen politics and ideology.

 

Most of Stalin’s pometki consist of underlinings of sentences and paragraphs and lines in the margin. He also liked to number points that he picked out from the text. Stalin’s non-verbal markings show what was interesting and important to him and how systematic and engaged a reader he could be. Technically, there is nothing special about Stalin’s pometki. Lenin’s were quite similar. Indeed, they look like those of anyone who marks their books, including my own!

 

When the remnants of Stalin’s library became accessible to researchers there was a rush to find smoking-guns that would reveal all about Stalin as a person, especially his motivation for the Great Terror. But while Stalin’s marginalia are interesting, and often intriguing, they mostly confirm what we know from many other sources, for example, that he was a devoted disciple of Lenin. The explanation for the Terror is not to be found in Stalin’s marginalia but hiding in plain sight in the politics and ideology of ruthless class struggle in defense of the revolution and the pursuit of socialism.

 

What Stalin’s pometki do show is that he was a serious intellectual who had a rich reading life, a life-longer learner who took seriously the Bolshevik admonition to revolutionize your own mind as well as society.

 

Finally, to get contemporary, since the Russian invasion of Ukraine, there has been a proliferation of comparisons between Stalin and Putin. For example, Stalin biographer Simon Sebag Montefiore wrote recently, “Putin’s repression at home increasingly resembles Stalinist tyranny – in its cult of fear, rallying of patriotic displays, crushing of protests, brazen lies and total control of media – although without the mass deportations and mass shootings.” While I don’t think Montefiore’s example is particularly compelling — it could describe the behavior of any number of dictatorships, past or present — I’m curious about how you see the comparison between the two?

 

After Stalin’s death Soviet socialism became far less authoritarian and violent but it remained recognizably the system he had created. Putin was born and brought up in that relatively relaxed post-Stalin system. Like most Soviet citizens he accepted its values, ideology, politics and socio-economic structures. He was a member of the communist party and served in the KGB. But when the USSR collapsed in 1991 he reinvented himself as a pro-capitalist liberal democrat and later transitioned to a conservative and more authoritarian politician.

 

The great political continuity between Stalin and Putin that I see is their espousal of multinationalism–their shared concept of a state based not on ethnicity but on the patriotic loyalty of its citizenry. In that regard, the Soviet multinational state still exists in the form of the Russian Federation. Like Stalin, Putin is determined to defend that state against foreign foes, including through the expansion of its borders.

 

Putin is without doubt a very well-read politician. Like Stalin, his favorite reading is history and fiction. He has published respectable, albeit controversial, articles on the origins of the Second World War and on the history of Russia-Ukraine relations. His speeches are peppered with references to history and to literary classics. He keeps a copy of Lermontov’s poetry on his desk and claims you can’t understand Russia without reading its great literature.

 

Putin is also greatly interested in ideas, especially those of conservative Russian philosophers who provide an alternative worldview to both Marxism and liberalism. But I don’t get the impression that his engagement with their ideas is particularly deep, or that he is a systematic thinker like Stalin, let alone a theorist or ideologue.

 

As an intellectual, Stalin’s politics were shaped by his utopian ideology and by his profound belief in the transformative power of ideas. He was an intellectual in power. Putin is a more conventional politician, albeit one with a deep love of reading.

 

Geoffrey Roberts is Emeritus Professor of History at University College Cork and a Member of the Royal Irish Academy. His previous books include the acclaimed Stalin’s Wars: From World War to Cold War, 1939-1953 (2006) and Stalin’s General: The Life of Georgy Zhukov (2012), which won the Society for Military History’s Distinguished Book Award. Stalin’s Library: A Dictator and His Books was published by Yale University Press in Febrary 2022.

 

 

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183517 https://historynewsnetwork.org/article/183517 0
At 75th Anniversary, What Can Anne Frank's Diary Teach Today's Teens?

 

 

Anne Frank, who died in a concentration camp after hiding in a warehouse behind her father’s office for over two years, is arguably the most famous child of the twentieth century. 

 

Her diary, first published in Dutch on June 25th, 1947, has been translated into more than 70 languages and reconfigured as a Tony Award-winning Broadway play, an Academy Award-winning film, and a graphic novel. The famous text has inspired teenage diarists, from Sarajevo in Bosnia and Herzegovina to Long Beach, Calif., and is among the books most frequently read by those who are incarcerated. Imprisoned for 27 years, Nelson Mandela read the diary so many times the volume fell apart and he had to copy passages on toilet paper. And after the film version of a young adult novel, “The Fault in Our Stars,” featured the annex as backdrop to the first kiss of two teenagers dying of cancer, the Anne Frank House became the third most visited museum in the Netherlands.

 

Because it stops abruptly three days before Anne’s arrest, some criticize the diary as an incomplete Holocaust narrative that replaces the horrors of the death camps with adolescent angst. And yet who better than an adolescent to speak to teenagers in our own time of anxiety and uncertainty, of deep political division, of exposed inequity? 

 

In seeking to address both contemporary antisemitism and racism, the Anne Frank Project at Loyola University New Orleans guides teens in understanding not only the terrible history of the Holocaust, but how to apply those lessons today. The exhibit, “Anne Frank: A History for Today,” was created by the Anne Frank House in Amsterdam thirty years ago; our copy was loaned to us by the Anne Frank House’s American partner, the Anne Frank Center at the University of South Carolina.  The exhibit is designed not only to provide historical facts, but to use peer docents to encourage discussion regarding tolerance, inclusion, racism, and human rights.  

 

The middle schoolers we train may have heard of Anne Frank or of the Holocaust, but, for them, it is ancient history, long ago and far away. It has no relevance to their lives or to the big challenges in New Orleans, a majority Black city with one of the highest murder rates in the country, where 40% of adults are illiterate, one in six children experiences food insecurity, and gun violence is endemic. If our goal (and it is certainly mine) is to teach not just about but through the Holocaust, to address contemporary challenges, then Anne’s story provides an important point of access.  

 

 The diary is, indeed, not a story of the death camps, but of life in hiding for a girl persecuted because of her identity. Addressed to “Kitty,” the diary is friend and comfort, a space a 13-year-old girl can explore her turbulent emotions in a time and place of unbelievable stress, anxiety, and, frequently, terror. An active, self-described “chatterbox,” Anne is not able to go outside for over two years. She is forced to remain still and silent from 8 a.m. to 5 p.m. each workday, unable to flush the only toilet until the workers leave the building.  Confined to 450-square feet with seven other people (including a 50-something dentist who shares her bedroom), Anne expresses the frustrations of communal life in severe constraints, her sense of injured merit at being treated as a child, her longing for true companionship, and, beginning in 1944, her evolving feelings for Peter, in hiding as well.  And, always, there is the fear of being discovered. Amid the quotidian anecdotes of selfishness and claustrophobia, Anne makes vivid how terrifying their life is, particularly at night, when sleep is shattered by bombardments that send Anne, quaking, to her father’s bed, and during two break-ins that underscore their vulnerability.  

 

Anne’s story takes on new meaning when it is explained to children by people their own age. The peer-led tours provide, of course, a powerful educational experience, but it is the docent training that is truly transformative, as teenagers – Anne Frank’s age – spend several days working together, building community and learning not only to provide tours, but how to be upstanders against prejudice and hate.  

 

Anne’s diary is powerful because she is “just like us,” or at least, in so many ways, like any adolescent, trying to make sense of a world turned upside down. She speaks to teenagers, quarreling with their parents or feeling misunderstood, teenagers who are hungry, fear gun violence, are despondent because their sexual identity is demonized in contentious public debate.  

Understanding Anne in her larger context guides students in thinking critically and acting justly. She is extraordinary not because no other little girls were in hiding but because, in fact, so many people were hiding for their lives. Even as Jewish people were being dehumanized, demonized, and destroyed, Anne kept a witty, insightful diary that offers powerful witness: each of the 1.5 million murdered Jewish children was a unique individual.

 

In other words, the foundational lesson of Anne’s diary is the recognition of innate human dignity: to comprehend how the Holocaust could have happened, why genocide continues to happen, and how racist systems continue to be perpetuated, we must each move beyond statistics and the panoramic sweep of events to consider the perspectives, lived experiences, choices, and values of individuals.

 

Although some claim Anne’s famous affirmation, 'I still believe, in spite of everything, that people are truly good at heart,” can only be read as tragic irony, Anne is also a figure of hope.  Her decision in March 1944, after almost two years in hiding and when she was not yet 15, to rewrite her own diary as a novel to be read by an envisioned postwar audience in a liberated Netherlands, is a remarkable act of spiritual resilience that deserves to be recognized and celebrated, even if the author was murdered before she had a chance to complete the work.

 

By learning and teaching others Anne’s story, young people can embrace that hope, just as their connection to Anne can be a powerful reminder of the dangers of stereotypes, and that hatred and white supremacy left unchecked lead to death and destruction.  Her diary serves as a clarion call to recognize the innate human dignity not just in Anne, whom we know so well from her own words, but the millions of others killed in the Holocaust,  and the millions of children who are currently refugees, who live in hunger, who live in fear.  Anne’s story, like the Holocaust itself, inspires us to ask, not  “What would I have done?” but rather, “What will I do?”  

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183513 https://historynewsnetwork.org/article/183513 0
Abortion and Birth Control Have Always Been Linked It’s been three weeks since SCOTUS overturned Roe v. Wade. To learn how the Dobbs v. Jackson decision could affect birth control access (and about birth control history in general), I spoke to Dr. Kelly O'Donnell and Professor Lauren MacIvor Thompson, two experts on reproductive history. Turns out, birth control is wrapped up with abortion in a bit of a “turducken” (you’ll see).

A condensed transcript edited for clarity is below. Paying subscribers to Skipped History can access audio of the full conversation here, as well as a written bit of Skipped History about the surprising big money, anti-abortion alliance here. This is the second in a series of conversations on how Roe fell, and where we go from here.

Ben: You’re both quite knowledgeable about the birth control movement and birth control history. To ground our conversation, let's start back in time with early views of birth control.

LMT: I can take that one. The biggest thing to know about birth control in the 17th, 18th and 19th centuries in colonial America and in the United States is for a long, long time women managed their own reproductive health. Care was not in the hands of physicians and it was not in the hands of the law. By and large, women were using herbs from their gardens, things like tansy and pennyroyal that grow and did grow in American gardens for centuries. 

Ben: Skipped gardening history.

LMT: Yeah. They're using these herbs in teas, and women soak sponges in solutions and insert those. They use douching mechanisms after sex. So women are really managing their reproductive health from start to finish. And before the early 19th century, the vast majority of women's pregnancies and births would have been handled by midwives.

So this is really a woman's space and it's not until the early 19th century that we begin to see a transition to men in the field. The earliest laws that we have about abortion and birth control on the books and state legislatures really have more to do with the fact that there are unscrupulous business owners and manufacturers who are mailing out contraceptives and abortifacients, along with like headache, remedies or gout remedies, or you name it.

None of these remedies are regulated, and so legislatures start passing laws that try to prevent people from being poisoned. So they're not actually concerned about abortion. They're not concerned that women are regulating their fertility. 

KOD: Yeah, and it’s really in the mid-19th century when we see more of a focus on regulating the practice of medicine in addition to the safety concerns. We get Anthony Comstock coming along, whom I know you talked about in your last interview.

LMT: Right, states passed their own mini Comstock laws regulating contraception, pornography, abortion, sex education materials—anything that has to do with sex under the umbrella of “obscenity.” This is in the 1870s, 1880s, and those laws remained on the books until the late 20th century. In some cases they’re still on the books, they're just not enforced. Of course, now that Roe has been overturned, it’s anyone’s guess if they’ll be enforced again.

KOD: Getting back to birth control, this is why you can't really separate out contraception and abortion. Dating back to the 19th century, anything related to sexuality, and any kind of regulation of not having children, is wrapped up into one obscenity constellation.

Ben: An obscenity burrito.

KOD: Right, a turducken of reproductive options.

Ben: Interesting. So, physicians and moral crusaders like Comstock push abortion outside of the women's sphere. It becomes a regulated, male-dominated action...

KOD: Yes, by the end of the 19th century, going into the 20th century, all management of reproduction is going towards a male-controlled, medicalized model. And there were some benefits to that, right? For some women, hospital births were safer. Not getting infections and dying in childbirth was obviously an improvement (that is, for women who weren’t discriminated against in hospitals). 

So, undoubtedly, there are good things that happen. But alongside that comes—how do I say this in a way that's not super nefarious sounding—a sort of medical surveillance regime.

Ben: Totally not nefarious. 

LMT: Yeah. Between the 1940s and the 1960s, the immediate two decades before the Roe decision, that’s when we see the most cracking down by police forces and the legal system. Police forces become devoted to rooting out abortionists and also putting women on trial for seeking abortions. And so, for example, women ended up going to the hospital for some awful septic abortion, and the hospital committee and police officers would interview them on their hospital bed going, who did this to you? Why did you do this? And then the next thing you know, you're in court. 

And here’s a really important thing to remember that helps explain where we are today. There’s this popular idea that Roe enshrined women's rights, but it didn't.

Ben: In the sense that the decision didn't enshrine women's rights, but rather a right to privacy? How would you phrase that?

LMT: Well, if you read the text of the Roe decision and the way Justice Harry Blackmun words the key portions of the majority opinion, it's really about granting physicians rights and upholding physicians’ professional expertise. In fact, Blackmun says, the feminist movement has argued that “a woman's right is absolute and that she is entitled to terminate her pregnancy at whatever time, in whatever way, and for whatever reason she alone chooses. With this, we do not agree.”

In other words, he’s explicitly saying, this isn't about feminism. It's not about women's rights. It’s about this practical medical matter that women need to take care of in consultation with their (male) physicians and their (male) physicians need to have full professional authority to make those decisions without fear of getting arrested.

KOD: Yeah, the quick story that people have about Roe is oh, right to privacy, now women can have abortions and it's legal.

But that really collapses a lot of the complexity of the history of abortion, which is very much more uneven than people realize. Dating back to the Comstock Laws, there’ve always been women who’ve had trouble accessing abortion—lower income, women of color in particular. 

So one point to take away as we're facing this absolute chaos post-Dobbs is that, in a lot of ways, there’s always been this chaotic patchwork of women with uneven access to abortion. It steadily improved over the years, but now we’re descending back into chaos—in a lot significantly worse ways due to the oversight and surveillance of people's bodies that are available today. 

Ben: This return to an earlier patchwork seems to suggest that maybe the shape of progress is less a straight line than... a rhombus, with lots of little unexpected and unwanted turns and pointy edges. Moving forward, how does the Dobbs decision affect birth control access?

KOD: Again, abortion and birth control have always been linked. Unsurprisingly, we’re already seeing increasing discomfort with birth control types like emergency contraceptives. Prescribers are also worried about things like IUDs because they can’t be sure if a court would view them as abortifacients and not contraceptives.

So I imagine we’ll see a lot of preemptive CYA (cover your ass) moves, just for fear that doctors are maybe getting a little too close to what some people in their brains think is abortion. 

LMT: Right, we’re also heading into an uncertain future regarding ectopic pregnancies, where the sperm meets the egg and implants accidentally in the fallopian tube, which is terrible and you bleed out if it's allowed to continue. Now, according to interpretations of some abortion laws, you may not remove that embryo even though it's implanted in a place where it’s going to kill the woman and itself. Legal experts warn that abortion laws can be interpreted to say that if you treat an ectopic pregnancy, that's the equivalent of an abortion. It is madness.

Ben: What would you say to people who want to reverse the madness?

LMT: I think we can learn from other countries. If you look at places like Argentina or Ireland, two very Catholic countries that have liberalized abortion laws, activists haven’t glommed onto that language of choice and citizenship.

Rather, they’ve attacked the pro-choice question from an empathy angle, pointing out that women will die. Their messaging is fundamentally different than how Americans are approaching this, and it’s been successful.

KOD: I agree, and generally, I think it’s important to remember that there are no simple answers. This history is complicated, and anyone who's trying to give you a black and white version of it, whether they're a Supreme Court justice or someone on Twitter, they’re wrong.

There's never really been a simple solution to granting (or removing) women’s reproductive autonomy. We would not be having this conversation if there were.

Ben: Okay, got it. Your reminders are nothing is simple and the past is chaos.

KOD: No! But maybe.

Ben: I will let you go from there. This was so fascinating. Disturbing but fascinating. Thank you both so much for your time.

LMT: Thank you.

KOD: Thanks for having us.

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/blog/154616 https://historynewsnetwork.org/blog/154616 0
Putin's "Mad Gamble" Isn't Atypical for Oil Dictators

 

 

Four months after the Russian invasion of Ukraine and the future of the world has never felt so wide open. The military campaign in the Donbass seems to put under a question mark both the stability of the global economy and the long lull in superpower conflicts that prevailed since 1991. In light of that, it is vital to decipher Putin's motives and try to figure out the possible end game of the Russian-Ukrainian conflict. To do that we need to start with what really defines Vladimir Putin as a leader.

Since his rise to power, Putin has single-mindedly focused on using the natural resources of Russia to catapult his country into superpower status. Putin utilized the revenue of oil and gas sales both to bribe the population (by raising wages and pensions) and beef up the capabilities of his armed forces. It would be no exaggeration to call Putin an oil dictator. However, he is not the first leader in history to build his legitimacy around oil revenue and there is much to be learned from the sad history of these regimes. Two salient examples that come to mind are the Soviet Union in the late 1970s and Saddam Hussein's Iraq.

Same as Putin, these regimes made disastrous decisions to invade their neighbors. In 1979 the Soviet Union invaded Afghanistan. A year later, Saddam invaded Iran. Despite the fact that this step embroiled Iraq in a bloody and protracted conflict, a decade later, in 1991, Saddam sent his forces into Kuwait. Indeed, oil dictatorships are prone to exhibit belligerent foreign policies. But what do they hope to achieve by being aggressive toward their neighbors?

In all the cases listed above, the oil dictatorship aimed its attack on an energy-rich province of its neighbor. In 1979, Soviet troops were quick to secure the gas fields in northern Afghanistan. In 1980, Saddam's forces sought to take Khuzestan, where the majority of Iran's oil and gas fields are located. And in this year, Putin's troops focused on eastern Ukraine which has abundant deposits of coal mines and vast reserves of gas. Yet, how would control of more energy further the interests of the oil dictatorship?

Since oil dictatorships are so dependent on oil revenue, they are deeply worried about the threat of losing market share. The reasons varied but the response remained the same.

By the late 1970s the Soviet business model was to buy gas cheaply from Iran and Afghanistan and sell it for a hefty profit in the European energy market. Following the revolution in Iran in 1979, the Khomeini regime signaled it was about to break its contract with Moscow. In parallel, events in Afghanistan suggested that it was on the verge of another Islamic revolution. The Soviets apparently were unwilling to take the risk of losing two gas suppliers in the same year.

In 1980, Saddam Hussein faced a similar problem. Iran was abetting and aiding a Shia insurrection in the south of Iraq. The south also happened to the location of the Rumaila oil field, Iraq's largest. Saddam apparently decided that instead of letting the Iranians deny him access to his oil fields, he would deny them access to their oil fields.

In 2022, Ukraine's steady drift toward EU and NATO membership, created a comparable challenge for Putin. A third of Russian gas exports to Europe were going through pipelines on Ukrainian territory. These pipelines had already been blown up, probably by Ukrainian nationalists, both in 2007 and in 2014. Putin's efforts to develop a parallel network, known as the Nordstream pipelines, which would allow him to bypass Ukraine, hit a wall when the German regulator refused to approve the use of one of them. If that wasn't enough, then Ukrainian companies were on the verge of developing gas fields in the black sea, and thus competing with Russian Gazprom in the European energy market.

The wars launched by oil dictatorships end badly. Oil dictators make decisions with only a small group of advisors who are loath to contradict them. The armies of oil dictators fight poorly, as the regime prefers to promote loyal officers rather than competent ones. As a result, both the Soviets in Afghanistan and the Iraqis in Iran got bogged down in protracted warfare, pre-war dreams about swift victory notwithstanding. Both regimes emerged from their invasions greatly weakened. As Russia's campaign in Ukraine exhibits the same morbid symptoms, its hard to think its fate would be any different.       

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183514 https://historynewsnetwork.org/article/183514 0
Kentucky Fried Vice President?

 

 

The United States, particularly of late, has always had citizens take a non-traditional path into politics. Often people will run for the school board, city council, state legislature, and so on, gaining experience along the way. For better or worse, it has become more common for aspiring politicians to seek office due to their fame, or status, as a celebrity. Donald Trump—the first president never to have served either in the military or any public office—is certainly the most recent and glaring example. Ronald Reagan would certainly have been considered a celebrity when he ran for the presidency, but the former actor had served two terms as governor of California. In 2022, celebrity Dr. Mehmet Oz and former football player Herschel Walker are seeking the GOP nominations in Pennsylvania and Georgia for the United States Senate. Neither has any political experience.

Several celebrated people have either been elected to or sought political office in the past. Frontiersman Davy Crocket was elected to three terms in Congress from Tennessee before famously dying at the Alamo in 1836. When John Wayne (1960) and Billy Bob Thornton (2004) play you in a movie, you sir, are legendary.

Newspaper magnate William Randolph Hearst (today his granddaughter Patty may sadly be more famous) was very active politically in the first decade of the 20th century. He was elected to two terms in Congress (1903-1907), and unsuccessfully ran for mayor of New York City in 1905 and 1909, governor of New York in 1906, and even president in 1904. William Randolph Hearst foreshadowed Donald J. Trump in many ways.

Upton Sinclair, best known as the author of The Jungle (a graphic treatise on the meat packing industry that surely sent a few readers to vegetarianism), was potentially to the left of Bernie Sanders. He ran for congress, the U.S. Senate, and governor of California. Actress Helen Gahagan Douglas was elected to Congress from the state three times as a Democrat (1945-1951) before being defeated by Richard Nixon in a senatorial race in 1950 (John F. Kennedy quietly contributed to his future rival’s campaign against her). She is best known for giving Nixon the nickname "Tricky Dick" (he called her “the Pink Lady,” implying she was soft on communism). A third future president, Lyndon Johnson, served as a bit more than her political mentor, which you can look up on your own.

After Reagan’s election as governor, the proverbial floodgates of prospective celebrity politicians would open (and I am only citing those elected). Actors: Clint Eastwood (mayor of Carmel, California), The Love Boat’s ‘Gopher’ Fred Grandy (US Congress- Iowa), Fred Thompson (US Senate- Tennessee; presidential candidate 2008), and Jesse Ventura (Governor, Minnesota). Athletes: Basketball Hall of Famer Bill Bradley (US Senate, New Jersey; presidential candidate 2000), former NFL quarterback Jack Kemp (Congressman, New York; vice presidential nominee 1996) Football Hall of Famer Steve Largent (US Congress- Oklahoma), and NFL quarterback Heath Shuler (US Congress, North Carolina). Bodybuilder/actor Arnold Schwarzenegger (Governor of California) was an athlete and an actor.

George Wallace ran for president four times from 1964-1976. He survived an assassination attempt in 1972. 1968 was his most successful presidential run, when he latched onto a group of supporters who (or perhaps their children) may now identify with Donald Trump more than they may know. George Wallace was the governor of Alabama three times, serving a total of sixteen years, and famously (or infamously) pledged “segregation today, segregation tomorrow, and segregation forever.” He entered some Democratic primaries late in 1964, and did (to some) shockingly well. In 1968, he ran as an independent.

1968 is one of the most tumultuous years in American history. The Tet Offensive, King and Kennedy assassinations, and so on. Wallace tapped into disaffected white voters with his combative style on the stump. At one time that summer, he polled more than twenty percent in national polls, and still higher in certain parts of the country, especially the South. Studies have shown he drew voters from both the Republican nominee Richard Nixon and Democrat Hubert Humphrey. The man without a party needed a running mate, and this is where an intriguing possibility for a celebrity politician did NOT happen.

Wallace and his advisers “agreed that it was important to broaden his Deep South base, they wanted to reach out for a national figure who might give the ticket credibility outside the region.” He considered former Secretary of Agriculture Ezra Taft Benson, former baseball commissioner and Kentucky Governor A.B. “Happy” Chandler, radio commentator Paul Harvey, FBI Director J. Edgar Hoover, and others, before settling on retired Air Force General Curtis LeMay. He also contemplated Colonel Harland Sanders of Kentucky Fried Chicken fame, who was a contributor to his campaign.

Colonel Sanders was a familiar sight on television at the time, even though he had sold the company to John Y. Brown (future governor of Kentucky as well as the former husband of TV personality Phyllis George and father of CNN’s Pamela Brown). Sanders wasn’t successful until later in life. He’s associated with Kentucky, but was born in Indiana and spent many years doing odd jobs in the region. He didn’t start cooking for money until the age of forty, and didn’t begin franchising his restaurants until he was already collecting Social Security. A little research will show that the man was more obsessed with gravy than chicken.

Sanders’ candidacy was unlikely for a couple of reasons. First of all, he was seventy-eight years old, and if Joe Biden is old at seventy-nine in 2022, the Colonel was really old in 1968 (though he did live to be ninety). Secondly, since he was the face, but not the owner, of KFC, it is meaningless, yet curious speculation if the sales agreement of the chicken franchise would have allowed him to run politically, especially considering what a polarizing figure Wallace was. A good share of Americans may have stopped eating that tasty poultry, even if it was “finger lickin’ good,” had a Wallace/Sanders ticket been on the ballot in 1968.

Sanders has enjoyed a bit of a resurrection in the last decade as eighteen different actors have played the Colonel (Paste Magazine even ranked them), including Jason Alexander, George Hamilton, Ray Liotta, Rob Lowe, Reba McEntire, Norm MacDonald and even Hafthor Bjornsson, perhaps better known as ‘The Mountain,’ on Game of Thrones (yes, it is worth looking that up as well).

The infamous General LeMay did Wallace more harm than good.  He was the inspiration for General “Jack D. Ripper” in the 1964 satire Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb, which was neither flattering nor complimentary. The movie was a work of fiction, but the reality of the Cuban Missile Crisis was just six short years in the past. LeMay arguably conferred military credibility on Wallace, though the governor, who did fly in B-29 bombing raids in World War II under LeMay, may not have needed help in that area.

On October 3, 1968, Wallace announced LeMay as his running mate in working class Pittsburgh. After an introduction,

Wallace stood off to the side, a stony expression on his face, while LeMay self-destructed and in the process all but brought down the Wallace campaign. Le May offered sincere, straightforward responses to reporters’ questions, leaving no doubt about where he stood.

The numbers confirm that assessment. On September 29th, a Gallup poll listed Richard Nixon with 43%, Hubert Humphrey just 28%, and Wallace at 21% (with 8 percent undecided). By election day, those numbers finished at 43.4%, 42.72%, and 13.53%. Wallace lost a third of his support in the final month.

Not all of that can be blamed on LeMay. Humphrey changed his position on Vietnam and ran a strong campaign in October. Additionally, for perhaps the last time they exerted this much influence, labor unions “poured unprecedented resources into the Democratic campaign going into the home stretch, registering 4.6 million voters, sending out 115 million pamphlets, establishing 638 phone banks, fielding 72, 000 house to house canvassers and 94,000 Election Day volunteers.” Humphrey poached fifteen late-campaign points from Wallace among unionists.

The American electorate rejected George Wallace for President in 1968 (as well as in 1964, 1972, and 1976). Nevertheless, Vice President Harland Sanders is an amusing historical “what if?"

Sadly, we will never know of the possible Thirty-Ninth vice president of the United Sates would have wanted to be addressed as “Mr. Vice President,” or simply, “Colonel.”  

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183515 https://historynewsnetwork.org/article/183515 0
The Roundup Top Ten for July 22, 2022

What the Antebellum Period Tells Us about the Coming Battles Over Abortion

by Kate Masur

"The history of the 19th century reminds us that arguments for states’ rights, or for federal power, have no intrinsic political or moral valence."

 

Why Biden Failed

by Adam Tooze

If Biden’s plan was to stabilize US democracy with progressive politics – an updated New Deal for the 21st century – the conclusion now must be that his presidency has failed.

 

 

Stuck on the Rufo Road

by Jennifer Berkshire

As conservative activists mount a multi-front campaign to discredit and defund public schools, too many leading Democrats seem unaware that the popularity of public education means they have a winning issue right in front of their faces, says an education historian and policy analyst. 

 

 

Letting States Legislate Morality Will End Badly (Again)

by Nancy C. Unger

The shameful history of Mann Act prosecutions shows what happens when panics over sexual morality are hastily written into criminal laws. 

 

 

Can We Have International Cooperation Without Domination?

by Jamie Martin

There is no golden age of international relations free of the coercive power of capital. A different version of internationalism is needed. 

 

 

The Right-Wing Court Has a New Target: Native American Rights

by Nick Estes

The Court recently overturned precedent to allow state governments criminal jurisdiction over tribal lands, which has historically been a tactic of oppression and elimination. 

 

 

Archival Structures and the Preservers and Retrievers of Stories

by Fernando Amador II

"Historians rarely understand the terminology, organizational strategies, or labor required for establishing and maintaining an archive, and I was no exception."

 

 

Think You Know the Biblical Position on Abortion? You May Be Surprised

by Melanie A. Howard

Although the Bible was written at a time when abortion was practiced, it never directly addresses the issue.

 

 

Twitter is Just Fine

by John Warner

Twitter "can be a terrible place, but it is also a place that – at least for me – has been far more welcoming and supportive of my academic pursuits than academia itself ever managed."

 

 

The Dangerous Misunderstanding of America's History of Mob Action

by Stefan Lund

Contrary to the protestations of January 6th apologists, mob action in America has usually worked to suppress, rather than defend, democracy.

 

]]>
Fri, 12 Aug 2022 04:40:06 +0000 https://historynewsnetwork.org/article/183511 https://historynewsnetwork.org/article/183511 0