History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Mon, 27 Jan 2020 03:17:58 +0000 Mon, 27 Jan 2020 03:17:58 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/site/feed 1,056 Feet: Why I Needed the 1619 Project Growing Up

The Roanoke Star

 

Atop Mill Mountain in Roanoke, Virginia sits a giant metal star, standing at nearly ninety feet. Almost every night the star is brightly lit, as a sort of landlocked lighthouse at the edge of the Blue Ridge Mountains. It can be seen from almost anywhere in the small city, including the cramped living room of my childhood apartment. And it always seemed so far away from where I lived, both physically and figuratively. A lush green mountain, topped with a star, was a far cry from the low income HUD apartment of my youth, which sat feet from a large concrete drainage ditch. When it was warm outside, children would play in the ditch and explore the open tunnels that punctuating each side, all under the watchful gaze of the Roanoke Star. Roanoke was, and remains, a deeply segregated city. 

 

Closer to home sat Lincoln Terrace Elementary School. Google Maps puts the school a mere 1,056 feet from my patio. Had it not been for the small wooded area that sat between the school and my apartment complex, it would have been just as visible as the Roanoke Star. My experience with Lincoln Terrace Elementary, however, is tangential at best. I never went there. Nor did anyone else in my 108-unit apartment complex, or half the children in the complex next to mine. The one street of our neighborhood bisected those apartments, leading children on the other side of the road to Lincoln Terrace. My side of the street, on the other hand, went to a different school that required my bus to pass the school just feet from where I lived and to drive past yet another elementary school before dropping me off each morning. 

 

Why? The school district lines hadn’t been changed since desegregation. Those lines remained unchanged until 2010, my junior year of high school. 

            

I never quite understood how that could happen. I had learned about de jure and de facto segregation in school. The former was by law, and the other by choice. But why would people choose to live segregated lives? And how did the country really get from one form of segregation to the other? I knew about Reconstruction and Jim Crow, but that seemed so far in the past and confusing. Furthermore, how could the institution of slavery possibly be connected to which classroom I sat in to learn history? After all, the vast majority of white Southerners didn’t even own slaves.

 

That’s the one thing I can remember learning in grade school about slavery. Not all white people. I know I learned more than this, but this is the one thing I vividly remember being stressed in my grade school education. The rest is a sort of blur. Pieces of history dispersed throughout the grand narrative of American progress. What glimpses of Black America I found in my education never seemed to connect.

 

If Americans have traditionally understood 1776 to be a monumental moment in our history, and the birth year of our country, the 1619 Project takes us further back to 1619, when the first enslaved Africans arrived in British North America. By foregrounding this country’s history of race based chattel slavery and its long shadow, the 1619 Project seeks to connect those disparate moments of history I was never able to make sense of as a student. It is, first and foremost, an education project, which will hopefully allow future generations of students who, like myself, are left wondering what any part of our history means. The Project has had much success--some people lined up for hours to receive a free copy--and some school systems will include it as supplemental material to their history lessons

 

Yet, the 1619 Project has not been warmly received by everyone. Conservatives denounced it immediately, saying it was--in the words of Newt Gingrich--a “lie.” The 1619 Project has also garnered a large amount of criticism from a vocal minority of historians, with Sean Wilentz, Victoria Bynum, James McPherson, James Oaks, and Gordon Wood taking the lead. These historians present their critiques as “a matter of facts,” yet their comments betray their true grievances.

 

One of the major points of historical critique these scholars have leveled against the 1619 Project deals with one sentence in the project’s introductory essay by Nikole Hannah-Jones. “One of the primary reasons the colonists decided to declare their independence from Britain was because they wanted to protect the institution of slavery,” she wrote. While the 1619 Project is nearly 100 pages in total, these historians have given interviews and written entire articles almost solely dedicated to debunking this statement.

 

No work of history is above critique. Not the 1619 Project. Not mine. And certainly not those of the 1619 Project’s dedicated critics. These historians have attempted to prove, without a doubt, that Americans could not possibly have wanted independence from Britain because of slavery. And, if we are fair, they have a good case. They point out that Britain only had a fledging anti-slavery and abolitionist movement at this time. Loyalist enslavers also present a conundrum for Hannah-Jones’s statement, as many enslavers not only chose to stay loyal to the British Empire, but had their property--i.e. slaves--seized by Patriot governments as a result.

 

But it is worth asking why it seems so difficult to admit slavery could have played a major part in the American Revolution. After all, Americans were infuriated by British attempts to free enslaved people during the war. Lord Dunmore’s 1774 Proclamation, which offered freedom to enslaved people who fled rebel enslavers, angered and scared Patriots and Loyalists alike. Are we to assume no Patriots would have picked up arms to protect their property, a sacred right in the Anglo-American tradition, from a royalist governor attempting to foment a slave rebellion? 

 

Immediately following the war, the question of whether runaway slaves belonged to Patriots or Loyalists was of such importance that a committee was set up in New York City to settle disputes. This eventually led to the Book of Negroes, something Simone Browne, a scholar of blackness and the surveillance state, has likened to a “no-sail list.” The actions of the British left such a bad taste in the mouth of southern enslavers that throughout the 1780s, they accused Northerners of being just as bad, or worse, than their former imperial lords if Northerners didn’t actively assist in the retrieval of fugitive slaves. 

 

These are all well-known facts. This is to say even the arguments leveled by these critics are an interpretation of history, despite their claims of simply fact-checking. Then again, simply stating the facts has never been the real point for these scholars. They are, after all, accomplished and award winning historians who have built their own careers critiquing and being critiqued for their interpretations of “the facts.” Yet they have repeatedly acted as if the very fact that the 1619 Project can be critiqued means the entire work is moot. The idea that any other piece of historical writing could stand up to such standards is almost laughable. If we were to apply the model the 1619 Project detractors are putting forward, the world would either have no works of history or exactly one “right” work. 

 

Their real reasons for opposing the 1619 Project are, well, quite black and white. If the Project seeks to correct the established narrative of American history, which has for so long sidelined slavery, racism, Jim Crow, the Civil Rights movement, mass incarceration, and a host of other things in the prefix of African American History, these scholars seek to maintain the status quo. For these scholars, the 1619 Project gets the “facts” wrong, and instead interprets history in a way which, in their words, foregrounds “identity politics.” 

 

Identity politics is the favorite phrase of those who oppose changes to the status quo. It is less a scholarly critique than an insult leveled by The Federalists Society. It illustrates a fundamental misunderstanding from these scholars that history is and has always been political. The national narrative of U.S. history we are all so familiar with could not be constructed without first contemplating who is and is not a part of the nation, which is in and of itself a political project. 

 

This political project is not only wrapped up in what constitutes a “fact,” but also in interpreting them. Yet, in his most recent criticism of the 1619 Project, Sean Wilentz ended by imploring people to stick to the facts, and remember the example set by W.E.B. DuBois. Likening his own crusade against the 1619 Project to DuBois’s struggles to rewrite the racist historical narrative of Reconstruction, Wilentz stated that “in exposing the falsehoods of his racist adversaries, Du Bois became the upholder of plain, provable fact.” 

 

Such a use of DuBois willfully ignores that in heralding the “facts” before the people, the narrative of history DuBois wrote was not only part of his own political project in fighting for Black equality, but something his readers pushed back against. William MacDonald, in his 1935 reviewof Black Reconstruction for the New York Times, commented that “there is no need to accept the author’s views about racial equality in order to recognize” his contribution to history. In quoting DuBois’s statements on Black equality, MacDonald ended his review by saying DuBois’s uncompromising “fight for absolute equality” was troubling, to say the least.

 

Wilentz’s ill-advised use of DuBois is a not so subtle sleight of hand, meant to admonish the 1619 Project and its contributors for straying away from “real” history towards “identity politics.” It is also, to be frank, the scholarly equivalent of using the “I have a Black friend” excuse. Quoting DuBois doesn’t change the fact that Wilentz and his fellow scholars, like MacDonald, would rather have the history without the social consequences, and the “facts” without the interpretation.  

            

What DuBois and so many historians before and after him have known, however, is history has never been just the facts. DuBois did not seek out the facts for the sake of facts. He did it to combat Jim Crow America. The 1619 Project is not interested in retelling America’s founding story. It seeks to forge a new one. The people who contributed to this effort know full well those like myself, who grew up in the drainage ditches of America, in the long shadow of a bright star, need to hear and read our history. Demands to “stick to the facts” often sideline or silence our story. 

 

What Wilentz and his fellow critics are intent on forgetting is history has always been an interpretive art. Facts can say one thing. History should often say another. After all, the vast majority of white Southerners didn’t even own slaves.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174144 https://historynewsnetwork.org/article/174144 0
Twelve Scholars Critique the 1619 Project and the New York Times Magazine Editor Responds Editor's note: Twelve Civil War historians and political scientists who research the Civil War composed a letter to The New Times Magazine concerning 'The 1619 Project.' The NYTM editor, Jake Silverstein, responded but the NYTM declined to publish the letter and his response. The scholars created a reply and Silverstein had no objection to publishing the exchange in another venue. It is published below.

 

To the Editor of The New York Times Magazine  12/30/2019

Re: The 1619 Project

 

We are writing to you today, in tandem with numerous others, to express our deep concern about the New York Times’ promotion of The 1619 Project, which first appeared in the pages of the New York Times Magazine on August 14th in the form of ten essays, poems and fiction by a variety of authors. The Project’s avowed purpose is to restore the history of slavery to a central place in American memory and history, and in conjunction with the New York Times, the Project now plans to create and distribute school curriculums which will feature this re-centering of the American experience.

 

It is not our purpose to question the significance of slavery in the American past. None of us have any disagreement with the need for Americans, as they consider their history, to understand that the past is populated by sinners as well as saints, by horrors as well as honors, and that is particularly true of the scarred legacy of slavery. 

 

As historians and students of the Founding and the Civil War era, our concern is that The 1619 Project offers a historically-limited view of slavery, especially since slavery was not just (or even exclusively) an American malady, and grew up in a larger context of forced labor and race. Moreover, the breadth of 400 years and 300 million people cannot be compressed into single-size interpretations; yet, The 1619 Project asserts that every aspect of American life has only one lens for viewing, that of slavery and its fall-out. “America Wasn’t a Democracy Until Black Americans Made It One,” insists the lead essay by Nikole Hannah-Jones; “American Capitalism Is Brutal. You Can Trace That to the Plantation,” asserts another by Matthew Desmond. In some cases, history is reduced to metaphor: “How Segregation Caused Your Traffic Jam.”

 

We are also dismayed by the problematic treatment of major issues and personalities of the Founding and Civil War eras. For instance: The 1619 Project construes slavery as a capitalist venture, yet it fails to note how Southern slaveholders scorned capitalism as “a conglomeration of greasy mechanics, petty operators, small-fisted farmers, and moon-struck theorists.”[1] Although the Project asserts that “New Orleans boasted a denser concentration of banking capital than New York City,” the phrase “banking capital” elides the reality that on the eve of the Civil War, New York possessed more banks (294) than the entire future Confederacy (208), and that Southern “banking capital” in 1858 amounted to less than 80% of that held by New York banks alone.[2]

 

Again: we are presented with an image of Abraham Lincoln in 1862, informing a delegation of “five esteemed free black men” at the White House that, because black Americans were a “troublesome presence,” his solution was colonization -- “to ship black people, once freed, to another country.” No mention, however, is made that the “troublesome presence” comment is Lincoln’s description in 1852 of the views of Henry Clay,[3] or that colonization would be “sloughed off” by him (in John Hay’s diary) as a “barbarous humbug,”[4] or that Lincoln would eventually be murdered by a white supremacist in 1865 after calling for black voting rights, or that this was the man whom Frederick Douglass described as “emphatically the black man’s president.”[5]

 

We do not believe that the authors of The 1619 Project have considered these larger contexts with sufficient seriousness, or invited a candid review of its assertions by the larger community of historians. We are also troubled that these materials are now to become the basis of school curriculums, with the imprimatur of the New York Times. The remedy for past historical oversights is not their replacement by modern oversights. We therefore respectfully ask the New York Times to withhold any steps to publish and distribute The 1619 Project until these concerns can be addressed in a thorough and open fashion.

 

William B. Allen, Emeritus Dean and Professor, Michigan State University

Michael A. Burlingame, Naomi B. Lynn Distinguished Chair in Lincoln Studies, University of Illinois, Springfield

Joseph R. Fornieri, Professor of Political Science, Rochester Institute of Technology

Allen C. Guelzo, Senior Research Scholar, Princeton University

Peter Kolchin, Henry Clay Reed Professor Emeritus of History, University of Delaware

Glenn W. LaFantasie, Frockt Family Professor of Civil War History and Director of the Institute for Civil War Studies, Western Kentucky University

Lucas E. Morel, Professor of Politics, Washington & Lee University

George C. Rable, Professor Emeritus, University of Alabama

Diana J. Schaub, Professor of Political Science, Loyola University

Colleen A. Sheehan, Professor of Political Science and Director, The Matthew J. Ryan Center, Villanova University

Steven B. Smith, Alfred Cowles Professor of Political Science, Yale University.

Michael P. Zuckert, N. Reeves Dreux Professor of Political Science, University of Notre Dame

 

 

From Jake Silverstein, Editor, The New York Times Magazine 1/10/2020 

Dr. Guelzo, Thank you again for your letter regarding The 1619 Project. We welcome feedback of all kinds, and we take seriously the job of reviewing objections to anything we publish. As you know, the project has been the topic of considerable discussion in recent weeks. I’m sure you saw the letter from Sean Wilentz and others, along with my response, both of which were published in our Dec 29 issue. I believe that this earlier letter, together with my response, addresses many of the same objections raised in your letter.

 

I asked our research desk, which reviews all requests for corrections, to read this letter and examine the questions it raises. They did so, and concluded that no corrections are warranted. Your letter raises many interesting points, which is no surprise considering the distinguished group of signatories, but they are not points that prompt correction. For instance, you write that “The 1619 Project offers a historically-limited view of slavery, especially since slavery was not just (or even exclusively) an American malady.” This is a critique of the project, not a request for correction. I believe you made a similar point in your essay for City Journal. Similarly, your letter notes critically that “The 1619 Project asserts that every aspect of American life has only one lens for viewing, that of slavery and its fallout.” Those are your words, not ours, but again, the complaint goes to a difference of interpretation and intention, not fact. 

 

I do allow that some of the queries in your letter are of a more factual nature. Below is our research desk’s responses to those matters. Sincerely, Jake Silverstein --- Notes from our research desk:

1. The letter states that the 1619 Project construes slavery as a capitalist venture and fails to note how Southern slaveholders scorned capitalism as ‘a conglomeration of greasy mechanics, petty operators, small-fisted farmers, and moon-struck theorists.’

This quote appears in James L. Huston's The British Gentry, the Southern Planter, and the Northern Family Farmer: Agriculture and Sectional Antagonism in North America (2016). In full it reads, "Free Society, we sicken at the name, what is it but a conglomeration of greasy mechanics, petty operators, small-fisted farmers, and moon-struck theorists.’ All the Northern, and especially the New England states, are devoid of society fitted for a gentleman." Huston attributes this quote to "a Georgia editor in a foul humor." It does not have to do with capitalism but with aristocratic plantation owners scoffing at small-scale family farms of the north. In Hurston’s words, this is about the “aristocratic distain” of the slavers. 

 

2. The letter states that although the 1619 Project asserts that “New Orleans boasted a denser concentration of banking capital than New York City,” the phrase “banking capital” elides the reality that on the eve of the Civil War, New York possessed more banks (294) than the entire future Confederacy (208), and that Southern “banking capital” in 1858 amounted to less than 80% of that held by New York banks alone.

The sentence in Matthew Desmond’s essay has to do with New Orleans and New York City. The citation has to do with entire states—and not with the concentration of banking capital but with banks. Several works—Sven Beckert and Seth Rockman, eds., Slavery’s Capitalism (2016); Seth Rockman, “The Unfree Origins of American Capitalism” in The Economy of Early America (2006); Calvin Schermerhorn, The Business of Slavery and the Rise of American Capitalism, 1815-1860—deal with the importance of finance and banking in the American South—in particular, with the rise of state chartered banks.

 

3. The letter asserts that Nikole Hannah-Jones does not provide enough context in her essay for Lincoln’s "troublesome presence" quote and that this was only Lincoln's description of the views of Henry Clay. 

 

Hannah-Jones does not state, as the letter implies, that Lincoln recited these words to the visiting delegation of free black men. Second, while the occasion for Lincoln’s words was indeed a eulogy for Clay, the full context makes it clear that Lincoln was endorsing Clay’s position: “He considered it no demerit in the society, that it tended to relieve slave-holders from the troublesome presence of the free negroes; but this was far from being its whole merit in his estimation... [Clay’s] suggestion of the possible ultimate redemption of the African race and African continent, was made twenty-five years ago. Every succeeding year has added strength to the hope of its realization. May it indeed be realized!” Hay’s diary entries about Lincoln’s eventual abandonment of the colonization scheme, two years after he met with the delegation, do not alter the fact that we correctly describe Lincoln’s views at the time of the meeting in 1862. The letter’s other concerns about how Hannah-Jones’s essay characterizes Lincoln are fundamentally requests for the inclusion of additional information--about Frederick Douglass’s estimation of Lincoln, or the conditions under which Lincoln was assassinated--rather than errors in need of correction.  _____________________________________________________________________  

From Allen C. Guelzo, Princeton University 1/11/2020  

Dear Mr. Silverstein:

Thank you for your reply. That our letter addresses both matters of overall interpretation and specific fact was, we thought, self-evident. That the disagreement concerning overall interpretation, and on a subject of such consequence, can be simply dismissed out-of-hand is dismaying, but so are the dismissals by your "reference desk" of the specific examples we offered. 

 

It is evasive to claim that the Georgia quotation is only about aristocracy and therefore not germane; the whole point, not only of Huston’s book but our letter, is that the Southern slave economy was aristocratic, not capitalistic, in spirit and practice. Citations to works of contemporaneous authors, North and South, foreign and domestic, to the same effect could be easily multiplied, the most notorious being the words of the pro-slavery apologist George Fitzhugh. 

 

It is similarly evasive to claim that the statement about banks and banking capital only applies to two cities; the point of our objection was that the slaveholding South possessed minuscule amounts of such capital when compared to the North, and merely vaguely invoking the work of Beckert, Rockman and Schermerhorn (and without a specific citation) fails to speak to the hard data of 1859. And had your “reference desk” paid attention to the material cited in our letter, it would have seen that New York alone outdistanced the entire future Confederacy in terms of both banks and banking capital as well. 

 

Finally, your response does nothing to correct the mistaken attribution to Lincoln of views which Lincoln specifically attributed to Clay. Lincoln was, at best, ambivalent about colonization – something evidenced by his comments on the subject in 1854 – and it is unhelpful to attempt to distance Lincoln’s August 1862 meeting with the black delegation from Hay’s 1864 diary entry, where it is clear that Hay is articulating Lincoln’s views. Your response also takes no notice of the fact that Lincoln’s appeal in 1862 was for voluntary emigration; that he called off the only temporary experiment in such emigration which the federal government sponsored in 1863; and that Lincoln was at the same time advocating the recruitment of black soldiers whom, in 1864, he has already begun declaring must be granted equal voting rights. That The 1619 Project failed to speak to these matters is an error of omission, but a colossal omission, and still an error.

It is my assumption, given your response, that the New York Times Magazine has no intention of publishing our letter. I hope, in that case, that you will have no objection to our publishing it in an alternative venue.

 

(Dr) Allen C. Guelzo

Senior Research Scholar, The Council of the Humanities

Director, Initiative in Politics and Statesmanship, James Madison Program in American Ideals and Institutions

Princeton University

 

[1]James L. Huston, The British Gentry, the Southern Planter and the Northern Family Farmer: Agriculture and Sectional Antagonism in North America(Baton Rouge: Louisiana State University Press, 2015), 168-169.

[2]The American Almanac and Repository of Useful Knowledge for the Year 1859(Boston: Crosby, Nichols, and Co., 1859), 218.

[3]“Eulogy on Henry Clay” (July 6, 1852), in Collected Works of Abraham Lincoln, ed. R.P. Basler et al(New Brunswick, NJ: Rutgers University Press, 1953), 2:132. 

[4]John Hay, diary entry for July 1, 1864, in Inside Lincoln’s White House: The Complete Civil War Diary of John Hay, eds. M. Burlingame & J.R.T. Ettlinger (Carbondale: Southern Illinois University Press, 1997), 217.

[5]“Oration of Fred. Douglass,” New York Daily Herald(June 2, 1865).

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174140 https://historynewsnetwork.org/article/174140 0
Modeling Grief: the Death of Children in Historical Perspective

 

I remember the afternoon my brother died in vivid detail: the chill of the afternoon, the bustle of doctors throughout the house, the sound of the ambulance – and the silence that followed. That silence stretched interminably, it seemed to me, from the moment of death, through the tearful embraces at my brother's graveside a few days later, into the months and years to come. I was not yet eight at the time, just a couple of years younger than my brother hand been, Yet death, I understood even then, carried with it a finality that defied words, doubly so when the death in question was that of a child. 

 

The death of children, for contemporary Western readers, looms large among the horrors most hope to never experience – and most indeed never will. In the United States in 2015, estimated average life expectancy at birth was nearly 79 years; the United Kingdom and most Western European countries reach even higher age ranges.(1) Put slightly differently, only 24.9 among every 100,000 American children are expected to die between the ages of one and five; just 38.1 of those 100,000 won’t live to see age fourteen. The haunting specter of facing the death of a child nevertheless informs a wide cross-section of Americans’ experience, from the quasi-apotropaic ritual of announcing pregnancies only after a number of months have passed, to the media's reporting of international tragedies by presenting children as "ideal victims": the faces of affliction most likely to engender sympathy. 

 

When the unthinkable occurs, as it did in my family that distant afternoon, resources are nevertheless surprisingly scarce, with few tools designed to address and help families move past the experience of bereavement, and few ways to break the silence surrounding loss and grief. As students of the history of emotions have argued, even the deepest and most primal feelings require cultural scripts to come to expression: models for experiencing one's own suffering and for responding appropriately to others’. The relative lack of such models in contemporary American society contrasts sharply with other eras' approach to voicing bereavement. Frequently and unsurprisingly, it has been religious communities that developed ways of speaking about loss, including the loss of a child. In late antiquity, roughly the first millennium of the Common Era, for example, Christian homilists and liturgists – the authors of hymns and funeral services – crafted scripts for families and communities confronting grief. 

 

Perhaps the most surprising aspect of these is their pluriformity. Letters and treatises addressed to elite recipients frequently commended restraint to mourners. The dead child, these writers argued, was enjoying blissful afterlife in the company of Jesus, angelic caretakers, and appropriately saintly playmates; parents accordingly ought to rejoice rather than deplore their loss. Most writers, however, addressed themselves to broader audiences, including the ordinary women and men who gathered in churches across the Roman Empire on Sundays and feast day. For them, homilists brought to life sympathetic characters from the very pages of Scripture: mothers who, like the first woman Eve, had endured the death of a son, and fathers like Job who had to bury all their children in a single day. 

 

Most of these stories found only the briefest of warrants in the biblical text. The death of Job's children as part of a wager between the divine and his accuser, for example, commands only two verses in the eponymous book (Job 1:18-19), and is recounted to Job only as part of an onslaught of trials that include the loss of crops, herds, and slaves' lives. In the Hebrew Scriptures, Job responds alike to all misfortunes; and yet, ancient Christian exegetes agreed, it was the death of his sons and daughters that truly caused him suffering. One anonymous Greek author, in fact, depicted Job searching through the ruins of the house that crushed his children. In this homily, Job painstakingly retrieves their remains from the detritus, joins together body-parts that had been scattered, and so prepars each child for burial. In the same vein, another author envisioned Eve's horror at discovering her son Abel's corpse. The bible's first bereaved mother, this text suggests, Eve initially did not recognize the familiar signs of death, tearfully calling out to Abel as to one asleep and appealing to him to speak to her.

 

At times, writers even imagined the grief and anxiety of parents otherwise marginalized in Scripture. Genesis 22, the story of Abraham's sacrifice of his son Isaac at God’s behest, for example, makes no mention of Isaac's mother, Sarah. And yet, ancient Christian homilists wondered at a father's ability to abscond with a child without consulting his mother, and envisioned Sarah as fearful and suspicious of her husband's motives. One anonymous Syriac homily, for example, has her appeal to Abraham to join her in pleading with God for Isaac's life, and even to die in her son’s stead. In the face of Abraham's refusal, Sarah's fear turns to anguished lament: 

 

I wish I were an eagle or had the speed of a turtle-dove,

so that I might go and behold that place, where my only child, my beloved, was sacrificed!

That I might see the place of his ashes, and look on the place of his binding,

and bring back a little of his blood to be comforted by its smell.

[That] I had some of his hair to place somewhere inside my clothes,

and when grief overcame me, I had some of his clothes,

so that I might imagine him, as I put them in front of my eyes;

and when suffering sorrow overcame me I gained relief through gazing upon them.(2) 

 

In late antiquity, these narratives served to draw the lives of biblical characters into Christians’ experiential realm. Listeners could sympathize with the despair of a mother learning of her child's death, or the devastation of a father having to bury in short succession not just one but several children. The writers who crafted these accounts sought to create a fellowship of grief between their communities and the biblical heroes they encountered in these stories. More than that, the latter could serve as models of emotion for Christians. In observing these characters’ laments, rituals, bargaining with God, or even railing against divine injustice, late ancient audiences learned to encounter – and to survive – the quotidian tragedy of bereavement.

 

In the twenty-first century, these figure and their stories may no longer speak to bereaved individuals with similar clarity; at the very least, they do not speak in the same ways in a religiously pluralistic society where childhood mortality has become rare as they did to Mediterranean and middle Eastern audiences in the first millennium. And yet the fact that they speak at all is perhaps sufficient: their voices stand against the silences surrounding death and bereavement, and against the despair that lies in wait in those silences. If their message or their messengers in the intervening centuries have ceased to speak to families and communities, they nevertheless issue a challenge: to find new voices by which to speak grief and rage, fear and disempowerment – even, at long last, hope.

 

(1) Sherry L. Murphy, Jiaquan Xu, Kenneth D. Kochanek, Sally C. Curtin, and Elizabeth Arias, “Deaths: Final Data for 2015,” National Vital Statistics Reports 66.6 (November 27, 2017): 2. 

 

(2) B.L. Add. 17206, lines 113-22, in Sebastian Brock, “Two Syriac Verse Homilies on the Binding of Isaac,” Le Muséon 99.1-2 (1986): 61-129, at 121-2, translation in Sebastian Brock, Sebastian Brock, Treasure-house of Mysteries: Explorations of the Sacred Text through Poetry in the Syriac Tradition (Yonkers, NY: St. Vladimir’s Press, 2012), 82-3.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174135 https://historynewsnetwork.org/article/174135 0
Justin Amash On the Exquisite Veracity of Truth Telling

‘Truth is the only merit that gives dignity and worth to history.’ – Lord Acton

On December 18, 2019, Congressman Justin Amashvoted to impeach Donald Trump. Since he was elected as part of the tea party wave in the 2010 mid-term elections, Amash has consistently shocked his colleagues by his unerring adherence to conservative/libertarian principles. He voted to repeal federal legislation against same-sex marriage and opposed gerrymandering; sponsored a bill ending the federal prohibition against marijuana; unfailingly supported efforts to reign in government spending; and opposed any abridgement of personal freedom.

 

Amash, however, is no longer a Republican. On July 4, 2019, Amash announced in aWashington Postopinion piece that he was leaving the GOP and notiedthat his commitment to integrity and truth-telling lead him out of his party. 

 

No great fan of President Trump — he called him a “childish bully” in 2017 — as early as May of that year Amash asserted that if allegations were true that the President pressured former FBI Director James Comey to end the investigation of National Security Advisor Michael Flynn, it was an act worth of impeachment. 

 

As a result of these actions, Amash been subject to a torrent of abuse from his former colleagues. White House aide Dan Scavino called on Michigan Republicans to defeat Amash in the Republican primary. As if on cue,by August 2019 reports circulatedthat as many as five Republicans were exploring the idea of challenging Amash in the 2020 elections. President Trump called him a “loser” who sought notoriety through “controversy.” House Minority Leader Kevin McCarthy accused Amash of voting with Nancy Pelosi more than with his party (PolitiFactjudged that to be false). Amash has learned the tragic irony in H.L. Mencken’s dictum, “The men the American public admire most extravagantly are the most daring liars; the men they detest most violently are those who try to tell them the truth.” 

 

American history is replete with political figures for whom truth-telling is a rhetorical device rather than the foundation of an honorable life. If, in contrast, integrity and fidelity to the truth are essential to Republican governance, a journey into the past provides enlightening insights. Consider if you will, three other courageous American politicians — Alexander Butterfield, Margaret Chase Smith, and Edmond Ross — who were unwavering champions of truth-telling, consequences be damned. 

 

The experience of Colonel Alexander Butterfield illustrated the importance of truth-telling as a regular part of professional engagement. After retiring from the U.S. Air Force in 1969, his college friend, H.R. Haldeman, hired him as deputy assistant to President Richard Nixon, a role that included managing the documents that crossed the President’s desk and running the White House in Haldeman’s absence. In early 1971, Butterfield supervised the installation of a secret taping system in Nixon’s offices. Butterfield felt his White House job was tedious and left in early 1973 to become head of the Federal Aviation Administration. 

 

Called before the Senate Watergate Committee in June 1973, Butterfield confirmed that Richard Nixon had been secretly taping his conversations and phone calls and subsequently “triggered a constitutional crisis.” Each time he was questioned by Watergate investigators he told the truth. While he did not participate in the Watergate burglary, he admitted to his role in legally supervising the cash used to pay for the break-in. Despite his transparency, Butterfield received no political retribution for his revelations. He was forced to resign from the FAA in March 1975 by President Ford, who cleared the government house of all top Nixon officeholders, and struggled to find employment for two years. Eventually he established his own consulting firm. By 2015, in retirement, he was a Ph.D. candidate focusing his work on presidential pardon power. 

 

Often circumstances require the truth-teller to stand alone when all the world is demanding they relent. In 1950 Margaret Chase Smith (R-Maine) was a moderate Republican when the party was beginning to shift fiercely to the right in the late 1950s. Despite threats to their existence, moderates like Smith did not go quietly into the night. Smith was elected to her deceased husband’s House seat in the early 1940s and like her husband she supported much of the New Deal legislation of Franklin Roosevelt. She backed the foreign policy of President Harry Truman and was elected to the U.S. Senate in 1948, remaining there until 1973. 

 

Soon after taking her seat, Smith learned the painful penalty exacted of those who stand against the whirlwind of popular anxiety. At first impressed with the accusations of Senator Joseph Raymond McCarthy (R-WI), Smith soon became disenchanted with his tactics of abuse. It was the time of the second great Red Scare in the early days of the Cold War and McCarthy rose as the prince of anti-communist darkness. When his evidence of wide-spread communist influence in the federal government was not forth-coming, Senator Smith spoke with integrity. She expected that the majority Democrats would provide a brake on McCarthy, but was disappointed when they proved themselves as timid as her Republican colleagues in the face of his campaign of political terror. 

 

On June 1, 1950 she stood in the well of the Senate and made a “Declaration of Conscience.” Without mentioning McCarthy’s name she precisely described his methods ones that had “debased” the Senate making it a “forum of hate and character assassination.” Smith decried the idea that her party should use the “four horsemen of calumny – fear, ignorance, bigotry, and smear,” to obtain victory.  Smith quickly came to feel the rage of the four horsemen of the Republican right. When several of her moderate Senatorial colleagues supported her declaration, McCarthy called them “Snow White and the Six Dwarfs.” He replaced her on his committee with Richard Nixon, and supported a primary challenge to her re-election in 1954. Smith weathered the political storm, but many conservatives considered her to be anathema and blocked her rise in party politics.

 

A commitment to truth can also place a public servant in direct opposition to one’s allies and one’s partisan affiliation. In 1868 President Andrew Johnson was impeached and tried for violating the Tenure of Office Act, which restricted the President’s ability to hire and fire officeholders. Johnson survived conviction in the Senate by a single vote. Senator Edmond Ross(R-KS) provided the decisive vote that spared President Johnson the disgrace of removal. As a passionate abolitionist and war hero, Ross came to the Senate in 1866 and was expected to join fellow Radical Republicans in removing Johnson. 

 

Some who voted for acquittal saw the case presented by House managers as unfair. Some feared a future in which presidents became pawns of a dominant legislature. Some believed the president had a right to choose his own subordinates. There is some evidence that bribes and patronage jobs were offered to some of the Senators, Ross included, but subsequent investigations yielded no compelling evidence of malfeasance, so no Senators were charged, and there were no convictions. 

 

Ross’s decision to acquit Johnson, to place himself on the side of truth as he saw it, brought an end to his political career. In fact, none of the Republican senators who voted to acquit Johnson — Fressenden, Fowler, Grimes, Henderson, Trumbull, Van Winkle, Dixon, Doolittle, Norton and Ross — were ever elected to office again. 

 

Amash. Butterfield. Smith. Ross. They were from different eras, parties, and opinions, but consistent, honest, citizens of integrity and character possessing a strong fidelity to the truth. They held to their convictions with integrity, told the truth as an ordinary part of their professional lives, stared down popular discontent to tell the truth as they saw it, stood alone while others around them bent to the demands of popular anxiety, defied those with whom they were in alliance, and, even in loss, grasped on to the passion that animated their principles.

 

All reside in vivid distinction to Winston Churchill’s lament, “Men occasionally stumble over the truth, but most of them pick themselves up and hurry off as if nothing ever happened.”

 

Copyright 2020 by Dan Roberts Enterprises, Inc.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174138 https://historynewsnetwork.org/article/174138 0
Could the Climate Crisis Be “The Good News of Damnation”?

 

On August 12, 1945, six days after the U.S. government obliterated the city of Hiroshima with a single atomic bomb, Robert Hutchins, the president of the University of Chicago, delivered a remarkable public address.  Speaking on his weekly radio program, the Chicago Roundtable, Hutchins observed that Leon Bloy, a French philosopher, had referred to “the good news of damnation” under the assumption that only the fear of perpetual hellfire would motivate moral behavior.  “It may be,” Hutchins remarked, “that the atomic bomb is the good news of damnation, that it may frighten us into that Christian character and those righteous actions and those positive political steps necessary to the creation of a world society.”

 

According to Hutchins, this world society would serve as the foundation of a world government, and, in the context of the existential danger posed by nuclear war, he was totally committed to creating it. “Up to last Monday,” he said, “I didn’t have much hope for a world state.”  But the shock of the atomic bombing, he added, crystallized “the necessity of a world organization.”

 

In the following months, Hutchins created and, then, presided over a Committee to Frame a World Constitution―a group of farsighted intellectuals who conducted discussions on how best to overcome humanity’s ancient divisions and, thereby move beyond nationalism to a humane and effective system of global governance. In 1948, they issued a Preliminary Draft of a World Constitution, with a Preamble declaring that, to secure human advancement, peace, and justice, “the age of nations must end and the era of humanity begin.”

 

The Chicago committee constituted but a small part of a surprisingly large and influential world government movement that, drawing on the slogan “One World or None,” flourished during the late 1940s. In the United States, the largest of the new organizations, United World Federalists, claimed 46,775 members and 720 chapters by mid-1949.  The goal of creating a world federation was endorsed by 45 major national organizations, including the National Grange, the General Federation of Women’s Clubs, the United Auto Workers, the Junior Chamber of Commerce, the Young Democrats, the Young Republicans, and numerous religious bodies.  That year, 20 state legislatures passed resolutions endorsing world government, while 111 members of the House of Representatives and 21 Senators sponsored a congressional resolution declaring that the new United Nations should be transformed into “a world federation.”  Much the same kind of uprising occurred in nations around the world.

 

Although this popular crusade waned with the intensification of the Cold War, as did the hopes for a sweeping transformation of the nation-state system, the movement did secure a number of vital changes in the international order.  Not only did the United Nations begin playing an important part in global peace and justice efforts, but the original impetus for the world government movement―the existential danger of nuclear war―began to be addressed by world society.  

 

Indeed, a massive, transnational nuclear disarmament movement, often led by former activists in the world government campaign, emerged and rallied people all around the planet.  In this fashion, it placed enormous pressure upon the world’s governments to back away from the brink of catastrophe.  By the mid-1990s, national governments had reluctantly agreed to a sweeping array of international nuclear arms control and disarmament treaties and were no longer threatening to plunge the world into a nuclear holocaust.

 

More recently, however, that world society has been crumbling thanks to a dangerous return of nationalism. From the United States to Russia, from India to Brazil, numerous countries have been swept up in xenophobia, triggering not only a disastrous revival of the nuclear arms race, but an inability to work together to challenge the latest existential threat to human survival: climate change.  Championing their own narrow national interests―often based on little more than enhancing the profits of their fossil fuel industries―these nations have either torn loose from the limited international environmental agreements of the past or, at best, shown their unwillingness to take the more significant steps necessary to address the crisis.

 

And a crisis it is.  With the polar ice caps melting, sea levels rising, whole continents (such as Australia) in flames, agriculture collapsing, and storms of unprecedented ferocity wreaking havoc, climate catastrophe is no longer a prediction, but a reality.

 

What can be done about it?

 

Clearly, just as in the case of heading off nuclear annihilation, no single nation can tackle the problem on its own.  Even if a small country like the Netherlands, or a large country like the United States, managed to quickly develop a system of 100 percent renewable energy, that action would be insufficient, for other countries would still be generating more than enough greenhouse gasses to destroy the planet.

 

So there really is no other solution to the onrushing climate catastrophe than for people and nations to forget their tribal animosities and start behaving as part of a world society, bound together by an effective system of global governance.  The climate crisis, like the prospect of nuclear annihilation, really is “the good news of damnation.”  And we can only overcome it by working together.

 

One world or none!  

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174142 https://historynewsnetwork.org/article/174142 0
The Film “1917” and the Allegory of the Wooden-Headed

 

The recently released film “1917” is a cinematically stunning and dramatically riveting “war movie”.  I have no quibble with its receipt earlier this month of the Golden Globe award for best film of 2019 (albeit Quinton Tarantino’s “Once Upon a Time in Hollywood,” a sort of anti-history of the Manson murders, is equally deserving).  But I want to argue here that it is so much more than a typical war-genre film.

 

Filmed from April through June last year on Britain’s Salisbury Plain (which reportedly upset conservationists, who feared the disturbance of undiscovered ancient-human remains), the film succeeds in placing us in the extensive trenches and corpse-strewn No-Man’s Land of the Western Front.

 

Of the latter, Leon Wolff, whose 1958 In Flanders Fields: The 1917 Campaign is still the best book on its topic, writes, “The problem of terrain has bedeviled military commanders in Flanders throughout history…. For clay plus water equals mud --- not the chalky mud of the Somme battlefield to the south, but gluey, intolerable mud.”  Wolff quotes one officer, instructed to consolidate his advance position, as writing back to HQ, “It is impossible to consolidate porridge.” [1]

 

The film’s two protagonists’ trek across No-Man’s Land depicts this “gluey, intolerable” muck, punctuated by decaying men and horses, perfectly.  It also depicts, though not so obviously (and perhaps unintentionally), the abysmal stupidity with which warfare was still being conducted nearly three years after hostilities started in August 1914.

 

Wolff remarks, “In the fourth year of this war there occurred one of many military cataclysms:  The Third Battle of Ypres, often referred to as the Paschendaele campaign, or the 1917 Flanders offensive.”[2]  

 

Stalin said, “When one man dies, it’s a tragedy.  When a million die, it’s a statistic.” “1917” puts two human faces on the statistics of that year’s cataclysm.

 

Cataclysms, of course, were nothing new in the history of warfare, when World War I rolled around.  Also not new was the abysmal “wooden-headedness” of military leadership. Credit Barbara W. Tuchman for introducing this label in her 1984 book The March of Folly.[3]  Tuchman, who built her reputation on the First World War with The Guns of August[4]some two decades earlier, walks us through a series of military disasters from the Trojans’ acceptance of the wooden horse, through the Brits’ loss of the American colonies, to America’s Vietnam debacle. 

 

Still, for sheer stupidity, World War I arguably has no equal, either before or since.  The eminent English historian Martin Gilbert sums it up for us in his The First World War: A Complete History. “The destructiveness of the First World War, in terms of the number of soldiers killed, exceeded that of all other wars known to history.”  He approximates the total as 8,626,000.[5]  While World War II --- which some might characterize as a continuation and final resolution of WWI --- exceeded this figure in total human carnage (principally due to the extension of total war to civilian populations), eight-plus million remains a stunning figure.  It also remains a tragic figure in light of the technological innovations available literally at the fingertips of the generals who, time and again, threw the flower of their national manhood at the barbed wire and machine guns.

 

The Context of “1917”

 

Wolff observes, “The conflict which had exploded in 1914 was, it was felt at the time, fortunately going to be a short one…. [I]n the event, the unexpected power of the defensive… brusquely smashed the respective military schemes….”[6]  After the opponents crashed head on “like two mountain goats” in 1914, the Western Front solidified.  Still both sides clung to the dream that victory by the weight of men and artillery massed could win.  Iron will was rated higher than iron tanks.  “In 1916 Foch, under the continuing delusion that sheer will power could break through barbed wire and machine guns, further drained the life blood of France in vast, notorious battles….”[7]

 

Wolff goes on to tell us that 1917 would have been a good year to end the struggle.  Both sides were bled white and exhausted.  Instead, first the French and next the British launched new, abortive offensives.

 

By the time of the Third Battle of Ypres, Field Marshall Haig was again convinced that he could win… and he intended to win before the Americans arrived in force and stole his thunder.  And, although a few visionary leaders like Winston Churchill appreciated the value of tanks against barbed wire and machine guns, this weapon remained largely a novelty. Haig, himself a cavalry officer, clung to the view that his beloved horse soldiers would play a crucial role in his triumph; once the vaunted breakthrough occurred, they would pour through the gap and seize the day. Claims Wolff, “By 1917 Field Marshall Haig had lost not a particle of his optimism and self-esteem, though all his offensives to date had miscarried, the war was a stalemate, British casualties exceeded a million, and his fitness for command had become a known matter of debate….”[8]

 

This, then, is the context in which the protagonists of “1917” go off across No Man’s Land, carrying a typewritten message.  And, as I have said, the story is indeed an engaging one.  Still, no matter how one cheers for our side, no matter how we clutch the arms of our seats in our anxiety for their fate and that of the troops they are attempting to warn of impending disaster, when the theater’s lights go up, we are left with a gnawing suspicion that we’ve been had.  After all, it was 1917.  Wasn’t there an easier way?

 

Why not a wireless message?

 

At the film’s start, we are informed that those nasty Germans, as they retreated, unhelpfully cut all the telephone lines in the abandoned trenches. But wait.  What about wireless radio?  

 

In Intelligence in War, another eminent British military historian, John Keegan, writes, “Between 1897 and 1899…, Marconi so much improved his apparatus that by 1900 the British Admiralty had decided to adopt wireless as a principal means of communication….”[9]  Keegan added that wireless worked better at sea than on land for a variety of technical reasons.  However, he granted that laxity in the use of unencoded (“clear”) communications was a greater hindrance to effective wireless communication on the battlefield than was technical difficulty.[10]

 

As with the tank, so it was with the wireless radio.  Per Keegan, “During the years of static warfare…, neither wireless messaging nor interference played any significant part, since the available equipment was ill-adapted to trench conditions and most communication, both strategic and tactical, was conducted by hand-carried paper, as was traditional, or by telegraph or telephone.” (my emphasis) The technology was known. It was either available or readily adaptable.  It was the will to adapt that was missing.  And so our two young Tommies sally forth with a letter from General Erinmore (Colin Firth) to Colonel Mackenzie (Benedict Cumberbatch), warning the latter that his impending attack would run straight into a deadly trap at the German’s Hindenburg Line.

 

Director Sam Mendes tells us at the film’s conclusion that the plot is based on stories related to him by his grandfather, Alfred Mendes, who was a Lance Corporal in WWI.  Keegan, as we see, confirms the veracity of the elder Mendes’s recollections of running messages.  

 

Then why not an airplane?

 

That said, my friend and colleague, Dr. Gregory J. W. Urwin, professor of history at Temple University, and himself a distinguished author on the history of warfare[11]and I have been speculating about the sheer lunacy of sending the two corporals (played by George MacKay and Dean-Charles Chapman) off on their Quixotic mission.

 

Let us grant that wireless radio, like the armored tank, was a neglected technology. But why place the fate of Mackenzie’s two battalions (1600 men) in the hands of a couple of corporals afoot?  As Dr. Urwin opined to me, “[S]ince they knew the location of the 1600 troops in the two advanced battalions, send over one or more aircraft to drop containers.  Heck, you could have landed a plane on the ground that the regiment occupied to transmit the news.”

 

General Erinmore insists that Colonel Mackenzie read the message in front of witnesses, because Erinmore fears, with his blood up Mackenzie might ignore the order and proceed with his planned attack. But why not send an officer with the message in a biplane?  In fact, why not opt for redundancy (the hallmark of trench warfare) and send two planes?  We see plenty of aircraft in this film, reflecting that, in contrast to tanks, ‘aeroplanes’ were (to a limited degree at least) an accepted innovation on both sides. 

 

Airpower, admittedly, was in its infancy.  Pilots, similar to knights of old, fought one on one for control of the sky. Beyond scouting the enemy’s lines, they did little of real use to the war effort.  Bombing raids, which were the hallmarks of the Battle of Britain 20 years on, were novelties in WWI.  Still, the capacity of a biplane or two to get the message to Mackenzie seems beyond debate.

 

So, is Director Mendes’s tale no more than a contrivance for dramatic effect?

 

I don’t think so.  If one accepts his claim that he inherited his story from his granddad’s lips, and we also take Keegan at his word that the generals clung to the hand-carried message, then the yarn takes on the trappings of veracity.  Add into the mix the ample circumstantial evidence --- Haig clinging to his faith in cavalry, when early experiments with armored tanks had proven them efficacious, for example --- and “1917” is a microcosmic dramatization of Tuchman’s “wooden-headedness” thesis.

 

A cautionary tale for today?

 

The March of Folly’s finale is Vietnam. Just as the Germans learned from World War One and opened hostilities in 1938 with “blitzkrieg” (lightening war), the American military learned from its bloody humiliation in Southeast Asia. This was demonstrated in spades in Kuwait in 1991.  

 

But wooden-headedness, like all traits of human nature, is intractable. Assuming that Bush the Elder’s lightning war against Iraq was the appropriate template, Bush the Younger plunged us back into Iraq in 2003.  Seventeen years later, not only are we still there.  We may be only one more drone strike away from a new Middle East war. 

 

Viewed with one eye on this current context, “1917” can surmount its surface characterization as an exciting “war movie” to become an allegory. “As a literary device, an allegory is a narrative, whether in prose or verse, in which a character, place or event is used to deliver a broader message about real-world issues and occurrences.”[12]  Instead of “1917,” let me propose the alternative title “The Allegory of the Wooden-headed,” with a suitable bow to the late, great Barbara Tuchman.

 

[1](New York: Time, Inc. edition, 1963) pages 122-23.

[2]Ibid. at xxxvi.

[3](New York: Alfred A. Knopf, Inc. 1984) at p. 7 (“Wooden-headedness, the source of self-deception, is a factor that plays a remarkably large role in government.”)

[4](New York: The MacMillan Co., 1966)

[5]Martin Gilbert, The First World War: A Complete History (New York: Henry Holt & Company 1994) at pages 540-41.

[6]Wolff, op. cit., page 6.

[7]Ibid., page 9

[8]Ibid., page 53.

[9]John Keegan, Intelligence in War (New York: Random House 2004) at page 102.

[10]Ibid. at page 144.

[11]https://liberalarts.temple.edu/academics/faculty/urwingregory-j-w

[12]https://en.wikipedia.org/wiki/Allegory

 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174139 https://historynewsnetwork.org/article/174139 0
BBC Whitewashes U.S. Refusal to Bomb Auschwitz

The trailer for the Bombing Auschwitz documentary

 

The new BBC documentary about the question of bombing Auschwitz deserves an award—for creative fiction. Through omissions, distortions, and “re-enactments” of conversations with imaginary dialogue inserted for effect, the BBC has made a shambles of the historical record concerning this important issue.

 

The film, “Bombing Auschwitz,” was broadcast in the United States by PBS on January 21 and is being screened at various venues. It purports to tell the story of what it calls the “debate” in 1944 over “one of the great moral dilemmas of the 20th century” --that is, whether to bomb the gas chambers at Auschwitz, despite the risk that some inmates might be harmed.

 

In fact, there was no such “debate.” There were a few individuals who privately expressed qualms. But they did so long after the Roosevelt administration had repeatedly rejected the bombing requests, on completely different grounds.

 

U.S. officials did not cite the danger of harming inmates when they turned down the bombing requests. That was not a consideration. The first such requests—and many of the later ones—asked for the bombing of the railways and bridges leading to the camp, and striking such targets obviously did not endanger civilians.

 

  Rescue advocates sought those bombings because hundreds of thousands of Hungarian Jews were being deported in cattle cars across those tracks and bridges, destined for Auschwitz. Damaging the transport routes would have interrupted the mass murder process. 

 

Remarkably, the requests and rejections concerning bombing the railways and bridges are not discussed in the film. In fact, some of the requests that were made to bomb the railways are misleadingly presented in the film as requests to “bomb Auschwitz.” The BBC has, in effect, whited-out the actual historical record, and replaced it with a distorted narrative that suits its creators’ agenda.

 

THE “STUDY” THAT NEVER WAS

 

The first requests for bombing were made in telegrams to the Roosevelt administration in June 1944 by leaders of Agudath Israel (an Orthodox group based in New York) and the Jewish Agency for Palestine, and by Roswell McClelland, a Switzerland-based official of the U.S. government’s own War Refugee Board. They named the specific rail lines and bridges between Hungary and Poland that should be targeted to disrupt the deportations.

 

An official of Agudath Israel, Meier Schenkolewski, also met in person, on June 19, with two senior members of President Franklin D. Roosevelt’s cabinet to plead for the bombing of the railways and bridges. Secretary of State Cordell Hull passed the buck, telling the Agudath Israel emissary to go talk to the War Department. Secretary of War Henry Stimson falsely told Schenkolewski that bombing those targets was impossible because they were “within the competence of the Russian Military Command.” In fact, American planes were already flying in the vicinity of Auschwitz, in preparation for attacks on other targets. Neither Hull nor Stimson is mentioned in the film.

 

Assistant Secretary of War John McCloy, acting on behalf of the administration, replied to the written requests of June 1944 and rejected them out of hand. He used nearly identical language in the rejection letters that he sent later that summer, in response to requests by other Jewish groups to bomb both the railways and the gas chambers within Auschwitz. 

 

In his letters, McCloy did not express any concern about possibly harming the inmates of Auschwitz. He wrote that the War Department had undertaken “a study” which concluded that any such bombings were “impracticable” because they would require “the diversion of considerable  air support essential to the success of our forces now engaged in decisive operations” elsewhere in Europe.

 

McCloy’s explanation was false. No such “study” was ever conducted. No “diversion” of airplanes would have been needed—because U.S. bombers were already striking German oil factories in the Auschwitz industrial zone, just a few miles from the gas chambers. The real reason for the rejections was the Roosevelt administration’s policy of refraining from using even the most minimal resources for humanitarian objectives, such as interrupting genocide. 

 

McCloy’s letter, which is central to this historical episode, is not mentioned in the film. Instead, the “diversion” argument is presented as a legitimate objection—as if bombing Auschwitz really would have undermined the war effort. The entire fictional “debate” is presented to the viewer as a clash between U.S. officials who were waging the war, and semi-hysterical Jewish leaders who wanted to divert from the war effort for the sake of their narrow Jewish interests.

 

THE OIL WAR

 

Jewish leaders were aware at the time that the Allies were bombing the oil factories at Auschwitz. The co-chairman of the World Jewish Congress, Nahum Goldmann, who repeatedly met with U.S. officials to press for bombing the railways and the gas chambers, specifically cited the fact that they were already “regularly bombing the I.G. Farben factories, a few miles distant from Auschwitz.”

 

The “oil war,” as it was called, was no secret. On August 21, 1944, for example, a front page article in the New York Times described how “500 United States heavy bombers from Italy today…bombed the I.G. Farbenindustrie synthetic  oil and rubber plant at Oswiecim in Polish Silesia.” 

 

The fact that Oswiecim/Auschwitz also was a death camp for Jews was no secret, either. The Times itself mentioned the mass murder of Hungarian Jews in Auschwitz in news articles on June 25, July 3, and July 6. During this same period, the Jewish Telegraphic Agency —the leading international Jewish news agency—repeatedly described the mass killings that were taking place in what it called “the notorious ‘extermination camp’ at Oswiecim.”

 

In other words, Goldmann and his colleagues knew at the time that the “diversion” argument was false. Yet Goldmann does not appear anywhere in the BBC’s film—not among the characters quoted, not in the narration, and not in the re-enactments. 

 

The producers preferred one of Goldmann’s subordinates, A. Leon Kubowitzki, because he proposed that Auschwitz be attacked by Allied ground troops rather than from the air. Kubowitzki was the only official of any Jewish organization who told U.S. officials he opposed bombing Auschwitz. By contrast, 30 different officials of Jewish organizations called for bombing. Yet the BBC film highlights Kubowitzki’s view, omits Goldmann, and disingenuously creates the impression that there were a few Jewish leaders in favor, and a few against. 

 

Kubowitzki was useful to the BBC producers—but only up to a point. The fact that he also circulated proposals for bombing the railway lines and bridges leading to Auschwitz was simply omitted. That historical information would have conflicted with the film’s narrative—and when the historical record conflicts with the preconceived narrative in “Bombing Auschwitz,” apparently history must give way.

 

GEORGE McGOVERN’S ROLE

 

Just before the film was completed, a BBC producer contacted me for an interview. When the project’s extreme bias became apparent, I declined to participate. I had good reason to expect that anything I said on camera which undermined the predetermined narrative would end up on the cutting room floor. Which, as it turns out, is exactly what happened with George McGovern.

 

In a series of telephone conversations and email exchanges with the BBC producer, Sue Jones, I explained that there had been no “debate” over bombing Auschwitz and that the main Jewish requests that U.S. officials rejected concerned bombing the railways and bridges. I pointed out to her that young George McGovern, the future U.S. senator and Democratic presidential nominee, was one of the pilots who flew over the Auschwitz region in 1944. In a lengthy videotaped interview some years ago (with the filmmakers Haim Hecht and Stuart Erdheim), McGovern said that bombing the railways and bridges would have been feasible. He noted that Allied pilots frequently bombed railways and bridges as part of the war effort, even though they were sometimes difficult to hit. 

 

McGovern said the “diversion” argument was just “a rationalization,” since he and other U.S. pilots were already flying over that area, and didn’t need to be diverted.  “Franklin Roosevelt was a great man and he was my political hero,” McGovern added. “But I think he made two great mistakes in World War Two.” One was the internment of more than 120,000 innocent Japanese-Americans; the other was the decision “not to go after Auschwitz…God forgive us for that tragic miscalculation.”

 

Ms. Jones told me she had never heard of McGovern’s role—despite the fact that the interview with him had been the subject of dozens of published articles in major newspapers, and even was screened on Capitol Hill for a Congressional task force.

 

I sent her the link to the video. Jones feigned interest. “I’m very interested in including McGovern,” she wrote me on April 1. And: “The McGovern interview is a good watch.” Not good enough, apparently. McGovern is not mentioned even once in the film. One of the most prominent figures in recent American political history was directly involved in the events in question—and yet the BBC could not find even a few seconds to mention him. Because, of course, what he had to say would have contradicted the film’s agenda.

 

“The film will be entirely faithful to the history of 1944,” Sue Jones of the BBC wrote me. In truth, “Bombing Auschwitz” is entirely unfaithful to the historical record. It is faithful only to the preconceived misconceptions of its creators.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174136 https://historynewsnetwork.org/article/174136 0
Islamophobia Goes Global World War II ended the reign of fascism and racial supremacy and the victors of the war took pride in their cultural diversity, ethnic pluralism, and universal humanitarianism. They defined their modernity through the scope of civil rights and liberties not just for their own ethnic minorities, but for deprived and disenfranchised minorities globally. International organizations such as the United Nations addressed the same concerns beyond cultural identity and cultural relativism through the United Nations Development Programme (UNDP), the United Nations International Children's Emergency Fund (UNICEF), the United Nations Educational, Scientific, and Cultural Organization (UNESCO) and other such agencies. Nations adhered to, or at least respected, the principles of democracy. Even authoritarian states pretended to be democratic, at least in name: the Democratic Republic of Korea, the German Democratic Republic, the people’s democratic republic of… Other nations proudly claimed to be ‘the beacon of democracy,’ ‘the world’s largest democracy,’ or ‘the Middle East’s only democracy.’ In the short interval before the East-West military realignment, it seemed that humanity had reached a critical turning point in its history and that any return to the tyrannies of the past genocides, pogroms, massacres, and holocausts were just that—the past.

 

At the end of the Cold War impasse, however, as the Berlin Wall came crumbling down, the ideological divide of the East-West was replaced by the Middle East-West racial, religious, and cultural divide that culminated in the election of Donald Trump and the rise of neo-fascism parading as super-nationalism. With the imposition of the Muslim Ban, Bush’s "you're either with us, or against us" anti-terrorism campaign shifted into high gear and every effort of Obama’s attempted outreach to the Muslim world was thrown out the window.

 

Religious extremism was no longer lurking on the fringes of the society as a rebellious opposition or a disenfranchised minority, rather it was firmly established in the seat of power from Eastern Europe to the Middle East, and from India and Myanmar to the Philippines, and under the guise of ideology even in Russia and China. Most of these governments and the media readily portrayed the terrorists as Muslims, but were reluctant to identify the victims of these terrorist and counter-terrorist campaigns as Muslim civilians.

 

The United States destroy edits neutrality as an impartial arbitrator in the Middle East conflict with concrete anti-Muslim positions such as recognizing Jerusalem as the capital of Israel, subsidizing unhindered settlements on occupied lands, rescinding the designation ‘occupied’ from Palestinian territories, and giving the green light to Israel to drill in and practically annex the Golan Heights. 

 

Even India’s Hindu nationalist government joined what appears to be an unholy alliance of sorts against Muslims by becoming the largest customer in the world for Israeli weapons and by turning Kashmir to one of the most militarized places on earth through a massive infusion of Indian forces. The Indian suppression of Muslims in Kashmir now resembles the Israeli suppression of the Palestinians. The Indian annexation of Kashmir mirrors Israel’s annexation of the Golan Heights and the rumored annexation of the West Bank. Both countries wreaking havoc on victims of the 1947 and 1948 partitions of India and Palestine respectively. As the Palestinian BDS National Committee (BNC) claims: “The Israeli weapons that India uses to oppress Kashmiris have been ‘field-tested’ on Palestinian bodies.” The parallels play out further as the Hindu nationalists who consider India ‘the Holy Land of the Hindus’ recently denied the autonomy of the Muslim state of Kashmir by revoking Article 370 of India’s Constitution and enacted the Muslim exclusion in the Citizenship Amendment Bill (CAB) raising concerns about the bill's constitutionality amidst growing anti-Muslim rhetoric in India. The Hindu nationalists seem to be bent on replicating the 1947 partition of the soil of India with the partition of the soul of India seven decades later.

 

Not to be left behind, China has sped up the cultural colonization of the Muslims in Eastern Turkistan (Xinjiang) where they have put some two million minority Uyghur Muslims in ‘re-education’ camps. Hundreds of thousands of children of millions of Uyghurs who are in concentration camps are taken to detention centers for forced assimilation and ‘cultural cleansing.’ Thousands of Uyghur intellectuals are imprisoned, some of whom have been investigated by Radio Free Asia.

 

A BBC report reveals widespread destruction of mosques and a Wall Street Journal investigation reveals the extensive use of cutting-edge technology by the Chinese government in the domestic surveillance of Muslims. CNN’s Matt Rivers referred to China’s policy of cultural repression as “the biggest human rights story on earth."

 

If you thought genocides like the Russian tyranny in Grozny or the Muslim massacre in Srebrenica by the Serbs was a thing of the past, think again. As Islamophobia goes global, Muslims became every oppressive regime’s favorite minority to suppress—from the Rohingya in Myanmar to the Mindanao Muslims in the Philippines to the Muslim refugees fleeing their devastated homelands. Even the repressive regimes of the so-called Islamic states like Saudi Arabia in Yemen, Turkey in Kurdish lands, Syria against its own citizens are persecuting their ethnic, linguistic, or sectarian Muslim minorities. What it leaves behind are devastations of cities, destruction of lives, and displacement of refugees in the millions.

 

The sad irony is that this globalized islamophobia is carried out not only by authoritarian regimes that have no concern for the world opinion but by political parties who themselves represent minority constituencies. The Republican Party, the Grand Old Party in the U.S., is anything but grand with only 32,854,496 members. The Hindu nationalist Bharatiya Janata Party (BJP) claims 180 million members, a mere 37% of the constituency in a country of more than a billion people. The Communist Party of China, even if we assume it adheres to democratic principles and procedures, has a membership of 90,594,000 that is even smaller than India’s BJP for a population of roughly the same size. The Likud Party’s small representation is evidenced in its inability to form a majority government—proof positive that the tyranny of the minority has morphed.

 

The twists of ironies are many and they originate in pretty much the same period of time: the partition of Kashmir in 1947, the partition of Palestine in 1948, and the annexation of Eastern Turkestan in 1949. When foreign powers with their foreign solutions came to these unfortunate neighborhoods, they drew lines of otherness between people making them foreign to each other. A critical turning point in human history? Not so fast.

 

Whether there is light at the end of the tunnel depends on how long the tunnel is. It’s a déjà vu all over again. The political suppression in Muslim majority states in the 1960s and 1970s resulted in the marginalization and radicalization of political Islam that produced the likes of bin Laden in the 1980s and 1990s. What the outcome of the upbringing of the millions of Muslim youngsters in these refugee camps and of this wave of vilification through Islamophobia will be in a decade or two, only time will tell.

 

This rampant militarism cloaked in vapid nationalist sloganeering may be serving the excesses of unbridled corporate greed, but it definitely destroys a social order whose unforeseeable consequences will be detrimental to civil order and world peace. If names like Sanna Marin (Prime Minister of Finland), Katrín Jakobsdóttir (Prime Minister of Iceland), Alexandria Ocasio Cortez (U.S. Congressperson), and Greta Thunberg (Swedish activist) are the signs of the future, then it is likely that the wounded soul of the world will be healed through a harmonic convergence of the feminine energy, wisdom, and foresight.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174141 https://historynewsnetwork.org/article/174141 0
How Neville Chamberlin Misread Hitler and Allowed the Third Reich to Threaten the World Order

 

When Adolf Hitler unleashed the might of the German armed forces against Poland on 1 September 1939, shock waves of horror and trepidation ran through the cities of Europe. After years of methodically capitulating to the territorial demands of the Fuhrer in an effort to attenuate the widely-perceived punitive excesses of the Treaty of Versailles (1919), Hitler remained insatiable in his desire for vengeance and world conquest.  The principle architect of  the policy of appeasement, British Prime Minister Neville Chamberlain, entered the House of Commons and angrily denounced Hitler stating “the responsibility for this terrible catastrophe lies on the shoulders of one man – the German Chancellor, who has not hesitated to plunge the world into misery in order to serve his own senseless ambitions.” (p. 3) Members of Parliament cheered at his long-awaited defiance.  One day later, however, Chamberlain backtracked and indicated the resumption of appeasement by endorsing a diplomatic initiative floated by fascist Italian dictator Benito Mussolini to resolve the crisis.  Hence, the prime minister lost the support of his own Cabinet, most of the nation and subsequently earned an infamous judgment at the bar of history as an icon of hollow and failed statesmanship.

 

In the new and magisterial study Appeasement: Chamberlain, Hitler, Churchill and the Road to War (2019), Tim Bouverie incisively reconstructs the ideological landscape of post-WWI Britain to explain how Chamberlain and other politicians and pundits misread Hitler and ultimately allowed the Third Reich to threaten the entire world order.

 

A Case of Mass Denial & Wish Fulfillment: Embracing Hitler, Proclaiming Peace

One year prior to the ascension of Adolf Hitler as chancellor of Germany in January 1933, the Deuxieme Bureau (French intelligence service) revealed a secret rearmament campaign by Berlin in violation of the Treaty of Versailles.  At nearly the same time, the Bureau obtained an unexpurgated copy of Mein Kampf (My Struggle, 1925) – a political manifesto penned by Hitler while in prison several years earlier – and attempted to warn Continental leaders of his ambitions to conquer France and embark upon military expansion.  Shortly after his visit to Hitler’s Germany in 1933, British MP Bob Boothby returned to London alarmed, declared the previously defeated nation to be in “‘the grip of something very like war fever’” and called for Britain to counter the clear intentions of Berlin with military preparedness.  Despite mounting and incontrovertible evidence of impending German aggression, most politicians and the public turned a blind eye to the ominous developments.  Why?  

 

Fifteen years after the end of the First World War, the heinous violence, ineffable carnage and countless number of atrocities generated by the conflict remained seared in the memories of the survivors.  Rather than adopting the fourth century Roman maxim Si vis pacem, para bellum(“If you want peace, prepare for war) to deter other nations from seeking territorial aggrandizement, psychologically traumatized Britons and their equally-effected Continental counterparts declared “Never again!,” relocated responsibility for the war to the pre-1914 arms race instead of the militarist course taken by Kaiser Wilhelm II and all-but declared peace at any price. In February 1933, the Oxford Union and its student-membership declared in the majority “‘This House will in no circumstances fight for King and country,’” and the Labour Party called for a nationwide work stoppage to force the government to disarm only eight months later. (p.24-25) At the same time, a significant number of high-level government officials succumbed to the diplomatic machinations and manipulative wiles of the Fuhrer.  While future prime minister Anthony Eden concluded “‘I find it very hard to believe the man [Hitler] himself wants war’” after meeting with Hitler in Berlin, Philip Kerr, the Marquess of Lothian and a leading political figure, became enthralled with the German chancellor to the point of characterizing Hitler as “‘a prophet.’” (p.49)  Even former Prime Minister David Lloyd George, who led Britain during the First World War and signed off on imposing relatively harsh terms of surrender on Germany, not only partially succumbed to Nazi propaganda casting Germany as a victim of Allied hegemony but also sized-up Hitler as “‘the Greatest German of the age’” from a short tea-time meeting with the Fuhrer in September 1936. (p.116)

 

Similar to Lloyd George, Prime Minister Neville Chamberlain believed adjustments and ameliorations to the Treaty of Versailles with “‘careful diplomacy [could] stave [war] off, perhaps infinitely.’” (p.132) Despite Hitler’s re-introduction of conscription to build a formidable standing army, an audacious announcement to rearm on a massive scale in March 1935 and the annexation of Austria by Germany (Anschluss) through intimidation and subterfuge three years later, Chamberlain and most of the British public – still haunted by the unprecedented death and destruction wrought by the previous war – ultimately rejected the stark realities of the Nazi menace and continued to vainly believe in a diplomatic solution. From Chapter XV “The Crisis Breaks” to Chapter XXIV “The Fall of Chamberlain,” Bouverie vividly recounts how the beleaguered prime minister succumbed to Hitler and his strategically cunning statecraft.  After ratcheting up pressure on London and Paris by conducting a robust propaganda campaign and (falsely) accusing the Czech government of repressing the German population within its borders, Chamberlain traveled to Hitler’s Bavarian headquarters – the Berghof – and attempted to de-escalate tensions.  On 15 September 1938, the British prime minister agreed to cede Czech lands containing a preponderant number of Sudeten Germans to the Third Reich over the objections of Prague.  Two weeks later in one final and fatal grand gesture of placation, Chamberlain flew to Munich for further discussions with the Fuhrer and assented to his demands for control of Sudetenland.  Upon returning to London, the prime minister declared the agreement would translate into “peace for our time” to a rapturous crowd.  For his perceived accomplishment, his popularity reached new heights at home.

 

The Unlikely Resurrection and Triumph of Winston Churchill

As a result of his role in planning the disastrous Gallipoli campaign (1915-16) and subsequent shortcomings as Chancellor of the Exchequer (1924-29), Winston Churchill had become a relatively marginal figure and seemed en route to political oblivion by 1930. Yet Churchill, who had once egregiously characterized fascist Italian dictator Benito Mussolini as “the greatest lawgiver among men,” not only immediately diagnosed the threat of the Nazi regime to Continental security but also began issuing a long series of prescient warnings on both Hitler’s irredentist agenda and the price of appeasing a militarist state. Only ten months after the appointment of Hitler as chancellor (January 1933), Churchill publicly shared the contents of intelligence reports indicating a vast German buildup of “scrap iron, nickel and other war metals” for the purpose of military rearmament and implored the British government to increase the size of the Air Force. (p.31) By 1936, the oft-ignored and frequently dismissed, former parliamentarian once again rose in stature with his uncannily accurate predictions of Hitler’s preparation and path to war.  Upon the Nazi takeover of Austria without resistance from London, Paris or Moscow in 1938, Churchill exclaimed:

For five years…I have watched this famous island descending incontinently, fecklessly the stairway which leads to a dark gulf…Now the victors are the vanquished, and those who threw down their arms in the field and sued for an armistice are striding on to world mastery.(p. 190-91)

 

After more than half a dozen years decrying appeasement as a flawed and ultimately dangerous means to curb Hitler’s campaign to unravel the Versailles Treaty, Winston Churchill regained the trust of his party and the nation and entered 10 Downing Street as prime minister on 10 May 1940 at age sixty-five.  From its dire consequences, the definition of appeasement quickly evolved from a strategy of containment to an unprincipled, ruinous policy of abject surrender by degrees.  Thus, “Neville Chamberlain” and “appeasement” have become political tropes for willful national capitulation to the interests or designs of other nation states.

 

A New Understanding of Neville Chamberlain & His Era

In Appeasement: Chamberlain, Hitler, Churchill and the Road to War (2019), Tim Bouverie has authored a focused, well-written and exemplary study of the socio-cultural and political dynamics of the rise and fall of appeasement in the 1930s. His insights into the wounded, collective British psyche in the aftermath of the First World War offer a holistic analysis of how and why appeasement flourished at the highest levels of government and within the body politic inside the one of the largest and most powerful empires in world history.  As such, his superlative monograph will reshape the historiographical debate over the years 1919 to 1945.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174143 https://historynewsnetwork.org/article/174143 0
History & Law: GW History Professor Jennifer Wells Discusses How Her Study of Law Has Informed her Career in History

Recently, I had the pleasure to sit down Dr. Jennifer Wells, Assistant Professor in the History Department at the George Washington University, to discuss the path that lead her to a professional career in History, her opinion on the future of the field in the greater context of collegiate academia, and some tips for current students considering graduate work in history and law.

As a current student in Dr. Wells’ class “Law, State, and Empire” I also had the opportunity to ask the professor for some of her more personal thoughts regarding modern international law and international relations, and how important the application of what we learn history classrooms is and will be to responding to current challenges experts in these fields face. 

 

[Detlor]: Prior to pursuing your PhD in History, you studied/practiced law, correct? What was your area(s) of study/practice and what drew you to that field?

[Wells]: Yes, I went to law school and completed my JD at the University of California in 2010. I specialized in international law (even though you don’t need to specialize in anything in law school) and have an American Bar Association certification in that subject. I focused on international law because I have always found international relations and foreign policy fascinating – even as a kid, I remember being fascinated by things in the news: the conflict in the former Yugoslavia; the Troubles in Northern Ireland. Those were things that made headlines all the time in the 90s and they intrigued me. Extensive international travel in college reaffirmed and solidified my interest in the wider world and its peoples and cultures.

 

[Detlor]: What drew you to study History at the graduate level? Was there something unique about History or academia more broadly that you felt a career in law was lacking?

[Wells]:  My path to academia wasn’t particularly straight forward in that it wasn’t my end goal. While I really loved (and still love) the policy end of law, I absolutely hated my first year of law school. Civ Pro and Contracts were excessively boring. I went straight from undergrad into law and in hindsight that was probably too young/too early. I’d always recommend students take a few years off and work prior to beginning graduate programs (more on that in a bit). You are so much wiser as to how life can just hit you over the head with a brick; you realize what it is to deal with difficult bosses; you know how hard you have to work to excel. College doesn’t really prepare people for “reality” in my opinion (your professors are pushovers) and law school is reality in a big way. It is 100% sink or swim. At 22 you are just way too young and naïve as to the ways of the world. I think going into a graduate program at 25 gives you a huge advantage. The people who had a few years on me and other fresh college grads really excelled.  Conversely, once I was in grad school, I had a massive advantage at age 26 over my counterparts who were 22/23.

So in addition to feeling a bit at sea in law, I also found that I missed the research associated with history. I’d done an honors thesis in college and graduated summa cum laude because of it. I absolutely loved the original research – coming up with an idea, researching it, turning it into a story. I worked for a federal judge in San Francisco my first summer and found myself reading Past & Present instead of court briefs and realized I’d probably enjoy doing original research as a professor more than filling out templates as a litigator.

On top of that, while I loved the policy/criminal prosecution end of international law, the pay is comparable to a college professor (sometimes worse) and it is a tough field – just as hard/harder than getting a job in academia. Someone like Amal Clooney, an idol, put in many, many years of hard work to get where she landed. You can take all sorts of dud cases and work very hard and know you still probably won’t win. I figured I’d rather expend my energy in academia and have the freedom and flexibility afforded by academic schedules.

As a result, I applied to grad school in my second year of law school. I was admitted to several places and Brown allowed me to defer for a year in order to finish law school, which I did. Today, I am more grateful for my law degree than my PhD; it’s versatility is incredible and it opens so many doors.

 

[Detlor]: What field(s) in History do you study? What drew you to them?

[Wells]: My specialization in history is Britain and Ireland, specifically during the early modern period. I was drawn to Irish history because my mother’s family is Irish. Growing up, my grandparents shared all sorts of stories about Ireland, their families’ opinions of it, how they emigrated to the United States, etc. As I mentioned earlier, the Troubles was a constant news item when I was growing up and I was curious to learn more about this ancient enmity between Irish Catholics and English and Irish Protestants in the North. Personally, I don’t think you can focus on Irish history and not have a deep understanding of British history. I’ve always found the story of the British and their empire interesting. How is it that some tiny, insignificant, rain-drenched island in the North Atlantic on the fringes of Europe could, at its height in 1900, rule more than ¼ of the globe with xxx people owing allegiance to the House of Windsor? Britain has arguably had the biggest impact on the modern world – in no small part because of its former colony, the United States. The global lingua franca is English for a reason. And, in the words of an Irish friend, “If you see straight lines on a map, you know the British were there.”

My focus on the early modern world (that is, the world prior to the eighteenth/nineteenth century, depending on who you ask) is a result of taking a number of courses in the subject with a talented professor when I was an undergraduate. It gave me a firm understanding of the history and historiography that provided a seedbed for graduate study. On top of that, understanding early modern Europe is critical for understanding the world today. Our (and here I mean the West’s) great advances in science, technology, statecraft, the arts, religious pluralism, even the concept of the individual – originated in Europe between the fifteenth and eighteenth centuries. It was a remarkable time for human advancement and ingenuity and shaped the modern world.  In a British and Irish context, this is really when Britain began its expansion across the globe. The foundations it laid in the seventeenth century were vital to its ultimate domination in the nineteenth and early twentieth centuries.

 

[Detlor]: Would you say your background as a lawyer has influenced your study of History? In what ways? 

[Wells]: Definitely. I probably approach history much more as a lawyer than as a historian, chiefly in terms of looking at arguments as flimsy or lacking evidence/rigorous analysis (and a lot of historical arguments do). I’m more much analytical as a result of law and try to immediately make an assertion and back it up with evidence when I write; I think it’s a very effective way of writing but I’m not sure that I would have mastered it had I not gone to law school. The other thing that law really helped with was knowing what a good argument is; there are all sorts of bad arguments and weak arguments – mostly because the idea is too narrow, lacks evidence, or (the biggest) is simply not important. Sometimes I listen to projects people are working (knitting in a 12th century convent or selling shovels in Indiana in the 19th century) and I’m like, “Who cares? And what is the point?” There are all sorts of interesting factoids floating around out in the world but that doesn’t mean you should focus upon it, shape an entire project around it, and re-tell it to a broader audience. Because law is the foundation of our society – and world society – it will by default address more pressing issues. That is something that guides every single thing I write and every single project I undertake: “why is this important? Why is it transformative? What is it telling us about our world that is critical and fundamental?”

 

[Detlor]: Conversely, how has your study of the History states and imperialism influenced the way you think about international law, if at all?

[Wells]: History is valuable for study international law because you understand the wider context in which these laws were (and are) crafted and the numerous events that led to some incident occurring. Law by itself is narrow and bereft of context. Just look at the law professors who spoke at the House Judiciary Committee’s impeachment hearing recently. They had to provide our legislators with the historical context of impeachment. To me, a lot of this stuff was pretty basic history anyone who has gone to law school or gotten a BA in history and loaded upon courses in early modern Britain and early America. But for most people, the history was illuminating and helped them to understand where impeachment law came from, what the framers of the Constitution had in mind, the English context for impeachment, etc etc. When writing on international law, my historical knowledge of various events and the roots of such events (which were often sowed centuries before some sort of incident occurred) are invaluable.

 

[Detlor]: I'm currently enrolled in your "Law, State, and Empire" class currently. I'm curious as to your opinion on the state of the modern empire. Does such a thing exist currently? How might it differ from more traditional empires of the past?

[Wells]:  I always find the “modern empire” question a bit bogus when historians or academics in various cultural studies fields bring it up. Sure, you can argue that America has “an empire” or China has “an empire” but their empires don’t really hold up in traditional thinking nor would there by any legal argument you could make to support this line of thinking. Empires were traditionally (and in my opinion still are) founded upon the notion of sovereignty – that an entity or ruler exercises legitimate authority over some jurisdiction. This necessarily involves land or sections of the sea. America may well influence the cultures of a variety of notions but in no way does it exercise sovereignty over the other 192 member nations of the United Nations. Decisions in the US do impact people across the world; but it’s wrong to claim the US is an empire and I don’t know of any lawyer or law professor who would take that line of argument since there is little to sustain it.

 

[Detlor]: We've talked extensively about the varying degrees of efficacy and importance of international organizations such as the UN in class as well. I think you have the unique perspective of having studied the creation of such organizations while possessing concrete experience in international law. Considering this, how would you rate the importance of these organizations? Do you predict them continuing to have a place in international relations into the future?

[Wells]: I think institutions like the UN, the European Union, NATO, Interpol, the International Criminal Court, and so on are very important to the world. Sure, these institutions are flawed and they don’t work all that well. But until someone comes up with something better, I don’t see another alternative. There is a symbolic importance to most of these institutions too. They underscore humanity’s higher ideals, better angels, and aspirational goals – and for that reason, these institutions are very important. Having people come together and say, “Okay, despite our differences, we acknowledge that we share xyz and this institution is a mechanism by which to safeguard xyz” is powerful and important for long-term stability.  Systemic change that would lead to better enforcement and cooperation at the international level can only happen if powerful countries like the US and China start to play by the same rules as everyone else. The US’s (repeated) failure to sign international treaties or ignore/refuse to sign UN resolutions is a genuine problem. If you want something to work, the biggest players have to have just as much skin in the game and be willing to adhere to the same standards as everyone else.

As for whether these institutions survive – I think they will but a lot is highly dependent on where humanity goes in the next few years. In the US and Europe, we have been destroying each other for nearly twenty years as people and politics have become more polarized and I think we’re now at a legitimate cross-roads. One road sees us continuing along this path of division, the nativism, populism, and xenophobia. If Trump wins in 2020, I firmly, 100% believe that this is the way we go and it is 100% the wrong direction. Should that occur, the US will become increasingly isolated from the rest of the world. I don’t know that NATO will survive. Institutions like the UN could well become moribund by the close of the 21st century. The EU might actually strengthen in response to an isolated America but there is also the probability that Europe’s own conflicts pull it further apart. This lack of leadership in the West then allows China to fill the vacuum. It has already laid the groundwork for this, pouring billions into infrastructure in Southeast Asia, Africa, and Latin America. People may complain about an American-led world order but do you like the alternative? Our other path is one in which we embrace institutions, both at home in the US and internationally, that are meant to safeguard basic, fundamental rights.

 

[Detlor]: Another aspect of your class that I've enjoyed has been your consistent application of the history and theories we study to modern issues of international law and international relations. What value do you see in this aspect of application of history in the classroom? Do you think it's something that historians should be engaging in more in the classroom?

[Wells]: 100% yes – preach! -  there needs to be more application of history to contemporary events in the classroom. One of the constant anxieties of historians or the Chronicle of Higher Education or any number of academic journals and news outlets is the decline of historical thinking (there was an Atlantic article on this recently) and the slow but steady decline of history as an academic discipline. I share these concerns. However, to play devil’s advocate, I also think the decline of history as a discipline can be attributed to the way it is taught and the failure of professors to make it relevant to students (or to research and write on relevant topics). This is something many of my colleagues will likely bristle at but sadly statistics show that the way we are continuing to teach history and train historians actually contribute to lower majors and a decline in the discipline. If I can’t take someone seriously who writes about 12th century convent knitting then why on earth should a student or the general public? This goes back to my earlier point about relevance – if you cannot articulate to me why farming patterns in early modern France are relevant, then for the love of god, don’t teach it or write about it.

There are millions of ways to make things relevant too: for instance, in our course on Law, State, and Empire, I had you read things about resource exploitation in early modern Europe and apply the ideas of contemporaries like Grotius to resource exploitation in Africa today. The actions of modern men are all rooted in past practices. We are simple creatures. There is very little new under the sun; what is new is the way we apply ideas and knowledge to current problems and adapt existing infrastructure for innovative ends. Having an understanding then of history and how people thought about a particular subject is vitally important for resolving issues in our current world, but it involves moving beyond our narrow historical parameters to wider application -- again something historians might freak out about! As an aside and a good example of this: I gave a conference paper a year ago on the origins of refugee law and a colleague talked about climate crises in early modern Europe. Several people said to us it was “ahistorical” to bring up these issues in historical context and apply the ideas to modern problems. Ahistorical or not, we wound up with a book contract with a popular press on the topic – which shows that there is a public appetite for seeing past ideas applied to present predicaments).

Applying history to the present also shows why history is still highly relevant as an academic discipline and a major. Former UN Samantha Power remarked in her recent memoir, Education of an Idealist (fabulous book that I highly recommend) that the single most important course she took was a seminar entitled “The Use of Force: Political and Moral Criteria.” She was enrolled in law school at the time but the course was offered to Harvard’s undergrads and she enrolled. The professors in the class assigned students various readings from medieval and early modern thinkers – Aquians, Augustine, Niebhur, Michael Walze – and then asked the students to apply the ideas of these long-dead men to contemporary events, including the Vietnam War, the Gulf War, the 1992-93 US intervention in Somalia. I read that section of her book deep into our class on Law, State, and Empire but was delighted to see that the methodology for our course had had an outsized impact on a UN Ambassador and led her to pursue a career in human rights.

 

[Detlor]: As we move into 2020, what would you say are the top three biggest challenges the field of international law will tackle in the next decade?

[Wells]: An easy one: Climate change and two issues that will grow out of it: increased conflict and increased migration crises.

 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/173824 https://historynewsnetwork.org/article/173824 0
A Museum For The People: Anacostia Community Museum

Originally named the Anacostia Neighborhood Museum by the Secretary of the Smithsonian Institution, S.Dillon Ripley, the Anacostia Community Museum opened in 1967 as the first federally funded community museum. During this time, the museum was an outreach effort by the Smithsonian to create a museum that represented the local African American community. John Kinard, a local community activist and minister, was appointed the director and employed his skills in community engagement, organizing, and outreach to shape the practice and direction of the museum. Beginning in the 1980s the exhibition program turned to broader national themes in African American history and culture with a focus on preserving that history. This effort was followed by a series of exhibitions to feature more African American artists, such as André Leon Talley and Camille Cosby. 

 

The name of the museum was later revised to the Anacostia Museum and Center for African American History and Culture to be inclusive of these efforts, which is how the museum became a prototype for the new National Museum of African American History and Culture that opened in 2016. Once this new museum opened on the mall the Anacostia Community Museum reconnected with their early work that was once spearheaded by Mr. Ripley. By rebranding itself as both a museum and center for history and culture, the ACM began to serve broader audiences and foster unique exhibitions that focus on community issues and local history. In 2006, the museum’s name was changed to Anacostia Community Museum, to reflect a renewed commitment to examining issues of contemporary urban communities. Throughout its storied history of over 50 years, the museum has remained relevant, developing documentation projects, exhibitions, and programs which speak to the concerns, issues, and triumphs of communities and which tells the extraordinary stories of everyday people.

 

On December 6, 2019, the Anacostia Community Museum (ACM) and Asian Pacific American Center (APAC), held its symposium “A Museum for the People: Museums and Their Communities, 50 Years Later.” The symposium was organized to outline the future of community-based cultural organizations and how their organizations could meaningfully serve the needs of communities. Scholars, museum professionals, community members and other arts and culture workers convened to explore what it means to be a museum for the people in the 21st century and beyond. The moderators for the panel discussions were Melanie Adams, Director of the Smithsonian Anacostia Community Museum, and Lisa Sasaki, Director of the Smithsonian Asian Pacific American Center. During the symposium, they asked the panelists how museums could transform communities and vice versa. The goal of the symposium was to have an open and honest discussion about their successes in community based work, the challenges that they face, and potential strategies for working together to bring about lasting, positive change. 

 

After the symposium, I interviewed Perry Paul, Director of Education and Outreach for ACM to talk about the educational programs that are being implemented to help the community. When discussing how ACM represents the people of Anacostia, Paul said they represent them through their mission. Together with local communities ACM illuminates and amplifies their collective power to preserving communities’ memories, struggles, and successes. Their vision for urban communities is to activate their collective power for a more equitable future. Paul said the exhibition they have now is called A Right To The City which examines gentrification in Washington D.C. He says they are trying to “educate the public about these issues through a two-way dialogue.” By using a historical perspective they want to “engage people with civic engagement and civic activism.” An important reason why the education programs have flourished is because of their collaborations.

 

The ACM is currently partnering with the DC Public Library for The Right to the City exhibit and Martha’s Table for the new upcoming exhibit focusing on food access that will show in Fall 2020. Paul says these collaborations are part of ACM’s concept of museum without walls. Their goal is to implement exhibitions and programming offsite into the communities they serve. Paul says it was “a conscious decision in the education program to take [the museum] off site into the DMV area...to do author talks, lectures, and panels.” This way they can make their  programming inclusive of all of its primary audiences. To help museum staff remember their primary audiences based on their new strategic plan, they like to use the acronym DMV which stands for: DC history lovers, the Metropolitan DC area, and visitors involved in civic engagement activities or organizations. This embodies two different communities: local constituency and a wider audience, which allows ACM to develop programs for everyone interested in African American history and culture. 

 

When asked what makes the Anacostia Community Museum different from other Smithsonian museums on the National Mall, Paul explained that they take a “microscopic approach” to the way they address issues. He described how the Smithsonian exhibitions take a large view of things and because ACM is smaller they go into the weeds of an issue. They focus on the unsung voices. For example, the civil rights movement was a time of growth for the African American community and it was important to document those achievements. ACM decided to take a grassroots approach to talk about people that weren’t in the history books, but still made an important impact during that period. By breaking down issues at a local level, ACM has had a meaningful influence on the people of the Anacostia. Through education programming ACM has created meaningful dialogues between people and makes them want to do something for their community. This is how ACM continues ongoing efforts to represent, amplify, challenge, and innovate the community of Anacostia and the DMV area.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/173844 https://historynewsnetwork.org/article/173844 0
What Dreams of Canada Tell Us About Race in America

 

A few days into the new year, Americans awoke to news that the U.S. had assassinated Iranian General Qasem Soleimani. Fears of military conflict with Iran dawned on thousands, search engine hits for “World War 3” soared, and both Iranians and their loved ones in America braced for what might be next.

 

Although U.S. military personnel overseas were the ones closer to harm’s way, many young people at home in the U.S. immediately wondered if there could be a draft. By 8 a.m. on January 3rd, the Selective Service, which maintains records of those registered for military draft in case of war, was reporting website overload. On social media, users posted memes about leaving for Canada, imagining spontaneous road trips north and draft-safe igloos. Others posted more poignant messages, like the mother of an 18-year-old who sat her son down for “the talk” about moving north if the draft became real. Why does Canada spring to mind so quickly when Americans fear war?

 

Canada’s unique role in the American imagination comes in part from its very real history as a source of refuge. During slavery, enslaved Africans sang songs with encoded messages of resistance: Follow the north star to freedom, in Canada. That was precisely what thousands of enslaved people did, founding Canada’s largest early Black community. 

 

One hundred years after slavery’s official end, a new generation of Americans sought refuge in Canada. Their goal was to escape participation in the Vietnam War and they couldn’t help but see themselves as a modern-day version of that underground railroad, writes historian Wendell Adjetey. What resulted over the next decade was a massive, highly organized project to resist the war by draining the U.S. military of its human power. 

 

Roughly 50,000 men are estimated to have made it out of the draft’s crosshairs by fleeing to Canada. Female activists, although exempt from the draft, approached their political role with equal seriousness, sometimes posing as partners of male resisters to help them cross the border without arousing suspicion. Volunteers in both countries threw themselves into the work of helping young people get to Canada: staffing hotlines, counseling youth about their options if drafted and putting them in touch with people and resources to make the journey to Canada go smoothly. Canadians put pressure on their government, staging clever border actions to demand immigration officials welcome a larger number of resisters. 

 

But when it came to the Underground Railroad analogy, there was one problem. At a time when American casualties in Vietnam were disproportionately African American, most of those who successfully made it to Canada were white. Other racialized groups among the resisters in Canada are scarcely discussed in sources on the period. 

 

The experience of Black resisters in Canada was fraught. While attempting to cross the border, they faced scrutiny that white resisters did not. For those who made it in to Canada, adapting to their new country was difficult and blending in was impossible. Local demographics, art and culture felt so white that resister Eusi Ndugu compared arriving in Canada to “jumping into a pitcher of buttermilk…There’s a race problem here, just like in the Northern cities of the U.S.” Although Canadians were polite, Black resisters could sense a “subtle anti-Black bias” – not just among locals, but among white resisters, as well. 

 

To combat their alienation, Black resisters took matters into their own hands. In 1970, a small group of them founded BRO, the Black Refugee Organization. BRO members helped newer arrivals meet their needs and worked out a plan to match African-American resisters with local Black Canadian families - which involved bridging cultural gaps with a now predominantly Caribbean community. Yet very few Black resisters ultimately remained in Canada. BRO members soon urged Black resisters still in America to “stay there if it is at all possible – do what you can to resist there.” 

 

The experience of white resisters was worlds apart. For many, Canada became a new home; a place to reinvent themselves among like-minded peers and create lives of meaning. Canadians welcomed them warmly; even government officials later called the influx of war resister immigrants “the largest, best-educated group this country ever received.” When a 1977 amnesty allowed draft evaders to return home without punishment, thousands chose to stay in Canada.

 

Resisters of all races went to Canada because they were worried for themselves – but also because of their horror at what was happening to Vietnamese civilians. For a generation whose political awakening had begun with Civil Rights, it was difficult not to see bombing and napalming brown-skinned Vietnamese civilians as a racist war. Many early anti-war activists had learned their tactics of non-violent civil disobedience from Civil Rights work. The young Black activists of the Student Nonviolent Coordinating Committee led the way in making this connection between war and racism explicit. Dr. Martin Luther King, Jr. soon followed, gradually making his opposition to the war more visible in his Civil Rights work. 

 

The desperation that many young people felt about stopping the war was summed up by the words of 22-year-old Civil Rights activist Mario Savio. Three months before the deployment of U.S. combat troops to Vietnam, the young Italian-American cried out to student protesters, “There's a time when the operation of the machine becomes so odious, makes you so sick at heart, that you can’t…even passively take part. And you've got to put your bodies upon the gears and upon the wheels…and you've got to make it stop.” 

 

For a decade during the Vietnam war, young Americans tried every means they could think of to stop that machine. They sabotaged it by destroying draft records, stood in its way by blockading the trains that moved troops, and made it difficult for napalm manufacturers and recruiters to show their faces in public. By going to Canada, thousands attempted to remove themselves from the machine’s gears entirely. Yet none of them could escape the way that, in the end, racism shaped their available options.

 

Leaving America surely saved some resisters’ lives. But the machine they fought is still at work. It is held together by the message that some lives –at home and abroad –count as less human than others. A new generation will now face the question of how to dismantle it.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174074 https://historynewsnetwork.org/article/174074 0
The History of Presidential Misconduct Beyond Watergate and Iran-Contra

 

In 1974, James Banner was 1 of 14 historians tasked with creating a report on presidential misconduct and how presidents and their families responded to the charges. The resulting report was a chronicle of presidential misconduct from George Washington to Lyndon B. Johnson. It indicated moments of corruption, episode by episode with no connective tissue between administrations. 

 

The historians delivered the report in eight weeks and it was prepared for distribution to the House Judiciary Committee. But then Richard Nixon resigned and the hearings it was designed for never happened. The report was published as a book but it got very little attention. Very few American historians even heard of it. “And thus it lay,” stated Banner. 

 

Then, in August 2018 historian Jill Lepore called Banner and asked him about the book. Afterwards, Lepore wrote about the 1974 report in the New Yorker and the press and surrounding political events ignited interest in it. To bring the chronicle of presidential misconduct up to date, Banner identified and recruited seven historians to write new chapters so that a new version of the book would end with Barack Obama. Presidential Misconduct: From George Washington to Today was published by the New Press in July 2019.

 

At the American Historical Association’s annual meeting in early January, Banner monitored a panel with three of the book’s new authors: Kathryn Olmstead, Kevin M. Kruse, and Jeremi Suri. Examining the presidencies of Richard Nixon, Jimmy Carter, and Ronald Reagan, the historians discussed how the recent past shapes the current discussion of presidential misconduct.

 

Kathryn Olmstead examined presidential misconduct beyond Watergate in the Richard Nixon administration and argued abuse of power and law-breaking were central to Nixon’s presidency. 

 

Dr. Olmstead urged historians to remember just how unusual Nixon was. Popular accounts of Watergate often minimize the crimes and focus on the subsequent cover-up, but Nixon’s crimes were substantial and began before he was elected. 

 

During the 1968 election, President Lyndon B. Johnson announced that if the North Vietnamese made certain concessions, LBJ would halt bombing campaigns and being negotiations with the North Vietnamese. Nixon publicly agreed with LBJ’s stance, but privately he took action to sabotage the plan. Nixon appointed Anna Chennault, a prominent Republican fund-raiser, as a go-between to encourage South Vietnam to not accept the negotiations. When North Vietnam signaled it would make the necessary concessions, many thought the war would end and Hubert Humphrey, the Democratic candidate for president, started to do better in the polls. Then, the South Vietnamese indicated that the concessions would not be sufficient for them to negotiate.

 

LBJ knew Nixon influenced the potential negotiations because he instructed the FBI to listen to Nixon’s phone calls. LBJ believed Nixon’s actions amounted to treason but he did not have definitive proof to show Nixon knew about the entire operation so LBJ did not reveal the information publicly.

 

It is likely the Chennault Affair contributed to the paranoia that eventually led to the Watergate break-in. F.B.I. Director J. Edgar Hoover informed Nixon that LBJ knew of Nixon’s role in sabotaging negotiations with North Vietnam and Nixon became obsessed with the idea that the Democrats had information that could hurt him. 

 

Once in office, Nixon’s illegal behavior snowballed. Nixon authorized secret bombings of Cambodia, warrantless wiretaps on news reporters, and created the infamous “Plumbers.” The Committee to Reelect the President raised 20 million dollars--much of it acquired through bribery and extortion--and then used the funds for massive harassment and surveillance of Democrats during the 1972 election. 

 

Thus, illegal behavior was central to Nixon’s conception of the presidency. Nixon himself explained this to David Frost in 1977. Nixon insisted he should have destroyed the tapes and maintained that “when a president does it that means it’s not illegal.”

 

Princeton historian Kevin Kruse discussed the presidency of Jimmy Carter. While Carter had a pronounced commitment to ethical government, a closer look at Carter’s presidency shows that even those who tried to meet anticorruption standards can be brought low by their efforts. 

 

Even before Watergate, Carter presented himself as a political outsider. After Watergate, Carter capitalized on American concerns about a lack of morals in politics. Carter wanted to seem as trustworthy as possible on the campaign and after he was elected he worked to maintain the public’s trust. Carter famously put his peanut farm into a blind trust and the White House implemented stricter rules regarding conflicts of interest and financial disclosure. 

 

Because of these promises, Carter’s family came under intense scrutiny and the constant hunt for dirt hurt the administration.

 

Carter asked Burt Lance, the incoming manager of the Office of Budget, to put his stocks in a blind trust. Lance agreed but then the stocks began to plummet to Lance delayed doing it. In response, a Senate committee and Comptroller General opened investigations into Lance. The investigations revealed sloppiness but concluded there was no wrongdoing. Nonetheless, Carter’s administration was consumed by the Lance investigation and Carter stuck by his friend despite his staff’s recommendation to fire Lance. Finally, in the fall of 1977 Carter pressed Lance to resign and in retrospect Carter realized he should have sooner. 

 

Next, a scandal emerged centered on Carter’s peanut warehouse that was put in a blind trust. As the business had fallen on hard times, it was revealed that Bruce Lance had once given a loan to the warehouse. Many were concerned that the funds were diverted to Carter’s campaign. A team of investigators reviewed 80,000 documents and Carter even gave a sworn deposition (this was the first time a president was interviewed under oath in a criminal investigation). The investigation concluded there was no evidence was diverted. 

 

The third scandal of Carter’s administration centered on Billy Carter. Billy ran a gas station and when the business started to struggle, Billy tried to capitalize on his brother’s fame. In September 1978, Billy took a trip to Libya seeking to make a deal with Muammar Gaddafi. While there, Billy made anti-Semitic comments and urinated on the airport tarmac.  This all caused a great deal of embarrassment for Jimmy Carter. Worse, it was soon revealed that Billy had received hundreds of thousands of dollars in loans from Libya. The scandal was dubbed Billygate. After a Senate investigation, a bipartisan report concluded Billy had not done anything illegal. 

 

These sloppy practices each invited close investigation but in each case, officials concluded the acts were not criminal. Nonetheless, these scandals overshadowed much of Carter’s presidency and demonstrate that Carter’s action never lived up to his high-minded intensions. 

 

Jeremi Suri, a historian at the University of Texas at Austin, discussed presidential scandal during the Ronald Reagan administration. Suri noted that he was surprised at how little attention historians have paid to presidential misconduct, likely because historians like to stay away from scandal and research more “serious” events. Suri, however, thinks that misconduct was central to policy for Reagan.

 

Scandal under Reagan is a paradox because Reagan personally was not corrupt—he did not personally get money from misconduct—and was adverse to discussions he thought were unseemly. Nonetheless, because of his personal qualities, Reagan’s policies were built on a pyramid of misconduct or a “cocktail of corruption” centered on the intersection of deregulation, ideological and at times religious zealotry, lavish resources, and personal isolation from the daily uses of those resources. 

 

In other words, the institutional structure of the executive branch created incentives for corrupt behavior. Strikingly, over 100 members of the Reagan administration were prosecuted and 130 billion dollars were diverted from taxpayer uses. 

 

Dr. Suri focused on a few particular scandals and started with the Environmental Protection Agency scandal. Reagan appointed Anne Gorsuch, the mother of Supreme Court Justice Neil Gorsuch, as the administrator of the EPA. Gorsuch directly negotiated contracts with land developers and when she was investigated, Gorsuch refused to turn over documents or testify and was held in contempt of court. Her deputy served two years in prison. 

 

Secretary of the Interior James Watt was forced to resign after he made explicitly racist comments. After resignation, Watt used his connections to work with the Department of Housing and Urban Development and lobby for his friends to get contracts to build affordable housing that wasn’t actually affordable. Watts was convicted on 25 felony accounts. 

 

As Reagan approached his second term, many advisors resigned and became lobbyists, flagrantly going past the legal limitations on lobbying. Those who continued to work in the White House continued the corruption that plagued the administration in its first term. 

 

Attorney General Edward Meese combined petty corruption with the gargantuan. Niece would try to get double reimbursed for expenses. He took out personal loans from people who were bidding for government contracts, would not disclose the loans, and then would lobby for the loaners to get the contracts. Niece was not convicted but resigned. 

 

Assistant Secretary of the Navy Melvin Paisley stole 622 million dollars from the government. The F.B.I. concluded this was a consequence of large-scale appropriations with insufficient oversight. 

 

Suri argued that the Savings and Loans Crisis and the Iran Contra scandal emerged from an administrative culture that was unregulated, permissive in the misuse of resources, and lavish in spending. He concluded that this structural corruption has not gone away. 

 

To conclude the panel, James Banner gave a thoughtful comment that connected the history discussed to the present impeachment of President Donald Trump. Nixon was a pioneer in orchestrating misconduct from the oval office. Reagan pioneered allowing a shadow administration to implement policies that were not approved by Congress. To Banner, it seems that the Trump administration is doing both of these things at the same time. 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174106 https://historynewsnetwork.org/article/174106 0
The History Behind the Border and Immigrant Detention Centers How can history help us understand the detention of undocumented immigrants along the Southern border? At the American Historical Association’s annual meeting earlier this month, four historians examined this issue from different angles on a panel entitled Late Breaking: The Border Crisis in Historical Perspective. 

 

First, W. Fitzhugh Brundage, a professor at the University of North Carolina at Chapl Hill, examined how the history of human rights and torture illuminates the detention of migrants. 

 

All refugees are guaranteed broad protections under international law. These protections included a promise that an asylum seeker would not be returned to the country they were fleeing and protections for minor children and families. 

 

These protections were less consequential in reality than hoped. The international law had no methods for monitoring or formal enforcement mechanisms. Instead, it relied on the good will of signatory nations. The United States adopted the narrowest definitions of asylum and persecution so it only applied to a small portion of migrants. 

 

In 2005, under George W. Bush’s administration, the United States began deporting apprehended migrants using “expedited removal.” Operation Streamline established zero tolerance towards unauthorized border crossings and made it a misdemeanor. Deportations subsequently skyrocketed. Barack Obama expanded the policy and the number of people prosecuted for unauthorized border crossings quadrupled. 

 

Now, President Donald J. Trump has created an unprecedented campaign to delegitimize and dehumanize asylum seekers. Trump casts the southern border as a national security dilemma. Historically, national security has always been the largest justification for violating civil liberties and human rights. 

 

For much of American history, immigration was not considered a law enforcement issue. At the end of the 19th century, the government created formal oversight of immigration and placed it in the Department of the Treasury. In 1903, the duty moved to the Department of Labor. Then, in 1940 immigration oversight moved to the Department of Justice. In 2011, the oversight of immigration moved to the Department of Homeland Security solidifying that immigration was considered a matter of national security. This latest move is particularly troubling because the Department of Homeland Security has very little oversight so it’s hard for lawmakers to regulate its actions.

 

Recent news coverage has illuminated the systematic and pervasive human rights violations of agencies like the U.S. Immigration and Customs Enforcement (ICE).  Most egregiously, families are separated and people are indefinitely detained.  Inhumane conditions have amplified the problem, especially overcrowding. 

 

The intent of such policies is to make conditions in immigration detention so bad that asylum seekers will think twice about seeking asylum in the United States.

 

This indefinite detention is a glaring violation of the UN's ban of torture. Discussions of torture always involve debates over what torture actually is. Some defend it by dismissing any suggestion that it was actually torture. The controversy over the term “concentration camps” for the immigrant detention facilities shows the tension of the moment. 

 

The bottom line, however, is that the DHS and DOJ are committing huge violations of human rights. Trump will say these policies are necessary for national security, but the evidence shows that asylum and immigration enforcement are meant to intimidate, humiliate, and terrorize. 

 

Lauren Pearlman from the University of Florida discussed the economics of immigration detention with a specific focus on the role the private prison industry plays in immigration policy. 

 

Ellis Island, opened in 1882, was the first detention center created for immigrants. It wasn’t until the 1980s, however, that large numbers of people began to be detained. 

 

In 1988, Congress passed the Anti Drug Abuse Act and it demanded the detention of any noncitizens who committed a felony. The aftermath of September 11, 2001 cemented the securitization of immigration detention policy. The newly created Department of Homeland Security now took over the duties previously administered by the Department of Immigration and Naturalization Services (INS) under the Department of Justice.   

 

The Department of Immigration and Naturalization Services (INS) was one of the first to utilize private prison operators. By 1988, 800 of 2400 people in INS custody were housed in private facilities. 

 

Andrea Miller, an anthropologist at the University of California Davis, examined drones, air policing, and immigration. Dr. Miller asserted it was important to understand the longstanding connection between war power and police power. Atmospheric policing technologies (like helicopters, airplanes, and balloons) have long been used to control populations and are connected to tear gas, sonic weaponry, etc. 

 

Recently, the U.S. Customs and Border Patrol (CBP) announced it would test a small drone system weighing just 14 pounds. Incorporating this technology could reduce situation awareness gaps and “enhance situation awareness,” a military term that denotes the ability to extend police awareness beyond human ability. 

 

The use of drones also broadens the extensive system of surveillance of immigration. Today, undocumented immigrants are most likely to be apprehended in a traffic stop. ICE has access to data from license plate scans and readers. This data is stored and can be requested to create snapshots of where people live, travel, etc. This mirrors the same method of life analysis utilized in the War on Terror. Data points are placed in relation to each other to see patterns of mobility and figure out if someone is threatening. 

 

Dr. Miller is also interested in the connections between military tactics and tools and local police forces. For example, the Los Angeles Police Department first acquired drones in 2014. By 2017, the LAPD posted suggests guidelines for its unmanned aviation vehicle (UAV) pilot program that is now formally adopted. To Miller, it is telling that police and immigration agencies are adopting similar technologies adapted from the military. 

 

Finally, Stuart Schrader of Johns Hopkins University discussed El Salvador and the history of border patrol aid. 

 

Dr. Schrader examined the legacy of aid and immigration in the context of El Salvador. As Trump threatened to cut off aid to Central America, news circulated in September 2019 that El Salvador received enough American assistance to create a new border control agency. The agency would consist of 400 officers, 300 of whom would be dedicated to immigration and the other 100 working as part of a national police force. 

 

The United States has offered technical assistance to border patrol for a long time. In the 1950’s and ‘60s, the State Department worked with El Salvador to improve its structure of surveillance at its borders. 

 

U.S. assistance has always been designed to meet American policy goals. Previously, the U.S. Border Patrol was central to sate building and fighting the Cold War in the third world. Today, the U.S. is providing assistance to places like El Salvador to keep people inside their country and prevent them from immigrating to the U.S. The irony is that the U.S. is responsible for creating many of the problems that El Salvadorians are trying to escape and attempting to prevent people from leaving creates more pressure on El Salvador. 

 

Each of these perspectives offers a historical frame to help us understand the deep historical trends that shape daily headlines. 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174105 https://historynewsnetwork.org/article/174105 0
Roundup Top 10!  

Why No GOP Senator Will Stand Up to Trump

by Garrett M. Graff

Barry Goldwater had the power to tell Nixon it was all over. But don’t expect a repeat this time.

 

Martin Luther King Jr. on Making America Great Again

by Justin Rose

“I’d like somebody to mention that day that Martin Luther King, Jr., tried to give his life serving others…I want you to say that I tried to love and serve humanity.”

 

 

The Injustice of This Moment Is Not an ‘Aberration’

by Michelle Alexander

From mass incarceration to mass deportation, our nation remains in deep denial.

 

 

The National Archives' dangerous corruption of history

by David Perry

While the National Archives issued an apology and vowed to undergo "a thorough review" of its policies after the Washington Post first reported on the alteration, having discovered it by chance, as a historian I worry about how many other altered documents the Trump administration has buried in our records. Will we ever know?

 

 

The Road to Auschwitz Wasn't Paved With Indifference

by Rivka Weinberg

We don’t have to be ‘upstanders’ to avoid genocides. We just have to make sure not to help them along.

 

 

What Antiabortion Advocates Get Wrong About The Women Who Secured The Right to Vote

by Reva Siegel and Stacie Taranto

The most famous suffragists largely weren’t antiabortion and wanted women to have more control over their bodies.

 

 

Universities must open their archives and share their oppressive pasts

by Evadne Kelly and Carla Rice

The archives of academic institutions can tell previously untold stories of eugenics. Universities can begin to undo oppressive legacies by opening them to artists and communities.

 

 

The Neighborhoods We Will Not Share

by Richard Rothstein

Persistent housing segregation lies at the root of many of our society’s problems. Trump wants to make it worse.

 

 

A Matter of Facts

by Sean Wilentz

The New York Times’ 1619 Project launched with the best of intentions, but has been undermined by some of its claims.

 

 

Pence's outrageous op-ed holds deeper meaning

by Jeremi Suri

Vice President Mike Pence published a powerful, but deceptive article in Friday's Wall Street Journal.

 

 

 

Charlotta Bass for Vice-President: America’s Two-Parties and the Black Vote

by Denise Lynn

Bass, an influential political activist, ran for Vice-President on the Progressive Party ticket in 1948. Her campaign demonstrated some of the shifting loyalties of Black voters and the failure of both political parties to address the needs of the Black community.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174134 https://historynewsnetwork.org/article/174134 0
Reflecting on Martin Luther King Jr.'s Dream and Legacy I went to the annual city-sponsored celebration of Martin Luther King, Jr., on Monday. Jacksonville’s Mayor, Andy Ezard, inaugurated these yearly breakfasts a decade ago. Every year a speaker helps us think about what MLK said, what he wanted to happen, and how he lived. There is often music, and we sing “Lift Every Voice and Sing”, a hopeful song: from “the dark past”, “the gloomy past” “that with tears has been watered”, to the present, “the place for which our fathers signed”, to the future, “Let us march on till victory is won.”

 

Some years, that song and the celebration around it do lift hope, because the present is evolving to a brighter future that is joyous to imagine. But not today.

 

Today hope means believing that we will soon stop going backwards, that this moment is just a hesitation on the journey toward the unity we seek. Sometimes hope makes way for despair about how things might get worse.

 

What American governments since the 1960s have created in order to undo centuries of prejudice, discrimination, and persecution, the Republican Party is dismantling. I don’t say that Trump is doing this, even though his name is on every new policy of his administration, because he is not alone. Republican politicians across the country are doing this work themselves, defending the work of their colleagues, and pledging allegiance to the man who is leading the charge backwards.

 

Two authors of distinguished books on the history of race in America just wrote articles for the New York Times for Martin Luther King Jr.’s birthday, which tell us where we are and what is being done in our names. Michelle Alexander published “The New Jim Crow: Mass Incarceration in the Age of Colorblindness” 10 years ago, when Trump was just a glittery real estate con man. She showed how the explosion of the number of Americans in jail in the wake of the “war on drugs” was at its heart “another caste system — a system of mass incarceration — that locked millions of poor people and people of color in literal and virtual cages.”

 

The numbers must be printed to make their proper impression. These are careful estimates only, because fuller data does not exist, but they are the best estimates we have. Between 1980 and 2010, the proportion of Americans in prison tripled to 1%, but the proportion of African Americans in prison also almost tripled from 1.3% to 3.1%. The racial disparity remained about the same, as American governments imprisoned so many more Americans. In 2010, one out of three black men had a felony conviction in their past, and one of every ten were in prison or on parole. That was true for only one in fifty of the rest of the population.

 

This happened in Boston, where African Americans were subject to police observations, interrogations, and searches at seven times the rate of the rest of the population. It happened in Charlottesville, Virginia, in 2017, when African Americans were nine times as likely to be subject to police investigative detentions. And so on.

 

Alexander shows that both Democratic and Republican political leaders gave the US the dubious distinction of having more than one fifth of the world’s prisoners and the highest incarceration rate in the world: 756 per 100,000, while most countries imprison fewer than 150 per 100,000. Both Boston and Charlottesville were dominated by Democrats.

 

Now she delivers a shorter message: our nation must move back to the path toward racial justice from the detour we are taking. Obama and the national Democratic Party did not do enough to reverse those trends. But that is a long way from what has happened in the past 3 years. She is clear that the transition from Obama to Trump moved us from a hopeful discussion of racial reform to an era of white supremacy, clothed as returning to greatness.

 

Richard Rothstein published “The Color of Law: A Forgotten History of How Our Government Segregated America” three years ago. He also covered the long history of discrimination that Alexander described, but this time from the point of view of housing segregation. In fine detail, Rothstein explained how the federal government throughout the 20th century, under Democrats and Republicans, used its vast financial powers to promote further residential segregation, notably in the new postwar suburbs. Where I grew up, in the giant Levitt developments on Long Island, the federal government insured his loans on the condition that African Americans would be excluded from buying his houses.

 

This has been the American way for centuries, putting an unfair economic burden on African Americans. 1968 appeared to put an end to federal complicity in the segregation of American housing. Through the Fair Housing Act, included in the Civil Rights Act of 1968, groups who are discriminated against in anything to do with housing can use the legal system to demand redress. That Act was passed in the wake of MLK’s assassination.

 

But discrimination continues, taking less obvious forms. An example of how this occurs out of our sight comes from Syracuse. Since 1996, property has not been reassessed in the city, which seems like merely local government incompetence. But since then, the values of homes in white neighborhoods have risen much faster than homes in black neighborhoods. Reassessment would shift some of the weight of property taxes toward those much more valuable white homes. Not doing anything means that black homeowners have been paying an increasingly disproportionate share of property taxes. The city government in Syracuse is dominated by Democrats.

 

Rothstein’s article in the NY Times shows how Secretary of Housing and Urban Development Ben Carson is directing his vast bureaucracy and billions of dollars away from the process of desegregation. Since he was a candidate for President in 2016, he has argued that efforts to fix racial segregation are bad “social engineering”. Now HUD is trying to make it impossible for residents in a community like Syracuse, where government or business policies discriminate against racial minorities, to prove that in court. One of the far-reaching policies of the Trump administration which makes fighting discrimination more difficult.

 

These are pieces in today’s national puzzle of race. Martin Luther King has missed more than 50 years of change in race relations, in party politics, in the American landscape. But his yet unrealized dreams can still inspire hope.

 

Hardly anything is more worth fighting for.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/blog/154306 https://historynewsnetwork.org/blog/154306 0
What Will the Museums of the Future Be Like?

Help the Smithsonian’s National Museum of American History by taking this brief survey to share what you’d like to see from *your* National Museum. 

To access the survey, click the image above or this link: https://s.si.edu/YourOpinionMatters.

To access the survey in Spanish, click this link: https://s.si.edu/TuOpiniónImporta.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174092 https://historynewsnetwork.org/article/174092 0
Four Speeches by Dr. King That Can Still Guide Us Today

 

(Author's note: On January 9, 2020 I delivered the Martin Luther King, Jr. tribute lecture at the Uniondale Public Library in Uniondale, New York.  The presentation focused on four speeches by Dr. King that illustrate his concerns and suggest what his views on the world today might have been.)

 

Thank you for inviting me to speak today at the Uniondale (NY) Public Library Martin Luther King, Jr. commemoration. It is a great honor. I speak as a Hofstra University educator, as a historian, as an activist, but also as a white man committed to social justice. I want to share my ideas about Dr. King’s legacy, but I also came tonight because I want to hear yours. In my talk I will use Negro rather than Black or African American because that is how Dr. King referred to himself.

 

In the tradition of Dr. Martin Luther King, Jr., I am here today to recruit you. At a time when democracy itself is under threat from a renegade President and his rightwing, often racist supporters, the United States desperately needs a renewed King-like movement for social justice. The world also desperately needs an expanded King-like movement for climate action. Where we are sitting now may be under water by the end of the 21st century, or even sooner. These are the movements I am recruiting you to join. 

 

I am not recruiting you to a particular ideology, political program, or point of view. The world is constantly changing – we are all constantly changing. But I am recruiting you to become compassionate human beings with respect for diversity in the tradition of Dr. King. We need to be concerned with the needs of others who share this planet with us, to recognize their humanity, and to understand that they want the same things that we do for themselves and their families -- adequate food and housing, decent education and medical care, safety, and hope for the future. We need to understand that just because someone does not live your way, practice your religion, or make the same choices that you make, does not mean they are wrong or less than you. These are some of the lessons I have learned from Dr. King.

 

The African American Civil Rights Movement in the United States was a major world historic event that motivated people to fight for social justice in this country and others. Its activism, ideology, and achievements contributed to the women’s rights movement, the gay and lesbian rights movement, the struggle for immigrant rights, and the anti-war movement in this country. It inspired anti-Apartheid activists in South Africa and national liberation movements in Third World countries. 

 

The traditional myth about the Civil Rights Movement, the one that is taught in schools and promoted by politicians and the national media, is that Rosa Parks sat down, Martin Luther King stood up, and somehow the whole world changed. But the real story is that the Civil Rights Movement was a mass democratic movement to expand human equality and guarantee citizenship rights for Black Americans. It was definitely not a smooth climb to progress. Between roughly 1955 and 1968 it had peaks that enervated people and valleys that were demoralizing. Part of the genius of Dr. King was his ability to help people “keep on keeping on” when hope for the future seemed its bleakest.

 

While some individual activists clearly stood out during the Civil Rights Movement, it involved hundreds of thousands of people, including many White people, who could not abide the U.S. history of racial oppression dating back to slavery days. It is worth noting that a disproportionate number of whites involved in the Civil Rights movement were Jews, many with ties to Long Island. In the 1960s, the Great Neck Committee for Human Rights sponsored an anti-discrimination pledge signed by over 1,000 people who promised not to discriminate against any racial or ethnic groups if they rented or sold their homes. They also picketed local landlords accused of racial bias. The Human Rights Committee and Great Neck synagogues hosted Dr. King as a speaker and raised funds for his campaigns on multiple occasions.

 

King and Parks played crucial and symbolic roles in the Civil Rights Movement, but so did Thurgood Marshall, Myles Horton, Fanny Lou Hammer, Ella Baker, A. Philip Randolph, Walther Reuther, Medger Evers, John Lewis, Bayard Rustin, Pete Seeger, Presidents Eisenhower and Johnson, as well as activists who were critics of racial integration and non-violent civil disobedience such as Stokely Carmichael, Malcolm X, and the Black Panthers.

 

The stories of Rosa Parks and Martin Luther King have been sanitized to rob them of their radicalism and power. Rosa Parks was not a little old lady who sat down in the White only section of a bus because she was tired. She was only 42 when she refused to change her seat and made history. In addition, Parks was a trained organizer, a graduate of the Highlander School where she studied civil disobedience and social movements, and a leader of the Montgomery, Alabama NAACP. Rosa Parks made a conscious choice to break an unjust law in order to provoke a response and promote a movement for social change. 

 

Martin Luther King challenged the war in Vietnam, U.S. imperialism, and laws that victimized working people and the poor, not just racial discrimination. When he was assassinated in Memphis, Tennessee, he was helping organize a sanitation workers union. If Dr. King had not be assassinated, but had lived to become an old radical activist who constantly questioned American policy, I suspect he would never have become so venerated. It is better for a country to have heroes who are dead, because they cannot make embarrassing statements opposing continuing injustice and unnecessary wars.

 

The African American Civil Rights Movement probably ended with the assassination of Dr. King in April 1968 and the abandonment of Great Society social programs by the Democratic Party, but social inequality continues. What kind of country is it when young Black men are more likely to be involved with the criminal justice system than in college, inner city youth unemployment at the best of times hovers in the high double-digits, and children who already have internet access at home are the ones most likely to have it in school? What kind of country is it when families seeking refuge from war, crime, and climate disruption are barred entry to the United States or put in holding pens at the border? These are among the reasons I am recruiting everyone to a movement for social justice. These are the things that would have infuriated Martin Luther King.

 

I promised I would share excerpts from four of Dr. King’s speeches. Everyone has the phrases and speeches that they remember best. Most Americans are familiar with the 1963 “I have a Dream” speech at the Lincoln Memorial in Washington DC and the 1968 “I’ve been to the Mountaintop” speech in Memphis just before he died. These are four other speeches that still resonate with me the most today.

 

The first speech I reference is one for local Uniondale, Long Island, and Hofstra pride. In 1965, Dr. King was honored and spoke at the Hofstra University graduation. It was less than one year after he received the Nobel Peace Prize and three years before his assassination. In the speech Dr. King argued “mankind’s survival is dependent on man’s ability to solve the problems of racial injustice, poverty and war” and that the “solution of these problems is . . . dependent upon man squaring his moral progress with his scientific progress, and learning the practical art of living in harmony.” I have no doubt that if Dr. King were alive today, he would be at the forefront of the Black Lives Matter movement, demands for gun control, climate activism, and calls for the impeachment of Donald Trump. 

 

In his Hofstra speech, Dr. King told graduates, families, and faculty, “we have built machines that think, and instruments that peer into the unfathomable ranges of interstellar space. We have built gigantic bridges to span the seas, and gargantuan buildings to kiss the skies . . . We have been able to dwarf distance and place time in chains . . . Yet in spite of these spectacular strides in science and technology, something basic is missing. That is a sort of poverty of the spirit, which stands in glaring contrast to our scientific and technological abundance. The richer we have become materially, the poorer we have become morally and spiritually. We have learned to fly the air like birds and swim the sea like fish. But we have not learned the simple art of living together as brothers.”

 

Always a man of hope, as well as of peace, Dr. King concluded that he had faith in the future and that that people would “go all out to solve these ancient evils of mankind” but he also acknowledged that the struggle would be difficult and that “there is still a great deal of suffering ahead.” He then challenged the graduates to become “an involved participant in getting rid of war, and getting rid of poverty, and getting rid of racial injustice. Let us not be detached spectators or silent onlookers, but involved participants.” Dr. King would have been here to recruit you too.

 

Letter from Birmingham Jail was not originally given as a speech, but Dr. King recorded it later so I am including an excerpt here where he explained the legitimacy and urgency of direct political action, the kind that brought down apartheid in South Africa and we are seeing from young people in Hong Kong today.

 

"Why direct action? Why sit ins, marches and so forth? . . . Nonviolent direct action seeks to create such a crisis and foster such a tension that a community which has constantly refused to negotiate is forced to confront the issue. It seeks so to dramatize the issue that it can no longer be ignored . . . The purpose of our direct action program is to create a situation so crisis packed that it will inevitably open the door to negotiation . . . We know through painful experience that freedom is never voluntarily given by the oppressor; it must be demanded by the oppressed. Frankly, I have yet to engage in a direct action campaign that was "well timed" in the view of those who have not suffered unduly from the disease of segregation. For years now I have heard the word "Wait!" It rings in the ear of every Negro with piercing familiarity. This "Wait" has almost always meant "Never." We must come to see, with one of our distinguished jurists, that "justice too long delayed is justice denied."

 

I was an anti-war activist in the 1960s and this speech by Dr. King had particular importance for me at the time. On April 4, 1967, at a meeting of Clergy and Laity Concerned about Vietnam at Riverside Church in New York City, he denounced U.S. involvement in the war on Vietnam and imperialism in general.

 

“As I have called for radical departures from the destruction of Vietnam, many persons have questioned me about the wisdom of my path . . . ‘Peace and civil rights don’t mix,’ they say . . . There is at the outset a very obvious and almost facile connection between the war in Vietnam and the struggle I and others have been waging in America. A few years ago there was a shining moment in that struggle. It seemed as if there was a real promise of hope for the poor, both black and white, through the poverty program. There were experiments, hopes, new beginnings. Then came the buildup in Vietnam, and I watched this program broken and eviscerated as if it were some idle political plaything of a society gone mad on war. And I knew that America would never invest the necessary funds or energies in rehabilitation of its poor so long as adventures like Vietnam continued to draw men and skills and money like some demonic, destructive suction tube . . . We were taking the black young men who had been crippled by our society and sending them eight thousand miles away to guarantee liberties in Southeast Asia which they had not found in southwest Georgia and East Harlem . . . I could not be silent in the face of such cruel manipulation of the poor . . .. For the sake of those boys, for the sake of this government, for the sake of the hundreds of thousands trembling under our violence, I cannot be silent.

 

In the same speech Dr. King argued: “I am convinced that if we are to get on the right side of the world revolution, we as a nation must undergo a radical revolution of values . . .We must rapidly begin the shift from a thing-oriented society to a person-oriented society. When machines and computers, profit motives and property rights, are considered more important than people, the giant triplets of racism, extreme materialism, and militarism are incapable of being conquered. A true revolution of values will soon cause us to question the fairness and justice of many of our past and present policies . . . True compassion is more than flinging a coin to a beggar. It comes to see that an edifice which produces beggars needs restructuring. A true revolution of values will soon look uneasily on the glaring contrast of poverty and wealth. With righteous indignation, it will look across the seas and see individual capitalists of the West investing huge sums of money in Asia, Africa, and South America, only to take the profits out with no concern for the social betterment of the countries, and say, ‘This is not just.’” 

 

On August 16 in Atlanta, Georgia, at the annual meeting of the Southern Christian Leadership Conference,Dr. King asked, “Where do we go from here?” It is a question people concerned with social justice are still asking today.

 “The movement must address itself to the question of restructuring the whole of American society. There are forty million poor people here. And one day we must ask the question, “Why are there forty million poor people in America?” And when you begin to ask that question, you are raising questions about the economic system, about a broader distribution of wealth. When you ask that question, you begin to question the capitalistic economy. And I’m simply saying that more and more, we’ve got to begin to ask questions about the whole society. We are called upon to help the discouraged beggars in life’s marketplace. But one day we must come to see that an edifice which produces beggars needs restructuring. It means that questions must be raised. You see, my friends, when you deal with this, you begin to ask the question, “Who owns the oil?” You begin to ask the question, “Who owns the iron ore?” You begin to ask the question, “Why is it that people have to pay water bills in a world that is two-thirds water?” These are questions that must be asked . . . [W]hen I say question the whole society, it means ultimately coming to see that the problem of racism, the problem of exploitation, and the problem of war are all tied together. These are the triple evils that are interrelated.”

 

So where do we go from here? 

 

Will our country be on the right side of history?

 

Will each of us be part of defining a direction and participate in direct action to achieve social change?

 

Will you join, not me, but in Dr. King’s memory, the struggle to achieve racial, social, and economic justice in the United States and the world?

 

I know I said I would only refer to four King speeches, but I need to reference one more, this time by his nine-year-old granddaughter Yolanda Renee King at the 2018 March for Our Lives Rally in Washington DC.

 

“My grandfather had a dream that his four little children will not be judged by the color of the skin, but the content of their character. I have a dream that enough is enough. And that this should be a gun-free world, period. Will you please repeat these words after me? Spread the word, have you heard? All across the nation we are going to be a great generation. Now, I’d like you to say it like you really, really mean it. Spread the word, have you heard? All across the nation we are going to be a great generation.”

 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174067 https://historynewsnetwork.org/article/174067 0
Ross Douthat's Prescription for Academia Won't Solve the Real Problem

Fourteen years ago--four before I would enter a PhD program in English--the New York Book Review Classics rereleased a slim novel entitled Stoner that was first published in 1965 by an almost entirely forgotten writer named John Williams. Like many books in the (excellent) NYBRC series, it took awhile for readers to come back to Stoner, but by 2012 when I was studying for my comprehensive examinations in Renaissance poetry, this book by a man who died in 1994 became an unlikely international hit, lauded by some critics as one of the most perfect novels ever written. What defines Stoner, as Williams writes, is the “love of literature, of language, of the mystery of the mind and heart showing themselves in the minute, strange, and unexpected combinations of letters and words, in the blackest and coldest print.” 

 

It’s an unassuming narrative, the story of William Stoner, born to poor Missouri farmers at the end of the nineteenth-century, who through patience and simply working day in and day out ends up becoming a relatively unaccomplished professor of Renaissance literature Columbia University. He dies unheralded and mostly forgotten, still having lived a life dedicated to teaching and to poetry. Stoner’s career isn’t particularly happy, but he bears personal and professional hardship with a stolid midwestern dignity, and though Williams makes clear that the professor isn’t the most promising of his cohort, there is a rightful valorization of the work that he does, and the professionalism with which he conducts himself. Stoner is undeniably, in addition to being about the abstract love of literature, a celebration of work itself. 

 

With some foreshadowing, it was the New Critical close reading of Shakespeare’s Sonnet 73 that convinced the undergraduate Stoner that he belonged in a classroom and not on a farm. “That time of year thou mayst in me behold/When yellow leaves, or none, or few do hang/Upon those boughs which shake against the cold, /Bare ruined choirs, where late the sweet birds sang.” The poem evokes Stoner’s eventual solitary death, but in a manner far more prescient than Williams could have known. Shakespeare’s words are also a fair summation of what’s happened to the professor’s entire discipline over the last two generations as departments have shrunk, tenure track jobs have disappeared, and the academy has increasingly come to rely upon an exploited underclass of contingent, part-time faculty. 

 

I started graduate education a year before the financial collapse from which the vast majority of this country has yet to recover. Prospects for employment at a college or university were already bad enough in 2007; thirteen years later and they’re practically non-existent. That English departments – and by proxy the rest of the humanities including history, modern language, religious studies, philosophy, and so on – won’t exist in any recognizable form by let’s say 2035, should be obvious to anybody who surveys the figures and who has worked within the academy itself. 

 

The argument of who exactly is responsible for this state of affairs rages on, but New York Times columnist Ross Douthat insists he knows the answer. Part of the Times squad of Bret Stephens, David Brooks, and Bari Weiss--conservatives who are only read by liberals to prove how politically ecumenical said liberals are--Douthat doesn’t have a graduate degree himself. Nor has he (to my knowledge) produced peer-reviewed scholarship, or taught a university class (as anything other than a guest lecturer or as a function of his job as a columnist, I should say). But despite that, he has the hubris that can only be conferred by an undergraduate degree with the word “Harvard” printed on it, and so Douthat recently authored a column prescribing who is responsible for the “thousand different forces [that] are killing student interest in the humanities and cultural interest in high culture.” 

 

Like any column written by Douthat, Brooks, Stephens, or Weiss, what’s so insidious is that they’re often 25% correct – sometimes even 33% accurate! Douthat blames the disciplines themselves for their current predicament – and of course he’s correct. No doubt he’s critical of the lack of class solidarity between faculty and adjuncts, the ways in which professional organizations refuse to advocate for us, and the manner in which the ethic of the business school has infected the entire university. 

 

But of course that’s not what he finds responsible.  As predictably as if he were Alan Bloom writing The Closing of the American Mind in 1987, Douthat writes that the recovery of the humanities relies on a resuscitation of Victorian critic Matthew Arnold’s championing of “the best that has been thought and said.” A resurrection of the humanities must depend “at least on that belief, at least on the ideas that certain books and arts and forms are superior, transcendent, at least on the belief that students should learn to value these texts and forms before attempting their critical dissection.” Which we would all try and do, of course, if there were only any jobs in which to do it.

                  

I know that it’s been fashionable in conservative circles since the 1980s to bemoan the “post-modernist” professor who refuses to acknowledge Shakespeare’s brilliance while teaching comic books, but the schtick has gotten so old that I wish a young fogy like Douthat would learn a new tune. The critical studies professor railing against “dead, white males” is as much a bogeyman of the conservative conscience as the Cadillac driving welfare-queen, but the former remains a mainstay of their bromides, even while Douthat feigns an unearned reasonableness. 

 

So to his prescription, I must answer that of course those who study and teach literature cherish it as a source of value and transcendence, that of course they acknowledge that there are writings that endure because of individual, brilliant qualities, that of course we want our students to be enthusiastic about these things we’ve loved. Nobody spends the better part of a decade getting a PhD in English, or history, or philosophy because they hate English, history, or philosophy (though certainly some of them understandably come to). 

 

Nobody invests the better part of their youth in such research and teaching just so that Douthat can tell them that their professional failures are a result of just not loving their discipline enough. That critical, theoretical, and pedagogical consensus over the past few generations have rightly concluded that the job of the humanity’s isn’t mere enthusiasm, but also critical engagement with those texts – for both good and bad – speaks to the anemia of Douthat’s own education. Douthat, who has apparently never heard of the scientific method, writes that “no other discipline promises to teach only a style of thinking and not some essential substance,” as if learning how to rationally and critically engage with the world should be some afterthought to remembering that Shakespeare knew how to turn a phrase (I’ll cop to finding his phrase “essential substance” as being unclear--I suppose he means facts). Critical analysis of text has been a mainstay of humanistic inquiry since biblical exegetes first analyzed scripture; it runs through the humanists of the Renaissance, the philologists of the nineteenth-century, and the New Critics and formalists of the Modernist era. It’s hardly something made up at a Berkeley faculty meeting.  

 

Douthat’s pining for a purely aestheticized type of non-inquiry owes much to a certain segment of Victorian criticism, but it’s hardly defined the discipline for its whole history. Nor is the idea that being able to critically engage texts, historical events, or philosophical ideas as independent from whether or not you personally derive aesthetic pleasure from them particularly radical. It’s been a mainstay of the civitas humanitas since before the Enlightenment, whereby the study of literature, history, and philosophy wasn’t just an instruction in connoisseurship, but rather training in how to be a citizen. I’d propose instilling civic engagement is precisely what the “politicized” teaching and research that Douthat bemoans is trying to do. 

 

Unlike cultural warriors of the past, Douthat makes shallow gesture towards the academic jobs crisis, he alludes to the fact that he’s aware of economic austerity that’s gutted humanities departments. But like all culture wars masters of the form, Douthat’s conservative politics make it impossible for him to properly name the actual culprits of what happened to the humanities. Jacques Derrida didn’t kill the English department – Milton Friedman did. 

 

With some accuracy, albeit of the straw-man variety, Douthat argues that today it should be “easier than ever to assemble a diverse inheritor” of the old canon. I’m assuming that when he was at Harvard, he must have encountered those rightful diverse inheritors of the canon, because what we’ve been doing in the Renaissance literature classroom for thirty years is precisely that – teaching Shakespeare alongside Amelia Lanyer, John Milton with Aphra Behn. Like a teenager who asks why nobody has told him about the Beatles before, Douthat acts as if it’s some great insight that “This should, by rights, be a moment of exciting curricular debates, over which global and rediscovered and post-colonial works belong on the syllabus with Shakespeare” – but that’s precisely what we’ve been doing all this time. 

 

He writes that “humanists have often trapped themselves in a false choice between ‘dead white males’ and ‘we don’t transmit value,’” but this is only the situation within his own reactionary fever dream. Douthat’s prescription is as helpful as asking a person with an illness to just stop being sick, he tells us that the “path to recovery begins… with a renewed faith not only in humanism’s methods and approaches, but in the very thing itself.” May I suggest a humbler solution? The path to recovery of the humanities begins with actually funding the humanities, with hiring and paying people to teach and write about it, with making sure that its interests are not completely overlooked by universities more concerned with the NCAA, administrative pay, and making sure that students have the full “college experience.” That proposal might make the trustees of universities squeamish though, and they after all vote for the same political party which Douthat is a member of. Better just say that professors don’t love literature enough. 

 

Because the humanities do matter, the false choice that Douthat gives between aesthetic appreciation and critical analysis is to the detriment of both. Stoner wouldn’t have been as popular as it was if its sentiments didn’t move so many of us, graduate students who talked about the novel as if it were samizdat. What’s beautiful about Stoner is the character’s love of literature and of teaching. What’s inexplicable to us about it is that he’s actually able to have a job doing those things. Douthat may pretend that this is a spiritual problem, but the bare ruined choir of the academy wasn’t emptied because of insufficient faith, but rather because the money changers have long been in charge of the temple. It’s a spiritual problem only insomuch that all economic problems are at their core spiritual. Pretending that the disappearance of the English department has always been an issue about liberal professors attacking Western civilization is, with apologies to Borges, a bit like watching two bald men fight over a comb.  

 

The fact is that there is no excess in teaching critical analysis – in an era of increasing political propaganda and weakening democratic bonds it’s estimably necessary. We teach how to critically read culture – including movies, comics, and television – not because we don’t acknowledge the technical greatness of a Shakespeare, but in addition to it. Contrary to Douthat’s stereotypes, there’s not an English professor alive who doesn’t understand Shakespeare’s technical achievements when compared to lesser texts, but we understand that anything made by people is worthy of being studied because it tells us something about people. That is the creed of Terrence when he wrote that “I am human and I let nothing which is human be alien to me” – no doubt Douthat knows the line. Did I mention that he went to Harvard?   

 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174069 https://historynewsnetwork.org/article/174069 0
George Orwell, 70 Years Later

 

Seventy years ago this Tuesday, George Orwell, the author of Nineteen Eighty-Four, died alone in a hospital sanitorium in London.  He had struggled for years with pulmonary tuberculosis, and his weak lungs hemorrhaged for the final time.  He was just 46, and just on the verge of fame, having published his great novel just seven months earlier in June 1949.

 

Would the visionary author of Nineteen Eighty-Four—who always insisted on the full eighteen letters as his novel’s proper title, not the four screaming digits “1984”—have ever imagined that George Orwell might become the most important writer since Shakespeare and the most influential writer who ever lived?  That is my contention, based on his cultural and social impact, that is, with the omnipresence of his coinages in the contemporary political lexicon and his dystopian vision in the political imagination.

 

Crucial to his compelling language and vision was his superlative rhetorical ability to coin catchwords, such as those in his beast fable that allegorizes the history of the Soviet Union, Animal Farm (1945) and his dystopian novel, Nineteen Eighty-Four. His talent for composing arresting, memorable lines in both his essays and his fiction, especially openers “It was a bright cold day in April, and the clocks were striking thirteen”) and closers (“…He loved Big Brother”) is equally unforgettable.  Nineteen Eighty-Four rose to #1 on the bestseller lists (for the amazing fourth time in history) during January 2017 (after Donald Trump’s inauguration).  I fully expect it to do so again sometime during the presidential campaign this year.

 

Orwell’s worldwide fame—sales of his last two books approach 60 million in more than five dozen languages—certainly rests on his fable and dystopian novel.  Yet I would also maintain that Orwell’s importance not only is due to his “impact” as a polemicist or rhetorician, but is also explainable on the grounds of literary style, that is, in literary terms too. He is arguably the most important literary figure of recent generations. His direct literary influence in Britain and America on the generations directly following his own—the Movement writers and the Angry Young Men of the 1950s, the New Journalists such as Tom Wolfe and Gay Talese of the 1960s—rivals that of virtually any other twentieth-century writer. His influence on literary-political intellectuals since his death in 1950 is unrivalled—no successor has even come close to filling his outsized shoes.

 

Even more notable than all this, however, is the authority his “clear, plain” prose style has indirectly exerted, as countless writers have attested.  Orwell’s style has played a role in shaping nonfiction writing since midcentury. Along with Hemingway, Orwell is the literary stylist whose work has contributed most significantly to shifting the reigning prose style in English from the eighteenth-century ideal of the orotund, Ciceronian periodic sentence of Dr. Johnson, Gibbon, and the Augustan Age toward the limpid, fast-moving, direct, and hard-hitting sentences of present-day journalism. It is in these respects that Orwell’s literary influence is sizable indeed and bolsters his claim to the title “England’s Prose Laureate.”

 

Until recently, this was not at all the consensus, especially among British and American professors of English. Until the late 1980s, Orwell was typically relegated by most professors of English to the ranks of a middlebrow author.

 

When I began teaching at UT in the 1980s, literary academe was still dismissing Orwell as a rather simple “juvenile” or “high school author.  Distinguished “difficult” authors such as Vladimir Nabokov (of Lolita fame) dismissed Orwell as a “popular purveyor.”  In other words, Orwell’s works were, at best—in the much-quoted phrase that George Orwell used about the writings of some of his own favorite authors--“good bad books."

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174077 https://historynewsnetwork.org/article/174077 0
1917: The War Movie at Its Very Best

The World War I movie 1917 starts out quickly. A British General tells two enlisted men, Private Schofield and Lance Corporal Blake, that two British battalions are marching into a trap set up by the Germans several miles away. The two men must reach the 1,600 men in the battalions – one man’s brother among them – and warn the soldiers to turn back. To get there, the duo must march in and out of trenches, survive No Man’s Land, undergo machine gun fire, avoid bombs, race through blazing buildings and continually test their own courage and fortitude.

 

It is a film on fire that emulates the world on fire back in Europe 1917. It is loud. It is tense. It is dramatic. It is terrific.

 

1917, that opened nationwide last week, produced by Dreamworks and directed by Sam Mendes, is one of those great war movies that comes out only once every generation or so (think Saving Private Ryan). It is also one of the few films about World War I, that always seems to run third in public interest behind World War II and the American Civil War.

 

There are numerous elements that make 1917 a classic war film, and classic film, period. First, the action is focused on just two men, at the start, and they have to win or lose in the effort to save the apparently domed battalions. Second, their route to the troops takes them through hell on earth, with numerous Biblical symbols (the air all around on fire, climbing over dozens of dead bodies to save their own lives). Third, the special effects are impressive, with airplanes, explosions long, long lines of men in trenches. Fourth, the film has numerous closeups of exhausted, wiped out soldiers, most of whom are panting from the fury of the battle.

 

The movie is a story within a story – the two men within the greater war. Director Mendes has fashioned the film so that you constantly cheer the two men on, praying that they make it but yet every moment of the film you think they might perish and shortly afterwards the 1,600 men they were sent to save.

 

There is no great cavalry charge up the hill here, like in so many westerns, no sterling oratorical speeches by Henry V at Agincourt, no General ridings a white horse waving a sword in the air. It is a war of the grunts, trying to just get home. World War II was a war of victory and considerable glory; World War I was a fight for survival. There are numerous references to the idea of survival and no real purpose to the conflict, to any conflict. One General tells a corporal that it doesn’t matter what today’s orders are – next week the high command will issue orders that are just the opposite. Men don’t think of victory and welcome home parades, just getting home in one piece.

 

The first half of the film is slow but has some just plain astonishing scene. In one, a troop transport truck gets bogged down in the mud and a dozen soldiers, pushing and grimacing, try to get it out and back on the road. All of the pain of war is told in their faces and their aching arms and legs. In another A German plane is shot down by two allied planes, hits the ground and slides directly at the two messengers and you are certain it is going to kill them.

 

There are dazzling cinematic scenes, such as long moments focused on soldiers in the trenches, vast wasteland of empty meadows except for a few lone bombed out farm houses, mud puddle after mud puddle. There are vast plains with just one single, tree, still standing in the middle of it. There is a poignant scene in which Schofield meets a young woman and her baby hiding out from everybody in a building. He is attracted to her but has to leave to evade the Germans who are constantly looking for him.

 

Much has been made of the one camera effect in the film. A single camera picks up the two soldier boys when they leave on their journey and follows them most of the way. You see everything through their eyes or with the camera in front of them, in their faces. The final scene of the movie, shot this way, is striking.

 

The one problem in the film, and it is in just about all war movies, is that it starts too slowly. Our two heroes march and march and march and little happens.

 

Then, all of a sudden, the whole world explodes around them, and around the audience.

 

We are off….

 

Director Mendes does a brilliant job on this film about a ghastly conflict that tore apart the world. He gets numerous fine performances from a strong ensemble of actors. The two stars of the film, Dean-Charles Chapman as Corporal Blake and George Mackay as Private Schofield, are superb and win you over from the fist shot of the story.

 

The movie won the Golden Globe for Best Movie and was nominated for Best Picture in the Oscars. It deserves the accolades.

 

Right after World War I ended, they all said that it was “the war to end all wars.” It sure did, didn’t it? 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174091 https://historynewsnetwork.org/article/174091 0
Edmund G. Ross Was a Profile in Impeachment Corruption, not Courage

 

The current impeachment proceedings have revived the historical error of proclaiming Kansas Senator Edmund G. Ross a hero for providing the vote that saved President Andrew Johnson’s job after the April 1868 impeachment trial.   

For many years after that one-vote verdict, Ross was proclaimed the savior of the presidency from the rabid forces of impeachment.  More recent studies have noted that the racist Johnson was mostly a blight on the presidency, creating harsh divisions after the Civil War rather than binding up the wounds from that bloody conflict.

But another lingering fiction is that Ross cast his pro-Johnson vote altruistically.  Ross proclaimed his own heroism in his memoir.  When he cast the impeachment vote, he wrote, it was like looking into his open grave, but he courageously leapt in despite the consequences.

Future President John F. Kennedy (or his ghostwriter), revived this myth in his often-inaccurate book Profiles in Courage, pronouncing Ross’s vote “the most heroic act in American history.”  

That vote was a profile in corruption, not courage.

Ross, a printer previously accused of bid-rigging on Kansas state contracts, secured his Senate seat because his sponsor – a crook named Perry Fuller – paid Kansas state legislators $42,000 in bribes to select Ross and Samuel Pomeroy to Washington as the state’s senators.  

In the 1860s, frontier Kansas had a well-earned reputation for corruption.  Ross replaced a senator who had killed himself after the revelation that he took a $20,000 bribe from (yup) Perry Fuller.  

In his first months in Washington, Ross did nothing to attract the attention of even most dedicated Senate-watcher.  When the impeachment crisis erupted in early 1868 and landed in the Senate for Johnson’s trial, Ross flirted with both sides of the contest, then began to offer his vote for the best deal he could get.  

Less then two weeks before the trial ended, Ross sent his sponsor, Perry Fuller, to Johnson’s Interior Secretary.  Fuller had already joined with a Treasury official to offer a bribe that would have secured Ross’s vote for Johnson.  Fuller told the Interior Secretary that Ross would vote for acquittal if only Johnson would speed the return to the union of three Southern states. That was outside of Johnson’s power, so no deal was struck.

Days later, Ross promised pro-impeachment senators he would vote their way, a pledge he repeated to a reporter three days before the final vote.  But then, something happened.  On the morning of the vote, Ross breakfasted with his good friend, the ubiquitous Perry Fuller.  Then Ross reversed himself to cast the vote that kept the president in office.

The inference that Ross was bribed to vote for Johnson is powerful, although bribes in 1868 were paid in cash that could not (and cannot) be traced.  But records show that Ross immediately moved to cash in on his pro-Johnson vote with patronage appointments.

At the top of Ross’s shopping list was a top job for – you guessed it – Perry Fuller. The job?  Commissioner of Internal Revenue, a position from which Fuller’s corruption could spread through the nation like a virus.  Johnson promptly nominated Fuller for the pivotal position, but the Senate Finance Committee would not swallow a flat-out crook in that office.  It sent Fuller’s name back to the president.

So Ross set his sights lower.  In August, he secured Fuller’s appointment as Collector of Revenue in the port of New Orleans.  Through seven months in that office, Fuller more than doubled the number of that office’s employees, then was arrested for stealing $3 million.  

When Fuller had to post bond for his pretrial release, Senator Edmund Ross of Kansas was happy to guarantee it.

But Fuller was not the only name on Ross’s patronage shopping list.  The Kansan also requested the appointment of a friend as superintendent of Indian lands in which is now Oklahoma, stressing to President Johnson the “large amount of patronage connected with that office.”  Johnson made the appointment.  Then Ross asked for ratification of a treaty with the Osage Tribe, which Johnson swiftly granted.  

Even the widespread belief that Ross’s impeachment vote ruined his life is a fable. He did lose his Senate seat to a man who paid $60,000 in bribes to Kansas legislators, a moment of poetic justice.  Ross went on to publish two newspapers in Kansas before landing in 1885 as territorial governor of New Mexico.

Edmund G. Ross a profile in courage?  No. Not ever.

 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/173849 https://historynewsnetwork.org/article/173849 0
Jimmy Carter and The Myth That Gave the Iowa Caucuses Their Political Power

 

Every four years, the country witnesses what should be an inspiring ritual: Iowans like me brave the cold winter night, gather in school gyms, and talk about politics with their neighbors. But there is another ritual that also merits our attention: the condemnation of the Iowa precinct caucuses coming from around the country. Such critictism comes from the political right and the left.

 

There are many good reasons to oppose the caucuses. After all, is impossible to justify the same state going first each time, since it undermines the spirit of equality that is supposed to be crucial to American government. 

 

Further, Iowa is unrepresentative--disproportionately whiter, older, and more rural than the country as a whole. Finally, people with work or childcare responsibilities at night are out of luck. In this system, even the Democrats are undemocratic. 

 

Unlike primaries, the caucuses are a Rube Goldberg machine; that is, they are a complicated system designed to do something simple.  If parties were starting from scratch, they would do better to create something more like a primary. Historians can tell you, though, that politicians never start from scratch. Our caucuses are the product of historical accident, rather than conscious design.  

 

The accident began at the Democratic convention in Chicago in 1968. With cameras rolling, police officers attacked young activists who opposed the Vietnam War.   Meanwhile, the party leadership, still loyal to war President Lyndon B. Johnson, nominated Vice-President Hubert Humphrey, whose position on the war was ambiguous at best.  To protesters and much of the press, it appeared that Humphrey had been hand-picked by unelected pro-war Democratic Party leaders in a secretive process.  Meanwhile, police violence against dissenters and even reporters on the streets outside the Convention escalated into chaos.   The spectacle looked terrible to millions of voters watching on television, and the mess contributed to the Democrats’ defeat that fall.  

 

Something had to give.  While caucuses and primaries happened in a few states by 1968, most state parties used a smorgasbord of methods to insulate the nomination process from popular control.  Often conventions featured multiple ballots in which delegates, not voters, made the final choice of a nominee.  Now, more state parties began choosing their convention delegates through primary votes meant to empower average voters.  Others turned to party caucuses.  In these, at least in theory, party enthusiasts and volunteers led the proceedings. 

 

In 1972 and again in 1976, Iowa’s precinct caucuses happened first by historical accident. Put simply, the complexity of the state’s system for selecting national delegates, which is unknown to most of the public, generates a lot of meetings. The now-familiar precinct caucuses are only the start of Iowa’s system for selecting delegates, which continues long after journalists have left the state. Jimmy Carter (or rather his young adviser, Hamilton Jordan) thought a win at the caucuses could generate momentum by bringing media attention early in the process.  Jordan believed that early notice in the press was essential for a candidate to gain traction in a crowded field of presidential hopefuls.  The story is well-known: Carter focused on the state and did well in the caucuses, which set him, an unknown, on the path to the nomination. Presidential hopefuls have flocked to the state ever since. 

 

However, Carter’s story is not quite that simple. First, Carter didn’t exactly win. He received about 27% support, putting him in second place behind an always formidable opponent:  “undecided.” The forgettable Birch Bayh of Indiana finished third, ahead of a large but thoroughly mediocre group of challengers.

 

Moreover, the precinct caucuses did not provide Carter with any delegates to the national convention.  It doesn’t work that way. (This may be a little boring, but it is important to the story, so stay with me.) Those chosen as delegates by their precincts advance to the county convention, which in turn chooses delegates for the district conventions, which then designates participants in the state convention.  Finally, the state convention picks delegates to the national convention.  This complex system takes a lot of time, and Iowa Democrats start early because they have to in order to have delegates before the summer.    

 

Those county conventions are all-day affairs which happen long after the media has moved on. County delegates often get the job not because they want it, but because of pressure from their neighbors.  They may have to change their vote because their candidate has dropped out and that disheartening fact can create no-shows. Yes, there are alternate delegates, but things can go wrong.  

 

To recap: county conventions choose delegates for the district conventions, which in turn picks delegates for the state convention, which then chooses a few genuine delegates for the national convention. There are not many of these because Iowa is a small state.  The delegate stakes of the precinct caucuses are incredibly low. 

 

In other words, Carter won a glorified straw poll.  Nonetheless, his staff touted the caucuses as the will of the people, and Iowans had no reason to disagree. The caucuses did help Carter did get a big media bounce.  Yet the burst of attention he received happened because he also won the New Hampshire primary, which was then the traditional indicator of early strength.  Without Iowa, New Hampshire alone would have put him on the map.

 

Carter’s success actually happened because he was the perfect post-Watergate candidate. He was an evangelical who promised never to lie to the public. He was a naval veteran, a former nuclear engineer, and a peanut farmer.  As a former Governor of Georgia, he seemed to be a Washington “outsider” at a time when public disgust with politics was at its zenith.

 

Importantly, Jimmy Carter’s success also happened because he was a moderate, even conservative, by the standards of his party. Democrats, still smarting over their landslide loss of South Dakota liberal George McGovern in 1972, believed, correctly, that Carter could win by moving the party to the right. Ironically, in 1972 McGovern had himself performed well in Iowa, finishing second to Edmund Muskie, but received little acclaim for it in the media.  In 1976, the Carter campaign touted the event, and reaped the benefits. 

 

Iowa may be crucial to the nomination system today, but it still mirrors the caucuses in 1976 in one important respect:  it was a media event. It mattered, not because it produced delegates, but because it gave Carter media attention.  The caucuses were real politics in the same way that Celebrity Apprentice was reality television.    The Chicago protesters in 1968, while being pummeled by the police, chanted “the whole world is watching.”  And it was. Party reformers changed the system after 1968, but with an unintended consequence.  Today, the whole world is still watching, but they are watching Iowa.  

 

If you are a blue-state Democrat and you don’t like our system, you can try to make it go away.  Iowa’s status is always in danger, and yet the caucuses continually survive, maybe because Iowans care more.  We don’t have an NFL franchise, and we deserve to watch something.  

 

Or, voters can ignore the caucuses. In 2008, John McCain skipped the caucuses, which he viewed as a lost cause given his opposition to tax credits for ethanol. He won the Republican nomination. Michael Bloomberg, a far richer man than Donald Trump, has launched the extraordinary experiment of skipping the first four states and saturating the big media markets. 

 

Most pundits would tell you Bloomberg is doomed to fail, but you have to admire the impulse. He is ignoring us, which is probably the best way to minimize our importance. His strategy seems smarter than that of former Maryland U.S. Representative John Delaney, who has spent nearly two years wandering the state to little avail. Parents will tell you to ignore a toddler’s tantrums if you want them to stop. Twitter users know that they should refuse to feed the trolls; to do otherwise would be to reward bad behavior.  If you really don’t want to spend the evening of February 3 hearing media chatter about which candidate exceeded expectations, it’s on you.  Go to a bar, go bowling, read a novel.  I love my adopted state, but I have faith that the country is ready to choose a nominee without our help.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174076 https://historynewsnetwork.org/article/174076 0
The World is Losing another Historic Generation

Temple of Confucius of JiangyinWuxiJiangsu. Photo by ZhangzhugangCC BY-SA 3.0.

 

Every day, little by little, one of the world’s largest and most important historic generations is passing away— the last generation to have grown up in China before the communist takeover in 1949.  These individuals were born between 1919 and 1937, right before the outbreak of the Sino-Japanese War, and were raised in what we might loosely call traditional China. Yes, China was surely modernizing during the years 1919-1949, but traditional Chinese culture retreated only slowly and was still pervasive in 1949, even in foreign-influenced coastal cities like Shanghai.  Dense bastions of traditional practices remained in education, the judiciary, and government bureaucracies, as well as in family structure and social relations in general. Students and young people shaped by this environment were in touch with a very ancient Chinese system of values, with all its cultural beauty and social flaws.  

 

After 1949 the Chinese Communist Party (CCP) launched an active campaign to eradicate traditional values they regarded as “feudal thinking,” extending their long-standing hostility against the Confucian worldview to national educational policy. Things got much worse during the Cultural Revolution of 1966-1976 when fanatical anti-traditional hysteria generated a decade of state-tolerated anarchic violence and contempt by Red Guard youth groups directed against “feudal” and “bourgeois” values. The widespread physical destruction of artwork and cultural sites during those years was the most visible manifestation of a policy designed to root out traditional thinking tied to the pre-revolutionary past.   

 

More recently the CCP has paid lip-service to the nation’s past culture by appropriating Confucius’s name for the vast network of international “Confucian Institutes” that project a positive image of communist China to the outside world.   These institutes have little connection to China’s past; they are primarily centers of communist propaganda and recruitment of naïve, sympathetic foreign students.  A Chinese ruling elite that tolerates corrupt communists and crony capitalists has no genuine interest in a state grounded in Confucian ethics. Instead they pick out the Confucian elements that reinforce communist rule, such as emphases on political loyalty and communalism.

 

Students coming of age in China today are acquainted with only those parts of classical Chinese art, literature, and philosophy that the CCP regards as useful for justifying its continued one-party dictatorship. Conversations with Chinese students reveal alarming gaps in their knowledge of China’s political and economic development before 1949.  Disastrous events under communist rule like the failure of the Great Leap Forward and the resulting famine of 1959-1961 or the Tian An Men massacre of 1989 are “forbidden topics” beyond discussion or even mention in Chinese schools and universities, but that is a separate topic.   This is not simply complaining about the habits and interests of today’s youth in China; these knowledge gaps are a deliberate product of the CCP’s Orwellian educational policy reaching back over many decades with the intent of allowing vast swathes of pre-communist history and culture to wither away. 

These trends mean that as the last generation of individuals who grew up in, were educated, in, or were at least exposed to classical Chinese education and culture leave us, they will not be replaced. That human loss will be a bitter blow, perhaps the death blow, for a civilization with a 5,000 year pedigree. The surviving cultural remnants will be only those crumbs from the pre-communist past selected by the CCP for public presentations carefully managed by the party for its own purposes.

 

In a dangerous irony, this great reduction in the human cultural capital of pre-communist China comes at the same time the growth of Chinese economic, military, and political influence around the world makes international audiences more interested than ever before in understanding some of the deeper elements of Chinese culture. 

 

If the outside world wants to understand China’s long history, traditional values, and cultural contributions in an intellectually honest way, we will have to get beyond the carefully controlled version of history being offered by the current communist government. We can help by doing two things. First, by maintaining open, honest, and critical scholarship of China’s long history at universities and research institutes around the world. That includes refusing to knuckle under to China’s frequent demands that we cooperate with their efforts to censor critical research produced by foreign-based scholars. Second, we can support the remaining outposts of Chinese culture that exist beyond the reach of the CCP, for example in Taiwan, where non-communist versions of China’s past, present, and future are on display every day.  

 

We cannot prevent the passing away of China’s last generation from the pre-communist era, but we can do our part in helping to preserve the ancient culture they knew, so that it does not vanish completely with them. 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174068 https://historynewsnetwork.org/article/174068 0
Poles Apart: Putin, Poland and the Nazi-Soviet Pact

 

As the 75th anniversary of the end of World War II approaches, two of that war’s main victims – Poland and Russia – are once again embroiled in a highly emotional dispute about its origins. At the heart of the matter is the perennial controversy about the Nazi-Soviet pact of 23 August 1939. 

 

The polemics were kick-started by President Vladimir Putin when he was asked about the European Parliament’s resolution on the 80th anniversary of the outbreak of World War II at press conference in Moscow on 19 December. Putin deemed the resolution unacceptable because it equated the Soviet Union and Nazi Germany and accused its authors of being cynical and ignorant of history. He highlighted instead the Munich agreement of September 1938 and Poland’s participation in the dismemberment of Czechoslovakia. The Soviet-German non-aggression treaty was not the only such agreement made by Hitler with other states. Yes, said Putin, there were secret protocols dividing Poland between Germany and the USSR but Soviet troops only entered Poland after its government had collapsed. 

 

This is not the first time Putin has made such arguments. He made many similar points in 2009 on the 70th anniversary of the outbreak of war. But his tone then was conciliatory rather than combative. At the commemoration event in Gdansk, Putin stressed the common struggles of Poles and Russians and called for the outbreak of the war to be examined in all its complexity and diversity. Every country had been at fault, not just the Soviet Union: “it has to be admitted that all attempts made between 1934 and 1939 to appease the Nazis with various agreements and pacts were morally unacceptable and practically meaningless as well as harmful and dangerous.”

 

Responding to Putin, the then Polish Prime Minister, Donald Tusk, stressed that on 1st September 1939 his country was attacked by Germany and then two weeks later, invaded by the Soviet Union. But Tusk also emphasised that while “truth may be painful, it should not humiliate anyone.”

 

The day after his news conference in Moscow, Putin addressed leaders of the Commonwealth of Independent States at a meeting in St Petersburg convened to discuss preparations for the 75th anniversary. Putin used the occasion to deliver a long analysis of what led to the outbreak of war in September 1939, including detailed citations from many diplomatic documents.

 

One document that caught Putin’s eye was a September 1938 dispatch from Jozef Lipski, the Polish ambassador in Berlin, reporting on a talk with Hitler. During the conversation Hitler said that he was thinking of settling the Jewish issue by getting them emigrate to a colony. Lipski responded that if Hitler found a solution to the Jewish question the Poles would build a beautiful monument to him in Warsaw. “What kind of people are those who hold such conversations with Hitler?", asked Putin. The same kind, he averred, who now desecrate the graves and monuments of the Soviet soldiers who had liberated Europe from the Nazis.

 

The main point of Putin’s trawl through the British, French, German, Polish and Soviet archives was to show that all states had done business with the Nazis in the 1930s, not least Poland, which sought rapprochement with Hitler as part of an anti-Soviet alliance. Putin linked this history to present-day politics: “Russia is used to scare people. Be it Tsarist, Soviet or today’s – nothing has changed. It does not matter what kind of country Russia is – the rationale remains.”

 

Putin vigorously defended Soviet foreign policy in the 1930s. According to the Russian President, Moscow sought a collective security alliance against Hitler but its efforts were rebuffed, most importantly during the Czechoslovakian crisis of 1938 when the Soviets were prepared to go to war in defence of the country, provided France did the same. But the French linked their actions to that of the Poles, and Warsaw was busily scheming to grab some Czechoslovak territory. In Putin’s view the Second World War could have been averted if states had stood up to Hitler in 1938.

 

In relation to the Nazi-Soviet pact, while Putin accepted there was a secret protocol, he suggested that hidden in the archives of western states there might be confidential agreements that they had made with Hitler. He also reiterated that the Soviet Union had not really invaded Poland, adding that the Red Army’s action had saved many Jews from extermination by the Nazis.

 

Putin returned to the subject of the war’s origins at a meeting of Russia’s Defence Ministry Board on 24 December: “Yes, the Molotov-Ribbentrop Pact was signed and there was also a secret protocol which defined spheres of influence. But what had European countries been doing before that? The same. They had all done the same things”. But what hit him hardest, Putin told his colleagues, was the Lipski report: “That bastard! That anti-Semitic pig – I have no other words”.

 

To be fair to Putin there is more to his view of history than pointing the finger at Poland and the west. He also identified more profound causes of the Second World War, including the punitive Versailles peace treaty that encouraged “a radical and revanchist mood” in Germany, and the creation of new states that gave rise to many conflicts, notably in Czechoslovakia, which contained a 3.5 million-strong German minority.

 

Poland’s first response to Putin’s furious philippics was a statement by its foreign ministry on 21 December, expressing disbelief at the Russian President’s statements. Poland, the foreign ministry said, had a balanced policy towards Germany and the Soviet Union in the 1930s, signing non-aggression pacts with both countries. “Despite the peaceful policy pursued by the Republic of Poland, the Soviet Union took direct steps to trigger war and at the same time committed mass-scale crimes”.

 

According the Polish foreign ministry the crucial chronology of events was that in January 1939 the Germans made their claims against Poland; in mid-April the Soviet ambassador offered Berlin political co-operation and at the end of April Hitler repudiated the German-Polish non-aggression pact; in August the Nazi-Soviet pact was signed; in September Germany and the USSR invaded Poland and then signed a Boundary and Friendship Treaty that formalised Poland’s partition.

 

Among Soviet crimes against Poland was the mass repression of Poles in the territories occupied by the Red Army, including 107,000 arrests, 380, 000 deportations and, in spring 1940, 22,000 executions of Polish POWs and officials at Katyn and other murder sites.

 

On 29 December 2019 Polish Prime Minister, Mateusz Morawiecki, issued a statement, noting that Poland was the war’s first victim, “the first to experience the armed aggression of both Nazi Germany and Soviet Russia, and the first that fought in defense of a free Europe.” The Molotov-Ribbentrop pact was not a non-aggression agreement but a military and political alliance of two dictators and their totalitarian regimes. “Without Stalin’s complicity in the partitioning of Poland, and without the natural resources that Stalin supplied to Hitler, the Nazi German crime machine would not have taken control of Europe. Thanks to Stalin, Hitler could conquer new countries with impunity, imprison Jews from all over the continent in ghettos and prepare the Holocaust”.

 

Morawiecki pulled no punches in relation to Putin: “President Putin has lied about Poland on numerous occasions, and he has always done so deliberately.” According to Morawiecki, Putin’s “slander” was designed to distract attention from political setbacks suffered by the Russian President, such as US sanctions against the Nord Stream 2 oil pipeline project and the World Anti-Doping Agency’s banning of Russia from international sporting events for four years.

 

 All states like to present themselves as victims rather than perpetrators and this not the first time Poland and Russia have clashed over the Nazi-Soviet pact. The piquancy of the polemics is obviously related to the dire state of Russian-Western relations and to the presence in Warsaw of a radical nationalist government.

 

But how should we evaluate the historical content of these exchanges? My first book, published in 1989 on the 50th anniversary of the Nazi-Soviet pact, was The Unholy Alliance: Stalin’s Pact with Hitler. Since then I have written many more books and articles about the Nazi-Soviet pact. My research has led me to conclude that Putin is broadly right in relation to the history of Soviet foreign policy in the 1930s but deficient in his analysis of the Nazi-Soviet pact.

 

After Hitler came to power in 1933 the Soviets did strive for collective security alliances to contain Nazi aggression and expansionism. Moscow did stand by Czechoslovakia in 1938 and was prepared to go war with Germany.

 

After Munich the Soviets retreated into isolation but Hitler’s occupation of Prague in March 1939 presented an opportunity to relaunch their collective security campaign. In April Moscow proposed an Anglo-Soviet-French triple alliance that would guarantee the security of all European states under threat from Hitler, including Poland.

 

Some historians have questioned the sincerity of Moscow’s triple alliance proposal but extensive evidence from the Soviet archives shows that it was Stalin’s preferred option until quite late in the day. The problem was that Britain and France dragged their feet during the negotiations and as war grew closer so did Stalin doubts about the utility of a Soviet-Western alliance. Fearful the Soviet Union would be left to fight Hitler alone while Britain and France stood on the sidelines, Stalin decided to do a deal with Hitler -that kept the USSR out of the coming war and provided some guarantees for Soviet security.

 

The Soviets were not as proactive as they might have been in trying to persuade the British and French to accept their proposals. Some scholars argue this was because the Soviets were busy wooing the Germans. However, until August 1939 all the approaches came from the German side, which was desperate to disrupt the triple alliance negotiations. The political overture of April 1939 mentioned in the Polish foreign ministry statement is a case in point: the initiative came from the Germans not the Soviets.

 

One state that Moscow did actively pursue in 1939 was Poland. The bad blood in Soviet-Polish relations notwithstanding, after Munich the two states attempted to improve relations. When Hitler turned against Poland in spring 1939 Moscow made many approaches to Warsaw, trying to persuade the Poles to sign up to its triple alliance project. But Warsaw did not want or think it needed an alliance with the USSR given that it had the backing of Britain and France.

 

The failure of this incipient Polish-Soviet détente sealed the fate of the triple alliance negotiations, which broke down when the British and French were unable to guarantee Warsaw’s consent to the entry of the Red Army into Poland in the event of war with Germany.

 

After the signature of the Nazi-Soviet pact there was extensive political, economic and military co-operation between the Soviet Union and Germany. Most people see this as a tactical manoeuvre by Stalin to gain time to prepare for a German attack. However, I have argued that in 1939-1940 Stalin contemplated the possibility of long-term co-existence with Nazi Germany.

 

Putin makes the point that Stalin did not sully himself with meeting Hitler, unlike British, French and Polish leaders. True, but Stalin received Nazi Foreign Minister Ribbentrop twice - in August and September 1939 - and in November 1940 he sent his foreign minister, Molotov, to Berlin to negotiate a new Nazi-Soviet pact with Hitler. It was the failure of those negotiations that set Soviet-German relations on the path to war.

 

The first clause of the secret protocol attached to the Soviet-German non-aggression treaty concerned the Baltic states. Throughout the triple alliance negotiations Moscow’s major security concern was a German military advance across the Baltic coastal lands to Leningrad. With the signature of the Nazi-Soviet pact that Baltic door to German expansion was locked by a spheres of influence agreement that allocated Latvia, Estonia and Finland to the Soviet sphere. Lithuania remained in Germany’s sphere but was transferred to the Soviets in September 1939.

 

It was the second clause of the protocol that divided Poland into Soviet and German spheres but this should not be seen as a definite decision to partition Poland, though that possibility was certainly present. The protocol limited German expansion into Poland but did not specify the two states would annex their spheres of influence. The actions of both states in that respect would be determined by the course of the German-Polish war. In the event, Poland was rapidly crushed by the Germans, while the British and French did little to aid their ally except declare war on Germany. It was in those circumstances that Berlin pressed the Soviets to occupy Eastern Poland. Stalin was not ready, politically or militarily, to take that step but he knew that if the Red Army did not occupy the territory then the Wehrmacht would.

 

Putin glosses over the fact that the Red Army’s entry into Poland was a massive military operation involving a half million troops. Large-scale clashes with Polish forces were averted only because Poland’s commander-in-chief ordered his troops not to fire on Red Army. Even so, the Red Army suffered 3000 casualties including a thousand dead.

 

Often accused of parroting the Soviet line, Putin did not invoke the most potent argument that Moscow used to rationalise its attack on Poland, which was that the Red Army was entering the country to liberate Western Belorussia and Western Ukraine. 

 

Poland’s eastern territories had been secured as a result of the Russo-Polish war of 1919-1920. These territories lay east of the Curzon Line – the ethnographical frontier between Russia and Poland demarcated at Versailles. The majority of the population were Jews, Belorussians and Ukrainians and many welcomed the Red Army as liberators from Polish rule. Such enthusiasm did not outlast the violent process of sovietisation through which the occupied territories were incorporated into the USSR as part of a unified Belorussia and a unified Ukraine.

 

During the Second World War Stalin insisted that the Curzon Line would be the border between Poland and the USSR – a position that was eventually accepted by Britain and the United States. As compensation for its territorial losses Poland was given East Prussia and other parts of Germany. The result of this transfer was the brutal displacement of millions of Germans from their ancestral lands.

 

History is rarely as simple as polemicizing politicians would like it to be. Both sides of the Russo-Polish dispute have some valid arguments; neither has a monopoly of what is a bitter truth. The Nazi-Soviet pact is a fact but so is Polish collaboration with Hitler in the 1930s. The Soviet Union did cooperate with Nazi Germany but it also played the main role in the defeat of Hitler. Stalin was responsible for vast mass repressions but he was not a racist or genocidal dictator and nor was he a warmonger. The Red Army’s invasion of Eastern Poland was reprehensible but it also unified Belorussia and Ukraine. During the Second World War the Red Army was responsible for many atrocities but it did not commit mass murder and it did, together with its allies, liberate Europe from the Nazis.

 

Politicians will always use the past for political purposes. But in 2009 Putin came quite close to a balanced view about the Nazi-Soviet pact, as did Tusk in his measured rejoinder. Let’s hope that Poland and Russia can find their way back to such middle ground. 

 

The victory over Nazi Germany required enormous sacrifices by both countries. Surely it is possible to celebrate this common victory with dignity and with respect for differences about its complicated history.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174070 https://historynewsnetwork.org/article/174070 0
Strange Obsession: President Trump's Obama Complex  

 

Journalists were astonished when President Donald Trump took verbal shots at President Obama (without naming him) in a speech intended to deescalate a conflict with Iran on January 8, 2020. In that kind of international crisis, U.S. presidents ordinarily encourage a united American front. Yet Trump’s remarks had a disuniting effect. He presented a sharply negative judgment about Obama’s leadership. Trump criticized Obama’s “very defective” and “foolish Iran nuclear deal.” He claimed missiles fired by Iran at bases housing U.S. troops were financed “with funds made available by the last administration.” The statement implied that blood would be on Obama’s hands if Americans died in the bombings. Journalists said it was quite unusual for a president to lash out at his predecessor when delivering an important foreign policy message that needed broad public support. 

 

The journalists should not have been surprised. Trump has publicly berated Barack Obama on numerous occasions. While Trump’s dislike of Obama is complex and multifaceted, his behavior at the 2011 White House Correspondents’ Dinner may reveal an important source of the hostility.

 

At the time of that event Donald Trump maintained that Barack Obama had not been born in the United States and therefore was ineligible to be president of the United States. When President Obama stepped up to deliver a humorous monologue at the April 11 dinner, he saw an opportunity to poke fun at the champion of this false claim. Referring to rumors that Donald Trump might run for president someday, Obama pointed to Trump’s limited leadership experience as TV host of Celebrity Apprentice. Then Obama referred to recently published documentation confirming his U.S. birth. Now that the birther claim was put to rest, Obama teased, Trump could focus on “issues that matter -- like, did we fake the moon landing?” Comedian Seth Meyers piled on. “Donald Trump has been saying he will run for president as a Republican,” noted Meyers, “which is surprising, since I just assumed he was running as a joke.” 

 

Commentators in the national media interpreted the situation as a public humiliation. They observed that Trump appeared angry and did not smile. When asked about the event later, Trump scolded Meyers for being “too nasty, out of order” but said he enjoyed the attention. From that time on, though, Trump’s references to Obama became more contemptuous. Trump made several statements in 2011 claiming President Obama might attack Iran in order to boost his chances in the next presidential election.

 

A suggestion that President Trump’s contempt for Barack Obama played a role in America’s recent troubles with Iran is, of course, a matter of speculation. We cannot be sure that contempt for Barack Obama affected President Trump’s decision-making on key policy matters. But there is context for considering the idea. Throughout his presidential campaign and years in the White House, Donald Trump has delivered numerous verbal beatings to supposed villains. 

 

Just about anyone who criticizes Trump publicly becomes a target. The president has ridiculed Hillary Clinton, Adam Schiff, and Nancy Pelosi. Heroic and much-admired individuals received the president’s wrath, as well, including Senator John McCain, aviator and POW in the Vietnam War, and Khizr Khan, whose son, a U.S. soldier in Iraq, died protecting his men from a suicide bomber. Even the 16-year old climate activist Greta Thunberg received insults. Thunberg’s offense?  Staring down President Trump at a UN meeting on climate. Donald Trump has expressed scorn towards numerous people, but no public figure has been as consistent a mark for contempt as former president Barack Obama.  

 

Trump’s long record of criticizing Barack Obama seems to reveal deep-seated enmity. Ordinarily, U.S. presidents do not speak much about their predecessors, but when they do, the references tend to be positive. Trump refers to Obama often and in a disparaging way. CNN analyst Daniel Dale calculated that Trump mentioned Obama’s name 537 times in the first 10 months of 2019. “For whatever reason,” observed Fernando Cruz, who served both Obama and Trump at the National Security Council, “President Trump has fixated on President Obama, and I think that he views President Obama as a metric he has to beat.” Peter Nicholas, who covers the White House for Atlantic, said “a guiding principle of Trump’s White House has been, simply: ‘If Obama did it, undo it.’”

 

Trump hammered President Obama’s domestic initiatives. He tried to terminate Obama’s signature achievement, the Affordable Care Act, (its popular name, “Obamacare,” provided an attractive target). President Trump reversed Obama’s efforts to move energy consumption away from coal, and Trump opened national parks to commercial and mining activity, rejecting the protections Obama favored. Trump also undermined the Obama administration’s environmental initiatives. He mocked Obama’s promotion of wind power and rolled back regulations for oil and gas production, including standards for methane gas emission. 

 

In foreign affairs President Trump abandoned his predecessor’s efforts to bring nations together to fight climate change, and he rejected Obama’s plans for a trans-Pacific trade deal. Trump also scratched Obama’s programs for improved relations with Cuba. 

 

The most notable attack on Obama’s legacy in international affairs came in May 2018 when President Trump began pulling the U.S. out of the nuclear accord with Iran. Trump ignored advice from members of his national security team who supported the agreement. The accord had been working. Iranians complied with its terms, placing nuclear programs on hold in return for a promise of reduced sanctions. President Trump blasted the accord as “the worst deal ever.” His actions led Iran to reinstate nuclear development. In a brief time, Trump managed to smash an effective security arrangement that also had backing from the UK, Russia, France, China and Germany.

 

The reasons for Donald Trump’s major decisions appear shrouded in mystery. Why did President Trump try to obliterate Obamacare but offer no well-conceived substitute? Why did he abandon the Iran nuclear deal but offer no alternative that foreign policy experts considered effective? Perhaps Trump’s rejection of these and other important measures did not reflect disagreement about policy details. Maybe Trump objected to them because they symbolized goals and accomplishments of Barack Obama.

 

There can be no certainty about the emotional impact of Donald Trump’s unpleasant experience at the White House Correspondents’ Dinner on April 11, 2011. Suggestions about a connection must remain speculative. Not even well-trained psychologists or psychiatrists can provide definitive judgments about the significance. Nevertheless, Donald Trump’s lengthy and extensive record of negative comments about his predecessor is so unusual that connections to the event of 2011 deserve study.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174073 https://historynewsnetwork.org/article/174073 0
A Courageous Catholic Voice Against Antisemitism

Boston Irish activist Frances Sweeney was one of the few Catholic voices to challenge the silence in response to antisemitic attacks. 

 

“Eight young men came careening out of a side street. One snatched a yeshiva boy’s glasses and spun them into the street…another dumped the [Jewish] newsboy’s [papers] into the gutter; as yet another yanked as he had seen in the newsreels, an old, spidery Jew by his beard.”

 

This scene, which sounds as if it could have taken place this week in Crown Heights or Williamsburg, actually appears in the autobiography of the late award-winning journalist Nat Hentoff, recalling the wave of violent assaults on Jews in Boston in 1938.

 

Hentoff, then a student at Northeastern University, was an eyewitness to what the newspaper PM described as an “organized campaign of terrorism” against Jewish residents of Boston’s Roxbury, Mattapan, and Dorchester neighborhoods in the late 1930s and early 1940s.

 

The perpetrators were Irish Catholic youths, who were inspired by the rabble-rousing “Christian Front” organization and Father Charles Coughlin, the antisemitic priest whose hate-filled radio show drew millions of listeners each week.

 

As the harassment and beatings of Jews in the streets of Boston reached epidemic levels in 1943, one hundred Jewish boys and girls, ages 12 to 16, sent a poignant petition to the mayor. 

 

The violence “makes us sometimes doubt that this is a democratic land,” the children wrote. “We cannot walk on the streets, whether at night or in the daytime, without fear of being beaten by a group of non-Jewish boys.” They pointed out that the environment had become so dangerous for Jews that Jewish Girl Scout troupes and other social clubs had been forced to stop meeting,

 

Instead of taking action against the violence, Mayor Maurice Tobin dismissed the attacks as “strictly a juvenile problem,” while Governor Leverett Saltonstall accused the New York newspaper PM of being “utterly unfair” in criticizing the political leadership’s response to the crisis.

 

Given the fact that both the youth gangs and the Christian Front agitators were overwhelmingly Irish Catholic, the failure of the local Catholic leadership to speak out was especially troubling. 

 

Boston Irish activist Frances Sweeney was one of the few Catholic voices to challenge the silence. “The attacks on Jews…are the complete responsibility of Governor Saltonstall, Mayor Tobin, the [Catholic] church, and the clergy—all of whom [have] ignored this tragedy,” Ms. Sweeney charged.

 

Sweeney was the editor of a small crusading newspaper, the Boston City Reporter, which focused on exposing the antisemitic outbreaks and other instances of racism in the city. She was aided by a dozen volunteer researchers, including young Hentoff, who tracked the assaults and interviewed the victims. 

 

A fearless muckraker in the best sense of the word, Sweeney likely took her life into her hands when she infiltrated an event at South Boston High School in 1942 featuring the antisemitic priest, Rev. Edward Lodge Curran (he was known as the “Father Coughlin of the East”). Sweeney was spotted, roughed up, and physically thrown out of the building.

 

In her newspaper, Sweeney repeatedly called on the head of the Catholic Church in Boston, William Cardinal O’Connell, to “tell the faithful, without equivocation, to stop persecuting the Jews.” O’Connell summoned Sweeney to his office and threatened to have her excommunicated if she continued her “recklessly irresponsible attacks on the Church.” Not surprisingly, Sweeney refused. “The facts are the facts,” she replied. “Silence is a fact, especially when it comes from on high.”

 

Strong words of condemnation by Catholic leaders in the 1940s could have helped change the anti-Jewish atmosphere in Boston. Strong words of condemnation today by African-American leaders might influence those who have been assaulting Jews in Brooklyn in recent weeks. 

 

In the absence of such leadership, it is left to courageous individuals to speak out. That generation was fortunate to have Frances Sweeney. Will comparable voices emerge to counter the antisemitic violence of our own time?

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174071 https://historynewsnetwork.org/article/174071 0
Remembering Dennis Showalter, Grandmaster of Military History

 

On the evening of December 30, 2019, Dennis Showalter, a noted scholar of German, American, and military history, fought his last battle and rode off on a pale horse. He leaves behind a career spanning six decades teaching countless undergraduate students, shepherding thousands of graduate students, authoring many seminal works, and demonstrating the enduring importance of military history in the minds policy makers, service members, and the American public. 

 

As word circulates of his movement from a practitioner of the art to a subject of history, many individuals fortunate enough to have shared a room with him will undoubtedly give testimony to his excellence as a lecturer even as he fought against the cancer that eventually stilled his body. When he spoke on matters of history, culture, politics, or even baseball at gatherings, he did not simply lecture in the staid and muted voices too common in today’s academic halls. His deep baritone voice filled rooms with an oratory performance like few others, weaving complex thoughts into a symphonic-like confluence of research, historiography, and common culture made easily accessible to elders of the discipline and the uninformed alike. Reading a book a day on average, few could keep pace with the man on a wide array of topics, including the usage of transportation, armor and firepower, the ancient laws of piracy against Al-Qaida and ISIS, the exemplars of Buffy the Vampire Slayer as a cultural metaphor for a PTSD in RAF combat pilots, and how a Yankees pitcher managed to flush a perfect game. 

 

Like most serious scholars, Dennis Showalter’s depth and breadth of understanding on a wide array of subject matters was a cultural draw, but what kept students overfilling his classes and people coming back for more was the “Showalter Experience.” Over the years, my fellow surrogate academic sons and daughters of Dennis would often sit at the backs of rooms and watch new grad students enter for conference panel sessions, carrying notepads, pens, and almost dower expressions of academic seriousness. As he began to perform, we watched as these sullen figures suddenly dropped, first, their jaws, and then their pens, becoming entranced by the man’s intellectual repertoire sprinkled with cultural touchstones on science fiction, music, and clarifying his point with the phrase “Now in this discussion, mind you, there are three points that need to be considered….” No one was able to compete with Dennis and his sharp wit. 

 

Few knew from whence the mold that made Dennis Showalter was cast. Born in Minnesota in 1942, neither of his parents was well-educated, monied, or politically aligned. They were practical, stern, fiscally conservative, and they pushed their son with an urgency that defined the children of the Great Depression, hoping their progeny needed never suffer as they had. His mother was a stern homemaker who took care of the family while his salesman father was away. He had regularly traveled from town-to-town in rural America, selling items door-to-door in places still in recovery from the Depression and largely bypassed by the industrial transformation brought about by the Second World War. Upon reflection in the last years of his life, Dennis frequently mentioned how he treasured the trips he made with his father when he was old enough, looking for customers, talking baseball, understanding the value of money, and of how everyone deserved to be treated with dignity, inherent value, courtesy, and the closeness of a friend one had yet to meet. On a subconscious level, it also taught him the mechanics of oration, the audience-performer dynamic, and the art of the show in the sale. When it came time to make a career, Dennis admitted how he hated sales. As he was also “not good with his hands”, he knew the only path open to him was through making “this education thing work.” So he took his mother’s steadfastness and his father’s ability to sell into the academic world, graduating first with a BA from St. Johns University in 1963 and, later, earning a PhD from the University of Minnesota in 1969. In the later, he became close friends with noted Germanist and WWII OSS Chief of the European Axis Section of the Board of Economic Warfare Harold Deutsch, merging a thorough attention to detail with his now legendary showmanship. 

         

Now on the job market and perceiving himself as a fish out of water as the Vietnam War boiled over, Dennis Showalter displaced his political beliefs and pushed himself to measure up with the more “well-heeled crowd” of scholars with which he found himself sharing office and classroom space. In 1969, he began teaching at Colorado College and, in spare hours, threw himself into writing. During this period, he published many of the still standing authoritative works in his field such as Railroads and Rifles: Soldiers, Technology, and the Unification of Germany, German Military History Since 1648: A Critical Bibliography, and Tannenberg: Clash of Empires, 1914. However, they were more than just the requisite ticket punching of a new scholar. They were (and still are) recognized as a cut well above the rest, garnering him Distinguished Visiting Professorships at the Air Force Academy, the United States Military Academy, Marine Corps University, a position on the Iraq War Study Group, and regular fixture on the national and international lecture circuit. As he later identified in others, Dennis was “a first rate mind” and he tried to make the world a better place with it through the heart of American democratic principles. 

 

By the late 1970s, Professor Showalter was firmly entrenched in academic institutional circles, which granted him a vantage point from which he helped lift the profile of military history much maligned by radicals in the aftermath of the Vietnam War. At first, he served as a Trustee for the then-named American Military Institute and Editorial Advisory Board member for Military Affairs.  Following a major incident involving a small group of radicalized American Historical Association (AHA) participants who disrupting a panel of American military historians at the 1984 AHA conference, Dennis was also one of the members who stepped in to attempt to heal the breach between the small self-funded military history group and the organization chartered by congress. There he charmed crowds with his special brand of humor and personal charm, inviting the social history-centric clique to become involved in dialogues with the military history community. When Robin Higham retired after a lengthy term as editor of Military Affairs, Dennis Showalter, as president of the Society for Military History, helped steer the flagship publication, renamed as The Journal of Military History,through a rough transition into the now longstanding editorial hands of Bruce Vandervort at the Marshall Foundation, opening the door to a broader field of scholarly subject matter. Around the same time, Dennis became a regular pitch-hitter for various academic presses as series editor, using these outlets to aid the fledgling careers of young scholars by showcasing their work. Many of these included Kathryn Barbier, Patrick Speelman, Michael Neiberg, David Ulbrich to name just a few. If someone had a good idea, he would find a matching outlet for their efforts whether or not his politics aligned with the author’s views.  Long before “diversity” became a cultural buzzword for change, Dennis Showalter was already ahead of the crowd, championing the (still) most underappreciated quotient of American culture, “the diversity of ideas.” 

 

“New ideas are always needed,” he once told me. “If they can stand up to inspection, then no one should be left out in the cold.” For these efforts and more, he was awarded the SMH Samuel Elliot Morrison Award in 2005 and the Pritzker Literature Award in 2018 for lifetime achievement. At the time of his death, he was already working on his twenty-eighth book.

 

Still, Dennis Showalter never quite learned his father’s most important lesson or so he occasionally told me. You see, like the best of educators, Dennis sold his audiences on his subject matter with enthusiasm, but, instead of bargaining or raising the price as others do, he gave away his most valuable possessions for free: his time and his example of how to be a good person in a solipsistic world. Dennis went the extra mile for anyone looking for advice or in need of assistance….even his few detractors. In nearly twenty years, I’ve long lost count of the number of people in need to which he lent money, how many checks he picked up for starving grad students, how many dinners or manuscript edits he dropped to rush to campus to aid a student in crisis. When duty called, he stepped up before others even recognized the need. For example, when Harry Deutsch died, Dennis finished his last book (originally entitled What If?and since repackaged as If the Allies had Won) without question, losing time, money, and a few hairs off his head in the process. “If I make a promise, I stick to it,” my Doktorvater told me in the middle of a situation that would have broken others. “If I make a friend, I side with them to the bitter end.” As those close to him have long been intimately aware, this modus vivendi also extended to felines. Dennis Showalter never let a cat go hungry and never left the Colorado Springs pound without a grateful feline in arm …or two…or in one case, three. He said he couldn’t “bare the thought of leaving them there to await a needle in a cold cell when there was room in my home.” 

 

 It was an odd quirk of fate that put me in the same room with Dennis Showalter. One day, Bill Forstchen of Montreat College had looked towiden Dennis’s audience by bringing him into mainstream publishing and “get him paid what he was worth.” When the meeting was called, I had put my doctoral pursuits on hold to take care of my dying father. I knew of his scholarly reputation, had read his seminal work of “Railroads and Rifles” and his biography of Frederick the Great (still thedefinitive books in the field several decades later), but I knew little else to be the one to translate academes into businessspeak and keep him on target for manuscript delivery without seeming more than an interested fan. In the weeks following 9/11 and as the ashes of nearly 3,000 victims of America’s post-Cold War “Peace Dividend” rained down outside a Manhattan office, a group of us managed to craft for Dennis his dream project proposal he never thought an academic press would touch, a World War II dual biography akin to Stephen Ambrose’s Crazy Horse and Custer.  Patton versus Rommelwas born that day as was his follow-up Hitler’s Panzers. My first eye-opener to the “Showalter Experience” came when he refused to allow me to call him “Doctor” or “Professor.” Those were “titles that got in the way,” he said. “Just call me Dennis.” The second came when the conversation turned to money. Upon hearing the suggested sum to go with the proposal, he looked around the office, fixed on a copy of a Britney Spears’ memoir and asked with a broad grin if we “could possibly get Brittany to appear on the cover?  Maybe on top of a tank? Hey, if I’m getting my dream, why not get as many readers as possible?” 

 

When Dennis was first diagnosed with esophageal cancer, he asked me one night if I thought he had measured up to what his parents had wanted. I’ll tell you what I told him: “Some people are measured by their books, or their perceived professional reputation, or the number of bodies one leaves behind them or, in our current age, their vigorous support for a given political ideology. You sir, have an embarrassment of riches of which they would have been proud.” Dennis didn’t care about things for himself. He cared about what he could do with them to advance the cause of fellowship, free discussion, and an understanding of what it means to be human through the most horrific aspect of our animal dimension, war. There was a time when he was not here and that time, with profound sadness I must admit, has come again.  Many have and will speak of him as a valued grandmaster of the profession, but, most importantly as he now rides into the sunset of history; we should remember Dennis Showalter as a kind soul, a selfless friend, and a good man. 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174072 https://historynewsnetwork.org/article/174072 0
A Play About Historical Reenactors Grapples With American Identity You have all seen the historical re-enactors, the men, women and children who dress up in period costumes, grab a musket and re-fight battles of the Civil War, American Revolution and other conflicts. They have been in numerous movies (Lincoln, Glory, Gettysburg),appeared on hundreds of television programs and been the subjects of countless magazine and newspaper articles. They jump back into history and bring it alive for us.

 

Talene Monahon’s new play, How to Load a Musket, takes a deep, hard look at the re-enactors of two wars, the American Revolution and the Civil War. There is a lot of humor connected to the American Revolution, but when she turns her sights on the Civil War she fires away at the lives of the re-enactors, and their views of history and politics, with a blazing musket of her own. She hits most of her targets, too. This play at the E. 59th Street Theater, in New York, that opened Thursday, is a scorcher and the big parades and quaint campfires we have come to know and love fade off into the distance as the playwright fires away about what America I was really like, is like, and might be like in the future. It is a bare knuckled, no holds barred historical brawl on the race issue in 1861 and today, too. She charges that the race argument is about today, and not yesterday.

 

The play starts off in the office of the head of the Lexington, Massachusetts, re-enactment group and its lovable members. They are cute and charming. One George Washington re-enactor says that he is actually jealous of another George Washington re-enactor. The Americans who play British soldiers poke fun at themselves and a high-spirited middle-aged woman with a thick Boston accent giggles about the men she meets on the battlefield, and so do the man chuckle about the women. They all talk about how hard it is to meet people, but quite easy in the middle of a re enactor battle. They discuss at length at what a warm world they have created within the confines of the re enactor universe.

 

When the playwright moves to the Civil War, though, the three-cheers-for-the-red-white-and-blue atmosphere changes and the terrain sizzles with debates over the role of re-enactors and which America they represent. There is loud and pronounced verbal fisticuffs over the controversial tearing down and removal of Confederate monuments and what many African Americans might really feel about race back then, today and tomorrow morning.

 

This is an electrifying play that pulls no punches, a play that grabs your throat. It asks again and again, whose American was it in the past, and whose America is it today?

    

The playwright focuses much of the second half of the play on the 2017 white supremacist rally in Charlottesville, Virginia, that was held to support far right political causes and to prevent the removal of statues of Confederate war heroes. There were KKK men and women in their white robes, far right sympathizers and dozens of Confederate flags flying in the breeze. The far-right people were opposed by hundreds of shouting counter protestors. Things got out of hand. One woman was killed and several people were injured. The confrontation, recalled again in the play, drew international attention. In the play, the re-enactors fear they’ll be attacked, too.

 

That incident then erupted into a national debate over racism and President Trump’s famous line that there were good people on both sides. He should have said there were bad people in the crowds. The line is repeated in the play.

 

The great grandson of a Confederate soldier says that what is happening in America with monument removals and name changes, is “historical genocide” and that liberals today are trying to seriously rewrite history and cutting the stories of brave Confederate heroes out of it. This is, he insinuates, denying a part of American heritage. This is, of course, a debate that has been raging for several years.

 

The Confederate great grandson notes that his family helped a post-Civil War newly freed slave family learn how to farm and take care of their home. America is not, he claims, just heroes and villains.

 

The play is more of a moving conversation and heated debate than it is either a comedy or drama. Ms. Monahon deftly turns it into a play, though, carrying you along in the trenches as the re-enactors debate their lives and their wars.

 

The playwright does step over the line a few times. She suggests that tomorrow morning the U.S. might plunge int a Civil War over race. That is highly doubtful. She has an African American character say that Abraham Lincoln was a white supremacist. Oh, come on!

   

Jaki Bradley has done a fine job of directing this play. She has carefully woven dialogues and story to turn a serious debate into an engaging and rewarding play. All of the performers are superb in this drama. Bradley gets fine performances from Carolyn Braver, Ryan Spahn, Adam Chanler-Berat, Andy Taylor, David J. Cork, Lucy Taylor, Richard Topol and Nicole Villami.

 

This play is a bumpy night at the theater. If you go, regardless of your political persuasion – lock and load!

 

PRODUCTION: The play at the E. 59th Street Theaters is produced by the Less than Rent Theatre. Sets: Lawrence Moten, Lighting: Stacey Derosier, Sound: Jim Petty, Props:  Caitlyn Murphy, Costumes Heather McDevitt Barton. The drama is directed by Jaki Bradley. It runs through January 26. 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174075 https://historynewsnetwork.org/article/174075 0
Digesting History: A Conversation with the Museum of Food and Drink’s Curatorial Director Catherine Piccoli Catherine Piccoli is a food historian and writer, whose work focuses on the intersection of food, culture, memory, and place. She brings this multidisciplinary approach to the Museum of Food and Drink. As curatorial director, she oversees the creation of MOFAD’s exhibitions and educational programming, and guides the operations team. Catherine was instrumental in the research, writing, and development of past major exhibitions, Flavor: Making It and Faking It and Chow: Making the Chinese American Restaurant, as well as gallery shows Feasts and Festivals, Knights of the Raj, and Highlights from the Collection. She also established the museum’s robust public programming.

 

The following interview was originally conducted on November 25th, 2019.

 

What do you think separates MOFAD from other museums or similar institutions?

 

Because we are the Museum of Food and Drink, we start at a place of similarity with everyone. Everybody eats. Whether or not you like to eat, you have to do so multiple times a day. It’s something that we engage in out of necessity, it’s something that we engage with through culture, so I think having that starting point in common means that it’s much easier to reach people because they already have that interest in food. I think also because we believe that “Food is Culture”, and we do a lot of programming around that idea, we're meeting people with an idea that they're already comfortable and familiar with. Most people can think about what their families fed them, what they ate growing up, what is nostalgic to them, what is a part of their personal history and what that means to them. Then we can really take it from there and go in so many different directions and hopefully teach people something that they didn't know before. A big thing for us internally is thinking about the invisible every day. You can do that so easily with food. But when you open your refrigerator and you look at, say, a Chinese takeout box. What is the history of that food? What is the history of a Chinese takeout box? What is the history of a refrigerator? Why do we have one? Why is it an electric refrigerator? With all of these sorts of things, we can really blow people's minds wide open about food and use food as a lens to talk about larger ideas.

 

You mentioned that MOFAD’s slogan is “Food is Culture”. What does that mean to you in a historical context?

 

For me, that starts on a personal level. You know, your family's culture and history. We can take me for example, I am Italian-American and Polish- and Slovak-American, but I also grew up in the Midwest. So, growing up in Chicago, what are the things I grew up eating that my Italian family made? Or that my Slovak grandmother made? What does it mean to have grown up in a city with a really large Polish population? How did that impact the foods that I ate every day? And then you can go out even further than that to Poland. Cuisine in Poland, culture in Poland, how does that travel? What does transnational cuisine look like? How do cuisine and culture change when people move? For me, thinking "Food is Culture" is all-encompassing from the micro to the macro.

 

I have noticed that MOFAD offers an abundance of public programming, and that programming is more interactive than I have seen at other museums. What do you think the advantages are of inviting the public to become active participants in history?

 

We, not only in our public programming but also in our exhibitions, we feel it’s really important to engage people through all their senses. That's easy to do because food does that. When you come to see our exhibitions you will eat, you will literally, I like to say, "digest" the information that you have just literally digested. It's really important when you're talking about food to be able to experience it as well. We do that in our exhibitions as well as our public programming. We just had Marcus Samuelsson come last week and talk about the release of Our Harlem as an audiobook. He had some of the people that he interviewed there, they had a panel discussion and then there were foods from Red Rooster that people got to eat afterwards. Not everyone may be able to go to Red Rooster, but maybe you can come to MOFAD and taste some of those foods. Or not everyone may like to cook, or feel they're good enough to try one of those recipes. So they, too, can come to MOFAD and try that. Through the years we've done programming and exhibitions around the flavor industry. Which included programming around your sense of smell as well as things that you're eating and tasting. We've done honey tastings in the past, wine tasting, beer and cheese pairing, all sorts of different things to help people continue to engage with the topic but also think more deeply about food and drink.

 

What do you think are the advantages and disadvantages of featuring only one exhibit at a time? For example, you currently are displaying "Chow: Making the Chinese American Restaurant".

 

For us, I guess you could say we are a fledgling institution, and our current space is called MOFAD Lab. We call it that instead of calling it the Museum of Food and Drink because we really saw it as our experimental space, our exhibition design studio or even our "test kitchen" if you want to have another pun; where we can test out how to be a museum. While MOFAD has existed as an idea since 2010, it wasn't until 2015 that we had our first physical space. The Lab is not big enough to have multiple full-size exhibitions, but that was okay for us because we're still a small team and doing one exhibition at a time really helps us to focus and make the best exhibition possible. It has worked very well for us, I think, but it can be difficult at times. You know Chow’s been open for a few years now so some people think we're the Chinese Restaurant Museum or even a Chinese-American restaurant sometimes, which is a little bit silly. But we find that when people come in who are confused, once they get to MOFAD Lab and we can talk to them and they can really understand what we're doing and want to come back and see more. It is our goal, ultimately, to grow to an institution on the scale of the Smithsonian or the Met. Obviously, this is our first step towards that and hopefully the next phase will be several galleries instead of one so we can have multiple exhibitions at a time.  

What do you hope that a visitor who comes in with no prior knowledge gleans from your current exhibition? What do you what them to walk out of MOFAD thinking about?  

For us, a lot of it has to do with connection. With Chow, we're using food as a lens to talk about racist immigration policy. We're talking about the Chinese Exclusion Act and how despite the fact that during that 60-year period Chinese people are functionally excluded from entering this country, the Chinese-American restaurant really blossomed and that restaurant cuisine becomes a part of the culinary zeitgeist. We want people to leave understanding why that's a remarkable story and how that happened, but we also want visitors to go home and think about their local Chinese takeout place differently. Here in New York, a lot of the Chinese takeout restaurants are still family-run and probably across the country as well. Hopefully, people are going into those restaurants and engaging with the folks that are running them. Who are those people? How did they get to the U.S.? What are their plans? What are their dreams? What are they cooking? We really want people to look at those spaces in a new way and engage with the folks who are cooking their food.

 

David Chang, a chef who I personally admire, has been talking a lot recently about trying to get people to rethink MSG. Do you bring that at all into your current exhibit or into your conversation about Chinese-American food?

 

It's funny you asked that. Our first exhibition at Lab was called "Flavor: Making it and Faking it" and it was on the history and the technology of the flavor industry. We had three main stories that we told. One of which was the quote-unquote "discovery" of umami as a taste and how Dr. Kikunae Ikeda, who's a Japanese chemist, is the one who "discovers" it and names it and then begins manufacturing MSG in Japan. So we talked a lot about MSG in our first exhibition and we decided not to have any panels about it in this exhibition. But we often get that question at our culinary studio, and we often refer people to Harold McGee's piece on MSG which was in the first issue of Lucky Peach. But our stance as a museum on MSG, if that's what you’re asking, is that the studies have not borne out whether or not MSG is definitely bad for people. Now obviously everybody's bodies are different so if somebody feels that they react to MSG we're not going to argue with them about that because we don't know what's happening in each other's bodies. But, you know, MSG is used in so many foods in this country in the industrial food system and has been since the 30s in things like Campbell's soup here in the U.S. So, for us, it's not a scary thing. We did talk about in the Flavor exhibition the racist underpinnings of the fear of MSG with "Chinese restaurant syndrome", but again, that's not something that is an active piece of this exhibition.

 

While conducting research for the CHOW exhibit, did any one dish or food strike you as having a particularly interesting history?

 

I'm going to sort of answer your question. I became really intrigued by chop suey. We ate a lot of it because our initial thought was that our tasting at our culinary studio for Chow would be chop suey and historic tastings of chop suey. So, we found a lot of historic recipes dating back to the late 1800s for chop suey. We made a lot chop suey and we ate a lot of chop suey. It's one of those things that's funny right, it's one of the first Chinese-American dishes to really blow up if you will. But it's not something that's really on many menus anymore, and even now it's different. Those early recipes show usually a soy or Worcestershire based brown sauce and today chop suey is made with white sauce. It's interesting to me to think about how dishes change over time and why. I didn't look into it very much but I'm still fascinated by the idea that this one dish is a reason that Chinese-American restaurants really become en vogue in the late 1800s/early 1900s but it's something that we don't really eat anymore.

 

What made you choose to focus on food history?

 

I majored in history as an undergrad, my full major was social and cultural history which I got at Carnegie Mellon University in Pittsburgh. While I was at CMU I really thought I was going to be a music journalist. I also minored in clarinet performance so I was taking a lot of music history classes as well. That was where my passions laid. I did take a class where we read Sidney Mintz's book Sweetness and Power about sugar and I was not touched by it. It's a seminal food studies book but I was like "What is this? Why do I care?", which is funny to me thinking about it now. But I finished college and I didn't know how to write for Rolling Stone and didn't really know what was next. I was just working around Pittsburgh and started thinking about food in a different way in my 20s. I was having people over for dinner parties, getting into wine, those sorts of things. I had always been a good eater, both of my parents worked so we spent a lot of time together around the dinner table or in the kitchen on the weekends menu planning or cooking things for the week, or family baking around the holidays. Food was always the center of things that we did when I was growing up. So, it makes sense that I rediscovered that in my early 20s when I was working and forming a household of my own. I started thinking about food differently, I started interacting with food differently, and I started reading food memoirs such as Ruth Reichl and Michael Pollan. I started thinking about how I could have a career in food that wasn't necessarily working in a restaurant because that wasn't something that I was interested in, I didn't want to become a chef. I saw an ad in the newspaper for a food studies program at Chatham [University] which is where I did my master's degree. It really all sort of clicked, it fell into place for me. And of course, while I was there I realized that I could study food history. I think I saw it as the history of a recipe, or of a dish, or of a chef. Which, again, that's not personally where my passion for food history is. I like thinking about people and place and culture and the bearing that has on your food and what you eat and why you eat that, and I really was allowed to do that there. When I moved to New York with a master's degree and again wasn't really sure what I was going to do, I found MOFAD. It was sort of perfect because I had volunteered and interned and worked at history museums while I was in college and after college and then here was this museum that had food as its central focus and it made a lot of sense for me. I lucked out, I think, finding MOFAD and realizing that food history made sense for me, and being able to hold onto it and keep working.

 

As an undergrad studying history, I feel compelled to ask this question. How did your academic career influence your working career? (The answer to this question was submitted after the interview via email.)

 

There are the obvious skills around research (using databases to locate materials, analyzing primary and secondary sources, crafting and conducting oral history interviews) and writing (synthesizing and analyzing research, crafting tight and compelling narratives). A few other skills also come to mind that I've jotted down below:

 

1. Learning the formal way to address and communicate with professors: One of the first things my freshman seminar professor taught our class was the proper way to interact with professors – how to address them in person and over email, how to keep our requests short and respectful. It feels so simple now, but I'm so glad I learned this skill early on in my academic career. At MOFAD, I often have to reach out to professionals, academics and others, with no introduction. Sending that first professional email can set the tone for a productive working relationship.

 

2. Comfort using non-traditional primary sources: Perhaps "non-traditional" is not quite the right term. Still, I became quite comfortable during my undergraduate coursework for my music degree in using performances, songs, lyrics/poems as primary sources. This has served me well as a food historian where cookbooks, agricultural manuals, and recipes can serve as primary sources.

 

3. Communicating why you should care: I think this is something I began to learn as an undergrad, but really honed during my graduate work. Whenever I write, I keep the question "But why should I care?" in the back of my mind and try to answer it (sometimes again and again). I think with any topic, but especially with a topic rich in materiality like food, it's so important to convey to your reader why this thing matters, why they should care. What can a historical event teach us about current events that are affecting our daily lives?

 

Where do you see MOFAD headed in the future?

 

My dream for MOFAD is that we can continue growing and can continue putting together meaningful and thoughtful exhibitions. We're in a bit of a transition right now. Our next exhibition, which I'm really excited about, is called "African/American: Making the Nation's Table" and it's about the many contributions of African Americans to the creation of American cuisine. As part of that, we won the rights to the Ebony Test Kitchen from the Johnson Publishing Co. building. That was the test kitchen where all of the recipe testing was done for Ebony Magazine, Jet Magazine and also some other Johnson Publishing Co. magazines. It's really exciting for us to have this historic and crazy super psychedelic, 1970's, orange, purple, green, swirly kitchen on display from my hometown of Chicago as part of that. That exhibition will be on show at a different space, not MOFAD Lab but instead at the Africa Center, which is a museum in Harlem. It'll be on display there for six months next year and then it will travel. So that's amazing for us, a travelling exhibition. And then from there, we're figuring out what's next for us, where we'll go. I think that for me and for our staff, we're not trying to be and we don't want to be a place like the Museum of Ice Cream or one of those sort of Instagram “experiential” museums. We are really hoping that people come and they have an "a-ha" moment, and they learn something about food that they didn't know or they're inspired to think deeper about the things that they're putting into their body, or how foods and drinks get to their plates and to their cups. I think that we've been able to do it with our exhibitions so far and I just hope that they keep getting bigger and better.

 

 

Catherine Piccoli can be found on Twitter @gigaEats and Instagram @giga.eats. The Museum of Food and Drink can be found at mofad.org, as well as on Twitter and Instagram @mofad.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/173798 https://historynewsnetwork.org/article/173798 0
The Supreme Court Historical Society: An interview with the President The Supreme Court Historical Society is a private, non-profit organization dedicated to the collection and preservation of the history of the Supreme Court of the United States. It was founded in 1974 by Chief Justice Warren E. Burger, and is still in operation today. 

 

I recently interviewed the President of the Supreme Court Historical Society, Chilton Varner. Ms. Varner is from Atlanta, Georgia where she is a litigator at the law firm King & Spalding. Ms. Varner graciously spoke about the work of the Society and its relation to history. 

 

Ms. Varner is passionate about the Society. She listed a number of things the Society does for the public, including the creation of various lesson plans, scholarly publications, a lecture series, an annual reenactment of landmark decisions, and a number of lectures open to the public.

 

Ms. Varner’s favorite activity the Society sponsors is the Supreme Court Summer Institute for Secondary Teachers. This program allows thirty secondary teachers to come to the Court to enhance “the level of their instruction about the court” for their own students. The teachers come to Washington D.C., where they are given a tutorial about the Court, the Constitution, and the Judiciary Branch. They are able to interact with one another and the Chief Justice of the Supreme Court. Ms. Varner described how this program leaves the teachers “excited, energized, and armed with new information” which they bring back to the students they teach. 

 

Ms. Varner also made note of how civic courses have been disappearing from schools around the country. She values these courses and wishes to see more young people educated about the government and its history. She would also like to see more young people sign up for the Historical Society as well because it “keeps them current, teaches them about legal history, and is important for the future.” It is exciting for her when she sees young people from her own law firm join. Ms. Varner expressed the importance of young people learning history. The Supreme Court Historical Society strives not only to keep the history of the Court alive, but to pass it on to the next generation. 

 

The Supreme Court Justices have been critical in the success of the Historical Society. Ms. Varner was greatly appreciative of all the Justices and what they have done for the Society. From introducing guest speakers to providing various forms of support, they have been key to the function of the Society. Ms. Varner noted the amount of time the Justices give to the Society. She believes they “recognize the importance of their own Court’s history” and the importance of the Society.

 

Ms. Varner herself is a part of history. In 1983, she became the first woman litigation partner at her firm and was the only woman trial lawyer at the firm for a number of years. Since the start of her time as a practicing attorney, she has seen numerous changes for women in law. Most notably, when she argued in front of the Eleventh Circuit Court of Appeals,  it was in front of a panel of all women. While Ms. Varner says there is “still a way to go and more progress to be made,” today is a very different environment where “nobody blinks an eye now when a women trial lawyer stands up to strike a jury.” Ms. Varner sees trial lawyers like herself as “historians” in both their lives and what they are arguing. 

Throughout my interview with Ms. Varner, she was clear about the importance of history in general and that of the Supreme Court. Ms. Varner highlighted just how important the Supreme Court Historical Society is in preserving this history and educating the public. You can check out the Historical Society's website at https://www.supremecourthistory.org/.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/173476 https://historynewsnetwork.org/article/173476 0
Do Morals Matter in Foreign Policy?

 

Do morals matter in American foreign policy, or is American moralism just hypocrisy as realists teach us? Conventional wisdom is skeptical and surprisingly few books and articles focus on whether presidents’ moral views affected their foreign policies and how that should affect our historical judgment of them. I set out to answer these questions in my new book Do Morals Matter? Presidents and Foreign Policy from FDR to Trump

 

Examining 14 presidencies since 1945 shows that a radically skeptical view of morality is bad history. Morals did matter. For example, a purely realist account of the founding of the postwar order in terms of the bipolar structure of power or an imperial imposition of hegemony does not explain FDR’s Wilsonian design or Harry Truman’s delay in adapting it after 1945, or the liberal nature of the order that was created after 1947. George Kennan suggested a realist policy of containment, but Truman defined and implemented it in broader liberal terms. 

 

Similarly, an accurate account of American intervention in Korea in June 1950—in spite of the fact that Secretary of State Dean Acheson had declared earlier that year that Korea was outside our defense perimeter—would have to include Truman’s axiomatic moral decision to respond to what he saw as immoral aggression. Similarly, to explain the major elevation in the priority of human rights in American foreign policy after the Vietnam era, we must include the moral outlook of Jimmy Carter. Ronald Reagan’s decision to ignore his advisors and his previous harsh rhetoric about the “evil empire” must be understood in the light of his personal moral commitment to ending the nuclear threat.

 

Looking back over the past seven decades of American primacy, we can see certain patterns in the role of ethics and foreign policy. All presidents expressed formal goals and values that were attractive to Americans. After all, that is how they got elected. All proclaimed a goal of preserving American primacy. While that goal was attractive to the American public, its morality depended on how it was implemented. Imperial swagger and hubris did not pass the test, but provision of global public goods by the largest state had important moral consequences.

 

The moral problems in the presidents’ stated intentions arose more from their personal motives than from their stated formal goals. Lyndon Johnson and Richard Nixon may have admirably sought the formal goal of protecting South Vietnamese from Communist totalitarianism, but they also expanded and prolonged the war because they did not want to be “the man who lost Vietnam.” In contrast, Truman allowed his presidency to be politically weakened by the stalemate in Korea rather than follow General Douglas MacArthur’s advice of using nuclear weapons. Morality mattered greatly in both these cases.

 

If we determine morality based on the three dimensions of intentions, means and consequences, the founding presidents of the post-1945 world order—FDR, Truman, and Eisenhower—all had moral intentions, both in values and personal motives, and largely moral consequences. Where they sometimes fell short was the dimension of means, specially the use of force. In contrast, the Vietnam era presidents, particularly Johnson and Nixon, rated poorly on their motives, means and consequences. The two post-Vietnam presidents, Gerald Ford and Jimmy Carter, had notably moral foreign policies on all three dimensions but their tenures were brief, and they illustrate that a moral foreign policy is not necessarily the same as an effective one. The two presidents who presided over the end of the Cold War, Reagan and George H.W. Bush, also scored quite well on all three dimensions of morality. The years of unipolarity and then the diffusion of power in the twenty-first century produced mixed results with Bill Clinton and Barack Obama above the average and George W. Bush and Donald Trump falling well below average. Among the fourteen presidents since 1945, in my view the four best at combining morality and effectiveness in foreign policy were FDR, Truman, Eisenhower, and Bush 41. Reagan, Kennedy, Ford, Carter, Clinton and Obama make up the middle. The four worst were Johnson, Nixon, Bush 43, and (tentatively because of incompletion) Trump. Of course, such judgments can be contested and my own views have changed over time. Historical revision is inevitable as new facts are uncovered and as each generation re-examines the past in terms of new circumstances and its changing priorities.

 

Obviously, such judgments reflect the circumstances these presidents faced, and a moral foreign policy means making the best choices that the circumstances permit. War involves special circumstances. Because wars impose enormous costs on Americans and others, they raise enormous moral issues. Presiding over a major war such as World War II is different from presiding over debatable wars of intervention such as Vietnam and Iraq. 

 

The importance of prudence as a moral virtue in foreign policy becomes clear when one compares Dwight Eisenhower’s refusal to send troops to Vietnam with John Kennedy and Lyndon Johnson’s decisions. After losing 241 Marines in a terrorist attack during Lebanon’s civil war in 1983, Reagan withdrew the troops rather than double down. Similarly, Obama and Trump’s reluctance to send more than a small number of forces to Syria may look different with time. Bush 41 was criticized for restricting his objectives, terminating the Gulf War after four days’ fighting, and not sending American armies to Baghdad in 1991, but his decision seems better when contrasted with the lack of prudence that his son showed in 2003 when members of his administration expected to be greeted as liberators after the invasion of Iraq and failed to prepare adequately for the occupation. In foreign policy as in law, some levels of negligence are culpable.

 

Realists sometimes dismiss prudence as an instrumental and not a moral value, but given the complexity and high prospect of unintended consequences when judging the morality of foreign policy decisions, the distinction between instrumental and intuited values breaks down and prudence becomes a crucial virtue. Moral decisions in foreign policy involve both intuition and reason. Willful ignorance or careless assessment produces immoral consequences. Conversely, not all decisions based on conviction are prudential, as some of my cases indicated. Truman’s response to North Korea’s crossing the 38th parallel in Korea, for example, was imprudent, though he saw it as a moral imperative. These reasoned and intuited virtues can conflict with each other. Principle and prudence do not always coincide. The problem that presidents often face is not a question of right versus wrong, but right versus right. But whatever the choices, they cannot be dismissed on the basis of simplistic realist models of “national interest.” What matters is how interest was defined, and to answer that question, good history shows that morals mattered. 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174018 https://historynewsnetwork.org/article/174018 0
Our GOP Problem

New York Times, April 2, 1950

 

Stone Age Brain is the blog of Rick Shenkman, the founding editor of the History News Network. His newest book is Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (Basic Books, 2016). You can follow him on Twitter @rickshenkman.

 

The Republican Party, unsurprisingly, has taken the position that President Trump should be defended. This is unsurprising because this is what parties in power do.  If we want to explain what has happened to the Republican Party, which all must try to do in this hour of crisis when democracy itself is on the line owing to Republican perfidy, it is essential for us to view events not from the perspective of the rational actor but from that of the party politician.  Only then can the alarming events through which we are living become understandable. 

 

We must begin with the basics.  Three overriding causes may be said to account for the behavior of politicians holding national office.  One, is money, with which we need not concern ourselves too much.  It’s obvious the role money plays in our politics.  Members of Congress must always be thinking in the back of their mind how any vote may affect their chance of financing their next election. Every GOP member of Congress has to worry that if they vote against Trump they’ll be cut-off from various campaign funds available to Republicans in good standing with the party and the party’s major-domo donors such as Sheldon Adelson and Charles Koch. 

 

More interesting, though also obvious, is the second factor, pure partisanship. The social sciences tell us that partisanship is hard-wired in the human brain.  It is the reason we cheer for our side in a ball game and hope for the opposition’s defeat.  Once we identify with a group we look for evidence that confirms the group’s status and dismiss evidence that detracts from it.  Because partisanship is stronger among Republicans generally than it is among Democrats, perhaps owing to a default loyalty bias among people who identify as conservative, it is pretty easy to comprehend the ordinary Republican’s behavior in ordinary times.  

 

Of course, these are not ordinary times.  Presidents are rarely impeached.  So Judiciary Committee Chairman Jerry Nadler, ahead of the committee’s vote on impeachment, issued a rare plea that his Republican colleagues consult their consciences before voting.  As many have noted Republicans during Watergate did just this, voting against Nixon when they similarly faced an impeachment vote.  Why is no Republican doing that this time?

 

The explanation may be found in the third factor accounting for the behavior of politicians. It is this one that is perhaps the most telling in the current situation.  Politicians prefer winning over losing and recent history suggests that the way to win, notwithstanding the losses the party suffered in 2018, is to stand with Donald Trump .  By nature politicians are cautious.  The only way to know what will succeed in winning votes is to follow the path of proven winners like Trump.  As long as he appears to be retaining the support of the GOP party base it is prudent to assume that he has figured out the magic sauce in the recipe of political victory and to follow the recipe closely.  Only a few dare to tamper with the ingredients.

 

Change is unlikely in the Republican Party short of a massive defeat.  Only in defeat do politicians, facing years in the wilderness, risk experimenting with new approaches.  Thus far there’s little sign that the party base is fielding second thoughts about Trump.  He remains nearly as popular today among Republicans as he did when he was elected.  Polls show his support among Republicans in states like California and Texas is north of 85 percent.  Nixon's support, by contrast, began to collapse by the time he faced impeachment.  At the beginning of 1973, before Watergate shook the country, Nixon had the support of 91 percent of GOP voters.  By the end of the year — a year in which John Dean testified about payoffs to the Watergate burglars and Special Prosecutor Archibald Cox was fired in the Saturday Night Massacre — Nixon’s support in the GOP had fallen to 54 percent.  

 

So the real question isn’t why members of Congress are remaining staunch Trump supporters, but why the GOP base is.  Many reasons have been offered for this strange phenomenon (strange because Trump is so unlikely an avatar of Republican virtue). They include Fox News, Rush Limbaugh, and the other leading cogs in the propaganda machine that props up the Republican Party.

 

Whatever the cause of Trump's hold over the GOP base, it's a fact, and we as a country need to do something about it. We have to hope that the GOP evolves into a better version of itself because, as Arthur Schlesinger Jr. observed in an article in the New York Times in 1950, this country needs two intelligent parties. Right now we've got just one.  Only the Democrats are grappling with the real problems the United States faces, among them climate change and inequality.  This is untenable over the long term.

 

Through much of our history we have had a responsible conservative party, as Schlesinger noted in his piece in the Times.  In antebellum America the party of Jefferson was cross-checked by the party of Hamilton and Adams.  In the next generation Jacksonians faced off against Whigs, and while the Whigs eventually disappeared, for decades they offered Americans like Lincoln an intelligent alternative.  In the postbellum period the GOP espoused (for a time)  a bold vision of racial equality and entrepreneurial zeal.  Later it was captured by the plutocrats but by the turn of the 19th century reform elements led by Teddy Roosevelt succeeded in refashioning the party as an engine of reform.  In the 1920s the party once again became beholden to the rich until the Great Depression put an end to its control of the federal government.  For a couple of decades it nearly ceased to exist at the national level.  Then, as if in response to Schlesinger’s call, the party finally made peace with the New Deal under the leadership of Dwight Eisenhower.  “Should any political party attempt to abolish social security, unemployment insurance and eliminate labor laws and farm programs,” Ike wrote, “you would not hear of that party again in our political history.”

Under both Richard Nixon and Ronald Reagan the GOP continued to deal with real world problems, particularly in foreign affairs.  But slowly in the years following the end of the Cold War Republicans gave themselves over increasingly to fake nostrums.  They did this because they found they couldn’t win by running on their real agenda -- tax cuts for the wealthy, which constituted nearly the whole of their domestic program once welfare had been reformed in the 1990s.  

 

Trump in 2016 correctly identified several key issues that demand public attention, especially the decline and demoralization of much of rural America.  But rather than offer a rational program to address this and other issues he won election by dividing the country along racial and religious lines.  Instead of appealing to the better angels of our nature he played his voters for fools.  He began his time in the national political spotlight by hinting that Barack Obama was born in a foreign country and might be a secret Muslim.  Later he signed up as a card carrying member of the anti-science brigade of climate change deniers.  Throughout his presidency he’s spread rumors of conspiracies. And his biggest "accomplishment"?  It was giving the wealthy huge tax breaks.

 

What if the GOP doesn’t reinvent itself as a responsible party? Schlesinger worried seven decades ago the GOP could collapse into pieces, leaving “its members prey for fascist-minded demagogues.”  There was, it turns out, another possibility Schlesinger didn’t anticipate.  It’s that the party would hold together by itself appealing to a trinity of fascist evils: xenophobia, racism, and authoritarianism. This should worry all of us.

 

 

 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/blog/154299 https://historynewsnetwork.org/blog/154299 0
Chief Justice John Roberts' Predecessors: The Supreme Court Chief Justices Who Presided Over Previous Impeachment Trials

 

As the Senate impeachment trial of Donald Trump looms, many aspects of the trial are still undetermined. Will the parties call witnesses? How long will it last? How seriously will Senate Majority leader Mitch McConnell take it? 

 

One aspect that is determined but often misunderstood is who presides over the trial. As Chief Justice John Roberts, appointed by President George W. Bush in 2005, readies himself for his historic role as the presiding judge over the trial, it is instructive to look back at the experiences of the two prior Chief Justices who presided over the trials of President Andrew Johnson in 1868 and of President Bill Clinton in 1999.

 

Salmon P. Chase, Chief Justice from 1864-1873, and William Rehnquist, Chief Justice from 1986-2005, both faced great pressures as presiding judge over the highly partisan impeachment trials. Neither one would be considered noncontroversial in his career, but both had the responsibility to uphold the Constitution at times of great turmoil, and both did so, after an early period of controversy around Salmon P. Chase.

 

Salmon P. Chase’s career reflected the realignment of political parties in the mid nineteenth century. He was a member of the Whig Party in the 1830s, the Liberty Party of the 1840s, the Free Soil Party from 1848-1854, the Republican Party from its founding in 1854 to 1868, and finally, the Democratic Party in the last five years of his life, while still serving as Chief Justice by appointment of Abraham Lincoln.

 

Chase helped recruit former Democratic President Martin Van Buren to run as the Free Soil Presidential candidate in 1848; helped found the Republican Party on the same principles of antislavery activism; sought the Republican nomination for President in 1860 before Lincoln was selected by the Republican National Convention; and he sought the Presidency on the Democratic Party line in 1868 and the Liberal Republican line in 1872 while still serving as Chief Justice.  He had a varied career as Ohio Senator (1849-1855), Governor (1856-1860), and Secretary of the Treasury under Lincoln (1861-1864).

 

Chase attempted to establish the concept of unilateral rulings on procedural matters during the early days of the trial of Andrew Johnson, but he was overruled by the Senate majority, controlled by Radical Republicans, and quickly gave up trying to control the trial. He moved toward neutrality and simple presiding as the trial moved forward after early turmoil.

 

William H. Rehnquist could not have been more different than Salmon P. Chase in his political leanings.  As far “left” as Chase was in his times, Rehnquist was far ‘right”, starting his political career as a legal advisor to Republican Senator Barry Goldwater in his failed campaign for President of the Arizona Senator in 1964.  Rehnquist was appointed Assistant Attorney General of the Office of Legal Counsel in 1969 by President Richard Nixon. 

 

Nixon nominated him for the Supreme Court in late 1971 and he was confirmed and sworn in the first week of 1972. Rehnquist served nearly 34 yearson the Court and was elevated to Chief Justice in 1986 by President Ronald Reagan. He was regarded as the most conservative member on the Warren Burger Court and was one of the most consistently conservative Justices in modern times. Rehnquist recused himself from participating in the US V. Nixon Case in 1974, where the President was ordered to hand over the Watergate Tapes to the Special Prosecutor Leon Jaworski, leading to Nixon’s resignation on August  9, 1974.

 

Presiding over the Bill Clinton Impeachment Trial in the Spring of 1999, Rehnquist chose to  limit any attempt to influence the trial that was being promoted by a strong conservative Republican leadership in the House of Representatives, led by Speaker of the House Newt Gingrich and House Judiciary Committee Chairman Henry Hyde.  Despite his strong conservative credentials, Rehnquist managed always to get along well with his Supreme Court colleagues, and there were no controversies about his handling of the Clinton Impeachment Trial. 

 

He was, despite his right wing credentials and voting record on the Court, seen as fair minded, approachable, and a far more unifying leader of the Court before and after the Clinton Impeachment Trial than Chase was before and after the Andrew Johnson Impeachment Trial.

 

Now, Chief Justice John Roberts, who clerked for Rehnquist in 1980-1981, is faced with the same challenge of presiding over a highly charged impeachment trial.

 

Roberts worked in the Ronald Reagan and George H. W. Bush Administrations in the Justice Department and the Office of White House Counsel, then as Principal Deputy Solicitor General,followed by private law practice before his appointment to the DC Court Of Appeals by George W. Bush in 2003.  In 2005, he was nominated to replace the retiring Associate Justice Sandra Day O’Connor, but before hearings could begin on the nomination, Chief Justice Rehnquist died. Roberts was then nominated to replace Rehnquist. 

 

Roberts has been very clear in his desire to run a Court that has the respect and regard of the American people, and while he has a strong conservative judicial philosophy in his 14 plus years on the Court, he has also come across as having a willingness to work with the Supreme Court’s liberal bloc, and is seen as the “swing” vote on the Court since Associate Justice Anthony Kennedy retired in 2018.  

 

He has surprised many liberal commentators with some of his votes, including the preservation of “ObamaCare.” He is seen as comparatively more moderate and conciliatory, and he has been somewhat critical of utterances by President Donald Trump regarding bias of Justices appointed by Presidents Bill Clinton, George W. Bush, and Barack Obama.

 

 It is clear that Roberts wants to have a good historical reputation as only the 17th person to head the Supreme Court, and while he will work to avoid controversy in the upcoming Trump Impeachment Trial, he will wish to preserve respect for the Constitution, democracy, and the rule of law, and will be the center of attention in the coming weeks and months.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/blog/154302 https://historynewsnetwork.org/blog/154302 0
Roundup Top 10!  

The Job of the Academic Market

by Rebecca S. Wingo

Over three years, I dedicated 106.5 workdays to getting a job—while working another job. 

 

Prohibition Was a Failed Experiment in Moral Governance

by Annika Neklason

A repealed amendment and generations of Supreme Court rulings have left the constitutional regulation of private behavior in the past. Will it stay there?

 

 

History and the Opioid Crisis

by Jeremy Milloy

In the 1970s, just as now, people living with and recovering from substance use disorders faced prejudice and mistreatment at the hiring stage and in the workplace itself.

 

 

1619?

by Sasha Turner

What to the historian is 1619?

 

 

Boris Johnson Might Break Up the U.K. That’s a Good Thing.

by David Edgerton

It’s time to let the fantasy of the “British nation” die.

 

 

The problem with a year of celebrating the 19th Amendment

by Andrew Joseph Pegoda

Our entire understanding of the history of feminism is skewed.

 

 

Assassination as Cure: Disease Metaphors and Foreign Policy

by Sarah Swedberg

Kinzinger’s words fit within a long historical tradition of badly used disease metaphors that often accompany bad outcomes.

 

 

Another Disability Disaster in the Making

by Jonathan M. Stein

The Trump administration’s Social Security proposal would repeat one of Ronald Reagan’s most damaging mistakes.

 

 

How the President Became a Drone Operator

by Allegra Harpootlian

From Obama to Trump, the Escalation of Drone Warfare

 

 

 

What Australia’s Fires Should Teach the USA: Be Alarmist!

by Walter G. Moss

Most importantly in this 2020 election year, the Australian tragedy tells us we should vote out all the human-caused climate-change deniers and minimizers.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174066 https://historynewsnetwork.org/article/174066 0
Stepping Back From the Brink of War Trump’s order to kill General Soleimani is one of the most reckless acts taken by a president, who once again has put his personal political interest above the nation’s security. Certainly, Soleimani deserved to meet his bitter fate. He was behind the killing of hundreds of American soldiers in Iraq while threatening and acting against American allies. However, killing him without considering the potentially dire regional repercussions and without a strategy, under the guise of national security concerns, is hard to fathom. Republican members of Congress who praised the assassination of General Soleimani seem to be utterly blinded by their desire to see him eliminated. What will happen next, they seem to have no clue. Trump, who is fighting for his political life, appeared to have cared less about the horrifying consequences as long as he distracts public attention from his political woes. He made the decision to assassinate Soleimani seven months ago, but he gave the order now to serve his own self-interest, especially in this election year where he desperately needs a victory while awaiting an impeachment trial in the Senate. During the Senate briefing on Iran led by Secretaries of State and Defense Pompeo and Esper, and CIA Director Haspel, they produced no evidence that there was an imminent danger of an attack on four American embassies orchestrated by Soleimani, as Trump has claimed. In fact, Esper said openly in a January 12 interview that he saw no evidence. Republican Senator Mike Lee labeled it as “probably the worst briefing I have seen, at least on a military issue…What I found so distressing about the briefing is one of the messages we received from the briefers was, ‘Do not debate, do not discuss the issue of the appropriateness of further military intervention against Iran,’ and that if you do ‘You will be emboldening Iran.’” Now, having failed to produce evidence of imminent danger, the Trump administration claims that the killing of Soleimani was part of a long-term deterrence strategy. The assassination itself has certainly emboldened Iran’s resolve to continue its nefarious activities throughout the region, but even then, the measure Trump has taken to presumably make the US more secure has in fact done the complete opposite. It has created new mounting problems and multiple crises. Trump dangerously escalated the conflict with Iran; severely compromised the US’ geostrategic interest in the Middle East; intensified the Iranian threat against our allies, especially Israel; led Iran to double down in its support of terrorist and Jihadist groups; badly wounded the US’ relations with its European allies; deemed the US untrustworthy by friends and foes; and pushed Iran to annul much of the nuclear deal, all while impressively advancing its anti-ballistic missile technology. And contrary to Trump’s claim that he made the right decision for the sake of American security, 55 percent of voters in a USA Today survey released on January 9th said he made the US less safe. And now we are still at the brink of war. Although Iran has admitted to being behind the attack on the Asad air base in Iraq, it initiated the attack to save face in the eyes of its public and demonstrate its possession of precision missiles and willingness to stand up to the US. This retaliation was expected, but since Iran wants to avoid an all-out war, it was strategic and carefully calculated to inflict the fewest American casualties, if any, to prevent a vicious cycle of retaliatory attacks which could get out of control and lead to a war. This, however, does not suggest that Iran will stop its clandestine proxy operations—employing its well-trained militia in Iraq, Yemen, and Syria to execute new attacks on American and allies’ targets in the region while maintaining deniability. Similarly, the clergy can also pressure hawks in and outside the government to avoid any provocative acts against the US. Iran is patient and will carefully weigh its gains and losses before it takes the next step. Following Iran’s attack on the Asad base, Trump has also shown restraint because he too wants to prevent an all-out war, knowing that even though the US can win it handedly, it will be the costliest victory in blood and treasure and certainly in political capital. The whole mess began when Trump withdrew from the Iran deal. What did Trump think he could accomplish? Withdrawing from the deal without having any substitute, without consultation with the European signatories, and with re-imposing sanctions, especially when Iran was in full compliance with all the deal’s provisions, is dangerously reckless—undermining our national security interests and jeopardizing the security of our allies in the region. The Iran deal was not perfect, but the idea was to build on it, gradually normalize relations with Iran, and prevent it from acquiring nuclear weapons altogether as it works to become a constructive member of the community of nations. To resolve the crisis with Iran, the US must demonstrate a clear understanding of the Iranian mindset. Iran is a proud nation with a long and continuing rich history; it has huge natural and human resources, is the leader of the Shiite world, occupies one of the most geostrategic locations in the world, and wants to be respected. The Iranians are not compulsive; they think strategically and are patient, consistent, and determined. The revocation of the Iran deal simply reaffirms Iran’s distrust of the US, from the time the CIA toppled the Mosaddeq government in 1953 to the continuing sanctions, adversarial attitude, and the open call for regime change. Both Khamenei and Trump have their own domestic pressure to contend with and want to avoid war. The Iranian public is becoming increasingly restive. They are back in streets demanding immediate economic relief. Conversely, Trump calculated that further escalation of violent conflict with Iran will erode rather than enhance his political prospects, and would make defeat in November all but certain. West European countries are extremely sensitive to any major escalation of violence, as it would lead to mounting casualties and destruction on all sides. Iran can resort to a wide range of hostile measures, including disrupting oil supplies from Saudi Arabia and other Gulf states, by mining the Straits of Hormuz through which 21 million barrels per day (21% of global oil consumption) pass, resulting in a massive economic dislocation in the Middle East and Europe in particular. The pause in hostilities offers a golden opportunity to begin a new process of mitigation. Germany, France, and Britain have already engaged the Iranians in an effort to ease the tension between Iran and the US and create conditions conducive to direct US-Iran negotiations. By now, Trump must realize that Iran cannot be bullied and the only way to prevent it from pursuing nuclear weapons is through dialogue. Regardless of how flawed Trump views the Iran deal, it still provides the foundation for a new agreement, as many of its the original provisions remain valid and can be built on it. Other conflicting issues between the two sides, especially Iran’s subversive activities, should be negotiated on a separate track. In some ways, both Iran and the US need to lick their wounds and begin a new chapter, however long and arduous it may be, because war is not and will never be an option.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174065 https://historynewsnetwork.org/article/174065 0
Trump Checks all of the Impeachment Boxes: Will it Matter?

 

In my last article, I wrote that there is something wholly different about Donald Trump’s actions than those of other presidents who have exceeded their power. Why? Unlike other presidents, Trump’s actions meet each of the requirements that the Framer’s laid out to impeach a president. Ironically, just as impeachment is needed most, the partisan tenor of the times may make it impossible to accomplish.

 

The Framers of the Constitution had included the power of impeachment for instances of treason, bribery, or other high crimes and misdemeanors committed by executive branch officials and the president. They had rejected policy disputes as a basis for impeachment. When George Mason proposed adding maladministration as an impeachable offense, Madison responded that “so vague a term will be equivalent to tenure during pleasure of the Senate.” It was at this point that “high crimes and misdemeanors” were added. While the first two are clear, the third sounds vague to us today. Yet, the Framers had a clear idea of what this meant. As I wrote previously for the History News Network, the Framers “thought that the power of impeachment should be reserved for abuses of power, especially those that involved elections, the role of foreign interference, and actions that place personal interest above the public good” which fall within the definition of high crimes and misdemeanors.  Professor Noah Feldman of Harvard Law School, in his testimony to the House Judiciary Committee, said that for the Framers “the essential definition of high crimes and misdemeanors is the abuse of office” by a president, of using “the power of his office to gain personal advantage.” Trump’s actions with Ukraine checks each of these boxes.

 

Presidents and Congress have often found themselves in conflict, dating back to the earliest days of our Republic. Part of this is inevitable, built into a constitutional system of separation of powers with overlapping functions between the two branches. Only Congress can pass laws, but presidents can veto them. Presidents can negotiate treaties but must obtain the advice and consent of the Senate. As Arthur Schlesinger Jr. observed, checks and balances also make our political system subject to inertia. The system only really works “in response to vigorous presidential leadership,” Schlesinger wrote in The Imperial Presidency. But sometimes presidents grasp for powers that fall outside of the normal types of Constitutional disputes. This is where impeachment enters the picture.

 

In the early Republic, disputes between presidents and congress revolved around the veto power. Andrew Jackson was censured by the Senate over his veto of bank legislation and his subsequent removal of federal deposits from the Second Bank of the United States. Jackson’s actions showed a willingness to grab power and to ignore the law and the system of checks and balances when it suited his purposes. While the censure motion passed, it did not have the force of law. It was also unlikely that impeachment would have been successful, since the dispute was over policy. While the president had violated the law, not all illegal acts are impeachable. As Lawrence Tribe and Joshua Matz have noted, “nearly every president has used power in illegal ways.” Impeachment is meant to be limited to those actions, like treason and bribery, “that risk grave injury to the nation.”

 

Congress considered impeaching John Tyler in 1842 over policy disputes for which he too used the veto power. Tyler is sometimes referred to as the accidental president since he assumed the presidency when William Henry Harrison died in office one month after he was sworn in. Tyler had previously been a Democrat and state’s-rights champion who had joined the Whig Party over disagreements with Jackson’s use of presidential power. Yet he preceded to use the powers of the presidency to advance his own policy views, and not those of his newly adopted Whig Party, vetoing major bills favored by the Whigs, which led to an impeachment inquiry. A House Committee led by John Quincy Adams issued a report that found the President had engaged in “offenses of the gravest nature” but did not recommend that Tyler be impeached. 

 

Even Andrew’s Johnson’s impeachment largely revolved around policy disputes, albeit extremely important ones. Johnson was a pro-Union southerner who was selected by Lincoln to aid in his reelection effort in 1864. Johnson was clearly a racist, a member of the lowest rung of southern white society, those that felt their social position was threatened by the advancement of blacks, a view that shaped Johnson’s policies on Reconstruction. While the Civil War ended slavery, it did not end the discussion of race and who can be an American. Johnson, much like Trump, represented those who believed that America was a typical nation, made up of one racial group. “I am for a white man’s government in America,” he said during the war. On the other hand, the Radical Republicans believed that America was essentially a nation dedicated to liberty and equality, and they set out to fulfill the promise of the American Creed for all American men, black as well as white. This was the underlying tension during the period of Reconstruction. “Johnson preferred what he called Restoration to Reconstruction, welcoming the white citizenry in the South back into the Union at the expense of the freed blacks,” Joseph Ellis writes.

 

But Johnson was ultimately impeached on false pretenses, not for his policy disputes with Congress, but due to his violation of the Tenure of Office Act, which some have characterized as an impeachment trap. The act denied the president the power to fire executive branch officials until a Senate confirmed appointment had been made. While the House impeached Johnson, he escaped removal by one vote in the Senate. Johnson’s sin was egregious; he had violated one of the core tenants of the American Creed, that all are created equal. Yet this did not rise to the level of an impeachable offense that warranted the removal of a president. It was a policy dispute, one that went to the core of who we are as Americans, but it was not a high crime or misdemeanor. 

 

It would be over one hundred years before impeachment would be considered again, this time in the case of Richard Nixon. Like Trump, Nixon abused the power of his office to advance his own reelection. Both men shared a sense that the president has unlimited power. Nixon famously told David Frost that “when the president does it, that means its not illegal,” while Trump has claimed that under Article II “I have the right to do whatever I want.  But Nixon, unlike Trump, did not elicit foreign interference in his reelection effort. The two men also shared certain personality traits that led to the problems they experienced in office. As the political scientist James David Barber wrote in his book The Presidential Character, Richard Nixon was an active-negative president, one who had “a persistent problem in managing his aggressive feelings” and who attempted to “achieve and hold power” at any cost. Trump too fits this pattern, sharing with Nixon a predilection toward “singlehanded decision making,” a leader who thrives on conflict. 

 

Indeed, Nixon would likely have gotten away with Watergate except for the tapes that documented in detail his role in covering up the “third rate burglary” that occurred of the Democratic Party headquarters on June 17, 1972. Paranoid about possibly losing another election, Nixon had directed his staff to use “a dirty tricks campaign linked to his reelection bid in 1972,” presidential historian Timothy Naftali has written. When the break in was discovered, Nixon then engaged in a systematic cover-up, going so far as to tell the CIA to get the FBI to back off the investigation of the break in on bogus national security grounds. 

 

Much like Trump, Nixon stonewalled the various investigations into his actions, what Naftali calls “deceptive cooperation.” He had the Watergate Special Prosecutor, Archibald Cox, fired in October 1973 in order to conceal the tapes, knowing his presidency was over once they were revealed. In the aftermath of Cox’s firing during the so-called Saturday Night Massacre, Nixon refused to release the tapes to the House’s impeachment inquiry. Instead, he provided a transcript he had personally edited that was highly misleading. The final brick in Nixon’s wall of obstruction was removed when the Supreme Court unanimously ruled in July 1974 that Nixon had to release the tapes, which he complied with. One wonders if Trump would do the same.  

One difference with the Trump case is that there was a degree of bipartisanship during the Nixon impeachment process. By the early summer of 1974, cracks had begun to appear in the Republicans support for Nixon. Unlike today, there were still moderate Republicans who were appalled by Nixon’s actions and had become convinced that the president had engineered a cover-up. Peter Rodino, the Democratic chairman of the House Judiciary, had bent over backwards to appeal to the moderate Republicans and to Southern Democrats, where Nixon was popular. Despite strong pressure from the leadership of the GOP in the House, it was this group that ultimately drew up the articles of impeachment. 

 

Still, a large number of Republicans in the House continued to stick with the president until the tapes were finally released. It was at this point that even die-hard Nixon supporters deserted him when it became apparent that Nixon had been lying all along and had committed a crime. Nixon’s case shows both the importance of bipartisanship in the impeachments process, but also how difficult it is for members of the president’s party to turn on him. In August of 1974, Nixon resigned when confronted by a group of Senators and House members, led by conservative Senator Barry Goldwater.

 

The impeachment of Bill Clinton is the anomaly, since it was not about policy (as in Johnson’s case) or the abuse of power (in Nixon’s case). Rather it emerged in part due to a character flaw. Clinton could not restrain himself when it came to women. 

 

The facts of the case are well known. While president, Clinton had an illicit sexual encounter with Monica Lewinski in the Oval Office. He then proceeded to lie about it, both to the country and also during a deposition in the Paula Jones case, and attempted to cover up the affair. Kenneth Starr, who had been appointed as Independent Counsel to investigate the Whitewater matter, a failed land deal in which the Clinton’s lost money but did not nothing wrong, then turned his investigation to the president’s actions with Lewinski and recommended that the House of Representative consider impeachment proceedings for perjury and obstruction of justice.

 

By this point, Clinton had admitted he had lied to the country and apologized for his actions. The House had the opportunity to censure Clinton, but Tom Delay, one of the Republican leaders, buried that attempt, even in the aftermath of the midterm elections when Democrats gained seats, clearly pointing to public opposition to impeachment of the president, whose approval rating were going up. While the House voted for impeachment largely along partisan lines, the Senate easily acquitted Clinton on a bi-partisan basis. Clinton’s actions, while “indefensible, outrageous, unforgiveable, shameless,” as his own attorney described them, did not rise to the level the Framers’ had established for impeachment. 

 

Clinton’s impeachment in the House was largely a product of partisan politics that were out of control. As Lawrence Tribe and Joshua Matz have written, “starting in the mid-1990’s and continuing through the present, we’ve seen the creeping emergence of a permanent impeachment campaign.” Both George W. Bush and Barack Obama faced impeachment movements during their terms in office over issues that in the past no one would have considered impeachable. During the 2016 election, both Hillary Clinton and Donald Trump lobbed accusations that the other would face impeachment if elected. The current toxic political environment raises the issue of whether a bipartisan impeachment effort has any chance at all, especially when the two sides cannot even agree over basic facts. Nancy Pelosi was right to hold off the movement to impeach Trump prior to the Ukrainian matter, but now that we have such a clear abuse of power, what is the alternative? At this moment, when the tool of impeachment is most needed for a president who meets all of the criteria laid out by the Framers, the process itself has become captive to extreme partisanship by the Republicans. 

 

The ball is now in the Senate’s court, where the trial will soon occur. While conviction and removal from office is highly unlikely short of additional corroborating evidence (which Mitch McConnell has been attempting to squelch), perhaps the Senate can find the will to issue a bipartisan censure motion that condemns the president and issues a warning that another similar abuse of power will result in removal. Ultimately, the voters will likely decide Trump’s fate come November. We can only hope they choose correctly.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174016 https://historynewsnetwork.org/article/174016 0
Annual Jewish Film Festival, Following New Wave of Anti-Semitism, Offers Hope and Inspiration Early last month, four people were killed at a Jewish food store next to a synagogue in Jersey City, N.J. (two blocks from the building in which I work). A few days later, a man wielding a machete stabbed and badly injured five Jews praying in the home of a Rabbi in Monsey, New York, about 40 miles from New York City. Since then, several swastikas have been painted on buildings in various cities. These incidents are all part of a growing wave of anti-Semitism  in America. The anti-Semitic crimes in Chicago, New York and Los Angeles were the highest in 18 years in 2019. The hate crimes in Los Angeles, a category expanded by police to include swastikas an any religious property, doubled in 2019 over the previous year.  New York City counted 229 anti-Semitic crimes in the past year, a new record, and up significantly from last year.  The Anti-Defamation League said 2019 showed the third highest anti-Semitic crime total in the entire history of the organization.

 

On Wednesday, the 29th annual New York Jewish Film Festival, a two week (January 15-28) cinematic celebration of Jewish life, kicks off at Lincoln Center’s Water Reade Theater, in New York, and serves as hope and inspiration to not just Jews, but everybody.

 

Given the attacks on Jews all over the country, the Jewish Film Festival, one of the oldest in the United States, could not have come at a better time.

 

Aviva Weintraub, the executive director of the festival that is sponsored by the New York Jewish Museum and the Film Society of Lincoln Center, said what has been happening against Jews in the nation over the last two months is “horrifying.” She said the goal of the festival each year is to “bring Jews together with each other and others” and said she is hopeful that will happen again this year. 

    

Discrimination and persecution, of course, are no strangers to Jews and the selectins of films from the festival reflects that.

 

The film festival starts with the upbeat Aulcie, a sports film about how basketball star Aulcie Perry was spotted playing basketball on a New York City playground tournament by a scout for the Maccabi Tel Aviv basketball team from Israel in 1976. He was signed and, despite personal problems, helped the Maccabi team win two separate European championships. He later converted to Judaism and became an Israeli citizen

 

The centerpiece of the film festival is the screening of the award winnings 1970 film The Garden of the Finzi-Continis, director Vittorio De Sica’s movie about the struggles of the Jews in the World War II era in Italy, now celebrating its 50th anniversary. That Holocaust era film is joined by a new documentary, Four Winters: A Story of Jewish Partisan Resistance and Bravery in WW II, that tells the story of Jewish resistance to the Nazis throughout World War II in different countries.

 

“We chose The Garden of the Finzi-Continis because of its anniversary, but also because it is such as great film about the struggle of the Jews against the Nazis and because it is a beautiful and moving story,” said Ms. Weintraub. The film won 26 international awards in 1970, plus the Oscar for Best Foreign Language film.

 

Executive director Weintraub is equally proud of Four Winters. “The strength of the film is not just its story, but the inspiring story of each of the men and women, seniors now, who survived the Holocaust and, in the movie, describe what happened. It Is stirring,” said Ms. Weintraub. “You cannot see that documentary and not be moved by it.”

 

She said Four Winters is a factual and inspirational story of years of resistance against the Nazi regime. “It’s amazing to realize that the Jews and others resisted for that long,” she said.

 

The festival has always been popular. Weintraub chuckles when she thinks back on different years of the Festival. “We have hordes of people who literally camp out at Lincoln Center to catch as many films in the festival as they can,” she said. “It’s not surprising for someone to see several films. Many people come to Lincoln Center on their way home from work, or between shopping trips on weekends,” she said.

 

She and two others spend about a year winnowing down the films to 31 or 32 for each festival. “We look for films that represent history, politics and Jewish life. Each year the mix of movies is different, “ she added.

 

Some movies in this year’s festival represent the Holocaust. There is Birch Tree Meadow, a 2003 film that tells the story of a concentration camp survivor who returns to the camp years later to confront memory and the descendant of a Nazi guard. 

 

An Irrepressible Woman is the story of 1940s French Prime Minister Leon Blum, imprisoned at Buchenwald, and his love, Jeanne Reichenbach, who fell in love with him as a teenager, and risks her life to find him again.

 

There are cultural tales. The 1919 silent film Broken Barriers was the first film to tell some of the old Sholom Aleichem stories, that much later became world famous play and move Fiddler on the Roof.

 

Incitement is the complicated story of the lead up to the highly publicized assassination of Israeli Prime Minister Yitzhak Rabin in 1995. It tracks not only the murder, but the politics in the nation at the time.

 

God of the Piano is the story of a woman who is forced to meet high expectations by her father as a pianist. When she grows up, she places those same high expectation of her son but he is deaf. The story is the larger family conflict. 

 

The festival closes on a high note with the unification film Crescendo, the true story of how music conductor Eduard Sporck took over a joint Israeli-Palestinian youth orchestra.  At first, he saw his job as the man to get all of the musicians to produce beautiful music, but he soon realized the harder job,  and the more rewarding job, was to get the children from the two opposing political sides to forget personal differences and to work together as a smoothly running musical group. 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174042 https://historynewsnetwork.org/article/174042 0
Painter of Disquiet: Félix Vallotton at the Metropolitan Museum of Art'

La Chambre rouge (1898)

 

The Metropolitan Museum of Art is currently presenting the work of Félix Vallotton, an artist who has been largely neglected relative to his contemporaries, such as Pierre Bonnard and Édouard Vuillard. This makes the present exhibition all the more welcome, and fascinating. Vallotton’s work unquestionably merits the renewed attention — his paintings possess a mysterious quality, narrative appeal, and attention to detail, as well as invoke a delicious sense of irony and wit. Born in the Swiss town of Lausanne on the shores of Lake Geneva in 1865, Vallotton displayed early an ability to draw from life; and at sixteen he arrived in Paris to study painting at the Académie Julian. A self-portrait at the age of twenty reveals a virtuosic, confident brush and much more — Vallotton is not interested in merely demonstrating his facility as a realist painter: he has a penetrating eye, a psychological depth, and a naturalism that owes much to northern Renaissance masters, especially Albrecht Dürer and Hans Holbein. It is not entirely clear when Vallotton’s friendship with Les Nabis began – from the Hebrew word for prophet, the Nabis were an avant-garde group of Parisian artists, which included Bonnard, Vuillard, Charles Cottet, and Ker-Xavier Roussel, among others. Valloton’s nickname within the group may be revealing – he was the ‘foreigner Nabi’. Perhaps this name reflected his Swiss origin, but the Nabis were an international group anyhow. It could also reflect in some measure that Vallotton was something of a loner; but it may allude to the fact that while for a time he adopted the Nabi’s dismissal of naturalism in favor of flat forms and the expressivist use of color, Vallotton was and fundamentally remained a realist painter. Recognition arrived early in Vallotton’s career for reviving the art of woodcut prints which had largely been forgotten since the Renaissance. Indeed, by the age of 25, Vallotton had single-handedly brought about a kind of revolution in xylography. Inspired by Japanese woodcuts that were popular in Paris at the time, Vallotton produced images with sharp contrasts of jet black and pristine white. His woodcuts are remarkable for their commentary on the French bourgeoisie, their stinging rebuke of societal decadence in fin-de-siècle Paris, their critical orientation to the police. Vallotton has an eye for violence — be it in the form of murder, or the sudden carriage accident, the political execution or the suicide. Vallotton combines a dark realism with sophisticated satire, wry humor, and a keen acerbic wit. He has an eye for the ambiguous and the enigmatic — a deft and subtle touch that defies finalization. The Demonstration from 1893 renders the political chaos of the day with humor — from the old man who has lost hold of his hat to the sole figure in white, a woman racing along the left hand side. Vallotton would return in both woodcuts and paintings to the scene of the bourgeoisie shopping for the latest luxury goods at the Bon Marche Department Store — the first such store of its kind in the world. The 1898 triptych is not without a certain irony — given that the format Vallotton chose was traditionally associated with altarpieces such as would be found in church. Which is just to underscore the Vallotton is a close observer of modernity, fascinated by the new world he sees emerging around him — with its rampant consumerism, and its technological novelties (such as a moving conveyor belt along a footbridge, which he includes in his woodcut series devoted to the World’s Fair of 1900.) A series of ten woodcuts entitled Intimités is a sustained and biting critique, and among Vallotton’s greatest achievements as a graphic artist. As an unsettling and disquieting series which lays bare the hypocrisies of bourgeois society, Vallotton deftly exposes a decadent class through scenes of adultery, deceit, romantic quarrels and indecent proposals. In 1899, Vallotton married Gabrielle Rodrigues-Henriques and the union was to have a significant effect on the remainder of his career. His wife was a wealthy widow and the daughter of a Paris art merchant, which meant that he now enjoyed a certain financial security and could turn exclusively to painting. While Vallotton’s work generally lost its satirical wit and subversive edge — there is also a certain psychological insight and marked turn towards the inwardness of his subjects that constitutes much of the power of this later period. The acquisition of a Kodak camera in 1899 led to changes in the way the artist worked. Now, he would typically take snapshots of imagery that appealed to him – and then used those photographs to craft his painting in the studio. It appears that often he would retain the sharply contrasting patterns of light and shadow revealed in the small photograph. However, the painter however was by no means subservient to the photographic image, as The Red Room, Etretat (1899), demonstrates. It is a remarkable painting for its unity of composition and psychological structure. All lines in the painting essentially point to (and up to) the seated figure of Gabrielle Vallotton – even the viewer is made to feel they’re looking up to this woman, who meanwhile looks down at the small child in the foreground. This child, so crucial to the psychological depth of the painting is entirely absent, however, from the photograph. Vallotton was also a master of ambiguity. There is always something more to the story he is telling that must ever remain just beyond our reach. Consider, for example, one of this exhibition’s finest offerings, The White and the Black (1913), a provocative work depicting a young white woman, nude and reclining; and a black woman, seated and coolly smoking a cigarette. The painting may be a response to Édouard Manet’s Olympia (1863) in which a white model, a prostitute likely, lies on a bed attended by a black servant bringing her flowers (probably from a client). But Vallotton may also be in dialogue with Jean-Auguste-Dominique Ingres’ Grande Odalisque (1814) and Odalisque with a Slave (1839). All of his friend’s attest to Vallotton’s love and admiration for Ingres, by whom he was “conquered without offering any resistance” – as one contemporary put it. Vallotton was clearly a close observer of Manet as well, and many of his paintings – for example, Misia at Her Dressing Table (1898) – emphasize the harsh light, the large color surfaces and shallow depth that was characteristic of Manet’s work, including Olympia (1863). But at the same time Vallotton subverts the traditional roles of mistress and servant by making the relationship between these two women utterly ambiguous. The Provincial (1909) is notable for its exploration of one of Vallotton’s recurring themes – namely, the complex, uneven relationship between men and women. In this painting, and others such as Chaste Suzanne (1922), a powerful female figure holds sway over her subservient male counterpart. The white lace of her blouse protrudes in a witty but subtle reminder of her breasts, which only underscores her sexual dominance over the docile man who sits beside her with eyes deferentially lowered. Moonlight (1895) is a standout work that reveals the influence of the Symbolists, in Vallotton’s attention to emotional force over actual topographical representation – water, earth and sky have become interfused in what is almost an abstraction. The picture also anticipates the latter part of his career, when Vallotton increasingly turned to landscape painting – often beginning with a photographic image or an on-site sketch, which was then imaginatively reconstructed on the canvas. The painter referred to his later landscapes as paysages composes (composed landscapes) – and remarked in 1906, “I dream of a painting free from any literal respect for nature.” Valloton said he wanted to “be able to recreate landscapes only with the help of the emotion they have provoked in me.” His method allows him to simplify the compositional structure, to intensify the mood and emphasize the emotional impact of color – as, for example, in Last Rays (1911) where we find the silhouettes of umbrella pines as they receive the last light of the evening sun. Félix Vallotton more than deserves the attention that this exhibition brings. His work – from the groundbreaking forays into xylography, to his portraits and scenes of bourgeois society, to his hauntingly mesmerizing landscapes – defies identification with any artistic school. Like all truly great artists, Vallotton is inimitable; while he experiments with various artistic programs, he ultimately remains aloof from them, determined to paint pictures purged of all sentimentality, beholden only to the emotional, psychological or social reality with which he is concerned. Such a body of work remains ever fresh and vital, and rewards close attention with a glimpse of truth.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174040 https://historynewsnetwork.org/article/174040 0
Depression Era Tenor Hits All the High Notes

 

It is 1934, at the height of the Depression, and an opera company in Cleveland is trying to make enough money to stay in business. The answer, its officials believe, is to bring in fabled Italian tenor Tito Merelli for a one-night concert. Merelli’s appearance is highly publicized and the theater is sold out. The opera company has a stage set. It has an orchestra. It has an audience. But it has no Merelli. He is missing.

 

In late afternoon, just a few hours before the performance, the distraught opera company’s manager takes a bold step. He will use Max, his assistant and an amateur tenor, to dress as the clown Pagliacci and do the show as Merelli. With all the makeup and man in disguise with a tenor’s voice, and also about the correct height and weight, who would know? What could possibly go wrong?

 

That’s where everything starts to collapse in this play set in 1934, Lend Me a Tenor, that opened last weekend at the Westchester Broadway Theater in Elmsford, N.Y. It is a wacky, crazy, upside down play by Ken Ludwig that has the audience roaring with laughter. This tenor can hit all the high notes and makes everybody laugh, too. He is his own opera company – if he can be found.

 

The fraudulent singer plan is in place, Max is ready for his fifteen minutes of fame and the audience is waiting, unaware of the deception. Then, unannounced, Tito Merelli arrives, as flamboyant as flamboyant can be, with his overly emotional, arm waving wife, Maria, who is always angry about something. He is ready to go on. Max is ready to go on. Who will go on? 

 

And then…………well, see the play.

 

Lend Me A Tenor is not only a hilarious show, well directed by Harry Bouvy, but a neat look back at how opera singers and other highly regarded performers toured the country in that era. Merelli lived when they left their homes in the U.S. or in Europe and went on tours of America or appeared in different cities at different times of the year. The play captures the physical details of the traveling star’s Depression life well.

 

Star tours were very common in the Depression, despite the sagging national economy, that hurt theaters and opera houses as badly as it hurt everything else. Bringing in a star for a one-night performance usually worked not only to put money in the opera house bank, but garner enormous attention for the opera company which translated to ticket sales for the company’s regular operas. Marian Anderson, the great singer, toured the U.S almost every year in the late 1930s, sometimes taking tours that lasted several months an included up to eighty concerts.  The big bands of the era - Duke Ellington, Glen Miller, the Dorseys - did the same thing, hitting the road early in the year and living out of suitcases for most of it. The casts of Broadway shows also traveled in that manner. There were always logistical problems- train breakdowns, snowstorms, ticket mix-ups, but, in general, the tours worked well.

 

Except for Tito Merelli.

 

He and his wife have enormous problems to combat when they finally do arrive in Cleveland (Tito’s main problem is his firecracker wife). How do you un-vanish? How does Max, bursting for a chance in the musical spotlight, cope with the fact what Tito is, in fact, in Cleveland? Or is he? Is anybody really in Cleveland?

 

In Lend Me A Tenor, the Cleveland Opera Company has rented a palatial, two room suites for Merelli and his wife, an adoring bell hop brings up their bags and the opera company’s manger treats him like visiting royalty. All of this is historically accurate. Performers arrived by plane or train, mostly, although some went by car. They ate well in the hotel’s restaurant, were given tours of the area ad introduced to dozens of artistic and municipal officials. Newspaper ads for Merelli would have been taken out for a week or more prior to the show. Hundreds of large, cardboard broadsides would have been put in store windows or nailed to trees and telephone poles. There might even have been a Tito Merelli Day (that is, if they could find Tito). 

 

Everything that show business people did to arrange an overnight stay and performance for a star like Morelli in 1934 was done year after year and became the backbone of the American Tour. It is the same way today although social media, television, emails and i-phones have modernized the tour. Tito Merelli would love the contemporary tour, if he could find it.

 

Director Bouvy, who pays such careful attention to the history of the play, has done a superb job with a talented cast of actors. He lets them all shine as a very group of the eccentric, egomaniacal show biz people we all know and love. The two stars of the show are Tito, played wonderfully and with great style by Joey Sorge, and Max, played equally well by J.D. Daw. In addition to them, Bouvy gets fine work from Molly McCaskill as Maggie, Max’ girlfriend, Philip Hoffman the opera manager, Kathy Voytko as Tito’s wife, Tregoney Sheperd as the doughty opera company board chair, Hannah Jane McMurray as a soprano, Sam Seferian as the bellhop and John Savatore as an assistant stage manager.

 

PRODUCTION: The play is produced by Westchester Broadway Theater. Sets: Steve Loftus, Costumes:  Keith Nielsen, Lighting: Andrew Gmoser, Sound: Mark Zuckerman . The play is directed by Harry Bouvy. It runs through January 26.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174041 https://historynewsnetwork.org/article/174041 0
Can America Recapture Its Signature Exuberance?

 

As the news cycle churns on with impeachment coverage, pundits and politicians are quick to remind that the Constitution is all that stands between us and the whims and dangers of authoritarian rule. It’s a good point, but incomplete. America is a country of law and legend and our founding document, essential as it is, won’t save us if we don’t also buy into a binding story about the point and purpose of our democracy.

 

That’s a tall order in today’s world. To author Lee Siegel, hacking America’s dour realities is like scaling a rock face. In his recent essay “Why Is America So Depressed?” he suggests we reach for the metaphorical pitons—after “the iron spikes mountain climbers drive into rock to ascend, sometimes hand over hand”—to get a humanistic grip amid our “bitter social antagonisms” and the problems of gun violence, climate crisis and social inequities that have contributed to an alarming rise in rates of anxiety, depression and suicide. 

 

“Work is a piton,” says Siegel. “The enjoyment of art is a piton. Showing kindness to another person is a piton.”

 

Thanks to Siegel, I now think of the 200-year anniversary of poet Walt Whitman’s birth in the year just ended as a piton for the uplift it gave me. As Ed Simon, a staff writer for The Millions and contributing editor at HNN, declares in another fine essay published New Year’s week, Whitman is “our greatest poet and prophet of democracy.” We’d do well to enlist the bard’s ideas in 2020, he argues, to steer ourselves through “warring ideologies” and to offset “the compelling, if nihilistic, story that authoritarians have told about nationality.”

 

I would simply add this: Whitman is also the godfather of our national exuberance, a champion of the country’s heaving potential and can-do spirit—traits that, until recently at least, could win grudging regard from some of America’s fiercest critics. Despite his intermittent bouts of depression, Whitman tapped into the energy and brashness of our democratic experiment like no other. His verse hovers drone-like above the landscape, sweeping the manifold particulars of American life into a rapturous, cosmic frame.

 

Whitman did some of his most inspiring work in the worst of times. He published the first edition of his collection “Leaves of Grass” in 1855 as the country drifted toward civil war. His Homeric imprint, inclusive for its time, invited individuals from varied walks of life (“Workmen and Workwomen!…. I will be even with you and you shall be even with me.”) to splice up with the democratic push to explore contours of the American collective.

 

My first brush with Whitman’s high spirits came when I was an undergraduate in the late 1960s. Like campuses across the country, the University of Washington was riven by clashing dogmas and muscular protests against the Vietnam War. Like not a few working-class kids brown-bagging it from their family homes, I rode the fence politically, not knowing how to jump. As a citizen, I was offended by club-wielding riot police entering campus in force; as a student of Asian history, I was repulsed watching radical activists trash an Asian studies library, flinging rare old texts out windows onto the grass. Cultural revolution was a tough business.

 

Lucky for me, a generous English professor, Richard Baldwin, supplied some useful ballast. Sensing my fascination with Emerson, Thoreau and Dickinson, he offered to help me sort out the Transcendentalists during office hours. But it was Whitman who bulled his way forward. His booming energy, delivered in his self-proclaimed “barbaric yawp,” turned me into a forever fanboy.

 

I loved how Whitman challenged the thou-shalt-nots of established order in “Song of Myself,” joining the role and responsibilities of the individual to humanity’s common lot:

 

Unscrew the locks from the doors!

Unscrew the doors themselves from their jambs!

 

Whoever degrades another degrades me,

And whatever is done or said returns at last to me.

 

I loved his fervent embrace of democracy in “Democratic Vistas,” his 1871 dispatch to a divided nation:

 

Did you, too, O friend, suppose democracy was only for elections, for politics, and for a party name? I say democracy is only of use there that it may pass on and come to its flower and fruits in manners, in the highest forms of interaction between men, and their beliefs - in religion, literature, colleges, and schools- democracy in all public and private life … .

 

I loved his zest for the many-parted American experience, for linking the natural environment to the soul of the country, as he proclaimed his outsized poetic ambition in “Starting from Paumanok”:

 

… After roaming many lands, lover of populous pavements,

Dweller in Mannahatta my city, or on southern savannas,

Or a soldier camp'd or carrying my knapsack and gun, or a miner 

in California,

Or rude in my home in Dakota's woods, my diet meat, my drink

from the spring,

Or withdrawn to muse and meditate in some deep recess,

Far from the clank of crowds intervals passing rapt and happy

Aware of the fresh free giver the flowing Missouri, aware of mighty

Niagara,

Aware of the buffalo herds grazing the plains, the hirsute and

strong-breasted bull,

Of earth, rocks, Fifth-month flowers experienced, stars, rain, snow,

my amaze….

Solitary, singing in the West, I strike up for a New World.

 

Whitman had his flaws, to be sure. Despite cheerleading for a big-tent America, he also shared the prejudices of his time, blotting his copybook, for example, with ugly remarks about African Americans. By 1969, when I hungered for a healing narrative, the bard’s prescriptions were undergoing reappraisal in light of righteous and long-overlooked storylines offered up by the civil rights and antiwar movements.

 

Still, Whitman has kept his place in chain of title to America’s resilience. He conveys a panoramic confidence that we are greater as a people than our contemporary realities; that if we lose the thread, we’ll find it again and reweave our story into a new and improved version. As Whitman says in the preface to the 1855 edition of “Leaves of Grass,” people look to the poet “to indicate the path between reality and their souls.”

 

Thomas Jefferson saw utility in the cohesive national story. In her 2018 book “The Death of Truth,” Michiko Kakutani writes that Jefferson “spoke in his inaugural address of the young country uniting ‘in common efforts for the common good.’ A common purpose and a shared sense of reality mattered because they bound the disparate states and regions together, and they remain essential for conducting a national conversation.”

 

Today, Kakutani argues, we’re mired in “a kind of homegrown nihilism … partly a by-product of disillusion with a grossly dysfunctional political system that runs on partisan warfare; partly a sense of dislocation in a world reeling from technological change, globalization, and data overload; and partly a reflection of dwindling hopes among the middle class that the basic promises of the American Dream … were achievable … .” 

 

Whitman understood transcendence of national mood is an uphill climb. Periods of division and strife are baked into our democracy, part of the yin and yang, the theory goes, that sorts new realities into a renovated sense of purpose. Yet periods of upheaval must necessarily lead to a refitting, not obliteration, of our common story or democracy is toast.

 

Do Americans still have the chops to reimagine the song of ourselves? 

 

Lord knows, we’ve had the chops. In his 1995 book “The End of Education,” media critic Neil Postman writes: “Our genius lies in our capacity to make meaning through the creation of narratives that give point to our labors, exalt our history, elucidate the present, and give direction to our future. ”But we do have to mobilize the genius.

 

That’s easier said than done in a time brimming with distraction, distress and division. It’s hard to say when or if we’ll devise a story with the oomph necessary to yank us up and over the walls of our gated communities of the mind—toward efforts that will make society more inclusive, the economy more equitable and life on the planet more sustainable. 

 

In the meantime, three cheers for the ebullient currents of American life Walt Whitman chronicled. On our best days, they can lift us above our brattish, self-defeating ways.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174015 https://historynewsnetwork.org/article/174015 0
Cinema Paradiso: The Academy Museum of Motion Pictures Will Be The Home of Movies, Past and Present

(For more images of the museum, click the image above or here)

 

Acclaimed film director Martin Scorsese was recently in the news for asserting his opinion that comic book movies are not “cinema.” In light of his upcoming historical film depicting the life of notorious mobster Jimmy Hoffa, Scorsese expanded on his claim that the massive push for superhero movies has affected young people’s understanding of history, saying “they perceive even the concept of what history is supposed to be [differently]." 

 

Whether it promotes a better understanding of history or not, film itself has remained a major cultural influence around the world for over a century now. Its larger historical impact is hard to measure. Soon, however, the first large-scale museum in the country solely dedicated to the history of motion pictures is set to open and attempt to do just that. 

 

The Academy Museum of Motion Pictures will be located on the corner of Wilshire and Fairfax Boulevard in Los Angeles. According to Jessica Niebel, the Exhibitions Curator for the museum, “the Academy Museum of Motion Pictures will be the world’s premier institution devoted to the art and science of movies and moviemaking.” The museum is on track to open sometime in 2020 after several delays in construction. 

 

The purpose of the museum is to encapsulate how film has changed over time. Moving pictures can be traced all the way back to the 19th century; in 1895 the first ever black and white film, the 50-second long “Arrival of a Train,” was released. The film caused an uproar from audiences, who had never even conceived what moving picture could be. Since then it has grown as an art form, which is something that the Academy Museum aims to capture. Niebel hopes that the museum’s programs and exhibitions will allow visitors to experience, “how movies evolved and are made, and highlight artists, craftspeople, designers and technicians who make the movies possible.”

 

The 20th century was integral to the growth of film as an art form. Despite the fact that it was new, Niebel explains that film, “being the most democratic artform of the 20th century as it was available, affordable and attractive to the masses, had a very strong connect to cultural histories everywhere.” Hollywood’s growth as an industry was soon followed by the rapid growth of film industries in India with “Bollywood,” and later in Nigeria with “Nollywood.” The Academy Museum plans to showcase international film in its first major temporary exhibition, “an unprecedented retrospective on Hayao Miyazaki, which will be the first major exhibition in the United States of the work of the legendary Japanese filmmaker, organized in collaboration with Studio Ghibli.”

 

Visitors to the museum can expect to experience a wide variety of programs, including film programs, exhibitions, public programs, publications, and education programs. The six-story museum itself will be a resource for film education, featuring “more than 50,000 square feet of exhibition galleries, a state-of-the-art education studio, two film and performance theaters, a 34-foot high, double-height gallery for cutting-edge temporary installations, a restaurant and café, and public and special event spaces.” Some programming may involve hosting industry professionals and guest speakers to give insight into their experience with film and an insider look into how much of a collaborative process filmmaking really is. A recent New Yorker piece detailed how the American movie industry took off in the 20th century with the help of many groups that don’t get on-screen credit, especially women. Museum programming hopes to address that.

 

An official grand opening date has not been announced yet, despite the fact that it’s been a long time coming. Plans for the construction of the museum were announced in 2012; there have been significant delays for the project in the seven years since. The Academy has chalked that up to the sheer feat involved in building it, including the renovation of a 1939 LA landmark (the May Company building), building a new spherical structure that includes a 1500 panel glass dome, and joining them together. In a statement, the Academy said that “we have always chosen the path that would enhance the structure, even if that meant construction would take more time to complete,” and “we are weighing the overall schedule for major industry events in 2020, and on this basis will choose the optimal moment for our official opening.”

 

Once it finally opens, the museum will be the first of its kind in the US. As such, it has been very important for planners like Niebel to create an experience that is altogether unique with an eye to the future. That involves screening films in the correct format that they were intended to be seen in, while also providing exhibitions that complement the screenings. Education will be an emphasis, as Niebel explained: “Film exhibitions cannot recreate the cinematic experience but translate it into another medium, that of the exhibition.” The Academy Museum has differentiated itself from other museums in that regard. History and art museums typically focus on one aspect of either education or visual display. Niebel maintains that the Academy Museum will be able to address both, because “film exhibitions are a ‘genre’ of their own in that they combine artifacts, film clips, design and immersive experiences to achieve not only educational aspects or aesthetic impressions, but wholistic experiences.” 

 

The museum will have the advantage of access to the Academy of Motion Pictures Arts and Sciences Archive. The archive has been acquiring film since 1929 and currently holds over 190,000 materials, including many of the films nominated for Oscars in all categories and all of the award-winning films for Best Picture. Niebel confirmed that the museum will “draw on the unique intellectual and material resources of the Academy of Motion Picture Arts and Sciences.” It should be an interesting landmark for historians and movie buffs alike. Film has always been a kind of public history, a reflection of society and culture. The Academy Museum of Motion Pictures, once it opens, may capture that. Score one for Martin Scorsese. 

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/173843 https://historynewsnetwork.org/article/173843 0
Why The West Is Losing The Fight For Democracy

 

Adapted from The Light That Failed by Ivan Krastev and Stephen Holmes, published by Pegasus Books. Reprinted with permission. All other rights reserved.

 

The future was better yesterday. We used to believe that the year 1989 divided ‘the past from the future almost as clearly as the Berlin wall divided the East from the West.’ We had ‘trouble imagining a world that is radically better than our own, or a future that is not essentially democratic and capitalist.’ That is not the way we think today. Most of us now have trouble imagining a future, even in the West, that remains securely democratic and liberal.

 

When the Cold War ended, hopes for liberal capitalist democracy spreading globally were high. The geopolitical stage seemed set for a performance not unlike George Bernard Shaw’s Pygmalion, an optimistic and didactic play in which a professor of phonetics, over a short period of time, succeeds in teaching a poor flower girl to speak like the Queen and feel at home in polite company.

 

Having prematurely celebrated the integration of the East into the West, interested observers eventually realized that the spectacle before them was not playing out as expected. It was as if, instead of watching a performance of Pygmalion, the world ended up with a theatrical adaptation of Mary Shelley’s Frankenstein, a pessimistic and didactic novel about a man who decided to play God by assembling replicas of human body parts into a humanoid creature. The defective monster felt doomed to loneliness, invisibility and rejection. And envying the unattainable happiness of its creator, it turned violently against the latter’s friends and family, laying their world to waste, leaving only remorse and heartbreak as legacies of a misguided experiment in human ​self-​duplication.

 

So, how did liberalism end up the victim of its heralded success in the Cold War? Superficially, the fault lay with a series of profoundly destabilizing political events: the 9/11 attack on the World Trade Center in New York, the second Iraq War, the 2008 financial crisis, Russia’s annexation of Crimea and intervention in Eastern Ukraine, the impotence of the West as Syria descended into a humanitarian nightmare, the 2015 migration crisis in Europe, the Brexit referendum, and the election of Donald Trump. Liberal democracy’s ​post-​Cold War afterglow has also been dimmed by the Chinese economic miracle, orchestrated by a political leadership that is unapologetically neither liberal nor democratic. Attempts to salvage the good name of liberal democracy by contrasting it favourably with ​non-​Western autocracy have been undercut by the feckless violation of liberal norms, as in the torture of prisoners, and the evident malfunctioning of democratic institutions inside the West itself. Tellingly, how democracies atrophy and perish has become the question that most preoccupies liberal scholars today.

 

The very ideal of ‘an open society,’ too, has lost its ​once​-fêted lustre. For many disillusioned citizens, openness to the world now suggests more grounds for anxiety than for hope. When the Berlin Wall was toppled, there were only sixteen border fences in the world. Now there are ​sixty​-five fortified perimeters either completed or under construction. According to Quebec University expert Elisabeth Vallet, almost a third of the world’s countries are rearing barriers along their borders. The three decades following 1989 turned out to be an ‘inter-​mural period’, a brief ​barricade-​free interval between the dramatic breaching of the Berlin Wall, exciting utopian fantasies of a borderless world, and a global craze of ​wall​-building, with cement and ​barbed-​wire barriers embodying existential (if sometimes imaginary) fears.

 

Most Europeans and Americans today also believe that the lives of their children will be less prosperous and fulfilling than their own. Public faith in democracy is plummeting and ​long-​established political parties are disintegrating or being crowded out by amorphous political movements and populist strongmen, putting into question the willingness of organized political forces to fight for democracy’s survival in times of crisis. Spooked by the phantom of ​large​-scale migration, electorates in parts of Europe and America are increasingly drawn to xenophobic rhetoric, authoritarian leaders and militarized borders. Rather than believing that the future will be uplifted by the liberal ideas radiating out of the West, they fear that ​21st-​century history will be afflicted by the millions of people streaming into it. Once extolled as a bulwark against tyranny, human rights are now routinely accused of limiting the ability of democracies to fight terrorism effectively. Fears for liberalism’s survival are so acute that references to William Butler Yeats’s ‘The Second Coming’, written in 1919 in the wake of one of the deadliest conflicts in human history, became an almost obligatory refrain for political commentators in 2016. A century after Yeats wrote them, these words are now the mantra of apprehensive defenders of liberal democracy worldwide: ‘Things fall apart; the centre cannot hold; / Mere anarchy is loosed upon the world.’

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174017 https://historynewsnetwork.org/article/174017 0
The History of Fire: An interview with Stephen Pyne

 

Stephen Pyne is an emeritus professor at Arizona State University. He has published 35 books, most of them dealing with fire, but others on Antarctica, the Grand Canyon, the Voyager mission. With his oldest daughter, he wrote an inquiry into the Pleistocene. His fire histories include surveys of America, Australia, Canada, Europe (including Russia), and the Earth. To learn more about his work, please feel free to visit his website

 

What made you want to be a historian?

I drifted into history. I enjoyed reading it, especially world history, as a kid, then the spark took when I realized that I understood things best through their history.  It was by reading Carl Boyer’s History of the Calculus, for example, that I finally appreciated what the numbers were all about.  The same has proven true for topics like fire, Antarctica, Grand Canyon, and the rest.  Then I realized that history could also be literature. That clinched it.

I saw that you used to be a wildland firefighter. Is this what made you want to study the history of fire?

A few days after graduating from high school, I was hired as a laborer at Grand Canyon National Park.  While I was signing my papers, an opening appeared on the North Rim fire crew, and I was asked if I wanted to join.  I’d never been to the North Rim, never worked around a flame bigger than a campfire, didn’t even know the names of the basic tools, and of course said, Sure. It was a moment of biographical wind shear.  I returned to the Longshots for 15 seasons, 12 as crew boss, then spent three summers writing fire plans for Rocky Mountain and Yellowstone National Parks.  Then I went to Antarctica for a season. The North Rim fire crew and campus were separate lives: neither had much to do with the other.  It took 10 years as a Longshot and after getting a Ph.D. that I finally brought the two worlds together.  I decided to apply to the scholarship I had been trained into the subject that most animated me. Fire in America was the result. It’s not a hard weld; the two lives have never fully fused.  I have one line of books that continues the topics I studied in grad school, and another that deals with fire, particularly big-screen fire histories for America, Australia, Canada, Europe (including Russia), and the Earth.  In some respects, I’ve been a demographic of one. But it’s also like being on the American River in 1848 California. There are riverbeds of nuggets just waiting to be picked up.    

 

How did personal experience influence your scholarship?

Obviously, as a topic.  I would never have thought of fire as a subject without those hopping seasons on the Rim.  They gave me woods credibility. More subtly, those years shaped how I think and speak about fire.  On a fire crew, you quickly appreciate how fires shape a season, and how fire seasons can shape a life.  It’s not a big step to wonder if the same might be true for humanity. After all, we are a uniquely fire creature on a uniquely fire planet.  Fire is what we do that no other species does. It makes a pretty good index of our environmental agency. You pick up a language, a familiarity, a sensibility toward fire – it’s a relationship, in some way.  Without anthropomorphizing fire, you learn to animate it – give it a presence. That’s what I can bring that someone else might not.  At the same time, I need to abstract the vernacular into more general concepts, which is what scholarship does. I’m constantly fluctuating between the two poles, a kind of alternating current. There are always trade-offs.  I begin with fire and construct a history.  I don’t begin with historiography – questions of interest to the history community – and use fire to illustrate them.  That makes my fire stuff different, and it means it can be hard to massage it into a more general history or a classroom.  I sometimes wonder if I invented a subject only to kill it.  

 

As I’m sure you are aware, wildfires recently ravaged the state of California, and bushfires continue to burn Australia. How does your understanding of the history of fire shape how you think about these wildfires? 

Ah, as long as California keeps burning it seems I’ll never be lonely.  I do a lot of interviews. Last year I finally wrote a fire primer for journalists and posted it on my website. California is built to burn and to burn explosively.  Against that, we have a society that is determined to live and work where and how it wants.  For a century California has buffered between hammer and anvil with a world-class firefighting apparatus.  But four fire busts in three years have – or should have – broken that strategy. It’s costly, it’s ineffective against the extreme events (which are the ones that matter), and it’s unfair to put crews at risk in this way.  Doing the same things at ever-higher intensities only worsens the conditions. Australia is California at a continental scale, only drier, and with winds that can turn the southeast into a veritable fire flume.  It has a long history of eruptive fires, but the 2009 Black Saturday fires and the Forever fires burning now feel different.  They are more savage, more frequent, and more disruptive.  Australia is also a firepower because it has a cultural connection to fire at a range and depth, from art to politics, that I haven’t found elsewhere.  Australia’s foresters were the first to adopt controlled burning as a strategy for protection – their experience makes a fascinating contrast to what happened in the U.S. Both places show the importance of framing the problem.  Fire is a creation of the living world, but we have defined it as a phenomenon for physics and chemistry, which leads us to seek physical solutions like dropping retardants and shoving hydrocarbons around.  We haven’t really thought about nuanced ecological engineering. We’ve mostly ignored the ideas and institutions that shape the social half of the equation. We’ve neglected how these scenes are historically constructed, how they carry a long evolution that doesn’t derive from first principles. For that matter, the intellectual history of fire is relevant because fire as an integral subject was a casualty of the Enlightenment.  We have no fire department at a university except the one that sends emergency vehicles when an alarm sounds. There is a lot here for historians to chew on.  But it isn’t enough to problematize. We have to show how our analysis can lead to problem-solving.

 

Is climate change alone enough to explain wildfires or do we need to understand more about history to understand why they are such a problem? 

There are many ways to get big fires.  Presently, climate change is serving as a performance enhancer.  In the 19th and early 20th centuries, megafires an order of magnitude greater than those of today were powered by logging and land clearing slash.  Climate integrates many factors, so does fire, and when you stir those two sloppy variables together, it’s tricky to attribute particular causes to the stew of effects. If you make fire an informing principle, a narrative axis, you find that the shift to burning fossil fuels, what I think of as the pyric transition, unhinged Earth’s fire regimes even without climate change.  The conversion has rolled over habitat after habitat. Land use and humanity’s fire practices, for example, are hugely important in interacting with climate to shape fires. But most of those changes also trace back to fossil fuels. Basically, we’re burning our combustion candle at both ends. How lithic landscapes and living landscapes interact has not been something fire ecology or physics has considered.  They dread dealing with humans because people muck up the models. You have to come at those notions sideways, you have to view the scene from outside the disciplinary prisms that we’re trained in.  Paradoxically perhaps, the humanities may be better positioned to make conceptual contributions than the natural sciences.

 

How do you think the field of environmental history will change as the climate crisis becomes a more and more pressing issue?

The crisis is spooky.  But I reject the notion that we are heading into a no-narrative, no-analog future.  With fire, I can offer a pretty substantial narrative – it’s one of the oldest humanity has.  In truth, I now regard climate history as a sub-narrative of fire history. And I’ve come to imagine our evolving fire age as something comparable to the ice ages of the Pleistocene.  That’s a crisp analog. Changing sea levels, mass extinction, wholesale upheavals of biotas, regions reconstructed with the fire-equivalent of ice sheets and pluvial lakes – it’s all there.  For me, the Anthropocene extends across the whole of the Holocene, and from a fire perspective, the Anthropocene could be usefully renamed the Pyrocene. There are other helpful analogs out there.  The problem of powerline-kindled wildfires is very similar to that of railroad fires in the past.  The problem of fringe communities burning (the fatuously named wildland-urban interface) replays the chronicle of settlement fires in the 19th century.  Then it was agricultural colonization; now, an urban reclamation of rural lands. We have pretty good examples of how we might cope. The WUI problem got defined by the wildland fire community which saw houses complicating their management of land, but it makes more sense to pick up the other end of the stick and define these places as urban settings with peculiar landscaping.  The aptest analogy is not to wildland fire but to urban fire. Do that, and it’s obvious what we need to do to reduce the havoc. History also holds lessons beyond data and techniques.  How do we live in a contingent world about which we have incomplete knowledge?  That’s a topic for stories and characters, not algorithms. Mostly, though, the sciences and fire folk don’t credit history with much analytical power.  Historians deal with anecdotes. Historians are good for yearbooks and court poetry. The critics are wrong, but it can be a tough slog, like mopping up in mixed conifer duff. Still, when I began, fire was a fringe topic.  Now it’s a global concern.  People want context, and that’s what history provides, which has created a context for what I do.  It’s been an interesting ride.     

 

I also saw that you specialized in the history of exploration. Have you been able to intertwine that interest with your knowledge of fire?

I went to grad school to study geology, western history, exploration – stuff relevant to my life on the Rim.  All my applications were rejected. Then, serendipitously, it was suggested I apply to William Goetzmann in the American Civ program at UT-Austin.  He accepted me and mostly left me alone. At the time he was playing with the idea of a second great age of discovery. I quickly added a third and have used it as an organizing principle – a kind of conceptual rebar - for a series of books, including The Ice, How the Canyon Became Grand, and Voyager.  I’ve finally systematized the grand schema into The Great Ages of Discovery, now headed for publication. I see exploration as a cultural movement and for the West a kind of quest narrative. My exploration books do better critically and commercially than my fire books.  Exploration has a literary tradition, fire other than disaster and battlefield doesn’t.  It’s been helpful to have two themes – puts me back into the two-cycle rhythms I knew in my rim-campus days, keeps me from getting too stale in either one.  If I were starting over, I’d write the two series under different names.

 

In 2012, you wrote a book with your daughter, The Last Lost World. What was it like working with her, and what kind of expertise did she bring to the book?

My oldest daughter, Lydia, was attracted to archaeology and paleoanthropology and went to graduate school to study them.  Then she decided that the history of the field was more appealing. For years we joked about writing a book together sometime.  She graduated in 2009, a horrible moment for a new Ph.D., so I thought that the time had come. I had a sabbatical semester, and we took her topic and wrote The Last Lost World.  I knew about a few of the subjects, and Lydia the others, and she directed our research.  I credit her with keeping us (me) on message. I have good memories of the project, particularly the weeks we spent at our mountain cabin revising. (She’s gone on to a successful career as a writer; her latest, Genuine Fakes: How Phony Things Teach Us About Real Stuff, is just out.)  And, yes, we’re still speaking to each other.

 

Is there any other information you want people to know about you or the work that you do?

Because I ended up in a school of life sciences, I didn’t do much with graduate students.  Historians didn’t want someone from biology on their committee, and biologists didn’t want a historian.  Eventually, I decided to offer a course on nonfiction writing, and then wrote a couple of books about writing books.  Call it my postmodern phase.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/173629 https://historynewsnetwork.org/article/173629 0
A New History of American Liberalism

 

Historians interested in the history of political philosophies would do well to read James Traub's new book What Was Liberalism? The Past, Present, and Promise of a Noble Idea.  Traub's ambitious book documents that liberalism has evolved over time. For John Stuart Mill and Thomas Jefferson, it was mostly a check on government's power and preservation of individuals' liberty. For Theodore Roosevelt and Franklin Roosevelt, liberalism meant using government's power to regulate business and promote social welfare. Recently it has been  more about government policies for equality and inclusion. 

 

But Traub frets that liberalism has foundered. People's faith in government has eroded. Working and middle class people have become disenchanted. Traub recounts that George Wallace, running for president in 1964, rebuked what he called liberal elites. "They have looked down their noses at the average man in the street too long," Wallace cried. That alienation grew and culminated in Donald Trump's 2016 campaign. Trump, as president, keeps hammering at liberal values. In addition, in Traub's view, Trump poses a threat to political civility, free speech, and the rule of law.

 

Traub's book begins with a chapter on "Why Liberalism Matters."  He concludes it with a chapter on "A Liberal Nationalism" that laments liberalism's eclipse but looks for signs of its potential resurgence. He finds faint hope in liberalism's history of adaptation and resurgence. Liberals need to identify with opportunity, national confidence, inclusion, civility. They need to be seen as champions of "the public good." 

 

In  the book's last sentence, he says "Liberalism will renew itself only if enough people believe that its principles are worth fighting for."

 

James Traub's sense of concern contrasts sharply with the buoyancy, optimism, self- confidence and determination that helped make liberalism a success in the past.

 

In his odyssey through liberalism's history, Traub brings in Hubert H. Humphrey (1911-1978), longtime Democratic senator from Minnesota, Vice President (1965-1969) and the party's unsuccessful presidential nominee in 1968. Traub draws extensively on Humphrey's autobiography, The Education of a Public Man: My Life in Politics, published in 1976. A better book, though, for insight into the liberal mind at the movement's high tide in the 1960's -- and a contrast to liberal disarray and doldrums today -- is Humphrey's 1964 book The Cause is Mankind: A Liberal Program for Modern America.

 

Humphrey exudes optimism and confidence: liberals know what the nation needs and are determined to secure it.  “The enduring strength of American liberalism," Humphrey wrote in the book, "is that it recognizes and welcomes change as an essential part of life, and moves to seize rather than evade the challenges and opportunities that change presents. It is, basically, an attitude toward life rather than a dogma—characterized by a warm heart, an open mind, and willing hands.”

 

To be sure, there were lots of challenges. "We are living in an age when America seems to be bursting with issues and problems. We must secure civil rights for all our citizens. We must end poverty, Our economy must grow in all parts of the country. Automation and technology must create new jobs, not more jobless...We must rebuild our cities, revitalize our rural areas... We must conserve our natural resources."

 

But in Humphrey's sunny view of things, challenges were little more than opportunities for reform.  Nothing was too difficult for Americans who needed to endorse bold government initiatives engineered by the liberal spirit.  Humphrey's book included a chapter on planning. He had proposals to streamline the work of Congress. He had proposals for reconciling big business and labor and preserving competition. "The chief economic role of government," he wrote, "must be the smoothing of the way for new men and new ideas."

 

He endorsed the "welfare state" and government's responsibility to ensure human dignity and a decent standard of living for everyone. He proposed a massive "war on poverty" that was bolder than what President Lyndon Johnson was sponsoring. Better agricultural policies would preserve family farms and at the same time provide food in abundance. Federal education aid would boost that sector.

 

Civil rights, something Humphrey had championed for years, would keep progressing.

 

Sometimes, it would take experimentation and improvisation. No matter, said Humphrey. Americans were good at that.

 

Humphrey did not foresee that the war in Vietnam and other events that would soon undercut the Johnson/Humphrey liberal domestic agenda. In his book, it was all about forward momentum into a beckoning, bracing future. American liberalism "sees free competitive enterprise as the mainspring economic life and is dedicated to the maintenance of the traditional freedoms of speech, of the press, of assembly and the like." But it is behind "the use of power of the state to achieve both freedom and a reasonable measure of equality."

 

Liberals looking for an enlightening but sobering account of their movement's history should read James Traub. 

 

But liberals should also read Hubert Humphrey for some much-needed inspiration.

]]>
Mon, 27 Jan 2020 03:17:58 +0000 https://historynewsnetwork.org/article/174019 https://historynewsnetwork.org/article/174019 0