History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Fri, 23 Apr 2021 02:37:05 +0000 Fri, 23 Apr 2021 02:37:05 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/site/feed The Real Patriots Invaded the Nation’s Capital Fifty Years Ago

Vietnam Veterans Against the War in Washington, DC. 1971.

 

 

They called their trip to Washington, D.C., an “invasion.”  Vowing not be “deterred or intimidated by police, government agents, [or] U.S. marshals,” they arrived outfitted for war in fatigues and jungle boots with weapons and gas masks firmly in hand.  Calling themselves “concerned citizens” and “patriots,” they announced their intention to “protect the flag” by “stop[ing] all business as usual, until the government recognizes and responds positively to our demands.” 

No, these were not the self-professed patriots who stormed the U.S. Capitol Building on January 6, 2021.

This was back in 1971 when President Richard Nixon claimed to be fulfilling his campaign promise of “peace with honor” by lowering the number of American ground troops in Vietnam.  Much to the horror of thousands of recently returned GIs, the civilian branch of the most vocal and sustained antiwar movement in American history took the bait and stopped protesting.

And thus, on the evening before Patriots’ Day, twelve hundred members of Vietnam Veterans Against the War (“VVAW”) arrived in Washington from around the country for what they called Operation Dewey Canyon III in a pointed rebuke of the recent American expansion of the air war into Laos under code names Operation Dewey Canyon I and II.

At first the public was confused.  The men who descended on the nation’s capital in olive drab, some with bandoliers strapped across their chests, did not look anything like the closely clipped GIs featured in the military recruiting posters plastering America’s post offices.  These guys had beards and long hair.

“Son, I don’t think what you’re doing is good for the troops,” a Daughter of the American Revolution complained to one them, as the veterans marched past the DAR’s Memorial Hall.

“Lady, we are the troops,” was the ready reply.

After four days spent in such peaceable pursuits as lobbying their congresspeople, laying funeral wreaths at Arlington National Ceremony for both the American and the Vietnamese dead, holding a candlelight vigil at the White House, and testifying in front of the Senate Foreign Relations Committee, the veterans announced their plan to descend on the Capitol Building, which the Nixon administration decided to surround, preemptively, with a version of the same kind of barrier fence that encircles it now.

The nation held its collective breath.

But rather than storm the seat of the legislative branch of the U.S. government, the veterans set about assembling a makeshift platform on the west side of the Capitol, which they equipped with a powerful sound system.  At the appointed time, those who were not confined to wheelchairs walked up to the microphone one-by-one.  Holding up their medals, ribbons, and citations, each man told the assembled crowd of veterans and journalists what the nation’s highest honors meant to him.

“A symbol of dishonor, shame, and inhumanity,” said one veteran as he hurled his medals over the barrier fence. 

“Worthless,” said another as the pile of discarded honors grew.

Many of the veterans called out the American government for being racist towards South Asians and others.

“I symbolically return my Vietnam medals and other service medals given me by the power structure that has genocidal policies against the nonwhite peoples of the world.”

“Our hearts,” many of the veterans declared, “are broken,” and their copious tears proved it. 

In taking a stand against the war in front of the Capitol Building, the veterans were following in the footsteps of Martin Luther King Jr., who addressed the American people in 1963 from the steps of the Lincoln Memorial as a means of asking them to measure the distance between the promise of the Emancipation Proclamation and the reality of Jim Crow.  VVAW was similarly asking the country to note the difference between the promise of an inclusive and transparent government, as represented by the welcoming façade and the usually open doors of the Capitol Building, and the secret air war the Nixon administration was conducting.

These first veterans to protest a war in which they had served won their countrymen’s respect.  Noting that the day they began their protest was the “anniversary of the day the ‘shot heard round the world’ was fired at Concord Bridge,” one Boston newspaper asked any readers who might be reluctant to recognize the veterans as patriots to remember that “in 1775 the colonial forces were also unruly and young.”

After being photographed and filmed by all of the major news outlets throwing away their medals and discarding what turned out to be Mattel-manufactured toy M16s, the veterans packed up their gear and policed their campsite on the National Mall.  Just to be sure they left it in better shape than they had found it, they planted a tree.  Then they went home to their local VVAW chapters where they continued to work to end the war by mobilizing other sacred symbols.  The New England chapter marched Paul Revere’s route in reverse, stopping at the famed Revolutionary War battlefields in Concord, Lexington, and Charlestown to perform mock search-and-destroy missions in a demonstration of the difference between fighting against an imperialist regime and becoming one.  On another occasion, antiwar veterans signaled their distress about the ongoing war in Southeast Asia by hanging an upside-down American flag from the crown of the Statue of Liberty.  And when the war was finally over in 1975, VVAW set to work advocating for better mental health care for those American servicemen who had been traumatized by being asked to do the most un-American thing imaginable: deny another country its own April 19, 1775.

On this Patriots’ Day, fifty years after a battalion of Vietnam veterans brought their anguish and their outrage to the Capitol Building, the nation owes its thanks not only to the colonial militiamen who lost their lives along the famed Battle Road, but also to their direct descendants, the antiwar veterans who, in reminding a nation of its foundational values, sought to reset its course. 

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179891 https://historynewsnetwork.org/article/179891 0
History Found Dixie Kiefer, one of the Greatest Heroes of World War II in the Pacific

USS Yorktown, struck by Japanese bombs in the Battle of Midway, June 4, 1942.

A "sailor's skipper," Executive Officer Dixie Kiefer led efforts to repair the damaged ship and, though injured himself, pushed a raft of injured sailors to rescue.

 

 

 

Like Forrest Gump or Woody Allen’s Zelig, some people seem to have a knack for being present whenever and wherever history is made. But one such person is not a fictional character at all. Dixie Kiefer was about as real as they come. Like many naval leaders, he was born far from the sea, in Idaho. At Annapolis, he was better known for his difficulty in maintaining his weight than for his academics. But Kiefer’s determination, leadership skills, and ability to foresee the direction of future naval warfare would eventually make him one of the greatest heroes of World War II. But not one of the best known.

 

He was a visionary when he—unlike many of the leaders of the world’s naval powers—saw that the battleship would soon no longer rule the seas. Instead, it would be an airfield that could be floated to wherever it was needed, the aircraft carrier. Kiefer became a pioneer in naval aviation, even before the carrier. He was the first man to fly an airplane off a warship at night. That was at a time when aircraft were catapulted off destroyers or cruisers. But he and others helped develop carriers, from construction to implementation to tactics. And that work paid dividends in World War II.

 

Kiefer was the executive officer—the second in command—on the first USS Yorktown (CV-5) in the historic Battle of the Coral Sea. Historic primarily because, fought by carrier-launched aircraft, it was the first sea battle in which the competing vessels never saw each other. Historic, too, because though neither the Japanese nor the Allies could claim victory, the outcome would soon have an effect in another upcoming showdown. The United States lost one carrier and Yorktown was badly damaged. The Imperial Japanese Navy did not lose a ship. However, two IJN carriers were damaged badly enough to require they go back to the Home Islands for repair. They would be unavailable for another critical battle about a month later, this one near Midway Atoll.

 

The Battle of Midway, arguably the turning point of the sea war in the Pacific, would be another moment in history in which Dixie Kiefer would play a key role. His carrier, the Yorktown, limped back from the Coral Sea to Pearl Harbor for repairs that everyone believed would require up to two months. Admiral Chester Nimitz would have none of that. Intercepted Japanese messages indicated a major IJN task force was headed to Midway Atoll to take that island and entice the remaining American aircraft carriers—the IJN believed Yorktown was sunk at the Coral Sea—into a confrontation the US could not win. Nimitz ordered Yorktown to be ready to steam toward Midway in less than a week with sister flat-tops Enterprise and Hornet. Dixie Kiefer played a key role in making sure his ship was ready.

 

As most know, the US carriers, their torpedo- and dive-bombers and brave crews defeated the Japanese at Midway. Planes off Yorktown played a big role. Japan would never fully recover. But during the battle, two air-delivered Japanese torpedoes struck Yorktown, inflicting heavy damage. She began listing badly, apparently sinking. The order came to abandon ship. Kiefer was one of the last off.

 

But the next morning, the carrier was still afloat. The decision came to tow Yorktown back to Hawaii. Kiefer led the repair party back aboard to prepare the ship. Then, an enemy submarine managed to hit the carrier with two torpedoes. As she quickly sank, Kiefer helped surviving crew off the ship. In the process, he fell into the water, striking the ship’s hull, badly breaking his ankle and leg. Even so, he swam through the water pushing a raft full of badly injured sailors to their rescue. That heroic effort won Kiefer the Navy Cross.

 

Once recovered, Kiefer became the CO of a new carrier, Ticonderoga (CV-14). After taking the ship to war, and even steering her through a typhoon that claimed more ships and lives than any sea battle of the war, Kiefer ultimately faced his toughest test. The desperate Japanese had begun sending pilots on suicide missions. In January 1945, two kamikaze planes struck Ticonderoga. The first hit the deck where fueling and ordnance were being loaded on planes, setting off massive fires. But a quick-thinking Kiefer ordered ballast tanks be flooded, purposely causing the ship to list. That caused burning fuel and equipment to slide off the deck and into the sea. He also turned the ship so the wind would blow smoke and flames away from men fighting the fires.

 

Then a second kamikaze hit the carrier just below the bridge where Kiefer was overseeing fire control. The captain had more than sixty-five shrapnel wounds and a badly broken arm. However, he refused to leave the bridge until the fires were controlled and every wounded man had been cared for. For this extraordinary action, Kiefer was awarded the Distinguished Service Medal. At the medal award ceremony, Secretary of the Navy James Forrestal dubbed Kiefer “The Indestructible Man.”

 

World War II ended before Kiefer recovered from those injuries. He was promoted to the rank of commodore and named commander of the naval air station at Quonset Point, Rhode Island.

It is a cruel irony that only three months after VJ Day, the announcement of the Japanese surrender, and while employing a form of transportation he had pioneered, Dixie Kiefer proved to not be indestructible after all. He was returning from a trip to New York City with five other Navy personnel when their airplane encountered bad weather and crashed into Mount Beacon, New York, above the Hudson River. All six men died. One way Kiefer’s body was identified was by the plaster cast still on his broken arm from the kamikaze assault on Ticonderoga. It is noteworthy that two of those who died with Kiefer were enlisted men, given a lift by the commodore. Kiefer was known as a “sailor’s skipper,” beloved by all, an indication of his leadership style. And one of the men was African American, a sailor who had repeatedly been denied pilot training by the Navy because of his race. Kiefer was actively working to change that policy.

 

More cruel coincidences: the man who gave “indestructible man” his nickname, James Forrestal, was born in Beacon, New York, not far from where Dixie Kiefer died. And the crash occurred on November 11. That was Armistice Day, marking the end of World War I.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179892 https://historynewsnetwork.org/article/179892 0
Law, Politics, Public Health and Deadly Epidemics: A Conversation with John Fabian Witt on “American Contagions”

 

 

 

If the past is a guide, how our law responds to contagion now and in the future will help decide the course of our democracy. John Fabian Witt, American Contagions.

 

At this writing, the deadly COVID-19 pandemic has killed a shocking 560,000 Americans and infected about 32 million Americans The United States remains the nation with the highest death toll in the world, and the death rates here betray stunning inequities for people color among other vulnerable and disadvantaged communities.

In March, former White House coronavirus response coordinator for the previous administration, Dr. Deborah Birx, told a CNN reporter: "There were about a hundred thousand deaths that came from that original surge. All of the rest of them, in my mind, could have been mitigated or decreased substantially."  Our nation suffered for months under a chaotic presidential administration that mocked science and public health experts as it politicized efforts to reduce infections and prevent spread of the virus.

The Biden administration has made the pandemic its top priority. Many Americans, particularly those in high-risk categories, have been vaccinated, but the pandemic has yet to abate. Communities of color and impoverished people continue to disproportionately bear the brunt of illness and death from COVID-19.

The experience of this novel pandemic in the past year has fueled questions about the role of the federal and state governments in addressing epidemics; the importance of public health versus individual freedoms; the inequities in access to health care; and more.

To help address these concerns, Yale Professor of History and Law, John Fabian Witt, has provided a comprehensive citizens guide to the history of law and epidemics with his recent book, American Contagions: Epidemics and the Law from Smallpox to COVID-19 (Yale University Press).

In his book, Professor Witt explores how infectious diseases through our history have shaped the law, and how law has shaped our response to these recurrent diseases. For the most part, since the inception of our nation, public health has held primacy over other interests such as individual rights. Court decisions often reflected the principle set forth by the Roman scholar and lawyer Cicero more than two millennia ago: “Salus populi suprema lex esto.” [The health of the people is the supreme law.]

But, as Professor Witt stresses, the results of health laws and court decisions have not always been experienced equally by all citizens. While laws may have protected white majority populations, vulnerable minorities and the poor were often ignored or were subject to harsh measures such as confinement and strict quarantines. The book offers stunning examples of past laws that penalized rather than prevented illness among marginalized groups such as Native Americans, recent immigrants, Black people, Asians, and others. As Professor Witt observes, the ostensibly neutral rules and laws that govern American life “contain the compounded form of discriminations and inequities, both old and new.”

And, there’s a new twist in the legal story since American Contagions was published. In recent months, the U.S. Supreme Court has chipped away at public health law precedents with series of religious freedom cases. 

As Professor Witt notes, we can intelligently face the future, but only if we grasp our “often disturbing past.” He urges that the history of law and epidemics not only tells us where we have been but shapes the present moment and informs us on where we are headed. And epidemics can be used to illuminate inequities and correct them if we recall the lessons from our imperfect past.

John Fabian Witt is Allen H. Duffy Class of 1960 Professor of Law at Yale Law School where he teaches courses in American Legal History, Torts, History of the Laws of War, and Problems in Legal Historiography. He also taught for a decade at Columbia Law School, visited at Harvard and the University of Texas at Austin, and served as a law clerk to Judge Pierre N. Leval on the United States Court of Appeals for the Second Circuit. He holds a J.D. and a Ph.D. in history from Yale.

His other books include Lincoln’s Code: The Laws of War in American History, which received the Bancroft Prize and American Bar Association’s Silver Gavel Award and was a finalist for the Pulitzer Prize; To Save the Country: A Lost Manuscript of the Civil War Constitution; Patriots and Cosmopolitans: Hidden Histories of American Law; and The Accidental Republic: Crippled Workingmen, Destitute Widows, and the Remaking of American Law. He has also written for scholarly journals as well as the The New York Times, Slate, the Wall Street Journal, and The Washington Post.

Professor Witt generously responded in writing to a series of questions on his teaching and his new book, American Contagions.

 

Robin Lindley: Congratulations Professor Witt on your new and very timely book, American Contagions: Epidemics and the Law from Smallpox to COVID-19. You’ve created a lively and clear guide for citizens on the history of public health, epidemics, and American law. Did the COVID-19 epidemic spark your book or were you already working on this subject?

Professor John Fabian Witt: Thanks, Robin!  The book took shape in the spring of 2020 as the pandemic set in.  I retooled a section of my course on American Legal History to include a unit on the legal history of epidemics in the U.S.  Thanks to some amazing RA’s I was able to gather some awesome materials and some excellent images and I turned my lecture on the subject into a public lecture.  My editor saw it and suggested that I turn it into a book – the rest is history!

My boys and I sat at the dinner table, they did school work on Zoom….  We called it our Covid Coffee Shop. 

Robin Lindley: An excellent working arrangement at a challenging time. I saw that the book was dedicated to your boys.

As you note in your book, under our federal system, state and local governments bear the primary responsibility for dealing with public health under their “police power.” What is the role for the federal government and Congress in dealing with a nationwide epidemic?

Professor John Fabian Witt:  Great question, and we’ve seen a big reversal on this over two administrations now. 

It’s a tough question.  On the one hand, the fact that germs don’t respect boundaries is a powerful argument for a centralized approach directed by the federal government.  My brilliant Yale colleague Nicholas Christakis compares a decentralized approach to letting swimmers urinate in one corner of the swimming pool and hoping for the best.  And of course, everyone recalls the period in which federal government inaction led states to be competing for one another for protective gear and ventilators. 

On the other hand, we’ve also had a powerful lesson in the dangers of centralized power.  The Trump Administration’s mix of malevolence and incompetence was a terrible recipe for pandemic management.  Centralized power in public health, as in other domains, is a high-risk arrangement. Our decentralized approach functioned as insurance against the real risk of failure in Washington, DC. Governors were able to adopt masking requirements, business closure mandates, and gathering limits that almost certainly wouldn’t have come out of the federal government. 

Robin Lindley: How do you see the national response to COVID-19 under the Trump administration?

Professor John Fabian Witt: The true crisis for any president is the one they are least suited to manage.  The pandemic was exactly that for the Trump presidency. 

We have a deep history of administration in public health in the US.  Public health measures in the mid-nineteenth century produced the modern administrative state.  But the Trump administration was deeply resistant to expertise and suspicious of the civil service and the state.  Part of this was specific to Trump, who is a kind of genius of self-promotion and who rightly identified the professionals in the federal bureaucracy as a threat to his unaccountable entrepreneurialism in the White House.  But Trump’s particular reasons for seeing the state as a threat connected to a more general phenomenon, namely the Republican Party’s resistance to the administrative state.  

Of course, it’s always important to observe that lots of countries around the world struggled with the pandemic.  But if you look at per capita death rates, the only developed countries with rates as bad as the US (countries such as the U.K., Spain, and Italy) are countries that had older and more vulnerable populations.  Some western countries did much better than the U.S. even though they had substantially older populations.  Austria and Germany are good examples.  Japan, Singapore, and South Korea boasted performances that put all these western countries to shame. 

Robin Lindley: At last, vaccines are gradually getting to the public. What more would you like to see the Biden administration do to protect citizens?

Professor John Fabian Witt: A competent federal government could set clearer standards for state public health guidelines on questions such as reopening.  The current spike in Michigan, for example, is one where local political pressures seem to have led state officials to abandon crucial public health regulations.  Tougher limits are unpopular, but the federal government can sometimes reduce or deflect those pressures. 

I also think the federal government and the Justice Department may be in a position to help states manage a new and emerging problem, which is the centralized policing of state public health regulations by the Supreme Court. 

Robin Lindley: The previous president and many of his partisans undercut science by openly mocking public health officials and scientific experts. Is this fierce politicization of a deadly virus and science denial unprecedented or were there earlier examples of such politicization?

Professor John Fabian Witt: This is a great question.  There are certain precedents for politicization, but the way in which politicization is playing out in national partisan terms is, so far as I can tell, completely unprecedented and incredibly dangerous.

It shouldn’t be news that unpopular public health impositions produce political backlash.  That is an old story.  My Irish and German Catholic ancestors in Five Points in New York City in the 1850s were pretty sure that Whig and then Republican administrations in City Hall discriminated against them in the administration of public health rules because they were Democrats.  The Democrats certainly urged them to think so.  Residents of Staten Island rioted when City Hall tried to locate a quarantine facility close to their homes.

 So political controversy and public health rules in epidemics go together historically.  What’s new is that the country’s political parties are polarized along ideological lines.  What used to be local fights have become national battles with much higher stakes.  

Robin Lindley: It seems our history shows that the courts usually uphold state efforts to protect public health in line with Cicero’s dictum that you note, “health of the people is the supreme law.” Yet, some recent US Supreme Court decisions indicate that religious freedom trumps public health protections. What do you see in these recent decisions?

Professor John Fabian Witt: If there’s one thing I would like readers to come away from my book with, it is that today’s courts have made a radically novel departure from the long history of judicial deference to public health officials. 

Going back at least to the time of Chief Justice John Marshall, courts have recognized that governments need to be able to protect the health of the people.  Courts have played a role in shaping and channeling public health limits, sometimes ruling out the abusive uses of government power.  But they have rarely if ever gotten in the way and blocked public health authorities from putting in place the measures they think important.  The paradigm case has been Jacobs v. Massachusetts from 1905, in which the Supreme Court upheld mandatory smallpox vaccination. In the novel coronavirus pandemic, by contrast, state supreme courts in places like Wisconsin and Michigan struck down state COVID-19 limits.  The Michigan decision was especially striking because it ruled the state’s emergency public health arrangements unconstitutional. 

At the U.S. Supreme Court, a series of religious freedom decisions starting just before Thanksgiving and accelerating last week have interposed individual rights against public health limits that, candidly, had colorable public health rationales.    

Robin Lindley: You note that public health measures often have protected white populations while targeting or neglecting the powerless: the underprivileged, minority groups, immigrants, and others. Are there examples of discrimination and public health that stand out for you?

Professor John Fabian Witt: A classic and dreadful example is San Francisco at the turn of the twentieth century, when local officials imposed a quarantine on Chinatown that was limited only to people of Chinese descent. Time and again, politically vulnerable populations have borne the brunt of the awesome public health powers of the state. 

There is a paradox here.  Those powers are dangerous and awesome.  But they are indispensable, too.  The public health power Cicero talked about – salus populi suprema lex – is like the power of national self-defense. Terrible things can be and have been done in its name even though we can hardly do without it.  

Robin Lindley: What changes in law and policy would you suggest to protect citizens, particularly minority groups and other marginalized people, during an epidemic? 

Professor John Fabian Witt: Epidemics are paradigm cases for panicked public policy making, and courts can play a valuable role in constraining the worst forms of arbitrary discrimination.  The Federal Ninth Circuit Court of Appeals, for example, struck down the San Francisco plague quarantine of 1900. 

But today the most discriminatory features of the pandemic seem to arise out of socio-economic and health care inequalities.  Until we have better systems for the provision of basic social goods like health care, income, and housing, we will see poor Americans in vulnerable positions.    

Robin Lindley: It seems that economic inequity and systemic racism loom large in your accounts of vulnerable populations and public health.

You have a unique background as an academic historian with a law degree and you’re known for your groundbreaking and accessible books and other writing on how law has shaped the American past. What brought you to a career that combined law and history?

Professor John Fabian Witt: Is it true that everyone finds their way for a mix of personal and professional reasons?  My father is a brilliant lawyer in Philadelphia. He teaches now at Penn as an adjunct professor.  Early in his career, he pursued a Ph.D. in history.  I suppose I’ve been singularly uncreative in pursuing much the same path! At the same time, my curiosity about the law resists the usual disciplinary boundaries in the law school world.  I move from subfield to subfield, writing about and teaching about industrial accidents, constitutional law, contracts, war, and more. 

I think my secret project may be to try to make sense of it all by studying a little bit of everything.  The risk is that I’m an expert in none of it.  It can drive me to distraction, but I wouldn’t have it any other way. 

Robin Lindley: This past year has been painful for many citizens, and often frustrating for those who look to science for answers when addressing a deadly pandemic. Where do you find hope in these challenging times?

Professor John Fabian Witt: I got my first vaccine dose two weeks ago in a pop-up public health clinic run by the New Haven Health Department.  After so much dreadful failure in our public health infrastructure, it felt like impressive evidence of reservoirs of state capacity and good citizenship.  

Robin Lindley: That had to be gratifying. Do you have any closing thoughts for readers on the law and epidemics or anything else?

Professor John Fabian Witt: One of the great challenges of the coronavirus pandemic is that it has revealed the limits of some of our most powerful and long-standing institutions.  In the U.S. we rely on private property and markets to deliver all sorts of crucial social goods.  People rely on markets to put food on their tables, keep a roof over their heads, and get medical care for themselves and their families.  Such markets have considerable virtues.  But the pandemic has made salient the limits of such markets in situations of public health risk. 

Collective risks press us to develop collective solutions.  (Think of the glorious democracy of New Haven’s pop-up vaccination clinics.)  Our private mechanisms have produced dreadful outcomes for the most vulnerable.  Consider that our overall death rates are double those of comparable western European countries like Germany.  Or consider that once we adjust for age, Black and Latinx people have accounted for twice as many deaths per capita as whites.  Such disparities are a result of legal arrangements and policy choices.  We can do better.   

Robin Lindley: Thank you Professor Witt for your thoughtful comments and congratulations again on your groundbreaking and timely new book American Contagions.

 

Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Bill Moyers.com, Salon.com, Crosscut, Documentary, ABA Journal, Re-Markings, Huffington Post, and more. Most of his legal work has been in public service. He also served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. King. He can be reached by email: robinlindley@gmail.com.  

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/blog/154492 https://historynewsnetwork.org/blog/154492 0
What Do John Dewey's Century-Old Thoughts on Anti-Asian Bigotry Teach Us?

John Dewey with wife Alice Chipman Dewey and other Chinese educators, c. 1920.

 

Whether or not one agrees with Pulitzer-prize winning historian Richard Hofstadter’s observation that the famous philosopher John Dewey’s “style is suggestive of the cannonading of distant armies: one concludes that something portentous is going on at a remote and inaccessible distance, but one cannot determine just what it is” or the noted Harvard pragmatist, William James, who opined that his writings are “damnable; you might even say God-damnable,” it remains hard to ignore Dewey’s social and political views regarding American attitudes toward Asian Americans. After all, Dewey was more commentator than philosopher in many respects.  The organization Stop AAPI Hate identifies nearly 3,800 reported events of anti-Asian hate incidents in the US over the past 12 months (a total that represents a fraction of all such events). A century ago, Dewey commented on the issue of race prejudice in the wake of another global crisis—the aftereffects of World War I. Today, we are experiencing another world crisis, COVID-19, and there are similar parallels when it comes to how we are treating our Asian American citizens.    

The global pandemic that has consumed and overtaken our lives has led to a fresh wave of hatred against those of Asian descent but particularly Chinese Americans. The recent attacks at massage parlors in Atlanta and random assaults on the streets of New York and other cities are stark reminders of what can happen when people feel confined, angry, and compelled to blame someone else for their own current predicament. Scholars at Cal State San Bernardino estimate that in 2020, attacks against Asian Americans increased by one hundred and fifty percent from the previous year, a trend which seems to be intensifying in 2021.

The current spate of hate crimes and prejudice against those of Asian descent is particularly worrisome but should not come as a complete surprise. We have a long history of nativist resentment towards those who do not look Western European. The Chinese Exclusion Act of 1882, the 1885 killing of twenty-eight Chinese coal miners by a white mob in Rock Springs, Wyoming, The Gentlemen’s Agreement of 1907, and most famously, the establishment of Internment Camps during World War II, which witnessed Japanese American citizens being torn from their homes and jobs on the West Coast under the pretext of national security (measures not imposed on Germans or Italians in other parts of the country), are just some examples of how Asian ethnic groups have become targets at moments of national tension.

As he was America’s most noted philosopher of the day, Dewey’s post-World War I trip to Asia remains instructive.  Fresh from a two-year sabbatical to the Far East from 1919 to1921, Dewey returned to resume his duties at Columbia prior to his retirement in 1930. He had been battered and intellectually bruised by his former student, Randolph Bourne, who soundly criticized him for supporting America’s entry into the war without carefully thinking about its associated consequences. Indeed, the resulting petty bickering at the Treaty of Versailles and failures to implement all of Wilson’s Fourteen Points resulted in Dewey issuing his own public apologia, “The Discrediting of Idealism.” He heartily welcomed this needed hiatus when invited to the Far East by a number of his former Chinese students at Teachers College—he was encouraged, especially by Hu Shih, to present his ideas on progressive education to coincide with the wave of nationalism and modernization as China emerged from its feudalistic past.  

The two years he spent, first lecturing in Japan for six weeks and then teaching and lecturing at the University of Nanking and other colleges in China while traveling about the countryside during the remainder of his sabbatical, gave Dewey a newfound appreciation for the Chinese and their culture. While he found Chinese thinking difficult to penetrate he was uplifted by their willingness to entertain certain aspects of  Western democracy and industrialization.  

But what he did not count upon when he arrived back in his own homeland was the virulent xenophobic nationalism that had surged in his absence. Symptoms included the Red Scare of 1919, the rebirth of the Ku Klux Klan, rural suspicions of expanding urban centers, and growing calls for a stricter immigration bill. The pinnacle of white Anglo-Saxon nativism was the 1924 National Origins Act, which imposed strict quotas to restrict immigration by those not from Northern Europe. The historian John Higham neatly captures the reasons for this nativist hostility in his excellent work, Strangers in the Land: Patterns of American Nativism.

Naturally, Dewey had hoped that upon his return to the United States attitudes would be different. Unfortunately, it was not to be the case. Perhaps he should have seen this coming as a result of the war hysteria and anti-German feelings exhibited between 1914-1918. Although the war had discredited his own idealism, he still found it very difficult to understand why his own nation not only refused to abandon its wartime intolerance but focused it on new enemies; he viewed with dismay and disappointment the nativist mind-set sweeping across the American landscape in the new decade.

Determined to speak out and challenge Americans to try and understand their reasons for treating Asian Americans the way that they did, as well as satisfy Chinese doubts about the sincerity of Western intentions, he presented a powerful and moving speech in 1921. He then fine-tuned it with force and conviction for his American readership. It appeared in a 1922 issue of the Chinese Social and Political Science Review appropriately titled, “Race Prejudice and Friction.”

What is most interesting about this speech and why it needs retelling today is how Dewey defined race and prejudice. In this article he insisted that racial prejudice is a social disease, one that comes before judgment; it cuts short our thinking, relies simply on desire or emotion thereby forcing people to see things only in one light and slanting one’s beliefs. What is shocking to our customary habits, Dewey observed, is the manufactured creation of a mentality that nurtures intolerance and hatred.

The anti-foreign sentiment Dewey experienced upon his return led to his further exploration of the nature of the causes for such attitudes. In re-reading this essay I decided to dig deeper into the philosopher’s thinking only to find out to my surprise that he hit upon the obvious: what leads to such reaction is a current crisis. In our case, today, it is the pandemic; what exacerbates the attitudes we are witnessing currently against Asian Americans have been  fanned by those who chose political expediency and blame rather than accepting responsibility for their own inactions from the very beginning of this crisis here in the United States.

Perhaps a good way to frame Dewey’s line of thinking and applying it to our present  situation is based upon the principle of post hoc, ergo propter hoc, a fallacious determination that, in Dewey’s own words, “since one thing happens after another it happens because of it.” Since things did not go well once the pandemic hit, Asian Americans have now become objects of blame, contempt, and anger. The same analogy can be applied to Muslim-Americans in the wake of 9/11. Indeed, the anti-foreign animus, which Dewey experienced after World War I, continues to resonate within a certain element of Americans who see only Darth Vader among those U.S. citizens of a different color skin, religion, and physical appearance. We can even make the same argument when it comes to immigration from south of our border. We doubt, however, that there would be the same reaction if a bunch of French-speaking Canadians crossed the St. Lawrence River, and invaded Maine; they might even encounter a friendly moose or two as they set up camp.

For Dewey race is an abstract idea and in terms of science is primarily a “mythical idea.” What we, as Americans, must learn from Dewey’s own words is that race “in its popular usage is merely a name given to a large number of phenomena which strike attention because they are different.” We must consider those factors complicating the relationships in our “melting pot” while paying close attention to those cultural aspects found in our “salad bowl.” When and if understanding of the mythical nature of race becomes common, it may counteract the tendency to regard ethnic Americans as strange, unwelcome, or threatening. More importantly, it may allow the embrace of Asian Americans as equal participants in Dewey’s ideal of democracy as a way of life, rather than a mere political construct.

And speaking of political realities, perhaps the most important lesson Dewey gave us in this speech and later published is that race, unfortunately, has been tied too closely to the notion of nationalism, which in turn has “become almost exclusively political fact.” Let Dewey’s words speak for themselves. “The political factor,” he wrote, “works in two ways. In the first place, the fact of political domination creates the belief in superiority on one side and inferiority on the other. It changes race prejudice into racial discrimination.” The second aspect, he argued, is one that engenders a “psychological effect of rule upon the dominant political group”—one that inevitably fosters arrogance and contempt. Seeking cover for its own missteps, certain public officials made all those of Asian nationality responsible for America’s misfortune—it was a calculated-driven attempt based on a tone of self-righteous superiority and indignation.

In reading Dewey’s words we can only wonder if anything has really changed about the true nature of American nativism: “The same man who is sure of the inherent superiority of the white race will for example hold forth on the Yellow Peril in a style which would make one believe that he believed in the inherent inferiority of the white race, though he usually tries to save himself by attributing fear to superiority in numbers.” Race prejudice, Dewey maintained throughout his life, is nothing more than an instinctive dislike and dread of what is different. It is a prejudice “converted into discrimination and friction by accidental physical features, and by cultural differences of language, religion, and, especially at the present time, by an intermixture of political and economic forces [just think today of the political and economic consequences of our current pandemic].” Need Dewey to have said more?

Yet Dewey’s philosophy was not so much about ideas in and of themselves but how they could work out our common social problems. Civic or public involvement captures his philosophical view of democracy in action. A democracy is only as good as the people who make it, apart from the political structure in place, he once proclaimed in The Public and Its Problems. What he sought to do in his writings and speeches was offer a method of inquiry for revising those ideas preventing people from understanding exactly which social and political problems required thought and action, which were necessary for remediation and correction. He was truly a public philosopher whose works were aimed for audiences outside of the academy—an important virtue that has rapidly declined over the years.

By applying his own method of inquiry upon his return to America, he recognized the critical importance of getting at the root of racial prejudice and, in his case, how we treat Asian Americans. What needed correction, then and now, is how those “who have claimed racial superiority and who instigated and used race prejudice to maintain their state of superiority” were allowed to get away with it and why education in schools lost sight of its democratic/civic purpose. How is it possible, Dewey asked, to separate the governing constructs of democracy from the social and cultural patterns of the way we live?

So, what did Dewey suggest? Dewey argued that the nation needed to do a better job to promote a clear understanding of foreign cultures. Despite global communication networks available to encourage understanding, we still remain ill-informed and even less willing to work on this proposition individually. Many of us receive information passively with the goal of being given certainty of knowledge and guidance on how to act on it, or selectively with the goal of confirming pre-existing prejudice (problems Dewey certainly recognized). What still persists is an ongoing reluctance to examine critically and question vigorously what needs to be understood for overcoming long-held misconceptions and built-in biases regarding cultural differences.

But perhaps more importantly, Dewey did provide a vital clue in his own time that continues to resonate and make sense. What society has never fully come to grips with is dealing with the problem of what he called, “acute nationalism.” To solve animosity toward those of non-Western European heritage, we need in Dewey’s words a “degree of political internationalism.” In other words, what he argued a century ago was that the biggest obstacle to cultural assimilation is actually not one of race but a reluctance to adjust to different types of culture. This can only occur when a new state of mind is created that is favorably inclined to encourage fundamental changes in political and economic relationships—one which breaks down those cultural barriers currently steering many white or native-born Americans to blame and anger over a supposed “Chinese virus” instead of the embrace of shared humanity in fighting the global pandemic. An appreciation and willingness, Dewey insisted, which would forego nationalistic predilections by entrenched political systems existing solely for the preservation of the status quo.

Indeed, in his concluding words, he warned his readers that “the problem of the mutual adjustment to one another of distinct cultures each having its roots deep in the past is not an easy one at the best. It is not a task to be approached in either an off-hand or a querulous mood. At the present moment the situation is not at its best; we may hope in fact that it is at its worst.” Unfortunately, despite what he observed and what he encouraged a century ago, the way we are treating our Asian American citizens today would not make Dewey very happy. His message still remains unheeded.           

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179923 https://historynewsnetwork.org/article/179923 0
Who Won the American Revolution?

"The Battle of Lexington,"  Amos Doolittle, 1775, based on Doolittle's interviews with town residents and militiamen.

 

 

The American War of Independence broke out on this day (April 19) in 1775, when 70 Massachusetts militiamen confronted 700 British troops on the Lexington green.  Six years later, the last engagement of the war ended with the surrender of a British army to George Washington’s Continental soldiers in Yorktown, Virginia.  The contrast between the two types of American troops – the citizen-soldiers of the militias and the professional and uniformed soldiers of the Continental Army – was a meaningful one to Americans during the war years, and has remained important ever since. 

 

Local governments prioritized their own armed forces (the state militias) over the Continental Congress’s army with regard to provisioning.  Civilians likewise were more supportive of militia (with provisions and hospitality) because militiamen were locals, whereas Continentals were strangers from distant states.  Moreover, militia provided various services for local communities – from regional and town defense to suppressing Loyalist opposition – which Continentals did not.  These factors explain why civilians were much more likely to perform their military service in the militia –which they did in vast numbers – than in the Continental Army.  As a result, the Continental Army struggled to maintain its numbers and became increasingly populated by socially marginal Americans – men at the bottom rungs of the socio-economic ladder and at the outskirts of society – whereas militias featured a more representative cross-section of the male citizenry.

 

 

 

After the war, Americans overwhelmingly credited the militia for the victory in the war.  Not only did militiamen serve as combatants alongside Continentals, they also did combat in their localities against Loyalist militias, Britain’s Indian allies, and British foraging and raiding parties.  The militia was also the key to Patriot civic control in countless American towns, which enabled Patriots to sustain the Continental Army with provisions and recruits, while denying these invaluable resources to the British Army.  In the twentieth century, however, Americans transferred the laurels of victory from the militia to the Continental Army.  Thus, when historians and laypeople now consider the Revolutionary War, they focus primarily on the national army’s operations and are generally dismissive of the militias.  This is reflected in both academic and popular histories, as well as in museum exhibits, documentaries, literature, and film. 

 

 

This modern view is supported by the testimony of George Washington himself, who deemed militiamen as unreliable soldiers – “men just dragged from the tender Scenes of domestick life,” unaccustomed to military life and to combat, and naturally “timid, and ready to fly from their own Shadows.”   Washington also thought that the sudden change in militiamen’s lodging bred physical illnesses among them, and “an unconquerable desire of returning to their respective homes,” resulting in high rates of desertion.  “Men accustomed to unbounded freedom, and no controul, cannot brooke the Restraint which is indispensably necessary to the good Order and Government of an Army.”

 

This question – whether it was the militia or the army that won the war – has never been purely academic.  Rather, this historical question was intimately related to the way Americans organized their political lives in any given era since the Revolution.  The militia and the army are emblems of adversarial administrative systems – the state governments and the national government – that have competed with one another over jurisdiction and authority since the birth of the republic.  In the centuries that followed the Revolution, Americans engaged in fierce contests over the proper roles, jurisdictions, and powers of the Federal and state governments.  The competing narratives about the Continental Army and Revolutionary militias illustrated the political and administrative principles that Americans championed in their various contemporary debates over Federal power and states’ rights. 

 

Thus, the idea of an effective militia that was the backbone of the war effort served Americans in the early-republic as a testament to the efficacy of democratic civic institutions.  It taught that the states had led the war effort and won the war, and should therefore take the lead in administering public life in the young republic.  By contrast, the twentieth-century narrative of a feckless militia and strong army was a testament to the need for professional expertise to run important executive bodies.  It taught that the national government had won the war, and that it should therefore direct public policy.

 

Federal power and states’ rights were intensely contested issues in the early republic, with the advocates of states’ rights largely winning the ideological, political, and public-relations battle.  It should come as no surprise, then, that Americans in Revolutionary America and the early republic – living as they did in a states’ rights republic – largely judged the militias favorably, as the bulwark of American independence.  By contrast, during the Progressive Era (1890-1930) and increasingly ever since, the United States has transformed into a modern nation-state, in which states’ rights have receded in the face of Federal power.  It makes sense, therefore, that during this time, Americans have shifted their historical understanding of the Revolutionary War, determining that the Continental Army had won the war, rather than the militias.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179925 https://historynewsnetwork.org/article/179925 0
Sadly, Hatred is Very Much American

"St. Patrick's Day, 1926" 

 

The anti-hate movement that is rightfully sweeping across America, spurred by unprovoked, violent assaults on Asian-Americans, including the murder of six women in Atlanta, often brands ethnic bigotry as "un-American," but ignores an historical reality: Americans have frequently encouraged hate as an emotion designed to further a political agenda, whether in war or to preserve the nation as a white man's country. There has nearly always been a place for hate in America, provided it is widespread. When, as now, it's not the attitude of the majority, hate may be loudly condemned. Americans have found what they consider legitimate grounds for hate since the colonial era. "Indians" were always the enemy. Redcoats and Tories of the 1770s were the equivalent of "Japs" and "Wops" of the early 1940s. Pejorative slurs, conveying a hatred of certain nationalities or ideological groups, have been a fixture of American life for centuries. "No Irish wanted here" was not a tongue-in-cheek sign. It meant what it said. "Love thy neighbor" had its limits when it came to different ethnicities at various times. While the Irish were the butt of American hatred in the first half of the nineteenth century, they gave way to the Chinese, especially in California, in mid-century. As 1900 approached, the hatred was turned against most immigrants, especially from southern and eastern Europe, fostering a full-fledged xenophobic movement. In the 1930s, here in California, it was the "Okies" who were the object of our hatred, to the point of trying to stop their migration at the Arizona border. In wartime, hate has been an essential rallying cry among Americans. Whether in North or South, the Civil War required an emotion of hate to justify the slaughter of hundreds of thousands of fellow Americans. Hate was an instrument of war in both World Wars, especially the second one. In 1941, Japanese Americans were the object of our hate, so much so that one of the arguments for relocation was to protect them from white Americans. Hollywood depicted enemy German and Japanese soldiers as heartless in films made during World War II, but when the war ended we discovered the German soldier was human, as depicted in "Young Lions." It took longer to humanize Japanese troops, but that came with Clint Eastwood's "Letters from Iwo Jima."  On occasion, such as now, there is an ideological basis for hatred. A groundswell of anti-foreign hatred on the right is evident in more than the anti-Asian assaults provoked by blaming China for the Covid pandemic. It also has an anti-Hispanic element, seen in the call for a much more formidable wall at our southern border and in the fear of an invasion of Latinos following the inauguration of President Joe Biden. The "crisis at the border" is not just a humanitarian concern. Those on the right who offer that argument also hate the potential “mongrelization” of white America. They justify their hatred of illegal immigrants on various grounds, such as the fear that illegals will give more congressional seats to Democrats and will, eventually, become voters, swinging red states to blue ones. The anti-immigrant crowd doesn't believe that their hatred of the unchecked flood of immigration is a bad thing. There is also hatred from the left. That was evident among some of the anti-Vietnam War protesters of the 1960s and '70s.  In some cases, they expressed their hatred by taunting troops returning from the war, and burned and bombed the hated symbols of American imperialism - banks, government offices, even college buildings. A decade ago, the Occupy Movement expressed its hatred of the rich and powerful by taking over part of Wall Street. The siege of Portland, Oregon, in 2018 and the destruction of public and private buildings there, whether by anarchists or other ideological leftists, was fueled by a hatred of capitalism. Hate has nearly always had a place in America. At the moment it is out, but when a resurgence of the Covid virus or the possibility of nuclear war dominates our thinking, hate will once again be in. Lieutenant Cable, and Oscar Hammerstein, had it wrong in "South Pacific."  Americans don't have to be "carefully taught " to hate. Historically, it's been inherent, one generation after another. The only change has been the target.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179924 https://historynewsnetwork.org/article/179924 0
Review: "The Third Man: Churchill, Roosevelt, Mackenzie King, and the Untold Friendships that Won WWII"

Canadian Prime Minister William Mackenzie King (r) with, from left, the Earl of Athlone (Governor General of Canada), Franklin D. Roosevelt and Winston Churchill. Quebec City, 1944.

 

 

 

Neville Thompson’s The Third Man: Churchill, Roosevelt, Mackenzie King, and the Untold Friendships that Won WWII (2021) is a worthy addition to the books dealing with the two great victorious leaders Winston Churchill and Franklin Roosevelt (FDR). Thompson is a professor emeritus of history at the University of Western Ontario, and the chief value of his book is that it is primarily based on the relevant years of Canadian Prime Minister William Mackenzie King’s diary of mind-boggling length--the complete diary is 30,000 typewritten pages or the equivalent of thirty-five volumes of Thompson’s The Third Man, itself nearly 500 pages.

 

The diary is important because (in Thompson’s words);

 

No other national leader was so intimate with both Churchill and Roosevelt. Indeed, King arguably knew them both better than they knew each other. No one else had such a privileged view of the evolution and workings of their relationship. . . . Few were so perceptive about them and none left so detailed an account [in his diary] of their circumstances, policy, and interactions. In addition to closely observing the two individuals, King, as the leader of a country vital to their interests, used his position to actively participate in the transfer of global hegemony from Britain to the United States, an outcome not entirely apparent until a decade or more after the end of the Second World War. (p. 2)

 

After an introductory chapter on “The Atlantic Triumvirate,” Thompson’s next two chapters deal with the pre-WWII relations between “King and the ‘Great Genius’ [Churchill]” and “King and the Brave Neighbour [USA].” Mackenzie King became head of the Canadian Liberal party in 1919 and prime minister in 1921, and he continued as the Canadian leader for most of the next quarter century except for the early 1930s. Thus, he had plenty of pre-war opportunities to get to know both foreign leaders.

 

Even before 1919, King had met with Churchill several times, first in 1900.  After meeting with him in England in 1908, King noted that he did not seem as egotistic as earlier, but that his self-interest still seemed his main driving force. Months after again meeting with Churchill, in 1934 at his Chartwell estate in southeast England, King wrote in his diary, “a truly remarkable man, [a] great genius.” (p. 46)

 

King’s first recorded meeting with FDR occurred only in late 1935 at the White House. This was a few years after the latter had become president and just a few weeks after King had been reinstalled, along with a large Liberal parliamentary majority, as Canadian prime minister. Nevertheless, King’s diary had already recorded impressions of Roosevelt’s policies, which he thought too radical.  Although a liberal, King was more like a nineteenth-century British one, for example Prime Minister William Gladstone, who was less bold and sweeping than FDR regarding the government’s economic role.

 

The purpose of King’s trip to the White House was to firm up a trade deal between the U. S. and Canada. Both FDR and his wife, Eleanor, favorably impressed King, and after FDR’s state-of-the-union address in early 1936, King noted that he “was a brave man . . . with his voice the nation will believe a sincere and good man, determined to help his day and generation.” (p. 58)

 

In future days King was often impressed with the speeches of Roosevelt and Churchill. For example, after FDR’s State of the Union address in January 1939, King wrote in his diary, “It was the finest thing I have heard anywhere at any time, in the way of comprehensive constructiveness.” (p. 87) Perhaps King’s admiration stemmed partly from his own inability at speech-making. Thompson writes: “King’s greatest distinction was his failure as an orator. No great occasion was ever ennobled by one of his speeches. He had none of the rhetorical skill, the sensitivity to language, the gestures and ringing cadences of Roosevelt and Churchill, to say nothing of their distinct, resonant voices.” (p. 6)

 

Between 1935 and the German annexation of Austria in March 1938, King had a few more meetings each with Roosevelt and Churchill, and even went to Berlin in mid 1937 to meet with Hitler. Like several other foreign political leaders both before and after him--but not Churchill--King thought Hitler desired peace.

 

Thompson’s Chapter Three, entitled “A Visit in the Shadow of War,” runs from the Austrian annexation to the declarations of war on Germany by Britain and France on 3 September, 1939, two days after Germany’s invasion of Poland. During this year and a half, in his desire to prevent war, King was more sympathetic to Prime Minister Neville Chamberlain’s appeasement policies than to the bellicose Churchill’s criticism of them--although holding some government positions earlier and being a prominent member of parliament, Churchill did not become prime minister until May 1940 and remained in office until mid 1945.

 

Chapters 4 to 12 of The Third Man deal with the wartime relations of the three leaders up through FDR’s death in April 1945. The first meeting of all three occurred at the White House the day after Christmas, 1941, several weeks after the Japanese bombing of Pearl Harbor had brought the U.S. into the war. From that first meeting until Roosevelt’s death there would be numerous other meetings, sometimes involving all three but more frequently without King participating. Nevertheless, he often communicated with each of the two significant leaders and occasionally met one of them without the other being present, either in their countries or Canada.

 

Continuously citing the diary, as well as occasional other sources, Thompson provides King’s observations about the personality and changing health of each leader. Often he commented on how much Churchill drank. For example, visiting London in August 1941, the abstemious King “was astonished at the quantity and variety of liquor that Churchill consumed.” (p. 177) During a visit to D. C. in March 1945 (a month after the FDR-Churchill-Stalin Yalta Conference), King jotted down in his diary, that Roosevelt has ”pretty well lost his spring. He is a very tired man. He is kindly, gentle and humorous, but I can see is hardly in shape to cope with problems.” (p. 391) Months later, in October 1945--with Churchill now no longer prime minister and FDR having been dead since April--King came to London and was told by Churchill that at Yalta Roosevelt was “gone, that there was not life there, that the man was exhausted.” (p. 415)

 

In regard to issues, King was a natural go-between between Churchill and FDR because Canada was part of the British Commonwealth but also shared an extended border with the USA. In the year and a half that the Churchill-led Britain was at war with Germany while the U. S. remained neutral, the British leader was desperate for U.S. assistance and thought King could facilitate that goal, which he was willing to do. Thompson cites King diary entries of August 17 and 18, 1940 to indicate that King “thought the purpose of his life was to bring Canada, the United States and Britain together.”  (p. 137) 

 

With Canada being part of the British Commonwealth, King was naturally concerned with the role that it would play in aiding Britain’s war effort.  And it did much: Thompson indicates that by the end of the war “over 47,000 [Canadians] had been killed, 54,000 wounded.” (p. 402) Proportionate to population (with the U.S. population being more than ten times that of Canada), Canadian casualties were similar to U. S. numbers. But King resisted Churchill’s efforts to pursue any Commonwealth policies that the Canadian feared might reduce his country’s independence, and he criticized (in his diary) Churchill’s determination to prevent British Empire countries like India from gaining more independence. Thompson quotes King as saying (in mid 1944), “The British Empire and Commonwealth is a religion to him [Churchill].” (p. 326)

 

Other issues that King comments on include the attitudes of FDR and Churchill towards China under Chiang Kai-shek (FDR was more favorable) and the French leader in exile, Charles de Gaulle (both FDR and Churchill found him difficult to deal with). More frequently mentioned were the differences between Churchill and Roosevelt (and Stalin) regarding the timing of the cross-channel invasion, which eventually occurred at Normandy Beach in June 1944. FDR and Stalin wanted it to come sooner, but, fearful of great casualties if it did, Churchill kept delaying it. Partly because of his concerns with Russian expansion into Eastern Europe, the British prime minister placed much more emphasis on attacking German forces in Italy and perhaps other areas reachable from the Mediterranean Sea.

 

King’s comments reinforce the standard historical view that in the latter stages of the war Churchill was much more suspicious and hostile toward Stalin than was FDR. Although King sometimes had his doubts about the wisdom of Churchill’s wartime distrust of Stalin, the Canadian prime minister greatly valued Churchill’s famous hardline “iron curtain” speech (March 5, 1946 at Westminster College, in Fulton, Missouri) and telephoned him to say it was “the most courageous made by any man at any time.” (p. 423) By this time Churchill was no longer prime minister, his Conservative party having been voted out by the British people in July 1945.

 

The last two chapters of Thee Third Man, 13 and 14, cover the period from FDR’s death in April 1945 until King’s retirement in 1948 and subsequent death in 1950. Among the later topics touched by King in his diaries (and Thompson in his book) are why the British Labour party was able to defeat Churchill’s Conservatives in July 1945’s surprising victory, King’s dealing with the new U. S. president, Harry Truman, and King’s perspective on the new world order emerging out of the ashes of WWII.

 

Although Thompson relies mainly on the King diaries, he also considers important other books on the two leaders like those of Martin Gilbert and Nigel Hamilton--although, unfortunately, there is no mention (in the bibliography or index) of historian Lewis E. Lehrman’s Churchill, Roosevelt & Company: Studies in Character and Statecraft (2017).

 

One final point: although Thompson explains some of the political reasons why early in the century neither King nor Churchill were sympathetic toward more Asian immigration into Canada, nor King or FDR toward favoring more Jewish refugees into their countries in 1939, none of the three leaders demonstrated the type of heartfelt concern as expressed by the poet W. H. Auden in his haunting “Refugee Blues.”  About the refugee discussion between FDR and King in mid 1939, Thompson at least wrote that the topic was “far more tragic than acknowledged.” (p. 91)

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179926 https://historynewsnetwork.org/article/179926 0
More Senators Who Made an Impact, Despite First Being Appointed (Not Elected)

Sen. Sam Ervin (D-NC) chairs the 1973 Senate Watergate hearings.

 

 

 

Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

A previous essay identified several US Senators who were initially appointed to their seats, rather than winning election. In the second half of the 20th century, six other senators achieved historical significance despite originally being appointed on a temporary basis.  

Sam Ervin (D-NC) served in the Senate for more than 20 years from 1954-1974, after being originally appointed on a temporary basis in early 1954. He defended segregation and Jim Crow laws, and was somehow also seen as a constitutional expert, although he generally referred to himself as a simple “country lawyer.” He opposed the Supreme Court decision in Brown v. Board Of Education in 1954, and promoted the Southern Manifesto in 1956, urging defiance of that decision.  He also opposed the Civil Rights Act of 1964, the Voting Rights Act of 1965, the Immigration and Nationality Act of 1965, and the proposed Equal Rights Amendment.

 But this negative record was contradicted by a number of other positions.  He was also a defender of civil liberties, while not of civil rights, and was an opponent of “no knock” search laws, invasions of privacy through data banks, and lie detector tests, and was also opposed to a constitutional amendment advocating prayer in the public schools. Ervin also opposed making illegally gained evidence admissible in criminal trials.  Ervin also was one of the small band of senators who challenged Joseph McCarthy of Wisconsin in the last year before McCarthy’s censure by the US Senate at the end of 1954.

Even more significant was his leadership on the Senate Watergate Committee in the late spring of 1973, investigating the background of the emerging scandal which would bring down President Richard Nixon. The Ervin Committee’s investigation would lead to impeachment charges in the House Judiciary Committee, and Nixon’s ultimate decision to resign.  Ervin became a national hero and gained great publicity at the Watergate hearings with his personal charm and Southern drawl and mannerisms. He was suspicious of the abuse of presidential power, a concern that is ever more significant in the 21st century.

 

Walter Mondale (D-MN) served in the US Senate for 12 years from 1964 to 1976. He was first appointed to finish out the term of Hubert Humphrey when the latter became Lyndon Johnson’s vice president. Mondale had served as Minnesota’s Attorney General from May 1960 to December 1964, and would leave the Senate in 1976 after being elected Vice President under Jimmy Carter, just as his mentor Humphrey had done 12 years earlier.  Similarly, Mondale served one term as Vice President, and like Humphrey, both of them ran for President and lost, in 1968 for Humphrey and 1984 for Mondale. However, Mondale is seen as perhaps the most active and engaged Vice President; his service under Carter was as close to a “co-presidency” as America has seen. Humphrey, sadly was neutralized under Johnson, which undermined his presidential campaign in 1968. Mondale was able to expand the vice presidential role to include being a presidential advisor and full-time trouble shooter for Carter. He held a vice presidential office in the White House, and had weekly lunches with the president, a tradition which has continued ever since the late 1970s.

While in the senate, Mondale became a leader on such issues as consumer protection, fair housing and desegregation and tax reform. Notably, he was a member of the 1975 Select Committee To Study Governmental Operations With Respect To Intelligence Activities, led by Senator Frank Church of Idaho, which later led to the creation of the Senate Select Committee on Intelligence. After being in private life following his defeat in 1984, Mondale came back to public service from 1993 to 1996, serving as Ambassador to Japan. Mondale has survived post-vice presidency longer than any other vice president, more than 40 years, sharing with his “boss” Jimmy Carter amazing longevity after service in the White House.

 

Ted Stevens (R-Alaska) served 40 years in the Senate from 1968 to 2009, and was the longest-serving Republican senator until Senator Orrin Hatch of Utah surpassed him in 2017.  Originally appointed to a vacant seat, he was elected to that seat two years later, and won overwhelming victories for reelection until he lost his seat in 2008 in an election that closely followed his indictment and conviction on federal charges of failing to report gifts (the conviction was overturned for prosecutorial misconduct).  He was President Pro Tempore of the Senate from 2003-2007, and was the Chair of several committees during his career, including the Ethics, Rules, Governmental Affairs, Appropriations, and Commerce Committees.  Stevens was also Senate Minority Whip and Senate Majority Whip during the Carter and Reagan administrations, respectively.

Stevens’ key historical role was in the sponsorship and promotion of Alaska’s economic and social development, including the Alaska Native Claims Settlement Act, the Trans-Alaska Pipeline Authorization Act, the Alaska National Interest Lands Conservation Act, and the Magnuson-Stevens Fishery Conservation and Management Act.  He was also notable for his promotion of the Amateur Sports Act of 1978, which resulted in the establishment of the US Olympic Committee. Stevens also had earlier been engaged in promoting Alaska Statehood and served in the Alaska legislature during the mid 1960s before his Senate appointment in 1968, having lost two Senate bids in 1962 and 1968.

 

Howard Metzenbaum (D-Ohio) served nearly 20 years in the US Senate, starting in 1974 by appointment, and by election from 1976-1995, after eight years in the Ohio legislature from 1943-1951. Born to an immigrant Jewish family in poverty, Metzenbaum became a successful lawyer and businessmen, and became very wealthy through real estate investments in the Cleveland metropolitan area.  He ran for the senate in 1970 and lost to Robert Taft, Jr. Metzenbaum was appointed to the Senate in 1974, but then lost that year’s Democratic primary for the seat to astronaut John Glenn, who won a general election landslide.  However, in 1976, Metzenbaum defeated Taft, the son of the icon of conservative Republicanism. Metzenbaum would serve in the Senate for the next 18 years until his retirement in 1995. 

Metzenbaum would gain the reputation of being a prominent liberal, highly controversial for his strong convictions and ability to gain and keep the attention of the news media.  He was active on the Senate Judiciary Committee, particularly on the issues of antitrust and consumer protection legislation.  He became a master of the filibuster tactic, and was a leader in the support of abortion rights for women.  Metzenbaum was also strongly pro-labor, advocating legislation requiring warning periods for large factory closures.  He was active in promoting legislation to prohibit federally subsidized adoption agencies from delaying or denying child placement on grounds of race or ethnicity.  He also was a major promoter of the Brady Handgun Violence Prevention Act of 1993, which mandated federal background checks on firearms purchasers.

 

George Michell (D-Maine) served in the US Senate from 1980 to 1995, originally by appointment to replace Senator Edmund Muskie, who had been appointed Secretary of State by Jimmy Carter. Mitchell won the seat in 1982, and served two complete terms in the Senate.  Mitchell’s senate service followed a legal career working for the federal government, culminating in appointment as a US District Court Judge by President Carter. He moved up rapidly in senate leadership, and was Majority Leader from 1989-1995. He promoted the Americans with Disabilities Act, the Clean Air Act, the North American Free Trade Agreement, and the formation of the World Trade Organization.  He had the opportunity to be appointed to the US Supreme Court in 1994, but passed on it to try to promote President Bill Clinton’s health care legislation, which failed. Mitchell retired at the end of his term in 1995.

But this was not the end of Mitchell’s public career, as he became Clinton’s Special Envoy for Northern Ireland in 1995, brokering the Good Friday Agreement in 1998 that ended the sectarian violence between Catholics and Protestants that had raged since 1969.  Mitchell was awarded the Presidential Medal of Freedom in 1999 and nominated for the Nobel Peace Prize in 1998 for his efforts.  He also was engaged in the Middle East peace process, regarding the Arab-Israeli conflict, first under Bill Clinton and then later under President Barack Obama, but won no long-term results. Some parties were suspicious of his involvement due to his Lebanese ethnicity.  Mitchell was also on a short list to be Al Gore’s running mate in 2000, and to be Secretary of State if Gore had won (Obama considered him for the same Cabinet post in 2009).  He also was on the board of several corporations, including the Boston Red Sox and Walt Disney Company, and was involved in Major League Baseball’s investigation of steroids and performance-enhancing drugs in 2006-2007.  He remains very much involved in public life at age 87.

 

Dan Coats (R-Indiana) served in the senate twice, from 1989 to 1999 and from 2011 to 2017. The first time, he was appointed to replace Vice President Dan Quayle, and elected in 1990 to serve the remainder of the term, and then reelected for one full term before retiring.  But Coats decided 12 years later to run for his old Senate seat, and served another term until a second retirement.  Previously, Coats had served in the House of Representatives from 1981-1989.  Coats’s voting record in both the House and the Senate can be described as traditionally conservative.  He was highly respected, however, as a serious legislator. After his first Senate stint, he also served as US Ambassador to Germany under President George W. Bush from 2001-2005, after being on a short list to be the Secretary of Defense.

After his second Senate retirement, Coats served as Director of National Intelligence under Donald Trump from 2017 to 2019.  He had served on the Senate Select Committee on Intelligence. Coats often clashed with Trump, as Coats was concerned about Russian meddling in American elections, which Trump denied.  But Coats was courageous in his open criticism of Trump, not only on Russia, but also on North Korea and Iran.  Trump’s meddling with Ukraine’s President to undermine Joe Biden through innuendo seemed to have spurred Trump to announce the dismissal of Coats, because of his clear disloyalty to the Trump agenda, and his willingness to be publicly blunt.  This controversy led to Trump’s first impeachment trial in late 2019. There was speculation, still unproven, that Coats might have been the author of an anonymous letter published by the New York Times in September 2018 voicing criticism of Trump’s foreign policy as dangerous to American national security. Coats is now retired, with a public image of having been a serious leader on intelligence, but unable to affect Donald Trump, who had worked to undermine the intelligence community.

 

So these six senators, along with the six other senators discussed in the first article of this two part series, prove that beginning a senate career by being appointed does not preclude an impactful career.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/blog/154491 https://historynewsnetwork.org/blog/154491 0
The Roundup Top Ten for April 16, 2021

This Much is Clear: Derek Chauvin’s Trial Won’t Change Policing in America

by Simon Balto

A historian of policing warns that, while many hope for a guilty verdict, that result, by identifying and punishing "bad" policing, may effectively render legitimate forms of violence and abuse that are historially part of policing in minority communities.

 

Return the National Parks to the Tribes

by David Treuer

"The idea of a virgin American wilderness—an Eden untouched by humans and devoid of sin—is an illusion" that has hidden the forced removal of Native people from the lands converted to national parks. Native people should tend and protect the land again.

 

 

My Ancestors Were Enslaved—But Their Freedom Came at a Price for Others

by Alaina E. Roberts

Historian Alaina Roberts' work grew out of a family history in which her ancestors were brought to Indian Territory as slaves of Cherokee masters expelled from the southeast, then became landowners as the government erased tribal control of land. 

 

 

Left Behind: The Trouble with Euphemism

by Nancy Isenberg

A historian of white rural poverty says that the cultural phenomenon of JD Vance's book "Hillbilly Elegy" is just the latest deployment of the "left behind" euphemism to obscure the nature of poverty in the United States. The rural poor are and have been part and parcel of the American economic order.

 

 

Why the Hope for Peace is Waning in Northern Ireland

by James Waller

"The Troubles, the decades-long Catholic uprising against British rule starting in the 1960s, began with Catholic frustration over a government that would not leave. If widespread violence returns, it will be because of Protestant frustration over a government that would not stay."

 

 

A Once-in-a-Century Crisis Can Help Educate Doctors

by Molly Worthen

The COVID-19 pandemic has offered valuable lessons on the necessity of humanistic education in the training of medical professionals. 

 

 

Trump, Defying Custom, Hasn’t Given the National Archives Rally Speech Records

by Shannon Bow O'Brien

"Until President Trump, there have been no missing public speeches in the permanent collection. By removing these speeches, Trump is creating a false perception of his presidency, making it look more serious and traditional."

 

 

Stacey Abrams’s Fight against Voter Suppression Dates Back to the Revolution

by Karen Cook Bell

"The roots of Black women’s activism can be traced back to the Revolutionary Era, when thousands of Black women protested with their feet and ran away from their enslavers." This act would shape the demands of radical Black politics in the ensuing decades.

 

 

"Where Perversion is Taught": The Untold History of a Gay Rights Demonstration at Bucks County Community College in 1968

by Marc Stein

A student protest at Bucks County (PA) Community College in 1968, sparked by the college's decision to block a speech by gay rights advocate Richard Leitsch, should be recognized as a key event in the growing movement for gay liberation. 

 

 

The Media will be Key to Overcoming a Senate Filibuster on Voting Rights

by Donald A. Ritchie

"From the Boston Massacre to Watergate, the power of the media became manifest whenever editors and reporters, convinced of the seriousness of their cause, kept a story alive until they forced people to pay attention." TV journalist Roger Mudd kept the story of the Senate's filibuster of the Civil Rights Act in the public eye. 

 

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179922 https://historynewsnetwork.org/article/179922 0
Prosecuting Sedition in a Divided Nation is a Challenge as Old as America

 

 

 

With arrests mounting and prosecutors starting to develop their legal strategies to address the January attack on the Capitol, sedition, the centuries old concept broadly defined as conspiring to overthrow or destroy by force the government of the United States, or to forcibly prevent the execution of the law, is back in the news.  The frequently referenced stepchild to treason, sedition has long been a dark specter, a sword hanging over the not-really-United States, almost from the beginning of the nation's existence.  Indeed, whether we realize it or not, talk of sedition is but another indication of the divisions that have haunted the distinctive experiment that is the United States since its inception. 

Ironically, for all the language about force and conspiracy, the legislative enshrinement of sedition has resulted most often in use as a vehicle for squelching political opposition while testing the limits of free speech.  In its statutory infancy it was part of the infamous Alien and Sedition Acts the Federalists enacted in an effort to and hang on to their diminishing political power at the end of the eighteenth century. The Sedition Act was aggressively used to try and silence John Adams's political opponents with the act making criticism of the President a violation but not of the vice president, who just happened to be running for president against the incumbent. Fortunately, Federalist prosecutors had the good sense not to target candidate Thomas Jefferson, but his fellow Republican Vermont Congressman Matthew Lyon was not so lucky, and earned the distinction of being both the first to be prosecuted for violating the act and, after his conviction, the first person to be elected to Congress while sitting in jail. 

While the original sedition act was allowed to expire upon Jefferson’s ascension to the Presidency the concept remained alive, reemerging when Woodrow Wilson aggressively used the newly enacted Sedition Act of 1918 to limit opposition to World War I. Wilson aggressively tried to silence that part of the populace unhappy with the actions of a president who had won re-election on the slogan “He kept us out of war.” In an ironic turn, while fighting the war he said would make the world “safe for democracy,” Wilson used the sedition act to the suppress it at home, most famously prosecuting socialist leader and perennial presidential candidate Eugene V. Debs. Those sedition-based prosecutions would lead to Supreme Court cases that offered the first significant clarion calls concerning free speech in the United States, albeit as dissents by Justices Oliver Wendell Holmes, Jr. and Louis Brandeis.    

The concept, not to mention the laws still on the books, have reappeared on the public's radar as prosecutors have posed the possibility of charging at least some of those involved with the January 6 attack on the Capitol with sedition. And yet those discussions should remind us of the complicated nature of the concept and the law.  A look at recent cases offers some interesting clues to how sedition cases can proceed as well as the elements that may resonate with juries. The 1988 Ft. Smith Arkansas case is as a cautionary tale for those who believe that sedition is the way to proceed against the January 6 terrorists.      

The Ft. Smith trial took place over the course of seven weeks from February 16 to April 7, 1988, when the all-white jury of ten men and two women returned a verdict of not guilty against the fourteen individuals who were charged with conspiring to overthrow the U.S. government.  It was a multi-faceted trial that heard from almost 200 witnesses—113 for the prosecution and 79 for the defense--as federal prosecutors presented evidence that sought to prove that ten of the defendants had conspired and plotted to overthrow the federal government.  In addition, they asserted that the other defendants were guilty of trying to kill a federal judge and a Federal Bureau of Investigation (FBI) agent.  According to the scenario laid out by assistant U.S. attorney Steven N. Snyder, it all began at a July 1983 meeting in Hayden Lake, Idaho, when leaders of a number of racist and neo-Nazi groups began the planning and implementation of a program that would ultimately include robberies, bombings, and murders as part of a conspiracy to create a white Aryan nation in the United States.

The defendants were a geographically diverse but like-minded group that included multiple Ku Klux Klan grand dragons as well as leaders of other white nationalist groups, all allegedly intent on overthrowing the U.S. government. But in their efforts to prove that the diverse cast of alleged conspirators had met and developed such a plan, the prosecution seemed to get bogged down in the over 100 specific and overt acts which the group had allegedly committed. In the end not only were there no convictions for conspiracy, but 14 men went free—or at least as free as they could, given that five were already in prison for previous offenses.

District Court Judge Morris Arnold worked hard to keep the trial on track, but between the extensive number of charges and the numerous defendants—some of whom delivered their own opening statements—it was not your everyday event.  But, when the prosecution’s systematically presented scenario, a veritable roadmap detailing the meetings and actions that constituted the alleged conspiracy, ran head-on into one defendant’s assertion that the issue at hand was not conspiracy but rather freedom of speech, and that the group’s professed effort to replace one government with another was little more than democratically-based free speech, the jury returned its verdict of not guilty.

The trial results led to many post-verdict interpretations, analyses, and responses. On the one hand, defendants saw it as a victory for religious freedom in the United States.  Other commentators maintained that prosecutors had placed too much faith in witnesses whose credibility and motives were suspect. Others saw the verdicts as a disturbing reflection of the nation’s changing cultural landscape, a “real setback in the war against organized hate.” It certainly represented a hurdle in the efforts to address the increasingly aggressive efforts of the white nationalist movement, one whose profile and outreach would only continue to grow, culminating it now appears, in the January 6 attacks. 

Yet as we look to Ft. Smith as a guide for any January 6 prosecutions, we must recognize that the advent of social media leaves prosecutors in unchartered waters.  The evidence provided via social media is a massive improvement over the questionable witnesses who were so central to the Ft. Smith prosecutor’s case.  Cell pictures and preliminary reports, including those used in the Trump impeachment proceedings, paint a very different picture from previous efforts. 

But what does that really mean?  Convictions under the Sedition Act carry significantly tougher penalties, including jail sentences of up to 20 years, and the pursuit of such charges would leave no doubt about the seriousness with which the government, representing the American people, took what happened on January 6.  And yet with admittedly lesser, but more easily provable, charges available, is it worth the risk of yet another unsuccessful sedition prosecution, one which might only serve to embolden both those who are found not guilty as well as their followers--especially since so many profess to simply have been doing their patriotic duty, following the wishes of their commander in chief?

The frustration officials felt in the aftermath of Ft. Smith was only reinforced in 2010 when the government brought a sedition case against the Hutaree militia, a so-called Christian patriot group.  And in neither the Ft. Smith nor the Hutaree militia trials were prosecutors dealing with defendants who could point to the President of the United States—one whose own actions were scrutinized as part of the same effort, only to be acquitted in an impeachment proceeding--as a central part of their defense.  For better or worse, as it has for literally hundreds of years, the specter of sedition looms over the American populace, its very existence and potential power a reflection of the divisions from which its original enactment sprung and which its current contemplated use all too clearly reflect.  But it may not be the best way to solve the problem.  

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179857 https://historynewsnetwork.org/article/179857 0
Can Space Exploration Restore American Faith in Science?

Yuri Gagarin prepares for the first manned space flight, April 12, 1961

 

April 12 marks both the 60th anniversary of the first manned space flight and the 40th anniversary of the first U.S. space shuttle launch. These anniversaries might pass with little fanfare, but as NASA hopes to put the first woman on the moon by 2024 with its Artemis mission, Americans might want to ask themselves how so much of the wonder of space exploration has faded in just six decades — and faith in science and our institutions with it.

On so many levels, the era that produced American space travel will seem like a foreign country, even to those who once lived there. This was a time when massive expenditures by the government for new projects and bureaucracies were not only unremarkable but exciting. It was a time when the public sector and the private sector were not only tightly enmeshed, but it was the public sector and the guiding hand of the government that was dominant and no one thought to call it socialism. It was a time when we implicitly trusted our government.

The space program was a symbol of our incredible faith in both our government and science. The scourge of polio had just been conquered as Sputnik brought new fear into our lives, and our faith in the government and scientists to protect us was natural. Americans would breathlessly listen to reports from an American scientist with an incredibly thick German accent and a name that suggests he was a recent immigrant, Wernher von Braun. His rehabilitation from Nazi to German-American was a simple matter of suspending our disbelief, which was easy because we had faith in our institutions. Whether it was curing polio or conquering space, Americans believed that men in white coats would point the way to a better future. Even with the breathtaking progress of research into the pandemic, American faith in science is far from its halcyon 1950s/1960s high.

The Challenger tragedy shattered our dreams, but more than three decades later, there are few signs America is ready to once more trust or even take seriously the science of space flight.

Today, we do not have the patience to follow the advice of scientists in the face of the greatest health crisis America has ever faced. We do not have the political will to make investments that will pay off in the future. So how will we be able to delay our gratification to make the kind of wise, patient investment that the space program needs? John F. Kennedy made a promise in 1962 that NASA kept in 1969, just under the wire. What promises will Americans make to our future selves?

I would guess that a lot of Americans, if asked to think of the future of space travel, would think of Elon Musk, and just maybe Virgin Galactic — both of these have a certain futuristic flair to them and a lot of profit motive. The incredible image of the earth in space, the “big blue marble” taken on December 7, 1972, briefly united a world divided by Cold War and hot wars, a world haunted by the prospect of impending famine and disease. That image reminded us that we were all united after all. That sight could soon be available to the highest bidders as a selfie background. It remains to be seen if a celebrity influencer could have the same gravitas as an astronaut. It won’t be easy to remind us of our shared humanity and the miracle of science.

Space tourism and space hotels, a space entrepreneur who seems part Tony Stark, part P.T. Barnum — this is what Americans think of the future of space. While the private sector focuses on such frivolity as sending a Tesla to space, the scientists, mathematicians and engineers trudge on, mapping the trajectories and designing the spacecraft and software, and waiting for the stars to align themselves again. The astronauts train, dreaming of the chance to go up. 

Our rocket scientists are some of the most brilliant people in the world. Instead of getting rich — and please notice the parking lots of their workplaces are not filled with luxury cars — they are using their brilliance to take mankind further than it had gone before. And the astronauts? They are part rocket scientist, part superhero. The best of the best in so many physical, mental, and emotional categories. They risk their lives for the dream of exploration. They used to be household names and international heroes. Maybe they can be again.

 

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179854 https://historynewsnetwork.org/article/179854 0
Making Religious Peace in Afghanistan

The Hanging, Jacques Callot, ca. 1632.

 

 

Nearly twenty years ago, on October 7, 2001, the United States, supported by a broad international coalition, started what the Bush administration called a War on Terrorism; it began with an aerial attack on Afghanistan, whose radical Islamist Taliban government had harbored those responsible for the devastating 9/11 attacks. Two weeks earlier President George W. Bush had described the coming conflict as the necessary response to “a new kind of evil. . . . This crusade, this war on terrorism is going to take a while, and the American people must be patient.”

The President’s aides quickly tempered Bush’s rhetorical impulse, insisting that this was not a “crusade” against Islam, but a defense of freedom and democracy. Still, Bush was right in at least one respect: The War on Terrorism has taken a while. At this point, we still don’t know how and when our war in Afghanistan will end, but the Taliban and “radical Islam” are still generally considered to be the principal obstacles to peace. As the Biden administration struggles with the questions of whether and how soon to withdraw the remaining US military forces from Afghanistan, it is critical to recognize the religious dimensions of our “forever war” and to accept the challenge of making religious peace possible.

The day the aerial attack on Afghanistan began, Andrew Sullivan declared, in an essay in the New York Times Magazine, “This is a Religious War,” not unlike Europe’s religious wars. As a scholar of Europe’s religious wars, I appreciated Sullivan’s sense of historical recognition, which is still useful today. The problems and enmities that underwrote Europe’s religious wars as well as the War on Terrorism were religious in the sense that the forces in conflict recurrently and often insistently identified their enemies in term of religious ideas, behaviors, or affiliations. While some observers framed the War on Terrorism as a global struggle between Islam and the (Judeo-Christian) West, Sullivan framed it as a “war of fundamentalism against faiths of all kinds that are at peace with freedom and modernity.” After twenty years of “religious” war, however, religious fundamentalism has not been defeated in Afghanistan. It’s long past time to make religious peace.

But how do we shift from prosecuting religious war to making religious peace? Here the historical analogy with the religious wars in Europe is particularly useful. During more than a century of intermittent and increasingly destructive religious wars, Europeans learned to accept and manage their religious differences, thereby establishing the foundations of modern religious pluralism. This European religious peace, which I have described as complex and messy, has since been disrupted by revolution, nationalism, authoritarianism, and world war, but so far it has survived even the mass religious migrations of the last decades without descending to the coordinated destruction of religious war.

To learn anything useful from this history, however, we must shift our focus from contentious ideas to political action. Ideas, theologies, and ideologies provide useful clues for understanding the motives and intentions of those who prosecute wars, but it is a much broader array of political actors and actions that make war and peace possible, as often as not quite unintentionally. This is because the outcomes of large historical processes – like the cycles of religious conflict, violence, war evident in early modern Europe and in the world today – are the product of contentious human interactions, which do not yield clear winners and losers. Indeed, European history shows that if the essential foundation of religious war is ideological intransigence, the essential foundation of religious peace is political compromise.

During Europe’s Age of Religious Wars, most of the wars ended with a political compromise that took the form of a truce, an edict, or a treaty. From the First National Peace (Landfrieden) in Switzerland in 1529 to the Peace of Westphalia in 1648, each of these agreements was founded on three essential principles: mutual recognition, security guarantees for all parties to the agreement, and mediation to prevent the escalation of future disagreements into the coordinated destruction of war. Many of these political compromises failed when they were not accompanied by the demilitarization of the contending religious parties, but the ones that were successful, like the Peace of Augsburg, the Edict of Nantes, the Treaties of Westphalia, earned the grudging consent of those who controlled the means of coercion and warmaking. That consent, which was invariably grudging and implicit, entailed the recognition that war was the problem, not the solution to the “problem” of religious difference and that durable religious coexistence or diversity was the necessary condition for a more peaceful future.

Formal peace agreements can end wars – what we might call negative peace – but they do not suddenly create new, more peaceful conditions on the ground – which we might consider positive peace, or that which makes peace much more than the absence of war. The durable forms of religious coexistence that were the foundation of Europe’s religious peace emerged prior to, survived during, and were already firmly in place at the end of the military conflicts. And they had been created by a motley crew of political actors: often intolerant rulers and frequently dissenting subjects as well as competing claimants to religious authority and external allies and enemies. What the peace settlements did was to validate diversity that already existed for some time and then, over time, to protect that diversity in law and political institutions, both for groups and individuals.

In early modern Europe, the religious diversity that many considered problematic or even unacceptable was the legacy of a long-term process of religious fragmentation or pluralization within Christianity that began with the Reformation and was consummated by the permanent diminution of papal power. Thus, their religious wars did not represent a global struggle between Protestantism and Catholicism but were a myriad of struggles among fragmented communities of Christians and the ethnicities and political formations affiliated with them.  

Similarly, in our current cycle of religious violence and war, in the Middle East, North Africa, and South Asia, and in Afghanistan, in particular, the religious diversity that many consider problematic or even unacceptable is the legacy of a long-term process of religious fragmentation or pluralization within Islam that began with religious fundamentalist criticisms of the Ottoman Empire in the eighteenth century and was consummated by the abolition of the Ottoman caliphate in 1924. Thus, the religious struggle in Afghanistan is not merely an echo of a global struggle between Islam and the West or between fundamentalism and more tolerant faiths, but a local struggle among many fragmented communities of Muslims and the ethnicities and political associations affiliated with them, including the Taliban and the current Afghan government.

What this means for Afghanistan is that making religious peace is both straightforward and an enormous political challenge. The goal is not to defeat a fundamentalist ideology, but to broker a political agreement that validates religious diversity and protects that diversity in law. The Taliban has already stated that religious diversity can be protected under Islamic law, although their interlocutors do not trust their sincerity. But trust is not necessary; neither is religious dialogue or reconciliation. What is essential is that mutual recognition, security, and mediation be built into a political agreement that entails the demilitarization of the religious parties to the agreement and the protection of religious diversity in law.

Without recognition of the religious dimensions of the conflict and explicit protection of religious diversity, including the fundamentalism of the Taliban, peace is likely to elude us once again. But one thing all sides need to accept, however grudgingly, is that more war will accomplish nothing.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179853 https://historynewsnetwork.org/article/179853 0
Gordon Liddy and the Greek Connection to Watergate

Did this Democratic National Committee filing cabinet contain damaging evidence of an illegal contribution to the 1968 Nixon campaign by the Greek military dictatorship, rooted out by journalist Elias Demetracopoulos?

 

Obituaries of G. Gordon Liddy, one year shy of the 50th anniversary of the bungled Watergate break-in that he masterminded, remind us how history morphs with new information and changing attitudes.

Just what was Liddy seeking in the offices of Democratic National Committee chairman Larry O’Brien? It remains a mystery. After he transformed himself from a tight-lipped, Hitler-admiring political operative to a self-promoting right-wing media entertainer, Liddy embraced some far-fetched conspiracy theories touted by Nixon admirer Roger Stone and others. These fanciful tales were designed to implicate others in the Watergate crimes and exonerate Nixon and himself.

But in his more contemporaneous 1980 autobiography, Liddy said the purpose of the June 17 break-in, as ordered by Nixon aide Jeb Start Magruder, was to “find out what O'Brien had of a derogatory nature about us, not for us to get something on him or the Democrats.”

According to then-White House counsel John Dean, it was a fishing expedition. Magruder told Liddy to “photograph whatever you can find,” and Howard Hunt, Liddy’s political sabotage co-conspirator, told the burglars to “look for financial documents—anything with numbers on them,” especially if it involved “foreign contributions.”

That search would likely have included evidence O’Brien possessed concerning a large illegal transfer of cash, nearly four million in today’s dollars, from the Greek dictatorship to the 1968 Nixon campaign. The bagman for that payoff was Greek-American tycoon and uber-GOP fundraiser Tom Pappas, later named on the Watergate tapes as “the Greek bearing gifts.”

Pappas had convinced the Greek military junta, which had recently overthrown its democratic government, that underwriting the Nixon-Agnew campaign would be a good investment. The whistleblower in 1968 was Elias Demetracopoulos, a controversial and scoop-hungry Greek journalist whose exposes had so angered American officials that the CIA and State Department had long tried disinformation campaigns to destroy his reputation. After the junta took over in 1967, Demetracopoulos escaped to the United States. Changing from journalist to activist, he wanted to generate American opposition to the junta, restoring Greek democracy from Washington. After Nixon’s running mate Spiro Agnew endorsed the junta in September 1968, breaking a promise of neutrality, Demetracopoulos investigated, uncovered the secret Greek money trail, and met with O’Brien twice in October at the Watergate trying unsuccessfully to get him to expose the plot.

At the time, support was soft for all three candidates: Nixon, Hubert Humphrey and George Wallace. This lack of enthusiasm meant a higher-than-usual possibility of last-minute switches spurred by a late campaign disclosure. The Greek money revelation would have exploded the so-called “new Nixon” image campaign. That alone could have changed the outcome of the second-closest presidential contest in the 20th century. The victory margin was less than one percent. A shift of fewer than 42,000 votes in only three states would have thrown the outcome into the Democrat-controlled House of Representatives.

The Nixon people knew fragments about Demetracopoulos’s 1968 disclosure to O’Brien. For years it caused them great anxiety. In 1969, Jack Caulfield, handling wiretaps and political surveillance for Nixon, sent John Ehrlichman a confidential memorandum titled, “Greek Inquiry.” In 1970 John Dean became aware of anti-Demetracopoulos smears. In July 1971, Elias testified before the House Foreign Affairs Committee, against the Greek dictatorship -- and against Pappas.

In that hearing, he hinted that he would disclose more about the 1968 money. So, Pappas and his allies tried different approaches to get both the Greek and American governments to attack Demetracopoulos. Nixon’s hatchet man Charles Colson took Elias to lunch in September 1971 to warn him not to criticize Pappas. Pappas menaced him directly. In early 1972, Attorney General John Mitchell publicly threatened Demetracopoulos. He tried to have him deported, which would surely have led to his torture and likely death.

The evidence Demetracopoulos gathered and presented in 1968 was still in O’Brien’s files in June 1972, and there is strong circumstantial evidence that it was part of what the burglars were looking for in the Watergate break-in. Liddy told me he was aware of Demetracopoulos and his co-conspirator Hunt knew him from his days with the CIA in Greece. Jeb Magruder admitted to historian Stanley Kutler that information on the Greek money would have been part of what the burglars were seeking. Years later Harry Dent, White House counsel and an architect of Nixon’s Southern Strategy, told Demetracopoulos that the Greek-money origins of Watergate “makes sense.” Congressional investigations to explore these connections were repeatedly blocked. Before we are inundated with memorial reflections on the significance of the events of June 1972, we should consider that the roots of Watergate likely extend back to the 1968 Nixon campaign. Timely disclosure of the 1968 illegal money transfer could have meant a Hubert Humphrey victory, meaning no President Nixon, no Watergate break-in, and a different course of history.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179858 https://historynewsnetwork.org/article/179858 0
A Reason Republicans May Not Wish to Proclaim Themselves the Party of Lincoln

 

 

The Republican Party may soon live up to its moniker, “The Party of Lincoln,” though not in a way that bodes well for the GOP.  Abraham Lincoln, of course, was the first Republican president.  While he is heralded today as one of our greatest chief executives, he was never very popular in his own day.  Nor was his party.

It is worth remembering that the Republican Party was not a “national” party from its inception; it drew its support exclusively from the North.  Born in 1854 in Jackson, Michigan, the party in its earliest days was organized around opposition to the extension of slavery into the western territories.  Concern over slavery in the West erupted in the wake of the Kansas-Nebraska Act of 1853, which opened up those two territories to slavery after over thirty years of prohibition there.  Energized over containing slavery, the party attracted anti-slavery activists, Northern Whigs, and ex-Free Soilers, who similarly wanted the West kept as “free soil for free men.” 

Early Republicans knew that their fledgling party needed to broaden its appeal beyond the slavery issue.  So they advocated support for internal improvements, what today we would call infrastructure.  As New York Tribune editor Horace Greely wrote in 1860, “An Anti-Slavery man per se cannot be elected.”  But, “a Tariff, River-and-Harbor, Pacific Railroad, Free Homestead man, may succeed although he is Anti-Slavery.”  Containing the “peculiar institution” alone would not be enough to secure victory, but adding other planks to the party’s platform could result in electoral success.

For obvious reasons, the Republican Party held no appeal for southerners.  During the 1860 election, Lincoln’s name did not even appear on the ballot of ten southern states.  Although he was able to amass a majority of the electoral college votes, Lincoln won only 39.7% of the popular vote.  Six out of every ten Americans voted for someone else, as the nation descended into civil war.

Four years later, some Republicans wanted to dump Lincoln for Salmon P. Chase or John C. Fremont.  While Republicans renominated Lincoln, they replaced Hannibal Hamlin as the vice presidential nominee in favor of Andrew Johnson, a War Democrat.  Republicans knew their base alone would not be enough to secure victory.  The party of Lincoln had such limited appeal that Lincoln himself needed the support of Democrats to win reelection.

Fast-forward to the present, one hundred and sixty years later the Republican Party chances again to become a party of limited appeal.  Under Donald Trump’s leadership, a deep fracture has grown within the GOP.  While a majority of Republicans remain loyal to the former president, some have grown weary of his mendacious ways.  Even after election results were counted and recounted, Donald Trump will not concede defeat.  And as he continues to falsely claim victory, he demands that Republicans similarly proclaim “the lie.” 

Even before the election, fissures in the Grand Old Party were apparent.  Some Republican leaders, especially those who had or were about to retire from office, rejected Trump.  It was a strange and telling moment when Ohio’s former governor John Kasich, a life-long Republican, spoke at the Democratic National Convention in support of Joe Biden’s candidacy.

Nonetheless, the influence Trump has over Republican office holders is so great, the vast majority do not dare suggest that the emperor has no clothes—that Trump lost the election fair and square.  Instead, they embrace and perpetuate the lie.  To do anything less will result in Trump’s wrath and a primary challenge.  Even after January 6, only a small number of Republicans in Congress have shown the political will to challenge Trump’s deceit.  So, they voted for acquittal again, when he stood trial for inciting the capitol riot.  In their shortsighted effort to save their own political skins, those Republicans are fundamentally transforming their party. 

In stark contrast to the majority of House Republicans who supported efforts to overturn the Electoral College vote, ten Republicans voted for impeachment.  In the Senate, though the handful of Republicans who mustered the courage to vote to convict might seem small, they represent a growing number who reject Trumpism and the lie.  Once alone, Mitt Romney was joined by Lisa Murkowski, Susan Collins, Ben Sasse, Pat Toomey, Bill Cassidy, and Richard Burr.  Indicative of the depth of Republican divisions, several of them were censured by their own state party committees. 

The Republican electorate is similarly split.  A full three out of every four Republicans believe that there was widespread voter fraud in 2020, handing the election to Biden.  Though most Republican voters remain loyal to Trump and his lie, a growing number are re-assessing their fealty, repulsed by the events of January 6.  For them, there was no steal to stop.  They would not believe the lie.  After all, “you can’t fool all of the people all of the time.” 

Such divisions within the Republican Party threaten to devastate the GOP.  A party that has only won the popular vote once in the last eight presidential elections can ill afford its present fracture.  Trump’s lie chances to fatally handicap the party of Honest Abe.  As Lincoln warned years ago, “A house divided against itself cannot stand.”      

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179850 https://historynewsnetwork.org/article/179850 0
The Seductions and Confusions of Genealogical Research For a long time, I thought that researching family history was a dubious pastime. Also one fraught with peril, when undertaken for the purposes of ancestor-glorification and ego-gratification.  Should you have a forebear by whom you set great store – for example, as my Aunt May did by Philip Alston, you may well learn many disreputable things about him, of which owning slaves is only one.

That didn’t stop May from pursuing pedigrees on my behalf.  I remember being told as a teenager that she had filled out a chart in my name, detailing a lineage that would qualify me to join not only the Daughters of the American Revolution but also the Daughters of the Confederacy.  This was not how I pictured my future and I told my father, none too politely, to forget it.

Yet somehow this document survived – I found it among the other papers in the Pile. Labeled D.A.R. ANCESTRAL CHART, it diagrams a branch of my father’s family, starting with his name, Richard Griffin Banks, and working backwards in time through a Major Edwin Banks and a Dr. Richard G. Banks. 

This wasn’t the kind of rabbit hole I had any intention of going down.  Until for some reason it was.  Richard Griffin Banks is an unusual name.  Maybe I wasn’t ready to track my father on a genealogy website, but why not just Google him and see what I found?  Several hours later I was following the Internet trail of a Confederate Army Surgeon named Richard Griffin Banks.  Could this be my father’s great-grandfather,  the Dr. Richard G. Banks from the Ancestral Chart?

As my morning slipped away, I pursued Dr. Banks through 38 entries in my search results.  I learned that he was a trustee of a public school in Hampton.  I learned that at one point he became embroiled in a dispute involving a school budget which caused him to be assaulted with “horse whip and pistol” by C.J.D. Pryor, a teacher at the school. 

At that point I clawed my way out of the ancestry rabbit hole for the time being – but not before taking note of a line in the Richard Griffin Banks entry on the “Deceased Banks . . .” website:  “Unclear why he was born before the marriage date of parents.”  

What started as an idle pastime – Googling my father’s name – produced several surprises. It was of no particular consequence to learn that my great great grandfather may have been born out of wedlock.  But I was shocked to come across the information that he had owned 7 slaves.  It wasn’t surprising that my planter ancestors would have been slaveholders, but this great great grandfather was a doctor.  I didn’t know — though I have since learned — that households owning small numbers of slaves were not unusual; nearly half of the Southerners who owned slaves held fewer than five.  

According to a website compiling “All Deceased Banks & Bankses Persons of European Origin in the U.S. . .” Dr. Banks’ Hampton, Virginia, house was burned down during the Civil War and the family was forced to flee, saving only a pair of silver candlesticks.  This colorful detail comes from the records of a Mrs. James Banks and may or may not be apocryphal. (And if it IS true, what became of those candlesticks?)

I take note of the qualifying “of European Origin” in the webpage title.  In the 1840 census, Dr. Banks’s household consisted of “1 white male, 1 white female, and 7 slaves.”  In 1840, enslaved men and women were not listed by surname. But if they were eventually assigned the last name of Banks, as was common, it must have seemed important to the compiler of the genealogy to exclude them from the white Bankses.  

 

 

Read more about Ann's Confederates In My Closet on her website. 

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/blog/154487 https://historynewsnetwork.org/blog/154487 0
What Will be the Terms of Racial Forgiveness in America?

Memorial "Dedicated to those known and unknown who lost their lives in the Elaine Massacre" of September 30-October 7, 1919. Dedicated September 29, 2019.

 

 

 

Dietrich Bonhoeffer, the German martyr and theologian who fought against Hitler and the Nazis in his native country during the 1930s and early 1940s, defined "cheap grace" as grace without cost, as "grace we bestow on ourselves." I hear many whites today defend themselves against the fearsome appellation of "racist" by announcing that they would never think about practicing the kind of behavior, shown in recent years across this country by white nationalists.  Having taken that position to convince themselves they are not racists, they nonetheless harbor only white friends, and they also attend virtually all-white religious institutions and social clubs. They live in all-white neighborhoods, and their business partners, associates, and contacts are routinely white. When we whites vindicate ourselves from racism this way, we commit the equivalency of bestowing exoneration upon ourselves. The hurdle is very much higher if we whites have any right eventually to receive actual forgiveness by African-Americans for our white racism, both inherited and practiced. So, what are the criteria to begin the process -merely to begin the process - that can help lead us whites toward a realistic chance for such forgiveness, and ultimately, toward a path of racial reconciliation with our Black brothers and sisters? First, we must thoroughly acknowledge, through demonstrable and sincere acceptance, the unvarnished racial prejudice and history of 400 years in the United States - without any deflective excuse from filiopietism (excessive veneration of ancestors, the past, and tradition), too often employed to ameliorate the effects of those 400 years.  We whites must, by unfettered actions and resolute ideology, accept the truth of our illegitimate and evil white American domination of African-Americans, as manifested in our "damaged heritage" - that essence of American white racism, consisting of evidence, passed on from generation to generation, of ingrained, prejudicial customs and traditions, sometimes codified into law, and historically combined with not infrequent, gratuitous, and often severe violence and repression perpetrated by American whites against African-Americans. We whites know what has happened through our racial subjugation of African-Americans, and we have no defensible or legitimate reason whatsoever to excuse it by benefit of prideful mythologies and endless genealogies. In a very much related thought, the philosopher Soren Kierkegaard wrote 170 years ago that one generation does not learn anything "genuinely human" from a past generation; in other words, we have to learn anew for ourselves those qualities, which constitute the "genuinely human".  Yet, those qualities that can lead us whites to the "genuinely human" to obviate our adherence to racism are not the aspects we whites have normally employed for the treatment of Blacks - rather, we have habitually relied on customs, traditions, skin color, accents, and history. But the "genuinely human" is deeper, more fundamental, instinctual as it wills a connection between us (Black and white) to understand, to empathize, to reconcile, to love, to co-inhere, to step into another's shoes and be that person. There can be no reservation about the need for this adoption, this additional and highly significant step for whites to brook.  Still, why does any white in America need to be forgiven for past racism?

 

Because we carry a self-destructive legacy we routinely do not even recognize, derived from white privilege and pure white domination. Since this legacy has been with us so long and it has been so thoroughly ours, most white Americans cannot perceive or appreciate its corrosive effects or consequences for others and ourselves.  While it may not have been apparent, whites always knew, but have been unable to admit that, in truth, we can only be cleansed of that legacy and its evil nature by those we made our victims: Black brothers and sisters among us. We simply do not have the resources for exerting the forceful act of truly forgiving ourselves for the accumulation of such evil displayed through white privilege and pure white domination we foisted upon African-Americans for generations.  Notwithstanding the indispensability for this cleansing of racial legacy, the steps called for here must be unilateral steps taken by whites, steps taken without expectation of anything immediate in return, but solely for the desire, for the simple purpose of demonstrating a desire by whites to express a sincere and personal mission to empathize, to reconcile, to love, to reach out to those we have egregiously and continuously harmed in body, spirit, and mind as an inhuman cudgel of national, white policy and practice.      

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179852 https://historynewsnetwork.org/article/179852 0
60 Years Later: The Enduring Legacy of the Bay of Pigs Fiasco

 

 

 

On April 17, 1961, a CIA-trained force of fourteen hundred Cuban exiles were all captured or killed within seventy-two hours of landing at the Bay of Pigs. In the aftermath of this fiasco, critics often asked how someone as smart as John F. Kennedy could have approved what some have described as the “perfect failure.” But there was a certain inevitability about the entire Bay of Pigs operation. Kennedy hoped to deliver on one of the key promises from his presidential campaign – to remove the cancerous communist growth 90 miles from Key West. Kennedy was determined to reverse Dwight Eisenhower’s “lethargic” foreign policy and saw a chance to do so within three months of his inauguration. A successful overthrow of Castro would have been a signal that American complacency had been replaced with a renewed “vigor,” a favorite term from the New Frontier. Toppling Castro would fulfill Kennedy’s inaugural pledge to “pay any price, bear any burden . . . support any friend, oppose any foe to assure the survival and the success of liberty.” More directly, it would fulfill his promise to “let every other power know that this hemisphere intends to remain the master of its own house.”

There were several consequences stemming from Kennedy’s failure at the Bay of Pigs. Some were significant, others less so. Allen Dulles was removed as CIA Director seven months after the failure of Operation Zapata. Kennedy told Dulles, “Under a parliamentary system of government it is I who would be leaving office . . . but under our system it is you who must go.” While it has become one of the main talking points of the post-Bay of Pigs, pro-Kennedy narrative, it is nonetheless true that he became more suspect of expert advice, including from the military and the intelligence community. Kennedy’s speechwriter and alter ego Ted Sorensen recalled Kennedy saying to him “I got where I am by not trusting experts. But this time I put all my faith in the experts and look what happened.”

Another repercussion from the Bay of Pigs turned out to be a boon for future historians, as President Kennedy secretly installed a tape-recording system in the White House to make sure that he, and he alone, would have important discussions “on the record.” Some advisors who favored the invasion claimed in off the record discussions with reporters that they had opposed it, and this duplicity irked Kennedy. He apparently intended to use these recordings to write a memoir someday.                      

But the most important consequence of the failure at the Bay of Pigs was Kennedy’s decision to intensify covert efforts to topple the Castro regime. Operating under the codename “Operation Mongoose” the President placed his brother, Attorney General Robert Kennedy, in charge of the effort, straining the concept of plausible deniability to the breaking point. Mongoose was designed, according to Robert Kennedy, to “stir things up” with “espionage, sabotage, general disorder.” But it also involved eliminating Castro by any means necessary. At least eight attempts were made on Castro’s life, with the CIA enlisting the help of American organized crime to do their bidding.

Attorney General Robert Kennedy viewed the Bay of Pigs as an “insult that had to be redressed” and he pressured a sclerotic bureaucracy to ensure that Mongoose received all the funding it needed to carry out its campaign to topple the Castro government. The assassination element of the campaign saw the CIA develop a variety of means to eliminate Castro including various poisons and exploding seashells designed to lure the curious scuba diving dictator to his death.

Many veterans of the Second World War considered assassination to be a legitimate weapon. The United States military had targeted the commander of the Imperial Japanese Navy, Admiral Isoroku Yamamoto, when decoded intercepts revealed his flight plans, while the British had trained the assassins of the “Butcher of Prague,” SS-Gruppenfuhrer Reinhard Heydrich. The Cold War’s ever-present threat of “mutual assured destruction” lent further credence to the idea that assassination was a legitimate “tool” in the nation’s arsenal.                              

Operation Mongoose continued for the entirety of the Kennedy presidency, despite Kennedy’s “no invasion” pledge to Nikita Khrushchev during the missile crisis of October 1962.  Mongoose became one of the largest covert operations in the CIA’s history, involving some 400 agents and an annual budget of over $50 million. Kennedy’s successor, Lyndon Johnson, shut Mongoose down in April 1964, later observing that the United States had operated a Murder, Inc., in the Caribbean.                                                                                                  

Operation Mongoose was publicly revealed in 1975 by a Senate committee chaired by Idaho Senator Frank Church that examined abuses of power by the CIA and the FBI. Only a handful of high-ranking U.S. government officials out of the one hundred and eighty-nine million Americans alive in November 1963, were aware that assassination had been adopted as a tool of American foreign policy by Dwight Eisenhower and John F. Kennedy. These 1975 revelations would fuel the already flourishing Kennedy assassination conspiracy complex, providing endless and fruitless leads for those intent on proving that Lee Harvey Oswald did not act alone. Sadly, this proved to be one of the most enduring legacies of the “perfect failure” at the Bay of Pigs.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179856 https://historynewsnetwork.org/article/179856 0
Holocaust Remembrance 80 Years After the Beginning of Hitler's Campaign of Genocide

Ruins of the Great Choral Synagogue of Riga, Latvia, which was burned by Nazis with Jewish victims inside on July 4, 1941.

 

 

We’ve got a lot on our collective plates and minds, coping with a pandemic and the frustrating partisan politics that continue to mire progress.

 

It would be natural for us at this time to embrace the rites of Spring and avoid the haunting historical memories and anniversaries of the Holocaust that began to occur 80 years ago in Europe, and which continued beyond the end of World War II in 1945.

 

Even as we are solemnly reminded each April to “Never Forget.”

 

But my purpose and mission is to unearth these horrors anew. Tragically, and inexcusably, most Americans remain ignorant of both the locations of these terrible crimes as well as the men, among the worst killers of the Holocaust, responsible for them. Horrifyingly, recent surveys in this country indicate that over 40% of Americans, and 66% of our millennials, cannot even say what Auschwitz was.  In 2021, how is this possible or acceptable?

 

On June 22, 1941, Hitler unleashed his armies on the Soviet Union, thereby forever altering the nature of both World War II and history itself. In the remainder of that year, the Holocaust would claim almost 500,000 victims, innocent people shot to death by Nazi Einsatzgruppen killing squads as the Wehrmacht’s four invading armies raced northeast into the Baltics, eastward through Belarus towards Moscow, southeast into Ukraine, and farther southeasterly towards the Crimea and the Caucasus.

 

The Nazis were quick to reach Lithuania. From June 25-29, 5,000 people were murdered at Kaunas, in what would be among the first of more than 250 such mass executions in Lithuania. At the Ninth Fort, also in Kaunas, on Nov. 25 and Nov. 29, Karl Jager ordered the death by shooting of almost 5,000 more Jews.  Arguably the most horrific killing site in the country was located in the Ponary Forest (now Paneriai) outside the capital of Vilnius, where from July 1941 to August 1944, some 100,000 people, including over 70,000 Jews, mostly from the capital of Vilnius, were murdered.

 

Nazi forces entered Riga, Latvia, on July 1, and within 3 weeks, over 6,000 Jews had been killed in targeted actions against them.  On July 4, the Great Choral Synagogue in Riga was burned to the ground with Jews locked inside. Heberts Cukurs, “The Hangman of Riga,” and Viktors Arajs organized collaborationist killing squads who helped Friedrich Jeckeln murder over 25,000 Jews on Nov. 25 and Dec. 8 at the Rumbula Forest outside Riga. Thousands of Jews were similarly murdered at both Skede and Liepaja in Latvia in the most notorious killings in the country except those in Riga.

 

The Nazi massacre of 33,771 Jews in the ravine at Babi Yar, outside Kiev, in Ukraine, on Sept. 29-30, 1941, is the worst two-day killing spree of the war. Overall, more than 100,000 Jews would be killed there under the orders of Karl Eberhard, Paul Blobel and Otto Rasch. The murders represented one of the largest mass killing actions in the early months of the invasion of the Soviet Union.

 

Only the massacres carried out by Romanian forces against Jews in the Crimean city of Odessa would numerically surpass the Babi Yar killings. From Oct. 22-24, 1941, almost 35,000 Jews would be shot or burned alive in locked buildings.  Over 100,000 people would be annihilated in the area through the early winter months of 1942 under the orders of Romanians Marshal Ion Antonescu and Gheorghe Alexianu, both of whom would later be executed as war criminals.

 

These anniversaries serve as grim reminders of the early months of the Holocaust against the peoples of eastern Europe. While many Americans are reminded of the Japanese attack at Pearl Harbor on Dec. 7, 1941, most have no clue that in Nazi-occupied Poland, on Dec. 8, the Germans began operating the first of six death camps, at Chelmno, where some 360,000 Jews from the Lodz ghetto would be taken to be killed in mobile gas vans and buried deep within a forest setting. The first phase of killing at Chelmno would last until April 11, 1943, coinciding with the deadliest phase of the Holocaust. The camp would resume its operations once more from June 1944-January 1945.

 

As awful as these murders were in 1941, the worst was yet to come. The following year would mark the zenith in frenzied Nazi killing in the death camps at Belzec, Sobibor, Treblinka, Majdanek and Auschwitz-Birkenau (which was already killing people in 1941). The 80th anniversary of the Final Solution at those sites will be solemnly remembered next year.

 

The war and mass murders inside the Baltics and Soviet Union 80 years ago may seem to be only a distant memory today within our country. The Nazis' premeditated attempt to obliterate peoples from the earth is today, eight decades and at least four generations afterwards, increasingly forgotten and ignored. Yet the evils of intolerance, racism, prejudice, and the horrors of ethnic cleansing that combined to produce the destruction of some 11 million people during the Holocaust live on, and even worse, remain pervasive today.

 

The Nazi goal was to eradicate them from the face of the earth and to then remove all traces of the instruments of their destruction: the camps were to be destroyed, the ground plowed over, all previously buried victims to be exhumed and burned and reduced to ashes, and no traces or records of the slaughter were to be left. In essence, the goal was to leave no memory of these victims for future generations.

 

Those of us alive today have an obligation to remember what happened, as well as the reasons why it happened and was allowed to happen, because we have the responsibility to at least try to prevent it from happening again. The very fact that the dehumanization and destruction of “undesirables” continues in our own time should serve as a gruesome reminder that Nazi ideology is still alive and well, and has been improved upon since 1945.

 

We must see our lives as inextricably linked to both the past and future, so that all peoples, individually or collectively, do not have to know of a world with genocide.

 

As we prepare to enjoy the remainder of our spring and impending summer months, let us pause to remember those who, through no fault of their own, were not allowed the same seasonal joys of Eastern Europe, 1941. Let us commit ourselves to remembering and learning from that difficult year, and to educating ourselves, our children and successive generations, so that our voices and memory will open new futures of hope, of restraint and of justice. There is no such thing as a lesser person.

 

 

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179855 https://historynewsnetwork.org/article/179855 0
Senators who Made an Impact, Despite First being Appointed (not Elected)

Harry Byrd, while Governor of Virginia, photographed ca. 1928

 

 

Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

The US Senate, since the beginning of the 117th Congress this January, has seen a grand total of 1,994 members in its 232-year history.

Among those, there have been a total of 202 appointed Senators since the adoption of the 17th Amendment in 1913, which provided for direct popular election of Senators.

Therefore, it is common to think of appointed Senators as just temporary replacements, waiting for the next regularly scheduled election for that Senate seat, or until the next even-year election. This has often been true.

But several have ended up being major historical figures in Senate and political history.

This article is the first of two to examine the historical significance of twelve US Senators who, despite being originally appointed rather than elected, made a difference in American history.

 

Charles McNary (R-Oregon) was appointed in May 1917, and then was elected to the Senate in November 1918, serving until his death in February 1944.  He was chosen by the Oregon Governor for the vacancy due to his support of women’s suffrage and Prohibition, two policies that were established by constitutional amendments ratified before the 1920 national election.  He was Chair of the Senate Agriculture Committee from 1926-1933, and held the position of Senate Minority Leader during Franklin D. Roosevelt’s New Deal from 1933 until 1944, longer than any Republican has held that post. 

He was perceived as a “progressive” Republican who supported much of the New Deal and defense measures as World War II came closer, including the Selective Service military conscription in 1940 and the Lend Lease Act in 1941. A westerner, he supported the development of hydroelectric power, including the Grand Coulee and Bonneville Dams, as public works projects.  He was the primary promoter of the proposed McNary-Haugen Farm Relief Bill, twice vetoed by Republican President Calvin Coolidge in the 1920s, which might have staved off or alleviated the effects of the Depression on agriculture.  McNary was the Vice Presidential running mate of Wendell Willkie in 1940. In an odd footnote, had the duo been elected over FDR and Henry Wallace, they might have become the first president and vice president to both die in office, as McNary did in February 1944 of a brain tumor, and Willkie of a heart attack in October 1944.  My book, Twilight of Progressivism: The Western Republican Senators and the New Deal (Johns Hopkins University Press, 1981), has McNary as a leading figure in that group, which cooperated with FDR on many New Deal initiatives.

 

Carter Glass (D-Virginia) was appointed in November 1919, and then was elected to the Senate in November 1920, serving until his death in May 1946.  Glass had earlier served in the House of Representatives from 1902-1918, chairing the House Banking Committee from 1913-1918, and was appointed by President Woodrow Wilson for 14 months as Secretary of the Treasury from December 1918 until his appointment to the Senate. 

He served as Senate Appropriations Committee Chairman from 1933 until his death in 1946, and was also President Pro Tempore of the US Senate from 1941-1945.  He also helped to establish the Federal Reserve Banking System under Wilson, and was the author of the Glass-Steagall Act that set up the Federal Deposit Insurance Corporation under FDR’s New Deal in 1933.  However, as a staunch supporter of States Rights, he opposed much of the New Deal, and advocated disenfranchisement of African Americans in his state and nationally, and Jim Crow segregation laws.

 

Gerald Nye (R-North Dakota) was appointed to the Senate in November 1925, and was elected to three full terms before he was defeated in 1944.  He was termed a “progressive” Republican, and my book on the subject included an interview with Nye conducted in March 1971, his last interview with a historian before his death a few months later.

Nye became noted for his investigation of the Teapot Dome scandal, and helping to create Grand Teton National Park.  He supported much of the New Deal until later breaking with the President, but became most controversial as a leading isolationist spokesman. This included heading the Nye Committee in 1934-1935, which investigated the munitions industry, and promoting the view that America could have avoided entrance into World War I. He was a leading advocate of the neutrality laws passed by Congress in the mid-1930s.  Nye was accusatory toward Jews in the film industry, leading to charges of antisemitism, and was a major critic of both Great Britain and of the Republican Presidential nominee Wendell Willkie in 1940.  He was also an active speaker on radio at rallies of the America First Committee in 1940-1941, the leading organization attempting to keep America out of World War II. Nye told me, thirty years after Pearl Harbor, that he believed Roosevelt had plotted to get America into that war.  Nye was even ridiculed by Dr. Seuss for his isolationist views and his vehement rhetoric and oratorical manner.

 

Arthur Vandenberg (R-Michigan) was appointed to the senate in March 1928, after a career in journalism as an editor and publisher in Grand Rapids, and was then elected for four terms, dying in office in April, 1951.  Originally supportive of President Herbert Hoover, he would support much of the early New Deal of FDR, but then became part of the conservative coalition that opposed the 1937 Supreme Court “packing” plan and the pro-labor Wagner Act, and was an isolationist in foreign policy until after the Japanese attack on Pearl Harbor in December 1941. 

His position on foreign policy changed radically as a result, and he became an internationalist, making a well-hailed transformation in a speech in the Senate in January, 1945.  He became a promoter of the United Nations, and cooperated in a bipartisan fashion with President Harry Truman on the Truman Doctrine, the Marshall Plan, and the formation of the North Atlantic Treaty Organization as chair of the Senate Foreign Relations Committee from 1947-1949.  Vandenberg was President Pro Tempore of the Senate during the 80th Congress (1947-1949), so two heartbeats away from the Presidency, and was a “favorite son” candidate for the White House in 1940 and 1948.  The Senate Reception Room has a portrait of Vandenberg, part of a very select group of seven legislators rated by the Senate as the most prominent in its history.

 

Harry F. Byrd, Sr. (D-Virginia) was appointed to the Senate in 1933, and served 32 years.  Previously, he had been Virginia Governor from 1926-1930 after a career as a newspaper publisher and two stints in the Virginia State Senate.  His state political machine dominated Virginia politics for a half century, enforcing literacy tests and poll taxes to deny the franchise to African Americans. He became a leader in the conservative coalition against the New Deal, and opposed as Governor and in the Senate against any racial desegregation, advocating “massive resistance” to the 1954 Supreme Court decision in Brown v. Board of Education.

But in foreign policy, Byrd was an internationalist and supported FDR’s foreign policy as a leader on the Senate Armed Services Committee after World War II. He later became the Chairman of the Senate Finance Committee.  Byrd refused to endorse President Truman in 1948 or Democratic nominee Adlai Stevenson in 1952, and was always a thorn in the side of Dwight D. Eisenhower—refusing to support the Interstate Highway System—and of Lyndon B. Johnson—opposing the Civil Rights Act of 1964.  Byrd received 15 electoral votes in 1960, from Mississippi, Alabama, and Oklahoma, in the election that made John F. Kennedy President.  His greatest legacy was the creation of the Shenandoah National Park, Skyline Drive, the Blue Ridge Parkway, and the Virginia state park system.

 

Ralph Flanders (R-Vermont) was appointed to the Senate in November 1946, and then was elected to two full terms, serving until the first days of 1959.  He had a career as a mechanical engineer and industrialist, and was President of the Boston Federal Reserve Bank for two years before his Senate career. He served on the Joint Economic Committee in an investigatory and advisory committee, and on the Finance Committee and Armed Services Committee.  He promoted public housing, higher education spending, and the Civil Rights Act of 1957 under President Dwight D. Eisenhower. 

He promoted arms control in foreign policy, and became noticed when he became the major critic of Republican Senator Joseph McCarthy of Wisconsin, who was pursuing what Flanders saw as reckless rhetoric and behavior in his Red Scare tactics from 1950-1954.  He was an early and strong critic of McCarthy, saying on March 9, 1954 that he was misdirecting America’s efforts at fighting communism overseas, and causing a loss of respect for America in the world community.  His Senate address was a scathing criticism of McCarthy, hailed by many, but attacked by critics as supporting the Communist cause.  Flanders introduced a resolution on June 11, 1954, condemning the conduct of McCarthy and calling for his censure for flagrant abuse of power. The US Senate would censure McCarthy on December 2, 1954. Republicans split evenly on the motion, but the total vote was a landside of 67-22, and McCarthy never recovered from the censure.  Flanders became a national hero, and a profile in courage to many millions of Americans. 

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/blog/154488 https://historynewsnetwork.org/blog/154488 0
Political Precedent for the Trump Cult of Personality

 

 

The term “Trumpism,” alluding to a cult of personality surrounding the 45th president, has penetrated the American vernacular. So much about Donald Trump and his presidency has been unprecedented. But in this case, the phenomenon is not new. A cult of personality also engulfed Ronald Reagan. Although these men are very different from one another in character, their cults of personality share similar qualities. Both were not always truthful, both made serious mistakes, and both were tinged with racism.

A political cult of personality means a strong admiration and devotion to a leader. Frequently, the leader spreads his fame widely through mass media. Followers become enamored to the point of idolizing the leader while overlooking or ignoring shortcomings. This characterizes the public life of both Trump and Reagan.

Familiar to millions of Americans by appearing in movies and hosting the weekly General Electric Theater on Sunday night television, Ronald Reagan began a political career on October 27, 1964 with a nationally televised speech on behalf of Republican presidential candidate Barry Goldwater. It was a week before the election. The speech was filled with false claims about the overbearing U.S. government and unverified anecdotes. This was all to support Reagan’s view that government needed to get out of the way of the economic freedom of the American people. Reagan falsely claimed that farmers could be imprisoned who did not cooperate with federal government programs, and that the Federal Reserve Board planned inflation.

Reagan also said, “We were told four years ago that seventeen million people went to be hungry each night. Well, that was probably true. They were all on a diet.” Due to his building a cult of personality, this particularly callous and inaccurate quote was overlooked. When Reagan spoke, more than 36 million Americans were living in poverty, nearly one-fifth of the country. Following the formula of that speech, Reagan won the California governorship two years later by a landslide and would go on to win the presidency twice by equally impressive margins. The Reagan cult of personality enabled him to remain popular with his followers even when violating his own conservative principles. Throughout his political career, Reagan railed against big government deficit spending. But when the national debt rose by 189 percent, he suffered no political consequences. When Reagan admitted to misleading Americans during the Iran-Contra scandal, his popularity went down temporarily, but bounced back by the end of his presidency.

Donald Trump, like Reagan, gained fame with the American public through show business. Trump starred in a reality television show called The Apprentice. Many Americans assumed that Trump was the “boss” starring in his own program, but in reality Trump was an actor employed by a television production company, just as Reagan was an actor employed by the General Electric Company. The Apprentice gave Trump a favorable celebrity status leading toward a political cult of personality. While Reagan launched his political career with a televised speech, Trump began his with a nationally televised accusation that Barack Obama should not be president because he was not a natural-born U.S. citizen. With no proof other than his words, Trump claimed to have investigators in Hawaii uncovering evidence that Obama was not born there as his birth certificate indicated. “They couldn’t believe what they’re finding,” Trump asserted. Several years later, shortly before winning the presidency, Trump admitted that he believed Obama is a U.S. citizen.

When Trump announced his presidential candidacy, he declared, “Sadly the American dream is dead.” The campaign slogan became “Make America Great Again” That is not unlike Reagan’s decrying big government for destroying our freedom. The Reagan 1980 campaign slogan “Morning in America” is not very different in meaning from the Trump 2016 slogan. Like Reagan, Trump deviated from facts to support political points. Examples of this are legion, from Trump’s assertion that he saw thousands of Muslims on 9/11 cheering the collapse of the twin-towers to his claim of Obamacare imploding. One difference however is that Reagan’s factual deviations usually served to buttress his political points, while Trump’s were often to boost himself, from the false claim to have graduated from the Wharton School at the University of Pennsylvania at the top of his class,  and the boast of being a “very stable genius.” That arrogance was not in Reagan’s character.

Trump’s personality cult protected him to some extent as it did Reagan. Trump’s popularity was never high as Reagan’s was. But his approval ratings always remained in the middle 40s, not dropping precipitously as in the case of Nixon and Carter for example. That is despite numerous scandals, including the Russia investigation, and a poorly-handled pandemic killing hundreds of thousands of Americans. In the end, 74 million Americans voted for Trump. The cult of personality remained intact.

Another and more sinister similarity in the Reagan and Trump cults of personality is white racism. Both men saw an opportunity to advance their political careers by appealing to white voters in a racially prejudicial way. In Reagan’s 1966 campaign for governor he appealed to white voters disgusted with the “beatniks, radicals, and filthy speech advocates” as Reagan termed it. In his 1976 campaign for the Republican presidential nomination, Reagan frequently told the “welfare queen” story about a woman on welfare who allegedly defrauded the U.S. Government of $150,000. The story was significantly embellished, but was in keeping with Reagan’s political views. He once called welfare recipients a “faceless mass waiting for a handout.” He did not mention race, but the implication was abundantly clear that the welfare queen is black. In his 1980 presidential campaign, Reagan after winning the nomination, traveled to Mississippi to give a speech at the Neshoba County Fair to a white audience glorifying states’ rights, which has long been the cry of white Southerners fighting civil rights. Neshoba County is the site where three civil rights workers were infamously killed in 1964.

Donald Trump’s appeal to white racism has been more blatant. In August 2017, the Unite the Right rally occurred with one counter-protester killed. Trump said that “you also had people that were very fine people on both sides.” One side had neo-Nazis, Ku Klux Klansmen, and Alt-Right people. In the last presidential campaign, Trump in numerous ways appealed to racism attempting to win re-election. For example, he condemned NASCAR for banning the Confederate flag. He condemned Black Lives Matter and predicted the “beautiful suburbs” will be destroyed by low-income housing if Biden wins. He blamed big city Democrats and their black voters for stealing the election, ignoring the fact that he lost battleground states because too many whites in the suburbs deserted him.

Two recent presidents have had cults of personality, although that is antithetical to democracy. That enabled both to win their party nominations and the general election. It gave both men the luxury of deviating from truthfulness and enabled Reagan to survive a severe scandal and Trump to be incompetent and scandalous while maintaining a significant base of popularity. This also indicates something ominous about America. If a candidate has a cult of personality, and develops a large number of devoted followers who believe he or she can do no wrong, it could potentially make white supremacy or other malignant elements  of politics seem permissible, with unknown consequences for democracy.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179849 https://historynewsnetwork.org/article/179849 0
Life During Wartime 532

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/blog/154489 https://historynewsnetwork.org/blog/154489 0
Roundup Top Ten for April 9, 2021

The Meaning of the Democrats’ Spending Spree

by Keeanga-Yamahtta Taylor

Joe Biden supported a balanced budget amendment in 1995, ran as the "establishment" candidate in the Democratic primaries, and has been a regular advocate of bipartisanship. So why is his administration proposing the massive American Rescue Plan Act, and showing a willingness to act without securing Republican cooperation? A tour of recent history can explain. 

 

Our Greatest Libraries are Melting Away

by David Farrier

Ice core samples from the Greenland shelf are a physical archive of the long sweep of human history, and demonstrate the connections of humanity's past and future. 

 

 

Without Asian American Studies, We Can’t Understand American Racism

by Min Hyoung Song

The establishment of Asian American Studies and ethnic studies programs has been essential to putting Asian American scholars (and scholars of Asian Americans) in position to engage the mass media around events like the Atlanta shootings. As those programs are under fire, it's time to recognize their value. 

 

 

What Manhattan Beach’s Racist Land Grab Really Meant

by Alison Rose Jefferson

Debates over  the redress of past racial injustice must acknowledge that some past actions have harmed communities in ways that can't be repaired, including th loss of space for communal leisure or equal access to everyday pleasures.

 

 

A Poem That Shows How to Remember the Holocaust

by James Loeffler and Leora Bilsky

"Lemkin’s anguished text also explains why the world had already begun to forget the Holocaust. Genocide represents more than a large-scale physical assault on human bodies, he suggests; it is also an attack on the very existence of minority cultures. In a genocide, books are burned and memories are extinguished."

 

 

“Taxpayer Dollars”: The Origins of Austerity’s Racist Catchphrase

by Camille Walsh

The rhetoric of protecting "taxpayer dollars" hinges on a selective interpretation of who pays taxes that reinforces the privilege of affluent whites to have government follow their preferences. 

 

 

Higher Education's Racial Reckoning Reaches Far Beyond Slavery

by Davarian L. Baldwin

American universities have grown in harmony with American racism throughout their history, from building on land appropriated from Native Americans to accommodating Jim Crow to promoting social science theories that justified segregation and directly encouraging gentrification through real estate purchasing. 

 

 

The World the Suez Canal Made

by Aaron Jakes

"The purpose of the Suez Canal, from the perspective of both the Egyptian state and its European investors, was not simply to render the world more interconnected and international transport more efficient, but to extract transit fees from the ships passing through it."

 

 

Restoring the People’s Universities

by Alejandra Marchevsky and Jeanne Theoharis

"We see this trend across the nation: when students of color finally began to gain access to higher education, disinvestment and the shrinking of educational opportunity followed."

 

 

Biden’s Plan for Central America Is a Smokescreen

by Aviva Chomsky

The Biden plan for Central America revives the Cold War formula of business-friendly economic development and militarized security in the name of stopping migration toward the US. This, the author argues, amounts to doubling down on failed policies that have driven migration for decades.

 

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179845 https://historynewsnetwork.org/article/179845 0
Hidden Stories of Jewish Resistance in Poland

 

 

 

In 1959, writing about the Holocaust, scholar Mark Bernard highlighted that Jewish resistance was almost always considered a miracle, ethereal, beyond research scope. Still today, this impression generally persists. And yet, Jewish defiance was everywhere during the war, carried out in a multitude of ways, by all types of people.

 

I first encountered this phenomenon several years ago, when I accidentally came across a collection of Yiddish writing by and about young Polish-Jewish women who rebelled against the Nazis. These “ghetto girls” paid off Gestapo guards, hid revolvers in marmalade jars, and built underground bunkers. They flung homemade explosives and blew up German trains. I was stunned. Why had I – a Jewish writer from a survivor family, not to mention a trained historian who held a Ph.D. in feminist art — never heard this side of the story?

 

And so began my research. As I discovered, due to preconceived notions of gender, the girls’ educations, and the lack of evident markers of their Jewishness (i.e., circumcision), women played a critical role in the Jewish underground in Poland. But when I set out to write their story and sought a chronological context, it quickly became apparent that there was none. No comprehensive history of the men in the underground existed either. Sure, excellent academic biographies and case studies of rebellions in particular ghettos and camps had been published, but there were no recent English books that relayed the tale of Jewish resistance in the country as a whole. As much as I was baffled by the ferocious female fighters, I was equally baffled by the entire Jewish effort in Poland, the epicenter of the bloodshed, where 3 million Jews (90% of the pre-war population) were savagely murdered. The truth was, though I’d heard of the Warsaw ghetto uprising, I had no idea what actually happened. I certainly had no idea of the scope of Jewish revolt. 

 

Holocaust scholars have debated what “counts” as an act of Jewish resistance. Many take it at its most broad definition: any action that affirmed the humanity of a Jew; any solitary or collaborative deed that even unintentionally defied Nazi policy or ideology, including simply staying alive. Others feel that too general a definition diminishes those who risked their lives to defy a regime actively and that there is a distinction between resistance and resilience. The rebellious acts that I discovered among Jewish women and men in Poland, my country of focus, spanned the gamut, from those entailing complex planning and elaborate forethought, like setting off large quantities of TNT, to those that were spontaneous and simple, even slapstick-like, involving costumes, dress-up, biting and scratching, wiggling out of Nazis’ arms. Some were one-offs, some were organized movements. For many, the goal was to rescue Jews; for others, to die with and leave a legacy of dignity. 

 

As guerrilla fighters, the Polish-Jewish resistance took only a handful of Nazi casualties and achieved a relatively minuscule victory in terms of military success, but the effort was much more significant than I’d known. Over 90 European ghettos had armed Jewish resistance units. In Poland, where many of these were located, the units comprised “ghetto fighters” who used found objects (like pipes), manufactured items (such as homemade explosives), and smuggled-in weapons (including pistols and revolvers) to engage in spontaneous or, more often, organized anti-Nazi assaults. Most of these underground operatives were young, in their twenties and even teens, and had been members of youth movements, which now formed the core structures of resistance cadres. Ghetto fighters were combatants as well as editors of underground bulletins and social activists. The Warsaw Ghetto Uprising, I learned, was youth-driven, and strategically planned over months. Most accounts agree that about 750 young Jews participated. (Roughly 180 of them were women.)  

 

Some Jews fought inside the ghettos, but 30,000 (ten percent were women) fled their towns and cities and enlisted in forest-based partisan units; many carried out sabotage and intelligence missions. ‘The Avengers,’ a Jewish-led detachment outside Vilnius, blew up German trains, vehicles, bridges, and buildings. They used their bare hands to rip down telephone poles, telegraph wires, and train tracks. Other Polish Jews joined Soviet, Lithuanian, and Polish-run detachments or foreign resistance units; while others still worked with the Polish underground, often disguised as non-Jews, even from their fellow rebels. 

 

Alongside military-style organizations, Jews organized rescue operations to help fellow Jews escape, hide, or live on the Aryan side as Christians. Warsawian Vladka Meed, a Jewish woman in her early 20s, printed fake documents, distributed Catholic prayer books, and paid Christian Poles fees for hiding Jews in their homes; she also helped save Jewish children by sneaking them out of the ghetto and placing them with non-Jewish families. In Poland, rescue networks supported roughly 10,000 Jews in hiding in Warsaw alone; they also operated in Krakow. Mordechai Paldiel, the former director of the Righteous Gentiles Department at Yad Vashem, Israel’s largest Holocaust memorial, was troubled that Jewish rescuers never received the same recognition as their Gentile counterparts. In 2017 he authored Saving One’s Own: Jewish Rescuers During the Holocaust, a tome about Jews who organized large-scale rescue efforts across Europe. Poland, he claims, had only a small number of these efforts, and still, it was significant.

 

All these accompanied daily acts of defiance: smuggling food across ghetto walls, creating art, playing music, hiding, even humor. Jews resisted morally, spiritually, and culturally in public and intimate ways by distributing Jewish books, telling jokes during transports to relieve fear, hugging barrack-mates to keep them warm, writing diary entries, and setting up soup kitchens. Mothers kept their children alive and propagated the next Jewish generation, in and of itself an anti-Nazi act. Jews resisted by escaping or by taking on false Christian identities. Roughly 30,000 Jews survived by dying their hair blond, adopting a Polish name and patron saint, curbing their gesticulations and other Jewish seeming habits, and “passing.”

 

I was fascinated by this widespread resistance effort, but equally by its absence from current understandings of the war. Of all the legions of Holocaust tales, what had happened to this one?

 

While I researched the lives of Jewish rebels, I simultaneously probed the trajectory of their tales. As I came to find, though there were waves of interest in Jewish defiance over the decades, the resistance narrative was more often silenced for both personal and political reasons that differed across countries and communities. The history of the Jewish underground has generally been suppressed in favor of a “myth of passivity.” Holocaust narratives were shaped by the need to build a new homeland (Israel), the fear of exposing wartime allegiances (Poland), and redefining identity (USA). Early post-war interest in partisans turned into a 1970s focus on “everyday resilience.” A barrage of 1980s Holocaust publications flooded out earlier tales.

 

Many fighters who survived kept their stories hidden. Many women were treated with disbelief; relatives accused others of having fled to fight instead of staying to look after their parents; still others were charged with sleeping their way to safety. Sometimes family members silenced them, as they feared that opening up old wounds would tear them apart. Many hushed their tales due to oppressive survivors’ guilt: they felt that compared to others, they’d “had it easy.”

 

Then, there was coping. Women in particular felt a cosmic responsibility to mother the next generation of Jews. They wanted to create a normal life for their children and, for themselves. They did not want to be “professional survivors.” Like so many refugees, they attempted to conceal their pasts and start afresh. The fighters’ formidable tales were buried with their traumas, but both stayed close to the surface, waiting to burst out.

 

The Warsaw Ghetto Uprising began in April 1943, on the first night of Passover. In her groundbreaking book, We Remember with Reverence and Love: American Jews and the Myth of Silence After the Holocaust, 1945–1962, Hasia Diner explains that Passover, a holiday where Jews celebrate liberty, became the time around which American Jews commemorated the Holocaust. However, the uprising element was forgotten. When my book comes out this April, I hope to bring the revolt to the fore once again. I cannot think of Polish Jewry without it; theirs is a story of persistent resistance and profound courage.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179779 https://historynewsnetwork.org/article/179779 0
What Comes Next?

Poster images by Amanda Phingbodhipakkiya, from I Still Believe in Our City, a recent public art campaign for the New York City Commission on Human Rights. 

 

 

 

“Until we address the discrimination and harassment against Asian Americans today, they will become deeply entrenched in the fabric of our nation, causing unimaginable harm and suffering and taking decades to undo,” Manjusha P. Kulkarni, Executive Director of the Asian Pacific Policy & Planning Council, explained in a recent written statement submitted to the House Subcommittee on Constitution, Civil Rights, and Civil Liberties. On March 18, activists, scholars, and artists across the Asian American and Pacific Islander community provided testimony on the increase in anti-Asian hate speech and violence since March of 2020. These attacks aligned with former president Donald Trump’s use of phrases like  “Chinese virus” on Twitter and in public statements. Stop AAPI Hate—a coalition of activists and scholars maintaining a database that contains nearly 3,800 documented incidents of verbal and physical abuse—has continued the legacy of Asian Americans pursuing protection by presenting evidence of racism.

 

But what comes next?

 

Forty-two years ago, Asian Americans spoke to legislators in DC as consultants on civil rights issues still faced by the AAPI community long after the legislative milestones of the 1960s. From May 8th to the 9th in 1979, the U.S. Commission on Civil Rights held its first hearing on specific rights violations encountered by Asian Americans. It coincided with increasing representation of Asian Americans in politics and Congress’s passing of Public Law 95-419, which designated the week of May 4th as Asian American Pacific Islander Week. Just a few days earlier, President Jimmy Carter declared, “We have succeeded in removing the barriers [for Asian Americans] to full participation in American life.” Refugees from southeast Asia fleeing the wreckage of the Vietnam War were also resettling in the US, adding diversity to the AAPI community and, Carter declared admiringly, "their successful integration into American society and their positive and active participation in our national life demonstrates the soundness of America’s policy of continued openness to peoples from Asia and the Pacific." Carter’s praise for the AAPI community and their “enormous contributions to our science, arts, industry, government, and commerce” bolstered the idea of Asian Americans as the model minority who had overcome adversity to achieve the American Dream.

 

However, for those who appeared before the Commission, Carter’s comments did more harm than good. He described Asian Americans as economic drivers for the United States whose rewards were acceptance and economic comfort—an idea that glazed over the challenges they faced. Dr. Ling-Chi Wang, then an assistant professor in Asian American Studies at the University of California at Berkeley, provided a historical overview of Asian American experiences that clashed with Carter’s simplistic characterizations. “I just want to add,” Wang stated during his testimony, “that current popular beliefs, held most firmly by government agencies—that Asians have no problems, that Asians have made it, that Asians take care of their own problems, and that Asians are too proud to seek government assistance—are but persistent manifestations of the highly institutionalized government attitude toward Asian Americans of benign neglect.”

 

This neglect stemmed from a history of exploitation by the government and white employers. “Almost without exception,” Wang continued, “each economic crisis was accompanied by an anti-Asian movement… each Asian group was imported to meet a concrete demand for cheap labor, and each was subsequently excluded by law when each was no longer perceived to be needed or when it was no longer politically and economically expedient to continue its utilization.” Racist policies excluded Asian immigrants, making them expendable, rendering them as outsiders, and making them easy to scapegoat in different crises. As Wang charged, the model minority myth “absolves the government of any responsibility of protecting the civil rights of Asian Americans and assigns Asian Americans to a permanent status of being neglected.”

 

Others presented evidence of the damage from more than a century of anti-Asian sentiment. Challenges faced by Asian Americans ranged from limited access to health services, lack of bilingual educational resources, and poverty—social problems also encountered by other communities. Participants in the consultation offered solutions such as promoting more representation in the federal government, directing more money to community grants, and developing a set of criteria for identifying civil rights violations specific to Asian Americans. There was hope—particularly after the movement for reparations for Japanese Americans who survived incarceration during World War II—that with more attention, Asian American civil rights would progress.

 

But in 1986, the Commission on Civil Rights issued a disturbing report. “Recent Activities Against Citizens and Residents of Asian Descent” noted an uptick in attacks on Asian Americans. Economic competition from Japan spurred a reinvigorated anti-Japanese movement in the US during the early 1980s. In 1982, a Japanese American state legislator in California reported that someone had spray-painted the word “Jap” on his garage door while a group called the White American Resistance Movement distributed anti-Asian pamphlets throughout San Francisco. The report connected these incidents to the death of Vincent Chin, a Chinese American draftsman who was murdered by two white men who had recently been laid off from an auto plant in Detroit. They blamed Chin for the layoffs—thinking he was Japanese—and brutally beat him.

 

The recent deadly, racially-motivated attacks on Asian American women in the Atlanta metro area has brought attention to the historic trend highlighted by Wang during his 1979 testimony. In May of 2020, the Commission on Civil Rights promised to prosecute civil rights violations and hold public hearings on anti-Asian hate, but these initiatives had largely languished despite calls from the AAPI community for legislators to take verbal and physical abuse seriously.

 

The promises made by the Commission in 1979 did not save Vincent Chin. And now—as Wang predicted—the COVID-19 pandemic is the latest in a long list of crises that produced violent anti-Asian attacks. The Chinese Exclusion Act and other historic moments are crucial to understanding where the nation is today, but historians have more contemporary examples to draw from. The pleas for help in 1979 before the Commission on Civil Rights largely went unanswered. Today, holding public officials accountable for the promises they will undoubtedly make to the AAPI community after recent hearings depends upon forcing Americans to reckon with a cycle of perpetual scapegoating and the racist language that makes it possible.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179780 https://historynewsnetwork.org/article/179780 0
Pamela, Randolph and Winston: The Wartime Discord of the Churchills

Pamela Digby, photographed in 1938, before her brief courtship and tumultuous marriage (1939-1945) with Randolph Churchill. 

 

 

 

In the spring of 1941, Averell Harriman, Roosevelt’s special envoy to Britain, started an affair. That both he and the woman in question were married was not a huge problem; there were different rules in wartime. What was more complicated was the identity of the woman’s husband: Randolph Churchill, the British prime minister’s adored, spoiled, turbulent son.

 

Randolph had started the Second World War desperate for two things. He wanted to be wherever the fighting was fiercest, and was anxious to find a wife who would bear his child. Both were ways of pleasing his father, who placed an outsized premium on physical bravery, and was obsessed with the idea of building a powerful political dynasty. Randolph, he felt, had a duty to make sure the line was continued.

 

Randolph’s first ambition was stymied by the artificial calm that followed Hitler’s invasion of Poland, as well as his father’s reluctance to get him reposted. He was more successful in achieving the second. In the course of a fortnight he proposed to, and was rejected by, eight women. Then he met Pamela Digby.

 

Pamela had wide, deep-blue eyes, pink flushed cheeks and auburn hair streaked with a patch of white. Some saw her as “a red headed bouncing little thing regarded as a joke by her contemporaries,” but beneath the plumpness and a forced air of jollity was an adamantine desire to escape her dull, provincial life in Dorset.

 

Winston embraced her immediately. Pamela was soon an essential part of the Churchill family, especially once Winston became prime minister and they moved to Downing Street. She had an uncanny instinct for sensing what people needed, and then giving it to them almost before they had realized themselves. She was a source of support for an embattled Winston, and a much-needed confidante for his wife, the lonely Clementine. In October 1940 she gave birth to a boy, named, inevitably, Winston.

 

The only problem was her husband. Randolph was charming, clever, generous and funny. Most of the time. He was also rude, arrogant and incapable of understanding why marriage should stop him sleeping with other women. All of these qualities were exacerbated when he drank, which he did uncontrollably.

 

Bills and arguments mounted up. When Randolph behaved appallingly, or ran up debts he couldn’t cover, it was to his parents that Pamela ran for help. They, increasingly, took her side, which was another source of friction in an already fraught web of relationships.

 

It was after Randolph finally got his posting abroad that the problems really began. In January, 1941 his Commando unit set sail for Egypt. Before their ship had even docked on the other side of the Atlantic he’d lost more at the gaming table than he could possibly pay back. Once Pamela had fixed the financial disaster her wayward husband had forced upon her, she deftly, single-mindedly, began to fashion a new, independent life for herself. Before the end of spring she had started sleeping with Harriman.

 

Randolph was incandescent when he discovered his wife’s infidelity. This was largely because he was convinced that his father had at the very least tolerated, and at worst actually encouraged, an affair that was being conducted beneath his nose. After all, the situation presented a clear political advantage to Winston. And so although Pamela did not create the tensions that run between father and son – they had a long history of their own – her actions brought matters to a head.

 

Winston and Randolph’s bond had always had an almost romantic intensity. Winston was obsessed with his son, claiming that he would not be able to continue leading the country if anything were to happen to him; Randolph was devoted to his father. They had spent the last decade living in each other’s pockets: drinking, plotting, gambling, talking and quarrelling. But this closeness masked some profound difficulties.

 

Throughout his life Randolph had struggled to find a way of marrying the outsized expectations Winston had thrust onto his shoulders with the need to provide his father with the asphyxiating loyalty he demanded. Every time Randolph tried to fashion an opportunity for himself, or attempted to assert an independent position, he found himself accused of sabotage. He had been Winston’s most passionate defender during his time in the wilderness, an unfailing source of affection and reassurance. And yet when Winston formed his government in 1940, there was no place for Randolph. All of this had lain under the surface for years, now it erupted.

 

Volatile, unable to control their emotions, the two men launched into rows that frightened anybody who witnessed them. Winston became so angry that Clementine feared he would have a heart attack; Randolph stormed out of rooms in tears, swearing that he would never see his father again.

 

Although a fragile peace was restored, it could not last. Randolph was unable to reconcile the deep animal love he bore for his father with what he regarded as Winston’s treachery. Nor could he understand why his parents continued to show Pamela such open affection. Winston reacted violently to his son’s reproaches. Wrapped up in his own consuming sense of destiny, and unable to ever read what was going in anybody else’s heart, he did not see that he had done anything wrong. As Pamela moved serenely from one affair to the next, father and son fought, again and again, opening deeper and deeper wounds.

 

Randolph and Pamela’s divorce was confirmed in 1945. Randolph could survive this, but the damage to his relationship with Winston was irreparable. They would never recapture the intimacy they had enjoyed before the war. Randolph had married Pamela to make his father happy, and yet he only succeeded in alienating the man he loved more than anybody else on the planet.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179782 https://historynewsnetwork.org/article/179782 0
Economic Justice and Political Stability Require More Progressive Taxation

Income Tax filers, 1920

 

 

The invasion of the Capitol on January 6th is a sign of deep anger at the course of American life among what is usually called the “white working class.”  Beside it is the protest of black America over continuing racism and poverty.  People with little else in common count perceived economic unfairness among their complaints. What can we as citizens of a democracy do about it?  Significant reforms, such as those usually ascribed to the left in the Biden administration, are going to cost money.  A return to progressive tax rates by meaningful tax reform will be part of the solution.

 

Economists with a sense of history point out that inequalities began to grow about 1980, starting with the Reagan tax cuts.  Emmanuel Saez and Gabriel Zucman of the University of California, Berkeley, have done a service to the republic by methodically tracing what has happened to equality over the last 40 years.  Their book, The Triumph of Injustice (2020), is contentious but it sets out uncomfortable facts that bear upon a solution.  A complementary analytical tool is also provided.

 

They trace and measure the working, middle, and upper class divisions in American society all the way to the super-rich and the top 400 families.  Fully 50% — half — of the American people are classified as working class, with annual income on average of $18,500.  They earned 10% of national income in 1980; 40 years later, only 12%.  Most gains in the economy due to technological progress and globalization went to the upper 10%.  The next 40% of the people are middle class, earning an average of $75,000.  The last 10% of the people are reckoned as upper class or the rich, earning $220,000 annually.  But they have divisions, too.  The top 1% earn $1.5 million per year.  They earned as much (10% of national income) as the whole working class in 1980; 40 years later, their share had grown to 20% (pp. 3-7).  At the very top are 400 families of the super-rich, including Warren Buffett, who earned $3.2 billion in 2015, and paid taxes of $1.8 million (a rate of 0.055%) (p. 129).  Buffet is honorable in that he openly admits that he should pay a higher rate, which he has famously stated is less than his secretary’s.

 

Reversal of this pattern is absolutely vital to a sense of fairness in America.  We have already had one insurgency.  But won’t the rich, especially the very rich, resist any proposal that increases their taxes?  Money is their property.  A tax is an appropriation of private property for public benefit.  People within democracies are resistant to taxes until convinced of their necessity and justice.  This country began in a tax revolt.  How can we convince Jeff Bezos (net worth $179 billion) to Alice Walton ($62 billion) among the 400 to share?

 

The principle of progressive taxation was established historically.  The Constitution did not originally provide for an income tax, but it distinguished between indirect taxes (like customs duties) and direct ones (like taxes on land).  Customs duties or tariffs were understood as taxes on consumption, which are regressive, but the citizenry in the early republic were so nearly equal — most were owners and cultivators of farms — that the slightly increased price of foreign imports was bearable.  For 100 years the principal revenue of the U.S. federal government was drawn from tariffs.

 

Great national crises have been the settings for the introduction of a progressive income tax.  During the Civil War, the first income tax was introduced — as a direct tax — to meet the threat to the Union.  Its rates were gradually reduced until the 1890s, at the height of the industrial Gilded Age, when the Supreme Court ruled that the government had no right to impose a direct tax.  That defect was removed by the 16th amendment (1913), one of the high achievements of the Progressive Era (1905-15).  A regulated, orderly capitalistic economy was steadily established by the Interstate Commerce Commission, the Sherman and Clayton Anti-Trust Acts, the Federal Reserve System, the Federal Trade Commission, later the Securities and Exchange Commission, and the income tax.

 

Initial rates for the tax were quite modest (7% for the top bracket) but U.S. entry into the Great War increased the top marginal rate to 67% to counter war profiteering.  An estate tax was also established at 10% for the largest bequests, which grew to 20% by the late 1920s.

 

The great expansion of the income tax came with the supreme crises of the Depression and the Second World War.  In the 1930s, with business in ruins for many owners and many workers reduced to poverty, President Franklin D. Roosevelt aimed to confiscate remaining excessive incomes. The top marginal rate rose to 79% in 1936.  Roosevelt argued that in American democracy no one should, after taxes, have an income of more than $25,000 (equivalent now to about $1,000,000).  The purpose was plainly to redistribute income to create a more equitable society.  Roosevelt explained in his 1937 inaugural address, “The test of our progress is not whether we add more to the abundance of those who have much.  It is whether we provide enough for those who have too little.”  During the war, the top marginal rate rose to a maximum of 94%.  This progressive rate fell slowly in the 1950s and ’60s (even in Nixon’s time it was 70%), producing the most equitable, and hence just, U.S. society yet in the industrial age.

 

This achievement from about 1936 to 1980 has largely been undone by tax cuts (the top rate is now 23%) and by the rise of an immense tax-dodging industry.  Neoliberal economics argues that the optimal tax rate on capital should fall to zero, and that capital gains revenue should be replaced by higher taxes on labor income or consumption (p. 99).  Reversal of this 40-year pattern is vital to bringing the working class into the promise of America and the very rich into recognition of their obligations to a democracy.

 

If you want peace, work for justice.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179781 https://historynewsnetwork.org/article/179781 0
Richard Minear Reflects on Teaching History, Including Teaching Vietnamese History during the Vietnam War  

 

 

Dr. Minear grew up in Newton outside of Boston and his wife was born in Northampton.  He taught at Ohio State University from 1967 to 1970. “That was prime Vietnam time,” he said. “Columbus, Ohio is distinctly not New England. Ohio State is a huge school.” He also taught at the University of Massachusetts Amherst from 1971 to 2008.

Dr. Minear graciously answered questions via phone about history, his career, teaching, and what he is doing now while in retirement.

 

Did your education in your early life prepare you to eventually pursue a career as a historian?

Yes and no. I didn’t set out just to learn languages, but in this neck of the woods, being competent in Japanese, Chinese, or Vietnamese for that matter is a prerequisite.

Did you think you would travel to so many continents and experience different cultures?

By then, I had lived in Germany for two or three years, I had lived in Sweden for six months. When I was an eight-year-old, I went to a Swedish school while my father was on sabbatical. This was 1958-59: my brother was on a Fulbright scholarship in Germany and I was in Heidelberg for my junior year. My parents were in Holland in that spring. European winters are god awful, with not much sunshine. That spring, because of my dad's interest, he took my brother and me on a week-tour, and we hit Istanbul, Palestine, then Israel on the Jordanian side – Palestine/Israel was my dad's turf – New Testament theology. Then we went to Rome on the way home back to Europe. I had had a lot of travel.

Which Japanese island have you been to the most?

There are four major islands in Japan. I spent most of my time on the main island. UMass has a sister university, Hokkaido University. I spent the summer and part of another year. My first three years were in Kyoto and a little time in Tokyo.

Is there a historical event that captivates you most?

I was born in 1938. I learned about World War II primarily after the fact. My first ‘political’ memory was of the atomic bomb while I was up in Vermont. I got on a boat on the lake and the sirens started going off and I remember a bonfire. That's part of my background in Japan and a natural focus or interest.

I was too young to serve in Korea and I used the educational deferment, which got me through 1968 by which time I was 30 and married. The accident of chronology kept me out of the military during Vietnam, and Vietnam had a major impact on everything that I did afterwards. Ohio State has a quarter system which means three ‘semesters’ each year. I was beginning to teach six courses that year and I quickly ran out of Japan courses. In the third semester of the quarter of the first year, which would have been spring of 1968, it dawned on me that there was no course on Vietnam at Ohio State.

Here was a major university without a course on Vietnam and the war, and I proposed a course. Even though I had never had a course on Vietnam, my Asia background gave me some kind of entree. I taught a course in spring of in 1968 and 1969 and 1970, and Vietnam had gotten very big. I had brought in guest lecturers. Ever since, it has had a major effect on my politics, on my thinking, both having watched it and having read materials on it. I taught about Vietnam at UMass throughout the seventies and it has had an effect on all of my teaching.

Was the effect related to how people perceived the Vietnam War or based on how you approached teaching and explaining about it?

It very quickly dawned on me that this was more than textbook stuff. I had students who had graduated from my class who are in the military. One of the faculty members at Ohio State who was also an ROTC instructor, he (and a colleague) gave a single lecture in my course. It later dawned on me that they were only free to give the Pentagon line. He went back to Vietnam in late spring 1968 and a couple weeks later was killed. Students were graduating from the course and then going to Vietnam, and student populations were wrapped up in the anti-war movement. That gives a sense of urgency, a certain seriousness to what you do in the classroom.

Growing up in the 1940s and 50s, my high school and college education I had was pretty straight-lined and celebratory to the American master narrative. My involvement with teaching about Vietnam and reading about Vietnam basically knocked me off that master narrative.

What influenced your interest in history?

It all looks different in retrospect than in prospect. A while ago, I looked back at my high school yearbook and several people had said, “You're going to make a great professor!” They were way ahead of me!

My background was liberal arts English, history, and language. I knew I didn't want to go into theology. I was a history major as an undergraduate, although it wasn't much of a major. I spent my junior year abroad in Heidelberg and we went to classes, but there was no attendance, no grading, and no exams. It was great for languages and for other purposes, and then came graduate school. I can remember Christmas time in 1960 after I graduated from college, and my family was in the living room. The question was, “What will Richie do next year?”

There wasn't any drive on my part or any consuming interest driving me to history, but once I got there, I was not sorry.

I had seen enough of German European history which is what my undergraduate major was mainly about, to realize that it was a pretty trampled, congested field and somebody had told me about two-year programs in Asian studies. Yale had one, Harvard had one, and Berkeley. It was only two years, what could go wrong? That's what got me into graduate school and into Japanese language.

How did the perspectives acquired through your education influence your career as a historian?

I think the steering was more from the outside world than the education itself. The education that I got made it possible for the things that happened, but I think it was more stuff outside of the classroom. Looking back to the post-Vietnam era, Vietnam happened, and it had a major impact on my teaching. That is from 1964 to 1966; I was in Japan as a Fulbright graduate student and by the time I got back, it was a much bigger topic in this country. I was in Japan from 1964 to 1966, and by the late 1960s the war was heating up.

One of the fortunate things for me, first at Ohio State, and here in Massachusetts, I was the only Japan historian, which meant except for rare occasions I wasn't team-teaching or preparing my students to take an advanced course in the subject with someone else. I had an unusual independence when it came to coverage.

One of the major problems in history teaching is the compulsion that we feel or that is actual. Teaching Japanese history includes that you cover Japan from A to Z, or to cover the United States A to Z or Germany. If there are others in your department who are likely to get those students the next year, if your teaching has to cover what other colleagues expect it to cover, then that is one thing; but I never faced that issue. That's extraordinary freedom. It has been important for me all the way through. The standard introductory courses are large, and you have discussion sections taught by graduate students. In the discussion sections, all the way through, I was able to do my own discussion sections, so I rarely taught more than 60 people in one course. It was a Monday and Wednesday lecture and a discussion. I led the discussion sections and got to know the students as a result. It was important for me, not for them, to know where they were coming from.

How different were students’ specialties when they came to your survey course?

Here at UMass, the Japan survey was open to everybody so the history majors were a small part of that. I didn't really register which students were history majors or engineering or the sciences. This had an impact on my sense of audience, so I could give them some kind of perspective. One of your questions has to do with pedagogy; what we ought to be teaching and how to teach it.

Every year I had one and often two Japan survey courses taught to people who would never have another Japan course and who came from all parts of the university. This kind of shaped my ideas about teaching. We often think of covering the field, and in my experience, we don't teach history, we talk about history. We should teach a habit of mind, not a list of facts. This may have changed since my student days, but I'm not sure. Students can take our courses without really getting a sense of what it means to think historically.

Nowadays, things are different. Back when I was studying Asia, there were a handful of Asian experts at a dozen major universities. Nowadays, the US has many Asian historians. It wasn't true back then.

The name I knew was Edwin O. Reischauer. He had got into the field and he was a major figure in the beginning of Japanese studies. Later, Kennedy had appointed him U.S. Ambassador to Japan. He was one of the names that attracted me to Harvard but as soon as I got there he left, and I left before he got back.

We rarely teach about the history of the field. We rarely teach about who the historians are, who Reischauer was. What were the American Japanists doing in the World War II era. I didn't have to cover more than six to eight people to cover the field. This was after Vietnam had shaped my thinking. World War II and patriotic fervor and Japan was the enemy. Hence, they had a certain take on Japan. Some of them had a negative experience of Japan, but certainly there was ‘an American nation spreading democracy’ and that was shared almost across the spectrum.

Part of this is teaching about the background of the field and part of it is more practical – in my syllabi, this is after I had gotten my feet on the ground, after teaching about Vietnam for a while – I gave biographical information on every author we encountered in the course, and I included myself. Date of birth, educational background. Every discussion session, once a week, started out with a quiz. The first question was, “Who is the author?” and another question likely was “When was this written?” and maybe a third question was, “Where was it published?” Was it Life magazine or was it the Harvard Journal? That kind of questioning.

It underlined for the students that a major, major part of history is analyzing sources. Who is this person and why are they saying these things about Vietnam and Japan in World War II? Who is his or her audience? The emphasis for me in teaching, yes, the subject was Japan, but the underlying goal throughout was to get people to read critically, to think critically, not just about the authors we read, but also about me. At the end of the course, say “OK, this was Professor Minear’s course, who is he and where is he coming from? Then factor that in.” How many history courses today have biographical data on the professor and everybody else?

And the continued emphasis in discussions, lectures: Who is this author? When did he write? Was it before the Tet Offensive or after? For what audience? It makes the students into players rather than audience members.

In your opinion, what is the purpose of history? Who are its intended consumers, and does the historian have a social responsibility?

I think for everyone, it’s different! With the audience, something we tend to forget is – in my case, I began graduate study as a 21-year-old, I think that’s true for many of the folks. The sense of the audience then is nonexistent and you are just trying to get through the next exam and get your Masters and decide whether to go on. But once you get past and into your thesis, your audience is the three or four guys – and they were all men back then – on your thesis committee. All of whom were academics and distinguished. I can remember thinking for a while in my thirties that my audience for my writing wasn’t anybody at UMass. My audience was 30 or 40 Japan experts like me, scattered around the country but limited to the ‘in-group’ of the real experts. I can remember thinking, at that stage, that maybe my audience was, in part, historians like me in Japan. If I was really good, they might learn something about Japan from what I had to say.

In my teaching and publishing, it gradually got me away from that kind of hyper-professional focus on specialists and into what was useful for non-experts, the students who I was teaching in my courses or the general readers. Each historian has a different path to follow and maybe everyone has different expectations and a different take on this.

My ‘5 minutes of fame’ was Dr. Seuss Goes to War: The World War II Editorial Cartoons of Theodor Seuss Geisel, and soon after that came out, I gave a talk in Dr. Seuss’s hometown in California, at University of California, San Diego (UCSD). They had posters around the campus which had Dr. Seuss and my name on it. I knew one of the Japanists at UCSD. I bumped into him after the talk, and he said, “I saw this poster. I knew it wasn’t you because you were a Japan person.” The idea that a Japan person would write about Dr. Seuss didn’t compute, and yet that book got me on Good Morning America and All Things Considered. Part of teaching about writing got me into E.B. White, and I did an essay tracking the changes on the various editions of his book, The Elements of Style. The idea that a Japanist could do Dr. Seuss and E.B. White…

Back on Japan and speaking to the Japanese, my second book was Victors' Justice: The Tokyo War Crimes Trial. I was writing it in the middle of when the Vietnam War was happening, in anger. This was the Pacific counterpart of Nuremberg. When you look at the trial in retrospect, it was heavily a propaganda operation and it had a serious impact. In that sense, I’m here writing a counter piece on the trial that wasn’t exactly an exercise in justice. That gets translated into Japanese and it reinforces what the hard right in Japan was saying about the war, and about the Tokyo trial – that it was a put-up job. They are coming at it from a political position diametrically opposed to mine: the context makes a huge difference. They reacted, they loved the book. The Japanese have an expression, a proverb that if something is big news on Japan abroad, it tends to feed back into Japan. The Japanese press sits up and takes notice; I guess it’s much less so for the United States, partly because of size, reach, and influence. What other people say matters [in Japan].

I have done a lot of translations of Hiroshima survivor accounts and more recently translations of "ephemera" (pamphlets, wall posters) produced by Japan's left-wing activists. It’s fascinating how stuff that you do for one audience can be read very differently by another audience.

I think maximum clarity about your own politics, your own stance, your own commitments, and not simply clarity, but not hiding your politics can give your readers enough material, whether it’s a biographical squib on a syllabus or a translator’s introduction to a translation. That gives your audience some clue as to who you are and where you are coming from.

One of the first major translations I did was of a WW2 battleship epic, Requiem for Battleship Yamato. The battleship sailed out at the end of the war into the Okinawa campaign on essentially a suicide mission. What were they going to do with the battleship? They turned it into a floating platform that would maybe have some minor effect on the battle. Without air cover, it would be destroyed rapidly. One of the officers on Yamato survived, one of the three hundred or so crew members who had survived out of 3,332. He wrote his account. We tend to look down on military history, but it was a stunning, gruesome, yet gorgeous account of his own experience, of truth-seeking, and I showed a draft to my colleagues, a European historian, one of them a Canadian classicist. He said to me, “Any classicist (of whatever tradition) would appreciate Requiem for Battleship Yamato.”

No matter which classics (for example, if you're a classical scholar in the European tradition or the Indian tradition or the Chinese tradition), there’s horror on one hand and human nobility on the opposite, but also underlying human need, a common humanity. I think that’s part of what we owe to the public and to our kids, to get across with our work.

When I started teaching the Vietnam course, I very quickly found a classic Vietnamese poem, The Tale of Kieu, written before the French takeover of Vietnam. It’s beautiful, utterly unconnected to the war and yet. Kieu is a woman who undergoes great suffering, largely not of her own devising, and yet survives. The author is Vietnam's Shakespeare. I can remember one fellow here at UMass in the Vietnam course that had to read this, and this guy served in Vietnam. He came up after I had him write a paper, and he said, “I feel closer to Kieu than I ever felt with any Vietnamese.”

If you approach Japan, China, Vietnam, or Russia through classics and poetry, it becomes a little harder to accept unthinkingly what used to be in the textbooks and the press. What used to be in the newspapers and comics. I used to do a lecture on the Sergeant Rock comic book, Ali My, and it was a story of a U.S. operation in a war, and Ali My is an anagram for Mỹ Lai. A gruesome American massacre of Vietnamese civilians gets transmuted into a heroic battle.

How many of us grew up reading comics and war comics? Somebody needs to study videogames for their images. Who is the ‘other,’ who is the bad guy, how are they depicted, what are the gender dynamics? Videogames are having a far greater impact on our kids than any teacher in a classroom.

Who is our audience and what do we know about our audience? What de-programming needs to be in place? One of the major influences on my intellectual development was Orientalism by Edward Said. That book blew my mind! I was already coming off of Vietnam disillusioned. Said's book takes the entire tradition of European and American thinking about the Arab world and points out what a coherent, self-congratulating, and denigrating constellation that is. When the book came out, the Journal of Asian Studies commissioned essay reviews, and it came to three Asian experts: one from Japan, one from China, and one from India. I was the Japan person, and almost everything that Edward Said says about orientalism transposes beautifully onto the pre-war- and wartime American thinking about Japan.

You’re inside a tradition and you can’t see it as a tradition because it’s the world, but when somebody points out from outside the tradition or from a position from within, when somebody nails it so beautifully, you can say, a-ha. This is a world view. This is a coherent system, and we need to re-examine all of it.

There was true excitement there. What we ought to be doing in teaching is somehow start conveying that excitement, that possibility to the folks who are in classrooms. And then to say, OK, who is Edward Said? Where’s he coming from? And who am I – either I as a professor or as a student and where am I coming from? How does this all factor into how I read Edward Said and how I look at the America or the European hang-ups about Japan or the Orient. It’s a game of mirrors but it’s a deadly serious game of trying to be aware. Not simply of what the tradition is, from a matrix, what’s handed down, but also of myself and how I’m reacting and how I’m contributing in one way or another to the perpetuation or the challenge.

It’s only when you’re getting into it at that level or that order of operation, that you begin to see what a fascinating and difficult and impossible task we all have. But that’s where it goes back to syllabus – biographical sketches of all of the authors, and the dates when they’re writing, who is this person, when was she writing? Where was this published? We just don’t, for the most part, let most of our students, we don’t make them aware that there is this whole level of endeavor of thinking. How many times have you run into people who said, “I had history in high school and I hated it.” Don’t blame the teachers, they’re doing the best they can, given the constraints of SATs and covering the waterfront and all that.

Part of the problem is history is not exciting for most people because they don’t see it for what it is. They can get into a historical novel because in one way it comes alive, but when you read a book like Said’s Orientalism, all of a sudden, the whole board game shifts. The whole perspective gets challenged in ways that can only be useful.

Who writes history? By and large, of course, it’s the victors, but we don’t know who the victors are until much later. They cover stuff up. It has to be uncovered by oddballs like historians who don’t buy into the master narrative.

A story about Vietnam: when I was teaching the Vietnam course in the mid- to late 70s, it was a smaller class size. There was less interest after a while. There was a group of the class of maybe 40 kids and we got two-thirds of the way through, and I said, “OK, you’ve got some play here on the last several weeks, what topics would you like to cover?”

I listed several possibilities, including Mỹ Lai. After the class, one of the guys who had been sitting in the back all semester said, “Well, Professor, if you were going to cover Mỹ Lai, I’d be happy to answer questions.” He had been in Lt. Calley's platoon at My Lai. He did two class periods and took us through training. He had been through Vietnamese language training. Your jaw drops.

 

What have you been doing since your retirement?

I retired in 2008 and I was 69. Since then, I’ve published 3 or 4 book-length translations. I’ve kept some of that going and I’m doing a little bit now. I’m still living in Amherst, and I stopped teaching cold turkey and haven’t gone back to part-time teaching. It’s been 12, 13 years now since I gave a talk. For a while, with the Dr. Seuss book, I was giving talks on a regular basis but I did stop and I’ve been happily [retired].

Amherst is a neat pace to retire, it’s a beautiful fit. The city is close to the hills and the roads are good for biking. I do a 25-30-mile span when I go out. I hike in the hills; I bike north and south along the Connecticut. There’s an online journal, The Asia Pacific Journal: Japan Focus, and my most recent stuff is there, including, I mentioned, a Japanese leftist pamphlet about a Japanese massacre of Chinese forced laborers in the summer of 1945. I’ve kept a toe–or two toes–in. I loved teaching while I was doing but I’m happy to not be doing it now.

I’ve always been active.

I have two sons and they are in their 50s but for a while, we did triathlons as a team. I swam, one of them biked, and one of them ran. If you did a great time you could qualify for the Iron Man. We weren’t in that category but it was fascinating just to see how fit some of the folks were.

For many students then and now, martial arts offers a way into Japanese culture. One of my students from 20 years ago, she sat in and took one of my courses. She's now a MMA practitioner, in the top ten in her weight category.

You take them where they are at, try to figure out where they are at, and what you can do that might be useful, not in terms of profession, but thinking about Japan, about life, about what it means to be human. Those folks are maybe less likely to doubt the basic humanity of the Vietnamese or Japanese. Martial arts practitioners—or fans of anime or Zen meditators—have an advantage. One toe in the door.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179785 https://historynewsnetwork.org/article/179785 0
A Personal and Family History of Encountering Prejudice and Intolerance

 

 

Anti-Asian sentiment is nothing new in America. For instance, the Chinese exclusion Act of 1882 and the acts that preceded it, especially those that restricted Chinese women from living and working in the United States and those laws that followed restricting any person of Asian descent from being anything but a second class citizen. There has always been a deep-seated fear of what white supremacists call the yellow peril, a concept that some historians believe originated as far back as the Greco-Roman Wars. William Randolph Hearst is the villain who in the 1900s popularized the yellow peril in his newspapers as a major selling tool in the era of yellow journalism. Whether he believed it, he said that America was under threat of an invasion from Japan, thus what he called the yellow peril. We should never forget Hearst and the role he played in creating this deepest of inhumane prejudices. The threat of a Japanese invasion is long gone but the fear of Asians, their look and skin color remains deeply engrained in America's collective psyche.

It is easy to review law by law how the mass of Asians suffer because of discrimination but that would be nothing new. I am here to tell you about my life, a personal history if you will, as I inadvertently became a part of the wider Asian community in many different counties. My late wife was from Saigon, Vietnam. We met in Saigon, married in Hong Kong, lived in London, Washington and New York. We had three mixed race children, two boys and a girl, now thriving adults. I have three grandchildren, boys who, because of their antecedents, are part of the Asian continuum.

We lived in dynamic cities. As a soon to be married couple we had a tough time in Saigon. Unmarried and still courting, we did not live together. When as a couple in public, Vietnamese soldiers mocked and chided us accusing me of usurping their women and calling my future wife a whore. We found it better to walk separately with me behind her and to never hold hands or otherwise touch in public. Incidentally, but not less important are the memories my wife had of being chased through the streets of Saigon by French soldiers from Africa. From an early age she had an understanding of what it meant to be sexually harassed.

We thought life in Hong Kong would be better and for the most part it was, but prejudice tailed us everywhere we went. In the 1960s, Hong Kong was a progressive, dynamic city with many mixed race couples. European mixed race couples were more easily acceptable than I was as an American with an Asian wife. I should note that during the Vietnam War, there was no love for Americans. The war was not very popular in South East Asia, of which Hong Kong was a part. Do I attribute the prejudice we felt to my being an American, for some reason easily detectable because of the way I walked, looked, dressed? To a degree, yes but it was mostly because we were a couple. A Chinese doctor friend said with a smile, that many Asian men could not understand why a beautiful Asian woman, particularly from Vietnam, would consort with a pale faced, big nosed American. Beyond that popular descriptive utterance he had no answer why prejudice should be part of anyone's life. I knew it was not part of his life.

Life in London, Washington and New York, at least on the surface seemed to have less prejudice than in either Saigon or Hong Kong, yet it still existed in many forms, especially for my wife and then for my mixed race children as they grew and we established ourselves on Long Island. The outward expression of prejudice we experienced were the hard stares of people who viewed us as beings out of the ordinary. Most of Long Island then was conservative and not very progressive. So seeing a mixed couple and often their mixed race children out for a meal in public was indeed strange, enough so that most people could not help staring even for a moment. We were uncomfortable but we did nothing to stop the stares. We learned that keeping to ourselves in public was the best defense, though at times I wanted to physically strike out against their stupidity as I once did in a movie theater in Hong Kong when we faced a crowd of teenagers who attacked us verbally for being a couple. 

My wife worked for years on Long Island helping settle Vietnamese, Lao and Cambodian refugees. She served as a court interpreter for many refugees helping them understand a language for the most part they did not speak or understand. She worked with them to traverse the intricacies of the benefits they were due. The prejudice those new immigrants felt knew no bounds but they never, nor did she, ever complain publically. Getting and holding jobs, making life work, no matter how trivial what they did may have seemed, was more important than registering a complaint about a life they were trying to understand and manage to survive. Many of these former refugees, now adults, made it through to the new world of opportunity in America. When they first arrived, intolerance, though a concern, was not an issue. In time, they ignored, but never forgot, the unreasonable hatred they knew as newcomers to our so-called hallowed land. For many years there was no issue with hate crimes. Now that there is, living their lives to the full and educating their children about the faults of hate and intolerance works best for them in our current climate.

I am Caucasian Jewish, my family from Lithuania, and Russia. That is normally enough for full bore bigotry. I grew up in a diverse neighborhood in Brooklyn and felt almost no prejudice. It was not until college that I suffered for being not only Jewish, but a New York Jew, a condition I survived with added strength into adulthood. My wife was South Vietnamese, part Chinese and a Buddhist. By the sheer force of her personality she was able to overcome much of the racial intolerance that permeated Long Island but she never understood why some people did not like her because she looked different, For my mixed race children, today all worldly adults, they are a part of a unique fabric that is more like an abstract quilt. They are white and Vietnamese but with those other fragments blended in. It was quite a mix and a serious burden for young children to carry. As children they knew they looked different. Everyday in their preteen and young teen years they knew the slings and arrows of racism. "Chink," was the epitaph applied to how intolerant and mostly ignorant kids and teenagers usually attacked my sons. My oldest son knew he looked different. He turned to Judo in the hopes that he could defend himself if attacked. My daughter simply said yes when asked if she knew bigotry but she did not elaborate.

As a family we never talked about hatred and racial intolerance but I know this: what my wife and children went through informs my children's lives to this day. In a backward sort of way, the idea of bigotry in their lives has taught them to be better husbands, a better wife and better parents. They are better people for what they learned, for what many other people have never known or will have sadly ever understood.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179784 https://historynewsnetwork.org/article/179784 0
Paying the Price: Our Veterans and the Burden of Parkinson's Disease

 

 

 

Parkinson’s disease is the world’s fastest growing brain disease, even faster than Alzheimer’s.  The number affected worldwide has doubled in the past 25 years and, absent change, will double again in the coming generation.  In the U.S., 1.1 million Americans bear its burden, up 35% in just the past decade.  The toll is especially great on veterans; 110,000 have the debilitating disease.

 

Veterans are at high risk for at least three reasons.  First, many were exposed to toxic herbicides like Agent Orange during the Vietnam War and other conflicts.  Richard Stewart is one of those affected.  He is a former Green Beret who served as a platoon leader in Vietnam for the U.S. Army’s famous 101st Airborne Division.  He, like thousands of other veterans and millions of Vietnamese, were often soaked by the 45 million liters of Agent Orange (“pretty nasty stuff” in his words) that were sprayed in the country.  The chemical, which derived its name from the large orange barrels in which it was stored, killed vegetation and crops and contributed to birth defects, cancer, and Parkinson’s disease.  Today, Stewart lives in upstate New York with his wife, a “flower child who peacefully protested the war.”  He still walks 2.5 miles and does 200 push-ups daily, is a member of local veterans’ groups, and says, “I only have Parkinson’s.  A lot of people are worse off.”

 

Pesticides are not the only chemical contributing to Parkinson’s disease among veterans.  Trichloroethylene, or TCE, is another.  TCE has been used to decaffeinate coffee, clean silicon wafers, and remove grease.  The military used the dangerous chemical to clean engines and vehicles.  At the Marine Corps Base Camp Lejeune in Jacksonville, North Carolina, TCE and 70 other chemicals poisoned the base and its water supply for 25 years.  Over one million service members, their spouses, and children were exposed to its toxic effects, leading to miscarriages, birth defects, cancer—and Parkinson’s disease.  Many drank contaminated water or inhaled TCE that had evaporated into their homes, like radon, from polluted groundwater.  The consequences of that exposure are still being felt 30 years later.

 

Finally, head trauma contributes to Parkinson’s disease.  A single head injury causing loss of consciousness or memory loss can triple the risk of Parkinson’s.  Repeated head trauma raises the risk even further.  These injuries are common in the military.  According to the U.S. Department of Defense, nearly 400,000 service members have had a traumatic brain injury since 2000.  Another eight million veterans have likely experienced such an injury.  Of those with moderate or severe injury, one in fifty will develop Parkinson’s within 12 years.

 

So what can we do to help our veterans? The first and most important step is to prevent those who serve from ever developing the disease.  Banning harmful pesticides and chemicals like TCE, which the Environmental Protection Agency has proposed to do, is an important step.  We also need to clean up contaminated sites throughout the country, many of which are located on current or former military bases.  In addition, service members must have proper equipment to minimize the risk of head injury.

 

Next we need to advocate for those that have already been harmed.  Veterans who have Parkinson’s and were exposed to Agent Orange are now eligible for disability compensation and health care.  Some efforts have been made to help those who have Parkinson’s tied to their service at Camp Lejeuene.  But these efforts are insufficient and have excluded many who have been injured.  For example, in 2019, the U.S. Navy denied civil claims from about 4,500 harmed at Camp Lejeune. 

 

We also need more research to prevent, measure, and treat the condition.  Despite Parkinson’s growth over the past decade, funding from the National Institutes of Health for the condition, adjusted for inflation, has actually decreased. 

 

Anyone anywhere with Parkinson’s should receive the care that they need.  The Veterans Health Administration has long had dedicated centers to research and treat Parkinson’s.  However, not every veteran lives near one of these centers.  Telemedicine is one way to expand the reach of care, but some veterans do not have internet access.  Others need in-person in-home care and support.  Increased access and novel care models can help ensure that no one suffers in silence.

 

Finally, better treatments for Parkinson’s disease are lacking.  The most effective medication for the condition is now 50 years old, and we have had no major therapeutic breakthroughs this century.  The economic burden of Parkinson’s disease is over $50 billion per year.  Federal and foundation support is less than 1% of that total.  That will not get the job done.  We must increase our research efforts ten-fold to change the course of Parkinson’s as we did for polio, HIV, and COVID-19.

 

Veterans have served and sacrificed too much to have Parkinson’s be their fate. 

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179786 https://historynewsnetwork.org/article/179786 0
Teachers, Keep Hope about the Minds You Influence

Professor Donald Treadgold of the University of Washington. 

 

 

How and to what degree does a teacher impact a student?  I doubt that we will ever be able to gauge the matter. I believe it may boil down to matters of hopefulness and pessimism and the moral imperative of making a choice between them. Surely each one of us, both professional educators and laymen, impacts the lives of people we cross, but we often don't know which ones or to what degree. We must remain hopeful as an article of faith.

I once had a memorable professor at the University of Washington, Donald Warren Treadgold, an eminent scholar of the Soviet Union. Let's be kind and just say that this man, a cold warrior extraordinaire, knew his own mind. He was more than a little famous for that. Professor Treadgold was a prolific author, and his work was known throughout the world. He authored Lenin and His Rivals; The Great Siberian Migration; Twentieth Century Russia; The West in Russia and China (two volumes); and Freedom: A History. When I first wandered into one of Professor Treadgold's classes, almost all of my study had been on western Europe. I was a stranger in a strange land in my attempts to learn about Russia and the Soviet Union. Considering myself a hotshot, I plunged forward. But wait. The rub was that Professor Treadgold attempted to teach me a great deal that I found myself resisting at every turn. It all took place within the constraints of academic etiquette, but make no mistake, this was a slugging match. And it was a mismatch, for he knew so much, and I knew so little. I considered him to be an old relic. He considered me to be a dopy, misguided, poorly informed idealist. I dug in. He persisted. Throughout the following years, my memory of him remained fresh. I continued to remember his disdain for my viewpoints, his deep learning, his patient demeanor, and the overall gentleness of his character. And as the decades passed, I found myself incorporating much of what he had vainly tried to teach me. It dripped into me, consciously and subconsciously. I never swallowed it whole, but the slow drip never stopped. I can now firmly say that he had as great an intellectual impact on me, both morally and intellectually, as any person that I have known. One day, many years later, I was pecking away at my computer. Suddenly, for no conscious reason, I did googled his name. I found that two years before he had passed away as a result of leukemia. Stunned, I gazed out my window. The sun was going down and it looked cold outside. The streets were empty. I placed both hands over my face and sobbed like a little child.

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179783 https://historynewsnetwork.org/article/179783 0
The Roundup Top Ten for April 1, 2021

The Painful History of the Georgia Voting Law

by Jason Morgan Ward

The new wave of vote suppression bills, like the one in Georgia, reflect a less obvious but important aspect of Jim Crow law: the use of superficially race-neutral language to keep specific groups from voting. The danger is that courts today will similarly fail to see these bills for what they are. 

 

Mitch McConnell is Wrong. The Filibuster is, in Fact, Racist

by Keisha N. Blain

"Try as he might, McConnell cannot erase the historical record. To use his own words, 'There's no dispute among historians about that'."

 

 

Working with Histories that Haunt Us

by Marius Kothor

The author responds to a recent essay on the traumatic aspects of archival research. As a political exile from Togo, her identity and experience converged with subject matter she couldn't study at a remove. 

 

 

Government has Always Picked Winners and Losers

by David M.P. Freund

Government action has always been tied to economic growth, and always involved policy choosing winners and losers. Policies proposed by the Biden administration as part of the COVID recovery aren't inserting the government into the market, they're changing the parties favored by government policy. 

 

 

The Problem with Confederate Monuments

by Karen L. Cox

"I also believe it’s important that I, a Southern white woman, write and speak about this topic with blunt honesty. Monument defenders cannot dismiss me as a Northern liberal who has invaded the region to tell them what to do. I’ve grown up here, too."

 

 

Teaching Controversial History: Four Moves

by Jonathan Wilson

A reflection on the work of teaching controversial subjects argues that it's essential to respect students' autonomy and provide them with the tools with which to change their own minds. 

 

 

Who's Afraid of Antiracism?

by Chelsea Stieber

Recent books in different genres shed light on the limits of the French governing ideal of republican universalism for a society where racism is real and historically significant. 

 

 

Paleo Con

by Daniel Immerwahr

Why do the lifestyles of paleolithic hunter-gatherers repeatedly pop up as foils for western capitalist modernity? 

 

 

The Lack of Federal Voting Rights Protections Returns Us to the Pre-Civil War Era

by Kate Masur

New vote suppression bills in multiple states threaten to return the United States not to the Jim Crow era but to the period before the Civil War and Reconstruction when civil and political rights were protected or denied according to state politics. 

 

 

America’s Longest War Winds Down

by Andrew Bacevich

Public fatigue over the ongoing War on Terror must not allow political leaders to do what they seem to want most to do: avoid taking responsibility or learning lessons.

 

]]>
Fri, 23 Apr 2021 02:37:05 +0000 https://historynewsnetwork.org/article/179777 https://historynewsnetwork.org/article/179777 0