History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Wed, 23 Jun 2021 11:57:58 +0000 Wed, 23 Jun 2021 11:57:58 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/site/feed HNN Turned 20 This Week! Revisit the First Edition HNN Founder Rick Shenkman published the first issue of History News Network twenty years ago this week. In the two decades since, HNN has continued to carry out its mission of offering historical perspective on the news and giving a platform for historians to use their expertise to comment on new discoveries, current events, and public understanding of the past. 

Thanks to the invaluable resource of the Internet Archive's Wayback Machine, you can check out that inaugural issue (most images are, sadly, lost). 

I hope that this is a fun trip down memory lane for our readers who have been with HNN for the whole journey and for relative newcomers, myself included. 

It is absolutely the case that HNN has endured and thrived because of the support of our readers. We are the leading site on the web putting news in historical perspective because of our audience and our contributors, who make HNN a destination for informed commentary, a forum for incisive discussion, and, we believe, an asset to the community of historians. 

Our readers have also sustained HNN with financial support. Contributions from our readers are necessary to pay for our web hosting, staffing, media subscriptions, and other vital tools of our work. If you value HNN and want to see what we will do in the next 20 years, please consider making a contribution today. 

 

  __wm.bt(650,27,25,2,"web","http://www.historynewsnetwork.org/","20010612055210",1996,"/_static/",["/_static/css/banner-styles.css?v=omkqRugM","/_static/css/iconochive.css?v=qtvMKcIJ"], false); __wm.rw(1);

PUTTING THE NEWS IN PERSPECTIVE

HNN FEATURE WAS THIS MAN GAY? The Question That Won't Go Away
Homophobia in Lincoln Studies? Philip Nobile Reports

 

ASK MR. HISTORY Who invented the gas tax?

BOOK REVIEW Why Ike Ran CNN's Pearl Harbor Mistake Thomas Fleming Exposes Pearl Harbor Myths The Jeffords Switch: Why Bushes Get Betrayed Still Mad About Election 2000?

 

WEEK OF JUNE 10 HNN FEATURE W. Scott Thompson Was Lincoln Gay? SETTING THE RECORD STRAIGHT Thomas Fleming Pearl Harbor Hype HNN COLUMN William Thompson Do You Have to be a Democrat to like the Jeffords Switch? GOTCHA! HNN Staff CNN's Pearl Harbor Mistake READY, AIM, FIRE! Philip Nobile Don't Ask, Don't Tell, Don't Publish: Homophobia in Lincoln Studies? POLITICS Bernard Weisberger What History Tells Us Will Likely Happen to Those Giant Surpluses POLITICS Walter Nugent Reflections on Election 2000 HISTORIAN'S TAKE ON THE NEWS Rick Shenkman Family Feud: Jeffords Switch ASK MR. HISTORY HNN Staff Who Dreamed Up the Gas Tax? BOOK REVIEW Steven Wagner William Pickett's Eisenhower Decides to Run

 

Home | Archives | Search | About Us | Contact Us | Newsletter Copyright ©2001 History News Network. All Rights Reserved.

 

 

 

 

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180514 https://historynewsnetwork.org/article/180514 0
Powerline Politics in the 1970s and Today

 

 

 

 

In the frigid winter of 1983, the first book my college professor assigned for our “Introduction to Political Science” course was Powerline: The First Battle of America’s Energy War.  The professor, Paul Wellstone, had co-authored the book in 1981 with his Carleton College physics colleague, Barry (Mike) Casper.  Powerline recounts the fierce protests in the late 1970s by western Minnesota farmers against the construction of a 430-mile, 400-kilovolt powerline, called the “CU Project,” that ran from a coal-fired power plant in central North Dakota across agricultural land to a substation outside Minneapolis. 

 

Wellstone and Casper had coached the powerline protestors and documented their stories.  The alliances they forged helped lay the groundwork for Wellstone’s 1990 election to the U.S. Senate, where he served as its most liberal member until he was killed in a plane crash in 2002.  I wonder what Wellstone and Casper (who died in 2007), both champions of renewable energy and environmental sustainability, would make of today’s brewing energy wars over proposals, such as President Joe Biden’s infrastructure plan, to expand high-voltage, long-distance transmission lines.  A massive build-out of the grid has become central to strategies for decarbonizing the electric sector, electrifying the economy, and meeting net-zero carbon-emission targets by 2050.  Commentators usually present the barriers to such expansion as “financial and administrative.”  An equally large barrier, however, is likely to be political resistance to powerlines like the revolt in rural Minnesota forty years ago.

 

Powerline is a classic David-versus-Goliath morality tale.  The Goliath electric utilities and state government officials possessed the power of eminent domain -- the right to take private property for public use.  Their allies in state courts and law enforcement enforced government decisions and protected the line.  The dairy farmer Davids in Grant, Pope, and Stearns counties, by contrast, had little voice in the process of granting the “certificate of need” for the line and determining where it would be located.  Neither non-violent nor violent action could stop the project.  The lesson my classmates and I took away from the book was “fight the power!” 

 

Lost on us at the time, though, was the complexity of the power struggle.  The developers of the CU Project were not the investor-owned “private power” companies long distrusted by farmers, but two associations of rural electric cooperatives, the kind that rural Americans fought for under the 1936 Rural Electrification Act.  In the early 1970s, the price paid by the cooperatives for electric power skyrocketed.  Coal-fired power was by far the cheapest option for new electricity generation, and one promoted heavily by the Ford and Carter administrations.  As interviews from the Minnesota Powerline Oral History Project reveal, the cooperatives viewed the CU Project as a way of increasing control over energy costs to their members, many of whom favored the line.  But those supporters did not have easements forced on them through eminent domain.

 

For those who did, the powerline represented an assault on their way of life, a government intrusion into how they managed their privately owned land.  Many also alleged that electrical and magnetic fields emitted from high-voltage lines caused cancer and other ailments, allegations that have since been disproven.  After exhausting their options through the legal and political system, the farmers’ initial efforts at non-violent civil disobedience turned violent.  Protestors pulled survey stakes, destroyed concrete tower foundations, toppled transmission towers (fourteen in all), and shot at private security guards.  

 

In Powerline, Wellstone and Casper justify these acts of violence and sabotage as the farmers’ only recourse in the face of “increasing centralization of energy facilities and energy control.”  They conclude that the powerline would not have been needed had the utilities followed the “soft energy path” toward conservation and local renewable generation.  They argue that the $1.2 billion in low-interest REA loans for the CU Project would have been better spent on home insulation, wood-burning stoves, and solar-water heaters.  Taking cues from Powerline, a recent history of the 1970s “rural revolt” goes so far as to depict the powerline protestors as budding renewable energy advocates.

 

That depiction is far-fetched.  These farms, averaging 300-400 acres in size, consumed enormous amounts of electricity, fuel oil, propane, diesel fuel, and gasoline.  Small-scale solar installations would not have powered their dairy, poultry, and crop operations.  These protests were driven by concerns not about energy, but about land.  As the Powerline Oral History project shows, the protesting farmers resented the use of eminent domain to force easements across their land against their will.  They seethed at state and federal environmental laws that prevented the siting of utility lines through wildlife areas and along highway rights-of-way, but not across farmland.

 

The farmer’s revolt did not prevent the CU powerline from entering into operation in August 1979.  It still delivers electricity today.  However, the revolt did win important reforms for future projects.  Minnesota passed a “buy-the-farm” statute that guaranteed more public involvement in energy siting decisions and strengthened landowner rights in eminent domain proceedings.  The state later amended eminent domain law to increase condemnation payments to property owners.  Congress also revised the Highway Beautification Act to make it possible to build powerlines within highway rights-of-way.  These reforms smoothed the way for the 2017 completion of the CapX2020 transmission project, a system of eight hundred miles of new transmission to bring electricity from the Dakotas across Minnesota. 

 

Still, property rights challenges to the construction of energy infrastructure are gaining force, echoing the complaints from the Minnesota plains in the 1970s.  The context this time is different, mainly due to the issue of climate change.  As part of their “keep it in the ground” strategy to reduce carbon emissions, climate activists oppose the use of eminent domain when it comes to oil and gas pipelines.  Such opposition, however, also feeds resistance to high-voltage powerlines, whether transmitting electricity from fossil fuels or renewable sources. 

 

Whereas climate activists may distinguish between fossil fuels and renewable energy when it comes to enforcing eminent domain, their property rights allies in pipeline fights do not.  Like the Minnesota farmers in the 1970s, rural Americans are intent on guarding their land, and they object more to infrastructure above ground than below.  In 2016, Iowa farmers challenged the granting of eminent domain for the Dakota Access Pipeline (DAPL).  Although supported by the Sierra Club, these Iowans were not concerned about DAPL’s climate implications, but by what they viewed as an unconstitutional taking of their property.  The Iowa Supreme Court denied the challenge, but the state legislature in 2017 acknowledged this property rights insurgency by banning the use of eminent domain for siting overhead electricity transmission lines.

 

Recently, developers have sketched out many large-scale transmission projects to integrate dispersed wind and solar power into regional grids.  Property owners across the United States are fighting to stop them.  The now-debunked claim that power lines cause cancer still fuels opposition.  As Russell Gold’s Superpower recounts, Clean Line Energy Partners worked fruitlessly for years to overcome landowner objections to powerlines that would bring wind energy from Oklahoma to Tennessee and from Kansas to Indiana.  Those regions, along with Iowa and Nebraska, are precisely where wind energy is needed the most to meet net-zero goals.  Battles have been joined in New Jersey, New Hampshire, Illinois, and Wisconsin to block other transmission projects.  In addition, much opposition has mobilized against wind projects across the Northeast and Midwest, including Minnesota.

 

In some places, there are options for running lines along railway or highway rights-of-way to avoid eminent domain proceedings.  Transparent and inclusive transmission planning, exemplified by CapX2020, has proven that landholders can be won over in some regions.  Nevertheless, achieving carbon-reduction targets will require many wind farms, solar arrays, and transmission lines across the land of rural Americans.  Will some end up resisting in the way that many Minnesotans did in the 1970s?  Would Paul Wellstone and Mike Casper support such resistance now if it undermined national climate action?

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180575 https://historynewsnetwork.org/article/180575 0
Powerline Politics in the 1970s and Today

 

 

 

 

In the frigid winter of 1983, the first book my college professor assigned for our “Introduction to Political Science” course was Powerline: The First Battle of America’s Energy War.  The professor, Paul Wellstone, had co-authored the book in 1981 with his Carleton College physics colleague, Barry (Mike) Casper.  Powerline recounts the fierce protests in the late 1970s by western Minnesota farmers against the construction of a 430-mile, 400-kilovolt powerline, called the “CU Project,” that ran from a coal-fired power plant in central North Dakota across agricultural land to a substation outside Minneapolis. 

 

Wellstone and Casper had coached the powerline protestors and documented their stories.  The alliances they forged helped lay the groundwork for Wellstone’s 1990 election to the U.S. Senate, where he served as its most liberal member until he was killed in a plane crash in 2002.  I wonder what Wellstone and Casper (who died in 2007), both champions of renewable energy and environmental sustainability, would make of today’s brewing energy wars over proposals, such as President Joe Biden’s infrastructure plan, to expand high-voltage, long-distance transmission lines.  A massive build-out of the grid has become central to strategies for decarbonizing the electric sector, electrifying the economy, and meeting net-zero carbon-emission targets by 2050.  Commentators usually present the barriers to such expansion as “financial and administrative.”  An equally large barrier, however, is likely to be political resistance to powerlines like the revolt in rural Minnesota forty years ago.

 

Powerline is a classic David-versus-Goliath morality tale.  The Goliath electric utilities and state government officials possessed the power of eminent domain -- the right to take private property for public use.  Their allies in state courts and law enforcement enforced government decisions and protected the line.  The dairy farmer Davids in Grant, Pope, and Stearns counties, by contrast, had little voice in the process of granting the “certificate of need” for the line and determining where it would be located.  Neither non-violent nor violent action could stop the project.  The lesson my classmates and I took away from the book was “fight the power!” 

 

Lost on us at the time, though, was the complexity of the power struggle.  The developers of the CU Project were not the investor-owned “private power” companies long distrusted by farmers, but two associations of rural electric cooperatives, the kind that rural Americans fought for under the 1936 Rural Electrification Act.  In the early 1970s, the price paid by the cooperatives for electric power skyrocketed.  Coal-fired power was by far the cheapest option for new electricity generation, and one promoted heavily by the Ford and Carter administrations.  As interviews from the Minnesota Powerline Oral History Project reveal, the cooperatives viewed the CU Project as a way of increasing control over energy costs to their members, many of whom favored the line.  But those supporters did not have easements forced on them through eminent domain.

 

For those who did, the powerline represented an assault on their way of life, a government intrusion into how they managed their privately owned land.  Many also alleged that electrical and magnetic fields emitted from high-voltage lines caused cancer and other ailments, allegations that have since been disproven.  After exhausting their options through the legal and political system, the farmers’ initial efforts at non-violent civil disobedience turned violent.  Protestors pulled survey stakes, destroyed concrete tower foundations, toppled transmission towers (fourteen in all), and shot at private security guards.  

 

In Powerline, Wellstone and Casper justify these acts of violence and sabotage as the farmers’ only recourse in the face of “increasing centralization of energy facilities and energy control.”  They conclude that the powerline would not have been needed had the utilities followed the “soft energy path” toward conservation and local renewable generation.  They argue that the $1.2 billion in low-interest REA loans for the CU Project would have been better spent on home insulation, wood-burning stoves, and solar-water heaters.  Taking cues from Powerline, a recent history of the 1970s “rural revolt” goes so far as to depict the powerline protestors as budding renewable energy advocates.

 

That depiction is far-fetched.  These farms, averaging 300-400 acres in size, consumed enormous amounts of electricity, fuel oil, propane, diesel fuel, and gasoline.  Small-scale solar installations would not have powered their dairy, poultry, and crop operations.  These protests were driven by concerns not about energy, but about land.  As the Powerline Oral History project shows, the protesting farmers resented the use of eminent domain to force easements across their land against their will.  They seethed at state and federal environmental laws that prevented the siting of utility lines through wildlife areas and along highway rights-of-way, but not across farmland.

 

The farmer’s revolt did not prevent the CU powerline from entering into operation in August 1979.  It still delivers electricity today.  However, the revolt did win important reforms for future projects.  Minnesota passed a “buy-the-farm” statute that guaranteed more public involvement in energy siting decisions and strengthened landowner rights in eminent domain proceedings.  The state later amended eminent domain law to increase condemnation payments to property owners.  Congress also revised the Highway Beautification Act to make it possible to build powerlines within highway rights-of-way.  These reforms smoothed the way for the 2017 completion of the CapX2020 transmission project, a system of eight hundred miles of new transmission to bring electricity from the Dakotas across Minnesota. 

 

Still, property rights challenges to the construction of energy infrastructure are gaining force, echoing the complaints from the Minnesota plains in the 1970s.  The context this time is different, mainly due to the issue of climate change.  As part of their “keep it in the ground” strategy to reduce carbon emissions, climate activists oppose the use of eminent domain when it comes to oil and gas pipelines.  Such opposition, however, also feeds resistance to high-voltage powerlines, whether transmitting electricity from fossil fuels or renewable sources. 

 

Whereas climate activists may distinguish between fossil fuels and renewable energy when it comes to enforcing eminent domain, their property rights allies in pipeline fights do not.  Like the Minnesota farmers in the 1970s, rural Americans are intent on guarding their land, and they object more to infrastructure above ground than below.  In 2016, Iowa farmers challenged the granting of eminent domain for the Dakota Access Pipeline (DAPL).  Although supported by the Sierra Club, these Iowans were not concerned about DAPL’s climate implications, but by what they viewed as an unconstitutional taking of their property.  The Iowa Supreme Court denied the challenge, but the state legislature in 2017 acknowledged this property rights insurgency by banning the use of eminent domain for siting overhead electricity transmission lines.

 

Recently, developers have sketched out many large-scale transmission projects to integrate dispersed wind and solar power into regional grids.  Property owners across the United States are fighting to stop them.  The now-debunked claim that power lines cause cancer still fuels opposition.  As Russell Gold’s Superpower recounts, Clean Line Energy Partners worked fruitlessly for years to overcome landowner objections to powerlines that would bring wind energy from Oklahoma to Tennessee and from Kansas to Indiana.  Those regions, along with Iowa and Nebraska, are precisely where wind energy is needed the most to meet net-zero goals.  Battles have been joined in New Jersey, New Hampshire, Illinois, and Wisconsin to block other transmission projects.  In addition, much opposition has mobilized against wind projects across the Northeast and Midwest, including Minnesota.

 

In some places, there are options for running lines along railway or highway rights-of-way to avoid eminent domain proceedings.  Transparent and inclusive transmission planning, exemplified by CapX2020, has proven that landholders can be won over in some regions.  Nevertheless, achieving carbon-reduction targets will require many wind farms, solar arrays, and transmission lines across the land of rural Americans.  Will some end up resisting in the way that many Minnesotans did in the 1970s?  Would Paul Wellstone and Mike Casper support such resistance now if it undermined national climate action?

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180575 https://historynewsnetwork.org/article/180575 0
Recovering the Stories of Pioneering Frenchwomen of Science

Science depicted as a woman in an 18th century engraving. Bodleian Libraries, Oxford

 

 

Women of science who did their work in the past are powerful examples for female scientists today, and for the push to encourage girls to enter, and women to stay, in the STEM fields. There is nothing close to parity between the sexes in these disciplines, a source of great concern in our present climate where advances in science and technology are critical for restoring the democracies of the world to global leadership. For too long, governments in the Western world have failed to support and recognize the extreme importance of scientific research, and hence have not provided anywhere close to the financial and moral support that they require. The consequence of this is the decline of our position on the world stage, and even more frighteningly the assault on facts that right now threatens any real progress in the pursuit of knowledge. Add to that the problem that the female half of humanity is denied the same uplifting and recognition in the STEM fields as their male counterparts, and we have a true crisis on our hands.

Why would any sensible government not recognize how essential it is to treat women equally, in every respect? But here I am arguing that this is especially essential to the survival and eventually to the thriving of the sciences. Women can and want to contribute, and they need all the inspiration and backing they can get. Only about 3% of Nobel prizes in science have ever gone to women, and that includes the uptick in female laureates in just the last few years.

Examples of intrepid women in the past who accomplished great things can only help in this effort, even if they lived centuries ago and in another country. Their stories demonstrate the perseverance necessary to study nature and further the understanding of its workings, when the conventions of the day required nothing more of them than that they marry, run an efficient household, and produce children, preferably an heir and even a spare. But a few women bucked the tide and found ways to excel beyond these confines, to nourish their intellectual ambitions and make valuable contributions to knowledge. They had no role models and so were themselves the pioneers. Their courage, grit and resoluteness in the face of obstacles were extraordinary.

I have devoted my career to writing the lives of such women in 18th century France, the country that gave birth to the Enlightenment. It might be argued that the first major figures of the Enlightenment were English, but this movement positively exploded in France from the early 1700s. The decadent Catholic French monarchy and its strict censorship of new ideas gave rise to resentment of traditional norms, and this in turn erupted into new demands for progress, for toleration, for the use of reason as the arbiter of truth, and for science. A true war on fanaticism had begun, encapsulated in Voltaire’s famous call to “Écrasez l’Infâme,” meaning “crush the loathsome thing,” in particular superstition and intolerance of all kinds. Voltaire’s partner, not accidentally, was Mme du Châtelet, a brilliant woman of science who translated Newton into French and presented an erudite commentary on his physics, in addition to doing original experiments of her own.

The Enlightenment was capped in 1789 by the French Revolution, with its stirring, unforgettable slogan “Liberté, Egalité, Fraternité.” Fraternity is all very well and good, but it is the part of this rallying cry that has always bothered me. Where were the women, what were they doing? They were surely there, but why was nobody talking about them? By adjusting our lens, we can find them.

My work has always attempted to bring to the fore some of their stories. My first book dealt with women journalists of the Enlightenment, especially those who edited a paper called, appropriately, Le Journal des Dames. Its first female director, Mme de Beaumer, wrote the following: “Let it be firmly resolved that women will henceforth be enlightened and intelligent…. Courage, women, no more timidity. Let us prove that we can think, speak, study, and criticize as well as [men]…. I await this revolution with impatience. I will do my utmost to be the first to precipitate it…. Men everywhere are being forced to recognize that Nature made the two sexes equal.” Yes, even in the 1700s radical feminists could be found. And she set the tone for her male colleagues to use the paper for opposition causes.  

My second book, The King’s Midwife: A History and Mystery of Madame du Coudray, told of an astonishingly brave woman who was commissioned by Louis XV, and then re-commissioned by Louis XVI, to travel throughout the realm teaching the latest obstetrical practices, with the mandate to arrest infant mortality—nothing less! The royal administrators in all the provinces had to do her bidding so that she could accomplish her mission, which lasted several decades. She could, and did, report them to the king in Versailles if they made things difficult for her.

My new book, just out last month, Minerva’s French Sisters: Women of Science in Enlightenment France, presents the lives and work of a sextet who sought to elucidate the workings of Nature: mathematician and epistemologist Elisabeth Ferrand; astronomer and “learned calculator” Nicole Reine Lepaute; field naturalist Jeanne Barret; botanist and botanical illustrator Madeleine Françoise Basseporte; anatomist and inventor Marie-Marguerite Biheron; and chemist Geneviève Thiroux d’Arconville. One was a peasant, another a pharmacist’s daughter, yet another a grande dame of the aristocracy. But they shared a passion for science, and devoted decades of their lives in the pursuit of understanding the mind, the mechanics of the heavens, the flora of exotic places, the wonders of the human body, and the phenomenon of putrefaction, how living flesh decays, and what substances can arrest or delay this process.

Women today who aspire to do science still have an uphill battle, but as Marie Curie is believed to have said, “We must have perseverance and above all confidence in ourselves. We must believe that we are gifted for something, and that this thing, at whatever cost, must be attained.”

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180576 https://historynewsnetwork.org/article/180576 0
"Juke": Bluesman Bobby Rush on the Roots of Rock and Roll

Cover image courtesy Hachette Books. Bobby Rush performs at Chicago Blues Festival, 2010. Photo Bryan Thompson, Creative Commons Attribution 2.0

 

 

The good Lord had planted me in the right place. For Pine Bluff, Arkansas, had lots of juke joints. There was Sturdik’s, the Elks Lodge, Drum’s, Nappy’s, the Jack Rabbit, and Jitterbug, where I got to first know Elmo James. In the early ’50s some juke joints had improved in the way they looked, while others looked like they hadn’t been touched since Negroes built them. Some didn’t even have indoor toilets.

 

Most of the jukes in Pine Bluff couldn’t hold more than a hundred people. Many were in the woods, off the beaten path. This was so the bootlegging and gambling and—sometimes—prostitution could take place freely. ’Cause the four Bs—the blues, booze, broads, and booty (money, not boo-tay)—were the magical mixture of people spending their paychecks, and that was the lifeblood of the juke joint hustle.

 

Still, music being the central attraction for most, we Black folks raised Cain to get into the juke joints. With delicious music, there was a feelin’ in the air of livin’ it up. Everybody is enjoying everybody else. Men picked up women and women picked up men. Some people were fully liquored up, and some just a little tipsy. Segregation was a way of life for us. But in juke joints we fixed onto being segregated. Being in the thick of ourselves with our own groove. The sights and sounds, the tastes and smells, were ours and ours alone. There was freedom in these places. But it doesn’t mean that jukes couldn’t get rowdy, especially on the weekends after payday.

 

But I would make a name for myself at Nappy’s and at a jumping spot in Altheimer, Arkansas, called the Busy Bee Cafe. Nappy’s was the first place I performed with a band, a microphone, and an amplifier. Just a drummer, upright bass player, and me. Nappy’s sat behind a big old sawmill. And like most juke joints, it didn’t look like anything from the outside. Just a frame dwelling with a front door and a back door. No windows. There were three of us, each paid 75 cents along with a plate of chitlins, a hot dog, and a hamburger. I sold my chitlins and my hamburger for 50 cents and ate my hot dog. I had $1.25. I was happy.

 

In my first gig at Nappy’s I was just that teenager with personality that could get the house goin’. Even though I was playing 90 percent of every song I knew in the same key, with a hard-finger-playing bassist and a drummer who could really snap, we grooved. And I played enough of the hits of the day to keep the party going. Louis Jordan’s “Saturday Night Fish Fry” and “Caledonia,” Johnny Otis’s “Double Crossing Blues” and “Cupid’s Boogie.” I blazed through Charles Brown’s classic “Trouble Blues” and Memphis Slim’s “Messin’ Around.” Those were great party songs. And with a few of my own ditties thrown in, I had a pretty bouncin’ set.

 

But this was late 1951, and there was one song that came out earlier in the year that I had to play in my set. And that was “Rocket 88” by Jackie Brenston & His Delta Cats. But for all intents and purposes, this was an Ike Turner record. It was his band and, although not credited, he wrote the song. That record changed everything. Most people say “Rocket 88” is the first rock ’n’ roll record, but as Ike Turner told me repeatedly for over forty years, “Bobbyrush, everybody knows that that’s a damn R&B record.”

 

R&B was very boogie-woogie influenced early on. But in terms of what they then called pop or popular music, that boogie-woogie sound was the opposite of the very calm and stuffy hit parade tunes. Hell, the big pop hits of 1951 were songs like “The Tennessee Waltz” by Patti Page, “Too Young” by Nat Cole, and “If” by Perry Como.

 

“Rocket 88” just turned the world upside down. Just like ten years before when the snap and pop of the jump blues sound swept young folks off their feet. The distorted guitar and sexual innuendo dripping from “Rocket 88” was too much for young Black and white kids to take. You felt that shit in your bones. It made you horny in a way that wasn’t all sexual. It had heat. It was literally a fresh feeling when I played that song from the stage. But no one could deny the smokin’ energy of “Rocket.”

 

I took Boyd Gilmore’s advice when he said, “You gotta git, boy,” and made it my religion. I went to every gig by every artist I could in Pine Bluff and the surrounding area. Saw everybody. From guys performing on the street to stars performing at big nightclubs and the little juke joints. I would do a lot of soaking at the Townsend Park Recreation center, or, as everybody called it, “Big Rec.”

 

Big Joe Turner was the first artist I saw at the Big Rec. This was before he had his hit record with “Shake, Rattle & Roll.” But he had popular songs everybody knew, like “Piney Town Blues.” Most of his grooves were boogie-woogie, and back then that was irresistible—you couldn’t help but bop your head. People that didn’t (or couldn’t) dance would just wiggle their first finger in time as he sang. The boss of the blues was light skinned like me and had a flawless slicked-back haircut. He looked to be three hundred pounds or more. Joe Turner wasn’t an outrageous performer; he just stood there and pointed every once in a while. But his voice was so commanding you couldn’t help but to focus on him. And when he’d stretch both of his arms far out to his side, he looked like a giant T. Big Joe was so big—it was as if he was wrapping the entire room up in his outstretched arms. I immediately copied that move into my performances. I also tried to copy something else.

 

To get my hair like his.

 

Excerpted from I AIN’T STUDDIN’ YA: My American Blues Story by Bobby Rush, with Herb Powell. Copyright © 2021. Available from Hachette Books, an imprint of Hachette Book Group, Inc.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180549 https://historynewsnetwork.org/article/180549 0
Keep History Teachers Free to Teach, in Iowa and the Nation

Iowa Governor Kim Reynolds has signed legislation restricting the teaching of racism in schools and colleges in the state. 

 

 

The Iowa legislature has just passed a new law on teaching about racism in the Iowa schools. It is long, vague, and contradictory. It is a confusing, poorly drafted piece of legislation. It is clear, though, that it drastically restricts speech on the part of students and teachers. It is now law, but unlikely to have much legal impact as it is almost certainly unconstitutional and does not include tough enforcement measures.

Yet it still matters. We are losing many of our best college graduates to places like Chicago, Minneapolis, and Texas.  Our civic leaders try to win new businesses in the state yet employers are unlikely to invest in a state that looks like Mississippi, only with cold weather.

Reading this new law felt like skimming the Terms of Service on a commercial website, or carefully reading the warranty on a new toaster oven.  It left me wondering how such a mess received the support of GOP leaders. 

Sadly, this law is purely political. It has arrived in the midst of a larger movement against Critical Race Theory (CRT) a doctrine almost solely taught in law schools. Yet it will damage faculty governance over teaching in universities, and, along with recent threats to academic tenure, make it difficult to recruit the best professors to our state. However, it meshes well with current Republican thinking in the age of Trump.  Iowa has a long tradition of local control of education. However, this law gives most power to the legislature. On a variety of issues, from hog lots to COVID-19 masking, the Iowa GOP believes in local control, except when it doesn’t.

The law seems to protect students against discrimination based on their “political ideology.” That sounds laudable, but it does not effectively define “ideology,” creating confusion. Other elements of the law seem equally problematic. For example, it prohibits teaching anything with the consequence that “any individual should feel discomfort, guilt, anguish, or any other form of psychological distress on account of that individual’s race or sex.” 

I’m not a lawyer, but it appears that the legislature has just passed a law against hurt feelings. My students are adults, and I love hearing their varied opinions. But when they learn about slavery, Indian removal, or the Vietnam War, it might be distressing. Feel good history, which celebrates great men, might sound fine. Unfortunately, it obscures the fact that disagreement and dissent are crucial to our country’s past.

Teaching about race and anti-racism has been central for the field of history for more than a century. It has always been controversial because it has always been uncomfortable. And that won't change. American history contains stories of thoughtfulness and heroism. It also contains stories of brutality and hatred.

We have to tell the uncomfortable stories, the stories of Jim Crow, slavery, and race riots. Why? Because we can do better. The history of Reconstruction or the Civil Rights movement shows us that we HAVE done better.

We have a set of principles based on equality, derived from the Enlightenment. That is, we value freedom. It is precisely because we are committed to freedom that talking about unfairness and inequality can be so painful.

It can also be tremendously liberating; it offers the opportunity to change; to move closer to that American ideal of freedom. Let's try to get there. But we will always be uncomfortable. It can feel like pushing a boulder up a hill, only to see it roll back to the bottom. But that boulder won't stay at the bottom. That is because Americans want real history--the stories that tell about our best moments and also our worst.

Twenty-five years ago, I moved to Iowa in order to teach history. I had been told three things about the state: it was cold in the winter, flat, and had a great system of education.  Iowa winters are undeniably cold.  It is not flat, and my first walk up College Hill in Cedar Falls cured me of that mistaken impression. And it did have a great system of education. UNI may be a small school, but I love the students. They are willing to think hard about the past.  It is hard to define “Iowa nice,” but our students have it and I am grateful for that. Is UNI still part of a great system of education, known across the country, and stretching from kindergarten to college?  Maybe, but that tradition is hanging by a thread.  A key step in saving it would be studying history, but not the warm and fuzzy fantasies that the legislature has created.  There is a better way: let’s give teachers and professors, who have studied the field, a chance to do their jobs without stifling control from the legislature and governor in Des Moines. 

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180574 https://historynewsnetwork.org/article/180574 0
How to Change History

John Wilkes Booth escaping Ford’s Theater after shooting President Lincoln.

 

It’s more than likely that in the audience in Montgomery on the day Stephen Douglas spoke on the statehouse steps was John Wilkes Booth.   Booth would have cheered Douglas, as he put before the citizens of Montgomery the case for remaining in the Union.  The actor had arrived in town a week earlier to make his debut as a leading man, in the title role of Richard III.

Booth believed that slavery was a blessing rather than a sin.  “I have been through the whole South,” he wrote in an unfinished speech, “and have marked the happiness of master & of man.” True, he had seen “the Black man whipped, but only when he deserved much more than he received.”  

 

 

 

 

 

 

 

 

 

 

 

 

John Wilkes Booth believed that slavery was a blessing but he opposed secession.

 

Nevertheless, Booth was strongly opposed to secession, believing that “the whole union is our country and no particular state.”  According to his manager, his public utterances on behalf of the Union “were so unguarded” as to put his life in jeopardy.  

A year earlier, Booth had attached himself to the Virginia Militia detachment sent to maintain order and stand guard at the scaffold where John Brown was hanged.  “I may say I helped to hang John Brown,” he later wrote proudly.  Although Booth despised Brown’s anti-slavery cause, he envied him his fame and heroic stature, calling him “the grandest character of this century.”   If John Brown’s execution hastened the freeing of the slaves by igniting the Civil War, it also very likely inspired John Wilkes Booth in another history-changing act five years later, the assassination of the President who signed the Emancipation Proclamation.  

 

See

Fortune's Fool: The Life of John Wilkes Booth, Terry Alford

John Wilkes Booth:  Day by Day, Arthur F Loux

 

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/blog/154507 https://historynewsnetwork.org/blog/154507 0
50 Years Later, the Pentagon Papers Remain an Historic Landmark for Freedom of the Press

 

 

Democracy has seemed paper thin lately.

 

Politicians and pundits increasingly forget, ignore and distort First Amendment principles in hopes of scoring points with constituents.

 

This week, we celebrate the 50th anniversary of a key part of freedom of the press which, like our democratic principles, survives as a hardened protection for publishers and remains unmoved by the unrelenting attacks our First Amendment freedoms face.

 

On June 30, 1971, the Supreme Court announced its decision in New York Times v. United States, also called the Pentagon Papers case. In its terse, one-page opinion, the Court concluded the government cannot censor or restrain the press. The First Amendment does not allow it.

 

Taking from a previous decision, the Court concluded, “Any system of prior restraints of expression comes to this Court bearing a heavy presumption against its constitutional validity.”

 

That presumption against government censorship has weathered half a century of turmoil, from a judge in Nebraska that barred journalists from covering a high-profile murder case to the Wikileaks scandal in 2010. The Pentagon Papers case affirms the government cannot stop the presses – or keyboards.

 

The rushed, 275-word decision takes up only a single page in the Court’s records, merely a cocktail napkin compared to justices’ 54-page decision from Citizens United v. Federal Elections Commission in 2010. A printout of the entire case, including emotional concurring opinions by Justices Hugo Black and William O Douglas, would take about 10 double-sided pieces of paper. Paper-thin, indeed.

 

Yet, at the half-century mark, the Pentagon Papers decision remains a solid wall against government censorship. It continues to represent a line that has not been crossed.

 

The outcome was far from certain in 1971. Daniel Ellsberg, a military analyst, leaked the classified Pentagon Papers, a history of the U.S. government’s involvement during the Vietnam War that was commissioned by the Department of Defense, to the New York Times’s Neil Sheehan in January, 1971. The story of how Sheehan acquired the papers, which he kept a secret until his death, rivals the plot of nearly any spy movie.

 

As the conflict continued in Vietnam, Sheehen and a growing number of Times reporters and editors read through the 7,000 pages of documents and deliberated about if or how to report the information. The Times published its first stories using the papers on June 13, reporting the government had been lying to Americans about its involvement in Vietnam for decades.

 

Attorney General John Mitchell demanded the Times stop publication, but the newspaper refused. A federal judge temporarily halted the Times from publishing using information from the stolen documents, but refused to compel the newspaper to turn the papers over to the government. By June 18, the Washington Post was publishing revelations from the classified papers, leading to a court injunction.

 

Justices decided to hear the case on June 25. During the oral argument, it became clear four justices strongly sided with the Times and Post, three sided with the government, and Justices Potter Stewart and Byron White were undecided. Justices William Brennan and Thurgood Marshall sought to have the case dismissed and the temporary restraints on publication vacated.

 

Justices heard the case on June 26 and went into conference that day. As they deliberated, they heard the government had obtained a restraining order against the St. Louis Post-Dispatch, halting it from publishing information from the Pentagon Papers. Justice Stewart concluded the government’s lawyers had failed to meet the burden needed to restrain the press. Justice White agreed, making it a 6-3 decision.

 

Justices took up the case and provided a written decision in a four-day period, a record pace compared to the years it often takes for a case to move through the court system to resolution by the Supreme Court. 

 

Black, in his final First Amendment opinion before his death in September, 1971, lauded the press for realizing its purpose in a democracy by reporting vital information from the top-secret documents.

 

“Only a free and unrestrained press can effectively expose deception in government,” Black wrote. “And paramount among the responsibilities of a free press is the duty to prevent any part of the government from deceiving the people and sending them off to distant lands to die of foreign fevers and foreign shot and shell.”

 

The relatively short opinion has generally kept the government from halting publication for half a century. When the New York Times, as well as the Guardian and Der Spiegel, received stolen, classified government documents regarding U.S. wars in Iraq and Afghanistan from WikiLeaks in 2010, the government made no effort to stop the presses. Days of coverage in countless organizations scrutinized the often-embarrassing documents. Certainly, the government went after WikiLeaks founder Julian Assange, as well as the Army private who helped leak the documents, but no effort was made to halt publication.

 

Three years later, Edward Snowden, a subcontractor working for the NSA, leaked stolen documents about U.S. surveillance programs. Again, no effort was made to censor the news organizations.

 

The precedent stymied multiple attempts to halt publication during the Trump administration. A judge rejected Trump’s efforts to stop former national security adviser John Bolton’s book, The Room Where it Happened, from being published last June. Similarly, a New York judge rejected attempts to stop Mary Trump’s tell-all book, Too Much and Never Enough, from publication in July.

 

While democratic norms have, at times, seemed paper thin in recent years, the Court’s terse decision in the Pentagon Papers decision persists as a bulwark against government efforts to halt publication.

 

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180552 https://historynewsnetwork.org/article/180552 0
Liberty, Freedom, and Whiteness: Reviewing Tyler Stovall's "White Freedom"

 

 

 

White Freedom: The Racial History of an Idea by Tyler Stovall (Princeton, NJ: Princeton University press, 2021)

 

 

Tyler Stovall, a professor of history and the Dean of the Graduate School of Arts and Sciences at Fordham University, is a European historian specializing in the history of France. In 2017, Stovall was President of the American Historical Association. In White Freedom: The Racial History of an Idea, Stovall defines “white freedom as the belief (and practice) that freedom is central to white identity, and that only white people can or should be free” (11). While the definition of white freedom does not necessitate it, Stovall believes that in practice white freedom meant the subjugation of other races, both to preserve and justify a system that privileged people of European ancestry. It is a history “replete with paradoxes” (18). The era of the European Enlightenment was also the apex of the trans-Atlantic slave trade (102).

 

Starting with the seventeenth century European Enlightenment, Stovall traces the way liberty (individual rights) and freedom (the absence of restraint) for whites was based on the exploitation of people of African ancestry in colonized territories and in metropolitan centers. Stovall, focusing on France and the United States, argues that “two seemingly opposite philosophies, liberty and racism, are in significant ways not opposites at all” (x). They are reinforcing concepts of the same social system. He cites Edmund Morgan’s study of colonial Virginia where Morgan argues “Racism made it possible for white Virginians to develop a devotion to the equality that English republicans had declared the soul of liberty” (13). Whites in Virginia learned to prize their shared freedom so highly because “they could see every day what it meant to live without it” (12).

 

Stovall’s book is especially timely given debate in the United States between supporters of the 1619 Project and 1776 Unites over the lingering impact of slavery and racism on contemporary American society. He opens the book by pinpointing what may be the ultimate irony in the history of the United States, that the American “Temple of Liberty,” the Capitol, was constructed, at least in part, by enslaved Africans. In a second irony, when Congress acknowledged this history in 2007, it renamed the building’s Great Hall as Emancipation Hall to recognize the work of the enslaved Africans. Stovall questions the choice of the new name because it was unlikely any of the Africans lived long enough to be emancipated. He calls the name a “mockery” that ignores their actual history and the way liberty has always been identified with and reserved for “whiteness” (5). Liberty and race, to Stovall, are at the core of “white identity” in the past and in the United States today, as racism has shaped the identity of those denied “whiteness.” People of color have historically had to struggle to overcome the identification of freedom with whiteness and to redefine it to include all humanity (7).

 

White Freedom is divided into three sections with two extended chapters in each section. Part 1 is both more philosophical and more narrowly focused than chronologically historical, with chapters on alternative idea of freedom and the Statue of Liberty as a symbol of freedom. Part 2 examines the relationship between race and freedom in the Enlightenment period and the 19th century while Part 3 extends discussion into the 20th century up to the tearing down of the Berlin Wall in 1989 and the collapse of Soviet communism. Stovall concludes with a brief discussion of the post-Cold War world, but stops short of examining more recent rightwing “populist,” ethno-nationalist, and neo-fascist movements in the United States and other areas of the world. He does associate the post-9/11 war on terrorism (Islam) and President Trump’s effort to build a wall to separate the United States from Mexico with past racist ideology and practice. Unfortunately readers of this book do not get to learn his views on the significance of the elections of Barack Obama and Kamala Harris or of the Black Lives Matter Movement, although he has commented on them in other places.

 

In Chapter 3, Stovall’s recounting of the revolutionary generation’s condemnation of British tyranny, coupled with their own endorsement of African slavery, what he labels a “bizarre spectacle,” is especially well done. There is widespread panic, especially in the southern colonies, that the British will employ formerly enslaved Africans to crush the rebellion, both before and after the Royal Governor of Virginia, Lord Dunmore, issued a proclamation in November 1775 promising freedom to enslaved Africans who fought for the British (118). Colonial leaders, at least some of them, understood their hypocrisy. In my classes I reference a 1773 letter by Patrick Henry to Robert Pleasant, something not mentioned by Stovall, where Henry wrote that he abhorred the “lamentable Evil” and could not justify it, however was committed to its continuance:

 

Would any one believe that I am Master of Slaves of my own purchase! I am drawn along by ye. general inconvenience of living without them, I will not, I cannot justify it. However culpable my Conduct, I will so far pay my devoir to Virtue, as to own the excellence & rectitude of her Precepts, & to lament my want of conforming to them.

 

Because he is grounded in both American and French history, Stovall is able to give important recognition to the Haitian slave rebellion in Saint-Domingue, an actual struggle for freedom, that strikes fear in European colonial empires and American enslavers by challenging their limited white only version of freedom. Stovall mentions Alexis de Tocqueville on multiple occasions, but I was surprised he did not quote the passage from de Tocqueville’s1835 Democracy in America where he commented on the “Situation of the Black Population in the United States, And Dangers with Which its Presence Threatens the Whites.” In a passage that supports Stovall’s thesis, de Tocqueville, with incredible prescience, wrote:

 

I do not believe that the white and black races will ever live in any country upon an equal footing . . . A despot who should subject the Americans and their former slaves to the same yoke might perhaps succeed in commingling their races; but as long as the American democracy remains at the head of affairs, no one will undertake so difficult a task; and it may be foreseen that the freer the white population of the United States becomes, the more isolated will it remain.

 

The United States today is a product of the failed effort at Reconstruction following the American Civil War. This was very much, as Stovall points out, the result of Northern ambivalence toward ending slavery and rejection of racial equality. In the end, for most white Northerners, freedom for the formerly enslaved just meant the end of slavery. With the Compromise of 1877, the federal government surrendered the future of freed men and women to former slaveholders who resumed control over Southern institutions. This was facilitated, in part, by the assimilation of new waves of European immigrants into whiteness, providing a low wage workforce for an expanding industrial giant, and an expanded white political majority that cemented de Tocqueville’s perception of a racially divided America in place.

 

Stovall has a very interesting take on the impact of World War I and II on America’s racial ideology. He sees both wars further racializing American society as the United States battled stereotyped German and Japanese racial enemies and used racist propaganda to mobilize the home front. After World War II, the United States emerged as an imperial power at war with East and Southeast Asians who again were racially stereotyped. Stovall has one chapter sub-head that aptly captures his view of these developments, especially the Treaty of Versailles rejection of nationalist aspirations in the non-white colonized world, “Making a World Safe for Whiteness” (204).

 

Stovall documents how white America responded to a Black sense of possibility after World War I with racial violence intended to suppress any possibility of change. At the same time, pseudo-scientific intellectuals championed a racial hierarchy that led to the exclusion from whiteness, at least temporarily, of Southern and Eastern European immigrants, largely Jews and Southern Italians. During the post-war period there was a rise in Klan activity in the United States and Nazi ideology in Europe that expressly mirrored United States treatment of African and Native Americans (219). World War II was portrayed as a war to promote freedom, but as Gandhi pointed out, it was a “hollow” declaration “so long as India, and for that matter Africa, are exploited by Great Britain, and America has the Negro problem in her own home” (228).

 

While there were important shifts in racial policy in the United States during World War II, Stovall believes that the overall picture was the persistence of discrimination, segregation, and violence aimed at Blacks who sought to improve their situation and argued for “Double V,” victory over fascism in Europe and racism at home. However, “the most egregious example of racial discrimination in America during World War II was the internment of Japanese Americans” (238). Stovall argues that the internment of Japanese Americans, rather than being an exception, represents an “extreme example of the over-arching theme of white freedom” in American history (239).

 

Stovall sub-titles his discussion of the post-World War II world “The Fall and Rise of White Freedom During the Cold War” (247). The weakening of British and French empires and their mobilization of colonized people to support the war effort, contributed to a post-war surge in independence movements and unprecedented challenges to white freedom across the globe and in the United States by the African American Civil Rights Movement. Stovall has an interesting take on the Cold War, one I accept but don’t entirely agree with. Instead of positing it as a conflict between capitalism and communism or democracy versus totalitarianism, as it is in standard narratives, Stovall argues “Western cold warriors were outraged by the denial of freedom and independence to the white nations of Eastern Europe, an outrage that certainly did not extend to the absence of self-determination for the [non-white] peoples of the colonial world” (252). The European colonial powers and the United States fought a series of anti-independence wars in Africa and Asia and the United States repeatedly intervened in Latin America and the Caribbean to suppress liberation movements.

 

Stovall argues that the history of the Civil Rights movement in the United States has to be understood as intricately interwoven with the history of the Cold War, as with the decolonization movements abroad; it argued for the extension of fundamental rights, freedom, to all people. “Freedom was the ultimate goal of the worldwide struggle against racism” (270). Yet despite gains in voting rights, educational opportunity, and access to public facilities, and new taboos on overtly racist language, the Civil Rights Movement was met, not by white acceptance, but by white resistance and a “new variant,” of white freedom. Globally, newly independent nations remained dependent on and dominated by the former colonial powers and their international agencies and ripped apart by wars rooted in ethnic conflicts exacerbated by colonial regimes. In the United States, a white backlash against school and housing desegregation efforts elected right leaning governments. Legal limits were placed on affirmative action programs and school integration efforts. Social service budgets were cut at the same time that police funding to protect white communities was increased.

 

One of the things that make the book most engaging and thought provoking is Stovall’s ability to muse about a wide range of material including literary, artistic, and cultural references. Chapter 1 begins with a long discourse on Peter Pan (1904) by J.M. Barrie. Stovall argues that the lesson of the original play, not the Broadway or Hollywood versions, is that “savage freedom inexorably gives way to white freedom,” which Barrie equates with middle-class Edwardian society, not the fantasy Neverland, which, after all, is a never land inhabited by racialized (“redskins”) and criminalized (pirates) others (24). Later Stovall compares Napster, the now illegal computer program that permitted user the “freedom” to share downloaded music files, with romanticized notions of pirates such as the one depicted in Robert Louis Stevenson’s Treasure Island (1883); they were the ultimate free men unbounded by the constraints of civilization (39). Chapter 3 starts with a discussion of Mozart’s opera The Magic Flute (1791) as a representation of the triumph of light over darkness, the victory of “brotherhood, enlightenment, and liberty” (99). However, Stovall argues there is also a racial dimension to the opera that portrays the dual victory of white Enlightenment Europe over monarchy and the savagery of a Moorish slave who attempted to rape the heroine and then allied with the Queen of Night (99-100). Chapter 4 begins with the debate over whether Heathcliff, the protagonist in Emily Brontë’s Wuthering Heights (1847), who is described with dark metaphors and as having a semi-savage nature, had at least partial African ancestry (134).

 

The battle over the meaning of the Statue of Liberty is one of the essential American paradoxes that Stovall highlights. The statue presents Liberty as a white lady on a pedestal that welcomes white, European, immigrants to New York harbor. But she is also the “white Goddess” in the poem “Unguarded Gates” (1895) by the nativist Thomas Bailey. Bailey chides the statue for failing to protect America from the “wild motley throng” whose “tiger passions” threaten to destroy white, Anglo-Saxon America (85-86). In class, I have students compare the images presented in Bailey’s poem with Emma Lazarus’ “New Colossus” (1883). They discover that the nativist version wins out as the United States passes increasingly restrictive immigration laws culminating in 1924 with a virtual halt to immigration from Eastern and Southern Europe. Stovall notes that it was not until after World War II that Lazarus’ vision, and European immigration, became more broadly acceptable in the United States (90). He believes that even with this broader vision, the Statue of Liberty remains the “world’s greatest representation of white freedom” (95).

 

Stovall demonstrates the breadth of his scholarship in his discussions of childhood as an example of immature freedom and of adolescence as a period of greater maturity with rebellion against restraint and a greater acceptance of norms. He is also fluent with pseudo-scientific studies where Europeans and white Americans tried to establish a scientific basis for defining race and justifying racial inequality and racism (108-111).

 

One point of disagreement I have with Stovall is his tendency to use liberty and freedom interchangeably. French and American Enlightenment documents, including the Declaration of Independence (1776) and the Declaration of the Rights of Man (1789), defend liberty or individual rights from oppressive government. The United States celebrates the Sons of Liberty, the Bill of Rights, and the Statue of Liberty. France has the revolutionary slogan “Liberté, Egalité, Fraternité,” as well as “Liberty leading the people” (1830), the famous painting by Eugène Delacroix where Miss Liberty leads the people, white people, against tyranny. Free for Jean-Jacques Rousseau (“Man is born free and everywhere he is in chains”) and especially freedom to America’s founders, simply meant the opposite of enslavement. It is not until Abraham Lincoln’s Gettysburg Address (1863) and his call for a “new birth of freedom,” that the idea of freedom is irrevocably connected with democratic government (Foner, 1998). Stovall acknowledges that when “Enlightenment writers talked about the evils of slavery they did so only in a metaphorical rather than literal sense” and they were primarily concerned with their felt oppression by the church and monarchical state (107).  Leading European Enlightenment figures like John Locke, David Hume, and Voltaire profited from slavery and the slave trade (108). For them, their liberty was very different from the freedom of others.

 

Reading White Freedom gave me the chance to reconnect with someone I knew fifty years earlier when we were counselors at Camp Hurley, an interracial program in the Catskill Mountains of New York State that attempted to build bridges across America’s racial divide. I am white and Tyler is African American, something I was surprised he did not mention in the book, although it is clear from his cover photograph. In the preface he talks about his father, a World War II veteran who was stationed in Europe, but not about what the experience meant to an African American solider in a segregated military. Stovall acknowledges the personal nature of the subject of this book and I wish he had personalized it more with his own experience as a Black man and the father of a Black son.

 

Whether you agree with Tyler Stovall’s over all theme or specific points, there is no question that this this is an excellent work of comparative and integrative history that deserves a wide academic and general audience. Stovall believes the history of white freedom as documented is his book is “both a sobering tale and one full of hope, and if the past is a guide I consider myself justified in believing that hope will prevail in the future” (321). I also hope so.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180551 https://historynewsnetwork.org/article/180551 0
The Roundup Top Ten for June 18, 2021

As Immigration Politics Changed, So Did "In the Heights"

by A. K. Sandoval-Strausz

The film release of Lin-Manuel Miranda's "In the Heights" reflects the way the show has evolved in response to the shifting politics of immigration and nativism in the United States. 

 

In Vietnam, the Pentagon Papers Are History Written by the Defeated

by Lien-Hang Nguyen

A Vietnamese historian explains how the Pentagon Papers have become a foundation of domestic histories of war (both before and during US involvement) even as the Vietnamese government has declined to release its own official histories of the conflict. 

 

 

Our Insurance Dystopia

by Caley Horan

America's health insurance morass is a result of the replacement of the ideal of mutual, universal risk sharing with the privatization of risk in pursuit of profit. 

 

 

Inhumane System of Incarceration in U.S. Poses Special Danger to Women

by Jessica L. Adler

When politicians close single prisons after complaints of abuse, they leave untouched a cruel and dehumanizing system that poses particular risk to women. 

 

 

The Story of January 6 Will be Told

by Julian Zelizer

Republican obstructionism makes it likely that the full story of the January 6 attack on the Capitol and the election certification will be told in popular culture. 

 

 

How Deep Is America’s Reckoning with Racism?

by Kerri Greenidge

"Juneteenth has gained recent popular attention after white Americans responded to last summer’s mass protest movement in the most American way possible—through token gestures of “historical reckoning” rather than actual atonement through, say, restoration of Section 4b of the 1965 Voting Rights Act."

 

 

History as End: 1619, 1776, and the Politics of the Past

by Matthew Karp

"Current American inequalities, many liberals insist, must be addressed through encounters with the past. Programs of reform or redistribution, no matter how ambitious, can hope to succeed only after the country undergoes a profound “reckoning”—to use the key word of the day—with centuries of racial oppression."

 

 

We've Been Telling the Alamo Story Wrong for Nearly 200 Years. Let's Correct the Record

by Bryan Burrough and Jason Stanford

"Imagine if the U.S. were to open interior Alaska for colonization and, for whatever reason, thousands of Canadian settlers poured in, establishing their own towns, hockey rinks and Tim Hortons stores."

 

 

To Find the History of African American Women, Look to Their Handiwork

by Tiya Miles

"To discover the past lives of those for whom the historical record is abysmally thin, I’ve found that we must expand the materials we use as sources of information."

 

 

America’s ‘Great Chief Justice’ Was an Unrepentant Slaveholder

by Paul Finkelman

John Marshall's previous biographers have glossed over the extent of his slaveholding and his enthusiasm for the institution. Reappraisal of his legacy is entirely appropriate in light of new discoveries. 

 

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180571 https://historynewsnetwork.org/article/180571 0
Experts Beware: Is America Headed for a Scopes Moment over Critical Race Theory?

Clarence Darrow and William Jennings Bryan during the trial of John Scopes, Dayton, TN 1925.

 

 

In a recent debate over a law to ban the teaching of Critical Race Theory, Tennessee legislator Justin Lafferty (R) explained to his colleagues that the 3/5th Compromise of 1787, used to determine a state’s representation in Congress by counting enslaved people as “three fifths of all other Persons,” was designed with “the purpose of ending slavery.” Lafferty had his facts spectacularly wrong, but that did nothing to derail the law’s passage.

 

Anti-Critical Race Theory laws like the one passed in Tennessee – as well Texas, Iowa, Oklahoma, and Florida -- are not just aimed to push back against the heightened awareness of the nation’s history of racial injustice in the wake of the popularity of the 1619 Project and last summer’s massive protests over the murder of George Floyd. They are also attacks on educators -- and on expertise itself. As Christine Emba explained in a recent Washington Post article on conservatives’ current obsession with Critical Race Theory, “disguising one’s discomfort with racial reconsideration as an intellectual critique is still allowed.” Not only is it allowed in these public debates, it is an effective strategy to curb movements for social change. It is also not new.

 

A century ago a similar right-wing outrage campaign was launched against the teaching of evolution in public schools. The 1925 Scopes “Monkey Trial” remains a touchstone of this era of conservatism. When John Scopes, a substitute teacher in Dayton, Tennessee was charged with violating a new state law against teaching evolution, the case became an international story. Scopes was found guilty and fined $100.

 

The Scopes Trial’s legacy rests perhaps too comfortably on defense lawyer Clarence Darrow’s skewering of the anti-evolution hero William Jennings Bryan in that hot Tennessee courtroom, memorialized in the play (and film) Inherit the Wind. Darrow’s withering questioning made Bryan appear ignorant and incurious. In response to Darrow’s questions about other religious and cultural traditions, Bryan acknowledged that he did not know about them, but added that he did not need to know since through his Christian faith, “I have all the information I need to live by and die by.” 

 

Bryan’s responses were more clever than the popular legend of the trial might lead us to believe. By asserting that he did not need to know what Darrow and the scientists knew, Bryan was calling into question the social value of modern expertise itself. When Darrow asked if Bryan knew how many people there were in Egypt three thousand years ago, or in China five thousand years ago, Bryan answered simply, “No.” Darrow pressed on, “Have you ever tried to find out?” Bryan: “No sir, you are the first man I ever heard of who was interested in it.” Translation: experts studied subjects that no one needed to care about. When asked if he knew how old the earth was, Bryan again responded he did not, but added that he could “possibly come as near as the scientists do.” Here Bryan rejected the premise that the experts really knew what they’re talking about any more than he – presenting himself to the court and the public as a simple man of faith -- did.

 

The legacy of these tactics is on full display today. As David Theo Goldberg wrote in the Boston Review recently, Republican critics of Critical Race Theory “simply don’t know what they’re talking about.” Goldberg is correct of course, but their ignorance is not a hole they are looking to fill anytime soon. It is rather both a shield and a weapon used to go on the offensive against the experts themselves. What the experts “know” about the 3/5th Compromise or the history of racial injustice generally (or climate change, or the dangers posed by COVID-19, or the outcome of the 2020 election) threatens their beliefs in how American society should look and function.   

 

Similar to what we’re seeing today, the attack on the teaching of evolution in the 1920s was an effective means by which to challenge all manner of troubling developments that always seemed to emanate from the latest pronouncement of some expert somewhere. Mordecai Ham, for example, was a popular Baptist preacher who first converted Billy Graham. In a 1924 sermon he moved seamlessly from attacking evolution as false to warning parents that having Darwinism taught to their children would assuredly lead to communism and sexual promiscuity. He thundered, “you will be in the grip of the Red Terror and your children will be taught free love by that damnable theory evolution.” That Ham skipped effortlessly from the teaching of evolution to Bolshevism to free love makes sense only if one remembers that winning a debate over evolution was not the goal--condemning the modern day teaching of evolution was. Evolution then served as the entry point to attack educators and expertise in general as existential threats to their way of life.

 

After Bryan’s death in 1925 sidelined the evolution debate, conservatives continued to connect expertise with unwelcome social change. When University of North Carolina (UNC) sociologists began to investigate the often-poor living conditions in nearby textile mill villages, David Clark, publisher of the Southern Textile Bulletin, the voice of the powerful textile industry, became irate. Clark was convinced that university sociologists were not “just” interested in research. In response he accused the school’s experts of promoting “dangerous tendencies” and “meddling” in the business community’s affairs. The university, he charged, “was never intended as a breeding place for socialism and communism.” UNC’s sociologists like Howard Odum, a fairly conservative, but well-respected expert, was taken aback by Clark’s virulence. But, like Bryan and Ham, linking expertise with radicalism was central to Clark’s strategy.

 

As Goldberg observed of today’s critics of Critical Race Theory, David Clark actually knew very little about sociology or socialism. This became clear when UNC invited him to campus in 1931 to make his case before the faculty and students themselves. During the question and answer period an exasperated audience member asked Clark if he knew what socialism actually was. He responded: “I don’t know, and I don’t think anybody else does.” A newspaper account recorded that “The audience fairly howled.”

 

Clark’s followers would not have been bothered by his concession on socialism -- and they would have not been surprised that the university audience laughed at him. Once again, the goal was not to win a debate over socialism; it was to stop social change they objected to. The experts represented a movement aimed at them, they believed -- a movement that also seemed to take delight in pointing out all that people like David Clark did not know. UNC, so proud of its accomplished faculty, was actually, in the conservatives’ view, a “breeding ground for reformers” and “radicals.”

 

The factual misstatements by today’s Republicans can seem breathtaking to those who value living in an evidence-based reality. These include historical errors like Representative Lafferty’s forehead-smacking error on the 3/5th Compromise or Congressman Madison Cawthorn’s (R-NC) reference to James Madison signing the Declaration of Independence. And there is the ongoing misrepresentation, and even outright denial, of current day events that happened in plain sight – the January 6th insurrection, for example. But attempts by “the experts” to set the record straight will most likely be seen as more proof that the world is out to get them. For the rest, perhaps some comfort can be taken by remembering that facts, as John Adams once pointed out, “are stubborn things.”

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180483 https://historynewsnetwork.org/article/180483 0
It's Time for a "Don't Trust, Do Verify" US-Russia Cybersecurity Treaty

Last month's Colonial Pipeline hack shows the urgency of US-Russia cybersecurity negotiations.

 

 

 

As he started his European tour this week, which culminates in a summit in Geneva with Vladimir Putin, President Joe Biden said he wants a stable, predictable relationship with Russia. Moscow has been echoing that sentiment. Although each side has a different understanding of what those qualifiers mean and expectations for the meeting are very low, the hacks on SolarWinds and Colonial Pipeline demonstrate that cyberspace is the most glaring threat to stability and predictability.

 

In early spring of 1985, when Ronald Reagan and Soviet leader Mikhail Gorbachev had their first meeting in that Swiss city, expectations were also low. The Soviet downing of a Korean commercial airliner in 1983 and Reagan’s not-off-mic comment in 1984 about outlawing and bombing the USSR clearly indicated just how tense relations were.  

 

The upcoming Biden-Putin summit provides an opportunity to begin discussing a framework for an Internet version of the most significant U.S.-Russian cooperation to date: the product of work done at the Geneva and, later, Reykjavik summits: The Intermediate Range Nuclear Forces (INF) treaty  signed by Reagan and Gorbachev in 1987.

 

Reagan adopted the Russian phrase “Trust but Verify” and developed respect for a Soviet leader whose ideology he loathed. Gorbachev vanquished internal foes to ensure successful treaty implementation. The result was thought to be impossible: military intelligence officers inspected the opposing countries’ missile storage and launch facilities.  Another thirty inspectors from each side took up residence at the gate of their former enemy’s most secret rocket-motor manufacturing facility.

 

The idea of on-site inspection had been discussed for years, but no one believed both sides could push the boundaries of sovereignty and counter-intelligence concerns to make it work.  But INF did work. All 2,693 short and intermediate range nuclear missiles were destroyed and mutual trust established. The treaty ushered in two decades of bilateral cooperation, including the Cooperative Threat Reduction program, which secured and eliminated strategic and chemical weapons across the former Soviet Union.

 

Critics, no doubt, will regard applying the arms control approach to cyber security as naïve, impractical, and even dangerous.  But it’s worth remembering that big problems require bold solutions.  And the incentive is clear: hackers threaten governments, the private sector and individuals, electric grids, transportation and energy facilities, defense installations and intellectual property.  A tit-for-tat response to an attack may well escalate into armed conflict

 

A cyber treaty is certain to be based on little trust, with lots to verify.  Technological challenges, however, can be overcome.  Both sides have extensive experience in monitoring public communications. From Solzhenitsyn’s days in a “sharashka” (scientific labor camp) developing decoding technology for Stalin, to the now ubiquitous SORM (an abbreviation for “network eavesdropping”) boxes attached by security services to the equipment of every telco and internet provider in the country, Russian officials know who is doing what to whom.

 

American systems are more poetically nicknamed: PRISM, MYSTIC, Carnivore, Boundless Informant. Government agencies conduct packet sniffing and people snooping—at home to benefit local law enforcement and abroad to spy on friends and enemies, counter ISIS and track monsters like Bin Laden.

 

What if each side allows the other to install such systems on the global Internet Exchange Points (IXPs) on their territory and let loose the algorithms and other tools necessary to identify botnets, hackers and disinformation campaigns?

 

A monitoring center staffed by experts from both countries could be established with anomalies and threats displayed in real time. The UN could supply neutral inspectors and arbitrate disputes.  The treaty should provide protocols for deterring and punishing bad actors.

 

As with INF, the devil will be in the details. Thousands of IXPs will have to be monitored. Though many Russians and Americans understand that their digital privacy has already been compromised, meta-anonymity could be maintained to protect individuals. 

 

A cyber treaty could also help both countries combat drug trafficking, terrorism and child pornography.

 

The advantages of a don’t trust, do verify “cyber-INF” seem clear. But do our leaders have the political will to go forward? Without in any way minimizing the obstacles, we believe there are reasons for cautious optimism.  In 2015, Russia and China agreed not to conduct cyberattacks against each other that would “disturb public order” or “interfere with the internal affairs of the state.”   In September 2020, President Putin proposed a cyber agreement with the United States. President Biden seems cautiously open to seeking out commonalities, without, of course, the unrequited bromance his predecessor had with Putin.

 

There’s no time to lose. A digital iron curtain is descending. Russia continues to turn the screws on internet freedom and is examining ways to isolate itself from the WWW, while pressing foreign content providers to submit to local rules about appropriate content and come on shore with their customer data – or face fines, restrictions and eventual blocking.

 

Should our leaders find the courage to create a monitorable digital peace, perhaps they’ll be willing to turn their attention to the other urgent problems of the 21st century – climate change, terrorism, inequality, pandemics and unchecked artificial intelligence.

 

The aphorism Robert Kennedy “borrowed” from George Bernard Shaw seems appropriate for addressing the prospects of a substantive cyber treaty.  “Some men see things as they are and ask, ‘Why?’ I dream things that never were and ask, ’Why not?’"

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180515 https://historynewsnetwork.org/article/180515 0
The Night Vietnam Veterans Stormed Bunker Hill

Vietnam Veterans Against the War (VVAW) members vote to remain on Lexington Green in defiance of an order by local government to vacate, May 30, 1971. VVAW were subjected to a mass arrest, but gained support by town residents who gave them rides to the Bunker Hill monument in Charlestown to continue the group's march from Concord to Boston. Photo Richard Robbat.

 

 

Citing continuing public health concerns about COVID-19, the city of Boston has declined again this year to issue a permit allowing the annual and always much-anticipated Bunker Hill Day Parade to proceed through the streets of Charlestown. 

The hiatus is an opportunity to recall the holiday’s history and the summer fifty years ago when the country was as politically divided as it is today, until Vietnam Veterans Against the War, or VVAW, insisted on celebrating Bunker Hill Day early.

Bunker Hill Day was initially intended to commemorate the role Massachusetts played in securing the nation’s independence.  Fought on June 17, 1775, the Battle of Bunker Hill was a pyrrhic victory for the imperial British.  The newly formed Continental Army was forced to retreat but not before inflicting enough damage that British forces were confined to Boston.  Famously fought on nearby Breed’s Hill, the battle’s anniversary was first observed with a parade in 1785.  On the fiftieth anniversary, the newly formed Bunker Hill Monument Association organized the first Bunker Hill Day.  While very much a local holiday then as now, the entire nation observed it in 1843, when the Association’s soaring 221-foot granite obelisk was dedicated. 

After Irish immigrants moved into the neighborhood in the final quarter of the nineteenth century, Charlestown became “the only place on the planet,” as famously noted by actor Will Rogers, “where the Irish celebrate a British military victory.”  A cartoonist from that era was prompted to draw a picture of the obelisk with the words “Erected by the Irish in Memory of Patrick O’Bunker of Cork.  Observances came to include companies of reenactors marching in colonial attire to the cadence of fife and drum, as well as elements of Irish peasant culture, including carnivals, fireworks, and alcohol.  As journalist J. Anthony Lukas put it, Bunker Hill Day became “an exuberant statement of Charlestown's independence from the rest of the world.” 

The late 1960s and early seventies were difficult years for Charlestown.  To the dismay of many white parents, the Massachusetts legislature was insisting on school desegregation.  And, as a working-class neighborhood, Charlestown was sending a disproportionate number of its children to fight in Southeast Asia.  The Charleston community engaged in activism on behalf of anti-busing efforts, sometimes resorting to violence; however, few joined what became the most vocal and sustained antiwar movement in US history out of fear of hurting troop morale. 

No one could predict how this community, very much on edge in the spring of 1971, would respond on the Sunday evening of Memorial Day Weekend when not the British, but a wave of American Vietnam veterans swept up Breed’s Hill towards the obelisk the Irish-Americans in Charlestown had made their own.

Forty-eight hours earlier, over one hundred members of VVAW dressed in jungle fatigues had commenced a three-day march that was intended to retrace Paul Revere’s mythic midnight ride in reverse.  Like Revere, the antiwar veterans were seeking to bring a message to the people, in their case that the country had shamefully reversed its earlier course and become the type of imperial aggressor the colonists had once fought to vanquish.  The march route passed through four Revolutionary War battlefields where the veterans planned to demonstrate their patriotic respect for their colonial brothers-in-arms while illustrating with their physical wounds and anguished spirits how far the nation had fallen from its founding ideals. 

VVAW’s march kicked off without incident in Concord, where officials from the National Park Service had granted the veterans permission to camp next to the Old North Bridge, and townspeople served the veterans a hearty dinner.  In marked contrast, the Lexington Selectmen (the Massachusetts equivalent of a town council) refused to grant the veterans permission to camp the second night of the march on the town’s sacred Battle Green.  Intent on punishing the veterans for what he later described as deflating the spirits of those troops still in harm’s way, the Chairman of the Board of Selectmen ordered a mass arrest.

When the veterans were released from the town’s makeshift jail and had paid their fine in county court, they considered skipping Bunker Hill, the final Revolutionary War battlefield on their itinerary.  The mass arrest had taken up a lot of time and they were now at risk of arriving late to the Memorial Day antiwar rally on Boston Common to which they had invited the public. 

Of greater concern was the fact that Charlestown might not be as welcoming as the liberal elites of Lexington, many of whom had decided to get arrested with the veterans and who would later ensure the Chairman was not re-elected. 

Over a dinner prepared for them by one of Lexington’s congregations, the veterans conferred about what to do.  Buoyed by the national media’s sympathetic coverage of the mass arrest, a wounded veteran living at the Bedford VA hospital urged the veterans onwards.

“We’ve already begun the Battle of Lexington,” he enthused about VVAW’s success thus far in unleashing the energy that birthed the nation.  “The whole country knows it.  So let’s go on to Bunker Hill.”

The problem of lost time was solved by hitching rides to Charlestown from their Lexington supporters.  Disembarking in Sullivan Square so they could respectfully approach the Bunker Hill battlefield on foot as the descendants of those who fought and died there, some of the veterans later recalled feeling very worried about how they would be received.

“Was it gonna be food and acceptance or sticks and stones?” one wondered.

As the veterans started uphill toting the very real-looking toy M16s they had carried from Concord as a sign of their authority to speak about the war, windows in the tenement buildings lining the narrow streets flew open and cheers erupted from them.  The veterans had served alongside Charlestown’s own sons and were being honored as such.  Minutes later, when the veterans set foot on the hallowed ground where so many Americans had died so that their children could be free, Charlestown’s residents bore silent witness as the veterans ceremoniously rejected their weaponry in a message that the war must end.

“We love you and we are happy to be here with you,” one of the still stunned veterans exclaimed to these new supporters who hours before VVAW considered avoiding.  “We must begin to share with one another the peace we need right now.”

The next morning, when the veterans emerged from their tents, countless residents returned to their side, offering food and coffee to fuel the veterans’ final push for Boston.

Fifty years ago this summer, Bunker Hill Day was celebrated early by Charlestown’s residents and a new breed of antiwar activists who came together around the idea that the Vietnam War did not reflect the values for which the colonists gave their lives on Breed’s Hill nor those of the Bunker Hill Irish-American community whose children were being forced to fight it.  It was a victory for VVAW and the antiwar movement as great as the ones traditionally celebrated on Bunker Hill Day.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180485 https://historynewsnetwork.org/article/180485 0
Valor Roll: American Newsies in the Great War and the Flu Pandemic

"Scotty and His  Beloved Sho-Sho Gun", Gayle Porter Hoskins. Appeared in Ladies Home Journal, June 1919. Image Delaware Art Museum

Newsboy Albert Edward Scott of Brookline, MA enlisted at age 15 in 1917. The scene depicts his death by a sniper's bullet after defending a road as a machine gunner. "Scotty" was the youngest American casualty of the war.

 

 

World War I presented new opportunities to honor newsboys, particularly those who joined the armed forces. The Boston Globe made a minor celebrity of Fifekey Bernstein, the first Boston newsboy to enlist in the war. The Chicago Tribune placed former Loop news crier Joe Bagnuola on its “valor roll” after he distinguished himself as a battlefield messenger. And the Hartford Courant lauded the fighting spirit of Nat Fierberg, who joined the army to avenge the death of his brother Sam, a former Main Street hawker. Sam had enlisted at age 14 and died at Seicheprey, the first major action involving US ground troops. The Courant commended the boys who joined up “to make news instead of sell it.”

 

Newspapers also applauded newsboys who demanded to see the draft cards of suspected “slackers,” who taunted those who drove on gasless days as “Hun lovers,” or, in the case of a 10- year-old in New York, who gut-punched a suspected German spy as he was being led through Penn Station under armed guard.

 

Newsboys were anything but slackers. Newsboys’ homes supplied many raw recruits. Father Dunne’s home in St. Louis sent 126 residents into the armed forces, five of whom were killed in action. The Brace Memorial Home in New York contributed 2,890 current or past residents to the military. Its superintendent signed enlistment papers for 1,600 boys. Fifteen were killed in action and twenty were wounded. The first to fall, at Château-Thierry, was George “Blackie” Kammers. Others included “Libby” Labenthal, a pitcher on the home’s baseball team, and Peter Cawley and Jackie Levine, who starred in the home’s minstrel shows. Their inch-long obituaries mention their affiliation with the Newsboys’ Lodging House, just as those of Ivy Leaguers mention their association with Harvard, Princeton, or Dartmouth.

 

Eighteen was the minimum age for induction into the army, yet boys like Sam Fierberg sometimes lied their way into service. The foremost example is Albert Edward Scott, a newsboy from Brookline, Massachusetts, who enlisted in July 1917 at age 15 and became a machine gunner in the 101st Regiment. The youngest American casualty of the war, “Scotty” died defending a road near Epieds, killing thirty “boches” before a sniper got him. He received a hero’s burial in France and posthumous honors at home. An oil painting by Gayle Porter Hoskins showing four soldiers gathered around Scotty’s body in the woods was one of the Ladies’ Home Journal’s “souvenir pictures of the Great War.” The Roosevelt Newsboys’ Association raised funds to install a bronze tablet and bas- relief sculpture of the painting in Brookline’s town hall. Vice President Calvin Coolidge ordered two navy destroyers to convey a three- hundred- piece newsboy band from New York for the dedication ceremony, attended by former secretary of state William Jennings Bryan. Scotty was eulogized as a “steady, self-reliant, manly American boy” who “did his duty in war and in peace, in France and in Brookline.” On a visit to Boston, Marshal Ferdinand Foch, supreme commander of the Allied armies, left a wreath of roses bearing the French tricolor to be placed on Scotty’s tablet. The next year the corner of Chambers and Spring Streets in Boston was renamed Benjamin Rutstein Square after a popular West End newsboy killed in the Argonne in 1918. Thus did newsboys participate in the culture of commemoration that followed the war. Plaques, parades, paintings, wreath layings, and street dedications helped give meaning to the slaughter and replenish the wellsprings of nationalism.

 

Movies and songs about newsboy proliferated during the period, and some directly addressed the theme of making the world safe for democracy. The silent film Ginger, the Bride of Chateau Thierry, follows two tenement sweethearts who are separated when Ginger is adopted by a judge. She befriends his son Bobby but stays true to newsboy Tim Mooney. Once grown up, the two men vie for her hand, but they call a truce when war is declared; they ship out to France, and Ginger follows with the Red Cross. When Tim is wounded, Bobby risks his life to carry him to a hospital where he dies in Ginger’s arms, freeing her to marry Bobby. The movie, which includes actual scenes of trench combat, portrays the war as a great crusade unifying the classes. Striking the same note musically was the 1919 Tin Pan Alley flag-waver “I’d Rather Be a Newsboy in the USA than a Ruler in a Foreign Land.” One critic called it “pathetic patriotic piffle,” but it got a smile from AEF commander Pershing when sung for him by a wounded Yank at Walter Reed Hospital.

 

One of the most devastating effects of the war was the influenza pandemic of 1918, which killed 650,000 Americans and 50 million people worldwide. The scourge pushed war news off the front pages and took its toll on many of the children and elderly who sold those papers. One casualty, “Mullen the newsboy,” was a Chicago Loop vendor who gained notoriety after passage of the Seventeenth Amendment by vowing to run for the Senate and introduce legislation allowing indigent newsboys to live in old-soldiers’ homes. Chicago newsies came under scrutiny as potential carriers of the disease because of their habit of spitting for good luck on every nickel earned. Newsies in Norwood, Massachusetts, had to put their money on a table for the manager to spray with a disinfectant before he’d touch it. In Harrisonburg, Virginia, the Daily Independent suspended publication after its entire workforce fell ill. Newspapers from Pueblo, Colorado, to Winnipeg, Manitoba, outfitted carriers with gauze masks to protect their health and alleviate subscribers’ fears. The Wichita Eagle went further, offering its newsboys mittens, health insurance, and the services of a physician.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180484 https://historynewsnetwork.org/article/180484 0
A Celebrity Apology and the Reality of Taiwan

Students in Taipei protest a trade agreement with the People's Republic of China in 2014.

Photo Max Lin, CC BY-SA 2.0

 

 

 

John Cena made international headlines this week while promoting his new movie, The Fast and the Furious 9. The professional wrestler turned action star referred to Taiwan as “the first country” where people would be able to see the film. Chinese citizens were outraged. Cena quickly issued a video apology, spoken in Chinese, for the “mistake.” The apology provoked another round of criticism. The reaction on social media and cable news was unforgiving, calling Cena everything from gutless to disgusting. I do not intend to join in the pile on. Cena’s had enough punishment for one news cycle.

 

However, underneath the scathing takes, lies an inconvenient truth: most Americans couldn’t find Taiwan, which sits in the East China Sea between China, Japan, and the Philippines, on a map, much less trace the origins of Taiwanese identity. Cena’s foray into international affairs provides an opportunity to examine the entrenched misunderstandings about the history of Taiwan at a time when the U.S., and the world, is paying attention.

 

The oft-repeated dictum that “Taiwan is an integral part of China’s historic territory” was not widely held within China in 1895, the year that the Qing Dynasty ceded Taiwan to Japanese colonization. The Chinese government did not begin to assert control over much of the island until the 1870s, and in 1895 officials expressed less interest in protecting Taiwan than other territories demanded by Japan. They were particularly interested in avoiding Japanese rule of Taiwan, but suggested to British and French diplomats that those countries could annex Taiwan. Within Taiwan itself, when given the option of remaining to live under Japanese colonialism or to move to China, less than 10,000 of Taiwan’s roughly 2.5 million inhabitants chose to make the journey across the Taiwan Strait. None of this indicates that Taiwan rose to the level of integral territory before the twentieth century.

 

During the 50 years of Japanese rule, the majority of those residents and their descendants came to think of themselves as Taiwanese, albeit in ways that reinforced divisions between indigenous and non-indigenous groups. Violent and non-violent resistance to the Japanese colonial regime remained a feature of Taiwan’s history, but it was couched in terms of preventing either encroachment into indigenous lands or the eradication of social and religious practices, and rarely if ever in the language of reunification with China. Taiwanese remained interested in China, of course, but as an ancestral homeland or a site for lucrative business activities. Instead, they developed new identities as Taiwanese and displayed them in calls for independence from Japan, drives for voting rights within Japan and an autonomous legislature for Taiwan, and a wide-range of social and cultural behaviors, from social work to religious festivals. All of these behaviors clearly distinguished them from the Japanese settlers and the colonial government that attempted to transform them into loyal Japanese subjects. Instead, they became Taiwanese.

 

That they had not remained Chinese—at least not as people in China defined that term during the early twentieth century—became very clear to everyone on the scene soon after the end of World War II. Members of the Nationalist Party and the government of the Republic of China (ROC), and Chinese popular opinion, had begun to speak of Taiwan as a part of China during the 1930s and 1940s, in the context of anti-Japanese sentiment and war. However, government officials and many Chinese settlers looked upon the Taiwanese as backwards people who had been tainted by Japanese influence. Those Taiwanese viewed themselves as having resisted Japanese assimilation and having built their identities in burgeoning modern metropolises and in relation to modern capitalist industries. Even though many Taiwanese began to study the new national language of Chinese, as they had Japanese, they felt no connection to the national struggles and heroes that they were told to embrace.

 

All of this was evident before 1947, when the separation between Taiwanese and Chinese came into high relief during the 2-28 Uprising and its brutal suppression by Nationalist Chinese military forces, and the White Terror that began soon thereafter. Decades of single-party rule under martial law by Chiang Kai-shek’s regime did not effectively instill most of Taiwan’s residents with a new sense of Chinese national identity. The ROC nevertheless continued Taiwan’s condition of political separation from China, a fact that has been in existence now for almost all of the past 126 years, and Chinese insistence on the idea of Taiwan as a part of China has failed to convince the roughly twenty-three million Taiwanese. Chinese views have been more effective in shaping international opinion, but they do not change Taiwan’s modern history or the reality that Taiwan is a country.

 

To close, the controversy surrounding Mr. Cena’s apology highlights two things: the power of ideas—in this case, the idea of Taiwan as a part of China—and the geopolitical and economic power of countries like China to shape opinions and actions both domestically and around the world.  People, companies, and countries should make their own decisions about what accommodations they are willing to make to do business with China and its citizens. But they should do so with an understanding of the history that lies behind and challenges such ideas.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180486 https://historynewsnetwork.org/article/180486 0
American Conference for Irish Studies Connects the Past, Future of Irish-American Relations

Republic of Ireland President Michael Higgins addresses the ACIS virtual meeting. 

 

 

 

 

U.S. and Irish leaders have suggested the historic bonds between the two countries must evolve to meet modern challenges. This includes Ireland’s role in post-Brexit relations with the European Union, U.S. support for maintaining peace in Northern Ireland, and other robust alliances to protect democracy against rising tides of illiberalism and authoritarianism.

 

Irish President Michael D. Higgins, Irish Ambassador to the United States Dan Mulhall, and U.S. Rep. Brendan Boyle (D-Pa.) made separate remarks at the 2021 American Conference for Irish Studies (ACIS). The annual conference was based at Ulster University’s Magee campus in Derry, Northern Ireland, but held virtually June 2-5 due to COVID-19 restrictions.

 

In addition to their usual studies of history and literature, Mulhall suggested scholars explore how globalism has shaped modern Ireland since the start of the 21st century. Of particular interest, he said, is how the country--so far--has managed to resist anti-immigrant populism and other right-wing ideology that has taken root elsewhere.

 

“The challenges stem from societal changes, especially in Ireland,” Mulhall said. Not only must Ireland explain how it has changed, especially from the outdated and cliched archetypes, he said, but also expand its outreach to new generations of the more than 33 million Americans who claim Irish ancestry. This includes those who are Black, Hispanic, or LGBTI.

 

From either side of the Atlantic, Mulhall said, “it is unreasonable for us to expect a monolithic outlook.”

 

Ireland plans to open its eighth consulate office next year in Miami, a more extensive U.S. presence than many larger nations. Ireland is now the ninth largest source of foreign direct investment in the U.S., with Irish firms employing over 110,00 American workers. More than 750 U.S. multinationals have made Ireland their base for European operations.

 

“In the wake of Brexit, the relationship grows in importance to the U.S.,” said Boyle, who has represented a Philadelphia district since 2015. “Ireland is the only English-speaking country around the E.U. table.”

 

Boyle joined the 2019 congressional delegation led by the U.S. House Speaker Nancy Pelosi (D-Calif.) to the Republic of Ireland, part of the E.U., and Northern Ireland, which remains tied to Britain. In his ACIS remarks, Boyle reiterated that U.S. President Joe Biden and members of Congress from both parties will not accept a “hard border” on the island of Ireland, especially if it threatens peace in the north.  

 

More importantly, he warned of the danger of allowing segments of any population to slip into economic isolation and social resentment, which can “manifest in unhealthy ways.” He suggested this has happened more in America than in Ireland.

 

“Ireland most shares American values,” Boyle said. “With the world in a democracy recession, we need Ireland to speak up for these values.”

 

Higgins praised Biden’s inaugural speech “offer of a moral reawakening on our global responsibilities, including how we respond to COVID-19 and climate change, global conflicts gone on too long.”

 

Returning to the traditional academic ground of ACIS, Higgins said American scholarship of Irish history, especially the late 19th and early 20th century, is “a debt never to be forgotten.”

 

ACIS was founded in 1960 and has about 800 members in the U.S., Ireland, Canada, and other countries. More than three dozen U.S. colleges and universities offer Irish studies programs, and about 12,000 American students visit Ireland annually under non-pandemic conditions.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180513 https://historynewsnetwork.org/article/180513 0
The Roundup Top Ten for June 11, 2021

The Fog of History Wars

by David W. Blight

Nations have histories, and someone must write and teach them, but the 1990s battle over the National Standards for History remains a warning to all those who try – setting a history curriculum is politics by other means, and the right has always been willing to fight over it.

 

Conspiracies in the Classroom

by Elizabeth Stice

"Colleges and universities and faculty have a responsibility to ground their disciplines in truth claims that go deeper than the rabbit holes of the internet and to graduate students who are capable of distinguishing between conspiracy and reality."

 

 

The Push For LGBTQ Equality Began Long Before Stonewall

by Aaron S. Lecklider

Pride month is based on an origin story of the LGBTQ liberation movement that starts with Stonewall. There is a longer history of queer political activism that has been erased because of its origins in the left.

 

 

The Fissure Between Republicans and Business is Less Surprising than it Seems

by Jennifer Delton

Friction between the Trump-led Republican Party and big business organizations like the Chamber of Commerce over supposed "woke capitalism" isn't a new story. Big business's partisan allegiances have shifted according to capital's interests for decades. 

 

 

A Supreme Court Case Poses a Threat to L.G.B.T.Q. Foster Kids

by Stephen Vider and David S. Byers

State and local social service agencies for decades have been actively working to protect the safety and dignity of queer youth in the foster care system. A Supreme Court case threatens that progress in the name of "religious freedom." 

 

 

Protesters in Elizabeth City, N.C. are Walking in the Footsteps of Centuries of Fighters for Black Rights

by Melissa N. Stuckey

A historian living and working at the site of Andrew Brown Jr.'s killing by police explains that local protesters are following generations of freedom seekers. 

 

 

It’s Time for an Overhaul of Academic Freedom

by Emily J. Levine

The idea of academic freedom doesn't account for the present precarity of most university teachers, and doesn't rest on a positive concept of what professors should do with students and the public. 

 

 

The Problem with a U.S.-Centric Understanding of Pride and LGBTQ Rights

by Samuel Huneke

The histories of gay liberation politics in divided Germany offer surprising insight into what it means for LGBTQ people to live freely in a society. 

 

 

‘Lady of Guadalupe’ Avoids Tough Truths About the Catholic Church and Indigenous Genocide

by Rebecca Janzen

"Although it portrays the story of the Virgin of Guadalupe for a broad audience, ultimately this film sanitizes the real-life brutality of the Church toward Indigenous peoples in the 16th century."

 

 

The Last Time There Was a Craze About UFOs and Aliens

by Daniel N. Gullotta

A recent resurgence of interest in UFOs in respectable public discourse recalls the 1990s, when the X Files reflected a similar moment of distrust in authority and conspiratorial thinking. 

 

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180512 https://historynewsnetwork.org/article/180512 0
America's First Peaceful (Just Barely!) Transfer of Power

 

 

On July 14, 1798—nine years to the day after the storming of the Bastille—President John Adams signed an American Sedition Act into law. The 1789 Parisian incident had set in motion events that ultimately toppled and killed King Louis XVI; his queen, Marie Antoinette; and their heir to the throne, the dauphin. Adams’s signature likewise led to his own ouster, but the president; his lady, Abigail; and their heir, John Quincy, got to keep their heads in the transition and thereafter. On two telling dimensions—orderliness of regime change and avoidance of bloodshed—Federalist-era America showed itself vastly superior to Revolutionary France. But the events of 1798-1801—America’s first peaceful transfer of power from one presidential party to another—were in fact far more fraught than is generally understood today and in myriad respects cast an eerie light on the not entirely peaceful transfer of presidential power in 2020-21.   

UNDER THE TERMS OF THE Sedition Act, anyone who dared to criticize the federal government, the president, or Congress risked a fine of up to $2,000 and a prison term of up to two years. But venomous criticism, even if knowingly false and violence-inciting, that targeted the vice president was fair game under the law. Thus, in the impending 1800 electoral contest between Adams and his main rival, Thomas Jefferson—who was also Adams’s sitting vice president—Adams and his Federalist Party allies could malign Jefferson, but Jefferson and his allies, the Democratic Republicans, could not reciprocate with equal vigor. Congressional aspirants attacking Congressional incumbents would need to watch their words, but not vice versa. Just in case the Democratic Republicans managed to win the next election, the act provided that it would poof into thin air on March 3, 1801, a day before the new presidential term would begin.

 

On its surface, the act seemed modest. It criminalized only “false, scandalous, and malicious” writings or utterances that had the “intent to defame” or comparable acidic motivation. The defendant could introduce into evidence “the truth of the matter contained in the publication charged as a libel.”

 

This was more generous than libel law at the time in Britain, where truth was no defense. Indeed, truth could actually compound a British publisher’s liability. “The greater the truth, the greater the libel,” because the libelee would suffer a greater reputational fall if the unflattering story was, in fact, true. British law was thus all about protecting His Majesty and His Lordship and His Worshipfulness from criticism; it was the product of a residually monarchial, aristocratic, and deeply deferential legal and social order. British freedom of the press meant only that the press would not be licensed or censored prepublication. Anyone could freely run a printing press, but printers might face severe punishment after the fact if they used their presses to disparage the powerful.

 

Back in the 1780s, Jefferson had urged James Madison and other allies to fashion a federal Bill of Rights that would go beyond English law—but not by miles. As Jefferson envisioned what would ultimately become America’s First Amendment, “a declaration that the federal government will never restrain the presses from printing any thing they please, will not take away the liability of the printers for false facts printed.” Jefferson evidently could live with publisher liability for “false facts printed.” But what if the falsehood was a good-faith mistake, or a rhetorical overstatement in a vigorous political give-and-take? Could an honest mistake or mere exuberance ever justify serious criminal liability and extended imprisonment?

 

Also, who would bear the burden of proof? The Sedition Act purported to criminalize only “false” statements, but in the 1790s many derogatory comments were legally presumed false. The Sedition Act said that a defendant could “give in evidence in his defence, the truth of the matter,” but many edgy statements mixed truth with opinion and rhetoric. If a critic wrote that John Adams was a vain and pompous ass who did not deserve a second term, how exactly could the critic establish the courtroom “truth of the matter”?

 

ADAMS ERRED NOT SIMPLY in signing the Sedition Act but in mindlessly and mercilessly prosecuting and punishing, and never pardoning, men under it. He and his minions hounded tart but peaceful speakers and printers whose only real crime was dislike of John Adams, his party, and his policies, in cases whose facts were miles apart from treason, riot, or mayhem. Indeed, under the ridiculously strict standards of his own administration, a young John Adams himself should have been fined and imprisoned back in the 1760s and 1770s for his vigorous denunciations of colonial Massachusetts royal Governor Thomas Hutchinson.

 

In the first high-profile sedition case, brought in October 1798, the Adams administration targeted a sitting Democratic Republican congressman from Vermont, Matthew Lyon, for political writings and harangues, some of them at campaign rallies. In one passage highlighted by the prosecution, Lyon had written that Adams had “swallowed up” every proper “consideration of the public welfare” in “a continual grasp for power, in an unbounded thirst for ridiculous pomp, foolish adulation, or selfish avarice.” Adams, wrote Lyon, had “turned out of office . . . men of real merit [and] independency” in favor of “men of meanness.” Lyon had also read at public meetings a communication from a French diplomat bemoaning the “extremely alarming” state of relations between France and the United States, worsened by the “bullying speech of your president and stupid answer of your senate.” Congress, wrote the diplomat in words that Lyon publicly repeated, should send Adams “to a mad house.”

 

How exactly could Lyon prove in a courtroom the technical truth of these words, blending as they did fact, opinion, analysis, interpretation, and rhetoric? The jury convicted and the court sentenced Lyon to a fine of $1,000 and a four-month imprisonment.

 

Dozens of newspapers across the continent brought readers detailed reports of the cause célèbre. While in prison, Lyon wrote an account of his travails that Philadelphia’s Aurora General Advertiser published in early November, followed by newspapers in many other localities. The congressman vividly described his conditions of confinement: “I [am] locked up in [a] room . . . about 16 feet long by 12 feet wide, with a necessary in one corner, which affords a stench about equal to the Philadelphia docks, in the month of August. The cell is the common receptacle for horse-thieves, money makers [counterfeiters], runaway negroes, or any kind of felons.” When Lyon stood for reelection—from prison!—in December, his constituents gave him a roaring vote of confidence, returning him to his House seat. Adams thus won the first courtroom battle but was beginning to lose the war of public opinion.

A year and a half later, the last big Sedition Act trial before the election of 1800 resulted in an even harsher sentence—nine months’ imprisonment. The defendant was the trashy but talented journalist James Callender—the man who broke the Alexander Hamilton sex-scandal story in 1797 and would later, in 1802, expose Jefferson’s affair with his slave mistress Sally Hemings (who was also his deceased wife’s half sister). In the run-up to the election of 1800, Callender published a campaign pamphlet, The Prospect Before Us.

 

Callender painted in bright colors and attacked Adams for just about everything: “Take your choice, then, between Adams, war and beggary, and Jefferson, peace and competency!” The “reign of Mr. Adams has been one continued tempest of malignant passions. As president, he has never opened his lips, or lifted his pen without threatening and scolding.” The administration’s “corruption” was “notorious.” Indeed, the president had appointed his own son-in-law, William Stevens Smith, to a plum federal office, surveyor of the port of New York, thus “heap[ing] . . . myriads of dollars upon . . . a paper jobber, who, next to Hamilton and himself is, perhaps, the most detested character on the continent.”

 

Notably, Callender also blasted the Sedition Act itself, and Adams’s abuse of it: “The grand object of his administration has been . . . to calumniate and destroy every man who differs from his opinions.” The “simple act of writing a censure of government incurs the penalties, although the manuscript shall only be found locked up in your own desk,” noted Callender. Here, the Sedition Act did indeed approximate mind control, yet Adams apparently never shuddered to think about his own diary diatribes against Hutchinson and other governmental figures in the 1760s and 1770s. Finally, Callender, who showed more self-awareness than Adams on this point, connected his critique of the act to the very nature of the election-year pamphlet in which his more general critiques of Adams were appearing. The act made it virtually “impossible to discuss the merit of the candidates.” If a person proclaimed that he “prefer[red] Jefferson to Adams”—as Callender was of course doing in this very pamphlet—wouldn’t that itself be an actionable slur on Adams?

 

The Adams administration apparently agreed, and prosecuted Callender in the spring of 1800 for what today looks like a rather typical, if overstated, campaign tract.

 

Callender’s nine-month sentence drew the gaze of printers and readers across the continent, just as the Adams-Jefferson race was unfolding in a series of statewide contests for electoral votes. Alongside the conviction of Lyon, Callender’s case cast Adams in an unflattering light, as did other lower-profile cases. (One featured a Newark drunkard, Luther Baldwin, who made a crude joke about the president’s rear end.)

 

All told, the Adams administration initiated more than a dozen—indeed, one recent historian says many dozen—prosecutions under the Sedition Act and closely related legal theories. Some cases never came to trial but still captured attention. For example, the feisty printer of Philadelphia’s Aurora General Advertiser, Benjamin Franklin Bache, named for his famous printer-grandfather, died while under indictment—the victim of a yellow fever pandemic. The Aurora was a high-profile anti-administration paper published in an iconic city. Going after Bache was the eighteenth-century equivalent of a Republican president today seeking to imprison the editors of the Washington Post or a modern Democratic president aiming to criminalize the publishers of the National Review.

 

Indeed, Jefferson himself had had secretly financed Callender (a fact which only later came to light).  If Callender was guilty, why not his accomplice Jefferson? So Adams’s policies were in fact the eighteenth century equivalent of, say, Donald Trump trying to imprison Joe Biden in 2020 for speaking ill of Trump and supporting others who did the same.

 

TWO SUPREME COURT JUSTICES riding circuit had sided with Adams, but America’s ultimate supreme court consists of the sovereign American people, who express themselves most consequentially via constitutional amendments and pivotal elections. The Adams-Jefferson contest was just such a pivotal election, and the court of public opinion ultimately sided with Jefferson and Madison, as has the court of history.

 

The biggest problem with the Sedition Act of 1798 was its self-sealing quality. Anyone in the press who harshly criticized this horrid law (such as Callender) risked prosecution under the law itself.

 

But each state legislature was a special speech spot. Even if newspapers risked prosecution under the Sedition Act if they initiated their own critiques of the act, or reprinted other newspapers’ critiques, surely they would enjoy absolute immunity if they merely told their readers what had been said in the special speech spots in state capitals. Thus, Madison and Jefferson quietly composed resolutions for adoption in the Virginia and Kentucky legislatures, respectively.

 

Madison was by far the abler constitutional theorist and practitioner, and his version has aged better than Jefferson’s. On Christmas Eve 1798, the Virginia General Assembly denounced the provisions of the Sedition Act as “palpable and alarming infractions of the Constitution.” That act, “more than any other, ought to produce universal alarm, because it is levelled against that right of freely examining public characters and measures, and of free communication among the people thereon, which has ever been justly deemed, the only effectual guardian of every other right.”

 

Over the next six weeks, newspapers in most states reprinted or excerpted Virginia’s protest. In the short run, Madison and Jefferson did not succeed in getting other state legislatures to join the Virginia and Kentucky bandwagon. But in the end, it did not matter whether the two statesmen immediately convinced a majority of state lawmakers, just as it did not matter whether they immediately convinced a majority of sitting Supreme Court justices. What mattered most in 1800–1801 was winning a majority of Electoral College votes in the Jefferson-Adams slugfest.

 

And that Jefferson did. When the American people, having now seen quite clearly what freedom meant to Adams and what freedom meant to Jefferson, decided between these two icons of 1776, they decided for Jefferson.

 

BUT THERE WAS A CATCH, involving palace intrigue eerily similar to some of the strangest moments that would unfold in America 220 years later, in January 2021.

 

The backstory to this episode of palace intrigue and near mayhem in 1800–1801 began, fittingly enough, with the early 1790s rivalry between Jefferson and Hamilton. Who was truly Washington’s prime minister? In particular, who should succeed to the presidency if both Washington and Adams were to die, become disabled, or resign?

 

The Constitution’s Vacancy Clause left this question for the federal legislature to decide: “Congress may by Law . . . declar[e] what Officer shall then act as President.” The text authorized an ex officio designation—not who but what, not which person but “what Officer” qua officer would serve as acting president as part of his regular office. In 1791 Jefferson’s partisans in Congress, led by Madison, proposed to designate the secretary of state as the officer next in line, a move that would bolster the status of Thomas Jefferson (who then held that office) and deflate the pretentions of then Treasury Secretary Alexander Hamilton. Hamilton’s Congressional admirers balked. As a compromise, some proposed to designate the chief justice—a post then held by the Hamilton-leaning John Jay. After bouncing between House and Senate and various committees thereof, the bill as finally adopted in 1792 placed America’s top senator—the Senate president pro tempore—first in line, followed by the Speaker of the House.

 

Alas, this was unconstitutional. As Madison and others persuasively pointed out, senators and House members were not, strictly speaking, “officers” within the letter and spirit of the Constitution’s Vacancy Clause. Only judges and executive officials—those who acted upon private persons, and were not mere lawmakers—were proper “officers” for succession purposes. Indeed, Article I, section 6 expressly prohibited sitting congress members from holding executive or judicial office: “no Person holding any Office under the United States, shall be a Member of either House during his Continuance in Office.”

 

All this set the scene for the post-election drama of 1800–1801. The Democratic Republicans won the election, with 73 electoral votes for Jefferson compared to 65 for Adams. But the fledgling party blundered, slightly.

 

Under the original Constitution, there was no separate balloting for the vice presidency. Rather, each member of the Electoral College cast two votes for president. The top vote-getter, if backed by a majority of Electors, would win the presidency, and whoever came in second in the presidential balloting would become vice president. The Democratic Republicans aimed to catapult Jefferson into the presidency and his running mate, New Yorker Aaron Burr, into the vice presidential slot, but every Jeffersonian Elector also voted for Burr. The party should have designated one Elector to throw away his second vote to ensure that Jefferson would outpoint Burr, but somehow failed to do this. Thus there was a tie at the top, a tie that would need to be untied by the lame-duck, Federalist-dominated House of Representatives.

 

The House could surely pick Jefferson—the only proper outcome, thought the Jeffersonians. Indeed, this is what the House ultimately did, thanks in no small measure to Hamilton’s emphatic appeals to Congressional Federalists on behalf of Jefferson. Hamilton told his correspondents that despite his own fierce feuds with Jefferson and the personal dislike that each man had for the other, the former secretary of state was an honorable and capable public servant committed to his country’s welfare. Once in power, Jefferson would, Hamilton hoped, eventually see the (Hamiltonian) light and govern in a way that would protect America’s vital interests at home and abroad. (Hamilton guessed right on this, in general.) Hamilton told his Federalist allies that Burr, by contrast, was a charming but corrupt wild card, who might sell the nation out to the highest bidder merely to line his own pocket.

 

Still, the Federalist-dominated Congress could lawfully pick Burr. Many Jeffersonians considered this scenario underhanded, because none of Burr’s Electors had truly wanted to see him president. From a legal point of view, however, Burr’s votes were no different from Jefferson’s. If Federalists actually preferred Burr, why shouldn’t he win as the consensus candidate? After all, had Federalist Electors known long in advance that Adams was a lost cause, they could have chosen to vote for Burr in the Electoral College balloting in the several states. Had even a single Federalist so voted, Burr in fact would have received more electoral votes than Jefferson, and thus would have won under the strict letter of the rules. How was the matter any different if Federalist House members opted to back Burr over Jefferson when allowed to untie the Electoral College tally? If this flipping of their ticket irked Jeffersonians, they had only themselves to blame for having picked Burr as their second man. After all, even if Burr were selected by the Federalist-dominated House, nothing would stop (President) Burr from resigning in favor of (Vice President) Jefferson. Easier still, nothing stopped Burr from publicly urging all House members to endorse Jefferson, mooting any need for post-inaugural heroics.

 

What if the House failed to pick either Jefferson or Burr? This sounded lawless, but it wasn’t, really. The Constitution required the House to untie the election under special voting rules reminiscent of the old Articles of Confederation. Each state delegation in the House would cast one vote, and the winner would need a majority of state delegations. If a state delegation were equally divided or abstained, its vote would count for zero, not one-half for each candidate. It was thus imaginable that neither Jefferson nor Burr would have an absolute majority of state-delegation votes in the House—nine out of sixteen—when Adams’s term expired at the end of March 3.

 

If so, could Adams simply hold over for a short period past his constitutionally allotted four years? For, say, a month? For a year? For four years? Or would the Succession Act spring to life when Adams’s term expired, allowing the Senate’s president pro tempore to become the president of all America? Even if that person were a Federalist? (The Federalists had a comfortable majority in the lame-duck Senate; the new Senate would be closely divided.) What about the argument that the Succession Act was in fact unconstitutional?

 

Enter “Horatius,” stage right. In a pair of newspaper essays initially published in early January 1801 in the Alexandria Advertiser and widely reprinted in both the capital area and beyond, the anonymous Horatius offered a cute way of untying the “Presidential Knot.” Horatius argued that the Succession Act was indeed unconstitutional. The lame-duck Congress should thus enact, and the lame-duck president, Adams, should sign, a new Succession Act designating a proper “officer” to take charge after March 3 in the event of a Jefferson-Burr House deadlock. Horatius did not explicitly state what officer should now fill the blank, but the obvious choice, legally and politically, for the lame-duck Federalists, was the secretary of state. After all, he was the highest-ranking officer, except for the arguable possibility of the treasury secretary and the chief justice. But the position of chief justice was vacant in early January. And although Horatius said none of this—he didn’t need to—the sitting secretary of state in early 1801 just happened to be the Federalists’ most popular and able politician: Jefferson’s old rival and first cousin, once removed, John Marshall.

 

It was an elegant and brilliant idea, a political and legal stroke of genius—evil genius, from a Jeffersonian perspective. But whose genius idea was it to crown John Marshall? Who was this Horatius? Most likely, according to modern scholars, John Marshall himself!

 

Even if Marshall was somehow not Horatius, Marshall surely agreed with Horatius. In mid-January 1801, James Monroe sent Jefferson a letter bristling with concern: “It is said here that Marshall has given an opinion in conversation …that in case 9 States should not unite in favor of one of the persons chosen [by the Electoral College—that is, Jefferson or Burr], the legislature may appoint a Presidt. till another election is made, & that intrigues are carrying on to place us in that situation.” In an earlier letter to Jefferson, Monroe had also identified Marshall as a likely beneficiary of the Horatius gambit: “Some strange reports are circulating here of the views of the federal party in the present desperate state of its affrs. It is said they are resolved to prevent the designation by the H. of Reps. of the person to be president, and that they mean to commit the power by a legislative act to John Marshall,. . . or some other person till another election.”

 

Jefferson responded by treating the situation as 1776 all over again, rallying his troops and rattling his saber. In mid-February 1801, he told Monroe that he “thought it best to declare openly & firmly, one & all, that the day such [a succession] act passed, the middle states would arm, & that no such usurpation even for a single day should be submitted to.” This was not casual chitchat. In 1801 Monroe was the sitting governor of Virginia, which of course bordered on the new national capital city. Jefferson was telling Monroe to ready his militia to march on Washington—with weapons—and Monroe was listening carefully.

 

Jefferson’s were the words of a sloppy, rash, and trigger-happy politico. What was his legal warrant for threatening to incite states near the national capital (“the middle states”) to take up arms against the central government? The Horatius gambit was surely sharp dealing, given that it aimed to give the presidency to neither Jefferson nor Burr, but how was it illegal? The Jeffersonians themselves had created the mess that Horatius slyly offered to tidy up. After all, Jefferson himself and his party had picked the ethically challenged Aaron Burr to be—under their own plan—a heartbeat away from the presidency.

 

If Burr were supremely honorable, he could simply declare, publicly and unequivocally, that he would not accept the presidency even if offered the post by the lame-duck Federalist-dominated House. Had Burr made such a clear and public declaration, it is impossible to imagine that the House could have deadlocked. Jefferson would have become president by process of elimination, much as if Burr were dead. (Imagine, say, an early 1801 duel in which Hamilton killed Burr!)

 

To his credit, Burr did not actively lobby in his own behalf. He did not hasten to Washington City to meet with House members, nor did he make any promises by letter or via intermediaries in exchange for House votes. But he did not, as he easily could have done, emphatically and openly disavow willingness to be selected over his senior partner.

 

Four years earlier, Jefferson had acted with more modesty when he had faced a remarkably similar situation. In mid-December 1796, he wrote a letter to his campaign manager, Madison, that ended up yielding enormous political dividends. If, upon the unsealing and counting of Electoral College ballots in early 1797, he and Adams ended up tied in the contest to succeed the retiring George Washington, thus obliging the House to break the tie, he wrote, “I pray you and authorize you fully to solicit on my behalf that Mr. Adams may be preferred. He has always been my senior from the commencement of our public life, and the expression of the public will being equal, this circumstance ought to give him the preference.” As events unfolded, Adams ended up with an outright majority over Jefferson in the Electoral College tally, rendering Jefferson’s sacrificial offer moot.

 

Adams himself learned of the letter and was charmed. (Jefferson, who had far more self-possession and politesse, generally knew how to play Adams—via professions of friendship and fulsome praise of the senior statesman’s early services to the republic.) In an exultant note to Abigail written on New Year’s Day, 1797, John regaled his wife with (imagined and inflated) details of Jefferson’s admiration and deference:

 

So many Compliments, so many old Anecdotes. . . . [Dr. Benjamin Rush] met Mr. Madison in the Street and ask’d him if he thought Mr. Jefferson would accept the Vice Presidency. Mr. Madison answered there was no doubt of that. Dr. Rush replied that he had heard some of his Friends doubt it. Madison took from his Pocket a Letter from Mr. Jefferson himself and gave it to the Dr. to read. In it he tells Mr. Madison that he had been told there was a Possibility of a Tie between Mr. Adams and himself. If this should happen says he, I beg of you, to Use all your Influence to procure for me [Jefferson] the Second Place, for Mr. Adams’s Services have been longer more constant and more important than mine, and Something more in the complimentary strain about Qualifications &c.

 

Perhaps Jefferson in late 1796 knew all along that Adams had more votes, and the letter to Madison was a brilliant ploy designed mainly to flatter Adams and put him off guard. (If so, it worked.) Or perhaps Jefferson meant everything he said (which was less than Adams recounted; the tale grew in the telling). Either way, it is notable that Aaron Burr did not follow in Jefferson’s deferential footsteps, even though Burr, in 1800–1801, had infinitely more reason to yield to his senior partner and teammate Jefferson than Jefferson in 1796 had to yield to his old friend, but now rival, Adams.

 

On Wednesday, February 11, 1801, Congress met in the new capital city of Washington in the District of Columbia to unseal the presidential ballots that had been cast by electors in the several states. Per the Constitution’s explicit provisions, the Senate’s presiding officer—that is, the incumbent vice president, Thomas Jefferson himself—chaired the proceedings. As expected, there was the tie at the top: 73 votes for Jefferson and 73 votes for Burr. The House immediately began balloting by state delegation. House rules said that the House “shall not adjourn until a choice be made.”

 

All through the night and the next morning, the House voted over and over, but neither Jefferson nor Burr could reach the requisite nine states (out of sixteen total). After twenty-eight continuous rounds of balloting, the exhausted legislators broke off shortly after noon on Thursday to get some sleep. Friday the 13th brought no resolution. Nor did Saturday. Still nothing when Congress reconvened on Monday the 16th. Adams’s term of office was due to expire on Tuesday, March 3—a mere fortnight away.

 

If the impasse continued, would Adams audaciously (illegally?) hold over past his allotted four years? Or would the lame-duck and electorally repudiated Federalist Congress in its final hours ram through a new Succession Act, à la Horatius, crowning Marshall ex officio as acting president, either in his capacity as secretary of state or in his new and additional role as America’s chief justice? (He was nominated for this post by President Adams on January 20 and confirmed by the Senate on January 27; he received his judicial commission on January 31 and took his judicial oath on February 4. Thus for the last month of the Adams administration, he wore both an executive and judicial hat.) If Adams or Marshall took steps to act as president on March 4, would Jeffersonian middle-state militias in Virginia and Pennsylvania respond with force as threatened? Would the self-proclaimed acting president Adams or Marshall counter with federal military force? Whom would the federal military salute? Would Federalist New England militias mobilize and march south? Would Hamilton try to jump into the fray? (In the late 1790s, he had been commissioned as a high general, second in command to George Washington, in anticipation of possible military conflict with France.) With the irreplaceable Washington no longer alive to calm the country and rally patriots from all sides to his unionist banner, would the American constitutional project ultimately collapse in an orgy of blood and recrimination, like so many Greek republics of old and the fledgling French republic of late?

 

These and other dreadful questions darkened the horizon in mid-February. And then, suddenly—as if a strong blast of fresh air abruptly swept across the capital city—the impasse ended. On the thirty-sixth ballot, on the afternoon of Tuesday, February 17, enough House members changed their minds to swing the election to Jefferson, by a vote of ten states to four, with the remaining two states professing neutrality. Most historians believe that Jefferson gave certain assurances to fence-sitting Federalists. Jefferson denied having made any promises, but he was a master wordsmith; his carefully crafted statements of intent (as distinct from promises) had sufficed. Thus, various Federalists crowned Jefferson with the expectation, confirmed by winks and nods from Jefferson and his authorized intermediaries, that he would govern as a moderate.

 

ON MARCH 4, 1801, America’s new chief justice administered the presidential oath of office to his rival and kinsman to complete the nation’s first peaceful (?) transfer of power. Adams was not there to witness the event. Earlier that day, he had left the capital city on a coach bound for his family homestead, brooding about what might have been.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180415 https://historynewsnetwork.org/article/180415 0
Why a Culture War Over Critical Race Theory? Consider the Pro-Slavery Congressional "Gag Rule"

 

 

 

What is Critical Race Theory and why are Republican governors and state legislators saying such terrible things about it? If you are among the 99% of Americans who had never heard of this theory before a month or two ago, you might be forgiven for believing that it poses a grave threat to the United States through the indoctrination of our schoolchildren. To clarify the reasons behind the sudden rise in attacks against this little-known theory, it can first help to consider an earlier campaign of silencing in US history—the effort to shut down any discussion of slavery in Congress through a gag-rule that lasted for almost a decade in the 1830s and 1840s.   

 

In 1836, in response to a flood of anti-slavery petitions, the House of Representatives passed a resolution (Rule 21) that automatically tabled all petitions on slavery without a hearing. By doing so, they effectively prohibited even the discussion of slavery in Congress. The Senate, for its part, regularly voted not to consider such petitions at all. Southern Representatives and their Democratic allies in the North believed that any attention paid to slavery was divisive in that it heightened regional tensions and promoted slave rebellions. They argued that the drafters of the Constitution never intended for the subject of slavery to be discussed or debated in Congress.

           

At the beginning of each session after 1837, during discussion of the House rules, the ex-President and then Representative John Quincy Adams would attempt to read anti-slavery petitions he had received. Originally, only Whigs supported his efforts, but more Democrats joined him each session, so that the majority against Adams gradually decreased until the gag-rule was repealed at the beginning of the 1845 session.

 

Parallels between the gagging of anti-slavery petitions and the campaign to prohibit the teaching of Critical Race Theory are clear, if unnoticed before now. Like the Southern delegations who opposed discussion of slavery, opponents of Critical Race Theory believe that any discussion of persistent racial inequities in legal and other institutions is unacceptable because it is “divisive.” Ben Carson and Christie Noem (Gov. ND-R) have asserted that Critical Race Theory is “a deliberate means to sow division and cripple our nation from within.”

 

In fact the theory, based on an understanding that race is not biological but socially constructed, yet nevertheless immensely significant for everyday life, provides a way to investigate systemic racism and its consequences. It recognizes that racism did not exist solely in the past, that structures embedded in laws and customs persist in the present and permeate social institutions. These structures, intentionally or not, lead to the treatment of people of color as second-class citizens or less-than-full human beings.

 

As their central charge, critics frequently take the theory’s argument that in the US racism is “structural” or “systemic” as synonymous with saying that the United States is “systematically” or “inherently” racist. However, doing so conflates “systemic” with “systematic”: “systemic” practices are those that affect a complex whole of which they are a part; “systematic” practices are planned and methodical. To say an attitude or pattern is structural does not mean that it is unavoidable and unchangeable, that it cannot be addressed and its effects reduced through reforms. Indeed, a central tenet of the theory is that racism has produced its effects through specific, historical institutions, and that reduction of racial inequities can be accomplished, but only once the existence of such injustices is recognized.

 

Most lines of attack on Critical Race Theory depend in similar ways on misunderstandings or distortions. Whether subtle or not-so-subtle, unintentional or willful, their effect is the same: they misrepresent the theory. The opponents criticize what they call the theory’s “race essentialism”— their misconception of Critical Race Theory as saying that an individual, based on their race, is “inherently” racist or oppressive. Against the idea of structural or “inherent” racism, the critics assert that racism only expresses personal choices and actions. But we need not accept their assumption that racism must be either structural or personal; both can surely exist at the same time.

 

Nor do we need to agree with the opponents that the theory considers all white people “inherently privileged” because of their race. In the 1930s, Social Security benefits were denied to domestic workers, the right to organize a union was withheld from propertyless farmworkers, and federally funded mortgages were denied to people of color generally through the practice of “redlining.” The vote was denied to many people of color via poll taxes and other legal obstacles. Recognizing this pattern is not the same as saying that white workers, voters, and mortgage holders are “inherently privileged.” Yet recognizing such a pattern does mean that some of the inequalities and disadvantages under which people of color have labored as a result of discriminatory legislation can be addressed through reformative legislation.       

 

When State Senator Brian Kelsey of Tennessee supported a ban on teaching Critical Race Theory in public schools, he stated that the theory teaches “that the rule of law does not exist and is instead a series of power struggles among racial groups.” However, to acknowledge that laws have been shaped by social structures and cultural assumptions of a particular time does not mean that the rule of law does not exist. Rather, it poses a challenge for us to root out the racist patterns and practices that have been invisibly at work in the idea of “equality under the law.”  

 

Finally, the detractors charge teachers with “imposing” or “forcing” the theory on their students. But these critics are not in fact calling for independence of thought. Rather, their charge seeks to suppress thought that questions historic and continuing inequities and inequalities, just as, almost two hundred years ago, representatives of Southern slave-owners and their Northern sympathizers imposed a gag-rule on their anti-slavery Congressional colleagues.

 

It is instructive that opponents of Critical Race Theory deny what the theory does not assert—that each white person is inherently, essentially racist, and that the institutions of American society are fundamentally, unchangeably racist. It may be easier to legislate these denials and to gag educators than to acknowledge what the theory does assert, and then work to make the difficult changes that are called for in the legal and the educational systems of our country. By denying that racism is entrenched and unyielding, they render it more entrenched and more resistant to attempts to address its consequences. 

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180452 https://historynewsnetwork.org/article/180452 0
The Legacy of Same-Sex Love in Ancient Thebes

 

 

 

Among the many roads leading up to the Supreme Court’s 2015 decision on same-sex marriage, one of the more significant routes passes through Boeotia in central Greece.  This region, and its principal city, Thebes, established a precedent for male same-sex unions that deeply impressed the ancient Greek world as well as gay rights pioneers in nineteenth-century England and the U.S. 

 

The story is little known compared with, say, that of the poetess Sappho of Lesbos, whose homoerotic verses have made the name of her home island, in adjective form, a virtual synonym for female same-sex love.  The Thebans wrote little compared with other Greeks, and those who wrote about them were often biased against them.  But traces survive of their uniquely gay-friendly culture, including a set of archaeological sketches made in 1880 but brought to light only very recently.

 

The long trail begins not with a Theban but with a Corinthian, a wealthy aristocrat named Philolaus.  Sometime in the 8th century BC this man left Corinth with his male lover, an Olympic athlete named Diocles, and landed in Thebes.  The pair were fleeing the incestuous passion of Diocles’ mother – a drama worthy of Sophocles, one supposes, but Aristotle, the source for their flight, gives no details.

 

Committed male couples, willing to go into exile together, were as yet uncommon in ancient Greece.  Homoerotic affairs were more typically short-lived, ending when the junior partner – who may have been pre-pubescent at the outset – began to grow facial hair.  That is the model described by Plato’s speakers in the dialogue Symposium, one of our fullest sources for ancient sexual mores.  But Philolaus and Diocles were both mature men.

 

Did this pair go to Thebes because they knew that their bond would be welcomed there?  Or did their arrival help make Thebes a more gay-positive place?  Aristotle says that Philolaus crafted laws for the Thebans, and other writers make clear that those laws gave special support to male unions.  It’s the first we hear, in any Greek city, of a legislative program designed to encourage same-sex pair bonding; some other Greek law codes explicitly discouraged it.

 

Fast-forward to the 4th century BC, where evidence of Theban uniqueness is more widespread.  In Athens, observers like Xenophon noted that male lovers among the Boeotians (the ethnic group that included the Thebans) lived together “as yoke-mates,” a metaphor usually used of heterosexual marriage.  Aristotle, in a work now lost but cited by Plutarch, described how Theban male couples swore vows of fidelity to one another beside the tombs of Heracles and Iolaus, a mythic pair of heroes assumed by most Greeks to have also been sexual partners,

 

By far the most significant evidence of same-sex bonding among Theban adults is the legendary Sacred Band, described by Plutarch (himself a Boeotian) in his work Parallel Lives.  This infantry regiment was formed from 150 male couples, of which both partners were clearly above the age of military service.  The Thebans established this corps in 378 BC in response to Spartan aggression, and with its help they defeated the Spartans in open battle only seven years later.

 

Our information about the Band is scanty enough that a handful of scholars have doubted Plutarch and suggested the erotic principle described in the Lives was only a fiction.  But the mass grave of the Band was uncovered in 1880 at Chaeronea, the spot where they fell in battle, and sketches were made of their remains by Panagiotis Stamatakis, the chief excavator.  These sketches, uncovered by Greek archivists only in the past few years, reveal that pairs of corpses were interred with arms linked – dramatic confirmation of Plutarch’s account.

 

The effectiveness of the Band, says Plutarch, was based on the way that two lovers, fighting side by side, would strive to impress one another with prowess and courage.  It’s just the same reason a pair of chariot horses run faster than any one horse, says Plutarch (who seems to know this for a fact).  Plato makes a related point in Symposium, discussing a hypothetical army of lovers: No one, he says, would want his beloved to see him turn and run in the face of danger. 

 

The tomb of the Sacred Band was unearthed at just the moment, in the late nineteenth century, that gay men, in Europe and the U.S., were first coming out of the closet.  Walt Whitman wrote in Leaves of Grass of a city where “manly love” flourished, seemingly inspired by Plutarch’s account of Thebes.  In England, the classical scholar and essayist J.A. Symonds wrote an impassioned defense of Greek male homosexual culture, including that of Thebes, in his seminal pamphlet “A Problem in Greek Ethics.”

 

The work of both Symonds and Whitman influenced George Cecil Ives, a gay Victorian man and a friend of Oscar Wilde, to make the Sacred Band his emblem of gay male pride.  He formed a secret society, the Order of Chaeronea, to provide a forum for closeted homosexuals and to work toward a more inclusive society.  The voluminous diary he kept during the late 19th and early 20th century is today considered a vital source for the start of the gay rights movement.

 

Ives became so entranced with the Sacred Band that he began dating his diary entries according to the years elapsed from 338 BC, the date of the Battle of Chaeronea.  He thought the new age of the world had begun not with the birth of Christ but with the destruction of the Sacred Band by Alexander the Great.  The extinction of the Band, in his eyes, had marked a great fall from grace, the end of an era when gay men like himself could live a life unimpeded by condemnation.

 

Ives was optimistic that the golden age that ended at Chaeronea could someday return.  “I believe that Liberty is coming,” he wrote in his diary in 1893.  “I sometimes think that some of us will live to see the victory.”  How gratified he would have been by the legalization of same-sex marriage and other kinds of “victory” for inclusion -- milestones attained with the help, in some small part, of Thebes and its Sacred Band.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180453 https://historynewsnetwork.org/article/180453 0
Paying People to Get Vaccines is an Old Idea Whose Time has Come Again

John Haygarth founded the Smallpox Society of Chester in 1778 to promote the then-unpopular practice of inoculation.

 

 

Several states now offer incentives for COVID vaccinations, hoping that enough people will sign up to drive the infection rate down and protect the entire community. When this was first tried in the late eighteenth century, it met with mixed success.  The originator was John Haygarth of Chester in Northwest England who published his plan for a "general inoculation” of the poor, as "An Inquiry How to Prevent the Small-pox" in March 1778. Haygarth argued that "social distancing” and other preventive steps could quell urban smallpox epidemics.  At that time, smallpox most often attacked poor young children whose families could not afford to shield them from all contact with infectious people. It was by far the most fatal disease in Britain, causing about half of all deaths among children under ten.  Most adult city residents had contracted the disease as children and become immune.  Outbreaks struck every few years when enough children had been born to sustain a fresh epidemic although inoculation, a preventive measure, was increasingly accessible.

When it was first introduced into England in 1721, smallpox inoculation, also known as “engrafting,” or “variolation,” was a brutal and dangerous procedure.  By inserting matter from a smallpox pustule under the skin of a new patient, a practitioner could usually produce a comparatively mild infection.  Although it was less lethal than naturally contracted smallpox, which killed about one patient in five, inoculation initially had a fatality rate of about one in fifty.  By mid-century, practitioners had become more adept, making the procedure safer and less complex.  Members of the entrepreneurial Sutton family were especially skilled. Daniel Sutton claimed that between 1763 and 1766, he had inoculated 22,000 people with only 3 deaths. However, this created a new problem: because inoculation caused an actual infection with smallpox, recently inoculated patients could infect any susceptible person who came near them before they had fully recovered.  Haygarth set out to solve this double problem: save more children without spreading the disease to others.

After studying smallpox outbreaks, Haygarth decided that it spread only by contagion from person to person and was transmitted primarily by an airborne vapor over a very short distance.  He drew up “Rules of Prevention” warning those with smallpox to avoid going out in public and those who were susceptible from entering any house that held a smallpox patient.  Everyone and everything touched by any discharges from a patient should be washed and exposed to fresh air and all medical attendants must wash their hands. Then he launched a campaign in Chester for mass inoculations.  Inoculating groups of people at the same time could also reduce the odds that they would transmit smallpox.

With his ally Thomas Falconer, Haygarth founded a "Society for Promoting Inoculation at Stated Periods and Preventing the Natural Smallpox" in Chester.  They raised donations to pay local doctors to perform the procedure and to pay poor families for bringing their children. All the doctors volunteered to participate without charge, increasing the fund for the families.  One concern was that recently inoculated children might spread smallpox. Even worse, some families exposed their children deliberately. To prevent these problems, the Society offered two payments to any poor family that (1) inoculated its children and (2) faithfully followed Haygarth's “Rules for Prevention”. The Society wrestled with the ethics of paying parents for inoculations but concluded that it could be considered compensation for the wages lost while they nursed their children. They also hired an “inspector” to follow up with the families and ensure that they observed the quarantine rules.  Mass inoculations were not new, but this may have been the first time anyone tried to create widespread immunity by offering a reward to families for inoculations. At first the society was successful, inoculating hundreds of children, suppressing incipient epidemics, and halving the death rate in Chester from smallpox.  In 1781, however, with their funds dwindling, they decided to stop paying families for inoculating their children and focus on rewards for obeying the rules of prevention.  When the scheme was first initiated, they thought paying for the inoculation itself was necessary to overcome “inveterate prejudices,” against the procedure, but it had come to be seen as a bribe for doing something wrong.

  After six years the plan collapsed in the face of frequent re-importation from nearby cities, the transit of infected soldiers through the city, and the resistance of many Cestrians.  The Society admitted that a single payment of five shillings for following the rules was too small to attract parents.  With so many new cases emerging, the inspectors were overwhelmed, and the society could not raise enough money to sustain its campaign or attain widespread compliance. (Proceedings, 205) Haygarth, a very tenacious man, was undaunted.  As the constant re-importation had contributed to the Society's collapse, he decided that only a nationwide inoculation effort could succeed. He filled in the details in 1793 with a sprawling two volume compilation entitled Sketch of a Plan to Exterminate the Casual Small-pox from Great Britain.  This ambitious plan was out of step with the realities of eighteenth-century British governance.  In any case, it was soon pre-empted by the safer practice of “cowpox” vaccination.  Even vaccination did not eliminate smallpox although child mortality began to plummet after a reformed British government made it mandatory in 1853. Yet Haygarth's efforts were not wasted.  His research on smallpox produced new information about the behavior of contagious diseases including incubation times, infectivity, conditions for exponential growth, epidemiology, and methods of control.  Haygarth drew on his growing expertise to investigate influenza, typhus and, less successfully, yellow fever, and to establish isolation wards for typhus at the Chester Infirmary. Haygarth was among the leaders of a small group of British physicians who redefined infectious diseases and claimed that contagion could be prevented by relentless cleanliness, separation, and fresh air.  Although their work was always controversial in Britain, it was widely read both at home and abroad and created a more secure foundation for medical research. So when you line up for your shots, win the vaccination lottery, or throw away your mask, take a minute to acknowledge the activists of the eighteenth century who worked to improve public health with mass inoculation.    

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180451 https://historynewsnetwork.org/article/180451 0
Understanding Gun and Police Violence Lies Between History and Power

Navy Junior ROTC Cadets use the firearms training simulator at Great Lakes Naval Air Station (IL).

 

 

National Gun Violence Awareness Day is here, yet given the continued spikes in gun and police violence since the past year, one could argue a reminder isn’t necessary. According to the Alliance for Gun Responsibility, African Americans are 10 times more likely than whites to die by gun homicide. They are also more than three times likely to be killed during a police encounter.

Gun and police violence are not separate affairs. Police shootings are part of America’s gun problem: police violence is a leading cause of death for young men. There is a correlation between police killings, states’ gun control laws, and gun ownership rates. And peoples of African ancestry, more than others, pay for this mix of gun culture and militarized police with their lives.

After Derek Chauvin's conviction, many believed justice prevailed and that the George Floyd Justice in Policing Act would become law. Yet further police killings just 24 hours after the verdict, and President Biden marking Floyd’s murder with a discussion rather than a law, shows why that belief may be premature.

U.S. police kill civilians at higher rates and in larger numbers compared to other democracies and policy changes, body cameras, and media scrutiny have not reduced the racial disparity in fatal police shootings. We need to understand the limits of policy prescriptions in the face of deeply rooted cultural and social norms related to policing and gun violence. The Supreme Court ruled racial segregation in public schools was unconstitutional in 1954, but today public schools are more segregated by race and income more than six decades ago.

We cannot likewise reform our way out of policing and guns because they are tethered to settler colonialism and slavery’s ongoing violence. Understanding this past in the present outweighs any act of protest or congressional bill. But grasping the past means understanding that a third of U.S. adults cannot pass a U.S. citizenship exam, most K-12 students have a poor grasp of U.S. history and so do many lawmakers. Indeed, billionaire David Rubenstein undertook a project to teach politicians U.S. history.

The issue of police and gun violence, especially against non-white peoples, is marginalized if not absent in the public’s understanding of both. There are stubborn perceptions: guns don’t kill, people do; most cops are good with only a few bad ones; the threat of police violence experienced by non-whites are overblown or justified, and high-profile cases of police “misconduct” are anomalies that are fixable through reform. These perceptions ignore the history of the power derived from control over people.

In the colonies that later formed the United States, policing and violence were tightly braided. If policing and gun violence were circles of a Venn diagram, the overlap would be enforcement—the act of power over another. Enforcement animated police violence, and as policing became bureaucratized so did this legitimacy shield and give police greater discretion in the use of deadly force.

Policing in the United States and in England evolved from community watches. These were supplemented by unarmed and unpaid constables without uniforms. In urban areas, like Boston and New York City, centralized police forces became publicly funded, full-time bureaucracies. Their mandates were to ensure social rather than crime control. Private businesses transferred the cost of their protection to the state, which paid for policing.

U.S. rural, southern areas used a mixture of “slave patrols,” bounty-hunting, deputization, and general surveillance of chattel—to apprehend, deter revolt, and enforce racists laws. Together, rural and urban policing were corrupt and brutal, operating under the control of politicians, who were beholden to economic elites. Political and economic elites who created the venues for public drinking, prostitution, and workers to strike, then criminalized those behaviors, assigning them to an identifiable “class” dangerous to social order.

Protests or strikes were criminalized as “riots.” Police were legally authorized to use force under the guise of rule of law, to patrol and surveil, and to wear uniforms which signaled a clear difference between them and the “dangerous” elements. The central problem that led to policing was never a crime, but political and economic power. As centralized police departments became the norm, they decided to arm officers after officers had already armed themselves. Indeed, white people were armed in the United States long before centralized police forces, and so white police forces simply formalized the “right to bear arms” argument, seemingly reserved for white people. Viewed from this perspective, gun control has also meant limiting gun access to peoples of African ancestry.

Though gun and police violence are in the crosshairs of current political debates, the real targets are the history and the very coercive power used to build this nation. The nation cannot be made anew and accountable through feel-good implicit bias training, anti-racist workshops, and policy prescriptions. Facing deeply rooted cultural and social norms around policing and gun violence requires confronting histories and concentrations of power which make them viable, and at a minimum making police violence an integral part of National Gun Violence Awareness.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180449 https://historynewsnetwork.org/article/180449 0
The New Meaning of "The Loyal Opposition"

 

 

 

 

The phrase “the loyal opposition” was coined by John Hobhouse in a debate in the English Parliament in 1826.  Less than a hundred years later, A. Lawrence Lowell, a political scientist (and later president of Harvard University) proclaimed the loyal opposition “the greatest contribution of the nineteenth century to the art of government.”

 

Designed to make space for the political party out of power to dissent and hold the majority party accountable without facing accusations of treason, the concept of a loyal opposition depends on the deference of non-governing parties to the authority of democratic institutions and the normative framework in which they operate.

 

The saving assumption of the loyal opposition, Michael Ignatieff, former leader of the Liberal Party in Canada and President of the Central European University, has written, is that “in the house of democracy, there are no enemies.”  When politicians treat each other as enemies, “legislatures replace relevance with pure partisanship.  Party discipline reigns supreme… negotiation and compromise are rarely practiced, and debate within the chamber becomes as venomously personal as it is politically meaningless.”

 

Republicans in the United States Congress, many of whom endorsed groundless claims that the 2020 presidential election was rigged, it now seems clear, have changed the meaning of “loyal” to obeisance to party rather than to democratic principles.  And the decision of GOP leaders in the House and Senate to block a bi-partisan commission to investigate the January 6 assault on the Capitol serves as the most recent example:

 

According to John Katko, the New York Republican Congressman who negotiated the provisions of the draft legislation with his Democratic counterpart Benny Thompson of Mississippi, the bill was modeled on the 9/11 comission to ensure it was “depoliticized entirely.”  The commission would have been composed of an equal number of Republicans and Democrats, with equal subpoena powers, an inability to subpoena a witness without bi-partisan agreement, and shared authority to hire staff.

 

Although Democrats incorporated the provisions Kevin McCarthy (R-California) demanded into the bill, the House Minority Leader declared last month that he opposed the commission because its “shortsighted scope” omitted “interrelated forms of political violence in America… I just think a Pelosi commission is a lot of politics.”

 

Senator Mitch McConnell (R-Kentucky), who declared in 2010 that “the single most important thing we want to achieve is for President Obama to be a one-term president” and in 2021 that “100% of his focus” would be on “stopping the [Biden] administration,” claimed, without evidence, that Pelosi, Thompson and Company negotiated “in bad faith” in order to “centralize control over the commission’s process and conclusion in Democratic hands.”  Although the Justice Department is limited to investigating crimes and lacks the power to subpoena individuals with knowledge of the assault who did not break the law, the Minority Leader opined that the DOJ probe rendered a bi-partisan commission “redundant.”  To this allegedly good reason, he added his real reason: winning majorities in the House and Senate in 2022 requires Republicans to prevent Democrats from continuing “to debate things that occurred in the past.”  McConnell then orchestrated the filibuster that prevented the Senate from considering the legislation.

 

Ditto John Thune, Republican Minority Whip.  Without addressing the need to determine what happened on January 6, who was responsible, and how another assault might be prevented, Thune expressed his fear that an investigation “could be weaponized” in 2022.  Senator John Cornyn, who had agreed in February “with Speaker Pelosi – a 9/11 type commission is called for to help prevent this from happening again,” also began to sing along with Mitch.  “The process has been highjacked for political purposes,” he declared.  Democrats are “going to try to figure out what they can do to win the election.  Just like 2020 was a referendum on the previous problem, they want to make 2022 one.”

 

In the closing pages of 1984, George Orwell’s dystopian novel, O’Brien, a functionary in the totalitarian state of Oceania (whose first name is never revealed), predicts that in the not-too-distant future “there will be no loyalty, except loyalty to the Party… There will be no laughter, except the laugh of triumph over a defeated enemy.”

 

It has been said that “when the loyal opposition dies, the soul of America dies with it.”  And it may not be unreasonable to fear that unless principle begins to trump party that time may be at hand.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180455 https://historynewsnetwork.org/article/180455 0
John Cena's Taiwan Controversy Recalls Richard Nixon's Biggest Mistake

Richard Nixon Meets Chairman Mao, February 1972

 

 

John Cena, a former wrestler turned actor, found himself enmeshed in a controversy after giving an interview to a Taiwanese television station. Promoting his new film, F9, Cena said, “Taiwan is the first country to watch Fast and Furious 9.”  Calling Taiwan a country sparked a furious backlash in mainland China, where Cena is a major star and the Fast and Furious franchise has raked in billions over the years. Cena quickly backtracked, offering a groveling and pathetic apology to the Chinese government. In an effort to appease communist China, Cena took to the social web site Weibo, and speaking in Mandarin (which he is fluent in), he issued the following statement: “I love and respect China and Chinese people. I’m very, very sorry for my mistake.”

 

Cena deserves condemnation for bowing to Beijing, though he’s hardly alone in doing so, but he can perhaps be forgiven for not understanding the geopolitical issue involved between China and Taiwan. Situated 100 miles south of the mainland, Taiwan’s global situation is ambiguous. In 1949 Chiang Kai-Shek and the Nationalists fled as the civil war ended and Mao’s forces took control of China. Despite the military defeat, Chiang proclaimed himself the legitimate ruler of China and denied that China had any right to rule over Taiwan. For two decades the United States accepted his claims and refused to establish ties with Mao Zedong’s People’s Republic of China.

 

It wasn’t until Richard Nixon went to China in February 1972 that the situation changed when Nixon abandoned America’s longtime ally. At the end Nixon’s visit the two sides issued what became known as the “Shanghai Communique.” China’s part of the statement included the following: “The Government of the People’s Republic of China is the sole legal government of China; Taiwan is a province of China which has long been returned to the motherland; the liberation of Taiwan is China’s internal affair in which no other country has the right to interfere; and all U.S. forces and military installations must be withdrawn from Taiwan.” Nixon, the man who owed his rise in politics to his fierce anti-communism, willingly went along with China’s demands. The American response in the communique was that the United States acknowledged “one China and that Taiwan is part of China.”[i]

 

Nixon’s betrayal of Taiwan was far worse than anything he did during the Watergate scandal. However flawed Chiang Kai-Shek may have been as a leader, he was a man who fought against Mao’s tyranny and stood loyally with the United States for decades. Taiwan was a haven for many who had fled Mao’s terror and they counted on Americans to defend the island from a communist takeover. Although not quite another Munich, Nixon’s selling out of Taiwan is a stain upon his presidency and a dark chapter in American foreign policy.

 

While China has never invaded Taiwan, the threat of intervention looms over the island. Today Taiwan maintains that it is in fact an independent nation while Beijing insists that it is part of the People’s Republic. Over a dozen countries, including the Vatican, have diplomatic relations with Taiwan, but the United States is not among them. That shameful fact is part of Richard Nixon’s legacy. In retrospect, his opening to China is not the success he believed it was. At some point the two nations were going to establish formal ties. The line that “Only Nixon could go to China” was perhaps correct then but one of Nixon’s successors would have done it anyway. Further, Nixon’s adulation of Chairman Mao looks worse as time has gone on. Mao’s brutal dictatorship, especially his Great Leap Forward, which one historian estimates led directly to the death of 45 million Chinese, does not make Nixon’s effusive praise of Mao look wise.[ii] His ignoring of the PRC’s appalling human rights record and willingness to forsake an old and valued friend is a blight on his presidency. A man who did much good and is far better president than he is given credit for, Nixon deserves nothing but censure for his capitulation to China and Mao.

 

[i] Richard Nixon, RN: The Memoirs of Richard Nixon (New York: Simon & Schuster, 1978), 576-577.

[ii] For the estimate of 45 million killed see Frank Dikotter, Mao’s Great Famine: The History of China’s Most Devastating Catastrophe, 1958-1962 (New York: Bloomsbury, 2010).

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180448 https://historynewsnetwork.org/article/180448 0
Whataboutism Didn't Get Nixon Off the Hook. It Shouldn't Stop Investigation of the Capitol Riots

 

 

The idea of there being an equivalency—moral or otherwise—between the Capitol riot of January 6 and Black Lives Matter and Antifa street agitation is preposterous. Republican attempts to submarine the January 6 commission proposal based on the isolated incidents of urban violence that arose during legitimate citizen protest over police abuses is a complete ruse. But whataboutism is a favorite tactic of politicians who have no good response for their own malfeasance or that of their followers.

Let’s take Richard Nixon and Watergate as the example.

In January 1973, Richard Nixon faced a sticky situation. He was trying to end the Vietnam War for the United States through brutal tactics because diplomacy had failed. In December, Nixon, without Congressional approval, instituted a punishing bombing campaign of Hanoi and Haiphong Harbor to bring the North Vietnamese back to the bargaining table in Paris. The strategy was having some success, though at the expense of international outrage over the bombing of civilian centers.

Nixon wanted support from the former president, Lyndon Johnson, and reached out to him on January 2, 1973, for what would be their last phone call. Johnson encouraged Nixon to keep at it. “Well, I just feel the torture you are going through on Vietnam,” Johnson said. Nixon replied: “As you know, and I’m sure you feel the same way, we’ve got to get this finished in the right way and not in the wrong way.”

Johnson responded, almost inaudibly, “You’re doing it, and I just wish the best for you.”

The complication for Nixon was that the Watergate burglars’ trial was just about to commence in Judge John Sirica’s courtroom in Washington. Howard Hunt, one of the leaders of the burglars, pleaded guilty before the trial would start, believing he had an implicit promise of a pardon through his lawyer’s talks with Chuck Colson, a Nixon adviser. All of this seemed suspicious to Senate Democrats, including Ted Kennedy, Sam Irvin and Mike Mansfield, who were making noises of an investigation into Watergate and the political shenanigans of the 1972 campaign.

Nixon wanted a strategy to starve the Congressional appetite for a full-blown Watergate investigation. John Dean, Nixon’s White House Counsel, suggested to Nixon’s Chief of Staff Bob Haldeman that Nixon rummage around at the FBI to see if there might be corroboration to the rumor that LBJ had wiretapped Nixon’s campaign plane in the 1968 campaign. J. Edgar Hoover supposedly told Nixon after he was elected that his plane had been bugged by Johnson.

Nixon never forgot it. And now Dean was recommending they try to “turn off” the Watergate investigation by threatening to expose Democratic wrongdoing in 1968, or whataboutism.

Nixon ordered a deep dive into FBI files and had his staff contact former FBI assistant to Hoover, Cartha “Deke” DeLoach, to find out if hard evidence existed of the 1968 plane bugging. This was a dangerous game, as Nixon knew LBJ would react with anger if he found out. Nonetheless, Nixon persisted.

On January 11, Nixon asked for an update from Haldeman. John Mitchell, Nixon’s attorney general, had spoken with DeLoach, who, according to the Haldeman and an Oval Office tape, confirmed that the spying on Nixon did take place. DeLoach offered to help find confirming evidence but refused to provide an affidavit. Nixon was unhappy. “Bob, I want it from DeLoach.”

All concerned were wary of Johnson’s reaction. Schemer that he was, Nixon suggested that someone tell Johnson that the Washington Star was on to the 1968 bugging story and that together they had to squelch the story by telling Congress to back off all campaign investigations, whether of 1968 or 1972.

The ploy backfired. “LBJ got very hot,” according to Haldeman and called DeLoach and said, “if the Nixon people are going to play with this, that he would release [deleted material—national security], saying that our side was asking for certain things to be done.” Whatever the counterthreat was, it was serious enough to be classified to this day.

In the end, the gamesmanship did nothing to deter the Watergate investigation. The Senate voted in February to start its inquiry. That famous investigation, led by North Carolina Democrat Sam Irvin and Tennessee Republican Howard Baker, broke the back of the Nixon administration’s criminal cover-up of the Watergate break-in. John Dean broke ranks and testified about his own culpability in the cover-up and his warning to Nixon that there was a “cancer growing on the presidency.” Dean’s testimony was fully corroborated when the White House tapes were ordered turned over by the Supreme Court in July 1974. Weeks later, Nixon resigned.

Whataboutism is a dangerous strategy. At least in the Watergate example the two instances were of some equivalence, if true—both involved presidential surveillance that was probably unlawful. The current situation is vastly different. Comparing protest violence connected to street demonstrations might be apple to apple—for example, violence at BLM protests equated with violence by Trump Proud Boys supporters in the street protests in Washington in December 2020—these might be considered of a kind.

But a violent insurrection at the United States Capitol with the purpose of stopping Congress from certifying a presidential election is of a different character and magnitude altogether. A frontal attack on democracy is entirely different from street violence that is easily controlled by law enforcement. We have had riots in our cities and looting related to civil unrest, but we have never had the seat of government placed under siege, except in time of war with Britain in 1812.

If Republicans want to investigate street violence, they are free to look into it. But it is imperative to our very form of government that the insurrection of January 6 be completely and full investigated.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180450 https://historynewsnetwork.org/article/180450 0
Two Films Show the Historical Toll and Present Danger of Ethnic Violence

Still from Habermann (2010)

 

 

With ethnic tensions in the USA much in evidence as witnessed by attacks on Asian Americansincreased antisemitism, and continuing Trumpian white resentment against minorities and immigrants, two films recently available on Amazon Prime Video that display ethnic hatred are timely indeed. They also reflect real historical happenings.

 

The first, named as “Hatred” on Prime, is a Polish drama that first came out in 2016 under the title “Volhynia.” But the movie does show hatred aplenty, including some brutal killings--especially of Poles by Ukrainians.

The original title refers to a Ukrainian area below modern-day Belarus. It is a region now in Ukraine that in history has bounced back and forth between Russian and Polish control, e.g., Russia ruled it in the nineteenth century and Poland controlled the western part of it between WWI and WWII. A graphic at the film’s beginning cites the ethnic makeup of Volhynia prior to WWII as 70% Ukrainian, 16% Polish, and 10 percent Jewish.

Many viewers of the film will probably just shake their heads and ask how humans can be so hateful and cruel to each other. All the ethnic and religious killings remind one of the lines from Indian-born writer Salman Rushdie’s The Moor’s Last Sigh: “In Punjab, Assam, Kashmir, Meerut--in Delhi, in Calcutta--from time to time they slit their neighbor’s throats. . . . They killed you for being circumcised and they killed you because your foreskins had been left on. Long hair got you murdered and haircuts too; light skin flayed dark skin and if you spoke the wrong language you could lose your twisted tongue.”

 

But in Volhynia, like in India and Pakistan, not only did different ethnic groups (e. g., Poles, Ukrainians, Jews, and Russians) clash, but so also did differing religious beliefs--in Volhynia’s case, Catholicism, Orthodoxy, and Judaism. In eastern Europe as a whole, similar conflicts occurred from even before the assassination of the Austrian archduke Franz Ferdinand by a Bosnian Serb in 1914 up to the 1990s’ conflicts among Serbs, Croats, Bosnian and Kosovar Muslims and other ethnic groups in the former Yugoslavia that led to the deaths of hundreds of thousands and created millions of refugees. In his sweeping The War of the World: Twentieth-Century Conflict and the Descent of the West, historian Niall Ferguson lists ethnic conflict as one of the three main causes of the “extreme violence” of the century, and central and eastern Europe as the most deadly of the “killing spaces.”

 

Hatred centers on the story of a young Polish girl, Zosia GÅ‚owacka, living in a Volhynian village. It begins with the wedding of her sister to a Ukrainian. Zosia is also in love with a young Ukrainian, Petro--an early scene shows them physically intimate. But in exchange for farmland and some animals, her father marries Zosia to Maciej, an older widowed Polish landowner with children (In the interwar years, the Polish government had helped many war veterans and other Polish colonists settle in Volhynia, which contributed to Ukrainian resentments). The wedding occurs on the eve of Germany’s 1939 invasion of western Poland, and Maciej is soon drafted to help the Poles fight the Germans.  

 

After the Germans quickly rout them, Maciej and other Poles attempt to return to their Ukrainian homes, but many of them are captured, tortured, and killed by local Ukrainians. Maciej, however, returns to his village by disguising himself as a Ukrainian. But he will not remain there long because according to the secret agreement attached to the Nazi-Soviet Pact of August 1939, part of “Polish” Volhynia is to be taken over by the Russians. After they do so, they arrest and send many Poles, including Maciej, to forced labor in Siberia or Kazakhstan--the new teacher also tells (in Russian) her young students that religion is a superstition.

 

From late 1939 to the summer of 1941, Zosia remains at Maciej’s farm with his children and an infant son of her own, probably fathered by Petro, who gets killed soon after helping Maciej’s children and Zosia avoid deportation.

 

In June 1941 the Germans attack the USSR and quickly take over the Volhynian area where Zosia lives. Some of her fellow villagers who are Ukrainian greet the German troops and cooperate in the arrest and killing of Jews and Poles. Zosia, however, risks her own life to help some Jews.  

 

By the summer of 1943 the Ukrainian Insurgent Army (UPA) has grown, as has the number of local Poles it has killed. The film shows two Ukrainian Orthodox priests preaching to their congregations. The first warns against excessive nationalism, but the second states, “We need to fill all the rivers with Polish blood because Ukraine has to be pure.”

 

Shortly thereafter some of the most horrific scenes of the film appear as local Ukrainians burn Polish huts and kill by burning, stabbing, axing, and other means, while shouting, “Death to the Poles.”

 

Zosia escapes with her little son, but sees her stepson burned alive. Eventually, she arrives at the home of her sister, Helena, and her Ukrainian husband, Vasyl, who is urged by his brother to kill the Polish Helena. Instead, Vasyl ends up killing his own brother with an axe.

 

Shortly thereafter, however, it is the Poles’ turn to be barbaric. They kill Helena’s whole family, including her for marrying a Ukrainian. Zosia escapes again, hiding in the woods with her son.

 

The film’s final scene shows her on a long dirt road, lying in the back of a horse-drawn cart, her and her son being transported by a kindly young man who found them in the woods. And the following wording is displayed on the screen: “In the period of 1943-45 an estimated 80 to 100 thousand Poles and 10 to 15 thousand Ukrainians had fallen victim to Ukrainian nationalists’ attacks and Polish retaliations in the Eastern Borderlands” (According to various historical sources, these estimates seem a bit high, but a joint Polish-Ukrainian conference in 1994 agreed that 50,000 Polish deaths was a moderate estimate).

 

The second film on Amazon Prime, Habermann, is a 2011 Czech-German film that reflects tensions between Czechs and Germans in the Sudetenland area bordering Germany and Czechoslovakia during the years 1938 to 1945. This border region was part of Czechoslovakia from 1918 until 1938, when Hitler it annexed. On the the first page of his Mein Kampf [1925], Hitler had written that all German speaking people should be united in an enlarged Germany. In March 1938 he began this process by absorbing (German-speaking) Austria to Germany. Later that year, in late September, he got the governments of England and France to “appease” him (in the infamous Munich Agreement) by agreeing that the Sudetenland, where ethnic German speakers were in the majority, was to be given to Germany.

 

August Habermann is a Sudeten German sawmill owner whose family has run the mill for generations. He is married to the Czech Jana, whose father (unbeknownst to her) was Jewish. They have a young daughter. August’s best friend is a Czech forester named Jan Brezina, who is married to Martha, an ethnic German. Most of the employees at the mill are ethnic Czechs, and August treats them fairly. But he begins having major problems after the German takeover, when SS Major Koslowski starts making demands on him and the sawmill and complains that Habermann employs mainly Czechs as opposed to Sudeten Germans.

 

The film then displays various examples of Nazi cruelty--for example, Major Koslowski demands that Habermann select 20 Czech civilians for execution to avenge the deaths of two German soldiers, and Habermann’s wife Jana is sent to a concentration camp. It also shows various examples of Czech resentment of Nazi control. Although some Sudeten Germans like August Habermann are unhappy about Nazi demands, others, like his younger brother Hans, who joins the German army, are fervent Nazi supporters.

 

The movie’s final scenes, like its opening one that foreshadows them, display the violent wrath (including killing) of the Czechs against their Sudeten German neighbors after the Nazis pull out in 1945. Unfortunately for August, the local Czechs blame him for cooperating with the Nazis. They even direct their hatred at his Czech wife, Jana, who has been freed from the concentration camp. “Habermann’s whore,” they call her.

 

Like Hatred, Habermann visualizes for its audiences many unpleasant truths about how beastly we humans can be to one another. According to the Czech historian Tomas Stanek, in Verfolgung [Persecution]1945, from May until early September 1945, Czechs brutalized and killed hundreds of thousands of Sudeten Germans as they drove them out of Czechoslovakia.

 

As we watch all this ethnic hatred on display, we naturally ask ourselves, why?  How can we act so inhumanely toward other human beings?

 

In Chapter 1, “A Century of Violence,” of my An Age of Progress? Clashing Twentieth-Century Global Forces (2008), I attempted to explain ethnic and other 20th- century violence. In doing so, I cited the Nobel-Prize winning economist Amartya Sen, who wrote that much of it flowed from “the illusion of a unique and choiceless identity,” for example, that of nationality, race, or class. He added that “the art of constructing hatred takes the form of invoking the magical power of some allegedly predominant identity that drowns other affiliations and in a conveniently bellicose form can also overpower any human sympathy or natural kindness that we may normally have.”

 

I also indicated that

 

there are many reasons why the deaths of foreigners or those considered fundamentally different seemed to matter much less to people than the deaths of those more similar. . . .  It is natural for people to feel more compassion for those closer to them--for family members, neighbors, or members of a group or nation with whom they identify. In addition, in the case of a nation or state, patriotism and nationalism were often reinforced by education, by media, and by social and cultural rituals such as the singing of national anthems, and, especially in wartime, by government propaganda. (For more of my thoughts on the motivations for violence and the dehumanization that often precedes it, see here.)

 

Looking specifically at the two films reviewed here, the focusing on past grievances and an ideology of ethnic nationalism are two main causes of much of the bloodshed. Volhynian Ukrainians remembering Polish government and individual mistreatment of them in the first film, and Czech revenge for Nazi oppression in the second are main factors.  In an interview about his Volhynian film, director Wojciech Smarzowski said it’s “ against extreme nationalism. The film is a warning–it shows what a human being is capable of doing when equipped with a relevant ideology, political or religious doctrine and is allowed to kill.”

 

I have often written against any nationalism, dogmatism, or ideology that makes us less tolerant of others. And I closed a recent essay with the hope that the USA could be “a land of many ethnic groups and various believers and non-believers [who] can live harmoniously together, can become “stronger, not in spite of its many elements, but because of them.”

How exactly to do so is complex, but a starting point might be to look at the example of the South Africans Nelson Mandela and Archbishop Desmond Tutu, who in the 1990s respectively created and chaired a Truth and Reconciliation Commission, which attempted to move South Africa’s Whites and Blacks beyond the cycle of violence and counter-violence.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180454 https://historynewsnetwork.org/article/180454 0
Governing With an Evenly Divided Senate is a Rare Tightrope Act

Sen. Kyrsten Sinema (D, AZ) may ultimately decide how much legislation passes the Senate before the midterm elections.

 

 

Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

 

Today, America has, for the fourth time, an evenly divided US Senate. Already this has complicated the ability of President Joe Biden and the Democratic Party to accomplish their goals.  Senate Democrats need party unity in an unusually urgent way. Passing most legislation under current Senate rules is blocked by the ability of the Republicans to filibuster. While 50 Democrats plus Vice President Kamala Harris could vote to change the rules, two party moderates, Joe Manchin of West Virginia and Kyrsten Sinema of Arizona, have been prominently opposed to changing the filibuster despite Republican obstructionism.

The tactic of budget reconciliation has allowed the passage in March of the “American Rescue Plan,” and may be pursued for the “American Jobs Plan,” the much-debated infrastructure bill, due to constant refusal of Republicans in the Senate to present a counterproposal close enough to the Biden Administration plan to begin real bipartisan negotiation.  Another major initiative, the “American Families Plan,” a major federal initiative promoting education, healthcare and child care by raising taxes on individuals who earn more than $400,000 in income and on corporations, also faces political barriers.  

It is becoming apparent that only modifying or removing the filibuster, which requires a 60 vote supermajority to move legislation toward a final vote (and thus allows a minority faction to control legislation), will make it possible to accomplish such ambitious goals, the greatest since the Great Society or the New Deal. Likewise for Biden’s goals for civil rights, gun regulation, voting rights, climate change, immigration, the minimum wage, criminal justice reform, and education.  What’s more, Biden faces another opponent in the calendar; Republicans are betting on the filibuster continuing to prevent Democrats from passing potentially popular legislation before the midterm Congressional elections in 2022, which historically favor the party opposed to the president.

The average age of current US Senators is 63. Five Senators are older than 80, 25 are in their 70s, and 18 of these 30 Senators are Democrats. There is also concern and alarm over the fact that if one of those 18 Democrats should become incapacitated or die, the Republican Party would hold a majority of the Senate.

The Senate has been evenly split three times in the past, with the 83rd Congress (1953-1955) covering the first two years of the Eisenhower presidency being the most chaotic.  In January 1953 the Senate had 48 Republicans, 47 Democrats, and Independent Wayne Morse of Oregon.  Three senators died in 1953, and six died in 1954, a total number never reached before or since that time. After the death of Senate Majority Leader Robert Taft of Ohio, on July 31, 1953, the Democrats had more members for nearly a year, until July 7, 1954.  However, while Independent Wayne Morse had left the Republican Party, he agreed to caucus with them to keep their majority. The Democrats would not be able to take over leadership in the 83rd Congress.  However, in 1955, with the Democrats controlling the Senate by one vote, Morse finally joined the Democratic Party, although he voted in an independent fashion.

Seven decades earlier, the 47th Congress (1881-1883) convened with 76 Senators from 38 states. With the Democrats and Republicans evenly divided, the two Independents divided their party support. Republicans could rely on the vote of Chester A. Arthur, the vice president under James Garfield, to break the tie. Arthur’s ascendancy to the presidency in September 1881 after Garfield’s death by assassination made the Republican hold on the Senate more tenuous as the vice presidency remained vacant for the remainder of Arthur’s only term, and the Senate elected presidents pro tem from within their ranks.

Most recently, the 107th Congress (2001-2003) under President George W. Bush saw party control of the US Senate switch a total of three times in a two year period.  The Republicans controlled from Inauguration Day on January 20, 2001 until June 6, 2001, when Republican Senator Jim Jeffords of Vermont became an Independent and agreed to caucus with the Democrats, switching the Senate from a 50-50 tie to 51-49 Democratic control.  It would remain that way for the rest of the two years in the Senate, although technically, with the Senate out of session, the Democrats lost the majority. Minnesota Senator Paul Wellstone died in a small plane crash on October 25, 2002, and interim Senator Jean Carnahan of Missouri was defeated at the polls that November.  Her husband, Mel Carnahan, had been elected posthumously to the Senate in 2000, three weeks after he was killed (also in a small plane crash), and she was appointed to the seat until the next regular Congressional election in 2002. Since the Senate was not in session or doing any important business in November and December 2002, the party switch had no consequences.

One must hope that the aging Senate will not see the loss of members of either party, but a death could be politically significant, depending on the timing and the partisan affiliation of the deceased.  History suggests the Democrats, the party in the White House, are likely to lose seats and thus control of the Senate in the 2022 midterm elections.  But two thirds of the seats coming up for election are now held by Republicans, and a number of veteran Republicans are not running for reelection. The Democrats could defy that pattern, win seats, and achieve a solid majority in the Senate for the third and fourth years of the Biden term.  Having a record of popular legislation will be essential in that effort. If the Democrats can’t accomplish enough through reconciliation, scrapping the filibuster may be their only chance.

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/blog/154506 https://historynewsnetwork.org/blog/154506 0
The Roundup Top Ten for June 4, 2021

What We Believe About History

by Kristin Kobes Du Mez

"Understanding that beliefs have a history does not preclude a commitment to truths outside of history. But it does prompt believers to consider how historical forces and cultural allegiances may have shaped their own deeply held convictions."

 

Anti-Vaxxers are Claiming Centuries of Jewish Suffering to Look like Martyrs

by Sarah E. Bond

"Anti-vaxxers and anti-maskers would have us believe that the evil of being encouraged to get a vaccine is the same as the project of ethnic labeling and cleansing undertaken by the Third Reich. It appears at first a farcical analogy, but it’s not without its dangers."

 

 

Beyond the Nation-State

by Claire Vergerio

Much of what has been told about the rise of the nation-state from the Peace of Westphalia in 1648 is wrong. Reevaluating the history of the nation-state is essential for conceptualizing solutions to local and global problems that defy the logic of the nation-state.

 

 

The Racist Roots of Campus Policing

by Eddie R. Cole

Campus police forces often trace their origins to moments when Black demands for expanded housing opportunity clashed with universities' ambitions for expansion or desire to maintain white residential areas near their campuses. 

 

 

The Unbearable Easiness of Killing

by Arie M. Dubnov

"As a colleague justly commented, it is only helpful to call a situation ‘complicated’ if one is committed to unfolding the package, willing to examine its contents and prepared to be surprised by what one finds hidden inside."

 

 

How Cruelty Became the Point of Our Labor and Welfare Policies

by Gail Savage

The persistence of Malthusian thinking in social welfare debates is leading to policies that create needless suffering and a corrosion of the common bonds of humanity that sustain a society.

 

 

The Reconstruction Origins of "Black Wall Street"

by Alexandra E. Stern

Understanding Tulsa's Black Wall Street as a product of the rise and fall of Reconstruction helps to think more productively about how the Tulsa massacre speaks to the policy problems of racial justice. 

 

 

It’s Time to Break Up the Ivy League Cartel

by Sam Haselby and Matt Stoller

Ivy League institutions have an unfair hold on the distribution of opportunity and on the diversity of ideas in America and the world. 

 

 

James Meredith Reminds Us that Powerful Movements can Include those with Very Different Ideas

by Aram Goudsouzian

Meredith’s historical meaning is slippery, but that very inability to pin him down can teach important lessons – not only for how to remember the 1960s, but for how to think about social change.

 

 

Race, Free Speech, and the Purge of Campus Blasphemers

by Jonathan Zimmerman

An adjunct literature instructor at St. John's University has fallen victim to an adminstration's desire to make complex teaching challenges – like how to evaluate Twain's use of racial slurs in the context of satire – into simple rules. 

 

]]>
Wed, 23 Jun 2021 11:57:58 +0000 https://historynewsnetwork.org/article/180445 https://historynewsnetwork.org/article/180445 0