News at Home News at Home articles brought to you by History News Network. Fri, 26 Apr 2024 08:18:44 +0000 Fri, 26 Apr 2024 08:18:44 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/article/category/3 The Army Warned Troops in 1945 of the Danger of Fascism. That Warning Rings True Today

On March 25, 1945, the United States Army issued “Fact Sheet #64: Fascism!” to promote discussions amongst American troops about fascism as the war in Europe wound down to a close. Discussion leaders were alerted “Fascism is not the easiest thing to identify and analyze; nor, once in power, is it easy to destroy. It is important for our future and that of the world that as many of us as possible understand the causes and practices of fascism, in order to combat it.”

It is worth revisiting the Army’s warnings as Donald Trump and MAGA Republicans denounce legal due process and threaten civil war.

Four key points were addressed in the Army fact sheet to be included in discussions.

(1) Fascism is more apt to come to power at a time of economic crisis;

(2) Fascism inevitably leads to war;

(3) It can come to any country;

(4) We can best combat it by making our democracy work.

The fact sheet described findings by war correspondent Cecil Brown who toured the United States after leaving Europe. Brown discovered that most Americans he talked with were “vague about just what fascism really means. He found few Americans who were confident they would recognize a fascist if they saw one.” The War Department was concerned that ignorance about fascism could make it possible for it to emerge in the United States and issued recommendations for how to prevent it.

As a simple definition, the War Department described fascism as the “opposite of democracy. The people run democratic governments, but fascist governments run the people. Fascism is government by the few and for the few.” Fascists remain in power through “skillful manipulation of fear and hate, and by false promise of security . . . At the very time that the fascists proclaimed that their party was the party of the ‘average citizen,’ they were in the pay of certain big industrialists and financiers . . . They played political, religious, social, and economic groups against each other and seized power while these groups struggled against each other.”

The War Department acknowledged that the United States had

native fascists who say that they are ‘100 percent American’ . . . [A]t various times and places in our history, we have had sorry instances of mob sadism, lynchings, vigilantism, terror, and suppression of civil liberties. We have had our hooded gangs, Black Legions, Silver Shirts, and racial and religious bigots. All of them, in the name of Americanism, have used undemocratic methods and doctrines which experience has shown can be properly identified as ‘fascist’.

The War Department warned,

An American fascist seeking power would not proclaim that he is a fascist. Fascism always camouflages its plans and purposes . . . Any fascist attempt to gain power in America would not use the exact Hitler pattern. It would work under the guise of ‘super-patriotism’ and ‘super-American- ism’.

The War Department identified three attitudes and practices that fascists share in common. Fascists pit “religious, racial, and economic groups against one another in order to break down national unity . . . In the United States, native fascists have often been anti-Catholic, anti-Jew, anti-Negro, anti-Labor, anti- foreign-born.” Fascists also “deny the need for international cooperation” and that “all people — regardless of color, race, creed, or nationality have rights.” They “substitute a perverted sort of ultra-nationalism which tells their people that they are the only people in the world who count.” Finally, for fascists, the “[i]ndiscriminate pinning of the label ‘Red’ on people and proposals which one opposes is a common political device.”

Learning to identify American fascists and detect their techniques was not going to be easy, but

it is vitally important to learn to spot them, even though they adopt names and slogans with popular appeal, drape themselves with the American flag, and attempt to carry out their program in the name of the democracy they are trying to destroy . . . In its bid for power, it is ready to drive wedges that will disunite the people and weaken the nation. It supplies the scapegoat — Catholics, Jews, Negroes, labor unions, big business — any group upon which the insecure and unemployed

are willing to blame.

They become frightened, angry, desperate, confused. Many, in their misery, seek to find somebody to blame . . . The resentment may be directed against minorities — especially if undemocratic organizations with power and money can direct our emotions and thinking along these lines.

The goal of the fascist doctrine is to prevent “men from seeking the real cause and a democratic solution to the problem.”

Fascists may talk about freedom, but

freedom . . . involves being alert and on guard against the infringement not only of our own freedom but the freedom of every American. If we permit discrimination, prejudice, or hate to rob anyone of his democratic rights, our own freedom and all democracy is threatened.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185879 https://historynewsnetwork.org/article/185879 0
New York's Education Wars a Century Ago Show how Content Restrictions Can Backfire

Matthew Hawn, a high school teacher for sixteen years in conservative Sullivan County, Tennessee, opened the 2020-21 year in his Contemporary Issues class with a discussion of police shootings.  White privilege is a fact, he told the students.  He had a history of challenging his classes, which led to lively discussions among those who agreed and disagreed with his views.  But this day’s discussion got back to a parent who objected.  Hawn apologized – but didn’t relent.  Months later, with more parents complaining, school officials reprimanded him for assigning “The First White President,” an essay by Ta-Nehisi Coates, which argues that white supremacy was the basis for Donald Trump’s presidency.  After another incident in April, school officials fired him for insubordination and unprofessional behavior.

Days later, Tennessee outlawed his teaching statewide, placing restrictions on what could be taught about race and sex.  Students should learn “the exceptionalism of our nation,” not “things that inherently divide or pit either Americans against Americans or people groups against people groups,” Governor Bill Lee announced.  The new laws also required advance notice to parents of instruction on sexual orientation, gender identity, and contraception, with an option to withdraw their children.

Over the past three years, at least 18 states have enacted laws governing what is and is not taught in schools. Restricted topics mirror Tennessee’s, focusing on race, gender identity, and sexual orientation.  In some cases, legislation bans the more general category of “divisive concepts,” a term coined in a 2020 executive order issued by the Trump administration and now promoted by conservative advocates.  In recent months, Florida has been at the forefront of extending such laws to cover political ideology, mandating lessons that communism could lead to the overthrow of the US government.  Even the teaching of mathematics has not escaped Florida politics, with 44 books banned for infractions like using race-based examples in word problems.

In a sense the country is stepping back a century to when a similar hysteria invaded New York’s schools during the “Red Scare” at the end of World War I, when fear of socialism and Bolshevism spread throughout the US.  New York City launched its reaction in 1918 when Mayor John Francis Hylan banned public display of the red flag.  He considered the Socialist Party’s banner “an insignia for law hating and anarchy . . .  repulsive to ideals of civilization and the principles upon which our Government is founded.”

In the schools, Benjamin Glassberg, a teacher at Commercial High School in Brooklyn, was cast in Matthew Hawn’s role.  On January 14, 1919, his history class discussed Bolshevism.  The next day, twelve students, about one-third, signed a statement that their teacher had portrayed Bolshevism as a form of political expression not nearly so black as people painted it.  The students cited specifics Glassberg gave them – that the State Department forbade publishing the truth about Bolshevism; that Red Cross staff with first-hand knowledge were prevented from talking about conditions in Russia; that Lenin and Trotsky had undermined rather than supported Germany and helped end the war.  The school’s principal forwarded the statement to Dr. John L. Tildsley, Associate Superintendent of Schools, who suspended Glassberg, pending a trial by the Board of Education.

Glassberg’s trial played out through May.  Several students repeated the charges in their statement, while others testified their teacher had said nothing disrespectful to the US government.  Over that period, the sentiments of school officials became clear.  Dr. Tildsley proclaimed that no person adhering to the Marxian program should become a teacher in the public schools, and if discovered should be forced to resign.  He would be sending to everyone in the school system a circular making clear that “Americanism is to be put above everything else in classroom study.”  He directed teachers to correct students’ opinions contrary to fundamental American ideas. The Board of Education empowered City Superintendent William Ettinger to undertake an “exhaustive examination into the life, affiliations, opinions, and loyalty of every member” of the teachers union.  Organizations like the National Security League and the American Defense Society pushed the fight against Bolshevism across the country.

After the Board declared Glassberg guilty, the pace picked up.  In June, the city’s high school students took a test entitled Examination For High Schools on the Great War.  The title was misleading.  The first question was designed to assess students’ knowledge of and attitude toward Bolshevism.  The instructions to principals said this question was of greatest interest and teachers should highlight any students who displayed an especially intimate knowledge of that subject.  The results pleased school officials when only 1 in 300 students showed any significant knowledge of or leaning toward Bolshevism.  The “self-confessed radicals” would be given a six-month course on the “economic and social system recognized in America.”  Only if they failed that course would their diplomas be denied.

In September, the state got involved.  New York Attorney General Charles D. Newton called for “Americanization,” describing it as “intensive instruction in our schools in the ideals and traditions of America.”  Also serving as counsel to the New York State Legislative Committee to Investigate Bolshevism, commonly known as the Lusk Committee after its chairman, Newton was in a position to make it happen.  In January 1920, Lusk began hearings on education.  Tildsly, Ettinger, and Board of Education President Anning S. Prawl all testified in favor of an Americanization plan.

In April, the New York Senate and Assembly passed three anti-Socialist “Lusk bills.”  The “Teachers’ Loyalty” bill required public school teachers to obtain from the Board of Regents a Certificate of Loyalty to the State and Federal Constitutions and the country’s laws and institutions.  “Sorely needed,” praised the New York Times, a long-time advocate for Americanization in the schools.  But any celebration was premature.  Governor Alfred E. Smith had his objections.  Stating that the Teacher Loyalty Bill “permits one man to place upon any teacher the stigma of disloyalty, and this even without hearing or trial,” he vetoed it along with the others.  Lusk and his backers would have to wait until the governor’s election in November when Nathan L. Miller beat Smith in a squeaker.  After Miller’s inauguration, the Legislature passed the bills again.  Miller signed them in May despite substantial opposition from prominent New Yorkers.

Over the next two years, the opposition grew.  Even the New York Times backed off its unrelenting anti-Socialist stance.  With the governor’s term lasting only two years, opponents got another chance in November, 1922, in a Smith-Miller rematch.  Making the Lusk laws a major issue, Smith won in a landslide.  He announced his intention to repeal the laws days after his inauguration.  Lusk and his backers fought viciously but the Legislature finally passed repeal in April.  Calling the teacher loyalty law (and a second Lusk law on private school licensing) “repugnant to the fundamentals of American democracy,” Smith signed their repeal.

More than any other factor, the experience of the teachers fueled the growing opposition to the Teachers’ Loyalty bill.  After its enactment, state authorities administered two oaths to teachers statewide.  That effort didn’t satisfy Dr. Frank P. Graves, State Commissioner of Education.  In April 1922, he established the Advisory Council on Qualifications of Teachers of the State of New York to hear cases of teachers charged with disloyalty.  He appointed Archibald Stevenson, counsel to the Lusk committee and arch-proponent of rooting out disloyalty in the schools, as one member.  By summer the Council had earned a reputation as a witch hunt.  Its activities drew headlines such as Teachers Secretly Quizzed on Loyalty and Teachers Defy Loyalty Court.  Teachers and principals called before it refused to attend.  Its reputation grew so bad that New York’s Board of Education asked for its abolishment and the President of the Board told teachers that they need not appear if summoned.

A lesson perhaps lies in that experience for proponents of restrictions on what can be taught today.  Already teachers, principals, and superintendents risk fines and termination from violating laws ambiguous on what is and is not allowed.  The result has been a chilling environment where educators simply avoid controversial issues altogether.  Punishing long-time and respected teachers – like Matthew Hawn, whom dozens of his former students defend – will put faces on the fallout from the laws being passed.  How long before a backlash rears up, as it did in New York over Teachers’ Loyalty?

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185878 https://historynewsnetwork.org/article/185878 0
Was a Utah District's Decision to Remove the Bible from Shelves a Win for the Anti-Anti-Woke? History Says Maybe Not

The latest twist in America’s culture wars saw crowds at the capitol in Salt Lake City this summer, protesting a book ban from the elementary and middle school libraries of Davis County, Utah. Such bans are increasingly prevalent in American public life, with issues of race and sexuality proving especially controversial. In this instance, though, contention arose because an unexpected book was deemed too “violent or vulgar” for children.

The Davis School District’s decision to ban the Bible has riled many, but Utah’s case is not unprecedented. Although the cultural context has changed, controversy over scripture in America’s public schools dates back to the “Bible Wars” of the 1840s, when use of the Protestant King James Bible came under fire. In cities throughout the United States, Protestants clashed with Catholics over the Bible’s place in the nation’s nominally secular but culturally evangelical public schools. In Philadelphia, rumors that Catholics sought to ban the King James Bible from city classrooms sparked deadly riots in 1844, with over twenty killed and dozens injured.

In Utah—at the time of writing—the controversy has not yet triggered physical violence, although today’s “Bible War” is entangled with broader conflict. The angry ambivalence of cancel culture is well illustrated in the placard of one protestor at the Utah Capitol, urging lawmakers to “Remove Porn Not the Holy Bible.”

The Davis School District Bible ban stems from H.B. 374—a “sensitive content” law enacted by Utah’s State Legislature last year. This legislation, backed by activist groups such as Utah Parents United, targets “pornographic or indecent material,” and provides a fast track for the removal of offending literature. Davis had already banned such books as Sherman Alexie’s The Absolutely True Diary of a Part-Time Indian and John Green’s Looking for Alaska when it received an anonymous complaint in December 2022. “Utah Parents United left off one of the most sex-ridden books around: The Bible,” asserted the plaintiff. “You’ll no doubt find that the Bible (under state law) has ‘no serious values for minors’ because it’s pornographic by our new definition.” Tellingly, the Davis school board upheld this objection and removed the Bible, although this decision is under appeal. A similar complaint has since been lodged within the district against the Book of Mormon.

Support for the Utah Bible ban comes from unexpected quarters. Republican state representative Ken Ivory, a co-sponsor of H. B. 374, initially criticized this removal, but since reversed his position. Ivory admitted that the Bible is a “challenging read” for children. More to the point, he questioned whether the school library was the best place for them to encounter scripture. “Traditionally, in America,” he added, “the Bible is best taught, and best understood, in the home, and around the hearth.” Doubling down on his broader skepticism of public education, Ivory demanded Utah school boards review “all instructional materials” for content, though failing to address how such sweeping assessment might work.

Ivory’s appeal to hearth and home hints at a deeper ideology; one that evokes the tradition of limited government and what Thomas Jefferson called the “wall of separation between church and State.” Such historical parallels, though beguiling, are misleading. Jefferson was neither the consistent partisan idealized by today’s libertarians, nor the die-hard secularist admired by critics of religion. On the contrary, his pragmatism was reflected in the Northwest Ordinance of 1787, which framed the territories between the Ohio River and Great Lakes as a political template for American expansion. This ordinance stated that “Religion, morality, and knowledge, being necessary to good government and the happiness of mankind, schools and the means of education shall forever be encouraged.” In so doing, it earmarked public lands for future schools and colleges, while accepting the generally porous boundaries then maintained between the pulpit and the classroom.

Even as the Northwest Ordinance established public education in the Midwest, immigrants from Catholic Europe challenged the region’s dominant Protestant culture by the 1830s. Tensions peaked in Cincinnati, the urban hub of the Ohio Valley and America’s sixth-largest city by 1840. Cincinnati largely escaped the ethnic violence experienced in Philadelphia, but nativist demagogues flooded in by the score. Among them was Lyman Beecher, the notorious New England evangelical who strove to redeem the frontier for Christ. In his 1835 anti-Catholic tract, A Plea for the West, Beecher declaimed: “We must educate! We must educate! Or we must perish by our own prosperity.” Beecher demanded militant Protestant nationalism to stanch the foreign influence of Catholicism. The growing competition between the secular public and Catholic parochial school systems, both of which developed in close competition through the early nineteenth century, only intensified such demands.

Rivalry between Cincinnati’s public and parochial schools culminated shortly after the Civil War. In 1869, hoping to appeal to Catholic and Jewish parents, the public school board voted to ban the King James Bible, which had been assigned “without note or comment.” Outraged citizens took to streets and to the courts in protest. In Minor v. Board of Education (1869), Cincinnati’s Superior Court upheld plaintiff John D. Minor’s assertion that the board acted illegally. Many children, Minor insisted, “receive no religious instruction or knowledge of the Holy Bible, except that communicated as aforesaid in said schools.” In a dissenting opinion, Judge Alphonso Taft defended the board’s position. “This great principle of equality in the enjoyment of religious liberty, and the faithful preservation of the rights of each individual conscience is important in itself ... But in a city and State whose people have been drawn from the four quarters of the world, with a great diversity of inherited religious opinions, it is indispensable.” Ohio’s Supreme Court later overruled Minor v. Board following appeal by the school district. The later ruling, Board of Education v. Minor (1873) “broke open the floodgates,” wrote historian Steven K. Green, “ushering in a national conversation about the meaning of separation of church and state.” Ohio became the first state to authorize (but not require) the Bible to be banned in public schools. The Buckeye State’s decision predated by nearly a century Abington School District v. Schempp (1963), whereby the U.S. Supreme Court banned Bible reading and the Lord’s Prayer in public schools, leading to complaints that “the Supreme Court has made God unconstitutional.”

Cincinnati’s Bible War exposed a nerve. In his Second Inaugural Address, a few years before, Abraham Lincoln reflected on the Civil War: “Both sides read the same Bible and pray to the same God; and each invokes His aid against the other.” Goaded and consoled by the same text, Americans slaughtered each another. As historian Mark A. Noll argued in America’s Book (2022), “the importance of the Bible for explaining the meaning of America,” and “the importance of America for explaining the history of the Bible” are tightly woven motifs. Following the Civil War, “the inability of Bible believers to find common ground in the Book they championed as the comprehensive guide to all truth” signaled the demise of a distinctly Protestant “Bible civilization,” among other consequences, heralding a more multicultural—apparently more secular—nation.

As Utah’s controversy suggests, the Bible may have fallen from grace, yet it remains a potent symbol. No longer assigned as a devotional text in America’s public schools, its mere presence on library shelves remains incendiary. The context surrounding its removal has shifted from nineteenth century sectarianism to twenty-first century culture wars, but continuities ignite destructive passions. Cynics might contend that Utah’s anti-woke warriors have been hoisted on their own censorious petard. However tempting this conclusion, we should also recognize the bitterness of old wine in a new wineskin, as the Bible once more becomes a focus of partisan discord.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185875 https://historynewsnetwork.org/article/185875 0
What to the Incarcerated Is Juneteenth?

Incarcerated laborers sew military uniforms under the UNICOR (Federal Prison Industries) program 

Juneteenth is a bittersweet day for Black people in prison holding onto the promise of freedom.

Let’s start with history. The Emancipation Proclamation -- issued by Abraham Lincoln on September 22, 1862, during the American Civil War -- declared that on January 1 all slaves in the Confederacy would be “forever free.” Unfortunately, that freedom didn’t extend to the four slaveholding states not in rebellion against the Union, and the proclamation was of course ignored by the Confederate states in rebellion. For the roughly 4 million people enslaved, Lincoln's declaration was symbolic; only after the Civil War ended was the proclamation enforced.

But the end of the fighting in April 1865 didn’t immediately end slavery everywhere. As the Union Army took control of more Confederate territory during the war, Texas became a safe haven for slaveholders. Finally, on June 19, Union General Gordon Granger rode into Galveston and issued General Order No. 3, announcing freedom for those enslaved. There were about 250,000 slaves in Texas when it became the last state to release African American bodies from the cruelest institution known to American history. By the end of that year, the 13th Amendment abolished slavery, mostly (more on that soon).

Darrell Jackson: My understanding of Juneteenth developed in prison

Juneteenth has long been a special day in Black communities, but I didn’t learn about it until I went to prison.

In the early 2000s, prisoners at Washington State Penitentiary in Walla Walla decided to hold a Juneteenth celebration. Because the Department of Corrections didn’t treat the day as special, Black prisoners used the category of "African American Cultural Event" (which had usually been used to celebrate Black History Month) as a platform to celebrate Juneteenth. The spirit of liberation moved through the incarcerated population, motivating other prison facilities across the state to follow suit. 

It seems hard to believe, but prior to my incarceration, I knew nothing about Juneteenth. I had taken classes that included American history, but the event apparently wasn’t part of my school's curriculum. I first heard about that history from other prisoners at Clallam Bay Correction Center, where I was incarcerated in 2009, and that prompted me to learn more about the ways that African Americans mark Juneteenth. By the time I had entered the prison system, Black prisoners had expanded the celebration to include family, friends, and community members, creating an opportunity for prisoners to connect with loved ones in ways that regular visiting did not permit. We were able to choose what foods we ate and provide our own entertainment, using creative ways to communicate inspiring messages to the African American prison population and their families.

One memorable moment for me came in June 2012. The Black Prisoners Caucus was hosting the event, which had not happened since 2007, and a friend and I were asked to perform a few songs. It was my first vocal performance and I was extremely nervous. When we finished, my friend's 7-year-old daughter shouted from the audience, "Daddy, they killed it!" Though I didn't have any family present at the event, that little girl's endorsement etched a long-lasting smile on my heart. Her words had become a soothing balm in the face of the stress, self-doubt, depression, and anger I had felt as a prisoner throughout that year.

It’s important for the world to know that Juneteenth holds great significance for the Black bodies who are locked away in prison. But we shouldn’t forget that the 13th Amendment abolished slavery and involuntary servitude, “except as a punishment for crime whereof the party shall have been duly convicted.” For the men and women who are Black and in prison, that exception connects us to our ancestors who were in chains long ago. 

Antoine Davis: The conflict of celebrating freedom while in chains

I can only imagine the effect that June 19, 1865, had on the souls of those trapped in the most barbaric institution in American history. The hope for freedom, passed down from one generation to another, had finally come to pass. Tears from Black faces must have run like rivers, not from the pains of a master's lashes but from the joy of knowing that one’s momma and daddy, sons and daughters, family and friends would no longer live in bondage.

Such images of joy have run through my mind as I have celebrated Juneteenth with Black families and White families, all occupying the same space in the prison's visiting room. While our loved ones laugh and dance, eat and rejoice, the truth about what we celebrate creates for me a tension between joy and grief. The joy comes from recognizing how far we've come as a people. The grief comes from the reminder that while chattel slavery was abolished, a new form continues in a prison system that incarcerates African American people at an alarming rate.

Prisoners aren’t the only people who understand the injustice. For several decades, activists and academics have developed an analysis called “Thirteentherism,” which argues that the 13th Amendment created constitutional protection for the brutal convict-leasing system that former Confederate states created after Reconstruction and which evolved into today’s system of racialized mass incarceration.

Here’s just one statistic of many: In Washington state, 33 percent of prisoners serving a sentence longer than 15 years for an offense committed before their 25th birthday are black. Blacks make up 4.3 percent of the state's population. 

The statistics that demonstrate the racialized disparities in prisons make me think of Devontae Crawford, who at the age of 20 was sentenced to 35 years in prison. Although he committed a crime with three white friends, Devontae ended up with more time than all his crime partners put together. Today, all three of them are free, and Devontae still has 30 years left to do in prison.

One of my closest friends, who asked to remain anonymous, also comes to mind. He was sentenced to 170 years in prison after his white friend shot a man during an altercation. Although his friend admitted to being the gunman, this prisoner remains incarcerated while his white friend was released after serving seven years.

Jackson/Davis: Still slaves to the system

As Black men in prison, we live the tension between celebrating the abolition of slavery and struggling inside the criminal justice system that replaced slavery. We prisoners who are left to deteriorate inside one of America's most inhumane systems are able to find joy in celebrating Juneteenth, but not without indignities.

For example, a number of us were told by the DOC that prisoners would have to pay an estimated $1,500 for this year's Juneteenth celebration -- $500 for food, not including the cost to our guests, and $1,000 to pay for additional guards. Juneteenth became a national holiday in 2021, and DOC officials decided that African American prisoners should cover the overtime and holiday pay for the extra guards deemed to be necessary for us to celebrate Juneteenth with our children. 

That’s a lot of money for any working folks, but consider what it means for people who make 42 cents an hour, maxing out at $55 a month. No matter what the job in prison, that’s our DOC wage. This means that if we African American prisoners want to include our children in celebrating the historical meaning behind June 19, we will be forced to give the prison facility 3,600 hours of labor. The irony is hardly subtle: Prisoners in Washington state who work at near-slave wages will have to pay to celebrate a day that represents freedom.

We live with this injustice, but we African American prisoners find a way to maintain joy in the face of adversity. It's never easy to cope with the social conditions that are designed to diminish prisoners’ sense of their own value, but we keep on keeping on. If our ancestors were able not only to survive but also sometimes thrive in the face of enslavers’ disregard for their humanity, so can we. Something as simple as a 7-year-old girl shouting encouragement after a Juneteenth performance can be enough to keep us going.

African American prisoners have learned to embrace all the positives, celebrating freedom when we can while living in modern-day chains.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185834 https://historynewsnetwork.org/article/185834 0
Comparing the Trump – DeSantis Race to the Republicans' 1912 Debacle is a Stretch... Right?

Leonard Raven-Hill, "For Auld Lang Syne" from Punch, 1912

And they’re off. With just a year and a half to go, Ron DeSantis has finally thrown his hat into the ring. Now the race for the GOP nomination truly begins. Nikki Haley, Asa Hutchinson, and a variety of other runners and riders are there but, for most people, this is a two-horse race between Donald Trump and DeSantis. This potential head-to-head has been a long time coming, and some think that DeSantis has left it too late. DeSantis was well ahead of Trump in GOP opinion polls shortly after the 2022 midterms, but now Trump has a commanding lead. However, we shouldn’t forget that a lot can change between now and November 2024.

Let’s go back a little, to see how polling looked for Trump in 2011 and 2015 (around a year and a half out from the presidential elections of 2012 and 2016).

In April 2011, Trump led a poll of GOP primary voters with 26 percent, more than ten points ahead of eventual nominee Mitt Romney. Much of this “Trump bump” was linked to his high-profile “birther” campaign demanding to see President Obama’s birth certificate. However, once Obama produced the document, Trump’s numbers swiftly dissipated, and he decided not to run. Conversely, In June 2015, Trump polled just 1 percent, leaving him in eleventh place out of 16 candidates in a survey of GOP primary voters. Of course, Trump looks to be in a much stronger position at the end of May 2023, with polls of primary voters putting him at over 50 percent and DeSantis trailing with around half of that. But remember, Trump won the nomination in 2016 despite polling at only 1 percent at the same stage of the nomination campaign, and his numbers collapsed when he had gathered a significant lead four years earlier.

So, let’s imagine, just for argument’s sake, that DeSantis stays the course. We get a broad slate of candidates (as we did in 2015-2016), and Trump isn’t the outright winner come the Republican National Convention. Let’s stretch our imaginations even further to see the GOP Convention tightly contested so that, in the end, DeSantis gets the nomination by the narrowest of margins. Trump, spurned, storms out and decides to run independently under the banner of the “Truth Party.” Come November, Trump picks up a number of states he won in 2020, and DeSantis takes Florida and a handful of flyover states for the GOP. Meanwhile, Biden wins by a mile, as the divided Republican vote lands him easy wins in Pennsylvania, Georgia and Ohio, and he even sneaks by in Texas. It’s an outlandish scenario, but for those of you with a long memory, it’s not quite the frantic fever-dream of a teacher overcome by too much grading that it might seem. I take you back to 1912….

In February 1912 – election year – Theodore Roosevelt (Republican president from 1901-1909) formally challenged the incumbent Republican President William Howard Taft for the GOP nomination. In Roosevelt’s mind, he had made Taft’s career; TR had appointed Taft to his cabinet while president and handpicked Taft as his successor. Roosevelt campaigned for Taft in 1908, had photographs taken with him, went the whole nine yards. In many ways TR felt he won the election for Taft. Yet, while in office, Taft disappointed Roosevelt. Always a larger-than-life personality, retirement was never really going to be TR’s favored path.

The 1912 Republican nomination campaign turned nasty. Roosevelt launched several stinging attacks on Taft. Taft, a TR true-believer and former friend, was wounded and was slow to resort to the same sort of name-calling as Roosevelt. The stage was set for a close race, and the result went down to the wire. That June, though, Taft wrested the nomination from Roosevelt at the convention. Roosevelt cried foul play, a corrupt stitch-up! He stormed out of the convention and weeks later ran a third-party “Bull Moose” campaign under the banner of the Progressive Party.

The 1912 election became a three-horse race – though special mention should go to five-time presidential candidate for the Socialist Party, Eugene Debs, who received over 900,000 votes (from a total vote of just over 15 million). Roosevelt won 88 Electoral College votes, including large states like Pennsylvania, while Taft got the measly 8 votes of Utah and Vermont. Between them, the erstwhile allies got over 50 percent of the popular vote, but the Electoral College saw Democrat Woodrow Wilson win in a landslide, with 435 votes out of 531.

As ever with these historical parallels, there are innumerable other variables that don’t mirror the past anywhere near as well. However, this comparison is not so much aiming to suggest that 2024 might be a full repeat of 1912, as to offer a glimpse into the danger that a full split could cause for the GOP if DeSantis and Trump really did divide the vote come November 2024.

Trump and DeSantis started as allies. Many, including Trump, felt that Trump’s backing won the Florida gubernatorial race for DeSantis in 2018. DeSantis appeared to be a Trump true-believer. Trump did not want to retire quietly, and he doesn’t seem to like how DeSantis has “betrayed” him. They are now running against each other for the nomination, and Trump has been criticising “Ron DeSanctimonious” for months, while DeSantis has remained largely passive in his response. There is an echo of the past here for sure, even if it’s a faint one thus far.

However, if things were to go the course, and the Convention looked like it might be decisive… if the remaining court cases against Trump were to go against him, and the Republican Party threw its weight behind DeSantis to narrowly deny Trump the nomination… then it does not seem quite so far-fetched that Trump could run as an independent. Maybe, just maybe, 2024 might see more echoes of 1912 when it arrives. If so, then President Biden will no doubt be happily ordering copies of James Chace’s 1912 or Lewis Gould’s Four Hats in the Ring, and merrily assessing his chances in a three-horse race come next November.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185814 https://historynewsnetwork.org/article/185814 0
Can the Left Take Back Identity Politics?

Members of the Combahee River Collective, 1974. Included are (back row, l-r) Margo Okazawa-Rey, Barbara Smith, Beverly Smith, Chirlane McCray, and Mercedes Tompkins;

(front row, l-r) Demita Frazier and Helen Stewart. 

The Combahee River Collective

“We were asserting that we exist, our concerns and our experiences matter,” said Black feminist activist Barbara Smith in an interview she gave almost four decades after the publication of the seminal Combahee River Collective Statement, credited as the first text where the term “identity politics” is used. “We named that ‘identity politics' because we said that it is legitimate to look at the elements of one’s own identity and to form a political analysis and practice out of it.”

Combahee River Collective was a Black feminist lesbian socialist organization active in Boston from 1974 to 1980. The Collective got its name from a military expedition at the Combahee River in South Carolina planned and carried out by the abolitionist Harriet Tubman on June 2, 1863. The raid, which freed 750 slaves at the time, was the first military campaign in American history led by a woman. When asked to describe her work with the Combahee Collective in Boston, Smith said, “I think it was really fated that I ended up there. In Boston there's something about the size and the scale of the city that made it more possible for those of us who were like-minded to find each other.”

But the Collective's impact extended much farther than the local activist scene, thanks to its widely circulated statement of principles. Written by Barbara Smith, her sister Beverly Smith and Demita Frazier in 1977, the statement was published in 1979 in Zillah Eisenstein's anthology Capitalist Patriarchy and the Case for Socialist Feminism, and has since become one of the foundational texts of Black feminist thought:

Our politics initially sprang from the shared belief that Black women are inherently valuable. This focusing upon our own oppression is embodied in the concept of identity politics. We believe that the most profound and potentially most radical politics come directly out of our own identity ... In the case of Black women this is a particularly repugnant, dangerous, threatening, and therefore revolutionary concept because it is obvious from looking at all the political movements that have preceded us that anyone is more worthy of liberation than ourselves. We reject pedestals, queenhood, and walking ten paces behind. To be recognized as human, levelly human, is enough.

This was indeed a very different understanding of identity politics than the hollowed-out versions that dominate public debate today. First, it refused the idea of comparing and ranking oppressions, focusing instead on the particularity of each lived experience. “We actually believed that the way you come together is to recognize everyone fully for who they are,” Smith said, “as we work toward common goals of justice and liberation and freedom.” This opened the door to cooperation and coalition-building, including with those who don't resemble, or necessarily agree with, us. Second, it rejected single-issue politics by pointing to the “interlocking” nature of major systems of oppression. This was in fact the reason the Combahee statement was written in the first place: to point to the failure of the Civil Rights movement, Black nationalism and White feminism to sufficiently address the realities of Black lesbian women.

But the statement didn't prioritize the liberation of one group of people over any other, and proposed what was effectively a new model of social justice activism — foregrounding what would later be called “intersectionality.” Oppressions were multilayered and experienced simultaneously, and that required multi-issue strategies that reject a rights-only agenda. And third, the Combahee vision was unabashedly internationalist and anti-capitalist. The members of the Collective were actively involved in the anti-war movement, for they considered themselves to be, in the words of Barbara Smith, “third world women”: “We saw ourselves in solidarity and in struggle with all third world people around the globe.” Growing out of the organized Left, they defined themselves as socialists, and believed, as their statement put it, “that work must be organized for the collective benefit of those who do the work and create the products, and not for the profit of the bosses.”

Till Identity Do Us Part

But times have changed, and not for the better. A new type of identity politics was forged on university campuses, one that didn't fully grasp the connection between theory and practice, or concerns about bread-and-butter issues that affect all women. This narrow version “was used by people as a way of isolating themselves, and not working in coalition, and not being concerned about overarching systems of institutionalized oppression,” Barbara Smith said, expressing her discontent with the ways in which identity politics was reconfigured by the campus Left. “Trigger warnings and safe spaces and microaggressions — those are all real, but the thing is, that’s not what we were focused upon.” Like other groups of Black women who were organizing around Black feminism, Combahee was “community-activist based. Focusing on looking at real issues affecting all Black women, which includes poor Black women.”

Demita Frazier, another co-author of the Combahee statement, concurred. Part of the problems is “the commodification of everything,” including identity politics, which was completely detached from its anti-capitalist origins. This was because of the way it was co-opted by academics, she added: “I wouldn’t say co-opted if it weren’t for the fact that there’s still this big divide between practice and theory, right? I mean, I’m glad that the children and the young’uns are getting educated, but it looks like a factory to me right now.”

This brief excursion into history, and the reflections of the veteran activists of the Combahee River Collective on the legacy of their statement, provide several insights into the problems that plague current understandings of identity politics. The radical identity politics of campus activists, Diversity, Equity and Inclusion trainers and anti-racism gurus is everything that the identity politics of the Combahee River Collective is not. The new upgrade is profoundly narcissistic, and focuses on perceived individual harm at the expense of structural injustices; it establishes hierarchies of oppression by resuscitating the theological concept of “eternal sin,” which is then imputed to certain groups of people who are expected to devote a certain percentage of their daily lives to confess and repent (after all, no salvation without self-flagellation!); it interjects the term “intersectionality” here and there as a catchphrase, but treats identities as if they are fixed, insulated categories with no internal hierarchies or divisions; it disparages the idea of universal values or human rights, treating them as tools for domination invented by the powerful to maintain the status quo; it sees no allies, and it seeks no allies; it is thus “separatist,” in the sense in which Barbara Smith used the term. “Instead of working to challenge”, Smith said, “many separatists wash their hands of it and the system continues on its merry way.”

“This Bridge Called My Back”

For the Combahee women, identity politics was about politics, and identity was one way of doing politics and challenging hierarchies. For the campus Left, identity politics is about identity, and identity is beyond politics. It's a sacred value that needs to be preserved intact, at all costs. The questions of who defines a particular identity, or what causes harm, are left unanswered. In that sense, early critics of radical identity politics, Marxists and liberals alike, were right, but only partially. It's true that for the campus Left, “symbolic verbal politics” was the only form of politics that was possible. But today, even verbal politics is out of bounds. Terms are not discussed but dictated; truth, in an ironic twist, is no longer relative but absolute. Paradoxical as it may sound, new identity politics is “anti-politics” — not only in the conventional sense of alienation from or distrust in mainstream politics but also in the broader sense of how we understand “the political,” as a space of contestation. The current obsession with privilege closes up that space, ruling out the possibility of dialogue and building alliances. In such a scheme, anyone who criticizes dominant progressive orthodoxies is branded as a “useful idiot,” advancing or unwittingly enabling a right-wing agenda. White progressives, Black conservatives, centrists or bona fide liberals are considered to be more harmful to the cause of social justice than explicitly racist modern day Ku Klux Klanners. It may well be so. But what does this mean, politically speaking? Are we not supposed to reach out to fellow progressives or, indeed, regular people, and explain to them that in a society built on White values, colorblindness may not be the best way to achieve racial equality? And if we cannot even speak to the progressives, how are we going to convince the conservatives, reactionaries, or overt racists who still constitute a substantial part of any given society?

The Combahee women who coined the term identity politics knew the answer to these questions because they were doing political work and consciousness-raising in the real world, with women of all colors and walks of life, not peddling virtue in sterilized boardrooms or slick vodcasts. They were guided by the motto “This Bridge Called my Back” (which was later to become the title of a ground-breaking feminist anthology edited by Cherrie Moraga and Gloria E. Anzaldúa), which they saw as the key to success. “The only way that we can win — and before winning, the only way we can survive,” said Barbara Smith, “is by working with each other, and not seeing each other as enemies.”

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185812 https://historynewsnetwork.org/article/185812 0
California's Collusion with a Texas Timber Company Let Ancient Redwoods be Clearcut

Old-growth redwoods, Headwaters Forest Reserve (CA)

In 1985, fresh out of college, I took a job as a reporter in my hometown of Guerneville, a small village set alongside the Russian River in western Sonoma County, California. Among my many beats was the North Coast timber industry. At first I focused on the logging of second-growth redwood forests by international timber giant Louisiana-Pacific. Yet just three months after I’d landed that reporter’s job, a Houston company called Maxxam completed one of the most consequential corporate buyouts of the era by leveraging $754 million in high-yield “junk bonds” to purchase the 117-year-old Pacific Lumber Company.

At the time, Pacific Lumber owned the very last private inventory of old-growth redwood forest still standing outside of parks. The company’s 200,000 acres of forestland included 8,000 acres of virgin redwood, and 56,000 acres of “residual” old-growth redwood forest. It was the single largest expanse of old-growth redwood remaining in the world, inside or outside of parks.

At first I investigated Maxxam’s redwood liquidation as a reporter. But after I visited the targeted forests I quit my job and moved to Humboldt County, a coastal outback two hundred miles north. Searching for a means of saving these ancient redwood groves, I co-founded a North Coast chapter of the radical environmental movement Earth First!. Our small group of determined activists camped high into the canopy of old-growth redwood trees, blockaded roads, and occupied the Golden Gate Bridge to draw national attention to what turned out to be the illegal destruction of the last ancient redwoods.

When I wasn’t dangling from trees, I was researching and publishing, which turned up several startling discoveries. It was shocking enough that a Houston oil and real estate company could employ the services of three soon-to-be-convicted financial felons—Ivan Boesky, Boyd Jeffries, and, at the notorious junk-bond firm of Drexel Burnham Lambert, Michael Milken—to secure a hostile takeover of, and immediately set to liquidating, the world’s last ancient redwoods. But what was especially jarring was the understanding that the California Department of Forestry (CDF), which was charged with enforcing statutes designed to protect soils, streams, and habitat, would approve every one of Maxxam’s destructive timber harvest plan (THPs) no matter that most of them violated several state and federal environmental laws.

Prior to the takeover of Pacific Lumber, Maxxam’s legal eagles no doubt understood that in 1970 California lawmakers had passed an unusually effective environmental law called the California Environmental Quality Act (CEQA). Three years later the state also passed the California Forest Practice Act (FPA). At the time the Forest Practice Act was touted as a means of curbing abusive logging practices, particularly in the redwoods. More accurately, the Forest Practice Act allowed timber companies to evade the far more restrictive requirements of CEQA—no matter that such evasion was illegal, as several court challenges would prove.

CEQA required that a company proposing a “project” in California must first provide an environmental analysis that takes into consideration the potentially “significant, cumulative, adverse environmental effects” of a of that project when considered alongside “past, present, or reasonably foreseeable future” development plans in the area. This was a very high bar, especially when applied to logging, among the most cumulatively destructive activities in California at the time. Properly enforced, CEQA would have prevented the scorched-earth logging that has always occurred in California, particularly in the state’s minuscule stands of remaining ancient forest. In contrast, the Forest Practice Act required that private timber companies submit just a short, check-box “timber harvest plan” whose level of environmental review was cursory at best. The Forest Practice Act reconstituted the California Department of Forestry as an enforcement agency, but really CDF served as a bulwark to assuage and fend off public challenges to destructive logging.

In 1985 the Humboldt County-based Environmental Protection Information Center (EPIC) penetrated the bulwark by securing a state appellate court decision against timber giant Georgia-Pacific Corporation and CDF for submitting and approving, respectively, a timber harvest plan designed to clear-cut the very last 75-acre grove of virgin redwood forest (called Sally Bell Grove) on California’s famous Lost Coast. The court ruled that timber companies and CDF must comply not just with the Forest Practice Act, but also the “cumulative effects” clause of CEQA.

Nonetheless, a confederacy of state officials—from the nine-member state Board of Forestry (made up primarily of loggers, ranchers, and academics loyal to the timber industry), to the Director of CDF in Sacramento, to CDF foresters on the ground—almost to a person refused to properly enforce CEQA, or even to consider it. Department of Forestry officials were largely culled from the same forestry programs—at Humboldt State University and UC Berkeley—as were corporate foresters. They all knew each other, and some of them were related. At the time of the Maxxam takeover of Pacific Lumber, the state foresters were also under pressure from Republican Governor George Deukmejian, a great friend of the timber industry, to maintain a heavy cut. State officials stuck with the weak provisions of Forest Practice Act and approved every Maxxam timber harvest plan with dizzying haste.

At the end of 1986 I compiled a list of the fifty-two timber harvest plans that CDF had approved that year for Maxxam, for a total of 10,855 acres—more than double PL’s acreage of the previous year. Of this expanse, 9,589 acres contained old-growth forest, virgin and residual. There would be no more selective logging. This was an appalling rate of cut. Every standing old-growth tree—hundreds of thousands total—would be leveled as the amount of lumber that Maxxam extracted from the redwoods actually tripled. The cumulative environmental effects of this liquidation were certainly significant; more so, they were extreme. Whole watersheds began unraveling as ten timber crews carved abusive roads and denuded miles of terrain in the most rapid acceleration of old-growth redwood logging in history. Rivers and creeks filled with up to fifteen feet of sediment, and salmon, along with all manner of terrestrial wildlife, disappeared.

By 1986, thanks to the EPIC lawsuits against Georgia Pacific, CDF foresters were now forced to provide at least a cursory CEQA review of logging plans. Cursory became Orwellian. In approving a Maxxam clear-cut of 125 acres of virgin redwood, a CDF forester addressed CEQA’s questions of “significant, cumulative, adverse environmental effects” in this way:

Tractor logging and new road construction will contribute to surface soil erosion, but it is unlikely to be significant at this time. Mass soil movement may happen but to say that it will be significant is to [sic] early. New road construction and tractor logging may somewhat decrease water quality but only for a short time period. It cannot be judged at this time if it will be significant. This stand of old-growth timber has direct access to the public, however it cannot be judged at this time if aesthetics would be significantly impacted. No endangered species were noted during the inspection. Old-growth timber has been noted to shelter all types of plants and animals. Unless one observes these species it cannot be judged if any significant impact would occur.

Department of Forestry officials specialized in throwing wildlife under the bulldozer. Incredibly, they often argued in Orwellian fashion that wildlife would “benefit” from the destruction of the extremely rare virgin redwood groves held by Pacific Lumber. “A general short-term improvement for wildlife habitat is seen from this plan,” CDF wrote of a 294-acre old-growth redwood logging plan on Chadd Creek, adjacent to Humboldt Redwoods State Park. Likewise, a “possible minor improvement to wildlife habitat” would result from an 88-acre clear-cut of old growth on Corner Creek, and “wildlife habitat may be improved” after 309 acres of old-growth clear-cutting on Larabee Creek. On Strongs Creek, where Pacific Lumber would clear 760 acres of old-growth redwood, “Some deer habitat will be improved.”

In May 1987 I attended a CDF “review team meeting,” during which state foresters examined two Maxxam timber harvest plans that proposed clear-cutting 274 acres out of the heart of 3,000-acre Headwaters Forest. I had discovered and named Headwaters Forest just two months earlier. Headwaters was the largest remaining island of virgin redwood forest still standing outside of parks; the value of its old-growth habitat was undeniable. Headwaters Forest stood as the only ancient forest remaining in the Humboldt Bay watershed, and it was one of the most important breeding areas for the endangered marbled murrelets.

During the review team meeting, I asked Stephen Davis, the Humboldt County-based CDF forester who reviewed Maxxam’s new logging plans, about the cumulative effects of logging Headwaters Forest. Davis was sanguine. He saw nothing amiss in the ecological dismantling of the grove. On paper, prior to the meeting, Davis had whipped through CDF’s “Cumulative Impacts Checklist” for the THP (positioned as an afterthought at the end of the review document after EPIC’s court victory) as if he were renewing a driver’s license. A checklist question asked whether the clear-cutting would cause “significant adverse cumulative environmental effects…to fish or wildlife or their habitat.” Davis had answered, “No. Minimal impacts will occur to these values; some wildlife may benefit.”

Davis was sitting directly across from me. I pulled out a cassette recorder, turned it on, and placed it on the table in front of Davis. Reading over Davis’s cumulative impacts checklist, I asked him, in the language of the California Environmental Quality Act, if clear-cutting nearly 10 percent of the world’s largest remaining unprotected grove of ancient redwoods wouldn’t significantly, cumulatively, and adversely impact the rare, threatened, and endangered species that depended on the grove for survival.

“I don’t think so,” said Davis.

“You don’t think so?” I asked.

“No.”

“What makes you not think so. Once the old-growth habitat is gone, how will the wildlife species that depend on that habitat survive?”

“What habitat are you speaking of?”

“The old-growth-forest habitat.”

“Who?”

“This”—I pounded my fingers into a THP map that was on the table—“this old-growth-forest habitat.”

Davis said, “I don’t think there’s a cumulative effect on those.”

“You don’t think so? You don’t think that by eliminating old growth in general, old-growth-dependent species will also be eliminated?”

“There’s plenty of habitat out there.”

The Department of Forestry quickly approved the THPs. Just as quickly, EPIC sued. The organization won this and nearly every lawsuit that it brought against Maxxam and CDF, exposing the willingness of both a voracious Houston corporation and California officials to violate state environmental laws (EPIC would also soon win a federal lawsuit against Maxxam that enforced the Endangered Species Act). But EPIC could only litigate against individual plans—the courts refused to consider a lawsuit targeting a company’s entire operation, no matter the obvious cumulative devastation.

In 1999, the state and federal governments paid Maxxam $480 million for 3,000-acre Headwaters Forest. It was an extraordinary sum in the face of lawsuits that had locked up the grove and rendered its actual value closer to approximately $50 million. After the deal, Maxxam continued cutting virtually every remaining tree of value on its 200,000 acres until, as if on cue, in January 2007, Pacific Lumber declared bankruptcy.

In two decades Maxxam had liquidated nearly all of Pacific Lumber’s assets, valued at between $3 billion and $4 billion. Twelve hundred employees lost their jobs. Pacific Lumber still owed bondholders $714 million, virtually the same debt incurred at the time of the junk-bond takeover in late 1985. Nonetheless, Maxxam, thanks to profits realized in the liquidation of Pacific Lumber, had made the Fortune 500 list eight times between 1989 and 1998. It was a sordid, though expected and predicted, ending to brief history of ancient redwood liquidation.

This excerpt of The Ghost Forest is published by permission of Public Affairs. 

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185783 https://historynewsnetwork.org/article/185783 0
Dangerous Records: Why LGBTQ Americans Today Fear the Weaponization of Bureaucracy

Prisoners at Sachsenhausen concentration camp wear triangle badges indicating the nature of their offenses against Nazi social code (pink would indicate homosexuality). National Archives and Records Administration, 1938.

The recent rise of far right political movements in the United States and globally has prompted historical comparisons to the Nazis. The atrocities committed by the Nazis have been studied widely, particularly in reference to the Jewish victims of the Holocaust, but it is also important to understand lesser-known victims and the ways that prior discrimination affected their persecution. While focusing on the pre-war experience it is crucial to understand how the Nazis relied on bureaucratic information to know whom to target, especially when the classification was not an obvious ethnic or religious one (such as assimilated and secular Jews, or gay men, lesbians, and others persecuted for gender or sexual behavior). Today, there are important lessons to learn about the dangers that bureaucratic information gathering, combined with escalating prejudice and vilification, could present.

The rise of the Nazi party in Germany also brought about several laws restricting access to literature and laws regarding the treatment of what we today would refer to as LGBTQ+ people. Paragraph 175, a law criminalizing same sex male relationships, was established in 1871, but revised by the Nazi party to be more inclusive in regard to the actions that could be punished. Queer men were targeted early in the Nazi regime, which placed heavy blame on them for losing the First World War. Nazi ideology justified discrimination and repression by claiming that a lack of masculinity was a contributing cause of the country’s downfall and economic depression. Though only half of the 100,000 arrested for the alleged crime of homosexuality were persecuted, this figure is still large enough to raise an interesting question about how the Nazis knew whom to target and where the information was coming from. Political factors appear to be involved, because a majority were prosecuted within six weeks after Heinrich Himmler’s assumption of control of internal security in 1943. Each man was reported in a similar manner whether that was a private individual report, a police raid, or utilization of the “Pink List.”

The practice of information gathering towards members of minority groups by bureaucratic organizations has a startling history of being used for oppressive ends, particularly by the Nazis. A clear example of this includes the utilization by the Nazis of the “Pink List," a list compiled by organizations of support such as the Scientific Humanitarian Committee or reported by private individuals and then held by the police. The Scientific Humanitarian Committee aimed for “Justice Through Science” and espoused the biological theory of homosexuality, the idea that sexuality is an innate biological feature rather than a characteristic of weakness and psychological deviance. The SHC was targeted by the Nazi party early in the rise of Hitler due to their propensity to advocate for homosexuals. The SHC kept lists of homosexual Germans for support and scientific reasons but those lists were seized by the Nazis then utilized to target the homosexuals on the list.

A clear example of the danger that could befall a young gay man who interacted with police on any other matter is seen with the story of Pierre Seel. Seel arrived at his local police station to report a stolen watch and, when questioned about the specific circumstances, revealed that he had come from Steinbach Square, a well-known hangout for gay men seeking each other's company. After experiencing intense questioning, he was released and assured that nothing would come of the compromising information, but three years later he was arrested as a suspected homosexual due to the list he was placed on after he left the police station. This list was compiled by police and security forces over the years, and was augmented by confessions made by imprisoned gay men who were raped and tortured to compel them to add additional names to the list. The Pink List is a clear example of how dangerous information that categorizes someone into a minority group can be, particularly in the hands of those in power with ill intentions.

While the Holocaust is an unmatched and exceptional example of cruelty and systematic persecution of social outgroups, it is nevertheless important, even crucial, to recognize similarities between those events and the present, especially where prejudices join with bureaucratic state power. Today, transgender Americans are being framed as deviants, accused of undermining traditional gender roles, and described as “groomers'' and child sex abusers. Armed vigilantes have harassed people attending drag performances, and activists are seeking to remove books about gender and transgender experiences from schools and libraries. When the power of the state aligns with these expressions of prejudice and identification of outgroups as a threat to children, family and society, there is real cause for concern.

Anti-LBGTQ sentiment has been particularly vociferous in Texas. Texas Attorney General Ken Paxton’s recent request for a list of individuals who have changed their gender on state-issued driver’s licenses, as well as other departmental documents, has concerning similarities to the “Pink List” compiled by Nazi officials in 1930’s Germany. The request for the list itself made transgender Texans subjects of surveillance, implying the state views them as dangerous. According to an email sent on June 30, 2022 by Sheri Gipson, the chief of the DPS’s driver license division, the Attorney General’s office “wanted ‘numbers’ and later would want ‘a list’ of names, as well as ‘the number of people who had a legal sex change’.” This first request produced over sixteen thousand results. Unfortunately for the Attorney General, it was difficult for the state agencies to meet his request. One issue involved gender changes to correct filing mistakes (a cisgender person’s gender was accidentally recorded inaccurately, and the change affirmed their identity). A subsequent data request attempt led to narrowing the data to only court-ordered document changes, which would identify transgender people specifically. Although the agency could not accurately produce this data, this instance, alongside the various laws being introduced throughout the state such as the prohibition of gender affirming care and the limiting of LGBTQ+ lessons in school, brings up the startling question of the kind of damage that information gathering could do not only presently, but also in several years.

The weaponization of personal information available to state organizations should not be taken lightly. It has, and will continue to, present danger to those being targeted by the state as threats. Laws to target transgender children by restricting their access to gender-affirming care or affirming ideas in books have become commonplace in several Republican led states, but an explicit attack on legal adults adds an element that lends the question to where it will stop and who will stop it. These laws send a clear message that the right does not want transgender people to have a presence in society, both within everyday life and in the media surrounding them. The proposed laws restricting gender affirming care, along with classifying the parents of transgender children receiving gender affirming care as child abusers, LGBTQ+ lessons in school, and banning books and media that showcases queer people attempt to erase the queer experience both from modern life as well as in history.

All of these efforts depend on being able to identify those who are not living with the gender assigned to them at birth. Bureaucratic records may not be considered dangerous by the public, but the ability of government officials to access the records of those whose place in society they are seeking to erase can lead to dangerous consequences in the future. Other vulnerable groups will be targeted, and it is necessary to examine the historical implications and repercussions of the blatant targeting of these groups.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185765 https://historynewsnetwork.org/article/185765 0
The Modern Relics in Crow's Cabinet of Curiosities

Senator Sheldon Whitehouse (D-RI) points to a painting commissioned by Harlan Crow depicting a meeting at Crow's vacation retreat including Federalist Society head Leonard Leo (2nd from left), Justice Clarence Thomas (2nd from right) and Crow (far right)

Who is Harlan Crow? As questions mount about Supreme Court Justice Clarence Thomas’s alleged failure to disclose significant gifts (and attendant concerns about his integrity multiply), his principal benefactor has achieved a certain, curious fame. Until recently Harlan Crow, despite his enormous wealth and influence, remained a relatively obscure Dallas billionaire. Now, many want to know why he has lavished so many gifts on Justice Thomas, including a Bible once owned by the great abolitionist Frederick Douglass, an invaluable piece of Americana and an American relic.

For me, and for many others, the most fascinating aspect of Crow’s new celebrity is his controversial penchant for collecting rare,— and sometimes disturbing—historical objects. These include things we might call “atrocious relics.” In my recent book, American Relics and the Politics of Public Memory, I wrestle with such matters. Why do we collect relic-like things? What do they mean? What do they “do” or “say”—to those who possess them and to those who view them? Relics can be whimsical, glorious, or sober, but they are also volatile and sometimes alarming and offensive.

 

What is a “relic”?

Relic is commonly defined as a material object held in reverence by believers because it is linked to a holy person. In medieval Christendom, relics—blood and bones of saints, pieces of the “true cross,” and other sacred traces—gave power to their possessors and access to the divine. Their presence elevated and sanctified churches and communities, helped mold worshippers’ identities, and fixed them in a larger Christian world.

In our more secular modern world, relics endure and perform some of the same functions. Prized vestiges of former times, souvenirs or mementos connect us directly to the past. They do not merely illustrate it; they physically embody it, its glory and triumph, sometimes its tragedy or even horror. Relics are the past, persisting in our present.

Important public relics seemingly possess an ability to speak firsthand, to communicate authentically, wordlessly, emotionally, compellingly. They are both the argument and the evidence, veritable “smoking guns.” Sometimes they look ordinary. Who cares about some old, unremarkable fountain pen, until we learn that Lincoln used it to inscribe the Emancipation Proclamation in 1863? What’s the big deal with some old, tattered book, until it’s revealed as the Bible once owned (before Crow and Thomas) by Frederick Douglass? Through such things, we are uncannily linked to “history.”

Crow’s Nest

Harlan Crow has accumulated lots of such stuff at his Highland Park estate—astonishing stuff—including (randomly) a letter written by Christopher Columbus, a silver tankard crafted by Paul Revere, the deed to George Washington’s Mount Vernon, Dwight D. Eisenhower’s helmet, adorned with five stars, a cannonball from the Battle of Gettysburg, and much, much more.

But mingled among these American treasures are linens, medallions, and other Nazi artifacts and memorabilia, as well as an autographed copy of Hitler’s hateful tome Mein Kampf and two of his paintings, landscapes distinctive because of their artist, not their artistry. The manor’s grounds include a sculpture park arrayed with statues of notorious Communist leaders, a so-called “garden of evil” populated by Marx, Lenin, Stalin, Tito, Castro, Ceausescu, and other villains perhaps more obscure but nonetheless malignant, such as Gavrilo Princip, the assassin of Archduke Franz Ferdinand who precipitated World War I.

Why would Harlan Crow harbor such things? Of course, they are rare and valuable commodities, which might command a considerable price if sold, and which conspicuously display the inestimable fortune of their possessor. They are the prizes of Crow’s wealth. But his collection is not merely an investment, uncurated, or randomly compiled. These things hold meaning beyond their financial valuation, and they help define the man who owns them. If Crow tells stories through them, they tell stories about him.

Maybe Crow’s despots in bronze and stone function like big game trophies, displaying dominance over one’s quarry or foes. Or maybe they are a snarky, conservative troll to antagonize liberal critics, representing Crow’s supremacy over his opponents. They allow him, literally, to crow. Defenders argue their benign didacticism, marking the triumph of good over evil and reminding us of what to hate. In fact, new sorts of institutions—memorial museums—emerged after the Second World War that were designed to confront evil, to teach, memorialize, and heal in the wake of cataclysms, the Holocaust most prominently. But these institutions commemorate victims, not perpetrators like those assembled by Crow. Despite the rationales, Crow’s garden of evil does not teach or heal. It pays implicit homage to the evildoers and their power, deadening viewers to the full measure of their horrific ideas and acts.

It’s not really possible to renovate disgraced public monuments, unlike structures or institutions saddled with an unfortunate name, which can be changed and repurposed. Fort Benning recently became Fort Moore; Fort Bragg, Fort Liberty; Fort Hood, Fort Cavazos. But a statue of Robert E. Lee or Josef Stalin is inescapably a statue of Lee or Stalin. Neither can be rehabilitated by unilaterally rechristening them Martin Luther King or Lech Walesa. Crow doesn’t try and likely doesn’t care.

Crow’s unnerving monuments and memorabilia connect us to a reprehensible past, revivifying that which is sinister and frightening and, even for Crow perhaps, sordid and shameful. As one visiting reporter noted, the Nazi artifacts are placed in cabinets, “out of the view of visitors,” controlling their ability to “say” indiscreet things. Such materials evoke the lynching postcards and other grisly souvenirs once prized by white supremacists, kept privately as racist talismans. Broader public scrutiny transformed them into appalling objects, atrocious relics. Recent revelations thus pose some uncomfortable questions. Has Crow collected Thomas? And what do his relics say about him, and about us?

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185764 https://historynewsnetwork.org/article/185764 0
Texas Judge Revives Anthony Comstock's Crusade Against Reproductive Freedom

In April, a Texas judge ruled invalid the Food and Drug Administration’s approval of a pill used in over half the abortions in America.  Going further, he invoked the federal Comstock Act to declare it “nonmailable.” Twenty Republican Attorneys General promptly warned pharmacy chains to halt its sale.  Such sales would violate a law initiated 150 years ago by a Connecticut farm boy turned dry goods salesman beginning his battle against reproductive rights.

From an early age, Anthony Comstock showed his moralistic zeal.  At eighteen, he broke into a gin mill near his family’s farm and drained the liquor onto the floor. Enlisting after Gettysburg, he fought his fellow soldiers’ vices – liquor, lust, swearing, breaking the Sabbath – as vigorously as the Confederates.  Moving to New York, he futilely tried to jail a smut dealer loaning obscene books to schoolboys.

The “hydra-headed monster” of smut is where he made his first big kill.  On March 2, 1872, he and a police captain raided booksellers along Manhattan’s Nassau Street, the heart of America’s smut industry.  In one shop, he purchased The Confessions of a Voluptuous Young Lady of High Rank. In others, he bought Women’s Rights Convention and La Rose d’Amour.  Evidence in hand, the pair secured warrants from a judge who agreed the books were obscene.  Returning to Nassau, they arrested eight culprits and confiscated five bushels of obscene merchandise.

Later that month, Comstock targeted a crime catering more to women, and which he considered an immeasurably greater evil.  Smut merely inspired lust.  This crime enabled it.  His specific target was a man, Dr. Charles Manches.  But the services Manches offered helped women overcome the safeguards God had built to control their passions:  the fear that could make a woman on the brink stop and preserve her chastity.

Manches advertised his “French Imported Male Safes” as “a perfect shield against disease or conception.”  For ladies wishing to take matters into their own hands, he offered “Ladies Protectors,” commonly known as womb veils.  If those devices failed to prevent pregnancy, he promised “Ladies Cured at One Interview, with or without medicine, $5.”  He was one of over a hundred abortionists in the city, according to the New York Times.

With support from the YMCA, Comstock continued his raids.  By mid-year, he had eight smut cases pending in New York courts.  But prosecutors continually requested postponements.  When one case finally proceeded, the defense didn’t contest Comstock’s testimony.  It simply argued the material confiscated was no more obscene than passages in the bible.  The argument wasn’t convincing.  Ten jurors voted to convict.  But the two who didn’t meant the defendant walked.  That proved the best outcome of his pending cases.

Frustrated under state law, Comstock changed tactics.  Seven years earlier, Congress had banned obscenity from first class mail.  The law was weak, narrowly defining obscenity and prohibiting postmasters from unsealing mail even if they knew a piece contained it.  Prosecutions had barely hit half a dozen.

Comstock began ordering smut by mail.  After receiving obscene goods, he obtained warrants in US Circuit Court.  Four dealers were convicted and sentenced to one year in jail and $500 fines – too lenient for Comstock, but the maximum the law allowed.

Raiding one dealer’s medical associate, he discovered the doctor’s teenage patient awaiting his third attempt to abort her fetus.  But abortion was a state crime.  A district attorney killed that case.

Dissatisfied, Comstock outlined ideas for a tougher federal law to Morris Jessup, the YMCA’s President.  Jessup got US Supreme Court Justice William Strong to finalize a bill for Congress.  In February 1873, Comstock visited the US Capitol to exhibit obscenities – books, sex toys, rubber goods.  Attending senators declared they would accept any bill he wanted so long as it was constitutional.  They could pass it before the current session closed for President Grant’s second inauguration March 4.

New York Congressman Clinton Merriam introduced the bill in the House, expecting to pass it quickly under a suspension of the rules.  Connecticut Senator William Buckingham followed in the Senate.

An optimistic Comstock got a head start on enforcement.  On Treasury Department letterhead, he contacted nine suspicious doctors.  “I am an employee of the Treasury,” he wrote under the pseudonym Anna M. Ray, “I was seduced about four months ago, and I am now three months gone in the family way.”  “Anna” begged each doctor to send something to relieve her condition.  “For God’s sake do not disappoint a poor ruined and forsaken girl whose only relief will be suicide if you fail me.”

The optimism was premature.  With resisting legislators invoking rules and demanding changes, weeks passed.  On Saturday evening, March 1, the House met for its final session.  Comstock watched.  At midnight, unwilling to break the Sabbath, he gave up.  Leaving the Capitol, he spent a sleepless night too depressed even to pray.  Not until dawn could he accept the failure as God’s will. Only when he ran into the Senate’s chaplain did he learn the news.  “Your bill passed the House at two o’clock this morning,” the chaplain said.  It was immediately sent to the Senate and passed.  President Grant signed it the next day.

His bill launched Comstock’s four-decade career fighting smut dealers, abortionists, birth control advocates, artists, playwrights, and poets.  Its opening section foretold his war on reproductive rights, explicitly banning anything – device, medicine, tool, information, advertising – “for the prevention of conception” or “for causing unlawful abortion.”

Women bookended that career.  As he was pushing his bill in Congress, Comstock indicted “Free Lover” Victoria Woodhull and her sister Tennie Claflin for publishing an obscene article exposing the adultery of Reverend Henry Ward Beecher.  While the article might have been libelous were it not true, it wasn’t obscene.  But Comstock guessed the arrests would be a publicity coup that would help his bill pass.  After a year of harassment, the sisters were acquitted.

Under his bill, Comstock quickly attacked abortionists—twelve in Chicago, seventeen in New York.  But Chicago judges imposed trivial fines. In New York only three served serious time.  Through 1875, Comstock claimed 49 abortion arrests with 39 convictions, but even he acknowledged the difficulty of bringing the practitioners to justice.  In 1878, he achieved one notable feat.  He entrapped New York’s notorious abortionist Madame Restell, driving her to suicide.  “A Bloody ending to a bloody life,” he noted without remorse.

Months later, Comstock entrapped Dr. Sara Case.  She supplied syringes for post-coital douching with substances like vinegar and carbolic acid to prevent conception.  As their battle played out in the press, Case renamed her device the “Comstock Syringe.”  Sales soared.

The list went on until Comstock closed his career arresting birth control advocate Margaret Sanger.  She fled to Europe to escape his clutches.  Comstock resorted to convicting her estranged husband for handing out a birth control pamphlet.

Of course the women he attacked directly were not the only victims of Comstock’s fight against reproductive rights.  Others were the desperate women forced to bear children, no matter the risks to their health, their inability to support another baby, or simply satisfaction with the family they already had.

With the Texas judge’s decision stayed and appeals underway, the battle over reproductive rights continues in Anthony Comstock’s shadow.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185707 https://historynewsnetwork.org/article/185707 0
AI the Latest Instance of our Capacity for Innovation Outstripping our Capacity for Ethics

The eagerness with which movie and television studios have proposed to use artificial intelligence to write content collides with the concern of Writers Guild members for their employment security and pay in the latest episode of technological innovation running ahead of ethical deliberation. 

Regarding modern technology, the psychologist Steven Pinker and the economist/environmentalist E. F. Schumacher have expressed opposite opinions. In his Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (2018), the former is full of optimism--e.g.,“technology is our best hope of cheating death”--but many decades earlier Schumacher stated that it was “the greatest destructive force in modern society.” And he warned, “Whatever becomes technologically possible . . . must be done. Society must adapt itself to it. The question whether or not it does any good is ruled out.”

Now, in 2023, looking over all the technological developments of the last century, I think Schumacher’s assessment was more accurate. I base this judgment on recent developments in spyware and Artificial Intelligence (AI). They have joined the ranks of nuclear weapons, our continuing climate crisis, and social media in inclining me to doubt humans’ ability to control the Frankensteinian  monsters they have created. The remainder of this essay will indicate why I have made this judgment.

Before taking up the specific modern technological developments mentioned above, our main failing can be stated: The structures that we have developed to manage technology are woefully inadequate. We have possessed neither the values nor wisdom necessary to do so. Several quotes reinforce this point.

One is General Omar Bradley’s: "Ours is a world of nuclear giants and ethical infants. If we continue to develop our technology without wisdom or prudence, our servant may prove to be our executioner."

More recently, psychologist and futurist Tom Lombardo has observed that “the overriding goal” of technology has often been “to make money . . . without much consideration given to other possible values or consequences.”

Finally, the following words of Schumacher are still relevant:

“The exclusion of wisdom from economics, science, and technology was something which we could perhaps get away with for a little while, as long as we were relatively unsuccessful; but now that we have become very successful, the problem of spiritual and moral truth moves into the central position. . . . Ever-bigger machines, entailing ever-bigger concentrations of economic power and exerting ever-greater violence against the environment, do not represent progress: they are a denial of wisdom. Wisdom demands a new orientation of science and technology towards the organic, the gentle, the nonviolent, the elegant and beautiful.”

“Woefully inadequate” structures to oversee technological developments. How so? Some 200 governments are responsible for overseeing such changes in their countries. In capitalist countries, technological advances often come from individuals or corporations interested in earning profits--or sometimes from governments sponsoring research for military reasons. In countries where some form of capitalism is not dominant, what determines technological advancements? Military needs? The whims of authoritarian rulers or elites? Show me a significant country where the advancement of the common good is seriously considered when contemplating new technology.

Two main failings leap out at us. The first, Schumacher observed a half century ago--capitalism’s emphasis on profits rather than wisdom. Secondly--and it’s connected with a lack of wisdom--too many “bad guys,” leaders like Hitler, Stalin, Putin, and Trump, have had tremendous power yet poor values.

Now, however, on to the five specific technological developments mentioned above. First, nuclear weapons. From the bombings of Hiroshima and Nagasaki in 1945 until the Cuban Missile Crisis in 1962, concerns about the unleashing of a nuclear holocaust topped our list of possible technological catastrophes. In 1947, the Bulletin of the Atomic Scientists established its Doomsday Clock, “a design that warns the public about how close we are to destroying our world with dangerous technologies of our own making.” The scientists set the clock at seven minutes to midnight. “Since then the Bulletin has reset the minute hand on the Doomsday Clock 25 times,” most recently in January of this year when it was moved to 90 seconds to midnight--“the closest to global catastrophe it has ever been.” Why the move forward? “Largely (though not exclusively) because of the mounting dangers of the war in Ukraine.”

Second, our continuing climate crisis. It has been ongoing now for at least four decades. The first edition (1983) of The Twentieth Century: A Brief Global History  noted that “the increased burning of fossil fuels might cause an increase in global temperatures, thereby possibly melting the polar ice caps, and flooding low-lying parts of the world.” The third edition (1990) expanded the treatment by mentioning that by 1988 scientists “concluded that the problem was much worse than they had earlier thought. . . . They claimed that the increased burning of fossil fuels like coal and petroleum was likely to cause an increase in global temperatures, possibly melting the polar ice caps, changing crop yields, and flooding low-lying parts of the world.” Since then the situation has only grown worse.

Third, the effects of social media. Four years ago I quoted historian Jill Lepore’s highly-praised These Truths: A History of the United States (2018): “Hiroshima marked the beginning of a new and differently unstable political era, in which technological change wildly outpaced the human capacity for moral reckoning.” By the 1990s, she observed that “targeted political messaging through emerging technologies” was contributing to “a more atomized and enraged electorate.” In addition, social media, expanded by smartphones, “provided a breeding ground for fanaticism, authoritarianism, and nihilism.”

Moreover, the Internet was “easily manipulated, not least by foreign agents. . . . Its unintended economic and political consequences were often dire.” The Internet also contributed to widening economic inequalities and a more “disconnected and distraught” world. Internet information was “uneven, unreliable,” and often unrestrained by any type of editing and fact-checking. The Internet left news-seekers “brutally constrained,” and “blogging, posting, and tweeting, artifacts of a new culture of narcissism,” became commonplace. So, too did Internet-related companies that feed people only what they wanted to see and hear. Further, social media “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right. . . . The ties to timeless truths that held the nation together, faded to ethereal invisibility.”

Similar comments came from the brilliant and humane neurologist Oliver Sacks, who shortly before his death in 2015 stated that people were developing “no immunity to the seductions of digital life” and that “what we are seeing—and bringing on ourselves—resembles a neurological catastrophe on a gigantic scale.” 

Fourth, spyware. Fortunately, in the USA and many other countries independent media still exists. Various types of such media are not faultless, but they are invaluable in bringing us truths that would otherwise be concealed. PBS is one such example.

Two of the programs it produces, the PBS Newshour and Frontline have helped expose how insidious spyware has become. In different countries, its targets have included journalists, activists, and dissidents. According to an expert on The Newshour,

“The use of spyware has really exploded over the last decade. One minute, you have the most up-to-date iPhone, it's clean, sitting on your bedside table, and then, the next minute, it's vacuuming up information and sending it over to some security agency on the other side of the planet.”

The Israeli company NSO Group has produced one lucrative type of spyware called Pegasus. According to Frontline, it “was designed to infect phones like iPhones or Androids. And once in the phone, it can extract and access everything from the device: the phone books, geolocation, the messages, the photos, even the encrypted messages sent by Signal or WhatsApp. It can even access the microphone or the camera of your phone remotely.” Frontline quotes one journalist, Dana Priest of The Washington Post, as stating, “This technology, it's so far ahead of government regulation and even of public understanding of what's happening out there.”

The fifth and final technological development to consider is Artificial Intelligence (AI). During the past year, media has been agog with articles on it. Several months ago on this website I expressed doubts that any forces will be able to limit the development and sale of a product that makes money, even if it ultimately harms the common good. 

More recently (this month) the PBS Newshour again provided a public service when it conducted two interviews on AI. The first was with “Geoffrey Hinton, one of the leading voices in the field of AI,” who “announced he was quitting Google over his worries about what AI could eventually lead to if unchecked.”

Hinton told the interviewer (Geoff Bennett) that “we're entering a time of great uncertainty, where we're dealing with kinds of things we have never dealt with before.” He recognized various risks posed by AI such as misinformation, fraud, and discrimination, but there was one that he especially wanted to highlight: “the risk of super intelligent AI taking over control from people.” It was “advancing far more quickly than governments and societies can keep pace with.” While AI was leaping “forward every few months,” needed restraining legislation and international treaties could take years.

He also stated that because AI is “much smarter than us, and because it's trained from everything people ever do . . . it knows a lot about how to manipulate people, and “it might start manipulating us into giving it more power, and we might not have a clue what's going on.” In addition, “many of the organizations developing this technology are defense departments.” And such departments “don't necessarily want to build in, be nice to people, as the first rule. Some defense departments would like to build in, kill people of a particular kind.”

Yet, despite his fears, Hinton thinks it would be a “big mistake to stop developing” AI. For “it's going to be tremendously useful in medicine. . . . You can make better nanotechnology for solar panels. You can predict floods. You can predict earthquakes. You can do tremendous good with this.”

What he would like to see is equal resources put into both developing AI and  “figuring out how to keep it under control and how to minimize bad side effects of it.” He thinks “it's an area in which we can actually have international collaboration, because the machines taking over is a threat for everybody.”

The second PBS May interview on AI was with Gary Marcus, another leading voice in the field. He also perceived many possible dangers ahead and advocated  international controls.

Such efforts are admirable, but are the hopes for controls realistic? Looking back over the past century, I am more inclined to agree with General Omar Bradley--we have developed “our technology without wisdom or prudence,” and we are “ethical infants.”

In the USA, we are troubled by divisive political polarization; neither of the leading candidates for president in 2024 has majority support in the polls; and Congress and the Supreme Court are disdained by most people. Our educational systems are little concerned with stimulating thinking about wisdom or values. If not from the USA, from where else might global leadership come? From Russia? From China? From India? From Europe? From the UN? The past century offers little hope that it would spring from any of these sources.

But both Hinton and Marcus were hopeful in their PBS interviews, and just because past efforts to control technology for human betterment were generally unsuccessful  does not mean we should give up. Great leaders like Abraham Lincoln, Franklin Roosevelt, and Nelson Mandela did not despair even in their nations’ darkest hours. Like them, we too must hope for--and more importantly work toward--a better future.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185706 https://historynewsnetwork.org/article/185706 0
Forget "Finding Forrester"—Our Best Teaching Can Be Ordinary

Plato and Aristotle in detail from The School of Athens by Raphael (1509–1510), fresco at the Apostolic PalaceVatican City.

Every few years there is a movie about a gifted young person striving to reach their potential and being formatively shaped by a teacher or mentor. Finding Forrester is a classic in this genre. The main character, Jamal, is a gifted young writer who meets a famous but reclusive novelist, William Forrester, who helps Jamal improve by challenging him and not being overly easy with the praise. In Whiplash, Miles Teller plays a gifted young drummer named Andrew Neiman whose music teacher, Terence Fletcher, is determined to draw out his genius. Fletcher’s approach is abusive and even somewhat insane. But Andrew wants so badly to be a musical legend on the level of Charlie Parker that he practices until his hands bleed and he endures the abuse.

Though university level instruction should not involve the abusive behavior we see in Whiplash, and we probably have to be more orthodox in our teaching than an old novelist eating soup and pecking at a typewriter, we sometimes dream of working with the kind of student pictured in those films. This would be a young person who has a natural gift and an unnatural drive to succeed. They want to be challenged. When you push them, they keep getting better. They go on to achieve remarkable things. You get to help launch them into the stratosphere.

In reality, very few students are going to resemble the characters in these movies. Some of your students aren’t interested in your class. Some are trying to decide if they are interested. Some are interested, but have other priorities. Some want to get better at whatever your discipline is, but do not believe that your course is part of their hero’s journey. Not everyone is going to read your comments on their paper. Not all who do will take the comments to heart. A few of your students will cheat on class assignments. Some of your students will certainly go on to greatness and many have significant abilities, but most of your students will not practice until their hands bleed.

There aren’t a lot of movies about doing an excellent job with normal students and getting normal outcomes. However, if it’s true that the process is more important than the product, those movies are missing something anyway. There’s quite a bit of true excellence in teaching that never gets associated with students who go on to win Nobel prizes or become MacArthur Fellows. Exceptional outcomes are not the only measure of excellence in teaching. An excellent teacher can teach all kinds of students. You can do meaningful work and inspire people without becoming the backstory of the next Stand and Deliver.

In films with bright students, those students arrive with the passion. Jamal is already a writer before he finds Forrester. Andrew Nieman has aspirations in the opening sequence. In real life, some college students are still searching for their passion. Some of them need that flame to be nourished. Even those with significant gifts are not always a half step from legendary excellence. Sometimes the role of the excellent teacher is an introduction to a subject or guiding the first steps along the path of whatever it is that a student is pursuing. Sometimes what you impart is not even a passion for your own subject.

A lot of the wise mentors in movies are set in their ways and have a pretty fixed and cantankerous approach to instruction. That may not slow down a gifted student who cannot be deterred from learning, but, even then, it may not be the actual best approach. Teaching excellence does not always take the form of pushing students to the extreme limits of their abilities. All students need to be challenged, but not all in extreme ways. Some also need to be encouraged. Struggle can help with growth, but sometimes students are struggling with things that are more important than our classes and don’t need provocatively difficult assignments to learn to push themselves in life. That doesn’t mean that every semester, every course, has to be catered to each individual student, or that everything should be easy, but it does mean that good teaching is much more than setting the bar at the correct height and then noting who makes it over and who doesn’t. There is a real art to setting meaningful but realistic expectations for students and ourselves.

One very unhelpful thing about films with amazing students is that they give us a distorted sense of impact. A good teacher’s legacy is not built on the genius of a single student helped along the way. A good teacher’s legacy includes people who became slightly better writers, casual readers of history, more critical viewers of documentaries, more knowledgeable citizens, and even people who just got better at passing college classes. A good legacy may even include helping direct a student to a better major for them. A good legacy is built on hundreds, thousands of recommendation letters, for all kinds of positions with varying degrees of prestige.

The reclusive novelist in Finding Forrester is roughly modeled on J.D. Salinger. Interestingly, Salinger’s novel Franny & Zooey has a relevant passage. Franny is a college student experiencing a kind of breakdown, and is judging her peers and professors along the way. Though they are part of the Glass family, full of child geniuses, her brother Zooey suggests that she is not necessarily flexing her intellect as much as she is being snobbish. Both had been precocious kids on a radio quiz show and Zooey reminds his sister that their older brother Seymour always encouraged them to do their best for the “Fat Lady”—to do their best for some unknown woman in the audience that they imagined as really deserving and really listening. Zooey even shined his shoes, for the radio program, for the “Fat Lady.” He tells his sister:

“I don’t care where any actor acts. It can be in summer stock, it can be over a radio, it can be over television, it can be in a goddam Broadway theatre, complete with the most fashionable, most well-fed, most sunburned-looking audience you can imagine. But I’ll tell you a terrible secret—Are you listening to me? There isn’t anyone out there who isn’t Seymour’s Fat Lady. That includes your Professor Tupper, buddy. And all his goddam cousins by the dozens. There isn’t anyone anywhere that isn’t Seymour’s Fat Lady. Don’t you know that? Don’t you know that goddam secret yet? And don’t you know—listen to me, now—don’t you know who that Fat Lady really is?... Ah, buddy. It’s Christ Himself. Christ Himself, buddy.”

There are days it feels like we are doing the Broadway equivalent of teaching—students seem to be lighting up, they’re going on to bigger and better things, they’re asking for outside reading recommendations. It is easy to feel inspired. But there are days we are off-off- Broadway—monitoring low grades and repeating ourselves in class. It is our job to see all of our students as significant, whether or not they seem special to us when we first meet them. Even if they would rather be texting, it is our job to be teaching to the best of our abilities.

Excellence in teaching is in meeting the challenge of real-life classrooms, filled with students of all abilities, and resulting in all kinds of outcomes. Excellent teaching is not just about throwing down challenges to push great students on to more greatness. We don’t work on a film set, we work in a university classroom. We are great when we are consistently excellent, whether or not our students are famous or we are experiencing moments that have the feel of movie magic.   

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185704 https://historynewsnetwork.org/article/185704 0
Contemporary Pundits Need a Refresher on Populism's History

From the People's Party Paper, 1892

The way “populism” is typically invoked in today’s media, you wouldn’t know that the word comes down to us from one of America’s most successful progressive movements— the grass-roots crusade that resisted corporate power and fought to save democracy 130 years ago.

Many of today’s pundits would have you think otherwise.

“Is American Democracy Doomed by Populism?” asks Yascha Mounk of the Council on Foreign Relations, writing days after Trump supporters stormed the Capitol. Politico called Trump “The Perfect Populist” in 2016, likening him to George Wallace, Alabama’s white-supremacist governor in the 1960s. “Trump and Sanders Lead Competing Populist Movements,” says the Washington Post, echoing a common claim that progressives share some kind of “populistic” perspective with the far right.

In this ahistorical babble, you rarely hear mention of the men and women who organized a multiracial resistance to the first corporate oligarchs.

“The fruits of the toil of millions are boldly stolen to build up colossal fortunes for a few,” the Populists announced when they formed the People’s Party in 1892. The mega-rich who “despise the republic and endanger liberty” were the real danger to democracy.

The People’s Party would contest the rule of these “plutocrats” at a time of rapid social change. Railroads, electricity, and mechanized crop harvesting were transforming the economy, making the “Robber Barons” who controlled these new technologies the richest men on earth. While many workers took home less than $10 a week in 1890, Jay Gould, the infamous stock speculator, was pocketing more than $20,000 a day (in today’s dollars, about $700,000).

Farmers were routinely abused. Railroad monopolies gouged them with inflated charges for shipping wheat and cotton to distant markets, while lenders (especially in the South) extorted interest of 40% or more on loans for overpriced supplies and equipment. At a time when farmers and farm laborers accounted for more than 40 percent of the labor force, their collective anger posed a genuine threat to unfettered capitalism.

Neither the Democratic nor Republican parties saw what was coming. Both were dominated by monopoly capitalists who wanted minimal taxation, no regulation of their “private” business empires, and no legal rights for the farmers and workers who resisted corporate profiteering. At a time when there were no primary elections for choosing a party’s presidential candidate, there was little prospect for internal reform in either major party.

The Populists had to launch a new political movement, drawing support from the Farmers Alliance, the American Railway Union, the women’s suffrage movement, Christian Socialists, the United Mine Workers, and utopian reformers. The People’s Party was also a multiracial movement in the South, where African Americans served on the party’s state executive committees in Texas, Louisiana and Georgia.

The economic and political goals of these Populists were as broad as their membership. They wanted farmer-owned cooperatives that would negotiate for better prices from processors and merchants. They favored public ownership of railroads, utilities and other natural monopolies. They called for postal savings banks and low-cost federal loans for farmers and workers. They wanted recognition of farm organizations and labor unions. Where bankers favored the high interest rates that came from basing the money supply on scarce reserves of gold, the Populists wanted to abolish the Gold Standard and expand the money supply with government-issued bills and silver coinage.

Above all, they wanted to restore a democracy corroded by the blatant buying of privilege. Nationally, they favored the election of senators rather than their appointment by bought-and-sold state legislatures— as was then the case. To reform state government, they called for referendum, recall, and votes for women. In the South, they favored political rights for Black voters.

On this reform platform, the Populists called on the “producing classes” to vote for the return of government “to the hands of the ‘plain people’.”

They failed nationally, but it was a close call in the West, the Great Plains and the South. Fifty Populists won election to Congress from 16 states. North Carolina, Oregon, South Dakota, Nebraska, Kansas, and Colorado all elected Populist governors. The Populist vote would have been higher still in the many southern states where white elites organized a deadly backlash, stealing votes, murdering Populists, and imposing one-party rule by white-supremacist Democrats.

Even so, the Populists transformed the political terrain in America, marked by the subsequent emergence of progressive movements in both national parties. The watershed was 1896, when William Jennings Bryan won the Democratic Party nomination for president on a pledge to regulate the railroads and expand the money supply with silver. Running as a Democrat— and widely viewed as a “Popocrat”— he fell short with 47 percent of the popular vote. But progressives thereafter gained ascendency in the party, leading to reforms in the next century that included much of the Populist platform: election of senators, votes for women, corporate regulation, collective bargaining rights for workers and farmers, and an end to the Gold Standard.

Bernie Sanders, the Democratic Socialist, is at least a distant cousin of these original Populists. Donald Trump is not. Even the phrase “right-wing populism” is— historically speaking— an oxymoron. The Populists of the 1890s would have despised the likes of Trump, a preening billionaire allied with today’s mega-rich.

Mainstream pundits would nevertheless have us believe that any popular movement calling on “the people” to overturn “corrupt elites” is a populist threat to democracy. Lumping Sanders together with a wanna-be fascist like Trump implies that both men seek to sway voters with equally polarizing and manipulative rhetoric.

Those who apply this shape-shifting term are actually branding themselves. Some are simply unwitting users of a phrase that’s in vogue and gives the appearance of historical insight. Others may know better, but have gotten used to it. Still others deliberately use the populist label to stigmatize any movement that challenges the questionable legitimacy of our elite-dominated “meritocracy.”

Elites who tar their critics in the U.S. with the sly pejorative of “populist” count on our collective amnesia. They’d rather the real Populists remained forgotten, along with the potential they represented.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185651 https://historynewsnetwork.org/article/185651 0
How a Little-Known Anti-Vietnam Protest Reverberates Today

Probably no period of US history witnessed more student unrest than the Vietnam War years before Congress ended the draft in 1971.  Student demonstrations started in 1963 at St. John’s University in Queens, NY, gained momentum at Berkeley the following year, and culminated with strikes at nearly 400 colleges and universities following the killing of students at Kent State and Jackson State in 1970. 

Why, then, might a small-scale demonstration at a remote, though distinguished, university in western New York deserve our attention?  Only fifteen students and two professors demonstrated against the Vietnam War during an ROTC ceremony at Alfred University (AU) in May 1968.   The university suspended seven students and fired one of the two faculty protestors.  AU charged them with violating recently adopted guidelines relating to demonstrations.  The incident hardly created a ripple beyond the local area. 

Although most Americans knew nothing about this event, the ACLU took note.  Alfred University is a private university, but it contains an internationally- acclaimed Ceramics College funded by the State University of New York.  Three of the seven students were enrolled in that SUNY unit, and the ACLU, mainly represented by a young and brilliant civil rights attorney named Neil Fabricant, aided the students.

The fired professor, a 40-year old historian named Michael Kay, was an outspoken anti-war radical.  Ironically, he had been hired in part because he was a Marxist; the department chairman, David Leach, thought students should be exposed to a range of historical viewpoints.  Nevertheless, Kay had become a thorn in the side of the university. He organized a chapter of Students for a Democratic Society (SDS).  He rarely attended faculty meetings and frequently canceled classes.  AU’s president complained, with justification, that he “has a passion for anarchy and a genius for discord.” 

Many of Kay’s colleagues would have agreed.   A sociologist who shared Kay’s political views wrote after the university fired him that “he gave no quarter and deserves none.”   He was considered so disagreeable that not even the local American Association of University Professors (AAUP) chapter came to his defense when he alleged that the university had fired him because of his left-wing politics.    

Despite calls to reinstate him, the university stood firm.  He had clearly violated the university’s demonstration guidelines by interfering with the progress of the ROTC ceremony and refusing to move away when ordered to do so. He had been warned that his behavior at the ROTC ceremony placed him at risk of dismissal.   And because AU was a private institution, he could not legally challenge his firing. 

Not so the three suspended SUNY Ceramic College students.  They claimed that AU violated their First Amendment freedoms of speech and assembly, along with their Fourteenth Amendment right to due process.  Joined by the other four students, all seven went to court.  Because the Ceramics College, one of AU’s four colleges, was fully funded by the State of New York and because the state provided AU with about $200,000 to cover instructional costs for Ceramics students taking courses in other AU colleges, the plaintiffs argued that AU officials had acted as state agents and therefore that the suspension constituted “state action.”  That concept—state action—though little known outside of the legal community, became critically important to their suit. 

 Their case would be known as Powe v. Miles, Emile Powe being the first of seven plaintiffs, and Miles being Leland Miles, president of the university.  The students went to Federal District Court in Buffalo where the case was assigned to Judge John T. Curtin.  Following two days of hearings, Judge Curtin held that the university had not acted in the role of the state and therefore that the “state action” principle was inapplicable.   For that reason, the students did not have standing to sue in a federal court.   A private university, Curtin concluded, could suspend students for almost any reason, and AU was private despite receiving state monies for the Ceramic College. 

The students’ attorney, Neil Fabricant, strongly disagreed.  He persuaded his clients to appeal.  Off they went to the U.S. Court of Appeals for the Second Circuit in New York City. 

 There, a three-judge panel that included Henry J. Friendly, perhaps the most highly respected appellate court jurist in the country, accepted Fabricant’s argument that New York State’s funding of the Ceramics College meant that suspending the students indeed constituted “state action.”  Fabricant reminded the Court that the very name of the college—the New York State College of Ceramics at Alfred University—justified the “state action” designation.  The Second Circuit therefore reversed Judge Curtin’s lower-court decision. It concluded that a federal court could properly address the First and Fourteenth Amendment issues raised by the plaintiffs.

 Unfortunately for the students, however, the Appeals Court did not find that AU had violated their constitutional rights.  The Court held that AU’s demonstration guidelines requiring such things as 48-hours prior notice and no disruption of educational activities (the ROTC ceremony was technically a class) were reasonable.  Moreover, the Court further noted that the university had given the students adequate opportunity to protest in a way that did not abridge their First Amendment rights.  They were permitted to display signs calling for an end to the Vietnam War and the abolition of compulsory ROTC by standing to the side of the ROTC parade grounds so long as they did not disrupt the ceremony.  The university also granted the students a right to appeal their suspension, thereby preserving their Fourteenth Amendment right to due process.  AU even permitted the students to take their spring semester final exams off campus.  The Court therefore sustained AU’s decision to suspend the students for the fall semester.

The Appellate Court may have exonerated AU, but the AAUP was less forgiving.  The AAUP is a professional organization committed to the defense of faculty and the principle of academic freedom.  It ignored the student side of this controversy and mounted a fourteen-month investigation into Kay’s dismissal.  With laser focus, the AAUP highlighted the fact that Kay had been fired before he had a chance to exercise his right to appeal and therefore concluded that he had been denied due process.  The AAUP disregarded the inaction of the local AAUP chapter. Some members had found Kay so objectionable as to have recommended even before the ROTC protest that he be terminated. 

 Nevertheless, in what can only be viewed as a victory for AU, the AAUP stopped short of censuring the university after Alfred officials agreed to pay Kay a year’s salary and to update its faculty handbook in accord with AAUP recommendations. 

 So why should we remember this matter?  Not because of Professor Kay’s fate, but because the Court of Appeals redefined the legal status of a private university that receives state funding.  Is a private university subject to state regulation in respect to protests?  Will its faculty and students enjoy constitutional protections?   

Powe v. Miles became a national moot court case.  It has been cited in federal and state courts 216 times since 1968.   Seventy-five of these citations relate specifically to the “state action” concept.  Moreover, in the immediate wake of Powe v. Miles, the New York State legislature passed Education Law Section 6450 requiring every institution of higher learning in New York receiving public funds to “adopt rules and regulations for the maintenance of public order….and provide for the enforcement thereof.”  A college refusing to abide by Section 6450 would forfeit state monies. 

From Powe v Miles in 1968 until about 1982, we find that courts expanded the scope of the “state action” concept, especially related to issues of race and gender.  After 1982, reflecting a more conservative legal environment, courts narrowed their interpretation.  In short, Powe v Miles has influenced a corner of American law for over a half century, which is to say that the ripple effects of the 1968 demonstration at Alfred University reverberate into the 21st century.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185650 https://historynewsnetwork.org/article/185650 0
Brandon Johnson Built a Coalition to Win in Chicago. Can He Keep it to Govern?

On Monday, May 15, Brandon Johnson takes the oath of office to become the 57th mayor of Chicago – a moment echoing Harold Washington’s path-breaking election, forty years ago this spring, as the city’s first Black mayor. The historic parallels between 1983 and today, however, are less about who Johnson is – the 46-year-old former teacher and union activist will become the fourth Black mayor in the city’s history – but more about how he got to the fifth floor of City Hall and the challenges that face him and his administration.

Not unlike Washington, Johnson won a narrow victory against a more conservative white opponent in Paul Vallas, a former city budget director and schools CEO who emphasized law and order and other racial dog whistles throughout the campaign. Johnson built a multiracial coalition with an overwhelming Black vote and substantial Latino and white support to beat Vallas. Younger voters, who had largely eschewed the first round of the mayoral campaign, came out in larger numbers to help push Johnson over the top in the runoff.

Similarly, by the time Harold Washington delivered his inaugural address in May 1983, he had vanquished three prominent white opponents in two elections with an astounding 99 percent of the Black vote, nearly 80 percent of Latinos, and a small but significant number of white progressives, including much of the city’s burgeoning gay and lesbian community. Promising to make the city a fairer, more inclusive place, Washington inspired high hopes among his supporters that he indeed could open up the city to all.

But while Washington’s new administration made some important strides, the reality of governing proved even harsher than most had predicted. The explicit racism Washington and his allies faced during the campaign continued as a white City Council majority thwarted most of his policies and appointments for the first two-and-a-half years of his mayoralty in what was called the Council Wars. Deindustrialization, a hostile Reagan White House, and crises posed by crime, drugs, and AIDS proved just as daunting to his policy agenda. When Washington was successful, it was often not only because his allies had his political back, but also because they were willing to maintain their own pressure on the new mayor to follow through on his campaign promises.

For instance, when Washington moved slowly to incorporate Latinos in his new administration, activists such as Nena Torres, Miguel del Valle, and Linda Coronado threatened to establish their own independent Latino affairs commission. What became the Mayor’s Advisory Commission on Latino Affairs started as a provocation to the new mayor to take them and their issues more seriously. The commission, formalized in 1984, became an essential voice for Latino interests – from affirmative action and redistricting to infant mortality and urban renewal – and a model for mayoral commissions representing other groups. But the commission only came into existence though intense lobbying by Latinos.

Washington’s historic choice of Fred Rice as the city’s first African American police superintendent offered another example. While important symbolically, Rice’s appointment did not change the culture of what remained a highly dysfunctional police department known for harassment and even torture. Excessive force complaints, in fact, rose during the first two years under Washington and Rice. And yet Washington supporters and activists at the time generally tread lightly on his management of the department, knowing that sharp public criticism of the first Black mayor’s handling of the police would simply add fuel to his opponents’ efforts to discredit him.

Forty years later, Brandon Johnson faces the same kind of high expectations that Washington did, but in a city far more unequal and financially strapped than it was under Washington. As the new mayor navigates issues of rising crime, under-resourced schools, and now a growing migrant crisis, staying in the good graces of the diverse and inherently fragile coalition that elected him may prove difficult.

Ultimately, as in 1983, it will be up to those Chicagoans who voted for reform – including the powerful Chicago Teachers Union, of which Johnson once was a member and organizer, to decide how much patience and grace to show their now elected ally. How accountable will they hold him to his campaign promises to govern differently than his predecessors? Or will another chance at reform in the Windy City slowly blow away?

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185649 https://historynewsnetwork.org/article/185649 0
Political Pundits, Apply the "Resentment" Label with Caution

Although former President Donald Trump's 2024 campaign frequently references his own resentments, is that emotion driving his supporters?

If someone ever managed to copyright the word “resentment,” the owner would enjoy a steady stream of revenue, especially from columnists and opinion writers. Take those of the venerable  New York Times. “The Resentment Fueling the Republican Party is Not Coming from the Suburbs,” reads the headline of a Thomas Edsall column from earlier this year. (January 25, 2023)  Just a day later, Edsall’s Times colleague Paul Krugman declared, “Rural resentment has become a fact of American politics.” (January 26, 2023). Earlier that month, Bret Stephens wrote, in a colloquy with David Brooks, “The problem is that Trump turned the [Republican] party into a single-purpose vehicle for cultural resentments,” adding: “It doesn’t help that coastal elites do so much on their own to feed those resentments.” (Jan. 15, 2023) And in August of last year, Jamelle Bouie struck the same chord: “Republicans would like to offer you some resentment.” (August 22, 2022)

Given these assertions, it is no surprise to discover that the rush to evoke resentment coincided with the election of Trump in 2016. It quickly became an off-the-shelf explanation for a political phenomenon that defied all rational expectations. David Remnick, the editor of the New Yorker, vilified the victorious candidate as a “slick performer” who essentially duped his followers by being “more than willing to assume their resentments, their fury, their sense of a new world that conspired against their interests.” And days after the election, Leon Wieseltier, writing in the Washington Post, seized upon it as the apt word to describe the present moment: “Resentment, even when it has a basis in experience, is one of the ugliest political emotions, and it has been the source of horrors,” he declared. Others followed suit.

What are we to make of the place of “resentment” in the echo chamber of a significant segment of the commentariat? Does its frequent, casual, sometimes unthinking deployment really offer any insight into the motivations and values of the millions of Americans who voted the former president into office and support him still? It’s like inflation: when we use something too frequently its value is diminished. Might it be time to place a moratorium on “resentment?”

Perhaps not. But we might at least become more aware of its potential meanings and implications, especially those that risk overshooting the mark of what commentators intend to convey.

We might recall, for example, that at least since Friedrich Nietzsche it has usually been understood as a profoundly demeaning characterization of people convinced of their unjust victimization, consumed by bitterness and envy, governed by a twisted sense of the reasons for their fate. “Nothing on earth consumes a man more quickly than the passion of resentment,” he wrote in Ecce Homo. And in The Genealogy of Morality, where he cast the emotion as fundamental to the debased morality of the slaves, he says of the resentful man, “His soul squints.”

In more recent times, commentators have usually defined this psychological disposition in similar terms.  It is the “villain of the passions,” according to the philosopher of emotions Robert Solomon. It poisons “the whole of subjectivity with its venom… maintaining its keen and vicious focus on each of the myriad of petty offenses it senses against itself.”  One doesn’t have to embrace this rather Nietzschean view to appreciate that resentment is an emotion that few people are eager to “own.” 

Or we might also realize that resentment has often been used to delegitimize people who merely exhibit a profound dissatisfaction with the status quo, who insist that they are being denied their just desserts.  The literary scholar Frederic Jameson sees recourse to resentment in explaining protestors’ motivations as “little more than an expression of annoyance at seemingly gratuitous lower-class agitation, at the apparently quite unnecessary rocking of the social boat.” Too often, to fixate on resentment is to ignore or underplay the real grievances that stand behind this usually unappealing emotional state. It is to mistake the symptom for the cause.

On the other hand, we might consider that there are different modes of resentment, some indeed not so much a function of envy, or bitterness, or feeling cheated by fate, but rather righteous indications of an injustice that must not be ignored.  And here it is precisely the irritating, clamorous tone of resentment that serves this purpose. “In the midst of the world’s silence, our resentment holds its finger raised,” wrote the Auschwitz survivor Jean Améry in 1966: his lonely protest against the blithe alacrity of his contemporaries to put the past behind them, especially when it came to the Shoah. In the face of this tendency, he writes, “I ‘stuck out’…I persevered in my resentments.” 

The moral philosopher Amélie Oksenberg Rorty warned that if we slight or ignore expressions of resentment, we are like the physician who dismisses the symptoms of a suffering patient.  And in the experience of various “Truth and Reconciliation Tribunals” around the world, it has often been former victims’ insistent expressions of resentment that have called a temporary halt to the proceedings—which almost always aimed at achieving the “closure” of forgiveness—until their grievances were adequately acknowledged.

Finally, those quick to brand others with the label of resentment ought to think again whether they are so immune from the same ascription.  One thing that distinguishes resentment from other kindred emotions, such as anger, bitterness, or enervating envy, is that it usually signals a moral injury—a conviction that you have been wronged in a way that contravenes some basic notions or standards that should normally govern what people expect for themselves and from others. In our current climate, the tendency is to think of resentment as the farthest thing from “moral”—often, given some of its uglier manifestations, with justification. But anyone with a sense of self-worth has to be at least prone to the kind of moral aggrievement which gives rise to resentment.  

I am not arguing for banishing “resentment” from our current lexicon. It’s clearly useful in illuminating the passions and grievances that animate many people in the US and elsewhere, especially on the right. But let’s deploy it less as a means of reproach and more in the quest for insight, perhaps even empathy.   

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185620 https://historynewsnetwork.org/article/185620 0
Buried Footage Helped Chicago Police Get Away with Killing 10 Labor Activists in 1937

Chicago Police attack a march of steelworkers and their families, May 30, 1937. Photo National Archives and Records Administration

A major labor strike is back on the front pages this week, as Writers Guild members—movie and TV writers—have walked out. The most obvious fallout so far: late night talk shows going dark or struggling for jokes. Stephen Colbert, before signing off on CBS for who knows how many weeks, did declare: “This nation owes so much to unions.”

Labor actions and organizing, in fact, have been surging in recent years, mainly in new industries, from Starbucks to Amazon and Apple.

Decades in the past, strikes often led to violent conflict between workers and local police. It virtually never happens today. This is at least partly due to what happened eighty-six years ago this month in Chicago, after police shot forty steel strikers and supporters (mainly in the back) and killed ten of them in what has become known as The Memorial Day Massacre. No labor conflict has come close to this toll since.

Nevertheless, Paramount buried the only footage of the incident, until a famed reporter and crusading U.S. senator brought it to light.

My new film exploring all this premiered over PBS on May 6. It’s titled Memorial Day Massacre: Workers Die, Film Buried. It will be aired over local PBS stations all month but everyone can watch it starting the same night and for several weeks after via PBS.org and PBS apps, or at the site for the hosting station, KCET in Los Angeles.

It’s also explored my companion book of the same title, the first oral history on the tragedy, with testimony from eyewitnesses as well as historians and authors such as Howard Zinn, John Hope Franklin, Gore Vidal, David Kennedy and Studs Terkel, and even Ayn Rand.

The background: A wave of labor actions swept America starting in 1935. Sit-down strikes became all the rage and even General Motors and Ford caved. The largest steel company, U.S. Steel, avoided a strike by offering workers--under pressure from new CIO chief John L. Lewis--but companies known as Little Steel (though hardly small) across the Midwest and Pennsylvania, refused to even recognize the new Steel Workers Organizing Committee.

More than 70,000 at those plants declared a strike in late-May, 1937. When some set up picket lines outside Republic Steel in South Chicago, police swung nightsticks and more than two dozen were injured. So they scheduled a family picnic to mobilize support on a broad field several blocks from the Republic plant on May 30. Well over one thousand turned out on this hot, sunny day, including many women and children, dressed in their Sunday best. Organizers called for a ragged march to the plant for legal, mass, picketing.

When they were halted by hundreds of Chicago police, armed with pistols and some carrying axe handles or tear gas provided by Republic, a few minutes of heated discussion ensued. Some marchers tossed stones or a tree branch, and police lost patience with the crowd, which included women and children, when they failed to disperse as ordered. Suddenly police hurled tear gas bombs and then fired dozens of shots.

About 40 marchers would be shot as they fled across the open prairie, including an 11-year-old boy, the vast majority wounded in the back or side. (Ten would die that day or in days ahead.) Dozens more would suffer head wounds after police clubbed the retreating marchers.

Police did not call ambulances or administer first aid but instead arrested the wounded and shoved them into paddy wagons for trips to a prison hospital and other distant medical facilities. Only a handful of police suffered injuries, all minor.

Newspapers across the country (including The New York Times) almost invariably described the marchers as a “mob” of “rioters” who left no choice but for police to fire shots to keep them from attacking the plant. After two weeks of this, it emerged that a leading newsreel company, Paramount News, had a cameraman on the scene. He had filmed almost the entire confrontation and ugly aftermath. But Paramount failed to release the four-minute newsreel it prepared, claiming they feared it might set off riots in movie theaters, but more likely to protect Chicago police and officials.

This sparked a Senate subcommittee, under the Wisconsin progressive, Robert M. La Follette, Jr., to subpoena the footage. A staffer leaked it to investigative reporter, Paul Y. Anderson, who wrote a sensational report picked up by many newspapers. At the well-publicized hearings called by La Follette at the end of June the footage was screened for the first time.

Paramount now had little choice but to release a newsreel devoted to the incident. Screenings, however, would be banned in cities such as Chicago and St. Louis, or by entire theater chains. The Senate report would place full blame on police for the massacre. Yet a coroner’s jury in Chicago would judge the killings “justifiable homicide.” No police would be punished. Dozens of unionists had been wounded, jailed or fined.

Workers at the steel plant returned to the plants without a contract, but they would win recognition and most of their demands a few years later. And there was this positive result: Strike leaders in nearly every field now tried to avoid violent conflicts at all cost and police were determined to control labor actions without the use of firearms.

Today, police shootings of unarmed citizens remain far too common, and often unpunished. But there is this further legacy of 1937 massacre: It provoked the first calls for police to be equipped with cameras to document arrests—anticipating the dashboard-cams and body-cams that reveal so many of the unjust shootings today.

PBS is now streaming this documentary.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185599 https://historynewsnetwork.org/article/185599 0
"An Inconvenient Truth" Shows the Missed Opportunities to Act on Climate Change

When the documentary film An Inconvenient Truth opened in May 2006, former Vice President Al Gore, who is the central figure in the film, hoped the movie would increase public awareness of climate threats and arouse bipartisan political support for action. Two years after the movie’s release, prospects for cooperation looked promising. The nominees for President of the United States from both major political parties called for ambitious climate programs. Later, partisan division on climate issues turned severe. Political leaders failed to take meaningful action. In recent years the recurrence of floods, droughts, and heat waves convinced world leaders that a shift to green energy is imperative. But the hour for adjustment is late. Secretary-General Antonio Guterres summed up the dangers in a 2023 scientific report issued by the United Nations: “Humanity is on thin ice,” warned Guterres, “and that ice is melting fast.”

An examination of An Inconvenient Truth’s place in this history illuminates the record of lost opportunities in struggles to protect the planet.

The idea for creating the movie emerged when producer Laurie David saw a film excerpt of Gore’s lecture. She thought a feature-length production could inform the public about environmental threats. Davis Guggenheim soon joined the project as director. He faced a challenge trying to turn Gore’s lectures and slideshow into compelling entertainment. To excite the interest of moviegoers, Guggenheim made Gore the central character in the narrative.

The production drew attention to Al Gore’s longtime work as a climate activist. Gore became concerned about warming temperatures when he took a class at Harvard taught by Roger Rovelle, a scientist who recorded the buildup of carbon dioxide in the atmosphere. Later, as a new member of the House in 1976, Gore held the first congressional hearings on climate change. In 1992 he published a bestseller, Earth in the Balance. As Vice President in the 1990s, Gore promoted the 1997 Kyoto Protocol, a UN initiative to reduce greenhouse gases. After he lost the presidential election in 2000, Al Gore devoted considerable time to lecturing on climate change in the United States and around the world.

An Inconvenient Truth was a surprising hit for a documentary film. It received considerable praise, won a prestigious award, and influenced public opinion. David Edelstein, a reviewer for New York Magazine, called the film “One of the most realistic documentaries I’ve ever seen – and dry as it is, one of the most devastating in its implications.” The movie won an Academy Award for Best Documentary Feature. In 2007, Gore and the Intergovernmental Panel on Climate Change received the Nobel Peace Prize. That year a 47-country survey on the impact of An Inconvenient Truth conducted by the Nielsen Company and Oxford University reported 66% of respondents who saw the movie indicated it changed their minds about global warming.

When an interviewer asked President George W. Bush if he would watch the film, Bush replied, “Doubt it.” Relations between Bush and Gore had been strained since the 2000 presidential campaign. The tension was due partly to Gore’s criticism of Bush for failing to address global warming.

Some Republicans in Washington blasted the film and Gore. Senator Jim Inhofe of Oklahoma, a prominent climate denier, compared An Inconvenient Truth to Adolf Hitler’s book, Mein Kampf. Inhofe said every claim in the movie “has been refuted scientifically.” Congressman Lamar Smith of Texas claimed the science was flawed because of “exaggerations, personal agendas and questionable predictions.” Republicans also complained that Al Gore’s prominence in the movie damaged the film’s potential for attracting bipartisan support. Gore, they noted, had been the Democratic candidate for president, and they said Gore’s abrasive personality hurt the environmental cause.

Intervention by powerful business interests magnified political divisions. In the years after An Inconvenient Truth’s release officials at coal, oil, and gas companies bankrolled communications that questioned the validity of climate science. These well-financed campaigns delivered political dividends. One of the most productive strategies involved requests in 2010 for “No Climate Tax” pledges from congressional candidates. Of 93 new members in Congress that won their races, 83 signed the pledge.

President Barack Obama tried to implement climate initiatives by giving the Environmental Protection Agency greater authority to regulate carbon emissions and by promoting electric cars and batteries, but the next president took a different approach. Donald Trump promised to “save coal” during the 2016 campaign. He denounced climate change as a hoax. Shortly after moving into the White House, Trump pulled the United States out of the Paris climate accord. His administration scrubbed references to “climate change” from government websites.

Climate denial was prominent in G.O.P. politics, but there were still several Republicans that were willing to advocate climate action in the first years after release of An Inconvenient Truth. In 2008 the Republican Party’s candidate for president made the case for renewable energy. John McCain called for mandatory limits on greenhouse gas emissions, supported a “cap-and-trade” program that gave companies incentives to invest in clean alternatives, and he pledged to work with the globe’s biggest polluters, including China and India, to protect the earth.  In a major speech on the topic, McCain urged action rather than “idly debating” whether climate change was man-made. He said, “We need to deal with the central facts of rising temperatures, rising water and all the endless troubles global warming will bring.”

In recent years some Republican officials have demanded responses to the kind of “endless troubles” McCain described. They recognized the threats to communities, evidenced by storms, floods, droughts, and melting ice. Several Republican leaders in southern Florida demanded climate action because of rising waters and flooded streets. Jim Cason, the mayor of Coral Gables (adjacent to Miami) said, “I’m a Republican, but this is a non-partisan job . . . You have to deal with facts, deal with risks and probabilities, you can’t keep putting your head in the sand.”

Unfortunately, a head-in-the-sand approach is still favored by many Republican officials and legislators. In 2021 twelve states with Republican attorneys general sued President Joe Biden because of his efforts to implement rules aimed at reducing greenhouse gases. That year 109 Republican representatives and 30 Republican senators in the 117th Congress refused to acknowledge the scientific evidence of human-induced global warming. In 2023 the House Republicans’ debt ceiling proposal aimed to repeal major policies designed to incentivize deployment of green energy.

Al Gore made a significant contribution to public awareness of climate issues through his central role in An Inconvenient Truth, but Gore could have done more for the environmental cause if he had won the 2000 presidential election. George W. Bush joked about his opponent in that campaign, saying Gore “likes electric cars. He just doesn’t like making electricity.” Al Gore, who realized 23 long ago that electric cars could play an important role in protecting the planet, lost that presidential election by just a few hundred ballots in Florida.

Progress toward reducing greenhouse gases might have come earlier if the count in Florida had gone the other way. It seems likely that efforts to raise public awareness of climate threats would have been more robust in a Gore presidency.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185597 https://historynewsnetwork.org/article/185597 0
Let Us Now Praise R. DeSantis

R. DeSantis, Governor of Florida, speaks to an audience of persons in Tampa

I am such a fan! Thank you so much for putting an end to all references to gender and sexuality in Florida’s public schools. I can’t believe it’s taken this long to have a political leader take a stand against gender and sexuality!

No more references in social studies classes to Pocahontas and her husband John Rolfe in colonial Virginia! Our children do not need to know that John identified as a man. From now on we’ll teach our children about Pocahontas and their spouse J. Rolfe!

No more talk of George and Martha as the first U.S. president and his First Lady! We’ll instead tell our students about G. Washington, who was followed by J. Adams, who was influenced by their spouse A. Adams in the famous “Remember The Non-Gender-Specific People” letters. We can then refer to T. Jefferson, our third president, who apparently was devastated by the loss of their first spouse, but was comforted by their coworker S. Hemings. We’ll finally be able to teach our children that Washington, Adams, and Jefferson were three of our most important Founding Parents!

I can’t wait to find out how the new lesson plans will avoid all discussion of gender identities in our unit about the 1848 Seneca Falls Convention and its famous call to enfranchise people who were inexplicably denied the right to vote. And to tell you the truth, I thank my gender-free gods (or goddesses, or will we now be referring to them as godx?) that Florida will finally allow us to correct that terribly worded speech by S. Truth, which we will henceforth translate as “Am I Not A Person?”

Who could vote after the 19th Amendment was ratified? People, not women!

Who were the youth who fought for the United States in World War One? The dough-people!

Who kept the factories running during World War Two? R. the Riveter!

Who staffed the Federal Bureau of Investigation? G-People.

What patriotic organization said no to allowing M. Anderson to sing at Constitution Hall in 1939? Children of the American Revolution!

What really surprises and pleases me is that your law is finally going to do away with gender-segregated bathrooms in Florida’s  public schools. No teacher should be telling my children which bathroom to use! No more teaching children about gender identity!

And those dreadful family tree projects! No more branches for fathers, mothers, sons, daughters, brothers, sisters, uncles, or aunts. We’re all relatives now!

And when you, R. DeSantis, are written up in future history textbooks, Florida’s children will be able to commemorate your fight against M. Mouse and D. Duck. You will be known around the world for insisting that parents, not teachers, should teach their children about rodent and avian genitalia.

All of this just leaves me with one question: how will the future teachers of Florida celebrate you for banning public school lessons about gender and sexuality if we can’t mention the unmentionable?

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185596 https://historynewsnetwork.org/article/185596 0
The First Campgrounds Took the City to the Wilderness

Overflow Crowd of Campers in Stoneman Meadow, Yosemite National Park, 1915. Bancroft Library, University of California, Berkeley.

A camping area is a form, however primitive, of a city.[1]

—Constant Nieuwenhuys

There is a deeply satisfying immediacy about the prospect of establishing and occupying an encampment for the night—clearing the site, erecting the tent, chopping wood, building a fire, and cooking over the live flame—that undoubtedly suggests a meaningful connection to landscape, place, and the rugged life of backwoods adventurers. As they reach their numbered campsites by foot, automobile, or RV, modern practitioners of the craft no doubt give little thought to the ways in which campgrounds have reshaped this powerful experience of nature in ways both subtle and fundamental.

Recreational camping emerged in the United States after the American Civil War as a form of escape from the hustle of city life. Spurred by the 1869 publication of Adventures in the Wilderness; or, Camp-Life in the Adirondacks by the Reverend William Henry Harrison, aka Adirondack Murray (1850–1941), visitors flocked to the region in search of spiritual and physical renewal, marking what the historical geographer Terence Young characterized as the birth of the practice as we know it today.

Before this important book appeared in print, recreational camping had remained the exclusive province of wealthy enthusiasts who could afford to hire expert guides to lead them, build their camps, fish, hunt and prepare their food. As they casually set about to explore their surroundings, these new leisure campers were likely unaware that they themselves would have little hope of surviving this unforgiving American wilderness on their own. As demand surged, these individual camps made way for large, organized tours such as those the Wylie Permanent Camp Company began operating in Yellowstone National Park during the mid 1880s. The 1910 edition of the company’s Yellowstone brochure promised an experience that sounds remarkably similar to modern-day glamping, in which visitors overnighted in well-appointed tents built on sturdy wooden platforms and furnished with comfortable beds.[2] As they set out in the morning, these campers were secure in the knowledge that the next stop on the itinerary would offer the very same conveniences.

Paradise Valley Campground, Mount Rainier National Park, 1915. Courtesy of the National Park Service.

During the early part of the 20th century, the influx of affordable automobiles like the Ford Model T made it possible for prospective sightseers to reject the artificial trappings of these organized tours. Instead, these newly-minted motor tourists set out to experience the world on their own terms: "Thoreau at 29 cents a gallon."[3] Unfortunately for these hordes of intoxicated travelers, wild nature was often met with enthusiasm, but not always with high regard. Careless campers used streams as latrines and left behind unextinguished campfires and heaps of trash. Even the most respectful were not immune, since they might inadvertently consume water polluted by towns, factories, or campsites upstream. In an ironic twist, the very places that early enthusiasts like Adirondack Murray had championed for their thrilling scenery and regenerative properties were slowly turning into breeding grounds for diseases like typhoid and cholera long eradicated from urban agglomerations nationwide.[4] The noted expert Frank E. Brimmer (1890-1977) remarked dryly of this period that “The vandals sacked Rome, but what 12,000,000 motor campers may do in this land of the free, unless good sportsmanship is the keynote of camping, will make the inroads of the Huns look like a fairy story for juvenile consumption by comparison.”[5]

As a new type of spatial landscape during the 1910s and ’20s, campgrounds helped mitigate some of these destructive impacts. They did not close the door on increasing masses of visitors, but instead concentrated campers and their motor vehicles in designated enclaves often demarcated by fences or moats, sparing more delicate and ecologically sensitive natural areas. It is interesting to think of the convergence of these two trends: the independence afforded by the automobile, and the slow, progressive removal of agency caused by the protective enclosure of the campground. In truth, campgrounds came into existence precisely because of motor vehicles. For Murray and his contemporaries, siting the rustic encampment in direct proximity of a lake, river, or stream, and within reach of an ample supply of firewood, represented a strategic decision of great consequence. With the emergence of these new massive public facilities, however, the location of the encampment was no longer open for discussion: park here also meant camp here. Campgrounds offered motor tourists the persuasive illusion of roughing it in nature, albeit within a structured spatial setting.

Art Holmes, Summit Lake Camp Boundary, Lassen Volcanic National Park in California, 1932.

Courtesy of National Park Service, NPGallery Digital Asset Management System.

Little more than open fields serviced by a few latrines, a well and a large evening bonfire, the first campgrounds were characterized by overcrowding and loosely arranged clusters of motor vehicles, tents, and trailers. The Yosemite historian Stanford Demars observed that “It was commonly joked—and not without some truth—that the first camper to drive his automobile out of the campground on a holiday morning was likely to dismantle half of the campground in the process due to the common practice of securing tent lines to the handiest object available—including automobile bumpers.”[6] Over the next two decades, scores of additional controls such as length of stays, one-way roads, dedicated numbered plots serviced by parking spurs, firepits and picnic tables, would progressively shape the campground from a free and relatively open territory into the facilities we know and enjoy today.

Every summer now, over forty million Americans take to the open road in search of this powerful experience of nature.[7] They travel to state parks, national parks, and other federally managed lands. They summer at commercial campgrounds, like KOAs (Kampgrounds of America), or at private, luxury glamping sites advertised on Airbnb and Tentrr. Less discriminating campers even overnight their RVs in Walmart parking lots. Despite the hundreds of campsites available at popular facilities, demand has become so great that reservations must be booked months in advance of arrival using online websites such as reserveamerica.com, recreation.gov or campgroundviews.com, an amenity that offers prospective campers a Google Street View-like experience of many facilities nationally. Serviced by extensive networks of infrastructure and populated with automobiles, trailers, $300,000 RVs, massive tents, thick mattresses, coolers, stoves and other forms of specialized gear, each of the country’s 900,000 "lone" campsites, spread over 20,000 campgrounds nationally, functions as a stage upon which cultural fantasies are performed, often in full view of a nearby audience of fellow enthusiasts interested in the very same "wilderness" experience. As Ralph Waldo Emerson (1803-1882) keenly observed in his poem The Adirondacs, “We flee from cities, but we bring the best of cities with us.”[8]

[1] Cited in Charlie Hailey, Campsite: Architectures of Duration and Place (Baton Rouge, LA: Louisiana University Press, 2008), 62.

[2] Idem.

[3] "Two Week's Vagabonds," The New York Times, July 20, 1922, sec VII, 7. Cited in Warren James Belasco, Americans on the Road: From Autocamp to Motel, 1910-1945 (Baltimore: Johns Hopkins University Press, 1997), 8.

[4] Cindy Aron, Working at Play: A History of Vacations in the United States (Oxford: Oxford University Press, 1999), 224.

[5] Frank Brimmer, Coleman Motor Campers Manual (Wichita, KS: The Coleman Lamp Co., 1926), 62.

[6] Stanford Demars, The Tourist in Yosemite (Salt Lake City: University of Utah Press, 1991), 139.

[7] Based on the 2017 Outdoor Foundation Camping Report. Downloaded at https://outdoorindustry.org/wp-content/uploads/2015/03/2017-Camping-Report__FINAL.pdf. [visited 08.16.19]

[8] Ralph Waldo Emerson, “The Adirondacs,” in May-Day and Other Pieces (Boston: Ticknor and Fields, 1867), 60.

]]>
Fri, 26 Apr 2024 08:18:44 +0000 https://historynewsnetwork.org/article/185594 https://historynewsnetwork.org/article/185594 0