History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Wed, 31 May 2023 10:41:47 +0000 Wed, 31 May 2023 10:41:47 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/site/feed What We Can Learn From—and Through—Historical Fiction

Novelist Anna Maria Porter, engraving The Ladies' Pocket Magazine (1824)

This image is available from the New York Public Library's Digital Library under the digital ID 1556053: digitalgallery.nypl.org → digitalcollections.nypl.org

 

 

I have been a local historian for many years, but turned to historical fiction to tell a specific story for which there were no sources. There was a sense of going to the “dark side” in doing so, yet at the same time I was able to illuminate things that do not appear in the historic record.  I suspect that there could be a lively debate online about what good historical fiction can accomplish—and also the misuse of history by those who write historical fiction.

 

As a local historian I tried to be true to the sources I found; to be trusted by readers. In the case of the dozen women who crossed the country in 1842, members of the first overland company to set out for the Pacific Northwest, I could find little. With no verifiable facts, but knowledge that women were present, I turned to fiction to put women in the picture and wrote Lamentations: A Novel of Women Walking West (Bison Books, an imprint of the University of Nebraska, 2021). To someone like Gore Vidal, that made perfect sense; he thought history should not be left to the historians, “most of whom are too narrow, unworldly, and unlettered to grasp the mind and motive,” of historical figures. E. L. Doctorow would agree, but more agreeably, writing that “the historian will tell you what happened,” while the novelist will explain what it felt like. The historian works with the verifiable facts—fiction is a step beyond.

 

Historical fiction is generally dated to Sir Walter Scott, beginning with Waverly in 1814. It turns out, however, that Scott was not the first historical novelist. Devoney Looser has just published Sister Novelists (Bloomsbury Press, 2022) about Maria (1778-1832) and Jane (1775-1850) Porter, driven by poverty, who wrote popular historical novels beginning in the 1790s. A Wall Street Journal reviewer in 2022 noted that “Maria was a workhorse, Jane a perfectionist. Between them they wrote 26 books and pioneered the historical novel.”

 

There have been only a few academic treatments of historical fiction. Ernest Leisy issued The American Historical Novel in 1950 and George Dekker wrote American Historical Romance in 1987, both interested in chronological periods, but neither man created, or exhibited, much enthusiasm for it. Yet, in 1911 James Harvey Robinson wrote in an essay titled “The New History,” published in the Proceedings of the American Philosophical Society, where he observed that historians need to be engaging, even while “it is hard to complete with fiction writers.” He stated

 

History is not infrequently still defined as a record of past events and the public still expects from the historian a story of the past. But the conscientious historian has come to realize that he cannot aspire to be a good story teller for the simple reason that if he tells no more than he has good reasons for believing to be true his story is usually very fragmentary and uncertain. Fiction and drama are perfectly free to conceive and adjust detail so as to meet the demands of art, but the historian should always be conscious of the rigid limitations placed upon him. If he confines himself to an honest and critical statement of a series of events as described in his sources it is usually too deficient in vivid authentic detail to make a presentable story.

 

The historian Daniel Aaron took the genre of historical fiction seriously in a 1992 American Heritage essay in which he castigates Gore Vidal. Aaron however conceded that “good writers, write the kind of history [that] good historians can’t or don’t write.”

 

Aaron quotes Henry James, who thought of historians as coal miners working in the dark, on hands and knees, wanting more and more documents, whereas a storyteller needed only to be quickened by a letter or event to see a way to share it with readers or use it to illuminate a point about the historical past. He recognized that genres of reading had changed. In the 19th century we read historical tomes, mostly about the classical world or of British and European war and political alignments, but in the last quarter of the 20th century “so-called scientific historians left a void that biographers and writers of fictional history quickly filled.” Aaron cites inventive novelists who have perverted history for a variety of reasons, using Gore Vidal as his prime example. Vidal thought of historians as squirrels, collecting facts to advance their careers. But Vidal does not get the last word.

 

Professor Aaron recognized that historical fiction had moved from a limited earlier model focused on well-known individuals to serious re-tellers of history who have “taken pains to check their facts and who possess a historical sensibility and the power to reconstruct and inhabit a space in time past.” What a lovely description of some of the best of our contemporary historical fiction.

 

But what of putting women into the past where they often do not appear? Addressing this issue, Dame Hilary Mantel noted in her 2013 London Review of Books essay “Royal Bodies” that

 

If you want to write about women in history, you have to distort history to do it, or substitute fantasy for facts; you have to pretend that individual women were more important than they were or that we know more about them than we do.

 

Despite my great admiration for Dame Hilary, I think we can deal with the issue of women in the past by honoring their lives in such a way that does not turn them into twenty-first century heroines but as women who found themselves in situations they might not have wished, and did what they needed to do, thought about their circumstances, and dealt with what they found they had landed in. They, as we, are each grounded in our own time, deserve credit for surviving, and should be appreciated for our observations of life around us.

 

We should respect the historians’ knowledge of time and place and the novelists’ intuition that is sometimes spot-on. An example: in trying to explore the moment when the buttoned-down eastern women in 1842 encountered a band of Lakota, then identified as Sioux, I wondered what the women might have thought of those bronzed warriors whose clothing left much of their chests and shoulders bare. What would the women walking west have thought about? When I read the paragraph I had written to an elderly friend, she went to her desk and pulled out a letter from an ancestor who had crossed Nebraska, walked over South Pass, and on into Oregon. And that ancestor, in the 1850s, had said exactly what I had imagined. Sometimes, the imagined past is as we conceive it to be because we have grasped the knowledge of time and place on which to activate believable players.

 

My desire in Lamentations was to hear what the women were thinking, and sometimes saying to each other, but within the context of that century when much that was unorthodox could not be said aloud. I wanted to show how a group of people traveling together would get to know each other, rather as students in a class know that one was from Ohio and another played hockey. We do not know others fully, but from the vantages we are given. I wanted to display how the women gained information, and then passed it along; how tragedies were dealt with; how personalities differed, and how, in the end, Jane matured. I wanted to bring women of different generations together, to show discord among sisters, to think about what was important when dismantling a home, how women fit into the daily account of miles and weather and sometimes events kept by the company clerk. I wanted to explore what it was like to answer a longing for new beginnings, for a journey when one is the first to make it. I am interested in names and what they mean, in the landscape what how one travels through. I wanted to hear the women speak when the records do not.

 

Historians need to be conscious of the audience we/they hope to have and perhaps can learn something about style and sense of place from the writers of historical fiction. Academic and local history can be told vividly; good history can also have good narrative but also, that some historical fiction tells a story that a historian cannot. I have written this to praise historical fiction when it respects the line between our times and the past, when it adheres to the known-truth and does not pervert it for excitement—or for book sales. I appreciate Daniel Aaron who thought historical fiction was worth taking seriously, and for all those writers who have brought the past alive in this form.

 

Fiction is not the only way to explore the past, but historical fiction can attract readers to wonder and speculate and then explore the past in other forms. A friend said that as a child, reading fiction of other times led her to read history and then become a historian. Aaron wrote that historical fiction gives “us something more than the historical argument.” It can bring alive an era, a person, a moment in time so that we meet the past as it was, not as we might want it to have been.

                                                                                                                                   

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185767 https://historynewsnetwork.org/article/185767 0
White House Speechwriter Cody Keenan on the Crucial 10 Days of the Obama Presidency

Cody Keenan (Photo by Melanie Dunea)

 

Other than being able to string a sentence together, empathy is the most important quality in a speechwriter. The ability or at least the attempt to understand your audience, to walk in their shoes for a little while, even if empathy will never be a perfect match for experience.—Cody Keenan, Grace

 

 

 

Ten days in June 2015 were some of the most intense during the presidency of Barack Obama. The president was awaiting US Supreme Court decisions on the fate of the Affordable Care Act and marriage equality. And, on June 17, a hate-fueled white supremacist shot to death nine African American worshippers at a historic church in Charleston, South Carolina.

Chief White House speechwriter Cody Keenan focuses on this extraordinary period in his revelatory and lively new book Grace: President Obama and Ten Days in the Battle for America (Mariner Books).

In response to this perfect storm of historic events, Mr. Keenan drafted memorable speeches and a heartfelt and now immortal eulogy for Reverend Clementa Pinckney and other victims of the Charleston violence. And that address moved beyond a eulogy with the president’s powerful plea for unity and reconciliation and his surprising segue as he led the congregation and the nation in singing “Amazing Grace.”

In Grace, Mr. Keenan recounts highlights of his career as a speechwriter as he describes the tumultuous ten days. The reader immediately senses the demands of working for a president who was himself the former editor of the Harvard Law Review and among the most celebrated writers and orators of the recent history. As Mr. Keenan puts it, “To be a speechwriter for Barack Obama is f---ing terrifying.” Mr. Keenan worked “to his limits” in his high-pressure position to provide President Obama with the best drafts possible. And it’s obvious from Grace that the two men were gifted collaborators who worked together with great mutual respect and admiration.

As he provides a behind-the-scenes perspective on White House operations, Mr. Keenan introduces key presidential aides such as Valerie Jarrett, Jen Psaki, Ben Rhodes, Jon Favreau and his speechwriting team. He also intersperses the book with the story of his romance with esteemed presidential fact-checker Kristen Bartoloni, who often challenged and corrected his writing. They married at the White House in 2016.

By 2015, President Obama had delivered more than a dozen eulogies for the victims of gun violence, including for those who died in the massacre where Representative Gabby Giffords was seriously wounded in Arizona and the horrific gunshot murders of 20 children and five adults in Sandy Hook, Connecticut. Mr. Keenan wrote those eulogies as well as the president’s now famous speech honoring the fiftieth anniversary of the 1965 March on Selma for voting rights and those peaceful protesters including civil rights icon, Representative John Lewis, who endured a bloody attack by police.

Mr. Keenan writes powerfully of the pain and sorrow that he and the president experienced in addressing yet another mass shooting in June 2015, that time with the added dimension of racist violence. The description in Grace of the creation of the president’s address for the funeral of beloved Reverend Clementa Pinckney is a case study in collaboration in the speech drafting process.

During the same sad week, Mr. Keenan wrote statements for the president to deliver if the Supreme Court gutted the Affordable Care Act and ended marriage equality. We now know that those speeches on the Court decisions weren’t necessary. And the eulogy for Reverend Pinckney will be remembered as one of the great presidential addresses. Mr. Keenan concedes that this eulogy was his most difficult assignment after working on more than three thousand speeches for President Obama.

Mr. Keenan’s heartfelt and moving memoir Grace shows how a gifted president and his devoted team worked together tirelessly for a more fair, more tolerant, and more just nation.

Mr. Keenan is best known as an acclaimed speechwriter. He studied political science at Northwestern University and, after graduation worked in the office of US Senator Ted Kennedy. After several years in that role, he earned a master's degree in public policy at the Harvard Kennedy School. He subsequently secured a full-time position with Barack Obama's presidential campaign in Chicago in 2008.

When President Obama took office in 2009, Mr. Keenan became deputy director of speechwriting in the White House. He was promoted to chief White House speechwriter during the president’s second term. He also collaborated with President Obama on writing projects from the end of his term in 2017 until 2020. He has said that he wrote his dream speech just four days before Obama left office—welcoming the World Champion Chicago Cubs to the White House.

Mr. Keenan is currently a partner at the speechwriting firm Fenway Strategies and, as a visiting professor at his alma mater Northwestern University, he teaches a popular course on political speechwriting. Today, he and Kristen live in New York City with their daughter, Grace.

Mr. Keenan graciously responded by email to a long series of questions on his new book and his work.

 

Robin Lindley: Congratulations Mr. Keenan on your engaging new book Grace, a revelatory exploration of your work as chief speechwriter for President Obama at an incredibly turbulent time. Before getting to that period, I wanted to ask about your background. You majored in political science at Northwestern University. What sparked your interest in politics?

Cody Keenan: Well, I enrolled at Northwestern as a pre-med student. I wanted to be an orthopedic surgeon after a football injury forced a knee reconstruction. Chemistry 101 weeded me right out, though. I just wanted to take biology.

But politics had always been an interest. My parents often argued about politics at the dinner table – my mom was a Kennedy Democrat from Indiana; my dad was a Reagan Republican from California – and whatever could make them so animated was something worth exploring. One value they both hammered into me, though, was the idea that I should do whatever I could to make sure more people had the same kind of opportunities I did growing up – and by the time I graduated from college, only one political party cared about that.

Robin Lindley: Did you have academic or other training in speechwriting?

Cody Keenan: No. Writing was something that always came naturally, and I think that came from being a voracious reader. I won every summer competition at the local public library. You can’t be a good writer without being a great reader.

Robin Lindley: You interned for legendary Senator Ted Kennedy after college. Did your duties in that role include speechwriting?

Cody Keenan: Not as part of the internship, or even the first position after that. Three months as an intern got me hired to answer his phones. I ended up working for him for almost four years in four different roles.

In 2004, when I was on his staff for the Committee on Health, Education, Labor, and Pensions, the Democratic National Convention was in Boston, his hometown. We all took a week off work to volunteer. I was on the arena floor the night that Barack Obama gave the speech that made him famous. He walked into the arena anonymous; he walked out 17 minutes later a global megastar. It shows you what a good speech can do.

Once we were back in Washington, I must have talked about that speech a lot, because that’s when my boss asked if I could write a speech. I don’t know if he meant did I have the time or did I know how, but it didn’t matter – I lied and said yes.

Robin Lindley: Senator Kennedy was known as a great legislator in the Senate who could work across the aisle. Did you work with him or his staff on any significant projects? What did you learn from that internship?

Cody Keenan: As an intern, one of my tasks was to read and route mail that came to the office. Perfect strangers were writing a senator – often one who wasn’t even their senator – to ask for help. There’s an act of hope involved in that. Even when it was a tough letter to read, even when you could see that the writer had wiped a tear from the page, they hoped that someone on the other end would care enough to help. I learned right away just how important this stuff is.

Later, as a staffer, I worked on all sorts of legislation. Kennedy was involved in everything. Health care, minimum wage, education, immigration, the Iraq War, the response to Hurricane Katrina, Supreme Court nominations – we were always busy. And with good mentors, I learned that just as important as the policy itself was often the way you communicated it.

Robin Lindley: What attracted you to working for President Obama during his first presidential campaign in 2007? Did you work as a speechwriter before his election?

Cody Keenan: Well, what struck me about that 2004 speech was that he described politics the way I wanted it to be – as this collective endeavor in which we could do extraordinary things that we couldn’t do alone. His only speechwriter at the time, Jon Favreau, called me early in the campaign and asked if I wanted to join the speechwriting team he was putting together. I said yes.

Robin Lindley:  What did you learn or do to prepare for work as a speechwriter for President Obama, one of our most celebrated American writers and thinkers even then? Did you go back and read works of some of the great White House writers such as Ted Sorensen, Bill Moyers, and Peggy Noonan? Did you read speeches by the likes of Lincoln, FDR, JFK, Churchill, and other memorable leaders?

Cody Keenan: I didn’t. I’d already read the canon of presidential hits, but to be a speechwriter for someone means writing for that specific person, helping him or her sound not like anybody else, but rather the best version of himself or herself.

Robin Lindley: I read that you didn’t personally meet President Obama until his first day at the White House in 2009. Yet, you had been working for him for a year and a half. What do you remember about your first meeting and your early days at the White House?

Cody Keenan: Yep – he visited Chicago headquarters maybe three times during the campaign. He was out campaigning! And when he did visit, it was for strategy sessions with his top aides and to address the entire staff at once, not to meet with his most junior speechwriter.

On our first day at the White House, he called me into the Oval Office because he’d seen my name at the top of speech drafts and he just wanted to put a face to the name. Those early days were drinking from a firehose: the economy was falling apart, millions of Americans had lost their jobs and their homes in just the four months before he took office, and millions more would in the first few months after. There was no honeymoon; we were busy trying to turn that firehose onto the fire.

Robin Lindley: Did you immediately start as a speechwriter once President Obama began work at the White House?

Cody Keenan: I did.

Robin Lindley: How does one prepare for a job that requires knowing the voice and propensities of the person they are writing for?

Cody Keenan: Well, I had a year and a half foundation from the campaign. I’d read his books to absorb his worldview, listened to the audio versions to absorb his cadence, and paid close attention to his edits. He was a writer. He was our chief speechwriter. And he was our top editor. I learned a lot just by poring over his edits to our drafts.

Robin Lindley: How did your relationship with President Obama evolve over his eight years in office? You wrote that working for this acclaimed writer could be terrifying. It seems he offered good advice to you such as having a drink and listening to Miles Davis or John Coltrane. Or reading James Baldwin. Did you see him as a kind of coach or mentor?

Cody Keenan: I was the junior writer on the team for the first two years, sitting across the driveway in the Eisenhower Executive Office Building. Then a series of high-profile speeches got me promoted to deputy director of speechwriting, and I moved into a West Wing office with Jon Favreau. Once he left after the second inaugural, I took over as chief speechwriter. So naturally, our relationship evolved – I went from seeing Obama every couple weeks to every week to every day.

I saw him as my boss. I guess as a writing coach of sorts. And sometimes even as an uncle or older brother who loved to dispense advice. He hosted my wife and our families and our best friends at the White House on our wedding day. It was his idea. He didn’t have to do that.

Robin Lindley: Are there other bits of President Obama’s advice that stick with you?

Cody Keenan: “Don’t impart motives to people.” That’s advice we could use more of.

Robin Lindley: Indeed. A big question, but can you give a sense of the speechwriting process? What sparks the process? Who is involved? What’s it like to collaborate with a team of writers and other staff?

Cody Keenan: He viewed speechwriting as a collaboration. He just wanted us to give him something he could work with. We wrote 3,477 speeches and statements in the White House, and believe it or not, he edited most of the speeches, even if lightly. But he couldn’t be deeply involved with all of them.

For any speech of consequence, though, we’d start by sitting down with him and asking “what’s the story we’re trying to tell?” Then the speechwriting team would talk over each speech, helping each other get started. Then we’d all go back to our own laptops and draft whatever speech we’d been assigned. The drafting was not a collaborative process. The revising was – with each other, but more importantly with him.

Robin Lindley: What’s the fact checking process for a speech draft before it goes to the president? It’s interesting that your future wife Kristen was one of the very diligent fact-checkers you relied on.

Cody Keenan: Yeah, she literally got paid to tell me I was wrong. Every day. For years. It was her team’s job to fireproof the president – to make sure he never said something he shouldn’t, with someone he shouldn’t be with, at a place he shouldn’t be visiting. They prevented countless alternate timelines where we’d have to do some cleanup in the press. They saved us from ourselves again and again.

Robin Lindley: Congratulations on your marriage to Kristen with the magnificent White House wedding. Your blossoming romance runs like a red thread through your book. You note that President Obama would stay up late at night to review and edit drafts of speeches he would give the next day. And you often received late night calls from him or met with him in the wee hours. How did those final hours work with a speech? It seems the president would often edit to the time of delivery.

Cody Keenan: He always edited in the wee hours of the morning. It’s when he preferred to work. It was rare that we were editing right up until delivery. If we were flying somewhere for a speech, he’d always go over it one or two final times on the plane. But he didn’t like chaos. In fact, the reason he edited so heavily, so often, was because he wanted the speech exactly the way he wanted it. Sometimes it was perfectionism. But it’s really just preparation.

Robin Lindley: What did you think when the president ad libbed or changed something from your draft as he spoke? I think you said something to the effect that he was a better speechwriter than all of his writing staff.

Cody Keenan: I loved it. I can’t think of a time I cringed at an adlib. He had a knack for it. It could be a little white-knuckled if he did it at the end of the speech when there’s no text for him to come back to. In that case, he’d have to build a new runway while he was speaking on which to land the plane.

Robin Lindley: When does humor come into the mix? Do you write for events such as the White House Correspondents Dinner? President Obama had some zingers for his eventual birther successor at these events.

Cody Keenan: Those were our most collaborative sets of remarks. The entire team would pitch jokes, and we’d reach out to professional comedy writers to solicit their help. We’d start out with about 200 jokes and whittle them down to the 20 funniest. Sometimes, none of your jokes would make the cut. You’ve got to have a thick skin.

Robin Lindley: And you and the other speechwriters did not use a template such as this speech is on the economy or this speech is political, so we’ll use the file template X or Y. You were responsible for more than three thousand speeches, yet it seems each speech was approached as a unique project.

Cody Keenan: Yes and no. We never used a template. But while each individual speech should tell a story, so should all speeches. What I mean by that is, we were mindful that every speech we wrote fit into a longer narrative arc – both of his presidency and his entire political career.

Robin Lindley: You worked for the president through his eight years in office. How did you come to focus on ten days in 2015 in Grace as the president dealt with the horrific 2015 mass murder of nine Black parishioners by an avowed white supremacist at Mother Emanuel Church in Charleston, South Carolina. The president then also was preparing to address two impending Supreme Court decisions that would determine the fate of the Affordable Care Act and marriage equality.  

Cody Keenan: Yeah. People will remember all of the stories and all of the events in this book. They won’t remember that they all happened in the same ten-day span. I mean, that in and of itself is a story that demands to be told. In addition to a massacre carried out by a self-radicalized white supremacist, there was a very real chance that the Supreme Court would say no, people who work two or three jobs don’t deserve help affording health insurance; no, gay Americans don’t get to get married like the rest of us; all of those people are now second-class citizens. And the first Black president has to serve as the public narrator and provide some moral clarity for all of this.

Someone once described it as ten days too implausible for an entire season of The West Wing. But it’s also what those events symbolized and how they fit in the broader, centuries-long story of America – whether or not we’re actually going to live up to the ideals we profess to believe in. Whether we’re going to stand up to white supremacy, and bigotry, and people who profit from inequality and violence. And that week, the answers were all “yes.”

Robin Lindley: With the Charleston massacre, the president had to address another mass shooting and he was tired of giving eulogies after the murders at Sandy Hook and all of the other heartbreaking mass shootings during his term in office. How was his speech at Mother Emmanuel Church different from previous addresses? What was your role in creating this memorable speech? How did the speech go beyond a eulogy to become a message of reconciliation?

Cody Keenan: We had done over a dozen eulogies after mass shootings at that point. And this goes back a few years, the shooting in Newtown, Connecticut, where 20 little kids were murdered in their classrooms, along with six of their educators, was right after he’d been reelected.

And he put aside his second term agenda right out of the gate to try to do something about guns, because what an abdication of leadership that would be if he didn’t. And he had a little boost by Joe Manchin and Pat Toomey, an arch conservative from Pennsylvania with an A-rating from the NRA. They both had one. They decided to work together on a background checks bill. And even though we knew the odds in the Senate would be long, that gives you something to try for. And so, we traveled the country for a few months. He made it a centerpiece of his State of the Union address. Big, emotional, powerful ending. And in the end, in April, Republicans blocked a vote on it with the parents of the Newtown kids watching from the gallery.

And that’s about as cynical as I’ve ever seen Barack Obama. Yet he went out and spoke in the Rose Garden with those families. I handed him a draft of the speech and he said, look out, I'm going to use this as a as a template, but I’m just going to wing it. And he came in after that speech into the outer Oval Office, which is this room just off the oval where his assistants sit, and he was almost yelling once the door closed, he said, “what am I going to do the next time this happens? What am I going to say? I don’t want to speak. If we’ve decided as a country that we’re not going to do anything about this, then I don’t want to be the one who closes the cycle every time with a eulogy that gives the country permission to move on.”

Ultimately, we did decide to do a eulogy after Charleston, and it was his idea to build the structure of the speech around the lyrics to “Amazing Grace.”

Robin Lindley: I think everyone was surprised and moved when President Obama sang “Amazing Grace” during the Charleston speech. Were you surprised or was that part of the plan for the speech?

Cody Keenan: That, too, was his idea. He told me on Marine One that morning that, if it felt right in the arena, he might sing it.

Robin Lindley: You now teach speechwriting at your alma mater Northwestern University. Do you have any other advice for prospective speech writers?

Cody Keenan: It’s fun, training a new generation of speechwriters and trying to convince them that public service is worth it. What I didn’t expect was that my students would end up teaching me quite a bit in return. There’s an impatience to their generation that mine didn’t have to have. Politics and the pace of change is now existential for them in a way it hasn’t been since schoolkids were doing duck and cover drills during the Cold War. They’re doing those duck and cover drills again because of guns. They can see an end to their future because of climate change.

And let me tell you, when they see a party more into policing books than policing assault weapons; when they see a party more exercised about drag queens than about climate change – they feel a real disdain there. I want them to harness it, though, in a productive way. And part of that means telling them the truth. To tell them that change has always taken time isn’t fun. To tell them that they’re not always going to win isn’t fun. To tell them that even when they vote in every election, they’ll never elect a leader who delivers everything they want. Well, that’s just not inspiring. But it’s also true.

Nobody ever promised us these things. That’s democracy. But here’s the thing about democracy: we get to refresh it whenever we want. Older generations aren’t entitled to their full tenure. So, while I counsel patience and realism, I also fan the flames of their impatience and idealism. I tell them to join a campaign now, to start an advocacy group now, to run for office now. Stay at it not just until the people in power are more representative of what America actually is, but until they’re the ones in power themselves. Then make the system your own. Faster, smarter, more responsive to the needs of a modern, pluralistic democracy. And one way to do that is through my cardinal rule of speechwriting: help more leaders talk like actual human beings.

Robin Lindley: You also continue to work as a speechwriter and you note that you worked with President Obama after his tenure in office. Did you consult with the president on writing projects such as his monumental memoir Promised Land?

Cody Keenan: I worked for him full-time for four years after we left the White House, ultimately leaving after the 2020 election so that I could devote my time to writing Grace.

Robin Lindley: What sorts of clients do your work with as a speechwriter now?

Cody Keenan: All kinds. Progressive candidates, nonprofit, academic, and corporate. Our rule is that each client has to be putting more into the world – hopefully much more – than it’s taking out. But the best part of it is to be surrounded by a team of idealistic young speechwriters again. I missed that over the four years after the White House.

Robin Lindley: Would you consider working with a president at the White House again?

Cody Keenan: Maybe. Depends on who it is. For a speechwriter, it really, really depends on who it is. Speeches require a deeper relationship than a lot of other staff positions. But I’m also older and have a young daughter. Both of those things make the grind of the White House much less attractive.

Robin Lindley: It seems we’re more divided now than during the Obama years. I never thought I’d see Nazi rallies in America in the 21st century. Where do you find hope for our democracy at this fraught time?

Cody Keenan: My students. While politics as it is may make them cynical, they’re not cynical about America and its possibilities. Somehow, they’re not as plagued by fear or suspicion as older generations; they’re more tolerant of differences between race and culture and gender and orientation, not only comfortable navigating all these different worlds but impatient to make them all fairer, more inclusive, and just plain better. They’re consumed with the idea that they can change things. They just want to do it faster.

Robin Lindley: Is there anything you’d like to add for readers about your book or your work?

Cody Keenan: You’re going to love Grace. I wrote it because it’s a hell of a story and it’s the most intimate look at Obama’s approach to speechwriting that exists.

But I also wrote it, as I told Stephen Colbert when he had me on, to blow up people’s cynicism about our politics. Because politics isn’t some rigid system we’re trapped under. It’s us. It’s only as good as we are. That’s why I was so happy when Obama called it “an antidote to cynicism that will make you believe again.”

But I was just as happy to read a review that described it this way: “Grace is a refreshing departure from the flood of scandalous ‘literary’ flotsam that typically washes up in the wake of the transfer of power. This book might not make breaking-news headlines, but it just might restore a little faith in the presidency and the backstage men and women who work around the clock to fulfill the chief executive’s promises to the American people.” The publicist at the publishing house didn’t love the part about “breaking-news headlines,” because that’s what sells books – but I was proud to write it the way I did. There’s no sleazy tell-all in this book, but there are a bunch of great never-before-told stories about what it’s like to sit alone with Obama and unlock the right words for a fraught moment.

Robin Lindley: Thank you Cody for your generosity and thoughtful comments. Your book captures the reality of work in the tense and often exhilarating environment of the White House with a president who was devoted to creating a more just and tolerant nation. Best wishes on your continuing work and congratulations on Grace.

 

Robin Lindley is a Seattle-based attorney, writer, illustrator, and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer’s Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service. He served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. Martin Luther King, Jr. His writing often focuses on the history of human rights, social justice, conflict, medicine, visual culture, and art. Robin’s email: robinlindley@gmail.com.  

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/blog/154750 https://historynewsnetwork.org/blog/154750 0
Dangerous Records: Why LGBTQ Americans Today Fear the Weaponization of Bureaucracy

Prisoners at Sachsenhausen concentration camp wear triangle badges indicating the nature of their offenses against Nazi social code (pink would indicate homosexuality). National Archives and Records Administration, 1938.

 

 

The recent rise of far right political movements in the United States and globally has prompted historical comparisons to the Nazis. The atrocities committed by the Nazis have been studied widely, particularly in reference to the Jewish victims of the Holocaust, but it is also important to understand lesser-known victims and the ways that prior discrimination affected their persecution. While focusing on the pre-war experience it is crucial to understand how the Nazis relied on bureaucratic information to know whom to target, especially when the classification was not an obvious ethnic or religious one (such as assimilated and secular Jews, or gay men, lesbians, and others persecuted for gender or sexual behavior). Today, there are important lessons to learn about the dangers that bureaucratic information gathering, combined with escalating prejudice and vilification, could present.

The rise of the Nazi party in Germany also brought about several laws restricting access to literature and laws regarding the treatment of what we today would refer to as LGBTQ+ people. Paragraph 175, a law criminalizing same sex male relationships, was established in 1871, but revised by the Nazi party to be more inclusive in regard to the actions that could be punished. Queer men were targeted early in the Nazi regime, which placed heavy blame on them for losing the First World War. Nazi ideology justified discrimination and repression by claiming that a lack of masculinity was a contributing cause of the country’s downfall and economic depression. Though only half of the 100,000 arrested for the alleged crime of homosexuality were persecuted, this figure is still large enough to raise an interesting question about how the Nazis knew whom to target and where the information was coming from. Political factors appear to be involved, because a majority were prosecuted within six weeks after Heinrich Himmler’s assumption of control of internal security in 1943. Each man was reported in a similar manner whether that was a private individual report, a police raid, or utilization of the “Pink List.”

The practice of information gathering towards members of minority groups by bureaucratic organizations has a startling history of being used for oppressive ends, particularly by the Nazis. A clear example of this includes the utilization by the Nazis of the “Pink List," a list compiled by organizations of support such as the Scientific Humanitarian Committee or reported by private individuals and then held by the police. The Scientific Humanitarian Committee aimed for “Justice Through Science” and espoused the biological theory of homosexuality, the idea that sexuality is an innate biological feature rather than a characteristic of weakness and psychological deviance. The SHC was targeted by the Nazi party early in the rise of Hitler due to their propensity to advocate for homosexuals. The SHC kept lists of homosexual Germans for support and scientific reasons but those lists were seized by the Nazis then utilized to target the homosexuals on the list.

A clear example of the danger that could befall a young gay man who interacted with police on any other matter is seen with the story of Pierre Seel. Seel arrived at his local police station to report a stolen watch and, when questioned about the specific circumstances, revealed that he had come from Steinbach Square, a well-known hangout for gay men seeking each other's company. After experiencing intense questioning, he was released and assured that nothing would come of the compromising information, but three years later he was arrested as a suspected homosexual due to the list he was placed on after he left the police station. This list was compiled by police and security forces over the years, and was augmented by confessions made by imprisoned gay men who were raped and tortured to compel them to add additional names to the list. The Pink List is a clear example of how dangerous information that categorizes someone into a minority group can be, particularly in the hands of those in power with ill intentions.

While the Holocaust is an unmatched and exceptional example of cruelty and systematic persecution of social outgroups, it is nevertheless important, even crucial, to recognize similarities between those events and the present, especially where prejudices join with bureaucratic state power. Today, transgender Americans are being framed as deviants, accused of undermining traditional gender roles, and described as “groomers'' and child sex abusers. Armed vigilantes have harassed people attending drag performances, and activists are seeking to remove books about gender and transgender experiences from schools and libraries. When the power of the state aligns with these expressions of prejudice and identification of outgroups as a threat to children, family and society, there is real cause for concern.

Anti-LBGTQ sentiment has been particularly vociferous in Texas. Texas Attorney General Ken Paxton’s recent request for a list of individuals who have changed their gender on state-issued driver’s licenses, as well as other departmental documents, has concerning similarities to the “Pink List” compiled by Nazi officials in 1930’s Germany. The request for the list itself made transgender Texans subjects of surveillance, implying the state views them as dangerous. According to an email sent on June 30, 2022 by Sheri Gipson, the chief of the DPS’s driver license division, the Attorney General’s office “wanted ‘numbers’ and later would want ‘a list’ of names, as well as ‘the number of people who had a legal sex change’.” This first request produced over sixteen thousand results. Unfortunately for the Attorney General, it was difficult for the state agencies to meet his request. One issue involved gender changes to correct filing mistakes (a cisgender person’s gender was accidentally recorded inaccurately, and the change affirmed their identity). A subsequent data request attempt led to narrowing the data to only court-ordered document changes, which would identify transgender people specifically. Although the agency could not accurately produce this data, this instance, alongside the various laws being introduced throughout the state such as the prohibition of gender affirming care and the limiting of LGBTQ+ lessons in school, brings up the startling question of the kind of damage that information gathering could do not only presently, but also in several years.

The weaponization of personal information available to state organizations should not be taken lightly. It has, and will continue to, present danger to those being targeted by the state as threats. Laws to target transgender children by restricting their access to gender-affirming care or affirming ideas in books have become commonplace in several Republican led states, but an explicit attack on legal adults adds an element that lends the question to where it will stop and who will stop it. These laws send a clear message that the right does not want transgender people to have a presence in society, both within everyday life and in the media surrounding them. The proposed laws restricting gender affirming care, along with classifying the parents of transgender children receiving gender affirming care as child abusers, LGBTQ+ lessons in school, and banning books and media that showcases queer people attempt to erase the queer experience both from modern life as well as in history.

All of these efforts depend on being able to identify those who are not living with the gender assigned to them at birth. Bureaucratic records may not be considered dangerous by the public, but the ability of government officials to access the records of those whose place in society they are seeking to erase can lead to dangerous consequences in the future. Other vulnerable groups will be targeted, and it is necessary to examine the historical implications and repercussions of the blatant targeting of these groups.

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185765 https://historynewsnetwork.org/article/185765 0
150 Years of "Zero-Sum Thinking" on Immigration Last week Title 42, a Trump-era policy that has limited immigration for the last three years, expired. Still, the Biden administration warned people arriving at the border that “the border is not open” and anyone arriving there would be “presumed ineligible for asylum.” In conversation, Dr. Carly Goodman revealed the 150-year-old history behind the US government’s restrictionist stance.

Specifically, Dr. Goodman explored this history through the lens of the Diversity Lottery. Not by coincidence, Dr. Goodman is the author of Dreamland: America’s Immigration Lottery in an Age of Restriction. She’s also a Senior Editor of Made by History at the Washington Post, which provides fantastic daily commentary from the nation’s leading historians.

A condensed transcript edited for clarity is below.

Ben: Dr. Goodman, thank you so much for being here.

CG: Thank you, Ben. 

Ben: Today I'd like to explore the history of the Diversity Visa as part of a broader exploration of US immigration history writ large.

Before we go back in time, could you please just give a quick overview of what the lottery is? 

CG: Sure. The Diversity Visa Lottery has been part of our immigration policies and laws since the Immigration Act of 1990. It's an annual lottery, open to almost every country. People from eligible countries can put their names in to register for the lottery, and if they are selected, they can then apply to become lawful permanent residents of the US. 

The first lottery was held in June of 1994, and it remains one of the very few pathways to legal status in the US. It's restrictive in some sense—you still have to apply for the visa and fit qualifications like having a high school diploma or its equivalent—but also much more expansive than many parts of our immigration system.

Ben: I think that’s a good segue into exploring the system's restrictive nature, beginning in the 1870s. What were the first immigration restrictions, imposed at that time?

CG: I’ll mention that my colleague, historian Hidetaka Hirota, has written about state-level restrictions prior to the imposition of federal immigration controls.

However, when the US federal government started to think about imposing regulations on immigration, it began by excluding almost all Chinese immigrants in the 1880s, who were seen as competing for work and land in the American West. This set an enduring pattern wherein immigration would be racialized.

Ben: The next big evolution in immigration policy occurred in the 1920s. What happened then?

CG: This time period is really relevant to the rise of the Diversity Lotter later on.

In the early 20th century, eugenicists looked around at growing numbers of immigrants from Europe—Italians, Poles, Jews (including my ancestors), for example—and they really didn't like how the American nation (as they perceived it) was changing.

So, officials came up with national origins quotas that imposed severe numerical restrictions on the entry of people they deemed undesirable, especially southern and eastern Europeans (as well as Asians), who were seen as almost a contagion on the nation.

The national origins quotas were explicitly eugenic in nature, and they remained in place from 1924 until a major immigration reform in 1965 finally dismantled them. The Immigration Act of 1965, also known as the Hart-Celler Act, instead emphasized family ties as one of the main ways to legally migrate.

Ben: You write that the shift toward family ties wasn’t purely altruistic.

CG: No, in some ways it was a compromise meant to mollify bigots who hoped that prioritizing family ties would lead to primarily white family members joining their relatives in the States.

Ben: Related, you quote the famous (but problematic) historian Arthur Schlesinger Jr. who worried that the arrival of different immigrant groups in the US might “shift the balance from unum to pluribus.”

To continue speaking in Latin, did Schlesinger’s ad nauseating fear come to fruition?

CG: Well, in addition to creating the family ties route to becoming citizens, Hart Celler imposed the first numerical restrictions on immigration from the Western Hemisphere. There’d long been migration from Latin America, both because the US destabilized many countries there, leading people to leave, and because of the need for workers here.

After 1965, Latin Americans who’d been coming to the US were still coming, but now they ran up against numerical limits. As historian May Ngai discusses in her work, Hart Celler thus created the problem of undocumented immigration. Some would say that's one of the most important legacies of the act. 

Ben: Moving into the 80s, how did the Irish defy the growing conceptions of illegal immigration, and what reforms did they push for?

CG: There's a long, storied history of Irish immigration to the US. For example, I live in Philadelphia, and we have a vibrant Irish-American community here.

Ben: The Philly cheese steak is famously an Irish creation.

CG: Um, it's closer to Italian.

Ben: ...right.

CG: Anyway, that sense of heritage was foremost on Irish immigrants' minds in the 80s. They felt the injustice of having to leave Ireland amid an economic crisis, just as their grandparents had, but encountered the added injustice of restrictions on their access to the US. Many Irish people came as visitors and overstayed their visas to try and find work. They were undocumented and white, contrary to the more racially motivated stereotypes of people without legal status that burgeoned in the 70s.

Congress, meanwhile, had been working on passing immigration reform. In 1986, legislators passed bipartisan reform that combined new enforcement measures with a couple of legalization programs to help people gain status and a path to citizenship.

Most of the Irish hadn’t arrived in time to qualify for the legalization, so members of the Irish communities in major cities got together to try to get legislation passed that would help them out. Basically, they identified the Immigration Act of 1965 as their problem, which reduced the number of visas available to them under the laws from the 1920s.

But it wasn’t cool to say, let’s bring back the eugenicist quotas that privilege white people. Instead, congresspeople close with the Irish community—Brian Donnelly and John Kerry from Massachusetts, for example—began asking, what if we could create pathways for countries that get very few visas these days? Countries like, oh, I don't know... how about Ireland?

There were all kinds of proposals for how to do this, but they came up with a lottery because it was the cheapest way to administer it. They opened it up to all countries in the world that had sent fewer than 50,000 immigrants to the US in the previous five years.

That’s how the Diversity Lottery began.

Ben: And surprisingly, African countries, long ignored or excluded in US immigration policy, maybe benefitted the most from the Irish-led reform, is that right?

CG: Exactly. The lottery began in 1994. The following year, 6.5 million entries from around the world vied for just 55,000 visas. 

I first learned about the lottery by speaking with people in places like Ghana, Nigeria, and Cameroon. It seemed to foster a sense of admiration for the US and for its openness and diversity. In some ways, the lottery format, relying on chance, disrupted people's perception that they were being turned away from the US because of their African passports and a racist system.

Ben: At the same time, you point out that when a person from an African country was lucky enough to win the lottery, they then encountered racism in the US. It’s like: you pack your bags, ready to embark on new life, and then you have to face all of the US' own baggage.

CG: Yep, and the lottery aside, the 90s turned out to be a time of more immigration restriction, not less. Levels of nativism reached points not seen since the early 20th century, and politicians on state and federal levels began to see what anti-immigrant demagoguery could do for them. Even policymakers who were supposed pro-immigration, like Bill Clinton, were relying on and expanding a growing system of immigrant detention.

After 9/11, restrictions only intensified. Under George Bush, the government began to view immigration as a threat. More and more money was put into border militarization and enforcement. 

Ben: Bringing us into the present day, you talk about how Obama and then Biden effectively maintained continuity with the past in terms of restrictive immigration procedures. Biden of course struck down Trump's African and Muslim travel bans, but he's also kept in place lots of Trump’s policies at the border.

How do you view the lottery within this still-restrictive context?

CG: Well, there’ve been efforts to dismantle the lottery over the last 20 years, and a lot of critics’ arguments are really built around zero-sum thinking; around the idea that this was a weird policy created for the Irish, and we’re already pretty diverse, so can’t we use the visas for something better?

But, that’s zero-sum thinking. As it turns out, we could just create more visas for more people. This leads to one of the central points I’m trying to make: Since the 1870s, we’ve had a restrictionist, gatekeeping system, but it’s possible to widen access if we want to. 

The thing preventing us, as it’s always been, is racism. When Donald Trump called for the lottery to be replaced with a system that would be based on what he calls “merit,” he meant white people (which he clarified). Policymakers know that any reform to end the lottery would diminish the number of visas available to Africans and limit one of their few legal pathways to coming to the US.

So, I study the lottery because it’s a part of our immigration system that we really never hear about, and it just works. It's operated fine for thirty years. I don't want to say that the lottery is a good thing for the people who feel that they have no choice but to enter, but I know that more inclusion and more access serve our communities in so many ways, despite our government’s best attempts to limit migration for the last 150 years.

Ben: A good concluding note. I might suggest that your book be called Dreamland: A Little More Pluribus, A Little Less Unum.

CG: Ha!

Ben: Thank you so much for your time today, Dr. Goodman. This has been a pleasure.

CG: Thank you for having me.

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/blog/154749 https://historynewsnetwork.org/blog/154749 0
The Mexican War Suggests Ukraine May End Up Conceding Crimea. World War I Suggests the Price May Be Tragic if it Doesn't

"American Army Entering the City of Mexico" by Filippo Constaggini, 1885. Architect of the Capitol. 

 

In April 1846, the United States invaded Mexico after a highly disputed incident at the border. Freshman Congressman Abraham Lincoln challenged President James Polk’s account of Mexican provocations as misleading and demanded to know the “spot” where they supposedly took place.

None of the major European powers got involved on either side. Great Britain remained officially neutral during the war, although it objected to an American blockade that interfered with its trade with Mexico. France was uninvolved but insisted that Mexico remain an independent nation.

By September 1847, American forces had captured the Mexican capital and forced surrender. An overwhelmed Mexico signed the 1848 Treaty of Guadalupe Hidalgo, ending the war and transferring to the United States over half of its territory, including modern day California, Nevada, Utah, and most of present day Colorado, New Mexico, and Arizona. Mexico was also forced to drop its claims to the former Mexican province of Texas and accept the Rio Grande as the new border between the countries. In return, the United States paid Mexico a consideration of fifteen million U.S. dollars, worth between 500 and 600 million dollars in today’s money.

Mexico is never going to receive its stolen territory back. The annual economy of California today alone is $3.5 trillion, approximately three times that of Mexico.

Fast forward to 1913, when Europe was divided into two military alliances. The Central Powers  (Germany, the Austro-Hungarian Empire, and Italy, later joined by the Ottoman Empire and Bulgaria), faced off against the Triple Entente (Great Britain, France and the Russian Empire, later to be joined by the United States and Italy when it changed sides). The alliances provided some stability in Europe, much like NATO and the Warsaw Pact alliances did during the Cold War, but also set conditions for the situation in Europe to rapidly spiral out of control.

On July 28, 1914, Austria-Hungary invaded Serbia after the assassination of the Austrian Archduke in Sarajevo, which had been annexed by Austria-Hungary in 1908. The assassins hoped to liberate Bosnia and Herzegovina from Austro-Hungarian rule. On August 8 Montenegro joined in the defense of Serbia, and on August 9 Russia, an ally of Serbia, attacked German and Austro-Hungarian positions. Meanwhile, Germany invaded Belgium, bringing France and Great Britain into the war. In the east, Russia collapsed, but in the west the two alliances stalemated. The war dragged on until the German collapse in the fall of 1918. Military and civilian casualties during World War I, deaths and injuries, were an estimated 40 million people. The punitive treaty that ended the war became an underlying cause of World War 2 and the deaths of another 80 million people.

Fast forward again, this time to more recent decades. With the collapse of the Soviet Union, Ukraine and Russia became independent countries, with the former Soviet Black Sea naval base now located in Ukraine after Crimea was administratively transferred from Russia to Ukraine in the 1950s. In 2014, a Ukrainian government allied with Russia was replaced by a westward leaning government, and Russia seized Crimea and its warm water naval base in violation of international agreements established after World War II protecting the territorial integrity of nations. In response, western nations placed economic sanctions on Russia, and NATO expanded eastward and considered admitting Ukraine into the alliance. Russia responded by invading Ukraine with the goals of putting a friendly government into power there and annexing territories on the Russian-Ukrainian border. The invasion stalled when NATO, including the United States, armed the Ukrainian military with modern weaponry more sophisticated than that used by Russian forces. It is now a war between NATO and Russia, although still a limited war, not just a war between Ukraine and Russia.

Ukrainian President Volodymyr Zelensky continually pressures NATO and the United States to provide Ukraine with more advanced weaponry. NATO has already agreed to deliver tanks, anti-missile systems, drones, and rockets, but Zelensky wants fighter jets that will allow Ukraine to shift from a defensive war and attack targets deep inside Russia.

The United States and NATO face a serious dilemma. They are committed to supporting Ukraine and preserving its national integrity, but Zelensky is demanding that Russia return all occupied territory, including Crimea, and pay reparations to rebuild areas of Ukraine that were destroyed by the Russian invasion, demands that Russia will never accept. Russia will not return Crimea to Ukraine, just as the United States will never return California to Mexico.

If NATO and the United States deliver jet fighters and Ukraine uses them to attack Russian targets, including cities, the world faces an escalating domino effect similar to that which started World War 1 and led to World War 2. That is why as a historian, I am really worried about events playing out in Ukraine. The only peaceful resolution that I see is Ukraine agreeing to accept Russia control over Crimea and some of the disputed border areas in exchange for the NATO alliance rebuilding areas destroyed by the war. NATO and Russia will then have to find a resolution to their differences, but I am not hopeful they will find an amicable solution.

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185766 https://historynewsnetwork.org/article/185766 0
The "Critical Race Theory" Controversy Continues

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/177258 https://historynewsnetwork.org/article/177258 0
The Right's Political Attack on LGBTQ Americans Escalates

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/182937 https://historynewsnetwork.org/article/182937 0
Mifepristone, the Courts, and the Comstock Act: Historians on the Politics of Abortion Rights

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/181169 https://historynewsnetwork.org/article/181169 0
The Roundup Top Ten for May 25, 2023

Why Historians Love Comparing Themselves to the Great Detectives

by Carolyn Eastman

The best point of comparison between Holmes and historian isn't in solving the case but in the struggle to make sense of the facts. 

 

Hollywood Strikers Carry the Legacy of Ned Ludd

by Gavin Mueller

Our techo-utopian society holds the Luddites in low regard, but their actual history helps explain what's at stake in the screenwriters' strike and any labor conflict where new technology threatens workers' livelihoods. 

 

 

Republican Push for More Capital Punishment Echoes Crime Panic of the 1980s

by Duncan Hosie

The Supreme Court decision in 1976 that allowed the states to resume executions coincided with a rise in anxiety over crime and pushed politicians to pledge more executions. 

 

 

After Dobbs, Abortion Politics are Straining the Republican Coalition

by Daniel K. Williams

When the party could focus on appointing anti-Roe judges, the Republicans could make abortion a political issue without having to decide matters of policy that inevitably leave parts of their coalition angry and disappointed. Have they lost by winning? 

 

 

"Return to Rigor" Isn't the Answer to Restoring Student Engagement

by Kevin Gannon

A post-COVID reaction to the improvisations made on grades, schedules and deadlines supposes that students are suffering from too much flexibility, but a singular focus on rigor won't address the causes of disengagment. 

 

 

How to Fight Back Against the Right's "Parents' Rights" Moral Panic

by Jennifer Berkshire

Parents' fears about losing control over their children have been the raw material for potent politically-motivated moral panics for a century and more. But those panics aren't irresistible, because parents everywhere still value public schools as democratic community institutions.  

 

 

Trump and DeSantis Two Peas in a White Nationalist Pod

by Clarence Lusane

Any Republican candidate will need to lean in to the politics of white Christian nationalism ascendant on the right; Trump has needed the MAGA movement as much as it's needed him. 

 

 

"Salts" are Part of Labor's Fight to Organize. They were once Part of the Antiwar Movement

by Derek Seidman

Taking a job with the covert intention of organizing the workplace is a time-honored labor tactic that's back in the news. Some dedicated activists in the 1960s "salted" the U.S. military in the hopes of building an antiwar movement within the ranks. 

 

 

Coca Cola Can't Go Green While Selling Drinks Cold

by Bart Elmore

If the worldwide beverage giant wants to reduce its carbon footprint, it's time for it to reverse its historical commitment to make its drinks available cold—in electric coolers—across the globe.

 

 

The Writers' Strike Opens Old Wounds

by Kate Fortmueller

The plot of each sequel of negotiations between the producers and writers has followed a formula of compromise for mutual self-preservation. Technological advances have convinced studio heads that they no longer need the labor of writers enough to keep compromising. 

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185763 https://historynewsnetwork.org/article/185763 0
Texas Judge Revives Anthony Comstock's Crusade Against Reproductive Freedom

 

 

 

In April, a Texas judge ruled invalid the Food and Drug Administration’s approval of a pill used in over half the abortions in America.  Going further, he invoked the federal Comstock Act to declare it “nonmailable.” Twenty Republican Attorneys General promptly warned pharmacy chains to halt its sale.  Such sales would violate a law initiated 150 years ago by a Connecticut farm boy turned dry goods salesman beginning his battle against reproductive rights.

 

From an early age, Anthony Comstock showed his moralistic zeal.  At eighteen, he broke into a gin mill near his family’s farm and drained the liquor onto the floor. Enlisting after Gettysburg, he fought his fellow soldiers’ vices – liquor, lust, swearing, breaking the Sabbath – as vigorously as the Confederates.  Moving to New York, he futilely tried to jail a smut dealer loaning obscene books to schoolboys.

 

The “hydra-headed monster” of smut is where he made his first big kill.  On March 2, 1872, he and a police captain raided booksellers along Manhattan’s Nassau Street, the heart of America’s smut industry.  In one shop, he purchased The Confessions of a Voluptuous Young Lady of High Rank. In others, he bought Women’s Rights Convention and La Rose d’Amour.  Evidence in hand, the pair secured warrants from a judge who agreed the books were obscene.  Returning to Nassau, they arrested eight culprits and confiscated five bushels of obscene merchandise.

Later that month, Comstock targeted a crime catering more to women, and which he considered an immeasurably greater evil.  Smut merely inspired lust.  This crime enabled it.  His specific target was a man, Dr. Charles Manches.  But the services Manches offered helped women overcome the safeguards God had built to control their passions:  the fear that could make a woman on the brink stop and preserve her chastity.

 

Manches advertised his “French Imported Male Safes” as “a perfect shield against disease or conception.”  For ladies wishing to take matters into their own hands, he offered “Ladies Protectors,” commonly known as womb veils.  If those devices failed to prevent pregnancy, he promised “Ladies Cured at One Interview, with or without medicine, $5.”  He was one of over a hundred abortionists in the city, according to the New York Times.

 

With support from the YMCA, Comstock continued his raids.  By mid-year, he had eight smut cases pending in New York courts.  But prosecutors continually requested postponements.  When one case finally proceeded, the defense didn’t contest Comstock’s testimony.  It simply argued the material confiscated was no more obscene than passages in the bible.  The argument wasn’t convincing.  Ten jurors voted to convict.  But the two who didn’t meant the defendant walked.  That proved the best outcome of his pending cases.

 

Frustrated under state law, Comstock changed tactics.  Seven years earlier, Congress had banned obscenity from first class mail.  The law was weak, narrowly defining obscenity and prohibiting postmasters from unsealing mail even if they knew a piece contained it.  Prosecutions had barely hit half a dozen.

 

Comstock began ordering smut by mail.  After receiving obscene goods, he obtained warrants in US Circuit Court.  Four dealers were convicted and sentenced to one year in jail and $500 fines – too lenient for Comstock, but the maximum the law allowed.

 

Raiding one dealer’s medical associate, he discovered the doctor’s teenage patient awaiting his third attempt to abort her fetus.  But abortion was a state crime.  A district attorney killed that case.

 

Dissatisfied, Comstock outlined ideas for a tougher federal law to Morris Jessup, the YMCA’s President.  Jessup got US Supreme Court Justice William Strong to finalize a bill for Congress.  In February 1873, Comstock visited the US Capitol to exhibit obscenities – books, sex toys, rubber goods.  Attending senators declared they would accept any bill he wanted so long as it was constitutional.  They could pass it before the current session closed for President Grant’s second inauguration March 4.

 

New York Congressman Clinton Merriam introduced the bill in the House, expecting to pass it quickly under a suspension of the rules.  Connecticut Senator William Buckingham followed in the Senate.

 

An optimistic Comstock got a head start on enforcement.  On Treasury Department letterhead, he contacted nine suspicious doctors.  “I am an employee of the Treasury,” he wrote under the pseudonym Anna M. Ray, “I was seduced about four months ago, and I am now three months gone in the family way.”  “Anna” begged each doctor to send something to relieve her condition.  “For God’s sake do not disappoint a poor ruined and forsaken girl whose only relief will be suicide if you fail me.”

 

The optimism was premature.  With resisting legislators invoking rules and demanding changes, weeks passed.  On Saturday evening, March 1, the House met for its final session.  Comstock watched.  At midnight, unwilling to break the Sabbath, he gave up.  Leaving the Capitol, he spent a sleepless night too depressed even to pray.  Not until dawn could he accept the failure as God’s will. Only when he ran into the Senate’s chaplain did he learn the news.  “Your bill passed the House at two o’clock this morning,” the chaplain said.  It was immediately sent to the Senate and passed.  President Grant signed it the next day.

 

His bill launched Comstock’s four-decade career fighting smut dealers, abortionists, birth control advocates, artists, playwrights, and poets.  Its opening section foretold his war on reproductive rights, explicitly banning anything – device, medicine, tool, information, advertising – “for the prevention of conception” or “for causing unlawful abortion.”

 

Women bookended that career.  As he was pushing his bill in Congress, Comstock indicted “Free Lover” Victoria Woodhull and her sister Tennie Claflin for publishing an obscene article exposing the adultery of Reverend Henry Ward Beecher.  While the article might have been libelous were it not true, it wasn’t obscene.  But Comstock guessed the arrests would be a publicity coup that would help his bill pass.  After a year of harassment, the sisters were acquitted.

 

Under his bill, Comstock quickly attacked abortionists—twelve in Chicago, seventeen in New York.  But Chicago judges imposed trivial fines. In New York only three served serious time.  Through 1875, Comstock claimed 49 abortion arrests with 39 convictions, but even he acknowledged the difficulty of bringing the practitioners to justice.  In 1878, he achieved one notable feat.  He entrapped New York’s notorious abortionist Madame Restell, driving her to suicide.  “A Bloody ending to a bloody life,” he noted without remorse.

 

Months later, Comstock entrapped Dr. Sara Case.  She supplied syringes for post-coital douching with substances like vinegar and carbolic acid to prevent conception.  As their battle played out in the press, Case renamed her device the “Comstock Syringe.”  Sales soared.

 

The list went on until Comstock closed his career arresting birth control advocate Margaret Sanger.  She fled to Europe to escape his clutches.  Comstock resorted to convicting her estranged husband for handing out a birth control pamphlet.

 

Of course the women he attacked directly were not the only victims of Comstock’s fight against reproductive rights.  Others were the desperate women forced to bear children, no matter the risks to their health, their inability to support another baby, or simply satisfaction with the family they already had.

 

With the Texas judge’s decision stayed and appeals underway, the battle over reproductive rights continues in Anthony Comstock’s shadow.

 

 

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185707 https://historynewsnetwork.org/article/185707 0
Forget "Finding Forrester"—Our Best Teaching Can Be Ordinary

Plato and Aristotle in detail from The School of Athens by Raphael (1509–1510), fresco at the Apostolic PalaceVatican City.

 

 

Every few years there is a movie about a gifted young person striving to reach their potential and being formatively shaped by a teacher or mentor. Finding Forrester is a classic in this genre. The main character, Jamal, is a gifted young writer who meets a famous but reclusive novelist, William Forrester, who helps Jamal improve by challenging him and not being overly easy with the praise. In Whiplash, Miles Teller plays a gifted young drummer named Andrew Neiman whose music teacher, Terence Fletcher, is determined to draw out his genius. Fletcher’s approach is abusive and even somewhat insane. But Andrew wants so badly to be a musical legend on the level of Charlie Parker that he practices until his hands bleed and he endures the abuse.

 

Though university level instruction should not involve the abusive behavior we see in Whiplash, and we probably have to be more orthodox in our teaching than an old novelist eating soup and pecking at a typewriter, we sometimes dream of working with the kind of student pictured in those films. This would be a young person who has a natural gift and an unnatural drive to succeed. They want to be challenged. When you push them, they keep getting better. They go on to achieve remarkable things. You get to help launch them into the stratosphere.

 

In reality, very few students are going to resemble the characters in these movies. Some of your students aren’t interested in your class. Some are trying to decide if they are interested. Some are interested, but have other priorities. Some want to get better at whatever your discipline is, but do not believe that your course is part of their hero’s journey. Not everyone is going to read your comments on their paper. Not all who do will take the comments to heart. A few of your students will cheat on class assignments. Some of your students will certainly go on to greatness and many have significant abilities, but most of your students will not practice until their hands bleed.

 

There aren’t a lot of movies about doing an excellent job with normal students and getting normal outcomes. However, if it’s true that the process is more important than the product, those movies are missing something anyway. There’s quite a bit of true excellence in teaching that never gets associated with students who go on to win Nobel prizes or become MacArthur Fellows. Exceptional outcomes are not the only measure of excellence in teaching. An excellent teacher can teach all kinds of students. You can do meaningful work and inspire people without becoming the backstory of the next Stand and Deliver.

 

In films with bright students, those students arrive with the passion. Jamal is already a writer before he finds Forrester. Andrew Nieman has aspirations in the opening sequence. In real life, some college students are still searching for their passion. Some of them need that flame to be nourished. Even those with significant gifts are not always a half step from legendary excellence. Sometimes the role of the excellent teacher is an introduction to a subject or guiding the first steps along the path of whatever it is that a student is pursuing. Sometimes what you impart is not even a passion for your own subject.

 

A lot of the wise mentors in movies are set in their ways and have a pretty fixed and cantankerous approach to instruction. That may not slow down a gifted student who cannot be deterred from learning, but, even then, it may not be the actual best approach. Teaching excellence does not always take the form of pushing students to the extreme limits of their abilities. All students need to be challenged, but not all in extreme ways. Some also need to be encouraged. Struggle can help with growth, but sometimes students are struggling with things that are more important than our classes and don’t need provocatively difficult assignments to learn to push themselves in life. That doesn’t mean that every semester, every course, has to be catered to each individual student, or that everything should be easy, but it does mean that good teaching is much more than setting the bar at the correct height and then noting who makes it over and who doesn’t. There is a real art to setting meaningful but realistic expectations for students and ourselves.

 

One very unhelpful thing about films with amazing students is that they give us a distorted sense of impact. A good teacher’s legacy is not built on the genius of a single student helped along the way. A good teacher’s legacy includes people who became slightly better writers, casual readers of history, more critical viewers of documentaries, more knowledgeable citizens, and even people who just got better at passing college classes. A good legacy may even include helping direct a student to a better major for them. A good legacy is built on hundreds, thousands of recommendation letters, for all kinds of positions with varying degrees of prestige.

 

The reclusive novelist in Finding Forrester is roughly modeled on J.D. Salinger. Interestingly, Salinger’s novel Franny & Zooey has a relevant passage. Franny is a college student experiencing a kind of breakdown, and is judging her peers and professors along the way. Though they are part of the Glass family, full of child geniuses, her brother Zooey suggests that she is not necessarily flexing her intellect as much as she is being snobbish. Both had been precocious kids on a radio quiz show and Zooey reminds his sister that their older brother Seymour always encouraged them to do their best for the “Fat Lady”—to do their best for some unknown woman in the audience that they imagined as really deserving and really listening. Zooey even shined his shoes, for the radio program, for the “Fat Lady.” He tells his sister:

 

“I don’t care where any actor acts. It can be in summer stock, it can be over a radio, it can be over television, it can be in a goddam Broadway theatre, complete with the most fashionable, most well-fed, most sunburned-looking audience you can imagine. But I’ll tell you a terrible secret—Are you listening to me? There isn’t anyone out there who isn’t Seymour’s Fat Lady. That includes your Professor Tupper, buddy. And all his goddam cousins by the dozens. There isn’t anyone anywhere that isn’t Seymour’s Fat Lady. Don’t you know that? Don’t you know that goddam secret yet? And don’t you know—listen to me, now—don’t you know who that Fat Lady really is?... Ah, buddy. It’s Christ Himself. Christ Himself, buddy.”

 

There are days it feels like we are doing the Broadway equivalent of teaching—students seem to be lighting up, they’re going on to bigger and better things, they’re asking for outside reading recommendations. It is easy to feel inspired. But there are days we are off-off- Broadway—monitoring low grades and repeating ourselves in class. It is our job to see all of our students as significant, whether or not they seem special to us when we first meet them. Even if they would rather be texting, it is our job to be teaching to the best of our abilities.

 

Excellence in teaching is in meeting the challenge of real-life classrooms, filled with students of all abilities, and resulting in all kinds of outcomes. Excellent teaching is not just about throwing down challenges to push great students on to more greatness. We don’t work on a film set, we work in a university classroom. We are great when we are consistently excellent, whether or not our students are famous or we are experiencing moments that have the feel of movie magic.   

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185704 https://historynewsnetwork.org/article/185704 0
Stronger Global Governance is the Only Way to a World Free of Nuclear Weapons

Some of the 800 members of Women Strike for Peace who marched at United Nations headquarters in Manhattan to demand UN mediation of the 1962 Cuban Missile Crisis

 

It should come as no surprise that the world is currently facing an existential nuclear danger.  In fact, it has been caught up in that danger since 1945, when atomic bombs were used to annihilate the populations of Hiroshima and Nagasaki.

Today, however, the danger of a nuclear holocaust is probably greater than in the past.  There are now nine nuclear powers―the United States, Russia, Britain, France, China, Israel, India, Pakistan, and North Korea―and they are currently engaged in a new nuclear arms race, building ever more efficient weapons of mass destruction.  The latest entry in their nuclear scramble, the hypersonic missile, travels at more than five times the speed of sound and is adept at evading missile defense systems. 

Furthermore, these nuclear-armed powers engage in military confrontations with one another―Russia with the United States, Britain, and France over the fate of Ukraine, India with Pakistan over territorial disputes, and China with the United States over control of Taiwan and the South China Sea―and on occasion issue public threats of nuclear war against other nuclear nations.  In recent years, Vladimir Putin, Donald Trump, and Kim Jong-Un have also publicly threatened non-nuclear nations with nuclear destruction.

Little wonder that in January 2023 the editors of the Bulletin of the Atomic Scientists set the hands of their famous “Doomsday Clock” at 90 seconds before midnight, the most dangerous setting since its creation in 1946.

Until fairly recently this march to Armageddon was disrupted, for people around the world found nuclear war a very unappealing prospect.  A massive nuclear disarmament campaign developed in many countries and, gradually, began to force governments to temper their nuclear ambitions.  The results were banning nuclear testing, curbing nuclear proliferation, limiting development of some kinds of nuclear weapons, and fostering substantial nuclear disarmament.  From the 1980s to today the number of nuclear weapons in the world sharply decreased, from 70,000 to roughly 13,000.  And with nuclear weapons stigmatized, nuclear war was averted.

But successes in rolling back the nuclear menace undermined the popular struggle against it, while proponents of nuclear weapons seized the opportunity to reassert their priorities.  Consequently, a new nuclear arms race gradually got underway.

Even so, a nuclear-free world remains possible.  Although an inflamed nationalism and the excessive power of military contractors are likely to continue bolstering the drive to acquire, brandish, and use nuclear weapons, there is a route out of the world’s nuclear nightmare.

We can begin uncovering this route to a safer, saner world when we recognize that a great many people and governments cling to nuclear weapons because of their desire for national security.  After all, it has been and remains a dangerous world, and for thousands of years nations (and before the existence of nations, rival territories) have protected themselves from aggression by wielding military might.

The United Nations, of course, was created in the aftermath of the vast devastation of World War II in the hope of providing international security.  But, as history has demonstrated, it is not strong enough to do the job―largely because the “great powers,” fearing that significant power in the hands of the international organization would diminish their own influence in world affairs, have deliberately kept the world organization weak.  Thus, for example, the UN Security Council, which is officially in charge of maintaining international security, is frequently blocked from taking action by a veto cast by one its five powerful, permanent members.

But what if global governance were strengthened to the extent that it could provide national security?  What if the United Nations were transformed from a loose confederation of nations into a genuine federation of nations, enabled thereby to create binding international law, prevent international aggression, and guarantee treaty commitments, including commitments for nuclear disarmament? 

Nuclear weapons, like other weapons of mass destruction, have emerged in the context of unrestrained international conflict.  But with national security guaranteed, many policymakers and most people around the world would conclude that nuclear weapons, which they already knew were immensely dangerous, had also become unnecessary.

Aside from undermining the national security rationale for building and maintaining nuclear weapons, a stronger United Nations would have the legitimacy and power to ensure their abolition.  No longer would nations be able to disregard international agreements they didn’t like.  Instead, nuclear disarmament legislation, once adopted by the federation’s legislature, would be enforced by the federation.  Under this legislation, the federation would presumably have the authority to inspect nuclear facilities, block the development of new nuclear weapons, and reduce and eliminate nuclear stockpiles.

The relative weakness of the current United Nations in enforcing nuclear disarmament is illustrated by the status of the UN Treaty on the Prohibition of Nuclear Weapons.  Voted for by 122 nations at a UN conference in 2017, the treaty bans producing, testing, acquiring, possessing, stockpiling, transferring, and using or threatening the use of nuclear weapons.  Although the treaty officially went into force in 2021, it is only binding on nations that have decided to become parties to it.  Thus far, that does not include any of the nuclear armed nations.  As a result, the treaty currently has more moral than practical effect in securing nuclear disarmament.

If comparable legislation were adopted by a world federation, however, participating in a disarmament process would no longer be voluntary, for the legislation would be binding on all nations.  Furthermore, the law’s universal applicability would not only lead to worldwide disarmament, but offset fears that nations complying with its provisions would one day be attacked by nations that refused to abide by it.

In this fashion, enhanced global governance could finally end the menace of worldwide nuclear annihilation that has haunted humanity since 1945.  What remains to be determined is if nations are ready to unite in the interest of human survival.

 

 

 

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185705 https://historynewsnetwork.org/article/185705 0
AI the Latest Instance of our Capacity for Innovation Outstripping our Capacity for Ethics

The eagerness with which movie and television studios have proposed to use artificial intelligence to write content collides with the concern of Writers Guild members for their employment security and pay in the latest episode of technological innovation running ahead of ethical deliberation. 

 

 

 

Regarding modern technology, the psychologist Steven Pinker and the economist/environmentalist E. F. Schumacher have expressed opposite opinions. In his Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (2018), the former is full of optimism--e.g.,“technology is our best hope of cheating death”--but many decades earlier Schumacher stated that it was “the greatest destructive force in modern society.” And he warned, “Whatever becomes technologically possible . . . must be done. Society must adapt itself to it. The question whether or not it does any good is ruled out.”

 

Now, in 2023, looking over all the technological developments of the last century, I think Schumacher’s assessment was more accurate. I base this judgment on recent developments in spyware and Artificial Intelligence (AI). They have joined the ranks of nuclear weapons, our continuing climate crisis, and social media in inclining me to doubt humans’ ability to control the Frankensteinian  monsters they have created. The remainder of this essay will indicate why I have made this judgment.

 

Before taking up the specific modern technological developments mentioned above, our main failing can be stated: The structures that we have developed to manage technology are woefully inadequate. We have possessed neither the values nor wisdom necessary to do so. Several quotes reinforce this point.

 

One is General Omar Bradley’s: "Ours is a world of nuclear giants and ethical infants. If we continue to develop our technology without wisdom or prudence, our servant may prove to be our executioner."

 

More recently, psychologist and futurist Tom Lombardo has observed that “the overriding goal” of technology has often been “to make money . . . without much consideration given to other possible values or consequences.”

 

Finally, the following words of Schumacher are still relevant:

“The exclusion of wisdom from economics, science, and technology was something which we could perhaps get away with for a little while, as long as we were relatively unsuccessful; but now that we have become very successful, the problem of spiritual and moral truth moves into the central position. . . . Ever-bigger machines, entailing ever-bigger concentrations of economic power and exerting ever-greater violence against the environment, do not represent progress: they are a denial of wisdom. Wisdom demands a new orientation of science and technology towards the organic, the gentle, the nonviolent, the elegant and beautiful.”

 

“Woefully inadequate” structures to oversee technological developments. How so? Some 200 governments are responsible for overseeing such changes in their countries. In capitalist countries, technological advances often come from individuals or corporations interested in earning profits--or sometimes from governments sponsoring research for military reasons. In countries where some form of capitalism is not dominant, what determines technological advancements? Military needs? The whims of authoritarian rulers or elites? Show me a significant country where the advancement of the common good is seriously considered when contemplating new technology.

 

Two main failings leap out at us. The first, Schumacher observed a half century ago--capitalism’s emphasis on profits rather than wisdom. Secondly--and it’s connected with a lack of wisdom--too many “bad guys,” leaders like Hitler, Stalin, Putin, and Trump, have had tremendous power yet poor values.

 

Now, however, on to the five specific technological developments mentioned above. First, nuclear weapons. From the bombings of Hiroshima and Nagasaki in 1945 until the Cuban Missile Crisis in 1962, concerns about the unleashing of a nuclear holocaust topped our list of possible technological catastrophes. In 1947, the Bulletin of the Atomic Scientists established its Doomsday Clock, “a design that warns the public about how close we are to destroying our world with dangerous technologies of our own making.” The scientists set the clock at seven minutes to midnight. “Since then the Bulletin has reset the minute hand on the Doomsday Clock 25 times,” most recently in January of this year when it was moved to 90 seconds to midnight--“the closest to global catastrophe it has ever been.” Why the move forward? “Largely (though not exclusively) because of the mounting dangers of the war in Ukraine.”

 

Second, our continuing climate crisis. It has been ongoing now for at least four decades. The first edition (1983) of The Twentieth Century: A Brief Global History  noted that “the increased burning of fossil fuels might cause an increase in global temperatures, thereby possibly melting the polar ice caps, and flooding low-lying parts of the world.” The third edition (1990) expanded the treatment by mentioning that by 1988 scientists “concluded that the problem was much worse than they had earlier thought. . . . They claimed that the increased burning of fossil fuels like coal and petroleum was likely to cause an increase in global temperatures, possibly melting the polar ice caps, changing crop yields, and flooding low-lying parts of the world.” Since then the situation has only grown worse.

 

Third, the effects of social media. Four years ago I quoted historian Jill Lepore’s highly-praised These Truths: A History of the United States (2018): “Hiroshima marked the beginning of a new and differently unstable political era, in which technological change wildly outpaced the human capacity for moral reckoning.” By the 1990s, she observed that “targeted political messaging through emerging technologies” was contributing to “a more atomized and enraged electorate.” In addition, social media, expanded by smartphones, “provided a breeding ground for fanaticism, authoritarianism, and nihilism.”

 

Moreover, the Internet was “easily manipulated, not least by foreign agents. . . . Its unintended economic and political consequences were often dire.” The Internet also contributed to widening economic inequalities and a more “disconnected and distraught” world. Internet information was “uneven, unreliable,” and often unrestrained by any type of editing and fact-checking. The Internet left news-seekers “brutally constrained,” and “blogging, posting, and tweeting, artifacts of a new culture of narcissism,” became commonplace. So, too did Internet-related companies that feed people only what they wanted to see and hear. Further, social media “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right. . . . The ties to timeless truths that held the nation together, faded to ethereal invisibility.”

Similar comments came from the brilliant and humane neurologist Oliver Sacks, who shortly before his death in 2015 stated that people were developing “no immunity to the seductions of digital life” and that “what we are seeing—and bringing on ourselves—resembles a neurological catastrophe on a gigantic scale.” 

Fourth, spyware. Fortunately, in the USA and many other countries independent media still exists. Various types of such media are not faultless, but they are invaluable in bringing us truths that would otherwise be concealed. PBS is one such example.

Two of the programs it produces, the PBS Newshour and Frontline have helped expose how insidious spyware has become. In different countries, its targets have included journalists, activists, and dissidents. According to an expert on The Newshour,

“The use of spyware has really exploded over the last decade. One minute, you have the most up-to-date iPhone, it's clean, sitting on your bedside table, and then, the next minute, it's vacuuming up information and sending it over to some security agency on the other side of the planet.”

The Israeli company NSO Group has produced one lucrative type of spyware called Pegasus. According to Frontline, it “was designed to infect phones like iPhones or Androids. And once in the phone, it can extract and access everything from the device: the phone books, geolocation, the messages, the photos, even the encrypted messages sent by Signal or WhatsApp. It can even access the microphone or the camera of your phone remotely.” Frontline quotes one journalist, Dana Priest of The Washington Post, as stating, “This technology, it's so far ahead of government regulation and even of public understanding of what's happening out there.”

The fifth and final technological development to consider is Artificial Intelligence (AI). During the past year, media has been agog with articles on it. Several months ago on this website I expressed doubts that any forces will be able to limit the development and sale of a product that makes money, even if it ultimately harms the common good. 

More recently (this month) the PBS Newshour again provided a public service when it conducted two interviews on AI. The first was with “Geoffrey Hinton, one of the leading voices in the field of AI,” who “announced he was quitting Google over his worries about what AI could eventually lead to if unchecked.”

Hinton told the interviewer (Geoff Bennett) that “we're entering a time of great uncertainty, where we're dealing with kinds of things we have never dealt with before.” He recognized various risks posed by AI such as misinformation, fraud, and discrimination, but there was one that he especially wanted to highlight: “the risk of super intelligent AI taking over control from people.” It was “advancing far more quickly than governments and societies can keep pace with.” While AI was leaping “forward every few months,” needed restraining legislation and international treaties could take years.

He also stated that because AI is “much smarter than us, and because it's trained from everything people ever do . . . it knows a lot about how to manipulate people, and “it might start manipulating us into giving it more power, and we might not have a clue what's going on.” In addition, “many of the organizations developing this technology are defense departments.” And such departments “don't necessarily want to build in, be nice to people, as the first rule. Some defense departments would like to build in, kill people of a particular kind.”

Yet, despite his fears, Hinton thinks it would be a “big mistake to stop developing” AI. For “it's going to be tremendously useful in medicine. . . . You can make better nanotechnology for solar panels. You can predict floods. You can predict earthquakes. You can do tremendous good with this.”

What he would like to see is equal resources put into both developing AI and  “figuring out how to keep it under control and how to minimize bad side effects of it.” He thinks “it's an area in which we can actually have international collaboration, because the machines taking over is a threat for everybody.”

The second PBS May interview on AI was with Gary Marcus, another leading voice in the field. He also perceived many possible dangers ahead and advocated  international controls.

Such efforts are admirable, but are the hopes for controls realistic? Looking back over the past century, I am more inclined to agree with General Omar Bradley--we have developed “our technology without wisdom or prudence,” and we are “ethical infants.”

In the USA, we are troubled by divisive political polarization; neither of the leading candidates for president in 2024 has majority support in the polls; and Congress and the Supreme Court are disdained by most people. Our educational systems are little concerned with stimulating thinking about wisdom or values. If not from the USA, from where else might global leadership come? From Russia? From China? From India? From Europe? From the UN? The past century offers little hope that it would spring from any of these sources.

But both Hinton and Marcus were hopeful in their PBS interviews, and just because past efforts to control technology for human betterment were generally unsuccessful  does not mean we should give up. Great leaders like Abraham Lincoln, Franklin Roosevelt, and Nelson Mandela did not despair even in their nations’ darkest hours. Like them, we too must hope for--and more importantly work toward--a better future.

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185706 https://historynewsnetwork.org/article/185706 0
John de Graaf on his Powerful Documentary on Stewart Udall, Conservation, and the True Ends of Politics

John de Graaf and Stewart Udall

 

We have, I fear, confused power with greatness.—Stewart Udall

 

Stewart Udall (1920-2010) may be the most effective environmentalist in our history considering his monumental accomplishments in protecting and preserving the environment and improving the quality of life for all citizens. Unfortunately, his tireless efforts for conservation and environmental protection and his gifts as a leader are not well known to the wider public today. His life offers inspiration and a model for, among others, public servants and citizen activists today.

As the Secretary of the Interior from 1961 to 1969 under Presidents John F. Kennedy and Lyndon Baines Johnson, Udall took the department in new directions as he crafted some of the most significant environmental policies and legislation in our history. With his talent for forging bipartisan alliances, he spearheaded the enactment of major environmental laws such as the Clear Air, Water Quality and Clean Water Restoration Acts, the Wilderness Act of 1964,

the Endangered Species Preservation Act of 1966, the Land and Water Conservation Fund Act of 1965, the National Trail System Act of 1968, and the Wild and Scenic Rivers Act of 1968.

Secretary Udall also led in expanding federal lands and he established four national parks, six national monuments, eight national seashores and lakeshores, nine national recreation areas, 20 national historic sites, and 56 national wildlife refuges including Canyonlands National Park in Utah, North Cascades National Park in Washington, Redwood National Park in California, and more. A lifelong advocate for civil rights, Udall also desegregated the National Park Service.

After his term as Secretary of the Interior, Udall continued to work for decades as an attorney advancing environmental protection, worker health and safety, human rights, tolerance, Indigenous rights, racial equality, and justice.

Despite his many achievements, Udall seems to have faded from memory and most people today know little of his monumental legacy. His name doesn’t usually leap to mind when considering the great leaders on the environment and human rights.

To remind us of Udall’s remarkable life and legacy, acclaimed filmmaker and activist John de Graaf created a new documentary, Stewart Udall, The Politics of Beauty (The film is available through Bullfrog Communities: www.bullfrogcommunities.com/stewartudall).

The film charts the trajectory of Udall’s life as it introduces viewers to a history of the origins of the modern environmental movement. There’s the journey from Udall’s childhood in Arizona, his schooling, and his World War II combat duty, to his commitment to public service, his terms in Congress, and his achievements as Secretary of the Interior. The film further recounts his later life as a zealous attorney, author, and voice for beauty, simplicity, and peace as he warned about climate change, health hazards, rampant consumerism, and the dangers of polarization and extreme partisanship. Especially engaging are interviews with Udall and his family supplemented with family films as well as scenes with JFK and Lady Bird Johnson.

The film is based on exhaustive archival research as well as interviews with historians, family members, friends and colleagues of Udall. Personal films, photographs and papers were shared with Mr. de Graaf and his team. As the life of Udall unfolds, the film provides historical context illustrated with vivid scenes from the turbulence, environmental devastation, and movements for justice and peace in the sixties and seventies. There are also stunning sequences of natural beauty from the forests, seas, deserts and other sites that Udall sought to protect.

The story of Udall’s life may provide a way forward for younger people today who are skeptical of politics and disillusioned by stasis and polarization that prevent meaningful change for a better quality of life and a more livable world. Udall’s visionary pursuit of environmental and social justice came out of his cooperative nature and his belief in democracy. May his inspiring example create hope and fire the minds of citizens today.  

Mr. de Graaf is a Seattle-based award-winning filmmaker, author, and activist. He has said that his mission is to “help create a happy, healthy and sustainable quality of life for America,” and his documentary on Stewart Udall is an aspect of that desire. He has been producing and directing documentaries for public television for more than forty years. His nearly 50 films, including 15 prime time PBS specials, have won more than 100 regional, national and international awards.

Mr. de Graaf also has written four books, including the bestselling Affluenza: The All-Consuming Epidemic. The John de Graaf Environmental Filmmaking Award, named for him, is presented annually at the Wild and Scenic Film Festival in California. He is also co-founder and president of Take Back Your Time, co-founder of the Happiness Alliance, former policy director of the Simplicity Forum, and founder of the emerging organization, And Beauty for All. 

Mr. de Graaf graciously responded to questions about his background and his Udall documentary by phone from his Seattle office.

 

Robin Lindley: Congratulations John on your heartfelt and vivid Stewart Udall film. I appreciate the work you do and your persistence. Every documentary film must be a long haul.

John de Graaf: Thank you. I had a team of great people to work with, so I can't take all the credit.

Robin Lindley:  Before we get to the Udall film, I wanted to give readers a sense about your background. What inspired you to work now as an activist, author and filmmaker?

John de Graaf:  I was an activist first, and that led me to do quite a bit of writing, to print reporting. And that eventually led me to do a public affairs radio show at the University of Minnesota in Duluth. Doing that, I met a character that I thought would make a great film. And then I connected with this videographer at the University of Minnesota Minneapolis, and we put a film together that was then aired on Minnesota Public Television in 1977, and the film won a major PBS award and that launched me.

Four years later I started doing freelance documentary production at Channel Nine, the PBS station in Seattle. I was there for 31 years basically, until they kicked me out in 2014, but I've continued. My film Affluenza was a big success on PBS, so I was asked to write a book by a by a New York agent. Then a California publisher put out the Affluenza book, and that took off like the film. It has sold nearly 200,000 copies in 10 or 11 languages internationally.

I also made a little film called What's the Economy for Anyway? and that led to another book. I also edited a book called Take Back Your Time that was connected with research and activism I was doing about overwork in America.

Robin Lindley: Congratulations on those projects aimed at exposing social justice and environmental issues and at encouraging work to improve the quality of our lives.

John de Graaf: Yes. The quantity of our stuff, or the gross national product, or world power, or any of those things should not be the goal. Instead, the aim should be about the best quality of life for people. I think all of these themes connect with that.

Robin Lindley: Thanks for your tireless efforts. You title of your new documentary is Stewart Udall, The Politics of Beauty. What do you mean by the politics of beauty? It seems that expression ties in with your interests in the environment and nature as well as your efforts to promote happiness and better quality of life.

John de Graaf: I think there is a lot of evidence that our common, even universal, love for beauty, especially nature’s beauty, can bring us together and reduce polarization.  It’s no accident that the most bipartisan bill passed during the Trump administration was the Great American Outdoors Act.  Beautiful cities can slow us down, reduce our levels of consumption, and use of the automobile.  Parks and access to nature are a more satisfying substitute for material stuff.  The response to my film confirms this for me.  Stewart was aware of all of this.

Robin Lindley: What inspired you to make a film now about Stewart Udall, who seems to be an overlooked champion for the environment? He's not remembered in the same way as naturalist John Muir maybe, or author Rachel Carson or Sierra Club’s David Brower.

John de Graaf: Of course, John Muir was a huge figure in his time. His writing was known by everybody and he stirred such a movement but he needed political figures like Teddy Roosevelt and later, Udall, to make his dream of the National Parks come true.

Rachel Carson’s book Silent Spring was very powerful, but that's what she did and she died soon afterwards. She wasn't able to accomplish a lot without people like Udall who actually created and passed legislation. I don't mean to in any way denigrate her. She was great and Udall loved and appreciated her. He was a pallbearer at her funeral. Her book stirred a lot of interest and attention, and people like Udall got behind it, and so it had a major effect.

In terms of environmental work, David Brower was exceedingly important because he was involved in so many things including the Sierra Club. Aldo Leopold was a key figure with his impact. And there have been many, many others since then. Now you'd have to probably add Bill McKibben, Gus Speth, and people like that.

Robin Lindley: It seems, however, that Udall has been overlooked or forgotten. Was that one of the reasons you wanted to do a film about him?

John de Graaf: I was impressed years ago when I interviewed him, but I'd forgotten about him until I saw a newspaper story in 2020 that said “a hundred years ago today Stewart Udall was born.” I was struck by my memory of him, and I knew he gave me a book so I went to my shelf and pulled down the book that he gave me and signed to me when I interviewed him.

And then I started doing a little more research, first online and then ordering biographies of him. And I thought, what a fascinating character. I knew that he had created several national parks and some things like that, and I knew that he had stopped the Grand Canyon dams because that was what I'd interviewed him about. But I had no idea about his civil rights activity, his work for world peace, his work for the arts, and his support for victims of atomic fallout and uranium miners, and so many other things that he ended up doing. That came as a complete surprise to me, and I think made the film richer.

Robin Lindley: Udall seems a renaissance man. I didn't know much about his life, and your film was certainly illuminating. What do you think inspired him to get involved in environmental protection and then in environmental and social justice issues?

John de Graaf: Number one, he did spend a lot of time outdoors when he was a kid on a farm in Arizona and hiking in the nearby White Mountains. And he got very interested in the natural world and the beauty of the natural world when he was out hiking.

And then, he grew up in a Mormon family, but it was unusual because it was a very liberal Mormon family. His father impressed on all the kids that Mormons had been discriminated against and that's why they were in these godforsaken places in the desert. They'd been pushed out of Illinois and Missouri and other places, so they had to stand up for other people who were discriminated against, and that included especially Native Americas because they lived in the area where he was, and Black Americans, and so forth.

And then, he fought in World War II. He flew on 52 very dangerous bombing missions. He was very lucky to come back alive and he said that he must have been allowed to live for some reason. He decided, “I really need to be involved in public service in the best way that I know how.”

When he came back, he played basketball at the University of Arizona, and he was very committed to civil rights. He and his brother Mo both joined the Tucson chapter of the NAACP right after the war. And they’d had Black friends in the military and Mo had been a lieutenant with a division of Black troops. And they both fought to integrate the University of Arizona.

And Stewart was especially interested in the environment and protecting the beauty of the West. Later, that went beyond conservation, beauty and preservation to a much wider view of ecology and the environment and pollution.  

Robin Lindley: Udall’s probably best known for his role as the Secretary of the Interior under JFK and LBJ. How did he come to be appointed the Secretary of Interior? What brought him to the attention of the Kennedy administration?

John de Graaf: He worked with Senator John Kennedy as a congressman. They worked on a number of bills together in the late fifties, and he was very impressed by Kennedy.

When Kennedy decided to run for president for 1960, Stewart got behind him. Stewart was a very influential and persuasive person in Arizona at that time, though nobody knew anything about him beyond Arizona.  But he was able to convince Arizona's delegation to unanimously support Kennedy for president over Lyndon Johnson at the Democratic Convention. And Kennedy appreciated that.

Kennedy was also looking for somebody who knew something about the outdoors and somebody who was a westerner because it was traditional that the Interior Secretary be a westerner. Stewart Udall was the obvious choice for Kennedy at that time.

Robin Lindley: Did Udall have a close relationship with Kennedy during his short presidential term?

John de Graaf: I think Kennedy was distant and Stewart wanted a much closer relationship than Kennedy would allow with him, or I think with anyone else. But they were friends, of course, and Kennedy supported what Stewart was doing and Stewart supported what Kennedy was doing. He felt that Kennedy had a prodigious intellect and capacity for getting things done, but he was not a person who was easy to make friends with. Stewart was actually much better friends with Jackie, Kennedy's wife. She thought Stewart was such a gentleman and a fascinating character. She liked his personality and very much liked his wife. They were friends with his family.

Stewart didn't know how Johnson would be, but it turned out that Johnson was a much more social person than Kennedy, and much easier to be with and have a friendship with, And Johnson really loved nature and was committed to environmental protection in a stronger way than Kennedy had been. And a lot of that came from Johnson’s wife so Stewart cultivated his friendship with Lady Bird Johnson who adored him, according to Johnson’s daughters.

Udall convinced Lady Bird Johnson that she should make a name for herself in conservation by first doing a beautification campaign and then through various other work. Lady Bird took up that Beautify America campaign and became a great advocate for the environment.

Robin Lindley: Didn’t Lady Bird and Udall share a concern about impoverished urban areas urban areas also?

John de Graaf: It didn't start with the impoverished areas. It started with the idea of beautifying America. But Lady Bird and Lyndon Johnson loved the cities that they visited in Europe, and they felt that Washington was a derelict place-- a mess in comparison to the other capitals of the world. It was embarrassing to bring people to the United States capital.

They felt that they had to start their campaign addressing cities in Washington DC, and that justice compelled them to start in the poorest communities, which were African American communities. They decided to put money first into beautifying those areas before focusing on the neighborhoods that were already gentrified.

Robin Lindley: And that approach also ties into Udall’s interest in civil rights, which you stress in your documentary.

John de Graaf: Yes. He was very interested in promoting civil rights. One of his first discoveries as Secretary of Interior was that the Washington Redskins (now Commanders) football team wouldn't hire Black players. So, he basically gave them this ultimatum that, if they wanted to play in the National Stadium, which the Department of Interior controlled, they needed to hire Black players or Udall would not lease the stadium to them. And so, they hired Black players, and that changed football immensely. In fact, the Washington Redskins became a much better team. The Washington Post even suggested that Stewart Udall should be named NFL Coach of the Year because of what he’d done to improve the team.

Udall also discovered that the National Park Service, which he was in charge of, was segregated. They had Black rangers only in the Virgin Islands, which is primarily Black. He was determined to change that. He sent recruiters to traditionally Black colleges in order to do it.

His kids told me that he would watch the civil rights protests on television. And he would say things like “Look at those brave young people. They have so much dignity.” And these young people were getting slammed, and weren't violent. They were quite the opposite, and Stewart said, “These kids are what America should be all about.” He added, “We need kids like this in the National Park Service, and the National Park Service needs to look like America.”

Bob Stanton from Texas was one of the first Black park rangers, and he went to Grand Teton. He later became the head of the National Park Service. He's a wonderful guy and I’ve gotten to know him well. Bob's 83 now, but he has the deepest memories of all that happened and Stewart Udall's role in it.

Stewart also had to decide whether the 1963 March on Washington could happen because it was planned for the National Park areas of the Lincoln Memorial and the Washington Monument. He had to grant a permit for the march to proceed, and there was enormous pressure for him not to approve the permit that came from the Jim Crow Democratic Senators in the South who were also putting huge pressure on President Kennedy. 

The march happened, and it was huge, and its impact was huge. Stewart watched it from the sidelines, but you could see in the photos of the march that National Park rangers were standing right near Martin Luther King when he spoke.

Robin Lindley:  Thanks for sharing those comments on Udall’s support of civil rights. Didn’t he leave the Mormon Church because of its racist policies?

John de Graaf: He wasn’t a Mormon anymore by then, but he always claimed that he remained a cultural Mormon--that he believed in the Mormon ideas of public service, of community and family, and all those things. And Mormons did have a real ethic of serving the community in those days. Those communities were tight, and people worked together. And Stewart believed in that.

World War II really cost him his faith because he just couldn't accept that, if there was a God, God would allow the things to happen that he saw in the war. He became basically an agnostic but he did not reject the church, and he did not openly criticize the church until the mid-1960s when he became concerned about the church's refusal to allow Blacks in its priesthood.

Udall thought that was astounding and terrible, so he finally wrote a letter to the church saying there was a Civil Rights Movement and the position of Mormon Church was unacceptable. The church basically wrote back and said that it might agree with Udall but it doesn’t make those decisions. God does. Until God gives a revelation to the head of the church, things must stay as they are.

Ten years later, God gave a revelation to the head of the church and they changed the policy. Stewart basically was out of the church and was not considered a Mormon, but he was never excommunicated and never really disowned in any sense by the church. In fact, some of the strongest supporters of this film today are Mormons even though it’s clear about Udall leaving the church. Some evangelicals believe that former members are betrayers, but the Mormons don't take that position at all. In fact, they very much honor Udall. I just spoke at Utah State University, and a young Mormon woman come up to me after the screening and said she wanted to show this film. She said she was a board member of the Mormon Environmental Stewardship Association, and she added that “We're proud of Steward Udall.” It was very positive to see that attitude.

Robin Lindley: Thanks for explaining that, John. I didn't quite understand Udall’s interactions with the Mormon Church.

John de Graaf: The church's view was that Stewart had honest reasons for rejecting policies and for leaving the church, and that was respected. And it did not make him a bad person. You had to decide that he was a good or bad person on the basis of the deeds that he did, which seems a good attitude

Robin Lindley:  Yes. And Stewart Udall had a special gift for working both sides of the aisle to pass legislation including many important environmental measures. Wasn’t the Congress of the 1960s far less polarized than what we see now?

John de Graaf: It was, and particularly after Kennedy's death, but there was a lot of fighting and it was hard for Stewart to move things through. He certainly had some very key Republican support, but he also had some major Democratic opposition, not only from the head of the Interior Committee, Wayne Aspinall, a Colorado Democrat, but he also had southern Democrats who hated him because of his civil rights positions.

But after Kennedy was killed, and Johnson was elected in a landslide, that brought the Congress together around the idea of LBJ’s Great Society programs and civil rights laws. And Johnson did a much better job of getting things through Congress than Kennedy. Then you saw the Land and Water Conservation Fund, and the Wilderness Act, and Endangered Species List--major bills that passed because Congress and Johnson supported them.

But some environmental laws didn’t get passed until Nixon came in because of the huge protests on the first Earth Day in 1970. These bills were already in Congress, and Congress moved them ahead. And when Nixon was president, he had a Democratic Congress. The bills moved ahead but there was never a veto proof majority except on a couple bills like the Wild Rivers Act. Nixon though, with the pressure of Earth Day and all the environmentalist sentiment at that time, signed the bills.

Nixon himself had an environmental sensibility. He was terrible on civil rights issues and the war but he was much more open about the environment. He realized the impact of pollution. He had seen the Santa Barbara oil spill, the polluted Cuyahoga River. Nixon felt comfortable in signing the act creating the Environmental Protection Agency.

Robin Lindley: Is it fair to say that Stewart Udall was the architect of the EPA’s creation?

John de Graaf: It's fair to say that he was certainly one of the main architects. He didn't do it alone. He had key people in Congress who were supporting him, but he certainly pushed hard for it. I don't know if the idea was originally his, but he was probably the first who talked about it, and he certainly played a major role in it.

Stewart was also the first political figure to speak about global warming. He heard about it from his scientific advisor, Roger Revelle.  Revelle was an oceanographer who worked with the Smithsonian and was one of the first scientists to look at how the oceans were heating up. He said we have a problem on our hands with global warming. Stewart was talking with him on a regular basis and then decided to go public with this threat.  Other politicians knew about it, but they wouldn't go public, but Stewart said this was a major problem and he predicted flooding of Miami and New York and melting of the polar ice cap. And he was talking about global warming in 1967.

Robin Lindley: That surprised me. He was so prescient.

John de Graaf: Yes. There were smart scientists, but most politicians wouldn't dare touch it, even though the signs of much of it were already there. Daniel Moynihan gave a big public speech in 1969 about global warming as a big issue. More attention was probably paid to that speech than to Stewart, because Stewart wrote about the climate in books and in articles rather than in speeches.

Robin Lindley: It was interesting that, in one of Johnson's major speeches on the Great Society, he spoke about civil rights and poverty, and he decided to added a section that Stewart had suggested on the quality of life despite objections from some politicians.

John de Graaf: The speech was written by Richard Goodwin, the president’s speechwriter. But certainly, Goodwin had to have been reading what Stewart had written for LBJ because the language was exactly the same as much of Stewart's language.

Stewart had actually written short speeches for LBJ that had that language about quality of life and beauty. He wrote that when we lose beauty, we lose much that is meaningful in our lives.

That Great Society speech was interesting because Johnson was clearly influenced by Stewart and he agreed with his views about quality of life and nature. And Johnson told Richard Goodwin to have three themes in that speech: poverty, civil rights, and the quality of life and beauty. But then he told Goodwin to share the speech with the leaders of the House and Senate and get their opinions on it because he wanted them to like it and to support it. When Goodwin did that, he found that the Democratic leaders wanted him to take out the part about beauty and quality of life and to focus on the war on poverty and civil rights because they felt that these other things would distract from the main message that the president wanted to share.

The story is that Goodwin took those sections out of the speech and passed the speech back to LBJ who read the speech before giving it. He looked at Goodwin and he said, “What the hell happened to the stuff about quality of life?” Goodwin said, “You told me to show it to the House and Senate leaders. They said I should take it out because it was a distraction from your message.” And Johnson slammed his hand on the desk and said, “They don't write my speeches. That's just as important as the other stuff. Put that back in.” So that language on quality of life ended up being part of his incredible Great Society speech.

Robin Lindley: And I was surprised that Udall was working on a nuclear test ban treaty and was very concerned about nuclear proliferation.

John de Graaf: Yes. That was under Kennedy before the Test Ban Treaty of 1963 was signed by Kennedy and Khrushchev.

In 1962, Stewart was very concerned about nuclear war. He also had been very concerned about the dropping of the bomb on Japan. He felt, even as a soldier, that it was going beyond what he believed in. He believed that it was all right to bomb military installations but he did not believe that we should bomb civilians deliberately. He accepted that civilians would inadvertently be killed, but we should never target civilians. That was simply awful and against all notions of how we fight and against the Geneva Convention.

Udall went to the Soviet Union to discuss energy issues and he took poet Robert Frost along to meet Soviet poets like Yevtushenko because he knew that the Russians loved poetry. And at that time, Americans didn't pay much attention to it. So, he took Robert Frost, and he was able to get a meeting with Khrushchev where they discussed nuclear weapons and banning atmospheric nuclear testing, which was going on in both countries at that time.

Nothing immediately came of the talks because it was actually right before the Cuban Missile Crisis. But it apparently had some influence, because once that crisis was resolved and nuclear weapons were not used, the Russians came back to the table with Kennedy and agreed to ban atmospheric testing. They were able to do that and I think Stewart had some influence, although it's impossible to say for certain.

Robin Lindley: Thanks for that insight. Udall must have been haunted by his World War II experiences. Many veterans were.

John de Graaf:  Yes. With Mormons who were in the war, the stresses of the war pushed quite a few into being smokers and drinkers, which the Mormon Church didn't allow. But many Mormons came back smoking and drinking to relieve stress, and Stewart was certainly one of them because the war was such a tragic experience.

Robin Lindley: Didn’t Udall differ with Johnson about the war in Vietnam.

John de Graaf: Big differences. Initially Stewart shared some of the worries about the spread of communism as many people did at that time. Stewart was never really a far leftist, but he was a strong liberal and he was afraid of communism or any totalitarianism, especially after fighting the Nazis.

Initially, Udall believed that maybe we should try to stop the spread of communism and help Vietnam be democratic. But that didn't last for long. Once Johnson sent the troops and Udall started seeing what was happening to the people of Vietnam, Udall changed his mind, probably as early as late 1965. He tried to get Johnson to pull back.

And Secretary of Defense Robert McNamara was a close friend of Udall. They hiked and backpacked together. Their kids knew each other. They always liked each other very much. But McNamara's son Craig told me that he didn’t know that Stewart was so against the war until he saw my film.  He said he always liked Stewart and thought Stewart was a wonderful guy. And his dad liked him, he said, but his dad never talked about what other people thought about the war.

McNamara completely separated his work and family life so he would not talk at home about anything going on with other cabinet members. So, McNamara's son had no idea that Stewart was so vociferously against the war along with Nicolas Katzenbach, Johnson’s Attorney General, and a couple of others who criticized the war at the cabinet meetings and to the president. Craig McNamara wrote to me saying that he wished his dad had listened to Stewart Udall.

Robin Lindley: And, after the Johnson administration, after Udall left his post as Secretary of Interior, he worked as a lawyer with environmental justice and human rights issues. How do you see his life after his term as Secretary?

John de Graaf: He didn't know exactly what to do in Washington. He wanted to work as a consultant to improve cities, to make cities more livable. He became very critical of the automobile and our use of energy. And plus, he saw racism tear our cities apart.

Stewart was looking for things to do, but it was not easy. What kept him in Washington was that he and his wife wanted to allow their kids to finish high school with their friends. After the kids were adults and off to college, the Udalls moved back to Arizona and to Phoenix. It took a while for Stewart to figure out what to do there after he’d been in a position of power and influence. He was 60 years old with so much behind him.

Robin Lindley: He practiced law after his years as Secretary of the Interior and focused on social justice and environmental issues. The film notes his work with “downwinders” who were ill from radiation as well as miners who faced work hazards. What do you see as some of his important accomplishments after he moved back to Arizona?

John de Graaf: Two things: certainly, his work for downwinders and uranium miners for more than ten years was the most significant.  Then in 1989, he moved to Santa Fe and did a lot of research and writing.  In all, he wrote nine books, the most significant being The Myths of August, an exploration of the terrible impacts of the nuclear arms race.  He loved history and several of his books are about the history of the American West.

Robin Lindley: You obviously did extensive research for the film. Can you talk about how the project evolved and some of the archival research and the interviews that surprised you? It seems that Udall’s family and colleagues were very enthusiastic and open to sharing their perceptions with you.

John de Graaf: The Udall family was wonderfully gracious and open to me.  Much of the real research had been done by Udall’s biographers so I just picked up on that.  As I talked to people, I discovered that no one would say anything negative about him; even those who disagreed with his politics had total respect for his humility and integrity.  That’s not common with political figures, especially in this polarized time.  I was especially impressed by current Interior Secretary Deb Haaland’s insistence that “the politics of beauty lives on.” And I was stunned by the paintings of Navajo artist Shonto Begay, a wonderful guy.  I use some of his paintings in the film.  I had great cooperation from the University of Arizona in finding still photos.

Robin Lindley: Congratulations John on the film and its recent warm reception at the Department of Interior with Secretary Deb Haaland, the first Native American in that role.

John de Graaf: Yes. That was a wonderful event. We had about 300 people there, and Secretary Haaland spoke and talked about Stewart.

And we are getting a very good response to the film at other screenings. My biggest concern is it's hard to get young people to come out to see it. But when they do, they like it, like the young Mormon woman who I mentioned at Utah State. And a Hispanic student at University of Arizona who is a leader of the students’ association there wants to present screenings to get students more active in politics. I think that's the way it's going to have to happen. The screenings already turn out faculty and the older community, but they don’t turn out students. But once they see it, they do respond to it. I've been very surprised at how many students come up to me afterwards and want to talk. They tell me that they never knew about any of this history. They didn't learn about it in school. We’ve also been treated very well by media.  We’ve done fairly well in festivals, though I’m disappointed that my own hometown Seattle International Film Festival didn’t take the film.

Robin Lindley: Thanks for your thoughtful comments, John, and again, congratulations on your intrepid work to create and now display this moving cinematic profile of Stewart Udall. I learned a lot, and the film brought back many memories of the sixties, those times of exuberance and turbulence. The film not only illuminates our history, but it's also inspiring. Udall’s example offers hope for our divided nation now.

 

Robin Lindley is a Seattle-based attorney, writer, illustrator, and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer’s Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service. He served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. Martin Luther King, Jr. His writing often focuses on the history of human rights, social justice, conflict, medicine, visual culture, and art. Robin’s email: robinlindley@gmail.com.  

 

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/blog/154745 https://historynewsnetwork.org/blog/154745 0
The Roundup Top Ten for May 19, 2023

I'm Headed to Florida to Teach-In Against DeSantis's Education Policies

by Kellie Carter Jackson

This May 17 saw a 24-hour teach-in by historians in St. Petersburg, Florida, to protest the restrictions on curriculum, books and ideas pushed by Governor Ron DeSantis and his allies. As a historian of abolition, the author stresses that denying people the pen may influence them to pick up the sword. 

 

Bull Connor's Police Dogs Shocked the Nation in 1963, but they were an American Tradition

by Joshua Clark Davis

"In 1963 liberal critics condemned the Alabama city’s K-9 unit as a relic of the Old South. The harder truth to accept, however, was that it was actually a product of a new America."

 

 

MLK: Christian, Radical

by Jonathan Eig

Veneration has hollowed out Martin Luther King, Jr.'s legacy, and obscured the way that his political leadership always aimed at radical transformation of American society, argues the author of an acclaimed new biography. 

 

 

If it's Ineffective and Harmful, Why is Gay Conversion Therapy Still Around?

by Andrea Ens

Conversion therapies endure because their purpose is political, not therapeutic. They seek and symbolize the eradication of LGBTQ people from society and are promoted by groups who want that eradication to happen. 

 

 

Florida Just Banned Everything I Teach

by William Horne

Black historians during the Jim Crow era observed that the history taught in schools justified slavery, segregation, and lynching. A professor thinks that's where Ron DeSantis's vision of history is headed. Some politicians may think curriculum is a winning issue, but students and society will lose. 

 

 

Texas Shooting Highlights Long History of Anti-Black Violence in Latino Communities

by Cecilia Márquez

History shows that there have long been strains of anti-black racism in Latino communities, and that the categories "white" and "latino" are not mutually exclusive. Understanding today's far right requires attention to those details. 

 

 

The Relevance of Common Law to Abortion Debate: How Did the Law Work in Practice?

by Katherine Bergevin, Stephanie Insley Hershinow and Manushag N. Powell

Samuel Alito's ruling in Dobbs claimed to ground itself in the English common law's treatment of pregnancy. But he focused on a small number of published treatises while ignoring the record of how the law actually treated pregnant women and fetuses. 

 

 

There's Never Been a Right Way to Read

by Adrian Johns

The intellectual work and play of reading has always competed with other demands on attention; only recently have science and commerce converged to sell remedies for distraction and proprietary methods for reading. 

 

 

China is Cutting the US Out of the Middle East with an Axis of the Sanctioned

by Juan Cole

Recent American policies have squandered an opportunity to engage poductively with Iran and Saudi Arabia and instead pushed them toward stronger economic development relationships with China. 

 

 

Henry Kissinger: A War Criminal Still at Large at 100

by Greg Grandin

Henry Kissinger was instrumental in Nixon's decision to undertake the illegal bombing of Cambodia. His foreign policy machinations also led him to push Nixon to the actions that led to Watergate and the president's downfall, though Kissinger has remained unaccountable. 

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185702 https://historynewsnetwork.org/article/185702 0
Contemporary Pundits Need a Refresher on Populism's History

From the People's Party Paper, 1892

 

 

The way “populism” is typically invoked in today’s media, you wouldn’t know that the word comes down to us from one of America’s most successful progressive movements— the grass-roots crusade that resisted corporate power and fought to save democracy 130 years ago.

Many of today’s pundits would have you think otherwise.

“Is American Democracy Doomed by Populism?” asks Yascha Mounk of the Council on Foreign Relations, writing days after Trump supporters stormed the Capitol. Politico called Trump “The Perfect Populist” in 2016, likening him to George Wallace, Alabama’s white-supremacist governor in the 1960s. “Trump and Sanders Lead Competing Populist Movements,” says the Washington Post, echoing a common claim that progressives share some kind of “populistic” perspective with the far right.

In this ahistorical babble, you rarely hear mention of the men and women who organized a multiracial resistance to the first corporate oligarchs.

“The fruits of the toil of millions are boldly stolen to build up colossal fortunes for a few,” the Populists announced when they formed the People’s Party in 1892. The mega-rich who “despise the republic and endanger liberty” were the real danger to democracy.

The People’s Party would contest the rule of these “plutocrats” at a time of rapid social change. Railroads, electricity, and mechanized crop harvesting were transforming the economy, making the “Robber Barons” who controlled these new technologies the richest men on earth. While many workers took home less than $10 a week in 1890, Jay Gould, the infamous stock speculator, was pocketing more than $20,000 a day (in today’s dollars, about $700,000).

Farmers were routinely abused. Railroad monopolies gouged them with inflated charges for shipping wheat and cotton to distant markets, while lenders (especially in the South) extorted interest of 40% or more on loans for overpriced supplies and equipment. At a time when farmers and farm laborers accounted for more than 40 percent of the labor force, their collective anger posed a genuine threat to unfettered capitalism.

Neither the Democratic nor Republican parties saw what was coming. Both were dominated by monopoly capitalists who wanted minimal taxation, no regulation of their “private” business empires, and no legal rights for the farmers and workers who resisted corporate profiteering. At a time when there were no primary elections for choosing a party’s presidential candidate, there was little prospect for internal reform in either major party.

The Populists had to launch a new political movement, drawing support from the Farmers Alliance, the American Railway Union, the women’s suffrage movement, Christian Socialists, the United Mine Workers, and utopian reformers. The People’s Party was also a multiracial movement in the South, where African Americans served on the party’s state executive committees in Texas, Louisiana and Georgia.

The economic and political goals of these Populists were as broad as their membership. They wanted farmer-owned cooperatives that would negotiate for better prices from processors and merchants. They favored public ownership of railroads, utilities and other natural monopolies. They called for postal savings banks and low-cost federal loans for farmers and workers. They wanted recognition of farm organizations and labor unions. Where bankers favored the high interest rates that came from basing the money supply on scarce reserves of gold, the Populists wanted to abolish the Gold Standard and expand the money supply with government-issued bills and silver coinage.

Above all, they wanted to restore a democracy corroded by the blatant buying of privilege. Nationally, they favored the election of senators rather than their appointment by bought-and-sold state legislatures— as was then the case. To reform state government, they called for referendum, recall, and votes for women. In the South, they favored political rights for Black voters.

On this reform platform, the Populists called on the “producing classes” to vote for the return of government “to the hands of the ‘plain people’.”

They failed nationally, but it was a close call in the West, the Great Plains and the South. Fifty Populists won election to Congress from 16 states. North Carolina, Oregon, South Dakota, Nebraska, Kansas, and Colorado all elected Populist governors. The Populist vote would have been higher still in the many southern states where white elites organized a deadly backlash, stealing votes, murdering Populists, and imposing one-party rule by white-supremacist Democrats.

Even so, the Populists transformed the political terrain in America, marked by the subsequent emergence of progressive movements in both national parties. The watershed was 1896, when William Jennings Bryan won the Democratic Party nomination for president on a pledge to regulate the railroads and expand the money supply with silver. Running as a Democrat— and widely viewed as a “Popocrat”— he fell short with 47 percent of the popular vote. But progressives thereafter gained ascendency in the party, leading to reforms in the next century that included much of the Populist platform: election of senators, votes for women, corporate regulation, collective bargaining rights for workers and farmers, and an end to the Gold Standard.

Bernie Sanders, the Democratic Socialist, is at least a distant cousin of these original Populists. Donald Trump is not. Even the phrase “right-wing populism” is— historically speaking— an oxymoron. The Populists of the 1890s would have despised the likes of Trump, a preening billionaire allied with today’s mega-rich.

Mainstream pundits would nevertheless have us believe that any popular movement calling on “the people” to overturn “corrupt elites” is a populist threat to democracy. Lumping Sanders together with a wanna-be fascist like Trump implies that both men seek to sway voters with equally polarizing and manipulative rhetoric.

Those who apply this shape-shifting term are actually branding themselves. Some are simply unwitting users of a phrase that’s in vogue and gives the appearance of historical insight. Others may know better, but have gotten used to it. Still others deliberately use the populist label to stigmatize any movement that challenges the questionable legitimacy of our elite-dominated “meritocracy.”

Elites who tar their critics in the U.S. with the sly pejorative of “populist” count on our collective amnesia. They’d rather the real Populists remained forgotten, along with the potential they represented.

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185651 https://historynewsnetwork.org/article/185651 0
Political Pundits, Apply the "Resentment" Label with Caution

Although former President Donald Trump's 2024 campaign frequently references his own resentments, is that emotion driving his supporters?

 

 

If someone ever managed to copyright the word “resentment,” the owner would enjoy a steady stream of revenue, especially from columnists and opinion writers. Take those of the venerable  New York Times. “The Resentment Fueling the Republican Party is Not Coming from the Suburbs,” reads the headline of a Thomas Edsall column from earlier this year. (January 25, 2023)  Just a day later, Edsall’s Times colleague Paul Krugman declared, “Rural resentment has become a fact of American politics.” (January 26, 2023). Earlier that month, Bret Stephens wrote, in a colloquy with David Brooks, “The problem is that Trump turned the [Republican] party into a single-purpose vehicle for cultural resentments,” adding: “It doesn’t help that coastal elites do so much on their own to feed those resentments.” (Jan. 15, 2023) And in August of last year, Jamelle Bouie struck the same chord: “Republicans would like to offer you some resentment.” (August 22, 2022)

 

Given these assertions, it is no surprise to discover that the rush to evoke resentment coincided with the election of Trump in 2016. It quickly became an off-the-shelf explanation for a political phenomenon that defied all rational expectations. David Remnick, the editor of the New Yorker, vilified the victorious candidate as a “slick performer” who essentially duped his followers by being “more than willing to assume their resentments, their fury, their sense of a new world that conspired against their interests.” And days after the election, Leon Wieseltier, writing in the Washington Post, seized upon it as the apt word to describe the present moment: “Resentment, even when it has a basis in experience, is one of the ugliest political emotions, and it has been the source of horrors,” he declared. Others followed suit.

 

What are we to make of the place of “resentment” in the echo chamber of a significant segment of the commentariat? Does its frequent, casual, sometimes unthinking deployment really offer any insight into the motivations and values of the millions of Americans who voted the former president into office and support him still? It’s like inflation: when we use something too frequently its value is diminished. Might it be time to place a moratorium on “resentment?”

 

Perhaps not. But we might at least become more aware of its potential meanings and implications, especially those that risk overshooting the mark of what commentators intend to convey.

 

We might recall, for example, that at least since Friedrich Nietzsche it has usually been understood as a profoundly demeaning characterization of people convinced of their unjust victimization, consumed by bitterness and envy, governed by a twisted sense of the reasons for their fate. “Nothing on earth consumes a man more quickly than the passion of resentment,” he wrote in Ecce Homo. And in The Genealogy of Morality, where he cast the emotion as fundamental to the debased morality of the slaves, he says of the resentful man, “His soul squints.”

 

In more recent times, commentators have usually defined this psychological disposition in similar terms.  It is the “villain of the passions,” according to the philosopher of emotions Robert Solomon. It poisons “the whole of subjectivity with its venom… maintaining its keen and vicious focus on each of the myriad of petty offenses it senses against itself.”  One doesn’t have to embrace this rather Nietzschean view to appreciate that resentment is an emotion that few people are eager to “own.” 

 

Or we might also realize that resentment has often been used to delegitimize people who merely exhibit a profound dissatisfaction with the status quo, who insist that they are being denied their just desserts.  The literary scholar Frederic Jameson sees recourse to resentment in explaining protestors’ motivations as “little more than an expression of annoyance at seemingly gratuitous lower-class agitation, at the apparently quite unnecessary rocking of the social boat.” Too often, to fixate on resentment is to ignore or underplay the real grievances that stand behind this usually unappealing emotional state. It is to mistake the symptom for the cause.

 

On the other hand, we might consider that there are different modes of resentment, some indeed not so much a function of envy, or bitterness, or feeling cheated by fate, but rather righteous indications of an injustice that must not be ignored.  And here it is precisely the irritating, clamorous tone of resentment that serves this purpose. “In the midst of the world’s silence, our resentment holds its finger raised,” wrote the Auschwitz survivor Jean Améry in 1966: his lonely protest against the blithe alacrity of his contemporaries to put the past behind them, especially when it came to the Shoah. In the face of this tendency, he writes, “I ‘stuck out’…I persevered in my resentments.” 

 

The moral philosopher Amélie Oksenberg Rorty warned that if we slight or ignore expressions of resentment, we are like the physician who dismisses the symptoms of a suffering patient.  And in the experience of various “Truth and Reconciliation Tribunals” around the world, it has often been former victims’ insistent expressions of resentment that have called a temporary halt to the proceedings—which almost always aimed at achieving the “closure” of forgiveness—until their grievances were adequately acknowledged.

 

Finally, those quick to brand others with the label of resentment ought to think again whether they are so immune from the same ascription.  One thing that distinguishes resentment from other kindred emotions, such as anger, bitterness, or enervating envy, is that it usually signals a moral injury—a conviction that you have been wronged in a way that contravenes some basic notions or standards that should normally govern what people expect for themselves and from others. In our current climate, the tendency is to think of resentment as the farthest thing from “moral”—often, given some of its uglier manifestations, with justification. But anyone with a sense of self-worth has to be at least prone to the kind of moral aggrievement which gives rise to resentment.  

 

I am not arguing for banishing “resentment” from our current lexicon. It’s clearly useful in illuminating the passions and grievances that animate many people in the US and elsewhere, especially on the right. But let’s deploy it less as a means of reproach and more in the quest for insight, perhaps even empathy.   

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185620 https://historynewsnetwork.org/article/185620 0
Brandon Johnson Built a Coalition to Win in Chicago. Can He Keep it to Govern?

 

 

 

On Monday, May 15, Brandon Johnson takes the oath of office to become the 57th mayor of Chicago – a moment echoing Harold Washington’s path-breaking election, forty years ago this spring, as the city’s first Black mayor. The historic parallels between 1983 and today, however, are less about who Johnson is – the 46-year-old former teacher and union activist will become the fourth Black mayor in the city’s history – but more about how he got to the fifth floor of City Hall and the challenges that face him and his administration.

 

Not unlike Washington, Johnson won a narrow victory against a more conservative white opponent in Paul Vallas, a former city budget director and schools CEO who emphasized law and order and other racial dog whistles throughout the campaign. Johnson built a multiracial coalition with an overwhelming Black vote and substantial Latino and white support to beat Vallas. Younger voters, who had largely eschewed the first round of the mayoral campaign, came out in larger numbers to help push Johnson over the top in the runoff.

 

Similarly, by the time Harold Washington delivered his inaugural address in May 1983, he had vanquished three prominent white opponents in two elections with an astounding 99 percent of the Black vote, nearly 80 percent of Latinos, and a small but significant number of white progressives, including much of the city’s burgeoning gay and lesbian community. Promising to make the city a fairer, more inclusive place, Washington inspired high hopes among his supporters that he indeed could open up the city to all.

 

But while Washington’s new administration made some important strides, the reality of governing proved even harsher than most had predicted. The explicit racism Washington and his allies faced during the campaign continued as a white City Council majority thwarted most of his policies and appointments for the first two-and-a-half years of his mayoralty in what was called the Council Wars. Deindustrialization, a hostile Reagan White House, and crises posed by crime, drugs, and AIDS proved just as daunting to his policy agenda. When Washington was successful, it was often not only because his allies had his political back, but also because they were willing to maintain their own pressure on the new mayor to follow through on his campaign promises.

 

For instance, when Washington moved slowly to incorporate Latinos in his new administration, activists such as Nena Torres, Miguel del Valle, and Linda Coronado threatened to establish their own independent Latino affairs commission. What became the Mayor’s Advisory Commission on Latino Affairs started as a provocation to the new mayor to take them and their issues more seriously. The commission, formalized in 1984, became an essential voice for Latino interests – from affirmative action and redistricting to infant mortality and urban renewal – and a model for mayoral commissions representing other groups. But the commission only came into existence though intense lobbying by Latinos.

 

Washington’s historic choice of Fred Rice as the city’s first African American police superintendent offered another example. While important symbolically, Rice’s appointment did not change the culture of what remained a highly dysfunctional police department known for harassment and even torture. Excessive force complaints, in fact, rose during the first two years under Washington and Rice. And yet Washington supporters and activists at the time generally tread lightly on his management of the department, knowing that sharp public criticism of the first Black mayor’s handling of the police would simply add fuel to his opponents’ efforts to discredit him.

 

Forty years later, Brandon Johnson faces the same kind of high expectations that Washington did, but in a city far more unequal and financially strapped than it was under Washington. As the new mayor navigates issues of rising crime, under-resourced schools, and now a growing migrant crisis, staying in the good graces of the diverse and inherently fragile coalition that elected him may prove difficult.

 

Ultimately, as in 1983, it will be up to those Chicagoans who voted for reform – including the powerful Chicago Teachers Union, of which Johnson once was a member and organizer, to decide how much patience and grace to show their now elected ally. How accountable will they hold him to his campaign promises to govern differently than his predecessors? Or will another chance at reform in the Windy City slowly blow away?

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185649 https://historynewsnetwork.org/article/185649 0
Mary Wollstonecraft's Diagnosis of the Prejudices Holding Back Girls' Education Remains Relevant Today

Frontispiece engraving by William Blake from Mary Wollstonecraft, Original Stories from Real Life, 1791 ed. 

 

 

In 1785, aged only twenty-five, Mary Wollstonecraft, along with her two sisters and her good friend Fanny Blood, opened a school in Newington Green, London. Their aim was to fill the gaping hole in the education of young women, and there seemed no better place to start the rollout. As home to numerous religious radicals and dissenters, Newington Green was a community open to new ideas – one that had already rejected many a status quo. But, despite Wollstonecraft’s best efforts, the school soon failed. Rather than giving up, she turned to writing as a means of championing the cause. Her first book, aptly titled Thoughts on the Education of Daughters, was published in 1787. By 1792, she had moved to France in search of the Revolution and was publishing what was to become her best-known work: A Vindication of the Rights of Woman with Strictures on Moral and Political Subjects.

Contrary to the opinion of the day, Wollstonecraft argued that women were brains, not just bodies: that they were just as capable as men and deserved the same access to education in order to broaden their minds. While this much is, of course, well known, there is a further aspect to Wollstonecraft’s work that has been buried in history. It is a golden nugget, and one that allows us to better understand the obstacles girls face. Referencing the popular conduct manuals of the time, which she described as “specious poisons” that created an “insipid decency,” Wollstonecraft noted that it wasn’t only society’s warped focus on women’s biology that hampered progress towards educational equality but also, more specifically, society’s obsession with female “purity.” Even for a girl whose parents had the means and inclination to support her education, the fear that her virginity could be brought into question made schooling alongside men a virtual impossibility.

The answer to this was the governess, but this was expensive and necessarily limited women’s ability to acquire a broad education, and to mix and debate with others. Having been a governess herself, working for the Kingsborough family in Ireland following the failure of her school, Wollstonecraft had firsthand experience. In championing girls’ schooling, Wollstonecraft might have been the proverbial turkey voting for Christmas, but she knew that much more was at stake than her own job (and, in any case, she didn’t much get along with the mother of the Kingsborough brood).

At a time when respectable families placed their daughters’ “morality” ahead of their education, Wollstonecraft stated that “[w]ithout knowledge there can be no morality.” True virtue, she argued, could only ever be achieved by immersing yourself in life and experiencing the world, as men were encouraged to do, including on their grand tours. In her Vindication, she writes that “men have superior judgement” because “they give a freer scope to the grand passions, and by more frequently going astray enlarge their minds.” Men were allowed to achieve wisdom and virtue because “the hero is allowed to be mortal.” By contrast, heroines “are to be born immaculate.” For women, everything was to be lost; for men, everything was there for the taking. What Wollstonecraft ultimately called for was a “revolution in female manners.”

While the revolution in female manners is still ongoing, progress in regard to women’s schooling came in the late nineteenth century, albeit only for the wealthy. In England, Cheltenham Ladies’ College opened in 1853, followed by Roedean School in 1885. By the late nineteenth century, young women were able to acquire an education at my own university, Cambridge, in ladies’ colleges strategically positioned outside the city center. The compromise was, of course, gender segregation.

Even if young women could by then acquire a mentally challenging education, the next step, entry to the workforce, also presented a reputational risk. While it was not a viable strategy for the poorest families, families with means expected their daughters to remain at home until marriage, spending their days helping with domestic tasks, preparing themselves to become good wives and mothers. Priscilla Wakefield was, however, no stranger to paid work. Living at the same time as Wollstonecraft, Wakefield managed to carve out a successful career as a writer, publishing a total of seventeen books, while also finding time to establish England’s first savings bank for women and children. Informed by her personal experience, Wakefield offered her own solution to the problem of preserving female virtue, one which involved embracing paid work but with strict limitations attached.

According to Wakefield, the central reason why women fell into “sexual sin,” including sex work, was a lack of financial support. Limiting young women’s educational development and their ability to earn was, she thought, a recipe for immorality, not morality. Rather than protecting women, their exclusion only succeeded in leaving them vulnerable. The phenomenon of “fallen women” was, she argued, an economic and not a social problem, one that resulted from a “dreadful necessity.” By means of a solution, her Reflection on the Present Condition of the Female Sex; with Suggestions for its Improvement (published in 1798) proposed an intricate and detailed plan for women’s work, tabulated by class, with educational and training recommendations for each “class.” She attempted to reconcile work and virtue, combining Wollstonecraft-style thinking with social conservatism. With it, Wakefield recommended that poorer women be properly trained as hairdressers, cooks or seamstresses so as to avoid falling into harlotry, and that men should be discouraged from working in such professions, keeping them “safe” for women. For the handful of women born into families with means, writing and painting were at the top of her list of recommendations, as they could be conducted from the “safety” of the home, away from men. Segregation along gender lines was, for Wakefield, the route to liberation.

The cult of female modesty has hampered women’s access to education and work for a long time. Sadly, it continues to have the same effect in parts of the world today. While the number of children not in school across the world has fallen over the last two decades, at current rates of progress it will be 2050 before all girls have been educated to at least primary school level. Evidence suggests that the poorest girls tend to be withdrawn from school at puberty (between the age of 12 and 14). In 2020, the countries with the highest out-of-school rate for girls in this age group were: Mali (84% out-of-school), the United Republic of Tanzania (81%), Guinea (78%), Nigeria (78%), Benin (73%), Pakistan (70%), Mauritania (63%), Afghanistan (62%), Senegal (58%) and Côte d’Ivoire (57%).

In 2012, the struggle for girls’ education in Pakistan came into sharp focus when Malala Yousafzai, then aged fifteen, was shot in the head by masked gunmen on her way home from school. She had become the target of the militant group Tehrik-i-Taliban following her campaign for girls’ education. Four years before the attack, in 2008, she and her female friends had been denied schooling when her town, in the Swat Valley, came under Taliban control. Since her recovery, Malala has continued her campaign. So too, sadly, have her enemies.

Following the introduction of the new Taliban regime in Afghanistan in 2021, secondary schools were closed to girls. At the same time, the work of the Women’s Affairs Ministry was swallowed up by the Ministry of Vice and Virtue. Under the Taliban, and much as in Wollstonecraft’s time, “morality” comes first and that morality does not include a right to an education.

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185652 https://historynewsnetwork.org/article/185652 0
How a Little-Known Anti-Vietnam Protest Reverberates Today

 

 

Probably no period of US history witnessed more student unrest than the Vietnam War years before Congress ended the draft in 1971.  Student demonstrations started in 1963 at St. John’s University in Queens, NY, gained momentum at Berkeley the following year, and culminated with strikes at nearly 400 colleges and universities following the killing of students at Kent State and Jackson State in 1970. 

Why, then, might a small-scale demonstration at a remote, though distinguished, university in western New York deserve our attention?  Only fifteen students and two professors demonstrated against the Vietnam War during an ROTC ceremony at Alfred University (AU) in May 1968.   The university suspended seven students and fired one of the two faculty protestors.  AU charged them with violating recently adopted guidelines relating to demonstrations.  The incident hardly created a ripple beyond the local area. 

Although most Americans knew nothing about this event, the ACLU took note.  Alfred University is a private university, but it contains an internationally- acclaimed Ceramics College funded by the State University of New York.  Three of the seven students were enrolled in that SUNY unit, and the ACLU, mainly represented by a young and brilliant civil rights attorney named Neil Fabricant, aided the students.

The fired professor, a 40-year old historian named Michael Kay, was an outspoken anti-war radical.  Ironically, he had been hired in part because he was a Marxist; the department chairman, David Leach, thought students should be exposed to a range of historical viewpoints.  Nevertheless, Kay had become a thorn in the side of the university. He organized a chapter of Students for a Democratic Society (SDS).  He rarely attended faculty meetings and frequently canceled classes.  AU’s president complained, with justification, that he “has a passion for anarchy and a genius for discord.” 

Many of Kay’s colleagues would have agreed.   A sociologist who shared Kay’s political views wrote after the university fired him that “he gave no quarter and deserves none.”   He was considered so disagreeable that not even the local American Association of University Professors (AAUP) chapter came to his defense when he alleged that the university had fired him because of his left-wing politics.    

Despite calls to reinstate him, the university stood firm.  He had clearly violated the university’s demonstration guidelines by interfering with the progress of the ROTC ceremony and refusing to move away when ordered to do so. He had been warned that his behavior at the ROTC ceremony placed him at risk of dismissal.   And because AU was a private institution, he could not legally challenge his firing. 

Not so the three suspended SUNY Ceramic College students.  They claimed that AU violated their First Amendment freedoms of speech and assembly, along with their Fourteenth Amendment right to due process.  Joined by the other four students, all seven went to court.  Because the Ceramics College, one of AU’s four colleges, was fully funded by the State of New York and because the state provided AU with about $200,000 to cover instructional costs for Ceramics students taking courses in other AU colleges, the plaintiffs argued that AU officials had acted as state agents and therefore that the suspension constituted “state action.”  That concept—state action—though little known outside of the legal community, became critically important to their suit. 

 Their case would be known as Powe v. Miles, Emile Powe being the first of seven plaintiffs, and Miles being Leland Miles, president of the university.  The students went to Federal District Court in Buffalo where the case was assigned to Judge John T. Curtin.  Following two days of hearings, Judge Curtin held that the university had not acted in the role of the state and therefore that the “state action” principle was inapplicable.   For that reason, the students did not have standing to sue in a federal court.   A private university, Curtin concluded, could suspend students for almost any reason, and AU was private despite receiving state monies for the Ceramic College. 

The students’ attorney, Neil Fabricant, strongly disagreed.  He persuaded his clients to appeal.  Off they went to the U.S. Court of Appeals for the Second Circuit in New York City. 

 There, a three-judge panel that included Henry J. Friendly, perhaps the most highly respected appellate court jurist in the country, accepted Fabricant’s argument that New York State’s funding of the Ceramics College meant that suspending the students indeed constituted “state action.”  Fabricant reminded the Court that the very name of the college—the New York State College of Ceramics at Alfred University—justified the “state action” designation.  The Second Circuit therefore reversed Judge Curtin’s lower-court decision. It concluded that a federal court could properly address the First and Fourteenth Amendment issues raised by the plaintiffs.

 Unfortunately for the students, however, the Appeals Court did not find that AU had violated their constitutional rights.  The Court held that AU’s demonstration guidelines requiring such things as 48-hours prior notice and no disruption of educational activities (the ROTC ceremony was technically a class) were reasonable.  Moreover, the Court further noted that the university had given the students adequate opportunity to protest in a way that did not abridge their First Amendment rights.  They were permitted to display signs calling for an end to the Vietnam War and the abolition of compulsory ROTC by standing to the side of the ROTC parade grounds so long as they did not disrupt the ceremony.  The university also granted the students a right to appeal their suspension, thereby preserving their Fourteenth Amendment right to due process.  AU even permitted the students to take their spring semester final exams off campus.  The Court therefore sustained AU’s decision to suspend the students for the fall semester.

The Appellate Court may have exonerated AU, but the AAUP was less forgiving.  The AAUP is a professional organization committed to the defense of faculty and the principle of academic freedom.  It ignored the student side of this controversy and mounted a fourteen-month investigation into Kay’s dismissal.  With laser focus, the AAUP highlighted the fact that Kay had been fired before he had a chance to exercise his right to appeal and therefore concluded that he had been denied due process.  The AAUP disregarded the inaction of the local AAUP chapter. Some members had found Kay so objectionable as to have recommended even before the ROTC protest that he be terminated. 

 Nevertheless, in what can only be viewed as a victory for AU, the AAUP stopped short of censuring the university after Alfred officials agreed to pay Kay a year’s salary and to update its faculty handbook in accord with AAUP recommendations. 

 So why should we remember this matter?  Not because of Professor Kay’s fate, but because the Court of Appeals redefined the legal status of a private university that receives state funding.  Is a private university subject to state regulation in respect to protests?  Will its faculty and students enjoy constitutional protections?   

Powe v. Miles became a national moot court case.  It has been cited in federal and state courts 216 times since 1968.   Seventy-five of these citations relate specifically to the “state action” concept.  Moreover, in the immediate wake of Powe v. Miles, the New York State legislature passed Education Law Section 6450 requiring every institution of higher learning in New York receiving public funds to “adopt rules and regulations for the maintenance of public order….and provide for the enforcement thereof.”  A college refusing to abide by Section 6450 would forfeit state monies. 

From Powe v Miles in 1968 until about 1982, we find that courts expanded the scope of the “state action” concept, especially related to issues of race and gender.  After 1982, reflecting a more conservative legal environment, courts narrowed their interpretation.  In short, Powe v Miles has influenced a corner of American law for over a half century, which is to say that the ripple effects of the 1968 demonstration at Alfred University reverberate into the 21st century.

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185650 https://historynewsnetwork.org/article/185650 0
Slowing Our Roll on Silicon Valley With the world aflutter about AI technology, Silicon Valley Bank going belly up, and legal proceedings around the collapsed cryptocurrency exchange FTX heating up, I spoke with Malcolm Harris, author of Palo Alto: A History of California, Capitalism, and the World.

Malcolm and I went back a few millennia to contextualize and uncover a virulent, prejudicial ethos that has shaped Palo Alto, home of the tech industry, since its founding. A transcript edited for clarity is below. 

Ben: Malcolm, thank you so much for being here. 

MH: Thank you so much for having me. 

Ben: Of course! Today I'd like to explore what you present as “the curse” of Palo Alto, both on itself and the world. 

To begin, let’s discuss the people who inhabited California long before settlers arrived.

MH: Right, when you're talking about Anglo-American California, you have to step back and talk about not just the Spanish and Mexican periods that preceded it, but pre-contact California.

Estimates now have 300,000 Indigenous people living in California for millennia before Spanish colonization. The political nature of the Indigenous communities of California, particularly Northern California and the Bay Area, was stunning.

Ben: As you write, California was home to “one of the densest concentrations of human linguistic and cultural diversity scholars have ever been able to reconstruct anywhere in world history.”

MH: And we still don't quite understand the complexity of the political and social structures of these societies.

Spanish colonization was devastating for California’s Indigenous population, decreasing the population by around half to 150,000 people. Still, Indigenous societies endured well into the 19th century, which I think is really important to note. California's self-justifying ideology blames the Spanish for the genocide of Indigenous people and then holds that Anglo-Americans showed up and kicked out the Spanish without any blood on their hands, but that's not what happened.

As the gold rush began in 1849, state-sanctioned murder gangs took over California based on the pattern of Texas. During the Civil War, California was home to some of the most egregious colonial violence in US history. That's how Palo Alto was able to emerge on what had long been the homeland of the Muwekma Ohlone.

Ben: We explored Texas history recently with Professor Gerald Horne, and one of the takeaways from that conversation was the sheer scale of violence in Texas. To think such violence extended to California and even perhaps exceeded it is astonishing.

Let’s shift to the exceptionally unexceptional founder of Palo Alto, Leland Stanford. How did he end up there?

MH: Leland Stanford, originally from New York, was the least competent of an early group of capitalists to arrive in California. This group, who later called themselves the Associates, grew rich from running the Central Pacific and Southern Pacific railroads, which brought white settlers to the West. Leland was the dumbest of the Associates, so his buddies made him the public face of the company to take the heat for a pyramid scheme they were running. 

And though Leland dodged a federal investigation (by dying before it was completed), he faced a lot of heat from workers who, during the 1870s, rallied outside his house in San Francisco. Leland wasn’t that worried—again, he wasn’t too smart—but other people were worried for him. So Leland bought a big piece of land in the South Bay, moved there with his wife, Jane and their son, and created the suburb we now know as Palo Alto (named after a tall tree), where they were far better insulated from the class tensions of the city.

Ben: And Leland became governor of California at one point, right? I remember you noting that he went by “governor” for the rest of his life.

MH: Yep. He served one relatively undistinguished term (and later served as senator). One of the few things he did as governor was fund the genocidal militia campaigns. 

Ben: Another reminder to never trust anybody who chooses to maintain a title for the entirety of their lives (Queen Bey excepted). 

Obviously, the name “Stanford” sounds familiar. Can you discuss the founding of the university?

MH: In 1884, Jane and Leland’s son suddenly died, so they decided to take the privileges that they were going to endow their son with and spread them to “the children of California” (i.e. the children of other members of their settler class). They founded Stanford the following year, but it was really the first president whom the Stanfords recruited, David Starr Jordan, who set the direction for the kind of university Stanford would become.

Ben: You describe him as a “school administrator committed above all to the genetic future of the white race.” 

MH: Jordan was not just a eugenicist, but one of the senior eugenicists in the world, and in his mind, the onset of World War I presented one of the greatest threats yet to the white race. He recruited new faculty to Stanford to develop eugenic strategies for fighting the war, including most impactfully a guy named Lewis Terman.

Terman brought this technology called the IQ test from France to the United States and reformatted it at Stanford into what we now call the Stanford Benet IQ test. The goal was to figure out how to ensure that supposed A-students were at college doing reserve office training while C-students were on the front lines getting shot and shooting.

The military adopted the IQ test. From the beginning, it was based on racial pseudoscience, intended to send people of color to the front lines.

Ben: In the book, you include a sample question from the IQ test: “What is Christy Mathewson’s job?” Answer: pitcher for the New York Giants.

I'm a big Yankee fan, as well as a fantasy baseball player, and I promise you that knowledge of obscure baseball statistics is inversely proportional to one’s ability to function as a contributing member of society.

MH: Ha! The tests were totally arbitrary, as many scientists at the time pointed out. However, they went into mass use, and though we don’t think of California as the laboratory for the construction of whiteness, it really was. 

Ben: One thing that's really interesting about your book, too, is that it’s not just a history of California, but a global history as well.

Can you discuss how after the world wars, Palo Alto became “a conduit for the production of an outsourced capitalist planet?”

MH: Absolutely. Stanford students and faculty and early Palo Alto companies helped develop the aviation technology that later culminated in the raids over Tokyo and Germany during World War II. 

The areas that were bombed became the real centers of growth for the post-war era, and the electronics industry in particular moved in as soon as the war was over. For example, Hewlett Packard, founded by Stanford graduates, built their first overseas factories immediately in the bombed-out areas of Japan and Germany.

This was part of a new American policy of building up countries as ramparts against what was now the new threat, global communism. During the Cold War, throughout the developing world, American companies and Silicon Valley firms especially replicated this dynamic, going into foreign countries, taking advantage of unsettled labor situations (or intentionally destabilizing labor situations), and saying, alright, now you’re going to work to the advantage of the US economy.

This practice set the tone for more famous tech companies that came along later like Apple to outsource their work.

Ben: Related, you say that innovators in computer technology became “the tools that got capital from the crisis of the 1960s to the ‘greed is good’ 80s.” 

Can you elaborate on that crisis, and how computer nerds fit into the story?

MH: The 60s were explosive, both in the US and around the world. The global decolonization movement had stepped up, picked up guns, and started to win territory, right? It began to wind back control of whole societies, whole continents.

And this movement, which included the Black Power movement in the US, was very much a threat to the status quo of white power. For the litany of Californians who still believed in white hegemony, this was a problem similar to the one that David Starr Jordan and Lewis Terman perceived earlier in the century: How do you maintain inequality in a world that is globalized?

That brings us to the nerds. We don't often link the creation of the personal computer to this problem, but they're definitely associated. 

Consider for example what it meant to be in a private school in the late 60s and early 70s in segregated American enclaves. Pulling your kids out of public institutions and placing them into private schools that were white-only or that could exclude non-white people at much higher rates, thereby concentrating privilege within those institutions, was very much a white power solution to the threat of integration.

I talk about Bill Gates and Paul Allen within this context because they attended just such a school, the Lakeside School in Seattle, at the same time that Stokely Carmichael was in town rallying nearby Black students about the need for integration and sharing of resources. 

The mentality to pull out of the city, go to the suburbs, and go to private schools can’t be separated from the mentality to shrink computers down, as Steve Jobs did at Apple, or to privatize computer software, as Gates and Allen did at Microsoft. Effectively, Gates, Allen, Jobs, and other so-called “geniuses” made computer access a private privilege. In a world that was getting rapidly worse for most people, they weren’t serving the public but themselves, and they got rich from it.

Ben: I've never felt such antagonism towards my computer as I did when reading your analysis.

Apple and Microsoft have made a lot of money but many celebrated Silicon Valley companies have never turned a profit. Can you discuss why you think the historical trajectory of entrepreneurialism and Palo Alto has taken a turn for the “dumb”?

MH: Sure. Over the last fifteen or twenty years, people have gotten rich by selling failing businesses at the right time or by selling promises that companies will have access to lots of funds in the future. So, Google made money. Facebook made some money, but Uber's never made any money, right? Stripe has never made any money.

But these companies have still made financiers rich, either because the investors sold their shares to other people or maintained stakes in now-public companies that are worth billions, even though the companies have never produced anything in the productive sense of the economy. 

And so that's where I talk about it getting stupid. In the early 20th century, the finance layer underlying the Bay Area business community could look at the future of California business and say, yeah, we're in on this, we're in on farmland, we're in on the water rights, we're in on the movie industry, we’re in on the radio industry.

These were all real growth areas, but now, tech founders are continually rewarded for their failure to produce anything.

Ben: I wish I’d been rewarded for the same.

MH: Or consider the background of Sam Altman, the CEO of OpenAI, which is more or less a series of catastrophes—bad business racked up on top of bad business and failure upon failure. In this finance profit era, those all turn out to be successes, and he’s now reaching iconographic status. 

I think the kind of bank run that we saw recently at Silicon Valley Bank reveals how fickle this kind of success is, as well as signifies diminishing faith in the profitable expansion of this dumb finance-tech business model.

Ben: The fickleness of the dynamic also seems to represent an extension of Palo Alto’s history as a place rooted in bad science, reckless speculation, and outright fraud dating back to the days of Leland Stanford, and bringing us full circle.

Malcolm, this has been a pleasure. Thank you so much for your work and your time today. 

MH: Thanks again! I had a great time.

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/blog/154737 https://historynewsnetwork.org/blog/154737 0
The Roundup Top Ten for May 12, 2023

Arlington Cemetery Will Remove its Racist Confederate Monument; Some Won't Let it Go

by Erin L. Thompson

The Arlington Confederate Memorial, according to a Defense Department review, mythologizes history so severely that no contextualizing signage could overcome its embodiment of the Lost Cause myths that justified Jim Crow. Some adherents of that myth are mad. 

 

Faculty and Librarians Must be Allies in the Fight to Save Higher Ed

by Emily Drabinski

"A robust defense of free expression requires an equally vociferous defense of the institutions where that speech is most widely celebrated. The fight for higher education must be a fight for the library as well."

 

 

Americans Can and Must Fight Back Against Anti-Woke Authoritarians

by Khalil Gibran Muhammad

Signs are emerging that Americans won't be persuaded by moral panics to surrender the freedom to learn. What can this counteroffensive build on? 

 

 

Can Capitalism Exist Without Excess?

by Trevor Jackson

The pandemic supply chain disruptions have focused attention on shortages, but the problem of gluts—of food being destroyed when it can't be profitably sold–reflects a deeper problem with global capitalism. 

 

 

Who's Afraid of a Black Cleopatra?

by Gwen Nally and Mary Hamil Gilbert

The controversy over the portrayal of Cleopatra by the Black British actress Adele James highlights the difficulty of reading modern ideas of race and identity back onto the past. But more interesting questions arise around why people in the present seek commonality with past figures. 

 

 

Why the French are Striking

by Moshik Temkin

Brits and Americans commonly refer to French protests as a form of national sport, which obscures the serious retrenchment of the welfare state that President Macron is seeking to oppose, and trivializes opposition to the changes. 

 

 

Coke Money and the Public Relations of Higher Ed Divestment from Apartheid

by Amanda Joyce Hall

At Emory Univerity in the 1980s the close relationship between Coca Cola and the administration allowed Coke to leverage its financial support of the university to press the administration to suppress and coopt student movements for divestment from Apartheid South Africa. 

 

 

Onoto Watanna, the First Asian American Screenwriter

by Ben Railton

Under the pen name of Onoto Watanna, a woman named Winnifred Eaton of British and Chinese descent became a literary prodigy, penning romance novels, ethnic cookbooks, and screenplays—and a searing critique of the treatment of writers in Hollywood that rings true today. 

 

 

The Siege of Wounded Knee was a Beginning for Renewed Native Resistance

by Benjamin Hedin and Nick Estes

Movement activists occupied the Wounded Knee site in 1973, in defiance of corrupt tribal leadership and federal authorities. Both the occupation and the massacre of Native people at the same place in 1890 had been cast as tragic endings. Native activists insist that they represent cultural and political rebirths.

 

 

Washington State Law to Offer No-Interest Home Loans to Redress Decades of Discriminatory Housing

by James Gregory

Specific practices by private lenders and public authorities have created and perpetuated disparities in homeownership and wealth through real estate. Guided by researchers, Washington State is attempting to compensate for that harm.

 

]]>
Wed, 31 May 2023 10:41:47 +0000 https://historynewsnetwork.org/article/185648 https://historynewsnetwork.org/article/185648 0