History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Tue, 23 Jul 2019 09:06:58 +0000 Tue, 23 Jul 2019 09:06:58 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/site/feed Jane Addams and Lillian Wald: Imagining Social Justice from the Outside

Jane Addams (left) and Lilian Wald (right)


Anyone who has taken a United States history course in high school knows the story of Jane Addams and Chicago’s Hull House, the first Settlement House in America and arguably the genesis of social work in the country. More advanced textbooks may even have discussed Lillian Wald, founder of New York’s Henry Street Settlement House, who was instrumental in introducing the concept of “public health” – and the important epidemiological axiom that physical well being is inseparable from economic and living conditions. 


What no one learned in high school, or later, was that Addams and Wald were women who loved other women and that these relationships – as well as the female friendship networks in which they were involved – were profoundly instrumental to their vision of social justice that changed America. 


Since its founding – even amid deep seated prejudices and politics of exclusion and animus – there has been an American impulse to help the less advantaged. This was the kinder aspect of Winthrop’s 1630 sermon “The City on the Hill” (also know as “A Model of Christian Charity”)  and the sentiment was evident in George W. H. Bush’s 1988 “A Thousand Points of Light” speech. Helping fellow countrymen – at least those deemed worthy of help – was a social and political virtue. 


What Jane Addams and Lillian Wald did was different. They imagined an America in which helping the poor was not charity but a work of democracy and a demonstration of equality. Addams and Wald, and many other women like them, were complicated products of the traditional American impulseforcharity and the massive reforms of the progressive era. What made them distinct was that their status as single women, and as lovers of women, gave them an outsider status that allowed them to envision different ways of structuring society.


Jane Addams, born in 1860, grew up in what looked like a nineteenth-century picture-book American home in Cedarville, Illinois with servants and farmhands. Her family was prosperous and owned factories that produced wool and flour. Her father, friends with Abraham Lincoln, was an abolitionist and progressive and raised his children likewise. While attending Rockford Female Seminary in Rockford, Illinois, Addams met Ellen Gates Starr and the two became a couple, exchanging constant letters while they were apart. In 1885 Starr wrote to her: 


My Dear, It has occurred to me that it might just be possible that you would spend a night with me if you should be going east at the right time. If you decide to go the week before Christmas - I mean - what do I mean? I think it is this. Couldn't you decide to spend the Sunday before Christmas with me? Get here on Saturday and go on Monday? . . . Please forgive me for writing three letters in a week


In 1887, after hearing about Toynbee Hall in London’s impoverished East End neighborhood of London, Addams became intrigued with the new concept of a settlement house: group living in poor neighborhoods that brought local women, men, and children together with teachers, artists, and counselors from various backgrounds. Today, we might call the concept “intentional living groups.” These collectives – often funded by wealthy people – offered education, health care, arts training, day care, meals, and emotional support for the economically disadvantaged. Addams and Starr visited Toynbee House and decided to open something similar in Chicago. In 1889 they opened Hull House with the charter “to provide a center for the higher civic and social life; to institute and maintain educational and philanthropic enterprises, and to investigate and improve the conditions in the industrial districts of Chicago.”  Later, after Addams and Starr separated, her new lover Mary Rozer Smith joined her in this grand social experiment.





Lillian Wald had a similar story. Born into a comfortable, middle-class Jewish family in Cincinnati, Ohio in 1867 she was raised in Rochester, New York. Although she was a brilliant student, she was tuned down by Vassar College because she was considered too young at age 16. Instead, she later went to nursing school. Inspired by Jane Addams and Hull House, upon graduating, Wald and her close friend Mary Brewster moved into a tenement in the immigrant communities of New York’s Lower East Side and began their nursing careers. They believed that nursing involved more than physical care. It was important for them, and other nurses, to live in the neighborhoods of the people for whom they cared and to address the social and economic problems as much as the physical ills. Wald coined the term “public health nurse” to convey the broad swath of this goal. Soon, Wald and Brewster moved into a home on Henry Street that eventually became the Henry Street Settlement. This became  a model of community-based health initiatives and eventually the Visiting Nurse Service grew out of this work. In 1906, it housed 27 nurses; by 2013, the Henry Street Settlement employed 92 people. 


Wald and Brewster received emotional and financial support from many women, and some men. But, much of the core of Henry Street Settlement was formed around a close network of single women, who among themselves had a complex series of personal friendships and romantic relationships. The Manhattan socialite, and daughter of a prominent New York minister, Mabel Hyde Kittredge, for example, worked at Henry Street Settlement for many years and was an intimate friend to Wald. In the early years of their friendship she wrote to Wald:


I seemed to hold you in my arms and whisper all of this. . . . If you want me to stay all night tomorrow night just say so when you see me. . . . Then I can hear you say "I love you"-and again and again I can see in your eyes the strength, and the power and the truth that I love. 


Wald had a vast network of women friends – lovingly referred to as her “steadies” – and at the end of her life she said “I am a very happy women... because I’ve had so many people to love, and so many who have loved me.” 


What does it matter that Addams and Weld were women who loved women?  Addams had two major loves in her life, with whom she shared work, a vision and a bed. Wald’s relationships were less dedicated, but no less intense. Would they have been able to do this important work if they had been heterosexual, married and probably mothers? Certainly there were many married women – from Julia Ward Howe in the mid-nineteenth century to Eleanor Roosevelt in the mid-twentieth century –  who partook in public life, public service, and social reform. What set Addams and Wald and their friendship circles apart was that they were outsiders to social conventions. 


In a world dominated by heterosexual expectations being a single woman culturally set you apart in ways that were dismissive – words and phases such as “spinster” and “old maid” – but also liberating: you were not burdened with the duties of marriage and motherhood. Addams and Wald were also fortunate to come from wealthy families which gave them the ability to dictate their own life choices. With limited opportunities for gainful employment, many women understood that marriage was their best path to economic security. As women unattached to male spouses, Adaams and Wald were able to break from the traditional methods of female giving such as the ideology of motherly love or the distanced, munificent “lady bountiful.” 


Yet there is something else here as well. Unburdened by the expectations of heterosexual marriage these women imagined and explored new ways of organizing the world. They created new social and housing structures – extended non-biological families – that were more efficient and more capable of taking care of a wealth of human social, physical and emotional needs. In large part they were able to do this because they did not rely on the traditional model of heterosexual marriage and home as the building block of society. Instead, they rejected this model. 


Historian Blanche Wiesen Cook has written extensively on how these female friendship circles – precisely because they were homosocial, and in many cases homosexual – were able to transform American social and political life with a new vision of how to organize society and conceptualize how to care for family is the largest sense of the word. Such a vision is not only profoundly American, it is the essence of social justice. 

Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172589 https://historynewsnetwork.org/article/172589 0
The History of the American System of Interrogation


Why do people confess to serious crimes they did not commit? Such an act appears totally against human nature. And yet, we have the case of the Central Park Five where five teenage boys didn’t just say “I did it,” but gave detailed, overlapping, confessions. Based upon those confessions, and with no supporting physical evidence, the boys were sentenced to prison. After serving nearly a dozen years, someone else confessed to the rape, and that confession is backed up by DNA evidence. How can this happen?


The more important question is: were the Central Park Five confessions a freak accident? The answer is NO. Every day, adults and juveniles falsely confess to serious crimes they did not commit.


After 40 years of practicing law and having handled a case similar to the Central Park Five, I can tell you that it is not the people who make these false confessions that society should look to for an explanation, but rather the system itself. In particular, how the authorities question potential suspects.


At the core of American criminal justice is an accusatorial system that assumes a suspect is guilty. This accusatory model runs through all levels of law enforcement and naturally leads to an accusatory method of interrogation, where the suspect is presumed guilty by their questioner.


At first, physical torture was used to extract confessions, verifying the interrogator’s theory of guilt. Then in 1936, the United States Supreme Court ruled, in the case of Brown vs. Mississippi, confessions obtained through violence, such as beatings and hangings, could not be entered as evidence at trial. The court recognized that any human can be coerced to say anything, and as such, confessions by torture were unreliable.


As a consequence, the authorities went to a softer and less obvious method of coercion: the “third degree.” The third degree left less-observable physical marks of torture; the police shoved the suspect’s head into a toilet, twisted arms, or struck the accused in places that would not leave an obvious mark. Interrogations were conducted nonstop for days, with sleep deprivation, bright lights, verbal abuse, and threats to the suspect and suspect’s family all commonplace.


In the early 1960s, John E. Reid, a polygraph expert and former police officer, and Fred E. Inbau, a lawyer and criminologist, devised an extensive method of psychological interrogation called the Reid Technique of Interrogation. This model is based on psychological manipulations and the ability of the questioner to tell when the suspect is lying, and it is used today by practically all police departments in the United States. The Reid Technique follows the American tradition of accusatory criminal investigation. Instead of torture, however, the Reid Technique utilizes isolation, confrontation, the minimization of culpability and consequences, and the officer’s use of lies about evidence that supposedly proves the suspect is guilty.


This accusatory method establishes control over the person being investigated by leaving the suspect alone in a small, windowless, claustrophobic room prior to interrogation; has the interrogator ask accusatory, closed-ended questions that reflect the police theory of what happened; and has the officers evaluate body language and speech in order to determine if the suspect is lying. The goal of this psychological interrogation is to overwhelm the person being questioned and to maximize the suspect’s perception of their guilt. When necessary, a softer approach by the investigator allows the suspect to perceive their conduct in a socially more acceptable light and thereby minimizes both the perception of the suspect’s guilt and the likely legal consequences if they confess.  


By the time the Supreme Court decided Miranda v. Arizona, psychological interrogations had supplanted physical coercion. But with no obvious marks of torture, the Supreme Court now had difficulty distinguishing voluntary from involuntary confessions. The Court noted:

[T]he modern practice of in-custody interrogation is psychological rather than physically oriented.


As we have stated before, this Court has recognized that coercion can be mental as well as physical, and that blood of the accused is not the only hallmark of an unconstitutional inquisition.


The justices went on to emphasize the “inherent coercion of custodial interrogation [when considering] the interaction of physical isolation and psychological manipulation,” and concluded that new safeguards were necessary in order to ensure non-coerced confessions. Thus, the Court required the now-famous Miranda rights warning that law-enforcement agencies read to suspects.


These safeguards are as follows:

  • You have the right to remain silent.
  • Anything you say will be used against you in court.
  • You have the right to an attorney.
  • If you cannot afford an attorney, one will be provided to you.

    However, these Miranda safeguards do not prevent psychologically induced false confessions.


    Scholars say the flaw in psychological interrogations like the Reid Technique is the assumption that the investigator can detect when the suspect is lying. Studies challenge the ability of a person, even a trained investigator, to tell when a person is lying. This is particularly true when dealing with the young, the poorly educated, or the mentally ill, especially when the suspect is under the psychological stress of isolation, accusation, and the presentation of false evidence.


    As a criminal defense attorney, I have had firsthand experience observing the results of the Reed Technique when used against juvenile suspects. The Crowe murder case is a prime example of how psychological interrogation goes wrong when the police follow their presumed skilled assumption that a suspect is guilty. In the Crowe case, the police had no evidence as to who killed 12-year-old Stephanie Crowe. But her 14-year-old brother, Michael, didn’t seem to be grieving appropriately. By the time the police were done interrogating Michael, he had confessed to the murder, and two other high school friends had given statements tying them to the murder. All three were charged as adults for the murder. I represented one of the boys. At defense insistence, a mentally ill vagrant’s clothing was tested. DNA tests found Stephanie’s blood on the vagrant’s clothing. The boys were released and exonerated of all guilt.


    Such false confessions need not happen. There is a new method of interrogation created by the High-Value Detainee Interrogation Group (HIG) that is quietly being used by a few law enforcement agencies. HIG was established in 2009 as a reaction to the physical and psychological torture at Guantanamo Bay, Abu Ghraib, and other overseas CIA facilities during the post-9/11 Bush years. The HIG technique represents a joint effort by the FBI, the CIA, and the Pentagon to conduct non-coercive interrogations.


    How HIG works and the tactics used by interrogators are closely held government secrets. But this we do know: The United States government through HIG has funded over sixty studies in the psychological and behavioral sciences worldwide, with particular emphasis on studies of the law enforcement models in England and Canada. These two countries have abandoned the Reid Technique of psychologically accusatory interrogations for a “cognitive interview” model where the suspect is asked what they know about the crime. This method presumes the suspect is innocent and allows the suspect to tell their story without interruption or accusation. The investigator may ask the suspect about contradictions or inconsistencies between the suspect’s narration and the known evidence. But the interrogator may not lie about the evidence or deceive the suspect.


    Could we be seeing the end of the American method of accusatory interrogation and the beginning of a new and more effective form of interrogation? One thing is sure: the cognitive-interview method was found, in a 2014 HIG study, to be more effective in producing true confessions as opposed to false confessions than the American accusatory approach.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172588 https://historynewsnetwork.org/article/172588 0
    "If ever I felt that I ought to be five priests it was that week:" Chaplains in World War 2



    An exasperated Father Henry Heintskill, C.S.C., a Notre Dame chaplain posted to naval duty in the Pacific, faced the same issue with which almost every military chaplain grappled during World War II—how to perform the multiple tasks that normally required the services of two or three chaplains. “There are all sorts of problems the men have,” Father Heintskill wrote in a letter to his superior, “they’re worried about conditions at home, etc.  We have to do what we can.”  He explained that after one Friday evening service, at least two hundred men gathered for Confession, requiring him to remain an hour after lights out at 9:30 p.m.  “If ever I felt that I ought to be five priests it was that week.”


    Chaplains celebrated Mass and helped the men complete government forms. Some soldiers, not long out of high school, wondered what combat would be like.  Others asked about the morality of taking another man’s life. 


    One duty, however, was paramount—to be at the side of a soldier or sailor as the young man died. It was then that the chaplain could administer Last Rites and its promise of dying with a clear conscience. 


    That was evident as soon as the first day of war, when ninety Japanese aircraft struck Clark Field in the Philippines shortly after noon. Father John Duffy, a Notre Dame diocesan priest from Ohio, eluded bullets and bombs as he ran to the field, littered with dead and wounded, to hear confessions and administer Last Rites. To avoid wasting precious moments by inquiring about the soldier’s faith, he gave Last Rites to any dying serviceman he came across. “I knew it would be effective for the members of my faith & that it would do the others no harm,” he explained later.  “There wasn’t sufficient time for inquiry about religious tenets of the wounded.” 


    Four months later, Father Duffy lay at the receiving end of the sacrament. After enduring severe abuse at the hands of cruel Japanese guards on what became known as the infamous Bataan Death March—“Extreme Unction, Baptism, Confessions administered daily on march,” wrote Father Duffy. “Death, pestilence, hunger, exhaustion, depleted all.”—the priest lay on the ground, apparently dying from bayonet slashes to his body. A Protestant chaplain knelt beside his friend, held Duffy’s head in his hands, and prayed, “Lord, have mercy on your servant. He’s a good man who served you well. Receive his soul.”  Within moments another Catholic chaplain came upon the scene and, also thinking the priest was dying, anointed Father Duffy.


    The importance of Last Rites extended even to the enemy. Two and one half years later, during the bitter combat in Normandy following the June 6, 1944 invasion, Father Francis L. Sampson spotted a German soldier lying in a creek a few feet away.  He crawled over to do what he could for the enemy soldier, but as Sampson lifted him into his arms, the German groaned a few times and died.  Because he saw a higher duty, the Catholic chaplain from Notre Dame, wearing an American uniform, gave absolution to a German soldier dying in a French creek.


    Father Joseph D. Barry, C.S.C., recognized that a paralyzing fear for a wounded or dying soldier was to lie alone on the battlefield, left to contend with his fears. During his more than 500 days in European combat with the 45th Infantry Division, Barry exerted every effort to reach a boy prone on the ground and bring him the peace of knowing that someone was there with him. “After 54 years, I can still see Father Barry administering last rites to soldiers in the field while enemy shells exploded all around him,” wrote Albert R. Panebianco, a soldier in the 45th Infantry Division.  


    On one occasion Barry talked with a soldier who, due to go into battle in a few hours, feared that “this might be my last night.” The soldier confided that he accepted fear as part of his task, but wondered if he would control his panic and still perform when it counted.  


    Barry inquired if there was anything the priest could do for him. Above all, the boy told Barry, he had wanted to be a good soldier—for his men, his family, his country, and his God—and if he died, would Barry please tell his family that he had fulfilled that wish.  During combat later that night, German fire cut down the youth. Father Barry rushed to him, cradled the mortally wounded boy in his arms, and with explosions and combat nearly drowning out his words, shouted into the dying boy’s ear, “Remember how we talked last night.  Here it is.  And I can say you were a good soldier.”


    Father Barry consoled more people than the soldiers he tended. He also penned letters to parents and loved ones, often at the behest of the dying soldier who asked the priest to inform his mother or wife that he loved her. Above all, he made certain that they knew that their son had died with a priest at his side. “I wrote to so many.  You could write what they wanted to know more than anything else, ‘I wonder if there was a priest with my boy.’” Barry explained in an interview.  “And that is the only reason I wrote,” he said. 


    In Okinawa, Father John J. Burke, C.S.C., knew the difficulty of fashioning letters to grieving loved ones. After a Japanese torpedo struck the aft portion of his battleship, USS Pennsylvania, on August 12, 1945, killing twenty men, Father Burke mailed twenty responses to loved ones in which he relayed, with dignity and compassion, information about the loss of their son, brother, or husband. Rather than send a similar form letter to each family, he instead crafted similar opening and ending paragraphs, but inserted personal information unique to each individual in the main portion. “God bless you in your present sorrow,” Father Burke began each letter.  “As the Catholic Chaplain aboard the U.S.S. Pennsylvania I want to assure you that your son [here he inserted their first name] received Catholic Burial.  The Holy Sacrifice of the Mass was offered several times for the repose of his soul.”


    He then added personal information about each sailor.  To Mrs. Angeline Ortbals of Ferndale, Michigan, whose son, nineteen-year-old Seaman 1/c Robert J. Ortbals, died, he wrote that Robert “had a heart of gold” and went out of his way to help his shipmates. To the parents of a sailor named Roemer, he wrote that, “I feel that a boy so young must very soon, if not already, be enjoying the eternal happiness of heaven which is beyond human description and to which, in God’s mercy we all look forward.” 


    Father Burke closed all letters by explaining that their son had recently attended Mass and received Communion, and that as far as he knew, had led a religious life.  “It is impossible for me to express anything that will lessen the sorrow which you must endure.  You have returned to God your beloved son on your Country’s Altar of Sacrifice.  In this supreme sacrifice your son is most like our Divine Savior; and you, I trust, most like his Blessed Mother.  God bless you with the humble and Christian spirit of resignation to His Divine Will.”


    Though they conducted many rigorous tasks, the chaplains cherished the knowledge that they had comforted dying young men, and subsequently their families, in these final moments. As Father Duffy related, “I did what I could for each regardless of his faith, and a look of ineffable peace came to the face of many a tortured soul in that last bitter hour on earth.”

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172584 https://historynewsnetwork.org/article/172584 0
    From the Founding Fathers to Trump: Who Can Be an American? “Why don’t they go back and help fix the totally broken and crime-infested places from which they came” the president of the United States recently tweeted. Trump was referring to four Congress women of color, three of whom were born in the United States. The other is a naturalized American citizen. Trump continued his criticism of “the Squad,” in particular Congresswoman Ilhan Omar, at a campaign rally and the crowd responded by chanting “send her back.” As David Leonhardt of the New York Times wrote, “It was an ugly, lawless, racist sentiment, and President Trump loved it”. He later denied that he supported the crowd’s chant.  It’s hard to imagine Trump telling a white man, even someone like Senator Bernie Sanders who he disagrees with, to go back to where he came from. Correctly charged as a racist statement, the tweet also reflects an age-old question in our history: who can be an American?


    There have always been two views of what defines American identity. One is tied to a traditional racial or ethnic view, ethno-nationalism for short, the other is that America is an idea. Gunnar Myrdal of Sweden dubbed the second one the American Creed: Americans were bound together by “the ideals of the essential dignity and equality of all human beings, of inalienable rights to freedom, justice and opportunity.” Myrdal was referring to Jefferson’s natural rights section of the Declaration of Independence when he made this observation.


    Today, the United States is religiously, culturally, and ethnically diverse. Yet we see ourselves as Americans in large part due to this creedal notion of America. In 2018, two scholars at Grinnell College “polled Americans on what they most associate with being a real American.” They found that a “vast majority of respondents identified a set of values as more essential than any particular identity.” As the historian Mark Byrnes wrote for HNN back in 2016, “The United States is fundamentally an idea, one whose basic tenets were argued in the Declaration of Independence and given practical application in the Constitution.” These ideas revolve around liberty, equality, self-government, and equal justice for all, and have universal appeal. “Since America was an idea, one could become an American by learning and devoting oneself to” those universal ideas, Byrnes observes.


    Despite the strong appeal of the American Creed, 25 percent of those polled by Grinnell College held nativist views similar to those espoused by Donald Trump during his 2016 election campaign for president, and as further reflected in his comments after Charlottesville and in his recent tweet. The view that ethnicity and race made the United States one people predominated in the early American Republic. John Jay, in Federalist No. 2, made the argument that the United States was one nation at the time of the debate over ratification of the Constitution by appealing to ethno-nationalism. He wrote that we are “one united people—a people descended from the same ancestors, speaking the same language, professing the same religion, attached to the same principles of government, very similar in their manners and customs and who, by their joint counsels, arms, and efforts…established their general liberty and independence.” Jay’s thesis largely reflected the traditional view of nationhood. A majority of Americans were of English descent in 1788 and they viewed America as a nation for white people, with Caucasians, and specifically Anglo-Saxons,as the superior race. Some scholars even defend these ideas: the late Samuel P. Huntington, who was a Harvard political scientist, has argued that this Anglo-Saxon heritage ultimately contributed to the American Creed.


    While Jay’s ethno-nationalist perspective obviously cannot describe the United States today, it was inaccurate even in 1790, when we were already a diverse people. African Americans, most of them enslaved, were 20 percent of the total population in 1790. In Pennsylvania, thirty-three percent of the people were of German ancestry, and both New York and New Jersey had large numbers of German and Dutch peoples. There were also conflicts between the English and these other groups, including the Irish, the Scottish, and the Welsh, who were themselves from the British Isles.


    While ethno-nationalism has deep roots in the United States, so too does Jefferson’s American Creed. Jay himself noted that the United States was “attached to the same principles of government,” a reference to the support for the establishment of a government grounded in the consent of the governed. To Thomas Paine, the country was drawn from “people from different nations, speaking different languages” who were melded together “by the simple operation of constructing governments on the principles of society and the Rights of Man” in which “every difficulty retires and all the parts are brought into cordial union.” Washington saw America as a place that was “open to receive not only the Opulent and respectable Stranger, but the oppressed and persecuted of all Nations and Religions.” Hector St. John de Crevecoeur was another adherent of the view that America was an idea. Originally from France, he emigrated to New York during the colonial period. Crevecoeur talked about the amazing “mixture of English, Scotch, Irish, French, Dutch, Germans, and Swedes” who were “a strange mixture of blood.” He referred to people who came to the United States as Americans, a place where “individuals of all nations are melted into a new race of men.”


    Still, rapid increases in immigration have always threatened this notion of American identity.The founding fathers differentiated between the people who inhabited the original thirteen colonies, who were largely drawn from northern Europeans, even if today we see few differences between people of European descent. Benjamin Franklin complained about the “Palatine Boors” who swarmed “into our Settlements…herding together” and creating a “Colony of Aliens.” Thomas Jefferson doubted that he shared the same blood as the “Scotch” and worried about immigrants from the wrong parts of Europe coming to the United States,” Francis Fukuyama writes in his recent book Identity. During periods of rapid immigration, nativist movements tend to emerge. 


    For people of color, America has rarely been a welcoming place. Many Black Americans were brought here as slaves, and Native Americans were overrun as the insatiable desire for land led to ever greater westward expansion. Our history must always take into account “the shameful fact: historically the United States has been a racist nation,” as the historian Arthur Schlesinger framed it in his book The Disuniting of America. “The curse of racism has been the great failure of the American experiment, the glaring contradiction of American ideals.”  


    So much of American history can be seen as an attempt by previously excluded groups to also be granted their share of the rights for which the American Creed calls. By the 1850’s, abolitionists had been agitating for an end to slavery and the extension of rights to black people. Lincoln eventually become committed to a creedal view of America that extended the rights enshrined in the Declaration of Independence to all people, black and white, native born and immigrant. Martin Luther King Jr., in his “I Have A Dream Speech” delivered in front of the Lincoln Memorial in 1963, reminded the nation that it had fallen short of its founding ideals, that the Declaration of Independence was a “promissory note to which every American was to fall heir…yes black men as well as white men.”  


    In the early 21st century, in the aftermath of the election of our first African American president, many of us hoped that our great nation had grown beyond the ethno-nationalist version of America, but the election of Donald Trump proved that this is not the case.  As Americans, our challenge, in the words of Francis Fukuyama, is to avoid “the narrow, ethnically based, intolerant, aggressive, and deeply illiberal form that national identity took” in our past, and which Trump is trying to reignite. Instead, we need “to define an inclusive national identity that fits [our] society’s diverse reality.” The challenge of our times is to continue our commitment to a creedal vision of America. We need to make a reality of the opening words of our Constitution, that “We the People” means all people who share the American creed, regardless of race, ethnicity, or religion, and to constantly strive to unleash, in Lincoln’s words, “ the better angels of our nature.” We can start by denouncing Trump’s continuing appeal to racism. 

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172583 https://historynewsnetwork.org/article/172583 0
    Can Donald Trump Be Compared to Caligula, the Mad Emperor of Rome?



    Even before Donald Trump was elected president of the United States he was being compared to Caligula, third emperor of Rome. Following Mr. Trump’s election, comparisons flowed thick and fast. But, is it fair to compare the unpredictable, ultimately chaotic reign and questionable mental state of Caligula with the administration and personality of the forty- fifth president of the United States? Do comparisons stand up to scrutiny?


    Well, both men ruled/rule the largest military and economic powers of their age. Caligula emptied the treasury with his extravagances. Trump presides over a ballooning U.S. national debt. Neither man had served in the military they ended up commanding.


    Both had few friends growing up. Both had multiple wives. Both men had successful, wealthy fathers. The parents of both Caligula and Trump all died before their son rose to the highest office in the land.


    Both men rid themselves of senior advisers who restrained them. Both were/are sports lovers, building their own sporting facilities in furtherance of their passions. In Caligula’s case it was chariot racing and hippodromes. For Trump, it’s been golf and golf courses.


    Then there are the obvious differences. Caligula was twenty-four years old when he came to power. Trump was seventy on taking the top job. Caligula had absolute power with no specified end date. Unless the system is changed, Trump can expect a maximum of eight years in power. Trump has made numerous outrageous claims. Caligula made just one—that he and his sister Drusilla were gods.


    Caligula was well-read and an accomplished public speaker with a lively if barbed wit. Trump’s wit can be similarly stinging. But he comes across as an inarticulate man, exhibiting an obvious discomfort with formal speeches and producing a nervous sniff when out of his comfort zone.


    It’s instructive to look at the handshakes of both. The handshake as a form of greeting went well back before the foundation of Rome. Originally, it demonstrated that neither party held a sword in the right hand. If a Roman respected the other party, he would “yield the upper hand” in a handshake, offering his hand palm up.


    Caligula yielded the upper hand to few men other than his best friend Herod Agrippa, grandson of King Herod the Great. Donald Trump sometimes yields the upper hand. But is it through respect, or diffidence?


    At his first public meeting as president with Russia’s president Vladimir Putin at the 2017 G20 summit in Germany, Trump offered his hand first, palm up, yielding the upper hand to Putin. He did the same when meeting France’s president Emanuel Macron that same year. In contrast, Trump offered female leaders Germany’s chancellor Angela Merkel and Britain’s prime minister Theresa May a straight up and down handshake.


    Through late 2018, Trump was photographed yielding the upper hand to Japan’s Prime Minister Shinzo Abe and Australian Prime Minister Scott Morrison. In October, he did the same with America’s then ambassador to the United Nations, Nikki Haley, in the Oval Office.


    In terms of policy, Trump and Caligula are poles apart. Some of Caligula’s public infrastructure policies were ambitiously innovative and progressive, if expensive. While Trump has always painted himself as entrepreneurial, his policies have been regressive – a blanket program of retreat. Retreat from the Paris Climate Accord. Retreat from free trade. Retreat from government regulatory control of the economy and the environment. Retreat from military boots on the ground in Syria and Afghanistan.


    As US Secretary of State Mike Pompeo said in January, “When America retreats, chaos often follows.” From the available evidence, it seems Caligula did suffer from a mental illness. Trump’s mental stability is, in the words of an ancient Roman saying, still before the judge. 


    In the end, it wasn’t external foes who caused Caligula’s downfall. Caligula was brought down by a dread among his inner circle of being next as he eliminated many around him. Loyalty and friendship were no guarantee of survival. Similarly, it’s been said that President Trump turns on a dime when it comes to friends. In the case of Caligula’s friends, self-preservation eventually made the most loyal the most lethal.


    When Caligula’s reign was terminated at the point of swords wielded by assassins in his own guard, it had lasted around four years, the equivalent of a U.S. presidential term. Perhaps it will take that long for the proverbial knives to come out among the Republican old guard in Washington today. As was the case in AD 41, it will probably not be a pretty sight.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172593 https://historynewsnetwork.org/article/172593 0
    The Beginning of the HIV/AIDS Epidemic – and One Doctor's Search for a Cure


    The following is an excerpt from The Impatient Dr. Lange by Dr. Seems Yasmin. 


    Before the living dead roamed the hospital, the sharp angles of their bones poking through paper-thin bed sheets and diaphanous nightgowns, there was one patient, a harbinger of what would consume the rest of Dr. Joep’s life. Noah walked into the hospital on the last Sunday in November of 1981. It was Joep’s sixth month as a doctor and a quiet day in the emergency room at the Wilhelmina hospital, a red brick building surrounded by gardens in the center of Amsterdam. 

    Noah was forty-two years old, feverish, and pale. His skin dripped a cold sweat. The insides of his cheeks were fuzzy with thick streaks of white fungus. And then there was the diarrhea. Relentless, bloody diar- rhea. Noah’s stomach cramped, his sides ached, he couldn’t swallow food. Doctors admitted him to the infectious disease ward, a former army barracks in the ninety-year-old hospital, where they puzzled over the streaky plaques of Candida albicans, a yeasty fungus growing inside his mouth, and the bacteria Shigella breeding inside his gut. 

    Noah swallowed spoonfuls of antifungal medicine. Antibiotics were pushed through his veins until his mouth turned a rosy pink and his bowels quieted. Still, the doctors were baffled by his unlikely conglomeration of symptoms. “The patient needs further evaluation,” they wrote in his medical records. “He has anemia. And if the oral Candida recurs, it would be useful to check his immune function.” They discharged him on Friday, December 11, 1981. 

    Had they read the New England Journal of Medicine on Thursday, December 10, they would have found nineteen Noahs in its pages. 


    Reports were coming in from Los Angeles and New York City of gay men dying from bizarre infections usually seen in transplant patients and rarely in the elderly. Like Noah, their immune systems had been an- nihilated and they were plagued with a dozen different bugs—ubiquitous microbes that rarely caused sickness in young men. 

    The week that Noah walked out of Wilhelmina hospital, the New Eng- land Journal of Medicine dedicated its entire “original research” section to articles on this strange plague. In one report, scientists from Los Angeles described four gay men who were brewing a fungus, Pneumocystis carinii, inside their lungs, and Candida inside their mouths and rectums. Doctors in New York City puzzled over fifteen men with worn-out immune systems and persistent herpes sores around their anus. 

    By the time the New England Journal of Medicine article was printed, only seven of the nineteen men in its pages were still alive.


    ...Four days after Noah walked out of the Wilhelmina hospital, a young man walked into the emergency room at the Onze Lieve Vrouwe Gasthuis, or Our Lady Hospital, a ten-minute bike ride away in east Amsterdam. 

    Dr. Peter Reiss was on call that Tuesday night, his long white coat flapping around his knees as he hurried from bed to bed. Peter and Joep had met in medical school, where they realized they were born a day apart. Joep was one day older than Peter, and he liked to remind his friend of this fact. 

    Peter picked up the new patient’s chart and stroked his trim brown beard as he read the intake sheet. His bright blue eyes scanned the notes just as he had read the New England Journal of Medicine the previous Thursday. 

    He walked over to the cubicle and pushed aside the curtain. There was Daniel, a skinny nineteen-year-old with mousey blonde hair, sitting at the edge of the examination table. His skin was pale with bluish half- moons setting beneath his eyes. He was drenched in sweat and hacking a dry cough. 

    Daniel looked barely pubescent, Peter thought, and he checked the chart for his date of birth. There it was, February 1962. Daniel watched as his young doctor slipped blue gloves over his steady hands. Peter had a broad, reassuring smile and a calm manner. 

    “Tell me, how long have you been feeling sick?” he asked gently. Daniel had been ill since November. First, a prickly red rash dotted his chest and arms, then itchy red scabs appeared on his bottom. The diarrhea started soon after and he was running to the toilet every few hours. He had a fever that kept creeping higher. 

    Peter asked if he could palpate his neck and Daniel nodded. Pressing his fingers under the angles of his jaw, the pads of Peter’s fingers found shotty lymph nodes as large as jelly beans. Peering inside Daniel’s mouth, he saw a white blanket of Candida coating his tongue and tonsils. When he stepped back, Daniel coughed and caught his breath and Peter realized that the air had escaped his own lungs, too. 

    A teenaged boy with enlarged glands, oral thrush, and perianal herpes sounded a lot like the journal articles he had read on Thursday. The case reports flashed through his head: young gay men, history of drug use, American cities. 

    “Are you sexually active?” Peter asked softly. Daniel looked away. “Yes,” he whispered. “I had sex with a man for the first time ten weeks ago. He was much older than me, forty-two years old, I think. I heard he’s very sick.” 


    In the summer of 1981...Amsterdam was a safe haven. Lovers from the provinces could walk down the street and do the unthinkable: hold hands, hug, plant playful kisses on their boyfriend’s faces. Here, there was safety in numbers— freedom in a place where gay men were met with smiles instead of slurs. Those who took vacations in the United States reported back that Amsterdam was a lot like San Francisco with its kink bars and bathhouses, places where gay men could hang out and enjoy anonymous sex. 

    In both cities, the new illness was preying on love and freedom. If colonialism had sparked the spread of HIV from chimpanzees to humans, homophobia was the fuel that helped the epidemic spread from one person to another. The virus was exploiting the need for comfort and community as it swept through bedrooms and bathhouses in the Castro and on Reguliersdwarsstraat. 

    More than twenty bathhouses dotted San Francisco, including the Fairoaks Hotel, a converted apartment building on the corner of Oak and Stiner. Yoga classes ran alongside therapy sessions and group sex. Wooden-framed signs at the front desk advertised poppers and t-shirts at five dollars apiece. 

    Disease detectives from the CDC descended on these refuges to col- lect samples and stories, a nameless disease with an unknown mode of transmission giving them license to inject themselves into the private lives of strangers. They offered no answers, only long lists of questions: How many men did you have sex with? What kind of sex was it? Can you write down all your lovers’ names? 

    The men offered up memories and saliva samples, fearful of what the government doctors would find inside their specimens. The disease detectives were trying to work the investigation like any other outbreak, following the same steps in their usual logical manner. Except this time, the world was watching and waiting for answers. 

    A handful of diseases have been eliminated from a few pockets of the globe, their numbers dwindling to levels that give humans a sense of dominance over the microbial world. But only one infectious disease has been eradicated: smallpox. 

    The mastermind behind the global erasure of that virus was Dr. Bill Foege, a looming figure who worked tirelessly to eradicate smallpox in the 1970s. In 1977, he was appointed director of the CDC by President Jimmy Carter. 

    But a few years into his tenure, Bill’s scientific acumen was up against political fatuity. Carter lost the election to Ronald Reagan, who was sup- ported by a political-action group called the Moral Majority. “AIDS is the wrath of God upon homosexuals,” said its leader, Reverend Jerry Falwell. Pat Buchanan, Reagan’s communication director, said the illness was “nature’s revenge on gay men.” 

    Reagan said nothing. He uttered the word “AIDS” publicly for the first time in May of 1987 as he neared the end of his presidency. By that time, fifty thousand people were infected around the world and more than twenty thousand Americans had died. 

    To make matters worse, the Reagan administration demanded cuts in public health spending. Bill had to tighten his purse strings just as the biggest epidemic to hit humanity was taking off. 

    Even within the CDC, some leaders were doling out politically motivated advice. “Look pretty and do as little as possible,” said Dr. John Bennett, assistant director of the division of the Center for Infectious Diseases. He was speaking to Dr. Don Francis, a young and outspoken epidemiologist who had returned from investigating the world’s first outbreak of Ebola in Zaire. 

    Bill possessed a stronger will. Armed with political savvy and epide- miologic expertise, he instructed Dr. James Curran to assemble a team. James was head of the research branch of the CDC’s Venereal Disease Control Division. By assigning him a new role, Bill was working the sys- tem to give James enough latitude to conduct what would be the most important investigation of their lives. 

    James gathered thirty Epidemic Intelligence Service officers and CDC staff to form a task force. Joined by Dr. Wayne Shandera, the Epidemic Intelligence Service officer assigned to Los Angeles County, the task force for Kaposi’s sarcoma and Opportunistic Infections got to work. 

    The first item on the to do list in any outbreak investigation—even one as devastating as AIDS—is to come up with a case definition, a short list of criteria that will help other doctors look for cases. The disease detec- tives huddled around a table in their Atlanta headquarters and listed the major scourges of the new syndrome. 

    A case was defined as a person who had Kaposi’s sarcoma or a proven opportunistic infection such as Pneumocystis carinii pneumonia. They had to be aged younger than sixty, and they couldn’t have any underlying illness such as cancer or be on any medications that would suppress their immune system. 

    They shared the case definition with doctors around the country and by the end of 1981, as Noah and Daniel were walking into hospitals in Amsterdam, the CDC had a list of one hundred and fifty-eight American men and one woman who fit the description. Half of them had Kaposi’s sarcoma, 40 percent had Pneumocystis carinii pneumonia, and one in ten had both. Looking back, the earliest case they could find was a man who fell sick in 1978. 

    They looked for connections between the cases and found them- selves writing names on a blackboard and drawing white lines between the people who had sex with one another. A spider’s web of a vast sexual network emerged. Thirteen of the nineteen men who were sick in southern California had had sex with the same man. 

    Task force Drs. David Auerbach and William Darrow cast their net wider, looking at ninety gay men across a dozen cities who fit the case defi- nition. Forty of those men had sex with the same man, who was also sick. Still, some were vehemently opposed to the idea that the syndrome was sexually transmitted. If it was spread through sex, why hadn’t this happened before? But then came the summer of 1982 and reports of babies and hemophiliacs with Pneumocystis carinii. The common link was blood transfusions. This added a new mode of transmission. Like hepatitis B, the new illness was spread through sex and blood. 

    The CDC announced four groups of people were most vulnerable to the new illness, hemophiliacs, homosexuals, heroin users, and Hai- tians, and the disease earned a new name: 4H. With that public health announcement came public outrage and vitriol against those groups, especially gay men and Haitians. Houses were burned, children expelled from school, families forced to move towns because they were sick. Poli- ticians sat complicit in their silence. 

    It was unparalleled, this confluence of public health, politics, clinical medicine, and public anxiety. The unknown disease was spreading faster than imagined. Humanity had never seen anything like it. 


    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172592 https://historynewsnetwork.org/article/172592 0
    Climate Change and the Last Great Awakening


    Historians from Joseph Tracy in 1842 up to the present day have seen the religious revival movement of the mid-eighteenth century as the first mass movement in American history.  With its roots in the works of Congregationalist minister Jonathan Edwards and the John Wesley-influenced Anglican George Whitefield, it renewed and expanded the Puritan notion of the “second birth” in achieving salvation in a Christian-dominated milieu.  Tracy dubbed this movement the “Great Awakening” and it began what I like to think of as “conscience in American history.”  This movement, as it ebbed and flowed over time, influenced the American Revolution, abolitionism, women’s rights, the labor movement, social welfare, and environmental concerns right up to the present day.  


    Another American mass movement was the social revolution of the 1950s into the 1970s that resulted in more equitable civil rights, the end of the Vietnam War, the rejuvenation of the women’s rights movement, and the environmental movement. Today, this mass movement for change is in need of resurgence. We need another Great Awakening to convince the public and political leaders to accept the devastating reality of man-made climate change and embrace efforts to combat it. Even if movements like Extinction Rebellion take off and become massive, with millions of people in the streets, the attempt to curb the catastrophic effects of Climate Disruption may well be the Last Great Awakening.


    Fifty years ago, historian Richard Bushman wrote that twentieth-century inhabitants, if they had ever even heard of it, misunderstood the nature of the eighteenth-century Great Awakening.  This was a period of religious “revival” that ran through the middle third or so of the eighteenth century.  The fervor of the original sixteenth and seventeenth century Puritans – the ones who had made their way to Massachusetts aboard the Mayflower– believed themselves to be creating a “City on the Hill” to welcome the imminent return of Christ the Messiah, ushering in a thousand-year reign known as the Millennium.  These colonists were Calvinists, meaning that they embraced not only millennialism, but a doctrine known as “pre-destination” – it had already been determined who was going to Heaven and who was going to Hell in the eternal realm of the Father, Son, and Holy Spirit.  But how could one know if one was pre-destined for Heaven or not? Calvinists responded that one could know by having a “second birth” of the spirit – a profound psychological experience that would leave a lasting mark on one’s psyche making it quite clear that one versed in the doctrine had been “Chosen.” The relief from the experience or dread from not having it could, and usually was, profound.  Today, we hear of people being “Born Again” in charismatic Christian churches (and elsewhere), but many consider this to be either fake or the ramblings of the mildly insane.  So, when one mentions this business of a “second birth” now, many people simply ignore it and carry on with their lives.  They don’t understand the implications for those having the experience, especially during the eighteenth century’s Great Awakening.  This is what Bushman was saying.  

    Great Awakener Jonathan Edwards’s famous “Sinners in the Hands of an Angry God” is often put forward as an example of the kind of jeremiad that would induce the desired “second birth” of the reprobate (one who had not had the “second birth”). Subtitled “Sermon on the Danger of the Unconverted” and delivered at Enfield, CT in July of 1741, it needs to be remembered as a spoken sermon delivered "enthusiastically."  (Think “hellfire and brimstone”.)  The function of the sermon was to induce a sublime terror that would propel the listener into a cataclysmic psychological transformation.  


    This is not an unusual psychological journey for humans on Planet Earth. Visionary experiences are common in the mystical aspects of all religions.  The context and agency can vary.  Bushman implies this in his essay when he compares the eighteenth-century Great Awakening to the ‘60s Civil Rights and anti-war movements.  The Civil Rights movement – delayed justice for African Americans – and opposition to the atrocious crime known as the Vietnam War, were “awakenings” that had both a political and cultural side.  The politics were that of the New Left – Students for a Democratic Society, the Black Panthers, the American Indian Movement, and others.  The cultural side was a bohemian spirit inherited from the "Beats" of the 1950s that became truly massive with the "hippie movement" and featured, among other things, the shared experiences of rock music and psychotropic substances like LSD, psilocybin mushrooms, peyote cactus buttons, etc.  While this directly impacted a fairly small percentage of the population overall, historians and other students of this period including psychologists and other care-givers, are beginning to understand this impact. And, like the Great Awakening of the eighteenth century, there was a ripple effect that spread throughout the culture at large.  


    Like the first Great Awakening, the fragmented twentieth-century movements mentioned above made an impact that is strongly felt today, although many people born since then don’t realize it.  The idea of self-realization – becoming the person you were meant to be (or, in Christian terms, the person God intended you to be), getting society back on a track of justice and equity and freedom, having a sense of mission for bringing positive change to the world – these are the results of such profound psychological experiences not unlike the Great Awakening. An entire generation awoke to both what we were doing to the planet and that we were essentially poisoning ourselves by not paying attention to what we were putting in our mouths.  In the eighteenth-century Awakening, Bushman estimated up to twenty or thirty percent of a town could be converted to the “New Lights” (those who had experienced the “second birth”) in one pass by the iconic Anglican preacher George Whitefield.  The counterculture and its politics, while suppressed, has maintained significant numbers of adherents.  It is possible to see both of these “Awakenings” as “seasons of revival, outpourings of the Holy Spirit, and converted sinners experiencing God’s love personally."  In the eighteenth century, those who experienced the “second birth” often “saw the light” of both religious and political freedom, compassion for their fellow humans, and a strong sense of staying attuned to their inner life.  In the twentieth century, many in the Boomer Generation experienced similar feelings of love and compassion for not only their fellow humans, but for all life.  


    Now, in the twenty-first century, as the Boomer Generation has begun passing through the Sun Door, we have the reality of Climate Disruption and cataclysmic change staring us in the face.  It was the counterculture of the Sixties and Seventies – readers of Aldo Leopold and Rachel Carson (et al.) – who first awoke to this danger on a mass scale.  If one takes the language of the first Great Awakening metaphorically, Bushman’s admonition hits close to home.  The “slippery slope” that Edwards, Whitefield, and a small army of itinerant revivalists used to induce the transformative “second birth” describes the reality we now face.  We are sliding.  Edwards’s “God” is our “Climate” and its reality cannot be denied.  We're too late to prevent many of the cataclysmic disruptions to come in the name of profit and convenience.  But we can still mitigate some of it.  We still have some agency, but it is slipping away daily. This is the Last Great Awakening. It’s not just humanity that is sliding, it’s the entire planetary ecological system as well as future generations.  This is the REAL DEAL, the Sixth Extinction is underway and everyone is responsible now. 


    Richard Bushman’s observation that modern observers do not understand those eighteenth-century individuals who underwent a “second birth” has gained profound significance.  Ignoring what we have been doing to our home planet has created consequences that we are only beginning to grasp and that are no longer abstract.  The itinerant preachers of the Great Awakening bent on inducing a “second birth” were, in their way, absolutely right about the need for profound personal change and diligent attention to one’s inner life.  The Last Great Awakening, if it is to be massive and successful, must involve profoundly altering our personal behaviors and inner lives while seriously committing to living in a sustainable way.  Indeed, the sublime terror needed to propel massive action does not need the abstract references to HELL of the First Great Awakening.  The consequences of the “reprobate’s” failure to change is not abstract at all; it's happening now.


    ©Douglas Harvey 2019

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172585 https://historynewsnetwork.org/article/172585 0
    Exploring the Curious Sources of Medieval Law: An Interview with Acclaimed Historian Robin Chapman Stacy


    As my research has shown, lawbooks of this period could communicate ideas and opinions as well as information; they could convey outrage and resentment as well as the stability of custom.

    Our challenge as scholars is to read in ways that allow us to fully get whatever jokes these authors might be telling.

    Robin Chapman Stacey, Law and the Imagination in Medieval Wales


    I first met Professor Robin Chapman Stacey, an acclaimed medieval historian, at the University of Washington in Seattle in May 2019. She had recently published a book about Wales of the thirteenth century, Law and the Imagination in Medieval Wales (University of Pennsylvania Press, 2019). 


    I told her I thought that law and imagination were contradictions. She briefly explained that she had found that literature, rituals, myth and other imaginative enterprises had influenced and shaped the law she examined at this far off time in Wales. I was hooked when she mentioned the bawdy humor, flights of whimsy, and even burlesque in this body of law. For her, the law then was actually a complex political fiction. 


    I was intrigued by her remarkable book. It was apparent that Professor Stacey had spent years on this ambitious project that required painstaking translating from medieval Welsh and Latin, as well as rigorous exploration of the work of other scholars—and a perceptive and knowing sense of humor. 


    In Law and the Imagination in Medieval Wales, Professor Stacey examines the literary and political aspects of the Welsh lawbooks, and argues that the laws are best read not as objective records of native custom but, rather, as important and often humorous commentaries on the politics of thirteenth-century Wales, a nation facing challenges from within and without.


    Professor Stacey finds political commentary and even bizarre comedy in the Welsh lawbooks, arguing that they addressed threats to native traditions posed by the encroaching English while attempting to assure stability in domestic concerns such as marriage, divorce, and inheritance, as well as deal with corruption, abuse and violence. Welsh law then also reflects a special concern for preserving male authority, evidence of discomfort with the participation of women in the economic and political affairs. Professor Stacey peppers her examination of the old lawbooks with examples of medieval Welsh irreverence, bawdiness, wit, and sexual humor as she breathes life into the dry bones of this law of yore.


    Robin Chapman Stacey is a Professor of History at the University of Washington. She teaches medieval history, and her academic focus is Ireland, Wales, and England from the Iron Age through the thirteenth century. In addition to her appointment in History, she is an Adjunct Professor in the Gender, Women, and Sexuality Studies Department. Her work with students has been honored with the UW Distinguished Teaching Award. She has graduate degrees from Yale and Oxford, and has done intensive academic study in medieval Welsh and Irish languages. 


    Professor Stacey’s other books include The Road to Judgment: From Custom to Court in Medieval Ireland and Wales (1994), on the Irish and Welsh institution of personal suretyship in the high middle ages; and Dark Speech: The Performance of Law in Early Ireland (2007), on the role played by speech and performance in ensuring social order in early medieval Ireland. Her books have received prizes from the Medieval Academy of America, the American Conference for Irish Studies, and the Board of Celtic Studies of the University of Wales. She has also written numerous articles on subjects pertaining to medieval Ireland, Wales, and England, including divorce, law and memory, riddles, and legal education. 


    Professor Stacey’s research has been supported by grants from the Guggenheim Foundation, the American Council of Learned Societies, All Souls College Oxford, and the Dublin Institute for Advanced Studies. She is a Past President of the Celtic Studies Association of North America, and has served on the Board of Directors of the American Society for Legal History, and was a past Councilor and Executive Board member of the Medieval Academy of America. 


    Professor Stacey graciously responded by email to my barrage of questions about her career and her new book. 


    Robin Lindley: Thank you Professor Stacey for agreeing to respond to some questions on your work as a historian and your new book, Law and the Imagination in Medieval Wales. Before getting to your book, could you mention how and why you decided to study history?


    Professor Robin Chapman Stacey: I have always been fascinated by the past, but until I went to college, I was planning to be an archaeologist.  In fact, I taught myself Egyptian hieroglyphics in middle school—only to realize once the deed was done that I wasn’t going to get very far knowing the script if I didn’t also know the language!   

    The switch from archaeology to history, however, was the direct result of my taking a mind-numbingly dull class in anthropology at the University of Colorado followed a year later by a mind-blowingly stupendous class in history on the French Revolution at Colorado College.  The teacher I had for French Revolution, Professor Susan Ashley, was the best undergraduate teacher I ever had:  one of those instructors who could make everything you did for class matter so intensely that you stopped paying attention to what was actually going on in your day-to-day life.  After her class, the die was cast--I switched to history and never looked back.  

    Ironically, of course, I now make use of both archaeology and anthropology in my historical work, whereas the closest I get to the French Revolution is the occasional novel!


    Robin Lindley: And how did you decide to specialize in medieval history with an emphasis on Wales, Ireland and England? 


    Professor Robin Chapman Stacey:  I had a class on medieval English history at Colorado College in which we were asked to write a short paper on the Easter controversy as depicted in Bede.  I was so intrigued by the manner in which Anglo-Saxon, British, and Irish met and mingled in the multi-cultural world of the north that I decided later to write my senior Honors thesis on a related topic. Naively, I thought this would be a great thing to study in graduate school—I didn’t even know enough about what I was doing when applying for schools to realize that most graduate programs wouldn’t have someone teaching that period of history.  

    Happily, while there were no Celtic specialists in sight at Yale, where I went (Harvard is the premier Celtic program in the country), I did have the good fortune to work with two very supportive (though decisively non-Celtic) medievalists, Professors John Boswell and Jaroslav Pelikan, who did everything they could to promote my interest in what seemed to them the most obscure of subjects.  Then, in my second year, Professor Warren Cowgill, an eminent Indo-European linguist, decided to offer a course in Old Irish, which I jumped at the chance to take. Honestly, I was terrible at it: I had never even had a course in linguistics before, and Old Irish is an incredibly complex language.  However, I was also stubborn, wouldn’t quit, and was fortunate enough to win a Fulbright to study Irish at Oxford, where I met my mentor and now good friend, Professor Thomas Charles-Edwards.  He was both a world expert in Irish and Welsh law and the nicest and most patient man in the universe; without his help, I might never have finished my degree.   


    Robin Lindley: You have a gift for writing that breathes life into the dry bones of the law. How did you also come to focus on law in your work as a historian? Did you ever consider law school and working as a lawyer?


    Professor Robin Chapman Stacey:  Thank you! I enjoy writing, at least when I don’t hate it, if you know what I mean.  

    In terms of law:  well, I am intellectually interested in legal issues and always have been.  However, the fact that law emerged as my professional focus was the result of an entirely random event:  when I went in to consult Professor Cowgill about a paper topic for Old Irish, he pulled a legal text down off the shelf and told me to work on it.  That text, Berrad Airechta, a tract on personal suretyship, ended up being the basis for both my Oxford and Yale theses.  It was probably also my experience with that text that caused Professor Charles-Edwards to agree to work with me later at Oxford.  Had he chosen a literary rather than a legal text, my career might have been altogether different.


    Robin Lindley: Your background is fascinating. You’ve studied Welsh and other languages and translated from documents that are hundreds of years old. I read that you had a special tutor at Oxford in Welsh. How would you describe your interest in learning often obscure languages and your facility with languages other than modern English?


    Professor Robin Chapman Stacey:   Well, I had done some French, Latin, and German in high school, and until I went to graduate school and studied Old Irish, I had thought of myself as being fairly decent at learning languages.  In fact, my graduate degree was actually in Medieval Studies rather than in History, because I had initially thought when applying that I would like to work across the disciplines of history, language, and literature.  A year of Old Irish cured me of any illusions I might have had about my facility with languages (!), though it also persuaded me that I needed them if I was to do serious historical work, and that is why I applied for the Fulbright.  

    I hadn’t intended to tackle Welsh until I went to Oxford and found myself on the receiving end of this rather intimidating question: “You do know French, Latin, German, Anglo-Saxon, Irish, and Welsh, don’t you?”  (For the record, my tutor now claims this account to be entirely apocryphal, but that’s my story and I’m sticking to it!)  Two weeks later, I had started Welsh and found myself completely wasting the time of the highly eminent Welsh professor D. Ellis Evans, who sat there patiently as I went painfully, word by word, through one of the easiest texts in the medieval language.  Then came a summer course in Modern Welsh, and now I work more with medieval Welsh and Irish than I do with Latin.


    Robin Lindley: It seems you have a long-term interest in the role of imagination in the politics and culture of the societies you study. Did your new book on Wales somehow grow from your previous research on performance and law in early Ireland in your book Dark Speech?


    Professor Robin Chapman Stacey: That is a fascinating question, and I’m not sure I know the answer to it.  My M.Litt. thesis at Oxford was fairly straight-forward legal history.  However, as I worked to turn that into a book, I became more and more interested in questions like the social context of the law, the nature of its authority, and its relationship to other genres and ways of thinking in the period.   

    In my subsequent work, I found myself returning more and more to the connections between law and language, on the one hand, and law and literature on the other.  Dark Speech is focused very much on the former topic, the main issue being the ways in which the use of heightened language in performance lent authority to particular legal rituals or specialists.  Law and the Imagination, by contrast, focuses on the latter, exploring the ways in which literary themes and tropes were used by the medieval Welsh jurists to comment on contemporary political events. I suppose one could say that imagination ties both of these projects together, and in that sense one may have led to the other.


    Robin Lindley: How do you see the historical problem you address in your new book on medieval Wales?


    Professor Robin Chapman Stacey:  The Welsh lawbooks are the most extensive body of prose literature extant from medieval Wales.  They are preserved in approximately 40 manuscripts, both in Welsh and in Latin, and were clearly extremely important to the people who wrote and made use of them.  One can read them as law is traditionally read, as more or less straight-forward (if stylized) accounts of legal practice.  Reading them in this way gives us a sense of how Welsh law worked and developed over time.  However, these lawbooks were written in the twelfth and thirteenth centuries, the last two centuries of Welsh independence and a time of rapid internal change and heightened external conflict with England.  It is my belief that these texts reflect the period in which they were composed in very direct ways, and that reading them in the way we read literature, with close attention to theme and symbolic detail, reveals them to be a sophisticated, opinionated, occasionally even humorous commentary on contemporary political events. 


    Robin Lindley: When I first saw your book, I thought that the concepts of law and imagination were absolutely contradictory. How do you respond to that sense in light of your findings on the law in medieval Wales?


    Professor Robin Chapman Stacey:  We are so accustomed to seeing historical legal texts in the light of our own experience—no one nowadays would likely turn to a statute book for fun!—that we often make presumptions about lawbooks written in the past, and those presumptions govern how we read them.  

    What I am arguing in this book is that at least with respect to this particular body of sources, we need to let go of our expectations and read these texts within their own historical context.  The lawbooks of medieval Wales were not legislative in nature, and their authors had extensive family connections with poets, storytellers, and other more overtly literary artists.  There is a rich body of political poetry extant from this period, as well as a number of fabulous tales and a considerable corpus of erotic verse.  The lawbook authors are aware of all of these and, I argue, deploy many of the same tropes and symbols in their own work.   If we abandon the idea that law is always factual and instead approach these lawbooks in the way we might a tale, I think we will see that these texts also are meant to be read on more than one level.


    Robin Lindley: Your work is deeply researched and thoughtful. What are the some of the challenges and limitations regarding sources in the kind of work you do?


    Professor Robin Chapman Stacey:  The biggest general challenge was realizing what I was arguing and deciding how far I was willing to take it, on which you see further below.  The biggest source challenge was that every tractate within the lawbooks is different from the others in its nature and development; additionally, many lawbook manuscript versions also differ significantly from one another.  This made it challenging to draw general conclusions while respecting also the fact that what is true of one tractate or manuscript might not be true of another.


    Robin Lindley: How did your book evolve from your initial conception to the final product?


    Professor Robin Chapman Stacey:  Honestly, this was one of those books that crept up on me article by article until I realized that everything I was writing about Welsh law was tending to go in the same direction and ought actually to be a book. 

    The very first article I wrote on the subject was an assignment given to me by the editors of a volume on the court tractate to write about the king, queen, and royal heir.  I was asked—a daunting invitation for a relatively young scholar!--to write on these figures because all the other contributors (infinitely more experienced and venerable than myself) had already written on the subject.  At first, I felt stymied and intimidated, but then I began to study these sections carefully and to realize that what I had expected to see was not at all what I was finding.  I was particularly taken by the manner in which the passages I had been assigned seemed intentionally to be creating an image of space within the court that was more politicized than real.  When I later found myself writing about divorce, I was struck again by the “unreality” of what was being described, and by the political subtext that seemed to me to be emerging from what purported to be a description of procedure.  Other chapters followed, and the book progressed from there.


    Robin Lindley: You write about Wales in the thirteenth century. What was Wales like then? Was it a loose group of principalities bound by a common language? 


    Professor Robin Chapman Stacey: The unity of Wales in the twelfth and thirteenth centuries was vested primarily in language, culture, law, and a shared mythology. Various Welsh rulers, usually but not always from the northern province of Gwynedd, had made attempts over the centuries to exert political control over the other regions of Wales. However, these attempts were sometimes successful and sometimes not, and they never lasted, not least because Welsh lords from other regions often resisted Gwynedd’s attempts to extend its rule. Additionally, the pressures posed by the presence of Marcher lords and by the aggression of the English crown were a constant obstacle to native unification. 

    Native law was an important factor in defining Welsh identity in the period, but it was already the case that individual Welshmen—and even some Welsh rulers—were beginning to adopt Common Law ideas and procedures even before the final conquest of Wales by Edward I in 1282-1283. The Statute of Rhuddlan, enacted in 1284, established the basis for English rule of the Principality up till 1536, permitting some aspects of native law to continue, but introducing Common Law procedures in other areas.


    Robin Lindley: You also write about a period when the Wales was eventually conquered by Britain. How did Welsh law respond to English power and Welsh nationalist concerns?  


    Professor Robin Chapman Stacey: My argument is that the Welsh lawbooks speak directly to the need for political unity in face of external aggression and to the importance of native law as a marker of Welsh identity.  They tackle some of the criticisms commonly made against the Welsh in the period, such as sexual immorality and an inordinate penchant for violence.  They also voice, albeit obliquely, criticisms of native rulers for their abandonment of native customs and increasingly intrusive forms of governance.


    Robin Lindley: You see the law as a form of political literature where even the ridiculous can have meaning. Could you give a couple of examples of where you found this quality in law?


    Professor Robin Chapman Stacey:  My favorite examples are the “burlesques”—ridiculous and even obscene rituals described by the jurists as taking place in the court or localities.  For example, one of the things the authors of the lawbooks were concerned about was the degree to which native Welsh rulers were enlarging their jurisdiction by intruding on the traditional prerogatives of the uchelwyr, “noble” or “free” classes from which the lawbook authors came.  The manner in which they described the royal officers taking part in this process was intended (I think) to convey their contempt for them.  The royal sergeant or tax collector, for example, is depicted in the lawbooks as wearing ridiculously short clothing with boots better suited to a child than to a full-grown man; additionally, he is wrongly dressed for the season, wears his underwear on the outside, and goes about trying to do his dirty business holding a short (and flaccid) spear in front of him in a gesture that certainly looks sexual to me in the manuscript illustration we have of it.  Similarly, the judge is said to sleep at night on the pillow the king sits on during the day (imagine the odors here, not to mention the posterior as a source of royal justice); and the porter (who guards the entrances and exits to the court) is imagined as receiving all the bovine rectums from the court as his due.  If this isn’t satire, then I don’t know what is!  


    Also held up for ridicule are women and men who violate native Welsh sexual strictures by having sex without marrying.  Women would normally get property as part of an authorized marital arrangement.  What happens in one passage is that the greased tail of a steer is thrust through the keyhole of a house.  The woman is on the inside; the man is on the outside with two of his friends urging on the steer with whips.  If she can hold on to the tail, she can keep the animal; if she can’t, she gets nothing.  If one imagines the see-sawing motion back and forth, the grease coming off on her hand, and the two “helpers” stirring on the excited animal…well, the sexual imagery is pretty hard to avoid.  We don’t know whether this was an actual ritual in the community; however, it functions in the lawbook as a response to the perceived immorality of the Welsh by showing that they do indeed punish bad behavior and uphold marriage.


    Robin Lindley: The Welsh distinguished court and country in law. Did this mean that there was one law for royalty and elites at court and another law for workers and farmers in the country? Why was this distinction significant?   


    Professor Robin Chapman Stacey:  The basic pattern of the lawbook divides Wales into court and country.  The court is where the prince and his entourage and servants live, and the country is all the rest.  The intent is not so much to suggest that each live by different laws, as to highlight the differing spheres of jurisdiction.  The tractate on the court really just describes the different royal officers and their duties and prerogatives.  The tractates on the country are focused on different legal subjects and procedures affecting Welshmen as a whole:  suretyship, land law, marriage, animal law, and the like. Part of the idea here is, I think, to create a sense of spatial politics moving outwards from the royal court to encompass the settled and wild parts of the gwlad, “kingdom.”  Opposed to this (at least in one redaction) is the gorwlad, the lands outside of gwlad, which are portrayed as regions of danger and predation.  This is part of how the jurists highlight the need for political unity in the face of external threat.  


    Robin Lindley: What are some ways the Welsh law reflected past stories, lore, myth, and such?  


    Professor Robin Chapman Stacey: There are certain myths reflected in the lawbooks, again with political intent.  Perhaps the most obvious of them is the harkening back to a (mythical and politically motivated rather than historical) time before the coming of the English when Britain was ruled by a Welsh-speaking king residing in London.  But another, I think, is the image of Wales itself created in these lawbooks:  as timeless, set in the reign of no particular king and thus of them all, and enduringly native.  It says something important about these sources, I think, that some of the most obvious facts of political life in thirteenth-century Wales—such as the March and Marcher lords--aren’t even mentioned.


    Robin Lindley: It seems law always has a role in assuring stability. Under Welsh law, wasn’t there a special interest in limiting any social mobility and keeping people in their place?   


    Professor Robin Chapman Stacey: Yes, I think you would find that in any body of medieval law to some extent.  But because these texts are written by uchelwyr, this free or noble class, the focus is mainly on protecting themselves from unwarranted and untraditional royal demands, and on preserving their own distinctiveness as a class from economic pressures fragmenting their wealth and pushing them downwards.


    Robin Lindley: You stress “bodies” in the law. How did the law see gender and class in terms of the bodies pertaining to law?


    Professor Robin Chapman Stacey:  I think bodies and, particularly, the gendering of bodies, play an important role in the political commentary implicit in these tracts. One example would be the depiction of the sergeant mentioned earlier, where he is portrayed symbolically not only as ridiculous, but as simultaneously a child and a woman in the items he is given by the king.  In fact, I think the body itself is an important metaphor in these texts, with the prince being depicted as the only person fully in possession of an entire body, while his courtiers are represented by the body parts of the animals they receive as rewards for their service.  Each officer receives something appropriate for his office—the judge gets the tongues, the watchman gets the eyes, the doorkeeper the skins, etc.  It may be that we are supposed to take these rewards as serious; after all, animal body parts had real value in the middle ages. On the other hand, their symbolism is evident, and some of them (in my view) just have to have been invented by the jurists.  What was a porter to do with a pile of rectums, for example?  And what are we to make of the image of a watchman surrounded by a mound of eyes?


    Robin Lindley: You mention that slaves were considered “no-bodies” under Welsh law. What did this mean in terms of the treatment of slaves?


    Professor Robin Chapman Stacey: Given the way in which other ranks of person are represented in the lawbooks by body parts symbolic of their status and duties, it seems significant to me that slaves receive nothing themselves, and that their owners receive only the tools of their trade in compensation if the slave is killed.  They are, literally, “no-bodies” as far as the lawbooks are concerned.


    Robin Lindley: The Welsh sense of humor was often ribald and sexual and that came through in some laws. How did this humor influence legal relationships such as marriage?


    Professor Robin Chapman Stacey:  I don’t think we have any real way of knowing how humor played out in marriage.  However, I can see a mordant sort of humor in the divorce procedure described in these texts, where the divorcing parties divide up their goods one by one. Previous historical interpretations portrayed this process as practical in nature, as setting each party up to move on into a new life and new relationships.  However, when one looks at the items being separated, many of them are things that cannot function apart from one another—one party gets all the hens and only one cat to protect them, while the other gets all the cats and no hens, for example.  One party gets the sieve with large holes and the other the sieve with fine holes.  Moreover, several of these symbols are sexual and, I think, intended to be shaming and perhaps ruefully funny.  Not only are there lots of phallic symbols (validated as such by contemporary Welsh poetry), the top and bottom stones of the quern by which seed is ground go to different parties, and it is surely not coincidental that this provision comes right after one about the separation of the bedclothes.  What we have here, I think, is a symbolic “sermon” on the infertility and waste that result from divorce.


    Robin Lindley: Many readers may be surprised that Welsh law permitted divorce and canon law did not prevail there in the thirteenth century. How would you describe the reality of divorce in Wales then? Although permitted in law, it seems divorce was regarded as a great human failure.   


    Professor Robin Chapman Stacey:  Yes, I think that is exactly the point.  Welsh law did permit divorce, although the Welsh were severely criticized by the Church for this because divorce was not allowed under canon law.  My suspicion about the divorce “burlesque” is that it is a sign that attitudes to divorce were changing on this front even in native Wales, even though the practice had not yet been abolished.


    Robin Lindley: Female virginity was prized in the law. Why was this status so significant?


    Professor Robin Chapman Stacey:  I’m not sure how to explain why some societies value female virginity and some seem to care somewhat less about it.  For the Welsh, virginity was tied up in lordship, which might explain something of its importance.  Lords received monetary payments for the first sexual experiences of women under their jurisdiction; these payments are said to parallel in very direct ways the payments made to lords when men under their jurisdiction die.  My guess is that virginity was perceived as an asset for the lord:  both a duty that their female dependents owed them and a source of revenue and potential political connections.


    Robin Lindley: Outsiders saw the Welsh as violent and bloodthirsty. What was the reality you found when you compare other societies of the period?


    Professor Robin Chapman Stacey:  Most studies done of Welsh violence have found that the Welsh were no more or less bloodthirsty than others at this period. Indeed, at least one study has even argued that the Welsh learned some of their most gruesome practices from England! However, it was a common criticism of the Welsh, and I believe that many lawbook authors deliberately downplay violence in their work as a way of rebutting this accusation. 


    Robin Lindley: In the Welsh law of homicide you discuss, murderers are less of a concern than those who abet or assist in murders. Why was that the focus of the law?   


    Professor Robin Chapman Stacey:  Again, a fascinating question to which I wish I knew the answer!  Part of it likely reflects the fact that payments for abetment go to lords, and these sources are written in part to support good lordship.  Part of it also, I think, is the desire on the part of the lawbook authors to downplay the specter of actual violence.  They couldn’t just leave the abetments section out of their work, as all indications are that it was an old and traditional part of the lawbook pattern.  However, only certain manuscripts go on to describe actual violence:  most don’t, and I think that is deliberate.


    Robin Lindley: Is there anything you’d like to add for readers about your work or your groundbreaking new book on law and imagination in Medieval Wales?


    Professor Robin Chapman Stacey:  Merely my thanks to you for taking the time to ask about it!  Few people realize just how interesting and important these sources are.  The written Welsh legal tradition is extensive and, as I hope I have suggested above, wonderfully absorbing in its imagery and attention to symbolic detail.  And yet few medievalists are acquainted with these texts.  One of my hopes is that by tackling questions of interest to historians of other medieval regions, that might change.  


    Robin Lindley: Thank you for your generosity and thoughtful comments Professor Stacey. And congratulations on your fascinating new book.  


    Professor Robin Chapman Stacey: You’re welcome and thank you!

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172591 https://historynewsnetwork.org/article/172591 0
    The Power of Microhistory: An Interview with Bancroft Prize Winner Douglas Winiarski


    Dr. Douglas Winiarski recently won a grant from the National Endowment for the Humanities. From the press release

    "University of Richmond religious studies and American Studies professor Douglas Winiarski will begin two central chapters of his latest book project, Shakers & the Shawnee Prophet: A Microhistory of Religious Violence on the Early American Frontier, 1805–1815, this summer.

    His research is being funded by a $6,000 Summer Stipend by the National Endowment for the Humanities. 

    “Shakers and the Shawnee Prophet examines the local sources of religious violence on the early American frontier during the years leading up to the War of 1812,” said Winiarski. “I anticipate that the book will resonate with readers attuned to the politics of religious difference and the troubling connections between religion and violence in our own times.” 

    Winiarski is the author of Darkness Falls on the Land of Light: Experiencing Religious Awakenings in Eighteenth-Century New England which was awarded several honors in 2018 including the prestigious Bancroft Prize in American History and Diplomacy and the Peter J. Gomes Memorial Book Prize."


    Dr. Winiarski kindly gave HNN an interview and explained his excitiing research.


    Your previous work focused on the religious history of New England. What prompted you to shift to studying the American frontier for this project? 


    This project actually predates my work on popular religion in eighteenth-century New England. I first starting thinking about the pacifist Shakers’ unusual relationship with the militant followers of Tenskwatawa, the notorious Prophet and brother of the famed Shawnee war captain Tecumseh, more than two decades ago. As a graduate student at Indiana University, I was fortunate to study with two outstanding mentors: Stephen Stein, a leading historian of Shakerism and new religious movements in America, and David Edmunds, who had written the definitive biography of Tenskwatawa. They were colleagues and knew each other, of course, but had never discussed the intriguing connections between their scholarship. Steve’s definitive Shaker Experience in America makes only a brief reference to the Shakers’ 1807 mission to the Indians of the Old Northwest; David’s Shawnee Prophet relies on the Shaker missionaries’ journals but doesn’t explain why those sources existed in the first place. As I read their books side by side, I realized that both scholars were working around the edges of a fascinating, untold story. 


    I started poking around with some of the primary sources from the period and was stunned by what I found. The Shakers produced dozens of journals, letters, and speeches documenting their meetings with the Prophet and his followers. They provisioned the Prophet’s villages and invited Tecumseh to visit their communal settlement at Turtle Creek, Ohio. During a three-day meeting during the summer of 1807, in fact, the Shakers and Shawnee danced together in front of an astonished and horrified audience numbering in the hundreds. Then the frontier erupted into violence. The Prophet faced relentless pressure from all sides, native and American. The Shakers suffered intimidation, theft, and arson. In 1811, the Prophet’s movement was nearly destroyed at the Battle of Tippecanoe; one year earlier, an armed mob threatened to raze the Shakers’ entire village. It’s an amazing story, but also a tragic one. I always planned to come back to this project after I had completed Darkness Falls on the Land of Light.


    What is a religious micro history? What made you use that term?


    I’m pretty sure the term “microhistory” won’t make the final cut in the title of the book, but I used it in my NEH application to signal the distinctive method I’ve adopted in this project. Microhistorians take well-documented, but obscure individuals or events and use them to explore larger historical phenomena, issues, problems, and themes. It’s a narrative genre as well. Classic microhistories, such as John Demos’s Unredeemed Captive or Paul Johnson and Sean Wilentz’s Kingdom of Matthias, often read like gripping historical novels. They share much in terms of approach with popular nonfiction books, such as Erik Larson’s Devil in White City.


    A microhistorical approach allows me to tell a great story. This one features two groups of religious outsiders—both despised and feared by their own people—who briefly discovered common ground and mutual respect within the racially charged and frequently violent crucible of the early American frontier. But I think the story of the Shakers and the Shawnee Prophet raises larger questions about religion, race, and violence in the newly United States. Seen from a broader angle, it’s a cautionary tale about what could have been, what might have been. The violence that erupted in response to the Shakers’ meetings with Tenskwatawa and Tecumseh brings into sharp relief the important connections between America’s rising imperial aspirations, which were directly tied to the dispossession of native Americans, and the emergence of American evangelicalism and the rise of the southern Bible Belt.


    What do you think would surprise readers about this time period and subject? Did anything surprise you while you researched?


    Just about every Protestant denomination sent missionaries to the native peoples of the Ohio Valley and Great Lakes region during the early decades of the nineteenth century. Most missionaries were aggressive agents of culture change; a few, including the Moravians and Quakers, tried to work closely with native communities and sympathized with their plight. Yet all of them presumed that the Indians needed to change or perish, to become in the language of the period, “civilized.” The Shakers were different. They were convinced that other denominations believed and practiced Christianity in error. In fact, they typically referred to Baptists, Methodists, and Presbyterians as “antichristians”! In 1805, three Shaker missionaries traveled to Kentucky and Ohio seeking to convert members of these evangelical denominations to the Shakers’ distinctive faith. After some impressive early gains, however, the Shaker mission faltered and they began to experience various acts of violence at the hands of their antichristian neighbors. Two years later, when members of the struggling Shaker community at Turtle Creek heard that a new group of “religious Indians” led by a charismatic prophet had recently established a new settlement less than fifty miles away, they quickly dispatched a delegation. But the Shakers never attempted to convert the Shawnee Prophet or his followers; nor did they think the Indians needed to be civilized. Instead, the Shakers understood the Prophet’s movement as a manifestation of the very same “operation of the holy Spirit” that has once existed among Anglo-American “revivalers” on the frontier. The Holy Spirit had abandoned evangelical Protestants, as one Shaker noted in his journal, and been “given to the Heathen, that they should bring forth the fruits of it.” This is an extraordinary statement. And it set the stage for a series of fascinating encounters between the two groups—and a lot of anti-Shaker violence as well.


    What lessons has your research into this the period offered for American politics and society today?


    And I guess that’s the contemporary payoff of this project. I think my book will resonate with readers attuned to the politics of religious difference and the troubling connections between religion and violence in our own times. Loose talk of religious “extremism” seems to be everywhere in our public discourse these days—of individuals “radicalized” and religious beliefs “weaponized.” Anglo-Americans would have understood the Shakers and the Shawnee Prophet in similar terms in 1807. Then, as now, violence against outsider religious communities was fueled by partisan media and warhawking politicians. When Indiana territorial governor and future president William Henry Harrison wrote to the Secretary of War and mistakenly claimed that Tenskwatawa “affects to follow the Shaker principles in everything but the vow of celebacy,” he was beating the drums of racial and religious prejudice to drum up support for a Indian war that would begin with the Battle of Tippecanoe and continue through the War of 1812. Perhaps, in understanding the little-known story of the Shakers and the Shawnee Prophet, readers will be in a better position to assess the dangerous power of similar forms of political invective in our own times. We’ll see.


    What has been your favorite archival experience while researching this book?


    That’s an easy one. It’s my obsessive quest to recover the history of something called “the jerks.” My book is about the rise of western Shakerism and the believers’ complicated relationship with the Shawnee Prophet. Along the way, I needed to figure out why the Shakers of upstate New York and New England sent missionaries to the trans-Appalachian west in the first place. It turned out to be a interesting story in itself. By the turn of the nineteenth-century, the Shakers had developed a notorious reputation for their ecstatic dancing worship practices. During these “laboring” exercises, as they were called, Shaker brothers and sisters frequently collapsed to the ground, trembled uncontrollably, or whirled in circles. Outsiders began referring to them as “convulsioners.” Powerful religious revivals began sweeping across western settlements in during the first years of the nineteenth century, but the Shakers waited to send missionaries to investigate until a curious article appeared in a local newspaper in November 1804. The anonymous author of “The Jerks” described a peculiar new religious phenomenon that had recently surfaced in the Shenandoah Valley of Virginia in which the bodies of revival participants convulsed uncontrollably. Benjamin Seth Youngs, the central figure in my book, and two colleagues set off for the frontier just one month later. The Shaker missionaries targeted communities of these “jerkers” during their travels, and revival participants who experienced such unusual somatic phenomena emerged among the Shakers’ earliest converts in Kentucky and Ohio. So I spent quite a bit of time tracking down these “jerkers”—the first “Holy Rollers” in the history of American evangelicalism. Readers can learn more about that side project on my website (www.douglaswiniarski.com/essays) or by visiting my “History of the Jerks” digital archive (https://blog.richmond.edu/jerkshistory).


    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172590 https://historynewsnetwork.org/article/172590 0
    History Provides a Critical Thinking ‘Toolbox’ for Students: An Interview with Ortal-Paz Saar



    Ortal-Paz Saar is a postdoctoral researcher in the Department of History and Art History at Utrecht University where she specializes in religious studies and Jewish cultural history. 


    What books are you reading now?


    Yesterday I finished A Pale View of Hills by Katzuo Ishiguro, which I read slowly, so as to “save” it for as long as possible. It is a true masterpiece, from every imaginable point of view. It could make a very good horror film, meaning an intelligent, not a scary one. As it is, the novel makes you shiver because of the things it does not say, somewhat like Giorgio de Chirico’s painting “Mystery and melancholy of a street”. Before that I read Sarah Perry’s The Essex Serpent, a historically-set novel with moving and convincing characters. I came across it by chance in the bookstore, while looking for a novel by a different Sarah, Sarah Moss, whose Ghost WallI enjoyed very much.


    What is your favorite history book?


    If you mean “history” as in “non-fiction”, then Gideon Bohak’s Ancient Jewish Magic: A History (2008). Bohak was my PhD supervisor and is a true intellectual whom I profoundly admire. He also happens to write exceptional academic prose: clear, pleasant to read, and full of humor. 


    When reading fiction, I tend to notice the historical setting, and I often learn a lot from novels – good ones encourage you to go and read more about a period.


    Why did you choose history as your career?


    I started out as a classical archaeologist but fell in love with magic-related artifacts during my MA studies, which led to a doctorate focusing on manuscripts and history. For me, historical research is fascinating in a way similar to a detective investigation: you have clues, some of which are misleading and others fragmentary, and you need to piece together an image. You strive to achieve an accurate one, although history (especially pre-modern periods) precludes certainty.


    What qualities do you need to be a historian?


    To be a historian you probably just need to study, find a research topic and work on it. To be a good historian, however, you need to have curiosity, imagination, passion, and the courage to go against the current if you believe you are right. Come to think of it, one needs those qualities to be good in every profession.


    Who was your favorite history teacher?


    Tough question. I do not recall any positive (high)school experiences with this subject. At Tel Aviv University I had several good teachers, and particularly liked Prof. Israel Roll, who unfortunately passed away in 2010. He had a very systematic method of teaching, clear and easy to follow, whether he was teaching about classical art, excavations in Pompeii or sites in Israel. I can still remember many of his classes, and think the students appreciated him.


    What is your most memorable or rewarding teaching experience?


    Both memorable and rewarding: adult-education courses in which the participants were people from all walks of life and different religious backgrounds: an orthodox Jew sitting next to two Muslims and several atheists. My lectures were about the major world religions, and I will always remember the warm, respectful, and friendly atmosphere at those meetings. 


    What are your hopes for history as a discipline?


    If we want to maintain history as a subject worthy of being taught even when increasingly more historical information can be found online, we need to seriously think about its raison d’être. We need to ask: What does history really teach us, why is it needed today? These questions seem to be even more pertinent when we talk about ancient history – why should people care about what happened more than two millennia ago? I do not often come across discussions on the philosophical aspects of this discipline; maybe because people working within the discipline love it, so they do not stop to ponder on its future or its relevance. They climb the mountain because it is there. However, I find it important to pose these questions, both in class and among colleagues. 


    One of the things I often tell students is that I would like to teach them critical thinking, and that history provides a toolbox they can take with them once they finish the course. This is increasingly important in the age of fake (news-, deep-, you name it). It may not be long before distinguishing true history from its other forms becomes impossible, and worse: irrelevant. My hope is that we, and the student generations we help shaping, will be able to prevent this.


    On a less serious note, I hope that someone will finally invent the time machine that history lovers have been dreaming about for so long (suggested reading: M.R. James’ “A View from a Hill” -- and anything else by this author).


    Do you own any rare history or collectible books? Do you collect artifacts related to history?


    None. I am not a collector, and tend to get rid of things that clutter my space (books are never clutter, of course).


    What have you found most rewarding and most frustrating about your career?


    Rewarding: the fact that my work and my hobbies coincide. I go to work each morning feeling happy about the hours ahead. This is probably one of the greatest blessings one can ask for. Frustrations? None so far.


    How has the study of history changed in the course of your career?


    One word: digitization. Two words: digital humanities. I remember writing my PhD and using microfilm images of Cairo Genizah manuscripts: black and white, poor quality, and the microfilm machines were constantly breaking. Only a decade has passed and those manuscripts, fully digitized, can be viewed from any laptop, in excellent resolution. Secondly, the rapid increase in the use of DH techniques and methodologies has brought to light historical patterns previously unseen, enabling people to ask questions that were unconceivable before. 


    What is your favorite history-related saying? Have you come up with your own?


    Not just history-related: “The broader your horizons, the more points of contact you have with infinity” (attributed to Blaise Pascal). I use it to justify spending time reading interesting (but irrelevant) things when I should finish writing my monograph on epitaphs that reflect Jewish diasporic identity.


    What are you doing next?


    Finishing that monograph, of course. 

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172587 https://historynewsnetwork.org/article/172587 0
    Apollo: America’s Moonshot and the Power of a National Project


    On October 4th, 1957, Americans looked skyward to see their world had forever changed. The Soviet satellite Sputnik was orbiting the Earth every couple hours. The Soviets had kicked off the Space Race in grand fashion, shocking and embarrassing America’s political establishment.


    Senate Majority Leader Lyndon Johnson was unequivocal about the new threat: “soon, the Russians will be dropping bombs on us from space like kids dropping rocks onto cars from freeway overpasses.” In 1958, Johnson supported a massive appropriations bill to expand the American space program and create NASA. During the 1960 election, John F. Kennedy would hammer Vice President Nixon about the “missile gap” between the USSR and America. 


    As Kennedy took office, the Soviets retained their edge in the Space Race. On April 12th, 1961, Yuri Gagarin became the first man to orbit the Earth. A month later, after Alan Shepard completed America’s first spaceflight, the president announced that America would land a man on the moon before the end of the decade. 


    Far from naïve idealism, the declaration laid out an ambitious roadmap to restore America’s technological superiority. As Kennedy later said, “we choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard.” His goal would demand a tremendous amount of time and resources. By the mid-1960s, NASA’s budget was over 4% of federal spending, and its activities engaged over 400,000 people. 


    On July 20th, 1969, Kennedy’s bold vision was realized when Neil Armstrong set foot on the lunar surface. However, NASA did far more than win the Space Race. Each dollar the agency has spent has produced eight to ten dollars of economic benefits. As historian Douglas Brinkley noted, space hardware spurred major advances in nearly all facets of modern life including: “satellite reconnaissance, biomedical equipment, lightweight materials, water-purification systems, [and] improved computing systems.” The Apollo program shows how big goals paired with effective execution and government-supported R&D can play a unique role in driving national progress. 



    The Longest Journey


    Before Americans could walk on the moon, they needed to reach space and return safely. Project Mercury was America’s first foray into manned spaceflight and established several key practices that were essential the moon landing’s success. 


    First, Mercury established the public-private partnership approach NASA would effectively use during Project Gemini (the successor to Mercury) and Project Apollo. NASA’s Space Task Force designed the Mercury spacecraft and McDonnell Aircraft produced it. Likewise, army engineers designed the rocket that would propel the spacecraft into orbit, and Chrysler built it. 


    Second, the Mercury project thrived on cooperation instead of competition. In the earlier days of rocket design, Army and Navy teams competed against each other. Now, the entire Mercury program fell under NASA administration that concentrated personnel and resources on the task of spaceflight. The seven pilots who became the Mercury astronauts came from the Air Force, Marines, and Navy. 


    In 1961, Alan Shepard became the first American in space, with a fifteen-minute suborbital flight. The next year, John Glenn would become the first American to orbit the Earth. Three other manned flights would follow Glenn, and the astronauts would be feted as national heroes. However, a world of difference separated these brief journeys above Earth from a quarter-million-mile adventure to the moon.



    NASA recognized that the path to the moon demanded incremental steps over several years. This long-term perspective envisioned Gemini as a stepping-stone to Apollo. Although Gemini spacecraft never flew more than eight-hundred miles from Earth, the two-man missions provided invaluable knowledge about the tasks required to make a moon landing possible. 


    On Gemini 4, Ed White became the first American to perform an extra-vehicular activity (EVA), commonly known as a spacewalk. Later Gemini missions would refine the techniques for maneuvering outside the spacecraft required when the astronauts landed on the moon.


    Given that a roundtrip to the moon would take nearly a week, NASA had to ensure that the crew could survive in space for far longer than during the Mercury missions. Gemini 5 orbited the Earth one-hundred-twenty times during a weeklong mission. Later in 1965, Frank Borman and Jim Lovell spent two cramped weeks within Gemini 7.


    Borman and Lovell also participated in the first space rendezvous. Orbiting hundreds of miles above Earth, they would be joined by Gemini 6. During the rendezvous, the two spacecraft would come close enough for the astronauts to see each other clearly. This exercise provided confidence for advanced docking procedures, where Gemini crafts would connect with an unmanned target vehicle. This docking simulated the detachment and reattachment of the Lunar Module (LM). 


    The Gemini program concluded by setting altitude records and practicing reentry into the Earth’s atmosphere. With Gemini completed, NASA was ready for Apollo. 



    Project Apollo marked the culmination of America’s manned space program. By the mid-1960s, over half of NASA’s annual $5 billion budget (approximately $40 billion in today’s dollars) went to the Apollo program. NASA contracted with thousands of companies, including IBM, who developed state-of-the art computers. Dozens of universities provided their brightest minds too, including MIT, who developed navigation and guidance systems. Apollo was propelled into space by the Saturn V rocket, a three-hundred-foot colossus, designed by the US Army under Wernher von Braun’s direction. The three-man Apollo capsule also far exceeded the tiny Gemini capsule in spaciousness and complexity.


    However, Apollo suffered from numerous minor engineering defects and technical glitches, continually frustrating its first crew. Then, on January 27th, 1967, disaster struck. During a test of Apollo 1, faulty wiring created a spark which rapidly spread through the capsule’s pure oxygen environment. Astronauts Gus Grissom, Ed White, and Roger Chaffee perished. 


    For a period, the space program’s very future was in doubt. Even before the fire, some critics had condemned Apollo as a “moondoogle.” Now, the public and Congress were demanding immediate answers. 


    Rather than attempting to deflect blame, NASA created a review board to investigate Apollo 1. Frank Borman and other astronauts literally walked the floors of North American Aviation, the company that had assembled the capsule. Engineers, research directors, and spacecraft designers also joined the review board. After several painstaking months, the board recommended a series of comprehensive changes that would ultimately make Apollo far safer and more reliable. “Spaceflight will never tolerate carelessness, incapacity or neglect,” Flight Director Gene Kranz told his team after the tragedy, “from this day forward Flight Control will be known by two words: ‘tough and competent.’” 


    When Apollo resumed manned spaceflights in October 1968, the culture of nonstop self-improvement instilled by Kranz and others had taken root. Apollo 7 was an operational success.


    Apollo 8 marked a huge step forward as the crew of Frank Borman, Jim Lovell, and Bill Anders became the first human beings to orbit the moon. After their quarter-million-mile journey, they approached within seventy miles of the lunar surface and glimpsed the far side of the moon. While in lunar orbit, Anders snapped a photo of our fragile home planet in the void of space. “Earthrise” would became an icon of the nascent environmental movement.


    After a successful test of the LM on Apollo 10, Apollo 11 mission put Neil Armstrong and Buzz Aldrin on the moon (their crewmate Michael Collins piloted the main ship as they descended). After Apollo 11, NASA completed five additional lunar missions. In the later missions, astronauts spent almost a full day on the moon and successfully deployed a lunar rover. They also conducted valuable experiments and returned with rock samples that have taught us much about the moon’s origins and the state of the early Earth. 


    No More Moonshots

    After the first moon landing, public interest in the space program waned. Even in the last years of the Johnson administration, NASA’s budget was cut as the Vietnam War escalated. Now, after America conclusively won the Space Race, Nixon enacted even steeper cuts. By the early 1970s, the 400,000 people working with NASA had been reduced to under 150,000. Ambitious plans for lunar colonization and further exploration were scrapped along with the final Apollo missions


    In the late 1970s, NASA turned its attention to the Space Shuttle program. The shuttle would provide a reusable and cost-effective vehicle for transporting astronauts into low-Earth orbit. However, the shuttle proved far more expensive and less dependable than expected. Among the Shuttle program’s greatest accomplishments was the construction of the International Space Station (ISS). However, many at NASA considered the shuttle a partial success at best. NASA Administrator Michael Griffin argued that the Saturn rocket program could have provided more frequent launches into deeper space at a similar cost. Had that path been pursued, “we would be on Mars today, not writing about it as a subject for “the next 50 years,’” Griffin asserted. The Shuttle program ended in 2011, and American astronauts now use Russian crafts to reach the ISS. NASA’s current budget is less than 0.5% of total federal spending, barely 1/10th its mid-1960s peak.


    Interestingly, NASA’s grand plans also fell victim to the ideals of the Reagan revolution. While President Reagan supported Cold War military spending, he espoused the belief that “government is not the solution to our problem, government is the problem." That philosophy has become an article of faith for American political conservatives. Even among moderates, a deep skepticism of government programs has become commonplace. 


    Faith in government has been replaced by faith in markets. True believers claim that market competition alone drives human progress and advancement. However, economic realities have challenged that optimistic assessment. For corporations, executive compensation has become increasingly linked to stock performance. Investors press for management changes if companies underperform their targets. Corporate leaders are ever more beholden to the next quarterly earnings report and short-term growth. These demands make it far harder to invest in long-term R&D efforts, especially when the outcome is uncertain. For startups, the situation is not much better. A firm may develop a novel product, leading to a massive infusion of venture capital. However, that capital comes with a high price. To justify a sky-high valuation, investors require rapid expansion. That expansion puts an obsessive focus on user growth and customer acquisition for the existing product, leaving little time for meaningful innovation.


    Many in the business community have recognized the limitations of the current market system and have sought new ways to pursue ambitious projects. Earlier this year, a set of entrepreneurs launched the Long-Term Stock Exchange to address concerns with short-termism. Google and Facebook have created their own venture and innovation arms to pursue projects beyond their core business activities. 


    Returning to space exploration, Jeff Bezos’ Blue Origin and Elon Musk’s SpaceX both are working towards private spaceflight. Although Blue Origin and SpaceX have shown promise, their budgets represent a miniscule fraction of their founders’ assets (and a tiny percentage of Apollo’s budget in adjusted dollars). Their companies each employ only a few thousand people. While not discounting their impressive accomplishments to date, both companies are passion projects of ultra-wealthy individuals. 


    The Best of Our Energies

    Projects like Apollo show what a national mission can achieve. President Kennedy understood that his “goal [would] serve to organize and measure the best of our energies and skills.” We need similar thinking today. We face challenges that the market is poorly equipped to address from infrastructure improvement to antibiotic development. Multiyear projects that require significant resources and provide broad-based benefits to society are prime candidates for government investment. That is not to say government should go it alone. Apollo succeeded as a collaborative effort between the government, companies, and research institutions. Indeed, given NASA’s partnerships today with Blue Origin and SpaceX these companies may well be key contractors for a reinvigorated American space program. 


    Government funded R&D also brings a cascade of associated benefits. As mentioned previously, NASA research has led to the development of many new technologies from the everyday: memory form, water filters, and smartphone cameras, to the lifesaving: cancer detection software, fireproofing, and search and rescue signals. The modern world would be unthinkable without satellite communication, advanced computers, and the internet, which all began within government research programs. 


    Finally, Apollo represents the best of our American spirit. It represents exploration and innovation, hard-work and team-work, as well as the relentless desire to push the limits of human possibility. Our history is one of big dreams. We dug the Panama Canal, built the Hoover Dam, sent a man to the moon, and sequenced the human genome. These accomplishments have become part of our national identity. We should be similarly audacious today. Let’s pledge to wipe out cancer or address the challenges of climate change head-on. Regardless of the mission, let us remember Apollo and shoot for the moon. 


    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172580 https://historynewsnetwork.org/article/172580 0
    What is a Concentration Camp? Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.


    A new argument has broken out over the Holocaust, or more precisely, over references to the Holocaust in contemporary life. The sequence of events is revealing about politics, but not especially reliable about history.


    In response to the increasing comparison of right-wing populists in Europe and America to Nazis, last December Edna Friedberg, a historian in the United States Holocaust Memorial Museum’s William Levine Family Institute for Holocaust Education, wrote an official statement for the Museum about the dangers of Holocaust analogies. She was clear about what she condemned: “sloppy analogizing”, “grossly simplified Holocaust analogies”, “careless Holocaust analogies”. Dr. Friedberg criticized the political use by left and right of “the memory of the Holocaust as a rhetorical cudgel”. She urged upon everyone better history, “conducted with integrity and rigor”.


    This was not controversial, but rather typical of what historians say about the much too common references to Hitler and Nazis and fascism in our political discourse.


    Congresswoman Alexandria Ocasio-Cortez said last month on social media that the U.S. is “running concentration camps on our southern border”. Many Jewish organizations and Holocaust institutions condemned her remarks, as well as the usual chorus of conservative politicians. Although she did not mention the Holocaust, it was assumed that she was making one of those careless analogies for political purposes.


    This appears to have prompted the USHMM to issue another brief statement on June 24, that then ignited a wider controversy: “The United States Holocaust Memorial Museum unequivocally rejects efforts to create analogies between the Holocaust and other events, whether historical or contemporary. That position has repeatedly and unambiguously been made clear in the Museum’s official statement on the matter,” referring to Dr. Friedberg’s earlier statement.


    In response, an international list of over 500 historians, many or most of whom write about the Holocaust, signed an open letter to Sara J. Bloomfield, the director of the Museum, published in the New York Review of Books, urging retraction of that recent statement. They criticized the rejection of all analogies as “fundamentally ahistorical”, “a radical position that is far removed from mainstream scholarship on the Holocaust and genocide.” They argued that “Scholars in the humanities and social sciences rely on careful and responsible analysis, contextualization, comparison, and argumentation to answer questions about the past and the present.”


    There have been many media reports about the Museum’s June statement and the historians’ letter criticizing it. But there has been no discussion of the obvious distinction between the original statement by Dr. Friedberg and the newer unsigned “official” statement. Dr. Friedberg had noted that the “current environment of rapid fire online communication” tended to encourage the “sloppy analogizing” she condemned. Ironically, the too rapid response by someone at the Museum to Rep. Ocasio-Cortez’s remarks ignored the difference between bad historical analogies for political purposes and the careful use of comparisons by scholars. Now the stances of the Museum appear contradictory.


    The outraged historians also ignored the difference between the two versions of Museum statements, and demanded a retraction of the recent version without reference to Dr. Friedberg’s thoughtful statement.


    An easier out for the Museum is to issue one more statement affirming that Dr. Friedberg’s formulation is their official position, excusing itself for the poorly worded June statement, and thanking the historians for defending the proper context in which the Holocaust ought to be discussed and the proper means for that discussion.


    Lost in this furor is the fact that Ocasio-Cortez did not make a Holocaust analogy when she referred to concentration camps. Widely accepted definitions of concentration camp are worded differently but agree in substance. The online Merriam-Webster dictionary defines concentration camp as: “a place where large numbers of people (such as prisoners of war, political prisoners, refugees, or the members of an ethnic or religious minority) are detained or confined under armed guard.” The Oxford English Dictionary offers some history: “a camp where non-combatants of a district are accommodated, such as those instituted by Lord Kitchener during the Boer War (1899–1902); one for the internment of political prisoners, foreign nationals, etc., esp. as organized by the Nazi regime in Germany before and during the war of 1939–45.” The Encyclopedia Britannica offers a similar definition: “internment centre for political prisoners and members of national or minority groups who are confined for reasons of state security, exploitation, or punishment, usually by executive decree or military order.”


    Perhaps the most significant definition of the phrase “concentration camp” in this context comes from the USHMM itself, on its web page about Nazi camps: “The term concentration camp refers to a camp in which people are detained or confined, usually under harsh conditions and without regard to legal norms of arrest and imprisonment that are acceptable in a constitutional democracy. . . . What distinguishes a concentration camp from a prison (in the modern sense) is that it functions outside of a judicial system. The prisoners are not indicted or convicted of any crime by judicial process.”


    From what we have learned recently about the actual conditions in the places where asylum seekers are being held on our southern border, Rep. Ocasio-Cortez’s use of the term fits closely within these definitions. She is supported by people who understand the realities of concentration camp life. The Japanese American Citizens League, the oldest Asian-American civil rights group, calls the camps which the US government set up to hold Japanese American citizens “concentration camps”, and repeated that term in June 2018 to condemn the camps now used to hold asylum seekers.


    Rep. Ocasio-Cortez used careful and responsible analysis to make a comparison between current American policy and a century of inhumane policies by many governments against people who are considered enemies. It will take much more contextualization and argumentation to tease out the differences and similarities between all the regrettable situations in which nations have locked up entire categories of innocent people. But given the emotions which have prompted even the most thoughtful to leap to briefly expressed one-sided positions, it appears unlikely that such rational processes will determine our discourse about this important subject.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/blog/154229 https://historynewsnetwork.org/blog/154229 0
    Roundup Top 10! Videos of the Week

    How Do We Know Something Is Racist? Historian Ibram X. Kendi Explains

    Kendi explained the important historical context behind telling a person of color to go back to where they came from.


    Historian Jon Meacham: Trump now with Andrew Johnson as 'most racist president in American history'

    The historian drew the parallels between Trump and Johnson one day after Trump targeted four freshman House lawmakers.



    Roundup Top 10

    HNN Tip: You can read more about topics in which you’re interested by clicking on the tags featured directly underneath the title of any article you click on.

    Not Everyone Wanted a Man on the Moon

    by Neil M. Maher

    Protesters in the late ’60s and early ’70s pushed for spending at home on the same multibillion dollar scale as the moon race.


    Republicans Want a White Republic. They'll Destroy America to Get It

    by Carol Anderson

    Already, Trump and the Republicans have severely harmed the institutional heft of checks-and-balances. But they’re not done.



    Tennessee just showed that white supremacy is alive and well

    by Keisha N. Blain

    Honoring a former Confederate general and KKK grand wizard in 2019 is outrageous.



    Trump’s America Is a ‘White Man’s Country’

    by Jamelle Bouie

    His racist idea of citizenship is an old one, brought back from the margins of American politics.



    Citizenship once meant whiteness. Here’s how that changed.

    by Ariela Gross and Alejandro de la Fuente

    Free people of color challenged racial citizenship from the start.



    How activists can defeat Trump’s latest assault on asylum seekers

    by Carly Goodman, S. Deborah Kang and Yael Schacher

    Immigration activists helped give power to asylum protections once before. They can do it again.



    Eisenhower's Worst Nightmare

    by William D. Hartung

    When, in his farewell address in 1961, President Dwight D. Eisenhower warned of the dangers of the “unwarranted influence” wielded by the “military-industrial complex,” he could never have dreamed of an arms-making corporation of the size and political clout of Lockheed Martin.



    Immigration and the New Fugitive Slave Laws

    by Manisha Sinha

    The abolitionists’ protests against the fugitive slave laws, which deprived large groups of people of their liberty and criminalized those who offered assistance to them, should be an inspiration in our dismal times.



    Marshall Plan for Central America would restore hope, end migrant border crisis

    by William Lambers

    The Marshall Plan was key to restoring stability to Europe after WWII. Now, a similar approach must be taken in Central America.



    What the French Revolution teaches us about the dangers of gerrymandering

    by Rebecca L. Spang

    Our institutions must remain representative and responsive.



    What Americans Do Now Will Define Us Forever

    by Adam Serwer

    "I want to be very clear about what the country saw last night, as an American president incited a chant of “Send her back!” aimed at a Somali-born member of Congress: America has not been here before."


    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172581 https://historynewsnetwork.org/article/172581 0
    The Truth About the Holocaust and Holocaust Education


    Principal William Latson of Spanish River Community High School in Palm Beach County, Florida was removed from his position and reassigned to a different position in the Palm Beach County school district after refusing to admit that the Holocaust was a “factual, historical event.”  


    It is not that he personally denied the Holocaust. Rather, in an email to a student’s parent, he relied on a faux professionalism and a dangerous sense of relativism, claiming that as a school district employee, he was not in a position to say that the Holocaust is a factual, historical event since not everyone believes the Holocaust happened.


    While it is important to recognize the limits of one’s own expertise, and it is usually a good idea to avoid speaking as an authority on issues that are outside of the scope of one’s proficiency, Latson’s claim that he could not admit the Holocaust is a historical fact is not only unacceptable, it is irresponsible. One does not need to be a professional historian to know that the Holocaust occurred.


    We know that it occurred. 


    We can visit Auschwitz and walk through the barracks of the concentration camps that now serve as memorials and house personal artifacts of the victims, such as clothing, shoes, prosthetic limbs, even human hair.  We can stand in the gas chambers and see the ovens used to burn the bodies of those who were murdered.  We can talk to survivors and witness the numbers tattooed on their arms.  


    Moreover, when educating students, it is vital to provide them with the skills to analyze data, verify which data are reliable, and arrive at justifiable conclusions; however, it does a disservice to students to make them question the veracity of obvious facts, simply because “not everyone believes in them.” Imagine if the principal questioned the importance of teaching “Introduction to Physics,” simply because not everyone believes that gravity exists.


    Encouraging students to “see all sides” of a complicated issue where values can be prioritized in different ways with varying implications to teach them to arrive at a conclusion based on facts, their values, and the consequences is a sound lesson. Presenting truth and falsity as two equally valid options is quite another.


    Underlying Latson’s refusal is a question of the importance of Holocaust education.  Currently, 11 states, including Florida, the state where Mr. Lawson was principal, have laws requiring schools to provide Holocaust education. The most recent state to require it was Oregon in 2019, whose law stipulates that instruction be designed to “prepare students to confront the immorality of the Holocaust, genocide, and other acts of mass violence and to reflect on the causes of related historical events.” Yet the importance of Holocaust education is not simply out of awareness of a historical fact. Holocaust education can provide a unique lens to evaluate many contemporary social, political and professional issues that challenge us today.  


    While it is typically conceived of as a devastating moment in Jewish history, the Holocaust has much broader lessons to teach. The Holocaust is the only example of medically sanctioned genocide in history. It is the only time where medicine, science and politics merged to endorse the labeling, persecution, and eventual mass murder of millions of people deemed “unfit” in the quest to create a more perfect society. Individuals were stripped of their dignity and viewed solely as a risk to the public health of the nation. Their value was determined by their usefulness – or danger – to society.


    Today’s political and social landscape is one where the voices of nationalism and populism have become louder and louder. While the ethos of liberal democracy and multiculturalism still provides a strong foundation for peaceful civil societies and international law, its influence on shaping the future of domestic and international politics is waning. One can see the deleterious effects of hardening ideologies and prejudice, not only along the margins of society, but even within those sectors of society that have traditionally been seen as its stalwarts.


    By showing that the tragedy of the Holocaust is not only a tragedy in Jewish history but a lesson for everyone, Holocaust education can serve to foster civics and ethics education. The Holocaust can functionas a historical example for understanding the danger of placing societal progress and political expediency ahead of individuals. Holocaust education is an opportunity to teach the next generation about the essential connection between the past and the future, to give them the tools they need to learn about moral decision making and to emphasize our responsibility to stand up and speak out when we see evil in any form.


    How we teach the memory of the Holocaust is intricately tied to our vision for the future of our society. Let’s stop thinking that “Never Forget” is enough of a message. Let’s remember not only for the sake of remembering, but for the sake of developing our students to become people who respect each other.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172561 https://historynewsnetwork.org/article/172561 0
    American Billionaires' Ties to Moscow Go Back Decades



    Sleazy American businessmen who make deals with corrupt Russians with shadowy connections to intelligence agents or powerful politicians did not spring up de novo in the twentieth-first century.  As far back as 1919, Armand Hammer, the future head of Occidental Petroleum, met with Vladimir Lenin and agreed to a truly corrupt arrangement which enabled his family to obtain a lucrative business concession in the Soviet Union in return for financing illegal communist operations in Europe and the fledgling American Communist Party.


    In the 1970s, Hammer introduced another American businessman to his friends in Moscow.  Although David Karr, the former CEO of Fairbanks Whitney, never became a household name like his sponsor, he made millions of dollars doing business with the USSR, while providing the KGB with an entrée into the American political world. While less financially successful than Hammer, Karr, the subject of my new biography, The Millionaire Was a Soviet More: The Twisted Life of David Karr, lived an even more adventurous life, ricocheting from young Communist to government bureaucrat, from newsman to public relations man, from proxy fighter to international businessman, from Hollywood mogul to the hotel business, and finally to a KGB informant.  When he died in Paris in 1979 at the age of 60, his Machiavellian past and his myriad of enemies inspired speculation in the French press that he had been murdered by the KGB, CIA, Israeli Mossad, or the Mafia.  Remarkably, there were scenarios- not all plausible- that could finger any one of that unlikely quartet as his killer.


    Born into a middle-class Jewish family in New York, David Katz (as he was then known) was attracted to the CPUSA by its militant anti-fascism and began writing for its newspaper, the Daily Worker in the 1930s.  For the rest of his life, however, he insisted he had never joined the Party.  His early flirtation with communism, though, haunted him for more than twenty years. He went to work for the Office of War Information during World War II, but was forced to resign after being denounced by Congressman Martin Dies for his past associations.  Becoming syndicated columnist Drew Pearson’s chief “leg man,” he was denounced on the floor of the Senate by Senator Joseph McCarthy as Pearson’s KGB controller.  He was a frequent target of right-wing columnists, particularly Westbrook Pegler. Throughout the 1940s and 1950s Karr himself tried to cozy up to the FBI, claiming he had provided information on Communists.


    In the early 1950s he moved back to New York and joined a public relations firm, eventually setting up his own business and developing a specialty in working for corporate raiders during proxy wars.  With another shady character, Alfons Landa, as his partner, in 1959 Karr successfully took control of Fairbanks Whitney, a diversified company that held a number of defense contracts.  One of the youngest CEOs of a major American corporation, Karr faced problems getting a security clearance, but even more difficulty running a company.  Within four years, amid constant losses, an exodus of high-ranking personnel, shareholder complaints about excessive executive salaries, and the failure of a major investment in a water desalinization initiative in Israel, Karr was forced out.  Undaunted, he relocated to Broadway and Hollywood, serving as a co-producer for several movies and shows.  


    Karr’s personal life was almost as tumultuous as his business career.  Already married and divorced twice, with four children, he was engaged to a glamorous Hollywood actress when he met a wealthy French woman.  Breaking the engagement, he married and moved to Paris where her parents bought them a luxurious apartment.  Parlaying his wife’s connections, Karr brokered the sale of several luxury hotels to British magnate Charles Forte and became general manager of the iconic George V.  From his new perch in French society, he did business deals with Aristotle Onassis, befriended America’s ambassador to France Sargent Shriver, and established ties to the Lazard Freres investment house.


    Through Shriver, Karr began the final phase of his peripatetic career.  He was introduced to Armand Hammer and he accompanied the owner of Occidental Petroleum to Moscow.  Developing close ties to Djerman Gvishiani, a high-ranking Soviet trade official and Premier Kosygin’s son-in-law, Karr soon became a key figure negotiating for Western companies hoping to gain access to the Soviet market. 


    His access to Soviet officials and lucrative contracts sparked rumors about possible intelligence connections.  Not only did Karr arrange the financing of the first Western hotel built in the USSR in the run-up to the Moscow Olympics, but he controlled the North American rights to Misha the Bear, the mascot for the Games, in partnership with former Senator John Tunney.  He and Hammer also had the rights to market Olympic commemorative coins.  Amid published claims that he had bribed  Gvishiani, and with labor troubles afflicting Forte’s Paris hotels, Karr became ill in Moscow in July 1979 at events celebrating the opening of the Kosmos Hotel. He died of an apparent heart attack shortly after returning to Paris.


    His personal life also took a dramatic turn in the late 1970s. Divorcing his French wife, he married a German model, more than thirty years his junior.  In less than a year of marriage, there were two divorce filings. Shunned by his family, his widow made a series of sensational charges in the French press, alleging he had been murdered. Journalists retailed stories from anonymous sources that he had cheated the Soviets on the coin deal and that he and an Israeli partner had run guns to Idi Amin and Muammar Khadaffi.  A British journalist published a sensational book alleging that he and Onassis had arranged the murder of Bobby Kennedy with the PLO.


    Untangling these lurid accusations, I have been able to establish that Karr was recruited as source by the KGB in the early 1970s.  He never had access to top-secret information but he did report about the presidential campaigns of Shriver, Scoop Jackson, Jerry Brown and Jimmy Carter.  He tried to insinuate himself in the Gerald Ford White House.  He probably also worked for the Mossad.  Was it any wonder that many people and organizations had good reason to dislike and fear David Karr?  


    His secretive empire with corporations, trusts, and entities in several countries required a decade before probate was completed, as three ex-wives, one widow, and five children wrangled over the estate.  His complicated legacy is well worth pursuing, illustrating the intersecting worlds of business, journalism, politics and espionage.  

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172525 https://historynewsnetwork.org/article/172525 0
    Alexa: What Can Apollo 11 Teach Us About Our Relationship with Technology?


    If you haven’t seen Samsung’s Apollo 11-themed television ad for its next-generation 8K TVs, it’s inspired: Families of contrasting backgrounds huddle around the tube in dark-paneled living rooms of the 1960s, eyes glistening with wonder, as they watch Neil Armstrong step onto the lunar surface. As commercials go, it’s a canny ode to American greatness past, and a stroke of advertising genius. It also reminds us that nostalgia makes for a foggy lens.


    Yes, Apollo 11 was a big deal. Historian Arthur Schlesinger, Jr. rated space exploration as “the one thing for which this century will be remembered 500 years from now … ” and saw the July 1969 moon landing as the key event. Its success sealed America’s standing as the planet’s unrivaled leader in science and technology, and today’s media lookbacks, including major TV documentaries, make that case. The better of these reports highlight the turbulent times in which Apollo took flight, as American cities boiled with protests for racial and economic justice and against the Vietnam War, and concerns about the environment were on the rise. Yet, it’s still easy to gloss over the fact that, for most of the 1960s, most Americans opposed Washington spending billions of dollars on space. 


    What also gets overlooked is Apollo’s importance as a pivot in our national thinking about science and technology. By the late 1960s, young Americans, in particular, had come to see our rivalries with the Soviet Union—“space race” and “nuclear arms race”—as stand-ins for an ur-struggle between humankind and its machines. Baby boomers, like me, loved their incredible shrinking transistor radios, out-of-this-world four-track car stereos, and Tang, the breakfast drink of the astronauts. But we’d also seen Stanley Kubrick’s “2001: A Space Odyssey” (1968), and knew Hal the computer was not to be trusted. Harvard behavioral psychologist B.F. Skinner wryly fingered the irony of the age: “The real question is not whether machines think but whether men do.” Given the times, a healthy skepticism was in order.


    In today’s digital age, we have renewed cause for pause given the way our machines have snuggled into our daily lives, a Siri here, an Alexa, smartwatch or home-security system there. The new intimacy begs a question that animated the early days of the Space Age: How do we harness technology’s promethean powers before they harness us?


    C.P. Snow joined that debate in 1959, when the British physicist and novelist argued that a split between two “polar” groups, “literary intellectuals” and “physical scientists,” was crippling the West’s response to the Cold War. In “Two Cultures,” a landmark lecture at Cambridge University, Snow said “a gulf of mutual incomprehension” separated the sides, “sometimes [involving] … hostility and dislike, but most of all lack of understanding.” Scientists in the U.K. and throughout the West had “the future in their bones,” while traditionalists were “wishing the future did not exist.” A cheerleader for science, Snow nonetheless warned that the parties had better heal their breach or risk getting steamrolled by Russia’s putative juggernaut. 


    Without “traditional culture,” Snow argued, the scientifically-minded lack “imaginative understanding.” Meanwhile, traditionalists,“the majority of the cleverest people,” had “about as much insight into[ modern physics] as their Neolithic ancestors would have had.” Snow’s point: Only an intellectually integrated culture can work at full capacity to mesh solutions to big problems with its fundamental human values.


    On this side of the pond, President Eisenhower wondered where the alliance among science, industry and government was propelling America. In his 1961 farewell address, the former five-star general warned of a military-industrial complex that could risk driving the United States toward an undemocratic technocracy. By that time, of course, the Russians had already blunted Ike’s message thanks to their record of alarming firsts, including Sputnik I, the world’s first earth-orbiting satellite, in October 1957, and the dog-manned Sputnik II the next month. The nation’s confidence rattled, Eisenhower had ramped up the space program and launched NASA in 1958.


    Talk of a technology gap, including a deeply scary “missile gap,” with Russia gave the Soviets more credit than they deserved, as it turned out, but the specter of a nuclear-tipped foe raining nuclear warheads on us was impossible for political leaders to ignore. Meanwhile, the boomer generation got enlisted in a cultural mobilization. Under Eisenhower, public school students learned to “duck and cover.” When John Kennedy replaced him in 1961, our teachers prepared us to confront the Soviet menace by having us run foot races on the playground or hurl softballs for distance; in the classroom, they exhorted us to buckle down on our math and science lest the enemy, which schooled their kids six day a week, clean our clocks.


    In April 1961, the Soviets sprang another surprise—successfully putting the first human, cosmonaut Yuri Gagarin, into low-Earth orbit. President Kennedy countered on May 25, telling a joint session of Congress that “if we are to win the battle that is now going on around the world between freedom and tyranny …” one of the country’s goals should be “landing a man on the moon and returning him safely to the earth” by the end of the 1960s. It was a bold move, requiring a prodigious skillset we didn’t have and would have to invent.


    The fact that we pulled off Apollo 11 at all is a testament to American ingenuity and pluck. Yet while the successful moon landing decided the race for space in America’s favor, it didn’t undo our subliminal angst about the tightening embrace of technology.


    The mechanized carnage of World War II had seen to that. The war had killed tens of millions of people worldwide, including over 400,000 Americans, and the atomic bombs dropped on Hiroshima and Nagasaki opened humankind to a future that might dwarf such numbers. In a controversial 1957 essay, author Norman Mailer captured the sum of all fears: In modern warfare we could well “… be doomed to die as a cipher in some vast statistical operation in which our teeth would be counted, and our hair would be saved, but our death itself would be unknown, unhonored, and unremarked . . . a death by deus ex machina in a gas chamber or a radioactive city. …”


    As the United States and Soviet Russia kept up their decades-long nuclear stalemate, the American mind wrestled with a sublime paradox: Only modern technology, the source of our largest fears, could protect and pacify us in the face of the dangers of modern technology. 


    Today, we grapple with a variation on that theme. Fears of nuclear annihilation have given way to concerns less obtrusively lethal but potentially devastating: cyber-meddling in our elections, out-and-out cyberwarfare, and nagging questions about what our digital devices, social media, and today’s information tsunami may be doing to our brains and social habits. In 2008, as the advent of the smartphone accelerated the digital age, technology writer Nicholas Carr wondered about the extent to which our digital distractions had eroded our capacity to store the accreted knowledge, what some call crystallized intelligence, that supports civilized society. The headline of Carr’s article in The Atlantic put the point bluntly: “Is Goggle Making Us Stupid?”


    Accordingly, MITI social psychologist Sherry Turkle has argued that too much digital technology robs us of our fundamental human talent for face-to-face conversation, reduces the solitude we need for the contemplation that adds quality to what we have to say, and contributes to a hive-mindedness that can curtail true independence of thought and action.


    That’s a head-twisting departure from the American tradition of the empowered individual–an idea that once inspired our intrepid moonwalkers. In his 1841 essay “Self-Reliance,” Ralph Waldo Emerson advised America to stand on its own two feet and eschew Europe as a source for ideas and intellectual custom; rather, we should establish our own culture with the individual as the sole judge of meaning, and get on with creating a new kind of nation, unshackled by the past. “There is a time in every man’s education,” wrote Emerson, “when he arrives at the conviction that envy is ignorance; that imitation is suicide; that he must take himself for better, for worse, as his portion. …”


    A half-century later, in a globalizing and technologically more complex world, philosopher William James applied the Goldie Locks principle to citizen-philosophers. For Americans, he argued, European-inspired “rationalism” (being guided by high-minded principle) was too airy, “empiricism” (just the facts, please) was too hard—but “pragmatism, ” (a mix of principles and what really works, with each individual in charge of deriving meaning) was just right. James sought to meld “the scientific loyalty to the facts” and “the old confidence in human values and the resultant spontaneity, whether of the religious or of the romantic type.”


    Maybe this is what James had in mind when he reached for a description of America’s democratic inner life: “For the philosophy which is so important in each of us is not a technical matter; it is our more or less dumb sense of what life honestly and deeply means. It is only partly got from books; it is our individual way of just seeing and feeling the total push and pressure of the cosmos.”


    Today, the gap between our technological capabilities and our human means of coping with them is only likely to widen. As Michiko Kakutani pointed out in her 2018 book “The Death of Truth”: “Advances in virtual reality and machine-learning systems will soon result in fabricated images and videos so convincing that they may be difficult to distinguished from the real thing … between the imitation and the real, the fake and the true.”


    (If you’ve been keeping up with developments in “deepfake” technology, you know that a scary part of the future is already at the disposal of hackers foreign and domestic.)


    In a sense, today’s digital dilemma is the reverse of what C.P. Snow talked about 60 years ago. Our technology innovators still have the future in their bones, to be sure; but overall, the health of our society may rest on the degree to which we don’t make the world a more convivial place for science per se, but rather deploy our humanistic traditions to make our fast-moving technology best serve and sustain the human enterprise. 


    At a broader level, of course, Snow was right: “To say we have to educate ourselves or perish, is a little more melodramatic than the facts warrant,” he said. “To say, we have to educate ourselves or watch a steep decline in our own lifetime, is about right.” And we can’t truly do that without prioritizing a more comprehensive partnership between the science that pushes technology ahead and the humanities that help us consider the wisdom of such advances in light of the best that has been thought and said.


    And there, the Apollo program serves as a helpful metaphor. In 1968, as Apollo 8 orbited the moon in a warm-up run for Apollo 11, astronaut Bill Anders snapped “Earthrise,” the iconic photograph showing our serene blue-white sphere hanging in lonely space. Often credited with helping to launch modern environmentalism, the image underscored what human beings have riding on the preservation of their home planet. The turn in our thinking was reflected 16 months later when Earth Day was born. And ironically, perhaps, the Apollo program spun off technology—advanced computing and micro-circuitry—that helped ignite today’s disorienting digital explosion, but also produced applications for environmental science and engineering, for example, that promote the public good.


    Meanwhile, our shallow grasp of digital technology presents problems that are deeper than we like to think when we think about them at all. As it stands, most Americans, this one included, have only the shakiest handle on how digital technology works its influences on us, and even the experts are of mixed minds about its protean ways.  

    That order of technological gap is what worried Aldous Huxley, author of the classic dystopian novel “Brave New World.” As Huxley told Mike Wallace, in a 1958 interview, “advancing technology” has a way of taking human beings by surprise. “This has happened again and again in history,” he said. “Technology … changes social conditions and suddenly people have found themselves in a situation which they didn’t foresee and doing all sorts of things they didn’t really want to do.” Unscrupulous leaders have used technology and the propaganda it makes possible in subverting the “rational side of man and appealing to his subconscious and his deeper emotions … and so, making him actually love his slavery.”

    It may not be as bad as all that with us—not yet, anyway. But it does point back to our central question: Have our digital devices gotten the drop on us—or can we train ourselves to use them to our best advantage? This summer’s Apollo 11 anniversary is as timely an event as any to remind us of what’s at stake in managing digital technology’s race for who or what controls our personal and collective inner space.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172523 https://historynewsnetwork.org/article/172523 0
    HBO’s Chernobyl and the Rendering of History


    While watching HBO’s recent 5-part dramatization of the 1986 Soviet nuclear accident at Chernobyl, which spewed more radioactive material into the atmosphere than Hiroshima and Nagasaki bombings combined, I kept thinking of all the suffering it caused. (Because of the difficulties of determining eventual early deaths due to radiation exposure, we don’t know whether they be in the thousands, tens of thousands, or more.)


    I also kept thinking of lines from Ian McEwan’s novel Black Dogs (1993):

    He was struck by the recently concluded war [World War II] not as a historical, geopolitical fact but as a multiplicity, a near-infinity of private sorrows, as a boundless grief minutely subdivided without diminishment among individuals who covered the continent like dust. . . . For the first time he sensed the scale of the catastrophe in terms of feeling; all those unique and solitary deaths, all that consequent sorrow, unique and solitary too, which had no place in conferences, headlines, history, and which had quietly retired to houses, kitchens, unshared beds, and anguished memories.

    Like wars, the Chernobyl accident had all kinds of unforeseen consequences.


    In an earlier HNN essay, I mentioned that novels, films, or television can sometimes stir our emotions and imaginations more than drier works by professional historians. And truth comes to us not just through our intellects, but also from the affective areas of our personalities. That same essay dealt with the problem of determining truth in fictionalized history. Regarding Chernobyl, Masha Gessen, who is both a U. S. and Russian citizen, provides some guidance


    She lauds the “uncanny precision with which the physical surroundings of Soviet people have been reproduced.” One example that struck me was a dilapidated sign hanging over a street that read “Our goal is the happiness of all mankind” (also the miniseries title for Episode 4). Such signs were abundant in Soviet Russia. One banner hanging over a street (a photo of which I included in my A History of Russia) urged children returning to school after summer vacation in 1978, to “get ready to become active fighters for the cause of Lenin and for communism.”


    Although praising background depictions, Gessen faults the miniseries for “its failure to accurately portray Soviet relationships of power.” Too often, it unrealistically depicts “heroic scientists,” especially the fictional Ulyana Khomyuk (Emily Watson), “confronting intransigent bureaucrats by explicitly criticizing the Soviet system of decision-making.”


    Despite such failures, the Chernobyl episodes do a good job depicting the effects of the tragedy. The suicide of scientist Valery Legasov. The suffering of the young Lyudmilla Ignatenko as she watchs the slow and painful death of her fireman husband, Vasily, and later has her new-born daughter die of the radiation she absorbed while pregnant. The hundreds of miners exposing themselves to Chernobyl radiation—at the end of the series we are informed that “it is estimated that at least 100 of them died before the age of 40.” (All quotes from the miniseries are taken from the episode scripts.)  The young soldier Pavel forced to kill contaminated dogs and other animals. The old woman who refuses to move out of her contaminated home even after a soldier shoots the cow she is milking—some 300,000 people “were displaced from their homes.” And we think, “How could Soviet leaders have been so careless as to allow such a tragedy to occur?”


    The causes, as usually happens with historical events, were many, and the series mentions some of them. The fifth (and last) episode of the miniseries is devoted mainly to the 1987 trial of three Chernobyl officials whom Soviet authorities claimed were most responsible. The miniseries certainly indicated that they shared some of the blame, but it was also endemic to the Soviet system. 


    In episode 5 a fictional KGB head tells scientist Legasov that the Chernobyl accident was essentially “the result of operator error.” The KGB and judge at the trial wanted to deflect any suggestion that the Soviet communist system itself was at fault. The judge tells Legasov, “If you mean to suggest the Soviet State is somehow responsible for what happened, then I must warn you—you are treading on dangerous ground.”   


    Gessen mentions that “the Harvard historian Serhii Plokhy’s 2018 book on Chernobyl . . . . argues, it was the Soviet system that created Chernobyl and made the explosion inevitable.” (See an excerpt of the book here.) In fairness to the miniseries, it does indicate some of that blame.


    In one scene featuring the three men who were put on trial, one of them tells the other two that the power at Chernobyl could not be lowered to the extent it should have been for the safety test, the failure of which causes the massive nuclear accident. Why couldn’t it be lowered more? “It's the end of the month. All the productivity quotas? Everyone's working overtime, the factories need power.” 


    As one book on the Soviet environment states, “For the environment, the central planning system became Frankenstein’s monster. . . . The plan and its fulfillment became engines of destruction geared to consume, not to conserve, the natural wealth and human strength of the Soviet Union.” 


    Fulfilling quotas, whether multi-year, yearly, or monthly ones, generally became more important than safety or quality considerations. (See here for prioritizing the production schedule over nuclear safety at Chernobyl.) In the vast Soviet bureaucratic central-planning system,pleasing your superiors by meeting quotas became an important path for advancement. 


    That same system discouraged individual initiative, initiative that might have prevented or mitigated the effects of the accident. One individual reporting in May 1986 to the Central Committee of the Communist party about conditions his investigating group discovered at Chernobyl wrote that “we constantly heard the following phrase: ‘We did not receive those instructions from the center.’” He added, “They waited for orders from Moscow.” (In annual summer trips to the USSR in the mid and late 1980s, I frequently observed this reluctance to exercise initiative. In the summer of 1986, for example, just months after Chernobyl, the group I was leading was assigned inadequate lodging when we checked into a hotel in Odessa, a city more distant from Chernobyl than Kiev, where we were originally scheduled to go. When I complained to the hotel manager that the accommodations assigned to us were inferior to those we had arranged and paid for, he informed us that he would have to straighten the matter out with officials in Moscow. It took three hours to do so. Only then were we assigned proper lodging.) 


    At the trial mentioned above, Legasov indicates still other failings of the Soviet system, especially its secrecy and lies. “They are,” he says, “practically what defines us. When the truth offends, we lie and lie until we cannot even remember it's there.” He also mentions that to save money various safety measures, like having containment buildings built around the reactors, were not taken.  


    In his Memoirs, published after he no longer headed the Soviet Union (1985-1991), which itself had disintegrated, Mikhail Gorbachev alluded to the Soviet failings indicated above. He wrote: 

    The accident at the Chernobyl nuclear power plant was graphic evidence . . . of the failure of the old system. . . . 

    The closed nature and secrecy of the nuclear power industry, which was burdened by bureaucracy . . . had an extremely bad effect. I spoke of this at a meeting of the [Communist party] Politburo on 3 July 1986: ‘For thirty years you scientists, specialists, and ministers have been telling us everything was safe. . . . But now we have ended up with a fiasco.  . . . Throughout the entire system there has reigned a spirit of servility, fawning, clannishness and persecution of independent thinkers. . . .

    Chernobyl shed light on many of the sicknesses of our system as a whole. Everything that had built up over the years converged in this drama: the concealing or hushing up of accidents and other bad news, irresponsibility and carelessness, slipshod work, wholesale drunkenness. This was one more convincing argument in favor of radical reforms. 


    Gorbachev himself is depicted in the miniseries as someone more interested in discovering the truths of Chernobyl than in covering them up. And in general that was true, but he inherited a Soviet system that was not much interested in truth or justice, and he had to contend with many government and Communist party officials opposed to some of the radical reforms he pushed, such as more openness, less censorship, and economic restructuring.


    Media and the courts were strictly controlled by the Communist party and government. About the Chernobyl trial, Legasov says, “It's a show trial. The ‘jury’ has already been given their verdict.” One indication that such was the usual practice was the observation of an earlier Soviet dissident that among 424 known political trials in the decade following 1968, there were no acquittals in any of them.


    But the value of HBO’s Chernobyl is not just what it tells us about the failings of the Soviet system. It offers much more. In 2011, the Bulletin of the Atomic Scientists published an essay by Gorbachev that listed some of the lessons that the world could learn from the accident. One was that we “must invest in alternative and more sustainable sources of energy” like wind and solar. Surely, another lesson is the need for caution in developing any powerful technology. In June 2019, conservative New York Times columnist Bret Stephens wrote an op-ed entitled “What ‘Chernobyl’ Teaches About Trump.” In it he compared the effects of Trump’s many lies to those of Communist officials. Wikipedia offers us a convenient overview of the miniseries, including a summary of each episode, and most significantly links to various other essays that comment on the miniseries and its relevance for today. 

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172521 https://historynewsnetwork.org/article/172521 0
    Say It Ain't So Joe: Strategies of Segregation in Ventura County

    A protest of desegregation busing in Boston, 1974



    From the last Democratic debate, we learned that in the 1970s Joe Biden opposed federally mandated busing to desegregate schools because he believed it was a dilemma to be reckoned with by local government.

    To be fair to Joe, most people—black, brown, and white—at the time liked their neighborhood schools. 


    White-collar professionals purchased homes significantly based on the public school that came with them—historically better resourced than those in black and brown communities systematically concentrated in the nation’s inner cities. Less affluent minority parents, too, simply desired equally funded neighborhood schools with effective teachers friendly to the needs of their children. And for many others, a culturally relevant curriculum that instilled an amour propre in students from diverse backgrounds was a plus.  Largely absent from today’s public conversation on mandated busing is the racism that created segregated neighborhoods in the first place and translated to poorly funded schools for minority children. As Eric Avila in Popular Culture in the Age of White Flight: Fear and Fantasy in Suburban Los Angeles (2004) and Richard Rothstein in The Color of Law: A Forgotten History of How Our Government Segregated America (2017) detail, in the 1930s, Federal Housing Authority policy, via the Home Owner’s Loan Corporation, created a redlining system that encouraged real estate interests (lenders, developers, and agents) to concentrate people of color away from white homeowners. This was the history behind the desegregation case of Soria v. Oxnard School District Board of Trustees (1971) in Ventura County, California. As David G. García details in Strategies of Segregation: Race, Residence, and the Struggle for Educational Equality (2018), since the 1930s, the Oxnard School District accommodated white homeowners who did not want their children socializing with Mexican children, primarily, as they were the largest non-white demographic. As limited funding and facilities made complete segregation impossible, OSD administrators, upon the direction of trustees, gerrymandered attendance boundaries and schedules to separate students as much as possible. To maintain this system, during the next two decades the OSD constructed two segregated Mexican schools in the 1940s less than one block away from each other. When these sites overcrowded, the district imported portables classrooms and constructed new campuses nearby. Ten years after Brown v. Board of Education 1954, the Community Service Organization, an ethnic Mexican civil rights group, and the National Association for the Advancement of Colored People of Ventura County protested the segregationist practices of the OSD trustees. The district contended that de facto school segregation was an outcome of residential patterns outside its purview. As the City of Oxnard grew, the CSO and NAACP persistently petitioned the OSD to remedy racial imbalances in the schools. The board rejected all of the numerous desegregation plans proposed by Althea Simmons, field secretary of the Los Angeles chapter of the NAACP andits own advisory committee. Fed up with the intransigence of OSD trustees, black and ethnic Mexican parents filed the Soria case in 1970 in federal court. In May 1971, Judge Harry Pregerson’s  summary judgement found that both de facto and “de jure overtones” of segregation consisted of, but were not limited to, the creation of new schools, individual intra-district transfers via busing, and the use of portables to keep black and brown students concentrated in segregated schools.


    These were constitutional violations of equal protection under the 14th Amendment. As a result, Judge Pregerson mandated a paired-schools busing plan as a remedy.  That September buses transported children of the barrio to their paired schools in the city’s more middle-class neighborhoods and vice versa. Like Kamala Harris in Berkeley at this time, as a first grader I, too, was bused from an ethnically integrated neighborhood of black, brown, and Asian American families in south Oxnard to Brittell Elementary in the predominantly white, northern part of the city.  In November of 1973, the U.S. Ninth Circuit Court of Appeals vacated Judge Pregerson’s summary judgment and remanded the case for a trial. Subsequently, board minutes of the 1930s surfaced that evidenced the de jure segregation of Mexican children to appease white parents. Former OSD superintendents, including Los Angeles County Superintendent of Schools Dr. Richard Clowes, also testified that up to and throughout the 1960s, trustees maintained segregation. Based on prior and fresh findings, Judge Pregerson ruled in favor of the plaintiffs and busing continued. The need to bus students faded through the 1980s as the City of Oxnard increasingly browned. Its cause: a middle-class flight of diverse races and ethnicities to the neighboring communities of Camarillo and Ventura. But as the demographics of these communities shifted over time, flight renewed. People moved further eastward, if able, to Newbury Park and Thousand Oaks. Hence, a more insidious segregation exists today as people of all colors and creeds troll education and real estate websites for school rankings. The systemic outcome: the segregation of largely black and brown students, again. If a school’s status dips and the number of brown students rises, some parents, if they can, will take one of the following steps: move to whiter more affluent neighborhoods or commute their children to higher performing and less racially diverse schools. Without the segregationist mentors who Joe Biden proudly worked with as U.S. Senator in the 1970s, this is the new face of de facto school segregation.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172519 https://historynewsnetwork.org/article/172519 0
    Delightful but Dizzying Romp as the 17th Century Meets the 1950s  

    Viola is a lovely young blonde-haired woman washed up on a sandy shore after a shipwreck that took the life of her brother Sebastian. Now alone in the world, and seeking a mate, she disguises herself as a man and goes to work for the wealthy and powerful (and very handsome) Duke Orsino in Illyria. She falls in love with him. The Duke, though, is smitten with Lady Oliva and chases her fervently while Viola chases him. So, you have a man, the Duke, chasing a Lady, Oliva, and a man chasing the Duke. Or is it a woman chasing the Duke? Or is anybody really chasing the Duke? Is everybody chasing him?


    There is merriment galore in this new production of William Shakespeare’s play Twelfth Night, that just opened at the Shakespeare & Co. theater in Lenox, Massachusetts in the Berkshires. There is also a bit of dizziness too, as you try to figure out who is who. There are other characters who are trying to help or hinder the romances of this smart trio. They contest against each other and conspire with each other throughout the tale. You need a scorecard.


    This production ofTwelfth Night has an unusual setting – a seaside nightclub in the year 1959 to represent, I guess, fun and frolic just before the turbulent 1960s. As you find your seat you are happily bombarded with rock and roll music from 1959, tunes such as Venus, by Frankie Avalon.


    All of this makes for an enjoyable night at the theater, but you get lost trying to follow the plot and trying to figure out the motives of all the very odd characters in the story and a lot of hidden wrinkles that should remain hidden – very hidden. 


    Twelfth Night has been staged in many different ways. It is often set on a ship, as an example, and the sea surrounds the actors. It has been set in different centuries and folks travel by cars, horses and carriages. Here it is the Bard rocking and rolling to the beat of Pink Shoelaces ,by Dodie Stevens, and Rock Around the Clock by Bill Haley and the Comets.


    Our heroine, Viola, is really two people, a man and woman. She becomes the sexual object of desire, as a man, for Lady Oliva. Now Oliva is being pursued ardently by a very surprising mystery man. For Viola, wrestling with Oliva throughout the play as a guy and pining for Orsino as a woman (you’ve got to pay attention here), the question is – what to do?


    She bumps into a group of men who tell jokes, make sarcastic remarks and sing a lot. They sing original songs written for the play and they, and the audience, listen to a long list of 1950s songs that sometimes have something to do with the play and are a part of this 1601 Dick Clark’s American Bandstand television show.


    The pace of the story is faster than the Indianapolis 500 auto race and you have to keep up with three or four subplots at the same time and try, try, try, to find Duke Orsino, who wanders around 1959 like a man in search of an Esso gas station. All of the characters tease and torture poor old Malvolio, one of Shakespeare’s great comic characters, who is wonderful and involved in the hidden sub plot.


    Although the plot is a bit mixed up, there are many good thigs to say about this Twelfth Night. Director Allyn Burrows has done a very admirable job of mastering the story and, although unwieldy, keeps the tale moving along and milks every bit of comedy out of it in addition to directing a fine cast of actors. There are wonderful small scenes involving small characters, such the mirthful Sir Andrew and his pals Sir Toby and Feste, who does some fine singing throughout the play. They add some spark to the story.


    The director gets really superb work from the cast. Ella Loudon is triumphant as the woman/man Viola. She is on stage for practically the entire play and does yeoman work.  Miles Anderson is just a vision as scampy Malvolio, who has the audience chuckling for the length of the play. Other good performances are turned in by Martin Jason Asprey as the sea captain who rescues Viola from the shipwreck, Bruce Michael Wood as heartthrob Orsino, Steven Barkhimer as Sir Toby, Gregory Boover as Feste, Nigel Gore (marvelous) as Sir Andrew, Deacon Griffin Pressley as the mysterious suitor of Lady Oliva. When Olivas is not shouting too much, she is a welcome addition to the play and portrayed nicely by Cloteale L. Horne. Bella Merlin plays Oliva’s servant and possesses the world’s loudest and longest cackle.


    If you see this Twelfth Night, bring a scorecard to keep track of the characters and the plot and be prepared for a long two-and-a-half-hour play. Oh, and bring your dancing shoes. You may not be able to follow Viola, but you certainly can follow the rock and roll songs in the story.


    PRODUCTION: The play is produced by Shakespeare and Co. Sets: Christina Todesco, Costumes: Govane Lohbauer, Sound and original music: Arshan Gailus, Lighting: Deb Sullivan, Fight Director: Allyn Burrows. The play is director by Burrows. It runs through August 4.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172552 https://historynewsnetwork.org/article/172552 0
    Will We all Survive by the Skin of Our Teeth?


    In 1942, just after the start of World War II, playwright Thornton Wilder, who had written the classic Our Town just five years earlier, wrote The Skin of our Teeth. It was very unusual then and pretty unusual now. The play is the story of a family of mammals, that has evolved into people, over 5,000 years. They are the Antrobuses, a family of mom and dad, who do not get along, and a daughter and son who is rebel. The dad, George, is the President of the mammal society and has been since the days of Noah. The family has dealt with tragedy for all those centuries and now they are smacked with World War II.


    The play, that just opened at the Fitzpatrick Theater, Berkshire Theater Group, in Stockbridge, Massachusetts, begins with the family living in the Ice Age and surrounded by large mammoths and other ancient animals. Act one tries to show that the mammals were more civilized than people. It also sets up the idea that this is a unique play, a bit weird, maybe, and, in the end, the story not just of humankind, particularly Americans, but of what will happen to humankind if it does not change its ways (as if we are EVER going to change our ways).


    It is a strange look at the world, very surrealistic, and something you would expect to be written today, and not in 1942. In it, playwright Wilder shows an amazing vison of the future. Much of it happened, although vaguely, just the way he predicted.


    Despite its odd structure and tromping, stomping, mammoths, who wander about sort of half off the stage, their roars frightening everybody, The Skin of Our Teeth is a colossal success, a perceptive look back to the past and to the future and from post-World War II life to the present day and to our future, too. 


    In the beginning of the story, someone reminds the audience that Americans got through the Depression “by the skin of our teeth” and that we can get through anything, at that time referring to the war. You could jump into 1942’s future, though, and look at Watergate, Vietnam, the Civil Rights movement and even today’s border crisis to see how Americans did get through everything, and in much the way Wilder predicted. Oh, we’ve had our and bruises, but we made it so far. Wilder then adds that if we have gotten this far, we can get farther. He’s right.

    Following the sci-fi start of the show in act one, the tale shifts to contemporary 1942 at America’s most glamourous resort, Atlantic City (that has fallen into gambling disrepair over the last forty years). George and his wife are the leaders of a mammal’s society but have their domestic problems, and their two troublesome teenagers (teenagers were troublesome 5,000 years ago and remain so today, and will be a pain 5,000 years from now).  Middle-aged George, married so long, thinks about having a fling with Miss Atlantic City, his kids rebel and his wife seems fed up with everything and everybody. They tell people that when the big machine on stage turns red the world will be coming to an end. It does and the world seems to be rushing that way, given the war, but it survives. In act three, in a dazzling performance by the teenage son Henry, we see the future after World War II and it is not rosy.


    The Skin of Our Teeth is not an easy play to stage or watch. At the start of act three, for no reason, the action stops and the stage manager tells us, as part of the play, that actors have become ill and new actors have been put on stage. There is a rehearsal for the new actors. The play then reverts back to form with the newcomers. There are trips to 5,000 years ago, Conventions, beaches, strobe lights, explosions, boardwalks and lots of cantankerous people. 


    Director David Auburn has done a splendid job of keeping all of this running smoothly in his skilled hands. He also remembers the past warmly but meets the future with open arms and an eager smile, as do the actors. Auburn gets wonderful performances from the entire cast. Particularly good performances are turned in by Danny Johnson as George, Harriet Harris as his wife, Lauren Baez as daughter Sabina, Marcus Gladney Jr. as son Henry, and (delightful) Ariana Venturi as the Antrobus’ maid.  They are surrounded by an ensemble of fine actors.


    At the end of the play, George Antrobus looks out over the audience and says that the end of the play, earth’s resolution, has not been written in 1942. It has not been written today, either, and will not be until another 5,000 or 10,000 years has gone by. People struggle on, sometimes triumphant and sometimes tragically, but we do survive.


    What will the world be like 10,000 years from now?


    Will there still be robot calls?


    Will there still be 546 candidates in each televised Presidential debate? 


    Will parents still tell everybody that their kids are the smartest, most athletic and most beautiful people that ever lived?


    PRODUCTION: The play is produced by the Berkshire Theatre Group. Sets: Bill Clarke, Costumes: Hunter Kaczorowski, Sound: Scott Killian, Lighting: Daniel J. Kotlowitz. The play is directed by David Auburn. It runs through August 3.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172553 https://historynewsnetwork.org/article/172553 0
    Mary Jo Kopechne’s Legacy



    On July 19, 1969, Senator Edward M. Kennedy drove his black Oldsmobile off a bridge on Chappaquiddick Island, near Martha’s Vineyard, Massachusetts. His 28-year-old passenger, Mary Jo Kopechne, was killed in the accident.                                    


    A week later, Kennedy went on national television to ask the people of Massachusetts for their forgiveness. And they forgave him each of the  six times he was reelected to the Senate until his death from brain cancer in 2009. History has not been as kind to Kopechne.                                                                                           


    For fifty years, Mary Jo was treated as collateral damage by the Kennedys and the Washington political establishment.  The media spun the tragedy as part of a much larger “curse” on the Kennedy family, and one that prevented the Massachusetts senator from being elected to the presidency. Other, more sensationalist writers suggested that Kopechne was an opportunist who was having an affair with the senator.                                                                                          


    But Kopechne’s life and legacy are much greater than her death at Chappaquiddick and the cottage industry of scandalous accounts that followed over the next half century.  Mary Jo was a bright young woman who was a pioneer for a later generation of female political consultants, including Mary Matalin, Ann Erben and Donna Lucas.                                                                                                    


    Inspired by President John F. Kennedy's challenge to "ask what you can do for your country," Mary Jo Kopechne became part of the civil rights movement, taking a job as a school teacher at the Mission of St. Jude in Montgomery, Ala. Three years later, she joined the Capitol Hill staff of New York Sen. Robert F. Kennedy.                                                                                       


    A devout Catholic, Kopechne lived in a Georgetown neighborhood with three other women. She rarely drank, didn’t smoke and was offended by profanity, yet she was irresistibly drawn to the fast-paced, glitzy world of Washington.                                


    Mary Jo distinguished herself during RFK’s by working long hours at his Washington headquarters.  During Bobby’s 1968 presidential campaign, Kopechne served as a secretary to speechwriters Jeff Greenfield, Peter Edelman and Adam Walinsky and tracked and compiled data on how Democratic delegates from various states might vote.  She shared the latter responsibility with five other young women: Rosemary Keough, Esther Newberg, Nance and Maryellen Lyons, and Susan Tannenbaum.  Collectively, they were known as the "Boiler Room Girls,” after the windowless office they worked in at 2020 L Street in Washington, DC.                                               


    At age 27, Mary Jo was the oldest of the Boiler Room Girls and the one who had worked for RFK the longest. She was the key Washington contact in the Boiler Room.  She also kept track of delegates in Indiana, Kentucky and Pennsylvania, critical battleground states where polls were predicting a close race between Kennedy and Vice-President Hubert Humphrey.                                                    


    Mary Jo was only paid $6,000 a year when she was hired, compared to the male legislative assistants who started at a salary of between $12,000 and $15,000 a year.  In the four years she worked for RFK she never earned more than $7,500 a year; enough to pay rent and maintain a Volkswagen Beetle.                                 


    By today’s standards, Kopechne was grossly underpaid, extremely overworked and dismissed as a “secretary” when her responsibilities demanded the more respectable title of “political consultant” and paid accordingly. But she belonged to a transitional generation of women who paved the way for the feminists of the 1970s and their fight for gender equality.                                                     


    Of all the Boiler Room Girls, Kopechne was the “the most politically astute,” according to Dun Gifford, who supervised the operation.  “Mary Jo had an exceptional ability to stay ahead of fluctuating intelligence on delegates. That ability allowed her to negotiate deals on RFK’s behalf, to travel with him when necessary and even to offer her opinions when she had the best working knowledge of a situation.      


    “Had Bobby won the election, Mary Jo would have been rewarded with a very significant job in his administration,” added Gifford. Kopechne was devastated by RFK's assassination in June 1968, and she felt she could no longer work on Capitol Hill. Instead, she took a job with a political consulting firm in Washington, D.C.                                                                             


    On the evening of July 18, 1969, Mary Jo attended a party thrown by Ted Kennedy on Chappaquiddick to honor her and the other Boiler Room Girls. Later that night, she accepted the senator's offer to drive her back to her hotel on Martha’s Vineyard. Kennedy's car swerved off a narrow, unlit bridge and overturned in the water. The senator escaped from the submerged car, but Kopechne died after what Kennedy claimed were "several diving attempts to free her."                                                   


    By the time Kennedy reported the accident to police the following morning, Kopechne's body had been recovered. John Farrar, the diver who found her, reported that she had positioned herself near a backseat wheel well, where an air pocket had formed, and had apparently suffocated rather than drowned. Farrar added that he "could have saved her life if the accident had been reported earlier."  Kennedy could easily have been charged with involuntary homicide and sentenced to significant jail time for Kopechne's death. Judge James Boyle instead sentenced him to two months' incarceration, the statutory minimum for the crime. He then suspended the sentence saying that Kennedy had "already been, and will continue to be, punished far beyond anything this court can impose."                             


    Perhaps Ted Kennedy tried to do his penance in the United States Senate, where he championed historic legislation on civil rights, immigration, education, and health care.                                                                                                                  


    If so, Mary Jo Kopechne inspired those achievements because her death forced the embattled senator to strive for a higher standard.


    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172522 https://historynewsnetwork.org/article/172522 0
    Billionaires and American Politics


    Is the United States becoming a plutocracy?

    With the manifestly unqualified but immensely rich Donald Trump serving as the nation’s first billionaire president, it’s not hard to draw that conclusion.  And there are numerous other signs, as well, that great wealth has become a central factor in American politics.

    Although big money has always played an important role in U.S. political campaigns, its influence has been growing over the past decade.  According to the Center for Responsive Politics, by 2014 the share of political donations by the wealthiest 0.01 percent of Americans had increased to 29 percent (from 21 percent four years before), while the top 100 individual donors accounted for 39 percent of the nation’s super PAC contributions.  

    With the 2016 presidential primaries looming, would-be Republican nominees flocked to Las Vegas to court billionaire casino magnate Sheldon Adelson and his wife, who had donated well over $100 million to Republican groups during the 2012 election cycle.  Although even Adelson’s money couldn’t save them from succumbing to vicious attacks by Trump, Adelson quickly forged a close alliance with the billionaire president. In 2018, he became the top political moneyman in the nation, supplying Republicans with a record $113 million

    In fact, with Adelson and other billionaires bringing U.S. campaign spending to $5.2 billion in that year’s midterm elections, the big-ticket players grew increasingly dominant in American politics.  “We like to think of our democracy as being one person, one vote,” noted a top official at the Brennan Center for Justice.  “But just being rich and being able to write million-dollar checks gets you influence over elected officials that’s far greater than the average person.”

    This influence has been facilitated, in recent years, by the rise of enormous fortunes. According to Forbes ― a publication that pays adoring attention to people of great wealth―by March 2019 the United States had a record 607 billionaires, including 14 of the 20 wealthiest people in the world.  In the fall of 2017, the Institute for Policy Studies estimated that the three richest among them (Jeff Bezos, Bill Gates, and Warren Buffett) possessed more wealth ($248.5 billion) than half the American population combined.  

    After this dramatic example of economic inequality surfaced in June 2019, during the second Democratic debate, the fact-checkers at the New York Times reported that the wealth gap “has likely increased.” That certainly appears to be the case. According to Forbes, these three individuals now possess $350.5 billion in wealth―a $102 billion (41 percent) increase in less than two years.

    The same pattern characterizes the wealth of families.  As Chuck Collins of the Institute for Policy Studies recently revealed, Charles and David Koch of Koch Industries (their fossil fuel empire), the Mars candy family, and the Waltons of Walmart now possess a combined fortune of $348.7 billion―an increase in their wealth, since 1982, of nearly 6,000 percent.  During the same period, the median household wealth in the United States declined by 3 percent.

    Not surprisingly, when billionaires have deployed their vast new wealth in American politics, it has usually been to serve their own interests.

    Many, indeed, have been nakedly self-interested, sparing no expense to transform the Republican Party into a consistent servant of the wealthy and to turn the nation sharply rightward.  The Koch brothers and their affluent network poured hundreds of millions (and perhaps billions) of dollars into organizations and election campaigns promoting tax cuts for the rich, deregulation of corporations, climate change denial, the scrapping of Medicare and Social Security, and the undercutting of labor unions, while assailing proposals for accessible healthcare and other social services.  And they have had substantial success.  

    Similarly, billionaire hedge fund manager Robert Mercer and his daughter, Rebekah, spent $49 million on rightwing political ventures in 2016, including funding Steve Bannon, Breitbart News, and Cambridge Analytica (the data firm that improperly harvested data on Facebook users to help Trump’s campaign).  After Trump’s victory, Robert stayed carefully out of sight, sailing the world on his luxurious, high-tech super yacht or hidden on his Long Island estate.  But Rebekah worked on the Trump transition team and formed an outside group, Making America Great, to mobilize public support for the new president’s policies.

    The story of the Walton family, the nation’s wealthiest, is more complex.  For years, while it fiercely opposed union organizing drives and wage raises for its poorly-paid workers, it routinely channeled most of its millions of dollars in campaign contributions to Republicans.  In the 2016 elections, it took a more balanced approach, but that might have occurred because Hillary Clinton, a former Walmart director and defender of that company’s monopolistic and labor practices, was the Democratic standard-bearer.

    Although some billionaires do contribute to Democrats, they gravitate toward the “moderate” types rather than toward those with a more progressive agenda.  In January 2019, an article in Politico reported that a panic had broken out on Wall Street over the possibility that the 2020 Democratic presidential nominee might go to someone on the party’s leftwing.  “It can’t be Warren and it can’t be Sanders,” insisted the CEO of a giant bank.  More recently, billionaire hedge fund manager Leon Cooperman made the same point, publicly assailing the two Democrats for their calls to raise taxes on the wealthy. “Taxes are high enough,” he declared. “We have the best economy in the world. Capitalism works.”

    The political preferences of the super-wealthy were also apparent in early 2019, when Howard Schultz, the multibillionaire former CEO of Starbucks, declared that, if the Democrats nominated a progressive candidate, he would consider a third party race.  After Schultz denounced Warren’s tax plan as “ridiculous,” Warren responded that “what’s `ridiculous’ is billionaires who think they can buy the presidency to keep the system rigged for themselves.”

    Can they buy it? The 2020 election might give us an answer to that question.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172520 https://historynewsnetwork.org/article/172520 0
    5 Times Presidents Lost Big in the Midterms But Won (or nearly won) Reelection


    By any standard of measure, the 2018 midterm congressional elections created a blue wave that swept major regions of the country. With a record voter turnout for midterm elections, eight million more people cast ballots for Democrats than Republicans. That translated into a Democratic majority in the House of Representatives with a gain of 40 seats. 


    Attention now turns to 2020 and its presidential election. What happens in the next presidential election when a first-term president’s party is rejected by the American people? In the Post-World War II era, a president has won reelection after his party suffered significant losses in the midterms. Analyzing these instances may give us a clue as to whether a Trump second term is in our future. 


    Democrat Harry Truman was the first post-war president whose party suffered a major defeat in the midterms. Assuming the presidency less than three months after becoming vice-president, Truman faced a myriad of domestic problems stemming from the transition from World War II to a peace-time economy. Inflation and labor unrest including a nationwide railroad strike consumed the new presidency. The American people went to the polls in the 1946 midterms and resoundingly punished the party in power. The Democrats lost their majorities in both the Senate and House of Representatives dropping 10 and 54 seats respectively. Believing the election demonstrated the public wanted a Republican president, Arkansas Senator J. William Fulbright called on Truman to resign so that the new Speaker of the House Joe Martin would assume the presidency.


    Truman did not take that advice and instead rose from political ashes to win in 1948 by reinforcing the government programs and policies that proved popular during FDR’s New Deal. In September 1945, Truman proposed his 21 Point Program including a minimum wage extension and increase, expanding public works, and strengthening unemployment compensation, housing subsidies, and farm price supports and subsidies. Truman later proposed universal healthcare. Although almost none of these proposals were law in 1948, Truman nonetheless made social welfare the cornerstone of his platform. In his acceptance speech in the Democratic Convention of 1948 Truman said, “Republicans approve of the American farmer, but they are willing to help him go broke. They stand four-square for the American home—but not for housing. They are strong for labor—but they are stronger for restricting labor's rights. They favor minimum wage—the smaller the minimum wage the better.” The promise of expanded government programs was popular with various segments of American society and Truman barnstormed the country and won a presidential term. 


    28 years later, Republicans experienced heavy losses in the 1974 midterm elections. Only three months earlier, Richard Nixon resigned in disgrace leaving the White House to his appointed vice-president Gerald Ford. When the American people went to the polls in November 1974, inflation was in the double-digits and—even worse—the seemingly honest, untainted President Ford pardoned his predecessor one month to the day after Nixon announced his resignation. The results were devastating:  the Republicans lost 48 seats in the House and 3 in the Senate. 


    Like pessimistic Democrats in 1946, some Republicans viewed the 1974 results as a warning that Gerald Ford should not be the party standard bearer in 1976. But Ford had a reputation for honesty and conservative principles and fought the Democratic controlled Congress as he vetoed 60 bills. Although unemployment was fairly high, inflation had dropped into single-digits by 1976 and Ford successfully repelled Ronald Reagan’s strong bid for the presidential nomination. After falling way behind his Democratic opponent Jimmy Carter, Ford nearly caught up and won 48 percent of the popular vote. A shift of a few thousand votes in Ohio and Mississippi would have given Ford a majority in the Electoral College.


    Ronald Reagan also won reelection despite Republican losses in the preceding midterms. During the 1982 elections, the nation was in the worst recession since the Great Depression. With inflation declining but unemployment nearing 11 percent, the Republicans lost 26 House seats but broke even in the Senate and maintained a slim majority. In the next two years unemployment declined and inflation remained under control and Reagan’s massive tax cut seemed to be working. Reagan, remaining as likeable as ever to most Americans, easily won re-election in 1984.


    More strikingly, the Democrats lost 54 seats in the House and 10 in the Senatein the 1994 midterms during President Bill Clinton’s first term. Republicans won a majority in the House for the first time in 40 years and eight years in the Senate. Anti-Democratic groups such as right-to-lifers, term-limit supporters, and the NRA got out the vote but the Democrats did not. Republicans began calling Clinton irrelevant and predicted his easy defeat in 1996. 


    A government shutdown likely aided Clinton’s reelection bid. In 1995, the Republican House of Representatives under Speaker Newt Gingrich proposed a U.S. budget that would limit what the federal government could do for the environment, education and many other areas. Worst of all for Republicans, the budget would have increased Medicare premiums. When Clinton vetoed the budget, a government shutdown occurred in 1995-96. By the time Congress reached an agreement, most Americans blamed the Republicans. Clinton ran for re-election by asking the voters, “Do you want the same people who would endanger your Medicare to also have the White House?” Furthermore Clinton won re-election by appearing mainstream or moderate by signing welfare reform and the Defense of Marriage Act. He expanded his base.


    In the midterm elections during Barack Obama’s first term in 2010, the Democrats lost 63 seats and their majority in the House. Although Democrats narrowly maintained the majority in the Senate, they lost six seats. The Republicans mobilized the voters by claiming that Obama’s $831 billion stimulus package did nothing to end the recession and criticized “Obamacare,” the newly passed health care legislation. 


    By November 2012, unemployment had dropped two points and at least a million more Americans had obtained health insurance under Obamacare. Furthermore, the Republicans and their presidential nominee Mitt Romney played to a very conservative base without expanding it. The coalition of ethnic and racial minorities that helped elect Obama in 2008 had incentives to stay together and succeeded again.


    The post-war history of midterm elections contains lessons for Trump and his campaign advisors. Presidents Truman, Reagan, Clinton, and Obama all saw their parties defeated badly in midterms in their first term, yet two years later the voters rewarded them with a second presidential term. President Ford came close to doing that too. 


    Given this history, what are Trump’s 2020 prospects? Truman succeeded by fighting against a “do-nothing” Congress and fighting for government programs popular with many Americans. But Trump’s border wall has gained no traction with the American people and many will likely blame him for the ongoing government shutdown. Reagan’s likeability coupled with a rebounding economy saved his presidency. Trump, however, remains strongly disliked by two thirds of the American people. Clinton moved toward the center attracting moderate voters who comprised a large segment of the electorate in a general election. Trump shows no sign of taking moderate or broader positions on any issue from immigration to healthcare. So while presidents have previously overcome midterm election losses, Trump isn’t following any of the previously established paths to rebound and win reelection. 

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172524 https://historynewsnetwork.org/article/172524 0
    The Civil War Battle Decided By A General's Mistakes

    Downtown Lynchburg circa 1919


    Lynchburg, Virginia, today displays many markers of its Civil War history. There are several signs in and around the city that indicate where Confederate forces were placed in defense of the city. There is a statue of a Confederate infantryman at the top of Monument Terrace. In Riverside Park, what is left of the hull of Marshall, which carried the body of dead Gen. Stonewall Jackson through Lynchburg, sits in an enclosed display near James River. At the junction where Fort Avenue splits, just beyond Vermont Avenue, and Memorial Avenue begins, there is a monument to Jubal Early. There too sit Fort Early just to the west of that juncture and Fort McCausland on what is today Langhorne Road, each having served as part of the outer defenses of Lynchburg, and each considered a Confederate hero of the battle. Finally, Samuel Hutter’s Sandusky residence, the headquarters of the Union’s leaders during the campaign, sits undisturbed down Sandusky Drive to the north of Fort Avenue.


    Why does Lynchburg display so many markers, ultimately in the cause of a losing effort?


    It was one of the most important Southern cities in the conduct of the war and was the only major Virginian city not to be captured by the North. By analysis of how the city was not captured, we gain an additional perspective on the military history of the Civil War and see in this instance the large consequences of following one’s inclinations instead of one’s orders.


    Lynchburg, Virginia, was deemed by both North and South to be a pivotal city in the conduct of the Civil War. It was centrally located in Virginia, where much of the fighting was conducted. Furthermore, it was, with its three railroads and the Kanawha Canal, an extraordinary transportational hub. Confederate troops would gather in Lynchburg to be sent via railroad to other places, and it was a strategic hub for supplies, and, with its numerous warehouses, a place to bring wounded soldiers from the South with the hopes of convalescence. Moreover, it was also near Richmond, the capital of the Confederacy. Yet with a wall of mountains to its northwest and the James River to the east, it was defensible.


    In spite of its significance, it was not until the summer of 1864 that the Civil War came to Lynchburg. Union Gen. David Hunter was tasked with capturing Lynchburg. Hunter had a history of conducting military affairs as he saw fit to do so, not as he was ordered to do. For instance, in 1862, he issued, without proper authority, an order to free all slaves in Georgia, Florida, and South Carolina. That order was quickly countermanded by President Lincoln. He also, and without authorization, began enlisting black soldiers from South Carolina to form 1stSouth Carolina. Lincoln again rescinded that order.


    In June 1864, Hunter and his men approached Lynchburg after leaving the Shenandoah Valley. The Confederate convalescents from the hospitals, under the command of invalid Gen. John C. Breckinridge, erected breastworks around the city in some effort to defend it and its citizens.


    Hunter was under orders to destroy the railroad and the canal at Lynchburg and generally to follow a scorched-earth policy vis-à-vis all industries that might be used to benefit the South on his way through Staunton to Lynchburg. Union Gen. Ulysses Grant wrote to Hunter: “The complete destruction of [the railroad] and of the canal on the James River are of great importance to us. You [are] to proceed to Lynchburg and commence there. It would be of great value to us to get possession of Lynchburg for a single day.” In a single day, Lynchburg’s infrastructure could be annihilated, thereby crippling the South’s capacity to transport goods, soldiers, and the ill and wounded.


    Yet again Hunter did as he pleased, not as he was commanded to do. As he moved southwest from Staunton, he tarried so that he could burn or destroy almost everything on his path to Lynchburg. He was sidetracked by several raids in Lexington where he remained from June 11 to June 14. He burned down the Virginia Military Institute and plundered Washington College—he even took a statue of Washington as part of this booty—and had plans to raze even more as he traveled—for instance, the University of Virginia. These raids were likely his undoing in the battle. Darrell Laurant writes in “The Battle of Lynchburg”: “The invaders were thwarted for a number of reasons, but chief among them, there was the failure of commanding general Hunter to cut this vital rail line north of the city when he had the opportunity.”


    Lynchburg, before the arrival of Confederate troops, was protected only by some 700 convalescent soldiers, under active command of the lame Gen. Francis Nichols. Thus, General Robert E. Lee ordered Gen. Jubal Early to assist the invalids Breckinridge and Nichols to defend Lynchburg. Breckinridge had Gen. D.H. Hill set up breastworks around the city. They too were aided by McCausland, who arrived in Lynchburg ahead of Hunter, and by John Imboden, who had a small remnant of cavalry, and the two had established a defensive posture to the southwest of Lynchburg, at a breastwork near the Quaker Meeting House, near Salem Turnpike.


    Early and the 2ndCorps arrived in Lynchburg early in the afternoon on Friday, June 17. With the railroad tracks maimed in several places, transit from Charlottesville to Lynchburg took five hours. Until Early arrived, McCausland and Imboden had kept Hunter’s troops, over 10 times their number, in check, but were slowly being driven back. Even with Early’s troops, the Confederates were still at a distinct numerical disadvantage—some 8,000 to 10,000 Confederate soldiers to some 16,000 to 18,000 Union soldiers—so Early ran trains all night along the tracks on June 17 in an effort to convince the Union troops that still more Confederates were arriving. Hunter wrote in his diary, “During the night the trains on the different railroads were heard running without intermission, while repeated cheers and the beating of drums indicated the arrival of large bodies of troops in the town.”


    Early’s ruse—to pretend that there were more Confederate soldiers than there were—worked. Hunter was convinced that he was fronted by superior numbers. Ammunition, he wrote in his diary, was also running short. After discussion of military affairs with colleagues from Sandusky, Hunter ordered an immediate withdrawal of Union troops on the night of June 18. Hunter later wrote Gen. Grant of his decision, “It had now become sufficiently evident that the enemy concentrated a force at least double the numerical strength of mine, and what added to the gravity of the situation was the fact that my troops had scarcely enough of ammunition left to sustain another well-contested battle.”


    Early, the following morning, attempted to retrace the retreat of Union soldiers, and did so with some success, as he soon caught the rear guard of the retreating blue-coats and killed a number of them. Yet the Union soldiers did escape to Salem and eventually to the mountains of what is now West Virginia. Grant however had requested that Hunter, if in retreat, go toward Washington, where his troops could be of use in defense of the city. He chose a safer route, because, he said, of his dearth of ammunition.


    Confederate Capt. Charles Blackford, who left behind a lengthy account of the battle that was published in 1901, challenged Hunter’s account of having inferior numbers and of being low on ammunition. Hunter knew that his numbers were superior—“he had scouts on both railroads and the country was filled with the vigilant spies who prided themselves on their cleverness”—and he was not low on ammunition. “It cannot be believed that a corps was short of ammunition which had been organized but a few weeks, a part only of which had been engaged at Piedmont, and which had fought no serious pitched battle, and the sheep, chickens, hogs and cattle they wantonly shot on their march could not have exhausted their supply. The corps would not have started had the ammunition been so scarce.” The Union, he maintained, was well-stocked with ammunition. He concluded that Hunter, more interested in campaigns where there was little chance of loss of life, was a coward.


    There is meat in Blackford’s assertions. Hunter could readily have arrived in Lynchburg by June 16, when he would have faced only the convalescent guard, the Silver Grays, and the few other men available to Breckinridge. Had he done so, Lynchburg would have fallen. Yet he tarried in Lexington to pillage and burn needlessly. He was also slowed by the constant burning and plundering of houses along the way which caused pain and loss to Southerners uninvolved in the fight and nowise advanced the North’s cause.


    Thus, one of the most significant cities for the fate of the South, Lynchburg, was not captured. Had Hunter arrived before Early and had he destroyed the railroad tracks and Kanawha Canal as he was ordered to do, the city would have fallen and the Civil War would likely have ended in 1864.


    And so, the real savior of Lynchburg was neither Early nor McCausland, but Hunter, about whom, scholars are in agreement, had ambitions much larger than his abilities. Perhaps Hill City ought to erect a monument in a prominent place in his honor.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172526 https://historynewsnetwork.org/article/172526 0
    What Does It Mean to Be Patriotic? Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.



    July 4 means that healthy and unhealthy discussions of patriotism again take center stage. Will you wave the flag?


    I always welcome thoughtful conversations about what behavior is patriotic, about how we should act if we love our country. It’s too bad that this rarely happens. Face-to-face talks about patriotism usually begin as arguments and accusations, and then get worse. Among people of like minds, how to be patriotic is also seldom sincerely and frankly addressed. Maybe we all are afraid to discover that we don’t agree, or that our ideas can be easily criticized.


    For example, the premise that all good Americans should love our country is a starting point that is never questioned. The postwar conservative refrain that liberals did not love America and wanted to betray it to the world communist movement has never abated, only taken different forms in different political eras. When I was growing up, it was crudely expressed as a taunt to antiwar protesters: “Love it or leave it.” Of course, no self-regarding conservative would now dare suggest that cozying up to post-Soviet Russia is unpatriotic, considering Trump’s attempts to excuse Putin’s electoral meddling. That has taken some of the sting out of the taunt that Democrats are socialists, but not so much that Republicans don’t use it every day.


    Just a few years ago, Republicans howled that if a black man like Rev. Jeremiah Wright said, “God damn America, for treating our citizens as less than human”, and if our black President had ever listened to him, then the whole election of Obama was tainted by lack of patriotism.


    Must a German Jew love her country? Could she not be a loyal citizen, but still experience other feelings besides love for Germany, even 70 years after the end of Nazi rule? Must a Russian whose grandparents were murdered in Stalin’s purges by the secret police now love a country run by the former KGB leader?


    Must African-Americans who experienced discrimination on their own bodies now simply love America, when segregation and discrimination still exist, and when our President is an unrepentant racist? That’s just the beginning of a thoughtful confrontation with the meaning of patriotism.


    A second problem with patriotism discussions is how they often are about symbols rather than behavior. In fact, conservatives and liberals agree about many political behaviors that should characterize a patriotic American: voting, paying taxes, and serving on juries. But conservatives tend to value reverence for symbols of America much more than liberals. In a survey last year, 71% of Republicans, but only 34% of Democrats said that knowing the Pledge of Allegiance was important for good citizenship. Displaying the flag was important for 50% of Republicans, but only 25% of Democrats.


    Someone posted on Facebook the false claim that none of the 10 Democratic presidential candidates at the first debate wore flag pins, which was not true, and then concluded that “Democrats hate Americans and America”. That is a familiar refrain from the right wing.


    Another difference between partisans is how criticisms of one’s country are regarded. While half of Democrats think a good citizen should protest when the government does something that is wrong, that is true for only a third of Republicans. Conservatives have argued my entire lifetime that criticisms of America and American history are equivalent to treason. That’s the position that conservatives defended when protests came mainly from liberals during the 1960s and 1970s. Now that much protest comes from the right about “over-regulation” or investigations of Trump, conservative protest has become legitimate. For them it’s fine, that candidate and President Trump can display patriotism by offering wide-ranging criticisms of America: our President was illegitimate; our airports were “third-world”; our FBI committed treason; our military leaders are ignorant. Trump became the epitome of conservative patriotism, not out of any principles about what patriotism means, but from pure partisanship.


    Some Republican “principles” are defended only when convenient. 79% of Republicans said that good citizens “always follow the law”, compared to 61% of Democrats, but Trump’s multiple legal transgressions are ignored or defended.


    Whatever the thinking behind the idea of patriotism, Republicans believe theirs is the right way. A survey one year ago showed that 72% of Republicans rated themselves “very patriotic”, while only 29% of Democrats chose this label for themselves. Since Trump’s election, the self-proclaimed patriotism of Democrats has dropped significantly.


    It turns out that patriotism refers both to long-term feelings about country and more temporary feelings about current political leadership. Behavior and symbols are both important, but to different people. Political differences lead too often to claims that the other side is not just wrong, but also unpatriotic.


    Because patriotism is about feelings, it is hard to analyze, even for oneself. The American women just won the soccer World Cup. I rooted for them all the way, just the way I root for American athletes I never heard of in the Olympic Games or at Wimbledon. I don’t think that makes me a better American, just a normal one.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/blog/154227 https://historynewsnetwork.org/blog/154227 0
    The 2020 Election Presents a Unique Opportunity to Elect a “New Generation of Leadership”


    The 2020 election presents a unique opportunity to elect a “new generation of leadership” to the presidency. The American public has done so before, as represented by John F. Kennedy in 1960; Jimmy Carter in 1976; Bill Clinton in 1992; and Barack Obama in 2008.


    One way to elect a “new generation of leadership” is by electing a younger president. Such would be the case with Pete Buttigieg, who would be 39 at the time of inauguration; Tulsi Gabbard, 39; Seth Moulton, 42; Julian Castro, 46; Tim Ryan, 47; Beto O’Rourke, 48; Cory Booker, 51; Steve Bullock, 54; Kirsten Gillibrand, 54; Kamala Harris, 56; Michael Bennet, 56; John Delaney, 58; Bill de Blasio, 59; or Amy Klobuchar, 60.


    When one examines modern American political history, one discovers that traditionally, the Democratic Party regularly has much younger Presidential nominees than the Republicans.


    The average age of all Presidents is about 55, but since 1952, with two exceptions, all of the Democratic presidential nominees have been younger than 60 years old. As exceptions, John Kerry was 61 when he ran for President in 2004 and Hillary Clinton was 69 in 2016. In chronological order, the Democratic nominees were: Adlai Stevenson, age 52 and 56; John F. Kennedy, 43; Lyndon B. Johnson, full term, 56; Hubert Humphrey, 57; George McGovern, 50; Jimmy Carter, 52 and 56; Walter Mondale, 56; Michael Dukakis, 56; Bill Clinton, 46 and 50; Al Gore, 52; Barack Obama, 47 and 51. 


    The Republican nominees have generally been older: Dwight D. Eisenhower, age 62 and 66; Gerald Ford, 63 when running for full term; Ronald Reagan, 69 and 73; George H. W. Bush, 64 and 68; Bob Dole, 73; John McCain, 72; Mitt Romney, 65; Donald Trump, 70. The only exceptions were Richard Nixon, 47, 55 and 59; Barry Goldwater, 55; and George W. Bush, age 54 and 58.


    So if the Democrats nominate Bernie Sanders, 79 at the time of inauguration; Joe Biden, 78; Elizabeth Warren, 71; Jay Inslee, 69; or John Hickenlooper, 68; they would alter a historical pattern. 


    In the past, there has often been a wide age gap between the two presidential candidates, as with Gerald Ford and Jimmy Carter in 1976 (11 years); Ronald Reagan and Jimmy Carter in 1980 (13 years); Ronald Reagan and Walter Mondale in 1984 (17 years); George H. W. Bush and Bill Clinton in 1992 (22 years); Bob Dole and Bill Clinton in 1996 (23 years); John McCain and Barack Obama in 2008 (25 years); and Mitt Romney and Barack Obama in 2012 (14 years).


    Now in 2020, we could have a much wider divergence in age—as much as 36 years between Donald Trump and Pete Buttigieg.


    2020 could be a “revolutionary” and unique election year beyond the issue of age. We could possibly elect the first woman President (Warren, Harris, Klobuchar, Gillibrand, Gabbard); our first mixed race woman President (Harris); our second African American male President (Booker); our first Latino President (Castro); our first gay President (Buttigieg); our first Jewish President (Sanders, Bennet); our first Hindu President (Gabbard), born in the US territory of American Samoa; our oldest first term President at inauguration (Sanders, Biden, Warren); our first President who will reach 80 years of age in office (Sanders, Biden); our first sitting Mayor President (Buttigieg, de Blasio); our first sitting Congressman President since James A. Garfield in 1880 (Gabbard, Moulton, Ryan); or a President younger than Theodore Roosevelt or John F. Kennedy (Buttigieg, Gabbard, Moulton).


    Why is this important for the upcoming election?  The answer is that “fresh blood,” whether age, gender, ethnicity, or sexual orientation, represents the long-term future of America, as the nation becomes more diverse than it has ever been. Promoting change and uniqueness in political leadership could result in higher voter turnout and potentially would enhance efforts to address the challenges of the 21st century. Historically, this  occurred in the early to mid 20th century with the era of Theodore Roosevelt, Woodrow Wilson, Franklin D. Roosevelt and Harry Truman.  

    The future is ultimately in the hands of those born since 1980 who will lead America in the next few decades.  Despite the strength at the moment of leaders born in the World War II and early Cold War years, the long range future suggests the “torch should pass to a new generation of leadership,” as California Congressman Eric Swalwell stated, quoting John F. Kennedy’s Inaugural Address, in the first debate in late June (although, Swalwell dropped out of the race July 9th). The same situation occurred when Jimmy Carter, Bill Clinton, and Barack Obama took the oath of office, and it is likely that the same will occur in 2020.


    Most certainly, the Presidential Election of 2020 will be one of the most fascinating and significant elections in American history.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172492 https://historynewsnetwork.org/article/172492 0
    Roundup Top 10!  

    Only Washington Can Solve the Nation’s Housing Crisis

    by Lizabeth Cohen

    The federal government once promised to provide homes for every American. What happened?


    Democrats’ Ominous Shift on School Segregation

    by Brett Gadsden

    It’s not just Joe Biden—the party has backed away from its commitment to fighting segregation in the public schools.



    How antitrust laws can save Silicon Valley — without breaking up the tech giants

    by Margaret O'Mara

    For AT&T in the 1950s, antitrust enforcement helped increase competition while keeping Ma Bell intact.



    How Fake News Could Lead to Real War

    by Daniel Benjamin and Steven Simon

    We think of false information as a domestic problem. It’s much more dangerous than that.



    The War Against Endless War Heats Up With Koch-Soros Salvo

    by Ronald Radosh

    The otherwise ideologically opposed billionaires are the latest unlikely pair to find common ground in the idea that American power is the root cause of the world’s problems.



    The Riptide of American Militarism

    by William Astore

    As Americans wrestled with the possibility of finding themselves in a second looming world war, what advice did the CFR have for then-President Franklin Delano Roosevelt in 1940?



    The white nostalgia fueling the ‘Little Mermaid’ backlash

    by Brooke Newman

    The uproar over a black Ariel shows how important representation in children’s entertainment is.



    There’s More to Castro Than Meets the Eye

    by Jonathan M. Hansen

    The revolutionary leader fought for and defended the very democratic ideals his government would later suspend.



    Roosevelt versus the refugees: One FDR policy that Bernie Sanders never mentions

    by Rafael Medoff

    Sanders favors a much more liberal U.S. immigration policy. Not Roosevelt. In fact, FDR’s immigration policy was so strict that if Sanders’s father, Eli, had not arrived from Poland before Roosevelt became president, he probably would not have been admitted.



    Why We Need More Black Women In Economics

    by Keri Leigh Merritt

    Recently a group of brilliant, driven, young Black women formed The Sadie Collective, an organization that “seeks to be an answer to the dismal representation of Black women in the quantitatively demanding fields such as public policy, economics, data analytics, and finance.”



    What to an American Is the Fourth of July?

    by Ibram X. Kendi

    Power comes before freedom, not the other way around.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172517 https://historynewsnetwork.org/article/172517 0
    DJ Alan Freed: The Inventor of Rock and Roll


    The “juke box musical” is a term that describes plays about an individual or group in rock music history in which there are a lot of songs but a thin plot and undeveloped characters. Most of them fail. Some, like The Jersey Boys, Beautiful, and Ain’t Too Proud, succeed.


    Rock and Roll Man: the Alan Freed Story, the tale of the fabled 1950s DJ who coined the term "rock and roll” and was the most famous DJ in America until the arrival of Dick Clark, falls somewhere in the middle. The play, with book by Gary Kupper, Larry Marshak and Rose Caiola, and music and lyrics by Kupper, just opened at the Colonial Theatre, part of the Berkshire Theater Festival, in Pittsfield, Massachusetts. It is a rocking and bopping night of wop-bop-a-doo-wop rock music entertainment, loaded with fabulous songs from the early days of rock, such as those by Chuck Berry, Little Richard, the Coasters and the Platters, and full of brilliant choreography.


    The play has eye-opening staging. There is a turntable on stage that spins about in the first act, a high courthouse bench for the play’s judge, a second level balcony on which performers delight the audience, recreated music studios, night clubs and bars. It is a music city of a stage. It is, for rock fans a rip-roaring good night at the theater.


    The play has a flashback format. In its beginning, Alan Freed emerges as a troubled drunk as the payola scandal (bribes to DJs to play particular songs) hits America. Then we go back to a mythical court drama in which Freed is on trial (the court of public opinion). His defense attorney is the colorful, flamboyant singer Little Richard, played with all of his pomposity and wildness by Richard Crandle, whose costume looks like an exploding candle. The prosecutor is FBI Director J. Edgar Hoover. The court then recounts Freed’s life from his early days as a DJ at a small radio station in Cleveland to his position as the number one DJ in America at New York City’s WINS radio in the early 1960s.


    Director Randal Myler does a pretty good job of holding on to the reigns of a play that is a bit cumbersome. He gets fine performances from Crandle as Little Richard and a very talented ensemble of actors, singers and dancers, plus numerous famous quartets.


    The play has some problems that hurt it, though. The opening act is overloaded with songs and underdeveloped with plot. It appears to be more of a music review – the best moments of the 1950s, everybody please clap. You are overwhelmed with song after song. Dizziness sets in. The tunes are good, but they knock you over in your seat. The story of Freed, who was such a music figure in that era, emerges very slowly and Alan Campbell, the actor who plays Freed, never quite gets going in his role. Campbell himself rambles throughout the play. Campbell presents Freed as more of a bystander in a story and not the tenacious Freed himself. He needs a sharper focus and more pizzazz. George Wendt, the co-star of the famed television series Cheers, has the same problem. He plays J. Edgar Hoover. Wendt never captures the character of Hoover. Wendt is a bit miscast in the role. The play is also too long. It runs about two and a half hours and a good twenty minutes could be cut, especially in the first act. Many songs overlap each other and make the same plot point and could be dropped.


    The story needs to be sharper. There are points in it, such as the African American singer Frankie Lymon kissing a white girl ln TV, that were very controversial and they are glossed over in the story. Freed’s deadly alcoholism is mentioned several times in the play but you never expect drink to ensnare him. That needs to be strengthened. The payola scandal, that stunned the country and brought down several well-known DJs and grabbed headlines for weeks, needs to explained better.


    Overall, though, Rock and Roll Man is a good show. It is a treasure house of entertainment history and the payola scandal. You learn just about all there is to know about how Freed emerged, caught the public eye and became the number one DJ in America. You learn how DJs on radio worked, how they moved to television (Dick Clark) and how rock and roll, so feared by the police, parents and the schools, literally took over America. You learn much about the attitude of teenagers in that day (accused of being juvenile delinquents by just about everybody. There is a lot on how records became number one best sellers, the integration of music and America and the development of the rock and roll concert, a rarity in American entertainment history.


    And, of course, the show gives you dozens of classic songs, real finger-snappers, from the early days of rock and roll, - Good Golly Miss Molly, Great Balls of Fire, I’m Walkin’, Lucille, Maybelline and Roll Over Beethoven, to name a few.


    So, try to ignore the weak spots in the show, put on your blue suede shoes and jitterbug out on to the dance floor.


    Rock and roll is here to stay…


    PRODUCTION: The show is produced by the Berkshire Theater Festival. Scenic Design:  Tim Mackabee, Costumes: Leon Dobkowski, Lighting: Matthew RIchards, Sound: Nathan Leigh. Choreography: Brian Reeder. Director: Randal Myler. The play runs through July 21.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172494 https://historynewsnetwork.org/article/172494 0
    Jimmy Carter, Public Historian

    Jimmy Carter Sunday School, February 3, 2019, photo by Jill Stuckey



    Those who have attended Jimmy Carter's Sunday School know that the time between arrival and 10:00 a.m.—measured in hours—is not really empty. A church member, often Jana Carter or Jill Stuckey, orients visitors, goes through a list of do's and don'ts, and provides short history lessons along the way. It is a presentation that church members have perfected over the years and the orientation is interactive and lively. "What have the Carters been doing in retirement?" Jill asks the audience. It is a loaded question. "Habitat!" or "Building houses!" is the most common first response. "Yes, the Carters build houses one week a year," Jill responds, smiling through gritted teeth. "How about the other fifty-one weeks?" Soon a more comprehensive accounting emerges: helping to ensure fair elections; eradicating Dracunculiasis or "guinea worm"; writing books; staying in shape; hunting and fishing. One avocation, though quite successful, is never mentioned: Jimmy and Rosalynn Carter are practicing public historians.


    Jimmy who? A public what? The work that the Carters do at the local level meets the National Council on Public History's inclusive definition of the field: "public history describes the many and diverse ways in which history is put to work in the world." Consider, for instance, Sunday school itself. The history lesson begins before President Carter arrives and continues after he enters. Political Scientist Jason Berggren writes that Sunday school at Maranatha Baptist Church in Plains, Georgia, serves "as a press conference of sorts" and "an occasion for presidential apologetics – an ongoing defense and explanation of his presidency."(1) The setting also makes it a form of wide-ranging historic site interpretation that includes Carter's upbringing, his years in the White House, his post-presidency and— last but key to it all—his deep, compelling faith.


    Sunday school at Maranatha is not a political rally, but it has a secular significance as public history. Guests often come from around the world to connect with the past—a need that drives much of the public history world from heritage tourism to reenactments. President Carter serves as both historical subject and docent in these moments. "Where are you from?" he asks the audience. When someone says, "Washington State," Carter responds, "The best nuclear-powered submarine in the world is stationed there. I'll let you guess the name." Carter is referring to a ship that bears his name—a subtle reminder of his history with the U.S. Navy. At the end of Sunday school, Carter shifts fully into public historian mode, shaping the way he wants people to remember him. "I used to say I'd be happy to take photographs with you after church," he jokes with a smile. "Now, I'm willingto do them." Or, with a smaller smile, he apologizes about his waning mobility. "My doctor tells me I have to sit during photographs. Please don't take my sitting to mean that I think I'm better than you."



    Jimmy and Rosalynn Carter speak at Plains High School for President's Day, 2016, photo by Jill Stuckey



    Jimmy Carter continues to shape his historical legacy in others ways. Indeed, to visit Plains High School, the railroad depot that served as a presidential campaign headquarters, or the Boyhood Farm is to take a guided tour led by public historian Jimmy Carter. Although Carter has written more than thirty books, his favorite to write was An Hour before Daylight: Memoirs of a Rural Boyhood (2001). Published just as his boyhood farm opened to visitors, this book has shaped interpretation at the Jimmy Carter National Historic Site more than any other. In fact, it would be difficult to overemphasize the impact of the book at the farm. According to historian Zachary Lechner, "Carter's perspective—very much in evidence throughout the site's interpretation—is omnipresent at the farm."(2) From Jimmy Carter voiceovers to written excerpts, the boyhood farm is nearly as immersive an experience as Maranatha. There are few autobiographical landscapes quite like it.


    Both Jimmy and Rosalynn Carter also remain active members of the public history community, including the Plains Better Hometown Program and the Friends of Jimmy Carter National Historic Site. In December 2016, one year after President Carter beat cancer, the Better Hometown Program held a Christmas Party in the Matthew Rylander House. Better known locally as "the haunted house," the (ca. 1850) plantation house was rented by the Carter family between 1956 and 1961. Although now vacant, it is owned by the Better Hometown Program, and the Carters led the way in stabilizing the building. The night of the party, with a torrential storm outside and only Christmas lights inside, the Carters went to each table after dinner, describing what used to be here or there and pointing out "hidden" rooms between first-floor closets and an attic. The family thought these nooks were the source of house's haunting. The Carters also make recurring cameo appearances in a "whodunit" murder mystery series organized by Kim Fuller, Director of the Friends of Jimmy Carter NHS. The popular event is held on the SAM Shortline, an excursion train between Cordele and Plains that the Carters lobbied to bring here in 2000.



    Carter painting door at the Plains High School, 2015, photo by Jill Stuckey



    As board members of the Friends, the Carters play key roles from interpretive work to fundraising. Rosalynn Carter recently led the way in putting Plains on Georgia's Camellia Trial. In 2016, the National Park Service made President Carter an honorary park ranger and the Carters have given special programs on President's Day for years. Together, they have helped the Friends group raise millions of dollars for the park. This fundraising has enabled the organization to hire a full-time education specialist who creates museum lesson plans and coordinates field trips. In some ways, the Carters have created a living history museum of 20th Century rural America in Plains with a twist: the global perspective of a former president and first lady.


    Orientation before Sunday school at Maranatha makes it clear that Jimmy Carter still thinks carefully about official and ceremonial titles. He does not like to be referred to as "Mr. President," an orientation leader explains, "because there is only one 'Mr. President' at a time, and that is the person who occupies the Oval Office." So be it. Indeed, we ought to respect President Carter’s wish that we not confine him too much to his time as chief executive. So address him as President Carter, or describe him by one of his less formal titles and roles: Sunday school teacher, public health leader, navy veteran, compassionate Christian, and a friend to strangers. And to these sobriquets, each more descriptive than “Mr. President,” we should include Jimmy Carter, public historian.


    Carter becomes honorary park ranger, 2017, photo by Jill Stuckey


    (1) D. Jason Berggren, "Life after the Presidency: Jimmy Carter as Sunday School Teacher," White House Studies, vol. 13, no. 2 (2015), 111, 109-127.

    (2) Zachary J. Lechner, "Commemorating Jimmy Carter and Southern Rural Life in Plains, Georgia," in Born in the U.S.A.: Birth, Commemoration, and American Public Memory, edited by Seth. C. Bruggeman (Amherst: University of Massachusetts Press, 2012), 83.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172464 https://historynewsnetwork.org/article/172464 0
    Fictional History, Patriotism, and the Fight for Scottish Independence

    A screenshot from Braveheart (1995) 


    Premiering at the 2019 Edinburgh International Film Festival in advance of its general release on June 28, 2019, Robert the Bruce, directed by Richard Gray, will “boost support for Scottish independence,” if actor and independence activist Angus Macfadyen has his way. Macfadyen revisits his role as the titular Scottish leader, a role he first played in Braveheart (1995). That film, he believes, “led to a surge in Scottish nationalist confidence.” Coincidentally, within a few days of the premier of Robert the Bruce in Scotland, former UK prime minister Gordon Brown warned that “the unity of the United Kingdom has never been at greater risk,” due to the “hijacking of patriotism” by Conservative Party leaders and Brexit bulldogs Boris Johnson and Nigel Farage, and by the Scottish National party’s embrace of “a more extreme nationalism.” 


    To many, the contribution of popular but historically inaccurate films—and literature—to Brexit and the evolution of a misguided patriotism that fails to take account of historical and political complexities seems obvious. Perhaps even more disturbing, however, is the synergy between politics, popular culture, and economics: as promoters of Scotland as a tourist destination continue to embrace “tartan heritage” in an effort to support Scotland’s important tourist industry, they unwittingly reinforce a version of history that serves the purpose of political propaganda, rather than disseminating a nuanced understanding of Scotland’s past. 


    The case for Braveheart’s influence on Scottish politics has been made previously by other observers, including historian Robert Brent Toplin, who noted in a 2015 History News Network article that Scottish audiences gave the film standing ovations at screenings and began supporting the separatist movement in far greater numbers after its appearance. Toplin concluded that “Braveheart’s impact on the people of Scotland reveals the potential of film to shape public opinion and agitate national politics.” It’s important to keep in mind that films such as Braveheart and Robert the Bruce, and recent books such as Diana Gabaldon’s Outlander series with its “wildly popular” Starz adaptation, are building upon a romantic vision of Scotland developed by eighteenth-century writers and Romantic visual artists and codified by the poetry and novels of Sir Walter Scott in the nineteenth century: their creative workestablished romantic Jacobitism as a dominant narrative of Scotland’s past. This narrative of history fostered the idea of Scotland as an “imagined community,” to use Benedict Anderson’s phrase, associated with a heroic but doomed rebellion against an indifferent, often unjust overlord, or, as it evolved over time, patriotic Scots against the cruel English colonizer. When the contemporary American novelist Diana Gabaldon came to choose the subject for her first novel, she tapped into a historical master-narrative of Scotland that already had an established set of associations and cultural values influenced by fiction. 





    The truth of Scotland’s history is, of course, much more complex than the narrative of the past one finds in the realm of popular culture. Romantic artists erased the Gaelic population by visualizing Scotland as a picturesque landscape, sublime and largely empty of people, despite the presence of industry throughout the country in the eighteenth century and the rapid urbanization of Edinburgh and Glasgow. Romanticism’s promotion of Gaelic primitivism, now popularized by contemporary literature and film, has also overwritten the significant global contributions made by Scottish Enlightenment philosophers, statesmen, scientists, and innovators. Popular stories of the Jacobite Rising of 1745  typically narrate a conflict between heroic Highlanders and a better-equipped English army that overlooks the military successes and subsequent poor military decisions of Prince Charles’s army, as well as the presence of many Scots who fought and died alongside the English at Culloden. The history of the Highland Clearances is similarly more complicated than a nationalist narrative of ethnic cleansing by the English suggests. As author Madeleine Bunting has observed in her memoir Love of Country: A Journey through the Hebrides, “Racism, betrayal [by fellow Scots], and imperial exploitation: three toxic elements have been incorporated into different readings of the Clearances” (147).


    The fictional “history” of Scotland has and continues to receive reinforcement via the consumer website of Scotland’s national tourist board, which seeks to capitalize on the popularity of Braveheart and, now, Outlander by invoking that romantic narrative as it entices visitors and their pocketbooks to Scotland. In fact, just as the nineteenth-century tourist industry drew upon the popularity of Scott’s works to inspire readers to visit the locations he made famous, promoters of tourism today are quick to invite fans of Outlander to experience a version of Scotland that exists largely within the realm of the imaginary. 


    One may ask why this matters: if fan tourism brings much needed money into the country, does it matter if those tourists are ill informed about history, so long as the inhabitants of the country know better? If historical fiction had no effect upon its citizens’ perceptions and political decision-making, the oversimplification of Scotland’s history by novelists and filmmakers in quest of a good story—and the reinforcement of that story by those seeking economic gain—would not matter. But, as noted above, fiction does inform life, in the case of Scotland’s independence movement: the popular story of Scotland told across print and media platforms, on screen, in books, and on websites, has become, for many Scots, the only story of their past known by those who get their history from popular culture.


    Comments about Culloden made by members of the popular Facebook page Outlander Series Books & TV reveal that this series has constructed the history that some believe is true. As one member commented, “Scotland is where I was born and raised. . . . I never knew anything about the battle of Culloden until I watched outlander [sic].” Pop culture derived “history” has been similarly on display during Scottish independence rallies since the 2014 referendum. Reporting on a 2015 rally in Glasgow, VICE correspondent Liam Turbett noted the expression of “dodgy pseudo-ethnic nationalism” which, while it resembled “a parody of everything people say to discredit the independence movement,” was cheered by those “along the fringes of the Yes movement.” Turbett supplemented his verdict of this “contortion of history” with a mention of a pro-Independence sign containing a quote attributed to William Wallace—but really made by “his fictional dad in the film Braveheart.” 


    Whose responsibility is it to ensure that a more nuanced understanding of history is shared widely, especially among those who may lack the interest in or ability to access the scholarship of historians? The example of Scotland and the forces unleashed by Brexit and the current nationalist debate illuminate the importance of understanding how commercial and political entities use pseudo-historical narrative for self-promotion and the creation of an imagined community. However, it may be as important for serious writers and filmmakers to create historical fiction more thoughtfully. Knowing that literature and film can shape public opinion and beliefs about the past, writers and readers who crave a better-informed populace may need more often to use the power of the pen to avert the power of the sword.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172467 https://historynewsnetwork.org/article/172467 0
    Gresham's Law of Reading: Bad Reading Drives Out Good


    James W. Loewen is a sociologist.  The New Press recently brought out new paperbacks of Loewen's bestseller, Lies My Teacher Told Me, and Sundown Towns, about places that were/are all-white on purpose. 


    Gresham's Law, as I'm sure you recall from Econ. 101, states, "Bad currency drives out good." It works like this. Suppose you have $100 in gold coins and $100 in paper bills. You want to buy a sport coat for $99. (I did buy a sport coat for $99, just before Christmas.) Are you going to hand over your gold coins or your paper bills? 


    You're going to hand over your paper bills. At least most of us will.


    After all, the paper bills depend upon the backing of the government. The gold coins have intrinsic value. If North Korea or an ISIS terrorist sets off a nuclear bomb in D.C., where I live, I can escape in my car, camp out in southern Pennsylvania, and maybe trade a gold coin for some bread and cheese from the nearest Amish farmer. Even without the threat of societal breakdown, the gold coins also look nice, so I derive pleasure from merely owning them. From the paper, not so much. 


    As a result, gold coins don't work as currency. People don't exchange them. They hoard them. By definition, "currency" is "a medium of exchange." Bad money has driven out good. 


    So it goes with reading, at least for me. My current fiction read is Cloud Atlas, a complex remarkable novel by David Mitchell that takes place in 1841, 1931, more-or-less the present, and several future eras. I recommend it to you. 


    I've been reading it for years. First, I used it as bedtime reading. This didn't work, because to the annoyance of my spouse, I fall asleep within 30 seconds of opening it. Then I switched to taking it on trips with me. 


    Cloud Atlas has now been to, in chronological order, West Virginia, Indiana, Colorado, Montana, Minnesota, Georgia, California, Wisconsin, Philadelphia, New York City, Switzerland-to-Amsterdam on the Rhine, the United Kingdom, the Bahamas, New York City again, Vermont (twice), and Massachusetts (three times). A year ago it visited the Azores (which were excellent, by the way). This past April, it went down the Nile (a bucket-list trip, fascinating in many ways). Just last month, it ventured to Portland, Oregon, and then to Minnesota. Still, I didn't finish it.  


    What is going on? 


    It's Gresham's Law of Reading. Bad reading drives out good. 


    Specifically, it's the newspaper, in my case, the Washington Post. It's Time, Smithsonian, and Multicultural Perspectives. It's The National Museum of the American Indian. (Yes, that's a magazine as well as the institution that puts it out.) God help me, it's AARP the Magazine and whatever the magazine is called that AAA sends me. I am always behind on reading them, so I always pack a stack of them on my trips. Since I don't want to bring them back home, I always read them first, so I can throw them out. Consequently I rarely get to the gold. 


    This pattern does have one payoff: I do catch up on my magazines. This saves me from the fate of a Time subscriber whose letter I still recall from about 1952, when I was ten years old, reading my father's magazine. From memory, it went, 


    I really like your magazine. You're doing a fine job. However, it is too much material for me. I file each new issue on my bookshelf on the right, and I read them from the left. Right now I'm in the middle of 1943, and I can't wait to see how it all turns out!


    On my last day on earth, however, I shall be sad if I have not finished Cloud Atlas. I doubt I'll lament not having finished the latest AARP. 


    Could this perhaps be a metaphor? On that day, might I also be sad, not having taken care of the important things — the gold — while wasting my time on tasks that have currency, but no real value? 

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/blog/154226 https://historynewsnetwork.org/blog/154226 0
    Flight Girls: Remembering World War 2's Women Airforce Service Pilots


    What is a hero? It’s a question I’ve pondered off and on for the past seven years, ever since I came across a stack of books at my aunt’s house and read a piece of WWII history I hadn’t previously known. 


    The Women Airforce Service Pilots program (WASP), was the brainchild of famed aviatrixes Jacqueline Cochran and Nancy Harkness Love and—with the assistance of General Henry “Hap” Arnold, the commanding general of the Army Air Forces—they built a program teaching female pilots to fly every type of airplane the military owned, so long as they met the age and height requirements, had 500 flying hours under their belt each, and a pilot’s license in hand. They were taught to fly “the Army way” and flew warplanes that had been damaged in battle, planes right off the production line, simulated strafing missions, and towed gunnery targets for live ammunition training. The women who flew were bound by spirit and duty, bravery and skill… and bonded by their love of country and a job they knew they could do well. Some say they could handle those planes better than many of the men.


    If they washed out, they had to pay their own way home. If they were injured or killed, it was up to their friends and family to get them the care, or the casket, they needed. 


    Once training was finished, they were sent to one of the many military bases across the country where they ferried planes from base to base, transported military personnel and cargo, or continued testing new planes. There wasn’t always a designated space for them to bunk, so sometimes they slept in the nurse’s quarters. Other times they had to get a hotel room. No plane to fly back to the base you just landed at? No problem! Wait around for a day or more, or get yourself a ticket on a commercial flight – on your own dime of course. They weren’t allowed to pack much in the way of clothing—warplanes don’t always have a lot of room for luggage—so they tucked spare bits of clothing in the cockpits’ nooks and crannies. There were undergarments in logbooks, a pair of heels beside their seat. Sometimes they got stuck in a city for days, washing and re-washing the few items of clothing they’d brought until they could get back to their home base. 


    They did this without complaint or expectation. They did this so the men could go to war.


    I was stunned by anecdotes of bravery, death and outright misogyny. And I was baffled the subjects of these stories had tried to be heard, but still, seventy-seven years later, for the most part were unknown to the greater public. 


    On a humid and windy May morning I arrived at what is now the Texas State Technical College. Seven-plus decades ago though, in place of the brick buildings, stood long wooden structures that housed the pilots that trained here. There were offices and a chow hall, classrooms, and hangars. Boots marched on this dirt. Planes buzzed overhead at all hours of the day and night in the wide-open blue sky. This had been Avenger Field. And in 1942 - 1944, 1,074 women served their country with bravery and a whole lot of moxie. 


    What brought me there was the annual WASP Homecoming Reunion. I had heard there would be five members attending. Only two were able to make the trip. Kay Hildebrand and Dorothy Lucas were greeted with a salute and escorted from their cars by service women and men, who then rolled them in their respective wheelchairs between two walls bearing their comrades’ names and helped them onto the low brick wall that encircled a wishing well- the same one they’d jumped in when they’d graduated the program so many years before, and where they sat now, smiling at their admiring crowd. 


    The faces smiling back were both young and old. Some women wore outfits of an era gone by, their hair in Gibson Rolls, their lips painted red. There was one dressed as Rosie the Riveter and a young girl named Jenna sporting a pilot’s costume, goggles perched upon her little head. There were family members and fans, and there were the women who came after. Women who may never have had the chance to wear an Air Force uniform if not for the two women by the fountain. 


    Those two women – representing the 1,074 who served. They did not fight in Pearl Harbor. They didn’t storm the beach of Normandy. They didn’t serve in the Pacific or stand on the front lines of any battle. They never stared down the barrel of a rifle, waiting to plunge a bullet into a Nazi soldier racing to try and land his shot first. 


    But they did serve their country. They served at home, on American soil. They served without military status or benefits. Without expectation or praise.


    These are the women history forgot.


    Let me rephrase.


    These are the women erased from history.


    Do they not deserve recognition purely because they weren’t allowed to step foot on a front line? Or drop a bomb from thousands of feet in the air? Or sit in the ball turret of a B-17 discharging a machine gun?


    They signed up with no chance at being promoted. No raise in their future. No contract stating they’d be taken care of. And they did it with pride… and barely a thank you in return. 


    The WASP program ended on December 20, 1944. The women, with some exceptions, were responsible (of course) for getting themselves home. If they wanted to have a career flying, they’d have to find it elsewhere. They were no longer invited to fly the military’s aircraft. 


    And that was it. The file on them was sealed and for thirty-five years there was not a peep about what these birds of war had done. How they had stood up to serve their country – and how their country disserved them. 


    In 1977, after much debate between the Veterans Administration and the Department of Defense – the former against, the latter in favor of—the WASP were finally given veteran status and President Jimmy Carter signed it into law in November 23rd of that year. On March 10, 2010, President Barack Obama awarded the WASP with the Congressional Gold Medal.


    And yet, why do so few know about these fearless flyers STILL? Why is their story not being added to curriculums across the country? Being taught in elementary schools, high schools, colleges? They are barely a blip on the History Channel’s website. I almost fell off the sofa when Josh Gates from Expedition Unknown went in search of Gertrude “Tommy” Tompkins, a WASP who went missing after taking off from an airfield decades ago. I shouldn’t be surprised to see these stories. It should be a given that ALL those who served have their stories told. There should be fieldtrips to the National WASP Museum. There should be mentions and parades and films! 


    I stood watching the two remaining WASPs sitting at that wishing well. When I’d first learned of their service, I was astonished and outraged I hadn’t known before. I hadn’t expected that, since that day eight years ago, I would fall in love with their stories. That I would write a book inspired by them. That I would become one of their biggest fans and greatest champions.


    What is a hero? By definition a hero is a person admired for courage, outstanding achievements, or noble qualities.


    I think the WASP fit that bill.  They are certainly my heroes.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172466 https://historynewsnetwork.org/article/172466 0
    Report on the National Strategy Meeting of Historians Convened by Historians for Peace and Justice


    Close to fifty historians attended the day-long National Strategy Meeting of Historians at Columbia University on May 28, 2019. Historians for Peace and Justice (H-PAD) convened the meeting.  The unprecedented gathering of historians independent of a formal conference testified to the urgent need many of us feel to continue and expand our opposition to the Trump regime as well as the multiple crises that confront us in this country and around the world. Thanks to the efforts, enthusiasm, and contributions of a number of historians, the meeting was stimulating, congenial, and successful, despite being organized on a shoe string budget. For the list of attendees and agenda, go to https://www.historiansforpeace.org/national-strategy-meeting-of-historians/.


    Van Gosse and Margaret Power opened the meeting, which then broke into small groups to discuss several questions. Some of the questions were (1) What is the role of historians in this time of acute global and national crises? (2) How can we go forward together, forging stronger alliances and connections nationally and locally with each other, as engaged scholars and with the larger movements? (3) How important is it to act within our profession, including its associations? The body reconvened and a representative from each group reported on the main points it had explored. No clear consensus emerged from the report; instead a wide-range of opinions and priorities were expressed.


    In the afternoon, people attended one of six work groups, based on what they wanted to work on. The six working groups and conveners that emerged and are currently functioning are the following: 

    Direct Action/Combatting the Right’s Fake News, Contact:  Jeremy Varon, jvaron@aol.com

    Empire and War, Contact: Prasannan Parthasarathi, prasannan.parthasarathi@bc.edu

    K-12, Contact: Barbara Winslow, bwpurplewins@gmail.com]

    Democratize the Academy/Smash the Carceral State, Contact:  Andy Battle, andrew.battle@gmail.com 

    Palestine, Contact:  Leena Dallasheh, leena.dallasheh@gmail.com

    Immigrants’ Rights, Contact Alex Avina, Alexander.Avina@asu.edu, and Margaret Power, marmacpower1@gmail.com 


    If you are interested in finding out more about the groups or in joining one of them, please contact the convener listed above. H-PAD hopes that other working groups will also form, so if you are interested in forming or participating in one, please contact us and we will announce them and put people with similar interests in contact with each other.


    The group also discussed whether to form a new organization to incorporate all the non-H-PAD people in attendance or whether to continue and expand H-PAD. Participants overwhelmingly decided it would serve no purpose to start a new organization and that those who wanted to get involved in the work should join H-PAD. For more information, go to https://www.historiansforpeace.org/

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172465 https://historynewsnetwork.org/article/172465 0
    The End of Humanitarian Intervention? A Debate at the Oxford Union With Historian David Gibbs and Michael Chertoff


    The issue of humanitarian intervention has proven a vexing one of the political left during the post-Cold War era. In light mass violence in Rwanda, Bosnia-Herzegovina, Kosovo, Darfur, Libya, and Syria, many leftists abandoned their traditional opposition to militarism and argued for robust military intervention by the United States and its allies to alleviate these crises. Critics argued in response that interventionism would end up worsening the very crises it was supposed to resolve. These issues were recently debated at the Oxford Union Society at Oxford University on March 4, 2019. The participants were Michael Chertoff -- former Secretary of Homeland Security during the presidency of George W. Bush and coauthor of the USA Patriot Act – who presented a qualified defense of humanitarian intervention; and myself, who argued against the practice. 


    In past years, when I debated this issue, I was struck by the sense of almost religious zeal that characterized advocacy for interventionism. “We have to do something!” was the standard refrain. Those who offered criticisms – including myself -- were cast as amoral heretics. However, the repeated failures of interventionism that I note below have taken their toll and have served to moderate the tone. During the Oxford debate, I noted a remarkable absence of emotionalism. I came away from the event sensing that, while some still defend humanitarian intervention, their arguments lack the crusading tone that was so noteworthy in the past. I sense that public support for interventionism is beginning to ebb.


    What follows is a verbatim transcript of the full statements by myself and Mr. Chertoff, as well as our responses to questions posed by the moderator and a member of the audience. For reasons of brevity, I have omitted most of the audience questions, as well as the responses. Interested readers can find the full debate at the Oxford Union’s Youtube site.



    Daniel Wilkinson, Oxford Union President

    So, gentlemen, the motion is: “This house believes humanitarian intervention is a contradiction in terms.” And Professor Gibbs, your ten-minute opening argument can begin when you’re ready.


    Professor David Gibbs

    Thank You. Well, I think that when one looks at humanitarian intervention, one has to look at the record of what has actually happened and in particular the last three major interventions since 2000: The Iraqi intervention of 2003, the Afghanistan intervention of 2001, and the Libya intervention of 2011. And what all three of these have in common, is that all three were justified at least in part on humanitarian grounds. I mean, the first two partly, the third almost exclusively were justified on humanitarian grounds. And all three produced humanitarian disasters. This is really quite clear, I think to anybody who has been reading the newspaper that these interventions have not gone well at all. And when evaluating the larger issue of humanitarian intervention, one really has to first look at those basic facts, which are not pleasant. Let me add that it’s very surprising to me in a lot of ways that the whole concept humanitarian intervention wasn't just fully discredited by those experiences, but it is not. 


    We still have calls for other interventions, including in Syria, most notably. Also, there are frequent calls for regime change, essentially intervention, in North Korea. I really don't know what is going to happen in the future with North Korea. But if the United States does undertake regime change in North Korea, I will hazard two predictions: One, it almost certainly will be justified at least in part as a humanitarian intervention designed to liberate the people of North Korea from a very unwholesome dictator; and two, it'll produce probably the biggest humanitarian disaster since 1945. One of the questions is: Why are we not learning from our mistakes? 


    The scale of the failures in these three previous interventions is in a lot of ways quite impressive. With regard to Iraq, it's perhaps the best documented failure, I would say. We have the 2006 Lancet study. Epidemiologically looking at excess deaths in Iraq, which at that time were estimated at 560,000 excess deaths.(1) This was published in 2006. So, presumably it's much higher by now. There have been other estimates, mostly on par with that one. And this is something that is problematic. Certainly, things were terrible under Saddam Hussein, that’s indisputable, as they were under the Taliban, as they were under Muammar Gaddafi, as they currently are under Kim Jong Un in North Korea. And so, we went in and removed from power those three figures one by one (or I should say with the Taliban, it was a larger regime, with Mullah Omar leading a larger regime), and things promptly got worse. It didn't seem to have occurred to policymakers that things could actually get worse, but they did. 


    Another effect that's worth noting is what I would say is a kind of destabilization of regions. This is particularly striking in the case of Libya, which destabilized much of North Africa, triggering a secondary civil war in Mali in 2013, which was directly attributable to the destabilization of Libya. This required a secondary intervention, by France this time, to combat basically the instability arising in that country, again justified at least in part on humanitarian grounds. 


    Certainly, one of the things one can say in terms the effects of humanitarian intervention, is that if you have a vested interest in intervention and that is something you are seeking, it's an excellent idea because it's the gift that just keeps on giving. It keeps on destabilizing regions, producing new humanitarian crises, thus justifying new interventions. That's certainly what happened in the case of Libya and then Mali. Now if you're interested in humanitarian effect, however the situation does not look so good. It does not look very positive at all. 


    The very striking thing here is the lack of loss of credibility. I'm very struck by the fact that the people who helped to argue for these three interventions -- and by that I don't just mean policymakers, but also academics and intellectuals like myself. I myself didn't argue for them, but many of my colleagues did. And it's rather remarkable to me that there's no expression of regret or acknowledgement they did anything wrong in arguing for these interventions. Nor is there effort to learn from our mistakes and to try and avoid interventions in the future. There's something very dysfunctional about the character of discussion on this topic, when we fail to learn from past mistakes. 


    A second problem with the issue of humanitarian intervention is what some have called the “dirty hands” problem. We are relying on countries and agencies of those countries which do not have very good records of humanitarian activity. Let us look at the United States and its history of interventionism. If one looks at that, the history of US interventionism, we find the United States as an intervening power was a major cause of humanitarian crises in the past. If one looks for example at the overthrow of Mossadegh in Iran in 1953, the overthrow of Allende in Chile in 1973. And I think the most striking example, a less known one, is Indonesia in 1965, where the CIA helped engineer a coup and then helped orchestrate a massacre of people that led to about 500,000 deaths. It's one of the really great massacres post-1945, yes indeed, on the scale of what happened in Rwanda, at least approximately. And that was something caused by intervention. And one could also go into the issue of the Vietnam War and look for example at the Pentagon Papers, the secret Pentagon study of the Vietnam War, and one does not get a sense of the United States as either a gentle power or a particularly humanitarian one. And the effects certainly were not humanitarian in any of these cases. 


    There's a larger issue perhaps of human rights violations by the agencies of state that are involved in intervention in the United States. We now know from declassified documents that both the uniformed military and the CIA were responsible in the 50s and early 60s in conducting radiation experiments on unsuspecting individuals; doing things like going around and having doctors working for the military injecting people with radioactive isotopes and then tracking their bodies over time to see what effects it had and what kinds of illnesses it caused them -- without telling them of course. The CIA had very disturbing mind-control experiments, testing new interrogation techniques on unsuspecting individuals, with very damaging effects. One of the scientists involved in the radiation studies commented in private, again this is from a declassified document, that some of what he was doing had what he called the “Buchenwald” effect, and we could see what he meant. And the obvious question again is: Why on earth would we want to trust agencies that do things like this to do something humanitarian now? This is a course long ago. But the fact that we now use the term “humanitarian intervention” does not make it a magical phrase and does not magically erase this past history, which is relevant and has to be taken into account. I do not want to focus excessively on my own country after all. Other states have done other disturbing things. One could look at the history of Britain and France, let us say, with the colonial and postcolonial interventions. One does not get a picture of humanitarian activity; quite the contrary I would say, either in intent or in effect. 


    Now I think one of the issues that finally has to be noted is the cost of humanitarian intervention. This is something that is rarely taken into account, but perhaps should be taken into account, especially since the record of results is so bad in terms of humanitarian effect. Well, military action generally speaking is extremely expensive. Amassing division-sized forces, deploying them overseas for extended periods of time cannot be done except at extreme expense. In the case of the Iraq War, what we have is what has been termed “the three trillion-dollar war.” Joseph Stiglitz of Columbia and Linda Bilmes estimated in 2008 the long-term cost of the Iraq War at $3 trillion.(2) Those figures of course are obsolete, because that's over ten years ago, but $3 trillion is quite a lot when you think about it. In fact, it's greater than the combined gross domestic product of Great Britain at the present time. And one wonders what kind of wonderful humanitarian projects we could have done with $3 trillion, rather than wasting it in a war that did nothing but killed several hundred thousand people and destabilized a region. 


    And these wars are not over of course in either Libya, nor Iraq, nor Afghanistan. Afghanistan is nearing the end of its second decade of war and the second decade of US intervention. This may very well run into being the longest war in US history, if it not already is. It depends how you define longest war, but it's certainly getting up there. And one can think of all sorts of things that could have been done with some of this money, for example, vaccination of children, who are under-vaccinated. (Two minutes is that right? One minute.) One could think of people who don't have enough medicines including in my own country the United States, where many people go without proper medicines. As economists know, you have opportunity costs. If you spend money on one thing, you may not have it available for another. And I think what we've been doing is overspending on intervention again with no significant humanitarian results or very few that I can discern. I guess I'm very impressed by the medical analogy here and the medical emphasis, so that's of course why I titled my book “First Do No Harm.” And the reason is that in medicine you don't just go and operate on the patient because the patient is suffering. You have to do a proper analysis of whether or not the operation will be positive or negative. An operation can of course hurt people, and in medicine sometimes the best thing to do is nothing. And perhaps here, the first thing we should do with the humanitarian crises is not make them worse, which is what we've done. Thank you.



    Thank you, Professor. Michael, your ten-minute argument can begin when you’re ready.


    Michael Chertoff

    The proposition here is whether humanitarian intervention is a contradiction in terms, and I think the answer to that is no. Sometimes it’s ill-advised, sometimes, it's well advised. Sometimes it doesn't work, sometimes it does work. It rarely works perfectly, but nothing in life does. So, let me first begin by talking about the three examples the professor gave: Afghanistan, Iraq, and Libya. I'm going to tell you Afghanistan was not a humanitarian intervention. Afghanistan was the result of an attack launched on the United States that killed 3,000 people, and it was quite openly and deliberately an effort to remove the person who launched the attack from the ability to do it again. If you think it wasn't worth it, I will tell you from personal experience: When we went into Afghanistan, we found laboratories al Qaeda was using to experiment with chemical and biological agents on animals, so they could deploy those against people in the West. Had we not gone into Afghanistan, we might be inhaling those now as we speak. This is not humanitarian in the sense of altruistic. This is kind of basic, core security that every country owes its citizens. 


    Iraq is also I think in my view not principally a humanitarian intervention. We can debate in a different debate what happened with the intelligence, and whether it was totally wrong or only partially wrong, regarding the possibility of weapons of mass destruction in Iraq. But at least that was the major assumption going in. It may have been erroneous, and there are all kinds of arguments that the way in which it was executed was poorly done. But again, it was not humanitarian. Libya was a humanitarian intervention. And the problem with Libya is I think the second part of what I want to say, which is not all humanitarian interventions are good. And in order to make a decision to intervene, you have to take into account some very important elements of what you're facing. What is your strategy and your objective, do you have clarity about that? What is your awareness of what the conditions in the place you're intervening in actually are? What are your capabilities and your willingness to be committed to see things through to the end? And then, to what degree do you have support from the international community? Libya is an example of a case where, while the impulse may have been humanitarian, these things were not carefully thought-out. And if I can say so, Michael Hayden and I made this point in an oped shortly after this process began.(3) That the easy part was going to be removing Gaddafi. The hard part was going to be what happens after Gaddafi is removed. And so here I agree with the professor. Had someone looked at the four factors I mentioned, they would have said: “Well you know, we don't really know, we haven’t really though through what happens without Gaddafi?” What happens to all the extremists in prison? What happens to all the mercenaries that he's paid for, who now aren't getting paid anymore? And that led to some of the negative results. I also think there was a failure to understand that when you remove a dictator, you have an unstable situation. And as Colin Powell used to say, if you broke it you bought it. If you're going to remove a dictator, you've got to then be prepared to invest in stabilizing. If you're not prepared to make that investment, you have no business removing him. 


    By way of example on the other side, if you look at for example the interventions in Sierra Leone and Ivory Coast. Sierra Leone was 2000. There was the United Front that was advancing on the capital. The British came in, they repelled them. They drove them back. And because of that, Sierra Leone was able to stabilize, and they ultimately wound up having elections. Or Ivory Coast, you had an incumbent who refused to accept that he had lost an election. He began to use violence against his people. There was an intervention. He was ultimately arrested, and now Ivory Coast has a democracy. So again, there are ways to do humanitarian intervention that can be successful, but not if you don't pay attention to the four characteristics I talked about. 


    Now, let me give you an example from something that we are literally facing today, and that is what is going on in Syria. And let's ask the question whether a couple of years ago, before the Russians got deeply involved, before the Iranians got deeply involved, whether an intervention would have made a difference in saving literally tens of thousands of people from being killed, innocent civilians with bombs and chemical weapons, as well as a huge mass migration crisis. And I think the answer is: Had we done in Syria what we did in northern Iraq in 1991, established a no-fly zone and a no-go zone for Assad and his people, and if we had done it early, we might have averted what we now see unfolding and continuing to unfold in the region. So, now I'm going to now look at it from the other lens: What happens when you don't intervene, as I suggest that we might have done in Syria? Well not only do you have a humanitarian crisis, you have a security crisis. Because as the consequence of not really enforcing any of the rules I've talked about and notwithstanding the fact that President Obama said there was a red line about chemical weapons and then the line disappeared when the chemical weapons were used. Because of the fact that we didn't enforce these humanitarian measures, we had not only many deaths, but we literally had an upheaval that has now reached into the heart of Europe. The reason the EU is now having a crisis about migration is because, and perhaps with some intent, the Russians as well as the Syrians deliberately acted to drive civilians out of the country and force them to go elsewhere. Many of them are now in Jordan and putting a strain on Jordan, but many of them are trying to get into Europe. And I have little doubt that Putin understood or quickly recognized, even if it was not his original intent, that once you create a migration crisis, you are creating a disorder and dissension within your principal adversary, which is Europe. And that has a destabilizing effect, the consequences of which we continue to see today. 


    And so, one of the things I want to say to be honest, is when we talk about humanitarian intervention, there is often an altruistic dimension to it, but frankly there is also a self-interested dimension. Places of disorder are places where terrorists operate, and you've seen Isis until quite recently had territory in parts of Syria and parts of Iraq that were not properly governed. It creates migration crises and similar crises, which then have an impact on the stability and the good order of the rest of the world. And it also creates grievances and desires for payback that often result in cycles of violence that continue over and over again, and you see that in Rwanda. 


    So, my bottom line is this: Not all humanitarian interventions are warranted, not all humanitarian interventions are properly thought out and properly executed. But by the same token, not all of them are wrong or improperly executed. And again, I go back to 1991 and the no-fly zone and no-go zone in Kurdistan as an example of one that worked. The key is this: Be clear why you're going in; don't underestimate the cost of what you're undertaking; have the capabilities and the commitment to see that you can handle those costs and achieve the result that you set out for yourself. Make sure you are aware of the conditions on the ground, so you make a rational assessment. And finally get international support, don't go it alone. I think in those circumstances, humanitarian intervention can not only be successful, but it can save a lot of lives and make our world more secure. Thank you.


    Question (Wilkinson)

    Thank you, Michael. Thank you both for those introductory remarks. I’ll ask one question, and then we’ll move over to questions from the audience. My question is this: You both cited a number of historical examples. But would you say it is a fair assessment that practically the problem is that there can never be a sufficient long-term plan, sufficient well intentions, sufficient benevolent motivations, or a sufficient harm-analysis to counter the fact that individual organizations and international organizations are fallible. And they will always make mistakes. And the fallibility of those groups means that humanitarian intervention has to be a contradiction in terms. So, Michael, if you’d like to respond. 


    Answer (Chertoff)

    My answer is this: Inaction is action. Some people think if you don't do something that's somehow abstaining. But if you don't do something, something is going to happen. So, if for example Franklin Roosevelt had decided not to help the British in 1940 with Lend Lease, because “I don't know if I'm making a mistake or not,” that would have resulted in a different outcome with respect to World War II. I don't think we'd be saying “well but that was inaction, so it didn't matter.” I think inaction is a form of action. And every time you're presented with a choice, you have to balance the consequences as far as you can project them, from both doing something and abstaining from doing something. 


    Answer (Gibbs)

    Well, I think that of course inaction is a form of action, but the onus should always be on person advocating intervention. Because let's be very clear on this: Intervention is an act of war. Humanitarian intervention is a mere euphemism. When we advocate humanitarian intervention, we are advocating war. The movement for intervention is a movement for war. And it seems to me those who advocate against war really have no burden on them of proof. The burden of proof should be on those who advocate for the use of violence, and really the standards should be very high for the use of violence. And I think we can see it's been used quite frivolously in the past to an extraordinary degree. 


    And a basic problem you have in small interventions -- for example the 1991 no-fly zone over Iraq -- is these things take place in the real world, not in a pretend world. And in that real world, the United States considers itself a great power, and there'll always be the question of American credibility. And if the U.S. undertakes half measures, such as a no-fly zone, there will always be pressures on the United States from various factions in the foreign policy establishment to take a more maximalist effort and solve the problem once and for all. Hence the need for another war with Iraq in 2003, producing an utter catastrophe. I get very queasy when I hear people discussing “let us just do a limited intervention, it'll just stop at that,” because it usually doesn't stop at that. There's the quagmire effect. You step into the quagmire, and you get deeper and deeper into the quagmire. And there will always be those who advocate deeper and deeper intervention.


    I guess one more point: I did want to respond to the claim which is a frequent one that the Iraq and Afghanistan wars were not really humanitarian interventions. It is true that this was to some extent, both interventions were at least partly traditional national interest, realpolitik, and the like. But if you look back at the record, clearly both were justified in part as humanitarian interventions, both by the Bush administration as well as many academics. I have here before me an edited volume published by the University of California Press, and I believe it's 2005, called A Matter of Principle: Humanitarian Arguments for War in Iraq.”(4) Just do a Google search on “humanitarian arguments for war in Iraq,” and this was very much part of the picture.  I think it's a bit of a rewriting of history to say that humanitarian intervention was not a significant factor in the arguments for war in Iraq or Afghanistan. They were very much part of both those wars.  And I would say the results very much discredit the idea of humanitarian intervention.


    Question (Audience)

    Thanks, so you've both talked about some historical examples and I'd like to hear both of your perspectives about the ongoing situation in Venezuela. And the Trump administration and the plans and the reports have come out that they might have plans to use military force there and how you would evaluate that in light of both of the perspectives that you've shared.


    Answer (Chertoff)

    So, I think what's happening in Venezuela is first of all I mean there's obviously a political dictatorship. And as I've said I don't think political regime issues are a reason to intervene militarily. There is also a humanitarian element here. People are starving. But I don't know we’re at the level of humanitarian crisis that we've seen in other cases. So, my short answer would be: I don't think we've met the threshold for having a real discussion about humanitarian intervention in a military sense. 


    That's not to say there aren't non-military ways to intervene, just to be clear so we round the picture out. There are a lot of tools in the toolbox when you deal with intervention. There are sanctions, economic sanctions. There is even potential use of cyber tools as a way of having some impact on what's going on. There is the possibility in some instances of legal action, for example International Criminal Court or something. So, all of these ought to be considered part of the toolbox. If I was looking at Venezuela, assuming it did, which I emphasize it has not, reach the level of humanitarian intervention, you would then have to balance issues like: Is there an endgame we see or a strategy we see to be successful? Do we have the capabilities to achieve it? Do we have international support? I think all of those would probably militate against it. That's not to say it couldn't change, but the dimensions of this I don't think have reached the point where military action is reasonable or likely.


    Answer (Gibbs)

    Well, the most important thing you need to know about Venezuela is that it's an undiversified oil exporting economy, and there's been a drop in oil price since 2014. I'll certainly grant that a lot of what is going on now is the fault of Maduro and authoritarian actions he's been taking, as well as mismanagement, corruption, and so on. Most of what has been going on by any reasonable reading, by any informed reading, is due to low oil prices. 


    It points to I think a larger issue, which is the way humanitarian crises are often triggered by economic crises. Discussions of Rwanda almost never discuss the fact that the genocide – and I think it really was a genocide in the case of Rwanda -- the genocide by the Hutu against the Tutsi took place in the context of a major economic crisis resulting from the collapse of coffee prices. Again, a very undiversified economy that was reliant almost exclusively on coffee. Coffee prices collapse, you get a political crisis. Yugoslavia had a major economic crisis just before the country broke up and descended into hell. We know about the descent into hell, most people don't know about the economic crisis. 


    For some reason people find economics boring, and because it's boring and military intervention seems more exciting, we think that the solution is to send in the 82nd Airborne Division. Whereas perhaps it would have been simpler and a lot cheaper and easier and better from a humanitarian standpoint to address the economic crisis; the very heavy emphasis placed on austerity in the international economic system and the very damaging political effects austerity has in many countries. Historical context is necessary here: For all the constant, repetitious references to the Third Reich and to World War II, which we hear again and again and again and again, people often forget that one of the things that brought us Adolph Hitler was the Great Depression. Any reasonable reading of Weimar Germany's history would be that without the Depression, you almost certainly would not have gotten the rise of Nazism. So, I think a greater addressing of the economic issues in the case of Venezuela -- Even if the United States were to overthrow Maduro by whatever means and replace them with someone else,  that someone else would still have to deal with the issue of low oil prices and the damaging effects on the economy, which would remain unaddressed by humanitarian intervention, whether we call it that or something else. 


    I guess another point about the United States and Venezuela is that the United Nations sent a representative down there and condemned the US sanctions as greatly intensifying the humanitarian crisis. So, the intervention the United States has been doing -- economic at this point mostly, rather than military -- is making things worse, and that clearly has to stop. If we're interested in helping the people of Venezuela, surely the United States would not want to make it worse.


    (1) Gilbert Burnham, et al, “Mortality after the 2003 Invasion of Iraq: A Cross Sectional Analysis Cluster Sample Survey,” Lancet 368, no. 9545, 2006. Note that the Lancet’s best estimate of excess deaths due to the invasion is actually higher than the one I cited above. The correct figure is 654,965, rather than the 560,000 that I presented.

    (2) Linda J. Bilmes and Joseph E. Stiglitz, The Three Trillion Dollar War: The True Cost of the Iraq Conflict. New York: Norton, 2008.

    (3) Michael Chertoff and Michael V. Hayden, “What Happens after Gaddafi is Removed?” Washington Post, April 21, 2011.

    (4) Thomas Cushman, ed., A Matter of Principle: Humanitarian Arguments for War in Iraq. Berkeley: University of California Press, 2005.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172461 https://historynewsnetwork.org/article/172461 0
    Historian Ian Reifowitz on How the Race-Baiting Invective of Rush Limbaugh on the Obama Presidency Led to Trump


    Ultimately, the right wing needs white racial anxiety. In fact, it cannot survive without it.

    Ian Reifowitz, The Tribalization of Politics


    On January 20, 2009, Barack Obama was inaugurated as the forty-fourth president of the United States of America—the first African American to attain this exalted office. Hundreds of thousands crowded the National Mall during the ceremony to wish the new president well.


    However, rather than offering the president words of encouragement and congratulations, voices from the far right almost immediately expressed the hope that President Obama would fail and serve no more than one term. He had inherited a faltering economy, a war, a country still divided by race and other vexing issues, while the right-wing media labeled him as anti-American and unpatriotic, as a black president who would please his constituents of color to the detriment of white citizens. 


    Popular far-right talk radio host Rush Limbaugh was one of the most vociferous voices, and was the one with the largest audience. He uttered unceasing, racially-charged attacks on President Obama virtually every day of his two terms in office. 


    Historian Ian Reifowitz examines Limbaugh’s hateful invective and American political polarization in his new book, The Tribalization of Politics: How Rush Limbaugh’s Race-Baiting Rhetoric on the Obama Presidency Paved the Way for Trump (Ig Publishing).


    To better understand the attacks on President Obama, Professor Reifowitz took on the daunting task of analyzing the transcripts of Limbaugh’s radio shows and associated materials from the Obama years. As a result, Professor Reifowitz has documented the manifold instances of Limbaugh’s hateful race-baiting and “othering” of the president. And Limbaugh has profited greatly as a leader in sparking white fear of racial peril. 


    The book traces the election of Donald Trump and the recent rise in white supremacist activity to the incendiary language of racism that the right-wing relies on to win politically. Historian Keri Leigh Merritt commented that The Tribalization of Politics is “is a must-read for anyone seeking to understand how the US has reached its lowest point in race relations since the Civil Rights Movement.” 


    Professor Reifowitz teaches history at Empire State College of the State University of New York. His other books include Imagining an Austrian Nation: Joseph Samuel Bloch and the Search for a Multiethnic Austrian Identity, 1846-1919, and Obama’s America: A Transformative Vision of Our National Identity. He has published a number of academic articles in the Journal of Jewish Identities, Nationalities Papers, and East European Quarterly, among others. Professor Reifowitz is also a contributing editor at Daily Kos, and his articles have appeared in the Daily News, Newsday, The New Republic, In These Times, Truthout, Huffington Post, and others. His awards include the 2009 Susan H. Turben Award for Scholarly Excellence, and the 2014 S.U.N.Y. Chancellor's Award for Excellence in Scholarly and Creative Activities.


    Professor Reifowitz graciously responded to a series of questions in an email exchange on his work and his new book. 


    Robin Lindley: You’re a historian specializing in the modern history of the United States. How did you decide to study history, and then to focus on the American past?


    Professor Reifowitz: I’ve always, since I was in college (too many years ago) been interested in multiethnic societies, and specifically how they work to create ‘national’ bonds across lines of ethnicity to bind together their diverse population. 


    My graduate study, which led to my first book and other early academic publications, focused on Austria-Hungary. That state tried and failed to create strong enough national bonds, i.e., bonds based on citizenship in and loyalty to a common state, that would have allowed it to survive World War I and the overthrow of the Habsburg dynasty. Even while pursuing that research, I’d also been reading and thinking about another multiethnic society, the one we live in, that faces some of the same issues (thankfully, we don’t have to rely on a monarchy as the foundation of our unity). 


    Eventually, my passion for understanding how unity and diversity were playing out today drove me to begin writing and researching the contemporary U.S. I published a couple of articles in The New Republic, and later in other outlets, and began to read more deeply and develop my ideas further. Then along came Barack Obama. I wrote my previous book, Obama’s America, in which I examined his conception of American national identity, one that incorporates pluralism and inclusiveness into a strong, unifying vision of national community that, one would hope, Americans of every background could adopt. 


    Robin Lindley: How did you come to write about Rush Limbaugh’s race-baiting rhetoric in the Obama Era? Did the project grow out of your past research for Obama’s America?


    Professor Ian Reifowitz: To continue the story from above, one section of Obama’s America examined critics (mostly on the right, but a few to Obama’s left) who criticized Obama’s vision of American national identity. I had spent some pages examining Rush Limbaugh’s rhetoric from the first couple of years of Obama’s presidency, and that had energized me (albeit with a sort of dark energy, compared to the more uplifting work of looking at Obama’s writings and speeches). 


    Then, in the summer of 2015, the idea came to me for another book, and I thought: why not do a comprehensive, close examination of everything Limbaugh said about the Obama presidency. I put together a proposal, started the work in late 2015 and kept up the research until Obama left office, and then started writing. In the meantime, of course, Trump had emerged and been elected. Trump’s campaign and then victory helped me decide to focus the book on Limbaugh’s race-baiting, both in order to document it in a comprehensive way for people, and to draw parallels between what he was doing—playing and preying on white anxiety—and what Trump did in his campaign (and, to be sure, for years beforehand, starting with his own incendiary, racist rhetoric about the Central Park Five and right up to him claiming the mantle of birther-in-chief).


    Robin Lindley: You address our current political “tribalization” by focusing on Limbaugh’s rhetoric. In your view, what is tribalization, and how does it affect our politics now?


    Professor Ian Reifowitz: I’ll give you the definition I used in the book’s introduction rather than come up with something off the top of my head.


    “Tribalization refers to a transformation much more profound than merely convincing Americans to be partisans who vote based on a shared set of policy preferences. It means cleaving America in two, and, in the case of Limbaugh, creating a conservative tribe animated somewhat by political ideology, but more so by racial and cultural resentment that feeds a hatred of the opposing tribe."


    Robin Lindley: This new Limbaugh project had to be daunting and possibly distasteful to you in view of your past research on President Obama and your favorable view of his efforts to unite our diverse nation. How did you feel as you put your book together? 


    Professor Ian Reifowitz: Well, I did mention above that I felt a different kind of passion motivating me on this project compared to the Obama book. But I have to admit that, once I got deep into the research, there were times when I wished I hadn’t committed to the project. There were plenty of times that I didn’t want to read through another word of Limbaugh. 


    I guess my stubborn streak helped. I wasn’t going to abandon a project that I’d already invested so much time and energy in, and certainly wasn’t going to do so because Limbaugh’s rhetoric was hard to stomach. I hoped I was doing something important, that could make some connections that would help people better understand where our politics has gone in the past few years.


    Robin Lindley: What was your research process for your new book?


    Professor Ian Reifowitz: I went to RushLimbaugh.com and read through the transcripts for every show he did during the eight years Barack Obama was president, which he thoughtfully published free of charge. To be honest, if the transcripts didn’t exist, I don’t know that I could have done the research by listening to the audio recordings. That might have been too much. Thankfully I didn’t have to find out. 


    I also read secondary sources on contemporary politics, in particular on matters of race and identity. After I started focusing on the connections between Limbaugh and Trump, I read political science scholarship on public opinion in 2016, which documented how white anxiety and resentment correlated with votes for Trump both in the primary and general election, and I incorporated that information into my analysis.


    Robin Lindley: What did you learn about Limbaugh’s origins?


    Professor Ian Reifowitz: I read some about his rhetoric in early years, how he had used racist language even before making his turn toward talking full-time about politics in the 1980s. But the focus of the book is on what he said about Obama, which spoke for itself. To clarify, I don’t care if he actually believes what he’s saying, because the effect his words have is the same whether he’s just a cynical opportunist or a true believer. I’m not especially interested in his motivations.


    Robin Lindley: How did you come to focus on Limbaugh in your book. You see Limbaugh as a major force in dividing the US during the Obama era, but other potent Obama detractors included the current president, Senator Mitch McConnell and much of the Republican Party, Fox News, the Tea Party, and others. How would you weigh Limbaugh’s influence, if possible?


    Professor Ian Reifowitz: My background, in terms of the kind of work I do, focuses on analyzing political rhetoric. Limbaugh was the person whose rhetoric I chose to examine because he broadcasts about two hundred shows a year, so there would be essentially no important issue relating to the Obama presidency that he would not address. Plus, he had the largest radio audience in the country throughout all eight years Obama was president (and decades before as well, and even in the years since up through the most recent month). 


    I used him as a case study—where the biggest part stands in for the whole of the right-wing media. The transcripts helped as well, as it would be impossible to read every word broadcast on, say, Fox. This way, I had a closed, yet comprehensive, set of data to use as my source base.


    Robin Lindley: Thanks for explaining your process. Do you see Limbaugh as an ally of white supremacist organizations such as the Ku Klux Klan and the American Nazis? 


    Professor Ian Reifowitz: His show helps push sanitized versions of some of their ideas into the mainstream. The views he expresses are not the same as the views of the KKK or American Nazis, but he taps into some of the same hate and fear that they do. I don’t think that makes him an ally, but more like an enabler.


    Robin Lindley: How did Limbaugh view Senator Obama before he was elected in 2008? 


    Professor Ian Reifowitz: I didn’t look at the pre-inauguration rhetoric in a comprehensive way, but from what I saw nothing changed on Election Day.


    Robin Lindley: How did Limbaugh usually describe President Obama? 


    Professor Ian Reifowitz: You want the whole book in a nutshell? Here’s a brief summary from the book:


    “While Obama was president, Limbaugh constantly, almost daily, talked about him using a technique that scholars call “racial priming”—in other words, he race-baited. The host aimed to convince his audience that Obama was some kind of anti-white, anti-American, radical, Marxist, black nationalist, and possibly a secret Muslim to boot. This was neither a bug nor a supporting element of Limbaugh’s presentation, but instead stood as a central feature deployed strategically in order to accomplish a very specific task, a task reflected in the title of this book. The tribalization of politics is exactly what Limbaugh set out to achieve.”


    I’ll add: “[Limbaugh] portrayed him in a way designed to exacerbate white racial anxiety about a black president, or depicted him as a foreign “other,” outside the bounds of traditional Americanness.”


    Robin Lindley: How did Limbaugh exploit Islamophobia and the fear of immigrants to attack President Obama?


    Professor Ian Reifowitz:  He repeatedly sought to portray Obama as some kind of “secret Muslim” or somehow more sympathetic to Muslims—even terrorists—than to Christians and/or the interests of the United States. On immigrants, I’ll give you the following example:


    “On July 1, 2015, two weeks after Trump’s infamous comments [made during his announcement that he was running for president] about Mexican immigrants being rapists and bringing drugs into the United States, a woman named Kathryn Steinle was shot and killed in San Francisco by Jose Inez Garcia Zarate [an undocumented immigrant with a criminal record].


    “. . . On the campaign trail, Trump pounced, and Limbaugh followed suit a few days later. On July 7, in comments designed to inflame white racial resentment, the host claimed that Steinle’s name would “never be as well-known as Trayvon Martin,” and that the president would not deliver the eulogy at her funeral, even though Obama had not delivered a eulogy at the funeral of Martin or any other citizen killed by police. Obama did, however, speak at the memorial service for the five Dallas police officers murdered a year later….Limbaugh speculated that the president did not care about Steinle’s murder, and blamed it on the administration’s immigration policies, which were “coming home to roost”—this was a phrase uttered by Reverend Jeremiah Wright that was discussed so often on Limbaugh’s show. The host again talked about Obama hating America and wanting to alter its “composition” in order to change “the face of the country.”  


    Limbaugh attacked the president over Steinle on three more shows over the next week. On July 15, 2015, the host contrasted Obama not having contacted the Steinle family to his having written letters to forty-six felons whose sentences he commuted, and to his outreach to the family of Michael Brown in Ferguson. Limbaugh’s point was to remind his listeners that Obama cared more about prisoners (read: black and Hispanic people) and black people killed by cops than a white woman who was murdered by someone here illegally. If there’s one segment that both encapsulates Limbaugh’s tribalizing history of the Obama presidency, and shows how his race-baiting rhetoric set the way for the rise of Trump, this was it.


    Robin Lindley: How did President Obama respond to Limbaugh’s attacks, particularly in terms of dealing with claims that he was pro-Muslim, anti-police, and anti-white? 


    Professor Ian Reifowitz: He basically ignored them, but I did not examine Obama’s responses comprehensively.


    Robin Lindley: Limbaugh attacked President Obama almost daily during his eight years in office. For Limbaugh, it seems that the ideals of equality, tolerance, democracy, community, and serving the common good are anathema and, indeed, anti-American. What is your sense of Limbaugh’s view of these ideals? 


    Professor Ian Reifowitz: He would pay lip service to most of those ideals in the abstract, while attacking Obama and other liberals for seeking to change the traditional definition of them to something involving retribution and reparations that would take from whites and give to non-whites. He would turn any criticism of racial inequality in America back around and argue that the problem of racism in America stemmed from people overexaggerating it. For example, on July 25, 2013, Limbaugh “accused “the left” of wanting “race problems” to remain unsolved, and in fact wanting to make them worse. Why? Because “too many people make money off of racial strife, and therefore they’re always going to promote it.” Here’s a quote from May 26, 2010, about Obama and liberals in general: “everything’s about race. Everything is about skin color to these people, or however they classify people, however they seek to group them, whatever, they’re victims.” This is how he viewed racism in America. 


    Robin Lindley: Beyond Limbaugh, what are some things that you learned about the massive right-wing media misinformation machine?


    Professor Ian Reifowitz: I didn’t do too much with them, because they do generally move in lockstep. I did note in the book that in the summer of 2018, Tucker Carlson and Laura Ingraham on Fox News echoed Trump’s language of white anxiety regarding immigration and demographic changes. Limbaugh had spoken similarly as well during the Obama presidency, which I documented in greater depth.


    Robin Lindley: Were there any particular findings that surprised you as you researched and wrote your book?


    Professor Ian Reifowitz: Nothing really surprised me in terms of ideology, Rush pretty much delivered exactly what I expected when I started the research. I was already pretty familiar with his bile. However, when I came across Limbaugh’s comments connecting Tiger Woods and his sex-related scandals involving white women to Obama—which suggested that the president might be involved in something analogous based on little more than the fact that both were multiracial guys with a similar skin tone—that was something beyond even what I had expected. There were also a few times, at least until I got used to it, when I was surprised by how baldly Limbaugh just lied about facts and statistics, in particular regarding the economy. Either he really didn’t understand them, which is not likely to be true because he didn’t manipulate them to make President Trump look bad—only President Obama—or he just thought lying was the right thing for him to do.


    Robin Lindley: Limbaugh wasn’t new to exploiting race to divide Americans. In fact, that’s been a Republican strategy for decades. What did you find about how Republicans use race to their political advantage? 


    Professor Ian Reifowitz: I wrote this in the book: “As journalist Dylan Matthews noted in an article entitled “Donald Trump Has Every Reason to Keep White People Thinking About Race,” a vast corpus of social science research indicates that “even very mild messages or cues that touch on race can alter political opinions,” and added that “priming white people to so much as think about race, even subconsciously, pushes them toward racially regressive views.”


    Robin Lindley: What do you see as Limbaugh’s role in the election of President Donald Trump?


    Professor Ian Reifowitz: I’ll share an example from the book, with some data, that demonstrates the role Limbaugh’s race-baiting rhetoric played in paving the way for Trump:


    “Public opinion research data suggests that exactly this kind of rhetoric helped move some whites who had previously voted for Obama into Trump’s column by 2016—most Obama-Trump voters expressed high levels of anger toward non-whites and foreigners. It might be hard to imagine Obama voters being bigoted, but John Sides, Michael Tesler, and Lynn Vavreck found that significant numbers of whites who voted for Obama in 2012 expressed varying degrees of white racial resentment while also overwhelmingly embracing liberal positions on issues such as taxation and the existence of climate change. It might be surprising, but about 25% of those whites who found interracial couples unacceptable nonetheless voted for Obama in both 2008 and 2012. The country’s racial climate during Obama’s second term contributed to this phenomenon of racially resentful white Obama voters shifting to Trump, as [according to Zack Beauchamp at Vox] Black Lives Matter and Ferguson “kicked off a massive and racially polarizing national debate over police violence against African Americans.” Limbaugh took full advantage of that climate, and his race-baiting helped pave the way for Trump.”


    Robin Lindley: What has Limbaugh been doing since Trump’s election? Does he continue to blame President Obama and Secretary Hillary Clinton for problems the nation faces. 


    Professor Ian Reifowitz: I’ve stayed away from Limbaugh to some degree, just to give myself a break. But he’s still a huge media figure. He’s done exactly what I expected, which is the same thing he did once Trump became the presumptive nominee. He’s been a huge Trump backer and has continued to use rhetoric aimed at ginning up white anxiety, to make sure those anxious whites keep on remembering who their (false) champion is. I did check to see what Limbaugh said about Tiger Woods recently, now that Trump has embraced him, and in fact Limbaugh has done a 180, offering nothing but praise for how been Tiger has been a friend to Trump. However, while discussing Tiger and Trump, Limbaugh made sure to remind his audience that Obama is still the one to blame for exacerbating racial tensions in America. He certainly doesn’t blame Trump—or himself, for that matter. None of those things qualify as a surprise.


    Robin Lindley: What are some good ways to counter the hateful and inaccurate rhetoric of Limbaugh and his fellow extremists in the right-wing media? 


    Professor Ian Reifowitz: I’ll leave folks with the concluding paragraphs of the book, which are as close as I get to offering a prescription going forward regarding how to counter the Limbaugh/Trump vision of America:


    “White racial identity has been the foundation of the single most destructive form of identity politics over the course of American history. In colonial times, slave-owners raised the status of white indentured servants—many of whom had developed close relationships with the enslaved African Americans alongside whom they worked—transforming these “plain white folks” into equal citizens and telling them that they were superior to blacks, who were thus undeserving of freedom. Why did they do this? Because the slave-owning elites had one fear above all: a white-black coalition of the masses that would unite to overthrow them. Similarly, after emancipation, the Southern economic elites made sure to bind poor whites to them through the race-based advantages conferred by Jim Crow, all in the name of thwarting that same white-black, class-based political partnership. 


    “In this century, some working- and even middle-class whites, especially those without a college degree, have been drowning economically in a way they have not since the Great Depression. For many, whatever privilege comes with being white is not enough to keep them afloat. They are angry, afraid, and looking for a scapegoat. Limbaugh has been only too happy to oblige. He has absolutely no interest in helping the country figure out how to deal in a productive way with the white anxiety that arises from demographic change. He is interested in one thing, and one thing only: exacerbating this phenomenon in order to keep separate whites and Americans of color who do share common economic interests. That is how Republicans win elections. 


    Limbaugh’s divisive approach, in that specific regard, is a carbon copy of the approach taken by the nineteenth century Southern white elites. The more he can get working, and middle-class whites to identify with their racial identity—their tribe—above their economic interests, the better he will be able to prevent the multiracial, progressive coalition assembled by President Obama from growing strong enough to defeat Limbaugh/Trump-style conservatism once and for all. Ultimately, the right-wing needs white racial anxiety. In fact, it cannot survive without it.


    Robin Lindley: Thanks for those powerful words. What are you working on now?


    Professor Ian Reifowitz: Right now? How about a nap? I’m still teaching full time, so I’ll be spending time thinking about a new project over the coming months.


    Robin Lindley: Thank you for your thoughtful comments Professor Reifowitz and congratulations on your fascinating new book, “The Tribalization of Politics.” 

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172463 https://historynewsnetwork.org/article/172463 0
    Charles Reich and The Greening of America – an Appreciation


    Charles Reich, author of The Greening of America, passed away last month at the age of 91.


    His book was first published in 1970 to mixed reviews: Newsweek’s Stewart Alsop called it “scary mush.” Another critic labeled it “a toasted marshmallow,” devoid of substance. Yet Reich’s book, a combination of history, sociology and philosophy struck a chord in a somber time of war and national protest. It went on to sell five million copies and become a key cultural anchoring point, a book that explained the new counterculture in a clear, uplifting manner. 


    In 1970, I was one of the legions of long-haired, dope-smoking, anti-war protesting college students. We knew what we were against, but were struggling to define a vision of the future. Like many of my friends, I devoured Reich’s book, underlining dozens of passages. The Greening of America became a touchstone for our generation, the center of many intense conversations in campus cafeterias and smoke-filled dorm rooms.  We were angered by Nixon’s deceitful actions to prolong the Vietnam war, distrustful of a soul-crushing corporate culture and curious about the promise of new technology (NASA landed on the moon in 1969, but an affordable personal computer was still a decade away). 


    The Greening of America spoke to our concerns with a carefully reasoned, historically anchored thesis. It explained many of the hopes and fears we felt intuitively but had not been able to articulate at length. 


    Rather than talking about a violent political revolution, Reich described a revolution based on a new, open culture that freed men’s minds, not repressed them. He described an America that had evolved since the 1776 Revolution through three “consciousnesses.” In the first hundred years, Consciousness I, based on individual freedom and self-reliance, spurred the settling of the new nation. Consciousness II, born with the rise of industrial society in the 19th century, created a new hierarchy and demanded submission of an individual’s identity to the corporation. The rise of a mass consumer culture defined happiness in the terms of a man’s position in a hierarchy of status. Factories and workshops produced a split between the duties and identity of “man at home” versus the “man at work.” 


    According to Reich, the post-World War II boom brought new economic security and allowed the first stirrings of Consciousness III to emerge among the children of a new, expanded middle-class. Many members of this generation sought to gain a new freedom based on a “lifestyle” that was authentic. Their identity was based on cultural interests, creativity and self-expression, not status-seeking through the accumulation of consumer goods.


    One of the joys of Reich’s book was its optimism; one reviewer noted “It combined the rigor of an intellectual and the enthusiasm of a teenager.”


    Greening was based on the belief that America had been founded with great hopes for personal freedom and that its Constitution allowed for major societal change. As Reich saw it, “there is a revolution underway. If it succeeds it will change the political structure as its final act. It will not require violence to succeed. Its ultimate creation could be a higher reason, a more human community and a new and liberated individual.” 


    1950s Social Criticism

    Reich’s book did not appear in a vacuum. Concern about the domination of large corporations in culture and politics and the loss of individual identity has been brewing for some time.  David Riesman’s The Lonely Crowd, a critique of the new suburban culture, appeared in 1950. In 1964, Herbert Marcuse, a philosophy professor at UC San Diego, published One Dimensional Man: Studies in the Ideology of Advanced Industrial Society.  Marcuse’s book contained scores of profound insights, but it was written at the level of a philosophy textbook. It contained numerous references to Marx, Hegel, Max Weber, Walter Benjamin and other German scholars. While we loved the title, One Dimensional Man was simply beyond the comprehension of most undergraduates.  


    For college students today, the word “greening” today is closely associated with the environmental movement (green buildings, a Green New Deal), but Reich’s book barely touched on the environment. For him, greening meant newness, a natural growth. He compared the emerging youth culture to like “flowers poking up through the concrete.”


    Reich wrote in 1970, before the 1973 oil embargo, when gasoline was around 30 cents per gallon and solar power was so expensive it was used only on NASA space probes. The rainforests were still intact and global warming had yet to manifest itself.  


    Greening did not meet with universal acclaim. Critics on the far left, including Herbert Marcuse, condemned it for being “naïve,” and imaging that massive social change was possible without violent action. Marcuse, in a critique published in the New York Times in November 1970 warned that no national revolution has ever succeeded without violence. Marcuse advised that the entrenched “groups, classes, interests” in America controlled the police and armed forces. They set the priorities for America and they would not voluntarily give up any of their power.    


    Reading Reich’s book today, some fifty years after its publication, we can see that many of the descriptions of the descriptions of repressive culture accompanying Consciousness are still valid. But his predictions about an emerging Consciousness III were off-target. 


    From today’s vantage point, it is clear this book was written by an affluent white man working at an elite cultural institution and for an audience of well-educated young white people. Although Reich included a few quotes from Eldridge Cleaver’s recently published Soul on Ice, he never discussed the crushing poverty of inner-city ghettos, the suburbs’ segregated schools nor the structural racism still in place in the 1960s.


    He also seemed blind to the nascent feminist movement. Betty Friedan’s The Feminine Mystique was published in 1963 and the National Organization for Women founded in 1966, so the basic tenets of feminism were known, if not yet widely practices.  His vision for Consciousness III women was confined to “liberated housewives” and enlightened school teachers. 


    Still, Reich was eerily prescient about many other trends in American society. In Greening he posited that “the great question of these times is how to live in and with a technological society; what mind and what way of life can preserve man’s humanity against the domination of the forces he has created.”


    He also warned of “the willful ignorance in American life.” He lamented that Americans “could be sold an ignorant and incapable leader because he looked like the embodiment of American virtues.” 


    Reich left Yale Law School in 1974 and moved to San Francisco. He published several more books, including an autobiography, The Sorcerer of Bolinas Reef, in which he revealed his gay identity. 


    Reich gave the younger generation hope in a dark period in American history.  He will be missed.


    Note: Although the original 120,000-word edition of Greening of America is out of print, a condensed, 25,000 e-book version, with a new forward by Charles Reich, was published in 2012 and is available on the Internet. 

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172462 https://historynewsnetwork.org/article/172462 0
    Who Says Historians Can’t Travel Back in Time?


    Ricky Law is a historian of interwar Germany and Japan and an Associate Professor in the Department of History at Carnegie Mellon University. He received his PhD from the University of North Carolina at Chapel Hill in 2012.


    What books are you reading now?


    My book, Transnational Nazism: Ideology and Culture in German-Japanese Relations, 1919–1936, was just published. I still can’t help but read passages from it to come up with possible improvements. I have spent so much time in the past several years reading and revising the manuscript that I find it a little hard to kick the habit abruptly, even though there is no chance of making changes.


    I am also beginning to prepare for my next book project on the cultural and social impact of foreign language learning in interwar and wartime Japan. This is an interdisciplinary project about history, international relations, and linguistics. I am now working my way through Robert Phillipson’s Linguistic Imperialism and Linguistic Imperialism Continued, which examine the concomitant rise of English and English-speaking countries to global predominance.


    For personal interest, I am reading The Hellenistic World and the Coming of Rome by Erich Gruen and The Senate of Imperial Rome by Richard Talbert. As teachers, both authors were generous to indulge me in my amateurish fascination with ancient history. I am now starting to catch up on their books.


    What is your favorite history book?


    One favorite is hard to say. A book that left a deep impression on me was Weimar: Why Did German Democracy Fail? It is not a typical monograph but a discussion among four historians. The introduction by Ian Kershaw features some of the most illuminating writing I have read on the complex event. I encountered the book as a graduate student already familiar with the topic, but I was still struck by the clarity and insight of Kershaw’s overview of the history and historiography. The discussion following the introduction is an excellent source to learn how historians think and interact.


    I read Christopher Browning’s Ordinary Men: Reserve Police Battalion 101 and the Final Solution in Polandt hree times – once as an undergraduate, once as a graduate, and once as an instructor. Each time I learned something new, and I think that is the mark of a great work. Another book that I pick up from time to time is History of the Later Roman Empire, Volume 2 by J. B. Bury. I could hardly put the book down the first time I read it. Good history writing involves good storytelling.


    Why did you choose history as your career?


    I have always been interested in the past, but I didn’t know that I could turn it into a career. I began my undergraduate studies at UC Berkeley with a major in electrical engineering and computer science. It was the heyday of the original internet boom in the Bay Area, so a career in tech was self-explanatory. But very quickly I realized that engineering did not suit me and that if I worked in tech I would be miserable so I switched majors to study what really motivated me. 


    The greatness of UC Berkeley as a university was that I walked out of one building and into another, but I still received the same world-class education. I ended up double majoring in history and German, and was one course short of a minor in classics. I also studied abroad in Germany in my junior year. After graduation, I took a chance offer to teach English in Japan and gained a deep appreciation for the country. I decided that I wanted to learn more about the history of Germany and Japan, so I went to UNC Chapel Hill for graduate school with a project on Japanese-German relations. I actually had no intention of entering academia until very late in my graduate studies. I was planning to become a civil servant – I went to public schools all my life and wanted to give back to society. I applied to both government and academic positions, and the university job offer came at the right time.


    What qualities do you need to be a historian?


    Attention to detail, persistence, open-mindedness, and a willingness to take risks. History is not a subject like mathematics or physics where one can make major discoveries through personal genius alone. The past is not something that one can just “figure out” – no amount of intellectual brilliance can replace time and effort spent with historical sources. I gained some important insights in my research by noticing scribbles in books, changes to letterheads, or choice of fonts. Published materials likely went through an editing process, so even small details can reveal the thoughts of their creators. To understand the past, it is important not to confine oneself just to political, cultural, or social history, etc. These are categories for analysis but people don’t live separate lives like that. Going beyond disciplinary boundaries can be risky but can also bring unexpected results.


    Who was your favorite history teacher?


    I was fortunate to have many excellent mentors throughout my studies. At UNC Chapel Hill, Christopher Browning and Miles Fletcher were superb co-advisers for my dissertation on transnational history. They taught me lessons on teaching and research that I still apply in my classroom and writing. At UC Berkeley, I took every class offered by Michael Grüttner, then a DAAD visiting professor from the Technical University of Berlin. In the twenty years we have known each other, he was always ready to answer my questions and provided indispensable insights for my book. I only took a freshman seminar taught by Tom Havens, but he has since been generous with his time and supportive of my work. Although I was just one of hundreds of students in Leon Litwack’s US history survey course, we had many interesting conversations during his office hours. When I began teaching my own large lecture courses, I looked back to that experience for inspiration and guidance. The senior-thesis seminar taught by Erich Gruen was singularly enlightening. When I write, I still keep in mind his exhortations to interpret evidence more thoroughly and skeptically. My history teacher at King City High School, Paul Cavanaugh, was not only a phenomenal teacher but also a great role model.


    What is your most memorable or rewarding teaching experience?


    I find it most rewarding when students told me that they had dreaded history classes because they had a bad experience in high school, but my course changed their view on the subject and encouraged them to take more history classes or major in history. I teach a large, introductory survey course that is part of general education. Most of my students intend to specialize in a field other than history, so convincing any of them to take another history class or add a history major feels like a victory and validation of my effort. 


    What are your hopes for history as a discipline?


    History as a discipline is indispensable for democracy. It trains us to place ourselves in the shoes of others in different times and places. Thinking beyond the self and from the perspectives of others is the essence of democracy. My interactions with my students make me hopeful that history and democracy will thrive together.


    Do you own any rare history or collectible books? Do you collect artifacts related to history?


    I have a small library of German and Japanese books published in the interwar era. Most of these books were not accessible at the German National Library or the National Diet Library, so I had to look for them in bookstores or flea markets. Both Japan and Germany have a robust antiquarian trade. Going book hunting gave me a good reason to take a break from research or writing. Leipzig, where I stayed for many months, has historically been a center of publishing in Germany and host to one of Europe’s largest book fairs. But there is nothing quite like Jinbōchō in Tokyo, a district with streets lined with bookstores. I spent countless hours there browsing bookshelves looking for titles and discovering ones I did not know about. Many of the books I collected deal with the practicalities of intercultural interactions, such as travel guides, travelogues, and language textbooks. I am fascinated by the mechanics of how humans made sense of ideas, objects, and people from other cultures. I also have a small collection of maps from that period. Most of these maps were not for finding directions, but they gave viewers a chance to imagine traveling abroad.


    What have you found most rewarding and most frustrating about your career?


    My most rewarding experiences came from sharing my knowledge of history with others, be they colleagues, students, or anyone interested in learning more about the past. Seeing my book, which I have worked on for over a decade, in print is truly gratifying and makes me feel that I have added to human knowledge. But it is also frightening as it is now in the open for everyone to see and comment on. The most frustrating times were when students wrote in course evaluations that they don’t see the point of having to take a history class because they plan to have a career in STEM, performance arts, etc. I wish they would have told me earlier, so I could have tried personally to convince them that a grounding in history will make them better scientists, artists, and businesspeople. College is precisely the place to explore various areas of interest.


    How has the study of history changed in the course of your career?


    I may belong to the last generation of historians who remember the days before the internet. Information technology has undoubtedly transformed how we research history. Digitization allows the historian to handle an amount of material previously unimaginable. I could not have written a book on the mass media of two countries without digitized sources. Having said that, I think digitization causes some problems. One is information overload. Because it is so easy to access electronic material, it is very tempting to keep looking for the elusive “whole picture” rather than to start writing. Another is the loss of context. A keyword search can be the fastest, most efficient way to find a relevant document, but it rips the document from its surroundings, like reading newspaper clippings rather than scrolling through the pages. An even more disruptive change is that far fewer people are studying history. The interest is still there, but history is often erroneously perceived as a field that does not lead to career success. I worry that history may soon be studied only by the privileged few who can afford to. That would set history back to its status hundreds of years ago, but certainly not when I began my studies.


    What is your favorite history-related saying? Have you come up with your own?


    I like to say that both astrophysicists and historians can travel back in time, but only historians can do anything about the past.


    What are you doing next?


    There are some post-publication activities associated with Transnational Nazism. I have read numerous books but until I wrote my own, I was not aware how much work goes into publishing one. I gained much respect for book authors. I have established a presence on Twitter (@rickywlaw), mostly to discuss my book and to comment on historically relevant current events. I have felt, and still feel, rather ambivalent about the purpose of social media and its repercussions. But that’s how the world works now (I’ll leave it to future historians to assess social media’s worth), and those who know better have a responsibility to speak up and speak out when history gets misused. I am also starting to write my next book. It will analyze why and how the Japanese learned various foreign languages in the interwar and wartime years. Additionally I will develop a few new classes, on Roman, Japanese, and German histories.


    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172469 https://historynewsnetwork.org/article/172469 0
    Roundup Top 10!  



    How the Declaration of Independence became a beacon to the world

    by Charles Edel

    The Declaration’s international reach.


    Reflecting On The Civil Rights Act’s Anniversary With James Baldwin

    by Lindsey R. Swindall

    Much like the time in which Baldwin wrote, we are living through a period of deep political division and social crisis framed by global discord.



    The Supreme Court Is in Danger of Again Becoming ‘the Grave of Liberty’

    by Eric Foner

    Supreme Court decisions have practical consequences, which justices too often blithely ignore.



    An Open Letter to the Director of the US Holocaust Memorial Museum

    by Omer Bartov, Doris Bergen, Andrea Orzoff, Timothy Snyder, and Anika Walke, et al.

    The United States Holocaust Memorial Museum released a statement on June 24 condemning the use of Holocaust analogies.




    The Surprising History of Nationalist Internationalism

    by David Motadel

    Internationalism, a concept that, after all, implicitly presumes the existence of the nation, and extreme nationalism are not necessarily incompatible. The far right is less parochial than we think — and that’s dangerous.




    The Lingering of Loss

    by Jill Lepore

    My best friend left her laptop to me in her will. Twenty years later, I turned it on and began my inquest.



    The False Narratives of the Fall of Rome Mapped Onto America

    by Sarah E. Bond

    It is disturbing to see how gravely inaccurate 19th-century depictions of the destruction of Rome are used to illustrate news stories today, particularly those that draw parallels between Rome and the United States.



    Why Democrats are wrong about Trump’s politicization of the Fourth of July

    by Shira Lurie

    He’s not doing anything that hasn’t been done for centuries.



    How Lincoln's disdain for demagogues pricks Trump's Fourth of July pomposity

    by Sidney Blumenthal

    If nothing else, the president’s speech on the Mall on Thursday will show how far we have fallen since Lincoln.


    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172470 https://historynewsnetwork.org/article/172470 0
    A Declaration of Independence from Hunger

    The Homestead at Hot Springs, Virginia, USA, where a UN Food Conference in 1943 laid the foundation for the UN Food and Agriculture Organization. (courtesy FAO)


    As we celebrate the Fourth of July, let's remember the greatest line from our Declaration of Independence from Great Britain in 1776: "We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness." These ideals of human rights for all citizens inspire us to end world hunger.  For without basic food and nutrition no person is free to build a life and reach their God given potential.  These essential rights were evident in President Franklin Roosevelt's Four Freedoms speech early in World War II, the third of which was Freedom from Want.  When FDR called for a United Nations Conference on Food during the war, it drew inspiration from the Declaration of Independence, according to Harvard Professor John Black is his book Food Enough.  The delegates stated at Hot Springs, Virginia in 1943: "This Conference, meeting in the midst of the greatest war ever waged, and in full confidence of victory, has considered world problems of food and agriculture and declares its belief that the goal of freedom from want of food, suitable and adequate for the health and strength of all peoples, can be achieved." While progress has been achieved in fighting world hunger, still too many people go to bed hungry. We must continue to fight hunger at home and abroad.  Americans donating to the Letter Carriers National Stamp Out Hunger food drive in May can feel proud of their efforts to eliminate hunger. Stamp Out Hunger donations were the third highest total ever reaching over 75 million pounds of food. That is 63 million meals donated to America's foodbanks.  Letter Carriers President Fredric Rolando (FVR) says “We are thrilled by the results of this year’s food drive and the impact it will have on helping feed those in need. And we appreciate the generosity and compassion of so many Americans.” The most donations of any postal branch was San Juan, Puerto Rico, which beat Los Angeles for the top spot. Orlando was third and Florida was the state with the most donations.  When Americans donate to foodbanks it also sends a message to our government to do their part. Congress and the President should support the TEFAP program that helps foodbanks and also the SNAP food stamp safety net for families. The Senate should pass legislation put forth by Senators Sherrod Brown and Kirsten Gillibrand to expand summer feeding.  You never know when trauma can strike and a family will need help putting food on the table.  Charity in fighting hunger may start at home, but it does not end there. Hunger is a major international crisis right now especially with the wars in Syria, Yemen, Afghanistan, and South Sudan. Relief agencies need support in the worst hunger emergencies of our time. Children suffer the most in the war zones, becoming victims of deadly malnutrition. The new Global Childhood report from Save the Children says "Many children who manage to survive in these fragile and conflict-affected settings suffer from malnutrition. Recent estimates put the number of stunted children living in conflict affected countries between 68 million and 113 million (45 to 75 percent of the global total)." The child malnutrition crisis is going to get much worse this summer with East Africa facing a major drought. The Congress should increase funding for Food for Peace, McGovern-Dole global school lunches and other aid programs.  As our experiences after World War II taught us, you cannot build peace if there is hunger and chaos.     Food is of the utmost importance to our domestic and foreign policies. Each one of us can do something about it, especially when it comes to helping charities fight hunger. You can support your local foodbank and charities that fight hunger across the globe like the World Food Program, Catholic Relief Services, Mercy Corps, Save the Children, Action Against Hunger, UNICEF and so many others. You can write letters to your representatives in Congress urging them to do more to feed the hungry.  Everyone can be a leader in declaring a new independence: Freedom from hunger for all.  

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172459 https://historynewsnetwork.org/article/172459 0
    America, One and Inseparable


    This Independence Day it is all too easy to find the theme we must adopt as we reflect on American history and identity. “America: One and Inseparable” comes to mind at once as we think on what we need to seek out and build for our greater good.


    Those of us who have lived a long life in our great country look about at division everywhere among us.  In Congress, in many a state legislature, deep in our Society—the expression “united we stand,” once so normal, so commonplace, there for our use, seems no longer to describe us.


    It is so very tempting to rush at the very outset to the figure of the one who has so successfully divided us and continues to do so.  He need not be mentioned, for all are all too well aware of what is happening and why this disaster has come upon us.


    But today, I’d like to return to the document this holiday celebrates. The words enshrined in the Declaration of Independence tell us who this country is for and who our duty belongs to:  “all men” it says, and we now consider that to mean “men and women alike.” 


    But despite those words and national mission, from birth to our passing we allow ourselves to be separated from our fellow beings!   The quiet acceptance of the First Amendment so common in earlier years seems open to debate, while that Second Amendment is in the public eye for absolutely devastating reasons, is it not so?  Our Leader seems determined to make of the word “Immigrant” something alien to us, as they with darker skin and sometimes different religion get considered quite unacceptable in the America we know and love. Violence is omnipresent —from the daily news to our entertainment.  It is a vehicle that is increasingly used to entertain, while the consequences create deeper and deeper fear.


    Yes, the challenges we face as a nation are vast.  Guns and drugs and opioids have no place here, nor do threats that robots may take over tomorrow’s means of earning a living.  Often, it seems like compromise is impossible. As I offer these words, representatives of a political party in one state just moved out of state to avoid voting in the Legislature, denying legal representation and participation for all elected there.


    To combat these challenges, we should listen to our former presidents, our great authors, and those who so brilliantly fill our pulpits.


    We want to live with words like those of President Thomas Jefferson in his First Inaugural Address, President Abraham Lincoln as he faced those who suffered at Gettysburg, and even President Franklin D. Roosevelt, apolitical as he warned Americans against life governed by “fear.” 


    We need to think on the Four Freedoms as characteristic of a free and self-governing Nation: the freedom of speech and expression, the freedom to worship God, the freedom from want, and freedom from fear. 


    As we seek sources that may offer inspiration, we must not overlook the Archives of past Presidents of our Land.  Every time I enter the doors at West Branch, Iowa (Hoover), Abilene, Kansas (Eisenhower), Independence, Missouri (Truman), and Austin, Texas (Johnson), I get a real lift.  Two weeks at Hyde Park, N. Y. raised my spirits in 1952, and I’m sure the words of Kennedy (usually inspiring), and Nixon would inspire these days as they did long ago.  



    Billy Graham in his day and others who offer spiritual guidance are also capable of raising our ideas for improving Society.  Some columnists and press writers use prose that lifts our boats (an old expression).  There is no doubt in my mind that present day dwelling on The Negative every morning with my newspaper and tuning in so often to TV News with its recital of “unpleasantness” is having an adverse effect that may prove lasting—that is, if not diluted by the uplifting.


    Finally, in law-making it is becoming much more than essential that we COMPROMISE strong belief and frame new legislation that will serve us well.  


    Do the ideas above seem remote from the old, that traditional 4th of July spirit that served us well in the Past?  Well they might, for at the very outset I strongly suggested that our Nation is seriously on the wrong track and needs A New Spirit. 


    What we must have, and soon, is a renewed  affection for Unity, and respect for togetherness.  We need to breed  hope across our landscape for a Unity of thinking about a safe and noble Future for us all.  Rejecting hatred and suspicion that will surely prevent our unity in spirit and action as we move forward! Yes, and a determined and renewed affection for one another.  In that spirit, let us hope, will the day intended to build patriotic spirit serve us so we can face the Future united as a single democratic people.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172458 https://historynewsnetwork.org/article/172458 0
    Gerrymandering presented a ‘political question doctrine’ deemed outside Supreme Court jurisdiction

    An example of gerrymandering



    In Rucho vs. Common Cause, the Supreme Court held that the question of whether partisan gerrymandering in North Carolina and Maryland violated the Constitution was a political question over which federal courts lacked jurisdiction. The result was a long time in coming but was clearly correct. As Chief Justice John Roberts ably demonstrated, there were no neutral and objective judicially manageable standards by which a federal court could resolve questions of political gerrymandering. Beyond that, however the decision may portend the much needed revitalization of the political question doctrine that the Court might apply to other legal questions as well.


    First, here is a little background on the political question doctrine. Justice Elena Kagan’s dissent in Rucho assumes that if there is a constitutional violation, there must be a judicial remedy. However, if the political question doctrine has substance, the opposite is true. The political question doctrine holds that there are some legal questions that courts can’t resolve even if they are convinced that the legislative or executive branch – or a state institution – resolved  them in a manner that was clearly incorrect. In other words, there is no instant replay. Just as there was a consensus that referees missed a blatant instance of pass interference in the 2019 NFC Championship game (affecting the outcome and denying the New Orleans Saints a trip to the Super Bowl), so it is with law. There are some legal, often constitutional errors, that are beyond the judicial capacity and authority to correct.


    The political question doctrine has evolved in the federal courts since the very beginning of the republic. Chief Justice John Marshall may have recognized it in Marbury vs. Madison. It had most frequently been applied in the area of foreign affairs and war but not exclusively. The great case that attempted to analyze and define the political question doctrine was Baker vs. Carr presenting the question of whether equal protection-based challenges to malapportionment of state legislatures presented non-justiciable political questions.


    Writing for the Court, Justice William J. Brennan made a valiant attempt to deduce principles from the chaotic political question decisions that had accumulated over 150 years. He concluded the political question doctrine generally involved questions of separation of powers, but not always. He indicated six reasons why the Court had found political questions in the past. The only criteria pertinent to the reapportionment issue presented in Baker as well as the partisan gerrymandering issue raised in Rucho was the absence of judicially manageable standards. In other words, the Court’s obligation, as stated in Marbury vs. Madison, was to resolve legal disputes through the application of pre-existing law. If there were no judicially manageable standards, hence no law to apply, federal courts had no business resolving the dispute regardless of how important it might seem.


    Applying this principle to the reapportionment dispute, Justice Brennan concluded there were judicially manageable standards since the Court was accustomed to applying the Equal Protection Clause to a variety of issues. This was one of the greatest mistakes in constitutional history. In dissent, Justice Felix Frankfurter vainly argued that there was no constitutionally mandated or discoverable benchmark for proper apportionment. He argued that political question analysis should turn on the nature of the issue at stake rather than the legal theory underlying the challenge. Political thinkers had disagreed throughout history, including American history, as to what was the best way to apportion a legislature and the Constitution itself provided no guidance. Thus there were no judicially manageable standards. The Court would simply have to choose among several contested alternatives, which it did two years later in Reynolds vs. Simms when it chose one person one vote as the appropriate constitutional benchmark. Thus Baker vs. Carr effectively diminished the role of the political question doctrine. Instead, future courts acted on the assumption that if they could figure out a way to justify their decisions no matter how unpersuasive or how lacking in constitutional pedigree, then by definition, the case was justiciable and it presented no political question.


    However, the one area following Baker in which there was serious judicial concern as to the lack of judicially manageable standards was challenges to partisan gerrymandering. Over a period of almost 50 years, the Court heard several challenges to partisan gerrymandering but never invalidated a legislative districting plan on that ground. In 1986, in Davis vs. Bandemer, (Justice Sandra Day O’Connor writing for three justices) argued that challenges to partisan gerrymandering constituted a nonjusticiable political question due to lack of judicially manageable standards. Eighteen years later in Vieth vs. Jubelirier, Justice Antonin Scalia made the same argument, this time on behalf of four justices. However, Justice Kennedy held out hope that a judicially manageable standard might yet be discovered though he conceded that the Court had failed to find one so far.


    The search for such a standard seemed hopeless given that the Court had long made it clear that any standard that at least implicitly assumed or led to proportional representation between political parties was forbidden. The Court did not want to be faced with a flood of challenges to redistricting plans, recognizing that the losers of elections would have an incentive to file such lawsuits. Finally, the Court had long declared that some degree of partisanship in redistricting was constitutionally appropriate. Thus the question became one of degree: “how much is too much.”


    After five decades of searching for a standard, Chief Justice Roberts was finally able to assemble a majority that concluded “enough is enough.” There is no judicially manageable standard here. This is a non-justiciable political question.


    The political question doctrine was the constitutionally appropriate manner to dispose of the ever increasing judicial challenges to redistricting based on partisanship. That, in itself, was a major achievement. But beyond the immediate case and issue, hopefully Ruche will lead to a revitalization of the political question doctrine after its diminishment in Baker vs. Carr 60 years earlier. As Justice John Marshall Harlan declared in his dissent in Reynolds vs. Simms, “The Constitution is not a panacea for every blot on the public welfare.” The political question doctrine aids the Court in its appropriate constitutional role of resolving disputes through the application of pre-existing legal rules as opposed to making up from whole cloth legal rules to resolve disputes. Hopefully, after Ruche, we will see more of the political question doctrine in the future. If so, it will help return the Court to its proper place in our constitutional system.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172436 https://historynewsnetwork.org/article/172436 0
    12 History Podcasts You Should Be Listening To


    Stuff You Missed in History Class

    The name is pretty self-explanatory, but Stuff You Missed in History Class definitely should not be (missed, that is). Each episode features the hosts, Holly and Tracy, telling the story of a new historical event that isn’t usually covered in standard history classes. They take turns explaining the event chronologically, with some humorous commentary on the side. If you feel that your history education is lacking, or just want to know more about niche topics of history, this podcast is number 19 on Spotify’s list of top educational podcasts. It’s easy to listen to and the 30 to 45-minute episodes go very quickly. 


    Recommended first episode: The Bone War Pt. 1 (and 2, if you like it)

    Revisionist History

    This weekly podcast, which sits at 21 on the US iTunes podcast charts and is hosted by journalist and author Malcolm Gladwell, tackles news topics of history that have either been overlooked or conventionally misunderstood. It typically runs from 30 to 40 minutes in length, and “asks whether we got it right the first time.” Gladwell adopts a pseudo-documentary style for each episode, which feature primary interviews and various recorded sounds that establish setting. He goes in-depth every episode and does the important job of debunking common misconceptions about the events of the past. 


    Recommended first episode: McDonald’s Broke My Heart

    Ridiculous History 

    Ridiculous History is iHeart Radio’s history podcast. Its two hosts, Ben Bowlin and Noel Brown tackle new topics twice a week and “dive into some of the weirdest stories from across the span of human civilization.” The two hosts give an introduction and briefly explain the concepts they will be discussing at the beginning of each episode. Episodes sometimes feature guests, like podcasters Jack O’Brien and Miles Gray from the Daily Zeitgeist, that supplement the retelling. This podcast has some of the most unique subject matter of any of the podcasts on this list. 


    Recommended first episode: (Some of) History’s Dumbest Military Prototypes

    The History of Rome

    In this series that ran continuously from 2010 to 2012, host Mike Duncan takes listeners through the complete history of the Roman Empire. The episodes are much more scripted than some of the podcasts in this list and sound like reading from the chapters of a book. It is a limited series, meaning one should listen to the episodes in order, rather than skipping around. Although it is admittedly dry, this podcast is a great in-depth exploration of one of the more famous and formidable civilizations of human history. At only 15 to 30 minutes in length for each episode, it is perfect for a morning or afternoon commute. 


    Recommended first episode: 001 – In the Beginning

    The History Chicks

    The History Chicks introduces listeners to various historical female figures as hosts Beckett Graham and Susan Vollenweider discuss the challenges the figures faced and the most interesting parts of their lives. Graham and Vollenweider give a little introduction of historical background to set up the figure they then talk about. Their side commentary interspersed throughout episodes keeps listeners entertained. This podcast, posted twice a month, is on the longer side, usually running between 60 and 90 minutes.


    Recommended first episode: Mary, Queen of Scots

    Our Fake History

    This podcast tackles different historical myths and commonalities that are either not completely true or sometimes completely false. Host Sebastian dramatically reads out historical accounts from newspapers, public documents, and even historians; he then goes through challenges to those accounts from eyewitness testimony or other historians. The podcast is well-researched and gives a lot of information on interesting topics. 


    Recommended first episode: Episode 38 – Was There a Real Atlantis? (Part 1 & 2)

    BBC Witness History

    Witness History is a short podcast produced by BBC and describes itself as “history told by those who were there.” It covers various topics from modern history, from the war on drugs to women airplane pilots. The host is supplemented by primary audio recordings and interviews. As a result, it’s more journalistic and has a news report feel to it. There is a new episode covering a different topic every couple days and episodes only run 9 to 12 minutes, so if you’re looking for a podcast to listen to on your walk, this is it. 


    Recommended first episode: D-Day

    The Dollop

    This podcast is a personal favorite. Comedians Dave Anthony and Gareth Reynolds host. In each episode, Anthony takes on one subject of American history and reads the historical account, while Reynolds reacts to hearing it for the first time. It features more commentary than some other podcasts, but it makes the educational component fun. Some of the stories they cover are just so genuinely entertaining, they almost don’t even require any commentary. 


    Recommended first episode: 210 – The New Jersey Shark Attacks

    Atlanta Monster

    If true crime series interest you, this is your podcast. Host Payne Lindsey adopts an investigative journalism style as he covers the notorious Atlanta Child Murders that took place between 1979 and 1981. The podcast uses audio from news clips and is more reliant on interviews, which highlight first-person perspectives and experiences that make the podcast really interesting to listen to. Atlanta Monster is a true crime podcast, meaning it covers a single historical event in a season. 


    Recommended first episode: S1 Ep1 – Boogeyman


    BackStory is the product of four historians at Virginia Humanities. Ed Ayers, Brian Balogh, Nathan Connolly, and Joanne Freeman take current events that people in the US are talking about and approach them from a historical perspective. They consistently choose interesting topics, like college sports, women in congress, and gambling. Episodes run from 30 to 70 minutes and the hosts do a good job of staying on topic. 


    Recommended first episode: 276 – Red in the Stars and Stripes? A History of Socialism in America


    In Our Time

    In Our Time, produced by BBC Radio, covers older history that isn’t typically covered in the podcasts on this list, such as the Inca, Moby Dick, and the Epic of Gilgamesh. Each episode, host Melvyn Bragg gets right into the subject matter immediately and brings historical scholars on as guests to interview and offer explanations. The interviews keep things moving and offer expert analysis. This fast-paced, 40 to 60-minute podcast offers a different style as opposed to podcasts that focus on modern history. 


    Recommended first episode: 1816, the Year Without a Summer

    Past Present Podcast

    Past Present tackles current political events from the perspective of professional historians. Hosted by historians Neil Young, Natalia Petrzela, and Nicole Hemmer, Editor of the Washington Post’s history section, Made by History, this podcast tries to make sense of what’s happening in the world by placing it in the context of history. It attempts to avoid partisan punditry and offers a nice alternative to current news cycles. Recent episodes cover various aspects of the 2020 election race, including Elizabeth Warren’s candidacy, Joe Biden and the 1994 Crime Bill, and tariffs.


    Recommended first episode: Episode 184 – YouTube, Tariffs, and Elizabeth Warren

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172392 https://historynewsnetwork.org/article/172392 0
    The 2020 Election and Presidential Age


    Last week, 20 Democrats took to the debate stage over the course of two nights in hopes of becoming the 2020 Democratic nominee for president. On the second night in particular, the age gap between the candidates was striking. Joe Biden, 76, and Bernie Sanders, 77, shared the platform with Pete Buttigieg, 37. 


    America has had 44 men serve in the Presidency. Theodore Roosevelt, 42 years old at his inauguration, was the youngest president and Donald Trump, 70 at inauguration, is the oldest. The average age for presidents at inauguration is slightly over 55 years old. 11 presidents were 60 or older; 24 were in their 50s; and nine were in their 40s at inauguration. 


    For those who were 60 years and older, several have had health issues while in office.  Two out of 11 died in office: William Henry Harrison and Zachary Taylor. Ronald Reagan displayed signs of aging as many believed he was in the early stages of dementia or Alzheimers. Dwight D. Eisenhower suffered a massive heart attack while in office. 


    Several of the presidents who were elected in their 60s struggled to effectively lead. Two  of these Presidents, John Adams and George H. W. Bush, could not win reelection, and Gerald Ford, who became president after Richard Nixon’s resignation, was unable to win a full term in the White House. Only three Presidents who served in their 60s and beyond--Harry Truman, Dwight D. Eisenhower, and Ronald Reagan--had what were regarded as outstanding administrations, making the top ten list of presidents in just about any scholarly poll.


    Most of the remaining top 10 presidents were in their 50s when taking office (George Washington, Thomas Jefferson, Abraham Lincoln, Franklin D. Roosevelt, and Lyndon B. Johnson), with the exception of Theodore Roosevelt and John F. Kennedy, who were in their 40s. 


    Several candidates would raise the average age of presidents based on their age on the day they would be inaugurated: Bernie Sanders (79), Joe Biden (78), Elizabeth Warren (71), Jay Inslee (68), John Hickenlooper (67); and Amy Klobuchar (60). At the same time, several potential nominees in their 50s would be consistent with the average age of presidents: (from oldest to youngest) Bill de Blasio, John Delaney, Michael Bennet, Kamala Harris, Kirsten Gillibrand, Steve Bullock, and  Cory Booker. The potential presidents who would be in their 40s on Inauguration Day would lower the average presidential age: (from oldest to youngest) Beto O’Rourke, Tim Ryan, Julian Castro, Seth Moulton, Eric Swalwell, and Tulsi Gabbard. Finally, Indiana Mayor Pete Buttigieg, who would be only 39 years and one day old on Inauguration Day 2021, would be nearly four years younger than Theodore Roosevelt and four years and eight months younger than John F. Kennedy. Moulton, Swallwell, and Gabbard would also be younger than TR or JFK, but older than Buttigieg.


    So the potential exists that we could have the oldest President in American history at inauguration with Sanders, Biden or Warren, or the youngest President in American history with Buttigieg, Gabbard, Swalwell or Moulton. If any of these seven take the oath, they will either raise the average age of American Presidents, or lower the age dramatically.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172434 https://historynewsnetwork.org/article/172434 0
    When the Future is the Past: A ‘Star Wars’ Summer at Tanglewood


    This Sunday afternoon, the star fleet from the evil Empire, the storm troopers of Darth Vader and the nasty masters of the Death Star will once again battle the good guys – Han Solo, Luke  Skywalker, Chewbacca, 3-CPO, R2-D2 and Princess Leia. This time the epic conflict in the film Star Wars will be at, of all places, the renown Tanglewood music center in Lenox, Massachusetts, as part of a three-concert film tribute to music composer John Williams.


    The Tanglewood series resumes August 16 with the screening of Star Wars: A New Hope, with Keith Lockhart conducting the orchestra, and concludes August 24 with Williams back again for another concert of his different film music.


    Star Wars represents the future in the past. It is about a galaxy in a far-off time in the future but the film itself is already 42 years old, an historic gem. It is entertainment history, Yoda and all, at its best.


    Films accompanied by live music by famous orchestras is a craze across the nation, and Tanglewood was one of the first in the field, starting back in the 1990s.  Their films have even included Alfred Hitchcock’s Psycho.


    Why Williams and Star Wars this summer? Could anything be more unclassical for this world renown home of classical music than Han Solo, Luke and Chewie zipping through the galaxy in the Millennium Falcon, blasting away at the bad guys?


    “We are devoted to great music, and everyone, just everyone, puts the theme to Star Wars, among the top pieces of music ever written,” said Dennis Alves, the director of artistic Planning at Tanglewood.  “People hum the Star Wars theme everywhere. It is great music, plain and simple, and people just love it.”


    Alves, who has to book every kind of act at Tanglewood, from the Venice Baroque Orchestra to James Taylor, has a formula for deciding which movies will be screened with his orchestra playing the music at the Lenox music center in summer and at the Boston Pops home in winter in Boston, too. 


    “You want to pick a very popular movie, something that appeals to all or is remembered by all, because you want to draw as many people and, frankly, they sell tickets. You want to pick a genre of film that is beloved, such as science fiction or adventure or comedy films. We did a Bugs Bunny tribute that was wildly successful. You want to select a film that will please teenagers as well as adults, too. And men and women,” he said.


    To him, Star Wars is one of those films. “They are so popular that when each one comes out, I wait several weeks to see it to avoid the crush of the big crowds at the theaters,” said Alves.


    John Williams is one if America’s greatest film music composers. He has won five Oscars and been nominated for 51 (second only to Walt Disney). Among his films are all of the Stars Wars movies, Close Encounters of the Third Kind, the first three Harry Potter films, the first two Jurassic Park films, Home Alone, Superman, and E.T. He even composed the music for the first season of the television series Gilligan’s Island. He last appeared at Tanglewood in the summer of 2017. Back in the ‘90s he began a long association with Tanglewood and has conducted there dozens of times.


    “He’s a fan favorite,” said Alves, who added that the type of movies Williams scores are exactly the kind of films the music center wants to show.


    Tanglewood executives claim that another reason they do film concerts is that they draw a very different audience from their standard classic offerings. “Star Wars is the perfect example of that. We’ll get thousands of teenagers each summer for those concerts. We hope that these teenagers, who ordinarily would not come here, will come back later, or bring their own kids back to see the classical works.”


    Another reason for the use of Star Wars is the movie’s cult following. “There are millions of Star Wars fans and there are a number of different Star Wars movies. One time when Williams was doing one of his concerts here people in the audience yelled that they wanted more songs in an encore. What did he choose? The orchestra did music from The Empire Strikes Back. The place simply went crazy. They loved it,” said Alves.


    The setting for the movie concerts at Tanglewood is beautiful. The music tent, or “shed,” and nearby Ozawa Hall, sit on the sprawling lawns carved out of a forest in the hills of Lenox, in Western Massachusetts, three hours from both New York and Boston. In addition to the 5,000 some seats under the tent, hundreds of music lovers sit on the wide lawns beyond the tent. People often arrive on mornings and picnic on the lawns while listening to the Boston Pops practice in a quiet world far from the maddening crowds.


    Tanglewood is certainly not unique. For several years, the New York Philharmonic has added several movie/concerts to their schedule. Last spring, it presented a two-and-a-half-hour concert of Bugs Bunny cartoons with the orchestra playing the music and David Geffen Hall at Lincoln Center was completely sold out. Best of all, everybody had the chance to have their picture taken with Bugs Bunny himself (he was not chewing a carrot, though). The audience there roared at all the cartoons and applauded madly at the end of the films, as audiences do across the country in these movie concerts.


    So, on Sunday, and on August 16 and 24, Tanglewood will be packed with movie fans. Their enthusiasm will be with them. Their energy will be with them and. most importantly, the Force will be with them, too.

    Tue, 23 Jul 2019 09:06:58 +0000 https://historynewsnetwork.org/article/172435 https://historynewsnetwork.org/article/172435 0