History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Wed, 08 Apr 2020 22:35:09 +0000 Wed, 08 Apr 2020 22:35:09 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/site/feed Sick Leave Provisions in the COVID Relief Bill Were Historic. That's the Problem.

 

On April 1, for the first time in our country’s history, some U.S. workers became entitled to receive paid sick time on an emergency basis related to the coronavirus pandemic, for the remainder of 2020. There is a lot that’s remarkable about that sentence. The legislation instituted an emergency program, rather than expanding a longstanding right. It embraced some workers, but not all. It is contingent on an ongoing emergency with an end date, rather than permanent. 

The Families First Coronavirus Response Act (FFCRA), enacted on March 18 and in effect from April 1 through December 31, 2020 provides up to 10 paid sick days for people who must miss work because of  a COVID quarantine, self-isolation recommendation or diagnosis, those caring for other individuals affected by COVID and closures of a child’s place of school or care. The new law applies to public sector workers and employees in businesses with fewer than 500 workers – but excludes those who work for larger companies, amounting to about half the workforce. It also includes potential exemptions – which the U.S. Department of Labor has interpreted quite broadly – of health care workers and emergency responders, which means essential workers on the frontlines of the coronavirus crisis may be prohibited from accessing paid sick time if they or a loved one fall ill with the virus.

Guaranteed access to paid sick days is necessary – now, during this pandemic, and always – because more than one-quarter of the U.S. workforce (27 percent) does not have paid sick time at their jobs, and access rates are even lower among lower-wage workers and service sector workers – the very people who are in high rates of contact with the public and unable to afford unpaid days away from work without jeopardizing their ability to buy food for the month or afford housing costs. The United States stands alone among high-wealth countries in failing to guarantee workers paid sick time; this is a byproduct of our anti-regulatory approach to businesses, the “bootstraps” ideology that workers should negotiate for benefits with their employers on a one-to-one basis, and the relatively low density of union members within the workforce relative to our international peers.

Notwithstanding its limitations, the FFCRA brought to fruition a victory that advocates and lawmakers have been pursuing for more than a decade: a federally mandated right to paid sick time. This right must be enhanced and made permanent to achieve the goal of universal coverage. Congress first endeavored to pass paid sick days legislation more than 15 years ago. First introduced in 2004 – before any states or cities had enacted paid sick days laws of their own, as dozens now have – the Healthy Families Act would have guaranteed most workers the right to earn up to seven paid sick days per year to use to address their own health need or seek health care services, to care for a sick family member, or accompany a family member to medical appointments. More recent versions of the Healthy Families Act also include as acceptable uses for paid time away from work for people seeking services related to domestic violence, stalking and sexual assault and to attend a child’s school meetings related to their health and education.

Over the past dozen years, states and cities have acted where Congress has not. San Francisco was the first jurisdiction in the United States to adopt a paid sick days law. In 2006, a group of young restaurant workers affiliated with the organization Young Workers United successfully appealed to voters to approve a paid sick leave ballot ordinance. Approved by 61 percent of the electorate, the law went into effect in February 2007. Two other cities followed suit the next year: Washington, D.C., through D.C. Council action in March 2008, and Milwaukee by ballot question in November 2008 (unfortunately, however, Milwaukee’s law was challenged in court and, once upheld, was preempted by the Wisconsin state legislature). Beginning in 2011, states began to adopt paid sick days laws, beginning with Connecticut, which included only a portion of workers in its law, and building from there to cover more workers and more businesses in cities and states across the country. Today, there are 10 states plus D.C. with comprehensive paid sick days laws and more than 20 cities and counties with such laws; these generally follow and expand on the Healthy Families Act model, which allows workers to earn paid sick time for personal and family purposes. (Notably other cities adopted laws that were later preempted by state law, and Michigan passed a law with holes so damaging as to render the law relatively toothless.)

Evidence from cities, counties and states have shown the value of paid sick days laws and dispelled concerns of opponents. Research consistently shows that paid sick days generally help avoid the spread of contagious illnesses to coworkers and improve individual and public health. For example, researchers found that state and municipal U.S. paid sick days laws are associated with a 5 percent reduction in the flu. Analysts conclude that laws are used appropriately by workers and have not had adverse effects on business creation, job creation, wages or prices. 

As Congress, businesses and advocates continue to address workers’ needs during the coronavirus pandemic, they should fill the gaps left by the FFCRA. While some larger businesses have announced voluntarily provided emergency paid sick days policies, many have not. Investigative journalism reveals that even workers in companies that have made announcements may not receive the benefits they have been promised due to limitations in policy roll-out or policy details that make utilization difficult or impossible. Only a universal law can help ensure all workers a baseline level of protection with enforceable rights. 

Beyond the current moment, the coronavirus pandemic has laid bare the structural gaps in our society, of which workers’ lack of guaranteed access to paid sick time is one. Who knows how coronavirus might have spread more slowly had a paid sick days law been in effect, allowing workers who had unexplained fevers or coughs to stay home? It is well past time for lawmakers to cure the United States’ paid sick leave malady and provide permanent paid sick days protections, so that no one is forced to work sick, risk their paycheck or risk their job. 

To learn more about specific holes in the FFCRA and recommended congressional fixes, as well as ways to get involved in the national paid sick days effort, read here.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174876 https://historynewsnetwork.org/article/174876 0
US v. Sineneng-Smith Echoes the Fugitive Slave Act

 

The rightwing majority on the United States Supreme Court will soon decide an immigration case, United States v. Sineneng-Smith, where their decision could threaten fundamental American freedoms. 

 

In 2015, Evelyn Sineneng-Smith, an immigration consultant based in California, was convicted of tax and mail fraud and of violating a 1997 law against encouraging undocumented immigrants to remain in the United States. Sineneng-Smith’s conviction was just argued before the Supreme Court and a decision is expected this session. A U.S. Court of Appeals decision invalidated parts of the “encouraging” law in December 2018. Judge A. Wallace Tashima argued, “criminalizing expression like this threatens almost anyone willing to weigh in on the debate.” The Trump administration argues the law only applies to individuals who provide “substantial assistance” to undocumented immigrants, but that is not how the law reads.

 

The Supreme Court has three options. It can rule narrowly, and declare Sineneng-Smith either guilty or not guilty without ruling on the constitutionality of the 1997 law. It can declare the criminal penalties for speech encouraging an undocumented immigrant to remain in the United State a violation of the First Amendment and unconstitutional. It can uphold both Sineneng-Smith’s conviction and the specific wording of the 1997 law criminalizing anything that encourages an undocumented immigrant to actively or passively resist deportation. 

 

U.S. Code § 1324 has a clause titled “Bringing in and harboring certain aliens.” It includes sections that criminalize “any person who . . . knowing or in reckless disregard of the fact that an alien has come to, entered, or remains in the United States in violation of law, transports, or moves or attempts to transport or move such alien . . . conceals, harbors, or shields from detection . . . [or]  encourages or induces an alien to come to, enter, or reside in the United States.” It is also illegal to engage in “any conspiracy to commit any of the preceding acts” or to aid or abet the commission. Penalties for people who aid or encourage undocumented aliens include fines and imprisonment for between 5 and 10 years.

 

Driving an undocumented immigrant or housing them are pretty specific offenses, though not necessarily illegal and certainly not immoral. Aiding and encouraging someone are much more nebulous. Is feeding a hungry child or providing the child with medical care a crime? How about teaching a child to read or speak English? Does “I wish you would stay” or “I love you” cross the line into criminality? Is marriage an act of aiding and abetting?

 

A Supreme Court decision that broadly upholds U.S. Code § 1324 has the potential to drive a wedge in the nation, very similar to the Fugitive Slave Act of 1850 that played a major role in precipitating the American Civil War. The 1850 act fined marshals that did not enforce the law and civilians who refused to be part of deputations to recapture freedom seekers. 

 

African American abolitionists and former slaves Frederick Douglass and Jermain Loguen denounced the law in speeches and essays. Frederick Douglass charged “Under this law the oaths of any two villains (the capturer and the claimant) are sufficient to confine a free man to slavery for life.” Jermain Loguen declared “I don’t respect this law—I don’t fear it—I won’t obey it! It outlaws me, and I outlaw it, and the men who attempt to enforce it on me.” Enraged abolitionists in the North rallied support to actively resist the law in Boston, New York, and Syracuse. Speaking in the United States Senate, William Seward (Whig-NY) denounced “the principle of the law for the recapture of fugitives, as… unjust, unconstitutional, and immoral; and thus, while patriotism withholds its approbation, the consciences of our people condemn it . . . We are not slaveholders. We cannot, in our judgment, be either true Christians or real freemen, if we impose on another a chain that we defy all human power to fasten on ourselves.”

 

Daniel Webster (Whig-MA) ultimately endorsed the Senate compromise that included the Fugitive Slave law declaring its passage necessary for the “preservation of the Union.”  It was a position that effectively destroyed Webster’s presidential ambitions.

 

The 1850 Fugitive Slave Act spurred resistance to slavery and the expansion of the Underground Railroad. Two years later, Charles Sumner (Free Soil/Ind., MA), newly arrived in the United States Senate, declared himself “openly against the usurpation, injustice, cruelty, of the late enactment by Congress for the recovery of fugitive slaves” and demanded repeal of the Fugitive Slave Act, demanding Congress “let its terrors no longer rage through the land.”

 

Ultimately the Fugitive Slave Act contributed to unbridgeable breach between North and South that propelled the nation to Civil War. A Supreme Court decision in United States v. Sineneng-Smith that broadens the authority of the federal government to suppress the rights of advocates for undocumented immigrants could well spur similar resistance and divide the nation irreparably.

 

As a historian, a teacher, and a citizen, I encourage undocumented immigrants to remain in the United States, especially those covered under Deferred Action for Childhood Arrivals. To paraphrase Jermain Loguen, if the Supreme Court tries to eliminate freedom of speech in the United States, I won’t respect its decision — I won’t fear it — I won’t obey it! It may outlaw me, but I outlaw it, and anyone who attempts to enforce it.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174872 https://historynewsnetwork.org/article/174872 0
The Atomic Bomb, War Room Intrigue and Emperor Hirohito's Decision to Surrender

As the world observes the 75th anniversaries of the atomic bombings of Hiroshima and Nagasaki later this year, Americans have still not reached a consensus about the role of the atomic bomb in Japan’s surrender. Was Japan ready to surrender without the bomb? Some of us believe so, in particular if the Allies would have simply guaranteed the preservation of the Imperial System (with the divine Emperor as the embodiment of all sovereign power in the realm). Others contend the Soviet entry into the war actually caused Japan’s surrender.

However, a close look at the deliberations of Japan’s Supreme Council at the Direction of War––known as the “Big Six”—evidence which has been available for decades, shows Japan’s leadership deeply divided into two factions. A hardline contingent remained opposed to surrender even after the second atomic attack on Nagasaki and the Soviet Union’s entry into the war. On the flip side, the peace faction used the atomic bombings to engage the Emperor in the surrender decision, an unprecedented move within Japan’s dysfunctional form of government, which required unanimous approval to make any decision.

By the summer of 1945, the United States and Japan had finalized their plans for the battle for mainland Japan. Paramount in President Harry S. Truman’s mind were the expected American losses. Multiple sources in the American military estimated the invasion of Japan would result in a quarter of a million to one million U.S. soldiers killed and four times those numbers in total casualties. They also predicted five to ten million Japanese deaths in the invasion; the Japanese themselves thought the total could reach twenty million.

From May until early August, Japanese Foreign Minister Shigenori Tōgō, at the behest of the Big Six, carried on a series of vague communications with the Russians that proved futile and never broached the subject of unconditional surrender, an Ally demand since January of 1943. On July 18, Japan’s ambassador to the Soviet Union, Naotake Satō, sent Tōgō a message saying that the Japanese must accept the equivalent of unconditional surrender, excepting the preservation of the Imperial System. Tōgō flatly replied, “With regard to unconditional surrender, we are unable to consent to it under any circumstances whatever.”

On July 26, the Allies––excluding Russia––gave Japan an ultimatum known as the Potsdam Declaration. Thirteen terms were listed in the declaration, a break from the blanket unconditional surrender imposed on Germany. For Japan, unconditional surrender had only been applied to Japan’s military. The Allies also included two warnings, that Japan must surrender now or face “prompt and utter destruction” and that the Allies would “brook no delay.” Prime Minister Kantarō Suzuki rejected the declaration, calling it “a thing of no great value.”

On August 6, the U.S. exploded the first atomic bomb used in combat over the city of Hiroshima. The Big Six did not meet until three days later on August 9, the same day that the Soviets entered the war against Japan and the United States detonated a second atomic bomb over Nagasaki. The previous day, the Emperor had told Tōgō: “Now that this sort of weapon has been used, it is becoming increasingly impossible to continue the war. I do not think it is a good idea to miss an opportunity to end the war by attempting to secure advantageous conditions.”

Nevertheless, the members of the Big Six remained at odds throughout the 9th. The peace faction, which included Suzuki, Tōgō, and Navy Minister Admiral Mitsumasa Yonai, wanted to surrender, preserving only the Imperial System. But War Minister General Korechika Anami, Army Chief-of-Staff General Yoshijirō Umezu, and Navy Chief-of-Staff Admiral Soemu Toyoda demanded three additional conditions: limited occupation or none at all, control of war crimes trials, and control of the disarmament of the nation’s military. Absent these, they wanted to fight the decisive battle on the Japanese homeland, believing they would inflict such high casualties that they could achieve better terms.

After hours of heated debate, the peace faction managed to out-maneuver the hardline militarists and convene an Imperial Conference to get the Emperor’s opinion on an issue that clearly had not been agreed upon unanimously. After hearing both sides of the argument, Hirohito told his government he favored the peace faction’s proposal.

Baron Kiichirō Hiranuma, President of the Privy Council, demanded that acceptance of the Potsdam Declaration be contingent on "the understanding that the said declaration does not compromise any demand which prejudices the prerogatives of His Majesty as a Sovereign Ruler.” This proviso preserved the Imperial System and retained Hirohito as Emperor and head of the Japanese government. Anami extracted one further requirement, a promise from the cabinet that the war would continue should the Allies reject the Japanese offer.

Upon learning of the offer, Truman immediately called a meeting with Secretary of War Henry Stimson, Secretary of State James Byrnes, Secretary of the Navy James Forrestal, and his Chief of Staff Admiral William Leahy. “Could we even consider a message with so large a ‘but’ as the kind of unconditional surrender we had fought for?” he asked. Forrestal, Stimson, and Leahy wanted to accept the Japanese offer. Stimson said allowing the Japanese to keep the Emperor served American interests by encouraging the military to lay down their arms. Byrnes demurred. In his view, Japan’s phrasing amounted to a refusal to accept the Allies’ offer. Forrestal suggested that some changes to the wording might make the offer acceptable—an idea that appealed to Truman, who asked Byrnes to come up with a counteroffer acceptable to the Allies.

Byrnes’ response contained two key points: first, that “the authority of the Emperor… shall be subject to the Supreme Commander of the Allied Powers,” and second, that “the ultimate form of government of Japan shall, in accordance with the Potsdam Declaration, be established by the freely expressed will of the Japanese people.” Upon receipt, the Japanese government relapsed into stalemate for two and a half days, with Anami, Umezu, Toyoda, and Hiranuma demanding acceptance of their original four conditions or continuation of the war.

On the morning of August 14, American B-29 bombers released millions of leaflets over Japan describing the surrender negotiations. Marquis Kōichi Kido, the Lord Keeper of the Privy Seal and the Emperor’s most trusted advisor,found one and warned the Emperor that the leaflets might provoke a rebellion, since the peace negotiations had been kept secret from the Japanese people and military. The Emperor instructed Suzuki to muster another Imperial Conference, where, if necessary, he would “command” the cabinet to accept Byrnes’ counteroffer. After listening to Anami, Umezu, and Toyoda’s now familiar arguments for rejecting the Allies terms, Hirohito told his government he wanted the Byrnes offer accepted.

The record of the Imperial Conferences extinguishes any notion that the Japanese government was ready to surrender, that it would have surrendered had the Allies guaranteed the Imperial System, or that it surrendered in response to Russia’s entry into the war. It makes clear two salient facts: that Emperor Hirohito ended the war, and he ended it because of the atomic bomb. He stated as much on four separate occasions: at his meeting with Tōgō on August 8, in his Imperial Rescript (considered by the Japanese to be an inviolable statement from their divine Emperor) on August 15, in a letter to his son on September 6, and when he told Supreme Allied Commander General Douglas MacArthur on September 7 that “the peace party did not prevail until the bombing of Hiroshima created a situation which could be dramatized.”

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174871 https://historynewsnetwork.org/article/174871 0
Students of History, Your Professors have Prepared You for such a Time as This!

 

 

Students of history, your professors have prepared you for such a time as this!

 

The study of history offers an approach to the world necessary for the creation of good citizens in a democratic society. In school, most of us probably recall taking some sort of "civics" courses that taught us things about the United States government. We learned about the importance of voting, the system of checks and balances, and some basic information about our constitutional rights.

 

This kind of knowledge is essential and useful. But taking a course, or memorizing some facts about the system of government, does not make us citizens with an understanding of our roles, power and obligations in those systems. And citizenship is what we need at this moment.

 

Good students and teachers of history understand full well that history is more than just "the facts." Yet even they may fail to grasp the role of history within  civic education. Too often young people are taught to engage public life for the purpose of defending their rights or, to put it in a negative way, their self-interests. This approach to citizenship education, as historian Robert Ketcham writes in his 1987 book Individualism and Public Life, "would be intricate knowledge of how the system really works and shrewd understanding of how and where to exert pressure to achieve particular objectives."

 

Such a rights-based approach, an operating manual for the civic machine, is a vital part of citizenship, but it does not help us in a time when sacrifice is essential. The coronavirus pandemic demands a citizenship that places a commitment to the public good over self-interest. Yes, we have a right to spend Spring Break partying in Florida, eat meals in restaurants, and buy as much toilet paper as we may afford, but citizenship also requires obligation, duty, and responsibility. Sometimes the practice of these virtues means that we must temporarily curb our exercise of certain rights. We must think of others and their needs. 

Historians think critically about their world, and about how they can reliably know it. They will evaluate the information they receive about the coronavirus, and develop insight into which sources they can trust and  and which they can't. In a time when news and information about this virus is changing and developing at a rapid rate, context becomes very important. News that came across our feeds two days ago may no longer be relevant today. Historians’ work revolves around building a context for knowledge out of disparate documents and sources, and demands revising and reframing knowledge in light of new discovery. Odd as it may seem, the skill of building knowledge from an archive of old documents is the same skill of sorting the flood of electronic information.

Historians are also able to put this pandemic in a larger context. Type the words “1918 Influenza” into your web browser and you will find opinion essays, interviews and news articles written by or featuring dozens of historians trying to help us make sense of the present by understanding the past. They can, at times, alert us to potential present-day behavior by reminding us of what happened in an earlier era.

The study of history also cultivates the virtues necessary for a thriving democracy. In his book Historical Thinking and Other Unnatural Acts, Stanford historian Sam Wineburg argues convincingly that it is the strangeness of the past that has the best potential to change our lives in positive ways. Those who are willing to acknowledge that the past is a foreign country–a place where they do things differently than we do in the present–set off on a journey that has the potential to transform themselves and their society.  An encounter with the past in all of its fullness teaches us empathy, humility, and selflessness. We learn to remove ourselves from our present context in order to encounter the culture and beliefs of others who live in this "foreign country." Sometimes the people we meet in the past may appear strange when compared with our present sensibilities. Yet the discipline of history requires that we understand them on their own terms, not ours.

 

History demands we set aside our moral condemnation about a person, ideal or event from the past in order to understand it. It thus, ironically, becomes the necessary building block of informed cultural criticism and political commentary. It sharpens our moral focus and places our ethical engagement with society in a larger context.  One cannot underestimate how the virtues learned through historical inquiry also apply to our civic life. The same skills of empathy and understanding that a student or reader of history learns from studying the seemingly bizarre practices of the Aztec Empire might also prove to be useful at work when we don’t know what to make of the beliefs or behavior of the person in the cubicle next to us.

The study of the past has the potential to cure us of our narcissism. The narcissist views the world with himself at the center. While this a fairly normal way to see the world for an infant or a toddler, it is actually a very immature way of viewing the world as an adult.  History, to quote Yale historian John Lewis Gaddis, “dethrones” us “from our original position at the center of the universe.” It requires us to see ourselves as part of a much larger human story. When we view the world this way, we come face-to-face with our own smallness, our own insignificance.”

 

As we begin to see our lives as part of a human community made up of both the living and the dead, we may start to see our neighbors (and our enemies) in a different light. We may want to listen to their ideas, empathize with them, and try to understand why they see the world the way they do. We may want to have a conversation (or two) with them. We may learn that even amid our religious or political differences we still have a lot in common.  We also may gain a better understanding into why their ideas must be refuted.

This is a time for engaged citizenship and regard for our fellows along with ourselves. The study of history reminds us that we are all in this together.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174880 https://historynewsnetwork.org/article/174880 0
Laughter in an Age of Pandemics

 

In the 1941 movie classic Sullivan’s Travels, successful movie director John L. Sullivan, played by Joel McCrea, laments the fact that in the midst of the misery caused by the Depression and War, he is making frivolous films such as Ants In Your Pants, 1939. Sullivan rebels.  He decides to pose as an average citizen and go out among the people to see what they are like, what they want, and how he can be of service to humanity. After a series of troubles along the way, Sullivan happens upon the sound of laughter. He searches for the source and finds a group of down and out men hysterically laughing at a silly cartoon. Eureka! Sullivan realizes the error of his ways. The people don’t want serious, ponderous social criticism, they want to laugh, escape, lose themselves for just a few moments, forget about the troubles they face and have a good time. The movie’s point is driven home by Sullivan in the final lines of the film: “There’s a lot to be said for making people laugh. Did you know that’s all some people have? It isn’t much, but it’s better than nothing in this cockeyed caravan.”

During World War II, the commissioner of baseball, Judge Kenesaw Mountain Landis, sent President Franklin D. Roosevelt a letter offering to cancel the baseball season if the President so wished. Roosevelt, in a January 15, 1942 letter, told the commissioner that baseball must go on. The people needed it in the midst of the troubles of the war. Baseball brought joy to millions of anxious Americans. The game had to go on.

When things go from bad to worse, we have essentially two choices: let it defeat us or rage against the madness and laugh. Laughter is good medicine for virtually anything that ails us.  And in this age of pandemics, where social distancing removes the tactile from our daily lives, and forces us to hibernate in isolation, we social animals hunger for the embrace of others. Stripped of the direct contact with others, we search to fill the void. Laughter helps. True, things aren’t very funny just now, but life remains ironic, silly, discombobulated, and downright hysterical – if you wish to see things that way. And if you do, it will help see you through this insanity. In a world where the Trump Covfefe Panic Index has exploded off the charts, we all need distractions from the misery that surrounds us. And speaking of distractions, I find myself suffering from Kardashian Withdrawal Syndrome. My social grounding has been torn out from under me. 

As our politicians inadvertently spread fear and anxiety, we search for security and hope. There was a time when FDR could remind us that the only thing we had to fear was fear itself. But today, watching Donald Trump bumble and fumble his way through a press briefing on the coronavirus, we are left dumbfounded and with a feeling of “Oh Dear Lord, all is lost if this guy is in charge.” Yes, we get the occasional chuckle, as when Dr. Fauci stands behind as the daily press briefing while the President is speaking, shakes his head, looks down at this feet and invites us to imagine the thought bubble over his head that reads “What the [bleep] is wrong with this moron?” But that is little consolation. Trump, who is wrapped tighter than an airport sandwich, actually inspires fear and anxiety every time he opens his mouth. His credibility has disappeared faster than cupcakes at a pot party, and as each member of Team Trump – crammed together in a very non-socially distanced way – goes up to the microphone, bows and makes the ritual “You are doing a wonderful job, Dear Leader” before delivering the bad news about a pandemic out of control, we cringe and think, “Life under Trump is like running through hell wearing a gasoline bathing suit.” Trump’s disappointing response to the coronavirus has been as welcome as an ingrown toenail. Our president who used to say “I alone can fix it” has been revealed as a fraud. He does however have the Midas Touch… everything he touches turns to mufflers. 

This president may be a joke, but it is no laughing matter. In this, Marx was right. Of course I refer to Groucho Marx, who said that the problem with political jokes is that they keep getting elected. Can President Trump lead us out of this crisis? That’s about as likely as Mike Pence marrying Cardi B. And while the President says that he is doing a tremendous job (and that is why I do not let my students grade their own exams), and that he would give himself an “A” grade for his handling of the crisis, in reality the case for Trump handling this crisis well has fallen apart faster than a third-grade science project. 

If President Trump cannot provide decisive leadership in this crisis, at least we can laugh, and at this time, laughing at and not with President Trump is a tiny bit comforting. Our hope is that governors and mayors can lead us through the crisis. President Trump is AWOL on this, and perhaps we are all the better for that (OK, we aren’t better off for that, but if he can’t lead the least he can do is get out of the way). 

We are all struggling, and we all need the distractions that only absurdity can provide. If we take President Trump seriously, we are lost. And so our only option is to turn away from our president and turn to each other for comfort, solace, and hope.  Social distancing makes that a bit harder but we are a strong, resilient people. We have been through worse than this. So laugh now and then; see the silly, the absurd and the comic in life. And remember, always remember, we are all in this together and we can get through this together. Reach out to your friends, your neighbors (at a safe distance, of course) and spread hope. It is better than despair. 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174874 https://historynewsnetwork.org/article/174874 0
What About the Waitresses?

Waitress image public domain

 

Erica S.’s post on Nextdoor.com read, “Are you a bartender or restaurant worker who is financially struggling because your work is shut down right now? If so I have a $25 gift card from the amazing Le Beau market for you!” Among the hopeful respondents was Kira M., who praised the market, and added that she “was laid off from both my server jobs this week.” As the thread wove on, Erica revealed that her impetus for giving came from a friend who asked his friends to celebrate his birthday by supporting someone in the restaurant/retail industry. She had been given the gift card she was donating, but “what the heck,” she’d match it, while others chimed in that they would buy gift cards from Le Beau to donate, too.

            

These mostly upbeat exchanges came from San Francisco’s wealthy Nob Hill enclave, where in the past residents have posted inquiries about such matters as finding a superior tailor to alter expensive trousers. And nanny shares --- many many nanny shares.

 

Farther down the economic food chain from Erica S’s post came a memory emailed to me a few minutes later. Diana, a friend and (happily) former waitress, recalled she used to make half the minimum wage – the myth being that tips would make up the rest. This was very familiar territory to me, for it echoed the  conclusion from a book I spent years researching and writing: Hey, Waitress! The USA from the Other Side of the Tray.  I interviewed waitresses from low ends to high, from a Maine seafood spot to the Tohono O’odham Indian Reservation in Arizona, from Seattle fine dining to the Florida Everglades, and points in between, including some memorably disgusting places where my friend Diana and others had worked (one New York waitress recalled her manager stirring his coffee with his penis. She added, the coffee never was very warm.).

 

If I added my hours, as waitresses add their tips, my guess is that some 40 interviews yielded 100+ hours of candor, insights, an overall reflection of the United States for worse and for better, and a litany of piques. Two piques topped the list. The emotional-sociological winner was classism, waitresses being treated as if they were lesser beings than were their hungry customers. The economic (if also emotional-sociological) winner was, no surprise, tipping.

 

Server after server described what I came to believe is the central problem with tipping:  it turns customers into employers, and not every customer employs well. Problem with your meal? Don’t blame the manager, the chef, the line cook. Instead, stiff the waitress. Plus, her smile seemed insincere.  

 

Wherever they worked, though, the majority of American waitresses now might be (or soon will be) out of work. The Federal Reserve Bank of St. Louis recently estimated that 13.3 million workers in food preparation and service jobs are at high risk for layoff. The Pew Research Center reports a slight majority of these are women. 

 

I especially worry about the waitresses. Some, presumably, are helping with the semi-new semi-normal of readying take out meals. How do tips work their way into that equation? Do the “subtract the taxes and the drinks before tipping” doofuses figure no tip is required now anyway? As an extension of take out, some waitresses apparently have become home deliverers of the meals, either handing over the bag of pad Thai or, as some recommend, leaving it at the door of the recipient. No touching. So, no tipping? The distance, the facial anonymity, would seem to protect the tip-challenged from the generous. 

            

What about tipping in advance? That is also alien to the world of waitressing as I know it. Good waitresses – again, those I’ve met—believe that their tip-increasing strategies were successful, that their professionalism or their ruses worked as hard as they did. (And it brings to mind the stigma attached to prostitutes, who generally get paid first, an element of cultural sexism that continues to impact waitresses).

           

Indeed, think of what a laid off waitress does not have to endure on her job site, particularly from strangers: unwelcome touches, insulting comments. “Where’s the pretty one who waited on me yesterday?” “What else do you do?” My favorite retort, if apocryphal, to “What time do you get off?” was “Oh, about an hour after I get home.” 

 

Such deflections are but one of the skills of effective servers. Virtually all waitresses who spoke to me indicated the pride they took in at least some aspect of the work, including getting better tips (hate the tipping, love the tips?). One way was to comp their regulars a piece of pie now and then, or up the tab to those two ladies having lunch (hint: suggest dessert, while mentioning to-go bags are at the ready for your unfinished entree). Some waitresses all but boasted they knew how to calm the cook, or manage the computer. And maybe -- no, often – they simply did their job well, knowing how much their regulars treasured them. Waitresses were often the fulcrum of the neighborhood.

 

As I think of the women I met, I wonder about today’s waitresses, including a sweet young Frenchwoman at my local bistro, who smiles gamely at every uttered “mercy” trying to be “merci.” The bistro serves only take-out only, now. Will unemployment checks cover gratuities? Je pense que non.

 

I wonder if some of the many laid-off waitresses will re-assess the job, and what they like or don’t like about it, if the unplanned distance changes their perspective. If they return to the same work, or if the work is available again, will some waitresses, as a group wildly varied in their spending habits (if all famously generous as tippers), start saving for another rainy day? Or does that apron pocket of tips demand to be spent immediately?

            

I wonder too, if – when – this viral horror is over, customers will have a new appreciation for the women, and men, who serve.

           

Meanwhile, my former waitress friend, Diana, also wrote me in the email about a friend of hers, a poker dealer now laid off from California’s casino shutdowns. “She is scrambling and just applied for food assistance. Her teens are going to school every day to pick up the free lunch available.”

            

Let’s hope Kira M. landed that gift card for her groceries.

 

Readers interested in support for restaurant workers may learn about one national effort at rerf.us.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174878 https://historynewsnetwork.org/article/174878 0
What the "Primacy" Debate in Foreign Policy Gets Wrong

 

Americans and people around the world wondering about the diminution of American leadership should be paying more attention to a vivid debate that has been taking place among academics and pundits. It sounds arcane but it has big implications for the reshaping of elite opinion about foreign policy. 

 

At issue is not how the United States will act with other nations – during global crises in particular – but whether the United States will act at all. In framing that false choice, the debate echoes and to an extent distorts past controversies about how the United States should or should not use its power in the world.

 

The debate centers on what is loosely called grand strategy, and – as simplified, exaggerated, and abstract as such debates tend to be – boils down to the old line about power: use it or lose it. On one side are “liberal hegemonists,”who are often referred to as “primacists” by their adversaries. This group would generally agree with the “use it or lose it” formulation. Their principal adversaries, the “restrainers,” would not.

 

Unfortunately, both camps tend to shift easily from questions of substantive success or failure to questions of motive. Restrainers accuse primacists of greed or aggression, and primacists accuse restrainers of naïveté or a selfish withdrawal from the world. 

 

This shift tends to put the primacists on the ground of grand values and the restrainers on the particulars of policy, seemingly shouting past each other. Mainly this is because the primacists are vulnerable to the bad track record of recent interventions and the restrainers are vulnerable to being associated with the worst elements of American isolationism.

 

But aside from the sincerity of the debaters and the subject matter, there is an even more basic, and potentially more dangerous, problem with this debate. It is one of meaning. Repeatedly, partisans on both sides use terms like “primacy,” “hegemony,” “preeminence,” “dominance,” “unipolarity,” and “supremacy” almost as if they are interchangeable. But they mean different things.  

 

For example, “primacy” comes from the Latin for “first” – in influence, importance, or rank, but not really in physical strength, aggression, or egoism, as “America-first” implies. To students of international relations, primacy is associated with the role of foreign (or domestic) policy in statecraft. Primacy is as much a normative concept about the character of U.S. leadership as it is a description of America’s ability to exert power. And so it requires a certain ethics; if we were to turn back to the ecclesiastical root of the word, these ethics are combined with spiritual leadership and, as the Oxford English Dictionary tells us, must be “properly distinguished from ‘supremacy.’”

 

Restraint, by contrast, suggests that assessing the consequences of any action is the most appropriate yardstick. Therefore, the opposite or alternative to primacy is not “restraint.” It is a state of being lowest or last – “abjectist,” “ultimatist” or perhaps “postumist.” And the opposite or alternative to “restraint” is not “first,” but “disinhibition” or even “recklessness.”

 

The incommensurability of these terms makes it difficult to even agree what is being debated. The good news is that much of the ideological baggage can be stripped away from the debate to focus on what policy does and what it achieves.

 

What restrainers really protest is the belief that the United States ought to try to rule the world in perpetuity. They say that is a doomed strategy, and they’re right. And what some primacists really defend is not superiority per se but as a calibrated (and usually collective) means to advance certain American interests and values in the belief that security is indivisible – geographically as well as functionally – across economic, military, political, and social relations. That does not necessarily mean what is good for Americans is good for everyone else; rather, that it is not possible to disaggregate the elements of security from one another and survive in an interdependent world. 

 

Under these circumstances, it is no surprise that primacists seek to shift the terms of debate back to values and revert to a more familiar, loaded term: “isolationism.” Just as the promoters of liberal hegemony seek to absolve themselves of the legacies of imperialism (forgetting perhaps that hegemony was once called “imperialism with good manners”), it is also no surprise, given its association with appeasement in the years before the Second World War, that most restrainers strongly denounce isolationism, insisting that they are every bit as “internationalist” as their critics, the difference being that they seek peace rather than domination.

 

American isolationism was more complex, however, than its detractors then or now recognize. Isolationism rejected not internationalism but interventionism, and specifically military intervention in foreign conflicts. Few of the people once known as isolationists sought actual isolation behind two oceans. They sought instead to shift the emphasis of foreign policy away from such intervention and perceived entanglement. One of their spokesmen, Senator Arthur Vandenberg, said he preferred the term “insulationist.” Like today’s restrainers, Vandenberg’s generation of isolationists also sought peace, neutrality, and independence, along with, for some, a nostalgic return to the “Era of Good Feelings” of the early 19th century when John Quincy Adams was the Secretary of State.    

 

The scholar and diplomat George Kennan (who, incidentally, didn’t think much of the term “grand strategy”) once wrote that there are just two types of isolationist: those who “hold the outside world too unimportant or wholly wicked and therefore not worth bothering about” and “those who distrust the ability of the United States Government, so constituted and inspired as it is, to involve itself to any useful effect in most foreign situations.” Kennan admitted to being of the second type. That was one of the reasons Kennan’s career as a professional diplomat lasted half as long as his career as a scholar.

 

Primacy is a state of affairs. Restraint is an attitude. Neither, strictly speaking, is a strategy. To the extent the United States has ever had a “grand strategy,” it remains remarkably similar to the one that Woodrow Wilson half-sold to the world a century ago: that the world must be made safe for democracy, which is to say, a world whole, free, and at peace. 

 

At least, that’s the principle. Many countries repudiated it in practice, but it has proceeded nonetheless in forms of collective security, economic integration, and cultural understanding. Its progression has not been universal, as Wilson may have hoped, but piecemeal, region by region: in much of the Western Hemisphere, in Europe, and in some parts of Africa and Asia during the second half of the 20th century. It should still be allowed to proceed there and in other regions, and globally, with American help. People of goodwill on both sides of the debate should welcome it.

 

So, yes, let’s end “endless wars” and enhance peaceable diplomacy around the world. But let us not at the same time renounce by redefinition America’s international responsibilities and its capacity for wise, moral leadership. The American people and the country’s survival depend on it. 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174882 https://historynewsnetwork.org/article/174882 0
New Deal or Nazism? Historical Comparisons to Trump's Performance as a Leader in Crisis

 

In times of crisis, effective leadership is more crucial than ever. As President Trump struggles with the early stages of the coronavirus pandemic, including the filing of more than ten million new unemployment claims in late March and crucial shortages of equipment like effective masks and ventilators, the ways other leaders responded to the Great Depression offer lessons both inspirational and cautionary for the present. Although Franklin Roosevelt and Adolf Hitler operated in two very different political cultures, their first 100 days in power offer a sobering reminder of the consequences of decisions pursued by leaders in crisis. 

 

By late 1932, the Depression was several years old. One of every four workers in the USA was unemployed, in Germany one of every three. But in early 1933 Hitler and then FDR came to power, both remaining in office until they died in 1945. Historian Robert Dallek insists that FDR’s First Inaugural Address of March 4, 1933 displayed “a masterful use of language to ease current fears and encourage positive thinking about the future.” The new president declared, “This is preeminently the time to speak the truth, the whole truth, frankly and boldly.” And he indicated that he would ask Congress to convene in a special session to act upon his proposals. 

 

Following the speech he proceeded, in the words of historian David Kennedy, “to act with spectacular vigor.” For advice he relied on a wide variety of sources, including some in academia. By the time Congress’s special session ended on June 16, he “had sent fifteen messages to Congress and had in turn signed fifteen bills into law.” 

 

Among other results, these actions guaranteed bank deposits (earlier bank defaults had left depositors poorer than ever), set up a national relief system, aided farmers, and created jobs through such new programs as the Civilian Conservation Corps. The CCC employed hundreds of thousands of young men (more than three million in the next decade) to work on conservation and beautification projects in such areas as national parks. “No less important,” writes Kennedy, “the spirit of the country, so discouraged by four years of economic devastation, had been infused with Roosevelt’s own contagious optimism and hope.” 

 

Aiding FDR were his interactions with the media and public. On 8 March he held a press conference. Dallek writes, “ His engaging manner routinely disarmed the journalists, and as early as the end of the first conference, in an unprecedented expression of appreciation for his effectiveness in drawing the country back from the brink of disaster and the civility he showed the press, the reporters applauded his performance.” To continue interacting with the press, he meet with them regularly, holding 80 more press conferences during the remainder of 1933--far more in his first year than any subsequent president. 

 

On March 12, FDR spoke on the radio for about twenty minutes in the first of his more than 300 presidential fireside chats. Some 60 million Americans, about half of the country’s population, listened in--historian Jill Lepore has written that during his subsequent summertime fireside chats, “people said that . . . you could walk down a city street, past the open windows of houses and cars, and not miss a word.” She adds that “he spoke on the radio with an easy intimacy and a ready charm, coming across as knowledgeable, patient, kind-hearted, and firm of purpose.”

 

For the rest of his first presidential term, FDR continued in a like manner with his New Deal policies. The results of the 1936 presidential election—FDR won 46 out of 48 states--testify to the citizens’ approval. 

 

Adolf Hitler became chancellor of Germany at the end of January 1933, a little over a month before FDR became president. Although, unlike Roosevelt, he at first had to contend with another executive authority, President Paul von Hindenburg, it was Hitler who dominated. Historian Peter Fritzsche’s new book, Hitler's First Hundred Days, provides a timely analysis of that period, most of which occurred concurrently to FDR’s first Hundred Days.   

 

Prior to the Great Depression Hitler’s Nazi Party had been a small one. In the summer elections of 1932, however, it won 230 out of 608 seats, making them by far the largest of many parties in the Reichstag. Although the miseries of the Depression, certainly helped the Nazis’ rise, Fritzsche stresses other factors which aided them. For example, he often quotes letters of Elisabeth Gebensleben, wife of a conservative deputy mayor in a Saxon town. Fritzsche states that “the crowds of unemployed Communists who gathered in city streets . . . frightened Elisabeth most, and “middle-class Germans shared . . . [her] fears.” By 1932, she had concluded that “only Hitler’s National Socialists could protect Germany from the Communists.” 

 

Fritzsche also indicates that Hitler and the Nazis “mined the recent past in a different way.” They emphasized an idealized pure German community and depicted the Weimar Republic, established in 1918, as an alien institution. Many Germans accepted their views on “community, nation, and race” and their methods of making Germany great again.

 

Fritzsche further points out “the Nazis resolved the paradox of promoting national unity by dividing the country. They did so by promulgating a binary worldview,” a we-vs-they approach: “pitting patriotic Germans against subversive Communists, Aryans against Jews, the healthy against the sick, the Third Reich against the rest of the world.”

 

In late February, Hitler blamed a Reichstag-building fire on German communists and convinced Hindenburg to sign an emergency decree, supposedly to protect Germany against communist violence. It indefinitely suspended due process of law, while also serving as an excuse to persecute communists. In Reichstag elections in early March 1933, the Nazi vote increased to 44 percent of the total. The communist vote declined, and none of their deputies, fearing arrest, took their seats in the new Reichstag. Soon afterward, Hitler established a new Ministry of Public Enlightenment and Propaganda. Later that month, using threats and promises, Hitler convinced Reichstag deputies to agree to the Enabling Act, which transferred legislative power to Hitler and his cabinet and allowed him to suspend parts of the Weimar constitution. In addition, the Nazis appointed officials who helped them take over federal states like Prussia and Bavaria, as well as local governments. 

 

Furthermore, as Fritzsche writes, the Nazis “dismantled the trade unions, coordinated many of the institutions of civic life, and promulgated laws denying German Jews equal rights as citizens.” On May 10, the 101st day of Hitler’s chancellorship, Nazi students organized the burning of “unpatriotic” books. The fact that German unemployment eventually began declining, partly as a result of German rearmament, concerns Fritzsche less than the motives of Hitler and his supporters.

 

Fritzsche succinctly explained the contrast between FDR and Hitler in an interview:

 

Roosevelt’s 100 days were an imaginative and improvised effort to restore confidence and put Americans back to work through government legislation. . . . Hitler’s 100 days were to consolidate power around his party, which then spoke for the nation at large. . . . Roosevelt spoke in an inclusive voice, especially when he addressed Americans in fireside chats; Hitler divided Germans into friends and foes, and promised a final reckoning with enemies. Hitler and his conservative allies wanted to smash the Weimar Republic, not save the fiscal or economic ship of state. 

 

Like FDR and Hitler dealing with the horrors of the Great Depression, Donald Trump now also confronts a great crisis. FDR eased people’s fears, generally spoke the truth, acted with vigor, sought to cushion people from absolute economic calamity, sought advice from creative and varied sources, spoke with an “inclusive voice,” had good relations with the press, and came across on the radio as “patient, kind-hearted, and firm of purpose.” Hitler’s main concern, as Fritzsche tells us, was not improving the conditions of the German people, but strengthening his own power by dividing “Germans into friends and foes” (communists, Jews, defenders of the Weimar Republic, etc.). Which previous leader sounds the most like Trump?

 

To date, Trump has eased few people’s pandemic fears. Truth has long been an alien planet to him. Two recent articles, Frank Rich’s “Trump Lies His Way Through a Pandemic” and James Fallows’ “Trump Is Lying, Blatantly,”indicate that truth remains an alien concept to him. 

 

He has not acted with vigor. Recent reports suggest an Obama-era National Security Council document intended as a pandemic response playbook has gone unread. Meanwhile, as Fallows stated on March 17, for weeks Trump had “been mocking the virus threat—at rallies, in tweets, and in press remarks. [“We have it totally under control,” he said during a Jan. 22 interview.”] But both yesterday and today, he’d suddenly shifted to warning that the public-health and economic problems were real, and would remain so for a long time.” Such shilly-shallying has impeded bold actions and created increasing anxiety. Rather than relying on creative and varied reliable sources, Trump, fearing independent thinkers, has surrounded himself with yes-men (few women) and is becoming increasingly impatient with such truth-tellers as Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases since 1984.  

 

Unlike FDR, Trump has seldom spoken with an “inclusive voice” and has poor relations with reporters. During a March 20 press conference, he told Peter Alexander of NBC, “you’re a terrible reporter” and accused him, as well as NBC and Comcast, of sensationalism. He also lamented, as he often has, “fake news.” At a March 21st conference, after a reporter asked him about a Washington Post story criticizing him for insufficient action on the virus in January and February, he stated that “the Washington Post covers . . . me very inaccurately. . . I think it’s a disgrace.” Trump’s  main form of communication, tweets, are divisive, rife with insults, and the opposite of FDR’s “patient, kind-hearted” fireside chats.  

 

On March 21, Peter Baker in the New York Times wrote, “Mr. Trump’s performance on the national stage in recent weeks has put on display” such traits as his “profound need for personal praise, the propensity to blame others, the lack of human empathy, the penchant for rewriting history, the disregard for expertise, the distortion of facts, the impatience with scrutiny or criticism.” In the same paper, the following day, David Leonhardt’s “How Trump Is Worsening the Virus Now” appeared. And on March 25, a conservative contributor to The Atlantic, Peter Wehner, wrote one of the most insightful and damning critiques to date of Trump’s coronavirus-crisis failings.

 

A March 18-22 poll on Trump’s handling of the crisis reflected a deep divide on the question between Republicans and Democrats. On March 25th the Trump administration and the U. S. Senate agreed on a $2 trillion relief package, and House Speaker Nancy Pelosi indicated the House of Representatives would soon add its approval. To what extent this package will help Trump’s popularity remains uncertain. Despite such uncertainties, however, Trump’s personality and actions heretofore in the present crises do not inspire confidence. In being narcissistic, primarily concerned with strengthening his own power, and in seeing the world through a we-vs-they lenses, Trump resembles Hitler more than FDR. 

 

 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174875 https://historynewsnetwork.org/article/174875 0
Historian William McNeill Warned in 1976 that a Mutated Flu Virus Could Cause a Pandemic

 

William McNeill, the eminent University of Chicago historian, won a National Book Award for his first book, Rise of the West in 1964. But it his second book, Plagues and Peoples, published in 1975 that many readers today will find eerily prophetic. 

Summarizing his research in the last pages of the book, he observed that while 20th century medicine had effectively controlled a number of deadly diseases (e.g. smallpox, polio), world leaders had grown complacent and the resulting surge in world population had created an “extraordinary ecological upheaval.” 

He warned readers that “infectious disease which antedated the emergence of humankind will last as long as humanity itself… and will remain… one of the fundamental determinants of human history.”

He even pinpointed the most likely cause of the next pandemic: a mutated form of the influenza virus. He noted that the influenza virus was not only highly infectious, but “is unstable and alters details of its chemical structure at frequent intervals,” which prevents humanity from acquiring any long-term immunity. Thus, human populations are dependent upon researchers continually developing new influenza vaccines which must be mass produced in a hurry.  

McNeil’s major achievement in Plagues and Peoples was to incorporate developments in microbiology, anthropology and archeology, fields usually operating in silos, and synthesize them in a popular world history, the first nonfiction book that identified disease as a primary shaper of world history.  Although McNeill’s research was conducted six decades ago—in an era before personal computers, DNA sequencing and the discovery of the Lucy skeleton—many of his conclusions remain valid and seem all the more prescient. 

One of McNeill’s insights in Plagues was to describe the relationship between “microparasites” (e.g. bacteria, viruses) and “macroparasites” (e.g. rats, humans) which exist in a constantly evolving relationship. McNeill theorized that humankind itself was a type of disease or microparasite on its host, the Earth. He warned that if humans put too great a stress on that host, it could self-destruct. Indeed, McNeill formed this hypothesis, which presages arguments about climate change, in the early 1970s, when the concern about rising greenhouse gases was limited to climatologists.

Before McNeill’s book, many sweeping or synthetic histories of western civilization were based on political or social movements or used the Great Man theory to explain rise and fall of empires. Epidemics were seen as trivial episodes compared with military battles or charismatic monarchs. Perhaps this reflected the 20th century vantage point of many historians. In a 2010  magazine profile of McNeil, Dr. Donald Hopkins, of the Carter Center in Atlanta, observed that with smallpox, polio and other diseases virtually eradicated, “it was part of the hubris of the late 20th century to say we’d taken care of infectious diseases… we’ve got them under control.” 

 McNeill, who died in 2016 at age 97, recounted in the introduction to Plagues how he conceived of the book’s central thesis. He was deep into researching The Rise of the West when he read an account of how Hernan Cortez and his small band of Spanish warriors conquered the Aztec Empire.  The Aztecs were skilled warriors who had beaten the Spanish in an initial battle, but when the Cortez returned four months later, he faced no opposition; smallpox had wiped out half the population and most of the Aztec leadership. 

Since the “entire history of the new world hinged on Cortez’s gamble” and the Spaniards’ group immunity to the new epidemic, McNeill decided to research the role of disease in the collapse of other civilizations. His book went on to correlate Sparta’s defeat of Athens, the decline of the Roman Empire and the fall of the Han dynasty to populations decimated by epidemics.

Plagues and Peoples was published to positive reviews and major sales. It has been in print ever since and has led to many other books incorporating biological interpretations of world history such as Jared Diamond’s widely read 1997 book, Guns, Germs, and Steel (In the New York Review of Books, McNeil praised Diamond’s book but questioned his emphasis on technology over human agency).

While McNeill was a historian capable of great conceptual leaps, he did not work in isolation, he was influenced by two historians who preceded him. 

As an undergraduate at the University of Chicago, he had immersed himself in the first three volumes of A Study of History by  Arnold J. Toynbee. Toynbee, a professor of international history at the London School of Economics, argued that all civilizations rose and fell in cyclical patterns. For example, he drew parallels, between fifth-century Greece and the early 20th century. Toynbee’s work eventually spanned 12 volumes and looked at patterns of success and failure in 28 separate civilizations.

McNeill also read Hans Zinsser’s Rats, Lice and History, published in 1935.  Zinsser was a microbiologist at Harvard University and his book (as the title indicates) focused on epidemics of typhus and bubonic plague, both spread by lice. Zinsser’s book was based on the science of epidemiology but written in a witty, picaresque style. One chapter was titled, “On the louse: we are now ready to consider the environment which helped to form the character of our subject.”

William McNeill went on to author a total of 23 books including The Pursuit of Power: Technology, Armed Force, and Society since A.D. 1000, Arnold J. Toynbee: A Life, and The Human Web: A Bird's-Eye View of World History, the latter co-authored with his son,  J. R. McNeill, a professor of environmental history at Georgetown University.

It is interesting to speculate what William McNeill would have thought of our current coronavirus epidemic. One thing is certain: he would not have been surprised. As he warned in Plagues and Peoples, humankind remains susceptible to epidemics and the constantly mutating flu virus is most likely threat.

In a later work he suggested that there might be a law of “conservation of catastrophe,” under which each successful civilization adapts itself to one or more disasters. This gives the leaders a false sense of security, but also raises the probability that a new, unforeseen ecological disaster would destabilize it.

McNeill was, however, optimistic overall about mankind’s ability to adapt. In The Rise of the West he said that human intelligence has proven again and again that it capable of overcoming catastrophe. He added that “in human society… belief matters most” and that the values and stories we employ to make sense of life are “often self-validating.” 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174879 https://historynewsnetwork.org/article/174879 0
The Coronavirus Pandemic, Like Other Global Catastrophes, Reveals the Limitations of Nationalism

 

We live with a profound paradox.  Our lives are powerfully affected by worldwide economic, communications, transportation, food supply, and entertainment systems.  Yet we continue an outdated faith in the nation-state, with all the divisiveness, competition, and helplessness that faith produces when dealing with planetary problems.

As we have seen in recent weeks, the coronavirus, like other diseases, does not respect national boundaries, but spreads easily around the world.  And how is it being confronted?  Despite the heroic efforts of doctors, nurses, and other medical personnel, the governments of individual nations have largely gone their own way―some denying the pandemic’s existence, others taking fragmentary and sometimes contradictory steps, and still others doing a reasonably good job of stemming the contagion.  The UN’s World Health Organization (WHO) should be at the center of a global campaign to contain the disease.  But its early warnings were ignored by many national officials, including those of the U.S. government, who rejected the WHO’s coronavirus testing kits.  Moreover, the WHO has limited funding―more than three-quarters of which now comes from voluntary contributions rather than from the dwindling assessments paid by individual nations.  Undermined by parochial national concerns, the WHO has been less effective in safeguarding the health of the world’s people than it could have been.

Similarly, the unfolding climate disaster presents a stark contrast between a worldwide problem and the behavior of national governments.  The world’s leading climate scientists have concluded that urgent changes are needed by 2030 to rescue the planet from irreversible climate catastrophe, including extreme heat, drought, floods, and escalating poverty. And yet, despite an upsurge of social movements to save the planet, national governments have been unable to agree on remedial action, such as sharps curbs on fossil fuel production.  Indeed, two of the biggest oil producers―the Russian and Saudi Arabian governments―are currently opening the spigots in an oil production war.  For its part, the U.S. government has turned sharply against the solar power industry and is heavily subsidizing the fossil fuel industry.  This national irresponsibility occurs despite the urgent pleas of UN leaders.  “The point of no return is no longer over the horizon,” UN Secretary-General Antonio Guterres told reporters in late 2019.  “It is in sight and hurtling toward us.”

Warfare, of course, constitutes yet another problem of global dimensions.  Over the centuries, war has shattered countless lives and brought human civilization to the brink of annihilation.  It is estimated that, during the 20th century alone, war (including two world wars) caused 187 million deaths, plus far greater numbers of injuries, widespread devastation, and economic ruin.  Furthermore, nuclear war, unleashed in 1945 as the culmination of World War II, today has the potential to wipe out virtually all life on earth.  And how are individual nations preparing to avert this global catastrophe?  By getting ready to fight wars with one another!  In 2018 (the last year for which figures are available), world military expenditures rose to a record $1.8 trillion, with the governments of the United States and China leading the way.  Ignoring the 2017 UN Treaty on the Prohibition of Nuclear Weapons, the nine nuclear-armed nations, at enormous cost, are currently busy ramping up their nuclear production facilities and producing a new generation of nuclear weapons.  In response to the looming nuclear menace and climate catastrophe, the editors of the Bulletin of the Atomic Scientistsrecently reset the hands of their famous “Doomsday Clock” at an unprecedented 100 seconds to midnight.

Nor are these the only global threats that the nation-state system has failed to adequately address.  Among other things, the world is undergoing a refugee crisis of vast proportions, suffering from the predatory policies of multinational corporations, and experiencing widespread poverty and violations of human rights.  Do we really think that the current crop of flamboyant, flag-waving nationalist leaders, busy promising to make their countries “great” again, are going to solve these or other global problems?

Of course, for centuries there have been great ethical, intellectual, and political leaders who have sought to move beyond nationalism by emphasizing the common humanity of all people.  “The world is my country,” declared the adopted American revolutionary Tom Paine, and “all mankind are my brethren.”  Albert Einstein dismissed nationalism as “an infantile disease,” while British novelist H.G. Wells, like Einstein, became a staunch advocate of world government.  The idea of limiting national sovereignty in the interest of global security helped spark the creation of the League of Nations and, later, the United Nations.  

But, unfortunately, the rulers of numerous countries, though often paying lip service to international law and international security, have never accepted significant limitations on their own government’s ability to do what it liked in world affairs. Thus, major military powers hamstrung the League and the United Nations by refusing to join these world organizations, withdrawing from them, vetoing or ignoring official resolutions, and refusing to pay their annual dues or other assessments.  A particularly flagrant example of contempt for global governance occurred in mid-March 2020, when the U.S. secretary of state Mike Pompeo ridiculed the International Criminal Court and threatened its staff (and even their family members) for daring to investigate U.S. war crimes in Afghanistan.

Thus, although robust and capable global governance is now more necessary than ever, a primitive, shortsighted nationalism continues to frustrate efforts to come to grips with massive global problems.  

Even so, an extraordinary danger presents humanity with an extraordinary opportunity.  The coronavirus disaster, like the other current catastrophes ravaging the planet, might finally convince people around the globe that transcending nationalism is central to survival.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174881 https://historynewsnetwork.org/article/174881 0
The Polarized, Partisan Pandemic The Partisan Pandemic

 

There’s nothing like a crisis to bring out in bold relief the differences between left and right. The coronavirus bailout passed by Congress and signed into law just a couple of days ago presented a facade of cooperation between Democrats and Republicans: the bill passed the Senate 96 to 0. But I have been struck by how differently the two parties have approached their responsibilities to Americans and America in this unprecedented medical and economic disaster. There has been nearly blindered media focus on Trump and the lying incompetence with which he has proposed one bad idea after another, while not doing what everyone thinks should be done, then bragging about it. It’s worth looking beyond him to the political struggles across the country to save lives and win votes.

 

It’s important to see that the coronavirus has affected Democratic states much more heavily than Republican states thus far, because Democrats control urban states where the virus struck earlier and more rapidly. That partially explains the partisan differences in response to the pandemic at the level of state governments. The first states to issue statewide stay-at-home orders were California (March 19), Illinois (March 21), New Jersey (March 21), and New York (March 22), all states with Democratic governments and large urban populations.

 

The next wave of statewide orders between March 22 and March 29 included 22 states: New Hampshire, Vermont, Massachusetts, Connecticut, Rhode Island, Delaware, Kentucky, West Virginia, Ohio, Indiana, Michigan, Wisconsin, Minnesota, Louisiana, New Mexico, Colorado, Montana, Idaho, Oregon, Washington, Hawaii, and Alaska. This group includes 8 states under Democratic control, 10 where the state government is mixed, and 4 Republican states.

 

The final group over the past couple of days includes 3 Republican states, Arizona, Kansas and Tennessee, two mixed states, North Carolina and Maryland, and one Democratic state, Virginia. Still with no statewide orders are two Democratic states, Maine and Nevada, 2 mixed states, Iowa and Pennsylvania, and 14 Republican states across the South and West. Summarizing, only 2 out of 15 Democratic states and 2 out of 14 mixed-government states do not have statewide orders, but 14 out of 21 Republican states lack them.

 

Within states without statewide orders, there are many counties or cities where local stay-at-home orders have been issued. Again, these tend to follow partisan differences. In heavily Republican Mississippi, the only municipality to issue a stay-at-home order was Oxford, home of the University of Mississippi, whose mayor is a Democrat. The 6 states where no jurisdiction has issued an order, as identified by the NY Times, include 4 of the states which voted most heavily for Trump in 2016.

 

Some people have gotten news coverage for their seeming indifference to reasonable precautions and other people’s health.  We might call them outliers on the spectrum of responses. Oklahoma Governor Kevin Stitt faced wide criticism after he tweeted a photo of himself and his family at a crowded restaurant on March 14. The next day he declared a state of emergency for Oklahoma. Pastor Tony Spell in Baton Rouge defied the state’s orders about social distancing to hold massive services twice last week. He told a reporter that he is not concerned about his congregants contracting the virus. “The virus, we believe, is politically motivated.” Devin Nunes, Congressman from California, urged Americans to go out to eat on March 15: “it’s a great time to just go out, go to a local restaurant.” Kentucky Congressman Thomas Massie, an opponent of the stimulus bill that was just passed, forced many representatives to travel to Washington to vote for it, earning even Trump’s criticism. Within the media, FOX News is an outlier, because of the lack of concern about the spreading virus broadcast by some, not all, of its stars. Sean Hannity, Laura Ingraham, and Trish Regan downplayed the dangers and blamed Trump’s opponents for whipping up unnecessary “hysteria”. That is, until Trump declared a national emergency, and they changed their tune. All of these outliers are Republicans.

 

Meanwhile, the most politically active Democrat has been New York Governor Andrew Cuomo, whose daily media briefings have displayed constantly updated statistics, careful reasoning, and concern for the victims of the disease. His briefings have been broadcast live by the major news networks, making him a media star. He is exhibit A for what government can do and should do in a crisis.

 

The background of these widely differing political responses is the gulf between Republican and Democratic voters in their views of the pandemic. A Gallup poll in early March showed that 42% of Republicans were very or somewhat worried about the virus, versus 73% of Democrats. An NPR/PBS NewsHour/Marist Poll in mid-March showed that 54% of Republicans, but only 20% of Democrats thought the coronavirus threat had been blown out of proportion. The demographic groups with the greatest allegiance to Trump are the same that have taken the least precautions to prevent the spread of the virus: white males without a college education, people from small towns and rural communities.

 

These partisan differences reflect the circular interaction among mutually reinforcing causes: the early virulence in a few cities and the lack of cases in rural areas; the suspicion among Republicans across the country of the “elites”, the medical professionals who have provided accurate information and warnings for months; and the official Republican messaging, led by Trump, that there was nothing to worry about.

 

Less easy to explain is why the recent sharp reversal in Trump’s message has not led to skepticism among his supporters. After suggesting that everything would be over by Easter, Trump on Sunday said that 2.2 million people might die unless preventative measures are taken. “And so if we can hold that down, as we’re saying, to 100,000, that’s a horrible number, maybe even less, but to 100,000, so we have between 100 and 200,000, we all, together, have done a very good job.” In China, there have been less than 3500 deaths. Worldwide the death toll just passed 40,000. Thus his new message is that if 200,000 Americans die, he, Trump, has “done a very good job”.

 

And Republicans will believe that.

 

Steve Hochstadt

Jacksonville IL

March 31, 2020

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/blog/154332 https://historynewsnetwork.org/blog/154332 0
Life during Wartime #502

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/blog/154333 https://historynewsnetwork.org/blog/154333 0
Roundup Top Ten for April 3, 2020

During the Covid-19 Pandemic, Immigrant Farmworkers are Heroes

by Eladio Bobadilla

Tracing Cesar Chavez's transforming views on immigration may shed light on how we can support farmworkers’ rights today.

 

For More Than A Century, Americans Fine-Tuned The Rules Of Democracy. Why Have We Stopped?

by Gregory P. Downs

The United States has survived not by keeping the same system but by transforming its rules at crucial moments.

 

 

The Internet Archive Chooses Readers

by Karin Wulf

To elevate the needs of the reader above all others is to dismiss the labor of archivists, authors, compositors, designers, editors, librarians, marketers, metadata creators, and all the other myriad people involved in bringing knowledge into being and into the marketplace.

 

 

The Cult of the Shining City Embraces the Plague

by Jared Yates Sexton

Those who see Trump as a messianic figure believe the coronavirus will put a fallen world right again.

 

 

Sanctions Are Inhumane—Now, and Always

by Aslı U. Bâli, Aziz Rana

In a world imperiled by global pandemic, it is long past time to put an end to sanctions—including new ones against Iran—and to reconstruct U.S. foreign policy around international solidarity.

 

 

When Americans Fell in Love with The Ideal of ‘One World’

by Samuel Zipp

In 1943, failed presidential candidate Wendell Willkie advanced a strikingly anti-racist, anti-colonial plan to bring the planet together.

 

 

Why Politicians Can’t Stop Talking About “Folks”

by John Patrick Leary

Most politicians are playacting: a privileged cohort of other-than-real Americans desperately trying to convince a mass following that they are, indeed, just plain folks.

 

 

Reality Has Endorsed Bernie Sanders

by Keeanga-Yamahtta Taylor

American life has been suddenly and dramatically upended, and, when things are turned upside down, the bottom is brought to the surface and exposed to the light.

 

 

The Religious Right’s Hostility to Science Is Crippling Our Coronavirus Response

by Katherine Stewart

Trump’s response to the pandemic has been haunted by the science denialism of his ultraconservative religious allies.

 

 

The Trouble with Triscuits

by Charles Louis Richter

Where did the name of this popular snack come from? An exercise in historical reasoning.

 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174870 https://historynewsnetwork.org/article/174870 0
HNN Introduces Classroom Activity Kits As part of a new initiative to merge the worlds of education and journalism, HNN is introducing a series of Classroom Activity Kits.

 

These kits, all crafted by undergraduate students at George Washington University, are designed to illustrate the relationship between current events and history. Each kit consists of a complete 45-60 minute lesson plan which requires little-to-no preparation on part of the instructor. These lessons are designed for high school and college students, but they can be altered to suit other education levels.

 

During these lessons, students will be use news articles as a tool for understanding a specific history. In doing so, students will engage in critical thinking, group-work, text analysis, research and more. This first batch of activity kits covers the history of U.S. immigration, climate change, sports activism, and the U.S. prison system.

 

Aside from the educational benefits to students, these kits are also a great resource for educators. All you need to do is click one of the download links below and boom! – you have a fully formed unique activity.

 

HNN’s Classroom Activity Kits will be a continued effort to bring news into the classroom.

 

Click on each link below to learn more and download the activity kit. 

 

 

Classroom Activity Kit: The History of Climate Change

What do farmers from the 1950s, anti-smoking campaigns and climate change have in common? Download this Classroom Activity Kit to find out.

 

Classroom Activity Kit: The History of U.S. Immigration

This Classroom Activity Kit teaches students about U.S. immigration history while also highlighting their personal histories.

 

 

Classroom Activity Kit: The History of Sports Activism

Discussing athletes from Jackie Robinson to Colin Kaepernick, this Activity Kit teaches students about the history of political activism in sports.

 

 

Classroom Activity Kit: The History of Private Prisons in the U.S.

Download this Classroom Activity Kit to teach students about the history of the American prison system.

 

 

Classroom Activity Kit: The History of Climate Change and Activism

Download this Classroom Activity Kit to help students understand climate change activism in its historical context.

 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174366 https://historynewsnetwork.org/article/174366 0
Getting Medieval on COVID? The Risks of Periodizing Public Health

 

 

Since the outbreak of COVID-19, many observers have described fighting the pandemic in terms of adopting either “medieval” or “modern” approaches, most recently in a New York Times op-ed. But that dichotomy is a false one, and a dangerous one as well. First, from a historian’s perspective, distinguishing between modern and premodern is a facile way to periodize. Applying a medieval/modern binary to public health, or any other past social practice, implies a normative path towards universal best practices. It also promotes a sense of superiority over earlier societies, on the one hand, and over present-day societies framed as living in the deeper past, on the other. As scholars working across medieval and (post)colonial studies have shown, designating a cultural practice as medieval (or modern) legitimizes hierarchies, inequalities and even violence, rather than providing a nuanced view of the human past. Secondly, and from a preventative-health perspective, mitigating pandemics’ impact hardly demands a choice between non-biomedical (“medieval”) solutions and advanced and rational (“modern”) ones. As recent campaigns against HIV/AIDS and Ebola demonstrate, success requires thinking about how best to combine diagnostic, preventative and curative measures, since different admixtures fit different environmental, administrative and political contexts. Imagining medieval and modern solutions as antithetical blunts this contextual awareness and can put certain communities at greater risk by ignoring their specific needs.

 

Italy’s current struggle against COVID-19 is a case in point. At the time of writing, nearly one fifth of all reported cases and one third of all confirmed deaths occurred there, especially in the peninsula’s industrialized north (Lombardy, Emilia-Romagna and Veneto), making it so far the most stricken large nation in the world. The historical irony could not be starker, as it was Italy’s inhabitants who led Europe in medical innovation for nearly a millennium, starting with the reception, from the Arabic-speaking world, of the Hippocratic and Galenic medical corpus. Current analyses chalk up Italy’s vulnerability to a combination of a sizable elderly community, multigenerational households, high rates of smoking, polluted cities, and a heavy dose of bad luck. Italy also reacted complacently and belatedly to the virus’ spread, but that was, and remains, hardly unique. Without firm instruction and enforcement, many Italians, like most humans, resisted adapting to a new reality for as long as they could. 

 

But when central and regional governments finally drew the line, they rolled out a set of preventative measures that local cities have been practicing at least since their proliferation in the twelfth century: social distancing, quick burials, closed gates, limited movement, scrupulous market supervision, and rapid communication to keep others abreast. And, while many commentators imagine the second plague pandemic (aka the Black Death, 1346-53) as a trigger for consolidating public health power analogous to modern state action, I have found that these measures spread much earlier in local communities. Rather than the calamity of the Black Death, the normal pressures of urban life taught people that these measures would help them cope with living in hazardous environments.

 

To be sure, when plague pathogens (Yersinia pestis) entered Europe in the mid-fourteenth century, likely through several Italian ports, they dealt the region a major blow, as they did the Middle East, and likely East Asia and Sub-Saharan Africa. The disease killed at least a third of Europe’s population, even as it spawned a new genre of prophylactic literature and preventative measures such as enclosure of people and goods now known as quarantine (from the Italian quaranta, a period of forty days imposed on vessels arriving from reportedly afflicted areas). The Black Death’s significance in contemporary thinking about contagion reflects the influence of the modern/premodern binary. In particular, the belief that the plague spread in Europe due to hygienic apathy has led many to invoke it as a quintessentially medieval—that is, uncivilized—event, caused or at least exacerbated by the supposed backwardness of its victims.

 

This line of thinking can similarly constrain addressing contagion today. Consider the emergence of COVID-19, reported to have begun in Wuhan, China in late 2019. New cases soon appeared in South Korea and Iran. This allowed Euro-American observers in January and early February 2020 to compare conditions abroad to the onset of the Black Death and deride other governments’ slow and authoritarian responses (or the threat of applying them in the West) as a return to the Middle Ages. However, as the global scale and scope of the infection became evident, and with Italy taking center stage, Western commentators changed their tone. Although early accounts speculated whether “medieval” or “modern” approaches would serve Italians—and by extension Europeans—best, media discourse soon began juxtaposing “strict” and “liberal” paradigms instead. Perhaps the final blow to any tendentious alignment of certain solutions with particular eras came when China and Cuba headed relief missions to Italy, visibly outperforming efforts by the European Union.

 

By mid-March, democratically elected governments in developed nations, began imposing unprecedented peace-time restrictions, including tracking infected people’s phones, and suggested an entirely different framing of contagion. Rather than construing pandemics as medieval aberrations, leaders stressed their inevitability: health crises were not borne out of hygienic ignorance and incompetence but were part of the human condition and its fragile socio-political order, which would nowbe maintained by complying with new regulations. A conceptual barrier was thereby removed between the medieval, unhygienic past and the modern, hygienic present. Monty Python bringing out their dead—and almost dead—came to have a different ring to it, as Italian morgues and cemeteries literally ran out of space. Yesterday’s Draconian law became today’s “aggressive measure.” Quarantine, now carefully distinguished from social isolation, lost its pejorative undertones, aided by posts on quarantine hacks from hip social media influencers. Unity and conformity, not innovation and economic viability, became the paramount concern in the war on COVID-19, even among avowed free-marketeers such as Emmanuel Macron of France and Mark Rutteof the Netherlands. Donald Trump, having initially played down the crisis, proudly described comprehensive travel bans, and the widest rollout of quarantine measures in modern history, as a rational way to put (a certain) America First and beat the “Chinese virus”.

 

Social distancing may still affront present-day sensibilities. Yet by and large the embracing of medieval—or, to be precise, non-biomedical—preventative health measures in Italy and elsewhere reflects public health professionals’ informed and practical approach. It is also a helpful nod at the power of history to educate and inspire new synthetic solutions to new cases of recurrent problems. For millennia, healthcare in Europe, the Middle East and elsewhere was a preventative pursuit, requiring basic hygienic education, clear communication and the efficient policing of infrastructures. Swapping vaccines and ICUs for curfews and quarantines may strike many today as insufficient under the juggernaut of global trade and transportation. The world is also far more urbanized in 2020 than ever in the past, and pandemics distinctly threaten the sort of sociability urban life requires. Yet for most people, coping with COVID-19 successfully will mean following the tried and tested techniques of group and personal hygiene. Broad consensus on the latter among public health experts came as bad news to “supply-side” economic elites, who may well benefit from medievalizing social isolation in order to delegitimize it. But, especially in places like the US, sending un- or poorly insured people back to work, and relying on biomedical measures when health systems for non-elites are utterly unprepared, is tantamount to calling for mass suicide.

 

Of course, as populations around the world follow low-tech recipes for disease prevention, trained professionals using high-powered computers are already rushing to develop vaccines. Central banks are scrambling to offer financial safety nets so we can go back to consuming what we were used to. And in obvious ways our social stability continues to rely on digital technologies and infrastructures (increasingly private ones) barely conceivable even fifty years ago. Returning to our immediate past may no longer be an option. Yet however we choose to move on, it does not involve either getting medieval or becoming modern. We were never either.

 

To know more about past prophylactics visit Premodern Healthscaping

Twitter: @Prosanitate

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174758 https://historynewsnetwork.org/article/174758 0
Filmmaking, Reality and Fact: How Documentaries Shape Americans' Ideas of Truth

 

Documentaries—screened, broadcast, and streamed—are more appreciated and influential today than ever before. It’s been said we live in a Golden Age of nonfiction filmmaking. At the same time, media phenomena like “alternative facts” and “fake news” lead some to suggest we live in a “post-truth” era. It’s a major challenge for viewers trying to gain an understanding of contemporary politics, and for those who study America’s past. 

How did we get here? Given that a sizable percentage of Americans get their sense of history through television documentaries, the history of nonfiction filmmaking offers insights.  How documentary filmmakers struggled to discover and convey a sense of the real world—succeeding, failing, and compromising along the way—provides a basis for understanding and appreciating the difficulties and importance of representing and determining evidential truth in times where facts are discounted, distorted, or ignored. 

Since the first movies in the 1890s, nonfiction filmmakers were driven by curiosity about the real world and changing times—and shaped by interactions with their audiences. Three documentaries focused on families mark changes in both filmmaking techniques and philosophical arguments about how film and television have represented reality during the past one hundred years.

By 1922, the first Dream Factories of Hollywood dominated international movie production, but a less-than-glamorous story about an Inuit family of seal hunters in the Canadian arctic was a surprise box office hit. Shot on location without trained actors, Robert Flaherty’s Nanook of the North is considered the first documentary film narrative. With moments of drama, suspense, humor, and poignancy—unlike previous travelogues—Flaherty mostly portrays Inuit people as complete human beings, not primitive sources of detached curiosity for his largely white audience, many with attitudes of racial superiority.  

Yet, as “realistic” as it appeared (and sought to appear), the truth of Nanook was affected, at least partly, by the filming process. Since the heavy hand-cranked silent cameras of the day limited Flaherty’s ability to respond spontaneously to changing action, he staged much of what he shot. Without the capability to record sound, narrative information and context are provided by title cards. 

At a time when distinctions between fiction and nonfiction films were yet to be defined, Flaherty still “cast” his characters: his “star” is a respected seal hunter. Not known as Nanook (Bear), his real name (Allakariallak) would be unpronounceable to English-speakers. The woman who plays Nanook’s wife, “Nyla” (Smiling One), isn’t his actual spouse, but chosen for her beauty. 

Notwithstanding these creative licenses, Flaherty intended to act as a cultural preservationist. He aimed to convey a sense of the Inuit as they lived before the influence of the white man.  Even though modern academic anthropologists disdain this as “salvage anthropology”—an approach that ignores native people as they are to convey an idealist vision of the past—Flaherty wasn’t making a deliberate attempt to deceive. Despite creative liberties, Inuit life as portrayed in Nanook represents the reality of a forbidding arctic environment where starvation was a true threat to survival. Facing criticism, Flaherty openly admitted, “sometimes you have to lie to tell the truth.” 

The idea that small facts can be sacrificed in service to larger truth remained a cornerstone of the craft of documentary film even as modern technology made it possible to record events without more obtrusive staging.

During the 1960s, the invention of light, portable cameras, more sensitive film stock, and easily synchronized sound lifted many of the restraints that had hindered Flaherty. The result was cinéma vérité, a new style and philosophy of nonfiction moviemaking that transformed portrayals of reality on film.  

Vérité pioneers like Bob Drew, Ricky Leacock, Al and David Maysles, D.A Pennebaker, and Fred Wiseman sought to capture experience and environment, to give viewers a sense of “being there,” not necessarily to communicate factual information. They vehemently objected to authoritative narration, interviews, and the emotional guidance provided by a separate musical soundtrack. They wanted the audience to be actively involved in evaluating and determining the truth of what was shown, not to be told how to interpret what they were seeing. 

Building on the cinéma vérité tradition, the 1973 PBS/WNET series An American Family closely observed the lives of the upper middle-class Loud family of Santa Barbara, California.  While previous vérité films, like Drew and Leacock’s Primary, focused on the 1960 Democratic presidential race, suggested a new kind of journalism, An American Family producer Craig Gilbert saw his project in terms of a sociological study. The series was recorded by a husband-and-wife camera and sound team, Alan and Susan Raymond, who virtually lived with the Louds during filming. Despite early sociological intentions, during the editing process, the twelve-part series morphed into a kind of real-life soap opera, with portents of reality TV to come.  

As it turned out, along with familiar upper-middle-class activities, life with the Louds unexpectedly included adultery, frequent shots of vodka, and the uninhibited lifestyle of a son, Lance, who was proudly gay at a time when openly gay characters were missing from television and movies. He attracted critical mockery, but later proved to be an example of changing attitudes in the decades to come. As film scholar Jeffrey Ruoff wrote: “Lance Loud didn’t come out on American TV, American television came out of the closet through An American Family.”

As An American Family unfolded, the normally sedate public television audience was divided between fascination and disdain. For many viewers, although the Louds weren’t scripted actors, it was easy to see them less as real people and more like performers playing themselves. The result was a new kind of celebrity and questions that added to doubts about the veracity of screened reality. Did the filmmaking process affect the truth it purported to capture? 

Even with these questions, An American Family offers valuable, if filtered, historical evidence, not only about family life in the 1970s, but also the evolution of portrayals of the real world on film and television. Sadly, the outtakes of a popular TV show aren’t often considered serious source material for future study. Most of the three hundred hours of original footage shot by Alan and Susan Raymond was destroyed during a WNET housekeeping session.

If cinéma vérité purists scorn traditional documentary approaches like narration and interviews, found on television since the 1950s—Ken Burns, perhaps the best known and successful contemporary American historical documentarians, is proudly old-fashioned. Burns’s 2014 seven-part, fourteen-hour PBS family portrait The Roosevelts: An Intimate History engagingly traces the intertwined lives of Theodore, Franklin, and Eleanor, based on deep and careful archival research, informative narration by longtime Burns collaborator Geoffrey Ward, and the on-camera guidance of interviews with authority figures and eyewitnesses. 

As an evidential historian, Burns benefits from the evolution of nonfiction filmmaking genres between 1920 and 1950, especially newsreels. Teddy Roosevelt wasn’t the first president to appear on film. That was William McKinley in 1896. But TR’s physically flamboyant speaking style made him a compelling “picture man” on silent movie screens. 

Sound movies and radio were boons for FDR and Eleanor, providing a vivid sense of who they were in “real life” as well as the world they inhabited—impressions that are harder for print chroniclers to convey. Yet even though contemporary documentarians may access a wealth of facts and engaging archival imagery, questions about how documentaries are made, dating from the days of Nanook, continue, perhaps inevitably, to generate questions about truth on film.

Like his fiction filmmaking counterparts, Ken Burns considers himself a storyteller, “an historian of emotions,” emotions being “the glue of history.” Most viewers found The Roosevelts moving as well as informative. However, fairly or not, a mutual sense of filmmakers and audiences that movies are an emotional medium has long been a burden for nonfiction filmmakers with informative intentions. Despite, or perhaps because of, Burns’ popular success, many academic historians are suspicious of film, considering it a source of impressions and feelings rather than the purported intellectual depth and rigor of the printed word.

Notably, Burns has been criticized for a tendency to view historical stories like the lives of the Roosevelts, and his most lauded effort, The Civil War, in terms of a series of trials that ultimately lead to an uplifting triumph over adversity—a narrative arc found in fiction. Although Burns doesn’t ignore social and political injustices, especially concerning race, as a skilled popular historian he doesn’t dwell on unfinished business that could disturb or challenge viewers who look to historical documentaries to provide a sense of closure.

Compensating for this, “open wounds” from the more recent past are often the subject of less-reassuring documentaries like The Central Park Five. An uncharacteristic Burns film, made in collaboration with his daughter Sarah and her husband David McMahon, The Central Park Five investigated the unjust conviction of a group of young African American and Latino teenagers accused of a brutal (and sensationally-reported) rape. 

The nonfiction film history I’ve sketched reveals the persistent limitations and compromises of documentary film as well as the medium’s expressive strengths. Just as academic historians bicker over differences of interpretation and the importance of archival minutia, without dismissing the idea of truth itself, serious documentarians are just as committed to reflecting an honest sense of the real world.  

This is especially vital during a moment in society when people may toggle between believing what is staged and edited, cynically rejecting everything as fake, or selectively embracing evidence that affirms their beliefs. Polarized audiences can be tempted to dismiss documentaries as just another a kind of entertainment—no more true than fiction films. 

Such attitudes are encouraged by the reality TV presidency of Donald Trump, whose administration is known for a dramatized attitude toward the real world, an entertainer’s ability to appeal to emotions, tweets that offer diverting plot twists, and attention-grabbing behavior that delights fans and outrages the less enamored. 

If this weren’t enough, there could be more uncertainty ahead. Technology has transformed the possibilities and supported new forms of nonfiction storytelling and will affect traditional historians as well. The emerging realm of “deepfake” imagery threatens to manipulate real images beyond detection, and artificial intelligence can target viewers with the stories they prefer to see, true or not.

Like all nonfiction, documentary films have always depended on credibility, even when the building blocks of moving images have been staged, selectively edited, or chosen above others for reasons of narrative. Of course, print historians face similar challenges in constructing narratives that marry facts with interpretations, rather than simply assembling encyclopedic lists of dates and names. Unlike a documentarian, a fiction filmmaker can have a flop and move on. For a creator of nonfiction films, losing audience trust is catastrophic.

Viewers can lose out, too. If they make choices about what they believe based on entertainment appeal or personal preference rather than thoughtful evaluation, even a Golden Age of documentaries can become counterfeit. American democracy—and an accurate appreciation of our past—will be the worse for it.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174756 https://historynewsnetwork.org/article/174756 0
"Rogue" Manufacturing in China: Past and Present

 

China’s rapid ascent as an economic superpower at the turn of the twenty-first century has fueled considerable global anxiety. The Trump administration’s recent decision to adopt an aggressive, no-holds-barred tariff war with China is one such expression of concern. At the core of the unease is the sense that China does not play by the rules and engages in rogue manufacturing and industrial espionage. While copying and trade theft occur worldwide, China has become identified as the most egregious offender in the production of fakes and knockoffs, portrayed as unable to innovate and add productively and originally to the global economy. 

 

In China itself, however, copying is not necessarily unethical, nor mutually exclusive from innovation. Manufacturing products by copying preexisting brand name items is often associated with the term shanzhai. Originally literary in nature, referring to the mountain strongholds in which rogue heroes pursued extralegal justice, shanzhai is now translated as “knockoff” or “local imitation,” and is linked to the culture of underground factories located in manufacturing hubs such as Shenzhen that deliver local adaptations of brand products. Despite critics in the West raising alarm bells regarding Chinese counterfeits, Chinese leaders have come to celebrate shanzhai practices and promote the adaptation of imported technologies as crucial to “indigenous innovation.” Some Western journalists have even started to praise the culture of shanzhai in Shenzhen by likening the city to Silicon Valley and celebrating the area as a tech “nirvana.”

 

If shanzhai practices characterize China’s economic growth today, manufacturing that resembles shanzhai has a longer history. In the early twentieth century,unconventional and homegrown ways to build industry, which included drawing from global knowledge and engaging in strategic emulation, were avidly pursued. One particularly colorful individual engaged in such practices was Chen Diexian (1879-1940), a novelist and pen-for-hire, a professional editor/translator and dabbler in chemistry, and, eventually, a patriotic captain of industry. A new-style entrepreneur, Chen Diexian deftly navigated China’s early twentieth-century transition to industrial modernity. The fledgling republican state was weak and distracted, busy staving off internecine warfare and threats from abroad. Economic imperialism loomed large and Chinese entrepreneurs faced daunting challenges in developing native industry. Yet, while political chaos reigned and economic imperialism seemed insurmountable, the rise of vibrant treaty-port economies and modern print and light manufacturing industries also allowed unexpected opportunities to emerge. The decline of orthodoxy and tradition that had followed the fall of the empire meant that enterprising individuals could engage in new regimes of knowledge and pursue new endeavors. 

 

Taking advantage of the unprecedented opportunities, Chen Diexian was one such enterprising individual. He dabbled in scientific activity and developed commercial enterprises, both lettered and material. He translated texts on and explored regimes of chemical and legal knowledge, adapted foreign technologies, and openly pursued profit—activities once deemed unthinkable for respectable men in late-imperial China. Productive in the making and selling of words and things, Chen turned his writer’s studio into a chemistry lab, shared brand name manufacturing formulas as “common knowledge” in newspapers, and utilized proceeds from his romance novels to manufacture the incredibly popular “Butterfly Brand Toothpowder,” unique in its ability to double as face powder. By doing so in a moment when China was experiencing penetrative economic imperialism, Chen’s industrious activities constituted a form of “vernacular industrialism” that was local and “homegrown” (as opposed to foreign), informal and part of China’s consumer culture (rather than state-sponsored or academic-oriented), and artisanal and family-run, if eventually located in factories. 

 

A key element of Chen's unconventional route to entrepreneurship and vernacular industry building was a "Do-It-Yourself" maverick approach to manufacturing. As a self-proclaimed nativist not able to speak a single foreign language, Chen tapped into global networks of knowledge by employing practices of collaborative translation where fidelity was second to adapting texts to local concerns. He would then tinker with the translated technologies, improve foreign recipes, and present such adaptations as virtuous “emulation” crucial to the building of native Chinese industry. Chen’s iconic product, the Butterfly Brand Toothpowder, was ingeniously manufactured when Chen improvised a foreign manufacturing recipe by experimenting with local cuttlefish bones to source local calcium carbonate, a crucial ingredient. While an advocate of emulating foreign manufacturing formulas and technologies, he sought patents for his own recipes and gadgets, basing his claims of ownership not on original invention, but on improvement (gailiang), an approach that came to inform the National Products Movement, a “buy and manufacture Chinese goods” campaign. These practices were not examples of ignorance or deviousness, but instead demonstrate how copying, improvement and innovation were not always at odds. They also reveal the strategic agency of Chen, who despite being highly “local,” drew from far reaching circuits of law and science. 

 

Chen’s early twentieth-century vernacular industrialism can serve to remind contemporary observers that shanzhai practices of strategic emulation, hands-on tinkering, “open-source” know-how, assembling and incremental improvement to remake technology, all have a long history in China. By considering the two periods side by side, we can see how and why practices of DIY tinkering, copying, improving, and reassembling have come to be so closely associated with modern China and why they have come to be seen as “rogue” in global discourses. Republican-era vernacular industrialism and contemporary shanzhai manufacturing overlap insofar as they emerged during moments of China’s entry or reentry into global capitalism. 

 

Yet real differences between the two moments exist and reflect some of the variant political implications behind not just the manufacturing cultures but also the divergent place of China vis-à-vis global capitalism and the differences between capitalism in the two eras. In the earlier moment, the nascent Republic of China was struggling with what were unremitting economic and political imperialist pressures while being ripped apart by internal warfare and political fragmentation. Despite this inhospitable context, vernacular industrialists such as Chen were able to adapt new forms of industrial manufacturing that emerged with global developments in chemistry and physics to generate a patriotic and anti-imperialist National Products Movement. In this context, “rogue” practices for the purposes of import substitution were deemed necessary. The contemporary moment offers us a different iteration of the way China engages with global capitalism. Today’s shanzhai and counterfeit manufacturing emerges during a period when a strong post-socialist authoritarian state has been eager to reenter global capitalism and has adopted muscular policies that have enabled China to do this extremely successfully. Practices of shanzhai copying and adapting electronic and digital technologies of the current postindustrial global economy have been part of this success, helping fuel China’s ascent, even while generating anxiety among global competitors.

 

Both Chen’s vernacular industrialism in the early twentieth century and today’s shanzhai force us to rethink conventional narratives and normative ways of understanding ownership, innovation, and what constitutes industrial work and industrial development. For the Trump administration so eager to demonize Chinese industry to pursue its trade war, the legal conceit that emulation precludes innovation is highly convenient for it to castigate China for its purportedly flagrant disregard of intellectual property. Yet, history of shanzhai-style manufacturing reveals a more complex picture, where innovation and emulation have often been inextricably bound. This history, moreover, converges with contemporary discourses in maker’s movements and start-up cultures worldwide that embrace open-sourcing and shared access to knowledge, similarly threatening the easy distinction between imitation and innovation. This all suggests that imposing any one set of norms on intellectual property in industry is likely to be a futile effort. Rather, what we might see is that as China’s economic power grows, intellectual property might end up looking more and more like shanzhai in the future.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174767 https://historynewsnetwork.org/article/174767 0
DC Comics and the American Dilemma of Race

 

The sudden firing this past February of Dan Didio as co-publisher of DC Comics continued a tumultuous era for the company. During Didio’s decade-long tenure, DC’s comic universe frantically churned through four separate reboots/rebrandings–“The New 52,” “DC You,” “DC Rebirth,” and “DC Universe”–with a fifth allegedly in the offing for later this year. A consistent feature of this hyperactive course correcting has been an explicitly stated intent of building stories and heroes that more fully reflect the diversity of their real-world readers. Such an emphasis appears likely to continue. Speaking in the aftermath of the corporate turnover, now sole Publisher Jim Lee promised an even brighter future grounded in superheroes who are “inclusive and diverse.” 

It might be easy to think of this impetus toward inclusion within comics as a fairly recent phenomenon. But understanding diversity as only a 21st century preoccupation shortchanges what comic books have (and have not) been doing since Superman first flew onto the scene in 1938. Superhero popular culture, in fact, has always been embedded within American racial attitudes, reflecting and even contributing to them in ways that have set the stage for how we continue to grapple with these matters in 2020, especially in revealing that goodwill is not sufficient, in and of itself, to fix our problems.

Self-conscious explorations of racial issues in American comics date most fully from the late 60s and early 70s, when creators turned in earnest to the idea that they might use their medium to help build a more egalitarian world. No creators better embody this turn towards what would be termed “relevancy” in comics than Dennis O’Neil and Neal Adams, the creative team that produced the much-lauded “Green Lantern/Green Arrow” series. Wanting, as he later wrote, to do his part in the movement for civil rights, O’Neil used his four-color pulpit starting in 1970 to explore a range of issues, including urban and rural poverty, industrial exploitation, environmental degradation, overpopulation, and teen drug addiction. Alongside these, O’Neil and Adams also addressed race as the heroes encountered not only African Americans and Native Americans, but also the discrimination and marginalization such persons of color confronted on a daily basis. 

However, O’Neil and Adams’s work – despite its undoubtedly good intentions – reflected the limits of the liberal imagination of its time. Postwar liberalism offered grand visions of racial equality and harmony, but too often imagined the obstacle to these goals as discrete, misguided individuals as opposed to systemic inequities within U.S. society. Such an understanding readily translated to the good guy/bad guy duality in comics, and so Green Lantern and Green Arrow regularly dealt with corrupt slumlords and businessmen while leaving intact – if not completely unacknowledged – the structural problems fomenting race-based discrimination and impoverishment. Too, the series often put the onus for change on nonwhites themselves, chiding them to, in essence, get their act together and/or accept benevolent white assistance, implicitly casting them as part of the problem rather than its victim.  

If 60s and 70s comics were hemmed in by the liberal ideology prevailing within U.S. society, 80s and 90s comics found themselves trapped by the problematic understandings inherent within what would become known as “multiculturalism.” Nowhere is this seen more fully than in the super teams that developed during this era and would seem ready-made to promote inclusion. It turns out, however, that they fell prey to the limits of multiculturalism itself, which too often traded in superficial forms of inclusion as well as a flattening of nonwhite persons into racialized caricatures.

On the comic book page, inclusion often meant adding one nonwhite – and most often, black – member to an otherwise all-white lineup. The lauded relaunch of DC’s “New Teen Titans” by writer Marv Wolfman and artist George Pérez, for example, included the African American Cyborg as its only nonwhite member (setting aside, of course, the orange-skinned alien Starfire and green-skinned Changeling, neither of whom represent any real-world forms of racial difference). Other teams with wider-ranging diversity in their memberships traded in reductive stereotypes. The “Detroit-era” Justice League, for instance, added the Latino hero Vibe who spoke in a stilted dialect and came from “the street” as well as the African Vixen who was as much defined by her sexuality as her extranormal abilities. Even worse was DC’s “New Guardians,” an even greater conglomeration of stereotypical associations: an emotionless Japanese hero who was half human and half computer, a Chinese heroine with mystical abilities and an unrelenting sex drive, and a Latin American magician that likely could not have embodied more degrading stereotypes of homosexual men if his creators tried.

If looking back reveals signposts marking the ways in which comics have long evidenced Americans’ limited success in addressing race, we might then wonder what DC’s hyperactive – if not hyperreactive – rebooting of its heroes suggests about U.S. society, particularly as these reboots have been inextricably linked to inclusion. Certainly, it reveals some good-intentioned will to build something better. But such will, as in the past, does not guarantee a better world, and Americans, not unlike DC, still struggle to enact real reform. As DC struggles to find solutions that are anything more than feel-good bromides, Americans remain contradictorily caught between the pleasant fiction of what we claim – and have always claimed – that this country represents regarding diversity, and the unpleasant realities of a president who brags about building a wall along our southern border, a government agency that separates immigrant families, and police officers who brutally slay young men simply because they are black. The result is a kind of paralysis: We remain hemmed in by the disjunction between our lofty ideals and the disturbing realities that goodwill and talk are insufficient to resolve. Until such realities are acknowledged, real change is no more likely than the fairy-tale happy endings that comics so often promise.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174762 https://historynewsnetwork.org/article/174762 0
An Interview with "Most Wanted" Author Sarah Jane Marsh

 

 

Chelsea Connolly: While reading Most Wanted, it occurred to me that I couldn’t remember the last time I had read a children’s book, and your book helped remind me of how unique of a medium they are. Could you talk about what you enjoy most about being a children’s author? Sarah Jane Marsh: I love the design challenge of crafting compelling nonfiction for children. I start with a vision of a story I’m fired up to share with kids. In Most Wanted, I wanted readers to understand why John Hancock and Samuel Adams were called out by name as troublemakers by the British government (reminiscent of our current U.S. President name-calling on Twitter). In the process, I’m whisking readers through ten years of American Revolution history...all in less than 2500 words. It’s challenging to channel all this inspiration and research into a compelling story within the format of a picture book biography. I’m writing to engage my young reader, but I must also consider the needs of the illustrator, publisher, teacher, and librarian. And the magic of narrative nonfiction is that it reads like fiction yet is historically accurate. So one immediate challenge is employing fiction techniques, such as narrative arc, character, pacing, and dialogue, without straying from the facts. And word count is my toughest constraint. An ideal nonfiction picture book is less than 2000 words. Each word has to hold its weight. Children's books are just as much a visual medium as they are textual. This puts a great deal of responsibility on the illustrator as well as the author. What is the relationship between an author and an illustrator when crafting a children's book? How do you help and/or influence each other?  Interestingly, the author and illustrator don’t usually have direct contact. Our work passes through our editor and art director who help orchestrate the story. But it is important for the illustrator to first create their vision based upon the completed manuscript. This begins with sketches that my editor sends to me for comment. It’s my job to weigh in if something doesn’t work historically. I do compile resources for our illustrator from my research: images and primary source descriptions of buildings, objects, clothing, etc. My publisher also employs a historian to fact check both the manuscript and illustrations.

I also share my thoughts about overall theme and context. In both Most Wanted and Thomas Paine and the Dangerous Word, it was important for me to have a visual sense of “the people.” Although I use individuals such as Adams, Hancock, and Paine as a vehicle for the story, I want readers to understand the American Revolution was ultimately a mass movement of the people. My editor shared this insight with our illustrator of both books, Edwin Fotheringham, who crafted some amazing scenes such as the 5,000 Bostonians who gathered at Old South Meeting House on the evening before the protest now known as the Boston Tea Party. We also had fun with the final illustration by adding some famous folks of the American Revolution. 

Visual literacy is also an important skill for young readers. With picture books, the illustrations enhance the story in many ways, such as by adding context, explaining vocabulary, or expressing emotion. Ed brilliantly depicted the oppressive nature of the Stamp Act through a visual metaphor of a super-sized stamp falling from the sky as John Hancock obliviously sips his beloved Madeira wine. Ed added an incredible amount of historical detail in Most Wanted, especially impressive considering he’s from Australia! 

The causes of the American Revolution is a very dense subject. How did you choose to tell this specific story? My picture book biography on Thomas Paine focused on the personal story behind the famous pamphlet Common Sense. As a result, readers gain an understanding of how and why we declared independence. (Insider secret: underneath the jacket cover is a replica cover of Common Sense!) For Most Wanted, I was curious about General Thomas Gage’s 1775 proclamation pardoning all militia gathered outside Boston -- except for Samuel Adams and John Hancock. What trouble had these two men caused to be called out by name? In the process of writing this story, I realized I was creating a prequel, history-wise, to Thomas Paine.  At first, I approached this story by focusing on the legendary hectic night inside the Hancock-Clarke house before the battles in Lexington and Concord. I spent a year writing this story. And my literary agent wasn’t wild about the resulting manuscript. But she was intrigued by the relationship between Hancock and Adams. So using that as my new lens, I expanded the story across ten years of revered (pun-intended) history, taking readers from the Stamp Act through the protests in Boston, the fighting in Lexington and Concord, to Hancock and Adam’s triumphant entrance into Philadelphia for the second Continental Congress, and ending with General Gage’s proclamation. This gave me the opportunity to compare and contrast these two leaders, and show the cause and effect of events that led up to the fighting Lexington and Concord.  When the final book arrived on my doorstep, I was thrilled to discover that our editorial team had surprised me by recreating Gage’s proclamation underneath the jacket cover -- similar to our Paine book. A total delight!   The author's note at the end of your book addresses that traditional narratives of the American Revolution often ignore and silence the populations who suffered greatly at the hands of the colonists. How did you incorporate this complexity in the book itself? How do you strike a balance between addressing nuance while also keeping the story accessible to children? My author’s note, written closer to publication, reflects an awareness I didn’t have when I began writing Most Wanted in 2015. My focus was navigating the complexities of the confusion prior to the fighting in Lexington and Concord. As engaging as the story is, my ten-year history sticks to the traditional narrative that we are beginning to see as limited and one that omits the experiences of those who did not hold power at the time, such as women, African Americans, and Native Americans. In Thomas Paine we address the issue of slavery directly, as Thomas Paine did in his writings. In Most Wanted, the topic is alluded to in the illustrations, but not addressed in the text other than a hypocritical quote by Hancock declaring he “will not be a slave” in the presence of his enslaved servant.  By listening to the deeply knowledgeable voices challenging our traditional narratives, I began to see how we have distorted our understanding of our history and of ourselves. My author’s note is my attempt to correct course, prompt discussion of the limitations of the text, and encourage critical thinking about these narratives, our history, and of ourselves.   There is an awakening happening in American history. Books like Hidden Figures, Never Caught, Stamped From the Beginning, and the New York Times “1619 Project” are examples of how a new age of historians are delving into our past to share silenced voices, experiences, and accomplishments. Howard Zinn was an early voice in this realm. And traditionally, our history has been told primarily by white males through their perspective. That is changing. And we are seeing the effects and gaining a better understanding of our shared history.  Children’s publishing is also experiencing an awakening, thanks to the efforts of those pushing to address the inequality of representation in children’s books. Only 23% of children’s books depict characters of diverse backgrounds. And there is good discussion and groundbreaking books that address tough topics with kids. In nonfiction, nuance can be discussed more fully in the backmatter. Every child, especially those dealing with tough situations, deserves to be seen in a book. In many ways, we explain the world to ourselves and our children through our books. Storytelling is our most powerful medium for transmitting values. 

Much of the work you do as an author can be translated to the work a teacher does in the classroom. As a former elementary and middle school teacher and having taught in the classroom yourself, what advice would you give to educators who want to talk about these complex and often upsetting issues with their young students? I think it’s important for educators to first develop their own cultural competency. Like learning itself, it’s a lifelong journey and one that I’ve recently begun through books, discussion, online resources, and trainings available to guide this growth.    Cultural competency starts with a better understanding of self: an awareness of our own identity and how we’ve been socialized within our own culture to hold certain norms.  For example, it was eye-opening to realize that the books I read as a child universally featured white main characters. This inherently reinforces a bias that the white experience is the norm. When you become aware of your biases, you can work against them -- for example, by reading and sharing authors outside your race, gender, culture, sexual identity, etc. As Stephen Pinker wrote, “reading is a technology for perspective-taking.” It’s important to recognize these biases in our classroom and curriculum, understand how they negatively affect our students, and actively seek and share broader viewpoints. Doing this with your students can help develop their cultural competency as well. And culturally responsive teaching builds on that competency by seeking to understand the diverse cultures and identities of our students and incorporating them into our classroom. All students should see themselves represented and reflected in their learning at school.   Most importantly, use available tools. Teaching Tolerance is a project of the Southern Poverty Law Center and has a wealth of resources for teaching hard history and how to sit with that discomfort. Welcoming Schools has tools to create LGBTQ and gender inclusive schools and to prevent bias-based bullying. Books like An Indigenous People’s History of the United States and Stamped provide an eye-opening understanding of our history and have a version for teens, a powerful tool for the classroom.   These resources also provide important frameworks. For example, how and why we teach about complex and upsetting issues are important. Students need to know about the violence and oppression in our U.S. history of enslaving other humans, but also the many acts of resistance by enslaved people and the beautiful cultures that survived this brutality.  (Kwame Alexander’s picture book The Undefeated does this well with African American history and won the Caldecott Medal, a Newbery Honor, and the Coretta Scott King Award.) Similarly, Mexican and Native Americans experienced many of the same injustices and violence as African Americans, but this is rarely taught in the classroom. Celebrating these resilient, thriving cultures is an important part of the education we need to impart.  What do you hope that children take away after reading Most Wanted? I hope that Most Wanted inspires readers to want to read and learn more about our nation’s history. The American Revolution is not always taught in elementary school, and I hope this book sparks a curiosity to learn more. My own fascination with the American Revolution was inspired by reading Laurie Halse Anderson’s picture book Independent Dames about the courageous women of that era. One book can be a doorway to further exploration.  Also, I hope that my author’s note prompts my young readers and the adults in their lives to engage in critical discussion about the book, this era, and our history as a nation. And to seek out expanded viewpoints and fill the gaps in their own knowledge with more learning. Our history is fascinatingly complex and surrounds us still today.     Do you have any ideas in the works for future projects? I am re-evaluating my role as a historian and storyteller for children. I want to use the agency I have in the publishing world to broaden our children’s understanding of our nation’s history and to think critically, engage in democratic discourse and ultimately, to paraphrase Martin Luther King Jr., help bend the long moral arc of the universe toward justice. I’m working on another picture book that tackles the issues that I’ve been wrestling with as an author. It looks at America through a social justice lens to prompt conversation. I can’t tell if it’s brilliant or terrible, but I’m enjoying the creative challenge!

 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174765 https://historynewsnetwork.org/article/174765 0
A Word to My Fellow Progressives: Lessons of the 2020 Democratic Primary

 

 

This year feels to many progressives like a lost opportunity. With Donald Trump in the White House and anger among disaffected groups running high, it should have been possible to nominate a very progressive candidate to be the Democratic standard bearer this November. In the face of such Republican overreach, latitude existed to move the center of American politics leftward, compelling the electorate to accept a slightly more liberal candidate than would usually be tolerated, in deference to the catastrophic dearth of competence and probity in the Oval Office.  

 

As the nominating contest heated up, moderates invoked the precedent of the McGovern campaign of 1972 in warning against the dangers of a progressive candidacy. Drifting too far to the left would alienate the middle class voters that are foundational to any winning coalition. But the similarities between 2020 and 1972 were always tenuous. Few people had heard of the Watergate office complex as voters went to the polls in the fall of 1972. The nation was polarized by the Vietnam War, and for all that McGovern floated many progressive proposals, his was effectively a single-issue campaign seeking to channel anger over the war. Since the war had been started by successive Democratic administrations, and Nixon himself was promising a policy of “Vietnamization,” the election was less a referendum on progressive politics in general than it was over “gradualist” versus “subitist” models of military disengagement.

 

The closer parallel to 2020 was arguably the election of 1932. In that year, four years of perceived mismanagement of Depression fiscal policy by Herbert Hoover created widespread disaffection, not merely with the Republican party, but with the policy “status quo” more generally, paving the way for the nomination of Franklin Roosevelt to head the Democratic ticket over the more moderate Al Smith.  Though (before the onset of the Covid-19 pandemic, which hit after the Democratic nomination had largely been decided) we today have not suffered an acute crisis comparable to the Great Depression since 2008, Donald Trump and his administration had intensively alienated large swaths of the electorate as the Democratic primary got underway. In numerous polls, voters registered their fears over growing wealth inequality, climate change, the costs and availability of education and health care, and a growing sense of insecurity in a rapidly changing world riddled with unpredictable bad actors. Many African-American, Latinx, female, and Muslim voters felt that the US government was manifesting hostility not seen from federal authorities since the 1950s or 1960s. The spreading disenchantment, the sense that government is failing to provide needed leadership in urgent times, was reminiscent of the national mood at the end of Hoover’s first term. It was a moment in which progressives might have “turned the ship of state” and opened a course toward robust reforms like those of the New Deal era.

 

Now that Joseph Biden, the embodiment of the Democratic Party’s moderate wing, is all but sure to win the nomination, that chance has passed. What, then, can progressives learn and apply to the future from the experience of 2020? I would propose the following items:

 

1) The Primary’s the Thing. The American electoral system works along very particular lines. One of its idiosyncrasies is that, given the reality of the two-party system, each election has two phases which are strategically alien to one another. Winning the Democratic primary requires fundamentally different tactics, and the building of a fundamentally different coalition than does winning the general election. If progressives want to get onto the ballot, they must strategize intensively to compete in the primary contest. In other words, they must mobilize to defeat Democratic moderates in a contest among Democrats, state by state. 

 

This may seem like a truism, but it is a principle that is roundly ignored by members of all parties in all election cycles-it is one of the blind spots that helped propel Donald Trump to the Republican nomination in 2016, and that has helped him maintain control over the GOP ever since. As an example of things progressives might do differently if they were following such advice: campaign as Democrats. While it is true that Trump came in as an outsider and effected a kind of “hostile takeover” of the GOP, he did not do so while disavowing membership in the Republican party altogether. If you want to lead the party, you have to be willing to join the party.

 

2) Consolidate Early. The failure of a crowded Republican field to back a single insider helped Trump win the nomination in 2016. Democratic moderates eventually took this hint and consolidated behind Joe Biden in March 2020. If Pete Buttigieg, Amy Klobuchar, and Mike Bloomberg had not made timely withdrawals from the race, progressives might have captured the nomination. Conversely, if progressives had rallied behind either Elizabeth Warren or Bernie Sanders early on, outcomes might likewise have been different. Books like The Party Decides have raised awareness of the ways in which party power is institutionalized, such that systemic processes will generally favor moderates, whose donors consistently fund robust infrastructure. But Trump’s campaign demonstrated that the party “machine” cannot fully contain the energies of a consolidated movement. Progressives on the Democratic side only need to be able to achieve purposefully what Donald Trump did by accident in 2016. They do not need an operation as elaborate as the DNC, only a forum analogous to C-PAC in which progressive opinion and strategy can be deliberated.  The Center for American Progress tried to launch such a forum in this cycle with their “Ideas Conference.” Progressives should treat that event in future as an opportunity to achieve “movement discipline.”

 

3) Fight Astroturf Aggressively. Corporate interests will inevitably use their disproportionate power in the media to demonize progressive candidates and policy initiatives. We saw this early on with regard to Elizabeth Warren. A well-funded media blitz fabricated the self-fulfilling narrative that her “Medicare for All” plan was unpopular with voters and made her “unelectable.” Progressives must be ready for that kind of assault, and move to aggressively counter-message in defense. Obviously there are limits to how effectively such resistance may be mounted, but the leaders of progressive campaigns should not be above such tactics as circulating “talking points” to surrogates and allies, by way of cultivating “message discipline.” Some part of campaign war chests likewise should be earmarked for “anti-astroturf” use. Here Democrats might take a page from Donald Trump. Voters respond to a candidate who sticks to her guns, and Warren’s move to “moderate” her position may have contributed to the effectiveness of the campaign to discredit her, and costing her support among those on the fence between her and Bernie Sanders.

 

4) Engage Voters of Color. Failure in this regard is perhaps the key to the defeat of the progressive wing of the party in 2020. Bernie Sanders won the trust of many Latinx voters, but neither he nor Elizabeth Warren was able to garner robust support in the African-American community. The fundamental lesson here is that the campaign season is too late to forge relationships in communities of color. Any progressive Democrat who is planning to seek higher office must begin now, in whatever office they occupy or position they hold, to communicate with Latinx and African-American leaders (this includes candidates who are themselves African-American or Latinx, who may not presume upon the support of voters of color) and to partner with them on issues of urgent concern. 

 

This is a dynamic that works at the levels of both policy and politics. Latinx and African-American voters reserve their support for candidates who both address key problems (i.e. fighting against discrimination in education or credit markets, defending communities against racist violence) and demonstrate, in their public activities and communications, that they are comfortable with and respect communities of color. The success of Doug Jones in Alabama is perhaps the best object lesson in what this looks like. The fact that he took some risks to bring white-supremacist terrorists to justice as D.A. convinced millions of African-American voters that he might uphold his promises to them as a U.S. Senator. 

 

5) Build A Winning Coalition- Among Democrats. This is a corollary of principle #1, “the primary’s the thing.” Progressives have a distinct advantage in the Democratic primaries, in that turnout in those contests is higher among progressives than among moderates. But 2020 shows that progressives still cannot win on their own--a higher percentage of self-identified “progressives” may turn out to vote, but progressives as a whole are still outnumbered in quantities sufficient to overwhelm their advantage in turnout. A progressive candidate must be able to win over some moderates in order to claim the nomination. Not all, some.Enough to make up the difference between the progressive plurality and a winning majority. Moreover, this imperative works hand-in-hand with #4, above: moderates are disproportionately represented among voters of color.

 

This does not mean that the left must settle for “progressives-in-name-only,” only that a degree of compromise will need to be tolerated. A Sanders candidacy seems to have been an exercise in progressive “overreach.” Powerful systemic forces draw the electorate rightward in the American political process, progressives will always need to thread the needle between fighting that tide and riding that wave.

 

As a progressive, I would like to see “our wing” of the party succeed in capturing the White House. A great deal might thus be achieved. Some of the disappointments of the “neoliberal” moderation of the Clinton and Obama years might be redressed: bringing a more robust approach to global warming; meaningful reform in health care and education; redress to wealth inequality, and much more. But for any of this progress to be possible, the riddle of the American ballot box must be confronted. If we can assimilate the lessons of 2020, perhaps in the future lost opportunities can be redeemed. 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174768 https://historynewsnetwork.org/article/174768 0
The Ruthless Litigant in Chief: James Zirin Paints a Portrait of Trump Through 3,500 Lawsuits

 

American presidents before Donald Trump had some record of public achievement in politics, government or the military before they were elected. Donald Trump lacked any of those credentials, but brought his astounding history of involvement in thousands of lawsuits to the nation’s highest office. This trove of cases from more than 45 years reflects Trump’s contempt for ethical standards and for the US Constitution and the rule of law, the foundation of American democracy.

As a perennial litigant, Trump weaponized the law to devastate perceived enemies, to consolidate power, to frustrate opposing parties, as former federal prosecutor and acclaimed author James D. Zirin illuminates in his compelling and disturbing history of Trump’s use and abuse of the law, Plaintiff in Chief: A Portrait of Trump in 3,500 Lawsuits (All Points Books).

Mr. Zirin is a distinguished veteran attorney who spent decades handling complex litigation. He is also a self-described “middle of the road Republican.” Plaintiff in Chief stands as his response to Trump’s disrespect for law and our legal system. He stresses that the book is a legal study, not a partisan takedown. 

In his book, Mr. Zirin scrupulously documents Trump’s life in courts of law. Based on more than three years of extensive research, the book examines illustrative cases and how they reflect on the character and moral perspective of the current president. The details are grounded in more than 3,500 lawsuits filed by Trump and against Trump. Litigation usually involves sworn affidavits attesting to accuracy and testimony given under oath if a trial occurs, so Mr. Zirin is able to reference page after page of irrefutable evidence of Trump's legal maneuvering, misstatements, hyperbole, and outright lies. 

As Mr. Zirin points out, Trump learned how to use the law from his mentor, the notoriously unprincipled lawyer and fixer Roy Cohn whose motto was “Fuck the law.” Trump took Cohn’s scorched earth strategy to heart and used the law to attack others, to never accept blame or responsibility, and to always claim victory no matter how badly he lost.

”Trump saw litigation as being only about winning,” Mr. Zirin writes. “He sued at the drop of a hat. He sued for sport; he sued to achieve control; and he sued to make a point. He sued as a means of destroying or silencing those who crossed him. He became a plaintiff in chief.”

And Trump also has been a defendant in hundreds of legal actions, as Mr. Zirin details. In 2016, there were 160 federal lawsuits pending in which he was a named defendant, as well as numerous other investigations and proceeding. Mr. Zirin observes that Trump “has been sued for race and sex discrimination, sexual harassment, fraud, breach of trust, money laundering, defamation, stiffing his creditors, defaulting on loans, and . . . he [has] been investigated for deep ties to the Mob, which he enjoyed over the years.” 

And Trump’s pattern of disrespect and contempt for the law persists. As Mr. Zirin writes, "All this aberrant behavior would be problematic in a businessman. . . But the implications of such conduct in a man who is the president of the United States are nothing less than terrifying."

Mr. Zirin is a leading litigator who has appeared in federal and state courts around the nation. He is a former Assistant US Attorney for the Southern District of New York under the legendary Robert M. Morgenthau. His other books include Supremely Partisan-How Raw Politics Tips the Scales in the United States Supreme Court and The Mother Court, on great trials from the Southern District of New York in the mid-twentieth century. His articles have appeared in array of publications including Time, Forbes, Barron’s, The Los Angeles Times, The London Times, and others. 

 

Mr. Zirin also hosts the critically acclaimed television talk show Conversations with Jim Zirin Digital Age, which airs weekly throughout the New York metropolitan area. In August 2003, Mayor Michael R. Bloomberg appointed him to the New York City Commission to Combat Police Corruption. He is a Fellow of the American College of Trial Lawyers, and is a member of the Council on Foreign Relations. A graduate of Princeton University with honors, he received his law degree from the University of Michigan Law School where he was an editor of the Michigan Law Review and a member of the Order of the Coif.

 

Mr. Zirin graciously responded to questions on his study of Donald Trump by telephone from his office in New York.

 

Robin Lindley: Congratulations on your new book Mr. Zirin, and on your distinguished legal career. In your book, you chronicle Donald Trump’s life as a ruthless litigator for almost a half century. How did you come to write Plaintiff in Chief on Trump’s life through more than 3,500 lawsuits? 

James D. Zirin: About three years ago, a friend suggested that I write a biography of Roy Cohn. I knew Roy Cohn. He was an unscrupulous lawyer. He was disbarred 1986, about three years before he died. And he was Trump's lawyer and confidant, and their relationship was very close, very intimate. He boasted to a journalist that he and Trump spoke about five or six times a day. This was before Trump had any notion of seeking political office. 

Cohn really taught Trump everything he knows about waging what I call asymmetrical warfare, weaponizing the law and using litigation as a means to attain the various objectives that he had. They met in a bar in 1973 just after Trump had been named as a defendant along with his father in a race discrimination in housing suit brought by the Justice Department. Trump had a number of lawyers and normally a suit like that ends quickly with a consent decree with the defendant agreeing that he or she won't discriminate anymore without accepting or admitting or denying the allegations in the complaint. 

Cohn had a different recipe for going forward. He liked to beat the system. He'd been indicted three times by the legendary prosecutor Robert M. Morgenthau, and he'd been acquitted three times. Cohn’s recipe was fight, and he taught Trump the tools he used. Number one is, if you're charged with anything, counterattack. Rule number two is, if you're charged with anything, try to undermine your adversary. Rule number three is work the press. Rule number four is lie. It doesn't matter how tall a tale it is, but repeat it again and again. Rule number five is settle the case, claim victory, and go home. And that's exactly what happened in the race discrimination case. 

So anyway, I created a book proposal, which I sent to my agent and my publisher, St. Martin's Press. In its wisdom, the Press said I should try to write a larger book about the influence of litigation on Donald Trump because that's the way he had conducted himself in the 40 years before he achieved office, and I should use Cohn perhaps as a springboard but the book should center on Trump and what experience he had had in litigation. So, I did that and that's how I came to create the book. 

Robin Lindley: It's a remarkable and chilling account of Trump's life through the prism of his legal affairs. You stress in the book that the two most powerful influences in his life were Roy Cohn and his father, Fred Trump. 

James D. Zirin: Yes. His father, of course, was a defendant as well in the race discrimination cases. His father was also a real estate operator and he came up against the government in the arena of FHA loans. He was accused of profiteering. He testified before a Senate committee and was interrogated by Senator Lehman. He made a lot of money by mortgaging out with FHA loans in ways that they were never intended. Then when he was asked about the profits, he said he had never withdrawn the money from the bank, so therefore there were no profits. That was ridiculous. But here is an example of saying something that's totally ridiculous for public consumption that somehow or other some people will believe. And that's the approach Trump has used professionally and that's the approach he continues to use in office. 

Robin Lindley: Were you ever involved in litigation with Donald Trump? 

James D. Zirin: No. I never was. I met him several times, and I met him with Roy Cohn several times. 

When I first met Cohn, I was an Assistant US Attorney and Cohn was being investigated by a federal grand jury. I worked for Robert M. Morgenthau then and that investigation resulted in indictments. Cohn was in the anteroom of the grand jury chamber and witnesses were waiting to testify and he raised his open hand in what might be interpreted as a high five. I naively thought it was a high five to encourage the witnesses since they were facing the daunting experience of testifying before a grand jury. And it wasn't the high five at all. He was telling them to take the Fifth. 

That’s how Cohn operated and that's the way Trump operates. It's saying something that's highly incriminating and doing something that's highly incriminating, but doing it in a way so that you have total deniability if anyone calls you out for it. 

Robin Lindley: What was your impression of Trump when you met him decades ago?

James D. Zirin: I really met him only to shake hands. I never met him to talk with him, but I knew of his reputation. I knew he didn't pay his bills. I knew he didn't pay his lawyers. I knew he'd been in bankruptcy five times, and I knew about his Atlantic City casinos. I knew he'd been sued a number of times. And I knew that he had been a plaintiff an extraordinary number of times. He sued journalists. He sued small business people for using the Trump name. He sued women who he was involved with. He sued his wives even after a divorce, both Ivana and Marla Maples. And I knew that a lot of settlements he entered during litigation were kept under seal in the files of the court so the public would never know the terms of the settlements. 

In one major litigation effort, you had the so-called Polish brigade case which involved the construction of Trump Tower that opened in 1983.Trump had undocumented Polish workers and he did not contribute to the union pension fund as he was required to do. There was litigation and it was eventually settled for 100 cents on the dollar after lengthy litigation including a trial in which the trial judge said Trump's testimony was completely lacking in credibility. But the case was settled. We never knew what the terms of settlement were except, about 20 years later, a judge unsealed the settlement papers and it turned out that Trump had settled for 100 cents on the dollar. 

Robin Lindley: Full disclosure: I'm a lifelong Democrat and I think most in my party would agree with your history and characterization of Trump.

James D. Zirin: I'm actually a lifelong Republican and I'm decidedly anti-Trump because I don't think he represents the values of the country or the Constitution of America. I think he's been a rogue president.  

Robin Lindley; I agree. A lot has happened since your book came out, with Trump’s reaction to the Mueller report, his impeachment, his weaponizing of the Department of Justice, and more suits against the media and others. And Trump continues to follow the Cohn rules. Trump famously said he needed a Roy Cohn. Does he have his Roy Cohn now in William Barr, the Attorney General? 

James D. Zirin: Many people have suggested that. I think Barr is more of an ideologue. He's not an unscrupulous lawyer as Roy Cohn was. 

Roy Cohn represented mobsters and he was a crook. He was eventually disbarred because he stole $100,000 from a client. He was disbarred because his yacht went up in flames and a crew member was killed. He collected the insurance. It was supposed to go to his creditors, but instead he pocketed the money. He was disbarred because he made a false and misleading statements on an application to become a member of the DC bar, and there was a disbarment hearing. Trump was one of a number of his character witnesses and he testified to Cohn’s good reputation for honesty, integrity, truth and veracity. And of course, Cohn’s reputation for honesty, integrity, truth and veracity was very bad.

And after Cohn was disbarred in 1986, Trump distanced and himself from Cohn, but that was not for long because Cohn died three weeks thereafter. There was a funeral and Trump stood in the back of the room and delivered no eulogy, and never said much more about him. 

What we do know about how close the relationship is that 30 years later, in 2016, when Trump was elected president, he turned to gossip columnist Cindy Adams, a friend of his, and he said, “Cindy, if Roy were here, he never would've believed it.” So we know that’s how close the relationship was. And, in the White House in 2017 when counsel Donald McGahn was dragging his feet about firing Sessions, Trump made the famous statement, “Where’s my Roy Cohn?” 

Robin Lindley: It seems to many observers that William Barr is acting as the president's personal attorney and has an authoritarian attitude about the Constitution and the role of the president while scoffing at the separation of powers. 

James D. Zirin: That is easily said, but I don't think it's easily demonstrated because the Constitution does not say that the attorney general must be independent of the president. There is a tradition of independence in the justice department, particularly since Watergate where the Attorney General must serve the Constitution, and not the president, and if there's a conflict the Attorney General should do something to resolve that conflict. 

 I think Barr has been quite cavalier about observing that tradition. He doesn't believe in it. He is contrarian and a libertarian. He believes in the unitary executive so that Trump is free to do basically anything he wants to do because he's the President of the United States. Barr has not been a check on Trump's unbridled abuse of power, but it's not really for the attorney general to do that. It's for the Congress to do that through the impeachment power, so you can't say that Barr has failed to ride herd on the president because he would take the view that that's not his obligation.  

Robin Lindley: Thanks for explaining that view of the Attorney General’s role. Since your book came out the impeachment occurred. Senator Susan Collins thought the president would be chastened by that process. Of course, that hasn’t happened. How do you see Trump’s response to the impeachment and the unanimous Republican Congressional support of him, with the exception of Mitt Romney? 

James D. Zirin:I think Trump believes he's above the law, and when I say the law, I mean the law including the Constitution. 

The Republicans in the Senate were willing to give him a pass for various reasons. I suppose they could rationalize it. They could say, number one, it was for the American people ultimately to decide on whether he should remain in office and we have an election coming up in a few months. And number two, what Trump did was bad perhaps, but it wasn't so bad as to amount to an impeachable offense. Impeachable offenses are what two thirds of the Senate say they are was going to be. 

I don't think anyone ever thought that two thirds of the United States Senate would vote to remove him from office. But the Constitution provides for a trial and it's supposed to be presided over by the Chief Justice. And this was not a trial. It was a travesty because who has ever had a trial where the prosecution can't call witnesses to present the evidence. And that's what Romney was extremely upset about. 

I think it was a Senator Lamar Alexander's who said we don't need witnesses because, if five people say you left the scene of the accident, why call a sixth? And so, it was pretty much uncontested what the facts were, and what is to be made of those facts is up to the United States Senate under our Constitution. It shows that the hoary document we call the Constitution of the United States, which we put on a pedestal and supposedly has iconic significance, is an 18th century document that in the real world is pretty inefficient in curbing the powers of a tyrannical president. And I think that history will record that. 

Robin Lindley: And Trump responded that the impeachment was “a hoax” and said his letter to Ukranian President Zelensky was “perfect.” He actually asked a foreign government to interfere in an American election. It seems a high crime and misdemeanor under the Constitution. Elections are sacrosanct in a democracy. 

James D. Zirin:Well, that's true. And a high crime and misdemeanor does not have to be a crime that's in the United States Code, although this amounted to an invitation for a bribe, but also amounted to extortion, both of which are in the United States Code. 

But at the time of the enactment of the Constitution, there was no United States Code. The Constitution mentions bribery or other high crimes and misdemeanors. It was quite clear from the Federalist papers and the ruminations of Hamilton and Madison and others that abusive presidential power was an impeachable offense. And here you certainly have an abuse where Trump was using the foreign policy of the United States and the leverage of withholding funds for military aid that were authorized by Congress in order to achieve a domestic political advantage and benefit himself. 

Robin Lindley: And Trump continues to bring lawsuits from the White House. In the last couple of weeks, he's sued the New York Times and CNN for defamation. Of course, he'll never appear to be deposed, so those lawsuits will probably go nowhere. He continues to use the law as a weapon. You chronicle that sort of abuse of the legal system for the last 45 years or so. 

James D. Zirin: That's right. He has sued a lot of writers. Before he took public office, he sued the journalist Tim O'Brien for daring to write that his net worth was overstated. They took his deposition, and he demonstrably lied at least 32 times under oath, and the case was eventually dismissed. The defense was able to show the truth that, in fact, he had overstated his net worth. That was one of Trump’s sore points and he sued whenever someone said he was worth less than he believed should be stated. But O'Brien won his case. 

And he brought other cases against journalists. He sued an architectural critic for the Chicago Tribune for suggesting that one of his buildings, which wasn't even up but was planned, would be an eyesore on the horizon. The judge threw the case out because of the rule of opinions. To succeed in a libel action, you have to show that a statement of fact which is defamatory and false was made of and concerning the plaintiff. The architect stated an opinion that the building would be an eyesore on the horizon, and that's not something that could ever be libelous. 

Robin Lindley: You use the term “truth decay” and Trump is probably responsible for either misstatements or outright lies on an average of at least 10 times a day. How does this pattern of lying fit into his attitude toward the law? 

 James D. Zirin: I think he enjoys lying. I think it's part of his DNA. I don't think he has any grasp of the facts at all, so he says whatever he thinks will help him and whatever comes into his head. 

It is expedient, I suppose, to lie in litigation if you crossed an intersection through a red light. You can lie and say it was a green light and that changes the legal outcome of your case. And that's the way Trump operates. But he would go beyond that because he would say the heck with you and the horse you rode in on as he did in the House impeachment inquiry. Then, he denounced Adam Schiff and denounced Jerry Nadler and denounced the witnesses. He tried to subvert the whole proceeding by denouncing the whistleblower and by showing that those people who lined up against him were of low character and were themselves liars.

All of this goes back to Joe McCarthy because this is the way McCarthy operated. Then adversaries accused McCarthy of engaging in a witch hunt and accused McCarthy of generating these hoaxes. And of course, Roy Cohn was McCarthy’s chief counsel. And so he learned how useful those charges can be and how devastating they can be in any kind of controversy. And he taught all of that to Trump and Trump uses that to his great advantage.

Robin Lindley: Yes. And Trump certainly follows the Roy Cohn rule about declaring victory no matter how badly you lose. 

James D. Zirin: Yes. Even that conversation with Zelensky he called “perfect,” and it was something other than perfect. I don't know whether he's going to say the Corona virus is a perfect hoax, but perfect is a word that recurs again and again in his lexicon. 

Robin Lindley: Pardon me for this psychological observation, but you write that power and dominance are even more important to Trump than money. That seems pathological. And he seems to take a sadistic joy in attacking and ruining anyone he perceives as a foe of some sort. 

James D. Zirin: Well, that's true. 

In one of the cases that I describe in the book, he got wind of the fact that a small business, a travel agency conducted by father and daughter in Baldwin, Long Island, was using the name Trump Travel. Not Donald Trump Travel, but Trump Travel. They used Trump Travel because they were selling a bridge tours for people who play bridge, and “trump” is a bridge term. Also, like “Ace Hardware,” they thought “Trump” keynoted excellence. This was a little storefront travel agency in a small Long Island community. Trump had never been in the travel business and he never had any business involvement in Long Island, but he sued them. And at the end of the day, they were allowed to continue to use the name Trump Travel, but it had to make the lettering a little smaller. And they'd exhausted their life savings in defending the case. So he was quite sadistic about the way he went about it. 

There was another similar case where an unrelated family named Trump from South Africa had a multibillion-dollar pharmaceutical business and Trump sued them. He'd never been in the pharmaceutical business and had never been in business at that point anywhere outside the United States. But this family had the wherewithal to defend the case and eventually the case was thrown out completely.

 Robin Lindley: What are a few things you learned about Trump's ties to the Mob or organized crime? 

James D. Zirin: In the first place, his father had ties to the mob. His partner was a man named Willie Tomasello who was a made man and they were partners in various real estate ventures. 

Through Roy Cohn, Trump met leaders of the five families in New York, principally Fat Tony Salerno, Paul Castellano, and others who controlled the poured concrete business in New York. He also met John Cody who was a labor racketeer and president of the Teamsters.

 At that time, particularly because of the mob involvement, poured concrete was a much more expensive way of constructing a building than structural steel. The poured concrete business was dominated by Castellano who was murdered, and Salerno who was eventually sent to jail for a term of about 99 years. 

Trump retained these mafia companies to construct buildings out of poured concrete even though it was more expensive. We don't really know why he did that, but his mob ties ran quite deep. They existed in Atlantic City where he had a number of casinos, principally the Taj Mahal, which went bankrupt six months after its opening. At times, on a Tim Russert program and under oath, he denied that he had any contact with the mob, but he was warned by the FBI when he went into Atlantic City that he shouldn't deal with mobsters because it would ruin his reputation. 

But Trump continued to have contact with mobsters. On at least two occasions, which I relate in the book, he admitted that he had ties with the mob. In the construction of Trump Tower, the Teamsters called a citywide strike. Construction trucks and concrete trucks didn't have access to construction sites, but mysteriously at Trump Tower poured-concrete trucks passed the picket lines and continued their work. Cody, the president of the Teamsters got not one, not two, but several condominium units at Trump Tower for a female friend of his who had no visible means of support. 

Robin Lindley: Did you learn anything about Trump’s ties to the Russian mob and Russian oligarchs?

James D. Zirin: Yes. A lot of it is revealed in the book [by Craig Unger] House of Trump, House of Putin. But in 1986, a Russian oligarch walked into Trump Tower and he bought five condominium units with monies that had been wired from London and laundered from Russia. He was a member of the so-called Russian mafia. Eventually the Attorney General cracked down on it and made a finding that these apartments were purchased with laundered funds. So Trump’s ties to Russia go back at least that far, maybe further. And he has continued to deal with Russian oligarchs throughout his business career. 

Robin Lindley: Money laundering is complicated to me. Can you say more about Trump and laundered funds?  

James D. Zirin: Money laundering is where money comes from some illegal source and the money can't be reported, so the origin of the money must be concealed, and that's why it's called laundered money. A good way to conceal the source of money, particularly with money from Russia that is obtained by fraud or theft or in violation of Russian laws, is to buy a condominium unit in New York and the condominium unit is there and there's no tracing of the funds. The funds went to Trump and he deposited them in his bank accounts and he used it to pay his loans, and the origin and tracing of the funds just disappeared. 

Robin Lindley: And you indicate that Trump has repeatedly used laundered money to hide illegal funds.

James D. Zirin: The interesting thing with Trump and the Russians is that Deutsche Bank was his principal lender. No other bank would touch him. He now owes about $365 million to Deutsche bank. At one point in time, he was in default on a debt service payment to Deutsche Bank and they were about to sue him. Trump tried to stop them with the same technique, a suit against the bank: a counter attack, suing for fraud in lending. Somehow or other that got resolved, and the debt service obligation was extended and another department of Deutsche Bank continued to lend him large sums of money. Now that’s very suspicious because what bank lends money to a customer who's been in default, number one, and number two, what bank lends the money to a customer who has sued them and charged them with fraud?

Deutsche Bank pleaded guilty to money laundering for Russian interests and there were definite ties between Deutsche Bank and the Russians, which have never been fully explored. Russian money in effect may have been used to guarantee Trump’s indebtedness.  I think Trump's son Eric said on a number of occasions, and Donald Jr. said at a certain point in time that they couldn't get financing until they got it from Russia. 

Robin Lindley: Yes, I recall that comment. Trump also has been able to keep his tax returns secret. How do you see his refusal to reveal his returns and its significance?

James D. Zirin: Trump’s five predecessors in office all had no trouble releasing their tax returns. Both Republicans and Democrats seem to regard this release as a tradition although there is no legal obligation imposed on a president to release his or her tax returns. 

Trump's tax returns remain shrouded in mystery. Now, the District Attorney of New York County obtained a grand jury subpoena covering eight taxable years, and five of the eight were before Trump became president. He didn't subpoena Trump for them, but subpoenaed Trump's outside accountants and the Trump organization. Right up the line the courts sustained the subpoenas and said that the accountants had to comply, which they were willing to do except Trump had instructed them not to. That matter is now before the Supreme Court and will be argued in March. Presumably it'll be decided in June before this term of the Court ends. 

In addition, committees of the House of Representatives have subpoenaed some tax returns and that matter is also before the Supreme Court. 

Now, this is absolutely appalling. In the Second Circuit subpoena case brought by the Manhattan District Attorney, Cy Vance, Judge Chin questioned the lawyer for Trump in the Second Circuit. “Now your client said he could shoot someone on Fifth Avenue and no one would mind. None of his followers would mind and would still vote for him. I suppose if he shot someone on Fifth Avenue a district attorney would investigate the case. Couldn't the police investigate the case? Couldn't they seize the gun? Couldn't they talk to witnesses?” Trump’s lawyer answered “No, your honor. He’s protected by the fact that he's president.” And if anyone buys that one, I think the rule of law is seriously compromised. 

Robin Lindley: Yes. That goes back the wealth of evidence you present that Donald Trump has no regard for the rule of law or the Constitution. 

James D. Zirin: Yes, if it gets in his way. He said that impeachment was unconstitutional even though impeachment is provided for in the Constitution. So he doesn't know what he's talking about as a legal matter. 

There were also the instances of his undermining the judiciary, accusing judges who decide against him of being Obama judges or Mexican judges and taking them on individually as so-called judges. And he's undermined the judiciary in a way that's totally obnoxious to any lawyer who's dedicated to the rule of law.

Robin Lindley: He swore an oath to the Constitution as president and yet continues to attack the legal system and the rule of law. 

James D. Zirin: Justice John Marshall said we're a government of laws, not men. He would've said today men and women.  But Trump has attacked not only the legal system, but the judges who administer the legal system like Justices Sonia Sotomayor and Ruth Bader Ginsberg who he thinks should recuse themselves from all Trump-related cases. And again, he's gone after the individual judges. 

Robin Lindley: You also point out his abusive treatment of women and his payments of hush money to his paramours.

James D. Zirin: He did that before he took office and that's why Vance wants to see Trump’s tax returns, to see how these payments were treated on his tax returns and perhaps to find payments to other women. 

Robin Lindley: Since the book came out, have you received any backlash or criticism from Trump supporters or Republicans.

 James D. Zirin: No. I'm often asked if I expect Trump to sue me, and I say I wish he would because it would help the sales of the book. The book Fire and Fury would have sold about 5,000 copies, but then when Trump tried to enjoin it, it was like “Banned in Boston” and it became a runaway bestseller and sold three million books. 

I haven't received any backlash because the book is very well documented and everything in it is true, but the [Trump supporters’] response is basically, so what.  The stock market is up. We have less regulation. The government is less intrusive in our lives, except maybe when it comes to abortion. [To Trump supporters] this is all a good thing. And they spit up the names Pelosi and Biden and Bernie Sanders, and they ask would you rather have someone who's going to tax and spend like Bernie Sanders who promises a chicken in every pot? Or would you rather have a Donald Trump whose policies are a good though he's a little ridiculous. That’s the comment.

Robin Lindley: In the context of your study of Trump’s attitudes and perspective on the law, how do you see Trump’s response to the public health crisis now of coronavirus?

James D. Zirin: The coronavirus is quite likely to be the undoing of Donald Trump, when his mendacity, ignorance and shallowness came into full view, an empirical reality, as indisputable as the laws of science or a Euclidian equation.

I saw it all coming and I cried aloud in my book “Plaintiff in Chief. “

Here’s a partial list of Trump’s lies about the coronavirus: 

In President Trump’s first public comments about the coronavirus, on Jan. 22, he assured people that it would not become a pandemic: “No. Not at all,” he told viewers of CNBC. “It’s going to be just fine.”

In the weeks that followed, he offered a series of similar reassurances:

“We have it very well under control.”

“We pretty much shut it down coming in from China.”

“I think the numbers are going to get progressively better as we go along.”

“We’re going very substantially down, not up.”

“It’s going to disappear. One day — it’s like a miracle — it will disappear.”

None of it was true.

Robin Lindley: Do you have any words of wisdom now on the future of democracy and the rule of law? Trump has persisted in twisting the law to his interests or ignoring it in the lifelong pattern you portray vividly.

James D. Zirin: He has continued and I think he will continue. And I think the rule of law has been seriously undermined. 

Our democracy has been seriously compromised because the framers of the Constitution never thought the system would work this way. Republican senators deserve part of the blame because of their need to retain power or whatever, they did not respect the oath they took to be fair and impartial judges of the facts and the law, but instead voted along party lines to acquit him. 

Robin Lindley: Trump said sometime in the 90s, as you note in your book, that “I love to have enemies.” I think most of us wouldn't feel that way and it seems pathological to me. What did you think when you found that quote from him? 

James D. Zirin: Most politicians are controversial and have political enemies. I think Trump relishes that perhaps more than others and he loves to attack them personally. We know Biden is “Sleepy Joe” and Elizabeth Warren is “Pocahontas” and Bernie Sanders is “Crazy Bernie.” He has nicknames for everyone and he revels in trashing them rather than addressing the merits of anything they propose. 

Robin Lindley: The mentality of an eight-year-old bully, it seems. You write that Trump’s experience in lawsuits reflects “his inmost ulterior motivation.” This comes out in your responses, but could you sum up your sense of his character, motivation, and morality based on your extensive research?

James D. Zirin: His virulent combination of anti-science, anti-law, ignorance, irrational conspiracy theorizing, instability, narcissism and vindictiveness has led us to national catastrophe. If he is re-elected, I fear for the republic and the American people.

 

Robin Lindley: As you demonstrate, Trump is an anomaly in terms of the adversarial system. Do you have anything to add on his abuse of legal process? 

James D. Zirin: Look, the adversary system is the crown jewel of our legal system. We got it from the British. The idea is that you had lawyers on both sides who were partisan, who were like the Knights Templar who rode into battle on behalf of somebody or other in the Middle Ages. That has been the best way of getting at the truth. In contrast, in civil law countries like France or Germany, the judge conducts the inquiry. The judge might ask questions of the lawyers, but the lawyers don't develop the evidence on both sides. 

But adversary doesn't mean enemy. What Trump has demonstrated is that we have a great legal system and we all have the benefit of it. But there are also limitations for the law and those limitations can be exploited by someone determined to beat the system, and that's what Trump has done.

Robin Lindley. Thank you, Mr. Zirin and congratulations on your groundbreaking and compelling book on the life of Donald Trump through the perspective of his thousands of lawsuits. It's been an honor to talk with you

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/blog/154329 https://historynewsnetwork.org/blog/154329 0
Roundup Top Ten for March 27, 2020

We’ve Never Been Here Before

by Adam Tooze

There are historical analogies to this kind of collective shutdown, but they are not attractive.

 

Women Also Know Washington

by Lindsay Chervinsky

The last decade has witnessed a noticeable uptick of works on Washington authored by women, with more to come in the pipeline.

 

 

The History of Asian American Discrimination in Public Health

by Stanley Thangaraj

The popularity of various pseudo-scientific, ad hoc religious, and other problematic discourses about the coronavirus are jeopardizing national and global health. 

 

 

What Our Contagion Fables Are Really About

by Jill Lepore

In the literature of pestilence, the greatest threat isn’t the loss of human life but the loss of what makes us human.

 

 

Dismantling the Federal Bureaucracy Has Left Us All Vulnerable to COVID-19

by Teal Arcadi and Casey Eilbert

Decades of deriding bureaucrats and cuts have undermined government capacity to serve us when we need it most.

 

 

Babe Ruth's New York @ 100

by Jonathan Goldman

When Babe Ruth started hitting home runs, the US started to change.

 

 

Hospice of the Creative Class

by Alex Sayf Cummings

No event has so starkly revealed the brutal inequalities of contemporary capitalism as the coronavirus pandemic.

 

 

It Doesn’t Have to Be a War

by Tim Barker

The Trump administration appears ready to invoke the Defense Production Act to speed manufacture of essential goods like face masks. What if we didn’t have to resort to the analog of war?

 

 

Joking in the Time of Pandemic: The 1889–92 Flu and 2020 COVID-19

by Kristin Brig

As we see with COVID-19, the darkest periods in history expose the best—and worst—of humanity.

 

 

Assimilationists of a Feather

by Elliot Friar and Travis LaCouter

If the history of gay liberation has taught us anything, it’s that assimilationism is one hell of a drug.

 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174755 https://historynewsnetwork.org/article/174755 0
A New Lease on Life for the "Jewish Nose" Lie

 

Few people are shocked when a drunken buffoon or a white supremacist rants about Jews having distinctive noses. But when a historian at a respected institution perpetuates that dangerous myth, it’s a cause for concern.

In a tweet on December 17, 2019, Dr. Rebecca Erbelding, a staff historian at the United States Holocaust Memorial Museum, who frequently represents it at events around the country, wrote: “At a talk today, asked about my personal background. I confessed that I’m not Jewish, but with a Hebrew first name, German last name, and my nose and hair, I ‘pass’.”

The same week that Erbelding made her remark, UNESCO announced that it was canceling its association with the annual Carnival of Aalst, in Belgium, because the carnival has included a float that mocked Jews by depicting them with huge “Jewish noses.” 

Perhaps Erbelding or the Belgian float designers believed they were acting in a humorous spirit. But that misses the point. Wrapping a racist stereotype in a joke does not make it any less racist. It does the same damage—it perpetuates the degradation of the Other. The idea that there is a distinctive “Jewish nose” is one of the oldest anti-Jewish myths. Medieval anti-Semites introduced it in the 12th century CE as a way of singling out Jews for contempt. The nose became “a physical symbol of otherness for Jews,” as Prof. Roy Goldblatt has put it.

Nazi Germany’s propaganda machine made ample use of the “Jewish nose” stereotype. A notorious Nazi film produced in 1940, called “The Eternal Jew,” purporting to expose the “real” Jew, focused on “Jewish faces,” zooming in on Jews’ noses to make them seem unattractive. 

Similar images appeared throughout the Nazis’ news media, cultural publications, and children’s books. Der Giftpilz, an anti-Jewish children’s book published by Julius Streicher (publisher of the Nazi newspaper Der Sturmer), included a section called “How To Tell A Jew.” It described a 7th grade boys’ class, in which “Karl Schulz, a small lad in the front row,” steps to the chalkboard and declares: “One can most easily tell a Jew by his nose. The Jewish nose is bent at its point. It looks like the number six. We call it the ‘Jewish six.’ ”

In recent years, the “Jewish nose” lie has been heard from time to time. In 1999, for example, Arizona state legislator Barbara Blewster told a colleague, “You can’t be Jewish. You don’t have a big hooked nose.” Blewster was compelled to publicly apologize, but she continued to insist, “I have no prejudice at all. I admire the Jews.” 

The recently resigned prime minister of Malaysia, Mahathir Mohamed, has repeatedly referred to “hook-nosed Jews.” The website of Belgium’s Ghent University until recently included a sign-language video showing a hooked nose as the translation for the word “Jewish.” 

A related episode occurred in Sweden last year. A Jewish woman reported that when she went to a Stockholm police station to have her photo taken for an ID card, an antisemitic officer digitally altered her image to drastically enlarge her nose.

Women in particular have suffered from the “Jewish nose” stereotype. Rachel Jacoby Rosenfield and Maital Friedman, of the Shalom Hartman Institute, have written about how the “Jewish nose” and similar stereotypes have been used to intimidate Jewish women into altering their appearances. “These negative stereotypes have impacted our Jewish psyche and spawned a self-consciousness and communal shame about ‘Jewish looks’,” they point out.   

Some scholars see a connection between the Jewish nose stereotype and violence against Jews. In a recent essay, Jonathan Kaplan of the University of Technology-Sydney, noted that Pittsburgh synagogue gunman Robert Bowers invoked classic anti-Jewish stereotypes in his online ravings. “How we speak about and depict others in the media and social discourse perpetuates long-held stereotypes and ultimately emboldens hate-filled individuals,” Kaplan warned.

That, in my view, is what makes the Jewish nose lie so dangerous, and why it is so important for mainstream society to reject it. Stereotypes fuel hatred. Hatred fuels violence. So even when a stereotype is invoked in a supposedly humorous way, it has the same impact.

In fact, one could argue that a racist “joke” is even more insidious than the kind that is yelled at a passerby or scrawled on the side of a synagogue. Anti-Semitic humor helps bigotry infiltrate mainstream culture and discourse. Humor becomes the channel through which otherwise unacceptable sentiments are deemed acceptable.

Similarly, when a representative of a major Holocaust museum perpetuates the myth of the “Jewish nose,” it’s much more damaging than when some drunk at a local bar spouts off about Jewish noses. A scholar gives legitimacy to the stereotype in a way that the average gutter bigot never could.

Scholarly and cultural organizations, institutions, and museums need to respond swiftly when such stereotypes rear their ugly heads. UNESCO did the right thing in dissociating itself from the “Jewish nose” stereotypes of the Belgian carnival. Other institutions should do likewise.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174766 https://historynewsnetwork.org/article/174766 0
Can We Build On Yesterday, Salvage Today—And Save Tomorrow? There was life as it was lived in the United States before the Coronavirus; and as all know there is life after its arrival (from China—or wherever, it matters little).  Our nation has been torn apart and not put together again.  Lowly people, and the elevated ones as well, have been torn from “normalcy” and face uncertain futures.

       There is a temptation to mentally visit a large city and imagine what's been affected. Go down the list: schools, bars and restaurants, entertainment of all kinds, transportation, retirement homes—everything is disrupted,  Jobs have been lost, with everybody hoping it’s temporary.

       In a great many cases an interviewer questioning a public authority is likely to find the answerer “uncertain” as to exactly what has happened. The superior who tried to tell him hardly knew the facts themself!  Predicting an individual’s future—short or long range—is mighty hard.  Let’s try with an old person, male or female.

         So many Americans these days live on pensions.  We anticipate their arrival on perhaps the first of the month, or thereabouts.  Are they going to come now?  Or, are they held up in court?  Will a judge suddenly proclaim:  “Let’s pack it in, boys; come back in a month.”

        The place where we worked is still there, thank goodness.  But, is that door locked in some special way? Is my key useless?  Nobody’s coming in; we all got a card saying “Don’t be reporting for work until we notify you that we’re active again!

       I need to go to the bank now and then.  The time is now, but when I went to the familiar front door it had a notice pasted on it.  The words said, essentially, “We are meeting to see where we stand, and will be letting you know pretty soon just what is likely to happen.”

       My drugstore’s pharmacy was open, thank goodness, and I got the product I wanted, but the conversation was not of a kind I would like to get regularly.  Predictions about a “vaccine” that will combat this world virus situation and bring real relief are not being made—even on the internet, people are hesitant to predict improvement.

     I watch television a lot these days.  My favorite commentators are announcing and explaining.  Good.  But they just don’t seem to know as much as they (and I) would like!  I listened to the Governor of New York State and he was really lucid, detailed, and calm.  But the things he was saying were, well, horrible.  Schools closed; maybe curfews ahead; teaching school on television channels. Can we really do all that?

       There’s wild talk out there.  “Let’s just abolish the coming 2020 Elections.”  Let’s not!  We did abandon basketball and other events that draw crowds. Even the Masters golf!  Cultural events are suffering.  People are staying home.  In a way, that sounds terrible. I guess looking another way, there are worse places to be than one’s home—especially when the whole World’s exploding around you….

     I’m sitting quietly in my living room  as I write. I’m trying to imagine, to visualize, to predict:  yes, I’m trying to envision a life like the one I had from birth ‘til now being lived so very differently….  It is really hard.  I am trying to imagine life being lived in a place so very different from the United States I (we) had. Everything was working, it seemed.  Things were “on time.” Products were available.  Services could be gotten for the asking.

       Now: well, let’s just skip “now” for now. It won’t be like today in a couple of days, will it?  Change is the way things are going to be.  On the other hand:  we haven’t had a war.  Earthquake’s upheavals. Tornado or hurricane’s eradication of the whole landscape. Our cruise ship didn’t sink. (And we didn’t get locked up outside one of our finest ports, unable to swim the miles to an invisible shore.)

       Some in authority speak of living  two weeks down the line.  (Sadly, others talk of “next summer.”)  I have heard the phrase:  “Things will never be the same, will they?”  Then I glance at my investments and groan.  There seems to be nothing else I can do (except maybe pray). 

       Still and all:  I and mine are alive and well.  There’s been no fatal automobile accident.  All kinds of people in authority seem to have plans that stress “what we did before, remember? It worked pretty well, didn’t it?  We have our Army, and those military branches somebody seems to know about.  We can Build things, can’t we?”   So we have to slow down.  Well, I can do that if I must.  

       But Lord:  we will have to focus on those poor and unemployed persons we know and could know. It’s time to give, to share, to “think outside the box.”  Members of our own family have experience in doing that.  Let’s Unite.  Let’s lick this.  We’ll be patient, and innovative, and use our imaginations.  And, relax a little.  I just know that we can come out of this united; maybe not better; but not licked!

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174738 https://historynewsnetwork.org/article/174738 0
Our Stories Will Carry Us to the Future. Can They Save Us Now?

 

In his essay ‘The Storyteller,’ 20th century German-Jewish philosopher of history Walter Benjamin recounts an episode in the Histories of Herodotus, on the humbling of the Egyptian king Psammenitus. Captured by the Persian ruler Cambyses at Pelusium in 525 BC, Psammenitus was forced to witness his son and daughter join a parade of Egyptian prisoners. But he remained unmoved until he saw an old man, a former servant, stumbling along at the tail of the procession. Only at this final sight did Psammenitus beat his fists against his head in anguish.

According to Benjamin, the episode shows us the potential stories have to reach across time—not with declarative “truths” but with questions or mysteries. Successive generations have speculated on the reasons for Psammenitus’s reticence and sudden show of grief, but, Benjamin tells us, Herodotus “offers no explanation. His report is the driest.” This is why the tale can still provoke many thousands of years later: “It resembles the seeds of grain which have lain for centuries in the chambers of the pyramids, shut up air-tight, and have retained their germinative power to this day.” Psammenitus’s story should provoke us to consider how the distant future will find the seeds of our society and our stories about ourselves.

The changes we are making to the Earth run wide and deep. The environmental crisis is accelerating, absorbing lives on a daily basis; but to appreciate the scale of this, we need to also learn to see the deep time impacts of our actions. And yet, we struggle to really see this, in the parade of traces human activity daily leaves on ocean, air, and earth. 

Sea level rise is one of the most urgent crises facing humanity; how far the waters climb will decide the fate of hundreds of millions. 150 million people could have to leave their homes as early as 2050.  Cities like Bangkok and Mumbai could become lagoon cities, or even be lost altogether to the rising waters. War and want would surely follow. Whole regions could go up in flames, ignited by water.

If the worst predictions come to pass, by 2100 630 million peoplecould be homeless. The economies of entire nations will collapse, and already-unstable regions will face unprecedented pressures. Mumbai, India’s economic hub and home to one of its main nuclear weapons research facilities, will become again the archipelago of small islands that it was in the 1850s. Basra, the second largest city in Iraq, will be submerged beneath a vast inland lake.

Although so much will be lost, the inundation will also be an act of preservation, on a scale that has never been seen before. 24 million people live in Shanghai—by the end of the next decade, it will be closer to 30 million—but the city has already sunk by 2.6 metres in a hundred years. The financial district of Pudong boasts some of the tallest buildings in the world, erected impossibly on a soft bed of mud and sand that is also threaded by over 600 kilometres of metro lines. Seawater that floods its streets will claim even the tallest towers over the course of several hundred years. But the subterranean city—the underground shopping malls, the metro stations, and the steel and concrete roots of the skyscrapers—will be sealed against decay by a layer of thick marine sediment. In time, as it is pressed deeper into the rock, the city will become a vast trace fossil of twenty-first century life. 

Some miraculous transformations will occur underground: glass will devitrify, acquiring a milky sheen; iron will react with sulfides in the sediment to form pyrite, or fool’s gold. Everyday objects discarded in the rush to flee the drowning city will be leave their impressions. One hundred million years from now, writes Jan Zalasiewicz  in The Earth After Us, Shanghai will be a metre-thick layer in the strata, filled with the outlines of sim cards and bicycle wheels, as precise as fingerprints.

“We live in things,” wrote Virginia Woolf. Each molecule of carbon and sliver of plastic is a message to the future, locked up against decay and with the potential to flower into meaning millennia from now. 

Our future fossils will be our stories, but it is hard to predict how, exactly, they will be interpreted. What will the C-14 bomb spike, trace evidence of thousands of nuclear weapons tests since the middle of the twentieth century that will be legible for tens of thousands of years, say about us? Or the ‘reef gap,’ a crimson stain in the rock record that, millennia from now, might tell that a continuous coral ecosystem, 2,300 kilometres long, once thrived off the east coast of Australia? Just as we wonder how Psammenitus could stand impassively, perhaps anyone confronted with our stories in the deep future will wonder how we could simply stand by and watch the procession of disaster. 

But our future fossils are both our legacy and our opportunity. We can choose how we will be remembered. In learning to see the deep future flash upon the present, we will be better able to shape how our story will be told.

We’re confronted every day with stories of urgent and strange change. The evidence is quite literally all around us, but we are conditioned to look away, as if the climate crisis were also a crisis of the imagination. To remedy this, science is essential, but we also need stories. We know ourselves first and foremost in the tales we tell. This has been so since the beginnings of humanity; and since then, it is art and narrative that have borne our essence into the future. The question we must ask is, what stories will our future fossils tell?

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174670 https://historynewsnetwork.org/article/174670 0
Why Holocaust Fiction?

 

As a biographer I envy novelists, who can craft a captivating tale without needing to carefully document sources of information. Though they face definite challenges, fiction writers can, with whatever degree of knowledge and understanding they possess, invent composite characters and telling dialogue.

 

As a reader of Holocaust literature, I prefer non-fiction. Give me the facts straight up, please. I do not want to be left wondering whether some person existed or some action occurred. And though I love good stories, I see little need to manufacture them when the truth is powerful and strange and terrible enough. 

 

As a daughter of survivors, I think my grandparents—two of whom were in their early forties and two in their early sixties when they were killed in Auschwitz-Birkenau—would have wanted the full truth of what happened to them and their children to be wailed unto the heavens—lest they all vanish without a trace of having existed, as evidence of the human capacity for evil. 

I understand, however, that a case can be made for Holocaust fiction. In portraying virtuous, ignoble, or complex characters in extreme situations, novelists shed light on human behavior in normality. Vivid scenes facilitate our entry into foreign worlds. And in distilling events, the fiction writer can make the complicated comprehensible. Finally, readers who might not otherwise have known that Mengele experimented on twins, or that diplomat-rescuers saved some prominent Jews, or that tattooists engraved numbers on the arms of select Auschwitz inmates—or about any other of the innumerable dimensions of the maelstrom—might learn something. They may be spurred to further exploration. 

 

When fiction is “based on true events” we receive more than the author’s imagination. Of course, the degree to which such stories can be relied upon for historical accuracy varies. How deeply did the author research the subject? How can those of us who are not scholars evaluate whether we are getting a true picture? 

 

Some fiction writers use the Holocaust in the service of a good yarn—as if throwing perpetrators or victims or survivors into their narrative adds pathos or heft. Sometimes the most fantastical accounts (for example, the movies Life is Beautiful and Jo Jo Rabbit) dish up the absurd enmeshed in the plausible, like an SS officer barking orders at inmates in a language they did not understand. This happened. Nazi leaders training Hitlerjugend to defend the fatherland—this happened. While such works may be accused of trivializing the most serious of subjects, they make no claim to being other than farcical. 

 

But pretenders to truth (such as Binjamin Wilkomirski’s Fragments: Memories of a Wartime Childhood) indisputably cross a line. Passing off a false account as true provides fodder for Holocaust deniers and affronts us all. 

 

Sometimes true accounts are mistaken for fiction. Each semester that I taught a college course on the Holocaust, students would hand in papers that read, “In Elie Wiesel’s novel Night...” The Nobel laureate wrote several novels, but Night is a true account of teenaged Wiesel’s experience of the war. I wanted my students to know that. 

 

Survivor accounts are among our most trustworthy sources. Though it was near impossible for people in extreme situations to remember precise dates and times (not even decently-fed soldiers could recall such details), those who were there were (and are) experts on what they saw and felt on their own skins. 

 

Perhaps, then, the greatest good that ever came out of a work of Holocaust fiction was the Institute for Visual History and Education of the University of Southern California Shoah Foundation. In March of 1994, after accepting an Academy Award for Best Picture for Schindler’s List (based on Thomas Keneally’s historical novel), Steven Spielberg launched an ambitious and impractical project: knowing that survivors’ stories would soon be lost to history, he would capture on video as many of their testimonies as possible. Among the roughly 350,000 survivors then alive, most had been young adults during the war; they were now seniors; it would be a race-against-time. 

 

Moving quickly and efficiently—with the aid of historians and scholars, and production logistics experts; with project directors, coordinators, and, eventually, 2,500 interviewers in 33 cities in 24 countries, Spielberg set about achieving his goal. Wanting viewers to “See the faces, to hear the voices,” he insisted that survivors be interviewed in their own homes wherever possible. They were to tell their complete life stories, but spend most of the interview recounting their Holocaust experiences. Spielberg’s team ultimately amassed 52,000 videotaped testimonies. 

 

What possessed these (mostly) ordinary citizens, including those who were shy or humble or who had never before spoken about their experiences, to dress neatly, invite interviewers and videographers into their living rooms, and open up about the darkest period of their lives? For one, most knew about Steven Spielberg’s Schindler’s List. Secondly, they learned about the project through multiple media sources; flyers and ads with the headline “So Generations Never Forget What So Few Lived to Tell” awakened their sense of moral responsibility. Ms. Miller, a child who had hid with her family in a crowded farmhouse in the hills of Italy, said, “We come forward because we are aware of our own mortality and how important it is to share what happened.” 

 

Fortuitously, during the six-year period (1994-2000) in which the interviewing took place, there was an explosion in the field of information technology, enabling Spielberg’s team to create a vast, searchable cyber-archive. The carefully catalogued and scientifically preserved videos were distributed to various organizations (including the U.S. Holocaust Memorial Museum and Yad Vashem). 

 

It would take twelve years (or more than 105,000 hours) to watch all of the interviews. I have only had time to view some, obtained online through the USC Shoah Foundation. Once I began listening to certain survivors, I could not tear myself away. Their stories are inherently harrowing, gripping, and educational. And authentic. 

 

For his noble work, all of humanity owes a debt of gratitude to Steven Spielberg (whose foundation has subsequently worked with the Kigali Genocide Memorial to capture the testimonies of survivors of the Rwandan genocide). Twenty-five years after the release of Schindler’s List, in December 2018, the filmmaker reflected on this “most important experience” of his career. Survivors who could bear to watch the film told him that it could not compare to what was. But they were glad he told the story—it should not be forgotten.

 

Owing to the singular circumstances, perhaps no author of Holocaust fiction can aspire to again produce a work as far-reaching as Schindler’s List. But writers who ignore or take liberties with the truth ought to reflect on their purposes. If in some measure they aim to edify, counter hate, and inspire empathy, they might be mindful of those who did not live to tell their stories—who, when they could, engraved their names and places of birth in the walls of barracks, or implored others to remember them. Had they had a choice, I believe Hitler’s victims would have wanted nothing about the mortal crimes against them falsified. 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174674 https://historynewsnetwork.org/article/174674 0
Plots Against America?: Jim Crow was Homegrown Fascism

 

Note: this essay quotes a sign attached to the body of a man lynched in 1919, which uses a racial slur.

HBO’s recent release of a prestige adaptation of Philip Roth’s 2004 novel The Plot Against America makes it worthwhile to examine whether fascism is really so alien to the United States as many wish to believe. In Roth’s novel, a Jewish extended family in Newark experiences fascism’s arrival in America, with the 1932 election of Charles Lindbergh to the presidency, as an intrusion of European extremism against an American Way—and a short-lived one, at that. When many white thinkers ponder Roth’s Plot or the sardonic title of Sinclair Lewis’s 1935 novel It Can’t Happen Here, they often miss the many ways in which it has already happened here. After all, the public prominence of the Ku Klux Klan and the massive riots that rock the climax of Roth’s novel were already regular features of American life, depending upon where you lived. For example, in early May 1927, just a few weeks before the historical Lindbergh took off from Roosevelt Field in the Spirit of St. Louis, some 5,000 whites rioted in the black business district of Little Rock, Arkansas, where they burned the body of man named John Carter. Local police officers, many rumored to be Klan members, did nothing to stop the violence—and may have even taken part.

Black observers of current trends, however, tend to be a little more astute. For example, in his February 21, 2020, New York Times column, Jamelle Bouie argues that the expansive authoritarianism of Donald Trump has its analogue in the Jim Crow South. However, we should not consider the Venn Diagram of fascism and Jim Crow as a circle. The reality is a little more complicated.

The word “fascism” has also long been employed as a political Rorschach blot. As early as 1946, just one year after the end of World War II, George Orwell was complaining of this fact in his essay “Politics and the English Language,” writing: “The word Fascism has now no meaning except in so far as it signifies ‘something not desirable.’” But let us take this as a working definition:

Fascism is the attempt, birthed in reactionary politics, to resolve the contradictions of democracy for purposes of preserving elite power against the demands of the masses.

This will need some explanation. Although we today associate democracy with high ideals, its origins are a bit grubbier. For example, opposition to the Angevin kings of England (which led to the Magna Carta) included, according to historian Robert Bartlett, such charges as “heavy taxation, elevation of low-born officials, slow and venal justice, disregard for the property rights and dignities of the aristocracy.” Much of the Magna Carta focuses upon preserving the property rights and prestige of the aristocracy while limiting the king’s ability to levy certain taxes without “the common counsel of the kingdom.” The eventual emergence of a mercantile class in the late Middle Ages and early Renaissance sparked another expansion of “democracy,” as the bourgeoisie sought similar privileges in order to protect their own wealth. With industrialization, and the eventual concentration of the lower classes into cities, the emerging proletariat began to press for access to the franchise itself on both sides of the Atlantic, as exemplified in the UK by the Reform Act of 1832 and in the US by Jacksonian democracy and the removal of property qualifications for the vote. 

At each step in the expansion of democracy, those who already possessed the franchise feared the loss of their power and wealth by allowing any “lower” classes the privilege of voting. The United States has been much more a racial society than a class society along the lines of the UK, and so here it was easier to get elite buy-in to the idea of universal male suffrage, so long as those males were exclusively white. The abolition of slavery and the expansion of suffrage to those former slaves and their eventual descendants provoked the rage of the south’s idle landlords, who initiated a campaign of violence in the immediate aftermath of the Civil War in order to return to the status quo ante of black servitude and submissiveness. What historians call the first Ku Klux Klan was an elite project to scuttle the political empowerment of African Americans.

The “contradictions of democracy” can be seen in this struggle between those who believe that republican government should preserve elite power and the democratic desire that all citizens be given a voice in how they are governed. Fascism is an attempt to short-circuit this tension through the advancement of a purely corporate figure who is cast as the savior of “the people,” not by empowering them but rather by emphasizing his own unique attributes to act on their behalf. As the Israeli scholar Ishay Landa points out in The Apprentice’s Sorcerer: Liberal Tradition and Fascism, while fascism regularly employs the rhetoric of collectivism, it centralizes such collective and democratic yearnings upon the individual strongman leader, so that he becomes democracy personified, the one true spokesman for “the people,” who no longer need engage in self-governance. 

But there is more to it. As historian Aristotle Kallis observes in Genocide and Fascism: The Eliminationist Drive in Fascist Europe, fascist ideology was born with the specific aim of seeking redemption from recent “humiliations” by latching onto the glories of the past to drive a new utopian future. This “redemption” manifested itself externally, through expansionist policies of conquest, and internally, through a “cleansing” of the population aimed at eliminating those figures responsible for recent humiliations: socialists, communists, Jews and other minority groups. The drive to “cleanse” the state, Kallis writes, “helped shape a redemptive licence to hatedirected at particular ‘others’ and render the prospect of their elimination more desirable, more intelligible, and less morally troubling.” 

This has been just a brief overview, but it allows us to draw some parallels between fascism and Jim Crow. Both fascism and Jim Crow were means of limiting democratic participation and thus the political and economic emancipation of certain “others.” And both fostered a “license to hate” that resulted in massive violence against the enemies of the elite. But there are more parallels. As Landa writes in Fascism and the Masses: The Revolt against the Last Humans, 1848–1945, “Rhetoric of honoring labor aside, the Nazis strove to achieve the exact opposite: keeping wages low and increasing working hours, which was precisely what German business was insisting should be done throughout the years of the Weimar Republic.” Much the same held true in the Jim Crow South, where particular ire was reserved for those who resisted the southern tradition of racialized economic exploitation. In June 1919, after Clyde Ellison refused to work for Lincoln County, Arkansas, planter David Bennett for a mere 85 cents a day, he was hanged from a bridge, with a sign attached to his body reading, “This is how we treat lazy niggers.” Later that year, and not too far away, white mobs and soldiers would slaughter untold numbers of African Americans, in what has become known as the Elaine Massacre, for daring to organize a farmers’ union.

Although both fascism and Jim Crow constituted violent means of securing elite power, there are important distinctions to note. While southern states had their share of demagogues, Jim Crow was a multi-generational project of the Democratic Party, one not centered upon any particular individual. Too, while both fostered a “license to hate” against racial and ideological others, the Jim Crow project made a distinction between “good negroes” who “knew their place” and “bad negroes” who sought the privileges reserved to whites. The latter may have to be killed, and the region “cleansed” of those “outsider” whites who spread dreams of equality, but black people who were dutifully submissive could be tolerated in so far as their unpaid or underpaid labor created the region’s wealth.

Back in 2004, The Plot Against America was widely regarded as a commentary about the administration of George W. Bush. No doubt, the 2020 television adaptation will be viewed in the light of Donald Trump, whose rhetoric and policies have been compared by critics to both fascism and Jim Crow. Perhaps the television series will exhibit a more sophisticated understanding of fascism and America than did the book. Perhaps not. Either way, the series should provide a good opportunity for historians to educate the public on who, exactly, lay behind the centuries-old plot against all Americans.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174673 https://historynewsnetwork.org/article/174673 0
The Three Daring Women Who Traversed the Himalayas

 

 

Antonia Deacock, Anne Davies and Eve Sims were three rather extraordinary women who, when in their mid-twenties and thirties, set off overland from England to Tibet in 1958.  Their aim was climbing one of the Himalayas’ unexplored high peaks. They made the 16,000-mile drive to India and back, adding a 300-mile trek on foot to Zanskar, a remote province of Ladakh (part of Kashmir), and became the first European women to set foot there.

 

They called their adventure the Women’s Overland Himalayan Expedition, and were inspired and encouraged by their husbands, explorers and climbers themselves. Not wanting to be left behind "chained to the kitchen sink" when their husbands went on a trek, the women spent six months planning their own adventure. "It was just something we wanted to do," said Anne matter-of-factly. "And it was a good idea."

 

They had no vehicle, little money and no equipment, and didn’t even know each other very well at the start, but by working together they mustered enough support to fund their expedition, gaining sponsorship from, among other companies, Brooke Bond Tea (they mistakenly ordered enough tea "to keep a family going for 150 years," according to Eve), lllustrated magazine, John Player & Son, even cosmetician Max Factor. They persuaded Land Rover to sell them a demonstration model of a modified long-wheelbase all-terrain vehicle at a significant discount, and the British Ministry of Agriculture and Fisheries donated steak, vegetables, and berries preserved with the new technology of freeze-drying. A publicity campaign before they left captured the public’s attention, and headlines such as "Four Fed-up Wives" (one member had to pull out at the last minute, due to an unexpected pregnancy) helped raise the required funds.

 

They weren’t completely naïve, however. Anne was fluent in Urdu and Hindi and had previously trekked in Kashmir with her husband, their baby strapped to the back of a mule. Sims had spent two years motorbiking around Australia and New Zealand, and before that had learned to climb in Wales. Deacock was an experienced rock climber. They weren’t the kind of women who would let the fact that two of them couldn’t even drive when they started planning their trip deter them.

 

Eve and Antonia had not yet had children, but Anne left behind her three sons, aged 15, 14 and five. "I didn't feel guilty," she said. "Because Lester had gone off on expeditions, and this was my turn."

 

Their five-month journey saw them battle illness, delays, inhospitable terrain, the effects of altitude, and terrible roads. They occasionally had to fend off unwelcome advances and negotiate with recalcitrant porters. Rest days were spent scrubbing their laundry on rocks in a nearby river, catching up on correspondence–communication was slow and difficult, and they could only occasionally get word to their husbands and families that they were safe–and checking their supplies. The entire trip was a considerable feat of organisation – after the women estimated what they would need to last the entire trip, several crates were shipped to Bombay and had to be retrieved from the docks after bureaucratic delays.

 

Due to the heat in Iran, they were often forced to drive at night, and were faced with constant enquiries as to where the ‘sahibs’ were and incredulity that the women were travelling on their own. They camped almost everywhere, and were welcomed by Land Rover’s agents in the European cities they travelled through, who praised the women for their maintenance of the vehicle, though they often expressed surprise that they successfully completed the trip. 

 

Following an audience with the Indian Prime Minister Nehru they were granted the rare privilege of travelling beyond the "inner line,", a boundary across India and Tibet that had been drawn up in the 1800s and beyond which no British subject might rely on government protection or rescue. No non-Indian had been granted permission to cross the inner line since before the Second World War, and it was one of the last regions untraveled by Europeans on the planet. They also were thought to be the first European women to cross Afghanistan unescorted. 

 

Fording fast-flowing meltwater streams, sometimes up to their waists in water, they trekked through snow and ice, climbing a 18,700 foot peak and naming it Biri Giri ("Wives Peak"). As they travelled, they passed through villages untouched by European contact, meeting locals who had never seen things such as zippers or nylon climbing ropes. One elderly woman was terrified of the sight of her own face in one of their mirrors. "How privileged we were to witness and partake in societies that were virtually strangers to the modern world that we know," said Antonia afterwards.

 

The women were bound by a common desire to prove themselves. "I’d been my father’s daughter, my husband’s wife," said Eve. "But this time I was somebody on my own." Despite living in close proximity and enduring many hardships, surviving dust and discomfort, the freezing cold and the intense heat, they claim never to have argued, preferring to thoroughly discuss issues and abide by a majority rule on decisions.

 

After their expedition, they carried on the spirit of adventure in their lives. Antonia wrote a book about their exploits, then eventually moved to Australia and established an Outward Bound school and the world’s first dedicated adventure travel company with her husband, also establishing close links with Nepal. Eve had three children and went on to run an Outward Bound centre with her husband, and Davies helped her husband to run an outward bound school in the Lake District.

 

The 1950s were a time when relations between British and other western people and newly independent nations in South Asia were in their infancy, and the women’s fearless endeavour is all the more remarkable for it. Although trekking in remote lands is now far more common, it is unlikely that they would be able to make such a journey–particularly through Afghanistan and Iran– today.

 

A film about their expedition, by Pulse Films and Britain’s Film4, is in development. Antonia Deacock’s book, No Purdah in Padam, is now out of print, but available from antiquarian booksellers.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174675 https://historynewsnetwork.org/article/174675 0
As We Zoom into Online Learning….

 

 

“Zoom” – this playful kid-word, which once referred to fast cars, now signals a fast-approaching sea-change in higher education, unfolding before our eyes thanks to COVID-19 and the need to move live university classes online. Zoom, for those who do not know, is the video meeting platform by which faculty are all migrating our classes to on-line format.

 

We all might have larger things to worry about in the next few weeks and months, like our loved ones, our colleagues, and our students becoming terribly ill. If this happens, then the nature of online education will hardly be our biggest problem. 

 

But for the moment at least, as an academic who has been teaching at state universities for nearly thirty years, I am torn concerning the issue at hand. On the one hand, the students who signed up for my History of the Holocaust class this semester at the University of Florida did so because they were interested in the topic, some intensely so. As I try and move my lectures and discussion sessions from a classroom to a Zoom format, I want to provide something as close to the classroom experience as I can. On the other hand, I suspect, as do many of my colleagues, that university administrators and state legislators throughout the US will study this crash experiment in online education very closely one day. Are we academics showing them how they might replace us in the name of heightened efficiency? 

 

We can agree that some of the efficiencies are indeed desirable. Those of us who remember putting books on reserve in the library for twenty-five students at a time will attest to this. A certain number of online classes, moreover, have existed for the last couple of decades, helping place-bound students and those students, younger and older, who work full time. But what happens when everything goes online, and all at once? If we discover that all classes can be delivered online from a remote location, then what is the point of having lecture halls, classrooms, or for that matter a diverse faculty of broad expertise and talents? The arguments that have been percolating in universities for the past decade will intensify overnight.Yes, there are faculty who have put in an immense amount of time in order to develop fine online experiences. But I have also seen half-baked efforts over the years that are rather disastrous, even within the oft-cited rationalizing context of the apocryphal 1970s professor (I never actually had one of these guys) who mumbled through his yellowing lecture notes. 

 

My colleagues are proceeding cautiously. One colleague warned me not to record my lectures into the Zoom cloud, but to provide them live through Zoom. Everyone, I hear, is giving synchronous (“live” in Zoomspeak) as opposed to asynchronous (“recorded” in Zoomspeak) lectures. Anyone who has seen the infatuation with online learning in higher education administration over the past twenty years knows that this is hardly a paranoid reaction. The university would own the recorded content, as the work is done for the university in return for compensation. I actually recorded the first few lectures for my class. I needed to crawl with this technology before I could walk, and the students, I thought, would need time to adjust to the new reality of a full course load online as they simultaneously move from Gainesville back to their homes in Florida and elsewhere in the US.  Nonetheless, like some of my colleagues, I am uneasy even with synchronous content. If I have learned anything from other people’s travails with Facebook and Twitter over the years, it is that nothing put online, even briefly, is truly protected or truly deleted. 

 

And if I know what faculty will say once the experiment is over, I am less sure about the students.  I hope that they give a contextual yet roundly negative assessment -- something like, “I understand the situation, but I can’t wait to get back to live classes.” But today’s students are online-surveyed to death, starting with online evaluations of faculty each semester that most do not complete. Worse, twenty-somethings believe that they can effectively multitask–what others of us would call diffusing one’s focus. The professor telling them at the start of class to turn off their cell phones and laptops is now the professor who depends on these devices as we try to lecture or hold discussion sections from remote locations. Zoom actually has a feature that discloses a student’s level of attention—have they left their screen? Are they messaging? Are they watching Netflix? But I really don’t want to check that feature, and the fact that Zoom has it at all reveals the nature of the problem. Live classes promote a level of decorum that everyone in the classroom understands and from which everyone in the room benefits. But taking a class alone in one’s kitchen or bedroom? There is a reason that different rooms have different names, and in every language.

 

Finally, there is the quality of our own work, our pedagogical preparation. Like most of my colleagues in certain disciplines, I believe that each lecture and each discussion is the result of having worked at our craft over a period of years. What material shall we present to make a particular point about, say, Jewish resistance in the Warsaw ghetto, or about Reconstruction after the Civil War, or about Robespierre’s dictatorship? How shall we present it? What verbiage will we use? What visuals will we use? When will we leave the lectern for a stroll up the aisle? When will we pause and urge the the students think rather than just take notes? What questions will we pose to them when they discuss? How can we encourage them to interact and even debate with one another face-to-face-to-face, complete with expressions and gestures? How will we get them to understand that there are no black and white answers but only arguments, some thoughtful, some needing intensive development? 

 

These questions and many others form the very stuff that makes live higher education on a university campus an experience for faculty and students that cannot be replicated online, at least through the Zoom technology with which I have become familiar. Even if all of the technology “works,” how can our broader efforts, having been squeezed through the portal between a faculty computer and those of the students, come out undistorted on either side in ways that we cannot yet fully recognize? Zoom is fine technology—for conference calls. It enables business executives to talk to one another over long distances while presenting flowcharts and such. Academics can even have faculty meetings via Zoom, so that we ourselves can “multitask” to our heart’s content while discussing the minutiae of departmental by-laws.

 

But the real interaction that results in true learning? I am not sure at all. Zoom at its bandwidth-driven heart allows us to see one another and hear one another only to the point where we can talk to one another’s images (with cheesy optional backgrounds of the tropics or of outer space no less) and not speak to one another as human beings. We can see, but our vision is circumscribed. We can listen, but our hearing is muffled. We can connect, but our interaction is impeded.

 

For this, we all need to be, once again, in the same room. 

 

Let’s hope it is soon.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174676 https://historynewsnetwork.org/article/174676 0
Use What You Know: Online Teaching Tools If you teach history at any sort of educational institution, whether K–12 or higher ed, chances are your institution has “pivoted” to remote (online) instruction, if not closed campus entirely, as a response to the rapid spread of COVID-19 and the imperative for social distancing. Over the past week, a wave of these pivots and closures has left many of us scrambling for alternative means of engaging our students in an online and likely asynchronous setting. This is not the optimal way to teach and learn history, given that good online courses take more time and planning to develop than the handful of days most of us have been given.

So what do we do? How do we keep as many of the essential elements of our course as possible, even if those look different online? We want to help our students to continue to be engaged with history; that is, to still feel present in the course and actively work with the course material. We want our students to do things like discuss, analyze, work with primary sources, and be able to communicate their interpretations to others.....

Many historians have already been doing this kind of teaching online, so there might not be the need for you to re-discover fire. Check out the #twitterstorians hashtag on Twitter to read and participate in ongoing conversations and resource sharing. Waitman Beorn, a senior lecturer in history at Northumbria University, has generously created a spreadsheet of teaching tools, digital history sites you can direct students to, digital tools for historical scholarship, digital humanities projects, and digital archives (use the tabs at the bottom of the spreadsheet to navigate between categories). H-Net has put together a repository for resources on teaching history online, which should be a thriving community soon. One of the most exciting cross-disciplinary products has been the “Keep Teaching” online community set up by the staff of Kansas State University’s Global Campus, an excellent virtual gathering spot where faculty, staff, and designers are sharing tips, tricks, techniques, and—most importantly—solidarity as we navigate this rapidly changing landscape together. 

As historians and teachers, we pride ourselves on being able to engage students with the complexity and wonders of the past. Though our current circumstances are far different than we anticipated, we have the research skills and critical faculties to help solve this new set of problems. Being analytical and discerning about the tools we use is a necessary part of that process, but so, too, is our discipline’s remarkable willingness to collaborate and share expertise. If you’re one of the thousands of us “moving online,” good luck, and see you on the internet!

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174651 https://historynewsnetwork.org/article/174651 0
The Cultural Constants of Contagion

 

Sometime during the year 541, a few rats found their way into Byzantium. Soon more would arrive in the city. Whether they came from ships unloading cargo in Constantinople’s bay, or overland in carts bringing goods from points further east, the rodents carried fleas harboring the Yersina pestis bacterium. Byzantium’s citizens, ruled over by the increasingly erratic Justinian, had heard accounts of the plague as it struck down other nations first. The historian Procopius, who was witness to the ensuing pestilence, writes in his History of the Wars that the resultant disease “seemed to move by fixed arrangement, and to tarry for a specified time in each country, casting its blight… spreading in either direction out to the ends of this world, as if fearing some corner of the earth might escape it.” 

Known as Justinian’s Plague, the pandemic marked the first instance of the bubonic plague in Europe, almost a millennium before its more famous manifestation as the Medieval Black Death. William Rosen writes in Justinian’s Flea that while contagion was a feature of human life, before the sixth-century “none of them [had] ever swept across what amounted to the entire known world, ending tens of millions of lives, and stopping tens of millions more from ever being born.” By the pandemic’s end, epidemiologists estimate that perhaps a quarter of the known world’s population had perished.

There is something deep within our collective unconscious that dimly apprehends these past traumas. The accounts of a historian like Procopius sound familiar to us; echoes of such horror in everything from the skeleton masks of Halloween, the morbid aesthetic of the gothic, and the horror movies of pandemic–from the clinical Contagion and Outbreak to the fantastical The Walking Dead and 28 Days Later. We so fearfully thrill to narratives of apocalypse, that when faced with our own pandemic there is something almost uncanny about it. For the past several months a few Western observers nervously read reports about the coronavirus outbreak in Wuhan, China. More people paid attention as cases emerged in Italy. Shades of Procopius, who reflected that “at first the deaths were a little more than the normal, then the mortality rose still higher, and afterwards the tale of dead reached five thousand each day, and again it even came to ten thousand and still more than that.” Today almost every nation in the European Union is affected by coronavirus, a majority of U.S. states, and every single continent save for Antarctica. Physicians have yet to fully ascertain the disease’s mortality rate, but thousands have already died, and contrary the obstinate denials from the president of the United States, Covid-19 clearly seems to be more than “just the flu.” 

Those of us who work in the humanities have, for more than a generation, been loath to compare radically different time periods and cultures–and for good reason. There can be a flattening to human experience when we read the past as simply a mirror of our own lives. Yet plague is, in some ways, a type of cultural absolute zero, a shared experience of extremity that does seem to offer certain perennial themes that approach universality. For the first time in more than a century, since the Spanish Influenza outbreak of 1918, we collectively and globally face a pandemic that threatens to radically alter the lives of virtually everybody on the planet. It behooves us to converse with the dead. Much is alien about our forebears, but when we read accounts of pestilence raging through ancient cities, it’s hard not to hear echoes of our own increasingly frenzied push notifications. 

There is a collection of morbid motifs which recur with epidemics–the initial disbelief, the governmental incompetence, hoarding of goods, desperate attempts at protection (including the embrace of superstition), social stigma and bigotry, social distancing and quarantine, and of course panic. In the pilfered N95 surgical masks there are echoes of the avian masks worn by Renaissance plague doctors; in the rosemary, sage, and thyme clutched by medieval peasants there are precursors of the Purell which we’re all desperately using (albeit the latter is certainly more effective). As different as those who lived before us may be, the null void which is the pandemic does display a certain universality in how people react to their circumstances, born necessarily from biology itself. For example, gallows humor is an inextricable and required aspect of human endurance, though it can also mark a dangerous denialism. Catharine Arnold writes in Pandemic 1918 that the in its earliest days, the Spanish flu was either denied or joked about. Arnold writes that “At the [Cape Town, South Africa] Opera House, a cough in the audience provided the actor on stage with an excellent opportunity to ad-lib. ‘Ha, the Spanish flu, I presume?’ The remark brought the house down.” However, she writes that “within days, that joke wasn’t funny anymore.”

Few emotions are as all encompassing, understandable, and universal as is panic. In the queasy feeling which a pandemic inculcates throughout a society there is a unity of experience across disparate time periods. Consider the language from The New York Evening Post in July of 1832 as they traced the inevitable arrival of a cholera epidemic into the city, where inhabitants of the metropolis fled “from the city, as we may suppose the inhabitants of Pompeii or Reggio fled from those devoted places.” Naturally hoarding is another predictable behavior, if often dangerous and self-defeating, as can be attested to anyone who has tried to buy hand sanitizer or toilet paper in the last week, or with the squirreling away of face masks which are currently needed by public health officials. Procopius again notes that in Byzantium, “it seemed a difficult and very notable thing to have sufficiency of bread or of anything else; so that with some of the sick it appeared that the end of life came about sooner than it should have come by reason of the lack of the necessities of life.”

This panic can metastasize into rank irrationalism, as pandemic often leads to the embrace of quack cures, or more disturbingly the promulgation of noxious hatreds. Philip Ziegler writes in The Black Death that during the pandemic of 1348, in France a “fashionable course of study” regarding the plague included “an ointment called ‘Black de Razes’” that was “on sale at apothecaries as a cure recommended for virtually any ailment.” Its effectiveness might be compared to the silver nitrate solutions hawked by televangelist Jim Bakker as a prophylactic against coronavirus. During the Black Death, fear was a convenient catalyst for old hatreds, as the plague justified antisemitic pogroms. By dint of both their isolation and the religious strictures that encouraged rigorous hygiene, medieval Jewish communities were sometimes spared the worst of the plague. This was interpreted by Christians as complicity in the pandemic, and so the Jews were made to suffer for their previous good fortune. John Kelly writes in The Great Mortality that antisemitism “bubbled up from the medieval Teutonic psyche,” where in Germany and other nations the religious cult of the Flagellants “believed the curse of the mortality could be lifted through self-abuse of the flesh and slaying Jews.” 

Xenophobia similarly marked more modern pandemics. In the first four years of the twentieth-century, San Francisco saw America’s only prolonged outbreak of bubonic plague. Brought by ship into the city, where today a coronavirus contaminated cruise ship lingers off the bay, the plague broke out in Chinatown, leading to hundreds of deaths. Marilyn Chase explains in The Barbary Plague that “public health efforts… were handicapped by limited scientific knowledge and bedeviled by the twin demons of denial and discrimination.” Rather than offering treatment, Mayor James D. Phelan spread fear about the Chinese residents of the city, anticipating today’s talk of “foreign virus.” He called them “a constant menace to the public health,” while ironically exacerbating the outbreak by dint of his policy. Meanwhile, California governor Henry Gage simply denied the existence of the plague at all, fearing more that the economy would be harmed then that his fellow citizens were dying. 

If there is a commonality to the recurring themes that mark pandemics throughout history, it’s because of the physical reality of the ailment. Covid-19 can’t be explained away, can’t be ignored, can’t be obscured, can’t be denied. It can’t be tweeted out of existence. It turns out that internet memes aren’t the same thing as viruses. We may yet discover that in a “post-reality” world, reality has a way of coming to collect its debts. 

An important reminder, however; for all of the uncertainty, panic, and horror which pandemics spread, there is also the acknowledgment, at least among some, of our shared affliction, our collective ailment, our common humanity. Within a plague there are fears both rational and irrational, there are prejudices and panics, there is selfishness, cruelty, and hatred. But there is also kindness, and the opportunity for kindness. The story of pandemics contains flagellants, but also selfless physicians and nurses; it includes the shunning of whole groups of people but the relief of treatment as well. We’ve never faced something like the coronavirus in the contemporary Western world, but we’d do well to remember something strangely hopeful that Daniel Defoe observed in his quasi-fictional A Journal of the Plague Year, 1666 when the last of the major bubonic outbreaks killed a fifth of London’s population: “a close conversing with Death, or the Diseases that threaten Death, would scum off the Gall from our Tempers, remove the Animosities among us, and bring us to see with differing Eyes, than those which we look’d on Things with before.”   

 

 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174671 https://historynewsnetwork.org/article/174671 0
Bancroft Prize Goes to Books on Emancipation and Urban Renewal A sweeping reconsideration of the complexities of Emancipation and a biography of the nearly forgotten mid-20th-century urban planner who reshaped Boston and other cities have won this year’s Bancroft Prize, which is considered one of the most prestigious honors in the field of American history.

Lizabeth Cohen’s “Saving America’s Cities: Ed Logue and the Struggle to Renew Urban American in the Suburban Age,” published by Farrar Straus and Giroux, was cited for offering “a nuanced view of federally-funded urban redevelopment and of one of its major practitioners that goes beyond the simplicity of good and bad, heroes and villains.”

Reviewing the book last year in The New York Times Book Review, Alan Ehrenhalt praised Dr. Cohen, a professor of American Studies at Harvard, for her “incisive treatment of the entire urban-planning world in America in the last half of the 20th century,” and fair-mindedness in addressing what has become, he writes, “a highly polarized subject.”

The second winner, Joseph P. Reidy’s “Illusions of Emancipation: The Pursuit of Freedom and Equality in the Twilight of Slavery,” published by University of North Carolina Press, was cited by the prize committee for the way it builds on and departs from the huge existing literature on the subject to “deepen our understanding of the vagaries of Emancipation in the United States.”

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174669 https://historynewsnetwork.org/article/174669 0
UPDATED 4/6: What Historians Are Saying About COVID-19 and Trump's Response Click inside the image to scroll through tweets

 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174574 https://historynewsnetwork.org/article/174574 0
Updated 4/5: Historians Discuss the Media's Coverage of COVID-19 Click Inside the Image to Scroll Tweets

 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174678 https://historynewsnetwork.org/article/174678 0
Roundup Top Ten for March 20, 2020

The Shortages May Be Worse Than the Disease

by Elise A. Mitchell

Societies further their own destruction whenever they fail to provide anyone health care, housing, or dispensation from work because of their employment, socioeconomic, or immigration status.

 

We Need Social Solidarity, Not Just Social Distancing

by Eric Klinenberg

To combat the coronavirus, Americans need to do more than secure their own safety.

 

 

Hurricane Katrina Provides Lessons about Closing Campuses

by Andre M. Perry

Students in New Orleans needed resources to return to normalcy. But when racial wealth gaps are the norm, a stumble can become a fall.

 

 

Work Requirements are Catastrophic in a Pandemic

by Elisa Minoff

Instead, we should be implementing policies that support people’s work in the wage labor force and make it possible for working families to make ends meet.

 

 

Counting Everyone—Citizens and Non-Citizens—In the 2020 Census is Crucial

by Brendan A. Shanahan

Even without a citizenship question, the Trump administration wants to shape how states reapportion their legislatures.

 

 

Why Sanders Isn’t Winning Over Black Voters

by Keeanga-Yamahtta Taylor

For millions, even when government “works” it is not working for them.

 

 

An Epicenter of the Pandemic Will Be Jails and Prisons, if Inaction Continues

by Amanda Klonsky

How will we prevent incarcerated people and those who work in these institutions from becoming ill and spreading the virus?

 

 

Democracy: How 1860 Connects to 2020

by Daniel W. Crofts

In the years before the Civil War, just as today, minority rule was the norm. White Southerners dominated the Democratic Party, and the Democratic Party dominated the federal government.

 

 

College Worth Fighting For

by Ryan Boyd

Professors are in a class struggle, a real fight that cannot be won with critique alone.

 

We Can’t Forget Women as We Tell The Story of COVID-19

by Jennifer Brier

Women who have been medical (and political) subjects of HIV/AIDS also have much to teach us during our current pandemic.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174668 https://historynewsnetwork.org/article/174668 0
An Interview with Mary V. Thompson on the Lives of the Enslaved Residents of Mount Vernon

 

Mount Vernon Historian Mary V. Thompson is the author of “The Only Unavoidable Subject of Regret”: George Washington, Slavery, and the Enslaved Community at Mount Vernon (University of Virginia Press, 2019).

Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Crosscut, Documentary, NW Lawyer, Real Change, Huffington Post, Bill Moyers.com, Salon.com, and more. He has a special interest in the history of human rights and conflict. He can be reached by email: robinlindley@gmail.com.

 

Drawing on years of extensive research and a wide variety of sources from financial and property records to letters and diaries, Ms. Thompson recounts the back-breaking work and everyday activities of those held in bondage. Without sentimentality she describes oppressive working conditions; the confinement; the diet and food shortages; the illness; the drafty housing; the ragged clothes; the spasms of cruel punishment; the solace in religion and customs; and the episodic resistance. 

Ms. Thompson also illuminates the lives of George and Martha Washington through their relationships with black slaves. Washington was a strict disciplinarian with high expectations of himself and his slaves. As a young man, he callously bought and sold slaves like cattle. However, as Ms. Thompson explores, his attitudes toward slavery and race changed with the American Revolution when he saw black men fight valiantly beside white troops. Although not a vocal abolitionist, his postwar statements reveal that he found slavery hypocritical and incompatible with the ideals of democracy and freedom for which he had fought. He was the only Founding Father who freed his slaves in his will.

Ms. Thompson brings to life this complicated history of enslaved people and their legendary owner. Her careful explication of the many aspects of life at Mount Vernon offers a vivid microcosm for readers to better understand the institution of slavery and its human consequences during colonial period and early decades of the republic.

Since 1980, Mary V. Thompson has worked at George Washington's Mount Vernon in several capacities, and currently serves as Research Historian who supports programs in all departments at Mount Vernon, with a primary focus on everyday life on the estate, including domestic routines, foodways, religious practices, slavery, and the slave community. She has lectured on many subjects, ranging from family life and private enterprise among the slaves, to slave resistance, to religious practices and funerary customs in George Washington's family. Her other books include “In the Hands of a Good Providence:” Religion in the Life of George Washington, and A Short Biography of Martha Washington.Ms. Thompson also has written chapters for several books, entries in encyclopedias, and numerous articles. She earned an M.A. in History from the University of Virginia.

Ms. Thompson generously responded by email to a series of questions on her work and her new book on the slave community at Mount Vernon.

 

Robin Lindley: Congratulations Ms. Thompson on your recent book on George Washington and enslavement at Mount Vernon. Before getting to your book, I wanted to ask about your background. How did you decide on a career as a historian?

Mary V. Thompson: My father was a major influence on that.  He served for 32 years as an Army Chaplain and, through quite a few moves, would drag us to nearby museums and historic sites and encourage us to read about the next place we were going and all the exciting things that happened there, so we were pretty psyched by the time we got there. He was also the first curator of the Army Chaplains Museum, when it was in Brooklyn, during the Bicentennial of the Revolutionary War.  As part of that job, he also edited a 5-volume history of the Chaplains Corps, while writing the first volume, which covered the American Revolution.  So, as I went through high school, I helped in the museum with some of the exhibits, helped with acquisitions, and with research. I loved all of it.

Robin Lindley: I understand that you’ve spent most of your professional career as a historian at Mount Vernon. How did you come to work at this historic plantation and what is your role?

Mary V. Thompson: This was definitely a result of serendipity---or providence, depending on your world view.  I was getting ready to finish a master’s degree at the University of Virginia, while working as a volunteer for the Army Ordnance Museum at Aberdeen Proving Ground in Maryland, and sending out what felt like bazillions of resumes for jobs all over the country.  I started out part-time [at Mount Vernon] as an historic interpreter (giving tours to about 8,000 visitors per day).  From there, I moved on to doing special projects for the Curator, then to assisting full-time in the Curatorial Department.  I moved up to being the Registrar in the Curatorial Department, which involved cataloguing new objects as they came into the collection, keeping track of where everything was, doing inventories, working with insurance companies, etc.  

To keep me from going nuts, they gave me one day per week to do research on a specific, agreed-upon topic, the first of which dealt with foodways.  After a few years, my boss asked me to switch to studying slavery and slave life at Mount Vernon.  In the late 1990s, as the 200thanniversary of George Washington’s death was rapidly approaching, I worked on three major projects: a travelling exhibition entitled, “Treasures from Mount Vernon:  George Washington Revealed,” which opened in late 1998 and travelled to five cities around the country; redoing the furnishings in the mansion, with special exhibitions to make the house look as though the Washingtons had just walked out of the room; and the recreation/reenactment of George Washington’s funeral, a three-hour event on C-Span.  

I was then moved to the Library, where I worked as the Research Specialist and then as Research Historian.  This involved dealing with questions from people all over the country, generally dealing with domestic life here at Mount Vernon; helping authors, illustrators, and publishers by vetting publications; helping pretty much every department on the estate with helpful quotes and deciding whether we had enough information on a particular subject to do a special exhibit or program built around it.  Best of all was the opportunity to give talks on and publish my own research.  

Robin Lindley: What sparked your recent book on enslavement at Mount Vernon? 

Mary V. Thompson: I actually started working on the topic in the late 1980s, because Mount Vernon really needed to be able to teach its staff and visitors about this issue, but it was probably about seven or eight years after that before it knew it wanted to be a book.  It was in the early 1960s that I first learned about slavery, as a result of the Civil War centennial, which was going on when I was an elementary school student, at the same time that the Civil Rights movement was playing out on the news every night during dinner.  Then in graduate school at the University of Virginia in the late 1970s, slavery was the subject of much of our reading and classroom discussions.

Robin Lindley: Your book has been praised for its impressive detail and extensive research. What was your research process?

Mary V. Thompson: Thankfully, I was able to start with some of the sources compiled by prior members of our Library staff.  One of the Librarians had put together a bound volume of statements by George Washington on the topic of slavery, which she’d typed up back in the 1940s.  I went through that, page by page, listing the topics covered on each and then photocopied the pages and put them into loose-leaf binders for each of those topics.

I also went through bound volumes of photostats of the Weekly Work Reports that Washington required from his overseers, as well as photostats of his financial records.  The Weekly Reports provided detailed information on the work being done on each of the five farms that made up the Mount Vernon estate, as well as information on the food being delivered to each, the weather on each day, food delivered to each farm, the number of people working on each farm, and explanations for why certain people were not working each week.  This last category was really interesting, because it provides information on illnesses, injuries, childbirth, and how long women were out of work because they were recovering from giving birth.

Another great source was correspondence by family members other than George Washington, as well as descriptions of Mount Vernon by visitors to the plantation, that often mention those enslaved there.  In order to understand where Mount Vernon fit in the overall picture of plantations in Virginia, it was also necessary to learn about life at Monticello, Montpelier, Sabine Hall, and elsewhere in the colony/state.   

Robin Lindley: You reconstruct and put a human face on the lives of slaves at Mount Vernon—despite the virtual lack of any contemporary documents by slaves from that period. How did you deal with that challenge?

Mary V. Thompson:  Getting at the enslaved community was one of my favorite parts of this project.  I started by taking the two fullest slave lists, from 1786 and 1799, and used them to try to reconstruct families. Thankfully, these two lists enumerated the people on each of the five farms and what their work was, with the 1786 list linking mothers and their children who were too young to work, and the ages of those children.  The 1799 list did the same, but also linked women and their husbands and told where those husbands lived (whether they were on the same farm with their wives and children, lived on another of Washington’s farms, or belonged to another owner altogether, or were free men).

 Comparing the two lists made it possible to start reconstructing extended, multigenerational families.  I put together a document for each of the farms, organized by family, and then, as people would be named in the work reports, the financial records, or correspondence, would put those references in the individual records, if I was as sure as I could be that I’d found the right person.  

For most of the people, I was keeping track of such things as information about what work they were doing; references to their health; children; ways they might have made extra money; rations of food and clothing; instances of resistance; etc.

Robin Lindley: I was impressed by your description of the massive size of Mount Vernon and the number of slaves who worked there. How would you briefly describe the Mount Vernon plantation in Washington’s era in terms of area, farming, crops, forests, and number of slaves? 

Mary V. Thompson:  Mount Vernon reached an ultimate size of 8,000 acres during Washington’s lifetime. While Washington, like many plantation owners prior to the American Revolution, started out as a tobacco grower, by the late 1760s, he was making the switch from tobacco to grain and from markets in Europe to American and West Indian markets.  Much of the land was still forested after switching in crops and markets. As I understand it, in order to keep fireplaces running on a daily basis for heating, cooking, and washing, it takes ten acres of forest to get enough trees and branches dying naturally to do those things, without the need to cut any more trees.  The largest number of enslaved people on the plantation was 317 in 1799, the last year of George Washington’s life. 

Robin Lindley: What are a few salient things you learned about Washington’s treatment of slaves? 

Mary V. Thompson: Washington was a stickler for detail and a strict disciplinarian.  He was also approachable when his enslaved workers had problems with their overseers, needed to borrow something, or someone was interested in moving from one plantation job to another that required more responsibility.  They even talked to him to clarify things, when he didn’t understand a particular problem.  

Robin Lindley: How did Washington’s military background affect his treatment of slaves and other workers?

Mary V. Thompson: Washington used the same methods to keep an eye on his army as he did on the plantation with his slaves.  He directed that both officers and overseers spend time with his soldiers and slaves, respectively; he expected regular reports from them so that he had a very good idea about how things were going and would also travel daily through his military camps and farms to catch problems before they became major issues.  He also insisted on proper medical care for both soldiers and slaves and was a strict disciplinarian in both situations.

Robin Lindley: How did Martha Washington see and treat slaves? It seems she was more dismissive and derogatory than her husband concerning black people.

Mary V. Thompson:  Like her husband, Martha Washington tended to doubt the trustworthiness of the enslaved people at Mount Vernon.  Upon learning of the death of an enslaved child with whom her niece was close, she wrote that the younger woman should “not find in him much loss,” because “the Blacks are so bad in th[e]ir nature that they have not the least grat[i]tude for the kindness that may be sh[o]wed them.”  

The Washingtons never seemed to realize that they only knew Africans and African-Americans as people who were enslaved, which meant that they were not interacting as equals and any ideas they may have had about innate qualities of this different culture were tainted by the institution of slavery.

Robin Lindley: I realize that direct evidence from slaves is limited, but what did you learn about how slaves viewed George Washington? 

Mary V. Thompson:  Because Washington was so admired by his contemporaries, many of whom came to Mount Vernon to see his home—and especially his tomb—those visitors often talked with the slaves and formerly enslaved people on the plantation in order to learn snippets about what the private George Washington was like. 

Extended members of the Washington family, former neighbors, official guests, and journalists, often wrote about their experiences at Mount Vernon and what they learned about Washington from those enslaved by him. Some people were still angry about how they were treated, while others were grateful for having been freed by him.

Robin Lindley: In his early years as a plantation owner, Washington—like most slave owners—saw his slaves as his property and he bought and sold slaves with seeming indifference to the cruelty and unfairness of this institution. He broke up slave marriages and families, and he considered black people indolent and intellectually inferior. However, as you detail, his views evolved. How do you see the arc of Washington’s life in terms of how he viewed his slaves and slavery?

Mary V. Thompson: That change primarily happened during the American Revolution.  Washington took command of the American Army in mid-1775.  Within three years, he was confiding to a cousin, who was managing Mount Vernon for him, that he no longer wanted to be a slave owner.  In those years, Washington was spending long periods of time in parts of the country where agriculture was successfully practiced without slave labor and he saw black soldiers fighting alongside white ones. He also could see the hypocrisy of fighting for liberty and freedom, while keeping others enslaved.  There were even younger officers on his staff who supported abolition.  

While he came to believe that slavery was something he wanted nothing more to do with, it was one thing to think that slavery was wrong, and something else again to figure out what to do to remedy the situation.  For example, it was not until 1782 that Virginia made it possible for individual slave owners to manumit their slaves without going through the state legislature.  After an 8-year absence from home, during which he took no salary, Washington also faced legal and financial issues that would also hamper his ability to free the Mount Vernon slaves.

Robin Lindley: Many readers are familiar with the story of Thomas Jefferson and Sally Hemmings. Did you find any evidence that George Washington had intimate relationships with any of his slaves or any free blacks?  

Mary V. Thompson:  Not really. As a young officer on the frontier during the French and Indian War, one of his brother officers wrote a letter, teasing him about his relationship with a woman described as “M’s Nel.”  The wording suggests several possibilities: she might have been a barmaid working for a tavern owner or pimp, whose first initial was M; another possibility is that she was the mistress of a brother officer; or perhaps that she was enslaved to another person.  With the minimal evidence that survives, there are many unanswered questions about this mystery woman.

The oral history of an enslaved family at Bushfield, the home of Washington’s younger brother, John Augustine Washington, alleges that George Washington was the father of a young male slave named West Ford, who was born in Westmoreland County, Virginia, roughly 95 miles from Mount Vernon, about a year or two after the American Revolution.  Here, the surviving documentary evidence contradicts the oral history, indicating that Ford’s father was someone in the Bushfield branch of the family.

Robin Lindley: What struck you particularly about the working conditions for slaves at Mount Vernon and how did they compare to conditions at other plantations?

Mary V. Thompson:  As was true on other Virginia plantations in the eighteenth century, the enslaved labor force at Mount Vernon worked from dawn to dusk six days per week, with the exception of four days off for Christmas, two days each off for Easter and Pentecost, and every Sunday throughout the year,  Because Easter and Pentecost took place on Sunday, which was already a day off, the slaves were given an additional day off on the Monday following the religious holiday.  If they were required to work on a holiday, there is considerable evidence that they were paid for their time on those days.

Robin Lindley: What are a few things you’d like readers to know about the living conditions of slaves at Mount Vernon?

Mary V. Thompson:  Most of the enslaved residents at Mount Vernon lived in wooden cabins—the smaller ones served as homes for one family, while the larger “duplexes” housed two families, separated by a fireplace wall.  

The majority of Americans at this period, free and enslaved, lived in very small quarters.  In comparing the sizes of cabins used by enslaved overseers and their families at two of the farms at Mount Vernon with those of the overseer on a plantation in Richmond County, the two at Mount Vernon had a total living space of 640 square feet, while the other had 480 square feet.

The homes of 75% of middle-class white farmers in the southwestern part of Virginia in 1785 were wooden cabins ranging from 640 square feet to 394 square feet.  Our visitors tend to be very surprised to learn that the entire average Virginia home for middle class or poor families in the eighteenth century would fit easily into just “the New Room,” the first room they enter in the Mount Vernon mansion. In other words, pretty much everyone was on the poor end of the scale, unless they were like the Washingtons, the Custises, or the Carters.  

Robin Lindley: I was surprised that some of the Mount Vernon slaves were literate. I had thought that education of slaves was illegal then. 

Mary V. Thompson: There were no restrictions on teaching slaves to read in eighteenth century Virginia, and, in fact, it might have been a useful skill, especially for slaves working in more of a business capacity, than in agricultural labor.  It was not until after a slave revolt known as Gabriel’s Rebellion (1800), that the state passed a law forbidding enslaved people to gather together in order to learn to read.  At least one historian has suggested that between 15 and 20 percent of slaves could read in the 18thcentury.

Robin Lindley: You found evidence that many slaves were aware of African lore and practices—at times from stories passed down through generations and at times from black people more recently arrived from Africa. What are some things you learned about African influences?

Mary V. Thompson:  African influence can be seen in everything from naming practices within families, to family lore and folk tales told to children, the languages spoken in the quarters, religious beliefs and practices, and even some of the food and cooking traditions.

Robin Lindley: You note that slaves were punished physically at Mount Vernon and that even Washington at times applied the lash. What did you find about forms of punishment at the plantation?

Mary V. Thompson:  One of the changes on the plantation after the war, recorded by Washington’s secretary Tobias Lear, was that his employer was trying to put limits on the physical punishment doled out to the slaved.  According to Lear, Washington wrote that no one was to be punished unless there was an investigation into the case and “the defendant found guilty of some bad deed.”  After the war, Washington also tried to use more positive reinforcement, instead of punishment, in order to get the sort of behavior he wanted.  Those positive reinforcements included such things as the chance to get a better job, earning monetary rewards, or even better quality clothing.

Robin Lindley: What happened to slaves at Mount Vernon who escaped and were recaptured? 

Mary V. Thompson: It would depend on the circumstances and how difficult it was to get them back.  Some people might run away briefly because of a conflict with someone else in the quarters, or with an overseer and needed a breather to let the situation cool off.  Others might have left to visit relatives on another plantation.  If they were not gone long and came back on their own, there might be little punishment.  In other cases, if someone continually ran away or was involved in petty crimes, they might be punished physically or even sold away.  

We know of at least one slave, who was sold to another plantation in Virginia, after running away four times in five years; three times when George Washington sold a person to the West Indies, something many people today consider akin to a death sentence; and one case where a young man at Mount Vernon—and his parents—were told that he would be sold there, as well, if he didn’t start exhibiting better behavior.

Robin Lindley: Did you find examples of slave resistance?

Mary V. Thompson:  Yes, many. When people today think of resistance, most probably are thinking of things like running away, or physically fighting back with an overseer, stealing something to eat, or poisoning someone in the big house. Not everyone was brave enough or desperate enough to do something so easily detectable.  They might well have tried something less obvious, like slowing down the pace of work, procrastinating on finishing a particular job, or even pretending to be sick or pregnant.  

Robin Lindley: Oney Judge Staines was a Mount Vernon slave who escaped to New Hampshire a few years before Washington died. He was angry and vigorously sought her return, but was unsuccessful. Did you find new information on this fascinating case?

Mary V. Thompson:  It wasn’t exactly new information, but the fact that this young woman was one of the “dower slaves” from the estate of Martha Washington’s first husband, meant that Martha did not own her or any of the others, but only had the use of them (and any offspring they had) until her death.  George Washington would lose access to those slaves upon Martha’s death, when the dower slaves would be divided among the heirs of her first husband, who in this case were her four Custis grandchildren.

According to a Virginia law at the time, if any dower slave from that state was taken to another state, without the permission of the heirs—or presumably the guardian of those heirs if they were minors—then the heirs or the guardian acting on their behalf would be entitled to take the entire estate immediately, without having to wait for the death of either the husband or wife. Oney’s escape may well have threatened the entire Custis estate.

Robin Lindley: You note that Washington was the only slave-owning Founder who freed all of his slaves in his will. You also note that he seemed circumspect and perhaps ashamed about owning slaves later in his life. Did he ever speak out publicly for the abolition of slavery in his lifetime?

Mary V. Thompson:  It depends on what a person means by “publicly”.  Washington corresponded with quite a few abolitionists, both British and American, after the Revolution.  In response to those people who were pushing him to emancipate those he held in bondage, Washington typically responded that he thought the only legitimate way to do that was through a gradual process of manumission, much like the northern states were setting up.  He noted that he would always vote to forward such a plan, however, he never stood in front of a legislative body as a proponent of a plan like that.  

Robin Lindley: What do you hope readers take from your groundbreaking book? 

Mary V. Thompson:  I would like people to understand that slavery in eighteenth-century Virginia differed from the same institution in both the seventeenth and nineteenth centuries, and that it was a complex institution.  For example, there were people at Mount Vernon who were free, hired, indentured, and enslaved.  They came from many countries and cultures on two continents, represented a variety of both European and African religious traditions, and began their relationships speaking many different languages.  

Robin Lindley: It’s a complicated story. Thank you very much for your thoughtful comments Ms. Thompson, and congratulations on your illuminating book on the Father of the Country and enslavement on his plantation. 

 

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/blog/154325 https://historynewsnetwork.org/blog/154325 0
When the Western Wall Was A Battleground For Jewish Rights

 

Moshe Phillips reviews this book as part of Herut North America's Zionist History Book of the Month series.

 

The book The Western Wall Wars (Whirlwind Press, 2019) details the stories about the young men who, from 1930 to 1947, violated British regulations which banned the sounding the shofar at the Western Wall at the conclusion of Yom Kippur services each year.

 

Moshe Zvi Segal was the first of these young men and he was arrested for sounding the shofar. He blazed a path forward for young Zionist revolutionaries to follow in what was the longest running Zionist underground operation in its history.

 

Rabbi Moshe Segal (1904-1985) was the quintessential Zionist rebel and was a key figure in the histories of Betar, Brit HaBiryonim, Irgun, LEHI (Stern Group), and Haganah and he was the founder of the Brit HaShmonaim religious youth movement. All of these organizations were part of the movement initiated by Zev Jabotinsky (1880-1940) who was the greatest pre-World War Two Zionist leader after Theodor Herzl. Segal himself was a close comrade of Yitzhak Shamir when the later prime minister was a 1940s commander of LEHI.

 

Author Zev Golan knew Rabbi Segal personally and interviewed him many times in addition to attending his lectures and translating his writings. Golan is one of only a handful of Americans who made it their business to seek out the aging heroes of the Irgun and LEHI and to get to know them, their stories, and the ideas that animated their deeds while they were still alive.

 

The Western Wall Wars is subtitled How the Wailing Wall Became the Heroic Wall and is a direct result of Golan's relationship with Segal. In 1930, Segal was the first individual to violate the British regulations against the sounding of the shofar at the Western Wall at the conclusion of the Yom Kippur service. Until 1947, a volunteer from the Irgun, Betar, or the Brit HaShmoniam sounded the shofar every year–often after receiving personal training from Segal in both the mitzvah of shofar as well as how to elude the British police. The British authorities went to great lengths to stop the shofar from being sounded. British efforts to stop Jews from performing a mitzvah probably will seem impossible to fathom to today’s readers and that is just one of the reasons this book is so important.

 

The book explains how Segal and the others who followed in his footsteps transformed the Western Wall from a site of wailing to one of national pride. The book reveals the details of the actual operations at the Western Wall and the full stories of the volunteers who were arrested, escaped from prison, and/or deported to prisons in Africa. Some were involved in the 1946 Irgun attack on the King David Hotel and other Irgun or LEHI operations. Many later fought in Israel's wars. The Western Wall Wars also covers Arab attempts during the 1920s to drive the Jews from the Western Wall and the Jewish response to the Arab effort. Segal was a leader of the opposition in this area as well.

 

The emerging Jewish Underground in the pre-1940 period was a time when the Jabotinsky movement suffered the slings and arrows of the leftist establishment and bravely soldiered on. The light of history has shown that the stances of the Jabotinsky Zionists were correct. If Jabotinsky had been more successful, perhaps the tragedies of the Holocaust and the loss of life in the 1948 war could have been lessened. Progressive historians have always downplayed--and often completely removed–the role of Jabotinsky's movement from their histories of Zionism. This book helps to preserve authentic history and that is a highly praiseworthy thing.

 

The story of the Zionist underground in the pre-state period told here also helps the reader to understand the ideology that guided these warriors as they fought for Jewish rights and rebelled against the British Empire.

 

And this is no small thing. The ideology of Jabotinsky, Rabbi Segal and their comrades is just as instructive and relevant now as it was many decades ago--probably more so.

 

Now that the Jewish People possess a sovereign Jewish State, the concept of just what a Jewish State should rightly be is of vital importance. Avrum Burg, a former Speaker of the Knesset who was also a former chairman of both the World Zionist Organization and Jewish Agency, said "To define the state of Israel as a Jewish state is the key to its end," in a June 2007 interview with Israel's Haaretz newspaper. Now, we live in a time when many radical Jewish organizations in the U.S. struggle to redefine Israel as something other than a Jewish State.

  

For today’s Zionists to be truly successful in a way that transcends politics and elections–in a nation transforming way–we must reevaluate the philosophy of the heroes who fought for Israel’s freedom and Jewish rights in Jerusalem. These heroes were not only the ideological heirs of Jabotinsky but the champions who brought Jabotinsky's deepest hopes into reality.

]]>
Wed, 08 Apr 2020 22:35:09 +0000 https://historynewsnetwork.org/article/174672 https://historynewsnetwork.org/article/174672 0