Ira Chernus's MythicAmerica Ira Chernus's MythicAmerica blog brought to you by History News Network. Sat, 20 Apr 2024 09:31:11 +0000 Sat, 20 Apr 2024 09:31:11 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/blog/author/7 Medicaid and the Eurozone Crisis: Beware the Wages of Sin MythicAmerica.us is hereby launched, with special thanks to History News Network editor David Walsh, an excellent harbor pilot who has steered this new vessel out of its dock and headed us toward the deep waters of America’s mythic narratives. Now we’re free to explore wherever we like. Welcome aboard the maiden voyage.

Most posts on MythicAmerica.us will begin with something in the news that jumps out at a myth-seeker, like this gem: The states should reject increased Medicaid funding from Washington, now that Chief Justice Roberts has given them that option. Why? because if the federal government spends more, “sooner or later you become Greece or Spain or Italy.” That explanation comes from Rich Galen, a Republican strategist who was Newt Gingrich’s press secretary when Gingrich was Speaker of the House.

You don’t have to be an ultra-conservative to fear seeing America turn into one of those Mediterranean lands. No doubt it’s a prospect that disturbs plenty of Americans across much of the political spectrum. 

How else explain the constant drumbeat of criticism of those three nations in the U.S. mass media? Yes, there’s an economic argument made, which Galen compressed into this sound bite: “If you keep expanding unemployment insurance and expanding Medicaid and expanding food stamps,” eventually “the money runs out....the government cannot keep growing without fraying at the seams.” That’s what most Americans, who get their news and views from the mass media, think is happening to the Eurozone’s three southernmost members -- a stark proof of the dangers of “big government.”

And lots of Americans think they know why it’s happening: Those Spaniards and Italians and Greeks would rather take from the government teat than work hard to take care of themselves.

The only problem is that the facts don’t bear out this stereotype. There’s no correlation between how productive workers are and how well their national economies are doing.  If you measure productivity by GDP per hour worked, the way the Organization for Economic Cooperation and Development does, Spanish workers are more productive than those in Finland, Canada, or Australia. Italians are more productive than the Japanese or Israelis. Greeks are considerably more productive than South Koreans.

Nor is there any correlation between government support for human services and the health of the economy. Some nations are doing much better than the Mediterranean three, even though they have government-funded safety nets as generous or even more so. Places like Sweden, Finland, and Canada are hardly fraying at the seams.

Why, then, are the problems of the Mediterranean three so consistently blamed on their social safety nets? And why is this dubious explanation so rarely questioned in the American political conversation?  

Much of the answer surely goes back to that fictional image of the lazy southern European who would rather lie around in the sun and drink than put in an honest day’s work. It’s a time-honored stereotype in the U.S., part of a time-honored tradition of stereotyping

For most of our history, most Americans of northwest European descent took it for granted that every nation had a distinct characteristic. In the nineteenth century and well on into the twentieth, “everyone knew” -- at least everyone who lived inside the dominant public discourse -- that Greeks love to eat and drink. Italians love to sing (especially opera) and drink, and make love.

What about Spaniards? Well, there weren’t too many folks living in the U.S. who had emigrated from Spain. But south of the border, there were those millions of Hispanics, speaking Spanish. In mythic terms they couldn’t be separated from Spain itself. And “everyone knew” that that they loved to do not much of anything at all, except drink, make love, and make trouble.

The prevailing myth of “national characteristics” arranged them in a hierarchy: Italy and Greece were somewhere in the middle, well below northwest European lands, but well above Africa and native Americans. Spain, represented by Latin American Hispanics, was a notch below its southern European neighbors.

But all three nations had two things in common, in the heyday of this mythic view: First, dark skin. Before World War I, when large numbers of Greeks and Italians came to the U.S., they were widely seen by lighter-skinned Americans as a different race. Only gradually did they become “white people.” Hispanics -- and thus Spain itself -- have remained largely outside the “white” preserve, despite the Census Bureau and other official agencies counting them as a certain kind of “white.”

Second, southern Europe was not predominantly Protestant. Now that anti-Catholic prejudice has tapered off so much, it’s too easy to forget how virulent it was for most of American history. Most Greeks were not Roman Catholic. But they practiced a form of Christianity even more foreign, mysterious, and thus intimidating to American Protestants.

The lack of white skin and Protestant faith was a literally damning combination in the mythology that prevailed at least through the mid-twentieth century. Like every myth, this one has its internal logic: White Protestants are uniquely blessed in their determination and ability to control their bodily desires. That’s why they work so hard, foregoing today’s pleasures for tomorrow’s earthly and heavenly gain.

Darker-skinned non-Protestants, lacking this capacity for deferred gratification, obviously would rather indulge now at someone else’s expense. In a word, they are lazy. In another word, they are sinners. And the wages of sin, it turns out, are plummeting wages or outright unemployment in this world as well as eternal perdition in next.

So there is no reason to take pity on their suffering. they’ve brought it upon themselves -- which is the basic message underlying the story of their economic suffering in the American mass media today.

Of course the facts of American labor history tell a very different story. Immigrants from Italy, Greece, and Spanish-speaking lands have typically done the most arduous (and often dangerous) work for the lowest pay, because they were determined to build a better life for their children. (The greatest roadblock to today’s anti-immigrant crusade is the business community’s acute awareness of this fact.)

Again, the question arises: Why has it been so easy for a mythic perception to eclipse obvious empirical truth? In this case, we should recognize that, like most myths, this one may have a kernel of truth. Southern Europeans are willing to work as hard as northern Europeans. But there probably is a significant difference in their cultural view of what people do when the working day is over.

The Northwest European Protestant traditions insist on a myth of absolute self-discipline, 24/7. To the south, there seems to be more acceptance of some degree of indulgence in sense pleasure once the work is done. (Perhaps the Catholic and Greek Orthodox theologies of confession, penitence, and indulgence play a role here, but that’s a question for specialists to debate.)

Of course Americans of Northwest European heritage knew that there was plenty of indulgence in their own communities, too. But their cultural traditions demanded that it be kept secret and largely denied. The best tool for denial is to point the finger of blame at some “other,” who is labeled, by definition, sinful.  That makes “us,” the total antithesis, by definition virtuous. So a difference in cultural orientations got blown into a dichotomy of saints against sinners.

This dichotomy dominated the discourse of white America through the early twentieth century. Only gradually did it fade in the last six or seven decades. the Latino community, along with the African American and native American communities, still suffer from it in quite overt ways.

Does this mythic heritage directly affect the way American media report, and American people perceive, the economic situation in southern Europe? There’s no way to prove it. But the disparity between perception and empirical economic facts, plus the scarcity of discussion about that disparity here in America, suggest that some unrecognized factors are at work beneath the surface of our political culture. The mythic tradition I’ve sketched here must surely play a part.

Whatever those X factors are, they have very real consequences in the real world. A headline in the New York Times (mobile edition, July 5, 2012) sums it up: “Washington’s New Austerity Spreads Around the Pain.” Of course the pain isn’t spread evenly. As always, the lower down you are on the economic ladder, the more it hurts. But if most Americans are persuaded that they need more austerity to avoid the dread fate of becoming Greece or Spain or Italy, the pain will only get worse.

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147131 https://historynewsnetwork.org/blog/147131 0
U.S.-China Policy: Hiding the Military-Economic Link “For Clinton, an Effort to Rechannel the Rivalry With China.” Under that headline the New York Times’s Jane Perlez reports: “At a gathering of business executives in Cambodia this week, Secretary of State Hillary Rodham Clinton plans to urge the expansion of American trade and investment across Asia, particularly in Southeast Asian nations on the periphery of China,”

Perlez explains: “The extra attention devoted to economics is intended to send a message that Washington recognizes that it initially overemphasized the military component of its new focus on Asia, setting up more of a confrontation with China than some countries felt comfortable with. ... Both sides have an interest in channeling their rivalry into trade more than weaponry.”

This story has two implicit punch lines: China poses some military threat; the only question is how much. And the Chinese military threat is essentially independent of economic issues.

If Alfred Thayer Mahan can read the New York Times in his grave, he probably can’t decide whether to roll over or laugh -- or maybe both, considering what a howler Perlez offers from the perspective of U.S. foreign policy since Mahan’s time.

Rear Admiral Alfred Thayer Mahan, 1904. Credit: Library of Congress

In 1890, just as the U.S. was on the brink of its rises to global power, Mahan published his immensely influential treatise The Influence of Sea Power Upon History: 1660-1783. The argument in the book that impressed many future U.S. policymakers, most notably Theodore Roosevelt, was simple: A nation’s economy now depended on the volume of its global trade. To trade as much as possible, you’ve got to prevent rival nations from interfering with your trade. Since nearly all international trade in 1890 went by sea, that meant controlling the sea lanes and ports along those lanes for refueling -- controlling them by any means necessary.

When Roosevelt sent the famous Great White Fleet on its 15-month voyage around the world, he sent the clearest signal yet that the U.S. was adopting Mahan’s basic perspective: Global trade and global military power could not be separated.

That theory had its ups and downs at the highest levels in Washington. In 1916, as most Americans eagerly sought to avoid involvement in World War I, the Assistant Secretary of the Navy, TR’s distant cousin Franklin D., took a different view. His great passion was the sea and sailing vessels; he was an avid reader of Mahan’s works.

In a private letter, FDR wrote that human nature itself makes every nation greedy for all the power it could gain. “I am not at all sure that we are free from this ourselves,” he added. But his conclusion was right out of Mahan: “The answer is … ‘build the ships.’” In public speeches, he warned that if the U.S. did not build up its military strength, “anybody that wishes [would] come right along and take from us whatever they choose.” And “if you cut off the United States from all trade and intercourse with the rest of the world you would have economic death in this country before long.”

Roosevelt avoided such militant words for nearly two decades after World War I; they were so unpopular that they spelled political suicide. By the late ‘30s, though, with Germany and Japan threatening U.S. access to resources and markets in Europe and East Asia, he had little trouble getting Congress to approve huge increases in military spending.

The rest -- dare I say it? -- is history. Since World War II, U.S. foreign policy has pretty much taken for granted the Mahanite theory, summed up by Thomas Friedman in these memorable words (from The Lexus and the Olive Tree): “The hidden hand of the market will never work without a hidden fist. McDonald’s cannot flourish without McDonnell-Douglas.” For those who need prose rather than poetry to get the point, Friedman told New York Times readers: “The emerging global order needs an enforcer. That’s America’s new burden.”  

Of course U.S. political leaders don’t talk that way in public. Nor do most journalists, even in the Times. Typically, they tell the story the way Perlez does -- a way that denies the link between America’s military machine and its efforts to dominate global trade. Taxpayers aren’t likely to shell out such immense amounts every year to maintain overwhelming military power if it’s just to give U.S.-based multinational corporations overwhelming economic power. Some other stories have to be told.

So we get, every day, new variations and improvisations on the myth of national insecurity. China, Iran, Al-Qaeda, the Taliban: there’s no end to the list of candidates to play the role of “enemy.” Whoever gets the part, the story is always about a military threat that has no direct connection with America’s global economic aspirations.

I don’t mean to suggest that the story Mahan, FDR, and Friedman told is the only one that really moves federal policymakers and legislators to fund our extravagant military budget. There are a lot of interlocking myths that lead to the same result. I’ll be touching on many of them in future posts. And some of them do treat supposed military threats as quite separate from the economic sphere.

But it’s important to recognize how one version of the story, which portrays military and economic goals competing with each other, can mask the way those goals have work synergistically in U.S. policy and elite mythology for so many years.

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147157 https://historynewsnetwork.org/blog/147157 0
Election 2012: What’s the Real Story Here? Changes in the Electoral College from 2008 to 2012. Credit: U.S. Census Bureau

Come Election Day, we’ll learn two very important things: Who will be president for the next four years? What story will be told about the presidential election of 2012? I’m not sure which of those two is ultimately more important. Some presidential elections create stories that last longer than the presidents who get elected. Sometimes the stories may have even more impact than the presidents themselves.

Richard Nixon, for example, went down in disgrace. But the popular story told about his two winning campaigns -- “The people want law and order, not abortion, acid, and amnesty” -- still strongly affects our political life. So does the story that was used to sum up Ronald Reagan’s two victories: “The people want big government off their backs.”

So far, it seems like the story of 2012 will be a pretty predictable repeat of Bill Clinton's 1992 win: It’s the economy, stupid. If Obama wins, we’ll be told that most voters are optimistic; they believe the economy is generally on the upswing. If Romney wins, we’ll hear that most voters feel hopelessly mired in a seemingly endless recession.

In either of those versions, the candidates are passive victims of economic forces beyond their control. What they say or do doesn’t matter much at all.  That’s what most of the pundits are saying. And there is a whole body of research in political science to support that view.

Now, though, there are hints that another story might emerge on Election Day, the one the candidates themselves seem to favor: the voters are choosing between two profoundly different visions of what it means to be an American. On the Sunday after the Fourth of July, two top political journalists in the nation’s two most influential newspapers told readers that this election really is about choosing between those visions.

In the Washington Post, Dan Balz wrote:

On both sides, it is a choice between black and white with little in between. On one side, it is seen as the threat of big government, shackles on the economy and an end to freedom. On the other side, it is seen as shredding the middle class in order to reward the rich. Swing voters in the middle are being asked to pick one side or the other.

In the New York Times, Richard Stevenson wrote:

Presidential campaigns are never just about policies or even personalities. They tend to turn as much as anything on values, and the values in this case go to central questions about the psyche of the American electorate in 2012. ... Will the long-held assumption that the United States is an aspirational society that admires rather than resents success hold true? ... Do political leaders have less incentive to put the needs of the poor and the middle class ahead of the agendas of their benefactors?

Will the commentariat turn to this view of the election as a profound choice between competing worldviews? Or will it stick with the prevailing view that only the economic statistics really matter? That’s an important question I’ll be tracking here between now and Election Day.

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147179 https://historynewsnetwork.org/blog/147179 0
If the U.S. Is The World's Fireman, Who Rebuilds The Burned-Out Neighborhoods? U.S. Air Force firefighters during a training exercise. Credit: Wikipedia

One of the advantages of a mythic approach to political culture is that it gives us a chance to put the pieces of the puzzle together in new ways, opening up new, sometimes unexpected, perspectives. Today’s pieces are wildfires in the West and politics in the Middle East.

When fire ravaged some 360 homes in Colorado Springs, federally-funded firefighters were quickly on the scene. Soon Barack Obama was there too, offering more federal aid. I expected the mayor of the Springs, a bastion of shrink-the-government conservatism, to declare indignantly that his people could take care of themselves perfectly well, thank you. In fact, local officials didn’t just take the money. They asked for it even before the president arrived.

It reminded me of the time I had a small fire in my house. The firemen were there for hours, making sure every tiny ember was extinguished. When they wrapped up to leave, I felt like I should ask for the bill. I had to remind myself that when it comes to putting out fires, we Americans are socialists. We all chip in what we can and then take what we need.

In fact that’s what we do in all kinds of emergencies, whether natural or humanly made. The heroic firefighters of 9/11 didn’t present anyone with a bill, either.

But nearly all that public funding goes for putting out the fire. What happens after it’s out and, as in Colorado Springs, whole neighborhoods must be rebuilt? There will be a bit of federal money for crisis counseling and unemployment assistance. Beyond that, Obama simply appealed for private charity and donations to the Red Cross.

Fire victims who have private insurance can probably restore their own homes pretty well. To those who were not insured, most of the good burghers of the Springs will simply say, “Well, whose fault is that? Not ours.”

And what about the roads, the electric lines, the sidewalks, the parks, and all the other public infrastructure that has to be rebuilt? In the Springs, they’ve already cut funding for all sorts of infrastructure drastically. They don’t even replace burned out street light bulbs. They call it the American way: rugged individualism, getting big government off the backs of the people (as Ronald Reagan loved to say).

Perhaps they’ll make an exception for the burned-out neighborhoods, which have evoked so much public sympathy. Is it too cynical to think that it depends a lot on how much political clout those neighborhoods can muster? If a poor neighborhood had gone up in flames, I don’t think you’d see streetlights or parks or even sidewalks there for a long time to come. And uninsured homes would remain empty lots for even longer. Look at what happened in New Orleans’ Ninth Ward.

Or look at Libya. Here’s where the puzzle pieces may seem disconnected at first. But I started thinking about the aftermath of the Springs fire just after it occurred to me that I had not seen any prominent news about Libya in ages. Most Americans had cheered when Libyans began a movement to overthrow their dictator, Muammar Gaddafi. There was scarcely a loud murmur of dissent when U.S. forces started dropping bombs on Libya to speed up the process.

In the American mythic lexicon, the link between fire and Gaddafi is obvious: He was the Great Satan of the day, one in a long line of Great Satans who deserved to spend eternity in the flames. But he had no right to inflict his burning evil on his people. In the U.S. mass media, it was taken for granted that it was a battle of the dictator versus “the Libyan people”; the Libyans who supported Gaddafi were rarely mentioned. And it was taken for granted that, like a forest fire threatening civilized structures, the dictator to be extinguished immediately, regardless of the cost.

Yet once the fire was out -- once Gaddafi was gone -- Americans rather quickly turned their gaze elsewhere and didn’t seem to look back. It was time for the Libyans to take care of themselves.

By coincidence, the day after I started thinking about all this Libya did break into the news briefly. it was election day there. The headlines trumpeted the triumph of democracy after years of Gaddafi’s autocracy.

If you read the details, though, it wasn’t a much prettier sight than the aftermath of a fire. The new government is gearing up shakily, in fits and starts, and there was plenty of violence to mark its first elections. The New York Times veteran Mideast correspondent David Kirkpatrick chalked it up to tribal rivalries: Libya has been riven for decades by recurring battles among regions and tribes.” So has the rest of the Arab world, most American readers would silently add.

However, getting democracy going is almost always a messy procedure, sort of like rebuilding a burned out neighborhood. It took the United States eleven years just to get a Constitution. And that merely set the stage for the 1790s, which many historians see as the decade marked (or marred) by the most vicious political battles in U.S. history. 

In any event, news about the vicissitudes and violence of nation-building in Libya didn’t last long. (Just two days after the election, “Libya” couldn’t be found on the home page of either the New York Times or the Washington Post.) I’d bet a bundle that we won’t hear much about Libya again for a long time, except maybe for an occasional mention if there’s some massive violence.

Even large-scale violence in Iraq, which made headlines when American troops were at risk, now gets only a passing glance in our mass media. After all, the fire named Saddam, like the one named Gaddafi, has long been extinguished.

All of this is more than just ancient history because Americans face a similar scenario looming in Syria. When Secretary of State Hillary Clinton took the spotlight at a meeting of the “Friends of Syria,” she merely confirmed the underlying assumption of virtually all American mass media reporting: The battle in the country is simple to understand; an entire nation -- all of its people -- are pitted against a single totalitarian leader. For Americans, the impulse to side with “the people” is understandable, seemingly natural, perhaps inevitable.

But it’s not natural. It’s cultural. Syria today, as most Americans see it, is one more example of a basic pattern of our national culture, summed up succinctly by the prominent historian of American religions, John F. Wilson: “A resolution is repeatedly believed to be at hand to that one special evil which, when overcome, will permit a long-anticipated era to be ushered in” -- an era far better than anything we’ve seen before, an era even, perhaps, of millennial perfection. Just put out the fire and all will be well.

From this perspective, the only question that remains is whether to support “the Syrian people” with or without military violence. Stephen Zunes makes a cogent case that in this situation, as in most situations, the rebels are less likely to suffer and die if they maintain strictly nonviolent tactics. (Abolitionist Charles Whipple made the same case, retrospectively, in 1839 about the American Revolution.)  

As part of his argument, Zunes makes an even more important point:

A fairly large minority of Syrians -- consisting of Alawites, Christians and other minority communities, Baath Party loyalists and government employees, and the crony capitalist class that the regime has nurtured -- still back the regime. ... The regime will only solidify its support in the case of foreign intervention. The Baath Party is organized in virtually every town and neighborhood. ... It has ruled Syria for nearly 50 years.  And with an ideology rooted in Arab nationalism, socialism and anti-imperialism, it could mobilize its hundreds of thousands of members to resist the foreign invaders

So the satisfying simplicity of “the people versus the dictator” is rather fictional here, as it was in Libya and Iraq. We are really looking at another civil war. That doesn’t necessarily mean the U.S. should just stay home and mind its own business, though there is a serious case to be made for that option.

It does mean that Americans should resist the temptation to rush in and treat Bashar Al-Assad as if he were a fire to be extinguished ASAP. Even if a large majority of Syrians would like to see Assad gone, a civil war is a terribly complicated thing, as anyone who has studied American history knows all too well. When you interfere in situations you know virtually nothing about, trying to be the world’s fireman, you are actually playing with -- and probably stoking -- the fire.

When it comes to Syria, a responsible public conversation about U.S. options would take into account as many variables as possible and remember that once the fire is out, the problems are just beginning. Who will do the rebuilding? How? What complexities are likely to arise? How might we be adding to those complexities, even if we have the best of intentions?

I’d hate to have to be called on to answer those questions. The complications are so immense and unpredictable. it’s like trying to model an ecosystem: There aren’t any computers, or any human minds, big enough to handle all the variables. Indeed, that complexity may be the strongest argument for resisting the temptation to take sides and intervene.

But the American cultural tradition makes it unlikely we’ll ever see any of those questions or complications at the forefront of public discussion. Instead, we will probably push on, trying to overthrow the dictator in the name of “the people.” If we succeed, whether through diplomacy or force, we’ll leave the actual people of Syria -- as we left the Libyans and Iraqis (and soon enough the Afghans) -- to do the rebuilding alone. 

If they complain that they’ve been abandoned in their hour of greatest need, most Americans would say, as most of the good neighbors of Colorado Springs would say, “Hey, it’s none of our business. Now you’re on your own. That’s what we mean when we say, ‘It’s a free country.’ That’s what we came here to give you. That’s why we put out the fire.”  

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147266 https://historynewsnetwork.org/blog/147266 0
Slavery and “Big Government”: The Emancipation Proclamation’s Lessons 150 Years Later First Reading of the Emancipation Proclamation of President Lincoln, Francis Bicknell Carpenter, 1864.

One hundred fifty years ago today, on July 13, 1862, Abraham Lincoln went out for a carriage ride with his Secretary of State, William Seward, and his Secretary of the Navy, Gideon Welles. Lincoln told them (as Welles recalled it) that he had “about come to the conclusion that it was a military necessity absolutely essential for the salvation of the Union, that we must free the slaves.” That was the seed of conception for the Emancipation Proclamation, which came to birth five and half months later, giving Lincoln his greatest legacy: “He freed the slaves.” It’s a story everyone knows.

But it’s not quite accurate. Only the slaves in the Confederate states were emancipated. Citizens of the Union could still own slaves.

The part of the story that portrays Lincoln issuing the Proclamation simply from his deep moral concern about the evils of slavery is rather misleading, too. He had plenty of moral concern. But he stated over and over that his number one goal was winning the war. He said he would be willing to keep every slave enslaved if it would help win the war.

Fortunately, by mid-1862 freeing the slaves seemed to be the best way to win the war. Emancipation would deprive the Confederates of their main source of labor and bring much of that labor into the Union Army, where blacks served in all sorts of ways.

Lincoln knew he was taking a political risk. The vast majority of Northerners, like their president, had gone to war only to save the Union, and now they were shedding blood at an unprecedented and unexpected rate for that cause. But racism was rampant in the North. Would whites fight and die for emancipation?

There was also a huge controversy in the North over whether blacks should be allowed to fight; the specter of armed African Americans terrified many white Northerners almost as much as Southerners. And Northern Democrats played the racist card as their strongest weapon to discredit Lincoln and the Republicans.

So why was Lincoln willing to take the political risk of declaring the Confederacy’s slaves free? The military advantages were certainly the decisive factor. And he had finally given up his fervent hope that the border states would agree to a slow, gradual emancipation plan.

But historians also point to a major shift in public opinion between the war’s beginning and the middle of 1862, which diminished the political risk. Though racism still abounded, there was a large and unexpected growth of sympathy for the slaves that brought with it support for the idea of giving them freedom. Indeed, Congress had already passed the Confiscation Acts, giving Union soldiers the right to free slaves in any Confederate territory that the Northern army controlled.

Why this sudden public desire to free the slaves? Many of these congressmen and their constituents were not only racist but rather conservative by today’s standards. And the surge of evangelical piety unleashed by the Second Great Awakening was still rising. A vast number of Northerners in 1862 would have been quite comfortable with the religious, social, and for the most part political views of today’s “religious right.” Recent scholarship (Richard Carwardine, David Goldfield, and Orville Vernon Burton, among others) sees the religious factor as key to understanding the era.

The Republican Party first coalesced around a demand to keep slavery out of the western territories. Their rallying cry, “free soil, free labor, free men,” had a rather libertarian ring to it. There was little to no enthusiasm for war.

Once the Confederacy seceded, though, the slavery issue itself almost disappeared, as “the salvation of the Union” became the one and only concern. (It is noteworthy that Lincoln used this religious terminology in a private chat with his Cabinet secretaries.) A year later, limiting slavery had returned, in a new form, to join victory at the top of the Republican agenda.

It’s striking to see how quickly and easily nineteenth-century evangelicals (again, mostly conservative by today’s standards) could change their top political issues.

We’ve seen the same thing in the twenty-first century. The day after Election Day, 2004, the pundits credited George W. Bush’s re-election to the power of the religious right and its overwhelming concern for morality and social issues. A closer analysis of the exit polls showed, though, that the real key to Bush’s win was the perception that he could best win the war against terrorism.

In 2004 conservatives showed no fear of “big government.” They wanted a government strong enough to protect them from “terrorists” and “secular humanists,” so they voted enthusiastically for a president who had driven the nation deeply into debt, erasing the surpluses of his Democratic predecessor.

Now, neither social issues nor terrorism rise to the top of conservatives’ list of most important issues. Neither one gets more than about 5 percent in “What is your most important issue” polling, although 35 percent to 40 percent of Americans call themselves conservative. Even among evangelicals, the leading concern of the day is the supposedly “crushing burden” of federal debt and curbing the spending of “big government.”

We could look at any other era of American history and see the same pattern we see in the Civil War era and our own. Conservatives, evangelical and otherwise, are not defined by any single issue. Their favorite issue(s) change with the times, sometimes very rapidly.

Still, their movement does have a unity and continuity that explains its enduring strength. Burton offers an important clue when he describes the national mood in 1856, as the Republican Party was first taking the national stage:

Americans had a strong sense of failure in spite of economic prosperity. Socially, communities seemed troubled, in flux, coming apart. Culturally, all that was American seemed steadily diluted, adulterated, narrowed. Political vision seemed utterly lacking.

Fast forward five years, add the shock of massive death and suffering on the battlefield, with no victory in sight, and one can only imagine how much stronger was the sense of failure, how much more communities seemed troubled. Yet they were no longer in such flux, and America no longer seemed diluted or adulterated. Now all came together around a fixed cause to fight for, to give America meaning: a war against the evil of secession.

Add another year, and slavery joined secession in the list of evil to be exterminated. But the principle remained the same: When conservatives are plagued with those disturbing feelings, their healing balm is to divide the world into a simple dichotomy of good against evil and to join the forces of good in a war -- social, political, and military if need be -- against the evil.

Now fast forward 156 years, and we must subtract the economic prosperity of 1856. But the rest of Burton’s description is an almost exact summary of how most conservatives feel, especially the evangelicals among them. And the loss of confidence in the economy surely heightens all the other anxieties he identifies.

Now, as then, the antidote to anxiety is to find an evil to resist. Now, as then, the name of the evil is a secondary matter. The heart of the matter is a simple truth: A world starkly divided between good and evil is a world that has a firm, clearly defined structure. Psychological studies show that conservatism arises from a desire for structure, for a controlled world that offers the security of certainty.

As long as the moral boundary line seems immutable, people who are comforted by structure feel far less troubled. Their world no longer seems in flux or coming apart. As long as they can place America squarely on the side of good, they no longer have to worry about their nation seeming diluted or adulterated.

Any issue that lets conservatives draw an absolute, patriotic dividing line will do the job and inspire their passion, at least for a while. That’s why the causes they fight against can change so readily.

Unfortunately, when the world is morally divided like that, someone on the “wrong” side usually suffers. In today’s political climate, the millions who depend on government funds for their very survival are the potential victims of the conservative crusade. (Just take a look at this one sad story, out of dozens that appear every day.) In the nineteenth century, the Emancipation Proclamation was produced by the same crusading reform spirit that demanded the repression of sexuality, the prohibition of alcohol and gambling, and other such routes to “purity.” Yet in this case it produced an indisputable moral good.   

Just as the nineteenth century spirit of reform produced mixed results, so the mythic tale that the Emancipation Proclamation spawned has carried mixed results. As Eruc Foner has written:

The sheer drama of emancipation fused nationalism, morality, and the language of freedom in an entirely new combination. ... It crystallized a new identification between the ideal of liberty and a nation-state whose powers increased enormously as the war progressed. ...  Henceforth, freedom would follow the American flag. As Frederick Douglass proclaimed, “The cause of the slaves and the cause of the country” had become one.

As conservatives showed in 2004 -- and indeed, as they have showed since the 1940s -- they, too, want an enormously powerful nation-state, as long as it wields its powers only to fight foreign enemies and domestic moral evils, as conservatives define them. The debate in this election year is about the economic meaning of the freedom that follows the flag.

But conservatives also tend to want their government to wield its power quickly and decisively. When good goes up against evil, they see a battle of absolutes. So there’s no time for reflecting on subtle shades of gray. 

That’s the way the story of Lincoln and the Emancipation Proclamation has been told: One day, Lincoln just decided that slavery was wrong, so he freed the slaves, each and every one, with a single stroke of the governmental pen. This myth goes hand-in-hand with another myth born out of the Civil War: When American soldiers carry the flag into battle, they must win absolute victory by totally destroying the enemy -- and the quicker the better. Like the final battle of Christ against Satan in the Apocalypse (The Book of Revelation), victory should ideally be instantaneous.

This was apparently the strategic vision of General Ulysses S. Grant. Once he took command of the Union forces and achieved victory, it became what Russell Weigley has called “the American way of war,” showing its implications most clearly in Sherman’s march through Georgia.

Since the Civil War, total, instantaneous freedom -- whether by decree or by force of arms -- has been a powerful American myth. Today’s conservatives seem more likely than liberals to build their politics upon it. We shouldn’t discount its impact on liberals, too. But those who demand a clear-cut, immutable structure for their nation and their world are more inclined to be conservative, and they’re more inclined to try to put this apocalyptic myth into political practice.   

Nevertheless, it’s important to remember that, in 1862, politicized evangelicals pushed a hesitant president to do what was so obviously the morally right thing, even if he did it only as a way to win the war. And Lincoln’s decision transformed the very idea of America.

Today, too, there is a small but growing minority of evangelicals who would move public policy in progressive directions. Some are deeply concerned about the environment. A smaller number of white evangelicals -- and lots of evangelicals of color -- are deeply concerned about peace and social justice.

Progressives have a dangerous tendency to stereotype all evangelicals as conservative and reactionary. It would be better strategy to engage them in conversation and treat disagreements somewhat diplomatically. No one knows what issue might engage evangelicals’ passions next year. 

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147318 https://historynewsnetwork.org/blog/147318 0
The War on Drugs and the Wars on Mexico Mexican troops in a gun battle, 2007. Credit: Wikipedia

There’s a growing debate among policymakers about how to wage the war on drugs, the New York Times reports. No one doubts that the war must be waged, apparently. At least the article, in the nation’s most influential news source, doesn’t hint at any doubts. Like any mythic truth, the need to keep the war going is simply taken for granted.

But how should we fight it? That’s the question now, it seems. In one corner is the traditional “stop the flow from abroad” approach. In the other corner, a new and rising view:

The money now used for interdiction could be better spent building up the institutions -- especially courts and prosecutors’ offices -- that would lead to long-term stability in Mexico and elsewhere. ... Since 2010, programs for building the rule of law and stronger communities have become the largest items in the State Department’s antidrug budget, with the bulk of the money assigned to Mexico. That amounts to a reversal from 2008 and 2009, when 70 percent was allocated to border security and heavy equipment like helicopters. ... American officials say they are now focused on training Mexican prison guards, prosecutors and judges.

In sum, U.S. drug policy is rather confused, one expert reports: “Some U.S. officials favor building institutions; others think it’s hopeless.”

It’s a classic case of head-to-head competition between the two great mythologies that have vied for dominance throughout American history. The mythology of homeland insecurity focuses on keeping us safe from dangerous forces trying to pierce through our borders. The mythology of hope and change urges us to go out to the frontier and tame those dangerous forces by bringing them the gift of civilization, which we as Americans are uniquely suited -- some say obligated -- to bestow.

Those officials who favor strengthening the rule of law in Mexico by building institutions like courts and prosecutors are acting out the latter myth with splendid clarity. It seems obvious to them that the Mexicans simply can’t figure out on their own how to run a legal system. They’ve got to learn it. And who better to teach them than their northern neighbors?

U.S. officials have been trying to teach the Mexicans that kind of lesson for a long time. Most famously, Woodrow Wilson sent troops south of the border in 1913 to remove the Mexican president, Victoriana Huerta -- an “ape-like” man, his counterpart in Washington declared. But as Wilson explained his motives to the British ambassador to Washington, his personal dislike for Huerta wasn’t really the issue. It was mainly about Wilson’s fear that the kind of nationalist revolution which brought Huerta to power might break out in Central American nations too, places much closer to the all-important Panama Canal.

The specter of nationalists seizing control of the Canal was intolerable. So, Wilson said, those Central Americans had to have “fairly decent rulers”; that is, rulers decently disposed to support policies favoring U.S. interests. Wilson sent troops to depose Huerta in order “to teach those countries a lesson”: They had to learn to “elect good men.”  

That’s not to suggest history is merely repeating itself. There’s no indication that U.S. officials are contemplating a military invasion to achieve their institution-building goals. All their means are peaceable. It’s only the interdiction fans who want to use the military -- at least so far. And if the Times has it right, the pendulum is swinging toward the advocates of peaceful means. Some would say this is real progress.

The progress looks even more significant if we expand the historical perspective back to the U.S. war with Mexico, 1846-48. Though there were strong and loud opponents of the conflict that President Polk intentionally provoked, the dominant mood of the country was pro-war. The big question that most Americans debated was: How much of our defeated southern neighbor should we annex? Some clamored for “all Mexico.” We were “pioneers of civilization,” as a prominent historian of the era, William Prescott, put it. We could regenerate the Mexican people by making them Americans.    

But Prescott himself argued the other side: “The Spanish blood will not mix well with the Yankee.” Indeed, said Andrew Jackson Donelson (nephew of the president for whom he was named), “We can no more amalgamate with her people than with negroes.” The racist fear of white Americans inter-breeding with Mexicans, contaminating American blood, bringing Americans down to the degraded level of Mexicans, was one big reason -- some historians say the biggest reason -- that the U.S. took only the northern part of Mexico (where relatively few Mexicans lived).

We rarely hear such overt anti-Mexican racism from elite voices today. And of course neither overt annexation or military invasion are ever discussed. No doubt that’s progress.

Yet there are striking continuities. Wilson worried most about the Panama Canal, which was making investment in foreign trade so much safer, more profitable, and thus more attractive than ever before. There’s a similar motive at work in the move toward “institution-building” today: “We see crime as the leading threat in some countries to economic growth and the leading threat to democracy,” Mark Feuerstein told the New York Times. He’s the assistant administrator for Latin America and the Caribbean in the U.S. Agency for International Development.

Note that democracy gets second place in this short list of priorities. Economic growth is number one. If we can teach the Mexicans to elect (or appoint) good judges and prosecutors, foreign capital will be safer. Again, the basic assumption is that the Mexicans will never figure it out on their own. They’ve got to be carefully taught.

But Americans have had to be carefully taught too -- taught to assume the inferiority of even the most elite Mexicans. Behind that teaching lies another lesson, memorably phrased by Oscar Hammerstein II in South Pacific:

You've got to be taught to be afraid Of people whose eyes are oddly made, And people whose skin is a diff'rent shade, You've got to be carefully taught.

Here the two great mythologies meet. Even if today’s “pioneers of civilization” are mainly concerned about making Mexico safe for foreign investment (and no one can know their true motives), they can’t come out and say so. The American public wouldn’t pay for that. “Hope and change” is hard enough to sell even here at home.

The best way to build public support for this new policy direction is to play subtly on two deeply-rooted strands of mythic America: the continuing sense of superiority that so many white Americans feel when they look southward, and the fear that crime, so often imagined in American discourse as a physical plague, is epidemic in Mexico and constantly threatens to spread northward across our border. “Homeland security” is still what sells best.  

P.S.: Just after I posted this piece I noticed that the NY Times website had posted, very prominently, an article about the very widespread use of bottled water in Mexico, because of the perception (perhaps inaccurate, the article notes) that the water throughout Mexico is contaminated. It's probably just coincidence that this article appears right next to the "drug war" piece on the Times's site. But in the realm of mythic language and thought, everything gets connected, whether it's logical to do so or not.  

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147351 https://historynewsnetwork.org/blog/147351 0
Obama’s Killing Problem -- Or Ours? Predator drone firing a missile. Credit: Wikipedia

Esquire magazine has just brought out their new Fall Fashion Preview issue, and they somehow thought it made sense to include an article on “Obama’s Killing Problem.” You can see that title featured on the cover, just above “29 Reasons to Watch the Olympics.”

Obama’s problem, if I understand author Tom Junod right, is that by relying so heavily on drones meting out targeted assassinations, he’s changing the face of war in ways neither he nor anyone else can predict. Whatever the U.S. does, the rest of the world is bound to follow, and those drones will probably some day come back to haunt us.

This was also a major theme in American media in the first days after the bombing of Hiroshima and Nagasaki: We are now vulnerable, too. It was true then, and it’s true now.

But is this, as Junod claims, Obama’s problem? (He underscores the claim by writing his whole long article as a letter addressed to the president.) If John McCain were president, would he be less likely to use the drone technology that the Pentagon has made available to any resident of the White House? Or is any president almost inevitably seduced by the technological imperative?

The best historical accounts I’ve read of the summer of 1945 suggest that neither Harry Truman nor most of those in his inner circle ever seriously considered not using the Bomb. It was so “technically sweet,” as Robert Oppenheimer famously said. Its lure was irresistible. Why should drones be any different?

There’s certainly more to Obama’s reliance on drones than the siren call of technology. It’s just possible that he hates to kill anyone but has made a cool political calculation: The only way to preserve health care reform and have any chance of promoting his other domestic policies is to neutralize the predictable attacks from the right that he’s “soft” on national security. And he's got to give the Pentagon something in return for ordering so many troops out of Iraq and Afghanistan.

It seems more likely, though, that he’s driven as much or more by philosophy than by politics. By his own account he’s been, since his college days, a devoted fan of the works of Reinhold Niebuhr. He even used his Nobel prize speech -- the PEACE prize speech -- to give the world a lesson in the same watered-down version of Niebuhr 101 that shaped U.S. policy in the days of Truman, Dean Acheson, and George Kennan: Evil is out there, waiting to destroy us unless we destroy it first. And since we’re all sinners, it’s inevitable that even the best of us will fight evil with some of evil’s own means.

This doesn’t absolve Obama of responsibility for the lethal choices he has made. He has blood on his hands. As a Niebuhrian he had to expect that from the day he first stepped into the Oval Office.

But it does raise some disturbing questions: Can we, the people, expect anything different as long as we remain so enamored with “cool” technology, so enmeshed in Niebuhrian assumptions about evil and original sin, and so ready to fuse the two in our conversations about America’s role in the world? If we allow that cultural pattern to remain dominant, shouldn’t we expect whomever we choose as president to embrace it too, and act upon it? Isn’t that the way democracy is supposed to work?

These questions, and others that Junod’s article raises, are a huge can of worms. I just wanted to lift the lid a tiny bit and take a quick peek inside. No doubt I’ll be looking more deeply into it in the future. For now, it’s enough to note that Obama’s drone-driven policies bring together two old and familiar strands of American mythology, strands that we are likely to find wrapped around any American president. If those policies are problematic -- and I’d say that’s an understatement -- then it’s our problem as much as, or more than, Obama’s. 

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147385 https://historynewsnetwork.org/blog/147385 0
Fantasy of Absolute Safety is Killing Us, Even in Movie Theaters A group of handguns. Credit: Wikipedia

My son was spending the night in Aurora, Colorado, when all hell broke loose just a few miles away. He wasn’t in the Century 16 theater. But he might have been; he loves those opening nights. And there wasn’t a thing I could do to protect him.

I’m a professor at the University of Colorado (though not on the campus where James Holmes studied). I’ve surely had quiet students who were deeply troubled but, like Holmes, drew no attention to themselves. So there wasn’t a thing I could do to help protect them. 

I’m active in a community organization trying to improve Colorado’s abysmal mental health services: The state ranks dead-last in per capita psychiatric hospital beds, and services of every kind have suffered drastic budget cuts. I’ve had more than one family member who needed help badly and got that help only after a long wait, persistent struggle, and nasty fights with insurance companies. Lots of folks don’t have decent insurance, or anyone to fight the woefully inadequate system on their behalf.

The shooting at that movie theater in Aurora hit me personally on all these levels. It made me realize how little I, or anyone, can do to prevent such mass violence.

That’s the bottom line of the national gasp of horror. “This was supposed to be a safe space,” as Monica Hesse wrote in the Washington Post. But now it feels like, “no space is safe; maybe that’s what’s shocking.” Surely that’s what’s shocking, I’d say. But a moment’s reflection on my experience as a parent, a teacher, and a community activist tells me it’s true. We can never make our public and private spaces absolutely safe.

Yet we can make them safer.

From the first reports, it seems there’s nothing the mental health system or the University could have done to spot Holmes as a troubled young man. No one really noticed him. He was so “ordinary.” That’s what the neighbors typically say in these cases, and it’s usually true. Even when the signs of trouble are there, it’s extraordinarily hard to change a person who is so deeply disturbed.

It would be much easier to change their access to weapons of mass killing. At least easier ideally, in principle. But not in political reality, it seems. The problem isn’t just the clout of the National Rifle Association, which is real but probably over-rated. The bigger problem is that so many Americans are paralyzed on the gun issue, caught in a crossfire of competing cultural traditions, beliefs, and symbols that make it very difficult to mobilize the public in any clear direction.

Just look at the numbers:

Gallup tells us that the number of Americans favoring stricter gun laws has fallen by nearly half in the last half century. That shocking statistic reflects the long post-‘60s rightward shift in the national mood. “Gun control” is widely seen as an idea by and for liberals. By now less than a quarter of us will wear that badge. To the rest of America, liberals look more or less dangerous because they are "soft" on keeping us safe from enemies, foreign and domestic. It’s impressive that even 43% of us would support the liberal cause of “gun control.”

However, the number who want guns laws eased has risen dramatically since 1990: from 2 to 11 percent. Yes, only 11 percent of us want less strict regulation of guns. And support for specific gun control measures -- waiting periods and background checks for gun buyers (even at gun shows), banning assault weapons, registering all guns with local government -- remains very high. A slim majority even support limits on the number of guns a person can own. (Most gun owners have several, and most mass killers are caught holding many guns.)

So here’s the real political problem: Ask people about specific, common-sense gun control measures and they strongly approve. Ask them about “gun control” in the abstract, and a growing majority says no, though almost half say yes. We, the people as a whole, want controls but we don’t want them. When nations, like individuals, try to go in two directions at once they get paralyzed. That’s where we are on the politics of gun control.

The roots of our paralysis run very deep in our cultural history, where traditions about guns are equally ambivalent. It looks like Americans have a love affair with guns. “America's gun ownership rates are vastly higher than that of other wealthy countries,” according to economist Howard Steven Friedman. “Only one OECD country has a rate that is even half as much of America's gun ownership rate.” The Gallup survey found gun ownership dropping just slightly in recent decades.

But far fewer than half of American homes have guns. The General Social Survey of the National Opinion Research Center found a steady and sizeable decline over the last 35 years in the number of households with guns. That decline showed up in every age group and was especially sharp in recent years among people under 30. 

Those numbers reflect a contradiction as old as the nation itself. On the one hand, we’ve got a tradition going back to colonial times that says: If you want to be safe, get a gun; if you want to be absolutely safe, get a lot of guns. That’s why Americans once built forts and stockades and included the right to well-regulated militias in the Constitution.

Since World War II, we’ve made our quest for absolute safety our number one national priority by far, under the banner of “national security.” That’s why we built a nuclear “shield” of tens of thousands of bombs that can each destroy a whole city. It’s also why we have a military nearly as big as all the rest of the world’s militaries combined. Now we call it “homeland security.” we’ve enshrined it as our sacred national myth.

And that’s why, with the eager help of the military-industrial complex, we are awash in a sea of military weapons -- a sea that on tragic occasions turns to blood in our own homeland.

Yet we also have another tradition as old as the nation itself, inscribed in the very first words of our constitution: to provide for the common defense, which most of us now take to mean absolute safety. The longing for absolute safety is certainly as strong, and probably stronger, among conservatives as it is among liberals. Across the political spectrum most of us want stricter specific gun control laws, which we expect will keep guns out of the hands of “evildoers” at home just as we hunt down and annihilate the “evildoers” abroad.

So we’re caught in a crossfire of competing cultural traditions and beliefs that make it very difficult to mobilize the public in any clear direction when it comes to guns. Paralyzed by our ambivalence, we can’t mobilize for political change. So we leave it easy for anyone to get weapons of mass slaughter.

The result is a growing fear that no space is safe any more, that at any moment our longing for absolutely safety could be shot to pieces. Fear is even more paralyzing than ambivalence. When Americans do manage to act on their fear, their most common response is to chase the fantasy of safety by getting another gun, or at least allowing others to get more guns. Fear will override common sense most every time.

In the movies we see the most fantastic military-style weapons deal out measureless blood and gore. Audiences applaud it all, because they trust that the good guys on the screen will end up with their absolute safety restored. Unfortunately, it doesn’t work that way in real life -- not even in movie theaters.

The root of the problem is our dedication to the fantasy of absolute safety and security. The sooner we recognize that as our national fantasy and stop arming ourselves to the teeth in pursuit of it, the safer we all will be.

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147432 https://historynewsnetwork.org/blog/147432 0
Toppling Dictators Can Be Dangerous, Even If You Are Exceptional Billboard of Bashar al-Assad in Damascus, 2006. Credit: Wikipedia

I can’t resist a brief comment on the debate about American exceptionalism that’s now the headline offering on History News Network. If you’re interested in the mythic dimension of American political life, exceptionalism is bound to be an important topic. Even if you just casually peruse the day’s news you are likely to meet it in one guise or another.

Today you can meet it in its purest form, with no guise at all, in Mitt Romney’s speech to the VFW:  “Our country is the greatest force for good the world has ever known. ... Throughout history our power has brought justice where there was tyranny, peace where there was conflict, and hope where there was affliction and despair. ... Our influence is needed as much now as ever.”

Romney’s words are likely to find receptive ears across America, especially among those who hear the news of Syrian warplanes dropping bombs on Syrians in Aleppo, the nation’s largest city.  It’s getting mighty grim over there. It’s hard to resist the feeling that someone really ought to stop that slaughter, by any means necessary. It will be a nasty job, but if we don’t bring justice where there is now tyranny, who will?

That’s exactly what they’re starting to say in the White House, too, according to the New York Times. “We’re looking at the controlled demolition of the Assad regime,” as a Syria expert at a Washington think tank put it. But the Times thought it helpful to add that expert’s word of warning: “Like any controlled demolition, anything can go wrong.”

If you are wondering what might go wrong when American wields its power, the Washington Post offered one answer just a day after that Times story appeared, in a story headlined, “In Syria, U.S. Intelligence Gaps.” The United States, it seems, “is struggling to develop a clear understanding of opposition forces inside the country, according to U.S. officials who said that intelligence gaps have impeded efforts to support the ouster of Syrian President Bashar al-Assad.” U.S. spy agencies “are still largely confined to monitoring intercepted communications and observing the conflict from a distance, officials said.” 

“The lack of intelligence,” the WaPo explained, “has complicated the Obama administration’s ability to navigate a crisis that presents an opportunity to remove a longtime U.S. adversary but carries the risk of bolstering insurgents sympathetic to al-Qaeda or militant Islam.”

Complicated, indeed. Supposedly controlled demolitions often turn out to be a lot less controlled than the controllers expect, especially when they are relatively clueless about what’s really going on. As the man said, anything can go wrong -- no matter how exceptional the demolition crew may be. 

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147486 https://historynewsnetwork.org/blog/147486 0
Ignoring the Homeless is Un-American Homeless man in The Bowery, Manhattan. The advertisement is for luxury condos.

The other day I read that homelessness in my small, typical, middle American city has jumped 39 percent in just one year. Still reeling from the shock, I wondered, “What would Thomas Jefferson say?” Jefferson spent long hours worrying whether the fledgling United States -- the first country based on the right to life, liberty, and the pursuit of happiness -- would survive. He wasn’t sure that ordinary people would be willing to contribute their hard work and tax dollars to support the common good and needs of the nation.

Jefferson’s solution was that everyone should have a home of their own. Homeowners would be the most responsible, civic-minded citizens. They would realize that their own place would prosper best if the whole community prospered.

That’s a fundamental pillar of the myth of Jeffersonian democracy,which has been such a central thread in American life. In the nineteenth century the federal government gave away homesteads to make sure that the frontier would be settled by responsible homeowners committed to building strong communities. We get tax deductions for mortgages because the government still wants to promote home ownership.

Of course plenty of us can’t afford to buy homes. When government provides subsidies for renters, the principle is the same: People can fulfill their personal potential, and thus contribute most to the community, if they have a place of their own.

That’s not to say the homeless don’t contribute. Over 40 percent of homeless adults have jobs, many of them full-time. Homeless people take care of their communities, whether in shelters or on the streets. But so much human potential is wasted when people must spend a good part of each day figuring out how they’ll get through the night -- and get through it safely, if they are lucky.

In the current recession, as homelessness skyrockets, the resources of government to help the homeless wane. Housing subsidies are much harder to come by, though home owners still get generous tax deductions on their mortgage interest. Our society divides further between haves and have-nots -- precisely the danger Jefferson foresaw if everyone did not have a place of their own to live.

There’s another great American myth: “We shall be as a city on a hill. The eyes of all people are upon us.” Ronald Reagan loved that line. But he left out the rest of what John Winthrop told the first Puritans headed for America: “We must be knit together in this work as one man. We must be willing to abridge ourselves of our superfluities, for the supply of other's necessities.” That doesn’t sound like the kind of society that would let homelessness rise 39 percent in a single year.

The Rev. Dr. Martin Luther King gave new life to Winthrop’s vision when he proclaimed: “We are caught in an inescapable network of mutuality, tied in a single garment of destiny. Whatever affects one directly, affects all indirectly.”

King did not say we “should” be tied together. He was not telling us how we should live our lives. He was telling us what the world is like. We are in fact tied together, the richest and the poorest, those who live in mansions and those who live on the streets. Whatever happens to the homeless affects all of us, even if it’s so indirect that the homeless remain largely invisible to most of us.

Jefferson understood what John Winthrop and Martin Luther King were talking about. He understood that democracy means we are, in fact, all tied together in the single garment of our community’s destiny. We must all contribute. Most of the homeless already contribute what they can. But they could give so much more if they had the stability that a home provides.

Sadly, many homeless can’t contribute a lot because of mental and physical disabilities that go untreated. Here, too, our society has failed to recognize that we’re all in this together, that treating those who need help is a crucial way to help all of us live better lives.

Responding to the growing epidemic of homelessness is not simply an act of charity. It’s an essential way to live out our most traditional myths of what it means to be a good American. To ignore the homeless, and the skyrocketing rate of homelessness, is more than a moral lapse. It’s positively un-American.

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147509 https://historynewsnetwork.org/blog/147509 0
The Olympics: New Model for Foreign Policy? Amid all the extravagant hoopla of the Olympics’ opening ceremony in London, one striking contrast caught my eye. Though some of the athletes parading into the stadium looked like they had come for a great party, many were obviously taking the occasion very seriously -- especially those from smaller and “less developed” nations. You could tell that being in the Olympics was the greatest occasion of their lives. But even those who looked most dignified were often smiling. Their big broad smiles made these beautiful young people look absolutely radiant. Even if they weren’t smiling, you could see the obvious pride and pure joy bursting out of them.

Contrast that with the uniformed soldiers who were conscripted to perform in the ceremony. Maybe they were just as proud, perhaps even just as joyful. But their appointed role and the ethos of military culture combined to prevent them from showing it, or showing any emotion at all. They kept their faces stiff and blank, while they moved like life-sized robots.

There’s nothing surprising in that rigid demeanor. It’s what we expect from military personnel performing military jobs. After all, the people in uniform are obligated, above all, to follow orders. They have largely given up their individual personalities to become merely extensions of the state apparatus that gives the commands.

Of course the Olympic athletes are extensions of their home nations and their governments too, most obviously in the opening parade of nations. The athletes are at least as highly trained as the soldiers, at least as disciplined in perfecting their skills -- probably even more so -- and at least as dedicated to winning for their nations on all three counts.

But athletes don’t represent their nations by fighting and killing. They represent their nations by playing games. Can anyone imagine Queen Elizabeth saying, “I declare open these Olympic wars?”

The media do often tell us that the athletes are “battling” for the gold medal and, more often, that countries are “battling” for the lead in the medal count. Once the games begin, the competition and desire to win can fairly be called “deadly” serious.  But everyone knows that those are just metaphors drawn from the realm of war, where the words are meant quite literally.

It doesn’t work the other way around, though. No one ever says, even metaphoricaly, that U.S. troops in Afghanistan are “playing” war. No one call the war a “game.” (There are “war games.” But everyone knows that they are just pretend, that no one risks getting killed, except in the rare accident). 

So we have two starkly contrasting models of international competition, creating two equally different images of patriotic service to the homeland: the robotic soldiers fighting deadly wars and the joyous athletes playing games.

 The two have not always been so different, however. In medieval jousting tournaments (as in bullfights even more recently), men did get killed playing athletic game. Conversely, when those medieval knights went out to war, they were in a real sense playing a game. It was much like what we call a sporting event. The same has been true of warfare in many other cultures around the world.

When war was play, the contestants were all expected to observe elaborate rules and codes of conduct. War, like sports, was highly ritualized. And though the results were often gruesomely horrific, the death tolls were suprisingly small compared to what we are accustomed to now.

That’s because (according to one theory, at least) the goal was to display superiority, but not to destroy the enemy completely. In fact it was necessary to let most of the enemy forces survive so that both sides could return to the playing field to renew the game another day. It was the process, not the outcome, that mattered most.

We still have an echo of that medieval heritage in the word “sportsmanlike,” defined in one dictionary as “qualities highly regarded in sport, such as fairness, generosity, observance of the rules, and good humour when losing”; in another dictionary the qualities are “fairness, courtesy, good temper, etc.”  And we still occasionally hear that old, once popular, saying: “It’s not whether you win or lose, but how you play the game.”

Of course now we’re more likely to hear the much more popular saying, “Winning isn’t everything. It’s the only thing.” The widespread currency of that saying is a sign of how much we’ve turned our athletics into war and, more importantly, given up the old idea of war as sport.

How that transformation of war occurred is a very complicated story, still debated by historians of warfare. There’s general agreement that the Napoleonic wars were a decisive turning point. For the first time, war was waged not by small professional armies but by huge conscripted forces. Entire nations were mobilized to support those forces. Every citizen was encouraged to see him- or herself as part of the war effort.

The result was a sense that the nation’s very existence depended on victory. Governments eagerly promoted that view: If all citizens saw the prospect of defeat as a prospect of annihilation, they were more likely to sacrifice all to support the war. If annihilation was the only alternative to victory, the logical response was to try to annihilate the enemy. Thus war became a zero-sum game, which meant, in effect, no longer a game at all.

The old idea of war as a game -- where the rules of honorable conduct were as important as, perhaps more important than, victory -- lasted longest among the most elite military leaders. One sign of the death of this ethos at the highest level came in 1945, on the day Germany admitted defeat in the European war. The Allied commander, Dwight Eisenhower, refused to shake the hand of the surrendering German commander, a ritual that military tradition had always required.

In his war memoir, Crusade in Europe (1948), Eisenhower explained both that refusal and the book’s title in the same sentence: “Because only by the utter destruction of the Axis was a decent world possible, the war became for me a crusade in the traditional sense of that often misused word.” The representative of pure good could acknowledge no hint of comradeship with the representative of pure evil, Ike implied, because that would imply some kind of equality, as if both were players in the same game.

There is good reason to believe that Eisenhower was offering an ex post facto explanation, trying to prove his credentials as a crusading cold warrior. During the war he had given little indication of concern about, or even understanding of, fascism as a political system or ideology. His memoir was published in the same year that official Washington fully committed itself to mobilizing the nation in an anti-communist crusade, a crusade made all the more convincing by conflating Nazis and communists in an image of “red fascism.” It was the cold war, even more than World War II, that vanished the last trace of sportstmanlike conduct in war.

The Olympic Games have such broad appeal, I suspect, in part because they offer a rare chance to regain at least a glimpse of that old-fashioned idea of international competition as a sporting event. The Olympic Games also have broad appeal because we watch athletes doing things we might imagine, but can scarcely believe are possible in reality.  

Danny Boyle’s production for the opening ceremony was certainly an extravagant indulgence in pure fantasy. So it got my imagination going in some pretty extreme ways. I imagined for a while what war would be like if we went back to the medieval tradition of battle as ritual contest.

Then I took a bigger leap and imagined what U.S. foreign policy would be like if we thought of it in the same way. We would compete earnestly with other nations for wealth, power, and influence over world events. Policy debates and decisions would still be deadly serious. But we would not think, even for a moment, of destroying the nations and groups we were competing with. We would understand that the whole point is to keep the game going.

So the most urgent question of foreign policy would not be “Who won?”, but “How did we play the game?” The most important goal would be to make sure that we acted in the international arena with fairness, courtesy, observance of the rules, generosity, and good humour when losing.

And if we lost one round in the contest, we would shake hands with our opponents, congratulating them on a contest well played. Then we would go back to the drawing board, figure out how to do better next time, and prepare to play better another day.

My fantasy is not totally beyond the realm of possibility. It is, in some respects, how kings conducted their foreign policy in days of old, when knights were bold. Since the Olympic Games inspire us all by seeing people achieve feats we never dreamed possible, why not dream of a new mythology for foreign policy: picking up where those kings left off but going even further, making foreign policy in every respect a serious, strenuous, but ultimately playful game? 

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147534 https://historynewsnetwork.org/blog/147534 0
New York Times: “All the Fear That Fits, We Print” I know, the Times’ motto is really, “All the News That’s Fit to Print.” No scandal, no sentiment, no sensationalism. Just straight, sober, boring facts. That’s why they call it “The Gray Lady,” right? Well the times they are apparently changin’. “The Gray Lady” ended the month of July with two lurid lead stories in a row, stoking fear in vivid technicolor.

July 30 top headline: “Jihadists Taking A Growing Role In Syrian Revolt.” The lead: “Syrians involved in the armed struggle say it is becoming more radicalized: homegrown Muslim jihadists, as well as small groups of fighters from Al Qaeda, are taking a more prominent role.”

July 31 top headline: “Militant Group Poses Risk of U.S. - Pakistan Rupture.” The lead: “Grinning for the camera, the suicide bomber fondly patted his truckload of explosives. ‘We will defeat these crusader pigs as they have invaded our land,’ he declared. ... The camera followed the truck to an American base in southern Afghanistan, where it exploded with a tangerine dust-framed fireball.”

Even the well-educated, well-to-do movers and shakers who make up a significant share of the Times’ readership (which is why it’s so influential) might have found their worst fears for our national security confirmed. With hearts pounding and adrenalin pumping, many probably didn’t bother to read the whole stories, where they would find some facts to confirm their fears but others to calm them. The truth, it seems, is more complicated than the scary headlines.

Symbols of jihadi and Salafi (strict traditionalist) Islam are growing among Syrians fighting to overthrow their government. But “both fighters and analysts said not all the jihadist symbols could be taken at face value. ... there tends to be more Salafi guys in the way the groups portray themselves than in the groups on the ground.” Why? The fighters desperately need money, and most of it now come “from religious donors in Saudi Arabia, Qatar and elsewhere in the Persian Gulf region whose generosity hinges on Salafi teaching.”

The old image of a single well-organized monster called “Al Qaeda,” sending its cadres around the world, has long been debunked, even by the Congressional Research Service. In fact, says the Times article, “there is, as yet, no significant presence of foreign combatants of any stripe in Syria,” and “not all foreign fighters are jihadists, either.” But the story’s first paragraph, which is where many readers stop, reinforces the false outdated image of a unified bloc of armed “Muslim radicals” spreading its tentacles ever closer to American shores.

The other story, from Pakistan, identifies the suicide bomber as a member of the Haqqani network. U.S. mass media have been focusing on that group for many months as the mainstream Taliban were moving toward negotiations, which remove them from the list of suitable monsters to feature in the headlines. Someone in Afghanistan has to fill that bill, as long as we have soldiers fighting there.

Haqqani is the top candidate. “Inside the [Obama] administration,” the Times reports, “it is a commonly held view that the United States is ‘one major [Haqqani] attack’ away from unilateral action against Pakistan -- diplomatically or perhaps even militarily.”  

But just as Times readers were absorbing this frightening picture of Pakistan as a hot-bed of anti-American instability, U.S. and Pakistani diplomats were signing an agreement that will allow NATO convoys to move across Pakistan to Afghanistan at least until the end of 2015 and maybe longer. “The pact seems to close, for now, one of the most contentious chapters in the long-turbulent relationship between Washington and Islamabad,” the Washington Post reported. Perhaps the Times was exaggerating the danger to U.S. - Pakistan relations just a bit.

And exagggerating the danger of the Haqqani, too. That group has claimed credit for a few high-profile attacks in Afghanistan, leading both the House and the Senate to pass bills that urge Secretary of State Clinton to designate the Haqqani network a “foreign terrorist organization.” But “the headlines created by such violence are disproportionate to their military significance,” as the Times article itself notes; “Haqqani operations account for one-tenth of the attacks” on NATO troops, “and perhaps 15 percent of casualties.”

Is the Times shaping its headlines and lead paragraphs in such unnecessarily fear-mongering ways to influence policy? When it exaggerates the Haqqani threat -- and adds that Pakistan’s Inter-Services Intelligence agency “is covertly aiding the insurgents,” according to unnamed “American officials” -- is it encouraging a tougher administration stance against the ISI, just as that agency’s new head arrives for talks in Washington? Is it trying to push the administration to give more money to the Syrian fighters, so that they won’t have to turn to conservative Muslims for their funding?

Or is the Times just to trying to gain more readers? Most people turn to the news less for factual truth and more for emotionally satisfying mythic narratives. Myths are not simply lies. They are, typically, a complex blend of fiction and truth, with the fiction taking the lead and bending the empirical facts to fit the story. In other words, myths do just what these New York Times front-page stories do.

And myths drive policy. The “war on terror” industry has been able to keep its massive federal funding, while other programs are drastically cut, because the myth of homeland insecurity has taken such deep root in American political culture. The New York Times is doing its part, at least this week, to keep that myth alive. 

[PS:  I've got a new article up on Alternet, Mitt Romney Disses Palestinian, Mexicans -- Who's Next?  Please take a look.]

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147545 https://historynewsnetwork.org/blog/147545 0
Pentagon Planning for War With China You usually have to dig a bit to find the mythic dimension in political discourse. But sometimes it is right there on the surface, staring you in the face.

Latest example: A Washington Post report on “Air Sea Battle,” a Pentagon plan for war with China. They’ve gamed it all out, it seems, and, I’m happy to report, we win!  Here’s how it goes:

The war games are set 20 years in the future and cast China as a hegemonic and aggressive enemy. Guided anti-ship missiles sink U.S. aircraft carriers and other surface ships. Simultaneous Chinese strikes destroy American air bases, making it impossible for the U.S. military to launch its fighter jets. The outnumbered American force fights back … Stealthy American bombers and submarines would knock out China’s long-range surveillance radar and precision missile systems located deep inside the country. The initial “blinding campaign” would be followed by a larger air and naval assault.

This reminds me of the nuclear war scenarios that were popular in the late 1940s and early 1950s. My favorite was a cartoon spread in Life Magazine that sketched out an American - Soviet war. Of course the Soviets start it. Rockets fly and there is mass devastation on both sides. The last cartoon shows Manhattan reduced to rubble, except for the 42nd Street Library’s guardian lions, which remain standing in all their nobility. Underneath, the unexplained (and inexplicable) caption reads simply: “The United States wins.”

Of course that was for public consumption, to whip up cold war fervor among the masses. But similarly fantastic story lines were used in the Pentagon’s secret “games” back then, too. President Dwight Eisenhower ordered plans for winning a nuclear war -- and for (in his words) “digging ourselves out of the ashes” after it was over.  

In 1958, as the size of the nuclear arsenals and the estimates of casualties spiraled beyond imagining, Ike demanded “a basis for further planning which is in the range of something reasonable… manageable or useable.”

Notes from one planning session read:

The President observed that he had asserted many times that if we assumed too much damage there would be little point in planning, since everything would be in ashes.  An earlier presentation had estimated that some areas would not be useable for 30 years after an attack; of course planning on this basis is impossible.  While we don’t get off scot free in case of an attack, we should make assumptions which describe a realm in which humans can operate.

So Eisenhower officially directed his National Security Council to keep “assumptions as to the extent of damage within limits which provide a basis for feasible planning.”

The WaPo article does not suggest that President Obama or anyone close to him ordered the recent war game scenario with China. It’s the brainchild of Andrew Marshall, who at 91 years old is still doing what he’s done for decades: sitting in his Pentagon office, dreaming up worst-case scenarios, and (with enthusiastic help from the military-industrial complex) persuading lots of people to take them seriously.

So far the battle, like any fantasy, is all in the mind. Marshall worries, says the WaPo, that China might some day supplant the United States’ position as the world’s sole superpower. One of his supporters, a senior Navy official, explains: “We want to put enough uncertainty in the minds of Chinese military planners that they would not want to take us on. … Air-Sea Battle is all about convincing the Chinese that we will win this competition.” It sounds like an Olympic champion plotting to psych out the challengers, doesn’t it?

The nuclear arms race of the cold war era reached such fantastic proportions for much the same reason: Each side was so good at imagining what the other side might possibly, conceivably, some day, be able to do, and each side was determined to gain the psychological edge.

Of course such mental fantasies have a nasty habit of become self-fulfilling prophecies played out in all-too-physical reality.The WaPo notes that a U.S. attack would result in “incalculable human and economic destruction,” according to an internal assessment prepared for the Marine Corps commandant. Some defense analysts “warn that an assault on the Chinese mainland carries potentially catastrophic risks and could quickly escalate to nuclear armageddon.” 

But “the war games elided these concerns. Instead they focused on how U.S. forces would weather the initial Chinese missile salvo and attack.”

“Elided.” Such an elegant word. Eliding the real world keeps everything in the mind, where myth and fantasy flourish. I suppose if Dwight Eisenhower had known the word he would have been proud to say that he elided the actual estimates of death and destruction in his war planning, too.

Some critics of Marshall’s war planning decry not only his “eliding” but the very premise of his project: “It is absolutely fraudulent,” said Jonathan D. Pollack, a senior fellow at Brookings. “What is the imaginable context or scenario for this attack?”

But for people like Andrew Marshall (and there are lots of them), raised among staunch cold warriors who saw worst-case scenarios as the only scenarios worth considering, imagining a scenario for this attack is no challenge at all. “We tend to look at not very happy futures,” Marshall says, with wry understatement. When your mental world is shaped by the myth of homeland insecurity, there are monsters all around. It’s easy enough to pick out the scariest one and invent fantasies of the next battle.

Will all this planning make war more likely?  I can easily imagine Andrew Marshall’s eliding that most crucial issue by answering: “That’s not my department.”

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147579 https://historynewsnetwork.org/blog/147579 0
The Secret Tie that Binds U.S. and Israel Israeli and American flags on a ship's mast on the Sea of Galilee. Credit: Wikipedia.

I was really excited when I saw an op-ed in the New York Times by Israel’s most important critic of his own government, Avraham Burg. He’s the most important critic because of his high political standing (he served as the speaker of Israel’s Knesset [parliament]) because he so persuasively condemns Israel’s occupation of Palestine, and because he focuses on such an important, but too often neglected, motive for Israel’s oppression of Palestinians: the mistaken belief that Israel is a weak nation, vulnerable to enemies who want to destroy it.

In his most influential book, The Holocaust Is Over; We Must Rise From its Ashes, Burg argued that it’s dangerous for Jews as well as Palestinians when Israel views its political opponents as if they were Nazis. They’re not, and the huge difference is crucial if Israel is ever to have a realistic security policy that will realize bring its people security.

Yet that mistaken equation, Palestinians (or Arabs) equal Nazis, is what Israel’s quite popular prime minister, Benjamin Netanyahu, promotes at every opportunity. On his recent trip to Israel Mitt Romney embraced Netanyahu, making it clear that, if he becomes president, Romney will base U.S. Mideast policy on the same wrong-headed, fear-based ideology.

Of course Obama as well as Romney justify their pro-right-wing-Israel tilt by claiming that there’s a “special relationship” between Israel and the U.S., that there’s “no daylight” between the two governments on basic policy issues. And the “special relationship” goes beyond geopolitical interests, we’re always told. It’s a matter of fundamental values.

What exactly are those values? That’s just the question Avraham Burg said he intended to address his op-ed. In the 1950s, he began, the crucial ties were a common commitment to “democracy, human rights, respect for other nations and human solidarity.” Now the two nations are bound by “a new set of mutual interests: war, bombs, threats, fear and trauma.”

Right on target!, I thought. The most important message anyone can bring about U.S.-Israel relations, delivered by a former top-ranking Israeli leader on America’s most prestigious op-ed page.

Unfortunately, Burg’s column did not go on to address the crucial issues his opening paragraphs raised. He wrote eloquently about the gradual disappearance of democracy in Israel, as religious intolerance and ethnic chauvinism become the norm -- an important subject, to be sure. But he said nothing about the deeper insight of his book: the way unjustified fears and feelings of victimization warp Israeli Jewish life and give rise to anti-democratic, anti-humanistic trends.

Even more unfortunately, he said nothing more about the strikingly similar trends in American political culture. No, we did not endure a Holocaust. But the fear of a German invasion of the U.S. homeland was very real in the early 1940s. And Franklin D. Roosevelt did everything he could to fan the sparks of that fear into an anti-German fire. It was a crucial part of his strategy to build political support for his program of supporting the British war effort with everything short of sending U.S. troops to Europe. That was before December 7, 1941.

When the Japanese attacked Pearl Harbor, the fear for the homeland that Roosevelt had created was immediately, seemingly effortlessly, extended from Germany to Japan. Resistance to war, which had been strong enough to remain Roosevelt’s number one political concern, evaporated as a politically significant force overnight.

Roosevelt did not realize the staying power of the worldview he created for the American people, based on the warning that we must always be watching out for enemies who want to destroy us. That warning had played little role in American life since the 1840s (some historians would say since 1815), except for the brief U.S. engagement in World War I.

World War II changed all that. Once the Axis was defeated, the “red menace” of communism took its place, to be followed by “terrorists,” “Islamofascism,” and now “the Iranian bomb.” Next year the enemy might have some other name. Who knows?

Israel has gone through a similar revolving door of enemies. Once it was “the Arabs,” then particular Arab nations, then “the Palestinians,” then “the PLO,” now “Hamas terrorists” and “the Iranian bomb.”  Next year Israel’s enemy might have some other name too.

For now, Israel and the U.S. agree that Iran is bogeyman number one. They differ only on tactics for combating the purported threat. And the agreement is not only among top political leaders. There seems to be as much support for anti-Iranian policies among Americans as Israeli Jews.

Israel’s myth of insecurity and Americans’ myth of homeland insecurity foster the fear of Iran. Indeed the myths demand the fear: Someone has to play the threatening enemy to make the myths believable.

A myth of insecurity, with the sense of vulnerability and victimization that it breeds, is the most fundamental tie that binds the two nations. Both peoples learned long ago to base their national identity and sense of patriotism on fighting off enemies who are, they believe, bent on destroying their nations. Both are deeply committed to and shaped by these myths.

In both countries there are dissenters who say that their own people are being warped by their myths. These minorities see the peril as well as the foolishness of a fear-based worldview. But in both countries the minorities are as yet small enough that they have little influence on policy.

This is what I hoped Avraham Burg would write about: the shared values that so clearly bind the U.S. and Israel yet remains undiscussed, virtually unnoticed, as hidden as any secret.  What a chance Burg had to lay the secret out in the open, to try to wake Americans up, just as his book tried (and in some cases succeeded) in waking Jews up.

It’s a message that Americans need to hear, too. If it were delivered in their most prestigious newspaper, at least the relatively well-educated and elite readers of that newspaper might begin to ponder it. It would be secret no longer.  

Here’s hoping Avraham Burg will return to the op-ed pages of influential American newspapers with his full message another day. Then we could begin to have a public debate not only about the U.S. relationship with Israel but about our relationship with our own understanding of security. 

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147611 https://historynewsnetwork.org/blog/147611 0
Why Do Married Women Vote Republican? Mitt Romney TV ad from the primary campaign featuring his wife Ann.

An article in today’s New York Times takes on one of the enduring mysteries of recent American politics: Why do single women vote for Democrats in such greater numbers than married women? Single women, predictably, are suffering more than married women in this protracted recession. So if the election is essentially a referendum on Obama’s handling of the economy, as so many pundits tell us, then the polling should show singles more eager to reject the president. Yet most single women still say they’ll vote for Democrats, while married women trend more to the GOP.

The Times’s reporter Shaila Dawan ends up theorizing that single women assume they’ll go on getting the short end of the economic stick regardless of who is president, so they make up their minds based on social issues, where Obama’s more liberal views are more appealing to them. Perhaps. But Dawan’s evidence is nothing more than a few comments she heard from a few single women.  

Another view comes from an acknowledged expert on the subject, pollster Celinda Lake. She quite rightly looks to the “symbols and images of politics” for an answer. More specifically, she looks to the photos of picture-perfect (usually perfectly white) families, with smiling wives and 2.3 children, that Republican candidates rely on to symbolize the order and domestic tranquility that they hope voters will prize above all else. Mitt Romney is certainly following that well worn path.

One journalist summed up Lake’s view of why this symbolism falls flat with the unmarried: “If you’re a single mom in Alabama struggling to work and take care of a kid alone, it can be grating to have to take in three generations of Romney perfection. ‘That's not the lives of these women,’ Lake says. ‘They are economically marginal, they are short of time, they are juggling, and hoping that one of the balls doesn't fall on their head at any given time.’”

The “grating Romneys” argument is a bit of a stretch, since it’s hard to beat the Obama PR machine for a steady stream of absolutely charming photos of a picture-perfect (though perfectly African-American) family. But Lake is surely right to focus on symbolic images. As political psychologists have shown in so many ways, when most voters of both genders they cast their ballots are usually moved more by such images than by rational analysis of issues.

Perhaps, then, Republicans (with or without family photos) symbolize something that is more valuable to married than to single women. Scholar June Carbone, who has studied the demographic patterns of red and blue voters, suggests that we should ask: “Who's most anxious about family values?” Her answer (in 2010): The most anxious voters are “in Sarah Palin's America,” where divorce and unwed pregnancy rates are the highest. 

If images, not issues, sway voters the most, I’d phrase the question a bit differently: “Who’s most anxious about the difficulty of holding on to images of enduring values and lifestyle patterns, or anything constant, in American life?” But I, too, would look for those worried folks in (to update the imagery) Mitt Romney’s America.

It’s not Romney’s family photos, but Romney himself and all that he symbolizes, that create a reassuring image of constancy, which is what conservatives crave. They got their name precisely because they want to conserve, right? Romney serves them the way all those Currier & Ives prints of rural America used to serve urban and suburban dwellers, who were one or two (or more) generations removed from rural life but hung the prints on their walls to try to mitigate (or perhaps deny) the impact of the change the nation had gone through.

Might this explain why married women vote Republican so much more often than single women? Regardless of the state of the economy, perhaps married women have more of a stake in trying to maintain the status quo -- to ward off change symbolically, when they have little control over it in any practical way.

It wouldn’t be surprising if the most crucial change they hope to ward off is the loss of their married status. Women who suddenly find themselves single typically find themselves in worse financial straits. That can be a huge challenge even in good economic times, and red state women see it happening around them at higher rates than in blue states.

Given the times economic we live in now, divorce can be terrifying for women. So when we’re looking at female voters, if we ask “Who’s most anxious?” as a way of predicting who’s most likely to vote Republican, it wouldn’t be surprising if the answer is: married women.

This is a speculative conclusion, I know. I’d love to see some real research on it. For now, I offer it mainly to point out that when election time rolls around, and we want to understand what’s going on, there’s real value in looking at the kinds of symbolic images that myths are made of. They can often help us unravel electoral puzzles when the more conventional kind of analysis, focused on issues and material interests, just leaves us bafffled.

]]>
Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147650 https://historynewsnetwork.org/blog/147650 0
Myth Versus Myth: Remembering Nagasaki Oak Ridge Environmental Peace Alliance rally against nuclear weapons at the Y-12 National Security Complex in Oak Ridge, Tennessee, April 6, 2011. Credit: Wikipedia.

The sixty-seventh anniversary of the atomic bombing of Nagasaki prompts me to indulge in a bit of autobiography. My path to the study of mythic America began when I was young historian of religion, writing highly specialized studies of rabbinic Judaism, and in my spare time an antinuclear activist, protesting the Rocky Flats Nuclear Weapons Plant near Boulder, Colorado, where I lived and worked. When I first realized that I could apply the tools of my trade -- the analysis of mythic and symbolic language -- to the nuclear issue, I was glad to bring my professional life into synch with my political and ethical commitments.

It seemed obvious to me at the time that the object of my new study should be my political foes: the Bomb and the millions of my fellow citizens who saw it as an acceptable, even laudable, part of American life. From any moral or practical viewpoint their attitudes seemed to me as inexplicable as they were objectionable. So I wrote a book, Dr. Strangegod: On the Symbolic Meaning of Nuclear Weapons, analyzing those attitudes as modern expressions of very old mythic-symbolic traditions.

It only dawned on me very gradually that the same approach could be applied to my side of the political conflict. The first awareness came one year as I stood on August 9 (or maybe it was August 6) at the entrance to Rocky Flats, along with a huge throng of antinuclear activists, in the annual protest/vigil. Some had walked the nine long, uphill miles from Bouder, as they did every year, led by a Buddhist monk chanting and drumming. Some sang the same old familiar songs. Some carried the same old familiar protest signs. Some held hands and bowed their heads in silent meditation. A few probably climbed through the barbed wire fence and waited to see if they would be arrested. I don’t remember exactly.

But I do remember that I thought to myself, These four days, from the 6th to the 9th, are the high point of the antinuclear year. This is our most solemn occasion, our annual pilgrimage, our High Holy Days. At the time I took it for granted that I’d be observing this sacred holiday every year for the rest of my life.

Well, times change. By the time Rocky Flats was closed down as a bomb-making factory, in 1989, the nuclear movement had already faded to a shadow of its once-powerful self. I was beginning to tire of studying mass destruction. So I turned to the meaning of peace in U.S. history. But I quickly realized that war and peace were so intertwined that I would have to study the whole history of U.S. foreign policy, though still using the same tools.  

Eventually, just as I’d recognized a quasi-religious ritual in the antinuclear movement, I recognized that the movement was also steeped in mythic language and symbolic imagery of its own. It was too simple to say that we, the good guys in the peace movement, were seeing the world objectively, paying attention only to the facts, while those evil warmakers and their millions of supporters were warped by mythic thinking.

As I taught my students about William Lloyd Garrison and Thoreau, Gandhi and King, Jane Addams and Dorothy Day, I realized that those gifted writers and orators had tapped into the roots of imagination as much as any Strangelovian nuclear strategist. Their power to move people and change history came precisely from their immense talent for blending fact and imagery in the service of humane values. The challenge to the peace movement was not to transcend myth but to create new myths, as our great heroes had done.

That offered me another way to understand the scholarly history I was studying. It’s certainly not original to suggest that academic history, especially when it’s done in the old-fashioned narrative way, is a form of storytelling that inevitably has its own mythic dimension. But it might touch a few raw nerves to read the competing histories of the decision to drop the Bomb on Nagasaki that way.

If we had any doubt that the history of that event was a touchy subject, they were erased by the firestorm surrounding the Smithsonian’s exhibit on the occasion of the fiftieth anniversary. The main issue then was whether the Japanese narrative should, or could, be allowed into the halls of America’s one great national museum.

The fire came from those who would permit no doubt to be cast on what Stanley Kutler calls today’s “common wisdom[:] that Truman had only two simple, stark choices: to use the Bomb or invade and suffer a million casualties.” A narrative being treated as “common wisdom” is a hallmark of myth -- in this case, the myth that the bombing of both cities was perhaps tragic but ultimately necessary in the service of a morally good cause. 

My friends in the peace movement were appalled by that firestorm of right-wing fury. But it’s easy to imagine them reacting with some anger of their own if I stick to my view of myth, which sees every myth as a blend of fact and fiction, and put forth two related propositions:

First, there is some truth in the popular narrative that Truman was compelled to use the Bomb twice. The history books that tell the story according to the “common wisdom” perpetuate that myth. But the best of those books were written by competent historians who have some accurate facts embedded in their accounts. Their conclusion is certainly wrong, in my opinion. But they are not spinning totally fictional yarns.

Second, the peace movement’s counter-narratives are themselves myths. Two myths predominate among those of us who condemn Truman’s decision. One is that the bombing of Hiroshima, and certainly of Nagasaki, was unnecessary because Japan would have surrendered in any event, obviating the need for an American invasion. The other myth says that Truman was moved to bomb perhaps Hiroshima and certainly Nagasaki by his (and his advisors’) desire to demonstrate America’s unprecedented might to the new enemy on the horizon, the Soviet Union.

There are plenty of facts to back up both of those myths. But the facts are not so absolutely compelling as to eliminate all competing views. Historians are still free to choose how to put the facts together and how to tell the story. It is, and no doubt will remain, myth versus myth.

That’s not a bad thing. The peace movement would be all the stronger if it recognized that its political influence depends largely on the strength of its myths. No political movement ever succeeded without a powerful narrative. The history of the American peace movement itself teaches us that lesson.

And there’s danger in relying solely on the persuasive power of historically verifiable facts. In any political struggle, the other side will never concede that its facts are totally flawed. The contest of fact against fact will go on forever. Most historians accept that as a given. It’s our lifeblood.

If a political movement waits until its narrative is absolutely, indisputably proven by facts that all historians agree on, it will wait forever. A successful movement gathers what facts it has, weaves them into an effective mythic narrative, and moves ahead with its work, recognizing that political life will always be myth versus myth.

Related Links

  • HNN Hot Topics: Hiroshima and Nagasaki
  • HNN Hot Topics: Truman on Trial
  • ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147675 https://historynewsnetwork.org/blog/147675 0
    The Mythic Paul Ryan Enters, Stage Right Paul Ryan and Mitt Romney in Norfolk, Virginia. Credit: Wikipedia.

    For those of us wondering what will be the defining story line of the 2012 presidential election, the selection of Paul Ryan as Mitt Romney’s running mate makes it a whole new ball game. Maybe. Or maybe not.

    The story line so far has remained in flux. For a while the common wisdom said it all depended on the state of the economy; nothing else mattered. Then the conventional wisdom decided that the big story was the Obama campaign’s full-court advertising press to define Romney as a callous capitalist: Would it succeed or backfire?

    Now, most of the pundits tell us, the Ryan pick really is a game-changer. That view is summed up in two columns on the op-ed page of today’s New York Times: “Let the Real Debate Begin,” says Joe Nocera: “With Paul Ryan on the Republican ticket, Americans can have a much needed discussion about the size and role of the federal government.” Roger Cohen agrees that we are finally going to get down to substantive issues: “Romney's choice of Ryan has the merit of opening a serious debate about the debt undermining America.”

    If they’re right, then Americans will finally get what both candidates say they’ve wanted all along: A clear-cut choice between competing philosophies of political economy. Frank Bruni writes on the same Times op-ed page:

    Right after Romney announced Ryan, who has positioned himself as the wonk prince of the Republican Party, there was some barbed commentary that Romney had outsourced the policy for his campaign, answering the question of what he really stood for by standing with Ryan.

    Then Bruni adds another perspective: “Romney outsourced the emotion, the charisma and the narrative as well.” “What Paul Ryan can give Mitt Romney is a tutorial in political myth-making,” says the teaser for Bruni’s column. Read the whole piece and you’ll find this spot-on assertion: “Modern politics demands some myth-making.”  

    But the mythic narrative that Bruni sees Ryan bringing to the GOP ticket has little if anything to do with political philosophy or economic first principles. It’s all about personal image: Ryan’s striking ability to turn himself into the outsized hero of a compelling life story.

    Bruni explains that there’s 

    a nonstop chorus of Republican allies urging [Romney] to talk more about his Mormonism or his Massachusetts years or Ann Romney’s struggle with multiple sclerosis. They want him to show some skin. They want him to show some soul. Ryan does that so deftly that the contradictions, holes and hooey in his story recede. ... He has fine-tuned the most valuable oxymoron in political life: he’s utterly slick in his projection of genuineness.

    (It seems to me I’ve heard that oxymoron applied to Ronald Reagan, both as praise and as blame, more than once or twice.)

    I lift up Frank Bruni’s column for two reasons. First, it’s a rare example of a house writer for a prestigious newspaper focusing on the power of myth. Bruni replaced Frank Rich in the culture-critic-turned-political-pundit niche on the Times op-ed page. So it’s not surprising that he would see politics as a competition between dramatic narratives where the characters are not principles or concepts but real flesh-and-blood people playing mythic roles.

    Second, his column is a useful reminder that the storm of punditry whipped up by the Ryan pick may pass as quickly as the summer thunderstorms out here in the Colorado Rockies. 

    In the end, the decisive story may not be about political principles or the economy, stupid. It may be about what some experts think every election is about: The candidates themselves as characters acting out implied, vaguely defined, yet emotionally powerful narratives.

    When the nation’s most eminent leaders cajoled and flattered a reluctant George Washington into coming out of retirement to be the first president of the United States, they weren’t especially interested in Washington’s political or economic principles. Future Republicans as well as future Federalists added their imploring voices. They were all convinced that no one else embodied the emotion, the charisma and the narrative needed to hold the fledgling nation together.

    The mythic meaning of the presidency has always rested, more or less, on the mythic stature of the person holding the office. Presidential elections have always been understood, more or less, as contests between contending mythic heroes. Whether this election will be remembered as more or less in that regard is a crucial question. It won’t be finally answered until some time after Election Day.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147748 https://historynewsnetwork.org/blog/147748 0
    In Search of New Mythology (Part One) American eagle and flag. Credt: Bubbels/Wikipedia.

    In “MythicAmerica: Essays” I have described two great American mythologies (that is, two large sets of mythic themes and traditions) that are most important for understanding American political culture today. I call them (using current terminology) the mythologies of “hope and change” and “homeland insecurity.” The mythology of hope and change casts America as a dynamic force constantly transforming both itself and world by expanding its frontiers, civilizing people who live in wilderness, and reshaping the world in the image of America’s highest ideals. The mythology of homeland security casts America as the protector of its own borders against alien threats and the protector of the whole world against those same threats.

    These two mythologies are so deeply rooted in American society, so completely dominant, and so overwhelmingly powerful -- especially when they work together to reinforce each other, despite the contradictions between them -- that it may be hard to imagine them ever being replaced by any new mythology. Of course that was once true of the mythologies that supported monarchy, slavery, patriarchy, and other institutions that seemed unchallengeable for centuries. Fundamental change does happen. But it’s very slow and arduous. The question is always: Is it worth the effort it takes to develop new mythologies and the much greater effort it takes to make them truly living, working mythologies that have a powerful impact on a nation’s life?

    Another great American mythology, pragmatism, suggests that we should answer these questions by asking other questions: What are the practical results of living within the existing dominant mythologies? What might be the tangible results of replacing them with new mythologies? Would the benefits of the new mythologies outweigh the losses and justify the effort involved in creating and promoting them?

    Some of the practical results of the two great mythologies are easy enough to see. Both have served to legitimate killing, injuring, and harmful acts of all kinds that have brought suffering to countless numbers of people. In some cases whole cultures and societies have been destroyed. For some Americans -- going back to the earliest Quaker immigrants to the New World -- any myths that legitimated harm to others have been, by definition, objectionable.

    Of course most Americans have assumed that harm is acceptable in some cases, as long as it is a means to good ends and the harm is outweighed by the good. But that moral calculus is always computed from within the framework of the mythology that legitimates the action. One of the most basic functions of myth is to create the perspective from which we judge what is true and false or good and bad. So when any act is motivated and justified from within a particular mythic framework, its results are likely to appear, on balance, more constructive than destructive.

    That leaves open the question of whether another mythic framework applied to the same situation might have mitigated or perhaps even avoided completely the harm done. So the death, injury, and suffering inflicted in the name of the dominant mythologies is perhaps the most obvious reason to search for alternatives.

    There are other, less obvious, reasons, which I have discussed in my essays on the two great mythologies. I summarize them briefly here:

    Myths are supposed to provide a dependable structure and sense of certainty, a firm foundation for a society’s sense of meaning. But the mythology of hope and change is riddled with internal paradoxes that undermine structure and certainty. It values progress above all -- pushing back the frontier both in geographical space and in time: To move west is to move into a better future. This vision of progress is rooted in the biblical story of history moving toward a utopian consummation, an era without any evil. Yet the hope for perfection requires the dynamism of constant internal improvement, which means constant change. The nation must go on improving, making progress, forever. So the ever-shifting real can never match the static ideal. 

    Moreover, the very idea of a frontier implies some opposing force on the other side, which is typically viewed as an evil threat, creating an “us versus them” dualism. Moreover, evil must exist inside as well as outside the nation. How else could Americans demonstrate their ability to improve and purify their nation, which is an essential mark of progress within this mythology? So evil can never be fully overcome. The struggle to defeat it, and the fears that accompany the struggle, must go on forever.

    Thus the mythology of hope and change demands pursuit of a perfection that can never be attained. The inevitable result is frustration, anxiety, and insecurity, which many historians have identified as a constant feature of American history. 

    The mythology of homeland security has fewer internal contradictions because it has a simpler message: America will always be threatened by enemies bent on destroying it. To keep itself secure, America must be constantly prepared to defeat those enemies by any means necessary. The most effective way to maintain national security is to keep control of potentially threatening forces around the world, which means, in effect, controlling everything of consequence that happens anywhere in the world. This has the welcome side effect of making America the protector of the whole world against the menacing enemies.

    However this mythology has its down side, too, in its one overwhelming paradox. Though it posits security as the nation’s highest goal, it also assumes that threat is a permanent fact of life, creating a permanent state of national insecurity. The insecurity is typically expressed as fear of evil beyond the nation’s borders. When the effort to control the world inevitably provokes resistance in some places, the mythology interprets it as confirmation of its premise that there will always be a threat to fend off.

    This mythology also conflates space and time. So it breeds equal, or perhaps greater, fear of whatever lies beyond the border separating the present from the future. Every kind of fundamental change comes to look like a threat from the future invading the safely bounded present. The natural response is to protect the status quo, which becomes the mythic equivalent of protecting the nation. Of course change is inevitable. So the peril of uncertainty becomes the basic foundation of the nation’s life.

    Despite their profound differences, then, the two great mythologies meet in their ultimate result: a society pervaded by a sense of constant threat, insecurity, anxiety, and frustration. For those who would rather not live in such a society, the most pragmatic course is to search for new mythologies that avoid these pitfalls.

    In principle, that search has no boundaries. America, like every nation, is an imagined community, and imagination has no limits. We could (again, in principle) imagine American identity and America’s role in the world in any way we collectively choose. For those who want to indulge in fantasy, all options are open.

    For pragmatists, though, the question has to be put in more concrete political terms: What kinds of new mythologies would actually work? What would be effective in reshaping American political culture? Here we are limited by the lessons of history: People are not very likely to totally abandon their most fundamental mythic structures and jump headlong into brand new structures. The new structures that become powerful and dominant are adopted precisely because they retain some kind of continuity with the old.

    So what are the minimum requirements for a mythology to have real success with the American public of the early twenty-first century? Any answer to that question can be no more than educated guesswork; all I can do is offer my own best guess. I’d say no mythology has a chance of meaningful impact unless it offers five basic elements that most Americans expect (whether they know it or not) from their national mythology:

    -- a strong appeal to patriotism and national pride, including an assertion of something uniquely good about America

    -- an assurance that there are eternal, universal truths and values, which are not merely human creations and thus provide an objective, unshakeable foundation for human life  

    -- a narrative pitting those eternal values against their opposites -- a moral drama of good versus evil -- on a global scale

    -- an affirmation of individual freedom as the highest value of all

    -- continuity with the mythic past through deep roots in distinctively American traditions and a close connection with a figure from the pantheon of national heroes

    This last criterion suggests that it would be most pragmatic to build a new mythology on the foundations of one of the two great existing mythologies. Which of the two is a better candidate? The very names of the two suggest an obvious answer. 

    “Homeland insecurity” has built into its very name the biggest problem that we must overcome.  And since it inherently mitigates against fundamental change of any kind, it mitigates against a change in mythology, which is often the hardest aspect of any nation’s culture to change.

    The other mythology has built into its name the possibility of change of all kinds and the sense of hope for a better future. So it seems clearly the better candidate on which to build new mythology -- if there is a way to eliminate from it the perception of threat, which breeds anxiety and evokes responses that cause harm to Americans and others.

    Perception of threat may seem to be inherent in the mythology of hope and change because that mythology has always centered around the image of a frontier: a line dividing “us” from “them,” the know from the unknown, the safe from the dangerous. So all promises of progress toward a radically better future are inextricably bound up with fear of what that future might hold.

    The only way to avoid this paradox is to develop a new mythology of hope unmixed with terror, one that views even fundamental changes in American life as progress rather than threat. This mythology would have to avoid pitting America against any enemies; avoid a division of the world into the “virtuous” and the “evildoers”; avoid images of a perfect future that clashes with the reality of the present moment. It would also have to meet the five criteria for any successful mythology, listed above.  

    That’s certainly a tall order. It might seem impossible. But myth-making is an exercise of imagination. Can we stretch our imaginations to conjure up a mythology that fits all these requirements? Perhaps the answer lies closer than we think.

    (This is the first installment in a series. Stay tuned for Part Two.) 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147780 https://historynewsnetwork.org/blog/147780 0
    In Search of New Mythology (Part Two) Martin Luther King, Jr. giving the "I Have a Dream" speech at the March on Washington, 1963.

    In Part One of this series I explained why America needs new mythology and why it makes the most sense to build it upon the existing mythological tradition of “hope and change.” I set out a series of features that any new mythology must have if it is going to avoid the pitfalls of the current dominant mythologies and have a chance of widespread acceptance by the American public:

    -- a strong appeal to patriotism and national pride, including an assertion of something uniquely good about America

    -- continuity with the mythic past through deep roots in distinctively American traditions and a close connection with a figure from the pantheon of national heroes

    -- an affirmation of individual freedom as the highest value of all

    -- an assurance that there are eternal, universal truths and values, which are not merely human creations and thus provide an objective, unshakeable foundation for human life 

    -- a narrative pitting those eternal values against their opposites -- a moral drama of good versus evil -- on a global scale

    The new mythology would also have to eliminate the most harmful features of the current mythology of hope and change, those that breed harm to others and insecurity to ourselves:

    -- dividing the world into the “virtuous” and the “evildoers”

    -- pitting America against perceived enemies

    -- mixing the hope inherent in the ideal of progress with a strong dose of fear of fundamental change

    -- promising a perfect future that clashes with the reality of the present moment

    As I concluded in Part One, "That’s certainly a tall order. It might seem impossible. But myth-making is an exercise of imagination. Can we stretch our imaginations to conjure up a mythology that fits all these requirements? Perhaps the answer lies closer than we think."

    I begin my search by thinking back to the great myth-makers of the nation’s history, those who have earned a place of great respect and admiration in the minds of most Americans. That’s a rather small group, and nearly all have offered up versions of the two great dominant mythologies that are so problematic. Very quickly, though, I come to one name that stands out not only for eminence and mythic imagination, but for meeting the requirements of a successful mythology while avoiding the dangerous pitfalls: the Reverend Dr. Martin Luther King, Jr.

    To call Dr. King a myth-maker is not to say that he offered up pure fiction. It is to say that he had a rare ability to tell the truth in emotionally powerful ways that could inspire dramatic political and cultural change. Like all great American myth-makers, he took a great number of empirical facts and wove them into a deeply moving narrative centered on the ideal of freedom. Unlike so many of the others, though, he included facts that were disturbing to most Americans, facts about the tragic denial of freedom in this land. So there is less of a gap between fact and myth in the national story that he created than in most others.

    King also had a rare ability to root the radically new elements of his mythology deeply in the existing mythology of hope and change. He insisted that the dream he so famously had was nothing really new, that it demanded no novel ideals or values. He dreamed only that the nation would finally live up to the most basic values on which it was founded, the ones it declared as its reason for being on July 4, 1776: the inherent right of every person to equality, life, liberty, and the pursuit of happiness. The Founding Fathers expected a nation living by these ideals to transform the world. Dr. King agreed.

    It is already evident that his mythology meets the first three tests: patriotic appeal, continuity with the mythic past, and freedom as the highest ideal. King clearly met the fourth test -- an unshakeable foundation for our lives -- when he preached freedom and equality as eternal, objective truths that were granted not by any human entity but by God. 

    However King was quite aware that religious language, which was his native tongue, would not be meaningful to many Americans. So he carefully cast his mythology simultaneously in both religious and secular languages. Like the Founding Fathers, he presented the fundamental values as trans-human truths both by claiming a divine source for them and by arguing that they are self-evident to human reason. Here again he showed his continuity with the mythic past and stood with the pantheon of our earliest national heroes.

    I shall offer a sketch of a mythology based on King’s words as I read them (with occasional quotations). I rely only on King’s secular words, since any mythology has the greatest chance of success when it can appeal to the widest range of people. Many Americans will want to translate this story into the religious language that King so commonly used. He made that translation surprisingly easy, which is one more good reason to use his words as a springboard for a new myth. (A summary of King’s views my book American Nonviolence shows how effectively he intertwined religious and secular language.)

    I do not suggest that King’s mythology is the one-and-only cure-all for the nation’s ills. I present it merely as a springboard for imagination, an example of what a search for new mythology could look like. Like any mythology, it can be developed in endless ways.

    The basic story line begins with the Founding Fathers creating something brand new and extraordinary: A nation-state based on the eternal truths that are self-evident to any reasonable person. It is obvious that every human being feels a need to be free. Everyone, if allowed basic freedom, feels that they are intrinsically worthy and valuable. Everyone wants equality -- to be treated with justice -- because of their inherent desire for freedom and sense of their intrinsic value. 

    The Founding Fathers emphasized certain kinds of freedom -- to speak openly, vote in elections, and own property -- because they saw humanity as essentially a collection of separate individuals, all free to compete with each other for life’s rewards. In fact, though, freedom means much more. It means “the opportunity to fulfill my total capacity untrammeled by any artificial barrier.” It means the ability of each person to choose their own way to actualize their own unique potentials to the fullest.

    But no one can reach their full potential on their own. Sooner or later (and usually sooner rather than later) we all need some kind of help from others. That’s why “my personality can be fulfilled only in the context of community…the mutually cooperative and voluntary venture of man to assume a semblance of responsibility for his brother." To be fully free, we must recognize that we are all members of a single human family. “We are caught in an inescapable network of mutuality, tied in a single garment of destiny. Whatever affects one directly, affects all indirectly. This is the interrelated structure of all reality. You can never be what you ought to be until I become what I ought to be” -- and vice versa. What happens to one happens to all.

    Most importantly, if one person’s freedom to pursue their own fulfillment is abridged, then everyone suffers. Since that one person cannot contribute fully to the fulfillment of others, all suffer a loss of their fullest freedom. It is in everyone’s self-interest, then, that no one interfere with the freedom of anyone else. Certainly the Founding Fathers understood that.

    But full freedom requires a more positive approach. It requires each of us to actively support all others in fulfilling themselves. That’s the only way we can totally fulfill our own potentials. This active support is the deepest meaning of love. We must not merely tolerate everyone else; we must love all members of the human family equally. We must care about what happens to every person, respond to the unique needs of each, and thus help all fulfill their highest potentials.

    For some this may be a religiously or morally motivated altruism. For others, though, it need only be a matter of common sense. We need others to fulfill ourselves. The more optimally others are functioning, the more they can give to us. We must live in whatever kind of community we create. The happier and healthier the community, the happier and healthier our own lives. For all these reasons, when we help others we are also serving ourselves: “We are in the fortunate position of having our deepest sense of morality coalesce with our self-interest.”

    If everyone acted upon this common-sense insight, we would be living in “the beloved community.” This is a particular interpretation of the utopian or millennial goal that has been such an essential part of the mythology of hope and change throughout American history. In the beloved community, everyone would recognize the truth that we all are, always have been, and always will be interdependent. And everyone would act upon that truth. The ideal is active interdependence and mutual loving service, not individual self-reliance and competition. Therefore there would be no hierarchies, no irresolvable conflicts, no oppression.

    The beloved community would be one of perfect unity but not strict uniformity. Diversity would be fully valued, because the distinctive qualities and potentials of every individual would be fully valued. The unity would come from each one appreciating and enhancing the qualities that make every other one different and unique.

    At first sight it seems that this millennial ideal creates the same gap between the real and the ideal that all other millennial visions have created, breeding the same frustration and anxiety, since it is hard to believe it could ever be attained in this world. This new mythology openly acknowledges the radical difference between real and ideal. Obviously, in today’s real world many people are selfish and unjust; many ignore or actively thwart the needs of others, especially the need for freedom; inequality is far too widespread; societal problems fester and are exacerbated every day.

    At the deepest root of all these problems is the tragic fact of separation (or, in religious language, sin). This is readily apparent in our own nation’s life. Americans have learned from their earliest beginnings to see themselves essentially as separate individuals who have a terribly difficult time figuring out how to relate to other individuals. That difficulty is reflected in the many separations between groups: genders, nations, ethnicities, races, religions, etc. As soon as there is separation there is likely to be a contest for domination between the two opposing sides. This is ultimate source of all inequality, which brings with it oppression, injustice, and all too often violence.

    In the modern world, we also find growing separation between our own competing values: some toward ethical/spiritual ideals, others toward material acquisition. And the material side seems increasingly to be dominating. The urge to materialist domination is a major cause of the environmental perils we face. But the only reason we can even think about dominating nature is our deep cultural tradition of treating humans as separate from the rest of the natural environment. In all these ways, separation is the source of humanity’s ills.

    However, separation is not the final word. There is a countervailing reality, “some creative force that works for togetherness, a creative force in this universe that works to bring the disconnected aspects of reality into a harmonious whole.” For many people, not all of them formally religious, the existence of this force is another objective truth that can serve as a reassuring foundation for life. We can see this force at work most easily in natural environments unspoiled by human interference: an organic system of endless interactions, all contributing to an overarching harmony.

    It is harder to see the same togetherness in human life. Though we are all threads interwoven in the single garment of destiny, the many separations we experience make the weave ragged and torn, sometimes leaving gaping holes.

    Thus there are two forces contending for dominance in our world and our daily lives: one for togetherness and one for separation, one toward the beloved community and one away from it, one an “arc of the universe that bends toward justice” and one thwarting that arc. The conflict between these two is the kind of moral drama -- a battle between good and evil -- that seems to be necessary for any mythology to gain dominant influence in American life.

    Each one of us is called to choose sides in this conflict. We can create more rips and holes in the garment of destiny, or we can help to mend the weave and bring it closer to the harmonious blend it is meant to be. If we enter into this moral drama and fight for good against evil, we become menders. We resist every form of inequality, injustice, oppression, and violence. We move our nation, and ultimately the world, toward the beloved community.

    But resistance means doing battle. It might even be called a form of war, although one of the greatest ills to be fought against is war of the traditional, military kind, which brings the cruelest separations of all.

    This moral dualism is perfectly consonant with the dominant American tradition of the mythology of hope and change. It is, in a sense, the familiar call to go out to the frontier and defeat the evil enemy; to promote the march of civilization; to take a stand on the cutting edge of progress, where the present meets the future, and bravely face the unknown. So it might seem to keep us trapped in the sources of insecurity this mythology has always created: fear of enemies, of the unknown, of evils beyond our control, and of a future where the real can never match the ideal.

    But the fundamental innovation of this mythology -- viewing all humanity as one family, tied together in a single garment of destiny -- points a way out of this dilemma, as we shall see in the next installment.

    (This is Part Two in a series. Read Part One here. Stay tuned for Part Three.)

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147815 https://historynewsnetwork.org/blog/147815 0
    Why Are Americans So Confused? William Gropper: "Construction of a Dam" (1939) -- an example of New Deal-style populist murals.

    When I was back in junior high school, there was a civics teacher who put forth the proposition that democracy depends on one simple principle: People are rational. Give them free access to information, and they’ll think things through logically to figure out what policies are best not just for themselves but for the whole nation.

    The latest poll from the Washington Post-Kaiser Family Foundation gives some support to that proposition. 67 percent of this sampling of 3,130 Americans understand that “there are many goods and services which would not be available to ordinary people without government intervention.” Only 29 percent disagree.

    As a group they are more worried about jobs and health costs than the federal deficit. 45 percent think Democratic economic policies help them; only 37 percent say that about the GOP.

    To be fair, the group sampled was a bit more liberal than in most other polls. 34 percent identify as Democrats and only 25 percent as Republicans; 29 percent declared themselves liberal on most political matters, which is more than you see in most polls. They favor Obama over Romney by a solid margin, 50 percent to 43 percent.

    On a number of questions, though, a majority of this seemingly left-tilting group sound like they are reading from Romney’s script. 55 percent want “smaller government with fewer services”; only 40 percent want more government services. 53 percent think budget cuts to reduce the federal deficit would help the economy. But not in the Pentagon. Only 45 percent would reduce military spending, while 51 percent oppose that.

    Yet 63 percent agree with the Democrats that additional spending on roads, bridges, and other public works projects would help the economy; only 13 percent think it would hurt. No support for smaller government or budget cuts there.

    And 65 percent support Obama’s plan to raise taxes on households with incomes of $250,000 per year or higher. Only 33 percent oppose it. But 51 percent say cutting personal income taxes would help the economy, and 53 percent support cutting taxes on businesses.

    These people seem rather confused. So maybe it’s not surprising that when asked whether the economy would benefit more from increased spending or avoiding federal deficit, they deadlock at 48 percent to 48 percent.  

    They’ve got the same kind of confusion when it comes to broader principles. More think government regulation is helpful than think it’s harmful, by 49 percent to 44 percent. 52 percent say “the federal government should do everything possible to improve the standard of living”; only 44 percent say that is “not the government’s responsibility, each person should take care of themselves.” What happened to the majority who want smaller government with fewer services?  

    They show up again in the 60 percent who agree with the Romney-Ryan view that “government controls too much of our daily lives” and in the 65 percent who say, “Most people who don’t get ahead should not blame the system, they have only themselves to blame”; only 33 percent disagree. A whopping 75 percent say, “People should take responsibility for their own lives and economic well-being and not expect other people to help”; only 23 percent disagree. (Remember the 67 percent who said many goods and services would not be available to ordinary people without government intervention?)

    However another whopping 70 percent endorse the view that “If people were treated more equally in this country we would have many fewer problems”; only 28 percent disagree. If people are not treated equally, how can they be blamed for their own troubles? In the very same poll, though, only 52 percent will agree that “one of the big problems in this country is that we don’t give everyone an equal chance,” while fully 46 percent disagree.

    With such confusion on basic principles, no wonder this group contradicts itself so much on economic policies. But if your head is spinning now, wait. There’s just a bit more.

    Should we “be more tolerant of people who choose to live according to their own moral standards even if we think they are wrong”? A huge majority say yes, we should be more tolerant, 75 percent to 23 percent.  But are “Americans are too tolerant and accepting of behaviors that in the past were considered immoral or wrong”? You guessed it. A large majority (61 percent) say that we are too tolerant; only 36 percent disagree,

    Well, that’s how democracy works. Not by rigorous logical thinking, as we were taught in civics class. But not by cynical manipulation of the masses, either. There’s enough support for Democratic and even liberal views here to show that these are not passive victims of Fox News propaganda.

    Should we conclude that people are hopelessly confused and leave it at that? No. I think there’s a way to make sense out of all this contradiction, if we look at one more question from this poll: “Do you think that people and groups that hold values similar to yours are gaining influence in American life in general these days, or do you think that they are losing influence”? 59 percent reply that they and their people are losing influence. Only 34 percent see themselves gaining. 

    To put it just a bit too bluntly, an awful lot of Americans feel like losers. They know that they are hard-working citizens who play by the rules. But they aren’t getting ahead and they don’t feel they’re getting a fair shake. It seems like the little guy just doesn’t stand a chance. That’s their story.

    So they are ready to agree with the story Obama tells on the campaign trail: Inequality is a big problem. Ordinary folks deserve more help from the government to equalize things and help out the little guy. That means more government spending. The rich should pay a bigger share of the costs of to offset that. And a bit more tolerance all around will make us a better society.  

    But Obama can’t solve their biggest problem: How to explain why they are losing out. Someone must be to blame. Enter Romney and Ryan. They, like Ronald Reagan and lots of other Republicans, know who is to blame: the government. It’s a simplistic, wrong-headed answer. But if this poll is anywhere near accurate (and plenty of others polls get similar results), a majority of Americans find the R-crowd’s story pretty appealing.

    They know that they are capable, responsible people who can make it on their own if given half a chance. So they figure everyone else can take care of themselves too. The problem must be the huge, impersonal government controlling our lives. Get it off the backs of ordinary people -- let everyone stand on their own two feet -- and those of us who are decent folks, who still live by the tried-and-true moral values of the past, will do fine. That’s how the Romney-Ryan story goes.

    All presidential candidates dish up mythic narratives, wrapping their selective version of the facts in stories that give them meaning and emotional punch. But this poll suggests that in 2012, unlike some election years, the myths the two sides are telling have roughly equal appeal. The public as a whole just can’t make up its mind which story to take as its guide. That’s why the candidates are virtually tied in the polls, which have barely moved in months.  

    There’s a lesson here for progressives who wonder why their movement has so much trouble gaining political traction with the masses. Yes, the masses are manipulated, but not as much as many progressives think. The problem progressives ignore is that they still believe what they learned in civics class: Give the people the true facts and their minds will lead them to logical conclusions. What the civics teacher left out is the powerful, perhaps ineradicable, human tendency to look for meaning by thinking in (or by means of) mythic narratives.

    This poll suggests that a majority of Americans are listening with a somewhat open mind to the traditional populist narrative: It’s the rich bosses against the little guy, and it’s government’s job to balance the scales.

    Obama and his campaign strategists are betting that a “soft” version of this story, hedged with concessions to the Romney-Ryan tale of rugged individualism, will win at least 271 electoral votes. So they are giving the basic progressive story line a public hearing and respectability that it hasn’t had since the days of Lyndon B. Johnson, the last of the New Dealers.

    That gives progressives something to build on, if they will forget what they learned in civics class and accept the fact that in politics, it’s myth against myth. A successful myth has to have deep roots in the past. The old populist story meets that test. From the 1890s to the 1930s, and then again in the 1960s, it had huge support in the political mainstream. There’s no reason it can’t be revived.

    But other deep roots have to be included too, if a progressive myth is going to have political success: the American tradition of individual responsibility and self-help, along with some kind of respect for familiar, reassuring moral values and a strong dash of patriotism.

    It’s a tough task to put all that together in an appealing story. But with some imaginative effort it certainly can be done. If you doubt that go back and read the speeches of Dr. Martin Luther King, Jr. The populist myth was only part of King’s much broader message. But he can serve as an instructive and inspiring example because he was such a shrewd politician. He was able to win over much of the nation for radical change, despite massive opposition, in part because he had factual and moral truth on his side, but in part because he expressed that truth in such an emotionally powerful mythic narrative.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147858 https://historynewsnetwork.org/blog/147858 0
    In Search of New Mythology (Part Three) Civil rights marchers in Washington, D.C., August 28, 1963. Credit: National Archives

    In Part 2 of this series, I sketched out the foundations of an American mythology of hope and change based closely on the words of Dr. Martin Luther King, Jr. In this new patriotic vision, a good American believes in every person’s freedom to discover and fulfill their own unique potentials. Life is all about exploring new possibilities, and there is no end to that exploration. A good American also believes that all humanity, indeed all life, is woven together in a single garment of destiny. Freedom means fulfilling oneself by helping all others fulfill themselves.

    America’s mission is to move toward the beloved community at home and abroad, to create a world where everyone acts lovingly to enhance the fulfillment of all and bring the disconnected aspects of reality into a harmonious whole. It is the patriotic duty, and privilege, of all Americans to fight against separation of every kind, especially against its most pernicious forms: inequality, injustice, oppression.

    In the past, America’s moral battles have typically caused a number of problems that any new mythology must avoid: killing, injuring, and in many ways harming others; creating insecurity for ourselves through fear of enemies and of future evils beyond our control; creating frustration by aiming for a future ideal that can never be attained in reality.

    A mythology based on Dr. King’s words would include the moral drama of good combating evil yet avoid these troubling effects because we would enter the battle with the new attitude that is at the center of this mythology: viewing every person not as a separate unit, trying to figure out how to relate to others, but as a strand in the single garment of destiny, already related to others in myriad ways, with each of us affecting all others in one organic whole.

    From this new perspective many of the old, familiar assumptions of both dominant American mythologies simply make no sense. We cannot claim to be purely good and innocent, as if we stood apart from those we oppose, and ascribe all evil to them, as if we had no role in contributing to the ills that plague us. Nor can we hope to heal those ills by imposing our control over others, as if we were some kind of Lone Ranger arriving from outside to right every wrong.

    Once we recognize that we are all parts of an interactive network of mutuality encompassing all humanity, we realize that we can never stand outside that network. We are never passive victims of history, nor can we be isolated from the dynamics of history. And the hope of fully controlling people and events is a fantasy; every effort at control acts back upon us in unexpected, usually harmful, way. But we always influence what happens. So we each share some degree of responsibility for contributing to the ills of the system. The ills arise out of the pattern of relationships. They cannot be blamed on any one person or group of people and certainly not on “those people” across the border, since the border is itself a mode of relationship, a place where two groups meet and interact.

    From this perspective, the enemy is no longer any particular person or group of people. It is the evil that has arisen from all of us. In a more abstract sense, the enemy is the fact of separation itself. Therefore, in this new mythology, Americans no longer see themselves as the “good” people dedicated to destroying the “evil.” America's mission is to overcome separation, to strengthen every thread in the garment of destiny threat by strengthening the interaction of each with all others. Every good American must have that same goal.

    So America still has opponents, both abroad and at home -- those who appear to be increasing the separation in the world and blocking progress toward reconnection. We recognize them by the inequality they promote, the injustices they inflict, and the harm they do to others and themselves. But we oppose their actions, perhaps even label those actions “evil,” without viewing the people themselves as evil.

    Instead, good Americans treat them the way we treat all people, as equally important threads in the single garment of destiny. We respect their inherent dignity and demand the same freedom and justice for them as for all others. If we resist their actions, it is only because we want the best for the whole society, including them. We aim to help our opponents fulfill their full potential, which in turn will help us do the same.

    Since we and our opponents are parts of the same human family, we give them the same respect, empathy, and love we give our own family members even when we disagree with them. We handle conflict with them the way we handle conflicts with our own family members: asserting our own views, sometimes very strongly, only because we want the best for the whole family, including those we disagree with. The American way is guided by the principle of universal love, which means overcoming every separation, even between ourselves and our opponents.

    So we try to see the world through our opponents’ eyes, “to see the enemy’s point of view, to hear his questions, to know his assessment of ourselves.” (All quotations here are from Dr. King.) Only then can we have the fullest possible view of what is best for all.

    However, when grave moral matters are at stake, good Americans take a firm stand and fight for it. Indeed, in this new mythology it is our patriotic duty to risk death, if we must, defending our nation’s highest values. But it is equally our duty never intentionally to inflict death in defense of those values. Killing or physically harming our opponents would only increase the separation we aim to overcome.

    In other words, nonviolence is an intrinsic part of this new mythology of hope and change. But refraining from physical injury is only one part of the larger principle of nonviolence: to love all and want the best for all. That means we must never intend to do any harm to others; we must never try to gain advantage by imposing ourselves or our views in ways that will thwart the fulfillment of others.

    Any intention to do any kind of harm creates conflict and separation, not only physically but psychologically. Hatred and anger lead us to depersonalize and dehumanize others, to treat them as an "It" rather than as "Thou." Because violent intentions as well as actions always perpetuate this dehumanizing, they can only “intensify the cleavage in a broken community.” In the end, violence “leaves society in monologue rather than dialogue.” It simply will not work to pursue the goal of community by means that drive people apart. Even when violence is used to promote a just cause, it destroys the very community it seeks to create. 

    When Americans are called to fight for our ideals nonviolently, we will stand firmly against others, but only temporarily, and only to help them in the long run to heal the rifts that set them apart from others. Responding to hate with love “is the only way to reestablish the broken community.”

    This vision of nonviolence can serve as a basis for all relationships, from person-to-person all the way up to nation-to-nation. Just as parents and children are tied together even in the worst moments of conflict, just as the criminal and the victim are tied together, so the United States is tied to Iran, North Korea, Al Qaeda, and the Taliban. In world affairs, as in personal affairs, there are no winners and losers. Either everyone wins or everyone loses.

    John Quincy Adams once said that “America goes not abroad in search of monsters to destroy.” From the viewpoint of nonviolence, there can never be any monsters. As soon as we start to imagine monsters and set out to destroy them, we destroy the global community and the chance of fulfilling our own highest potentials. So America, like every other nation, will flourish best if it shows “an overriding loyalty to mankind as a whole.” America, like every nation, will preserve its own best values only by helping others enhance their own.

    It’s easy enough to see that including nonviolence in a new mythology of hope and change avoids two of the major themes that have always marked this mythology: dividing the world into the “virtuous” and the “evildoers,” and pitting America against perceived enemies. Thus it removes the insecurity and anxiety these themes have bred. Of course it also removes the impetus to do harm to others, which has so often blown back in harm upon Americans. 

    Though it may be less obvious, nonviolence also avoids the other two major problems of the traditional mythology of hope and change: mixing the hope with a strong dose of fear of fundamental change, and promising a perfect future that clashes with the reality of the present moment. Nonviolence avoids these problems because it does not aim merely to create harmony in some far distant future. It uses means that are meant to bring people together at every step of the way in order to reach togetherness; its ends are fully present in its means.

    When Americans go out to do nonviolent battle, we recognize from the beginning that we are always already connected with everyone, including our opponents. All our actions are guided by that awareness. So in the very act of resisting others we make the beloved community a present reality, in a partial and preliminary way. We realize that we may never have a perfect beloved community. But in every fragmentary experience of it we see the separation between present and future, real and ideal, being overcome. We experience the process of creating more unification. And that process of endless change toward greater harmony is the essence of the beloved community.

    For Americans who live within the mythology of homeland security or the traditional myth of hope and change -- and are therefore prone to see major change as dangerous -- Dr. King, his words, and his example may still appear threatening. But for those who live, or aspire to live, within a mythology based on his words, major change of any kind is not inherently threatening. Every effort for change reinforces our awareness that we have no enemies and that there is no necessary clash between present and future, since the future we seek can always be realized, at least partially, in the present moment. No matter what obstacles we face, the way we face them demonstrates that our lives are changing for the better and thus gives us hope.

    Thus we gain a sense of security that mythologies based on dualities -- “us” versus “them,” the present versus the future -- can never offer. More broadly, we gain all the advantages of a mythology of hope and change without the disadvantages that of the familiar expressions of hope and change that dominate our culture now. 

    This vision of a new mythology may all seem like idle utopian speculation. In light of our current American reality, it may very well seem impossible to imagine nonviolence becoming a central theme of the prevailing American mythology. But it’s worth remembering that nonviolence has been part of the nation’s political culture since before there was nation, when the Quakers made such a success of Pennsylvania in the seventeenth century. Nonviolence has been especially prominent in the fight for racial justice for nearly two centuries, its banner carried by such eminent figures as William Lloyd Garrison and Julia Ward Howe as well as, of course, Dr. King himself. 

    The movement for racial justice is a reminder of how long change can take. But it also proves that the basic assumptions of American life can change in ways once thought impossible. Racism was taken for granted throughout most of American history as an immutable fact. Though we still have a long way to go in improving race relations and equal opportunity, the level of racial integration and equality we have today was absolutely unthinkable to the vast majority of Americans as late as the 1940s.

    In a similar way, lesser mythic themes can become dominant surprisingly quickly. In the mid-1930s, virtually no one could believe that American discourse would ever be dominated by a fear of foreign enemies invading the nation. It seemed unimaginable. By the 1950s it was not merely a reality but an apparently irreversible reality. If the nation’s mythology could be transformed so quickly in the direction of homeland insecurity, it seems equally possible, in principle, to transform it in the opposite direction.

    One of the reasons (among many) for the rise of the “homeland insecurity” myth was the tremendous conscious effort that a lot of people put into making it happen. Staffers in the Eisenhower administration made elaborate plans, with the president’s approval, to accustom the populace to Cold War fear as what they called “the new normal.” They were merely speeding up a process that was already well underway, and they got the results they wanted. Americans became accustomed to what seemed impossible in the mid-‘30s: a life built on a constant, deep, underlying conviction that our national existence is constantly threatened. That conviction still dominates our national life in many ways.

    If we are going to escape from the mythology of homeland insecurity, and from the negative consequences of the mythology of hope and change, it will take just as much conscious effort. This time, though, it’s not likely to happen at the highest levels of government. It will have to emerge the way the civil rights movement emerged, from deep thinking and wise planning at the grassroots of American life.

    (This is Part 3 in a series. Read Part 1 and Part 2 here. Stay tuned for further installments.) 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/147916 https://historynewsnetwork.org/blog/147916 0
    For GOP, It’s the Patriotism, Stupid Mitt and Ann Romney on Super Tuesday, 2012. Credit: Flickr.

    When an incumbent president is running for re-election and the economy is in the doldrums, what’s a challenger to do? All the pundits and political analysts agree: Focus like a laser on just one issue: the economy, stupid. It worked for Bill Clinton, the last challenger to run against an incumbent during a recession. Any other strategy would indeed be stupid, the common wisdom says.

    But there are millions of Americans who don’t know the common wisdom. They tuned into the campaign for the first time when they tuned in to watch Mitt Romney’s acceptance speech, or perhaps earlier bits of the Republican Convention. And they’re not likely to think that Romney's campaign is guided by the famous mantra of Clinton’s 1992 campaign. They probably came away assuming that there’s a very different sign posted in Romney’s campaign headquarters: “It’s not the economy, stupid. It’s the patriotism.”

    The pundits ignored the patriotism that drenched Romney’s acceptance speech, just as they ignored the chants of “USA! USA!” that punctuated the speech and all those “We believe in America” signs decorating the GOP convention hall. They figured it was just the usual window-dressing required at any Republican convention -- or Democratic convention, for that matter. There will be just as much red-white-and-blue in Charlotte as in Tampa.

    But Romney’s speechwriters and strategists are not likely to dismiss their patriotic flourishes as mere window-dressing. They’re not stupid. They’ve got to mobilize their conservative base while appealing to a crucial sliver of voters in the swing states. As the polls consistently show, both those target groups consist mainly of white married people and over-65’s. They’re the only demographics that can give Romney a victory. And they’re not the ones most affected by a weak economy.

    The people suffering most from unemployment and underemployment are the young, single women, people of color. If the economy were really the only issue that mattered, and the common wisdom were accurate, Romney should be adding big chunks of these distressed voters to a very big chunk of his conservative base, giving him an easy victory. But the most distressed groups are overwhelmingly for Obama, which suggests that the common wisdom misses the mark.

    Romney’s strategists know this. So they have to find some other issues to bring their target voters on board. The social issues of the so-called “culture war” are too dicey to stake a campaign on. Patriotism is absolutely safe. And for years now it has been GOP territory.  “We believe in America” obviously implies the unspoken sequel: “And those others, who chose Barack Hussein Obama as their leader, do not.”

    That’s not to say the Romneyites are appealing to racism or anti-Muslimism. No doubt they are happy enough to have such prejudice work in their favor. But there probably isn’t enough of it to swing the election.

    What there is, in great abundance among Romney’s target demographics, is a strong feeling that the Democrats lost their patriotism back in the days of Vietnam war. When the Dems picked a vehement opponent of the war, George McGovern, as their presidential candidate, they lost their claim to truly believe in America and they’ve never regained it, as far as Romney’s target voters are concerned. Even the assassination of Osama bin Laden can’t erase that deep conviction, because Obama won’t say the sacred word that Marco Rubio fairly shouted out in introducing the GOP candidate: Romney “understands what makes America exceptional.” 

    To be sure, Romney did spend plenty of time in his speech harping on the weak economy. But he wasn’t appealing to people’s personal suffering. Most polls show that a majority of Americans say their own economic situation is OK or even good. When Romney asks Ronald Reagan’s classic question: “Are you better off now than you were four years ago?”, if the voters answer honestly based only on their own family’s situation, most would say “yes,” or at least “no worse off.” And that’s more likely to be true among Ronney than Obama supporters.

    But Romney’s target voters are nervous about their future because they see the nation’s economy as a whole in bad shape. So when he asks that classic question, it’s a coded way of asking, “How do you think America is doing? Is the economy safe? Is America keeping you safe?” He’s raising the issue of national pride, the core of patriotism. And he’s probing the tenderest of political spots among his target voters, their deeply buried sense of national insecurity.

    He’s also asking, “Do you think your family is safe?” The biggest applause line of his speech (according to the subjective applause meter in my ears) was his scoffing remark that “President Obama promised to begin to slow the rise of the oceans and heal the planet,” followed by the powerful punch line: “My promise is to help you and your family.”  How can Democrats be patriotic when they care so much about the whole world?, Romney was asking. True patriots worry about taking care of business at home.

    With that the GOP leader tied together the three key elements of his patriotic narrative: American exceptionalism, prosperity through unbridled capitalism, and the safety of the traditional nuclear family. That’s the holy trinity of “the America that we all know,” as Romney put it -- though he should have said, more accurately, the America that we Republicans imagine once existed and still long to believe in, the America we think any true patriot must believe in, too. 

    For many of Romney’s target voters, Obama symbolizes profound doubt whether the familiar America of their imagination exists any more or can ever exist again. Romney’s strategists surely understand the anxiety those voters feel about losing the America they belive in. Just as surely, the Romneyites want to raise that anxiety as high as they can between now and Election Day. Romney’s speech and all those “We Believe in America” signs were a strong start.

    Obama and his strategists know this well enough. So we are likely to hear the president and other Democrats oozing a patriotism that will make some of their own base a bit queasy. That part of the Dem base has, in truth, been more or less skeptical about unbridled patriotism ever since it carried us into the horror of Vietnam.

    But there’s an election to be won. It’s a simple rule of battle strategy: When your opponent rolls out a big gun, if you’ve got the same gun in your arsenal, you fire back with it every chance you get.

    The two candidates’ visions of patriotism are hardly the same, though. In Obama’s rhetoric, the holy trinity is American leadership-in-partnership, prosperity through cooperation, and the safety of families of every kind, both traditional and not.

    The real story of Election ‘12 may turn out to be not just a referendum on the economy nor a choice between two ideas about government’s role, but also -- perhaps most importantly -- a choice between two visions of what it means to be a patriotic American. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148048 https://historynewsnetwork.org/blog/148048 0
    Michelle Challenges Nineteenth-Century Myth Official White House portrait of Michelle Obama, 2009.

    I’m an unabashed Michelle Obama fan, and my wife is even more so. It’s not just Michelle’s extraordinary set of talents. It’s the way she carries them so gracefully. If her air of humility and naivete is not genuine, then in addition to all those other talents she’s the greatest actress of our time. So as we watched her speech to the Democratic National Convention we ooh-ed and aah-ed over her delivery and her magnetic presence.

    But to be honest there was not much interesting substance in the speech beyond the expected, politically necessary words. There was just one sentence that made my wife exclaim, “Good line!”, and I had to agree: “When you've worked hard, and done well, and walked through that doorway of opportunity, you do not slam it shut behind you. You reach back, and you give other folks the same chances that helped you succeed.”

    The Obama campaign has done a pretty good job of creating the impression that Mitt Romney, having walked through that doorway, quickly slammed it behind him. No doubt Romney would protest that it just ain’t so, that he cares as much as any Democrat about reaching back and helping others succeed. And he might very well be telling the truth.

    The crucial difference between the two candidates and the two parties is in how they see that metaphorical hand reaching back.

    The Republicans see it primarily as an act of charity, a personal decision by individuals who get ahead to reach back and help individuals of their choosing who lag behind. Communities that vote overwhelmingly Republican are filled with churches, clubs, societies, and organizations whose main purpose is to help others. And they do help others, immensely, every day.

    It’s a tradition that goes back to colonial times, when its prime motivation was rooted in religion -- as it still is for so many Republicans. Calvinist theology (in the version most popular among the colonists) taught the successful to see themselves as blessed by God and obliged by God to help others less blessed. But it also taught that success was a sign of being right with God, while lack of success showed some flaw in one’s relationship with God. In its cruder (but very popular) form, the message was that lack of success was a sign of sin. So charity was a way not only of giving the sinners a helping hand, but also of publicly reinforcing the message that they were, indeed, sinners.

    That tradition continued to dominate the American view of the helping hand through the end of the nineteenth century.

    By the early twentieth century, though, a revolution was occurring. The Progressive movement was on the rise, spreading a new message: Lack of success was a sign of failure not by the individual but by societal structures and institutions that limited the individual’s opportunities, no matter how hard he or she worked. That premise dramatically changed the view of the helping hand. Now it had to be not merely a personal decision to bestow charity, but a decision for structural change. Without that change, all the charity in the world would merely perpetuate the problems and insure that some people would lag behind, that they’d never get the help they needed to make it through the doorway of success.

    In a democratic republic, structural change can never happen at the whim of one or even many separate individuals. It has to be initiated through the political process. So, in the Progressives’ view, the helping hand had to be extended by the body politic as a whole. And the obvious agent of the body politic is government.

    That’s what Michelle Obama meant, of course, when she said, “You reach back”: We, the people, change our laws and policies to make sure everyone can get through that doorway.

    Insofar as this election is a choice between those two visions of the helping hand (and that’s just a part, maybe even a small part, of what this election is about), it’s a choice that twenty-first century voters will make between nineteenth century and twenty-first century worldviews -- the stuff that myths are made of. It’s a useful reminder that when a myth is eclipsed, it doesn’t always die. Often it lives on in the shadows, just waiting for a chance to make its comeback. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148147 https://historynewsnetwork.org/blog/148147 0
    The Simple Message Dems Are Missing Barack Obama needs to refine his elevator speech game. Photo credit: Pete Souza.

    Barack Obama needs a good elevator speech. So does every political activist. It’s the quick little speech you give a stranger you meet on an elevator about your group’s goal, why it matters, and why that stranger should support you. You don’t know which floor the stranger will get off on, so you have to convey your whole message clearly in just a few words.

    If you get on an elevator at the first floor with Mitt Romney, you know what you’ll get: “Barack Obama is destroying our economy because he lets the government take your money and give it to other people, who probably don’t deserve it. We Republicans will build prosperity by letting you decide what to do with your hard-earned money.” Second floor, speech over, all out.

    But suppose you get on the elevator at the first floor with the president. Which speech will you get? You might be on floor 20 or 30, still trying to figure it out.

    Will it be the speech about tax fairness, income inequality, everyone playing by the same rules, making hard work pay off, building the middle class, guaranteeing everyone a middle class life, building the nation’s infrastructure, finishing the job we started, keeping hope alive, Romney’s job-killing at Bain Capital, Obama understanding your problems, social justice for women and minorities? And there are surely a few I’ve missed.

    Theoretically, all these speeches can be knit together into a logical whole. The problem is that in this age of instant communication, few of the swing voters who will decide the election have the patience, or the interest, to think through the logical connections.

    But there is one elevator speech the president often mentions that can quickly sum up how all the others fit together: “Our destines are bound together. A freedom which only asks what’s in it for me [which is Romney’s kind of freedom], a freedom without love or charity, is unworthy of our founding ideals. … We travel together. We leave no one behind. We pull each other up,” as Obama put it in his acceptance speech in Charlotte. 

    In Ossawatamie, Kansas, last year he put it even more succinctly:  Today’s Republican “philosophy is simple: We are better off when everybody is left to fend for themselves and play by their own rules. I am here to say they are wrong. We’re greater together than we are on our own.”

    That elevator speech is not about specific policies or what will happen in the next four years. It’s about two basic philosophies of human society that have been vying for dominance throughout the history of this country. Pick the one you believe in, and every other political and social view flows from it.

    It’s not really a question of personal preference about how people should live, though. It’s a question of whether or not we’ll recognize how things really are. Dr. Martin Luther King, Jr. put it most memorably. Whether we like or not, we are in fact “caught in an inescapable network of mutuality, tied in a single garment of destiny. Whatever affects one directly, affects all indirectly. This is the interrelated structure of all reality.”

    Recognizing that truth should be “the fundamental rule of our national life,” Obama said at Ossawatamie: “In the long run, we shall go up or down together.” He was quoting another president who had spoken those words in the same place a century earlier -- a Republican, Theodore Roosevelt.  

    Obama could just as well have quoted the Democratic President Roosevelt, who said in his first inaugural address: “The basic thought that guides these specific means of national recovery is … the insistence, as a first consideration, upon the interdependence of the various elements in all parts of the United States … We now realize as we have never realized before our interdependence on each other.”

    It’s true that, for both Roosevelts, the theme of interdependence was only a small part of their rhetorical arsenal. TR’s elevator speech focused more on justice and personal morality. FDR’s focused on the social morality of keeping every person out of abject poverty. But they each followed the basic rule of politics that says a winning campaign is built on no more than two clear, simple, positive messages, repeated over and over.

    Obama seems to want to have it all. He gives us both Roosevelts’ elevator speeches, along with a few themes of his own that differ from both TR and FDR, and he throws in a dash of MLK for good measure. The president may get re-elected, even though he’s defying that basic rule of politics. Then we’ll know who the next president is. But we won’t know exactly what message his victory sent.

    However it doesn’t all depend on the president. The average American in the street does not have to parrot all of his many elevator speeches. As we talk about the election with everyone we meet, we are free to focus on whatever theme we want. We could choose to focus on the message of interdependence: “Our destines are bound together. … We’re greater together than we are on our own.”

    If enough of us make that choice, the election would become a referendum not just on candidates or policies, but on the fundamental question of American political life: rugged individualism versus the common good, “you’re on your own” versus “we’re all in it together.”

    Then, if Obama wins on Election Day, Dr. King’s vision of “a single garment of destiny” would be the clear winner too. The Republican’s “every man for himself” philosophy would go down in defeat. American political life -- indeed all of American life -- would turn a profound corner and might never be the same again.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148228 https://historynewsnetwork.org/blog/148228 0
    The Myth of Arab (Or Is It Muslim?) Rage Newsweek cashes in on Muslim rage.

    I’m on a brief vacation, from writing this blog and from almost everything else -- traveling to both coasts, seeing friends, museums, and oceans -- which means I don’t get to know very much about the news: just brief snatches of headlines caught from newspaper racks, TVs in public places running CNN, and an occasional glance at the New York Times website.

    That’s actually a very revealing way for a writer on mythic America to get the news, because that’s the way most Americans get their news. Headline writers don’t have time or space for details or, often, facts. They just need to grab attention with some emotionally punchy words, the kind of words that good myths are made of.

    So I know that mobs are venting anti-American rage throughout the Arab world. Or is it the Muslim world? I’m not quite sure. And how many Arabs, or Muslims? What percentage of the population in predominantly Arab or Muslim nations? I have no idea. Like most Americans, I know only that “those Arabs” -- or maybe it’s “those Muslims” -- are raging against us. Oh, and I know that they’re creating a big new headache for the Obama and Romney campaigns.

    For the headline writers, that’s a good enough story. And it’s a pretty satisfying story for a lot of Americans. Our prevailing national myth, the myth of homeland insecurity, requires that some foreigners be out to get us. At least since 9/11/01 Arabs (or is it Muslims?) have been the number one candidate for that role. The latest anti-American outbursts came on the anniversary of that tragic day, which is quite convenient, speaking in mythic terms. It allows the connections to be made so easily; the world seems to fit together, just as most myths aim to suggest.

    And, as on 9/11/01, the story is about a new threat that we must all prepare to deal with for an indefinable, but surely lengthy, amount of time (or so we’re told). Even the Times, widely acclaimed as our most serious, in-depth, newspaper of record, is satisfied with the headline: “U.S. Is Preparing for a Long Siege of Arab Unrest.” The many readers who never get past that headline or the first few paragraphs of the story still won’t know how many Arab countries, or what percentage of population in those countries, we’re actually talking about here.

    Imagine that a tiny group of Arabs made an movie critical of Jesus, which provoked anti-Muslim demonstrations among some right-wing Christians in the U.S. It’s easy enough to imagine a few thousand or even a few tens of thousands participating in those demonstrations. Some might even get out of hand for a while.

    But if newspapers in Arab lands headlined that anti-Muslim “rage” now characterized the whole United States, most of us would laugh bemusedly at how badly those headline writers misunderstand us.

    Yet so many Americans, and so many of our journalists, seem to be content with crude sweeping generalizations about “the Arabs” or even “the Muslims.” And many are not merely content but eager to purvey and consume these generalizations. They serve to confirm stereotypes that have been popular in American culture throughout the nation’s history, ever since the days of the Barbary pirates.

    In fact these stereotypes are much older than the United States. The first Europeans who came to North America carried with them centuries-old pejorative images of Arabs and Muslims. Of course the Muslims were seen as “infidels”; literally, people without true faith (and Muslims repaid the compliment to Christians). At a deeper level, though, the Christian image of the Muslim could be traced back to the “civilized” Greek and Roman image of the “barbarian”: lazy, dirty, impulsive, unruly, unpredictable, and easily given to sudden outbursts of rage. It’s the same image, of course, that Americans of northern European descent have applied to a long list of other Americans who didn’t seem quite “civilized.”

    What nearly all these pejorative images boil down to is a supposed lack of self-restraint. That’s the essence of the current Arab (or is it Muslim?) “unrest” that even the New York Times warns us we should be worrying about now.

    So we’re caught, as a nation, in a conflict between our awareness that those age-old generalizations have become unacceptable form of prejudice and our unawareness (for the most part) that it can really feel good to give vent to prejudicial stereotypes every so often.

    Ironically, there’s some evidence that a similar internal struggle between new and old cultural perspectives is playing out in the anti-U.S. demonstrations themselves. The anti-U.S. “rage” surely represents only a portion, and probably quite a small portion, of public sentiment in predominantly Arab and Muslim lands. It’s a pretty safe bet that this latest episode will pass and be forgotten in the U.S., just as the stir created by the Danish cartoons of Muhammad passed and were forgotten seven years ago.

    But it’s an equally safe bet that in the future we will see other such episode blown out of proportion in the U.S. mass media, because they offer the perverse satisfaction of purveying the old mythic image of Arabs and Muslims as if they were fact. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148386 https://historynewsnetwork.org/blog/148386 0
    GOP vs. Dems: The Failure to Communicate What we've got here is a ... well, you know. Photo credit: Flickr/HNN staff.

    There’s a common view that the clandestine video of Mitt Romney’s “47 percent” speech revealed “the real Mitt.” But why should we assume that? How can we know? We can never read someone else’s mind.

    There is one thing we can know, though: Politicians tend to tell audiences what they want to hear. A good politician’s stock-in-trade is a knack for summing up an audience’s shared narrative more effectively than the folks in the audience themselves can do it. Why shouldn’t that be just as true of Romney at a small gathering of super-rich donors as a huge crowd of gun owners or Tea Partiers or teachers?

    And why shouldn’t it be just as true of Barack Obama in 2008, when he was captured on clandestine video talking to super-rich donors about industrial workers facing permanent unemployment: “It's not surprising then they get bitter, they cling to guns or religion or antipathy toward people who aren't like them or anti-immigrant sentiment or anti-trade sentiment as a way to explain their frustrations.”

    We’ll never know if Obama really believes that, either. But there’s a good chance that he knew what story his audience wanted to hear. So both of these infamous video clips probably do tell us something significant about what elite liberals and conservative donors believe and the messages they send to their masses.

    Taken together, they look like a pair of messages that have no connection with each other. Like skew lines in geometry, they can go on forever and never meet at a single point. So they reveal the huge failure to communicate that marks our current political discourse.

    If I remember rightly, conservatives responded to Obama’s ’08 remarks with insulted outrage, as if to say, “This is absurd. You don’t know what you’re talking about. We won’t even dignify that nonsense with any effort to disprove it.” Liberals have offered a similar sense of insulted outrage in response to Romney’s recent remarks.

    In neither case did the offended side say, “OK. Let’s take your narrative seriously, unearth the assumptions behind it, and use it as a springboard to identify the most important issues that divide us. Then we can discuss those issues on their merits and demerits. Let’s make that debate the central issue of this electoral contest.” 

    The comparison is not perfect, because liberal journalists have gone into great detail proving Romney’s facts wrong. Many non-income-taxpayers will not vote for Obama, and many income-taxpayers will. Romney will carry many states that get more from the federal government than they pay in, while Obama will carry many states that pay in more than they get back.

    But those facts don’t refute the main thrust of the conservative narrative, which isn’t about amounts. It’s about attitudes: the purported “sense of entitlement” versus the equally purported “sense of responsibility.” And there’s no way to disprove that story, just as there’s no way to disprove the liberal story. They are both, in the philosophers’ terminology, non-falsifiable.

    Of course political parties have been creating non-falsifiable narratives about their opponents throughout American history. The classic example is the first: In the 1790s, Federalists insisted that Republicans wanted to plunge the nation into the same kind of bloody revolutionary chaos that had engulfed France; Republicans insisted with equal fervor that Federalists wanted to turn the nation into an English-style monarchy.

    Then, as now, these were claims about the supposedly deepest beliefs of the other side. Since nothing the other side said could effectively refute those claims, the two sides simply hurled their charges at each other. The substantive issues dividing them, which might have been effectively debated, never had a chance to dominate the public conversation. So there was too little genuine conversation and too much of the parties talking past each other.

    In the 1790s, the analogy of two groups speaking different languages, with little interest in learning each other’s language, was not far from the truth. Nor is it today.

    However there are a couple of very important differences between the infamous Obama and Romney remarks to donors.

    Though Romney’s words in private were rather more blunt (“inelegant,” as he put it) than his words in public, there was little in the substance of his narrative that we haven’t heard from him in public settings. And that narrative elicits cheers and jeers from the GOP faithful every time it’s told again.

    Obama’s private words of ‘08, on the other hand, have rarely if ever made it into the public arena. I suspect that’s because they aren’t very effective fodder for evoking cheers and jeers at public rallies, since they are not really an attack on conservatives.

    Taken in context, they are a rather sorrowful explanation (from a liberal perspective, of course) of the roots of conservative attitudes. They cast the blame for the social attitudes of the conservative masses on the failed economic policies of conservative elites. They don’t do what Romney did: blame the victim.

    The other great difference between the two infamous remarks is that many of Romney’s factual claims can easily be refuted, as we now know. Conversely, there is a substantial, academically respectable body of psychological data and social theory that tends to support (though not definitively prove) Obama’s claims.

    The data show that conservatives, in general, tend to want predictable order and structure because they have trouble tolerating ambiguous situations. They are more inclined than liberals to follow norms and rules and to plan and organize their activities. So they feel more troubled than liberals by unstable systems, while they are less open to new (hence unpredictable and unstable) experiences.

    One well-known explanation for all this data comes from the prominent conservative social theorist Peter Berger. He argued (to make a very long story far too short) that humans naturally treat their familiar cultural structures (such as “God and guns”) as immutable objective truths, because that’s the best way to make sense out of the constant flood of stimuli that would otherwise overwhelm and paralyze us. That may not be true of all people, but the data show that it’s significantly more true of conservatives than liberals.

    If Barack Obama doesn’t know this combination of data and theory (and he very well might), someone in the West Wing surely does and could explain it to him quick enough. So it looks like Obama has missed a golden opportunity. He could tout his “God and guns” message very publicly and live up to the common caricature of him as a professorial lecturer. Yes, he would incur the wrath of conservatives, but probably no more than he has in any event. So there would be no appreciable political loss. 

    The political gain would be to create a new narrative on the liberal side, where there is now (let’s be honest) plenty of blaming the victim. The new narrative would disagree with the political and social values of conservatives but show compassion for the people who hold them and especially for their economic plight. Imagine how that would elevate the public discourse, change the political landscape, and perhaps help nudge undecided voters toward the Democratic side.

    Obama has missed another opportunity too, at least so far. He could challenge Romney to back up his story about the 47% with facts and make the resulting debate the center of the campaign. No, the president could not disprove Romney’s main points. But it would give him another way to put the spotlight where he wants it to be: on the stark contrast between the candidates’ competing narratives. And it just might open the door to a real debate about the merits and drawbacks of their very different political stances.

    Instead, so far, the president has merely echoed the liberal emotions of insult and outrage. He has perpetuated the failure to communicate. That’s a shame.

    But hey, we still have three “debates” coming. Who knows? Maybe somewhere in there we will get a genuinely substantive debate, the stuff that democracy is supposed to be made of. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148532 https://historynewsnetwork.org/blog/148532 0
    Israel Versus Iran: Netanyahu's Cartoon Version Still from Netanyahu's speech. Credit: C-SPAN.

    I was driving home listening to NPR when the top-of-the-hour headlines came on. First item: Just moments earlier, Israeli Prime Minister Benjamin Netanyahu, addressing the UN General Assembly, “warned that by next summer Iran could have weapons-grade nuclear material.” Then a clip of Netanyahu, trying to sound chilling: “At stake is the future of the world. Nothing could imperil our common future more than the arming of Iran with nuclear weapons.” 

    "Nothing?," I wondered. Not even the melting of the polar ice caps, or a huge spike in global food prices, or an accidental launch of one of the many nukes that the U.S. and Russia still keep on hair-trigger alert?

    Then I asked myself, Why is this big news? Everyone knew what Netanyahu was going to say. Everyone knows that he’s been beating the war drum for years to build his political base at home. Meanwhile, as everyone knows, he’s alienating the rest of the world. Top U.S. political and military leaders, and many of Israel’s top leaders, want him to cut it out before he stumbles us into a war that no serious person (very possible not even Netanyahu) really wants. There’s nothing new here, though there is something really dangerous in giving these bellicose words top billing when they hardly deserve it.

    When I got home I noticed that the NPR website was running the story as its lead. But it wasn’t just NPR. On the websites of the nation’s two most respected newspapers, the New York Times and the Washington Post, the lead stories were “Netanyahu Sets a Time Frame for Stopping Iranian Bomb” and “Netanyahu: Iran Could be Nuclear by Next Summer.” Again I asked, Really, why is this big news?  

    It was WaPo columnist Alexandra Petri who put me on the track of an answer. She noted that a diagram Netanyahu had held up during his speech, supposedly showing Iran’s progress toward a bomb, was drawn in the crude shape of a cartoon bomb that her four-year-old might have produced using MS Paint. 

    “It violates this bomb’s contract that there are no train tracks or Looney Tunes characters visible in the shot,” Petri mocked, “and that it is surrounded by three-dimensional people in color. This is not even a Clip Art bomb. This is a Wingding.”

    But she went on to explain why “everyone’s fixated on the graphic design” (and indeed, both of our great newspapers prominently featured a photo of Netanyahu holding up the Looney Tunes bomb):  “When you have to critique a speech, there are two approaches: try to step back and see the whole thing, or alight on one moment that anyone who made it out of kindergarten intact can argue about.”

    Most people take the latter route because it’s easier: “Forget close reading. Skip the text.” Just say, “It was all there in the bomb.” It was all there in the cartoon.

    Netanyahu’s endless warnings about “the Iranian threat” are just as cartoonish as his bomb. That’s not meant as an insult; it’s simply a description. We all appreciate a good political cartoon. It communicates a clear, simple message that you can grasp in an instant, because it’s done with a few exaggerating strokes of the pen -- or a few exaggerating words. Drawing verbal cartoons is one of the skills we expect any political leader to have.  

    It’s also, I suppose, one of the skills that journalists in the mass media must have, since their work is measured so much by the size of the audience they draw. Even readers of our two great newspapers rarely flock to subtle, in-depth analysis. (Just check out their websites’ lists of “most viewed” articles.) Most people want their stories clear, simple, easy to grasp -- and, I suspect, exaggerated. It’s the exaggerations that make the news not only simple but emotionally engaging.

    Now back to my original question: Why was Netanyahu’s speech big news? Mainly, I think, because it would bring in big audiences for the same reason cartoons do. It offered yet another chance to trot out a simplistic, immensely popular, decades-old cartoon: little Israel, our hero, bravely and cleverly fighting off the Muslim foe, much like Roadrunner fends off Wile E. Coyote every time.

    But without the laughs. We are forbidden, by an unwritten but immutable cultural law, to laugh. We must take absolutely seriously Netanyahu’s umphteenth reiteration of his warning about “the greatest threat to the future of the world,” even when it’s illustrated with a laughable cartoon bomb. Because this childish story of absolute good against absolute evil (and isn’t that what most great cartoons provide?) wouldn’t be nearly as emotionally satisfying if everyone admitted how silly it is and had a good Looney-Tunes-style belly laugh. No, it has to be treated as a profound, stirring drama.

    So we are supposed to take this cartoon with a totally straight face, the way most cultures have taken their myths. A myth is not a lie. It’s a story that expresses something fundamental about the worldview and the values of the people who tell it. In our culture, cartoonish political words often do the same.

    There is a lot of similarity between myths and cartoons. Both mix fact and fiction. Both exaggerate facts to fit the fiction and to evoke emotional response. So both create a caricature of truth, a picture that is oversimplified, schematized, and therefore easier to grasp and respond to.

    But some myths, like some cartoons, are higher quality than others. A good myth or cartoon tells something important about the society that produces it. It has some complexity, some subtlety, something than an interpreter can sink his or her intellectual and emotional teeth into, even if it’s only to reject the myth.

    The stories that Barack Obama, Israeli Defense Minister Ehud Barak, and the latest report from Israel’s Foreign Ministry tell about Iran’s nuclear program have some of the qualities of a good myth. Even if they are ultimately built on fictions, they have at least a bit of nuance and complexity. There’s something there you can push back against.

    Netanyahu’s cartoonish tale, on the other hand, is about as simplistic as it gets: absolutely easy to understand in a moment, even for those who barely made it out of kindergarten. That, I conclude with no pleasure but with serious sorrow, is a large part of the reason it made such big news. 

    Netanyahu's full speech:

    (PS: A few hours after I posted this piece I noticed that articles about Netanyahu's speech, complete with photos of the cartoon bomb, were number one on the most popular list of the Washington Post website, but only number five on the New York Times site.)

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148580 https://historynewsnetwork.org/blog/148580 0
    “Hope and Change”: The “Comeback Kid” of Political Narratives? Obama campaign graphic.

    On the eve of the “great debate,” the presidential election narrative in the mass media is moving toward “Obama’s widening lead.” That may or may not be true, depending on how seriously you take the polling process. But in politics, as in so much of life, the story will trump the facts nearly every time.

    If Obama is indeed widening his lead, the change is most evident in the battleground states, where voters are inundated with advertising, robocalls, and candidate appearances as portrayed on the TV news. Why are Obama’s numbers improving, slightly but steadily? Theories abound.

    Here’s one that comes from a little fragment of (perhaps previously unreported?) history that I just stumbled across, reported by Howard Kurtz, The Daily Beast and Newsweek’s Washington bureau chief. (Isn’t it telling that a once-serious magazine, now turned into a pop tabloid, would hire a very talented media -- especially TV -- critic as its Washington bureau chief?)

    It seems that Mike McCurry, who was White House spokesman for Bill Clinton, told Kurtz this story: 

    In the summer of 1996, Clinton “had not crystallized his argument for reelection until he watched Dole deliver his acceptance speech,” which included the line “let me be the bridge to a time of tranquility, faith, and confidence in action.” In the hotel room, Clinton “slammed the desk and said, ‘No, that’s wrong. You’ve got to be a bridge to the future. That’s how I want to make my closing argument.’” “Obama is now building that argument,” Kurtz adds, speculating that this goes far to explain the September boost the president is getting in the polls.

    There’s no way to prove it, of course. But we do know that Clinton is now up there in the pantheon of modern American political geniuses, alongside Ronald Reagan and Lyndon Johnson. So his advice is always worth listening to. We also ought to know that Clinton’s famous piece of campaign advice from 1992, “It’s the economy, stupid," cannot, by itself, explain Obama’s standing in the polls.

    So it’s worth considering the possibility that the Obama campaign’s focus on the future really has made a difference. The old slogan of “hope and change” has not been trotted out again. The media wrote its obituary long ago, so it would make too tempting a target for Republican scorn. But the idea is certainly there, front and center. Perhaps the “comeback kid” has revealed to the Obama campaign the secret of coming back from the brink of disaster.

    When the campaign first settled on that one-word slogan, “Forward,” I laughed. It seemed not merely a sad cliché, but a flimsy one. Even less substance than “hope and change.” Who’s going to take it seriously, I wondered?

    But as the contest has unfolded, a pattern is emerging. Romney, as the challenger, naturally focuses on what the incumbent has done wrong. Indeed the challenger has come in for some major criticism from media wonks like Kurtz because he has not been able to keep the media focus, or his own focus, on one simple message: Obama is ruining the economy.

    But that still remains the best argument Romney can make. And it boils down to, “Voting for me is the only way to prevent disaster.” Romney would have us believe that Obama, the symbol of “big government” and “the crushing burden of federal debt,” is leading the invasion, destroying the tranquility that many voters imagine America enjoyed before the turmoil of “the ‘60s.” It’s not a message about making the future better but about preventing it from becoming much worse.

    So Romney is following the script that we might expect from any presidential candidate. As Maureen Dowd once wrote, “Every election has the same narrative: Can the strong father protect the house from invaders?” 

    That kind of frightening narrative -- contrasting the peril we face with the safety we crave -- has indeed dominated American politics for a very long time (since the 1930s, I would argue.) Through this spring and summer, the Obama campaign continued that tradition, emphasizing a negative message about protecting our national house from an invader named Mitt Romney. That message worked well enough to prevent Romney from moving into the lead. But it didn’t give Obama a lead either.

    During the Democratic convention, though, we saw a shift in tone. Clinton wowed the audience and media with a speech that largely accentuated the positive. When I read Obama’s acceptance speech with my best skeptical eye, even looking between the lines for an implied narrative of protecting American from threats and dangers, I must admit I had a hard time finding it. The speech really was almost all about a vision of a better future, with the candidate, of course, presenting himself and his party as the bridge to that future.

    Now Obama may be opening up a lead by following the 1996 dictum of the “comeback kid” and giving us a kind of “hope and change” redux. If a positive focus on the future gives Obama victory and a second term, it will certainly be worth watching whether he uses that second term to try to fulfill the promise of his first campaign: to change the basic tone of American political discourse from fear to hope.

    I wouldn’t bet much on it. The “protect us from invaders” them is so fundamental to American political life that challenging it in any significant way would be a massive, and politically risky, undertaking. Presidents win political victories most commonly by presenting their policies as the only way to ward off disaster.

    Clinton himself is a good example. Though he may have campaigned on building a bridge to the future, he is best remembered as president for resisting putative threats like “welfare queens,” Slobodan Milosevich, and Newt Gingrich’s “Contract with America.”

    Obama certainly followed Clinton’s dictum during his first campaign, when he promised to move us to that politics of “hope and change.” In fact, though, Obama won mainly by using the “protect us from invaders” plot line so effectively (in this case, an invasion of economic disaster), as I argued in a recent article in the journal Political Theology. I also showed that his most important speeches during his first year in office were based on that negative theme.

    I haven’t done careful research on the president's rhetoric since then, but my impression is that it shows a mix of positive messages -- “a future built to last” -- and negative messages about protecting us from dangers foreign and domestic (mostly China and the rising federal debt), both driven by political necessity.

    A second term Obama is likely to focus on getting a few more landmark pieces of legislation through a Congress dominated by Republican obstructionists, as well as insuring another Democratic victory in 2016. And the best -- perhaps only -- way to achieve both goals is to lean heavily on the rhetoric of resisting threats to the nation, as history shows.

    Still, it’s worth noting that a more positive message may very well turn out to be the key to victory for an incumbent whose chances, not long ago, looked rather uncertain. If Obama does win, the consequences are impossible to predict with any certainty. A political narrative, like a politician, always has a chance to be the next “comeback kid.” 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148603 https://historynewsnetwork.org/blog/148603 0
    The Presidential Debate: Myth versus Myth I wrote this after the debate:

    The myth of democracy did put in a brief appearance tonight. Each candidate gave us a whole series of little logical arguments, compressed into soundbites. But the part of the myth that requires thoughtful debate, with every point subjected to sustained, careful logical exploration, was predictably missing in action.

    What we got instead, again predictably, was a fine display of the “theater state” in action. The familiar ritual did come off “without a hitch” -- so much so that many observers found it a rather dull affair.

    So who “won”?  

    Full disclosure: I am an occasional local volunteer for the Obama campaign, so my personal preference is obvious. But I agree with many of the pundits that Romney came off better than expected. He certainly showed more energy than the president, and he got the benefit of appearing to stand up to the “leader of the free world” as an equal.

    Of course no one doubts that the president can show as much energy as Romney, and probably more, if and when he wants to. But tonight he never went on the attack. Nor did he play defense. In almost every case, when his opponent hurled potentially damaging charges at him, he simply ignored them.

    Perhaps Obama was just “off his game.” But his campaign organization is a pretty shrewd calculating machine that so far has shown impressive results. So it’s worth considering the possibility that his performance was a deliberate choice.

    After describing Obama’s demeanor as “grim/uninterested,” Washington Post political analyst Chris Cilizza concluded: “My guess is that Obama and his team made the calculated decision not to hit Romney” because “a) it wouldn’t look presidential” and b) the Democrats’ relentless attacks on Romney have “already penetrated deep into the political consciousness of the electorate.”

    Looking presidential means always remaining centered, never losing your balance, remaining at all times the regal actor-in-chief of the “theater state” whose equipoise does not merely symbolize but actually creates the equinamity and balance of the societal structure. Let others do the attacking and defending, raising tensions and stirring destabilizing conflict. The president must remain implacable, unmoved.

     The challenger is obliged to do a certain amount of attacking and stirring conflict. Romney appears personally prone to be full of stresses that he is constantly trying to repress; when he defends against others he often appears to be fending off his own inner tension, too. At least that’s the way it looked to me, tonight as always.

    So perhaps Obama intentionally chose his placid demeanor to bring out the contrast between his own imperturbable official status and the excited agitation of the challenger. Perhaps it was a calculated strategy to give the impression that dethroning him would mean overturning the order of the “theater state” and ushering in a new era of frightening chaos.

    If most viewers get that impression, it would add one more negative mark to the long string of negatives with which the Obama campaign has tarred Mitt Romney. To achieve that goal, though, the president had to refrain from reminding viewers of all those other negatives. He and his strategists had to count on those others to be in the air, working the way television always works: subliminally.

    This may be a charitable interpretation. But this was only act one. There are many scripts that can be played out effectively by the actor-in-chief of the “theater state.” In the next two debates we may well see a rather different Barack Obama, which would tend to bear out the view that tonight’s performance was indeed a deliberate choice.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148673 https://historynewsnetwork.org/blog/148673 0
    Obama’s Other Debate Failure: No Narrative Teddy Roosevelt knew how to string a narrative together.

    Everyone is talking about Barack Obama’s flat performance in the first debate, and with good reason. The debates are essentially television shows. Like any theatrical contest, the performer who is most entertaining and charismatic wins. The other guy loses.

    But Obama also failed in another very important way. He failed to tell a good story. He didn’t offer any persuasive narrative that would tie together all his talking points. If he had, it might have compensated for his poor performance and softened the blow he suffered that night.

    The funny and sad thing is that the Obama campaign has the makings of a consistent and powerful narrative, one that contrasts sharply with that of the Republicans. The president laid it out clearly last December at Ossawatamie, Kansas: “We’re greater together than we are on our own. ... In the long run, we shall go up or down together.”

    That’s an time-honored story in American political life, though it hasn’t been heard as the main theme of a presidential campaign in decades. Obama went to Ossawatamie to take it off the shelf because that’s where Teddy Roosevelt spoke the same words over a century ago.

    When TR used that narrative to run for the presidency in 1912 as a Progressive, the Democratic and Socialist candidates, Woodrow Wilson and Eugene Debs, were telling variations on essentially the same story. Among them they got fully three-quarters of the votes. 

    The last major party candidate to run on that narrative, Franklin D. Roosevelt in 1936, got re-elected with 60 percent of the votes -- still a landslide in American political terms. The prospect of another Democratic president basing his re-election campaign on that progressive story, giving it new life in a new century, was exciting.

    At Ossawatamie Obama enlarged the story when he praised “the promise that's at the very heart of America. ... Even if you're born with nothing, work hard and you can get into the middle class.” The idea that everyone who works hard earns the right to a middle-class life is something new in American history. A campaign centered on that narrative would have been a landmark.

    Obama used the same story for months. But then it got lost in the midst of a tangle of stories. The president began to show his central narrative the way he shows his smile -- in fleeting, and presumably carefully calculated, flashes. By the time he got to his acceptance speech in Charlotte, he was still saying, “Our destinies are bound together. … We travel together. We leave no one behind. We pull each other up.” But that message was no longer central.

    The words “middle class” showed up only twice in the acceptance speech. Obama mentioned, almost in passing, that he was fighting to restore the values that built the world’s largest middle class. But the “make-or-break moment” and the promise that everyone could make it into the middle class were gone. 

    Then, in the fiasco of the first debate, the progressive narrative pretty much evaporated. Obama hinted at it in his opening statement when he offered “a new economic patriotism that says America does best when the middle class does best.” But then it went MIA. There were merely a few vague references to helping the middle class and one weak claim that, though free enterprise is “the genius of America, ... there are also some things we do better together.”

    You could find the whole progressive story between the lines of Obama’s rambling words, but only if you tried really hard. The whole point of good storytelling is that the audience does not have to try hard. The main lines of the plot are too obvious to miss, because the storyteller puts them front and center and repeats them over and over again. 

    Bill Clinton proved that in his speech in Charlotte. He reminded us that not long ago, for eight years, we enjoyed a masterful Democrat storyteller in the White House. Before Clinton, the most popular presidents of both parties -- Reagan, Kennedy, the Roosevelts -- were all equally skilled storytellers, especially on the campaign trail. As the prominent Democratic pollster Stanley Greenberg once wrote, in a presidential election “a narrative is the key to everything.”

    No doubt Obama’s strategists know all this. And no one doubts Obama’s ability to put across an appealing story when he wants too. He proved it four years ago with his narrative of “Hope and Change.”

    So what were those strategists thinking as they prepared their man for the first debate? Perhaps they were out to prove that he is a master of detail with a head full of numbers. But we already knew that.

    More likely, they were obsessed with the messages they got from their polls and focus groups. So they had their man tell a new narrative: The Democrats have a plan to reduce federal deficits and the debt while still offering specific benefits to specific groups of people. Obama spent most of his time ticking off those benefits, in traditional Democratic laundry-list style, while insisting that Romney was the one who would increase the debt.

    Maybe that’s the message the focus groups wanted to hear. Maybe it could be a winning narrative. If so, Obama should have stated it up front and then repeated it constantly. Implication and indirection don’t win elections. A clear narrative, told in simple language over and over, is what wins.

    But there’s an obvious danger in letting focus groups determine the story: Next week a different focus group will want to hear something else, so the story will keep on changing.

    Fortunately for the Democrats, Romney isn’t any more consistent as a storyteller than Obama. His performance in the first debate confirmed him as the etch-a-sketch candidate he’s been all along.

    With neither candidate offering one clear-cut narrative, there’s nothing to interfere with the main goal of the debates: to show which candidate is a better performer on stage. That’s one more nail in the coffin of democracy. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148737 https://historynewsnetwork.org/blog/148737 0
    The New “New Normal”: Saving Ourselves From the Cliff

    Are you worried about the looming “fiscal cliff”? Well if it’s your only worry about the American economy, you’re not worried nearly enough. There are plenty of other economic cliffs out there, just waiting for you.

    That’s the lead story on the front page of this past Sunday’s Washington Post. “Even if Washington somehow finds a way to avoid the fiscal cliff -- the automatic tax hikes and federal spending cuts that threaten to plunge the nation back into a recession --” Zachary A. Goldfarb warns us, “the economy could suffer a stiff blow next year.”

    Tax hikes and spending cuts could take billions of dollars out of the economy. But if we extend tax cuts and cancel spending cuts, we’ll increase the federal debt, bringing new and unpredictable economic suffering. So we’re trapped.

    There’s no glimmer of good news to counter the gloom and doom brought to you by the WaPo. There’s only an overwhelmed Congress and administration, forced to grapple with an impending economic apocalypse. We’re likely to go over one cliff or another, it seems -- no matter who wins the election.

    Indeed, from this article you wouldn’t even know that there is an election coming up. The apocalyptic threat is treated as a fact of life that transcends politics.

    We’ve seen endless news reports and opinion pieces for a long time now telling us that this is “the new normal.” It doesn’t always mean that we’re doomed to go totally over the cliff. But it always means something pretty disastrous compared to the promise of endlessly growing prosperity, which was, until recently, taken for granted in our shared national story. 

    A permanent possibility of disaster is nothing new in the American story, though. What’s new is to find it in the domestic, economic arena. When it comes to foreign affairs, Americans are accustomed to living with apocalyptic danger as the norm, expecting their government to manage the threat at best, but never to extinguish it.

    We first learned this fear-ridden way of life back in the 1950s. Of course then the threat was “the reds.” The Eisenhower administration created the foreign policy that Ike’s successors followed throughout the cold war, the policy I call “apocalypse management.” 

    Eisenhower warned publicly that we were “not in a moment of peril, but an age of peril.” In an internal White House memo, a staffer described it as “the new normal.”  After the 9/11 attack, Dick Cheney used the same phrase to describe the supposedly endless “war on terror.”

    Now, the WaPo suggests, permanent fear is still the new normal. The only difference is that the peril comes from the economy within.

    Over at the Sunday front page of the nation’s other most influential newspaper, the New York Times, the horizon is a just tad brighter. There’s a big color photo of Donna’s Diner in Elyria, Ohio, with the dawn’s early light barely relieving the gloom of night. The headline reads: “At the Corner of Hope and Worry: A Small Café, and a Small City, are Put to the Test by a Tough Economy.”

    Below is a photo of Donna, the proprietor, holding her hands in an obviously prayerful gesture, with anxiety etched on her face. It looks like there’s still a chance that Donna, an iconic ordinary American, will somehow avoid the cliff, pass the test, and make it through these tough times -- if she has enough hope and faith, the photo suggests. But no one can say for sure.

    Put these lead stories from the nation’s two most prominent newspapers together and you get a complicated narrative: As we head toward a domestic apocalypse, there’s not much the government can do about it. The politicians will try their best to manage this “new normal.” But they are so hopelessly tangled in their internal contradictions, we can’t count on them for anything. We would do better to put our hope in the faith and resilience of ordinary Americans, people just like you and me. That sounds like a very Republican message.

    When it comes to foreign policy, presidents of both parties have offered much more than that when they pledged to protect the American people from “the red menace” and "the terrorists." They never said it was up to the people themselves to keep the nation safe. They promised that the government would do the job.

    Democrats traditionally made the same promise when economic apocalypse loomed. William Jennings Bryan famously preached that the Democrats would save the people from being crucified on a cross of gold. Franklin D. Roosevelt asserted that, if Congress failed to halt the Great Depression, he would ask for “broad Executive power to wage a war against the emergency, as great as the power that would be given to me if we were in fact invaded by a foreign foe.” That was the biggest applause line of his first inaugural address

    In the same address Roosevelt summed up the traditional Democratic view of how ordinary Americans respond to crisis. He insisted “as a first consideration, upon the interdependence of the various elements in all parts of the United States -- a recognition of the old and permanently important manifestation of the American spirit of the pioneer.” FDR knew that the pioneers were no rugged individualists. They built their communities by working together, using government as their agent.  

    Barack Obama seemed to be building his campaign on the same kind of message, until he lost his narrative way. He still offers more from government than Romney, to be sure. But now, as the race tightens and he knows he’ll have to work with another Republican House, he seems to focus more on the power of “ordinary people.” In his closing remarks in the first debate, all he could offer from government was to “channel” the “genius, grit, and determination” of the American people.

    If that is the political narrative of the future -- if the alternative that Democrats once offered is so muted in our national conversation -- then today’s “new normal” is something new indeed. And the real looming tragedy is the way it diminishes the possibilities for a better future that we all could enjoy if, like true pioneers, we expected -- and elected -- government leaders to serve our common interests.     

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148805 https://historynewsnetwork.org/blog/148805 0
    Fact-checking the Candidates: A Sacred Ritual Romney as Pagliacci -- acting out the theater state. Credit: Flickr/HNN staff.

    There’s an old theory that people perform religious rituals as a way of acting out their sacred myths. Scholars of religion don’t take this old theory very seriously any more. It’s far too simplistic and misses too many aspects of the meaning of function of ritual. Sometimes, though, this theory still sheds interesting light on rituals. It’s especially useful when a ritual does pretty obviously act out a myth and the people performing the ritual tell you that they are reenacting one of their myths.

    A fine example is the Christian ritual of Eucharist: eating the body and drinking the blood of Christ. In the Gospel story of the Last Supper, Jesus explicitly tells his disciples to keep on eating bread and drinking wine after he is gone, because those consumables are his body and blood. When you ask Christians who believe that the consumables literally become the body and blood why they are doing the ritual, they’ll tell you that they are obeying Jesus’s command and doing exactly what the disciples did. They are acting out their sacred myth.

    Christians, when I call the Gospel story a myth, please don’t be offended. I don’t mean it’s a lie. A myth is a narrative that people tell to express their most basic views about what the world is like and how they should live in it. The myth serves that purpose whether it’s totally false, totally true, or (as is usually the case) some mixture of the two. So it’s perfectly possible that every word in the Gospels tells us what actually, literally happened in the life of Jesus of Nazareth. The Gospels would still be Christians’ mythology.  

    Fact-checking the myth is irrelevant to its role in the lives of the people who tell it.  They do not judge it by whether it can be proven factually true. Rather, it shapes their view of truth; it tells them what they can accept as factually true and what they must consider false. So they act out their myth in a ritual to reinforce their commitment to truth as the myth teaches them to see it -- or so the old theory goes.

    It’s worthwhile dusting off that old theory in this election season, which presents us with an interesting twist: What happens when fact-checking itself becomes a ritual? I don’t have quantitative data, but it seems to me that we have much more fact-checking in this presidential election than in any election before. Fact-checkers seem to be all over the place.

    And the mass news media promote their fact-checking as a major part of their campaign coverage. They treat it as something their audience really wants. Since they are in business to make money, presumably they do have quantitative data; presumably they’ve done market research that shows they can increase their ratings or readership with all that fact-checking.

    Why is fact-checking so popular? The traditional American view of democracy has a ready answer: The people know that, to be responsible voters, they must know the facts. How else can they judge which party’s policies are best for the nation? And they must know whether the candidates are leveling with them. We want a president who is a straight-shooter, not one who will deceive us for his or her own political gain.

    There’s a complex myth of democracy packed into that little story. There’s a basic premise: Democracy can work because we humans are rational animals. We are built to be fact-checkers; we all have the capacity to separate true facts from lies. And once we have true facts, we know how to analyze them logically to come to reasonable conclusions. If that weren’t true, democracy would be a foolish experiment, indeed.

    But, the myth goes on to say, a capacity is useless unless it is developed through training. That’s why democracy demands universal access to education. How much education is a matter of debate; other democracies tend to set the bar higher than we Americans. The basic concept is the same in all democracies, though: Only educated people can be responsible citizens because only the educated have actualized their potential for fact-checking and rational thinking.

    Many of the reformers who promoted universal public education in the nineteenth century (for boys at least; some weren’t sure about girls’ capacity for reasoning) were motivated by that myth. Of course capitalism also drove education reform; the industrial revolution created a demand for more educated workers, just as the high-tech revolution has in our own time. But a genuine commitment to the mythic vision of democracy played a significant role back then. (We’re probably too close to evaluate how much of a role it plays in moves toward expanding educational opportunity today.) 

    The myth of democracy says that citizens must educated enough to know which policies are best for their community. But good citizens must also bring their rationality into the polling booth. They must know which candidates promote and implement the right policies. They must know whether incumbents have done so, and whether challengers might do better. That means they must have honesty from their leaders and transparency from their government.

    Hence, the need for fact-checkers at every step on the campaign trail. It’s only logical.

    Except that there’s no evidence all the fact-checking has any measurable impact on the voters’ choices.

    As soon as the first presidential debate ended, many Obama supporters were quite gleeful. Mitt Romney had made so many demonstrably false statements, and denied his own positions so often, that it seemed like a bonanza for the Democrats. They duly set about broadcasting that bonanza, falsehood and deception by falsehood and deception.

    And look what they got for their efforts.

    Even the prominent pro-Obama intellectual Robert Reich, a master of progressive ideas, opens and closes his “Memo to the President” for the next debate with advice about performance style. Though Reich offers plenty of ideas too, he knows that ideas hardly mattered any more than facts in the outcome of the first debate. Romney won on style points alone.

    The “theater state” is a performance art. Every candidate is judged, above all, on their performance. Good theatrical performers know how to create satisfying illusory images of truth. It’s one of their highest skills. Mitt Romney proved that in the first debate. The big question, all the mass media reports tell us, is whether Barack Obama can prove equal to the task in the second debate.

    Michael Scherer’s conclusion to his perceptive Time cover story on fact-checking is quite on the money:

    When the final book is written on this campaign, one-sided deception will still have played a central role. As it stands, the very notions of fact and truth are employed in American politics as much to distort as to reveal. And until the voting public demands something else, not just from the politicians they oppose but also from the ones they support, there is little reason to suspect that will change.

    But why should the voting public demand something else? They’ve already got this enormous stage in the political theater packed to the rafters with fact-checkers. The fact-checkers are performing their duly appointed role in the drama, just as the candidates are. The fact-checkers, too, are seasoned performers skilled in the art of creating satisfying illusory images of truth.

    Above all, they create the illusion that American democracy is alive and well because the public is apparently being informed of the facts and the veracity of each candidate is apparently being carefully evaluated and widely reported. Fact-checking, then, is the ritual enactment of our myth of democracy. As long as the myth keeps getting acted out, we can trust that it is alive and well.

    There has been growing suspicion over the years whether democracy really is alive and well in this postmodern world, where signs are increasingly detached from the reality they claim to signify. The ritual of fact-checking eases the anxiety about the state of our democracy in this “theater state.” That, I submit, is why fact-checking is so popular.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148833 https://historynewsnetwork.org/blog/148833 0
    The Second Debate: “They Were So Good Being at Each Other’s Face” Credit: Flickr/Obama for America.

    Did you think the second presidential debate was too nasty, that it was sad to see the two lead actors portray such a polarized image of American politics? The third performer up on the stage, moderator Candy Crowley, didn’t think so.

    “They were talking to their bases who want to see them stand up to each other,” Crowley said on CNN after the debate. “They were so good being at each other’s face, and I thought this was a debate, so I let it go. … It was so good.”

    The woman with the only front row seat didn’t seem to be interested in the content of the candidates’ arguments, much less their logical coherence. She cared about the show. And as long as they were at each other’s face, “it was so good.”

    A long-time TV professional, who has made television her life, naturally judges the debate by the same criteria she would judge any television show. And appropriately so, since the debate is above all television entertainment.

    That’s why when debate season roles around I always turn to TV critics, like the New York Times’ David Carr. What struck Carr most about the first debate was not anything about the content. It was the extraordinary size of the audience -- over 70 million -- “breaking a 32-year-old record in viewership.” (And there was every likelihood that the second debate would score even higher.) Only the Superbowl gained more viewers -- a TV show where we don’t merely hope, but know with certainty, that the performers will be at each other’s face. 

    “Credit live event television,” Carr wrote, “the last remaining civic common in an atomized world. While ratings for almost everything on television have sunk, big spectacles that hold some promise of spontaneity -- N.F.L. games, the Olympics and various singing competitions -- continue to thrive.” And, of course, so do the presidential debates, as long as the race is close enough that the big prize is at stake.

    Carr quotes Jeff Zucker, former chief executive of NBC Universal: “Television is about drama, whether it is the Olympics, the Super Bowl, or ‘Homeland,’ and these debates have provided incredibly great drama. It just proves the adage that if you put on a good show, and both of these debates have been very good television, the audiences are going to be there.”

     Carr and Zucker didn’t say it, but they know as well as Candy Crowley what makes great drama that draws big audiences: conflict, characters standing up to each other and being at each other’s faces.

    Crowley and Carr were merely two of the thousands of journalists and commentators, not only on TV but in every news medium, who all read from the same prescribed text: It’s fundamentally about performance. Obama lost the first debate because of his poor performance. In fact he lost most because of his performance when he wasn’t speaking. So the content of his words could not have played much role at all in his loss.

    That’s why everyone was focused on Obama’s performance in the second debate. And he played it pitch perfect. When Romney spoke, Obama showed no scorn or disinterest or boredom. He was all ears, apparently paying attention with the appropriately neutral face. But when it was his turn to speak, he was at Romney’s face -- certainly not all the time, but enough to make it the biggest news event of the night.

    Romney gave as good as he got, though -- letting the New York Times website headline (happily, I trust), “Rivals Bring Bare Fists to Rematch.”

    Media professionals don’t really care who won, as long as they get a good conflict-packed show. Having one candidate declared the surprise, clear-cut winner, as in the first debate, is a bonus; it makes the show even better.    

    Most voters will agree it was a good debate. It offered enough conflict to create a good drama, which is always entertaining.

    But the voters care about more than just production values and being entertained. They have a much more urgent question than “Was it a good show?” As Maureen Dowd put it, “Every election has the same narrative: Can the strong father protect the house from invaders?” That’s the question the voters ask about each candidate -- consciously or unconsciously -- as they watch the two perform.

    That’s bound to be the crucial question in a nation whose political life is shaped so much by the myth of homeland insecurity -- a myth that says invaders are always outside, threatening to burst through the door and destroy us if our leaders don’t have fists strong enough to keep them out.

    There’s no common agreement about who the invaders are. Indeed, one way to understand American political discourse is to see it as a debate about the name of the truly threatening invader. Is it the rich who thrive in an unregulated, runaway, overly free market? Or is it the government, imposing too much taxation and too much regulation? Or perhaps the terrorists? Or maybe it doesn’t matter so much who, exactly, the invaders are.

    The crucial question is which candidate is strong enough to keep out the invaders, whoever they may be.

    Oh, perhaps you thought the crucial question had something to do with the economy, since you’ve been told that about a zillion times. Consider this:

    In CNN’s instant (but “scientific”) poll of second debate watchers, well over 55% said Romney would be the better president when it comes to boosting the economy and lowering the deficit. But the same group awarded Obama a victory in the debate by the sizeable margin of 46% to 39%.

    Obama lost the first debate, the media consensus agrees, because he simply did not look strong enough to protect the house. In the second debate, he was warned, he had either to look strong enough or to expect defeat on Election Day. He certainly got the message, proved himself up to the task, and took home the blue ribbon.

    But Romney did a creditable job of performing the role of strong father, too. So he’s not out of the race by any means. It will continue to be close unless one or the other candidate shows a moment of major weakness.

    Whoever wins, though, this debate will stand as evidence that many voters are looking for both good entertainment and that strong father to soothe their insecurities.  Perhaps they are looking for good entertainment mostly because it, too, soothes their insecurities. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148840 https://historynewsnetwork.org/blog/148840 0
    In Memoriam: George McGovern and Liberal Politics McGovern vs. Nixon campaign pamphlet, 1972. Credit: Pennsylvania AFL-CIO.

    George McGovern was the first presidential candidate I actively campaigned for. Like many baby boomers, I stood on the street corner handing out “Vote for McGovern” handbills. The fifty-year-old Democrat was so unique among politicians, we gave him a special exception to our first commandment: Never trust anyone over thirty.

    Under thirty? Sure. We knew we could trust each other. Or so we thought.

    But the day before George McGovern died, I stumbled across a little known fact that took me back those forty years and made me wonder whether my trust was misplaced.

    Assuming that we can trust the data compiled by American National Election Studies, it seems that on Election Day 1972, of my fellow under-thirty, baby-boomer voters, only 47 percent marked their ballots for McGovern. 53 percent voted for Richard Nixon.

    There are at least two good lessons here:  First, our knowledge of the body politic depends largely on who we hang out with. We tend to assume too easily that the people we know in our own demographic groups (age, gender, race, whatever) represent the entirety of those demographics. I suppose we ought to get around more, talk to more people who are like us demographically but not politically.

    The other lesson is that the common wisdom handed down as history is often not borne out by the facts. I suppose we ought to do more empirical research and less parroting of the common wisdom (of which, in this case, I was guilty all these years).

    By coincidence, on the day George McGovern died I learned another fact about that 1972 election: women voted overwhelmingly for Nixon, in virtually the same numbers as men. And there was no gender gap at all in 1976. Since 1980, though, Republican presidential candidates have done far better among men and Democrats far better among women. It looks like the same pattern will repeat again this Election Day.

    When I mentioned this to my wife, she asked an obvious question that I’ve rarely if ever seen discussed in all the fevered analysis of the polls: In presidential elections, do more women vote, or more men, or is it roughly equal?

    Since I had the American National Election Studies website up on my computer, it was easy to get an answer: In ’72 and ’76, women voters far outnumbered men. It didn’t matter much then, since there was virtually no gender gap.

    But since 1980, women have continued to outnumber men by nearly as much. On average, roughly 54 percent of voters have been women. To repeat: In all those elections, women have voted Democratic in significantly higher numbers than men. So if the vote had been evenly split between the two genders, the Republicans would have done significantly better.

    In 2000, for example, Al Gore won the women’s vote by 11 percent. George W. Bush got the men by 9 percent. But 56 percent of the voters were women. Had it been 50-50, Bush would have won easily and the Supreme Court would have been spared its worst embarrassment in living memory.

    As far as I can tell, since 1972 (when the stats I have on the gender gap begin), there’s no case where the preponderance of women was the decisive factor; i.e., where a 50-50 gender turnout would have swung the election to the other candidate.

    But 2012 could be a first. As close as this election is, and with the gender gap as large as ever, if the pattern of women outnumbering men by about 8 points continues, Barack Obama might well gain re-election solely due to the women’s vote.

    The larger point here is that, since 1980, the presidential vote has not accurately reflected the political views of the population at large (assuming that the gender split in the overall population is roughly 50-50, which is roughly the case in the U.S.). With so many more women voting, the electorate has trended a bit more Democratic than the whole body politic. In other words, the presidential election results have led us to think that the American people were a bit more liberal than they really were.

    In the same way, the mythic tale of George McGovern and the “youth vote” led us to think that the baby-boomers of “the ‘60s” were a bit more liberal than they really were.

    Considering what hard times it’s often been for liberals since 1972, it’s a bitter pill for those of us on the left to learn that the reality has been even worse than we thought.

    It’s sad that we no longer have George McGovern with us. He was such a fine model of the committed liberal who keeps on speaking up for what he (or, more likely, she) believes in, regardless of how chilly the political climate may be.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148881 https://historynewsnetwork.org/blog/148881 0
    Political Symbolism Is Political Reality: The Case of Wisconsin Tammy Baldwin in 2010. Credit: Flickr/Center for American Progress.

    In case anyone doubts the power of myth and symbol in American politics:  In the dead-heat race for the Senate in Wisconsin, one issue now towers over all others, the Washington Post reports. It’s not health care or education or energy or immigration. No, it’s Democratic Congresswoman Tammy Baldwin’s 2006 vote against a purely symbolic bill to continue recognizing September 11 as a national day of remembrance and mourning.  

    Baldwin voted against the bill because it included a clause endorsing the Patriot Act and a host of other post-9/11 legislation, which few people had read completely and even fewer understood thoroughly.  

    But an ad by Baldwin’s opponent, former Wisconsin governor and secretary of health and human services Tommy Thompson, conveniently omits that explanation and all the symbolic recognitions of 9/11 that Baldwin did vote for. Instead, the ad features military personnel and veterans charging that Baldwin dishonors the victims of 9/11, disgraces the flag, slaps every one of America’s military personnel in the face, puts the nation’s security “in jeopardy,” leads us down “a very dangerous path,” and doesn’t care about America’s children. All this from one symbolic vote -- and in 30 seconds.

    As a piece of political advertising, it has impressive production values and certainly tugs at plenty of voters’ heartstrings. But only one thing sets it apart from many other such slick ads: It has now made Baldwin’s no vote six years ago the pivotal issue in the far-too-close-to-call contest, according to WaPo reporter Aaron Blake.

    It would take an entire book to unpack all of the symbolic and mythic narratives crammed into those thirty seconds. I won’t even try to outline the table of contents of that book here. I simply want to note what a huge role pure symbolism can play in what we think of as the very real world of power politics, as if “symbolic” or “mythic” and “real” were somehow opposites.

    But if we define “real” as whatever makes a difference in the world, then in Wisconsin in 2012, at least, the mythic symbolism is the dominant reality. If a politician as liberal as the fifty-year-old Baldwin enters the Senate, she might well be there for three decades or more, moving up to committee chairs, wielding significant influence, and thereby nudging the Senate at least a bit further to the left. If she’s kept out by the emotional impact of this ad and this issue, the future of the Senate will be at least a little different for decades to come.

    Moreover, Wisconsin is still very much a toss-up in the presidential race. Voters’ feelings for or against Baldwin are sure to influence the fate of Wisconsin’s ten critical electoral votes.

    In this context it’s also worth recalling a former senator from Wisconsin, named Joseph McCarthy. Talk about myth and symbolism becoming political reality!

    All this is a useful reminder that myths and symbols are political realities, deserving the same careful attention we give to any other political reality. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148929 https://historynewsnetwork.org/blog/148929 0
    Red vs. Blue: Causes Elude Us, but Effects Are Clear Credit: Wikimedia Commons.

    The prominent psychologist Steven Pinker has a long piece on the New York Times website, trying to explain why Republicans do so well in the South and the West but not in the rest of the country. It seems that it all comes down to how different regions have, historically, dealt with the eternal threat of societal anarchy. Harvard media stars rush in where careful historians usually fear to tread, or at best tread very lightly.

    There are plenty of holes in Pinker’s speculative framework big enough to drive most any vehicle you can think of through. For starters, if the North is indeed historically accustomed to counting on government to tame anarchy, as he argues, how to explain the Republican strength in New Hampshire, or in the non-urbanized areas of northern Ohio, Indiana, and Illinois? And if the West (which one assumes includes the “red” Great Plains states) is so accustomed to rejecting government as the tamer of anarchy, how explain the great political success of Progressivism and farmer-labor coalitions in those states in the days of William Jennings Bryan?

    If Pinker’s whole edifice is taken seriously, it quickly dies the death of a thousand qualifications.

    But rather than subject it to such a slow, painful death by analyzing it in detail, I’d rather look at the part of the article that has some persuasive power. That means setting aside all the speculation about the history of geographical regions and looking at politics in terms of personal decisions. What makes some people choose a candidate who sees a prominent role for government in society, while others choose a candidate who wants to limit and weaken government's role?

    Pinker is an expert on the history of the long-term decline in human violence. So his focus, naturally, is on how people deal with violence and the prospect of it being inflicted upon them.  

    He links the small-government view to the culture of honor, where individuals -- mostly men -- decide for themselves when they have been offended and how to punish the offenders. They keep “the safeguarding of their personal safety” as their own private prerogative. At best, they cede that power to “their own civilizing forces of churches, families and temperance,” created largely by women.

    Those who would allow government a much larger role “are extensions of Europe and continued the government-driven civilizing process that had been gathering momentum since the Middle Ages.” They are especially extensions of “the Age of Reason and the Enlightenment, [when] governments were forced to implement democratic procedures, humanitarian reforms and the protection of human rights.”

    If there’s any truth in this speculation, it suggests that less-government advocates live in a social world where collective institutions for curbing violence appear to be relatively weaker and less dependable, compared with the social world of more-government advocates. (Note that I say social, not geographical, world. Two next-door neighbors can -- and from the campaign yard signs I see in my town, often do -- live in totally different social worlds.)

    Explaining how and why those different social worlds arose is like explaining the weather: The causal factors way too complicated, with far too many variables, to be modeled completely on even the most sophisticated computers. The best we can hope for are partial explanations, depending on what particular questions are asked. Historians and social scientists should certainly keep on vigorously pursuing those questions. But they should not hope for the kind of simple, all-encompassing explanation that Pinker offers here.

    However, like the weather, the effects of different social worlds can be understood with a lot more certainty than the causes. People who feel relatively less protected from offense and violence, for whatever reasons, are more likely to feel more vulnerable, to see the world as a more threatening place and other people as sources of threat. So they are more likely to draw upon the mythology of homeland insecurity to make sense out of their experience -- a mythology based on the premise that we Americans will always face some serious threat to our very existence.

    People who feel relatively safer from offense and violence are more likely to feel more protected, to see the world as a place where people can cooperate because others are not such sources of threat. So they are more likely to draw upon the mythology of hope and change to make sense out of their experience -- a mythology that says people can work together to make a better community for all, using government as their collective agent. 

    In the current presidential election we might seem to have a direct head-to-head competition between the two social worlds, with the two locked in a virtual tie. But things are more complicated. The number one apostle of the mythology of hope and change, Barack Obama, states bluntly that “the first role of the federal government is to keep the American people safe.” That’s “homeland insecurity” at its best.

    Perhaps he is simply trying to appeal to the less-government advocates, so he can peel off enough of their votes to eke out a victory. If so, it’s good evidence of how strong the mythology of homeland insecurity is.

    But I think this is better evidence of how closely the two great mythologies are intertwined. Pinker’s “red state vs. blue state” kind of analysis is popular for the same reason athletic contests of all kinds are so popular. We want to see two clearly defined sides fight it out and, in the end, have a clear-cut winner and loser.

    But it doesn’t match the reality of American life. No one feels absolutely threatened or absolutely secure. Like Pinker’s “red state” and “blue state” personalities, these absolutes are ideal types, useful only for theoretical purposes.

    In fact, all of us live somewhere on a spectrum between those two theoretical constructs. All of us feel some degree of threat and vulnerability, and some degree of safety and protection. So all of us are drawn to both of the great mythologies. How we vote will depend largely on the particular mix of the two within our minds and our autonomic nervous systems.  

    It’s not surprising, then, to see both of the major party presidential candidates drawing on both of the mythologies and blending them together. Each does it in his own way, gesturing somewhat more toward one end or the other of the spectrum. But both recognize that the crucial swing voters are in the middle of the spectrum, with their sense of vulnerability and their sense of protection balanced in roughly equal measure.

    That seems to sum up the state of the union in the autumn of 2012. No one can yet predict which way the balance will tip by Election Day.

    For the as-yet-undecided, it’s worth remembering that even the smallest gesture toward one end or the other of the spectrum is a self-fulfilling prophecy. Those who act as if the institutions that protect us are relatively weak end up weakening the institutions that protect us, so that ultimately we are all in fact more vulnerable. And that’s true no matter where we live. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/148960 https://historynewsnetwork.org/blog/148960 0
    What's Still the Matter With Kansas? Credit: Wikimedia Commons/HNN staff.

    Today I posted a long article on Truthout.org titled "What's Still the Matter With Kansas -- and With the Democrats?" The title refers to a popular 2005 book by Thomas Frank, exploring the puzzle of why so many people of middling economic means vote for Republicans whose policies so clearly favor the rich and do little to help people of middling economic means. Frank chose Kansas as the place to study a large number of voters who vote against their economic self-interest because he came from Kansas.

    In my article I use "Kansas" as a symbol for all those voters.  I argue that Democrats are losing this key demographic group, and maybe this election, because they're unwilling to support values issues dear to the heart of “Kansans” that they could very plausibly endorse.

    One little piece of that article may be of special interest to historians. I note another book on the same topic, Red State Religion, by another native Kansan, the eminent sociologist of American religion Robert Wuthnow. He stresses the powerful spirit of community you will find among these Republican voters of Kansas.  He also traces the history of that spirit being expressed both in religious communities and in electoral politics. There’s a rich tradition of many “Kansans” voting Democratic for decades, in the nineteenth and early twentieth century, when populism and progressivism overlapped in so many way. Back then, lots of “Kansans” understood the invaluable role of government. 

    Now, their descendants will still often bend over backwards to help you out when you need it -- as long as they judge you deserving. But, crucially, they insist on reserving that right to judge for themselves. They won’t let any government bureaucrat do it.

    Why not? Wuthnow traces the distrust of the federal government back to 1938, when Franklin D. Roosevelt failed to follow through on the promises he’d made in the 1936 campaign. As my Truthout article shows, that’s far too simplistic an explanation. It’s only one factor, and probably not a major factor, in understanding the “Kansas” of today.

    Still, it’s an interesting point. Wuthnow does make a strong case for the late ‘30s as the crucial point at which “Kansas” began to support the conservative drive to shrink government.

    But he neglects to explore the complexities of that turning point. FDR did not intentionally forsake “Kansas.” He made a couple of bad strategic blunders: insisting on his court-packing plan after it was obviously bound to fail, and campaigning in the 1938 primaries against some stalwart Democrat conservatives running for re-election to Congress, who won re-nomination and re-election anyway. 

    As a result, FDR lost a lot of political capital in Congress. For that and lots of other reasons, Congress became more conservative and blocked progressive measures that FDR probably would have been happy to sign into law. 

    So FDR was blamed for failures that were mostly caused by an obstructive Congress. Of course back in those days a president was allowed to blame Congress, loud and clear, for obstructing progressive measures that he would have approved.

    Today that seems to be pretty much taboo. Barack Obama, who suffered much the same fate as the second-term Franklin Roosevelt, has put very little effort into pinning the blame on the Republicans in Congress. If he tried to make that a major issue, he would be pilloried by the press as a whiner and a weasel, trying to avoid taking responsibility. 

    I’m not sure why that change in media perspective has happened. But it’s certainly worth noticing.  

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149013 https://historynewsnetwork.org/blog/149013 0
    From “Who Lost China?” to “Who Lost Libya?” Credit: HNN staff.

    “Who lost Libya?” Mitt Romney has not asked the question exactly that way. Neither has Paul Ryan, nor any prominent Republican politician or commentator, as far as I know. But anyone familiar with the history of U.S. foreign policy since the 1940s can hardly avoid hearing that question, between the lines, in the GOP assault on the Obama administration’s handling of the September 11 killings in Benghazi.

    The “Who lost … ?” pattern first emerged after the communist revolution transformed mainland China in 1949. Republicans angrily demanded, “Who lost China?” The taste of omnipotence coming out of World War II was still fresh in Americans’ mouths. It seemed like the U.S. had such immense power, we could control just about everything that happened everywhere outside the Soviet Union and its eastern European bloc.

    The Democrats boasted about that apparent omnipotence. Secretary of State Dean Acheson crowed that the U.S. was “the locomotive at the head of mankind ... the rest of the world is the caboose.” The Democrats assumed that claiming credit for achieving such power could only redound to their political advantage.

    Then suddenly the Chinese revolution made it seem like a big “red” chunk of the caboose had come loose and was careening out of control. Given the widespread premise that the U.S. controlled the entire “free world,” it was impossible for many Americans to believe that the Chinese had the power, on their own, to release themselves from America’s grasp.

    The only logical way to explain it was to assume that someone within the U.S. government had consciously let China go. Someone had committed treachery. It must have been an inside job.

    The Republicans saw this explanation as a great chance to neutralize the points the Democrats had scored on foreign policy throughout the 1940s. They insisted that the traitorous villains had to be inside Acheson’s State Department.

    The political dynamite was defused in June 1950, when Truman sent several hundred thousand U.S. troops to fight the communists in Korea. That was hardly his main motive, but it was a welcome political side effect.

    However the “Who lost China?” debate had long-lasting effects. Apart from the ensuing purge of the best Asia experts from the State Department (which paved the way for the disastrous U.S. involvement in Vietnam), the debate had a major impact on the narrative of U.S. foreign policy for years to come.

    It reinforced the assumption of American omnipotence. To argue seriously about “Who lost China?” implied that we once “had” China, as a sort of possession, and had let it slip from our grasp.

    To chalk it up to internal treachery was not merely consistent with the image of U.S. omnipotence; it actually reinforced the image. Now, the story went, just as the U.S. government could hold on to nations at its will, so it could let them go, even though that would always be a mistake.

    And the Democrats’ response to the charges -- ramping up the Cold War in Korea and elsewhere -- further reinforced the idea that the U.S. ought to aim, at least, at total control of the “free world.” The Democrats had to say that to reassure a nervous public. The obvious fact that other nations act independently could hardly get a fair hearing.

    Nevertheless, the reassuring implications of the debate were offset by a more frightening one. Though we were still holding on to the rest of the “free world,” the “loss” of China showed how fragile our hold was. If we weren’t hyper-vigilant, who knew what country we might lose next. At any time the “dike” might burst (as Dwight Eisenhower warned his National Security Council, as the discussed Vietnam in 1954) and the “red tide” would flood our own homeland.

    The reassurance and the fear actually reinforced each other. The more Americans worried about “losing” some other nation, the more they reinforced the premise that the “free world” was indeed a possession under our control. And the more we “had,” the more we had to “lose.” So our global control would always be threatened, it seemed. But the bipartisan narrative agreed that strong, wise, patriotic leaders should be able to keep the “dike” firm and hold on to the “free world” forever.

    This myth of homeland insecurity became the fundamental myth of American foreign affairs. And Democrats were haunted by the shadow of the “Who lost China?” question. They were constantly on the defensive, vulnerable to GOP charges of being weak on security. Only in the late 1950s and early 1960s did they successfully fend off those charges.

    Although the Cold War ended, the myth and its specter of permanent peril endured. Once the “Iron Curtain” fell, the whole world came to look like a possession that we were supposed to control. Every nation was ours to lose. As Colin Powell, chair of the Joint Chiefs of Staff, put it in the early ‘90s, “the real threat is the unknown, the uncertain.” The U.S. needed “the ability to respond to the crisis nobody expected, nobody told us about, the contingency that suddenly pops up at 2:00 in the morning.”

    During the Democratic primary contest of 2008, some copywriter for the Hillary Clinton campaign advanced the danger hour to 3:00 am. But the impact of that famous “phone call” ad showed that the myth of homeland security, institutionalized during the Cold War years, was still as powerful as ever.

    In early September, 2008, Barack Obama was falling behind in the polls; his campaign based on “hope and change” was stumbling. Then suddenly a new peril appeared on the scene: an impending collapse of the economic system that threatened to flood the nation with disaster. Obama was judged most able to fend off that peril and he surged ahead. 

    But Obama and his political strategists knew that, as Democrats, they would always be open to charges of being “weak” on security issues. No doubt many factors moved the president to adopt a national security policy in many ways resembled his predecessor’s. But the need to guard his right political flank was surely one of those factors.

    Like the Democrats of the late ‘40s, Obama’s 2012 campaign team expected to score lots of political points by crowing about American domination -- in this case, domination of a splintering, Osama bin Laden-less Al Qaeda. Once again, though, calling attention to homeland security issues put the Democrats in a precarious political position. Having intentionally created an impression of a strong U.S. hand controlling events around the world, they were vulnerable to any event that called their total control into question. On September 11, 2012, in Benghazi, that event arrived.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149017 https://historynewsnetwork.org/blog/149017 0
    “Hope and Change” Born Again: The New, Improved Version Credit: Twitter/BarackObama.

    I’ve waited eagerly for the day after Election Day, to see what the story of Election 2012 would be. Every presidential winner has a story attached to his name. Sometimes the story is not so memorable. (What was the day-after-victory story of Jimmy Carter or George H.W. Bush?). Often, though, the story told about an election outlives the direct influence of the president whose name is attached to it:

    1960: John F. Kennedy: Youth and vigor can meet any challenge.

    1968 and 1972: Richard Nixon: Law and order stem the tumult of “the ‘60s.”

    1980 and 1984: Ronald Reagan: It’s morning in America as we shrink big government.

    2004: George W. Bush: America must win the war on terror.

    What about 2008, when the name of Barack Obama was indelibly linked to the words “hope and change”? Had Obama lost in 2012, his story probably would have been as forgotten as Carter’s or Bush 41’s.

    But given Obama’s victory, the jury is still out, awaiting the verdict of history yet to be written.

    In his bid for reelection, the president intentionally avoided any emphasis on the “hope and change” narrative. Focus groups showed that voters had “lowered their expectations, and they responded better when Obama appeared to have lowered his expectations, too,” as Ezra Klein reports

    Yet many observers, listening to the president’s 2012 victory speech, thought they heard powerful echoes of the “hope and change” story returning.

    This time, though, the story is thicker because it’s linked to two themes that dominated Obama’s campaign rhetoric. One is the burned-once caution Klein notes, which showed up clearly in the victory speech: “As it has for more than two centuries, progress will come in fits and starts” because the work of self-government is always “hard and frustrating.”

    The other new theme is economic inequality, the demand that the rich should pay a little bit more so that the middle class can survive, expand, and perhaps even thrive again. Nearly a year ago the president signaled that this would be the leitmotif of his campaign, in a speech in Osawatamie, Kansas.

    As the campaign went on and the focus groups held sway, that theme was blurred by a host of others which looked like winners among crucial niche groups in crucial states. But the original leitmotif never disappeared.

    It came back in a Washington Post story just days before the election, surely planted by the Obama campaign, that the president would demand higher taxes on the rich in the post-election bargaining as we approach the “fiscal cliff.”

    And it came back in the victory speech, too. The president coupled “reducing our deficit” with “reforming out tax code.” He promised to “continue to fight for new jobs and new opportunities and new security for the middle class,” to “keep the promise of our founding,” that “you can make it here in America if you're willing to try.”

    Obama put these economic promises in the same broader ideological context he had used since Osawatamie: “We are an American family, and we rise or fall together as one nation and as one people. ... What makes America exceptional are the bonds that hold together the most diverse nation on Earth, the belief that our destiny is shared. ... This country only works when we accept certain obligations to one another and to future generations.”

    So “hope and change” now has a more specific meaning: Struggling against entrenched opposition to force the rich to act, at least a little bit, as if they had an obligation to care about the economic well-being of the rest of us.

    Despite the Obama campaign’s efforts to blur and soften this narrative, there was no way -- and is no way -- that it can separated from the man. It remains the most obvious demarcation between the president and his challenger, who was widely perceived as the embodiment of the wealthy and their power and privileges.

    This difference in narratives does not explain Obama’s victory. None of the influential voices in the mass media are saying that. Indeed, so many different explanations are being offered for Obama’s victory that no single story will emerge as “the story” of the day after Election Day, 2012. But history can attach a narrative to a president even if it does not judge that narrative to be the key to his electoral success.

    History can also judge the narrative immensely successful regardless of the president’s policies. No one should expect Obama to make a serious dent in the power and privileges of the rich. His first term shows no evidence that he wants to do more than symbolically chip away at the edges of that power and privilege.

    Yet symbolism is an immensely powerful force. Kennedy’s youthful vigor never solved his greatest challenge, Vietnam. But it helped give rise to the youth culture of the ‘60s. Nixon could not turn back all the changes the youth culture initiated. But his theme of “law and order” blunted the truly radical power of the ‘60s and paved the wave for Reaganism.

    Reagan didn’t really shrink government. But he created a mythology that government is the problem, which still reigns in the House of Representatives today. George W. Bush’s war on terror led to fiascos in Iraq and Afghanistan. But drones still kill innocent civilians by the scores because political reality demands that any president must “defeat terrorists.”   

    Regardless of his policies, if Obama is widely seen four years from now as a successful president, his story of shared destiny translated into greater economic equality will be remembered as the true meaning of “hope and change.” And it will take on a life of its own, exerting a powerful influence on American political life long after Barack Obama leaves the White House.

    Related Links

    HNN Hot Topics: Election 2012

    HNN Hot Topics: Barack Obama

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149152 https://historynewsnetwork.org/blog/149152 0
    Obama vs. Boehner: Who is the True Jeffersonian? Credit: Flickr/Wiki Commons/HNN staff.

    As the presidential race neared the finish line, I occasionally tried to resist my obsession with today’s politics by opening Peter Onuf’s Jefferson’s Empire. The more I read, though, the more I realize that studying Jefferson doesn’t take us out of the present at all. It merely reminds us that, as Faulkner said, the past isn’t even past.

    Onuf explains that Jefferson’s vision of America was profoundly shaped by his understanding of the British empire, where all power and wealth flowed from the periphery (especially the colonies) to the center, the great metropolis of London and its royal court. Jefferson insisted that the United States of America must be the opposite: a vast empire with no metropolitan center and thus no periphery to be oppressed by the center.

    This view became the framework for Jefferson’s understanding of American nationalism and thus (like so much else in Jefferson’s thought) a basic staple of the American political narrative for future generations.

    At every moment of crisis, Onuf writes, Americans have repeated Jefferson’s essential revolutionary gesture. They have understood -- “(or imagined)” he adds, in a crucial parenthetical remark -- that “they confronted powerful domestic enemies” ensconced in the metropolis “who were prepared to sacrifice the common good for their own selfish advantage. Thus even as the memory of the Revolution evoked images of transcendent brotherhood and union -- the apotheosis of empire -- it also taught young patriots to question the patriotism of their opponents and to mobilize against them.”

    Jefferson is, of course, the holy grail of every generation of American political speakers. All want to prove that they are his genuine representative, worthy to bear and pass on his legacy.

    So it’s not surprising that, in his victory speech, Barack Obama evoked powerful Jeffersonian images of transcendent brotherhood and union. “We rise or fall together as one nation and as one people. ... What makes America exceptional are the bonds that hold together the most diverse nation on Earth, the belief that our destiny is shared,” he proclaimed.

    But he embedded his reconciliatory words within a veiled warning that there are still domestic enemies to be confronted: “By itself, the recognition that we have common hopes and dreams won't end all the gridlock.”

    And throughout his speech he made it clear how to identify the enemies. They’re the ones who resist all the “common hopes and dreams” he named: better schools, new technologies, health care for all, equality for racial minorities and gays and the disabled, “new jobs and new opportunities and new security for the middle class”; in short, the whole agenda of policy goals for which he advocates government action and spending.      

    Where would the money to fund these improvements come from? And who would oppose them? Obama didn’t have to spell out the answers. Having spent months demanding higher taxes from the rich, and attacking Republicans who promise lower taxes, he could assume that everyone got the message clearly enough.

    Obama did not question the patriotism of the rich whose special interest lies in resisting higher taxes, nor of the Republicans who carry their banner in Congress. But when he rejected the “wishful idealism that allows us to ... shirk from a fight,” he was clearly mobilizing his political troops to do battle against them. In good Jeffersonian fashion, he clearly implied that all patriotic Americans would rally to his call.

    The most immediate battleground is the showdown over the looming “fiscal cliff.” Enter another major contender for the title of “true Jeffersonian” in 2012: John Boehner.

    In a press conference just hours after Obama’s victory, the speaker of the House of Representatives sounded his own clarion call for transcendent brotherhood and union: Voters “gave us a mandate to work together to do the best thing for our country. ... Let's challenge ourselves to find the common ground that has eluded us ... and do the right thing together for our country.”

    But Boehner, too, could scarcely conceal his warning about domestic enemies who imperil the common good for their selfish interests. “The greatest challenge of all [is] a massive [federal] debt.” And “the entitlement programs are the primary drivers of our debt.”

    Boehner knew it would be impolitic for the losing party to spell out the obvious implication: The enemies are all those recipients of Social Security, Medicare, and Medicaid who refuse to take the cuts that true patriots would eagerly accept.

    (Over at Fox News, Bill O’Reilly didn’t hesitate to say it out loud. “It’s not a traditional America anymore,” he lamented. There’s a new majority made up of people who “want stuff.”)

    Boehner went beyond Obama by identifying the good guys as well as the bad guys. No less than seven times he lauded small businesses -- the “rock of our economy” -- and demanded that they be protected from tax hikes.

    This praise of independent entrepreneurs gave Boehner another point in the competition for the title of “true Jeffersonian.” Jefferson assumed that the vast majority of patriotic Americans would be independent yeoman farmers, the most common form of small businessmen in his day.

    Boehner scored an even bigger Jeffersonian point, though, when he warned against “government taking a larger share of what the American people earn.” Here was the familiar heart of the GOP's Jeffersonian message, the evil of the metropolis and especially the royal court: “Feeding the growth of government through higher tax rates won’t help us solve the problem. ... A ‘balanced’ approach isn’t balanced if it’s done in the old Washington way of raising taxes now, and ultimately failing to cut spending in the future.”

    But don’t count Obama out in this “true Jeffersonian” contest. He’s proven that he’s every bit as much a “comeback kid” as Bill Clinton. And in this case his path to victory is clear, though not easy. He has to explain to the American people that the new form of empire, which Jefferson did so much to create, only managed to produce part of the change that TJ expected.

    It did largely eliminate the old imperial system, in which rulers housed in the metropolis reaped direct financial gain from their political control. No one gets rich simply by being president or speaker of the House. Top-flight politicians can almost always make far more money by using their skills in the private sector, where the real wealth is.

    Real wealth still flows in huge waves to the metropolis, of course. But it’s not the same metropolis as the seat of political power. To put it bluntly, Washington and New York (and Chicago, Los Angeles, San Francisco, Houston, and Dallas) are separate metropoles; the political and economic centers are no longer the same. That’s the piece of the picture Jefferson did not foresee.

    This means that, theoretically, the government in Washington can be the true agent of the common good, the benefactor of all, while the masses remain oppressed by the other metropoles, where wealthy and powerful domestic enemies sacrifice the common good for their own selfish advantage.

    Obama could use his impressive rhetorical gift to make the case that this theoretical possibility has become the actual reality. Then he could call true patriots to mobilize in support of the political metropolis against the selfish enemies of the nation, who live so lavishly in the economic metropoles.

    Obama could add that the economic metropoles have reorganized our economic life so that Jefferson’s vision of a land full of small businessmen can no longer match the reality. But, he could explain, Jefferson’s praise of the yeoman farmer need not be seen as praise for the family or household as an independent economic enterprise.

    Rather, Jefferson was making a compelling argument that everyone in society benefits when each household has a firm and dependable foundation of economic sufficiency. In our day this comes much more often from earned wages and benefits than from labor in one’s own fields. But it is still the task of the political metropolis to insure that the land is filled with economically secure households, despite all the opposition from the economic metropoles.

    If Obama makes these rhetorical moves he can defeat Boehner and the Republicans in the contest for the title of “true Jeffersonian.” More importantly, he can update our understanding of America as the Jeffersonian “empire of liberty” and make it relevant for the twenty-firs century. And, in the process, he just might take control of policymaking in the political metropolis, too. 

    Related Links

    HNN Hot Topics: Barack Obama

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149204 https://historynewsnetwork.org/blog/149204 0
    The “Fiscal Cliff” and THE SCANDAL David Petraeus and Paula Broadwell in 2011. Credit: Flickr/U.S. Navy.

    Robert Rubin, former secretary of the Treasury, writes in the New York Times: “Now that the election is over, Washington’s attention is consumed by the looming combination of automatic spending cuts and tax increases known as ‘the fiscal cliff.’” 

    “Consumed”?  Excuse me, but I just checked the websites of the Times, the Washington Post, USA Today, CNN, Fox News, CBS, NBC, and ABC. Every one of them had the same lead story -- and it was not “the fiscal cliff.” 

    By now, of course, you know what it was. Everybody knows: THE SCANDAL WIDENS! 

    If Robert Rubin had written that some people in Washington are giving some attention to the “the looming ‘fiscal cliff,’” he might have been correct. In Washington they’re sort of forced to deal with such wonkish stuff, at least part of the time.

    But outside Washington the “fiscal cliff’ must be so far eclipsed by THE SCANDAL that hardly anyone can see the “cliff” at all, much less see it looming ominously just ahead. Even in Washington, the news sources suggest, the “cliff” is taking a distinctly back seat to THE SCANDAL.

    The news media are once again showing their depressing penchant for sensationalism. But there’s no point in complaining. It would as useful as complaining about the weather. Like Hurricane Sandy, THE SCANDAL will dominate the headlines until it runs its natural course and plays itself out.

    If you want to know why, try this little thought experiment. Imagine that you are a Hollywood screenwriter hoping to pen the next box-office blockbuster. You’ve been offered two very different projects.

    One is a film about the president and Congressional leaders negotiating to avoid a financial catastrophe. The other is about the nation’s two most prominent generals, one head of the CIA, caught in some mysterious secret relationships with two attractive younger women, both married, one a wealthy socialite and the other a Harvard-trained expert on terrorism.   

    No-brainer, right? That second project sounds like something that could only happen in a Hollywood movie, not in real life -- something manufactured in “the dream factory,” full of larger than life characters freighted with complex symbolic meanings, doing things that pack a powerful emotional punch.

    In short, like any good movie, THE SCANDAL has all the qualities we associate with myth. Which is precisely why it has eclipsed what may be the most important political negotiations in decades.

    Of course it’s the very real newsroom editors, not some hypothetical Hollywood writer, who are faced with the choice. Their job is to deliver audiences to advertisers. And what audiences want from their news is not so much accurate facts or penetrating logical analyses as gripping tales, the kind that would make good movies. So that’s what the editors give them. Why do you think we call them news “stories”?  

    No doubt it’s significant that this particular story is loaded with sex appeal. I’ve read plenty of Freud (once even taught a course on him), so I could offer some opinions on why sex sells. But I’ll demur.

    The larger and more important point is the power of narrative to shape our perceptions of public events. (I was going to say “public affairs,” but that seems a poor choice of words here.) Indeed, it’s may be fair to say that events don’t become public -- at least don’t take on public significance -- until they are represented in narrative form. And the more mythic those narratives are, the more public attention they get.

    In this case, as so often, it’s all most unfortunate. THE SCANDAL will soon be forgotten, as most scandals are, and have no lasting impact on the nation. But the negotiations to avoid going over “the cliff” will have a huge and lasting impact on all of us. The fate of Medicare and Medicaid, and perhaps Social Security too, hangs in the balance.

    There’s an organized movement to stop President Obama from agreeing to cuts in those entitlement programs. That movement might have some success if it can muster broad public support. Will the public ever know about it? I wonder. It is getting a bit of news coverage. It’s even featured on the WaPo website -- buried beneath six (6) stories about THE SCANDAL!

    As I said, there’s no use complaining about it. But we can use THE SCANDAL as a very useful reminder that we can’t understand American public life -- and certainly not American political life -- without giving serious attention to its mythic dimension.

    I suppose what those opponents of cuts to entitlements need now is a good myth. Something about “Grandma,” perhaps? Remember those fictional “death panels”? But, as THE SCANDAL reminds us, mythic tales can be full of empirically true facts.

    An empirically true, but emotionally powerful, story about what will happen to “Grandma” if her Medicare is cut might be just the thing right now -- once THE SCANDAL fades from the front page of public memory. Let’s just hope it fades before “Grandma,” and all of us, go over “the cliff.” 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149279 https://historynewsnetwork.org/blog/149279 0
    Class: The Missing Link in the Story of Election 2012.

    On Election Day we learned who will be president for the next four years. In the days after Election Day we learned something almost as important: the story that will be told about the election of 2012. The popular story of any election takes on a life of its own, and it can shape the political landscape for years to come.

    We can now safely project the winner of this year’s election story contest: Republicans self-destructed by moving too far to the right on issues that matter to women (especially unmarried women), newly empowered Latinos, and still empowered African-Americans.

    Among liberal pollsters this pro-Obama coalition (plus the under-30s) is often called “the rising American electorate” (RAE). They are the future, the story goes. The Republicans must face that fact, make the necessary changes, or get ready to become history. Race, ethnicity, and gender are destiny.

    But I wouldn’t write off the Republicans so fast. If the story is told this particular way it can actually work to the GOP’s advantage. The 2012 election may become a turning point in our political history, as the story makes it out to be, only if class is added to race, ethnicity, and gender as a fundamental element in the plot.

    I came to that conclusion by looking at some numbers that have largely been left out of the popular story.

    First there is the most crucial and most often ignored number: seven million. That’s the drop in the number of white voters between 2008 and 2012. Seven million white voters just didn’t show up this year. The big question is whether they will show up four years from now or, just as importantly, two years from now.

    In 2014, 20 Senate seats now held by Democrats will be up for grabs, 11 of them in states where Dems are vulnerable. Republicans will have 13 seats up, but only one in a state where a Dem might win. So a large turnout of Republican voters could easily give the GOP control of the Senate.

    For the Democrats to retake the House in 2014, they must hold on to all the new seats they won this year -- all in swing districts -- and win at least 18 more, eight of them in leaning GOP districts. A strong showing of Republican voters would prevent that and insure that the GOP gets an even larger majority in the House.

    As this year’s exit polls show, Republican success depends on a high percentage of white male voters. And there’s one thing that is sure to bring lots of white men: the currently popular story that emphasizes race, ethnicity, and gender.

    It’s already being translated into language that white conservative men understand all too well: Latinos are teaming up with blacks and liberal (code for “loose”) women to take over the country. They’re the reason we are losing the America we once knew and loved. Rush Limbaugh told his millions of listeners the day after Election Day, “I went to bed last night thinking we've lost the country.”

    But two years from now Limbaugh will be telling those millions that it’s time for patriots (read: whites) to take back their country. And they will try mightily, simply by showing up at the polls. Ditto for the dittoheads four years from now.

    So for those of us who fear this vision of the future, it’s a good idea to look for another story about this year’s election that fits the facts but can blunt the boomerang effect of the “race, ethnicity, and gender” narrative. Fortunately, it’s staring us right in the face.

    Pick up the exit poll and look at the category labeled “Family Income.” (The best breakdown is on the FoxNews  site, but it’s the same poll all the media used). You’ll see a strikingly simple tale: The more money you make, the more likely you were to vote for Mitt Romney. Under 30K families went 63% and 30 - 50K families 57% for Obama. Among 50 -  100K families Romney got 52% and among 100 - 200K he increased to 54%.

    50K, the median family income, is the great political divide. Voters below the median gave Obama 60% of their votes, and thus his victory. And some of them were white men and married women above age 30.

    The RAE made up 48% of the voters, and two-thirds of them went for Obama. So 32% of the electorate were pro-Obama RAE voters. But Obama got a shade over 50% of the votes. So some 18% of voters were not part of the RAE yet opted for Obama. Some were folks with graduate degrees, most of them no doubt above the median income. But that leaves the decisive swing voters: several million white men and married women below the median income who voted for Obama.

    So the election wasn’t just about racial, ethnic, or gender politics. It was also about the economy, stupid. As Paul Krugman wrote, “the big numbers came from groups unified by economic fear. … While single women and members of minority groups are more insecure at any given point of time than married whites, insecurity is on the rise for everyone, driven by changes in the economy.”

    Yet the story of class -- which fits the exit polls as well as the story of race, ethnicity, and gender -- got virtually no hearing in the mass media.  

    It’s always been taboo in America to talk about class. The myth that “we’re all middle class” has been among the most powerful of all our national myths. Both candidates this year knew that very well, which is why they often sounded so silly as they fought to see who could mention the sacred words “middle class” most often. Barack Obama never talked about helping the poor, only about helping people who aren’t yet in the middle class achieve that normative status.

    Historically, Americans have been able to avoid talking about the glaring class divides and tensions in their midst by focusing on the equally glaring divides and tensions surrounding race. It’s almost a cliché among historians to say that, while other nations have dealt so often with class conflict, we’ve dealt constantly with race conflict.

    The growing salience of Latinos complicates matters a bit because thoughtful people (and the U.S. Census Bureau) know that Latinos are an ethnic group composed of many racial identities. But many (most?) white Americans see Latinos merely as “brown-skinned people,” making it easy to assume that Latinos, like African-Americans, are a dark-skinned race. So in practical political terms Latinos become part of the story of race as a substitute for class in public discourse.

    Feminist historians would be quick to add that gender conflict has been central to public discourse, along with race conflict, throughout American history.

    So the popular story of Election Day 2012 reflects a long-standing pattern, unique to the United States, of avoiding talk of class in favor of talk about race, ethnicity, and gender.

    The Democrats are just now beginning to talk about the divide between the rich and the rest of us. It’s another big step to talk about the differences between those above and those below the median income -- including the decisive political difference.

    But if Democrats don’t take that step soon they risk another major defeat in 2014 and perhaps 2016. Then all the benefits of re-electing Barack Obama could easily slip down the political drain.

    On the other hand, imagine this as Democrats’ story of the 2012 election: The winning coalition was a rainbow of folks under median income who saw clearly where their bread was buttered, and it wasn’t on Wall Street or in the corporate offices of Bain Capital.

    Then the central question of the elections of 2012 and 2014 might become, “Do you want a government dedicated solely to increasing the wealth and cutting the taxes of the rich while slashing the vital government services we all depend on?,” rather than, “Do you want those blacks, Latinos, and liberal women to take over the country?”

    Making class a central issue could get at least some whites, especially men, in the lower income brackets to think of their vote in a rather different light -- not as revenge against the people who are “taking away our country,” but as a chance to continue a move toward the economic justice they deserve as a reward for all their years of hard work.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149391 https://historynewsnetwork.org/blog/149391 0
    Nineteenth-Century Nationalism Still Alive, and Deadly, in Mideast IDF brass in a briefing about the conflict in Gaza, November 17, 2012. Credit: Flickr.

    Ask most Americans why Israel went to war in Gaza again and they’ll give you a simple answer: Palestinians were shooting rockets into Israel, and, as President Obama said, “there’s no country on Earth that would tolerate missiles raining down on its citizens from outside its borders.”

    To name those rockets as the root cause of the war is like saying my fever caused my flu. But why shouldn’t the public identify a symptom as the cause of the conflict? They hear and read the same misleading explanation in their news media over and over again. So they see no reason to dig any deeper.

    Historians, of course, will dig deeper. They’ll be be suspicious of explanations of any war that go back no further than the last few days, or even few months. A barrage of rockets may have been the “precipitating” event, as Obama put it, with what must have been a carefully chosen word. It can equally be said that Israel's assassination of a high-ranking Hamas official involved in negotiating a truce was the precipitating spark.

    But the fuel has been building up for two centuries.

    I’m not talking about the wrong-headed cliché, “Oh, those Jews and Arabs. They’ve hated each other for centuries. They’ll go on fighting forever.” The long history of Jewish-Arab relations runs the gamut from bitter enmity to tolerant co-existence to cordial friendship. When Jews and Arabs meet, anything is possible.

    Yet the historical circumstances of any particular meeting set limits to the possibilities. Since the nineteenth century, the overwhelming historical circumstance has been a passionate embrace, on both sides, of modern secular nationalism.

    I emphasize “secular” to dismiss the other common but wrong-headed cliché, “It’s a religious war, and those never end.” Religion is the tail that may occasionally wag the dog in Jewish-Arab relations. But the dog -- the beast itself -- is the core innovation of nineteenth-century nationalism: one’s personal identity, worth, and dignity come from full membership in a nation-state.    

    By the end of the nineteenth century, most Jews who met Arabs were Zionists. Zionism was, and still is, the Jewish form of modern secular nationalism. Nineteenth-century Zionist writing is rife with expressions of anger over the indignities suffered by Jews in the preceding centuries -- but even more with expressions of shame, implying (and often stating outright) that Jews have themselves to blame, that they allowed themselves to become powerless victims. In the canon of modern nationalism, that is perhaps the gravest of sins. 

    Zionism was, above all, an effort to use modern nationalism to prove that Jews could achieve personal worth and dignity only by escaping the shameful sin of powerlessness. To that end, Zionists had to enact a script in which they confronted and (unlike their ancestors) successfully overcame enemies who were persecuting them for no other reason than being Jewish. That was the Zionists’ way of proving their right to have a proud, self-respecting nation-state, entitled to an equal place alongside all the modern nation-states.

    But here was the Catch-22: Zionists could never feel like a proud nation unless they were actively dispelling the pall of the shameful Jewish past. So they had to be constantly enacting their script, in which innocent Jews struggle to overcome oppressive enemies.  

    The need for constant enemies produced a Jewish myth of constant insecurity, which shaped the Zionist view of history at every step. (I make this case in much more detail in my essay “The Myth of Israel’s Insecurity.”) 

    Of course the script required some real people to play the role of the anti-Semitic enemy. Before 1947, when the British ruled Palestine, they played that role, along with the Palestinian Arabs. Once the state of Israel was born, a long (but narrowing) list of actors played the role: the Arabs, the Nasserites, the Palestinians, the PLO, and now Hamas.

    Certainly not all Israelis view the world through the myth of insecurity. But so many do that no successful Israeli political leader has dared (or perhaps wanted to) question it. So the myth became the guiding light of policy.

    In the case of Gaza, the myth dictates that Hamas must be treated as an irrational gang of anti-Semites determined to destroy Israel. All the evidence to the contrary (including the most recent CNN interview with the head of Hamas) must be dismissed as merely the devious lies one would expect from such a diabolical crew.

    More specifically, the myth dictates that Hamas must be smuggling into Gaza the weapons it needs to mount an all-out assault on Israel. So it makes perfect sense, from the Israeli perspective, to demand that Gaza be blockaded, to prevent Hamas from getting those weapons -- even though that means Gazans also can’t get food, medicine, building materials, and other necessities of life. 

    Some Israelis may find that an unfortunate side effect of the blockade. Others may see it as the main effect, like the prominent Israeli official who said that the point of the blockade is “to put the Palestinians on a diet, but not to make them die of hunger."

    In either case, the blockade has the effect of proving Israel’s power over an enemy, which in turns proves (according to the mythic script) that the Jewish nation can hold its head up high, that Jews need no longer feel ashamed and blame themselves for powerlessness.   

    Thus the blockade has continued, provoking the only kind of resistance that Gazans have managed to come up with: sporadic rocket fire into Israel. Those who fire the rockets have said repeatedly that they will cease when the blockade ceases. But Israel’s nationalism demands that it dismiss these promises as deceit.

    Of course those who fire the rockets know that their weapons are far too weak to influence Israeli policy directly. Perhaps they harbor some theories of indirect influence. In any case, they appear to be moved by the same nineteenth-century nationalist values that shaped Zionism: To remain passive, to accept and exhibit powerlessness, would be a mortal blow to their sense of dignity and self-respect. Rather than risk this gravest sin, they must show Israel, and the whole world, whatever power they have. So they risk the terrible retaliation that Israel periodically mounts.

    Ultimately, then, it is the legacy of nineteenth-century nationalism that has kept the two sides locked in this ongoing conflict. Perhaps right now we are seeing the glimmer of a new kind of nationalism taking over, where pragmatic self-interest gets precedence over old-fashioned notions of national pride. But we should not underestimate the reach of the long shadow of the nineteenth century.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149424 https://historynewsnetwork.org/blog/149424 0
    “Lincoln”: Jesus Christ! God Almighty! What a (Biblical) Movie!

    About three score and a couple of years ago my sister was a research librarian in Hollywood, working for an outfit that dug up information needed by moviemakers. One day she called me and said, “You’ve got a PhD in the history of Judaism. So what are the facts about the lost ark, the one that was in the Jerusalem Temple in biblical times?”  “There are no facts,” I quickly replied. “It’s all just legend. Why do you want to know, anyway?”

    “Steven Spielberg is making a movie about the lost ark, and he wants us to get him the facts.” “A movie about the lost ark?”, I asked incredulously. “Is he crazy? Does he think anyone is going to pay money to see that?”

    Obviously, I may know something about history but not much at all about the movies. I suppose that alone might disqualify me from making any comment on Spielberg’s latest epic, Lincoln.

    But when America’s greatest living mythmaker takes on America’s most mythicized president, how can the author of a blog called MythicAmerica remain silent? If it’s not my obligation to say something, at least it’s an irresistible temptation.

    What places Lincoln above all presidents in our national memory is his image as The Great Emancipator, a larger-than-life man led by a crystal clear and unwavering moral vision on the transcendent moral issue of American history. In mythic terms, the mere fact that America could produce such a leader is powerful evidence of a clear moral vision at the heart of America, a vision that all Americans can draw from and thus share in, at least vicariously. If that moral vision can be combined, in this one person, with skillful use of our democratic system to put the vision into practice, so much the better.

    But recent historians have created at least a hint of a different myth, in which Lincoln is larger than life because he so skillfully manipulated the system in pursuit of some lesser goal -- saving the Union not for a greater moral purpose, but merely as an end in itself; or perhaps, even worse, merely being a winner for the sake of being a winner, in both war and politics.

    I expected the film to explore this issue, to take a stand on it, to tell us what the Lincoln myth for our generation should be. Spielberg’s choice to focus on the Thirteenth Amendment seemed well suited to the task. The key scenes would be those in which Lincoln came to his decision about pressing for immediate passage. That would reveal just what kind of mythic figure the director (and screenwriter Tony Kushner) wanted us to see.

    Watching the film, I quickly found myself frustrated because that question was sidestepped, or at best made rather secondary. Lincoln’s decision-making process had been concluded before the time frame of the film even began. We are introduced to his firm decision in the form of a dream.

    My frustration was heightened by the rather wooden way the political-historical facts were discussed. The dialogue was so fragmentary and rapid fire that it could hardly be considered a thoughtful, much less thought-provoking, treatment of the issues in question. The historian in me couldn’t figure out quite what to make of it all.

    Scenes of personal interaction -- among Lincoln, his wife, his sons, their servants, minor functionaries, and soldiers -- relieved the tension because they meant nothing as history. They were simply superb cinema, and I could indulge completely in enjoying them as such.

    Then at a certain point it struck me that I was missing the point of the movie: It was all simply superb cinema. If I let myself, I could be sucked into the story and carried along by it, as I suspect most of the audience was (except the guy sitting next to my wife, who fell asleep). Once I allowed myself to suspend disbelief and treat what I called the political scenes on the same level as what I called the personal scenes, it was a truly glorious piece of theater, a spectacle from the Hollywood “dream factory” at its best. How appropriate that we meet the Thirteenth Amendment first in a dream.

    The tension between historical fact and pure theater was reinforced right after the movie by two incidents. As the credits rolled, a woman sitting near me told a friend about a high school American history teacher who was giving his students extra credit for seeing the movie. Maybe he wanted them to think about how history is turned into mythic spectacle. But I doubt it. Since the filmmakers emphasized so strongly their debt to historian Doris Kearns Goodwin, and the credits included thanks to so many other historians, there’s an understandably widespread (though unfortunate) view that this is a fine way to learn real history.

    When I got home and glanced at my email, I found that a friend had sent out a piece by the New Yorker’s film critic, David Denby. His conclusion sums up the very ahistorical quality of the film. It’s strange to call a movie “momentous,” he says, because great movies typically suggest their larger meanings only through implication. But “Lincoln” is momentous because the message is so direct: “Spielberg and Kushner marched straight down the center of national memory ... and they got it right.”

    What they “got right,” of course, was not the facts of history; no doubt they got plenty of those facts right, but that misses the point. What they “got right” was the path that leads down the center of national memory. Since national memory is mythic and need not be checked by facts, that path can always appear to aim at, and be guided by, America’s crystal clear and unwavering moral vision, so that it runs straight and true through the twists and turns of messy democracy.

    Spielberg is obviously in love with this traditional story of America’s journey along the path of moral truth. (See Saving Private Ryan, Amistad, and his video game, Medal of Honor.) Now his immense technical gifts have allowed him to create his most impressive pageant of America marching down that path, headed by its greatest leader, as interpreted by its greatest mythmaker.

    Of course it’s not just Spielberg. Some scholars believe that Americans, as a people, are more likely than many others to see their history as a morality tale because so many Americans have taken the Bible as a sort of code book to decipher the meaning of our historical events.

    David Denby raises this theory at the outset of his review, quoting Lincoln’s law partner, William Herndon: Lincoln was “the noblest and loveliest character since Jesus Christ ... I believe that Lincoln was God’s chosen one.” Denby goes on to note that the popular image of Lincoln still includes “attributes both human and semi-divine ... which combine elements of the Old and New Testaments.”

    As for the New, he might have noted the obvious: In the end, Lincoln is martyred for having cleansed his people of their sin. Denby also could have pointed to the sequence in which Lincoln reminds his son that the president is the all-powerful ruler (at least as far as the army is concerned), but then gives his son up to the risk of death in that army, where so many soldiers died to wash away the sins of the whole nation -- a sort of “God the son becomes God the father” sequence. Is it too much to add that the exquisite lighting of the film, especially in the interior shots, creates an aura of the holy spirit hovering over everything the great man says and does?

    Denby offers only one Old Testament reference: the sequence in which Lincoln talks of his “awesome power,” and demands that his aides get the last two votes to pass the Thirteenth Amendment. “It’s Lincoln’s only moment of majesty in office ... Any thought of Jesus disappears. This is an Old Testament figure, wrathful and demanding.”

    But there’s a deeper Old Testament dimension to the lead character, which Spielberg spotlighted by closing the film with a flashback to the second inaugural address: “The Almighty has His own purposes.” Both the “offense” of slavery and “this mighty scourge of war” to punish that offense may be among those purposes. Yet “as was said three thousand years ago, so still it must be said, ‘The judgments of the Lord are true and righteous altogether.’"

    The New Testament is a relevant prototype for a story about God’s martyred chosen one. But since the film is really about a nation’s memory, the Old Testament is the more relevant prototype. The Old is the story of a whole nation’s historical struggles with offense and punishment, embedded in a thick web of political complexities, but all guided by an omnipotent moral hand toward a transcendent goal.

    It may be most rewarding to watch Lincoln as a biblical epic, ranked alongside films like The Ten Commandments and The Greatest Story Ever Told as one of the best American films of that genre. Lincoln reminded me why so much of the Bible is such fine literature: Once we are grabbed and swept away by a great story, crafted by great storytellers, a careful analysis of the historical facts no longer seems so important any more, and certainly not nearly so interesting. 

    After the last credits roll and the last reviews are read, though, we are left wondering what it means for a nation to continue remembering its own history as if that history were a Bible story.

    Related Links

    HNN Hot Topics: "Lincoln": The Movie

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149442 https://historynewsnetwork.org/blog/149442 0
    A New "New Cold War" in the Mideast? Credit: HNN staff.

    Just when we thought it was safe for Americans to go out in a democratizing Middle East ... Well, I guess we stopped thinking that a while ago. But now a lead story on the front page of the New York Times makes it official. Far from boosting our security, the Arab Spring has given us more to be afraid of.

    Gone are the days when all we had to worry about was fanatical Shi’ite Islam. Now a new Sunni “axis” is emerging, the Times informs us -- using a word that should send chills up the spine of anyone who knows anything about World War II -- with Egypt, Turkey, and Qatar playing the role once filled by Germany, Japan, and Italy.

    All three Mideast nations are governed by Sunni Muslim parties. So are Libya and Tunisia. More ominously, according to the Times, Hamas is allying with the “axis.” And if the Syrian rebels win their civil war, they’ll take Syria out of the Iranian orbit and into the new “axis” too.

    The result “could be a weaker Iran.” After years of warning us about a “new cold war” with a possibly nuclear-armed Iran, you’d think the mass media would be celebrating.

    But no. The Times merely warns us that we have to shift our anxiety to a new target. Why? The answer is a mother lode of precious material for students of American political mythology.

    These Sunnis, reporter Neil MacFarquahar explains, “promote a radical religious-based ideology that has fueled anti-Western sentiment around the region.” That’s a good example of how exaggerated facts create the emotional punch so essential to myth.

    Yes, there’s a religiously-based ideology fueling anti-Western sentiment. “Radical” makes it sound inherently dangerous. But it’s hardly radical or dangerous to those who hold it. It’s perfectly sensible to them.

    And it’s a huge stretch to say that government leaders in Egypt and Turkey “promote” this ideology. They are trying to harness and lead it. But at the same time they are trying to restrain it as they navigate tricky political waters, where they depend heavily on secular forces for their economic and political well-being.

    More soberly, MacFarquahar writes that “the new reality could be ... a far more religiously conservative Middle East.” Yes, it could be. But it might not be. There’s no way to know.

    The American journalist shows a sharper understanding of the response in his homeland, focusing precisely on this uncertainty of the future.  “The shifts seem to leave the United States somewhat dazed.” “The United States” here means that tiny fraction of one percent of the American population who make or directly influence foreign policy. On the international stage, where they act out the drama of geopolitics, they represent the entire nation.

    They are dazed because “what will emerge from all the ferment remains obscure. ... Confusion reigns in terms of knowing how to deal with this new paradigm, one that could well create societies infused with religious ideology that Americans find difficult to accept.” In this case, “Americans” probably does mean a majority of the whole population.

    Why should Americans find it difficult to accept the religious choices of people on the other side of the world? Why does it even matter if Americans accept them? MacFarquahar’s answer is simple and surprisingly candid: “The old leaders Washington relied on to enforce its will, like President Hosni Mubarak of Egypt, are gone or at least eclipsed. ... The new reality could be ... [a] Middle East that is less beholden to the United States.”

    In case you’re one of those liberals who doesn’t think the U.S. should be enforcing its will on independent foreign nations, the next sentence should bring you around: “Already, Islamists have been empowered in Egypt, Libya and Tunisia, while Syria’s opposition is being led by Sunni insurgents, including a growing number identified as jihadists, some identified as sympathizing with Al Qaeda.” 

    It’s impressive to watch guilt by association in action: Islamists are linked to Syrian Sunnis, who are linked to insurgents (an inherently danger-packed word), who are linked to jihadists (and even scarier word), who are linked to sympathizers with Al Qaeda (the scariest word of all, of course).

    By the logic of association -- a basic principle of mythic thinking -- the conclusion is obvious: If the U.S. can no longer enforce its will and keep Mideast nations beholden to us, we’re on the way to rule by Al Qaeda. Are you worried now??

    Despite the allusion to World War II in the loaded word “axis,” this all reminds me more of the era right after the war. Many American policymakers were somewhat dazed and confused by the political ferment, especially in Europe, that was making the future obscure.

    But some told a simple story that made sense of it all: The U.S. had such preponderant power that total global control seemed within our reach. Nothing less should satisfy. However, the Soviet Union and other political actors wouldn’t just roll over and submit. The principle of guilt by association proved that they must all be communists controlled by Stalin. He was causing all the ferment, promoting a radical ideology that fueled anti-American sentiment.

    It was a paradigm foreign to the American way, the story went; wherever it took hold, nations would no longer be beholden to the U.S., and we would no longer be able to enforce our will. Nor could we control, or even predict, the future. Anyone even remotely associated with anyone remotely associated with a communist shared blame for this frightening chaos. All of them became enemies who had to be destroyed.

    Those enemies might arise anywhere, which meant danger lurked everywhere. So America became an embattled fortress and a watchtower of constant vigilance. By the late ‘40s this narrative reigned supreme.

    The result was not merely four decades of cold war, but a firmly entrenched mythology of homeland insecurity that still persists and spawns new fears. Global control remains the impossible dream. Since there is always someone frustrating it, there is always a new enemy springing up. And so we are told yet again that we must be always on guard, always afraid, always ready for the next “new cold war.”

    We aren’t anywhere near that in our relationships with Sunni-led nations -- at least not yet. But in a myth-soaked foreign policy discourse, “the good guys” can become “the bad guys” awfully fast, as we learned in a very few years right after World War II. I guess it is a good idea to be on our guard.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149506 https://historynewsnetwork.org/blog/149506 0
    Why Are Dems Left Hanging on Edge of “Fiscal Cliff”? How the Dems could win the fiscal cliff debate: mobilize for the moral equivalent of war. Credit: Flickr/Library of Congress/StockMonkey.com/HNN staff.

    Where’s that surge of public outrage that’s supposed to force the Republicans to surrender in the “fiscal cliff” negotiations? The Democrats are still waiting for it ... and waiting ... and waiting, while they teeter on the edge of the cliff.

    The Dems are so busy scrutinizing the polls, they forgot to notice the impact of the little word “cliff.” Sure, it’s just a metaphor. But every metaphor tells a story. And the stories we tell (or, more commonly, take for granted, without ever spelling them out) shape the way we view things, which in turn determines the policies we’ll adopt or reject and the way we’ll live our lives.

    Any story about a “cliff” is simple: We are safe now, with our feet planted firmly on solid ground. The whole broad earth supports us. But if we take one more step in the direction we’re currently heading it will be an apocalyptic step. Suddenly we’ll be plunging down through the abyss toward certain destruction, helpless to save ourselves. If we step in any other direction we will remain securely on solid ground; we’ll escape the apocalypse.

    The story of the “fiscal cliff” is more complicated because the public is getting so much conflicting advice about which direction is safe and which is the truly dangerous one. When you are standing on the edge of the precipice, with so many voices yelling “Go this way!” -- “No, that way! -- “No, the other way!” -- what’s the sensible thing to do? Don’t move at all. At least that way you know you are safe.

    And sensible reasoning is reinforced by emotion. When we’re confused and in mortal danger our “fight or flight” response can easily get paralyzed. We freeze; play dead. It’s a primal response, the psychologists say, from deep inside the reptilian brain. 

    When people are too afraid to move, they see all images of change as images of danger. Inertia carries them on in the direction they’ve been going. It seems like the safest direction because it requires no new decisions. Conserving the status quo feels like the most comforting path.

    In short, when apocalypse looms and it’s not clear how to prevent it, people are likely to become more conservative. So if the Democrats want dynamic movement -- a surge of public support for innovative new policies to reduce economic inequality -- “cliff” may be exactly the wrong metaphor. 

    “Cliff” may also be the wrong metaphor if you want a story that actually fits the facts, as two reports in the New York Times explain: “America’s fiscal condition will be altered without a deal between President Obama and the Republicans in Congress. But not radically so, and in many cases not immediately.”

    “Policy and economic analysts … said the term ‘fiscal hill’ or ‘fiscal slope’ might be more apt: the effect would be powerful but gradual, and in some cases, reversible.” “The slope would likely be relatively modest at first,” according to Chad Stone, the chief economist at the Center on Budget and Policy Priorities.

    So why do the Dems ignore more appropriate metaphors and go along with the popular metaphor of the “fiscal cliff”?

    For the same reason Republicans embrace the “cliff” image, says Washington Post wonk Ezra Klein: “Legislators from both parties have concluded that crises are the only impetus to get anything -- and thus the opportunity to get everything -- done.”

    That may well be true in the back rooms of DC, where there’s little sense of urgency and some confidence that a final deal will surely be cut. But outside the beltway, crisis is more likely to breed conservatism.

    Except, perhaps, when we go to war. Historian Michael Sherry has shown that the most effective impetus to get anything done in American political life is to convince the public that we’re living “in the shadow of war.” Then we have a feeling of apocalyptic crisis, since Americans have always tended to talk about their wars in apocalyptic terms, as if the only alternative to victory were the demise of the nation.

    But when war breaks out we also have a clear consensus on how to respond. We don’t freeze. We band together and mobilize to fight back.

    The enemy need not be a foreign foe. Sherry offered copious examples of domestic societal problems framed as wars as far back as the 1930s, when Franklin D. Roosevelt often proclaimed that fighting the Great Depression was much the same as fighting the Germans in World War I. During FDR’s first term, there was widespread agreement that the New Deal was the best way to resist the enemy of a broken economy. So the nation mobilized to fight back.

    However the New Deal teaches another lesson about the “war” metaphor: It triggers a dynamic common effort for apocalyptic victory -- at first. But war also breeds apocalyptic fear, which sooner or later creates a more conservative mood, at least on the domestic policy front. That was clear by the middle of FDR’s second term. Both world wars, the Korean War, the Vietnam War, and the post-9/11 response all produced similarly conservative reactions on domestic issues.

    In any case, this isn’t the ‘30s redux. The “fiscal cliff” is not a war metaphor. The only “war” triggered by our current economic problems is the one between the Democrats and Republicans about what to do as we teeter on the “cliff.” So talk of a “fiscal cliff” doesn’t unite the nation and set it moving in a clear direction, as war metaphors do. The political warfare only heightens the confusion and, therefore, the conservative impulse.

    It’s worth wondering how the Dems would have fared if they had refused the “cliff” metaphor and opted instead for “war.” If we can have wars on cancer or poverty, for example, why not a similar war against “special privileges” or “to save the middle class”?

    Those obviously metaphorical “wars” on the domestic front don’t usually generate apocalyptic fear the way actual military conflict does. Perhaps the “war” metaphor might have mobilized the kind of support the Democrats had hoped for.  

    We’ll never know. For better or worse the Democrats are content to leave us, and themselves, hanging on the edge of a “cliff.”

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149592 https://historynewsnetwork.org/blog/149592 0
    Cheer Up, Rush: Dems Keep "Traditional America" Alive

    Dear Rush Limbaugh,

    The night President Obama was re-elected you went to bed thinking that Mitt Romney “put forth a great vision of traditional America, and it was rejected.” So “we’ve lost the country.” You explained to your audience that the voters had chosen a “Santa Claus” government over hard work as the way to get their needs met.

    Well, now that Santa is finishing up the last toys and getting the reindeer ready to fly, I want to bring you a season’s greeting full of good cheer. I want to cheer you up by telling you about the Christmas card I just got from the Obamas. It should ease your fears that your country, the one you call “traditional America,” is disappearing.

    All the Obamas signed the card, even their little dog Bo (who added his pawprint). In fact Bo is the star of the card; he’s the only one who got his picture on it. There he is romping through the snow on the Obamas’ lawn. Hey, Rush, what could be more traditionally American than that?

    And then read the message inside: “This season, may your home be filled with family, friends, and the joy of the holidays.” That’s it. No government coming into your home to spy on you -- or to give away stuff. In fact, no stuff at all. And no fat guy in a red suit to bring stuff. Just a home filled with family, friends, and Christmas joy.  

    (Yeah, I know it says “holidays.” But seriously, when did you ever see a picture of a little dog romping in the snow as a symbol of Hanukkah, or of anything associated with Muslim culture? We are obviously talking Christmas here.)

    Why do you suppose I got this card? I don’t know. The only reason I can imagine is that I’m on some list of people who volunteered for Obama during the campaign. It says it was “not authorized by any candidate or candidate’s committee.” It was “paid for by the Democratic National Committee, www.democrats.org” (which means no government money was spent on the card, so don’t jump to any nasty conclusions).

    But you know as well as I that the Democrats are trying to hold on to all of us who volunteered, so that when the time comes they can mobilize us in whatever political fight they need us for. I mean, you should see all the emails they still send me.

    The thing is, I didn’t do very much for the campaign. There must be tens of thousands of people, maybe hundreds of thousands, who did as much as I did. You’ve heard about the size of the Obama “ground game,” I bet. And they must all be getting the same card.

    Now think about it, Rush. (This should really dry your tears.) The Democrats made a Christmas card to send to this huge list of people who support the Dems so solidly that they’ll give a few volunteer hours. These are all the people that you think are taking away your country, rejecting “traditional America.” The Dems surely hired some pretty high-priced PR professionals to figure out exactly what should go on that card -- what would make all of us who get it feel so good that we’ll want to volunteer even more.

    And what did they come up with? Santa giving away stuff to a diverse rainbow coalition of greedy Americans? An inter-faith gathering, complete with atheists, celebrating a neutralized “seasonal observance”? A gay couple sitting down to Christmas dinner with their multi-racial children?

    No. Not even a white working-class couple sitting down to Christmas dinner with their blond-haired, blue-eyed children. Just a little dog in the snow and a “home, family, friends, coded-Christmas” greeting.

    But that’s not all, Rush. It gets better. The picture shows the dog in front of a grand, immense mansion, wearing a scarf no less. His head is held up straight and high, aligned perfectly with the stately pillars of the White House, as if he were marching in a military parade. And it’s all framed in a thin line of gold. Open it up and the message is embossed in gold, under the seal of the president in fine detail, embossed in the same gold.

    Why, when I hold this in my hands I feel like I’ve been magically transported to Romneyland. Come to think of it, suppose Mitt had won and the Republicans sent a Christmas card to all the volunteers from his campaign. What would be different?

    Well, they might leave out the dog, because that would remind people of the “tied to the top of the car” story. But surely they would have found some equally traditional Christmas-y picture full of snow, and done it in equally elegant style, with the same visual allusion to the martial dignity of America. Beyond that, only the names would be changed.

    So apparently the Democrats’ best PR pros think that an elegant Romneyesque vision of “traditional America,” filled with gold, will warm the hearts of Dem loyalists. What do you make of that, Rush?

    I take it as a coded message, not merely that Christmas is still the top-dog holiday around here, but that your idea (I call it your myth) of “traditional America” is very much alive and still packs an emotional wallop.

    Yes, your myth took something of a hit this last election day. But it wasn’t such a serious blow. Last I looked, your guy got over 47 percent of the votes and my guy less than 51 percent. My guy did a couple of points better four years ago. But there was a congressional election in between where we got slaughtered by “traditional America.”

    I bet the wizards who plot strategy and make Christmas cards for the Democratic National Committee remember that slaughter very vividly and aren’t nearly so sure as you are that you’ve lost your country. At least, they want us Dem activists to know that we had still better give lip service to “traditional America.”

    I suspect it’s more than that, though. I suspect that at least the Christmas-y piece of the “traditional America” myth is still meaningful in some (perhaps subliminal) way to a lot of dyed-in-the-wool Democrats. They don’t think Christmas is about Santa handing out stuff -- and no Dems I know (which is a lot) think that government is about playing Santa, handing out stuff.

    But they do have some sentimental attachment to the Norman Rockwell version of America and all the values it represents. They are even impressed (though they might hate to admit it) by the elegance of gold.

    So cheer up, Rush. You’ve got the old American myth on your side of the political fence. And old myths die hard -- so hard, apparently, that even a lot of us on the opposite side of the fence are still hooked into your “traditional America.” 

    But here’s the best news: The deepest message of this card from the Obamas is that they love a lot of old American traditions, and they assume plenty of us Dem loyalists do too. We’re all patriots, on your side and ours. We all love the same country and want the best for it, even if we have different ways of getting there.

    Merry Christmas, Rush!

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149631 https://historynewsnetwork.org/blog/149631 0
    Ain’t No "Cliff," Pardner; This Here’s a "Showdown" Solidarity poster from Poland in 1989 -- an effective use of the "showdown" myth in politics. Credit: Wiki Commons.

    Progressive groups are trying to rally their troops to stop any cuts to Medicare, Medicaid, and Social Security. They may wish they could turn out crowds large and noisy enough to make a media splash, the way the Tea Party did a couple of years ago. But their troops are all volunteers, and as far as I can tell not enough of them are showing up for duty to make that media splash.

    Barack Obama and his ax-wielding budget aides will draw the obvious conclusion: Most people say they oppose cuts to the big three “entitlements.” But they don’t care strongly enough to make any noise about it. Mostly what they want is to stop hearing about the dangers of the “fiscal cliff.”

    So Democrats can make cuts to the big three, satisfy the Republicans, end the “fiscal cliff” crisis, and pay a very small political price. In fact the Dems will probably come out with a higher rating in the polls because they’ll show that they can “make Washington work.”

    That’s probably what’s going to happen in the next few weeks, unless some progressive crowds get out there with Tea-Party-like enthusiasm and start screaming “No! Stop!”?

    Why aren’t they out there yet? One reason, I’ve suggested, is that progressives have not challenged the metaphor that everyone uses to describe the situation: We’re headed for a “cliff.” Every metaphor tells a story. And the stories we tell shape the way we view things, which in turn determines the policies we’ll adopt or reject.

    The story of the “cliff” tells us that apocalyptic peril looms ahead. We’re all in this together, and if we take one more step in the wrong direction we’re doomed. But we don’t have any consensus on which direction is the right one. Most people, facing that kind of threat, are afraid to take a step in any direction. So they just stand still, cling to the status quo, and turn more conservative. 

    Recently I learned that there are some progressives who understand the power of metaphor. I met some folks who are organizing to save Medicaid. They certainly want Medicare and Social Security protected too. And they’re not talking about any “cliff.” They are talking about the “fiscal showdown.” 

    All of a sudden the whole situation looked different to me. It’s not all of us together rushing toward a precipice, trying frantically to figure out where to direct our collective steps, constantly bumping into each other -- and sometimes trampling each other -- in our panic. If it were, we’d have good reason to feel paralyzed, afraid to move at all.

    No, the “showdown” metaphor gives us two clearly defined groups -- good guys and bad guys -- facing each other in a fight to the finish. We each get to choose which side is good and which is bad. But once we’ve made the choice, we get to stand with the good guys and join in the fight. We get to take action.

    Once the good guys defeat the bad guys, the people who have been blocking progress toward a better life for all are gone. The way is clear to make all sorts of improvements for our society and everyone in it.

    Sure, for progressives that’s a fantasy. Even if the Republicans go down to terrible defeat in this round of negotiations (which is hardly likely, given their majority in the House), they’ll bounce right back and start trying to force some other horrible new policies on us.

    But imagine if all the headlines were about the “fiscal showdown,” not the “fiscal cliff.” “Showdown” is an energizing fantasy. It creates a feeling that we can eventually “clean up this town, make it a decent place where fine folks will want to raise their families.” I think I heard that in a movie or two, or actually a few dozen.

    The film history of the “showdown” -- with its familiar mantra, “draw, podner” -- reminds us that this metaphor is classic Americana. The good guy is the all-American kid. Whatever virtues he represents are, by definition, all-American virtues. And he’s expected to win an unconditional victory over the bad guy. At the OK Corral or anywhere else, the “showdown” has a fine patriotic pedigree.

    If progressives go out into the street for a “fiscal showdown,” they’re acting out a traditional American drama. In a strange way that makes them more appealing to the rest of the public, even to the most conservative among us.

    On the other hand, if we are hurtling toward the cliff the best we can hope for is to avoid disaster at the last minute. The only film prototype I can think of is James Dean as the Rebel Without a Cause. That’s hardly an appealing image if progressives hope to get their message beyond their already rebellious circles.

    Those of us who are committed to nonviolence may not feel very comfortable with the traditional American “showdown” metaphor, since it’s so loaded with overtones of violent death. But we don’t shrink from confrontation any more than Gandhi or Dr. King did. The “showdown” we want isn’t between two groups of people. It’s between two sets of policies, each with its underlying values and mythic narratives.

    When we support more funding for Medicare, Medicaid, Social Security, and all the government’s other human service programs, we are going out to fight for a society where we are all interconnected; all threads in a single garment of destiny; each caring deeply for and feeling responsible for the well-being of all others. We are fighting against a rampant, uncaring individualism built on greed and selfishness.

    That’s what this fight is really about. And when you bring it down to that level of basic values, it’s hard to see how anyone can advocate compromise. Because if greedy individualism wins, we all lose -- even the richest among us, though they don’t know it yet. So the only way to avoid sending our society over the moral as well as fiscal cliff is to make sure progressives win this showdown.

    And here in America, the traditional place for a showdown is in the street, out in public, where everyone can see the victory of right over wrong.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149759 https://historynewsnetwork.org/blog/149759 0
    America's Proud Individualism Helped Pull the Trigger Credit: Flickr.

    I know it’s foolish hubris to hear about a tragedy like the school shooting in Connecticut and then immediately start writing about it. But many of us who blog do it, at least in part, as a way to deal with feelings that otherwise might overwhelm us. It’s cathartic. And it’s our wager that, in the process, we’ll say something helpful to others who are trying to make a little bit of sense out of at least some corner of the tragedy

    Convincing explanations of any kind are ultimately bound to elude us. All one can do is try to shed a little light on a little piece of the immense calamity, from one’s own particular viewpoint. I naturally think about American mythic traditions that seem relevant in this situation.

    After the mass killing in an Aurora, Colorado movie theater last summer I noted a point that Washington Post wonk Ezra Klein Klein confirms in a very useful post today: While the American public generally supports a number of specific gun control proposals, when pollsters ask about “gun control laws” in the abstract a growing number of Americans say they oppose it. And pollsters consistently find that mass killings do nothing to increase support for gun control.

    Back then I suggested that “when nations, like individuals, try to go in two directions at once they get paralyzed. That’s where we are on the politics of gun control.” I added that the paralysis makes us ever more frightened and craving safety. The traditional American source of safety is a gun -- or two, or three, or more. I concluded that “the root of the problem is our dedication to the fantasy of absolute safety and security. The sooner we recognize that as our national fantasy and stop arming ourselves to the teeth in pursuit of it, the safer we all will be.”

    At the time I did not know that the killer had been in treatment with a very competent psychiatrist. I merely assumed that it’s mentally or emotionally disturbed people with guns who kill people, at least on such a mass scale. We still don’t know anything about the killer in the Connecticut school. But again that assumption seems to be a rather safe one.

    In other words, I start with the premise that the opponents of gun control are half right. Guns don’t kill people, as they like to say. But the other half of the truth is the part they won’t say: Mentally or emotionally disturbed people with guns kill people.

    And now I’m thinking about the connection between mental/emotional disturbance and the widespread resistance to the idea of “gun control,” which I assume comes from the mythic tradition that equates guns with absolute safety.  

    I’ve been working with a group in my community trying to promote public support for mental health treatment. It has made me very aware of the profound reluctance we see all around us (even in a very liberal and wealthy county like mine) to treat mental/emotional disturbance as a communal problem.

    To say the same thing from the other side: When we talk about mentally or emotionally disturbed individuals, our society puts the emphasis on “individuals.” Without really thinking about it, most of us assume that we’re dealing with peculiar cases, each one caused by some unique set of problems encased in one individual’s brain.

    We just don’t have many cultural resources at all to think about mental/emotional disturbance as a societal problem. Oh, there’s shelves full of books in university libraries which can teach us to see it that way. But that academic perspective has not percolated through to our shared public myths. We still tend, as a society, rather reflexively to see troubled people as individual “weirdos,” unique outliers from the norm.

    And our natural inclination, most of the time, is to stay as far away from them as we can -- unless they are family members or otherwise connected to us in ways we couldn’t escape even if we wanted to. Then we try our best to get help for them. And we usually discover that the resources our society provides are far too meager to give them the help they really need -- precisely because, as a society, we don’t think of such disturbances as a collective problem. So we don’t even think about, much less provide the resources for, collective solutions.

    I suspect this pattern has its deepest roots in a tradition that was pervasive through the late nineteenth century and still affects us deeply: viewing mental/emotional disturbance through the lens of religious and spiritual language. I’ve spoken with ministers who are trying hard to bring their fellow clergy into fruitful conversation with mental health professionals. It’s an uphill struggle, they say, in part because there are still many clergy who assume that personal prayer and spiritual renewal is the only appropriate treatment.

    What we have here, to some degree that’s impossible to quantify, is a living legacy of the days when mental and emotional disturbance were interpreted as signs of sin. (“Evil visited this community today,” said Connecticut Governor Dan Malloy, as if the the tragedy were caused by some distant, utterly alien metaphysical force.) Just as sin was seen to be the responsibility of the individual, so mental/emotional disturbance is still seen to be, if not the individual’s responsibility, at least an individual problem.

    The proud American tradition of individualism is also, I suspect, at the root of the popular resistance to gun control. Discrete gun control measures gain popularity because most people think that they will apply only to others. Things like background checks and no guns for felons -- or the mentally ill -- don’t apply to me, the average respondent in a poll assumes. But gun control in general means that I may no longer have the right to defend myself, my family, and my home.

    The curious fact (which I noted in my post last summer and Klein confirms) is that the actual number of American households with guns has declined fairly steeply in the last forty years. So the objection to gun control laws doesn’t come only from people who have guns and want to hold on to them (though they are the largest portion of the naysayers). It also comes from people who imagine that they might some day feel the need for a gun to protect themselves. They don’t want their individual freedom abridged.

    So here is the picture we end up with: an image of a nation where at least half the people (or more, depending the poll) assert their individual rights by opposing gun control laws, while uncounted millions are walking around with serious disturbances locked up inside them  -- disturbances that occasionally burst out with horrific consequences. It’s a picture made up of 300-plus million separate individuals.

    Most of us see it that way because we don’t have the cultural traditions -- the myths, I’d say -- that would let us see both gun ownership and mental/emotional disturbance as societal facts, as manifestations of what the community as a whole is doing.

    So we go on letting individuals arm themselves to protect their individual rights and freedom, or so the myth tells us. (Illinois just became the 50th state to allow citizens to carry concealed guns.)  But we tragically underfund and ignore societal programs to help the mentally/emotionally disturbed, because we simply don’t see any relationship between them and the rest of us, or so the myth tells us.

    In such an individualistic nation, the recipe for absolute safety seems simple enough: Give everyone the freedom to carry a concealed gun, and stay as far away as possible from those “weirdos.” We’ve just seen, in a Connecticut schoolhouse, what that recipe produces.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149785 https://historynewsnetwork.org/blog/149785 0
    Gun Control: The New Abolitionism?

    Guns and violence are “a deep illness in our society,” columnist Frank Rich opines. “There's only one other malady that was so deeply embedded into the country's DNA at birth: slavery. We know how long it took us to shake those shackles. And so ... overthrowing America's gun-worship is not a project that will be cured in a legislative session; it's a struggle that's going to take decades.”

    I wonder if Rich is too pessimistic. He assumes that the gun-control issue is now where the slavery issue was in perhaps the 1820s, when the abolitionist movement was just beginning to gather steam as an organized Protestant reform effort. But that doesn’t seem a fair comparison.

    There has already been a well-organized, well-publicized gun control movement in the U.S. for decades. And it has already had a brief era of great success, in the early 1990s: the Gun-Free School Zones Act in 1990 (revised 1995), the Brady Bill in 1993, and the 10-year assault-weapons ban in 1994. That era was followed by a strong and relatively successful reaction from anti-gun-control forces, leaving us now with a common but mistaken impression that most Americans have always been reactionaries on this issue.

    If the analogy is to the slavery debate, it might be more accurate to think of 2012 as akin to 1852. In the preceding years pro-slavery sentiment in the South, and the pro-slavers’ political clout in Washington, had grown much stronger. Then Harriet Beecher’s Stowe’s epochal novel Uncle Tom’s Cabin appeared. The immensely popular book, and the many dramatizations of it that were quickly produced, gave powerful new energy to the anti-slavery movement.

    Although historians are supposed to refrain from predicting the future, there is no rule against imagining hypothetical possibilities. So I’ll suggest, with lots of qualifiers, that it’s possible that the dreadful murders in Newtown might turn out to play a role in some way akin to Uncle Tom’s Cabin.

    Who would have thought that Barack Obama, so deeply immersed in such delicate negotiations about taxes and budget, would run the risk of publicly advocating specific gun control measures: banning the sale of military-style assault weapons and high-capacity ammunition clips, and requiring background checks before all gun purchases. Granted, they are popular measures, as Obama himself admitted.

    But there will be plenty of pushback from the National Rifle Association and other pro-gun groups, who have proven very effective in the past. So the president knows he is taking a considerable political risk.

    In fact, if the 1850s is the appropriate decade for comparison, it’s a safe bet that the movement Obama has now joined will suffer losses in the near future. The anti-slavery movement was shocked by the Kansas-Nebraska Act in 1854, the ensuing battle over “bloody Kansas,” the Dred Scott decision in 1857, and the hanging of John Brown for raiding the Harper’s Ferry Arsenal in 1859 (just to name the most influential events).

    Yet each of those shocks ultimately had a similar effect to the shock we received when all those little children and their teachers were killed in Newtown. They redoubled the commitment of reformers to create political change, and therefore they heightened the tension between the opposing political forces, a tension that ultimately led to massive change.

    So the lesson of the 1850s is that no one event is likely, by itself, to transform public attitudes and policies. But a series of events, each one profoundly shocking, can have that effect. When the first of those events occurs, no one can know for sure that it is the first of a history-changing series. That’s something we can only know in retrospect. But we can know that change does sometimes happen in a series of spasmodic leaps.  

    There’s one more interesting parallel to consider. Throughout the 1850s, the total abolition of slavery always remained a minority view. The history-changing events of the decade never made the abolition of slavery a broadly popular opinion. The broad wave of support, spurred by every tragic turn of events, was for “free soil”: banning the extension of slavery to places it was not already legal.

    That was clearly Abraham Lincoln’s position, the major plank on which he won the presidency.  Only under fierce pressure to win the Civil War did he become “The Great Emancipator,” the prophet of total abolition.

    Similarly, there is no serious talk now of a total ban on the sale and/or possession of guns in the United States. Barack Obama knows it would be political suicide to endorse such an extreme position, just as Lincoln knew in the 1850s that total abolitionism would be political suicide.

    But the lesson of Lincoln’s career is that political issues and causes have a life of their own. Once you join or endorse them in even a partial way, there’s no telling where you might end up. The fates forbid that we ever have to endure anything remotely like the bloodshed of the Civil War, for any reason, including the eventual banning of guns. But even without violence history can lead us to very unexpected outcomes, sometimes in very sudden leaps, as we are learning right now.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149832 https://historynewsnetwork.org/blog/149832 0
    Fiscal Deal: And the Winning Myth Is …. Credit: Wiki Commons

    As Congress and the administration went through their tortured post-election wrangling (or was it a dance?) over fiscal policy, Americans never seemed quite sure what mythic lens was best suited to viewing the proceedings.

    Were we watching ourselves, all together, hurtling toward a cliff and trying desperately to avoid plunging over it? Or were we divided into two political and ideological camps, approaching a final showdown. I explored both the “cliff” and “showdown” metaphors in the run-up to the New Year’s denouement.

    Now that a deal has been done and we can watch the public reaction through the news media, which mythic metaphor is the winner?

    The many “Who won? Who lost?” evaluations seem to support the “showdown” view. The widespread view that this is merely round one, with more battles between the two parties sure to follow, also seems to give the nod to the “showdown” metaphor. Or perhaps, instead of a single showdown, we’ll start talking about a long, drawn-out "war."

    But take a closer look. “Deal done, but Threats Remain; ‘Cliff’ deal averts economic disaster but hazards linger,” the headline article on the Washington Post website gravely warns us. USA Today titles its roundup of opinion, “'Fiscal cliff' deal doesn't bode well,” and the editors of that paper conclude that the deal sets “only resets the stage for the next suspenseful act.” (At least they understand that their job is to turn complex economic and political problems into dramatic stories.) The editorial page editor of the New York Times sums up the common view: “The Cliff is Dead. Long Live the Cliff.”

    So we’ve actually ended up with a story that blends the two dominant metaphors. It tells us that we are still heading toward, or perhaps teetering on, the brink of a disastrous cliff, precisely because more showdowns between the two major parties lie ahead.

    This isn’t a myth that Americans are very familiar with. The closest parallel might be the “government shutdown” deadlock of 1995, which led to two brief suspensions of many federal services. But few people are likely to recall that as a dreadful disaster; the immediate aftermath that’s best remembered is a spike in Bill Clinton’s popularity ratings.

    Franklin D. Roosevelt tried his best in 1937 to depict a political showdown as a looming economic disaster for the nation. But his public showdown was principally with the Supreme Court, and only secondarily with conservatives (especially Democrats) in Congress. He lost both fights, and though the economy continued to struggle there was no precipitous decline (in part because the Supreme Court ended up approving a number of New Deal measure that FDR feared would be struck down).

    Perhaps the closest historical analogy to the doomsayers’ view of the future would be the 1850s, when intractable political struggle split the nation apart. But that was not a struggle over economic policy (at least not in the public imagination). And the result was a temporary calamity resulting in long-term benefit to the nation, as far as most Americans are concerned.

    So the idea that an ongoing political war over economic policies might truly bring national disaster has few if any roots in the soil of the American mythic imagination. 

    Of course the idea of living on the edge of disaster has powerful roots in modern American history in the realm of foreign affairs. During the early Cold War years Americans became accustomed to living on “the brink,” as it was commonly called, of nuclear war.

    In 1956 Secretary of State John Foster Dulles explained in Life Magazine that “the ability to get to the verge without getting into the war is the necessary art" of Cold War diplomacy. Dulles left no doubt that he and the whole Eisenhower administration had mastered the art of brinksmanship. Historians still debate that claim vigorously.

    But there’s little debate that during the Eisenhower era the myth of homeland insecurity -- the story of a nation living constantly on the brink of catastrophe, in what the president called “an age of peril” -- became “the new normal,” as a White House staffer put it. So Americans were afraid, but not terribly surprised, when Ike’s successor, John F. Kennedy, had to face the brink most directly during the Cuban missile crisis.

    Nor should anyone have been surprised when the myth of homeland insecurity arose again with such power within hours after the Twin Towers fell on September 11, 2001. The sense of a nation living always on the brink, facing a constant threat of destruction, had indeed become “the new normal.” So -- again, no surprise -- Vice President Dick Cheney raised few eyebrows when he said that we’d have to get used to “the war on terror” as “the new normal” forever.

    When it comes to national mythology, the lesson of the Cold War years was that domestic political showdowns come and go; they’re most commonly described as “squabbles.” But they’re all fought out under the shadow of permanent threat, in a nation teetering at the edge of extinction.

    That lesson endures. So the showdown -- even if it is seen only as the first battle in a long war, the first act in a protracted drama -- will become merely a way to explain the most basic “fact”; that is to say, the winning myth: America is doomed to live on edge of the “fiscal cliff” for a long, long time. The New Year fiscal deal has confirmed the view so many already held: Our economic plight is now “the new normal.” We might call this the new normal myth.  

    As long as the “cliff” myth prevails, it will carry many of the same implications as the Cold War “brink.” When you are facing catastrophe your highest priority is to protect yourself and what you already have. It’s only logical to avoid making any major changes or even thinking about any substantial innovations. They’re simply too risky for people teetering on the edge of a cliff.

    It's no wonder that this “showdown” was a drama acted out by extremely cautious people taking only the smallest, most cautious steps. Nor should we expect anything else in the future, as long as the “cliff” myth prevails. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/149961 https://historynewsnetwork.org/blog/149961 0
    "Fix the Debt": Sheer Hypocrisy or a Myth Worth Debating?

    The New York Times has just published an expose on Fix the Debt, “a group of business executives and retired legislators who have become Washington’s most visible and best-financed advocates for reining in the federal deficit.” It turns out that “close to half of the members of Fix the Debt’s board and steering committee have ties to companies that have engaged in lobbying on taxes and spending, often to preserve tax breaks and other special treatment.” The Times gives plenty of examples to support that charge.

    I’m shocked. Shocked. Why, I wouldn’t be surprised if tomorrow’s Times reveals that there’s gambling going on in the back room at Rick’s Café Americain.

    Actually, the analogy with the film Casablanca may not be a bad one. Mr. Rick certainly makes enough money from his business establishment to live quite comfortably, dress impeccably, and keep his place among the city’s elite. But we know that he’s in no way just a greedy money-grubber. He aims to be a useful citizen, providing a service that his city needs and offering it at a high level of quality.

    I suspect that Erskine Bowles and Alan Simpson, who co-founded Fix the Debt to promote economic policies along the lines of their Simpson-Bowles plan, would probably say much the same about themselves. So would the other leaders of Fix the Debt who were outed by the Times.

    They would make the case that they aren’t merely out to engorge their own bank accounts. Smart people who go into government or lobbying know that they are not going to get truly rich that way. They are going to help other people get truly rich, though they’ll be well-paid for their services along the way.

    Nor, they’d probably claim, are they hypocrites. Though the Times article never uses that word, it’s the word that will spring to the minds of many readers. The article leaves a clear impression of men (they’re all male) who claim to be serving the public good while actually serving their own private interests.

    Fix the Debt leaders might well protest that this view rests on a false dichotomy, as if one must choose between public good and private interests. On the contrary (I suspect they’d argue), the genius of capitalist democracy is to erase the conflict between those two. Our system is set up so that the only way to improve material life for everyone is to let the truly rich get even richer. Call it “a rising tide” or “trickle down.” Either way, if we limit the wealth of the truly rich we all suffer. So enriching private interests is the best way to serve the public good.

    If they are historically minded, they might point out that most of the Founding Fathers saw things this way, too. Alexander Hamilton articulated this view at great length on behalf of the Federalists, who didn’t think a fledgling democracy could survive unless it had a strong central government making thoughtful arrangements for the very rich to get richer. 

    Jefferson and Madison refuted that view on behalf of the eighteenth-century Republicans. But they came from the wealthy, slave-owning elite of Virginia. Jefferson argued against abolishing the slavery that made him rich because it would bring the whole socio-economic system crashing down upon everyone -- rich and poor, white and black, alike. And as president he used the power of government in many ways that helped the truly rich get richer, because he saw those policies necessary to serve the public way. Every president since has done the same thing.

    There are many good arguments, both economic and moral, against the view that our society, or any democracy, must or should or really does work this way. But Simpson, Bowles, and the rest of the Fix the Debt crew probably have never taken those arguments seriously; perhaps never even heard them. Nor did most of the Founding Fathers. The rich and their top-flight hired hands generally live in a restricted social circle, where they meet and hear only other economic elite figures like themselves.

    As in any social circle, their conversation is based on shared premises. It constantly reinforces their shared story of how human life works. In other words, they hang out with each other and keep telling each other the same myth. They never get a chance to hear any other myths. So why shouldn’t they genuinely believe their own? I’m not saying they do believe it. I’m just suggesting they might.

    It’s worth considering because, if the rest of us assume that they are merely money-grubbers, most of us may easily dismiss the Times’ story with the same mock shock that was my first reaction. Of course they are gaming the system for their own profit, we’ll say. That’s how the system works. It’s the way of the world. So we’ll have our few minutes of righteous indignation and then go on our way, reaffirming our own myth about the selfishness of the rich.

    But suppose a major newspaper ran an expose on the mythic worldview of Fix the Debt’s leaders. Suppose it was clearly explained and traced back to its roots in the worldview of the Founding Fathers. Then at least some of us would feel that we ought to start thinking about it.

    Just what is wrong with their assumptions? If we don’t want our society based on those assumptions, can we reform the current system but still keep its basic structure? Or is their elite worldview the foundation of that structure? If we challenge their way of thinking, must we soon find ourselves talking about a revolution?

    Back in the 1790s, the Republican attack on the Federalists raised these issues, albeit in a restricted way. Jefferson and Madison never seem to have taken a really long, hard look in the mirror. But even if they had a disturbing measure of hypocrisy, they did spark a debate about whether the United States really needed a federal government helping the truly rich get richer. Sometimes, at least, that debate was conducted on a fairly sophisticated intellectual level.

    In the past year the United States has flirted with at least a few threads of that debate, and even that very limited flirtation has been very healthy for us. We should take every opportunity to continue it. If we cut it off with a curt “They all do it,” we steer ourselves toward an intellectual and political dead end, which cuts off any possibility of meaningful change. We ignore the myths of the economic elite at our peril.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150027 https://historynewsnetwork.org/blog/150027 0
    “Ike” and the “Red Menace”: Some Myths Won’t Die Martin Luther King, Jr. with President Eisenhower in 1956.

    You probably know the mythic Dwight Eisenhower, the “great peacekeeper in a dangerous era,” who bravely withstood the communist threat while skillfully avoiding all-out war. The quote comes from Evan Thomas, the latest writer to make a mint by retelling the tale. It would hardly be worth noticing, except that pundits keep trotting out the mythic Ike by as a model for the current president to follow.  

    Latest example: the Washington Post’s influential foreign affairs columnist David Ignatius, a dependable megaphone for the centrist foreign policy establishment. He’s praising  Thomas’ book, Ike’s Bluff, for supposedly showing us how a great president deals with “continuing global threats … that require some way to project power.”

    Thomas’ book bears the grandiose subtitle “President Eisenhower’s Secret Battle to Save the World.” Save it from what? Why, the “red menace,” of course. And now, says Ignatius, Obama must deal with al-Qaeda and Iran -- who are also, presumably, threatening to destroy the world. Eisenhower had to stop the communist “advance in Europe and around the world,” Ignatius writes. “Obama has a similar challenge with Iran.” Then he tacks on al-Qaeda as the other looming threat to our national security. It’s the myth of homeland insecurity, as clear as you’ll ever see it.

    When I call this a myth, I don’t mean it’s an outright lie. Like most of the myths in American political life, it blends some number of facts with a sizeable dose of fiction to create a narrative that expresses basic assumptions about the world and shapes government policies. 

    For example: In the very fluid situation created by the devastation of World War II, the U.S. government saw a chance to install its capitalist system solidly everywhere except the Soviet Union. Stalin, seeing his nation potentially encircled by an enemy, naturally did what he could to promote Soviet influence throughout Eurasia.

    Eisenhower made this the stuff of myth: “Russia is definitely out to communize the world,” he wrote in his private diary. “Now we face a battle to extinction.” In 1953 Ike carried this fear-stoked exaggeration into the White House. He wrote in private letters that the Soviets were “seeking our destruction,” and his goal was to prevent “the Kremlin’s control of the entire earth.” 

    To achieve that goal, he was absolutely ready (though certainly not eager) to use nuclear weapons. Sorry Evan Thomas, but Eisenhower was never bluffing. He told his National Security Council that “if the Soviets attempt to overrun Europe, we should have no recourse but to go to war.” He was equally ready to use nukes to end wars in Korea and Vietnam, he told the NSC, if he thought it necessary. In 1958 he said much the same about the standoff over Berlin.

    Eisenhower understood the risks. But he summed up his view quite succinctly to the British ambassador: “He would rather be atomized than communized.” In his mythic worldview, those were both very real possibilities. However the risk of being atomized arose only because he was approving the most rapid buildup of weapons of mass destruction in U.S. history and making sure that disarmament negotiations could never succeed.

    Ike did all this because he took for granted the mythic threat that he, and so many other Americans, had created out of their own fears: the “red menace.” Driven by this image of imminent danger, he sowed all the seeds of a nuclear confrontation that could “atomize” the world. It was largely just luck that allowed him to escape the ultimate showdown.

    His successor wasn’t so lucky. JFK had to taste the bitter fruit that grew from the seeds Ike planted.     

    Despite all this history, which is plain enough to anyone who reads the once-secret documents of the era, the mythic version of Eisenhower continues to be held up as a model that current presidents should follow.

    So pundits like David Ignatius encourage Barack Obama to threaten Iran with “economic, military and political destruction if it refuses to make a deal” -- on U.S. terms, of course, which is bound to stiffen Iranian resistance. And he encourages Obama to continue using lethal drones to kill people, without knowing who they are or what their attitudes toward America might be -- which is sure to turn attitudes in the victims’ communities against America.

    But all this is done in the name of “national security,” to contain supposed threats that are imagined to be as ominous as the “red menace” that once dominated America’s public imagination. What do we gain by letting our imaginations run away with us again?

    Evan Thomas is right on one point: “Public terror was a price” -- the price, I would say -- that the nation paid for Eisenhower’s policies. Why do so many “foreign policy experts” want to take us back to that era of terror, or create a new incarnation of it?

    The answer involves more than cynical manipulation. Those “experts” may very well be sincere when they tell us about the terrifying “global threats … that require some way to project power.” The more they discuss the “sources of insecurity” with each other at their high-level conferences and expense-account luncheons, the more they convince each other that their myth is literal fact.

    The same goes for the politicians tutored by the experts. Sure, the politicians will lie to get specific policies implemented. But when they tell the tale that shapes their policies -- the story of “impending threat to our national security” -- there is no reason to assume that they are bluffing us.

    That’s what I learned from reading thousands of pages of Eisenhower’s letters, diaries, and private conversations. No one can ever know what was in his mind. But in the documents there was never a hint that he was consciously purveying an invented “red menace” narrative. On the contrary, everything he said seemed to take for granted the truth of that myth.

    So who knows? The pundits who equate “the Iranian threat” with “the red menace” may really believe it. Barack Obama may believe it too. Looking back to the cold war years teaches us how dangerous it is when the “experts” and national leaders take their own myths seriously.

    Of course we should debunk the falsehoods they purvey. But debunking alone doesn’t weaken the power of a myth. It takes a new narrative. That’s something to think about as we approach a unique convergence – Inauguration Day and Martin Luther King Day on the very same day. The president, beginning his second term, is hardly likely to give us a radically new narrative. Dr. King already gave us one, many decades ago.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150129 https://historynewsnetwork.org/blog/150129 0
    Why Does a Hostage Crisis Fascinate America? Camels in the Sahara near In Aménas, the site of the hostage crisis. Credit: Flickr/albatros11.

    Barack Obama and his political advisors surely thought that gun control would dominate the headlines for days to come after the president announced his controversial proposals. But some armed men in a remote gas drilling site in the Sahara desert had other ideas.

    The pundits love to tell us that a president who focuses on domestic policy is inevitably frustrated, because there are bound to be unexpected crises abroad that demand his, and the nation’s, attention. But there’s really nothing inevitable about it. It’s a choice that the public, and the news media who must sell their wares to the public, make.  

    Certainly the lives of the people at risk in the Sahara are important. It’s a tragedy when anyone is killed. But let’s face it. A handful of American lives may be lost in Algeria; maybe not. Whatever the outcome, this incident will soon disappear down the American memory hole.    

    In the gun control debate, on the other hand, we’re talking about a continuing threat to a huge number of Americans. Thousands of lives will surely be lost this year, and next year, and the year after that, ad infinitum, if the laws aren’t changed. Yet the gun control issue was quickly eclipsed by the public’s rapt attention to the hostage drama in the desert.

    Historians may not be surprised. White Americans have been fascinated by stories of their own people being taken hostage by “bad guys” ever since the  seventeenth      century.

    Back then, many colonists were captured by native warriors. To the native Americans it was perfectly logical: Whenever some of their people were killed by whites, they would capture -- not kill -- a roughly equivalent number of whites to replace the lost members of their community. It wasn’t about good destroying evil. It was about maintaining an approximate balance.

    But the whites didn’t understand that. As most of them told the story, absolutely good (white) people were locked in an endless struggle with absolutely evil (native) people. When whites were taken captive by natives, whites typically saw it as a violation of the most basic moral rule: good should triumph over evil. When whites escaped, it was easy to explain it as an act of God, restoring the proper moral order of the universe.

    That’s how Mary Rowlandson told the story of her captivity in The Sovereignty and Goodness of God (1682), a book that quickly became a best-seller and continued to be widely read for more than a century.  But Rowlandson’s is only the most famous of the many so-called captivity narratives that have captivated the imaginations of white Americans ever since.

    What makes these stories so compelling? Historians have made a cottage industry out of finding new answers to that question. One intriguing theory begins with a well-documented observation: Plenty of captured whites were in no rush to go home. A good number chose to “go native” and live out their rest of their lives among the Indians.

    This widely known fact freaked out a lot of white people; it turned their world upside down. If the “good” voluntarily chose to blend into the “evil,” how could anyone be sure anymore where the line was that separated the two? And if that line was blurred, how could there be any moral order at all?

    What these whites needed, above all, was reassurance that the moral line dividing them from the native people was absolute, impermeable, and immutable. That’s why captivity narratives were so popular (this theory goes): In these tales, the whites were always absolutely good and the natives absolutely evil. Telling and reading the stories over and over again was a way of reaffirming the simplistic moral fantasy as the true reality, which made it easier to treat the observable, empirical world as if it were not real.

    Is it still going on today? The parallel is far from perfect. There’s no evidence that armed Muslim forces want to capture white Americans to populate Muslim communities. Yet white America still has an insatiable appetite for captivity narratives.

    And the reasons behind that appetite may very well be much the same.  Even if very few white Americans have visibly “joined Al Qaeda” (whatever, exactly, that might mean), lots of white Americans feel increasingly unsure that they can see a clear-cut, absolute line between good and evil.

    The gun control debate is a fine example. While some Americans are sure that stricter gun laws would be good, and some are sure those same laws would be evil, a vast number in between aren’t quite sure of anything. So the public as a whole is in a state of moral confusion. The same is true, of course, about so many other issues.

    On at least one point, though, there is an overwhelming consensus: Al Qaeda (whatever it may be) is evil. So when America is attacked by, or pitted against, Al Qaeda, America is self-evidently good.

    Ditto when Americans are captured by Al Qaeda -- especially if the capture is “masterminded” by a fierce-looking, black-turbaned, one-eyed Muslim with the exotic name Mokhtar Belmokhtar, AKA “the Uncatchable.” It sounds too wicked to be true, like something “straight out of central casting,“ as the Times of London said -- although in a grade-B Hollywood movie, where we would expect to find him, he would be called simply “the Evil One.”

    It’s precisely the mythic quality of this story, and of all captivity narratives, that makes them so fascinating. In myth, as in Hollywood, all the world’s shades of grade can be boiled down to simple black and white.

    There’s a perverse sort of advantage when good people are captured by evildoers rather than killed outright. Attacks and battles are usually short-lived affairs. The story is told, and then we’re quickly on to the next story. Nothing is as stale as yesterday’s news.

    But a hostage crisis can continue for a long time. Day after day we get to see or read the captivity narrative. And each repetition offers more reassurance that, despite all our disputes and uncertainties, the struggle of good against evil goes on. So we know there are still some absolutes to provide order in our moral universe.

    This theory that explains the popularity of captivity narratives also explains why the public would so quickly switch its focus from gun control to the drama unfolding in the Sahara. The gun debate only reinforces the sense that no one knows any longer what’s good and what’s bad. The endless news about the hostage crisis eases that disturbing feeling and replaces it with a satisfying reassurance that, ultimately, all is still right with the world -- even if a bunch of people have to die to prove it. 

    (For a look at the mythic qualities of the gun control debate, see my recent post on ReligionDispatches.org.)

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150211 https://historynewsnetwork.org/blog/150211 0
    Inauguration Shows President as Prime Minister and King Barack and Michelle Obama at his second inaugural. Credit: Flickr/Adam Fagen.

    There were passages in Barack Obama’s second inaugural address that sounded like a European prime minister from a Labor or Social Democrat party addressing his Parliament. Obama had a whole laundry list of progressive proposals. Some were explicit:

    “Care for the vulnerable and protect people from life's worst hazards and misfortune” through “Medicare, and Medicaid, and Social Security”; “respond to the threat of climate change”; make sure that “our wives, our mothers, and daughters can earn a living equal to their efforts. … our gay brothers and sisters are treated like anyone else … no citizen is forced to wait for hours to exercise the right to vote”; “find a better way to welcome the striving, hopeful immigrants.”

    Some of the progressive program was implicit:

    Protect the environment with “the technology that will power new jobs and new industries” (presumably funded generously by government); “revamp our tax code” (presumably to make the rich pay more); “reform our schools and empower our citizens with the skills they need” (presumably with more public funding for education); keep “all our children … always safe from harm” (presumably through gun control laws).

    Yet Obama could not actually come across, on Inauguration Day, as a progressive prime minister. The occasion has traditional rules, written and unwritten, that bind any president, no matter what his or her political views. There must be pomp and ceremony, strict protocol, splendor and grandeur. There must be patriotic praise of America, religious praise of God, and ample assurance that the two are inextricably connected.

    In other words, the occasion must be a coronation, and the star of the show must act, to a considerable degree, not as a prime minister but as a king. While the particulars of the ceremony are uniquely American, its underlying structure can be traced back to royal rituals of the third millennium BCE, when a new king received his crown and scepter (typically from priests) amid the same kind of pomp and splendor.  

    One scholarly opinion explains these ceremonies in terms of a worldview that saw the state as an island of order surrounded by a threatening sea of chaos. The ruler and the axis connecting him to the gods were the linchpins of order. So the demise of a ruler was an immensely threatening event. The new ruler had to be installed according to an elaborately structured ritual to protect the vulnerable state from tipping over into chaos.

    The new ruler’s job was the same as the old ruler’s: to continue that protection by living every moment of his royal life according to the traditionally prescribed, ritualized rules. The state was a 24/7 dramatic production -- a “theater state.”  As long as the heroic lead actor performed perfectly, the order of the state (and, in most versions of the “theater state,” the world) would be preserved.

    A more skeptical scholarly view holds that the real intent of the coronation ceremony was to overawe the inhabitants of the state as well as its potential enemies, to impress upon them the immense power wielded by the new ruler. The same intent motivated the daily ritual and grandeur of the royal court after the new king was installed, this theory holds. If everyone was impressed enough with the king’s power, they would obey his commands and refrain from any kind of resistance. Thus the prevailing status quo -- the existing order -- would continue undisturbed.

    So both theories arrive, by different routes, at the same conclusion: The pomp and splendor symbolized a guaranteed assurance of permanent order in the face of an ever-present threat of chaos. Maintaining the status quo was the essential -- and essentially conservative -- purpose of the “theater state.”

    During the presidential campaign I wrote about the debates as an example of the “theater state.” I suggested one lesson from the survival of this ancient tradition in our democracy: Americans want their president to be, in some sense, like a king, offering “the reassurance that comes from seeing and hearing the same ritualized words and behaviors, over and over again, in a well-acted political theater.”  

    The presidential inauguration is more obviously a direct descendant of the ancient “theater state.” It shows more clearly that the president must be both prime minister and king. No matter how progressive he may want to be in the former role, his royal obligations force him to be the guarantor of the status quo, hence essentially conservative.

    When Obama concluded his inaugural address with an appeal to the citizenry to “shape the debates of our time - not only with the votes we cast, but with the voices we lift in defense of our most ancient values and enduring ideals,” he offered a fine example of the dual role that he must play.

    The call to lift voices is a pragmatic prime minister’s tactic: Mobilize public opinion in support of the ruling party, so that the opposition will see more political risk than benefit in blocking the PM’s program.

    The invocation of what is “ancient” and “enduring” also has a pragmatic purpose: to win over wavering centrists to at least some parts of his program. But no matter what a president’s political calculations may be, he or she is compelled to use such language on inauguration day. It is the obligatory royal language fit for the occasion of a coronation ritual. It is inescapably conservative language, because the conservatism inherent in the role of the king is inescapable.

    Barack Obama may be comfortable with that inescapably conservative element of his job, or he may be quite unhappy about it. After four years I still can’t tell. In any event, though, he is stuck with it because, in a democracy, government must give the people at least some of what they want.

    Thomas Jefferson thought that his victory over the Federalists in 1800 dealt a decisive defeat to the desire for monarchy in the United States. He made his inauguration an extremely modest affair to symbolize that point. So far, at least, it seems that Jefferson was wrong.

    Yet precisely because this is a democracy citizens can shape the outcome of the political process. Indeed, as Obama said, we “have the obligation to shape the debates” -- and, he might have added, the inaugurations -- “of our time.”

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150220 https://historynewsnetwork.org/blog/150220 0
    Zero Dark Dirty: The "Good War" Lives

    I once heard a prominent expert on contemporary Islam say that Al Qaeda is not an organized group (and this was while Osama bin Laden was still alive). It isn’t even, primarily, a group of people at all. Al Qaeda is best understood as a body of discourse, a way of talking.

    How do you fight a body of discourse? With another body of discourse, of course. The United States government is doing that in all sorts of ways, spreading the gospel of democratic capitalism and the American way of life.

    But how do you make a movie about a war between two bodies of discourse? If you want to win awards, pack the theaters, and turn a profit, you don’t. A good movie has to start with a mythic script.  And it’s awfully hard to find the myth in a war of discourse versus discourse.

    So you make a movie about a war of good guys against bad guys. That’s about as mythic as it gets. It’s the American war story that has been made in Hollywood a thousand times -- well, a thousand and one, now that we have Zero Dark Thirty. I’m finally getting around to writing about the film, after just about everyone else in the world has had their say, because I finally got around to seeing it. It turns out there was no reason to rush anyway.

    After all the controversy about the torture scenes, and Kathryn Bigelow’s highly publicized claim that her film merely depicts the horrors of American behavior in the “war on terror,” letting us all make up our own minds about the moral issues, I was expecting some complexity and ambiguity -- something like what she gave us in The Hurt Locker. 

    Instead I got two hours and thirty seven minutes of classic, mythic American war movie.

    I suppose the difference between Bigelow’s two films is a good index of the difference between the two wars they depict, as far as public perception goes. A popular film about the Iraq war was bound to be ambiguous because, once Saddam Hussein was gone, no one could tell exactly who the enemy was. They were simply (as so many American soldiers told us on the evening news) “the bad guys.”

    But as long as Osama was alive, the war against Al Qaeda was a perfectly unambiguous war. In fact it was a “good war," because it fit so well the prototype of all “good wars": the war against Nazi Germany. Both were waged against forces that had, without a doubt, done terrible things. And in both wars American forces also did terrible things. But American deeds were rarely called into question because the enemy’s deeds were so indisputably evil.

    There is one major difference between World War II and the war against Al Qaeda: the Germans never made a significant attack on American soil. In that sense, the war against Japan is a better parallel to our current war.

    But mythically (and thus cinematically) the war against Germany remains the prototype, for many reasons, no doubt. Zero Dark Thirty reminds us of one big reason: German forces were led by a single arch-villain, the man who remains for Americans the epitome of evil. Leadership in Japan was more diffuse. Since World War II, Americans have needed an enemy led by a single “Hitler figure” before they would sustain support for a war. Osama was the Hitler-est of them all. 

    Zero Dark Thirty also fits the WWII movie mold by giving us a superhero who wins the day. Granted, a woman who defeats the enemy by brainy manipulation of digital data is a far cry from John Wayne in the trenches. She’s a fine measure of how much American culture has changed in the last half-century or so.

    Nevertheless, Zero Dark Thirty fits the WWII mold: a gripping story of one purely good person defeating one purely evil person (and an inept bureaucracy on her own side, to boot). It’s a dark, dirty job the superhero must do, even if she wears a clean white collar. But, then, as long as the evildoer is at large, commanding his forces of evil, it’s a dark, dirty world. Someone has to do the dirty work to clean up and purify this dirty world. Someone has to descend into the darkness to create a bright new light for all of us to bask in. And that someone, our national story insists, must be an American.   

    There’s another important parallel linking the war against Germany with the one against Al Qaeda. In both cases, neither the troops doing the fighting nor the general public knew very much at all about the beliefs, values, or ideologies that drove their enemies. They simply “knew” (that is, believed) that the arch-villain and his minions were evildoers who threatened the very existence of the United States and thus had to be stopped at all costs. That was the essence of the myth.

    Zero Dark Thirty reflects that myth quite perfectly. We never get a hint of interest on the part of the American fighters in why their enemy perpetrates violence. This is Hollywood -- or perhaps I should say, this is America -- and it just doesn’t matter.  As long as there is an American superhero pitted against a foreign arch-villain and our superhero wins, no questions need be asked. Perhaps that’s why all the controversy about this film has centered on the torture scenes, not on the simplistic, superficial, conventionally American triumphalism that prevents it from being a great film. 

    After the arch-villain is vanquished, though, there is one question to be asked -- the question that ends Zero Dark Thirty: Now that you are no longer threatened by the evildoer, “Where do you want to go?” I presume Kathryn Bigelow wants the audience to see Jessica Chastain, at that moment, as a symbol for America. Once we have defeated evil, where do we as a nation want to go?

    The tear falling down Chastain’s cheek tells me that the question is supposed to make us all cry. Why? Finally, in the last seconds of a very long film, a note of ambiguity: You decide why.

    I don’t have any trouble with that one. We Americans can unite so readily and act so effectively, as a nation, as long as we believe we are fighting an evil that threatens our country, or, to use Michael Sherry’s apt phrase, as long as we feel that we’re “in the shadow of war.”

    But suppose we could escape from that shadow into a world that is no longer dark and dirty? Could we unite and choose a positive new direction for our nation? Our history since the 1940s suggests that we have largely forgotten how to do that.

    We do fine when we are acting out our mythology of national insecurity. But if we try to think about acting out a mythology of hope and change, we don’t know how to change or even what to hope for. That is indeed worth shedding a tear for. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150341 https://historynewsnetwork.org/blog/150341 0
    Kerry Admits It: “Foreign policy is Economic Policy.” John Kerry embraces John McCain at his recent confirmation hearings. Via Flickr/Glyn Lowe.

    At his confirmation hearing, the new Secretary of State, John Kerry, declared flatly:  “Foreign policy is economic policy.” Now them is fightin’ words if they’re spoken by a scholar of U.S. foreign policy. Scholars of the “revisionist” school have been attacked, reviled, and marginalized for decades simply for saying what Kerry seemed to say: Economic motives are the main drivers of foreign policy. So when revisionists hear a top government official say it out loud, it’s like discovering gold: It’s hard evidence that their view is correct.

    And, like discovering gold, it doesn’t happen very often. When U.S. government officials speak in public, they are usually careful to say that American foreign policy has one overriding aim: promoting American values and ideals around the world. Those values and ideals hold true everywhere, the official narrative has always insisted. So our foreign policy goal is to promote the good of everyone, all over the world.

    It is permissible, sometimes obligatory, to add that U.S. foreign policy also aims to protect the United States against enemies. And that can readily lead to the goal of “promoting American interests.” But the official narrative assumes that the stronger America is on the world stage, the more able it is to promote its universally true values, which are the only key to world peace. So there can be no conflict between our interests and our altruistic ideals. That identity of interests and values has always been the bedrock of the official story.

    It was still the bedrock on Inauguration Day, when Barack Obama proclaimed: “We will defend our people and uphold our values through strength of arms and rule of law. ... Our interests and our conscience compel us to act on behalf of those who long for freedom. ... Peace in our time requires the constant advance of those principles that our common creed describes.” The official narrative seemed alive and well.

    But just three days later Senator Kerry -- a solid pillar of the foreign policy establishment -- had surprisingly little to say about values and ideals in his statement to the Senate Foreign Relations Committee. He did talk openly about “advanc[ing] America's security interests in a complicated and even dangerous world.” And he warned that “we will do what we must to prevent Iran from obtaining a nuclear weapon.”

    But Kerry went out of his way to put emphasize what revisionists have long seen as the most precious, closely-guarded secret: Economic interests are the mainspring of foreign policy. And he treated it as if it were an obvious, ordinary observation.

    The establishment press put the spotlight just where the new head of State wanted it. “Kerry Links Economics to Foreign Policy,” the New York Times headlined. Though he “outlined no grand agenda for the next four years,” the Washington Post reported, “the closest he got to a foreign policy mission statement” was the simple equation: foreign policy = economic policy.

    As Kerry explained himself, he gave a whole arsenal of ammunition to the revisionist argument. He introduced his discussion of economics and foreign policy this way: “It’s often said that we can’t be strong at home if we’re not strong in the world.” Then he warned that the U.S. it at risk of losing its “leverage ... strength and prospects abroad.”

    Leverage and strength for what prospects? Kerry gave several kinds of answers.

    The first sounded like classic revisionist theory: “The world is competing for resources and global markets. Every day that goes by where America is uncertain in that arena, unwilling to put our best foot forward and win, unwilling to demonstrate our resolve to lead, is a day in which we weaken our nation itself.” In other words, the global economy is like a huge pie. America’s strength is defined by how big a slice we get. The goal of foreign policy is to make sure we get a bigger slice than anyone else.

    Kerry’s other explanations for building American “leverage” and “strength” were less direct. “The first priority ... as we work to help other countries create order ... will be that America at last puts its own fiscal house in order. “Order,” revisionists point out, has been a central term in American foreign policy discourse for a long time. It’s a code word for a stable capitalist system, where capitalists can safely predict that they’ll get a decent long-term return on their investments.

    Kerry was warning that if capitalism can’t guarantee long-term prosperity in the U.S., foreign nations will not be so eager to accept American investments in their own land.

    He also had another kind of warning: “It is hard to tell the leadership of any number of countries they must get their economic issues resolved if we don't resolve our own.” Getting “their economic issues resolved” is another coded message, one straight from the fount of common capitalist wisdom: To create the “order” that makes investment safe, many nations must cut public expenditures drastically. To make the world safer for American investors, the U.S. government must use its “leverage” and “strength” to compel other governments to make those painful cuts.

    The underlying premise here is the premise of all U.S. foreign policy since at least the 1930s: America’s role in the world is to create and safeguard global “order” -- to make the world safe for capital investment, especially American investment. The U.S. is entitled, in fact obligated, to impose “order” everywhere, by any means necessary. Now that means, in most cases, imposing austerity.

    But, Kerry said, demands for austerity from the U.S. won’t be credible if we have a huge budget deficit of our own. So “the first priority of business which will affect my credibility as a diplomat ... is whether America at last puts its own fiscal house in order.” 

    No doubt Kerry said all this to support Obama’s budget battle against the Republicans. Obama’s call (in his inaugural speech) to “to act in our time” was a message to the GOP to quit their obstructionist ways and accept the centrist compromises the president is always ready to offer. The administration is trying to make the case on every front that the nation’s well-being demands it. Kerry was showing that he’ll be his boss’ loyal servant and sound appropriately urgent.

    But Kerry’s eagerness to make the “Foreign policy is economic policy” case reflects more than short-term political tactics. It’s a sign that that the official narrative of American foreign policy is changing, or at least is open to change. Top officials are ready to say openly what revisionists claim they’ve been saying privately, among themselves, all along (and revisionists have plenty of evidence to support that claim).

    Why the shift?

    The government always faces a major problem when it comes to foreign affairs: Not many Americans care much about the rest of the world, and certainly not about spreading American ideals throughout the world. Government officials have to come up with some other reason to justify their extensive involvements abroad and the tax dollars they spend on those involvements.

    It’s not so hard when there is some clearly identified enemy to fight -- as long as the public thinks their tax dollars are buying American victories. Now, though, the only “victories” are pinpoint attacks on “terrorists,” and Obama wants to preserve his freedom in that fight by keeping it secret. The obedient Kerry’s single mention of “terrorism” and “drones” was to downplay their importance.

    How can the whole foreign policy enterprise be justified today? At a time when public opinion focuses so single-mindedly on the economy, the answer is obvious: Just say, loud and clear, “Foreign policy is economic policy”; there’s a global economic struggle going on; we Americans need to be strong enough to win it; the only way to win it is to control economic life around the world.

    And there’s no great danger for an incoming Secretary of State to say all that, nor to have it headlined in the nation’s leading newspapers. The revisionists have been so effectively silenced that their cries of “I told you so” are not likely to cause much of a ripple. So there’s no reason for the foreign policy establishment to be afraid of their criticism.

    But there’s a lesson here that foreign policy revisionists might want to ponder. The stories that interpret and justify public policies -- I call them myths -- are created for political purposes. They can shift as quickly as the political winds. Sometimes those winds blow a heavy dose of truth into the myths. That’s why a myth is not a lie; it’s a mixture of truth and falsehood, with the proportions depending, in large part, on the political needs of the time.

    Precisely because the political winds can shift so quickly, groups that have little influence today may find themselves with a lot more influence tomorrow. So the revisionists have good reason to store up their political resources, polish up their own myths, and pack them with as much empirical truth as they can. The golden nugget offered by John Kerry is a treasure that can serve revisionists well for all three of those purposes.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150401 https://historynewsnetwork.org/blog/150401 0
    The Imaginary World of the “War Against al-Qaeda”

    The on-again, off-again debate is on again: Does the executive branch of the United States government ever have the right to assassinate American citizens without due process of law? A brave soul, who hopefully will remain nameless, has leaked an internal Justice Department “White Paper” outlining the Obama administration's reasons for answering “Yes.” A chorus of critical voices answers, just as loudly, “No.”  

    But most of the critics agree with the administration and its supporters on one point: The question here is about the executive’s power in wartime.

    If that is indeed the question -- a big “if” -- history offers a certain kind of answer. Lincoln, Wilson, and Franklin D. Roosevelt all pushed their constitutional authority to the limit during war -- and beyond the limit, critics in their own day and ever since contended. Yet the overreach of these three presidents (if overreach it was) did little to tarnish their reputations.

    Even their critics generally place their actions in a larger context: It’s understandable, though regrettable, that war subjugates all other concerns to the overriding goal of victory. And imagine if the United States had lost any of those wars. Where would we be now? The “White Paper” assumes much the same question as its foundation: Who would countenance a president risking the security of the United States in wartime?

    So the document ignores the more basic question: Is this actually “wartime”? Is there a precise parallel between the situation this president faces and the wars his illustrious predecessors waged?

    The “White Paper” itself admits that this is a different kind of war: “a non-international armed conflict.” But it ignores the difference. It acknowledges that this is not “a clash between nations.” Yet it consistently treats al-Qaeda, for all practical purposes, as if it were a nation. And it uses all the reasoning that would apply to an old-fashioned war between two nations.

    This version of reality -- call it the “We’re at war again” story -- has been so dominant for so long that it’s easy to forget how it began. After the 9/11 attack, the Bush administration made a very calculated decision to declare it an act of war. There was an obvious alternative: After the botched 1993 attack on the World Trade Center, President Bill Clinton chose to treat it as a criminal act, to be addressed by the police and justice apparatus, not the military.

    A decade ago, there was still some public controversy about whether the Clinton or Bush approach was the best way to proceed. But that controversy didn’t last long. The war party’s story won out and is still winning out.

    Every story creates its own world, a world spawned in imagination. The  “war against al-Qaeda” story lends itself very readily to fiction; its world has been depicted in innumerable movies, novels, and TV shows.

    Now the “White Paper” offers a valuable confirmation that this imagined world has become the very real world of the Obama administration and the national security establishment. In many respects, it is the world in which all Americans live. The “White Paper” lets us take a good look at its mythic foundations.

    In this world, al-Qaeda is not a jumble of separate, vaguely connected cells (as many experts describe it). It is a virtual nation, with a unified, well-disciplined army whose “leaders are continually planning attacks.” Their purported motives are irrelevant; at least they are never addressed in the paper. All that matters is their one and only activity in life: ceaselessly planning attacks.

    To make matters worse, “the U.S. government may not be aware of all al-Qaeda plots as they are developing and thus cannot be confident that none [i.e., no attack] is about to occur.” In other words, we must live and act as if an attack were about to occur unless we have firm evidence to the contrary. And since that evidence can never be found -- How can you prove a negative? -- the threat of attack is “imminent” at every moment of every day. That’s the pivotal premise of the story.

    But who or what is always about to be attacked? Here the war story’s world gets a bit murky. On the one hand, the target is clearly the entire nation; the “White Paper” repeatedly insists that the president is acting only to protect the nation from attack. On the other hand, the document insists just as often that he is acting to protect individual Americans from attack.

    The two kinds of attack are treated as interchangeable. So the war story, in effect, makes every person in America an embodiment of the nation. An attack on any one, if somehow linked to al-Qaeda (or an “associated force”), is the equivalent of a whole al-Qaeda army invading our homeland.

    Is any attack on an individual American, by definition, really an attack on America itself and thus an act of war? Yes, the “White Paper” assumes -- if the attack is planned and carried out by al-Qaeda (or an “associated force”). Yet it never offers any argument to substantiate this claim. There’s no need for an argument. Within the world of the war story it’s a tautology: Since al-Qaeda is, by definition, at war with us, any violent deed it or its associates commit is, by definition, an act of war. Within another story -- say, Clinton’s story of 1993 -- the same deed would be a criminal act, calling for a hugely different kind of response.

    The “White Paper” occasionally mentions a third kind of attack, on U.S. “interests.” These remain undefined. But it treats any attack on our “interests” as equivalent to an armed invasion of the nation -- even if those “interests” are on the other side of the globe. In the war story, “the nation” is an expansive concept, indeed.

    Those are the highlights of the war story and the world it creates. The crucial question that the “White Paper” raises is whether this is the world we want to live in. Once we recognize that this world is a product of imagination, born from one story among several that we might have told after 9/11, we also recognize that we are not forced to live in this world. It is a choice.

    The ultimate results of this choice are clear enough. There are uncounted numbers of people dead. A few of them are U.S. citizens. Some (we shall never know how many) may actually be planning an attack that might kill people on U.S. soil. And some (more than we would like to imagine, perhaps) are wholly innocent “collateral damage.” Their deaths raise powerful anti-American sentiments and motivate a few among the survivors to become active planners of attack on the United States.

    Growing anti-Americanism reinforces one more inevitable result of the war story: a distant, muffled, yet very real and constant drumbeat of cultural anxiety that has become part of the soundtrack of American life.

    The debate about whether the executive has the right to execute U.S. citizens without due process in wartime is certainly an important one. But isn’t it rather more urgent to debate whether we want to live in this frightening imagined world of “wartime”? 

    The American people may collectively choose this world despite its perils. One sign: The public endorses the president’s policy of extra-judicial killing of U.S. citizens, according to polls. In fact pollsters no longer find it a controversial issue; the most recent poll I could find that asked the question was a full year ago.

    Perhaps most Americans have forgotten that another story is possible. Or perhaps most prefer to be at war. The war story, and war itself, have undeniable appeal. And a “good war,” in which the enemy is absolutely evil and the only Americans who die are “bad guys,” is so much more appealing.

    But if that’s what the public wants, at least it should be a conscious choice. Then, if there’s another attack on U.S. soil, we will have to acknowledge that the story we chose to tell played a role in making the attack more likely.

    Of course we could choose a different story and a different world, one where police and judiciary action rather than war is the proper response to attacks on U.S. soil.  Then the question about the executive’s right to kill citizens extra-judicially would simply evaporate. Wouldn’t that be a simple, elegant way to end the debate?

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150502 https://historynewsnetwork.org/blog/150502 0
    No, Prof. Meyer, Anti-Zionism is NOT Anti-Semitism Swedish B.D.S. poster. Credit: Wiki Commons.

    Andrew Meyer wants us to believe that anyone who opposes Zionism, for whatever reason, is inherently anti-Semitic. He starts from the premise that we should focus on historical effects rather than intentions. Perhaps he thinks that restriction works to the advantage of his argument.

    After all, it’s obvious that plenty of people have opposed Zionism with no anti-Semitic intent. Before World War II many Jews -- perhaps a majority of the world’s Jews, and certainly a vast number of devout Orthodox Jews -- opposed the Zionist project in principle. They surely had no anti-Semitic intentions. There are still plenty of Jews today who oppose Zionism. Some of them, especially in Israel, make a very thoughtful case that Zionism is ultimately harmful to the best interests of the Jewish people. Their intentions are obviously not anti-Semitic. So looking at intent certainly would undermine Prof. Meyer’s case.

    But even if we look only at historical effects, his argument is mistaken. It really boils down to one claim: “Israel has been the single greatest impediment to institutionalized anti-Semitism in the international arena.” Without a Jewish state, he argues, “Jewish communities throughout the world” would lack “concrete protections” from anti-Semitism, and there would be “a more favorable climate for the growth and spread of anti-Semitism.”  

    That argument might have been convincing once upon a time. Historians will probably argue about it forever.

    Today, though, there can hardly be any doubt that Israel is actually increasing anti-Semitism around the world. Every day Israel is creating more opposition, antagonism, and sometimes anger toward the Jewish state -- not because of its mere existence, but because of its palpably unjust treatment of Palestinians, its unjust (and too often violent) military occupation of Palestinian land, and its reluctance to make a just peace that would leave it living alongside a viable Palestinian state.

    The growing atmosphere of world-wide criticism of Israel is hardly helpful to erasing the vestiges of anti-Semitism. On the contrary, it does more than anything else to keep anti-Semitism alive. Most critics of Israel’s policies know that this effect is unfortunate and unfair. They say that they object to the Jewish state’s treatment of Palestinians, not to Jews or Judaism, and there is no reason to doubt their sincerity.

    However unfair it is, though, this historical effect of fostering anti-Semitism is understandable. The leaders of Israel’s government in recent years have insisted loudly that their state has, will always have, and must have a “Jewish” identity. As Prof. Meyer points out, there is no consensus on exactly what that means. But the general message comes across emphatically: Whatever Israel is and does, there is something uniquely Jewish about it.

    What the rest of the world sees Israel doing, more than anything else, is occupying and oppressing Palestinians. So it’s easy enough -- even if illogical in the strict sense -- to conclude that military occupation and oppression are somehow essential expressions of Jewish identity. That’s bound to fuel anti-Jewish feelings.

    Similarly, the leaders of Israel have always insisted that the state acts on behalf of all Jews, everywhere. Those leaders have done whatever they could to make that claim true, and they have largely succeeded. Israel is widely seen as the primary agent, and in a sense, the embodiment of the Jewish people on the world stage. So it is natural that many non-Jews would understand Israel’s actions as deeds done by the Jewish people at large. Since the most public of those deeds are morally dubious, at best, it is inevitable -- though again, illogical in the strict sense -- that many observers will have an increasingly negative view of Jews.

    The process works in yet a third way: Growing numbers of Israel’s critics are persuaded that there is something inherently unjust in a state that privileges one group of people over all others. This argument is heard much more widely now that it was twenty or thirty years ago. Anyone who has watched events over those decades knows why: More and more people every year are concluding that the Jewish state is incapable of mending its ways. The facts on the ground give support to the (once again, logically erroneous) argument that a Jewish state is bound to be an oppressive state, which further fuels anti-Jewish feelings.

    The points I’m making here are so well known and so widely discussed that I’m surprised Prof. Meyer ignored them. You can find some columnist worrying about them in the Israeli press nearly every day.

    I’m surprised by something else. Prof. Meyer says he “stand[s] with the Palestinian people in demanding their right to statehood, and decr[ies] the injustice of the Israeli occupation.” And he defends his college’s sponsorship of a public discussion on the “boycott, divestment, sanctions” movement.  All laudable sentiments.

    So I wonder why he ignores the actual effect of writing an article titled “Anti-Zionism Is Anti-Semitism.” He must know that this slogan is commonly used to stifle expression of exactly the views he holds. In fact, the slogan is most often used to try to silence all criticism of Israel’s policies and actions, no matter how unjust or inimical they are to the interests of peace.

    Many readers of the news, on the web or in print, never get past the headlines. So by choosing to write on this topic Prof. Meyer, no matter how unwittingly, is serving the same unjust policies he criticizes. And he is aiding, no doubt unintentionally, the suppression of the free debate that he actually wants to foster.

    He notes at the end of his article that critics of Israeli policies are often perceived, by supporters of those policies, “to be evasively concealing” their true agenda. Unfortunately the same perception readily applies to anyone who writes an article titled “Anti-Zionism Is Anti-Semitism,” regardless of his intent or the content of his writing. I absolutely do not believe that Prof. Meyer had any hidden agenda in writing this article. Other readers might not be so generous. Effects are often as unfair as they are illogical. But I heartily endorse Prof. Meyer’s view that they must be taken carefully into account.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150539 https://historynewsnetwork.org/blog/150539 0
    The State of the Union and the State of the "Homeland" Claire Danes in "Homeland."

    In our home the State of the Union address was not followed by the Republican reply. We skipped Marco Rubio’s rebuttal in favor of watching a DVD of old “Homeland” episodes. We’re finally catching up on the first season of the “CIA versus terrorists” drama that everyone else has been watching and raving about for the past two years.

    The incongruity of watching the SOTU and “Homeland” in the same evening was a stark reminder of how much has changed in America in just a few years. “Homeland” would have made a wholly congruous nightcap to any SOTU speech by George W. Bush.

    That’s not to say Obama’s “war on terror” policies are so different from W.’s. The similarities as well as differences have been parsed at length by the pundits, and similarities there are a-plenty. But the tone of American life has changed so much now that we have a “hope and change” president instead of a “war president.” 

    “Homeland” takes us back to the dramatic world that W. invited us into: a world where evildoers lurk unseen beneath the surface of American life, a life that is constantly (if sometimes only slightly) on edge, because no one knows for sure where and when sudden death may strike again, as it did on September 11, 2001. W. fit easily as an actor in that world. Indeed he gave himself a leading role in the drama.

    We may not have been happier in that world of so recent yesteryear. But “Homeland” reminds us why so many Americans found it gripping and exciting: It seemed like a matter of life and death. That’s the stuff great theater is made of.

    Barack Obama’s SOTU, like every SOTU, was meant to be great theater too. Yet there was something less than satisfying about the show. Watching “Homeland” made it clear what was missing in Obama’s show: The death-dealing bad guys were nearly invisible. The “terrorists” got a very brief mention, mostly to assure us that they could be defeated by technical means, like any other technical problem, without any compromise of our cherished American values.

    The real bad guys lurking constantly between the lines of the speech were the Republicans. But they were never called out by name. And their evil -- the fact that their proposed policies would kill many more Americans than “terrorists” ever will -- was hidden so deeply between the lines, it was practically invisible. So they could hardly perform effectively as the villains in the piece.

    The Republicans’ evil had to be hidden because the world that the president created in his address was such a utopian world, where everything wrong in American life is just a technical problem that can be fixed with relatively little effort. In Obama’s world evil is simply a temporary error, a lapse in clear thinking, easily corrected under the guidance of a skillful tutor.

    Obama took us back to the days of Theodore Roosevelt and Woodrow Wilson, when all we had to do was to reason together. We would surely recognize the logic of his proffered solutions, he seemed to say with every breath. Then, with only the slightest application of good will, all our problems could be quickly resolved. He made it all sound so simple, so obvious.

    The world of “Homeland” -- W.’s world -- is the world of Franklin D. Roosevelt (and Winston Churchill), where evil is far more than a mistake in logic. It is a permanent, intractable element of human life. We cannot reason together, because some of us are moved by an impulse to evil that defies all reason. So evil is not a problem to be solved. It is an enemy to be defeated by any means necessary -- perhaps even extra-constitutional means, though that remains a matter for debate.  

    Few Americans watched the SOTU and “Homeland” in the same evening. But all got a taste of this stark contrast in national narratives when they watched the evening news, where Obama had to share the headlines with an evildoer defeated in the mountains outside Los Angeles. Any TV news editor worth his or her professional salt would probably lead with the story of the dead LA ex-cop, not the SOTU. The battle of good against evil is the heart and soul of all television drama, even on the news.

    Yet the utopian impulse can create great theater, too. After all, it rests on imagination and fantasy, which are the root not only of theater but of all entertainment. Utopia is only entertaining, though, if it offers a vision of a completely perfect world that can be attained some day, no matter how distant, without compromise.

    Barack Obama will not give us that emotional satisfaction. He is a self-confessed disciple of the theologian Reinhold Niebuhr, who battled fiercely against the utopian political impulse -- created largely by Christians -- that flourished in the days of TR and Wilson. Niebuhr accused Christians of being “unrealistic” because they ignored the classical doctrine of original sin: evil is a permanent fact of life, which we must wrestle with forever. It’s a testament to Niebuhr’s enduring influence that being “unrealistic” and “utopian” remains the cardinal sin of American political life.

    Obama danced on the edge of that sin in his SOTU. The tone he set left an  unmistakable sense of utopian aspiration. Yet it remained merely a vague impression because every time he approached the edge of utopia he backed away, as he always does, for the sake of “realistic” compromise with the GOP evildoers.

    The question Obama's SOTU speech poses is whether the utopian impulse can be resurrected in a nation that has been gripped for so long by the drama of good against evil, a nation that has made the war against evildoers the essence of its national narrative.

    Obama himself can never be the agent of utopia’s resurrection.  But John F. Kennedy was certainly far from a true utopian either. And his rhetoric played a role -- a major role, some historians think -- in creating the brief era of the late 1960s, when the utopian impulse flourished throughout the land once again. 

    Of course JFK had MLK to do the heavy utopian rhetorical lifting. Dr. King had studied Niebuhr carefully, and he too asserted the reality of evil. But he threw in his lot with the faith of the Christian utopians who were convinced that some day evil will be overcome, not by war but by the reason and good will of humanity. No one can say how long it will take. The arc of the moral universe is long. But it bends toward justice and the perfection of the beloved community.

    Obama has no one with nearly the stature of MLK to offer such a message to America today. So his tantalizing hints of utopia must do their work on their own. We don’t yet know what that work will be. Just as no one in the days of JFK could predict what effect his words would have, so we cannot predict the long term effects of Obama's turn toward utopian imagination.

    Stay tuned for our next exciting episode.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150605 https://historynewsnetwork.org/blog/150605 0
    “War on Terror”: The Ticking Time Bomb Dick Cheney in 2011, and the infamous Bulletin of Atomic Scientists Doomsday Clock.

    I saw Zero Dark Thirty a few weeks ago and then consumed the whole first season of “Homeland.” Don’t tell me what happens in season two. I love the suspense.

    I also love those brave (fictional) CIA analysts, Maya and Carrie. They see a huge danger ahead that everyone else is blind to, and they insist on crying out a warning, regardless of the risk -- just like the biblical prophets. What’s not to love? 

    In fact they’ve inspired me to cry out a warning of my own. It’s not the threat of “another terrorist attack,” but the threat of America being seized once again by “war on terror” fever.  I know that seems crazy, because hardly anybody worries seriously about the “terrorist” threat any more. In the last year, when pollsters asked about the single most important issue facing the nation, they usually didn’t even list “terrorism” as an option. When they did, it consistently showed up at the bottom of the list.

    But Zero Dark Thirty and “Homeland” reminded me that one sector of the American populace ranks “terrorism” right up at the top, way above any other national concern, and obsesses about the “threat” night and day. It’s not just the CIA but the entire “Homeland Security Complex” -- what the Washington Post called, in 2010, “Top Secret America: A hidden world, growing beyond control.”

    The Post’s article began by saying that the HS Complex is “so large, so unwieldy and so secretive that no one knows how much money it costs, how many people it employs, how many programs exist within it or exactly how many agencies do the same work.”

    It went on to estimate that there were “some 1,271 government organizations and 1,931 private companies work on programs related to counterterrorism, homeland security and intelligence in about 10,000 locations across the United States,” with “33 building complexes for top-secret intelligence work built or under construction since September 2001” -- with total space bigger than three Pentagons -- in the DC area alone. 

    That’s a lot of people, spending a lot of money, focusing like a laser on the single goal of “defeating the terrorists.” And that was three years ago. Think how the HS Complex has grown since then.

    As those “most important issue” polls show, there’s a huge disconnect between the HS Complex and the rest of the nation, which doesn’t think much about “the terrorist threat” at all any more -- except as an exciting theme for suspenseful movies and television shows.

    We’re back in a situation much like the early and mid-1970s, when “détente” was the watchword in U.S. relations with the Soviet Union and the People’s Republic of China. The debate over Vietnam generally treated that country as an isolated hot spot, largely detached from its global cold war context. So the global cold war purred along quietly in the background of American life, little more than a plot device for the entertainment industry -- except in the huge Military-Industrial-Intelligence Complex, which devoted itself night and day to waging and worrying about the conflict between the U.S. and the whole of the “communist bloc.”

    The M-I-I Complex was a kind of fuel waiting to be ignited, to put the global cold war once again at the top of the national priority list. The fuse that did the job was the election of Ronald Reagan in 1980. It re-ignited the cold war, which dominated much of the nation’s life for the next decade (and vastly inflated the federal budget). 

    The resurrection of the cold war was a triumph for a group of high-powered conservative and neoconservative politicos, banded together under the banner of the Committee on the Present Danger (CPD).

    One of their icons, Norman Podhoretz, explained that they worked hard to counter a “national mood of self-doubt and self-disgust” triggered by the debacle in Vietnam. Americans were crippled by “failure of will” and “spiritual surrender,” the neocon writer lamented. They would no longer make the sacrifices needed to “impose their will on other countries.”

    The only way to counter this national transformation, as the CPD saw it, was to revive cold war brinksmanship. A nuclear buildup, bringing increased risk of nuclear war, was not merely a price worth paying; it was a way to teach the public to accept sacrifice as the route to national and spiritual greatness.

    The CPD’s diagnosis and prescription were extreme, to say the least. But many Americans did feel what President Jimmy Carter called a “a crisis of confidence”; many others called it a “malaise.” And the CPD understood correctly that it was triggered, above all, by the Vietnam war -- the first war that America had ever lost -- which did so much to undermine faith in the narrative of American exceptionalism.  

    The 1970s teach us a vital lesson: The “foreign threat” narrative has a much better chance to prevail when large numbers of Americans are unsure that the familiar structures of public life are secure.

    We had already learned that lesson in the late 1930s. The Great Depression had created so much anxiety about American life for so long that the public was ripe for a “foreign threat” narrative. In the late 1970s there was also economic anxiety, powerfully reinforced by the all-too-fresh memory of the defeat in Vietnam.

    It could happen again! The fuel pile -- the Homeland Security Complex -- grows bigger every day. The neocon heirs of the CPD still have their well-honed public relations machine in high gear, eager to see that fuel explode.

    In an economy that is slowly recovering but still perceived as feeble, even a minor “terrorist attack” -- or any incident that could plausibly have that label pinned on it -- would give the neocons the fuse to ignite the fuel. We might well be thrown back a decade, to a time when the “war on terror” dominated national life in a way that teenagers of today can hardly imagine. 

    The HS Complex, as big as it is, could grow by leaps and bounds. Presidents Eisenhower, Kennedy, and Reagan all showed that an M-I-I Complex widely seen as immense could actually grow much bigger.

    There’s one other vital piece in this ticking time bomb: the weakness of any alternative narrative to interpret a “terrorist attack.” Whenever there’s sudden, unexpected violence, people will demand some narrative or other to make sense out of it. That’s why we have news media: to “give us the story.” Right now, the only story the media can even think about using is the “war on terror” tale.

    It wasn’t always thus. When an attempt was made to bomb the World Trade Center in 1993, the Clinton administration treated it as a criminal attack. The media followed suit. And eventually people were convicted for the crime through due process of law.

    In the weeks after the 9/11 attack, there was some effort to invoke the same “crime and justice” narrative. But it was quickly eclipsed by the story that labeled the attack an “act of war,” and that view has prevailed ever since.

    As long as there is no other narrative on the scene to effectively challenge the “war” story, the “war on terror” time bomb will keep on ticking. The longer it ticks, the more likely it is to explode, plunging us back into the world that Dick Cheney once assured us would be “the new normal” forever. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150682 https://historynewsnetwork.org/blog/150682 0
    “Yellow Peril” Morphs into Chinese Borg You remember those Chinese hackers, the ones we are all supposed to be so terribly worried about just a few days ago? They’ve disappeared from the headlines; apparently we’re not supposed to worry about them any more, at least for now. But they’re bound to be back back in the headlines sooner or later, and probably sooner. So we ought to take a close look at the story.

    The joke is on the hackers, says Washington Post wonk blogger Ezra Klein. They’ve been suckered in by a great myth -- the myth that there’s some secret plan hidden somewhere in Washington, the script for everything that the American government and American corporations do. The Chinese think that if they hack enough computers, somewhere buried in that mountain of data they’ll find the master key that unlocks the plan.

    The joke is that there’s no key because there’s no plan, says Klein. Everyone in Washington and in corporate America is just bumbling along, buffeted by each day’s endless compromises and unexpected twists and turns that make our system run.

    I imagine Klein is describing the American system pretty accurately. But I suspect the joke is on most Americans too, including many in positions of power, because we too have been suckered in by a great myth -- a myth about China that’s a mirror image of the myth Klein says the Chinese hold about us.

    All the headlines I’ve seen about the alleged hacking suggest that there’s some great monolithic monster out there -- some say it’s the Chinese army, some say the Chinese government, some just call it “China” -- doing the electronic snooping. Its tentacles are everywhere across America (though they’re especially dense inside the D.C. Beltway), scooping up vast amounts of precious data (the story would hardly matter if the data weren’t precious) and feeding it all into some superbrain that is plotting something.

    Who knows what? We certainly can’t know. After all, the monster is Oriental; thus, as an age-old American tradition tells us, inscrutable.

    But the Chinese monster surely knows what it’s plotting. It must have a plan. Otherwise why go to all the trouble of doing all that hacking? And the plan must be nefarious. I mean, we’re talking about foreign spies here. Whatever they’re up to, it’s always no good. Whoever heard -- or more likely saw, in the movies or on television -- a story about foreign spies who didn’t pose a threat to us?

    If the spies are inscrutable Orientals, who think in ways we can’t ever understand, they are all the more threatening. And if those Orientals are agents of the world’s largest nation – a nation so vast we can scarcely comprehend its size, much less its motives – why that’s the most threatening image of all.

    Wait, it gets worse.

    I’m writing this a few days behind the headlines because I’ve been traveling. You learn a lot traveling. For example, from reading the nation’s most respected, influential newspapers I had learned that the Chinese hackers are a threat to our national security. But in the airport I learned that I had a lot more to worry about than that.

    On every newsstand the message leaped out at me from Bloomberg BusinessWeek: “YES, THE CHINESE ARMY IS SPYING ON YOU” -- spelled out against a blood-red background in big bold yellow letters. Very yellow.

    It’s appropriate that my destination was California. I’m writing this piece in the state that did more than any other to promote the notion of the “yellow peril” and the Chinese Exclusion Acts of the 1880s and 1890s.

    In light of this sad history, I was hardly surprised that there would be so much fear of the Chinese spying on our government and corporations. But now Bloomberg tells me that the “yellow peril” is coming after little old me -- and you and you and you -- scooping up all the most personal private data that each one of us has stored on our home computers.

    Who would have imagined it, even in the heyday of the “yellow peril” panic? No matter how much you fear the foreign evildoers at your doorstep, it seems, there’s always more to be afraid of.

    If the Chinese are absorbing data from each and every one of us, they are obviously looking for more than just Washington’s master plan. Surely they know that nothing on personal computers in Keokuk or Kankakee can yield the mythical key. No, it’s obvious that they want everything that America has to offer, absolutely everything, wherever it is.

    I mean, think about it. China keeps on expanding in every way imaginable -- just like the Borg. So, like the Borg, it must intend to absorb us all. Absorbing our precious data is merely the first step in the plan.

    And while we Americans merely bumble along chaotically, the Chinese Borg must have a plan. All this hacking can’t be happening at random. Even Ezra Klein, who usually makes his living promoting healthy skepticism, assumes that everything in China is guided by some master plan.

    Indeed, that’s the vital difference between the two great powers, Klein concludes, and the key to America’s superiority. Since we don’t have any great plan, when things get screwed up (as they inevitably do) we are flexible enough to get along pretty well anyway.

    But the Chinese depend so heavily on “the plan” that when it gets screwed up they are stuck. In other words, the Chinese are just too regimented, as if the whole nation were one big army. As in any army, when things get snafu’d no one knows what to do next, because the underlings have never learned to think for themselves.  Everyone must simply follow orders. Resistance is futile.

    Which brings us back to the “China as the Borg” myth. But this Borg isn’t brave enough to confront us boldy, declaring with supreme self-confidence, “Resistance is futile.”

    No, this is a Chinese Borg. So it must be not merely inscrutable but sly, devious, duplicitous. Real (read: white) Americans have always been fascinated by what they imagined went on in the alleyways of our Chinatowns, the kitchens and back rooms of our Chinese restaurants, and -- the most fascinating mythic realm of all -- the smoky, gauzy netherworld of the opium dens.

    We could never know what they were doing. But whatever it was, like the deeds of foreign spies, it could bode no good for real Americans.

    Now, it seems, that inscrutable “yellow peril” has morphed into a perilous yellow cyber-Borg, aiming its e-tentacles at each and every one of us. For lovers of political mythology, China is the gift that keeps on giving.

    Seriously, I’d bet the farm (if I had a farm) that the joke is on us. Of course China has hackers who scoop up as much data as they can from us -- just like the American hackers who are aiming their e-tentacles at China. When the New York Times first reported that the Chinese hackers were probably working for the army (probability was all they could offer) their only source was “American intelligence officials who say they have tapped into the activity of the army unit for years.” I should hope so. That’s what we pay them, with our tax dollars, to do.

    And just as our government and military analysts sift and organize the spy-collected data in endless ways, constantly arguing about what it all really means, no doubt the Chinese do the same, bumbling along in the same way with the same random, chaotic results.

    It’s spy versus spy. I imagine it has been going on ever since the Sumerians faced off against the Akkadians well over four thousand years ago. It’s the obvious thing for great powers to do too. And for all those four millennia, I suspect, every spy has tried to convince his (or her; see “Homeland”) boss that s/he’s got the master key to the enemy’s grand plan. How else can a spy hope to get promoted or given a juicier assignment?

    Mad Magazine’s “Spy Versus Spy” cartoons captured the essential comedy of the game for a whole generation of youth growing up in the heyday of the cold war. No doubt, if the Sumerians or Akkadians had a Mad Magazine the basic joke would have been the same.    

    But while my generation was laughing at “Spy Versus Spy,” our elders were avidly supporting a government that was heading toward the Cuban missile crisis, when the president’s closest advisor (his brother) could only estimate the rough odds on avoiding an all-out nuclear holocaust. No joke.

    So while we laugh at the funny side of the Chinese hackers” panic, we also need to take it, and talk about it, very seriously. Understanding its deep roots in American mythology is a useful first step. Resistance to even the most deeply entrenched myth is never futile. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150825 https://historynewsnetwork.org/blog/150825 0
    “Debt Crisis”: The Myth Behind the Myth Image via Shutterstock.

    While the two major parties plot strategy for the next battle in the federal debt-reduction war, another war rages among economists over the question, “Is debt really the federal government's biggest problem?” Some insist that unless Washington cuts spending substantially to reduce the debt quickly, we are headed for disaster. Others insist with equal fervor that growth is the number one priority: Aggressive pro-growth policies will reduce the debt in the long run with far less pain.

    If the pro-growth economists could gain public support they would give liberal Democrats a powerful weapon to resist the Republican’s budget-slashing ax. But the pro-growth faction makes little headway in the public arena because the political wind is blowing so strongly against it. Why should the wind blow that way?

    It’s not because voters have studied the competing theories carefully and concluded that the debt-crisis faction has the stronger case. When it comes to economic theory, few of us draw any conclusions at all. We get lost in the esoteric arguments so quickly that we give up trying to understand. Politicians know this; most of them are probably just as lost as the rest of us amidst the esoteric arguments.

    But the best politicians know something else: Few voters care much about theories at all. Few of us make up our minds through careful logical analysis of the facts. Instead we rely on myths to organize of the vast barrage of information constantly bombarding us.

    The two economic theories represent two myths, deeply rooted in American political culture, that have competed for dominance throughout our history. The debt-crisis theory is so powerful now because it evokes the more compelling myth. 

    (Again, to clarify: when I speak of myth, I don’t mean an outright lie. Like most historians of religion, I take myths to be the stories -- compounded of fact and fiction -- that we take for granted, stories we use, often unconsciously, to make sense out of life and turn it into meaningful experience.)

    The pro-growth view is summed up by its most prominent spokesman, Paul Krugman: The size and danger of the debt are overstated. And the best way to reduce the debt we do have is to spend more government money now to stimulate growth. Yes, it will raise the federal debt for a while. But soon the expanded economy will be putting enough back in the government coffers to erase that increased debt. 

    Here we have the cherished American myth of progress, or, as Barack Obama rechristened it, the myth of hope and change: America’s mission is to make a better life for all its people and, in the process, for the whole world. Head out to a new frontier. Believe in your vision of the future. Invest in it. When that future arrives, things are bound to be better than they are now. So take some risk. Show some courage. That’s what America is all about.

    But the debt-crisis party won’t buy it. The heart of their economic theory is fear, especially of federal debt. As Republican economic guru Bruce Bartlett put it, “The debt limit is the real fiscal cliff.” Like individuals or families who spend beyond their means, the party can continue just so long, they say. Then comes the day of reckoning, when it’s impossible to pay back the borrowed money: Bankrupt!

    No one can predict when that tipping point will come, the prominent columnist Robert Samuelson has written. Like any cliff, it can remain unseen until we go over it, suddenly and unexpectedly.

    He quotes economist Barry Eichengreen, “a leading scholar of the Great Depression,” who warns that if the U.S. debt grows large enough bond traders will stop funding it: “This scenario will develop not gradually but abruptly. Previously gullible investors will wake up one morning and conclude that the situation is beyond salvation. They will scramble to get out. … The United States will suffer the kind of crisis that Europe experienced in 2010, but magnified.”

    Americans are very familiar with such warnings of a surprise attack, though not from bond traders but from foreign evildoers, be they fascists, communists, or terrorists. To draw on the current parlance, it’s our myth of homeland insecurity: America is constantly at risk. Its chief mission is to protect itself from forces that would destroy it.  

    So let’s arm ourselves well, circle the wagons, and proceed with utmost caution. Any morning we may wake up and find our nation under attack. Any misstep might plunge us over the cliff into the abyss of national catastrophe.

    Economists like Krugman can argue with the most compelling logic that a nation is not like a family. Governments don’t have to repay all their debts. They need only “ensure that debt grows more slowly than their tax base.” And a nation’s debt is largely “money we owe to ourselves.” They can point out that the current federal debt is no higher (perhaps lower) that it was in the post-World War II years, when the U.S. was beginning its greatest economic boom ever. 

    But the myth of homeland insecurity is a formidable foe. Like any deep-seated myth, it’s largely impervious to logic.

    Most historians agree that Franklin D. Roosevelt’s warnings about German bombers attacking Kansas City were exaggerated, to say the least. So were the warnings about a Soviet nuclear “bolt out of the blue” from Eisenhower, Kennedy, and Reagan. Nevertheless, the sense of permanent insecurity they created has become a firm pillar -- perhaps the central pillar -- of American political culture.

    It’s the foundation on which the Republicans build virtually all their rhetoric and policies. And they can tie the rising federal debt to traditional fears of foreign foes by citing warnings that the debt is “the most significant threat to our national security” (Admiral Mike Mullen) and puts the U.S. “at risk of squandering its global influence” (New York Times analyst David Sanger).

    Of course even the most conservative Republicans still pay lip service to the American faith in progress. But even most Democrats agree that, when it comes to making policy, national security trumps every other concern. There’s a bipartisan consensus that we must always be on the alert for threats, old and new, and ready to resist them by any means necessary. That consensus is bound to keep us insecure, constantly ready to see every new development as a potential crisis, which gives a clear edge to “homeland insecurity” in the war of the myths. 

    Barack Obama has paid homage to the myth of homeland insecurity ever since he first won the presidency by promising to stave off an impending financial collapse. He has promoted his major policies not only in the name of hope and change but, even more often, as ways to prevent things from getting worse. Now he constantly reassures the public that he takes the idea of an impending debt crisis very seriously and is dedicated to resolving it.

    Most Democrats follow the president’s lead, insisting that debt reduction is at the top of their agenda, just as Republicans insist it must be. This bipartisan consensus is a testament to the enduring dominance of the myth of homeland insecurity. It’s the power of this mythology, not any facts or logic, that deprives the pro-growth view of any serious public hearing. Every day that the debate over debt reduction dominates the headlines cements America more deeply into its long-standing dominant mythology.

    Every society has the right to choose its myths. But every right has a correlated responsibility. As we grow more self-conscious about the role of mythology in political life, we also have a growing responsibility to recognize the consequences of our choices. America’s obsession with homeland security has already had grave consequences. The debate between debt-crisis and pro-growth economists gives us a chance, from an unexpected quarter, to consider whether we want to dig ourselves deeper into that hole.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/150979 https://historynewsnetwork.org/blog/150979 0
    North Korea as “Bad Guy”: A Multi-Layered Myth North Korean leader Kim Jong-un.

    “The United States is taking the threat of a ballistic missile attack from North Korea very seriously,” Melissa Block informed us on NPR the other day, sounding very serious herself. To protect us from that threat, the U.S. will station 16 more anti-missiles missiles in Alaska.

    “The big questions, of course, are this,” NPR’s Tom Bowman explained: “Would North Korea actually launch a missile against the United States, and would these missile interceptors work? And frankly nobody knows for sure, but the Pentagon says, we have high confidence.”

    High confidence that the missile defense will work or that the North Koreans would attack the U.S.? No doubt Bowman meant the former (though “the testing has been a bit spotty,” as he tactfully put it).

    But the whole project, with all its ballyhoo and its $1 billion price tag, makes no sense unless the Pentagon also has high confidence that North Korea might indeed attack the United States.

    Seriously? North Korea is unlikely to have the technical means to attack the U.S., at least not for many years to come. If they ever get that capacity, their nuclear arsenal, like their military capability as a whole, would still be infinitesimal compared to ours. It takes a microscope -- or whatever equivalent CIA analysts use -- even to see it. That’s not going to change.

    Any gesture of attack would give the U.S. license to devastate the small, poverty-stricken Asian land. The exercise would be rather effortless for the U.S. North Korea’s leaders must know that attacking the world’s mightiest nation would mean instant national suicide.  

    The whole idea of the North Korean mouse attacking the American elephant seems rather absurd, to say the least. Yet the “threat” is widely reported in the U.S. mass media as if it were an undeniable fact. Why?

    Bowman offered an important clue when he said that U.S. anti-missile missiles “are the ones that would actually hit an incoming enemy missile from, let's say, North Korea.” His “let’s say” implies that we are defending against a generic threat, of which North Korea is merely one example. North Korea is just the current actor filling the role of “enemy attacker” that the generic script calls for.

    It’s much the same mythic scenario that white Americans have been acting out, and basing policy on, ever since the first colonial militias were formed to fend off the Indians -- the scenario twentieth-century Americans came to know (and often love) as “cowboys versus Indians.” Now, some nation or other has to play the role of Indians.

    Since the United States was created, only one other nation -- Great Britain -- has actually launched an attack on U.S. soil. That was a full two centuries ago. But the mythology of homeland insecurity, with its picture of an America constantly at risk of enemy invasion, remains powerful. This deeply-rooted and long-regnant mythology -- with America playing the cowboys and some (any) enemy nation the Indians -- is the lens through which the mainstream of American culture sees the world. It seems totally natural. That’s one reason it’s so easy for the U.S. media, and so many American people, to believe in “the North Korean menace.”

    What’s more, in our traditional national narrative the “bad guy” enemy is, by definition, “savage” and thus bereft of reason. So he might well do something as totally self-destructive as attacking us. How often have we heard that North Korea’s leaders are erratic, irrational, and indeed “crazy”?  

    Digging deeper we find other, more paradoxical, sources for this old myth’s staying power.

    It’s getting harder to see the world through the familiar lens of fear. After the Cold War ended, the Chairman of the Joint Chiefs of Staff, Colin Powell, complained: “I’m running out of demons.” He knew that an enormous military budget needs “demons” to sustain it.

    A mythology of insecurity needs those “demons” just as badly. American culture is deeply invested in its mythology. Many millions of Americans can hardly imagine what it would mean to be a patriotic American if we did not have potential attackers to resist at all costs. In America, the sense of security that comes from a taken-for-granted mythic narrative needs some nation or other to play the Indians.

    But if Powell worried about the absence of “demons” twenty years ago, how much more might a Chair of the Joint Chiefs worry now. The whole tradition of courageous resistance to enemy nations may soon be just a quaint relic of a bygone era, unless we keep on finding “threats to our national security.” North Korea may be just what we need to save the worldview of American patriotism.

    Let’s go another level deeper. I’ve been re-reading Alan Trachenberg’s fine study of the “Gilded Age,” The Incorporation of America. Trachtenberg makes a key point about the “cowboys versus Indians” narrative. The story depends on a cowboy using his unique combination of skill and courage to save a whole (white) community from “savagery.” The mythic cowboy is a throwback to the knightly bravado of Launcelot and Galahad. He is popular culture’s way of celebrating the same heroic individualism that Frederick Jackson Turner celebrated academically in his famous “frontier thesis”: “that dominant individualism ... that masterful grasp of material things, lacking in the artistic but powerful to effect great ends.”

    Guns were certainly high on the list of material things that the frontier hero masterfully grasped -- and used, though only in self-defense, to effect great ends, as the story assures us. 

    Here’s the paradox that Trachtenberg points out: The “cowboys versus Indians” narrative first came to dominate popular culture precisely when the era of rugged individuals determining the destiny of anyone or anything in the West was in rapidly dying. The real emerging power in the post-Civil War era was the corporation: the ever-growing army of anonymous technicians, managers, and accountants who, each day, gained more and more power over the resources, the culture, and the lives of people in the West.

    The growing supremacy of corporations triggered a cultural crisis because it raised such a fundamental question: How could Americans continue to base their lives on their familiar worldview and values? Those had grown up at time when Americans still had reason to believe that they might control a substantial part of their lives through their own choices and actions. Would that old way of life have to be abandoned altogether? Or could some part of it be saved?

    One way to save it, Trachtenberg argues -- perhaps the only way -- was in imagination, by creating the mythic tale of the heroic cowboy: “Through such popular fictions, the West in its wildness retained older associations with freedom, escape from social restraint, and closeness to nature.” The ultimate, though unseen, point of the story was to hold on to an old worldview precisely because new realities were rapidly rendering it irrelevant.

    That may well be the point of today’s popular story, too -- the one that casts North Korea as the “bad guy” who must be defeated by the heroic U.S. military (witness the recent Red Dawn remake). Enemy “demons” have disappeared because resistance to U.S. -- led multinational corporate capitalism has been largely extinguished in the few places it remained: Serbia, Libya, Iraq, Afghanistan. Apart from North Korea, only Iran remains on America’s list of inarguably evil nations. And the Obama administration still insists that it might be possible to negotiate our difference with Iran.

    That leaves North Korea as the sole irredeemable “bad guy,” the only nation left to play the role of Indian in our long-cherished national tale of global “cowboys versus Indians.”  

    In a more theoretical vein, analysts of world affairs have been debating for many years whether nation-states will remain significant actors on the global stage in an age of multinational corporate capitalism. Some argue that even now national borders have become, for all practical purposes, irrelevant as they’re swallowed up by the multinational monoliths. Headlines announcing an increase in U.S. anti-missile defenses to meet the North Korean threat may seem to prove them wrong.

    But the lesson Alan Trachtenberg draws from the nineteenth-century cowboy narratives teaches just the opposite: Stories may well become prominent precisely because they are irrelevant to, and stoutly deny, the actual facts of life.   

    That lesson is all the more convincing if we look at another, closely related aspect of the “Gilded Age”: the growing call for a more “muscular” American military. The president who trumpeted that call the loudest was the “Rough Rider,” Theodore Roosevelt. TR saw the military as one crucial way to revive “the strenuous life” of rugged individualism and its masculine virtues, which he claimed to have learned from the cowboys on the South Dakota frontier.

    It’s no coincidence that TR was also the first president to fight against the monopolistic practices of corporations. For him and many of his generation, corporations threatened to sap the individualistic vigor that was essential to the American way of life. Tales of heroic cowboys and soldiers, both defending Americans against savages, pointed the way toward averting the threat posed by the corporate way of life.

    Now, with this threat grown global, stories of a muscular American military response to a savage enemy may serve much the same purpose: reassuring Americans (even if unconsciously) that they, as individuals, still matter and still have some control over their ever more corporatized lives.

    The ultimate irony, which was already becoming evident in the “Gilded Age,” is totally obvious today: In the story that Americans tell, their security depends on highly technological weapons built by multinational corporations and wielded by anonymous, bureacratized military managers. The 21st century military “cowboy,” the mythic figure so many depend on to resist total corporate domination, has been completely corporatized.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/151060 https://historynewsnetwork.org/blog/151060 0
    The President, in Israel, Giveth and Taketh Away Barack Obama and John Kerry at the Church of the Nativity in Bethlehem. Credit: U.S. State Department.

    The real Barack Obama was clearly on display in his quick trip to Israel and Palestine. Wherever you are on the political spectrum, he always gives you something you want with one hand, while he takes away something equally important with the other hand.

    When Obama spoke in Jerusalem, I cheered as loudly as the audience of liberal Jewish students who shared my views, which the president voiced so eloquently: The occupation is really bad for Israel; Prime Minister Benjamin Netanyahu must lead his nation to a just peace with an independent, viable Palestinian state.

    I cheered most when I heard Obama say words that I never thought I’d hear an American president say in Israel: The occupation is not merely harmful to Israel’s national interests, it’s downright immoral: “It is not fair that a Palestinian child ... lives with the presence of a foreign army that controls the movements of her parents every single day. ... It is not right to prevent Palestinians from farming their lands ... or to displace Palestinian families from their home.” Bravo!

    But Obama is no starry-eyed idealist. He crafts such idealistic words for practical political purposes. In this case he was pushing Israeli liberals and centrists further toward the peace camp, widening the gap between them and the Netanyahu-led right wing. Down the road, he can use the political tensions he stirred up to move Israel toward the kind of peace agreement he wants.

    The pundits who declared him finished with the peace process were obviously wrong. (Even Thomas Friedman can make mistakes.) The president gave me something I want very much: A promise of more American pressure on Israel to make a just peace, for moral as well as practical reasons.

    Predictably, though, at the same time Obama took away something equally important: his demand that Israel stop the main roadblock to peace, its expansion of settlements in the West Bank. Instead he fell back on the vague language we’ve heard from many presidents before: “We do not consider continued settlement activity to be constructive, to be appropriate”; “Settlement activity is counterproductive to the cause of peace.”

    In Ramallah, standing alongside Palestinian Authority president Mahmoud Abbas, Obama called the settlements merely an “irritant.” He urged Abbas not to use settlements as an “excuse” to refuse direct negotiations.

    There’s some evidence that the PA had already received and perhaps accepted this message from Washington. Talking points prepared for Abbas suggested that he should agree to negotiations after getting only private assurances from Netanyahu on stopping settlement expansion. How much could those assurances be worth?

    Back down on moral principle and tolerate an evil for the sake of a greater good: That seems to be Obama’s message now on the settlements. As usual, the president gives and at the same moment takes away. Does it make him just another crass politician, maneuvering to score the next victory, bereft of any principle?

    Not necessarily. Biographies of Obama suggest that, from his college days, he has been a devotee of a consistent set of principles: the script laid out over 80 years ago by the famous theologian Reinhold Niebuhr in his classic book, Moral Man and Immoral Society -- though Niebuhr supposedly said, years later, that he should have called it “Immoral Man and Very Immoral Society.”  

    Indeed. Because Niebuhr’s basic point is that we are all doomed to tolerate and even embrace evil. We are all selfish, always out to get more than the other guy, simply because we are human. It’s the old story of original sin -- the myth of Adam and Eve eating the forbidden fruit, expelled forever from paradise -- dressed up in twentieth-century clothing.

    If individuals are bound to be nasty and brutish to each other, it’s worse in relations among nations, Niebuhr argued. Never expect anything from a nation except greed and lust for power. Even on the rare occasion that a nation pursues a relatively good aim, it’s bound to use evil means. And that includes Niebuhr’s homeland, the good old US of A.

    It pained him to see his theology become the dominant narrative of Cold War America with one huge twist: U.S. presidents and policymakers exempted America from the universal stain of sin -- at least in public, where they insisted that America would, and could, do no wrong.

    In private, the cold warriors acted upon (and occasionally admitted to each other) the principle that Niebuhr said all nations will inevitably use: accepting evil means to pursue even the best goals. The twenty-first-century warriors against terrorism, Democrats as well as Republicans, have followed the same Niebuhrian script. Now Obama has brought it to the Middle East.

    In fact Americans have always practiced such hypocrisy, Niebuhr argued, although they generally denied it and claimed that their nation was as pure as Eden. That’s The Irony of American History (as he titled his other most famous book).

    Obama surely understand this irony very well. He never quite comes out and admits that he is embracing evil for the sake of a greater good. But he doesn’t boast of America’s perfect purity in the way the early cold warriors, or his predecessor George W. Bush, did.

    Obama addresses almost every issue in the Niebuhrian way he spoke of the settlements: “The politics there are complex ... It’s not going to be solved overnight,” because there is no absolute good or evil; we always deal in shades of gray; we all make compromises; sooner or later, we all become hypocrites.

    But I wonder whether Obama ever stops to think about the other irony of American history, since Niebuhr became a guiding light of its foreign policy.

    When he wrote Moral Man and Immoral Society, Niebuhr thought he was showing a better path toward hope and change than the idealistic Christian liberalism of the Progressive era. You’ve got to get your hands dirty in the political process if you want to improve the world: That was the essence of the myth that he intended to create.

    History played a trick on him, though -- just as his own theory predicted. The main message that American readers and leaders took from his book is that the world is a dangerous place; everyone is out to get us; self-protection is the name of the international game; so do evil unto others before they do it unto you.

    This is the foundation of what I call the American mythology of homeland insecurity. It’s the narrative that dominates U.S. foreign policy -- and Israeli foreign policy too, though the Zionists didn’t need Niebuhr to teach them. They developed their own myth of insecurity before he ever wrote a word.

    The same narrative dominated Obama’s rhetoric in Israel. He wrapped his calls for peace in endless recitation of the supposed dangers that Israel faces, dangers that are largely imaginary. He may have meant it as a pragmatic move, to convince Israeli Jews that he really does care about their fate.

    But irony always wins in the end, Niebuhr taught. So Obama’s powerful reinforcement of Israel’s insecurity is likely, in the end, to undermine his call for Israel to compromise and take risks for peace. As long as the Israeli Jews, and their supporters here in the U.S. (mostly gentile conservatives), believe that they are as endangered as Obama says, they are not likely to take any risks at all. They are more likely to do evil to others, because their fearful imaginings tell them that others are about to do evil to them.

    Myths of insecurity always block the path to hope and change. Barack Obama, the faithful Niebuhrian, always gives hope and change with one hand and takes it away with the other.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/151196 https://historynewsnetwork.org/blog/151196 0
    The Myth of the All-Powerful President: A Very Brief History FDR with Ibn Saud, first king of Saudi Arabia, in February 1945.

    In a column I’ve just posted on Tomdispatch.com I summarized the tremendous task Barack Obama seemed to commit himself to, in his recent Middle East trip, as he once again took on the role of peacemaker:

    [He] must satisfy (or mollify) both the center-left and the right in Israel, strike an equally perfect balance between divergent Israeli and Palestinian demands, march with [Israeli Prime Minister Benjamin] Netanyahu up to the edge of war with Iran yet keep Israel from plunging over that particular cliff, calibrate the ratcheting up of punishing sanctions and other acts in relation to Iran so finely that the Iranians will, in the end, yield to U.S. demands without triggering a war, prevent the Syrian civil war from spilling into Israel, which means controlling Lebanese politics too -- and do it all while maintaining his liberal base at home and fending off the inevitable assault from the right. 

    That’s a tall order, indeed. But in American political culture we expect no less from any president. After all, he is “the most powerful man in the world” -- so he should be able to walk such a high wire adroitly, without fretting too much about the consequences should he fall.

    Whenever an American president travels abroad, his overriding plan is to act out on the world stage the fantasy that so many Americans love: Their leader, and the nation he embodies, have unlimited power to control people and events around the globe.

    In this imaginary scenario, the president can do all because he knows all. He is above every fray, understanding the true needs of both sides in every conflict. That’s why he can go anywhere and tell the locals what is true and right and how they should behave.

    With his awesome wisdom and omnipotence, the mythic president can deftly maneuver his way across the most challenging and dangerous situations and settle every dispute with god-like justice. He can be all things to all people. So he never has to make painful sacrifices or suffer any losses, as he proves that the American way must eventually triumph over all.

    Historians should wonder: How did this mythic image of the all-powerful president arise? Already in late eighteenth-century writings we can find confident claims that the fledgling United States of America is destined to play a unique role in bringing peace to the world.

    But the idea that the president would personally have such power has its earliest seed in Theodore’s Roosevelt’s successful mediation to end the Russo-Japanese War in 1905. TR was probably motivated mostly by concern that the war would interfere with burgeoning U.S. trade interests in east Asia. But the Nobel Peace Prize he received seemed to mark him as less interested in national power than world peace.

    That image of a disinterested pursuit of peace and justice was magnified manifold by Woodrow Wilson, who truly founded the myth of the omnipotent president on the global stage. Wilson deftly blended appeals to the idealism of the Progressive era (and the Christian Social Gospel) with warnings that Americans would never be safe until the world was “safe for democracy.”

    To what he extent was he an idealist, and to what extent did his idealistic words mask a crafty pursuit of U.S. interests? Historians will probably debate that question forever. But there’s no debating his profound influence on the image of the presidency as an office responsible for bringing peace and justice to all lands.

    That image languished during the Republican presidencies of the 1920s, waiting to be revived by Franklin D. Roosevelt. He always cited “cousin Teddy” and Wilson as his two great political heroes. So it’s not surprising that FDR followed their lead. In private conversations and letters, he promoted a vision of a unified democratic capitalist order spanning the entire globe, with America leading the way. And he was more than ready to play the role of omnipotent dispenser of peace and justice to maintain that order.

    But he knew that the American people were hardly ready to see the nation, or the president, take on that level of international involvements. Even when war broke out in Europe, FDR had to confront a public deeply divided on whether they and their should get involved. Like Wilson, FDR mounted a major public relations campaign to gain support for his efforts to use America’s mighty power to control events around the world. Like Wilson, he appealed to both idealistic traditions and vivid depictions of threats to U.S. interests and American lives.

    Once the U.S. entered World War II, resistance to this new global role pretty much evaporated. But FDR continued to worry that, once the war ended, the public would revert to its “isolationist” tendency to ignore issues of peace and justice in the rest of the world.

    Roosevelt had underestimated his own achievement. By the war’s end, his skillful rhetoric had persuaded nearly all Americans that their own safety depended on their government's -- and especially their president’s -- ability to control events everywhere. With all the other major powers devastated, the U.S. had such preponderant power that the fantasy of total control seemed quite realistic.

    Josef Stalin’s Soviet Union quickly burst that bubble. But by the late 1940s American public discourse had settled on a seemingly comfortable consensus that lasted through the Cold War era: The U.S. would control everything of significance that happened in the “free world,” on our side of the Iron Curtain, while exercising enough control over the communist bloc to “contain” it.

    Once the Cold War ended FDR’s vision of a single global order seemed genuinely within reach. So there was even more reason to embrace the mythic vision of the president’s unlimited power.  

    There’s a good argument to be made that the most important results of U.S. foreign policy ever since the 1940s -- for better and for worse -- have flowed directly from this image of the omnipotent president, representing the omnipotent nation, trying to exercise unlimited control.

    The most vivid lessons came from presidential (some call it imperial) overreach, most notably in Vietnam and Iraq. Yet despite these remarkable evidences to the contrary, many Americans still cling to the mythic narrative of “the most powerful man in the world,” able to control events in every corner of the globe. Why?

    The question can be answered in many ways. If we stay strictly within the confines of the study of myth, one explanation seems most compelling.

    The claims for presidential control have always grown hand in hand with fears about what we now call homeland security. There’s a straight line leading from Wilson’s warning that Hun victory would spell the end of all civilized (read: American) values to the Obama administration's warnings about North Korea’s nuclear weapons and the Syria’s chemical weapons. All the fears built up along the way created what I call the myth of homeland insecurity: the conviction that the very existence of America is constantly in peril.

    The best way -- perhaps the only way -- to allay that fearful belief has been, and apparently still is, to accept the myth of the omnipotent president: “The most powerful man in the world” can manage every situation, no matter how perilous, with wisdom and skill. He can give a cleverly calculated prod here and a perfectly calibrated nudge there, pull all the strings with unfailing precision, without ever losing his perfect balance. Thus he can guarantee a safe outcome for America.

    How reassuring it must be to believe that. And how predictable it is that, as long as this mythic story prevails, presidents will continue to overreach their true, limited power, with results that most of America will come to regret.

    Yet the irony is obvious: The more regret, the more insecurity; the more insecurity, the more powerful the appeal of the myth of all-powerful president. And the cycle just keeps on turning.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/151413 https://historynewsnetwork.org/blog/151413 0
    The Conspiracy to Kill MLK: Not a Theory but a Fact

    Should the United States government be allowed to assassinate its own citizens? That question was in the air briefly not long ago. April 4 is an excellent day to revive it: On April 4, 1968, the government was part of a successful conspiracy to assassinate the Rev. Dr. Martin Luther King, Jr.

    That’s not just some wing-nut conspiracy theory. It’s not a theory at all. It is a fact, according to our legal system.

    In 1999, in Shelby County, Tennessee, Lloyd Jowers was tried before a jury of his peers (made up equally of white and black citizens, if it matters) on the charge of conspiring to kill Dr. King. The jury heard testimony for four full weeks.

    On the last day of the trial, the attorney for the King family (which brought suit against Jowers) concluded his summation by saying: “We're dealing in conspiracy with agents of the City of Memphis and the governments of the State of Tennessee and the United States of America. We ask you to find that conspiracy existed.”

    It took the jury only two-and-half hours to reach its verdict: Jowers and “others, including governmental agencies, were parties to this conspiracy.”

    I don’t know whether the jury’s verdict reflects the factual truth of what happened on April 4, 1968. Juries have been known to make mistakes and (probably rather more often) juries have made mistakes that remain unknown.

    But within our system of government, when a crime is committed it’s a jury, and only a jury, that is entitled to decide on the facts. If a jury makes a mistake, the only way to rectify it is to go back into court and establish a more convincing version of the facts. That’s the job of the judicial branch, not the executive.

    So far, no one has gone into court to challenge the verdict on the King assassination.

    Yet the version of history most Americans know is very different because it has been shaped much more by the executive than the judicial branch. Right after the jury handed down its verdict, the federal government’s Department of Justice went into high gear, sparing no effort to try to disprove the version of the facts that the jury endorsed -- not in a court of law but in the “court” of public opinion.

    The government’s effort was immensely successful. Very few Americans are aware the trial ever happened, much less that the jury was convinced of a conspiracy involving the federal government.

    To understand why, let’s reflect on how history, as understood by the general public, is made: We take the facts we have, which are rarely complete, and then we fill in the gaps with our imaginations -- for the most part, with our hopes and/or fears. The result is a myth: not a lie, but a mixture of proven facts and the fictions spawned by our imaginings.

    In this case, we have two basic myths in conflict.

    One is a story Americans have been telling since the earliest days of our nation: Back in not-so-merry old England, people could be imprisoned or even executed on the whim of some government official. They had no right to prove their innocence in a fair, impartial court. We fought a bloody war to throw off the British yoke precisely to guarantee ourselves basic rights like the right to a fair trial by a jury of our peers. We would fight again, if need be, to preserve that fundamental right. This story explains why we are supposed to let a jury, and only a jury, determine the facts.

    (By odd coincidence, as I was writing this the mail arrived with my summons to serve on a local jury. The website it directed me to urged me to feel “a sense of pride and respect for our system of justice,” because “about 95 percent of all jury trials in the world take place in the United States.”)  

    Then there’s another myth, a story that says the federal government has only assassinated American citizens who were truly bad people and aimed to do the rest of us harm; the government would never assassinate an innocent citizen. Most Americans devoutly hope this story is true. And most Americans don’t put MLK in the “bad guy” category. So they resist believing what the legal system tells us is true about his death. 

    Perhaps a lot of Americans would not be too disturbed to learn that the local government in Memphis or even the Tennessee state government were involved. There’s still plenty of prejudice against white Southerners. But the federal government? It’s a thought too shocking for most Americans even to consider. So they fill in the facts with what they want to believe -- and the myth of James Earl Ray, “the lone assassin,” lives on, hale and hearty.

    Since that’s the popular myth, it’s the one the corporate mass media have always purveyed. After all, their job is to sell newspapers and boost ratings in order to boost profits. Just a few days after the trial ended the New York Times, our “newspaper of record,” went to great lengths to cast doubt on the verdict and assure readers, in its headline, that the trial would have “little effect” -- an accurate, though self-fufilling, prophecy.

    Imagine if the accused had been not a white Southerner but a black man, with known ties not to the government but to the Black Panther Party. You can bet that the trial verdict would have been bannered on every front page; the conspiracy would be known to every American and enshrined in every history book as the true version of events.

    None of this necessarily means that the federal government and the mass media are covering up actual facts. Maybe they are, maybe they aren’t. Again, I don’t claim to know what really happened on April 4, 1968.

    But there surely were people in the federal government who thought they had good reason to join a conspiracy to get rid of Dr. King. He was deep into planning for the Poor People’s Campaign, which would bring poor folks of every race and ethnicity to Washington, D.C. The plan was to have them camp out on the Mall until the government enacted major economic reforms to lift everyone out of poverty. That meant redistributing wealth -- an idea that made perfect sense to Dr. King, who was a harsh critic of the evils of capitalism (as well as communism).

    It also meant uniting whites and non-whites in the lower income brackets, to persuade them that the suffering they shared in common was stronger than the racial prejudice that divided them. Dr. King did not have to be a prophet to foresee that the longer whites blamed non-whites, rather than the rich, for their troubles, the easier it would be to block measures for redistributing wealth. The unifying effect of the Poor People’s Campaign spelled trouble for those whose wealth might be redistributed.

    At the same time, Dr. King was the most famous and respected critic of the war in Vietnam. By 1968 he was constantly preaching that the war was not just a tragic mistake. It was the logical outgrowth of the American way of life, based on what he called the inextricably linked “triplets” of militarism, racism, and materialism. Had he lived, the Poor People’s Campaign would have become a powerful vehicle for attacking all three and showing just how inseparable they are.  

    Yes, plenty of people in the federal government thought they had good reason to put an end to the work of Dr. King. But that hardly proves federal government complicity in a conspiracy to kill him.

    So let’s assume for a moment, just for the sake of argument, that the jury was wrong, that James Earl Ray did the shooting and acted alone. The federal government would still have good reasons to suppress the conspiracy story. Essentially, all those reasons boil down to a matter of trust. There is already immense mistrust of the federal government. Imagine if everyone knew, and every history book said, that our legal system has established as fact the government’s complicity in the assassination.

    If the federal government has a convincing argument that the jury was wrong, we all deserve to hear it. There’s little advantage to having such uncertainty hanging in the air after 45 years. But the government should make its argument in open court, in front of a jury of our peers.

    In America, we have only one way to decide the facts of guilt or innocence: not through the media or gossip or imagination, but through the slowly grinding machinery of the judicial system. At least that’s the story I want to believe.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/151436 https://historynewsnetwork.org/blog/151436 0
    Social Security Cuts: More Than Money At Stake Image via Shutterstock.

    I’m old enough to remember when Social Security was the “third rail” of American politics -- too dangerous for even the most conservative politician to touch. You’re probably old enough to remember that, too. It wasn’t very long ago. As recently as the 2012 Republican primaries, Mitt Romney defended Social Security against attacks from other candidates (notably Rick Perry), and Romney emerged the GOP standard-bearer.

    How things have changed in just a year. It’s not merely that a Democratic president is offering, very publicly, to cut Social Security benefits. There’s something much more important: In the mainstream of American political conversation, this revelation was not treated as very big news.

    Oh, it made headlines. But it shared equal billing at best -- and was often buried beneath -- a host of other stories. You could easily get the impression that the most important thing Barack Obama did on April 5 was to remark on the good looks of California’s state attorney general and then apologize to her.

    As I wrote this, his budget proposal including the Social Security cuts was number five on the Washington Post website’s “most viewed stories” list; his apology to that attorney general was number one. The very idea of cutting Social Security benefits is no longer a big deal in the media, so it’s no longer a big deal to the public. And that change is a very big deal indeed in the mythology of American politics.

    Until recently, successful Republican as well as Democratic politicians have avoided any hint of tampering with Social Security. It’s a bipartisan tradition going back to Dwight Eisenhower. When he was just a private citizen he warned that the liberal Democrats, who had created Social Security a decade earlier, wanted to “advance us one more step toward total socialism, just beyond which lies total dictatorship.” In letters to wealthy friends he pledged to “combat remorselessly all those paternalistic and collectivistic ideas."

    Once in the White House Eisenhower changed his tune. Even though his Republican party controlled both houses of Congress, he told aides that it would be politically impossible to change the Social Security system; the public would never stand for it.

    That was precisely the way Franklin D. Roosevelt had planned it. Knowing conservatives were eager to pounce on every New Deal program, he designed Social Security so that no one could attack it as a “government giveaway.” The benefits would not -- and still do not -- come out of the government treasury. They come out of a special pool of money funded by payroll taxes. To hammer home that point, FDR made sure the first benefits would not flow until some time after the first taxes were paid in.

    In this way FDR created the myth of Social Security: When you retire or are disabled and stop working, the checks you get in the mail don’t come from the government. They are your own money -- the money you’ve set aside over the years -- being returned to you, fair and square. No American worker is going to let the government touch his or her own money. That’s how FDR made sure Social Security would be the “third rail.”

    Like most myths, this one is compounded of fact and fiction. It’s true that Social Security is a separate fund and an insurance program of sorts. You don’t take out of that fund unless you’ve paid in. But the amount you take out is rarely directly proportional to what you’ve put in. And the separation between Social Security fund and federal treasury disappeared long ago, because the government has raided the fund regularly.

    However the proportion of fact to fiction has little to do with the power of a myth. The myth of Social Security has been such a staple of American political life because it is so simple and seems to capture so well the basic idea of fairness and equity: If you work hard and set aside some of your earnings every paycheck, when you can no longer work you will still have enough to live a decent life. The government won’t give you the money; it’s your money. But government will guarantee that the money will be there.

    That sounds a lot like the recent rhetoric of Barack Obama: If you work hard and play by the rules, it’s the government's responsibility to make sure you can live a decent middle-class life, no matter how long you live.

    Yet now Obama’s policies have diverged from his rhetoric, as well as from the policies of all successful politicians since FDR’s day -- and it’s not very big news at all. How can that be?

    The short answer is that the old myth of Social Security no longer has the power it has held since FDR’s day. We are watching a time-honored political myth begin to die.

    Myths don’t die because they are debunked by facts. Myths die when new, more persuasive myths come along to take their place. Soon, the demise of the old myth just doesn’t seem so important any more.

    The new myth, in this case, is a story about the baby-boomers who will soon all be retired or disabled. They’ll draw down huge amounts of Social Security money, the myths says, far more than they have put in and far more than the younger generation can provide for them. If we don’t cut their benefits, the nation will go broke.

    This myth thrives despite the mountain of facts that contradict it. Social Security is infine financial shape until the youngest baby-boomers are at least 90, maybe 100. If the system needs more money after that, there’s a simple solution: Raise the cap on the Social Security payroll tax. Right now, no matter how many millions you may earn, you pay that tax on only the first $113K of your income. Start taxing income over $113K (which is only the top 5% of wage-earners) and the Social Security fund begins to swell, covering all future needs. It’s a solution that gets huge support from the public, when pollsters ask about it.

    But, to repeat, myths don’t die because they are debunked by facts. They die when they are eclipsed by new, more powerful myths. The myth that Social Security benefits must be cut is one part of the myth of the “debt crisis,”  which is, in turn, just part of the much larger mythology of homeland insecurity -- the narrative that says America is teetering on the cliff, ready to be plunged into disaster, very possibly even extinction, by some evil force or other. That’s been America’s master narrative, our most powerful myth, for decades.

    The fact that Social Security is sound, and could be made more sound by raising that cap on the payroll tax, doesn’t make headlines or get much air time because it doesn’t fit into our ruling myth of insecurity. And Obama’s decision to propose Social Security cuts doesn’t make a huge political wave because it fits so well into that ruling myth.

    Sure, polls consistently find people opposing those cuts. But the more important point is that -- judging from the media coverage of Obama’s decision and the weakness of the opposition to it -- the public now accepts those cuts as inevitable, because the myth of insecurity is the political water we all swim in.  

    Cutting Social Security benefits even a small amount will be another victory for the myth of homeland insecurity. Once that myth comes to control the public view of Social Security, there’s no limit to the cuts in benefits. Everything and anything must be given up for the sake of national security: That’s been America’s creed for a very long time. When Social Security cuts become routine news, that creed gets a huge boost, opening the door to even larger benefit reductions.

    So there’s much more at stake in the Social Security debate than the amounts of the monthly payments. It’s really a fundamental debate about the mythology that shapes American political life. Will we maintain the New Deal myth, the basic social contract, which Barack Obama voices so eloquently: If you work hard and set aside part of each paycheck, you’ll be guaranteed a decent life after you stop working?

    Or will we scrap it in favor of the insecurity myth, which creates such a different  kind of social contract: Since danger threatens America so massively and so imminently, each of us must sacrifice to save a whole nation on the brink of collapse -- and the poorer you are, the more you must sacrifice? That’s the direction Obama’s policy is taking us in.

    Yet both options remain open. Like FDR and Eisenhower, Obama will go whichever way the political wind blows. The choice is up to us. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/151467 https://historynewsnetwork.org/blog/151467 0
    MythicAmerica Returns to Meet the New News, Same as the Old News Protesters in Taksim Square. Credit: Wiki Commons

    When I left the country back in April for an extended sojourn in Europe I made myself a promise and a prediction. I promised that I would not look at a newspaper or any news source -- cold turkey, for a news junkie like me. I predicted that when I got home and fell back into my old junkie ways, the news would be very much the same as when I left home. It’s a lot like a soap opera: You can skip the news for weeks at a time, and when you turn it back on you feel like you’re picking up right where you left off; you’ve haven’t missed anything important at all.

    Keeping the pledge to abstain was easy. I enjoyed the vacation from the news so much that I extended it nearly two weeks after I got home, with only one exception. Having spent ten days in Istanbul, I had to follow the protests in Taksim Square, a place I’d visited several times during my stay there.

    When the protests moved down to the Dolmabahce Palace -- the splendiferous digs of the last Ottoman emperors, where Turkish prime minister Recip Tayyip Erdogan, the main target of the protests, now has his office -- it all got even more real. My wife and I had been to Dolmabahce and stayed in a rented apartment less than a mile away. We immediately emailed our landlord -- a thirty-something, highly secular video editor -- to make sure he was OK. Here’s the response we got:

    “dont worry we are fine so far....but the police continuously assault us like we are enemy, and behave like they want to killl us....we will resist we will win against this f---n’ dictator...we are just standing here doing nothing, just standing peacefully for our rights but if the police assault us we are covering each other........pls share this and help us to kick off this motherf---r dictator above us.” That’s just one Istanbulli’s opinion, though apparently it’s shared by many thousands of others.

    As for my prediction that when I went back to my news junkie habit it would seem like I’d never been away, because so little would have changed: Previous experience told me that was a very safe bet. I’d been through this on plenty of vacations before.

    I got my first hint that it would work the same way this time when, on our flight back to the U.S., I inadvertently glimpsed a headline in a fellow passenger’s New York Times: “Debates over debt and deficit widen rift in GOP” -- the same story I’d seen reported for weeks before I left for my stay abroad.

    Today I finally ended my moratorium on the news, clicked those well-worn “NYTimes” and “WashPo” bookmarks on my web browser, and (you guessed it): Meet the new news, same as the old news. The National Security Administration is collecting data on phone calls and web surfing from American companies. I’m shocked, shocked. Why, the next thing you know they’ll be telling us that there’s gambling going on in Rick’s Café Americain!

    The Times site featured another story with roughly equal shock value: “Clashes in the Golan Heights rattled Israel as the violence of Syria’s war threatened to spill over into Israeli-held territory.” How many months have we been hearing that one?

    Over at the Washington Post, the other featured story was “IRS official apologizes for agency’s lavish 2010 conference in Calif.” I’m sure the official added, “We’ll never do it again.” How many years have we been hearing that one?  

    Stories on government electronic eavesdropping here in the U.S. have been running for nearly half a century. Stories on Israeli fear and U.S. agencies’ lavish spending are much older than that.

    All this old news certainly doesn’t mean nothing new happened while I was away. It merely means that the new must be hidden behind the old, or made to look like the old.

    Consider the protests in Istanbul. Is it “Arab Spring” or “Occupy Wall Street” redux? That question largely dominates U.S. coverage of the events, though they are in fact quite unique and couldn’t fit either category comfortably, as my Istanbulli landlord could easily explain.

    But it’s not the corporate mass media’s job to challenge us with anything really new. Their job is to deliver audiences to advertisers. In the editorial rooms of America’s mass news media they know the basic truth of their profession: It’s the story, not the news, that hooks the audience. And the most satisfying stories are the ones that are old and familiar.

    Which goes far to explain why our news is so much like soap opera: stock characters in simplistic, stereotypical conflict situations, reciting trite and predictable dialogue, with plot lines that go on for decades full of superficial twists and turns, all designed to evoke the strongest possible emotional response from an audience that seems to have an inexhaustible capacity to be gripped, fascinated, and perhaps titillated by it. Only one set of American TV shows has greater longevity than soaps like General Hospital and As the World Turns: the network news shows.

    In this sense we are little different from our ancestors who gathered around the village bonfire to hear respected elders recite the same old myths. Every recitation might well include some small innovations (as scholars of myth have proven over and over again). But the essential story line remained as familiar and satisfying as ever.  

    If you’re asking the all-important question, “So what?” -- and I hope you are -- I found at least part of the answer when I waded through weeks of email accumulated while I was away. Much of it came from independent, non-corporate news sources, all sending me their own variations on the same single message: Something is outrageously wrong in our world and it must be fixed, immediately!

    I didn’t take the time to read every one. But a random sampling, and years of reading similar emails, suggests two conclusions: Most of them are probably quite correct. And, like the corporate mass media news, these are the same messages I’ve been reading and hearing for years. In progressive circles, too, when you meet the new news it’s generally pretty much the same as the old news.

    One change over recent decades is surely worth noting: Stories that used to get major coverage only in progressive indy media are now more likely to make headlines in U.S. mass sources like the Times, the Post, and the network TV news. NSA spying is a good case in point. Ditto for global climate change, bloated military budgets, bloated executive salaries and other corporate abuses, violence against women and minorities (and gun violence of all kinds), shameful treatment of military vets, and the list goes on and on.

    Yet despite all the exposure such outrages now get in the mass media, meaningful change is hard to find, as my chock-full inbox reminded me. Why? The answer is as complex and manifold as the explanation of any chaotic system in nature.

    But one piece of it that is largely overlooked is the way our mass news media give us soap opera as if it were the real world. Soap operas thrive on vivid depictions of outrageous happenings. They can make us moan and mourn and sometimes even shed real tears. But it never occurs to anyone in the audience to feel responsible for putting a stop to the outrages.

    Indeed that would defeat the whole point of the soap opera, which is to keep the story, with all its outrages, going. The motto is not “Fix the world, now,” nor even “Here’s what the world is really like.” It’s “The show must go on.” As I return from my moratorium on the news, it strikes me that mass media news runs by the same motto.

    That’s one reason I’m bringing MythicAmerica back after its brief hiatus: to give concrete examples of the soap-opera-ish, mythic patterns that dominate public discussion of the most crucial issues of the day, making it hard for Americans even to grasp the actual realities of the world we live in, much less respond to them in constructive ways.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/152157 https://historynewsnetwork.org/blog/152157 0
    The Neverending Morality Play of the Deficit Hawks Credit: Wiki Commons.

    When I returned from a long stay abroad, my first blog post noted how much the new news at home looked like the old news, as if I’d never left. I assumed that new, important events had unfolded. They just didn’t make the headlines. Sure enough, I had barely returned to my news junkie habits when Paul Krugman confirmed that I was right.

    For years, he wrote, he and other liberal economists have been fighting “the policy elite's damaging obsession with budget deficits, an obsession that led governments to cut investment when they should have been raising it, to destroy jobs when job creation should have been their priority.”

    The big news: “That fight seems largely won -- in fact, I don't think I've ever seen anything quite like the sudden intellectual collapse of austerity economics as a policy doctrine.”

    A bit of research suggested that Krugman’s victory celebration (shared by The New Yorker and others) is probably a bit premature. The academic champions of budget-cutting did take some serious body blows while I was away. But they are still fighting back, with plenty of influential allies.

    And it’s a long way from the halls of academe to the halls of Congress where the federal budget is actually made, and where budget hawks still wield the same power they did before I left the country.

    The battle of economic theories is also still being waged in the White House, too, where the outcome is likely to be the kind of “split the difference” policy that Barack Obama opts for on nearly every major issue.

    Why does the passion for cutting government spending stay so strong, when it’s increasingly clear that it is indeed a passion, not a rational view well-grounded in solid evidence and theory?  Krugman’s answer takes the conversation directly into the realm of myth:

    Everyone loves a morality play. “For the wages of sin is death” is a much more satisfying message than “Shit happens.” We all want events to have meaning. When applied to macroeconomics, this urge to find moral meaning creates in all of us a predisposition toward believing stories that attribute the pain of a slump to the excesses of the boom that precedes it.

    He cites the well-known example of German bankers chastising the Greeks for indulging their own passions to excess. Now the Greeks, like all sinners, must pay the price. Bail them out and you’ll just encourage them to go further on the road to hell, which is paved with government-funded good times for all, whether they earn it or not.

    The same kind of moral fear of pleasure-seeking foreigners has haunted white middle-class America for a long time (as I noted in my very first MythicAmerica post). Among economic theorists, too, it’s an old story. Economist Steven Conn traces it back to the Gilded Age, digging up this pithy quote from the prophet of “Social Darwinism,” William Graham Sumner:

    If we should try by any measures to relieve the victims of social pressure from the calamity of their position we should only offer premiums to folly and vice and extend them further....The only two things which really tell on the welfare of man on earth are hard work and self-denial.

    “Except for its eloquence,” Conn adds wryly, “that sentence could have been uttered by any member of the GOP leadership yesterday.”

    But let’s be fair to America’s deficit hawks. Unlike their German counterparts, they don’t typically promote their case by preaching the moral dangers of excess and vice.

    What they claim over and over again (at least in public) is that they are merely trying to save the nation from the economic disaster that is bound to befall us if the national debt grows too large. As GOP economic guru Bruce Bartlett put it, “The debt limit is the real fiscal cliff.”

    And like falling over a cliff -- or like the biblical apocalypse -- it could all happen in a flash. Washington Post columnist Robert Samuelson quotes this frightening vision of instant collapse from austerity theorist Barry Eichengreen: “Investors will wake up one morning and conclude that the situation is beyond salvation. They will scramble to get out.…The United States will suffer the kind of crisis that Europe experienced in 2010, but magnified.”

    The American public has long shown itself ready to respond to such warnings of a sudden surprise attack. But in the past the attackers haven’t been bond traders. They’ve been fascists, communists, or terrorists. The deficit hawks are giving us a new version of the old myth of homeland insecurity: America is constantly at risk. Arm yourself well, circle the wagons, proceed with the utmost caution. Any morning you may wake up and find your nation under assault. Any misstep might plunge you into the abyss of national catastrophe.

    That kind of worldview is bound to make most people more conservative. And conservatives can tie the rising federal debt to traditional fears of foreign foes, citing warnings that the debt is “the most significant threat to our national security” (Admiral Mike Mullen, former chair of the Joint Chiefs of Staff) and puts the U.S. “at risk of squandering its global influence” (New York Times’ analyst David Sanger).

    That’s not to say the morality-play theory of Krugman and Conn is irrelevant. On the contrary, it fits into the pattern of conservative fears quite easily. If every new experience that brings pleasure is bound to be followed by pain; if every burst of excess is bound to provoke punishment; if the only way to avoid punishment and pain is a limited, constricted life of constant self-denial; then the world must indeed look like a dangerous place, full of pitfalls everywhere, with every step a risk that wise people will surely avoid. That’s the kind of world the myth of homeland insecurity gives us.

    Like any deep-rooted mythic worldview, it is largely impervious to logic. It’s never easy to give up, or even question, the narratives we use to give our lives meaning. As Steven Conn warns his fellow economists, if they view policy “as the expression of unyielding [moral] principles, admitting that you were wrong means the whole edifice comes crashing down. Which is why we may be stuck with austerity economics for the foreseeable future.”

    At least it’s a safe bet that the debate between budget hawks and doves is very likely to continue. It’s just what last winter’s “fiscal cliff” debate was: another battle in an ongoing policy war that is really a war of competing myths and has no end in sight. As I said, Professor Krugman’s celebration is probably premature.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/152269 https://historynewsnetwork.org/blog/152269 0
    Guns and the NSA Make Strange Bedfellows Image via Shutterstock.

    I was chatting with my local state legislator the other day about guns. He supported the gun control measures that passed in Colorado this year. But he took far more criticism (and lost far more campaign contributions) for those votes than any other he cast. Many of the critics are liberal on every other issue, he told me; they just won’t abide a law that limits them to “only” 15 rounds in a clip.

    The gun control issue brought conservatives, moderates, and even some liberals together. And it riled people up like nothing else in his district, where nearly all the voters live in cities and suburbs, though a few live in rural areas.

    The rural vs. urban/suburban divide struck him as a key to the issue. The main arguments he heard against gun control centered on self-protection: If you’re alone and attacked, you’d better have plenty of ammo. That argument might make sense, he opined, in rural areas where you’ve got to wait a long time for the police to show up if there’s trouble. But they make little sense in urban/suburban areas where “the law” is just minutes away.

    I’m not convinced by my legislator’s argument about the valid needs of rural folks. But his viewpoint helped me understand why there is such powerful resistance to gun control across the nation, and not just from conservatives.

    There’s a widespread belief -- whether the data bear it out or not -- that people living on scattered farms or out in the woods or up in the mountains need a lot of firepower to protect themselves from evildoers, because they are so far from the centers of population and legal authority.

    But my state legislator’s district is a microcosm of the whole country: Only a vanishingly small fraction of Americans live in such isolated areas. Why then all the resistance to gun control? Part of the answer, surely, is that so many people living in urban/suburban areas view the issue as if they themselves were the “rural folks” of their imagination. 

    The imagined countryside has been shaping the politics of America for a long time on all sorts of issues. No matter how urbanized we become, rural values and the rural lifestyle have always retained a privileged place. You can still find plenty of apartments in the biggest cities decorated with those sweet, pastoral Currier & Ives prints, reminding everyone who enters that the countryside -- at least in its mythic version -- is the “real” America.

    When it comes to the link between guns and the rural lifestyle, I suspect the biggest influence is the endless stream of “cops vs. bad guys” stories filling our movie theaters and TV screens. They are most often set in big cities. Yet they are typically just updated, high-tech versions of the old stories of the rural frontier.

    Even when the urban lawmen work for governmental institutions, they so often win the battle by acting as if they were fighting the evildoers all alone, or with one partner at best. And one of the most beloved plot lines is the lawman (or ex-lawman) who can defeat evil only by working outside the institutional structures of the state.

    Urban/suburban dwellers are fed a steady diet of this kind of narrative, set in increasingly realistic visuals of neighborhoods that look much like their own. How easy it must be for them to imagine that, when it comes to issues of “law and order,” they too  are still living in the rural frontier -- reassuring them that they, too, are still “real” Americans. And out there where the mythicized “real” Americans live, you can’t depend on the institutions of the state to protect you.

    Colonial historians tell us that this feature of the urban-rural divide was already evident even before there was a United States of America. Travelers to the colonial frontier -- the hills of Appalachia -- who left written records were often struck by the anarchy they saw: everyone assuming it was up to them to protect themselves, their families, and their property, since there were no agents of the state to do the job. And everyone on the frontier was well armed for the task.

    These writing travelers had come from the more urbanized Atlantic coast. They may well have exaggerated; what they were really recording was the contrast between their accustomed social ways and the rather different ways of the frontier. In those days the dominant trend was to scorn the rural style. Over the centuries, though, the cultural balance shifted markedly in favor of the frontier, as our movies and TV shows remind us every day.

    This rural bias extends far beyond the issue of guns. (Perhaps, for example, to the new laws that would give special treatment to undocumented immigrants if they’ve worked in agriculture, though there’s plenty of economic factors at work in that one, too.)

    And it’s not just a question of the absence of state authority. The myth (and often, no doubt, the empirical reality) of the early frontier is marked by a stout resistance to agents of state authority -- most famously in the form of “revenooers,” who could try to collect taxes from well-armed moonshiners only at the risk of their lives.

    At a later stage of our history the frontier gave us a seemingly opposite myth: the courageous sheriff who is appreciated, even revered, by the townsfolk for protecting them when the bad guys ride into town. But this narrative has always been counterbalanced by a certain reverence for outlaws. In this counter-narrative, the lawmen are typically cast as agents not so much of the state as of the bankers and other wealthy capitalists who victimize “real” Americans.

    And it doesn’t necessarily take an outlaw to defeat these rich evildoers. Another iconic image is the poor but honest, hardworking farmer who can resist foreclosure all by himself -- because he has his gun and is ready to use it when the sheriff, doing the bidding of the banker, arrives.

    The difference between the revered and reviled agent of the state is easy to see: The former protects against trouble that arrives from outside the community, whether it’s British “redcoats” or (in much later generations) “commie reds,” “outside agitators,” and eventually “terrorists.” The state lawman becomes an enemy of the people when it’s a question of troublemakers who live among us, especially those who live on the upper economic crust. The dominant tradition tells us that, unless danger threatens from beyond our borders, “real” Americans want to be left to take care of themselves -- with gun in hand.

    This same tradition sheds an interesting light on the current controversy about the NSA’s data-gathering programs, which has also brought sharp protest from across the political spectrum. The politics of privacy has made left and right strange bedfellows, we are often told. True, both sides are protesting the same policy. But it’s less often noticed how different their motives are.

    From the left, there’s no objection to the government collecting data about us, as long as it’s for a good reason: data about things like health care, housing, education, the environment. Most people on the left expect the data to be anonymous. But I have a hunch many would be willing to have their names attached as long as they felt sure it was serving such benign purposes.

    The real objection from the left is that all this spying on us is done in the name of “national security” and the “war on terror.” The national (in)security state that spies on all of us is the same state that has hung the nuclear threat over our heads, killed millions in Vietnam and Iraq, and taken so many trillions in tax dollars that could be used for humane purposes. The paranoia, like the violence, of the (in)security state seems to know no limit. That’s the essence of the complaint from the left about violations of our privacy.

    From the right it’s quite a different, and more mixed, story. Some conservatives agree with the left that the NSA’s snooping is part of a war against terrorists. But since they see those terrorists as intruding foreigners, they accept (maybe even applaud) the government agents invading their privacy.  

    For others on the right, though, it’s the latest chapter in the old story of rural, “real” America -- one that would have been very familiar to those colonial pioneers in Appalachia, so disdained by eastern visitors. The NSA is the sheriff as bad guy: the nose of the state and its elite, snooping and prying into our lives; just one more set of “gummint” agents, in a long line stretching back nearly three centuries, who won’t leave us alone to live our lives -- and settle our differences -- on our own, as we please, with our own guns in hand.

    From the right, then, the resistance to NSA spying grows from the same root as the resistance to gun control laws. What I learned from my state legislator is that this root has been spreading leftward (spreading largely unnoticed, as roots will do), because so many who would not call themselves conservative still prize their imagined version of rural America.

    Even on the left, protests against government intrusion may be fueled by more than resistance to the national (in)security state. The myths and values of rural life work their influence, unseen, across the political spectrum. They make gun control laws harder to pass. But they also fan the flames of outrage over NSA spying on U.S. citizens.

    If the ideal of the rural as the “real” America did not still wield so much cultural power, we would probably have a lot more gun control laws. But we might not have nearly so much controversy about the NSA violating our right to privacy.

    Cultural myths, when they enter the political arena, can make strange bedfellows indeed.  

    Of course we always have the option of recognizing those myths at work, setting them aside for purposes of policy debate, and deciding each policy issue on its own genuine merits. But I wouldn’t hold my breath waiting for that to happen.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/152350 https://historynewsnetwork.org/blog/152350 0
    On DOMA, Right-Wing Justices Got It Right -- and Wrong Antonin Scalia in 2010. Credit: Wiki Commons.

    (This post is dedicated to my son, Angel, and his spouse, Thomas, who had to leave their home state and go to another state simply to exercise the legal right of enshrining their love in the bonds of matrimony.)

    No one has ever accused Justice Antonin Scalia of timidity. So it’s not surprising that his opinion in United States v. Windsor, the case that struck down the federal Defense of Marriage Act (DOMA), fairly screams: I’m not a bigot. I’m not. I’m not.

    “The majority says that the supporters of this Act acted with malice,” he claims in his dissent. And of course by dissenting he became a supporter of the act. So he must defend himself against the charge that he harbors malice toward gays and lesbians. “I am sure these accusations are quite untrue,” he retorts flatly. “To defend traditional marriage is not to condemn, demean, or humiliate those who would prefer other arrangements... To hurl such accusations so casually demeans this institution.” In other words, “It demeans me!”

    And I don’t want to demean anyone, Scalia clearly implies. I’m “simply supporting an Act that did no more than codify an aspect of marriage that had been unquestioned in our society for most of its existence -- indeed, had been unquestioned in virtually all societies for virtually all of human history.”

    Before we laugh off Scalia as a bigot flailing around to find some way to defend himself, let’s take a closer look at what he said. We liberals sort of take it for granted that bigots will be conservatives, and that conservatives are more likely to be bigots. But we all too rarely ask why that should be.

    Scalia offers a glimpse of an answer here: It’s not that we have some kind of blind, irrational bias against certain groups of people, he argues. We merely want to keep up patterns of thought and behavior that have “been unquestioned in our society for most of its existence.” We want to conserve. Why do you think they call us conservatives?

    Scalia’s partner in conservatism, Samuel Alito, agrees emphatically in his dissent (with which the silent partner, Clarence Thomas, joins): “It is well established that any ‘substantive’ component to the Due Process Clause protects only ‘those fundamental rights and liberties which are, objectively, ‘deeply rooted in this Nation’s history and tradition.’ ... It is beyond dispute that the right to same-sex marriage is not deeply rooted in this Nation’s history and tradition. ... Nor is the right to same-sex marriage deeply rooted in the traditions of other nations.” So how can it be a fundamental right?

    In other words, if you ain’t doin’ what we’ve always done, you ain’t got no right to be doin’ it -- at least no legal right protected by the Constitution.

    Alito goes on to explain why. The opponents of DOMA want “the recognition of a very new right, and they seek this innovation ...  from unelected judges.” Those judges had best be cautious, he warns, because “the family is an ancient and universal human institution. Family structure reflects the characteristics of a civilization, and changes in family structure and in the popular understanding of marriage and the family can have profound effects.” No one “can predict with any certainty what the long-term ramifications of widespread acceptance of same-sex marriage will be.”

    In other words, when you start doin’ things different from what we’ve always done, no one knows what you’ll do next. That makes life feel unstable, off balance, and dangerous. Conservatives conserve because it makes them feel safe.

    Later in his opinion Alito seems to take the argument in a different direction: “Throughout human history ... marriage has been viewed as an exclusively opposite-sex institution and as one inextricably linked to procreation.” Now we’ve got a new, “’consent-based’ vision of marriage, a vision that primarily defines marriage as the solemnization of mutual commitment -- marked by strong emotional attachment and sexual attraction -- between two persons. ... Our popular culture is infused with this understanding of marriage.” So is the argument for same-sex marriage, Alito implies. If you accepted the old view that marriage is mainly for procreation, same-sex marriage would make no sense.  

    Since “the Constitution does not codify either of these views of marriage,” he concludes, it should be up to the people, not the courts, to decide between them. And the people, through their elected representatives, have the right to choose “between competing visions of the good, provided that the vision of the good that they adopt is not countermanded by the Constitution.”

    Of course the Court majority has now decided, in effect, that the vision of the good that links marriage to procreation is countermanded by the Constitution. But the conservatives say the majority is wrong -- because, and (as far as their written dissents tell us) only because, the old definition of marriage is, well, old: “deeply rooted in this Nation’s history and tradition.”

    So it looks like “the purpose of marriage” and “the nature of the family,” which opponents of same-sex marriage generally cite as the crucial issues, really aren’t. The crucial issue is playing it safe by conserving tradition.

    More evidence comes from The Bipartisan Legal Advisory Group of the U.S. House Of Representatives (BLAG) and its lead lawyer, Paul Clement, who represented the pro-DOMA forces in oral arguments before the Court. BLAG’s brief to the Court argued in detail that the purpose of marriage is for “providing a stable structure to raise unintended and unplanned offspring,” “encouraging the rearing of children by their biological parents,” and “promoting childrearing by both a mother and a father.”

    But when Clement argued before the Court he never said a word about any of this. Not a word. What he talked about, over and over and over again, was the need to preserve the “traditional” understanding of marriage.

    During the oral argument Justice Breyer summarized Clement’s view succinctly: “There has been this uniform one man - one woman rule for several hundred years or whatever, and there's a revolution going on in the States. [Congress said] we either adopt the revolution or push it along a little, or we stay out of it. And I think Mr. Clement was saying, well, we've [i.e., Congress] decided to stay out of it and the way to stay out of it is to go with the traditional thing.”

    Behind all this legalese we see the link between bigotry and conservatism. The essence of bigotry is treating people unequally -- giving rights, privileges, and respect to one group that are denied to another group. The people with the rights and privileges pretty quickly get used to having those advantages. After a while the inequality becomes tradition, the way things have always been. As long as things stay that way, the world seems familiar, predictable, and therefore safe.

    But as soon as there is any substantial step toward more equality -- whether it’s women getting the vote, blacks going to school with whites, undocumented immigrants getting a path to citizenship, same sex couples getting married, or whatever -- conservatives think, “Hey, if this can happen, who knows what can happen next?” As Justice Alito wrote, “No one ... can predict with any certainty what the long-term ramifications” will be.

    So the right-wing justices did get something right. As they made clear, resisting same-sex marriage is merely the most current example of what’s always the really crucial issue for conservatives: continuity versus change; the familiar, which is predictable, stable, and safe (or so they want to believe), versus the unfamiliar, which feels so unpredictable, unstable, and scary. The essence of their arguments, whatever the issue, always comes back to conserving the old so we can avoid the uncertainty of the new.

    Justice Anthony Kennedy, in the majority opinion, pointed out part of what the conservatives got wrong. If you really want to live in a stable, predictable society, he wrote, you should strike down DOMA. “By creating two contradictory marriage regimes within the same State, DOMA forces same-sex couples to live as married for the purpose of state law but unmarried for the purpose of federal law, thus diminishing the stability and predictability of basic personal relations the State has found it proper to acknowledge and protect.”

    Kennedy recognizes the value of stability and predictability. But he understands that sometimes you get more of it by allowing change than by keeping things rigidly the same. That’s what makes him the swing vote on the Court.

    Kennedy says that this decision applies only to people already married in states that allow same-sex marriage. By his own logic, though, he should have gone further. If the goal is a more stable, predictable society, as conservatives claim, then all couples who have made life-long commitments to each other should be able to get legally married.

    After all, conservatives constantly tell us that marriage is a bedrock institution for preserving social stability. Maybe, maybe not. But if it is, then the more couples who are married, of whatever gender, the better off we all are. The conservatives on the Court seem to have missed that obvious point. They’re so scared of change that they can’t see when it is promoting their ultimate goals. That’s what they really got wrong.

    Now, with the Court’s decision in hand, more courageous and clear-thinking conservatives can join with moderates and liberals across the country to honor the “personhood and dignity” (as Justice Kennedy put it) of every American by granting everyone the right to marry whomever they love.   

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/152408 https://historynewsnetwork.org/blog/152408 0
    Who Says Conservatives Are More Patriotic?

    As we got busy preparing for Fourth of July festivities, this question popped into my head: Are conservatives more patriotic than other Americans? If you were a foreigner spending some time in the USA, getting news from the mass media and just talking to people, you might easily get that impression -- especially around the Fourth, when conservatives seem to be the ones most likely to display those big American flags.

    In fact you might easily get that impression on any day of the year, when conservatives seem to be the ones most likely to put their love of country on display in all sorts of ways, aiming to leave no doubt in anyone’s mind about their patriotism.

    But what’s the truth behind the display? Are conservatives really more patriotic than others? Well, it depends on what you mean by patriotism.

    And there lies the heart of the matter: Conservatives appear to be more patriotic because they have so much control over the very meaning of the term. Most of the time, when anyone uses the word “patriotism,” it turns out to mean what conservatives say it means.

    Debates about the meaning of patriotism may rage in the margins of our political life. But in ordinary day to day America, where the real action is, nobody pays much (if any) attention, because the fundamentals of patriotism are generally taken for granted. And they are assumed to be pretty much what conservatives usually say they are: the right words (“greatest, and freest, country on earth,” “support our troops,” “I regret that I have but one life to give,” etc.); the right images (Uncle Sam, Statue of Liberty, Capitol dome, etc.); the right actions (waving the flag, singing the national anthem, etc.) -- the words, images, and actions that they love to flourish on the right. So of course most Americans say the right is more patriotic.

    Oh, sure, on the Fourth of July you’ll find even the most liberal politicians throughout the land proclaiming their particular brand of liberalism as truly American and genuinely patriotic. Politicians of every stripe do that every day. Most organizations that have any significant clout, across the political spectrum, will loudly assert their patriotism too, if they are pressed to say anything about the issue. But expressions of patriotism outside the conservative orbit are widely received as a kind of window-dressing, not to be taken too seriously.

    Conservatives' expressions, on the other hand, are generally seen, in the main stream of the culture, as the genuine article. They are credited as totally serious and as an essential piece of the whole conservative package -- naturally, since patriotism is defined so largely in conservative terms.

    But the intrinsic special connection between conservatism and patriotism is only an appearance. It’s like a magic trick. A good magician’s tricks are so dazzling because the audience wants to be dazzled; the trick is a transaction between the magician and the audience.

    In the same way, the idea that conservatives are especially patriotic -- that they understand and feel patriotism more deeply, that it’s more fundamental in their lives -- has taken root throughout American political culture only because everyone who is not conservative has agreed to play along. Conservatives control the meaning of patriotism because most everyone else lets them get away with it.

    As long as conservatives have such a strong lock on patriotism, they have a built-in advantage in the political arena -- especially among the 20 percent or so of voters who don’t have any special allegiance to either major party, leaving their votes always up for grabs.

    A lot of those uncommitted voters stay that way because they don’t see much clear-cut difference between the Republicans and the Democrats. When you look at two alternatives that appear roughly equally balanced, any one factor can tip the scales. Who knows how many votes Republicans get from voters who see the two parties as roughly equal, except that the GOP appears to be so much more devoted to patriotism. The GOP will always have that advantageous appearance as long its control over the language, imagery, and ritual of patriotism goes unchallenged.

    Moderates and liberals could push back. They could take a firm stand in favor of their own brands of patriotism; show that theirs are just as genuine as any conservative’s; insist that patriotism is just as important in their lives as in any right-winger’s; make the meaning of patriotism a defining political battleground, as important as gender rights or immigration or Social Security. Even on the progressive far left, there is plenty to contribute to a conversation about patriotism.

    Trying to challenge conservatives on this ground would be an uphill struggle, to be sure, because they have a major advantage on the right: They are generally quite sure that they know what patriotism is, and they tend to agree with each other on their definition. So they present a pretty solid united front (at least when viewed in the rather hazy, general terms that most Americans see all things political).

    Everywhere else on the political spectrum there is a lot more questioning, disagreement, and uncertainty about the true meaning of patriotism, though the degree will surely vary from point to point on that spectrum. The further you go toward the left, the more uncertainty there is about whether patriotism of any kind has any value at all. Eventually you reach a point where it’s widely taken for granted that patriotism is something bad, something to be rejected out of hand.

    That extreme stance is not likely to win too many votes, so it doesn’t have much direct political power. Nevertheless it has an important political effect: The questions about patriotism raised so pointedly on the far left have seeped across the whole left side, and even into the center, of the political spectrum, stirring up the uncertainty that weakens the Democratic Party on this issue.

    It’s been going on for a long time. In the 1830s William Lloyd Garrison, the great preacher of nonviolent abolitionism, wrote: “Breaking down the narrow boundaries of a selfish patriotism, [I have] inscribed upon my banner this motto: My country is the world; my countrymen are all mankind.”

    But the most important historical root of our current situation is, without doubt, the Vietnam war. As the antiwar movement grew, so did the belief that patriotism was the last refuge of the scoundrels who had led us into, and now perpetuated, the war -- from Johnson and Nixon on down to the millions who gave unwavering support to those presidents’ war policies.

    Those millions waved their flags and spouted patriotic rhetoric as a sign of their support for the war. So it was perhaps inevitable that, from the antiwar side, it became harder and harder to distinguish patriotism from militaristic chauvinism.

    To be sure, some antiwar activists went out of their way to insist that they were the true patriots; they even carried American flags as they joined the protesting crowds. But their message was drowned out by the louder voices on their side decrying patriotism as a root of the war’s evil. And antiwar patriots were largely ignored by the mass media, who were eager to put the spotlight on every “Amerikka” sign they could find.

    One telling example: When Martin Luther King first publicly denounced the Vietnam war (a year to the day before he was murdered) he stressed that he was speaking out because of his deep love for his country and its ideals. But in antiwar circles then (and in liberal circles now) his patriotism was almost always ignored. All that got remembered was his eloquent critique of the war and of “the greatest purveyor of violence in the world today: my own government.”  

    The Vietnam war era raised questions about the meaning and value of patriotism more profoundly and persistently than ever before in U.S. history -- questions that large numbers of Americans found unsettling, at least, even if they never bothered to think them through very systematically. The war excised the taken-for-granted patriotism that had once been the heart of American political culture. Instead of sparking a public debate about patriotism, though, it left only a gaping hole in the body politic.

    Surely one part (historians will always argue about how big a part) of the rightward shift of the latter 1970s was a desire to escape that unsettled feeling and fill that hole by returning to the “good old days” of unquestioned patriotism. Ronald Reagan was the ideal pitchman for the job, selling the old-fashioned wine of patriotism in new bottles that perfectly suited the times. The demand was huge. But the supply, from Reagan and the right-wing movement he led, was unlimited.

    Those who refused to buy Reaganism also refused to buy the heady brew of patriotism he was peddling, and vice versa. But they had no alternative vision of patriotism to offer because they were caught in the uncertainty about, or outright rejection of, patriotism that the war had brought them.

    So the deal was sealed: Patriotism would come from the right. And whatever came from the right would be -- by definition -- the accepted meaning of true patriotism. Where else could that meaning come from, with the rest of the political spectrum in such disarray on the subject?

    Moreover, the right was offering expressions of patriotism that had deep roots in America’s past, while the rest, if they wanted patriotism at all, would be happy only with some genuinely new formulations. At a time when so many Americans felt like changes were coming too thick and fast, the seemingly old had a natural advantage over the new. Conserving the familiar expressions of patriotism was more popular than the alternative of liberating patriotism to find new meanings and new values.

    This was one of the many lasting effects -- and one of the great tragedies -- of the Vietnam war. How different things might have been if all the war critics, all the liberals, even all the radicals, had followed Dr. King’s lead and framed their antiwar sentiment within an overarching patriotism: a commitment to making a better America because they loved America so much. They might have declared, in all honesty, that they were trying to save America, as well as Vietnam, from all the evils the war brought; that they clearly loved their country more than conservatives, who applauded a war that did the U.S. (and, of course, Vietnam) so much harm.

    It didn’t happen that way, and the damage was done. But it’s never too late for moderates, liberals, and even leftist progressives to start proclaiming their patriotism loud and clear. Yes, it would be an uphill struggle to break the perception of a conservative monopoly on patriotism. The conservatives do have all those advantages. But fighting for what’s right against daunting odds is the American way. What could be more patriotic?

    So this Fourth of July, if you’re sitting in a crowd waiting for the fireworks to begin and you’re not a conservative, you might seek out someone to your political right and say, “Hey, let’s talk about the real meaning of patriotism.” Maybe offer them a cold beer, too. It’s the American way.  

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/152460 https://historynewsnetwork.org/blog/152460 0
    Liberals Tolerate Ambiguity More Than Conservatives Image via IMDB.

    In my little corner of the world it’s time for a celebration. A book -- an entire book -- has just been published devoted to my favorite subject: the role of myth in American history. That doesn’t happen very often.

    Time No Longer, by Patrick Smith, is hardly the definitive book on the subject. (That will be written by a great historian of the future, one perhaps now still in grade school.) It’s a journalist’s energetic gallop through the history of the nation, full of the kind of sweeping generalizations, illustrated with anecdotes, that journalists so often love.

    Some of Smith’s points are smart enough to deserve applause. Many are dubious, or debatable at best. But when debatable, or even dubious, claims are about important issues they are intellectually stimulating. Such claims show up often enough in this book to make it definitely worth a careful, critical read.

    What I found most provocative is Smith’s basic argument: Myth and history are mutually exclusive ways of experiencing life because myth, by definition, takes place outside of time and history. The United States is a unique nation because its people have always opted, and still opt, for myth. Thus, while other nations remember their actual history and live in real history, grappling with its constantly changing challenges, Americans take refuge in a mythic past that allows them avoid facing the changes, and the demands for decision, that history brings. And that refusal to deal with the reality of history is the nation’s biggest problem, the source of our biggest mistakes, the one thing above all we have to change in our national life.

    There’s some much here to think about and write about, one hardly knows where to begin.

    For example, the claim that Americans have embraced their national myths much more strongly than other nations, so that myth plays a uniquely powerful role in American life. Europeans have been saying so for a long time, and Smith cites them as his authority. But perhaps that’s merely a way for the Europeans to validate their mythic images of their own nations.

    Smith builds his case mainly by contrasting the intellectuals of other lands to the views of ordinary, average Americans. Don’t ordinary, average folks in France or Russia or China or anywhere else see their nation’s life through the lens of long-standing myths? That’s a question worth a lot more discussion than it usually gets here in the U.S.

    And who says that “timelessness” or “ahistoricality” is the essential quality of myth? Smith got the idea, he tells us, from reading the works of Mircea Eliade, who was once the king of the academic study of religion. But that was about four decades ago. The king has long been dethroned, his work now read by scholars of religion mainly in graduate courses on the history of their discipline. There are so many other things to say about myth that are now considered more, or at least equally, important.

    (It’s curious how often preeminent academic figures lose their standing in their own discipline, but go on for decades being cited as authorities by writers in other fields of study.)

    Nevertheless, it’s true that the urge to “escape from time” and “live outside of time” does play, and has always played, an important role in American political culture, shaping the ways Americans relate to their past, present, and future. Even when there is public clamor for “change,” the most popular changes are typically the ones that seem to undo recent history and let us live once again in the past as we fondly imagine it, allowing us to nurture the illusion that time has not affected us. So the whole issue of time and timelessness in public life deserves far more attention than it usually gets. Any writer who brings it to our attention as forcefully as Smith does deserves our thanks.  

    He deserves more thanks for digging so deeply into this issue and coming up with some really useful insights. The most useful, to me, is his observation that the writings of the earliest Europeans immigrants to the “new world” are filled with both anticipation and anxiety and that this makes perfect sense, because the uncertain future they faced was bound to evoke both moods.

    Ever since, American culture has been shaped by the contrasting myths that I call “hope and change” and “homeland insecurity.” By Smith’s logic, this should be clear evidence that our culture has remained steeped in uncertainty about the future. Yet he argues that we are so steeped in myth precisely as a way to avoid and thus reduce uncertainty. If so, since we have been telling and living by our myths for all these centuries, we should have a much lower degree of uncertainty by now, and hence less need for the myths. Yet we keep on telling and living by the myths as much as ever.

    There is an obvious solution to this only apparent paradox: The more we try to avoid uncertainty the more we foster it; what is repressed returns stronger than ever, just as Freud predicted. As uncertainty grows, so does the ambivalence of anticipation and anxiety that it brings. Smith does not articulate this conclusion explicitly, but it seems implied by his whole analysis.

    (To take an example from a great American mythic drama: When Scarlett O’Hara concludes “Gone With the Wind” by saying “I’ll think about that tomorrow. After all, tomorrow is another day,” don’t we know that her hope is rather superficial, that tomorrow she will be even more anxious about the future than she is today?)

    If Smith’s premises are thus both debatable and stimulating, his conclusion is more so, and more important:

    On September 11, 2001, he writes, “America’s long mythological notion of itself crumbled along with the Manhattan towers....From those moments onward, America became part of history.” Americans had to face the naked fact of uncertainty because their mythic tools for evading that fact were gone.

    Smith grieves for the tragic way it happened, but he concludes that this outcome is ultimately all for the good, because a nation living in myth does bad things -- as he demonstrates in lengthy chapters on the Spanish-American war, the cold war, and the “war on terror.” A nation finally forced to face and take responsibility for its history will be, he contends, a better land and a better citizen of the world.

    It’s a fine story, as we would expect from an experienced journalist whose professional skill is writing news stories. In fact it’s a pretty good myth. The narrative has broad explanatory power. It endows a historical event with a trans-historical meaning that embraces the past, present, and future of a whole nation and has profound implications for the whole world. A story that rich is bound to pack a powerful emotional punch, as this one does if you take it seriously.

    But how seriously should we take it? As a description of what actually happened in post-9/11 America, to call it dubious is an understatement -- as Smith himself makes clear. His concluding chapter, on the response to the 9/11 disaster, shows that the mythological structure did not crumble at all. On the contrary, he demonstrates in vivid detail how 9/11 revived and intensified the power of our national myths and how those myths profoundly shaped the response to the disaster.   

    What Smith really means, it turns out, is that America’s mythological notion of itself should have crumbled along with the Manhattan towers, that after 9/11 America should become part of history, that we can no longer afford the luxury of avoiding history and the difficult decisions it imposes on us, that we should now be living by his story: a nation that has abandoned myth to enter into real history.

    A fine story, as I say. But again, how seriously should we take it? Smith wants us to be realistic. But is his call to live without myth realistic?

    Living without myth is a dream that goes back to the eighteenth-century Enlightenment. Though America’s Founding Fathers were deeply influenced by the Enlightenment, they also handed on to posterity (that means us) a legacy of national myth, as Smith argues in some detail.

    Nevertheless they also passed on an Enlightenment legacy, which has always played a powerful part in American intellectual culture -- and even in popular culture, where perhaps its preeminent representative was Sergeant Joe Friday, the mythic detective so beloved for demanding, “the facts, ma’am, just the facts.” We might say that Smith wants all Americans, and certainly all historians, to emulate the good Sergeant’s model.

    But it seems rather late in the day to be chasing what Peter Novick called “that noble dream” of perfect objectivity. Hasn’t Smith heard of Sam Spade in “The Maltese Falcon” or Jake Gittes in “Chinatown,” those mythic detectives so beloved in more modern (and post-modern) circles of American intellectual culture? It was their job to gather facts, and they got plenty of them. But the more facts they collected, the less they really knew. They learned the hard way that the facts themselves can never reveal the truths of human life, for the facts always come filtered through interpretation, wrapped in a story. In the end, even the biggest pile of facts amounts to no more than someone’s -- or some nation’s -- story.

    There is an ongoing tension in intellectual culture about the possibility of living in world of pure facts, without mythic narratives to give meaning to those facts. Intellectual fashions may ebb and flow on this question. But I think there is little doubt that the long-term trend is to dilute, and perhaps eventually wash away, the influence of the “Sergeant Friday” school of thought and its fantasy of life without myth.

    That same tension gets played out in the political arena, where it’s likely to be with us a lot longer. And it makes strange bedfellows: From the left there is a chorus (which Smith now joins) calling for Americans to drop their myths and face the empirical, indisputable facts; from the right, there is a rather louder chorus calling for Americans to reject the Sam Spades and Jake Gittes among us and take a stand on bedrock certainties. The sharp differences between the two sides make it easy to overlook this rather abstract common ground.

    Both also share an antipathy toward the policies, and often the person, of Barack Obama. That’s hardly surprising if, as James Kloppenberg claims, the premise of Obama’s political life is a rejection of all claims to absolute truth. At the least, anyone who has read Obama’s Dreams from My Father knows that he has a fine novelist’s sensitivity to the power of interpretation, subjectivity, and storytelling in people’s lives. And anyone who has heard his campaign speeches knows that he has an equally fine politician’s sensitivity to the power of mythic narrative in a nation’s life.

    As far as I can tell, the Sergeant Fridays of the left who demand “just the facts” are not yet making any powerful headway in the race for political power. That race, for now at least, is between the Sergeant Fridays of the right and the liberal adherents of the “Jake Gittes” school. Let’s hope the right-wing devotees of certainty, and the whole nation, don’t have to suffer as much as Mr. Gittes did to learn that life without myth may be a noble dream, but it isn’t likely ever to be encountered in reality. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/152581 https://historynewsnetwork.org/blog/152581 0
    What Ever Happened to American Regionalism? Soda vs. pop vs. coke vs. soft drink. American regionalism in a nutshell.

    I spent several hours last week driving around New York City. For a guy like myself from the wide open Rocky Mountain West, it was rather hellish. I plan to drive in NYC again approximately when hell is covered with a thick sheet of glacial ice.

    It could have been worse, I suppose. I’m not a native Westerner . I grew up in the suburbs of New York. So I had a good idea of what driving in “the City” might be like. I knew that, if you live in the tri-state area, “the City” (and indeed each of its boroughs) is a distinct region, far different from its suburbs.

    And I knew that my wife, a native Midwesterner, made a terrible linguistic faux pas when she told someone that she had business meetings “on” Manhattan and “in” Long Island. She got the prepositions exactly reversed. There’s no logic to it. It’s all just local knowledge.

    I had already been thinking a lot about the power of locale and regionalism in America before my trip to New York. I had just read Ira Katznelson’s recent study of the New Deal, Fear Itself. Katznelson’s main theme is the immense power of the Southern bloc in Congress in the 1930s and 1940s, a power wielded primarily to maintain the harsh cruelty of segregation. Southerners were happy enough to support the New Deal in its earliest years, he argues.

    But during Franklin Roosevelt’s second term they began to see, dimly on the horizon, the possibility that New Deal policies (especially the empowerment of labor unions) might give even a tiny amount of increased political strength, wages, and autonomy to African Americans. The vaguest hint of such changes sent the Southern legislators into a frenzy of opposition, and the New Deal’s energy soon began to sputter.

    Katznelson mentions in passing that the Southerners could wield such power because they allied with Republicans, who were equally fiercely opposed to the New Deal. But everyone already knows about the Republicans, so there’s no need to go into any detail there. What’s new in Katznelson’s book is the obstructive power of the Democrats from the South, so that’s what gets all the attention.

    Whether we are talking about visiting New York City or analyzing the New Deal -- or pretty much anything else in America and its history -- you can’t understand it unless you pay close attention to the power of regionalism.

    Of course there is always the countervailing power of nationalism. When I wanted caffeine to keep up the sharp attention and quick reflexes a driver in New York City needs to survive, the only place I could find to sit and drink coffee was a Starbuck’s.

    More seriously, Katznelson explains at length that, when FDR put out a patriotic call to all Americans to aid of Britain in its fight against Nazi Germany, white Southerners were the first to rally round the flag. They had good economic reasons: Their agricultural-based economy benefited most from free trade policies that were threatened by the Nazis, and the Roosevelt administration wooed them by putting a disproportionate number of new military bases in the South. FDR insured white Southern support by promising to keep the rapidly growing military strictly segregated.

    But the white South was also moved by its traditional masculine code of honor, which had always been acted out most vividly in wartime. And white Southerners, despite their intense regionalism, still felt wounded by their lingering sense of being treated as second-class Americans (at best), a vestige of the Civil War that remained very much alive three-quarters of a century later.

    Now the president was warning that the whole nation was imperiled and only a massively expanded military could save it. Whether FDR was right or wrong was hotly debated then in most of the country (and still is debated by thoughtful historians). But in the white South there was little debate. By rallying to the purported defense of the whole nation, white Southerners could exercise their sense of honor and show that they were fully equal to other regions in their patriotism -- thus proving that they now deserved to be treated as fully equal Americans in every sense.

    Ever since, the South has been far over-represented among Americans in military uniform (though since the 1960s that over-representation has included black and Latino as well as white Southerners).

    All this got me thinking about the role of regionalism and its limits in American culture and politics -- especially, in our own time, its limits. It’s really striking how relatively small a role regional identity now plays in American life, compared with the past. The long-term trend seems clearly to be working against regionalism.

    When the nation was born, regional conflicts were fierce. So were state conflicts, even within the same region -- so fierce that there was great doubt a union of the thirteen original states could endure. Political compromises and economic ties were cultivated by political leaders quite intentionally to overcome the centrifugal forces. But the two key factors that held the union together emerged less consciously and more organically: a set of national myths and, among European-Americans, a largely unquestioned assumption of white supremacy.

    Both of those factors played critical roles in reasserting the unity of the nation after it was torn apart by the Civil War. It was only after the Civil War that people began talking “the United States” in the singular, instead of the plural form that had been universal before the war: “these United States.”

    One of the few things that really unified white people across the United States in those post-war years was white supremacy. Those who had lost loved ones fighting for slavery and those who had lost loved ones fighting against it agreed (with only a few murmurs of dissent) that white people were inherently different from and superior to those of other “races.” (The quotations mark are there to acknowledge that the concepts of race and of the various races are all social constructions.)

    In recent decades the issue of national unity has been widely raised again. What holds the Union and all its people together? That question has disturbed some substantial number of Americans -- at least among those who speak and write in the public arena -- since the 1960s. But it has not been sparked by any significant resurgence of regionalism. Race and myth are still the key factors.

    It’s no coincidence that the ‘60s was the era of proud self-assertion among the “non-white races.” Those who have raised the question of national unity in worried tones have virtually all been white. In effect, many seem to be asking: Now that we can no longer assume that white people are the “normative” Americans, how can we know what norms bind us all together? The intense reactions stirred by proposals for immigration reform and the acquittal of George Zimmerman are stark reminders that racial issues still carry a powerful punch.

    Of course it’s not fair to assume racial overtones whenever the question of national unity is raised. In the cultural upheaval of the late 1960s, all of the nation’s traditions, not merely its racial biases, were widely called into question. So even among those who sympathized with demands for genuine racial equality, some worried that the nation might not survive the dissolution of its other binding force, its national myths.

    Take, for example, Richard Hughes, author of one of the very few recent books devoted to serious analysis of American political myths, Myths America Lives By. Hughes goes out of his way to note how racially loaded these myths have been, and how differently they are read by African Americans. When he worries that the nation is “in peril of disintegration” he blames not any racial group, nor racial tensions in general, but political factions: the so-called “fundamentalists of the left, who can find no good in America whatsoever,” and the equally dangerous (in his view) fanatics on the right, who cling unquestioningly to the old myths that have so oppressed minority groups.

    The proper middle ground, Hughes argues, is to revise the myths in light of current notions of equality by building on the one narrative that he claims is more than a myth: the national “Creed” that all men (and women too, of course) are created equal and endowed by their Creator with inalienable rights to life, liberty, and the pursuit of happiness. 

    Americans must not “scuttle their national myths,” Hughes concludes, but embrace them “in their highest and noblest form ... with extraordinary humility. ... In this way, their national myths might yet sustain the promise of the American Creed.”

    Is it a feasible project? That’s a question worthy of much more debate than it gets. The myths Hughes writes about -- “the Chosen Nation,” “the Christian Nation,” “the Millennial Nation,” “the Innocent Nation” -- may very well be so imbued with the seeds of oppression (racial and otherwise) that they are beyond saving in any noble form that can sustain the promise of liberty and justice for all.

    But when a Westerner like me survives the rigors of New York City and studies about the depth of Southern influence in the demise of the New Deal spirit, he’s led to ask another question that deserves just as much debate and gets even less. It’s not just whether national unity is really possible, but why it remains such a crucial issue in American public life at all. Why do thoughtful people like Richard Hughes, and so many others, lie awake at night worrying about “what holds us together”?

    What would be wrong with imagining “the United States” as merely a loose administrative structure for a group of quite autonomous regions? Or perhaps an agency for safeguarding human rights and redistributing wealth in the interests of greater equity, or an entity serving only the purpose of protecting its various regions from threats coming from outside U.S. borders, or a vast debating society where we congenially discuss competing myths and values, or any number of other functions one might think of that “the United States” could play, while leaving regions as the principal source of political-cultural identity?

    It’s worth asking why such questions are so rarely raised, while it’s widely assumed that the one question, “What holds us together as a nation?” is a burning question that must be answered, quickly and decisively, if we are to avoid perishing in some imagined catastrophe. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/152671 https://historynewsnetwork.org/blog/152671 0
    Walt Whitman: Dreaming of America, On the Road Credit: Wiki Commons.

    With the fiftieth anniversary of Dr. Martin Luther King, Jr.’s greatest speech approaching, I started wondering: Who has a dream anymore? Down toward the left end of the political spectrum, where I spend most of my time, there’s plenty of (usually quite accurate) complaining about the ills that plague our world and some talk about specific policies aiming to remedy one or more of those ills. But hardly anyone ever follows Dr. King’s example and publicly shares a vision of what a far better world would look like. It’s just “not done” these days. Perhaps it feels too naïve, too unrealistic, too embarrassing.  

    I’ve got a piece up today on another site exploring the decline of political dreaming in progressive twenty-first-century America, which starts with my confession: I do have a dream. I dream of a vigorous nation-wide conversation about alternative mythologies of America, new ways of envisioning what the nation is about and where it should be headed. I’ve written here previously about why I think we need alternatives to the mythologies that dominate our political culture now.

    One point I stressed is that new mythologies have the best chance of gaining influence if they are rooted in what’s old, familiar, and respected. So I turned to the mythology of America I found in the words of Dr. King as an example, to get a conversation started.

    As I spent this summer thinking and writing about political dreams, I went back and re-read the work that may well be America’s most important contribution to the literature of political (and cosmic) visions: Leaves of Grass. If we want a source for a national mythology that’s old, familiar, and respected, who better to call on than the venerable poet, Walt Whitman?

    So I’ve written a new essay about an alternative mythology of America, based on my understanding of Whitman. Here are a couple of selections from that essay, which I hope will intrigue you enough to want to read the whole thing:

    America is more than a place. It is a project -- a process with a purpose. Though Whitman describes that purpose in many ways, he comes closest to the heart of his vision of America when he describes the mission of a true American poet: to proclaim “the great Idea, the idea of perfect and free individuals.” Since he maintains that the true American poet embodies the entire nation, he clearly implies that the mission of the entire nation is to promote “the great Idea,” to create and nurture perfect free individuals -- an idea that turns out to be the central thread of his mythology of America.

    Each individual is the center of an endless web of interconnections, which the awakened soul can feel: “Whoever walks a furlong without sympathy walks to his own funeral drest in his shroud.” This awareness of universal interconnectedness is the essence of human perfection. The boundaries that seem to separate one person from another and from all the other realities of the world are seen for what they really are: bridges that connect the individual to everyone and everything else. When the soul evokes even a glimpse of perfection, the individual realizes “the plenteousness of all, that there are no bounds.”

    If we are all parts of the same single pool of humanity -- if there are no boundaries that actually separate one from another -- then there are no limits to define and confine the individual. Moved by a powerful awareness of his soul, Whitman exclaims:

    From this hour, freedom! From this hour I ordain myself loos'd of limits and imaginary lines, Going where I list, my own master total and absolute.

    Given the awareness of infinite connection -- that each of us contains all others -- each individual must extend the freedom they feel to all others. That’s the essence of democracy.

    Whitman knew as well as anyone, and could say more eloquently than most, that American democracy has always been, and still remains, tragically limited and stunted, a faint foreshadowing of what a true democracy would be:

    Society, in these States, is canker'd, crude, superstitious, and rotten ... [with] the depraving influences of riches just as much as poverty ... We live in an atmosphere of hypocrisy throughout. ... Using the moral microscope upon humanity, a sort of dry and flat Sahara appears.

    But his is a mythic tale. It is not meant to be a snapshot of reality as it exists now. Rather it is a story of the nation’s project, the road that it follows to bring the real ever closer to the ideal, with many “passing errors” and “perturbations” along the way. And there are, in the nation’s past and present, realities that are a partial (sometimes very partial) fulfillment of the ideal. The foundations of the ideal are already set in place, here and now, to point the way.

    The crucial point of his story is that the process of moving toward the ideal continues, and must continue. Wherever we are, “we must not stop here, However shelter'd this port and however calm these waters we must not anchor here.” “It is provided in the essence of things that from any fruition of success, no matter what, shall come forth something to make a greater struggle necessary.”

    Whitman’s most original contribution to the tradition of millennialism is to proclaim that the process has no end. America is constantly fulfilling its mission, constantly moving toward a fuller realization of its purpose: “Others take finish, but the Republic is ever constructive.” The nation’s highest goal is to keep its millennial project alive forever. The road to paradise is itself the American paradise.

    The most authentic way to be an American is to be, like nature, always in process, always restless, always on the move and wanting more:

    To see nothing anywhere but what you may reach it and pass it, To conceive no time, however distant, but what you may reach it and pass it, To look up or down no road but it stretches and waits for you, however long but it stretches and waits for you, To see no being, not God's or any, but you also go thither.

    All patriotic Americans will always be pioneers, “conquering, holding, daring, venturing as we go the unknown ways.” They “know the universe itself as a road, as many roads,” and they are always on the road.

    In a nation whose collective process is a millennial project, any pursuit can be the road to paradise, offering anyone a new way to experience moments of perfection here and now. In a nation dedicated to the freedom to discover and affirm one’s own unique identity, each must find their own road: “Not I, not any one else can travel that road for you, You must travel it for yourself.”

    Well, these are just a few tidbits from my take on Whitman’s mythology of America. They don’t do justice to the richness and complexity of the legacy he left us. But I hope they will whet your appetite enough to prompt you to read the whole essay.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/152740 https://historynewsnetwork.org/blog/152740 0
    Bradley Manning Meets Woodrow Wilson: The Secret of the Espionage Act Revealed Credit: Wiki Commons/HNN staff.

    Military justice is to justice as military music is to music. In a civilian court, the judge explains the decision as soon as it’s handed down. In the military, the judge just announces the decision and passes sentence.

    In Bradley Manning’s case, Judge Denise Lind did say “she would issue findings later that would explain her ruling on each of the charges.” We don’t know how long “later” may be. All we know now is that Judge Lind does not think Manning was aiding the enemy.

    Which raises an interesting question: If you take classified documents, but you don’t do it to help some enemy, apparently you haven’t done any harm to the United States. So why is it a crime? Why does it count as “spying” at all? I always thought “spying” meant one side stealing secrets from the other side.

    Manning said he did it on behalf of a nation -- his own. He did it on behalf of all of us. I haven’t heard of any reason to doubt him. Yet he’s getting applause only at the left end of the political spectrum. Across the rest of the spectrum the responses range from uncertainty to outright condemnation. So the public verdict on Manning, like the judge’s verdict, is decidedly mixed: “Hero to some, traitor to others,” an AP story called him.

    You might think he’d get plenty of applause from the mass news media. After all, he provided them with headline material for weeks. But the mass media are hardly showing much appreciation. Some (like that AP headline) are studiously neutral. Others, including high-profile liberal outlets, have avoided the substance of the issue by making Manning’s personality the issue. “Bradley Manning had long been plagued by mental health issues,” NPR headlined. The New York Times called him a “loner” and “misfit,” “a teenager bullied for his conflicted sexuality.” That’s one easy way to convict him in the court of public opinion.

    Which raises another interesting question: Why is there so little public approval for a man who took immense risks simply to let us all know what our government is doing, with our money, in our name?

    To dig into both of the questions I’ve raised, let’s look at the origins of the Espionage Act under which Manning was convicted. It began with Woodrow Wilson.

    In his authoritative biography of Wilson, John Milton Cooper reminds us that U.S. entry into World War I aroused a lot of public opposition and protest. “The need to whip up popular fervor behind the war made dissent look dangerous.” The Espionage Act aimed mainly to quell that dissent.

    Perhaps Wilson, who initiated a massive PR campaign to swing public opinion to support the war, recognized that it also works the other way around: Making dissent look dangerous is one powerful way to whip up popular fervor. Anything that makes the public feel endangered helps generate support for the government’s efforts to “defend us against all enemies, foreign and domestic.”

    When war critics were convicted en masse under the Espionage Act, it served this dual purpose admirably, blunting dissent and fostering a sense of danger that built pro-war sentiment. No doubt the conviction of even one soldier today under the same Act has a somewhat similar effect, which helps to explain why there is so little public support for Bradley Manning.

    But the matter is more complicated, because Wilson proposed the Act a full 16 months before the U.S. entered World War I, when he was still promising to keep us out of the war. In his annual message to Congress for 1915 he explained, “I have had in my mind no thought of any immediate or particular danger arising out of our relations with other nations.”

    What he did have in mind -- what moved him to call for a new anti-spying law -- were foreign-born citizens of the U.S. who “have sought to pry into every confidential transaction of the Government” with the goal of moving the public to support one or another of Europe’s warring nations. In other words, they threatened to limit the Wilson administration’s freedom to shape U.S. policy, for war or peace, as it saw fit.

    Wilson was obviously pointing a finger mainly at pro-German sympathizers. But he couldn’t say so, because his official stance -- and the ostensible reason for proposing an Espionage Act -- was that the U.S. had to maintain its neutrality.  

    As usual, Wilson turned it into a moral issue: These foreign-born citizens who pry into the government’s confidential affairs “are not many, but they are infinitely malignant.” They “have sought to bring the authority and good name of our Government into contempt. ... I am urging you to do nothing less than save the honor and self-respect of the nation. Such creatures of passion, disloyalty, and anarchy must be crushed out.”

    Maybe it’s just a coincidence that the prosecutor in the Manning trial called the defendant “an anarchist.” Or maybe he had studied Wilson’s speech. In any event, the word “anarchy” gets to the root of the Espionage Act and the issue of “spying” as well as the public reaction to the Manning conviction.  .

    The premise of the Act is that the government can, indeed must, have “confidential transactions” -- secrets, to put it bluntly. If you are an honorable, self-respecting gentleman and a secret is entrusted to you, you keep it safe. You erect inviolable barriers to protect it from prying intruders who would steal it away.

    The same goes for an honorable, self-respecting nation: If the government allows its secrets to be stolen, the structures it has so painstakingly erected will be seriously endangered. Whenever structures are weakening, it’s a sign that passion is taking control. Anarchy looms on the horizon. Presidents don’t come out and say this any more, the way Wilson (and his predecessors) did without hesitation.

    But the idea remains as a key premise, not just in presidential speeches but throughout American political culture: Life is a struggle of order against unruly passion. The government, simply by its continuing existence, serves as a powerful symbol of enduring order for many (I suspect for most) Americans. Even those who are most critical of “big government” enthusiastically embrace the notion of “Constitutional government”; that is, a government working the way it’s supposed to, according to the prescribed structure. Passion and anarchy are what they fear most; rigid, inviolable control is what they want most.

    A government that can keep its secrets and quickly punish snooping secret-snatchers is obviously a government that knows how to set limits, maintain boundaries, keep control, and preserve order.

    The secret-snatchers, on the other hand, call into question the authority of the government, its ability to maintain boundaries and preserve order. Thus they tread on the government’s good name and cast it into disrepute. What good is a government that isn’t strong and honorable enough to keep its secrets, as any gentleman would? Such a weak government is always vulnerable to the forces of chaos and anarchy.

    That’s the real meaning of “spying.” It’s not necessarily about one nation or group taking secrets from another. It’s about any individual or group breaking through the barriers the government has set up to keep its secrets safe. The essence of “spying” is demonstrating that the government’s structures are not at all inviolable.

    That’s what makes espionage a crime. That’s why it has to be punished -- to prove that the government is still in control; that its structures have been repaired; that no one can violate them with impunity.   

    The Manning trial is a moral drama, played out in the bright media spotlight. It’s supposed to teach us all a simple lesson: Order always triumphs over anarchy in the end. That’s why, in the final scene, Bradley Manning has to be put behind the inviolable walls of a prison cell -- to prove that the government can keep its secret-stealers, like its secrets, safely locked up forever.

    But in a digital age that can never be the true moral of the story. As Manning, Edward Snowden, Julian Assange --- and the NSA -- have so clearly reminded us, in a digital age there are no secrets. No matter how hard the government tries, once it digitizes all its secrets there are no walls strong enough to safeguard them forever.

    Does that mean we are plunging into anarchy or entering a new age of transparency and participatory democracy? That’s the real question driving the debates about Manning, Snowden, Assange, and all the future “spies” who are bound pierce more holes in the government’s increasingly vulnerable wall of secrecy.    

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/152846 https://historynewsnetwork.org/blog/152846 0
    “Them Bad Russians” Still Haunt America Nikolai Bulganin, Dwight Eisenhower, Edgar Faure, and Anthony Eden at the 1955 Geneva Summit.

    “America it's them bad Russians. Them Russians them Russians. ... She wants to take our cars from out our garages. Her wants to grab Chicago. ...  Her make us all work sixteen hours a day. ...  America this is the impression I get from looking in the television set,” the poet Allen Ginsberg wrote. 

    But Ginsberg’s poem “America” was written in 1956, when Cold War fervor gripped the nation. We have come a long way since then -- haven’t we?

    Sure we have, Barack Obama assured us, when he lamented that “there have been times where they slip back into Cold War thinking and a Cold War mentality.  And what I consistently say to them [Russians], and what I say to President Putin, is that’s the past and we’ve got to think about the future, and there’s no reason why we shouldn’t be able to cooperate more effectively than we do.”

    Obama said this on the Jay Leno show. So he might have meant it as a joke. But he seemed to be perfectly serious.

    I try to avoid psychological categories as I observe mythic America. But occasionally the language of psychology is irresistible, at least as metaphor. The impression I get from looking in the television set -- and the Internet -- is that Obama was offering a classic example of projection: denying some characteristic of oneself by ascribing it to someone else. I use “projection” as a metaphor because I’m not talking about Obama as an individual. I’m talking about America, the whole nation, slipping back into a Cold War mentality.

    For weeks, the U.S. mass media have been gripped by the drama of Snowden. Would he leave Russia or stay? Would the Russians grant him asylum? Now “them bad Russians” have gone and done it. If Ginsberg were writing “America” today he would add, “Them Russians, she keep the traitor Edward Snowden.” And they’ve got to be punished. That’s certainly the story I get from the media. 

    It’s the story Russians get too, though apparently many of them think it’s funny. A joke making the rounds there portrays Obama as a jilted suitor: "Obama won't see Putin because Putin is already seeing Snowden."

    Some more serious Russian observers dismiss the Obama snub as no danger, and perhaps even a help, to Putin, who is happy to build up his nationalist political base. His effort is helped by stirring up memories of the Cold War, like analyst Sergei Markov’s claim that "Obama is under powerful pressure from the cold war lobby." America still has a “Cold War lobby”? Who knew? That sounds like a projection from the Russian side.

    So we’ve got mirror images, each side accusing the other of slipping back to the bad old days. No doubt there’s some truth on both sides.

    The ghost of the Cold War certainly still haunts Obama’s America. He was under powerful pressure to cancel the Putin summit or else pay the political price at home, just as Democrats from Henry Wallace to Jimmy Carter paid for being seen as “soft on communism.”

    For a short time there was some popular enthusiasm for “hitting the reset button” in U.S.-Russian relations. But that never erased the stronger enthusiasm for bashing “them Russians” whenever there was a chance.

    True, the Russians have given their American critics some tragically fat targets for criticism. Their continued anti-gay campaign, done with government encouragement, is to me the most blatant and frightening example. But after recent events in Egypt, who would believe that the U.S. government bases its foreign policy decisions on moral considerations?

    The Russian support for Syria’s president Bashar al-Assad is another dismaying example. But that’s obviously a matter of power politics. If the Obama administration saw any real advantage in supporting Assad, they’d do it too. Again, see Egypt (and a host of other countries).

    Yet the traditional American story does not allow us to see any kind of equivalence between our own “land of freedom” and “them bad Russians.” Americans insisted on the fundamental difference even in the pre-Soviet days, when an autocratic czar presided over an eastward expansion in many ways similar to, and as cruel as, America’s westward expansion.

    The same was true during the post-World War I “red scare” and the pre-World War II days of the early HUAC (the House Committee on Un-American Activities), when America showed itself to be the land of something less than full freedom. (Imagine if the U.S. government had digital technology back in those days.)  

    But it was surely the Cold War era that fixed in American political mythology the unbridgeable chasm between us and “them bad Russians,” the chasm reflected in the furor over Snowden and in Obama’s remarks on the Leno show.

    However the lingering effects of the Cold War narrative are only part of the picture. For the Obama administration, as the Christian Science Monitor headlined, “It's about much more than Edward Snowden.”

    As the Monitor noted, the White House statement on the summit cancelation revealed what’s probably the heart of the issue for the president. It “listed arms control, missile defense, trade relations, and human rights as among the issues that would have been discussed by the two leaders but which have not had enough progress to necessitate a summit.” “Progress,” of course, is a code word for significant concessions from the other side.

    In the White House no doubt they read the New York Times editorial that appeared just hours before the cancelation, which offered quite a similar conclusion: “There is no reason for Mr. Obama to attend unless Mr. Putin provides solid assurances that he is prepared to address contentious issues in a substantive and constructive way.” “Substantive and constructive” are more code words for Putin making significant concessions to meet Obama’s demands.

    Presidents don’t want to meet with less-than-chummy leaders of other countries unless they can count on such concessions. What they worry about most are the media stories that a summit will produce. Unless the headlines are sure to feature code words like progress, substantive, and constructive, the president would rather stay home. Why give the other side a chance to look like America’s equal in all those photo ops if there’s no guarantee of a payoff for our side? The cost-benefit analysis just doesn’t add up.

    Dwight Eisenhower was one president who believed that fervently. And for purposes of understanding the current canceled summit it is worth reviving at least the memory of the early Cold War era, the days of Ike and Ginsberg’s “America,” when “summitry” was a topic of constant interest.

    For over two years after Josef Stalin died, Kremlin leaders pushed hard, in public and private, for a summit with the U.S. president. Eisenhower resisted just as firmly. He saw no chance that “them Russians” would yield enough ground to repay the cost of giving them a stage to look like his equal.  

    By the time Ginsberg wrote “America,” though, Eisenhower had met the Soviet leaders, Nikita Khrushchev and Nikolai Bulganin, at Geneva. Why did he go to the summit? “World opinion could be allayed or at least satisfied a bit,” was how Ike explained it to Winston Churchill.

    Mostly he was concerned with public opinion in Western Europe. The specter of any move toward neutrality there, spurred by a perception that the Soviets were more peace-loving than the U.S., terrified official Washington. To forestall such a shift, and to bolster pro-U.S. leaders in Western Europe, the president went to the summit.

    To assure a PR victory, he grabbed the headlines by offering his “Open Skies” plan, giving each side the right to fly over the other’s land and see its nuclear facilities.

    Khrushchev immediately rejected the plan, knowing that it would “have a spectacular appearance which will perhaps deprive the Soviet Union of their propaganda advantage in slogan ‘ban the bomb” and also “allay [American] fear of surprise attack. … Military advisors agree that [the U.S.] would gain more information than would Soviets.”

    Those were not Khrushchev’s words. They were written by U.S. Secretary of State John Foster Dulles in a cable to the State Department. And they summed up Eisenhower’s own understanding of the big win he hoped to score with “Open Skies.” He wanted that win so badly that he went scurrying to find Khrushchev in his final moments in Geneva, hoping to get the Soviet leader to change his mind. But Khrushchev was already on the plane headed home. He never even considered discussing “Open Skies.”

    The story of the Geneva summit shows what presidents are most likely to think about when they schedule, or don’t schedule, or cancel, summit meetings.

    It also shows how far the U.S. has come from the Cold War era of the ‘50s in at least one respect. Obama can hardly be too worried about public reaction among America’s closest allies because of the cancelled Moscow meeting. And he is under no international pressure to offer a spectacular new plan for U.S. - Russian cooperation. Our multi-polar world just doesn’t work that way any more.  

    But the story of the canceled summit shows, too, how much the narrative of the Cold War era still shapes American public perceptions of “them bad Russians.” Even if Obama has freed himself from that narrative, its continuing grip on America means that he must now treat Putin as more of a foe than a friend. So there will be no summit until it seems sure that the American president can come away claiming some kind of victory.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/152928 https://historynewsnetwork.org/blog/152928 0
    Occupy Wall Street, Occupy Egypt: Compare and Contrast Police preparing to attack pro-Morsi protestors in Cairo. Credit: Flickr.

    The professor inside my head just handed out an essay assignment:

    Occupy Wall Street, Occupy Egypt: Compare and Contrast.

    The big difference leaps to my mind first: The crackdowns on Occupy Egypt (the movement in support of ousted president Mohamed Morsi) have been far more violent than the ones that dispersed the Occupy Wall Street encampments across the U.S. in 2011. In both cases definite numbers are almost impossible to come up with. But there seems to be not a single death clearly due to police action against an OWS site, though surely hundreds suffered injuries. In Egypt, of course, hundreds have been killed by uniformed agents as well as civilian supporters of the military government. Injuries run into the thousands.

    On the other hand, since the Egyptian authorities seem more intent on doing violence than arresting people, their arrest totals at the encampments may not equal the 7,762 arrests of OWS protesters in the U.S. recorded so far.

    The victims of violent suppression in the U.S. might be the first to say that their hearts go out to the victims in Egypt, who suffered on such a larger scale. But the Americans might also want to tell us what it feels like to be pepper-sprayed in the face or run over by a police scooter or slammed to the ground and smacked with a police baton.

    I couldn’t find any tally of the effects of lasting injuries from concussions or broken bones or dislocations among U.S. Occupiers. But anyone who has suffered such an injury for any reason will tell you it doesn’t go away quickly. And when you are in pain, it’s hard to be consoled very much by knowing that half-way around the world a lot more people are in pain, and some are dead. After all, pain is pain.

    If this were a philosophy class, I might add the argument that state violence is state violence. The difference in scale doesn’t change the central fact: In both cases the state was inflicting violence on protesters simply because they expressed political views the state didn’t like.

    Another important similarity is the generally peaceful demeanor of the victims of violence. “The protest camps at the heart of Egypt's political crisis feel more like a village fair than a bastion of resistance,” Reuters reported just hours before the vicious crackdown of August 14. The most violent protesters the reporter could find in the sweltering heat were the boys who “ran around with water dispensers strapped to their backs, spraying people and laughing.”

    Video and photos from Occupy encampments broken up around the U.S. told a very similar story.

    Yet media headlines back then told quite a different tale. ABC News was typical: “Occupy Deaths Make Oakland, Salt Lake City, Burlington Order Camps Closed.” Officials had no choice, the headlines implied. The camps were a clear and present danger to public safety -- though the deaths, in each case, were wholly unrelated to the protesters’ activities.

    In Oakland the city’s eviction notice to protesters spelled out more detail: "Your activities are injurious to health, obstruct the free use of property, ... and unlawfully obstruct the free passage or use of a public park or square." The Oakland Police Officers’ Association went further: "This Occupy Oakland has created an environment that is conducive to crime."

    The chief of police in Burlington, VT, was shocked -- shocked -- to discover that “there has been extensive consumption of alcohol and some use of drugs” in the encampment there, as if that made it different from any other neighborhood in Burlington. 

    What’s the word from Egypt? Al-Jazeera reports the official explanation for a month-long state of emergency: "The security and order of the nation face danger due to deliberate sabotage ... and the loss of life [inflicted] by extremist groups. ... The armed forces, in cooperation with the police, [will] take all necessary measures to maintain security and order and to protect public and private property and the lives of citizens."

    That’s rather more exaggerated than the words of American officials breaking up OWS, but the gist of the message is strikingly similar. And when an Egyptian government spokesman assures us that security forces exercise “self-control and high-level professionalism in dispersing the sit-ins," while the Muslim Brotherhood protesters alone are responsible for "escalation and violence," it sounds all-too-familiar to anyone who followed the news in the heyday of the Occupy Wall Street movement. 

    Yet there’s another fundamental difference. When OWS was broken up, the official explanations were widely quoted in U.S. news sources. And editorial comments were usually sympathetic: It’s a darn shame that force has to be used, but what choice did the authorities have? After all, you can’t let a bunch of rag-tag protesters disrupt the good order of the community and obstruct the freedom of decent, law-abiding citizens.

    But when it comes to Egypt, the U.S. media don’t seem at all interested in quoting the official words of justification (which is why one must turn to Al-Jazeera). Reporters seem to think the words irrelevant, since the immorality of the violence and the hypocrisy of the military leaders is so obvious.

    The same editorial pages that once sympathized with police action against OWS now offer unsparing criticism of the Egyptian attacks on protesters. “It is difficult to understand,” says the New York Times, “why the army ... would think that crushing the [Muslim] Brotherhood could benefit the country.” The editors at the Washington Post agree that what they called a “massive violation of human rights” cannot possibly help Egypt toward democracy.

    But the WaPo editors have a more urgent concern on their minds: “The Obama administration is complicit in the new and horrifyingly bloody crackdown.” Though the administration has warned the Egyptian leaders to cut it out, “the military’s disregard for these appeals was logical and predictable: Washington had already demonstrated that its warnings were not credible.” And indeed, as I write this, the Times website carries the headline: “U.S. Condemns Crackdown, but Doesn’t Alter Policy.” (The cancelling of one joint military exercise is a nice symbol, but it hardly counts as a change in policy.)

    Obama’s national security advisor, Susan Rice, gave the green light for the coup launched by the leaders who have now unleashed such violence. Would those leaders risk the $1.5 billion or more they get from Washington without getting some kind of OK for their crackdown from the hand that feeds them?  We may never know.

    Yet we do know that here at home many of the officials who broke up Occupy encampments in their cities got direct guidance and perhaps coordination from the Department of Homeland Security and the FBI.  That never seemed too newsworthy to the American mass media -- perhaps because breaking up the American protests was generally seen as fully justified, even necessary. It’s hard to report “government does the right thing” as front-page news.

    Barack Obama may decide that he underestimated the American public’s reaction to the horrific scenes from Egypt. He may tell the generals there to back off, and they’ll have little choice but to obey.

    Yet no mayor of an American city who sent the police into Occupy Wall Street camps felt compelled to apologize (though a few may have “regretted an occasional excess”).

    And therein lies one more crucial difference between the two movements. The U.S. public heard, and generally affirmed, two very different stories. One is about violence ordered by good, democratically elected, American governments. The other is about violence unleashed by a bad, unelected, Egyptian government.

    Since stories with good guys always have to have bad guys, and vice versa, the implication is unavoidable: The Occupy Wall Street protesters must be bad guys, and the Egyptian occupiers have surprisingly become good guys.

    How to explain the different perceptions of two sets of events that are, though so different in scale, so similar in their basic structure? (Remember, this is all about how events are seen through an American lens.)

    Here in the U.S. we have a long history of state violence inflicted on people protesting peacefully for more economic equity. From the North Carolina Regulators in the 1760s to the steel strikes of 1937, the pattern was rather predictable: Newspapers owned by wealthy publishers cast the protesters and strikers as villains, but public opinion was broadly divided. When the state called out its troops, their targets could count on plenty of sympathy. And a major public debate ensued.

    That didn’t happen during Occupy Wall Street. According to the experts at Gallup, only about a quarter of the public supported OWS. Most Americans didn’t care enough to have any opinion at all. The more state violence was unleashed, the more the public turned against the methods (though not the goals) of the protests. So state violence evoked little public debate -- certainly nothing like the furor set off by the police riot at the 1968 Democratic Convention in Chicago and the shootings at Kent State in 1970. But those were antiwar protests being suppressed, not protests over economic issues. 

    Perhaps it’s an effect of the long period from roughly 1940 to 2007 when most white Americans generally assumed that they’d have an endlessly increasing prospect of economic growth and freedom to enjoy the good life. They didn’t want that prospect clouded by crowds in the streets disturbing their comfortable way of life. So they were happy enough to see the authorities break up those crowds, with little concern about the methods used.

    Perhaps antipathy, or at best apathy, toward visible protest became such an entrenched habit that it endures, despite the dramatic change in the economic prospects of most white Americans. 

    Whatever the reason, it would have been absurd to think the suppression of Occupy Wall Street would spark a civil war in the U.S. -- just as it’s absurd to dismiss the possibility of civil war in Egypt.

    But public opinion is a slippery, unpredictable creature. Who would have thought that Americans’ hearts would go out to a movement led by the Muslim Brotherhood? There is, to be sure, a long American tradition of sympathizing with people whose democratically elected government is overthrown by a military coup. But it’s a selective sympathy. To take just one example that remains painfully relevant: There was little lament or debate in the U.S. when the Iranian military, under CIA tutelage, overthrew the democratically elected government of Mohammad Mossadegh in 1953.

    No doubt there are voices whispering in Obama’s ear: Give the Egyptian conflict the right spin and the American public will soon enough turn today’s good guys into tomorrow’s bad guys, returning to its habit of choosing order over the specter of chaos. It shouldn’t be too hard when that specter wears a Muslim veil. Those voices may prove to be right.

    On the other hand, the Occupy Wall Street movement may revive as unexpectedly as it appeared the first time, with even more strength. Given the unending Great Recession, the media and the public may respond quite differently next time.

    The only thing history teaches us for sure is that we should not try to predict the future. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153058 https://historynewsnetwork.org/blog/153058 0
    The Question Americans Can't Ask About Egypt and Syria Credit: Flickr.

    What does the Obama administration really want in Egypt and Syria? "To reduce the risk to U.S. interests," writes Washington Post blogger Max Fisher. The administration wants "to play the middle and to avoid any strong positions" because it values "above all, an aversion to risk." So "the White House tried [in Egypt], as in Syria, to manage it from the behind the scenes."

    The result has been bad for the U.S. and horrendous for the Egyptians and Syrians: "Bloody stalemate has become the status quo." But that status quo is "unsustainable."

    It's hard to argue with Fisher's observations. It's equally hard not to argue with his conclusion, a call for the U.S. to take charge: "Better to force a solution, however uncertain, than wait for things to combust on their own. Staving off catastrophe only works for so long."

    Really? I wish we could assemble all the Cold War presidents, from Truman to Reagan, read them that last line, and listen to them chuckle. Staving off catastrophe was the heart and soul of their foreign policy. They called it "containment." (I call it the mythology of homeland insecurity.)

    Then we could assemble an all-star roster of historians and political analysts who would assure us that it worked pretty damn well for four decades. That's a debatable conclusion; it depends on what you mean by "worked well." But it's the prevailing view in the foreign policy establishment.

    Anyone who thinks that the U.S. must choose between a) forcing solutions in other countries, and b) avoiding strong positions, doesn't know how U.S. foreign policy has been working for the last seven decades. Cold War presidents all used both of those options as means to pursue their goal of containment -- staving off catastrophe -- in hopes of reducing the risks to U.S. interests everywhere around the world.

    Sometimes they thought they had to force a solution and overthrow the status quo in some nation or other to serve U.S. interests. And they were always ready to do it. (The most important example for our current world is the overthrow of the government in Iran in 1953 -- which Iranians have not forgotten, even if most of us have.) During the Cold War era, such jolts to the status quo almost always came where communism supposedly threatened.

    Throughout the non-communist "free world," the goal was to manage events from behind the scenes in order to maintain stability everywhere. "Stability" was the code word for sufficient U.S. control to contain the spread of communism and thus stave off the catastrophe.

    Since 1989, without any overall global struggle to fight, U.S. administrations have still pursued the traditional goal of stability to reduce the risk to U.S. interests. In effect, they have acted as if we had turned the whole world into the "free world." So they've continued to follow the golden rule of the cold war containment policy: avoid risk; stave off catastrophe.

    Occasionally, in obedience to that rule, they still may upset the status quo, hoping short-term chaos will breed long-term stability. Apparently something like that happened in Egypt. If the U.S. didn't orchestrate the coup, it certainly gave a green light to the generals who did.

    But the theory is always that the U.S. is strong enough to control the outcome by managing every situation, even the most catastrophic, either by overt force or from behind the scenes. (I call it "apocalypse management.") If you give up that belief, you in effect give up U.S. foreign policy as we've known it for nearly three-quarters of a century. Then you're really left with chaos, not in some country far away, but in Washington.

    The Obama administration, and much of the foreign policy establishment, has to believe that the current bloody status quo in Egypt and Syria is only temporary, that with enough skill and persistence the U.S. government can still produce an outcome that serves U.S. interests.

    If that means playing the middle and avoiding any strong positions right now, so be it. You've got to have lots of arrows in your quiver, they'll say. Sometimes the best way to force a solution is to act decisively (as in OK'ing a coup). Sometimes the best way is to play a patient long game and make it look like you are not forcing any solution at all.

    So the most basic question about U.S. policy in Egypt and Syria is not: Should we force a solution or discreetly play the middle? Those are just different means toward the same old end of reducing risk and staving off catastrophe.

    The most basic question is: Does a foreign policy built on reducing risks to "national interests" (as defined by the establishment), using all the old familiar tactics, really serve the best interests of the American people?

    Unfortunately, neither the administration nor the foreign policy establishment and its pundits (like Max Fisher) can ask that question. Their language, generated by and trapped in the dominant mythology, simply doesn't allow it. They can't think outside of the box they have created.

    If they could ask the question, the past -- in places like Korea, Vietnam, and Iraq -- and the present -- in places like Afghanistan, Syria, and Egypt -- would send back the same answer: Every conflict is more likely than not to drain the U.S. of resources, influence, prestige, and, all too often, American blood -- all because of a self-defeating obsession with forcing solutions and staving off catastrophe.

    If bloody stalemate has become the status quo in Syria and Egypt, as Fisher says (and he could well add Afghanistan to the list), it's hardly because the U.S. has restrained itself from interfering.

    On the contrary, U.S. pursuit of perceived self-interest, by a variety of means, has played a significant role in creating the stalemate and the bloodshed. That's more obvious in Egypt (where both sides in the brewing civil war openly cast blame on the U.S.) and in Afghanistan than it is in Syria.

    But U.S. foreign policy still follows the dictum laid down by Cordell Hull in 1937, when the mythology of homeland insecurity was just being born: "There can be no serious hostilities anywhere in the world which will not one way or another affect interests or rights or obligations of this country."

    In other words, the U.S. has to have at least its foot, and usually more parts of the national corpus, in every conflict, everywhere. And all of us, in America and around the world, pay the price.  

    Is there any alternative? Proponents of nonviolence in America, from the early Quakers to the abolitionist William Lloyd Garrison to Dr. Martin Luther King, Jr., all thought there was.

    A lesser-known but equally great leader of the nonviolence tradition, A. J. Muste, once wrote: "When a crisis develops, people turn upon the pacifist, figuratively hold a gun to his head, and demand:  'Now how would you pacifists stop this thing? ¾ in five minutes and painlessly." 

    It is usually the violent who want simple, instant solutions, he pointed out. The nonviolent recognize that there are never any simple solutions to foreign policy problems, most of which were created by policies of force and control. 

    What would happen if the world's strongest nation exerted all of its diplomatic, moral, and economic force based on the principles of nonviolence? There is no way to know. Again, politicians, policymakers, the foreign policy establishment, and its media scribes simply cannot ask that question. The very words are incomprehensible within their mythic framework. So the American public at large never learns how to ask it either.

    That's the bloody stalemate which has become America's status quo.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153131 https://historynewsnetwork.org/blog/153131 0
    Americans Still Favor Global Intervention -- FDR-Style If you want to understand the fracas over Syrian chemical weapons as a chapter in American history, I'd suggest starting with these words: "For nearly seven decades the United States has been the anchor of global security. ... Out of the ashes of world war, we built an international order and enforced the rules that gave it meaning. ... Even a limited strike will send a message to Assad that no other nation can deliver. ... Now is the time to show the world that America keeps our commitments. ... We do what we say." 

    That's a pastiche from Barack Obama's two major speeches on Syria (August 31 and September 10). The message was coded, but easy enough to decipher: America still has more of a commitment than any other nation to enforce the rules of the international order, because we built that order. The rules we made are the "red line" that Syria crossed. Now Syria must be punished, and only the U.S. can do the punishing -- indeed must do the punishing, if only to prove that the U.S. is still a credible enforcer.

    I don't suggest that this is a key to deciphering Obama's true motives. No one can read his mind to ferret out his motives. Perhaps even Obama himself isn't really sure of his true motives.

    But if you want to put Obama's initial eagerness to attack Syria, and his later step back from the brink, in the context of American history, these words from his speeches are the key -- especially his reference to "the ashes of world war," obviously meaning World War II.

    When the Germans conquered most of continental Europe in 1940 and raised fears of an imminent attack on Britain, Franklin D. Roosevelt launched a skillful campaign to persuade Americans to pay for a huge increase in military weaponry and send as much as needed to the British to defend themselves. But, FDR promised, U.S. troops would never go to Europe or anywhere else in the world. The U.S. could help win this war without risking a drop of American blood; in today's jargon, he promised "no boots on the ground." Of course the bombing of Pearl Harbor ended that vision of bloodless victory.

    But by 1942 FDR was telling people (quite privately) about his utopian postwar vision for lasting peace on earth. The story was quite simple: Nations go to war only if they have the weapons to do it. Take away their weapons, and -- presto! -- no more war. Eternal peace would let the United States go on freely trading with, and profiting from, every nation on earth forever.

    Of course someone had to be strong enough to take away all those weapons and make sure no one else could obtain new ones. So four nations would be exempt from the command to disarm. The U.S., Britain, Russia, and China would be the world's "four policemen," as FDR called them, each enforcing the rules in their own part of the world.

    However "by 1944 Roosevelt's musings about the four world policemen had faded into the background," as Martin Sherwin wrote in A World Destroyed, his classic history of how the atomic bomb reshaped world diplomacy in the 1940s. FDR was getting encouraging reports from the Manhattan Project and growing optimistic that the United States would have soon have an atomic bomb.

    The fateful decision he made was to share the bomb and knowledge of how to make it only with the Britain and not with any other nation -- including, most importantly, Russia. FDR was misled into thinking that the U.S. and Britain could keep an atomic monopoly for two decades or more. So, he assumed, there would actually be only two policemen.

    Even though Roosevelt hoped for postwar cooperation with the Russians, "the underlying idea" of his original plan, as Sherwin wrote, "the concept of guaranteeing world peace by the amassing of overwhelming military power, remained a prominent feature of his postwar plans." And not a drop of American blood would ever have to be shed.

    After Roosevelt's death, the Truman administration made that concept the most prominent feature of America's postwar plan -- the international order that, as Obama said, the U.S. built out of the ashes of world war.

    Harry Truman came increasingly under the sway of cold war hawks, who turned back all efforts to cooperate with the Russians on anything related to the bomb. Our national commitment -- the responsibility we awarded ourselves in 1945 -- was to enforce the new world order and keep the peace by ourselves, brandishing the bomb (or as Truman called it "the hammer").

    Of course the Soviet Union was not nearly as intimidated by that "hammer" as Truman and his cold warriors hoped. It took over 40 years to persuade the Soviets to stop competing for the title of global enforcer.

    But for more than twenty years now the United States has held undisputed claim to the title of the world's sole indispensable superpower. And, as in the cold war era, Americans still generally justify that claim with a simple moral tale of good (that's us) against evil (that's whomever we happen to be opposing at the moment).

    Chemical Weapons and WMDs

    Back in the 1940s, with all attention focused on the atomic bomb, chemical and biological weapons didn't get much public attention. Only in recent years were they lumped together with nuclear weapons under the umbrella term "WMDs." But the principle that prevailed in U.S. foreign policymaking circles remained the same: We alone would enforce the rules and red lines of the international game, because we alone would have the power to do it.

    The U.S. has signed the Chemical Weapons Convention, the international agreement calling on signatories to destroy their stocks of chemical weapons, and U.S. chemical weapons stockpiles are being destroyed (though much more slowly than the Convention called for; the U.S. claims it will finish the job only 11 years behind schedule).

    At the same time, though, U.S. leaders have emphasized a kind of equivalence between chemical and nuclear weapons. They have often said that their response to any chemical attack would be "devastating," with no weapons ruled out -- which implies, despite the calculated ambiguity, a clear threat of nuclear retaliation.

    There's another obvious link between nuclear and chemical weapons. In American political culture, both are framed within the same dualistic narrative: Some WMDs are acceptable, some are unacceptable, and you damn well better know the difference, or else the enforcer will soon be at your doorstep.

    How to know which is which? The enforcer will decide and then let you know, explaining it all with the familiar tale of good versus evil. And if your WMDs are unacceptable, the enforcer will fulfill his commitments and do what he has been saying he'd do for the past 70 years: punish you. After all, his credibility is on the (red) line.

    That narrative goes back to the fateful decision Franklin Roosevelt made: a British atomic bomb would be acceptable. A Soviet atomic bomb would be unacceptable.

    Today we see the same dualism played out around the world. Iran must be prevented, at all costs, from obtaining even one nuclear weapon. Meanwhile, its neighbors to the east (Pakistan and India) and not far to the west (Israel) are perfectly entitled to keep expanding their nuclear arsenals. The U.S. is perfectly entitled to keep who knows how many nuclear weapons stationed in South Korea. But North Korea is a dangerous, even monstrous, threat because it has a tiny handful.

    The same dualistic principle applies to chemical weapons. When Syrians opposed to Bashar al-Assad stockpiled and very possibly used them, Washington uttered not a peep. Those WMDs are, apparently, acceptable. But if Assad did indeed use similar weapons himself, they are totally unacceptable and he must be punished.

    Similarly, we are urged to be outraged that Assad refuses to sign the Chemical Weapons Convention; his very possession of the weapons is unacceptable. But we hear not a word about his most powerful regional rivals, Israel and Egypt, refusing to sign the same Convention. Their chemical weapons are, apparently, acceptable -- even if Egypt once actually used them in Yemen years ago.

    Sometimes the very same chemical weapons can be transformed from one category to the other. It all depends on the context. Most famously, Saddam Hussein's use of chemical weapons in the 1980s was acceptable to the United States. By 2003, that same event had become utterly unacceptable, offered as prime evidence that he must have WMDs, including the nuclear kind, and therefore must be destroyed.

    The moral of the story is clear: The global policeman's responsibility is to decide which WMDs are unacceptable and then to punish those nations that use or even obtain them, because those nations are bad guys by definition (a definition marked "made in USA"). Every president since FDR has embraced that as a foundational mythic narrative of U.S. foreign policy.

    THE LATEST CHAPTER: SYRIA

    Obama, having placed Assad's chemical weapons in the category of unacceptable, called the world to act out the venerable myth once again. Of course he didn't get quite the answer he expected.

    His first ran in to trouble because Roosevelt's idea of the "four policemen" never really died. It was enshrined in the UN Security Council, in the form of the permanent members. (France was added as a fifth world cop, as a last minute afterthought, when the UN was created in 1945.)

    To prove their status as the world's most powerful nations, all five were given veto power -- which is why Obama couldn't use the Security Council, the route he and everyone else would have preferred, to legitimize an attack on Syria. Global cop Russia was determined to veto it.

    So Obama was forced to turn to the U.S. Congress for a stamp of approval. Congress is packed with Republicans who were eager to stick it to Obama any way they could. And on this issue they had t it easy, because they could say in all honesty that most of the voters back home were against an attack (even though most believe Assad did use chemical weapons against civilians). Plenty of Democrats got the same message from the voters, forcing them to choose between embarrassing their president and defying the will of the people, most of whom seemed uninterested in being global enforcers.

    The public's reluctance to use military force in Syria has sent the pundits scurrying back to the FDR era -- this time to the years just before the U.S. entered World War II, where they found what seemed to be the obvious analogy: Once again, it seems, it's interventionists versus isolationists.

    But the analogy is actually quite mistaken. All the evidence suggests that the U.S. public today is not at all loath to be involved in war around the world. It just depends on how the war is fought.

    If it's fought with drones, even drones that kill American citizens, there is some public controversy, but not a whole lot. If it's fought with cyberspying, even cyberspying that scoops up millions of American's phone calls and emails, there's more controversy. But it's nothing nearly on the scale of the row over attacking Syrian.

    And then there are the issues that arouse no public discussion at all: The massive U.S. arsenals of WMDs, and the smaller arsenals held by U.S. allies. (Of course, they are all "acceptable" WMDs, so what is there to discuss?); the fact that the U.S. military capacity, as a whole, is vastly larger than any other nation's; and the vast array of U.S. military bases around the world, ready to wield that capacity everywhere at a moment's notice.

    In other words, the American public still accepts the vision that FDR used against the "isolationists," the same vision he later offered for world peace: the U.S. enforcing global order without risking American blood.

    What the American public won't accept, right now, is the risk of shedding even a drop of American blood. After twelve gruesome and apparently (to most Americans) pointless years of war in Iraq and Afghanistan, most Americans simply wouldn't believe the president's promise that no Americans would ever be sent to kill and die in Syria.

    So the public rejection of Obama's call for intervention had nothing to do with pre-World War II "isolationism." It was, rather, a hearty endorsement of FDR's ideal method of interventionism (even making room for his idea of cooperating with the Russians). In effect, it was a message to Obama that the nation has too often strayed from FDR's plan; now it's time to get back on the track he first plotted out over seven decades ago.

    Does that mean the global policeman must hang up his badge? It depends on the weapons the cop uses. As long as the policeman can enforce the law with no risk to himself, he can keep that badge pinned on proudly. But if he has to show up the old-fashioned way, in person with gun in hand, and shoot down the bad guys, then it seems the American version will have to hang up his badge -- at least for a while.

    For how long? No one can say. As so many fictional policemen and sheriffs -- and Ronald Reagan and two presidents Bush -- have shown us, a seemingly retired cop can always be tempted to pick up his badge and gun and go out to the dusty streets again, if evil threatens in an ugly enough form. It's a story American audiences never seem to tire of. So I wouldn't write an obituary for the old-fashioned global cop just yet.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153172 https://historynewsnetwork.org/blog/153172 0
    Obama's Syria Shift Sends Foreign Policy Elite Back to School In my last post, I suggested that the American public, far from being "isolationist," still generally accepts the vision that Franklin D. Roosevelt used to defeat the "isolationists," the same vision he later offered for world peace: the U.S. enforcing global order without risking American blood. Now Barack Obama has stepped back from the brink of "boots on the ground" in Syria and returned to FDR's idealized approach to policing the world.

    Obama also took another page from FDR's book: Compromise, be flexible, and cooperate with the Russians when that's the best way to avoid risking American lives. In this case, Obama is cooperating with the Russians to deactivate the Syrian government's chemical weapons stockpile. As we now know, the president began exploring that idea with Vladimir Putin back in June. They both liked the idea enough that their technical people have already spent months figuring out how to do it. And the American people like the idea enough that an astounding 82% favor the U.S. - Russian plan.

    The foreign policy elite, still living in the world created by Harry Truman's cold warriors, are horrified by the very idea of Obama trading toughness for conciliation. In their world, flexibility is weakness, and only the strong survive.

    Which just shows what a gap there is between the elite and the people, and why Obama seems to be surviving the Syrian crisis, politically, quite well thank you.

    As Washington Post political analyst Greg Sargent found when he looked at the polling data, the "Beltway establishment criticism, which has focused largely on process and theatrics, is deeply misguided and disconnected from how Americans view the situation. ... There is just no evidence Americans see this through the prism favored by establishment pundits -- that adapting to shifting circumstances is not 'resolute' or 'decisive,' and is therefore inherently a bad thing that has 'weakened' the presidency and the country."

    In a WaPo poll, 60 percent said Obama “sticks with his principles,” roughly unchanged since January 2012. 46 percent said his handling of the Syria issue “has not made much difference to U.S. leadership,” while only 32 percent say it weakened the country. "What we really need," Sargent concluded, "is a reevaluation of all the unstated assumptions that shape establishment discourse about these matters." 

    How true, how true. And how surprising to see it said by an in-house analyst for the nation's most ultra-establishment news organization.

    Fortunately some of those establishment assumptions are no longer unstated. The debate about whether to attack Syria brought them out of the woodwork and into the pundits' columns, as they urged the president to get tough. So we can start the reevaluation by looking back on the punditry of recent weeks.

    Let's consider a couple of examples.

    "Principle backed by credible force made the United States the anchor of global security since 1945," New York Times columnist Roger Cohen asserted. Credibility depends on sending a clear message and then acting upon it. But there has been no "message discipline on Syria from the careening Obama administration," Cohen lamented. Obama's "wavering has looked like acquiescence to a global power shift. ... America signaled an inward turn that leaves the world anchorless." It's all about signals and messaging (or, as Sargent put it, theatrics).

    If an American president does not consistently send a consistent message of strength and resolve, we'll end up with "an anchorless world," a "post-American world -- and that means chaos.” Why? Cohen quoted W.H. Auden: “'The ogre does what ogres can.'” Once you give ogres the message that they can run wild, chaos reigns.

    Vali Nasr, dean of Johns Hopkins University’s Paul H. Nitze School of Advanced International Studies (a famous training ground for the foreign policy elite), was equally worried about ogres. In an op-ed, he wrote: "The Obama administration has no choice but to enforce the 'red line' the president laid out a year ago. To maintain American credibility -- and his own -- President Obama has to do so quickly and decisively."

    Why? For Nasr, too, it's about the theatrics of signals and messages, because they shape impressions and perceptions." Enforcing the red line "would impress American allies and adversaries alike." By "dithering," the president is "reinforcing the perception that the United States is no longer keen on leading the world."

    That perception "will embolden America’s adversaries and deject its friends. America could soon find itself alone in standing up to Iran or North Korea, or in pushing back against China and Russia. ... Shirking from our global responsibilities will only create bigger problems that will eventually raise both the cost and the likelihood of American intervention." Thus saith the establishment.

    Now let's look at a few of the unstated assumptions, the premises you have to accept if these words by eminent foreign policy analysts are going to make any sense:

    The geopolitical world that matters is divided into two and only two parts: America's adversaries and friends, the bad (ogre) guys and the good guys. The bad guys are always pushing against the good guys, trying to do as much evil as they can. If they aren't kept in check, the world situation keeps shifting and changing. Life gets unpredictable; it's not fully under anyone's control. And whenever no one is in control, the only alternative is chaos. Who can feel secure facing a growing tide of chaos? 

    The U.S. has a unique responsibility to keep the world securely under control because the U.S. is the only nation with enough power to do it. So America has to prevent the bad guys from pushing too hard and fomenting too much change. Fortunately that usually doesn't require military force. It merely requires a message that is as firm and unchanging as we want the world to be: If you keep pushing, you'll pay such a high price that the pushing simply won't be worth it.

    But if America's spokesman, the president, doesn't stand firm --  if he's constantly moving and shifting -- the bad guys have no reason to believe that the world is under strict control. They won't expect to pay any great price for making the world move and shake even more. So they'll keep on pushing.

    Eventually the U.S. will have to stop them, to prevent global chaos. But when "eventually" comes around, the only way to stop them may be with violence. And that would be a shame, since it could have been done with a firm, unwavering message --backed up by the mere threat of force.

    It's a simple story that any average grade-school child could understand. In its childlike simplicity it has all the charm and appeal of a good myth. Indeed, it's part and parcel of America's prevailing myth, the myth of homeland insecurity.

    But it's taught as perfectly logical fact and common sense in the nation's most elite grad schools, institutes, and think tanks, where the foreign policy establishment is trained -- and where reevaluation of unstated assumptions is strictly taboo.

    So the establishment can't see the fundamental contradiction in its favorite narrative.

    On the one hand, the story assumes that the bad guys are reasonable. They'll understand a clear, consistent message from Washington, do a perfectly rational cost-benefit analysis, and figure out that the cost of whatever action the U.S. has proscribed (for example, using chemical weapons) isn't worth the benefit.

    On the other hand, it assumes that the bad guys are ogres. And ogres, as every child knows, do evil simply for the sake of doing evil, whether it makes any sense or not. In psychological terms, their reasoning faculties are overwhelmed by their impulses. Or, to put it theologically, their will is determined not by reason but by original sin.

    Why put it theologically? Because the establishment narrative was christened back in the 1930s by the highly influential theologian Reinhold Niebuhr. George Kennan, who did so much to turn this story into policy in the early cold war years, called Niebuhr "the father of us all."

    What Niebuhr taught "us all" (to put it a bit too simply) is that people are sinners because they want to assert and aggrandize themselves without limit. When sin takes over, reason and common sense go out the door. So does good order. Order depends on limits; evildoers, driven by sin, know no limits.

    Logic creates orderly structures out of the chaos of the world, Niebuhr taught. But when sin lets impulse loose it destroys all orderly structures and limits in its rush to gratify its whims and wishes. As Yeats said, "Things fall apart; the centre cannot hold." Then we get Roger Cohen's "anchorless world."

    But here's the rub: Since the ogre-ish bad guys are driven by sin, they'll always let impulse trump reason. Why, then, should we think that they will or can do any cost-benefit analysis, no matter how unwavering the message from Washington? Why should we think that any message or signal has the power to hold back the chaos?

    That's just one of the lapses in logic we'll have to confront if we follow Greg Sargent's advice, as we should, and start reevaluating all the unstated assumptions that shape the establishment narrative.

    There is still plenty to criticize in Barack Obama's foreign policy. But if his recent shift on Syria helps to send the foreign policy elite and their pundits back to school, this time to learn with some really rigorous logic and common sense, it will be one big point in the president's favor.

     

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153177 https://historynewsnetwork.org/blog/153177 0
    "Yes, We Have No Narratives": A Great American Myth "Dueling Narratives Over Iran - U.S. Relations." That New York Times headline jumped out at me and my heart skipped a beat. Was David Sanger, one of the nation's most respected foreign affairs journalists (who co-authored the story with Michael Schwirtz), really telling us that U.S. policy toward Iran depends on a contest between competing narratives?

    This was a news story, not an opinion piece. An NYT reporter recently told me that the news staff at the Times is under strict orders to keep a clear distance between themselves and the opinion page staff. So it seemed Sanger was presenting the role of narrative as objective fact.

    Of course narratives themselves are never merely objective accountings of fact. They always depend on interpretation. But it's a fact that interpretive narratives­­ -- what some call myths -- wield tremendous influence in policymaking.

    It's also a fact that the mass news media rarely tell us how crucial narratives are. Indeed they rarely treat narratives as news at all. Even the best journalists generally treat policymaking as a simple matter of gathering information, analyzing it, and drawing rational (or irrational) conclusions. They'll tell us that different policymakers have different analyses and come to different conclusions. But they generally ignore the way preconceived narratives shape the whole process.

    That's why I was so surprised and pleased to read this headline. For once, it promised, a major news source was going to put the spotlight on narrative in the public arena. I was about to get some facts about the different narratives that are competing for dominance in the United States and in Iran. Or so I thought.

    Unfortunately, as I realized a second later, my surprise and pleasure were somewhat premature. Yes, I did get a story about the importance of political narratives, and that itself was rare good news. But my eye-to-brain connection had played a trick on me. I'm so accustomed to thinking about the role of narrative in American life that I see it even when it isn't there.

    What the headline actually said was: "Dueling Narratives in Iran Over U.S. Relations."  The story was only about what's going on in Iran: Top officials there are dueling with each other about the stance their nation should take toward the U.S., and two competing factions each have their own narrative. There was not a word about dueling narratives here in the U.S.

    Of course every news "story" is just that -- a narrative tale, an interpretation of the facts, though it presents itself as a mere recitation of facts. And every story has a moral.

    Perhaps the Times editors thought the moral of this story was that there are actually competing opinions in the highest circles of Iranian politics. And they probably thought readers would find that surprising. The prevailing narrative in the U.S. mass media is that Iran is a bad-guy nation, an evildoer. And in America's mythic world evil nations are always, by definition, totalitarian nations where dissent just isn't permitted, much less dueling narratives. In that respect, this article did challenge the common view of Iran.

    To what end, though? Dig a bit deeper, and a more telling moral appears. Iran's new president, Hassan Rouhani, made headlines by coming to the UN and seeming to offer an olive branch to the U.S. and its allies. On the nuclear issue -- the only issue that seems to count, as far as the U.S. mass media are concerned -- Rouhani "is widely believed to have the backing of the country’s supreme leader [Ayatollah Ali Khamenei] to at least give negotiations a try," as this story notes.

    But don't be fooled by all the good news, David Sanger and his colleague were warning. There are very powerful people in Iran who are trying their best to stop Rouhani and go back to the bad old days of pushing the U.S. - Iranian confrontation to the brink.

    The Times story offered one and only one example: Iran's deputy foreign minister, Abbas Araghchi, "sought to assure conservative factions that Iran remained skeptical of Washington and would not rush headlong into a deal." “We never trust America 100 percent,” he said. "And, in the future, we will remain on the same path.” Araghchi and other "hard-liners among the Iranian leadership are watching warily and could try to derail an agreement."

    And the anti-American trouble runs deeper than the leadership, the story tells us. When Rouhani returned to Teheran, a crowd of hard-liners surrounded his car, shouting, “Our people are awake and hate America!” The accompanying photo shows the melee and the bodyguards Rouhani needed (as the cut line explains).

    Along with such ominous (though no doubt factual) reports, the story offered an equally factual list of issues that remain contentious between the U.S. and Iran. In the context, those served as more warnings that we should not yet break out the champagne and celebrate a new day in relations between the two nations. There are still plenty of reasons why the budding romance might fall apart before it ever gets a chance to blossom: "The dueling narratives [in Iran] underscored the complexity of any rapprochement between the two countries."

    This one rather short story underscores the complexity of studying narratives in public affairs. It's an American narrative about competing narratives in Iran that reinforces the prevailing narrative about Iran in America: Iran is a dangerous place, a place full of unpredictable tensions where nothing is necessarily what it seems to be and anything can go wrong at any time.

    Yet the story is really not complex enough, not by half. The half that's missing is the topic I naively hoped I was going to read about: The dueling narratives about Iran here in the United States. Whether you read it as a story about an ongoing battle or a budding romance (or both), it seems that only one partner is really doing anything. The other (that's America) is apparently a passive victim of circumstance, just waiting to see what happens. This sense of passivity is another staple of America's narrative life.

    So is the myth that American policymakers don't have narratives shaping their perceptions and decisions. Only the bad guys, it seems, have such narratives. I suppose that's because narratives strike us as subjective products of bias and emotion, the kind of irrationality that can lead people to do evil. So it's a comforting part of the American narrative that our leaders make their decision based on facts and reason, not narratives. It's a story we Americans have been telling ourselves ever since the days of the 18th century Enlightenment, when we declared our independence and justified it by invoking the self-evident truths of reason.

    Obviously there are, in fact, dueling narratives in America about Iran. There are more than just two (as, no doubt, there are also more than just two in Iran). Sometimes we do get news stories about the spectrum of competing opinions on the Iran issue among U.S. policymakers, though not as often as we used to. What I've never seen is a story about how those opinions flow from and create competing narratives.

    And I'm not sure whether it's more funny or sad that this story about Iranian narratives shows how easy it would be to write a companion piece on America. Virtually every sentence describing or quoting Iranian hard-liners and moderates could be used, almost verbatim, to do the same for American hard-liners and moderates.

    To note just one example: When Iranian Foreign Minister Zarif was asked why crowds are chanting "hate America!" he said, "The Iranian people hated American policies, not the American people." For decades, it has been de rigueur for American leaders to insist that we never hate the people of any nation, even one we may be attacking and bombing. Surely, if we do some day attack Iran, we will be told that we hate Iran's policies (and probably its leaders) but definitely not its people.

    The Times story does include one brief allusion to the U.S. as an active participant in the relationship. And it points up, indirectly, the parallel between the two nations:

    "Susan Rice, the national security adviser, said sanctions would remain until the United States and its allies were convinced Iran was not pursuing nuclear weapons. But like Mr. Zarif, she did not discuss how a lifting of sanctions could be conducted or how much of its nuclear infrastructure Iran would have to dismantle."

    Yet there was no suggestion that Rice, or anyone in America, has a narrative about Iran.

    As long as the myth of America as a land without political myths holds sway, we won't get the news we really need -- a factual story about two duels of narratives, in two very different countries, which mirror each other in many ways and interact to create a whole new set of constantly shifting narratives.

    Until that story is widely known, we won't have a really well-informed national conversation about U.S. policy toward Iran. But I guess we'll have to wait a long while for a journalist as prominent as David Sanger and a source as eminent as the New York Times to bring us that narrative.

      

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153180 https://historynewsnetwork.org/blog/153180 0
    The Myth That Makes the GOP Suicidal "These voters think they are losing the country,” and they are very scared. That's how prominent pollster Stan Greenberg summed up his recent intensive study of Republicans.  Nothing new there. Many others have said just the same thing.

    What is new is that the GOP is no longer interested in fighting to regain control of the country through the political system. They'd rather bring the whole system down, even if their party has to go down with it.

    Tea Party stalwart John Culberson, the Congressman from Texas, made that clear right after the GOP House caucus voted to shut down the government unless the Obamacare program was put on hold. Culberson recalls that he "said, like 9/11, ‘let’s roll!’”  

    So supporters of Obamacare are like hijackers trying to crash a plane into the heart of America? And right-wing Republicans are like the passengers of flight 93, who chose to crash and die heroically rather than let the hijackers have their way? Let no one say that the American political scene is devoid of symbolism. Of course we already knew the Tea Party had a real knack for symbolism when they chose their name.

    But Culberson occasionally lapses into prosaic and surprisingly honest language, revealing what's at the heart of the GOP's suicidal impulse. Just two days later he told an interviewer: "We need to get the federal government the heck out of healthcare. ... It’s a violation of our most sacred right as Americans to be left alone."

    When the interviewer asked how that squared with Culberson's support for Medicare, the legislator replied: "That’s not even relevant to the conversation."

    OK, so we don't get logical reasoning here. But we do get powerful symbolism and a crystal clear explanation of the symbolism: Since right-wingers can no longer control the country, they want to be left alone, isolated from and thus unaffected by the country. They believe it's their right.

    Obamacare is the most recent symbolic message that the rest of the country won't leave them alone. So the right-wingers have decided that their only option is to commit political suicide and bring down the whole political (and perhaps financial) world with them. It's a crusade to defend every American's "most sacred right."

    Which suggests that the Tea Party chose the wrong symbol as its emblem. The original Tea Partiers, who threw all that valuable cargo into Boston Harbor, weren't demanding to be left alone. They wanted to reform the structure of their government, give the colonists equal rights and an equal voice in it, and be part of it, not destroy it.

    Extreme right-wing Republicans like Culberson should actually have called themselves the Fence Party. What they really want, symbolically speaking, is a fence -- in fact, as many fences as possible --  to protect them not only from government but from all the disturbing messiness of the world out there that they can't control.

    "The Fence Party" could evoke deep echoes of American history. Fences have played an important role in the nation's symbolic life. There's the white picket fence surrounding the home of the American dream; the stone fences that Robert Frost told us make good neighbors; the barbed wire fence surrounding the north 40 in all those cowboy movies; the electrified fence along sizeable stretches of the U.S. - Mexican border; and all the other fences we share as part of our cultural store.

    They always have a similar symbolic meaning. It's best expressed in a myth -- a story that, whether true or not, expresses a whole worldview and gives meaning to life for those who believe it. The tale has innumerable variants, but the plot always goes something like this:

    "I have worked hard to get a space -- a physical, economic, social, and cultural space -- to call my own. I have a right to be in charge of that space. So I'm determined to keep my space surrounded by a sturdy fence. I alone will control the gate, deciding who gets in and who doesn't. As long as my fence stands strong I can control all of my space. I can keep it secure, in good order, well protected from the chaos that threatens just beyond the fence."

    The chaos has taken on different faces from one era to the next. The faces it wears today -- "big government," unwelcome foreigners, sexual freedom -- all have a long, distinguished pedigree in American political myth and symbolism. In every era they have inspired fear, worry, and an impassioned drive to keep symbolic fences strong and well-mended.

    As many commentators are beginning to note, though, there was only one time in our past that the fear and passion rose high enough to inspire a suicidal urge to destroy the whole system. Yes, today's far right conjures up, in this respect, echoes of the Old South, which chose to risk its own destruction rather than yield to Washington's dictates.

    I wouldn't push the parallel too far. This isn't 1860. But it has been 153 years since we've seen a political movement desperate enough to say "Give me a fence or give me death" and powerful enough to push the entire country to the political brink.

    And now, as in early 1860, no one can say just how it will end, because there is apparently equal determination on the other side. It's hard to see where there's room for compromise. In the short run that's because the Democrats got suckered so often in Obama's first term, and now they say they won't get fooled again.

    But there's a deeper long-term impasse here. Liberals see the conservatives' symbolic fence as a myth in the conventional sense of the word -- an illusion with no basis in fact. In the modern world, liberals say, we have all become so interdependent, so enmeshed in large (often global) institutions, that the idea of controlling the gate to your own space is perhaps a nice fantasy, but surely nothing more.

    For proof, just consider the very air we breathe and the water we drink, the salaries we earn (or don't earn), the health care we get (or don't get); indeed, everything that determines whether our lives are secure and orderly. It all depends on choices made by people (perhaps millions of people) that we will never know or even see, choices that are beyond our control. None of those people can be kept from affecting our lives deeply, no matter how earnestly we work at building symbolic fences. 

    It is far more realistic to recognize that we are all parts of a community -- in fact many communities, extending around the world -- and we should all share in taking care of everyone in those communities. That's the essence of the liberal myth. Liberals can't compromise on it because they can't deny the evidence of their eyes, their ears, and their reasoning minds.

    This debate has been an important part of American political life, in one form or another, for as long as there has been a United States of America. Today its symbolic platform is Obamacare versus a federal government shutdown and perhaps default. Soon enough, it will be some other policy conflict. But the clashing myths driving the debate are likely to remain the same.

    What liberals can and should offer, by way of compromise, is a recognition that political positions depend on much more than evidence and logic. All of us are shaped by the mythic narratives we embrace (or, perhaps more precisely, that embrace us). What we see, how we see it, and how we think about it are all filtered through the lens of the symbolic myths we hold. And every myth enshrines a worldview and values that are the foundations of people's lives. No one likes to see their foundation threatened. That's always a scary feeling.

    Liberals should acknowledge that the conservative myth of the fence is more than mere illusion. It's a matter of the heart. It expresses powerful human emotions and longings that are deeply rooted in American history. They're understandable enough and deserve to be respected -- even when the policies that serve as their symbolic vehicles should be resisted. If liberals are as open-minded as they claim, they should be able to take this step toward engaging in some constructive, though surely contentious, dialogue with their political foes.  

    It's a crucial step that can open up room for the debate we really need in this country: a genuine, thoughtful debate about the myth and symbol of the fence. It will turn out to be a complicated affair, involving many other myths and symbols that will have to be interpreted with the same care and sensitivity. But no one ever said democracy would be easy. At least we would be following John Culberson's lead and talking honestly about the heart of the matter.   

     

     

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153184 https://historynewsnetwork.org/blog/153184 0
    Uncovering the Tea Party's Radical Roots For decades, Democrats across the country have been holding Jefferson Day dinners, filling their coffers by honoring their party's founder. Suddenly, along comes the extreme right wing of the Republican Party, snatches up poor old TJ, and says, "Sorry, he's actually ours. After all, didn't he say, 'That government is best which governs least'?"

    Well, no, in fact he didn't. But perhaps he should have. He often expressed skepticism, and sometimes outright criticism, of the growing powers of the federal government. So which side in today's political divide is most entitled to carry the name of Jefferson on its banner? Exploring those questions led me to a surprising discovery: If we put the Tea Party's claim to TJ's mantle in the proper historical perspective, we come out not on the far right but on the far left.

    It all began when I was re-reading Gordon Wood's The Radicalism of the American Revolution (trying to escape from obsessively tracking the DC rollercoaster.) As Wood observes, the Jeffersonians and Hamiltonians divided over basically the same issue that plagues us now: How much of a role should government play in people's lives? (Though the clash back then was so fierce, and split American society so sharply, that it makes today's politics look rather mild by comparison.)

    But Wood takes us deeper into the substance of the issue. Jeffersonians were willing to limit government only because they assumed that there was "a principle of benevolence ... a moral instinct, a sense of sympathy, in each human being." They  were founding an American nation upon the European Enlightenment's belief that "there was 'a natural principle of attraction in man towards man' [as Hume put it], and that these natural affinities were by themselves capable of holding the society together."

    This was exactly the point that frightened Alexander Hamilton most. He summed up his opponents' view quite accurately: "As human nature shall refine and ameliorate by the operation of a more enlightened plan," based on common moral sense and the spread of affection and benevolence, government eventually "will become useless, and Society will subsist and flourish free from its shackles." Then Hamilton, the greatest conservative of his day, dismissed this vision of shrinking government as "a wild and fatal scheme."

     The Republicans who now control the House obviously have a very different view of what it means to be a true conservative. But that doesn't mean they have become Jeffersonians. Not by any means. In many ways they would be closer to Hamilton, who scorned Jefferson's trust in human nature.

    The Tea Party et al. don't defend their call for less government by claiming that we are all born with an innate sense of benevolence and sympathy toward all other people. On the contrary, they claim "the most sacred right to be left alone" largely because they don't trust people outside their own familiar circle, so they don't want those strangers meddling in their affairs.

    Yet the current call for less government is a useful reminder of the worldview on which Jefferson and many of the Founding Fathers expected to build the United States. They assumed it was "natural to infer, that a disposition to do good, must, in some degree, be common to all men."

    And this, Wood goes on to write, "was the real source of democratic equality." Every human being can be equally trusted to make wise decisions for the good of all (the reasoning went) because everyone, simply by virtue of being human, has a natural concern for the good of all -- as long as that inborn sense of sympathy and benevolence is not corrupted by a misguided society. Let nature take its course and everyone will be taking care of everyone else so well that there won't be very much for government to do.

     In the mid-19th century Henry David Thoreau drew that line of thinking out to its logical conclusion in his essay "Civil Disobedience":

    I heartily accept the motto, -- "That government is best which governs least" -- and I should like to see it acted up to more rapidly and systematically. Carried out, it finally amounts to this, which also I believe, -- "That government is best which governs not at all"; and when men are prepared for it, that will be the kind of government which they will have.

    How would men (and women, to be sure) get prepared for such anarchy, which was really Thoreau's ideal? He offered no simple rule, because there was none, in his view: "I would have each one be very careful and find out his own way." he wrote in Walden. "Explore the private sea, the Atlantic and Pacific Ocean of one’s own being."

    Within that private sea of our own being, though, Thoreau was sure that every one of us could find -- each in our own way -- the eternal, spiritual "solid bottom," of the universe. "Next to us the grandest laws ... all the laws of Nature ... are continually being executed." We can know those laws directly and be guided by them, as long as we "live deep and suck all the marrow out of life." Then we will find government superfluous.

    Thoreau concluded "Civil Disobedience" by "imagining a State" that would let a few people

    live aloof from it, not meddling with it, nor embraced by it, who fulfilled all the duties of neighbors and fellow-men. A State which bore this kind of fruit, and suffered it to drop off as fast as it ripened, would prepare the way for a still more perfect and glorious State, which also I have imagined, but not yet anywhere seen.

    It would be a state of perfect Transcendentalist anarchy, where everyone would fulfill all the duties of neighbors and fellow-men not because they were following the government's laws but because they were letting nature take its course within them, living deep and sucking all the marrow out of life.

    Today's right-wing extremists would probably run from Thoreau's view of life even faster than from Jefferson's. But there is no denying that their obsession with shrinking government stands in a long, distinguished line of American tradition where these two luminaries shine so bright.

    Those same right-wingers would probably run fastest of all from another luminary, Walt Whitman, who was surely marching to his own drummer when he rhapsodized about his own transcendental moments: "From this hour, freedom! From this hour I ordain myself loos'd of limits and imaginary lines." Where the Tea Party would erect fences stronger and higher, Whitman would have every fence torn down.

    And in his imagined freedom, shorn of all defences, Whitman found "the joy of that vast elemental sympathy which only the human soul is capable of generating and emitting in steady and limitless floods." Even Jefferson could not have expressed the Enlightenment faith in benevolent human nature more eloquently.

    Whitman gave classic voice to the link between the anarchic Transcendentalists and the Jeffersonians: Live free, follow your natural promptings, and you will spontaneously act upon the elemental sympathy for all that wells up from within you.

    So it seems a fitting coincidence that I first heard this tradition voiced by friends at "Leaves of Grass," my local countercultural bookstore, back in the late 1960s. They summed it up by asking, in Whitman's words: "What do you need, Camerado? Do you think it is love?", and answering, in the Beatles' words, "All you need is love."

    These friends were imagining something not yet anywhere seen: a society blending personal freedom and spiritual seeking with universal sympathy, so that everyone could suck all the marrow out of life. Most of them thought they were the first to even imagine it. They didn't know that they were only forging the next link in a historical chain of imagining -- a chain of political mythmaking -- stretching back to the American Revolution.

    As for the size of government, I don't recall it being a burning issue back then outside a small circle of political philosophy wonks. For the rest of us, it seemed just a matter of common sense. The innate sense of sympathy, as well as direct contact with the marrow of life, had been stunted for far too long by a society that valued profits and material goods above people. It would be many years before everyone's genuine needs would be fully met by spontaneous acts of benevolence and love.

    Until then, government should fill the gaps, since only government has the resource to make sure all are filled. But it should stay out where it does more harm than good -- most obviously, back then, in Vietnam.

    So if we drag the Tea Party and its fellow-travelers (kicking and screaming, no doubt) back into their proper historical context, we discover that the size of the government is not the crucial issue at all. They are here to remind us of something much bigger: a grand mythic vision that appeared at the very birth of the nation and has remained with us ever since, periodically blazing up in individuals or groups who have articulated it in clear and sometimes eloquent words.

    So far the spotlight on the Tea Party has done much more to obscure than illuminate this mythic vision. But history has its way of playing unexpected tricks on us. Exhibit A: If it weren't for the Tea Party's vehement opposition, the U.S. would probably be dropping bombs on Syria right now, and very possibly sinking deeper into prolonged military involvement there.

    So let's give thanks where thanks are due, recall the patriotic far right's true roots in America's radical history, do what we can to cultivate those roots, and do what we can to cultivate those roots so that they’ll give rise to a healthier plant in the future.

     

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153185 https://historynewsnetwork.org/blog/153185 0
    In the Shadow of the Progressive Era, for Better and for Worse The other day I took in, at a single glance, three headlines on the home page of the Washington Post. As they merged in my mind, they set me thinking back to the Progressive Era -- the decade from 1905-1915 -- that is arguably as pivotal as any in U.S. history and, for those of us who see a need for significant social change, perhaps more worth studying than any other decade.

    It was a time when a Republican president with impeccable conservative credentials, Theodore Roosevelt, could stake his political life on words like these: "every man holds his property subject to the general right of the community to regulate its use to whatever degree the public welfare may require it. ... Whenever the alternative must be faced, I am for men and not for property." "The welfare of each of us is dependent fundamentally upon the welfare of all of us." 

    It was a time when there were no businesses "too big to fail." The biggest business of all, Standard Oil, had its crimes and abuses exposed in articles and a book that became the talk of the nation. TR racked up plenty of political points by demanding that this trust, like others, be broken up.

    It was a time when experts began to use statistical methods to measure the effects not only of trusts, but of poverty, crowded slum housing, contaminated food, substandard education, and hundreds of other things. Armed with such objective evidence, Progressives got laws passed that have saved and improved innumerable lives ever since.

    We still live in the long shadow cast by Progressivism, for better and for worse.  Though some things have changed some have not, as the three articles I took in made clear.

    Today, a president with impeccable liberal credentials echoes TR's words. But he  doesn't follow through on that narrative nearly as rigorously. That liberal president (I read in the first of those WaPo articles) has allowed his Justice Department, in good Progressive fashion, to impose an unprecedented fine on JPMorgan Chase. The fabled House of Morgan will have to pay something like $13 billion -- more than half of last year's profit.

    But what about all the accumulated profits from the other years since Morgan started peddling subprime mortgage packages and other derivatives that it knew were dubious, at best? Those profits will more than compensate for whatever pain Morgan feels from the fine, and they’ll keep the firm’s top execs living in their accustomed uber-luxury.

    Indeed, once Morgan’s accountants tote up the cost-benefit analysis in good Progressive fashion, they’re bound to conclude that, in the long run, the bank's miseeds were quite profitable. The obvious lesson is simply to go on doing more of the same -- especially when Morgan knows that one central tentet of Progressivism is long gone: Huge banks and corporations are now considered too big to fail. So the fine is not likely to have much deterrent effect at all

    And 9t’s still very possible that Morgan, with its massive legal war chest, will take the case all the way to the Supreme Court. Can we count on today's Supreme Court to follow the public will and allow a harsh penalty to stand? It's certainly doubtful. This is 2013.

    How different things were in 1913, when Standard Oil, after spending huge sums in court fighting the break-up order, took its case all the way to the Supreme Court and lost, ending up with no choice but to organize its own demise.

    One IQ Point Spells Death

    The Supreme Court was itself the subject of the second article that caught my attention. It has agreed to decide whether a convicted a murderer must be put to death by the state of Florida because of ONE IQ point. 

    In 2002 the Court ruled that it was cruel and unusual punishment to execute someone who is mentally disabled. That was far short of the goal promoted by Progressives like Clarence Darrow: declaring all capital punishment cruel, unusual, and hence unacceptable.

    And to limit the effect even further, the Court gave each state the right to define "mentally disabled." Florida picked IQ 70 as the legal cutoff point. Anything above 70 and you can't claim that you're mentally disabled. So Florida plans to execute Freddie Lee Hall, whose IQ, the state’s tests showed, is at least 71.

    Hall's lawyers argue what every psychologists knows: The results of IQ tests are just approximations, at best. There's a heavy dose of subjectivity in the scoring. And there are shelves of research questioning whether IQ tests, even when administered by experts, tell us anything definitive at all about someone's intelligence. But the Florida Supreme Court has said, in effect, "Sorry, Freddy. We've got an objective measurement. And you're one point too high to live."  

    That's one legacy of the darker side of the Progressive era. Progressives were devoted to the newly arising cult of objectivity and statistical measurement, with some unarguably humane results. But here we have a case -- and so many others could be cited -- where the same myth of the all-powerful objective statistic may very well end up taking a life in a way that a more liberal Supreme Court could easily consider cruel and unusual punishment.

    It's only a bit of a stretch to see the $13B fine against Morgan as part of the same pattern. Why, it's more than four times the size of any previous fine against a financial giant! By pinning that enormous number on Morgan's misdeeds, we may get a warm feeling of satisfaction, as if justice has been served. And to warm our hearts more, some $4B of the settlement will supposedly go to the victims of the huge bank's nefarious scheming.

    But will $4B really be enough to compensate all of them (even if they get it all, which historical precedent suggests is doubtful)? Every foreclosure has effects that can never be quantified, not only in the lives of the people evicted but in the lives of everyone they touch. Those effects ripple out across their block, their neighborhood, their whole city. Sometimes those effects leave dead bodies, quite literally, in their wake.

    When foreclosures are rampant, as in the last few years, the whole is greater, and potentially deadlier, than the sum of its parts. To try to quantify all that seems foolish at best, callous at worst.

    Yet the feeling of comfort provided by "hard numbers" is part of the legacy of Progressivism that we still live with. And as long as those "hard numbers" are so big, as in this case, they make it easier for the public to live with the idea that some financial institutions are still too big to fail. So the deadly cycle of boom and bust goes on.

    Humanity or Technical Rationality?

    The Progressive faith in numbers also sheds light on the third article I saw on the home page of the WaPo, the same story most newsappers were headlining that day: The president's apology for the technical failures of the Obamacare rollout and his assurance that he was bringing in the top experts to fix it all ASAP.

    Apparently there's plenty of blame to go around. But the fundamental issue is that millions of Americans are equating the plan for expanded health issurance with the computers that are so central to the plan -- or at least so the mass media tell us, in endless stories that are likely to be self-fulfilling prophecies, at least in the short run.

    It's certainly possible that after a few months, while the enrollment period is still open, those top experts will succeed, the problems will be pretty much fixed, and the millions who crashed the system in its first days because they were so eager to get health insurance will reach their goal.

    Yet to turn this into a gripping story -- and after all that's how they make their money -- the mass media have to persuade us that it's equally possible the public will give up on the whole plan before the experts can do the job. In that case, milliions will lose health care, and thousands will die, because a potentially humane health care program was judged by the quality of its of digital computing.

    The idea that government should take responsibility for the health of all is a direct descendant of the Progressive era. But so is the digital age, harking back to the Progressive faith that everything can be reduced to numbers. In the case of computing, it’s a faith in two little digits, one and zero. More broadly, it’s a faith that the right machines will save us -- machines that only experts can design, build, and maintain.

     Even more broadly, the digitial age is the culmination of the rule of technical reason that first emerged so clearly in the heyday of Progressivism. It was hardly just an American phenomenon, though some would argue that Americans have carried it to its furthest extreme. But Americans were also slowest to recognize its dangers.

    By the time Progressivism was sweeping this nation, European intellectuals were already beginning to warn about those dangers. The new mode of rationality cared only about calculating what means could reach any given end most efficiently. Reducing everything to numbers was a necessary first step in making those calculations. But what numbers and technical means-ends reasoning could never tell us was the ultimate value of the ends that society was seeking.

    In fact, they argued, modern society was losing the very ability to ask about ultimate goals. We were building machines meant to serve us and make our lives better, but then letting the machines become our masters, determining the quality and the very fabric of our lives. Keeping the machines going -- maintaining a dependable, seemingly rational order -- became not merely a means to an end but the very purpose of society.

    People who make the whole value of Obamacare depend on the smooth function of the computers are making a similar mistake. If the tools don't work, of course they should be fixed. But the tools are only the means to the goal. The goal should not be equated with, or judged by, the quality of the tools.

    Yet we can make that mistake so easily because we are so quick to reduce reality to its virtual, computerized version. That, too, is ultimately a legacy of the Progressive faith in numbers, experts, and machines. 

    The computerization of medical care, like the Progressive era itself, yields us an ambiguous fate. I've seen (at the Mayo clinic) how immensely helpful a well-oiled, fully computerized, medical system can be. I want all my medical records in a national data bank accessible to any provider I visit, despite the risks to my privacy, which I believe are a price worth paying.

    But I do not want my medical providers, medical insurers, or anyone else to be so mesmerized by numbers, computers, and the myth of perfect objectivity that they cannot see me, the human being in front of them.

    Nor do I want the public so mesmerized by the huge number $13 billion that it cannot see all the human beings whose lives continue to be ruined by corporations that remain fundamentally unchecked because they are deemed “too big to fail.”

    And I certainly don’t want the Supreme Court so mesmerized by numbers and the myth of objectivity that it cannot see the humanity of Freddie Lee Hall. Though I personally find all capital punishment to be cruel, and would like to see it become not just unusual but unthinkable, there seems little doubt that the Supreme Court in 2002 meant to make Mr. Hall’s execution legally “cruel and unusual.” 

    But there’s every possibility that in this case, too, the Progressive era’s devotion to numbers and objectivity will triumph over its undeniable concern for the value of human life.

    PS:  Just a day after these three articles appeared, as if to underscore the point, the WaPo offered another story explaining how our misguided trust in the "pinpoint" guidance systems of drones, guided by computers thousands of miles away, has killed far more people than most Americans would want to admit, or perhaps even think about. Yet the president assures us that in this case, too, he’s got experts refining the digital technology to fix the problem of “excessive collateral damage.”

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153191 https://historynewsnetwork.org/blog/153191 0
    Critics of Spying on Allies Need a Better Narrative

    Seriously, though, I really was shocked to hear the U.S. Director of National Intelligence, James Clapper, admit the espionage publicly. Our allies spy on our leaders just as we spy on theirs, he explained to the House Intelligence Committee: "It's one of the first things I learned in intel school in 1963, that this is a fundamental given in the intelligence business is leadership intentions, no matter what level you're talking about."

    Racking up more points for honesty, Clapper pointed out the hypocrisy of the legislators, who know perfectly well what goes on yet are treating this as some scandalous new revelation. Like me, he couldn't resist the obvious film cliché: “Some of this reminds me of the classic movie Casablanca -- ‘My God, there’s gambling going on here.’”

    Of course U.S. intelligence agencies want to know everything that's done and said, everywhere, even on German Chancellor Angela Merkel's cell phone. That's their job, as they see it. Why else would they get those uncounted (literally uncounted, hidden in a blacked out budget) billions of our tax dollars every year? What's more, if the members of the House Intelligence Committee (or the Senate Intelligence Committee) don't know what's going on, they obviously aren't doing their own jobs competently.

    Yet here is Senator Dianne Feinstein, chair of the Senate Intelligence Committee, who generally defends all kinds of government surveillance, expressing her own shock: She does “not believe the United States should be collecting phone calls or emails of friendly presidents and prime ministers.” 

    It seems like everyone inside the Beltway is shocked about something or other.

    What's really going on is a tangle of conflicting narratives, all evoking strong passions because they hold such deep mythic meanings.

    For Clapper and the humongous intelligence bureaucracy he supposedly runs (though it's far too big for him to know even a fraction of what's really going on), the story is simple: Since September 11, 2001, we have been at war. And in wartime spying is a totally acceptable, indeed indispensable, weapon.

    It's a story at least as old as the days of ancient Israel, when biblical writers didn't hesitate to say that their ancestors had used spies to conquer the land they now lived in (see Joshua 2:1-7). The Roman empire had its spies, and it's safe to assume that every later empire did the same.

    The fledgling United States of America relied on spies to help win its first war, against Great Britain. Nathan Hale is the most famous of those spies. But when the Military Intelligence Corps Association set up an award to "recognize individuals who have contributed significantly to the promotion of Army Military Intelligence in ways that stand out," it didn't name the award for Hale. It created the Knowlton Award, commemorating Hale's commander, Thomas Knowlton. He was picked by George Washington in 1776 to create America's first spy unit, commonly known as "Knowlton's Rangers." Espionage in wartime has been de rigueur for the U.S. military ever since.

    But why spy on allies? The simple answer is the obvious imperative to gather as much information as possible, from every source possible. Ideally, every military commander would like to be omniscient, because in wartime, especially, knowledge is power.

    But things got more complicated during the cold war. Americans learned to expect an endless war as the new normal; the old line between wartime and peacetime disappeared. The battle was waged by economic, diplomatic, and cultural as well as traditional military means, so the line between civilian and military was blurred, too. Hence intelligence gathering became a constant responsibility shared across that line.

    Yet another line was blurred by the cold war, the most important of all: There was no longer any conventional front line between friend and foe. "The commies" could be anywhere -- even behind the desk at your local library, Senator Joseph McCarthy said. And certainly they might have infiltrated the highest levels of allied government offices. So it only made sense to spy everywhere.  

    The U.S. still spies on allies all over the world, as the New York Times points out, and not just on government leaders but on "their top aides and the heads of opposing parties" too. "It is all part of a comprehensive effort to gain an advantage over other nations, both friend and foe," the Times bluntly concludes.

    That's all the explanation needed for foreign policy elites and their journalistic scribes, who live within the narrative of political "realism." That view had already grown ascendant in Washington before World War II, and it was firmly entrenched by 1962, when James Clapper learned it: In the "great game," every major power is jockeying for advantage. So everyone spies on everyone, as they always have and always will, during times of cold as well as hot war.

    Why, then, is so much criticism leveled inside the Beltway at Clapper, the whole establishment he heads, and its ultimate boss, the president. Why would members of Congress, or anyone else, deny what seems so obvious to intelligence professionals and "realists" everywhere?

    Much of the answer, I think, comes from the cold war's unique contribution American political mythology. By 1962 the distinctively American mythology of homeland insecurity had become institutionalized as the dominant narrative of the nation. It demanded that official voices in government and media express deep ambivalence toward "realism," embracing it while also rejecting it.

    The mythology of homeland insecurity assumes that America is always the innocent nation, trying only to make the world better. So it must reject the idea that America should do what everyone else does. America, it insists, is more moral than everyone else. We have a higher set of national values. We are the standard-bearers of virtue and civilization in a world always threatened by savage evil.

    That virtue alone gives us the right to fight evil wherever it appears, by any means necessary. That's the only reason we can use "realist" tactics -- because our goals are definitely not those of the "realist." We want to build up the nation's moral standing, not its brute power.

    At least that is our prevailing public narrative. And anything that undermines our public appearance of unique virtue -- like snooping in allies' offices and tapping their phones -- must be denounced, at least in public.

    There's another side of the myth involved, too. If our homeland is constantly insecure, we'd better have a leader who is powerful enough to defend us against all the unpredictable threats that may pop up anytime, anywhere -- even inside his own executive branch. 

    That's one reason the question "What did the president know and when did he know it?" is so urgent. A president, who is the nation's highest military commander, is like a god. He cannot be omnipotent unless he is omniscient.

    To be sure, there's yet another time-honored American myth at play here: The narrative of a government consisting of three coequal branches, each jealously guarding its own powers. Congress must have something to criticize the president about, if only to assert itself.

    That's especially true at a time when so many headlines have been trumpeting, in one way or another, the story, "President defeats House." As pundit David Gergen points out, "this is a important turn in the Obama administration's position within American politics. They were really riding high coming out, because the Republicans were on the defensive, you know, and the extremism over the government shutdown. And that narrative has now been replaced by narrative of what did the president know and when did he know it."

    Gergen was talking about the question of what the president knew about the weaknesses in the Obamacare software. But his words shed just as much light on the controversy about spying on allies.

    As members of Congress know, the mass media are eager right now to magnify any controversy between the president and anyone else. The mass media need their own story, one that will sell. And they know that any narrative of conflict revolving around the president is a guaranteed good draw. It will boost media ratings more than Congress' ratings.

    If narrative is the key issue, though, the administration is currently in a stronger place than its critics on the spying-on-allies issue. If the best narrative that the critics can come up with is, "America should be more virtuous than others and the president should know everything that's going on," James Clapper can justly reply:

    "I agree. Absolutely. That's what I said, too. We all know that the intelligence community, including the Senate and House committees, is a kind of old boys (and girls) club, much like the casino at Rick's café. We all share those same basic premises. We all know that espionage has always been part of the game. We all know what we can talk about publicly and what we're supposed to keep secret. We've just got a little quarrel about tactics going on here. It's merely a question of how we, and especially our president, can best safeguard our virtue in a world full of evildoers."

    When the quarrel is only about tactics, the professionals in the executive branch are likely to best their critics, in Congress as well as outside of it, just about every time.

    If the critics hope to gain any real traction in this debate, they'll have to take it to a deeper level and challenge the basic premises of the intelligence community's narrative. They'll have to join the ad hoc coalition of left and right who are challenging the whole idea of government spying.

    That coalition has its own narrative, which also goes back to Revolutionary War times. Why did we fight that war in the first place? One big reason, they say, was to get rid of a monarchical system that could invade and control our private lives at any time, on any whim. The glory of the new system was enshrined in the Fourth Amendment to the Constitution, guaranteeing everyone the right "to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures."

    Our "effects" surely now include our digital communications. And we didn't fight selfishly just for our own rights, the traditional story says. We claimed those rights in the name of all humanity. So every human, everywhere, is entitled to the same right of privacy in their digital communications, no matter what political roles they might play. 

    Yet the "right to privacy" narrative itself will face an uphill struggle as long as the mythology of national insecurity dominates our public conversation. If we are an innocent nation, constantly threatened by "evildoers" who might pop up anywhere, it only makes sense that we must always be on our guard. After all, eternal vigilance is the price of liberty, right? Always has been; always will be. And if we now have digital technology to be vigilant for us, why not use it?

    As long as fear of unseen "evildoers" haunts the land, that argument will be hard to refute. 

    If advocates of the right to privacy want any chance to set meaningful limits on government spying, on allies or anyone else, they will have to challenge the basic premises of the national insecurity state. They will have to argue not only that privacy trumps security, but that the demands for security have been far exaggerated in an American society that has never really escaped the cold war narrative of constant danger and fear.

    Some critics of spying are already making that case, to be sure. But their voices will have to grow a lot louder if anything is really going to change.   

     

     

     

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153194 https://historynewsnetwork.org/blog/153194 0
    "End Times for Obama": The Jacksonian Myth Lives On  "If everything -- everything -- isn’t fixed by Nov. 30, we’re looking at a presidency that is going to collapse into utter disaster." That's how Michael Tomasky sums up, quite accurately, the current media narrative on the troubled launch of Obamacare. Why is the "end times for Obama" meme turning up everywhere? Tomasky attributes it to a journalistic worldview that is always about "who is displaying mastery of the game and who is being mastered at any given moment ... a certain type of political journalism that so exists in the moment that numerous such moments have been declared to be disasters for Obama, going back to Jeremiah Wright."

    No doubt that's part of the story. So is another piece of the journalism game, explained by David Gergen: The administration was " really riding high" during the government shutdown " because the Republicans were on the defensive" and ended up in utter defeat, as far as the mass media were concerned. Now the media need a president on the defensive, just to make the political game tense enough in the moment to keep their ratings up.

    Still, no story has such powerful "legs" (as they say in the newsroom) unless a big segment of the public is already inclined to care about it. So we should dig deeper and ask why so many people are ready to believe that Obama and his administration are facing disaster when, logically speaking, nothing terrible has happened at all.

    Yes, the website is a mess. But we are all accustomed to being frustrated by malfunctioning websites; it's just a normal part of life now. We are also accustomed to glitches in the rollout of government programs. Medicare Part D had major problems. The glaring "donut hole" didn't get repaired for years. Yet no one saw that program as the demise of the G.W. Bush administration. "In the end," as Paul Krugman wrote, "the program delivered lasting benefits, and woe unto any politician proposing that it be rolled back."

    Most importantly, people who want or need to sign up for insurance under Obamacare still have a full four and a half months to do it. The current glitches, which apparently are huge, don't constitute an emergency by any stretch of the imagination.  

    Yes, the president lied when he said no one would have to give up the insurance they already have. But we are all accustomed to presidents lying for political advantage. The question we ought to ask is: What are the consequences of the gap between the lie and the truth? If the consequences are, say, a disastrous eight years of war in Iraq, then that's something to be very upset about. If the consequences are that people will have to have better health insurance, for which a few will have to pay significantly more, we are in a very different ballpark.

    Matthew Yglesias, among many others, makes a compelling case that "It’s Good That You Can’t Keep Your Insurance Plan." Insurers are only canceling plans that threatened their profits, he argues, plans they woud have canceled anyway once the insured got seriously ill.

    So when Krugman wrote "the glitches of October won’t matter in the long run," he was giving the public credit for thinking the issue through reasonably. But by the end of that month he was starting to worry about the public's reasonableness: 

    The biggest reason Obama and co. should be anxious to fix these things now, I’d argue, isn’t the fate of the program itself, which can survive even large early wobbles, but the midterm elections. If Obamacare is fixed, Republicans will be in the position of attacking a program that is benefiting millions of Americans; if it isn’t, they can still run against the legend, not the fact.

    With a tweak of terminology, I'd say that Republicans will run -- indeed already are running, and have been for a long time -- against the mythic narrative that gave rise to Obamacare in the first place.

    Obama's narrative about health care reform was consistent from the beginning. By saying that most Americans would experience no change he tried to steer the public conversation away from the question of how his reform would impact individuals. What he talked about, over and over again, was how it would improve the quality and outcomes of the nation's health care system as a whole, while reducing the cost for the nation as a whole.

    Whether he was right or wrong about that isn't relevant to my point here. I'm simply saying that the president and the people who created Obamacare were telling a story about what would happen to the whole country, to all of us Americans en masse. That's apparently the way they looked at the issue. In any event, that's certainly the way they talked about it.

    The Republican-led opposition to Obamacare was based on just the opposite narrative: Think about (and be frightened by) what will happen to you, as an individual, and to your own private family. The GOP never tried very hard to rebut the administration's claims about what would happen to the nation as a socioeconomic system. That simply wasn't relevant to their story.

    The sense of catastrophe that pervades mass media reports about Obamacare now is based squarely on that Republican narrative. So many individuals are unhappy! That's all it amounts to. If the president were fully honest (which, let's face it, no president ever is), he'd say:

    Well, we knew some individuals would be unhappy. We didn't expect so many to be unhappy. But it's really beside the point. Every systemic change makes some individuals unhappy. My job isn't to make every individual happy -- which would be impossible anyway. My job, indeed the job of any federal official, is to improve the quality of life for the nation as a whole.

    Obama cannot say that and hope to survive politically because he never succeeded in winning the public at large over to his narrative. The rules of the political game are still defined by the current conservative narrative.

    I emphasize "current" because a focus on individual satisfaction has not always been directly linked to what we now call conservatism; nor has a focus on the good of the nation as a whole always been linked to what we now call liberalism.

    Consider the striking parallel between the difficulties Obama faces now and the difficulties that John Quincy Adams and his close political ally, Henry Clay, faced in their ongoing battle with Andrew Jackson in the 1820s. Adams and Clay were widely perceived, with good reason, as pro-big-business leaders. They were eager to unleash the budding capitalist energies of the corporate and financial sectors in their day.

    For Adams and Clay, though, that eagerness was merely part of a larger program of improving the nation, a program that Clay dubbed "The American System." An equally important part was an ambitious series of improvements in transportation and infrastructure funded by the federal government.

    Daniel Walker Howe, in his Pulitzer-Prize winning history of the era, says that "as Clay envisioned it, the American System ... would create, not division between the haves and have-nots, but a framework within which all could work harmoniously to improve themselves both individually and collectively." And the collective improvement would be the precondition of individual improvement. Adams and Clay, like Obama and his administration, focused most on what they thought was best for the nation as a whole, assuming that a more robust national economy would benefit everyone.

    Perhaps Clay was fooling himself, thinking that his System would not widen the gap between haves and have-nots, since it was so tilted toward the interests of the emerging financial and corporate sectors. That's why we call him (and Adams), by our current standards, conservative.

    Of course no one should call Obama anti-big-business, not by a long shot. But he is more open than his Republican foes to regulating the corporate and finance sectors. And it is hard to see how his health care reform would increase economic inequality. That's why we call him, by our current standards, more liberal than the GOP.

    Yet his health care reform is built on the same mythic narrative of national improvement touted by the conservatives Clay and Adams -- a narrative that has become the heart and soul of what we now call liberalism.

    We can see the origins of our modern liberal-conservative divide more clearly if we take a closer look back at the 1820s. Then, as now, opposition to a concern for national well-being was not based on opposition to the particulars of any policy. Jackson's support was not based on opposition to the American System or any other policy, says Howe: "Jackson possessed an appeal not based on issues; it derived from his image as a victor in battle, a frontiersman who had made it big, a man of decision who forged his own rules." In short, Jackson was the ultimate prototype of American "rugged individualism" -- a narrative that has become the heart and sould of what we now call conservatism.  

    Howe cites another historian, Daniel Feller, to explain why Adams and Clay could never sell America on their American System. "The inclination of its people was for diffusion rather than discipline, toward self-determination and away from supervision, however benign." Everyone going their own way and doing their own thing, judging the value of any policy by the immediate fulfillment it gives individuals, has often been a winning narrative in American political life, at least since the 1820s. Apparently it still is today. That's why it's so easy for the media to sell the story of "end times for Obama."

    Jackson also won lots of votes, says Howe, by initiating another "common and effective tactic in American politics: running against Washington, D.C." The more things have changed, it seems, the more they've stayed the same.

    But Jackson really wasn't the first candidate for president to run against the power of the nation's capital. That honor belongs to Thomas Jefferson. It's not surprising that the aged Jefferson was among the harshest critics of the American System, because it would give so much power to the government in Washington, D.C.

    Jefferson had always feared that a metropolitan center would dominate the American union of states -- what he called the "empire for liberty" -- turning it into a mirror image of the British empire dominated by the court in London that Jefferson had fought so bitterly. Jefferson wanted fervently to believe that the United States could flourish with a weak central government, but for just the opposite reason than the Jacksonians. TJ trusted that Americans would not be rugged individualists.

    Peter Onuf, studying Jefferson's ideas of nationalism, wrote that TJ wanted "a federal union that preempted the concentration of despotic power in a domineering metropolis." Americans would be held together as a nation not by a powerful federal state but "by their harmonious interests, common principles, and reciprocal affections." That was the essence of Jeffersonian nationalism. He could believe in that vision because he believed in the innate benevolence of human nature.

    "Yet," Onuf adds immediately, "as Jefferson discovered, these exalted expectations were repeatedly frustrated." So he felt "chronic concern that Americans would sink into a state of collective unconsciousness, forgetting that they were a people" -- that is, a society held together by reciprocal concern for all other Americans and thus for the national community as a whole.

    Jefferson didn't live quite long enough to see his fears fully realized in Jackson's presidential victory of 1828 -- a victory for the new Jacksonian vision of nationalism as a stout assertion of rugged individualism.

    But Barack Obama has lived long enough to see that victory in the panic engendered by the few brief weeks of the stumbling launch of his health care reform. The panic comes only from the mythic narrative dominating the nation: judging government by what it has done for me and my family very lately, in the current moment.

    There's much more hanging on the fate of Obamacare than health care or the 2014 election, as important as those are. Obamacare is an effort to reassert the mythic narrative that Jefferson, Adams, Clay, and so many other American leaders have promoted -- a story of Americans concerned more about what happens to all of us as a nation than to any one of us as individuals. If Obamacare is ultimately judged a failure it will be yet another victory for the Jacksonian myth of rugged individualism.

    The outcome doesn't depend on how many people can sign up for Obamacare by some arbitrary deadline or how many have to pay a bit more for new, improved insurance policies. Those facts will make headlines, but they will be only symbolic expressions of the underlying battle of myths. The outcome really depends on how hard each side fights for its myth.    

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153196 https://historynewsnetwork.org/blog/153196 0
    11/22: The Day "Truth" Died Last week I wrote a column pointing out that evangelical Christians supported a lot of progressive, even radical, political views in nineteenth-century America. Slowly, some evangelicals are starting to return to their left-leaning roots. Get a random cross-section of evangelicals together and you might get quite a lively debate about the economy, the role of government, the environment, and a host of other issues.

    But there's one thing they'd all agree on: Whatever political position they hold, evangelicals will always begin explaining their view with the words: "The Bible says." The stamp of biblical authority can be put on any political position, from far left to far right, as U.S. history proves.

    Whenever that stamp is pasted on, it gives any political position a sacred aura of absolute certainty. That's the old-fashioned sense of "truth": something that's eternal, universal, unarguable, unshakeable, unchallengeable; "the God's honest truth."  

    I wrote that column about evangelicals just as the stream of media words about the fiftieth anniversary of John F. Kennedy's death was swelling higher and higher. The coincidence set me to thinking about how the question of truth and certainty has played out in American political history.

    This week we are deluged by reminiscences of JFK and Jackie and, even more, the endless flow of theories about what really happened in Dealey Plaza on that tragic day. Some are sure a single gunman named Oswald killed the president. Some are sure that's a lie. Some aren't sure of anything. And some, like me, think the most important fact about the assassination is precisely that, after fifty years, the debate goes on with no end in sight.

    Perhaps documents not yet released will turn up some irrefutable "smoking gun" (at least metaphorically and perhaps literally). For now, though, there's only one thing for sure: Half a century on, we have no societal consensus about what really happened. As a nation, all we have is a shared collective uncertainty.

    Though no one knew it at the time, the announcement of JFK's death also announced the beginning of the end of the old-fashioned idea of "truth" -- an absolute certainty that we can all depend on.

    A profound sense of uncertainty set in as soon as we heard the news on November 22, 1963. The thought was sometimes articulated and almost universally felt: If the young, vigorous leader out of Camelot could be cut down in the prime of his life, by a shot or shots out of the blue, and bleed to death all over his beautiful young wife, then anything was possible. There was no way to know what to expect next. Call it metaphysical uncertainty.

    The generation yet unborn at the time can easily relate to that feeling by recalling the morning of September 11, 2001, another day in living memory that gave America an equal sense of shock and uncertainty, as if the ground had been pulled out from under us.

    But the deeper significance of JFK's killing appears by contrasting it with the third such day in living memory (and there have only been three): the bombing of Pearl Harbor. The result of that attack was an immediate end to uncertainty. The debate about whether to intervene in World War II collapsed within hours. There was a sweeping consensus on who was good, who was evil, and what had to be done.

    JFK's murder was a whole different story. Metaphysical uncertainty opened the door to moral uncertainty: If there was no consensus on who did it, how could we know for sure who was good and who was evil?

    A few years later the same kind of question abounded on many fronts, most notably the battle front. Growing numbers of Americans were coming to the conclusion -- a terribly painful one, for many -- that we were not the good guys in Vietnam. Many were seriously doubting that there were any good guys. A war without good guys against bad guys? That just didn't compute in the American mythological tradition. All that remained was bafflement.

    Many Americans found similar uncertainty looking at the question of race. Evangelical Christianity had been merged with politics most recently in the civil rights movement in the South, which never could have succeeded without the powerful force of the black churches behind it.

    As the African-American struggle for equality became centered less in the churches of the South and more in the streets and secular meeting halls of the North and West, it no longer felt so comfortable to many of the whites who had cheered the evangelical preaching of Martin Luther King, Jr.

    Still, most of those whites understood that African-Americans, and indeed all people of color, had justifiable reasons for anger. On the other hand, justifiable reasons for violence? Again, moral certainty was increasingly hard to come by.

    The sense of confusion was fueled by a host of other issues. Gender roles, sexual behavior, poverty, parental authority, education, drugs and alcohol -- just about everything seemed to be an area of contention and endless questions, where once seemingly settled rules had created a secure sense of certainty.

    To be sure, that yearning for the good old days of "truth" and settled rules was largely a matter of nostalgia for a mythic past that never quite existed, at least not the way so many people imagined it. But in political arena mythic pasts are just as real as historically accurate pasts -- and often more powerful, as the voters would prove in 1980 when they overwhelming elected Ronald Reagan to the presidency.

    Reagan's popularity rested on his call for shrinking government in the domestic sphere and expanding it in the military (especially nuclear) sphere, but most of all on his uncanny ability to provide a reassuring sense of certainty. In a manner more gentle than strident, he made his point clear: There are immutable rules in human life, America stands for those rules, and he would make sure they would never be successfully challenged or even called into question.

    (If anyone doubted it, they had only to recall that Reagan first came to political prominence as governor of California, when he ordered a massive police assault on the "dirty hippies" who wanted to seize the University of California's land and turn it into a People's Park.)

    Though Reagan was far from an evangelical Christian, he appealed to that community by speaking well of them and, even more, by bringing the stamp of authority,  certainty, and "truth" back into American political life. Plenty of Americans breathed a sigh of relief. The thought that truth no longer meant certainty had been a frightening one.

    What Reagan, the evangelicals, and all who shared their yearning for certainty didn't realize is that, all the time, another revolution had been brewing, one born in Paris. Back in 1968, even the most conservative Americans had been thankful that, as chaotic as our nation seemed, it was nothing compared to the streets of Paris (and other French cities), where a radical student-worker coalition manned and womanned the barricades to fight pitched battles against the police.

    But the real revolution -- the one that would transform life around the world, including the USA -- was happening in the elite universities of Paris, where Jacques Derrida, Michel Foucault, and their avant garde colleagues were teaching students a whole new way to understand what truth is.

    Deconstructionism and post-structuralism abolished the very possibility of certainty. What we call truth became nothing but a momentary arrangement of words whose meanings were constantly slipping and sliding under the pressure of other words, all produced by an ever-shifting array of constellations of power.

    Whenever someone declared they had found the truth, there was always another way to look at it. And that other way would probably prevail, sooner or later, because where you stood on any question of truth depended largely on where you sat in the unstable field of power.

    It was precisely during the Reagan presidency, while so many Americans were enjoying their newly regained sense of certainty, that these imports from Paris snuck into the universities and transformed American life. Academic humanists, and some social scientists, were finally catching up with what sophisticated physicists had known since Werner Heisenberg's famous pronouncement in 1927: Even in the most rigorous scientific experiment, there is always an element of uncertainty.   

    Now we have a whole generation of college-educated Americans who were taught that the old notion of "truth" as certainty is an outdated relic. Perhaps it's no coincidence that this generation is less likely than their elders to identify with any particular religion.

    But the more important -- I'd say momentous -- consequence of the rise of uncertainty is in the political realm. There's a mass of experimental evidence linking conservative political views to a desire for firm, dependable structure and an aversion to (or even fear of) ambiguity and uncertainty. Those who hold liberals views typically show just the opposite traits. (Take the quiz here.)

    Almost anything can become a symbol of uncertainty. Communism, "terrorist" attacks, legal abortion, gay marriage, and the banning of prayer from schools have all headed the list at one time or another. Now it's "big government." Behind all those issues, and so many more conservative bêtes noires, lies the fundamental disturbing question: Whatever happened to our cherished notion of "truth"?

    The answer, so painful to so many Americans -- not just to conservatives, by any means -- is that the old idea of "truth" is vanishing. It's an agonizingly slow process, moving forward in fits and starts. It brings all sorts of pain to many Americans, as those who suffer psychologically respond with political policies that inflict physical suffering on others.

    Yet the process is probably irreversible. The old "truth" is gradually being replaced by the idea Gandhi articulated so well: "Absolute truth ... is beyond reach. The truth we see is relative, many-sided, plural. ... There is nothing wrong in every man following truth according to his lights.  Indeed it is his duty to do so. ... In this world, we always have to act as judges for ourselves."

    Of course Gandhi recognized the question that immediately springs to mind: If everyone is deciding what's true for themselves, how will we ever get along with each other? How will we ever have any harmony in society? And he was ready with an answer, the only answer I've ever come across that makes sense: Nonviolence.

    Compromise on matters that aren't crucial, the Mahatma advised. When moral principle is at stake, stand up for truth as you see it. Yet offer love and compassion to those who see it differently, even as you firmly resist their actions and policies. Never seek to harm them. If harm must come, let it be on you.  

    Maybe some day, at the end of this long, painful transition, America will embrace Gandhi's view as its own. Maybe not. While we are waiting, we should resist the dangerous policies promoted by people who are terrified at the prospect of losing absolute "truth."

    But we should also understand why they are terrified. We should remember to give them our compassion, as Gandhi urged.

    And we should remember that it all began on that November day in 1963 when the president was killed by Lee Harvey Oswald, or by Oswald and someone (or someones) else, or someone(s) else and not Oswald at all, or … well, 50 years later the only truth we're left with is that we are, as a nation, still uncertain. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153197 https://historynewsnetwork.org/blog/153197 0
    The Mideast is America's New Wild West Why the enduring "special relationship" between the U.S. and Israel? Cultural historians, who look at symbols and stories more than politics and policies, say a big part of it goes back to the late 1950s, when Leon Uris' novel Exodus reached the top of the bestseller list and was then turned into a blockbuster film, with an all-star cast headed by Paul Newman.

    Scholar Rachel Weissbrod called it a "Zionist melodrama." M.M. Silver devoted a whole book to the phenomenon: Our Exodus, with the subtitle, The Americanization of Israel's Founding Story.

    A preeminent historian of American Judaism, Jonathan Sarna, came closest to the truth in his blurb for Silver's book: Exodus "consciously linked brawny Zionist pioneers with the heroes of traditional American westerns." The protagonist, Ari ben Canaan ("lion, son of Canaan"), is the Jewish Shane, the cowboy of impeccable virtue who kills only because he must to save decent people -- especially the gentile woman he loves -- and civilize a savage land.

    Screenwriter Dalton Trumbo (in his first outing after years of being blacklisted) did add a penultimate scene missing from the novel: Ari swears that someday Jews and Arabs will live together and share the land in peace. But then he heads off to fight those very Arabs. Who could resist rooting for Paul Newman, no matter how bad guys he was forced to wipe out?  

    Just a year later the Israelis kidnapped, tried, and executed Nazi bureaucrat Adolph Eichmann. Who could resist seeing fiction come to life, with the increasingly common equation, Arabs = Nazis?

    Thus cultural myth combined with historical event to set the stage for widespread support of the Johnson and Nixon administration's sharp pro-Israel tilt, when Israel went to war with its neighbors in 1967 and 1973.

    I'm writing about this history now because it still lives, today (December 4), in our flagship newspaper, the New York Times.

    The influential columnist Thomas Friedman tells us that the Middle East is a "merciless, hard-bitten region" where everyone is out to get everyone, and "it is vital to never let the other side think they can 'outcrazy' you" -- because the craziest people will be the most violent and thus the winners, one assumes. Apparently those Middle Easterners don't settle their differences politely and rationally, as we do here in "civilized" America.

    Are you beginning to see the melodrama of old-fashioned Westerns yet? Wait, there's plenty more:

     The Jews and the Kurds are among the few minorities that have managed to carve out autonomous spaces in the Arab-Muslim world because, at the end of the day, they would never let any of their foes outcrazy them; they did whatever they had to in order to survive, and sometimes it was really ugly, but they survived to tell the tale.

    Today, just as in the days of Exodus, Israelis must be threatened, Friedman assumes, and they must be willing to be crazy killers to survive. In fact, it's this old mythic narrative that must survive.

    Now the plot has been updated to make the Iranian part of " the Arab-Muslim world" the peril to Israel's very existence. (Friedman must have missed the episode of Homeland where Dana Brody informs her high-school classmates that Iranians are not Arabs, so there is no monolithic "Arab-Muslim world.")

    Friedman is sure that all the reports of Iranian leader Ali Khamenei supporting the moderate president Hassani (even in the Times itself) are not to be trusted. As evidence, he cites three acts of mass killing attributed to "Iran and Hezbollah" two or three decades ago. For him, this is proof enough that "the Iranians will go all the way" in irrational slaughter and that "the dark core of this Iranian regime has not gone away. It’s just out of sight, and it does need to believe that all options really are on the table for negotiations to succeed."

    How to show with "the dark core" at the heart of the "Arab-Muslim world" that we can be violently crazy too? Friedman nominates Israeli Prime Minister Bibi Netanyahu to do the job, to continue being "crazy" with "his Dr. Strangelove stuff and the occasional missile test." How else can we tame the savagery in the Wild West that we call the Middle East?

    Well, that's the view from the authoritative moderate voice not only of our flagship newspaper but of the liberal foreign policy establishment here in the U.S.

    What about the moderate view in Israel? The Times' website is now giving us that view from Shmuel Rosner, a veteran centrist Israel journalist who specializes in "the special relationship" between his nation and the U.S.

    In his Dec. 4 column, Rosner writes about the controversy between the Israeli government and the "thousands of Israeli Bedouins and Arabs [who] staged demonstrations, some of them violent, against a government plan to resettle the Bedouins of the Negev desert." By the third paragraph of the column, you can't help feeling you are back in America's Wild West -- this time with the decent folk facing not crazy gunslingers but primitive "Injuns." 

    Rosner hastens to tell us of the dreadful poverty of the Bedouins and shows his sympathy by asserting that their "community needs help to advance" -- help that can come, apparently, only from the civilized Israeli government. Bedouin communities are "more clusters of huts than real villages." Theirs is

    a historically nomadic society[,] and its relationship to land clashes with the state’s notion of ownership and its need for planned development. ... They claim the land as their own, based on a long history as its residents. They have no legal documents proving ownership, and the country has been reluctant to formalize their claims.

    Why that reminds me of the early Puritan minister who opined that the natives' "land is spacious and void, and they are few and do but run over the grass, as do also the foxes and wild beasts." And the Jamestown settler who described the natives as "only an idle, improvident, scattered people, ignorant of the knowledge of gold, or silver, or any commodities."

    John Winthrop, head of the Puritans' Massachusetts Bay Colony, explained that since the natives "inclose noe Land, neither have any settled habitation, nor any tame Cattle to improve the Land," the whites could take pretty much as much land as they wanted, leaving the natives just what his government deemed "sufficient for their use" -- which wasn't much at all, of course. No doubt he agreed with another Jamestown settler who said, "Our intrusion into their possession shall tend to their great good, and no way to their hurt, unlesse as unbridled beastes, they procure it to themselves."

    Yes, America's Wild West myth started way back when all the whites lived in towns hugging the East coast, wanting only to do  "great good" for all those native "beasts."

    In today's Israel, under the so-called Prawer Plan, "the government is ready to give the Bedouins title to some land." Their "clusters of huts" will be replaced with houses with running water and electricity and officially recognized as settlements.

    There's just one catch: "Between 30,000 and 40,000 Bedouins will have to relocate to existing or new towns in the same area." That's why Bedouins and their supporters are protesting.

    But, hey, Rosner urges us to believe, that will be in no way to their hurt (unless as unbridled beasts, they procure it to themselves, I suppose). And "Israel will also have to pay a high price." Not only will it give Bedouins land. "It will also spend considerable taxpayer money — about $2 billion for the entire effort, including over $330 million on economic development — to improve their living conditions ... bringing much-needed help to one of the country’s most disfavored groups."

    The link will take you to the Israeli government's website, describing its "comprehensive policy aimed at improving [Bedouins'] economic, social and living conditions, as well as resolving long-standing land issues. ...  a major step forward towards integrating the Bedouin more fully into Israel's multicultural society, while still preserving their unique culture and heritage."

    You might hear Ulysses S. Grant murmuring approval from the grave -- Grant being the president who did more than any other to promote the idea of putting native Americans on reservations to "improve their conditions." Maybe "The Great White Father" is now Jewish.

    To be fair, the parallel is far from complete. The Israelis are not talking about "reservations" in the sense that Americans know them. And not even the most Orthodox Jews in Israel are talking about converting the Bedouins to Judaism. They don't have anyone like the Puritan missionary John Eliot, who created "praying towns" to bring Christian civilization to the indigenous people -- who were doomed, he said, if they continued to live "so unfixed, confused, and ungoverned a life, uncivilized and unsubdued to labor and order."

    In fact many Orthodox Israelis reject the Prawer Plan as a giveaway to the indigenous people. One of their icons, Foreign Minister Avigdor Liberman, called the situation simply "a battle for the land. .. .We are fighting for the national lands of the Jewish people." You might hear Andrew Jackson murmuring approval from the grave; after all, his USA was still "the New Israel."

    Of course Jackson got huge resistance from whites for his Indian removal program. So does Liberman. Just as Americans long debated, sometimes fiercely, about "the Indian problem," Israelis now debate fiercely about "the Arab problem." Yet in the U.S. that debate gets little media attention. The media are more likely to oversimplify the issue, casting it through the lens of a centuries-old American mythology.  

    That's why I've gone into such detail about these two Times columns -- not because there's anything extraordinary about them, but precisely because that they are so ordinary. It's just another typical day in American journalism's coverage of "our friend Israel versus the Arab-Muslim world." From the Times, the pinnacle of our journalism, these old Wild West stereotypes trickle down to all the rest of the media and thus to the public at large.

    The particulars of Israeli policy toward Arabs are quite different from the specific ways the U.S. has dealt with its indigenous peoples. But the myths that shaped U.S. whites' attitudes toward native Americans for four centuries or more (and to some extent still do) are strikingly similar to the myths that shape American public attitudes toward Israel and "the Arab-Muslim" world.

    Especially the conservative public. The old idea that "the Jews" are responsible for the U.S. government's pro-Israel tilt has been put to rest by recent polling data from CNNthe Huffington Post, and Pew. All show that, in the U.S., the strongest support for Israel’s right-wing policies now comes not from Jews but from Republicans.

    That's especially true for white evangelical Christians. In one recent poll, 46% of those evangelicals said the U.S. is not supportive enough of Israel, while only 31% of Jews held that view. Half of the evangelicals said Israel could never coexist with an independent Palestinian state while only a third of Jews doubted it.  

    But the conservative pressure on any U.S. president to tilt toward Israel -- a pressure Barack Obama feels every day -- is not primarily a matter of religion. It's much more about a cultural affinity Americans have long felt for the story of Israel that they learned so long ago -- especially conservatives, who are most likely to love that story of the innocent good guys, who just want to civilize the wilderness, constantly threatened by "the dark core" of savage evildoers.

    That's the story at the heart of the myth of insecurity so fundamental to political culture in both Israel and America. But in America the media rarely cast the native people as savages any more, at least not explicitly.

    So perhaps many Americans are clinging to their old familiar myth vicariously by projecting it onto what Friedman calls the "merciless, hard-bitten" Middle East, where most everyone seems crazy -- if you accept the mass media's story as the truth. As I'm finishing this piece, the Times' website is featuring yet another in the endless string of frightening headlines, which all sound so much the same: "Jihadist Groups Gain in Turmoil Across Middle East." Meet the new news, same as the old news.

    The only good news is that myths do change. For years the best historians have been describing a native American culture, going back to pre-contact days, that was fully as rational and advanced a civilization as the whites', and deserves to be understood on its own terms.

    Indeed there's a persuasive theory that the British colonies of North America created pejorative myths about the native peoples to negate the lure of native ways, since so many immigrants found the natives' life more civilized -- and comfortable -- than the European life they'd brought across the sea.  

    That more accurate story of the American past is beginning to filter into history textbooks that millions of students will read in the coming years. Some of them will become journalists who will eventually control and revise the story line in the mass media. So there's hope that, some day, a more accurate story of Arabs and other Muslim peoples will also find its way into our mass media, too.

    Meanwhile, let's be aware of the old story that still prevails about " the Arab-Muslim world" and recognize how it appeals to many Americans, letting them hold on to a new version of an old narrative that they kind of hate to give up. And let's be aware that the appeal of this narrative plays a huge role in the public demand for a pro-Israel tilt from Washington. At a time when the Obama administration is immersed in potentially world-changing negotiations, both with Iran and at the Israel-Palestine table, the role of myth in political life is too important to ignore. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153198 https://historynewsnetwork.org/blog/153198 0
    Time to Bring Back the "Great Man" Theory of History? I don't know exactly when the "great man" theory went out of fashion among professional historians. I do know that when I started studying American history, a very long time ago, the theory was already so outdated that no respectable historian would think of subscribing to it -- at least not in public. Perhaps some still harbored private suspicions that "great" men and women play a much bigger role in shaping and explaining history than the profession now allows us to admit.

    Of course outside the circle of professional historians the "great man" theory never died. It's as alive, vibrant, and probably even dominant as ever. Just browse the history shelves at any big, popular bookstore. Then go over to the biography section and notice that it's almost as big as the history section.

    Or, now, browse any big, popular newspaper or magazine, where articles about Nelson Mandela are just barely beginning to fade from the headlines, a full week after his death. It was a week of indulging in (no doubt well deserved) praise of the great man who turned the tide of history -- virtually single-handedly, if you accept the popular version of the story that has dominated the news all week. That story would not have made the front page every day if it didn't sell newspapers, which is one more piece of evidence that for most non-historians the "great man" theory is still the key to understanding why history unfolds as it does.

    Is it possible that the masses know something we professional historians have forgotten? I'm certainly not going to make a case for bringing the "great man" theory back to the dominant place it once held in American historiography. Obviously there are a huge number of other modes of explanation that have earned our respect, with good reason. To be a good historian is to have all those explanatory models and mechanisms in your tool bag and know how and when to use each one to best effect.

    But I do wonder if we shouldn't give a bit more respect to the "great man" -- we'll now call it the "great man and woman" -- theory. In fact, a lot of historians still use that tool in practice, even if they prove their professional credentials by dissing and dismissing it when the conversation turns to historiographical theory. We still have a constant flow of excellent history books and articles that focus on the role famous individuals played in shaping events.

    And we still have the nagging question: Would things have turned out the same if person X had not appeared on the scene, or had disappeared from the scene earlier? Suppose (and it's all too easy to suppose) that the South African apartheid regime had killed Nelson Mandela in 1962 rather than imprisoning him. Yes, there's a good case to be made that structural forces would still have compelled the end of apartheid, eventually. But can we be sure? And even if we are sure, how much longer might "eventually" have taken?

    Perhaps I'm somewhat more intrigued by the "great man and woman" theory than a lot of other historians because I have specialized in foreign policy, where it's somewhat easier to make the case that unique individuals have determined the outcome of events. Would the U.S. have sent huge numbers of troops into Vietnam if that bullet had missed John F. Kennedy? Would the U.S. have joined the League of Nations if Woodrow Wilson had been a paragon of health through the end of his second term?

    Recently Frank Costigliola has returned to perhaps the most famous and fateful of these "if" questions: Would the U.S. have entered into a Cold War with the Soviet Union if Franklin D. Roosevelt had managed to stay healthy enough to finish his fourth term and keep control of foreign policy through January, 1949? Costigliola offers no theoretical argument for resurrecting the old "great man" theory. But as the subtitle of his latest book, Roosevelt's Lost Alliances: How Personal Politics Helped Start the Cold War, suggests, he had the courage to bring a modernized, much more sophisticated version of that theory back into the academic arena.

    And with impressive results. He mounts a strong case that we cannot understand what happened to American (or Soviet or British) foreign policy in the 1940s without understanding the personalities of the great men who were in charge of deciding policy. He mounts an equally impressive (though sure to be debated) case that, had Roosevelt lived and been healthy enough to remain fully in charge, U.S. policy to the Soviets would have followed a rather more conciliatory course; the cold war might well have been at least much less cold if not averted altogether.  

    I'm also intrigued by the "great man and woman" theory because of my non-professional experience with local organizations of all kinds, trying to get all sorts of things done. As a product of the counter-cultural '60s, I grew up believing in communal decision-making, with a fierce determination never to let any one person become a strong leader, lest they gain too much power over the rest of us. In the circles I went it as a young person, those were bedrock taken-for-granted principles.

    At the same time, though, most of us idolized Dr. Martin Luther King, Jr. We took it for granted that the civil rights movement never could have achieved so much without him. We eagerly followed the well-publicized doings of leaders like Angela Davis, Abbie Hoffman, Mark Rudd, and all the other media stars of the left, assuming that what they did really made a big difference.

    I don't recall that we talked or thought much about the contradiction between, on the one hand, our egalitarianism and rigid rejection of strong leaders in our local own groups, and on the other hand our own version of the "great man and woman" theory on the national scale.

    Over the decades, I've watched enough of those local groups rise and fall to come to this conclusion: One of the ingredients any group needs if it hopes to make change of any kind is strong leadership. "A few talented fanatics" is the way I often put it now -- a few people who have the time, energy, and skills to make sure everything that needs to be done gets done well. Without such talented fanatics, most groups are doomed to fade away sooner or later, and usually sooner. At least that's what my experience and observation tell me.

    We may be watching something similar happening to progressives now on a national scale. The Occupy movement made a powerful mark. It sparked a media interest in economic inequality that hardly anyone would have predicted, or perhaps thought possible, just a few years again.

    So why did the Occupy movement fade so quickly from public view? Certainly the main reason was swift, harsh, and widely approved police repression. But Occupy was also hampered by a lack of leaders. Of course each Occupy encampment did have leaders. The sociological theories that explain why every group is bound to create some kind of leadership structure remain pretty convincing.

    Yet the Occupy leaders, as well as their followers, were guided by the same kind of egalitarian "no leaders" philosophy that I remember from the '60s. So they consciously downplayed the role and media profile of their leaders. And police repression made sure the movement never had time to generate to the kind of media stars that the '60s left produced.

    Now let's play "what if". (The recent rise in "counter-factual" history is a useful reminder that old theories once discarded can in fact rise again.) What if Elizabeth Warren had become a senator, media star, and darling of the progressives before the Occupy movement arose? The two would have been a match made in media heaven.

    The results are hard to predict. But the thought of Warren, like Dr. King, being carted off to jail because she was standing up for justice -- and all seen by millions on the evening news -- is enough to make me believe that our history would somehow have become quite different.

    As it is, Warren's rise to stardom is at the center of the growing media meme about the rising influence of progressives in the Democratic party and the national political scene. Such media memes are often self-fulfilling prophecies. They are no guarantee of success, but without them a political movement is pretty well guaranteed to have little success.

    Without the media spotlight shining so brightly on Elizabeth Warren, progressive politics would still be a matter of small groups working on their specific issues, on the fringes of public awareness, with little hope for making any major change in the nation's political mood. One charismatic person has changed all that.

    And perhaps it has changed Barack Obama's political tune. Obama first began talking about inequality when Occupy was making its big, brief media splash. He stuck to the theme in the following months, though less enthusiastically, just enough to make sure he got re-elected. Now, after a bunch of distractions, he's resurrecting it as a way to try to resurrect his own political power. Would he be doing that if it weren't for Elizabeth Warren?

    It's like asking whether FDR would have turned to the left in the 1936 campaign if it hadn't been for the growing appeal of progressive populists like Huey Long. No one can say for sure. But most historians of the '30s credit those progressives with making a large impact on FDR's public stance, even if they would never admit to subscribing to any "great man and woman" theory.  

    If Obama's return to the issue of economic inequity works for him politically, and genuine policy changes occur as a result, it will give historians more grist for the mill of the "great man and woman" debate a few years from now. If Hillary Clinton takes the challenge of a Warren candidacy seriously and picks up Obama's rhetorical focus as her own, that mill may start turning fast and furious.  

    Great political changes need at least two ingredients: Great myths and lots of small, local organizations contributing to the effort. Great myths need compelling, charistmatic characters. Successful small organizations need talented, dedicated leaders. 

    We'll never return to the heyday of the "great man" theory. But we may find ourselves paying more attention to the history-making role of great men and women as time goes on. Then again, maybe not. I'm still not totally sure how I feel about it. I offer all these reflections merely as something worth thinking about.

     

     

     

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153199 https://historynewsnetwork.org/blog/153199 0
    Iran a U.S. Ally? Who Would Have Thunk It? One great thing about watching history unfold is that it's so full of surprises.

    The United States and Iran suddenly "find themselves on the same side of a range of regional issues" in the Mideast, the New York Times reports. “'The Americans are confessing Iran stands for peace and stability in this region,' said Hamid Reza Tarraghi, a hard-line political analyst, with views close to those of Iran’s leaders." And a slim majority of Americans favor a negotiated settlement with Iran about its nuclear program.

    Who would have thunk it?

    A vocal minority of Americans still oppose any rapprochement with Teheran. And, of course, everyone in the U.S. seems to agrees that, one way or another, Iran must be prevented from getting nuclear weapons -- or so the mass media tell us. The possibility of tolerating a nuclear-armed Iran scarcely ever comes up.

    So why is an Iran with a couple of nukes, or even just the capability of making them, the Prince of Darkness, while an Iran that renounces the right to make nukes might be on the way to international respectability, perhaps even as a U.S. ally?

    The argument that an Iranian bomb would start a Mideast arms race makes no sense, since Israel started that race decades ago. The argument that it would upset our Saudi and Gulf State allies makes "realist" sense, but few Americans outside the foreign policy establishment care much about those alliances. Why, then, does the premise that Iran must never get the bomb go virtually unchallenged?

    The most fundamental facts in the debate about Iran, as in any debate, are the assumptions that both sides share in common. Yet those are the facts most likely to be ignored. Since everyone takes them for granted, why bother talking about them?

    Anyone who does want to talk about them will quickly discover that the shared assumptions don't usually form any systematic philosophy or ideology. They're much more likely to be connected as parts of a taken-for-granted story. In the political arena, especially, those stories are likely to have the features that scholars of religion often associate with the term myth.

    American foreign policy debates are full of myths, all tied together in a vast, tangled web. The consensus that Iran must never have a nuke emerges from this web.

    There is the moral dualism of the Wild West yarn, with good guys pitted against irrational evildoers; American exceptionalism, the heroic tale of one virtuous nation leading the whole world toward peace and decency; old-style progressivism, an optimistic narrative of reasonable negotiated solutions to every conflict; the myth of "realism," a gritty story that says all of us are condemned to vie endlessly for power, and the guy with the biggest gun earns the right to rule the roost; and so many others, all interacting in endlessly complicated ways.

    Firm opposition to a nuclear-armed Iran is surely embedded in the "moral dualism" myth. But that begs the question of why Iran is a "bad guy" only if it has nuclear capability.

    One key to the mystery lies in the observation that both sides in any dispute usually share a common mythic narrative. Though Iranian leaders debate with each other about nuclear policy, all seem to take one story for granted: No nation can be taken seriously as a world power unless it has at least the capability to make the gun of infinite power, the nuclear one.

    U.S. policymakers have long assumed the same story. It's a myth marked "Made in 1940s America."

    In Franklin D. Roosevelt's mythic worldview, America's exceptional virtue entitled it to organize the postwar world as a progressive, harmonious "neighborhood," where a handful of big powers kept order, including the Soviet Union. But the rising "realist" myth prompted FDR to insure U.S. preeminence by keeping the "secret" of the bomb rather than sharing it with Stalin.

    Harry Truman rebalanced the mythic scales, putting "realist" fears ahead of progressive hopes for global cooperation. The cold war narrative of national security took control, though it actually plunged us into what a recent history of America's cold war calls "the politics of insecurity," or as I call it, the myth of homeland insecurity. For the foreseeable future, that new myth told us, our nation would always be threatened.

    Stalin saw the Americans rushing to embrace the atomic bomb as the new symbol of national pride, power, and security. So he made sure he got one of his own, fast. Truman, constantly on guard against new perils, countered with the decision to build the hydrogen bomb, which cemented the myth of the nuke as the weapon of infinite power. Mythologically speaking, any country that had one could lay claim to infinite national power.  

    Since the cold war ended, American leaders have had higher hopes that FDR's progressive myth might shape policy -- as long as all the world powers joined a single "international community" under America's benevolent guidance. But suppose a nation that doesn't share this narrative gets the weapon of infinite power and thus rises to the level of a world power? How can we count on it to play by the rules of the global "neighborhood" that Roosevelt had envisioned?

    Then the "realist" myth kicks in, warning that our national security is at serious risk -- a lesson we all learned again (the myth tells us) on September 11, 2001. Hence Iran must never get the bomb.

    At least that's the story that rules American public discourse today, still shaped by the myths of the 1940s -- especially the myth of homeland insecurity, which requires someone or something to play the role of mortal threat. A nuclear-armed Iran will fill the bill just fine, at least for now.

    It's crucial to understand the role of myth in political life. It's equally crucial to see that political myths can change surprisingly quickly. Hardly anyone was talking about national security or insecurity in the America of the mid-1930s, while nearly everyone was talking about those fears in the America of the late 1940s. Similarly rapid and unexpected changes can happen again.

    Myths can change even faster if we recognize that, though they have great power over us, they are produced by human choices. It's immensely difficult to challenge such potent myths as "national security" and "the bomb as entry ticket to the 'global power' club." In principle, though, we are always free to create new myths to reshape our political life.

    We can create a myth that sees Iran as a rising power taking its rightful place in the international community. We can create a myth that sees the bomb as a mark of national fear rather than pride and strength. We can create a myth that see nuclear capability as a barrier, rather than a key, to national security.

    If we want to, we can create all sorts of myths that become the shared premise of our national debates so fast it leaves us scratching our heads in wonder and saying, "Who would have thunk it?" Perhaps, when it comes to Iran, we're seeing something like that happening right before our eyes.

     

     

     

     

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153284 https://historynewsnetwork.org/blog/153284 0
    Is Israel a "Jewish Nation "? Is the U.S. an "American Nation"? As the media spotlight shines on U.S. negotiators talking with Iranians and Syrians, the Israeli-Palestinian talks have faded into the background. They're still grinding on, slowly, with several contentious issues unresolved.

    One of those issues doesn't get as much attention as it deserves in U.S. media. Israeli Prime Minister Benjamin Netanyahu "has catapulted to the fore an issue that may be even more intractable than old ones like security and settlements," the New York Times' Jodi Rudoren recently reported: "a demand that the Palestinians recognize Israel as ... 'the nation-state of the Jewish people.'”

     The Palestinians are resisting the demand, fearing "that recognizing Israel as a Jewish state would disenfranchise its 1.6 million Arab citizens [and] undercut the right of return for millions of Palestinian refugees," Rudoren reports. Israeli leaders respond "that the refugee question can be resolved separately and that the status of Israel’s Arab minority can be protected."

    The refugee question can probably be resolved separately. Roughly a decade ago the Palestinian leader Yassir Arafat suggested on the Times op-ed page that he would accept a token return of refugees and a huge monetary compensation for the rest. That idea has become a standard part of the settlement outline that has been assumed by most observers for years. 

    The money will come mainly from the United States, just as the U.S. agreed in 1978 to pay relatively huge amounts to Israel and Egypt each year as long as they keep the peace agreement they signed then. That's one reason Americans have a personal interest in the outcome of the current talks.

    As for Arab rights, Israel has been abridging them throughout its history. There's little reason to think an official recognition of Israel as "the nation-state of the Jews" would change the status of Arab Israelis in any major way.

    Americans have a personal responsibility in that regard, too. We've spent over two centuries telling the rest of the world that it must live up to our creed of "all are created equal." Yet we've spent billions of our tax dollars and much of our diplomatic capital supporting Israel's domination of the Palestinians. The least we can do now is to ease our hypocrisy by making sure that Israel does protect the rights of all its citizens, telling the Israelis, in effect, "We recognize the error of our ways. Do as we say, not as we have done."

    But the "most important" sticking point, according to Rudoren, is the Palestinians' sense that recognition of Israel as a Jewish nation-state would "require a psychological rewriting of the story they [Palestinians] hold dear about their longtime presence in the land." And, in Rudoren's telling, Israeli Jews don't try to refute this point. They agree that the crux of the issue is the political impact of a national story and its psychological ramifications.

    This is not news. While Americans generally ignore the political impact of national narratives, both Israelis and Palestinians constantly talk, hear, and read about the central role of the "competing narratives" in their political conflict. 

    As Rudoren notes, Palestinian leaders from President Mahmoud Abbas on down have long said that Israel can call itself whatever it wants, once it ends the occupation and accepts an internationally recognized border between its own land and that of a new Palestinian state.

    All countries define themselves, Hind Khoury, a former Palestinian minister and ambassador, told Rudoren. “Why doesn’t Israel call itself at the U.N. whatever they want to call it — the Jewish whatever, Maccabean, whatever they want. Then the whole world will recognize it.” But, Khoury added, “We will never recognize Israel the way they want, I mean genuinely, from our hearts. ...  Why for them to feel secure do we have to deny our most recent history?”

    "For them to feel secure" -- There's the heart of the matter, as Americans should easily understand. Israeli Jews, like white Americans, have always known that their claim to the land they call their own is dubious.

    Ever since the first Europeans arrived in what would become the United States, they have paraded an endless array of papers, all claiming to be treaties signed by native peoples ceding their lands to the conquerors. "You see, we have a right to this land," the whites proudly proclaimed. Never mind that most of the treaties were either coerced, signed by native peoples who did not understand them, or outright fraudulent. They gave at least the appearance of legal right.

    Israel has a somewhat stronger case with UN Resolution 181, passed in 1947, providing for "independent Arab and Jewish States" in Palestine. But the right of the Jews to have their own state in Palestine has still remained a matter of contention (pardon the understatement) ever since.

    Why did so many white Americans find it so important to be able to waive those pieces of paper "proving" their "legal right" to the land? Why do a sizeable majority of Israeli Jews favor the demand that Palestinians acknowledge Israel as "the nation-state of the Jewish people"? Obviously, both peoples are insecure about their right to their land. If they can get the former inhabitants to relinquish their rights, it gives the appearance, at least, that the vanquished concede to the victors a moral right to the land they have taken.   

    But the issue of security runs even deeper.

    Yedidia Z. Stern, a vice president of the Israel Democracy Institute, told Rudoren:  “We don’t know what it means to be a Jewish state. But does that mean we have to give it up? No way. I would leave. The reason I’m here is because this state is a Jewish state.”

    On the face of it, this sounds shockingly illogical. Why stake your life on three words whose meaning you can't define or explain -- three words whose meaning your own people have been debating for over a century?

    But the shock I got was one of recognition. So many people in the U.S. have been doing much the same thing for over three centuries: insisting that what makes us a great and exceptional people is that we are Americans, yet being unable to say exactly what it means to be "an American" and endlessly arguing about it.

    The book that has cleared up this mystery for me, more than any other, is David Campbell's Writing Security. To oversimplify a sophisticated theory, Campbell argues that, as Khoury says, every nation creates a label for itself: "the Jewish state," "the American people," whatever. But no one in the nation can ever say exactly what that term means in any clear, substantive way. Nations are far too complicated for any essentialist definition. And they're always changing, to boot.

    Yet in the modern world we are urged, perhaps in many nations almost required, to define ourselves primarily by our national identity: "I may not know what else I am, but I know for damn sure that I'm an American, and damn proud of it!"

    So we build our identities on constantly shifting sands, knowing (however unconsciously) that this means our identity might be washed away at any moment. Talk about being insecure! Israel is steeped in its myth of insecurity, as are we Americans.  

    To gain at least a shred of security, we must find some answer to one of the great questions -- perhaps the greatest question -- of the modern world: How to give our national, and thus personal, identity some firm foundation?

    That's the question Israeli Jews, like Americans, have been grappling with throughout their national history. And the Jews have come up with much the same answer that Campbell says Americans -- especially white Euro-Americans -- have always relied on: We may never be able to say what we are or what positive qualities mark us as a distinctive group. But we can certainly say what we are not: We are not "them"!

    "Them," in American history, has been a very fluid category. Native peoples, Africans, Irish, southern Europeans, Latinos, communists, terrorists, and so many others have filled that slot. In the future, no doubt there will be others.

    Right now, the dominant "them" in American political life is the undocumented immigrant. Conservatives insist that the undocumented must never become citizens; they must always remain the alien other. The dominant liberal compromise is that a path to citizenship should be opened, but the border with Mexico must remain tightly sealed. Either way, the line between "us" and "them" must be strictly drawn. Those white Americans who don't see any pressing need for such  line remain a sadly small minority.

    For Israelis, the "them" slot has always been filled by the single word, Arabs. But the principle remains the same in Jewish Israel as in white America. It doesn't matter who "they" are. All that matters is that "they" are not "us." So we know we are "us" -- "one nation, indivisible" -- only because we are not "them." And that knowledge, in a perversely logical way, breeds a sense of security.

    "National security" rests squarely on a story about the difference between "our" nation, despite all its internal variegation, and "them." If we can get "them" to tell the same story -- to confirm the difference between "us" and "them" -- how much more secure we would be!

    Americans got something like that from the Congressional Hispanic Caucus when it endorsed "smart and reasonable enforcement that protects our borders ... by targeting serious criminals and real threats at our northern and southern borders" as part of its immigration reform plan.

    For many Israeli Jews, a Palestinian recognition of Israel as "the nation-state of the Jewish people" would do the trick. So, if Rudoren is right, the Israelis are blocking the path to a peace settlement that is finally -- perhaps -- in sight, primarily because they demand that the Palestinians ease Jewish Israeli insecurity.

    The question for the American people is: Will we let them do it? Will we let Israelis go on oppressing Palestinians, occupying their lands, destroying their homes and fields, jailing their people, even on occasion killing their children, simply because Israeli Jews feel insecure and insist that only their long-time enemies can, and must, take away their insecurity?

    Make no mistake: The American people hold the trump card in this situation, just as the U.S. has always held the upper hand. The Palestinians depend on U.S. money and the leash the U.S. can keep on the Israelis. The Israelis are stuck on that leash, whether they like it or not, as has been proven many times in the past.

    U.S. administrations have let Israel get away with all sorts of injustices, with only a murmur of protest heard from Washington, because those administrations feared the domestic political repercussions if they pressed Israel too hard -- repercussions not so much from Jewish-Americans as from conservatives of every stripe.

    It's hardly a coincidence that conservatives who demand beefing up the Mexican-American border also tend to support Israeli right-wingers against Obama's push for a two-state solution. Conservatism is always marked by a quest for security and clearly defined boundaries; conservatives, more than others, need a clearly defined "them." The bigger and more (supposedly) threatening the "them," the more they reassure "us" that we are "us." How convenient, then, for conservatives to lump Latinos together with Palestinians as supposed "threats" to be resisted, while embracing "our best friend" Israel as if it were part of "us."

    Despite the continuing pressure from the right, Obama and his political advisors are now estimating that they can get away with pressing Israel harder. It's no coincidence that the latest push for Israeli-Palestinian talks began right after Obama's re-election, when he no longer had to worry so much about the political fallout that might come from forcing the two opposing sides to the negotiating table.

    Whether Obama's risk pays off depends on how we, the  American people, respond. So we must decide: Will we give in to Israeli and American insecurity and let the Israelis hold up the peace process until the Palestinians recognize Israel as "the nation-state of the Jewish people"? Or will we give Obama political space to ignore the political wages of insecurity and forge a settlement without that recognition?

    To put it more bluntly: Will we let Israel go on persecuting the Palestinians every day, and keep giving Israel $3 billion+ a year, simply because Israeli Jews feel insecure? Or will we tell the Israeli Jews that they have to stop their persecution, end the occupation, agree to an independent Palestinian state, and work out their insecurities on their own?

     That depends, in part, on whether we can stop obsessing about "threats at the border" and start working out our American insecurities on our own. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153285 https://historynewsnetwork.org/blog/153285 0
    When Did "the '60s" Begin? A Cautionary Tale for Historians When, exactly, did the era of radical ferment we remember as "the '60s" begin? Exactly one half-century ago, PBS tells us in its recent documentary titled "1964," kicking off a year when we'll celebrate the 50th anniversary of a host of memorable events:

    ·       Lyndon Johnson declared a war on poverty, pushed the Civil Rights Act through Congress, and got a blank check from Congress (the Tonkin Gulf resolution) to send troops to Vietnam.

    ·       The Mississippi Freedom Summer saw civil rights workers murdered and hundreds of white students going back to their campuses in the fall radicalized.

    ·       Some of those students, at Berkeley, created the Free Speech Movement.

    ·       African Americans "rioted" in Harlem.

    ·       America began to hear of Malcolm X, and Cassius Clay became Muhammad Ali.

    ·       After Republicans took a sharp turn to the right and saw their presidential candidate, Barry Goldwater, get 40% of the vote -- buoyed by the rhetoric of political newcomer Ronald Reagan -- right-wing politicos began planning a "New Right" movement.

    ·       The Beatles came to America, and Motown's biggest hit was "Dancing in the Streets."

    ·       TV viewers were spellbound by an immensely strong, totally independent woman on the season's biggest new hit, "Bewitched."    

    Connect the dots, the PBS show's talking head historians all say, and you'll see a year that changed America forever. "The 60s" had begun!

    There's just one problem with this story: Hardly anybody in 1964 was connecting the dots. The public generally saw these events as quite separate from each other. LBJ's support for civil rights and helping the poor were clearly connected. But hardly anyone foresaw how the Gulf of Tonkin resolution would intersect with, and ultimately destroy, his liberal domestic agenda. The Beatles sparred with Clay in a fun photo-op. But who could see any link between them and the Berkeley students taking over the university administration building?

    In fact 1964 seemed a rather calm year to most Americans compared with the two years that had preceded it, which had brought the Cuban Missile Crisis and the murder of President Kennedy. Even the change that seemed obviously greatest in 1964, the Civil Rights Act, struck most Americans outside the South as something that was happening elsewhere and wouldn't affect them directly.

    "The '60s" as a real political-cultural phenomenon was not evident to most Americans until 1967 or maybe even 1968. It's only in retrospect that so many events of 1964 seem so obviously intertwined.

    That's what historians do: look back and see things that people at the time couldn't see. It's a job well worth doing. But it's equally important that we don't confuse the early seeds of a major political, social, and cultural change with the substance of the change itself.

    If we make that mistake, we miss the most important lesson of 1964: The seeds can be all around us, yet the change itself remains unexpected, invisible, even unimaginable to most people at the time. And, as the huge leap from 1964 to 1968 teaches us, we should never forget how surprisingly fast it can happen. 

    Historians of "the '60s" often make a similar mistake when it comes to deciding when that era came to an end. They focus on the very beginning of the end, in 1968 and 1969, the very years that most Americans first began to feel engulfed by the wave of change. That wave remained strong, as far as most Americans could tell, into the first years of the '70s, though you might not know it from reading some histories of "the '60s."  

    Historians face a methodological problem here. If you're going to decide that the key to understanding any historical era is to track down its roots -- as '60s scholars so often do -- where do you stop? Everything that happened in 1964 -- or any other year, for that matter -- was the fruit of things that happened earlier. It's well known by now that the roots of "the '60s" really lie in the supposedly so opposite era of "the '50s."

    In fact, just out of curiosity, I took a look at the year 1950, to see whether I could build a case for it as the year "the '60s" really began. It turned out to be a quick easy job. In 1950:

    ·       Senator Joseph McCarthy launched a crusade against domestic communism at home, sounding the death knell of the Old Left, paving the way for the New Left and (arguably) for the Goldwater-Reaganite New Right. 

    ·       NSC-68 became the Democratic administration's secret roadmap for the cold war.

    ·       The administration made a formal commitment to fight the communist-led independence movement in Vietnam.

    ·       Presaging the future, the same administration sent a huge military force into a land war in Asia with widespread public approval at first, though the war would eventually destroy a Democratic presidency.

    ·       Soon after the Korean war began, over a quarter of new Army enlistees were African-Americans, and for the first time U.S. fighting units were integrated;  those African-Americans would come home with a very new view of what was possible.  

    ·       The Mattachine Society, the first gay liberation organization, was founded.

    ·       Jack Kerouac published his first novel (The Town and the City) and told Neal Cassady about a "spontaneous prose" technique he was using to write another book, based on experiences they and other Beats like Alan Ginsberg were having "on the road."

    ·       Professor Longhair, often called the first rock 'n roll musician, had his only national hit, "Bald Head."

    ·       Alan Watts left the Christian ministry to devote himself full-time to the study and practice of Eastern religions and published The Supreme Identity.  

    ·       Herbert Marcuse gave lectures that would later be published as Eros and Civilization, his radical critique of the erotic repression demanded by capitalism.

    ·       Charles Schulz began publishing "Peanuts," showing young people as the true fount of all wisdom.

    ·       Volkswagen made its first VW camper van. 

    I wouldn't seriously argue that 1950 was the beginning of "the '60s." I would seriously argue that seeds of change are being planted all around us all the time. Some grow underground, unseen, for a long, long time before they come to fruition. We shouldn't confuse the seeds with the full-flowering plant.

    Nevertheless, tracking down those seeds from eras past is a very important job, mostly because it can help us pay more attention to seeds that are growing underground right now. Of course we can't predict which seeds will connect up with which other ones to create significant change, and certainly not when or how it will happen. But history can teach us to watch more closely and optimistically for signs of change that might be coming surprisingly soon.

    Who knows whether, some day, PBS will produce a documentary called "2013." Talking head historians will tell us that 2013 was indeed the year everything changed in America in a way we hadn't seen since the '60s:

    ·       Wealth inequality became a constant topic of discussion.

    ·       Republicans who shut down the government to advance their anti-equalization agenda suffered ignominious defeat in the court of public opinion.

    ·       That defeat created a fatal schism among Republicans, dramatically weakening the once-powerful Tea Party.

    ·       The concern for inequality put Elizabeth Warren in the political spotlight, giving progressives their first media star with real influence in government.

    ·       Pope Francis began moving the Catholic Church in more liberal directions, especially on issues of economic justice

    ·       Edward Snowden revealed massive spying by the NSA, sparking public outrage over government abuses in the name of national security.

    ·       A wave of protests against the Keystone XL Pipeline hit the White House and cities across the country, including some civil disobedience actions, and over 75,000 people pledged to risk arrest if the president approves the Pipeline project.

    ·       Iran and the U.S. signed a preliminary agreement to settle differences through diplomacy.

    ·       The U.S. initiated ongoing peace talks between Israeli and Palestinian leaders.

    ·       A majority of Americans for the first time approved of gay marriage.

    ·       Colorado and Washington drafted laws to govern retail pot shops.

    ·       "The Hunger Games -- Catching Fire," depicting teenagers rebelling against an oppressive government, was the year's top box-office film.

    That's just skimming the surface. No doubt everyone will have their own favorite potential roots of change that I've missed in this quick overview.

    For historians the conclusion is this: We absolutely should trace the sources of change as far back as we can. But we should also make a clear, careful distinction between when the earliest root of any change took hold and when that change became truly significant for society at large.   

    For society at large the conclusion is this: Never forget how rapidly big changes, sometimes for the better, can happen. And never forget that the sources of the next big change are already gathering all around us. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153290 https://historynewsnetwork.org/blog/153290 0
    Maybe "The '60s" Really Did Begin in 1960 Not long ago I wrote a column asking, "When did 'the '60s' begin?". It evoked a lot more comment than my columns usually get. It seems there's still a big appetite out there for recollecting the '60s. Most of the comments nominated the writer's favorite candidate for the year or the occasion when "the '60s really began."

    One of those comments is uppermost in my mind now: an email from someone who was among the throng protesting the House Un-American Activities Committee hearing at San Francisco City Hall on May 13, 1960. That throng was literally washed out of the building by police wielding fire hoses. Many were also smashed by police billy clubs. And 68 were arrested. Nevertheless, they managed to shut down the hearings (and eventually got the charges against them thrown out of court).

    Some say it was the beginning of the end of HUAC as a meaningful political force.  HUAC never held a hearing outside of Washington, DC again.

    More to the point, my correspondent and others who were there say it was the beginning of "the '60s." "It was the end of the '50s," one of them recalled on the 50th anniversary of the event. "It made possible the 1960s in all its variations," added another. PBS (whose documentary on "1964" triggered my original column on the origin of "the 60s") says that "the entire [anti-HUAC] episode lays the foundation for subsequent eruption of the Free Speech Movement at Berkeley in 1964."

    Maybe those are exaggerations with a grain of truth. But no doubt 1960 was the first time in many, many years that a crowd of (mostly) college students took to the streets in political protest -- the kind of action we now see as such an essential feature of "the '60s."

    We see "the '60s" that way, though, only because of what happened in 1968 and 1969, when the crowds swelled to hundreds of thousands. I took eight or nine years for the seed planted at San Francisco City Hall to bear its full fruit.

    That incident sticks in my mind because, shortly after I got the email reminding me about it, I read about a huge crowd -- estimated at 80,000 to 100,000 -- on a "Moral March" through the streets of Raleigh, North Carolina, on February 8, 2014, demanding a wide array of progressive changes in American society. It moved me to write a piece suggesting that "this could be the start of something big."

     I didn't try to predict how long it would take until that "something big" might appear. I'm not that foolish. Historians are not supposed to play at prophecy. It's worth remembering, though, that after the 1960 anti-HUAC demo nothing quite like it happened in the Bay Area for more than four years, until the Free Speech Movement hit the Berkeley campus in the fall of 1964.

    So if nothing like the Moral March happens again for another four years, or even more, we still may look back on it some day as an important seed of a major left-wing movement that could take eight or nine years, or even more, to reach its peak.

    There are some important differences between San Francisco 1960 and Raleigh 2014 that say a lot about how American life has changed over the last 54 years. The San Francisco protesters were nearly all white. They readily acknowledged that they were inspired by the black civil rights movement in the South. But they did not directly connect their issue -- the rights of white people to express progressive political ideas -- with the issue of African-Americans' legal rights.

    It would take eight or nine years before white students recognized that all political issues -- including the question of race relations -- were connected. By that time they would have welcomed people of color marching with them. But most people of color were quite happy to create political action on their own, thank you. And by the mid-'70s most of the white protesters were fragmenting again into single-issue campaigns that often resisted making connections with others.

    In Raleigh, on the other hand, the crowd was multi-racial. And the march promoted a wide range of issues: economic justice; a living wage for every worker; support for organized labor; well-funded, diverse public schools; affordable health care and health insurance for all, especially women; environmental justice and green jobs; affordable housing for every person; abolishing the death penalty and mandatory sentencing; expanded services for released prisoners; comprehensive immigration reform to provide immigrants with health care, education, and workers rights; insuring everyone the right to vote; enhancing LGBT rights; keeping America's young men and women out of wars on foreign soil; and more.

    The long list of March sponsors reflects its comprehensive, multi-faceted political aspirations.

    They came together in one coalition because they all saw their own issues as connected pieces in one huge puzzle: How to make moral concern the guiding light of every public policy in America?   

    That moral focus reflects the leaders who inspired the Raleigh march. Most of them were clergy or faith leaders. In that long list of sponsoring groups, the biggest chunk was the ecumenical collection of religious and faith groups. And the March leader, the man who inspired it, was an evangelical preacher, Rev. William Barber.

    The anti-HUAC activists of 1960 surely included religious people. But as a group they did not give faith any credit for inspiring them to their actions, as so many of the Moral Marchers did.

    In fact throughout the '60s white students never managed to recognize how their political power could be enhanced (probably immensely) by allying with the power of organized religion. Many of them who have remained active as progressives, and many of the younger progressives who join them, still don't get that key point about successful political strategy in America.

    But the Moral Marchers did get it. They connected not only all the issues and all the races and ethnicities, but also the secular with the religious. March organizers proclaimed their event "open to all, whatever their beliefs or lack thereof when it comes to religion." And that's what they got. "The march brought together a diverse group from Baptists to Muslims and gay marriage supporters," as USAToday reported.

      Perhaps, then, just as the 1960 anti-HUAC protest foreshadowed big things to come years later, so does the Moral March in Raleigh. Perhaps it augurs a progressive movement that grows because it has learned important lessons since the '60s and '70s, a movement ready to be more diverse in its makeup -- even welcoming religious and secular people marching side by side -- but more unified because it sees all political issues tied together by the same thread of moral concern.

    Uh-oh, that's coming perilously close to prophecy, isn't it? Well, at least a historian is entitled to say that the protest of 1960 is a useful reminder of how long it can take for political seeds to come to fruition.

    And the American historical record also seems to bear out another conclusion: If you are going to go into the streets crying "Power to the people!" and want to have real political effect, you are better off adding "The Bible says!" rather than "Tear down the walls!".  

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153293 https://historynewsnetwork.org/blog/153293 0
    A Funny Thing Happened on the Way to the Apocalypse Everything has a history, and a pre-history. Even the end of history. Tracing "the end" back to its beginnings, and through some of the surprising twists and turns it has taken in U.S. history, can help us think more creatively about how to avoid "the end" and move toward a more hopeful future.

    This post is a companion to "Apocalypses Everywhere," a piece I've just posted on another site , which I hope you'll take a look at. There I wrote:

    Apocalyptic stories have been around at least since biblical times, if not earlier. They show up in many religions, always with the same basic plot: the end is at hand; the cosmic struggle between good and evil (or God and the Devil, as the New Testament has it) is about to culminate in catastrophic chaos, mass extermination, and the end of the world as we know it. 

    That, however, is only Act I, wherein we wipe out the past and leave a blank cosmic slate in preparation for Act II: a new, infinitely better, perhaps even perfect world that will arise from the ashes of our present one.

    The Jewish writers who invented this myth didn't make it up from scratch. They were inspired by earlier myths, songs, and poems -- especially biblical prophecies that warned of the coming destruction, not of the whole world, but only of the Israelite or Judean kingdoms. The earliest of those prophecies probably had no happy endings. Destruction was coming, they proclaimed: "The eyes of the Lord God are upon the sinful kingdom, and I will destroy it from the face of the earth" (Amos 9:8). That was the whole story.

    But later editors couldn't abide such utter hopelessness. So they added the happy ending: "I will restore the fortunes of my people ... They shall rebuild the ruined cities and inhabit them ... And they shall never again be plucked up" (Amos 9: 14,15). 

    Ancient Jewish apocalypses projected this story of national death and rebirth onto a global or cosmic scale. So did the last book of the New Testament, Revelations, the prototype of all Christian apocalypses. In those stories the Christians, the New Israel, must descend into universal chaos so they can emerge from it to live forever in the perfection of the New Jerusalem.

    The earliest British colonists in North America knew these biblical stories of endings, both cataclysmic and blissful, very well -- especially the Puritans, whose theology has had such a disproportionate influence on American political mythology.

    They also knew a third mythic theme: Repentance could avoid the prophesied destruction of the nation or community. That's the essential message of all those Puritan sermons, and their later imitators, which we know as jeremiads -- though the theme of "repent and ye shall be saved" actually doesn't show up very much in Jeremiah, nor in the other biblical prophets. It's rather the hallmark of Deuteronomy and the biblical histories written under the influence of that book.   

    These three kinds of narratives run like interwoven threads through the history American political mythology.

    Even Thomas Paine, reviled in his time as an enemy of religion, urged the colonists to revolution in overtly apocalyptic language: "The birthday of a new world is at hand. ... We have it in our power to begin the world over again," even if we must endure the chaos of war to bring that new world to birth.

    The same tones can be heard in the preaching of some nineteenth-century abolitionists. Abolitionism ultimately led to a Civil War that was widely interpreted by both sides as an apocalypse. (Ernest Tuveson's Redeemer Nation offers a copious collection of apocalyptic texts stretching from early colonial times to the Civil War, showing the continuities that persisted through widely different historical contexts.)

    Yet the dominant theological message of the antebellum abolitionists was not apocalyptic. Its language, shaped mainly in Puritan-influenced New England, drew most from the jeremiad tradition: The nation could avoid disaster by mending its ways, first and foremost by abolishing slavery. We call it "reform," but pious antebellum Protestants were more likely to call it repentance. 

    Several decades after the Civil War a new outburst of calls for repentance swept the land in the form of Progressivism, Bryanism, and the Social Gospel. That movement seemed to be felled by U.S. entry into World War I, which Woodrow Wilson justified in stark apocalyptic language: Defeat the forces of evil and we'll live in a world eternally safe for democracy. But Progressivism returned in the New Deal, though its religious underpinnings were now rather invisible.

    In this interplay between apocalypse and jeremiad, what happened to the original biblical source of the tradition, the prophetic theme of inescapable doom coming upon the nation? It was pretty much driven underground until it emerged, quite suddenly and unexpectedly, at the end of World War II, when the news broke of the atomic bombing of Hiroshima and Nagasaki.

    As Paul Boyer has shown in By the Bomb's Early Light, victory in World War II was not widely celebrated as an apocalyptic triumph ushering in a far better world. What the war's nuclear end ushered in was, instead, a nuclear fear that radically changed the very meaning of the word apocalypse.

    Type "define apocalypse" into Google's search engine and you'll first get the meaning that the Bomb made dominant in American political discourse: "the complete final destruction of the world." Utter annihilation, with nothing better -- indeed nothing at all -- to follow. THE END.

    Then you'll get the most fashionable current meaning of the word: "Any event involving destruction on an awesome scale; [for example] 'a stock market apocalypse.’" So apocalypse is no longer just "the end of everything" but, by extension, "the end of anything": mounting federal debt, the government's plan to take away our guns,  the Comcast-Time Warner mergerocalypse, Beijing's pollution airpocalypse, the American snowpocalypse, and the list goes on and on.

    That's why I call this the age of "apocalypses everywhere" -- an age whose dominant mood is defined by the paralyzing message first delivered by The Bomb: "We're doomed to annihilation."

    The old apocalyptic promise of a new heaven and new earth has largely disappeared in public discourse -- except in some (sizeable) evangelical Christian circles, where it offers hope, but only to the saved. Unfortunately, their “left behind” culture has produced among many a readiness, sometimes even eagerness, to fight both the final (perhaps nuclear) war with evildoers abroad and the ultimate culture war against sinners at home.

    In evangelical circles one can also hear jeremiads often enough, warning that America is doomed unless it repents of its evil (read: liberal) ways.

    Where does that leave the progressive left?

    A truly apocalyptic vision was once alive on the left, too -- most recently in the late 1960s. When the Jefferson Airplane chanted "Tear down the walls," millions of (mostly young) Americans who lustily sang along felt like they were enlisting in a revolution, offered themselves as "volunteers of America." They assumed that the tearing down was, simultaneously, a process of building, from the ground up, a brand new America, and indeed a brand new world, where "all you need is love."

    If it sounds a bit foolish in retrospect, it was no less -- and perhaps rather more --  realistic than all the apocalyptic visions that came before it.  

    A combination of violent repression in the streets and popular rejection in the voting booth put an end to the overt apocalypticism of those "volunteers of America."  Certainly many held on to some kind of revolutionary vision in some deeper recesses of their minds.

    Yet in the decades since, while "the left" was renaming itself "progressives," it came increasingly under the spell of our growing cultural tendency to see apocalypses everywhere. By now the progressive message, regardless of the issue, is typically: "Stop this catastrophe now or we're doomed!" (e.g., "Stop the Keystone XL pipeline or it's "Game Over"!"). Period. A better future may be implied between the lines, but it doesn't get much attention. So the note of hope that was the hallmark of religious apocalypse easily gets lost.

    The goal, of course, is to create a modern kind of jeremiad -- to alarm people and move them to change their ways. But in an age of apocalypses everywhere, all leading to nothingness, the summons to repentance loses its psychological impact. Instead, all this doomsaying from the left is most likely to reinforce the hopeless sense of impending annihilation that pervades American society.

    One sector of the left has largely escaped the siren call of doom. Logically enough, it's the religious left, the sector that inherited the ante-bellum Christian reformers' tradition of jeremiad. Today's religious left, like its ancestor, typically avoids an apocalyptic narrative; it does not foresee destructive chaos as the only route to the better world it envisions.  

    Rather, it returns to its roots in Deuteronomy, where the message is simple and stark: "I have set before you life and death, blessings and curses. Choose life" (Dt. 30:19). It's never too late to turn around, do the right thing, and avoid the catastrophic consequences of our past and present mistakes.

    Our past and present mistakes are most obviously catastrophic in our natural environment, where the end of the world as we know it is creeping up on us at every moment. It's what Todd Gitlin has called a "slow-motion apocalypse," using the word in its nuclear age meaning of total extinction.

    Amidst the gloom of global climate change, though, we may see a ray of the light of a new mythology to address the environmental crisis. Suppose we let the secular left define the problem in its familiar apocalyptic language -- which in this case is all too realistic -- but use the religious left's jeremiadic language of reform to spell out the solution.

    What we end up with is a "slow-motion revolution": every day, more and more people recognizing the danger and choosing life; that is, shifting from extinction-breeding fossil fuels to alternative energy sources, thus gradually transforming the impending doom to a realistic hope for a far better world. 

    Indeed, as I point out in "Apocalypses Everywhere," it's already happening. Scientists have shown that renewable sources like sun and wind could provide all the energy humanity needs. Alternative technologies are putting those theories into practice around the globe, just not (yet) on the scale needed to transform all human life.

    By combining the biblically-based apocalyptic and jeremiad traditions, we can make our words and thoughts reflect, not just our fears, but the promise of this revolution that is beginning all around us. In an age in which gloom, doom, and annihilation are everywhere, it's vital to bring genuine hope -- the reality, not just the word -- back into political life. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153299 https://historynewsnetwork.org/blog/153299 0
    The Washington Post's Editorial Fantasies Put Ukrainian Lives at Risk "President Obama's foreign policy is based on fantasy," an indignant Washington Post editorial headline exclaims -- as if there were any other possibility. Of course Obama's foreign policy is based on fantasy. So is every president's, just as every criticism of a president's foreign policy is based on fantasy.

    When we're talking about foreign affairs no one escapes the grip of fantasy, though I prefer to call it myth, since our fantasies always turn out to be based (consciously or unconsciously) on some story we embrace about how the world works. We all interpret the facts through our own mythic lens.

    Since every fantasy or myth is a self-fulfilling prophecy, the important question is  which one predicts, and thus points us toward, a more peaceful, humane world. If we really care about the Ukrainian people, we have to confront this question both urgently and carefully.

    The editors of the Washington Post don't have a very satisfying answer. They are deeply immersed in a dangerous fantasy of their own, one that's worth taking a close look at because it sums up a view that echoes so widely throughout the U.S. mass media -- a view that puts Ukrainian lives needlessly at risk.

    In this fantasy the world is divided into two kinds of nations. Some (like the U.S. and its allies) are reasonable; they want a peaceful world where everyone cooperates to promote economic growth.

    Others are under the thumb of leaders who just will not "behave rationally and in the interest of their people and the world." Instead they are "concerned primarily with maintaining their holds on power." So they ignore "the weight of world opinion," measure their country's standing "in throw-weight or battalions," and constantly engage in "misbehavior" that dams up the "tide of democracy in the world."

    (Russia has just joined China and Syria on the Washington Post's list of misbehavers; Iran has been dropped, for the time being at least).  

    In such a bifurcated fantasy world, the Post editors see only two choices for the United States:

    1) make the misbehavers "pay a price" by exerting U.S. "leadership" through shows of "military strength, trustworthiness as an ally, staying power in difficult corners of the world such as Afghanistan"; or else  

    2) let the world become "a more dangerous place," bereft of rational leadership and therefore prey to "disorder."

    The obvious choice, say the Post editors and so many other political and media leaders, is to "lead" and make sure those Russians -- especially their president, Vladimir Putin -- "pay a price" for their "misbehavior."

    I don't know if the Post editors would be more offended or dumbfounded to have this view labeled a fantasy. No doubt they see themselves as "realists."

    That was the label adopted by a new wave of foreign policy thinkers who emerged in the years just before World War II. They assumed that nations must always be competing because all are constantly seeking more power, using every means they can muster, including battalions and throw-weights.

    Their fantasy was bolstered and christened by the very influential theologian Reinhold Niebuhr. He argued that nations, even more than individuals, are driven by an innate egotism and lust to aggrandize themselves. It's what Christians have traditionally called original sin.

    The "realists" of the late 1930s achieved a brilliant coup by stealing the word "real" and pinning it on their particular fantasy. So they could claim that their view alone reflected "reality" and dismiss all the others as misguided, illusory fantasy -- which is just what the Washington Post has done, along with all the other voices loudly demanding that Obama "get tough." Toughness is the only thing our foes understand, "realists" insist.

    Since every fantasy is a self-fulfilling prophecy, it's easy to see where this one is likely to lead: a world always at war, as each nation shows its own strength to make others "pay a price" for showing theirs, while victims caught in between suffer the consequences.  

    But it gets worse if you claim to be an old-fashioned "realist" while you are really something quite different. That's the game the Post editors are playing, along with the whole chorus of American "get tough" voices they represent.

    A thoroughgoing "realist" might understand and accept Russia's move into the Crimea as inevitable. No nation is going to leave its only warm-water naval port in the hands of a new, distinctly unfriendly government. That would, indeed, be irrational. And real "realists" assume that every government is rationally maximizing its power.

    The dominant view in elite circles, the one that drives opinion in Washington Post editorials and a tsunami of other American mass media words, is actually a kind of radically modified "realism." It cannot accept any Russian response other than kowtowing to U.S. demands.

    This fantasy says, in effect: "Realism" tell us about our enemies' goals; they're driven by a mad lust for power. We have to stop them because we are pursuing a higher goal -- a peaceful, democratic world order. To stop them, though, we must imitate them and use any means necessary. So we fight as if we were "realists" in order to eventually put an end to a world of "realism."

    Here's a true anecdote that shows how such a distorted "realism" found its way into the White House:

    In 1944 Franklin Roosevelt met a theologian who started telling him about the Danish Christian philosopher Soren Kierkegaard, a forerunner of Niebuhr in resurrecting the doctrine of original sin in modern garb. FDR got so interested that he asked the theologian back to the White House several times.

    If Roosevelt had really understood what he was hearing, he would have said, "Oh now I see why we are all in this war, all fighting for a bigger share of the global pie."

    In fact, he told his Secretary of Labor Frances Perkins (as she recalled it in her memoir): "Kierkegaard explained the Nazis to me as nothing else ever has. ... They are human, yet they behave like demons. Kierkegaard gives you an understanding of what it is in man that makes it possible for these Germans to be so evil.” Apparently it never occurred to Roosevelt that original sin could possibly have anything to do with America.

    A few years later FDR was gone and his successor, Harry Truman, was applying the same chauvinistic combination of "realism" and American exceptionalism to our former ally, Russia: They are demons, driven by the irrational force of original sin. We, exempt from the taint of sin, have no choice but to stop them before they plunge the world into the chaos that the Devil himself has always been aiming at.

    Since Truman's time, U.S. public discourse has applied the same template to any number of foreign foes. Now we have come full circle and Russia is again Demon Number One. But the moral of the story is always the same: The only way to stop the demons and save the world is for us to take the lead and make them "pay a price," by any means necessary. We must use "realist" means for our purely ethical ends.

    It's easy to see why such a satisfying fantasy took powerful hold of American public discourse. And having taken hold for so long, it's easy to see why it feels reassuring to have it repeated over and over, as each new -- or, in this case, old -- enemy arises. It's so familiar, so easy to take as obvious truth, as if it were merely a photograph of an objective reality.

    But it's not genuine "realism." Indeed such a distorted, American exceptionalist use of "realism" is more dangerous than the original version for at least two reasons.

    First, it makes the risk of war even greater. When other nations see us using force while we forbid it to them, it looks like an obvious double standard. What we call "leading" they see, understandably enough, as "dominating." Naturally the unfairness of it makes them angry and, as the original "realists" predicted, they fight back. What looks to them like hypocrisy makes them that much less likely even to consider any compromise with us.

    Which leads to the second danger of America's popular distortion of "realism." While genuine "realism" is most likely to be a self-fulfilling prophecy of a world at endless war, it can occasionally have the opposite effect, at least in the short run. A "realist" would recognize why, for example, Russia would seize Crimea and perhaps accept it as an understandable fait accompli.

    Though the editors of the Washington Post are too moralistic for such compromise, they do offer a place on their op-ed page to two columnists who advise it. Here is Eugene Robinson's fantasy:

    "Russia gets sole or joint possession of Crimea. Ukraine and the other former Soviet republics remember that Moscow is watching, and we all settle down. ...  Realistically, that may be a deal the world decides to accept" -- without, apparently, forcing Russia to "pay a price."  

    Katrina van den Heuvel agrees that "we desperately need a strong dose of realism" and explains how both the deposed Ukraine leaders and Mr. Putin have acted rationally to maximize their own benefit. She also notes that it's realistic for Russians to worry about how they'll be treated in the new Ukraine: "The new leaders in Kiev include ultra-nationalists who, in one of their first acts, voted to repeal the 2012 law allowing Russian and other minority languages to be used locally."

    Van den Heuvel's fantasy is that "the E.U., Russia and the United States can join together to preserve Ukraine’s territorial unity; to support new and free elections; and to agree to allow Ukraine to be part of both the E.U. and Russian customs union, while reaffirming the pledge that NATO will not extend itself into Ukraine."

    "It is time to reduce tensions and create possibility, not flex rhetorical muscles and fan the flames of folly," she concludes.

    Obama and his inner circle may wish they could follow her advice. They may already be working back channels toward the kind of end game Robinson and van den Heuvel suggest.

    But in an election year that's already looking gloomy for the Democrats, it's too politically dangerous for the administration to consider advancing such a compromise publicly. They have to at least pose as "getting tough" against Russia or else open themselves to another dangerous line of attack from the Republicans. So the administration has to keep Ukrainians closer to the brink of war than a thoroughgoing "realist" fantasy would find necessary.

    Needless to say, no one gets to be president of the United States who would consider, even for a moment, a fantasy to the left of "realism" -- one that would reject arms and power struggles, dissolve NATO, and see no need for Ukraine or any other nation to be dominated by economic systems run by the great powers.

    If the question is how to move toward a more peaceful, humane world, that fantasy also deserves a full and fair hearing. But it remains far out on the left fringe, its voice barely audible to most Americans.

    That's the sad situation all these decades of a radically modified, American exceptionalist "realist" fantasy have led us to. Not only the Ukrainians, but people in areas of tension around the world, are the ones who must actually pay the price. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153301 https://historynewsnetwork.org/blog/153301 0
    Ukraine + Flight 370 = Bad News for Neocons In America the news is big business. That's not news. Everyone realizes that the corporate mass media make their money by delivering readers, viewers, and listeners to advertisers. The bigger the audience delivered, the bigger the profit. So corporate news editors have to know what good entertainers know: what the audience wants and how to give it to them.

    In late winter, 2014, it seemed that American news audiences wanted one thing above all else: a U.S. - Russia showdown over Ukraine. Why? Plenty of theories might be offered.

    But reading the headlines themselves, one explanation seemed most obvious: Americans understood that their nation's global prestige was on the line. Russian president Vladimir Putin was using Ukraine to test the will and resolve of the Obama administration. So Americans turned to the news each day to see whether their government would demonstrate enough strength to go on leading the international community.

    At least that was the story.

    Then came an unexpected turn of events calling that story deeply into question. On March 7 Americans began to drown in a deluge of headlines pointing them thousands of miles from Ukraine, to Malaysia, where Flight 370 had inexplicably vanished.

    Ever since, the mystery of 370 has at least rivaled, and more often eclipsed, Ukraine in U.S. news headlines -- even in our most respected elite news sources. Ten days after it disappeared, Flight 370 still held five of the top six spots on the New York Times website's "most viewed" list, while Ukraine limped in at numbers 8 and 9. Over at the Washington Post site, the missing flight took two of the top four spots on "Post Most" (and an impending snowstorm held the other two). No sign of Ukraine at all.

    Why such obsessive fascination with one missing plane on the other side of the world? Americans do not typically show deep concern about the fate of a handful of Asians (to put it politely). There were apparently three Americans on board, but they were not the main focus of the U.S. headlines.

    Nor can the possibility of terrorism explain it; that didn't become a central focus of the investigation until days after the plane disappeared. Yet the deluge of headlines began as soon as news of the disappearance broke. Even after Malaysian officials started focusing on foul play, only one of those NYT "most viewed" stories dealt with that issue. 

    The most obvious explanation for our fascination with the mystery of Flight 370 is simply that it's a great mystery. Our 24/7 news cycle lets us ride along, as it were, moment by moment with the detectives trying to solve it.

    From The Maltese Falcon to NCIS, Americans have loved a good detective story. And the likelihood of mass death never hurt any story's ratings. Make it a Hitchcockian murder mystery -- one that starts out in a setting so normal you could easily imagine yourself there (like a routine air flight) -- and you're headed for the top of the charts, or, in this case, the headlines. That's entertainment!

    What does the obsession with Flight 370 tell us about Americans' concern for their nation's strength and resolve as world leader? At the very least, it says, that concern is weak enough to be quickly diverted by an entertaining -- or, more precisely, infotaining -- mystery.

    Another possibility is equally plausible. Perhaps the corporate news media gave us all those headlines about Ukraine, knowing they would bring in big audiences, because the U.S. - Russia showdown itself was great entertainment. It, too, was a story involving great risk of life, whose outcome was unknown -- another mystery we could follow in real time, 24/7.

    For whatever reason, Ukraine and Flight 370 have held roughly equal appeal in the American news appetite, with 370 often having the edge. So the deep geopolitical dimensions of the Ukraine story obviously don't matter a whole lot to the news-consuming public. The people want to be infotained.

    That's very bad news for the neoconservatives who have worked so hard, and are still working, to make the U.S. - Russia showdown over Ukraine a matter of incomparable import and urgency.

    Not that they care so deeply about the Ukrainian people. For neocons, Ukraine is just the latest center stage for a drama that is always unfolding (more or less) everywhere, a drama pitting strong U.S. leadership against a global collapse into chaos and anarchy. Those are the only two alternatives neocons can see. And to them it looks like a matter of life or death.

    Apparently the rest of America no longer sees it that way. That's the bad news for the neoconservatives.

    To understand what’s at stake here for the neocons and for the rest of us, let's look briefly at the history of their movement.

    Neoconservatism crystallized in the late 1960s, when it had little concern about foreign affairs at all. As its intellectual godfather Irving Kristol wrote: “If there is any one thing that neoconservatives are unanimous about, it is their dislike of the [American]  counterculture.”

    The counterculture at home had unleashed a dangerous wave of selfish indulgence in private pleasures, Kristol complained: “Everything is now permitted. ... This is a prescription for moral anarchy. …The idea of ordered liberty could collapse,” leaving only “freedom, confusion, and disorientation."

    The other great exponent of neoconservatism, Norman Podhoretz, called the "epidemic" of  '60s radicalism "a vulgar plot to undermine Western civilization itself.” The root of the problem, in his view, was that “nobody was in charge” of the world any more.

    Neocons insisted that America could be saved only by restoring the rule of traditional authorities -- "organized religion, traditional moral values, and the family," as Kristol put it. Somebody had to be in charge.

    The neocons began to focus on foreign affairs only in the mid-1970s, "after the New Left and the ‘counterculture’ had dissolved as convincing foils for neoconservativism,” as historian Peter Steinfels pointed out.

    Neocons now worried that, after the '60s and the Vietnam debacle, Americans had lost the moral fiber that comes (they claimed) only from self-discipline. Political scientist Robert Tucker complained that the United States was afraid to make the “effort and sacrifice required to sustain the exercise of power.” So it might “no longer be the principal creator and guarantor of order.” The result, he warned, would be a “drift and uncertainty” in policy that might “lead to chaos.”

    Neoconservatives championed renewed cold war and a huge nuclear buildup in the '70s as symbols of "spiritual discipline," historian Edward Linenthal explained, "an inner transformation, a revival of the will to sacrifice." Such a return to traditional values would reject the "hedonism" of the '60s and restore order, both at home and abroad. As Podhoretz's wife, Midge Decter, said, for neocons “domestic policy was foreign policy, and vice versa.”

    When the cold war ended, most neocons turned back to their original battle against domestic moral anarchy. But a few kept the focus on global affairs, led by Krauthammer, who preached: “If America wants stability it will have to create it. The alternative…is chaos.”

    Two new neocon lights, Irving Kristol's son William and Robert Kagan, agreed. In the '90s they praised "conservatives' war against a relativistic multiculturalism ... reversing the widespread collapse of morals and standards in American society." But, they warned, "the remoralization of America at home ultimately requires the remoralization of American foreign policy.” So the U.S. should impose a “benevolent global hegemony,” demonstrating “that it is futile to compete with American power.”

    This was the worldview that George W. Bush brought into the White House. After the neocons had launched their wars in Afghanistan and Iraq, two scholars of the movement, Stefan Halper and Jonathan Clarke, observed: “Even today they look with horror at American society, which, in their view has never recovered from the assault of Woodstock.”

    Bush's neocons projected their fear of America's moral decay onto a global stage. They relied on a "tough" foreign policy, with endless shows of American "will and resolve," to fight against the "chaos and anarchy" that had first provoked them into action in the 1960s.

    They are still waging the same war, driven by the same fear. Listen to three of their most respected voices, clamoring for Obama and his administration to "get tough" with the Russians:

    Elliot Abrams: "Before Obama, there was a sense of world order that relied in large part on America."

    Charles Krauthammer: "What Obama doesn’t seem to understand is that American inaction creates a vacuum."

    Reuel Marc Gerecht: "If Washington retreats, only the void follows. Things are likely to get very, very nasty and brutish and short."

    For neocons to see the nation ignoring their warnings and indulging in the pure, self-centered pleasure of news as mere infotainment must be agonizing.

    That's how it looks from inside the neocon's mythic worldview. Nothing has changed since they first switched their focus from domestic to foreign fears in the 1970s -- except that most Americans no longer buy the neocon warnings as a genuine cause for anxiety, nor as a foundation for foreign policy.

    Perhaps most would agree with our last ambassador to the Soviet Union, Jack F. Matlock, Jr., that Putin is reacting understandably to a long "cycle of dismissive actions by the United States ... the diplomatic equivalent of swift kicks to the groin," most of them administered by Bush and his neocons. More such kicks "encouraging a more obstructive Russia is not in anyone’s interest."

    The public buys the neocon view, apparently, only as an entertaining story. When a more exciting story comes along, like Flight 370, the U.S. - Russian showdown simply can't control the headlines any more.

    Inside my mythic worldview we call that a step in the right (well, actually, left) direction. But it's only a step. The next big step is to make the quest for peace, nonviolence, and justice just as exciting and entertaining as the push toward war.

    The great Abolitionist William Lloyd Garrison knew how to do that. So did Gandhi and Dr. King. We always need to be re-learning the lessons they taught.

    However I'd still keep an eye on the neoconservatives. They've suffered decline before. Yet they keep on coming back, the same old wolves, just wearing slightly altered clothing.

    They speak for one permanent strain of American insecurity -- a fear of disorder and confusion, disguised as a fear of foreign enemies. It lies buried beneath the surface of our political culture now, but not too deeply. It could be unearthed all too easily, as suddenly as an airplane can vanish.

    There would be nothing entertaining about the result, though, as the lingering effects of the wars of George W. Bush remind us. So let us enjoy this interlude when infotainment reigns and use it to build a peace movement strong enough to resist the next onslaught of the neocons. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153310 https://historynewsnetwork.org/blog/153310 0
    Americans Have Stopped Looking For Another Hitler U.S. foreign policy hawks once had an unbeatable trump card: the Hitler analogy. Just convince most Americans to see the leader of any nation as another Hitler, and it was easy to get public support to "get tough," show "will and resolve," and even go to war. Serbia's Milosevich, Iraq's Saddam Hussein, and Al Qaeda's bin Laden were all labeled as new Hitlers. All fell victim to America's firepower.

    In recent months the hawks thought they had a rich crop of new Hitlers: Iran's Ali Khamenei, Syria's Assad, Russia's Putin. Each one seemed ripe for the Hitler analogy. 

    According to widely circulated press reports, Ali Khamenei sounded as anti-semitic as any Nazi when he  declared it "acceptable to kill all Jews" and called Israel's leaders "animals." Assad, like Hitler, purportedly killed citizens of his own nation in massive numbers. Putin was perhaps the most Hitler-like. He annexed foreign territory with the claim that it really belonged to his nation because so many of his own nationals lived there.

    If the Hitler analogy still held its once-invincible sway over American public opinion, the hawks should be riding high.The U.S. should be preparing to fight one, two, or perhaps even all three of these foreign leaders.

    Obviously it's not working out that way.

    The Russian annexation of Crimea has disappeared from American headlines as quickly as it arose. Photos of the U.S. secretary of state shaking hands with Russia's foreign minister, as they discussed plans to ease the Ukraine crisis, caused hardly a ripple in the mass media.

    Renewed negotiations between Iran and the U.S. and its allies got even less notice. It's now apparently taken for granted that it's OK to cut a deal with the Iranians.

    The most glaring failure of the Hitler analogy came last September, when the president of the United States tried to rally public support for an attack on Assad's Syria. Not only hawks but moderates and even some doves in the foreign policy elite supported Obama. But their immense efforts failed. The public simply wasn't interested in "getting tough" against another Hitler.

    Part of the reason is that the current crop of new Hitlers just aren't acting like the original Hitler.

    Putin called Barack Obama "to discuss ideas about how to peacefully resolve the international standoff over Ukraine" as the New York Times reported -- hardly a Hitler-like move. An NBC news crew, after traveling 1200 miles along the Russia-Ukraine border, found "no signs" of the widely but erroneously reported Russian military buildup.

    When the UN passed a resolution calling on Syria to get rid of its chemical weapons, Assad simply said, "Of course we have to comply" -- hardly a Hitler-like response. And the process of removing those chemical weapons is moving toward completion.

    Ali Khamenei has recently tempered his words about Jews and Israel. Of the Holocaust, he now says, "if it happened, it's uncertain how it happened." His foreign minister, Javad Zarif, claims that Iran has never denied the Holocaust.

    Ali Khamenei's earlier, seemingly anti-semitic, remarks came in a legal brief arguing that Iran "would be justified in launching a pre-emptive strike against Israel because of the threat the Jewish state's leaders are posing against its own nuclear facilities" -- the very same kind of argument Israeli leaders have used to justify a potential strike against Iran. The whole record of Ali Khamenei's rhetoric tends to support Zarif's recent words: "We never were against Jews. We oppose Zionists."

    However all these softening moves by the purported new Hitlers cannot, by themselves, explain the waning power of the Hitler analogy. It would be easy enough for the American public to ignore them or to frame them in ways that bolster the Hitler analogy.

    The bitter truth for U.S. hawks is that the public no longer seems eager to see another Hitler on the horizon.

    That poses a huge problem for the hawks. It's not merely that they have less leverage over public opinion. Even worse for them, they now have to deal with the question of foreign leaders' motives.

    As long as a foreign leader could be portrayed as another Hitler, the issue of motives never came up.

    In American mythology Hitler is the quintessential devil figure, the man who did evil purely for the sake of doing evil. The motives that drive most leaders' policies -- national security, power, wealth, pride, etc. -- are dismissed as irrelevant for understanding Hitler.

    As usual in political mythologies, there is no doubt some degree of truth in this view. How much truth? Historians will go on debating that question forever.

    But in American public memory the case is closed: Hitler's only motive was sheer evil for its own sake. So there was no way to placate him. No negotiations, compromises, or changes in U.S. policy could have affected the Nazi leader's actions one whit. International relations became a simple battle of America versus the devil. The only sensible option in fighting the devil's irrational, implacable evil was brute force.

    Thus, if any leader could successfully be portrayed as another Hitler, there was no need to ask about that leader's motives. The question would be not merely foolish but dangerous. It would lead us down the primrose path of negotiation and compromise. We would appease the devil, let down our guard, and inevitably fall prey to his next evil move. 

    The only way to deal with another Hitler, it's assumed, is the way we dealt with the first one: violence and more violence, until we compel the enemy's unconditional surrender.

    But if the Hitler analogy no longer sways public opinion, we are less likely to reach reflexively for our guns. So we have psychological space to think about the motives of leaders like Putin, Assad, and Ali Khamenei.

    We can ask how they see us, whether they might be responding to the policies and actions of other nations -- including our own -- and whether they might have understandable reasons for the choices they make.

    Once these questions are raised, the differences between today's leaders and the mythic Hitler quickly become apparent.

    Putin is understandably afraid that Ukraine might join NATO. Imagine an American president's response if Mexico considered joining a Russian-led military alliance. And Putin's afraid that an unfriendly Ukraine would deprive the Russian navy of its only warm-water port, Sevastopol in the Crimea.

    Ali Khamenei is waging a struggle for regional power with two much stronger nations, Saudi Arabia and Israel, both massively armed by huge U.S. aid grants. And he has faced threats of attack, for years, from two nuclear-armed powers: Israel and the U.S.

    In both cases, the leaders' policies are perfectly rational, judged by the rules of the international power politics game.

    Assad's case is different. He faces a powerful internal rebellion that might well oust him and his regime. Though it's easy to sympathize with the rebels' cause, it's equally easy to understand that any leader threatened with rebellion would resist. That's something Hitler never had to deal with.

    Indeed each of the three contemporary situations is different from Hitler's case. That's inevitable, because every historical situation is unique; there never really was another Hitler and there never will be another Hitler.

    One of the most potent roles of myth in political life is to deny that uniqueness, to create a frame that depicts new situations as exact replicas of old ones. Myth thrives, in part, because it offers a reassuring sense of familiarity. "Oh, I know what this situation is all about," we say, "because it's exactly like one I've been through before." It's never quite true. But the allure of this mythic message is undeniable.

    Which makes it all the more surprising that the American public, offered three likely candidates for the role of new Hitler, has rejected all three.

    That doesn't mean the mythic power of Hitler is gone forever. Finding another Hitler is an old habit. It goes back to the beginning of the cold war, when Stalin became the new Hitler, the "red fascist." (Never mind that reds and fascists despise each other; myth need not be troubled by such logical contradiction.) And old habits are hard to break.

    For now, though, most Americans seem ready to break the habit. We have stopped seeing new Hitlers because we have stopped looking for them.

    Who knows? Maybe it's the beginning of a long-term trend. Maybe public opinion will grow less and less likely to view the world in the simplistic terms of America versus the devil.

    The idea that national leaders everywhere act for comprehensible reasons, even when we don't like their policies, might move into the mainstream of American public discourse. Then Americans might begin to assume that we should always negotiate, compromise, and acknowledge our own role in creating international conflicts.

    Maybe, some day, the mythic Hitler will finally die. At least that demise now appears possible. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153321 https://historynewsnetwork.org/blog/153321 0
    Americans Still Blind to Israel's Domination of Palestinians When Stephen Colbert looks at people he doesn't see skin color. He treats everyone equally because everyone is equally white to him. In the same way, Joseph Kahn, foreign editor of the New York Times, sees no difference between Israelis and Palestinians. When it comes to covering the U.S. - supervised talks between the two neighbors, the Times treats both sides with scrupulously fair equality, Kahn insists.

    Of course Colbert's "color blindness" is a joke, meant to remind us that it's absurd to treat the historically powerful and powerless as if they were equal. But when the Times' editor insists on his even-handedness he is apparently dead serious.

    When it comes to the Israeli-Palestinian negotiations, at least, the Times is still the flagship of the U.S. mass media, charting the course that most others follow. Even if the Israelis were all white and the Palestinians all black, our mass media would remain strictly color blind. How else could they achieve their constant goal of neutrality, the key to objective reporting?

    I trust you get the Colbertian joke. Israel has been dominating the Palestinians ever since it conquered their territories in 1967. It would be crude to say that it's just like the way white Americans have dominated black Americans over the centuries. There are vast differences. But there is also a very rough analogy here.

    At least it's an exaggeration that points to a crucial truth: In this case, as in so many others, neutrality cannot be the key to objectivity because it ignores an immense inequity of power. When the powerful meet the powerless, journalists must always keep that inequity front and center if they want their reporting to be anything close to objective.

    Yet the U.S. mass media do just the opposite. Their reporting on the Israel-Palestine interaction is an endless litany of "he said, she said; he said, she said," constantly reinforcing the mistaken impression that the two sides are equals in something like a fair fight.

    Consider the latest impasse in the talks. The Israelis had promised to release a number of Palestinians they had imprisoned for the "crime" of fighting for their own nation's independence. No doubt some had used, or planned to use, violent means -- imitating the American revolutionaries of 1776 whom our national mythology holds up as heroes. Others, like many Americans during the Revolutionary War, had done no violence but were imprisoned in arbitrary roundups.

    In return, the Palestinians had promised not to join any international organizations or sign on to international treaties.

    Israel reneged on its promise to release the prisoners and rubbed salt in the wound by announcing 700 new apartments for Jewish settlers in East Jerusalem. The Palestinian Authority responded by signing 15 international conventions and treaties, including the Geneva Conventions of 1949, the Hague Convention respecting the Laws and Customs of War, and treaties dealing with women's and children's rights.  

    Yet the official U.S. government response treated both the Israeli and Palestinian actions as equally "unhelpful" acts. The U.S. media was similarly even-handed at best -- depicting both steps as equally damaging to the peace process -- or, like the Times, put more blame on the Palestinian move.

    So Palestinian commitment to international laws of peace and justice was framed as something evil, just as bad as or worse than Israel expanding its illegal housing and imprisoning rebels who were fighting (or wrongly accused of fighting) for their nation's right to be free.  

    That's what happens when you are politically color-blind.

    In a sense, American political leaders, media, and most of the public really are blind to the difference between Israeli and Palestinian actions. They are so sympathetic to Israel, and so antipathetic to Palestine, that they don't see Israel's dominance over Palestinian life as a form of brute oppression. So they don't see Palestinian reactions as an understandable -- and, in recent years, quite restrained and nonviolent -- response to oppression. Treating the two sides as if they were equal combatants in an endless fight reinforces this blindness.    

    Now the U.S. government is trying, with greater persistence than ever, to resolve the quarrel. So the appearance of neutrality is more important than ever.  

    If the media constantly reminded us of the immense power inequity between the two sides, the practical implication of U.S. policy would be clear: By constantly demanding Palestinian concessions in roughly equal measure to Israeli concessions, the U.S. government is continuing its historical pro-Israel bias, a bias going back to the very inception of the nation of Israel.

    When Harry Truman extended de facto recognition to Israel on the very day it declared its independence (though he waited several months for de jure recognition), he overrode the objection of his own State Department that he would alienate Arab states.

    The anti-Arab bias grew stronger in the Eisenhower White House, where it was an article of faith that most Arab governments were tilting toward the communists. When Eisenhower demanded that the Israelis pull back from the Suez Canal in 1956, he wasn't moved by any concern about justice for the Arabs; he was incensed that Israel -- and its allies in the conquest of Suez, Britain and France -- would act without his approval and risk pushing the Arabs further into the communists' arms.

    America's pro-Israel tilt became more blatant during the 1967 and 1973 wars and has remained a hallmark of U.S. policy and politics ever since. The only change is that Republicans are now more likely than Democrats to support the Israeli government, no matter how hawkish it may be. 

    Barack Obama is following the tradition of Jimmy Carter and Bill Clinton, all Democratic presidents who have posed as neutral mediators between Israel and its Arab rivals (though George W. Bush tried to make that a bipartisan tradition). The key to this pose is a Colbertian pretense of blindness to difference.

    By demanding roughly equal concessions from both sides, the U.S. is insisting on an outcome that will maintain the imbalance in power, even if it becomes an imbalance between two independent states.

    Yet there's more than just a pro-Israel bias driving America's image of equality between the negotiating rivals. It's also a matter of America's self-image about its role in the world.

    It has always been an article of faith in American public mythology that our foreign policy aims at justice and peace, while other nations are moved by lust for wealth and power. That's just one of the many ways America is exceptional, the mythology says.

    The idea goes back at least as far as Thomas Paine, who wrote in Common Sense that "in England a King hath little more to do than to make war" which "is to empoverish the nation." So colonial subjects of the British king would always be embroiled in and impoverished by war. But if they became an independent republic, ruled by the will of the people, they could shape an independent foreign policy that would keep them in peace and prosperity. They would remain above every foreign fray, keeping their hands free of bloodshed and thus pure.

     In the early years of the 20th century the myth was extended to cast America as the prime force for world peace and moral purity. Theodore Roosevelt won the Nobel Peace prize in 1906 for his "happy role in bringing to an end the bloody war recently waged between two of the world's great powers, Japan and Russia," the Nobel Committee declared, adding that "the United States of America was among the first to infuse the ideal of peace into practical politics."

    But there was plenty of self-interest involved. "We have become a great nation ... and we must behave as beseems a people with such responsibilities," TR boasted. Mediating the Russo-Japanese war was a way to act out American greatness and muscle on the world stage, well-dressed as moral virtue.

    It was also a way to stop the expansion of the obviously stronger Japanese military. Roosevelt "decided that [the war] must be stopped before Japan could gain too great an edge and he offered his good offices" as a mediator, George Herring wrote in his authoritative history of U.S. foreign policy, From Colony to Superpower.

    Woodrow Wilson made this image of American neutrality and superiority a bipartisan affair in his response to World War I. By guiding the warring Europeans toward a just and lasting peace, he proclaimed before the U.S. entered the war, he would make sure America played "the great part in the world which was providentially cut out for her. ... We have got to serve the world."

    But he also intended to lead and shape the world. Though he posed at the Versailles peace conference as a morally superior neutral, he insisted that a just peace "must be constructed along American lines," especially to ward off the threat of Soviet Bolshevism, which he "abhorred," to use Herring's words.

    So the Obama administration, with immense help from the U.S. mass media, is continuing a distinguished bipartisan tradition. By treating Israel and the Palestinians as equals, it can portray America standing virtuously above the bloody Old World fray, while at the same time insuring that the outcome suits U.S. interests and the political interests of the administration.

    So far that approach doesn't seem to be moving the Middle East closer to peace.

    The only realistic path to peace is to speak honestly -- to name Israel as the stronger party, the occupier, and thus the side obliged (legally, morally, and practically) to make more concessions.

    There is now a hint that the administration may yet surprise us and move in that direction. But all depends on how the White House reads the domestic public winds. All depends on the opinions expressed by the American people.      

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153326 https://historynewsnetwork.org/blog/153326 0
    9/11 Museum Stirs Memories -- and Protest

    Memories of old conflicts often spark new conflicts. So it's no surprise that there's controversy swirling around the National September 11 Memorial Museum, due to open on May 21, rising from the ashes of the fallen World Trade Center. The Museum will offer visitors a short video about another rising: "The Rise of Al-Qaeda."

    After the museum's Interfaith Advisory Group saw a preview, though, “everyone was just like, wow, you guys have got to be kidding me,” according to Peter Gudaitis, who initiated the Group. He and his colleagues spelled out their complaint in a letter to the Museum's director:

    Museum visitors who do not have a very sophisticated understanding of the issues could easily come away equating al-Qaeda with Islam generally. ... The video may very well leave viewers with the impression that all Muslims bear some collective guilt or responsibility for the actions of al-Qaeda, or even misinterpret its content to justify bigotry or even violence toward Muslims or those perceived to be Muslim (e.g., Sikhs).

    But museum officials are so far unmoved. “I don’t see this as difficult to respond to, if any response is even needed,” Clifford Chanin, the education director, wrote in an email to museum directors, which accidentally went to the protesting ministers, priests, rabbi, and imam. (Woops!)  

    After that gaffe, some response was obviously needed. Joseph Daniels, head of the foundation that oversees the museum, gave it to the New York Times: "We had a very heavy responsibility to be true to the facts, to be objective, and in no way smear an entire religion when we are talking about a terrorist group. ... We have gone out of the way to tell the truth.”

    The question at issue is, apparently, whether the video does smear an entire religion.

    But there are deeper questions. Can such a video, or such a museum, ever simply tell the truth through objective facts? If objectivity is a "noble dream" in the writing of history, as Peter Novick put it years ago, it's surely more of a dream in films and museums, where -- even more than in books -- history is not fact but story.

    The real question that the critics of the video raise is: What story should the museum tell about the men who allegedly perpetrated the horrendous events of 9/11?

    To explore that question, let's first consider another that they did not raise: Why tell the story of Al-Qaeda at all?

    The museum is part of a memorial complex at the foot of the new One World Trade Center, now officially declared the nation's tallest skyscraper at exactly 1776 feet. That tower and its official height tell a symbolic story of their own, the story that George W. Bush began telling almost immediately after the attack: “The resolve of our great nation is being tested. But make no mistake: We will show the world that we will pass this test.” “This will be a monumental struggle between good and evil. But good will prevail.”  

    One World Trade Center is a monumental way to say "We have prevailed!", that (as Ronald Reagan boasted when the U.S. defeated Grenada in 1983) America and all it has represented since 1776 is still "standing tall." We have passed the test; we and our goodness still tower high above all who would attack or condemn or criticize us.

    To tell the whole story, though, there is also (at the foot of the tower) a memorial to the fallen, reminding us how incredibly gruesome the test was and how much blood had to be shed. Yet the memorial's website tells us that "its design conveys a spirit of hope and renewal." Perhaps it should say "resurrection." In a country so steeped in Christian traditions, you don't have to be Christian to get the message (at least unconsciously): The horror of wholly unjustified death is made holy because the victim is risen again, high and mighty, right before our eyes.   

    If the tower and memorial tell the story clearly, why need a museum at all? In part, to make sure no visitor misses the symbolic point of the whole complex. In part, to spell out the story in greater detail.

    Most importantly, though, the museum adds a crucial piece to the story: This was a battlefield where good met evil in an unusual but very real kind of war, it says. If the tower and memorial tell us who the good people were (and still are), the museum tells us who were (and still are) the bad guys, the perpetrators of this horror.

    The video might have been devoted to the heroic rescue efforts on 9/11 or the immense outpouring of generosity that followed. But instead it is devoted solely to a story that might well be called "Who Was -- And Still Is -- Our Enemy?" The hall that houses the video is, in a sense, a theater of war. And right next to it, lest we miss the point, there's a gallery with photographs of the 19 alleged hijackers.

    The Interfaith Advisory Group does not object to the video's claim that America faced an enemy on 9/11. It's the more specific message -- our enemies are Muslims -- that has raised protest. In their letter the clergy said that "if generalized labels are needed" to explain the roots of the attack "we suggest using specific terms such as 'Al Qaeda-inspired terrorism.'”

    This may raise a serious problem for some historians, who see it as their job to provide all the facts and relevant context. Professor Bernard Haykel of Princeton, who vetted the film, spoke for those historians: “The critics who are going to say, ‘Let’s not talk about it as an Islamic or Islamist movement,’ could end up not telling the story at all, or diluting it so much that you wonder where Al Qaeda comes from.”

    Where Al-Qaeda comes from is indeed a pressing question for most Americans. However it's not a matter of needing facts, as the museum officials claim. It's a matter of  needing a satisfying story.

    What most people want from a war museum is, above all, a story that makes sense, a story that gives us an enemy substantial enough to be meaningful. Ultimately most of us feel safe as long as the world, despite all its conflicts, seems to have some overarching structure to give it sense and meaning.

    What if the museum depicts the attackers coming from nowhere in particular, out of the blue, like a tornado? Then there is no meaningful story to tell. 9/11 becomes a senseless, random act, and Americans are left to go on living intolerably precarious lives, no matter how many drone attacks are launched against suspected Al-Qaeda operatives.

    So our story must pit an entire good nation against an entire evil one, or two full sets of values and ideologies against each other -- something like the cold war, or World War II (as we remember it now; during World War II itself Americans showed little understanding of or interest in Nazi ideology).

    A story about high-tech military robots picking off random, shadowy individuals just doesn't offer the kind of familiar, clear-cut, full-bodied war story that makes sense to most Americans. It certainly can't give meaningful structure to our lives. And meaning is, above all, what the September 11 Memorial Museum seems determined to offer, as any war museum inevitably would.

    Even at the risk of stirring up anti-Muslim sentiment, however unintentionally? That's the question the Interfaith Advisory Group poses. Since the museum claims to be "the country’s principal institution for ...  exploring the continuing significance of September 11, 2001," the answer it gives will ultimately speak for all America, for a long, long time to come.

    Beneath all these complex layers of contestation there lies, unspoken, yet another issue, the deepest of all. The museum and its video do not merely speak for America. They speak, loud clear, about America. They make a profound comment about the mythic vision Americans hold of their own nation.

    Whenever we tell a war story, as we define the enemy we are also in some way defining ourselves. Certainly the burden of all George W. Bush's post-9/11 rhetoric was to insist that we are everything the evildoers are not, and vice versa.

    Whatever we say about "them" -- and the way we say it -- inevitably says a lot about "us" and who we think we really are. This is why the issue of defining the enemy is so fraught with tension.

    The video seems to define America as the "not-Muslim-extremist" nation -- which the protesting clergy fear will all too easily be turned into "America is the not-Muslim nation."

    Perhaps those clergy remember World War I, when America so crudely became the "not-German" nation, with many schools banning the teaching of German and many restaurants refusing to serve sauerkraut.

    Perhaps they remember World War II and the mass internment of wholly innocent American citizens simply because of their Japanese ancestry.

    Perhaps they remember the Korean War, the era when Joseph McCarthy and his followers had such free reign to terrorize loyal Americans with unsubstantiated charges of aiding and abetting the communist enemy.

    Ever since early colonial times, when immigrants from Europe could not be sure whether the native Americans were within or without the white community -- because they were in fact both -- there has been a powerful impulse to associate every purported foreign evil with an internal evil and to attack the internal as well as the external in a single war.

    Implicitly but very clearly, the protesting clergy are saying we must stoutly resist that impulse. We must avoid even the slightest risk of becoming that sort of intolerant nation again. As we define the enemy of 9/11, we must define ourselves as the most tolerant, all-embracing nation we can be, a nation filled with equal respect for all its inhabitants, whatever their religion or land of origin. The real test we face, they are saying, is whether we can rise to that level of humanity.

    Their view of America also has ancient roots. Nearly 400 years ago, one of the earliest Puritan settlers in  New England, Thomas Morton, erected a maypole at his settlement, Merrymount. He welcomed everyone and anyone -- include native Americans -- to enjoy "revels and merriment after the old English custom," as he recalled it, "and therefore brewed a barrel of excellent beer ... with other good cheer, for all comers of that day."  

    Morton's open-armed vision of what the New World could be outraged many other Puritans, whom he called, with fine precision, "Separatists." They could not stand the sight of native people joining hands with Puritans to dance around the "pagan" maypole. So Miles Standish led an armed band of these Separatists who angrily chopped it down. By that act they symbolized their dearest belief: Their community could remain pure only by separating itself absolutely from the "savages."

    In fact the whole idea of a good, pure, virtuous community would make no sense unless there were "savages" to represent its polar opposite, showing the world clearly what the Separatists were not. The same kind of demand for a clear-cut demonstration of what America is, by contrasting it with what it is not, still echoes across the land today. The National September 11 Memorial Museum and its video stand ready to fulfill that demand.

    The conflict over the video is so tense, and so important, because the two sides have such different mythic visions of America. One side wants an America defined by firm resolve to resist and overcome a clearly-defined enemy. The other side wants an America defined by its readiness to embrace the widest possible diversity of humanity.

    Museum officials face a band of clergy who are the symbolic descendants of Thomas Morton. Those officials may wish they could solve the problem as quickly and decisively as Miles Standish did, or so the errant email from the museum's education director suggests.

    But America is no longer the land of the Separatist Puritans. The demand for a community that welcomes all comers is not so easily stifled. The more it is resisted, the louder it will grow.

    Since the museum wants to represent America as a whole, it invites all of us to add our voices to the debate. "For feedback," says its website, "email info@911memorial.org."  

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153342 https://historynewsnetwork.org/blog/153342 0
    Church and State in America: A Brief Primer

    The Supreme Court has ruled, 5-4, that Greece, New York, can open its town meetings with a prayer, even though nearly all the prayers have contained distinctively Christian language. No doubt advocates and critics of the opinion are scouring American history, looking for proof that their view is correct.

    If they look with an unjaundiced eye, they'll quickly discover one basic principle: Whatever position you hold on this issue, you can find some support in our nation's history. So history alone cannot resolve the ongoing debate. But it can help inform the debate.

    To understand that history we have to begin in the European Middle Ages, when the Roman Catholic Church held sway over the religious life of almost all western Europeans. Politically each area was usually ruled by a single a monarch.  Since "Church" and "state" were both monolithic institutions, it made sense to talk about "church-state relations" quite literally.

    In principle, both sides usually agreed that the state ruled over the affairs of this world and the church ruled over the affairs of the soul as it headed toward the next world. In practice, though, each side often tried to extend its power over the other.

    When the Protestant reformation came along in the 16th century, it refuted the Catholic church's claim to control other-worldly affairs. But it did not challenge the basic idea that each area should have one secular ruler and one established church, and the two should live side by side, each respecting the other's domain. So tensions between church and state inevitably continued.

    Since nearly all the early European colonists in what would become the United States were Protestants, they brought that Protestant view with them. Different denominations had majorities in the various colonies, and each had its own model of church-state relations.

    But nearly everyone assumed that it could make sense for a colony to have one established church, which would have special privileges from and influence upon the colony. Most of the colonies did, in fact, have established churches.

    By the early 1700s, though, the colonies were filling up with immigrants from different places who held different religious views. So the established churches everywhere had to tolerate dissent from the official religion, to a greater or lesser degree.  At the same time, the colonies were experimenting with all sorts of different political structures.

    Thus "church" and "state" were no longer monolithic entities as they had been in medieval times. Gradually, the term "church" became a code word for religion in general, including the many different religious beliefs and practices held by different groups and individuals. And the term "state" became a code word for the many various political structures -- town, city, county, colonial legislature, royal council, etc.

    Things got more complicated in the 18th century as people found their identity based less in fixed social institutions and more in open-ended individual conscience. The Enlightenment philosophers taught that religion was a matter of private belief and individual relationship with God. They also taught that every individual was free to choose their own political views and that the state should base its policies on the will of the majority.

    A large Christian revival movement called the Great Awakening reinforced the idea that religion is a matter of inner experience and personal relationship with God more than membership in a church. So the Enlightenment and the Awakening combined to promote individualism and the notion of religion as a private matter.

    By the time of the American Revolution, then, there was a complex triangular structure, with private individuals, political institutions ("state"), and religious institutions ("church") all interacting. So the term "church-state relations" meant, more than ever, an endlessly complex set of changing relations among all the different forms of religious and political life.

    But there was a growing belief in the colonies that the private individual had highest priority, that the main role of the state was to protect the individual's rights, including the right to decide on one's own religion.  

    The colonists who joined the Revolution against England all agreed on one thing: the English political system was a tyranny, and the Church of England was part of that tyranny. So there was growing fear of the very idea of an established church.  

    It was only natural, then, that the new United States would want to protect its citizens from an established church. So the first words of the Bill of Rights said that "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof

    But there was no clear agreement then, as there is none now, about exactly what those words mean.

    Some see the two clauses making two opposites points. "No law respecting establishment of religion" makes it illegal to force people to practice  a religion; "no law prohibiting free exercise" makes it illegal to stop people from practicing religion. The "no establishment" clause protects the people and the government from religion. The "free exercise" clause protects religion from the government and the will of the majority.  

    But some say that both clauses actually make the same point: They both protect individuals from the federal government. The government cannot impose a religious institution on any individual, nor can the government restrict any individual's religious life. In fact some religious institutions supported the 1st amendment when it was ratified and refused to take any support from the government because they feared such support would entitle the government to impose controls upon them.

    The debate about the meaning of the 1st amendment and the intentions of the founders still rages on because they did not bequeath to us any single consistent view on church and state. They all claimed to be Christian. But they had many different ideas of what it meant to be Christian. Each individual could hold what we might see as contradictory views and practices.

    To take one important example: Thomas Jefferson created the image of a "wall of separation between church and state" and wrote powerfully about the need to protect the religious freedom of every individual. Yet in the Declaration of Independence he based the entire political philosophy of the new nation on the idea that all men are endowed by their Creator with certain unalienable rights. Without God, Jefferson's whole political philosophy makes no sense. Jefferson was also devoted to the teachings of Jesus, but only as he understood them; he even created his own version of the Gospels. Jefferson also supported, on occasion, legislation to create public prayer days and to punish people who broke Sabbath laws.

    If we cannot expect logical consistency even from Thomas Jefferson, we certainly can't expect it from the founding fathers as a group.

    The 1st amendment was the product of political compromise among the founders. So perhaps it is best to see it as the beginning of a conversation or debate about the relation of political and religious life. Perhaps many of the founders knew that all they could agree on was the need to continue the debate.

    Though the founders disagreed on what it meant to be Christian, they all assumed that some version of what each one saw as the "basics" of Christianity was more or less necessary as a foundation of an orderly society. Most of them assumed that Christian values were the basis of political liberty.

    Even those who were wary of Christian bias would probably have agreed with Justice Anthony Kennedy, who wrote the majority opinion in the recent Greece case:

    "Prayer is but a recognition that, since this nation was founded and until the present day, many Americans deem that their own existence must be understood by precepts far beyond the authority of government to alter or define."

    So most of the founders saw no contradiction between the federal government guaranteeing freedom of religion and the states having established churches that could get special privileges from government, provide prayers for political occasions, and dictate the teaching of religion in schools

    But by the late 18th century all the states had so much diversity that the power of established churches was rapidly fading. Massachusetts was the last state to end its established church, in 1833. By the 19th century, then, Americans did not merely believe in the right to dissent from the dominant church. They assumed that there would no longer be any dominant church.

    Yet the 19th century was dominated by one religious view: evangelical Christianity. Evangelicals emphasized individual experience as the basis of religion. So religion became, more than ever, a matter of individual choice, which led to the creation of many new churches. But the evangelical fervor also strengthened the idea that all Christians share basic values in common, and that these were the core values of the American way of life -- a view that would surface again in some 20th century Supreme Court decisions.  

    For evangelicals, the "wall of separation" meant that everyone was free to influence the government as much as possible according to their own version of Christian values, with the goal of making America the kingdom of God on earth. For some that meant causes we would consider liberal, like free public schools for all and the abolition of slavery. For some it meant causes that we would call conservative, like prohibition of alcohol and teaching the Bible in public schools. Many felt comfortable supporting all these reform movements.

    From the 1840s on large waves of Catholic immigrants came to the U.S.. They learned to accept religious pluralism and reject the old Catholic tradition of one universal church for everyone. But they created their own schools, raising new questions about state support for religious education. These problems, like nearly all problems of church and state in the 19th century, were dealt with at the local and state levels.  

    After the Civil War, the 14th amendment made all states subject to rule by the federal constitution, opening the way for federal courts to apply the 1st amendment and rule on church-state issues. In 1879 the Supreme Court issued its first opinion directly dealing with church and state. It ruled that the government could forbid Mormons from practicing polygamy. The Court cited words written by Jefferson indicating that the wall of separation prevents the government only from controlling religious beliefs. But the government could forbid behaviors it deemed harmful to society.

    However it was not until the 1940s that the Supreme Court began addressing the church-state question in earnest. By that time the federal government was playing a much larger role in the life of every American, while a slowly rising tide of secularism was undermining the notion of America as a Christian nation. For growing numbers of Americans, "the American way of life" meant a dedication to pluralism, diversity, and the fullest protection of individual rights. These factors combined to bring many issues related to religion before the Court.

    In 1940 the Court took on the case of Jehovah's Witnesses who argued they should be able to go door to door without a state license. The Court agreed, declaring for the first time that the 1st amendment's "free exercise of religion" clause applied to local and state governments as well as the federal.

    In the same year, though, a group of Jehovah's Witnesses argued that their children should not be required to salute the flag in school because it violated their free exercise of religion. The Court ruled against them. Then two years later, in an almost identical case, it ruled that the Jehovah's Witness children did not have to salute the flag.

    Why the abrupt turnaround? There is some evidence that the Court was influenced by a wave of criticism of its first decision from scholars and newspapers, and also by dismay over a wave of anti-Jehovah's Witness prejudice after the first ruling. This case reminds us that the Court is never making its decision in some abstract realm of pure legal rationality. It is always, to some extent, a barometer of the climate of public opinion.

    In the Everson case of 1947 taxpayers argued that their town, which paid for children's bus rides to public school, should not pay for Catholic children's bus rides to Catholic school. Writing for the majority, Justice Hugo Black penned a famous, stirring defense of the wall of separation, arguing that the 1st amendment's "no establishment of religion" clause applied to local and state as well as federal law. This became an accepted principle of later Court cases. Yet Black and the majority decided in favor of the Catholic children getting public money because it was going to them as individuals, not to the church.

    This case, and the Court's reversal in the Jehovah's Witness cases, foreshadowed the history of church-state cases ever since then. There has been no consistent pattern, but rather what Justice Robert Jackson called a "winding, serpentine" wall of separation, full of all sorts of unpredictable twists and turns in the Court's views.

    Vagueness often prevails. In the Lemon case of 1971, the Court ruled that no law may "have the primary effect of either advancing or inhibiting religion" and left it for later Courts to figure out what that means.  Now the Court has added another contorted brick to that wall, by a 5-4 margin, as has so often been true in recent church-state cases.

    The Court still reflects the climate of public opinion, which remains divided and uncertain about the proper relation of religious life to the body politic and the lives of individuals, or what we have come to call "church and state." So the debate initiated by the 1st amendment goes on -- which may be just what the founders intended.

     

     

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153347 https://historynewsnetwork.org/blog/153347 0
    Pity the Poor Second Term Presidents

    The commentariat of America's corporate mass media have just about reached their verdict on Barack Obama's second term: F for failure. We shouldn't be surprised. The day after Obama won re-election there was a wave of punditry reminding us that no second-term president has ever achieved very much. So little more than failure was ever expected from this second-term president.

    But as Leo Tolstoy might have said, successful first-term presidencies all alike; every unsuccessful second-term presidency is unhappy in its own way.

    The headline of a recent Washington Post editorial spoke for the emerging consensus: "America's global role deserves better support from Obama." "For seven decades since the end of World War II, the United States has shouldered the responsibility of global security guarantor," the editorial began, summing up the premise of that consensus.

    Now, the Post lamented, the president is shirking America's responsibility, giving in to the temptation to "lay down that burden," "focusing on the costs of the U.S. global role," and thus offering us, instead of the bold leadership our global role requires, only "an uncertain trumpet."

    Those last words take us back to 1960, when General Maxwell Taylor wrote a best-selling critique of Dwight Eisenhower's foreign policy, calling it an excessively weak "uncertain trumpet" in the face of a growing Communist menace -- for which Taylor was rewarded, by the new, more liberal president John F. Kennedy, with the chairmanship of the Joint Chiefs of Staff. Taylor used his position of power to lead America's troop much deeper into the swamps of Vietnam.

    Never mind that the American public remembers the lessons of Vietnam -- and Iraq and Afghanistan. Never mind that, as the Post editorial noted, "a new Wall Street Journal/NBC News poll found that 47 percent of Americans want the United States to be 'less active' in world affairs -- a 33 percentage-point increase since 2001." Never mind that Obama tried to play the global commander-in-chief in Syria and had to back down in the face of the public's wrath. The American people just don't understand what's good for them and for the world, the centrist Post lamented.  

    Now, as in 1960, liberals are joining the chorus of criticism of a president who purportedly won't stand up to the Russian menace.  After he described his own foreign policy this way -- “You hit singles; you hit doubles. Every once in a while, we may be able to hit a home run" -- the usually supportive New York Times called it "a sadly pinched view of the powers of his office. ... It does not feel as if he is exercising sufficient American leadership and power."

    Times columnist Maureen Dowd mocked Obama in her typically punchy way: "A singles hitter doesn’t scare anybody. It doesn’t feel like leadership. It doesn’t feel like you’re in command of your world. ... We expect the president, especially one who ran as Babe Ruth, to hit home runs." The real problem, Dowd concluded, speaking for the consensus, is that "Barry" is "whiffing."

    Either you command the world or you strike out. Those seem to be the only alternatives the mass media commentariat will allow (with a few notable exceptions.)

    Actually Eisenhower, the president the liberals attacked in 1960 for weakness, pretty much agreed with that dichotomous view of America's role in the world. He tried his best to command the world and to make sure the U.S. could go on commanding the world for decades to come. (That's why he wouldn't spend as much money on the military as the Democrats wanted; he was in it for what he called "the long haul.")

    Yet just 14 months into his second term (Obama is now 15 months in) Ike was summing up his second term experience, in a private letter, in self-pitying terms: "There has scarcely been a day when some seemingly insoluble problem did not arrive on my desk.”

    Eisenhower didn't see that his insoluble problems were largely of his own making. During his first term he could have gone to Geneva, forged a genuine rapprochement with the Soviets, and built a foundation for jointly resolving many of the problems that plagued him throughout his second term.

    Instead he offered an "Open Skies" plan that was an obvious ploy to give the U.S. further cold war advantage. The Soviets saw through the ruse and rejected it (in part because they feared letting the world see how pitifully inferior their nuclear war-fighting capabilities were).

    Eisenhower never imagined the possibility of really cooperating with the Soviets, and certainly not a joint U.S. - Russian space venture. So instead he had to suffer the second-term humiliation of seeing the Russians launch the first earth-orbiting satellite (Sputnik), and then the further humiliation of having a U-2 spy plane shot down over Russia, because he could never give up his goal of global control through global surveillance.

    Eisenhower's experience is a good reminder of the maxim that every unsuccessful second-term presidency is unhappy in its own way.

    But he also reminds us that a two-term president is usually remembered by history for what he accomplished in his first term, not what he failed to do in his second term.

    Consider the roster of presidents who were seen, during their second-term, to be floundering and failing. It also includes Wilson, FDR, Truman, and Reagan -- all now widely admired by historians for leaving a powerful mark on the nation's history. Historians go on debating whether that mark was positive, of course; that's what historians are supposed to do. But there's general agreement that these, like Eisenhower, were strong presidents whose first terms marked significant turning points. In the history books, their perceived second-term failures are typically treated as less significant.

    Wilson's failure to get the U.S. to join the League of Nations may be an exception to that rule. But most historians see the rejection of the League as a failure of the Senate, not the president, who often gets high marks for sticking to his first-term vision of a world without war, a world eternally safe for democracy. So Wilson probably does fit the general rule that second-term distress is eclipsed in the history books when there is reason to see first-term greatness.

    Why, then, was there such a rush to condemn even the most eminent 20th century presidents during their second term? Because all of them, like Eisenhower, were failing to achieve goals they had set during their first terms, largely due to their own mistakes.

    Wilson ran up not only against a recalcitrant Senate but against British and French leaders at Versailles whose aims he misread, and in both cases he overestimated his power of rhetorical persuasion.

    FDR's New Deal faltered because he cut government spending, tried to pack the Supreme Court, and campaigned against Southern Democrats who were blocking his legislative program.

    Truman, after launching America's cold war, continued to prosecute a war against communists that he could not win.

    Reagan, after a first term filled with inspiring rhetoric of renewing American virtue, got caught up in the Iran-Contra scandal and responded with clumsy, unconvincing words of self-excuse.

    Put all these criticisms of second-termers together, look at them the opposite way around (as if they were photographic negatives) and they paint a vivid picture of what we expect a president to be: wise, honest, virtuous, politically masterful, and powerful enough to win every battle, at home and abroad.

    We want, not a human being with human failings, but a fairy-tale hero of mythic stature who will embody the kind of perfection that America's mythic traditions attribute to the nation as a whole. Or as Maureen Dowd put it, we want every two-termer to spend eight full years being the Babe Ruth of politics, hitting homer after homer -- and we want it, I would add, so we can go on believing that America is, year after year, the Babe Ruth of nations.

    Eventually presidents' second-term failings fade from public memory, while their first-term home runs are remembered, so we can go on imagining that we've really had presidents who lived up to our cherished mythic standard.

    In fact none did. And there's no reason we should expect Obama to, either.

    However he is in a different category from all previous two-termers in one crucial respect. He is being denounced, not because he has failed to achieve foreign policy goals he set out during his first term, but precisely because he is doing so well in achieving his goals.

    He made it clear from the beginning that, when it came to relations among nation-states (non-state "terrorists" are a different story) he would not flex America's muscle to demonstrate overt global control. Instead he would follow the maxim of John Quincy Adams (nearly every historian's choice for our greatest Secretary of State): America "goes not abroad in search of monsters to destroy." Obama's great fault, his centrist and liberal critics insist, is that he sticking firmly to that maxim, which is widely scorned by today's foreign policy establishment, though it's once again quite popular among the people at large.

    Ironically, in the current orgy of Obama-bashing, there is little mention of the one foreign policy goal he set in his first term that he may very well fail to fulfill due to his own mistake -- the Israel-Palestine agreement that seems to be eluding him. In the mass media there's barely a whisper of the obvious: those peace talks collapsed because Obama would not put enough pressure on the Israeli government to follow through on its promises and to stop new development in the West Bank.

    Historians may eventually remember that as Obama's one true foreign policy failure. But if Obama follows the path of his predecessors, historians will turn that failure into a footnote, while lauding yet another president who achieved greatness in his first term. That's one way we keep the myth of America's greatness alive. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153355 https://historynewsnetwork.org/blog/153355 0
    A Civil War Myth That Hurts Us All

    Why are so many Americans woefully ignorant of their nation's history? That perennial question is raised yet again by Timothy Egan in his latest column on the New York Times website.

    To prove that the problem is real, Egan cites two pieces of evidence. The first one surely is cause for concern: "a 2010 report that only 12 percent of students in their last year of high school had a firm grasp of our nation’s history" (though historians will surely wish that Egan had added a link to the report, so we could track down the source).

    Egan's second piece of evidence suggests that he may be a participant in as well as observer of the problem. "Add to that," he writes, "a 2011 Pew study showing that nearly half of Americans think the main cause of the Civil War was a dispute over federal authority -- not slavery -- and you’ve got a serious national memory hole."

    I'm no expert on the Civil War, but I've read a number of recent books by historians who are. They all agree that in 1861, when thousands of Northerners eagerly enlisted, few were  signing up to fight for the abolition of slavery. They were signing up to do the one and only thing Abraham Lincoln called them to do: to save the Union, which is to say to affirm federal authority over all the states.

    True, the dispute over federal authority was sparked by the problem of slavery. Most Northerners were determined to stop slavery -- but only in the territories of the West, where they feared slaves would block work opportunities for free whites. Hence the popular slogan: "Free Soil, Free Labor, Free [White] Men."

    When it came to the existing slave states, most Northerners agreed with Lincoln that there was no legal ground to abolish slavery. Most expert historians suggest that there was still not enough political will in the North to try to abolish slavery.

    So from the North's point of view, at least, federal authority was indeed the fundamental issue.

    Only gradually, as the war progressed, did many Northerners come to see it as a war against slavery. Many others never reached that point, as Steven Spielberg's recent film Lincoln reminded us. Even among those who did get there, most probably embraced abolition largely as both a symbol of and strategic means to victory in the war, not as a good in and of itself.

    In the South issues of slavery and federal authority certainly were inextricably entwined. "Slavery was enshrined into the very first article of the Confederate Constitution; it was the casus belli, and the founding construct of the rebel republic," as Egan writes. That's a good snapshot of the issue from the Southerners' perspective.

    But it's the winners who are supposed to write the history of any war. For a Northerner to cite the Confederate Constitution as the full explanation for the war is questionable, at best.

    So let's thank Timothy Egan for adding a bit more proof that even "opinion leaders," as he writes (as well as "corporate titans, politicians, media personalities and educators") are sunk, more or less, in that national memory hole. At least their knowledge of history usually has some serious holes in it.

    And I should personally thank Egan for reinforcing a point that's dear to me: When we recall our history, and especially when we bring that memory into the political arena, we are more often in the realm of myth than empirical fact -- though most of our political and historical myths aren't simply falsehoods; they include facts, but those facts are always wrapped in imaginative, symbolic narratives that dictate how we interpret the facts.

    The story of the Civil War as essentially a war against slavery -- with all other issues secondary -- is a fine example. It's a story so deeply rooted in American public memory, at least outside the white South, that it will probably never be dislodged, no matter how many historians write how many books. Such is the power of myth.

    When Egan wanted to understand why Americans have such a weak grasp of their history, though, he didn't look into the power of myth. Instead he "asked a couple of the nation’s premier time travelers, the filmmaker Ken Burns and his frequent writing partner Dayton Duncan."

    Burns said: "It’s because many schools no longer stress 'civics,' or some variation of it," so students don't learn "how government is constructed" -- a curiously irrelevant response from someone who has enriched our understanding of so many aspects of our history.

    Duncan did offer a direct and provocative, if speculative, answer: "Americans tend to be 'ahistorical' — that is, we choose to forget the context of our past, perhaps as a way for a fractious nation of immigrants to get along."  

    That's where Egan adds his comment on the South's Constitution and slavery as the casus belli, as if to prove the point by example. "That history may hurt," he explains, implying that North and South can get along easier if we ignore the hurts of their fractious past. "But without proper understanding of it, you can’t understand contemporary American life and politics."

    No arguing with that conclusion. Coming on the heels of Egan's (mis)reading of the causes of the Civil War, though, it points to a more complex view of the American memory hole.

    Why do most Americans outside the white South embrace the mythic view of the Civil War as a battle essentially over slavery, from beginning to end? Isn't it because "history may hurt" -- because it would, and should, pain us to recall how deeply racist most Northern whites were in 1861, and how many were willing to let slavery continue in the existing slave states?

    If we believe the story of the North in 1861 as a monolithic block dedicated to eradicating slavery, it eases the hurt. It lets us believe that white America, outside the South, has a proud history of sacrificing blood and treasure for the cause of racial equality. It makes the rapid end of Reconstruction, the white Northerner apathy toward Jim Crow laws in the South until the 1960s, and white racism in the North until the present day all look like aberrations in a fundamentally moral history.

    So we can more easily forget that, as Ta-Nehisi Coates reminds us, "America was built on the preferential treatment of white people -- 395 years of it." He rightly laments "our inability to face up to the particular history of white-imposed black disadvantage," an unbroken history that continues to the present day in wealth, jobs, housing, education, incarceration, voting, and so many other areas of life.  

    If our ultimate goal is, as Egan suggests, to understand contemporary life and politics, the prevailing myth of the Civil War as a crusade for freedom and equality is counterproductive.

    That doesn't mean we should aim to replace myth with pure objective fact -- a noble but impossible dream. It does mean we need a myth of the Civil War that comes closer to the facts and helps to close the still-yawning gap between black and white America. We need a myth that makes sense out of all kinds of racism and racial disparities in the present, not one that obscures them.

    Such a myth would probably open up more white hurt, at least for a while. But it's the only way we might possibly, some day, heed Lincoln's call to bind up the nation's wounds.  

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153357 https://historynewsnetwork.org/blog/153357 0
    Mental Health vs. Gun Control: A Devil's Bargain

    Eliot Rodger

    The tragic death of seven young people in Isla Vista, CA, has sparked renewed calls for gun control, as everyone expected. Less expected: Republicans in the House are leading a push for a well-funded federal program to give a broad range of new services to Americans with serious mental illness.

    Before we get to the details, let's review some basic facts: Only 5% or less of violent act are committed by people with serious mental illness. Mental illness alone causes virtually no increase in the likelihood that any person will do violence. People with no mental disorder who abuse alcohol or other drugs are far more likely than the mentally ill to commit violence

    These facts lead a lot of people to the rather logical conclusion that the real problem raised by mass killings lies not in mental illness but in the all-too-easy availability of guns.

    Of course Republicans will have nothing to do with that line of thinking. Again, no surprise. The surprise is the new Republican interest in seriously addressing the nation's shamefully inadequate treatment of the mentally ill.

    It's led by GOP Rep. Tim Murphy, a clinical psychologist from Pennsylvania. He's introduced a rather sweeping bill, The Helping Families in Mental Health Crisis Act (H.R. 3717). While some of its provisions are no doubt debatable, overall it would provide an unprecedented array of services to people struggling with mental illness and their families. Some of the reforms would come from changes in existing federal law and interpretations of law.

    But some would require significant increases in federal spending. The bill even calls for a whole new level of bureaucracy: an Assistant Secretary for Mental Health and Substance Use Disorders within the Department of Health and Human Services.

    So far the bill has 86 co-sponsors -- and 50 of them are Republicans!

    Why the sudden GOP enthusiasm to see the feds take care of Americans who have suffered so much neglect for so long? A spokesman for a prominent House Republican, Duncan Hunter, acknowledged what everyone knows: GOP members "want to avoid any situation where mental health is primarily hitched to the gun debate."

    Murphy himself put it more delicately: "If guns caused mental illness, then we would treat that; mental illness needs to be treated, and it is not." But the point is clear enough.

    So what's a self-respecting liberal to do? Murphy's bill is the stuff that liberal dreams have been made of for years. Anyone who has directly seen the agony mental illness can cause will want to stand up and cheer for the Republican psychologist and his 50 colleagues. And the bill can't pass the House without plenty of Democratic support. 

    Meanwhile, with the House surely under GOP control through 2016, and perhaps the Senate too, chances for any kind of gun control legislation are as nonexistent as the unicorn.

    Still, supporting Murphy's bill is a symbolic endorsement of the politics behind it: pandering to the totally false but widespread belief that mental illness, not guns, is the primary cause of violence in the United States. It comes pretty to close to saying that gun control no longer really matters, at least not for the time being.

    Should liberals buy this devil's bargain?  

    That question brought to my mind the old Joni Mitchell line: "We're caught in the devil's bargain / And we've got to get ourselves back to the garden."

    It's a pithy summary of the political dilemma Americans have struggled with throughout our nation's history, the one that this mental health bill raises yet again: Are we pragmatists who take only what we can get, believing that politics is the art of the possible? Or are we idealists, standing up for absolute truth and justice every time as the genuine American way?

    Idealists since colonial times have claimed that the Old World was marred by pragmatism -- the willingness to compromise with the devil and soil one's soul in the dirtiness of political deals. Here in the garden of the New World, on the other hand, life could be Edenic. Every kind of perfection was possible. We could have it all -- or so the story was told. 

    Thomas Morton's Maremount, Brook Farm, and the communes of the '60s hippies are only the most famous of the many efforts to put that vision into practice.

    At the same time, there has been an equally powerful tradition of priding ourselves on our distinctive pragmatism, our Yankee ingenuity, our ability to get the job done no matter what it takes -- even compromise on basic principles. The Constitution, putting into practice Madison's vision of checks and balances, stands as the greatest monument to this side of America's national narrative. The story of the Constitutional Convention has been told over and over to prove that our spirit of compromise works -- even if it produced something as shameful as the 3/5 compromise (slaves counting as 3/5 of a person).

    Similarly, Franklin Roosevelt used a (no doubt invented) "Bulgarian proverb" to justify alliance with the Soviets in World War II: "You are permitted to hold hands with the devil until you get across the bridge." That line has often been quoted, almost always with approval -- except perhaps by ardent, principled anti-communists. Yet just a few years after the war's end they were willing, even eager, to embrace all the evil means of the "red menace" to defeat it, and they never seemed ashamed of saying so.

    Which is a good reminder that both liberals and conservatives have been found in abundance among both the pragmatists and the idealists. The current battles between the tea party and the more "moderate" Republicans as well as between the Clinton and Warren wings of the Democratic Party are both as American as apple pie.

    The lesson of history is that pragmatism and idealism are permanent features of all our major political parties. Every one has been riven by internal strife between its absolutists and its compromisers. Often enough the same person has been an absolutist on some issues and a compromiser on others.

    So if we ask whether Democrats will support Rep. Murphy's anti-gun-control mental health bill, the obvious answer is that some will and some won’t.

    The question that remains is how each side among the liberals will deal with the other. Will the supporters of the Murphy bill respect the purist gun control advocates and their righteous motives for criticizing the bill, recognizing that the purists want both mental health reform and gun control, not a choice between the two? Will the purists respect the righteous motives of pragmatists who support the bill, recognizing that the pragmatists remain committed to gun control whenever it becomes politically possible?

    The lesson of history is that the answer to both questions is "Not very likely."

    Idealists have typically been absolutists, stoutly resisting every suggestion of compromise. And their absolutism has given America some of its finest moments -- like Dr. Martin Luther King's refusal to tolerate the words "wait" and "gradualism" in the drive for genuine equality, now! The civil rights movement of the 1960s might have won no victories at all if the compromisers had prevailed.

    Pragmatists have typically criticized the purists, often harshly, for letting the best become the enemy of the good and thus condemning the nation to end up stuck with the bad. Their cautious approach, too, has led to some fine results.

    When FDR first entered the White House, for example, many of his advisors urged a utopian program of transforming the U.S. into what historian William Leuchtenberg called a “Heavenly City:  the greenbelt town, clean [and] green" prevailing everywhere. FDR opted for more limited, realistic goals. As a result we still have Social Security and unemployment checks flowing across the land to people in need.

    Even if Tim Murphy's mental health bill becomes law, it's not likely to be remembered by history on the same level as the New Deal and the civil rights movement -- though for the millions affected by mental illness and forced to endure our terribly inadequate mental health system, the suffering is often on a par with the worst effects of poverty and racism.

    While the bill is being debated, however, it offers liberals of both the pragmatic and idealist persuasion a chance to show each other some respect and acknowledge that good motives can be at work on both sides.

    Our national mythology has always insisted that such mutual respect is possible because (as illogical as it sounds) Americans are both exceptional pragmatists and equally exceptional idealists -- that we have a unique ability to walk on both sides of the fence simultaneously.

    Our national mythology has also enshrined the claim that America created the best possible political system, where honest disagreement between well-meaning factions need not lead to outright hostility.

    True, those mythic principles has been more honored in the breach than in the observance. Yet they remain as ideals worth keeping in mind when liberals are divided by the offer of a devil's bargain. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153365 https://historynewsnetwork.org/blog/153365 0
    "Iraq" Is Still Arabic For "Vietnam"

    When George W. Bush and the neocons launched their war in Iraq, critics coined the slogan, "'Iraq' is Arabic for 'Vietnam.'" The point was obvious: Another long quagmire of a war in an inhospitable foreign land would lead once again to nothing but death, suffering, and defeat for America.

    That was back in 2003 and 2004, when the parallel was to the Vietnam war of 1965 - 1973.

    To see why "Iraq" is still Arabic for "Vietnam" we have to turn the historical memory dial back just a few more years, to 1962 and 1963. That was when John F. Kennedy struggled with the same dilemma now facing Barack Obama: How much, if it all, should we get involved militarily to help a corrupt leader who stays in power by terrorizing his political enemies?

    Here's what JFK told interviewers in September, 1963, about South Vietnam under President Ngo Dinh Diem: "I don't think ... unless a greater effort is made by the Government to win popular support that the war can be won out there."

    Here's what Barack Obama told reporters on June 13, 2014: "Iraq’s leaders have to demonstrate a willingness to make hard decisions and compromises on behalf of the Iraqi people in order to bring the country together. ... and account for the legitimate interests of all of Iraq’s communities, and to continue to build the capacity of an effective security force."

    JFK: "In the final analysis it is their war. They are the ones who have to win it or lose it. We can help them, we can give them equipment, we can send our men out there as advisers, but they have to win it."

    Obama: "We can’t do it for them. ...  The United States is not simply going to involve itself in a military action in the absence of a political plan by the Iraqis that gives us some assurance that they’re prepared to work together." 

    JFK balanced his calls for Diem to reform with what sounded like a promise that the South Vietnamese government would get U.S. aid no matter what it did or failed to do: "I don't agree with those who say we should withdraw.... This is a very important struggle even though it is far away. ... We also have to participate -- we may not like it -- in the defense of Asia."

    Obama sounded a similar note: "Given the nature of these terrorists, it could pose a threat eventually to American interests as well. Iraq needs additional support to break the momentum of extremist groups and bolster the capabilities of Iraqi security forces. ...  They will have the support of the United States. ...  We have enormous interests there."

    Just as Kennedy publicly denied that he contemplated any significant troop buildup, Obama insisted, "We will not be sending U.S. troops back into combat in Iraq." Yet JFK continued pouring "advisors" into Vietnam throughout his presidency, just as Obama promised that there would be "selective actions by our military ...  We have redoubled our efforts to help build more capable counterterrorism forces so that groups like ISIL can’t establish a safe haven. And we’ll continue that effort. "

    Kennedy's warning that military aid depended on South Vietnamese government reform was not merely for public consumption. A year earlier he had sent Diem a private letter promising more money for Diem's army but adding a warning that the aid was "specifically conditioned up Vietnamese performance with respect to particular needed reforms" that would be "most effective to strengthen the vital ties of loyalty between the people of Free [i.e. South] Vietnam and their government."

    Whether Obama has sent such a letter to Iraq's prime minister Nouri al-Maliki is anybody's guess.

    There's another key difference. In his 1963 interviews JFK explained that Vietnam itself was not the crucial issue. It was more about the world's perception of America's power. Losing Vietnam would give "the impression that the wave of the future in southeast Asia was China and the Communists."

    Obama has not come out and said anything quite like this. Yet he must be keenly aware that his critics at home -- and even some of his usual supporters -- are urging him to make sure the world knows that the U.S. still runs the show.

    Just a week before Mosul fell to the ISIS/ISIL forces, liberal commentator Fareed Zakaria wrote that "the world today... rests on an order built by the United States that, since 1989, has not been challenged by any other major player." The big question, he said, is: "How to ensure that these conditions continue, even as new powers -- such as China -- rise and old ones -- such as Russia -- flex their muscles?" Now a new power is rising in the Middle East, and the question of preserving the world order is likely central to the conversation in the Oval Office.

    Indeed another usual supporter of Obama's foreign policy, the New York Times, says that neocon Robert Kagan's recent article "Superpowers Don't Get to Retire" "struck a nerve in the White House" -- so much so that "the president even invited Mr. Kagan to lunch to compare world views." "Events in Iraq Open Door for Interventionist Revival," the Times' headline declared.  

    So Obama is stuck in much the same dilemma that faced Kennedy: feeling compelled, both by global geopolitical and domestic political concerns, to bolster an ally, but knowing that all the military aid in the world won't help such a fatally flawed ally win the military victory that the U.S. government wants.

    How to resolve the dilemma? JFK insisted on keeping all his options open. Obama said: "I have asked my national security team to prepare a range of other options that could help support Iraqi security forces, and I’ll be reviewing those options in the days ahead."

    JFK sent a seemingly endless round of envoys to Vietnam to study the situation and report back to him. Obama may well end up doing the same.

    "We want to make sure that we have good eyes on the situation there," the current president said. "We want to make sure that we’ve gathered all the intelligence that’s necessary so that if, in fact, I do direct and order any actions there, that they’re targeted, they’re precise and they’re going to have an effect." 

    Have an effect? Looking back at the outcome in Vietnam, all one can say to Mr. Obama is, "Lotsa  luck, buddy."

    And one must wonder whether Obama has told Maliki in private what JFK told Diem: U.S. troops would not actually be doing the fighting; we would only send military aid and advisors. Nevertheless, the U.S. would "expect to share in the decision-making process in the political, economic, and military fields." Looking back to Vietnam and ahead to Iraq, one can only say again, "Lotsa luck, buddy."

    To the end of his life Kennedy remained caught up in a typical American fantasy: If you just work hard enough at it, you can reason your way to the precisely perfect solution. You can walk the fine line that lets you avoid hard decisions and instead find the perfect balance that embraces both sides of the dilemma. You can have it all. And because you are America you can bend smaller nations to your will, enforce that perfect solution, and insure a happy ending for everyone.  

    If the ghost of JFK still wanders the White House he might be waking Barack Obama in the middle of the night, saying, "Lotsa luck on that one, too, buddy."

     

     

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153386 https://historynewsnetwork.org/blog/153386 0
    Israel's Strategy and America's Mythology

    Bombs are falling and people are dying in Gaza. It's headline news in America's mass media. As usual, though, we get only today's events, with no historical context to explain what's really going on and why.

    The crucial piece of history our mass media ignore is that one basic principle has always guided Israel's foreign policy: Keep the perceived enemies divided; never let them unite.

    That's why Israel aided the creation of Hamas in the 1980s. The Israeli government feared the prospect of all Palestinians uniting under the flag of the Palestinian Liberation Organization, dominated by Yassir Arafat's Fatah party. Hamas seemed to offer a counterweight.

    The recent reconciliation of Hamas and Fatah raises that specter again. Israeli leaders want to stop it at all costs, to drive a wedge into the uneasy peace between the rival Palestinian parties. Hence the onslaught against Hamas and Gaza.

    Though he's in his 90s, the veteran Israeli politician and commentator Uri Avnery can see it all quite clearly. After three Israeli teenagers were kidnapped in the occupied West Bank, "the Netanyahu government immediately saw in the incident an auspicious opportunity. Without the least evidence (as far as we know) it accused Hamas. The next day," he wrote, the Israelis "started an attempt to eradicate Hamas in the West Bank," with massive arrests of Hamas leaders.

    "The main aim," Avnery posits, "is to pressure Mahmoud Abbas to abandon the inter-Palestinian reconciliation and to destroy the new experts-only Palestinian government. Abbas resists. He is already widely denounced in Palestine, because of the ongoing close cooperation between his security forces and the Israeli ones, even while the Israeli operation is continuing."

    Avnery wrote that before Israel began bombing Gaza, where Hamas rules. Surely he, and Israeli government strategists, knew that the crackdown on Hamas would provoke some ineffectual rocket fire by splinter groups in Gaza, giving Israel an excuse to blame Hamas for those rockets, too, and begin bombing Gaza.

     Now that so many Palestinians have died, the pressure on Abbas to "get tough" is all the greater. So is the pressure on Hamas to fight back, to abandon the nonviolent policy that was so basic to the Fatah-Hamas unity government. The more rockets fly out of Gaza, the harder it will be to patch up the Fatah-Hamas split. And the easier it will be for Israel to go on making its disingenuous case: How can we negotiate a peace with "terrorists" who want to destroy us?

    The Israeli government must have predicted all this when it first pinned blame on Hamas for the kidnappings, although it could present no evidence to support the charge. Anyone who has followed the conflict for very long could have predicted it. The logic of Israel's strategy, however deadly, is easy enough to see.

    So why is that strategy so glaringly absent from U.S. press coverage of the current conflict? Are American journalists in Jerusalem just too ignorant to get it? That's possible, but it doesn't seem likely.

    What's more likely is that their perceptions and the perceptions of their editors are in a sort of tunnel vision. They can only see what long-standing American myths allow them to see. Two myths have dominated the history of American perceptions of the Israel-Palestine -- or what's often called, more broadly and misleadingly, the "Israel-Arab" -- conflict.  

    From the time that the State of Israel was born in 1948 and immediately plunged into war with neighboring nations, the U.S. news media tended to treat it as a "tit for tat" struggle. It's a story that's been familiar to Americans since colonial times: In the Old World, there's just this inexplicable urge for nations to fight each other. "Inexplicable" means we don't have to try to understand the context, nor the motives of each side. They just hate each other and will go on fighting forever.

    That's a widespread view, in this country, of the Israeli-Palestinian struggle. It's showing up once again in headline after headline that all boil down to "Israel and Hamas Trade Bomb Attacks" -- period, as if nothing more need be said.

    In the aftermath of the Six-Day War of 1967 a second myth came to the fore in the U.S., one that saw Israel as a permanent victim of constant hatred and attack from its neighbors. It's familiar to Americans from endless hours of watching television: cowboys against Indians, cops against robbers, and any number of other variants on the "good guys against bad guys" myth -- with no doubt allowed, in this case, that Israel is the good guy. Maybe we should call it the "Israel can do no wrong" myth.

    That's also a widespread view showing up now in headlines in this country, even in our most influential newspapers, like: "Rockets Hit Israeli Heartland as Offensive Begins." Though the story speaks of Israel's offensive against Gaza, the cursory reader (and aren't most readers cursory?) who sees only headlines would assume that it's Hamas on the offensive -- "as usual," the mythic voice adds subliminally. After all, that voice says, Hamas is a "terrorist organization," isn't it? Israel's just defending itself, isn't it?

    Now these two myths are working together to put blinders on American journalism.

    The "tit for tat" myth is probably dominant, for historical reasons. Beginning with Israel's 1982 invasion of Lebanon, when its army stood by knowingly while hundreds, perhaps thousands, of Palestinians were massacred in the Sabra and Shatilla refugee camps, American journalists began to back away from the "Israel can do no wrong" myth. Over the years, they've increasingly informed us that some blame must be ascribed to Israeli policies.

    In U.S. coverage of the current situation, though, even the rare explicit critique of Israel usually reinforces the mythic perspective.

    For example, the New York Times' Isabel Kershner briefly mentioned one criticism: "Israeli experts often describe Israel’s periodic campaigns in Gaza in terms of 'mowing the grass,' with the limited goals of curbing rocket fire, destroying as much of the militant groups’ infrastructure as possible and restoring deterrence. Critics say the use of such terminology is dehumanizing to Palestinians and tends to minimize the toll on civilians as well as militants."

    But anyone reading Kershner's report, or viewing the Times' web video on "Mowing the Grass," is likely to conclude that, even if the terminology is dehumanizing, the practice makes perfect sense, because the war can be understood only as "tit for tat" or "good guys against bad guys." In such a war, the stronger nation would naturally want to "mow the grass" every so often.   

    There is no place in American mass media coverage for any other viewpoint -- and certainly not for the aim that so obviously motivates Israel's current attack on Hamas and Gaza: destroying the infant rapprochement between the two Palestinian parties and thus easing the international pressures on Israel to end the occupation of the West Bank and the isolation of Gaza. That just doesn't fit into the prevailing mythic framework.

    Of course Israel's strategy is shaped by its own long-standing mythology. I've called it the myth of Israel's insecurity -- the story that says Jews will always be under attack from enemies who want to destroy their state simply out of anti-semitic hatred. Avnery calls it "the ghetto reflex, formed by centuries of persecution, for Jews to stand together against the evil goyim [gentiles]." This myth is now powerful enough among Israeli Jews to drive the political policies of their government.

    Every political myth has some elements of fact wrapped up in its imaginary structure. Surely there is anti-semitism among some Palestinians. Surely Hamas is under pressure from other militaristic factions in Gaza and therefore uses its rockets to keep its political power. Surely there is now such a long history of animosity that it's a difficult cycle to break.  

    But in myth the imaginary overwhelms the factual, dictating that many facts -- often the most crucial facts -- be left out.

    So the Israeli government, and the vast majority of Israeli Jews, ignored the obvious fact that whoever kidnapped and presumably killed those three Israeli teenagers did not act on behalf of Hamas, much less the Palestinian people as a whole. On the contrary, the kidnappers no doubt intended to break up the reconciliation of Fatah and Hamas. They were able to understand, as clearly as Avnery, that the Israelis would react with violence and thus undermine the reconciliation. Tragically, as so often, the extremists on both sides became partners in pushing toward a common goal.  

    More basically, the Israelis have ignored for years the Hamas offer of a long-term truce during which the two sides would negotiate a permanent peace, including a de facto recognition of Israel by Hamas.

    Here in the U.S., the Hamas offer was beginning to get some notice a couple of years ago; there was a glimmer of possibility that a new myth, more true to the facts, was in the making.

    But now it has disappeared. Hamas is routinely described as "committed to Israel's destruction." That, too, is once again part of the prevailing mythology, making it easier for U.S. media to restrict coverage of the current events to the "tit for tat" and "good guys against bad guys" myths.

    The combination of these two myths dictates that Americans must be given the Israeli version of events: The kidnappers become not isolated individual criminals but merely "Palestinians," and the mythic tale of Hamas, or perhaps simply "the Palestinians," launching a deadly attack on an innocent Israel now passes for reality. Meanwhile, the obvious strategic purpose of Israel's response is ignored.

    Yet the history of the U.S. mass media's reporting on Israel shows that mythic frameworks can change. Another change, bringing myth closer to reality, is always possible. And if not now, as a famous Jew once said, when? 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153421 https://historynewsnetwork.org/blog/153421 0
    Don't Blame Climate Change Deniers

    Ira Chernus, is Professor of Religious Studies at the University of Colorado at Boulder and author of "Apocalypse Management: Eisenhower and the Discourse of National Insecurity."

    MythicAmerica is on a forced hiatus while I deal with health problems. But over 300,000 people in New York City the other day reminded us all that no one's health will matter much unless we take care of the planet's health. So I felt moved to polish up a previously unpublished column and share these thoughts with you:

    The old joke, "Everybody talks about the weather, but nobody does anything about it," is no laughing matter any more. It's dead serious. Yet the United States seems politically paralyzed on this most vital issue.

    It's easy to blame the climate change deniers. But it's wrong. In Gallup's most recent poll only 18% of us denied climate change. In a CBS poll, only 11% were outright deniers.    

    The vast majority of Americans are well aware that there's a real problem. More than four out of five agree with the overwhelming scientific consensus that climate change is happening now or surely will happen soon. And a solid majority believe that what they read in the news about climate change is either accurate or underestimates the problem.

    Nevertheless, Americans put the climate almost dead last on the list of problems facing the nation.

    30% of Americans believe climate change is here or on the way but simply do not worry about it. Virtually the same percentage believe it's already happening or will in their lifetimes but doesn't pose any serious threat to them.

    Another public opinion study, by scholars at Yale and George Mason (Y/GM), found Americans falling into rather clear-cut categories. The "Cautious" and "Disengaged" -- neither true believers nor deniers -- add up to exactly 30%. A sizeable majority of them believe climate change poses a high risk to future generations. Yet virtually none of them "have thought a lot" about climate change.

    The biggest political stumbling block is not the deniers. It's all those ignorers. How can so many ignore what they know is coming?

    The Y/GM study found one crucial reason: uncertainty about the facts. Though most of the ignorers see a danger looming, few are really sure that it's happening now. Only about a third of them think that scientists agree on the facts. About four out of five say they "need more information to form an opinion." Nearly all say they could "easily" change their minds.

    Don't be too quick to blame the 30% though. Even those the Y/GM study calls the "High Involvement Public" show surprising levels of uncertainty and apathy. About two-thirds of the "Concerned" say they're sure climate change is happening now. Yet four out of five say they need more information to make up their minds and 70% could "easily" change their minds. And only a tiny 13% have thought about it "a lot."

    Among the thin sliver of the public (16%) who are "Alarmed" -- who all know climate change is happening and poses a danger to future generations -- roughly half say they need more information, and nearly a quarter are open to changing their minds. More than one-third have not thought "a lot" about the issue, and only about a third have expressed their concern to any public officials. 

    Which means (I'm embarrassed to admit) that I'm a pretty typical American. For years I've written thousands of words on a wide range of subjects. Yet I've rarely addressed climate change, even though I've known that it's happening and poses unthinkable danger.

    When I look in the mirror and try to figure out why I've avoided the issue, what I see staring back at me is that word unthinkable. When I write I try to be sure I know what I'm talking about. When it comes to climate change, the science seems so complex, so daunting, so far over my head that I hesitate to say or even think anything. I can never feel certain.

    And I know that even the best scientists have to deal with uncertainty. They understand, as Elizabeth Kolbert recently noted in the New Yorker, that "while it is possible that the problem could turn out to be less serious than the consensus forecast, it is equally likely to turn out to be more serious."

    That's why one of my friends, who is on the UN's Nobel-Prize-winning Intergovernmental Panel on Climate Change, taught me long ago to call the problem "climate chaos." She and her colleagues are sure that climate change is happening. But they also know that the dangers to human life come from the unpredictable, erratic, and often massive weather events that it causes (like the storm that dropped some 20 inches of rain in just a few days on her neighborhood, triggering unprecedented flooding).

    Moreover, my friend tells me, climate scientists have been talking about all kinds of uncertainties for years. Recently she organized a conference on "Uncertainty in Climate Change Research: An Integrated Approach," because "uncertainty is present in all phases of climate change research." 

    Even climate change philosophers deal with uncertainties that make our national conversation on the issue chaotic. Dale Jamieson points out that we can't be sure who to blame: "A lot of our thinking about policy tends to be oriented around a sort of good guy-bad guy polarization. Climate change is an issue that doesn’t fit very neatly into that stereotype. ... We’re all involved in contributing to the problem to some extent and we’re all involved in suffering from the problem to some extent."

    The noisy climate change deniers bear some of the responsibility, of course, but surely not all. The fossil fuel corporations are a big part of the problem, too. Yet, as Paul Krugman recently wrote, "it’s not mainly about the vested interests. ... The monetary stakes aren’t nearly as big as you might think."

    Then there are the huge greenhouse gas emissions from poorer countries, especially China and India. Can we really say they are part of "the enemy" on this issue, when we Americans emit so much more per capita? Millions of us in the U.S. drive our cars, and use more energy than we need, every day. We have met the enemy and they is us.

    The evildoers in this tale are such a vast, diverse, vaguely-defined mass of people they're virtually invisible.

    If we think of carbon dioxide as the enemy, it's also invisible: "tasteless, odorless -- it doesn’t present to our visual systems," as Jamieson says. David Ropeik, an expert on risk perception, agrees. The public doesn't worry because the threat "doesn’t feel immediate/imminent. It doesn’t feel…well…real. It’s more of an idea, a concept, an abstraction."

    And we can't even be sure how big a problem carbon dioxide is. Methane may be the major culprit here.

    Moreover, the effects of climate change are creeping up on us so slowly that they, too, are largely invisible. If this is an apocalypse, it's an agonizingly gradual one, the kind we just don't know how to think about or even believe in, much less deal with.

    All in all, when I try to grasp the chaotic truth about climate change, I think I've got good reason to feel unsure and confused. 

    So I ask myself: Is there anything I know pretty much for sure? I know that in politics "a narrative is the key to everything," as Democrat polling guru Stanley Greenberg once wrote. The Yale/George Mason scholars agree that if there's any chance of motivating the ignorers to get involved, new narratives are a key:

    "Narratives foster involvement with a story and characters, and prior issue involvement is unnecessary for drawing the audience's attention. Memory of narrative content tends to be high ... and studies find that the persuasive effects of fiction can be as high as for non-fiction."

    I know that the best politicians of every stripe -- from FDR to Reagan, from Elizabeth Warren to Ted Cruz -- are always great storytellers. Of course they aren't novelists. Though they may lie when it's useful, the stories they rely on most to get themselves elected and their policies enacted have to include some dose of real facts. Yet those facts have to be embedded in a simple, emotionally powerful narrative rooted in familiar cultural traditions.

    The best politicians understand that shared stories are the glue that hold communities together. People cling to comfortable narratives because they want to cling to the other people in their comfortable group. Research now shows that even among the small minority who actually deny climate change, many probably know the scientific facts. They deny them mainly to reinforce their status as "true conservatives" -- the group bond that gives them a sense of identity.

    Here's another thing I know pretty much for sure: The dominant narrative of climate change activists isn't working well enough. " We are absolutely certain," that narrative insists. "Virtually all scientists agree. Unless we act urgently we are doomed." What could be simpler or more emotionally gripping?

    Nevertheless, this story has not in made much headway in the American political arena. The group Gallup calls "Concerned Believers" has held steady at only 39% for the last 14 years. And, as we've seen, not many of them are moved to consistent action or even apprehension. Hence the lack of political action.

    Maybe that's because most of them, like the "Cautious" and "Disengaged," aren't impelled by a narrative that relies on a claim of absolute certainty. As long as climate change activists don't have any other kind of story to offer, they aren't likely to win any big political victories.

    That doesn't mean the activists should throw out their prevailing narrative. Because here's another thing I know for sure: Every good political campaign needs niche marketing. There's still a sizeable minority of the U.S. population that believes the claims of scientific certainty, and they should hold on to their story.

    The people I worry about are in all those other niches, the ones who will respond only to stories that begin with "No one knows for sure, but ..."

    Then I ask myself, "Why worry?" I study and write about political narratives all the time. It should be fun to find some that allow for uncertainty. And it should be easy. In fact there's lots to choose from already. 

    A Republican stalwart, Henry Paulson, says flatly: "It is true that there is uncertainty about the timing and magnitude of these risks ... We’ll never know enough to resolve all of the uncertainties. " But "we must not lose sight of the profound economic risks of doing nothing." Good businessmen don't wait for certainty. They calculate the odds and then take action.

    That story about benefits to the marketplace from an all-out attack on climate change is growing. And it's bipartisan. Tom Steyer, perhaps the nation's wealthiest climate change activist, funds Democratic candidates and NextGen Climate, whose slogan is: "Act politically to avert climate disaster and preserve American prosperity."

    EPA head Gina McCarthy took a similar tack when she announced the Obama administration's proposals for limiting coal plant emissions: "The plan will create demand for designing and building energy-efficient technology ... It spurs ingenuity and innovation. ... All this means more jobs” -- regardless of how big the threat really is.

    However it's a gamble whether such a naked bid to economic self-interest will have a big impact, when so many Americans often vote against their own best economic interests.

    A recent experiment tested a more idealistic message. Conservatives, in particular, proved more favorable to safeguarding the environment when they were told that "it is patriotic." Most moderates and even many liberals may respond to that kind of call too.  

    The Pentagon has long been touting its version of that story. Its latest Quadrennial Defense Review  "identified climate change as one of our most significant national security problems"; at least that's the way the commander-in-chief read the report. Obama agreed with the Joint Chiefs that "climate change could end up having profound national security implications."  

    Look at it this way, and suddenly uncertainty is even less of a problem. Whenever American public opinion has believed that a potential risk to our nation and our way of life loomed the horizon, no matter how small, we've never waited for absolute certainty. We acted first and got all the facts later.

    Sometimes we've prepared for war -- and even gone to war -- no matter how slim the odds of real threat, because when it comes to protecting our homeland we take no chances -- as today's events in Syria and Iraq make painfully clear.

    Risk analyst Ropeik is pessimistic. He thinks the patriotic vein won't be tapped deep enough to yield political results unless we "feel we were at war -- bullets-flying ... NOW 'I am in Danger' war." He might be right; the "Climate Patriots" meme has been around for several years without garnering very much attention (perhaps because it's been yoked to a meme of absolute scientific certainty).

    But political narratives are germinating, unnoticed, all the time. Occasionally, unpredictably, one bursts into powerful prominence. People were talking about abolishing slavery, for example, for more than a century before the Civil War and the Emancipation Proclamation. Christopher Hayes, for one, thinks we need a new abolitionism, though he knows it will be a tough fight.

    Ever since Franklin D. Roosevelt called on the nation (in his first inaugural address) to "wage a war" against the Great Depression as if "we were in fact invaded by a foreign foe," Americans have united to resist all sorts of non-military dangers -- poverty, drugs, cancer, and even fat -- as long as the campaign was dubbed a "war."

    They've also learned to pay big bucks for research and development in wartime that led to all sorts of unexpected and profitable technological breakthroughs. So the economic benefit, patriotism, national security, abolition, and war stories might all fit together in a tale I suggested recently: a gradual apocalyptic transformation from the possibility of catastrophic risk to the possibility of a far better world.

    On the other hand, maybe the best to hope for is an endless a war of containment, like the cold war. For decades most Americans assumed that the apocalyptic communist threat could never be vanquished; we'd be staving it off forever. National security was reduced to risk management in a world of permanent uncertainty.

    Now the U.S. government is funding an international project treating climate change precisely as an exercise in risk management. These scientists call it "a problem imbued with deep uncertainty." Their first, still unanswered question is "How large are the uncertainties?"

    All these narratives -- and surely there can be lots more -- can start with the words, "No one knows for sure. But why take chances?" Any one of them might, or might not, be a political game-changer. 

    In any event, looking over all the climate change narratives, there's one last thing I know for sure: The dominant story of the American mass media, "doom-sayers versus deniers," is far too narrow to reflect the true complexity of the political landscape.

    So I say let a thousand narratives bloom. Or at least plant a thousand seeds, and see which ones bloom into political successes. No one can be certain about the future.

    All we can do is keep nurturing all those stories and embrace the uncertainty. Because the political landscape of climate change, like the climate itself, is bound to be chaotic at least for a while. Right now, it seems to me, the more chaotic the better.   

     

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153507 https://historynewsnetwork.org/blog/153507 0
    Here's What Is New in Our Latest Fight with China

    MythicAmerica explores the mythic dimension of American political culture, past, present, and future. The blogger, Ira Chernus, is Professor of Religious Studies at the University of Colorado at Boulder and author of Apocalypse Management: Eisenhower and the Discourse of National Insecurity

    You may not recall that a United States Navy warship recently took a brief cruise past the Spratly Islands in the South China Sea, to show that the U.S. disputes China’s claim to the islands. You may not have heard about it at all. Though top leaders of both nations are still talking about it -- and talking tough, suggesting that the the incident may presage future conflict -- the news came and went here in the U.S just as quickly as the USS Lassen’s trip past the islands itself.  Now it’s largely forgotten. Who know what’s going on in those islands? And who really cares?

    It all reminds me of one of my earliest political memories: John F. Kennedy and Richard Nixon, in the 1960 presidential campaign, vigorously debating about U.S. policy toward Quemoy and Matsu, two tiny islands off the coast of China. I didn’t know what was going on in those islands. I suspect most Americans didn’t really know either.

    But we all knew why they mattered: They had become the latest symbolic battleground in the Manichaean struggle we called the cold war. We had a simple mythic framework of good versus evil to make sense out of every disputed spot in the world, no matter how obscure or unimportant the place itself.

    The Spratly Islands are just as much terra incognita now, for most Americans, as Quemoy and Matsu were 55 years ago. But when we turn to our mass media to find out why the Spratly Islands matter, we don’t find the story told with the same rigid moral dualism. Maybe that’s why the story could come and go so quickly.

    Certainly America and its allies in the region remain the good guys in the tale. There is a consensus that we and the world face danger from Chinese “expansionism.” But exactly why is that dangerous? The old bogeyman of “global communism” no longer works as an answer; the fact that the Chinese government is, at least officially, communist now often goes unmentioned. The old cold war tone of absolute good versus absolute evil has been softened considerably.

    In fact it is easy enough to conclude, from U.S. mass media reports, that the Spratly Islands conflict is essentially an old-fashioned geopolitcal struggle between two great powers. It’s easy to get the impression that the U.S. is defending its status as the world’s Number One superpower. The U.S. response sounds much like a boxing world champion facing a tough young challenger for the title. The champ works out very publicly, sending frequent PR releases and taunting jibes from the training camp, just to make sure the challenger knows who is still Number One.

    Most news reports also mentioned the rich potential deposits of oil and natural gas beneath the Spratly Islands and their surrounding waters. Old-fashioned geopolitics readily blends with old-fashioned competition for resources as a reasonable explanation of what’s going on.

    Do I hear the ghost of Reinhold Niebuhr applauding us from the grave? In his 1952 classic, The Irony of American History, the influential American theologian lamented his nation’s penchant for framing its foreign relations within the Manichaean myth of good versus evil. We always see the enemy as a nation grasping for more domination of people and resources, he pointed out, but we never admit that we, too, are motivated not by morality but by self-interest. The irony, Niebuhr warned, is compounded when our self-righteousness leads us into policies that end up hurting our own self-interest. Niebuhr lived just long enough to see his warnings fulfilled most dreadfully in Vietnam.

    Does the Spratly Islands incident show that we Americans have grown up since 1960? Are we finally ready to admit that we play “the great game” as avidly as all those nations we have labeled “evildoers” for so long?

    Well, not so fast. The Spratly Islands caused only a ripple in our public awareness. But we are paying frequent, avid attention to the Syrian civil war. And the rhetoric coming from the Obama administration, loudly ampliifed by the mass media megaphone, still sees that war through the lens of the familiar dualistic mythology. We and the (rather hard to find) Syrian “moderates” we support are presented as absolutely good. The Islamic State (or ISIS, to use our favorite, oddly persistent, acronym) is absolutely evil. So is the alliance of Bashar al-Assad’s Syrian army and the Russian forces dispatched by Vladimir Putin.

    Pitted against our goodness, those two starkly opposing forces might readily merge in Americans’ minds, simply because they share in common the fundamental quality of pure evil. The Manichean myth, and the strange tricks it plays on our minds, are still very much alive.

    True, Secretary of State John Kerry is negotiating with his Russian counterpart to seek a resolution of the Syrian conflict. But negotiating with evil enemies was a tradition well developed during the cold war. It began when the Vietnamese civil war was supposed to have been settled at Geneva in 1954. That may not be a memory the Obama administration would like us to dredge up now, since within two years the U.S. had effectively scuttled the Geneva settlement, paving the way for nearly twenty years of war in Vietnam. Still, it’s worth remembering that Americans are accustomed to negotiating with the representatives of what they see as pure evil. Of course they expect the outcome to favor the interests of the purely good; that is, the Americans.

    If we step back and take a global view, it appears that the United States now applies its familiar dualistic mythic framework selectively: While we remain wholly good, some of our opponents are still absolutely evil, but others are merely “expansionist.”

    One sign of the difference: In the U.S. mass media, Russian and Syrian policies are always presented as the nefarious doings of one man—Putin or Assad—so that we know exactly which malevolent person to blame, while the name of China’s leader remains widely unknown, since he is so rarely mentioned. Since the days of King George III, Americans have typically seen the doings of nations they labeled purely evil as the deeds of a single individual. It is so much easier to cast a person as the devil than a nation. Blaming a whole nation (like China) takes us perilously close to the image of global geopolitics that “realists” like Niebuhr have promoted, suggesting that “it’s just what all nations do.”

    Why does communist China escape with the epithet of mere “expansionism,” which can place its wrong-doing so easily in the mythic frame of “realism”? Perhaps, when it comes to China, the Obama administration and the U.S. mass media have given up the moral dualist myth because nowadays it would simply cost too much. The American economy is dependent on China in too many ways. And there is already plenty of nervousness abroad in the land about depending so heavily on such a relatively evil country. If that perception of evil were transferred from the relative to the absolute category, the cognitive dissonance might be too much to bear. Our economic ties to China might fray beyond repair, leaving us—well, who can predict?

    When Franklin D. Roosevelt wanted to justify his wartime alliance with the French fascist leader Darlan, he cited (or invented) “an old Bulgarian proverb”:  “It is permitted you, my children, in time of danger to walk with the Devil until you have crossed the bridge.”

    But suppose the bridge—in this case, our endlessly complex economic links with China—has no end? Better, it seems, to decide that this Satan is no longer really such a devil after all. He has become an “expansionist.” Of course “expansionism” is still an evil of a sort and, in the persisent common wisdom, America remains untainted by any evil. So we must invent this third mythic category, “expansionism,” which lies not beyond but between absolute good and absolute evil.

    It’s an interesting case study in how mythic narratives and symbolic terms are rather easily reshaped to fit changing needs, and how little attention that reshaping often receives. But such seemingly small, and thus unnoticed, changes can have big consequences. 

    If American reporting on the Lassen’s trip past the Spratly Islands is any indicator, the new mythic category may end up pushing us, however subtly and unwittingly, toward questioning the common wisdom of our perfect goodness. And the Spratly dispute shows no sign of ending soon. So we may have to get used to a new, more “realist” stage in our nation’s mythic self-understanding, if only to keep our position as the world’s wealthiest nation. Do I hear the ghost of Reinhold Niebuhr chuckling from the grave? How ironic. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153688 https://historynewsnetwork.org/blog/153688 0
    Don’t Put All Blame for War Fever on Conservatives

    MythicAmerica explores the mythic dimension of American political culture, past, present, and future. The blogger, Ira Chernus, is Professor of Religious Studies at the University of Colorado at Boulder and author of Apocalypse Management: Eisenhower and the Discourse of National Insecurity

    In the wake of the Islamic State (IS) attacks in Paris, many American liberals, from the president on down,  have accused conservatives of something like warmongering – rushing to demand large-scale U.S. military force in response to a perhaps exaggerated IS threat. There is plenty of truth to the charge. But before they point the finger of blame, liberals should take a good look in a mirror framed by historical perspective and consider their own role in the move toward wider war.

    Since the Vietnam war era, it has been common for liberals, especially those at the far left end of the political spectrum, to lament that Americans are, and have always been, too eager to go to war. But history does not sustain the charge. The American attitude toward war has always been ambivalent. In every war that the U.S. has fought, a sizeable portion of the public has either resisted entry into the war or eventually rejected it as a mistake. World War I and the Iraq war saw both kinds of responses.

    Since World War II, at least, the American public as a whole has endorsed war only when there was a widespread view that we faced an enemy bent on destroying or conquering our whole nation – what is now often called an “existential threat.”

    The loud chorus of demands for more military force since the Paris attacks thus suggest that many more Americans do see the IS as an existential threat. Among the Republican presidential candidates, Ben Carson has used that term explicitly while others have only implied it. Senator Lindsey Graham has also used the term explicitly. A best-selling author echoes that view in the Washington Times

    Over at the Washington Post, the editor of the editorial page warns that the IS, if not already an existential threat to the U.S., is well on its way to becoming one. Other influential voices warn that our “way of life,”  if not our physical homeland, faces an existential threat, the same warning often heard during the cold war.

    It’s not only conservatives who raise the parallel with the cold war. In the New York Times, liberal columnist Roger Cohen suggests that we are fast approaching World War III – the term used so often during the Cold War to raise the specter of massive nuclear destruction of the American homeland.

    In fact, any talk or implication of the Islamic State posing an existential threat to the United States is based far more on fantasy than reality. Why, then, is there such a growing militaristic mood? Which is to ask, why such a widespread belief that we do face an existential threat to our homeland?  

    Here is where liberals should look in that mirror framed by historical perspective and see something disturbing.

    The idea that the United States might be conquered by a foreign enemy had not been heard from a president since 1812 until the iconic liberal president, Franklin D. Roosevelt, voiced it repeatedly in 1940 and 1941. Trying to raise public support for U.S. aid to Britain against the Nazis, FDR told Americans that the German air force was preparing to set up bases in South America, from which it would launch attacks against places like St. Louis, Kansas City, and Iowa – the very heart of the homeland.

    Once the U.S. entered World War II, it was natural to extend that imagery of existential threat not only to the Nazis but to the “Japs.” In the late ‘40s another iconic liberal president, Harry Truman, led the nation to transfer the same kind of fear almost seamlessly to a new foe, “the commies.”

    In 1960 John F. Kennedy, still the darling of many liberals, won the presidency largely by playing on that fear; he warned that the Republicans had created a “missile gap,” leaving the U.S. vulnerable to Soviet nuclear attack. Lyndon Johnson, perhaps the most liberal president ever on domestic issues, used a similar fear to justify war in Vietnam. If we did not fight the Reds over there we would have to fight them in San Francisco, he insisted.

    This liberal heritage made it easy for Americans to believe Ronald Reagan’s warnings about the communist threat – Reagan called it the “window of vulnerability – and George W. Bush’s warnings that Saddam Hussein was preparing to conquer the United States with his nuclear arsenal.

    In every one of these cases, historians have been able to show that the threat was either exaggerated or wholly non-existent. But fears like this rarely stem from verified facts. They stem from imagined realities so powerful that they have the weight of myth. Indeed American political culture has been immersed, for decades now, in a whole mythology of homeland insecurity, centered on the notion that the United States faces a permanent existential threat, though the name of the enemy is subject to change. 

    One of the central tenets of the mythology of homeland insecurity is that our enemies have no rational motives. Like the devil, they are driven by an irrational will to evil for its own sake. The possibility that they are responding to U.S. policies or behaviors is ruled out a priori.

    The logical conclusion is that no changes in U.S. policy can mitigate the danger posed by the enemy. Military force to destroy, or at least contain, the threat is assumed to be the only option. The Nazis, whose conquests gave rise to this mythology, still serve as the parade example, continually displayed to prove that this chain of mythic reasoning is valid.

    Conservatives have happily adopted the mythology of homeland security and adapted it to current historical circumstances. But its true birth parents were predominantly liberals.

    With such a long history of devoted support across the political spectrum, this mythology has become the most basic frame within which all issues of national security are discussed. Voices that do not accept the mythology as a basic premise have been largely relegated to the margins of American public discourse.

    Naturally, then, the public response to the attacks in Paris is massively shaped by fears for the very existence of the homeland. Every fearful word is magnified by the dominant mythology into a fear of existential threat. Thus it becomes further reason to demand a stronger military response. Conversely, every call for a stronger military response is heard through the filter of the dominant mythology and reinforces the idea that we face an existential threat.

    Liberals may find all this illogical. Devoted as they are to reason, liberals understandably avoid facing a truth they find unpalatable: In political life, as in so much of life, mythology is usually more powerful than rationality. So liberals may easily be unaware of the effects of their words.

    But those effects are very real. When liberals say anything that stirs fear of the Islamic State, and when they call for increased military action, no matter how minimal, they are feeding the growing sense of existential threat. And at the very highest level, liberals are doing just that.

    Barack Obama has called the IS “the face of evil.” Hillary Clinton has moved hawkishly to the president’s right, calling for stronger military force because we need to crush ISIS.” Such words are bound to feed the fear-driven war fever.

    (The same concern applies to progressives and democratic socialists who are further left than liberals. Bernie Sanders proclaims that “our priority must be … to destroy the brutal and barbaric ISIS regime.” Sanders cautions against rushing to increase our military force, apparently not recognizing that his florid words will encourage the rush to force regardless of his cautions.)

    What liberals should see when they look in the mirror, then, is their own double culpability. First there is the legacy of the liberal past, creating the mythology of homeland insecurity that shapes every moment of the current debate. Then there is the reality of the present, when liberals, however unintentionally, help shift the debate to the right with words that reinforce the mood of fear and the resulting belief that we must ratchet up the military force. 

    Liberals are supposed to be committed to a faith that all conflicts can somehow be resolved peacefully through rational discourse. That is their cherished mythology.

    Some liberals may give the proven facts of the current situation close logical scrutiny and conclude that peaceful resolution through reason is not possible. If so, they should do what many liberals did in FDR’s day: consciously renounce liberalism when it comes to foreign affairs. 

    Other liberals may find a way to break out of the stranglehold of that mythology, dig deeper into the facts, apply a more rigorous logic, and remain true to their liberal tenets. They will find a way to grapple with the complex challenge of the Islamic State from within the traditional peace-oriented mythology of liberalism. So will progressives and social democrats, who hold the same faith in peace through rational discourse.

    As a first step, everyone who remains true to the left will patiently explain why American military force against the IS should not be increased but ended – because every bomb that falls is another recruiting card, “like manna from heaven,” for ISIS and another invitation to ISIS to launch more attacks.

    As a second, perhaps more difficult, step, liberals and leftists will watch their own words carefully. They will avoid any words that might feed, however indirectly, the image of  the IS as an existential threat to the U.S. Instead, they will directly attack that notion and do whatever they can to undermine its credibility. Doing so, they will also be attacking the whole structure of the mythology of homeland insecurity.

    Finally, to attack that mythology from another angle, liberals and leftists will examine U.S. policies and actions carefully, asking how we might be contributing to the problem. Out of that analysis they will suggest changes to U.S. policies and actions – beyond an end to military force – that might ease rather than exacerbate the conflict. That’s no easy task. But no one ever said it would be easy to be a true liberal. 

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153696 https://historynewsnetwork.org/blog/153696 0
    Liberals Are Ambivalent About War Against Evil Since the shootings in San Bernardino I’ve posted two columns. One analyzes the dynamics of America’s urge to wage war against evil itself. The other points out that liberals must accept some of the blame for this persistent urge.

    But I don’t want to be unfair to liberals. Most of them advise us to respond to the Islamic State with what amounts to a moral crusade against evil, and as in all crusades they tend to dehumanize the enemy. Yet liberals also suggest more caution and moderation than conservatives (who mostly seemed inclined to “bomb the shit out of ‘em,”as Donald Trump put it, saying out loud what so many other right-wingers may be thinking).

    Exhibit A: the president. After the Islamic State (IS) murders in Paris, Barack Obama called the IS “barbaric terrorists ... the face of evil” and labeled their actions “an attack on all of humanity.” If the IS fighters are barbarians who target all humanity, they clearly stand outside of humanity. They are some other species, merely appearing to be human.

    Yet Obama claimed that the heart of the problem “is the ideology that they carry with them and their willingness to die” for it. “The narrative that ISIL developed of creating this caliphate makes it more attractive to potential recruits. ... They are very savvy when it comes to social media.” Only human beings are capable of the rational thought that produces ideological narratives purveyed through social media

    And Obama recognized that, since they are human, they are influenced by the ways their opponents respond to them: “Part of the reason that it is important what we do in Iraq and Syria is that the narrative that ISIL developed of creating this caliphate makes it more attractive to potential recruits.”  

    Hillary Clinton has apparently moved hawkishly to the president’s right, saying We need to crush ISIS ... break the group’s momentum and then its back.” But Hillary, too, believes that “online or offline” we are in “a war of ideas against an ideology.”

    Bernie Sanders (who calls himself a democratic socialist but aligns in his policies with the left wing of historic liberalism) goes a step further. Though he agrees that “our priority must be … to destroy the brutal and barbaric ISIS regime,” he immediately adds: “and importantly to address the root causes underlying these brutal acts. ... to create conditions that prevent fanatical extremist ideologies from flourishing.” So we can affect their behavior by our own choices. (Perhaps Sanders is thinking of the conditions Pope Francis has enumerated: “Terrorism feeds on fear, mistrust and the despair born of poverty and frustration.”)

    These top liberals are following a time-honored American tradition of ambivalence. The earliest English settlers on the eastern seaboard never reached a consensus: Were the native peoples they found here human? Some, like John Eliot and Thomas Morton, answered with a definite “Yes” and reached out to form community with the Indians. Others, like Captain Miles Standish, treated the Indians as nothing but wild beasts of the forest who had to be eradicated for civilization to advance.

    These were Puritans, of course. From the time they first exterminated a whole tribe (the Pequots in 1636) they explained their violence with Protestant theology: If irrational beasts took the form of human beings, they must be devils in disguise, or at least agents of the devil. Like all devils, they were bent on (to use one liberal pundit’s words about the IS) “slaughter for its own sake.” So no changes in the white man’s policies could affect the red beast’s madness. The only option was annihilation. Some Puritans used this as convenient justification for stealing land. But many, no doubt, solemnly believed their own ideology.

    The same ambivalence about the humanity of the Indians plagued white America for another three centuries or more (and perhaps still does, in some ways, today). Thomas Jefferson, the icon of liberal rationalism, was confident that all human beings could learn, some day, to conciliate their differences reasonably. As president he made treaties with the Indians, as one nation to another, clearly implying their humanity.

    Yet Jefferson wrote that some Indians would surely "relapse into barbarism ...  and we shall be obliged to drive them, with the beasts of the forest into the Stony [now Rocky] Mountains.” Even worse, “if ever we are constrained to lift the hatchet against any tribe, we will never lay it down till that tribe is exterminated.”

    The Civil War saw a similar kind of ambivalence. Preachers on both sides urged their flocks to see the contest in apocalyptic terms, pitting divine goodness against devils in human form. The only way to deal with absolute evil was to wipe it out. Yet only gradually did Abraham Lincoln move toward giving his generals license to fight what the historian of U.S. warfare Russell Weigley called the “American way of war” -- the war of total annihilation. Once given the green light from the White House, Generals like Grant and Sherman were willing to oblige.

    From then up to Saddam Hussein’s Iraq in 2003, Americans have shown the same double-edged approach to war. Large numbers -- especially liberals -- have often opposed war either before it began or after it was over (or both, as in World War I and the Iraq war). But when U.S. forces were actually fighting, a large majority of the nation -- including liberals -- has seemed quite willing to view the fight as a moral crusade against inhuman evidoers, thus a war that need observe no limits.

    Before the Civil War, though, two centuries of fighting Indians had already created the template. A vast number of white Americans were ready, and remain ready, to believe that our enemies are bent on evil for its own sake and that nothing we have done, or can do, could have any impact on the enemy’s behavior. Why has this view of the enemy as non-human devils won out so often?

    The Civil War offers the clearest example of a truth that historians of every American war have discovered: War has typically broken out at times when the public as a whole was divided, or at least deeply confused, about its sense of national identity. White Americans have typically used their wars to create at least the illusion of consensus about what it meant, and how good it was, to be a white American.

    This, too, goes back to the earliest days of English settlement. Jill Lepore made the point persuasively in her history of King Phillip’s War, the Puritans’ effort to do God’s work by exterminating all the tribes of New England in 1675.

    The English colonists, “plagued with anxieties of identity,” used their victory to draw “new, firmer boundaries ... between what it meant to be ‘English’ and what it meant to be ‘Indian.’” By defining themselves “against the Indians’ savagery,” which was “considered inhuman,” they “attempted to carve out for themselves a narrow path of virtue.” 

    But “the same cultural anxieties,” Lepore concluded, “would continue to haunt them ... their descendants ... [and] peoples from other parts of Europe” who came to these shores. So white America “would fight uncannily similar wars over and over again.”

    Perhaps it could not be otherwise. America has often been explained as a noble, yet fragile, experiment: People with no common ethnic or culture heritage and frequently not even a common language, trying to create a unified society built only on a set of abstract ideals that are honored more often in the breach than the observance. No wonder we can rarely agree with any certainty on who we are.

    War offers a relief from that uncertainty by letting us say, “We may not know exactly who we are, but we are for damn sure not them, the enemy.” To gain the full measure of the illusion of unity and the relief it brings, we must insist that we are the absolute opposite of our foes in every way. We need a boundary higher and sturdier than the Stony Mountains to separate “us” from “them.”

    The highest, sturdiest boundary line of all is the one we erect between our absolute good and the enemy’s absolute evil. As a nation steeped in Protestant lore, most of us cannot fully escape turning that line into a moral dualism of God’s people versus the devil. Even if we know that we are safer treating the foe as human beings, unconscious forces push us toward apocalyptic imagery of annihilating the devil.

    Time and again Americans have shown themselves willing to act out that imagery even at the risk of their own lives. They have made war against a foreign devil their highest priority, even though war meant more Americans would die.

    Perversely, that may have been the point: the more American deaths, the stronger the conviction that the enemy was indeed a devil, the absolute opposite of our own national goodness. So more American deaths became the surest way to create an illusion of unified national identity, an illusion that allowed us to avoid yet again the troubling uncertainty about what it means to be an American.

    This pattern has typically been most popular among political conservatives. But at least since Woodrow Wilson led us into World War I, modern liberals have also been prey to its seductive appeal.

    That appeal is all the stronger when the belief reigns across the political spectrum -- as it has since FDR led us into World War II -- that our enemies aim to conquer and destroy our entire nation. Today, conservatives like Pat Buchanan openly call the Islamic State “devils who came to kill us,” triggering that fear for the nation’s very existence (even though the IS obviously has nowhere near the capacity for such total devastation). Such claims stand as further proof that the enemy is an inhuman devil. That familiar sense of homeland insecurity becomes a stronger moral justification for annihilating the foes. And it makes the conviction of clearly defined American identity, unity, and absolute goodness all the stronger.

    We need only recall the days after the 9/11 attack to understand how reassuring all this could feel.

    The nation’s leading liberals are obviously concerned about the anxiety of identity. But when they address it directly, they are almost always talking about taking in refugees. “Slamming the door in their faces would be a betrayal of our values,” the president says. And to bar only Muslim refugees is “not American. That’s not who we are.”

    Hillary Clinton reads from the same script: “Discriminating against Muslims, slamming the door on every single Syrian refugee—that's just not who we are. We are better than that." So does Bernie Sanders:  “We will not turn our backs on the refugees ... We will do what we do best and that is be Americans.”

    Our top-ranking liberals obviously want all Americans to agree on just what it means to be an American. But using the refugee issue to promote a sense of clearly-defined, enduring national value isn’t working very well. It just ends up fueling a more divisive debate.

    These liberals may not realize the lesson of history: The words most likely to cement a stronger consensus on American identity and values are their words advising us to use military force against the IS. Public opinion will give liberal candidates more praise than blame, and very possibly more votes, when they pledge to go on dropping bombs, even though those bombs are the most powerful recruiting posters for the IS. So, however unintentionally, the top liberals are reinforcing the old pattern.

    If IS attacks continue we may discover a bitter truth: Beset by so much confusion and dissension about what it means to be an American, the people as a whole may not make their own safety their highest priority.

    The people may once again be willing to sacrifice American lives on the altar of an illusion of a unified, permanent national identity. They may be willing to offer that ultimate sacrifice to avoid accepting what may be America’s fate, or indeed America’s privilege: an insoluble uncertainty, and a conversation that goes on without end, about who we, as a nation, really are.

    ]]>
    Sat, 20 Apr 2024 09:31:11 +0000 https://historynewsnetwork.org/blog/153700 https://historynewsnetwork.org/blog/153700 0