Jim Loewen Jim Loewen blog brought to you by History News Network. Fri, 19 Apr 2024 07:39:22 +0000 Fri, 19 Apr 2024 07:39:22 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/blog/author/11 Taking 9/11/2001 Seriously on 9/11/2011 (1) Newsweek, 3/31/2008, 23.

(2) Chalmers Johnson, "Intellectual Fallacies of the War on Terror," TomDispatch, at tomdispatch.com/post/174852/chalmers_johnson_12_books_in_search_of_a_policy  (10‑22‑07), reprinted at HNN. 

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/141831 https://historynewsnetwork.org/blog/141831 0
"New Beginnings" at the AASLH The AASLH (American Association for State and Local History) just concluded its annual meeting, held in Richmond, VA.  Signs of "new beginnings" in local history—a phrase used in the conference title, abounded, both at AASLH and in Richmond.  Just in time, too! 

My book Lies Across America:  What Our Historic Sites Get Wrong has at least one entry for every state.  Virginia received eight, more than any other state, and Richmond supplied four of those.  I visited the city several times in the late 1990s and found it overrun by a neo-Confederate interpretation of its past.  Since then, new voices have been raised that famously contest Richmond's past.  Arthur Ashe got added to Monument Row.  A good historical marker went up, telling the story of Elizabeth Van Lew, spy for the Union.  An amazing monument telling about Virginia's massive resistance to school desegregation and the courage of black students in bringing the case now stands on the Capitol grounds.  A statue of Lincoln and his son Tad, commemorating their bold walk through Richmond shortly after its surrender, drew protests from the Sons of Confederate Veterans but remains on the landscape at Tredegar Ironworks.  Tredegar also boasts a new Civil War museum, telling the story from three viewpoints:  Confederate, Union, and African American. 

AASLH is also changing.  I was the AASLH banquet speaker last year in Oklahoma City and found that almost 80  percent of that national audience believed the Southern states had seceded "for states' rights."  That kind of traditional thinking seemed missing in Richmond.

AASLH built Richmond's sites into its program.  For example, on the first day of the conference, Richmond natives Sylvester Turner and Cricket White led a tour titled "Walking Through History, Honoring Sacred Stories."  We began at a landing point on the James River where ships disembarked enslaved Africans.  Turner minced no words, reminding us of the literally putrid condition we would have been in as we made our way to the shore.  Then we walked perhaps half a mile holding hands, simulating a coffle, stepping slowly to accommodate our slowest member, likely a child.  Mosquitoes attacked us, requiring us to cooperate to brush them off or to drop character and hands and swat them; either response helped us feel the discomfort members of the coffle would have experienced.

Hope in the Cities organizes these walking tours at least twice a month, usually for school groups, but also for as many as 150 adults.  Individuals can also walk the trail, however, owing to about sixteen historical markers with extensive text and illustrations.  Again, these are hard-hitting; one heading, for example, was "Despair."  Cricket White's husband Ralph White, who runs the James River Park System, wrote a booklet, "Seeing the Scars of Slavery in the Natural Environment: An Interpretive Guide to the Manchester Slave Trail Along the James River in Richmond," that the Park System put out in 2002.

At a turn-around point, Turner noted that coffles sometimes had to walk from Richmond all the way to Natchez, MS.  Our destination, which we reached by bus, was Robert Lumpkin's slave yard and jail near Richmond's Main St. railroad station.  Lies Across America told the story of Lumpkin, one of the biggest slave dealers in the U.S., and lamented that nothing on the Richmond landscape memorialized any form of the slave trade.  In 2008, Richmond hired archaeologists to explore Lumpkin's property, called by African Americans in 1850 the "Devil's Half Acre."  They unearthed many objects, a beautifully paved yard, and foundations of the jail and other buildings.  To preserve it, they covered it all back up, but three historical markers tell its story.  Nearby is one of the few manifestations anywhere in the world of the triangular trade, from West Africa to the U.S. (and the Caribbean) and the United Kingdom.  This is a sculpture, "Reconciliation," unveiled in 2007 before a crowd of 5,000 people.  Also spearheaded by Hope in the Cities, similar monuments stand in Benin and Liverpool.

More traditional conference fare was a panel the next day, "Interpreting Divergent Voices and Challenging Narratives."  Although traditional in form, it was innovative in content.  One speaker told how an upper-class home in Richmond, complete with gold faucets and silk wallpaper, now narrates the story of "the help" — years before the recent bestselling novel.  Another told of Colonial Williamsburg's tentative beginnings toward interpreting Native Americans, surely overdue, since Native tribes from as far away as present-day Pennsylvania and Ohio came to the town to treat with the English.  The room was full.  Other conference panels had titles like "Remembering Even When It Hurts" and "Programming Outside the Civil War Box."  These sessions too drew large audiences.

Several sessions focused on the Civil War, an obvious choice, given the year and locale.  I organized a session, "Secession and the Confederacy: Issues for Local History Sites," that was well-attended.  I presented the discouraging results of my widespread polling about the cause(s) of secession.  (See my Washington Post piece, reprinted at HNN.)  Dwight Pitcaithley, former Chief Historian of the National Park Service, told that of the 65 constitutional amendments proposed in 1860-61 to defuse the crisis, 95 percent dealt with slavery, providing additional evidence that the maintenance and extension of slavery was indeed its leading cause.  John Coski of the Museum of the Confederacy in Richmond explored some of the problems facing museums as they try to tell this story accurately.  He also noted that as they moved beyond their K-12 schooling, some adults, including site managers, feel a need to move beyond "slavery" as "the cause" of secession, leading them astray. 

There were three keynotes.  Adam Goodheart, journalist and author of the new book 1861, suggested historic sites needed to "complexify" their narratives.  Since he included few specifics, this advice could lead to mischief, as Coski pointed out.  Dorothy Cotton, formerly education director for the SCLC, gave an autobiographical talk that shaded into a civil rights rally, to the delight of many members of the audience.  Ed Ayers, historian and new president of the University of Richmond, spoke on the Civil War, emphasizing emancipation and pointing out that we must make even our newest immigrants think of it as "their" history, leading to rights and conflicts that still affect all of us.  Applause interrupted him twice before he finished.

Local history is no longer the intellectual backwater that many academic historians formerly assumed.  Many site managers pine to discuss historical issues, and academicians who pine for engaged readers need to consider composing exhibit narratives as well as pedagogical monographs.  The twain can meet at places like AASLH.

Richmond, too, is no longer the intellectual backwater that it used to be.  During my three days there, the only lack of candor I saw was near the beginning of the slave walk.  It begins at the river, just below a sewage disposal plant.  Richmond calls the plant its "Excess Nutrient Treatment Facility."  Of course, like everywhere else, shit happens in Richmond.  What we learned at the AASLH conference is that in the area of public history, Richmond is finding the courage to face its past honestly—all of it.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/141992 https://historynewsnetwork.org/blog/141992 0
Rick Perry's "Niggerhead" Camp Is Only Part of the Problem

On Sunday, October 2, a front page story in the Washington Post told of Gov. Rick Perry's hunting camp, a place known as "Niggerhead."  For many years a large flat rock stood upright at its gates, announcing the name in painted letters.  That rock is still at the entrance, now lying on its back, parts of the name still visible, painted over ineffectually. 

The camp has been important to Perry's political career.  Perry often hosted friends and supporters and fellow legislators there for turkey shoots and other outings.  Now Perry implies that he first saw the rock with its offensive name only in 1983 and immediately got his parents to paint over the letters.  As Post reporter Stephanie McCrummen delicately phrases it, Perry's version

differs in many respects from the recollections of seven people ... who spoke in detail of ... seeing the rock with the name at various points during the years that Perry was associated with the property.

The seven saw the sign in place and unpainted much later, even as late as 2008.

The name predates the Perrys' ownership.  Apparently it refers to the larger pasture area.  The sign at his hunting camp isn't the only racist sign in Throckmorton County, where the camp is located, however.  Throckmorton, the county seat, reportedly posted a sign at least as late as the 1950s that said, "Nigger, don't let the sun set on you in this town," according to a person who went to high school in Throckmorton at that time.  In 2006, another Throckmorton native emailed me, "It was common knowledge throughout that part of Texas that  African Americans were not welcome in Throckmorton County."  In 1953, a nearby white high school football team played Throckmorton High School, but because it employed a black trainer, the team and its trainer had to have a police escort to and from the stadium.  The county did not have a single black household in it from 1930 into the new millennium. 

In short, Throckmorton County was a "sundown county."  The term is common in Texas and the Midwest and some other parts of the country.  Except in Texas, the Ozarks, Appalachia, and along the "outside" of Florida, sundown towns are rare in the South.  Sundown towns and counties are much more common in the Midwest, Oregon, and other parts of the North.  In some parts of the country, such as Oregon and Pennsylvania, towns that were all-white on purpose were many but the term "sundown town" was not used. 

The key questions to put to Governor Rick Perry are two:  When did you learn that your camp was in a sundown county?  What did you do about it? 

Every sundown town or county needs to take a three-step program to get over it:

— Admit it.  We did this.  We kept out African Americans (and/or Jews, Chinese Americans, Native Americans, etc.).

— Apologize.  We did it, and it was wrong, and we're sorry.

— And state:  "but we don't do it any more."  That last step needs to have teeth:  We now have a racial ombudsperson, or a civil rights commission.  We are hiring affirmatively for our K-12 teaching staff, our police force, our trash collectors.

Absent these steps, African Americans have no reason to believe they can prudently move to Throckmorton County.  In the distant past, perhaps in the late 1920s, whites are said to have lynched an African American who had allegedly killed a white person and were never brought to justice.  As recently as 1995, several  African Americans came to a funeral, causing a stir among the  "keepers of the flame," as a Throckmorton native termed them — without even staying the night.  The 2010 census shows eleven African Americans, so the county may have "broken," but household data do not seem readily available yet.  Absent the three steps, the small thug minority that exists almost every place in the world can think it their business to make life unpleasant for the few African Americans who may have ventured in.

Did Rick Perry, before or after becoming governor, try to get Throckmorton County to take any of the three steps?  As governor, he oversees the distribution of state funds and programs to Throckmorton County.  Tax dollars from African Americans as well as non blacks make these programs possible — yet they go to locales that have had a policy of forbidding African Americans from living in them.  What does Governor Perry think of this?  Is it like using federal monies to fund abortions, even though some of the people paying taxes oppose abortion?  Or is it okay?

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/142173 https://historynewsnetwork.org/blog/142173 0
Going Postal History

Just now, your local post office—easier to find than it will be next year, when the Postal Service plans to close as many as 3,600—features a stamp of Owney, a dog.  He appeared in the Albany, NY, post office in 1888, where "clerks took a liking to him," according to the history that the USPS supplies on the back of each sheet of Owney stamps. 

Owney followed mailbags onto trains, where Railway Mail Service employees considered him their good-luck charm.  As Owney traveled the country, clerks affixed medals and tags to his collar to document his travels.

The Postal Service goes on to tell how the Postmaster General John Wanamaker gave Owney "a special dog-sized jacket to help him display them all."  He wound up with between 400 and 1,000 tags, far more than could fit on the jacket.  Later, Owney "toured the world by steamer and became an icon of American postal lore."  The account on the stamps ends with this happy conclusion:  "Today he enjoys a place of honor at the Smithsonian Institution's National Postal Museum in Washington, D.C." 

Such a cheerful story.  In those years (and for decades thereafter), clerks rode the rails, sorting the mail in special cars while the train was moving.  Trains picked up more mail without even stopping from trackside poles.  The system was very efficient.  And the story gets even happier:  in an era when train wrecks were all too common, no train Owney rode was ever in a wreck.  Postal workers came to see him as a good luck charm.  During his tour around the world in 1895, he met the emperor of Japan among other notables.  Briefly, he was the most famous dog in the world and lent his charisma to dog shows by making guest appearances. 

But that's not the full story. 

In April 1897, the Superintendent of the Chicago mail district forbade Owney from riding the rails any more.  His edict was unkind:

If the dog were in any wise remarkable for his intelligence, there might be some reason for paying attention to him.  He is only a mongrel cur, which has been petted until the thing has become disgusting.  His riding around on the postal cars distracts the attention of the clerks, takes up the time of employees at stations in showing him around, and it is about time he is kicked out.

Nevertheless, Owney took one final ride.  On June 11, 1897, now perhaps seveenteen years old, Owney took the rails to Toledo.  While he was there, a postal clerk tried to chain him for a photo opportunity, and Owney bit him.  The postmaster had a local gendarme shoot him, still chained. 

The postal service knows the full story, of course.  So does the Smithsonian, which now displays Owney at its Postal Museum.  But apparently the stamp-buying public does not need to know.  Neither does the museum-going public.  The museum displays Owney in a prime location, near the entrance, where no visitor can easily miss him.  It used a $10,300 grant and additional donations to pay for his makeover, just in time for the new stamp.  He got a new hand-sculpted snout, new eyes and claws, and pieces of coyote pelt to patch up some bald spots.  He gets a case all his own, next to a railway mail car, and a total of three different labels—but none tell anything bad.  Nor does "Owney the Railway Mail Service Mascot"—Owney's main page at the Postal Museum website — say a thing about his unfortunate demise (http://postalmuseum.si.edu/owney/index.html).  In an obscure corner of its website (http://postalmuseum.si.edu/owney/Postmasters_Advocate_Owney_article-2011-04.pdf, 9/2011), Nancy A. Pope, Postal Museum historian, tells all, although her account of Owney's demise differs from mine in a few details.  But there is no way to get from Owney’s page to Pope's article. 

"Relax, Loewen," some folks may say.  "You’re going postal.  It's just a dog, for heaven’s sake!"  Indeed, Owney was "just a dog"—and now just a stuffed dog.  But where do we draw the line?  Do we tell the unpleasant truths about, say, Woodrow Wilson?  He's long been a favorite of historians:  when Arthur M. Schlesinger asked 75 leading historians to rank the presidents in 1962, they listed Wilson fourth, ahead of Jefferson.  So let's write postal history about Wilson.  And so it is that only two of eighteen textbooks that I surveyed for Lies My Teacher Told Me even mention that Wilson authorized a naval blockade of the Soviet Union and sent troops to Archangel, Murmansk, and Vladivostok to help overthrow the Russian Revolution, in concert with Japan, Great Britain, and France.  Admitting this misadventure might cast a complexifying cloud on the sunny statement that the U.S.S.R. started the Cold War in 1946. 

The unnamed and unknown minions who really write our K-12 U.S. history textbooks write postal history about almost everything.  When discussing the Vietnam War, for instance, most never mention the My Lai massacre; only one of eighteen treats it as an example of a class of events.  My Lai was no more sunny than Owney's bullet.  Can textbooks be right to leave out the cloudy parts of the Vietnam War?  Can students understand the anti-war movement in such a vacuum?  In their important book on historiography After the Fact, James West Davidson and Mark H. Lytle agree that My Lai exemplified a larger phenomenon.  Lytle told me, “The American strategy had atrocity built into it.”  They also argue that My Lai “became a defining moment in the public’s perception of the war.”  But their textbook for high school students never mentions My Lai. 

George W. Bush likewise supplied postal history analysis.  Nine days after the 9/11/2001 attacks, he gave Congress his answer to the important question, why did terrorists strike the Pentagon and the World Trade Center:

Americans are asking, why do they hate us?  They hate what we see right here in this chamber—a democratically elected government.  Their leaders are self-appointed.  They hate our freedoms—our freedom of religion, our freedom of speech, our freedom to vote and assemble and disagree with each other.

What a sunny thought:  they hate us because we are good! 

Research by journalist James Fallows pushed him to critique this line of rhetoric:  "The soldiers, spies, academics, and diplomats I have interviewed are unanimous in saying that 'They hate us for who we are' is dangerous claptrap."  Fallows called this ideology "lazily self-justifying and self-deluding."  Later, the Pentagon itself pointed out, "Muslims do not 'hate our freedom,' but rather they hate our policies." 

Some history museums—especially small ones, like historic houses—supply postal history too.  When I toured Wheatfield, James Buchanan's mansion in Lancaster, Pennsylvania, staff members specifically denied both Buchanan's homosexuality and his pro-slavery stance.  Yet both are not only fact but also related.  So tourists who visited Wheatland left stupider about Buchanan than when they had arrived.  Today the Wheatland website and the National Park Service site about Wheatland merely omit these crucial facts.  Recently, two major museums in San Francisco—the Museum of Modern Art and the Contemporary Jewish Museum—both mounted important exhibits about Gertrude Stein.  Neither found it necessary to state clearly that Stein was a Nazi sympathizer, according to Mark Karlin of Truthout.  "Among the many fascinating aspects of the Stein story,” the art museum explained, “the museum hasn't seen this particular topic as especially germane to our project.”  The Jewish museum, too, hid behind the claim that her Nazi sympathies weren’t relevant to the art or other objects they displayed.  Unfortunately, the curators of the Stein exhibit at the Jewish museum also never made her Nazi leanings clear in their hour-long opening lecture.  It’s hard to give adequate attention to such unfortunate matters—like Owney’s death—while valorizing Stein (and her lover, Alice B. Toklas)—or Owney. 

Deliberate omission is a slippery slope.  We need to include Owney's bullet—and the bad behavior that led to his getting it—if we teach about Owney at all.  There is no safe resting point, no bright line that tells us which truths we can tell, which we must cover up.  Americans need to know about our war on the U.S.S.R.  We need to face what we did in Vietnam—all of it.  We need to understand that Buchanan's position in the pro-slavery wing of the Democratic Party derived in part from his relationship with William Rufus King, slaveowner and senator from Alabama.  We need to realize that people can be Jewish (and homosexual) and still be pro-Hitler.

Postal history won’t do.  Indeed, because it causes us to be ignorant when we think we know, postal history is worse than no history at all. 

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/142219 https://historynewsnetwork.org/blog/142219 0
Victimized by Folklore (1) Encyclopedias were large books filled with articles claiming to cover all human knowledge.  In those days, parents trying to do their best for their children bought them, especially multi-volume sets called The World Book Encyclopedia. 

(2) Singer's January 7, 2002, New Yorker article, "Who Killed Carol Jenkins?" set a high standard of reporting. 

(3) "Martinsville's Sad Season," Sports Illustrated, 2/23/1998, 24.

(4) Earl Woodard as paraphrased by Jeff Swiatek, "Martinsville tired of living with image of racism, bigotry," Indianapolis Star, 6/25/1989.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/142700 https://historynewsnetwork.org/blog/142700 0
Penn State and Violence Against Men The Penn State scandal brought forth a thoughtful commentary by Daniel Mendelsohn, Charles Ranlett Flint Professor of Humanities at Bard College.  Mendelsohn begins his recent New York Times op-ed, “What if it had been a 10-year-old girl in the Penn State locker room that Friday night in 2002?”

He concludes that then Mike McQueary, the graduate assistant to the football team, would surely have intervened or at least called the police.  "But the victim in this case was a boy," Mendelsohn notes.  He goes on to speculate that the university, too, would have taken the crime more seriously, had the victim been female.

Even though we cannot know for sure without at least interviewing McQueary and Joe Paterno, Graham Spanier, and other Penn State officials, surely Mendelsohn is right.  As he puts it,

Does anyone believe that if a burly graduate student had walked in on a 58-year-old man raping a naked little girl in the shower, he would have left without calling the police and without trying to rescue the girl?

However, Mendelsohn mistakes the source of this inequity.  He locates it in the shame associated with homosexuality.  Since the rape was male on male, he opines, the victims were “somehow untouchable, so fully tainted they couldn't, or shouldn't, be rescued.”  He notes that athletics is “the last redoubt of unapologetic anti-gay sentiment.”  Of course, he has overlooked many other redoubts, such as religious organizations from Muslims and Orthodox Jews through Mormons and Southern Baptists.  But this is a quibble:  male athletics is an anti-gay redoubt, if hardly the last one.  Mendelsohn goes on to speculate that somehow this anti-gay sentiment prompted denial, converting anal penetration into mere "horsing around," in the now-notorious words of Penn State's athletic director.  Such reasoning falls short.  Of course, loyalty to a coach, to a friend, can prompt police avoidance, regardless of the sex of the victim.  However, to claim that prejudice against homosexuality promotes winking at homosexual behavior is not logical.

Besides, there's a simpler explanation.  Our society does not take violence against males as seriously as violence against females.

Look at what happens in domestic abuse cases.  Research shows that, although women are more likely to be killed, men are the victims of domestic violence about half the time.  (See, inter alia, Straus and Gelles, The National Family Violence Survey, Philip W. Cook's Abused Men, and copious studies by David Finklehor.)  Yet most cities provide many shelters for abused women and none for abused men.  The federal government passed a “Violence Against Women Act” but no “Violence Against Men Act.”  Imagine a federal law designed to protect white victims of criminal acts while ignoring black victims!

Outside the family, the pattern continues:  in the workplace, men are more than a dozen times more likely than women to be killed.  To be sure, men also commit more than their share of workplace murders, but 90 percent of deaths on the job are accidental, not purposeful, and women's jobs are statistically much safer than men's.  Even God seems to have it in for men:  lightning strikes males seven times as often as females. 

Lightning, of course, is random, but men are much more likely to be working outside in inclement weather.  They are “supposed to”—terms like “telephone lineman” convey this expectation.  Men are also more likely to be playing outside in bad weather.  It's “not manly” to give up football or even golf just ‘cause of a little thunderstorm.  It’s also not manly to seek shelter from domestic violence.

For that matter, it's not manly to see a doctor for “just a little ache or pain.”  So it happens that women make 70 percent of all visits to doctors while men die five years earlier than women.  This difference is slightly greater than the difference race makes.  Like the racial difference, the male/female difference in lifespan largely derives from our culture, not our genes.  It has changed over time; a century ago, men lived longer than women.  Yet the discipline of sociology, which has taught us that most gender differences stem more from social causes than biological, has mainly ignored perhaps the most basic gender difference of all:  in length of life itself. 

Mendelsohn's piece about Penn State, reinterpreted, prompts us to notice what we otherwise take for granted:  folkways and mores embedded in our culture that make it all right in many families to hit boys, but not girls.  All right to require young men to register for military service, but not young women.  All right to execute male murderers while female murderers get prison terms.  If the Penn State scandal helps us take violence against males more seriously than before, perhaps that is the one good thing that can come of it.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/143367 https://historynewsnetwork.org/blog/143367 0
Can American Trains Achieve Steam Speeds in the Modern Era? I began writing this piece aboard Amtrak's Acela, the fastest train in North America.  It travels from Washington to Boston in 6 hours and 32 minutes.  Eventually, we read, despite Republicans, we may have truly high-speed rail, linking those cities and also perhaps speeding through corridors in California, Florida, and the Midwest. 

Pardon me, but haven't we been around this track before?

I remember reading the same story back in 1965.  Then they called it the Metroliner.  It would speed between Boston and Washington at an amazing 115 mph.  I remember that story because I remember where I was when I read it.  I was on a train.  Indeed, I was riding the famous City of New Orleans between Jackson, Mississippi, and Mattoon, Illinois.  I got to the Metroliner article as we were passing through the flat corn fields of central Illinois. 

The City of New Orleans was then the second fastest train in America, after the Santa Fe Super Chief.  It streaked along the Illinois prairie at 81 mpg, including stops.  Owing to holiday traffic, we were running late, so between Carbondale and Mattoon the train made up fifteen minutes.  A little long division revealed that this accomplishment required that we be traveling at about 115 mph between stops.

So already in 1965 I had a feeling of deja vu, reading about the marvelous new Metroliner.  The Metroliner went into service in 1969, the last accomplishment of private passenger rail service in the United States before Amtrak took over in 1971.  Owing to design problems with the self-propelled cars, the Metroliner never ventured north of New York City and rarely exceeded 90 mph. It averaged just 75 mph. 

Amtrak Metroliner train, 1974

In 2000, Amtrak put its Acela in service between Boston and Washington.  The new train was supposed to travel at speeds up to 150 mph and does reach that speed for two short distances.  Despite those bursts, on its journey from Boston to D.C.—456 miles—it averages just under 70 mph (78 mph for the old Metroliner part of the run, from New York City to D.C.).  If Acela merely went as fast as the Illinois Central's City of New Orleans did in Illinois half a century ago—81 mph with stops — it would reach Washington 5 hours and 40 minutes after leaving Boston, shaving almost an hour off its current schedule.  If it went as fast as the City of New Orleans did when I took it, making up time, it would arrive in Washington in just 5 hours.

Nevertheless, Acela is an accomplishment of sorts, because it is so much faster than today's regular passenger service.  Amtrak schedules its City of New Orleans at just 64 mph between Carbondale and Mattoon owing to freight traffic and track deterioration on the Illinois Central.  From New Orleans to Chicago the fabled train averages less than 48 mph.  It went faster in the age of steam, even though it had to stop about every 50 miles for water.  The successor to the Super Chief now takes 41.25 hours to trundle from Chicago to Los Angeles, averaging 54 mph.  In 1956 it required just 37.5 hours, about 60 mph. 

Other trains are even worse.  The Vermonter averages just 44 mph and actually runs backward from Palmer, Massachusetts, to its terminus in St. Albans, Vermont, to avoid a bad patch of track.  The famed Lake Shore Limited—successor to Cary Grant and Eva Saint Marie’s favorite train, the 20th Century Limited—is limited, all right:  Passengers now climb aboard in New York City at 3:45 pm instead of 5:00 pm, and reach Chicago at 9:45 the next morning instead of 7:45.  To a business traveler, those differences are huge. 

The Lake Shore Limited entering Croton-on-Hudson, NY, 2008

The first point of this commentary, then, is not to argue for high-speed rail (that would be a different article), but simply for a return to the speeds that America's regular passenger railroads achieved at the close of the age of steam.  Then we might strive further to ramp up to the speeds of the diesel heyday.  I remain suspicious that high speed rail—trains capable of traveling at, say, 200 mph in Japan—would somehow wind up averaging maybe 80 in the U.S. ... just like the City of New Orleans in 1965. 

My second (and final) point is not to kvetch, but to coax my readers to try a train.  Last week, for example, I spoke at Notre Dame.  I took the Capitol Limited, leaving D.C. Union Station at 4:05 pm.  After passing the familiar Maryland suburbs from an unusual vantage-point, the train runs along the Potomac River, with the ruins of the Chesapeake and Ohio Canal in between.  I went to the Sightseer Lounge to catch the best view of rapids that I had canoed years ago, the Shenandoah joining the Potomac at Harpers Ferry, and the Armory, also known as John Brown's Fort.  The town of Harpers Ferry looked like a fairyland as dusk came down, and I toasted it with a Sierra Pale Ale.  Later I enjoyed a steak, cooked rare, with Amtrak's new horseradish butter sauce, baked potato, salad, and cheesecake.  The dining cars on the overnight trains bear no resemblance to the sad "AmCafes" on day-time trains such as the Northeast Regional service between Boston and Washington.  After dinner, I retired to my little compartment, did some work, read a book, then asked the porter to make up my bed for the night.  The next morning I took a shower, had cheese blintzes for breakfast, and stepped off the train in South Bend at 7:56 AM, five minutes late. 

Amtrak dining car, 2009

To accomplish such an arrival by plane, I would have had to have left earlier in the afternoon and arrived at the South Bend airport around 11:00 pm after a change of flights at O'Hare.  Then I would have had to get to Notre Dame and rent a room at their inn. 

Returning made even more sense:  my train left South Bend at 8:34 pm, perfectly timed for the end of my 6:30 pm, talk.  It arrived in D.C. at 1:00 pm the next day, twenty minutes late.  Flying, assuming I had awakened at 7:00 am after another hotel night in South Bend, 1:00 pm is about when I might have arrived in D.C. by air.  But the train was much more fun.  Always, I wind up in interesting conversations about sundown towns, chain saw sculpture ("It's not just bears anymore."), and other important historical topics.  Except during the summer, trains are also cheaper, even with a sleeper, because tickets include meals and overnight accommodations. 

No one I met at Notre Dame had ever taken Amtrak to or from South Bend.  When the NCSS, OAH, or AHA meet in DC, I rarely meet anyone from Chicago or Indiana who has come by train.  The same holds for Atlanta, South Carolina, Savannah, and northern Florida, all within convenient distance of D.C. by overnight train.  Other Amtrak overnight routes that have met my business and speaking needs include Kansas City to southern Colorado, Montana to Portland and Seattle, and central Illinois to Memphis. 

Try one!  You're helping maintain an important resource.  You're saving energy, compared to other forms of travel.  You're participating in history—often at speeds so slow they come from the 1930s, not the 1950s.  Best of all, you're having a blast. 

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/143567 https://historynewsnetwork.org/blog/143567 0
Meaning in 'Pure' Music: Shostakovich's Fifth Symphony

"Stalin" by Isaak Brodsky, ~1939

By the 1930s, however, the increasingly authoritarian Soviet regime felt increasingly threatened by its artists. Or maybe Stalin, et al., simply felt that they should determine what was done in the arts as in the economy as in the political life of the country.  In any event, by the 1930s, painting had pretty much been reduced to Socialist Realism. Abstraction was forbidden. Stalin kept his eye—all right, his ear—on music, too. Even though a symphony might seem by definition apolitical, neither Stalin nor the Soviet of Composers thought so.

The Terror was a deliberate attempt to smash conventional social relations, again to foster the new obedient Soviet Man.  In the Soviet Union of the '30s, children informed on their parents, workers on their co-workers, and lovers on each other. Meanwhile, like slaves in the Old South, everyone had to wear a grin. "It was essential to smile," recalled Nadezhda Mandelstam. "If you didn't, it meant you were afraid or discontented."  The U.S.S.R. became a nation of masks.

In 1936, Shostakovich became "the first musician to take a blow," in the words of Russian soprano Galina Vishnevskaya, also the wife of cellist and conductor Mstislav Rostropovich. His opera Lady MacBeth of Mtsensk premiered to great popular acclaim at the Bolshoi.  A month later, Pravda, the official Communist newspaper, published a vicious attack on it titled "A Mess Instead of Music."  "The music quacks, moans, pants, and chokes," said Pravda.  Demanded was "Socialist Realism" in music, as in sculpture and the other arts. Stalin, who had attended and not liked the opera two days earlier, probably instigated the article. Certainly everyone thought he did. 

Shostakovich in 1942

In this atmosphere of terror, Shostakovich realized that not only his career but even his life were at stake. He responded eventually with his Fifth Symphony. Before its premiere, he called it "a Soviet artist's practical creative response to just criticism," or at least signed a statement containing those words.  He also gave a private premiere to Party officials at which he told them it ended "on a joyous, optimistic plane." They bought it. Party-line critics in the U.S.S.R. developed a Hamlet-like interpretation, in which the symphony celebrates the transformation of the hero, perhaps Shostakovich himself, from alienated individuality into a triumphant identification with the State. 

Many Western commentators bought this interpretation as well. The phrase, "a Soviet artist's response to just criticism," became something of a subtitle, and a millstone in the West. Taking that at face value, Western commentators for years were not sure whether it was a good thing that Western audiences liked the work so much. They called the symphony a concession to political pressure and an example of Socialist Realism. 

The audience at the world premiere of Symphony #5 in Leningrad heard the work very differently.  The first two movements are full of unpleasant repeated notes, sarcasm, and what Ian MacDonald calls a "Stalin motif." Then comes the largo, the heart of the symphony, its lyrical grieving slow movement. "Its intensity of feeling is more nakedly direct than anything the composer had written before," according to MacDonald. It comes across like a requiem, and it was during this movement that the audience began to weep. The final movement sounds triumphant, but only on its surface.  As Vishnevskaya put it in her autobiography, "beneath the triumphant blare of the trumpets, beneath the endlessly repeated A in the violins, like nails being pounded into one's brain—we hear a desecrated Russia..." She goes on to describe what happened next, at the premiere:

Each member of the audience realized that it had been written for him and about him. And the people reacted. They jumped from their seats shouting and applauding, and continued for half an hour, expressing their support for the composer....

A more complex view of Shostakovich surfaced after his death in 1975, particularly with the release of Solomon Volkov's book Testimony in 1979. Volkov claimed Shostakovich dictated or at least read every page. In his 1990 book The New Shostakovich, Ian McDonald summarizes the controversy about that claim. He concludes that Volkov got Shostakovich right overall, even if Testimony is not by the composer. 

I have long been interested in whether and how instrumental music, that most abstract art form, can convey ideas. Shostakovich's Fifth seemed to invite a test of some sort. Accordingly, some years ago I played it to a class of advanced undergraduates at the University of Vermont—not music majors but students in sociology and education.

I set it up as a lab experiment. One third of the students read program notes that described the symphony as Socialist Realism—the triumph of the New Man. Another third read notes based on from Vishnevskaya's memoir, describing the work as "a huge complex of human passions and sufferings." The final third received a neutral description, noting its four movements and telling about its instrumentation.

The entire "laboratory" was new to most of my students, who had never listened to a full symphony before. It's astounding to realize how insulated most young adults are today from classical music.  At the time (1994), the University of Vermont was the most expensive state school in the United States and drew a student body from the top end of the national income structure. They came largely from the suburbs of Boston, New York, Philadelphia, and Washington, D.C. Even from these elite families in metropolitan locations, many had never attended an orchestral concert. Not one in a hundred knew that Beethoven wrote an "Emperor Concerto." Nobody knew anything about Shostakovich. 

But they listened. Indeed, they listened well. When that final thundering tympani blast had faded to a distant echo, I asked them all to write down their impression of what the music was about. The third group, with neutral program notes, spoke first. To my surprise, they told of the anguish of the music, of passion and suffering, agreeing with the Vishnevskaya notes they had not seen. Indeed, even students who had received notes describing the symphony as a Socialist Realist triumph were converted by what they heard into a more tragic interpretation.

You can perform this experiment at home. Find someone you love—yourself, if you don't already know this symphony—and give them a CD of it—perhaps conducted by Rostropovich, a close friend of the composer. In the name of novelty or appreciation of another culture or Christmas, encourage them to listen to it, all the way through, doing nothing else, volume up high. Then ask them what it was about.

Modern Russian stamp commemorating Shostakovich

My hope is that they'll know, too, and that having heard it once, they'll want to hear it again. Why?  Because it happens in life that we all have terrible times—maybe not so bad as the Stalinist Terror, but tragic enough to us, all the same. Music that speaks honestly to us at such times is worth a great deal. Shostakovich obviously thought so—enough that he risked his life to give it to us. 

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/144532 https://historynewsnetwork.org/blog/144532 0
We Have Had a Gay President, Just Not Nixon In his recent article in the Washington Post "Was Nixon Gay?" journalism professor Mark Feldstein puts down the claim recently made by Don Fulsom in his book Nixon's Darkest Secrets. Almost no evidence supports it, he (rightly) points out.  He calls the book a "pathography," using a term invented by Joyce Carol Oates for biographies that emphasize the pathological. 

But then he goes on to decry similar claims about Abraham Lincoln, James Buchanan, and J. Edgar Hoover as similarly baseless.

Just as we should not rush to believe all the rumors about the sexual orientations of important past Americans, neither should we rush to deny them. Feldstein implies that history on this issue is just about impossible: "[T]here is almost no way to prove—or disprove—alleged intimacies from so long ago." But there is. It is called evidence.

Consider the case of President Buchanan. For many years in Washington, he lived with William Rufus King, Senator from Alabama. The two men were inseparable; wags referred to them as "the Siamese twins." Andrew Jackson dubbed King "Miss Nancy"; Aaron Brown, a prominent Democrat, writing to Mrs. James K. Polk, referred to him as Buchanan's "better half," "his wife," and "Aunt Fancy." When in 1844 King was appointed minister to France, he wrote Buchanan, "I am selfish enough to hope you will not be able to procure an associate who will cause you to feel no regret at our separation." After King's departure, Buchanan wrote to a Mrs. Roosevelt about his social life:

I am now "solitary and alone," having no companion in the house with me.  I have gone a wooing to several gentlemen, but have not succeeded with any one of them. 

King and Buchanan's relationship, though interrupted from time to time by their foreign service, ended only with King's death in 1853. 

I find the evidence for Buchanan's homosexuality, summarized above and presented in greater detail with footnotes in Lies Across America, persuasive beyond a reasonable doubt. That is the standard for a criminal conviction. Homosexuality is no longer a crime, however, at least in most of the country, and should never have been in the first place, so the criminal standard should not apply anyway. Surely these facts surpass the "preponderance of the evidence" standard required in civil trials—and for good history.

Does it make any difference? In Buchanan's case, almost surely it does. He became a stalwart of the radically pro-slavery wing of the Democratic Party. He supported U.S. expansion into Cuba because he worried that Spain might abolish slavery, leading to another black-run nation like Haiti. He appointed Howell Cobb Secretary of the Treasury; Cobb later became the first president of the Confederacy, before Jefferson Davis. He let his Secretary of War, John B. Floyd, soon to become a Confederate general, ship nearly 200,000 rifles to Southern arsenals. In 1860, Buchanan vetoed a homestead bill providing making Western land available cheaply from the government. He did so because planters opposed it: homesteading would have drawn free men, probably free-soil men, to the West. It had to wait until halfway through Lincoln's administration. 

Buchanan's faction's newspaper, the Washington Union, even pushed for the United States Supreme Court to take the Dred Scott decision one step further. Dred Scott required the United States to guarantee slavery in every territory, regardless of the wishes of its residents. Buchanan's paper argued that the United States should guarantee its citizens the right to carry their property—all kinds of property—in any state as well. It flatly came out against the very existence of free states: "The emancipation of the slaves of the northern States was then, as previously stated, a gross outrage on the rights of property, inasmuch as it was not a voluntary relinquishment on the part of the owners." [their italics]

Yet Buchanan hailed from Lancaster, Pennsylvania, surrounded by Mennonites and Quakers, the most anti-slavery white neighbors one could imagine. His own church, the Presbyterian, refused him membership for years because of his pro-slavery views. Coming from such a background, why would Buchanan endorse such a position? Surely his pro-slavery politics stemmed, at least in part, from his 23-year connection with King. Certainly Buchanan thought highly of King: "He is among the best, purest, and most consistent public men I have ever known, and is also a sound judging and discreet fellow."

Buchanan's sexual orientation matters in another way, too. It's important for all Americans to realize that gays (and now lesbians) can be president, indeed, can play all sorts of important roles in American society. The best way for us to realize that is by understanding the roles that gays and lesbians have played. Our greatest poet, Walt Whitman, was gay (and as with Buchanan, it influenced his work). Gays have been major composers—Aaron Copland comes to mind first, and although I cannot hear how his homosexuality affected "Appalachian Spring" or "Symphony #3," again, it's important to know it. Otherwise, we may conclude that gays and lesbians have made little impact in our past, so they hardly matter. 

Late in his article, Feldstein also denounces conspiracy theories, including those that have sprung up around the assassination of John F. Kennedy. He implies that they are equally baseless.  He needs to visit the Museum of the Sixth Floor in Dallas, which does a model job of presenting the various major theories of who shot Kennedy. The museum does not suggest that all conspiracies are equally likely. But neither does it say that Lee Harvey Oswald, acting alone, killed Kennedy, and then Jack Ruby, acting alone, killed Oswald. It presents the evidence. 

Feldstein complains that revisionism can be "oblivious to facts." So can put-downs of revisionism, as Feldstein lamentably demonstrates. Throwing up our hands with the excuse that "there's no way to know" simply won't do.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/144754 https://historynewsnetwork.org/blog/144754 0
Project Censored At Home And Abroad Recently I received an email from Peter Phillips, the president of the Media Freedom Foundation, also known as Project Censored. With Mickey Huff, Director of Project Censored, he co-edited Censored 2012, their annual account of news stories that received little or no coverage in the previous year. Phillips titled his email "Cuba Sets a Global Example for the Achievements of Socialism." His article with the same title is the current lead item at the Project Censored website.

It's a curious antique, redolent of leftist writing in the U.S. 40 years ago, and is perhaps instructive as well. The title is a fine example of that style of "news" writing found even further back in Pravda and Izvestia, known as "socialist realism." 

My first reaction was: "a global example of socialism?" Isn't that a euphemism? Isn't Cuba an example of Marx's dictatorship of the proletariat, a one-party state, a.k.a. communism? Certainly it's not an example of democratic socialism like, say, Sweden. Indeed, isn't Cuba just about the only remaining example of Marxist socialism, other than perhaps North Korea? 

Reading further, it became clear that the story was actually an effusive account of a nine-hour conference held in Havana on February 10, 2012, titled "Intellectual Encounters for Peace and the Preservation of the Environment." Attending were "some 120 authors, professors, and journalists," reported Phillips, rather breathlessly, "from dozens of Caribbean, American and African countries." According to Phillips, Fidel Castro, "(age 85)," addressed the group on a number of topics, ranging from the need to have gold or other assets backing up paper money to the threat to the environment posed by "neo-liberal capitalism." In Phillips's words, "Castro's main message was clear. Cuban socialism is an international example of a humanitarian economy in the world."

Not a single word of criticism of Castro marred the entire essay, which totaled nearly 1,000 words. On the contrary, Phillips's tone was fawning. Note this sentence, for example: 

Fidel Castro, reverently referred to as "Commandante" by many of those present, was flanked by the Cuban Minister of Culture, Abel Prieto, and the president of the Cuban Book Institute, Zuleika Romay.

Let us pause to imagine what Phillips and Project Censored would say if, during the last administration—or for that matter, our current administration—professors and journalists referred to George W. Bush—or Barack Obama—as "Commandante." Unless the reference were satirical, appearing in The Onion, say, or on The Daily Show, Project Censored would surely be outraged. 

About Castro, he is merely obsequious, sycophantic, "honored to participate in the discussions held with the 'Commandante.'" Indeed, Phillips finds only marvelous things to say about Castro:

His energy is inspiring and his command of history and contemporary issues is phenomenal. Castro had serious health issues a few years back, but remains mentally alert. He walked with assistance from his bodyguards, but remained fully participatory in the nine-hour session.

One is reminded of the story put out by propagandists of the Chinese Communist regime in 1966 claiming that Chairman Mao in his seventies had swum nearly ten miles down the Yangtze River in just over an hour. This verges on the much lamented "cult of personality," a characteristic of Marxist socialist societies most recently on exhibit in the funeral of Kim Jong-il in Pyongyang, capital of the "Democratic People's Republic of Korea." 

Just as he cannot find anything wrong with Castro, Phillips cannot find a thing wrong in Cuba. He quotes Fidel: "We have over 80,000 doctors." Now, Cuban medical care is indeed a wonder, as is its medical education. What about its journalism, its media? Surely that topic would interest an organization that calls itself "Media Democracy in Action." But no, the article says nothing about censorship in Cuba. Apparently Project Censored is only interested in censorship when it takes place in the U.S. and other capitalist nations.

So was the conference. In Phillips's words:

The lies and propaganda of the corporate/capitalist media were important themes for the day. One participant remarked how the global corporate media seeks to create a monoculture of the mind inside the capitalist countries.

Now, let us not lampoon Project Censored or the other leftists who attended this conference. Rather, I wish to make several more general points.

First, Project Censored, located at Sonoma State University in California, does excellent work critiquing the United States. It is incapable of critiquing "socialist" societies. So were many leftists during the 1960s and '70s. I remember a friend in Mississippi, an innovative worker against its system of racial segregation. He also developed important critiques of various policies of the federal government. But when we discussed East Germany, for example, he defended even its policy of shooting citizens if they tried to flee their "socialist paradise." Phillips verges on this position, noting without criticism that only in the 1990s, "Cuba opened it doors to those who wanted to leave." 

Phillips goes on to minimize the exodus from Cuba:  "Some 30,000 people choose [sic] to move to the United States. Yet, ten million people choose to stay and build the independent socialist country that Cuba is today."  Actually, the Pew Trust notes that more than 250,000 people who left Cuba after 1990 live in Florida alone. Across the U.S., about 1 million people claim to have been born in Cuba. I have no idea where Phillips got his number, but the actual outflow was at least thirty times larger. So Phillips engages in censorship or distortion of bad news about a "socialist" country.

Second, Project Censored and other writers of the same political persuasion seem oblivious to what their fate would be, were they Cuban. Not for a moment would Castro's "socialist" government permit anything like Project Censored. Phillips would last about a week in Cuba, once he started to point out its unreported or underreported stories. Like Lenin and Stalin before him, Castro has openly stated that Cuban education, media, and cultural activities sought to create a new socialist man, "a monoculture of the mind." To his credit, Phillips would not fit in.

Third, it is surprising that a project that focuses on censorship and the First Amendment in the U.S. does not even notice the complete absence of First Amendment rights in another country. Indeed, it's so surprising that it calls into question the objectivity of the project's work in this country.  Commentators of various political persuasions have already questioned that objectivity. Phillips is aware of the controversy. Interviewed by KC Active late in 2009, he responded to "long-time critics who claim that Project Censored is a left-leaning organization. Nothing could be further from the truth." Then he shoots himself in the foot with his own email from Havana.

About Cuba: its government is indeed a dictatorship of the proletariat.  It is a dictatorship, and it is of the proletariat. It has the health care, educational system, and some other benefits that go with the latter, and it is repressive like the former. To recognize the one without the other is bad scholarship, plain and simple, “BS” for short. Of course, "Cuba Sets a Global Example for the Achievements of Socialism" is not really scholarship at all, or even reportage. It's advocacy, plain and simple. It does not try to be accurate.

Even the notion that setting up a country so the same person can remain in charge for 49 years—and then be succeeded by his younger brother—might be problematic never occurred to Phillips. One wonders what Phillips thinks of Robert Mugabe, who passed his 88th birthday a few days ago and is in even better shape than Castro. Mugabe's Zimbabwe also claims a Maoist heritage and, like Cuba, initially emphasized health and education. Probably, like many leftists, he would say that Mugabe strayed from true socialist principles. Surely Mugabe did. The point, however, is that when the leader of a Marxist socialist state strays, its people have little recourse. That's the key problem with these states, and it is structural, sociological, not psychological. That problem is the most important single fact about Cuba. Peter Phillips is a professor of sociology at Sonoma State, but he missed it.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/144884 https://historynewsnetwork.org/blog/144884 0
George Zimmerman, Trayvon Martin, and Me I write late in the evening on Tuesday, April 10. This morning I woke up famous, at least in certain circles. George Zimmerman, famous for killing Trayvon Martin in Florida, had cited me on his new website. Correspondents rushed to tell me. His site was receiving so much traffic that it took seven minutes to log on to his home page. I could not reach any subsidiary page, specifically the page titled "The Facts," where I had been told Zimmerman prominently displayed my words, until late in the morning. 

Many other websites had picked up my quote, however.  According to "George Zimmerman Launches Website to Fund Legal Costs," an unsigned article at JD Journal, a site whose motto reads "Nothing but the Truth,"

The site carries a quote from sociologist James W. Loewen: "People have a right to their own opinions, but not to their own facts. Evidence must be located, not created, and opinions not backed by evidence cannot be given much weight."

At their websites, MSNBC, CBS, and many other news services also included the quote.  CBS termed it "a philosophy attributed to sociologist James W. Loewen." By mid-afternoon at least 427 sites, from the New York Times to the "Brother Of Yeshua Blogspot," included the quotation. 

I'm not the only person Zimmerman quoted, but I'm the only living person. He also included a famous sentence by Edmund Burke: "The only thing necessary for the triumph of evil, is that good men do nothing." A jury might take this quote to be a rationale for his vigilante activism toward "evil." Hence Zimmerman's attorneys might well have been unhappy with this posted quote, even before they resigned as his counsel later in the day. However, the Burke sentence pales compared to the macabre connotation of his second quotation, by Henrik Ibsen: "A thousand words will not leave so deep an impression as one deed." Again, his lawyers could not have been happy that Zimmerman posted this sentence, since a jury might infer that he wanted to make a "deep impression" by committing a dastardly deed. Lastly, Zimmerman quoted Thomas Paine: "The world is my country, all mankind are my brethren, and to do good is my religion." 

I'm happy to be in the company of Burke, Ibsen, and Paine. Who knows?  Maybe Zimmerman will get me into Familiar Quotations. (I had hoped that my one-liner, "Those who don't remember the past are condemned to repeat the eleventh grade," which I used at the beginning of Lies My Teacher Told Me, might make the grade, but so far, only three websites use it.)  However, the first of my two sentences was said in essence by Bernard Baruch in 1950, by various folks since then, and probably by others long before. So I think I must search elsewhere for my fifteen minutes of fame. 

Still, it was jarring to see my name and quotation behind the talking heads Tuesday as they told the story of Zimmerman's lawyers' resignations on the evening news. I'm not happy with being used as a resource by George Zimmerman, and I disclaim any relationship with him and his cause. Of course, once they have unleashed words upon the world—in particular, upon the World Wide Web—authors have no control over their use, for good or ill. Moreover, one reason why I have not written a thing about the death of Trayvon Martin is my lack of facts. I know only what I have learned from the newspapers (yes, I subscribe) and other media. Anyone likely to read anything I might write about the matter has already read the same sources. 

I would like to know how George Zimmerman learned of my words that he used.  They appear on page 358 of Lies My Teacher Told Me. While I would like to believe he read the entire book, if he did, he seems to have missed its anti-racist central message. 

When it comes to “Brother Of Yeshua,” who actually emailed me Tuesday morning to tell me he had used the quotation, I think it's safe to infer that he first encountered my words at Zimmerman's website or news sites that quoted it. Again, he has a (Constitutional) right to use my words to support any position he wants, and here is what he used them for: 

When rightly understood, what we are presented with is the manifestation of the statement by James W. Loewen that while "People have a right to their own opinions, but not to their own facts. Evidence must be located, not created, and opinions not backed by evidence cannot be given much weight"—and while the facts demonstrate that Mormonism is actually closer to the original Gospel teachings and objectives, mainstream Christianity has been in denial of the very facts that they have long censored and remain in denial of, to the degree that those who believe they are Christian, have been spiritually disenfranchised by the very Church they look to for truth.

So now my words are invoked to support belief in Mormonism as well as George Zimmerman's innocence!

"Brother Of Yeshua" writes further, "2000 years ago I lived as Jacob who people call James, and was known as the Brother of Yeshua/Jesus." Such a statement does not carry the weight of fact. Elsewhere on his site, he states that he holds to "Religion As The True System Of Education"—again at odds with education based on fact. 

At some point, I should relate all this to the study of history as taught in our K-12 schools—on which I've spent much of the past twenty years—so let's do so now. One reason why many Americans are not critical readers and do not insist upon facts stems from their history textbooks. Bear in mind that five-sixths of all Americans never take a history course after leaving high school. High school history textbooks include no footnotes or other system of references.  Moreover, even when issues remain contested, such as when and how did people first get to the Americas, textbooks cite no evidence—in this case, from archaeology, human biology, or anthropology. They just go on blandly relating certainties, even on topics still ruled by uncertainty. 

Moreover, if Allan Cronshaw in Graham, North Carolina, writes as "Brother Of Yeshua," that's not so different from what happens in the K-12 textbook world. There, unnamed gnomes deep in the bowels of the publishers write in the names of Daniel Boorstin, Alan Winkler, and many other famous historians whose names grace the covers of books they didn't write.

Nor does the style of history textbooks—written in a monotone, presenting "information" to be memorized—promote critical thinking skills or prompt students to question sources. Such skills might have induced Mr. Zimmerman and others not to profile young African American males, which—this much seems factual—he seems to have done. 

Then there is Zimmerman's use of the American flag on his website. He wraps himself in the flag to stop thought, not to start it. All six of the twenty-first-century textbooks that I analyzed for the new edition of Lies My Teacher Told Me similarly wave the American flag on their covers, and for the same reason: to quell critical thinking. Publishers wave it so prospective purchasers will not question them or doubt that they are "good Americans." If instead these books would distinguish between patriotism and nationalism, their flag-waving might be different. I take my definition of a patriot from Frederick Douglass, who said, "For he is a lover of his country who rebukes and does not excuse its sins." Surely textbooks need to help students to develop informed reasons to criticize as well as to take pride in their country. Nationalists, on the other hand, take pride in their nation no matter what—and do not care to think about its sins. If textbooks made that useful distinction, then Americans might not "follow the flag" even when our leaders take it into dangerous places on behalf of foolish and even immoral purposes. If the flag connoted "do your best critical thinking about the U.S.," then when politicians, vigilantes, and textbook authors waved it to garner unthinking approval, the rest of us would simply laugh at them.

I believe—at least I hope—that the millions of people who came upon my statement comparing facts and opinions Tuesday do not infer that I am George Zimmerman's ally. I am not. Rather, I hope that Americans will ground their opinions about this case on the facts. We all surely hope that a process has finally been set in place that will allow the facts to emerge. Meanwhile, those of us far from Sanford, even far from Florida, must set processes in place that will transform how we teach about the American past in grades K-12. When we allow facts to emerge—even awkward and untoward facts—when we encourage students to question national and local policies—and yes, when we insist that "opinions not backed by evidence cannot be given much weight"—then we are educating. Then we are producing Americans who are unlikely to profile. Then we are patriots.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/145629 https://historynewsnetwork.org/blog/145629 0
Now It's Obama Who's Our First Gay President! The new issue of Newsweek features a cover photo of President Obama topped by a rainbow-colored halo and captioned "The First Gay President." The halo and caption strike me as cheap sensationalism. I realize airport travelers look at a magazine for 2.2 seconds before moving on to the next one. I grant that this cover will probably get Newsweek a 4.4 second glance. I also understand that Newsweek is desperate for sales. Nevertheless, I doubt that the Newsweek of old, before it was sold for a dollar, would have pandered as shallowly.

The caption is a superficial way to characterize an important development of thought that the president -- along with the country -- has been making over recent years. It is also entirely wrong. Like the mini-furor a couple of months back about the claim that Richard Nixon was our first gay president, the story simply ignores that the U.S. already had a gay president more than a century ago.

There can be no doubt that James Buchanan was gay, before, during, and after his four years in the White House. Moreover, the nation knew it, too -- he was not far into the closet.

Today, I know no historian who has studied the matter and thinks Buchanan was heterosexual. Fifteen years ago, historian John Howard, author of Men Like That, a pioneering study of queer culture in Mississippi, shared with me the key documents, including Buchanan's May 13, 1844, letter to a Mrs. Roosevelt. Describing his deteriorating social life after his great love, William Rufus King, senator from Alabama, had moved to Paris to become our ambassador to France, Buchanan wrote:

I am now "solitary and alone," having no companion in the house with me. I have gone a wooing to several gentlemen, but have not succeeded with any one of them. I feel that it is not good for man to be alone; and should not be astonished to find myself married to some old maid who can nurse me when I am sick, provide good dinners for me when I am well, and not expect from me any very ardent or romantic affection.

Despite such evidence, one reason why Americans find it hard to believe Buchanan could have been gay is that we have a touching belief in progress. Our high school history textbooks' overall storyline is, "We started out great and have been getting better ever since," more or less automatically. Thus we must be more tolerant now than we were way back in the middle of the nineteenth century! Buchanan could not have been gay then, else we would not seem more tolerant now.

This ideology of progress amounts to a chronological form of ethnocentrism. Thus chronological ethnocentrism is the belief that we now live in a better society, compared to past societies. Of course, ethnocentrism is the anthropological term for the attitude that our society is better than any other society now existing, and theirs are OK to the degree that they are like ours.

Chronological ethnocentrism plays a helpful role for history textbook authors: it lets them sequester bad things, from racism to the robber barons, in the distant past. Unfortunately for students, it also makes history impossibly dull, because we all "know" everything turned out for the best. It also makes history irrelevant, because it separates what we might learn about, say, racism or the robber barons in the past from issues of the here and now. Unfortunately for us all, just as ethnocentrism makes us less able to learn from other societies, chronological ethnocentrism makes us less able to learn from our past. It makes us stupider.

To think even for a moment about aspects of personal presentation other than sexual orientation forces us to realize that we today are not necessarily more tolerant. Consider facial hair. In 1864, with a beard, Abraham Lincoln won re-election. Could that happen nowadays? Is it mere chance that no candidate with facial hair has won the presidency since William Howard Taft -- and he wore only a mustache? Indeed, since Thomas Dewey in 1948 no major party candidate with facial hair has even run for president, and Dewey wore only the smallest of mustaches.

Perhaps the presidency is too small a sample. Let's add in the Supreme Court. Since 1930, 34 different men have served on the Supreme Court. All save Thurgood Marshall have been clean-shaven. (Lest readers think that Marshall's tiny mustache might topple this argument, let me point out that during most of the last 82 years, 70 percentof adult black males have had some facial hair, yet the only three African Americans to have served on the Supreme Court or as president have had almost none.) The chance that a random sample of 33 white males would have had no facial hair is something like (.9)33 or about .03, not very likely.

"Even" today, many institutions, from investment banking firms to Brigham Young University, flatly prohibit beards on white males. Brigham Young falsifies its past to make this rule seem "natural." Its chief founder, John Maeser, usually wore a full beard and mustache. In front of the building bearing his name stands his bronze statue complete with full beard and mustache. In about 1960, however, perhaps earlier, BYU banned beards. Then in 1986, the university commissioned artist Ron Bell to paint a portrait of Maeser. Working from an old photograph, Bell did; of course, Maeser wound up bearded. So the administration asked him to remove the beard. "They didn't want today's students to believe they could follow suit," in the artist's words. He complied.

If this example seems too religious, consider the huge secular company Walt Disney Enterprises. The last time I visited Disney World, it still banned facial hair, although it quietly made exceptions for African Americans with well-trimmed beards or mustaches.

In themselves, beards may not be signs of progress, although mine has subtly improved my thinking. Nevertheless, we reached an arresting state of intolerance when the Disney organization, founded by a man with a mustache, would not allow one even on a janitor. Moreover, before we trivialize these examples by thinking they apply only to facial hair, consider that Lincoln was also our last president who was not a member of a Christian denomination when taking office. Could a non-Christian like Jefferson or Lincoln be president today? It's not clear.

All that said, President Obama's change of heart about gay marriage remains significant. It does show increasing tolerance compared to our recent past. During the Nadir of race relations, that terrible period between 1890 and about 1940 when white America went more racist in its thinking than at any other time, the U.S. also clamped down on beards, liquor (briefly), and, yes, homosexuals. As Jackie Robinson was not the first black player in Major League Baseball, but rather the first after the Nadir, so President Obama is not our "first gay" president (Forgive me: I cannot seem to retype Newsweek's silly headline without putting quotation marks around the words.), but only our "first" since the Nadir.

Remembering that James Buchanan was homosexual complexifies our national narrative, to be sure, but it is a complexity that we need. It prompts us to remember that terrible era, the Nadir, when we all moved backward, not just the South. Not just organized baseball but also the Kentucky Derby, the NFL, and even previously "black" jobs like railroad foremen got redefined "white only." Communities across the North became sundown towns, barring African Americans formally or informally. Even North Dakota outlawed interracial marriage.

Forgetting Buchanan's sexual orientation helps us forget all the other national secrets we have packed into that closet with him. Ultimately, it prompts us to succumb to chronological ethnocentrism. If, however, we can rid ourselves of the fantasy that we are already always getting better, then maybe we can create a nation that actually becomes more tolerant. Then we might -- again -- elect a real gay president. After all, just three months ago, Disney started letting white male employees grow beards.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/146241 https://historynewsnetwork.org/blog/146241 0
Dinesh D'Souza: Knave or Fool? Ad for Obama 2016.

Recently right-wing commentator Dinesh d'Souza released 2016: Obama's America, a movie trashing President Obama. (D'Souza is also president of King's College, a small religious college in Manhattan that is a subsidiary of Campus Crusade for Christ.) Obama's America is now playing in 1,500 theaters across the nation.

According to an NPR interview with d'Souza broadcast on September 1, 2012, its thesis is "that President Obama is weakening the country -- deliberately." The film paints Obama as "anti-colonialist," and D'Souza does not intend the term as a compliment. Anti-colonialism means, according to d'Souza, "Western countries, and now the United States, have become rich by invading and occupying and looting the poor countries, so that the wealth of the world is unfairly distributed. And what Obama wants to do is correct that."

Some anti-colonialists, such as Walter Rodney (How Europe Underdeveloped Africa), do hold the view that d'Souza attributes to Obama. Others do not agree that imperialism and mercantilism provided the foundation of Western economic progress. They see colonialism as largely an exercise in hubris, made easy by Western military superiority. I believe that the first takeovers -- of North and South America -- helped Christianity against Islam, fueled the triumph of Europe over Asia and Africa, and led to the rise of capitalism. Later takeovers, like France in Indochina and Germany in Namibia, seem less important to the rise of the West.

So far as I can tell, D'Souza ignores these issues. His focus is really on President Obama. As he put it at the end of the interview, "President Obama has an agenda for downsizing America that he dare not share with the American people because it would endanger his support." According to d'Souza, the point of his film, which he calls a "documentary," is to convince Americans of this hidden agenda. D'Souza wants to convince us that Obama "wants America to have less wealth and power so that people in other countries can have more wealth and power."

Several reasonable responses to d'Souza spring to mind. One might be to note that in his first term Obama has acted in our national interest -- from killing bin Laden to advocating that we reward companies for keeping jobs at home. But I have a personal response. Having had a work of mine trashed by d'Souza years ago, I wish to denounce him as either a knave or a fool. His standards of intellectual work are so low, at least in my own experience, as to dismiss him from the company of people whose voices are worth considering.

Years ago, teaching at Tougaloo College convinced me that "aptitude tests" like the SAT and ACT did not really measure aptitude, at least not across different subcultures and social structures. My classes at Tougaloo, a black college in Mississippi, contained students with extraordinary ability. They would have stood out in a Harvard seminar, as I knew from four years of graduate work at Harvard. One Tougaloo graduate went on to finish her doctorate at the University of Wisconsin in just three years. Others earned doctorates from Harvard, Berkeley, and other sociological powerhouses. Yet the SAT scores of these outstanding students averaged around 560. (Probably you know that SAT scores range from 200 to 800. 500 is average.) In contrast, a Harvard student of mine with 560 on his SATs, admitted as a legacy, was mired in the lowest quintile and simply could not do high-level college work.

To show this problem, I concocted the "Loewen Low-Aptitude Test." It contains five items. Each is biased against upper-middle-class white people in its own way. For example, one uses carpentry terms. Teaching Sociology published the test back in April of 1979 (6 #3, 221-44).

In 1991, Dinesh d'Souza published Illiberal Education: The Politics of Race and Sex on Campus. It became a best seller. For a page or two, probably because it was featured on ABC-TV's "20/20," d'Souza attacked my article. Here is his treatment. He picks on the item biased in favor of African Americans:

James Loewen of Catholic University, who alleges cultural bias in testing, gives an example of an alternate SAT question that would be more comprehensible to blacks.

Saturday Ajax got an LD:

(a) He had smoked too much grass.

(b) He tripped out on drugs.

(c) He brought her to his apartment.

(d) He showed it off to his fox.

(e) He became wised up (less dense).

I need to explain, as I did in Teaching Sociology, that I adapted this item from a then-well-known test by Robert Williams, a black social psychologist in St. Louis. Williams saw that his African American students were not as inept as their "standardized" test scores made them out to be. Inferring that cultural bias was involved, he developed the "Black Intelligence Test for Cultural Homogeneity," based on the vocabulary of inner-city St. Louis. It proved a "BITCH" for nonblack people to pass, because it relied on vocabulary relatively unfamiliar to them. Language has long been an area of noteworthy black creativity, and "LD" was a term for Cadillac Eldorado, then a stylish luxury car. "Fox" of course meant attractive girlfriend, a black invention that got picked up by the larger culture and has now become a variant meaning of the word, along with its adjectival form, "foxy."

As I wrote in Teaching Sociology, the point of the exercise was to show that some of "our meritocratic barriers are not meritocratic at all." Often the reasoning involved in answering even difficult items on "standardized" tests turns out to be trivial, if one knows the vocabulary. For example, an item from the Miller Analogy Test, required by some graduate and professional schools, is elementary if one knows that a meaning of "sake" is Japanese rice wine, impenetrable if one only knows the "for God's sake" meaning of the word.

D'Souza completely missed and misrepresented my point. He went on to write "that this line of criticism stereotypes blacks. [Loewen's] model presumes that blacks are most at home in the world of slang, womanizing, and drugs. Why a familiarity with this vocabulary is a good preparation for college, Loewen does not say."

Of course, the item had nothing to do with womanizing or drugs. And of course, I had not claimed that familiarity with black slang provides good preparation for college. Neither does familiarity with Japanese rice wine. To misread the exercise so completely marks d'Souza as either a knave or a fool, depending upon whether his misreading was deliberate or the result of too-quick reading.

In case his mistake was an honest one, let me spell out once more the point of the "Loewen Low-Aptitude Test." I deliberately created a test that was biased against upper-middle class white students. I did so to show it can be done, to give such students the experience of failing a test owing to test bias, and to suggest that cultural bias may explain at least some of the gaps between the scores of minority students and white upper-middle-class students.

For the record, I do believe that within a population, either test -- the SAT or the BITCH -- may provide useful information. Within my class in introductory sociology at Tougaloo, for instance, most students who scored in the lowest quintile on the BITCH test came from truly rural backgrounds. They had no more knowledge of urban black slang than did most white suburbanites. Among my sociology students at Tougaloo, the SAT listed many in the "right order" -- matching professors' assessments of their abilities. Across cultures, however, both tests would prove to be of little value.

Incidentally, partly as a result of similar arguments I made to Nancy Cole, then Vice-President of Educational Testing Services, purveyor of the SAT, ETS removed "aptitude" from the title of the SAT. (See The Validity of Testing in Education and Employment.) In 1994 it became the "Scholastic Assessment Test." A few years later, painfully aware that "Assessment Test" was redundant, repetitive, and said the same thing twice, ETS renamed the SAT once more. Now it merely stands for "S.A.T." -- the initials mean nothing at all! The change also amounts to nothing at all, however, because most people don't know it occurred. Google sends people who search for "Scholastic Aptitude Test" to ETS's home page for the SAT, even though neither "scholastic" nor "aptitude" appear visibly on that page. Hence students from rural Mississippi -- disadvantaged on both the SAT and the BITCH -- remain likely to infer that their test scores tell them they have low aptitude -- even if they don't, even if test bias remains the culprit.

Surely d'Souza could not have missed my point. Hence, I do suggest that his mischaracterization of my work derived from knavery, not ignorance or stupidity. He knew that few readers of his book would have access to Teaching Sociology in that pre-internet age; fewer still would use that access. So he was free to misuse the item for his own purposes. I suspect he did the same with the sources and interviews he compiled for 2016: Obama's America. Beware!

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/148052 https://historynewsnetwork.org/blog/148052 0
Registering to Vote, Then and Now Voter registration begins at the Magnolia Motel in Prentiss, Mississippi, August 25, 1965. Credit: Winfred Moncrief.

During the summer of 1965, while a graduate student, I ran the "Social Science Lab" at Tougaloo College in Madison County near Jackson, Mississippi. "Ze Lab" was the creation of Dr. Ernst Borinski, a refugee from Hitler's Germany who taught at Tougaloo from 1947 until his death in 1983. Borinski was a remarkable man -- an inspirational professor who used his status as outsider to cross boundaries between white and black Mississippi on behalf of social change. He is one of the main subjects of the book, movie, and now museum exhibit, From Swastika to Jim Crow.

On August 25 of that summer, the U.S. sent federal voting registrars to several counties in Mississippi. These were counties that had proven particularly reluctant to register African Americans to vote. The next day, curious to see the scene for myself, I drove to Canton, county seat of Madison County. There the registrars had rented a vacant storefront on the courthouse square.

What I saw was a Day of Jubilee. A card table with a folding chair on which sat a white registrar were the only furnishings in the otherwise bare room. From this table, in a line that stretched to the door, African Americans waited calmly to register. Outside, the line became double and stretched east to the end of the block, where it turned south, ran another block, reached the corner, turned west, and reached the end of that block. The wait time was more than a day, until the Department of Justice added two more card tables and two more registrars. Those waiting seemed not to mind. Some had been waiting for decades to register to vote; another day wouldn't exhaust their patience. A spirit of Jubilee -- not boisterous, just quiet satisfaction -- floated in the air. Everyone in line knew that the Democrats, the party of white supremacy, had for decades made it extremely difficult if not flatly impossible for African Americans to vote in Mississippi and especially in Madison County. Now it was their turn.

That scene -- with its hopeful masses yearning for democracy -- impressed itself on my mind. It returned when I read reporter Ann Gerhart's poignant account of Cheryl Ann Moore's successful attempt to get a non-driver photo ID card so she could vote in 2012.

The situations are not the same. Ms. Moore had registered to vote when she was 19; she is now 54. She lives in Pennsylvania, not Mississippi. The Republicans, not the Democrats, are now the party of white supremacy. But it is safe to say that the party in power in the state of Pennsylvania in 2012 is doing what it can to keep black (and poor and transient and Latino) voters from voting. So are Republicans in South Carolina, Virginia, and several other states.

In Mississippi in the early 1970s, I testified for the U.S. Department of Justice, the Lawyers' Committee for Civil Rights Under Law, and the American Civil Liberties Union in several voting rights cases. I showed statistically that whites voted overwhelmingly for white candidates, blacks voted almost as overwhelmingly for black candidates, and blacks produced fewer votes than whites, per capita. Indeed, a 65/35 advantage in total population was required to provide African Americans with a 50/50 shot at winning an election. That was because, on average, African Americans had more children, so a total population 65% black would be only about 60% black in voting age population. In turn, because it was harder for African Americans to register to vote, a 60/40 advantage in voting age population led to a 55/45 advantage in registered voters. Finally, because it was harder for African Americans to get to the polls and vote and vote freely, a 55/45 registration advantage translated to a tossup election.

I was testifying about the obstacles facing poor black would-be voters when a Mississippi judge interrupted me to ask, "Tell me, do you think that illiterate people should vote?" His question took me by surprise, but I managed on that occasion to answer well. "I do," I replied, and went on to explain why: Illiterates can vote better on their own behalf than anyone else can do for them. After all, they know their own situation and their own minds better than anyone else.

Historically, illiterate recently-freed African Americans voted in massive numbers across the South during Reconstruction. Indeed, they voted in substantial numbers in the North as well. And they voted their interests responsibly. Of course they voted Republican, despite efforts by Democrats to persuade or coerce them to do otherwise. Across the South, they also voted for the best state constitutions that the Southern states have ever had, far better than the constitutions under which they operate today.

Afterward, Democrats concocted canards about the behavior of African American and white Republican voters during Reconstruction. About voter fraud in South Carolina, a 1911 article in Confederate Veteran said this:

Armed troops were kept at every county seat to uphold negro rule and encourage him to vote the Republican ticket as often as he pleased, the Republicans by this means running up great majorities.

Such statements are simply not true. I suspect I need not belabor that claim for this audience. Confederate Veteran went on to decry the usual "carpetbaggers and scalawags" who committed "disgraceful scenes at the statehouse in Columbia."

Today Republicans, not Democrats, are voicing untrue claims about vote fraud. In South Carolina, Republicans cobbled together a collection of stories about individual fraudulent voters from other states to back their contention that their state needed a new voter ID law. In reality, as historian Vernon Burton noted in his expert witness report in the lawsuit challenging this law, "no bill sponsors, election administrators, or members of the testifying public could identify any verified instances of voter fraud that would be addressed by the voter ID law." Ironically, South Carolina already had voter ID cards; they just weren't photo ID cards. Still, they had accomplished their purpose without any problem, if that purpose was to prevent vote fraud.

The new voter ID law had a very different purpose. As Democratic Representative David Mack put it, "The Republican Party benefits from a low voter turnout. The Democratic Party benefits from a better, high voter turnout. That's what it's all about." [Tr. of House Debate, 1/26/2011, 57.] Indeed, the law had a racial purpose that Senate GOP Caucus Director Wesley Donehue made overt. Reporter Jim Davenport wrote a story for the Associated Press that said, "South Carolina's new voter photo identification law appears to be hitting black precincts in the state the hardest." Donehue responded by linking to Davenport's article and tweeting, "Nice! @jimdavenport_ap proves EXACTLY why we need Voter ID in SC." Burton amasses much other evidence that decreasing the black vote -- which of course goes overwhelmingly Democratic in South Carolina -- was the key Republican goal.

The same story played out in Pennsylvania. Last March, Republican majorities in the Pennsylvania House and Senate passed a photo ID law ostensibly to eliminate fraud. After it passed, however, Mike Turzai, majority leader of the Pennsylvania House of Representatives, let the real purpose out of the bag. He was videotaped proclaiming to a group of fellow Republicans, "Voter ID, which is gonna allow Governor Romney to win the state of Pennsylvania -- done!"

As in South Carolina, Pennsylvania leaders admitted they had no evidence of any fraud that the new law might combat. Indeed, they stipulated in the court proceeding now underway, "There have been no investigations or prosecutions of in-person voter fraud in Pennsylvania; and the parties do not have direct personal knowledge of any such investigations or prosecutions in other states."

As in South Carolina, Pennsylvania Republicans professed not to understand how the new law burdened the lower class. Tom Corbett, governor of the state, "estimated that 99 percent of the state's 8,300,000 voters already had an acceptable PennDOT ID," according to Gerhart. That would leave fewer than 83,000 without the right to vote. Actually, the governor was way off; his own Department of State estimated 758,000, more than 9 percent of the electorate, lack an acceptable ID. Republicans also profess not to understand how anyone can function in society without already having an official photo ID. Obviously, they have never been inside one of the check-cashing establishments that dot poor neighborhoods from large cities to small towns across America.

Gerhart followed one about-to-be disfranchised person, Ms. Moore, as she got her photo ID card. Moore owns her own home in Philadelphia. For twenty-four years, she has been a hospital custodian. Like many urban residents, she has no car and no driver's license. She has no bank account and has cashed her paycheck at the same store every two weeks for years.

Knowing it would take a long time, she devoted a vacation day to getting her ID card. She got to the office at about 11:30 am and was given ticket #C809, a clipboard, a two-page form, and a pen. Her estimated wait time was 63 minutes. By 12:30 pm, having skipped breakfast, she was hungry, and the number being served was just #C765, so she went to Subway for lunch. When she returned, every seat was taken, so she sat on a heating vent at the back. The office does not do voter IDs on Mondays. One can only imagine what the crowds will look like in October.

At 1:10 pm, Moore had been there for more than an hour and a half, but the office was only up to #C773. She worried that they might not take cash for the $13.50 fee, so she went outside to a store and bought a money order for that amount. Finally, at 1:42 pm, she stepped up to the window and handed in her paperwork. Since she was already registered, she had filled out the wrong form. She filled out another two-page form and returned. A clerk "phoned the Philadelphia Board of Elections," in Gerhart's account, "and, after a wait, verified she was, indeed, registered to vote. Ten minutes later, she directed Moore to print and sign her name on a sheet of paper labeled 'Examiner's Report.'" Moore proved her residency with the address stub from her paycheck. She swore an oath. The result, at 2:10 pm, was another ticket, A230, to get in the photo line. At 3:25 pm, she heard her number called and "scampered over to the camera, only to have the clerk take #A231 and the man standing behind her."

"Hey!" she called out, "I'm right here!"

The photo clerk said he had already called her number three times and demanded she get a new number! "This is bullshit," she replied. "At the end of all this?" Finally, another clerk took her photo and gave her her ID card.

As in Canton, Moore seemed not to mind her four-hour wait. She grinned, kissed the card, and said "I feel good!" I would have been livid. Republicans would have been livid. Republicans (if you'll pardon a mild overgeneralization) aren't used to waiting for hours in a government bureaucracy to regain a right that they had lost through no fault of their own.

Voters in rural areas aren't as fortunate as Ms. Moore. They can't take the bus to the DMV. They have to drive, but of course precisely those who need non-driver voter IDs cannot drive. Voters who have lost (or never had) an original "raised-seal" birth certificate aren't so fortunate either. Neither are voters who have misplaced their original Social Security card. It can require several trips, with waiting periods, to several offices, just to get the documentation required to get a photo ID.

Faced with these problems, Republicans allowed an amendment to substitute photo IDs from institutions of higher learning and nursing homes, if they had expiration dates. Then it turned out that the IDs at 91 of 110 colleges and universities in Pennsylvania had no expiration dates. And the idea of an "expiration date" on a nursing home ID is too macabre to contemplate. "Death panels" indeed!

Political scientists can predict the proportion of registered voters who will not or cannot jump through these hoops and will therefore be disfranchised. No doubt Rep. Turzai had seen such an estimate when he predicted that the requirement would allow Romney to win Pennsylvania. Certainly the law is working: through 9/11/2012, according to Gerhart, the state had issued just 7,500 voter ID cards. Can you imagine what would happen if the United States sent voting registrars to Pennsylvania (or South Carolina)? The lines would stretch out around the block.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/148390 https://historynewsnetwork.org/blog/148390 0
High Speed Amtrak: Part II A year ago, I wrote about Amtrak's efforts to achieve high-speed rail, or at least what is now coming to be known as higher-speed rail. Higher-speed rail does not mean trains that go faster than bullet trains in Japan (185-mph), high-speed trains in France (180-mph), or even Acela Express in our Northeast Corridor (150-mph). Higher-speed rail merely refers to trains that go faster than they used to. This is an update.

On October 19, 2012, according to a release by the National Association of Rail Passengers (NARP), "federal, state, and local leaders gathered ... in Illinois to celebrate the start of 110-mph rail service along the Chicago to St. Louis rail corridor."

A train averaging 110-mph over the 274 miles from Chicago to St. Louis would take 2½ hours. Flying takes just an hour, but that does not include the hour travelers must add for screening and flight check-in. As well, Amtrak travels city center to city center. For the business traveler or the tourist, Chicago offers much more within a few blocks of Union Station than Elk Grove Village does within a few blocks of O'Hare. So the higher-speed rail trip might generate considerable ridership. At present, "Lincoln Service" trundles along at 51 miles per hour, requiring 5 hours and 25 minutes for the journey.

Amtrak's celebration was happy. NARP called the event "a momentous step forward in the development of a modern and efficient Midwestern high speed rail network." Governor Quinn waxed, "A twenty-first century rail system in Illinois will create jobs and drive economic development throughout the Midwest, while making travel across Illinois faster, safer, and more reliable." He went on, "these long-term investments in our transportation system will benefit the citizens of our state for generations to come."

However, it was premature. All it meant was, an Amtrak train ran at 110-mph over one 15 mile stretch between Dwight and Pontiac. "The train ate up the 15 mile stretch of track in a mere 6 minutes," NARP exulted. Still, it was only 15 miles. No one has yet figured out how to move trains rapidly from downtown Chicago through its suburbs, within Springfield, or from Alton across the Mississippi River into downtown St. Louis.

As well, 110 mph is not very fast. In my earlier piece, I wrote of traveling faster than that across Illinois on the "City of New Orleans" in 1965, before Amtrak was born. Moreover, if Republicans win the presidency and Congress, even these modest goals will never be realized. Mitt Romney has not disclosed how he will save the trillions of dollars he has promised, but Amtrak's subsidy of about $1.5 billion (including almost a billion in capital expenditures) is unpopular with Congressional Republicans.

This map, showing what has been accomplished and what remains to be done, makes clear that higher-speed rail is a major attainment.

Surely the hardest challenge Amtrak faces is how to get Americans to think of it. Several years ago, I was eating lunch in a bar/restaurant in O'Hare en route to a speaking engagement in Nebraska. I was well aware of the Nebraska-Missouri football game scheduled for that evening in St. Louis, because people hosting me in Nebraska worried that no one would come to my talk, since the game was on TV. The intensity of the Nebraska-Missouri football rivalry is, shall we say, unrivaled. O'Hare had clear skies; so did Nebraska; I looked forward to a trouble-free connection. Next to me at the bar, however, was a thirty-year-old man with less happy prospects. He had been there for hours, downing beer after beer, because he could not get to St. Louis. Alone in the nation, St. Louis had been hit by a snow squall -- no planes in or out. My barmate had a date with three college friends to meet at the Nebraska-Missouri game. Now he could not make it. He was drowning his sorrow.

Since he had been at O'Hare for two hours and the game wasn't until evening, I asked him, "Why didn't you just take the train?" Obviously the thought had never occurred to him. "Where would I get it?" he asked. "At the train station," I said. "Oh yeah," he replied, wonderingly. But the concept was still completely beyond his ken. I flew off to Nebraska, leaving him nursing yet another beer and making plans to fly back home.

I provoke the same reaction when suggesting to acquaintances that they use Amtrak's overnight trains to travel from Washington, D.C., my home town. They've taken overnight trains in Europe but never considered them here. That trains might even be time-effective never occurred to them either. A traveler can leave D.C. after a full work day at 6:30 pm and be in Atlanta at 8:15 am the next morning. Moreover, that's after a shower and a full breakfast. The same economies hold for engagements in Chicago, South Carolina, or northern Florida.

Given this state of awareness, I suppose it's OK for Amtrak to celebrate a 6 minute higher-speed ride, if it makes folks more aware of trains. By the way, did you notice that the ride did not take 6 minutes? At 110 mph, 15 miles takes 8 minutes. But we can allow Amtrak a modest exaggeration, can we not? Considering the difficult tracks they face?

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/148991 https://historynewsnetwork.org/blog/148991 0
Morons in Africa In 1994, with Richard Herrnstein, Charles Murray brought out The Bell Curve, subtitled "Intelligence and Class Structure in American Life." It sold well and soon came out in paperback with Simon & Schuster. Indeed, the acquisitions editor at S&S who bought The Bell Curve next bought my bestseller, Lies My Teacher Told Me. I know this because she bragged to me about having done so. She wanted me to infer that she was a big-league acquisitions editor, worthy of my book, since she had acquired Murray's book. I had already bought The Bell Curve in hardbound and was teaching a course that considered it at length; her acquisition did not excite me as much as she'd hoped.

The Bell Curve makes an astonishing claim about the 700,000,000 people in Africa (its approximate population when the book came out -- now nearing 900,000,000). Herrnstein and Murray (hereafter "Murray") cite studies, particularly by Richard Lynn, showing "the median black African IQ to be 75, approximately 1.7 standard deviations below the U.S. overall population average, about 10 points lower than the current figure for American blacks." (Bell Curve, 289)

Probably most readers know that "normal" or "average" IQ is the range between 90 to 110. Herrnstein is right that 75 is about 1.7 points below the U.S. average. Some researchers consider 75-85 "mentally retarded." The Wechsler Adult Intelligence Scale (WAIS), the most widely used IQ test in the U.S., classifies 75 as "borderline." Such people have about a 50/50 chance of reaching high school, according to research cited by J.C. Loehlin, et al., Race differences in intelligence (San Francisco: W.H. Freeman, 1975). During and after World War II, the army rejected draftees who scored 75 or less. Today Social Security considers 75 to support a finding that a claimant is disabled, in conjunction with other factors.

So Herrnstein and Murray claimed that the average African is borderline disabled in intelligence. Just below Forrest Gump, whose IQ was 76 (hence he could serve in the Army). The reason African Americans have a higher IQ, according to The Bell Curve, is because their genetics benefit from white admixture.

When The Bell Curve came out, I had never been to Africa. Since then, I have (twice). In November of 2003, I heard Charles Murray give a book talk about a new book, Human Accomplishment. (It tries to explain the clustering of "geniuses" or major contributors to culture, in time and space). I asked this question about his earlier work:

I have read and taught from The Bell Curve. In it, as you know, you say that African Americans have IQs 15 points lower than white Americans, while black Africans have IQs 25 points lower than whites, averaging 75, which is borderline between dull normal and what used to be termed moron. Since then, I traveled to Africa. Now, borderline morons are noticeable, you know? And I just didn't find that the average person in Guinea, the country where I was, was a borderline moron. So my question is: have you been to Africa? And if so, did you find the average African to be 25 points below what we consider a "normal" IQ? And if not, how do you reconcile that with your claim in The Bell Curve?

Murray replied that further studies of IQ in Africa since The Bell Curve have further confirmed his conclusion that they average 25 or even 30 points lower. But they are not lower in what he called "social IQ," which he defined as the ability to function in society.

I felt like a giant hole had opened up before my feet, right in the bookstore. Neither "social IQ" nor "social intelligence" appear in The Bell Curve's index nor, to my memory, in its text. On the contrary, in The Bell Curve Murray professes strong belief that IQ tests measure g, general cognitive ability, "whatever it is that people mean when they use the word [sic] intelligent or smart in ordinary language." (p. 22) Relying on Charles Spearman, who invented important statistical measures around 1900, Murray posited that g is "a general capacity for inferring and applying relationships drawn from experience." Since other questioners waited behind me for the microphone, I did not reply. However, "social IQ" represented a dramatic retreat from his reliance on g.

In Africa, I saw people flying airplanes, driving cars, running stores, teaching school -- performing, in short, the range of occupations people do in the United States. These people were Africans. People with borderline intelligence could not perform many of these tasks. Furthermore, all of them required "a general capacity for inferring and applying relationships drawn from experience." Arthur Jensen, who died last month, likewise emphasized g, which he likewise termed "general intelligence."

Conversely, if Murray is right to claim that Africans have normal "social intelligence" but are borderline in g or general intelligence, it follows logically that g must not relate to such tasks as flying airplanes, driving cars, running stores, and teaching school.

To put it another way, such a retreat by Murray invalidates his entire theory.

As well, Forrest Gumps are noticeable. I did not notice any in Guinea, Ghana, Burkina Faso, or Mali. Yet Murray holds that half of all Africans have lower intelligence than Forrest Gump!

The Bell Curve goes on to state, "IQ scores are stable." (p. 23) Famous research done as long ago as the 1960s shows otherwise. In Pygmalion in the Classroom, Robert Rosenthal and Lenore Jacobson showed that students in first grade in San Francisco gained an average of 27 points in IQ in one year. They had generated such gains simply by "leaking" to first-grade teachers the names of students in their classes who had supposedly excelled at a "Harvard Test of Inflected Acquisition," said to predict which youngsters were about to "spurt"! Murray would claim this could not be: "Changing cognitive ability through environmental interventions has proved to be extraordinarily difficult." (p. 314) Rosenthal and Jacobson had provoked enormous gains by a short and seemingly minor intervention, undercutting the claim of difficulty. Gains of 27 points in one year similarly undercut the claim of stability.

More recently, a French study summarized by David Kirp in 2006 looked at poor French kids who were adopted between age 4 and 6. Their IQs had been tested in the orphanage and found to be in, shall we say, Herrnstein's "African range": they averaged 77, "nearly retarded." To the researchers, this reflects the abuse and neglect they had suffered as infants, then being "shunted from one foster home or institution to the next" as toddlers. I don't know what Murray would say. Nine years later, after being adopted by farmers and laborers, they averaged 88.5 -- real improvement. If adopted by middle-class families, they averaged 92. And if adopted by upper-class, the children averaged 98, a 21-point gain. Kirp concludes, "that is a huge difference... and it can only be explained by pointing to variations in family circumstances."

Again, Murray would claim this could not be, since "cognitive ability," which he says IQ tests measure, "is substantially heritable, apparently no less than 40 percent and no more than 80 percent." Moreover, "whatever variation is left over for the environment to explain ..., relatively little can be traced to the shared environments created by families." (pp. 23, 108) Since the orphans hardly received new genes when they got new families, gains of 21 points undercut Murray's claims of high genetic heritability and low influence by families. Since the Rosenthal and Jacobson first-graders hardly received gene transplants during the school year, their 21-point gains also undercut the claim of high genetic heritability.

Last month, an article by David Dobbs in the New York Times, "If Smart Is the Norm, Stupidity Gets More Interesting," went at this problem from a different position. Dobbs points out that so far, trying to find the genes responsible for intelligence has proven futile.

Researchers have tried hard to find [the key to intelligence] in our genes. With the rise of inexpensive genome sequencing, they've analyzed the genomes of thousands of people, looking for gene variants that clearly affect intelligence, and have found a grand total of two. One determines the risk of Alzheimer's and affects IQ only late in life; the other seems to build a bigger brain, but on average it raises IQ by all of 1.29 points.

Dobbs goes on to note, "A report last year concluded that several hundred gene variants taken together seemed to account for 40 to 50 percent of the difference in intelligence among the 3,500 subjects in the study. But the authors couldn't tell which of these genes created any significant effect. And when they tried to use the genes to predict differences in intelligence, they could account for only 1 percent of the differences in IQ." He quotes Robert Plomin, a professor of behavioral genetics: "If it's this hard to find an effect of just 1 percent, what you're really showing is that the cup is 99 percent empty."

Dobbs's piece sparked this essay. I cannot here do justice to the complexities of the IQ literature, the expectancy effect, or the reasons why Africans might not score well on the WAIS. (I do touch on these matters in Chapter 2 of Teaching What Really Happened. Readers might also examine The Validity of Testing in Education and Employment, a 1993 report of the U.S. Civil Rights Commission.) However, the good news, I believe, is that environment makes a huge difference.

The bad news is, of course, American children grow up in families that differ much more than do the French families Dobbs summarized above. That's partly because the U.S. has more social stratification than any other industrialized nation. In their pathbreaking study, Meaningful Differences in the Everyday Experience of Young American Children, Betty Hart and Todd Risley show that "by the time they are four years old, children growing up in poor families have typically heard a total of 32,000,000 fewer spoken words than those whose parents are professionals." Answers to this gap include early Head Start programs, free preschools, and various kinds of advice and assistance to poor parents.

Otherwise, as it stands now, IQ differs dramatically by social class, which in turn allows the rich to cite these IQ differences as justification for our wide gaps in income and schooling, which then create further differences in IQ. Authors like Charles Murray play a crucial part in maintaining this circular process, which seems perfectly logical at any juncture until examined more carefully.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/149137 https://historynewsnetwork.org/blog/149137 0
Reviewlets Yesterday I went through my enormous "books to read" pile and realized that more than half of them I did not choose. Publishers send copies hoping I will blurb them. Authors send copies hoping I will recommend them when I speak to large conventions of teachers. I keep them because their topics interest me, or should, and I intend to read them tomorrow.

Of course, tomorrow never comes, at least not to my book pile. Time does march on, however, and soon it's too late to blurb them anyway.

How do "real" blurbers and reviewers keep up? Well, some cheat. They blurb books they have not read, not even skimmed, only looked at. Or, as we used to say in grad school, discussing a source glibly, "I haven't read it personally, but ...," and we proceeded to invoke or dismiss it, hopefully to good effect. (This is called "socialization into the profession.")

I cannot bring myself to blurb a book I haven't read, so I hardly ever blurb anything. But my heart is in the right place. I try to read what authors and publishers send me. I just fall behind.

Hence this essay, which announces the birth of the reviewlet. The reviewlet is the review of that part of the book which the reviewer has finished as of the moment of the review. Surely the reviewlet is a time-saving device whose time has come, no? The reviewlet also has an advantage for the reader: it's much shorter than a review of the whole book. So the shortened attention span of today's reader matches the shortened work span of today's reviewer. I'm telling you, this reviewlet thing is gonna catch on! And you're in on its birth!

Here follow my reviewlets of three different books that I meant to review, or at least blurb — honest, I did!

Barbara J. Miner's Lessons from the Heartland is what we sociologists used to call a "community study." I wish she had titled it Race and Education in Milwaukee, which would tell readers what it's about. Most people outside Wisconsin -- and all too many within the state -- have no idea how racist it was in the '60s and '70s. Miner recounts the story of how Henry "Hank" Aaron, the famous Milwaukee Braves outfielder who broke Babe Ruth's home run record, desegregated the formerly sundown suburb of Mequon. (A sundown town, you will recall, is a town that for decades was -- some still are -- all-white on purpose.) Even though Aaron was the most famous African American in Wisconsin and perhaps the United States, and although most whites in the state were Braves fans and Aaron fans, he still faced difficulty moving in. Then, when he wanted to take in his younger sister from Mobile, Alabama, she had trouble at school, where she was of course the only African American. Comments of "nigger" and "Go back to Africa" added up to misery. When Alfredia Aaron finally "burst out crying at the breakfast table," in Miner's retelling, Henry Aaron realized he had to do something. He and his wife took his sister to school to meet with the principal. Alfredia wrote, "The principal looked Henry right in the face and said, 'There wouldn't be a problem if you hadn't brought her to this school.'" Miner concludes, "Before long Alfredia moved back to Alabama."

Frank Huyler's Right of Thirst is a novel. But not just any novel. It's opening chapter is riveting. With extraordinary concision and a physician's sure knowledge, it relates the death of the narrator's wife. Its prose exemplifies perhaps the most famous passage in Strunk & White's "little book," The Elements of Style: the paragraph titled "Omit Needless Words." In its entirety, Strunk & White write:

Vigorous writing is concise. A sentence should contain no unnecessary words, a paragraph no unnecessary sentences, for the same reason that a drawing should have no unnecessary lines and a machine no unnecessary parts. This requires not that the writer make all his sentences short, or that he avoid all detail and treat his subjects only in outline, but that every word tell.

Their paragraph is a masterpiece. It embodies its own instructions, thus leading by example. Having read it in college half a century ago, I remembered parts of it even today. However, I misremembered its final phrase, recalling it as "but only that every word tell." My addition was instructive: omitting "only" makes the phrase more precise, concise, and powerful.

I cannot be certain that Huyler read Strunk & White, but I feel sure that he rewrote his first chapter a dozen times. Every word tells. Small details convey swaths of past life. "Her ... hair black where it had grown in again" divulges cancer and chemo. "As I eased in beside her the plastic crackled beneath us" tells that this is not just a bed but a sick bed, probably a death bed. Tiny details likewise reveal the emotional lifetime of the couple. When I finished the five-page chapter, I muttered a simple "Wow" to myself, and though about to go to sleep, began it anew and read it again.

I am now further into the book. After his wife's death, the main character, a cardiologist, leaves to do medical rescue work after an earthquake in some undeveloped country, perhaps Afghanistan. Not all is as rivetting. But that remarkable first chapter will stay with me, even influencing my prose, the way the beginnings of "The Bear" and The Sound and the Fury did half a century ago.

Finally, last June Beacon Press sent me A Disability History of the United States by Kim Nielsen, asking for a blurb by June 25, 2012. The editor at Beacon pulled out the flattery stop: "Acquiring this book reminded me of the experience I had when reading Lies -- changing the way I understood not just the past but also the present world around us." But it didn't work. I opened it only last week.

I read the introduction on a D.C. Metro train. Shortly after she signed the contract for the book, "after I'd been in the field [of disability history] for over a decade," Nielsen writes, a serious illness disabled her sixteen-year-old daughter. Now, Nielsen continues,

Making my daughter laugh until she tips over and falls down is really, really funny. Her core body muscles are so weak that strong laughter makes her fall over.

Her husband did it first. He tried to convince their youngest daughter, who was preparing to be Mary in the annual church Christmas pageant, to cry out, "My God, it's a girl!" at the pivotal moment. "My newly disabled daughter laughed so hard that she fell on the kitchen floor and couldn't get up.... Picking her up from the floor caused more pain, which somehow made the maneuver so precarious that we laughed even more." I sat convulsed on the Metro, first from imagining the impact of Mary's outburst upon the assembled multitude in the pews, should she take her father's advice, and then from the image of the young teenager, laughing so hard she couldn't get up. Somehow the scene in the kitchen emphasized and humanized her disability at once.

There's history in the book, too. Nielsen reminds us that the Huron man who brought the Great Law of Peace to the Iroquois stuttered so badly that many people could not understand him, so he spoke through an interpreter. I suspect she'll teach me more ... but I am only on page twenty.

Maybe some of you have read one or more of these volumes. Maybe you will augment these reviewlets into wikireviews -- another innovation! -- by adding comments to this little essay. I can do no more, however -- not if I ever wish to read a book of my own selection, to say nothing of writing one.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/149668 https://historynewsnetwork.org/blog/149668 0
"The Other Civil War": Howard Zinn, Abraham Lincoln, Lerone Bennett, Stephen Spielberg, and Me Two years ago Michael Signorelli, an editor at Harper/Collins, asked me to write an introduction to a little book that Harper/Collins was spinning off from Howard Zinn's A People's History of the United States. It would include chapters 9, on the Civil War and Reconstruction, and 10, "The Other Civil War," about the class warfare of the late nineteenth century. A marketing ploy to tie in with the sesquicentennial, it would carry "The Other Civil War" as its title. I had just written the introductions to the documents in The Confederate and Neo-Confederate Reader, so I had been thinking about the issues Zinn addresses. Besides, my name alone would convince some bookstore browsers to buy the little volume, Signorelli said.

No stranger to marketing ploys and always susceptible to flattery, I agreed to write the introduction. I warned Signorelli, however, that I was not a Zinn partisan; my introduction would probably include some negatives as well as positives. The editor assured me that would pose no problem. I set to work.

"I'm delighted that HarperCollins asked me to preface a stand-alone book on the Civil War extracted from Howard Zinn's massive bestseller, A People's History of the United States," I began. "Personally I'm pleased, because Zinn's blurb for Lies My Teacher Told Me helped it become a bestseller, and Dr. Zinn was generous in his praise of my later works too."

I went on to praise Zinn's larger work for "the effect that People's History has had on legions of readers." I told how, after my talks lamenting how badly history is taught in most high schools, an audience member often came up afterward to tell me that their history teacher was different. "She assigned us People's History as well as the regular textbook, and her course was interesting." I also noted that The Other Civil War supplies "a stellar introduction to the difficult topic of slavery."

The Other Civil War says almost nothing about secession, however, so I filled that gap, also a problem in most mainline textbooks. I noted that Americans give four answers when asked today why the South left the Union:

(1) slavery

(2) states' rights

(3) the election of Lincoln

(4) tariffs and taxes.

Invited to vote, more than half then choose states' rights. Another 10 percent to 20 percent select tariffs and taxes. Unfortunately, both of those answers are dramatically wrong. As the secession statements collected in The Confederate and Neo-Confederate Reader show, Confederates were against states' rights. They seceded for slavery, not states' rights. Nor did they complain about tariffs. Why would they? The South had helped write the tariff under which the U.S. was functioning.

I went on to explain why I had spent so much time on the matter:

"I supply the foregoing because in an unfortunate sentence, Zinn writes, 'The clash was not over slavery as a moral institution -- most Northerners did not care enough about slavery to make war over it. He is right that the North did not make war to end slavery but to hold the nation together. Most white Southerners, on the other hand, were outraged at abolitionists' moral attacks on slavery. They indeed cared enough about the institution -- and its allied ideology of white supremacy -- to secede, knowing that war would likely result....”

Thus, from the South's point of view, secession and ensuing war were about slavery.

Signorelli asked me to tone down some of my language criticizing Zinn. I pointed out that what I wrote would hardly affect sales, because few bookstore browsers would get into the interior of the preface before deciding whether to buy the book. Signorelli said he was fearful of the reaction my introduction might draw from Zinn's heirs. I listened to his specific complaints and softened my language somewhat. For example, he wanted "unfortunate sentence" removed from the foregoing paragraph. I acquiesced. Now it just reads, "Zinn writes, 'The clash was not over slavery as a moral institution...."

I went on to critique Zinn's analysis of emancipation as inadequate. Zinn claimed slavery ended "only when required by the political and economic needs of the business elite of the North." I noted that conspiratorial thinking marred his treatment of emancipation from the beginning. Zinn implied that a cabal within the Northern upper class directed emancipation and later clamped limits on the extent of black freedom. In reality, the events of 1860-65 spiraled out of the control of any elite. Besides, the elite was split. Many Northern business and banking men had ties to slaveowners and were Democrats, even Copperheads, during the war. They hardly directed Republican policy.

With a few changes to which I agreed, HarperCollins published my introduction intact to this point, and I believe it helps readers to question Zinn's interpretation. I also gave Howard Zinn credit for going beyond the textbooks to quote a critical sentence of Abraham Lincoln's letter of August 22, 1862, to Horace Greeley's New York Tribune. As Lies My Teacher Told Me notes, this letter is textbook authors' favorite Lincoln quotation, used by fifteen of the eighteen textbooks I surveyed. But they excerpt only these two sentences:

If I could save the Union without freeing any slave, I would do it; and if I could save it by freeing all the slaves, I would do it; and if I could save it by freeing some and leaving others alone, I would also do that. What I do about slavery and the colored race I do because I believe it helps to save this Union; and what I forbear, I forbear because I do not believe it would help to save the Union.

Thus textbooks present a Lincoln unconcerned about slavery, concerned only to save the nation. Zinn gives readers what Lincoln wrote next:

I have here stated my purpose according to my view of official duty, and I intend no modification of my oft-expressed personal wish that all men, everywhere could be free.

So far, so good.

"Nevertheless, Zinn derides Lincoln's views on slavery," I went on to note. "He points to the inadequacies of the Emancipation Proclamation but fails to note that even with its limitations, it was too radical for most Northern voters and cost the Republicans dearly in the November, 1862, election. He needs to show that if Lincoln got too far ahead of the electorate, he would cease to have followers."

The foregoing was too much for HarperCollins. Instead of publishing what I wrote, they wrote and published this:

"What's more, Zinn questions Lincoln's views on slavery, He points to the inadequacies of the Emancipation Proclamation and notes that even with its limitations, it was too radical for most Northern voters and cost the Republicans dearly in the November, 1862, election. Lincoln always understood that if he got too far ahead of the electorate, he would cease to have followers."

The changes are subtle but substantive. "What's more" puts me on Zinn's side when he questioned Lincoln. I was not. I did not claim that Zinn had noted "that even with its limitations, it was too radical for most Northern voters and cost the Republicans dearly in the November, 1862, election." Zinn never said anything of the sort; I had claimed that he should have. Zinn never credited Lincoln for understanding that he could not get too far ahead of his followers; again, he should have. He might have quoted Lincoln's reply to Charles Sumner in the fall of 1861, when the senator from Massachusetts tried to get him to end slavery then: "It would do no good to go ahead any faster than the country would follow." But he didn't.

Around the same time that I was writing my introduction for HarperCollins, I also wrote a new introduction to the Chinese edition of my bestseller, Lies My Teacher Told Me. As I did with HarperCollins, I agreed to write the new piece but informed the editor that he might not like the result. They agreed either to publish it as I wrote it or not to publish it at all. As with HarperCollins, the Chinese publisher asked me to soften some language. As with HarperCollins, I was persuaded to make some changes. Even so, however, my preface was too much for them, and the Chinese firm decided not to publish it, leaving me free to publish it elsewhere. They chickened out, but at least they maintained their honor.

I am outraged that HarperCollins would change my words and publish the result without my approval. In so doing, HarperCollins maintained neither their honor nor my own. I am now on record, in print -- and in hard copy, not an ever-changing on-line edition -- praising Howard Zinn for something he did not write. Any reader can skim the next few pages of The Other Civil War and see that I am either a fool who misreads Zinn or a knave who misrepresents him. HarperCollins has damaged my reputation as a scholar -- even as a competent reader. In fact, the fault was HarperCollins's, and they are either fools or knaves. Signorelli claims the former: the rewrite happened somehow by accident. I suspect the latter: the changed words happened somehow to be in their interest.

Unlike what "my" introduction says, Zinn cuts Lincoln no slack. He even blames him for going to war: "Lincoln initiated hostilities by trying to repossess the federal base at Fort Sumter, South Carolina," which would have been news to Jefferson Davis, who in fact initiated hostilities by firing on Sumter, which federal forces had never vacated. About slavery, Zinn goes on, only U.S. desperation prompted Lincoln to act.

Zinn's take on Lincoln parallels Lerone Bennett's in Forced Into Greatness. Indeed, Bennett may have influenced Zinn. People's History debuted in 1979. In 1968 Bennett wrote the seminal Ebony article, "Was Abraham Lincoln a White Supremacist?" that grew later into Forced Into Greatness. Back during the Black Power movement, many African Americans insisted that Lincoln was no friend of theirs, indeed, that no white folks were friends of theirs. In 1971, Muhammed Kenyatta, later to win some fame during a trip to Hanoi, thundered at a Black Power rally at Tougaloo College, "They are your enemies. Not one white person has ever had the best interests of black people at heart." (John Brown sprang to my mind, but Kenyatta anticipated my objection: "You might say John Brown did, but remember, he was crazy." I have written about Brown elsewhere.)

Today three different schools of thought attack Lincoln for not giving a damn about slavery: left-wingers like Zinn, militant African Americans like Bennett, and neo-Confederates like Thomas DiLorenzo. Given this lineup, it's refreshing to see a movie like Stephen Spielberg's Lincoln, which shows the president genuinely concerned to end bondage, as indeed he was.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/149929 https://historynewsnetwork.org/blog/149929 0
Mitch Daniels: Friend or Foe to Academic Freedom?

On January 18, 2013, Michael Gerson, formerly George W. Bush’s speechwriter, wrote an op-ed bemoaning Mitch Daniels’s retirement from politics. He called Daniels “arguably the most ambitious, effective conservative governor in America” and was proud that Daniels had decreased state government by 6,800 jobs, ended mandatory union dues, and privatized a toll road. “Daniels is just the sort of leader most needed in a Republican revival,” Gerson proclaimed. He concluded, in sorrow, “The best Democratic politician in America is about to take his oath as president of the United States. The best Republican politician will soon be president of Purdue.”

From personal experience, here’s another side to Mitch Daniels.

In November, 2007, I was scheduled to go on a week-long speaking tour of the Midwest: flying to Kansas City, speaking there; driving to Missouri, speaking and doing research there; driving to Springfield, IL, and speaking there; and finishing by driving to Indianapolis. There, I was addressing the National Association of Student Personnel Administrators. In addition, the Indiana Civil Rights Commission, a state agency, impressed with my research and writing about “sundown towns,” had coordinated three additional events: a talk in their office about “sundown towns” coordinated with Indiana University-Purdue University of Indianapolis (IUPUI) and a talk and workshop at Ball State U. in nearby Muncie.

Sundown towns, you may know, are communities that for decades were (and some still are) all-white on purpose.

The day I was to leave, I got a distraught phone call from my contact at the Indiana Civil Rights Commission telling me that she had had to cancel all three events they had coordinated. I asked why and got a confused answer, part of which was, almost no interest was expressed in the event at her office. I doubted this reason and managed, in the hour before leaving for my flight to Kansas City, to reach people at IUPUI and Ball State and re-establish all three events, of course without the participation of the Civil Rights Commission. While traveling, I found that the Civil Rights Commission had also canceled my hotel room, but I managed to rebook it.

Later I learned, definitively, that lack of interest had nothing to do with the cancellation. In July 2006, I had written a short article, “Honda’s All-American Sundown Town,” posted on the History News Network. It noted that Honda had recently chosen Greensburg, Indiana, as the site for its new factory. Like many towns in Indiana, Greensburg was a sundown town. Indeed, probably every town and county in Indiana that has a color in its name – from Brown County and Brownsberg to Vermillion County, White County, and of course Whitestown – kept out African Americans. Using colored names was not a secret code for sundown towns, as some African Americans long suspected; it was simply coincidental; a majority of all towns in the state were sundown, including all those with color in their names.

In 1906, Greensburg’s white residents drove out most of its black population. By 1960, the entire county, which had boasted 164 African American residents in 1890, was down to just three, all female. In the 2000 census, Greensburg still had only two black or interracial households among 10,260 residents. My article noted, “While Honda was choosing its site, its executives had to have noticed the racial composition of Greensburg and Decatur County.” I went on to ask, whether Honda chose Greensburg despite or because of its racial past. (Honda has a reputation for avoiding black workers, perhaps believing they are likely too pro-union.)

This article proceeded to stir up a minor hornet’s nest in Indiana. On July 11, 2006, Mike Leonard wrote an article about it in the Bloomington Herald Times and in the process asked the mayor of Greensburg, Frank Manus, whether it was a sundown town. By my memory, Manus replied, “Well, I’ve heard that it was, but I’ve also heard there’s an Easter bunny.” He went on to note, “We have several colored people who live in the city.” His antiquated terminology, typical of sundown towns, caused the Bloomington paper to make his sentence a “quote of the week,” and the story got picked up by other newspapers, including the Fort Wayne Frost. Eventually it reached the publisher of Greensburg’s newspaper, Pete VanBaalen, who has a black son. He met with the mayor and said he was offended by his terminology. The mayor did not apologize, so VanBaalen wrote an editorial about the matter on July 22. In reply, one long-term resident commented,

“People are complaining about Frank using the term ‘Colored People’?! At least he didn't make the comment about what the signs on the outskirts of town said because I do believe the first word was the ‘N’ word. ‘N, don't let the sun set on your back in Decatur County!’ and it even had a picture of the sun setting! I can remember seeing it with my own 2 eyes.”

Another commenter said police use the term “B.I.G.,” meaning of course “Black In Greensburg,” and get “complaints every time a black person walks down a street.” This all prompted the Indianapolis Star to do a story; they interviewed the mayor, who this time did apologize for “colored people.”

My piece at HNN drew many comments, some critical. Readers correctly pointed out that I had offered no evidence that Honda had chosen Greensburg because of its sundown past. Some were sure it would not ever have done so. One stated, “It's a safe bet [choosing Greensburg] had nothing to do with its racial history.” "Does Honda, as a matter of course, pick areas with small black populations, or is this isolated?" asked another, clearly assuming the latter. In January 2007, an authority, James B. Treece, “industry editor” for Automotive News, weighed in. He noted that Honda indeed did pick areas with small black populations, and on purpose. “In 1988, Honda paid what was then the largest EEOC settlement ever -- $6 million -- over its discriminatory hiring patterns at Honda's Marysville, Ohio, factory. The multimillion-dollar settlement was based on Honda's red-lining of Columbus, Ohio, and its minority residents.” He told how Honda then said it would hire only within a certain radius of its plant, drawing the line to exclude black neighborhoods. “Was it race-based? Absolutely.” He also cited a study by two professors at the University of Michigan, Robert E. Cole and Donald R. Deskins Jr., "Racial factors in site location and employment patterns of Japanese auto firms in America," suggesting that “the Japanese,” in the authors’ words, “have a 'taste for discrimination.'”

Landing Honda’s new plant in Indiana had been “a major victory for Gov. Mitch Daniels,” according to the Star. To get it, Indiana coughed up more than $140,000,000 in taxpayer subsidies, according to Roger Bybee. At some point the teapot tempest I had aroused reached the eyes and ears of the governor, or at least his staff. Indiana was not going to help host someone who had attacked Honda! At the instigation of his office, the Civil Rights Commission withdrew my invitations to speak. Apparently Gov. Daniels did not care if Honda was racist. Neither did some commenters on my original article. “Honda exists to make a profit selling cars. It isn't a social welfare agency, which Mr. Loewen thinks it should be,” wrote one. “Honda likely choose [sic] the town because they thought the citizens would make good employees.” To Honda and that writer, blacks might not?

For the record, corporations should care about the communities into which they move and within which they employ. Many do. For example, before Quaker Oats moved into Danville, Illinois, not far from Greensburg, it asked Danville to pass an open housing ordinance, partly so its black employees could be assured they could find places to live. Danville did. Research by Laurie Bassi, Ed Frauenheim, and Dan McMurrer shows that companies that care about “social welfare” actually prosper. So doing good helps them to do well.

The Civil Rights Commission’s claim of lack of interest was belied by a standing-room-only crowd at IUPUI, drawn partly by co-sponsorship from IUPUI's Africana Studies Program. Law School Prof. Florence Roisman introduced me with a five-minute talk about the First Amendment and freedom of speech. Ball State was delighted to re-establish my events on their campus, which were also well-attended.

Is it “personal” of me to take a very different view of Mitch Daniels’s retirement from state or national politics from columnist Gerson? I don’t think so. I think it’s a good thing for Indiana and the United States that Barack Obama, not Mitch Daniels, was re-elected president of the United States. I think it’s a bad thing for Indiana and Purdue that Mitch Daniels has just become president of Purdue. Retaliation for speaking out against injustice is not what we need in higher education or politics.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/150218 https://historynewsnetwork.org/blog/150218 0
At War With Art Appears in print in the March 11-18 edition of The Nation.

“The Civil War and American Art,” the current exhibition at the Smithsonian American Art Museum in Washington, marks the sesquicentennial of the Civil War. After it closes in late April, the show will travel to the Metropolitan Museum of Art in New York for most of the rest of 2013. To complement the show, the Smithsonian has published, in conjunction with Yale University Press, a beautiful companion volume that includes many images not on display in the galleries and several chapters of commentary. The exhibit and book are an occasion not only to showcase some fascinating art, but also to clear up the misconceptions that many Americans still hold about the period. Unfortunately, both squander that opportunity.

Unlike, say, Emanuel Leutze’s famous Washington Crossing the Delaware, which depicts exactly what its title describes, few of the works in “The Civil War and American Art” portray the war directly. The curator, Eleanor Jones Harvey, explains this by claiming that photography, only recently invented, carried “the gruesome burden” of documenting the war’s carnage and destruction. As a result, artists shied away from direct representations of the war. But photography in the 1860s could not capture the action of war. Photographers could shoot posed generals, but because exposure times required still subjects, they could not photograph battles, only their aftermath -- mostly corpses. Mathew Brady’s famous 1862 New York City exhibit was titled “The Dead of Antietam,” not “The Battle of Antietam.”

To make Harvey’s claim stick, the exhibition simply omits most art that did portray the war itself. Two blocks from the Smithsonian is the National Building Museum, formerly the Pension Building. All the way around its exterior, on a frieze above the first floor, march Civil War soldiers, cavalry, artillery and supply trains. In front of the Capitol, the Grant Memorial shows the carnage of war viscerally. As early as 1862, the sculptor John Rogers offered mass-produced small sculptures of battlefield scenes like Sharp Shooters. All are missing from “The Civil War and American Art.” 

Sharp Shooters by John Rogers (1862)

Also missing are the mass-produced images about the conflict -- lithographs from Currier and Ives, woodcuts from Leslie’s Illustrated and Harper’s Weekly, and editorial cartoons of all kinds, including by the master, Thomas Nast. The exhibition includes portraits and landscapes by Winslow Homer but not his battle scenes for Harper’s; those would interfere with the claim that artists did not depict the war itself. Robert Knox Sneden’s drawings and watercolors of Andersonville Prison and other Civil War scenes, including naval engagements, make no appearance. Neither does folk art. 

Winslow Homer's sketch of The Sharpshooter for Harper's

The 1863 oil-on-canvas version.

Even if the exhibition had been titled “The Civil War and American Painting,” there still would be a problem. Harvey’s claim that because photography was better at capturing the “grim reality” of war, “American artists could not depict the conflict with the conventions of European history painting” simply isn’t true. Neither European history painters nor Americans felt such a constraint. Americans hired Europeans to produce enormous “cycloramas” -- oil paintings on canvas more than 300 feet long, mounted in circles surrounding their audiences -- of the battles of Gettysburg, Vicksburg and Atlanta. They attracted thousands of tourists; the cycloramas in Gettysburg and Atlanta still do. Their popularity undercuts the exhibit’s claim that war photographs drove historical paintings out of vogue. American artists like Peter Rothermel also painted the war itself, but this exhibit simply leaves them out. Two paintings of the Battle of Antietam by James Hope do make it into the book, but not onto the walls.

* * * * *

A Coming Storm by Sanford Robinson Gifford (1863)

After omitting art that portrays the Civil War, Harvey concludes there was little art portraying the Civil War. Not to worry, though: the exhibition’s title is “The Civil War and American Art,” not “in American Art.” The exhibit’s organizing principle is: “Genre and landscape painting captured the transformative impact of the war, not traditional history painting.” This insight proves not so much a conclusion as a prejudice. It leads Harvey to imagine traces of the Civil War in paintings of landscapes far removed from the conflict: if a painting shows a thunderstorm over Lake George, that signifies the approaching war; if another shows a rainbow over a tropical landscape, that’s because the artist, affected by the conflict, was seeking to transcend it. The exhibit’s wall text for Frederic Church’s Rainy Season in the Tropics quotes a sentence from The New York Times in 1865, as the war ended: “No more deluge of blood.... The whole heavens were spanned with the rainbow of promise.” So Church really was painting the war, not a rainbow. In fact, Church had gone to Ecuador and Colombia in 1853 and 1857, well before the conflict was even envisioned. Sketches made then led to Rainy Season in the Tropics in 1866, complete with that rainbow. He traveled to the tropics -- and later to the Arctic -- in response to the polymath Alexander von Humboldt, who implored artists to paint the very ends of the earth.

Rainy Season in the Tropics by Frederic Edwin Church (1866)

Unfortunately, seeing war in artworks that aren’t about the war leads to shaky commentaries on the art. Harvey finds a “poignant connection” between Sanford Robinson Gifford’s A Coming Storm (1863) and “the nation’s grief.” Inspection of the painting, however, reveals that it is actually more about light than darkness. Brightness shines through in the center, where the clouds part, lighting up a boulder and some maple trees in autumn. (Gifford retouched and redated the painting in 1880, and may have added more light; art historians aren’t sure.) It cannot presage the gloom of war. In her comments on Homer Dodge Martin’s The Iron Mine, Port Henry, New York (circa 1862), Harvey misreads two works at once: “The mine shaft openings resemble bullet wounds, and the rusted tailings of iron ore stain the slopes like dried blood. This scarred landscape subtly recalls Gardner’s battlefield photographs of fallen soldiers.” Well, no, it doesn’t. Alexander Gardner’s photographs show few bullet wounds, no colors (of course), and no slopes with blood running down. The comparison is solely in the mind of the curator. No one who knew Martin claimed that he was alluding to the Civil War, so far as I can tell. More misleading still is the wall text for Frederic Edwin Church’s Aurora Borealis (1865):

Under a dark Arctic sky, polar explorer Isaac Israel Hayes’s ship, the SS United States, lies frozen in the pack ice…. The auroras above erupt in a cascade of eerie lights. Throughout the war, auroras were solidly associated with apocalyptic warnings about the conflict. As the ice grips Hayes’s ship, and by proxy the nation, the auroras snake across the sky like a grim warning from God, a bleak foreshadowing of doom.

Aurora Borealis by Frederic Edwin Church (1865)

Only a curator who had never seen the northern lights would imagine that they “snake.” They don’t. The S-curve is the bottom of a curtain that descends or appears but does not snake. An impressive display appeared over the northern United States on December 23, 1864, but it occasioned few “grim warnings” or “foreshadowings of doom.” Nor would any have been appropriate, because Sherman had taken Atlanta in early September 1864, leading to Lincoln’s re-election in November. For that matter, the boat’s situation was not grim, either. Hayes expected it to be “gripped,” but after the ice loosened, he brought it back to the United States in triumph.

Looking Down Yosemite Valley, California by Albert Bierstadt (1864)

According to the exhibition, no subject, however distant, remained untouched by the war. In 1864, the United States bought Yosemite and turned it into a park. Albert Bierstadt hastened to the valley, which he called “the most magnificent place I was ever in,” and made numerous paintings of it. On exhibit is Looking Down Yosemite Valley, California. According to the wall text, “Bierstadt’s views of Yosemite held out the promise of a place where all Americans could slough off the trauma of war and sectarian strife, a place of renewal and healing.” The book takes the interpretation even further: “Bierstadt’s painting represented a wartime yearning for sanctuary. But what appears to be the promise of redemption is in fact mostly an escape -- not a solution to the nation’s problems.” Surely this is the first time in history that Yosemite Valley has been criticized for not being “a solution to the nation’s problems,” and surely Bierstadt never thought it might be when he painted it. Interpretations like these reflect a deep unease about the ambiguous relationship between history and art. 

* * * * *

When the exhibit turns from the Civil War to Reconstruction, its history goes completely off the rails. According to the big piece of wall text that introduces the postwar gallery, “The ensuing bitterness that permeated Reconstruction in the South came from dual causes -- the realization that the North quickly went back to business as usual, having sustained little damage, while Federal promises to rebuild the South were more often broken than fulfilled.” Under the title “The Unraveling of Reconstruction,” the book states:

Reconstruction began as a well-intended effort to repair the obvious damage across the South as each state reentered the Union. It was an overwhelming task under ideal circumstances. Following Lincoln’s assassination, that effort soon faltered, beset by corrupt politicians, well-meaning but inept administrations, speculators, and very little centralized management for programs.

This exhibition is hardly alone in misconstruing Reconstruction as reconstruction, but it’s embarrassing to find such an elementary blunder in print at a national museum. To be sure, the war had ruined parts of the South. But Reconstruction had nothing to do with rebuilding this “obvious damage.” Reconstruction was a political process: the seceded states had to be reconstituted politically to be readmitted to the Union. How to do this occupied President Andrew Johnson and the Republican-dominated Congress from 1865 until Johnson left office in 1869; Ulysses Grant then oversaw Reconstruction until it ended in March of 1877.

Comparing Georgia and Florida provides an easy way to grasp the matter. The Civil War raged across Georgia for the better part of two years, including General Sherman’s burning of Atlanta. In Georgia, Reconstruction lasted until October 1871 -- six years. Florida escaped the Civil War almost completely unscathed, but Reconstruction there lasted until 1877 -- eleven years. Again, Reconstruction was not about physical reconstruction, but about race: would African Americans be allowed to share power? In less than one year in just one state -- Louisiana -- white Democrats killed more than 1,000 people over this issue. I know of no murders committed over “centralized management” or any other matter connected with rebuilding.

Reconstruction still puzzles many Americans. The art journalist Tyler Green, reviewing the book published with the exhibition, says it “deserves to win awards in two disciplines: Art history and American history.” Like Green and Harvey, many Americans never learned in school what Reconstruction was really about. Even today, some of our K–12 history textbooks maintain the confusion. The American Journey, for example, begins:The war had left the South with enormous problems. Most of the major fighting had taken place in the South. Towns and cities were in ruin, plantations burned, and roads, bridges, and railroads destroyed. ... People in all parts of the nation agreed that the devastated Southern economy and society needed rebuilding. They disagreed bitterly, however, over how to accomplish this. This period of rebuilding is called Reconstruction. This term also refers to the various plans for accomplishing the rebuilding."

This chapter is allegedly by James McPherson, our foremost Civil War historian, but McPherson would never have written that passage and allowed such a mistake to stand.

The Bright Side by Winslow Homer (1865)

If Harvey had gotten the history of Reconstruction right, the meaning of the art from that period on display in the exhibition would have resonated more clearly. In addition to the political reconstruction of Southern state governments, with the attendant societal transformation as Confederate leaders were disfranchised and African Americans were enfranchised, Reconstruction was also an ideological movement. Across the entire country, many white people came to favor black voting and even equal rights. Paintings by Eastman Johnson and Winslow Homer, including some in this exhibit, show this equal treatment of African Americans. The curator does recognize that the place of African Americans was the galvanizing issue behind secession and the Civil War. But she cannot effectively tie Johnson’s and Homer’s works to Reconstruction as an ideology, because she thinks Reconstruction refers mainly to physically rebuilding the South.

The Girl I Left Behind Me by Eastman Johnson (1875)

Instead of perceiving Reconstruction in Johnson’s pictures of African-Americans, she sees it in his 1872 portrait of a white girl with windswept hair, The Girl I Left Behind Me, calling it “a compelling, if complicated, commentary on Reconstruction-era America.” “Complicated,” indeed: Johnson first titled it The Foggy Day, then Young Maidenhood, because it had nothing to do with Reconstruction. The girl wears a ring, possibly a wedding ring, which prompts Harvey to ask: “Is Johnson referring to her personal life or to the Union as the nation?” So an artist cannot picture a ring during or even well after the war and have it just be a ring! Such commentary reminds me of how, back when I was a lad, my Presbyterian Bible “explained” the sex passages in the Old Testament’s “Song of Solomon” as being about “Christ’s love for his church.”

Despite all this bad history, there are two reasons the exhibition is worth seeing. First, there is some gorgeous art. The Hudson River School is on beautiful display, including four large landscapes by Church: The Icebergs, Cotopaxi, Aurora Borealis and Rainy Season in the Tropics. The wall text is absurd. Cotopaxi depicts a volcano in Ecuador. “Although Cotopaxi is not specifically about the Civil War, it is suffused with it,” Harvey pontificates. “Race slavery was North America’s volcano, a simmering force, hidden and suppressed, but waiting to erupt explosively.” Too many visitors to art museums spend more time reading the wall texts than looking at the works on display; this a reminder of why it’s essential to look at the art.

Cotopaxi by Frederic Edwin Church (1862)

The second reason for seeing the exhibition is that Harvey has done a good job of assembling nineteenth-century paintings that treat race. She rightly notes that service in the US armed forces brought many white Americans into contact with African Americans for the first time. Certainly Winslow Homer and Eastman Johnson came to know African Americans during the war, and the results are well represented here.

So, go. Look. Think. But don’t read anything on the wall. Instead, go home and read Eric Foner’s Reconstruction. That way, you’ll get good art and good history.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/151326 https://historynewsnetwork.org/blog/151326 0
New Opposition to Old Sports Mascots On February 7, 2013, the National Museum of the American Indian (NMAI) hosted a day-long seminar, "Racist Stereotypes in American Sports." The handout they used to promote the show paired two graphic images:  a stereotypical black doll on a base saying "Not. Cool." paired with the stereotypical Cleveland Indian mascot on a base saying "Go Tribe!" It made a stunning impact. So did the symposium, getting considerable attention from the Washington Post, including the entire front page of its free "Express" edition the next morning.

The images used to market the seminar, "Racist Stereotypes in American Sports," at the National Museum of the American Indian continue a tradition of bitter Native humor. Prior examples included sports pennants pictured in Native newspapers honoring the "Atlanta Niggers," "New York Kikes," "Chicago Polacks," and "Washington Redskins," prompting readers to realize, in a democracy, none of the above would be conceivable. [Picture courtesy of the author.]

The Native mascot is an issue I've long thought about and spoken about too, especially at the University of Illinois, where I hold a visiting professorship in African American studies. Chief Illiniwek at the "U of I" was one of the key sticking points among collegiate mascots. It happens that as a Cub Scout in Decatur, Illinois, I learned Indian lore from none other than A. Webber Borchers, then in his early 50s. If not quite the founder of Chief Illiniwek, Borchers was responsible for establishing the tradition.  Quite an entrepreneur, Borchers traveled to Pine Ridge Reservation in South Dakota in 1930 and hired three Native women to make the costume that he used as Chief Illiniwek. That the Dakotas were Plains Indians while the Illini were prairie farmers and hunters made no difference; for mascot purposes, one Indian was as good as another. Better, even, for a warrior on a stallion is certainly more charismatic than a farmer on foot. 

Webber Borchers on his pony, c.1930. Courtesy University of Illinois.

Borchers was always controversial.  After WWII, it was said that he led his Boy Scout troop through Germany, looting it of Nazi memorabilia. He became a right-wing Republican state legislator, elected and re-elected because Illinois allowed cumulative voting in its three-member districts. His supporters -- relatively few but absolutely passionate -- gave him all three of their votes, so he never came in lower than third. Eventually he was found guilty of theft of state funds and official misconduct and received a sentence of weekend jail time and a $5,000 fine. 

Jay Rosenstein's In Whose Honor showed nationally on PBS, helping spark the elimination of many Indian names and mascots, including Chief Illiniwek, pictured here.

To be fair, however, Borchers was respectful and serious to us Cub Scouts about American Indian culture. Later, as a young adult, I never questioned Chief Illiniwek ... until I saw In Whose Honor, Jay Rosenstein's fine video about Charlene Teters, a Spokane Indian and mother of two small children who found herself in Champaign/Urbana as a graduate student at the U of I. Gradually she began to oppose and even protest the mascot, partly because of the embarrassment the Chief's halftime performance caused her children. As well, on football weekends the campus was overrun by adult fans wearing silly orange and blue face paint and childish colored feather headdresses -- cartoonish approximations of Native American culture. Plastic foam tomahawks and the "WHOO-whoo-whoo-whoo" drumbeat remain equally silly at Atlanta Braves games. (Native Americans use a "WHOA-whoa WHOA-whoa" beat that they say imitates the human heart.)

Like Chief Illiniwek, most Natives used as symbols are Plains Indians, a short-lived culture that arose around 1680 and was ended two centuries later. No woodland Indian could wear such a headdress for more than a few seconds before a branch would knock it off. Besides the headdress, most Native mascots are half-naked. Sometimes they brandish a spear.

In this typical image, a well-dressed Dutchman, perhaps Peter Minuit, hands $24 worth of beads to a Native American wearing only a breechcloth and a full feathered Plains headdress.  This sculpture is located on the exact spot in lower Manhattan where this transaction never took place.  If this transaction that never took place took place in August, the Dutchman is sweating; if in February, the Native is freezing.  To put it another way, no two people have ever dressed like that on the same day on the same point on the earth's surface.  The sculptor did not strive for realism, of course; rather, he followed an artistic convention about clothing showing "primitive" on the left, "civilized" on the right.  Plains Indian culture had not even been created when this transaction never took place, but again, one Indian is as good as another. [Picture courtesy of the author.]

The University of Illinois went through a wrenching process to rid itself of its Chief. I played a minor role, giving talks on campus to the effect that the Chief could never again play a unifying role. Historically, I pointed out that Illiniwek and many other mascots dated to the Nadir of race relations -- that terrible era, 1890 to about 1940, when whites Americans became more racist in their thinking than at any other time. (I cannot stop to justify "more racist" in this short piece but refer you to Chapter 2 of Sundown Towns or Chapter 10 of Teaching What Really Happened.) Whites named Indiana for American Indians precisely as they were driving American Indians from the state. Similarly, during the Nadir whites made use of Native symbols precisely as they were driving Native people into despair. The most ironic example was the fraternal organization "The Improved Order of Red Men," founded in the 1840s, which reached its zenith just as Native Americans reached their nadir. By 1920, the Red Men claimed more than half a million members, not including their female auxiliary, "Daughters of Pocahontas." Meanwhile, the census that year showed just 244,000 Native Americans. Of course, no Native American could be a "Red Man"; that privilege was reserved for Caucasians.

The most grandiose Nadir scheme for remembering Indians as they disappeared was the plan to build a huge statue on Staten Island “to the memory of the North American Indian,” in the language of the “Indian Monument Law.” It was to symbolize and eulogize "the departed Indian."  It would have stood taller than its neighbor, the Statue of Liberty, with whom it would have formed an ironic coupling:  welcome to the new (whites) and goodbye to the old ("reds").  Congress set aside land for the monument and President Taft presided at its ground-breaking. The funds never forthcame, however, so it was never finished.

During the Nadir, whites used the Dawes Act to turn Indian reservation land into white homesteads. Many whites expected Native Americans to disappear entirely. Population figures provided some grounds for this belief: the number of American Indians in what is now the United States declined from about 5,000,000 at first contact to a low of 245,000 in 1920.

In 1915 James Earle Fraser produced his most famous work, "The End of the Trail." It does not portray a tired Native American at the end of the day, of course, but the Nadir for American Indians as an entity. In the 1990s, a company still sold smaller replicas of "The End of the Trail," advertising them in Smithsonian magazine as Fraser's "tribute to the American Indian."

During the Nadir, not just colleges but also many high schools chose Native names and symbols. Few had any Native students or faculty. It's a telling point that the use of Native Americans as mascots was part of the Nadir. Indeed, the most important problem with using Native Americans as mascots is not their effect on American Indians such as Teters's children, but rather their impact on non-Indians. Mascots appropriate Native symbols as if they were in the public domain, which implies Natives no longer exist. So do the symbols whites select. Indian mascots are not only frozen onto the Plains, they are also frozen in time -- about 1876 (Little Big Horn). This encourages non-Indians to conclude that Native Americans, too, are frozen in time or no longer exist nowadays, at least not in any significant numbers. No twentieth-century Indian ever served as a mascot, because such a person might wear a business suit or hard hat or clerical collar, depending upon his (or her?) job. That wouldn't do. 

Nor can high schools or colleges control the use of their names and mascots by others, including both fans and opponents. If newspapers want to say, "Braves Scalp Titans," they are free to do so, implying that Natives in warfare were less civilized than Europeans, precisely the opposite of the truth. If fans want to wear T-shirts sporting buck-toothed caricatures, they are free to do so. If in the process non-Indians infer that Native Americans are savage, go around half-clothed, are indeed "primitive," they are free to do so. 

With the advent of Title IX, high schools now field girls' teams in basketball, volleyball, track, and other sports. Some changed "Braves" to "Lady Braves"; some resorted to "Squaws," a term with special problems of its own. "Squaw" is plainly a derogatory term. It may derive from a French corruption of an Iroquois epithet for vagina, similar to "cunt" in English. It may derive from an Algonquian suffix simply meaning "female." Either way, over the centuries it has taken on contemptuous overtones. "Squaw" cannot be an honor. Neither can "Redskins." Even less loaded terms like "Braves" and "Indians" usually "otherize" Native Americans as different from "us." 

To see this problem, imagine if an overwhelmingly Muslim high school in, say, Queens, New York, or Dearborn, Michigan, named its athletic team the "Christians." Suppose they displayed a mascot who wore a clerical collar, crossed himself, and raised a chalice and took communion as part of an interesting and "culturally accurate" dance at halftime. Many Christians would not be amused. If the Muslims told the Christians, "We are only paying homage to you and your costumes," many Christians would still not be satisfied. Yet the Native symbols that white schools commandeer -- dance, face paint, eagle feathers, and ceremonial dress -- are sacred items in many Native cultures and religions.

Since most non-Native Americans have little contact with Native Americans, stereotypical "Indians" like mascots provide much of what they "know" about them. Social psychologist Stephanie Fryberg found that "priming" Native Americans with mascot images like Chief Wahoo and Chief Illiniwek caused lower scores on a self-esteem scale among Native Americans but a boost in self esteem among non-Natives. Self-esteem is fine, of course, but surely it should not be sought at others' expense.

"Redskin" in particular is hard to defend. The term itself dehumanizes, labeling a person by the color of his/her skin. Some argue its roots go back to the requirement that a "settler" (read European American) had to bring in a scalp with enough "red skin" to prove it was an Indian to collect the bounty for killing a "savage." (Note the projection in the term "savage.") Hitler picked up on "redskin." He admired how the U.S. and Canada had wiped out most of the original settlers of those nations. He often referred to Russians as "Redskins" and suggested that Germans needed to do the same: "to look upon the natives [of Russia] as Redskins." Then they would see "There's only one duty: to Germanize this country by the immigration of Germans." [quoted in James Pool, Hitler and His Secret Partners, 254-55]

Yet another problem with using Native Americans as mascots has to do with power. Of course other groups serve as mascots too. Think of the Minnesota Vikings, the Fighting Irish of Notre Dame, the Vancouver Canucks. The last is even an ethnic slur. But each of those names was chosen with major input from the group to which it refers. Proportionately more Norwegian Americans live in Minnesota and adjacent states than anywhere else in the United States. Irish Americans have always had a special affinity for Notre Dame and have often headed the school. The NHL was born in Montreal; French Canadians have long been a large proportion of its skaters. If Norwegian Americans, Irish Americans, or French Canadians wanted to change those names, they would do so whenever they pleased. In the meantime, they can and do take ownership of them. 

What would we think if Brandeis University called its teams the "Jews" or even "Fighting Jews?" (Actually, they are the Judges.) We might think less of the school -- yet Brandeis is majority Jewish, so at least Jewish Americans would have done it to themselves. Earlham College does call its teams the "Quakers," sometimes even the "Fighting Quakers." Again, Earlham began life as a Quaker school and Quakers still run it.  They can change their moniker at any time. College of the Holy Cross, a Catholic school, calls its teams "Crusaders." Wake Forest uses "Demon Deacons," it being a Baptist school. Again, it's under their control. 

Not so with "Redskins," "Savages," "Squaws," "Braves," or until recently the "Fighting Sioux" of that other NDU, North Dakota University. Native Americans have almost no power in the United States, NFL, Major Leagues, or our mainstream colleges and universities. So non-Indians are free to say, "We don't mean anything bad by naming the team after you. We mean it as an honor. If you don't see it that way, tough." As one of my students at Catholic University of America put it in an e-discussion in 2002 about the local NFL team name: "It is only a minority of the people that want the name changed and the U.S. as a country is ruled by a majority." Obviously de Tocqueville's warning about the tyranny of the majority never reached this student. Or, as a Redskins fan wrote after the NMAI seminar, "The team should not change their name. I certainly don't deny that the name is a racial slur, I just don't care." Thus using mascots coarsens non-Indians, teaching them to disregard others' views if they are a minority. This is the very opposite of the cultural sensitivity our young people need to learn to succeed in today's global economy.

The U.S. Commission on Civil Rights had called for non‑Indian schools to end the use of American Indian mascots back in April 2001. After a decade of hearing arguments like the above, the National Collegiate Athletic Association (NCAA) came out against Native mascots. "As we reached our decision," said Delise O'Meally, Director of Governance and International Affairs for the Association, speaking at NMAI, "there was overwhelming evidence of the harm these mascots" caused. Colleges hastened to comply. The vast majority have now dropped their Indian-related names, whether "Braves," "Redmen," "Redskins," "Savages," or "Warriors." Five schools have managed to finesse the issue by retaining their names but de-Indianizing them and dropping their Indian mascots or symbols:

  * Alcorn State University (Braves and Lady Braves)   * Bradley University (Braves)   * Carthage College (Red Men and Lady Reds)   * University of Illinois at Urbana-Champaign (Illini)   * College of William and Mary (Tribe).

This has led to at least temporary success in avoiding sanctions from the NCAA. 

Five colleges won at least temporary approval from the NCAA for their mascots because they showed that local American Indian groups approved: 

  * Catawba College (Catawba Indians)   * Central Michigan University (Chippewas)   * Florida State University (Seminoles)   * Mississippi College (Choctaws)   * University of Utah (Utes).

What's different about, say, Central Michigan's use of "Chippewas" or the Florida State "Seminoles" is that those universities have consulted and continue to talk with nearby Native Americans, ceding them some power at the table. Not all Native Americans are of the same mind about the use of Indian names and symbols as mascots. Why should they be? Moreover, Chippewas do remain a presence in central Michigan, as Seminoles do in Florida.

The real Osceola, at right; “Chief Osceola” at Florida State, at left.

Even so, issues remain. For example, "Osceola" is the current symbol of Florida State; he rides in triumphantly on his horse, "Renegade," at halftime, and throws a flaming spear into the ground. "Osceola" is certainly an improvement on FSU's prior mascots, which included "Sammy Seminole," "Chief Fullabull," and later "Chief Wampumstompum." All remain inevitably "white man's Indians," however, even "Osceola," literally the creation of whites. The real Osceola, never defeated, was invited to enter peace negotiations by the U.S. Army, which then arrested and incarcerated him. This dishonorable behavior caused an uproar among civilians, but the debate that followed did Osceola no good: he died after three months in jail. Army officers then cut off his head and took it and his possessions as souvenirs. The half-time performance at a football game cannot appropriately include his tragic end, of course. Yet without it -- indeed, without any hint of the real man -- "Osceola" cannot represent Osceola.

To be sure, many Native Americans, especially in the West, don't care about the mascot controversy. I have spoken with Winnebagos and Navajos and others of this opinion. However, national Native leadership verges on unanimous disapproval. In addition to NMAI, the National Congress of American Indians, the National Indian Education Association, and fifty other Native organizations have come out against the practice. Native thought is changing. So is non-Native thinking and practice. The state school board of Maryland came out against Indian mascots by a vote of 10 to 2. Proponents of retaining "Fighting Sioux" at the University of North Dakota forced a statewide referendum in the matter in 2012 but were stunned when citizens voted two to one to drop the name. Several major newspapers, including the Duluth News-Tribune, Kansas City Star, Minneapolis Star-Tribune, Portland Oregonian, and Seattle Times, no longer use "Redskin."  Washington's arts weekly, the City Paper, switched to "Pigskins." The others just say "the Washington NFL team," as will I henceforth. Perhaps movement on this matter is akin to the rapid change we are seeing about same-sex marriage. Still, the Atlanta Braves, Kansas City Chiefs, Cleveland Indians, and Chicago Blackhawks among major professional sports teams show no signs of giving way, although the Blackhawks' mascot is a hawk, not an Indian (Its logo remains an Indian head).

In the aftermath of the seminar at NMAI, officials of the Washington NFL franchise cited high school teams as justification for their offensive nickname. Somehow, if high schools also called their teams "Redskins," then the term can't be racist, so this legitimizes our using it. Of course, the defense lacks logic. As well, the NFL team owner apparently does not realize that high schools across America have been giving up the name. Monroe Gilmour and the North Carolina Mascot Education & Action Group have helped persuade more than half of all schools in North Carolina that once used "Redskin" and other such names and symbols to abandon them. Moreover, on February 8, 2013, the day after NMAI's seminar, the Michigan Department of Civil Rights filed a discrimination complaint with the Office of Civil Rights at the U.S. Department of Education. The state agency charged some thirty-five school districts in Michigan with discrimination against American Indians because they named their athletic teams "Redskins," "Warriors," "Chiefs," etc., and used Indian mascots. According to the Civil Rights Department, this "creates a hostile environment."

Michigan deliberately chose to file on February 8 because February 8, 1887, was the dark day in Indian history when Congress passed the Dawes Act. In a fascinating invocation of the power of history, the Civil Rights Department stated in its Supporting Argument:

[W]e believe that no school where students, teachers, parents, and administrators knew and taught America's history well enough to recognize the Dawes Act would want to use the cartoonish imagery, sacred objects, disrespectful nick names, or other questionable imagery of American Indian's that many use today.

The complaint singled out "Redskins" for its "particularly negative connotations," since it "has historically been used as a racial slur." The state asserted that studies by Fryberg and others "empirically, objectively, and conclusively establish" that the continued use of American Indian mascots harms Native students.  Moreover, Michigan continued,

  "officially sanctioned use of such imagery conveys a message that stereotyping is acceptable. This has an indirect negative impact on all students when they later must deal with diverse workplaces, a diverse society and a global marketplace."

Surely NMAI, Gilmour, the Kansas City Star, the Michigan Department of Civil Rights, and all the rest have a point.  America's high schools and colleges need to prepare students with sensitivity, so they can work with people from other countries and other cultures. Otherizing other people by naming teams "for" them does not help. Nor does "honoring" American Indians as mascots help us remember American Indian history as it was.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/151462 https://historynewsnetwork.org/blog/151462 0
The Nature (or is it Nurture) of Color Image via Shutterstock.

Years ago, when I was teaching sociology at the University of Vermont, a colleague introduced me to a classroom exercise that he found useful for showing students gender differences. Eventually I wrote it up and Teaching Sociology published it.

The exercise had two parts. The first, labeled "Colors," listed eight colors -- not the easy ones like red, white, and blue, but puce, taupe, teal, mauve, magenta, chartreuse, ochre, and sienna. Students were to match each with one of ten definitions, such as "brilliant yellow" or "brownish gray."

The second half listed eight "Football terms": safety, screen, curl, trap, touchback, lateral, touchdown, and clip. Again we supplied ten definitions, such as "to block out a defensive player from the side after he has crossed the line of scrimmage."

In class, after students answered, I gave them the right answers. Then they filled in a little scoring form. They wrote down the number of color terms they got right, subtracted the number of football terms they got right, and added 8, to ensure all totals were positive numbers. Then I reseated the room with the highest scores to the front.

Even though readers here surely see the unfairness of the procedure, in class most students didn't. Of course, the result was, women students wound up overwhelmingly in the front, men in the rear. Most students assumed that women had shown more knowledge; only a few noticed that test-takers were actually penalized for knowing football terms. Men in particular were upset; many did not relish the possibility that they might be seated this way for the rest of the semester, according to their performance on such a "stupid" test.

Some male students who wound up in the front seemed embarrassed. I think some may have been gay, although one certainly need not be homosexual to be more interested in colors than football. Back then, and even now, I would not ask a student's sexual orientation publicly. Girls who wound up in the back usually credited their brothers or fathers for fostering an interest in football; rarely did they seem embarrassed. Perhaps this showed that male spheres of knowledge were more valued, or maybe it showed that girls' "tomboy" interests are not stigmatized while boys' "sissy" interests are.

My colleague and I used the exercise as a jumping-off point for discussing gender in society. We pointed out, for example, that in some executive lunchrooms, professional sports are a much more common topic of conversation than interior decorating. (That's certainly the case here in Washington, D.C., especially on Monday mornings during the National Football League season.) Indeed, the woman executive with nothing to contribute about sports can feel alienated or be viewed as aloof. Some years ago, The Learning Annex put out a self-help book to remedy this problem, How to Talk Sports to Men. "Does ERA mean only the Equal Rights Amendment to you?" it asked. Women who expressed their ideas in sports imagery, it suggested, became "easier for [men] to understand, even if the ideas ... remain the same."

Of course, some women didn't care. During the spring semester in 1993, coincidentally after I had used the exercise successfully in class, I watched the Super Bowl on television with three other people. An interesting play had just taken place on the field, and the other male and I were discussing it, when I noticed that the two women were conversing intensely about the name for the color of the sport coat worn by Dallas coach Jimmy Johnson, striding up and down on the sidelines. "Life imitates sociology class," I concluded.

The matter is not really trivial, however. Seemingly inconsequential differences in knowledge can lead to significant stratification by gender. Years ago, for example, working with Phyllis Rosser and John Katzman, I examined the serious gap in female performance on the math part of the SAT. We looked at performance by gender on old SAT items. Simply siting one item at a girls camp instead of a boys camp prompted girls to do better, boys worse. The item with the largest male/female gap of all involved sports statistics. Even my son -- no math whiz in high school and now an English teacher -- mastered sports statistics. They were important to him. By deleting that item and altering others, we were able to come up with a math test on which girls performed as well as boys.

Incidentally, "SAT" is not an abbreviation for "Scholastic Aptitude Test," even though most Americans think it is. Partly as a result of arguments I made to Nancy Cole, then Vice-President of Educational Testing Services, purveyor of the SAT, after Cole became president of ETS, the company removed "aptitude" from the title of the SAT (See The Validity of Testing in Education and Employment), renaming it the "Scholastic Assessment Test." A few years later, painfully aware that "Assessment Test" was redundant, ETS renamed the SAT once more. Now it merely stands for "S.A.T." -- the initials mean nothing at all!

Of course, our little teaching exercise was hardly the last word about gender and color. Indeed, a recent article in Smithsonian magazine (March, 2013) "50 Shades of Gray Matter," claims that women may have a genetic ability to distinguish colors better than men. The author, Libby Copeland, knows that "women possess a larger vocabulary than men for describing colors." This, she believes, already provides indirect evidence that it's genetic. (I would have thought exactly the opposite: the larger vocabulary suggests that girls have spent more time learning about colors and thinking about colors.)

As well, she cites research by Israel Abramov, "a psychologist and behavioral neuroscientist" at Brooklyn College. He found that "an object that women experience as orange will look slightly more yellowish to men." He deduces tentatively that Darwinian evolution is responsible: "males needed to see distant, moving objects, like bison, while females had to be better judges of color when scouring for edible plants." Copeland concludes, "Someday, further studies could reveal whether these traits could have implications for how men and women perform in fields such as the arts or athletics."

I suggest before we search for biological causes of gender differences, we first rule out likely social causes. Usually these exist in abundance. It's hardly rare to come upon a girl who by age twelve has been given dolls' outfits or girls' clothing in pink, scarlet, crimson, lake (yes, boys, "lake" too is a shade of red), vermillion, and magenta. It's hardly common to find a boy who by age twelve has not been given a football.

Looking for causation in the "hard" sciences may be trendier. And of course, it's hard to disprove that some eons-ago differentiation in occupation by gender led to higher survival rates for women with more color acuity. It is also difficult to prove such a causal argument, however.

And it's dangerous. When "hard scientists" conclude that girls have more ability in a given sphere by nature, feminists may applaud. They may not, when scientists conclude that girls have less ability by nature in some other sphere and therefore perhaps advise them against entering some important field of human endeavor. Certainly we should see where the evidence takes us. However, too-easy reporting like "50 Shades of Gray Matter" merely exemplifies lay credulity toward "science." Ms. Copeland along with Dr. Abramov may need a course in introductory sociology — may even need to experience our little color/football exercise.

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/151776 https://historynewsnetwork.org/blog/151776 0
Wabash Cannonball "Wabash Cannonball" is a light-hearted yet serious country-music song. It celebrates a train that went past my house at the southern edge of Decatur, Illinois, throughout my childhood. Maybe for that reason, my father bought Roy Acuff’s recording of it shortly after Columbia’s invention of the “LP” (long-playing record). I’ve heard this song since about 1950. I sang it myself – in public – in 2009. Now, I cannot get it out of my head.

Remembering Mark Twain’s famous short story "Punch, Brothers, Punch!", about the man who could not get a catchy jingle out of his head until he infected another person with it, in desperation, I turn to you. In the process of my passing the "Wabash Cannonball" on to you, we might enjoy a little railroad history together, perhaps a sort of baseball story, and even a bit of economic history from the 1890s.

Like several other railroads, the Wabash connected Chicago to St. Louis. It also went to Detroit and Kansas City. Perhaps its #2 claim to fame (after the song) was that its freight trains went fast, often averaging 55 and even 60 MPH. On many other railroads, then and now, freight trains trundle(d) along at 40 or even 20 MPH.

At this point I must acknowledge inadequate research for this article. I welcome corrections on any point in the first paragraph or any later paragraph. Nevertheless, I shall press onward, because the state of scholarship on "Cannonball" is not adequate, at least so far as I can find.

Along with "Orange Blossom Special," "Midnight Special," "City of New Orleans," and "The Wreck of the Old 97," "Wabash Cannonball" is one of the all-time best-selling songs about a specific train. So far as I can tell, it began as a different song about a different railroad, "The Great Rock Island Route," written by a J. A. Roff in 1882. (1) Roff wrote two versions, the first of which contained some of the lyrics we know today, including this version of the chorus:

Now listen to the jingle, and the rumble, and the roar, As she dashes thro' the woodland, and speeds along the shore, See the mighty rushing engine, hear her merry bell ring out, As they speed along in safety, on the "Great Rock Island Route."

Sheet music for one version of J.A. Roff's "The Great Rock Island Route," predecessor to the "Wabash Cannonball."

Some of the place names also carry over, such as the Atlantic ocean and "Pacific shore," Chicago, and Minnesota, but most of the cities he mentioned were particular to the Rock Island Line -- Peoria, Rock Island, Davenport, and Council Bluffs. As well, both versions are paeans of praise to the "mighty corporation, called the great Rock Island Route."

As the song changed, it became neutral toward corporations or even anti-railroad. William Kindt copyrighted it in 1904 under the title "Wabash Cannonball." The Carter Family recorded it in 1929. Roy Acuff & his Smoky Mountain Boys recorded it at least twice, followed by Doc Watson, Pete Seeger, Willie Nelson, Leon Russell, Johnnie Cash, Tennessee Ernie Ford, and many others. The Rock and Roll Museum in Cleveland compiled a list of 500 "songs that shaped rock and roll"; "Wabash Cannonball" is the oldest song on the list.

One of the first recordings that Columbia Records ever released, this is the jacket for 9004, a ten-inch LP. Until then, 78 RPM records lasted only two to four minutes. Twelve-inch records, mostly for classical music and Broadway shows, playing at 33 RPM, could last half an hour. The ten-inch format was reserved for collections of popular music and played for twelve to fifteen minutes per side. Eventually, 12-inch records won out.

At more than 10,000,000 copies, Acuff's recording was also one of the best-selling records of all time. Here is his version on Columbia 9004.

From the great Atlantic ocean to the wide Pacific shore, From the green [queen?] of flowing mountains to the South belle by the shore, She's mighty tall and handsome, and known quite well by all -- She's the combination on the Wabash Cannonball. She came down from Birmingham one cold December day. As she rolled into the station, you could hear all the people say, "There's a girl from Tennessee; she's long and she's tall. She came down from Birmingham on the Wabash Cannonball." Our eastern states are dandy, so the people always say, From New York to St. Louis, and Chicago by the way, From the hills of Minnesota where the rippling waters fall, No changes can be taken on the Wabash Cannonball. Here's to Daddy Claxton; may his name forever stand, And always be remembered 'round the courts of Alabam.' His earthly race is over; the curtains 'round him fall. We'll carry him home to victory on the Wabash Cannonball.

Chorus

Listen to the jingle, the rumble, and the roar, As she glides along the woodland, through the hills, and by the shore. Hear the mighty rush of her engine; hear that lonesome hobo's call. You're travelling through the jungles on the Wabash Cannonball.

According to railroad historian Mike Schafer, the Wabash named one of its passenger trains for the song much later, rather than the song being named for the train, the usual case. (4) In the song, the train took on more glamour and more destinations than its flat boring route from Detroit to Fort Wayne, Indiana, Decatur, Illinois, and on to St. Louis. Schafer believes that the popularity of the song helped prompt the public outcry that prevented the Wabash from ending the train; it survived until Amtrak took over passenger service across the U.S. in 1971. Charles Kuralt of CBS-TV went along on that last run, with a version of the song as the soundtrack.

Places the song mentions include the Atlantic and Pacific Oceans, an implicit reference to the Gulf of Mexico, New York, Chicago, St. Louis, "the hills of Minnesota," Birmingham, Tennessee, and Alabama. Indeed, one line even claims "You're travelin' through the jungle on the Wabash Cannonball."

On closer examination, however, the song never really says that the train went to all those places. "The hills of Minnesota" is a revision of "the lakes of Minnesota" from the predecessor version, "The Great Rock Island Route," which did go to Minnesota. The girl hails from Tennessee, but "she came down from Birmingham." That reference almost certainly is to Birmingham, Michigan, a suburb of Detroit that is still the terminus of Amtrak's run to Detroit, probably because the yard is there. Otherwise, the line, "She came down from Birmingham one cold December day," would make little sense, because one would come up from Birmingham, and cold is not a major feature of Birmingham, Alabama. "Chicago by the way" also makes sense, since the Cannonball didn't go to Chicago; it was "by the way"; but other Wabash trains did, from both Detroit and St. Louis.

The line, "You're travelin' through the jungle on the Wabash Cannonball," does not refer to the Amazon or Congo but to "hobo jungles," the areas at the edge of town, often at the edge of the railroad yard, where hoboes camped -- and still do. Indeed, Wikipedia credits anarchist folksinger Utah Phillips with the statement that the "Wabash Cannonball" was a mythical train that hobos imagined would appear at their death to carry their soul to its reward. The "Wabash Cannonball" also gave rise to several other songs using its tune, including "Hail! Ye Brave Industrial Workers" and Woody Guthrie's "The Grand Coulee Dam." (3)

I had always thought that "lonesome hobo's call" referred generically to the mournful whistle of steam locomotives, commemorated in so many train songs as pulling at men whose attachment to their families and communities was fragile. Indeed, another song by Acuff on the same record, "Freight Train Blues," offers an example of this genre:

I got the freight train blues. Lordy, lordy, lordy! I got 'em in the bottom of my rambling shoes. And when that whistle blows, I gotta go. Oh, lordy! Guess I'm never gonna lose those freight train blues.

But maybe it carries the more specific meaning of calling the hobo to paradise. We can no longer ask Utah Phillips, who died in 2008.

Roy Acuff was the Republican nominee for governor of Tennessee in 1948 and won more votes than any Republican candidate in the twentieth century to that point. Still he lost, two-to-one, unlike country singer Jimmie Davis, who served two terms as governor of Louisiana.

The last stanza of Acuff's version also has considerable historical significance. Daddy Claxton was a farmer in Alabama; I think he was African American. Like many farmers at that time -- maybe 1890, maybe as late as 1910, I'm not sure -- he was hurting, because the railroad had a monopoly. Crops and livestock went to market by rail. Cars and trucks did not exist. Only one railroad served most counties. Even when two did, they did not compete; they agreed upon a common rate. That rate almost bled their customers -- farmers -- dry, almost bankrupted them.

The Farmers Alliance, an interracial organization that predated the Populist Party, protested, but usually to no avail. Not knowing what to do, Claxton took matters into his own hands. He stole a train! Of course, his was only a partial solution, since he had no tracks. Eventually they caught him, of course, charged him with theft, and brought him to trial. I think he got off owing to jury nullification, but I'm not sure. I also no longer remember where I read or heard this story. But it cogently explains the lyrics: of course he would be remembered 'round the courts of Alabam' for this escapade, which encapsulated and publicized the plight of so many people.

Until the mid-1950s, baseball teams traveled by train. Here is the schedule for the New York Yankees in early September, 1950, for example:

Train to Washington, D.C., 9/9; Play Senators, 9/10, followed by double-header, 9/11; Overnight train to Cleveland, play Indians, 9/12, 9/13; Train to Detroit, play Tigers, 9/14, 9/15, 9/16; Overnight train to St. Louis, play Browns, double-header, 9/17; Train to Chicago, 9/18; Play White Sox, 9/19...

This meant that baseball teams went past my house, invisibly, on the Cannonball and other Wabash trains.

Only the Yankees were not always invisible. Twice during my childhood, I think -- once for sure -- the Decatur newspaper had a small story about how the Wabash put some Yankee baseball players off the train in Decatur owing to their bad behavior. The players, including their stars, Whitey Ford and Mickey Mantle, then had to hire a taxi at their own expense to take them the 120 miles to St. Louis and hope they made it to the game on time and sober enough to play.

Ford and Mantle were both notorious alcohol abusers. Indeed, their drinking became an issue during contract negotiations for 1958. Ford's contract that year required him to promise "to obey all of the club's rules, not miss any trains or planes, and, as Casey Stengel phrased it, be able to tell midnight from noon," according to baseball writer Harold Friend. Naturally, these escapades in Decatur gave rise to a new verse for the "Cannonball." It sees print (4) below for the first time. (I did sing it in public once, at a community center for homeless persons in Moline, Illinois, as part of a speaking tour of the Quad Cities, but the less said about that, the better.) (5)

The Yankees -- Mickey Mantle, Whitey Ford, and all the boys,Were drinking in the club car in Decatur, Illinois.They all got thrown off the train; they could not play baseball,'Cause they were drunk and disorderly on the Wabash Cannonball.

The organizer of the concert for homeless people did not fully follow my introduction of the song. After my performance, he said to me, "I must have heard that song a hundred times, but I never heard that verse before." I just nodded sagely. Now, in the interest of more complete and accurate history -- baseball, train, and substance abuse -- I pass it on, without copyright, to the folksinging and hobo communities. I look forward to the day when I might hear it sung by, say, Emmylou Harris? Ricky Skaggs? Garrison Keillor? Jay-Z??

* * * * *

1: J. A. Roff, 1882, "The Great Rock Island Route!!", lyrics courtesy of the Rock Island Technical Society

From a rocky bound Atlantic, to a mild Pacific shore, From a fair and sunny southland to an ice-bound Labrador, There's a name of magic import and 'tis known the world throughout, 'Tis a mighty corporation, called the great Rock Island Route.

Chorus:

Now listen to the jingle, and the rumble, and the roar, As she dashes thro' the woodland, and speeds along the shore, See the mighty rushing engine, hear her merry bell ring out, As they speed along in safety, on the "Great Rock Island Route." All great cities of importance can be found along its way, There's Chicago and Peoria and Rock Island so they say, With Davenport, and westward still is Council Bluffs far out. As a western termination of this Great Rock Island Route. To the great southwest another, and a mighty line they run, Reaching far famed Kansas City, Leavenworth and Atchison, Rich in beauty, power, and grandeur, and they owe it all no doubt, To the fact that they are stations, on the Great Rock Island Route There's the "Northern Route," a daisy as you all can plainly see To St. Paul, and Minneapolis, 'tis the famous "Albert Lea," To the lakes of Minnesota, and all points there 'round about Reached directly by no other, than the "Great Rock Island Route." Now let music soft and tender, in its mystic power reveal, Praises to the "Great Rock Island," that the heart can only feel: And to swell the mighty chorus -- comes the glad re-echoing shout, That for safety, time and comfort, take the "Great Rock Island Route.

The foregoing lyrics clearly morphed into part of the song, "Wabash Cannonball." Meanwhile, in the same year, Roff wrote entirely different words, which a correspondent to the Rock Island Technical Society provided, in the form of a four page promotional newspaper published by the Rock Island Passenger Department in September of 1882:

Have you ever heard it rumored, As you journeyed to the West, Of the many mighty railroads, Which was greatest and the best? The public long have said it, And 'tis true, beyond a doubt, That for safety, time and comfort, Take the "Great Rock Island Route".

Chorus

Only listen to the jingle, and the rumble, and the roar, As she dashes through woodland and skims along the shore! See the mighty, rushing engine -- hear the merry bell ring out, As they speed along in safety, on the "Great Rock Island Route"! In her crowded palace coaches All is happiness and joy, From the father and the mother To the little girl and boy; And a sweet look of contentment From every face shines out -- For the people all are happy On the "Great Rock Island Route".

Chorus

Through darkest hour of midnight Hear the rumble and the roar, As she glides like bird of spring time Past the humble cottage door; On, on into the darkness, With headlight streaming out For the safety of the people On the "Great Rock Island Route".

Chorus

Through prairies, rich and fertile With cities covered o'er; On, through broad hills and valleys, to the great Missouri's shore. Her name's in every household; 'Tis known the world throughout, So procure at once your tickets By the "Great Rock Island Route".

Chorus

Clearly the last set is an advertising jingle, not a folk song, whatever one's definition of "folk song" might be. It also had no influence after 1882.

2: Mike Schafer, More Classic American Railroads (Mendota, IL: Andover Junction Publ., 2000), 145.

3:Robert B. Waltz and David G. Engle, "The Ballad Index," (2013), which may be a compilation of songs at California State University, Fresno.

4: Is HNN "print?"

5: There are five Quad Cities: Moline, East Moline, and Rock Island in Illinois, and Davenport and Bettendorf in Iowa. Only a great country could have five Quad Cities! East Moline is smallest in population but at its city limits displays a large sign defiantly proclaiming "East Moline / One of the Quad Cities," and indeed, it was one of the original four.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/152354 https://historynewsnetwork.org/blog/152354 0
George Zimmerman, Harmony Stair, and the Elephant in the Room Credit: Wiki Commons.

After the Florida jury found George Zimmerman not guilty of second-degree murder or manslaughter for killing Trayvon Martin, everyone had an opinion about the verdict. Many people sought my opinion, including a major radio station in Jamaica and Al Jazeera. I knew that my own knowledge about the killing was no more than that of most of my listeners or readers. Hence I could not add to their knowledge base. I could only give them my opinion about the matter. So it would be about me and my opinion. What's the point of that?

I had expected the jury to find Zimmerman guilty of involuntary manslaughter. In my mind, I had compared his action to the fairly notorious 2011 case of Harmony Stair, a 33-year-old woman from Blacksburg, Virginia who was 7½ months pregnant. Stair had been making jello shots for a party and sampled her product. (Jello shots are small servings of jello made with vodka or grain alcohol.) She then drove drunk, causing a crash that fatally injured her fetus. Like Zimmerman, she was charged with manslaughter for his (her child's) death. Unlike Zimmerman, she was not charged with second-degree murder.

Involuntary manslaughter is the proper charge when a person has been unlawfully killed, but the perpetrator had no intention to kill. Stair did not intend to harm her unborn child. (Since the fetus was old enough to survive outside the womb with standard medical procedures, I use "fetus" and "unborn child" interchangeably.) Whether Zimmerman had any intention to harm Trayvon Martin is not known. Certainly he suspected that Martin was a criminal.

Stair never went to trial. She pleaded guilty last April, will be sentenced this September, and is out on bail until then. As the world knows, Zimmerman is a free man, able to resume his neighborhood watch activities, should he so choose.

I suggest you try discussing these two outcomes with acquaintances of differing politics. Many on the right defend both Stair's conviction and Zimmerman's not-guilty verdict. About Zimmerman's verdict, for instance, right-wing rocker Ted Nugent called Martin "a troublemaker who brought about his own demise." He suggested Zimmerman should sue Martin's parents over the injuries that Martin caused him. Many on the left question both verdicts. Reverend Al Sharpton called Zimmerman's acquittal "an atrocity." Again, these reactions tell more about the persons reacting than about the cases under discussion.

In the Washington Post, columnist Richard Cohen faulted those politicians (and others) who wore hoodies as a sign of solidarity with the deceased. "Where is the politician who will own up to the painful complexity of the problem and acknowledge the widespread fear of crime committed by young black males?" Thus Cohen's answer to the question, "Would Zimmerman have followed a young white male?" is, "Of course not," but he hardly denounces such racial profiling. Instead, he goes on to defend Zimmerman and the practice as justified owing to black males' greater propensity to commit crimes.

Cohen complains that no one will talk honestly about black crime. It's the elephant in the room. So, let's examine the elephant.

Without a doubt, young black men commit crimes at a higher rate than young white men. According to Census Table 325, Arrests by Race in 2009, the difference is four to one. No doubt, some of this difference results from different treatment by race and class within the criminal justice system. No doubt, a white middle-class family with a lawyer is more likely to get their son back without an arrest record than a black single mother living in public housing. But even if we suppose that the black crime rate is four times the white crime rate, we still face a logical problem. The black crime rate is 0.46% of the black population, while the white crime rate is 0.12% of the white population. Of course, George Zimmerman and Richard Cohen would point out that they don't propose profiling all African Americans, just youngish black males. We can accommodate that point by assuming that all arrests of African Americans are of males (they aren't), and that all those arrested are between 15 and 44 years old (they aren't), we divide the number of African Americans arrested by the male population just in that age bracket. This result, 2.1 percent, obviously overstates the proportion of young black males who are arrested. Nevertheless, about 98 percent were not arrested.

So the George Zimmermans of the world, the Richard Cohens of the world, are wrong 98 percent of the time in their "justified" racial profiling. True, doing the same statistics on white males would show that only about 0.5 percent were arrested. The issue becomes, does 2.1 percent versus 0.5 percent justify profiling? Or should we treat the next young black man we see as a person deserving of respect unless he gives us reason to believe he is not?

For those still not convinced, let me add one more point: about three men get arrested for every woman arrested. Should we therefore profile men? (Yes, I am aware that Shulamith Firestone, Andrea Dworkin, and some others suggest exactly this. I hope most readers disagree.)

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/152889 https://historynewsnetwork.org/blog/152889 0
Race Relations in Black and White, Color, and Digital For two summers as a lad, back in 1961 and 1962, I was camp photographer at Region Seven Explorer Canoe Base near Boulder Junction, Wisconsin. Canoe Base, as we called it, served Explorers, the division of the Boy Scouts for older boys aged fourteen through eighteen. "Posts" of eight to twelve boys would arrive, say late on a Monday afternoon, get training that evening and the next morning, then depart after lunch on Tuesday. They would paddle, camp, and cook for six days. On Sunday afternoon, they returned to the Base, checked in their gear, and enjoyed dinner and a campfire. Monday they left right after breakfast, and the cycle repeated.

It was a great place for a college student to work. I spent five summers there, working my way up from dishwasher to assistant director. During my time as photographer, the largest part of my duties consisted of taking a photo of each group posed at the Canoe Base gateway sign shortly after their arrival. I then worked feverishly to print 8" x 10" glossy photos of each group to show them at dinner. Then I took orders for copies, charging $1 each.

Overwhelmingly, Canoe Base served white boys. During my five years on staff, we served about 7,500. Not one group had a single black member, so far as I can recall, except one post from the South Side of Chicago, which was all black and came two consecutive summers.

This racial imbalance affected me personally, because it was hard to photograph whites when posed in bright sunlight. Their faces washed out. Talking with professional photographers, I learned that this was a long-standing problem that Kodak had spent a lot of money trying to solve. Perhaps their best solution was Plus X, a rather slow black-and-white film with fine grain, coupled with Medalist print paper. Ansel Adams and other art photographers often chose Medalist because it produced a nuanced range of tones from pure white to deep black. Even so, in the darkroom, I often had to "burn in" the faces of white folks.

This photo of the Canoe Base Camp Staff in 1961 shows the problem of washed-out features in white faces, even using Kodak's best products. The author is at lower left.

Since an increasing proportion of today's readers have no idea what film is, let alone "burning in," let me describe the process. When exposed to light in a camera and then developed, film yielded a negative image. With black and white photography, this meant the negative -- the "film" -- was mostly black where the subject was white. In a darkroom, that film was then placed in an enlarger -- a bright lamp enclosed in a light-proof housing with a lens on its bottom to focus the light as it passed through the film and project it onto the photographic paper. Where no light passed through the film, the paper remained pure white. Since white people are not pure white, if their negative image on the film was extremely dense, I had to put a piece of cardboard with a hole in it over the light image between the lens and the paper. The cardboard would block the light except that coming through the hole, which I would direct toward the Scouts' faces. Giving a few extra seconds of light to the faces meant they would not turn out pasty white. Shaking the cardboard while I held it prevented any tell-tale dark line of greater exposure on the print.

I mentioned that Kodak had spent a lot of money trying to solve this problem to minimize how much burning in would be needed. As an unfortunate side-effect of their research, it was hard to photograph blacks when posed in shadow or at night. Plus X film and Medalist paper often lost definition at the other end of the spectrum, doing a poor job on items in deep shadow. African American faces registered too dark, especially dark-skinned faces, except when posed in bright sunlight. "Dodging" helped. The opposite of burning in, this technique involves wiggling a flat circular piece of cardboard on the end of a rigid but fine wire in front of part of the image. Dodging kept too much light from hitting the paper and rendering it too dark. But dodging is harder to do than burning in.

Historians and sociologists see the results of Kodak's inadequate attention to black skin tones when we examine such primary sources as old newspaper photos. Photos at black or interracial events could result in faces so dark that expressions -- even features -- did not register well, especially in night shots.

The front page of the African American Seattle newspaper The Northwest Enterprise, from May 1940. Note the difficulty in discerning the features of the photographed man.

Plus X and Medalist were not the only culprits. Indeed, Kodak's faster products, like Tri-X film, were even worse, as were its competitors' products. I am using Plus X and Medalist as synecdoches -- the part standing in for the whole, in this case the whole of photography. The problem was, simply, whites dominated the market, so film was balanced for Caucasian skin tones -- just like "skin-colored" Band-Aids and "flesh colored" crayons. (Crayola relabeled the latter "peach" during the Civil Rights Movement.)

Hattie McDaniel as Mammy in Gone with the Wind."

Kodachrome and other color films did a little better but introduced their own problems: dark complexions sometimes registered with a bluish tinge that they did not have in real life. In famous movies like Gone with the Wind, black actors like Hattie McDaniel and Butterfly McQueen don't look quite right, but movie-goers cannot always articulate what is unnatural about their skin tones.

No deliberate racism prompted Kodak to do a better job representing white skin tones than black skin tones. Nevertheless, photography, like some other American institutions, "otherized" black folks. To the photographer, blacks seemed different, problematic, harder to photograph. And it seemed to be their fault, not Kodak's: "they" were harder to photograph. To members of the general public -- newspaper readers, movie-goers -- African Americans again looked different, subtly unnatural, other, darker than they were in real life. This posed a particular problem in segregated America (and South Africa, Germany, Australia ...) because so few whites saw blacks in real life.

Thus without any intentional prejudice on the part of anyone, photography provided a textbook example of racism: "treating people differently and worse because of their racial group membership." The use of "because" in that definition does not require intent. Regardless of intent, race did determine how accurately and humanely Kodak products depicted one. And that difference made a difference when it prompted whites to conceive of blacks as different.

Time took a lot of heat for its retouched photo of O.J. Simpson, but perhaps they reveled in it, believing there is no such thing as bad publicity.

Still photography has now gone digital. Moving pictures are rapidly following suit. This new technology carries new potential for harm — remember Time Magazine's infamous manipulation of O.J. Simpson's skin tones for its cover back in 1994? By making Simpson darker, Time made him seem more sinister, thus hoping to outsell Newsweek, which used the same mugshot photo unretouched. Readers are already familiar with ways that computers can alter photographs to falsify what they claim to depict. Not only can the commissar vanish, as in Soviet photography of yore, politicians can add whatever the polls suggest they need. In 2010, campaigning for U.S. Senator from Colorado, for example, Republican Andrew Romanoff worried that his campaign rallies looked too white, so his campaign photoshopped a black woman and at least two Latinos into a crowd shot. Unfortunately, the stunt backfired when word of it leaked out, only emphasizing Romanoff's narrow appeal.

But if digitization can be used badly on the racial front, it can also have serious benefits, at least in photography by professionals. In a recent Washington Post article, film critic Ann Hornaday tells how moviemakers now compensate for skin tone problems during the "digital intermediate" stage of post-production. She notes that 12 Years a Slave, the new movie about Solomon Northup, benefits as a result: even in night scenes, the faces of African Americans "are clearly defined." She also tells of sophisticated digital cameras that can capture African Americans and European Americans of varying hues all in the same shot, with no manipulation required afterward. (I think this results from software that compresses the brightness range when it senses loss of detail.) As this technology spreads to all digital cameras and smart phones, we may no longer confront a trade-off between the accurate rendition of whites versus blacks.

So let us rejoice in this positive result of the digital revolution. Surely this change is all to the good!

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153186 https://historynewsnetwork.org/blog/153186 0
"'Confronting' Mitch Daniels at Purdue" On Tuesday evening, November 5, 2013, I speak at Purdue University. This is hardly startling news. After all, various other institutions around Indiana have engaged me to speak and lead workshops for years, including Ball State, Indiana University, Indiana University/Purdue University of Indianapolis (IUPUI), Notre Dame, Southern Indiana, and at least three liberal arts colleges. As well, I've worked with other Indiana organizations, like Bloomington United, the South Bend Center for History, and the Indianapolis Public Schools. I've also spoken to national or regional conferences meeting in Indiana, such as the General Assembly of the Unitarian — Universalist Church, the National Association of Student Personnel Administrators, and the Great Lakes Council for the Social Studies.

Especially in this context, however, Purdue's invitation was newsworthy. That's because it didn't exactly come from Purdue. Instead, it came from a handful of faculty members in various departments, using what seems to be their own money, at least in part, and donations from "across Indiana and across the country," according to one of the event organizers. In fact, almost no money seems to be involved. Purdue offered me exactly $4,000 less than my usual speaking fee, which is $4,000. They are paying my travel expenses.

When they first asked me to speak, I asked the professors to get Purdue to invite me as a regular speaker, the way other institutions do. Although I speak for nothing at times, Purdue hardly qualifies as a shoestring community organization in need of my charity. Moreover, all the other Indiana schools paid my fee, some more than once.

The professors replied, "Funding your visit would be a challenge," as one put it. Another was even clearer: "Some senior administrators at Purdue expressed 'concern' that having you come to campus could be considered 'confrontational' to our President." These included at least two department chairs, I believe.

Who is "our President?" Why, Mitch Daniels, formerly the right-wing Republican governor of Indiana. When chosen president of Purdue by its Board of Trustees, all of whom he had appointed, people at Purdue and across the nation raised questions about whether Daniels would respect academic freedom. The first threat to academic freedom at Purdue, however, seems not to be Mitch Daniels as president, but the idea of Mitch Daniels as president. Just having him as president means no speaker deemed "antagonistic" (another word used) to Daniels or his ideas will be considered. Mr. Daniels need do nothing. The censorship lies upstream of him.

This chilling response to Daniels's mere existence reminds me of all too many authors of K-12 textbooks in U.S. history. They told me that they rarely experienced censorship from publishers or editors. Of course, they rarely wrote anything that might be considered "confrontational." As Mark Lytle, co-author of one textbook, told me, explaining why a major publisher had sought out him and James Davidson, relative unknowns, "They didn't want famous people, because we'd be more tractable."

"Were you?" I asked.

"We were reasonably tractable," he replied.

Again, the censorship comes upstream of the publishers. Few editors ever have to censor anything, few authors must resist any demand to tone anything down, and the resulting textbooks will never offend anyone. Of course, since they never say anything critical about the United States, they can never treat some subjects accurately, but who cares? We don't want eighteen-year-olds who can think for themselves anyway.

Howard Zinn was never tractable. This upcoming Purdue event is newsworthy in some other ways, because I am part of an array of speakers for an evening billed as a "Howard Zinn Read-In." Some other campuses in Indiana will host events in solidarity with Purdue's. The evening celebrates the work of the controversial political scientist and historian who wrote, most famously, A People's History of the United States.

By now it has become public knowledge that President Daniels, while governor, tried to keep the ideas of Howard Zinn from being taught anywhere in the state of Indiana. Daniels claimed this was no infringement of academic freedom, because there is no right to academic freedom in K-12. That is not precisely correct, but in addition, Daniels also tried to stop Zinn from being taught in state-funded colleges of education, which do have academic freedom.

My experience of Howard Zinn was mostly positive. Often, after I gave a talk lamenting how badly history is taught in most high schools, an audience member came up afterward to tell me that their history teacher was different. "She assigned us People's History as well as the regular textbook, and her course was interesting." Howard and I only met three times, I think, but he was always generous in his praise of my work.

At times, Zinn did make glaring errors. Also, like the textbooks he despised, People's History has no footnotes. But to expunge him from Indiana amounts to the claim that he has nothing of value to teach. This is wrong. His work brings in many facts, voices, and points of view that mainstream textbooks deliberately leave out.

When Zinn pointed out that the United States intervened around the world not for the cause of "freedom," but to instill anti-democratic dictators, he was not wrong, but right. When Zinn told the details of the 1877 labor revolt against the immense social inequality capitalism was then (and again is now) building up, he was not wrong, but right. Surely it is likely that Gov. Daniels tried to ban Zinn not because he was wrong, but because he was right.

I speak from experience about Mr. Daniels, because he also tried his best to keep me from speaking in Indiana. This attempted censorship came in the fall of 2007. I was scheduled to speak in a total of five venues in central Indiana, mostly on the subject of sundown towns in Indiana. Sundown towns are of course communities that for decades were — some still are — all-white on purpose. Indiana abounds in sundown towns. I estimate that a majority of all incorporated communities — and several entire counties — in the Hoosier state flatly kept out African Americans. As a result of this work, the Indiana Civil Rights Commission volunteered not only to have me speak to their agency (and other people in state government), but also to coordinate a modest speaking tour.

Then the governor, or at least his office, intervened. His intervention came in response to my writing an article about Honda's building a new $550,000,000 factory in Greensburg, Indiana, a sundown town that had driven out its black population in 1906. I pointed out Greensburg's unsavory past. Of course, Honda knew precisely what it was doing. Not only did it choose a sundown town, it also drew a circle with a 35-mile radius and stated that prospective employees had to live within that circle. Indianapolis, with its black community — the only black community anywhere near Greensburg — "happens" to lie 50 miles away.

This turns out to be traditional Honda behavior. In 1988, according to James Treece, news editor at Automotive News, Honda paid what was then the largest EEOC settlement ever — $6,000,000 — owing to discriminatory hiring patterns at its Marysville, Ohio, factory. Honda had similarly red-lined Columbus, Ohio, and its black residents.

I suggested that Honda should be asked, "Did you choose Greensburg because Greensburg was a sundown town or despite Greensburg being a sundown town?" And if Honda answered, as it surely would, "the latter," then the next question should be, "OK, then, what are you going to do about it?" (1)

I think Gov. Daniels should have responded to my article by putting those questions to Honda. He might then have gone on to suggest that all sundown towns in Indiana need to take distinct steps to move beyond their racist pasts.

Instead, he tried to stop my speaking tour, already scheduled across central Indiana. His office ordered the Indiana Civil Rights Commission to cancel all five events. At the last moment, I managed to reinstate four of them, two at IUPUI and two at Ball State. Of course, I could not reinstate the canceled event at the Indiana Government Center. At the main IUPUI event, Prof. Florence Wagman Roisman, who is Michael McCormick Professor of Law at IUPUI's Law School, introduced me with an eloquent five-minute disquisition on the First Amendment.

Surely Gov. Daniels tried to stop me from speaking, not because I was wrong about sundown towns in Indiana, but because I was right.

We can safely infer that Mitch Daniels has little regard for the rights or intrinsic value of African Americans. He has close ties to the Bradley Foundation, a right-wing institution that paid Charles Murray $1,000,000 to write The Bell Curve, according to education reporter Barbara Miner. The Bell Curve caused a sensation when it came out in 1994. It argues, inter alia, that the median IQ of black Africans is 75, just below Forrest Gump, whose IQ was 76. (2) Tom Hanks's Gump, although a Hollywood portrayal, is a reasonably accurate depiction of a person with such an IQ. Such people are noticeable. They do not appear "normal." During my trips to Guinea, Ghana, Burkina Faso, and Mali, I never noticed one. Yet Murray holds that half of all black Africans have lower intelligence than Forrest Gump! Had Murray used just $2,000 of his grant to fly to Africa and meet some ordinary people there, he might have concluded something was wrong with the IQ test, rather than with more than half of Africa's population.

Bradley has funded many other projects; I think it's safe to say that none has ever had the best interests of black people at heart. Daniels was a board member at Bradley; last June he accepted its "Bradley Prize," a $250,000 award.

Like the connection with Bradley, Daniels's ties with Purdue's board are complex. He appointed eight of its ten board members and reappointed the other two. Last summer, after he had spent just six months on the job, the board granted him a raise of more than $50,000.

Might we call these interlocking relationships and payoffs a form of Affirmative Action?

I do hope that Mr. Daniels comes to hear me and the other speakers. I look forward to asking him about his efforts to censor Zinn and to censor me. I also look forward to the other speakers of the evening. They include Staughton Lynd, who played an important role as director of the "Freedom Schools" during the 1964 Mississippi Summer Project. Lynd also knew Howard Zinn from his teaching days at Spelman College. Then he became an activist against the Vietnam War. Former U.S. diplomat and peace activist Anne Wright will speak. So will Anthony Arnove, co-editor with Zinn on Voices of a People's History of the United States.

It promises to be an evening to remember. If you live anywhere near West Lafayette, I hope you will drop by.

* * * * *

1    Incidentally, negotiations with the Indianapolis chapter of the Urban League eventually resulted in Honda's extending its hiring radius to 65 miles. Of course, that still does not deal with the fact that Honda chose to locate its 2,000 new jobs 50 miles away from black residences, guaranteeing an overwhelmingly white workforce. Some speculate that Honda does such things from simple racial prejudice, viewing blacks as inferior; others suggest that its management thinks African Americans are likely to be pro-union.

2    African Americans are less stupid, according to the book, averaging perhaps 85, owing to their admixture of white genes.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153195 https://historynewsnetwork.org/blog/153195 0
The History of Composting in America Composting first appears in the historical record of what is now the United States in 1621, when Squanto showed the "Pilgrims" how to put a fish in each corn hill, so the maize and squash would thrive. Even that statement is contested, however. Some Wampanoags today say their ancestors never dedicated entire fish to the role of fertilizer; they only used the skin and guts. Some historians agree. I think the primary sources got this one right, however, partly because Pilgrim authors were hardly squeamish about discussing blood and guts, had that been what they had used for fertilizer. As well, when the little fish -- variously called "alewives," "herring," "menhaden," "pogies," and "shad" -- came upstream in the spring, they were so plentiful as to be catchable en masse. There was no need to save and savor their edible parts.

Moreover, the languages themselves offer evidence. Narragansetts called the fish "munnawhatteaûgs," which means "fertilizer" or "that which enriches the land," a word the English corrupted into "menhaden." The Abenakis of Maine called them "pauhagens," which also means "fertilizer," a name the English shortened to "pogies." i

Like dogwoods further south, "shad blow" trees flower before their leaves appear. Their blossoms signal that the shad (or menhaden or alewives...) are running in nearby creeks. They were incredibly numerous. Not only the Pilgrims (and of course Native Americans) used these fish as fertilizer. So did farmers on Long Island, for example, at least as late as the early nineteenth century. In the second half of that century, menhaden oil outstripped whale oil as an American and worldwide industry. Even today, American fishing boats catch more menhaden than any other species. ii

Squanto was the first compost expert that European Americans consulted, but he was far from the last. In the twenty-first century, composting has become a pastime if not an art form. All kinds of people proclaim themselves experts at it and advisers on how to do it. Unfortunately, and unlike Squanto, many folks today don't know what they are talking about.

One such authority held forth on NPR not long ago. Asked if she was a purist who only made vegan compost, she replied, "Oh, no." She included "crushed eggshells," she went on, although she always rinsed them out first. Presumably unrinsed eggshells might offend the maggots, bacteria, worms, and larvae that convert table scraps to compost.

At our house, we put everything in our compost. Well, almost everything. No bones, except fish bones -- they take too long to break down. No industrial "food" (like Coca-Cola).

House guests are horrified. "You put milk in your compost?" one asks, as I rinse my cereal bowl and put the results in the compost can.

"What do cows eat?" I reply. "Ashes to ashes and grass to grass."

They remain unconvinced.

Our current composting mythology is a triumph of theory over experience, or rather, of theory over lack-of-experience. Those who follow the vegan rules wind up with compost, to be sure -- a bit inferior to mine but perfectly useful. Thus they are confirmed that the vegan rules are right. Those who simply make compost out of everything don't often write about it.

The compost rules include:

-- turn it ("ideally every day or two," according to "Compost A to Z");

-- layer green and brown ("the brown and green components should be layered throughout the bin," according to "Composting Rules" at E-How.com);

-- "avoid fruit as it will tend to attract fruit flies" (ibid.);

-- ban cooked food ("can attract vermin and should not be home-composted," according to Garden Organic, "the national charity for organic growing" in the U.K.);

and of course

-- no "oils, grease, lard, meat, bones, fish, dairy products..." (typical municipal composting ordinance).

Ages hence, historians of family life will look back on all these rules with wonder. Staffs at historic houses of the late twentieth century will show open-mouthed visitors kitchen shredders with "super-robust crank arms" for "pre-composting," special plastic bags that consumers bought because they would break down right along with the compost placed inside them; and "Tumbling Composters" to ensure that the top and bottom of the pile all compost evenly.

The butter churns and chestnut roasters that furnish historic nineteenth-century kitchens embody skills most of us have lost. These twenty-first-century items are just the opposite: they exemplify no skills, indeed, no understandable purpose. "People bought rotators for their compost?" visitors will ask, wide-eyed. "They pre-shredded their garbage for the worms?" The loss of common sense from one century to the next will be palpable.

Possibly, like the future anthropologist who wrote "Body Ritual among the Nacirema," iii historians of the future may conclude that the compost movement was a religion. They will discover uniform clothing worn to annual days of sacrifice like "Earth Day" -- such as T-shirts saying "A rind is a terrible thing to waste" and "I heart composting." They will note that after every meal we performed a thanksgiving ritual, giving a portion of our food to the compost god at its kitchen shrine. They will learn that we took the temperature of our compost pile as if it were alive and sought professional advice if it was "too hot" or "too cold." iv We gathered at community colleges and garden stores to listen to experts lecture on composting.

To thwart such misperceptions, historians need to get busy now, writing the history of compost for ages hence. ProQuest lists some 219 theses and dissertations with "compost" in their titles. Not one is in history, American studies, cultural studies, or any related field. Considering all the dissertations on food, cooking, and eating, it is surprising to find not one on our treatment of uneaten food.

The sheer volume of expertise on compost is enormous. 216 of ProQuest's 219 dissertations and theses turn out to be in biology, plant and soil science, and similar fields. v They bear titles like "Suppression of Rhizoctonia solani and its interaction with Trichoderma hamatum in bark compost container media." Then another massive industry popularizes these findings for the public. Rodale distributes one tome, Compost Gardening, that runs 350 pages long; each page is 8 1/2" x 11" and has two columns. Among its suggestions: sift your compost, so it looks better, and because "sifting compost is fun." vi

Sifting compost is not fun, however. Therefore, the main impact of these composting rules, gadgets, and books is to deter composting, except among those in desperate need of a time-consuming hobby. Even among the faithful who don't give up, the rules deter composting of about half of the stuff that could be composted. That's too bad, because composting is good for the planet. It's surprising, how much stuff can go into a compost heap and how compact the dirt is that comes out. No need to truck all that stuff away!

As well, the rules serve no purpose. I live in an urban neighborhood with a small back yard. I violate all the rules. I never turn it. I never water it. I never take its temperature. I just put stuff in on the top and take dirt out from the bottom. Yet my compost composts fine. (What else might it do? It has no other skills!)

Banning fruit to avoid fruit flies is particularly wondrous. Fruit flies are exactly what you want! Fruit flies turn fruit into compost. It's their job, after all. The rule about cooked food is almost as silly. Cooked food is already partly-digested, for heaven's sake, ready to break down the rest of the way into compost. And dairy ... well, let me tell you how I learned that even bad dairy makes great compost.

It was 1964. I was a college student, working summers at Region Seven Explorer Canoe Base near Boulder Junction, Wisconsin. By now, my fourth year, I had worked my way up to "Service Director," in charge of the operation of the base itself, including the kitchen and dining hall. Like many summer camps, Canoe Base got "government surplus" -- food commodities bought and processed by the United States Department of Agriculture. This program was supposed to maintain good prices for farmers' products while helping nonprofit organizations.

Most of these commodities we wanted. We got hundreds of pounds of cheese, sacks of flour, boxes of frozen ground beef.

Then there was the milk. Powdered nonfat dry milk. Each box weighed four pounds and made five gallons of "reconstituted milk." "Reconstituted" it may have been, but milk it was not -- at least not milk that any self-respecting lad would ever drink. Reconstituting it hours ahead of time supposedly allowed its disgusting medicinal odor to dissipate. Not so. Pouring it from one container to another from a height of several feet supposedly aerated it, making it more palatable. We never found a palate for which that worked. Fifty years later, the government still distributes this stuff, and it suggests: "Use nonfat dry milk as directed in recipes requiring dry or reconstituted milk or as a substitute in a cooked product when fresh milk is specified." Perhaps it was a failure of our imagination, but we never came up with "recipes requiring dry or reconstituted milk." Our cook refused to risk her reputation substituting this powder for fresh milk in anything she cooked.

What could we do?

According to government regulations, we could not sell commodity foods. That would undercut local grocery stores. We could not give it away. We could not even store it over the winter -- we had tried that the previous year, and a USDA inspector had come by in February, found the milk, and fined us. All we could do was, turn it back to the government. But the rumor among food recipients was, if you do that, the government will not only cut your allotment of milk for the next year, they will cut all your government surplus proportionately. We could not risk that.

I had a brainstorm. Why not line the volleyball court with it? The twine we tried to use to line the court broke. Marking just the corners led to arguments. So another staff member and I measured carefully and laid out two-inch lines of dried milk on each side of the net.

They worked perfectly, for two days. On the third day, it rained. The lines were still there, a bit yellowed perhaps, but functional. However, the entire court now smelled like spoiled milk. Not bad enough to deter play, but it does explain why Wimbledon prefers titanium dioxide.

I had thought we would have to reline at regular intervals, but almost immediately it became clear that we would not. A streak of deep green grass, taller and thicker than the rest, soon grew along every white line we had put down. This richer grass continued to mark the court for the next two summers.

The volleyball court proved the value of milk as soil enricher. But it used less than two packages of our massive supply. We still faced the problem of what to do with the bulk of our milk. Arlo Guthrie had not yet composed "Alice's Restaurant," but the crime scene in that song (and later, the movie) -- a roadside dump -- was hardly unique to western Massachusetts. I asked two staff members — "Moose" and "Little John" — to take all the rest to a dump in a ravine alongside a road about a mile from Canoe Base. There they were to dump it, but unlike Arlo, they were to leave no incriminating evidence. Instead, they were to bring back every carton, every box, even every plastic bag, for proper disposal. We wanted to bring off the perfect crime.

Moose and John loaded up a vehicle which, like its contraband cargo, was itself government surplus. A carryall painted olive drab, the Army had condemned it years before and given it to the Boy Scouts. Its steering was so loose that in a wide area like a parking lot, the driver could hold the wheel steady and the truck would lurch from left to right as it "caught" first on one side, then the other. Driving in a straight line required a certain Zen-like concentration: one had to anticipate which way it would next lurch and move the wheel several inches to the opposite side until resistance was encountered, then apply a tiny nudge before returning the wheel to the center. I reminded Moose to drive carefully.

Moose and John returned safely later that afternoon. I was told to come see them upon arrival, and the sight was unforgettable. At 6'5", Moose was our tallest staff member; at 5'5", John was our shortest. They stood before me, one Mutt, one Jeff, entirely white. It seems that merely slicing open each bag and emptying it into the ravine had grown boring, so they developed a more interesting routine: they whooshed each bag's contents at each other. They now gave new meaning to the term "white folks." Their lips were white. Their eyebrows were white. Their eyelids were white. Of course their shoes, socks, belts, and all items of clothing were white. Only when they spoke did glimpses of their tongues provide the only speck of color on their personages.

Fearing prosecution, I've never told this story before. Canoe Base has closed, however, so it is safe. Surely the statute of limitations has expired, so I too can safely risk prosecution for violating USDA regulations. I fear, though, that I have veered off my main point. Like some ovo-lacto-vegetarians, I have let lacto dominate.

I learned from the movie American History X that all good essays need to end with a quotation. So I spent almost an hour searching for a good joke, to leave you smiling. To my sorrow, I learned that all the compost jokes were rotten.

* * * * *

i.On pp. 15-17 of his engaging book, The Most Important Fish in the Sea (DC: Island Press, 2007), H. Bruce Franklin tells of the various names and species involved.

ii.Franklin, pp. 56-57.

iii.Actually by Horace Miner; see American Anthropologist 58:3, 7/1956.

iv.Admittedly, a compost pile is a system of living beings.

v.The other three are in English. Two treat poetry.

vi.Barbara Pleasant and Deborah L. Martin, Compost Gardening (no place indicated: Rodale, 2008), 220.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153202 https://historynewsnetwork.org/blog/153202 0
History of Sundown Suburbs Threatens the Existence of Diverse Suburbs Across the United States, diverse suburbs are in trouble. In 1990, Dolton (pronounced "DAWL-ton"), for example, southeast of Chicago, was nicely diverse: 58 percent white, 38 percent black, and 5.5 percent Hispanic. (1) By 2010, however, Dolton was 97 percent black. Today Cleveland Heights, just east of Cleveland, struggles to avoid the same fate; the 2010 census reports it to be 50 percent white, 42.5 percent black, 4 percent Asian, and 3.5 percent other. While African Americans per se are not the problem, when suburbs go overwhelmingly black, they lose prestige, even within the black community. Then they lose the ability to attract buyers, so housing prices slide. Then property taxes slide, as do municipal services, and a downward spiral ensues.

The residents of diverse suburbs want them to stay diverse. Some have, most famously Oak Park, just west of Chicago. Diverse suburbs concoct all kinds of strategies to stay diverse. Cleveland Heights, for example, does not supply its demographic data at its website, hoping to entice would-be purchasers by focusing on amenities and activities. Some diverse suburbs have forbidden “for sale” signs, hoping to forestall blockbusting. Others, including Oak Park, have engaged in more elaborate schemes.

The main problem that diverse suburbs face as they try to stay diverse is not located within the diverse suburbs themselves, however, but in former sundown suburbs. Sundown suburbs are communities that for decades were "all-white" on purpose. (2)

Around Chicago, sundown suburbs included predominantly working-class towns like Cicero, middle-class towns like Oak Park, and upper-class towns like Kenilworth. Some of Chicago's diverse suburbs, such as Oak Park, now famously diverse, started as all-white sundown suburbs. So did some suburbs that are now majority black, such as Dolton. Sundown suburbs are often racially unstable. After all, part of their community ideology had been, blacks hurt property values, are often criminal, etc., so we must keep them out. "Naturally," then, as soon as more than a handful of black households move in, whites flee.

Home in Kenilworth, Illinois. Via Flickr.

Of course, when sociologists say "naturally," we mean that the causes are historical, indeed are so buried in our past that most of us just assume it must be that way. For suburbs to be white while inner-city neighborhoods were black was hardly “natural.” On the contrary, between about 1905 and 1968, about 80 percent of all suburbs of Chicago, Detroit, Los Angeles, and other northern cities went sundown. Various mechanisms, from restrictive covenants to blatant violence, achieved this result. The federal government famously invented three sundown suburbs itself: Greenbelt, Maryland; Greenhills, Ohio; and Greendale, Wisconsin.

Many sundown suburbs remain overwhelmingly white today. For example, Joseph Sears founded Kenilworth, just beyond Evanston on Chicago's North Shore, with four guiding restrictions:

1. Large lots.... 2. High standards of construction ... 3. No alleys. 4. Sales to Caucasians only (meant to exclude Jews too). (3)

Today, Kenilworth still has not a single African American household. One black family did live there for twelve years, even after whites burned a cross on their lawn.

Unfortunately, Kenilworth is the richest and most prestigious suburb of Chicago. "Unfortunately," because many white families do not move to Kenilworth because it is a sundown town. They move there to share in its prestige. In the process, however, they undermine the efforts by interracial towns to stay interracial. Subtly, "interracial" comes to connote "working class" or even "struggling," while "white" connotes prestige, of course. 

Incidentally, Kenilworth's wealth does not explain its racial makeup. Almost 7,000 black families in the Chicago area have more annual income than the median Kenilworth family. Yet not one of these families has chosen to live in Kenilworth. Surely that is due to Kenilworth's reputation as a sundown town, along with its continuing whiteness.

Please note that in this discussion I use Kenilworth as a synecdoche for all the former sundown suburbs in the Chicago area -- and across the U.S. -- that remain overwhelmingly white today. How does their whiteness impact interracial suburbs? When whites move to Kenilworth -- from Oak Park, for example -- they make it harder for Oak Park to stay stably interracial. Sometimes real estate agents abet the process, discouraging blacks (and Jews) from buying in Kenilworth on the grounds that "you won't be happy there, you'll stick out like a sore thumb." Or, as the head of Kenilworth Realty said to me, "Birds of a feather flock together." Many agents sincerely believe that they are doing African Americans a service by telling them the racial reality of sundown and former sundown suburbs.

As the map shows, sundown suburbs are often miles away from the place where the battle for integration seems to be taken place ... and often seems to be lost. Kenilworth is miles from Dolton and the other south suburbs that are now trending black. Nevertheless, Kenilworth -- along with Chicago's other sundown suburbs -- is the problem.

Our usual means of ensuring open housing, such as paired-testing, is not really relevant to the problems posed by sundown suburbs (and by independent sundown towns, for that matter). Testing is a good way to identify individuals -- landlords, home sellers, real estate agents -- who discriminate against minority would-be residents. Sundown towns are and have been all-white by community policy, formal or informal. As a result, sundown towns pose problems that traditional testing cannot uncover. These problems lie "upstream" and "downstream" of the actual process of renting or buying. Hence they lie upstream and downstream of the testing process itself. They lie in the corporate history of sundown towns.

"Upstream" Problems

Some landlords, home sellers, and real estate agents in a sundown town are willing to rent or sell to African Americans. Indeed, most may be. Some property owners and agents may even be pleased to do so, to help their community transcend its racial past.

Precisely owing to that racial past, however, few African Americans may seek housing in the community. The town or county has built a reputation as an entity, based on policies and incidents stretching back for decades. It is not easy for acts by individuals to undo this corporate character. Indeed, the town's actions as an entity, along with the reputation they have built up, may preclude the possibility of nondiscriminatory acts by individual would-be sellers or renters. Would-be fair-minded landlords, for example, cannot rent to African Americans if none ask.

"Downstream" Problems

After a landlord, home seller, or real estate agent in a sundown town rents or sells to an African American family, community actions or acts by individual members of the community may undo the occupancy. For example, black children may get called "nigger" when they go to school. Police may follow, stop, and question family members or relatives and friends who visit them. Or a handful of thugs may burn a cross on their lawn or throw rocks through a window.

If the town as an entity does not provide police protection, if the mayor does not make a strong statement supporting the rights of people of all races to live in the community, and if neighbors do not show solidarity with the beleaguered newcomers, then they may leave. Who could blame them? It follows, then, that the town's policies are at issue, not just an individual seller or even an individual thug.

Again, precisely owing to a community's racial past, police may feel they should challenge black newcomers, whose color by itself marks them as strangers. The 5 percent or 10 percent of the population who might shout racial slurs or harass school children feel empowered to do so in a sundown town. Again, testing cannot uncover this kind of "downstream" problem.

Solutions

Any former sundown town that still "boasts" overwhelmingly white demography should be asked to make three statements:

1. Admit it ("We did this.") 2. Apologize for it ("We did this, and it was wrong.") 3. Proclaim they now welcome residents of all races ("We did this; it was wrong; and we don't do it any more.")

Towns must then back that third statement by action. "We have set up a racial ombudsperson." "We are hiring African Americans to end our overwhelmingly white teaching staff and police force." Etc.

Only then will the "silent majority" of willing home sellers, agents, and ordinary citizens feel empowered to speak out for open housing. Only then will people of color in the metropolitan area become convinced that it is not foolish to seek to move in. Until then, a sundown town's reputation and past policies empower precisely the wrong people not only to speak but also to act against open occupancy by people of color.

State governments and HUD need to require former sundown towns to take these steps or in other ways prove they have transcended their white supremacist past. Until they do, federal and state governments might rescind the mortgage interest exemption from income taxes for residents of confirmed sundown towns that have not changed demographically or in explicit policy. After all, while governments do want to encourage home ownership, they do not want to encourage home ownership by white people in sundown towns. The day after losing the exemption for their mortgage interest, every white homeowner in town will seek African American neighbors, just to get it back!

By whatever method, enforcement of open housing in sundown suburbs will solve many of the problems of diverse suburbs. Readers of HNN can help. By sending information on sundown towns and suburbs to me (jloewen@uvm.edu), you will contribute to the interactive map at the sundown town website. Doing so helps towns take that first step: admit “we did this.” Steps two and three then become easier.

* * * * *

1 The total is greater than 100 percent because Hispanics who list themselves “white” or “black” rather than “other” get counted twice.

2 I place "all-white" in quotation marks because a community need not be quite all-white to be a sundown town.

3Colleen Kilner, Joseph Sears and His Kenilworth (Kenilworth: Kenilworth Historical Society, 1990), 138, 143, her italics.

________________

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153283 https://historynewsnetwork.org/blog/153283 0
Revising the SAT To Make It Even Worse Happily, the Educational Testing Service (ETS) — the folks who bring us the SAT — have heard the increasing protests against their product. To be sure, they never gave much heed to FairTest, the nonprofit in Cambridge, MA, that for two decades has mounted devastating critiques of their products. Nor did they pay attention to faultfinding writers like Nicholas Lemann, Howard Gardner, or James Crouse and Dale Trusheim. But they do pay attention to that #1 critic of all: the market.

More and more colleges and universities have been making the SAT and its rival, the ACT, optional. Even worse from the viewpoint of ETS, the SAT has been losing ground to the ACT, its competitor that already dominated college admission in the Midwest and South. In Michigan, for example, according to the Washington Post, the number of people taking the SAT dropped by more than half from 2006 to 2013. In Illinois, the drop was 46%. The SAT has long been #1 on the West Coast, but its dominance is now in danger of being lost. Even in such Northeastern states as Pennsylvania, New Hampshire, and Vermont, fewer students in absolute terms now take the SAT than did in 2006. The Northeast is ETS's home ground.

In response, Educational Testing Service — which is neither educational nor a service — has announced a major revision in its marquee product, the SAT.

Some of ETS's changes will make the SAT even worse, not better. Removing the "penalty" for wrong (as opposed to blank) answers is one. Previously, ETS took off 1/4 point for wrong answers on the SAT. No longer will they do so.

Why is this a change for the worse?

An example will illustrate. Imagine a student — let's call him "Ernie" — who does not even read the test. He just answers randomly. Maybe he simply shades "B" "B" "B," all the way down the answer sheet. Each item has five alternatives, A, B, C, D, and E, so Ernie would get about 20 of 100 items correct, on average. He would also get about 80 wrong. Using the old SAT formula, his score would be his number correct, 20, minus 1/4 times his number wrong, 80. Since (1/4) x 80 = 20, Ernie's total score would be 20 - 20 = 0.

That raw score has real meaning. Getting a zero is appropriate. Ernie has shown no knowledge. He did not even read the test. His score should be zero. He got no penalty for guessing; his guessing simply was not rewarded. That's why I put quotation marks around "penalty" in my first use of it. ETS then translates this zero into 200 on its arcane 200-to-800 scale.

Imagine a student who did have some knowledge and ability. Call her Suzie. Suzie can read, perhaps slowly, and she can think, perhaps deeply. She answers only 15 of the 100 questions on this imaginary SAT, but because she reads carefully and thinks deeply, she gets all 15 correct. Under the rules up to now, it makes no difference whether she guesses on all the rest, some of the rest, or none of the rest. On average, her raw score will be 15. Not a good raw score, but far better than the zero that random guessing provided Ernie.

Beginning in 2016, copying the ACT, ETS will no longer take off a fraction for wrong (as opposed to blank) answers on the SAT. As always, students will answer all the items they are sure of that they get to. The tests are timed, of course. Students who read more slowly, are less familiar with the format, or are simply less verbally glib (including on the math test) will not reach every item.

In the future, test-wise students will use their final 30 seconds to answer every item, perhaps shading "B" "B" "B" all the way down, like Ernie, never even reading the rest of the items. Now ETS will reward guessing, so their scores will improve. Ernie, for example, will get 20, beating out Suzie's original score of 15, unless Suzie resorts to a similar strategy. Yet Suzie knew something. Of those items she answered, she got every one right. Ernie has not even demonstrated that he can read, only the ability to blacken the little ovals under column "B."

If you run out of time, beforeo you put your pencil down, just circle “B,” “B,” “B,” all the way down the page.

Clearly the new policy is anti-intellectual. It tells students that right answers are important, even if achieved by gaming the system. It implies that the score is what counts, rather than the knowledge or thinking that it represents.

As well, the new policy turns out to be biased. It is anti-black, anti-female, anti-rural, and anti-poor. Indeed, it hurts everyone who does not match up well with the socioeconomic status of white male residents of Princeton, NJ, where ETS staffers live — whom we might call the "in-group."

That's because out-group members will not get taught to guess randomly — at least not to the degree that suburban white kids will. Most poor people, racial minorities, rural people, etc., do not take the Princeton Review, the coaching school that for decades has helped children of the Establishment "game" the SAT. Princeton Review alumni will guess "B" "B" "B." Others, not so much.

To be sure, the instructions will inform students that nothing is deducted for wrong answers. But they will not be convincing. Perhaps they will say, "Informed guesses can help your score. If you think you can rule out an alternative, you are advised to choose among the remaining answers even if you are not sure which is correct." But they will not say, "You are an idiot if you do not fill in something for every item." Such a statement would come across as too anti-intellectual, even though it is accurate. Nor will ETS suggest, "Simply fill in 'B' 'B' 'B' all the way down." Princeton Review will.

We can infer that ETS will do a bad job of telling students when to guess because in the past they have done a bad job of telling students when to guess. With the old (and still current) scoring system, it is mathematically certain that students should guess randomly whenever they can eliminate one or more alternative as definitely wrong. If Ernie eliminates one wrong answer on each of 100 questions, then guesses randomly among the four alternatives that remain, he will get, on average, about 25 items correct (1/4 of 100). He will also get 75 wrong. At present, ETS will subtract ¼ of 75 or 18,75 for these wrong (as opposed to blank) replies. Ernie’s raw score will be 25 – 18.75 = 6.25, significantly better than the 0 he “earned” if he did not even read the items. If Ernie could eliminate two choices, he will get about 33.33 items correct (guessing randomly among the remaining three alternatives). ETS will subtract ¼ of 66.67 or 16.67, leaving Ernie with a raw score of about 16.67, much better than 0.

So what has ETS been telling students to do, regarding this type of intelligent guessing? Here is an example, from Real SATs, a 400-page ETS publication intended to advise high school students how to take the SAT:

As a last resort, if you can eliminate any choices as definitely wrong, it may pay you to make a guess among the other choices.

Test preparation material from ETS and The College Board do not level with students like material and courses from Princeton Review. Therefore unequal access to coaching is one more barrier that confronts “outgroup” students like poor people, minorities, and rural residents.

This advice is so weak as to be misleading. Intelligent guessing should not be considered a “last resort.” It will – not “may” – “pay you to make a guess among the other choices.”

Why does ETS so downplay guessing? Well, ETS claims “the SAT I is designed to help predict your freshman grades, so that admissions officers can make better decisions…,” according to page 4 of Real SATs. To do this, the SAT suposedly tests “your reasoning,” “how well you will do in college,” “your abilities,” “your own academic development,” to quote from other early pages of the book. To emphasize that students should guess randomly when they can eliminate an alternative or two is not seemly. It’s a gaming tactic. It does not belong in the same conversation with these other important “abilities.” Surely that’s why ETS has been doing a bad job of telling students when to guess.

The new policy – rewarding guessing – is even less defensible intellectually, as we have seen. Hence ETS will surely be even less forthright about how to “game” it. Many high school counselors and college admissions staff will only compound the problem. Here is an example. Discussing the change, the dean of admissions at St. Lawrence University said, "It will encourage students to consider the questions more carefully and to attempt them, where before if a cursory glance at a question made it seem too complex to them, they may go ahead and skip that question." So he would suggest that students “consider the questions carefully” that they don’t get to. Of course, they don’t have time to do that! What they should do is not consider them at all, just blindly fill in answers. So he, like many others in the college admissions process, will be a source of misinformation to students seeking guidance on whether and how to guess.

Several strands of evidence suggest that many test-takers simply do not guess blindly, even though they should. For example, years ago, when preparing to testify in the important civil rights case Ayers v. Fordyce (see Wikipedia for a short summary), I came upon ACT scores for students across the state of Mississippi. SAT and ACT scores both correlate strongly with social class and race. Since Mississippi is at once the poorest and blackest state in the U.S., I was not surprised to learn that many students scored abysmally on the ACT.

I was surprised to learn that in Mississippi in the 1980s, about 7% of white students and 13% of black students scored below random on the ACT.

Ordinary lack of ability cannot account for scores below random. As we saw with Ernie, if one does not read — perhaps cannot read — one still scores randomly, so long as one can shade the little ovals on the answer sheet. To score worse than random is truly a bizzare accomplishment.

There is only one likely way to score below random:1 probably the students didn't finish the ACT and didn't guess.

Suzie might be an example. If she worked doggedly, read slowly, thought deeply, answered every item she reached, and never used the last 30 seconds to guess, her score would wind up below random.

The guessing issue does not only affect poor test takers, whose scores wind up below random. Many students with scores above random also don't finish and don't guess. More than a dozen years ago, ETS changed how it scored its Graduate Record Exam (GRE), removing the subtraction for wrong (as opposed to blank) answers. Immediately, ETS observed that many test takers were still leaving many items blank, thus artifically lowering their scores. So far as I can tell, ETS then did nothing about this problem.2 In the material on the GRE on line as of March, 2014, nowhere can I find advice to guess. The closest ETS comes is to tell would-be test-takers,

"For each of the two measures, a raw score is computed. The raw score is the number of questions you answered correctly."

That's not very close! The word "guess" appears nowhere in this section — indeed, nowhere on the entire website — not at "About the Test," "Scores," "How the Test Is Scored," nor even "Frequently Asked Questions." Yet anyone who has ever talked with a roomful of test takers knows that "Should we guess?" is perhaps their most frequently asked question.

Not only are minority students, poor people, and rural people less likely to get the word that they should guess randomly. Research shows they are also less likely to believe it. I have seen this myself, when trying to clue in African American students in Mississippi on how to "game" the GRE. Perhaps it's a matter of sophistication — whatever that is — or the narrower concept, test-wiseness. I concluded that students not in the "in-group" are more dutiful, more sincere in a way, perhaps more plodding. They are less likely to believe that one should do such a thing as answer "B" "B" "B" all the way down an answer sheet. Somehow it doesn't seem right to them.

I share their feeling. Blindly answering "B" "B" "B" all the way down an answer sheet isn't right. It's not an activity that should be rewarded. It shouldn't have anything to do with getting into college. It is anti-intellectual.

As well, "out-group" students are less credulous, less likely to believe what they're told. Sometimes this is good. It made them less likely to believe in the Vietnam War, for example.3 When taking standardized tests, however, it hurts them. Girls, too, are less likely than boys to follow advice to "game the system," which accounts for part of the gap between male and female scores on the GRE and ACT.

As a result of these differences in intellectual style, "out-group" students and girls will be even more disadvantaged by the new SAT than they are now by the old one. I have written elsewhere about how and why the SAT disadvantages African Americans and girls (see Eileen Rudert, ed., The Validity Of Testing In Education And Employment [DC: US Commission on Civil Rights, 1993], 41-45, 58-62, 73-91, 161; and "Gender Bias on SAT Items," with Phyllis Rosser and John Katzman, Amer. Educ. Research Assn., 4/1988, ERIC ED294915). In brief, the statistical tests to which ETS submits proposed new items guarantee that no questions that favor blacks over whites can ever appear on the final SAT. Neither can an item on the math test ever favor girls over boys. As a result, African Americans do badly enough already! Rural people, compared to students living in the advantaged suburbs of the world, do badly enough already. So do girls, on the math test. To add yet another source of disadvantage by this rule change seems gratuitous, sort of "piling on."

If the SAT did its job well, that might be another matter. Its job is, of course, to predict first-semester college grades. At most colleges, the SAT adds almost nothing to the prediction obtained simply from high school grade point average alone.4

ETS has known for decades that the SAT does not measure "scholastic aptitude." Some years ago, "SAT" stood for "Scholastic Aptitude Test." No more. In 1993 the U.S. Civil Rights Commission published the testimony of Nancy Cole, then vice-president of ETS, admitting that the SAT does not measure "aptitude" for college (see Eileen Rudert, ed., The Validity of Testing in Education and Employment, 59 for Cole) The next year, Cole having become president, ETS changed its name to the "Scholastic Assessment Test." A few years later, painfully aware that "Assessment Test" was redundant, ETS renamed the SAT once more. Now it merely stands for "S.A.T." — the initials mean nothing at all!

The name change also amounts to nothing at all, however, because most people don't know it occurred. Even in 2014, when asked what "SAT" stands for, college audiences across the U.S. chorus "scholastic aptitude test." Indeed, ETS has done little to popularize the change. Quite the opposite: owing to the invisible use of "scholastic aptitude test" all over ETS's home page, Google sends searches for the term to that site, even though neither "scholastic" nor "aptitude" appear visibly on that page.

Establishment parents hire college placement tutors to tell their children that the SAT doesn't measure aptitude, so they should still apply to college even after getting poor scores. Again, then, African Americans, rural students, children from poor families, etc., remain particularly vulnerable — more likely to infer that their low test scores mean they have low aptitude. That's too bad, because some of them — perhaps Suzie, for one — would do fine in college. After January 2015, when ETS rewards guessing, this problem will grow even worse. After 2015, some students will score below random on the SAT, as they do on the ACT. Then they can infer that they are really stupid, even though part of the reason they tested so poorly is that they didn't shade "B" "B" "B" after they answered all they had time for.

Sigh!

1    To be sure, one might have such a perverse way of thinking that one systematically chooses wrong alternatives. Since SAT and ACT items usually do not focus on religious or political opinions but on word usage and math, however, scoring below random owing to perversity seems unlikely.

2    This section rests partly on work by Jeri Grandy, personal communication, 9/2000.

3    See Lies My Teacher Told Me (NY: Simon & Schuster, 2007), 345-54.

4    Neither does the ACT. In MS, for example, adding ACT scores increased the correlation between HS GPA and first semester GPA at Alcorn U. from .55 to .57, a trivial increase. At MS State U., the increase was from .68 to 71.

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153327 https://historynewsnetwork.org/blog/153327 0
Scalping Columbus

I just finished a new book by Adam Fortunate Eagle, formerly Adam Nordwall: Scalping Columbus and Other Damn Indian Stories: Truths, Half-Truths, and Outright Lies. Oklahoma University Press continues its tradition of publishing important books by and about Native Americans. But it's not like any other university press book, at least none I've ever read.

The problem — or perhaps the fun — of the book begins with its preface by Fortunate Eagle, an Ojibwa:

Some of these stories are so outrageous they appear to be pure exaggeration, when in actual fact they are true; only the author knows which are fact and which are fiction.

Other stories are based on facts, which begged for embellishment. I will stand by the facts until someone accuses me of fibbing. At such time, I reserve the right to accept or reject that challenge based on my assessment of the level of intelligence and knowledge of my accuser....

Some of my stories are total fabrications disguised as the truth. These tales test not only the literary creativity of the author but also the gullibility of the reader. Personally, I find it impossible to distinguish the difference between outright fabrications and bullshit. You, gentle reader, must decide. But don't you agree that bullshit is the fertilizer of the mind?

I consider the preface hilarious, but also thought-provoking. Certainly some American history textbook authors cannot tell the difference between outright fabrications and bullshit.

Surely bullshit is not the fertilizer of the mind, however, until that mind recognizes that it is bullshit, or at least might be. Then, Fortunate Eagle is surely right. Consider A History of the United States, a textbook aimed at high school students ostensibly written by Daniel Boorstin and Brooks Mather Kelley. It claims, inter alia, that the United States has sought peace around the world. Indeed, not only have we sought peace rather than dominance, we have also done so only reluctantly: "Still a superpower, the United States could not avoid some responsibility for keeping peace in the world." The eleventh-grade mind that believes this claim is not fertilized. As soon as the eleventh-grader questions the statement, however, then his/her curiosity may be fertilized by what the textbook left out. Eventually, s/he may go on to learn what we really did, and why.

The very first story in the book, "Moose on the Loose," prompted me to laugh so hard that I had to put the book down. Seems that Fortunate Eagle, back when he was Adam Nordwall and was only about ten, was out gathering wild rice on the rez in northern Minnesota. He spies a moose, eating some water lily stalks with his head under water. Trying to be a good provider, he fashions a noose from the rope on his boat and manages to lasso its antlers. The ensuing ride winds up in an upheaval that has to be read to be believed ... or perhaps disbelieved — the reader has already been warned.

This photo shows Fortunate Eagle with the Pope shortly after his discovery.

Some other stories are not so funny, because Fortunate Eagle is an Unfortunate Punster. Others treat serious issues that faced (and still face) Native Americans in this nation of ours. Readers should know that this is the man who in 1973, flew Alitalia to Rome, planted a banner, and famously "discovered" Italy. He is also the leader, according to the FBI, of the American Indians who took over Alcatraz in 1969 on behalf of Native rights. Both of these stories are in the book, one factually, one as fable.

Scalping Columbus includes another novelty. Fortunate Eagle does not really leave "only the author" knowing which tales are fact, which fiction. Instead, he ends with an appendix that tells the curious reader the proportion of bullshit in each chapter. This innovation other authors, especially of U.S. history textbooks, might emulate. 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153349 https://historynewsnetwork.org/blog/153349 0
Here's What's Wrong with the Nixon Library and Museum

The Richard Milhous Nixon Presidential Library and Museum has been a source of controversy since it opened back in 1990. It went public in 2007. After it took control, one of the first acts of NARA, the National Archives, though it took a while to finish, was to redo the notorious Watergate exhibit.

The museum had mostly omitted Watergate. Its introductory video ended with eight covers of Time Magazine, each of Richard Nixon; none was about Watergate. (Nixon was on 54 Time covers, 11 of which concerned Watergate.) Of the 16 film clips on display at the library, none treated Watergate. When it simply had to discuss Watergate, the library mystified it: "The story of Watergate is enormously complex. Even today, basic questions remain unknown and perhaps unknowable." Issues of right and wrong were not involved: "Given the benefit of time, it is now clear that Watergate was an epic and bloody political battle. . . ." Labels minimized Nixon's role in the ensuing coverup, instead blaming Martha Mitchell and John Dean!

Actually, the "benefit of time" has revealed Nixon's participation ever more clearly. In October, 1997, for example, Walter Pincus and George Lardner, Jr., reported how additional tapes directly involved the president in "providing money that Nixon knew was being used as hush money for the Watergate burglars."1

When the archives took over, it appointed a new director, Timothy Naftali. He told them and the Nixon foundation, "I can't run a shrine. I'm a historian." The redone Watergate exhibit opened in 2011, prompting a new controversy: the Nixon Foundation, formerly the sole owner/operator of the museum, objected to it. Later that year, Naftali resigned. Since then, the Nixon Foundation has blocked the appointment of a new director, most recently objecting to Mark Lawrence, professor of history at the University of Texas. His "perspective" on the Vietnam War, revealed in his Oxford University Press book, The Vietnam War: A Concise International History, was "different," in the words of Ron Walker, chair of the Foundation's board of directors. (See the recent story by Jon Wiener in The Nation.)

In a sense, continuing conflict at the Richard Milhous Nixon Presidential Library and Museum is good, because continuing controversy befits Richard Milhous Nixon. For that matter, all presidential museums might benefit from controversy. All of them wind up as neither libraries nor museums, for they invite people neither to learn nor to muse. Rather, they are exercises in spin control, seeking to convince visitors that their subject is both heroic and blameless and certainly blemish free. (I have not really earned the right to write "all of them," for I have not visited all the presidential museums. If you have a candidate for a good museum, one that honestly presents the tough issues related to its president, please comment below.)

I visited the Nixon Presidential Library and Museum in the late 1990s, before the National Archives had anything to do with it. (Therefore, I shall describe its exhibits in the past tense, although I think most of them are still intact.) When I left, having spent the afternoon, I felt that Richard Nixon had just lied to me once again, from the grave, no less. I had the same feeling about Kennedy when I left the JFK Library in Boston. Both museums reminded me of obituaries in third-rate small-town weeklies that feel they must speak only positively of the dead. Upon leaving the Sixth Floor Museum in Dallas, on the other hand, I actually felt better about JFK, partly because that museum had treated me as a knowledgeable adult.

The proliferation of presidential libraries is recent. Every president since Hoover now has one. No president before Hoover got one except Rutherford B. Hayes, who was wealthy enough to build his own. Abraham Lincoln got one, to be sure, but only in 2004. Jefferson Davis, hardly a president of the United States, also got his own Presidential Library and Museum, again recently, in 1998, supported by $4.5 million in state bond funds. It is located on the grounds of Beauvoir, his probable mistress's house, on the Mississippi Gulf Coast.2

Surely this proliferation of libraries is unfortunate. Among their alleged purposes is to gather the president's papers in one place for scholarly convenience. Archivists, librarians, and researchers have lamented the resulting scholarly inconvenience. Presidential papers used to go into the National Archives, along with those of cabinet members and other high officials. Pity the poor scholar who is trying to research, say, American invasions of small nations since 1930 — s/he must traipse to West Branch, Iowa (Hoover); Independence, Missouri (Truman); Abilene, Kansas (Eisenhower); Boston, Massachusetts (Kennedy); Austin, Texas (Johnson); Yorba Linda, California (Nixon); Ann Arbor, Michigan (Ford); Atlanta, Georgia (Carter): Simi Valley, California (Reagan); Austin, Texas (Bush I); Little Rock, Arkansas (Clinton); and Dallas, Texas (Bush II), as well as search the archives in Washington. (Researchers do not have to go to Hyde Park, New York, for FDR's papers, because he wisely left them at the Archives in Washington.)

Interestingly, vice presidential museums seem to be trending in the opposite direction. To be sure, the Dan Quayle Center and Museum opened in 1993 in Huntington, Indiana, joining the Charles Curtis House Museum in Kansas. However, the Charles Dawes Museum and the Alben W. Barkley Museum have closed, a proposed Hubert Humphrey Museum has been abandoned, and the John Nance Garner Museum is undergoing renovation. There are no others, to my knowledge.3

When I visited, the Nixon Library was even less useful as a library than other presidential libraries. It didn't house Nixon's presidential papers or even his vice presidential papers. (Nixon's presidential papers were at the National Archives then, to prevent him or his minions from destroying them.) The only items ready for use at the Nixon Library when I visited dated from 1946 through 1952, before he was Vice President. The Nixon Library had not even put out a statement describing its collection, rules of use, and the like. The main activity of its staff seemed to be battling to keep the public away from Nixon's tapes and papers that were in the custody of the National Archives, most of which were unavailable to scholarly use "largely because of delays caused by legal wrangling with the Nixon camp," according to George Lardner, Jr., a journalist paraphrasing the acting director of the National Archives' Nixon project.4 Moreover, the library's first director, Hugh Hewitt, "announced that researchers deemed unfriendly would be banned," singling out Bob Woodward of the Washington Post. In October, 1997, I asked Susan Naulty, the archivist at the Richard Milhous Nixon Library, to "kindly send me information as to your collection and how to make arrangements for using it." She never replied.

Most visitors never see the library part of a presidential library. They aren't supposed to. The libraries are for researchers. Connected with the libraries are "museums," but it is hard to use the word without quotation marks, for they invite people neither to learn nor to muse. Rather, they are exercises in spin control, seeking to convince visitors that their subject is blameless and blemish free. "Shrines" would be a better term for them.

The Nixon Museum exemplifies the problem. During Watergate, perhaps the most telling point made by Nixon defenders was their claim that Richard Nixon only did what other presidents were doing, just a little more obviously.5 In a way, the Richard Milhous Nixon Library and Birthplace is the perfect monument to the Nixon Presidency, because it only does what other presidential libraries do, like those for Kennedy and Reagan, just a little more obviously.

Its museum part is even less accurate than its obvious competitors, the Kennedy, Johnson, and Reagan libraries. Consider this content-free claim about Nixon's crucial 1968 presidential campaign: "He was not afraid to take principled stands. By the time Election Day arrived, the electorate knew exactly where Nixon stood on the great issues of the day; he stood with them, and they stood with him." Such rhetorical fog is all too common during campaigns, but decades later, the museum still did not reveal what the issues of the day were, let alone what "principled stand" Nixon took on any of them.

On the Vietnam War, the JFK, LBJ, and Richard Milhous Nixon libraries offer no new information but provide object-lessons in spin-control. Recognizing that the war was a mistake, the Kennedy Library blames Eisenhower and Johnson, the Johnson Library blames Kennedy, and the Nixon Library blames Truman, Eisenhower, Kennedy, and Johnson. On this point, the Nixon Library is closest to the mark, because all four of his predecessors played important roles in expanding America's involvement in Southeast Asia. But then the Nixon Library blows its credibility by proclaiming, "[Nixon] brought peace with honor." Perhaps aware that some visitors might recall that America's exit from Vietnam was accompanied by neither peace nor honor, a later exhibit finds someone else to blame: Congress, for cutting off funds for the war! Whoever wrote these exhibit labels seems not to have noticed the incongruity in praising Nixon for peace while berating Congress for not supporting the war.

As well, nowhere did the museum admit that all that Nixon's Vietnam policies had achieved — indeed, all that he and Henry Kissinger even intended to achieve toward the end — was "a decent interval," as Kissinger put it, between American withdrawal and South Vietnamese capitulation. Of course, the public might think badly of Nixon (and Kissinger) if they understood that the enormous sacrifice of life during his presidency (22,000 American lives, about 500,000 Vietnamese, 50,000 Laotians, and 250,000 Cambodians), as well as many billions of dollars, was just to help Nixon/Kissinger look better politically.

Misrepresentation at the Nixon Library was so pervasive that the museum lost all credibility with me. Fawn Brodie said of Nixon that he told "unnecessary lies," and so did his museum. One whole room treated civil rights. Repeatedly its exhibits claimed that more schools were desegregated during Nixon's six years than in any other comparable period in American history. One label said that Nixon "brought the full authority of the White House to bear on desegregating southern schools without violence or coercion." Nonsense! More schools were indeed desegregated, but despite Nixon! Shortly after he took office, Nixon ordered the Justice Department to change sides and oppose desegregation before the Supreme Court. Most observers believed he had cut a deal with Mississippi Senator John Stennis, chair of the Senate's Armed Forces Committee: give me more funds for the Vietnam War and I'll stop desegregation in your state. He had cut no deal with the Court, however, which ordered full desegregation as of Christmas break, 1969-70. All Nixon really accomplished was a four-month holdup, informally known in Mississippi as the "Stennis delay."

Another exhibit praised Nixon for having progressive policies toward American Indians. He did. He appointed a Native American to head the Bureau of Indian Affairs. During Nixon's six years in office, BIA schools hired many more Native teachers and principals. To this day, many residents of Taos Pueblo revere him for returning their sacred Blue Lake to their ownership. But an institution that praises Nixon for school desegregation, falsely, may not be believed when it praises him for his Indian policies, accurately.

The Nixon Library and Museum was already large when I visited. It incorporated his birthplace house, a garden, and 55,000 square feet of exhibits. Yet it had almost no visitors. In almost every room, I found myself alone. As I left the museum, I asked the admissions clerk how many people visited in an average day. She glanced at my note pad and then replied, "Oh, we have no idea!" Of course, that's not true. Every museum tracks the number of visitors. I thought this example of secrecy, coupled with apprehension about who I might be, comprised the perfect final exhibit for its protagonist. But the sales clerk in the museum giftstore had not gotten the word that the subject was taboo. "Less than a hundred on a weekday," she replied, not counting school groups. "Maybe 300 on a big weekend day."

Considering how poor was the presentation of history at the Nixon museum, surely it was good that so few people visited. That way, its counter-factual spin did less damage. After the new director, whoever s/he may be, has de-sanitized its portrayal of our 37th president, the Richard Milhous Nixon Presidential Library and Museum needs to mount exhibits that treat hard subjects, thus prompting controversy. Then people will come. Suggestions:

— Nixon's continuous connections with organized crime, including his friendships with Bebe Rebozo and Teamster leader Jimmy Hoffa. These connections go way back, even before he defeated Helen Gahagan Douglas for the U.S. Senate in 1950. Did they affect Nixon's direction of the Department of Justice and the FBI? His pardoning of Hoffa? His anti-Communism? (Rebozo and other Nixon cronies were tied to gambling and other vices in the Cuba of dictator Fulgencio Batista, which Castro brought to an abrupt end.)

— Nixon's operatives sabotaged LBJ's last chance for peace in Vietnam by promising our puppet government in South Vietnam that he would be far more supportive of them than Johnson, so they should wait for him to take office. Therefore during the last days of Nixon's 1968 campaign against Hubert Humphrey, the Saigon regime responded by refusing to participate in the peace talks that Johnson was trying to set up, which eliminated the possibility that peace in Vietnam might give Humphrey the presidency.6

— Initially at least as progressive on civil rights as JFK, Nixon stumbled into his "Southern strategy" to ward off defections to Alabama Gov. George Wallace. Despite fine reporting by Thomas and Mary Edsall and others, most Americans still don't realize that Republican presidential candidate Barry Goldwater purposefully and effectively appealed to Dixiecrats in 1964. Nixon then maintained and solidified the Republican Party in 1968 as the party of overt white supremacy.

In 1972 Richard Nixon proclaimed, "When information which properly belongs to the public is systematically withheld by those in power, the people soon become ignorant of their own affairs, distrustful of those who manage them, and — eventually — incapable of determining their own destinies." Surely he was right. But Nixon never took that sentence seriously. Neither does his library.7 When it does, the resulting controversies will attracts thousands, and their visits will be enlightening.

1    Walter Pincus and George Lardner, Jr., "President Nixon on Watergate Hush Money," Washington Post, 10/30/97.

2    Cf., inter alia, Carol Bleser, "The Marriage of Varina Howell and Jefferson Davis," in C. Bleser and L. Gordon, eds., Intimate Strategies of the Civil War: Military Commanders and Their Wives (NY: Oxford UP, 2001), 23-26, Whether Davis was sexually intimate with Sarah Dorsey seems not now recoverable.

3    There is a "Country Life Center" in Iowa that is the birthplace farm of Henry A. Wallace, but it is more about farming than about FDR's vice president.

4    Pincus and Lardner, op. cit.

5    This interpretation seems to have won: 70% of Americans in a survey in the late 1990s agreed he did nothing "worse than what other presidents have done."

6    Robert Dallek, "Three New Revelations About LBJ," Atlantic Monthly, 281 #4 (4/1998), 44.

7    1972 Presidential proclamation to strengthen the Freedom of Information Act, quoted in Tim Weiner, "The Cold War Freezer Keeps Historians Out," NY Times, May 23, 1993. 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153356 https://historynewsnetwork.org/blog/153356 0
The CIA Has a Museum?

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

Five years ago, traveling on the Capital Limited overnight to Chicago, I chanced into a conversation that was remarkable even by Amtrak's usual high standards. After leaving DC at 4:05PM, we journeyed along the Potomac to Harpers Ferry. That part of the ride is so scenic that I always savor it with a craft beer in the "Viewliner Lounge," which boasts an all-glass upper level. At 6PM I walked to the diner (not the cafe car, no, a real dining car with selections like steak, fish, pasta, etc.). As usual, Amtrak seated me with other passengers.

First to sit opposite me was a man who introduced himself as Bert Sacks, the founder of IraqiKids.org, an organization devoted to making Americans aware of the plight of the children of Iraq before and during our second War on Iraq. He was going back home to Seattle (a three-day trip!) and inquired where I was going. I told him to a speaking engagement in Milwaukee, and he asked what about. I told him about my bestseller, Lies My Teacher Told Me, and he replied, "No kidding? You wrote that book? I love that book."

I actually get that a lot, so it no longer embarrasses me, and I thanked him. In the meantime, another man had joined us, seated next to me. He looked like a classic spy, complete with black eye patch on a black elastic strap over his head.

Mr. Sacks then bent my ear about his "favorite" lie in American history: a lone gunman killed JFK. He said no, it was the CIA. He cited a book by the chair of his board, James W. Douglass, JFK and the Unspeakable: Why He Died and Why It Matters. This book "proves the CIA did it," Sacks insisted.

I was not so sure. "I don't think Oswald acted all alone, and then Jack Ruby acted all alone in killing him," I replied agreeably, "but I'm not sure it was the CIA. Oswald and Ruby both had ties to the Mafia — indeed, to the same branch of the Mafia, I understand. Or it might have been Castro. Certainly he had motive, since Kennedy kept trying to kill him."

At this point my seatmate broke in. "The CIA didn't kill Kennedy," he said quietly. "The CIA isn't competent to kill Kennedy and get away with it. And I should know, because I retired last month as Inspector General of the National Reconnaissance Office of the CIA."

I was impressed, though Sacks was not. The former Inspector General, who turned out to be named Eric Feldman, said to me, "You need to see the material on JFK in the CIA Museum." But then he corrected himself: "Oh, but then you won't be able to see the CIA Museum."

Later in the conversation, realizing that I had academic credentials, Feldman said I might be able to see the CIA Museum after all. He suggested I try.

Last fall, I finally got around to doing so. I emailed the Office of Public Affairs, which the CIA's website emphasizes is "the single point of contact for all inquiries about the Central Intelligence Agency (CIA)." Molly Hale of the OPA replied on November 26, 2013, telling me my message had been received, giving me a confirmation number, and saying she had forwarded my request and would get back to me about it.

"I let a month pass, and then a few more days, to allow for the holidays," I emailed to Ms. Hale, following up on January 3, 2014. "It is appropriate for the CIA to respond to my request, and to respond favorably," I continued. "I am an honored sociologist and a 'Distinguished Lecturer' of the Organization of American Historians."

And I am! The American Sociological Association has given me no less than three awards, including in 2012 its Cox-Johnson-Frazier Award for "outstanding achievement in research, teaching, and service with ... particular focus on human rights and social justice."

That made no difference. Hale never got back to me.

So on April 28, 2014, I phoned the OPA. A woman answered; I repeated my request and referenced my earlier email from "Molly Hale." She said she would look into it and get back to me. As she was about to hang up, I said, "Wait a minute: who am I talking with? Please give me your name." "Molly Hale," she replied, with some embarrassment.

It being the CIA, it's possible that all their female employees use the name "Molly Hale," doubtless descendants of Nathan.1 If not, it was surprising that she had not identified herself earlier, when I'd told her about my e-correspondence with "Molly Hale."

Ten days later, on May 8, Molly called me back. The museum is not open, she said — not to anyone, except CIA employees. "We do have an exhibit," currently in Seattle, she went on, "60 Treasures of the CIA."2

"That's interesting," I replied, "but it's not about the Kennedy assassination."

"We do not have an exhibit on President Kennedy or his assassination," Hale replied. "Maybe you were thinking of another museum?"

So I told her of my meeting up with Inspector General Feldman. Of course, I did not have another museum in mind.

It made no difference; I was not getting into the CIA Museum. And neither are you. Yet our tax dollars pay for the museum.

That's too bad, not only for me (and you), but also for our country. The CIA Museum should be open to the public, or at least to the vetted public. After all, it's our museum. We paid for it.

Of course, you might reply, especially those of you who are with the CIA, if the CIA Museum were open to the public, and if it were any good, then anyone could learn secrets of the CIA. That would endanger the nation, including the very public that the CIA is supposed to protect.

Perhaps. I used to think like that, until events of the 1960s changed my mind. I recall television coverage from Hanoi in 1966, showing how the Vietnamese government had dropped off concrete sewer pipes in front of homes in nice tree-leaved neighborhoods. Residents were then supposed to bury them in their yards as makeshift bomb shelters, in case the United States bombed them. I snorted at North Vietnam's paranoia — as if the United States would actually bomb their residential neighborhoods. Doing so was a war crime. Moreover, I knew it would only unite the populace against us.

Then came New York Times reporter Harrison Salisbury's pathbreaking trip to Hanoi later that year. On Christmas Day, he reported on residential areas the United States had bombed in Hanoi, giving the lie to American claims to have struck only military targets.3 The next day, the Pentagon conceded they had "accidentally struck civilian areas in North Vietnam."4

This process — events kept from the American public, events even directly denied, then proven to be true — kept happening. Over and over, in the 1960s and beyond, the United States government has been caught doing terrible things, usually after denying them. Nixon's "secret" war against Cambodia, one of his escalations of the war against Vietnam, provided another example. From whom was he keeping this war "secret?" Surely not from the Khmer people — they knew we were bombing them! Our war on Cambodia was secret only from the American public — again, the people paying for it. Most recently, we have the NSA spying, probably not secret from true cyberspies, but certainly unknown to the American people, who were paying for it.

Many Americans agree with Bert Sacks and James Douglass and blame the CIA for killing Kennedy. On the fortieth anniversary of his assassination, 37% of a Gallup poll sample chose my tentative thesis, the mafia, while 34% said the CIA. (Lyndon Johnson trailed with 18%, followed by the Soviet Union and Cuba tied at 15% each.) Another 37% did not believe any of the above; presumably most of them think Oswald acted alone.5 It would interest all Americans to know what the CIA thinks about the assassination, or at least what it says in its museum. I asked officials of the estimable Museum of the Sixth Floor, which provides a nuanced discussion of the various theories of Kennedy's and Oswald's deaths. They had not known that the CIA has a museum, let alone that it had an exhibit on their topic. Does Oliver Stone know? Did you? I did not — until Amtrak.

To summarize, let's see how this works. On the one hand, maybe the CIA has real knowledge about who killed Kennedy. At the very least, it certainly had a file on Lee Harvey Oswald. It certainly testified to the Warren Commission. It must know something. If so, if it speaks with authority and has important information to impart, then it should do so! People like Sacks, Douglass, Oliver Stone, the staff of the Sixth Floor Museum, and you and me, need to know! (Of course CIA museum staff would redact names of people who might still be endangered by exposure. The CIA leads the world in redaction! Of course they would not display material that might disclose how the Agency learned some things. And of course they would vet who gets in.)

On the other hand, maybe the CIA did it! If so, surely its exhibit would obfuscate its involvement. In that case, if its interpretation is fraudulent, then people like Sacks, Douglass, and the rest of us need to see it so we can critique it. It certainly is not in our national interest for the CIA to make its own agents stupid by showing them wrong information in its own museum. For that matter, if the CIA didn't do it but its interpretation is nevertheless incompetent, then we all — including our best historians and political scientists — need to see and critique it, to help keep the CIA from misleading its own staff.

Either way, this is our museum, not "theirs" — the CIA's. For that matter, it's our CIA, not the CIA's CIA. Woody Guthrie once wrote a song about something like that.

1    After all, years ago every Pullman porter was required to identify as "George," taking the first name of CEO George Pullman.

2    In an email accompanying her phone call, Hale wrote, "Thank you for your interest in the CIA Museum. We are not open to the public. If you are in Seattle between now and 1 September we have 60 of our treasures traveling with 'Spy The Secret World of Espionage', currently at the Pacific Science Museum."

Maybe some of you have seen it and will comment on this article, describing and assessing it. From the CIA's announcement, it seems to be celebratory and self-congratulatory.

3    Harrison E. Salisbury, "A Visitor to Hanoi Inspects Damage Laid to U.S. Raids," NY Times, 12/25/1966.

4    Neil Sheehan, "Washington Concedes Bombs Hit Civilian Areas in North Vietnam," NY Times, 12/26/1966.

5    Obviously, respondents could choose more than one. 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153419 https://historynewsnetwork.org/blog/153419 0
I Don’t Like It, So It Must Be Art Sociologist James W. Loewen is the author of Lies My Teacher Told Me. 

On vacation in Europe, my wife and I visited several art museums. The exhibits were interesting, sometimes beautiful, except when they paid allegiance to "conceptual art" and its associates — performance art, installation art, minimalism, and the like. These works merit quotation marks around "art" because often their point is to question sardonically the definition, meaning, and worth of art itself.

Colleges have for years educated students about such classics as Marcel Duchamp's 1917 work "Fountain" and Kasimir Malevich's 1918 painting "White on White." "Fountain" was merely a urinal signed and displayed as a sculpture. "White on White" is now on display in the Museum of Modern Art (MOMA) in New York City. It was not simply a piece of canvas, painted white and framed, but actually showed brush strokes; it portrays a tilted bluish-white square on a background of grayish white. Some time later, artists went Malevich a step further by hanging simple pieces of canvas, painted white or another color1 by rollers to eliminate brush strokes. Some have a straight line across them, dividing one color from a second. Others contain an entire square in that second color.

This is, of course, "minimalism."

In the last half century, millions of museum-goers have stood before such works and muttered, "I could do that!" Few ever do, partly because they mean the sentence as a disparagement, not a plan.

Museum Ludwig in Köln displays such a piece, "Water," by Rosemarie Trockel. In 2004, "Water" won the Wolfgang Hahn Prize of the Gesellschaft für Moderne Kunst (Society for Modern Art), allied with the Ludwig. This award carries a lot of prestige as well as "up to" 100,000 euros (about $134,000). 

Rosemarie Trockel, “Water,” 2004

To the untrained eye, "Water" seems to be a large square of cloth, uniformly rust colored. However, as guest juror Dr. Silvia Eiblmayr tells, it's not as simple as that:

Rosemarie Trockel had a knitted image made, this time by hand, in which, so to speak, various handwritings can be detected. A monochrome, but not quite homogeneously structured image, because of the different 'knitting temperaments' (its title is Water), which is mounted on a canvas-covered frame and then enclosed once again by a light-colored frame. Produced with the very specific conception of a 'painting-machine,' the knitted image will find its multiple references to the icons of Museum Ludwig, but also, with aesthetic wit, to the stone floor whose colour it will absorb.

My untrained eye, incidentally, saw no handwork in the knitting, let alone "different temperaments." I think Eiblmayr probably got it right the second time: a machine knit the cloth. On its lower floors, Museum Ludwig does have some fine works of art. Most are flat, some are square, and a few incorporate a rusty brown color somewhere. Other than sharing those characteristics, "Water" does not refer to them.

John Cage pioneered minimalism in music with his notorious 1952 composition "4' 33." A pianist comes out, sits at the keyboard for four minutes and thirty three seconds, then stands to acknowledge the applause of the audience.

I realize that those of you in the intelligentsia, or at least those in that subset of the intelligentsia known as the cognoscenti, have already written me off as a barbarian if not a buffoon for the foregoing. "Merely a urinal" indeed! Maybe it was a urinal, but Duchamp's calling it art changed everything, you dolt! And yes, Loewen, while you could go home and paint a canvas white, you didn't, and even if you did, you didn't do it first. You could also sit at a piano for five whole minutes, but that doesn't count. That's not art.

Well! It happens that in 1967, when I was a graduate student in sociology at Harvard University, I bought some burlap cloth, also rust-colored. For some time, I kept it around. I was going to build a frame for it, stretch it over, then put thin black-painted slats around it as an outside frame, followed by shiny aluminum slats outside the black. The aluminum would form a "floating frame," adding some drama. I think I even finished the project, although I lost custody of it in divorce. But that does not matter. As the famed American artist Sol LeWitt put it, also in 1967, "In conceptual art the idea or concept is the most important aspect of the work." I had the concept first! As LeWitt himself notes, "the execution is a perfunctory affair."

The real problem is that no one has acknowledged that I am a great conceptual or minimalist artist who antedated Trockel by some 37 years. I do have an oeuvre, by the way. Three years after the burlap, I constructed a bas relief that I titled "I Saw the Figure Four in Pine." Made from children's blocks (by then we had two kids), it was wholly abstract, although some of the blocks that stood out seemed to form a "4." The title was a reference to "I Saw the Figure Five in Gold," a major work by Charles Demuth, but of course only the cognoscenti would get that. My friend Alice Walker, soon to become famous as poet and novelist, liked the work so much as to ask if I'd make one for her. I did, and in return she gave me a work of art illustrating a poem about Mississippi by Rosellen Brown.

Despite that body of work, no one has said of me what Eiblmayr said of Trockel:

Here, as in her entire oeuvre, Rosemarie Trockel`s great art is her comprehensive critical epistemological and poetic knowledge of things and their gestalt. She is very close to them in a fascinating form, but then assumes distance again by displacements, inversions and refractions.

"Comprehensive critical epistemological and poetic knowledge of things and their gestalt!" Holy poopy! To think that all I saw was a piece of rust-colored cloth!

The Ludwig devotes an entire floor to such pretense. One work consisted of nine large panes of glass, mounted at a slight angle to the vertical, each separated by about half an inch. The artist titled it, as I recall, "Nine Panes of Glass." I could not argue with that. Many other works were simply titled "Untitled." Again, I could not argue, since their subjects and meanings eluded me, too.

This October, the Ludwig will mount an entire exhibit, "Stains, 1969," by the American artist Ed Ruscha. Ludwig waxes about this attraction: "Instead of applying paint to the surface of a canvas he dripped fluids — a total of 75, from tap water and coffee to blood, nail polish, and sulfuric acid — on paper. They left behind the most diverse stains." 

Ed Ruscha, Stains: Gunpowder, 1969

According to the Ludwig, this series was "pioneering for his oeuvre." I submit we all have done work like that! Had we just thought to save our collection of Bounty towel squares, such work might have been "pioneering for our oeuvre." Again, concept is key, along with some inflated commentary by an art critic.

In 2008, the science fiction writer John Scalzi displayed a related work of “art” and wrote a hilarious article about it, “I Don’t Know Art, But I Know What I Like.” The artwork is a roll of toilet paper shredded by his cat. Scalzi added the title, “I Will Be There At The End of All Things, or, The Shreddination.” His article provides some wonderfully ponderous explication, mocking the prose style of the Ludwig and other art museums. I would quote him, except part of my purpose in writing this article is to get you to visit his very brief web page yourself.

Unfortunately, galleries in lesser-known centers of art emulate rather than mock big-city institutions like the MOMA and the Ludwig. The Hafnarhú, part of the Reykjavik Art Museum,2 displays, inter alia, sheets of white paper that are not uniformly white. They contain faint smudges on one half. So it goes. Such art says nothing about Iceland, nothing about the artist's relationship to Iceland, indeed, nothing at all. Johannes Kjarval, featured in another building of the Reykjavik museum, famously said, “Art is too serious to be taken seriously.” Smudge artists ask us to take seriously stuff that they don’t really consider even to be art.

Of course, not all pictorial art need be local. Nor must it even be representational. Icelandic artists have as much right to participate in the currents of world art as do artists in New York City. The Icelandic smudges are merely visual versions of "4' 34," however. They don't "advance" art, if indeed art is progressive in the first place.3 They simply repeat a gesture. So do most of the examples of conceptual art in the Ludwig, Amsterdam's Stedelik Museum, etc.4 They don't try to engage the viewer. They don't show what the artist finds interesting about his/her subject. On the contrary, these works are "post-engaging" and "post-interesting" as well as "post-modern." The artists seem to be engaged only in pissing us off, the audience, and not even to be seriously absorbed in that arid task.

Or maybe I'm a barbarian if not a buffoon. Up to you!

1    Yes, I know that strictly speaking white is not a color.

2 Can you guess which airline we took home?

3    Remember Picasso's exclamation upon first seeing the Lascaux cave pictures: "We have discovered nothing!"

4    For the record, other floors of the Ludwig and Stedelik museums and other branches of the Reykjavik museum display much more interesting art, at least to me. 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153463 https://historynewsnetwork.org/blog/153463 0
Let Our Seniors Go

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.  

Everyone knows how our prisons bulge with too many young people, mostly black, put there with too long prison terms, for nonviolent crimes like selling or possessing illegal substances. Our prisons also bulge with too many old people, mostly black, put there with too long prison terms, for violent crimes like murder and rape. In both cases, we waste money and other resources.

Luckily, the State of Maryland finds itself in the midst of an unplanned but useful sociological experiment on senior incarceration. A recent Washington Post article tells how, in 2012, the Maryland Supreme court ruled more than 250 trials conducted before 1980 unconstitutional. Judges had given juries flawed instructions.1

The flaw is complicated. Judges told juries they were to judge not only the facts of their case but also the law. In Unger, the trial that has lent its name to this whole class of cases, the judge said, "Anything which I may say about the law, including any instructions which I may give you, is merely advisory and you are not in any way bound by it." Unfortunately, he included the phrase "must be convinced beyond a reasonable doubt" within the list of instructions that juries might disregard. Juries must not disregard that phrase. That phrase is part of what we mean by "due process" and is guaranteed by the 15th Amendment to the U.S. Constitution. Accordingly, all Maryland trials in which judges gave such instructions have now been deemed unconstitutional, retroactively.

As might be inferred by the length of their sentences, most of these prisoners had been convicted of serious crimes. In 1975, Merle Unger shot and killed an off-duty police officer in the aftermath of robbing a store. The next year, a jury initially convicted him of felony murder, armed robbery, and using a handgun in the commission of a felony. Other prisoners had been locked up for kidnap and rape. Now they might all be freed, unless retried and reconvicted.

It's not easy to retry people 40 years after a crime took place. Police often discard physical evidence after 25 years. Eyewitnesses die. Defending prisoners so long afterward is hard too. Perpetrators2 have no better memories than the rest of us. Where were you on the afternoon of July 22, 1977?

Hence most of these prisoners and their prosecutors are making deals to get free so long as they accept parole. After all, they have moved on to another portion of their life span. They are old. Some are infirm. One came into court for his hearing on a gurney. At least two came in wheelchairs. Others walked, but with difficulty, suffering from arthritis and diabetes. They are unlikely ever to re-offend. Freeing them serves the public interest.

Other prisoners are unlikely to re-offend because they have changed mentally. Some have earned college degrees while behind bars. Karl Brown was convicted at 17 in the fatal shooting of a man during a robbery. In prison he changed his name to Kareem Hasan, converted to Islam, and claims to have changed his way of thinking. Released in May, 2013, Hasan, now 55, got a job working at a wastewater treatment plant. As of November, 2013, he was saving money to buy a car. As quoted by the AP, he said then that he and others were excited to get a "chance to show we're not animals" and "prove our worth." He also spoke to the families of the victims: "I don't want the victims to think that we are not remorseful." "I pray every day that God will forgive." As of June, 2014, he was still employed at the treatment plant.

Certainly 55 is a different phase in the life cycle than 17. According to Michael Millemann, law professor at the University of Maryland who has been working on these cases, most defendants show signs that they've reformed after decades in prison. Data bear him out: the correlation between age and recidivism is strong and negative. The older the parolee, the less likely he will re-offend.

Forced by the 2012 ruling, localities across Maryland must review these cases. In Baltimore, Gregg Bernstein, the state's attorney, said his office set up a "deliberate, thoughtful, comprehensive" process that weighs each case individually, determining whether the prisoner can be reconvicted while assessing what threat he poses to public safety. The state's attorney in Annapolis, on the other hand, doesn't plan on making any agreements. "There are other jurisdictions within the state of Maryland that are making deals with these individuals - our policy is to fight them," said Anne Colt Leitess. "We are not willing to allow people convicted of murder to simply walk out the door without a fight."

So far, the state has released 69 prisoners. Two, including Unger, have been reconvicted. Eight others faced new trials, of whom three have already pled guilty. At least one lost his chance for a new trial on the ground that instructions given to his jury differed sufficiently from those given Unger's jury, so they were constitutional. 120 cases are still under review, Three prisoners have died while waiting, and the statuses of some were unknown to the author of the Washington Post story.

The 2012 ruling reveals both a tragedy and an opportunity. The tragedy is that so many of these prisoners could have been released a decade ago. The tragedy is that so many other prisoners who died in custody in the 1990s and 2000s could also have been released in their old age, well before they died.

This used to happen in America. It used to happen in Maryland. Then in 1993 a prisoner with a life sentence was out of jail on work release when he killed his estranged girlfriend and then himself. "Get tough on crime" was the public outcry, and Maryland Gov. Parris Glendening responded by changing the policy so anyone with a life sentence would die in prison.

Across the United States, more than 230,000 male prisoners are over the age of 50. That's about one in every six. In jurisdictions like Baltimore that have put thoughtful processes in place, at least 3/4 of these seniors are winning parole. Criminologists and penologists in Maryland now have a wonderful opportunity to study how they do. Hardliners might predict that these men, having been restricted for so long, face an impossible job in reconnecting to society and should never be let out. I suspect the evidence will prove otherwise.

Other states and the federal government should then use this evidence to free many of their own seniors. Not only is it sad and lonely to grow old, get sick, and die in a prison cell — it's also expensive. We can increase our humanity as a society while we decrease our expenditures. At the same time, we can take a step toward losing our current unfortunate title as the major country with the highest proportion of its citizens behind bars.

1Lynh Bui,"A victim's family relives an old pain," Washington Post, 9/25/2014.

2I use "perpetrators," "murderers," etc., rather than "alleged perpetrator," "accused rapist," etc., because in the overwhelming majority of these cases, guilt is not the issue. Moreover, this article is not about any alleged miscarriage of justice at trial. 

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153506 https://historynewsnetwork.org/blog/153506 0
This is What Flip-Flopping on States' Rights Looks Like

Same-sex marriage in San Francisco City Hall (Wikipedia)

Sociologist James W. Loewen is the author of Lies My Teacher Told Me. 

I have long claimed that whoever controls the federal government is against states' rights. Whoever loses control of the federal government favors states' rights. For example, in The Confederate and Neo-Confederate Reader, I noted that when South Carolina left the Union, it "was not for states' rights, but against them." This was only to be expected, I observed, "because Southern planters had been in power during the Buchanan administration, indeed, throughout most of our history."

In the 1850s, when Southern planters ruled Washington, they ran rough-shod over the ability of voters in Kansas to choose to organize as a free territory and state. No "local control" if that control might be anti-slavery! They also got the United States to buy what is now southern Arizona and New Mexico (the Gadsden Purchase), hoping to make the first transcontinental railroad go west from New Orleans rather than Chicago. So much for limited federal government! They denounced New England states for letting African Americans vote, even though voter qualifications was a state matter until the Fifteenth Amendment passed during Reconstruction, two eras later. No states' right if exercising that right might violate white supremacy!

Recently, Ed Meese, longtime Republican campaign strategist, provided another graphic example of this principle. In the mid-1990s, he and most other Republicans argued for (and won passage of) the Defense of Marriage Act.1 Its purpose was to limit the ability of states to allow same-sex marriages. DOMA defined "marriage" as "only a legal union between one man and one woman" and instructed the federal government not to let states grant the "full faith and credit" usually accorded to other states' laws, when they allowed gay unions.

A decade later, during the George W. Bush administration, Meese and many other Republicans professed alarm that several states, not just Hawaii, were considering legalizing same-sex marriage. Again giving no weight to states' rights, they now tried to pass a constitutional amendment banning same-sex marriage. Had they succeeded, no state could have favored gay marriage.

Last week, Meese completed a stunning flip-flop. Fearing that the U.S. Supreme Court was about to decide that barring marriage to gays takes away a civil right (and civil rite) that should be constitutionally protected, he took to the op-ed page of the Washington Post. "The states should decide on marriage" was the title of his print article, co-authored by Ryan Anderson. In it, he now claimed that "the central legal question" is: "Who gets to determine marriage policy?" They went on to quote one of the judges who opposes same-sex marriage: "The Constitution is silent on the regulation of marriage; accordingly, that power is reserved to the States..." Shamelessly, they now asserted, "Nothing less than the future of our society, and the course of constitutional government in the United States, are at stake."

It isn't just right-wingers who flip-flop on states' rights. If the 5-4 conservative majority on the Supreme Court were to overturn Roe v. Wade next week, then those who advocate that pregnant women should have the right to choose will immediately argue that abortion should remain legal in the states that allow it. I do hope that they are a tad less hypocritical about it, but they may not be.

About secession, shortly after their defeat in 1865 some ex-Confederates, led by their former president and vice-president, started saying that they had seceded for states' rights. Despite the bald evidence of their state secession statements in 1860-61, all of which lambasted Northern states for exercising states’ rights, these revisionists won the day in the 1890s. Nevertheless, there were still a few honest former Confederates who knew better and said so. John Mosby, the Gray Ghost of the Confederacy, was one. "In February 1860 Jeff Davis offered a bill in the Senate which passed making all the territories slave territory," he wrote in 1907. "He was opposed to letting the people decide whether or not they would have slavery." So much for the states' rights theory of what the war was about!

Perhaps some equally honest Republicans will call Meese on his flip-flop. Meese wants the states to decide on gay marriage, not because he favors states' rights, but because he opposes gay marriage. There! Did it hurt that much just to say it?

1    To be sure, many Democrats voted for DOMA too, and Bill Clinton signed it.   

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153511 https://historynewsnetwork.org/blog/153511 0
More Bad History About Gay Marriage Lies My Teacher Told Me.

Matthew J. Franck at National Review Online has transcended my discussion of similarities between same-sex marriage and secession. He compares judicial decisions in favor of same-sex marriage to Dred Scott! Both decisions, he claims, have “no connection to the Constitution’s words, historic meaning, or underlying principles.”

As with Ed Meese’s invocation of states’ rights as a cover for his opposition to gay marriage, Franck then goes on to let his own cat out of the bag. “Like Dred Scott, decisions for same-sex marriage rely on a false anthropology ... In Dred Scott it was the false idea that some human beings can own other human beings… In the same-sex marriage rulings it is the false idea that men can marry men, and women can marry women…"

As a sociologist who took his BA and later began his teaching career in departments of sociology-anthropology, I have read quite a bit of anthropology. The first thing that anthropology teaches is that human cultures vary enormously. About sexual orientation, for example, some Native American societies gave what we might call “full faith and credit” to gay men, called “berdache,” now an English-language word meaning “a man who adopts the dress and social roles traditionally assigned to women.”

The second principle that anthropology teaches is the term “ethnocentrism,” and why it’s a bad idea. Franck needs a bit of the humility that this teaching can provide.

Franck to the contrary, I know of no meaningful parallel between slavery and same-sex marriage. No one is forced into same-sex marriage, at least in our society, whereas slavery is all about force. Franck tries to invent a parallel:

Like Dred Scott, same-sex marriage rulings are a harbinger of further depredations, by courts and others, on human freedom in other dimensions. In 1857, it was the freedom to live in a country where slavery was minimized and at least arguably on its way to extinction. Today, it is the freedom to live, work, and learn in communities, schools, universities, and other organizations in which people can live the truth about marriage, for religious or other moral reasons.

How’s that for a forced parallel? In fact, Dred Scott was all about the rights of black folks in the United States – the Supreme Court stating that they didn’t have any. Yet Franck makes it about the right of white folks to live in a society where slavery was minimized! Same-sex marriage is all about the right of gay folks to enjoy the rites and rights of marriage. Yet Franck makes it about the right of straight folks to live in a society where they don’t have to endure the presence of others who are married but in their eyes shouldn’t be.

Finally, Franck ends with the same kind of broad claim that Meese used: “Like Dred Scott, same-sex marriage rulings … amount to a comprehensive threat to republican government.”

C’mon! Republican government will endure gay marriage! You, Mr. Franck, will endure having gay folks married in your own town. Maybe you’ll even come to realize that they should be married, because pair unions are a better way of life than the profligate sexuality that did mark gay culture in some circles before AIDS put a stop to it. Certainly pair unions – gay or straight – are superior from a public health viewpoint!

So why not declare victory – those gay folks finally “got” what we’ve been preaching about safe sex and monogamy – and find some other more plausible reason to worry about the future of republican government?

 Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153512 https://historynewsnetwork.org/blog/153512 0
The Pentagon's Commemorating the Vietnam War. So Should We.

According to the “DOD Vietnam War Commemoration Program,” the 2008 National Defense Authorization Act authorized the Department of Defense “to conduct a program to commemorate the 50th anniversary of the Vietnam War.” DOD is spending $30,000,000 to do so. Its “activities and ceremonies” are intended to “achieve the following objectives:

To thank and honor veterans of the Vietnam War, including personnel who were held as prisoners of war (POW), or listed as missing in action (MIA), for their service and sacrifice on behalf of the United States and to thank and honor the families of these veterans. To highlight the service of the Armed Forces during the Vietnam War and the contributions of Federal agencies and governmental and non-governmental organizations that served with, or in support of, the Armed Forces.

To pay tribute to the contributions made on the home front by the people of the United States during the Vietnam War. To highlight the advances in technology, science, and medicine related to military research conducted during the Vietnam War.

To recognize the contributions and sacrifices made by the allies of the United States during the Vietnam War.”

Here are some objectives that these commemorations are not intended to achieve:

● To tell the truth about the war, including the fact that killing civilians was part of our national policy, even though it is a war crime.

● To explain why we fought. (These reasons are no more obvious than the reasons for George W. Bush’s Iraq War. I believe I listed all the likely possibilities in my paragraph on “why” in Lies My Teacher Told Me:

Some people still argue that the United States fought in Vietnam to secure access to its valuable natural resources. Others still claim that we fought to bring democracy to its people. Perhaps more common are analyses of our internal politics: Democratic Presidents Kennedy and Johnson, having seen how Republicans castigated Truman for "losing" China, did not want to be seen as "losing" Vietnam. Another interpretation offers the domino theory: while we know now that Vietnam's communists are antagonists of China, we didn't then, and some leaders believed that if Vietnam "fell" to the communists, so would Thailand, Malaysia, Indonesia, and the Philippines. Yet another view is that America felt its prestige was on the line, so it did not want a defeat in Vietnam, lest Pax Americana be threatened in Africa, South America, or elsewhere in the world. Some conspiracy theorists go even further and claim that big business fomented the war to help the economy. Other historians take a longer view, arguing that our intervention in Vietnam derives from a cultural pattern of racism and imperialism that began with the first Indian war in Virginia in 1622, continued in the nineteenth century with "Manifest Destiny," and is now winding down in the "American century" of the present. A final view might be that there was no clear cause and certainly no clear purpose, that we blundered into the war because no later administration had the courage to undo our 1946 mistake of opposing a popular independence movement. "The fundamental blunder with respect to Indochina was made after 1945," wrote Secretary of State John Foster Dulles at the time of the Geneva Convention, when "our Government allowed itself to be persuaded" by the French and British "to restore France's colonial position in Indochina."

The historical evidence for some of these explanations is much weaker than for others, but the Department of Defense is unlikely to hold seminars to tease out which seems likeliest.

● To thank and honor those veterans of the Vietnam War who engaged in the extensive anti-war actions that grew and grew during the war, including resisting going out on pointless patrols, fragging gung-ho officers, writing letters home informing families and friends of the useless and even illegal nature of the war, supporting FTA, joining VVAW and other groups upon returning to “the world,” and to thank and honor the families of these veterans for supporting them.

● To pay tribute to the efforts made on the home front by the people of the United States to try to end the Vietnam War, including voting for candidate after candidate who promised to end it, then flip-flopped (Lyndon Johnson, Edward Brooke, Richard Nixon, etc.); composing and singing songs and poems that brought out the hapless nature of the enterprise ("Alice’s Restaurant," "I-Feel-Like-I'm-Fixin'-To-Die Rag," "Give Peace a Chance," "Wichita Vortex Sutra," etc.); dedicating years of their lives to getting out accurate information about the war (Daniel Ellsberg, David Halberstam, A. J. Muste, etc.); committing civil disobedience or refusing to pay taxes to try to stop the war (too many to list); fleeing to Canada or burning draft cards or in other ways refusing induction into the armed forces (Muhammad Ali and others too numerous to list); killing themselves or bombing or invading military installations in protest (Alice Herz, Norman Morrison, the Berrigans, etc.); and participating in all sorts of other efforts.

● To highlight the useless expenditures in money, munitions, and technology related to bombing and other military actions conducted during the Vietnam War.

          ●  To recognize the role of other countries in trying to get the United States to stop its War on Vietnam, including Canada for providing asylum, Sweden for recognizing North Vietnam and hosting the Russell Tribunal, West Germany for allowing student protests, etc

DOD’s website includes an interactive map showing 1,066 places where it will hold events that “commemorate” the Vietnam War. Here are some places that will not get onto that map:

● The University of Wisconsin’s Sterling Hall, home to the Army Math Center and bombed in 1970 in protest of the university’s connections to the Vietnam War.

● The Catonsville, Maryland, draft board office, where in 1968 Catholic protestors of the war burned draft files.

The "Catonsville Nine" burn draft files, May 17, 1968

● Alexander Hall, the dormitory at Jackson State University in Mississippi, raked by gunfire from law enforcement after anti-war protests in 1970.

● The parking lot memorial at Kent State University in Ohio and nearby areas where the Ohio National Guard killed four and injured nine anti-war protesters.

● Central Park in Manhattan, scene of the largest antiwar protest in U.S. history in 1967 [400,000 marched; the protest in Washington the next year drew fewer].

● Grant Park in Chicago opposite the Hilton Hotel, where antiwar protestors gathered and were clubbed by police during the 1968 Democratic National Convention.

Kent State University Memorial to Jeffrey Miller, Shot by National Guard

This year participants in the 1964 Freedom Summer – along with other people honoring and remembering them – staged marvelous commemorative events at Tougaloo College in Mississippi and Miami University in Ohio, held panels and premiered two new documentary films across the U.S., and thus guaranteed that this crucial series of events was not forgotten or, worse, misremembered.

The Vietnam War and its opposition are harder. Millions of people participated on each side. There is a danger that because so many took part, none will organize to remember. But the armed forces are organized. They have $30,000,000. They will remember. If we are not careful, their interpretation will carry the day.

Perhaps a better model for us, although further distant in time, is the Columbus Quincentenary of 1992. It was supposed to be a grand celebration. President George H. W. Bush appointed a committee to spearhead events in each state. But the government did not hold the stage by itself. Uninvited actors forced their way onto the stage. In DC, demonstrators splashed the huge Columbus sculpture in front of Union Station with red paint, leaving the message "500 years of genocide." In Denver, the American Indian Movement put up a "counter-memorial," consisting of 100 skeletal tepees, burned and scorched, accompanied by 29 official-looking historical markers with texts by Native American leaders. Columbus statues collected red paint and "murderer" graffiti from Boston and Newport, Rhode Island, to Pittsburgh and on to California.

Since the Vietnam War went on for a long time, we shall be commemorating its 50th anniversary until at least 2025, when Saigon “fell” – or was liberated. This gives you time to commemorate the park in your town where the little antiwar protest wound up, the draft counseling office where young people learned how to say no, the church that housed an antiwar service that caused its insurance rate to triple, the rehab clinic that tried to help paraplegics and quadriplegics, the home where a Vietnam veteran took his own life because he could not seem to forge it anew. The sites of the Vietnam War and its attendant opposition movement are everywhere. Tell their stories! Put up a marker! Organize a panel! Publish an op-ed!

“If we do not speak of it,” as George Swiers, a Vietnam War veteran, put it years ago, “others will surely rewrite the script. Each of the body bags, all of the mass graves will be reopened and their contents abracadabraed into a noble cause.”1

1 Quoted in William Appleman Williams, et al., eds., America in Vietnam (New York: Norton, 1989), ix.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153517 https://historynewsnetwork.org/blog/153517 0
Here We Go Again: Tests for the Common Core May Be Unfair to Some and Boring To All          

Clip art licensed from the Clip Art Gallery on DiscoverySchool.com 

           On October 16, 2014, the Center for American Progress (CAP), a "liberal" think-tank in Washington, D.C., presented a panel, "The Need for Better, Fairer, Fewer Tests." It starred Jeffrey Neilhaus, Chief of Assessment at PARCC, Partnership for Assessment of Readiness for Colleges and Careers, Inc.

            In case you're wondering what PARCC does, these folks seem to have an inside track toward developing the testing that will undergird America's new Common Core Standards for K-12 education.

            Unfortunately, multiple-choice tests are often biased against African Americans, Native Americans, and Mexican Americans, and sometimes against females. My (extensive) experience studying the SAT and other "standardized" tests[1] has shown that items that favor African Americans systematically get dropped. Increasingly, so do items that favor Mexican Americans and also items that favor girls on math tests. This is not because racists or sexists at ETS, Pearson, etc., want to deny competence or college to black folks or women or whomever. Rather, it results from the standard statistical tests psychometricians apply to items, notably the point-biserial correlation coefficient (known in sociology as the item-to-scale correlation coefficient).

            Let me explain "point-biserial correlation coefficient." Testmakers seek to measure or predict student performance. For example, ETS wants the SAT to correlate positively and strongly with first-semester college GPA. Researchers at ETS are always developing new items. What they should do is see if getting those items right correlates with higher first-semester college grades. ETS has relationships with over a hundred institutions of higher learning and could obtain from them students' GPAs. Then they could match those GPAs with performance on each new item. But doing so would be expensive and would take most of a year. Instead, ETS researchers argue that an item "behaves" statistically if it correlates with students' overall ability. That claim would make sense, except for their unfortunate choice of measure: "overall ability" is students' scores on the test as a whole. (This is why sociologists call this the "item-to-scale correlation coefficient.")

            Why is that measure of overall ability problematic? Well, consider an item that uses the word "environment." ETS staffers showed decades ago that when white students hear the "environment," most think first of the natural environment -- ecology, pollution, the balance of nature, and the like. Nothing wrong with that -- everyone knows that meaning of "environment." But when African American students hear the word, most think first of the social environment -- "what kind of environment does that child come from?" Again, nothing wrong with that -- everyone knows that meaning of "environment." In the pressure cooker conditions under which students take "standardized" tests, however, the first meaning that flashes into their minds when they encounter a word often influences whether they pick a "distracter" (wrong answer) or get the item right.

            It follows like night after day that a potential item based upon the second meaning of "environment" could never make it to the SAT. Perhaps 75% of African Americans would get it right, but only 65% of European Americans. In addition, working-class and lower-class whites and Latinos living in inner-city neighborhoods would also be more likely to get it right, since they live in a majority-black subculture. Unfortunately, all of these students typically score well below most suburban whites. Therefore the item would fail the point-biserial correlation coefficient test. The people getting it right scored lower on the overall test than the people getting it wrong.[2] The "wrong people" got it right. In statistical terms, the item had a negative point-biserial correlation coefficient. No one at ETS wanted nonwhites to score lower. No intentionality was involved. The process is purely statistical. Nevertheless, the result is: even though, within each group, the item may separate those with more ability from those with less, across all groups, the item "misbehaved" statistically. No such item — on which people who score badly overall do better than rich white males — can ever appear on the final exam.

            Like most other "standardized" tests given widely in the U.S., researchers originally validated the SAT on affluent white students. Affluent white students have always done better on it than have African Americans, Hispanics, Native Americans, Filipino Americans, or working-class whites. It follows that using point-biserial correlations increases test bias. Because new test items must correlate with old ones, items that favor blacks, Latinos, or poor people cannot pass this hurdle. Neither can items that favor girls on the math test.

            Knowing this, I asked the presenters at CAP how they were dealing with the possibility of biased items. In response, Neilhaus said that PARCC was doing two things. First, they subjected each item to scrutiny, to avoid language that might upset any race, ethnic group, gender, etc. Second, they subjected each item to DIF, differential item functioning, a statistical test developed by ETS, the company that puts out the SAT.

            Unfortunately, neither of these techniques has much to do with reducing bias.

            To be sure, we don't want to use tests with language or content that would upset any group. But most biased items have no such language or content. That's not a major problem.

            DIF is even less relevant to the issue of bias. DIF is a statistical technique that flags an equal number of outliers in each direction. Twenty years ago, I got staffers at ETS to admit that DIF is not a bias-reduction technique. It cannot be. Typically it brings to testmakers' attention as many items that favor blacks as those favoring whites. Indeed, since the number of items that favor African Americans is likely to be fewer than the number that favor whites, the test may flag more pro-black items than pro-white items, because the pro-black items will stand out as more distant from the mean.

            Moreover, whites may outscore blacks on items, but the items will still get dropped because whites don't outscore blacks by a great enough margin! Thus if 7% fewer African Americans typically get an item correct, compared to whites, and if 6% is the cutoff point that triggers DIF, then items on which blacks do "only" 1% worse than whites will get flagged for scrutiny. To be sure, so will items on which whites do 13% better than blacks. If both outliers get dropped, the mean difference, here 7%, remains unchanged. If bias explains part of that difference, the testers will neither notice nor remove it. DIF drops the mean difference before it looks at items. Therefore DIF is irrelevant to the mean difference.

            PARCC does not propose to look at items on which blacks do much worse than whites (or Hispanics much worse than Anglos, or women much worse than men on math). PARCC only proposes to use DIF. Therefore, PARCC has no bias reduction procedure in place.

            Even worse than that criticism, however, is the simple fact that their tests apparently will be mostly multiple-choice. There is no excuse for this. Rhode Island and Vermont showed years ago that states can test and qualify students for promotion or graduation using a portfolio approach that demands various competencies of them. Students might be asked to give a persuasive talk, for example. They might have to write a ten-page paper, making a coherent argument and backing it up. They might have to do a chemistry lab project or use math to interpret a table.

            I asked Neilhaus if they had considered using portfolios. He replied that portfolios are not feasible, partly because of problems comparing across different states. However, portfolios have crucial advantages over "standardized" tests. First, they are fun for students to assemble. No one has ever accused "standardized" tests of being fun. Second, portfolios give a real picture of students' strengths and weaknesses. Third, they give students something useful and meaningful to do, to improve. When a student gives a disorganized talk, it's apparent to all and is a skill worth acquiring. When a student percentages a table wrong, it's obvious upon reflection; again, it's a skill worth acquiring.

            When a student scores low on a "standardized" test, on the other hand, usually no meaningful diagnostic comes back. Does the student read slowly? By itself, that can decrease test scores dramatically on such tests, but the test does not even tell the student that s/he reads slowly, let alone offer meaningful remediation. Moreover, the ability to choose "B" from among five alternatives quickly is not a skill of any use in later life. Perhaps the most effective way to improve one's score on an SAT-type exam is by taking a coaching class, particularly the Princeton Review, but the "skills" it teaches are not only not useful after the test, they are anti-intellectual. And of course they are also expensive.

            The key drawback to portfolios and the other alternative forms of examining student performance? It's harder for companies to monetize them.

            Comparing portfolios across different states isn't crucial, but if it became an issue, it's solvable. After all, ETS hires capable people — including very good HS history teachers — to read and grade the essays that students write responding to the DBQ (Documents-Based Question) on the APUSH (Advanced Placement U.S. History) exam. Yes, it's labor intensive, but it can be done, and on a large scale. After all, ETS is a massive test-producer.

            If the Common Core is really going to have better, fairer, fewer tests, it needs to move away from overreliance on multiple-choice items, a.k.a. "standardized." Portfolios would be one solution. No Child Left Behind did not mandate multiple-choice tests, as Vermont and Rhode Island showed. Even less does the Common Core mandate such tests.

            For the record, students whose parents did not graduate from college, immigrant children, young people of color, etc., will find it harder to assemble a powerful portfolio, compared to affluent white students. The upper class will always find ways to advantage its children. They're even supposed to: it's a rule that parents should do what they can for their kids, and rich folks can do more. But at least portfolios and some other forms of assessment are less biased than multiple-choice tests. Moreover, since they test abilities that are useful in the workplace and in college, they possess an intrinsic fairness that multiple-choice tests lack.

            On March 9, 2014 the conservative columnist Kathleen Parker wrote "Simplifying the SAT." She bemoaned the loss of the analogy items back in 2005. Analogy items were particularly subject to bias. Hence I was glad to see them go, even though, like Parker, I personally enjoy them. Parker wrote, in passing, "Critics of the SAT maintain that the test is biased in favor of students from wealthier families. We all want a level playing field and equal opportunity for children. This is fundamental to who we are."

            This is an example of wishful thinking. "We all" do not want this, or we would have it! The affluent and influential want tests that advantage their children, and they have them.

            Here is just one of the many ways that affluent and influential families gain unfair advantage on "standardized" tests. They "game" the testing rules. Although rich and powerful families do not want their children in special education (except "gifted and talented" programs), they do want their children to get dispensations for special testing regimens from ETS. Across, say, the Los Angeles metropolitan area, these requests — each bolstered by a statement from a doctor or psychologist paid by the family — map inversely with the prevalence of students assigned to special education. These requests also correlate strongly, when mapped with median family income. And they correlate with the availability of coaching classes for the SAT.

            Based on the presentation at the Center for American Progress, the Common Core as tested by the Partnership for Assessment of Readiness for Colleges and Careers may be no fairer to "have-not" students than were the old multiple-choice tests that "No Child Left Behind" spiraled down into using in most states. No fairer than the SAT. Not fair at all. Not fun, either. No fun at all.

    [1]Since "standardized" implies much more to the lay reader than legitimately intended by the psychometrician, this word needs to be encased in quotation marks. Otherwise, many readers will infer that tests are somehow made "standard" or fair across population groups; just the opposite is usually the case.

    [2]See Loewen, "Statement," in Eileen Rudert, ed., The Validity of Testing in Education and Employment (DC:  U.S. Commission on Civil Rights, 1993), 42-43; cf. Loewen, "A Sociological View of Aptitude Tests," ibid., 73-89.

Copyright James W. Loewen]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153543 https://historynewsnetwork.org/blog/153543 0
How Two Historians Responded to Racism in Mississippi

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

In 1963, I was a junior at Carleton College in Minnesota. My classmates who majored in French were spending their junior year in France. I was in sociology and had never lived outside the Midwest. "How is that competent?" I asked myself. I did not think it was competent. So I decided to spend part of my "junior year abroad" in Mississippi.

I went to Mississippi State University for the winter quarter, January through March. While there, I audited courses, talked with community leaders in Starkville and Clarksdale, and spent several days as an "exchange student" at Tougaloo College near Jackson and Tuskegee Institute in Alabama, historically black institutions. (I put "exchange student" in quotation marks because no one went the other way, of course. Mississippi State University was then the largest segregated all-white institution of higher learning outside South Africa.) I also got to know two members of Mississippi State's Department of History.

It was (and remains) the largest history department in the state. It was the only department that granted the doctorate. In U.S. history its faculty included two historians of regional renown, John K. Bettersworth and Glover Moore. They were contemporaries: Bettersworth was born in 1909, Moore two years later. For four decades they overlapped at Mississippi State. Each became president of the Mississippi Historical Society. Its website calls Moore "the Mississippi State University professor who from 1936 to 1977 served as mentor and guide to countless students and historians." It calls Bettersworth "the distinguished historian and author who served as professor and administrator at Mississippi State University for almost forty years."

The historical society still honors both of them: its John K. Bettersworth Award goes annually to an outstanding teacher of middle school or high school history, while its Glover Moore Prize goes biennially to the best M.A. thesis. Both prizes carry a monetary award of $300, but the two historians are hardly of equal merit, at least in my eyes.1

Beneath their external similarities, the two men differed in their core values. John K. Bettersworth wrote probably the most important book that children across Mississippi read in high school. Variously titled Mississippi: A History; Mississippi: Yesterday and Today; and finally Your Mississippi, it was the required textbook for the required ninth-grade course, "Mississippi History." It stayed in print from 1959 to at least 1986.

Robert Coles, famed psychiatrist and author of the best-selling Children of Crisis series, tells the impact of these books upon black Mississippians by using the example of a family he "came to know in 1964," when he "worked in Mississippi as a member of the civil rights Summer Project." He quotes the mother, speaking of her children:

They don't get much out of school. Our people aren't supposed to take their education too seriously. We're supposed to do all the dirty work, and the white man is supposed to do the learning. I've seen the books they give our children in school. Our own people--Negro teachers--use those books. One of them was fired for telling the kids that the books are no good.

Her daughter told Coles about the impact that the Freedom Schools, an important part of the Mississippi Civil Rights Movement, had on her:

If it hadn't been for the civil rights people coming here, maybe I wouldn't have worried; maybe I would have swallowed what those people want us to swallow. But why should we go to school and read books that tell us that racists like Ross Barnett and Bilbo were nice men and that the blacks and whites always got along real nice, except for the Yankees who came down here?

Indeed, Bettersworth wrote exactly that. He began his chapter on Reconstruction, which he titled "The Road to 'Redemption,'" with this summary:

The war was over, but the fighting was not; reconstruction [sic] was a worse battle than the war had ever been. Slavery was gone, but the problem of the free former slaves was not. To make matters worse, political adventurers from the North came in to make their fortunes off the troubled state. The struggle of the state to free itself in the war was hardly more difficult than the struggle to free itself during reconstruction. Yet by 1875 the old political order had returned, and white and black people set about the task of getting along together in the "New South" as they had in the "Old.”....2

Parsing this paragraph, the first sentence wins a Pinocchio award. Reconstruction was hardly “a worse battle” than the Civil War. To be sure, white supremacists murdered more than 1,000 black and white Republicans in Mississippi during Reconstruction, but 20,000 men died in the campaign for Vicksburg, alone. The only way that Reconstruction was "worse" than the Civil War was that it challenged white supremacy even more directly than did the role that United States Colored Troops played during the war.

Historian John K. Bettersworth as he looked in about 1970. 

 Photo courtesy of the Mississippi State University Libraries.

Indeed, this is what Bettersworth means, as his next sentence makes clear. It's quite a sentence: "Slavery was gone, but the problem of the free former slaves was not." If only they — the black population of the state, then slightly larger than the white population — had left! Then we'd have no problem! Robert E. Lee expressed similar sentiments, testifying in 1866 before the Joint Congressional Committee on Reconstruction. "I think it would be better for Virginia if she could get rid of them."3 What is a black high school student to make of Bettersworth's sentence? For that matter, what is a white student to do with it? It implies that the state would be better off after total racial cleansing!

The sentence should have read: "Slavery was gone, but slavery's handmaiden — the ideology of white supremacy — was not." Such a sentence would have provided black and white students with more accurate history and with a tool for understanding the past's relevance to the present.

Bettersworth goes on to invoke the stereotypical "carpetbagger" image: "To make matters worse, political adventurers from the North came in to make their fortunes off the troubled state." As if "adventurers" would seek riches from what had already become, as it remains, the poorest state in the United States! A more typical "carpetbagger" was the white female schoolteacher from New England, come into the state to help teach black children — and adults — to read.

Every sentence in the paragraph — indeed, almost every sentence in his book — merits critique. However, the final sentence is accurate, if understood ironically. By 1876 (Bettersworth misdates), "the old political order" had indeed returned. From then until the Civil Rights Movement began to cause improvement, whites and blacks indeed got along "as they had" in the Old South — that is, hierarchically, with whites on top, blacks on the bottom.

Bettersworth's own white supremacy distorted not only his treatment of Reconstruction. Throughout, his book simply omits African Americans whenever they did anything notable. Among its sixty images of people, for instance, just two included African Americans, and both were "Old South" images: a drawing of a white mistress reading from the Bible to a group of slaves and a painting of cotton pickers by a white artist.4

As a historian, Bettersworth had to know better about Reconstruction. Anyone who ever read primary sources in Mississippi history knew better. My own eyes were opened when I read Vicksburg newspapers published in the 1870s. One Democratic newspaper in 1871 or '72 noted that African American men were going to be voting from then on. (In the nineteenth century, the Democratic Party was the party of overt white supremacy and even called itself "the White Man's Party" into the 1920s.) Its editor said he was therefore becoming a Republican, because that was the surest way to recapture political influence. He thought he could lead black public opinion, and perhaps he could have, in time. However, events of 1875 and '76 persuaded him to flip-flop again. Violent attacks on black Republican voters and candidates convinced him that African Americans might not be voting permanently. So he became a Democrat again and urged all whites to do the same.

White Democrats opposed Reconstruction not because it was a failure, but because it was working. Today almost all historians of Reconstruction hold that view. Even in 1959, when Bettersworth wrote the first edition of his textbook, the standard secondary source on Reconstruction, Vernon Wharton's The Negro in Mississippi, 1865-1890, was already twelve years old. It tells a very different story about Reconstruction than Bettersworth.

In 1971, Bettersworth showed that he too shared today's consensual view of Reconstruction. In the New York Times, he reviewed three books, for young readers, about Reconstruction.5 He set up his review by noting, "the racial crisis of our times has found the black man still fighting for a freedom he once fleetingly enjoyed until the politics of Southern recalcitrance and Northern hypocrisy conspired to nullify it." This review made clear that Bettersworth knew that the interracial Republican coalition that governed Mississippi during Reconstruction had done a good job under difficult circumstances. Discussing the one book of the three that specifically treated his state, Milton Meltzer's exemplary Freedom Comes to Mississippi, Bettersworth wrote, "A century ago, freedom came to the black man, who experienced it for a few years — until the political bargain of 1877 ... left the whole business to be done over again a century later as the Second Reconstruction."

He approves of Meltzer's work, pointing out that his book is "relevant" today, "as Reconstruction was, and still is."

Logically, Bettersworth had to know that what he wrote about Reconstruction (and the rest of Mississippi history) in his textbook likewise made a difference in the present. He could not have believed that his textbook was only an innocent way to make a few thousand dollars without hurting anyone. At Mississippi State, he encountered the fruits of his labor in every class that he taught. He had to have known how appallingly racist some of his students could be. He had merely to talk with his own undergraduates, which I did when I was one. My best friend (!) at Mississippi State, for example, told me with pride what he had done when the black driver in front of him stopped suddenly for a red light in downtown Greenwood, and my friend didn't stop fast enough. He jumped out of his car, surveyed the damage he had caused, then turned to the black driver and said, "Nigger! Why'd you back up?!" Knowing the mores of Mississippi, the African American dared not contradict a white man, so he drove off without getting the information needed to file an insurance claim against the driver who had hit him. Bettersworth's state history textbook provided the intellectual justification for such outrageous acts of white supremacy. Most white students believed what his textbook said about the inferiority of African Americans. How were they to learn better? Southern society was structured to prevent them ever from having an equal interaction with a person of another race.

To be sure, in his textbooks Bettersworth "merely" wrote what the white power structure of Mississippi wanted all Mississippi youth to read. Ross Barnett, the segregationist governor who tried to keep James Meredith out of the University of Mississippi in 1962, and whose inflammatory rhetoric helped spark the white race riot that followed, told the Textbook Board in that year, "There is nothing so important as the molding of the hearts and minds of our young people." Toward this end, after the U.S. Supreme Court in 1954 required school desegregation, Mississippi had responded with a package of measures designed to thwart the Court and maintain "our Southern way of life." Every Mississippian knew that meant segregation and white supremacy. One new law passed in 1956 required all fifth- and ninth-graders to take a year-long course in "Mississippi History." That law created the market for the textbook that Bettersworth supplied, and I mean that sentence ideologically as well as economically. We the people have to believe that what we do is right. Bettersworth supplied the history that enabled white Mississippians to justify their racist acts. White children who learned that Reconstruction was "worse ... than the war had ever been" because blacks had the vote could be counted on to oppose black voting now. Thus Bettersworth’s textbooks bolstered the thinking of Ross Barnett and even Byron de la Beckwith, murderer of civil rights leader Medgar Evers. Conversely, black children who learned that same message would hardly be able to throw themselves wholeheartedly into the struggle for civil rights in the present.

Is it too harsh to imply that John Bettersworth had blood on his hands because of the murder of Medgar Evers? History is important. What historians say to thousands of children makes a difference.

Could Bettersworth have done otherwise? Could he have written an accurate history of the state? He later implied that he could not: “The times determined what textbooks would be published.”6 But this is self-serving. In 1959, John K. Bettersworth was perhaps the most eminent historian in the state. Certainly he was the most eminent historian at Mississippi State, which was the most eminent history department in the state. Had he written an honest history of Mississippi, possibly some right-wing historian at Ole Miss or a junior college might have written a competitor, but Mississippi typically adopted three to five books in each subject. It is not conceivable that the State Textbook Board would have chosen only the competitor and rejected a book by the dean (literally!) of Mississippi historians. No, he determined what textbook would be published.

I don’t know what Dr. Bettersworth thought as he wrote and revised his textbooks. We can surmise that he knew that Ross Barnett surely did not read the New York Times, which was not then available within the state except by mail. Conversely, he knew that the OAH and AHA did not (and still do not) review high school U.S. history textbooks, let alone state histories. Thus his reputation as a white supremacist in Mississippi would not be undermined by his book review, while his reputation as a historian would not be sullied by his racist and unprofessional textbook.7

I also know that in the early 1970s, he wrote the text for Shrines To Tomorrow: A Photographic Study of More than 100 Historical Churches in Mississippi. He and the photographer, Bob Moulder, chose the churches, Bettersworth stated, for their historical and architectural significance. The first photo was an evocative image of a small Choctaw Indian Protestant church in rural Neshoba County. The Port Gibson Presbyterian Church, its steeple crowned not by a cross but by a hand dramatically pointing heavenward, won a prominent spot. Bettersworth and Moulder also included practically every First Baptist, First Presbyterian, and Episcopal church in every large town in the state — but not a single black church.

This was inexcusable. Everyone in the state knew that the one institution that African Americans built and took pride in above all others was their church. Entire black denominations were born in Mississippi, including the enormous Church of God in Christ, which had perhaps five million members around the world when Bettersworth wrote. Nevertheless, he did not see fit to include its original building or the chapel at its junior college in Lexington. Architecturally, the interesting black tradition of the twin steeples in front went unexamined because it went unrepresented.

The chapel at Mississippi State University 

Photo courtesy of the Mississippi State University Libraries.

 Bettersworth included the chapel at Mississippi State, but not Woodworth Chapel at Tougaloo. The only historic or architectural distinction of State's chapel was that it had been faced with bricks recycled from Old Main, a campus building that had burned. 

 Woodworth Chapel, Tougaloo College

Photo courtesy of Tougaloo College.

Woodworth Chapel, on the other hand, was the scene of the Social Science Forum, organized by Dr. Ernst Borinski, professor of sociology (1946-83) and a subject of the museum exhibit "From Swastika to Jim Crow" now touring the United States. Guest speakers ranging from the Communist historian Herbert Aptheker to novelist Ralph Ellison, and from economist and ambassador John Kenneth Galbraith to local civil rights leader Fannie Lou Hamer spoke there. It also housed historic meetings of the Mississippi Civil Rights Movement. Bettersworth even left out the Chinese Baptist Church in Cleveland, with its unique history of providing both a spiritual home and a route for racial upward mobility for this important minority in the Delta. Obviously, to win Dr. Bettersworth's attention, a church did not have to be important, but it did have to be white. And in this enterprise, Bettersworth wrote to please himself. No State Textbook Board loomed over his shoulder.

Although Bettersworth probably never talked with one, his textbooks particularly disadvantaged black students. I found this out when I taught first-year students at Tougaloo College in Mississippi on the first day of the spring semester in January 1969. I asked them, "What was Reconstruction? What images come to your mind about that era?" Sixteen of my seventeen students told me, "Reconstruction was that time, right after the Civil War, when African Americans took over the governing of the Southern states, including Mississippi, but they were too soon out of slavery, so they messed up, and reigned corruptly, and whites had to take back control."

This was straight Bettersworth. In reality, African Americans never took over the Southern states. All Southern states had white governors and all but one had white legislative majorities throughout Reconstruction. Moreover, the Reconstruction governments did not "mess up." Mississippi in particular enjoyed less corrupt government during Reconstruction than at any later point in the century. Across the South, governments during Reconstruction passed the best state constitutions the Southern states have ever had, including their current ones. They started public school systems for both races. The Reconstruction governments tried out various other ideas, some of which proved quite popular.

Unfortunately, my Tougaloo students were good students. They had learned what they had been taught, in all-black high schools with all-black teaching staffs who blindly taught what was in the textbook. Robert Coles's interviewee had told him what happened to black teachers who let slip "that the books are no good."

What must it do to my students, I wondered on that January afternoon, to believe that they were "too soon out of slavery?" That the one time their group stood center-stage in the American past, they "messed up?" It couldn't be good for them.

Glover Moore was very different. His course, "Mississippi and the South," was the one course I audited faithfully during my months at Mississippi State, except when I left to visit other campuses. He had a unique lecturing style. He dictated his notes to the class: "Then in 1866, Andrew Johnson took his case to the people. Then in 1866, Andrew Johnson took his case to the people." I was appalled. Well before 1492, I learned, the Sorbonne had outlawed spending class time doing this sort of thing. Mississippi State students wrote it all down verbatim, without protest. Then Prof. Moore would look up from his notes, smile, and proceed to share a fascinating anecdote from the time that made his staid dictation come alive. All over the room, I could hear clicks as students put down their pens to listen. That's when I picked mine up! Often the source was memorable and important and I took notes as rapidly as I could.

I also asked questions, so I came to Moore's attention as a student who might be worth getting to know. He invited me to a little group, mostly history majors and graduate students but also others, that met occasionally in the late afternoon. In class, Moore's own viewpoint never surfaced. When he presented the words or deeds of Nathan Bedford Forrest, first national leader of the Ku Klux Klan, he sounded like a Klansman. When he told of a leader of the Republican Party during Reconstruction, he sounded like a Republican. He could never have got in trouble for being unsound on segregation, as historian James W. Silver did at Ole Miss, because he was inscrutable. Still, his offering of original sources did give ammunition to any student interested in challenging his/her own prior education. In his afternoon discussion group, Moore was at least interested in my viewpoint — and that of my Carleton peers — on the unfolding events of the Civil Rights Movement, such as the enrollment of James Meredith at Ole Miss the previous fall. Unlike Jim Silver, Moore did not have the benefit of Meredith's extraordinary courage, which freed white professors at Ole Miss to talk about race much more honestly than was yet possible at Mississippi State.

Not until after I went to Tougaloo College to teach in the fall of 1968 did I learn Glover Moore's true colors. Ironically, it turned out that he too wrote a textbook while teaching at Mississippi State — a very different textbook. In 1969-70 he sent me a copy. Not only did he write it, he published it himself. He titled it Afro-American History. He intended it as a corrective to the usual all-white textbook, including those written by his colleague, John Bettersworth. Moore produced maybe a hundred copies — 8½" x 11", 300 pages long, and amateurishly bound. He provided them free, I think, as a resource for Freedom Schools and teachers who wanted to do better in Mississippi's newly integrated public schools. I skimmed it but did not read it — my impression was that it broke no new ground — but I was moved that he had written it, unbidden. I hope he got it to some people who used it, at least in northeast Mississippi.8

Since I left the campus in March, 1963, my life's path has taken me back to Mississippi State University several times, but always to sociology, not history. Balkanized as are most campuses, that meant that I never saw John Bettersworth or Glover Moore again. I regret that I never told Glover Moore what I thought of him and his work. For that matter, I regret that I never told John Bettersworth what I thought of him and his work. For what it's worth, for whatever you make of it, now I have told you.

As I thought about these men, I found a written image of a “Richard Bone,” a gravestone cutter, useful. You might too. It is by the poet Edgar Lee Masters, from his famous Spoon River Anthology, written about the time that these two historians were born. It ends:

[L]ater, as I lived among the people here,

I knew how near to the life

Were the epitaphs that were ordered for them when they died.

But still I chiseled whatever they paid me to chisel

And made myself party to the false chronicles

Of the stones,

Even as the historian does who writes

Without knowing the truth,

Or because he is influenced to hide it.9

1    I should emphasize that I express my view. This essay is a memoir of my experience with these two historians and an assessment of their impact within the state of Mississippi.Thus I do not review their books intended for a national audience.

2    John K. Bettersworth, Mississippi: Yesterday and Today (Austin: Steck Vaughn, 1964), 222.

3    Robert E. Lee. “Testimony Lee before the Congressional Joint Committee on Reconstruction,” 2/17/1866, Report of the Joint Committee on Reconstruction at the First Session Thirty-Ninth Congress (Washington: Government Printing Office, 1866), 135-6.

4    In addition, an illustration of boys on the deck of a steamboat may include a black boy dancing a jig; since he is just 3/8" tall, we cannot be sure of his race. Bettersworth’s next edition included two photographs of black Mississippians: head-and-shoulder portraits of politician Charles Evers and opera star Leontyne Price.

5    Bettersworth, "After the War Was Over," New York Times, 7/25/1971.

6 Andy Kanengiser, “Outdated Textbooks Give State’s Schools Bad Image,” Jackson Clarion-Ledger, 11/23/1986. According to “Guardians of Historical Knowledge: Textbook Politics, Conservative Activism, and School Reform in Mississippi, 1928-1982,” a Ph.D. dissertation by Kevin B. Johnson (Starkville: MS State, 2014), the state was well mobilized to ensure white supremacist textbooks.

7 This is no longer true. The lead article in The Journal of Mississippi History, 77 #1 (Spring 2010), “The Three R’s – Reading, ‘Riting, and Race: The Evolution of Race in Mississippi History Textbooks,” by Rebecca Miller Davis, 14, 16, 21, 36, and 40, takes Bettersworth’s textbooks to task for providing a pro-owner account of slavery, “the Confederate myth of Reconstruction,” a “vague and scanty” account of the struggle for civil rights, and in general “mythologized history.” About 35 years earlier, Robert B. Moore did so as well: Bettersworth’s book “overtly and covertly reinforces white chauvinism and racism through omission, distortion, and falsification of reality…. Bettersworth repeats the most outdated and discredited myths about Reconstruction, ignoring modern scholarship.” See Robert B. Moore, “Two History Texts: A Study in Contrast” (NYC: Racism and Sexism Resource Center for Educators, 1976), 3, 6. Of course, Moore published in New York City; Davis published in Jackson, Mississippi.

8 Unlike Bettersworth, who did not seem to have advised any doctoral candidates at Mississippi State, Moore advised at least ten.They show a suggestive broadening in topics beginning about 1974. Earlier dissertations did not treat race, except one general history of the Mississippi Choctaws. After 1974 his students took on more controversial topics like “Hodding Carter: Southern Liberal” and “The ‘Loyalist Democrats’ of Mississippi: Challenge to a White Majority.”

9 Edgar Lee Masters, Spoon River Anthology (Clayton, DE: Prestwick House, 2007 [1919]), 124.

  Copyright James W. Loewen]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153548 https://historynewsnetwork.org/blog/153548 0
Windshield Pockmarks and Police Behavior

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

  In the spring of 1954, beginning in Seattle, motorists started noticing mysterious pockmarks in their windshields. Officials blamed nuclear fallout, volcanic ash, extraterrestrials, even seagull shit. Soon motorists in California, Illinois, even Ohio noticed that their windshields, too, were suddenly sprouting pits.

            Of course, the answer was that windshields had always been picking up pockmarks. The main cause was tiny bits of gravel, picked up by the turbulence of vehicles ahead of us, then striking our windshield. The difference was not in our windshields but in ourselves. In the aftermath of the Seattle headlines, car owners and journalists were now looking at their windshields instead of merely looking through them.

            On August 9, 2014, sixty years later, African Americans in Ferguson, Missouri, noted that a police officer there had shot a black suspect to death, even though the man, Michael Brown, was at a considerable distance from him, unarmed, perhaps with his hands up, when the last and fatal fusillade was fired. They protested.

            Since then, a spate of killings by police of unarmed African Americans in other places generated headlines across the nation:

             — Eric Garner on a Staten Island sidewalk;

             — John Crawford in a Walmart toy department in Beavercreek, Ohio;

             — 12-year-old Tamir Rice at a Cleveland park pavilion;

             — and others.

            Of course, it turns out that, like windshields collecting pockmarks, police have been killing unarmed black men all along. During the previous decade, just in the vicinity of Ferguson, police in St. Louis and St. Louis County had shot at least forty men to death, mostly black, mostly unarmed. For that matter, Garner had been shot three weeks before Michael Brown, and Crawford four days before.

            The difference was not in new police behavior, but in ourselves. In the aftermath of the Ferguson protests, journalists and the public looked at police behavior instead of merely looking through police eyes at the alleged behavior of the deceased.

            The analogy between windshield pits and police killings can go only so far. When people looked at their windshields, they saw the pits. The pits were (and are) real. But the pits did not result from ominous forces, terrestrial or not. Ultimately, the pits were benign and unimportant. We have no reason to continue to scrutinize our windshields.

            The police killings were (and are) real, too. But they are neither benign nor unimportant. To some degree, they do result from ominous forces. Hence we as a people, our journalists and media people as our eyes and ears, and our governments and legal systems have good reason to continue to scrutinize our police. 

  • Copyright James W. Loewen
]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153563 https://historynewsnetwork.org/blog/153563 0
How 20th Century: Going to the Library to Get a Book

The Library of Congress

Sociologist James W. Loewen is the author of "Lies My Teacher Told Me."

This morning, even though it was Presidents' Day, I walked to the library and checked out a book. As I walked home, I considered that this was a twentieth-century errand, an action already historic. Hence I shall record it here, for future historians to ponder.

            A friend of mine, the famed radical librarian Sanford Berman, had mailed me an article about Claudia Rankine, who had spoken recently in Minneapolis. Yes, Sandy mailed me the article, in the form of a "Xerox" or photocopy. Sandy does not "do" email. For that matter, Xerox no longer does photocopies much. Its website exhorts, "Go Beyond Print: Your Customers Communication Needs Are Evolving," and it is in charge of, inter alia, the Smart Passes that turnpikes use. So everything about this story smacks of the twentieth century.

            The article described Rankine's book, Citizen: An American Lyric, in ways that made me want to read it. In vain, I searched for it online at my friendly neighborhood library, the Library of Congress; either they don't have it or they have not yet processed it. Catholic University, even closer, did have it, so I put on my coat and walked over to Mullen Library. I shall share some of the experiences I had.

            First, it was cold out, 12°F. I had to dress for it. Dressing for the weather is another twentieth-century custom. Several years ago a time-and-motion study showed that Americans average less than two minutes outdoors in a typical work day (not including time in their cars, which is not outdoors). If you imagine for a minute the typical suburbanite, getting in her car, already at 60°F in its attached garage, poking the remote opener to her garage door, driving off to the underground garage attached to her office building, then retracing her way home, you realize that two minutes may be an overstatement.

            As I walked — the only pedestrian — along the sidewalk, I noted, and not for the first time, an accident waiting to happen. Pepco, DC's desultory public power company, has a pole near the curb, held upright by two naked guy wires that cross the sidewalk diagonally, en route to the ground. At the right edge of the sidewalk, the lower wire is less than seven feet from the ground; a foot to the right of the sidewalk, it is less than five feet.

            Probably this is illegal; certainly it is dangerous. At night, the wires disappear from view. It is only a matter of time before someone, walking or running at the right edge of the sidewalk, clips the lower wire and is injured. Even more problematic, cyclists use the sidewalk here, because Michigan is such a busy street and has no bike lane or even bike-sharing paint. A tall cyclist may be killed if his face or neck hits that guy wire. Other guy wires sport bright yellow plastic shields. I made a mental note to write Pepco about the matter, and after I returned, I did so. I sent a copy to the on-line discussion list serving my neighborhood.

            Also very twentieth-century, no? Actually performing a civic action!

            I crossed the railroad tracks and reached campus, where I saw more than a hundred students walking around in groups of fifteen, many carrying red folders. I asked two laggards if they were prospective students, and they said yes. Presidents' Day weekend is a big time for high school students to visit college campuses, it turns out. Thus I tallied another twentieth-century activity: engaging others — strangers! — in face-to-face conversation.

            Arriving at the library, I went directly to the stacks, another act that grows increasingly rare if not impossible as the twenty-first century progresses. I discovered that at least in poetry, the stacks were 4/5 empty of books. Evidently considerable de-accessioning has been going on. In the handful of books that remained, I could not find Citizen: An American Lyric.

            I went to the circulation desk, where a work-study student had actually heard of the book and even recalled checking it out and then back in recently. She asked her supervisor, who went off and found the book on the "to be shelved" cart. Again, twentieth-century processes: memory (human, not digital), face-to-face work associates, physical movement.

            Buoyed by my success, I returned home. As I walked back, now on the other side of the street, my way was nearly blocked by a limb from a junk tree growing in a parking lot separated from the sidewalk by a high chain-link fence. I knew that the staff at the bar, whose lot it was, would never notice the blockage; they never even pick up trash in the lot itself. So I pulled the limb all the way to the sidewalk side, then pitched it back over into the parking lot where it belonged. Another civic act, making for a better walk home for commuters from the metro. How twentieth-century!

            Arriving home, I realized that I had been gone for 40 minutes. I could have saved most of that time by simply downloading Citizen into my Kindle or Nook, if I had one, or perhaps onto my iPhone, which I do have. But then I would not have gotten any exercise. I had walked just over a mile and a half. Of course, I might have set up my iPhone or my Kindle in front of a treadmill in my basement and walked in place for 40 minutes while reading Citizen. Doing so would have enlarged America's Gross Domestic Product (GDP), because I would have spent $9.95 for the download plus electricity for my treadmill. Indeed, I would have made a major contribution to the GDP, since I would first have had to buy a treadmill. That would have been twenty-first century of me.

            The monetary nexus may explain why twenty-first century "folkways" are winning. It is in some people's material interest to persuade me to walk indoors rather than out, download a book rather than use the library, etc. I suspect it also just seems to be the right thing to do. Certainly it's the hip thing to do. Who wants to be twentieth-century, after all?

            Our grandchildren, having grown used to nature-deficit-disorder, having adjusted to a society without civic acts, having abandoned face-to-face interaction even when face to face, won't even know what this essay is about. Indeed, rereading it convinces me it's merely a fogey-rant.

            Nevertheless, I shall send it out in the world, to tell them what we did back in the distant past, in the late twentieth century.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153582 https://historynewsnetwork.org/blog/153582 0
Lincoln's Second Inaugural on Its 150th Birthday

Sociologist James W. Loewen is the author of "Lies My Teacher Told Me."

            On Sunday, March 7, 2015, on the steps of the Lincoln Memorial, the National Park Service and the Lincoln Group of DC held a formal program remembering (and repeating) Abraham Lincoln's Second Inaugural Address.[1]

Abraham Lincoln (Michael Krebs) and Chief Justice Salmon P. Chase (John O’Brien) listening to learned analyses of Lincoln’s Second Inaugural.

            It was altogether fitting and proper that they should have done so. At least two speakers claimed that it was the most important speech ever given in this country, and I think they were right.[2] Moreover, this recognition followed more than a century of deliberate misreading or ignoring of this address, by our history textbooks and in our national culture.

            Bob Vogel, Director of the National Capital Region of National Park Service, began the day with remarks directly on point. He referenced Bloody Sunday at Selma, Alabama, whose 50th anniversary was being celebrated the same day, as part of the legacy of slavery of which Lincoln spoke. It would not be the day's only reference to Selma, which was appropriate, because the Civil Rights Movement, including the speeches given at this memorial during the March on Washington, sparked the shift in America's view of Lincoln that made Sunday's event possible.

            Vogel also referenced Robert Russa Moton, Booker T. Washington's successor at Tuskegee Institute, who gave the major address at the dedication of the Lincoln Memorial in 1922. He noted that Moton had invoked the "new birth of freedom" from Lincoln's other famous address, at Gettysburg, and had acknowledged, correctly, that Lincoln was referring of course to black freedom. Other speakers at the event, including Chief Justice William Howard Taft and President Warren Harding, only spoke of Lincoln as the savior of the Union.

            Vogel did not mention that Moton's remarks had been censored by the white people in charge of the dedication ceremony. Moton had planned to speak eloquently about the then-increasing tide of racial discrimination against which African Americans tried to swim. In the 1890s, white Americans, North and South, had reunited under the common bond of white supremacy. The '90s began with Mississippi's new constitution, which removed African Americans from citizenship — from voting, serving on juries, etc. Yet the United States simply ignored this flagrant defiance of the 14th and 15th amendments. As a result, every other Southern state and states as distant as Oklahoma had followed suit by 1907.

            So segregated had the United States become by the time the Memorial was dedicated that African Americans were restricted to a section across the road from the white audience, even in the nation's capital. Twenty-one African American guests left the dedication in protest.[3] The contrast between this seating and Lincoln's own practice at the White House reception after his inauguration shows in microcosm the deterioration in race relations during the Nadir.

            Moton had planned to reference Lincoln's famous "House Divided" speech at Springfield, Illinois, in 1858: "This nation cannot endure half slave and half free: it will become all one thing or all the other." Moton then went on:

            With equal truth, it can be said today: no more can the nation endure half privileged and half repressed; half educated and half uneducated; half protected and half unprotected; half prosperous and half in poverty; half in health and half in sickness; half content and half in discontent; yes, half free and half yet in bondage.

            

In this new era, after 1890, a new view of Lincoln had become popular, North and South, a synthesis historian Scott Sandage calls "bowdlerized" and "contrived."[4] Lincoln was now seen as "the best friend the [white] South ever had" as white Mississippians told me earnestly in the 1960s, because he would never have insisted on "Negro domination" the way the "radicals" in Congress did during Reconstruction. This new Lincoln was solely interested in holding the United States together, not in doing anything about slavery. Moton's invocation of a different Lincoln, one who did care about black rights, simply did not fit the new consensus. Moton was allowed to retain his ending, which referenced Lincoln's Gettysburg Address: "We dedicate ourselves and our posterity, with you and yours, to finish the work which he so nobly began, to make America the symbol for equal justice and equal opportunity for all."[5]

            Art critic Royal Cortissoz wrote the inscription that looms above the seated Lincoln. In keeping with this new interpretation of Lincoln, he deliberately omitted slavery: "In this temple, as in the hearts of the people for whom he saved the Union, the memory of Abraham Lincoln is enshrined forever." Cortissoz explained, "The memorial must make a common ground for the meeting of [white] North and South. By emphasizing his saving the union you appeal to both sections. By saying nothing about slavery you avoid the rubbing of old sores." To this day most high school textbooks in U.S. history echo Cortissoz's interpretation of Abraham Lincoln. One of the ways they do this is by omitting his Second Inaugural altogether or by quoting only its final paragraph, "with malice toward none."

            It's obvious to anyone who reads Lincoln's Second Inaugural that its last paragraph is a sort of epilogue — a brief passage at the end of a play or speech that brings the audience down from its high emotional and intellectual climax. That climax comes in the penultimate paragraph:

            If we shall suppose that American slavery is one of those offenses which, in the providence of God, must needs come, but which, having continued through his appointed time, he now wills to remove, and that he gives to both North and South this terrible war, as the woe due to those by whom the offense came, shall we discern therein any departure from those divine attributes which the believers in a living God always ascribe to him? Fondly do we hope — fervently do we pray — that this mighty scourge of war may speedily pass away. Yet, if God wills that it continue until all the wealth piled by the bondman's two hundred and fifty years of unrequited toil shall be sunk, and until every drop of blood drawn with the lash shall be paid by another drawn with the sword, as was said three thousand years ago, so still it must be said, 'The judgments of the Lord are true and righteous altogether.'

           

Two of those sentences are astonishing in their length alone, as well as in their content. Politicians don't talk like that nowadays. When my college students read it aloud, slowly and deliberately, they do not fail to perceive it as a searing indictment of America's sins against black people. The Civil War was by far the most devastating experience in our history. Yet we had it coming, Lincoln says here. And in his rhetorical context, sin or crime, not mere tragedy, is the fitting and proper term. Indeed, this indictment of U. S. race relations echoes the last note left by John Brown on his way to the gallows: "I, John Brown, am now quite certain that the crimes of this guilty land will never be purged away, but with Blood. . ."

            During the Nadir of Race Relations, which began in 1890 and began to loosen only around 1940, the United States went so racist in its ideology that the notion that Lincoln might actually have cared about black rights or ending slavery became an embarrassment. So let's leave that out! If the Second Inaugural is the most astonishing pronouncement about slavery ever uttered by any American president, let's just not quote it. Let's just quote from the epilogue. "With malice toward none, with charity for all" — who can be against that? Caring for widows and orphans — that's like being for the Ronald McDonald House — who could oppose that?

            The portrait of Lincoln that had been painted during the Nadir — free of content about slavery — still influenced our national commemoration of the centennial of the Civil War in the early 1960s. The centennial of his Second Inaugural went largely unremarked. Indeed, treatments of Christopher Columbus, secession, Abraham Lincoln, the presidency of U.S. Grant, that of Woodrow Wilson, and much else in our national U.S. history textbooks still remain under the thrall of interpretations that found favor during the Nadir of Race Relations. Thus only one textbook of the eighteen I studied for Lies My Teacher Told Me, published from 1975 to 2007, quotes anything from Lincoln's Second Inaugural about slavery. Seven quote a phrase or two from the epilogue. Ten ignore the speech completely.

            Not so, at the time. In 1865, it made quite an impact. Black listeners said "Bless the Lord" quietly after almost every sentence. Frederick Douglass told the president that evening that he deemed it "a sacred effort." Charles Francis Adams prophesied that it would be "for all time the historical keynote" of the war.[6] Six weeks later, farmers in New York and Ohio met Lincoln's funeral train with placards bearing phrases from this speech.

            Last Sunday, the mall looked beautiful, covered with snow. African American re-enactors played the part that U.S.C.T. had played 150 years earlier. Abraham Lincoln, in the person of Michael Krebs, at 6'4" the perfect size and shape to play the part, read his address. And last Sunday, the speech again made its full impact. Sally Jewell, Secretary of the Interior, referred to one of the two long key sentences in the address. She also referenced Harriet Tubman and her memorial in Maryland and various other innovative sites there National Park Service is opening, including Cesar Chavez National Monument and locales related to LGBT history.

Chuck Todd at the podium.  Lucas Morel, seated next to Lincoln, with United States Colored Troops behind him. Bob Vogel of NPS is to Todd's right; Karen Needles of the Lincoln Group of DC is in blue.

            Then came the first key speaker, and the best, Dr. Lucas Morel, who ironically hails from Washington & Lee University ("Lee" added of course to honor the Confederate general in 1870). Morel focused entirely on the key paragraph, calling it "the centerpiece of the Second Inaugural." In that paragraph, according to Morel, Lincoln provided a "national memory" of slavery, a "common understanding" of what the war had been about. North and South might now agree that even if this war were to cease immediately, after "only" four years, the nation would be getting off easy, so great was the stain of slavery. The ringing applause that greeted Morel's talk gave hope that in 2015, 150 years after Lincoln's death, perhaps we can recapture the content of his character, the anti-racism of his last speeches, and the meaning of his life for our time.

            Later, Chuck Todd of "Meet the Press" spoke of polarization and referenced Selma. Historian Edna Greene Medford of Howard University also referred to Selma at some length. She closed like Moton had closed almost a century ago, "Let us pledge to recommit ourselves to the principles [Lincoln] championed." Bobby Horton, known for his Civil War era musical scholarship, sang Lincoln's campaign song and also "Dixie." The Washington Performing Arts' Children of the Gospel Choir, not children but young men and women, sang several pieces. They ended with the "Battle Hymn of the Republic," which in a way gave John Brown the last word — perhaps a fitting finale. I noticed that Secretary Jewell let a tear fall during that hymn.

 Bobby Horton playing “Dixie”  for Pres. Lincoln and the audience.

            Afterward, Vogel invited the audience to come up and be photographed with the re-enactors. I didn't go. I was content just to look down the Mall on that beautiful day, now becoming comfortably warmer. Beyond the reflecting pools, behind the Washington Monument, I could see parts of the Grant sculptures and the wings of the Capitol behind them. It was all very imposing, as befits a great nation. In the aftermath of the morning's program, I was free to imagine, now that we let ourselves remember all of Lincoln's Second Inaugural, what if the United States could live up to its moral implications? What if we did construct a society with no unrequited toil? What if we did achieve a just and lasting peace with all nations?

            An impossible dream? Well, it was a patriotic occasion — and at a place where dreams have been dreamed before.

    [1]Karen Needles, president of the Lincoln Group, worked untiringly to make the program happen. Full disclosure: I think I am a member of the Lincoln Group.

    [2]Admittedly, the competition is fierce, from William Jennings Bryan's "Cross Of Gold" to Martin Luther King Jr.'s "I Have A Dream." Certainly Lincoln's Second Inaugural is the most important speech by an American president, notwithstanding even Lyndon B. Johnson's "We Shall Overcome" address to Congress in support of the Voting Rights Act.

    [3]Their walkout was specifically triggered when a Marine spoke abusively to them. When a fellow Marine admonished him, he replied, "That's the only way you can handle these damned 'niggers." At that point, the 21 left. See Albert Boime, The Unveiling of the National Icons (Cambridge:  Cambridge UP, 1998), 295-96.

    [4]Scott Sandage, "A Marble House Divided," Journal of American History, 80 #1 (6/93), 142.

    [5]Adam Fairclough, "Civil Rights and the Lincoln Memorial: the Censored Speeches of Robert R. Moton (1922) and John Lewis (1963)," Journal of Negro History, 82 #4 (1997), 410-12. Some sources claim that Moton himself was not allowed to sit on the speakers' platform, but photographs show him seated with other dignitaries. Cf. Sandage, ibid.

    [6]Adams quoted in Merrill D. Peterson Lincoln in American Memory (NY: Oxford UP, 1994), 12.

Copyright James W. Loewen]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153597 https://historynewsnetwork.org/blog/153597 0
Acknowledging History to Right a Wrong

Near Goshen, highlighted in pink, are other probable sundown towns. 

Sociologist James W. Loewen is the author of "Lies My Teacher Told Me" and "Sundown Towns:  A Hidden Dimension of American Racism."

 On March 17, 2015, the mayor and city council of Goshen, Indiana, passed a resolution to acknowledge and transcend its past as a sundown town. This resolution is unusual in its direct citation and use of historical materials. It does this in a series of “Whereas” statements that lead to a final conclusion: “It Happened, It Was Wrong, It’s A New Day.”

 Between 1890 and 1940, more than 200 towns and counties in Indiana became sundown towns -- places that were "all-white" on purpose. Goshen was one of these towns. To be sure, it stopped enforcing its ban both formally and informally some years ago. Now this resolution clearly moves Goshen beyond its sundown past.

 Goshen begins by citing the Census, which “reported that the Negro’ population of Goshen in 1890 was 21, but by 1910 it was 2.” Then the city directly admits its sundown past:

 "WHEREAS historical studies by multiple independent researchers confirm that Goshen was a “sundown town” for approximately the first two-thirds of the 20th century;"

The city also admitted that it used to brag about its racial composition:

"WHEREAS the Goshen City Utility, the Goshen Mayor’s Office, and the Goshen Chamber of Commerce put the City’s exclusionary reputation in writing in a number of publications from the mid-1930s to the late 1970s;"

Goshen then cited more recent and more positive events, such as establishing a “Diversity Day” in 1996 and placing “We Promote Tolerance” signs at its corporate limits in 2000. These “whereases” then lead to a final section:

 “NOW, THEREFORE, AS THE COMMUNITY OF GOSHEN, INDIANA,

“WE HEREBY:

“Acknowledge the racist and exclusionary aspects of Goshen’s “sundown town” history, along with the pain and suffering that these practices caused;

“…. [seek] equality and justice for all;

“Pledge to work toward the common good in building a community where people of all races and cultural backgrounds are welcome to live and prosper;

“And summarize this resolution in nine words: It happened, it was wrong, it’s a new day.”

The entire resolution is available here. Other former (or persisting) sundown towns in Indiana and across the United States now have an example to follow, a way to acknowledge and transcend their white supremacist pasts, and a step to take toward racial harmony. Had Ferguson, Missouri, for example – a suburb that tried mightily to become a sundown town between 1940 and 1960 but did not quite succeed –  debated and then passed a similar resolution, it might have given up its “DWB” policing years ago. Then it might have avoided the racial polarization that has now made the place notorious.

Copyright James W. Loewen]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153600 https://historynewsnetwork.org/blog/153600 0
Then Verne Gagne Moved! Lies My Teacher Told Me."

            When I was maybe eight, Dad took me to see my first (and only) professional wrestling bout. It was incredible! There were various events, including two tag-team bouts, which were the most exciting. One guy was getting clobbered, almost pinned, almost pinned, almost pinned, and then he managed to reach the corner and tag his partner, who jumped on top of his opponent and saved the day! I still remember it, 65 years later!

            The other tag-team match was even more memorable: it was bisexual! No, wait, that's not the word — it was "mixed." Two women started out, but each was paired with a man, and various interesting things happened (probably they would have been more interesting had I been twelve instead of eight) during the periods when a man and a woman grappled in the ring.

            These preliminaries were mere warm-ups, however. The main event featured Verne Gagne, American Wrestling Association "World Champion," and some opponent who looked nasty, complete with a black scowl, black trunks, black robe, and maybe a black mask. (So far as I recall, all the contestants were white, but I'm not sure.)

            To my knowledge, this was the first (and surely the last) time that Decatur, a city of about 65,000 right in the center of Illinois, had ever hosted a World Championship in any field of human endeavor! Surely it was held (no, I don't actually recall) in the Masonic Temple Auditorium, where all major events in Decatur took place. Surely it was sold out. Surely the crowd was yelling itself hoarse. Certainly I was.

            The first minutes passed in a blur; I can no longer recount them to you reliably. But I still remember the fabulous climax. Both wrestlers were standing in the center of the ring, exhausted from their heroic efforts. Then the masked challenger stepped back and launched a sneak attack! He bounced off the elastic ropes, and catapulted himself feet-first at Verne Gagne! Our hero, caught unaware, crashed down like a redwood. Soon he regained his feet, however, only to have the evil challenger do it again! Again, though, Mr. Gagne managed to get up.

            And then ... then, Verne (may I use his first name?) had endured enough! When he saw the challenger go back to bounce off the ropes for a third and possibly fatal attack, he bounced off the ropes and launched himself feet-first as well!

            Both pairs of feet met in mid-air, some five feet above the mat! Perhaps 500 pounds of male flesh fell to the mat with an enormous thud. Both men lay stunned, possibly unconscious. A hush descended upon the crowd. The referee started his countdown, "One, two," but had to break off in confusion — how could he count out both men? Who would be the winner?

            And then ... then, Verne Gagne moved! He rose up! Well, at least he turned over and got onto hands and knees. Groggily, he moved over to his supine opponent and fell on top of him. He could do no more.

            "One, two, three," called out the referee, and it was over. Stunned, we filed out.

            The next day, I told the story to all my friends at school. They too were impressed. Who wouldn't have been? Evil had been vanquished! Truth and justice had triumphed!

            Yes, looking back, I am astounded that even at age eight, I failed to see through it. Why hadn't Verne thought to move his head to the side — just a few inches — when the challenger launched his feet at him? At least the second time? The third time, why did simply falling to the mat from five feet above it knock out both wrestlers? For that matter, how in the world might Decatur, Illinois, be hosting a World Championship wrestling bout in the first place? On a Tuesday night, at that? And didn't Mr. Gagne successfully defend his championship about as often as the Harlem Globetrotters — who also came through Decatur — defeated the College All-Stars?

            Did my Dad see through it? And just not say anything, since disclosing the fraud would have taken away my immense pleasure in recounting the saga to my friends? I don't know — after all, decades later, many grown-ups still claimed to believe that Hulk Hogan, Jesse Ventura, and all the rest were for real. Indeed, Verne Gagne, who retired finally in the 1980s (!), became a wrestling promoter and wound up as the sole owner of the American Wrestling Association! In turn, the AWA was "the breeding ground" for Hogan, Ventura, and all the rest, according to the Associated Press. 

            Let's just say that the 1950s was simpler times.

            Yes, there is a parallel to the simpler times that historians presented, back then, in their "consensus accounts" of Columbus, the Cold War, and everything in-between. Vietnam and Watergate have made us all a bit more sophisticated, a lot less naive.

            But that's not where I'm going with this essay. Verne Gagne died last week, age 89. That AP article is his obituary. Gagne started out as a real wrestler (high school, college, and AAU) before he became a fake wrestler. Some textbook authors were real historians, too, before they became fake historians for Pearson or Houghton Mifflin. Let us raise a glass of Bud Lite to Verne Gagne tonite! 

Copyright James W. Loewen ]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153622 https://historynewsnetwork.org/blog/153622 0
The Vietnam War Revisioned by Those Who Opposed It

Sociologist James W. Loewen is the author of "Lies My Teacher Told Me."

            On Friday, May 1, and Saturday, May 2, 2015, 700 veterans of the protests against the Vietnam War gathered at the New York Avenue Presbyterian Church ("Lincoln's Church") in downtown Washington, D.C. The meeting had been called by the "Vietnam Peace Commemoration Committee," initiated by Tom Hayden, John McAuliff, and David Cortright.

            The Department of Defense provided the impetus for the formation of the Peace Commemoration Committee when it revealed its plans to spend some $30-60,000,000 that Congress had given it to commemorate the Vietnam War. I cannot just write "Department of Defense" without noting that it had been called the War Department until after World War II. Soon after its name change, DOD abandoned all pretense of limiting itself to defense, built bases all around the world including within gunshot of countries it defined as enemies, and has remained almost continuously at war ever since.

            About Vietnam, DOD's plans showed how prescient George Swiers, a Vietnam veteran, was, decades ago, when he said, "If we do not speak of it, others will surely rewrite the script. Each of the body bags, all of the mass graves will be reopened and their contents abracadabraed into a noble cause."[1]

            To head this off, the Vietnam Peace Commemoration Committee formed and met with the Pentagon. They challenged what they had heard about the K-12 school curriculum ideas that DOD planned. DOD's fall-back position was that we should now "honor our servicemen" by inviting them to speak in schools. Fine, said the Commemoration Committee, but they pointed out that schools should also invite veterans of the peace movement. After all, the protestors were right: our War in Vietnam was not in the best interests of the United States or Vietnam and was morally as well as politically wrong.

            I did not intend to write about the intense and well-planned reunion of protestors that took place in Washington. Apparently no one else has, however. Why no reporters attended I do not know. So I shall quickly record some of the things that took place. The plenary proceedings were video-recorded; hopefully they will show up soon on a website.

            The Friday evening program was called "Honoring Our Elders." Phil Donahue (still alive! still alert!) emceed. He proved to be a fine stand-up comedian! (Who knew?) For example, at the end of the program, trying to herd participants together for a photo, he said, "You know what Christ told his disciples at the Last Supper, right? If you wanna be in the picture, you gotta get on this side of the table!"

            Before the elders held forth, Bill Ehrhart read an interesting poem. Then Congresswoman Barbara Lee from Oakland (CA) spoke at length. She had replaced the legendary Ron Dellums and was the only person in Congress to vote against the blank check that legitimized George W. Bush's Iraq War. Then Peter Yarrow, of Peter, Paul, & Mary, sang a heartfelt and complex antiwar song.

            Then came the impressive array of elders. Each was introduced by a "younger," a young activist against our current wars or ongoing injustices. The elders were Daniel Ellsberg, Dick Fernandez, Judith Lerner, Staughton Lynd, Dave McReynolds, Marcus Raskin, George Regas, Arthur Waskow, and Cora Weiss. Quite a collection — if any are unfamiliar to you, look them up on the web! Donahue complimented them for figuring out that the war was wrong long before he did and proceeded to ask them useful general questions. Each person got to say something interesting; audience members asked questions; and suddenly it was time for Yarrow to be joined by his daughter and son-in-law for two antiwar songs that I did understand.

            Saturday began with an invocation by Monsignor Ray East, a song by Holly Near, and a welcome from Heather Booth and Marge Tabankin. Then a panel featured ten-minute presentations by Tom Hayden, Wayne Smith (a veteran who told his own story of becoming antiwar "in country"), and former Members of Congress Pat Schroeder and Ron Dellums himself. Each was effective, even eloquent. Comments and questions from the audience filled the rest of the time until 11:15AM. Simultaneous break-out sessions then lasted an hour, on such topics as "How to Teach about Vietnam K-12" (led by Julian Hipkins and me), "The War and the Women's Movement (Heather Booth), and "Vietnam Era Authors and Poets (Jan Barry and Bill Ehrhart).

            After lunch, there were "Simultaneous Mini-Plenaries," such as "Opposing Our Country's Agenda," with Todd Gitlin, David Hawk, Judith LeBlanc, Rosalio Munoz, and Taylor Branch. I attended "American Foreign Policy: From Then To Now," featuring Phyllis Bennis, Daniel Ellsberg, Michael Klare, Larry Korb, and Marilyn Young. It was interesting, but not what I'd hoped for; I wanted to hear Ellsberg, in particular, analyze our adventures in Lebanon, Panama, Iraq, etc., etc. Instead, we got ideas about what we should have done to oppose our militaristic foreign policy.

            Then Tom Hayden offered a reflective, even pensive, summary. He told how the antiwar movement had splintered over small differences, hence didn't accomplish as much as it might have. He issued no clarion call to arms — which might have worked, had he done so — but his remarks were honest and informative.

            In the late afternoon, we walked to the King Memorial. Supposedly we were going to do so via the Vietnam Veterans Memorial, which would have been appropriate, but we did not. We should have had a flier to hand out to onlookers, who were appropriately curious. Our fogy status was confirmed when one person collapsed on the walk and others took the bus alternative. To some degree, the weekend was about passing the torch, but an inadequate number of young (or even younger) people attended. Passing the torch requires someone to pass it to. No DC high school students attended. I met no one from Howard, GW, AU, or any other local college. At the King Memorial, Danny Glover presided, along with Julian Bond and others. Back at the church, we enjoyed a dinner of Cambodian, Laotian, and Vietnamese food, followed by an evening program that I did not attend.

            A dinner companion commented on the fact that no one had spoken of the sex or drugs that permeated some happenings of the antiwar movement half a century earlier. I saw no sex or drugs at the reunion. (One sticker, "Make Out / Not War," was widely handed out. I took one and wore it, but I had no idea why it was changed from the original, "Make Love / Not War." The weekend was also surprisingly sober in spirit.

            The various fiftieth reunions of the Civil Rights Movement, such as for Freedom Summer last year at Tougaloo College, were at least somewhat triumphant. Although they recognized the continuing injustices in our society, they also took pride in the changes that they had wrought, especially in the South. This peace reunion was much more subdued. Speakers were careful not to praise the peace movement for ending the Vietnam War — credit for that went primarily to the Viet Cong and North Vietnamese Army. Two speakers did note that Richard Nixon himself said he would have used nuclear weapons against North Vietnam, had it not been for the peace movement and the furor he knew he would trigger.

            There are reasons why peace movement alumni find it harder to congratulate themselves, compared to Civil Rights Movement alumni. Our society has formed a consensus that it is wrong to deny people citizenship, voting rights, jury duty, the use of hospitals and restaurants, etc., based on race. We have not formed a consensus that it is wrong to invade other countries thousands of miles away that pose no threat to our existence. On the contrary, we do it all the time.

            Similarly, our society commemorates on its landscape leaders of the Civil Rights Movement. Every city has its Martin Luther King Avenue. Medgar Evers gets an airport in Mississippi; Thurgood Marshall gets one in Maryland. Serious museums treat the Civil Rights Movement in Memphis, Selma, Birmingham, Montgomery, Greensboro, and elsewhere. But no major museum treats the peace movement anywhere in the United States.

            Nevertheless, we were right. The Vietnam War was wrong. The United States was wrong to "ask" its young men to travel around the world to "stop Communism" in Vietnam. The war killed almost 60,000 Americans and more than a million Vietnamese. It cost a fortune, imperiling LBJ's "War on Poverty." It also changed how Americans view their government, because as the edifice of lies that U.S. leaders constructed about the war collapsed, Americans' trust in government tumbled as well. It has yet to recover.

            HNN has posted a handout at the conference written by the committee, "Fighting on the Battlefield of Memory: Lessons from the Vietnam War." I understand the key authors were Tom Hayden and David Cortright. Most of its points are both useful and indisputable. I invite other attendees to flesh out this report, here or elsewhere on the web. I also invite readers to get involved in remembering and teaching about Vietnam. We cannot let the Pentagon, with its millions, tell the story of this war and its impact on America, Vietnam, and the world, all by itself.

[1].Quoted in William Appleman Williams, et al., eds., America in Vietnam (New York:  Norton, 1989), p. ix.

Copyright James W. Loewen]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153625 https://historynewsnetwork.org/blog/153625 0
Wrong and Racist at Duke

Sociologist James W. Loewen is the author of "Lies My Teacher Told Me."

  Jerry Hough, Professor of Political Science at Duke, got into trouble by making "noxious, offensive" comments about a New York Times editorial, "How Racism Doomed Baltimore." A Duke vice president used those adjectives, and Duke has placed him on leave. (Hough is 80 and planned to retire after next year anyway.)

            I would suggest that he should leave Duke, on grounds of incompetence. He knows nothing about the history of race relations, yet opines on it anyway. He claims, "No one has said I was wrong, just racist." Well, I do say he was wrong — as well as racist.

            Chronologically, Hough's first error is his astounding statement, "In 1965 the Asians were discriminated against as least as badly as blacks." In 1967, I wrote my doctoral dissertation, The Mississippi Chinese: Between Black and White, on exactly this point. I compared the Chinese American population in the Mississippi Delta — the largest in the South — with the majority population there, African Americans. I found exactly what Prof. Hough complains about: Chinese Americans had achieved social mobility while African Americans had not.

            To Hough, this proves black inferiority — not necessarily genetic, just behavioral. Asians "worked doubly hard" while blacks "felt sorry for themselves." On the contrary, my research showed  that the different results stemmed from different positions in social structure, leading to different degrees of white racism.

            The Chinese found a niche: grocers to the black underclass. In this position, they conflicted with no whites other than a handful of grocers who themselves had low status because they served a mostly black clientele. As Chinese Americans progressed economically, they were able to progress racially. By 1955, they were voting and had gained entry into "white" schools and hospitals. Whites then cited successful Chinese merchants to argue that Southern society was not racist. Just like Hough, they said that African Americans were just lazy, while Chinese were industrious.

            In reality, African Americans had no chance to make such progress. They constituted the workforce from which white landowners made their living — and some plantation owners made very good livings indeed. This aristocracy then flexed its political power to exclude agricultural and domestic workers from minimum wage standards and other labor laws. Any African American who nevertheless managed to become successful thereby became a target — such as the merchant in Shaw, Mississippi, whose shoe store was burned out by whites in the late 1950s. About voting, surely even Hough, a professor of political science, recalls that blacks could not register in most of the Mississippi Delta! Two Chinese Americans, meanwhile, served as mayors of their small Delta towns.

            Hough's next error concerns racial intermarriage. "Asian-white dating is enormous," he writes, "and so surely will be the intermarriage." He thinks this is because Asians favor integration while African Americans do not. Here his comments reveal ignorance about what sociologists call hypergamy: the tendency for men in the "higher" group to date and marry "down." It works like this: even today, men initiate most social interaction between genders. Men of the "upper" group choose women of the "lower" group far oftener than they do women of the "upper" group. Thus if a high school senior goes out with a sophomore, the senior is the male. If a doctor dates a nurse, the doctor is the male — and not just because most nurses are female. And yes, if a Caucasian dates an Asian, most often the Caucasian is the male and initiates the relationship.

            Way back in 1937, Romanzo Adams studied hypergamy in about the only place in the United States that then displayed much interracial marriage — Hawai'i. Adams found that hypergamy held perfectly across racial lines, with one marked exception: black/white. These couples were overwhelmingly composed of black males and white females. Why? Because white males resisted initiating social contact with black females. Since then, other investigators have found the same pattern across the U.S., although in the last few years the ratio has decreased somewhat. Here is Hough's explanation: "Black-white dating is almost non-existent because of the ostracism by blacks of anyone who dates a white." Yet by ratios of as much as five to one, blacks initiate black-white dating!

            Hough then suggests a staggeringly original analysis of black naming patterns, a topic with a long, rich, and troubled history. In slavery, owners rather than parents named slaves. Some owners, including George Washington, gave "their" slaves pretentious names like Pompey and Caesar, making fun of their powerlessness. Jefferson took a different tack, recording his slaves by diminutives, like "Jenny" for a woman who called herself "Jane Gillette." Some enslaved parents fought back by giving their children secret names. Those lucky enough to be freed, including by the Civil War, reverted to such names, if they had them.

            During the Nadir of race relations — that terrible period between 1890 and 1940 when white Southerners removed blacks from citizenship — a new consideration affected black naming customs. As racial subordination intensified, every element of social interaction became codified. Now whites called blacks — even older and more senior African Americans — by their first names while demanding to be called "Mr." or "Mrs." in return. To avoid this disrespect, some parents named their children initials, like "T. J." Daughters might get named with positive adjectives, like Patience or Precious. A few parents even named their first-born sons "Mister," giving white supremacists no way to disrespect them, other than "Boy!" The Civil Rights and Black Power movements opened up new ways to claim respect, including African names like Jamal. African Americans’ increased interest in their African past, symbolized in the bestseller Roots and its ensuing smash television miniseries, sparked white Americans’ renewed interest in their own ethnic pasts. In turn, social psychologist LaFrances Rose wrote a fascinating paper about African Americans' willingness to give their children original names with original spellings like Shimiqua and Cheniqua.

            About this entire history Prof. Hough seems ignorant, as well. Bestowing unusual names merely "symbolizes their [blacks'] lack of desire for integration." Nonsense! My daughter, half Irish American, married an Irish American; they named their children Seamus and Bridget. So we can infer they don’t desire integration?

            The most famous research about black names, done at Chicago and M.I.T., shows that whites discriminate against them. "White names like Emily Walsh or Greg Baker drew 50% more [job] callbacks than those with African-American sounding names like Lakisha Washington or Jamal Jones," according to Kenji Yoshino, summarizing in the New York Times.[1] Prof. Hough seems to suggest that such discrimination may be perfectly fair. After all, these kids' parents didn't value success in white society anyway, else they would have given their children white names.

            To sum up, Jerry Hough shows no knowledge about relative discrimination, racial intermarriage, or black naming patterns, though he is willing to write about all three.

  "Ignorant I am not," he nonetheless claims in his defense, in an email to the Duke Chronicle defending his Times comments. To prove he's not ignorant, he notes that he used to live on the West Coast and visited Asheville, NC, several times between 1940 and 1960! Since he finished his B.A. in 1961, we can infer that he was not a student of race relations during these years. Instead, like many white Americans, Hough feels he can simply cite his life as evidence of his knowledge about race relations in the United States.

            This won't do. Like most whites in those years, Hough spent his life in a white cocoon. Near Asheville, for example, three entire counties flatly prohibited African Americans from living in them, except for two small enclaves. Did he ever learn that? I don't know where Hough lived on the West Coast, but the same policy held true for 80% of the suburbs of Los Angeles, 80% of the Bay Area, and many independent towns in California. Has Hough ever interviewed a single black person about race relations in North Carolina? in California?

            Harvard was almost all-white when Hough was there. I know, because I got my doctorate in sociology there just after he left. At that time, Hough encountered not a single African American professor. Harvard's only faculty member who knew anything about race relations, the estimable Thomas Pettigrew in Social Relations, told me in 1966 that he "felt completely isolated."

            Even after finishing his degrees, Hough seemed to learn nothing about race relations. After leaving Harvard, he taught political science at the University of Illinois from 1961 to 1968, for example. At that time, Champaign/Urbana was surrounded by sundown towns — communities that did not let African Americans spend the night. Within about 25 miles, these included Deland, Farmer City, Mahomet, Monticello, Paxton, St. Joseph, Saybrook, Tolono, Villa Grove, White Heath, and several others. In those years, some even posted signs at their city limits announcing that blacks were not welcome or sounded sirens at 6PM telling them to leave. Did Hough even notice? None of these towns kept out Chinese Americans, as far as I know. Yet Hough claims that Chinese faced discrimination equal to blacks! Did he ever interview a single black person about race relations in central Illinois?

            Prof. Hough does not even know enough about race relations to know that he does not know anything about race relations!

            What would we make of a "distinguished" professor of geology at Duke who wrote the New York Times to say that the world is flat? Should he be fired? To be sure, geology is not exactly geography, but geologists who think the world is flat are likely to do bad work. Similarly, race relations is sociology, not exactly political science, but political scientists whose knowledge of race relations is as faulty as Hough's are going to do bad work.

            Finally, what about the fact that Prof. Hough seems not to care about race relations at Duke? Every student he might teach is black, white, Asian, or "other." Now they know that Hough is completely clueless about race. Should they take his course about the Soviet Union? Probably they should, if they want to learn about that subject, but I would not want Hough cluttering up faculty meetings or the Web with his nonsensical views on race. Duke has a troubled past about race relatioins, including recently. I would have to conclude that Hough’s "collegiality" potential would be limited, and his possible excellence about the U.S.S.R. should be weighed against his incompetence in other areas that he thinks are also his domain. Retirement seems appropriate, on grounds of incompetence.

    [1]Kenji Yoshino, "The Pressure To Cover," NY Times Magazine, 1/15/2006, 36. Incidentally, although Asian, Yoshino's parents also must have been uninterested in integration, according to Hough, naming him "Kenji" instead of "Ken." 

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153627 https://historynewsnetwork.org/blog/153627 0
What Do You Do When a Review Is Dishonest? Lies My Teacher Told Me."

My book, Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong, is probably the best-selling book by a living sociologist. Mostly it got glowing reviews. Some were negative. As I wrote to Bill Ott, editor of Booklist, the review service of the American Library Association, however, one was flatly dishonest.

Gilbert Taylor wrote it. Ironically, Taylor graduated from Carleton College, my own alma mater. According to the biography he posted on LinkedIn, reviewing for Booklist was his full-time job from 1991 to 2011. He lists no employment since then. In those twenty years he wrote thousands of reviews for Booklist, of books about history, current events, science, art, literature, sports, and fiction.

Here's his primary complaint about Lies: “To account for the deplorable situation [of poor textbooks], he [Loewen] offers this quasi-Marxist explanation: ‘Perhaps we are all dupes, manipulated by elite white male capitalists who orchestrate how history is written as part of their scheme to perpetuate their own power and privilege at the expense of the rest of us.’ "

That is simply not true. I made no such offer. Rather, I wrote the quoted sentence as a hypothetical, a straw-man of sorts, and then went on immediately to dispute it and offer better explanations. Indeed, the sentence reads like a caricature of a true Marxist explanation. In Lies, after making the explanation look less absurd by quoting some of its adherents, including Jonathan Kozol, Paolo Freire, and Henry Giroux, I went on to wonder “if it is appropriate to lay this bundle on the doorstep of the upper class. To blame the power elite for what is taught in a rural Vermont school or an inner-city classroom somehow seems too easy. If the elite is so dominant, why hasn't it also censored the books and articles that expose its influence in education?”

A paragraph later, I pointed out, "One of the glories of capitalism is that somewhere there are publishers who will publish almost any book, so long as they look to make a profit from it." I concluded: "In sum, power elite theories may credit the upper class with more power, unity, and conscious self-interest than it has." I went on to suggest other explanations and offer reasons why they are more persuasive.

None of this surfaces in Taylor's review. Instead, after presenting only my straw explanation, he then goes on to suggest alternative explanations of his own.

As I wrote to Bill Ott:

It follows that either Mr. Taylor is a knave or a fool. If he did not notice that I dismissed that explanation in favor of others, then he is incompetent. Probably he DID notice it but prefers to pretend to readers — who after all will not go on to read the book, not after such a negative review — that I advanced my "quasi-Marxist explanation" seriously. Then he is a knave.

In short, "[T]his is a dishonest review that does not serve your constituency well." I invited him "to examine the passage in question" and offered to ship him another copy.

Today I realize that it is possible that Taylor was so incensed by my ten chapters critiquing how history textbooks glide over any bad actions the U.S. has ever taken that he "knew" I must be Communist. In that case, he might have simply ignored the evidence of the next five pages of my book after my "quasi-Marxist explanation" because they did not fit his preconception of me. I suppose this is a version of the "fool" alternative — perhaps "honest fool" or "motivated fool" would be the term.

To his credit, Ott eventually replied:

It appears that you make a valid point about our reviewer quoting a passage from your book out of context and then drawing conclusions from that passage that amount to a misreading of your intentions. Therefore, we will remove the review from Booklist Online and instruct those vendors to whom we license our reviews to do likewise (you should know, however, that our licensees may or not respond to such requests in a timely fashion).

Ott went on to note, however, that if I insisted on his doing this, I might be shooting myself in the foot:

I would point out to you, however, that our review, despite its criticisms of your text, very likely did contribute positively to the book’s library sales. Librarians attempt to collect materials on both sides of controversial issues, that the fact that your book appeared in the pages of Booklist will certainly have prompted orders.

This is the old "there's no such thing as bad publicity" reasoning. Quite likely, it's accurate. A bad review in an important site like Booklist may be better than no review.

"Thank you for recognizing my point," I replied. "I don’t throw the term ‘dishonest’ around lightly, and at my age, I’ve received negative reviews more than once. His misreading had to be deliberate."

However, rather than merely winding up with no review, I asked for two other responses: a new review in Booklist, and that the managing editor who sent the book to Taylor follow up by asking Taylor about his review. If Taylor had no cogent explanation, then I suggested he "should never again be engaged by ALA." I argued, "You do owe that to me, and to your readers as well."

The editor refused any additional redress. My choice remained, a scurrilous review or no review.

At Amazon, for most of the time since it first came out, Lies My Teacher Told Me has led all books in sales in its category, historiography. Over time, more than 2,000 reviews of my book have appeared at Amazon, but Taylor's is still the second one under the category of "Editorial Reviews." His falsely framed quotation from me still sits front and center. As well, as of May 2015, his review lies on at least 450 web sites besides Amazon. My reputation therefore continues to be widely besmirched by him.

What should I do about it?

There is no letters column at Booklist, as there is at Journal of American History. So far as I can tell, Gilbert Taylor has no reputation to lose, certainly none in history. I know no way to respond at Amazon to the two essays, usually from Publisher's Weekly and Booklist, printed above "Product Details."

This essay is a bit of an answer, because HNN does reach some historians. As well, from now on anyone who Googles phrases like "quasi-Marxist explanation" or "perhaps we are all dupes" might get this essay along with Taylor's dishonest review.

Any other ideas?

Copyright James W. Loewen ]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153629 https://historynewsnetwork.org/blog/153629 0
Why Do People Believe Myths about the Confederacy? Because Our Textbooks and Monuments Are Wrong.

Sociologist James W. Loewen is the author of "Lies My Teacher Told Me." This article was first published by the Washington Post. 

Inside the Texas State Capitol in Austin

History is the polemics of the victor, William F. Buckley allegedly said. Not so in the United States, at least not regarding the Civil War. As soon as Confederates laid down their arms, some picked up their pens and began to distort what they had done, and why. Their resulting mythology went national a generation later and persists — which is why a presidential candidate can suggest that slavery was somehow pro-family, and the public believes that the war was mainly fought over states’ rights.

The Confederates won with the pen (and the noose) what they could not win on the battlefield: the cause of white supremacy and the dominant understanding of what the war was all about. We are still digging ourselves out from under the misinformation that they spread, which has manifested in both our history books and our public monuments.

Take Kentucky. Kentucky’s legislature voted not to secede, and early in the war, Confederate Gen. Albert Sidney Johnston ventured through the western part of the state and found “no enthusiasm as we imagined and hoped but hostility … in Kentucky.” Eventually, 90,000 Kentuckians would fight for the United States, while 35,000 fought for the Confederate States. Nevertheless, according to historian Thomas Clark, the state now has 72 Confederate monuments and only two Union ones.

Neo-Confederates also won western Maryland. In 1913, the United Daughters of the Confederacy (UDC) put a soldier on a pedestal at the Rockville courthouse. Montgomery County never seceded, of course. While Maryland did send 24,000 men to the Confederate armed forces, it sent 63,000 to the U.S. Army and Navy. Nevertheless, the UDC’s monument tells visitors to take the other side: “To our heroes of Montgomery Co. Maryland / That we through life may not forget to love the Thin Gray Line.”

In fact, the Thin Grey Line came through Montgomery and adjoining Frederick counties at least three times, en route to Antietam, Gettysburg and Washington. Lee’s army expected to find recruits and help with food, clothing and information. They didn’t. Maryland residents greeted Union soldiers as liberators when they came through on the way to Antietam. Recognizing the residents of Frederick as hostile, Confederate cavalry leader Jubal Early demanded and got $300,000 from them lest he burn their town, a sum equal to at least $5,000,000 today. Today, however, Frederick boasts what it calls the “Maryland Confederate Memorial,” and the manager of the Frederick cemetery — filled with Union and Confederate dead — told me in an interview, “Very little is done on the Union side” around Memorial Day. “It’s mostly Confederate.”

In addition to winning the battle for public monuments, neo-Confederates also managed to rename the war, calling it “the War Between the States.” Nevermind that while it was going on, no one called it that. Even Jeopardy! accepts it.

Perhaps most perniciously, neo-Confederates now claim that the South seceded for states’ rights. When each state left the Union, its leaders made clear that they were seceding because they were for slavery and against states’ rights. In its “Declaration Of The Causes Which Impel The State Of Texas To Secede From The Federal Union,” for example, the secession convention of Texas listed the states that had offended them: Maine, Vermont, New Hampshire, Connecticut, Rhode Island, Massachusetts, New York, Pennsylvania, Ohio, Wisconsin, Michigan and Iowa. These states had in fact exercised states’ rights by passing laws that interfered with the federal government’s attempts to enforce the Fugitive Slave Act. Some also no longer let slaveowners “transit” through their states with their slaves. “States’ rights” were what Texas was seceding against. Texas also made clear what it was seceding for: white supremacy.

We hold as undeniable truths that the governments of the various States, and of the confederacy itself, were established exclusively by the white race, for themselves and their posterity; that the African race had no agency in their establishment; that they were rightfully held and regarded as an inferior and dependent race, and in that condition only could their existence in this country be rendered beneficial or tolerable.

Despite such statements, during and after the Nadir, neo-Confederates put up monuments that flatly lied about the Confederate cause. For example, South Carolina’s monument at Gettysburg, dedicated in 1965, claims to explain why the state seceded: “Abiding faith in the sacredness of states rights provided their creed here.” This tells us nothing about 1863, when abiding opposition to states’ rights as claimed by free states provided South Carolinians’ creed. In 1965, however, its leaders did support states’ rights. Indeed, they were desperately trying to keep the federal government from enforcing school desegregation and civil rights. The one constant was that the leaders of South Carolina in 1860 and 1965 were acting on behalf of white supremacy.

So thoroughly did this mythology take hold that our textbooks still stand history on its head and say secession was for, rather than against, states’ rights. Publishers mystify secession because they don’t want to offend Southern school districts and thereby lose sales. Consider this passage from “The American Journey,” the largest textbook ever foisted on middle-school students and perhaps the best-selling U.S. history textbook:

The South Secedes

Lincoln and the Republicans had promised not to disturb slavery where it already existed. Nevertheless, many people in the South mistrusted the party, fearing that the Republican government would not protect Southern rights and liberties. On December 20, 1860, the South’s long-standing threat to leave the Union became a reality when South Carolina held a special convention and voted to secede.

Teachers and students infer from that passage that slavery was not the reason for secession. Instead, the reason is completely vague: [white] Southerners feared for their “rights and liberties.” On the next page, however, “Journey” becomes more precise: [White] Southerners claimed that since “the national government” had been derelict “by refusing to enforce the Fugitive Slave Act and by denying the Southern states equal rights in the territories — the states were justified in leaving the Union.”

“Journey” offers no evidence to support this claim. It cannot. No Southern state made any such charge against the federal government in any secession document I have ever seen. Presidents Buchanan and before him, Pierce, were part of the pro-Southern wing of the Democratic Party. For 10 years, the federal government had vigorously enforced the Fugitive Slave Act. Buchanan had supported pro-slavery forces in Kansas even after his own minion, the Mississippi slave owner Robert Walker, ruled that they had won only by fraud. The seven states that seceded before February 1861 had no quarrel with “the national government.”

Teaching or implying that the Confederate states seceded for states’ rights is not accurate history. It is white, Confederate-apologist history. It bends — even breaks — the facts of what happened. Like other U.S. history textbooks, “Journey” needs to be de-Confederatized. So does the history test we give to immigrants who want to become U.S. citizens. Item 74 asks, “Name one problem that led to the Civil War.” It then gives three acceptable answers: “slavery, economic reasons, and states’ rights.” If by “economic reasons” it means issues about tariffs and taxes, which most people infer, then two of its three “correct answers” are wrong! No other question on this 100-item test has more than one “right” answer. The reason is not because the history is unclear, but because neo-Confederates still wielded considerable influence in our culture and our Congress until quite recently, when a mass of politicians rushed to declare the Confederate flag unsuitable for display on government grounds.

Now the dean of the National Cathedral in Washington, D.C., has noted that the cathedral needs to de-Confederatize its stained glass windows. That would be a start for D.C., which also needs to remove its statue of Albert Pike, Confederate general and leader of the Arkansas Ku Klux Klan, from Judiciary Square. The Pentagon also needs to de-Confederatize the Army. No more Fort A.P. Hill. No more Fort Bragg, named for a general who was not only Confederate but also incompetent. No more Fort Benning, named for a general who, after he had helped get his home state of Georgia to secede, made the following argument to the Virginia legislature:

What was the reason that induced Georgia to take the step of secession? This reason may be summed up in one single proposition. It was a conviction … that a separation from the North was the only thing that could prevent the abolition of her slavery…. If things are allowed to go on as they are, it is certain that slavery is to be abolished. By the time the north shall have attained the power, the black race will be in a large majority, and then we will have black governors, black legislatures, black juries, black everything. … The consequence will be that our men will be all exterminated or expelled to wander as vagabonds over a hostile earth, and as for our women, their fate will be too horrible to contemplate even in fancy.

With our monuments lying about secession, our textbooks obfuscating what the Confederacy was about, and our army honoring its generals, no wonder so many Americans supported the Confederacy until last week. We can literally see the impact Confederate symbols and thinking had on Dylann Roof, but other examples abound. In his mugshot, Timothy McVeigh, who bombed the Murrah Building in Oklahoma City in 1995, wore a neo-Confederate T-shirtshowing Abraham Lincoln and the words, “Sic semper tyrannis!” When white students in Appleton, Wis., a recovering sundown town that for decades had been “all white” on purpose, had issues with Mexican American students in 1999, they responded by wearing and waving Confederate flags, which they already had at home, at the ready. Across the country, removing slavery from its central role in prompting the Civil War marginalizes African Americans and makes us all stupid.

De-Confederatizing the United States won’t end white supremacy, but it will be a momentous step in that direction.

Copyright James W. Loewen]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153642 https://historynewsnetwork.org/blog/153642 0
It's Time to De-Confederatize "The American Pageant": An Open Letter to David Kennedy

Sociologist James W. Loewen is the author of "Lies My Teacher Told Me."

            

             Like our textbooks, this monument to Jefferson Davis in Richmond, VA, mystifies the Confederate cause.

Dear David,

            Recently you and I crossed swords in a way, in an article in the Washington Post. Education reporter Emma Brown interviewed both of us and then wrote, "150 Years Later, Schools Are Still A Battlefield For Interpreting Civil War.

            Having read eighteen U.S. history textbooks closely for my book, Lies My Teacher Told Me:  Everything Your American History Textbook Got Wrong, I was able to criticize them for, as Brown put it, "failure to quote from key primary sources: the Southern states' declarations of secession, which made clear that they were leaving the union to protect white citizens' right to own slaves."

            From one of these sources, "Declaration of the Immediate Causes Which Induce and Justify the Secession of the State of Mississippi from the Federal Union," she then quoted a key passage: "Our position is thoroughly identified with the institution of slavery."[1]

            Brown went on to report what I had said to her about your book, American Pageant, one of the best-selling U.S. history textbooks for high-school students. I called Pageant unusual in that it actually (in Brown's words) "quoted directly from South Carolina's secession document. That's admirable, Loewen said, but the quotation leaves out the document's direct language about the role of slavery in driving South Carolina's decision."

            Then she reported her conversation with you. "Loewen is nitpicking. 'I would defy anybody who read our text to conclude that we were unaware of slavery as the cause of the Civil War.' "

            Did I nitpick? Let's see.

            Here's what you quote:

            We affirm that the ends for which this [Federal] Government was instituted have been defeated, and the government itself has been destructive of them by the action of the non-slaveholding States.... For twenty-five years this agitation has been steadily increasing, until it has now secured to its aid the power of the common Government. Observing the forms of the Constitution, a sectional party has found within that article establishing the Executive Department, the means of subverting the Constitution itself.

Notice that "slavery" does not appear. Nor is it implied. Instead, some vague "ends" for which the Government was formed "have been defeated," and "the Constitution" has somehow been subverted. Something has been agitated, but "this agitation" is about what? Logically, no reader can infer that slavery lurks behind these vague words, because the "non-slaveholding States" could be upset about tariffs, taxes, internal improvements — who knows?

            It isn't easy to quote the "Declaration of the Immediate Causes" at this length without mentioning slavery, because it comes back to slavery over and over. Let's investigate how you manage this feat.

            At the beginning, you make one error in transcribing the document. You write, "We affirm that the ends for which this [Federal] Government was instituted," but the document says "these ends." This error could not be random. Had you transcribed the document correctly, "these ends" would have alerted readers that something important had been referred to, just upstream. "These ends?" What ends?

            In the passage just before "these ends," South Carolina tells how the Constitution confers "the right of property in slaves" and "the rendition of fugitives from labor." "Rendition" means, of course, "legal return." A bit upstream of that, the document quotes the fugitive slave clause of the Constitution and complains that Northern states have not done their duty to fulfill it. That's what this quoted passage is all about. Indeed, that's what the entire document is about. Your excerpt offers no hint, however.

            Next, let's explore those four little dots you insert into the middle of the quote. Let's put back what you leave out.

            Those States have assumed the right of deciding upon the propriety of our domestic institutions; and have denied the rights of property established in fifteen of the States and recognized by the Constitution; they have denounced as sinful the institution of Slavery; they have permitted the open establishment among them of societies, whose avowed object is to disturb the peace of and eloign the property of the citizens of other States. They have encouraged and assisted thousands of our slaves to leave their homes; and those who remain, have been incited by emissaries, books, and pictures, to servile insurrection.

That is some ellipsis! Doesn't that make my point? Did you not "leave out the document's direct language about the role of slavery in driving South Carolina's decision?" Had you included any of it, then no reader could be confused. Agitation about what, indeed! About slavery, of course.

            Surely that's why you left it out, no? That is, surely Houghton Mifflin did not want to include anything directly on point, lest neo-Confederates on Southern state textbook adoption boards take offense.

            David, I do realize that the foregoing is unduly harsh in that I personalized my criticism. I suspect that you had nothing to do with deciding what goes into the various boxes, "Varying Viewpoints," "Chronology," and all the rest of the interruptions to the main narrative. If so, then let me ask, who did? Do you even know? What are the qualifications of the person who chose to include the key document but then left out everything important in it? Moreover, if you had nothing to do with deciding what goes into the various boxes, why not? Who better than you to select readings for students of U.S. history? Why delegate the task, when it affects millions of students and thousands of teachers across the country?

            After all, teachers need all the help they can get, concerning secession. The issue isn't whether you "were unaware of slavery as the cause of the Civil War," as you put it to Brown. I'm sure that you know that slavery was why the South seceded, David.[2] But most teachers do not. I know, because over the last ten years I have asked several thousand K-12 teachers of social studies and U.S. history this question: "Why did the South secede?" Teachers always generate four answers:

            (1) slavery

            (2) states' rights

            (3) the election of Lincoln

            (4) tariffs and taxes.

Then I ask them to vote. The results have not been pretty, although they are remarkably uniform across the country.

Why did the South secede?

            This graph should sadden everyone in the United States who cares about history. A century and a half after the Civil War, most teachers — a clear majority — still get secession 180° wrong. South Carolina and all other states succeeded because they were against, not for, states' rights. They seceded for slavery (and also owing to the election of Lincoln, to be sure).

            Then I ask, "What would be the best evidence to resolve the matter?" Individuals volunteer "diaries from the time" and "newspaper articles," a bit vague. Sometimes I banter with them: "The diary of an 1859 dairy farmer in Michigan?" Immediately they respond, no, diaries or articles from Charleston — better answers, but still hardly the best. Eventually, someone asks, "Wasn't there some sort of convention? Didn't it say why South Carolina was leaving the Union?" Yes, I reply, and I read to them the title of the document we are discussing here.

            Once they learn of South Carolina's "Declaration," teachers realize that it is the best single source. Its title alone convinces them that it is the smoking gun.[3] Hence they will believe it, in your book, above your own words, which do present slavery as the primary cause. Then they will continue to misteach the cause of secession as states' rights, or perhaps present multiple causes — "sectionalism, states' rights, and slavery," in that order, to quote the infamous new standards in Texas.

            From time to time, I have made lists of the ten most important documents in the history of the United States. It's a fun exercise. Of course the Declaration of Independence and the Constitution stand at the top, perhaps not in that order. What next? Lincoln's Gettysburg Address? I don't think so, since it announces no new policy beyond "a new birth of freedom," and that birth — of black freedom, of course — had already been announced in the Emancipation Proclamation. Maybe then the Emancipation Proclamation itself? Richard Hofstadter famously quipped that the Emancipation Proclamation has "all the moral grandeur of a bill of lading," and he had a point. Also, African Americans were freeing themselves at a remarkable rate, both in the border states and whenever U.S. troops came near them in the South. Still, it did change national policy and led directly both to the United States Colored Troops and the Thirteenth Amendment. So, OK, let's include it.

            I don't know what else you might nominate, David, but surely this document, the one we are "nitpicking," must rank among the top five. After all, it prompted the Civil War — the most important single event since the formation of the nation. As well, it provided the direct model for similar declarations by Mississippi and Texas, as well as the arguments made by most other states when they seceded. (See The Confederate and Neo-Confederate Reader for these documents.)

            Thus you leave out the most important passage in the document, and the document is one of the most important that students need to see in a year-long course in U.S. history.

            So much for nitpicking!

            David, in all, your quotation runs 81 words. Here is an alternative selection:

            [T]he non-slaveholding States .... have denounced as sinful the institution of Slavery; they have permitted the open establishment among them of societies, whose avowed object is to disturb the peace of and eloign the property of the citizens of other States. They have encouraged and assisted thousands of our slaves to leave their homes; and those who remain, have been incited by emissaries, books, and pictures, to servile insurrection.

Substitute these words, and no reader will ever again be confused. They total only 70, so they'll fit fine. Will you ask Houghton Mifflin to make this change?

            Thank you for your consideration — Jim Loewen

[1] She mistranscribed "thoroughly" as "clearly."

[2] Slavery of course did not drive the Union cause. The U.S. went to war to hold the nation together (and because Confederates attacked). Soon enough, however, on the ground and after the Emancipation Proclamation, ending slavery became a U.S. war aim.

[3] Why do you retitle it ("declaration of independence")? Isn't the original title dramatic enough? 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153644 https://historynewsnetwork.org/blog/153644 0
What should Charleston do with John C. Calhoun?

Sociologist James W. Loewen is the author of "Lies My Teacher Told Me." This article was first published in the Charleston Post and Courier

“Don’t dishonor Calhoun,” wrote H. Lee Cheek Jr. and Sean R. Busick in these pages recently. At least they recognized that John C. Calhoun’s monument in Marion Square and his name on Charleston’s major east-west artery are more matters of honor than history.

They then go on to claim that ceasing to honor Calhoun would somehow get history all wrong. In the process, however, they get Calhoun’s history wrong. Getting the history right precludes honoring him.

They note he graduated from college and was an able Secretary of War. Such claims miss the arc of his life. That’s not why we remember or honor him. As he matured, Calhoun increasingly placed the interests of his region as he perceived them ahead of the national interest. After 1820, he took ever more extreme positions favoring the South as a region and slavery as a cause. In 1832 he threatened secession; in return, President Andrew Jackson threatened to hang him for treason. In 1837 Calhoun told the Senate, “[Slavery] cannot be subverted without drenching the country in blood, and extirpating one or the other of the races.” Nor should slavery ever be ended, he went on, because it is “a positive good.” This theory relies openly on racism — slavery is good for black people because they are “low, degraded, and savage,” in Calhoun’s words. Should we honor him for that?

Repeatedly, Calhoun threatened disunion to blackmail national leaders to get what he wanted. He explained his strategy to a friend in 1827:

“You will see that I have made up the issue between North and South. If we flinch we are gone, but if we stand fast on it, we shall triumph either by compelling the North to yield to our terms, or declaring our independence of them.”

Opposed to the high tariffs of 1828-1832, Calhoun prompted a national crisis when he got South Carolina to “nullify” them. Jackson refused to back down, declaring, “Our Federal Union — it must and shall be preserved.”

Calhoun’s threats did get congressional leaders to lower the tariff, however. Having browbeaten his way on the tariff, Calhoun later threatened disunion if Texas was not annexed, if the United States extended diplomatic recognition to Haiti, and even if citizens in northern states continued to agitate for abolition. Should we honor him for that?

As the last point shows, Calhoun also pushed to make South Carolina (and all of the South) a closed society. He argued that petitions about slavery should not even be received by Congress and that sending abolitionist materials through the mail should be a crime. Should we honor him for that?

The last point also shows his shift away from states’ rights. By the 1840s, Calhoun was insisting that because the Constitution protected slavery, slave-owners had the right to take their property into any of the territories. He now claimed that Congress had had no Constitutional right to pass the Northwest Ordinance, outlawing slavery northwest of the Ohio River. Since some of the same people voted for the Northwest Ordinance who later voted for the Constitution, Calhoun could hardly claim he knew the founders’ intent. Now he also called the Missouri Compromise, which he had supported in 1820, “unconstitutional,” because it banned slavery from territories north of the northern boundary of Arkansas.

Calhoun’s purpose had to have been disunion. In 1837, he had written that states’ rights allowed Northerners to distance themselves morally from slavery. “A large portion of the Northern States believes slavery to be a sin, and would consider it as an obligation of conscience to abolish it if they should feel themselves in any degree responsible for its continuance.” Now he opposed states’ rights, again threatening secession unless the federal government passed and enforced a harsh fugitive slave law and required slavery throughout the territories. Should we honor him for that?

In 1850 his agitating for secession won that fugitive slave law. Nevertheless, on his deathbed he opposed the Compromise of 1850, claiming that it didn’t go far enough toward guaranteeing slavery forever. By this point, it is doubtful that any compromise would have satisfied Calhoun. Yet Cheek and Busick write, “His last years were spent attempting to unify the country.”

I disagree. John C. Calhoun is remembered — and honored in Charleston — for what he did in the latter half of his adult life. In those years, he provided the intellectual scaffolding that rationalized slavery, suppressed freedom of speech and legitimized secession.

Surely that legacy should persuade residents of Charleston to rename his street and move his statue, or at least put an accurate historical marker in front of it contextualizing it.

Every year that he dominates Marion Square, every year that Calhoun Street remains named for him, Charleston declares on its landscape that he was a hero worthy of honor. That declaration insults every black resident and every white resident who does not believe that treason on behalf of slavery made moral or political sense then or now.

After Charleston renames the street and perhaps removes the statue, it can put up good markers explaining what had been there and why it was changed.

South Carolina historical markers allow enough words to tell about Calhoun’s career, positive and negative.

Then history, not honor, will result.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153645 https://historynewsnetwork.org/blog/153645 0
What Does Rockville, Maryland's Confederate Monument Tell Us About the Civil War? About the Nadir? About the Present? Sociologist James W. Loewen is the author of "Lies My Teacher Told Me.”

Related Links

●  Developing Story:  Confederate Flag & Monuments

●  Montgomery County executive wants Rockville statue of Rebel soldier off lawn near courthouse

            In 1913, the United Daughters of the Confederacy (UDC) put a soldier on a pedestal in front of the Montgomery County courthouse in Rockville. Today this monument has something to teach us about three eras: what it's about, when it went up, and our present day (when, hopefully, it will come down).

            Historical monuments play at least two roles in society. They may prompt us to remember the past, maybe even to learn more about our history. They can also distort our knowledge of the past and warp our view of the world. This essay examines the roles played by Montgomery County's Confederate monument in each of these eras.

What Does Rockville's Confederate Monument Teach About the Civil War?

            This monument glorifies those who fought to keep African Americans in chains and many of whom, after Reconstruction, worked to put them back into second-class citizenship. It does not just memorialize the dead. It says that they were "our heroes," Montgomery County's heroes. It also tells us how to think about the Confederate cause: we are "to love the thin grey line."

            This was hardly the view of most residents of Montgomery County or adjoining Frederick County at the time. Neither county seceded, of course. While Maryland did send perhaps 24,000 men to the Confederate armed forces, it sent 63,000 to the U.S. army and navy. The Thin Grey Line came through Montgomery and Frederick counties at least three times, en route to Antietam in 1862, Gettysburg in 1863, and Washington in 1864. Lee's army expected to find recruits and help with food, clothing, and information. This did not happen, although the army did kidnap every African American it came upon, dragging them back into Virginia as slaves. In a further irony, on the courthouse grounds not far from the Confederate monument, a historical marker tells of J.E.B. Stuart’s 1863 raid nearby, in which he captured “as many as a hundred” African Americans and enslaved them, but they are invisible; the marker only mentions the capture of “150 U.S. wagons.”[1] During the first invasion, Maryland residents greeted Union soldiers "as liberators" when they came through on their way to Antietam, according to historian William F. Howard. During the last invasion, when Confederate cavalry leader Jubal Early came through, he demanded and got $300,000 from the leading merchants of Frederick, lest he burn their town, a sum equal to at least $5,000,000 today.[2]

            The phrase "thin grey line" alludes to part of the "Lost Cause" mythology about the Civil War: that the North won the war mainly or solely due to its numerical superiority. There is some truth to this claim: the U.S. Army outnumbered the Confederate Army about two to one. However, since the United States had to conquer and hold the seceding states to win, this "thinness" did not by itself determine the outcome. See my essay, "Appomattox:  Getting Even The Numbers Wrong,"[3] for a discussion of how the UDC overemphasized this same point at another historic site.

            Opponents of moving this monument cannot claim that doing so does violence to history or to our knowledge of the Civil War, since it says almost nothing about that war. On the contrary, this monument makes it harder for white residents of Montgomery County to emulate or even learn about their ancestors who enlisted in the Union army or worked for black rights after the War. No one seeing it at the Rockville Courthouse would imagine that Maryland sent more than 2½ times as many men to the U.S.A. as to the C.S.A. armed forces. No one reading its words would understand that the Confederate cause amounted to treason on behalf of slavery. This monument teaches us nothing about the Civil War except that Confederates should be honored as "our heroes."

            All Confederate monuments intrinsically imply that the Confederacy was a noble cause. Unfortunately, the cause was not noble. All of the secession documents that the Southern states issued as they left the Union state that their cause was slavery on behalf of white supremacy.[4] Therefore, even before people put up a monument like this one, they had to transform the Confederate cause into something more noble. This happened during the period known as the Nadir of Race Relations. "Nadir" means "low point."

What Does Rockville's Confederate Monument Teach About the Nadir of Race Relations?

            What can this monument teach us about when it went up? First, it is interesting that it went up in 1913. Why so late? Most United States monuments to Civil War dead went up between 1865 and 1890. Most Confederate monuments went up after 1890. Why?

            People usually put up monuments after they win, and the Confederates — we should say neo-Confederates because they were mostly a new generation by 1890 — won the Civil War in 1890. They won it in several ways. First, in that year they won what it was about: white supremacy. The state of Mississippi passed its new constitution. There had been nothing wrong with its 1868 constitution except that it let African Americans vote. In 1890, at their constitutional convention, white Mississippians were clear. As one delegate put it, "Let's tell the truth if it bursts the bottom of the Universe. We came here to exclude the Negro. Nothing short of this will answer." The key provision to do so was Section 244, requiring that voters must be able to give a "reasonable interpretation" of any section of the state constitution. White registrars would judge "reasonable." Other states across the South copied what came to be called "the Mississippi Plan," including Oklahoma by 1907.

            Second, neo-Confederates got to rename the Civil War to the "War Between the States." Not one person called it that while it was going on, so this was a complete anachronism.

            They also now claimed that they had seceded for states' rights, not for slavery. This claim stands history on its head. Every document from 1860-61 explaining secession refers to slavery — its expansion, enhancement, and maintenance — as the chief cause. During the Nadir of Race Relations, however, that terrible period from 1890 to 1940 when race relations deteriorated and whites grew more and more racist, Northerners found it embarrassing to think about the cause they had abandoned. They had fought for something, after all. At first, they went to war to prevent the breakup of the United States. As the war ground on, it became a struggle to end slavery. As early as Antietam, Union soldiers were going into battle singing "Battle Cry of Freedom" and "John Brown's Body."

            During the Nadir, however, black freedom turned out to have been stillborn. What was going on in Maryland during the Nadir? Sundown towns formed — places that for decades were "all white" on purpose.[5] An entire county — Garrett, farthest west — expelled its black population. Other sundown towns in Maryland included Tilghman Island, Mount Rainier, and Greenbelt. In Montgomery County, sundown towns included Washington Grove and the four Chevy Chases. Cities like Silver Spring and Washington, D.C., grew more segregated residentially. Schools were segregated, of course, and became less equal in quality.[6]

            In 1892 Grover Cleveland won the Presidency with a campaign that derided Republicans as "nigger lovers." Four years later, the United States Supreme Court granted official approval to racial segregation in Plessy v. Ferguson. Around that time, racial segregation became required by custom if not law throughout the North. No longer were Americans "dedicated to the proposition that all men are created equal," as Lincoln had said we were at Gettysburg.

            During those decades, our popular culture celebrated the antebellum plantation South, from minstrel shows to movies like Birth of a Nation and Gone with the Wind. Now the South and North came to be considered morally equal.

            During the Nadir of Race Relations, neo-Confederates also won the Civil War on the ground. White supremacists had the power to determine how the War would be remembered on the ground in Maryland. That's why Montgomery boasts no United States monuments, even though most of its young men fought on the Union side. We can be sure that at the dedication of the Rockville monument in 1913 the mood was celebratory. I suspect every speaker was white and the audience was overwhelmingly white as well.

            Thus the UDC's erection of this Confederate monument in Rockville signals a time and a way that the United States went astray as a nation. So this monument has something important to teach us about 1913. To the UDC and the SCV (Sons of Confederate Veterans), monuments were the continuation of the Civil War by other means. They knew that having a Confederate landscape makes it easier to have a Confederate mindset, even a Confederate heart.

What Does Rockville's Confederate Monument Teach About the Present?

            Monuments and markers also point to unresolved issues in a third era — our own. This Rockville monument shows the continuing power of neo-Confederates in Montgomery County right up to last month. Conversely, it shows the lack of power of African Americans as well as the lack of regard for their feelings or their history.

            But the key issue is not just African Americans. Confederate monuments make an impact on whites. They are meant to. "Do not forget to love the Thin Gray Line." Dylann Roof loved the Thin Gray Line. Dylann Roof shows what can happen when one loves the Thin Gray Line too much.

            Or, if Roof is too extreme an example, consider the Sigma Alpha Epsilon scandal just three months earlier. Its members love the Thin Gray Line less than Dylann Roof, so they merely display Confederate flags proudly in each chapter house and teach each other songs like

            There will never be a nigger in SAE.

            There will never be a nigger in SAE.

            You can hang him from a tree,

            But he'll never sign with me,

            And there'll never be a nigger in SAE.

Maryland boasts at least two chapters of SAE.

            Or consider White's Ferry, 23 miles west of Rockville. It too flies the Confederate flag and its ferry is named Jubal Early! Its owner thinks this is appropriate, because Early had a "rebellious, no surrender attitude.” It is certainly true that after the Civil War, Early never changed his attitude. He still defended slavery as appropriate for African Americans, since "The Creator of the Universe had stamped them, indelibly, with a different color and an inferior physical and mental organization." He was an early proponent of the "Lost Cause" mythology and helped organize the Southern Historical Society to spread its biased views. Perhaps this discussion about Montgomery County's Confederate monument will help change the thinking behind Montgomery County's only ferry operation.

            Confederate monuments and ideology make an impact on all too many of us. Residents of Montgomery County need to unlearn the myths they learned in school about secession and the Confederacy. I know they learned them because I have asked some 5,000 people across the United States, why did the Confederacy secede. In most audiences, 65% vote for states' rights! They sit open-mouthed as I read to them the key documents, showing them to be 180° wrong. The documents convince them, but millions more still believe the error, partly owing to our Confederate landscape.[7]

What needs to happen now.

            Even though monuments are written in stone, they are not permanent. Americans have forever been talking back to their landscape. On the whole, it is a healthy process. The history written on the American landscape was written by people, after all, and we the people have the power to take back the landscape and make it ours. Montgomery County and Rockville are to be congratulated for setting up this forum to begin this process here.

            We Americans share a common history that unites us. But we also share some more difficult events — a common history that divides us. These things too we must remember, for only then can we understand our divisions and work to reduce them. Monuments could help, except too often they suffer from the same forces that created the divisions in the first place. Certainly this Rockville monument was put up by the same people (or their descendants) who seceded on behalf of slavery in the first place.

            Our landscape has always been contested. Writing in 1999,[8] I noted that people were shooting historical markers full of holes in West Virginia, sawing a foot off a statue in New Mexico, and covering Columbus statues with red paint across the country. Since the Charleston murders, this tendency has only increased, with vandalism of Confederate monuments in North Carolina, Texas, and many other places. Rockville's Confederate monument will be contentious from here on. It will never be "safe" again.

            We have seen that this monument doesn't tell much history about the Civil War. It just tells us which side we should be on, in 1913. What should be done with it?

            Montgomery County needs to put this monument in a museum setting. A place at the Beall-Dawson House might work, so long as it was visible to visitors to the house, not to drivers going past. Then those who see it could learn from the label or marker that the historical society would put up, to explain what it teaches us about all three eras. Then the monument can at last play a positive role in educating everyone about the history of white supremacy in Montgomery County and in this country.

            Neo-Confederates may charge that removing this monument violates their "heritage." This emphasis on heritage, as historian Michael Kammen wrote, is "an impulse to remember what is attractive or flattering and to ignore all the rest."[9] Thus history and heritage are not the same; indeed, the two are often at odds. Neo-Confederates do not want to put the Confederacy into its proper historical context. They simply want to maintain its symbols as sites for homage in the present. By moving the monument, history gains, because Rockville can now tell more about the Confederate cause and the history of the monument itself. As well, the public can develop a more sophisticated understanding of the nature of history from the change on the landscape itself. The only heritage that we lose is the tradition of decades of honoring a repulsive cause. Losing this legacy is precisely the point.

            Has Montgomery County reached a point where African Americans have at least as much political and moral influence as neo-Confederates? To put this more broadly than a mere clash of interest groups, there is a reciprocal relationship between justice in the present and truth about the past. Has Montgomery County reached a level of justice in the present that it can now tell the truth about its past? This Confederate Monument can provide an effective tool for that telling — when it is moved to a museum and when its relationship to its three key eras is explained.

            We in 2015 can take back the landscape. The landscape does not belong only to the dead, but also to the living. Monuments are messages to the future, and the future does not belong to the neo-Confederates, but to Americans on the right side of history. We can no longer allow the "love the Thin Grey Line" message of the United Daughters of the Confederacy to stand.

  [1]David G. Smith, “Race and Retaliation,” in Peter Wallenstein and Bertram Wyatt-Brown, eds., Virginia’s Civil War (Charlottesville: U of VA Press, 2005), 137-38, 142.

    [2]William F. Howard, "Lee's Lost Orders," Civil War Quarterly, 9 (6/87), 27; Stephen E. Wilson, "Antietam — Death Knell of the Confederacy," ibid., 8; interview with cemetery manager, 5/99.

    [3]James W. Loewen, Lies Across America (NY: Simon & Schuster, 1999), 297-300.

    [4]See Loewen & co-editor, The Confederate and Neo-Confederate Reader (Jackson, MS: University Press of Mississippi, 2010), Chapter 2, for these documents.

    [5]I tell about them in my book, Sundown Towns (NY: New Press, 2004).

    [6]At least, this happened in Virginia and throughout the former Confederacy. I confess that I have not studied Maryland's black schools in the years 1865-1890, so I cannot say if they then deteriorated during the Nadir.

    [7]These results are even worse than an admittedly more professional recent Pew poll. It showed 51% of respondents who had an opinion thought the South seceded for states’ rights; another 10% thought states’ rights and slavery equally.

    [8] Lies Across America.

    [9] Michael Kammen, Mystic Chords of Memory (NY: Knopf, 1991), 626.

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153651 https://historynewsnetwork.org/blog/153651 0
Celebrating John C. Calhoun in Minnesota! Sociologist James W. Loewen is the author of "Lies My Teacher Told Me.”  

            Surely the farthest north point of commemoration for John C. Calhoun is Lake Calhoun, Minneapolis’s largest lake. Hopefully, a group of Minnesotans now numbering in the thousands will be able to change its name. They’re working on it.

            It is true that John C. Calhoun was an able Secretary of War, from 1817 to 1825. Some Minnesotans defend "Lake Calhoun" for that reason: he was in office when Fort Snelling was built, helping to fortify white settlements against Native Americans.

            Such arguments miss the arc of Calhoun's life. At the same age, Harry S. Truman ably oversaw the erection of twelve Madonna of the Trail monuments honoring pioneer women, coast to coast. So what? That's not why we remember or honor Truman.

            After he was Secretary of War, Calhoun took ever more extreme positions favoring the South as a region and slavery as a cause. He called the Missouri Compromise, which he had supported at the time, "unconstitutional," because it banned slavery from territories north of Arkansas. Eventually, he came to place the interests of his region as he perceived them ahead of the national interest, ahead even of national unity.

            Repeatedly, Calhoun threatened disunion to blackmail national leaders to get what he wanted. He explained his strategy to a friend in 1827:

You will see that I have made up the issue between North and South. If we flinch we are gone, but if we stand fast on it, we shall triumph either by compelling the North to yield to our terms, or declaring our independence of them.

            

            Opposed to the high tariffs of 1828-1832, Calhoun prompted a national crisis when he got South Carolina to "nullify" them. President Jackson refused to back down, declaring, "Our Federal Union — it must and shall be preserved." Calhoun's threats did get congressional leaders to lower the tariff, however. Having browbeaten his way on the tariff, Calhoun later threatened disunion if Texas was not annexed, if the United States extended diplomatic recognition to Haiti, and even if citizens in northern states continued to agitate for abolition. Should Minnesota honor him for that?

            In 1837 he told the Senate, "[Slavery] cannot be subverted without drenching the country in blood, and extirpating one or the other of the races." Nor should slavery ever be ended, he went on, because it is "a positive good." This theory relies openly on racism — slavery is good for black people because they are "low, degraded, and savage," in Calhoun's words. Should Minnesota honor him for that?

            At that time, Calhoun had written that states' rights let Northerners distance themselves morally from slavery. "A large portion of the Northern States believes slavery to be a sin, and would consider it as an obligation of conscience to abolish it if they should feel themselves in any degree responsible for its continuance." By the 1840s, however, he opposed states' rights when those rights had anything to do with freedom, a move calculated to sow sectional discord.

            By the 1840s, Calhoun had no more use for democracy. He pushed to make the South a closed society. He argued that Congress should not even receive petitions about slavery; sending abolitionist materials through the mail should be a crime. And he insisted that because the Constitution protected slavery, slaveowners had the right to take their property into any territory, even if its residents had voted slavery down.

            Therefore Congress had had no Constitutional right to pass the Northwest Ordinance, which had outlawed slavery northwest of the Ohio River, including part of Minnesota. Since some of the same people voted for the Northwest Ordinance who later voted for the Constitution, Calhoun could hardly claim to know the founders' intent. Nevertheless, he provided the intellectual scaffolding for Dred Scott in 1857, which reached Calhoun's conclusion seven years after his death. Minnesotans will recall that Dred Scott also references Fort Snelling, since Scott argued that his stay there made him free. Calhoun would have shouted "No!"

            In 1850 he again threatened secession unless the federal government passed and enforced a harsh fugitive slave law and required slavery throughout the territories. His agitating for secession helped win that fugitive slave law. Nevertheless, on his deathbed he opposed the Compromise of 1850, claiming that it didn't go far enough toward guaranteeing slavery forever. By this point, it is doubtful that any compromise would have satisfied Calhoun.

            John C. Calhoun is remembered for what he did in the latter half of his adult life. In those years, he rationalized slavery, suppressed freedom of speech, and legitimized secession. Surely that legacy should persuade Minnesotans to rename Lake Calhoun.

            Every year that the lake remains named for him, Minnesota declares on its landscape that John C. Calhoun was a hero worthy of honor. That declaration insults every black resident and every white resident who does not believe that treason on behalf of slavery made moral or political sense then or now. After Minnesotans rename the lake, they can put up a marker telling what its previous name had been and why it was changed. Then history, not honor, will result.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153652 https://historynewsnetwork.org/blog/153652 0
On the Amazing Similarity Between the New Texas Textbook Standards and the Textbook, "The Americans": An Open Letter to Gerald A. Danzer

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

Dear Professor Danzer:

            Although I pass through Chicago regularly and you teach at UIC, I don't think we've met. Our main connection is that I spent a lot of time with your high school U.S. history textbook, The Americans, while writing the second edition of Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong. The edition I read came out in 2007. I found that one major textbook weakness is their treatment of secession. It's about this problem that I write you today.

            Most teachers cannot plug this hole, because they too are confused. See this chart for details. About 65% reply "states' rights" when asked why the South seceded. Another 10% answer "issues about tariffs and taxes."

            Trying to correct that deficiency, I wrote Teaching What Really Happened, which includes an entire chapter titled "Why Did The South Secede." Also, I wrote the introductions to the documents in The Confederate and Neo-Confederate Reader and got that book out in time for the sesquicentennial of the Civil War, hoping it would help teachers get the Confederacy right and neutralize textbooks that get it wrong.

            Recent events have convinced me, however, that we must go further. Surely historians have bent the facts about session to avoid disturbing neo-Confederates for far too long. Glorifying the Confederate cause makes it easier for young people to get ensnared by the neo-Confederate cause today. Although the results are rarely as horrific as the recent murders in Charleston, increasing the number of neo-Confederates never leads to better race relations. Surely it is time to de-Confederatize U.S. history textbooks, including The Americans.

            I write you, Prof. Danzer, because Houghton Mifflin McDougal Littell[1] lists you as senior author, and none of your co-authors have expertise in the Civil War, according to their descriptive paragraphs at the beginning of the textbook.

            Here is your main treatment of the question, why did the South secede.

            Southern Secession

            Lincoln's victory convinced Southerners that they had lost their political voice in the national government. Fearful that Northern Republicans would submit the South to what noted Virginia agriculturalist Edmund Ruffin called 'the most complete subjection and political bondage,' some Southern states decided to act. South Carolina led the way, seceding from the Union on December 20, 1860. Four days later, the news reached William Tecumseh Sherman, superintendent of the Louisiana State Seminary of Learning and Military Academy. In utter dismay, Sherman poured out his fears for the South.

Then comes an inset passage with a paragraph by Sherman in which he predicts the "country will be drenched in blood."

            Then you resume:

            Even Sherman underestimated the depth and intensity of the South's commitment. For many Southern planters, the cry of "States' rights!" meant the complete independence of Southern states from federal government control. Most white Southerners also feared that an end to their entire way of life was at hand. Many were desperate for one last chance to preserve the slave labor system and saw secession as the only wy. Mississippi followed South Carolina's lead and seceded on January 9, 1861. Florida seceded the next day. Within a few weeks, Alabama, Georgia, Louisiana, and Texas had also seceded.

            

Let's consider this extended passage. The first paragraph lists sectionalism as the reason — fear that the South as a section will be outvoted by the North. The second paragraph lists states' rights, followed by fear "that an end to their entire way of life was at hand." Last, "many were desperate ... to preserve the slave labor system."

            In 2010, the state of Texas passed new standards for U.S. history textbooks. They list the causes of the Civil War as "'sectionalism, states' rights, and slavery' — written deliberately in that order," according to journalist Emma Brown. These new standards have prompted some rewriting in the history textbook publishing industry. Brian Belardi of McGraw-Hill, for example, said recently that the revised textbooks McGraw-Hill produces for Texas will not be used in other states.

            Note that Houghton Mifflin McDougal Littell will not have to change a word in The Americans. It already conforms!

            I must ask you, Prof. Danzer, do you think this is a good thing? Do you yourself think Texas seceded for those three reasons, in that order?

            If you do, let me share words from "A Declaration of the Causes Which Impel the State of Texas to Secede from the Federal Union," published by the secession convention as it took Texas out of the Union. The Declaration begins with three paragraphs outlining how Texas came to join the United States. This history emphasizes:

            She was received as a commonwealth holding, maintaining and protecting the institution known as negro slavery — the servitude of the African to the white race within her limits — a relation that had existed from the first settlement of her wilderness by the white race, and which her people intended should exist in all future time.

It begins to look as if slavery was the #1 cause, not #3, and was the concern of the entire state, not just "many."

            The Declaration then proceeds to charge the federal government with something almost no other Southern state charges: making California a free territory.[2]

            The controlling majority of the Federal Government, under various pretences and disguises, has so administered the same as to exclude the citizens of the Southern States ... from all the immense territory owned in common by all the States on the Pacific Ocean, for the avowed purpose of acquiring sufficient power in the common government to use it as a means of destroying the institutions of Texas and her sister slaveholding States.

This is nonsense. Even South Carolina, always the most extreme, never claimed in its secession documents that California should have or would have been a slave state, absent federal interference. In reality, Southern settlers in California at the time agreed that it should become a free state. As John S. Mosby, the Gray Ghost of the Confederacy, pointed out long after the war, "Now in the Convention wch. Gen. Taylor has called to form a Constitution for California, there were 51 Northern & 50 Southern men — but it was unanimous against slavery."[3]

            Texas then makes a charge against the federal government that is even more singular:

            The Federal Government ... has for years almost entirely failed to protect the lives and property of the people of Texas against the Indian savages on our border, and more recently against the murderous forays of banditti from the neighboring territory of Mexico.

Here Texas is upset with the federal government for not acting enough, not for acting too much.

            Is Texas really seceding because ten years earlier, the Compromise of 1850 let California enter as a free state? And because it thinks the government hasn't sent enough troops to quell the Native Americans? Of course not. Indeed, the document goes on to say that while these issues with the federal government have been real, Texas has "patiently borne" them. Finally, Texas gets to its real reasons for seceding.

            When we advert to the course of individual non-slave-holding States, and that [of] [sic.] a majority of their citizens, our grievances assume far greater magnitude. The States of Maine, Vermont, New Hampshire, Connecticut, Rhode Island, Massachusetts, New York, Pennsylvania, Ohio, Wisconsin, Michigan and Iowa, by solemn legislative enactments, have deliberately, directly or indirectly violated [Fugitive Slave Clause of the Constitution].

Texas makes various other charges against Northern states. In particular, Texas fulminates against their support for the Republican Party, which it calls an "abolitionist organization" (it wasn't, at least not yet) that demands "political equality between the white and the negro races" (it didn't and wouldn't until Reconstruction). Thus Texas makes clear that it is seceding against states' rights when exercised by Northern states on behalf of freedom and because of the November victory of the Republican Party, which is against slavery.

            A few pages before your treatment of secession, The Americans has a two-page "Tracing Themes" box titled "States' Rights." Such boxes repeatedly interrupt the main narrative of your book, like other textbooks. A box on states' rights could be a good idea. My essay on Gettysburg in Lies Across America, "South Carolina Defines the Civil War in 1965," shows how that state did favor states' rights when it put up its monument at Gettysburg, though it did not when it fought there. Tracing this history could introduce your readers to "historiography," among other benefits.

            Your box, however, begins:

            The power struggle between states and the federal government has caused controversy since the country's beginning. At its worst, the conflict resulted in the Civil War.

Surely most readers will infer that the South's insistence on states' rights led to secession and war. Surely they will not realize that the South's opposition to states' rights (and territories' rights) when exercised on behalf of freedom led to secession and war.

            That box then has four paragraphs treating four key years. "1787" treats the various Constitutional compromises. "1832" treats the Nullification crisis, certainly a states' rights issue. The fourth paragraph, "1957," treats the Little Rock crisis about school desegregation, also about states' rights. The third, "1860," is subtitled "South Carolina's Secession." You claim "South Carolina seceded after the election of Abraham Lincoln" — true enough — "whom the South perceived as anti-states' rights and antislavery," only half true. Again, let's refer to the key document, here the "Declaration of the Immediate Causes Which Induce and Justify the Secession of South Carolina from the Federal Union." It does mention the election of Lincoln, whom it immediately denounces because his "opinions and purposes are hostile to Slavery." Nowhere does it say anything about Lincoln's position on states' rights. Why would it? Lincoln had sworn repeatedly, "We must not disturb slavery in the states where it exists, because the Constitution, and the peace of the country both forbid us ...." Again, then, The Americans obfuscates rather than clarifies the position of the South on states' rights. South Carolina was furious with Lincoln's victory because he was anti-slavery. Period.

Prof. Danzer, can you defend this box? I know you may reply that you had nothing to do with these "Tracing Themes," "Great Debates," Section Reviews, Chapter Reviews, Further Readings, "Presidential Lives," and other boxes. Why not? They make up maybe half of your book. Who other than you should bear the responsibility for this content? Surely not the clerks who grind it out! Neither you nor I even know their names!

            If Texas is clear that it is seceding against states' rights, its secession document also makes clear that Texas is seceding for slavery and the ideology that undergirds it, white supremacy:

            We hold as undeniable truths that the governments of the various States, and of the confederacy itself, were established exclusively by the white race, for themselves and their posterity; that the African race had no agency in their establishment; that they were rightfully held and regarded as an inferior and dependent race, and in that condition only could their existence in this country be rendered beneficial or tolerable.

             That in this free government all white men are and of right ought to be entitled to equal civil and political rights; that the servitude of the African race, as existing in these States, is mutually beneficial to both bond and free, and is abundantly authorized and justified by the experience of mankind, and the revealed will of the Almighty Creator, as recognized by all Christian nations; while the destruction of the existing relations between the two races, as advocated by our sectional enemies, would bring inevitable calamities upon both and desolation upon the fifteen slave-holding States.

            

            Surely, Prof. Danzer, Texas seceded mostly about slavery (and white supremacy). If you do not feel you can say so on your own and still get adopted in Texas, why not just quote Texas's "Declaration of Causes?" Or quote from some other document! Is it good use of precious space to let the only "secession document" that you include be a comment from Sherman to a professor at the school he was running? Is it good pedagogy to tell students why the South seceded, rather than let them find out for themselves from reading what the leaders said as they left?

            There are some good things in your book, Prof. Danzer. For instance, you discuss "slave resistance in the Confederacy." You mention Confederate war crimes, such as Fort Pillow, which many textbooks duck. But your treatment of the essence of the Confederate cause — why it seceded in the first place — simply won't wash. Indeed, your book presents an even bigger problem than McGraw-Hill promises to produce. McGraw-Hill plans to lie about and obfuscate the reasons for secession only in Texas. You do so across the nation.

            The Texas standards, listing slavery third, are indefensible. So are the treatments of secession and of states' rights in The Americans. Teaching or implying that Texas seceded for states' rights is simply wrong. Avoiding the role of white supremacy in our past is itself a form of white supremacist history. I studied the index in The Americans. "White supremacy" does not appear. Neither does "racism" or any related term, such as "prejudice," "discrimination," "racial prejudice," "racial" anything, or even "race." Nor does this demonstrate mere bad indexing.

            If you agree, Prof. Danzer, that it is important to get the history of secession and the Confederacy right, what will you do about it? Will you ask the publisher to make immediate changes to align your book's treatment with the facts, the documents, rather than the Texas standards?

Sincerely,

James W. Loewen

            PS: I wrote Prof. Danzer on 7/15/2015. He replied on 7/21 and reminded me that we had met, when I was "the commentator on a panel on textbooks at the AHA meeting in San Francisco about 2002." He confirmed he had not had anything much to do with The Americans for some years; even in 2002 he had moved on to "global history." Danzer did imply that he had had something to do with the textbook's first edition, although "As you know," he wrote, "The Americans had a 'cast of thousands' in its original development and each major revision." I do not infer that he wrote even the central narrative, let alone the boxes and other material.

            Danzer did not speak to any substantive matter, such as his book's treatment of secession. Nor did he offer to make an effort to improve the textbook now, implying that it is a dead horse: "My impression is that The Americans is reaching the end of its life."

    [1]In 2008, Houghton Mifflin McDougal Littell seems to have merged with Harcourt Brace Jovanovich, which had acquired Holt, Rinehart & Winston. The resulting part of this behemoth that produced the 2012 edition of The Americans is now known as Holt McDougal for short. I have not read the 2012 edition.

    [2]Isham Harris, Governor of Tennessee, "Message to the Legislature," January 7, 1861, also complained that the Compromise of 1850 excluded "the Southern people from California." (See The Confederate and Neo-Confederate Reader [Jackson: University Press of Mississippi, 2010], 160, 162.)

    [3]John Singleton Mosby, "Letter to Sam Chapman," 7/4/1907, reprinted in The Confederate and Neo-Confederate Reader, 305. 

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153654 https://historynewsnetwork.org/blog/153654 0
Time to De-Confederatize the Textbook, "The American Journey": An Open Letter to James McPherson Sociologist James W. Loewen is the author of Lies My Teacher Told Me and The Confederate and Neo-Confederate Reader.

Why doesn't James McPherson's textbook (left) say the same things about secession and Civil War that his famed history of the war does (right)? Maybe he never wrote it?

Dear Jim McPherson:

On May 13, 2015, I heard you at Politics & Prose, the independent bookstore in Washington, D.C. Perhaps you saw me in the audience and later in the question line. (We have met several times, most recently two years ago, when we walked together from one part of Arlington Cemetery to another for the burial with military honors of two bodies recovered from the wreckage of Monitor.) Eventually I abandoned the question line, however, because my question was going to be critical, even embarrassing, and it wasn't appropriate to embarrass you in front of your book-tour audience.

Recent events have convinced me, however, that I must ask you more than one question, not about your most recent book, but about your middle-school textbook, The American Journey. I shall ask them here, in this letter sent to you and to History News Network, HNN, where at least some of the historical profession comes to learn about itself.

Maybe you won't be embarrassed. Let's see.

Let me start with the exact opposite of criticism. Decades ago, I bought your one-volume history of the Civil War, Battle Cry of Freedom, in hardbound. It is my "go to" book on the war — indeed, for years I have filed all my other general histories of the war next to it, under "McPher," no matter who wrote them. That way I don't have to remember those other authors' names; I just remember yours.

About secession and the Confederacy, you said at Politics & Prose as well as in Battle Cry that the Southern states seceded over slavery and its extension, not for states' rights. I didn't take notes in the bookstore, but in Battle Cry (p. 214) you write:

The Alabama Democratic convention [instructed] its delegates to walk out of the national convention if the party refused to adopt a platform pledging a federal slave code for the territories. Other lower-South Democratic organizations followed suit. In February, Jefferson Davis presented the substance of southern demands to the Senate in resolutions affirming that neither Congress nor a territorial legislature could 'impair the constitutional right of any citizen of the United States to take his slave property into the common territories....'

Thus you set the stage, noting that Southern politicians did not favor states' rights on the matter of slavery. Instead, they insisted that the federal government require slavery in the territories, even if the citizens of a territory felt and voted otherwise.[1]

The rest of your chapter on secession in Battle Cry tells how secessionist rhetoric was not only proslavery but also rested on white supremacy. You quote (p. 243) a South Carolina minister, "Abolition preachers will be at hand to consummate the marriage of your daughters to black husbands." And of course you quote (p. 244) Alexander Stephens's famous "Cornerstone Speech," given shortly after he had become vice-president of the confederacy:

Our new government is founded upon exactly the opposite idea; its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery ... is his natural and normal condition.[2]

You also note (p. 245) that the initial seven states seceded well before central government had done anything against slavery anywhere — "indeed," you point out, "several months before Lincoln even took office." Thus Confederates could have no complaint with the central government, and in their documents explaining secession, they make none.

In your textbook, however, which has shaped the views of secession, the Confederacy, and the Civil War of millions of middle-school children, you tell quite a different story. The American Journey is perhaps the largest single book ever inflicted upon a middle-school child. I included it as one of six new textbooks I reviewed for the second and current edition of Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong, even though it is intended for middle school rather than high school, because of the prominence of its authors: Joyce Appleby, Alan Brinkley, and you.[3] Your names are on the cover and on the title page.

To my sorrow, of all six of the new histories, all written after 2000, your book, The American Journey, is substantially the worst in its treatment of why the South seceded. Indeed, it shows the clear influence of neo-Confederates, rather than historical evidence. This astonished me, because I know your biography as a historian, beginning with your initial focus on African Americans and abolitionists during the Civil War era. Blurbs on Journey's copyright page describe Appleby as focusing on the 17th and 18th centuries and Brinkley as a scholar of the Depression and New Deal, while you are described as "author of 11 books about the Civil War era." Surely I am justified in concluding that you — not Appleby nor Brinkley — are responsible for "Unit 6: Civil War and Reconstruction." So I must ask you a total of ten questions about The American Journey.

Here is Journey's main treatment of secession:             

The South Secedes Lincoln and the Republicans had promised not to disturb slavery where it already existed. Nevertheless, many people in the South mistrusted the party, fearing that the Republican government would not protect Southern rights and liberties. On December 20, 1860, the South's long-standing threat to leave the Union became a reality when South Carolina held a special convention and voted to secede.

From that passage most readers infer — and I know, because I have asked some of them — that slavery was not the reason for secession. (See this chart, which reports the results of a survey of several thousand K-12 teachers.) After all, "Republicans had promised not to disturb slavery." Instead, the reason is, [white] Southerners feared for their "rights and liberties." What might this mean?

In Battle Cry of Freedom (p. 241), you ask precisely this question: "What were these rights and liberties for which Confederates contended?" There, you answer immediately: "The right to own slaves; the liberty to take this property into the territories...."[4]

Question #1: Why don't you say this in The American Journey? You leave it vague to the point of mystification.  On Journey's next page, however, you do say why Southerners seceded. Here is your "because" sentence:

Now because the national government had violated that contract — by refusing to enforce the Fugitive Slave Act and by denying the Southern states equal rights in the territories — the states were justified in leaving the Union.[5]

Question #2: When and where had "the national government" ever been "refusing to enforce the Fugitive Slave Act?" I did not know that it had ever refused. I thought that the United States under James Buchanan and earlier under Franklin Pierce and Millard Fillmore had robustly tried to enforce the Fugitive Slave Law of 1850, the most draconian national law passed in the United States to that time. Certainly no Southern state claims that the national government refused to enforce it in any secession document (see The Confederate and Neo-Confederate Reader). On the contrary, South Carolina takes pains to point out that it had no quarrel with "the national government": 

 The general government, as the common agent, passed laws to carry into effect these stipulations .... For many years these laws were executed. But an increasing hostility on the part of the non-slaveholding states to the institution of slavery has led to a disregard of their obligations....

Indeed, some Northern states and cities had failed to meet the fugitive slave act's requirements, which infuriated the seceding states, as document after document in our collection makes clear, but the federal government? South Carolina voiced no complaint with any anti-slavery policy of the federal government. Why would it? Lincoln had not even taken office, and Southern slaveowners dominated the Buchanan administration. The South also enjoyed a majority on the Supreme Court.

Question #3: When and where had "the national government" ever been "denying the Southern states equal rights in the territories?" Of course, "states" don't have any rights in territories; presumably you meant to say "Southerners" rather than "Southern states." But beyond that, did this ever happen?

            Let me give you my take on white slaveowning Southerners' changing positions about equal rights in the territories.

             ●  1820: Missouri Compromise: slaveowners accept a law dividing the western territories of the United States along the latitude of the Missouri/Arkansas border, south to be slave, north to be free.

             ●  1850: Texas restricts its land claim to accord with the Missouri Compromise line, but slavery might be allowed north of that line in Utah Territory, if residents of that territory so desired. Certainly some African Americans lived enslaved in Utah and nothing was done to free them.

             ● 1854: slaveowners get the Kansas-Nebraska Act, specifically repealing the Missouri Compromise and letting white male settlers in Kansas and Nebraska decide whether to allow slavery.

             ● 1857: slaveowners now get a ruling that denies free states equal rights in the territories. Dred Scott, supported by the Buchanan administration, requires the federal government to guarantee slaveowners the right to take slaves into any territory, regardless of the views of its residents.[6]

             ● 1860: slaveowners now demand (in their wing of the Democratic Party) a slave code for the territories, spelling out how the federal government will guarantee slavery as required by Dred Scott. This demand helped split the Democratic convention that year. The Douglas wing of the Party favored giving [white] Southerners equal rights to win majorities in the territorial legislatures to make territories slave or free. By 1860, however, as you point out in your Battle Cry passage quoting Jefferson Davis, equal rights was not enough for Southern slave owners. The Buchanan administration was part of the radical proslavery wing of the Party and did everything it could to guarantee slavery in Kansas, including condoning vote fraud and violence.

Isn't the above pretty much correct, Jim? But in your textbook, you claim "the national government" was "denying the Southern states equal rights in the territories!" Again, please tell me when and where! Again, no Southern state makes this claim in any secession document in The Confederate and Neo-Confederate Reader. 

Your next chapter, on the war itself, obfuscates secession yet again. There, after telling how "Jefferson Davis had "suspend[ed] habeas corpus," you say "Davis's action outraged Southerners who feared that they would lose the liberties for which they had gone to war." 

Question #4: When and where had Southerners said they were going to war for civil liberties? Again, this is news to me. I cannot find in any secession document a statement that the South was seceding to secure individual liberties like habeas corpus. The only civil liberty I find mentioned is the liberty to take one's slaves into any territory of the United States, even any state (at least temporarily), and have them protected by the U.S. government. About civil liberties, surely South Carolina took the opposite tack: by 1860 it had become life-threatening in that state to advocate racial equality or even to receive abolitionist material in the mail.

This brings me to my fifth question. You don't quote a single document as to why South Carolina or any other state seceded.

Question #5: The states make perfectly clear that they are seceding for slavery and against states' rights. Why not quote them? 

When South Carolina seceded, for example, its convention said why. On Christmas Eve, 1860, they passed "Declaration Of The Immediate Causes Which Induce And Justify The Secession Of South Carolina From The Federal Union." A key sentence cites "an increasing hostility on the part of the non-slaveholding states to the institution of slavery." At some length, the delegates complain about what Northern states have done, such as no longer allowing temporary slavery ("slave transit"), refusing to return fugitive slaves to their owners, and allowing anti-slavery societies to exist and to publish literature. That last might seem to be protected under the First Amendment, but not to South Carolinians, not if the free speech in question opposes slavery. No one reading The American Journey gets any hint of the above. Why not?

South Carolina secedes because it is against states' rights, whenever Northern states have tried to exercise those rights in ways that undermined slavery. The delegates even name the states and the states' rights that offend them:

The States of Maine, New Hampshire, Vermont, Massachusetts, Connecticut, Rhode Island, New York, Pennsylvania, Illinois, Indiana, Michigan, Wisconsin, and Iowa, have enacted laws which either nullify the acts of Congress, or render useless any attempt to execute them. In many of these States the fugitive is discharged from the service of labor claimed ....

Your textbook has plenty of room to include this passage. Adjacent to your discussion of secession, Journey devotes 40 percdent of p. 455 to a silly box, "From Hardtack to Unmeltable Chocolate," that leads only to the study question, "Why is it important for modern soldiers to have dehydrated foods?" This box isn't even in the right chapter! (It belongs in the next chapter, which treats the war.) Why not put some primary source on secession in that box instead? Don't students need to see primary sources? Why not show them Stephens's "Cornerstone Speech," as well? 

Maybe your answer to the above will be that you had nothing to do with deciding what goes into the various boxes, the "Time Line Activities," "Assessments," etc. Well, why not? Who better than you to select readings and formulate questions for students about secession and Civil War? But my next question is more basic. 

Question #6: Did you write the basic narrative on the Civil War and Reconstruction in The American Journey? In my discussion of your textbook in my recent Washington Post article, "Why Do People Believe Myths About The Confederacy? Because Our Textbooks And Monuments Are Wrong," I don't mention your name, because I cannot believe you wrote it. Did you?

If you did not, Question #7 then is: Who did? And, what do you think of their qualifications? A veteran editor of U.S. history textbooks put it this way to me about when the 2007 edition of your textbook was being written: "Here's $3,000 for a free-lance writer.... They pick things up pretty quickly, and in a couple of days, they're up on the Civil War." Do you agree?

Question #8: If you did not write it, did you even read "your" material on the Civil War and Reconstruction in Journey? In my discussion of your textbook in The Confederate and Neo-Confederate Reader, I assumed you did not:

This paragraph in Journey is not what we would expect from America's premiere Civil War historian. It is hard to believe that McPherson even read it and impossible to believe that he wrote it.

But if you did not, again, I ask, why not? Don't you think it's important to get it right, since it has influenced and will influence the minds of millions of Americans?

If you did read it, then why does the 2007 edition say exactly what the original 2000 edition said? After all, in 2000, the same year your textbook came out, you published "What Caused the Civil War?" in North & South magazine (11/2000). There, discussing the various possible causes, you wrote, "Of all these interpretations, the state's-rights [sic] argument is perhaps the weakest." Ironically, in 2007 you republished this essay in This Mighty Scourge; clearly you still believe it. Should not schoolchildren learn this too?

Question #9: Do you think it makes any difference, what schoolchildren learn about secession, the Confederacy, and the Civil War? At Politics & Prose, and again the next day on National Public Radio, you said that it is important for Americans to understand the Civil War, including secession. Indeed, you have spent much of your professional life in the service of this cause. I agree that it's important; that's why I co-edited The Confederate and Neo-Confederate Reader and wrote the introductions to each selection. Moreover, recent events in Charleston, South Carolina, have underlined some of the costs we all bear resulting from misunderstanding secession and glorifying the Confederacy.

Surely teaching or implying that the Confederate states seceded for states' rights is not accurate history. It is white supremacist history. It bends — even breaks — the facts of what happened. It rationalizes and defends the white South. It valorizes the Confederate cause. It makes us all stupid, because we learn something that isn't so. It encourages everyone — black children, white children, Hmong children, everyone — to believe that African Americans showed no agency, not even when it came to their own freedom. This estranges students of color from school, especially in history and social studies, which in turn widens the gap between white (and Asian) performance vis-a-vis black (and Native American) performance. As I show in Lies My Teacher Told Me, this gap is smallest in math, larger in English, but by far largest in history and social studies. Could this be because history is harder than, say, Faulkner? Trigonometry? Or might the usual way we teach history alienate nonwhite children?

Certainly neo-Confederates have long deemed it important to mystify secession and valorize the Confederate cause. You noted this in 2004 in "The Southern Textbook Crusade," your chapter in The Memory of the Civil War. There you cited Mildred Rutherford, Historian General of the United Daughters of the Confederacy, who in 1919 published A Measuring Rod to Test Textbooks.... One of her requirements was, "Reject a book that says the South fought to hold her slaves." You then noted that "every one" of her requirements "was false."[7] Now, almost a century after she published it, your textbook falsifies history to meet her requirement!

Of course, publishers, then and now, don't want to offend neo-Confederates. Editors and marketers imagine that the South is still so backward that textbooks still must lie about secession and the Confederate cause to get adopted. That may be the case in Texas, whose state board recently passed standards for textbooks that list the causes of the Civil War as " 'sectionalism, states' rights, and slavery' — written deliberately in that order," according to journalist Emma Brown.[8] Certainly that was the case in 1975, when we had to sue the Mississippi State Textbook Board to reverse their rejection of Mississippi: Conflict and Change. But we won that case (Loewen et al. v. Turnipseed et al.[9]). Publishers can use that precedent on behalf of accurate books today. Moreover, the South and the nation are changing. The reactions to the Charleston murders prove that. There is no excuse for pandering to Mildred Rutherford in 2015. Recently I was on NPR's Diane Rehm Show. Her producer had phoned the American Historical Association seeking someone to balance me, some reputable historian who would claim the Southern states seceded for states' rights. The AHA spokesperson said he could not come up with anyone; that would be like asking earth scientists to supply a "reputable climate denier."

If you agree that this history is important to get right, then I must ask one more question.

Question #10: What will you do about it? You have a great deal of influence. If you tell Glencoe/McGraw-Hill that the secession and Civil War sections of The American Journey have to be fixed, they'll fix them! Surely that is a minimal response, going forward.

Might I ask you to do something else to help make up for the harm that mystifying secession has done to all the children (and their teachers) who have read The American Journey over the last fifteen years? I don't know what to suggest. An op-ed by you about how this came to pass? An article for Social Education, the journal of the National Council for the Social Studies? Trouble is, those teachers who simply teach the textbook, rather than using the textbook as a tool (among others) for teaching history, aren't often members of NCSS. Nor do they read the New York Times or wherever you might place an op-ed. Maybe you have better ideas about what to do?

Finally, feel free to phone or email me about this letter. I pledge not to quote what you might say or write to me without your permission. I have long respected your intellect and your integrity and look forward to learning from your response.

Sincerely,

James W. Loewen

PS: I wrote Prof. McPherson on 7/15/2015. He replied two days later, saying he could see why I was so concerned about the treatment of secession and the Confederacy in the passages I quoted. He implied that the treatment of these subjects was better in the first edition and implied he had written that edition, though he was not sure. Like Gerald Danzer with The Americans (see here), McPherson confirmed he had not had much to do with American Journey "for at least the last ten years." I suspect he may not have written even the first edition, and I shall find that edition and report later.

Like Danzer, McPherson did not offer to make any effort to improve the textbook now. Nor did either author express any interest in doing anything to remedy the harm that their textbooks had done over the past years to students who thereby mislearned the history of secession and the Civil War.

    [1]I realize that a state is not a territory, but the principle — federal control over local rights — is the same.

    [2]James W. Loewen and Edward H. Sebesta, eds., The Confederate and Neo-Confederate Reader (Jackson: U. of MS Press, 2011), 187-90.

    [3]The current edition (2011) lists five authors, including Albert Broussard and Donald Ritchie. Broussard focuses on race relations in California, Ritchie on twentieth-century political history, so I doubt they had anything to do with what American Journey says about secession and the Civil War.

    [4]In Battle Cry you add a third cause: "freedom from the coercive powers of a centralized government." I differ with that point, which contradicts your own analysis quoted above, where you note that secessionists demanded that the "centralized government" use its "coercive powers" to prevent local liberty in the territories. On reflection, you might agree with me, for you go on to argue (p. 241) that the key problem was not the central government itself, which of course was still under James Buchanan, a member of the proslavery wing of the Democratic Party, when the first seven states left the nation. Rather, it was that after March 4, 1861, Southerners would no longer control that central government.

    [5]I did leave out the first half of the paragraph. In it, you state: "Southerners justified secession with the theory of states' rights. The states, they argued, had voluntarily chosen to enter the Union. They defined the Constitution as a contract among the independent states." This is, of course, the "compact theory" of the nation's formation. Some Confederates did claim it, although Jefferson Davis did not, on at least one important occasion. Some Northerners agreed. Some, on both sides, did not think it was valid. But this compact theory says nothing about why the South seceded. It only speaks to mechanism, claiming Southern states had the right to secede.

            Thus, in a literal and minimalist sense, the passage quoted above is accurate. In its "Declaration Of The Immediate Causes Which Induce And Justify The Secession Of South Carolina From The Federal Union," South Carolina does point out that the original thirteen states had voluntarily chosen to enter the Union and does claim secession as a state's right.

    [6]By late 1857 the Buchanan administration's newspaper, the Washington Union, favored extending Dred Scott to the free states! "What is recognized as property by the constitution of the United States, by a provision which applies equally to all the states, has an inalienable right to be protected in all the states." If Buchanan or Taney had accomplished this goal, free states would cease to exist, at least legally, although the article does suggest that local "sentiment" might still suffice to make slave owners feel unwelcome. The article goes on to denounce the ending of slavery in Northern states, which had mostly happened decades before, as "a gross outrage on the rights of property." (---, "Free-Soilism," (11/17/1857) Buchanan's was indeed a pro-Southern administration.

    [7]James McPherson, "The Southern Textbook Crusade," in Alice Fahs and Joan Waugh, eds., The Memory of the Civil War (Chapel Hill : University of North Carolina Press, c2004.), 64-78, reprinted as "Long-Legged Yankee Lies: The Lost Cause Textbook Crusade," in McPherson, This Mighty Scourge (NY: Oxford University Press, 2007), 102-03.

    [8]Emma Brown, "150 Years Later, Schools Are Still A Battlefield For Interpreting Civil War," Washington Post, 7/6/2015, washingtonpost.com/local/education/150-years-later-schools-are-still-a-battlefield-for-interpreting-civil-war/2015/07/05/e8fbd57e-2001-11e5-bf41-c23f5d3face1_story.html.

    [9]488 F. Supp. 1138.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153655 https://historynewsnetwork.org/blog/153655 0
Doonesbury as Documentary: Or, Comic Strip Imitates Life Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

Last Sunday's "Doonesbury" comic strip shows a high school teacher of U.S. History in Texas struggling to control his class while staying within the state guidelines about Southern secession. In two panels omitted from the Washington Post print edition, he first asks himself, "I have to teach this?" He answers, "You do, George. You have a family to support."

In class, he then asks his students a question: "So why did Texas secede?" He rushes to answer it: "Because of sectionalism, states' rights, and slavery, in that order," exactly as mandated by the new Texas guidelines. Recently here at HNN, I discussed this Texas requirement and a national textbook "by" Gerald A. Danzer that already teaches secession this way. 

A student rises to challenge him, by reading from Texas's "Declaration of the Causes Which Impel the State of Texas to Secede from the Federal Union." (This document and similar documents from other Confederate states are available in The Confederate and Neo-Confederate Reader.) Of course, what he reads is all about slavery. Texas's Declaration is all about slavery. 

The teacher stops his student from reading more from the document. 

The teacher then reminds his student to stick to the textbook. "What did I say about outside sources?" 

"That they're liberal," the student replies. "But this was written by racists!"

 In February, 2005, I spoke to a large public audience in Columbia, South Carolina. A sizable contingent of neo-Confederates attended, drawn by an interview I had given to The State, Columbia's daily newspaper, stressing how I would be telling the truth about secession. My long (70-minute) talk included quoting from South Carolina's "Declaration of the Immediate Causes Which Induce and Justify the State of South Carolina to Secede from the Federal Union." Since South Carolina attacks states' rights throughout that document, after considering it, no one can honestly claim that the state seceded for states' rights. Like Texas, South Carolina is all about slavery. 

The neo-Confederates sat quietly and attentively through my talk but then dominated the question period. I had no quarrel with that. They made short statements that did end with questions; if I could not answer, I should not have been speaking. 

Afterward, however, in the book-signing line, a woman came up to harangue me. She told me she had home-schooled her children to protect them from the influence of the Richland County Public Schools (which were under Dixiecrat/Republican control). Then she said, "I don't care what you say, the Confederacy seceded for states' rights!" 

"But, but what about the document?" I stammered, referring of course to South Carolina's "Declaration of Causes." 

"You can find documents to prove anything!" she replied. "That was a liberal document," she added, with a withering emphasis on "liberal." 

I sat speechless. Never before on the planet had South Carolina's "Declaration of the Immediate Causes Which Induce and Justify the State of South Carolina to Secede from the Federal Union" been called "a liberal document." Eventually, an organizer of the event came over to lead her away so the booksigning might continue. 

Garry Trudeau nailed it, this past Sunday. Texas now teaches history in the service of stupidity.

 Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153659 https://historynewsnetwork.org/blog/153659 0
Another Comic Strip Imitates (My) Life Sociologist James W. Loewen is the author of "Lies My Teacher Told Me.”

Related Link Doonesbury as Documentary: Or, Comic Strip Imitates Life By Jim Loewen

In last Sunday's Washington Post, the section "Book World" miraculously reappeared. "Book World" had come out every Sunday until the Post cut back several years ago, a retrenchment symptomatic both of the crisis facing newspapers and the dramatic slide in printed book sales. The occasion for its reappearance this week is the upcoming National Book Festival, sponsored by the Library of Congress and taking place September 5 at the Washington Convention Center.

Aficionados of books know that the two areas of book publishing that have not suffered declines parallel to those afflicting newspapers are "young adult fiction" (actually aimed mainly at teens and 'tweens) and "graphic novels" (many of which are nonfiction). On the back cover of the "Book World" section are three panels by cartoonist Stephan Pastis showing the characters from his comic strip "Pearls Before Swine," including the author/artist himself, celebrating Pastis's invitation to speak at the Book Festival.

Well, maybe "celebrating" is too strong. "Hey guys!" Pastis says to his characters as he rushes into the strip excitedly. "I got invited to speak as an author at the National Book Festival." In the second panel, Rat, famously dyspeptic, pours cold water on the occasion by asking, "Did their first fifty choices for speaker die?" 

Cartoonist Stephan Pastis is taken aback; maybe he wasn't the first choice for the National Book Festival. 

Pastis had not considered this possibility. But when I got invited to keynote the Second Biennial Writers Fair in Decatur, Illinois, my home town, way back in October, 2001, I did.

 I had vaguely been aware that Decatur, an industrial city of about 75,000 in the center of Illinois, had a biennial writers fair. I knew that Decatur's best-selling author, Richard Peck, had keynoted the first writers fair. Peck has now written a total of 41 books, mostly young adult fiction. At that point, 2001, they had sold perhaps ten million copies. (Now Peck is up to twenty million.)

I was acutely aware that Decatur's second-best-selling author was not me. My bestseller, Lies My Teacher Told Me, had sold only half a million as of 2001. Stephen E. Ambrose, not then dead, and not then tainted by plagiarism charges, was surely much better known and more widely read than I.

 When the book fair contacted me, I did not suggest that they should have asked Ambrose first. When I flew into Decatur, however, I learned that they already had. Conversing with my host as he drove me in from the airport, I asked, "Why didn't you engage Stephen Ambrose? Isn't he the second-best-selling author from Decatur?"

"Well, yes, he is," came the reply, "but we have the answer to your question. Stephen Ambrose charges $40,000 plus a private jet each way!"

"Gosh," I replied. "I saved you more than $38,000!"

"Yes you did," said my host, happily.

Shortly thereafter, Stephen Ambrose indeed died, just as his plagiarism scandal broke. Perhaps he remains, even now, more interesting than I. Nevertheless, I had a wonderful time at the Second Biennial Decatur Writers Fair.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153664 https://historynewsnetwork.org/blog/153664 0
The Tallest Mountain – The Silliest Naming – Reversed on August 31, 2015 Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

Related Links

●   Ohio delegation blasts Mount McKinley name change

●  A Misnamed Mountain, a Misunderstood President By Lewis Gould

Since people probably reached Alaska before any other part of the Western Hemisphere, they probably named North America's tallest mountain thousands of years ago. They didn't call it Mt. McKinley.

Replacing Native American names with those of European Americans is a form of cultural imperialism. The practice declares that the new rulers of the landscape can afford to ignore what Native names mean and connote in favor of new names that typically have no resonance with what is named.

The mountain is Denali, "the great one." It never deserved to be named for William McKinley, who was never "the great one."

Low-profile conflicts have raged for many years between those who want to change the names of localities and geographic features back to their original Native names and those who want them to be named for European American people, towns, or words. To some degree this is a contest between Native Americans and European Americans, but European Americans are usually found on both sides of the arguments. The battles might also be characterized as between traditionalists and those desiring change, except that both parties claim tradition on their side. Denali, or Mount McKinley, dramatically embodied these disputes about names all across America, not only because it is such a dramatic place but also because the controversy at Denali has gone on for more than 40 years.

William A. Dickey named the peak, the tallest point in North America, Mount McKinley in 1896. Why he got to name it is hard to fathom. Dickey had come to Alaska spurred by discoveries of gold in Cook Inlet. With three companions, he made it to Talkeetna and saw Denali, "the great one" in the language of the nearby Tanaina Indians. According to C. H. Merriam, testifying before the U. S. Geographical Board in 1917, "The right of the discoverer to name geographical features has never been questioned," but Dickey was no discoverer. People had discovered the huge mountain thousands of years earlier. Even if only white people "discover," Russians saw it in the 1770s or 1780s and named it Bulshaia Gora, "big mountain." Even if only English-speaking white people "discover," British Captain George Vancouver saw Denali in 1794. Dickey was not even the first white American to see it; other Americans had preceded him by a quarter century.

 Dickey had no serious reason to name the mountain as he did. William McKinley had not yet been martyred when he received the honor; indeed, he had not even been elected president. Nor had McKinley ever been to the mountain or even to Alaska. William Dickey favored conservative fiscal policies, while most people in the West wanted to expand the amount of money in circulation by minting more silver coins and certificates. Dickey was irritated by arguments he had lost with "free silver" partisans on his trip and decided to retaliate by naming Denali after the gold-standard champion.

"The original naming was little more than a joke," according to George R. Stewart, author of American Place-Names. From the first, some people preferred the native name, and Dickey's frivolous reason for choosing McKinley gave them ammunition. Nevertheless, probably because he wrote about his trip in the New York Sun, Dickey's choice began to catch on. McKinley defeated William Jennings Bryan in 1896, so at least the mountain turned out to be named after a president, and, when McKinley was shot in Buffalo in 1901, after a martyred president.

Today, however, many Americans consider the Native name more melodious and object to "McKinley" on aesthetic grounds — as if the Mississippi River had been renamed for, say, Zachary Taylor. Others support Native efforts to gain more acceptance, including recognition on the landscape. "It's time we listened to the Native people of Alaska," declared Senator Ted Stevens of Alaska back in 1991. "This mountain is the largest in North America. It was named by the Natives long before we arrived."

Nationally, a lone congressman from Ohio prevented the renaming of the mountain for decades. Rep. Ralph Regula from Canton, William McKinley's home town, found a way to block any change. His aide told me, back in 1997, "The Board of Geographic Names won't change names so long as legislation on the subject is pending. Congressman Regula always has legislation pending." The legislation never got anywhere, but it sufficed to prevent action by the Board.

In 1975, Regula blocked a compromise proposed by the Alaska legislature to name the mountain Denali and leave the national park named for McKinley. Five years later the National Park Service agreed to a compromise Regula couldn't block: it changed the name of Mount McKinley National Park to Denali National Park, but the mountain stayed Mount McKinley. This resolution proved unstable, however. Finding its Native lobby more persuasive than Ohio's McKinley lobby, Alaska changed its name for the mountain to Denali, relegating the 25th president to the parenthetical statement, "(also known as Mt. McKinley)."

When the Board on Geographic Names was considering a proposal renaming the mountain in 1977, Congressman Regula testified, "This action would be an insult to the memory of Pres. McKinley and to the people of my District and the nation who are so proud of his heritage," But Americans aren't! That's the problem: most Americans don't rank William McKinley very high in the pantheon of presidents. They remember him, if at all, as a creation of political boss Mark Hanna, beholden to big business, and addicted to high tariffs. He also got us bogged down in a seemingly endless colonial war in the Philippines. Such facts did not deter Regula, who portrayed McKinley as "a champion of the working class" and credited him for "settlement of the long-standing Spanish-American conflict." McKinley was not a proponent of war with Spain but gave in to the war fervor in late April, 1898. That war lasted only three months and was allegedly anti-colonialist. America’s next war, against the Philippines, lasted until Teddy Roosevelt declared victory on July 4, 1902. As in Iraq today, that declaration proved premature; hostilities continued for eleven more years.

Naturally, the congressman's office claimed that higher principles, not mere local pride, motivated Regula to block renaming the mountain. "The congressman feels that a lot of money goes into maps," emphasized aide Barbara Wainman, "and names shouldn't be changed lightly." Moreover, she noted, if they win Denali, Native groups will want to change other names.

On that last point, Wainman is right. Native groups do want to change other names all across the American landscape. American Indians are winning some of these battles. Memphis renamed DeSoto Bluff "Chickasaw Heritage State Park." "Custer's Last Stand" is now "The Little Bighorn Battlefield." The U.S. Board on Geographic Names adopted a policy in 1990 to favor names derived from American Indian, Inuit, and Polynesian languages.

The original version of this essay was the lead geographic chapter in my book, Lies Across America, which came out in 1999. I ended it, “Eventually Natives will outlast Ralph Regula and rename Denali.” Now that day has come. Regula retired in 2009, after 36 years in Congress. Other Ohio Congressmen continued their opposition to a name change, however. Today, John Boehner manifested the usual Ohio Republican opposition to the announced name change. His district lies considerably west of Regula’s, but hey, McKinley was a Republican and from Ohio, so he must be supported.

It remains to be seen whether Boehner can undo today’s action by the Department of the Interior, especially since Republican leaders from Alaska support the original name, Denali, over McKinley.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153665 https://historynewsnetwork.org/blog/153665 0
The Monument to White Power that Still Stands in New Orleans Sociologist James W. Loewen is the author of Lies My Teacher Told Me. This essay was modified from a chapter in his book, Lies Across America, which was first published in 1999. 

            "The Central Theme of Southern History," according to the Southern historian U. B. Phillips in a much-quoted article by that title, has been "a common resolve indomitably maintained — that it shall be and remain a white man's country." Not only in the South but all across America, the landscape commemorates this mentality still, commemorating white racists while obliterating the memory of Americans who fought against white supremacy.

            In 1891 at the foot of Canal Street in New Orleans, where the business district meets the Mississippi River, the white civic leadership of New Orleans erected the most overt monument to white supremacy in the United States. The monument celebrates the White League in what it called "The Battle of Liberty Place." Its checkered history offers something of a barometer showing the relative power of blacks and whites in this part of America and the importance each group places on control of the landscape.

            The monument celebrates a chilling battle: an armed insurrection against the city and state governments that took place in 1874. During Reconstruction, a biracial Republican coalition had won election to most state and city offices. The White League, composed of white New Orleans Democrats, sought to replace those officials with their own men. They had planned their takeover at the elite Boston Club. Their platform made their objective clear: "Having solely in view the maintenance of our hereditary civilization and Christianity menaced by a stupid Africanization, we appeal to the men of our race . . . to unite with us . . . in an earnest effort to re-establish a white man's government in the city and the State."

            On the morning of September 14, 1874, thousands of white Democrats gathered at a statue of Henry Clay then located in the Canal Street median at St. Charles Street. After incendiary speeches, at four in the afternoon about 8400 whites attacked 3000 black members of the state militia, 500 mostly white members of the Metropolitan Police, and 100 other local police officers, all under the command of Gen. James Longstreet. Longstreet had been a Confederate general; indeed, he was Lee's senior corps commander at Gettysburg. After the war, he came to believe, in accord with the 14th and 15th amendments, that African Americans should have full rights as citizens, including voting rights.

            In fifteen minutes, the White Leaguers routed Longstreet's forces, capturing Longstreet. Eleven Metropolitans and their allies were killed and 60 wounded. 21 White Leaguers were killed, including two bystanders, and 19 were wounded. White League officials then took charge of all state offices in New Orleans and appealed to President Ulysses Grant for recognition.

Engraving in Harper's Weekly, October, 1874

            Grant refused to recognize the new group, and a few days later, Federal troops restored the Republican governor to office. The League had no choice but to vacate the government posts they had seized. However, the "Battle of Liberty Place" was an important event that presaged the end of Reconstruction, which white Democrats accomplished in 1876-77 using similarly violent methods.

            In 1882, with the city now under white Democratic control, the median strip at the foot of Canal Street was renamed "Liberty Place." The Orwellian name celebrates the liberty to suppress black voting that racist whites had finally seized in 1877. The City Council passed an ordinance to erect a monument there commemorating the events of September 14, 1874, and for several years on September 14 white supremacists paraded through the streets to the site where the White League had met. But crowds dwindled as the event receded in memory, and no monument was erected.

            In 1891, another "racial crisis" hit New Orleans. Nineteen Italian immigrants, accused of the 1890 Mafia-style slaying of the police chief, were acquitted. White League veterans called for a mass meeting, and on March 14 a huge crowd gathered again at the Clay monument on Canal Street. "Not since the 14th day of September 1874 have we seen such a determined looking set of men assembled around this statue," shouted a White League descendant. "Then you assembled to assert your manhood. I want to know whether or not you will assert your manhood on the 14th day of March." The mob responded by marching on the city jail and shooting nine of the prisoners, dragging two others outside, and hanging them in view of the crowd. This mass lynching caused an international incident that did not end until the United States government paid Italy an indemnity of about $25,000. It also gave a boost to the movement to erect a monument at "Liberty Place."

"NOLAWhiteLeagueMonumentByTracks" by Infrogmation of New Orleans - Photo by Infrogmation. Licensed under CC BY 2.5 via Commons.

            That year an obelisk went up, supported by a shaft and four columns, inscribed with the names of sixteen White Leaguers killed in the battle and the date, September 14, 1874. Perhaps no additional text appeared because white Democrats felt they had better mute any overt expression of white supremacy. African Americans were still voting, and Democrats were still wary in 1891 of upsetting Northern Republicans, who had recently almost passed a strong voting rights bill.

            In 1932, in a reflection of the further deterioration of black rights, the monument acquired an overtly racist text. By then, few African Americans could vote, and the United States government was clearly not going to do anything to help them, so the white supremacist regime was now more secure than in 1891, when the monument had been erected. Upper-class white citizens of New Orleans faced a new threat, however: Huey Long seemed to be putting together a coalition of white farmers and workers, including also those blacks who could vote. Again, the New Orleans elite appealed to white supremacy to ward off the threat. A commission of white citizens appointed by the Mayor added the following inscription to the White League monument:

[Democrats] McEnery and Penn having been elected governor and lieutenant-governor by the white people, were duly installed by this overthrow of carpetbag government, ousting the usurpers, Governor Kellogg (white) and Lieutenant-Governor Antoine (colored). United States troops took over the state government and reinstated the usurpers but the national election of November 1876 recognized white supremacy in the South and gave us our state.

            

As segregation began to come under attack after World War II, the monument was again used as a symbol of intolerance. Disgusted by civil rights plank in the Democratic platform and by President Harry Truman's desegregation of the armed forces, in 1948 racist Democrats ran "Dixiecrats" Strom Thurmond of South Carolina for President and Fielding Wright of Mississippi for Vice President. Their supporters rallied at the monument, and they carried the state. After the Supreme Court outlawed school segregation in 1954, the monument was invoked to remind white Louisianans of what allegedly took place during Reconstruction, the last time the federal government "meddled" in Southern affairs. White supremacists rallied repeatedly at the monument in the 1950s and 1960s.

            By the 1974 centennial of the "Battle of Liberty Place," however, African Americans again were voting in New Orleans, thanks to the Civil Rights Movement and the 1965 Voting Rights Act. New Orleans businessmen even hosted the national NAACP meeting that year. Now the city government felt compelled to add a "counter-marker" next to the monument, which said:

Although the "Battle of Liberty Place" and this monument are important parts of New Orleans history, the sentiments in favor of white supremacy expressed thereon are contrary to the philosophy and beliefs of present-day New Orleans.

Over the 1932 white supremacy language, the city cemented marble slabs.

            When I saw the monument in April, 1988, however, it was clear that the continuing battle of Liberty Place was far from resolved and the "philosophy and beliefs" of New Orleans was still in dispute. White supremacists had removed the marble slabs, and the 1932 inscriptions showed through the thin cement. African Americans had covered its phrases with spray paint: "Black Power" and "Fuck You White People."

            A year later, major street construction on Canal Street gave New Orleans an excuse to remove the obelisk "for safe keeping." For two years it languished in a warehouse. Then a New Orleans druggist filed suit to force it back up on the landscape. His lawsuit led to interesting debates. Supporters of the monument claimed they weren't racist, they were merely good historians, while its detractors were "revisionists," trying to erase history. The monument's opponents pointed out that knowing about the incident was different from celebrating it. Some politicians proposed what they termed a compromise: instead of re-erecting the obelisk at its prominent place at the foot of Canal Street, place it in a "more appropriate location" in a white residential neighborhood — implying that different histories are appropriate for different races and white supremacy memorials are OK in white areas.

            In February, 1993, the city finally re-erected the monument at the foot of Iberville Street, only a block from the old location but out of the way, behind a parking garage. First workers obliterated the 1932 lettering. A new inscription honored those "on both sides of the conflict" and concluded vaguely, "A conflict of the past that should teach us lessons for the future.” Fifty supporters of the obelisk attended the rededication ceremony in March, 1993, while almost as many protesters demonstrated against it. Speakers, including former Ku Klux Klan grand wizard David Duke, strained to be heard over demonstrators' shouts and spirituals. A local reporter recorded the event:

Organizers began the ceremony . . . by waving a Confederate flag alongside American and Louisiana flags. That led to shouts of "Down with white supremacy!" from protesters, who tried to push their way to the monument, but were held back by eight police officers.

In the melee, officers put protester Avery Alexander in a chokehold, even though he was a state representative and also 82 years old.  Afterward, Duke said, "We may be a minority in this city, but I tell you, we still have rights." Alexander called the monument "a badge of slavery" and said "it should be removed." In all, one person attending the ceremony and four people protesting it were arrested.

            Despite the tranquil new text, the controversy continued. The druggist sued to remove the new wording. Vandals have torn out two of the four columns that, along with a central shaft, support the obelisk. In retaliation, a man representing the "Monument Preservation Army" put white paint on a bust of Martin Luther King, Jr., in February, 1993. He said he would continue to deface black monuments "until they leave ours alone." Soon after that, African Americans asked the city to remove the White League monument, saying it met the criteria for a "public nuisance." Later that year, the city's Advisory Committee on Human Relations held hearings and found that it did meet those criteria,[1] since it:

             — honors those who took part in killing city or state public employees;

             — suggests the supremacy of one ethnic, religious, or racial group;

             — praises actions wrongfully taken to promote ethnic, religious, or racial supremacy of one group over another;

             — has been or may become the site of violent demonstrations that threaten life or property; [and]

             — will present an unjustifiable expense to maintain or secure.

David Duke, self-professed admirer of Adolf Hitler, enlivened one hearing by protesting that the city was using "Nazi tactics" to remove the monument.

            Later that summer the city council ordered the monument removed to a museum. No major museum volunteered to house it, however, and more legal battles loomed immediately. The obelisk still stands at the foot of Iberville Street.

            However, the second battle of Liberty Place, the battle over the monument, is far from over. On September 2, 2015, the Vieux Carre Commission, which controls aesthetics in the historic French Quarter, voted 5-1 to remove the obelisk The New Orleans City Council must still approve the removal, but Mayor Mitch Landrieu has called for it to go, along with statues of Confederate leaders.

            Since it is one of the few monuments to the Nadir of Race Relations on our landscape, as well as one of the few sites that mentions Reconstruction, I hope it will wind up on display in a museum. It is also the most overt monument to white supremacy in the United States – and that’s saying a lot! The nearby Louisiana State Historical Museum has shown an increasing willingness to face the state’s white supremacist past, so it would be a likely location.[2]

  [1] It should; the public nuisance ordinance had been written with the monument specifically in mind.

  [2] Lawrence Powell, "A Concrete Symbol," Southern Exposure, Spring, 1990, 41; John Wilds, Charles Dufour, and Walter Cowan, LA Yesterday and Today (Baton Rouge: LA State UP, 1996), 57-58, 185-89; Herbert Aptheker, Afro-American History: The Modern Era (NY: Citadel, 1971), 18; ----, "Anniversary of Battle Ending Carpetbag Reign to Be Marked Wednesday," New Orleans Times- Picayune, 9/11/32; Judith K. Schafer, "The Battle of Liberty Place," Cultural Vistas, v. 5 #1 (Spring, 1994), 9-17; Lawrence Powell, "A Concrete Symbol," Southern Exposure, Spring, 1990, 41; Edward J. Cocke, Monumental New Orleans (Jefferson, LA: Hope Publications, 1974), 16-7; Bruce Eggler, "Judge Lets Liberty Statue Gather Dust in City Storage," Times-Picayune, 2/20/92; Kevin Bell, "Council Takes Step Against Monument," Times-Picayune, 4/16/93; Michael Perlstein, "5 Arrested at Monument Ceremony," Times-Picayune, 3/8/93; Eggler, "Liberty Statue is Replaced," Times-Picayune, 2/11/93; ---, "Rev. King State Spray-Painted White," Times-Picayune, 3/22/93; Susan Finch, "Duke Blasts 'Nazi' Tactics to Remove Monument," Times-Picayune, 6/16/93; Finch, "Liberty: Store Monument, Panel Says," Times-Picayune, 7/1/93; Cain Burdeau, “French Quarter Agency Backs Removal of White League Monument,” ABC News, abcnews.go.com/US/wireStory/french-quarter-agency-backs-removal-white-league-monument-33502248.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153667 https://historynewsnetwork.org/blog/153667 0
Once More, A Comic Strip Imitates (My) Life Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

In the Washington Post for Tuesday, November 17, 2015, Stephan Pastis's comic strip "Pearls Before Swine" shows two characters on barstools talking about history. Pig tells Goat he has a question about the Civil War. Goat says, "Great. I love history." We pick up here:

            I knew it was supposed to be funny, but I couldn't laugh. Once again, a comic strip had imitated my life.

            In the fall semester of 1993 at the University of Vermont, I was trying to finish my book, Lies My Teacher Told Me, which I had been working on for years. I decided to teach a course that I titled "Lies My Teacher Told Me: Issues in Secondary Social Studies Education." I could hand out some of my chapters and get feedback; I could also get students involved in wrestling with some of the issues.

            Perhaps owing to its flamboyant title, the seminar oversubscribed. I wound up with more than 40 students — juniors and seniors majoring in history, the social sciences, and education.

            On the first day of class, I gave out a quiz. I often do this, seeking to learn more about what my students know and don't know. One question asked, "The War in Vietnam was fought between ________ and ________." I wanted to see how many students said "between North and South Vietnam" compared to "between the United States and Vietnam." To my consternation, 22% of my students replied, "between North and South Korea!"

            History and social studies education in the United States isn't working. I'm not the only person to say that. About the Civil War, Diane Ravitch found that about half of all high school graduates don't know in which half-century it was fought. More than 75% don't know why the South seceded, claiming Confederates favored states' rights or were upset about tariffs and taxes.

            Still ... North and South Korea?

            At the next class period, I denounced the 22% (not by name, of course). They pointed out that they were not born when the War in Vietnam ended, which was quite true and another way of making me feel 97 years old. I replied, "Well, what about the Civil War? Shouldn't you know something about that, even if you weren't alive?"

            "Well, yes," they replied, reasonably. But then they pointed out that their teachers in U.S. History in high school never got anywhere near the Vietnam War. Their courses petered out somewhere around the Korean War.

            One student even wanted half credit for getting "north" and "south" correct!

            "What if you had put North and South Dakota?" I replied.

            That's why I couldn't laugh at "Pearls Before Swine." 

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153694 https://historynewsnetwork.org/blog/153694 0
BLM and Me

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

            In early October, 2015, I flew from Washington, DC, to Seattle, to speak at the annual meeting of the National Association of State Judicial Educators. A little incident marred the beginning of my trip, in Reagan Washington National Airport. I tell it in present tense to give some sense of how it unfolded to me.

            The TSA line is longer than I have ever seen at National Airport. It zig-zags through the holding area, guided by stanchions and cloth tapes, then shoots in a straight line across half the length of the terminal. I walk to the end of the line. Shortly thereafter a black man joins the line ahead of me, standing behind a family group with a cocker spaniel that is not really in the line, just accompanying someone who is. I wait to see if he is connected to the group or the person behind, then conclude he simply butted in.

            For some reason, I often take it upon myself to deal with these situations. When I'm in a generous mood toward myself, I credit my "improve the world" mindset. Those not so generous might simply call me bossy. Either way, I walk up past the four or five people between me and him, gently touch his arm, and say, "Sir, this is one line."

            "First of all, speak to me," he responds angrily. "Don't touch me."

            "I did, I called you 'sir,' " I reply.

            "And second," he goes on, "there are two lines," gesturing ahead.

            "It's a single line," says a man standing next to me. The interloper looks again and realizes we are right. Wordlessly, he starts turning his carry-on around to go to the end of the line.

            I go back to my spot in the line. As he passes me, he repeats angrily, "Don't touch me." "Yes sir!" I reply, with some sarcasm.

            I'm steamed. What flashes into my mind is one of the many videos of bad police behavior toward African Americans that have gone viral throughout the summer and autumn of 2015, this one from St. Paul. An African American man, Chris Lollie, was sitting in one of the "skyways" — enclosed passageways that connect downtown buildings so pedestrians don't have to go outside in Minnesota winter weather. Lollie recorded his own arrest and put it on YouTube, titled "Black Man Taken to Jail for Sitting in Public Place." Lollie was waiting for his children to be dropped off by a bus from their daycare center. As the police begin to take him into custody, he says, "Please don't touch me" repeatedly. Nothing happened to the police, of course, and nothing tragic happened to Mr. Lollie either, beyond the disruption of life and dignity caused by an unjustified arrest

            I think this black man at National has placed me in the role of the harassing police officer simply for touching him lightly to get his attention. This makes me angry, because I've done nothing wrong. Owing to BLM, black folks are increasingly touchy, I conclude, deliberately using the pun to myself.

            Then I recall my only prior unpleasant interaction at Reagan National Airport, maybe three years earlier. I had walked up to a Dunkin' Donuts kiosk to see if they had muffins, not wanting to waste my time in line if they did not. They did, so I turned toward the back of the line. Just then a white man in the line snarled at me, "You'd better get to the back of the line!" Stunned, I continued walking toward the back of the line. While waiting, I noticed where my antagonist sat, after completing his purchase. After eventually buying my muffin, I walked over toward him, planning to say I was sorry to have upset him but was only trying to learn if they had muffins. He saw me coming and waved me off angrily. I looked at him but didn't approach.

            Recalling this incident helps my frame of mind. Just as I could never generalize from that angry white man to all whites (because I myself am white), I should not generalize from one angry black man to all blacks. I know better; I teach better.

            About then, a black woman a couple of persons in front of me darts out of line to get a spoon to eat the yogurt she was carrying, having been warned that she could not bring it through security. (No one knows why not!) I make a point of saying "that's fine" as she asks the person in front of me if she can step out of the line for that moment.

            When my antagonist behind me finally enters the zig-zag part of the line, the stanchions guide us so we face and come near each other. We stare at each other wordlessly. The next time that this happens is near the scanning area. I'm not looking in his direction but rather out a window, and as he passes me, in a line going to a different scanner, he touches me. "I want to apologize for what happened back there in line," he says. Immediately I put forth my hand to accept and we shake hands. "I'm a vet," he continues, "43 years in the Marine Corps, and I suffer from PTSD. In fact, I'm on my way to a treatment center." I put my hand on his forearm in sympathy, then snatch it away as if I'd touched a hot stove. "I'm sorry," I say, for touching him. "That's all right," he says. I then tell him I taught for eight years at a black college and have spent my life in race relations ever since. "This wasn't about you," he replies. "This wasn't personal. This is on me." We shake hands again, and I reach security.

            Afterward, the woman who ate the yogurt asks me, "Did he apologize?" I realize she overheard the original incident. "Magnificently," I reply.

            What do I conclude from this encounter?

            First, that the "Black Lives Matter" movement touches us all. I have been in favor of the movement, have spoken positively about it, and have signaled with a raised fist or a "thumbs up" on those few occasions when I have encountered it live. But I realize now that BLM may have affected the ways I think categorically about African Americans. And, occasionally, I do think categorically about African Americans. I think most African Americans occasionally think categorically about white folks, too.

            A word about how I have been thinking about "Black Lives Matter": I think BLM is elliptical for "Black Lives Matter Too." Thus the phrase intrinsically implies that white lives have always mattered; African Americans want the same regard for their humanity, dignity, and very existence that whites have always enjoyed. Whatever one might think about white privilege, there can be no question that it exists at the moment of confrontation with a police officer, often even a black police officer.

            Yet that's too simple too. None of us, of any race, feels at ease when a police car is on our tail, even if we know we are not speeding and have done nothing wrong. We know the officer can always make something up. Consider Sandra Bland, who pulled over for a police officer and was arrested for not signalling before she pulled over! Anyone might respond identically. To be sure, Ms. Bland did not enjoy white privilege. Consider then the white guy in Utah, Dillon Taylor, shot for walking away from an officer. Apparently he could not hear the command to stop because he had his ear buds blasting at high volume. Like Michael Brown, this young man was no angel. But as with Michael Brown, his character and prior actions were not the issue. And as with Michael Brown, although the victim had no gun and the officer was in no danger, the officer was cleared of wrongdoing.

            The unease that we whites have vis-a-vis the police may help explain the widespread sympathy for the BLM movement in white America, especially among young adults. Yet we know we have white privilege, particularly at the moment of arrest. We agree with Chris Lollie's expressed view that had he been white, he would not have been handcuffed and jailed in St. Paul.

            Regardless of the race or character of victim or officer, the issue remains reckless police behavior, sometimes more reckless vis-a-vis people of color, often more reckless vis-a-vis the poor and homeless. Perhaps more careful police recruitment coupled with better police training can help, but the problem is systemic, so it cannot be cured merely by making changes within or among individual officers. We need an independent agent to investigate questionable police conduct and bring some accountability to bear. Everywhere, police officers are crucial work partners of prosecutors, so the latter cannot easily investigate the former impartially. The recent shooting of Laquan McDonald in Chicago, not really “recent” since the prosecutor sat on it for a year, shows that. The Department of Justice provided an answer in Ferguson, Missouri, and will now invesetigate Chicago, but it can hardly police the police in every jurisdiction. Civilian review boards with teeth including investigative power might provide one answer.

            With such a system in place everywhere, we might all breathe easier (and yes, I am referencing "I can't breathe") when tailed by police. We and the officer would then know that unprofessional police behavior is likely to impact the officer, not just us. Then, as our justified paranoia about the police decreases, even in African American and Native American communities — those most heavily hit by unprofessional policing — unjustified paranoia might also decrease, making ordinary encounters across racial lines easier for all of us.

Copyright James W. Loewen 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153697 https://historynewsnetwork.org/blog/153697 0
The Stethoscope as Synecdoche

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

            In the first Sunday Washington Post of the new year, Lenny Bernstein (not related to the conductor/composer) has a front-page article, "After 200 Years, Time to Check the Pulse of a Medical Icon." He begins: “The stethoscope is having a crossroads moment. Perhaps more than at any time in its two-century history, this ubiquitous tool of the medical profession is at the center of debate over how medicine should be practiced.” 

            "Great!" I think. Long ago — I don't know where — I read about how the stethoscope was not an example of the progress of Western medicine, but of its regress. The author noted that a stethoscope does not clarify or amplify the sounds of the lungs breathing, the heart beating, or of whatever body part is being examined. On the contrary, asking the patient to take his/her clothing off that part and then putting one's ear flatly against it is clearer than any traditional stethoscope. My dad was an M.D. I had a stethoscope handy. I tried it, compared to my naked ear. My ear was much clearer and louder. 

            The stethoscope was not invented to be clear or loud. On the contrary, René Laennec, a French physician, was "reluctant" to place his ear on a patient's chest because she was young, female, and buxom. Back then (1816), few female doctors existed, so this discomfort was probably widespread on the part of physicians — and patients too. I suspect some doctors were also squeamish about placing their ear directly onto men's chests, abdomens, and worse. 

            A stethoscope helps a doctor look and feel more professional, with professional tools. It works worse, but the physician can feel better about himself[1] while using it. It reminds me of groundskeepers at the University of Vermont in the mid-'70s who used gas-powered blowers to get the leaves off sidewalks in the fall. Back then at least, blowers were heavy and inefficient. Using them took longer than vigorous sweeping with a push broom would have. But pushing a broom has no status. Operating machinery does. Similarly, the stethoscope has become a badge of office for physicians, to the point where they leave them around their necks or looped out of white coat pockets even when they are unlikely to need them. 

            Sociologically, the stethoscope allows the physician to distance from the patient, actually as well as figuratively. As Richard Selzer, the surgeon and renowned author of Mortal Lessons put it, "The entire medical world continues to pay homage to Laennec for his gift of space interpersonal."[2] Stethoscopes also let doctors put a literal up/down dimension onto their patients, even patients who happen to be eminent artists, jurists, or executives. We have all had the experience of doctors calling us by our first names while we are to call them "Doctor _____" in return. The stethoscope is part of this process — sort of a synecdoche. 

            Laennec felt the need to remove the sexual aspects of his patients and to professionalize listening. In "Behavior in Pubic Places: The Sociology of the Vaginal Exam," sociologist James Henslin and obstetrical nurse Mae Biggs describe how male doctors prompt transitions within their patients "from person to pelvic" when doing pelvic examinations.[3] Again, the motivations were to remove the sexual aspects and professionalize examining — laudatory goals probably desired on both ends of the speculum. The entire American uphill "delivery" system blossomed, peaked, and withered during the twentieth century. It similarly professionalized a natural process, childbirth, complete with stirrups, forceps, and anesthesia. 

            The stethoscope symbolizes the continuing trend in Western medicine to treat the illness, not the patient. As synecdoche. it stands for all the ways we now do this. Similarly, the forceps and anesthesia for childbirth convert the baby into a medical problem; the entire twentieth century delivery protocal rested on an analogy between foetus and huge stomach tumor. 

            This practice has had success. Antibiotics do wipe out infections, even without taking a history or listening to an organ. Putting a stethoscope onto a chest objectifies that chest in a way that putting an ear on it does not. Doctors already take courses on how to gain distance from patients to get over their deaths and move on. Patient death is hard on doctors. Who am I to say they should get closer to their patients? 

            At the same time, Western medicine mislaid some things along the way. We may not all get fifteen minutes of fame, but fifteen minutes is all most of us get of a doctor's time, no matter what is wrong with us. On top of that, we usually see a different doctor each time, at the massive bureaucracy or hospital to which we go. The impersonality of it all — the social distance — causes failures of care as well as successes. Intangibles like trust and even humor facilitate the flow of information in both directions that leads to successful treatment. The placebo effect — better known as the body's ability to heal itself — also depends on a human-to-human bond. 

            It turns out that Lenny Bernstein's article in the Post missed the role of stethoscopes in inaugurating and symbolizing this process of distancing. Today, medicine has moved so far away from the person-to-person interaction of Laennec's time, in 1816, or even that of my childhood in 1950, that the stethoscope can now be viewed as a positive thing. Rather than separating the doctor from the patient, Bernstein says the stethoscope "narrows the physical distance" [my italics] and "compels human touch." In the context of medical practice today, surely he's right. Recently I saw my physician because my big toe was tender and inflamed. He never got within six feet of it or me.[4] When my wife broke her pelvis and two ribs, her doctors examined only her x-rays and MRI images, not her. As a result, they misdiagnosed her at first.[5]  

            Now stethoscopes are going higher-tech, letting doctors distance themselves still farther. Portable ultrasound imaging from devices no bigger than cellphones are on the way. Perhaps the next step: a stethoscope app for our cellphones that sends our sounds to the doctor's office. Just as we no longer go to the library to read a journal article, we'll no longer go to the doctor's office to be examined. 

            Elazer Edelman points out in the New England Journal of Medicine, "a stethoscope exam is an opportunity to create a bond between doctor and patient." Bernstein closes his essay with a quote from Edelman: "You can't trust someone who won't touch you."

    [1]I think this discussion is probably gendered, so I'll leave it "himself."

    [2]Richard Selzer, The Exact Location of the Soul (NY: St. Martin's Picador, 2001), 153.

    [3]James Henslin and Mae Biggs, "Behavior in Pubic Places: The Sociology of the Vaginal Exam," in Henslin, ed., Down to Earth Sociology (NY: Simon & Schuster Free Press, 2007), 229-41.

    [4]Luckily, my toe cleared up.

    [5]Luckily, she got better.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153711 https://historynewsnetwork.org/blog/153711 0
Reconstruction: The Sixth Myth     Earlier this week, the Washington Post printed my newest contribution to their ongoing "Five Myths about ..." series, "Five Myths about Reconstruction." I wrote it because the United States is entering the sesquicentennial of Reconstruction.

            My previous battles with Washington Post editing have been draconian. So this time I prepared: I sent them six myths. In the history of the world, the Post had never printed six myths — it is, after all, a "five myth" series — but I figured the editor was sure to bloody up at least one myth so it could no longer remain standing.

            I was right. The myth that perished in the process was: "Abraham Lincoln would have disapproved of Reconstruction." Here it is, in its original form:

            Some textbooks used today in high schools North and South still portray Reconstruction as a time when "vindictive Radicals" in Congress who were "bitter against the Southern rebels" and "wanted to punish white Southerners," took control of Reconstruction away from Andrew Johnson, who was just trying to follow Abraham Lincoln's dictum, "With malice toward none." The quoted words come from the 2005 edition of A History of the United States, "by" Daniel Boorstin, former Librarian of Congress, and Brooks Mather Kelley, former Archivist at Yale.             Of course, no one can say for sure what President Lincoln would have done, had he survived. But in his last two public speeches, he put himself on record as favoring the key "Radical" demand. Walking through Richmond on April 4, 1865, the day after U.S. troops had taken it, Lincoln drew an enormous crowd of excited spectators, mostly African American. "[A]s long as I live," he told them, "no one shall put a shackle on your limbs, and you shall have all the rights which God has given to every other free citizen of this Republic." A week later, he repeated these sentiments from the White House balcony. In this audience was John Wilkes Booth, who snarled to his companion, "That means nigger citizenship. Now, by God, I'll put him through. That is the last speech he will ever make." So it seems likely that Lincoln would have supported the 14th amendment, calling for equal rights, and the 15th, calling for voting rights.

This political cartoon from Puck shows the stereotypical view of Reconstruction that some textbooks and Hillary Clinton still espouse.

            Although it was on my "top five" myths list, my WaPo editor demurred on two grounds. First, he did not think that I had rebutted the myth effectively. After my last sentence above, he wrote: "Does it necessarily follow, though, that [Lincoln] would have supported the methods the Radical Republicans used in order to implement those amendments?"

            I inferred that the editor still suffered from the very myth that he was editing — that during Reconstruction, "vindictive Radicals" were "punishing" the "defeated South." (Of course, black Southerners were hardly defeated. Neither were Unionist Southerners, and there were a lot of them, not just in West Virginia but also the Ozarks, the Piney Woods in Mississippi, Appalachia, indeed, all over the South, even in Richmond, its capital.)

            The main policies that Radical Republicans foisted on "the South" were free and fair elections that allowed adult males to vote without regard to race. From the standpoint of white supremacy, that was indeed vindictive. White supremacy was at something of a low ebb, however, owing in part to the military prowess shown by some 200,000 African Americans during the Civil War. So, to my editor's question I replied:

            ? What methods? Requiring Southern states to let African Americans vote as a condition of reconstituting their governments? Well, note the foregoing quotes by A.L.

I also conceded, "We cannot know, of course," since Booth had ended Lincoln's presidency. But I did not think I had to prove that Lincoln would have supported black enfranchisement, only that the assertion he would have disapproved of it was unfounded.

            My editor had a second criticism of my alleged myth: he didn't think that anyone still believed it, other than Boorstin and Kelley, the authors of one old-fashioned textbook. And both of them were dead!

            It turns out that Daniel Boorstin and Brooks Mather Kelley did not write "their textbook" when they were alive, either, as I got Kelley to admit back in July of 2006. That's why I had placed quotation marks around "by" in my original draft. Since it does not depend upon its "authors," A History of the United States came out again in 2005, with all prior copyright dates expunged, to look younger. It remains in print, though we can hope that Pearson/Prentice-Hall may now give it the euthanasia it has so long deserved.

            So I added this sentence: “The American Pageant, likewise, has Johnson ‘agreeing with Lincoln’ about Reconstruction policy.”

            Still, my editor was not convinced. "Is there another example," he wrote, "of someone suggesting Lincoln would have opposed Reconstruction that we can add here?" I didn't have a truly current example at my fingertips, so we went to press without this myth.

            The next day (Monday, January 25, 2015), in the Democratic presidential debate, Hillary Clinton supplied the example. She repeated the myth about Lincoln and Reconstruction:

            You know, he was willing to reconcile and forgive.  And I don't know what our country might have been like had he not been murdered, but I bet that it might have been a little less rancorous, a little more forgiving and tolerant, that might possibly have brought people back together more quickly.

            

Actually, Andrew Johnson forgave the white Confederate elite for seceding. He tolerated, even encouraged their newly formed state governments when they reinstated white supremacy in the form of vicious "Black Codes." He did not "rancorously" require them to let African Americans vote, own property, etc. The result? Ex-Confederates concluded that they had a green light to impose serfdom upon African Americans. Presidential Reconstruction emboldened ex-Confederates to contest Congressional Reconstruction, which they did by violence and fraud until finally they ended it in 1876-77.

            Clinton is not to blame for having learned the old Dunning School myths about Reconstruction. They were taught to her in about 1962. She also doubtless imbibed them from Profiles in Courage, "by" John F. Kennedy. She is to blame for not having taken time to question the mythology she absorbed in her youth. Whoever wrote Boorstin and Kelley's textbook, along with whoever wrote The American Pageant, are to blame for repeating these canards half a century later. And Boorstin, Kelley, and the "authors" of The American Pageant, David M. Kennedy, Lizabeth Cohen, and Thomas A. Bailey, are to blame for not writing or even reading "their" books, to see if they still tell old myths about Reconstruction.  

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153726 https://historynewsnetwork.org/blog/153726 0
Lies the Neo-Confederates Told Me            

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

Recently Amazon.com listed a new "book," Lies My Teacher Told Me: The True History of the War for Southern Independence, by Clyde Wilson. As the author of the "original" Lies My Teacher Told Me, whose subtitle is "Everything Your American History Textbook Got Wrong," naturally this intrigued me.

            It is not likely that Wilson, professor emeritus of history at the University of South Carolina, was ignorant of my book when he chose his title. Lies My Teacher Told Me has sold more than a million and a half copies, making it the best-selling book by a living sociologist. Moreover, Chapter 6 treats secession and other unseemly aspects of the Confederacy, putting me clearly in the historical camp that Wilson despises. He calls it "the current fashion in historical interpretation."

            So I suppose I'm flattered that Wilson has chosen to appropriate my title for his "book." But not really — because Wilson's "book" gives history a bad name. Book titles are not copyrightable, so I shall not sue, but I do want to give my opinion of this "book" that sounds so familiar. 

            First, let me explain why I keep calling it a "book," complete with quotation marks. Lies My Teacher Told Me: The True History of the War for Southern Independence is only 38 pages long. So far as I can tell, most of it is a single essay that Wilson published two years earlier on the website of the semi-clandestine Abbeville Institute.[1]

            Then there is Wilson's level of scholarship. He quotes not a single word from any secession document — indeed, from any source other than Robert E. Lee's farewell address to his troops at Appomattox. From that speech he quotes seven words: Lee's praise of the "valor and devotion" and "unsurpassed courage and fortitude" of Confederate soldiers. I have no quarrel with praising those qualities of the men. Grant paid them the same tribute, calling them "a foe who had fought so long and valiantly, and had suffered so much for a cause, though that cause was, I believe, one of the worst for which a people ever fought."

                  Wilson will have nothing to do with that last phrase, however. Instead, he claims, "Although their cause was lost it was a good cause and still has a lot to teach the world today." He then actually asserts that the South Carolina delegates seceded on behalf of states' rights! In Wilson's words, "the Union was no longer to their benefit but had become a burden and a danger. They said: We have acted in good faith and been very patient. But obviously you people in control of the federal government intend permanently to exploit our wealth and interfere in our affairs."

                  He does not have to contend with why South Carolina's leaders said they seceded, because he does not quote them. In fact, South Carolina's leaders seceded because they were upset with states' rights. In the key document, “Declaration Of The Immediate Causes Which Induce And Justify The Secession Of South Carolina From The Federal Union,” adopted on Christmas Eve of 1860, delegates to the South Carolina secession convention made this clear. We are seceding, they wrote, because “fourteen of the States have deliberately refused for years past to fulfill their constitutional obligations, and we refer to their own statutes for the proof.” Constitutional obligations? Sounds pretty vague! But the delegates go right on to spell out why they are leaving:

        The Constitution of the United States, in its fourth Article, provides as follows: “No person held to service or labor in one State under the laws thereof, escaping into another, shall, in consequence of any law or regulation therein, be discharged from such service or labor, but shall be delivered up, on claim of the party to whom such service or labor may be due.”

The “general government,” South Carolina goes on, “passed laws to carry into effect these stipulations. But an increasing hostility on the part of the non-slaveholding states to the institution of slavery has led to a disregard of their obligations.”

                  South Carolina went on to list the states whose attempts to exercise states’ rights deeply offended them:

        The States of Maine, New Hampshire, Vermont, Massachusetts, Connecticut, Rhode Island, New York, Pennsylvania, Illinois, Indiana, Michigan, Wisconsin, and Iowa, have enacted laws which either nullify the acts of Congress, or render useless any attempt to execute them. In many of these States the fugitive is discharged from the service of labor claimed....

                 

     South Carolina goes on to charge other states with other unpardonable offenses. New York, for example, no longer allows owners the right to take slaves through New York or use them temporarily there. South Carolina is outraged. Some states, South Carolina charges, let African Americans vote. Who votes in America was at this time, of course, a state's right, until the passage of the 15th Amendment, two whole eras later, but the delegates refer to the Dred Scott decision and are offended that New Hampshire, for instance, lets blacks vote.

                  This key document from South Carolina is all about — and all against — states' rights. It makes no claim that the federal government has wronged the South. Why would it? Under Buchanan, South Carolina had no problem with the federal government.

                  What about Mississippi, next to secede? What do its leaders say about why they seceded?

                  They copied South Carolina’s title, passing “A Declaration of the Immediate Causes Which Induce and Justify the Secession of the State of Mississippi from the Federal Union” in January. “[I]t is but just that we should declare the prominent reasons which have induced our course,” they begin.

        Our position is thoroughly identified with the institution of slavery ¯ the greatest material interest of the world. Its labor supplies the product which constitutes by far the largest and most important portions of the commerce of the earth. These products are peculiar to the climate verging on the tropical regions, and by an imperious law of nature none but the black race can bear exposure to the tropical sun. These products have become necessities of the world, and a blow at slavery is a blow at commerce and civilization.      

                 

                 Secession is not the only subject that Wilson gets backward. He denies all agency to the Confederacy in the coming of the war. “The U.S. government, under the control of a minority party, launched a massive invasion of the South,” Wilson writes. Well, yes, eventually it did, but first, the Confederacy attacked Fort Sumter, for starters. No administration could have simply turned the other cheek to that initiation of war and remain in office. James McPherson makes this point at length in his 1989 essay, “The War of Southern Aggression.”                 

                  Wilson also repeats the old “Lost Cause” claim that only overwhelming numbers caused the South’s defeat. Indeed, he goes that claim one further: “Though they had four times our resources,[2] they were not able to defeat our men, so the U.S. government launched an unprecedentedly brutal war of terrorism again [sic] Southern women and children, white and black.” Nonsense! Even other neo-Confederate historians admit that Northern soldiers defeated “our men” at Appomattox, as well as Fort Donelson, Vicksburg, Gettysburg, and many other battlefields, ending in Bentonville, North Carolina, in 1865.

                  As for the “unprecedentedly brutal war of terrorism against Southern women and children,” Wilson apparently does not realize why plantation mistresses often secreted gold and jewelry on their persons when U.S. forces came through. Because it worked! That is, U.S. soldiers rarely touched white women. Nor does Wilson seem to know that C.S.A. policy in the North was to seize all African Americans they met and sell them as slaves, whether they had ever been enslaved or not. Indeed, within sight of the unfortunate and soon-to-be-moved bronze Confederate soldier at the Montgomery County Courthouse in Rockville, Maryland, Confederates under J.E.B. Stuart seized over a hundred African Americans and dragged them to Virginia in chains.

                  Wilson’s project is blatantly anti-intellectual. “History is human experience,” he writes, “and you do not have to be an ‘expert’ to have an opinion about human experience.” To be sure, I am on record as favoring helping every student to do history, but not just based on their “opinion about human experience.” "People have a right to their own opinions, but not to their own facts," I wrote some years ago. (George Zimmerman made these sentences modestly notorious by using them on his website.) "Evidence must be located, not created," I went on, "and opinions not backed by evidence cannot be given much weight." One wonders how Wilson dealt with those occasional hapless students at the University of South Carolina who argued based on their own experience, rather than from oral history or documents. For that matter, what is one's "human experience" about the Civil War in 2016? Conversations with your dad? Does not one's view of the Civil War have to be grounded in research?

                  Wilson does not agree. “History is not some disembodied truth,” he assures us. “All history is the story of somebody's experience," he goes on. "It is somebody's history. When we talk about the War it is our history we are talking about, it is a part of our identity. To tell libelous lies about our ancestors is a direct attack on who we are.” Therefore, he implies, we don’t need evidence. There is "more than one perspective," he writes, and he implies that all perspectives are legitimate. So much for the discipline of history.

                  Early in his essay, Wilson does say something with which I can whole-heartedly agree. “It is useless to proclaim the courage, skill, and sacrifice of the Confederate soldier while permitting him to be guilty of a bad cause.” The Confederate cause was “bad.” Grant got that right. Wilson cannot, without traducing the evidence, make secession on behalf of slavery into a good cause. Therefore, it is time for him (and his neo-Confederate clique at Abbeville) to surrender. Further resistance is futile.

                  Across the South, from before the 1860 crisis down to now, some white Southerners have worked on behalf of the rights of Southerners of all races. Their stories go unsung today because from 1890 to 2015, the white South was busy singing the misbegotten praises of the miscreants who took it out of the Union. Now much of that activity has ceased, has even reversed. The new activity — taking down the statues of Jefferson Davis, J.E.B. Stuart, and Robert E. Lee in favor of people like James Longstreet, Elizabeth Van Lew, and Print Matthews – that is a historical project worthy of Dixie. This work helps us actually to realize “The True History of the War for Southern Independence.” Recognizing such people helps African Americans realize not all whites were racist, while providing white Southerners with positive role models. How much better for the world and the history profession this new activity is, compared to the false mythologies that Wilson still hawks – misusing my book title to do it.

    [1]The rest of this article is based on the Abbeville essay. Thus I do not have to buy the "book" by sending Amazon $5.38, some of which would trickle down to Abbeville and Wilson. Donald Livingston, a philosophy professor retired from Emory, operates the Abbeville Institute from his house. Besides Wilson, its other marquee member is Thomas DiLorenzo, the notorious anti-Lincoln writer.

    [2]See "Getting Even the Numbers Wrong" in Loewen, Lies Across America:  What Our Historic Sites Get Wrong, for a critique of this claim that overwhelming numbers caused the Confederate defeat. 

Copyright James W. Loewen 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153737 https://historynewsnetwork.org/blog/153737 0
Ten Questions for Yale President Peter Salovey

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

         Peter Salovey, President of Yale University, announced a surprising decision last week: Yale chose to continue to name a dormitory "Calhoun College," despite protests. One Yale graduate and resident of Calhoun College, Malcolm Pearson, who describes himself as an "old Southern white," called this decision "almost beyond comprehension."

         Surely Pearson is right. The decision prompted me to put ten questions to President Salovey. I emailed the essay to Salovey last week but have not yet received a response.

1. Have you ever read anything by John C. Calhoun?

         I ask because, years before his native state of South Carolina put his writings into effect, he was forthright in his white supremacy and his advocacy of disunion on its behalf. "Abolition and the Union can not coexist," he argued in the Senate way back in February, 1837. Therefore, if Northerners did not eliminate abolitionism within their states, so much for the nation. Meanwhile, slavery "cannot be subverted without drenching the country in blood, and extirpating one or the other of the races." He went on to claim that African Americans (and Africans) were so inferior that slavery was "a positive good" for them, because without it they are "low, degraded, and savage."

2. Do you understand Calhoun's role in U.S. history?

         Repeatedly, Calhoun threatened disunion to blackmail national leaders to get what he wanted. He explained his strategy to a friend in 1827:  "You will see that I have made up the issue between North and South. If we flinch we are gone, but if we stand fast on it, we shall triumph either by compelling the North to yield to our terms, or declaring our independence of them."

 At that time, Calhoun had written that states' rights let Northerners distance themselves morally from slavery. "A large portion of the Northern States believes slavery to be a sin, and would consider it as an obligation of conscience to abolish it if they should feel themselves in any degree responsible for its continuance." By the 1840s, however, he opposed states' rights when those rights had anything to do with freedom, a move he knew would sow sectional discord.

         Also by the 1840s, Calhoun had no more use for democracy. He pushed to make the South a closed society. He argued that Congress should not even receive petitions about slavery. Sending abolitionist materials through the mail or even merely receiving same should be a crime.

         As time passed, Calhoun took ever more extreme positions favoring the South as a region and slavery as a cause. He called the Missouri Compromise, which he had supported at the time, unconstitutional, because it banned slavery from territories north of Arkansas. Because the Constitution protected slavery, he insisted, slaveowners had the right to take their property into any territory. Obviously, since many of the same people who had voted for the Constitution had also voted for the Northwest Ordinance, this argument was neither historically nor judicially sound. Nor did it comport with states' rights, because it required the national government to enforce slavery, even if the residents of a territory had voted slavery down. Eventually, he came to place the interests of his region as he perceived them ahead of the national interest, ahead even of national unity.

         Again, in the words of Yale's Malcolm Pearson

         "Calhoun was the proponent of a theory of the moral good of slavery thought ridiculous and self-serving in his own time. He was the intellectual father of nullification and secession, at whose feet we may lay the Civil War. It's not necessary to judge him by the terms of our culture. We can judge him best by the words of the President under whom he served as Vice President. Andrew Jackson said of Calhoun, 'I would hang him, if I could.' "

3. Do you think Calhoun at Yale is parallel to Woodrow Wilson at Princeton? or to Edwin DeBarr at the University of Oklahoma?

         He is not. Princeton honored Wilson not because he was an arrant racist who segregated the federal government, including the Navy, but despite those things. At least he gave lip service to democracy. Not Calhoun. Oklahoma honored DeBarr not because he was the statewide leader of the Ku Klux Klan, but because, as the plaque on what used to be DeBarr Hall says, he "built the chemistry department from the ground up, heading it for 31 years, and was also the head of the School of Pharmacy ... the University's first Vice President ... and the longest-serving member of the original faculty." Calhoun did no service to or at Yale. As your own professor of history, Glenda Gilmore, put it, Calhoun's "fame came from his guiding role in a racial regime that enslaved people, inspired secession and formed the specious legal foundation for a century of discrimination." 

         The naming of Calhoun County in Alabama exemplifies Gilmore's point. A historical marker in the county seat, Jacksonville, tells how it got its name: "Calhoun Co. originally was Benton Co., named for Col. T. H. Benton, Creek War officer, later U. S. Senator from Missouri. Renamed in 1858 for John C. Calhoun, champion of South in U. S. Senate. Benton's views by then unpopular in South." Like Calhoun, Thomas Hart Benton was a wealthy slaveowner. Like Calhoun, Benton was an important United States Senator representing a slave state — Missouri in Benton's case. Both were national leaders of the Democratic Party, and both were considered for the presidency. Gradually, however, Calhoun and Benton diverged in political philosophy until they became arch enemies. Calhoun, as noted earlier, came to place slavery above all other causes, including nationhood. Benton, on the other hand, pointed out that Southern Democrats had opposed secession when New England Federalists had threatened it during the War of 1812. "The leading language. . . south of the Potomac was that no state had a right to withdraw from the Union," noted Benton, ". . . and that any attempt to dissolve it, or to obstruct the action of constitutional laws, was treason."

         In 1858, in keeping with the growing secessionist sentiment in the plantation areas of the Deep South, pro-slavery extremists in Alabama renamed Benton County for Calhoun. They took this step precisely because Benton stood for the United States, while Calhoun did not.

4. Do you know about the era when the naming took place?

         In U.S. history, the era from 1890 to about 1940 is known as the “Nadir of race relations. During these years, the U.S. went more racist in its ideology, its thinking, than at any other time. I ask because, except professional historians, most Americans don't even know the name of this consequential epoch. During this time, race relations worsened for Native Americans, African Americans, Chinese Americans, and Mexican Americans. Lynchings peaked. African Americans got thrown out of the Major Leagues, the Kentucky Derby, and broader job groups such as mail carrier. In the South, they lost the right to vote. The "sundown town" movement swept the North, including Connecticut, resulting in thousands of communities that kept out African Americans (and sometimes Jewish, Chinese, Japanese, Mexican, or Native Americans).

         Every historic site is a tale of two eras: what it’s about, and when it went up. In this case, Calhoun College is about Calhoun (c. 1824-1850), but it tells us more about when it went up (1931-33). During that era, toward the end of the Nadir, most white Americans saw nothing wrong with naming a building for someone who stood for white supremacy. Whites in Decatur, Alabama, named a junior college for Calhoun even later, in 1947. By then the Nadir was beginning to ease in the North, but not in Alabama.

         Yale would never have named a structure for Calhoun in 1880. Wager Swayne would never have let that happen.

Swayne Hall at Talladega College, Alabama

5. Do you know who Wager Swayne was?

         Wager Swayne was precisely the kind of graduate whom Yale would never honor with a building, owing precisely to the Nadir of race relations. If you have never heard of him, let me refer you to the campus of another college in Alabama, Talladega, very different from Yale, that did name a building for him – indeed, its most important building. Swayne (Yale 1856) became an officer in the U.S. Army during the Civil War and lost a leg near the end of that conflict. During Reconstruction, Swayne headed the Freedman's Bureau in Alabama, became military governor of Alabama, and helped found Talladega, a black college.

         After Reconstruction ended, Swayne became a lawyer in New York City and vice president of the Union League Club, an elegant institution that still stands in Manhattan. Republicans had organized the club to combat pro-secessionists who dominated New York City early in the Civil War. After the Emancipation Proclamation, its members organized and equipped a regiment of black troops and sent them to the front, first marching them triumphantly through the city streets. During Reconstruction, the club helped start Union Leagues across the South that helped African Americans and white Republicans organize politically. In 1880 the club still required prospective new members to "agree with the principles of the Republican Party as hitherto expressed."

         During the Nadir, Northern and Southern elites reunited under the banner of white supremacy. Plutocrats like J. P. Morgan and John D. Rockefeller joined for the club's prestige, not because of what it stood for. Soon it stood for nothing. Indeed, it began to stand for ideas antithetical to its founding ideals. Now members refused to admit Jews, even though Jews had helped to found the club during the war. In 1901, the club’s management committee, having put caste principles in place regarding membership, now turned to its employees. They decided to fire their black servants and go to an all-white staff. At this point, Swayne intervened. He got up a "petition to bring the matter to an open vote," in the words of a contemporaneous observer, “spoke in favor of the Negroes, and after several others had talked on the same side the ... decision was overthrown."[1]

         The deepening racism of the Nadir was not to be denied, however. After Swayne's death the next year, the club made all the wait staff black, which they still were in the late 1990s. This pattern perpetuates plantation race relations, implying that the races should be separate and blacks should serve whites. Other clubs and elite restaurants adopted this practice, including Pullman sleeping cars across the United States. Most of these institutions adopted Southern etiquette as well, calling the staff members by their first names while demanding that they use courtesy titles and "sir" or "ma'am" in reply. Anti-racists like Swayne were dying off. The Nadir was settling in. 

6. Do you understand the difference between heritage and history?

         Your email to the Yale community implies that you do not seem to: "Removing Calhoun's name obscures the legacy of slavery rather than addressing it. Erasing Calhoun's name from a much-beloved residential college risks masking this past, downplaying the lasting effects of slavery and substituting a false and misleading narrative, albeit one that might allow us to feel complacent or, even, self-congratulatory."  

 The shallow comments to the Yale Daily News by many Yale alumni who support retaining "Calhoun" show that they don't grasp this distinction either. Putting his name on the dormitory in the first place was an act of heritage, not history. It told nothing about Calhoun except that he was great and we should honor him. Leaving his name on the building signifies that Yale thinks in 2016 that it is still appropriate to honor him. Taking his name off, on the other hand, and putting up a good plaque telling why, would teach future generations at Yale something about the history of the school, as well as the role of Calhoun. It could also enlighten as to the nature of the Nadir and send a message to the future about the changed racial environment of 2016.

7. What have you done, before the murders in Charleston, to make Calhoun College a flashpoint of knowledge seeking?

         We all know that questioning the naming of buildings after white supremacists skyrocketed after Dylann Roof's despicable acts. However, some colleges were already changing their racist names well before it became fashionable to do so. Not Yale. Moreover, if you did nothing to bring to the fore John C. Calhoun's execrable legacy, then your inaction itself undermines your claim that you keep Calhoun's name so as to keep alive the critique of him. You wrote

Yale's motto is "light and truth," and we cannot seek the truth by hiding it. As a University, as students and faculty, we search out knowledge and pursue discovery. We cannot inhibit this pursuit by marking the ugliest aspects of our own nature "off-limits." We must confront even those ideas that disgust us in the search for progress and an honest understanding of the human condition. If we understand the past, and know ourselves, we can make positive change.

I suppose no one can dispute the last sentence, since it has no content. Putting material on the web, however, cannot adequately counteract the statement that "Calhoun College" makes on the landscape. Neither can a work of art to be named later.  

9. Wouldn't calling it "Nameless College" spark more continuing dialogue than leaving it "Calhoun?"

         If you study the long process Germany went through while deciding upon a proper monument to the victims of the Holocaust in Berlin, you will find that one (serious) suggestion was: a never-ending process to go through to decide upon a proper monument to the victims of the Holocaust. Of course, it was not selected; in a sense it could not be. The proposal was meant to be paradoxical.

         Renaming Calhoun College "Nameless College," however, would not be contradictory and would provoke continuing discussion through the ages. "Nameless College" would also commemorate those who, in the words of Ecclesiasticus, "have no memorial; who are perished, as though they had never been; and are become as though they had never been born; and their children after them," such as the enslaved generations in America. Countless slaves have been lost to history, even to the census, because they had no last names, or because their owners did not bother to provide names to the enumerators but merely said, for example, "male, mulatto, age about 22..." Their first names too were not their own. They were bestowed upon them by their owners, rather than their parents, and were often deliberately chosen to be ridiculous.

         Of course Yale would install a plaque at the entrance, as noted above, telling that the dorm had been named for Calhoun, giving some facts about him, noting that no one at Yale in 1931 cared about his white supremacy, and describing the changes at Yale and in America that led to "Nameless" in 2016.

8. How do you propose Yale might suggest to people of color in the future that they should feel honored to come to Yale and to live in Calhoun College?

         I heard that you said something about living in Calhoun's shadow will make students "better prepared to rise to the challenges of the present and the future." As an educator, you must know that black students show lower graduation rates at most colleges than do white students. You must know that research shows that people of color face additional hurdles, from a sense of not-belonging, to exposure to micro-aggressions, to lower expectations from some professors. In this context, affirming a decision made in the depths of the Nadir of race relations to name a building for perhaps the most unwelcoming white supremacist in Yale's or our national history, cannot possibly be construed as sound educational practice.

         For that matter, how can you defend your decision to whites? Does it not invite whites to think Yale deems it appropriate to honor Calhoun in 2016? To be sure, your email denied this, but email is ephemeral. The name is chiseled in stone right over the entrance.

10. Does Yale honor any white male for his opposition to slavery or racism?

         In keeping with the racism of the Nadir, Yale has named no building for Gen. Swayne or, so far as I can tell, for any other white anti-racist. So Yale winds up with a landscape of white supremacists and black humanitarians (Pauli Murray), just like the University of Texas, Monument Avenue in Richmond, and so many other places.

         Maybe Yale should rename Calhoun College Swayne College!

Conclusion.

         The policies, writings, and beliefs of John C. Calhoun have caused much harm in the world. Yale can't change what its graduate did after he left Yale. Yale can change what it thinks and says about Calhoun's deeds. Naming an important building for Calhoun reveals Yale in 1931 to have been an active part of the Nadir of race relations, supporting white supremacy. Leaving it "Calhoun" affirms that position today.

         Every year that it retains the name Calhoun College, Yale declares on its campus that John C. Calhoun was a hero worthy of the honor of having a building named for him. That declaration insults every black resident and every nonblack resident who does not believe that treason on behalf of slavery made moral or political sense then or now.

Copyright James W. Loewen

    [1]Quoted in Michael W. Fitzgerald, The Union League Movement in the Deep South (LA State UP, 1989), 222, 234-42.

   ]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153767 https://historynewsnetwork.org/blog/153767 0
Tip for Journalists and Historians: When You Don't See Blacks in a Community Ask Why

Huntingon's #1 tourist attraction is its former Christian Science Church, now the Dan Quayle Museum, which also treats all vice-presidents in U.S. history.

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

Last week Eli Saslow wrote a 700-word story in the Washington Post about the economic woes of white America. He focused on Huntington, a formerly industrial city in northeast Indiana previously known mainly for its most famous son, Dan Quayle, vice-president under George H. W. Bush. 

It's an important topic. But Saslow never addresses why Huntington might be so white in the first place. It didn't happen by chance.

Like at least 300 other communities in Indiana, Huntington was until rather recently a sundown town — all-white on purpose. For decades, according to informants of Kathleen Blee, author of Women of the Klan, the entire county was sundown. Indeed, residents believed that it had passed an ordinance banning African Americans from spending the night, lest the land under the county courthouse revert to its original donor.

Ku Klux Klansmen at the courthouse in downtown Huntington    

In the 1920s, the Ku Klux Klan dominated the county. In 1923: 2,500 people attended a Klan rally at a city park in Huntington that ended with the lighting of a fiery cross. The next year, 250 Klansmen marched down Jefferson St. in downtown Huntington while hundreds of spectators lined the streets to cheer them. A rally in the rural drew 5,000. On occasion, as recently as the 1980s, African Americans could not find accommodations in hotels in Huntington, notwithstanding the 1964 Civil Rights Act.

Huntington's white supremacy dates to before the Civil War. During that war, it was famous as the home of Lambdin P. Milligan, Confederate sympathizer and member of the notorious Knights of the Golden Circle, a treasonous group supporting the Confederate cause. The Lincoln administration jailed him, and a military court found him guilty. He protested his trial in the military court system, however, leading to an important Supreme Court ruling, Ex parte Milligan, affirming the supremacy of civilian courts over military courts.

Lambdin P. Milligan Slave House, where captive escapees from slavery were held    

In 1985, Huntington dedicated "the Lambdin P. Milligan Slave House" as a historic site. Milligan used it to hold runaway slaves before the Civil War until he could return them to their owners, at least according to history as told by sources in Huntington. Information put out by the Huntington museum portrays Milligan as a local hero who "risked his life to protect freedom for us and all Americans." In fact, Milligan risked his life to protect slavery "for" African Americans.

In 1918, 328 citizens of Huntington signed a petition that demanded the removal of all "Negroes" from the city. The county's black population, only 16 in 1910, dwindled to just one by 1940, and none in the city. Until the late 1950s, Huntington displayed a sign saying, according to one informant, ""Nigger, you don't live here so don't stay around to see the sun set".  At least as recently as 2005, African and south Indian students at Huntington College report being stopped by police for Driving While Black. To this day, many African Americans avoid driving in or through Huntington, if they can. 

I realize that race was not the focus of Saslow's story. Still, it was a theme: the focus was the decline of the white middle class. Hence the absence of any mention of the absence of African Americans is strange. Did Saslow not notice that he saw no African Americans? Was it therefore a matter of "out of sight, out of mind"? 

Unfortunately, since he never mentioned that Huntington was a sundown town, Saslow also could not cover the rather heroic efforts the "Harmony Task Force," an anti-racist group in town, has got Huntington to take, including joining the "Inclusive Communities" movement. "The City of Huntington, Indiana is a community of civility and inclusion," the city now declares, "where diversity is honored and differences are respected." So this is a hopeful story, not just a tale of unrelenting white racism. For that matter, in 1880 Milligan became a member of the Republican Party and supported James Garfield's campaign for president. Garfield, you may recall, supported equal rights for all without regard to race. 

Another axiom historians, social scientists, and journalists can draw: when researching a town or county, if it is overwhelmingly monoracial, decade after decade, ask why. Most likely, even in out-of-the-way locations like Michigan's Upper Peninsula, it didn't get that way and stay that way by chance. Just because African Americans are out of sight, however, they should never be out of the researcher’s mind. They are never out of the minds of the residents of sundown towns, and their presence and that policy of exclusion is therefore always part of the community’s overall story.

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153770 https://historynewsnetwork.org/blog/153770 0
Orlando Was Not “The Worst Mass Shooting in U.S. History”

Washington DC Vigil in honor of Orlando victims

Related Links

●  The Deadliest Mass Killings in American History by a Lone Shooter

● Historians, Police and Others Argue What Makes Orlando Massacre ‘Worst’ (NYT)

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

I mean no disrespect to the victims in the horrific massacre in Orlando and their friends and family members now left behind to grieve. We must all show solidarity with them. Yes, most were GLBT. Yes, many were likely in the U.S. illegally. Nevertheless. Nothing they did or failed to do justifies in the slightest the carnage visited upon them. They were sacred holy people. They are our kin.

Recently we have been struck by terrorists from various orientations:

● Muslim  (S. Farook and T. Malik, San Bernardino, CA; T. and D. Tsarnaev, Boston Marathon),

● anti-GLBT (O. Mateen, Orlando),

● rightwing (T. McVeigh, Oklahoma City),

● neo-Confederate (D. Roof, Charleston, SC),

● Amish hater (C. C. Roberts, Nickel Mines, PA),

● mentally ill (A. Lanza, Newport, CT),

● or merely alienated teenager (E. Harris and D. Klebold, Columbine, CO).

Some terrorists were two-fers: Mateen, for example, was Muslim and anti-GLBT. McVeigh was rightwing and neo-Confederate.

With such a panoply of potential enemies, none of us is safe. For example, I personally have an Amish-style beard, so I might have been construed as an “appropriate” target by any of the above perpetrators except the last, since I do not work at a high school or look like a teenager.

The point of this essay is not to tell anyone how to cope with terrorism, individually, as an institution or an entire nation. My point is to rescue us from white history.

It is white history to think for a moment that, as so many have put it, the Orlando massacre is “the worst mass shooting in U.S. history.”

Before refuting this amnesiac claim, I must make one qualification. Might we mean, by “mass shooting,” done by one person? Surely we do not. In my list above, (which of course included two incidents done by bombs – Oklahoma City and Boston), two involved more than one perpetrator (San Bernardino and Columbine). Moreover, we may yet learn that Mateen or Roof had accomplices. Surely we do not desire a definition of "mass shooting" that excludes Columbine, for example.

So the answer has to be no. But in that case, just imagining that the murder of a "mere" 49 people might be the largest mass shooting in U.S. history shows a complete insensitivity to race relations.

Consider these ten events, in chronological order, with the number of victims in parentheses:

1.  The Gnadenhutten massacre of 1782 (100 Native Americans).

2.  The murders of African Americans in NYC in 1863 during the "NYC Draft Riots" (>120 African Americans)

3.  The Fort Pillow Massacre of 1864 (100-300, most African Americans).

4.  The Colfax Massacre of 1873 (150 African Americans).

5.  Rock Springs, Wyoming (78? Chinese Americans)

6.  The Massacre at Wounded Knee, 1890 (300 Native Americans).

7.  The East St. Louis race riot [by whites, against blacks] of 1919 (40-200 African Americans).

8.  The Elaine [AR] massacre of 1919 (>100 African Americans).

9.  The murders of black residents in the Tulsa race riot in 1921 (100? African Americans).

10. The Rosewood [FL] massacre of 1923 (150 African Americans).

True, in #1 event the attackers (white Americans) used knives, not guns, so the victims died not in a “mass shooting” but a “mass killing.” True, some of the dead in #2, 5, 7, and 9 died when whites burned their houses, but many in #5 and most in #2, #7, and #9 died from gunfire. True, #3 and #6 could be considered war deaths, but both groups had surrendered when the massacres took place. Otherwise, there is a pattern: in each of these mass murders or mass shootings: most of the victims were nonwhite.

In this context, any historian or other commentator who claims Orlando was the "deadliest mass shooting" in American history is also saying no to BLM -- black lives (and Native American lives and Chinese American lives) do NOT matter. For if they mattered, then we would know we as a people have a long history, beginning long before 1782, actually, of mass shootings and mass murders of nonwhites.* Whites have also been massacred – 120 at Mountain Meadows in southern Utah in 1857, for instance, by Mormons.

The fix for this amnesia is easy. Just add one word, “recent,” before “history,” thus implicitly acknowledging our violent past.

*Whether Orlando should be included depends upon one’s view – and especially Mateen’s view – of the racial definition of Latinos.  

Copyright James W. Loewen

Image by Ted Eytan from Washington, DC, USA (2016.06.13 From DC to Orlando Vigils 06103) [CC BY-SA 2.0 ], via Wikimedia Commons

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153779 https://historynewsnetwork.org/blog/153779 0
Future Shock and History

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

            Alvin Toffler, author of Future Shock and editor of Learning for Tomorrow, died last month. He was 87.

            Future Shock was a NY Times top ten bestseller for almost a year from October, 1970, through the end of the next summer. Worldwide, it has sold more than six million copies. It still ranks 4,975 among all books at Amazon. (Harry Potter and the Cursed Child ranks first, and it doesn't even come out until the end of July — talk about future shock!)

            Toffler claimed that by the 1960s, Western culture was changing so rapidly that it disoriented us. Historians of course had been using modernization and its discontents to explain everything from the French Revolution and before, to the rise of the KKK during Reconstruction and after.[1]

            In the 1960s, history textbooks, however, never discussed modernization or future shock. Indeed, they never invoked the future in any meaningful way. Nor do they today. Instead, they close by imparting vapid inanities. "The American tradition remains strong — strong enough to meet the many challenges that lie ahead," The American Adventure assures students; "the American adventure will surely continue." Americans "were convinced that their free institutions, their great natural wealth, and the genius of the American people would enable the U. S. to continue to be — as it always has been — THE LAND OF PROMISE," Land of Promise concludes. In short, all we need to do to prepare for the morrow is keep our collective chin up.

            History textbooks avoid serious consideration of the future because such reflection might be controversial. Not everyone would come to the same conclusion. Teachers might lose control of their classes or, more likely, get flak from some parents. Horrors! Of course, the traditional style of pedagogy, which we might call input/output, assumes that we want everyone in the room to give us the same "right answer" rather than develop the ability to think using historical information.

            There are other ways to end a history textbook. In 1972-74, I wrote a very different chapter to end Mississippi: Conflict & Change.[2] Titled "Into the Future: 1967-2000," it invited students to think analytically about what had happened in the past to cause Mississippi to be as it was in the present. Then they could project the state's likely future development. The chapter treated migration into and out of Mississippi and within the state, in which we predicted "rural life itself is ending," as everyone, farmers included, winds up intensively connected to urban America and the world. It also treated "Women in the Future of Mississippi," asking students to contemplate what jobs, if any, would still be men-only in 2000. Pages on the likely future of school desegregation, then just underway, and the effects of desegregation on politics invited students to think about the hottest-button topic of that era.

            Mississippi: Conflict & Change was the first "revisionist" state history. It was also the only history textbook to try to project into the future. For this reason, I sent "Into the Future" to Alvin Toffler, hoping for a blurb. He did not disappoint:

            For young people to grow into competent citizens in a time of change, education must connect their roots in the past with images of desirable futures. This history of Mississippi makes the connection between the living past and a livable tomorrow. Its use would make Mississippi a path-breaker in the teaching of local history.

Nevertheless, the State of Mississippi refused to adopt it by a 2 to 5 vote. The board had 2 blacks and 5 whites. You can do the math. Since Mississippi adopts statewide and provides approved books to students gratis, no school in the state ever uses any book not on the approved list. Usually the state adopts three to five books in a given subject, such as U.S. history. In Mississippi history, they only had two books, which might be termed “ours” and “theirs.”

            That attempt at censorship had a happy ending, however. The two editors, Charles Sallis and myself, along with three school systems as co-plaintiffs, filed suit, "Loewen et al v Turnipseed et al," in federal court. In 1980, we won, and the state was required to adopt the book. Pantheon published a second edition, and we added a new chapter, "Desegregation and Cultural Change, 1968-1980." Our retitled final chapter, "Into the Future: 1971-2000," required only minor adjustments, however, because most of our projections had been right.

            My point in this essay is not to tell that story, however. My point is to use Toffler’s death, and more importantly his work, to make an argument about all history textbooks. Surely all textbooks that reach the present should project into the future. Few do. Even today, more than 150 years after the end of slavery, most U.S. history textbooks avoid any treatment of the past that implies relevance to the present. Again, that would be controversial. Treating slavery is OK, because that's over. Treating slavery's handmaiden, racism, is not, because it continues. So textbooks leave the history of slavery, and of everything else, buried in the past, without relevance to the present, and with no implications for our future.

            No wonder most high school students rank history as their least favorite subject and find it "boring" and "irrelevant."

[1].Lacy K. Ford Jr., "One Southern Profile: Modernization and the Development of White Terror in York County, 1856 — 1876" (Columbia: U. of SC M.A., 1976).

[2].James W. Loewen, Charles Sallis, Jeanne Middleton, et al., Mississippi: Conflict & Change (NY: Pantheon, 1974), 309-35. 

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153787 https://historynewsnetwork.org/blog/153787 0
Lessons from Mississippi for Police in Ferguson, Chicago, Baltimore, Milwaukee …             

By Jamelle Bouie - File available on Flickr here in the set. This is the individual photo., CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=35442328

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

In 1967 I lived in the Mississippi Delta, researching the lives of Chinese Americans. Mississippi had more Chinese residents than any other Southern state, and how they fit into a social structure built for two races posed important sociological questions. Most wound up running grocery stores, usually serving the black population, which was in the majority in Delta towns.

            A grocer in Vicksburg, whose store provided something of a social center for its neighborhood, told me what happened when customers created disturbances that warranted his calling the police. "They always ask about the race of the people. If it's white, they send a white officer; if it's black, they send black." This made sense to the grocer and to me as well. Many whites simply wouldn't put up with being arrested by a black officer in the Mississippi of 1967, and blacks too got along better with officers of their own race.

            What was obvious in Mississippi in 1967 remained to be discovered in Ferguson, Missouri, as we all learned in August, 2014. Ferguson had just three African Americans among its 53 police officers when it became infamous for its style of policing of its two-thirds black citizenry. If Ferguson had been obeying the same rule that Vicksburg followed half a century ago, then we could infer that the average white person required 30 times as much policing as the average black person!

            Of course, Ferguson never bothered to learn what Vicksburg knew so long ago. Stemming from its days attempting to become a sundown suburb (1940-60), the Ferguson police department routinely sent white officers to deal with black disturbances. By the way, 1967 Vicksburg was, like Ferguson, a majority black city controlled by white elected officials. But these officials were pragmatic. They knew that good relationships between police and community benefited the police and the community.

            Unfortunately, leaders of some other Mississippi communities thought differently. Allen Thompson, mayor of Jackson, the state's largest city, militarized his police. He oversaw the purchase of a helicopter, SWAT gear, and even an armored vehicle — an earlier version of the vehicles that the federal government has been helping communities get in the last few years. Quickly dubbed "Thompson's Tank," it became a flashpoint of racial confrontation during civil rights demonstrations and during disturbances.

            Jackson provided the policing model Ferguson and St. Louis County relied on after the shooting of Michael Brown. Not until the governor put Captain Ron Johnson of the State Police in charge of security did the Vicksburg model prevail in Ferguson. Even then, statements by Ferguson's police chief imply that he remained stuck in the Jackson mindset.

            I mentioned that Ferguson was a sundown suburb. Sundown towns are called that because they did not allow African Americans after dark. Ironically, from its earliest days, Ferguson had a few black residents. In 1940, for example, 38 African Americans called Ferguson home, and though some were live-in servants in white homes — which do not violate the taboo — others lived in their own households. Then, like suburbs across the United States, Ferguson moved toward becoming all white. St. Louis County got carved into dozens of small communities, reifying into law divisions based on race and class. Ferguson put a chain across the main street connecting it with Kinloch, the tiny black suburb to its west.  Realtors refused to show homes to black would-be buyers. Police followed motorists who "did not belong in Ferguson"; DWB (“Driving While Black”) became an offense. The tactics worked. Between 1940 and 1960, the black population of the St. Louis metropolitan area doubled. Meanwhile Ferguson cut its black population in half, to just 15 persons.

            In the 1960s, black population pressure combined with the 1968 "Fair Housing" law finally broke the barrier. By 1970, 165 African Americans lived in Ferguson. At this point, as in other former sundown suburbs like Riverdale, outside Chicago, or Hawthorne, near Los Angeles, whites had ideological reasons to leave. After all, they had defined blacks as inferior, problematic, to be kept out. Now African Americans had breached the city limits. Many white Ferguson residents responded by moving to sundown exurbs farther out.

            Like many former sundown towns, Ferguson now faces what we call "second-generation sundown town issues," the foremost of which is its overwhelmingly white police force. Ironically, the disturbances in Ferguson, the resulting scathing report on its police force from the U.S. Department of Justice, and Ferguson’s new black voting majority offer Ferguson a way out.

If Ferguson can transcend the legacy from its sundown past, it may set an example that might help overly white police forces like Baltimore’s and Chicago’s transcend their pasts. Of course, black cops are no panacea. An African American officer can be just as disrespectful, just as short-fused, and just as scared as a European American officer. Still, the generalizations about black folks that mark too many all-white conversations to this day – and stain too many police emails – are harder to utter in a setting that is, say, half black. Moreover, there is one way in which an African American officer cannot easily be as disrespectful toward African American citizens, and that is racially.

            In that sense, then, the Ferguson police force was not competent. No overwhelmingly white police force can be competent in a majority-black city. Please note: I am not arguing against the competence of a white individual. Nor do I suggest that police forces should be all black or even overwhelmingly black. I have been a fan of racial integration since I started considering the world thoughtfully, which was in 1954, and I remain one today. But given what Vicksburg knew, half a century ago, we can make no argument for “color-blind policing.” Not only do Ferguson, Baltimore, Chicago, Milwaukee, and a host of other cities need to seek civilian review, cams, training in community relations, etc. – they also need to integrate.

Copyright James W. Loewen 2016

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153808 https://historynewsnetwork.org/blog/153808 0
Should Students Call Professors by Their First Name?

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

            A young professor of English (they're all young to me!) at Boston University, Carrie J. Preston, has just published an interesting article at the Chronicle of Higher Education. She tells of her journey from "Carrie" to "Professor Preston," a trip complicated by feminism and Japanese Noh theater. My own journey on this matter was complicated by my teaching at the Blackest and Whitest schools in America.

            In the Midwest, where I went to college (Carleton), all teachers were "Mr."[1] This, we were told, was a Midwestern tradition, deliberately nonhierarchical.

            Not so at Harvard, where I got my doctorate. I recall one of the few meetings that I attended in graduate school. It was on what if anything the Department of Social Relations, including sociology, should do or say about the ongoing Vietnam War. Sociologist Alex Inkeles chaired, and his selection of nouns of address was exquisitely nuanced. He called full and associate professors "Professor," and assistant professors and instructors "Dr.," unless they hadn't finished their doctorate, in which case "Mr."[2] Graduate students he called by first name. So did undergraduates, including my students in the "pro-seminar" in sociology, a crucial course for majors that Harvard fobbed off on third- and fourth-year graduate students. 

            In 1968 I left Harvard for Tougaloo College, a small predominately Black liberal arts college in Mississippi. There I observed the classroom interactions of Dr. Ernst Borinski, storied professor of sociology.[3] He called all students by their last names — "Mr. Jackson," "Miss Evans."[4] His reasoning was simple. At its heart, racial segregation is a system of etiquette, every element of which expresses White supremacy and Black inferiority. In Mississippi in 1968, Black adults called White adults "Mr." followed by their last names, only to get called by their first names in return. In intensely hierarchical situations, such as on plantations, African Americans might even say "Miss Ann" or "Mr. Charley," expressing even more deference, and would of course get called by their first names with no honorifics, back. The practice even extended to the U.S. mail: banks and utilities would send out statements addressed to "Mr. Curtis W. Shepard" if White, "Curtis Shepard" if Black.

            A few Black parents countered by giving their children no first names, only initials.[5] A few others named their children "Elder" or "Missy," forcing Whites to use courtesy terms while first-naming them. Most just gave their children the names they chose, like any other parents would do, but they hated the first-naming etiquette. Some women supplied only their husbands' names — "Mrs. Robert Walker." "I dare them to call me "Bob!" one said to me with a smile.

            Whites came to avoid "Mr." and "Mrs." during the Nadir of Race Relations, that terrible era, 1890 to 1940, when White America went more racist in its thinking than at any other time. During the Nadir, instead of "Mr." or "Sir," which might imply that the older and more senior African American was fully human, Whites used "Uncle," or "Aunt" or "Auntie" if the person was a woman. We still have these terms today, of course, in the form of atavistic survivals like Uncle Ben's Rice and Aunt Jemima Pancake Syrup.

This ad for Cream of Wheat epitomizes the racism of the Nadir of Race Relations. A little White boy whips an old Black man, shouting "Giddap, Uncle." The man is the boy's babysitter, of course, not his uncle. Whites said "Uncle" as a term of quasi-respect used across the color line because "Mr." would connote actual respect. The Cream of Wheat Company, in 1916, right in the middle of the Nadir, believed that this heartwarming scene would make most Americans warm and friendly inside and likely to buy their product. Probably they knew their market.

            In 1968, segregation was still in full force in Mississippi, although it was cracking. Its etiquette code covered all Black/White interaction. On two-lane highways, it was risky, hence rare, for Black motorists to pass White-driven cars. Black adults were not to look White adults in the eye while talking with them; nor were they to sit with Whites at the same table, even if friends. They were to step aside on sidewalks to let Whites pass.

            Chinese Americans had moved into the Mississippi Delta — the flat northwestern sixth of the state, from Memphis to Vicksburg, comprising the richest plantation land in the United States. They opened grocery stores, mostly serving the Black population, which was much larger (and poorer) than the White. In this niche they found economic success well beyond that realized even by White grocers. Nouns of address provided part of the reason why. As one Black customer told me, "They [Chinese grocers] don't worry the hell out of you about saying 'Mr.' or anything." Elsewhere, to speak without deference could be fatal, especially when speaking to White grocers already unhappy at the scorn they received from other Whites precisely because their store clientele was largely Black. Emmett Till was lynched because he claimed social equality in his brief interaction with a White Delta grocer, may even have whistled at her.

            In central Mississippi, Tougaloo offered almost the only respite from such threats. Before his legendary Social Science Forums, which offered Mississippians of both races almost their only opportunity to hear important speakers together, Borinski hosted dinners. He invited Whites and Blacks from the community, along with Tougaloo students. They then found themselves sitting together conversing across racial lines, many for the first time in their lives.

            In Southern society, to do the usual — address students by first name while they addressed me by my last name — thus constituted inadvertent compliance with the norms of segregation.

            I went the opposite direction, asking that students call me "Jim," as did "Prof. Preston" (her choice, today). I recall one time in 1969 at which mirth ensued. Near the beginning of a class early in the fall semester, I made my usual request to be called "Jim" thenceforward. Later in the hour, a student with a full Afro, just getting into the Black Power movement, raised her hand excitedly with a question. "Mr." she started, and then stopped abruptly as she remembered what I had said. "Jim," she continued, and then stopped again, blushing in confusion.[6] Unthinkingly, she had mimicked the plantation folkway, the opposite of her intent. "It's OK, you can just call me "boss man," I replied. The class laughed, and she asked her question.

            "Jim" worked, but then, I was in my twenties. Preston notes that it worked for her in her twenties, too, although she did face the additional complication of being female, which she shows presented difficulties. At 33 I moved to the University of Vermont. There the problem wasn't racial etiquette but sheer numbers. I had no problem continuing the use of first names for students in my upper-level seminars, but the educational philosophy of the school was quite different from Tougaloo's. At "UVM," as Vermont is known, a professor had to have permission from the dean to teach a lower-level course smaller than 40. At Tougaloo, a professor had to have permission from the dean to teach any course larger than 36. Tougaloo was structured for education; UVM for profit.

            The difference affected nouns of address. When I had 36 students, I knew the names of at least 30 by October. When I had 60, by the end of the semester I knew the names of maybe 10. It felt awkward to me to have students calling me "Jim" when I could not call them by any name. So I started suggesting that they use "Mr. Loewen" — that Midwestern equalitarian thing again — until they knew me well enough to use "Jim." That seemed to work, and if they said "Doctor," that was OK too.

            Now, at The Catholic University of America, where I sometimes guest lecture, students prefer "professor." Whatever! Mainly, we want to make it easier for students to address us, do we not, regardless of which noun of address they choose.

    [1]Carleton did have female professors, but only about ten, not counting Women's P.E. and Music, and I never had one in my four years there.

    [2]The Harvard Social Relations Department had one female professor, Cora DuBois, in anthropology. She was a full professor. Hence, no "Mrs.'s."

    [3]Borinski is a major subject of the book, video, and museum exhibit From Swastika to Jim Crow.

    [4]"Ms." had not been invented; this was 1968.

    [5]Mostly boys were named with initials, but L. C. Dorsey, pronounced "Elsie," provides a female example. Whites called her "Elsie," but in a sense they were not calling her by her first name.

    [6]Yes, African Americans can blush. Besides, she was light-skinned, so her blushing was easy to spot.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153845 https://historynewsnetwork.org/blog/153845 0
How Is It Still Possible for a Jury in South Carolina to Have Just One Black Member?

A 19th century jury, as depicted in 1861 by John Morgan in Wikipedia 

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

I wrote this essay the day after the hung jury in the trial of Michael Slager, the police officer in North Charleston. Slager repeatedly shot the Black driver, 50-year-old Walter Scott, until he was dead, even though the reason for the stop was a broken taillight, and even though Scott was running away from him, after an initial scuffle. The jury had eleven White members and one African American.

How is it possible, one might ask, for a jury in North Charleston, South Carolina, to have just one Black member?

Way back in 1969, I testified as an expert witness in Yazoo County, Mississippi, about the racial composition of its juries. Venires — the pool from which actual juries are drawn — were supposed to be selected randomly from the population of registered voters.[1] Each jury comprised fourteen people — twelve jurors and two alternates. The two were told they were alternates at the end of the trial, just before the jury retired to deliberate. This ensured all fourteen would pay attention throughout the proceeding. Usually the alternates were happy to be excused. It was a good system ... on paper.

In operation, however, juries in Yazoo County kept coming out with two or fewer Black jurors out of fourteen. Indeed, this had happened seven times in a row! Moreover, the two Black members often were the same two people! Neither of these outcomes was likely or even possible by chance, which is why I put exclamation points after them.

This Yazoo County case was my first of more than fifty cases as an expert witness. It was perfect for a beginner, because my job was to make the statistics clear to laypeople, including the judge. In class, professors use coin flips to get students thinking about probability. The voter registration roll in Yazoo county happened to be 50% Black, 50% White, perfect for the coin flip analogy.

The probability of getting two heads (or fewer) in fourteen flips of an unbiased coin is about .006. Try it yourself, if you don't agree. If you flip a coin fourteen times, then do so again, and continue a thousand times, you will get two heads (or fewer) about six times — not very likely. Statisticians, social scientists, and historians use the "1% level of significance" to say that a hypothesis — in this case, race influenced jury selection — is solidly confirmed. That's one in a hundred. This one result beats that standard.

The probability that two consecutive series would come out this way is .006 times .006 or .000036, fewer than four times in 100,000. The likelihood that seven consecutive juries would have no more than two African Americans each is less than once in 4 billion, or .00000000025. Impossible!

Our magistrate was a stereotypical White Southern judge: old, biased, incompetent. During my testimony, he even nodded off a couple of times, jerking back to consciousness when he heard the word "Objection!" Then he would peer down from the bench. If their side had objected, he would say "Sustained," if ours had, "Denied."

He was awake, however, when I reached the crux of my testimony, which was that juries as White as those drawn in Yazoo "could almost never happen by chance — the likelihood is less than one time in four billion."

Excursus 1: At this point, I must pause to denounce my own attorney. In my experience, most civil rights lawyers, especially those working for the Lawyers Committee for Civil Rights, ACLU, NAACP, "the Inc. Fund" (NAACP Legal Defense and Education Fund, Inc.), and the DOJ are hardworking, idealistic, intelligent, and knowledgeable. Our team in this case included three different civil rights lawyers, because this was a test case designed to break open the unreasonable White bias of jury selection across the state. Unfortunately, the lead attorney knew himself to be intelligent, so he did not bother to prepare. The drive from Jackson to Yazoo City provided us with an hour together in the car, with someone else driving, so I suggested we might review my testimony. He could not be bothered.

I had supplied him with an outline of what I would say, and of course he was able to ask my name and address competently, explore my educational background and qualifications, and ask what data had been provided me — in this case, the racial composition and names of the last seven empaneled juries in Yazoo County. Unfortunately, he then went off the rails and found himself asking me to do something no expert should ever be asked to do by his own attorney: make new calculations on the stand.

For instance, he asked, "What is the likelihood of getting exactly two Black jurors, rather than two or fewer?" That is a silly question, because if a jury came in with just one Black, that would certainly show support for the same hypothesis that two Blacks showed, namely White bias. It also requires the expert to calculate the probabilities of just one and of no Black juror and then subtract those minuscule numbers from the also minuscule likelihood of getting two or fewer. Hard to do, error free, with at least a dozen people watching and waiting.

Luckily, I had at my disposal a brand new invention: an electronic calculator. Today's youngsters (anyone under the age of 70) have no idea how hard it was to do square roots before the electronic calculator (and it was uphill both ways!). One had to use slide rules, which provided via physical methods approximations to three decimal places, or even worse, Monroe electro-mechanical calculators, which had to be tricked into doing approximations of square roots. In 1968, just two electronic calculators were available in the United States. Each did the four basic calculations — addition, subtraction, multiplication, and division — and would do square roots if you poked "divide" and then "equal." Each had a memory. Wang made one, which was a base station about the size of the stand-on-the-floor computers of twenty years ago, to which four different calculator keyboards attached. It cost $5,000. Sharp made the other, a stand-alone unit the size of a large laptop computer of today, for $1,300. I had made the purchase of the latter a condition of my employment at Tougaloo College, because I knew I did not want to have to teach my students in "Methods and Statistics of Social Research" how to use a Monroe calculator. According to the distributor, ours was just the third electronic calculator in the state, and when I brought it out in the courtroom it caused quite a stir.

Nevertheless, calculating the answer to my attorney's aimless queries misused the court's time, irritating both the judge and me. Then, when I reached my core finding, my attorney asked a question that revealed he did not understand for a moment the nature of statistical probability. The racial disparity would happen by chance "less than one time in four billion," I said, and he asked, "How much less?!"

This is akin to asking someone who has compared a single grain of sand to a cement truck to discuss portions of the grain of sand. I wanted to reply, "Look, you ignorant unprepared idiot, you cannot get "much less" than one in four billion," but I did not see how that would help our case, so I simply said, "Oh, much less."

The other side was equally flummoxed by my testimony. They knew it had damaged them, but they had not engaged an expert of their own, and even if they had, s/he would have to abide by the laws of statistics. To disparage my testimony, they tried to disparage me. "Where are you from?" was the prosecutor's opening question on cross-examination. "Tougaloo College, Tougaloo, Mississippi," I replied, as I had at the beginning of my direct testimony. "No, I mean where are you really from?" he retorted, referring to my outsider status in Mississippi.

"I don't understand the question," I replied, although of course I knew exactly what he was driving at, and looked beseechingly at the judge.

"You'll have to rephrase," the judge grudgingly told the prosecutor.

"Where do your parents live?" he asked.

"Decatur, Illinois," I replied. And now I lied, for the first and only time ever, in court: "That's in southern Illinois." Actually, Decatur is in central Illinois. I could not resist.

Soon his questions dwindled to a close. But it turned out that all my testimony went for naught, because to the surprise of all the attorneys in the courtroom, the jury hung. Even more surprising, it did not hang 10 to 2 for conviction of the two Black defendants, but 7 to 5. For the first time in memory in Yazoo County, some White jurists had voted to acquit a Black defendant.

To understand this occurrence, you need to know the nature of the charge. Yazoo City was in the midst of a boycott. Its downtown merchants had uniformly refused to hire any African Americans as sales clerks or cashiers. The only job Blacks could get was janitor. Yet the stores relied on their Black clientele. Worse, African Americans could not try on clothing, not even hats. They could buy, but they were not allowed to use the changing rooms. Civil Rights leaders were tired of such dehumanizing treatment and urged the Black community to shop in Jackson, an hour away. Conditions there were no better in some stores, but at least boycotting gave the Yazoo City Black community some leverage.

Saturday was the big shopping day in the Mississippi Delta, so each Saturday high school students walked the streets, talking with Black shoppers, trying to persuade them to go to Jackson. A deputy sheriff overheard two young Black males in conversation, one saying to the other, "If they [the merchants] don't give us something, we're gonna shut this mother-fucking town down." "You're damn right," replied the other. It happens that Yazoo City had a municipal ordinance dating from the nineteenth century making it illegal to curse on the streets, so the deputy called for backup and arrested the two young men. Now they were on trial.

Our attorneys asked that all witnesses be sequestered. They then asked the sheriff if he had ever sworn while on duty. He admitted he had. Then they put the deputy on the stand and asked if he had ever sworn while on duty. "Of course not," was the response. "Have you ever heard Sheriff ________" swear on duty? "Oh, no," came back. Then our attorney asked the court reporter to read back the sheriff's testimony, where he admitted doing so. "Would you like to amend your testimony?" the deputy was asked. Well, maybe once or twice, came back the reply. "Did you arrest him?"

A few other exchanges made clear the arrant silliness of the prosecution's case, affording Black jurors the courage to vote for acquittal and even persuading three White jurors to do likewise. Unlike the South Carolina matter, it was obvious that the case would not be retried, so justice had been served. Unfortunately, however, my testimony about the jury system was now moot. There was nothing to appeal.

As a result, a few months later I found myself in Wilkinson County, in the far southwest of the state, testifying all over again. Luckily the incompetent civil rights lawyer had gone back home to the North. The voter registration roll in Wilkinson was at least 70% Black, yet juries still wound up majority White, time after time. This time the judge was quite interested in my testimony, and although he did not decide the case then and there, he asked me afterward, "Most counties in Mississippi would show this kind of disparity, wouldn't they?" I think he did rule eventually that the juries would have to be redrawn.

It took years for White biased juries to be eliminated in all counties in the state. Indeed, probably they have not been, just as they have not been in Charleston County, South Carolina. Even when venires are actually drawn fairly from the underlying population, attorneys can use challenges to strike African Americans. Each side only gets a limited number of what are called "peremptory challenges" — for which no reason need be given. But each side gets an infinite number of challenges "for cause." If a prospective juror in the Charleston murder case was married to a police officer, for example, s/he got excused for cause. In Mississippi, some judges leaned toward letting the simple fact of being African American comprise a special interest group, so lawyers could excuse at least some African Americans for cause. Others got dropped via peremptory challenges.

Four years later, juries were still overwhelmingly White by design in many Mississippi counties. It seemed that each county had to be attacked individually. In 1973 I was again the expert witness in the criminal trial of two African Americans charged with theft in Hinds County, home of Jackson, the state capital. A private attorney had engaged me, W. S. Moore, who had graduated from Ole Miss Law School back in 1954. Very few White Mississippians of that generation showed the courage or idealism to defend Black clients in the early 1970s. William S. Moore had been part of the White establishment, but now word was that he had had a conversion and was working for justice for all. He had even changed his name, now going by W. Sebastian Moore, or "Sea-Bass" in the Black community. In Hinds County, as I recall, even the venire was biased, compared to the proportion of African Americans among the registered voters, which again was close to 50%.

Again, my testimony was telling, and when we all got on the elevators at the lunch break, the two prosecutors vented their frustration on Moore. "What are you doing, being in this case?" one asked. Moore answered forthrightly, "Well, now, you have to understand, my clients ain't nothin' but a pair of nigger crooks." The other attorneys were astonished that a lawyer would speak thus of his clients. Then Moore let the other shoe drop: "But you still should let Black folks be on juries." The two sentences, including the two very different terms for African Americans, were in their way perfect, contrasting the old and the new, the setup and the kill.

Excursus 2: I began by saying I wrote this essay the day after the hung jury in North Charleston. That was December 5, 2016. On that date I also sent a letter by U.S. mail to Atty. Moore, who I had found was still living, still near Jackson, Mississippi. I told him I considered him "a remarkable case of a white Mississippian who 'saw the light' and became a crusader for justice." I also noted that he "understood the statistics I used and were fine to work with." I attached this essay and said, "If you wish, I can change your name completely, so no one would think it might be you." I know Mr. Moore to be elderly and think he recently moved to assisted living, so I do not know that he received or considered my letter. Since I cannot imagine that my praiseful account of his work might offend him, I decided not to edit out his name.

Across the United States, we have made some progress in jury selection. Since 1986, "Batson challenges" can be filed against the racial use of peremptory challenges, for example. Still, we will do well to examine the racial and ideological makeup of our juries. In any jurisdiction where African Americans are in a small minority, it's easy for prosecutors to use their peremptory challenges to exclude them totally from juries. Then defendants are not being tried before a "jury of their peers," the legal requirement, which means a reasonable cross section of the community, just as the defendants were not in North Charleston, Yazoo City, or Hinds County.

Such exclusion may play a role in the astounding racial imbalances in the criminal justice systems in Wisconsin, Minnesota, Iowa, and Nebraska. Those states annually show up as incarcerating about nine times as many African Americans as European Americans, compared to their proportions in the population. In Mississippi, the imbalance is only about two and a half to one. To be sure, juries in white states (to say nothing of sundown towns) will always be overwhelmingly white. Nevertheless, the presence of one person of color, compared to none, still has an effect. Often it causes a difference in tone, in rhetoric, just as Donald Trump would probably not use his "locker room" rhetoric in a group of men that also included a woman.

Another source of jury imbalance comes from the measures that Republican state legislatures have passed to make it harder to register to vote. Since jurors are picked from the universe of registered voters, whitening that universe also whitens juries. This is yet another reason to undo the voter-suppression measures that so many states, North as well as South, have passed in the last few years.

Who would have thought we still have to win, in 2017, the victories won in Mississippi more than 40 years ago!

    [1]Some courts use a three-step process, choosing venires from the underlying population, then dividing the venire on a given day into two or more panels if two or more courtrooms are in session, and then choosing actual jurors after calling groups from the panel into the courtroom for general instructions and questioning.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153871 https://historynewsnetwork.org/blog/153871 0
It's Time for a Shadow Cabinet

UK Shadow Cabinet

Sociologist James W. Loewen is the author of Lies My Teacher Told Me. An earlier version of this article appeared in the San Francisco Chronicle. 

            In view of Donald Trump's shocking nominations for cabinet-level positions, it is in our national interest that Democrats appoint a "shadow cabinet." A shadow cabinet can help mobilize public opinion to ward off the worst Republican excesses and help Democats do better in the next election.

            "Shadow ministers" form an important feature of political life in many parliamentary democracies, such as Australia. Although the United States is not a parliamentary system, we need to import this feature. The Green Party announced a shadow cabinet in 2012, but no one noticed. The times have changed, however.

            This idea has nothing in common with Trump's appointment of close personal aides in each department, intended to shadow his own cabinet appointees and report evidence of disloyalty. Quite the contrary, a Democratic shadow cabinet will be publicly assembled; its job will be to report to the nation what Trump's appointees are doing to the agencies they direct.

            Except for Defense, Donald Trump has nominated cabinet secretaries who profoundly disagree with the missions of the departments they are supposed to head. Most obvious is Rick Perry, who promised to close the Department of Energy when running for president in 2011, even though he haplessly couldn't remember its name. Betsy DeVos, Trump's Secretary of Education, famously called our public schools a "dead end" and wants to give parents public money to send their children to private schools.

            Trump's choice for the Environmental Protection Agency, Scott Pruitt, Oklahoma's attorney general, has repeatedly sued the agency. The Sierra Club called his nomination putting "an arsonist in charge of fighting fires." Tom Price, Trump's pick to run Health and Human Services, has conflicts of interest owing to investments in drug and medical device companies. Of course he opposes Obamacare. For the Security and Exchange Commission, Trump chose another fox in a hen house in the form of mergers and acquisitions lawyer Jay Clayton.

            Putting an oil man in charge of our relations with Middle Eastern nations as Secretary of State is also a conflict, since we have fought war after war in the Middle East owing to our oil interests there. Rex Tillerson is further compromised by his close business ties in Russia. For HUD, Trump nominated Ben Carson, who calls integrated housing "social engineering"; he's even called it "Communist."

            Worst of all is Trump's choice for Attorney General, Jefferson Beauregard Sessions III, who has never distanced himself from his neo-Confederate heritage.

            Trump's nominees represent a tiny slice of America, mostly hyper-rich white males. He has also rejected the tradition of appointing at least one cabinet member from the opposing party.

            Someone needs to bring to the fore the missions and activities of these various departments of government. A shadow Secretary of Energy could review Perry's performance, show what he is not doing, and suggest to the American people the agenda that a sensible Department of Energy would be pursuing. Congressional oversight will not do, especially since Republicans control Congress.

            Since our new administration does not value facts, the shadow cabinet must provide the information that the appointed secretaries cannot. Shadow secretaries can collect data about "their" agencies, convene hearings to take public testimony about problems in the Republican operation of them, and even coordinate litigation to help them survive the agendas of the directors who will be running them. Small staffs can help, funded by foundations and think tanks that do not agree with the implicit Republican view that the agencies are basically illegitimate.

            Equally important, the shadow cabinet can offer ideas and programs to give Democratic Congressional candidates a head start toward the 2018 elections. Opposition cabinet members do this in Australia, New Zealand, and other countries.

            Newt Gingrich's 1994 "Contract with America" showed the advantage coordination can provide. Democrats have long lagged the G.O.P. in sound bites and clarity. A shadow cabinet can tell the public what Democrats stand for and plan to do, as well as what Republicans are not doing.

            Certainly an array of talent is out there awaiting nomination. For Energy, Skip Laitner, president of the Association for Environmental Studies and Sciences. For Secretary of Education, how about Diane Ravitch? For Labor, maybe Lawrence Mishel of the Economic Policy Institute. For the SEC, HUD, or anything she wants, Elizabeth Warren. For Commerce, perhaps Nobel Prize winner Joseph Stiglitz. Congressional Democrats or the national Democratic Party leadership might make the choices. Alternatively, a think tank or foundation might take the lead.

            Regardless of the details, taking this step now will induce the public to volunteer time, money, and ideas, rather than doing nothing until the fall of 2018. And surely America needs a shadow cabinet today more than Australia ever has! 

Copyright James W. Loewen 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153897 https://historynewsnetwork.org/blog/153897 0
James Loewen discusses Confederate monuments and memorials

Related Link Civil War symposium on monuments considers "America's Most Honored Traitor" and some "Modest Proposals"

Confederate Monuments and Memorials James Loewen, author of Lies Across America: What Our Historic Sites Get Wrong and other works talked about new perspectives on Confederate monuments.

“Confederate Monuments: Modest Proposals” was part of the American Civil War Museum’s 2017 annual symposium, “Lightning Rods of Controversy: Civil War Monuments Past, Present, and Future.” It was co-sponsored by the John L. Nau III Center for Civil War History at the University of Virginia and The Library of Virginia. 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153904 https://historynewsnetwork.org/blog/153904 0
Rockville’s Confederate Monument Belongs at White’s Ferry

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

A year ago Montgomery County, Maryland, decided to remove its Confederate statue from its position of power, next to the courthouse in Rockville. Finally they have figured out where to put it.

White's Ferry, the only ferry left on the Potomac, lies 24 miles west of the courthouse. The father of the ferry’s current owner named his ferry for Confederate general Jubal Early because Early had a "rebellious, no surrender attitude." It is certainly true that after the Civil War, Early never changed his attitude. He still defended slavery as appropriate for African Americans, since "The Creator of the Universe had stamped them, indelibly, with a different color and an inferior physical and mental organization." He was an early proponent of the "Lost Cause" mythology and helped organize the Southern Historical Society to spread its biased views. The statue will be in good company. Former Washington Post journalist Eugene L. Meyer recently called it "as fitting a place as any.”[1] 

However, Meyer professed to believe that the county is moving it "to pretend the whole dirty business of slavery and the county's role in trying to preserve it never happened."

In fact, every monument has something to teach us about three eras: what it's about, when it went up, and our present day. Rockville's Confederate monument tells us very little about the Civil War, however. Looking at it, no one would guess that Maryland sent 63,000 to the U.S. army and navy and less than 40% as many -- perhaps 24,000 -- to the Confederate side. The county has no monument to its Union dead, though it does have another, in Silver Spring, to Confederates. Nor does the monument say a word about slavery.

The monument does tell us how to think about the War: we are to revere the Confederate side:

To our heroes of Montgomery Co. Maryland

That we through life may not forget to love the Thin Gray Line

In actuality, the Confederate line came through Montgomery and Frederick counties three times during the war, en route to Antietam, Gettysburg, and Washington. The first two times, Confederates hoped to find succor, even provoke an uprising on their behalf, and were sorely disappointed.

En route to Gettysburg, J.E.B. Stuart's men kidnapped every African American they got their hands on, including about a hundred near the monument's current position, and dragged them into slavery in Virginia. Ironically, on the courthouse grounds a historical marker tells of his raid but keeps the enslaved people invisible, mentioning only the capture of "150 U.S. wagons."

Overtly, the monument says nothing about when it went up. Meyer says it was proposed for the Monocacy Cemetery, not far from White's Ferry, "but its symbolic importance in the county seat could not be denied." Exactly. What Meyer leaves unsaid is that during the Nadir of race relations, that ghastly era from 1890 to about 1940 when white Americans went more racist in their thinking than at any other time, neo-Confederate thinking changed. No longer were neo-Confederates satisfied to remember the Confederate dead in cemeteries. Rather, they desired to celebrate the Confederate cause at the seat of power. This is when triumphant monuments go up at state capitols and county seats across the South: white supremacist Democrats are symbolizing their control.

Moving the monument today says that members of all races are now again part of the body politic, as they had been during Reconstruction. It also shows that white residents no longer glory in proclaiming the "symbolic importance" of the Confederate cause at the county courthouse.

Around the U.S., sites that celebrated white supremacy, usually overtly when one examines their dedications, are now coming under attack. Around the U.S., defenders of these sites protest they are not defending white supremacy, oh no! Instead, they claim that removing them from their privileged positions on the landscape – at courthouses and state capitols, in important city squares, next to city hall in Dearborn, MI – is an effort to hide unpleasant history.

Quite the opposite! All we have to do, in the case of the Rockville monument, is to get the county to put up in its stead, visible from the sidewalk, a historical marker that hides nothing. It can tell accurately of the county's role in the Civil War, the statue's role in the Nadir of race relations, and the date that it moved, along with the reasons for its departure.

That tells far more history than the monument itself ever did.

[1] “Don't forget the past — just hide it," Washington Post, March 19, 2017).

Copyright James W. Loewen 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153908 https://historynewsnetwork.org/blog/153908 0
"Why Was There the Civil War?" Here’s Your Answer.

Union soldiers before Marye's Heights, Second Fredericksburg

Sociologist James W. Loewen is the author of Lies My Teacher Told Me. This article was first published in the Washington Post and is reprinted with permission. 

President Trump yet again finds himself in possible Pinocchio land, this time for suggesting that Andrew Jackson could have — or would have — averted the Civil War. This claim is more complicated than the size of the crowd at Trump’s inauguration, however. Trump has touched a controversy that has engaged historians and the public for more than a century.

Trump’s view on Jackson is unlikely but not absurd. Jackson famously faced down his vice president, John C. Calhoun, on the possibility of states rebelling against the federal government after Congress passed a tariff that hurt the Southern plantation economy. Jackson got Congress to authorize him to use military force following South Carolina’s attempt to “nullify” the tariff, but the crisis was averted when Congress passed a compromise tariff in 1833.

Still, Trump’s comment about Jackson was in the service of his wider discussion of the Civil War. “People don’t ask that question,” he said in an interview with the Washington Examiner, “but why was there the Civil War?”

This is truly an important question, and we can only wonder what the president would have said had his interviewer asked, “What do you think?” All too many Americans reply vaguely, “states’ rights,” even though Southern leaders, as they left the Union, made it clear that they opposed states’ rights and even named the states and rights that offended them. Americans are vague because their textbooks are vague; publishers don’t want to offend white school boards in Dixie.

Trump’s conclusion about Jackson places him in a camp of 1930s historians who called it a “needless war,” in the words of James G. Randall, brought about by a “blundering generation.” That view is a product of its time, and that time is now known as the Nadir of Race Relations. The Nadir began at the end of 1890 and began to ease around 1940. It was marked by lynchings, the eugenics movement and the spread of sundown towns across the North. Neo-Confederates put up triumphant Confederate monuments from Helena, Mont., to Key West, Fla., obfuscating why the Southern states seceded. They claimed it was about tariffs or states’ rights — anything but slavery.

Earlier, everyone knew better. In 1858, William Seward, a Republican senator from New York, gave a famous speech titled “The Irrepressible Conflict,” referring to the struggle between “slave labor” and “voluntary labor.” When Mississippi seceded, it emphasized the same point: “Our position is thoroughly identified with the institution of slavery — the greatest material interest of the world.”

Simply to recognize this material interest renders improbable the “needless war” notion. Mississippi was right: Slavery was the greatest material interest in the United States, if not the world. Slaves made up an investment greater than all manufacturing companies and railroads in the nation. Never has an elite given up such a stake voluntarily. The North went to war to hold the nation together, not to emancipate anyone. But the Civil War did end slavery. When might that have happened otherwise?

Today, when slavery has no state sanction anywhere, it seems obvious that the institution could not have survived to the 21st century. But if the South had prevailed, cotton would have resumed its role as “the largest and most important portions of the commerce of the earth,” to quote Mississippi’s secession document. The Confederacy might have replaced France as the colonial ruler in Mexico and Spain in Cuba. Eyeing such a strong economic and military model, Brazil might never have abandoned slavery.

There is one more layer on this onion: The South did not quite secede for slavery, but for slavery as the mechanism to ensure white supremacy. On many occasions, its leaders made this clear. Trying to persuade fellow Texans to secede, John Marshall wrote in his Austin State Gazette in 1861: “It is essential to the honor and safety of every poor white man to keep the negro in his present state of subordination and discipline.” In 1863, William Thompson, founder of the Savannah Morning News, proposed a new, mostly white national flag for the Confederacy: “As a people, we are fighting to maintain the Heaven-ordained supremacy of the white man over the inferior or colored race; a white flag would thus be emblematical of our cause.” The government agreed and adopted his flag. Late in the war, trying to persuade Confederates to persevere, the Richmond Daily Enquirer asked, “What are we fighting for? We are fighting for the idea of race.”

Some Trump partisans are clearly still fighting for that idea. Unfortunately, the Civil War settled only the issue of slavery — not white supremacy. Getting the Civil War wrong was part of the program of white supremacy during the Nadir. Today, getting it right is not just Trump’s responsibility — it’s all of ours.

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153927 https://historynewsnetwork.org/blog/153927 0
Slandering Native Americans this Spring

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

             Across the U.S. over the last two decades I have asked audiences, "What was the most important purchase in the history of the U.S. ever made for the exact sum of $24, in fact for $24 worth of beads?"

            Across the U.S., they chorus, "Manhattan." Asked, "Did you learn this 'fact' in college? in graduate school?" they chorus, "No." Most think a teacher first exposed them to this information back in elementary school.

            Acquaintance with this fable is hardly limited to the East Coast. The creators of the newspaper comic "Mother Goose & Grimm" know this; their strip for March 28, 2017, relies on Americans knowing it.

            In a sense, the comic offers a fresh take on this stale tale. Natives are recouping something by charging a lot for drinks at their casino. In another sense, however, it's one more put-down of American Indians by a "settler culture" that has been putting them down for more than 500 years.

            What is wrong with this little fable starts with the price. So far as I know, the $24 figure turns out to be arbitrary and bogus.[1] Schoolchildren have learned it for decades anyway, even though it becomes stupider every year. Consider: Abraham Lincoln bought his home in Springfield, Illinois, in 1844, for $1,200. He did add a second story to it, but the original home would probably sell today for about $100,000 — 83 times as much.[2] If $24 was the price of Manhattan in 1844 dollars, it would have been maybe $2,000 today. This $24 is the only price in the Western World that has never been touched by inflation!

            Then there are the beads. So far as historians can tell, beads and trinkets were not involved. What the Native Americans wanted and could not make themselves were mainly five items: steel axes, steel knives, metal kettles, which they used as kettles but also as a raw material for other things, guns, and brightly colored woolen blankets. For perhaps $2,400 worth of such trade goods, the Dutch bought the rights to Manhattan, probably from the Canarsies.

            You can go to New York City today and take the subway to Canarsie.  If you do, you will find yourself in Brooklyn, indeed east Brooklyn, at the end of the line. That's where the Canarsies lived.

            Why wouldn't they sell Manhattan?

            No doubt the Canarsies were as pleased with their deal as the legendary New Yorker who sold Brooklyn Bridge to some later hapless tourist, for they got paid for something that wasn't theirs in the first place.

            The Dutch didn't really care. They used the transaction to legitimize their presence to the next English ship that came by. The deal also made allies of the Canarsies, who otherwise might have joined with the British or other nearby American Indians against them.

            The Weckquaesgeeks, who actually lived on Manhattan, were not so pleased.[3] They warred sporadically with the Dutch for years – hence “Wall St.” Finally, around 1644 in Kieft's War, perhaps with help from the Canarsies, the Dutch exterminated them as a tribe. Survivors fled to what is now Westchester County.

            Almost no Americans know the Canarsies/Weckquaesgeeks story. On the other hand, so many of us know the nonsensical $24 myth that cartoonists can count on a laugh by invoking it. Why nonsensical? Well, consider what you'd do if you were the man on the left in the photograph below. Would you sell your share of your village, gardens, fields, your burial ground, and your gathering and hunting rights throughout and around Manhattan, in exchange for a few strings of beads? If you think you might, consider what you would then do the next day. Pack up and move, of course, but to where? New Jersey? People already live there, so you'd have to fight or negotiate with them before moving in. Then what? You'd face at least a year of hard work — clearing new fields, building new houses, planting new gardens ... all for a few beads?

In Battery Park at the lower tip of Manhattan, at the exact spot where this deal never took place, stands this monument. It's hard to believe that one scene can get so many things wrong, from the headdress to the difference in dress to the beads.

            The obvious falsity of the $24 story raises a question: why do teachers persist in teaching it?

            One way to answer this question is to think about what the story accomplishes. Sometimes how a cultural element functions is more important than whether it is true or false. The $24 story mythologizes much more than the taking of one small island. Manhattan is a synecdoche that symbolizes the taking of much of a continent. Indeed, like the Dutch, European Americans repeatedly paid the wrong tribe or paid off a small faction within a much larger nation. Often, like the Dutch, they didn't really care. Fraudulent transactions might even work better than legitimate purchases, for they set one tribe or faction against another while providing the newcomers with a semblance of legality to stifle criticism.

            The biggest single purchase from the wrong tribe took place in 1803, when Jefferson "doubled the size of the United States by buying Louisiana from France," as all the textbooks put it. And just like Manhattan, what a bargain! Just $15,000,000! Recent scholarship by Robert Lee [4] shows that the United States spent at least $2.6 billion in today's terms buying Louisiana from the Natives who owned it. All we got from France was the European rights to negotiate with them.[5] Nevertheless, Natives drop out of the textbook accounts of the Louisiana Purchase and off the maps of Louisiana Territory.

            The $24 myth has at least two important effects. First, it makes Native Americans look stupid. As the cartoon implies, $24 won't even buy two mixed-drink Manhattans at a nice bar today. Those idiotic Indians! They didn't know what they were doing! Second, it legitimizes the taking. We didn't really take the land or invade the continent; we bought it, fair and square. It didn't cost much, either! Thus the $24 myth sets us up to believe that acquiring Native lands was never very problematic.

            In reality, how European Americans got the country remains problematic. The U.S. and its predecessor colonies took other people's lands, uprooted their cultures, and in some cases moved them hundreds of miles. Then we kept them from acculturating and succeeding in our society. It can be hard to face these facts. The $24 tale is much more comforting.

            Surely these functions and the sense of entitlement and moral and intellectual superiority they engender help explain why the $24 story still gets passed on, even though it's so obviously absurd. At this point, however, it's important to consider who is "we" in the previous passage. Literally, it is everyone but Native Americans.

            Racism means treating people unfavorably because of their racial or cultural group membership. Sociologists identify three types of racism: individual, institutional, and cultural. Most Americans are familiar with the notion of individual racism — David Duke, for example. Institutional racism — unfavorable treatment by a social institution of a group of persons, based on group membership — may have no racist animus behind it. That would be true for the SAT, which uses a statistical process to discard items favorable to African Americans — but not on purpose.

            Perhaps most deep-seated of all, deeper even than the psychic racism of a KKK leader like Duke, is cultural racism. This is the ideology that one race is superior to others, expressed in etiquette, religion, law, in terms built into the language, and in countless other elements of our culture. The $24 story is an example. Soft-pedaling the invasion intrinsically entails making fools of Native Americans today. How could they be so dim-witted as to give away Manhattan for $24 of beads? The creators of "Mother Goose & Grimm" this spring took their stupidity for granted as the foundation upon which they based their strip.

            The butt of the bridge joke is always the naïve tourist, who does not realize that the seller has no rights to what he sells. Here, that would be the Dutchman. The Dutchman is not the butt of the $24 story, however. That would be the naïve resident, the Native, who does not realize that he is being swindled. What gets defined as funny and whose behavior gets defined as hapless depends on who holds the power today.

    [1] I review the evidence in "The $24 Myth," Chapter 7 of Teaching What Really Happened (NY: Teachers College Press, 2010), and "Making Native Americans Look Stupid," in Lies Across America (NY: Simon & Schuster, 2000).

    [2] Lincoln Home National Historic Site, Abraham Lincoln Online.

    [3] Although many writers call the Natives who lived on Manhattan Weckquaesgeeks, like most pre-contact American Indians in the East, they lived in villages only loosely organized into tribes. Some were called "Manahattans," variously spelled. Some may have been members of still smaller groups, such as the Reckgawawancs who were tributaries of the Weckquaesgeeks. The latter also lived in what is now the Bronx and Westchester County. No Indians may have been living on the southern tip of the island, for the Dutch moved in with no difficulty and lived there for a year with no treaty with anyone.

    [4] "Accounting for Conquest: The Price of the Louisiana Purchase of Indian Country, Journal of American History 103 #4 (3/2017), 931.

    [5] We also got the land France did control — much of the present state of Louisiana.

Copyright James W. Loewen 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153930 https://historynewsnetwork.org/blog/153930 0
My Cheating Memoirs #1: Harvard, 1966

Harvard's Memorial Hall

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

In 1964, I entered grad school at Harvard. At the time, Harvard was just entering the '60s. (It was not on the cutting edge.) All undergraduates were required to wear coats and ties for the dinner meal, so most dressed that way all day. All Harvard undergraduates were male; females allegedly attended "Radcliffe," a college with almost no faculty except in P.E.

Another hoary tradition afflicting Harvard was the "Gentleman's C." Professors and administrators openly acknowledged the split between "the gentlemen and the scholars." People like FDR and George W. Bush (oh all right, he attended Yale, then Harvard Business School, but close enough) never studied very hard, but Harvard passed them through anyway, hoping to get endowments from their rich parents and also hoping to bask in the reflected glow when they assumed their rightful place in the national upper class. "Donations from the rich make possible scholarships for the worthy," was the explanation.

At Harvard, cheating was a tradition too. To combat it, administrators took extraordinary steps. Half a century later, I still recall my disgust at Harvard for the way they tested graduate students at the end of our first semester. Fifty-five of us were taking the year-long introductory graduate course in "Social Relations," required for all students in sociology, social and cultural anthropology, and social and clinical psychology. They placed us in Memorial Hall, a huge auditorium honoring Civil War casualties, rather than our regular classroom. Then they had us sit in every third seat in every third row. All this was to achieve economies of scale, so they mixed us in with students from several other courses in the rows between. I still remember that they placed Sanskrit students in the rows just ahead of us, so those of us who might have been so desperate as to copy students' answers in another discipline couldn't, since even the alphabet was different!

Not one professor sat in the room. If students could not understand a question, even one with a misprint, they had no recourse. Only proctors were there, to make sure we did not cheat. The arrangements implied that the intellectual subculture was not anticipated to exist among graduate students at Harvard; only policing was in evidence.

Those of us from small liberal arts colleges where cheating was rare were also mystified by another Harvard final exam tradition. The "blue books" into which we were to write our essays all said, in large capital letters on their front covers:

DO NOT REMOVE PAGES FROM BOOKS.

DO NOT REMOVE BOOKS FROM ROOM.

 We could not fathom this. Suppose I wrote two pages of an essay and then decided I was on the wrong track. Why wouldn't I tear the page out? We understood why the institution would not allow pages to be inserted into the blue books, but, taken out? It was also offputting to see those two sentences as final words of every course.[1]

As their absence from final exam rooms implied, many Harvard professors did not take undergraduate education seriously. Some actually called it "a distraction" from "my work," which was, of course, their (mostly) abstruse research. Most professors taught as few undergraduate courses as they could, and when they did, it was in huge lecture classes. I shouldn't really have conflated Harvard and Yale above, because Harvard was then distinctly worse than Yale: although it had only a few more undergraduates, Harvard had ten times as many TAs. They did most of the real teaching, to the extent that anyone did.

In 1965, we grad students were growing upset that so many college professors were such terrible teachers. We knew there had to be better ways to learn than just sitting through droning lectures. We cared about teaching, but the Department of Social Relations didn't. For our important jobs as TAs, it gave us no preparation, not even an orientation as to what resources were available to students, TAs, and faculty.

]]> Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153940 https://historynewsnetwork.org/blog/153940 0 My Cheating Memoirs #2: Freud's Four Stages of Sexuality in Mississippi Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

          The first exam I ever gave as a full-time teacher was in Introductory Sociology at Tougaloo College, a small black college in Mississippi, in late September, 1968. I had taught quite a bit in my last two years of grad school at Harvard University, however, and Harvard is a world leader in student fraud, so I knew to take an elementary precaution. While my students took the hour exam, I drew a seating chart.

            My class met in the Social Science Lab, space that Dr. Ernst Borinski, professor of sociology at Tougaloo, had reclaimed from the basement of Beard Hall. The Lab was mainly a long narrow room, maybe 18' wide by 40' long, with two sets of long narrow tables running its length. Students sat elbow to elbow at the tables.

Ernst Borinski holds forth in the Social Science Lab at Tougaloo College in the early 1960s. Borinski, Jewish refugee from Hitler, is one of three subjects of a book, From Swastika to Jim Crow, film, and museum exhibit that is now circulating around the United States. As a result, this image is now widely known: I recall seeing it go past me on the street one Sunday morning in Philadelphia a couple of years ago. I was looking for a restaurant for brunch, and it was plastered on the side of a bus, part of an ad for the museum exhibit then showing at the new National Museum of American Jewish History. Photo courtesy Tougaloo College.

           One of the main subjects covered early in Introductory Sociology is socialization — the process by which unformed babies take in the folkways and mores of society and become, not merely humans, but Tahitians, Americans, whatever. This insight may seem obvious to you, reading this paragraph as an adult, but I still remember the impact it made on me as a first-year college student, way back in 1960. I had never given much thought to how we became human. It just seemed natural.

            Calling it "socialization" problematizes the process, however, and it comes to seem worthy of study, not "natural." As well, understanding socialization helps one become less ethnocentric. Students realize that eating three meals a day, having sex two times a week (the American average, sadly), and believing in one god is not natural. It's all cultural, and American culture is not the only or even always the best way to do things.

            The power of socialization is considerable. Indeed, sociologists aver that socialization makes society possible. How do we come to internalize the folkways and mores? and with such force that we believe with all our being that our neighbor, who turns out to have two spouses, is not only wrong but even evil?

            One way to understand the process is by teaching Freudian concepts: the id, ego, and most especially the developing superego. I had done so at some length that fall, even to the point of including Freud's famous four stages of psychosexual development: oral, anal, Oedipal, and adult or genital. As a first year teacher, it seemed obvious to create a test item:

            List Freud's four stages of psychosexual development:

                        1. __________________

                        2. __________________

                        3. __________________

                        4. __________________

Hopefully the rest of my test was of a higher order of complexity.

            The four stages question was complex enough for at least three of my students, however, all female. They all answered "oil" for the first stage. Now, my pronunciation of "r" is imperfect, so this might have been a simple misunderstanding, although when I went on to describe how tiny infants experience much of the world by putting it in their mouths, the term "oil fixation" does not make a lot of sense.

            For stage two, all three students replied "annual." I pronounce "anal" just fine, so there was no way that three students could all hear it as "annual." Nor did my description of the rigors of toilet training — according to Freud the first major behavioral demand that parents impose upon their children — imply anything yearly about the process.

            For stage three, all three students filled in "psychosexual," a place-holder of sorts, I suppose.

            Then, for stage four, all three answered with a stage of psychosexual development so high that I, personally, have never reached it: "nonmanual!"

Sigmund Freud, inventor of the 4 stages of psychological development, including (in Mississippi) nonmanual.  Photo 1921.

            Of course, the three students "happened" to sit next to each other. One student in 32 had answered with these four choices. The likelihood that a second student would do so would surely be less than 1 in 10,000, because they are pretty wondrous, but even if we simply assigned each student the same likelihood — about 1 in 30 — then the probability that three students would do so — and would sit next to each other by purest chance — would be less than 1/30 cubed or .00003 — a very small number!

            I felt justified in giving each student a zero for the entire exam, with a written explanation. After photocopying the exams, I handed them back. As a new teacher, that's all I did. Now I know to meet with each student individually, discuss why they were taking sociology, and then engage in heavy counseling with them, either to persuade them to drop the course or to take it seriously.

            Next semester, one of the three came to my office. "Mr. Loewen," she said, "I just wanted you to know that sociology is going to be my major, and I would never cheat in my major."

            I thought that showed some standard of decency. Moreover, I had every reason to believe her, not least because she sat in the middle of the three. She had to be the innocent cheatee, the others the cheaters. So I changed her exam grade to the score she would have earned, "D." It wasn't a good grade — unsurprising, I suppose, given her performance on the four stages. Nevertheless, 62 is so much better than 0 that it raised her overall course grade from a D+ to a C+. I made that change at the registrar's, and she received notice at the end of the semester. We never spoke about the incident again.

            Regrettably, I never asked her about the nonmanual stage. Now I'll never know what it is.

 Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153944 https://historynewsnetwork.org/blog/153944 0
My Cheating Memoirs #3: Ladysmith Black Mambazo Plays Vermont Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

             When I taught sociology at the University of Vermont (UVM), one course I offered repeatedly was "Race Relations." It had several requirements. One I called "Get Out of the White Cocoon." Students had to attend ten hours of events run by or featuring someone from a race different from their own. White students could go hear a jazz concert, a Native American speaker, even watch the new Spike Lee movie (back in his heyday).

            After each event, they wrote in their journals. "Summarize the event, for those who did not attend," I suggested. "Tell some important 'take-away' you had from the event. Or, depending on your reaction, critique at least one idea presented at the event."

            During the semester, my TAs and I suggested dozens of events from which students could choose. That might surprise you, since Vermont is not near a major black population center, but UVM drew a good share of important and diverse lecturers. As well, Burlington is often an early station on national tours for music groups, particularly those that had just played Boston or Montreal.

            One semester Ladysmith Black Mambazo came to the fine arts series. (If you don't know this fabulous a capella male singing group from Ladysmith Township in South Africa, go here right now. You can come back in a few minutes.) I said the event would count toward the "white cocoon" requirement. I mentioned that they had participated on Paul Simon's famous "Graceland" album and had also done a recent music commercial for LifeSavers candy.

            Two girls (the term used at UVM for female students!) who were best friends each attended the Ladysmith Black Mambazo concert. Or so their journals told me when next I collected them. "Lizzie" wrote, "I went to hear Ladysmith Black Mambazo. I was excited, because I liked them on Paul Simon's "Graceland" album and also enjoyed their recent music commercial for LifeSavers. The concert was very good." In terms of its review of the event, this was the shortest journal entry I had ever received. "Betsy" wrote a little more: "I went to hear Ladysmith Black Mambazo. The ladies wore beautiful dresses, and they were excellent on their instruments."

            Neither had attended, of course, but I needed to prove that. So I asked both to come in and see me, Betsy at 3PM the next afternoon, Lizzie at 3:10PM. To misdirect them, I asked both to bring in their most recent hour-exam.

            Betsy walked into my office. I asked her what kind of instruments the ladies played. She looked like a deer in the headlights, but she wasn't stupid — she knew they were from Africa. So, "Drums," she replied.

            I thought about asking about their dresses but decided she had twisted in the wind long enough. "Betsy," I said gently, "they don't have any instruments. They aren't ladies, either. Ladysmith Black Mambazo is all male, and they sing a capella. Do you know what that means?"

            "Yes," she said.

            "Ladysmith is a place," I went on.

            "I didn't go," Betsy said, softly.

            "I know," I said. "I've never had anyone lie in their journals before. I'll have to think about what to do about it."

            I escorted her out of my office and admitted Lizzie. She proved a tougher nut to crack. She had learned that Ladysmith Black Mambazo was all-male. So I read her terse "concert review" to her. "Lizzie," I said, "In ten years of teaching this course, always with the White Cocoon requirement, this is the shortest review of an event I've ever gotten. It's content free! Can you tell me more about the concert? What did they sing about? What difference did their songs make to you?" 

            "To tell the truth," she replied, four words that had nothing to do with what she next said, "I couldn't stay. I only heard about fifteen minutes."

            Aside from her claiming two hours of the requirement for a fifteen-minute stay, this statement was not credible. Tickets cost $24, and this back in 1994! Pretty soon Lizzie, too, agreed that she had never gone at all.

            I was disgusted. These students had lied in their journals. Somehow that struck me as showing even less integrity than cheating on an exam. For one thing, I had never required them to attend Ladysmith Black Mambazo. There were many ways to fulfill the ten hour requirement. It was hardly onerous. On the contrary, years later, students would stop me on the street to thank me for the requirement, not just on account of race relations, but also because it got them into the habit of going to events on campus and in the community.

            So I turned both students in to UVM's Vice President in Charge of Cheating. I forget what he made them do. But I never forgot "The ladies wore beautiful dresses."

 Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153945 https://historynewsnetwork.org/blog/153945 0
On Dealing with Serious Problems Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

     Recently I had cause to contemplate some basic issues of human life. Exactly what spurred me to do so is idiosyncratic and does not concern me here. My point rather is to try to present to you how classical music can play a role in helping us process whatever we face. 

     Not just I have relied on classical music in this way. A favorite poet of mine, Edna St. Vincent Millay, wrote a sonnet, "On Hearing a Symphony by Beethoven," that captures the thought:

Sweet sounds, oh, beautiful music, do not cease!  Reject me not into the world again.  With you alone is excellence and peace,  Mankind made plausible, his purpose plain  ....  Music my rampart, and my only one.

            Now, I must quickly note that music is not my only rampart. Nature is another, even more important. Poetry is a third. The romantic U.S. poet, William Cullen Bryant, combined these two in "Thanatopsis." Please pardon Bryant's use of "him" and "his" for the protagonist; after all he does atone by using "she" and "her" for Nature, the central force in the piece. The poem begins:

To him who in the love of Nature holds  Communion with her visible forms, she speaks  A various language; for his gayer hours  She has a voice of gladness, and a smile  And eloquence of beauty, and she glides  Into his darker musings, with a mild  And healing sympathy, that steals away  Their sharpness, ere he is aware.

Those lines, combined with various spots on the surface of the earth special to me, have relieved at least the sharpest edge of my pain when I faced such past misfortunes as divorce and separation from my young children. The spots need not be all that special — I'm not talkin' Yosemite here. Just a turn in a brook will do, or a certain tree, small, misshapen, yet sturdy. 

            The title means contemplation of death. I learned that only while researching this essay, and the poem now turns to death, but we shall leave it at this point. My recent problem was not impending death or thoughts thereof, and the relief I found was not in nature or poetry but in music. 

             I turned to Anton Bruckner, composer of nine symphonies between 1863 and 1896, specifically to his massive Symphony #8, perhaps the longest purely orchestral symphony ever written. It lasts about an hour and a half, depending upon the conductor.

    My own introduction to the piece came via what we then called "LPs" — "long-playing records," now "vinyl." Each side of these twelve-inch disks usually lasts twenty to thirty minutes, so Bruckner's Eighth occupied two disks. In those days (1963!), important new recordings of classical music were not only reviewed in specialty magazines (High Fidelity, HiFi/Stereo Review, Stereophile) but also in newspapers. I asked the music critic of the Chicago Sun-Times, Robert C. Marsh, for his advice about the best recording of Bruckner's Eighth, and he wrote back to recommend the old monaural set by the famed Concertgebouw Orchestra of Amsterdam under Edward Van Beinum. I bought it and still play it, but for you I shall recommend a newer CD version.

            Two or three years later I heard it live, played by the Chicago Symphony Orchestra under visiting conductor Rafael Kubelik. That concert remains one of the most memorable I've ever attended. Let me tell you why. 

             Craig, my best friend from high school, was with me. I think we were in Chicago en route home from Harvard, where I was getting a Ph.D. in sociology and Craig was finishing his B.A. in astronomy. We had arranged to meet up with his parents. Time has obscured the details of all this. I do remember that my suggestion to hear the symphony carried the day. I also recall that Craig and I got very cheap tickets in the student balcony, nosebleed seats from which it seemed we could see down into the bassoon. Mr. and Mrs. Chester paid much more for their seats, but when it came time to go into the auditorium, theirs were just four rows ahead of us, in the same high balcony!

            I knew what to expect; they did not. The symphony was the only work on the program. The Chicago Symphony played wonderfully. It was then perhaps the best orchestra in the world, certainly the most precise. The work begins quietly but soon leads to an orchestral climax. Climaxes then alternate with hushed suspense. After a quarter hour, the movement ends quietly but with some grandeur.

            I'm sure some listeners can grow anxious along the way, because the structure is not obvious. That is, it contains no "passagework," no repeats, no traditional A-B-A form as found in symphonies from early Haydn through Beethoven. Instead, Bruckner builds it from shorter passages, often separated by pauses. It is as if he is thinking, "All right, I just told you something important. Now I'm going to tell you something else important." The more important the thought, the longer the pause. 

             Listening at home, I suggest you play the symphony all the way through without stopping, letting Bruckner make the impact he can upon you. Try not to do anything else while listening, however, such as writing this essay (sigh!).

             The second movement, scherzo, is faster. Again, it is a quarter hour long, and again, quieter passages lead to grand climaxes with kettle drums. It comes to a firm resolution, but as is the case with all symphonic movements before the final one, some unresolved tension is supposed to linger, and here we find no exception.

            The two final movements each last nearly half an hour. The slow movement begins almost statically, with a single chord, which Bruckner then repeats. He seems very assured — as if he knows he has your undivided attention and will now use it to provide you some very calm moments.[1] It has passages charged with loveliness that I find very meaningful, but it is again not clear exactly how they build. The movement has climaxes, but these are lovelier, rather than simply louder, and only the harp quite reaches their apogees. It was at these moments in Chicago that I started to hear something I have never otherwise heard at any classical music concert, and only once or twice at any venue: soft gasps from people so moved by the music that they did not know they were reacting audibly. The movement ends softly, as it began.

            The final movement begins grandly with a brass climax leading to four majestic blows on the tympani.[2] These blows can be life-changing: years ago, I read about a very successful lawyer or banker in New York City who, having heard them, determined he would conduct the symphony. After years of study and planning, he hired an orchestra, rented Carnegie Hall (I think), and did! As with the first two movements, there are repeated brass climaxes. In Chicago I looked down with amazement as the face of Adolph Herseth, the symphony's first-chair trumpeter, turned pink and then crimson, playing them. Again, immediately after some of these climaxes, soft gasps ricocheted about the hall. 

             Near the end of the movement, with the orchestra otherwise still, two soft short kettle drum rolls signal somehow to the listener that Bruckner is now going to tie everything together. Extraordinarily, he does. The finale references many of the motifs from all four movements, somehow connecting them into a climax so grand that at its end I conclude, as I do when I think about Labrador Retrievers, "Well, humankind cannot be all bad; we did produce that!"

            James Agee surely never heard much Bruckner. Long-playing record sets were still rare when he died in 1955, and American orchestral performances of Bruckner became commonplace only in the 1960s. Certainly he never mentioned Bruckner in his famous instructions on how to listen to serious classical music, found near the beginning of Let Us Now Praise Famous Men, written in 1936-37. However, I'm sure they apply equally to Bruckner's Eighth. 

Get a radio or a phonograph capable of the most extreme loudness possible, and sit down to listen to a performance of Beethoven's Seventh Symphony or of Schubert's C-Major Symphony (#9). But I don't mean just sit down and listen. I mean this: Turn it on as loud as you can get it. Then get down on the floor and jam your ear as close into the loudspeaker as you can get it, and stay there, breathing as lightly as possible, and not moving, and neither eating nor smoking nor drinking. Concentrate everything you can into your hearing and into your body.[3]

            

On the next page, Agee goes on to write a passage that resides, for some reason, inside single quotation marks:

'Beethoven said a thing as rash and noble as the best of his work. By my memory, he said: "He who understands my music can never know unhappiness again." I believe it....'

Beethoven did write something like that, variously translated. Like Bruckner, some Beethoven can help one deal with whatever life visits upon you. Lenin even said that Beethoven's music was dangerous because it made him want to be kinder to his fellow human beings! Hitler also loved Beethoven (and Bruckner), so I make no claim that having either Beethoven or Bruckner by one's side necessarily makes one a better human being. It does me, though.

            What is the meaning of Bruckner's Eighth? It is deep classical music — indeed, the deepest. It is not program music, like, say, Scheherazade, portraying a ship in storm and other images. It's ineffable — so you cannot expect me to eff it for you. Maybe it says, "It is all right." Or, at least, "It will be all right." As the Beatles put it, "There will be an answer; let it be." (Though Bruckner does not move me — or Hitler or Lenin! — toward passivity.) 

            My program for you is that after you have played the symphony all the way through, you play just the slow movement, with its utterly calm beginning. Play it at least twice more. My friend Craig (yes, we're still friends) thinks you need to know a piece well enough that you know what is coming next, so you can sort of "hum along" mentally, before it can work its magic on you. The CD set by the Berlin Philharmonic conducted by Daniel Barenboim is one reasonably good recording.[4] See if it reaches you. 

Anton Bruckner identified completely with music. He was a choir boy in this monastery in St. Florian, Austria, then its organist in the 1850s, and chose to be buried under its organ.

Now, let me stop here for a word of caution: Bruckner's Eighth is not for everyone. It was not for Mrs. Chester, in Chicago. She had not heard what many of the rest of us had heard. No gasps from her. Instead, when we reunited on the street afterward, her first words were, "That was so long!"[5] Recently I played the beginning of the slow movement for one of my closest friends, an artist herself, with exquisite taste in painting and sculpture. She said, "It is as if someone were speaking to me very intensely and very sincerely but in a foreign language." 

            If you cannot get into this gargantuan work, at least not immediately, try this alternative: go to Spotify (or wherever you listen to new music) and play the "little slow movement," Adagietto, from Gustav Mahler's Fifth Symphony. It'll take less than ten minutes. Try it several times, paying attention. If that too makes no impact upon you, then maybe you'll want to seek solace in nature or poetry. Or maybe meditation or prayer will help in your moment of need.

            My final advice: do it now. That is, learn how to pray or meditate now. Or find your own sacred spot in nature now. Memorize the poetry most important to you now. Or perhaps, learn what pieces of music reach you deeply now. 

            Then you will have the resources when you need them. And when that time comes, know that you have my best wishes, and maybe Anton Bruckner's as well. 

    [1]Bruckner's assurance is inferable in other ways, such as the sheer length of many of his works. But he was also insecure, as various stories about him attest. He never received the audience response his works deserved and sometimes struggled just to get them played at all. 

    [2]I need to note that the Symphony exists in different editions. Many Bruckner symphonies do. Usually the last draft shows a composer's intentions best, but with Bruckner we cannot be sure. Owing to his insecurity, mentioned in the earlier note, an important conductor or critic might persuade him to make changes to which he would acquiesce, perhaps to get the work played at all, even though he might really have preferred the original. 

            The final movement of Symphony #8 then builds immediately to a second brass climax and, in its original version, to a second, very different, tympani blast. In the revised versions, the second brass climax is not followed by tympani. I prefer the original, as did von Beinem. But it's not worth getting into a snit about. 

    [3]Recall that in 1936, many "Victrola" phonographs did not even use electricity. Agee's contortions were necessary, to get adequate volume. I don't suggest that you turn a modern system to peak volume — not if you value your neighbors or your eardrums! 

    [4]At least you can hear the soft tympani strokes near the very end of the work; these are almost inaudible in some recordings. It also does contain the second tympani blast at the start of the final movement, mentioned in a previous note.

      [5] Of course, that was her first hearing of it. 

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153954 https://historynewsnetwork.org/blog/153954 0
My Lynching Photo Problem, and Ours             

    On February 13, 2017, the University Press of North Carolina announced a new book, Civil Rights, Culture Wars: The Fight Over a Mississippi Textbook. Written by Ole Miss history professor Charles Eagles, Civil Rights, Culture Wars tells the history of another book, Mississippi: Conflict and Change, which was published in 1974, of which I was senior author and, with Charles Sallis, professor of history at Millsaps College, co-editor.

            Eagles covers the entire saga of how our book came to be, from the background of the Mississippi educational system in the 1950s and the "Oh no!" moment that sparked the project, to the writing of our book and our attempts to get Mississippi to allow its use, leading to our lawsuit against the State Textbook Board, and including the court decision and its impact.

            Reading this book about our book provided me with an unexpected learning experience. Eagles points to an important error we made in Conflict and Change, a mistake other writers have made intentionally. I write this essay to confess this error, to commend Charles Eagles for his detective work, and also to bring attention to the tradition we historians and sociologists have of locating violent racism in the South.

            In 1970 I put together a team of students and faculty at Tougaloo College and nearby Millsaps, to write a new textbook of Mississippi history. The existing book, John K. Bettersworth's Mississippi: Yesterday and Today, was terrible. Bettersworth's textbook bolstered the thinking of Mississippi's notoriously racist white elected officials and even supplied the rationale underlying the actions of Byron de la Beckwith, convicted murderer of civil rights leader Medgar Evers. I discuss the particular "Oh no!" moment that sparked the project in the opening pages of Teaching What Really Happened,  a moment caused by the misinformation in Bettersworth's book.

            Our product, Mississippi: Conflict and Change, published in 1974, won the Lillian Smith Award for Best Southern Nonfiction.[1] Nevertheless, the Mississippi State Textbook Board rejected it. In turn, we sued the Board on First and Fourteenth Amendment grounds. We won a pathbreaking decision (Loewen et al. v. Turnipseed et al.),  hailed by the American Library Association as a landmark case protecting Americans' "right to read freely."

            In several ways Mississippi: Conflict and Change differed from other history textbooks, then and now. In places, it was self-critical. To my knowledge, no other textbook ever has been. It referred to women on second mention without courtesy titles: "Welty," not "Miss Welty," parallel to "Faulkner," not "Mr. Faulkner." Its 32 maps showed relationships among social variables, not just minutiae like the name of every county seat. It included pictures and accounts of history-makers of all races, not just whites. And it included a photo of a lynching.

            Until then, no history textbook, state or national, contained such a photo. To my knowledge, no other history textbook, state or national, does, even today. Most photos in the existing textbook were head-and-shoulders portraits of old white men, Such pictures are devoid of historic value unless one believes in phrenology. We vowed to do better. We wanted illustrations that showed history or themselves were historic documents. At trial, Neil McMillen, professor of history at the University of Southern Mississippi and author of Dark Journey: Black Mississippians in the Age of Jim Crow, testified to his admiration of the pictures in our book.

One U.S. history textbook included this drawing for a while. It's not a drawing of an actual lynching but a Reconstruction-era political cartoon implying that if Democrats win, they will lynch "carpetbagger" Republicans. A current textbook has a photograph of a civil rights march; one man carries a placard with a drawing of a lynching victim. Otherwise, no lynching images appear in K-12 textbooks. Too controversial?

            The lynching photo was particularly hard to find. Mississippi had more lynchings than any other state, but as the poorest state in the nation, had hardly any cameras! We did our research the hard way, decades before the internet, and also decades before Jimmy Allen had compiled his collection of lynching photos, some of which appear in the book Without Sanctuary. Finally we found a photo, identified simply as taken in Mississippi, in Scott Nearing's 1929 book Black America. Nearing was still alive — indeed, he lived to be 100, passing away in 1983 — so I located him, wrote him, and asked permission to reprint the photo. Sure, he said, but he did not have the original; I'd have to take the image from the book. I may have asked for details as to where and when the picture was taken, but if I did, he no longer knew, so we titled the photo simply, "A Mississippi lynching, captured by the camera."

            As lynching photos go, it is "tasteful." In the foreground, in silhouette, a man is being burned. Behind him, well-dressed whites pose for the camera. It does not show the victim in close-up; no one is hacking parts off his body. Nevertheless, the photo was controversial. Indeed, it figured in the "Perry Mason moment" of our trial in 1980. This came when the Assistant Attorney General for the State of Mississippi asked John Turnipseed, lead defendant, why he had objected to Mississippi: Conflict and Change. He had the court turn to page 178, the page with the lynching discussion. Pointing to the photo, he said, "Now, you know some ninth-graders are pretty big, especially black male ninth-graders. And we worried, or at least I worried that teachers, especially white lady teachers, would have trouble controlling their classes, with material like this in the book." So our book would cause racial unrest in the classroom!

            We had pretested our book in an overwhelmingly white classroom and an overwhelmingly black classroom; both had preferred it overwhelmingly to Bettersworth's book, so we had material for rebuttal testimony at the ready, but we didn't have to use it. At that point, Judge Orma Smith — an 83-year-old white Mississippian but a man of honor — took over the questioning.

            "But that happened, didn't it?" he asked. "Didn't Mississippi have more lynchings than any other state?"

            "Well, yes," Turnipseed allowed. "But that all happened so long ago. Why dwell on it now?"

            Smith replied, "Well, it is a history book!" Charles Sallis and I nudged each other. "We're gonna win this case!" I murmured.

            We had used the photo to make this point in the book: "Although lynchings occurred in almost every state, most of them took place in the Deep South. More lynchings have been recorded in Mississippi than in any other state."

    It turns out, however, as Eagles shows, that the lynching photo is not of a Mississippi incident at all, but from the race riot in Omaha, Nebraska, during the "Red Summer" of 1919. (Almost all race riots in U.S. history before 1942 were of whites rioting against people of color.)

 

Not "A Mississippi lynching, captured by the camera." Rather, well-dressed white men and women pose behind the burning body of William Brown in Omaha, Nebraska, in 1919.

            Of course, at the time we had no idea we had mislabeled it. Luckily, neither did anyone else. (The State would have rejoiced to be able to point to such an error.) During the years since our trial, Sallis learned of the mistake at some point, but I learned it only from Eagles's book.

            In the intervening years, I have focused much of my research on racism in the North. My book and website Sundown Towns show that many more Northern  communities flatly excluded African Americans during most of the twentieth century than did towns in the traditional South. These Northern towns usually went sundown between 1890 and about 1940, the "Nadir of race relations." In those years, whites went more racist in their ideology than at any other point in our nation's past, and that was true North as well as South. Writing of the day-to-day interactions of whites and blacks in Ohio, for example, Frank Quillen observed in 1913 that race prejudice "is increasing steadily, especially during the last twenty years."[2] This is when lynchings rose to their all-time high, and although most indeed took place in Dixie, on a per-black-capita basis, white Northerners may have committed as many.

            I write "may have" because librarians at Tuskegee Institute compiled the database used for most lynching studies from Southern weekly newspapers; they did not include data from Northern states. Like the database, the NAACP spent most of its time and resources exposing and arguing for an end to Southern lynching and segregation. Three of the most iconic lynching photos stem from Northern spectacle lynchings: Omaha, 1919; Duluth, Minnesota, the next year; and Marion, Indiana, in 1930. Often, these images have been used to illustrate Southern lynchings. For example, The Chamber, a Hollywood film, uses the photo of the Marion lynching, dubbing it "Lynching in Rural Mississippi in 1936." 

The bodies of Thomas Shipp and Abram Smith hang above a crowd of white people in summer dresses and straw hats in Marion, Indiana, August, 1930. One man points toward a body. 

"Part 1: Awakenings" from the famous documentary series Eyes On The Prize shows the same image while narrator Julian Bond says "There had been more than 500 documented lynchings in Mississippi alone." Today one web page titles the Duluth lynching photo "Alabama Wind Chimes."[3] 

A white mob has just lynched three black circus roustabouts, Elias Clayton, Elmer Jackson, and Isaac McGhie, in Duluth, Minnesota, in 1920. Two hang from a utility pole; the third lies on the ground beneath. Young white men crowd to get into the picture from either side. 

When I put "Southern" and "lynchings" into Google images on July 29, 2017, the Marion lynching came up second, fifth, and ninth. Duluth came up third and fourteenth. A different Omaha image (but of the same burned body) came up twelfth, and the image we used was #24.[4]

            At least since the Civil War, American culture has located extreme racial violence in the South. To be sure, many white Southerners have done what they could to deserve and even promote this reputation. Leaders like Theodore Bilbo of Mississippi, Rebecca Felton of Georgia, and Pitchfork Ben Tilman of South Carolina called openly for lynchings to keep African Americans subdued. However, Northern communities also resorted to violence, usually to drive African Americans out. These race riots rarely got reported. Between 1999 and 2004, as I told people that I was researching sundown towns, they often replied, "In Mississippi, right?" "In Alabama?" In fact, I found more than 500 sundown towns in Illinois, compared to just 3 in Mississippi.

            Even when whites take note of Northern sundown towns, they locate them in the South! During World War II, Malcolm Ross of the Fair Employment Practices Commission described Calhoun County, Illinois, as "a farming area on the Mississippi forty miles north" of St. Louis. "Calhoun County is recorded in the 1940 census as '8,207 whites; no Negroes; no other races,' " he went on to note. "This is not by accident. Calhoun people see to it that no Negroes settle there. This is ... an earthly paradise for those who hate Negro Americans." Calhoun County remains sundown today, so far as I can tell. Then Ross makes an astounding statement: "Along with the white boys from Calhoun County, and a hundred other counties of the South..."[5] Calhoun County is just 65 miles southwest of Springfield, the capital of Illinois. It's not even in Southern Illinois, let alone "the South."

            In 2007, reporter Elliot Jaspin wrote a book about sundown towns, Buried in the Bitter Waters (NYC: Basic Books). He focuses on twelve counties. Seven are in Confederate states. Three others are in the Border states of Kentucky and Missouri. The final two are in Indiana, but he notes that they lie in southern Indiana. Jaspin emphasizes the Southerness of the phenomenon, as does the movie Banished, which had him as an adviser.

            It turns out that, like the Tuskegee librarians, Jaspin started with census data from Southern states. Naturally he found sundown counties in the South! Even so, of his seven Southern counties, however, not one lies in what we might call the traditional South. One is in Texas,[6] two in the Arkansas Ozarks, two in far eastern Tennessee, one in far western North Carolina, and one in the Appalachian Mountains in north central Georgia. Similarly, the cases he found in Missouri and Kentucky lie not in the "Southern" parts of those states — which in Missouri are the counties in an east-west band along either side of the Missouri River, along with the cotton lands along the Mississippi.

            In reality, there are more sundown towns and counties in Wisconsin than in North Carolina, more in Oregon than in Georgia. Within Indiana, sundown towns are at least as common in the north as in the south. And so it goes.

Kurt Vonnegut's drawing of a sundown town sign.

            I wrote "And so it goes" not only to mean "etc." but also to reference novelist Kurt Vonnegut. Growing up in central Indiana, he saw sundown town signs all over the state in his childhood. In Breakfast of Champions he wrote about the phenomenon, which he illustrated with his own drawing of a sundown town sign, which he then gave me permission to use.

            The three iconic photographs of Northern lynchings give the lie to the notion that these dastardly deeds were committed by lower-class deviants at the margins of society in the dark of night. Rather, they show upright members of the white community happy to have their images captured in the commission of a felony, because they believe that they will be commended, not prosecuted, for the act. Indeed, a lynching can be defined as a public murder, done with considerable support of the community. When done by whites to people of color, it is a particularly egregious expression of racism, because the entire community knows that the perpetrators will likely get away with it.

            When we mislabel these three lynching photos as Southern, we again marginalize the perpetrators. When we locate sundown towns in the South, we write Bad Sociology (BS!) that excuses the rest of the country.

            Acts of violent racism have historically occurred all over the nation. In this era of BLM, they continue to do so.

    [1]Besides the Lillian Smith Award, Mississippi: Conflict and Change has won considerable attention over the years. Robert B. Moore compared it systematically to Bettersworth's textbook in a 24-page booklet, "Two History Texts: A Study in Contrast," (NY: Council on Interracial Books for Children, 1975). Herbert Foerstel lauded it in Studied Ignorance: How Curricular Censorship and Textbook Selection Are Dumbing Down American Education (Santa Barbara: BAC-CLIO Praeger, 2013), 53-57. Rebecca Miller Davis commended it in the lead article in volume 72 of the Journal of Mississippi History (Spring, 2016), 1-45.

    [2]Frank U. Quillen, The Color Line in Ohio (Ann Arbor: Wahr, 1913), 120.

    [3]At https://memegenerator.net. This is an Israeli site!

    [4]At the pages referenced by Google, some of these images are identified correctly; "Southern" merely occurs elsewhere on the page.

    [5]Malcolm Ross, All Manner of Men (NY:  Reynal & Hitchcock, 1948), 66.

    [6]I brought that county, Comanche, to his attention. 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153970 https://historynewsnetwork.org/blog/153970 0
Books, Blacks, and Bigots Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

            For nine years (c.1991 — 2000), Vertigo Books operated near Dupont Circle in Washington, DC. It was always touch-and-go, and when the property owner raised the rent, Vertigo went — to nearby College Park, Maryland, near the University of Maryland. There it lasted another nine years before throwing in the towel to Amazon.

            Vertigo was always known for hosting book-signings by progressive authors. Shortly before it moved from DC, I attended such a talk. I no longer remember the speaker, but I vividly recall the conversation I had with another patron, a white male in his 60s who sat next to me. Somehow he brought the conversation to race and declared to me that African Americans don't read books.

            This claim had been made before. In fact, it used to be a cliché. In African American culture before the Civil Rights Movement, there was even a bitter joke about it, popularized no doubt by frustrated black intellectuals: "Want to keep something secret from a black man? Hide it in a book!" I googled the phrase in 2016 and got 529 hits, so the phrase lives on.

            My own experience has been very different. In 1963, as an undergraduate at Carleton College in Minnesota, I spent part of my junior year "abroad" at Mississippi State University. Mississippi State was then the largest "all-white" institution of higher learning in the world outside South Africa, as some people told me with pride, others with chagrin. (I placed quotation marks around "all-white" because Chinese Mississippians could attend, as could dark-complexioned students from south India. A better term might be "non-black.")

            I enjoyed my months at Mississippi State and learned a lot, but it was very different from Carleton. One difference related to books, or, rather, their absence. My impression was, students at Mississippi State didn't read books.

            I was a (budding) sociologist. We count things. So, to test my impression, I counted all the books owned by all the students in my dormitory wing. There were twelve double rooms, so I counted the books owned by 23 students. (I did not include myself.) I counted everything — pulp novels, even comic books — but not textbooks. I was interested in books bought voluntarily.

            The 23 students owned 51 books. One owned 42. He was an intellectual. Another owned maybe 5. A couple of others owned one or two.

            That was it. Most of my dorm-mates had no books in their rooms and may have never owned one, other than those required for class. Compared to Carleton, a monastery of pointy-headed intellectuals, the contrast was stunning. Many Carleton students owned 51 books all by themselves, doubtless with still more at home.

            During my stay in Mississippi, for four days I became an undergraduate at another college, Tougaloo. Although more than 90 percent of its students had graduated from black public schools in Mississippi, which the white power structure deliberately kept separate and unequal, Tougaloo had a thriving intellectual subculture. Again, I counted books — all the books owned by my four roommates, excluding textbooks. (One roommate was away on exchange at Oberlin College, but his stuff was still there, so I could count his books.) The four owned 48 books among them, about a dozen each. A mode of twelve is infinitely more than a mode of zero, both in multiplication and culture.

            So I knew better than to capitulate to the contention of my new acquaintance at Vertigo Books. I told him some of the foregoing, but he would not hear it. "Blacks never come in bookstores," he said. Obviously he had never been in a black bookstore. At the time, the DC area boasted three important ones. My book (with co-editor) The Confederate and Neo-Confederate Reader, premiered at one in 2010, and although only about 30 people attended the small venue, it sold about 30 of my books, including earlier titles.[1] "I come here all the time," my antagonist finished, "and you never see a black person here."

            A few seconds later, a stunning young African American woman sat down next to him on the other side from me, accompanied by her handsome boyfriend. Both were eminently employable as models. Before I could comment, Bridget Warren, co-owner of Vertigo, began to speak, introducing that day's author.

            After the talk and question period, I thought about bringing up to the man what had just happened, but I concluded that doing so would just rub it in. Besides, I imagined that his response would be to claim the couple as a unique exception that somehow "proved the rule." The problem with bigots is that they can always dismiss positive experiences with the "opposite race" as exceptions, leaving their negative generalization unscathed.

            People, especially white people, rarely generalize negatively about their own group, of course, because doing so would put themselves down. Moreover, they know other white people who don't conform to the generalization, so they dismiss the negative behavior as idiosyncratic.

            We have real data about book reading. In January 2014, the Pew Research Center asked a sample of adult Americans whether they had read a book in the past calendar year. 76 percent said yes. Interestingly, age and residence (urban/suburban/rural) made little difference. Income of course did, but even among households making less than $30,000/year, two-thirds said they had read a book. Gender mattered, as all booksellers know: 82 percent of women said yes, compared to 69 percent of men. So did going on to college, whether or not one graduated.[2]

            Eighty-one percent of African Americans said yes, a difference Pew said was not statistically significant compared to 76 percent of whites. The difference was consistent, however, across various mediums (e-books, recorded books, print).

            These findings went viral, in the form of the generalization "the most likely person to read a book — in any format — is a black woman who's been to college." (See also, for example, Sophie McManus, "Diving into the Sexist, er, sexy beach read," Washington Post, 8/6/2016.)  In its support, I would note that Essence magazine, aimed at black women, prints a list of best-selling books in black America and reviews books regularly. Few magazines aimed at young white women review books.

            Racial and sexual generalizations like these — even my own — set my teeth on edge. There are always social causes for these social phenomena. They do not prove anything about racial "essence." But at least this new one about black women is positive.

            Let me also undo the generalization I penned earlier about Mississippi State. Five years ago, its sociology department hosted me for two days as their "Alpha Kappa Delta speaker." The intellectual subculture, which in 1963 consisted of fewer than two dozen students among 12,000, has grown immensely at MSU. It is far from dominant on campus — the "collegiate" and vocational subcultures are dominant — but neither is it dormant. Not every college needs to be alike, after all.

This twelve-page (!) program was for Coates’s DC book launch. 

            I wrote most of the foregoing more than a year ago but somehow never posted it to HNN. Last week, an event in Washington, D.C., pushed me to do so. The Metropolitan A.M.E. Church in D.C. hosted the "book launch" of Ta-Nehisi Coates's new book, We Were Eight Years in Power: An American Tragedy. By 5PM, when the doors were supposed to open, the line stretched from the churchyard gate to the corner and down the next block to that corner. There were three levels of admission to this paid event: Tier 1, Tier II, and "regular." Luckily I was the guest of a Tier I sponsor, Rodney Hurst, himself an author as well as a leader of the Civil Rights Movement in Jacksonville, FL. The large church rapidly filled to capacity; a leader exhorted us to squeeze together to accommodate more people still in line.

            Hundreds of people bought books. Some came pre-signed via stickers on the inside front covers. I had never seen anything like it. Admittedly, I had never stayed up to 12:01AM for the bookstore release of the latest Harry Potter novel, but still, I have been to many book launches. To be sure, the crowd was "only" about 85 percent black, To be sure, it had become an "in" event, although Coates is no media star and does not entertain so much as educate. Talk show host Kojo Nnamdi merely conversed with him, followed by audience questions.

            As an important black intellectual, Coates is hardly solitary. Among his peers are Michael Erik Dyson and Cornel West. The next generation back might feature Alice Walker and Henry Louis Gates, and before them, Ralph Ellison and Maya Angelou, in a line that stretches back to W. E. B. DuBois and Frederick Douglass. American culture would be impoverished without these authors — and so many more. There has also been a long line of authors who wrote primarily for working-class African Americans — people like Langston Hughes and his "Simple Tales," Carter G. Woodson and ASALH, and Pullman porter J. A. Rogers, selling his books across the country as he rode the rails.

            The baseless claim that African Americans are anti-intellectual hurts race relations, as does the assertion that "they" are stupider than "us." That's why I write, hoping that publicity about Coates's massive turnout — for a book talk! — can help put to rest both canards at once.

    [1]What the paucity of African Americans in Vertigo showed was mostly residential segregation. After Vertigo moved to College Park, in majority-black Prince Georges County, MD, its customer base became blacker, even though its immediate neighborhood was still white.

    [2]Pew relied on self-reported data. Reading a book is the socially approved response, so it is possible that some people said "yes" who had never opened one. It's not clear why this possibility would mess up comparisons across groups, although it might. A moment's thought will surely convince you, however, that exaggerated reading due to false reporting would probably be larger among white, female, more educated, and richer respondents, who might be predicted to "feel" the social pressure more. 

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/153991 https://historynewsnetwork.org/blog/153991 0
Florida Is Doing the Right Thing. May Other States Follow Quickly. Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

On January 31, 2018, the Florida Senate voted to replace Confederate Gen. Edmund Kirby Smith with educator Mary McLeod Bethune in the United States Capitol. Smith's removal is certain. The Alachua County (Florida) Public School District has also stripped Smith's name from their administrative building, formerly Kirby Smith Elementary School. Since the Senate vote was unanimous, we can assume the substitution of Bethune in the Capitol will be made.

 ]]> Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154061 https://historynewsnetwork.org/blog/154061 0 Farewell to the U.S. History Textbook?

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

Pearson has announced its intention to sell its K-12 division. Let us hope it can find no buyer and simply closes it down.

Pearson, incorporating Prentice-Hall, has long been a dominant publisher of K-12 U.S. history textbooks. In the second edition of Lies My Teacher Told Me:  Everything Your American History Textbook Got Wrong, I show how two sets of Pearson "authors," Allan Winkler, et al., and Daniel Boorstin and Brooks Mather Kelley, plagiarized each other. Actually, Pearson hired a clerk to write both of their high school U.S. History textbooks. However, instead of hiring two clerks, one for each book, as is standard practice, Pearson hired just one and used his/her work twice. As a result, for page after page the two books are almost identical. (Probably they went through separate copy-editing.)

Perhaps one reason for this economy move was Pearson's financial problems. Maybe Pearson was under pressure to cut costs, even though the list price for U.S. history textbooks has soared to more than $100/copy.

Certainly both Winkler and Kelley deplored Pearson's frugality. Initially they both implied that they wrote their respective books. (Well, Kelley actually said "Boorstin did it.") Then, after I told them that "their book" was identical to another Prentice-Hall book for page after page, each said, and I quote, "Oh no! That's terrible!"

But the problem wasn't just Pearson's false economy. The real issue is: Pearson has no integrity. It lists as authors famous historians who didn't write "their" textbooks, never knew who did, and didn't even bother to read them. So if Pearson sells its K-12 division, good riddance!

But again, the problem isn't just Pearson. The buyer would likely be no better. Other textbook publishers also release books by "authors" who didn't write them or even bother to read them.

Now imagine that Pearson does not find a buyer and simply closes the division. That would be an advance! The next step would be for other publishers to abandon the huge textbook entirely, whether in print or on line. Its time has long passed. In olden times (such as 1991), students in Tchula, Mississippi, had few resources for learning history other than their textbook. Now, almost every school in America has the web, so it has hundreds of thousands of books, photographs, the census, etc., available for students and teachers to use.

Despite the invention of the web, the books have actually grown. The twelve textbooks I examined for the first edition of Lies My Teacher Told Me, all published between 1975 and 1991, averaged 888 pages. The books I studied for the second edition, all published between 2000 and 2007, contained 1,152 pages. There is no excuse for this bloating. Textbooks should be shrinking. Teachers, districts, and entire states shouldn't choose any of these behemoths. They kill the excitement of history. By trying to cover everything, they don't uncover anything. Students suffering through courses based on these books never discover an answer. Even to questions such as when and how people first got to the Americas, where no consensus exists, textbooks don't invite thought; instead they choose one answer and present it for students to "learn."

We don't need these ponderous boring tomes any more. The web has made them superfluous. The textbook industry has been disrupted. It just doesn't know it yet, owing to the inertia built into its customer base — educational institutions. A skeletal 200-page paperback will do, indeed will do better, forcing teachers and students to go beyond the textbook instead of simply memorizing twigs.

A historian or a team might show the way by writing a good 200-page paperback U.S. history. A bad one used to exist, put out by the federal government and aimed at immigrants studying for the test required for becoming a naturalized citizen. It seems to have disappeared, leaving a vacuum. This is my million-dollar idea for you, the reader — free! Let me know when yours comes out!

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154069 https://historynewsnetwork.org/blog/154069 0
Coming of Age in the Heartland

Postcard image of the A. E. Staley Co. c.1940,  showing its characteristic soybean plume.

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

            I was born in Decatur, an industrial city in the middle of Illinois. I didn't know it then, because it wasn't true then, but Decatur is also in the middle of what we now call "flyover country," a term suggesting one merely wants to get past it, not stop and experience it. For those few readers who want to spend a few moments with me in the Midwest, I offer this little memoir. It is history, I suppose, but definitely not History. It might help humanize red flyover people for blue coastal people, a little bit.

            When I was growing up in the '40s and '50s, flyover country was called "Middle America," a more positive term implying some centrality to American life, even some importance. Chicago was clearly America's "second city," as its theatrical troupe of that name still claims, partly because people had to change trains there, simply to get from one coast to the other.[1] Even though O'Hare is still a busy airport, it's just not the same.

            When I was a lad, Decatur had train service. The Wabash Cannonball connected it to St. Louis and Detroit and the Wabash Banner Blue and other trains provided the fastest train service between St. Louis and Chicago. Now of course, like most U.S. cities, Decatur has no train service.

            The combination of Interstate Highways and increased air service replaced train service in most towns, but they barely reached Decatur. When Pres. Eisenhower announced the Interstate Highway System, Decatur was the second largest city in the United States not to be on it. Later, Decatur did get added, via a spur from Champaign/Urbana, but the damage was done. Now Decatur seemed somehow to lag behind its competitors in central Illinois — Peoria, Springfield, Bloomington/Normal, and Champaign/Urbana.

            When I was a lad, Decatur also had an airport. Ozark Airlines provided fairly frequent service to St. Louis, its hub, and to Chicago. Ozark then verged on bankruptcy and got taken over by TWA, whose hub was also St. Louis. Today, Ozark's Wikipedia entry lists all known Ozark destinations – except somehow Decatur. Then TWA went bankrupt (three times!) and got taken over by American. Again, TWA's Wikipedia entry lists all known destinations for TWA and its feeder airlines, but somehow not Decatur. Trying to avoid bankruptcy, American Airlines cut its legacy Decatur service down to just two flights a day, both to St. Louis. Later it eliminated its St. Louis hub altogether, along with all service to Decatur, but alas, it went bankrupt anyway. USAir took it over, retaining the larger company's name. Hoping to avoid financial problems, the "new" American has avoided Decatur ever since.

            For a while, no airline served Decatur. However, last month the government came to Decatur's rescue.[2] A federal boondoggle called Essential Air Service pays Cape Air, an airline centered on Cape Cod (!), to fly nine-passenger Cessnas to St. Louis and Chicago four times a day. The government will pay the airline about $100/passenger, by my calculation, to subsidize this service.

            Decatur has come down in some other ways too. It used to be known as "the fly-swatter capital of the world," because of a factory that made fly-swatters, and it housed the Hi-Flier Kite Co., America's only mass producer of kites. These kites, made of tissue paper and strips of balsa wood, sold in thin rolled-up form for just a dime. Nevertheless, some Asian company, first Japanese and later probably Chinese, undersold Decatur, and now Decatur makes neither fly-swatters nor kites.

            When I was growing up, Decatur also claimed to be "the soybean capital of the world." Even its radio station was WSOY. Much of the year, the entire eastern half of the city enjoys (endures?) the distinct smell of soybeans cooking. The A. E. Staley Co. made all kinds of things out of soybeans, even one (experimental) sailboat. They gave it to an employee, Mr. Boyer, for testing. His son Bill and I were friends, so I got to be his crew. It was a Thistle, as I recall, a popular class of sailboat. Thistle owners held races on Lake Decatur.

            The one time Bill and I entered a race, we were in third place when suddenly we capsized. I don't know why — I was only the crew. I only know that there we were, running with the wind, gaining on the leaders, and suddenly we were upside down, with our mast stuck in the many feet of silt that made up the bottom of Lake Decatur. Indeed, the silt went all the way up to the surface; the line between lake and bottom was indefinite, like that between good and evil here in DC. Of Lake Decatur, we said, "too thick to swim in, too thin to plow."

            Bill and I had met in the Boy Scouts. Decatur posed a problem to would-be Boy Scouts, however. Boy Scouts go on hikes, you see, and in Central Illinois these proved to be truly boring hikes. The topography around Decatur, except the Sangamon River valley, is totally flat – much flatter even than Kansas. Indeed, residents of Decatur consider Kansas to be mountainous. As a result, no matter how long the Boy Scout hike – 5 miles, 16 miles, even one of 23 miles – we could always see our destination when we set forth ... it just slowly grew bigger.

            Until recently, ADM, the Archer Daniels Midland Co., competitor to Staley's, maintained its corporate headquarters in Decatur. To do this, ADM had to create its own air service, of course. ADM also provided Decatur with one of its few brushes with fame: its participation in a notorious commodities price-fixing scandal. This led to the movie “The Informant,” starring Matt Damon, which was actually (gasp!) filmed in Decatur. In 2014, ADM gave up and moved its headquarters to Chicago.

            Besides ADM's, Decatur housed one other important corporate scandal. Its Firestone Tire and Rubber plant made tires that became notorious for tread separation and bursting. Firestone blamed owners for under-inflating its tires, but the National Highway Traffic Safety Administration uncovered a company email in which an executive wrote, "We are making an inferior quality radial tire which will subject us to belt-edge separation at high mileage." Later, these tires got blamed for rollover problems with Ford's huge Explorer SUVs. Partly owing to its shattered reputation -- shattered by products made in Decatur -- Firestone got acquired by the Japanese company Bridgestone.

            About 15 years ago, Illinois held a contest to name the "most boring city," and Decatur came in second.

            What an outrage!

            We Decaturites (Decatureans?) knew that such a vote had to have been rigged. In any fair contest, Decatur would have won. Instead, Rockford won.

            You can tell from the name alone that Rockford is more interesting. It has a rock, for example. Decatur has no rock. And it has a ford, which implies it has a river that flows. Decatur does have a river, but it does not flow. So on the face of it, Rockford could never possibly beat Decatur in any fair contest. Indeed, Decatur is the smallest city of its size in the United States!

            Decatur does have Decatur jokes, however. They take the form of the familiar "You Might Be From ____ If...." Here is just a sampling. Some of the references will be obscure to most of you, but that just goes to show that Decatur, like, say, Venice, has its own culture, complete with terminology.

You Might Be From Decatur If ....

  You have never met a celebrity.

  You think Chicago is a completely different state from Illinois.

  You refer to a toasted cheese sandwich as a "cheese toastie"

  Your school classes were canceled because of cold.

  Your school classes were ever canceled because of heat.

  You know what's "knee high by the Fourth of July," but it's much taller than that.

  Detasseling was your first job.

  Your idea of a traffic jam is eight cars waiting for a freight train on Eldorado St.

  You have no problem spelling or pronouncing "Mowequa."

  You have ever "warshed" your car.

  You see people wearing bib overalls at funerals.

  You can locate Decatur on the map of the United States.

  You wore your favorite white t-shirt while swimming in Lake Decatur; now it is your favorite brown t-shirt.

  You can hold your breath for more than 5 minutes, having practiced while driving over the Staley viaduct.

  You've rushed to the golf course when it snowed.

  You know the real home of the Chicago Bears, and the real name, too.

  You've used the air conditioning and heater in your car, both on the same day.

  You've used the air conditioning and furnace in your house, both on the same day.

            Although they never recorded any Decatur jokes, social scientists and historians have repeatedly studied Decatur. C. Wright Mills, author of The Power Elite, wrote a well-known paper, "The Middle Classes in Middle-Sized Cities," while doing field research in Decatur for Katz and Lazarsfeld's Personal Influence, a study of opinion leadership in Decatur.[3] Criminal Justice in Middle America is about Decatur, as is the sad book about the decline of organized labor, Three Strikes.

            Some years ago, I was the keynote speaker for the second annual Decatur Writers Conference, because I am the third-best-selling author from Decatur. For the first conference, they engaged Richard Peck, the celebrated author of 'such tween and 'teen novels as Father Figure and Lost in Cyberspace. So far as I can tell, none took place in Decatur, including two that are said to have been set there. Nevertheless, or perhaps therefore, his books have sold maybe twenty million copies, and he certainly deserved to keynote the first Decatur Writers Conference.

            For the second annual Decatur Writers Conference, I wondered why the organizers had not engaged the second best-selling writer from Decatur -- none other than Stephen Ambrose, the famous historian! (And this was before he died, and before his plagiarism scandal.) So I asked my host.

            "Well, we can tell you the answer to that question," came the reply. "Stephen Ambrose charges $40,000, plus a private jet both ways."

            "Gee," I said, "I saved you more than $36,000!"

            "Yes, you did!"

            Nevertheless, I had a fine time. Mr. Ambrose missed out.

            Nowadays, fewer and fewer people are visiting Decatur. According to a Wall St. Journal story, unemployment is down in Decatur, but solely because unemployed people have been moving away, not because they have been finding work.

            Who can blame them? I moved away, partly for work, partly because I had learned that other places weren’t so flat. The Decatur Staleys, our original National Football League team, moved to Chicago almost a century ago and renamed themselves the Bears. We don’t think they’re ever coming back.

            Sadly, people leave Decatur today not just for work but because it is too black for them. Decatur is only 20% black. These racist white people flee to dinky little sundown towns like Oreana, Forsyth, and Mt. Zion, sometimes “for the children.” As if homogeneity is what children need.

            Sociologically, however, Decatur was a good place to grow up. The working class was not then as depressed as it is now, in Decatur and elsewhere. Working class Democrats sometimes won political office. As well, Decatur was able to bus white children from the Mound School attendance zone to French Elementary School, just southeast of the business district, to relieve overcrowding at Mound while also preventing French from going majority black. (Busing of white kids to mixed or black neighborhoods was almost unheard-of in the 1950s, and this without any court order.) There was racism, to be sure. African Americans couldn't get regular jobs at places like Staley's and ADM — they could only be janitors and part-time summer workers. The teaching staff was all-white (except for two Japanese Americans).[4] Still, many of us grew up thinking that people were basically equal, as it said in the Declaration of Independence, and as Abraham Lincoln (who had lived nearby) had pointed out. We drove American cars. Some of us still do. We didn't want to be sophisticated. Some of us still don't. We dated and even married across social class lines, although not racial lines.

            I admit, I’d rather be from Decatur than in Decatur. Nevertheless, it’s still a good place to be from.

    [1]Interestingly, no trains went all the way across the country. (At times cars did, but almost everyone changed in Chicago (or St. Louis or New Orleans).

    [2]Actually, it awarded a new contract last month; the rescue, with another airline, came a few years back.

    [3]Personal Influence basically failed, owing to the lack of today's computing power that might have made sense of their data.

    [4]It took steps to change that after I graduated, including trying to recruit teachers at Tougaloo College around 1970.

 Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154078 https://historynewsnetwork.org/blog/154078 0
They Thought He Was Black, so They Claimed the Inn Was Full. Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

            Donald Trump and family are building, or at least branding, two new motels in the Mississippi Delta. in Bolivar County.[1]

            The Mississippi Delta is not the delta of the Mississippi River. It is the primordial flood plain between the Yazoo and Mississippi rivers. Its most famous description is by writer David Cohn, who lived in Greenville, in the heart of the Delta: "The Mississippi Delta begins in the lobby of the Peabody Hotel in Memphis and ends on Catfish Row in Vicksburg." Cohn encapsulated not only its geographic extent but also the racial and socioeconomic divide for which the Delta is so infamous. 

            Back in 1970 when I lived in Bolivar County, the next county north of Greenville, it had just two motels, both in Cleveland, its main county seat.[2] The Colonial Inn was larger and boasted an outdoor pool, along with four white columns in front, referencing antebellum plantation architecture.[3] The Holiday Inn Express was more utilitarian. It had no pool. 

            I had a grant from the Ford Foundation to engage several students and fellow faculty members from Tougaloo College to do a community study of Mound Bayou, often called the nation's oldest all-black town. For the students, I had rented the large brick home built by Isaiah T. Montgomery, the founder of the town. Montgomery had been born enslaved, on Joseph Davis's plantation at Davis Bend in southwest Mississippi, in 1847. His father Ben, a brilliant man who had made several inventions while a slave, taught him to read. He then became Joseph Davis's private secretary. The Davis families (Joseph and Jefferson) fled the area when Grant's forces came near, en route to taking Jackson and then Vicksburg during the Civil War. They left the Montgomerys in charge of their plantation. Isaiah later served as a cabin boy in the Union navy during the battles of Grand Gulf and Vicksburg. 

            From 1866 through 1878, the Montgomerys ran the former Davis plantations, which they had bought. In 1873, Montgomery & Sons was the third largest cotton producer in the South. But in 1878, Jefferson Davis sued his own relatives and won back control of the plantations. 

            Nine years later, Isaiah Montgomery led several of the Davis Bend settlers to land he had bought in Bolivar County. They founded Mound Bayou, which claims to be the South's first all-black town, although it often was not quite all black. Certainly it wasn't while I was living there. 

            By 1970, the Knights and Daughters of Tabor, a black self-help and fraternal/sororal organization, owned Montgomery's home. They also owned and ran the hospital, which for decades had been the only hospital serving African Americans in Bolivar County and adjoining counties. On its grounds was a modern mobile home. My wife and two-month-old son and I lived in the mobile home for the summer. 

            My then wife was then Catholic. Hence, in a few weeks it came time for my son's christening at the Catholic Church in Mound Bayou. My parents decided to come down for the ceremony. We had no room for them in our mobile home, and students occupied each bedroom in Montgomery's mansion, so I needed to reserve a room for them. 

            I phoned the Colonial Inn and made the reservation. Then I thought about the pool. I knew my students had never been in a swimming pool before, since Mississippi cities had closed their public pools rather than allow African Americans to use them. Since they were not staying there, however, the motel might legitimately deny them the pool, and I did not want to expose them to such embarrassment. So I called back to ask if my students might use the pool. 

            "I don't think so," the clerk replied. "You know how things are down here." I realized she had noticed my Northern accent. 

            The next morning, my wife awakened with a problem: a clogged milk duct. Luckily, Mound Bayou had an almost-new medical facility, the Tufts Delta Health Center, founded by Dr. H. Jack Geiger in 1965. All three of us – mother, father, and baby – drove over to the clinic. While my wife waited to be seen, an announcement came over the intercom: "Telephone call for James Loewen. Will James Loewen please come to the front desk." 

Tufts Delta Health Center today, with plaque about its founding.

            I was astounded. How did anyone know I was at the Health Center? I had only been there one other time in my life! Twenty minutes earlier, we had made a spur-of-the-moment decision to drive over. Professors at Tougaloo knew some of us were being watched by the Mississippi State Sovereignty Commission, but I didn't think they were this efficient. Drones were not even on drawing boards, after all. I walked to the front desk, was handed a phone, and said a hesitant "Hello? This is Jim Loewen." 

            "This is the manager at the Colonial Inn," a man replied. He had surmised I must work at Tufts Delta Health Center. Where else might a professional-sounding black Northerner work in Mound Bayou? "I'm sorry to report to you that our clerk made an error, the other day. You wanted a reservation for Friday, June 12, right?" 

            "I did," I replied, "for my parents, and I was given one."

            "Well, that's the thing," he replied. "You see, my clerk took down your reservation on the wrong day, and on the right day, we're all full up!"

            "Really!" I replied, knowing immediately what had happened. Living up to its name, the Colonial Inn wanted to head off any use of its pool by black folks. I could not argue with him, of course, because he could see his reservation sheet and I could not, so I hung up courteously. 

            Back home later that morning, I reserved my folks a room at the Holiday Inn Express. 

The Colonial Inn c.1970, its swimming pool in the center, its four columns visible to the left.

            Friday June 12th arrived. So did my folks and my sister, who was going to sleep in our mobile home. I showed them the way to the Holiday Inn and then drove my sister to the Colonial Inn. We both walked in, unannounced of course. "My folks are in the car," I said to the desk clerk, "and they need a room for tonight. Do you have any rooms?" 

            "Yes, of course," she replied. "What do you need?" She proceeded to list the various options and prices. I asked her to write down the rate for a two-bed double. "My father never thinks I get it right," I offered as an excuse for getting her to do the writing. Of course, I was collecting evidence. That's also why my sister was at my side, as a witness. The clerk complied. 

            Once outside, we drove off, no longer needing the room. We enjoyed the weekend with my folks, including the christening in Mound Bayou. I knew the priest; he was also the Catholic chaplain at Tougaloo. After the ritual, we all adjourned to our mobile home for refreshments, including the Tougaloo students. 

            On our next trip to Jackson, the state capital, I conferred with my friend Frank Parker, a peerless civil rights lawyer who would in 1980 help us win Loewen v. Turnipseed, our lawsuit on behalf of a new textbook in Mississippi history that the state had rejected. Parker threw cold water on my plan to turn my Colonial brush-off over to the FBI. "You haven't got a case," he said.

            "Why not?" I inquired. "Clearly the game they played was racial." 

            "Of course it was," Frank replied. "But here's the problem. Imagine that you are the wife of a really awful plutocrat. Everyone hates him – his employees, his associates, and especially you, his wife. Finally, one night, you have had enough. You go out to the mall and buy a pistol. Returning, you tiptoe up to the master bedroom, find your husband lying asleep in bed, and fill him full of lead.

            "However, unbeknownst to you, earlier that evening a work associate, also fed up with him, had come to your home, found the door unlocked, sneaked upstairs to the bedroom, and shot your husband. 

            "Are you guilty of murder?" 

Frank Parker in 1996 (Wikipedia, courtesy Anne Lawver)

            I didn't get a chance to answer before Frank plunged ahead. “You're not! You tried to murder your husband, but all you did was shoot a corpse, because he was not alive. Similarly, the Colonial Inn tried to commit racial discrimination, but they failed, because you are not black!" 

            I had to admit Frank's logic but decided to persevere anyway. I wrote up an account of the episode and sent it to the local FBI office, along with a photocopy of the clerk's statement of room availability and price on the day in question. The problem was hardly unique to the Colonial Inn, after all. According to a civil rights case decision, "blacks who travel the country, though entitled by law to the facilities for sleeping and dining that are offered all tourists, may well learn that the "vacancy" sign does not mean what it says, especially if the motel has a swimming pool."[4]

            The FBI did take my complaint seriously. Even though Parker was doubtless correct about the law, still, as a bureaucracy, the FBI wanted to satisfy me, so they send an agent to talk with me. They also visited the Colonial Inn and talked with its manager. Then they reported back to me that the manager would like to meet me and make an apology. 

            I went to see the manager. He did not admit that his whole charade was racially motivated. He vowed that the Colonial Inn was open to all races. "We have one boy, he stays with us every month on his route," he assured me, unaware that his use of "boy" for a salesman who was undoubtedly twenty and probably forty undercut his denial of racism. 

            "Does he use the pool?" I asked. 

            "Well, no," he admitted. "But he could." 

            I could accomplish no more, but it did seem as if the experience had chastened the manager, at least a little. Six years after the 1964 Civil Rights Act had become law, it was having some effect. 

            That fall, I was teaching a sociology course at Tougaloo and somehow found myself recounting the episode. My students found it riveting. At the end, wound up, I found myself exclaiming, "I've had just about enough of being discriminated against because I'm black!" The room erupted in laughter. 

            I have no doubt that Trump's motels in Cleveland will be racially integrated. Bolivar County, however, is not. Until last fall, there were three high schools in Cleveland: Cleveland High School, East Side High School, and Bayou Academy, the segregated all-white private school founded in 1964. Bayou Academy doubled in size when the public schools desegregated in January, 1970. Its student population swelled again in September, 2017, when Cleveland and East Side high schools finally had to integrate.[5] Today it claims to be not racist: 

            "Bayou Academy School admits students of any race, color, national or ethnic origin to all the rights, privileges, programs, and activities generally accorded or made available to students and does not discriminate on the basis of race, color national or ethnic origin [sic] in the administration of its educational policies, admission policies, scholarship and loan programs, and athletic or other school administered programs."

   Meanwhile, its promotional video shows only white people. According to the National Center for Education Statistics, Bayou had one Asian student, four black students, seven Hispanic students, and 343 white students last fall. It was 98.9% non-black. The chance of drawing a student body that white from an underlying population that was less than 33.5% non-black is infinitesimal, unless race or characteristics tightly associated with race influenced admission. For readers who understand statistics, the "Difference of Two Proportions Test" yields a "t-value." When t = 1.96, a difference as great or greater would happen 5 times in 100 trials, called "the 5% confidence level." When t = 2.55, a difference that great would happen 1 time in 100 casaes, "the 1% confidence level." T = 3.3 indicates an occurrence so extreme that it might happen 1 time in 1,000 trials. Higher confidence levels are rarely reported, because they rarely occur. Bayou Academy's whiteness generated a t = 24.67! The chance that a student body so white occurred by chance is less than the likelihood that the sun will not rise tomorrow morning! The hypocrisy of the academy's nondiscriminatory statement is truly breathtaking. 

            Just as Bayou Academy declares formal nonracism while race still determines who attends, so Bolivar County now claims to be formally nonracist, while race still determines life chances. The median per capita income of African Americans in the county in c.2015 was $15,901, almost $10,000 below the national black average. Meanwhile, the white median was $31,711, slightly more than the national white average. One in four African American would-be workers in the county was unemployed, compared with one in seventeen whites. In longevity, the situation was even worse: "The rural Mississippi Delta may be the first place in the United States where health stopped improving," according to a 2005 conference held in Bolivar County.[6] Data on U.S. life expectancy in 2010 published by the Institute for Health Metrics and Evaluation show that males in Bolivar County could expect to live just 65.0 years, the second shortest for any county in the United States. The figure for African American males would be considerably lower.[7] To avoid these bleak statistics, people, especially African Americans, have been leaving the county for decades. Bolivar County's total population has fallen to about 33,000, about two-thirds what it was when I lived there. 

            So why build new motels? Of course, Trump will "receive city and county tax breaks" for at least the next seven years, to minimize the family's risk. As well, Bolivar County has two draws: Delta State University, a modest attraction, and tourism, mostly tied to the blues. Indeed, the Washington Post titled its story about the new Trump motels, "Trump's Sons See Green in the Blues."[8] Just down the street from one of the motels is the new Mississippi branch of Los Angeles's Grammy Museum, which exceeded attendance expectations in its first year. 

            The irony that mainly whites will benefit from a mainly black cultural tradition is not lost on African American leaders in the Delta. One of the two new motels was originally intended to repeat the plantation architecture pioneered in Cleveland by the Colonial, but local Indian American motel partners are now questioning that concept. 

            The Post story questions whether African Americans will choose to stay at Trump hotels regardless of their design. But if the management lets them use the pool, probably they will. I would have. 

    [1]Jonathan O'Connell, Trump's sons see green in the blues," Washington Post, 10/23/2017. 

    [2]Several Mississippi counties are cursed with two county seats. Rosedale, a dying town on the Mississippi River, is also a county seat of Bolivar. 

    [3]Unlike Natchez and Vicksburg, Bolivar County had no antebellum plantations, since its cotton fields were swampland until well after the Civil War. 

    [4]"Heart of Atlanta Motel v. United States," 379 U. S. 241, quoted in "Jones v. Alfred H. Mayer Co.," 392 US 409 © Supreme Court 1968. 

    [5]Edwin Rios, "A Mississippi Town Finally Desegregated Its Schools, 60 Years Late," Mother Jones, 11/2017, motherjones.com/politics/2017/10/a-mississippi-town-finally-desegregated-its-schools-60-years-late/.

    [6]Alan W. Barton, Proceedings from the Delta in Global Context Workshop, 5/27-28/2005 (Cleveland, MS: Delta state U, 2005), ntweb.deltastate.edu/abarton/DeltaGlobalContext/DGC%20Proceedings.pdf. 

    [7]"Life Expectancy, Obesity, and Physical Activity". Institute for Health Metrics and Evaluation. 2010. 

    [8]Jonathan O'Connell, "Trump's Sons See Green in the Blues," Washington Post, 10/23/2017.  

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154089 https://historynewsnetwork.org/blog/154089 0
Dinesh D’Souza Lied About My Work

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

Dinesh D’Souza is back in the news. Ted Cruz, Republican senator from Texas, claims D’Souza got convicted of a campaign finance violation because he “was the subject of a political prosecution,” targeted because “he’s a powerful voice for freedom, systematically dismantling the lies of the Left – which is why they hate him.”[1]Cruz proposed to President Trump that he pardon D’Souza, and on the last day of May, Trump complied. 

I can’t speak for “the Left.” No leftists think I’m on the left. (Right-wingers do.) I can speak for myself, however. While I don’t hate D’Souza, I surely do dislike him heartily. Why? For precisely the opposite of what Ted Cruz tweeted. 

In my experience, D’Souza is the liar. Certainly he lied about my work, and he did so deliberately. 

I had written about the bias that is built into every SAT test, ACT test, and IQ test, by the nature of the statistical hoops their creators impose upon new items. I learned about that bias from teaching first at Harvard University (1966-68) and immediately afterward at Tougaloo College (1968-75). The contrast was dramatic.

My best students at Tougaloo, a black college in Mississippi, had extraordinary ability. They would have stood out in the Sociology Pro-Seminar I had just been teaching at Harvard. One Tougaloo graduate went on to finish her doctorate in sociology at the University of Wisconsin in just three years. Others earned doctorates from Harvard, Berkeley, and other sociological powerhouses. Yet the SAT scores of these outstanding students averaged around 560. (Probably you know that SAT scores range from 200 to 800. 500 is about median.) 

A 560 at Harvard, on the hand, was quite different. My pro-seminar in 1966-67 included one student who simply could not do the work. (Sociology is hard.) Plenty of undergraduates did not do the work, but Peter could not. I know, because I met with him, talked with him about how much and how he studied, and, in that simpler time, looked up his records. He had been admitted as a legacy; scored 520/560 on the SAT, in Harvard’s lowest quintile; and simply could not do high-level college work, at least not in sociology. (He transferred to an easier major.) 

This experience convinced me that "aptitude tests" like the SAT did not really measure aptitude, at least not across different subcultures and positions in social structure. Years of research on the SAT showed me how and why it failed. In a word, it was (and remains) biased.

An example can make the bias clear. Imagine a verbal item built around the word “environment.” This word has several meanings. I have cold-called on scores of people in my audiences and asked, “What does ‘environment’ mean, to you?” Usually I get replies emphasizing the natural environment – ecology, pollution, etc. Although the person I called on is nervous, I immediately say “Yes, the natural environment, we’ve all used the word that way, haven’t we?” The room nods. 

Then I ask for another meaning of environment or another context for the word. Almost immediately, someone will volunteer the social environment – “what kind of environment did that child grow up in?” Again, I summarize, “Yes, the social environment, we’ve all used the word that way, haven’t we?” The room nods.[2]

A study by researchers at ETS, the creator of the SAT, shows, however, that white folks are particularly likely to think of the natural environment first when encountering “environment” in print, while black folks are somewhat more likely to think of the social environment. 

What does that difference mean for the SAT? It means that an item that uses “environment” in its social context will subtly favor black students. White students outscore black students by about 7% on most items. On this item, blacks might outscore whites a bit: maybe 81% get it right, compared to just 78% of whites. Immediately the item gets discarded. It has “misbehaved” statistically. In psychometric terms, it has a “negative point-biserial correlation coefficient.” The people getting this item correct have, on average, lower overall test scores than the people getting this item wrong. In short, the wrong people get it right. 

For this reason, despite the best efforts of staffers at ETS (and ACT et al.), no item favoring African Americans ever appears on the final edition of an SAT test (or GRE, or test in a specific discipline, or IQ test). 

I invented a little demonstration, sometimes called “the Loewen Low-Aptitude Test” or “the Collegiate IQ Test,” to illustrate the problem to a lecture hall full of students.[3]Its point was to persuade my largely upper-middle-class white audiences that these tests do not really measure aptitude by giving them a test that was biased against them, rather than against “others.” My test had an item that favored the highest 1%, a “black” item, a working-class item, etc. Most people wound up scoring at or just above random, translating to an IQ of 60 to 90. They felt stupid. 

In his breakout book Illiberal Education: The Politics of Race and Sex on Campus, Dinesh D'Souza picked on my black item, which used vocabulary more familiar to African Americans – in this case two urban slang terms – to form an analogy on which whites score lower. My point was to show that no item truly tests intelligence, aptitude, reasoning, or any other higher mental process if its underlying vocabulary is unfamiliar to the test-taker. In his summary, D’Souza deliberately misled his readers and misstated my point:

"[T]his line of criticism stereotypes blacks. [Loewen's] model presumes that blacks are most at home in the world of slang, womanizing, and drugs. Why a familiarity with this vocabulary is a good preparation for college, Loewen does not say."

When I first pointed out that D’Souza had misrepresented me, I titled the essay “Dinesh D'Souza: Knave or Fool?”[4]But D’Souza is no fool. He certainly knew my intent; his misrepresentation was deliberate. 

Of course, lying on behalf of white supremacy is not considered, shall we say, a “black mark” on one’s character to our Pardoner-in-Chief. Neither are the lies D’Souza tells in his “documentary” films. Quite the contrary: since D’Souza lies on behalf of the right, that is a good reason (to Cruz and Trump) for pardoning him. D’Souza’s sentence was hardly onerous. He should have served out the few months of probation remaining on it. He was not pardoned as an act of mercy. His pardon was a political statement, on behalf of lying and against true scholarship. 

Ironically, even Trump’s pardon involved lying: the president denied anyone ever asked him to pardon D'Souza, while both Cruz and D’Souza claim Cruz did. Trump also claimed he had never met D’Souza, while a former aide recalled that they had talked at Trump Tower in 2012, shortly before the release of D’Souza’s movie, “Obama’s America.” Trump agreed to help promote the movie and tweeted later that summer that it was “an amazing film.”[5]For that matter, D’Souza was not exactly convicted; he pled guilty. If he was not guilty, then he lied. 

Also ironically, D’Souza’s most recent book makes use of the very term we’ve been discussing: The Big Lie: Exposing the Nazi Roots of the American Left. I hope this essay makes clear why I’m not going to buy it or even read it. The first sentence introducing D’Souza on “his” page at Amazon states: “Dinesh D'Souza has had a 25-year career as a writer, scholar, and public intellectual.” I don’t think so. “Writer,” yes. “Polemicist” would also be accurate. “Scholar” – not so much. You cannot misrepresent someone’s clear written work on purpose and call that “scholarship.” 

I’m not alone in my view of D’Souza’s probity. Historian Paul Finkelman calls D’Souza’s The End of Racism"a parody of scholarship.” Michael Bérubé, who is an intellectual, says D’Souza’s work “doesn't meet a single known standard for intellectual probity.”[6]And so it goes. And now this menace to scholarship – or even to clear thinking – has been unleashed upon us again, all cleaned up and ready to go. 

[1]Ted Cruz’s father officiated at D’Souza’s most recent wedding. 

[2]Another meaning is economic: “the environment of the firm.” Ironically, the field of psychometrics itself uses “environment” in yet another way, to mean “all factors other than genetic inheritance.” Still other uses occur in computer science, music, and other fields.

[3]"Introductory Sociology for the Privileged:  Four Classroom Exercises,” TeachingSociology, 6#3 (4/1979), 238-44; cf. James Fallows, "The Tests and the Brightest," Atlantic Monthly cover article, 2/1980. 

[4]Loewen, “Dinesh D'Souza: Knave or Fool?” HNN 9/3/2012, historynewsnetwork.org/blog/148052. 

[5]Lauren Fox, “How Ted Cruz Helped Get Dinesh D'Souza His Presidential Pardon,” CNN, 6/1/2018, cnn.com/2018/06/01/politics/ted-cruz-dinesh-dsouza/index.html; Philip Rucker, et al., “With Pardon, President Sends Signal,” Washington Post, 6/1/2018. 

[6]Michael Berube, “Review of Dinesh D'Souza, The End of Racism,” Transition,69(1996), 90-98, at bradford-delong.com/2014/09/weekend-reading-michael-berube-1996-review-of-dinesh-dsouza-the-end-of-racism.html.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154111 https://historynewsnetwork.org/blog/154111 0
How Charlottesville Transformed the Confederate Monuments Debate

The base of the double equestrian monument to Robert E. Lee and Stonewall Jackson plainly shows the influence of Charlottesville. So does the absence of Lee and Jackson.

James W. Loewen is a sociologist.  The New Press has just brought out new paperbacks of Loewen's bestseller, Lies My Teacher Told Me, and Sundown Towns, about places that were/are all-white on purpose. 

Thankfully, not much happened on the just-past first anniversary of the white supremacist riot in Charlottesville. It is clear, however, what a difference the events of a year ago made to the debates about Confederate monuments that still rage across America. The impact was transformative. Indeed, we can refer to “B.C.” (“Before Charlottesville”) and after. 

In Baltimore, for example, Mayor Stephanie Rawlings-Blake had set up a commission to advise what to do with that city’s four main Confederate monuments, after at least one got vandalized following the death of Freddie Gray while in police custody. Soon it became clear, however, that neither she nor the commission chair wanted to consider removing any of them. The commission delivered a mixed report to the mayor, who then left office without doing anything. Three days after Charlottesville, Baltimore’s new mayor took down all four.

Helena, Montana, boasted a fountain proclaiming, “A loving tribute to our Confederate Soldiers." Of course, Montana never had any Confederate soldiers.  For that matter, Montana hardly had any Union soldiers.  Montana was still Indian country during the Civil War and for some time thereafter, as George Armstrong Custer found out at the Little Big Horn some fifteen years after the Civil War started. No matter; during the Nadir of race relations, 1890-1940, the United Daughters of the Confederacy knew exactly what they were doing. Across the South, and even within sight of the Montana state capitol, the UDC used these monuments to declare their dominion over the landscape and the respectability of the Confederate cause.

B.C., we who argued against this landscape of white supremacy merely seemed to be whistlin’ “John Brown's Body." After Charlottesville, the Native American caucus in the Montana legislature asked for the fountain’s removal; within five days, it was gone.

A few weeks B.C., I participated in symposia in Charlottesville and Richmond about what to do about both cities’ Confederate landscapes. The mayors of both cities said ahead of time that removal was not on the table. After Charlottesville, removal was up for discussion and, in Charlottesville at least, seems likely.

After Charlottesville, mayors across the country realized their Confederate monuments were indefensible on the basis of good governance. They had become flashpoints of conflict. Leaving them alone was simply asking for trouble. If the city did nothing, then at the least, people would paint them again and it would fall to the government to clean them up. (Of course, another course of action would be to give both sides their say by letting the tags remain.) After Charlottesville, officials did not have to take a stand on the meaning of the Confederacy. They could act without having to take a stand on their values.

To be sure, some did take a stand. Mayor Landrieu of New Orleans, in a speech he titled “Truth," eloquently explained the values that underlay his removal of that city’s four Confederate monuments.

But even if cities remove these celebrations of white supremacy for the sake of expedience, so long as they put up markers to explain what happened, all will be well. These markers need to tell what monument stood here, why and when it went up, and why and when it came down. Then citizens will learn that we began getting secession and the Confederacy wrong during the Nadir. After the murders by a neo-Confederate in Charleston in June 2015, we began to get this history right, on our landscape. After Charlottesville, they can infer that their city no longer celebrates white supremacy, at least not in its important public spaces.

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154140 https://historynewsnetwork.org/blog/154140 0
The Lawyers Are At It Again

The dreaded cookies.

James W. Loewen is a sociologist.  The New Press has just brought out new paperbacks of Loewen's bestseller, Lies My Teacher Told Me, and Sundown Towns, about places that were/are all-white on purpose. 

America has too many lawyers. Or at least, higher ed does.  

Consider Princeton. This week I spoke at its Department of History, co-sponsored by the Fields Center for Equality and Cultural Understanding. But first I had to sign a contract. 

Princeton sent me a doozy. First, it had one of those infamous “hold harmless” clauses: 

Speaker/Performer will defend, indemnify and hold harmless University, its officers, employees, trustees, agents and representatives from and against any and all claims, demands, damages, liabilities, expenses, losses of every nature and kind, including but not limited to attorney’s fees and costs, sustained or alleged to have been sustained in connection with or arising out of the Engagement, even in the event the University is alleged or found to be partially negligent. However, Speaker/Performer will not be obligated to so indemnify University if University is proven to be solely negligent.

Way back in the 1970s, when clauses like this were spreading throughout book publishing, writer Victor Navasky led the charge against them. Some publishers backed off, recognizing their unfairness. Now, however, they are spreading in academia. 

What’s wrong with them, you might ask. Well, suppose that Princeton served chocolate chip cookies at my talk. Suppose that the chips contain a microbe that causes paralysis. The chocolate supplier was at fault; the cookie-maker could not have known of the problem; Princeton only served them. So of course Princeton cannot be "proven to be solely negligent." 

According to the hold harmless clause, I am responsible! Moreover, I am to pay for "all claims, demands, damages, liabilities, expenses, losses of every nature and kind, including but not limited to attorney’s fees and costs, sustained or alleged to have been sustained in connection with or arising out of the Engagement"! So even if no one got sick, even if no harm resulted, so long as someone alleged paralysis, I have to pay! 

And I had nothing to do with the chocolate chips! 

To show how serious Princeton is about these matters, the contract goes on to demand that I prove I carry “Commercial General Liability insurance coverage for personal injury, bodily injury and property damage with a minimum combined single limit of $1,000,000 per occurrence/aggregate. All such policies must be underwritten by a carrier rated at least "A- in Best’s Key Rating Guide.” Moreover, I have to name “The Trustees of Princeton University” as additional insureds “before the Engagement begins.” I must even promise not to “enter upon University property” unless I am so insured. 

Interestingly, some of the language Princeton’s lawyers used elsewhere in their contract was identical to that used by Salish Kootenai College, the tribal college in northwestern Montana where I spoke last month. This leads me to conclude that Princeton’s lawyers, like SKC’s, merely downloaded phrases from College Law for Dummies. 

I refused to sign Princeton’s contract. I asked my host, “Did you read it? Do you really think it fair? Should speakers take on responsibility for damages and costs for things that they had nothing to do with?” Moreover, I noted, my legal jeopardy was unlimited. I stood to make $4,000 from speaking at the college, but I could lose everything – my home, savings, even future earnings – and all because of a few chocolate chips that I didn’t even make!

It happens that I do carry insurance. How many professors do? A school system demanded that I be insured several years ago, and I gave in, partly because they did not try to inflict a hold harmless clause upon me. But it is “only” for $100,000/$300,000. Moreover, I knew no way to modify it to incorporate each new host institution as an additional insured. 

I managed to negotiate Princeton down to accepting my insurance “as is,” and I got this phrase added to the hold harmless clause: “in the event he is found to be negligent.”* I suggest you do too. 

The common phrase, used by publishers and universities, to “defend” these hold harmless clauses is, “Don’t worry, we’ve never enforced it.” A reasonable reply then is, “Oh, well, then you won’t really miss it, so let’s strike it out.” 

I gave my talk.  I spoke to about twenty people, mostly grad students. First, we had sandwiches and, yes, chocolate chip cookies! No one showed the slightest distress from my talk as the event came to a close. Forty-eight hours have now passed without any sign of poisoning from the cookies. Princeton can breathe a sigh of relief. And so can I. 

* Salish Kootenai College used this language: “indemnifies and holds harmless the tribes from and against all damages … that may arise in whole or in part from his acts, errors, or omissions.” That I could sign. 

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154153 https://historynewsnetwork.org/blog/154153 0
Censorship at Amazon

Neo-Confederate pamphlet on left. Loewen book on right. 

James W. Loewen is a sociologist.  The New Press has just brought out new paperbacks of Loewen's bestseller, Lies My Teacher Told Me, and Sundown Towns, about places that were/are all-white on purpose. 

In 2016 I wrote here about Clyde Wilson’s scurrilous neo-Confederate pamphlet, “Lies My Teacher Told Me: The True History of the War for Southern Independence.” It is a 38-page pamphlet that mostly reprints an article he wrote for what is pretentiously titled the Abbeville Institute but actually amounts to another neo-Confederate’s house. 

Recently friends visited the Gettysburg Emporium, a Civil War re-enactors’ store in Gettysburg, where they saw copies for sale along with Confederate and Union uniforms and paraphernalia. Appalled at both the rip-off of my title and the content of the pamphlet, they told me about it. That spurred me to go to Amazon.com, where I read several glowing reviews of the work by other neo-Confederates. 

I decided to post my own not-so-glowing review. “He stole my book title and ignores the primary sources in The Confederate and Neo-Confederate Reader,” I wrote. 

“In order to believe this pamphlet, you must conclude that ALL the leaders of the Southern states, as they left the U.S., lied as to why. All explain that they are AGAINST states' rights and are leaving precisely because Northern states have tried to use states' rights to interfere with slavery. All explain that they are leaving in order to safeguard and expand slavery forever.

“If in fact they are seceding for other reasons, why lie about it? We must infer, first, that they are NOT seceding for other reasons. But if you want to claim they ARE, then you have to concoct a theory of lying. Would it be that the leaders, unlike the audiences to which they appeal in their secession arguments, DO favor states' rights and ARE thinking about tariffs, but they know such causes would never convince rank-and-file white Southerners to fight? So they concoct slavery as the reason? If that is your argument, then it amounts to claiming white Southerners are so racist and pro-slavery that we have to lie to them!

“In short, buy the REAL book by this title, not this pamphlet. Confirm your choice with The Confederate and Neo-Confederate Reader.”

And I gave the pamphlet one star.

Every word of my comments is accurate. Moreover, they seem to me to be about the product they claim to be about. Nevertheless, Amazon censored my comment. “After carefully reviewing your submission,” Amazon replied, “your review could not be posted to the website. While we appreciate your time and comments, reviews must adhere to the following guidelines:

● “Your review should focus on specific features of the product and your experience with it. Feedback on the seller or your shipment experience should be provided at www.amazon.com/feedback.

[I had not commented on the seller or shipping experience.]

●  “We do not allow profane or obscene content. This applies to adult products too.

[My comments were not profane or obscene.]

●  “Advertisements, promotional material or repeated posts that make the same point excessively are considered spam.

[I had not posted repeatedly or sent promotional material or ads.]

●  “Please do not include URLs external to Amazon or personally identifiable content in your review.

[I had not sent any URLs. I had mentioned “my book title,” but that’s not exactly “personally identifiable content.” I had signed my review, but I always do, as do many reviewers.]

●  “Any attempt to manipulate Community content or features, including contributing false, misleading, or inauthentic content, is strictly prohibited.

[I wrote nothing false, misleading, or inauthentic.]

Amazon also provided me with a link to a longer set of “don’ts.” Comments must not be libelous, defamatory, harassing, threatening, inflammatory, obscene, profane, pornographic, or lewd. Comments must not express hatred or intolerance for people on the basis of race, ethnicity, nationality, gender or gender identity, religion, sexual orientation, age, or disability. Don't post other people's phone numbers, email addresses, mailing addresses, or other personal information. Don't engage in name-calling or attack people.

Finally, Amazon gives us something we can do:

●  “You may question the beliefs and expertise of others as long as it is relevant and done in a respectful and non-threatening manner.”

Readers can decide for themselves whether my content falls afoul of any of the “don’ts.”

Well, readers at HNN can decide for themselves. Readers at Amazon cannot decide anything for themselves, because Amazon has kept them from seeing my comments. 

Copyright James Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154162 https://historynewsnetwork.org/blog/154162 0
That Other Dick Cheney Movie

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

Last week, instead of watching “Vice,” my wife and I watched that other Dick Cheney movie, “The World According to Dick Cheney.” R. J. Cutler made this documentary in 2013. 

It’s true that the actors in “Vice” look amazingly like Cheney, Lynne Cheney, and George W. Bush, but in Cutler’s film, they look even more like the originals! 

Please pardon that moment of forced humor; my excuse is that the rest of this brief review won’t be funny. But since HNN never reviewed “The World According to Dick Cheney” when it came out, and since the documentary will draw new viewers now, spurred by “Vice,” I want to get my critique of it out into the world. 

Although I didn’t actually time it, about half of the film’s running time seems to be the real Dick Cheney on screen. Of course, the title promises as much. One would hope, however, that with a character as controversial and important as Cheney, the interviewer would ask hardball questions. Instead, he mostly tosses batting-practice lobs.  

No one asks Cheney anything about Halliburton, for instance. What role did the company play in the Iraq War and its continuing aftermath? How did Halliburton fare financially? Does Cheney still own Halliburton stock? How did he fare financially?

The narrator or one of the authorities the film interviews does note that the reason Saddam Hussein pretended to have “weapons of mass destruction” was to intimidate his neighbors, notably Iran. But it never points out that, threatened by Bush/Cheney, he had reversed himself and allowed United Nations inspectors full access to his country, so they might investigate Cheney’s claim of WMD. After seeing the movie, people would never guess that the inspectors had to get out of the country to avoid being killed during Bush/Cheney’s “shock and awe” demonstration that kicked off the war. 

The film never bothers to show a graph or give other accounting of the number of dead and wounded Iraqis we caused, with our war and the later civil war we triggered, which is still going on in a way. Nor does it tell how many Americans died or how much we spent. One of Cheney’s most absurd prewar assertions was his claim that the war would pay for itself, from oil revenues we would somehow get. None of this gets into the movie. 

To its credit, the movie does contest Cheney’s claims that our torture of POWs was not torture and was legal. It makes some other important points as well. But it never explicates clearly what went wrong in Iraq and why. Two key decisions by Bush/Cheney caused the U.S. occupation to become a quagmire: their dismissal of the Iraqi army and their dismissal of the police. 

During World War II, when Germany occupied, say, Holland, they did so through the Dutch state. When we occupied Japan, we did so through the Japanese state, even including the emperor. That’s how it’s done. Only … not in Iraq. Somehow Cutler never mentions these preposterous decisions to Cheney, never asks him to defend them. 

Before Bush/Cheney upended Iraq, Hussein was in a box. He had few allies; Iran, Kuwait, Saudi Arabia, and Israel were his enemies. He could not use his air force, not even against his own people, as was his wont, and was facing internal opposition from Shiites and Kurds. We had him where we wanted him. Hussein never supported al Qaeda and never sponsored terrorist attacks in the U.S. or Europe. 

By 2013, when the movie was made, where were we? Bush/Cheney’s war had sparked the creation of ISIS in western Iraq, which also spread to Syria (and later to the Philippines and other countries). ISIS had linked with al Qaeda. Meanwhile, the Iraqi government that we created in Baghdad to replace Saddam Hussein is Shiite, like Iran, increasing Iran’s influence in the region. Cheney’s primary defense of all his policies is that they made America safer, but Cutler never asks him how these developments could possibly have done so.  

“Vice” may be a one-sided take-down of Cheney, “The World According to Dick Cheney’s Enemies,” but it’s needed to set the record straight after “The World According to Dick Cheney.” Unfortunately, historians ages hence may not credit a docudrama against a documentary. That will be their loss, and our nation’s. At the end of “The World According to Dick Cheney,” he goes off contentedly fly-fishing into the Wyoming sunset.  Under Cheney our foreign policy failed both on humanitarian grounds and with regard to our realpolitik interests. Our culture needs to show that we understand this. We cannot afford to have an affable view of this vice-presidency. 

Copyright James W. Loewen

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154171 https://historynewsnetwork.org/blog/154171 0
Mennonite Values

Menno Simons

I claim to be "genetically Mennonite." Of course, since Mennonites are a religious group, not a racial/ethnic group, the claim is oxymoronic. Nevertheless, I mean it seriously as well as tongue-in-cheek. It does turn out, I think, that all Loewens in the world, at least all I have ever met, are of Mennonite origin. Add anything -- "Loewenberg," "Loewenstein" -- and it's Jewish. Subtract -- "Loewe," "Lowe" -- and it's likely Jewish but not always. But "Loewen," ironically meaning "lions," is usually Mennonite. 

Mennonites are followers of the Protestant minister Menno Simons, who lived in Holland 1496-1561. Mennonites were the first group in the Western World to come out against slavery and against war. Particularly that last stand -- against military service -- has caused them centuries of hardship and grief. 

"Old Order Mennonites" are also called Amish, and they famously forbear modern technology. Most "regular" Mennonites look like everyone else. "My" Mennonites, in Mountain Lake, MN, were good farmers, among the first to electrify. Besides, my dad stopped being a Mennonite and a believer when he was about 24. I was born when he was 39. So I was definitely "regular." Indeed, I grew up Presbyterian, since that church was closest to my house, and since Mom was a Christian. 

Nevertheless, my sister and I recently talked with each other about these matters, and we agreed that some Mennonite values seeped into our upbringing. We both seem to favor the underdog, for example. We both have worked for social justice. We are not impressed by mansions or BMWs. Today I am happy to choose my Mennonite heritage, if not religiously, well, then, as a statement of my values.

In particular, on the last page of the coffee-table book, In the Fullness of Time: 150 Years of Mennonite Sojourn in Russia, by Walter Quiring and Helen Bartel (3rd edition, 1974), are nine lines. Perhaps they are by Menno Simons; I have asked Mennonite scholars but they do not know. They sum up Mennonite values for me. I am particularly taken with the two words "we hope." What a modest claim! We hope that the good and the mild will have the power. Surely they ought to! 

Whose is the Earth?

Whose is the Earth? The toiler's. Who rules the earth? The wise one. Who has the might? Only the good, we hope, and mild. Vengeance and fury devour themselves. The peaceful abide and save. Only the wisest shall be our guide. The chain does men no honour and even less the sword.  

At the end of my life, I publish these lines thinking that they may come to be meaningful to you. You can claim them just as well as I can! You don't have to be genetically Mennonite to do so! Remember, genetically Mennonite is a contradiction anyway. You don't even have to attend a Mennonite church. (I go Unitarian. But that's another story.) "The chain does men no honor and even less the sword." 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154192 https://historynewsnetwork.org/blog/154192 0
Gresham's Law of Reading: Bad Reading Drives Out Good

James W. Loewen is a sociologist.  The New Press recently brought out new paperbacks of Loewen's bestseller, Lies My Teacher Told Me, and Sundown Towns, about places that were/are all-white on purpose. 

Gresham's Law, as I'm sure you recall from Econ. 101, states, "Bad currency drives out good." It works like this. Suppose you have $100 in gold coins and $100 in paper bills. You want to buy a sport coat for $99. (I did buy a sport coat for $99, just before Christmas.) Are you going to hand over your gold coins or your paper bills? 

You're going to hand over your paper bills. At least most of us will.

After all, the paper bills depend upon the backing of the government. The gold coins have intrinsic value. If North Korea or an ISIS terrorist sets off a nuclear bomb in D.C., where I live, I can escape in my car, camp out in southern Pennsylvania, and maybe trade a gold coin for some bread and cheese from the nearest Amish farmer. Even without the threat of societal breakdown, the gold coins also look nice, so I derive pleasure from merely owning them. From the paper, not so much. 

As a result, gold coins don't work as currency. People don't exchange them. They hoard them. By definition, "currency" is "a medium of exchange." Bad money has driven out good. 

So it goes with reading, at least for me. My current fiction read is Cloud Atlas, a complex remarkable novel by David Mitchell that takes place in 1841, 1931, more-or-less the present, and several future eras. I recommend it to you. 

I've been reading it for years. First, I used it as bedtime reading. This didn't work, because to the annoyance of my spouse, I fall asleep within 30 seconds of opening it. Then I switched to taking it on trips with me. 

Cloud Atlas has now been to, in chronological order, West Virginia, Indiana, Colorado, Montana, Minnesota, Georgia, California, Wisconsin, Philadelphia, New York City, Switzerland-to-Amsterdam on the Rhine, the United Kingdom, the Bahamas, New York City again, Vermont (twice), and Massachusetts (three times). A year ago it visited the Azores (which were excellent, by the way). This past April, it went down the Nile (a bucket-list trip, fascinating in many ways). Just last month, it ventured to Portland, Oregon, and then to Minnesota. Still, I didn't finish it.  

What is going on? 

It's Gresham's Law of Reading. Bad reading drives out good. 

Specifically, it's the newspaper, in my case, the Washington Post. It's Time, Smithsonian, and Multicultural Perspectives. It's The National Museum of the American Indian. (Yes, that's a magazine as well as the institution that puts it out.) God help me, it's AARP the Magazine and whatever the magazine is called that AAA sends me. I am always behind on reading them, so I always pack a stack of them on my trips. Since I don't want to bring them back home, I always read them first, so I can throw them out. Consequently I rarely get to the gold. 

This pattern does have one payoff: I do catch up on my magazines. This saves me from the fate of a Time subscriber whose letter I still recall from about 1952, when I was ten years old, reading my father's magazine. From memory, it went, 

I really like your magazine. You're doing a fine job. However, it is too much material for me. I file each new issue on my bookshelf on the right, and I read them from the left. Right now I'm in the middle of 1943, and I can't wait to see how it all turns out!

On my last day on earth, however, I shall be sad if I have not finished Cloud Atlas. I doubt I'll lament not having finished the latest AARP. 

Could this perhaps be a metaphor? On that day, might I also be sad, not having taken care of the important things — the gold — while wasting my time on tasks that have currency, but no real value? 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154226 https://historynewsnetwork.org/blog/154226 0
My Life with Books The New Press just brought out Jim Loewen’s public history book, Lies Across America, completely revised and with a new chapter, “Public History After Charlottesville.” 

Recently, Shelf Awareness, an interesting website new to me, interviewed me, mostly about books that have had an impact on my thinking. The conversation was intelligent (at least on their part), and they published it today, https://www.shelf-awareness.com/issue.html?issue=3588#m45942.

Since I seem to have moved to the memoir phase of life, I was happy to participate, and I offer the result to you below. Slightly altered, to become an essay, I think it works well. 

What’s on your nightstand now? 

For years I have been reading a fine dystopian fictional "history of the future," the well-known Cloud Atlas by David Mitchell. It's traveled with me to Egypt, the Azores, the Bahamas, all the countries on the Rhine (yes, I took that cruise), Iceland, and at least 20 states! My problem is, when I read in bed, I fall asleep immediately. That's not Mitchell's fault.

Favorite book when you were a child:

I must admit, it was the Dr. Dolittle series. Although I have not looked at them since attaining adulthood, I'm sure they were racist, even colonialist, since a white doctor knew just what to do with and for the animals and people in Africa. That I didn't think about such things probably made their impact all the more insidious, but still, I devoured the books.

Your top five authors:

Mark Twain. He's the only humorist from so many generations ago who is still consistently funny when reread today. And he can be deep, too.

William Faulkner. Yes, I went through my Faulkner period, and though I haven't reread him in years, I'm still happy to remember many passages, both for his values and his prose style.

Walt Whitman. As Stephen Vincent Benét put it, in "Ode to Walt Whitman," "You're still the giant lode we quarry/ For gold, fools' gold, and all the earthy metals,/ The matchless mine."

Vine Deloria. Being Native American, his worldview is different, but he makes it accessible to all.

Edna St. Vincent Millay. Perhaps my mom's favorite poet, she became one of mine too, especially her sonnets that sing of love and lament its loss.

Did you ever fake reading a book?

Yes, György Lukács, History and Class Consciousness. In grad school at Harvard in 1966, I took Barrington Moore's famously difficult course in social theory. Moore taught by the Socratic method, and when he queried you, you'd best be prepared. The time came to study Lukács, but his book was translated into English only in 1971. We were to read chapter 1 of Geschichte und Klassenbewusstsein in German. Supposedly I knew German, having taken two years in high school and two in college and then having scored 720 on the SAT German test. Actually, I knew better. I spent the next afternoon trying to read Lukács. After five hours, I had translated a page and a half. Doing the math, I realized that the whole chapter would take me another 70 hours! I had four other courses! So, when the seminar reassembled a week later, I hunkered down behind the guy in front of me, avoided eye contact with Moore, and thus avoided making a fool of myself about a book I'd not read.

Even after the translation came out, I never read the book. Ironically, reading aboutit while preparing this answer, I now realize I probably would have enjoyed it and learned from it. Sigh.

Book you're an evangelist for:

The only historical novel I recommend without reservation: Okla Hannali by R.A. Lafferty. Even though by a white author, I credit it as a Choctaw history of the 19th century, in the form of a biography of a fictional Choctaw leader who was born in Mississippi around 1801 and died in Oklahoma in 1900. I realize such a statement creates all sorts of problems for me--expropriation of Native knowledge, white arrogance, etc. My only defense is the work itself. I have no idea how Lafferty, otherwise known for science fiction, learned so much about Choctaws (and white folks), but every time I have checked out any fact inOkla Hannali, no matter how small, Lafferty got it right. And what a read! Only a little over 200 pages long, but an epic, nevertheless.

Book you should have hidden from your children:

Thomas Berger's Regiment of Women. I read it when it came out (1973) and enjoyed it. It seemed to me to be a pioneering feminist book and funny as hell. Then my son read it, followed by my daughter. Conversation with them reminded me that the book also contained seriously awry sex scenes that perhaps should not be read by kids age 13 and 11, especially when their mother sought to use any excuse to deny me contact with them. Luckily no complications ensued, either legal or psychological, so far as I know.

Book that changed your life:

Let Us Now Praise Famous Men by James Agee, photos by Walker Evans. Agee's nakedly emotional prose helped me feel what sociologists helped me understand: most poor people are not to be blamed for their poverty. As Agee put it, in the voice of his white sharecropper subjects: "How were we trapped?"

Favorite line from a book:

As I confront the end of my own life: "Come, lovely and soothing death, Undulate round the world, serenely arriving, arriving..." in "When Lilacs Last in the Dooryard Bloom'd," from Leaves of Grassby Walt Whitman.

Five books you'll never part with:

Leaves of Grass Millay, Collected Poems Okla Hannali Let Us Now Praise Famous Men Louis Untermeyer, ed., Modern American Poetry; Mid-Century Edition. This collection contains many poems that have meant a lot to me, from Whitman and Dickinson down to Langston Hughes and Kenneth Patchen.

Book you most want to read again for the first time:

T-Model Tommy and other books from my childhood. Not Dolittle, though.

Book that played a crucial role in resolving a family disagreement:

Thorstein Veblen's classic The Theory of the Leisure Class. The occasion was a serious conversation my Dad initiated during my sophomore year of college. He was upset that I had changed my major from chemistry to sociology. He confronted me with a challenge: "Just name me one person who ever graduated from Carleton College and made a name for himself in sociology." I was about to reply, "Just name me one person who graduated from Carleton and made a name in anything," but I knew he would come up with some Mayo Clinic doctor who was arguably well-known. Suddenly it came to me: in its 99 years, Carleton College had produced just one truly well-known person, famous for his book, The Theory of the Leisure Class. "Thorstein Veblen," I crowed triumphantly. He was silent.

Let me add, The Theory of the Leisure Class deserves its fame. It is as relevant today as when it came out in 1899. It explains how we model our behavior and our standards of success--even of morality--on the class next above us in social structure, all the way up to "the wealthy leisure class," his name for what we call the 1%. One chapter, "Devout Observances," also contains a new and even hilarious sociology of religion. If you aren't motivated to go read Lies Across America to understand how Americans misconceive the social world (a grievous mistake!), then can I persuade you to read Veblen?

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154261 https://historynewsnetwork.org/blog/154261 0
Tax Protesting on the Cheap

My years in grad school (1964-68) coincided with LBJ’s escalation of the Vietnam War. The more I learned about our actions, the less defensible they seemed. Continuing and even expanding the war was neither humane nor in our country’s Realpolitik best interest. Finally, I vowed not to support it with my tax dollars.

By my calculation, about half of our budget went to the military, especially if one included interest on the national debt, which had largely stemmed from World War II, the Korean War, and now the Vietnam War.

But how to protest? From my Harvard teaching stipend, the government already took out more tax dollars than I owed.

Luckily, I was the “proctor” of the “Harvard Co-op House,” which meant I lived and ate free, so I had saved a few hundred dollars from my fellowship and invested them in stocks. My choices had made money, so I owed taxes on my profits. Indeed, I owed exactly $9.96.

I filled out my IRS tax return and paid exactly half. Then to my tax return I attached a letter explaining why my check was for $4.98.

For weeks I heard nothing. Well, each month I got a reminder notice that my obligation had gone up, with penalties and interest, first to $5.20 and then I think $5.50. Then one morning an undergraduate rushed up to my room. “Loewen, Loewen!” he gasped, “there’s an IRS guy downstairs in a trench coat and he’s asking for you!”

His anxiety was contagious. I combed my hair, straightened my clothing, and came down the stairs, heart pounding.

The man indeed wore an ominous dark trench coat, but he proved to be a sweetheart. “Sorry to bother you,” he began, “but it’s about your taxes.” He went on to tell me that he understood my motivation perfectly; his own son was doing the same thing! He went on to explain that he wasn’t a collector. His job was to explain to me my options, which were three:

-- “You could pay what you owe.” I was considering doing just that, but he went on immediately, “I don’t suppose you’ll do that, or you would have paid in the first place.” “Oh, right,” I replied.

-- “You could do what Joan Baez and all those other celebrity tax resisters do,” he went on. “Tell us where you assets are, and the government will seize the money. That way, you’ll be off the hook, but not through any act of yours.”

That sounded a bit hypocritical to me, so I asked, “What’s the third option?”

“Oh, we’ll hound you to death,” he replied.

That actually sounded interesting. “I’ll take number three,” I said.

“OK, then,” he said, explaining again that his job that morning was to give me my options and record my response.

Later that day I went to my two banks, one where I had my checking account and the other a small savings and loan where I had my savings, and closed both accounts. I figured why make it easy for the government to seize my assets?

I had forgotten about my paycheck, however. At the end of the month, my usual envelope came from Harvard with my salary for being House Tutor. It contained not just my usual check, but a letter on government letterhead, signed, or at least stamped, by the Treasurer of the United States. He explained that the government had garnished some $6 from me, which explained why my check was short that month. Since a citizen cannot sue the government without its consent, the letter went on, my only recourse was to ask the Treasurer to give it back to me.

That would be unlikely to succeed, I concluded. But what if the government had made a simple error in computing my taxes and instead of $6 claimed I owed $6,000,000? Could they garnish all my pay? Forever? With no recourse? A scary imbalance of power!

Although I had lost $6, I was satisfied. Surely I had cost the government hundreds of dollars to collect my six. That had to have registered somewhere in the bureaucracy.

After finishing my degree, I moved to Mississippi to teach at Tougaloo College. One or two war protesters had flatly refused to pay taxes in Mississippi and were facing prison terms in Parchman as I recall, even though Parchman Penitentiary is a state facility. Again, however, that option was not open to me, because at year’s end, the government would owe me money back from my withholding. However, I now had my own phone, which meant I now had my own phone bill. Every month I paid not only Southern Bell (the temptation to add an “e” is almost overwhelming) for my calls but also an “excise tax” to the federal government. This 10% levy was put on phone bills during World War II, explicitly as a war tax. After the Korean War ended, it was being phased out, when Lyndon Johnson reimposed it to help pay for the Vietnam War. Across the country, antiwar activists were refusing to pay it.

At first, phone companies simply carried over the unpaid amount as if it were part of the phone bill. When the amount grew too large and went unpaid for too long, they shut off the phone. Having no phone made it hard to participate in modern society. By 1968, most phone companies were taking the reasonable position that they were not a collection agency for the federal government.[i] They reported the unpaid assessment to the government, then dropped the amount from the next bill. Then they repeated the process for that month’s new excise tax.

In Mississippi, no one had raised the issue, so Southern Bell kept piling up my unpaid amounts. I could see where this was headed, so I phoned them. (Back then, you could phone the phone company.) As soon as they heard my plea, with the precedent of other companies elsewhere, they changed their policy to match. Now my excise tax obligations were piling up where they belonged, between the federal government and me.

Again, the amounts were miniscule – less than $2/month. Again the government spent much more money threatening me each month. I honestly don’t remember how this tempest wound down, but I think I might have taken the easy way out. I was now engaged in social change in Mississippi, a full-time calling, as well as teaching at Tougaloo, and didn’t have time or energy to spare. But I still salute those heroes, from Joan Baez and Jane Fonda to lesser-known folk like Steve Trimm and John McAuliffe, who opposed the war with their whole being. They played a major role in ending it. Me, only a minor one.

 

[i]This position may have resulted from a court ruling.

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154349 https://historynewsnetwork.org/blog/154349 0
A Renaming Everyone Can Get Behind For a decade at least, Washington, D.C., has been stuck in ugly political gridlock. As a step toward renewed bipartisanship, I offer this modest proposal. 

At the turn of the last century, Republicans engaged in a wave of memorials and renamings for President Ronald Reagan. In 1998, a Republican Congress passed a bill requiring renaming Washington National Airport for Reagan. The airport authority and many D.C. residents pointed out that it was already named for one president, but "Ronald Reagan Washington National Airport" went into effect nevertheless. 

After George W. Bush entered the White House in 2001, the renaming went on in earnest. Historians know that memorials in the U.S. have often sprouted in waves. Union monuments began to go up immediately after the Civil War. Most Confederate memorials were dedicated much later, in the period 1890 to 1940. Why? Because victors usually put up memorials, and in about 1890, the Confederacy — or more accurately, since it was a new generation, neo-Confederates — won the Civil War. And, Republicans argued, had not Reagan similarly won the Cold War?

A year or so after the breakup of the Soviet Union, I heard an interview about it with Eduard Shevardnadze, who had been Foreign Secretary in the U.S.S.R. Asked if Ronald Reagan deserved partial credit in some way for the downfall of Communism and the breakup of the U.S.S.R., he was momentarily struck dumb. Clearly he had never thought of that hypothesis. Having considered it, he rejected it out of hand, citing more basic economic, societal, and ideological contradictions within the Communist system. 

But this made no difference to Republicans. Years ago Walt Kelly had mocked such thinking in his comic strip Pogo, in a scene in which Albert Alligator, claiming some political mantle at the time, took credit for the weather, a fine sunny day. "Why not?" he protested. "It happened during my administration, didn't it?"

The resulting mania for memorializing Reagan thus reflected a political rather than historical judgment. Historically, Ronald Reagan surely ranks no higher than the third best Republican president of the twentieth century, well below Teddy Roosevelt and Dwight Eisenhower. No matter. Grover Norquist, leader of the Ronald Reagan Legacy Project and even more famous for his no-tax-increase pledge, called for a monument to Reagan in each of America's 3,067 counties and on the national mall in Washington, D.C.; his face on the $10 bill, replacing Alexander Hamilton's; and perhaps his profile added to Mount Rushmore! "Or we could have our own mountain," suggested Norquist.  

An article by Greg Kaza in National Review called Mt. McKinley a "precedent" for renaming some other peak for Ronald Reagan. Of course, more recently McKinley has given way to Denali, its aboriginal name, but at the time the example made sense. 

I have a suggestion for Mount Reagan that I think will never get renamed for someone else.

Each of our United States has by definition its highest point. The highest point in Reagan's home state of California is already named, of course, for Josiah Dwight Whitney, who founded the California Geological Survey. So is the highest point in Reagan's native state, Illinois, 1,235' high Charles Mound. In fact, the highest point in every state is already named, even Florida's Britton Hill, a mere 345' from sea level — except Delaware's.

Indeed, Delaware's tallest spot was misknown until recently. It was thought to be marked by a National Geodetic Survey azimuth on Ebright Road between Brandywine and Brandywood in far northern Delaware. The Ebright Azimuth turns out not to be the highest point in Delaware, however. The highest point in Delaware, at 451' a full two feet above the Ebright Azimuth, is in a mobile home park some 300 yards west. It is "the elevation in front of the first trailer," according to William S. Schenck of the Delaware Geological Survey. 

Of course, Ronald Reagan had nothing particularly to do with Delaware. But then he had nothing to do with aviation, either, except for smashing the air traffic controllers' union, which didn't stop Republicans from renaming Washington National Airport Ronald Reagan Washington National Airport. 

And, like McKinley with Spain, Reagan did win a war — with Grenada. So perhaps he does deserve to have a mountain named for him. Delaware's tallest spot — “Mount Reagan” — is perfectly appropriate. It matches exactly the size of the war Ronald Reagan won. 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154370 https://historynewsnetwork.org/blog/154370 0
Don’t Tear Down the Wrong Monuments; Don’t Attack Every Holiday The United States defeated the Confederacy's Army of Northern Virginia at Gettysburg on July 3, 1863. That same evening, General John Pemberton agreed to surrender the Confederate army holding Vicksburg to Ulysses Grant. The next day, when the news of both Union victories began to spread through the nation, was surely the most memorable Independence Day in American history after the first one, four score and seven years earlier. Pemberton’s men stacked their arms and went home. Lee’s men withdrew from Gettysburg and made their way south. 

Historians have argued ever since over which victory was more important.

Both battlefields are now under the care of the National Park Service. This past Fourth of July, few people visited either, but when travel becomes easier, if you're within range, I suggest you visit whichever park is closer. Both parks are beautiful, especially in mid-summer. However, do not ask the question several NPS rangers submitted to me as their nomination for the dumbest query ever received from a visitor: "How come they fought so many Civil War battles in parks?" Instead, if you're at Vicksburg, suggest to the ranger that Gettysburg was the more important victory; if at Gettysburg, suggest Vicksburg. Probably you and your fellow tourists will be informed as well as entertained by the response. 

Perhaps the Mississippi victory was more telling, for several reasons. Vicksburg had been called "the Gibraltar of the Confederacy." After Richmond, the Confederate capital, it was surely the most strategic single place in the South, because it simultaneously blocked United States shipping down the Mississippi River and provided the Confederacy with its only secure link to the trans-Mississippi West. Vicksburg's capture led to the capitulation of the last Confederate stronghold on the Mississippi, Port Hudson, Louisiana, 130 miles south, five days later. This reopened the Mississippi River, an important benefit to farmers in its vast watershed, stretching from central Pennsylvania to northwestern Montana. Abraham Lincoln announced the victory with the famous phrase, "The Father of Waters again goes unvexed to the sea." In the wake of the victory, thousands of African Americans made their way to Vicksburg to be free, get legally married, help out the Union cause, make a buck, do the laundry and gather the firewood, and enlist in the United States Army. No longer was slavery secure in Mississippi, Arkansas, or Louisiana. Many whites from these states and west Tennessee also now joined the Union cause. 

But perhaps the Pennsylvania victory was more important. It taught the Army of the Potomac that Robert E. Lee and his forces were vincible. Freeman Cleaves, biographer of General George Gordon Meade, victor at Gettysburg, quotes a former Union corps commander, "I did not believe the enemy could be whipped." Lee's losses forced his army to a defensive posture for the rest of the war. The impact of the victory on Northern morale was profound. And of course it led to the immortal words of the Gettysburg Address. 

If you go to Vicksburg on the Fourth, be sure to visit the Illinois monument, a small marble pantheon that somehow stays cool even on the hottest July day. In Gettysburg, don't fail to take in the South Carolina monument. It claims, "Abiding faith in the sacredness of states rights provided their creed here" — a statement true about 1965, when it went up, but false about 1863. After all, in 1860, South Carolinians were perfectly clear about why they were seceding, and "states rights" had nothing to do with it. South Carolina was against states’ rights. South Carolina found no fault with the federal government when it said why it seceded, on Christmas Eve, 1860. On the contrary, its leaders found fault with Northern states and the rights they were trying to assert. These amounted to, according to South Carolina, “an increasing hostility on the part of the non-slaveholding States to the institution of slavery.” At both parks, come to your own conclusion about how the National Park Service is meeting its 1999 Congressional mandate "to recognize and include ... the unique role that the institution of slavery played in causing the Civil War."

The twin victories have also influenced how Americans have celebrated the Fourth of July since 1863. Living in Mississippi a century later taught me about the muted racial politics of the Fourth of July. African Americans celebrated this holiday with big family barbecues, speeches, and public gatherings in segregated black parks. Even white supremacists could hardly deny blacks the occasion to hold forth in segregated settings, since African Americans were only showing their patriotism, not holding some kind of fearsome “Black Power” rally. Both sides knew these gatherings had an edge, however. Black speakers did not fail to identify the Union victories with the anti-slavery cause and the still-unfinished removal of the vestiges of slavery from American life. This coded identification of the Fourth with freedom was the sweeter because in the 1960s, die-hard white Mississippians did not want to celebrate the Fourth at all, because they were still mourning the surrender at Vicksburg. We in the BLM movement can take a cue from the past. We can be patriotic on July 4 without being nationalistic. As Frederick Douglass put it, by my memory, “I call him a true patriot who rebukes his country for its sins, and does not excuse them.” And true patriots can also take pleasure from their country’s victories against a proslavery insurrection.

Muted racial politics also underlie the continuing changes on the landscape at both locations. In 1998 Gettysburg finally dedicated a new statue of James Longstreet, Lee's second in command. For more than a century, neo-Confederates had vilified Longstreet as responsible for the defeat. He did try to talk Lee out of the attack, deeming the U.S. position too strong, and his forces did take a long time getting into place.

James Longstreet had to wait to appear on the Gettysburg landscape until the United States became less racist.

Hopefully BLM protesters are informed enough to know not to tag or topple this Confederate monument. 

But the criticisms of Longstreet really stemmed from his actions after the Civil War. During Reconstruction he agreed that African Americans should have full civil rights and commanded black troops against an attempted white supremacist overthrow of the interracial Republican government of Louisiana. Ironically, ideological currents set into motion by the Civil Rights movement help explain why Gettysburg can now honor Longstreet. No longer do we consider it wrong to be in favor of equal rights for all, as we did during the Nadir. 

When I lived in Mississippi in the 1960s and '70s, bad history plagued how Grant’s campaign was remembered on the landscape. For example, a state historical marker stood a few miles south of Vicksburg at Rocky Springs:

Union Army Passes Rocky Springs

Upon the occupation of Willow Springs on May 3, 1863, Union Gen. J. A. McClernand sent patrols up the Jackson road.

These groups rode through Rocky Springs, where they encountered no resistance beyond the icy stares of the people who gathered at the side of the road to watch.

Actually, the area was then and remains today overwhelmingly black. "The people," mostly African Americans, supplied the patrols with food, showed them the best roads to Jackson, and told them exactly where the Confederates were. Indeed, support from the African American infrastructure made Grant's Vicksburg campaign possible. 

In about 1998, Mississippi took down this counterfactual marker. Or maybe a vigilante stole it — no one claims to know. Either way, the landscape benefits from its removal. Six years later, with funding from the state and from Vicksburg, a monument to roles African Americans played in support of Grant’s campaign went up at Vicksburg. It shows a wounded U.S.C.T. (United States Colored Troops) soldier being helped to safety by another member of the U.S.C.T. and by a black civilian. 

Now, if we can just fix that pesky South Carolina monument... 

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154378 https://historynewsnetwork.org/blog/154378 0
"Have You Lived Your Whole Life in Vermont? Well, Not Yet!": One State's Joke Culture

Having never seen this T-shirt within the state,

I don’t think it’s really a Vermont joke. 

Why do some states develop a joke culture while others don’t?

For that matter, some states develop much stronger state identities than others. One is or is not a native Vermonter, for example. Indeed, I probably should have written “Native Vermonter.” 

I moved to Vermont in 1975. Shortly after arriving, I went to a potluck dinner at the home of the minister of the Burlington Unitarian Church, hoping to make new friends. Two dozen people came, and we went around the room introducing ourselves. Perhaps the third person to speak was also the oldest, 84 years of age, and he spoke for several minutes, but his point was not to tell of any of the interesting jobs he had held or experiences he had had. No, he went back in time to explain and lament that he was not a native Vermonter, having lived in the state for only the last 82 years. From that point on, around the room, everyone took care to note whether they were or were not native to the state. 

Mississippi is like that. So is Texas. 

Compare Illinois, my home state. We don’t even know how to pronounce it! Some say “Illinoisan” without pronouncing the “s”; some say “Illinoisian,” and in that camp no one knows whether to pronounce the “s” or not; and no one cares anyway. Massachusetts does not even have a name for it—“Massachusettsian”? Massachusettser”? [there is in fact a widely-shared nickname for Bay State residents, popular among residents of the northern New England states, but it is profanity-adjacent--ed.]

Vermont, you may know, is also a state that has jokes. Most states do not. Some states are known to be the butt of others’ jokes, such as the cruel and misleading anecdotes told about West Virginians by people in neighboring states. But Vermonters tell their own jokes—wry, a bit sly, sometimes showing some wisdom. 

Often these jokes tie to the notion of the Vermont native and his[i] character. For example, a “flatland” tourist finds herself asking a Vermonter in at least his 80s, “Have you lived in Vermont your whole life?” Comes the laconic reply, “Not yet.”

Here are all the Vermont jokes I ever heard told within the state, starting with the worst. 

            Flatlander during dreary all-day drizzle: “Think this rain’ll stop?”

            Vermonter: “Always has.”

            Flatlander (me) during dreary all-day drizzle: “Think this rain’ll keep up?”

            Lumber yard worker: “Hope so.”

            Me: “You hope so?”

            Lumber yard worker: “Ayup. Then it won’t come down!”

            Vermonter to flatlander: “Don’t like our weather? Wait a minute.” 

A flatlander wants to take a shortcut across a field, but is worried by the bull he sees grazing in the middle of it. So he asks the nearby farmer, “Say, is that bull safe?” 

Vermonter: “Sure, he’s safe”. 

So the flatlander jumps the fence and starts across the pasture. 

Vermonter: “Can’t say the same for you, though.”

An elderly Vermont farm couple sits on their porch, watching a typical yet stupendous Vermont sunset over the low scarlet-tinged hills. Slowly pink turns to crimson turns to vermilion. It is literally breathtaking. At last, as they turn to go in, the husband says quietly, “We’ll pay for that.” 

Driving through one of our many intersections with no stop signs, a flatlander and a Vermonter both reacted too slowly for the icy conditions and had a fender bender. As they were exchanging insurance information, the Vermonter suggested they retreat to the pub that happened to be at the corner, to get out of the cold. As they entered, the Vermonter called to the waitress, “A beer for my new friend here.” 

“How nice,” thought the New Yorker. “This would never happen in the city.”

They called the authorities and wrote down the other’s insurance company and phone number, and then the flatlander realized that the Vermonter hadn’t bought himself anything to drink. “Let me get you a beer!” he said.

“Oh, no thank you,” replied the Vermonter. “I’ll just wait until the police have come and left.” 

No one had won the Vermont Lottery for several weeks in a row, so it had grown to more than $3,000,000. Finally a winner was announced, and it turned out to be a native Vermonter, a farmer all his life, so a newspaper reporter from Burlington was sent to interview him.[ii] “What do you plan to do with the winnings?” she asked.

“Nothin’,” replied the farmer.

“$3,000,000?! You must have some plans!”

“Nope,” said the farmer. “Reckon we’ll just stay right here and farm, ‘til it’s gone.” 

                       

The Burlington Free Press followed the custom of interviewing couples celebrating their 50th anniversaries and then writing little human interest stories about them.[iii] Then came the news that an old couple up in the Northeast Kingdom had just passed their 75th anniversary! Of course a reporter went to interview them. She knocked on the door and the husband let her in. They both sat down in the front room and the wife joined them. 

“You’ve been married longer than anyone else in the state, so far as we can tell,” gushed the reporter. “Do you have any secrets to share as to what helped you stay married so long?”

“Nope,” said the wife. “We hate each other.”

The reporter blanched. “Really?”

“Ayup,” said the husband. “For years we’ve had an in-house separation. And look – here she is, in my part of the house. It’s an outrage.” He glared at her. 

“You’re one to talk,” said the wife. “Last Sunday you ate in the kitchen.” And she shot him a look of sheer malevolence.

The reporter was shaken by hatred spewing from both sides. With a tremulous voice she asked, “Well, if you hate each other so much, why haven’t you divorced?

In unison, they both replied, “We’re planning to. We just wanted to wait ‘til the kids were dead.” 

A young woman schoolteacher in Boston decided to make a major change in her life. Her boyfriend had just moved out, and she wanted to leave old memories behind. So she applied for a teaching job in Brattleboro, the town closest to Boston but nevertheless in Vermont, and she got hired. Moving to her new state, she decided to go for the full Vermont experience and found a cabin in the woods for rent. It was an elegant modern cabin, but still, it was rural Vermont, in a lovely wood and with a beautiful view. 

The second week of school, she was driving home when suddenly a deer leaped out in front of her. Unprepared, she hit it, crumpling her fender against her wheel, and had to get roadside assistance. 

She was telling the tow truck man how the deer had surprised her. He pointed out the yellow sign with the dancing deer. “That means ‘deer crossing,’” he said. “Oh,” she replied. From then on she drove more carefully. 

Nevertheless, after the first PTA meeting, coming home late in the evening, another deer was in the roadway, and again she hit it. Another repair bill. Colleagues at work told her dusk was a particularly dangerous time for deer to be out. 

She drove still more carefully at dusk. 

As October passed, however, she failed to realize that dusk-like conditions also occur at dawn, and dawn came later every day. Setting forth early one morning, planning to have her wake-up coffee at school, she was stunned when a big buck jumped in front of her. This time her car was totaled, although she was not hurt. 

That evening, irate, she wrote a letter to the Vermont Department of Transportation asking – no, demanding – that they move the sign. 

It was foliage season. A wealthy Texas rancher on holiday was driving along one of our quaint two-lane highways and came upon a farmer, tinkering with his tractor by the side of the road. “I’m a farmer,” he reasoned to himself. “I’ll have a conversation with the fellow.”

So he stopped and introduced himself. “This your place here?

“Ayup.”

“How big a spread you got, then?

“Well, my land begins up there by the potato shed, takes in the woodlot, comes down along the creek there, and then back up along the road. 

The rancher had never heard of anything so dinky. He just had to reply, “Y’know, back home in West Texas, I can get in my truck and drive west all day and never reach the end of my property.”

“Ayup,” said the Vermonter. “Had a truck like that.”

Down in southwestern Vermont, the New York/Vermont border isn’t Lake Champlain, but a manmade line, and indeed, there had been a dispute about the exact location of that line since the formation of Vermont back in the eighteenth century. Finally the selectmen of the Vermont town got together with their counterparts across the state line and hired a surveyor to resolve the matter. Much of the dispute was on Ebenezer Jones’s property, and after the surveyor finished, part of his land, including the farm home, proved to be in New York state.

Who would tell him? Who would tell a fifth-generation Vermonter that in fact he was not, had never been, a native Vermonter? Finally they decided to go as a group. 

It was a warm August evening. The head selectman knocked. “Eb, you recollect we got this border dispute with New York?

“Ayup,” he replied. “Had surveyors on my property.”

“Yes,” said the selectman. “That’s what we want to talk to you about. It turns out that the line was bad. Actually, most of your land, including the house here, is not in Vermont at all, but in New York state.”

To their astonishment, a broad smile came across Ebeneezer’s face.

“You’re smiling!” the selectman exclaimed. “We thought you’d be downcast. You realize this means you’re not a native Vermont, don’t you? Never have been? Why are you smiling?”

“Well boys,” replied Eb, “It’s like this. You see, I’m gettin’ on in years. Passed my 82nd birthday last May. I just don’t think I can take another Vermont winter!” 

That’s it. Well, there were a few more, having to do with giving directions to tourists, but they were truly terrible. 

Hoping to supplement my own haphazard in-person research on Vermont jokes, I went to the web. The first URL listed by Google, “Vermont Jokes” at Jokes4Us.com,[iv] proves a dead end. Not one has ever been told within the state.[v] They simply demean the state, and most are generic, applicable to any state (or college, city, etc.). For example, 

“Q: Did you hear about the fire in University of Vermont's football dorm that destroyed twenty books?

“ A: The real tragedy was that fifteen hadn't been colored yet.” 

Since the University of Vermont gave up football in 1974 – the only flagship state school ever to do so – it’s safe to say that this joke has never even been told about Vermont, let alone in the state. 

At another website, I did come upon one possible Vermont joke that was new to me: 

A flatlander was visiting Vermont and stopped at a farm stand to buy some apples. As he stood in the barnyard talking with the farmer, a three-legged pig walked by. “I’ve never seen a three-legged pig," said the tourist. "How'd he get that way?        

"Last summer I was out plowing and my tractor overturned and pinned me,” the farmer replied. “That pig came running and all by himself dug the dirt out from around my head and then ran back to the farmhouse and got help. That pig saved my life!"

"But how'd he lose the leg?"

"Well," said the farmer, "a pig that good you don't eat all at once." 

Further research reveals versions of that joke in many other locales, however, including England, “the country,” and even Australia. In the Australian version, the pig is even more amazing: herds the farmer’s sheep, milks the cows, collects the eggs, and even does his taxes!

So I concluded it’s not a Vermont joke. 

Also, I made up a Vermont joke myself: “I’ve been stealing my neighbor’s maple sap,” said Tom, surreptitiously. But my native Vermont friends assured me it was not really a Vermont joke, indeed that no Tom Swiftie or other pun could ever be a Vermont joke. 

However, if you know a Vermont joke or two – real ones – please send it to me. Eventually it’ll wind up on my website. 

I did substantial research – of sorts – and I’ve reached the end of this essay, but I’m no closer to understanding why some states develop state jokes while others don’t. I did enjoy collecting the jokes, though. Maybe you enjoyed them too? 

 

[i] Usually “his,” not “hers,” I’m sorry to report.

[ii]Back when the Free Press had reporters. 

[iii]Back when the Free Press wrote stories.  

[iv] http://www.jokes4us.com/miscellaneousjokes/worldjokes/vermontjokes.html .

[v]I realize this is an impossibility theorem, but I stand by it. You read them. All of them demean the state and most are generic, to be applied to any state (or college, city, etc.). As well, few are funny.  

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154399 https://historynewsnetwork.org/blog/154399 0
The Donald J. Trump Presidential Library Joins a Proud Tradition James W. Loewen

jloewen@uvm.edu

Some progressives on Twitter argue that there must never be a Trump Presidential Library. They don’t realize that it already exists. Visit it virtually to experience its many storied exhibits.

However, it tells us nothing about the books. Libraries are supposed to have books, are they not?

Of course, Donald J. Trump was notorious for never even getting to the end of a two-page intelligence report, let alone finishing a book, so perhaps this is appropriate.

Trump did have his own library, however, consisting of just three volumes: The Bible, The Art of the Deal, and Golf for Dummies. It is rumored, however, that he never read any of them, even the one he “wrote.” All three are still in print, however, so the new Trump Library can easily remedy its deficiency by buying them, shelving them in a closet, and titling the door “Library Stacks”.

Presidential libraries have the same relationship to history as a dog to a fire hydrant. The John F. Kennedy Presidential Library and Museum says you can “Immerse yourself in the dynamic history of the Kennedy Administration.” Just don’t expect to get a balanced assessment of his presidency, fatally flawed on Civil Rights and Vietnam. Go to the Sixth Floor Museum, in Dallas, for that.

Don’t visit the Richard M. Nixon Presidential Library and Museum to learn about Watergate. When it opened, of the sixteen film clips on display at the library, none treated Watergate. When it simply had to discuss Watergate, the library mystified it: "The story of Watergate is enormously complex. Even today, basic questions remain unknown and perhaps unknowable."

Ronald Reagan’s “Library & Museum” displays its partisanship in the very first words on its website: “In a storied career that spanned more than five decades, Ronald Reagan inspired Americans to act and achieve even more than they imagined. His legacy thrives at The Reagan Library where events and exhibits rediscover his values, actions and spirit of determination.” You won’t find much about Iran-Contra, but you will encounter at least five different Reagan busts and statues, not counting several more in the gift shop.

A special feature of the new Trump Library is its Rooftop COVID Cemetery for VIPs in Trump’s orbit who fell ill and died of this horrible disease.

Presidential libraries exist to get visitors to think well of their namesakes, not to think about them. In this company, the new Trump Library and Museum is a breath of fresh air. It boasts exhibits on such difficult topics as “Failed COVID-19 Response” and “Access Hollywood Tape,” and it gives Trump full credit for the attack on the Capitol.

Too bad it’s a spoof.

]]>
Fri, 19 Apr 2024 07:39:22 +0000 https://historynewsnetwork.org/blog/154475 https://historynewsnetwork.org/blog/154475 0