History People Are Talking About Archives 5-26-03 to 7-16-03
James Dao, writing in the NYT about the reunion at Monticello of descendants of the clans of Hemings and Jefferson (July 14, 2003):
[B]eneath the uplifting veneer of this weekend's reunion lies an increasingly rancorous battle between the Hemings clan and some of Jefferson's descendants over who can claim the Jefferson birthright. At its heart, the fight is a metaphor for Americans' deeply conflicted views on race, family and Jefferson himself.
On one side, many of the Hemingses have argued for an all-inclusive definition of family that would encompass the offspring of all seven of Sally's children. Some have argued that the group should be expanded to include all the descendants of Elizabeth Hemings as well.
The DNA test concluded that there was strong evidence that a Jefferson male, probably Thomas himself, fathered one of Sally's sons, Eston. The Thomas Jefferson Foundation, the nonprofit organization that runs Monticello, issued a report in 2000 saying that the DNA results, combined with other historical evidence, indicated "a high probability" that Jefferson fathered Eston, and possibly five of Sally's other children.
That conclusion has been endorsed by the National Genealogical Society and a number of prominent Jefferson scholars, many of whom had rejected the Hemingses' claim before.
"Prior to the DNA, I'd say the case against Jefferson didn't reach beyond reasonable doubt," said the historian Joseph J. Ellis, the author of a Jefferson biography, "American Sphinx." "Jefferson is now regarded by most serious scholars as having clearly had a sexual relationship with Sally Hemings."
But the Monticello Association, which operates the Jefferson cemetery and represents descendants of Thomas Jefferson and his wife, Martha, has not accepted the DNA findings as conclusive. And the Thomas Jefferson Heritage Society, which includes some of Jefferson's descendants, commissioned its own panel, which concluded in 2001 that Jefferson's younger brother, Randolph, was more likely to have been the father of Sally's children.
"The reason we don't think Jefferson did this is that his reputation meant everything to him, and he would not have risked it on a young slave woman," said Nathaniel Abeles, president of the Monticello Association. "He had everything to lose and nothing to gain, especially when there were plenty of other available women at that time."
The fight between the groups has at times taken on the orchestrated nastiness of a political campaign.
Led by Mr. Abeles, the association set a limit on guests to this year's association meeting in Monticello, held in May, after he learned that Hemings family members were conspiring with sympathetic association members to send a large contingent.
The Hemings group later discovered that Mr. Abeles's wife, Paulie, had monitored their plotting by joining their Yahoo e-mail group, posing as a 67-year-old black woman named Cassandra Lewis. Mrs. Abeles has admitted to the ruse, claiming she was monitoring the Hemings group's efforts to infiltrate and perhaps disrupt the association's meeting.
"We found out about a lot of things that people were trying to do to get around rules for our meetings," Mr. Abeles said in a telephone interview.
The dispute has clearly created bitter divisions within the once sleepy association. During today's service at Monticello's slave graveyard, Susan Hutchison, a descendant of Martha Randolph, Jefferson's daughter, read a statement apologizing to the Hemings family for the association's exclusive policies and expressing "deep regret" that Jefferson owned slaves.
Brent Staples, writing in the NYT (July 16, 2003):
All roads at Monticello lead back to Elizabeth Hemings (1735-1807). Nearly all of the household servants who cared for Thomas and Martha, right up to the moments of their deaths, were her children and grandchildren. This great matriarch, whose progeny included about 80 of the 600 or so slaves that Jefferson owned during his lifetime, was initially a slave of Martha Jefferson's father, John Wayles. Elizabeth bore six children by Wayles. When he died, the Jeffersons inherited 11,000 acres and more than 130 slaves, including Elizabeth and her children, some of whom were Martha Jefferson's half-brothers and half-sisters. The historical record is silent on what Martha thought about keeping her flesh and blood in bondage. The Hemings clan appears to have discussed it often (as well as the bond that developed between Thomas and Sally), keeping the story alive even as the Jeffersons suppressed and conspired to kill it.
The country has been fascinated for the better part of two centuries with the question of whether Jefferson followed his father-in-law's example and fathered children by Elizabeth's daughter, Sally. Leading historians who doubted this have done an about-face since genetic evidence linked Jefferson to one Hemings child. There is a growing consensus that Jefferson fathered most, if not all, of Sally's children, just as Madison Hemings claimed in a now-famous newspaper interview published in 1873.
The emergence of the genetic evidence has shown historians who believed otherwise that the black oral tradition is sometimes more reliable than the official "white" version of history. It has also cleared the way for us to focus on the wider Hemings clan and the hundreds of other slaves who passed through Jefferson's plantation. One central figure is Elizabeth's son John, the skilled woodworker who helped build Jefferson's paradise and took charge of the Monticello carpentry shop (called "the joinery") after 1809.
It stands to reason that John Hemings and Jefferson worked closely together, given that Jefferson's main passions were building and tearing down. Jefferson's notes on furniture and general construction contain several references to Hemings's pieces, including a writing desk that the former president compared to the work of one of his favorite poets, Virgil. Away from home and in need of a comfortable chair, Jefferson wrote to his family, "I long for a Siesta chair. . . . I must therefore pray you to send by Henry the one made by Johnny Hemmings."
John Hemings, with his woodworking genius, became a Jefferson interlocutor who gave life and depth to his master's sometimes vague ideas. He was also a designer who adapted furniture made in France and elsewhere in a style that has come to be known as Franco-Piedmont. Susan Stein, curator at Monticello, has dubbed a small cabinet almost certainly made by Hemings "the Rosetta stone" for what it illuminates about overall construction at the plantation.
The story of Elizabeth Hemings and her progeny is far from finished. Even so, it is shifting Monticello's mission and deepening what we know of the enslaved people who lived there. As Ms. Swann-Wright, the historian, said recently of this new research: "It isn't about Thomas Jefferson at all. It is about a community of African-Americans and how they lived their lives."
William Safire, writing in the NYT (July 14, 2003):
A 5,500-word diary in President Harry Truman's handwriting, unnoticed for decades, recently turned up at the Truman Library in Independence, Mo. Three pages were mysteriously loose and interleaved in the journal.
On these detached and reinserted pages was this entry: "6:00 P.M. Monday July 21, 1947. Had ten minutes conversation with Henry Morgenthau about Jewish ship in Palistine sic . Told him I would talk to Gen eral George Marshall about it."
On that day, news reached the world that 4,500 Jewish refugees seeking entry to Palestine aboard the ship Exodus 1947 had been seized by British soldiers. These "displaced persons" had been placed on three vessels ostensibly headed to nearby Cyprus for detention until permitted entry to the Holy Land, where other Jews waited to welcome them. Instead, the homeless families, including a thousand children, were encaged on decks being taken back to a hostile Europe.
"He'd no business, whatever to call me," Truman wrote. Morgenthau, who had served as F.D.R.'s treasury secretary, was telephoning Truman as chairman of the United Jewish Appeal, and had an obligation to get through to the president to stop this further atrocity.
"The Jews have no sense of proportion," wrote the incensed Truman after he hung up, "nor do they have any judgement on world affairs. Henry brought a thousand Jews to New York on a supposedly temporary basis and they stayed." These refugees were welcomed in Oswego, N.Y., just after the war, and Truman saw political implications in Gov. Thomas E. Dewey's support for Jewish immigration: "When the country went backward -- and Republican in the election of 1946, this incident loomed large on the D isplaced P ersons program."
Then the president vented his spleen on the ethnic group trying desperately to escape from Europe's hatred: "The Jews, I find are very, very selfish. They care not how many Estonians, Latvians, Finns, Poles, Yugoslavs or Greeks get murdered or mistreated as DP as long as the Jews get special treatment. Yet when they have power, physical, financial or political neither Hitler nor Stalin has anything on them for cruelty or mistreatment to the under dog."
After equating the cruelty of Jews with that of Hitler and Stalin, Truman waxed philosophic about ingratitude: "Put an underdog on top and it makes no difference whether his name is Russian, Jewish, Negro, Management, Labor, Mormon, Baptist he goes haywire. I've found very, very few who remember their past condition when prosperity comes."
Truman wrongly assumed that the plight of all of Europe's displaced was the same -- ignoring the "special treatment" Hitler had inflicted on the Jews of the Holocaust, resulting in six million murdered, genocide beyond all other groups' suffering. The homeless survivors now faced sullen populations of former neighbors who wanted no part of the Jews' return.
This diary outburst reflected a longstanding judgment about the ungrateful nature of the oppressed; in a letter to Eleanor Roosevelt, he repeated that "Jews are like all underdogs. When they get on top they are just as intolerant and as cruel as the people were to them when they were underneath."
Did this deep-seated belief affect Truman's policy about taking immigrants into the U.S., or in failing to urge the British to allow the Exodus refugees haven in Palestine? Maybe; when the National Archives release was front-paged last week in The Washington Post, historians and other liberals hastened to remind us that the long-buried embarrassing entry was written when such talk was "acceptable." The director of the U.S. Holocaust Memorial Museum dismissed it as "typical of a sort of cultural anti-Semitism that was common at that time."
For decades, I have refused to make such excuses to defend President Nixon for his slurs about Jews on his tapes. This is more dismaying.
Lest we forget, Harry Truman overruled Secretary of State George Marshall and beat the Russians to be first to recognize the state of Israel. The private words of Truman and Nixon are far outweighed by their pro-Israel public actions.
But underdogs of every generation must disprove Truman's cynical theory and have a duty to speak up. I asked Robert Morgenthau, the great Manhattan D.A., about Truman's angry diary entry, and he said, "I'm glad my father made that call."
Richard Johnson, writing in the NY Post (July 16, 2003):
MEL Gibson's pet project "The Passion" is doomed to box-office oblivion, insiders say - not because the movie is in Aramaic and Latin, but because of its "violent and graphic nature."
Gibson, who has held several small private screenings, is "insisting that the movie holds true to what happened the last 12 hours of Christ's life," a pal of his said, "including the horrific depiction of the crucifixion. It is worse than the graphic scenes in 'Braveheart.' "
The stubborn star directed "The Passion" but does not appear in it. He's ignoring conventional studio wisdom and taking advice only from a small group of clerics who have seen the film. "Mel is making notes and small changes on the advice of the bishops and rabbis who have seen it, in order that he can assure accuracy," we're told.
The rough cut Gibson has been showing to his advisors and friends, however, has subtitles, which Gibson plans to remove from the final print.
"Mel won't listen to anybody on this," the pal said. "We are hoping he keeps the subtitles in, or there really is no chance for the movie. No one will go see it, especially if they can't understand it. His friends are working on him but so far, nothing can get through to him."
Still, on other post-production issues, "Mel has been more open-minded than we thought he would be."
Gibson shelled out $25 million on the "vanity" project, which many feared would be anti-Semitic. Gibson is a "traditionalist" Catholic, a splinter group that rejects the reforms of Vatican II. Some of its adherents embrace a 16th Century papal decree that blamed Jews for the death of Jesus, although Gibson does not.
"In the movie [as in the New Testament], the Romans killed Jesus, not the Jews," Gibson's friend declares. "It is in no way anti-Semitic."
If fans want to judge themselves, a small trailer from the flick has been posted on aintitcoolnews.com.
The early buzz on the movie has also been skewed by Gibson's father. Hutton Gibson , who has railed against the Vatican for more than 30 years, told a New York Times magazine writer several months ago that Vatican II was "a Masonic plot backed by the Jews."
He also called Pope John Paul II "Garrulous Karolus, the Koran-kisser," and questioned official accounts of the Holocaust.
Scott McLemee, writing in the Chronicle of Higher Education (July 18, 2003):
When Hans-Georg Gadamer lectured on philosophy -- as he continued to do until shortly before his death, last year, at the age of 102 -- he attracted crowds as large as a thousand people. They were drawn by his renown as the author of Truth and Method (1960), a dense and sizable volume that explored the very foundations of the humanities and social sciences. In it, Gadamer emphasized "hermeneutics," the art of interpretation. Once an obscure theological term, it caught on as part of the common stock of scholarly ideas.
Listeners who expected something severe and ponderous were often surprised to find that Gadamer was a relaxed speaker, improvising his talks rather than delivering them from on high. He seemed not so much to analyze Plato, Aristotle, Hegel, and Heidegger as to hold conversations with them -- as if he were able to hear the questions posed by their writings, and to ask them questions in turn. It was hermeneutics in action. Those who eavesdropped on one of Gadamer's dialogues with the illustrious dead often refer to it as the greatest pedagogical experience of their lives.
"He spoke freely," recalls Jean Grondin, a professor of philosophy at the University of Montreal, "and an audience always finds that engaging. It was a habit Gadamer developed in the 1930s, when he had to teach the entire spectrum of philosophy at the University of Leipzig."
An English translation of Mr. Grondin's Hans-Georg Gadamer: A Biography is just out from Yale University Press. When the German edition appeared, in 1999, it became part of a bitter controversy -- one that shows every sign of continuing on this side of the Atlantic. For there are some awkward questions about just what Gadamer was doing in Germany during the 1930s, besides developing a memorable classroom presence. And those questions, in turn, raise troubling concerns about his influential work in philosophy.
Scholars began looking into Gadamer's activity under the Third Reich during the late 1980s, following a prolonged debate over the Nazi Party membership of his mentor, Martin Heidegger. Mr. Grondin's professed intent is to clear Gadamer of any charge of totalitarian sympathies. "Gadamer was part of an older generation of university professors that was apolitical," he says. "To delve into politics is to go down into a realm that is a bit messy." For a "mandarin" scholar such as Gadamer -- who once prided himself on never reading a book that was less than 2,000 years old -- Heidegger's enthusiasm for Hitler "was not criminal," says Mr. Grondin. "it was just embarrassing."
Yet in tracing Gadamer's career throughout the 1930s and '40s, Mr. Grondin also documents just how adept the philosopher was at turning the darkest period of the 20th century to his own professional advantage. While Jewish friends and colleagues were being "furloughed," as the official euphemism had it, Gadamer moved into positions they left vacant. He never joined the Nazi Party. But he did enroll in a Nazi indoctrination camp in 1935, once it became clear that doing so would open certain academic doors.
Sharon Ann Holt, writing in Common-place.org (July 2003):
Between January 2002 and April 2003, the interpretation and future of the Liberty Bell and of the Philadelphia property occupied in turn by presidents George Washington and John Adams changed fundamentally and permanently. The official story of the Liberty Bell was newly imbued with social and political context. As a result, the site of Philadelphias presidential mansion, which was rented for the purpose from financier Robert Morris, may become a fully interpreted landmark for the first time in its history. Visitors on their way to the new Liberty Bell Center would move across the footprint of the Presidents House, buried since the 1950s under the malls public ladies room. What is more, the story told within the house would include not only Washington, Adams, and the development of the presidency, but also and especially the stories of eight enslaved Africans who lived there in bondage to Washington, including two who escaped, one with the help of Philadelphias free black community. The real breakthrough is that these would not be told as separate stories but as one. On the very doorstep of the Liberty Bell, and within the Liberty Bell Center itself, visitors would see and experience the troubling interdependence of slavery and freedom in the lives of the founding generation, black and white, and in the nation that emerged from their work. This is all possible, but there is no guarantee at this moment that it will happen. People who have been deeply involved differ profoundly on how the changes came about, on whether the new plan can be considered a success, and on what the whole struggle will mean for this site and others in the future.
How did Independence National Historical Park become the location for what may become the most powerful commemoration of the impact, achievements, and aspirations of people held in slavery ever built in the United States? And what remains to be done to ensure that the commemoration happens, that the content is accurate, and that this site does not become a solitary aberration on the margin of the "traditional" American story? These questions confront everyone involved in the process, and the answers depend upon what view one takes of what has happened so far.
From one perspective, an extraordinary David and Goliath drama unfolded in 2002, as a spontaneously organized group of historians and citizens, self-appointed to safeguard the integrity of this major historic site, took on the Park Service and changed its course. From another standpoint, the story concerns a team of hard-working, well-meaning public servants making practical decisions that made sense, only to be sideswiped by an eruption of public passion. From a third point of view, what happened is that African Americans suffered yet another enormous official betrayal, and can prevent a worse one only by organizing themselves and mobilizing both media and congressional attention. The affair makes fascinating public history because all these versions are true already, and the process is still unfolding. More important, had any element of the story been missing, the affair would have ended quickly in frustration and disappointment instead of enduring to produce a promising draft design and the possibility of a new relationship between the park and the city, as well as between the nation and its history.
Eric Scigliano, writing in the Seattle Weekly (July 11, 2003):
THIS BICENTENNIAL is just approaching the starting gate, and already it's in overdrive. Exactly 200 years ago this month, Capt. Meriwether Lewis set out on his epochal expedition across a mysterious continent and made it all the way toPittsburgh. There he gathered supplies and tried to get a boat built to float down the Ohio River and prepare for the real journey. The boatbuilder balked and fumbleda less than glorious beginning. Never mind. Two hundred years later, the bicentennial hype is chugging along under full steam.
No paltry National Sacagawea Week or Lewis and Clark Month here; President Bush has proclaimed a four-year celebration of the Voyage of Discovery, through 2006. Nearly every state, town, and visitors center in the journey's path is scrambling to clamber aboard and, in the words of USA Today, "grab a piece of the huge tourism pie associated with the 200th anniversary." In October, Louisville, Ky., will stage a re-enactment of Lewis and Clark's rendezvous. Next May, Wood River, Ill., will re-enact the Corps of Discovery's departure after wintering. St. Charles, Mo., will re-enact the start up the Mighty Mo. Leavenworth, Kan., will re-enact its stop-off to celebrate July 4, 1804, with a blast of the swivel gun and an extra whiskey ration. (Don't expect free drinks in what's now one of the driest states in the union.) And so on, with pageants, festivals, symposia, fireworks, and flyovers at 12 more stations of the crossing designated to host "national signature events," all the way to Fort Clatsop, Ore., and back. Maya Lin, designer of the Vietnam War Memorial and the closest thing this country has to an official artist, will create four sculptures at key Snake and Columbia river confluences for the bicentennial Confluence Project. Where the festivities go, newspapers and networks will surely follow; The Seattle Times, which maintains a Web page for breaking Lewis and Clark news, got the jump with nearly seven broadsheet pages in one May week.
Even places with more tenuous connections to the expedition will milk it for all it's worth. Harper's Ferry, W.Va., where Lewis stopped to shop, will hold a party and open a permanent Lewis and Clark exhibit. As for this Lewis and Clark heartland, never mind that the heroes passed their most wretched months in the Northwest, grumbling about the rain and rot and "thievishly inclined" natives (who were already expert at dealing with thievishly inclined white traders). Never mind that they fled the Washington side of the Columbia River for Oregon, where the game was better. We can expect a continuing rich diet of Lewis lore and Clark kitsch.
At untold points between, more Lewis and Clark centers will open, and more reincarnated explorersmany more than composed the original Corpswill turn out in three-corner hats, buckskin tunics, and elk-hide moccasins to sample the joys and a few of the ardors of roughing it 1805-style. Even before the bicentennial boom, the Corps of Discovery was second only to the Civil War as a refuge for those generational cross-dressers known as historical re-enactors.
THE PILGRIMS and Pocahontas are passé, and Columbus has been tarnished. But the Lewis and Clark Expedition endures as our favorite national creation mythmore cherished even than the Revolution and constitutional birth pangs that established the republic, perhaps even more sacrosanct than the Civil War. If the Civil War is our Iliada sprawling tragedy of war and purgationthen the Voyage of Discovery is our Odyssey: a more intimate epic of discovery, survival, and redemption, and everything else Hollywood loves.
It's easier to like The Odyssey than The Iliad. Like The Iliad and the Civil War, the Revolution and Constitutional Convention are messy affairs, crowded with murky characters, mixed motives, and political crosscurrents. Though they happened just a few years before the expedition, they seem much more distant. Their heroesWashington, Franklin, Madison, Lincolnare marble eminences, lofty and unapproachable.
But Meriwether Lewis and William Clark are perfect epic heroesordinary guys summoned to an outsized mission, the first Boy Scouts, our own Frodo Baggins and Sam Gamgee. Theirs seems a pure and simple quest, moral clarity incarnate. They appeal to both the anarchic and patriotic elements of American vanity: They light out for the country like Huck Finn and, at the same time, bear the nation's destiny in their steady hands. Their heroism is tempered and highlighted by their sympathetic weaknesses: Lewis' fierce mood swings and Clark's notoriusly haphazzird spelinge. As George W. Bush well knows, we like our heroes to trip over the language now and then. Brings 'em down to our level.
FOR BUSH, PRESIDING over this patriotic joyfest is one more in a run of lucky breaks. Lewis and Clark suit his administration's agenda better than any spectacle Karl Rove could concoct. When the Corps of Discovery reached the Pacific, staking a claim to the "Ouragon country," the United States became a continental power and embarked on the path to global power. The first steps toward Texas and California, Cuba and the Philippines, Kabul and Baghdad, were paced along the Missouri Valley in the spring of 1804. Thomas Jefferson's proto-imperialist venture, so contrary to his anti-imperialist principles, glows brighter than ever in an era of unabashed neo-imperialism. His continental ambitions, and the stratagems and rationalizations employed to achieve them, offer a legitimizing precedent to the global-supremacist ambitions of Cheney, Rumsfeld, Rice, and Wolfowitz.
Not that exploiting the explorers for latter-day ends is anything new. Lewis and Clark have gone in and out and back into fashion; each era exalts, forgets, revives, or remakes them according to its needs. They returned in 1806 as national heroes, but Lewis died three years later, mired in political and financial woes, and Clark became better known for his subsequent work as superintendent of Indian affairs in the West. A tardy distillation of their voluminous journals, ghostwritten by Nicholas Biddle, appeared in 1814 and flopped. The journals disappeared from view for eight decades. Then, in 1893, Biddle's version was reprinted. The full journals were finally published in 1904. Blockbuster expositions in St. Louis and Portland marked the centennials of the Louisiana Purchase and the expedition. The twin captains were heroes again; their expansive mission suited both the populist and the imperialist currents of Teddy Roosevelt's America.
L. Gordon Crovitz, a Rhodes scholar and senior vice president of Dow Jones, writing in the Wall Street Journal (July 11, 2003):
As the U.S., Britain and others in what Rhodes would have called the "English-speaking union" consider the meaning of empire these days--and as President Bush tours an Africa littered with states that have failed since independence--Rhodes offers some revisionist guideposts. Not all my fellow scholars will agree, but I think Rhodes deserves to be seen as the great champion of a well-intentioned and often effective imperial ambition. His vision of what it took "to lead the world's fight"--as his scholarship's statement of purpose puts it--could not be more timely....
Rethinking Rhodes is part of a broader reconsideration. Historian Niall Ferguson's book "Empire" reminds us that by the Victorian era, imperialism was about more than trade routes and spices. Imperialists "dreamed not just of ruling the world, but of redeeming it." In the quarter of the world it ruled, Britain established common law, property rights, representative assemblies and what Mr. Ferguson calls simply "the idea of liberty." It measured nation building in decades, not months.
Today there are no doubt some people in Zimbabwe, devastated by economic ruin and lorded over by Robert Mugabe, who might prefer a return to "Rhodesia" and the chance to start again. In West Africa, the British Empire has been quietly reborn in Sierra Leone, which London has saved from a civil war. That war itself was instigated by warlords from neighboring Liberia, whose citizens now plead for America to come and restore order. Even the French have been shamed into accepting resumed colonial authority for the Ivory Coast.
Hostility to Rhodes-style imperialism runs deep in U.S. history, back to our founding. So it is important to recall that the U.S. became what it is by building its own North American empire from the East Coast to the West, adding occasional colonies in Asia, the Caribbean and elsewhere. The hostility to empire endured, however. During World War II, as FDR and Churchill met to plan what they hoped would be a victory, FDR demanded that Britain dismantle its empire, even giving Hong Kong to China. Churchill refused and stormed out of the meeting, saving millions in Hong Kong from communism, at least for a time.
Today, "empire" is no longer a conversation stopper. But what about Rhodes and race? Rhodes's great error was abandoning the prevailing British view in the Cape Colony, which favored equal protection of the laws. He was intent on doing a favor for his allies among the Boers, who sought racial laws as a way to protect white workers against blacks. Many years after Rhodes's death, this mutated into apartheid.
Even so, the will creating the Rhodes scholarships--now offered in some 30 countries--includes a clause remarkable for its era. It says that no one will be "qualified or disqualified on account of his race or religious opinions." Some biographers of Rhodes have tried to argue that this tolerance extended only as far as the Dutch, and some Americans, but his contemporaries thought otherwise.
Ian Bell, reviewing a new BBC2 documentary," Rebels and Redcoats," by historian Richard Holmes; in the Herald (Glasgow) (July 9, 2003):
Richard Holmes believes he knows. Irritated as only a historian can be by Mel Gibson's The Patriot, a movie both dewy-eyed and psychotically violent, Holmes has set out to dispel America's cherished myths. In place of the homespun settler with a musket in one hand and The Rights of Man in another, he finds vested interests on the make. In place of the Jeffersonian ideal he finds faction-fights, communities turned against themselves, and venal men provoking conflict for their own advantage. A civil war, in short.
As theses go, it isn't a bad one. America's revolt, like France's revolution, was never an egalitarian jamboree. Both the Jacobins and the leading colonists were men of property whose assaults on monarchism had more to do with privilege than the woes of the little people. Equally, Holmes brings to the independence war the unsentimental eyes of a military historian who cares nothing
for political romance. For him, the point about the war against "the mother country" was that it was entirely deliberate, even preconceived.
As with most revisionist history, the desire to be contrary is a little overdone. But having just sat through a week of the History Channel's reverential Founding Fathers, it is good to see a curmudgeon from this side of the pond refusing to exchange truth for another nation's fables.
Mark Steel, writing in the London Independent (July 10, 2003):
It seems to me that we're not supposed to like the French Revolution very much. My introduction to the subject was on an unemployed afternoon in the late 1970s, slouched in front of Blue Peter. I think it was Peter Purves who introduced an item on Marie Antoinette. She loved beautiful clothes, he said, and was admired for her exquisite taste in jewellery. As a result, she was loved by the people of France.
Then the mood changed, and we were told how "outside agitators" spread untrue stories about the queen's greedy habits. And we were shown a silhouette of a cloaked man on a horse throwing leaflets in a cobbled street; this, apparently, led to the revolution, depicted as a shadowy crowd with pikes, while five or six actors shouted, "Down with ze Queen!". I can't recall what followed, though presumably someone showed you how to make your own guillotine using a shoebox, an elastic band and a Stanley knife.
But Blue Peter was only following the generally accepted version of the event, that the French Revolution was a dreadful episode with no redeeming features. To most people in Britain, suggesting that there was anything positive about it must seem as peculiar as saying that there was a good side to the plague. An honest appraisal of the revolution hasn't been helped by the majority of accounts in novels, documentaries and films, from Charles Dickens's A Tale of Two Cities to the two most famous British films covering the period, The Scarlet Pimpernel and Carry on Don't Lose Your Head (of which the Carry On film is by far the more realistic). But many serious historians are scarcely less one-sided, explaining the affair by saying, in effect, that everyone just went mental.
Nathan Guttman, writing in Ha'aretz (July 10, 2003):
New documents released this week by America's National Security Agency support Israel's version of a long-festering controversy between the two countries: Israel's sinking of an American spy ship, the USS Liberty, off the coast of Gaza during the1967 Six-Day War.
Israel has always said it had no idea the ship was American, but conspiracy theorists and anti-Israel propagandists still claim Israel sank the ship in the full knowledge that it was American.
The documents, originally defined as top secret, were made public by Florida Judge Jay Cristol, who has been investigating the Liberty incident for years and published a book on the subject last year. On Monday, the NSA gave him a transcript of conversations held by two Israeli Air Force helicopter pilots who were hovering over the Liberty as it was sinking, and these tapes confirm Israel's claim that the sinking of the ship, which killed 34 American servicemen and wounded 171, was a tragic error.
After the Liberty was bombed by both the Israel Air Force and the Israel Navy, the two helicopter pilots were summoned from their base to assess the damage and evaluate the possibility of rescuing the surviving crew members. An American spy plane, which had been sent to the area as soon as the NSA learned of the attack, recorded their conversations, which took place between 2:30 and 3:37 P.M. on June 8, the third day of the war.
The spy plane also recorded the orders radioed to the pilots by their supervisor at Hatzor Base, which instructed them to search for Egyptian survivors from the "Egyptian warship" that had just been bombed - thus supporting Israel's claim that it had believed the ship was Egyptian when it ordered it attacked. "Pay attention. The ship is now identified as Egyptian," the pilots were told.
Nine minutes later, Hatzor informed the pilots that it was not an Egyptian warship, but an Egyptian cargo ship. Only at 3:07 were the pilots first informed that the ship might not have been Egyptian at all: Hatzor told them that if they found Arabic-speaking survivors, they should be taken to El-Arish, but if they found English-speaking survivors, they should be taken to Lod. "Clarify by the first man that you bring up, what nationality he is, and report to me immediately," the supervisor instructed, according to the transcript. "It's important to know."
Then, at 3:12, one of the pilots informed Hatzor that he saw an American flag flying over the damaged ship. He was asked to investigate and determine whether it was really an American ship.
This is not the first time such transcripts have been made public: Israel gave its own recordings of the pilots' conversations to the British television station Thames in 1987. But conspiracy theorists charged that Israel had doctored the tapes before handing them over to the station in order to hide the fact that it sank the Liberty intentionally. No such imputation can be made about these new transcripts, as they were never in Israeli hands.
Israel has always said it attacked the Liberty, which America sent to the region to gather intelligence on the progress of the war, because it believed it was an Egyptian supply ship ferrying supplies to the Egyptian troops that Israel was then fighting. When it discovered the error, it immediately informed the Americans, apologized and paid compensation to the victims' families.
The incident was investigated by inquiry commissions in both Israel and the United States, and both concluded that it had, indeed, been a tragic error. Nevertheless, the controversy never died. In 1979, one of the survivors, James Ennes, published a book accusing Israel of bombing the American ship deliberately. Ennes claimed an Israeli spy plane had hovered over the ship all morning and had surely identified it as American, since the American flag was clearly visible.
A later book, written by James Bamford, charged that Israel sank the ship in order to keep America from learning of its plans to attack Syria, and further claimed that the NSA had tapes of conversations among Israeli pilots that not only confirmed this, but also proved that the tapes released by Israel had been doctored.
Another claim that appears frequently on the dozens of Internet sites devoted to the affair is that Israel sank the ship to conceal a mass murder of Egyptian soldiers on the Sinai peninsula.
In its letter to Cristol, the NSA stressed that, contrary to the claims that often appear in such books and Web sites - that the agency has tapes from both the Liberty and from a nearby American submarine that confirm Israel's guilt - the only tapes that exist were those made by the spy plane and given to Cristol this week.
"It's the last piece of intelligence that remained classified, and every rational person that will read it will understand that there is no truth in these conspiracy theories against Israel," Cristol said Tuesday. But he added: "Those who hate Israel, who hate Jews, and those who believe in conspiracy will not be convinced by anything."
Cristol, a former U.S. navy pilot and legal officer, began investigating the Liberty incident 14 years ago. Since publishing his book, which vindicates Israel, he has received threats and been accused of being an Israeli agent. "I take this lightly, but I am saddened to learn that there is this kind of hate toward Israel," he said.
Carlin Romano, writing in the Chronicle of Higher Education (subscribers only) (July 10, 2003):
Do philosophers respect the history of their field? In the heyday of 20th-century analytic philosophy -- rigidly designated by its true believers as the ahistorical probe of piecemeal issues in logic and language -- you didn't have to look far for the answer. Or, to put it another way, it was everywhere you looked.
The canon of early modern philosophy, for one thing, consisted of a handful of white, male figures organized like Motown singing groups (Locke, Berkeley, and Hume formed the Empiricists; Descartes, Spinoza, and Leibniz toured as the Rationalists). That structure flourished even though the great historian of philosophy Paul Oskar Kristeller decried the leaps of faith in the standard philosophy survey course, such as the giant step taken from Aquinas to Descartes.
At Harvard, regarded by the analytic establishment as a premier department despite its weaknesses in history, W.V.O. Quine, analytic epistemology's towering figure, declared philosophy and history of philosophy to be separate fields. A Harvard professor teaching early modern philosophy could abashedly ask his charges, as one did, "Descartes -- was he before Newton or after Newton?"
At Princeton, similar to Harvard in its ahistorical orientation, a philosophy professor famously posted a sign on his door, "Just Say No to the History of Philosophy!" Folks there frequently referred to major figures from the past as "Locke starred" or "Hume starred" to signal that the version of the philosopher cited wasn't historically accurate. "Locke starred" could be stipulated (for argumentative convenience) to hold a particular theory about color or consent, even though the real Locke didn't. It was a kind of "Do asterisk, don't tell" policy.
Over the past 20 years, however, a new generation of philosophers -- including a surprising number trained at those two institutions -- have tried, in the words of Princeton's own current expert in early modern philosophy, Daniel Garber, "to find a more historical way of doing the history of philosophy." Now the question is whether the news will ever trickle down to deans and undergraduate courses, still locked into models imposed by the analytic epistemologists.
Louis P. Masur, writing in the Chronicle of Higher Education (subscribers only) (July 10, 2003):
In 1991, the historian Simon Schama published a book that stirred great controversy among scholars. Dead Certainties (Unwarranted Speculations) consisted of two parts. The first, "The Many Deaths of General Wolfe," examined various accounts of the death of British Gen. James Wolfe at the Battle of Quebec in 1759. The second, "Death of a Harvard Man," probed the 1850 trial of John White Webster, a Harvard chemistry professor, for the murder of George Parkman, a physician turned affluent landlord and moneylender. Now a new prime-time program, Murder at Harvard (airing on July 14 as part of PBS's American Experience series) explores the Webster case and Schama's telling of the story.
What angered many historians about Dead Certainties was that its author opened with a brief fictional account of a soldier at the Battle of Quebec and then, throughout, freely offered speculations about what may have occurred and why in each story. Looking to provoke, he referred to the two parts of his book as "historical novellas." It is understandable that the line between history and fiction would fascinate scholars, but what does the film make of it?
As I once noted in a review, except for the explicit invention at the beginning, Dead Certainties firmly belonged to the genre of history and not fiction. Writing about the Webster case, for example, Schama both told a riveting story about the city of Boston's transformation in the first half of the 19th century and reflected on the historian's craft. If, as he put it, the "inventive faculty -- selecting, pruning, editing, commenting, interpreting, delivering judgments -- is in full play," then it was so much the better.
In Murder at Harvard, Schama steps before the camera to examine once again Parkman's death and raise issues about knowing the past. No doubt the filmmakers chose to focus on Parkman, and to exclude Wolfe, because trials both make for good television and speak with special urgency to historians: What happened? Whose story to believe? The director, Eric Stange, an independent documentary filmmaker and writer, and Melissa Banta, whose previous work focuses on history and images, have both said that they wanted to explore the craft of the historian.
Schama, who co-wrote the script and serves as on-screen presenter, has had previous success as the host of the 16-part BBC series A History of Britain, which was adapted from his best-selling book of the same title. But what happens in this second go at telling the story of Parkman's murder suggests some of the pitfalls of transforming a written account into a different genre, as well as some opportunities.
"What really happened is still debated today," Schama declares, somewhat disingenuously, early in the program, and an attempt is made in Act I to set the case up as a mystery. Was it the Harvard professor, owing Parkman money, who murdered him? Or the janitor Ephraim Littlefield, who discovered the body by digging into the privy below Webster's locked laboratory? For all his posturing of uncertainty, however, Schama the historian knows that Webster committed the crime. By Act II, Littlefield has become less a suspect than a witness and a vehicle to uncover increasing class tensions in Brahmin Boston, discuss the decline of the Puritan City on a Hill, and bring the past to life.
Webster, Schama tells the audience, "was trying desperately to cling to the gentility into which he was born," while Littlefield was driven by "bitterness toward Webster and toward his own lot." Schama is known as a dynamic lecturer, but he is featured too much here. In his book, there is a revealing moment of insight into class pretensions when the janitor takes the stand, states his name, and volunteers, "I have no middle name." Inexplicably, the film does not make use of that confession. It takes confidence to let scenes and moments speak for themselves, but all too often Schama and others tell viewers what to think. The trial transcript, newspaper accounts, letters, diaries, even a confession from Webster all provide more than enough words for the historian to use. What Schama did brilliantly in Dead Certainties was give meaning to those words and tell the story with literary pizazz. That is sometimes lost in the film.
In the book, moreover, Schama dispatched his self-conscious musings about what he was doing to a brief afterword, but in the film those reflections take center stage. "I knew I was crossing a line historians don't usually cross, the line that separates history from fiction," Schama tells us in the film. "I felt free to let my imagination work to get me closer to the truth."
To unpack those thoughts, the filmmakers enlist a group of distinguished historians, including Natalie Zemon Davis, James E. Goodman, Karen Halttunen, and Pauline Maier, who offer keen insights into the problem of how we know the past. Again, however, talking heads, no matter how articulate, cannot get the casual viewer interested in scholarly debates.
Leslie Casimir, writing in the New York Daily News (July 4, 2003):
The two tiny rooms - tucked away in the corner balconies at St. Augustine's Episcopal Church - resonate with an ugly part of New York history that is mostly ignored.
They are known as the slave galleries, where African-American worshipers were permitted to stand while their white counterparts sat comfortably down below.
After New York State lawmakers abolished slavery on July 4, 1827 - 176 years ago today - this milestone was not celebrated, and the segregated section at St. Augustine's remained in use until 1930.
"They couldn't celebrate on July 4 for fear of white backlash," explained the Rev. Deacon Edgar Hopper, 74, of St. Augustine's, the 175-year-old lower East Side church at 290 Henry St. "So people held marches and gave speeches the next day, so as not to interfere with the national holiday."
July 5 became the unofficial day associated with independence for black New Yorkers, but the tradition of parades and special church programs did not take hold. In New York City, widespread discrimination remained, and blacks in the South were still enslaved.
People figured there was not much to celebrate, said Christopher Moore, a historian at the Schomburg Center for Research in Black Culture.
"Slavery was a painful memory, and people didn't want to remind themselves or associate themselves with slaves," said Moore, who also is a co-author of "The Black New Yorkers: 400 years of African-American History."
"We tended to have a tradition where we didn't want to talk about it," Moore said. Until now.
A growing group of local black historians and clergy is hoping to revive this obscure holiday - often referred to as Manumission Day.
Tomorrow, St. Augustine's will host a ceremony of songs and drums to honor the tremendous contributions slaves made to building the infrastructure of New York City. And on Staten Island, the Sandy Ground Historical Society, 158 Woodrow Road, will pay homage to its ancestors with an exhibit that will depict how black New Yorkers used to celebrate Manumission Day.
"We think that it is important to bring back that tradition," said Rodger Taylor, 50, an archivist at the Hamilton Fish Park Branch of the New York Public Library on the lower East Side. "It was a holiday that was celebrated for a half a century, and then it just slipped away."
Nicholas Pyke, writing in the Guardian (July 5, 2003):
More than three decades since the last pink-tinged maps of the colonies were hauled down from classroom walls across Britain, the empire looks like striking back.
Leading historians addressing the Prince of Wales summer school for English and history specialists this week argued that Britain's imperial past has been ignored for too long, and should be reinstated at the core of the secondary school curriculum.
Professor Niall Ferguson, who recently presented Empire - How Britain Made the Modern World on Channel 4, described the subject as "the big story of British history in the modern period". Teaching British history without it, he said, is like "Hamlet without the prince". The call for the empire to make a comeback in history lessons was a main theme of the prince's gathering, which saw writers and historians brought together in Dunston Hall hotel in Norwich, a venue normally associated with golf rather than intellectual endeavour.
The week saw poet Seamus Heaney rubbing shoulders with historian David Starkey, playwright Tom Stoppard lining up alongside detective writer PD James, and historian Simon Schama flying in from New York. Prince Charles himself briefly attended, arriving by helicopter to criticise the "fashionable ideas of experts and educationists" who, he claimed, have left many young people culturally disinherited.
The guests, nearly 100 teachers from schools in eastern England, heard Professor Schama speak on the importance of "visual literacy", citing Picasso's Guernica. Dr Starkey inveighed against leftwing views of historical inevitability, Mr Heaney made some well-publicised remarks about US rapper Eminem, while Antony Beevor, author of the bestseller Stalingrad, warned against Hollywood's glamorisation of death and war.
But the subject which gripped the con ference was how, in the view of several delegates, the story of the British empire has been airbrushed out of history.
Michael Wood, historian and TV presenter, said the empire was a cornerstone of Britain's "national narrative", a view endorsed by Scott Harrison, history adviser to Ofsted, who told the private audience that the empire deserved a much greater share of classroom time.
A previous Prince of Wales summer school, held at Dartington Hall in Devon, heard complaints that secondary school exam courses were dominated by Hitler, Stalin and Henry VIII, an argument which helped prompt the education secretary, Charles Clarke, into calling for a review of secondary school history last month.
St James's palace has denied reports that Prince Charles is mounting a personal campaign for the return of imperial or Commonwealth studies. But academic interest in the imperial past has been steadily growing, with or without his support. Books on the empire by Linda Colley and David Cannadine are judged to be at the cutting edge of historical debate. Prof Ferguson's television series attracted audiences averaging more than 2 million.
Maureen Dowd, writing in the NYT (July 6, 2003):
America has A.A.D.D. [Adult Attention Deficit Disorder] The country has always had a pinball attention span, even before the Internet and cable TV accelerated it.
The New Republic recently dubbed this"historical attention deficit disorder," when a country gets distracted from focusing on any one place for very long. Our scattered consciousness is the reason we're so bad at empire, too impatient to hang around hot climes trying to force cold natives to like us.
Telegram from C.I.A. headquarters to the leaders of the agency-backed coup that toppled Guatemalan president Jacobo Arbenz Guzman (June 30, 1954):
Heartiest congratulations upon outcome developments past forty-eight hours. A great victory has been won.
NYT commentary on the outcome of the coup (July 6, 2003):
The coup brought Col. Carlos Castillo Armas to power and set off more than three decades of civil conflict and repression in which hundreds of thousands of Guatemalans were killed.
Bill Kauffman, writing in the Wall Street Journal (July 8, 2003:
Louisiana was the largest re-gift in North American history. France ceded it to Spain in 1762; in 1801 the Spaniards gave it back. Remote colonies drain the treasury, and besides, the Europeans could read the handwriting on the Mississippi River. American settlement of the "wilderness so immense" was inevitable. As the Spanish governor of Louisiana said in 1794: "A new and vigorous people, hostile to all subjection, [are] advancing and multiplying with a prodigious rapidity." Not to mention bellicosity. Swallowing hard, the Francophile Thomas Jefferson warned: "The day that France takes possession of New Orleans . . . we must marry ourselves to the British fleet and nation."
But the French never came. Mosquitoes and machetes were decimating the French army in St. Domingue, the site of a slave revolt. Desperate for francs to fuel his militarism, Napoleon negotiated the sale of Louisiana with U.S. diplomats James Monroe and Robert Robert Livingston. (The repetition is no typo, just a typical Hudson Valley conceit.)
Livingston would call the agreement "the noblest work of our whole lives," though the plural was a courtesy: He and Monroe engaged in what Thomas Fleming calls, in "The Louisiana Purchase" (Wiley, 186 pages, $19.95), an "ugly quarrel about who deserved credit for buying Louisiana," with Robert Robert even backdating a key document.
President Jefferson admitted that the Purchase was "beyond the Constitution." He fiddled with an authorizing amendment before concluding that "the less that is said about any constitutional difficulty, the better." This was not his finest hour but rather his imperial moment. As Jon Kukla writes in "A Wilderness So Immense" (Knopf, 430 pages, $30): "Only five years earlier, Jefferson's party had championed states' rights and strict construction in the Virginia and Kentucky Resolutions of 1798. Now their words could have been scripted by the Hamiltonian Federalists."
Almost all the opposition to the Purchase came from New England, and what is most interesting in Mr. Kukla's and Mr. Fleming's books are the voices of dissent. The Yankees had read their Montesquieu, who wisely wrote: "It is natural for a republic to have only a small territory, otherwise it cannot long subsist." The country was already too large, perhaps, and further expansion would swell it past the point of viability.
To the splenetic Federalist Fisher Ames, the U.S. was "rushing like a comet into infinite space." Even Jefferson's allies wondered if the enlargement of the territorial U.S. might lead inevitably to a larger and less responsive central government. Was Jefferson signing the death warrant for Jeffersonianism? Or, as an editorialist asked in September 1803: "Will republicans, who glory in their sacred regard to the rights of human nature, purchase an immense wilderness for the purpose of cultivating it with the labor of slaves?"
Alas, yes, answers Roger G. Kennedy in "Mr. Jefferson's Lost Cause" (Oxford, $350 pages, $30). That lost cause -- a South of free and independent yeomen -- was sold out by Presidents Jefferson, Madison and Monroe, who were simply "planters serving other planters," according to Mr. Kennedy.
Mr. Kennedy frankly despises the planter class, gentlemen whose country manors and courtly manners rested on man-owning and on the short-sighted exploitation of the land through such cash crops as tobacco and cotton. "Yeomen were kinder to the land than planters," he writes, because they themselves, and not uprooted slaves, worked their little patches of earth. Unlike vagabond planters, yeomen also exhibited the virtue of "sedentism," or staying in one place.
Mr. Kennedy is fashionably hard on Jefferson, arguing that, by acquiring Louisiana and refusing to insist on the prohibition of slavery in the new territory, Jefferson doomed his South. Those "who had worn out the productivity of their soil for their chosen staple crops were provided new land to wear out and new markets for the sale of their surplus slaves."
Mr. Kennedy's Jefferson is an expropriator of Indian lands, disrespecter of black intellects and unconscious dupe of British textile interests. He is a hypocritical lover of liberty and owner of slaves, the architect not of an Empire of Liberty but of an "empire of servitude." (The anti-Jefferson current of our age has gone too far: The "Wall of Shame" in my daughter's rural New York public school featured Lee Harvey Oswald, Charles Manson and Thomas Jefferson.)
Rogers M. Smith, professor of political science at the University of Pennsylvania, writing in the Chronicle of Higher Education (July 8, 2003):
President George W. Bush's Inaugural Address, delivered January 20, 2001, was a perfect example of a "story of peoplehood," an account offered by a leader to define and inspire allegiance to the political community he seeks to lead.
"We have a place, all of us, in a long story -- a story we continue, but whose end we will not see. It is the story of a new world that became a friend and liberator of the old, a story of a slaveholding society that became a servant of freedom, the story of a power that went into the world to protect but not possess, to defend but not to conquer. It is the American story."
Bush went on to elaborate that story in ways that included the three types of narratives that I believe are in all such messages. He offered an economic story, suggesting that his policies would make America a society that would materially "reward the effort and enterprise of working Americans." He also delineated what I term a political power story, promising that government would fulfill its "great responsibilities" for public safety by strengthening "our defenses beyond challenge," and confronting "weapons of mass destruction." As a result, all Americans would have both personal protection and a share in great collective power.
But Bush emphasized most what I call an ethically constitutive story: an account explaining why membership in a political community is intrinsic to who its members truly are. There are many kinds of ethically constitutive stories -- cultural, historical, geographic, linguistic, ethnic, racial, and more -- and they can be dangerous, imbuing chauvinistic nationalisms with moralistic fervor. Bush was careful to stress that, in his view, "America has never been united by blood or birth or soil. We are bound by ideals that move us beyond our backgrounds, lift us above our interests, and teach us what it means to be citizens." And he maintained that these "democratic" ideals were "more than the creed of our country." They represented "the inborn hope of our humanity, an ideal we carry but do not own." This is what scholars term a "civic" view of political community, in which membership rests on voluntary agreement as to political procedures and principles, not on unchosen "ethnic" components of national identity that can serve invidious ends.
Ultimately, however, President Bush made clear that his American story is not really so "civic" after all. It is a religious story. He recalled how during the American Revolution a friend wrote to Thomas Jefferson, "Do you not think an angel rides in the whirlwind and directs this storm?" And Bush concluded his version of the American story by asserting that, despite our democratic creed, "We are not this story's author, who fills time and eternity with his purpose. Yet his purpose is achieved in our duty, and our duty is fulfilled in service to one another. ... This work continues. This story goes on. And an angel still rides in the whirlwind and directs this storm."
According to President Bush, then, Americans are really Americans as part of a providential plan, and their purposes are defined by that plan. Many scholars have argued that America is a purely "civic" nation. I disagree, and think that American leaders have often defined America's meaning in more-comprehensive ideological terms, like religious traditions. So I confess that, as a researcher, I was delighted to find in Bush's speech fresh evidence that "civic" themes still make up only one set of threads in the American political fabric, along with ethically constitutive religious stories and much else. Many contemporary political philosophers, led by the late John Rawls, are uncomfortable when government leaders seek to advance religious visions of shared political membership, instead of stressing agreement on rationally grounded principles of justice. I find it healthier and more democratic, however, when a leader like Bush makes his deep religious commitments clear.
Still, I strongly disagree with the president's religious story of American peoplehood, all the more so since the angel in the whirlwind now seems to have directed a basically unilateral extension of Operation Desert Storm. Yet, I believe that such ethically constitutive accounts are philosophically and politically necessary dimensions of political life, so that it is pointless to lament their existence. Those who disagree with stories like Bush's must contest those narratives with, among other things, rival ethically constitutive stories of their own.
Robin McKie, writing in the Guardian (July 6, 2003):
Stonehenge has dominated the Wiltshire landscape for more than 4,000 years and is one of the world's most important heritage sites, but its purpose has remained a mystery.
Some researchers have claimed the stone circles were used as a giant computer; others that Stonehenge was an observatory for studying stars and predicting the seasons; and a few have even argued that its rings acted as a docking pad for alien spaceships.
Now a University of British Columbia researcher who has investigated the great prehistoric monument for several years has announced he has uncovered its true meaning: it is a giant fertility symbol, constructed in the shape of the female sexual organ.
'There was a concept in Neolithic times of a great goddess or Earth Mother,' says Anthony Perks, a gynaecologist who decided to investigate the idea that the circles could have symbolic anatomical links. 'Stonehenge could represent the opening by which the Earth Mother gave birth to the plants and animals on which ancient people so depended.'
According to Perks's analysis, published in the Journal of the Royal Society of Medicine , the critical events in the lives of the builders of Stonehenge - who began their work around 3,000 BC - were births and deaths in their families and community. But there is no evidence of any burials near Stonehenge, Perks adds. 'There is little sign of death; there are no tombs, because Stonehenge was a place of life and birth, not death, a place that looked to the future.'
Evidence that the monument was dominated by ideas about creation and regeneration has been overlooked until now, says Perks.
Take the inner circle, which consists of pairs of massive capped rock pillars, one of which is rough and the other carefully smoothed. 'To a biologist, the smooth and rougher stones arranged in pairs, united by heavy lintels, suggest that male and female, father and mother, joined together,' he states.
Even more convincing, says Perks, is the similarity between Stonehenge seen from above and the anatomy of the female sexual organ. His article includes a map of the former, which is compared, point by point, with a detailed diagram of the latter. Of these features, the most important concern the central empty area that is enclosed by the monument's inner circle of giant bluestones.
'This central area is empty because it represents the opening to the world, the birth canal,' says Perks. Stonehenge was therefore constructed to honour the Earth Mother for 'giving both life and livelihood'.
As to Stonehenge's alignment with various astronomical events such as the rising of mid-winter and mid-summer sun - discovered by astronomers many years ago - these fit with notions of an Earth Mother partnered with a Sun Father, says Perks. Stonehenge celebrated their association, a place where people celebrated the Sun's closest approach to Earth in summer, while in winter they prayed for the pair to reunite.
It is intriguing theory, though it has failed to impress experts. David Miles, chief archaeologist for English Heritage, which owns the site, said Perks's theory, although interesting, was essentially untestable. 'You can come up with just about any idea to explain a structure like Stonehenge if you stare at it for long enough. And if Stonehenge was built so that it looked like a female sexual organ when viewed from above, how were people supposed to see that? As far as we have been able to tell, they didn't have hot-air balloons in prehistoric times.'
Richard Brookhiser, writing in the Wall Street Journal (July 3, 2003):
[Gouverneur] Morris's work on the meat of the Constitution, its seven Articles, is a superb job of smoothing, organizing, and clipping unruly verbal vines. Anyone who has graded term papers or edited copy will appreciate his labor. One small simplification can stand for all the rest. An Article in the Committee of Detail's draft said: "The Government shall consist of supreme legislative, executive, and judicial powers." Morris struck this out, beginning his first three articles by announcing that "all legislative powers," "the executive power," and "the judicial power" shall be vested in a Congress, a president, and the courts. Morris's alteration pruned a needless statement. The accumulation of such changes over the entire Constitution makes for a document that is light and limber. As an old man, James Madison declared that "the finish given to the style and arrangement of the Constitution fairly belongs to the pen of Mr. Morris . . . A better choice could not have been made, as the performance of the task proved."
The most finished sentence Morris wrote is the Preamble. He did not have much to guide him. The closest thing to a statement of purpose in the Articles of Confederation, the first, failed constitution that the convention was replacing, says that the states will enter "a league of friendship . . . for their common defence, the security of their Liberties, and their mutual and general welfare . . . ." Madison had come to the convention with a plan of his own, which defined the objects of government as "common defence, security of liberty and general welfare." The draft of the Committee of Detail opened with a bald announcement: "We The People of the States of New Hampshire, Massachusetts" and so on, through Georgia, "do ordain, declare, and establish the following Constitution for the Government of Ourselves and our Posterity."
Morris preserved pieces of these forerunners in his Preamble -- yet he transformed them. "We the People of the United States, in order to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defence, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity, do ordain and establish this Constitution for the United States of America."
The first thing that leaps out is the parade of strong verbs -- no bland "to be" forms, no passive voices. Two verbs alliterate -- provide, promote. Two rhyme -- insure, secure. The repetition of "establish" links the Constitution itself with the idea of justice. Such verbal echoes make prose readable, and memorable. In their absence State of the Union messages sag like wet tarps, supported only by mindless ovations.
But Morris wrote like a philosopher as well as a poet. He lists six purposes of government. His last three come from the Articles of Confederation and from Madison, but by changing their order he makes "Liberty" open out to the future; he further sweetens it by speaking of its "blessings."
Where did his other purposes of government come from? "Domestic tranquility" had a personal meaning for him. Morris spent the beginning of the Revolution in his native New York, where the war was particularly ugly. The British invaded the state four times; loyalists and patriots fought each other in a virtual civil war. Morris's own family was split down the middle: his mother and sisters were loyalists; one half-brother signed the Declaration of Independence, while another was a general in the British army. Morris knew well the importance of a peaceful country, and a peaceful home.
Establishing justice was also a goal shaped by his own experience. Morris was the son and grandson of colonial judges. Of the three branches of government, the judiciary was the stepchild, receiving the least discussion at the convention and immediately afterward. Morris did not make this mistake. "In some parts of this Union," he warned a fellow founder, "justice cannot readily be obtained in the state courts."
Morris's greatest contribution came at the very beginning, when he shrank the list of the states in the Committee of Detail's draft to "We the People of the United States." There was a practical reason for his terseness: some states were unlikely to ratify any time soon (Rhode Island had sent no delegates to the convention). But Morris was moved by conviction as well. He was one of the doughtiest nationalists at the convention. "Among the many provisions which had been urged," he complained at one point, he had seen none "for supporting the dignity and splendor of the American empire." By speaking in the name of the people of the nation, Morris subtly but momentously changed the focus of government. The keenest critics of the Constitution saw what he was doing: Patrick Henry would pounce on "that poor little thing -- the expression, 'We the people." Abraham Lincoln would embrace it: the Gettysburg Address begins by recalling the Declaration, "four score and seven years ago," but its concluding invocation of "government of the people, by the people, and for the people" echoes the Preamble.
American history after the Constitutional Convention would be filled with many strange detours -- none stranger than Morris's decision, 25 years later, to repudiate his handiwork. As a northern Federalist during the War of 1812, he thought the government was in the grip of wicked bumblers, and he wanted the country split up and the Constitution scrapped. But literary critics know that authors are not the best judges of what they write. The Constitution has outlived the doubts of its draftsman -- sturdy, simple, and, at moments, beautiful.
Ken Ringle, writing in the Washington Post about the Folklife Festival on the Washington Mall (July 3, 2003):
Make no mistake about it. The Scotland and Appalachia sections of the Folklife Festival on the Mall are not just showcases of rural cutesy-quaint for the cultural voyeurs of the urbanized 21st century. They are direct pipelines into a major wellspring of the American character.
If the New England Pilgrims, Virginia Cavaliers and Pennsylvania Quakers shaped our national institutions, argues historian David Hackett Fischer of Brandeis University, it was the Scotch-Irish of Scotland and northern Ireland who most defined our culture and who define it still.
They arrived later than the others and settled in the mountain backcountry of pre-Revolutionary America (richer, earlier settlers held the fertile lowlands), and carved out a hardscrabble existence that for all its hardship and terrors was as proud as it was independent. There were a lot of them. Puritan immigrants numbered about 21,000, Fischer says. The Scotch-Irish numbered 275,000. Their heirs have fought our wars, written our music, shaped our churches and otherwise most defined our essence as a people for the past 200 years, Fischer says.
Shouldn't we maybe say thank you?
Down on the Mall, the festival participants aren't looking for thank-yous. They would probably be puzzled by them if not embarrassed. As much pride as they have as individuals, the Scotch-Irish have never chosen to leverage their collective ethnicity into political power, unless you count the election of Andrew Jackson as president -- their first great political hurrah.
Asked, for example, if the Appalachian foodstuffs she was hawking on the Mall were produced by local agricultural cooperatives, Phyllis Deal of Clintwood, down in Virginia's mountainous southwestern toe, said, "No, there's a traditional resistance to cooperatives in our area. We're just not very cooperative."
In his landmark 1989 study "Albion's Seed: Four British Folkways in America," Fischer traces the fractious independence of the Appalachian Scotch-Irish to the centuries of warfare along the borderlands of northern Ireland and southern Scotland from which Appalachian settlers emigrated in the mid-1700s. It was, apparently, a sort of 18th-century Hibernian Middle East: The fighting never stopped.
Since they were subject to violence and raids from both warring factions, Fischer says, the Scotch-Irish developed a distrust of all governments and most institutions other then their own family or clan. Loyalty to clan twinned with suspicion of strangers and with a cultural conservatism that clung to traditional beliefs and folkways. It also produced, Fischer says, an evangelical passion in religion that emphasized one's powerlessness to shape the future. And a land-hunger that spread them across the continent.
"Albion's Seed" argues persuasively that the instability of that ancestral homeland shaped an Appalachian culture that lent its distinctive character to everything from marriage customs and costumes to speech patterns, gender roles and food.
Carole Grout, writing in the Times-Picayune (July 3, 2003):
With the Fourth of July tomorrow, it is an appropriate time to look into the history of Uncle Sam, the character long a symbol of this nation.
Historians have been unable to agree on the person for whom Uncle Sam was named, though many think the name came from Samuel Wilson.
Wilson was born in Massachusetts in 1766, later moving as a child with his family to New Hampshire. As a young adult, Wilson moved to Troy, N.Y., where he later owned a slaughtering business and meat packing establishment.
During the War of 1812, Wilson's business provided much of the meat for the American Army in barrels on which was stamped "U.S.," probably denoting that it was packed for the United States government.
The story goes that someone looking at a barrel of meat, knowing its source and reading the letters "U.S." joked that the meat came from Uncle Sam. And so, it is said, in time Uncle Sam came to be equated with our country.
The traditional depiction of Uncle Sam was the creation of political cartoonists. Many of the earliest depictions were drawn by political cartoonist Thomas Nast in the 19th century. Nast also created the far less well known cartoon character of Yankee Doodle, a clean-shaven man with a feather in his hat.
Some believe that the outfit and top hat worn by Nast's Uncle Sam were inspired by clown Dan Rice, who wore a similar costume in his act. However, Nast's earliest Uncle Sam cartoons appeared in 1838 and Rice did not begin performing until six years later.
The most well-known depiction of Uncle Sam was a poster by artist James Montgomery Flagg. This is the famous World War I "I Want You" recruiting poster.
Uncle Sam has become a symbol recognized throughout the world as standing for the United States. And tomorrow we again celebrate the birthday of this great nation.
David Greenberg, writing in the NYT (June 29, 2003):
EVERY year as Independence Day draws near, we debunk old myths -- pointing out that Betsy Ross didn't sew the first flag, or that the Continental Congress actually proclaimed independence on July 2. But historians say that the real misunderstandings of history run deeper than a botched date or the unmerited canonization of a Philadelphia seamstress. Here are a few of what scholars describe as the true myths of Revolutionary history.
The Declaration of Independence was an original work by Thomas Jefferson.
In national lore, no Revolutionary leader except George Washington looms larger than Jefferson. "People seem to think that if not for Jefferson, we would not be created equal and we wouldn't have inalienable rights," said Pauline Maier, a historian at the Massachusetts Institute of Technology.
But the Declaration was hardly Jefferson's solitary work. He drafted it as part of a five-man committee. John Adams and Benjamin Franklin edited his version, and the Continental Congress substantially revised the document (to Jefferson's irritation), excising a fierce condemnation of slavery.
In addition, the ideas didn't originate with Jefferson. Americans had been issuing similar calls for independence for months. As Professor Maier described in her 1997 book "American Scripture: Making the Declaration of Independence," records of at least 90 proto-declarations have survived, put out by towns, counties and groups of local tradesmen or soldiers. "These documents in some ways are much more effective than Jefferson's draft," Professor Maier said. "They tell the story of how the colonists loved the king and how things got worse and worse."...
It's commonly thought that American colonists fought to separate themselves from the British monarchy but, unlike the French revolutionists of 1789, not to remake their society. There were no bloody purges, no significant redistribution of wealth and, heaven forbid, no class warfare.
"But the American Revolution was far more radical than people realize," said John E. Ferling, the author of "A Leap in the Dark: The Struggle to Create the American Republic" and a historian at the State University of West Georgia.
In 1909 the historian Carl Becker famously wrote that the war was not just about "home rule" but also about "who should rule at home." "If you emphasize home rule, you emphasize the repudiation of state power," said Isaac Kramnick, a professor of government at Cornell. "If you emphasize who rules at home, you see the revolution as a historic milestone in overthrowing traditional elites."
Jefferson borrowed words and arguments from these documents, notably George Mason's draft of Virginia's Declaration of Human Rights. Mason wrote: "All men are born equally free and independant, and have certain inherent natural rights, . . . among which are the enjoyment of life and liberty, with the means of acquiring and possessing property, and pursuing and obtaining happiness and safety."
In fairness, Jefferson conceded that the Declaration didn't embody his original ideas; he intended it, he said, "to be an expression of the American mind."
Frank Bruni, writing in the NYT (June 29, 2003):
The specter of Communism has long dominated the discourse of Silvio Berlusconi, who casts himself as Italy's last line of defense against a tenacious scourge.
It has also had a leading role in his legal stratagems, which portray the prosecutors who have charged him with corruption as left-wing zealots wielding hammers and sickles.
But the prime minister's fear of the Red Menace has crept unexpectedly into a new sphere of Italian life, and some of his political opponents are wondering how it got there.
The evils of Communism appear front and center in one of the themes that hundreds of thousands of Italian high school seniors could choose to write about in graduation exams given this month. That topic invited students to ponder "terror and the political repression in the totalitarian systems" of the 20th century and gives brief descriptions of Fascism in Italy, Nazism in Germany and Communism in the former Soviet Union and other countries.
Communism is blamed for the executions of about 100 million people, five times greater than the killings attributed in the exam to Nazism.
In the wording of the topic, it takes one sentence to denigrate Fascism. It takes four to vilify Communism.
Some historians and teachers have complained that the balance of the question is out of whack. "I teach my students that of course Communism must be seen in a negative light, but the goal of Nazism was to kill people, and the goal of Communism was to unite them," said Giuseppe Costantino, 61, who teaches history in a high school in Naples.
A few of Mr. Berlusconi's political opponents have suggested that he or his allies might be trying to mold young minds. "There's been an increase -- a boost -- in historical revisionism since the center-right came to power," said Enzo Carra, a center-left member of Parliament who follows education issues.
David Whitehouse, writing for the BBC (June 25, 2003):
A team of geologists believes it has found the incoming space rock's impact crater, and dating suggests its formation coincided with the celestial vision said to have converted a future Roman emperor to Christianity.
It was just before a decisive battle for control of Rome and the empire that Constantine saw a blazing light cross the sky and attributed his subsequent victory to divine help from a Christian God.
Constantine went on to consolidate his grip on power and ordered that persecution of Christians cease and their religion receive official status.
In the fourth century AD, the fragmented Roman Empire was being further torn apart by civil war. Constantine and Maxentius were bitterly fighting to be the sole emperor.
Constantine was the son of the western emperor Constantius Chlorus. When he died in 306, his father's troops proclaimed Constantine emperor.
But in Rome, the favourite was Maxentius, son of Constantius' predecessor, Maximian.
With both men claiming the title, a conference was called in AD 308 that resulted in Maxentius being named as senior emperor along with Galerius, his father-in-law. Constantine was to be a Caesar, or junior emperor.
The situation was not a stable one, however, and by 312 the two men were at war.
Constantine overran Italy and faced Maxentius at the Milvian Bridge over the Tiber a few kilometres from Rome. Both knew it would be a decisive battle with Constantine's forces outnumbered.
It was then that something strange happened. Eusebius - one of the Christian Church's early historians - relates the event in his Conversion of Constantine.
"...while he was thus praying with fervent entreaty, a most marvellous sign appeared to him from heaven, the account of which it might have been hard to believe had it been related by any other person.
"...about noon, when the day was already beginning to decline, he saw with his own eyes the trophy of a cross of light in the heavens, above the Sun, and bearing the inscription 'conquer by this'.
"At this sight he himself was struck with amazement, and his whole army also, which followed him on this expedition, and witnessed the miracle."
Spurred on by divine intervention, Constantine's army won the day and he gave homage to the God of the Christians whom he believed had helped him.
Stephanie Simon, writing in the LA Times (June 25, 2003):
The steel arcs rising in a broken rainbow on the riverfront are meant to resemble giant gears bursting out of the earth. They honor the working-class men and women who built Detroit.
But the sculpture is not just about remembering. It's a statement of defiance.
Unions represent only 13% of American workers these days -- the lowest level in six decades. The manufacturing jobs that traditionally have been the backbone of organized labor are vanishing, often going overseas. More than 2 million jobs disappeared in the last three years alone.
Yet unions around the world raised $1.5 million to build this sculpture in downtown Detroit. When completed next month, the Labor Legacy Landmark will be the Western Hemisphere's largest tribute to workers.
To workers here, the project is a proud beacon. It is a symbol that organized labor will not fold, despite its weakness. And they hope it will inspire future generations of activists.
"Building a memorial on this scale says something about the energy, about the vitality of the labor movement today," said Donald Boggs, president of the Metro Detroit AFL-CIO.
Others, though, see it as a eulogy.
"With the decline in union membership, there's been a real emphasis on making sure that the stories [of the labor struggle] are shared with young people," said John Revitte, a professor of labor history at Michigan State University. "It seems almost ironic."
New York, Chicago, St. Louis and a number of other cities have begun promoting labor tourism, with guided trips to old factories, union halls and other historic sites. Museums are revising their exhibits on the Industrial Age to capture the stories not just of the great titans of industry, but also of the anonymous masses who toiled in filthy factories for as little as $2.34 a day. Even academics have begun to take the subject seriously; there's now a national association of labor historians, Revitte said.
"The decline in manufacturing ... has meant that a way of life has disappeared," said John Russo, director of the Center for Working Class Studies at Youngstown State University in Ohio. That loss, Russo said, has sparked a new interest in understanding workers: "Who were they? What happened when they were displaced?"
Detroit labor leaders plan to develop a school curriculum for field trips to the monument; they want to teach a new generation about the fight to win rights such as paid sick time and a 40-hour workweek. But they want the monument to be more than a history lesson.
In fact, they resist even calling it a monument.
"That word denotes the past. This is a forward-looking landmark, laying out labor's vision for a better world," said Dave Elsila, editor of Solidarity, the United Auto Workers magazine. "It's about passing the torch, about learning from the past to create a better future."
Artists David Barr and Sergio DiGuisti designed the 63-foot-high arch, which they call "Transcendence," to be broken at the top. The open space, glinting in sunlight, represents the energy of workers everywhere -- and the unfinished struggle of the labor movement.
Below the stylized gears, 14 bronze sculptures on granite boulders will depict the toil of laborers past and present, from fur traders to airline pilots. The path that winds past the artwork will be engraved with quotations from labor leaders, politicians and civil rights activists.
From antislavery crusader Frederick Douglass: "If there is no struggle, there is no progress."
From the placards of sawmill workers striking for a shorter workday: "Ten hours or no sawdust."
From Cesar Chavez, who organized farm workers: "The people united will never be defeated."
James Bone, writing in the London Times (June 25, 2003):
Fifteen years ago, Clifford Long Sioux trekked up Last Stand Hill with other American Indian activists and placed a plaque commemorating the natives' famous victory over Lieutenant-Colonel George Custer and his men at the Battle of Little Bighorn.
The unauthorised memorial honoured the "Indian patriots who fought and defeated the US Cavalry in order to save our women and children from mass murder".
The plaque was later removed from the site, where headstones mark the mass grave of 260 dead from Custer's defeated 7th Cavalry, and it is now in a museum.
But today, 127 years after Custer's celebrated last stand, the American Indians who won the battle will get their own memorial at the windswept site on the Montana prairie.
In a controversial move to reinterpret one of the most famous engagements in American history, the Lakota Sioux, Cheyenne and Arapaho bands who slayed Custer, as well as the Crow and Arikara scouts who guided them, will be recognised with a sunken stone circle and a sculpture near the memorial to the slain cavalrymen.
Mr Long Sioux, a drug and alcohol counsellor on a nearby Cheyenne reservation, said: "For years it has been a one-sided story. Indians were never asked to be present there until the American Indian Movement started protesting and even placed a metal plate up there to draw attention.
"Some tried to ask for a memorial to their grandfathers who died there, but it had always fallen on deaf ears."
The Battle of Little Bighorn of June 25, 1876, was perhaps the greatest victory of Sitting Bull's Lakota Sioux against the encroaching US troops. But it is traditionally remembered for Custer's heroic last stand, an event that inspired numerous works of art.
The engagement began when Custer attacked an Indian village along the Little Bighorn River. His troops met unexpected resistance from up to 2,000 Sioux, Cheyenne and Arapaho warriors, who lost fewer than 100 men.
The US forces soon recovered and, within months, had renewed the military campaign against the Indians and began forcing them on to reservations. In 1881, the Government built a granite obelisk to honour the cavalry dead.
There was an effort to effect a reconciliation on the anniversary of the battle in 1916, and in 1926 the 7th Cavalry and the American Indians symbolically buried a hatchet. But the centennial in 1976 led to American Indian protests at what was then called the "Custer Battlefield National Monument".
After the plaque-laying in 1988, Congress changed the site's name to the Little Bighorn Battlefield National Monument and authorised a memorial to the battle's "Indian participants" to be built with private funds. Last year, at the urging of Senator Ben Nighthorse Campbell, a native American, Congress finally approved $ 2.3 million to build it.
Chris Weinkopf, writing in frontpagemag.com (June 25, 2003):
WHITNESS STUDIES, which began as a small fringe of the academic world only eight years ago, has since blossomed into the latest academic fad. The Washington Post reports that at least 30 institutionsfrom Princeton University to the University of California at Los Angelesteach courses in the subject. WS has its own think-tank (the Center for the Study of White American Culture), its own journal (Race Traitor, treason to whiteness is loyalty to humanity), periodic national conferences, and a veritable library of books and tracts.
The essence of the discipline can be summed up in two words: Hating Whitey.
Now, we didnt have Whiteness Studies back when I was in college. Then, all the rage was multiculturalism, of which I got more than I could handle when, as a freshman, a scheduling snafu forced me into a section of the mandatory freshman-English program bearing the ominous title of Differences. There we studied literature throughto use the most pervasive cliché in academiathe lens of race, class, gender, and sexual orientation.
What that meant, in application, was that in the first weeks of class, we read books by African-Americans, the theme of which, unfailingly, was hatred for white people. Next we moved on to books by Hispanics, the theme of which was hatred for white people. From there it was books by Asians and Native Americans onyou guessed ithatred for white people. There were a few variations, including some readings on anti-Semitism and homophobia, but otherwise, the theme was constant. This was a study of oppression, and the oppressors were always white guys.
Of course, all this took place way back in the early 1990s, eons ago for the modern-day Ivory Tower. Multiculturalism, once the primary fetish of academia, is now old hat in a culture that values the avant-garde above all else. Its permutations are spent. There are no more -isms to define; no more ethnic groups to balkanize; no more victims to patronize. That leaves academics looking for the next Big Thing, and they think theyve found it in WS.
The focus has changed from multiculturalism, but the hating whitey theme remains.
Whiteness, as its would-be studiers see it, is the underlying cause of most every conceivable social ill. As David Horowitz has observed, Whiteness Studies is different in kind from other ethnocentric disciplines: Black studies celebrates blackness, Chicano studies celebrates Chicanos, womens studies celebrates women, and white studies attacks white people as evil.
Scott McLemee, writing in the Chronicle of Higher Education (June 27, 2003):
Foner's magnum opus [History of the Labor Movement in the United States], may be his most characteristic work. When he began the project in the 1940s, he meant it as a rebuttal to the four-volume History of Labor in the United States prepared by John R. Commons and other scholars at the University of Wisconsin in the early decades of the 20th century. For "the Wisconsin school," labor organizations did not challenge the fundamental values of industrial capitalism. Rather, workers used unions to improve their position within the existing order. For Foner, by contrast, unions were part of a broader movement for democratization -- a means of struggling for political and social goals such as equality and power, as well as better wages, hours, and working conditions.
"He was a pioneer in the development of labor history as a discipline, in moving it out of the economics department," says Nelson N. Lichtenstein, a professor of history at the University of California at Santa Barbara. He notes that Foner chronicled the struggles of black and female workers at a time when the constituency of unions was assumed, by default, to be white and male. By the late 1960s -- despite his marginality, or perhaps because of it -- Foner was an acknowledged influence on the younger generation of historians studying the labor movement.
Meanwhile, Foner was in turn being influenced by lesser-known scholars, to put it as kindly as possible.
The first sign of trouble came in 1971, when James O. Morris published an article in Labor History charging that Foner's book The Case of Joe Hill (International Publishers, 1965) contained extensive plagiarism from an unpublished master's thesis that Mr. Morris wrote in the 1950s. "About one quarter of the Foner text is a verbatim or nearly verbatim reproduction of the Morris manuscript," he wrote. That was a low estimate, because Mr. Morris also noted that many of the primary sources quoted in his thesis also appeared in Foner's book -- passages that "begin at the same word in a broken sentence, involve the same pattern of dots for omitted material, end at the same point. ..."
In his reply, published along with Mr. Morris's article, Foner listed the archives and sources he had consulted. He acknowledged reading the thesis, but said he did so only toward the end of his research. He did not respond to Mr. Morris's documentation, in side-by-side columns, that compared Foner's book to the thesis and showed extensive borrowing, much of it word for word.
It was not to be the only time. Melvyn Dubofsky, a professor of history and sociology at the State University of New York at Binghamton, found "large chunks" of his dissertation incorporated, without attribution, into the fourth volume of Foner's History of the Labor Movement. "Later, I discovered he did the same with other dissertations too numerous to mention," he told H-Labor. Without noting the parallel with Mr. Morris's complaint, Mr. Dubofsky likewise points out that citations from primary sources in Foner's work tend to be exactly the same as those found in unpublished work by graduate students.
Other questions about Foner's documentation prove even more troubling. "I had a student working on the fur and leather workers' union, which Foner had written a book about," says Mr. Dubofsky. "She could not find the materials" in union records that Foner cited in his notes. "What happened? Did they exist?" In a book on the Industrial Workers of the World, a radical union, Foner claimed to have consulted government records that Mr. Dubofsky says he could never have actually examined, because they were classified and unavailable to researchers.
Such allegations were well known within labor history during the 1970s and 1980s, according to scholars in the field, including some who remain sympathetic to Foner.
"The tragedies of his life were multiple," says Santa Barbara's Mr. Lichtenstein. "He was on the margins of academic life, and even when he got back in, he didn't interact much with the mainstream. So I don't think he ever really held himself to academic standards."
Mr. Lichtenstein recalls hearing Foner lecture on the Molly Maguires -- the Irish-American labor organization that emerged in Pennsylvania's coalfields in the 1870s. "It felt like I was in the presence of someone who was from the 19th century himself, when being a historian meant, first of all, just assembling tremendous amounts of documents. So yes, we knew there were various problems in his own writing. You knew you wouldn't want to rely on him as a source, but would need to check it. On the other hand, I take the collections of documents that he edited at face value. Once you get past the 'gotcha' plagiarism stuff, he still has an important place in the development of labor history."
Another scholar, David R. Roediger, a professor of history at the University of Illinois at Urbana-Champaign, worked with Foner more directly -- collaborating with him on Our Own Time: A History of American Labor and the Working Day, published by Greenwood Press in 1989. "The nature of our collaboration was that he gave me boxes with all sorts of material in it, almost all primary sources." Foner himself drafted one chapter of the book. "I felt like I needed to check to make sure that everything in it was original," Mr. Roediger says, "because I knew the work would be scrutinized."
Foner shrugged off the charges of impropriety, Mr. Roediger recalls. "He basically said the same things that Ambrose later did: 'I write a lot of books, I have research help' -- mostly the women in his life -- 'I have a mountain of notes, and sometimes can't tell what's what.' That was his reasoning -- it was just a processing error that crept in every once in a while. He also had a photographic memory, so that may have been a factor."
Richard Pipes, in a letter to the editor of the NYT (June 21, 2003):
In "The Boys Who Cried Wolfowitz" (column, June 14), Bill Keller discusses the Central Intelligence Agency "Team B" of 1976, which I had the honor to lead and of which Paul Wolfowitz was a valuable member. He calls it "famous"; others have said "infamous."
But Team B did not come up "with estimates of Soviet military strength that we later learned to be ridiculously inflated." It did not deal with "Soviet military strength" at all, but with Soviet nuclear strategy whether the Soviet Union shared the dominant American strategy of mutual assured destruction.
Team B concluded on the theoretical and physical evidence that the Russians had instead adopted a "war-fighting and war-winning" doctrine, which was confirmed after the Soviet Union's collapse. Whatever objections Mr. Keller may have to the Iraq intelligence, Team B's findings are not relevant since they were proved correct.
David Snyder, writing in the Washington Post (June 22, 2003):
William F. Chaney took a look at the numbers and didn't like what he saw: Ninety-nine of the 104 monuments at the Antietam National Battlefield commemorated Union forces. The Confederacy had just four, and one monument was for both sides.
So he commissioned a 24-foot-tall statue of Robert E. Lee, at a cost "in the six figures," and last week erected it on private land, within spitting distance of the Antietam National Battlefield.
"I thought it was important that the Southern side be more represented," said Chaney, who lives in Lothian, in southern Anne Arundel County, and is a distant relative of the general. "And I couldn't think of anyone better to represent the South than Robert E. Lee."
For some, Chaney's statue, dedicated last night in a ceremony at the base of the monument, is an affront to efforts to preserve the Antietam National Battlefield as it was during the Civil War.
The bronze-and-granite statue, depicting a stern-visaged Lee astride his horse Traveller, is also a powerful evocation of the Confederacy at a time when the Old South's symbols are under fire from those who say they glorify a dark chapter in American history....
The National Park Service has not allowed new monuments on park land since 1991, said park superintendent John Howard. A monument to the Union Army's famed Irish Brigade was erected in 1997, but it had been in the planning stages before the moratorium.
Chaney, 57, bought 100 acres of land adjacent to Antietam National Battlefield in 1999. He sold about 60 acres to the National Park Service, keeping 40 acres with a circa 1790 farmhouse. He turned the house into a museum, which gives roughly equal space and attention to the Confederate and Union sides. Cheney would not say exactly how much he paid for the statue, created by Arkansas sculptor Ron Moore. A portion of the six-figure price tag came from groups including the Sons of Confederate Veterans, which made donations to Southern Heritage at Antietam, a nonprofit group formed by Cheney, he said.
Howard said he viewed the statue as "appropriate," given Lee's presence in the battle.
"I think it is a very well-done monument; the quality of the work is outstanding," Howard said. "There are certain things that can't be denied in history, and one is that Lee was here."
For some in Sharpsburg, population 691, arguments over whether Lee should be depicted on the battlefield are "a little strange," said Sallie Cornell, owner of the American Deli, at Main and Mechanic. Portraits of Lee and George Washington hang on the walls.
"I can understand wanting no new buildup," said Cornell, 42. "But a statue, for goodness sake? We've already got his name everywhere -- it's okay to show his name but not his figure?"
The battle, on Sept. 17, 1862, claimed more than 3,650 lives and remains the bloodiest single day on American soil. Historians view the fight as a draw -- though important to the Union because it stopped Lee's advance.
For Chaney, protests about Confederate iconography are "political correctness that's gone berserk."
"You have to have balance in history," Chaney said. "You can't tell history accurately if it's not balanced."
Kate Coleman, writing in the LA Times (June 22, 2003):
Huey P. Newton and Eldridge Cleaver may be dead, but the Black Panthers have never really gone away. This bunch of thugs continues to capture the imagination of American intellectuals. In the last couple of weeks, the group has been celebrated at a Wheelock College conference titled "The Black Panther Party in Historical Perspective" and on a National Public Radio program that considered the group's place in American life.
The conference in Boston was called, said one of its organizers, because "far too often in looking at the Panthers, people have relied on kind of a negative portrayal of the party." Some 40 new papers examining the Panther legacy were presented, but few of them dealt with the dark side of the Panther movement.
That's the side I know well, having documented and written about it since the late 1970s, along with a number of other journalists. Newton was a thug even before he co-founded the organization with Bobby Seale in 1966. In 1964, he was convicted of assault after stabbing an unarmed man at a party with a steak knife. He bragged in his autobiography, "Revolutionary Suicide," of hanging out in hospital parking lots to pull off strong-arm robberies....
If the Wheelock conference wanted to examine the real legacy of the Panthers, its participants should have pored over the cold statistics showing a spike in drive-by shooting deaths and gang warfare that took place in Oakland in the decade following the Panthers' demise. The Black Panther Party had so fetishized the gun as part of its mystique that young men in the ghetto felt incomplete without one.
But that's not the legacy most scholars want to examine. On Monday, after the Wheelock conference, a caller to National Public Radio's "Talk of the Nation" attempted to inject a note of skepticism into a discussion of the conference. "My only experience with the Black Panthers was back in the '60s in Oakland, Calif.," the caller said. " ... and my experience wasn't a good one at all because they would get on the buses and intimidate everyone on the bus. They wouldn't pay. As far as I could tell, they were a bunch of thugs."
Former Panther Kathleen Cleaver quickly cut in to suggest that the bus thugs weren't actually members of the group at all: "There were at that time," she said, "a large number of people who were sent by the United States government and paid to be informants and infiltrators who behaved as thugs" while posing as Panthers.
"That's baloney," the caller interjected.
At that point, conference organizer Yohuru Williams jumped in: "This is why we were so concerned about having historians look at this. You're looking there at an issue of memory... There's nothing to say that the people that you encountered on that bus that day were Panthers."
It has too often come down to this: The faults uncovered about the Panthers are dismissed as slander from the FBI or more specifically Cointelpro, the FBI's domestic counterintelligence programs. Panther apologists want to see the group as having been a heroic force against racial injustice. But the party's criminal underpinnings give that tale the lie.
In response, the organizers of the Black Panther conference--Jama Lazerow, Professor of History, Wheelock College and Yohuru Williams, Associate Professor of History, Delaware State--wrote the following letter protesting Ms. Coleman's characterizations and conclusions:
For the second time in a week, Kate Coleman (Just a Pack of Predators) has used our conference (The Black Panther Party in Historical Perspective, Wheelock College, June 11-13) as a way of discrediting professional historians, who, for the first time in nearly forty years, are finally subjecting the Black Panther Party to serious academic scrutiny.
Coleman was not in attendance at the three-day conference, and so far as we know, has read none of the fifty papers and commentaries, though she has been apprised in writing of the nature of the more than twenty-four hours of formal presentations and discussion, and she well knows that our purpose was precisely the opposite of that which she attributes to us in her op-ed piece. The conference was not called because people have relied on kind of a negative portrayal of the party nor because perhaps equal numbers have relied on a romanticization of the Party but both: our goal was to reconsider the historical significance of a key movement of what historians call the late Sixties.
Thus, your readers might be interested to know, two panels explored in great depth and with much vigorous academic debate the way the Panthers have been portrayed in history and popular memory; two sessions examined the enormous diversity of the Panther experience at the local level in places as far away from Oakland as New Bedford, Massachusetts; and an entire session was devoted to the issue of violence in the life and rhetoric of the Party, a matter that generated so much raucous debate that the session ran over by nearly a half-hour.
In hosting this first-ever history conference on the Black Panther Party, our goal was to, in the words of one of our participants, find a space between the vilification peddled by writers like Kate Coleman and the hagiography still being promoted by some ex-Panthers and even by some in the academy. Both are exercises in myth-making, and we take seriously our responsibility as historians to engage the Panther story as we would any other: to move beyond myth and seek to accurately describe and analyze the past in its historical context. We believe we began that latter process at our conference.
Steve Connor, writing in the London Independent (June 23, 2003):
THEY RAN the biggest empire of their age, with a vast network of roads, granaries, warehouses and a complex system of government. Yet the Inca, founded in about AD1200 by Manco Capac, were unique for such a significant civilisation: they had no written language. This has been the conventional view of the Inca, whose dominions at their height covered almost all of the Andean region, from Colombia to Chile, until they were defeated in the Spanish conquest of 1532.
But a leading scholar of South American antiquity believes the Inca did have a form of non-verbal communication written in an encoded language similar to the binary code of today's computers. Gary Urton, professor of anthropology at Harvard University, has re-analysed the complicated knotted strings of the Inca - decorative objects called khipu - and found they contain a seven-bit binary code capable of conveying more than 1,500 separate units of information.
In the search for definitive proof of his discovery, which will be detailed in a book, Professor Urton believes he is close to finding the "Rosetta stone" of South America, a khipu story that was translated into Spanish more than 400 years ago. "We need something like a Rosetta khipu and I'm optimistic that we will find one," said Professor Urton, referring to the basalt slab found at Rosetta, near Alexandria in Egypt, which allowed scholars to decipher a text written in Egyptian hieroglyphics from its demotic and Greek translations.
Kris Axtman, writing in the Christian Science Monitor (June 19, 2003):
On June 19, 138 years ago, African-Americans in Texas learned of their freedom - nearly 2-1/2 years after slavery had officially ended with President Abraham Lincoln's Emancipation Proclamation. So thrilled were those blacks who listened to Union Gen. Gordon Granger deliver the news after landing in Galveston and placing the city under martial law, that they raced into the streets, leaping and singing.
On that day, the story goes, they proclaimed they would never forget the date they were freed. And as Texans have migrated across the country over the last century, they've taken that vow along. While some have ribbed Texans for celebrating a dispatch so long deferred, the day's significance here is clear.
"To some extent, it is a sad day, like when an anticipated letter is lost. But the contents of that letter are not devalued by the delay in delivery," says Clifton Taulbert, an African-American author based in Tulsa. And the delay, he insists, should not diminish the hoopla. "We look at Juneteenth as: Our most distant cousin has finally gotten the letter - and now we can all celebrate together."
Juneteenth celebrations haven't evolved much in 138 years. Most still include free barbecue and red soda, music, and parades. But some national groups are trying to include an educational component, so that younger generations won't forget the history of the date.
"The holiday is going to come; it needs to come. But what is more important is the re-education of our people," says Lula Briggs Galloway, president of the National Association of Juneteenth Lineage in Saginaw, Mich.
She doesn't believe that African-Americans need permission to celebrate something that's already theirs. Just take the day off, she suggests - regardless of whether your state recognizes Juneteenth or not.
That's what many blacks in Texas did before Juneteenth was official, says Houston historian Patricia Smith Prather. "When I was growing up, it was an accepted fact that nobody black went to work on Juneteenth. Maids and butlers, we all stayed home," she says. "It was a day of jubilee, a day of prayer and singing."
Interest waxes, wanes
But enthusiasm for the celebrations began to wane in the 1960s with the birth of the civil rights movement and its focus on integration and unity - until the 1970s, when an emphasis on black history and cultural heritage swung the pendulum back.
Finally, in 1979, Texas became the first state to officially recognize Juneteenth, and other states followed suit on the momentum of precedent and popular demand. In 1997, the US Congress passed a resolution designating June 19 as Juneteenth Independence Day. A presidential proclamation would make it a national holiday - something no president has been willing to do.
Activists were hoping that President Bush would be sympathetic to the idea, having presided over the holiday many times as governor of Texas. So far, though, the thousands of petitions have fallen on deaf ears.
Clyde Haberman, writing in the NYT (June 20, 2003):
THE young landscaper stood yesterday over the graves of Julius and Ethel Rosenberg, and decided that the shrubbery needed a trim. He would definitely get to it, he said. People might visit, and he wanted everything to look right.
He understood that the Rosenbergs were somehow important, the man said in his Spanish-accented English. What exactly they had done in life to make them famous, he had no idea. He knew them only in death. Even then, they were not much more than a name on a broad headstone: Rosenberg. One more name out of thousands, representing all those souls on their journey through forever at Wellwood Cemetery, along the border between Nassau and Suffolk Counties.
The Rosenbergs' first names were engraved on separate stone markers at the foot of the graves. There was some other writing, too, but the landscaper could not read it. It was the Hebrew rendering of the couple's names. His was Yonah, hers Etel.
From the dates of their births and deaths, the man could tell that she had been the elder, 37 to his 35. Very young, he said. That is so, he was told. Look, he said, they both died on the same day: June 19, 1953. Was there, he asked, an accident? Not quite, he was told.
Across the language barrier, it was not so easy to explain why it was a gross oversimplification to say that the Rosenbergs died.
They were executed. Fifty years ago yesterday, having been found guilty of conspiring to pass American atomic secrets to the Soviet Union, they went to the electric chair at Sing Sing -- first Julius, then Ethel. They died as the sun was setting.
Usually at Sing Sing, the death penalty was carried out at 11 p.m. But that June 19 was a Friday, and 11 p.m. would have pushed the executions well into the Jewish Sabbath, which begins at sundown. The federal judge in Manhattan who sentenced them to death, Irving R. Kaufman, said that the very idea of a Sabbath execution gave him "considerable concern." The Justice Department agreed. So the time was pushed forward.
Killing the couple was one thing. But to do the deed on the Sabbath, apparently, was quite another.
Janadas Devan, writing in the Straits Times (Singapore) (June 20, 2003):
The homo sapien is a curious specimen: It is the only creature on Earth that can narrate its own history, and yet it is hardly conscious of it most of the time. The reason is simple: History and evolution can be known, but they cannot be experienced directly like this morning's breakfast. Consequently, we live in the main as though 'history is bunk', to quote Henry Ford, thinking our little spot in time were eternity. Change is what happened in the past; the present is permanent; the future, a simple extrapolation from now.
The Romans thought that, the Ming Chinese assumed the same, as did the Ottomans and the British. Consider what the last assumed over 100 years ago. Victoria was still on the throne in 1900 (she died in 1901); England was the workshop of the world; the British Navy ruled the waves; and the Union Jack flew over a fifth of the globe. Lord Curzon, then Viceroy of India, wrote in 1900: 'The message is carved granite, it is hewn in the rock of doom, that our work is righteous and it shall endure.'
In that same year, when the old queen was knocked about in her yacht by rough waves, she commanded her doctor: 'Go up at once, give the Admiral my compliments and tell him the thing must not occur again.' But she, no more than King Canute, couldn't stop the waves. Already in 1900, Britain had to suppress rebellions on two continents: the Boers in South Africa, and the Boxers in China....
In August 1914, it all came crashing down in the Great War; and in August 1947, Britain lost its jewel in the crown, India. Mohandas Karamchand Gandhi, who had actually organised an Indian Ambulance Corp to support the British empire in the Boer War, turned into a 'half-naked fakir of a type well-known in the East', and booted Britain out of India. Mao Zedong was only 10 years old in 1903 and Deng Xiaoping was still in his mummy's tummy (he was born in 1904). But within 50 years, Mao accomplished what the Boxers couldn't, and 30 years later, Deng invited the gwai lo back - as investors.
THE present, like all previous presents, seems solid: America reigns supreme, its power is impregnable. But all that's solid always melts finally. The following are some of the challenges the American empire will face, which I'll take up in the coming weeks:
Its military might is not matched by financial might; the world's most powerful country is also its biggest debtor. Like Britain in 1903, the US may be over-stretched today.
The world has never tolerated for long a unipolar power structure. It didn't in 1903 - with Germany, Russia, the US and soon Japan, challenging Britain - and it probably won't now.
A global economy exists now, as it did in 1903, but a global polity or society doesn't, as it didn't in 1903. That combination helped create 1914.
And finally, overarching everything is a looming environmental crisis of catastrophic proportions which will shape all relations between nations.
Think homo sapiens cannot become an endangered species? Tell that to the homo neanderthalensis.
NPR, "All Things Considered," remembering the 50th anniversary of the bus boycott that gave rise to the more famous Montgomery protest:
Fifty years ago in Baton Rouge, La., black citizens banded together to fight the segregated seating system on city buses. They quit riding for eight days, staging what historians believe was the first bus boycott of the budding Civil Rights movement.
The Baton Rouge episode inspired the Montgomery, Ala., bus boycott led by the Rev. Martin Luther King, but was largely forgotten. But as NPR's Debbie Elliott reports, organizers of a commemoration of the original bus boycott this week hope to change that.
Willis Reed, 88, publisher of the Baton Rouge Post, now takes a seat at the front of the bus that stops at the newspaper offices. The World War II veteran says doing that 50 years ago would have meant trouble.
"They'd put me in jail," he tells Elliott. "And it's wrong. Definitely wrong." Reed was the founder of a group challenging segregation on Baton Rouge buses. Reed and a local clergyman, the Rev. T.J. Jemison, were the leaders of the bus boycott, which began June 20, 1953.
In 1953, 80 percent of bus riders were black -- and Reed knew that a boycott would send an economic message.
"Historians believe it was one of the first times blacks in the South organized to challenge segregation," Elliott says. "Yet most people here -- even the African-American bus drivers -- don't know about the Baton Rouge bus boycott."
Jemison, now 84, says he got involved in the boycott 50 years ago after watching buses pass by his church and seeing black people standing in the aisles, not allowed by law to sit down in seats reserved for whites.
"I thought that was just out of order, that was just cruel," he tells Elliott.
After eight days of boycotting the buses, the Baton Rouge City Council agreed to a compromise that opened all seats -- except for the front two, which would be for whites, and the back two, for black riders.
That wasn't good enough for some protesters, but Jemison called off the boycott anyway, arguing they had achieved what they set out to do.
"When we started we didn't start to end segregation on buses," he tells Elliott, "we just started to get seats."
Marc Sternberg, who is 30 years old and white, grew up in Baton Rouge but found out about the boycott by accident, reading an account of the action in a book about King's success in Montgomery. Sternberg organized two days of events to highlight the 50-year anniversary of the Baton Rouge boycott.
"Before Dr. King had a dream, before Rosa kept her seat, and before Montgomery took a stand, Baton Rouge played its part," Sternberg says.
Kate Connolly, writing in the London Daily Telegraph (June 17, 2003):
HALF a century after more than 100 East German workers were shot or crushed to death by Soviet tanks and thousands arrested in the first mass uprising against Communist rule, a new book claims that the British Government "shamefully" failed to back the revolt.
In 17 June 1953 - A German Uprising, Hubertus Knabe, a historian, argues that in his determination to retain the status quo in post-war Europe, the then prime minister, Winston Churchill, failed to help the East German protesters because he feared it risked a unified Germany.
The West's failure to act foreshadowed the passive response to the Russian invasions of Hungary in 1956 and Czechoslovakia in 1968.
East Germany's five-day insurgency involved more than a million people in 700 towns and villages. It began on June 16 when, emboldened by Stalin's death three months' previously, about 5,000 workers marched in peaceful protest against longer working hours.
The next day some 17,000 demonstrated, rising to about 50,000 by midday. They were met by East German and Soviet troops who confronted the crowds with gunshots. By mid-afternoon a state of emergency was declared, mass arrests began and the insurgency was swiftly halted.
Most historians have accepted that the West's options were limited. Britain was worried about events inside Russia, then convulsed by the power struggle after Stalin's death. There was also fear of resurgent German nationalism only eight years after the Second World War.
But Mr Knabe argues that Britain offered virtually no assistance and sought even to dampen attempts to aid the uprising. He points to numerous official telegrams demonstrating the Churchill government's detachment when faced with Russia's successful attempts to smash the demonstrations.
In a memorandum to a Foreign Office diplomat just days after thousands of arrests had been made and a number of people had been killed by Russian troops, Churchill wrote: "I had the impression that in the light of the increasing unrest they acted with considerable restraint."
In a memorandum to the Foreign Office which illustrated the Government's anxiety that the strikes would spread uncontrollably, a British general wrote: "I've spoken twice to the [West Berlin] police president in order to effect an end to the incitement, stressing in particular the distribution of leaflets via balloons."
Barry Gewen, an editor at the New York Times Book Review, writing in the NYT (June 15, 2003):
THE turning point may have come in 1985 with "Shoah," Claude Lanzmann's nine-and-a-half-hour epic of death camp survivors, Nazi officials, Polish bystanders, righteous gentiles and meticulous historians hunched over aging documents. It marked -- if it did not initiate -- the moment when documentary filmmakers started giving their full attention to Hitler's planned extermination of the Jews. "When I began exploring how films have grappled with the Holocaust in 1979, there were merely a few dozen titles to warrant attention," Annette Insdorf writes in her encyclopedic study "Indelible Shadows: Film and the Holocaust." But for the book's third edition, published this year, she lists, together with the fiction films, 69 documentaries made since 1990 alone -- a rate of almost one every two months. Elsewhere she estimates that there are at least six completed Holocaust documentaries that do not get distribution for every one that does. And the stream has continued at flood tide into 2003. Last month "Secret Lives," Aviva Slesin's emotionally complex film about Jewish children hidden by gentile families during the Nazi era, opened in New York. Shortly after, PBS showed Charles Guggenheim's "Berga: Soldiers of Another War," about Jewish-American soldiers captured by the Germans. "Bonhoeffer," Martin Doblmeier's intellectual, spiritually suffused account of the anti-Nazi German theologian Dietrich Bonhoeffer, is opening on June 27, two days before A & E broadcasts Liz Garbus's "Nazi Officer's Wife," the biography of a Jewish woman who survived by assuming an Aryan identity and marrying a Nazi party member.
But simply listing these new films raises a troubling question: Are too many Holocaust documentaries now being made? Has supply outstripped demand? It's a question that makes people uncomfortable. Who would want to appear callous in the face of such suffering, or, worse, anti-Semitic? Yet there are definite signs of Holocaust fatigue. Perhaps because she is a survivor, Ms. Slesin is more forthright than most. "I can't bear to see evil over and over again," she says. "Even I roll my eyes when I hear about another Holocaust documentary" -- but then she quickly adds, "until I see what it's about."
Stephen Feinstein, the director of the Center for Holocaust and Genocide Studies at the University of Minnesota, has sat on a selection committee for a Jewish film festival when more than 15 Holocaust documentaries were submitted. With each year bringing still more films, he says, "you can't see them all." Many of the films have become formulaic, using the same German footage, the same static interviewing techniques. "Get out of the talking-head format," Mr. Feinstein advises. Raye Farr, the director of the Steven Spielberg Film and Video Archive of the United States Holocaust Memorial Museum, says that filmmakers are too often taking the easy way out, showing an "increasing inclination to go for sentimentality." With an undertone of exasperation in her voice, she says, "Crying is not very edifying."
Patricia Cohen, writing in the NYT (June 14, 2003):
A scientist financed by, say, the tobacco industry, is expected to declare whose wallet is behind his research. But what about a historian?
The question may seem odd, but it has suddenly become more urgent as medical historians are becoming witnesses in some of the country's most important and expensive lawsuits.
This practice is causing a fierce debate among historians over the ethics of testifying for industries accused of endangering the public's health. At the normally sedate annual meeting of the American Association of the History of Medicine last month, one panel erupted in angry recriminations over the paint industry's role in the lead poisoning of children, the subject of lawsuits around the country in which historians work as consultants on both sides.
"The historical profession has really not been prepared for this," said Robert N. Proctor, a professor of the history of science at the University of Pennsylvania, who in 1999 became the first historian to testify against the tobacco industry. "We don't have disclosure rules for publications, we haven't had discussions about the ethics of whether to testify or not to testify."
Passions cover the spectrum, from historians who say consulting is a private matter to those who insist it inevitably taints objectivity. Other objections are purely political: historians shouldn't represent "bad guys," like tobacco or toxic polluters.
Aware of the growing hoopla, the medical journal Lancet has invited Mr. Proctor to write a "comment" about the debate. In his manuscript, he writes: "Historians who render expert advice to the industry are playing a dangerous game. The industry does not ask us to lie, but they do ask us to research only those topics that will help them in their defense."
But John C. Burnham, a historian at Ohio State University who has served as a consultant for a number of companies, including tobacco, asbestos, lead and soda, scoffs at colleagues who claim the moral high ground against industry consultants. In lead paint cases, for example, Mr. Burnham declares "everyone has a financial interest": the lead paint manufacturers, the landlords and municipalities who don't want to pay for a cleanup, and the lawyers who stand to earn high fees.
"This would mean you couldn't testify for anyone in the lead cases, because everyone is contaminated at this point," Mr. Burnham said. Historians can maintain their integrity no matter whose side they're on, he maintained, adding, "Even large corporations are entitled to justice."
Richard Bernstein, writing in the NYT (June 16, 2003):
50 years ago this Tuesday, hundreds of thousands of workers took to the streets in 272 cities and towns across what was then the German Democratic Republic, the eastern half of divided Germany.
Within the space of that single day, they raided jails to release political prisoners, made and listened to speeches outlining a possible better future, issued manifestos calling for both democracy and better conditions for themselves and threw a scare into the East German leadership from which it never completely recovered.
At the end of the day, Soviet troops and the East German police, backed by tanks, put down demonstrations and arrested many of the movement's leaders. A number of people were killed in the process, estimated at between 25 and 300. Brief as it was, the June 17 uprising remained a treasured and inspiring memory for thousands, for whom, when East Germany finally did die in 1989, it seemed a precursor, a herald of what was to come.
The 50th anniversary has prompted some national attention. A television documentary about the event aired nationally here last week, and a large meeting will be held in Berlin on the anniversary day itself. But a good dozen or so local exhibitions are under way, including one here in this old East German rust-belt town of Bitterfeld, a fast 90 minute drive on the autobahn from Berlin.
"It was the first uprising in the Communist camp," said Mr. Wagner, now an organizer of cultural events who lives in Berlin. "And it was an uprising that clearly showed that the claim of the Communist Party to be a party of the people was false."
"It's remarkable that one of the demands of the workers was for a free country, which was translated into a reality in 1989," he said.
During the Communist years, Bitterfeld was perfectly representative of East Germany's big, inefficient, highly polluted industrial cities. Here, 30,000 workers in state-operated plants produced film, paint, aluminum and pesticides.
The protests were set off on June 16, 1953. On that day, the trade union newspaper, Die Tribune, defended a decree by the East German Council of Ministers that raised production quotas in industry and construction by at least 10 percent, according to a history of the German Democratic Republic by Martin McCauley, a British historian.
The next day, construction workers in East Berlin marched in protest. The news, reported by RIAS, a West German radio station that broadcast from the American-controlled sector of West Berlin and could be heard throughout East Germany, spread quickly, and within hours 300,000 workers all across the country had taken to the streets.
Sam Roberts, writing in the NYT (June 15, 2003):
Fifty years ago Thursday, Julius and Ethel Rosenberg were executed in the electric chair at Sing Sing. Their execution, originally set for 11 p.m. on Friday, June 19, 1953, was rescheduled for 8 p.m. to avoid conflict with the Jewish sabbath.
"They were to be killed more quickly than planned," the playwright Arthur Miller wrote, "to avoid any shadow of bad taste."
A shadow lingers.
"I grew up believing Ethel and Julius were completely innocent," Robert Meeropol, who was 6 years old in 1953, says of the Rosenbergs, his parents. "By the time I completed law school in 1985, however, I realized that the evidence we had amassed did not actually prove my parents' innocence but rather only demonstrated that they had been framed."
After digesting newly released American decryptions of Soviet cables a decade later, Mr. Meeropol came to a revised conclusion. "While the transcriptions seemed inconclusive, they forced me to accept the possibility that my father had participated in an illegal and covert effort to help the Soviet Union defeat the Nazis," he writes in his new memoir, "An Execution in the Family: One Son's Journey" (St. Martin's Press).
Of course, the Rosenbergs weren't executed for helping the Soviets defeat the Nazis, but as atom spies for helping Stalin end America's brief nuclear monopoly. They weren't charged with treason (the Russians were technically an ally in the mid-1940's) or even with actual spying. Rather, they were accused of conspiracy to commit espionage including enlisting Ethel's brother, David Greenglass, through his wife, Ruth, to steal atomic secrets from the Los Alamos weapons laboratory where he was stationed as an Army machinist during World War II. Mr. Greenglass's chief contribution was to corroborate what the Soviets had already gleaned from other spies, which by 1949 enabled them to replicate the bomb dropped on Nagasaki. (He confessed, testified against his sister and brother-in-law and was imprisoned for 10 years; Ruth testified, too, and was spared prosecution.)
As leverage against Julius, Ethel was also indicted on what, in retrospect, appears to have been flimsy evidence. The government didn't have to prove that anything of value was delivered to the Soviets, only that the participants acted to advance their goal.
"When you're dealing with a conspiracy, you don't have to be the kingpin, you have to participate," says James Kilsheimer, who helped prosecute the Rosenbergs. "You can't be partially guilty any more than you can be partially pregnant."
But to justify the death penalty, which was invoked to press the Rosenbergs to confess and implicate others, the government left the impression that the couple had handed America's mightiest weapon to the Soviets and precipitated the Korean War.
Records of the grand jury that voted the indictment remain sealed. But we now know the Soviet cables decoded before the trial provided no hard evidence of Ethel's complicity. And Mr. Greenglass has recently admitted that he lied about the most incriminating evidence against his sister. The government's strategy backfired. Ethel wouldn't budge. The Rosenbergs refused to confess and were convicted.
Michael Winerip, writing in the NYT (June 18, 2003):
TODAY, more than 100,000 New York high school students will take the state global history exam. Richard P. Mills, the state education commissioner, has such faith in this test and the state tests in English, math, American history and science that in recent years, he has made passage of all five exams mandatory for a high school diploma.
New York now has one of the most test-driven education systems in the nation. For global history teachers like Dalia Hochman at La Guardia High School in Manhattan, it's a whirlwind tour through the centuries as she tries to cover everything on the state test. "In one stretch," Ms. Hochman said, "we do the scientific revolution, French Revolution, revolution in Haiti, Simón Bolívar and Latin American independence movements, the Napoleonic period, 19th-century nationalism in Italy and Germany, Zionism, back to the Industrial Revolution it's a race to finish."
Nor does the state commissioner take kindly to high schools that want to teach history differently or test students' knowledge in other ways. Two dozen schools in the state use the International Baccalaureate, a curriculum that teaches history by emphasizing a more in-depth look at fewer eras, requires students to do research with primary sources and uses international specialists to grade the work. The program is considered so rigorous that students often accumulate enough credits to skip their freshman year of college. But it's not good enough for Dr. Mills. He has refused to approve the International Baccalaureate history program as an alternative to state tests.
Dr. Mills has also rejected a request by a consortium of 32 small high schools, including Urban Academy in Manhattan, to assess their students' history research and oral presentations by using real historians like Eric Foner of Columbia University and Mikal Muharrar of the New-York Historical Society instead of state tests. Urban Academy may be a national model for small, successful high schools, featured most recently in Newsweek magazine; it may take minority children who have failed at other New York City high schools and get virtually all of them into four-year colleges. But that is not good enough for Dr. Mills.
So Ann Cook, co-director of Urban Academy, wondered: Since the state is so dismissive of other assessment approaches, just how good are the state's own tests? Using a Gates Foundation grant, Ms. Cook assembled several panels of specialists to critique the New York State Regents exams.
In global history, the panelists, mainly history professors, actually took recent state tests. They were struck by the gap between how rigorous state standards sound (students must "evaluate evidence," "probe ideas and assumptions," "ask and answer analytical questions") and how "vapid" the test essay questions were....
Ms. Hochman of La Guardia High, a 2000 Yale graduate in history, spent a recent weekend writing comments on 170 term papers that examined an international response to a world problem like AIDS in Uganda. The papers were brilliant and deep, she says, but it bothers her that they are not part of the state evaluation of her students or her school. Though she loves their lively classroom discussions, she feels constant pressure to cut it off, to keep up with a test-driven curriculum that, she believes, goes a mile wide and an inch deep. After Sept. 11, 2001, her students begged for special lessons on Islam. But at a Regents training conference, she said, a veteran mentor teacher warned her, "Honey, spend two days on the Byzantine Empire and three days on Islam, and then you've got to move on."
Roger Boyes, writing in the Times (London) (June 17th, 2003):
THE remaining secrets of the East German Stasi, torn into shreds and stored in 16,000 sacks, may soon be pieced together by a new computer software program.
Designed to solve the world's biggest jigsaw puzzle, the program could shed light on love affairs between agents and Nato secretaries, contract murders and even the recruitment of foreign academics by one of the most thorough of the Communist secret police forces.
In the dying days of the East German state, agents at the Magdeburg archives were ordered by Erich Mielke, the head of the Stasi, to destroy tens of thousands of personal files. But the East German shredding machines were not up to the job and local officials could not organise the transport needed to create a huge bonfire of secrets on the outskirts of town.
Instead they tore each page into neat quarters and stuffed them into 16,000 brown paper sacks. After the Berlin Wall fell, Stasi headquarters were stormed by East Germans and the sacks were taken to a former refugee centre in Zirndorf in Bavaria. There, with painstaking thoroughness, about 50 civil servants have taken eight years to put together the destroyed files from 300 sacks. Fragments of paper are spread across large desks, much as one would solve a jigsaw puzzle: names, handwriting and signatures are matched. At the present rate, the authorities say, it will take another 450 years to process the remaining sacks.
The Berlin Frauenhofer Institute of Production Facilities and Construction Technology, however, has worked out a software program that will match the paper fragments and order them correctly according to the various secret police operations.
The files stretch back several decades. The advantage of the new program is clear: if the documents can be reconstructed in months rather than decades, outstanding murder or treason cases can be solved and the guilty brought to trial. Victims of the Stasi will also be able to lodge compensation claims and seek rehabilitation.
The pieces of paper are scanned electronically, then fed into the system. The program is being refined and a prototype should be ready for use by October. Then the German parliament has to decide whether to pay for it.
That could prove troublesome. When the German parliament first initiated the search for an appropriate software program 2 years ago, members of the former Communist Party of Democratic Socialism opposed it. The party no longer plays a significant role in parliament, but political analysts say that mainstream parties might also try to block the computer plan.
Some of the reconstituted files are likely to contain records of conversations between Western politicians and the East German authorities from the detente years of the late 1960s to the 1980s. Neither the Social Democrats nor the Christian Democrats (who ruled Germany between 1982 and 1998) have a strong interest in such revelations.
But the sacks should provide remarkable insights for historians, who are still trying to unravel the precise relationship between the East and West German states...
...Fifty years ago today, for example, East Berlin workers held a protest march that was crushed violently by Soviet troops. A road that led to the Brandenburg Gate, at the Berlin Wall, was renamed June 17 Street in defiance of the Communist state.
Memorial services, wreath-laying and historical tours are planned. However, there is still no agreement on whether the march was the beginning of a genuine rebellion against Communist rule.
Still many details are unclear: the number of deaths was estimated at between 25 and 300 and the fate of many of the prisoners is still murky, as is the relationship between the Soviet and East German security forces. The torn Stasi files could provide a clue to this and to many other supposedly defining historical moments.
Kathy Sawyer, writing in the Washington Post (June 17th, 2003):
WASHINGTON -- Vincent van Gogh's 1889 painting Evening Landscape with Rising Moon has been eclipsed, so to speak, by his famous Starry Night -- both painted 114 summers ago in the French village of Saint-Remy. But now his moon is about to rise again.
Astronomers at Southwest Texas State University used topographic maps, aerial photos, trigonometry, weather records and the painter's letters, as well as astronomical calculations, to solve a riddle that has long bedeviled art historians: When did van Gogh see that moon?
The answer is 9:08 p.m., July 13, 1889.
His painting depicts a moon as fat and juicy as a fresh orange lodged against a shimmering sky and almost impaled on a strange, overhanging cliff. Experts initially mistook the orb for a setting sun.
The researchers, whose work was described in the July issue of Sky & Telescope magazine, found the odd cliff and took relevant measurements, even though a forest of tall pine trees has grown up and obscured some of the landscape. The condition of the wheat fields in the painting, arrayed in golden stacks, was another clue. Described by the artist in a letter as very green in May, the grains by June had"the warm tones of a bread crust" and by July were"all yellow."
Because of a cosmic synchronicity, the calendar dates of lunar phases this year -- the 150th anniversary of van Gogh's birth -- nearly repeat those of 1889. Therefore, on July 13 modern observers might see the van Gogh moon, nearly full, rise in the southeast,"much as it did on July 13, 1889, when van Gogh stood among the wheat stacks," according to Sky & Telescope.
Kate Connolly, writing in the Telegraph (London) (June 17th, 2003):
In 17 June 1953 - A German Uprising, Hubertus Knabe, a historian, argues that in his determination to retain the status quo in post-war Europe, the then prime minister, Winston Churchill, failed to help the East German protesters because he feared it risked a unified Germany.
The West's failure to act foreshadowed the passive response to the Russian invasions of Hungary in 1956 and Czechoslovakia in 1968.
East Germany's five-day insurgency involved more than a million people in 700 towns and villages. It began on June 16 when, emboldened by Stalin's death three months' previously, about 5,000 workers marched in peaceful protest against longer working hours.
The next day some 17,000 demonstrated, rising to about 50,000 by midday. They were met by East German and Soviet troops who confronted the crowds with gunshots. By mid-afternoon a state of emergency was declared, mass arrests began and the insurgency was swiftly halted.
Most historians have accepted that the West's options were limited. Britain was worried about events inside Russia, then convulsed by the power struggle after Stalin's death. There was also fear of resurgent German nationalism only eight years after the Second World War.
But Mr Knabe argues that Britain offered virtually no assistance and sought even to dampen attempts to aid the uprising. He points to numerous official telegrams demonstrating the Churchill government's detachment when faced with Russia's successful attempts to smash the demonstrations.
In a memorandum to a Foreign Office diplomat just days after thousands of arrests had been made and a number of people had been killed by Russian troops, Churchill wrote:"I had the impression that in the light of the increasing unrest they acted with considerable restraint."
In a memorandum to the Foreign Office which illustrated the Government's anxiety that the strikes would spread uncontrollably, a British general wrote:"I've spoken twice to the [West Berlin] police president in order to effect an end to the incitement, stressing in particular the distribution of leaflets via balloons."
Selwyn Lloyd, then a junior Foreign Office minister, made crystal clear the British position in a confidential memorandum to Churchill on June 22:"At the present time, a divided Germany [is preferable because it is] more secure. But none of us wants to say this clearly because of the possible effects on public opinion in Germany. Therefore publicly we should all support a unified Germany." Mr Knabe told The Daily Telegraph yesterday:"The Western policy towards the protests was shameful. For years politicians in the West preached about liberating the East and encouraging easterners to stand up against Communists, but the moment hundreds of thousands went on to the streets they turned their backs on them."
Mr Knabe, director of the Stasi (east German secret police) museum in Berlin, says Britain even blocked the issue from being taken before the United Nations. The events of June 1953 caused shockwaves around the world, angering even long-standing Communist supporters such as the German playwright Bertolt Brecht, who wrote:"Would it not have been simpler for the government to have dissolved the people and chosen them anew?"
The uprising's 50th anniversary has generated an unprecedented amount of soul-searching in Germany, inspiring new films, publications and public debates about what really happened.
Dinitia Smith, writing in the NYT (June 17, 2003):
In late June and early July 1928, Hitler dictated a sequel to "Mein Kampf." It was taken down by Max Amann, the director of the Nazi party's publishing company.
The 1928 elections had just taken place, and the Nazis had fared dismally. One reason, Hitler thought, was his policy on the South Tyrol, which was transferred from Austria to Italy by the peace settlement after World War I. Hitler, hoping to make an alliance with Mussolini, was willing to leave the area and its ethnic Germans under Italian rule, angering ultra-right-wing Germans. Hitler wanted to explain his foreign policy on the South Tyrol and the United States, but "Hitler's Second Book," as it came to be known, was never published during his life. One theory is that the second volume of "Mein Kampf," published in 1927, was selling badly, and that Amann persuaded Hitler that bringing out a new book would further cut into sales.
That book is known to specialists and historians of Nazi Germany. The historian Ian Kershaw, for instance, refers to the German text in his biography of Hitler, but for the most part it has been ignored, partly because it has been largely unavailable in the English-speaking world.
This October, however, Enigma Books, a small press specializing in modern European history, is to publish a new English translation titled "Hitler's Second Book: The Unpublished Sequel to `Mein Kampf.' " The translator is Krista Smith, and the editor is Gerhard L. Weinberg, an emeritus professor of 20th-century history at the University of North Carolina at Chapel Hill. He also wrote an introduction and annotations. The book largely repeats Hitler's familiar obsessions with racial politics and Germany's need for more territory, or Lebensraum.
"What is new," Mr. Weinberg said in a telephone interview from Chapel Hill, "is the spelling out in greater detail than `Mein Kampf' how an alliance with Italy fits into this and why the long-term German objectives of conquest require a war with the United States."
Steven E. Landsburg, writing in Slate (June 13, 2003):
Why did Jews and only Jews take up urban occupations, and why did it happen so dramatically throughout the world? Two economic historiansMaristella Botticini (of Boston University and Universitá di Torino) and Zvi Eckstein (of Tel Aviv University and the University of Minnesota)have recently been giving that question a lot of thought.
First, say Botticini and Eckstein, the exodus from farms to towns was probably not a response to discrimination. It's true that in the Middle Ages, Jews were often prohibited from owning land. But the transition to urban occupations and urban living occurred long before anybody ever thought of those restrictions. In the Muslim world, Jews faced no limits on occupation, land ownership, or anything else that might have been relevant to the choice of whether to farm. Moreover, a prohibition on land ownership is not a prohibition on farmingother groups facing similar restrictions (such as Samaritans) went right on working other people's land.
Nor, despite an influential thesis by the economic historian Simon Kuznets, can you explain the urbanization of the Jews as an internal attempt to forge and maintain a unique group identity. Samaritans and Christians maintained unique group identities without leaving the land. The Amish maintain a unique group identity to this day, and they've done it without giving up their farms.
So, what's different about the Jews? First, Botticini and Eckstein explain why other groups didn't leave the land. The temptation was certainly there: Skilled urban jobs have always paid better than farming, and that's been true since the time of Christ. But those jobs require literacy, which requires educationand for hundreds of years, education was so expensive that it proved a poor investment despite those higher wages. (Botticini and Eckstein have data on ancient teachers' salaries to back this up.) So, rational economic calculus dictated that pretty much everyone should have stayed on the farms.
But the Jews (like everyone else) were beholden not just to economic rationalism, but also to the dictates of their religion. And the Jewish religion, unique among religions of the early Middle Ages, imposed an obligation to be literate. To be a good Jew you had to read the Torah four times a week at services: twice on the Sabbath, and once every Monday and Thursday morning. And to be a good Jewish parent you had to educate your children so that they could do the same.
The literacy obligation had two effects. First, it meant that Jews were uniquely qualified to enter higher-paying urban occupations. Of course, anyone else who wanted to could have gone to school and become a moneylender, but school was so expensive that it made no sense. Jews, who had to go to school for religious reasons, naturally sought to earn at least some return on their investment. Only many centuries later did education start to make sense economically, and by then the Jews had become well established in banking, trade, and so forth.
The second effect of the literacy obligation was to drive a lot of Jews away from their religion. Botticini and Eckstein admit that they have little direct evidence for this conclusion, but there's a lot of indirect evidence. First, it makes sense: People do tend to run away from expensive obligations. Second, we can look at population trends: While the world population increased from 50 million in the sixth century to 285 million in the 18th, the population of Jews remained almost fixed at just a little over a million. Why were the Jews not expanding when everyone else was? We don't know for sure, but a reasonable guess is that a lot of Jews were becoming Christians and Muslims.
Sowhich Jews stuck with Judaism? Presumably those with a particularly strong attachment to their religion and/or a particularly strong attachment to education for education's sake. (The burden of acquiring an education is, after all, less of a burden for those who enjoy being educated.) The result: Over time, you're left with a population of people who enjoy education, are required by their religion to be educated, and are particularly attached to their religion. Naturally, these people tend to become educated. And once they're educated, they leave the farms.
Robert Bartley, writing in the Wall Street Journal (June 16, 2003):
The Vietnam War haunted the American political psyche for three decades, until the ghost was exorcised on September 11, 2001. The other bookend of the era, at least in my mind, is November 1963, a month that opened with the assassination of President Ngo Dinh Diem in South Vietnam and closed with the assassination of President John F. Kennedy in Dallas.
Those of us who think this way, and I am by no means the only one, naturally looked forward to a new biography of the martyred president, Robert Dallek's "An Unfinished Life: John F. Kennedy, 1917-1963." As it turns out, Mr. Dallek asserts that JFK would have withdrawn from Vietnam if he had served a second term. This notion has been assiduously spread by Kennedy acolytes for three decades now, and Mr. Dallek's uncritical acceptance of it raises again the issue of why he was selected for privileged access to the Kennedy papers....
My own preoccupation, Vietnam, was the subject of the second article. Mr. Dallek discusses the long debate within the administration over whether to sanction the coup that ultimately resulted in Diem's murder. In contrast with his clarity during the Cuban Missile Crisis, the president is conflicted and indecisive. Immediately after the coup, he taped a memo, particularly regretting an August cable that first suggested a coup. "I should not have given my consent to it without a roundtable conference at which McNamara and Taylor could have presented their views."
In fact, the key Aug. 24 cable was approved by the president after a briefing by George Ball, who interrupted his shower on a Hyannis weekend. At least, this was the contemporary report of Marguerite Higgins in "Our Vietnam Nightmare" (Harper & Row, 1965). But this is missing from Mr. Dallek's bibliography, as is Ellen Hammer's "A Death in November" (Dutton, 1987). These anti-coup books are essential balance to the acolytes.
President Eisenhower briefed the incoming president the day before the inaugural. The principal subjects included Laos and, we know from other sources, the balance of payments, which unwound as a crisis during the Nixon administration. The outgoing president favored American intervention in Laos, predicting that unless the U.S. resisted there South Vietnam and Cambodia would also fall.
In the event, President Kennedy negotiated the Laos accords, a coalition arrangement that gave the Communists de facto control of the Ho Chi Minh trail vital to infiltration into South Vietnam. By 1963 the South erupted in crisis, with conflicting battlefield reports and political turmoil in the Buddhist crisis and burning bonzes. The notion spread in the Saigon press corps and a Kennedy administration faction that Diem, an inflexible Catholic, had to go in order to win the war. After the coup, the military situation deteriorated rapidly.
Mr. Dallek lists the reasons JFK was reluctant to withdraw from Vietnam: failure at the Bay of Pigs, the Vienna summit with Khrushchev, defending Laos, the Berlin Wall, the Soviet resumption of nuclear testing. He feared the international and domestic reaction to another defeat. By November, sanctioning a coup against an ally in the name of winning the war had been added.
Then withdraw? Joe Kennedy's competitive kid? The "green berets" guy? The "bear any burden" guy? Give me a break.
Acolytes love this myth dearly, of course, and Mr. Dallek was writing not a focused examination of it but a broad portrait valuable in its own right. But he need not adopt the withdrawal notion so uncritically or champion it in magazines. For the purpose of the myth is to obscure a salient truth. To wit, Vietnam was John F. Kennedy's war.
Victor Davis Hanson, writing for the website of the Claremont Institute (Summer 2003):
Why do so many western intellectuals excuse thuggery and whitewash the crimes of megalomaniacs? I have received more angry mail, for example, over a brief article I published a few years ago called "Alexander the Killer" than about anything I have ever written. And the myth of Napoleon, like that of Alexander the Great, is also deeply enshrined in our collective romanceto question either risks real outrage.
Both dictators were eerily similar in ways that go beyond being military geniuses who ruled entire continents by their early 30s. In each case ghastly records of slaughter were carefully masked by a professed concern for the arts and sciencese.g., silly tales of Alexander sleeping with a copy of the Iliad under his pillow and his real efforts to bring a legion of Greek natural scientists with him eastward; or Napoleon's patronage of Vivant Denon (author of the monumental 24-volume Description de l'Egypte) and his gifts of Egyptian booty to a generation of French scholars. Like Hitler's Speer and de Gaulle's Malraux, Denon was one of a long line of gifted toadies dating back to Alexander's Callisthenes, court intellectuals who simultaneously worshiped and loathed the powers that be, who at least noticed them.
Napoleon and Alexander were money-driven thieves par excellence, perhaps the difference being only that the looted imperial treasuries at Susa, Babylon, and Persepolis yielded more specie than the Swiss banks at Berne. The Great's "Brotherhood of Man" was about as genuinely utopian as the Code Napoléon. Both strongmen dazzled their immediate circle with lapidary self-infatuationfor example, Napoleon's "At twenty-nine years of age I have exhausted everything. It only remains for me to become a complete egoist." Or Alexander's reply to Parmenio's urging before the battle of Gaugamela to take the terms offered by Darius III: "And I would accept them tooif I were Parmenio."
In the end, their real legacies were millions dead and empires that crumbled the second they were gone. I suppose the only real difference was that Alexander loved horses and named a city after his steed Bucephalas, whereas Napoleon rode to death dozens of mounts and exhausted Europe of its horseflesh.
Paul Johnson's polemical Napoleon, an entry in the Viking/Penguin series of brief biographies, is not impressed with the little corporal or anything he did. After all, the military record is unquestioned17 years of wars, perhaps six million Europeans dead, France bankrupt, her overseas colonies lost. And it was all such a great waste, for, as Johnson shows, when the self-proclaimed tête d'armée was done, France's "losses were permanent" and she "began to slip from her position as the leading power in Europe to second-class statusthat was Bonaparte's true legacy."
Tim Rutten, writing in the Los Angeles Times (June 14, 2003):
Among all the unforeseeable permutations of the Jayson Blair affair, none is more unexpected -- or more problematic -- than its role in reviving the 13-year-old campaign to strip the New York Times' Walter Duranty of the Pulitzer Prize he won in 1932.
American journalism has thrown up more than its share of vile characters; Duranty certainly was among the worst. As the Times' Moscow correspondent in the 1920s and '30s, he was an active agent of Soviet propaganda and disinformation -- probably paid, certainly blackmailed, altogether willing. For years, Duranty lied, distorted and suppressed information to please Josef Stalin. One of his reportorial reputation's cornerstones, in fact, was the exclusive interview the Soviet dictator granted him in 1929....
"When the board of the Ukrainian Congress Committee of America (UCCA) met to discuss commemoration of the famine's 70th anniversary, it decided "a campaign to revoke Walter Duranty's 1932 Pulitzer Prize" would be an "integral component" of that effort. Their initiative quickly was joined by Canadian and British Ukrainian émigré associations, which set up Internet Web sites through which visitors could e-mail the Pulitzer Board and the New York Times. . . .
Curiously, the same organizations and commentators who are pressing the issue of Duranty's prize have been resolutely silent about one of the Holocaust's darkest chapters -- the collaboration by tens of thousands of Ukrainians with the Nazi murderers of Eastern European Jewry. The Waffen SS raised an entire brigade from among the Galician Ukrainians. Ukrainian POWs volunteered to serve as guards in the German death camps. Followers of the Ukrainian nationalist Stepan Bandera enthusiastically joined the Nazis in carrying out massacres of Jews throughout the Ukraine and adjoining regions.
According to Rabbi Abraham Cooper of the Simon Wiesenthal Center in Los Angeles, "there is no doubt at all of their participation in genocide."
There is a clear moral claim to be made on behalf of those who died in the Ukrainian famine. So, too, for those Ukrainian Jews who died at the hands of the Nazis and their own countrymen. This week, the Los Angeles Times asked officials of the leading U.S. and Canadian Ukrainian émigré organizations whether they ever had censured or condemned the Galician Brigade or Bandera's followers for their participation in genocide.
"It depends on what you mean by genocide," said the UCCA's Gallo. "To my knowledge, well, I'm not really sure. We may have sent out statements in the past on these things, but I would have to look for them and get back to you." As of Friday's deadline, she had not.
John Gregorovich, chairman of the Ukrainian Canadian Civil Liberties Assn., said, "No, these are controversial things, and there is no evidence of war crimes by the Galician Division or Bandera's Organization of Ukrainian Nationalists. They were not motivated by anti-Semitism. Those who charge they were are mainly Jewish correspondents and scholars, who fail to differentiate between anti-Semitism and Ukrainian nationalism."
And so it goes.
Meanwhile, a special subcommittee of the Pulitzer board is once again reviewing Duranty's prize. Whatever it decides, it is hard not to agree with at least one thing Gregorovich said:
"When it comes to genocide, every crime forgotten or denied is a victory for its perpetrators."
Gary Cox, in a post on H-Labor (June 12, 2003):
As I sat at the base of our desecrated monument in Ludlow, Colorado, it was quiet and peaceful. Only the birds were chirping in the cottonwoods above me. These trees must have been planted to shade this beautiful sculpture of a miner, his wife, and their small frightened child. Trees are as scarce as moisture on the high semi-desert of Eastern Colorado. There was a slight breeze bringing me the smell of tiny flowers and sage competing for the water from a precious rain the night before. I had arrived early to meet Mike Romero, President of Local 9856, U.M.W.A to view the recent damage to the sculpture and to get the Union's view of who may have vandalized this site.
My thoughts were of the 2 women and 11 children who had suffocated in the "black hole" there next to me, trying to escape their burning tents and militia machine gun fire; and of Louie Tikas who had been shot three times in the back by the Colorado State militia only a few feet from where I sat. If the Twin Towers in New York City symbolize wealth and power, this monument symbolizes the courage and solidarity of working people to resist exploitation, and to struggle for civil liberty, for freedom, and for dignity. The husband and wife team represented on the monument probably knew from past experience that wealthy lawyers, masquerading as their representatives in Washington, D.C., would never deliver on the Constitution and Bill of Rights for mere miners in Colorado when it was Rockefeller who was violating those rights. They knew also that Governor Elias Ammons would not enforce the state 8 hour day law for the same reason. A politician does not bite the hand that supplies him/her campaign funds and flowery press. Laws are selectively enforced. The miners' families learned the hard way that freedom, civil rights, and justice must be won by working people through struggle, tenacity, and courage, and then maintained by constant vigilance. Nothing has ever been "given" to the powerless. Freedom never comes to the timid nor to those too comfortable to sacrifice. These thoughts drifted up to me from the "black hole" along with the words of Woody Guthrie in the famous song he wrote after he had visited this site, "God bless the mine workers' union, and then I hung my head and cried." This monument is our "twin towers."
This isolated 40 acres, which had been the Ludlow tent colony site, was purchased by the United Mine Workers of America in 1917 and this monument was built next to the "black hole" to memorialize the tragic 1913-1914 U.M.W.A. strike. The Ludlow tent colony was the largest of several tent colonies spaced strategically to block the canyons leading up into the Sangre de Cristo mountains where the coal mines were located. The monument was officially dedicated at a large gathering of mostly miners and their families on May 30, Memorial Day, 1918. It was a magnificent sculpture and has witnessed a yearly memorial service for the past 85 years in the quiet, peaceful spot nestled at the foot of the majestic Sangre de Cristos, unmolested until May 7, 2003.
If your knowledge of Ludlow history needs a brushing up, go to web sites for used and rare books; e.g., alibris.com or abebooks.com, and look for"Out of the Depths," by Barron Beshoar, son of the only doctor who would care for striking miners or their families at Ludlow, and"The Great Coalfield War," by George McGovern, written for his doctorate thesis. It is the best of the two, in my opnion, but harder to find and more expensive. For some strange reason, all the accurate books on the history of Ludlow are out of print??? Mike tells me George McGovern has also been invited to come on the 29th but hasn't responded yet. He's no spring chicken. Look who's talking. See you on the 29th. If you have time, stop at the old county jail in Walsenburg, just 15 miles north of Ludlow. The jail has been converted into a delicious two story mining museum. It's on 5th Street, behind the county courthouse. Mother Jones slept here.
Richard Owen, writing in the Times (London) (June 16th, 2003):
A mysterious"sword in the stone" said to have been thrust into a rock near Siena by a medieval knight proves that the legend of King Arthur, Excalibur and the Holy Grail originated in Tuscany, not Cornwall or Brittany, an Italian scholar claims.
The sword, of which only the hilt and an inch or two of blade is visible, is preserved at the Gothic abbey of San Galgano at Montesiepi, about 19 miles (30 km) southwest of Siena. The Cistercian abbey, now ruined, was built to honour St Galgano, a 12th-century Tuscan nobleman named Galgano Guidotti who renounced a life of"arrogance, lust and violence" to become a hermit after seeing a vision of the Archangel Michael.
To symbolise his rejection of war, he supposedly plunged his sword into the rock, which miraculously"parted like butter", leaving only the hilt exposed to form the shape of the Cross.
It has been assumed that the Tuscan"sword in the stone" is a fake, made to echo the Celtic legend of King Arthur as told by Geoffrey of Monmouth and Chretien de Troyes and by Thomas Malory in his celebrated 15th- century Le Morte Darthur.
But a study by the medieval historian Mario Moiraghi suggests that the story of St Galgano and his sword was the origin of the myth of King Arthur and the Knights of the Round Table, embellished by medieval troubadours as it spread from Tuscany.
In The Enigma of St Galgano, Moiraghi -noted for his work on the Templars - claims that writers such as De Troyes were inspired by the tale of Galgano and not the other way round."The dates support this," he said."Galgano died in 1181, and the story of his miraculous act swiftly became widely known when he was canonised."
De Troyes wrote Perceval in 1190, and Wolfram von Eschenbach wrote the German version of the Holy Grail myth between 1210 and 1220, also focusing on Perceval (or Parzifal), the knight of humble origin who finds the Grail. Richard Wagner based his text for Parsifal, his last stage work premiered in 1882, on Von Eschenbach.
Moiraghi said that the testimony of Dionisa, St Galgano's mother, to the panel of cardinals considering his canonisation in 1190 contained"all the essential elements of the Round Table myth": a knight who overcomes all obstacles to reach his ideal; his search for a Holy Grail (in Galgano's case an indecipherable text he saw in a vision rather than the cup from the Last Supper); and the" central role of the sword". Tales of chivalry brought back from Persia by merchants became popular in Tuscany at about the same time.
Moiraghi said that the Arthurian"round table" may have been inspired by the shape of the chapel built over the sword in the stone at Montesiepi (the Rotunda). Even the name"Galgano" may have been corrupted into"Galvano" by later writers, giving birth to the figure of Gavin or Gawain, Arthur's nephew and at one stage his ambassador to Rome.
The theory that the legend of St Galgano predates rather than copies the story of Arthur is supported by tests on the sword at the Tuscan abbey. Scientists say that it is made of a metal and style" compatible with the era of St Galgano". Luigi Garlaschelli, a research scientist at the University of Pavia who helped to conduct the tests, said that there appeared to be a cavity beneath the rock. The church authorities had not yet given permission for an excavation to show whether this contained further evidence, such as the saint's remains.
Jack Malvern, writing in the Times (London) (June 16th, 2003):
A neglected song could be the only surviving scrap of a lost Shakespeare play, an academic has concluded.
The historian and broadcaster Michael Wood believes that the song Woods Rocks and Mountains by Robert Johnson was written for Cardenio. The play was performed by Shakespeare's company in 1613 and credited to him and John Fletcher in 1653.
No evidence of it survives, however, except for the statement by the 18th century playwright Lewis Theobald that his play Double Falshood was based on it. If Shakespeare did write Cardenio, it is very likely that he would have had songs written for it. Woods Rocks and Mountains would fit a scene in Double Falshood in which the heroine sings of her sorrows.
Mr Wood said:"It gives us a precious insight into how the play would have been staged." A re-creation of the song by the Royal Shakespeare Company will be broadcast in Mr Wood's series In Search of Shakespeare on BBC Two on June 28.
Experts on Shakespeare reacted cautiously. Lisa Jardine, of Queen Mary, University of London, said:"It is the kind of all-purpose song that might equally well have been used in a play by a contemporary, for instance Thomas Middleton."
The academic Charles Hamilton has published a text previously identified as a Middleton play, The Second Maiden's Tragedy, which he says is Cardenio.
Academic opinion is divided on the claim.
David Keys, writing in the Sunday Independant (June 15th, 2003):
A semi-pornographic royal seal, discovered in a field in East Anglia, is providing historians and archaeologists with vital clues to the life of one of the Dark Ages' most bizarre celebrities.
Queen Balthild is now thought to have been born an Anglian aristocrat, who was then sold into slavery. She married the King of the Franks, became a ruthless ruler and murderer, but was finally made a saint before she died.
With her somewhat intimidating name - Balthild means literally"Bold Battle" in Anglo-Saxon - she has long been an enigma to scholars of Dark Age history. But the discovery, by a metal-detector enthusiast, of her royal seal matrix buried in a field in East Anglia is shedding new light on her extraordinary story. The gold seal matrix, which was originally attached to a ring, is one of the most important Dark Age artefacts ever found in Britain. On one side is a human face with her name inscribed around it in Frankish form. On the other side are two naked figures thought to portray Balthild and her husband, the Frankish (French) king, having sex. The respectable side, according to this month's BBC History magazine, was used to seal official documents, while the reverse was no doubt used to seal more private correspondence between royal husband and wife.
An analysis of her name suggests that Balthild was a member of one of the Anglian (rather than Saxon) tribes and therefore almost certainly came from an Anglian area, namely Suffolk or Norfolk.
Second, the field in which the seal matrix was found - just a few miles east of Norfolk's county town, Norwich - has been yielding further Anglo- Saxon finds, suggesting that the matrix came from a long-vanished settlement, conceivably associated with her descendants.
Reconstructing Balthild's early life has long been a challenge to scholars, but new research now suggests that she was born around 627 and that she may well have been connected in some way to the last pagan king of East Anglia, a usurper called Ricberht who was ousted by his Christian rival Sigabert, the rightful heir to the throne, with French help. The victorious Sigabert (whose name, aptly, means"Shining Victory") had invaded East Anglia after spending several years at the court of the Frankish king.
As a young girl, Balthild was sent to the same French royal court as a slave - perhaps as a relative of the defeated Ricberht.
She joined the household of the king's chief administrator, Erchinoald, whose unwanted sexual overtures she rapidly learnt to resist. Just as well - for she soon met the French king, Clovis II. The pair appear to have fallen for each other and were married in 648. They had three sons, each of whom later became a Frankish king.
In 657 Clovis died, and Balthild took over as regent until her son came of age. By all accounts she was a ruthless ruler: as part of a continuing struggle with the church, she seems to have been responsible for the murder of at least nine French bishops. When her son Clothar came of age in 664, Balthild's rule ended - and she was virtually imprisoned in a convent. There she dedicated herself to a life of unexpected piety until her death in 680.
The wedding present from Clovis - the royal seal ring - must have been one of her most treasured and intimate possessions. How it ended up in a field near Norwich is a mystery. But it is conceivable that it was returned to her East Anglian family estate after her death. An analysis of all the other finds from the field - brooches, a finger ring, a pendant, belt fittings - does indeed hint that a high-status Anglo-Saxon residence once stood on the site.
For Dr Andrew Rogerson, a leading archaeologist at Norfolk Museums and Archaeology Service, which has rec- orded all the finds from the area, the seal is simply"the most extraordinary single object" he has ever examined.
Philip Dine, writing in the Washington Post (June 12, 2003):
Convinced that the nation could benefit if its leaders had a better grounding in military history and thinking, Rep. Ike Skelton, D-Mo., has done something unusual for a politician.
He's not proposing a law or seeking funds. He's urging people to read books.
A senior member of the House Armed Services Committee, Skelton has compiled a 50-book National Security Book List that he says should be "required reading" for defense officials, young officers, members of Congress and the nation's war colleges. He mailed out about 600 copies of the list this week, with more set to go.
The books center on actual wars and defense strategy, with a focus on World War II and the Civil War, but the list's aim goes well beyond military art, with Skelton hoping to help produce future generations schooled in leadership and character.
"I should have done it years ago," said Skelton, who was motivated in part by the postwar chaos in Iraq, which he said shows the relevance of historical context.
The United States is having trouble winning the peace in Iraq, and Skelton said that occupation experiences of the Americans in Saigon, where U.S. troops were hit with grenades and sniper fire, and of the French in Indochina, could shed light on today's difficulties....
Skelton spent weeks putting together and reshuffling a collection of titles, all of which he has read. At the top of his list, which is posted on his Web site, www.house.gov/skelton, is the Constitution. Twenty are biographies, including ones on Alexander the Great, Hannibal, Winston Churchill and Harry S Truman, and autobiographies by Ulysses S. Grant and Douglas MacArthur, and he cited them as valuable for citizens wanting to get a somewhat briefer introduction.
The most important book, Skelton said, is Edward Sheperd Creasy's "Fifteen Decisive Battles of the World: From Marathon to Waterloo," written in 1850, because it outlines key battles that shaped the world.
His personal favorites: "Daniel Boone: The Life and Legend of an American Pioneer" by John Mack Faragher, and "Tecumseh: A life" by John Sugden. Skelton's great-great-great-grandfather, Squire Boone, was a nephew of Daniel Boone.
Some books are ancient, such as Sun Tzu's "The Art of War" (fourth century B.C.), others merely old, including Carl von Clausewitz's "On War" (1832) and some recent, such as Stephen Ambrose's "Undaunted Courage" (1996).
Rick DelVecchio, writing in the San Francisco Chronicle (June 11, 2003)
The Black Panther Party is the subject of growing academic interest as historians born after the 1960s take a new look at a movement known to their generation mostly from movies, memoirs and negative government reports.
Scholars from around the country will meet in Boston beginning today to give more than 40 new papers on the Panthers. Organizers are saying the conference breaks new ground as a scholars-only assessment of the group, which emerged in Oakland in 1966, spread nationally and internationally, and faded out in 1980.
The conference, called "The Black Panther Party in Historical Perspective," is part of a trend to look at the Panthers in the web of American social history rather than merely as one of the most sensational groups to emerge in the upheaval of the '60s.
"If historians don't ignore them, they tend to focus on the militant side of black people brandishing guns, or they focus on the negative side of cocaine and brutalization and violence and all the rest," said Jama Lazerow, a professor of history at Wheelock College, where the conference will be held.
"The Panthers are the most radical version of the '60s stories, and the way people tend to interpret the '60s even now is in a negative or positive light," he said.
In 1968, the FBI branded the Panthers the nation's most violence-prone extremist group and began a campaign to disrupt the party from within and prosecute its leaders and active members. The government's record still clouds the Panthers' image, but young scholars see that as just one version of the story.
"We're not out to celebrate the Panthers," said Yohuru Williams, 31, a Delaware State University history professor and one of the organizers of the conference. "We're out to correct the historical record."
Shane Green, writing in the Australian Age(June 13, 2003):
A newly released document has revealed that Emperor Hirohito may have been planning to apologise and take responsibility for Japan's march across East Asia in World War II.
In a 1948 draft speech written in the hand of the Emperor's top aide, Hirohito told his subjects of his "mental agony" over Japan's war losses, and the "flame of anxiety" that "burns my body".
Significantly, the draft speech is an apology to the Japanese people, rather than the countries the Japanese occupied.
After Japan surrendered, Hirohito took no responsibility for the war, despite what many historians say was his central role. He was protected by the United States, his image remade, and kept on the Chrysanthemum throne as part of Washington's strategy to rebuild Japan.
The lack of public remorse by Hirohito, who died in 1989, is often cited as a crucial reason for the undercurrent of hostility in the region towards Japan for its wartime aggression.
The document, discovered by writer Kyoko Kato and published yesterday by Bungeishunju magazine, indicates that the emperor privately, and almost publicly, accepted his key role.
Richard Bernstein, writing in the NYT (June 13, 2003):
These days, the claw-shaped island in the Baltic Sea called Usedom is a family-oriented vacation resort, crowded especially on holiday weekends by members of the German middle class, playing in the sea or riding rented bicycles through the fragrant pine forests.
But the island, the very eastern part of which crosses into Poland, has a freighted history. Some vacationers, interested in mixing their pleasure with edification, visit the museum in Peenemünde on the island's northern claw, where the most important bit of history was enacted.
The museum is in a huge industrial shed that once housed the power plant for the top-secret complex where the Nazis developed and tested the V-1 missile, or buzz bomb, and the V-2 missile. The two missiles were, along with the atom bomb, the most momentous weapons devised during the war.
In the decades when the island was part of East Germany, this was a restricted military area. The rest of the island, once a preserve of fashionable people from Berlin, about four hours drive away, was converted into a vacation area for workers, and some nonworkers, even if, as one person here put it, the hotels then were "East German standard," meaning not very good.
After the reunification of Germany, local people in Peenemünde organized a first version of the museum, using the vastness of the power station to assemble exhibits on the work that took place there. Six thousand people came in the first month, and, in the mid-1990's, the museum was expanded by the state government and opened in its new condition a year ago.
"Before, there was no overall concept; now there is," Peter Profe, the museum's deputy director, told a recent visitor. "The concept is to represent the two ends of the rocket's parabola, the takeoff at the beginning and what happened where it landed at the end. We try not to glorify the technical aspect."
In other words, the museum does what one would want it to do. It presents a largely technological story while making it clear that the technology was pursued on behalf of an evil regime that "craved world domination," as an explanatory sign in one of the first exhibition rooms puts it. There is a room dedicated to showing how the two halves of the Peenemünde population lived, the soldiers and engineers very well, the thousands of forced laborers who did the manual work badly.
Perhaps the moral heart of the Peenemünde Museum is a dark room with a simple circular installation of urban rubble, starkly illuminated by a spotlight on the ceiling.
"We couldn't really show the suffering the rockets caused," Mr. Profe said, "so we made this room a place of contemplation."
But there is much else in the museum, including a reminder that the scientists who first dreamed of creating rockets in Germany wanted to devise a faster means of travel and to open up space for exploration, which, paradoxically, they did. The museum displays a group portrait of 118 German scientists who once worked at Peenemünde, including Wernher von Braun, its technical director.
Rick Perlstein, writing in Dissent (June 2003):
Matthew A. Crenson and Benjamin Ginsberg have a creepy theory about what popular democracy is: in the nineteenth century, the powerful granted the powerless such privileges as the right to vote, civil rights, some small voice in politics, and the opportunity to buy government bonds; in return, the powerless-now classed as citizens-gratefully served in the military, paid their taxes, and allowed themselves to be administered. This is how Crenson and Ginsberg define their golden age-a time, now past, when "citizens were the backbone of the Western state, providing it with the administrative, coercive, and extractive capabilities to conquer much of the world." It's an unpromising way to begin a book whose title-Downsizing Democracy: How America Sidelined Its Citizens and Privatized Its Public-would tempt many of us to pick up.
On this shaky foundation they propound a thesis at least a little more promising. It is that America's elites have lately learned that they can conquer the world without bothering about citizens at all. "In one public setting after another," Crenson and Ginsberg write, "government disaggregates the public into a mass of individual clients, consumers, and contributors," leading to "new and nonparticipatory ways of doing business." Elites have exploited that development to counter a structural flaw within the old model: namely, that a mobilized citizenry cannot be controlled. With pesky citizens out of the way, the powerful can defend their interests in less risky ways-in courtrooms, "by manipulating administrative procedures," through privatization. Citizens are left subject to a "personal democracy" of individual access to government services and redress-which is, to these authors, always bad. This is a bit of a creepy theory as well. Because personal democracy is not all bad, any more than their golden age was all good.
What Crenson and Ginsberg, both Johns Hopkins professors, have produced is a series of not-so-well-linked portraits of a great number of governmental and quasi-government sieves that divert the possibility of ordinary citizens exerting influence over their workings. And it certainly can be said, to get the praise out of the way, that some of these portraits are impressive. The section on those innocent-sounding monsters called "government sponsored enterprises"-Sallie Mae, Freddie Mac, and so on-is devastating: they are revealed as mere conduits for middle-class welfare, for-profit corporations that receive all kinds of government-granted advantages and offer very little public good in return.
I also like one of their ideas about the transformation of the American party system: they point out that the sixty million presidential nonvoters would seem a political bonanza for whichever party would endeavor to tap it, but that the parties have little interest in doing so-as shown by Walter Mondale's advisers telling him "that the idea of mobilizing new voters was 'backward thinking.'" The authors then argue that the parties' antique goal of mobilizing the masses has been partly supplanted by a system of "new patronage," whereby parties rely on the power of their built-in base of loyal activists ensconced within sectors of the government they control: the social service bureaucracy for the Democrats and the military-industrial complex for the Republicans.
It's an interesting insight, and yet . . . Take one of their examples of the retreat of mass mobilization: the Florida Recount Show. Once upon a time, recounts could become festivals of popular democracy, with candidates mobilizing grassroots networks to swarm the canvassing boards to vouchsafe a favorable outcome. In 2000, by contrast, Al Gore and Joe Lieberman "spent hours on the telephone each day contacting contributors"-to help pay for the lawyers, of course. "I'm quite sure that the polls don't matter in this, because it's a legal question," Gore told a television reporter. Better he should have found a way to make the polls matter, because they showed that an overwhelming number of Americans favored the full recount that would have given him the victory. But that would have meant putting grassroots troops in the street; which, for today's parties, is even more backward than mobilizing habitual nonvoters. Instead, to the party that best activated its professional cadres went the spoils.
Crenson and Ginsberg don't mention that the Republicans were able to raise $13.8 million for the fight, Gore and Lieberman only a quarter of that. The distortion that campaign finance introduced into the system-weighted heavily toward the more corporate-friendly Republicans-is exactly what comes to mind first when most of us think about how democracy has been downsized. It doesn't figure here. But then, acknowledging institutionalized entrenchments of unequal power has never been a strong suit of American political science. Crenson and Ginsberg ignore such bedrock realities in favor of a rickety structural scheme, rooted in a beggared reading of history: that the mobilizing of the masses through party discipline is the only variable that matters when measuring democracy.
From the Associated Press (June 11, 2003):
The Pulitzer Prize Board said yesterday that it was reviewing a prize awarded in 1932 to a correspondent for The New York Times who has been accused of ignoring a forced famine in Ukraine that killed millions.
The review is the second by the board into the work of the correspondent, Walter Duranty, who covered the Soviet Union for The Times from 1922 to 1941, earning acclaim for an exclusive 1929 interview with Stalin. A similar inquiry in 1990 ended with a decision to let Mr. Duranty's Pulitzer stand.
Members of the Ukrainian Congress Committee of America urged the new review to coincide with the 70th anniversary of the famine, which claimed as many as seven million Ukrainians as Stalin imposed collectivization on a largely resisting populace.
"Like any significant complaint, we take them seriously," Sig Gissler, administrator of the Pulitzer Board, said yesterday of the accusations against Mr. Duranty. "They are under review by a board subcommittee."
The review was begun in April.
A 1990 book by S. J. Taylor, "Stalin's Apologist," found that Mr. Duranty had known of the famine but had ignored it to preserve his access to Stalin.
The Times has distanced itself from Mr. Duranty's work. His Pulitzer is displayed at the newspaper's headquarters with this caveat: "Other writers in The Times and elsewhere have discredited this coverage."
Toby Usnik, director of public relations at The Times, said, "The Times has reported often and thoroughly on the defects in Duranty's journalism as viewed through the lens of later events."
Mr. Gissler, of the Pulitzer Board, pointed out that while Mr. Duranty won the prize in 1932, the year the famine began, it was for reports he had written a year earlier.
In addition, Mr. Gissler noted, the Pulitzer is awarded for work in a single year rather than "a winner's body of work over time."
David Whitehouse, writing in BBC.com (June 9, 2003):
Humans may have come close to extinction about 70,000 years ago, according to the latest genetic research.
The study suggests that at one point there may have been only 2,000 individuals alive as our species teetered on the brink.
This means that, for a while, humanity was in a perilous state, vulnerable to disease, environmental disasters and conflict. If any of these factors had turned against us, we would not be here.
The research also suggests that humans (Homo sapiens sapiens) made their first journey out of Africa as recently as 70,000 years ago.
Unlike our close genetic relatives - chimps - all humans have virtually identical DNA. In fact, one group of chimps can have more genetic diversity than all of the six billion humans alive today.
It is thought we spilt from a common ancestor with chimps 5-6 million years ago, more than enough time for substantial genetic differences to develop.
The absence of those differences suggests to some researchers that the human gene pool was reduced to a small size in the recent past, thereby wiping out genetic variation between current populations.
Evidence for that view is published in the American Journal of Human Genetics.
From Hong Kong's South China Morning Post (June 7, 2003):
"Why do all my students admire Adolf Hitler?" This is the sort of remark which can effortlessly interrupt several surrounding conversations in the staff canteen.
Faced with a sizeable audience, my historian colleague, who has taught in a local university for the best part of 20 years, retracted some of it. The European History students were all right, apparently, because there was time to put them right about Herr Hitler's drawbacks as a national leader. European Civilisation students, though, did not get the full treatment and thus tended to emerge from the course with an intact respect for Adolf, based on the notion that he "made the country strong". This is distressing.
Taking a charitable view of Hitler's career you could perhaps say that if he had had the good sense to die in 1939, or even early in 1941, he might now be regarded as someone who was, like Chairman Mao, "right more than half of the time". But that is not what happened. By the time he did die he had not made the country strong. He had made it a basket case, the name of which stank in the nostrils of decent people everywhere. The problem, I suspect, is in Chinese history, which is colourful and interesting but prone to stereotypes. Most of China's lengthy recorded history is official history written by official historians, who would not have enhanced their career prospects had they opined that Emperor X would have done a better job if he had spent less time on bashing the Uighurs and more on putting a chicken in every pot.
So the scale of values is crude. Anything can be forgiven the man who makes the country strong. There is no more serious crime than to be weak. This view of politics, which we may call the "Hero" syndrome, after the film which makes much the same point, still permeates many Hong Kong schools, not to mention its government.
This leads in turn to a misunderstanding of the purpose of democracy, which is not to allow us to choose our own dictator, but to allow people collectively to make better decisions than any one of them could make on his or her own. And it leads to the misconception that the most effective leader is the one who gives the most vigorous orders.
Sandi Dolbee, writing in the San Diego Union Tribune (June 5, 2003):
Scott Farrell would like to bring back the Middle Ages. Or at least a part of it. The part about chivalry.
The 38-year-old Santee writer is the creator of Shining Armor Enterprises and a program called Chivalry Today, which includes a Web site (ChivalryToday.com) and seminars on "a reawakening of the code of chivalry" to improve our behavior. He's also working on a book about how chivalry can be used to infuse ethics into the 21st century.
Farrell, who over the last two decades has become a self-taught aficionado on the subject, envisions a society in which knights are not only pieces of a chess board, the thought of round tables conjures up more than images of a pizza joint, and noble conduct isn't something to joust about.
"What I want to try to portray is that the ideals of chivalry can be brought to light today and not just enshrined in the past," says Farrell, who with his goatee and dark locks looks as if he could step right into King Arthur's castle....
Despite Farrell's enthusiasm, the history of knighthood comes with some not-so-knightly baggage.
After all, it was the Middle Ages that brought us the Crusades and Inquisition, neither of which were exactly models of ethical behavior. And knights themselves were pretty much white Christian men who were part of a caste system that also left much to be desired.
Peter Arnade, an associate professor of history at California State University San Marcos, suggests that those smitten with that era are dabbling in revisionist history.
"They romanticize the period. They don't want to understand the period as it unfolded," he says.
Chivalry got started in southern France in the 11th and 12th centuries with a movement known as "courtly love," according to Arnade, with romantic poetry and songs that straddled the language of religion and love. Out of this evolved a code of ethics and honor for military people.
"The historians' take on this is that chivalry mostly is a fiction," Arnade says. "It was a feel-good movement among the military to make them feel like they were doing something other than smashing people's brains out, which is what they were doing.
"When somebody says they want to revive chivalry, I think what they mean by that is the person has taken the romantic fiction of chivalry and said, 'This is an honorable code of courtesy, of manners, and shouldn't we all have more manners?' To which I would reply, 'yes,' but I would go looking somewhere else for my code of manners."
Duncan Spencer, writing in the Washington Times (June 9, 2003):
Warren Getler, formerly a Wall Street Journal reporter and now editor-at-large for Bloomberg News, and Arkansas treasure hunter Bob Brewer have written an intriguing yet infuriating book about buried Rebel gold.
Mr. Getler is a skilled hand at storytelling. He spins a tale here of deep mystery and occult practices and Southern mysticism, slowly pulling back the curtain on a strange world.
The thesis is a legend. At the end of the Civil War, a certain faction among Southern leaders, both military and political, refused to accept surrender. Some fled to Mexico, others dispersed, either going underground deep in the rural South or pretending acquiescence to the Confederate surrender.
But, the legend goes, these die-hards were convinced that the South would rise again and, with the connivance of top leaders, hid millions of dollars in gold, jewels and precious metals in a number of "mother lodes" (as Mr. Brewer puts it), throughout the South and Southwest.
These men knew that they probably would never live to see the rebirth of the pro-slave South... but how to hide and safely pass on the knowledge and location of the treasure?
That, the authors argue, was one of the tasks of a secret organization known as the Knights of the Golden Circle.
The Knights always has been an obscure, almost mythical organization dwelling in the shadows of those who worship at the shrine of the "noble Confederate." Mr. Getler and Mr. Brewer manage to bring it to life to form the background of their adventure. In short, the two claim the Knights devised hiding places for the Confederate treasure, and knowing that it would be vigorously sought, designed a system of obscure symbols, maps and marks on stones and certain trees to translate the exact location of the treasure troves to generations to come.
The original Knights appointed a ring of "watch-keepers" who had the duty of protecting the lore and the location of the treasure and of handing the duty on to new generations.
Mr. Brewer, a retired Navy helicopter crewman, returned to his hometown in Hatfield, Ark., in 1977, a stranger to all but the outlines of this Southern lore. He had served two tours in Vietnam and was glad to settle back into the community where two previous generations of Brewers had lived out quiet lives. He thought the store-porch stories he heard while catching up with old-timers were simply fables. Except for one thing: As a boy he had heard the same tales of buried gold hinted at by his uncle, Odis Ashcraft who had heard them from his father.
The tales and the talismans soon fascinated Mr. Brewer marks on ancient beech trees, crude pictures carved on boulders. A skilled navigator, he was soon working with detailed topographical maps, a pair of dividers and a metal detector drawing lines between local landmarks, then walking the lines and probing the earth.
The hobby became a passion when in the early '90s he found a pint jar stuffed with old coins, including gold pieces, all dating from 1802 to 1889, and valued at about $28,000. Mr. Brewer found another such trove a few months later. The hobbyist would become obsessed with deciphering the Confederate codes.
It's here that the authors begin to pull punches, however. After a long buildup of mysterious connections the Knights apparently are related to the Masonic movement, and famed outlaw Jesse James is strongly believed to have been one of the Knights' watch-keepers Mr. Brewer leads the reader on treasure hunt after treasure hunt but never reveals (aside from a few caches of coins that might have been buried for a myriad of reasons) whether the mystic maps have led him to a "mother lode."
In one adventure, he's outmaneuvered by a duplicitous partner, who steals the Oklahoma "Wolf Map" treasure Mr. Brewer discovers. In another important treasure hunt, the so-called Dutchman treasure in Arizona, he finds what the maps and markers tell him is the spot, only to conclude that it would be too difficult to retrieve because it's on government land.
Louis Fisher, writing in the Duke Law Journal (November 2002):
No constitutional language authorizes the president to withhold documents from Congress, nor does any provision empower Congress to demand and receive information from the executive branch. The Supreme Court has recognized the constitutional power of Congress to investigate,1 and the president's power to withhold information,2 but those powers would exist with or without judicial rulings. Over the past two centuries, Congress and the President have insisted that the powers are necessarily implied in the effective functioning of government. No doubt they are. The difficult and unpredictable issue is how to resolve two implied powers when they collide. Some judicial opinions provide guidance, but most of the disputes are resolved through political accommodations.
A lengthy study in 1949, expressing the executive branch position, asserted that federal courts "have uniformly held that the President and the heads of departments have an uncontrolled discretion to withhold . . . information and papers in the public interest, and they [*pg 325] will not interfere with the exercise of that discretion."3 That statement, incorrect when written, is even less true today as a result of litigation and political precedents over the past half century. Similarly inaccurate is the claim that "in every instance where a President has backed the refusal of a head of a department to divulge confidential information to either of the Houses of Congress, or their committees, the papers and the information requested were not furnished."4 Congress and its committees have enjoyed a more successful record than that. Finally, the 1949 study seriously understated the coercive powers of Congress when it claimed that the heads of departments "are entirely unaffected by existing laws which prescribe penalties for failure to testify and produce papers before the House of Representatives or the Senate, or their committees."5 Congress may hold both executive officials and private citizens in contempt.
What informs the process of congressional access to executive branch information is the constitutional structure of separation of powers and the system of checks and balances. Neither political branch has incontestable authority to withhold information or force its disgorgement. When these executive-legislative clashes occur, they are seldom resolved judicially. Accommodations are usually discovered without the need for litigation. On those rare occasions where these disputes enter the courts, judges typically reject sweeping claims of privilege by elected officials while encouraging the two branches to find a satisfactory compromise.6 Courts look to legal precedent, "and legal precedent is much too inflexible to apply in individual cases of executive-legislative disputes."7 The outcome is more likely decided by the persistence of Congress and its willingness to adopt political penalties for executive noncompliance. Congress can win most of the time -- if it has the will -- because its political tools are formidable.
Although the congressional power to investigate is not expressly provided for in the Constitution, the framers understood that legislatures must oversee the executive branch. At the Philadelphia Conven-[*pg 326] tion, George Mason emphasized that members of Congress "are not only Legislators but they possess inquisitorial powers. They must meet frequently to inspect the Conduct of the public offices."8 Charles Pinckney submitted a list of congressional prerogatives, including: "Each House shall be the Judge of its own privileges, and shall have authority to punish by imprisonment every person violating the same."9 The Constitution, however, provided no express powers for Congress to investigate or to punish for contempt. What was left silent would be filled within a few years by implied powers and legislative precedents.
Michael Janofsky, writing in the NYT (June 5, 2003):
For more than 120 years, Pat Garrett has enjoyed legendary status in the American West, a lawman on a par with Wyatt Earp, Bat Masterson, even Matt Dillon. As sheriff here in Lincoln County in 1881, Garrett is credited with shooting to death the notorious outlaw known as Billy the Kid, a killing that made Garrett a hero. For years, a patch bearing his likeness has adorned uniforms worn by sheriff's deputies here.
But now, modern science is about to interrupt Garrett's fame in a way that some say could expose him as a liar who covered up a murder to save his own skin and reputation.
Officials in New Mexico and Texas are working out plans to exhume and conduct genetic tests on the bodies of a woman buried in New Mexico who was believed to be the Kid's mother and a Texas man known as Brushy Bill Roberts, who claimed to be the Kid and died in 1950 at the age of 90. If test results suggest that the two were related, it would add new evidence to a long-held alternative theory that Garrett shot someone other than the Kid and led a conspiracy to cover up his crime.
Such skepticism is hardly uncommon. Disputes over major events in the Old West have engaged historians almost since they happened. The debate over Billy the Kid is one of the longest-running.
Beyond renewing interest in the Kid saga, the possibility that testing could enlarge Garrett's reputation or destroy it has even caught the fancy of Gov. Bill Richardson of New Mexico, who has offered state aid for the investigation and a possible pardon that an earlier New Mexico governor had once promised to the Kid for a murder he committed.
"The problem is, there's so much fairy tale with this story that it's hard to nail down the facts," said Steve Sederwall, the mayor of Capitan, N.M., who is working with Lincoln County's current sheriff, Tom Sullivan, to resolve the matter. "All we want is the truth, whatever it is. If the guy Garrett killed was Billy the Kid, that makes him a hero. If it wasn't, Garrett was a murderer, and we have egg on our face, big time."
No matter what the genetic testing may show and it might not show much of anything it is hard to overstate the prominence of Garrett and the Kid in Western lore, especially here in southeastern New Mexico, where their lives converged during and after the gun battles for financial control of the region that were known as the Lincoln County War. The Kid's notoriety grew after he and friends on one side of the conflict killed several men in an ambush, including Garrett's predecessor, Sheriff William Brady. For that, the Kid was hunted down, captured by Garrett, found guilty of murder and taken to the Lincoln jail, where he was placed in shackles to await hanging. He was only 21.
Today, the tiny town of Lincoln, population 38, is a memorial to what happened next. More than a dozen buildings, including one that housed the jail, have been preserved as a state monument that attracts as many as 35,000 visitors a year.
Historians generally agree that the Kid, born Henry McCarty and known at times as William H. Bonney, escaped after it became apparent that Gov. Lew Wallace had reneged on a promise to pardon him in exchange for information about other killings in the county war. On April 28, 1881, the Kid managed to get his hands on a gun, kill the two deputies assigned to watch him and leave the area on horseback.
But then stories diverge, providing fuel for two major theories of where, when and how the Kid's life ended.
The version embraced here and supported by numerous books and Garrett relatives is that the Kid made his way to a friend's ranch in Fort Sumner, about 100 miles northeast of Lincoln. The ranch owner, Pete Maxwell, was also a friend of Garrett's and somehow got word to Garrett that the Kid was in the area. After arriving, Garrett posted two deputies at the door.
As the Kid approached on the night of July 13, he spoke a few words in Spanish to the deputies, who did not recognize him. But Garrett, waiting inside, knew the voice. When the Kid walked in, Garrett turned and shot him in the heart....
But just as the story of Garrett as hero has flourished over the years, so have others, including the tale of Brushy Bill of Hico, Tex. His trip to New Mexico in 1950 to seek the pardon he said he was denied nearly 70 years before gave new life to an alternative possibility, that Garrett had not killed the Kid at all, but a drifter friend of the Kid's named Billy Barlow.
This story holds that Garrett and the Kid may have been in cahoots for some reason and that Garrett had stashed a gun in the outhouse at the jail that the Kid used to kill the deputies. Even if only part of that is true, it would strongly suggest that Garrett killed the wrong man.
Fred Kaplan, writing in Slate (June 3, 2003):
When tomorrow's historians go to write the chronicles of decision-making that led to Gulf War II, they may be startled to find there's not much history to be written. The same is true of Clinton's war over Kosovo, Bush Sr.'s Desert Storm, and a host of other major episodes of U.S. national security policy. Many of the kinds of documents that historians of prior wars, and of the Cold War, have taken for grantedmemoranda, minutes, and the routine back-and-forth among assistant secretaries of state and defense or among colonels and generals in the Joint Chiefs of Staffsimply no longer exist.
The problem is not some deliberate plot to conceal or destroy evidence. The problemand it may seem churlish to say so in an online publicationis the advent of e-mail.
In the old days, before the mid-to-late 1980s, Cabinet officials and their assistants and deputy assistants wrote memos on paper, then handed them to a secretary in a typing pool. The secretary would type it on a sheet of paper backed by two or three carbon sheets, then file the carbons. Periodically, someone from the national archive would stop by with a cart and haul away the carbons for posterity.
Nobody does this today. There are no typing pools to speak of. There are few written memos.
Eduard Mark, a Cold War historian who has worked for 15 years in the U.S. Air Force historian's office, has launched a one-man crusade to highlight, and repair, this situation. He remembers an incident from the early '90s, when he was researching the official Air Force history of the Panama invasion, which had taken place only a few years earlier. "I went to the Air Force operations center," Mark says. "They had a little Mac computer on which they'd saved all the briefings. They were getting ready to dump the computer. I stopped them just in time, and printed out all the briefings. Those printouts I made are the only copies in existence."
That was a decade ago, when computers were not yet pervasive in the Pentagon and many offices still printed important documents on paper. The situation now, Mark says, is much worse.
Almost all Air Force documents today, for example, are presented as PowerPoint briefings. They are almost never printed and rarely stored. When they are saved, they are often unaccompanied by any text. As a result, in many cases, the briefings are incomprehensible.
The new, paperless world has encouraged a general carelessness in official record-keeping. Mark says that J5, the planning department of the Joint Chiefs of Staff, does not, as a rule, save anything. When I talked with Mark on the phone Tuesday, he said he had before him an unclassified document, signed by the Air Force chief of staff and the secretary of the Air Force, ordering the creation of a senior steering group on "transformation" (the new buzzword for making military operations more agile and more inter-service in nature). The document was not dated.
Laurence Brahm, writing in the South China Morning Post (June 2, 2003):
Many a prank call to the Foreign Ministry begins: "Hello, is Li Hongzhang there?" But operators at the ministry will have a hard time finding Li - as he was a foreign minister in the Qing dynasty nearly 100 years ago.
Many of the prank callers are young people disappointed with what they perceive as China's money-centred foreign policy - the joke being that Li has, for years, been officially reviled as a traitor who sold out China's interests to foreigners.
In contrast with these publicly accepted views of Li is a shocking turnabout that is already under way in the Chinese media. Li's new, improved image is being presented in the television series Marching Towards a Republic, which explores sensitive issues, such as why systemic corruption led to dynastic collapse during the Qing era.
The series recently ended on CCTV 1, the mainland's most prissy, politically correct national station, and has left people wondering what is going on. Li's character is the good guy through the 50 episodes, an unprecedented rewriting of history.
Li's era, spanning late Qing and early Republican history, was a time of regional warlordism, dynastic corruption and foreign control over industry and trade. It was a period which has been officially denounced in education and media circles since 1949, and nobody would dare compare it with China's situation today.
But Marching Towards a Republic is sending some confusing messages about the Li legacy, making the programme the hottest topic of conversation since Sars. After 50 years of official disdain, Li is being relabelled a "patriot with historic responsibility".
Official press commentaries call Li an outstanding diplomat who tried his best to reduce China's losses by signing many unfair treaties. What is going on? The programme contains more shockers. Empress dowager Cixi is presented as a nice, vulnerable woman whose weaknesses were having her head in the clouds and trusting her family members too much - starkly contrasting with all the post -1949 depictions of her as an evil, scheming empress.
The programme's message seems to be that the Qing dynasty's collapse was precipitated by Cixi's adherence to centralised control, while Yuan Shi-kai, father of China's warlord era, is depicted graciously uniting the regional warlords, effectively promoting a republic, or federal system.
Many people criticise regional economic "warlordism" in China today, where local officials disregard central policies and behave like mafia - features hauntingly reminiscent of China in the 1920 and 1930s. Could CCTV 1 have a sequel in the works?
The surprises do not end there. The programme associates Li with opening China to foreign investment through the Yangwu Yundong, or Foreign Affairs Movement. Even Sun Yat-sen, the father of modern China, recognised by both communists and nationalists, is depicted as not opposing foreign involvement in national affairs.
In fact, four-fifths of Li's historical records have never been published, which means even historians know little about him. He did sign more than 30 treaties with foreigners, and these are generally accepted by foreign and local historians as unfair to China.
History professor Fang Delin, of Peking University, says: "History is not so easy to turn upside down ... Li Hongzhang - representing current rational diplomacy as a tool to solve country-to-country disputes - is, from today's point of view, a correct, positive historic figure . But you cannot deny that Li Hongzhang surrendered. You cannot say because he paid compensation to Japan as they asked, to avoid war, that he contributed to China's national interest.
"Bargaining isn't done that way. When Li Hongzhang negotiated with Russia, he instructed his negotiators to accept whatever Russia asked for. Isn't that a sellout?"
Nicholas Wade, writing in the NYT (May 27, 2003):
History books favor stories of conquest, not of continuity, so it is perhaps not surprising that many Englishmen grow up believing they are a fighting mixture of the Romans, Anglo-Saxons, Danes, Vikings and Normans who invaded Britain. The defeated Celts, by this reckoning, left their legacy only in the hinterlands of Ireland, Scotland and Wales.
A new genetic survey of Y chromosomes throughout the British Isles has revealed a very different story. The Celtic inhabitants of Britain were real survivors. Nowhere were they entirely replaced by the invaders and they survive in high proportions, often 50 percent or more, throughout the British Isles, according to a study by Dr. Cristian Capelli, Dr. David B. Goldstein and others at University College London.
The study, being reported today in Current Biology, was based on comparing Y chromosomes sampled throughout the British Isles with the invaders' Y chromosomes, as represented by the present-day descendants of the Danes, Vikings (in Norway) and Anglo-Saxons (in Schleswig-Holstein in northern Germany).
The survey began as a request from the British Broadcasting Corporation to look for genetic signatures of the Vikings in England, later broadened to include the Danes and Anglo-Saxons. Dr. Goldstein said that not enough money was available to study two other invaders, the Romans and the Normans, but that he felt that their demographic contribution had probably been small.
He assumed the original inhabitants of Britain could be represented by men living in Castlerea, in central Ireland, a region not reached by any foreign invader. In a study two years ago Dr. Goldstein and colleagues established that Y chromosomes of Celtic populations were almost identical with those of the Basques.
The Basques live in a mountainous refuge on the French-Spanish border and speak a language wholly unrelated to the Indo-European tongues that swept into Europe some 8,000 years ago, bringing the agricultural revolution of the Neolithic period. Hence they have long been regarded as likely remnants of the first modern humans to reach Europe some 30,000 years ago, during the Paleolithic.
By this chain of reasoning, the Celtic-speaking men, since genetically very close to the Basques, must also be drawn from the original Paleolithic inhabitants of Europe, and probably represent the first modern human inhabitants of Britain who settled the islands some 10,000 years ago, Dr. Goldstein said. These original Britons must later have adopted from Europe both the Celtic culture, evidence of which appears from some 3,000 years ago, and the Celtic language, which is a branch of the Indo-European language family.
Having identified Y chromosomes assumed typical of the original Britons, Dr. Goldstein and his team could assess the demographic impact of the invaders. They found that the Vikings left a heavy genetic imprint in the Orkneys, the islands off the northeast coast of Scotland, which were a center of Viking operations between A.D. 800 and 1200. Many men in York and east England carry Danish Y chromosomes. But surprisingly, there is little sign of Anglo-Saxon heritage in southern England.
Sarah Honig, writing in the Herusalem Post (May 27, 2003):
[ A few days ago] I read that Canada had rejected the immigrant visa application of a Lebanese man identified only as Mr. X. He was disqualified due to an Israeli connection, arising from his service in the now- defunct South Lebanese Army. The information he passed on about Hizbullah could have helped democratic Israel foil some of the most bloody-minded terrorists ever, and even apprehend them.
But tipping Israel off rendered Mr. X undesirable in Canadian eyes and suspected of crimes against humanity.
Canada, he was informed, doesn't admit war criminals.
This is where you could have knocked me down with a feather.
Jewish groups, and not they alone, have long accused Canadian governments of complicity in harboring numerous suspected Nazi criminals. The 1986 Deschenes Commission of Inquiry on War Criminals painted a dismal picture of how Nazis were virtually welcomed into postwar Canada and how successive administrations ignored warnings that they were offering safe haven to war criminals.
Justice Jules Deschenes wanted the 580-page report to gain wide readership, but Ottawa first withheld it and then heavily censored its contents, presumably because it documents persistent anti-Jewish bias within its federal bureaucracy and a lack of vigor in scrutinizing immigrants.
It wasn't difficult for umbrella groups of Canadians of German, Ukrainian, Latvian and Lithuanian extraction to lobby friendly politicians to overlook the dubious pasts of visa applicants. A tireless campaign by influential Quebecois won waivers for French Nazi collaborators. The lobbyists represented sizable voting blocs and caving in to their pressure paid off.
But while ex-Nazis and their assistants enjoyed Canadian asylum, it was a different story for those of their Jewish victims who managed to stay alive.
They were decidedly unwanted. Ample evidence is available in the book None Is Too Many by historians Irving Abella and Harold Troper.
Canadian restrictions on Jewish immigration predate Hitler's fall. According to the Wiesenthal Center, "from 1933 to 1948 Canada's doors remained closed to Jews. Canada had arguably the worst record among all Western states in granting sanctuary to refugees from Nazi Germany."
Kitchener's native son served as Canada's premier (for the third time) between 1935-48.
IT DIDN'T take long for Jewish "new Canadians" to realize that their tormentors lived literally next door. Thus back in 1948 the Canadian Jewish Congress informed the Royal Canadian Mounted Police that Lithuanian immigrant Antanas Kenstavicius had supervised the massacres of thousands of Jews when he was stationed as police chief in Svencionilla between 1941-44. But the Mounties didn't rush to get their man.
Only on January 22, 1997 did deportation hearings begin against Kenstavicius and he died on that very day, aged 90, having lived a full life and never paying for crimes he occasionally bragged about.
Siddhartha Deb, an Indian writer, in the Washington Post (June 1, 2003):
EVERY NATION IMPARTS self-serving myths and legends to its young, but in recent years few countries have done so quite as avidly as India. In classrooms from Kashmir to Karnataka, a new history is being produced by a resurgent right-wing Hindu movement. One finds a number of curious stories being peddled to schoolchildren: Aryans sallying out from India to settle Iran, Homer adapting ''The Iliad'' from the Ramayana, Christ roaming the Himalayas in search of Hindu wisdom.
These claims will sound unlikely even to the hardened Indophile, but they are being promoted in government-sponsored textbooks, columns by right-wing journalists, and paintings commissioned to adorn public spaces. The Hindu fundamentalist vision of history presumes that the Indian subcontinent is an exclusive Aryan-Hindu preserve. Hindu ideologues dismiss strong historical evidence that the area once contained a mix of peoples, and that the Aryan people migrated there from central Asia.
Their purist idea of India has become visible even in America. On March 25, a vocal group of well-dressed Indian nationalists disrupted a Columbia University panel on India and Pakistan, forcing the moderator to abruptly halt the discussion. In April, some 2000 signatures appeared on an online petition protesting the appointment of Romila Thapar, a secular scholar of ancient India, to a research chair at the Library of Congress.
Such incidents are old news in India, where liberal, secular, and left-wing historians have been under attack since the early 1990s, when the right-wing Bharatiya Janata Party (BJP) began its rise to power. The BJP originally focused its complaints on a 16th-century mosque in Ayodhya, a symbol to them of the long history of Muslim conquest and plunder. The BJP, along with its allied organizations the Rashtriya Swayamsevak Sangh (RSS) and the Vishva Hindu Parishad (VHP) (known collectively as the ''Sangh family''), claimed the mosque had been built on the birthplace of the Hindu god Rama. Historical evidence was hard to come by, but the BJP leader L.K. Advani (who has since become deputy prime minister) gave public appearances as Rama riding his chariot. Subsequently, the mosque was demolished by Sangh cadres.
In 1998, the BJP won national elections and began taking control of the country's leading scholarly bodies. A national curriculum for schools run by the central government was proposed the same year, with the objective of replacing history textbooks by the country's most reputed scholars, many of whom have a secular or left-wing orientation.
In May 2002, the education ministers of 16 states walked out of a conference to protest the right-wing bias of the new curriculum, while three leading scholar-activists filed a petition with the Supreme Court challenging the publication of new textbooks. The petition was turned down, however, and ''India and the World'' and ''Contemporary India'' made their appearance last year.
At first sight, the new textbooks seemed notable only for their bad photographs, cluttered maps, occasional typos, and the insouciance of introductory statements like ''The twentieth century world witnessed umpteen developments of far reaching consequences.'' Once the liberal press had subjected the textbooks to close readings, however, a pattern emerged from the ''umpteen developments'' left out.
The anti-left bias, as in the offhand description of Lenin as the leader of a ''coup,'' was expected. But some other distortions were less so. The Holocaust, for example, is significantly absent from the discussion of Nazi Germany in ''India and the World,'' and Gandhi's assassination by a right-wing Hindu isn't mentioned in ''Contemporary India.'' The books criticize German nationalism not for genocide, war, pogroms, and book-burning, but merely for a false superiority complex premised on ''so-called Aryan blood.'' The real Aryans, or Hindus, omnipresent in all aspects of the Indian subcontinent, are a different matter altogether.
Robert Matthews, writing in the Sunday Telegraph (June 1st, 2003):
LONDON -- It was one of the most famous experiments in science: Generations of schoolchildren have been taught how Benjamin Franklin, the 18th-century American inventor and statesman, risked his life flying a kite in a thunderstorm to prove that lightning was a form of electricity.
Franklin's success brought worldwide fame, but a new study of his work suggests that the inventor actually invented the story.
According to the official version of events, in the summer of 1752 Franklin devised a simple way of testing his theory that lightning was caused by an electrical buildup. He constructed a kite fitted with a metal spike and flew it during a thunderstorm.
Textbook accounts say that electricity ran down the kite's cord to a key tied near the end, creating a spark when Franklin brought his knuckle close to it.
His work led to the invention of the lightning conductor, which has since saved countless lives. He was made a member of the Royal Society in London, the world's most prestigious scientific academy, and received the society's premier award, the Copley Medal, in 1753"on account of his curious experiments and observations on electricity."
According to a new study of the historical evidence, however, the experiment that proved the theory took place only in Franklin's imagination.
Tom Tucker, a lecturer and historian at the Isothermal Technical College in North Carolina, has examined the original documents describing the experiment, and found differing accounts of it by Franklin that were vague about when or where it was performed.
"There was no witness identified in the announcement, no location named -- and nowhere does Franklin say he actually performed the experiment," said Mr. Tucker.
Mr. Tucker's suspicions were confirmed when he tried to recreate Franklin's experiment exactly -- using materials available in the mid-18th century.
"I followed the design of the kite and tried it several times -- and it just wouldn't fly."
According to Mr. Tucker, even if it had got off the ground, there was no way it could have reached the heights needed to draw electricity from thunderclouds. He then tried the experiment using a modern kite, but that did not work, either.
Mr. Tucker sets out his evidence in Bolt of Fate, the first detailed analysis of Franklin's kite-flying claims, to be published June 24.
While he debunks the experiment, Mr. Tucker stressed that Franklin's theory was entirely correct."I think he invented the story to claim some active involvement in the science -- to show that he was not just making a suggestion."
Alexander Stille, writing in the NYT (May 31, 2003):
When Shiite Muslims in Iraq took to the streets to protest the presence of American troops as well as Saddam Hussein, was the world witnessing the birth of nationalism? When President Bush used the term crusade to describe the war on terrorism, was he inadvertently revealing religious roots in American patriotism? In short, is religious sentiment, long considered the prime enemy of nationalism, actually one of its founding elements?
This iconoclastic theory has been gaining ground among historians. Until recently, there was a growing scholarly consensus that nationalism was a distinctly modern phenomenon, a product of post-Enlightenment culture. Public celebrations of the Fatherland, the creation of national anthems and devotion to the flag all occurred in the wake of the French and American Revolutions.
As several essayists show in the 1995 collection "The Invention of Tradition," edited by Eric Hobsbawm and Terence Ranger (Cambridge University Press), many of the great national traditions we tend to think of as originating in the mists of the distant past like the clan tartans of the Scottish highlanders were 19th-century inventions, meant to generate national pride.
But Peter Sahlins, a historian at the University of California at Berkeley, who is working on a book on the nature of citizenship in early modern France, says the idea that religious intolerance is the "original sin" of nationalism is getting more and more attention. "I think it's a healthy corrective to the modernist consensus," he said.
Mr. Sahlins notes that prevailing theories of nationalism have a way of following the mood of the times. When Serbs, Croats and Muslims were killing one another in the Balkans, many commentators originally pointed to the eternal and atavistic origins of ethnic violence, not recognizing that the different groups had lived in relative harmony under the Ottoman Empire and even under Tito.
"Now the context in which we see nationalism has completely changed," he said. Faced with the threat of Islamic fundamentalism, the West is more open to looking at the role of religion in the formation of nationalism.
One of the most recent contributions to this trend is "Faith in Nation: Exclusionary Origins of Nationalism" (Oxford University Press, 2003), by Anthony W. Marx, a professor of political science at Columbia University, who was recently named president of Amherst College. Mr. Marx insists that the birth of nationalism dates to a time when religious intolerance ravaged Europe. He begins his book in 1492, the year that King Ferdinand and Queen Isabella, who united Castille and Aragon to form the new kingdom of Spain, ousted the Moors from Southern Spain and decided to expel the Jews from their territory. The Spanish Inquisition, Mr. Marx writes, was a central mechanism in consolidating power and conferring legitimacy on the new Spanish state. ...
Mr. Marx quotes the British historian Lewis Namier, who once wrote that "religion is a 16th-century word for nationalism."
Catherine de Medici, Queen Regent of France, similarly exploited religious passion. After trying to mediate between the Catholics and the Huguenots, she manipulated anti-Huguenot feeling and in 1572 helped plot the infamous St. Bartholomew's Day Massacre, in which more than 15,000 Protestants were slaughtered in and around Paris.
"Nationalism thus began to emerge by piggybacking on the passion of religious conflicts," Mr. Marx writes.
Still, many scholars are skeptical.
"If the point is that state building enlisted religion, then that makes good sense, but I am not sure we are talking about nationalism," said Eugen Weber, a professor emeritus of history at the University of California at Los Angeles and an expert on French nationalism.
Modernists like Mr. Weber and others insist that the early modern states were fundamentally different, multilingual, multiethnic entities in which the sense of nation had not yet been firmly established. "The kings of Spain governed over modern Belgium and Austria, as well as parts of what are now France, Italy, Slovenia and Croatia," said David A. Bell, a historian who argues that the French Revolution was the critical event in establishing French nationalism.
Marjane Ambler, writing in the Tribal College Journal of American Indian Higher Education (Spring 2003):
Many of us non-Indians try to find books with diverse heroes for our children and grandchildren. Our eyes have been opened to the sins of our ancestors by writers such as the late Dee Brown (Bury my Heart at Wounded Knee), Vine Deloria, Jr. (Custer Died for your Sins), and Howard Zinn (A People's History of the United States). However, we might not realize how biased we are in remembering various American figures. Years ago I read a column by Tim Giago (now publisher of Lakota Journal) pointing out that his Lakota people did not share our heroes. While I lost my copy of the column, its message stuck with me and led to the theme for this issue.
"Rational" people often question tribal creation stories, saying they are merely mythology. We are just beginning to realize that the history textbooks we rely upon for a scientific accounting of our pasts are actually filled with half-true creation stories, written from the viewpoint of a particular ideology. Our history lessons are not much more complicated than the cowboy and Indian movies we watched on television; they make it much too easy to tell the black hats from the white hats.
It is deeply disquieting when we encounter the other side of people whom we had classified as villains or heroes:
* One of the most startling, for example, is Richard Nixon. Vilified for his corruption in the Watergate scandal, Nixon was a hero in the context of advancing Indian policy. In an address to Congress in 1970, Nixon introduced the self-determination policy that for the first time recognized the right of Indian tribes to control government programs, including schools (interview with Helen Schierbeck and Tom Davis, who were involved in Indian controlled schools in the 1970s)
* Dakota people remember Abraham Lincoln with mixed feelings for his role after the 1862 Dakota conflict in Minnesota. The Dakota Sioux were starving, and the federal agent said, "Let them eat grass." In the ensuing battle, hundreds were killed on both sides. More than 300 Dakota were condemned to death by hanging, but Lincoln reduced the list to 38, feeling compelled by political pressure to punish that many despite the lack of evidence. At about the same time that he signed the Emancipation Proclamation, Lincoln signed their death warrant. (interview with Dr. Elden Lawrence,Dakota historian)
* When Thomas Jefferson listed the offenses of King George III in his draft of our country's Declaration of Independence, he said George incited the "merciless Indian Savages." The man who sent Meriwether Lewis and William Clark on their peaceful exploration voyage vacillated between calling the Indians "my children" and calling for their extermination. (Richard Drinnon's Facing West: the Metaphysics of Indian Hating and Empire Building)
* Lewis and Clark were army officers who met each other fighting Indians in the Ohio Valley before they embarked on their voyage of discovery. Clark owned York, his body servant during the voyage. Although York played an important role in making the Indians friendlier toward the explorers, Clark and our nation mistreated him after the journey. He did not get the double pay and land grants that others in the group received after their return.( July 8, 2002, Lewis and Clark special issue, Time)
In 1992, the United States celebrated the Christopher Columbus Quincentenary Jubilee. American Indians felt they had to force their message on event planners and the public: Columbus was no hero to them. Protestors poured blood-red paint on statues of Columbus and hanged him in effigy. Columbus is reviled not just as the icon of change, when the Old World of Europe met the Old World of the Americas. He personally led the genocide and enslavement of the indigenous peoples.
From 2003-2006, the United States will be "commemorating" the Lewis and Clark Corps of Discovery Bicentennial. Organizers are not calling it a "jubilee" or even a "celebration." American Indians, including people associated with tribal colleges, have taken significant roles in planning the events from the very beginning. Gerald Baker (Mandan-Hidatsa) is the National Park Service superintendent of the Lewis and Clark National Historic Trail. Both he and Amy Mossett (formerly of Fort Berthold Community College) grew up on the Fort Berthold Indian Reservation in North Dakota. Mossett is co-chair of the Circle of Tribal Advisors for the Lewis and Clark Bicentennial, which formed in October 2000 because they were "wary of yet another anniversary of discovery." Dr. Rudi Mitchell, president of Nebraska Indian Community College, also serves on the circle.
In a column last year, Oglala Lakota journalist Tim Giago accused the participating tribes of being sellouts. By participating, however, they intend to use the bicentennial as an opportunity for education and reconciliation. The Circle of Tribal Advisors wants to do more than just clarify the important role of the tribes in the explorers' work. Along with their state and federal partners, they can promote a legacy that will outlive the four-year commemoration, such as cultural sensitivity, protection of sacred sites along the route, and language perpetuation programs. They clearly are having an impact: Each of the 15 "signature events" along the trail must involve tribes; and funding guidelines for state and federal grants related to the bicentennial reflect the circle's goals.
John Pilger, writing in the New Statesman (May 26, 2003):
British imperial power has been second to none in covering, even romanticising its crimes, projecting itself as benign and wise, even a gift to humanity. With every generation comes new mythologists. 'When a well-packaged web of lies has been sold gradually to the masses over generations,' observed the American sage Dresden James, 'the truth will seem utterly preposterous and its speaker a raving lunatic.' A brilliant, exciting and deeply disturbing book, published this month, unwraps the whole package, layer by layer, piece by piece. This is Web of Deceit: Britain's real role in the world by Mark Curtis (Vintage).
Curtis's history could not be more timely, for not in my memory has there been such an expose of private revelations and true intentions, told largely from official files. I know of no other living historian who has mined British foreign policy archives as devastatingly. From Africa to south-east Asia, Chechnya to Iraq, Curtis provides documented evidence of British foreign policy as 'one of the leading supporters of terrorism in the world today . . . a simple fact never mentioned in the mainstream political culture'. Most of his primary sources have long been in the public domain: a fact that shames silent, mainstream journalism.
It was Mark Curtis who was among the first to reveal the scale of British complicity in the bloodbath that brought General Suharto to power in Indonesia in 1965-66 (and had difficulty getting a newspaper to publish his findings).
He describes a total silence in the 1960s when the Labour government of Harold Wilson supplied warships, logistics and intelligence in support of Suharto. The slaughter of up to a million people was simply ignored in Britain; the headlines said that communism had been defeated in Indonesia and 'stability' restored.
What has changed? Not much. At the Labour Party conference in 2001, Tony Blair declared his 'moral commitment' to the world. 'I tell you,' he said, 'if Rwanda happened again today as it did in 1993, when a million people were slaughtered in cold blood, we would have a moral duty to act.' The following day, as Curtis points out, Blair's statement was reported without a single journalist reminding the British people that their government had contributed to the slaughter in Rwanda.
From official files, Curtis describes how the British government 'used its diplomatic weight to reduce severely a UN force that, according to military officers on the ground, could have prevented the killings. It then helped ensure the delay of other plans for intervention, which sent a direct green light to the murderers in Rwanda to continue. Britain also refused to provide the capability for other states to intervene, while blaming the lack of such capability on the UN. Throughout, Britain helped ensure that the UN did not use the word 'genocide' so the UN would not act, using diplomatic pressure on others to ensure this did not happen.' Not a word about this appeared in the British media at the time.
A similar silence has shrouded the shocking story of Diego Garcia. Last year, a report in the Washington Post alleged that the United States had 'rendered' alleged al-Qaeda prisoners for interrogation (tortured them) at the US base on Diego Garcia in the Indian Ocean. This is British territory 'leased' by the United States without the agreement of the inhabitants. As Curtis documents, the 1,500 Ilois people were, to use the official term, 'removed' from their homeland in the Chagos island group in 1966 by the Wilson government. This ruthless dispossession, secretly executed so that the largest island, Diego Garcia, could be handed to the American military, was, as the files show, 'the subject of systematic lying by seven British governments over near four decades'. The Ministry of Defence even denied that the island had been populated at all. BBC newsreaders routinely echo this. A high court action giving the people the right of return has been ignored by the Blair government. 'Violating international law,' writes Curtis, 'has become as British as afternoon tea.'
The final chapter, 'The Mass Production of Ignorance', describes a virulent media censorship by omission that is not conspiratorial, more a celebration of 'one key concept: the idea of Britain's basic benevolence . . . the idea that Britain promotes high principles - democracy, peace, human rights and development - in its foreign policy'.
Ronald Radosh, writing in National Review (June 2, 2003):
On May 5, Senators Susan Collins, Maine Republican, and Carl Levin, Michigan Democrat, stood together in Sen. Joseph McCarthy's old hearing room to announce the release of previously closed transcripts of executive-session hearings conducted by McCarthy in 1953 and 1954. Collins said she hoped "the excesses of McCarthyism will serve as a cautionary tale"; Levin added that "history is a powerful teacher." But it's important to relay accurately what history tells us -- and here, both the press coverage of the new material and the statements of Donald A. Ritchie, the Senate historian who edited the hearings, leave a great deal to be desired. According to Ritchie, the hearings serve only to "confirm what most people thought" about McCarthy -- that he created a paranoid and irrational system, one in which (as Sen. Levin said) innocents were deprived of "due process and respect for individual rights." The press echoed this view....
Much of the press coverage assumes that most of those called before McCarthy's committee were complete innocents, whose lives were possibly ruined by having been called to testify. But let's look at an example: McCarthy's investigation of possible Communist activity at Fort Monmouth in New Jersey, the Signal Corps base at which executed atom spy Julius Rosenberg had once worked. McCarthy and his aide Roy Cohn were convinced that Communist sympathizers at the base were still supplying information to the Soviets. Forty-two civilian employees were suspended; after investigation, all but two had their jobs reinstated. Ritchie notes that most of those who testified, even those who were obviously Communists, were involved only in union organizing for a Communist front union, and were not guilty of subversion or espionage.
But this analysis begs the question of whether, in a period of extreme Cold War tension, Communists and their sympathizers had a right to government jobs in which they might affect national security. Back in 1941 -- in the era of the Nazi-Soviet Pact -- Communists precipitated a political strike at the North American Aviation Company in California, purely for the purpose of interfering with America's defense effort. Communist-led unions could easily, at a moment of crisis, have attempted similar actions.
Furthermore, many of the witnesses called before the committee were picked by Cohn -- who had been one of the government's lawyers in the Rosenberg trial. Cohn was aware that since Julius and Ethel Rosenberg had never talked, many of those involved with their espionage network managed to either escape to the Soviet Union or otherwise avoid prosecution. Hence many of the people called by McCarthy were those peripherally involved with the Rosenbergs, people whom Cohn reasonably suspected of involvement in the spy network. What Cohn had in mind was to use the hearings to provide further evidence about that conspiracy. Most of the witnesses, like Max Finestone, invoked the Fifth Amendment when asked about their probable participation in the espionage ring. McCarthy, for example, asked Finestone whether it was "correct that you are in touch with the remainder of the Rosenberg ring . . . ?" When I interviewed Finestone decades later, in the late 1970s, he still refused to answer any substantive questions.
McCarthy's questioning of two of Rosenberg's couriers, Michael and Ann Sidorovich, is also treated by Ritchie as proof of his somewhat cartoonish version of McCarthy; but the Venona evidence from Moscow shows that the Sidoroviches were in fact full-time KGB agents, who moved to Cleveland on KGB instructions in order to be close to government scientist William Perl, a major Rosenberg source. The Sidoroviches were given a $500 bonus and a Leica camera for the filming of documents.
When the couple testified, they invoked their Fifth Amendment rights when asked about espionage. McCarthy and Cohn knew the Sidoroviches were guilty, but they did not have the necessary evidence -- or the Venona decrypts -- that would have made a compelling case. So McCarthy dropped his interrogation of some of the small-fry guilty like the Sidoroviches, and -- not caring about the truth himself -- tried to make a splash by attacking people like Owen Lattimore, who, although a fellow-traveler, was not a spy at all....
History is only rarely simple and neat; the new revelations about the McCarthy hearings show it to be its usual messy and complicated self. The zeal to use the recent disclosures merely to once again bash Joe McCarthy -- and then even ricochet that McCarthy-bashing into a condemnation of the Bush administration -- sets back the effort to achieve an honest understanding of the McCarthy chapter.
William F. Buckley, writing in National Review (May 20, 2003):
What John F. Kennedy did was despicable.
Never mind his abstract indifference to adultery. What he did was to seduce a 19-year-old girl working in the White House under his command. A Great King, seeking that day's vessel for his runaway appetite. The commander in chief opportunizing on his rank in order to overwhelm a teenager who, as Sidey reports, was once spotted in the presidential limousine in Bermuda, "sitting on the floor of the car like a child playing hide-and-seek." It is simply disgusting, to use a word which, like virtue, has lost its license.
Just how will the touted reduction in reputation reveal itself? Not in historian Robert Dallek's book, which outed the grandmother -- he is the most indulgent Camelotian in town, going so far in his book as to express certainty that if JFK had lived, Vietnam would never have happened.
It's bad stuff.
E.J. Dionne, writing in the Pittsburgh Post-Gazette (May 29, 2003):
Last month, a CNN/USA Today/Gallup Poll asked Americans whom they regarded as the greatest American president. Not surprisingly, Abraham Lincoln came in first at 15 percent. But Kennedy was right behind him at 13 percent. In 2000, a Gallup Poll found that Kennedy actually ranked first, with a four percentage point lead over Lincoln.
What accounts for Kennedy's appeal?
At a dinner last week, Dallek -- his biography of Kennedy, "An Unfinished Life," is already a best seller -- noted all the reasons why Kennedy shouldn't hold the position he does.
While he was president, almost all of Kennedy's domestic initiatives "fell by the wayside," Dallek says. He was responsible for the catastrophe of the Bay of Pigs in Cuba. He substantially increased the number of American advisers in Vietnam, setting the course that Lyndon B. Johnson carried through to full escalation. He covered up his many health problems and all the medication he was taking, something Dallek believes no president should ever be allowed to do again. And in his personal life, as Dallek noted, Kennedy was "so reckless."
Yes, Kennedy was a martyr. But as Dallek notes, William McKinley was assassinated in 1901, and he did not enjoy the same esteem 40 years after his death as Kennedy enjoys now.
Kennedy certainly profits from being the first president to understand the power of television. There is spectacular footage of Kennedy that has been replayed over and over since his death in 1963. What we see are the moments of vigor and wit, grace and strength. Kennedy would turn 86 on May 29, but "he is frozen in our minds at the age of 46."
But image, Dallek insists, isn't the only factor at work. There is, first, Kennedy's standing as a hero to America's vast array of ethnic groups. As the first Catholic elected to the presidency, Kennedy conferred "a sense of legitimacy" upon all minority groups. And while Kennedy was the privileged kid who didn't have to serve in World War II -- his many health problems would have gotten him out -- he actively sought combat duty.
As for Kennedy's domestic initiatives that didn't get passed while he was alive, many of them eventually did become law under Lyndon Johnson -- the civil rights bill, the tax cut, the war on poverty, Medicare, federal aid to education. Though Dallek notes that LBJ went beyond Kennedy's program, Kennedy shares the credit for much that became the Great Society. A slow convert to civil rights, Kennedy is well-remembered by African Americans for finally embracing the cause.
And as he did in life, Dallek says, "Kennedy works both sides of the street to this day and has both conservative support and liberal support." Conservatives still use Kennedy's tax cut and his tax-cutting rhetoric to confer a blessing on their own contemporary proposals (to the consternation of his brother, Sen. Edward M. Kennedy). Liberals remember much of the rest of his record, including his battle for a nuclear test ban treaty and his aversion to war, demonstrated by his careful handling of the Cuban missile crisis. As for Vietnam, Dallek believes that Kennedy was seeking a way out, though he acknowledges that the argument about what Kennedy would have done about the war "will go on forever."
NYT (May 29, 2003):
After 18 years of almost daily lectures about surviving the atomic bomb dropped here on Aug. 6, 1945, Setsuko Iwamoto's stories to classrooms full of students have a finely limned quality about them, as smooth as pebbles in a creek.
There is no straining for melodrama as the 71-year-old woman recounts how her skin seemed to melt and pour off her arms after the flash, or how whatever scraps of cloth that could be found were used by people to protect themselves from the black rain that fell afterward.
Stories of survival do not get much more compelling. But Ms. Iwamoto worries now, with Japan inching toward rearmament, that the spirit of Hiroshima and the moral power of her story are fading.
Each year, she said, the stares of the students she faces from the podium grow blanker, just as their questions about the atomic bombing grow more stilted, appearing rehearsed rather than heartfelt.
"Just a few years ago, most schoolteachers had direct memories of the war," said Ms. Iwamoto, who said she was found to have cancer last year but appeared hale. "That's not the case at all anymore, though, and I wonder once this kind of lecture ends, how effectively the experience of war is taught.
"In my day we had trouble just surviving every day, whereas these days everyone in Japan is comfortable," Ms. Iwamoto added. "Children learn about war through manga [comic books] and think it is kind of cool. They have no particular sensation of Japan's defeat."...
Hiroshima's entire image and economy are linked to the horrendous final days of World War II, and city officials say visits by Japanese travelers are locked in a serious, long-term decline, broken only by a modest spike since the Sept. 11, 2001, terrorist attacks in the United States.
Commissions have been formed to reverse the trend. A museum on the grounds of the Peace Park, near ground zero, has been expanded and modernized. In the hope of popularizing visits here, even a manga has been created to celebrate the memory of Sadako Sasaki, a 12-year-old who died of blood cancer years after the bombing.
"We are faced with the challenge of conveying this experience to the next generations," said Noriyuki Masuda, associate director of the Hiroshima Peace Memorial Association. "At some point we realized that what we had was a crisis involving young people's consciousness. We have been facing a change in attitudes and a decline of interest in Japan as a nation."
Born on May 25, 1803, Emerson is closer to us than ever on his 200th birthday. In America, we continue to have Emersonians of the left (the post-pragmatist Richard Rorty) and of the right (a swarm of libertarian Republicans, who exalt President Bush the second). The Emersonian vision of self-reliance inspired both the humane philosopher, John Dewey, and the first Henry Ford (circulator of The Protocols of the Learned Elders of Zion ). Emerson remains the central figure in American culture and informs our politics, as well as our unofficial religion, which I regard as more Emersonian than Christian, despite nearly all received opinion on this matter.
In the domain of American literature, Emerson was eclipsed during the era of TS Eliot, but was revived in the mid-1960s and is again what he was in his own time, and directly after, the dominant sage of the American imagination. I recall sadly the American academic and literary scene of the 1950s, when Emerson was under the ban of Eliot, who had proclaimed: "The essays of Emerson are already an encumbrance." I enjoy the thought of Eliot reading my favourite sentence in the essay, "Self-Reliance": "As men's prayers are a disease of the will, so are their creeds a disease of the intellect."
It delights me that, in 2003, there is an abundance in Emerson to go on offending, as well as inspiring, multitudes. "O you man without a handle!" was the exasperated outcry of his disciple, Henry James Sr, father of William the philosopher-psychologist and Henry the novelist. Like Hamlet, Emerson has no handle, and no ideology. And like another disciple, the greatest American poet, Walt Whitman, Emerson was not bothered by self-contradiction, since he knew he contained endless multitudes: "A foolish consistency is the hobgoblin of little minds."
Throughout the Depression and until the "stagflation" of the 1970s,
Keynes' ideas were ascendant. (The University of Chicago was a free-market
outpost.) In the early 1980s, Margaret Thatcher in Britain and Ronald Reagan
in the United States implemented principles of Hayek's philosophy. "Commanding Heights" shows how the 1982 Falklands War "saved"
Thatcher. Before the war, she had the lowest poll ratings of any prime minister
since they started taking polls, Yergin said. But the victory gave her the political capital to de-nationalize British
industry and sell shares to the public. The Thatcher reforms wrote the economic script for large parts of the world,
Yergin told UPI. "It's very striking that of the major European economies,
Britain is doing best." A dramatic revelation is Thatcher's decisive role in liberating Poland from
communist rule. The prime minister made seeing Solidarity leader Lech Walesa
a precondition of her 1988 trip to Poland. The two met at a dinner at the
home of Walesa's priest, an event captured by a Solidarity cameraman and shown
in "Commanding Heights." "That film had never been seen until we found it," Yergin said. "You don't say no to Mrs. Thatcher," said Walesa, interviewed for
the series. He called her visit "crucial," an event without which
he and his Solidarity partners might have been "destroyed." In August 1990, after Solidarity won a national election, the head of Poland's
Communist Party called Soviet leader Mikhail Gorbachev for instructions. Let
the election stand, Gorbachev said in the phone call that ended the Cold War. NYT (May 25, 2003): istory abounds in what-ifs. If Lincoln had not been shot, the South, and
the nation, might have accepted full racial equality a century earlier. If
Wilson had not been felled by a stroke, the United States might have joined
the League of Nations and prevented World War II. And if John F. Kennedy had
not been killed in Dallas could the Vietnam War have been avoided? In "An Unfinished Life: John F. Kennedy, 1917-1963" (Little, Brown),
the presidential historian Robert Dallek again raises the old question. He
argues that Lyndon Johnson misread Kennedy's intentions regarding Vietnam,
increasing the American commitment rather than seeking the gradual withdrawal
that Kennedy may have intended. People have powerful reasons for wanting to believe this argument. Not long
after Kennedy's death the country went from one crisis to another, stumbling
through years of race riots and civil disobedience, the counterculture revolt,
the crumbling of authority and the erosion of traditional values. Above all the war consumed the nation in anger, disillusion and self-doubt.
Shock waves generated in the 1960's have never fully abated, and are seen
today in the rhetoric of political correctness and neoconservatism. Mr. Dallek, citing newly available documents and his reappraisal of the existing
record, maintains that Kennedy was planning to wind down the Vietnam commitment
during his second term, and instructed Defense Secretary Robert S. McNamara
to chart a possible withdrawal by 1965. Kennedy, he argues, was becoming skeptical
of the hawkish advice he was getting from his military and political advisers.
The ill-fated Bay of Pigs landing, which his advisers urged on him, gave good
reason for that caution. At the height of the Cuban missile crisis, it later become known, Kennedy
overruled his advisers and accepted a secret deal with the Soviet Union that
ended a confrontation that threatened nuclear war. Kennedy even mused in private
conversations with journalists about the possibility of lifting the embargo
on Cuba in return for Castro's neutrality in the cold war. If this was Kennedy's intention, why did Johnson continue the impasse with
Cuba and substantially expand the involvement in Vietnam? Because, Mr. Dallek
maintains, the dead president's chief foreign policy aides, Mr. McNamara,
Secretary of State Dean Rusk and the national security adviser, McGeorge Bundy,
told him that was what Kennedy wanted. Since Kennedy had never shared any
doubts with his vice president, Johnson, a novice at foreign affairs and eager
to prove that he was marching in Kennedy's footsteps, did what he was persuaded
he must do. That Kennedy was planning to pull out of Vietnam after his re-election is,
Mr. Dallek admits, hotly contested by other historians, an old argument that
"can never fully be put to rest." NYT (May 24, 2003): Now that a new book about Douglas says he lied about (or at least embellished)
many parts of his life from having had childhood polio to graduating
second in his law school class some scholars are again asking what
such information reveals about the legacy and work of judges, especially those
on the Supreme Court. In "Wild Bill: The Legend and Life of William O. Douglas" (Random
House, $35), Bruce Allen Murphy, a professor of civil rights at Lafayette
College in Easton, Pa., argues that Douglas's sloppiness in framing and writing
Supreme Court decisions and his penchant for falsehood stemmed from boredom
with the high court and a deep need to invent the person he wanted to be:
a politician. Yet after romping through more than 700 pages about Douglas's lies and his
meanness to his wives and children, some scholars take issue with the idea
that Douglas's or any other judge's personal biography is crucial
in determining a judicial legacy. "These traditional biographies are pointless if you're interested in
understanding the significance of the judge as a judge," said Richard
A. Posner, a judge for the United States Court of Appeals for the Seventh
Circuit and a senior lecturer at the University of Chicago Law School. Citing a book of his, Mr. Posner continued: "I said what are needed
are critical studies, as opposed to biographies. The relation between personal
character and professional reputation is often nonexistent. People operate
on different tracks. Very successful people, when you look at them, they very
often turn out to be psychological basket cases. They were successful in part
because they were driven by some deep personal insecurity."... Professor Murphy stands by his work, saying: "I tried to look at Douglas's
life and connect the image with the decisions he was writing. They do connect. "If you read who he says he was in his autobiographies, you can see
the iconoclast, left-wing, anti-government jurist. But much of what he wrote
about himself cannot be confirmed. Now, who was Douglas really? He was a very
frustrated and disappointed politician." That frustration translated into sloppiness on the high court, Professor
Murphy said. Douglas's inattention to detail can be found even in one of his
most influential opinions, Griswold v. Connecticut, which defended the right
of marital privacy and is seen as a precursor to Roe v. Wade, which affirmed
a woman's right to abortion, Professor Murphy writes. The first draft of Douglas's opinion about the sacred association of marriage
coming from a much-married and unfaithful judge was hastily
written and narrowly constructed, he says. It was Justice William J. Brennan
Jr. and his law clerks, he said, who expounded on the right of privacy in
this case, which was brought to the court by the state's Planned Parenthood
League. UPI (May 22, 2003): in these days of cynicism about corporate America, is it possible to make
even the robber barons fashionable again? Thomas Kessner, a historian at the
City University of New York, is giving it a shot in a new book on the subject
that reads like a Horatio Alger throwback: those guys in the 19th century,
he says, were BOLD, VIGOROUS, SELF-SUFFICIENT, and they MADE US STRONG. You
can almost hear Marv Albert saying "Yes!" You certainly can't accuse Kessner of trying to be trendy. In the 1990s,
it suddenly became fashionable among American businessmen to quote Adam Smith
again -- which doesn't say much for Smith, since the decade ended with dot-com
bubbles, Enrons and recession. But there was one Big Gorilla philosopher,
Kessner points out, who was even more uncompromising than Smith. Nobody much reads him anymore, but there was a 19th century English philosopher
named Herbert Spencer who applied Charles Darwin's theories of evolution to
morality. Actually he decided that there WAS no morality, only "survival
of the fittest," and that when people are too weak to survive, or the
unemployed starve to death, well, too bad, that's the way the universe works
and it's nature's way of keeping humanity strong. The businessmen of America's Gilded Age -- Carnegie, Rockefeller, Morgan,
Vanderbilt, Gould -- weren't the kind of guys who normally read philosophy,
but they loved Spencer, and they feted him like a king when he showed up in
New York in 1873, the panic year, when a lot of the unfit ended up on the
scrap heap of history. Andrew Carnegie, who wasn't known as a bookish man
at the time, idolized Spencer and said his writings proved that anyone who
tampers with society -- or, heaven forbid, the markets! -- is trying to turn
us back to the dark ages. In other words, greed is the engine of our salvation. So they believed, so
they acted, and so, remarkably, Kessner pretty much backs them up in his new
book, "Capital City: New York City and the Men Behind America's Rise
to Economic Dominance, 1860- 1900," (Simon & Schuster, 396 pages,
$27). In a way it's refreshing to find someone who is willing to boldly sing the
praises of the old coots in the swallow-tail coats. It hasn't been fashionable
to do that for about, oh, 90 years now. But Kessner is convinced that J. Pierpont
Morgan, the architect of the modern monopoly, is the savior of the country. He makes a good case for the modern Wall Street corporation being basically
Morgan's invention and says the rules haven't changed very much. He goes even
further to say that, if Morgan had not stepped in with various schemes to
combine industries and prop up the gold supply, the whole edifice of American
capitalism might have come tumbling down. (No matter that Morgan's commissions
on all those deals were exorbitant even by the standards of contemporary CEOs.
This is the man who built ships for the Spanish-American War that he sold
to his country at breathtaking markups.) The problem with ALL books about the robber-baron era, though, is that there's
just no way to make any of these guys sympathetic. In a way, Kessner proves
as much by making the absolute best case for the achievements of the business
titans -- they brought us back from the devastation of the Civil War, they
brought order out of chaos by inventing new kinds of securities, bonds, futures
and corporate paper, they propelled us past Europe in manufacturing and finance,
and they created capital in such vast amounts that the country was able to
leap forward in one generation from provincial backwater to world economic
leader. And yet ... OK, fine, all of that's true. But how do you get around the fact that the
biggest liars, cheats and charlatans tended to win all these "survival
of the fittest" battles?
Throughout the Depression and until the "stagflation" of the 1970s, Keynes' ideas were ascendant. (The University of Chicago was a free-market outpost.) In the early 1980s, Margaret Thatcher in Britain and Ronald Reagan in the United States implemented principles of Hayek's philosophy.
"Commanding Heights" shows how the 1982 Falklands War "saved" Thatcher. Before the war, she had the lowest poll ratings of any prime minister since they started taking polls, Yergin said.
But the victory gave her the political capital to de-nationalize British industry and sell shares to the public.
The Thatcher reforms wrote the economic script for large parts of the world, Yergin told UPI. "It's very striking that of the major European economies, Britain is doing best."
A dramatic revelation is Thatcher's decisive role in liberating Poland from communist rule. The prime minister made seeing Solidarity leader Lech Walesa a precondition of her 1988 trip to Poland. The two met at a dinner at the home of Walesa's priest, an event captured by a Solidarity cameraman and shown in "Commanding Heights."
"That film had never been seen until we found it," Yergin said.
"You don't say no to Mrs. Thatcher," said Walesa, interviewed for the series. He called her visit "crucial," an event without which he and his Solidarity partners might have been "destroyed."
In August 1990, after Solidarity won a national election, the head of Poland's
Communist Party called Soviet leader Mikhail Gorbachev for instructions. Let
the election stand, Gorbachev said in the phone call that ended the Cold War.
NYT (May 25, 2003):
istory abounds in what-ifs. If Lincoln had not been shot, the South, and the nation, might have accepted full racial equality a century earlier. If Wilson had not been felled by a stroke, the United States might have joined the League of Nations and prevented World War II. And if John F. Kennedy had not been killed in Dallas could the Vietnam War have been avoided?
In "An Unfinished Life: John F. Kennedy, 1917-1963" (Little, Brown), the presidential historian Robert Dallek again raises the old question. He argues that Lyndon Johnson misread Kennedy's intentions regarding Vietnam, increasing the American commitment rather than seeking the gradual withdrawal that Kennedy may have intended.
People have powerful reasons for wanting to believe this argument. Not long after Kennedy's death the country went from one crisis to another, stumbling through years of race riots and civil disobedience, the counterculture revolt, the crumbling of authority and the erosion of traditional values.
Above all the war consumed the nation in anger, disillusion and self-doubt. Shock waves generated in the 1960's have never fully abated, and are seen today in the rhetoric of political correctness and neoconservatism.
Mr. Dallek, citing newly available documents and his reappraisal of the existing record, maintains that Kennedy was planning to wind down the Vietnam commitment during his second term, and instructed Defense Secretary Robert S. McNamara to chart a possible withdrawal by 1965. Kennedy, he argues, was becoming skeptical of the hawkish advice he was getting from his military and political advisers. The ill-fated Bay of Pigs landing, which his advisers urged on him, gave good reason for that caution.
At the height of the Cuban missile crisis, it later become known, Kennedy overruled his advisers and accepted a secret deal with the Soviet Union that ended a confrontation that threatened nuclear war. Kennedy even mused in private conversations with journalists about the possibility of lifting the embargo on Cuba in return for Castro's neutrality in the cold war.
If this was Kennedy's intention, why did Johnson continue the impasse with Cuba and substantially expand the involvement in Vietnam? Because, Mr. Dallek maintains, the dead president's chief foreign policy aides, Mr. McNamara, Secretary of State Dean Rusk and the national security adviser, McGeorge Bundy, told him that was what Kennedy wanted. Since Kennedy had never shared any doubts with his vice president, Johnson, a novice at foreign affairs and eager to prove that he was marching in Kennedy's footsteps, did what he was persuaded he must do.
That Kennedy was planning to pull out of Vietnam after his re-election is, Mr. Dallek admits, hotly contested by other historians, an old argument that "can never fully be put to rest."
NYT (May 24, 2003):
Now that a new book about Douglas says he lied about (or at least embellished) many parts of his life from having had childhood polio to graduating second in his law school class some scholars are again asking what such information reveals about the legacy and work of judges, especially those on the Supreme Court.
In "Wild Bill: The Legend and Life of William O. Douglas" (Random House, $35), Bruce Allen Murphy, a professor of civil rights at Lafayette College in Easton, Pa., argues that Douglas's sloppiness in framing and writing Supreme Court decisions and his penchant for falsehood stemmed from boredom with the high court and a deep need to invent the person he wanted to be: a politician.
Yet after romping through more than 700 pages about Douglas's lies and his meanness to his wives and children, some scholars take issue with the idea that Douglas's or any other judge's personal biography is crucial in determining a judicial legacy.
"These traditional biographies are pointless if you're interested in understanding the significance of the judge as a judge," said Richard A. Posner, a judge for the United States Court of Appeals for the Seventh Circuit and a senior lecturer at the University of Chicago Law School.
Citing a book of his, Mr. Posner continued: "I said what are needed are critical studies, as opposed to biographies. The relation between personal character and professional reputation is often nonexistent. People operate on different tracks. Very successful people, when you look at them, they very often turn out to be psychological basket cases. They were successful in part because they were driven by some deep personal insecurity."...
Professor Murphy stands by his work, saying: "I tried to look at Douglas's life and connect the image with the decisions he was writing. They do connect.
"If you read who he says he was in his autobiographies, you can see the iconoclast, left-wing, anti-government jurist. But much of what he wrote about himself cannot be confirmed. Now, who was Douglas really? He was a very frustrated and disappointed politician."
That frustration translated into sloppiness on the high court, Professor Murphy said. Douglas's inattention to detail can be found even in one of his most influential opinions, Griswold v. Connecticut, which defended the right of marital privacy and is seen as a precursor to Roe v. Wade, which affirmed a woman's right to abortion, Professor Murphy writes.
The first draft of Douglas's opinion about the sacred association of marriage
coming from a much-married and unfaithful judge was hastily
written and narrowly constructed, he says. It was Justice William J. Brennan
Jr. and his law clerks, he said, who expounded on the right of privacy in
this case, which was brought to the court by the state's Planned Parenthood
UPI (May 22, 2003):
in these days of cynicism about corporate America, is it possible to make even the robber barons fashionable again? Thomas Kessner, a historian at the City University of New York, is giving it a shot in a new book on the subject that reads like a Horatio Alger throwback: those guys in the 19th century, he says, were BOLD, VIGOROUS, SELF-SUFFICIENT, and they MADE US STRONG. You can almost hear Marv Albert saying "Yes!"
You certainly can't accuse Kessner of trying to be trendy. In the 1990s, it suddenly became fashionable among American businessmen to quote Adam Smith again -- which doesn't say much for Smith, since the decade ended with dot-com bubbles, Enrons and recession. But there was one Big Gorilla philosopher, Kessner points out, who was even more uncompromising than Smith.
Nobody much reads him anymore, but there was a 19th century English philosopher named Herbert Spencer who applied Charles Darwin's theories of evolution to morality. Actually he decided that there WAS no morality, only "survival of the fittest," and that when people are too weak to survive, or the unemployed starve to death, well, too bad, that's the way the universe works and it's nature's way of keeping humanity strong.
The businessmen of America's Gilded Age -- Carnegie, Rockefeller, Morgan, Vanderbilt, Gould -- weren't the kind of guys who normally read philosophy, but they loved Spencer, and they feted him like a king when he showed up in New York in 1873, the panic year, when a lot of the unfit ended up on the scrap heap of history. Andrew Carnegie, who wasn't known as a bookish man at the time, idolized Spencer and said his writings proved that anyone who tampers with society -- or, heaven forbid, the markets! -- is trying to turn us back to the dark ages.
In other words, greed is the engine of our salvation. So they believed, so they acted, and so, remarkably, Kessner pretty much backs them up in his new book, "Capital City: New York City and the Men Behind America's Rise to Economic Dominance, 1860- 1900," (Simon & Schuster, 396 pages, $27).
In a way it's refreshing to find someone who is willing to boldly sing the praises of the old coots in the swallow-tail coats. It hasn't been fashionable to do that for about, oh, 90 years now. But Kessner is convinced that J. Pierpont Morgan, the architect of the modern monopoly, is the savior of the country.
He makes a good case for the modern Wall Street corporation being basically Morgan's invention and says the rules haven't changed very much. He goes even further to say that, if Morgan had not stepped in with various schemes to combine industries and prop up the gold supply, the whole edifice of American capitalism might have come tumbling down. (No matter that Morgan's commissions on all those deals were exorbitant even by the standards of contemporary CEOs. This is the man who built ships for the Spanish-American War that he sold to his country at breathtaking markups.)
The problem with ALL books about the robber-baron era, though, is that there's just no way to make any of these guys sympathetic. In a way, Kessner proves as much by making the absolute best case for the achievements of the business titans -- they brought us back from the devastation of the Civil War, they brought order out of chaos by inventing new kinds of securities, bonds, futures and corporate paper, they propelled us past Europe in manufacturing and finance, and they created capital in such vast amounts that the country was able to leap forward in one generation from provincial backwater to world economic leader. And yet ...
OK, fine, all of that's true. But how do you get around the fact that the
biggest liars, cheats and charlatans tended to win all these "survival
of the fittest" battles?
comments powered by Disqus
- NYT History Book Reviews: Who Got Noticed this Week?
- Researchers have discovered a previously unknown 149-page manuscript defending homosexuality.
- What Counts as Historical Evidence? The Fracas over John Stauffer’s Black Confederates
- Israeli journalist-turned-biographer, Shabtai Teveth, is remembered for his attack on the New Historians
- Harvard’s Drew Faust says the Civil War marked the start of large-scale industrial war, not WW I