Did You Know?Hot Topics
Ben Hoyle, writing in the London Times (Jan. 28, 2004):
WHEN the historians are let loose in the Bill Clinton presidential library, they will have four million White House e-mails to trawl through. Fortunately for impatient biographers, it seems that the great man sent only two.
Perhaps one of the great political communicators in recent history preferred the personal touch or a telephone call, according to Skip Rutherford, president of the Clinton Presidential Foundation, which is raising money for the library. "He's not a technoklutz," Mr Rutherford said.
One of the Clinton messages was a test e-mail to see if the Commander-in-Chief could press the "send" button -which leaves John Glenn, the former Ohio senator and the first American to orbit the Earth, with a further distinction. He was the only person to receive an e-mail from President Clinton.
The President e-mailed Glenn, then 77, when he returned to space, after 36 years, aboard the space shuttle Discovery in 1998. "We are very proud of you and the entire crew, and a little jealous," the message read. "Back on Earth, we're having a lot of fun with your adventure."
From the London Times (Jan. 30, 2004):
Who is, or was, the world's most prolific author in terms of words written?
... [T]he ancient scholar Didymus of Alexandria (1st century BC) is credited with having written between three and four thousand books (of which a single papyrus, found in a rubbish heap, survives).
Rhazes, polymathic Persian author of several hundred works, once recalled: "In a single year I have written as many as 20,000 pages in a script as minute as that used for amulets. I was engaged fifteen years upon my great compendium, working night and day, until my sight began to fail and the nerves of my hands were paralysed".
His contemporary, the Arab historian Tabari, is said to have written forty pages every day throughout his long life. Voltaire left behind 15 million words; his letters alone fill 100 volumes. Jean-Paul Sartre wrote up to 10,000 words a day.
The bibliography of Bertrand Russell lists more than 3,000 published items and itself fills three volumes; he also wrote 40,000 letters. A.C. Benson, Master of Magdalene College, Cambridge, published 100 books and kept a diary of 5 million words, filling 180 manuscript volumes, enough to fill 40 volumes of print; after his death, whole shelf-fuls of unpublished books, stories, essays and poems were consigned to the flames.
Hugh Levinson, BBC Radio 4, producer of"The Blood Libel," writing in BBC News (Jan. 23, 2004):
For hundreds of years, it's been said that Jews kill Christian children and drain their blood for ritual purposes. Why has this myth persisted for so long?
Raya Beilis still remembers the day in 1911 that her father was arrested by the Kiev secret police, and put on trial for the alleged Jewish ritual murder of a Christian boy.
"I was too young to understand," says Raya, who now lives in a nursing home in New York City."All I knew was that they said if he's guilty, they're going to kill every one of you."
The authorities in Kiev said Mendel Beilis had lured a teenager called Andrey Yustschinsky away from his family, killed him and drained his blood for the production of matzah, the unleavened bread eaten at Passover.
The court threw out the charges, which were clearly fabricated. Mendel Beilis was freed and the feared pogrom against the Kiev Jews never happened. But where did this bizarre accusation come from?
The origins of this anti-Semitic myth, known as the blood libel, lie in medieval England. In 1144 a skinner's apprentice called William went missing in Norwich. When his body was found, the monks who examined the corpse claimed that the boy's head had been pierced by a crown of thorns.
Some years later, a monk called Thomas began to gather evidence about William's death. His main aim was to establish the boy as a holy martyr and draw pilgrims to the cathedral. Almost as an incidental matter, he accused the Jews of Norwich of killing the boy.
"The unforeseen outcome of what Thomas did was to create the blood libel, which then itself takes on a life of its own," says Dr Victor Morgan, of the University of East Anglia.
Hysteria, not evidence
The accusation that Jews would drain the blood of children and then use it for ritual purposes is bizarre, as Judaism has a powerful taboo against blood. Indeed, kosher butchering is meant to remove all blood from meat. But the idea seems to have had a powerful hold on the mediaeval imagination.
"It's not just an act of murder and of a ritual murder," says Professor Robert Wistrich, of the University of Jerusalem.
"Removing the blood from the body and then using it for a ritual or religious purpose - there is something horrific, but yet as fascinating as it is repulsive in this notion."
The blood libel spread across England and Continental Europe over the centuries, with hundreds of accusations, all based on hysteria rather than evidence. There were notorious blood libel cases in Lincoln in 1255 and Trento, Italy, in 1475. Many Jews were executed. Others were killed by mobs seeking revenge.
There was another rash of accusations in the late 19th and early 20th centuries in Eastern Europe - societies gripped by economic transformation and political uncertainty, climaxing with the Beilis case of 1913.
Even though the blood libel has been disproved countless times, it refuses to fade away. Racist groups in the US still sell videos which maintain that Jews commit ritual murder.
Preston Lerner, writing in the LAT Magazine (Jan. 18, 2004):
In 1961, then an ambitious, irrepressible 22-year-old flight instructor, [Wally] Funk was the youngest of 13 women who were secretly evaluated as candidates for NASA's space program. In several tests, she and her cohorts outperformed the men--the Mercury 7--who would rocket so famously into history. But America wasn't ready for female astronauts. "The time wasn't right," Funk says. "And the old-boy network didn't want us." The program was killed before it got off the ground, and the female pilots, who much later were dubbed the Mercury 13, faded into obscurity.
Yet more than 40 years after her brief stint as an understudy, Funk still hungers for a star turn on an astral stage, and there's nothing she won't risk to achieve her lifelong dream of rocketing into space. Her life savings? Check. Her reputation? No problem. Her life? In a heartbeat. She has signed on as a test pilot for Interorbital Systems, a tiny Mojave-based company with grand plans to make her the first human to fly into space in a privately funded spacecraft. This unprecedented launch could occur within a year if adequate funding is secured--a really big if.
For now, Funk has the publicity machine cranked up to redline. This summer morning she's at Santa Monica Airport to fly the flag while competing in the Palms to Pines Air Race from Southern California to Oregon. Seventy-five years ago, this was the starting point of the country's first transcontinental air race for women. Amelia Earhart and Pancho Barnes were among the celebrated aviatrixes (as they were known in those days) who flew in the inaugural Air Derby. Today, for better or worse, female pilots no longer fascinate the public. So instead of the star-studded crowd on hand in 1929, the atmosphere at the airport is as sedate as lunchtime at a laundromat--except for the whirlwind being kicked up by the human tornado with a shock of short white hair.
Although Funk reluctantly admits to being 64, she doesn't look or act her age. Dressed in red cargo pants and work boots, she's trim, athletic, gregarious and immensely likable--think instant confidante, a ball of fire who greets acquaintances with hearty hugs and refers to friends old and new as "babe." But the first thing most people notice is her tireless energy and boundless enthusiasm for the task--usually tasks--at hand. Her teammate in the race, Lou Ann Gibson, smiles indulgently as Funk wipes down their Cessna 172 while orchestrating photos, conducting interviews and mingling with a group of well-wishers large enough to constitute an entourage.
Gibson will pilot while Funk navigates as they compete against 19 other two-woman teams over the next two days. Gibson, an American Airlines pilot, is one of 800 or so students who have soloed under Funk's tutelage. In 1958, when Funk earned her wings, pilot instructor was just about as high as a woman could go in aviation. But she soared higher still with pioneering jobs as an inspector for the Federal Aviation Administration and an accident investigator for the National Transportation Safety Board in Los Angeles. She also completed her astronaut training on her own after the Mercury 13 program fizzled, even though NASA never showed the slightest inclination to send her into space....
Funk's Cessna looks puny and insignificant as it waits on the wide expanse of blacktop, and her dream of spaceflight seems far, far away. Then again, could Orville Wright have imagined, as he skimmed along the sand dunes of Kitty Hawk in 1903, that Charles Lindbergh would cross the Atlantic in 1927? That Chuck Yeager would break the sound barrier in 1947? That Neil Armstrong would walk on the moon in 1969? Can Wally Funk fly into space in 2004? She's got the ability. God knows she's got the drive. She's in the right place. Who's to say it's not the right time?
Ben Franklin's Musical Invention Enjoys a Revival (posted 1-14-04)
Brendan Miniter, writing in the WSJ (Jan. 14, 2004):
One of America's Founding Fathers invented a musical instrument that inspired original scores from Mozart, Beethoven and other greats. That instrument is the glass armonica (named after the Italian word for harmonic), devised by Benjamin Franklin in 1761. And out of all of his inventions, Franklin once said it was the one that gave him the"greatest personal satisfaction." But for more than a century and a half this once-popular instrument -- which employed glass bowls stacked horizontally inside one another and mounted on a small table -- sat in disrepute, nearly lost to history.
Now, however, it's enjoying a bit of a revival. Thanks to the hard work of a handful of men and women over the past 20 years, the glass armonica is being heard at festivals, at elementary-school concerts and in at least one movie score. And Philadelphia's Franklin Institute will mark the inventor's 298th birthday on Saturday by having the instrument played during their celebration....
Franklin found he could make beautiful, haunting music using glass bowls if they had a hole in their center and were stacked inside one another while mounted on a horizontal rod. He dipped his fingers in water, spun the bowls using a foot treadle and then played them almost like a piano. Except that he could sculpt each note by varying the speed of the bowls and the amount of pressure he applied -- similar to how a violinist uses a bow.
The idea of using glass to make music didn't originate with Franklin. It was already centuries old when he watched music being made from drinking glasses -- tuned by being filled with varying amounts of water -- while in England in the late 1750s and early '60s. But Franklin wanted to make the process less cumbersome. So with the help of London glass blower Charles James he figured out how to tune a glass bowl by varying its thickness.
Franklin spent much of the American Revolution as a diplomat in France and often played his instrument for parlor audiences. Soon Europeans fell in love with it and began building their own. One story has Franklin curing Polish Princess Izabella Czartoryska of"melancholia" by playing the armonica for her. She liked it so much he gave her lessons. Marie Antoinette is said to have studied the instrument. Mozart and his father, Leopold, heard the armonica in Vienna in the 1770s. Wolfgang"has played upon it," Leopold wrote his wife."How I should love to have one." And in 1791, the younger Mozart composed an Adagio for the armonica solo and the Adagio and Rondo for the armonica, flute, oboe, viola and cello.
By the 1820s and '30s, however, the armonica was gaining a reputation for driving musicians out of their minds. Marianne Kirchgessner eventually went insane after touring Europe playing it. J.C. Muller warned of its effect on the"temperament" in a 1788 instruction manual. Today many suspect the armonica's leaded glass and paint to be the real culprit, perhaps even contributing to Beethoven's likely lead poisoning.
How the Zip Code Changed America (posted 12-30-03)
John Schwartz, writing in the NYT (Dec. 28, 2003):
When the Postal Service introduced its Zone Improvement Plan in 1963, the mundane goal was to identify the mail delivery station associated with an address. It drew a border between past and present, says Edward Tenner, the author of"Why Things Bite Back: Technology and the Revenge of Unintended Consequences." What resulted was a more efficient mail system, but also"a new style of demographic and social analysis, marketing and clustering" that shapes everything from the allocation of bargain fliers and mail-order catalogs to the placement of stores.
Does the Stock Market Do Better Under Democrats or Republicans? (posted 12-17-03)
Stephen J. Glain, writing in the Boston Globe (Dec. 12, 2003):
According to an article published in the October issue of The Journal of Finance, share prices have for much of the last century fared better under Democratic presidents than Republicans. Using a broad index of stock prices, professors Pedro Santa-Clara and Rossen Valkanov at the University of California at Los Angeles found that the stock market between 1927 to 1998 returned about 11 percent more a year under Democrats and 2 percent more under Republicans.
Treasury bills also performed better under Democratic presidents, yielding a 5.3 percent higher return than the Republicans' 3.7 percent.
The cause of the disparity is uncertain, according to the study, which suggested share prices may have more influence over the outcome of presidential elections than a sitting president has over share prices.
"In sum," the authors write, "the market seems to react very little, if at all, to presidential election news."
Why Is Washington DC So Much Smaller than the Founders Envisioned? (posted 12-16-03)
Derrill Holly, writing for the AP (Dec. 13 2003):
On Dec. 12, 1800, the federal government officially moved to the District of Columbia. And 203 years later, historians say the congressionally created seat of government would be a far different place if politicians had remained true to the vision of the founding fathers.
Had the nation's capital remained 100 square miles -- and included what are today the city of Alexandria and Arlington County, Va. -- it might have nearly 900,000 residents and be a commercial trading center rivaling New York or Philadelphia.
"A wide swath of what is now northern Virginia was actually part of the district," noted Robert Bernstein, a Census Bureau spokesman. "In 1800 it was just a small area of 14,000 people and a lot of it was rural."
President George Washington personally took part in the positioning of the south cornerstone for the "seat of government at Jones Point" in 1791. The stone, eight miles north of his Mount Vernon estate, was the first marker placed as surveyors plotted a federal site, measuring 10 miles on each side, as authorized by the first Congress earlier that year.
Several other stone blocks along Virginia Route 7 also are among the surviving markers. Others exist at the boundary of the district and Maryland.
"We have a piece of pottery here that's marked with the maker's mark that says Alexandria, D.C., which strikes people as odd," said Jim Mackay, director of the Lyceum, Alexandria's history museum.
Until 1847, what was then known as Alexandria County was one of three major jurisdictions in the District of Columbia. The others were Georgetown and what was originally called Washington County, both formerly part of Maryland.
"Washington was very adamant that the capital be located as close to Alexandria as possible," said Mackay. The first president and others who promoted the site on the banks of the Potomac River envisioned the region becoming the cultural and economic epicenter of the young nation.
But members of Congress with seaport constituencies opposed appropriating federal funds to build public wharves and other facilities that would have benefited the capital area. Congress also passed the Residency Act, precluding construction of government buildings on the west bank of the Potomac.
"Alexandria didn't receive any benefits that Georgetown and Washington County received," said T. Michael Miller, the Alexandria City historian. All of this fed retrocession sentiments that had existed as early as 1801.
The effort finally succeeded in 1846, when George Washington Parke Custis -- the grandson of Martha Washington through her first marriage -- reversed his long-standing opposition to breaking up the district. Custis opposed retrocession for years as counter to the wishes of George Washington, who became his stepfather after his parents died.
"If that hadn't occurred, the district would be a fiscally more viable city, not as dependent on the federal government," said Kenneth R. Bowling, a historian at George Washington University. The city's tax base would also be larger, and its population would be more diverse.
Another useful bit of information is the origin of"spider hole," a phrase used by Lt. Gen. Ricardo Sanchez to describe the dugout hiding place in which the fugitive Saddam was cowering.
This is Army lingo from the Vietnam era. The Vietcong guerrillas dug"Cu Chi tunnels" often connected to what the G.I.'s called"spider holes" — space dug deep enough for the placement of a clay pot large enough to hold a crouching man, covered by a wooden plank and concealed with leaves. When an American patrol passed, the Vietcong would spring out, shooting. But the hole had its dangers; if the pot broke or cracked, the guerrilla could be attacked by poisonous spiders or snakes. Hence,"spider hole."
They Can't Both Be Right (postd 12-12-03) From History Today (Dec. 5, 2003):
The Edward Jenner Museum in Gloucestershire and the George Marshall Medical Museum in Worcester are both displaying the horns of a cow used in the development of the smallpox vaccine by Dr Edward Jenner in the late 18th century. The George Marshall museum’s Dr Frank Crompton acknowledged: “I have communicated with the Jenner Museum and we have come to the conclusion that we cannot be absolutely certain which ones are genuine.”
How the Smithsonian Finally Got an African-History Museum (posted 12-11-03)Bruce Craig, writing in his newsletter on behalf of the National Coalition for History (Dec. 11, 2003):
Perhaps the most significant history-related accomplishment of this Congress ... is enactment of legislation (H.R. 3491) to establish within the Smithsonian Institution the National Museum of African American History and Culture. This legislation is the culmination of a 15-year effort by the principal sponsor of the bill -- civil rights leader, Rep. John Lewis (D-GA).
Since 1988 Lewis has introduced legislation creating the museum, but for one reason or another his bills failed in the House or Senate -- politics makes for strange bedfellows. Because public opinion polls suggested low popularity of Republicans within the African-American community, the Republican leadership took direct action to boost support within this community. Consequently, under orders from their leaders and the White House, rank and file Republican congressmen enthusiastically embraced various funding and legislative proposals designed to benefit the African-American community, including Lewis' long-ignored bill. Republicans have reason to be proud for enacting this legislation that repeatedly failed for partisan reasons when the Democrats controlled Congress.
Why Is There a Pyramid on U.S. Money? (posted 12-10-03)From the newsletter of the American Revolution Round Table (Dec. 2003):
According to British scholar David Ovason, the Great Seal on the dollar bill reveals America's destiny. Ovason, who wowed reviewers with The Secret Architecture of Our Nation's Capital a couple of years ago, claims that there are two significant images on the dollar bill -- the truncated pyramid and the American eagle with the shield at its midsection, both framed in circles. The greatest secret is the pyramid, which includes the irradiated triangle that seems to complete the larger structure below it.Editor's Note: In response to this posting, we received the following email from Vern Bullough, SUNY Distinguished Professor Emeritus of History and Social Science:
The pyramid, with its lopped-off capstone -- the Egyptians revered the top -- was an historic reality."The Islamic invaders, once they captured Egypt, removed the face of the pyramid at the top and used it to build their mosques in Cairo," Ovason says.
"It means that for man to return to his spiritual heritage, as we must do eventually, the Americans are charged with the destiny of replacing that pyramid. It's the destiny of the U.S. to build on the foundations already given. The stones on the foundation bear the date 1776. So that means the pyramid is specific to the United States of America."
I have lived and traveled in Egypt and I never saw a lopped off pyramid.The pyramids were often robbed over their covering stone but the thieves started from the bottom and never reached the top. I don't know where the explanation given in one of your releases comes from. It was probably if anything a Masonic symbol but I am not certain. The explanation you printed is just not true, at least from anything I have seen and observed. Egyptians did not build on top of pyramids although the robbed them for building material.
The Anasazi Ate Turkey Long Before the Pilgrims (Who of Course May or May Not Have Eaten Turkey) (posted 12-4-03)
Brett Prettyman, writing in the Salt
Lake Tribune (Nov. 27, 2003):
Long before the famous pilgrim feast of 1621, residents of what would later be called southern Utah gathered in redrock canyons and ate their own turkey dinner.
And while historians say turkeys were not on the menu of the first Thanksgiving celebration, archaeologists have physical evidence that Meleagris gallopavo merriami was part of the Anasazi diet as far back as 700. No word on side dishes of the time.
"They had them in pens, they used them for food and they used the feathers for ornaments and blankets," said Ron Rood, Utah's assistant state archaeologist.
"The early archeologists talk about digging through turkey poop in Grand Gulch sites," said Dale Davidson, an archaeologist at the Bureau of Land Management's Monticello field office, which manages the Grand Gulch Plateau Primitive Area in southeastern Utah. "It is pretty dry in those alcoves. Backpackers complain about fresh cow turds, but there hasn't been any grazing [in Grand Gulch] since 1972. Things just don't go away down here."
Ancient petroglyphs and pictographs also depict the turkey as part of Anasazi daily life in Utah. There is a ruin in Grand Gulch called Turkey Pen, although there is debate about whether the small enclosure was actually used for turkeys, and petroglyph panels in Nine Mile Canyon east of Price contain turkey tracks, turkey pens with birds in them and what appears to be a male turkey displaying his tail feathers for all to see.
Who Invented Port? (posted 12-2-03)
Isambard Wilkinson, writing in the London Independent (Nov. 29, 2003):
HIGH in the Douro valley [in Oporto, Portugal] amid the endless slopes of vine-studded hills, the dwindling band of port dynasties will gather in the coming weeks to quietly celebrate Britain's love affair with the fortified wine.
The merchants will raise a glass to a little known, 300-year-old trade agreement, the Treaty of Methuen, which assured their future and the beginning of a national obsession: tippling the ruby nectar.
"I suppose we will quietly gather round and drink a bottle of port," said Paul Symington, managing director of a British family concern. "It was the Treaty of Methuen that really first encouraged port to be sold in large quantities."...
The original port shipping pioneers left Britain in the 17th century in search of their fortune.
They fortified the middling-quality local red with brandy to create port which quickly replaced French claret, something that was regularly unavailable because of war.
As a result of the Treaty of Methuen in 1703 and the shortage of French competitors, port became, as one author noted, "as British as Roast Beef and God-damn". William Hogarth's victims gained bile from its juice. "Claret is for liquor; port for men," declared Dr Johnson, a noted three-bottle man....
Yet the treaty is a source of some bitterness with Britain's oldest ally. Portuguese historians maintain that it destroyed the Portuguese textile industry by allowing cheap British imports and that British traders dealt with the country as if it were a colony.
There is a lingering, mild resentment because of the British dominance of trade....
Britons drink pounds 53 million worth of port per year, nearly half of which is consumed by women. Only the French drink more, though they prefer a cheaper, less gouty variety.
For the remaining Britons of the port trade the Treaty of Methuen has become a quiet symbol of survival.
Lincoln Never Said That (posted 12-1-03)
Recently, the Illinois Historic Preservation Agency began collecting spurious Lincoln quotes. They are being published on the agency's website :
Anyone who has glanced at a cereal box, herbal tea package, inspirational book, or restaurant place mat has probably encountered a Lincoln quotation that rings hollow. Lincoln is often quoted and misquoted by public officials and celebrities. Members of Congress have access to researchers at the Library of Congress to keep right with Lincoln's words. But even this resource cannot keep spurious Lincoln's quotations from being uttered by members of Congress....
The "Ten Points" appear every February 12 in newspaper ads honoring Abraham Lincoln. In fact, these aphorisms are from the pen of Reverend William John Henry Boetcker (1873-1962).
* You cannot bring about prosperity by discouraging thrift.
* You cannot strengthen the weak by weakening the strong.
* You cannot help small men by tearing down big men.
* You cannot help the poor by destroying the rich.
* You cannot lift the wage-earner by pulling down the wage-payer.
* You cannot keep out of trouble by spending more than your income.
* You cannot further the brotherhood of man by inciting class hatred.
* You cannot establish sound security on borrowed money.
* You cannot build character and courage by taking away a man's initiative and independence.
* You cannot help men permanently by doing for them what they could and should do for themselves.
Undoubtedly the most famous questioned utterance of Abraham Lincoln allegedly part of a speech delivered in Clinton, Illinois, September, 1858:
"You can fool all the people some of the time and some of the people all the time, but you cannot fool all the people all the time."
So the Pilgrims Celebrated the First Thanksgiving? (posted 11-26-03)
George Allen, the former senator from Virginia, writing in the Washington Post (Nov. 23, 2003):
As families come together this week, it is time to tell the truth about America's first Thanksgiving.
For decades, children across America have donned the buckle-topped hats and plain dress of the Puritan pilgrims who landed near Plymouth Rock in 1620. As the old story goes, William Bradford, Miles Standish and the rest of the pilgrims held a harvest festival and were joined by their Indian friends, Samoset and Squanto, in 1621. Thankful for their safe journey and good harvest, and in celebration of their friendship with the neighboring Indians, the pilgrims feasted on turkey, venison, fish, berries and Indian corn meal. This is a good and honorable story, but it was not America's first Thanksgiving.
Here, as Paul Harvey might say, is the rest of the story: America's first Thanksgiving occurred in what is now Charles City County, Va., on land that became part of the Berkeley Plantation on the James River. There, 38 men landed after a 10-week voyage across the Atlantic Ocean aboard the ship Margaret. The London Company, which had sent the expedition, sent explicit instructions for the settlers:
"Wee ordaine that the day of our ships arrivall at the place assigned for plantacon in the land of Virginia shall be yearly and perpetually keept holy as a day of thanksgiving to Almighty God."
On Dec. 4, 1619, a year before the pilgrims set foot on Plymouth Rock, the first Thanksgiving was held at Berkeley Plantation as Capt. John Woodlief and his band of settlers planted roots upriver from Jamestown in the growing colony of Virginia and gave thanks for their good fortune.
In 1863, Thanksgiving became a national holiday. At that time there was no official connection between Abraham Lincoln's proclamation and the 1621 event held in Massachusetts, as that would come later. The reasons for affiliating our November holiday with the pilgrim feast and not the day of thanksgiving observed by Capt. Woodlief and his men are uncertain. My good friend Ross MacKenzie, who was raised in Illinois and now serves as the editor of the editorial pages of the Richmond Times-Dispatch, surmised that this myth is the result of a "northern bias." Shenandoah University history professor Warren Hofstra says New England historians were just "quicker on the jump." But in 1963, President John F. Kennedy recognized Virginia's claim to the holiday in his 1963 Thanksgiving Proclamation and Berkeley Plantation is proud to make the claim today.
Visitors at Berkeley Plantation can find a plaque on the plantation grounds with the words of the London Company's instructions. The plantation was the birthplace of Benjamin Harrison as well as the home of President William Henry Harrison. It was also the site where Union Gen. Daniel Butterfield composed the melody for taps while camped on the grounds in 1862. Berkeley Plantation is truly one of our nation's historical jewels, and an important part of our Thanksgiving history.
When Was Cloture Devised? (posted 11-26-03)
Elaine S. Povich, writing in Newsday (Nov. 25, 2003):
Before 1917, there was no such thing as "cloture" in the U.S. Senate. Senators could debate as long as they wanted, mounting an endless filibuster, with no parliamentary way to stop them.
Today, cloture is used nearly weekly, as Senate leaders scramble to assemble the 60 votes needed to bring debate to a close on everything from judicial nominations to the Medicare bill. Cloture, which limits debate to a maximum of 30 hours, was invoked on the Medicare bill yesterday by a 70-29 vote.
First suggested by President Woodrow Wilson, the Senate in 1917 adopted Rule 22 that allowed the Senate to shut off debate by invoking cloture. At the time, a two-thirds majority, 67 votes, was required.
"It was a rarely used device until 1964, when President Lyndon Johnson organized the effort to invoke cloture after the 87-day debate against the Civil Rights Act of 1964," Senate Historian Donald Ritchie said. But, he said, once the issues of integration and civil rights were off the table, cloture began to be used more often.
By 1975, in the post-Watergate reform era, Democrats succeeded in reducing the number of senators needed for cloture from two-thirds to three-fifths majority, or 60.
Today, even the threat of a filibuster prompts the majority leader to file a cloture motion. He needs only 16 senators to join in filing the motion, and sometimes just that is enough to stop debate.
George McGovern's Faux Pas (posted 11-26-03)
Eric Alterman, on his blog:
An historical aside: You know, you can trace the entire history of neoconservatism to the time when the then-still liberal Norman Podhoretz was having lunch with George McGovern about three decades ago, and they were picking a table to eat at and McGovern said something unkind about the looks of a woman at one table spoiling his appetite that I fear even included a canine reference. The woman turned out to be Decter, Podhoretz's wife, and the rest is history. The story originally appeared in Sid Blumenthal's book, "The Rise of the Counter-Establishment," and was repeated in a Washington Post's review of it. With a perfect talent for making an already ugly situation even uglier, Podhoretz wrote in a letter demanding a retraction, thereby calling attention to what must have been a horrifying situation for Decter, only to have McGovern confirm the story for everybody. And yes, this does explain a lot about John P. "Normanson" Podhoretz too, but let's leave that for another day.
JFK Was Almost Killed as President-Elect (posted 11-26-03)
Robin Erb, writing in the toledoblade.com (Nov. 21, 2003):
On a bright Sunday morning nearly 43 years ago, a ramshackle Buick crept through the posh streets of Palm Beach, Fla., toward a sprawling, Mediterranean-style mansion.
At the wheel was a disheveled, silver-haired madman. His aged right hand rested near a switch wired to seven sticks of dynamite.
Inside the two-story stucco home was his target - president-elect John F. Kennedy - readying for morning Mass.
Richard Pavlick stopped a short distance from the house and waited, unnoticed by U.S. Secret Service agents outside.
It was decades before todays proliferation of suicide bombers, but Pavlicks plan on Dec. 11, 1960, was as simple: ram the president-elects car and detonate the dynamite.
Pavlicks suicide note had been written to the people of the United States, reading in part: "it is hoped by my actions that a better country ... has resulted."
The mansions door opened. Mr. Kennedy emerged.
But the 73-year-old Pavlick hesitated, then relaxed his fingers.
What saved the future president from assassination that day was neither the intervention of law enforcement nor a malfunction of Pavlicks device - a bomb that the Secret Service chief later said would have "blown up a small mountain."
It was timing and perhaps a moment of conscience for Pavlick.
Just steps behind the president, Jacqueline Kennedy appeared with toddler Caroline and newborn John, Jr.
"I did not wish to harm her or the children," Pavlick would later explain. "I decided to get him at the church or someplace later."
Pavlick never got the chance: He was arrested the following Thursday by authorities acting on information about his deep hatred for Kennedy. Sticks of dynamite were found in his vehicle.
The Woman Behind Thanksgiving (posted 11-25-03)
Candy Sagon, writing in the Washington Post (Nov. 25, 2003):
Sarah Josepha Hale was relentless. She wanted a national Thanksgiving Day holiday and, by God, she would use every iota of her personality, prestige and power to get it.
It was 1846, and Hale was editor of a highly popular women's magazine, Godey's Lady's Book. The North and the South were inexorably squaring off over the issue of slavery, and Hale believed that a nationally recognized day of thanksgiving could have a unifying effect. So she wrote letters, hundreds of them, during the next 17 years, to the governors of each state, to presidents, to secretaries of state, urging them to proclaim the last Thursday in November as Thanksgiving Day.
"She was a marketing genius," says New Jersey writer Laura Schenone, author of A Thousand Years Over a Hot Stove (Norton, 2003), a new history of women and food in America. "She used her magazine to create an emotional aura around Thanksgiving that focused on home, hearth and family. She ran tear-jerker stories and gave advice on what to cook. She pushed a New England menu, with pumpkin pie, a roast turkey at the center of the table and vegetables in cream sauce."
She also wrote impassioned editorials, urging all of the states and territories to celebrate the holiday on the same day "so that there will be complete moral and social reunion of the people of America," as Hale wrote in 1860. Until Hale began her crusade, says Schenone, Thanksgiving had been an erratic event, if celebrated at all. It was largely unknown in the South; in the North, it varied from state to state, held sometimes in October or November, but also in December, depending on the whims of the governors.
An indomitable woman, Hale was widowed when she was pregnant with her fifth child. Her oldest at the time was 7. As editor of Godey's, Hale was sort of the Martha Stewart (minus the financial scandals) of her day.
"She was a trendsetter and arbiter of national good taste," says Schenone.
At least one writer mentions that Hale may have even visited with President Lincoln about the holiday. Whether or not that happened, Lincoln must have liked her idea. In 1863, as the Civil War raged on, he declared a national day of Thanksgiving on the last Thursday of November and asked the country to be thankful for its bounties of nature and to come together in peace.
So Where Did the First Thanksgiving Take Place? (posted 11-25-03)
Randy Boswell, writing in the Ottawa Citizen (Oct. 12, 2003):
Everyone knows the story of the first Thanksgiving in the New World: how a group of English settlers who sailed to America on the Mayflower gathered in the fall of 1621 to celebrate the bountiful harvest in their Massachusetts colony.
Although a much-mythologized tale, its essential outline is true -- except for the part about being first.
Forty-three years earlier, on a tiny, windswept island in the Canadian Arctic, a group of ill-starred English sailors who survived a stormy Atlantic crossing knelt in prayer on a desolate shore 5,000 kilometres from home. At the urging of Martin Frobisher, leader of the 1578 expedition, and guided by a fiery ship's preacher named Robert Wolfall, 100 or more men gathered to give thanks for their deliverance from death, then devoured hearty meals of salt beef, biscuits and peas.
It's a little known nugget of authentic Canadiana, lost in a sea of cranberry sauce and warmed-over stories from the south. But during the past decade, archeologists have been quietly exploring the place where a beloved holiday tradition has Canadian roots.
They've not only rediscovered the romance and folly of an epic voyage of exploration. They've also unearthed tangible traces of the 16th-century adventure, including fragments of clay, a small basket and even bits of food left behind by Frobisher's party -- 425-year-old leftovers from North America's real first Thanksgiving.
From a Reuters report published by the Weekend Australian (Nov. 15, 2003):
THE US death toll in Iraq has surpassed the number of American soldiers killed during the first three years of the Vietnam War, the brutal Cold War conflict that cast a shadow over US affairs for more than a generation.
A Reuters analysis of US Defence Department statistics showed yesterday that the Vietnam War, which the army says officially began on December 11, 1961, produced a combined 392 fatal casualties from 1962 to 1964, when American troop levels in Indochina stood at just over 17,000.
By comparison, a roadside bomb attack that killed a soldier in Baghdad yesterday brought to 397 the tally of American dead in Iraq, where US forces number about 130,000 troops -- the same number reached in Vietnam by October 1965.
The casualty count for Iraq apparently surpassed the Vietnam figure last Sunday, when a US soldier killed in a rocket-propelled grenade attack south of Baghdad became the conflict's 393rd American casualty since Operation Iraqi Freedom began on March 20.
Larger still is the number of American casualties from the broader US war on terrorism, which has produced 488 military deaths in Iraq, Afghanistan, The Philippines, southwest Asia and other locations.
Statistics from battle zones outside Iraq show that 91 soldiers have died since October 7, 2001, as part of Operation Enduring Freedom, which US President George W. Bush launched against Afghanistan's former Taliban regime after the September 11, 2001, attacks on New York and Washington killed 3000 people.
Jim Stingl, writing in the Milwaukee Journal Sentinel (Nov. 17, 2003):
People are always trying to correct Bill Upham.
"You mean your grandfather fought in the Civil War," they insist.
"It would seem more true if it was my grandfather. But it was my father," the Milwaukee man says right back.
When you hear Bill Upham's story, the first thing you do is the math. His father, William Henry Upham Sr., was born in 1841. That was 162 years ago.
Bill and his brother, Frederick, both very much alive, were born in 1916 and 1921 respectively.
"My brother is always saying we should be on 'Good Morning America' telling our story," said Frederick, now 82, still working as a geologist and living in Fort Collins, Colo. "I feel like I should be in a jar of formaldehyde in some medical school."
A few years ago, a momentarily clueless television reporter asked Frederick, "What is your father doing now?"
William Upham Sr. was good, but even he wasn't that good.
The elder Mr. Upham - a Union soldier, successful businessman and for two years the governor of Wisconsin - lost his wife and married a much, much younger woman when he was 75. A year later, Bill showed up. And when William Upham was 80, he begot Frederick.
Between the father and his sons, they have lived every second of American history save the country's first 65 years.
The last Civil War veteran was buried nearly a half century ago. The last Civil War widow, who was 21 when she married an 81-year-old veteran, is 96 and living in Alabama. Offspring of the fighters in America's war between the states are more plentiful, but are thought to number only in the hundreds.
Bill Upham was just 8 when his father died, but he said he remembers him well and with great fondness. As a boy, he was so devastated by his father's death that he was sent to live with an aunt in North Carolina rather than remain at home and disrupt his mother's new marriage, which came rather quickly after William died.
Bill Cotterell, writing in tallahassee.com (Nov. 17, 2003):
Appalled by worldwide news reports that a rural Florida bridge bore the offensive name of a character in Mark Twain's "The Adventures of Huckleberry Finn," a veteran South Florida legislator wants public agencies to check their maps for any racial slurs.
State Sen. Steve Geller, D-Hallandale, filed a bill after seeing a Reuters news report in a South Florida newspaper that said there are 144 places throughout the country with names that use the word "nigger" in some fashion. As an example, the British wire service cited "Nigger Jim Hammock Bridge" in Hendry County, on a two-lane road near Clewiston.
The news story was picked up on several Web sites featuring political commentary.
"It's not the highest priority on my or anybody's agenda, but there is no reason today that anybody ought to have 'Nigger Jim Bridge,'" Geller said. "If there was a 'Long-nosed Jew Highway' somewhere in the state, I'd feel the same."
But some conservatives are worried that a politically correct witch hunt could result from his bill. And the Hendry County manager says he's never heard of a "Nigger Jim Hammock Bridge."
Typing in the pejorative name on the index of the home page of the U.S. Board of Geographic Names, the federal agency cited in the Reuters report, brings up a place map for "Negro Jim Hammock Bridge" and a site map showing a location southwest of Moore Haven.
Roger Payne, executive secretary of the federal board, said it officially changed all such names to "Negro" in 1963 and changed "Jap" to "Japanese" wherever it occurred in U.S. Geological Survey records in 1971. But he said "the records retain the variant or former name" of all 144 places as a secondary reference in federal databanks.
Payne said there are 13 places in Florida with names like "Negro Cove, Negro Island, Negro Camp Island." But he said some might be rooted in the Spanish word for black, rather than referring to a race of people.
Only four of the 13 Florida places in the USGS National Mapping Information site list the pejorative word as an original name - the Hendry County bridge and Negrotown Knoll and Negrotown Marsh, both in Highlands County, and Negro Head, a cape in Lee County.
Janadas Devan, writing in the Straits Times (Nov. 9, 2003)
The former Malaysian prime minister, Tun Dr Mahathir Mohamad, said recently that Jews rule the world by proxy. Opinion polls in Europe show a majority of Europeans feel Israel is a threat to world peace. Anti-Semitic 'hate speech' and 'hate acts' seem more frequent lately. But as Janadas Devan finds out, anti-Semitism has a long, persistent and troubling history.
CONSIDER the following examples of anti-Semitism:
'Reasons of race and religion combine to make any large number of free-thinking Jews undesirable.'
'You may as well do anything most hard/ As seek to soften that - than which what's harder? -/ His Jewish heart.'
'How I hated marrying a Jew.'
'Down in a tall busy street he read a dozen Jewish names on a line of stores... New York - he could not dissociate it now from the slow, upward creep of these people.'
'Jew York'. 'Jewnited States.' 'Franklin Delano Jewsfeld.'
Who uttered these statements?
Dr Josef Goebbels? Some Nazi poet? A blond Aryan, expressing regret for marrying a Jew during the Holocaust? A member of the lunatic Ku Klux Klan?
None of the above.
They were made by some of the most prestigious figures in Anglo-American culture: T.S. Eliot, William Shakespeare, Virginia Woolf (who, of course, married Leonard Woolf, a Jew), F. Scott Fitzgerald and Ezra Pound.
Similar examples of anti-Semitism can be easily multiplied.
In French literature - Emile Zola, Guy de Maupassant, Maurice Barres.
In English literature - Rudyard Kipling, Hilaire Beloc, G.K. Chesterton.
In American letters - Henry Adams, H.L. Mencken. Among industrialists - Henry Ford.
Among 'All-American heroes' - Charles Lindbergh. Among royalty - King Edward VIII, later the Duke of Windsor.
And on and on, ad infinitum.
But these are only examples of 'hate speech'.
The list of 20th century anti-Semitic 'hate acts' is more gruesome.
The Holocaust, when six million Jews were exterminated by Hitler, was only the final act.
Pogroms during and after the 1917 Russian Revolution resulted in the death of 75,000 Jews.
In Germany, after World War I, Jewish communities in Berlin and Munich were terrorised by anti-Semitic organisations.
After the Munich Soviet was crushed, all foreign-born Jews were expelled from the city.
The Holocaust didn't happen out of the blue; Europe was well-primed for the 'Final Solution'. And it was not the work of only a few decades, but of centuries.
As historian Paul Johnson points out in his History Of The Jews, though the term 'anti-Semitism' was not coined until 1879, anti-Semitism, 'in fact if not in name', undoubtedly existed from 'deep antiquity'.
Ed Turner, writng in the Washington Times (Nov. 6, 2003):
Among the most indelible images of American history is the caisson bearing President Kennedy's body during his funeral on Nov. 25, 1963. Black Jack, the Army's riderless horse, pranced restlessly and majestically behind the military carriage bearing the fallen president's casket as it was being taken to Arlington National Cemetery for burial.
Black Jack and the four soldiers and seven horses that led the caisson came from the Caisson Platoon of the U.S. Army's 3rd Infantry Regiment, also known as the Old Guard, the oldest active infantry unit in the Army. The Caisson Platoon, which has been stationed at Fort Meyer Army Post in Arlington since 1948, takes part in some 1,500 full honor military funerals each year at Arlington Cemetery and participates in parades, ceremonies and pageants in the Washington area.
Black Jack and the Caisson Platoon became national icons after Kennedy's funeral. In fact, after Black Jack died in 1976, his ashes were placed in a memorial at Summerall Field at Fort Meyer, just blocks from the stable where the horse was kept during its 21 years of service as a riderless horse. Black Jack was famous enough to visitors who toured the post that the Army created a special museum inside the John C. McKinney Memorial Stables in memory of him.
People who come to the stables still ask about Black Jack.
"Visitors usually ask what Black Jack did and when he died, whether he was the one in the Kennedy funeral," says Alan Bogan, director of the Old Guard Museum. "He's still the most famous horse. I doubt if anyone can name any other one."
More than 10,000 people visit the Caisson Platoon's stables each year to see the caissons and horses and where Black Jack resided. The Old Guard Museum down the street also houses artifacts and memorabilia from the full regiment, which provides sentinels at the Tomb of the Unknowns, demonstrations by its U.S. Army Drill Team, performances by its Fife and Drum Corps, and presentations of the colors by its Continental Color Guard.
Visitors who come to the John C. McKinney Memorial Stables, where the Caisson Platoon keeps many of it 44 horses three of them "riderless" horses like Black Jack can receive a guided tour from a soldier or explore the premises on their own. The stable, which was built in 1908, consists of tack rooms, a farrier room, caisson rooms and the Black Jack Museum in honor of the famous riderless horse that took part in President Kennedy's funeral march.
"Black Jack was the last horse that was bred and issued by the Army," says Spc. Matthew Moore, who has been in the Old Guard for 13 months. "The horses that we get now are either donated to us or purchased."
From BBC News (Oct. 30, 2003):
The oldest known condoms in the world - 17th Century creations made of animal and fish intestine - are to leave the UK to be displayed at a Dutch sex exhibition. The five contraceptives were excavated from a medieval toilet in Dudley Castle in 1985 - they are thought to have lain there since before 1646.
A spokesman for Dudley Council, which has care of the rare items, said they would be on show at the Drents Museum in the province of Drenthein from 11 November to 8 February.
Because the sheaths are so fragile, Dr Vincent Vilsteren, keeper of archaeology, is making a special visit this weekend to collect them.
The museum is staging an exhibition called 100,000 Years of Sex.
It's good to know that the earth has moved for many generations in the borough
Councillor Charles Fraser Macnamara
Adrian Durkin, exhibitions officer at Dudley Council, said: "It is very rare for such items to survive so well.
"Indeed the next oldest condoms in the world are over 100 years younger and will also be on display in the exhibition."
Councillor Charles Fraser Macnamara, lead member for culture and leisure, added: "This exhibition certainly has the opportunity to put Dudley on the map.
Tom Walker, writing in the Atlanta Journal-Constitution (Nov. 2, 2003):
If you own stocks, you'd best hope that President Bush is re-elected next year. Politics aside, that would be better for the market than his defeat.
That's history's message, according to the best-known compendium of Wall Street statistics and information, the "Stock Trader's Almanac."
The 37th edition of this deep mine of data is just out. Its focus, naturally, is on next year's presidential election.
"Positive market action usually accompanies re-election of a president," says Jeffrey A. Hirsch, who with his father, Yale Hirsch, compiles and edits the annual almanac.
This time, however, Hirsch detects a note of caution in the numbers that may or may not bode well for Bush as the election approaches in 2004. It has to do with whether the stock market is performing as well at this time next year as it is now, and Hirsch wonders whether it will.
The theorizing starts with the premise that bear markets are more likely in the first two years of the four-year presidential cycle, with bull markets more likely in pre-election and election years.
There's no mystery to this. Incumbents pull out all the stops to stimulate the economy so that voters feel good and prosperous at election time.
If that's right, then Bush's administration is running true to form. The market was in the tank from early 2000 until October 2002 and has been on a tear ever since.
But Hirsch says this pre-election year has been unusually strong and not likely to continue at that pace beyond the early part of 2004.
That means year-over-year returns in 2004 "are likely to be more tame," says Hirsch --- right about the time of the presidential campaign.
Corinne Atkins, writing in History Today (Nov. 11, 2003):
On August 29th, 2003, a huge car bomb went off in the central Iraqi town of Najav, killing more than 100 people, including the Shiite cleric Ayatollah Mohammed Baqr al-Hakim. Coming hard on the heels of an equally devastating explosion at the UN headquarters in Baghdad, it emphasised the dangers inherent in the reconstruction of Iraq, and the tensions within the country, many of them derived from the countrys political and religious past.
It was in this region, then known as Mesopotamia, that some of the most significant and tragic events of early Islam occurred. The three towns of Kufa, Najav and Kerbala, which all lay relatively close to each other, south of Baghdad, became pivotal to what is now known as the Shia branch of Islam.
The Sunni-Shia schism in Islam can be traced back to the issues that arose over the leadership of the Muslim community shortly after the death of the Prophet Mohammed in ad 632.
Since Mohammeds only daughter, Fatima, could not step into her fathers shoes, three caliphs (deputies) assumed control for the brief period ad 632-656. Some, however, refused to recognise them. Known as the Shias, they were followers of Mohammeds charismatic son-in-law, Ali ibn Abi Talib (ad 600-661).
The word Shia is an abbreviation of the phrase Shiat Ali, meaning the partisans of Ali. Arguing that only the blood line could be the recipient of Mohammeds divine guidance, they believed the Prophet had designated Ali as his political successor and had imparted to him the power of interpreting religious knowledge. Ali and his descendants were therefore the only rightful successors of the Prophet.
Their opponents, the Sunnis, supported the view that the Prophets legitimate successor could be chosen by man and should be an elected member of the Prophets own tribe.
In AD 656, after the assassination of Uthman, the third caliph, Ali ascended to the caliphate. Seven months after taking charge, he moved the capital of the caliphate from Medina in Arabia to Mesopotamia.
Michael Ollove, writing in the Baltimore Sun (Oct. 26, 2003):
Jews liked to think that Charlie Chaplin was Jewish. Nazis liked to think Charlie Chaplin was Jewish. McCarthyites liked to think Charlie Chaplin was Jewish. Sometimes, Charlie Chaplin liked to think Charlie Chaplin was Jewish.
Charlie Chaplin was not Jewish.
That he wasn't did not stop people from conceiving of him and the characters he played as Jewish, which colored the way they experienced his films. Or, maybe they recognized something "Jewish" in his films and that gave rise to their assumption that he was as well.
Whatever the original spark, Jewish audiences delighted in Chaplin's Little Tramp, viewing him as a heroic stand-in for their own painful immigrant experiences. "What do they want from him, the goyim," one woman was overheard crying out in a theater while watching The Gold Rush.
Sponsored Links What's this?
Hitler's minions denounced Chaplin as a Jew and banned his movies. Anti-Communist witch hunters in the 1950s uttered asides about his alleged Jewishness as a way to discredit him.
Chaplin himself was coy about his suspected Jewish lineage. "If they wanted me Jewish," he once said, "they would have me Jewish."
So it is with ironic intent that authors J. Hoberman and Jeffrey Shandler identify Chaplin as the first "Jewish" superstar in modern American entertainment.
In their book Entertaining America: Jews, Movies and Broadcasting, Chaplin is a beginning point in a fascinating conversation about Jewish identity in the context of American entertainment. The book accompanies an exhibit that was staged earlier this year at the Jewish Museum in New York. Today, it opens here at the Jewish Museum of Maryland, where it will remain until early next year.
John Ezard, writing in the Guardian (Oct. 18, 2003):
However foul it has got, the language of television soaps pales beside the sexual insults traded publicly on the streets of Britain for three centuries, according to a new book.
The real-life street theatre in the 16th to 18th centuries drew on a richer, far lewder lexicon, according to Professor Bernard Capp, of Warwick University.
It included the insults jade, quean, baggage, harlot, drab, filth, flirt, gill, trull, dirtyheels, draggletail, flap, naughty-pack, slut, squirt and strumpet.
All of these words were synonyms for whore, which had been weakened by massive overuse. The nouns were "generally heightened by adjectives such as arrant, base, brazenfaced or scurvy".
Prof Capp's book, When Gossips Meet, draws on court documents showing that prostitution was seen as a far worse disgrace than fornication.
"Venereal disease, especially syphilis or the pox, also featured prominently in abusive language," he adds. "Taunts such as 'burnt-arsed whore' and 'pocky whore' were familiar throughout the country.
"At Bury St Edmunds, Faith Wilson told her neighbour in 1619 to 'pull up your muffler higher and hide your pocky face, and go home and scrape your mangy arse'."
Insults and gossip had a function: to give "women some control over erring husbands, abusive employers or sexually disreputable women. When someone is gossiped about, they restrict their behaviour".
But it could tear apart families and parishes. According to archdeaconry court papers, "Joan Webb of Wittlesford, Cambs, was rumoured in 1596 to be worse than any whore", because she allegedly paid men to have sex with her... The stories prompted a man who has been planning to marry her to break off the match, giving her £5 'to be rid of her'."
Paul Collins, writing in the Australian Financial Review (Oct. 17, 2003):
The Catholic church is by far the largest multinational institution in the world, with more than one billion adherents. As John Paul II's papacy approaches its end, intense interest is building in the process by which popes are elected.
The papacy of John Paul II [1978- ] has been the third longest in history (if we leave out St Peter we have no idea of the length of his papacy). John Paul would overtake Leo XIII [1878-1903] on March 11 next year. The longest papacy was that of Pius IX [1846-78]. But John Paul's physical weakness indicates that the end of this pontificate may come soon.
The pope remains pope until he dies or resigns. There have been very few resignations the last was that of Celestine V in 1294. There has been much discussion of John Paul II resigning. This is unlikely because he sees his illness and sufferings as uniting himself with the agony of Jesus on the cross, and failure to see this through would be an abandonment of God's will for him.
Largely because pre-modern medicine was primitive and dangerous for the patient, sick popes usually died quickly. A number were murdered. The average length of a papacy is about seven years.
There is a danger that with contemporary medicine keeping people alive so much longer, a pope could become totally incapacitated by dementia, Alzheimer's disease or another form of mental or physical deterioration, leaving the church without leadership for a considerable period.
This could create a constitutional nightmare. There is nothing in canon law about removing a senile, sick or crazy pope unless he ultimately resigns. For instance, Urban VI [1378-89] was clearly mad, but he was still pope when he died. Despite the fact that the Council of Constance [1414-18] said that in extraordinary circumstances a general church council was superior to a pope, most church lawyers deny that a council can depose a pope. However, it may be the only way to resolve the impasse created by an unhinged incumbent.
John Paul will almost certainly have entrusted a written resignation to either the dean of the College of Cardinals, Josef Ratzinger, or to the Camerlengo of the Holy Roman Church, Cardinal Eduardo Martinez Somalo, or to his personal secretary, Archbishop Stanislaw Dziwisz. This letter will state something along the lines that if, in the opinion of the College of Cardinals, he has become mentally incapable of continuing in office, his resignation will automatically come into effect.
The church is run by the College of Cardinals during the 15- to 20-day interregnum between the death of the previous pope and the conclave. The period is technically called Sede vacante, which means "the (papal) chair being empty". In this period the cardinals operate according to strict rules laid down by popes Paul VI in October 1975 and John Paul II in February 1996. These rules cannot be changed by the cardinals during the interregnum.
The most important person in a papal interregnum is the cardinal camerlengo, or chamberlain. He is assisted by the Apostolic Camera, a small office, originating in the 11th century, that helps him in the administration of the temporal goods of the papacy during a Sede vacante.
A rotating committee of three cardinals is chosen by the cardinal electors to assist the camerlengo in preparing for the conclave, and in making day-to-day decisions that cannot be deferred. Cardinals are strictly bound not to make important decisions, above all rulings that would be binding on the next pope.
Daily meetings of all the cardinals are held, presided over by the dean of the College of Cardinals. When the pope dies, all cardinals who are heads of Vatican departments cease to hold office, except the camerlengo and the major penitentiary, the American Cardinal Francis Stafford. Since this cardinal deals with confessional matters the idea is that forgiveness should always be available. Also the vicar-general of the diocese of Rome, Cardinal Camillo Ruini, remains in office so that the government of the local church may continue.
During the Sede vacante the cardinals will spend time getting to know each other, and quietly discussing the profile of the kind of man they want, and think that the church needs, as the next pope. Cardinals over the age of 80 can participate in these discussions, but are excluded as soon as their colleagues enter the conclave.
Pre-existing ad hoc groupings of cardinals with theological, political and regional interests in common will have been discussing the issue of the next pope among themselves. Those working in the Vatican will be most active in this type of discussion because of their proximity to each other and their common interests.
They will have been doing this very discreetly and obliquely, and will always deny that anything like this is happening, especially if asked by the media. Most argue that the aim of the secrecy is to avoid party politics in the church, but the real reason is that Vatican politics, still very much influenced by the Latin mentality, are always played out obliquely and behind closed doors.
The pope is elected on the basis of an extremely narrow franchise: those members of the College of Cardinals under the age of 80. Since about the middle of the 12th century the popes have almost always been elected by the cardinals. The only real exception to this was at the end of the Great Western Schism, when all three papal pretenders were dismissed by the Council of Constance [1414-18]. Martin V [1417-31] was elected by a mixed group of cardinals, bishops and others representing the council.
The fundamental role of the pope is to be bishop of Rome. During the first 700 years of church history it was usually the clergy and laypeople of Rome, as well as bishops from nearby towns, who played the major role in electing the pope.
By the 8th century the franchise had became limited to the senior clergy of Rome. These were the priests who ministered at the "titular" churches, that is the oldest churches in the city. The title "cardinal", from the Latin cardo meaning "hinge", or "door", was first applied to these parish priests from as early as the 7th century. They became known as "cardinal priests".
The title was slowly extended to the senior deacons of Rome. These were ordained men who were not priests, but who were in charge of church administration and the distribution of social welfare to the poor.
In the 8th century the title of cardinal was also extended to the bishops of the central Italian dioceses immediately around Rome. With the pope, these bishops formed the Roman Synod, advising and assisting him in the administration of the Roman church. They eventually evolved into "cardinal bishops".
As the senior pastors and administrators, cardinal priests, deacons and bishops gradually assumed control of the Roman church during a papal vacancy. They also had an increasing say in the election of the new pope. In order to break the influence of secular rulers in papal elections, Stephen III [768-72] decreed in 769 that only cardinal deacons and priests of the Roman church were eligible for election as pope, and that the laity should have no vote. Lay participation had sometimes led to riots and vicious factional in-fighting.
Despite this, in the 9th and 10th centuries the papacy came under the influence of lay forces, especially the Mafia-like clans who controlled parts of Rome and its immediate surroundings from fortified mansions. Many of the popes of this period were members of these families, and they were often utterly unworthy of office.
From about 1030 onwards a reform movement permeated Rome. The greatest figure in the campaign to break lay control of ecclesiastical office was Gregory VII [1073-85]. Reformers saw that the papal election process was the key to ensuring that a worthy person was elected.
Since the 12th century the College of Cardinals has elected the pope in an closed meeting called a "conclave", from the Latin cum clave, meaning "with a key". This referred to the fact that the cardinals were locked up, sometimes with graduated fasting, until they elected the pope.
Most modern conclaves have been held in the Sistine Chapel, surrounded by Michelangelo's now gloriously restored paintings of the creation and the last judgement. After the early 14th century the cardinals were isolated from outsiders in uncomfortable circumstances until the new pope was elected. Even in 20th-century conclaves the cardinals and their assistants did not always have separate rooms. They resided in the cramped and very inconvenient makeshift area surrounding the Sistine chapel.
The purpose of locking them away was to guard against outside influence and to hasten papal elections. In the next and subsequent conclaves cardinals and their assistants will reside in the purpose-built and comfortable Domus Sanctae Marthae, a motel-style building of 130 suites and single rooms with dining facilities, erected in 1996 within the Vatican.
The election decree of the Third Lateran Council of March, 1179 required that for a valid election a two-thirds majority of cardinals must vote for a candidate. The purpose of this was to force cardinals to compromise in order to preclude the danger of a disputed election. It also avoided the problem of an elected pope's authority being weakened by having to deal with a large minority of disgruntled cardinals who had opposed his election. This rule remained in force until 1996, when it was modified suddenly and without apparent reason by John Paul II.
On February 2, 1996, John Paul issued a new set of rules governing the election process. Firstly, only election by scrutiny, that is by secret, written ballot, was permitted. There was also a seemingly small, but extremely significant modification to the two-thirds majority requirement.
John Paul decreed that ballots in the conclave were to proceed at the rate of four per day, two in the morning and two in the afternoon. If after three days no-one has been elected, a day of prayer and discussion is to be held. If, after a further 21 ballots, no-one has received the two-thirds vote required, the camerlengo can invite the cardinals to vote for another election procedure. The cardinals can then decide to drop the two-thirds majority requirement, and elect by an absolute majority, that is elect the cardinal who gets more than half of the votes.
The problem with this is that in contested elections there is no incentive to compromise. What has actually happened in most modern conclaves is that two candidates have emerged relatively quickly with large blocks of cardinals supporting them. But neither has had the required majority. The two-thirds requirement forced a compromise, and persuaded the great electors (the leading cardinals from various factions) to seek a compromise candidate, someone who would eventually be acceptable to a large majority from both blocks. What John Paul's change does is encourage a small majority to hold out against a large minority. It could prove disastrous in a strongly contested election.
Since the start of the 20th century the composition of the College of Cardinals has become more and more internationalised. Italians no longer hold the majority. In the first conclave of the 20th century, which elected Pius X, more than half of the cardinal electors (38 of the 62) were Italian.
As of this October 1, there are 135 electors from 59 countries. There are 23 from Italy. It is often forgotten that the pope's primary title is bishop of Rome and it could be argued that it is appropriate that he be an Italian, or that at least that he be able to speak excellent, idiomatic Italian, and be completely at home in western European culture.
From the website of Connecicut's TV station, WTNH (Oct. 15, 2003):
His name is synonymous with traitor. Still, Benedict Arnold is Norwich's most famous native son.
Bill Stanley, president of the Norwich Historical Society, will place a memorial stone at Arnold's gravesite at Saint Mary's Church in London on the Thames River.
The granite memorial is being made by a stone-crafting company in Vermont and will be shipped to Norwich before its final destination.
Stanley says he and his wife are paying the costs.
He says he wants to correct a mistake on the memorial that is painted on the wall at the British church. It reads that Arnold was born in 1801 and died in 1951.
He was born in 1741 and died in 1801.
Stanley says the new memorial also will correct the name of Arnold's wife.
He says he will accompany the stone to London in May.
From the Sydney Morning Herald (Oct. 9, 2003):
A New York gangster family that was said to be the inspiration for the hit television drama The Sopranos used so-called double-decker coffins to dispose of murder victims, a Manhattan court has been told.
Anthony Rotondo, a defector from "the Mob", told jurors on Tuesday that caskets with false bottoms were sold to trusting customers of a well-known New Jersey undertaker.
He said the DeCavalcante crime family had used the coffins to secretly bury the victims of mob executions, along with the bodies of people who had died from natural causes.
"The family would put the body of the murdered victim below the regular customer, thus disappearing forever," Rotondo explained, while giving evidence in the trial of the reputed DeCavalcante family boss, Girolamo "Jimmy" Palermo.
The ruse risked exposure at times because of the surprise of pall bearers as they carried the two-for-one coffins.
"Everyone would kind of look at one another," Rotondo recalled. "There would be six grown men carrying someone's 80-pound [36-kilogram] grandmother, and they looked like they were having a problem."
Rotondo said the double-decker coffins had been used as early as the 1920s, and were the brainchild of Carlo Corsentino, an undertaker member of the DeCavalcante family.
Corsentino's son, Carl, still runs the family's funeral home in Elizabeth, New Jersey.
From the Ottawa Citizen, a list of noteworthy suicides (Oct. 6, 2003):
Socrates, Greek philosopher
Mark Antony and Cleopatra; Roman politician/general, Egyptian queen
Judas Iscariot, disciple of Jesus Christ
Lucius Domitius Nero, Roman emperor
Vincent van Gogh, painter
Virginia Woolf, writer
Adolf Hitler, Nazi leader
George Reeves, actor, 1950's TV Superman
Ernest Hemingway, Nobel Prize-winning writer
Marilyn Monroe, actress
Sylvia Plath, poet
Thich Quang Duc, Buddhist monk, self-immolated on Saigon street, creating enduring and infamous image of the Vietnam war.
Brian Epstein, Beatles manager
Bobby Sands, imprisoned Irish
Republican Army hunger striker
John Robarts, former Ontario premier
Abbie Hoffman, '60s counter-culture figure, founder of Yippie movement, Chicago Seven defendant
Margaux Hemingway, model, actress, granddaughter of Ernest.
Michael Hutchence, rock musician with INXS
Vince Foster, adviser to U.S. president Bill Clinton
Kurt Cobain, rock musician with Nirvana
David Kelly, embattled British government weapons expert