History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sun, 20 Jan 2019 09:14:05 +0000 Sun, 20 Jan 2019 09:14:05 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/site/feed William Barr Needs a History Lesson As the Senate Judiciary Committee holds its confirmation hearings for William Barr, the current nominee for Attorney General of the United States, it is clear Barr needs to brush up on his constitutional law, as well as U.S. history.  

 

During yesterday’s hearing, Senator Mazie Hirono (D-HI) asked Barr whether or not he believed birthright citizenship was guaranteed by the 14th Amendment. The question is important as the idea of birthright citizenship has come under increasing attack from the right in recent years. From the Republican primaries onward, Donald Trump has repeatedly asserted that birthright citizenship is unconstitutional, should be eliminated, and can be ended by executive order. While some on the right have balked at the last claim, Trump has tapped into an ever-present disdain among conservatives for birthright citizenship. 

 

For his part, Barr seemingly tried to side step the politically divisive issue. However, his answer to Senator Hirono’s question was not only vague, it also suggested that the soon-to-be Attorney General doesn’t know basic constitutional law or history. 

 

“I haven’t looked at that issue legally. That’s the kind of issue I would ask OLC [Office of Legal Counsel] to advise me on, as to whether it’s something that appropriate for legislation. I don’t even know the answer to that,” Barr answered.

 

There are a couple of worrying signs in this response. First, birthright citizenship is a part of the 14th Amendment, meaning any action to change that would have to be a constitutional amendment, not legislation. This is a basic tenant of constitutional law. The fact that Barr, who previously served as Attorney General under George H.W. Bush, thinks any action can be taken against birthright citizenship through simple legislation shows one of two things: (1) he isn’t competent enough to understand basic constitutional processes in the United States or (2) he was rather insidiously actually answering Senator Hirono’s question. 

 

The latter point warrants a bit of explanation. Barr quite visibly looked like he was attempting to simply move past the question and not answer Senator Hirono. However, if Barr does in fact think that birthright citizenship can be dealt with through congressional legislation, then the only logical explanation for this, barring the above first option, is that he doesn’t believe the 14th Amendment guarantees this status. Whereas the first possibility of incompetence warrants a refresher in constitutional law, this second one demands a lesson in history. 

 

History is quite clear on the intent of 14th Amendment: it was meant to create the birthright citizenship in the wake of emancipation. The 14th Amendment was created to guarantee that freed slaves, free blacks, and their posterity would forever be considered American citizens. Before its adoption, citizenship was a murky, ill-defined, status. The Constitution only mentions citizenship a few times, and does not provide a concrete definition of what a citizen is or who can be a citizen. To this day there is actually no legal definition of what citizenship actually is.

 

From the Constitution’s ratification to the adoption of the 14th Amendment, black Americans had repeatedly claimed they were citizens because of their birth on American soil. Scholars such as Elizabeth Stordeur Pryor and Martha S. Jones have shown the myriad of ways in which black Americans made claims on this status, only to be rebuffed in many cases. Citizenship could provide black Americans with a recognized spot in the nation’s political community. It represented hope for a formal claim to certain rights, such as suing in federal court. 

 

This leads to the infamous 1857 Supreme Court decision Dred Scott v. Sandford, when Chief Justice Roger Taney crafted an opinion that quite consciously attacked the very possibility of black citizenship. Taney concluded that Dred Scott, an enslaved man, could not sue in federal court because he was not a citizen. He was not a citizen, in Taney’s words, because black people “are not included, and were not intended to be included, under the word ‘citizens’ in the Constitution… On the contrary, they were at that time considered as a subordinate and inferior class of beings who had been subjugated by the dominant race, and, whether emancipated or not, yet remained subject to their authority, and had no rights or privileges but such as those who held the power and the Government might choose to grant them.”

 

Taney went out of his way to create a Supreme Court decision that attempted to put the legal nail in the coffin of black citizenship. The 14th Amendment was, quite consciously, crafted to upend Dred Scott, which was still the law of the land after the Civil War.  Thus when conservatives rail against birthright citizenship and claim that it is not, in fact, a part of the Constitution, they are ignoring America’s long history of slavery, discrimination, and segregation. 

 

When the soon-to-be Attorney General William Barr states that he thinks legislation can be used to make changes to birthright citizenship, it is because he does not believe the 14th Amendment guarantees it. And when he and other conservatives espouse such an opinion, it is because they are once again willfully ignoring American slavery's legacy of racism. This is, admittedly, not surprising. Barr also expressed the opinion during his confirmation hearing that the justice system “overall” treats black and white Americans equally, despite mountains of information proving otherwise. 

 

While the attack on birthright citizenship from the right deserves attention and should be fought at every turn, the underlying historical erasure of slavery and discrimination is also requires our attention. This willful amnesia is why the potential next Attorney General of the United States can, in one day, ignore so many aspects of America’s fraught history with race. And it is why we all must be on guard. 

 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170959 https://historynewsnetwork.org/article/170959 0
The Defect that Brexit and MAGA Share

 

In June 1945, Winston Churchill, who had just overseen the British contribution to victory in the Second World War, was voted out of office in one of the most unexpected election outcomes of the twentieth century. Remarkably, the very soldiers who had enabled victory on the battlefield were central to the routing of the Conservative Party at the ballot box; they, and their social networks, voted overwhelmingly for Labour.

Churchill was a man of grand vision, of big ideas, and there was no greater mission in his life than the defeat of Nazi Germany and Imperial Japan and the retention of Britain’s special place in the world. The necessity to defeat the Axis and maintain the Empire, however, blinded him to a key dynamic: he fundamentally undervalued and misunderstood the central aspiration of his citizen soldiers in a second world war; they desired immediate and profound social change. For the ordinary citizen soldier, his participation in the war was, at heart, about building a better post-war world at home – a world with better housing, health care provision and jobs.

Churchill’s inability to fully empathize with his citizen soldiers was to have profound implications for his great mission. Churchill, and others in charge of strategy, were convinced that ordinary English, Welsh, Scots, Irish, Africans, Australians, Canadians, Indians, New Zealanders and South Africans would fight with the required determination and intensity to guarantee victory and save the Empire in its hour of need. This was the key assumption, or understanding, that drove British strategy during much of the first half of the war. It was accepted that in a newly raised citizen army men would be inadequately trained and might not be provisioned with the theoretically ideal scale or quality of materiel. But, it was expected, in this great crisis, the “great crisis of Empire,” that in spite of these drawbacks they would rise to the challenge. 

As we know, they did not always meet these lofty ambitions. The defeats in France in 1940 and at Singapore and in the desert in 1942 put a nail in the Imperial coffin. In the end, Britain did not even achieve her initial reason for going to war: the restoration of a free, independent Poland. As the war dragged on, the British and Commonwealth Armies played an increasingly smaller role proportionally in fighting the Axis. On 1 September 1944, with the Normandy campaign completed, General Bernard Montgomery reverted to commanding an army group of roughly fifteen divisions. His erstwhile American subordinate, General Omar Bradley, on the other hand, rose to command close to fifty in what clearly signaled a changing of the guard. By the end of March 1945, of the 4 million uniformed men under the command of General Dwight D. Eisenhower in North-West Europe, over 2.5 million were American, with less than 900,000 British and about 180,000 Canadian. The British and Commonwealth Armies advanced across Europe into an imperial retreat. 

The result was that the the post-war empire was a “pale shadow” of its former self. The cohesion of its constituent parts had been irretrievably damaged. Much of its wealth had been lost or redistributed. Nowhere was this more apparent than in the Far East. The loss of Singapore was not only a serious military defeat, but it was also a blow to British prestige. Barely two years after the war, and only five years after Churchill had uttered his famous words, that he had “not become the King’s first minister in order to preside over the liquidation of the British Empire,” Britain’s Imperial presence in India had ended. The loss of the subcontinent removed three-quarters of King George VI’s subjects overnight, reducing Britain to a second-rate power.

The history of Britain and the Commonwealth cannot, therefore, be understood outside of the context of the performance of British and Commonwealth soldiers in the Second World War. The Empire failed not only because of economic decline, or a greater desire for self-determination among its constituent peoples, but also because it failed to fully mobilize its subjects and citizens for a second great world war. Africans, Asians and even the citizens of Britain and the Dominions demonstrated, at times, an unwillingness to commit themselves fully to a cause or a polity that they believed did not adequately represent their ideals or best interests. This manifested in morale problems on the battlefield, which, in turn, influenced extremely high rates of sickness, battle exhaustion, desertion, absence without leave and surrender in key campaigns. When the human element failed, the Empire failed.

Today, the Anglo-Saxon world is enmeshed in another era of what some might term big ideas, although thankfully not in a world war. In the United Kingdom, instead of imperial unity we talk about Brexit and in the United States, President Trump’s vision for America, to make it Great Again, has captured the imagination of a significant cohort of the population. Whether one agrees with these movements or not, they are certainly radical; however, in a similar vein to Churchill’s grand vision of the 1940s, they are vulnerable to ignoring the needs of the many in preference to the visions of the few. In Britain, there is hardly a week that goes by without the announcement of a new set of figures outlining the collapse of basic public services and amenities. Violent crime is uphospital waiting times have risen and child poverty is increasing, just to mention a few social metrics. It seems evident to many, that leaving Europe will not address the fundamental issues faced by people who voted Brexit. Trump’s presidency, the evidence suggests, will harm the welfare of those who were most likely to vote for him; tax cuts for the wealthy and building a wall along the Mexican border will not bring back a lost prosperity to middle America. 

Trump and the Brexiteers might heed lessons from the Second World War. The American President, Franklin D. Roosevelt believed that the efforts demanded by the state to meet the global cataclysm that was the Second World War, required legitimacy, accorded by citizens who were invested materially and ideologically in that same state. In this sense, he linked intimately the questions of social change and social justice with the performance of American armies on the battlefield. By comparison, in his obsession with defeating the Axis, the British Prime Minister lost sight of the goals and ambitions of the ordinary man, the smallest cog in the “machinery of strategy,” but a vital one all the same. For the citizen soldier, the war was not an end in itself; it was a step on the road toward a greater aspiration, political and social reform. To succeed, big ideas had to take account of the little stories of ordinary people. If they do not, they are very likely bound to fail. 

 

 

 

 

 

 

 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170733 https://historynewsnetwork.org/article/170733 0
Martin Luther King Day: What Historians Are Saying

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170984 https://historynewsnetwork.org/article/170984 0
The Myth of the Liberal Professoriate

 

For years conservative broadcasters and the right-leaning print media have denounced liberal control of American higher education. This assertion is based on the large number of academic instructors who belong to the Democratic Party and on actions taken at some colleges to promote a sense of inclusiveness and toleration to the point which – some say – discourages free speech. Both of these latter observations about academe are to a limited extent true, but they indicate that college faculty and administrators are conservative, not liberal, at least in the philosophical sense. And for intellectuals, liberal and conservative social and political values have traditionally rested on philosophical views of human nature. 

Liberal programs initially emerged from the belief that human beings are inherently good or are attracted to the good. Seventeenth century religious liberals like the Quakers used the term Inner Light, or the presence of the Creator in all humans, to explain this, while later liberal theorists like Henry David Thoreau used the term conscience. On the other hand, early classical conservatives rooted their policies in the idea that people are by nature either evil or selfish. Religious conservatives like the Puritans believed that Original Sin left all with a powerful inclination to evil, whereas secularly oriented conservatives like Adam Smith, the father of capitalism, asserted that innate “self-love” drives human action.

Although we often lose sight of the philosophical origins of liberal and conservative policy, today’s public agendas reflect those roots. Conservatives have traditionally supported powerful militaries believing that strong nations selfishly prey on weak ones, while liberals downplayed the need for military spending and substituted investment in education and social programs in order to help individuals maximize their latent moral and intellectual capabilities. Similarly, conservatives advocated criminal justice systems characterized by strict laws and harsh punishment to control people’s evil or selfish impulses, while liberals favored systems that focus on rehabilitation to revitalize latent moral sensibilities. Conservatives traditionally opposed welfare spending believing its beneficiaries will live off the labor of society’s productive members, while liberals believed such investments help those who, often through no fault of their own, find themselves lacking the skills and knowledge needed to succeed. Though the philosophical roots of these policies are frequently forgotten today, these agendas continue to be embraced by liberals and conservatives.

College professors are philosophical conservatives. This is a product of their daily experiences, and it shapes their professional behaviors. First, the realm of experience: senior members of the profession are intimately familiar with the old excuse for failure to complete an assignment, “my dog ate my essay,” and its modern replacement, “my computer ate my essay.” Years ago, missed exams were blamed on faulty alarm clocks; today that responsibility has been shifted to dying iPhone batteries. Term papers cut and pasted from Wikipedia are an innovation; plagiarism is not. A philosophically liberal view of humanity is difficult to sustain amidst such behaviors.

The clearest manifestation of philosophical conservatism in the teaching profession is seen in tests and grades. Testing is based in part on the assumption that individuals will not work unless confronted by the negative consequences of failure, an outlook that is steeped in philosophical conservatism. (Among historians, who spend inordinate amounts of time examining wars and economic depressions, which often resulted from greed and avarice, their academic discipline itself encourages a philosophically conservative outlook.)

How then can academicians be accused of being liberal? As noted, this is partly because the majority of faculty are registered Democrats. Counterintuitively, this reflects a philosophically conservative and not a liberal outlook, especially relating to all important economic policy. During the debate over the tax bill last year, Republicans continued their traditional support for supply-side or trickledown economics by proposing to lower taxes on high earners and corporations, whereas Democrats continued to advocate demand-side economics by proposing to shift the tax burden from the large middle to the small upper class and to provide tax-credits for workers. Supply-side economics is based on the assumption that reducing the tax burden on the rich will lead them to invest in plant and equipment which will create jobs, the advantages of which will trickle down to workers in the form of wages and benefits.

This is a philosophically liberal notion. It assumes that people and corporations will invest in plant and equipment even when wages are stagnant leaving many people without the income needed to purchase the goods new factories will produce. Demand-side economics, on the other hand, is partially based on the philosophically conservative notion that no rich person or corporation will build a plant, if the masses lack the income needed to buy the product. In support of this position today, demand-siders note that many corporations are using most of the surplus capital from last year’s tax cuts to buy back stock instead of investing in capital assets because wealth and income are more concentrated in the hands of the few than in recent history, which minimizes the purchasing power of the many. In supporting tax schemes and other economic policies that put money in the pockets of the many, demand-siders embrace the conservative idea that such programs will stimulate selfishly based investment spending by corporations in an attempt to tap the rising wealth of the majority of consumers. Moreover, demand-side policies will unleash the selfishly oriented entrepreneurial inclinations of working people by giving them the wherewithal to open small businesses that spur economic growth.

College professors and administrators are also attracted to Democratic economic policy because they are aware of the successes of demand-side economics. There has not been a major depression since the New Deal, though there have certainly been recessions. This is because that movement largely achieved its goal of shifting the weight of the government from supporting a supply-side to a demand-side approach to economics by institutionalizing minimum wages, overtime pay for work beyond forty hours a week, unemployment insurance, Social Security, and strong labor unions. This was not part of some left-wing socialist agenda. The goal was to put money into the hands of the many and thus incentivize the capital class to invest in new productive capacity, and more importantly to maintain demand and spending when the economy slows.

Academicians generally, and historians especially, realize that prior to the New Deal depressions (aka panics), occurred every ten to twenty years and were exacerbated by wage cuts which reduced demand and led to further layoffs and wage reductions. Minimum wages and union contracts which guarantee a wage for the life of the contract have slowed the downward economic spiral that turned recessions into depressions by limiting wage cuts, while Social Security and unemployment insurance also slowed economic downturns by helping sustain demand as the economy slackened. Though the contemporary right often argues that New Deal programs sought to create a liberal safety net for the poor, academics realize those programs were less attempts to help individuals directly and more attempts to jump-start a stalled economy and to keep it humming in part by incentivizing the capital class to continue to invest in productive capacity.

Conservatives also label academics as liberals because of their attempts to encourage inclusiveness and discourage what some term hateful speech on campuses. To the extent that this is true, and it often seems exaggerated, it is rooted in philosophical conservatism. Academics realize that language has great symbolic power, and symbols have a tendency to generate emotional as opposed to rational responses which colleges and universities rightly scorn. Academics also recognize that negative symbolism, including language, has served to dehumanize groups, and dehumanization has often led to discrimination and persecution. Only philosophical conservatives can have so little faith in human reason and goodness as to believe that emotionally laden language has the power to perpetuate injustice.

Ironically, the right, in supporting both supply-side economics and in tacitly accepting ethnically insensitive and sexist language, is embracing policies rooted in liberal not conservative thought, while the university – in favoring the opposite – adheres to a philosophically conservative outlook. Indeed, a traditional conservative would argue that the appeal of supply side-economics and insensitive speech lie in their ability to protect the wealth of the rich and to sustain the increasingly fragile sense of dignity of the middle class.

 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170828 https://historynewsnetwork.org/article/170828 0
Roundup Top 10!

Teacher strikes can’t fix the core problems with our schools

by Diana D'Amico

The forces that once led to the growth of suburban schools have led to the decay of their urban counterparts.

 

Kruse and Zelizer: It's 'Network' nation: How our media became overrun by polarization, outrage and attitude

by Kevin Kruse and Julian Zelizer

How the news has become sensationalized.

 

 

What We Can Still Learn From American History's First Special Prosecutor

by Andrew Coan

The Mueller investigation grew out of a rich, complicated and not always edifying history that even most legal scholars and historians have largely forgotten.

 

 

Here’s How Democratic Presidential Contenders Should (Not) Talk About Russia

by David S. Fogleson

Candidates gearing up for 2020 may be blazing new trails on domestic issues, but when it comes to engagement with Russia, they haven’t moved beyond the counterproductive status quo.

 

 

Math And Science Can't Take Priority Over History And Civics

by Natalie Wexler

In our rush to prioritize STEM subjects, we’re overlooking other fields that are even more important.

 

 

Trump’s Trade Policy Threatens US Consumer as Much as China

by Paul Ropp

Trump’s China policy ignores the complete interdependence of the US and Chinese economies.

 

 

Angela Davis and the Jewish Civil War

by Marc H. Ellis

The Black-Jewish alliance, at least what’s left of it, faces a common challenge of how memorialization works and for whom.

 

 

The Radical Tradition of Student Protest

by Mike Jirik

The student protests against anti-black racism at UNC Chapel Hill are part of a long history of student protest against racism that includes individuals like John Brown Russwurm.

 

 

Why Study History?

by Elizabeth A. Lehfeldt

To answer that question, Elizabeth A. Lehfeldt tells a pedagogical story in two parts.

 

 

Why Americans trust technology but not science

by Joyce Chaplin

Benjamin Franklin understood that the two go hand-in-hand.

 

 

State of the Union: What would Jefferson do?

by Karen Tumulty

Pelosi's proposal was not as radical as it might sound.

 

 

 

 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170985 https://historynewsnetwork.org/article/170985 0
Lots of People Won New Rights in the 1960s, but Not College Women Athletes

Chris von Saltza, Olympic champion - By Harry Pot - [1] Dutch National Archives, The Hague, Fotocollectie Algemeen Nederlands Persbureau (ANeFo), 1945-1989, Nummer toegang 2.24.01.05 Bestanddeelnummer 912-8410, CC BY-SA 3.0 nl

 

Today the #MeToo movement puts the spotlight on young women in college who have been abused without much recourse. Most media attention exposes flagrant violations by men in date rape on campus. It extends to gender harassment by executives in the work place. Looking back to the 1960s, however, another pervasive abuse included benign neglect by colleges and universities. Women as students were treated inequitably in campus activities, especially in intercollegiate sports. Graphic examples can help us remember and learn from past practices.

Between August 26th and September 11th in 1960 Chris von Saltza stood on the victory podium at the Olympic Games in Rome four times to receive swimming medals, a total of three gold and one silver. She then entered Stanford University and graduated in 1965 with a bachelor’s degree in Asian history, gaining prominence in her long career as a computer scientist. After the 1960 Olympics Chris never had an opportunity to swim competitively for a team again. Stanford, after all, did not offer varsity athletics teams for women. What was a young woman to do? There was no appeal. For better or worse, this was the way things were in American colleges back then. 

In contrast to Chris von Saltza’s experience, over a half-century later another high school senior, American swimmer Katie Ledecky, won five medals at the 2016 Olympics held in Rio de Janeiro. She, too, was about to graduate from high school and would enroll at Stanford as a freshman in the fall of 2016. The historic difference was that she had a full athletic grant-in-aid plus a year-round national and international schedule of training and competition along with prospects for substantial income from endorsements and a professional athletics career. 

In 2018 there are pinnacles of success that indicate changes since the 1960s. Katie Ledecky has excelled as a student and works as a research assistant for a Stanford psychology professor. Ms. Ledecky also led her Stanford women’s swimming team to two National Collegiate Athletic Championships and recently signed a $7 million professional swimming contract.

Connecting the dots to explain the comparisons and contrasts of these two Olympic champion women swimmers who were students at Stanford requires reconstruction of the condition of college sports for students in the decade 1960 to 1969 The profile of Stanford‘s Chris von Saltza’s lack of collegiate opportunities was not an isolated incident. Following World War II American women had triumphed in the Olympic Games every four years - but with little base provided by high school or college sports. 

At the 1964 Olympic Games in Tokyo, the women’s swimming star was Donna DeVarona, who won two gold medals. In 1964, she was featured on the covers of both Time and Life magazines and named the outstanding women athlete of the year. Despite her achievements, her competitive swimming career was over, as she and other women athletes had few if any options for formal training and participation in intercollegiate sports or elsewhere.

Young women from the U. S. won gold medals in numerous Olympic sports. A good example was Wilma Rudolph, who won three gold medals in track and field at the 1960 Olympics in Rome. Rudolph benefitted from one of the few college track and field programs for women in the U. S., coached by Ed Temple. Most of their competition was at Amateur Athletic Union (AAU) meets, with no conference or national college championship meets available. Furthermore, at historically black Tennessee State University, funding and facilities were lean. 

The limits on women’s sports are revealed in college yearbooks of the era. A coeducational university campus yearbook devoted about fifty pages to men’s sports, especially football and basketball. In contrast, women’s athletics typically received three pages of coverage. In team pictures, the uniforms often were those of gym class gear. The playing format was for one college to sponsor a “play day” in which five to ten colleges within driving distance gathered to sponsor tournaments in several sports at once. Softball, field hockey, basketball, and lacrosse were foremost.

Coaches, usually women, usually received minimal pay. Most held staff appointments, in physical education where they taught activity classes. The women’s gym had few, if any, bleachers for spectators. Coaches of the women’s teams usually lined the playing fields with chalk, mopped and swept up the gymnasium floors, and gathered soiled towels to send to the laundry. One indispensable piece of equipment for a woman coach was a station wagon, as players and coaches piled in with equipment to drive to nearby colleges for games and tournaments. The women’s athletic activities often had their own director – yet another example of “separate but unequal” in intercollegiate athletics and all student activities. There was a perverse equality of sorts. All women students were required to pay the same mandatory athletics fee as male students, even though the bulk of it went to subsidize varsity teams that excluded women.

Despite the lack of intercollegiate sports for women in the 1960s there were some signs of life. One was creation of alliances that eventually led to chartering a national organization, the Association for Intercollegiate Athletics for Women (AIAW) in 1971, with over 280 colleges as members. The first action the Division for Girls and Women Sports (DGWS) took was to establish the Commission on Intercollegiate Athletics for Women (CIAW)to assume responsibility for women’s intercollegiate sports and championships. 

One heroic figure associated with women’s sports to emerge in the decade was Donna Lopiano, who graduated with a degree in physical education from Southern Connecticut State University in 1968. She excelled in sports as a girl and was the top player picked in the Stamford, Connecticut Little League local draft. However, she was forbidden to play baseball with the boys due to by-law language’s gender restrictions. Lopiano started playing women’s softball at the age of sixteen. After college, she was an assistant athletics director at Brooklyn College, coached basketball, volleyball, and softball, and then took on leadership roles in national women’s sports associations. Eventually she was Director of Women’s Athletics at the University of Texas along with appointments in sports policies and programs. She also was one of the most honored athletes of her era. Her experiences, including exclusion from teams, shaped her dynamic leadership over several decades. 

The bittersweet experiences of women athletes such as Donna Lopiano, Chris von Saltza, Wilma Rudolph, and Donna DeVarona show that although the 1960s has been celebrated as a period of concern for equity and social justice, colleges showed scant concern for women as student-athletes. One conventional analysis is that in 1972 the passage of Title IX would usher in a new era for women and scholastic and college sports. That was an unexpected development. In congressional deliberations around 1970, neither advocates nor opponents of Title IX mentioned college sports. All sides were surprised when the issue surfaced in 1972. The National Collegiate Athletic Association opposed inclusion of women’s sports -- until it made an unexpected reversal in 1978. Many colleges were slow to comply with the letter or spirit of Title IX. As late as 1997 the burden was on women as student-athletes to file lawsuits against their own colleges, pitting them against athletics directors, presidents, boards, and university legal counsel.

Title IX eventually demonstrated how federal legislation could prompt universities to provide programs and services accessible to women that they would not have provided if left to their own volition. It has required contentious oversight of resources for student athlete financial aid, training facilities, coaching salaries and other parts of a competitive athletics team. It includes television coverage of women’s championships in numerous sports. Equity and opportunity across all college activities, ranging from sports to fields of study along with hiring and promotion, remain uneven. The caution is that the experience of a Katie Ledecky at Stanford, including her professional swimming contract is exceptional. Sixty years after Chris von Saltza won her four Olympic medals and entered Stanford, inclusion of women as full citizens in the American campus is still an unfinished work in progress.

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170711 https://historynewsnetwork.org/article/170711 0
Separating Children from Their Parents Is an Anglo-American Tradition

 

The separation of children from their illegally migrant parents in the USA is seen as an aberrant and inhumane deviation from American tenderness for the family. This orphaning, as a matter of policy, is not “who we are,” as many liberals and some conservatives despairingly say.

But in many ways it is, and indeed, has long been. For the state, in the United States and earlier in Britain, has been a formidable creator of orphans. Perhaps this helps to explain the ambiguity in the attitude to the orphan: great display is made of theoretical pity and piety, but the way such children have been actually treated has frequently been punitive and repressive. Whether in orphanages, asylums, schools or other receptacles for those guilty of losing their parents, the extent of abuse by those in whose power such unfortunates have fallen is only now becoming clear.

Whatever charitable sentiments are kindled by the plight of orphans, such compassion has rarely prevented countries from the making of yet more of them by failing to wage war, or even to prevent it in those places – Syria and Yemen – where the indiscriminate harvesting of human life yields its sorry crop of abandoned children.

But it has not required war for governments, charities and even private individuals to rob children of their parents. From the first time a ship sailed from London to Virginia in the early 17th century taking “a hundred children out of the multitude that swarm about the place” until the last forced child migrants from Britain to Australia in 1967, thousands of young people were orphaned, not only of parents but of all ties of kinship, country and culture. The orphans sent from Britain to Australia alone numbered some 180,000.

A long association of derelict and orphan boys with the sea was formalized in a statute of 1703, which ordered that “all lewd and disorderly Man Servants and every such Person and Persons that are deemed and adjudged Rogues, Vagabonds and Sturdy beggars…shall be and are hereby directed to be taken up, sent, conducted and conveyed to Her Majesty’s Service at Sea.” Magistrates, overseers of the poor were empowered to apprentice to marine service “any boy or boys who is, are or shall be, of the age of ten and upwards or whose parents are or shall be chargeable to the parish or who beg for alms.”

Transportation removed 50,000 felons – among them many juveniles – by transportation to the American colonies; and in the process robbed many more children of at least one parent. In the 1740s, recruiting agents in Aberdeen sowed fear by luring children to service in the plantations. Peter Williamson and his companions, shipped to Virginia in 1743, were sold for sixteen pounds each. In 1789 the first convict ship, the Lady Juliana, consisting entirely of transported women and girls, set sail for Australia.

The historical fate of a majority of orphans is unknown. Many were taken in by kinsfolk or neighbors, and while many must have been fostered out of duty or affection, others were certainly used as cheap labor, for whom their foster-parents were accountable to no one.

It was not until the industrial era that the policy of removing children from their parents in the interests of society became widespread. The Poor Law Amendment Act permitted parishes to raise money to send adults abroad. One of the Assistant Commissioners claimed that “workhouse children had few ties to their land, and such as there were could be broken only to their profit.” In 1848, Lord Salisbury also advocated emigration for slum children.

Annie McPherson, Quaker and reformer, was the first private individual to organize mercy migrations, the rescue of children from their “gin-soaked mothers and violent fathers.” She set up a program of emigration in 1869. Dr Barnardo used Annie McPherson’s scheme, before implementing his own in 1882. He referred to “philanthropic abduction” as the rationale behind this disposal of the offspring of misery. 

At the same time, “orphan trains” carried children from New York and Boston to the open plains of the West, under the auspices of the Children’s Aid Society, established in 1853 by Charles Loring Brace. Sometimes children were “ordered” in advance, others were chosen as they left the train, or paraded in the playhouses of the small towns where farmers could assess their strength and willingness to work. These “little laborers” responded to a shortage of workers on farms. Between 1854 and 1929 a quarter of a million children were dispatched in this way.

In Britain, what were referred to as “John Bull’s surplus children,” were promised a future of open air, freedom and healthy work. Some were undoubtedly well cared for; but others exposed to exploitation, life in outhouses and barns, freezing in winter, stifling in summer, isolation and deprivation of all affection. The proponents of such schemes argued that this would provide them with a fresh start in life; but the cost of a one-way journey to Canada was far less than their maintenance by payers of the poor-rate. 

Joanna Penglase has called babies and infants taken from their mothers’ care for “moral” reasons, or simply because it was regarded as socially impossible for a woman to raise a child on her own as “orphans of the living.”

In 2010, the then British Prime Minister Gordon Brown apologized for the removal of children from their parents under the Fairbridge scheme, which took them to Australia, a practice which continued into the late 1960s. In 2008 Kevin Rudd, then Premier of Australia, apologized to indigenous families whose children had for generations been removed. In 2013 the Irish Taoiseach apologized for the abuse of orphans and illegitimate children by the Magdalene laundries from 1910 until 1970. 

It is in this context that former Attorney General Jeff Sessions declared zero tolerance of illegal immigration in April 2018. All such people would be prosecuted. Families were broken up, because detention centers were “unsuitable” for children. In June, after harrowing scenes of forcible separations, Trump signed an executive order that families should be kept together. All children under five were to be reunited with their families within 14 days, and those over five within 30 days.

It might have been thought that the creation of orphans by government had been consigned to history. Was it amnesia or dementia that made the administration, in its determination to be tough on illegal migration, separate parents from their children in its retreat to a tradition of punitive indifference to the most vulnerable? 

And then, what of the orphans of addiction, of mass incarceration, the abductions of the offspring of the marriage of technology with commerce, orphans of the gun-fetish and the multiple social estrangements created by social media and the engines of fantasy which lure children from their parents, protectors and guardians? The orphan-makers have never been busier in this era of wealth and progress.

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170705 https://historynewsnetwork.org/article/170705 0
A Choir Sings Out Loud and Strong

The Charles R. Drew Prep School has been graduating gifted music majors for fifty years. It opened when the Vietnam was raging in Southeast Asia. Throughout all of those years, and all of its headmasters, it has maintained its prestige mainly because of its well-known choir, made up of some of the most skilled singers in America. Year after year, the boys would graduate after four years of superior singing and studies to enroll at America’s very best colleges.

Until now.

This year all hell breaks loose behind the ivy-covered walls of Drew. Gay boys in the choir fight with each other and the straight singers, too. Jealousies and hatreds rise to the surface. One talented singer, Pharus, insists that he is better than everybody else and struts across the stage all night. The problems are so great that a teacher is brought in to teach ‘creative thinking’ in an effort to restore calm to the choir and he fails at this job. What to do?

The play, that opened last week at The Samuel J. Friedman Theater in New York, is the story of eight singers - seven African Americans and a white boy, David, who has a fascination with Biblical heritage - an overly strong fascination. They sing together, they argue together and they make amends together in this very impressive play with soaring music and singing written by Tarell Alvin McCraney.

The play is a roller coaster of emotions and says a lot about youth history and racial history and in many new ways. As an example, there is a marvelous discussion between the boys over what slave era African American songs meant to the slaves in the 1850s and what they mean to African Americans today. Did the lyrics cry out for an escape from bondage then and now or were they just lyrics and nothing more?

The highlight of the play is a searing argument over the ‘n’ word. The old, white creative thinking professor flies into a rage when one African American boy uses the word in yelling at another African American. The white professor tells them that they don’t know their history and the great struggle that has been going on for racial equality for hundreds of years.

The richness of the play, that also has some sharp humor, is not any one scene or one actor, though. It is the choreography by Camille A. Brown and the choir’s joyous singing of music by Jason Michael Webb.  It is one of the best choreographed plays I have ever seen and the choreography is really, really different. The conclusion of each song brings joyous roars from the audience.

Choir Boy could be a drama about any prep school, or any high school, in America. It is about teenage boys growing up between classes amid a myriad of racial and sexual tension. In the end, too, these teenagers who thought school and life was so easy, are confronted with the severe penalties they have to pay for their behavior.

Director Trip Cullman has done a wonderful job of telling a taut drama full of angst and hope and, at the same time, weaving in the song and dance numbers. The result is a very pleasing show. He gets fine work from his performers -- Chuck Cooper as the headmaster, John Clay III as Anthony, Nicholas Ashe aa Junior, Caleb Eberhardt as Davis, the only white singer, and J. Quinton Johnson as Bobby. The extraordinarily gifted Jeremy Pope plays Pharus. He is a wonder as both a singer and actor and a young man struggling with his homosexuality.

Veteran actor/director Austin Pendleton is sensational as the old white professor.  

The play has some minor problems. The plot is a bit choppy and you must pay careful attention to the story as it unfolds. There does not seem to be a strong reason to bring in the white teacher. There are pieces of the story that are left out. You never learn, as an example, whether this is a prep school that has a choir or a music school whose choir is an important part of the program. You are told that the white kid has to keep his grads up to stay in school but that the others, for some reason, do not. It is stressed that they are “legacies,” or students whose parents attended the school, and that are safe no matter what they do (that’s not really true. The sex in the play comes and goes and you are not sure of  peoples’ relationships and how they developed until late in the play.

Even so, Choir Boy is a powerful drama about the coming of age of a group of superbly talented and at the same time supremely distraught young men.

It is a song to remember.

PRODUCTION: The play is produced by the Manhattan Theatre Club. Scenic and Costume Design: David Zinn, Lighting” Peter Kaczorowski, Fight Direction: Thomas Schall, Music Director: Jason Michael Webb, Sound: Fitz Patton. The play is choreographed by Camille A. Brown and directed by Trip Cullman. It runs through February 24.  

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170945 https://historynewsnetwork.org/article/170945 0
Is There a Statesman in the House?  Either House?  

 

It seems likely that 2019 will be one of the most challenging and consequential years in recent American history, perhaps on par with 1974. When Robert Mueller’s investigation of President Donald Trump concludes and his report is likely sent to Congress, a time of reckoning will be upon us. 

The United States will need leaders in both parties to display a quality that has been in short supply in our country in recent years: statesmanship.

Statesmanship is a pattern of leadership, and an approach to public service, that is characterized by vision, courage, compassion, civility, fairness, and wisdom. Statesmanship can involve bipartisanship, but it is not the same as bipartisanship. History proves that there can be a strong bipartisan consensus to enact harmful policies or to evade difficult alternatives. 

When statesmen consider public policy issues, their first question is, “what is in the public interest?” Personal and partisan considerations can follow later, but hopefully much later. If the national good is not identical to, or even clashes with, personal and partisan considerations, the former must prevail. 

Genuine statesmanship requires leaders to dispassionately consider issues, carefully weigh evidence, and fairly render verdicts, even if they go against personal preferences or are contrary to the desires of their political base. Given our current political climate it is easy to forget that statesmanship, while unusual, has been a critical feature of American politics and history. 

Republican senator Arthur Vandenberg played a pivotal role in the late 1940s in securing congressional approval of key elements of President Harry Truman’s foreign policy including the Marshall Plan and NATO. Margaret Chase Smith, a first term GOP senator from Maine, broke from party ranks in 1950 and challenged Joseph McCarthy and his demagogic tactics. Senate Republican Leader Howard Baker damaged his chances for the 1980 GOP presidential nomination by supporting the Panama Canal treaties that were negotiated by President Jimmy Carter. Republican Richard Lugar, then the chairman of the Senate Foreign Relations Committee, defied President Ronald Reagan in the mid-1980s and pushed for economic sanctions on the apartheid regime in South Africa. 

Decisions of similar gravity are likely to face political leaders in Washington in 2019. Democrats should fairly evaluate Mueller’s report as it pertains to alleged Russian collusion and obstruction of justice by the president. They should not overreach because of their antipathy to Trump or avoid their responsibilities if Mueller’s findings suggest impeachable crimes if they fear that such an action would complicate their 2020 prospects. Republicans must end their reflexive and unworthy tendency to overlook the president’s frequently egregious and possibly criminal behavior because Trump remains hugely popular with the GOP base. 

Senator Paul Simon, a consequential and successful public official in Illinois for more than four decades, worried during his final years that statesmanship appeared to be at low ebb. “We have spawned ‘leadership’ that does not lead, that panders to our whims rather than telling us the truth, that follows the crowd rather than challenges us, that weakens us rather than strengthening us,” he wrote. “It is easy to go downhill, and we are now following that easy path. Pandering is not illegal, but it is immoral. It is doing the convenient when the right course demands inconvenience and courage.”

Decades earlier, Senator John F. Kennedy wrote eloquently on the subject. In Profiles in Courage, he argued that politicians sometimes face “a difficult and soul-searching decision” in which “we must on occasion lead, inform, correct and sometimes even ignore constituent opinion.” Kennedy added that being courageous “requires no exceptional qualifications, no magic formula, no special combination of time, place and circumstance. It is an opportunity that sooner or later is presented to all of us. Politics merely furnishes one arena which imposes special tests of courage.” 

Those tests are coming in 2019.

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170777 https://historynewsnetwork.org/article/170777 0
Maestro Is Out of Tune

Arturo Toscanini was one of the greatest symphonic conductors in the history of the world. He first raised his baton at the age of 19 and pretty much kept conducting until his death at the age of 82. The Italian genius led orchestras in numerous nations and even the famous NBC Symphony orchestra in New York. His orchestras played some of greatest classical music ever written and had some of the planet’s great musicians in them. Among the singers he worked with were superstars Ezio Pinza and Enrico Caruso. A little-known chapter to most Americans was his running battle with Benito Mussolini, the fascist Italian dictator, Adolf Hitler and their goons from the early 1930s to the end of World War II.

Now that intriguing story is finally being told in a new play, Maestro, that opened last night at the Duke Theater, 229 W. 42d Street, in New York. It stars John Noble, includes a small orchestra that plays composition by some of the great composers and has a vivid historical video news clip show that tracks the Mussolini takeover in Italy.

Maestro does not work, not at all. The legendary Maestro drops his baton in this drama, and that is sad because the story is so fascinating and inspiring. The play, though, is really out of tune.

In history, Toscanini was publicly critical of both German Chancellor Adolf Hitler and Italian dictator Benito Mussolini, sided with the Jews all over Europe against the fascists and was condemned by the Italian government. The conductor eventually fled the country. Generally speaking, that story is told in the play, but most of the interesting aspects of his war with the fascists are left out by playwright Eve Wolf.

Toscanini refused to conduct in Italy because of his hatred for Mussolini and so he led orchestras in other countries nearby and his audiences there were enormous. That’s not covered sufficiently in the play.

The program notes tell the vivid story of how Toscanini helped Italian Jews escape transport to camps and how he even got them out of Italy and helped find them jobs in the U.S. That is not in the play.  

In 1931, six years into Mussolini’s reign, Toscanini was physically assaulted by fascists, but that story is not in the play. Musicians are quoted in the program notes recalling how Toscanini not only made them better musicians, but better people. That is not in the play.

He was one of the star conductors of Germany’s Bayreuth Music Festival, but he quit in 1931 as the Nazis marched towards power. In Italy, he refused to play the new fascist national anthem that Mussolini insisted upon, infuriating the Italian strongman. Again, not in the play.

Toscanini’s defiance of Mussolini never ended. As an example, he refused to ever show Mussolini’s photograph at concerts, as just about everyone else did. An angry Mussolini had Toscanini’s phone tapped and had him followed. Not in the play.

There are several problems with Maestro besides its thin history. First, it is a one man show and Noble has to carry the whole drama on his shoulders. The tale cries out for other characters, such as his wife, Carla, the NBC brass who created their symphony just for Toscanini in 1937, his many musicians, music lovers and, of course, Il Duce himself. This is great play waiting to happen. It does not happen at the Duke. 

For some bizarre reason, playwright Wolf slips in the five-piece orchestra to play numerous classical music pieces by Verdi, Wagner, Gershwin, Tedesco, and others. The music goes on and on and on. Actor Noble could have lunch between each piece. The musical interludes seem longer than Mussolini’s nearly 20-year reign in Italy. The orchestra has absolutely nothing to do with the story (the musicians in the orchestra, Mari Lee, Henry Wang, Matthew Cohen, Ari Evan, Maximilian Morel and Zhenni Li, are quite good, though).

The book is choppy. It starts with Toscanini yelling at “musicians” who are audience members and then he is off discussing the last years of his career, skipping over all the years, the decades, that made him so famous. 

Questions are never answered. Toscanini tells the audience that he and his wife had not slept with each other for over twenty years, that he did not see her from 1938 to 1945 and did not get a single letter from her for seven years. When he returns to Italy at the end of the war, he tells the audience he is stunned that she was not there to greet him. Huh? In the play, Toscanini sort of left out the fact that during his marriage he had affairs with half the women in Italy and one third of the women in America. And the wife did not want to see him?

We are told the Mussolini has taken his passport, which means he can’t leave the country, but then he pops up in New York City with the NBC Symphony. How did he do that? At one point in the story he shows up in Palestine to lead an orchestra made up of Jewish refugees. How did he get there? Did someone invite him? Did he get off the train at the wrong stop? What?

He returns to conduct at La Scala, in Milan, but says he won’t walk the streets of Milan. Why?

Noble does a decent job portraying Toscanini in this play directed by Donald T. Sanders, but, unfortunately, he has a very weak book to work with and is unable to tell much about the fabled conductor whom Mussolini hated so much. You might have learned much about history but did not.

This play is like a concert in which much of the music is missing. 

PRODUCTION: The play was produced by the Ensemble for the Romantic Century. Scenic and Costume Design: Vanessa James, Lighting Design: Beverly Emmons & Sabastian Adamo, Sound: Bill Toles, Projection Design: David Bengali. The play is directed by Donald T. Sanders. It runs through February 9.

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170953 https://historynewsnetwork.org/article/170953 0
What I’m Reading: An Interview With Historian Carla Pestana

 

Carla Gardina Pestana is Professor of History at the University of California, Los Angeles and the Joyce Appleby Endowed Chair of America in the World. 

What books are you reading now?

I have a number of different books going at the moment. Disaffection and Everyday Life in Interregnum England by Caroline Boswell, a book I agreed to review, came to me because I listed myself as a military historian (among other categories) on the website, Women Also Know History. This site makes available information about women’s expertise as historians in order to promote the expertise of women historians. Since it was the first request I had received that mentioned having found me there, I felt compelled to agree. Otherwise I seldom review books these days. 

For recreational reading, I just finished Tigerbelle, The Wyomia Tyus Story, an autobiography of an Olympian. I don’t generally read either auto/biographies or modern history, but Ty is a family friend so I made an exception. It’s an interesting account, especially for what it shows about her world, which became dramatically wider (she was raised in the rural south but traveled extensively as a result of her athletic expertise) as well as for the gender dynamics prevailing in the era when she was coming up as an Olympian. 

More work-related but equally enjoyable has been reading Elena Schneider’s The British Occupation of Havana. I’ve been awaiting this study of the 1762 occupation of the supposedly impregnable Cuba port city. Through Schneider’s treatment we can see clearly that imperial boundaries were frequently crossed (in wartime as well as during periods of peace), and regional residents routinely failed to cooperate in efforts to close off one empire from another. That and her treatment of the role of slaves and free people of color in the defense and subsequent occupation of the city are profoundly illuminating. Her’s is one of a spate of excellent new books on the early Caribbean. 

All this is not to mention the doctoral dissertation on Haitian independence and land that I just finished or the many books and articles I am rereading in order to decide if I should assign them for my winter quarter class. There’s always too much reading to do. 

What is your favorite history book? 

This seems like an impossible question, because there are so many amazing books. I have recommended certain books to many people, so I guess that is one measure. Mr. Bligh’s Bad Language is a great book, and Greg Dening was a scholar I always admired as well as a lovely person. I used to teach Natalie Zemon Davis’s The Return of Martin Guerre, along with the related debates. I liked that book for how it shows the way the historian does her work. When I was a graduate student I read (in the same week in my first term) Christopher Hill’s The World Turned Upside and Perry Miller’s The New England Mind: From Colony to Province. This conjunction set me to thinking: how could both these realities have coexisted. My M.A. thesis (which was published in the New England Quarterly in 1983) represented my first attempt to answer that question; and my dissertation—on religious radicalism in early New England—followed and extended my effort to understand how England produced Quakerism and other forms of radicalism even as newly-founded New England embraced orthodoxy and policed its borders with violent results. 

Recently, I have read a number of wonderful books about Caribbean history: Elena Schneider’s book on Havana, but also David Wheat’s Atlantic Africa and the Spanish Caribbean, 1570-1640and Molly Warsh’s American Baroque: Pearls and the Nature of Empire, 1492-1700. I’m preparing to teach a new course on Atlantic history, so I have an enormous stack of such books to go through.   

Why did you choose history as your career?

The simple answer: my undergraduate teachers suggested I try graduate school. Growing up I knew no professors, indeed nobody with a Ph.D. But I loved reading and history, so I was amenable to the suggestion. I have never looked back: I went directly into graduate school from undergrad, and carried right on through M.A., Ph.D., and—somewhat miraculously—to a first position that was a tenure-track job at a good school. It all worked out amazingly well, although it seemed a crazy path, an unimaginable future, at the time when I took it up. If I had known I would become so passionate about being a historian that I would go live in another state for decades, away from family and my beloved Los Angeles, I wonder what I would have done. But, after a long haul, I managed to come home, and take a position at my graduate institution. I now work 15 miles from where I was born, and not too many academics can say that. 

The more complex answer: I find entering an alternate world an interesting way to use my intellectual abilities. Immersing oneself in a particular time and place in order to come to know it well is a fascinating process. At times in my career I have become so thoroughly immersed in my research that I have gotten mixed up about the number properly assigned to the current month; during the era I study, the first of the year was in March, which made December (quite sensibly) the tenth month. On the rare occasion that I can stay in the seventeenth century for an extended time, I have to remind myself that September is not in fact (any longer) the seventh month. 

I once read an essay by Edmund Morgan who suggested that we focus our research questions on what doesn’t make sense. That insight and instruction strikes me as apt in that it’s the disjunctions, the perplexities, which draw the eye and demand to be explored. When we complete that exploration, we so often find something unexpected and revealing. I would take his observation one step further, to say that as we come to know a time and place well, we become more sensitive to the unexpected. Some projects, of course, dig into unknown topics and archives but most re-consider (through deeper research or new questions) already studied topics. I find my work always shifts back and forth between verities (often contained in the historiography) that need to be challenged, and archival sources that open up the possibility of answers. That tension keeps the intellectual life of the historian interesting. 

What qualities do you need to be a historian?

Well, I don’t know about all historians, but I am tenacious, organized, and detail-oriented. I have trouble taking no for an answer, so I just keep digging and trying to figure out what I want to know. While I have never minded (indeed I cherish) time spent alone at my desk or in an archive, struggling with writing and with research, I also thoroughly enjoy the opportunities that my work gives me for thinking with others. I enjoy talking to people—students, colleagues, or the public—about ideas and about the past. I am equally pleased by the solitary and the communal aspects of this work. Ideally you can do both, work alone and with others. I feel that being able to support oneself through work as a historian is a great privilege. 

Who was your favorite history teacher? 

Another difficult question since I have benefited from the teaching and guidance of so many great history teachers. Besides my father who was not a history teacher but was prodigiously intelligent and would answer all my childhood questions about history, a high school teacher leaps to mind. Milton (Mickey) Sirkus was a great teacher. When I think back now about his pedagogical approach, I have to laugh. In an honors history class I took with him in high school, he used teasing each of his students about her or his heritage as a way to engage us in U.S. history. It is hard to imagine a teacher today who would use such a hook, and even at the time it struck me as rather edgy. It was the case that we defended our immigrant ancestors or relatives, and their contributions to U.S. history more energetically than we might have otherwise, since he engaged us on a personal level. I never thought much about what my ancestors and older relatives had faced because they were the children of immigrants until that class. He used sarcasm and teasing in a way I would not feel comfortable doing. As one example of that, when I wrote to him many years later to explain that I had gone on from his history class through college and graduate school to become a historian, he wrote back to welcome me to the ranks of the unemployed. It was in fact a terrible time for finding a history position in a university, but luckily he was eventually proved wrong on that score.  

Since my high school history class, I had excellent teachers at my undergraduate alma mater (people who pointed me toward graduate school) and in my graduate program too. I was fortunate to go to UCLA to pursue a graduate degree in early American history in the 1980s. I started working with Gary Nash, who was an amazing lecturer, galvanizing the undergraduates in his big classes, and an excellent editor, giving the best readings of my written work that I have encountered anywhere. My second year at UCLA, Joyce Appleby joined the faculty, and she was stunningly accomplished in all aspects of the work we do. She served as a role model for so many of us—so smart and no nonsense. I am so pleased to have a chair at UCLA now named in her honor. 

What is your most memorable or rewarding teaching experience?

I used to employ a first-day exercise in smaller classes that both the students and I enjoyed. I’d ask them to write down and pass up their earliest historical memory, and then I would write all their answers of the board. Then we’d discuss the list from numerous angles, starting with what criteria they had used to decide an event was historical. We discussed what guides us in making that sort of a call, and what examples of formal historical writing might align with their choices. The exercise always resulted in a great first-day discussion, one that often ranged widely. I have to say, doing it also brought home to me the ages of my students, as I watched their earliest events move forward in time. I haven’t done that in a while, but I do remember it fondly. 

These days I am enjoying the work I do at UCLA with transfer students. Some of the best undergrads I have taught here have been from the local community colleges, transferring in as juniors. They undergo a bit of culture shock, but at the same time they are eager, smart and enthusiastic. I have thoroughly enjoyed overseeing undergraduate honors theses with a handful of them.  

What are your hopes for history as a discipline? 

I work with so many smart, engaged young people that I am able to remain hopeful. It is easy to bemoan these anti-intellectual times and to worry about what will happen to the American university system and to our ability as a society and a culture to engage intellectually. Yet many people care deeply about learning, including learning about the past, and they work at the thinking and writing that we—whether as producers or consumers—need to keep history going as a discipline and as a form of knowledge. So in spite of the gloomy prognostications, I remain hopeful. History is a foundational component of a humanist education, and it is something that many people beyond the academy know to be valuable. I’m toying with writing a book for a popular audience in part to try to make some of the work we do in the academy more accessible and interesting to those outside it.  

Do you own any rare history or collectible books? Do you collect artifacts related to history?  

I don’t own any particularly rare or collectible books, although I do still have my beloved print copy of the OED—The Oxford English Dictionary—in two volumes, with its magnifying glass in the little drawer that allows me to read the many pages printed on each sheet.  

As for artifacts, I have received some fun items as gifts from former students. One gave me an old nautical sextant—appropriate to my work on maritime history and privateers—while another gave me a framed sheet out of an early edition of John Foxe’s The Actes and Monuments (better known as Foxe’s Book of Martyrs)—a gift relevant to my work on religion. I also have a counted cross stitch sampler that replicates one from the late seventeenth century. The original maker was a New England girl who grew up to join the Quaker meeting in Lynn, Massachusetts, a meeting and a community that I wrote about in my first book. My mother stitched it for me as a gift while I was writing a dissertation that included this girl, Hannah Breed, and other people from her community. 

What have you found most rewarding and most frustrating about your career?

When you ask about my career, I assume you mean my own personal triumphs and trials. If that’s the intention, I have to admit that I have been extremely privileged and lucky, so both the high points and the low occurred in that context. 

One of the most rewarding aspects of my privileged position has been being able to take all the time I wanted and needed to write a second book. I got tenure based on the first book, so despite the pressure to publish again quickly (and the harsh strictures from one department chair in particular about “frozen associate professors” who didn’t finish a second book promptly), I produced a second book (The English Atlantic in an Age of Revolution, 1640-1661) that differed drastically from my first. It took me forever to learn all that I needed in order to be confident about that book and to send it out into the world, but it was a better book for it. I am glad I did not bend to the pressure (whether self-inflicted, institutional, or otherwise) to be fast, and the tenure system allowed me that opportunity. That book might still be my personal favorite of those I have so far written, because of how far I had to stretch to write it. It didn’t help matters that I had two children over the course of researching and writing it, either. 

That is the perfect lead in to the frustrations. Like many women in my cohort, I did experience the challenges of having babies at an institution with no pregnancy leave policy. My female colleagues thought I should go ask what arrangements would be possible, but the chair of the department looked at me blankly. It was aggravating, but because I didn’t have tenure the first time around, I just thanked him and left. I didn’t become better at advocating for myself the second time, either, even though by then I did have tenure. My children are in their early to mid-20s so this was not all that long ago. Most women academics then of my acquaintance who were older than me did not have children, and if they did they often had them before they joined a department. If you found yourself in my situation, you were supposed to hope your baby arrived in the summer, best of all in early summer, so you could spend a little time at home; if the baby was born at a different time of the year, you might be allowed to teach an overload, bank some courses, and get a little time off that way. Some colleagues seemed to think that one should not try to be an academic and a mother. I managed, as did others, but the lack of support or even awareness was a source of frustration. 

How has the study of history changed in the course of your career? 

I have been at this a while, so it has changed in various ways. In my own original field of early American history, when I was in graduate school my fellow students were doing the “New Social History,” studying various groups in society often using quantitative methods. The cultural turn had already overtaken literature departments but was just coming into historians’ awareness. Soon that became the dominant approach, but at the same time areas such as Native American history were blossoming too. Today it seems that some of those early seeds of the social history scholarship—especially its engagement with race, class and (eventually) gender—has paid big dividends, reshaping the ways we think about so many topics.  

In my own historical scholarship, I have been most conscious of the shift in geographical frames. Today Atlantic history seems a bit passé, but the shift out from British North America felt startlingly true and profound at the time. When I was a graduate student, colonial America meant the thirteen colonies that became the United States and the only external links that matter were back to Britain. Most projects were framed within a single colony, and the bent toward social history meant detailed archival work within a relatively narrow geographical framework. Looking up from that narrow landscape to perceive the connectedness of various places not in North America and indeed not within the English imperial boundaries felt like a revelation.  

What is your favorite history-related saying? Have you come up with your own?

I do not like the usual history sayings, because they often assume some simple connection between the past and the present that I perceive to be wrong. For instance, I don’t agree that history repeats itself. Or rather, as George Santayana said, “those who cannot remember the past are condemned to repeat it.” Even Karl Marx’s version, “History repeats itself, first as tragedy and then as farce” doesn’t strike me as entirely accurate. The factors that shape our present are so complex and multifaceted that attempts to achieve or avoid a particular outcome usually set into motion numerous unintended consequences—that (more than the repeat nature of history or our ability to remember it and thereby keep it from repeating) is what strikes me most often as I study the intentions of historical actors. 

I am rather more enamored of the L.P. Hartley observation, which points in the opposite direction: that “the past is a foreign country; they do things differently there.” The opening line of his novel could actually be read as a caution to those who look for repetition or simply lessons, since that often involves ignoring the differences. 

For the sheer pleasure of following its twisted history in our popular culture, I do rather like Laurel Thatcher Ulrich’s “Well-behaved women rarely make history.” I read it in its original context, before it developed a life of its own, in an article about what was considered proper behavior for women. Laurel meant it as a straightforward description of the cultural ideal: women were not to draw attention to themselves but to remain quietly in their proscribed roles. She was not issuing a call to revolution or advocating that women should misbehave and make history. But the quotation got picked up and flipped from its original meaning to its opposite. That reversal is fascinating, and I particularly love how people attribute it to various women (such as Eleanor Roosevelt) who purportedly said it to advocate that women make trouble and call attention to the need for change. Laurel has written a book on the whole phenomenon, in part to get all those who know her to stop sending her pictures of it misattributed on t-shirts, coffee mugs, and protest signs. 

The strange history of that history quotation makes it fun. It remains ubiquitous, and I bet people still email Laurel about its odder appearances. I’ve long since quit doing so, although I continue to see it around. 

What are you doing next?

Well, I am chair of my department, so I am doing a great deal of university service. I care deeply about my department and my university, so I don’t mind giving some of my time over to this work. But that obligation does mean that I will produce less scholarship in the short term. I do have a book manuscript on Plymouth Plantation that I am trying to finish for the 400th anniversary of the Mayflower landing. It differs from anything I have done before, in that it is aimed at a popular audience. I was inspired to write it by an extended visit to the living history museum that reenacts Plymouth, having been brought in along with others to help the staff there to update their historical coverage. That experience got me thinking about Plymouth and how we Americans envision it. My impulse to create this work owes something to the fact that I wrote for a few years for the Huffington Post. Writing for a popular audience about the intersections between the past that I study and current events proved a challenging discipline; 800 words are very few (at least for the historian who writes 200 page books), and the need to respond quickly and in a focused fashion I found invigorating. I am trying to bring what I learned doing that to this new project. 

As usual—as has been the case since the start of my career—I also have a little Quaker piece I am mulling over. My very first research project as a graduate student was on the Quakers, and I keep coming back to them with various questions and ideas. And finally, I mean to get back into the Jamaican archives, to follow up some of what I was doing with my previous book. So, lots to do, but not enough time to do it all. Isn’t that always the case?

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170597 https://historynewsnetwork.org/article/170597 0
Trump’s Bullshit-Savant Moment on Afghanistan

Once again, the President put his factually-challenged relationship with the past on public display. In a January 3rd Cabinet meeting, Trump offered a tour de force with a fanciful alternative history of Afghanistan. According to him, the Soviets invaded in late 1979 because of cross-border terror attacks. The subsequent decade-long war, the President insisted, bankrupted the USSR and led to the collapse of the Soviet Union. Trump clearly had no idea what he was talking about. If the past is a foreign country, then in Trump’s parlance, he is an illegal immigrant trespassing upon it. 

 

To briefly correct the President’s (mis)understanding of Afghan history: The Soviet Union invaded the country on December 26, 1979, ostensibly to support a friendly communist government under threat from a domestic insurgency provoked by unpopular reforms and the violent suppression of political dissent. Fearing the collapse of an allied regime on its southern border, the Soviets replaced the Afghan communist leadership with a more moderate and pliable cadre. Though initially planning for a swift withdrawal, Soviet forces soon found themselves sucked into a quagmire which proved impossible to escape. Over the next decade, they deployed roughly 100,000 troops, losing 15,000 of them, in a bloody counter-insurgency against the so-called mujahideen– American supported ‘freedom fighters’. The war forced over 7 million to flee as refugees, created an unknown number of internally displaced persons, and killed, maimed and wounded an untold number of Afghans. The Soviet war ended with the Geneva Accords in 1988, allowing the USSR to feign ‘peace with honor’ which covered an ignominious retreat. 

 

The United States and its allies immediately denounced the Soviet invasion, which made Afghanistan a battleground in the increasingly hot Cold War. American policy-makers saw the potential of turning Afghanistan into the Soviet Vietnam. Beginning with the Carter administration, and significantly ramped up under Reagan, the US secretly funneled $3 billion to the Afghan mujahideen. By bleeding the soft underbelly of the beast, American Cold Warriors hoped to strike a mortal wound to the evil empire. Following the end of the Cold War, some conservative commentators characterized the Soviet defeat as a consequence of Reagan’s tough stance which forced them to spend an incessant, and unsustainable amount on defense. These analysts contend that the Afghan war, along with the cost of Soviet military aid to Central America and the US deployment of Pershing missiles in Europe, bankrupted the Soviet Union and led to its collapse. 

 

It was this interpretation of history which Trump’s stream of consciousness soliloquy rather clumsily tipped his hat to. Nevertheless, the President’s alternative history almost immediately earned him a scathing rebuke from a no-less august stalwart of the right than the editorial page of the Wall Street Journal. The Journal’s willingness to take him to task for a position loosely held by many on the American right over the years is notable. Doubly so for a publication which has repeatedly proven reticent to fact-check the man. 

 

Yet the Journal’s response is a non-sequitur. What has been lost in the consternation provoked by the President’s remarks is the fundamental question which remains unanswered – namely, what the hell is the US doing in Afghanistan? Though he got his facts wrong – Trump does not seem to care about them anyway, and is thus the bullshitter-in-chief in the Harry Frankfurt sense – the essence of his question is correct. The US has lacked a clear policy on and purpose in Afghanistan since the early 2000s, making the President’s rambling, historically uninformed remarks something of a bullshit-savant moment. 

 

Now entering its eighteenth year and one of the costliest wars in American history, the President has reportedly grown frustrated with a continuing conflict which he seemingly does not understand. While his ignorance provides fodder for detractors and evokes the concern of the national security establishment, it also allows him to ask basic questions regarding the purpose of that war which have long been considered settled within Washington circles of power. The President’s ignorance of Afghanistan, though extreme, is far from unique amongst the American policy establishment. Such ignorance is the consequence of a larger failing of American policy in the country – the lack of a clear publicly pronounced purpose and end-goal for the continued American presence in Afghanistan. 

 

Despite nearly two decades of war in the country, American policy is largely driven by a noxious combination of inertia and sunk costs. A large part of the problem is that America’s civilian political leadership long ago abdicated its war-fighting responsibilities regarding Afghanistan. It is the role of civilian elected officials to formulate, articulate and communicate the fundamental purpose of an armed conflict and to direct the military and security apparatus of the government to execute that vision. But this has not been the case with Afghanistan. Since the quick victory over the Taliban in 2001, America’s political attentions quickly wandered elsewhere, most importantly Iraq. This meant that the Afghan war has largely been farmed out to the generals to fight a war whose aims and purpose they have not been instructed in. The military has thus continued to do what the military knows best – fight a war. It is no wonder then this conflict goes on, with no end in sight. 

 

What the hell is the US doing in Afghanistan? The President clearly does not know. But this is the central question. The one the President himself, along with the other elected officials of the US Government, needs to answer. It is neither the responsibility nor the place of the US military leadership to do so. In his bullshit-savant moment, Trump has set himself a challenge. Sadly, it is one he has demonstrated little interest in or ability to rise to. 

 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170913 https://historynewsnetwork.org/article/170913 0
From Birth of a Nation to Silent Sam: What History and Popular Culture Can Teach Us About the Southern "Lost Cause" and Confederate Monuments Today Steve Hochstadt is a writer and a professor of history at Illinois College. 

The thousands of monuments to the Confederacy and its leaders scattered across the South have become a national political controversy that shows no signs of abating. The decision of the City Council of Charlottesville, Virginia, to remove the statue of Robert E. Lee, mounted on his horse on a 20-foot high pedestal in the center of town, prompted three public rallies of white supremacists in 2017. At the Unite the Right rally in August last year, James Alex Fields Jr. drove his car into a crowd of counterprotesters, killing one woman and injuring dozens of people. He has just been convicted of first-degree murder. The statue still stands.

 

Of the approximately 1700 public memorials to the Confederacy, less than 100 have been removed in the past few years. These visible symbols represent the persistence of a cherished historical myth of American conservatives, the honor of the “Lost Cause” of the Civil War. Developed immediately after the defeat of the South in 1865, the Lost Cause relies on two claims: the War was caused by a conflict over states’ rights, not slavery, and slavery itself was an honorable institution, in which whites and blacks formed contented “families”.

Thus the political and military leaders of the Confederacy were engaged in a righteous struggle and deserve to be honored as American heroes.

 

This interpretation of the Civil War was a political tool used by Southern whites to fight against Reconstruction and to disenfranchise and discriminate against African Americans. Northern whites generally accepted this mythology as a means to reunite the nation, since that was more comfortable for them than confronting their own racial codes.

 

During most of the 160 years since the end of the Civil War, the Lost Cause reigned as the official American understanding of our history. The glorification of the Ku Klux Klan in the film “Birth of a Nation” (originally titled “the Clansman”) in 1915 was a landmark in the nationalization of this ideology. The newly formed NAACP protested that the film should be banned, but President Woodrow Wilson brought it into the White House, and the KKK sprang to life again that year in both North and South.

 

Not as overtly supportive of white supremacy as “Birth of a Nation”, “Gone With The Wind” in 1939 reinforced the Lost Cause stereotypes of honorable plantation owners, contented slaves unable to fend for themselves, and devious Northerners. It broke attendance records everywhere, set a record by winning 8 Academy Awards, and is still considered “one of the most beloved movies of all time”.

 

Generations of professional historians, overwhelmingly white, transformed the Lost Cause into official historical truth, especially in the South. Textbooks, like the 1908 History of Virginia by Mary Tucker Magill, white-washed slavery: “Generally speaking, the negroes proved a harmless and affectionate race, easily governed, and happy in their condition.” This idea prevailed half a century later in the textbook Virginia: History, Government, Geography, used in seventh-grade classrooms into the 1970s: “Life among the Negroes of Virginia in slavery times was generally happy. The Negroes went about in a cheerful manner making a living for themselves and for those for whom they worked.” A high school text went into more fanciful detail about the slave: “He enjoyed long holidays, especially at Christmas. He did not work as hard as the average free laborer, since he did not have to worry about losing his job. In fact, the slave enjoyed what we might call collective security. Generally speaking, his food was plentiful, his clothing adequate, his cabin warm, his health protected, his leisure carefree. He did not have to worry about hard times, unemployment, or old age.” The texts were produced in cooperation with the Virginia state government.

 

The Civil Rights struggles of the 1960s not only overturned legal segregation, but they also prompted revision of this discriminatory history. Historians have since thoroughly rejected the tenets of the Lost Cause. All the leaders in the South openly proclaimed that they were fighting to preserve slavery, based on their belief in the inherent inferiority of the black race. Both official and eyewitness sources clearly describe the physical, psychological and social horrors of slavery.

 

But the defenders of the Lost Cause have fought back against good history with tenacious persistence. In the international context of the Cold War, the local journalists and academic historians and forthright eyewitnesses, who investigated and reported on the real race relations in American society, became potential traitors. These “terrorists” of the 1950s cast doubt on the fiction of a morally superior America, as it battled immoral Communism. The dominance of white Americans in every possible field of American life was also threatened by a factual accounting of slavery before, during, and after the Civil War.

 

Bad history persists because those in power can enforce it by harassing its critics. It was easy for the FBI and conservative organizations to pinpoint those academics, journalists, and film directors who dissented from the Lost Cause ideology. They could then be attacked for their associations with organizations that could be linked to other organizations that could be linked to Communists. These crimes of identification were made easier to concoct because of the leading role played by American leftists in the fight against racism during the long 20th century of Jim Crow.

 

Thus did Norman Cazden, an assistant professor of music at the University of Illinois, lose his job in 1953. The FBI had typed an anonymous letter containing what Cazden called “unverified allegations as to my past associations,” and sent it to the University President. Cazden was among 400 high school and university teachers anonymously accused by the FBI between 1951 and 1953.

 

The defenders of the Lost Cause switched parties in my lifetime. Shocked by the white supremacist violence of the Civil Rights years, popular movements and popular sentiment forced both parties to end Jim Crow, using historical and political facts to attack all facets of white supremacist ideology, including the Lost Cause.

 

The shift of Dixiecrat Democrats to loyal Republicans is personified in the party shift of Strom Thurmond, Senator from South Carolina and most prominent voice in favor of segregation, from Democrat to Republican in 1964.

 

It still seemed appropriate in 2002 for the Senate’s Republican leader, Trent Lott, to toast Thurmond on his 100th birthday by saying he was proud to have voted for Thurmond for President in 1948, and “if the rest of the country had followed our lead, we wouldn’t have had all these problems over the years, either.” None of the major news outlets, the “liberal media” reported the remark, dwelling instead on the pathos of the old famous rich racist. Only a groundswell of criticism forced the mainstream media to recognize Lott’s words as a hymn to white supremacy.

 

By then, generations of Americans, both in the South and in the North, had absorbed the bad historical lessons that remain the basis for racist beliefs today. 

 

The Lost Cause lives on in the South, supported by federal and state tax dollars. An investigative report published in Smithsonian magazine revealed that the official sites and memorials of the history of the Confederacy still “pay homage to a slave-owning society and serve as blunt assertions of dominance over African Americans.” During the past decade, over $40 million in government funds have been spent to preserve these sites, originally created by Jim Crow governments to justify segregation. Schoolchildren continue to be taught Lost Cause legends.

 

Politics keeps bad history alive, because of the political expediency of the false narratives it tells. American white supremacists have been created and encouraged by this version of American history. 

 

So the struggle over history goes on. Most recently, several dozen graduate teaching assistants at the University of North Carolina announced a “grade strike” to protest the University’s plan to spend $5 million constructing a new building to house a Confederate monument that protesters had pulled down in August. They are refusing to turn in students’ grades.

 

The Lost Cause story itself deserves an “F”, but it will persist as long as political leaders find its fictions convenient.

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/blog/154173 https://historynewsnetwork.org/blog/154173 0
What Historians Are Saying About the Wall, the Shutdown, and Trump's Primetime Speech

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170895 https://historynewsnetwork.org/article/170895 0
Round up Top 10!  

A wall can’t solve America’s addiction to undocumented immigration

by Julia G. Young

For more than 70 years, undocumented immigrants have shaped the U.S. economy.

 

Nixon in Fiction: A Bizarre, Complex History

by Alan Glynn

And What We Should Expect for Our Current President

 

 

Brazilian Politics and the Rise of the Far-Right

by Daniela Gomes

What history can teach us about the election of President Jair Bolsonaro in Brazil.

 

 

Alexandria Ocasio-Cortez isn’t the first self-described socialist elected to the House.

by David Greenberg

Here's some historical perspective on socialists in Congress.

 

 

The hole in Donald Trump’s wall

by Tore Olsson

As long as Americans continue to flood into Mexico, the wall will do little to deter crossings.

 

 

Kevin Kruse and Julian Zelizer: Trump's Demise Will Be "Worse Than Watergate"

by Kevin Kruse and Julian Zelizer

If the multiple charges against Trump prove out, he’ll easily displace Nixon at the top of the Crooked Modern Presidents list.

 

 

The history of science shows how to change the minds of science deniers

by Ephrat Livni

By understanding the history of science, we can “keep the world from falling apart.”

 

 

Lincoln's Legacy and the Government Shutdown: The Suicide of a Great Democracy

by George Packer

A shutdown looks like the beginning of the end that Lincoln always knew was possible.

 

 

The Crisis of Imperialism And Why It Will Only Get Worse

by Tom Engelhardt

A tale of imperial power gone awry that could hardly have been uglier. Yet, it’s hard to imagine how things won’t, in fact, get uglier still.

 

 

Why Is Trump Spouting Russian Propaganda?

by David Frum

The president’s endorsement of the U.S.S.R.’s invasion of Afghanistan echoes a narrative promoted by Vladimir Putin.

 

 

No, Trump Cannot Declare an ‘Emergency’ to Build His Wall

by Bruce Ackerman

If he did, and used soldiers to build it, they would all be committing a federal crime.

 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170915 https://historynewsnetwork.org/article/170915 0
Congresswoman Virginia Foxx Thinks We Should Abandon The Term "Vocational Training." I Disagree. Last week, Congresswoman Virginia Foxx condemned some un-named folks as “classist” for placing the “stigma” of inferiority on those who opt for a vocational or technical education.  Indeed, to even place the adjective “vocational” in front of the term “education” implies such inferiority, Ms. Foxx asserts. But the Congresswoman misses the real import of education and misses, therefore, the distinction between education and training.  

 

Unfortunately, Ms. Foxx is not alone in missing this distinction. In its true meaning education must be about introducing young people to a knowledge of themselves and a knowledge of the relation that they have to society, the world, and the universe in which they live. We are all historical creations.  We inherit our attitudes, beliefs, and values from the world around us.  It may well be, for example, that the United States is the greatest country in the world, or that Christianity is the one true and only religion, or that the limits capitalism places on democratic control are divinely or naturally ordained.  But young people born and raised in this country will believe this not because they have chosen such beliefs, but simply because the whole of the world around them tells them this is so.  Only by deeply studying the real history of this country, only by understanding evolution and the history and the immensity of the universe, only by delving into literature and psychology, can young people begin to comprehend who they are as human beings in this country today.  And only by sampling the wealth of human knowledge in all its varied fields can young people freely decide to pursue one path or another in making their life and making their living.  

 

Of course, every state legislature in this country, in underfunding public education, and pushing the agenda Ms. Foxx pushes here – let’s call training for a career the same thing as an education – effectively seeks to deny young people knowledge of themselves and the world in which they live. In so doing, they dis-empower our youth and simply make “education” into a means of churning out compliant men and women who will work for fifty or sixty years, and then die. 

 

No, this country desperately needs educated young people. Those of us who insist that our students get a real education are not the “classists” condemned by Ms. Foxx. No, the real “classists” are those who would condemn our youth to the perpetual role of servants in a world ruled by wealth.

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170887 https://historynewsnetwork.org/article/170887 0
Yesterday was the 100th Anniversary of Theodore Roosevelt's Death. Here's How His Legacy Still Shapes the United States Today. The beginning of the year 2019 marks the centennial of the death of the 26thPresident of the United States, Theodore Roosevelt, who passed away at age 60 on January 6, 1919. The impact of Roosevelt was massive, and continues to be so on America a century later. Here are five ways that Teddy Roosevelt’s legacy still shapes the United States today. 

 

The first and most significant contribution of Theodore Roosevelt to his country was his commitment to and advocacy of conservation of the environment, including promotion of national parks and national monuments, protection of our natural resources for the long term, and emphasizing the need for government and the people to show respect and awe for the great natural wonders of the North American continent.  Roosevelt is regarded as the premier figure who inspired the environmental movement, which fortunately was encouraged and accelerated by many of his White House successors including Woodrow Wilson, Franklin D. Roosevelt, John F. Kennedy, Lyndon B. Johnson, Richard Nixon, Jimmy Carter, Bill Clinton and Barack Obama.

 

Second, Roosevelt emphasized the need for social justice and encouraged “progressivism” from the White House. He was committed to the cause of workers and consumers both in and out of office.  The need for responsible government regulation of corporations was a driving force in his life.  He sincerely believed that many problems in American society could not be resolved just on the state and local level, but needed a national voice for all of the American people—not just the wealthy and privileged.

 

Roosevelt also shaped the modern presidency as he revived the Presidential office after its decline in power and influence after Abraham Lincoln’s assassination. In doing so, he became the model for many future presidents including Wilson, FDR, Harry Truman,  Kennedy. Johnson, Nixon, Carter, Clinton and Obama. Presidential scholars in History and Political Science would regularly rate Theodore Roosevelt as a “Near Great” President, ranked only behind Lincoln, George Washington, and FDR. This is quite a feat to hold such scholarly admiration and public renown for an entire century.

 

Fourthly, Roosevelt saw the absolute need to build the defenses of the United States against any future foreign threat. In particular, he loved and was fascinated with the US Navy. He believed war was at times necessary to protect the great experiment in democracy and the constitutional framework set up by the Founding Fathers.  As part of his perception of world affairs, Roosevelt saw the need for the building of the Panama Canal, and for assertion of American authority over the Western Hemisphere, going past the wording of the Monroe Doctrine of 1823 with his Roosevelt Corollary in 1904, and his assertion of the Big Stick policy toward Latin America.  Unfortunately, this created a long-term image of the United States as a imperialist power, not well regarded or appreciated by the independent nations of the hemisphere.

 

Finally, Roosevelt, while promoting military and naval buildup for protection of the nation, was also a great diplomat. His expansion of American diplomacy and relations with foreign nations helped expand American power in the early 20thcentury. He became very close to nations that would later become our allies—particularly Great Britain and France—and set a new standard for presidential engagement by negotiating the Treaty of Portsmouth which ended the Russo Japanese War of 1904-1905, winning him the honor of the Nobel Peace Prize in 1906.  He also took a moral stand toward any sign of aggression in the world as he came to warn of the danger of German aggression at the time of the Morocco Crisis of 1905-1906. He spoke out against the pogroms going on in Czarist Russia during his Presidency, and worked to promote a peaceful co-existence between Japan and the United States in the Far East, due to his concerns over our territories of Hawaii, Guam, and the Philippine Islands. 

 

These five positive contributions of Theodore Roosevelt have lasted and will continue to have an impact on the American Presidency and the future of the American nation.

 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170867 https://historynewsnetwork.org/article/170867 0
Fox News’s Nefarious Role in Misinforming Trump Voters

 

During the past two years in which Donald Trump has stumbled through his presidency, critics have been asking why so many Americans continue to back him despite mounting evidence of deeply flawed leadership. Often, these critics express contempt for the millions of Americans who constitute Trump’s “base.” They complain that Trump’s partisans are uniformed people who refuse to acknowledge that the president’s lying, ethical lapses, and failed policies are harming the nation. Trump remains in power, these critics argue, largely because starry-eyed followers ignore the facts.

These critics cast blame in the wrong place. Trump’s supporters, representing 38% of the electorate according to a recent poll, do not deserve all the censure that is directed at them. They did not create the pro-Trump narrative. They are its recipients. Conservative media have been especially influential in promoting optimistic judgments about Trump’s leadership. Fox News serves as command-central for the perspective. It draws a large audience. In October 2018, according the Nielson’s research, Fox racked up its 28thconsecutive month as the No. 1 basic cable news channel. Fox drew more viewers than CNN and MSNBC combined. 

Millions of Americans who wish to be informed about current events tune in regularly to Fox. Once they are Fox fans, they tend to stick with the channel. The hosts and reporters on Fox News encourage loyalty. Frequently, they make damning references to CNN (a favorite target) as well as CBS, ABC, NBC, MSNBC, the New York Timesand the Washington Post. They hint that only Fox can be trusted. Don’t look elsewhere for information, they warn, because “liberal” networks hawk “fake news.”

What kind of reporting does the Fox News viewership receive through prime-time reporting and commentary? Consider the lessons viewers learned on Thursday, December 20, 2018, an extraordinary day of troubles for Trump’s presidency. Leading print and television journalists outside of Fox expressed shock that Trump suddenly announced plans to withdraw U.S. troops from Syria and quickly draw down half of U.S. troops in Afghanistan. They warned that an American exodus in Syria could benefit the Russians, Syrians, Iranians, and Turks at the expense of Kurds who fought bravely against ISIS. Military leaders and foreign policy experts blasted Trump’s decision as ill-advised and dangerous. 

General Jim Mattis’s decision to resign as Secretary of Defense, also received abundant commentary on December 20. Mattis’s letter of resignation communicated strong disagreement with the direction of U.S. foreign policy. Mattis, in a clear rebuke of the president, noted that during his four decades of “immersion in the issues” he had learned the importance of treating allies with respect and “being clear-eyed about malign actors and strategic competitors.” Both American and international leaders were alarmed that the last general who seemed capable of taming the erratic president planned to leave his post.

On the home front, President Trump led congressional leaders to believe that a compromise was workable on temporarily funding the government. But Trump suddenly reversed his position, insisting there would be no settlement unless Congress provided $5 billion for a border wall. On December 20 the stock market tanked on this news and other developments. The next day Wall Street closed with its worst week since the financial crisis of 2008.

Mainstream journalists focused on the chaos associated with these developments. Several Republican leaders joined them in expressing concern, including Senators Lindsay Graham and Bob Corker. Senate Majority leader Mitch McConnell backed compromises aimed at averting a shutdown, but Trump and his backers in the Freedom Caucus made compromise unworkable.

Fox News television viewers got almost no sense of this mounting crisis when watching prime-time programing on the night of December 20. Shows hosted by Martha MacCallum, Tucker Carlson, Dan Bongino (sitting in for Sean Hannity) and Laura Ingraham directed viewers’ attention to other matters. The programs focused on a subject that had already received extensive coverage on Fox in previous months and years: undocumented immigrants. Hosts and guests warned repeatedly that dangerous foreigners threatened to overrun American society. 

Even though the real “news” on December 20 was about struggles in Congress to keep the government running, Fox’s prime-time programming highlighted stories about an immigrant invasion. Commentators asserted falsely that Democrats advocated “open borders.” They accentuated a report about a violent undocumented immigrant in California. In each program hosts and commentators left viewers with an impression that the big news of the day concerned security threats from aliens. Speakers praised President Trump for his determination to build a wall.

What other topics dominated the night’s discussions on Fox, essentially eclipsing any discussion of the big stories of the day about blow-back from the president’s controversial actions? 

Martha MacCallum’s program featured a lengthy interview with Susan Collins. The senator from Maine talked at length about some extremist critics of Brett Kavanaugh’s nomination to the Supreme Court (individuals who harassed her in disgusting ways). 

Tucker Carlson drew attention to the work of Robert Shibley, who maintained that America’s universities have been pushing aggressively against free speech. Carlson also took shots at a “Climate Tax” and made fun of claims about Russian interference. He maintained that China was the real threat. 

Dan Bongino and his guests blasted Deputy Attorney General Rod Rosenstein’s defense of the “so-called Russia investigation.” Bongino praised a “terrific book” by Gregg Jarrett called The Russia Hoax: The Illicit Scheme to Clear Hillary Clinton and Frame Donald Trump.

Laura Ingraham, like all the other prime-time hosts, devoted considerable time to immigration, but she encountered difficulty when a sheriff objected to some of her arguments. The officer acknowledged the difficulty of holding a violent undocumented man in California because of laws pertaining to sanctuary cities, but he noted that many immigrants in his community were good citizens and sought help from law enforcement when troubled by criminals. “That’s a lie!” Ingraham responded. She ended the interview quickly by mentioning “violence, rape, burglary, robbery and other offenses against property and people.”

A pattern in the reporting and commentary on Fox was evident in these four prime-time programs. By focusing on old news stories that had been red meat in right-oriented media commentary for years, there was little time left for an analysis of stunning developments in the previous 24-hours. General Mattis’s resignation letter hardly got a nod. The outcry by national and international leaders regarding President Trump’s plan to withdraw from Syria and Afghanistan received little or no attention. The impact of a government shutdown on the American economy and the American people was almost completely ignored. There was hardly a word about the stock markets’ plummet that day or the huge slide of recent weeks. Instead, viewers heard about scary threats from immigrants, Democrats, university administrators, and Chinese hackers. They were reminded often that President Trump fights tenaciously for ordinary Americans.

Much of the discussion did not reflect what used to be identified as mainstream Republican stands on economic and political affairs. Instead, viewers got an earful of analysis from individuals who spoke from the margins of political debates. Hosts and commentators seemed eager to please their most important viewer, the President of the United States. Most of them endorsed and celebrated Donald Trump’s statements and actions, despite their sharply controversial nature. Dan Bongino, substituting as host for Sean Hannity, provided an example of the slant by promoting his co-authored book, Spygate: The Attempted Sabotage of Donald J. Trump. 

Endorsing controversial and often questionable ideas, has, of course, been evident in Fox’s broadcasting over many years. A decade and a half ago, the channel sounded a drumbeat for war in program after program, convincing many viewers that Saddam Hussein had been responsible for the 9/11 tragedy, threatened the world with WMDs, including nukes, and needed to be removed. On the same channel host Bill O’Reilly devoted considerable time to warning viewers about a “war on Christmas.” Anti-religious forces were trying to replace “Merry Christmas” with “Happy Holidays,” O’Reilly cautioned. During those years, Roger Ailes, then Chairman and CEO at Fox News, designed the channel’s modus operandi, applying a heavy-handed slant on the news.

Complaints about Donald Trump’s enthusiasts are often misplaced. Many citizens who are regular patrons of Fox News and other right-oriented programming want to be well-informed about current events. They take civic engagement seriously. Those listeners and viewers should be more discerning, of course, but they are not fully to blame for making judgments about national and international affairs that raise the eyebrows of Trump’s critics. Fox News, and other opinion sources on television and radio are primarily responsible for the base’s narrow and skewed viewpoints. Millions of Americans have turned to the Fox News Channel and other sources, seeking knowledge about current events. They have been let down by the manipulators of “news.”

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170776 https://historynewsnetwork.org/article/170776 0
How a Harlem Immigrant Views What’s Happening on the Border

 

The confluence of seemingly random things can sometimes trigger the need to re-examine who we are, where we come from, what we believe and what we really know and understand. Not just about ourselves individually, but, often of greater significance, about our country.

On the eve of the November mid-term elections the President of the United States again revisited the same racist, anti-immigrant diatribe with which he launched his Republican Party presidential campaign three years earlier. With over-wrought, fear-mongering warnings he harangued the electorate inveighing against a “massive” horde- a caravan of brown people replete with Middle-Eastern Terrorists poised to invade the US southern border, take our jobs and assault our women. Press coverage of the several thousand people slowly streaming from various Central American countries toward the US-Mexican Border showed a somewhat different picture: poor, often ill-shod people—many of them women and children—slowly walking northward in hopes of gaining asylum and a better life in the US. Despite the President’s seemingly racially biased pre-election exhortations to his core followers, his political party lost 40 seats and control of the US House of Representatives. Two weeks later the same US President authorized US Troops—possibly illegally sent by him to that southern border—to use lethal force against those “invaders” if and when they tried to cross the Border. By the last week in November we had been treated to the sight of desperate people trying to do just that being indiscriminately tear-gassed: men, women with toddlers, even pregnant women. In a nutshell: a rather ugly, quite dismaying sight. That same week my good friend and journalist-colleague Raymond Peterson sent me a link to a story about the death of one of the last Navajo Code Talkers of WWII at the age of 94. And late on the evening of 30 November news broke that the 41st President of the US, George Herbert Walker Bush, had died, also at the age of 94. 

Definitely an interesting confluence of events in the month of November that worked in sinuous ways within my conscious/unconscious psyche. Especially since as a veteran practicing journalist with a rather unique background and perspective I’ve long considered myself the ultimate insider-outsider “observer” of the foibles and follies of this country. Among those converging “strands of confluence” that triggered re-assessment and realignment of knowledge and understanding: the consummate patriotism of that 94-year old Navajo, emblematic of all those Code Talkers whose native language became the US military’s impenetrable communications code the Japanese could not break and thus helped assure the Allied WWII victory in the Pacific. Navajo leaders reportedly say there are now less than ten of the Code Talkers still alive. Many of them had to “jump through hoops” just to be allowed to “perform their civic duty” for their country. (So too with the Nisei Japanese-American 442nd Infantry Regiment— which became the most highly decorated military unit in US history, the Tuskegee Airmen, and the Harlem Hellfighters of WWI.) 

Tie this Navajo Code Talker’s service and passing to the equally patriotic, love of country WWII Fighter-Pilot deeds and passing of George H. W. Bush, reportedly the youngest Navy Fighter Pilot in that war: a bona fide decorated war hero who continued to serve his country as a dedicated citizen and public political figure (a US Congressman from Texas, US Ambassador to the UN, Director of the CIA, 43rd Vice President of the US and of course our 41st President). But then you must also follow the equally connected threads and sinews: the same George H. W. Bush being rightly lauded for his astute handling of the end of the Cold War- the successful dissolution of the Soviet Union and smooth re-unification of Germany after the fall of the Berlin Wall; the swift, timely assembly and leadership of a UN Coalition to stop and repel Saddam Hussein’s invasion of Kuwait; and who advocated for “a thousand points of light” in a “kinder, gentler America,” is the same George H. W. Bush whose 1988 bid for the presidency launched the modern era of vicious negative, divisive, race-baiting political campaigning with the infamous “Willie Horton” campaign Ad—a tactic the current occupant of the White House still relies upon quite heavily; the same George H. W. Bush who was fully complicit in the Reagan administration’s Iran-Contra Affair and its cover up (illegally selling missiles to Iran—via Israel—to get funds for the CIA-backed Contra forces fighting the Sandinista regime in Nicaragua) to subvert and defy laws passed by the US Congress terminating funding for those Contras. As stated in the National Security Archive’s 25 November 2011 posting: “Independent Counsel Lawrence Walsh continued to consider filing criminal indictments against both Reagan and Bush.” Neither man was indicted, but it should be noted that before he left office, the 41st President of the US pardoned every member of the Reagan administration indicted and convicted for their part in Iran-Contra. One individual had not even come to trial, but was pardoned anyway. Closer to home, for me, and equally negative, was the George H. W. Bush-ordered invasion of Panama in 1989 when at least 3,000 civilians lost their lives so General Manuel Noriega could be “arrested” for aiding, abetting and profiting from the drugs the cartels were funneling into the US. Keep in mind this factoid: folks already knew General Noriega was in bed with those drug runners when the CIA recruited him as an “asset” to primarily aid them in their war against the Sadinistas in Nicaragua. The individual running the CIA at the time: George H. W. Bush.

So those disparate threads and sinews triggered interesting synaptic connections.  One was simply this, as Vice writer Cole Kazdin explained last summer: “Many historians and policy experts are quick to point out that much of the troubles in Central America were created or at least helped by the US’s interference in those countries going back decades. In other words, the foreign policy of the past has profoundly shaped the present immigration crisis.” And there have been similar reports in the New York Times within the past two months. 

But those are basically just reminders of “causal events” I fully understand, having lived through and covered some of them. I was the ABC TV News Foreign Assignment Editor the afternoon of June 20th 1979 when our cameraman, Jack Clark, called with the shocking news that correspondent Bill Stewart had been executed by one of Nicaraguan dictator Anastasio Somoza’s National Guardsmen. Jack had captured the horrific event on film. Film we smuggled out and shared with CBS, NBC and by extension the international news community. The upshot: The US (Carter Administration) withdrew support for Somoza’s dictatorial regime, which was quickly overthrown by the insurgent, revolutionary Sandinistas. 

You see where this is going, right? The incoming Reagan Administration abhorred the allegedly “Communist” Sandinistas and did everything in its power— legal and illegal— to thwart and destroy them. Thus we get the CIA-backed Contras, and the subsequent Iran-Contra web of lies, deceit, cover-up, prosecutions and pardons outlined above. And yes, out of that US-caused and/or influenced turmoil and violence in Nicaragua and other similar “adventures” in Central America, we are now “graced” with our current debacle caused by a now militarized southern border, shuttered legal border crossings and a non-existent amnesty policy. We are now graced with a White House occupant threatening to shut down the government if he doesn’t get funding for a “Border Wall” instead of demanding comprehensive immigration reform legislation. We are now graced with stories of seven-year old girls dying of dehydration while in the custody of US Customs Border Protection. This is who we are now?

What, you’re now wondering, do all these intersecting, interwoven people, places, and events have in common and mean to any of us? For me it’s about belonging or not belonging, about inclusion and exclusion, about who is welcomed and who is not, and thus ultimately who we are as a people and a nation. That caravan of Central Americans are brown people. That Navajo Code Talker would be called a “Redskin” by some in this country. Those Nisei Warriors, the Tuskegee Airmen and the Harlem Hellfighters, would be “yellow” and “black” people of color who many residents of this country felt did not “belong.” And as the stoker-of-fear-and-division in the White House fully understands, some still feel this way.

From it’s very beginnings this country has wrestled with this question. Our various attempts at legislating an answer have always been race-based, heavily white/Caucasian Northern European inflected. Asians, Blacks, Browns, the darker-skinned Eastern Europeans, even the country’s Native Americans, were all deemed of lesser worth and not to be truly welcomed into the fabric of the nation’s citizenry. This I observed and analyzed day to day living and growing up in the village of Harlem after my 9th birthday. Yours truly, me, the ultimate insider-outsider: a native born immigrant. Oxymoron? Not really. 

If the late Senator John McCain, a white male from the state of Arizona, born in the Panama Canal Zone was a “native born” son eligible to run for the presidency of the US why not a black kid from Harlem, born in the US hospital, Ancon, Canal Zone, with-ahem—a birth certificate to prove it? Equally “native born,” right? Unfortunately for a number of people, back in June of 1952, in another of those virulently anti-immigrant waves that are reminiscent of the current xenophobic, anti-immigrant crest we seem to be drowning in, the McCarran-Walter Immigration bill was ratified over President Harry Truman’s veto. 

I took great pains and immense personal pleasure in highlighting it’s inhumane, racist, anti-immigrant nature in Our World, Winter 1952: Fear and Frustration, a documentary I produced for a short-lived but outstanding documentary series in 1986, Our World (ABC TV News). It was a horrendous piece of legislation, as Mr. Truman bluntly stated before exercising his veto power:

“it discriminates, deliberately and intentionally, against many of the peoples of the world ... The idea behind this discriminatory policy was, to put it baldly, that Americans with English or Irish names were better people and better citizens than Americans with Italian or Greek or Polish names…. Such a concept is utterly unworthy of our traditions and our ideals. It violates the great political doctrine of the Declaration of Independence that "all men are created equal." 

It also made citizenship and legal immigration a very murky and highly difficult prospect not just for the darker-hued folks of Southern and Eastern Europe, but also for certain Asians and the black and brown people of Latin America and the Caribbean. This current 21st century wrangling over illegal and legal immigration—the now “back-burnered” Deferred Action for Childhood Arrivals (DACA) struggle and the current fear-mongering about the “Caravan Invasion”—are just another re-visitation of the age-old inclusionary/exclusionary struggle that forever plagues this country.

Mavis, my wise and pro-active mother took no chances with the citizenship of her children. She used her legal immigrant-naturalized citizen status to have us naturalized. Thus, this black kid from Harlem’s unique native born-immigrant status, with birth certificate and naturalization papers to prove it. The ultimate insider-outsider.

In history class I learned this lesson as a high school student: The U.S. Constitution as originally written was a morally flawed document that actually condoned and protected the ownership of human beings by other humans. Not only condoned, but actually supported slavery. 

Fr. Tiffany, my Cardinal Hayes High School American history teacher, imparted in-depth, eye-opening awareness and insights to us in that history class. Among the key ones: the country’s founding document was written to protect and safeguard the rights and property of white, male oligarchs. Fortunately it was written with the foreseen and unforeseen flexibility to eventually make it applicable to everyone, not just those oligarchs. 

Yes, as we are all too well aware, who tells the story, from what perspective—what’s included or left out—is central to who we are as a people and as a society. It is central to a full understanding of the underpinnings and bedrock of that society. Learn and accept the fact that this continent and country have always been a place of and for immigrants. Eons ago the ancestors of the Native American First Peoples migrated from Asia across that Bering Strait land-bridge (“Beringia”) and eventually populated this land mass from North to South. It’s no longer hypothesis. Recent DNA testing has confirmed the Siberian and “Beringinian” origins of these tribes. But later immigrants, whether Vikings and other Europeans—Italians, Dutch, Spaniards, English, or more free and enslaved Africans—whomever, especially into what’s now North America, just added to an ongoing, continuous influx. It has always been a diverse, multicultural land. Of all the works delineating and exploring this, especially as it pertains to the United States, my favorite by far is A Different Mirror: A History of Multicultural America by the late, and for me, really great, Dr. Ronald Takaki. It’s a must-read to fully understand the root-causes of the racial problems and constantly recurring “exclusionary” immigration tendencies of this country. 

Those flawed Founding Fathers were not exceptions in fomenting these national ills. Yet even as there are still significant anti-immigrant advocates today, there were also “inclusionary” others back then. Men like John Jay and Alexander Hamilton who pushed hard for educating free and enslaved Africans so they too could enjoy the full fruits of freedom and self-government promised by that recently won American Revolutionary War. Note well: from the very beginning, always that exclusionary/inclusionary battle that resurfaces over and over again. 

We need to re-examine and redefine ourselves to become that beacon of refuge and light we’ve only been pretending to be for a select few. We need to take a page from the positive side of George H. W. Bush and seek a “kinder, gentler America” that re-emphasizes what President Harry Truman also stated in his 1952 veto of the McCarran-Walter Immigration Bill: “It denies the humanitarian creed inscribed beneath the Statue of Liberty proclaiming to all nations, ‘Give me your tired, your poor, your huddled masses yearning to breathe free.’ "

Cliché or no, if that’s not who we are now, that’s exactly who and what we need to strive to be.

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170722 https://historynewsnetwork.org/article/170722 0
Technology's Poor Step-Sister: Maintenance Festival of Maintenance have joined Instagram! Follow us for more #maintenance, #repair and updates for the 2019 Festivalhttps://t.co/DuCly3urEv pic.twitter.com/yPdE9oPExI

— Festival of Maintenance (@MaintenanceFest) December 19, 2018

This is Jonathan Coopersmith's history of technology blog. He teaches history at Texas A & M University. An Associate Professor of History at Texas A&M University, Jonathan Coopersmith’s latest book is FAXED: The Rise and Fall of the Fax Machine (Johns Hopkins University Press, 2015).

 

Like Rodney Dangerfield, maintenance, whether of physical or social infrastructure, gets little respect despite its importance.  That importance historically has remained mostly invisible and unappreciated – until something breaks.  Keeping things going without a reduction of service or, even more challenging, keeping them operating whilst modifying them are challenges that apply both to airplane and organizations.

 

As the Festival of Maintenance conference in London this fall demonstrated, the concept can be expanded to almost any area, depending on how you define maintenance.  One of the delights of this one-day event was the big tent of participants.  Attending were academics, museum curators, professional maintainers, a range of maker organizers and activists, an architect, organization and advertising executives and the Guerilla Groundsman – and those were just the speakers.  The result was a vibrant salad of very different ingredients, which was both a strength and weakness.  The strength came from the range of perspectives and approaches.  I had multiple “I had never thought of that” moments, always the sign of a good conference. 

 

The weakness came from trying to pull everything together into a semi-coherent whole.  The widely ranging approaches and topics also reflect the youth and diversity of maintenance as an exciting area of study and action, as organizer Laura James’s article in Medium covering all the presentations illustrates. 

 

Maintenance has not received its appropriate academic and public appreciation because it is considered “normal” and often invisible – much of the history of technology focuses on innovation and not application, diffusion, and operation.  Budgets and life-cycle costs, however, ignore maintenance at their cost as do societies.  The public prestige of maintainers is often low, reflected in the historical downward trajectory of words like technician.  This conference, together with the American Maintainers’ conferences of 2015 and 2017, are part of a growing professional and academic interest in maintenance worthy of serious study. 

 

Perhaps the Festival’s most unconventional talk came from the Guerilla Groundsman who, wearing a pixelated box over his head, described his individual efforts to move beyond picking up trash to actually maintaining neglected parts of the Cambridge public infrastructure.  He cleaned bridges, cut shrubbery, painted bollards and repaired wooden benches, fences, and a sign.  Never did anyone stop his acts of civic reconstruction.   His actions could be seen as quixotic or a sign of local budget cuts in Britain since 2009.

 

The Guerilla Groundsman raises interesting questions about the potential expansion of active citizen participation in physical maintenance.  His endeavors demanded skill, the appropriate tools, and a willingness to buy wood, paint, and other materials.  In contrast, the civic groups I’ve seen in Japan, the then Soviet Union, and America only picked up litter (though the Japanese efforts were far more intensive), which requires no training or real skill.

 

More importantly, how do you move beyond the local actions of an individual to a larger organizing of individuals into coordinated actions in cooperation with governments?  Or should that even be tried? 

 

King’s College, London professor David Edgerton described the discipline and resources demanded by maintenance regimes while wondering why societies regard maintenance and its close relative, manufacturing, as less interesting and prestigious than invention. One benefit of Brexit and Trump-provoked trade wars is explicitly rendering the complexity of contemporary supply chains not just for making but also maintaining. 

 

Mike Green of the Central London Maintenance Association, which has a majority of the London facilities management market, described a changing institutional world of property maintenance.  A major challenge is reshaping institutions to ensure their compliance with standards in an internet-based world. 

 

As both Edgerton and Green emphasized, maintenance demands compliance and discipline.  And, as the history of many technologies illustrates, what was once seen as liberating may become imprisoning.  The Victoria & Albert’s Natalie Kane described how adult disposable diaper use is spreading from incontinent to fully functional adults whose employers mandate they wear diapers to maximize their work efficiency or time before taking a break.  Is this mission creep or, in the case of gamers maximizing their time before a screen and Chinese traveling in overfilled train cars during the holidays, self-decided? 

 

The number of speakers on maintaining the existence of voluntary maker groups reflected the attention this conference received in that community and definitely reflected the challenges these mostly volunteer groups faced in creating and sustaining themselves.  Adrian McEwen modified a P. J. O’Rourke quote, “everyone wants to save the earth, but no one wants to do the dishes,” to illuminate the challenges of sustaining a volunteer organization. 

 

Institutional infrastructure clearly needs as much attention as physical infrastructure to ensure its functioning, resilience, and relevance, but is maintenance the correct prism to view organizational competence?  If not, what are better approaches?  Judging by the common challenges the maker groups faced, their survival and effectiveness would benefit from maintenance-like analytic guidance and toolkit to motivate members and get their dishes done by improving their social functioning. 

 

Certainly, thinking of goals and actions from a maintenance perspective can stimulate ideas for a long-term viewpoint.  Alex Mecklenberg, a business consultant at Doteveryone, suggested, “Let’s think about imagining the maintenance of your bank instead of imagining the future of the bank.”  That’s a very different perspective. 

 

Speaking by Skype, an activity now so common to go unnoticed – unless the connection is poor, Lee Vinsel provided a brief history of the Maintainers, or “How a Group of Bureaucrats, Standards Engineers, and Introverts Made Technologies That Kind of Work Most of the Time” in contradistinction to Walter Isaacson’s The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution.  The Virginia Tech professor also briefly described an upcoming project, the Maintenance Community Framework, designed to make maintenance activities more effective and more visible to academics and the public. 

 

Maintainers, both academic and applied, need to study the broader context in which organizations operate and political decisions occur.  Simon Elmer of Architects for Social Housing provided this background in his overview of London council housing.  Compared with renovating and refilling buildings, tearing existing housing down and replacing it with new, larger buildings is more profitable for everyone – builders, architects and other who bill as a percentage of a project, and local councils – except the householders.  How can groups seeking to upgrade, not uproot the existing buildings frame their arguments in ways that are economically and politically attractive to the council members who will decide what course to take? 

 

Studying and acting on maintenance are clearly resonating with academics and practitioners.  The Alfred P. Sloan Foundation recently announced it will support the Maintenance Community Framework for “InfoMaintainers - people working in libraries and digital preservation - and Maintainers in the Workforce, a group of experts on the law and policy of labor and poverty.”

 

This mixing will continue:  The 2019 Festival of Maintenance will occur September 28 in Liverpool.  For more information, go to https://tinyletter.com/FestivalOfMaintenance or https://twitter.com/MaintenanceFest

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/blog/154168 https://historynewsnetwork.org/blog/154168 0
UPDATED Highlights of the 2019 AHA Annual Meeting

Dear #Twitterstorians; my kid in attendance at #AHA19 announced she wants to be a historian and activist. I think my work here is done.

— Naomi Rendina (@NaomiRendina) January 6, 2019

Key Links

Interesting choice Macmillan... not sure I’d have brought those to the #AHA19 but sure... pic.twitter.com/FBqbGRAqv6

— Adam Domby (@AdamHDomby) January 5, 2019

 

PSA: Women with PhDs in history are not "little girls." #AHA19 #twitterstorians #womenalsoknowhistory https://t.co/xIcbjjIYpX

— Coordinating Council for Women in History (@TheCCWH) January 6, 2019

News

.@JLWeisenfeld opens her paper by noting that she presented her first scholarly paper as a grad student at ASCH 29 yrs ago. At that conference, she was 1 of 2 scholars of color on the program + the only one presenting on AfAmerican religion. #ASCH19 pic.twitter.com/n0v2bS8xm2

— Christopher Jones (@ccjones13) January 4, 2019

 

What Historians Are Tweeting (most recent entries at top)

 

Q from audience on the progress of women’s history: LKK says that women’s progress in historical profession is linked symbiotically to the existence of women’s history—there is no either/or. #AHA

— Dr. Ann M. Little (@Historiann) January 5, 2019

 

Linda Kerber speaking RN at the #AHA19 Cmmee on Gender Equity breakfast talking ‘bout her generation of scholars graduating in 1969. pic.twitter.com/IC0Mr0VFaB

— Dr. Ann M. Little (@Historiann) January 5, 2019

Sessions Relevant to Topics in the News (Links to tweets may not be active until after sessions have begun)

Thursday, January 3, 2019

Friday, January 4, 2019

Saturday, January 5, 2019

Sunday, January 6, 2019

 

“I was told there would be no math.” -@KevinMKruse this quote is how you know you are at a conference of Historians#aha19 #s25

— Thomas Harvell-DeGolier (@DeGolierThomas) January 3, 2019

 

 

Do historians miss the ideals of assessment, as some have suggested? https://t.co/VXkK9PnLGb

— History News Network (@myHNN) January 4, 2019

 

 

 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170791 https://historynewsnetwork.org/article/170791 0
What I’m Reading: An Interview With Historian Sharlene Sinegal-DeCuir

 

Sharlene Sinegal-DeCuir is an Associate Professor of History at Xavier University of Louisiana. She received her PhD from Louisiana State University in 2009 and her areas of concentration are in American, African American, and Latin American history.

What books are you reading now?

I am reading several books at the moment, including: The New Jim Crow: Mass Incarceration in the Age of Colorblindness by Michelle Alexander; Witness to Change: From Jim Crow to Political Empowermentby Sybil Morial; and White Rage: The Unspoken Truth of Our Racial Divide by Carol Anderson.

What is your favorite history book?

My favorite history books are W. E. B. Dubois’s Black Reconstruction in America, 1860-1880and Carter G. Woodson’s The Mis-Education of the Negro.There are many more history books that I enjoy and consider my favorites but for me, those two are the classics. 

Why did you choose history as your career?

I don’t actually think I chose a career in history, it chose me. I began my college career as a pre-med major at Xavier University of Louisiana, a small private liberal arts college that is known nationally as being number one for placing African-Americans into medical schools. After the second semester of my sophomore year, I found myself questioning if I wanted to be an MD or if it was the path my parents chose for me. I decided to change my major to history because I remembered how much I liked it in high school and I had decided that I was going to be a lawyer. Fast forward to senior year, I took the LSAT and began applying to law schools, all the while questioning if that was the right choice for me because I realized I LOVED history.

At the last minute, I decided I would go to graduate school instead of law school. I told myself, I was only going to get a master’s degree than go to law school. I had not taken the GRE, had not applied to any programs and had no clue how I was going to get into any program. So being a Louisiana girl, I took myself over to LSU, walked into the graduate admissions department and told them I wanted to earn a master’s degree in history. I was allowed to take courses as a non-matriculating student for a semester while I prepared for the GRE and applied to the actual master’s program in history at LSU. I got accepted into the program and thought, okay, after this, law school. Well, literally right after defending my master’s thesis, the department chair asked me if I would like to stay at LSU and complete my PhD. Without even hesitating, I said yes, and the rest is history. Best decision I ever made!

What qualities do you need to be a historian?

You most definitely have to be open-minded and passionate. You must be willing to allow history to speak to you, no matter how difficult the subject. You also have to have a passion for history because I promise you, those long nights of research, preparing for conferences, writing articles, teaching, etc. can be daunting. If you have a passion for the subject, especially the subjects you teach/focus on, the little things don’t bother you and that passion will be felt by others. I have students every semester that enter my classroom telling me how much they hate history. I always tell them, “no, you don’t hate history; you hate the way it has been taught to you.” If you are passionate about what you teach, the students become passionate, active learners ready to soak up everything. Over the years of teaching, I have had several students change their majors to history after taking one of my classes. I think that’s the most fulfilling part of being a professor.

Who was your favorite history teacher?

I have had several amazing history teachers. Many of my undergraduate professors at Xavier University of Louisiana became my colleagues -- a few have since retired -- but Dr’s Jonathan Rotondo-McCord, Gary Donaldson, Shamsul Huda, and Sr. Barbara Hughes, have all contributed to my love for history.

My PhD advisor at Louisiana State University, Dr. Gaines Foster, is amazing. I remember thinking if I could be half the researcher and professor as him, I would be okay. Dr. Foster just had that quality about him. He was extremely helpful but very stern. With him, I couldn’t cut corners and get away with it. I must admit I was a bit intimidated. I have since had the pleasure of serving on a panel with him. After the panel, he complimented me on my research and my ability to engage the audience. He said I had a presence. I thought to myself…wow! That compliment meant so much to me because I truly value his opinion.

Another graduate school professor that had an impact on me was Dr. Leonard Moore. Dr. Moore had a captivating teaching style that was both engaging and passionate. I served as his teaching assistant for a few years in graduate school and learned a lot about lecturing and the delivery of information.

What is your most memorable or rewarding teaching experience?

Receiving my first student evaluations is one of my most memorable experiences. As a new professor you often question yourself about content and teaching: am I teaching relevant information? Do the students have a clear understanding of the information? And how can I make the information relatable? I remember reading those evaluations and feeling a sense of accomplishment because the majority of the students spoke about how I changed their perceptions of history. I fondly remember one student mentioning that I made a “boring” subject interesting and relatable. Another student said and I quote, “Dr. Sinegal-DeCuir, knows her sh*t,” -- when I am having a hard day, I think of that comment. It makes me smile every time.

It is also very rewarding to get emails, cards, and letters from current and former students expressing the impact I have made on their lives as a professor.

What are your hopes for history as a discipline?

I hope that the discipline continues to embrace diversity. Diversity in interpretations of historic events and diversity in scholars and scholarship. The facts of history don’t change but as long as we are accepting of diverse interpretations, history will forever be relevant.

Do you own any rare history or collectible books? Do you collect artifacts related to history?

I don’t have any rare historic collectables but I do have a few sad irons and one antique coffee grinder. I also have a large number of books pertaining to history and a small collection of Christmas ordainments from every museum I’ve visited across the U.S. and overseas. I am trying to start a Clementine Hunter collection; right now I have several prints, my plan is to replace the prints with the original paintings.

What have you found most rewarding and most frustrating about your career? 

The most rewarding thing about my career is becoming known as a scholar in my field. I have written a New York TimesOp-Ed, appeared on MSNBC with Al Sharpton, and have done a few local interviews about historic events. I enjoy putting myself out there. What is frustrating is that many people dismiss the field of history as not being important or not being as prestigious of a field like medicine or law. 

How has the study of history changed in the course of your career?

It has become more inclusive to different interpretations of events and to diverse fields. The field of history is no longer limited to a cookie cutter view of the past. Historians are uncovering amazing new stories and revisiting old ones with through the lens of gender studies and minority studies, to name a few.

What is your favorite history-related saying? Have you come up with your own?

I have not come up with my own saying but I really like this one, “History is not a burden on the memory but an illumination of the soul,” by Lord Acton. We should all be open to understanding the events of the past whether it is our own family history or the history of our country. Don’t hide the ugly truths and only embrace the good, happy times; we should learn from it all.

What are you doing next?

I am continuing to put myself out there through my scholarship. I am in the process of researching and writing a book proposal. 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170774 https://historynewsnetwork.org/article/170774 0
In Our Polarized Times, We Can All Use a Friend

 

Friendship in the universe of social media is generally based on similarity. Finding friends through kinship of interests or personal likenesses can put us at ease. It’s comfortable to pal with people who share your foodie tastes or enthusiasm for Star Trek. 

The magnetic pull of similarity is also at work in our political clustering. Americans are sorting by values and ideology not only in jobs and churches, but also in how they find friends and even in those they choose to marry.

The Pew Center reports that while only 8% of Americans maintain that “interracial marriage is a bad thing,” twice that percentage of political party members (15% of Democrats and 17% of Republicans)  “would be unhappy welcoming someone from the other party into their family.”

Is finding a similar person the only way to get close to a fellow human being. Seems a truism: birds of a feather flock together. The “Truly Me” dolls from American Girl are even designed to be personal replicas, so “girls [can] show exactly who they are—inside and out.”

And Americans are flocking together, from inside the Beltway to Main Street USA, with ever-sharper polarizations around cultural and party differences. An increasing number of citizens are choosing where to live based on the predominant ideologies in neighborhoods. And after President Donald Trump’s meeting in Helsinki, Finland, with Russian Federation President Vladimir Putin, the American Polarizer-In-Chief commanded strong approval ratings of 64% among Republicans, despite all the criticisms from within his own party, according to an NBC/WSJ poll, while a whopping 80% of Democrats strongly disapproved of Trump’s performance. 

Our tastes for similarity in friendship is percolating through the body politic. This trend produces comfort in the immediate circles, solidarity with the fellow minded across the land, and lots of anger directed toward everyone else outside of those circles.

There’s a place for the friendship of similarity, but it’s not the only source of comradeship, or even the strongest way to bond. The psychologist and philosopher William James (1842-1910) offers suggestions toward finding kinship through difference.

Early in his life, James declared, people “differ, thank heaven”! His attitude of gratitude came from realizing that similarity reinforced the ideas and assumptions he already had, but difference brought enrichment. He liked to say, quoting a carpenter friend, “There is very little difference between [people]; but what little there is, is very important.” 

When James launched his academic career in the 1870s, he put together his very own Facebook—literally a scrapbook of portraits depicting scholars he was reading. By his late twenties, he had already developed “quite a decent nucleus” of his “Anthropological Collection.” He hunted out distinctive human traits in friends and colleagues, and he would needle his friends when traveling for “any further contributions” to the collection.

Shortly after starting to read the work of contemporary English philosopher Shadworth Hodgson, he wrote to him saying, “I think I shall understand your books better for having this vision of your face.” And despite his reservations about German scientist Carl Ludwig, who proposed reductionist chemical and physical explanations for human psychology contrasting with James’s own humanistic approaches, he said, “I have enjoyed Ludwig’s face very much; he must be a good fellow.”

And yet, despite all the riches of diversity, differences can breed hostility. The siren song of anger directed at those other guys who just don’t get it leads right back to polarization.

Friendship can be tricky. How to find enough similarity to gain a point of connection, but not so much that it stifles growth. Call it the Goldilocks Problem For Finding Friends: how to get it just right. When it works, friends find a sweet spot in relationships with variations on a theme—similar general goals, but different and even complementary ways of achieving them.

The Disney brothers were very different. Walt was creative and outgoing; Roy was quiet and businesslike. They tapped their differences in building the Disney Brother’s Studio. Years later, the avuncular Walt Disney liked to say of their entertainment empire, “it was all started by a mouse.” But five years before the 1928 debut of Mickey Mouse (after some trial runs as the less-cute Mortimer Mouse, who would become the unfriendly rival to Mickey in later cartoons), the animation studio itself started with the complementary skills of Walt and Roy. And those became bywords in the company: the artsy ones were “Walts”; the accountants were the “Roys.”

Counselors have a saying about good marriages and good friendships: you don’t have to see eye to eye, but it’s crucial to be looking in the same direction. So the key is to have enough similarity to generate a common enterprise, but not so much sameness that you stifle growth—or each other. Difference adds the juice of challenge and creativity.

James called this balance genius. He praised a friend of his, the popular philosopher Thomas Davidson, as a genius, for his ability to be a “friend of very different people.”

A close relationship does not mean agreement on everything; and frustrations over differences can be real storm warnings. However, the richest of friendships will grow among those willing to step up to the challenge of learning from differences. There’s the personal genius: remaining steadfast to yourself while finding the opportunities in the obstacles generated by human diversity.

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/169709 https://historynewsnetwork.org/article/169709 0
There’s Another Analogy for Trump  

Many journalists, including Bob Woodward in his new book, Fear, find echoes of “Nixonianism” in the chaotic environment of the Trump White House. Richard Nixon’s paranoia, his denials of illegal activities, and his attempted persecution of his enemies are all found to have their counterparts in Donald J. Trump’s managerial style. The two presidencies may be superficially similar, but the men are rather different. Trump’s adept manipulation of the news media, his displacement of facts by self-serving fantasy, and his government by humiliation and intimidation are best understood through a deeper historical perspective. The original of Trump is not Nixon, but William Bligh, Lieutenant in the Royal Navy and Captain of HMS Bounty.

As recounted in Caroline Alexander’s recent history, The Bounty, Captain Bligh sailed the Bounty to Tahiti in 1788 to transport more than a thousand breadfruit trees to the British West Indies, where their fruit was to be used to provide a cheap source of food for slaves. At Tahiti, a dozen men were flogged on Bligh’s orders for neglect of duty and acts of desertion, which Bligh took increasingly as insults to himself. To his officers, who could not be flogged, Bligh became verbally abusive, to the extent that Fletcher Christian, one of the mates, was heard to say repeatedly that he was “in Hell!”

On the morning of April 28, 1789, a small party of men led by Christian seized control of the ship. Bligh and 18 loyal members of his crew were forced into the ship’s 23-foot launch and set adrift, while Christian and 24 other men returned to Tahiti in the Bounty. Bligh was incredulous that his men could rise up in revolt against him. In his personal log, he wrote that he had “taken the greatest pains and bore a most anxious care” to ensure the “most perfect order” on his ship and the “most perfect health” of all on board. He protests that “no cause could justify such an effect”—that is, since the mutiny was without cause, it must be diabolical. But since he must name a cause, he conjectures that the men had made “connections” with the women of Tahiti to whom they wanted to return. “The Women are handsome—mild in their Manners and conversation—possessed of great sensibility, and have sufficient delicacy to make them admired and beloved”—in short, Tahiti was “the finest Island in the World where [the men] need not labour, and where the allurements of dissipation are more than equal to anything that can be conceived.” According to Bligh, it must have been the sexual lasciviousness of the women of Tahiti, not his deficiencies of leadership, that led his men astray. 

For two months, Bligh drove his small boat 3600 miles through the stormy waters of the South Pacific, limiting his men to rations of a mouthful of bread and a swallow of water per day. He can only have survived this epic journey by denying the peril of his situation and believing unequivocally in his ultimate triumph over what he called his “calamity.” Upon reaching the Dutch island of Timor, Bligh copied out his personal log of the mutiny, which he intended to present to the King on his return to England. Excerpts from this log he sent on ahead to correspondents in London, who passed them on to the newspapers, not unlike a modern Twitter feed.

Bligh himself reached London on the Ides of March, 1790, where he received a hero’s welcome. He laid his personal journal at the feet of the King and was invited to a private audience. He gave another copy to the Admiralty (though he had sent one before). With his loyal ship-mates, he attended a play at the Royalty Theatre, titled “The Pirates; or, the Calamities of Capt. Bligh.” The newspapers called on him to publish a detailed account of “his miraculous voyage, that mankind may be impressed with the mercy of Providence, in endowing those devoted to a good cause, with virtue, proportionate to their difficulties!” This 88-page account was duly published ten weeks later as “A Narrative of the Mutiny, on Board His Majesty’s ShipBounty,” authorized by the Admiralty and printed by the King’s bookseller. Thanks to his management of the media, Bligh’s triumph was complete.

As other accounts of the mutiny were published, however, the full story of Bligh’s verbal abuse of his officers and men, his corporal punishments for small matters, and his rages over perceived offenses became clear. He had kept a list of his officers’ transgressions for a court-martial at the end of the voyage. He accused his men, especially the petty officers, of stealing from a pile of cocoanuts he kept on deck. “God dam you you Scoundrels you are all thieves alike, and combine with the men to rob me,” Bligh raved; “I suppose you’ll Steal my Yams next, but I’ll sweat you for it you rascals I’ll make half of you jump overboard before you get through the Straits!” To the officers, he must have seemed like a madman who could no longer be entrusted with the command of the ship. Their duty to themselves, and to the ship, required them to take control. On the morning of the 28th, none of the officers or sailors attempted to prevent the mutiny, including those thought to be loyal to Bligh.

In the years 1790-97, when fervor over the French Revolution was high, some sailors were inspired to think of themselves as men with natural rights. In a sort of “Me Too” moment, British seamen recalled the threats and punishments they had suffered, leading to mutinies at Spithead and the Nore in 1797. Though suppressed, these mutinies resulted in some permanent and far-reaching reforms that improved the lives of seamen aboard Royal Navy ships. It is not a reach to say that the reforms of 1797 had their roots in the mutiny on the Bounty and the courts-martial that followed it. Bligh himself was court-martialed, at his request, for the loss of the Bounty, and acquitted. 

A court-martial, in the Navy, is the equivalent of an impeachment in civilian service. It is extremely unlikely that Trump will seek his own impeachment as a means of proving his fitness for office. But however initiated, the trauma of his removal, if it comes, may lead to reforms as deep and lasting as those that followed the mutiny on the Bounty. 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170484 https://historynewsnetwork.org/article/170484 0
Do We Really Need Billionaires?

 

According to numerous reports, the world’s billionaires keep increasing in number and, especially, in wealth.

In March 2018, Forbes reported that it had identified 2,208 billionaires from 72 countries and territories.  Collectively, this group was worth $9.1 trillion, an increase in wealth of 18 percent since the preceding year.  Americans led the way with a record 585 billionaires, followed by mainland China which, despite its professed commitment to Communism, had a record 373. According to a Yahoo Finance report in late November 2018, the wealth of U.S. billionaires increased by 12 percent during 2017, while that of Chinese billionaires grew by 39 percent.

These vast fortunes were created much like those amassed by the Robber Barons of the late nineteenth century.  The Walton family’s $163 billion fortune grew rapidly because its giant business, Walmart, the largest private employer in the United States, paid its workers poverty-level wages.  Jeff Bezos (whose fortune jumped by $78.5 billion in one year to $160 billion, making him the richest man in the world), paid pathetically low wages at Amazon for years―until forced by strikes and public pressure to raise them. In mid-2017, Warren Buffett ($75 billion), then the world’s second richest man, noted that “the real problem” with the U.S. economy was that it was “disproportionately rewarding to the people on top.” 

The situation is much the same elsewhere.  Since the 1980s, the share of national income going to workers has been dropping significantly around the globe, thereby exacerbating inequality in wealth.  “The billionaire boom is . . . a symptom of a failing economic system,” remarked Winnie Byanyima, executive director of the development charity, Oxfam International.  “The people who make our clothes, assemble our phones and grow our food are being exploited.”

As a result, the further concentration of wealth has produced rising levels of economic inequality around the globe.  According to a January 2018 report by Oxfam, during the preceding year some 3.7 billion people―about half the world’s population―experienced no increase in their wealth.  Instead, 82 percent of the global wealth generated in 2017 went to the wealthiest 1 percent.  In the United States, economic inequality continued to grow, with the share of the national income drawn by the poorest half of the population steadily declining.  The situation was even starker in the country with the second largest economy, China. Here, despite two decades of spectacular economic growth, economic inequality rose at the fastest pace in the world, leaving China as one of the most unequal countries on the planet.  In its global survey, Oxfam reported that 42 billionaires possessed as much wealth as half the world’s population.

Upon reflection, it’s hard to understand why billionaires think they need to possess such vast amounts of money and to acquire even more.  After all, they can eat and drink only so much, just as they surely have all the mansions, yachts, diamonds, furs, and private jets they can possibly use.  What more can they desire?  

When it comes to desires, the answer is:  plenty!  That’s why they drive $4 million Lamborghini Venenos, acquire megamansions for their horses, take $80,000 “safaris” in private jets, purchase gold toothpicks, create megaclosets the size of homes, reside in $15,000 a night penthouse hotel suites, install luxury showers for their dogs, cover their staircases in gold, and build luxury survival bunkers.  Donald Trump maintains a penthouse apartment in Trump Tower that is reportedly worth $57 million and is marbled in gold.  Among his many other possessions are two private airplanes, three helicopters, five private residences, and 17 golf courses across the United States, Scotland, Ireland, and the United Arab Emirates.

In addition, billionaires devote enormous energy and money to controlling governments. ”They don’t put their wealth underneath their mattresses,” observed U.S. Senator Bernie Sanders; “they use that wealth to perpetuate their power.  So you have the Koch brothers and a handful of billionaires who pour hundreds of millions of dollars into elections.”  During the 2018 midterm elections in the United States, America’s billionaires lavished vast amounts of money on electoral politics, becoming the dominant funders of numerous candidates.  Sheldon Adelson alone poured over $113 million into the federal elections.  

This kind of big money has a major impact on American politics.  Three billionaire families―the Kochs, the Mercers, and the Adelsons―played a central role in bankrolling the Republican Party’s shift to the far Right and its takeover of federal and state offices. Thus, although polls indicate that most Americans favor raising taxes on the richregulating corporationsfighting climate change, and supporting labor unions, the Republican-dominated White House, Congress, Supreme Court, and regulatory agencies have moved in exactly the opposite direction, backing the priorities of the wealthy.

With so much at stake, billionaires even took direct command of the world’s three major powers.  Donald Trump became the first billionaire to capture the U.S. presidency, joining Russia’s president, Vladimir Putin (reputed to have amassed wealth of at least $70 billion), and China’s president, Xi Jinping (estimated to have a net worth of $1.51 billion).  The three oligarchs quickly developed a cozy relationship and shared a number of policy positions, including the encouragement of wealth acquisition and the discouragement of human rights.

Admittedly, some billionaires have signed a Giving Pledge, promising to devote most of their wealth to philanthropy.  Nevertheless, plutocratic philanthropy means that the priorities of the super-rich (for example, the funding of private schools), rather than the priorities of the general public (such as the funding of public schools), get implemented.  Moreover, these same billionaires are accumulating wealth much faster than they donate it.  Philanthropist Bill Gates was worth $54 billion in 2010, the year their pledge was announced, and his wealth stands at $90 billion today.

Overall, then, as wealth is concentrated in fewer and fewer hands, most people around the world are clearly the losers.  

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170723 https://historynewsnetwork.org/article/170723 0
Government Shutdowns Illustrate the Pragmatism of the Founding Fathers

Government shutdowns, while never preferable, provide an opportunity to examine the framework of the United States. Of course, no one applauds a context where disagreements are so profound that the government can no longer function. Yet, it also exemplifies the success of our Republic. Allow me to explain this seemingly counterintuitive statement. 

Our founding fathers were aware of how easily corruption and influences contrary to the public good may infiltrate the mindset of any representative. “If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary,” wrote James Madison in Federalist 51. “In framing a government which is to be administered by men over men, the great difficulty lies in this: you must first enable the government to control the governed; and in the next place oblige it to control itself.” Lobbyists, special interest groups, donors, political factions, and those who wield astounding influence often detract from interests of both the minority as well as the majority at times. 

There was mindfulness of that human fallibility during the period of the Constitutional Convention and ratification. One should consider their perception that the War for Independence was due to a corrupt government. Concentrated power leads to words prevalent in the Revolutionary Era rhetoric, such as despotism, corruption, and tyranny. 

As a result, the founders distributed power within the Constitution to more fully represent those who otherwise might lack a voice. They did so in several ways, one being the creation of the three branches and their distinct roles. Defending the Constitution’s framework and degree of the separation of branches, Madison noted in Federalist 47, “The accumulation of all powers, legislative, executive, and judiciary, in the same hands, whether of one, a few, or many, and whether hereditary, self-appointed, or elective, may justly be pronounced the very definition of tyranny.” 

The founders placed a significant emphasis on the role of the legislative branch and their pragmatism is why two different houses were created in Congress, along with an executive and judicial branch. Federalist 51also explained the need for the legislature to check its own power and how it would do so. “The remedy for this inconveniency is to divide the legislature into different branches; and to render them, by different modes of election and different principles of action, as little connected with each other as the nature of their common functions and their common dependence on the society will admit.” 

The Convention made all parts of government independent as well as dependent on one another. The effect is that power is diluted, resulting in a greater probability that a plurality, both within the federal government and those constituents outside of it, will be influential. 

One must remain cognizant that government workers are impacted by this less than desirable effect of our revered system. Government shutdowns were obviously not the intent of the founders, but rather for those serving to compromise, just as the Constitution itself was a compromise. Nevertheless, temporary stalemates in some form have always been inherent in such a framework, as any new legislation requires collaboration and concessions where there is a separation of powers.

Independence in our branches offers the ability to dissent and propose diverse courses of action, which in turn reinforces our democratic ideals. That ideological and legal framework of the late 18thcentury still endures. The irony is that while we feel elements of failure and despair during a shutdown, it is also illustration of the fundamental brilliance of our Republic. 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170852 https://historynewsnetwork.org/article/170852 0
That Other Dick Cheney Movie  

Sociologist James W. Loewen is the author of Lies My Teacher Told Me.

 

Last week, instead of watching “Vice,” my wife and I watched that other Dick Cheney movie, “The World According to Dick Cheney.” R. J. Cutler made this documentary in 2013. 

 

It’s true that the actors in “Vice” look amazingly like Cheney, Lynne Cheney, and George W. Bush, but in Cutler’s film, they look even more like the originals! 

 

Please pardon that moment of forced humor; my excuse is that the rest of this brief review won’t be funny. But since HNN never reviewed “The World According to Dick Cheney” when it came out, and since the documentary will draw new viewers now, spurred by “Vice,” I want to get my critique of it out into the world. 

 

Although I didn’t actually time it, about half of the film’s running time seems to be the real Dick Cheney on screen. Of course, the title promises as much. One would hope, however, that with a character as controversial and important as Cheney, the interviewer would ask hardball questions. Instead, he mostly tosses batting-practice lobs.  

 

No one asks Cheney anything about Halliburton, for instance. What role did the company play in the Iraq War and its continuing aftermath? How did Halliburton fare financially? Does Cheney still own Halliburton stock? How did he fare financially?

 

The narrator or one of the authorities the film interviews does note that the reason Saddam Hussein pretended to have “weapons of mass destruction” was to intimidate his neighbors, notably Iran. But it never points out that, threatened by Bush/Cheney, he had reversed himself and allowed United Nations inspectors full access to his country, so they might investigate Cheney’s claim of WMD. After seeing the movie, people would never guess that the inspectors had to get out of the country to avoid being killed during Bush/Cheney’s “shock and awe” demonstration that kicked off the war. 

 

The film never bothers to show a graph or give other accounting of the number of dead and wounded Iraqis we caused, with our war and the later civil war we triggered, which is still going on in a way. Nor does it tell how many Americans died or how much we spent. One of Cheney’s most absurd prewar assertions was his claim that the war would pay for itself, from oil revenues we would somehow get. None of this gets into the movie. 

 

To its credit, the movie does contest Cheney’s claims that our torture of POWs was not torture and was legal. It makes some other important points as well. But it never explicates clearly what went wrong in Iraq and why. Two key decisions by Bush/Cheney caused the U.S. occupation to become a quagmire: their dismissal of the Iraqi army and their dismissal of the police. 

 

During World War II, when Germany occupied, say, Holland, they did so through the Dutch state. When we occupied Japan, we did so through the Japanese state, even including the emperor. That’s how it’s done. Only … not in Iraq. Somehow Cutler never mentions these preposterous decisions to Cheney, never asks him to defend them. 

 

Before Bush/Cheney upended Iraq, Hussein was in a box. He had few allies; Iran, Kuwait, Saudi Arabia, and Israel were his enemies. He could not use his air force, not even against his own people, as was his wont, and was facing internal opposition from Shiites and Kurds. We had him where we wanted him. Hussein never supported al Qaeda and never sponsored terrorist attacks in the U.S. or Europe. 

 

By 2013, when the movie was made, where were we? Bush/Cheney’s war had sparked the creation of ISIS in western Iraq, which also spread to Syria (and later to the Philippines and other countries). ISIS had linked with al Qaeda. Meanwhile, the Iraqi government that we created in Baghdad to replace Saddam Hussein is Shiite, like Iran, increasing Iran’s influence in the region. Cheney’s primary defense of all his policies is that they made America safer, but Cutler never asks him how these developments could possibly have done so.  

 

“Vice” may be a one-sided take-down of Cheney, “The World According to Dick Cheney’s Enemies,” but it’s needed to set the record straight after “The World According to Dick Cheney.” Unfortunately, historians ages hence may not credit a docudrama against a documentary. That will be their loss, and our nation’s. At the end of “The World According to Dick Cheney,” he goes off contentedly fly-fishing into the Wyoming sunset.  Under Cheney our foreign policy failed both on humanitarian grounds and with regard to our realpolitik interests. Our culture needs to show that we understand this. We cannot afford to have an affable view of this vice-presidency. 

 

Copyright James W. Loewen

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/blog/154171 https://historynewsnetwork.org/blog/154171 0
Roundup Top 10!  

Taking Stock of Gender History at the American Historical Association Annual Meeting 2019

by MONICA L. MERCADO

Why are so many scholars taking stock of gender history and women in the historical profession now?

 

 

How Democracies Die

by STEVEN LEVITSKY AND DANIEL ZIBLATT

Two Harvard social scientists, studying history and contemporary events, share their frightening insights.

 

 

Nativism and Who Is Considered a "Real American"

by CALEB ELFINBEIN AND PETER HANSON

A new survey explores American attitudes toward their national identity. The results should give us pause.

 

 

The Dangerous Rise of the IUD as Poverty Cure and the History Behind It

by CHRISTINE DEHLENDORF AND KELSEY HOLT

The notion that limiting women’s reproduction can cure societal ills has a long, shameful history.

 

 

Why it took a century to pass an anti-lynching law

by Louis P. Masur

A century of political organizing could not overcome a powerful tool of white supremacy — until now.

 

 

Tenure and the Invisible Faculty

by Joseph G. Ramsey

By not standing up for adjuncts, tenure-track professors have undermined their own power.

 

 

The Louvre is returning sculptures to West Africa. Here’s how and why it's happening.

by John Warne Monroe

France is finally facing up to its colonial legacy.

 

 

Why Nancy Pelosi Was Reelected Speaker of the House

by Kathryn L. Pearson

Why has Nancy Pelosi been able to hold on to power for so long?

 

 

The Dick Cheney of ‘Vice’ just craves power. The reality was worse.

by James Mann

The former veep’s ideological agenda did far more damage than his quest for clout.

 

 

Tectonic Shifts in Attitudes toward Israel

by Daniel Pipes

As Arabs and Muslims warm to Israel, the Left grows colder. These shifts imply one great imperative for the Jewish state.

 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170851 https://historynewsnetwork.org/article/170851 0
"Vice" – Looking Back on the Cheney – Bush Administration

On February 11, 2006, my late wife and I were driving home to New Jersey from Washington, D.C., on the New Jersey Turnpike. It was snowing outside and bitterly cold. Suddenly, a radio announcer interrupted the music channel to tell the world that “Vice President Dick Cheney has shot somebody.”

I roared with laughter. This man had turned over just about every apple cart he could find in his life and now the fool has gone off and shot someone.

I thought about that moment over the weekend when I saw Vice,the new biography film about Dick Cheney that has been earning numerous awards and award nominations since it opened a few weeks ago. Last week it opened nationally.

This article’s title notes the Cheney-Bush Administration and not, properly, the Bush – Cheney Administration because director/screenwriter Adam McKay makes you think that Cheney ran the country. Bush was apparently out playing baseball somewhere for eight years.

Despite all of its early accolades, Vice, produced by Plan B Entertainment,is not a very good movie. Well, it is if you just simply hate Dick Cheney or George W. Bush.

In the movie, Christian Bale (who is superb in the role and deserves an Oscar nomination), portrays Cheney as a four-star dunce. At first, he is a man going nowhere. Then his overly ambitious wife Lynne reads him the marriage riot act and tells him to make something of his life. He does. He gets himself a job as a Washington intern and then becomes an aide to Donald Rumsfeld (“Rummy”) and moves up the ladder, holding down the jobs of Chief of Staff to President Ford and Secretary of Defense for President George H.W. Bush. In between, he spent ten years representing his home state, Wyoming, in Congress. He was asked to be George W. Bush’s running mate in 2000 and turned him down (he didn’t want to give up his high paying private sector job as head of the controversial Halliburton Company). Bush pushed him, though, and he accepted. After that, Cheney becomes a battering ram for the Republicans. Cheney certainly was a powerful Vice President and gave George Bush plenty of advice, but it was Bush, and not Cheney, who made all of the key decisions, all of which are harpooned by the director in the film.

If you accept the director McKay’s view, Bush was really just an aide in Cheney’s White House. The director would have you believe that Cheney orchestrated every single move in the administration.

Bush and Cheney made some colossal mistakes and did dupe the public. McKay should get high marks for the way he portrayed the disgraceful way, as an example, in which Bush and Cheney forced General Colin Powell to deliver that famous U.N. speech loaded with misstatements of fact on Iraq and prodded him and everybody to believe that Sadam Hussein had warehouses full of nuclear weapons when he actually had none. Did they do everything wrong? They did get re-elected.

McKay gets a splendid performance by Bale. He is young and tough at the start of the film and old and tough at its conclusion. In the film Bale looks and acts exactlylike Cheney. Amy Adams is just as impressive as Lynne Cheney, the driving force in Cheney’s life. Sam Rockwell, as George W. Bush, seems to appear quickly between long scenes about Cheney. He is unimpressive. Steve Carrell is quite good as Donald Rumsfeld.

The history in the film is vague, choppy and unconvincing. According to the movie, the history of Bush’s tenure is one long carnival style conspiracy full of governmental villains around every corner and at the end of every phone call. Cheney, the Dark Prince, and his henchmen ran the country and ran it into the ground, according to the film. They stripped citizens of their rights, tortured everyone short of George Clooney and tapped the phones of all the Muppets. 

Surprisingly, nowhere in the film does anybody call Cheney the “Darth Vader” of politics, the term his enemies used so frequently during his years in office. He isseen as the evil lord of the Universe. McKay constantly comes up with reasons why Cheney had so much power and they make no sense. I was startled by the explanation that he had great power because he had his Vice President’s office, two offices in the Senate, one in the House of Representatives and one in CIA headquarters. Hey, I know lawyers who have offices in numerous cites and are still bad lawyers. So what?

McKay takes every opportunity to make fun of Cheney, especially in scenes when he is sleeping or brushing his teeth. OK, the guy’s job approval rating was as low as 13%, but he could still brush his teeth.

On many accusations, though, McKay is right. How did Cheney manage to get away with all of these shenanigans? Wasn’t anybody watching? Weren’t the lights on? 

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/article/170838 https://historynewsnetwork.org/article/170838 0
A Salute to Rick Shenkman and His Contribution to History

Gil Troy is a professor of history at McGill University. His latest book — his tenth — is The Age of Clinton: America in the 1990s .

 

Related Link  Farewell:  An interview with Rick Shenkman on his exit from HNN

 

When I was in graduate school and attended my first American Historical Association, I was shocked. I could not believe how little the historical establishment did to make us graduate students feel welcomed, to feel part of a broader historical community, to feel some sense of nobility, of excitement, of camaraderie, regarding our joint mission as pastmasters, as truth-seekers and tellers. Over the years, I regret to say, I have seen few colleagues do all that much in any contexts to foster that sense. I, for one, harbor great guilt that so many of our graduate students seem stuck in the same trauma cycles of depression, discouragement, loneliness, anomie, alienation, and, sometimes abuse by the system or their elders, that too many of us endured. 

 

Our collective failure – our outrageous negligence – makes Rick Shenkman’s achievement with HNN, the History News Network, all the more impressive.

 

Day in, day out, year in, year out, Rick has not just been our community builder -- he’s been our town crier, our cheerleader, our scold, our coach, our teacher in so many ways. His website has got scholars thinking thoughtfully, critically, substantively at a time when we historians have stopped interacting with one another on so many levels. His efforts have resisted so many of the trends that have proven so toxic to free, open, dialogue in today’s academe. You click on HNN to find compelling, relevant articles in a time of scholarly monasticism, to find lively, accessible articles in an era that prizes stylistic turgidity, to find robust, respectful debate in a moment that prefers finger-pointing and virtue-shaming to free-thinking and open-minded-learning

 

The fact that Rick has done this with so little support – financially, institutionally, emotionally, existentially – from the profession at large is scandalous. The fact that he has done this while continuing his legendary career as a thoughtful historian who can sell tons of books while generating important, substantive, academic debates is miraculous. And the fact that he is passing “his baby” in such good, robust shape, to his successor, is a magnanimous act on his part. 

 

We look forward to continuing to learn from HNN, to engage with HNN, to feel a sense of historical camaraderie thanks to HNN, under the able leadership of the new editor-in-chief Kyla Sommers.  And we hope that Rick continues to challenge us, to teach us, to inspire us – on these pages and the pages of the future books he will undoubtedly start writing in earnest, after nearly twenty years of double-triple-quadruple duty, overworked, underpaid – but not unappreciated!

]]>
Sun, 20 Jan 2019 09:14:05 +0000 https://historynewsnetwork.org/blog/154170 https://historynewsnetwork.org/blog/154170 0