News at Home News at Home articles brought to you by History News Network. Fri, 18 Jan 2019 03:25:43 +0000 Fri, 18 Jan 2019 03:25:43 +0000 Zend_Feed_Writer 2 ( William Barr Needs a History Lesson As the Senate Judiciary Committee holds its confirmation hearings for William Barr, the current nominee for Attorney General of the United States, it is clear Barr needs to brush up on his constitutional law, as well as U.S. history.  


During yesterday’s hearing, Senator Mazie Hirono (D-HI) asked Barr whether or not he believed birthright citizenship was guaranteed by the 14th Amendment. The question is important as the idea of birthright citizenship has come under increasing attack from the right in recent years. From the Republican primaries onward, Donald Trump has repeatedly asserted that birthright citizenship is unconstitutional, should be eliminated, and can be ended by executive order. While some on the right have balked at the last claim, Trump has tapped into an ever-present disdain among conservatives for birthright citizenship. 


For his part, Barr seemingly tried to side step the politically divisive issue. However, his answer to Senator Hirono’s question was not only vague, it also suggested that the soon-to-be Attorney General doesn’t know basic constitutional law or history. 


“I haven’t looked at that issue legally. That’s the kind of issue I would ask OLC [Office of Legal Counsel] to advise me on, as to whether it’s something that appropriate for legislation. I don’t even know the answer to that,” Barr answered.


There are a couple of worrying signs in this response. First, birthright citizenship is a part of the 14th Amendment, meaning any action to change that would have to be a constitutional amendment, not legislation. This is a basic tenant of constitutional law. The fact that Barr, who previously served as Attorney General under George H.W. Bush, thinks any action can be taken against birthright citizenship through simple legislation shows one of two things: (1) he isn’t competent enough to understand basic constitutional processes in the United States or (2) he was rather insidiously actually answering Senator Hirono’s question. 


The latter point warrants a bit of explanation. Barr quite visibly looked like he was attempting to simply move past the question and not answer Senator Hirono. However, if Barr does in fact think that birthright citizenship can be dealt with through congressional legislation, then the only logical explanation for this, barring the above first option, is that he doesn’t believe the 14th Amendment guarantees this status. Whereas the first possibility of incompetence warrants a refresher in constitutional law, this second one demands a lesson in history. 


History is quite clear on the intent of 14th Amendment: it was meant to create the birthright citizenship in the wake of emancipation. The 14th Amendment was created to guarantee that freed slaves, free blacks, and their posterity would forever be considered American citizens. Before its adoption, citizenship was a murky, ill-defined, status. The Constitution only mentions citizenship a few times, and does not provide a concrete definition of what a citizen is or who can be a citizen. To this day there is actually no legal definition of what citizenship actually is.


From the Constitution’s ratification to the adoption of the 14th Amendment, black Americans had repeatedly claimed they were citizens because of their birth on American soil. Scholars such as Elizabeth Stordeur Pryor and Martha S. Jones have shown the myriad of ways in which black Americans made claims on this status, only to be rebuffed in many cases. Citizenship could provide black Americans with a recognized spot in the nation’s political community. It represented hope for a formal claim to certain rights, such as suing in federal court. 


This leads to the infamous 1857 Supreme Court decision Dred Scott v. Sandford, when Chief Justice Roger Taney crafted an opinion that quite consciously attacked the very possibility of black citizenship. Taney concluded that Dred Scott, an enslaved man, could not sue in federal court because he was not a citizen. He was not a citizen, in Taney’s words, because black people “are not included, and were not intended to be included, under the word ‘citizens’ in the Constitution… On the contrary, they were at that time considered as a subordinate and inferior class of beings who had been subjugated by the dominant race, and, whether emancipated or not, yet remained subject to their authority, and had no rights or privileges but such as those who held the power and the Government might choose to grant them.”


Taney went out of his way to create a Supreme Court decision that attempted to put the legal nail in the coffin of black citizenship. The 14th Amendment was, quite consciously, crafted to upend Dred Scott, which was still the law of the land after the Civil War.  Thus when conservatives rail against birthright citizenship and claim that it is not, in fact, a part of the Constitution, they are ignoring America’s long history of slavery, discrimination, and segregation. 


When the soon-to-be Attorney General William Barr states that he thinks legislation can be used to make changes to birthright citizenship, it is because he does not believe the 14th Amendment guarantees it. And when he and other conservatives espouse such an opinion, it is because they are once again willfully ignoring American slavery its legacy of racism. This is, admittedly, not surprising. Barr also expressed the opinion during his confirmation hearing that the justice system “overall” treats black and white Americans equally, despite mountains of information proving otherwise. 


While the attack on birthright citizenship from the right deserves attention and should be fought at every turn, the underlying historical erasure of slavery and discrimination is also requires our attention. This willful amnesia is why the potential next Attorney General of the United States can, in one day, ignore so many aspects of America’s fraught history with race. And it is why we all must be on guard. 


Fri, 18 Jan 2019 03:25:43 +0000 0
A Tyrant's Temper Tantrum

King Charles I of England, frustrated at the limitations of his otherwise powerful position, decided to dissolve Parliament in March of 1629 and to clap several of the opposition’s leaders in irons. The monarch had come to an impasse over issues as lofty as religious conformity and as mundane as the regulations concerning tonnage, eventually finding it easier to simply dissolve the gathering than to negotiate with them. Historian Michael Braddick explains that the “King was not willing to assent to necessary measures” in governance, and that Charles was intent on “upholding his right to withhold consent” as he saw it, believing that “without that power he was no longer a king in a meaningful sense.” Charles was a committed partisan of the divine right of kings, truly believing himself to be ennobled to rule by fiat, and regarding legislators as at best an annoyance, and at worst as actively contravening the rule of God. 

Though it was legally required to levy taxes, at this point in history Parliament was always an occasional institution; indeed, this was the fourth that Charles had dissolved. Yet separation of powers still made it impossible for the king to directly collect taxes of his own accord, and so he adopted byzantine means of shuffling numbers around to draw income into the treasury. Such was the “Period of Personal Rule,” and to critics the “Eleven Years Tyranny,” in which Charles’ power became ever more absolute. Royalists may have seen the dissolution as a political victory, yet the ultimate loss would be Charles’, to spectacular effect. Historian Dianne Purkiss explains that the “events that were ultimately to lead to the Civil War were set in motion by a royal tantrum.” 

Royal tantrums are very much on all of our minds this new year, as we approach the fourth week of the longest government shutdown in U.S. history. As the state coffers of England were depleted after Parliament was dissolved and continued solvency required creative means of reorganization, redefinition, and shifting of funds, so too do we find government agencies forced by extremity to demand work of essential employees without pay. Garbage piles up in federal parks and at the National Mall, TSA agents and air-traffic controls work for free, yet the president, under the influence of right-wing pundits, refuses to end this current shutdown. With shades of Charles’ tantrum, Speaker Nancy Pelosi explains Donald Trump’s current obstinacy as a “manhood thing for him.” 

Meanwhile, Trump claims that his proposed border wall with Mexico is a national security issue, and after two years of inaction on his unpopular signature campaign promise has decided, not uncoincidentally following the election of a Democratic House, that he’ll invoke sweeping emergency powers to construct said wall, which last month Jennifer De Pinto and Anthony Salvanto of CBS News reported 59% of Americans oppose. At the time of this writing it’s unclear as to if Trump will declare those broad executive powers, in an audacious power-grab not dissimilar to Charles’ petulant dissolution of Parliament. 

Trump’s proposal calls to mind the pamphleteer and poet John Milton’s appraisal of Charles in his 1649 Eikonoklastes that the monarch did “rule us forcibly by Laws to which we ourselves did not consent.” Milton denounced the royalists whom he saw as an “inconstant, irrational, and Image-doting rabble,” this crowd who wished to make the kingdom great again and who are “like a credulous and haples herd, begott’n to servility, and inchanted with these popular institutes of Tyranny.”

Yet as much fun as it is to draw parallels between the events of the 17th century and our current predicament, we must avoid the overly extended metaphor. Trump is not Charles I; Pelosi is not the anti-Royalist John Pym; the Republicans are not Cavaliers and the Democrats are not Parliamentarians. Treating history as a mirror can obscure as much as illuminate, and yet I’d argue that the past does have something to say to the present, especially as we understand the ways in which American governance is indebted to understandings of those earlier disputes.

Political pollster and amateur historian Kevin Phillips argued that the English civil wars set a template for perennial political conflict in his 1998 book The Cousins’ Wars: Religion, Politics, & the Triumph of Anglo-America. With much controversy, Phillips argued that a series of conflicts between the 17th and 19th  centuries should best be understood as connected to one another, analyzing how “three great internal wars seeded each other,” with the “English Civil War… [laying] the groundwork for the American Revolution” which “in turn, laid the groundwork for a new independent republic split by slavery” that would be torn asunder during the American Civil War. For Phillips, modern Anglo-American history should be interpreted as a Manichean battle between two broad ideologies, which manifested themselves differently in each conflict while preserving intellectual continuities with their forebearers. Basing his analysis on geography and demography, Phillips sees in Charles’ claims of Stuart absolutism and religious conformity the arguments of King George III in the American Revolution, or the aristocratic defenses of inequity offered by the Southern planter class in the American Civil War. As a corollary, in the Parliamentarian he sees the language of “ancient liberties” as embraced by the American revolutionaries, or the rhetoric of New England abolitionists in the antebellum era. The first position historically emphasizes order, hierarchy, and tradition, while the second individualism, justice, equality, and liberty. 

There’s much that is convincing in Phillip’s claims. The American revolutionaries certainly looked back to thinkers like Milton; the Puritanism of the Parliamentarians was crucial in both revolutionary and antebellum New England in terms of crafting a language of rebellion. The Southern aristocrats and apologists of slavery during the American Civil War consciously compared themselves to Charles’ Cavaliers, and rejected the creed as spelled out in the United States’ founding documents as evidence of heretical non-conformism. Thus applying Phillips’ to the current divisions in the United States has a logic to it. If the American Revolution continued the same debates from the 17th century English civil wars (and it in part did), and the American Civil War was born from the contradictions of the Revolution (which is undeniably true), then it might follow that the current divisions in our country are a continuation of the American Civil War by other means. In this perspective, Trump is a kind of Copperhead President, a northern Confederate sympathizer as argued convincingly by Rebecca Solnit in The Guardian.

While acknowledging that there is much that’s valuable in Phillips’ interpretation, I prefer rather to draw a different lesson entirely. Without comment as to the causal relationships between those conflicts, I rather note a particular structure by which each one of them unfolds, an ur-narrative which for progressives is incredibly important to be aware of as we may soon be facing a period of unrivaled opportunity for enacting profound change. 

Returning to the 17th century, parallels to today can be seen in the Parliamentarian view that Charles was both an incompetent monarch and an aspiring tyrant, an illegitimate ruler enraptured by foreign influence. Had it not been for his own petulant intransigence Charles may have been able to weather those political storms, but it was precisely his own sense of inviolate authority which made his downfall inevitable. Charles’ fall from power, in turn, heralded a period of incredible potential for radical change in English history. Historian David Horspool writes that this discourse was “of a kind never before witnessed in England: an open debate” on how the new Republic should be governed. Occasions like the Putney Debates, held by the New Model Army, put front and center issues of republican liberties that had been marginal before, such as the participant Thomas Rainsborough who declared that “Every person in England hath as clear a right to elect his Representative as the greatest person in England.” Meanwhile, religious radicals like the Levellers and the Diggers, the former of whom had sizable support in both the army and Parliament, suggested communitarian economic arrangements, whereby the commons would be restituted as the property of all Englishmen, views that would still be radical today. 

Such is the primordial narrative: an ineffectual and reactionary leader makes attempts at increasingly more power which triggers a crisis that leads to his downfall while presenting the opportunity for unprecedented, radical political change from the opposition. Had Charles been less vainglorious, perhaps the civil wars could have been avoided, but he was and as a result what ultimately presented itself was the possibility of something far more daring than mere incremental change. The same template is in evidence during the American Revolution. Had moderate voices like Prime Minister William Pitt been heeded, had George III been less intemperate regarding the imposition of the Intolerable Acts, than perhaps America would still simply be part of the British Empire. As it was, the hardening of George’s position allowed for the introduction into the world of the radical democratic philosophy which defined the American Revolution, and which flowered during the Articles of Confederation when many states adopted shockingly egalitarian constitutions. Similarly, on the eve of the American Civil War, most northerners were not abolitionists, yet increasing belligerence from the Southern slave-owning class, in the form of the Missouri Compromise and especially the Fugitive Slave Act, rapidly radicalized the northern population. In the years following the Civil War there was radical possibility in Reconstruction, when true democratic principles were installed in southern states for the first time. 

We’ve already seen the arrival of new radical possibilities in opposition to the reactionary leader. Does anyone credibly think that we’d have elected several Democratic Socialists were it not for Trump? Does anyone believe that we’d finally be able to consider policy proposals like Representative Alexandria Ocasio-Cortez’s Green New Deal, and the restitution of a proper marginal tax rate, had it not been for the rightful frustration and anger at the reactionary Republican agenda? Suddenly the Democrats are suggesting actual ideas and not just the furtherance of the collapsing neo-liberal consensus; suddenly it seems as if actual change might be possible. In this sense, Trump has ironically accomplished something that the Democrats themselves haven’t been able to do – he’s pushed them to the left. 

But I must present a warning as well, for there is another part to those narratives. Writing of the English civil wars, historian Frank McLynn explains that those years “undoubtedly constituted a revolutionary moment, a time when, in principle, momentous changes were possible.” Yet the English civil wars’ radical promise was never realized, betrayed by the reactionary Lord Protector Oliver Cromwell, and among the radical participants in that revolution they were done in by “the besetting sin of the Left through the ages – internal factionalism and squabbling instead of concentrating on the common enemy.” The result would be the demagoguery of Interregnum and finally the Restoration of the monarchy. Similar preclusion of democratic possibility occurred in the 18th Century United States, when the radical politics of the Revolution would be tempered at the Constitutional Convention of 1787, with the drafting of a document that abolitionist William Lloyd Garrison famously described as “an agreement with Hell.” Post-Civil War Reconstruction, often cynically and incorrectly presented as a manifestation of draconian Yankee opportunism, was a hopeful interlude drawn to a close by Congress’ 1877 betrayal, the ramifications of which define our politics today. 

Consequently, there is a central question which the left must ask itself. It’s no longer if Trump will fall, it’s the question of what opportunities will be taken by progressives once he does. Trump’s gross incompetence and unpopularity has done more to discredit right-wing ideas than decades of liberal punditry. Clearly, we cannot afford to retreat to bland centrist moderation when the tide of history seems to call for more radical proposals. But the historical template provides warning, especially about how quickly hopeful moments can be squandered and reversed. A king’s greatest weakness is that he too often actually believes in his divine right. To be effective we can never be as stupidly arrogant. Now, what will we do with this moment? 







Fri, 18 Jan 2019 03:25:43 +0000 0
Congresswoman Virginia Foxx Thinks We Should Abandon The Term "Vocational Training." I Disagree. Last week, Congresswoman Virginia Foxx condemned some un-named folks as “classist” for placing the “stigma” of inferiority on those who opt for a vocational or technical education.  Indeed, to even place the adjective “vocational” in front of the term “education” implies such inferiority, Ms. Foxx asserts. But the Congresswoman misses the real import of education and misses, therefore, the distinction between education and training.  


Unfortunately, Ms. Foxx is not alone in missing this distinction. In its true meaning education must be about introducing young people to a knowledge of themselves and a knowledge of the relation that they have to society, the world, and the universe in which they live. We are all historical creations.  We inherit our attitudes, beliefs, and values from the world around us.  It may well be, for example, that the United States is the greatest country in the world, or that Christianity is the one true and only religion, or that the limits capitalism places on democratic control are divinely or naturally ordained.  But young people born and raised in this country will believe this not because they have chosen such beliefs, but simply because the whole of the world around them tells them this is so.  Only by deeply studying the real history of this country, only by understanding evolution and the history and the immensity of the universe, only by delving into literature and psychology, can young people begin to comprehend who they are as human beings in this country today.  And only by sampling the wealth of human knowledge in all its varied fields can young people freely decide to pursue one path or another in making their life and making their living.  


Of course, every state legislature in this country, in underfunding public education, and pushing the agenda Ms. Foxx pushes here – let’s call training for a career the same thing as an education – effectively seeks to deny young people knowledge of themselves and the world in which they live. In so doing, they dis-empower our youth and simply make “education” into a means of churning out compliant men and women who will work for fifty or sixty years, and then die. 


No, this country desperately needs educated young people. Those of us who insist that our students get a real education are not the “classists” condemned by Ms. Foxx. No, the real “classists” are those who would condemn our youth to the perpetual role of servants in a world ruled by wealth.

Fri, 18 Jan 2019 03:25:43 +0000 0
Government Shutdowns Illustrate the Pragmatism of the Founding Fathers

Government shutdowns, while never preferable, provide an opportunity to examine the framework of the United States. Of course, no one applauds a context where disagreements are so profound that the government can no longer function. Yet, it also exemplifies the success of our Republic. Allow me to explain this seemingly counterintuitive statement. 

Our founding fathers were aware of how easily corruption and influences contrary to the public good may infiltrate the mindset of any representative. “If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary,” wrote James Madison in Federalist 51. “In framing a government which is to be administered by men over men, the great difficulty lies in this: you must first enable the government to control the governed; and in the next place oblige it to control itself.” Lobbyists, special interest groups, donors, political factions, and those who wield astounding influence often detract from interests of both the minority as well as the majority at times. 

There was mindfulness of that human fallibility during the period of the Constitutional Convention and ratification. One should consider their perception that the War for Independence was due to a corrupt government. Concentrated power leads to words prevalent in the Revolutionary Era rhetoric, such as despotism, corruption, and tyranny. 

As a result, the founders distributed power within the Constitution to more fully represent those who otherwise might lack a voice. They did so in several ways, one being the creation of the three branches and their distinct roles. Defending the Constitution’s framework and degree of the separation of branches, Madison noted in Federalist 47, “The accumulation of all powers, legislative, executive, and judiciary, in the same hands, whether of one, a few, or many, and whether hereditary, self-appointed, or elective, may justly be pronounced the very definition of tyranny.” 

The founders placed a significant emphasis on the role of the legislative branch and their pragmatism is why two different houses were created in Congress, along with an executive and judicial branch. Federalist 51also explained the need for the legislature to check its own power and how it would do so. “The remedy for this inconveniency is to divide the legislature into different branches; and to render them, by different modes of election and different principles of action, as little connected with each other as the nature of their common functions and their common dependence on the society will admit.” 

The Convention made all parts of government independent as well as dependent on one another. The effect is that power is diluted, resulting in a greater probability that a plurality, both within the federal government and those constituents outside of it, will be influential. 

One must remain cognizant that government workers are impacted by this less than desirable effect of our revered system. Government shutdowns were obviously not the intent of the founders, but rather for those serving to compromise, just as the Constitution itself was a compromise. Nevertheless, temporary stalemates in some form have always been inherent in such a framework, as any new legislation requires collaboration and concessions where there is a separation of powers.

Independence in our branches offers the ability to dissent and propose diverse courses of action, which in turn reinforces our democratic ideals. That ideological and legal framework of the late 18thcentury still endures. The irony is that while we feel elements of failure and despair during a shutdown, it is also illustration of the fundamental brilliance of our Republic. 

Fri, 18 Jan 2019 03:25:43 +0000 0
The Myth of the Liberal Professoriate


For years conservative broadcasters and the right-leaning print media have denounced liberal control of American higher education. This assertion is based on the large number of academic instructors who belong to the Democratic Party and on actions taken at some colleges to promote a sense of inclusiveness and toleration to the point which – some say – discourages free speech. Both of these latter observations about academe are to a limited extent true, but they indicate that college faculty and administrators are conservative, not liberal, at least in the philosophical sense. And for intellectuals, liberal and conservative social and political values have traditionally rested on philosophical views of human nature. 

Liberal programs initially emerged from the belief that human beings are inherently good or are attracted to the good. Seventeenth century religious liberals like the Quakers used the term Inner Light, or the presence of the Creator in all humans, to explain this, while later liberal theorists like Henry David Thoreau used the term conscience. On the other hand, early classical conservatives rooted their policies in the idea that people are by nature either evil or selfish. Religious conservatives like the Puritans believed that Original Sin left all with a powerful inclination to evil, whereas secularly oriented conservatives like Adam Smith, the father of capitalism, asserted that innate “self-love” drives human action.

Although we often lose sight of the philosophical origins of liberal and conservative policy, today’s public agendas reflect those roots. Conservatives have traditionally supported powerful militaries believing that strong nations selfishly prey on weak ones, while liberals downplayed the need for military spending and substituted investment in education and social programs in order to help individuals maximize their latent moral and intellectual capabilities. Similarly, conservatives advocated criminal justice systems characterized by strict laws and harsh punishment to control people’s evil or selfish impulses, while liberals favored systems that focus on rehabilitation to revitalize latent moral sensibilities. Conservatives traditionally opposed welfare spending believing its beneficiaries will live off the labor of society’s productive members, while liberals believed such investments help those who, often through no fault of their own, find themselves lacking the skills and knowledge needed to succeed. Though the philosophical roots of these policies are frequently forgotten today, these agendas continue to be embraced by liberals and conservatives.

College professors are philosophical conservatives. This is a product of their daily experiences, and it shapes their professional behaviors. First, the realm of experience: senior members of the profession are intimately familiar with the old excuse for failure to complete an assignment, “my dog ate my essay,” and its modern replacement, “my computer ate my essay.” Years ago, missed exams were blamed on faulty alarm clocks; today that responsibility has been shifted to dying iPhone batteries. Term papers cut and pasted from Wikipedia are an innovation; plagiarism is not. A philosophically liberal view of humanity is difficult to sustain amidst such behaviors.

The clearest manifestation of philosophical conservatism in the teaching profession is seen in tests and grades. Testing is based in part on the assumption that individuals will not work unless confronted by the negative consequences of failure, an outlook that is steeped in philosophical conservatism. (Among historians, who spend inordinate amounts of time examining wars and economic depressions, which often resulted from greed and avarice, their academic discipline itself encourages a philosophically conservative outlook.)

How then can academicians be accused of being liberal? As noted, this is partly because the majority of faculty are registered Democrats. Counterintuitively, this reflects a philosophically conservative and not a liberal outlook, especially relating to all important economic policy. During the debate over the tax bill last year, Republicans continued their traditional support for supply-side or trickledown economics by proposing to lower taxes on high earners and corporations, whereas Democrats continued to advocate demand-side economics by proposing to shift the tax burden from the large middle to the small upper class and to provide tax-credits for workers. Supply-side economics is based on the assumption that reducing the tax burden on the rich will lead them to invest in plant and equipment which will create jobs, the advantages of which will trickle down to workers in the form of wages and benefits.

This is a philosophically liberal notion. It assumes that people and corporations will invest in plant and equipment even when wages are stagnant leaving many people without the income needed to purchase the goods new factories will produce. Demand-side economics, on the other hand, is partially based on the philosophically conservative notion that no rich person or corporation will build a plant, if the masses lack the income needed to buy the product. In support of this position today, demand-siders note that many corporations are using most of the surplus capital from last year’s tax cuts to buy back stock instead of investing in capital assets because wealth and income are more concentrated in the hands of the few than in recent history, which minimizes the purchasing power of the many. In supporting tax schemes and other economic policies that put money in the pockets of the many, demand-siders embrace the conservative idea that such programs will stimulate selfishly based investment spending by corporations in an attempt to tap the rising wealth of the majority of consumers. Moreover, demand-side policies will unleash the selfishly oriented entrepreneurial inclinations of working people by giving them the wherewithal to open small businesses that spur economic growth.

College professors and administrators are also attracted to Democratic economic policy because they are aware of the successes of demand-side economics. There has not been a major depression since the New Deal, though there have certainly been recessions. This is because that movement largely achieved its goal of shifting the weight of the government from supporting a supply-side to a demand-side approach to economics by institutionalizing minimum wages, overtime pay for work beyond forty hours a week, unemployment insurance, Social Security, and strong labor unions. This was not part of some left-wing socialist agenda. The goal was to put money into the hands of the many and thus incentivize the capital class to invest in new productive capacity, and more importantly to maintain demand and spending when the economy slows.

Academicians generally, and historians especially, realize that prior to the New Deal depressions (aka panics), occurred every ten to twenty years and were exacerbated by wage cuts which reduced demand and led to further layoffs and wage reductions. Minimum wages and union contracts which guarantee a wage for the life of the contract have slowed the downward economic spiral that turned recessions into depressions by limiting wage cuts, while Social Security and unemployment insurance also slowed economic downturns by helping sustain demand as the economy slackened. Though the contemporary right often argues that New Deal programs sought to create a liberal safety net for the poor, academics realize those programs were less attempts to help individuals directly and more attempts to jump-start a stalled economy and to keep it humming in part by incentivizing the capital class to continue to invest in productive capacity.

Conservatives also label academics as liberals because of their attempts to encourage inclusiveness and discourage what some term hateful speech on campuses. To the extent that this is true, and it often seems exaggerated, it is rooted in philosophical conservatism. Academics realize that language has great symbolic power, and symbols have a tendency to generate emotional as opposed to rational responses which colleges and universities rightly scorn. Academics also recognize that negative symbolism, including language, has served to dehumanize groups, and dehumanization has often led to discrimination and persecution. Only philosophical conservatives can have so little faith in human reason and goodness as to believe that emotionally laden language has the power to perpetuate injustice.

Ironically, the right, in supporting both supply-side economics and in tacitly accepting ethnically insensitive and sexist language, is embracing policies rooted in liberal not conservative thought, while the university – in favoring the opposite – adheres to a philosophically conservative outlook. Indeed, a traditional conservative would argue that the appeal of supply side-economics and insensitive speech lie in their ability to protect the wealth of the rich and to sustain the increasingly fragile sense of dignity of the middle class.


Fri, 18 Jan 2019 03:25:43 +0000 0
Trump’s White Evangelicals are Nostalgic for an American Past that Never Existed for Blacks and Others

In 2013, I received an email from Rev. Ray McMillan, the pastor of Faith Christian Center, a conservative evangelical and largely African American congregation in Cincinnati, Ohio. McMillan was writing to ask me if I might be interested in participating on a panel at an upcoming conference on evangelicals and racial reconciliation, to be held later that year at Wheaton College, a Christian liberal arts college in western suburban Chicago. I was initially surprised by the invitation. I cared about racial reconciliation, but I had never spoken at a conference on the subject. I was not an expert in the field, and even my own historical work did not dive explicitly into race or the history of people of color in the United States.

I was even more confused when Rev. McMillan asked me to be part of a plenary presentation about my recent book Was America Founded as a Christian Nation? I thought I could probably say a few things about race and the American founding, but I also wondered if someone more prepared, and perhaps more of an activist in this area, might be better suited to speak in my time slot. After a follow-up phone conversation with Rev. McMillan, I began to see what he was up to. He told me that he and other Cincinnati pastors were noticing a disturbing trend in their African American and interracial congregations. Many of their parishioners had accepted the idea, propagated by the Christian Right, that the United States was founded as a Christian nation. McMillan believed that such an understanding of history was troubling for African American evangelicals. The promoters of this view were convincing many African Americans in Cincinnati that they needed to “reclaim” or “restore” America to its supposedly Christian roots in order to win the favor of God.

McMillan could not stomach the idea that a country that was committed to slavery, Jim Crow laws, and all kinds of other racial inequalities could ever call itself “Christian.” Why would any African American want to “reclaim” a history steeped in racism? If America was indeed built on Judeo-Christian principles, then its Founders would one day stand before God and explain why they did not apply these beliefs to African Americans. And if America was not founded as a Christian nation, McMillan needed to tell his congregation that they had been sold a bill of goods.

I often think about Rev. McMillan and the Wheaton conferences on racial reconciliation whenever Donald Trump says that he wants to “make America great again.” I assume that most people, when they hear this phrase, focus on the word “great.” But as a historian, I am much more interested in Trump’s use of the word “again.” For white Americans, “making America great again” invokes nostalgia for days gone by. America was great when the economy was booming, or when the culture was less course, or when the nuclear family looked like the Cleaver family onLeave it to Beaver, or when public school children prayed and read the Bible at the start of each day. But as I listened to the African American ministers at the Wheaton conference, I came face to face with the reality that African Americans have very little to be nostalgic about. As one of those preachers observed, “The best time to be black in the United States is right now!” When African Americans look back, they see the oppression of slavery, the burning crosses, the lynched bodies, the poll taxes and literacy tests, the separate but unequal schools, the “colored-only” water fountains, and the backs of buses. Make America great again?

When many conservative evangelical supporters of Donald Trump first heard the phrase “make America great again,” they probably assumed that America was indeed great until the Supreme Court, through a series of cases, removed God from public life. If America was founded as a “shining city on a hill” (as Ronald Reagan taught them) and continued to exist in a unique, exclusive, and exceptional covenant relationship with God long after the decline of Puritanism, then the Christian Right might have a legitimate case. But if America was not founded as a Christian nation, the entire foundation of their political agenda collapses. Christians would still be justified in fighting against abortion and gay marriage, or advocating for religious liberty; but it would be a lot more difficult to use American history to make their case.

As I argued in Was America Founded as a Christian Nation?, until the 1970s, Americans—evangelicals and non-evangelicals alike—believed that they were living in a Christian nation. This is merely a historical statement. I do not mean to suggest that such a view was right or wrong. Neither is it a statement about whether those who made this claim interpreted the Founding Fathers correctly on the matter. To ask whether America was founded as a Christian nation is to take a debate that did not reach any degree of intensity until recently and to superimpose it on an eighteenth-century world of the white men who build the American republic. The Founding Fathers lived in a world that was fundamentally different from our own. It was a world in which Christianity was the only game in town. To be sure, there were some small Jewish communities located in coastal cities, and it is likely that a form of Islam may have been practiced among some African slaves. But the powerful influence of Christianity, especially Protestant Christianity, held unrivaled influence.

The Founding Fathers also had very divergent views of the relationship between Christianity and the nation they were forging. We need to stop treating them as a monolithic whole. Thomas Jefferson and James Madison, for example, were strong advocates for the complete separation of church and state. John Adams and George Washington, like their fellow Federalists, believed that religion was essential to the cultivation of a virtuous citizenry. It is true that the Founders, by virtue of the fact that they signed the Declaration of Independence, probably believed in a God who presided over nature, was the author of human rights, would one day judge the dead, and who governed the world by his providence. Those who signed the US Constitution endorsed the idea that there should be no religious test—Christian or otherwise—required of those wishing to hold federal office. Those responsible for the First Amendment also championed the free exercise of religion and rejected a state-sponsored church.

Yet anyone who wants to use these documents to argue against the importance of religion—in the America of the time of the founding—must reckon with early state constitutions, such as those in Pennsylvania, Massachusetts, and South Carolina, that required officeholders to affirm the inspiration of the Old and New Testaments, to obey the Christian Sabbath, or to contribute tax money to support a state church. Some of the Founders believed that Christians, and only Christians, should be running their state governments. Other Founders rejected the idea of the separation of church and state.

And so, was America founded as a Christian nation? A close examination of the American past makes it very difficult to answer with a definitive “yes” or “no.”

This leads us to a second question: Is America a Christian nation today? It all depends on what one means by “Christian nation.” In terms of the religious affiliation of its population, the United States is unquestionably a Christian nation—in the sense that most Americans identify with some form of Christian faith. Yes, while Christianity has had a defining influence on American culture, that influence has waned dramatically in the last fifty years. Moreover, from a legal and constitutional standpoint, it is impossible to suggest that the United States is now a Christian nation. Article 6 of the US Constitution still forbids religious tests for office. The First Amendment still does not allow a religious establishment and still secures religious freedom for all Americans. The fact that some of the individual states at the time of the founding upheld test oaths or supported state churches became irrelevant to this conversation when the Supreme Court, in Everson v. Board (1947), applied the due-process clause of the Fourteenth Amendment to the establishment clause of the First Amendment. In other words, the Supreme Court made it clear that states now had to abide by the US Constitution and the Bill of Rights in matters of religion much in the same way that states no longer have the right to make their own decisions about whether slavery was legal. Many white evangelicals, especially those who champion the right of states to chart their own course on matters pertaining to religion and political life, are not happy about what the court did in Everson. But it remains the law of the land.

It is easy for white evangelicals to look back fondly on American history. There is, of course, a lot to celebrate. We are a nation founded on the belief that human beings are “endowed by our Creator with certain inalienable rights, namely, life, liberty, and the pursuit of happiness.” We have established some of the greatest colleges and universities in the world. Our standard of living exceeds those of other countries. When we have failed to live up to our ideals we have made efforts to correct our moral indiscretions. Those who fought tirelessly to end slavery, curb the negative effects of alcohol, defend human life, and deliver rights to women and the less fortunate come to mind. Americans have proven that they can act with a sense of common purpose and unity. We have seen the American character on display, for example, during two World Wars and in the wake of the September 11, 2001 terrorist attacks. And the United States has always been a place where immigrants can come and start new lives.

At the same time, America is a nation that has been steeped in racism, xenophobia, imperialism, violence, materialism, and a host of other practices that do not conform very well to the ethical standards that Christianity calls followers to live up to. Christians should be very careful when they long for the days when America was apparently “great.” Too many conservative evangelicals view the past through the lens of nostalgia. Scholar Svetlana Boym describes nostalgia as a “sentiment of loss and displacement” that “inevitably reappears as a defense mechanism in a time of accelerated rhythms of life and historical upheavals. In this sense, nostalgia is closely related to fear. In times of great social and cultural change, the nostalgic person will turn to a real or imagined past as an island of safety amid the raging storms of progress. In other words, to quote Boym again, “progress didn’t cure nostalgia but exacerbated it.” Sometimes evangelicals will seek refuge from change in a Christian past that never existed in the first place. At other times they will try to travel back to a Christian past that did exist—but, like the present, was compromised by sin.

Nostalgia is thus a powerful political tool. A politician who claims to have the power to take people back to a time when America was great stands a good chance of winning the votes of fearful men and women. In the end, the practice of nostalgia is inherently selfish because it focuses entirely on our own experience of the past and not on the experience of others. For example, people nostalgic for the world of Leave It to Beaver may fail to recognize that other people, perhaps even some of the people living in the Cleaver’s suburban “paradise” of the 1950s, were not experiencing the world in a way that they that they would describe as “great.” Nostalgia can give us tunnel vision. Its selective use of the past fails to recognize the complexity and breadth of the human experience—the good and the bad of America, the eras that we want to (re) experience (if only for a moment) and those we do not. Conservative evangelicals who sing the praises of America’s “Judeo-Christian heritage” today, and those who yearn for a Christian golden age, are really talking about the present rather than the past.

Fri, 18 Jan 2019 03:25:43 +0000 0
Alice Walker: In Her Own Words

Alice Walker (Image Courtesy Virginia DeBolt)

Related Links

● Alice Walker's Not Guilty of Antisemitism By Robert Cohen ● Alice Walker's Shame By Gil Troy

In the past weeks the mainstream media have carried articles casting Alice Walker, the Pulitzer Prize-winning novelist and poet, as an anti-Semite. Many of these articles, including the one by Gil Troy on HNN, have used the same few, select quotations from her poetry, with no contextualization, in order to indict Walker. They have ignored evidence from both her life history and her writings that contradicts the charge of anti-Semitism.

My view is that anti-Semitism is such a loathsome and dangerous form of bigotry that one should be careful and judicious -- examining all the relevant evidence, past and present -- before labeling someone an anti-Semite. All the more so in the case of Walker, a lifelong foe of bigotry. But instead there has been a rush to judgment, and a failure to examine Walker in a fair-minded way, so that the nuances of her thought and writing have been simply ignored.

Hoping that it is possible to restore some semblance of fairness and balance to this discussion, I offer below, complete and unedited, Alice Walker's poem "Conscious Earthlings," written earlier this year. The poem offers fierce criticism of Israel yet conveys warmth towards Jewish friends; it praises Jews who demonstrate solidarity on behalf of a world free of bigotry and share "the dream of one humanity ... one United Tribe of Conscious Earthlings." By reading the poem you can judge for yourself whether it offers evidence that critics have erred in denouncing Alice Walker as anti-Semitic.




Fri, 18 Jan 2019 03:25:43 +0000 0
Is There a Statesman in the House?  Either House?  


It seems likely that 2019 will be one of the most challenging and consequential years in recent American history, perhaps on par with 1974. When Robert Mueller’s investigation of President Donald Trump concludes and his report is likely sent to Congress, a time of reckoning will be upon us. 

The United States will need leaders in both parties to display a quality that has been in short supply in our country in recent years: statesmanship.

Statesmanship is a pattern of leadership, and an approach to public service, that is characterized by vision, courage, compassion, civility, fairness, and wisdom. Statesmanship can involve bipartisanship, but it is not the same as bipartisanship. History proves that there can be a strong bipartisan consensus to enact harmful policies or to evade difficult alternatives. 

When statesmen consider public policy issues, their first question is, “what is in the public interest?” Personal and partisan considerations can follow later, but hopefully much later. If the national good is not identical to, or even clashes with, personal and partisan considerations, the former must prevail. 

Genuine statesmanship requires leaders to dispassionately consider issues, carefully weigh evidence, and fairly render verdicts, even if they go against personal preferences or are contrary to the desires of their political base. Given our current political climate it is easy to forget that statesmanship, while unusual, has been a critical feature of American politics and history. 

Republican senator Arthur Vandenberg played a pivotal role in the late 1940s in securing congressional approval of key elements of President Harry Truman’s foreign policy including the Marshall Plan and NATO. Margaret Chase Smith, a first term GOP senator from Maine, broke from party ranks in 1950 and challenged Joseph McCarthy and his demagogic tactics. Senate Republican Leader Howard Baker damaged his chances for the 1980 GOP presidential nomination by supporting the Panama Canal treaties that were negotiated by President Jimmy Carter. Republican Richard Lugar, then the chairman of the Senate Foreign Relations Committee, defied President Ronald Reagan in the mid-1980s and pushed for economic sanctions on the apartheid regime in South Africa. 

Decisions of similar gravity are likely to face political leaders in Washington in 2019. Democrats should fairly evaluate Mueller’s report as it pertains to alleged Russian collusion and obstruction of justice by the president. They should not overreach because of their antipathy to Trump or avoid their responsibilities if Mueller’s findings suggest impeachable crimes if they fear that such an action would complicate their 2020 prospects. Republicans must end their reflexive and unworthy tendency to overlook the president’s frequently egregious and possibly criminal behavior because Trump remains hugely popular with the GOP base. 

Senator Paul Simon, a consequential and successful public official in Illinois for more than four decades, worried during his final years that statesmanship appeared to be at low ebb. “We have spawned ‘leadership’ that does not lead, that panders to our whims rather than telling us the truth, that follows the crowd rather than challenges us, that weakens us rather than strengthening us,” he wrote. “It is easy to go downhill, and we are now following that easy path. Pandering is not illegal, but it is immoral. It is doing the convenient when the right course demands inconvenience and courage.”

Decades earlier, Senator John F. Kennedy wrote eloquently on the subject. In Profiles in Courage, he argued that politicians sometimes face “a difficult and soul-searching decision” in which “we must on occasion lead, inform, correct and sometimes even ignore constituent opinion.” Kennedy added that being courageous “requires no exceptional qualifications, no magic formula, no special combination of time, place and circumstance. It is an opportunity that sooner or later is presented to all of us. Politics merely furnishes one arena which imposes special tests of courage.” 

Those tests are coming in 2019.

Fri, 18 Jan 2019 03:25:43 +0000 0
Fox News’s Nefarious Role in Misinforming Trump Voters


During the past two years in which Donald Trump has stumbled through his presidency, critics have been asking why so many Americans continue to back him despite mounting evidence of deeply flawed leadership. Often, these critics express contempt for the millions of Americans who constitute Trump’s “base.” They complain that Trump’s partisans are uniformed people who refuse to acknowledge that the president’s lying, ethical lapses, and failed policies are harming the nation. Trump remains in power, these critics argue, largely because starry-eyed followers ignore the facts.

These critics cast blame in the wrong place. Trump’s supporters, representing 38% of the electorate according to a recent poll, do not deserve all the censure that is directed at them. They did not create the pro-Trump narrative. They are its recipients. Conservative media have been especially influential in promoting optimistic judgments about Trump’s leadership. Fox News serves as command-central for the perspective. It draws a large audience. In October 2018, according the Nielson’s research, Fox racked up its 28thconsecutive month as the No. 1 basic cable news channel. Fox drew more viewers than CNN and MSNBC combined. 

Millions of Americans who wish to be informed about current events tune in regularly to Fox. Once they are Fox fans, they tend to stick with the channel. The hosts and reporters on Fox News encourage loyalty. Frequently, they make damning references to CNN (a favorite target) as well as CBS, ABC, NBC, MSNBC, the New York Timesand the Washington Post. They hint that only Fox can be trusted. Don’t look elsewhere for information, they warn, because “liberal” networks hawk “fake news.”

What kind of reporting does the Fox News viewership receive through prime-time reporting and commentary? Consider the lessons viewers learned on Thursday, December 20, 2018, an extraordinary day of troubles for Trump’s presidency. Leading print and television journalists outside of Fox expressed shock that Trump suddenly announced plans to withdraw U.S. troops from Syria and quickly draw down half of U.S. troops in Afghanistan. They warned that an American exodus in Syria could benefit the Russians, Syrians, Iranians, and Turks at the expense of Kurds who fought bravely against ISIS. Military leaders and foreign policy experts blasted Trump’s decision as ill-advised and dangerous. 

General Jim Mattis’s decision to resign as Secretary of Defense, also received abundant commentary on December 20. Mattis’s letter of resignation communicated strong disagreement with the direction of U.S. foreign policy. Mattis, in a clear rebuke of the president, noted that during his four decades of “immersion in the issues” he had learned the importance of treating allies with respect and “being clear-eyed about malign actors and strategic competitors.” Both American and international leaders were alarmed that the last general who seemed capable of taming the erratic president planned to leave his post.

On the home front, President Trump led congressional leaders to believe that a compromise was workable on temporarily funding the government. But Trump suddenly reversed his position, insisting there would be no settlement unless Congress provided $5 billion for a border wall. On December 20 the stock market tanked on this news and other developments. The next day Wall Street closed with its worst week since the financial crisis of 2008.

Mainstream journalists focused on the chaos associated with these developments. Several Republican leaders joined them in expressing concern, including Senators Lindsay Graham and Bob Corker. Senate Majority leader Mitch McConnell backed compromises aimed at averting a shutdown, but Trump and his backers in the Freedom Caucus made compromise unworkable.

Fox News television viewers got almost no sense of this mounting crisis when watching prime-time programing on the night of December 20. Shows hosted by Martha MacCallum, Tucker Carlson, Dan Bongino (sitting in for Sean Hannity) and Laura Ingraham directed viewers’ attention to other matters. The programs focused on a subject that had already received extensive coverage on Fox in previous months and years: undocumented immigrants. Hosts and guests warned repeatedly that dangerous foreigners threatened to overrun American society. 

Even though the real “news” on December 20 was about struggles in Congress to keep the government running, Fox’s prime-time programming highlighted stories about an immigrant invasion. Commentators asserted falsely that Democrats advocated “open borders.” They accentuated a report about a violent undocumented immigrant in California. In each program hosts and commentators left viewers with an impression that the big news of the day concerned security threats from aliens. Speakers praised President Trump for his determination to build a wall.

What other topics dominated the night’s discussions on Fox, essentially eclipsing any discussion of the big stories of the day about blow-back from the president’s controversial actions? 

Martha MacCallum’s program featured a lengthy interview with Susan Collins. The senator from Maine talked at length about some extremist critics of Brett Kavanaugh’s nomination to the Supreme Court (individuals who harassed her in disgusting ways). 

Tucker Carlson drew attention to the work of Robert Shibley, who maintained that America’s universities have been pushing aggressively against free speech. Carlson also took shots at a “Climate Tax” and made fun of claims about Russian interference. He maintained that China was the real threat. 

Dan Bongino and his guests blasted Deputy Attorney General Rod Rosenstein’s defense of the “so-called Russia investigation.” Bongino praised a “terrific book” by Gregg Jarrett called The Russia Hoax: The Illicit Scheme to Clear Hillary Clinton and Frame Donald Trump.

Laura Ingraham, like all the other prime-time hosts, devoted considerable time to immigration, but she encountered difficulty when a sheriff objected to some of her arguments. The officer acknowledged the difficulty of holding a violent undocumented man in California because of laws pertaining to sanctuary cities, but he noted that many immigrants in his community were good citizens and sought help from law enforcement when troubled by criminals. “That’s a lie!” Ingraham responded. She ended the interview quickly by mentioning “violence, rape, burglary, robbery and other offenses against property and people.”

A pattern in the reporting and commentary on Fox was evident in these four prime-time programs. By focusing on old news stories that had been red meat in right-oriented media commentary for years, there was little time left for an analysis of stunning developments in the previous 24-hours. General Mattis’s resignation letter hardly got a nod. The outcry by national and international leaders regarding President Trump’s plan to withdraw from Syria and Afghanistan received little or no attention. The impact of a government shutdown on the American economy and the American people was almost completely ignored. There was hardly a word about the stock markets’ plummet that day or the huge slide of recent weeks. Instead, viewers heard about scary threats from immigrants, Democrats, university administrators, and Chinese hackers. They were reminded often that President Trump fights tenaciously for ordinary Americans.

Much of the discussion did not reflect what used to be identified as mainstream Republican stands on economic and political affairs. Instead, viewers got an earful of analysis from individuals who spoke from the margins of political debates. Hosts and commentators seemed eager to please their most important viewer, the President of the United States. Most of them endorsed and celebrated Donald Trump’s statements and actions, despite their sharply controversial nature. Dan Bongino, substituting as host for Sean Hannity, provided an example of the slant by promoting his co-authored book, Spygate: The Attempted Sabotage of Donald J. Trump. 

Endorsing controversial and often questionable ideas, has, of course, been evident in Fox’s broadcasting over many years. A decade and a half ago, the channel sounded a drumbeat for war in program after program, convincing many viewers that Saddam Hussein had been responsible for the 9/11 tragedy, threatened the world with WMDs, including nukes, and needed to be removed. On the same channel host Bill O’Reilly devoted considerable time to warning viewers about a “war on Christmas.” Anti-religious forces were trying to replace “Merry Christmas” with “Happy Holidays,” O’Reilly cautioned. During those years, Roger Ailes, then Chairman and CEO at Fox News, designed the channel’s modus operandi, applying a heavy-handed slant on the news.

Complaints about Donald Trump’s enthusiasts are often misplaced. Many citizens who are regular patrons of Fox News and other right-oriented programming want to be well-informed about current events. They take civic engagement seriously. Those listeners and viewers should be more discerning, of course, but they are not fully to blame for making judgments about national and international affairs that raise the eyebrows of Trump’s critics. Fox News, and other opinion sources on television and radio are primarily responsible for the base’s narrow and skewed viewpoints. Millions of Americans have turned to the Fox News Channel and other sources, seeking knowledge about current events. They have been let down by the manipulators of “news.”

Fri, 18 Jan 2019 03:25:43 +0000 0
Alice Walker's Not Guilty of Antisemitism

Across the internet, poet and novelist Alice Walker this week has been harshly denounced as anti-Semitic on the basis of an interview she did with the New York Times Book Review last Sunday, for her comments  on  And The Truth Shall Set You Free,  by David Icke, whose work has been  condemned as anti-Semitic.  In the interview, Walker was asked what books were on her nightstand. She mentioned a number of books, one of which was Icke's, and  said that "In Icke's books there is the whole of existence, on this planet and several others, to think about. A curious person's dream come true." On her website, in response to the tidal wave of criticism of her for not criticizing Icke,  Walker wrote that she did not regard  him or his work as anti-Semitic, and defended her freedom to read whatever works interested her. 


Whatever the merits of Walker’s reading of Icke, her life history has been one in which she has consistently and eloquently battled bigotry since her teenage college years at Spelman College where she was active in the Atlanta movement against racial discrimination and the Jim Crow system. As one who has studied Walker’s history of political activism, I find no trace of anti-Semitism, but instead find a humane identification with the oppressed, including Palestinians, and a dedication to battling war, poverty, and hatred.    


The period in Walker’s life that I have studied most closely is her time as a Spelman college student in the early 1960s.  Though a great source of student activists, Spelman, a historically black women’s college in Atlanta, was socially conservative in the extreme, presided over by a campus administration that often sought to stifle student and faculty activism.  The most prominent faculty activist at Spelman back then was history department chair Howard Zinn, who served as both a mentor and supporter of the students protesting both racism in downtown Atlanta and paternalistic regimentation of student life on campus.  Zinn’s role in supporting the student movement led to his firing in 1963.   


Walker, a student of Zinn, took a courageous stand on behalf of this Jewish teacher she respected and loved, risking her  educational career to protest his firing. Coming from an impoverished rural family, Walker needed her scholarship funds to continue her education and knew that Spelman’s authoritarian president could easily have taken away that scholarship in response to the open letter she wrote on behalf of Zinn’s reinstatement. Yet she stood up for her teacher despite the fact that this meant jeopardizing her own future. Zinn and Walker would be lifelong friends.


Yes this was a long time ago, 1963. But Walker is as proud today as she was back then to have risked it all on behalf of her radical Jewish teacher.  I know this because this year I published a book on Howard Zinn, Spelman, and the Atlanta student movement, Howard Zinn’s Southern Diary: Sit-Ins, Civil Rights, and Black Women’s Student Activism, and Walker wrote a foreword to it that described in moving terms how her and her family’s love of education and reverence for teachers, along with her passion for freedom, and justice, motivated her to stand up for her beloved teacher.  Walker back then,  as my late mother would say (in Yiddish), was a mensch, who had acted selflessly, and she still is and still does.

Fri, 18 Jan 2019 03:25:43 +0000 0
Do We Really Need Billionaires?


According to numerous reports, the world’s billionaires keep increasing in number and, especially, in wealth.

In March 2018, Forbes reported that it had identified 2,208 billionaires from 72 countries and territories.  Collectively, this group was worth $9.1 trillion, an increase in wealth of 18 percent since the preceding year.  Americans led the way with a record 585 billionaires, followed by mainland China which, despite its professed commitment to Communism, had a record 373. According to a Yahoo Finance report in late November 2018, the wealth of U.S. billionaires increased by 12 percent during 2017, while that of Chinese billionaires grew by 39 percent.

These vast fortunes were created much like those amassed by the Robber Barons of the late nineteenth century.  The Walton family’s $163 billion fortune grew rapidly because its giant business, Walmart, the largest private employer in the United States, paid its workers poverty-level wages.  Jeff Bezos (whose fortune jumped by $78.5 billion in one year to $160 billion, making him the richest man in the world), paid pathetically low wages at Amazon for years―until forced by strikes and public pressure to raise them. In mid-2017, Warren Buffett ($75 billion), then the world’s second richest man, noted that “the real problem” with the U.S. economy was that it was “disproportionately rewarding to the people on top.” 

The situation is much the same elsewhere.  Since the 1980s, the share of national income going to workers has been dropping significantly around the globe, thereby exacerbating inequality in wealth.  “The billionaire boom is . . . a symptom of a failing economic system,” remarked Winnie Byanyima, executive director of the development charity, Oxfam International.  “The people who make our clothes, assemble our phones and grow our food are being exploited.”

As a result, the further concentration of wealth has produced rising levels of economic inequality around the globe.  According to a January 2018 report by Oxfam, during the preceding year some 3.7 billion people―about half the world’s population―experienced no increase in their wealth.  Instead, 82 percent of the global wealth generated in 2017 went to the wealthiest 1 percent.  In the United States, economic inequality continued to grow, with the share of the national income drawn by the poorest half of the population steadily declining.  The situation was even starker in the country with the second largest economy, China. Here, despite two decades of spectacular economic growth, economic inequality rose at the fastest pace in the world, leaving China as one of the most unequal countries on the planet.  In its global survey, Oxfam reported that 42 billionaires possessed as much wealth as half the world’s population.

Upon reflection, it’s hard to understand why billionaires think they need to possess such vast amounts of money and to acquire even more.  After all, they can eat and drink only so much, just as they surely have all the mansions, yachts, diamonds, furs, and private jets they can possibly use.  What more can they desire?  

When it comes to desires, the answer is:  plenty!  That’s why they drive $4 million Lamborghini Venenos, acquire megamansions for their horses, take $80,000 “safaris” in private jets, purchase gold toothpicks, create megaclosets the size of homes, reside in $15,000 a night penthouse hotel suites, install luxury showers for their dogs, cover their staircases in gold, and build luxury survival bunkers.  Donald Trump maintains a penthouse apartment in Trump Tower that is reportedly worth $57 million and is marbled in gold.  Among his many other possessions are two private airplanes, three helicopters, five private residences, and 17 golf courses across the United States, Scotland, Ireland, and the United Arab Emirates.

In addition, billionaires devote enormous energy and money to controlling governments. ”They don’t put their wealth underneath their mattresses,” observed U.S. Senator Bernie Sanders; “they use that wealth to perpetuate their power.  So you have the Koch brothers and a handful of billionaires who pour hundreds of millions of dollars into elections.”  During the 2018 midterm elections in the United States, America’s billionaires lavished vast amounts of money on electoral politics, becoming the dominant funders of numerous candidates.  Sheldon Adelson alone poured over $113 million into the federal elections.  

This kind of big money has a major impact on American politics.  Three billionaire families―the Kochs, the Mercers, and the Adelsons―played a central role in bankrolling the Republican Party’s shift to the far Right and its takeover of federal and state offices. Thus, although polls indicate that most Americans favor raising taxes on the richregulating corporationsfighting climate change, and supporting labor unions, the Republican-dominated White House, Congress, Supreme Court, and regulatory agencies have moved in exactly the opposite direction, backing the priorities of the wealthy.

With so much at stake, billionaires even took direct command of the world’s three major powers.  Donald Trump became the first billionaire to capture the U.S. presidency, joining Russia’s president, Vladimir Putin (reputed to have amassed wealth of at least $70 billion), and China’s president, Xi Jinping (estimated to have a net worth of $1.51 billion).  The three oligarchs quickly developed a cozy relationship and shared a number of policy positions, including the encouragement of wealth acquisition and the discouragement of human rights.

Admittedly, some billionaires have signed a Giving Pledge, promising to devote most of their wealth to philanthropy.  Nevertheless, plutocratic philanthropy means that the priorities of the super-rich (for example, the funding of private schools), rather than the priorities of the general public (such as the funding of public schools), get implemented.  Moreover, these same billionaires are accumulating wealth much faster than they donate it.  Philanthropist Bill Gates was worth $54 billion in 2010, the year their pledge was announced, and his wealth stands at $90 billion today.

Overall, then, as wealth is concentrated in fewer and fewer hands, most people around the world are clearly the losers.  

Fri, 18 Jan 2019 03:25:43 +0000 0
How a Harlem Immigrant Views What’s Happening on the Border


The confluence of seemingly random things can sometimes trigger the need to re-examine who we are, where we come from, what we believe and what we really know and understand. Not just about ourselves individually, but, often of greater significance, about our country.

On the eve of the November mid-term elections the President of the United States again revisited the same racist, anti-immigrant diatribe with which he launched his Republican Party presidential campaign three years earlier. With over-wrought, fear-mongering warnings he harangued the electorate inveighing against a “massive” horde- a caravan of brown people replete with Middle-Eastern Terrorists poised to invade the US southern border, take our jobs and assault our women. Press coverage of the several thousand people slowly streaming from various Central American countries toward the US-Mexican Border showed a somewhat different picture: poor, often ill-shod people—many of them women and children—slowly walking northward in hopes of gaining asylum and a better life in the US. Despite the President’s seemingly racially biased pre-election exhortations to his core followers, his political party lost 40 seats and control of the US House of Representatives. Two weeks later the same US President authorized US Troops—possibly illegally sent by him to that southern border—to use lethal force against those “invaders” if and when they tried to cross the Border. By the last week in November we had been treated to the sight of desperate people trying to do just that being indiscriminately tear-gassed: men, women with toddlers, even pregnant women. In a nutshell: a rather ugly, quite dismaying sight. That same week my good friend and journalist-colleague Raymond Peterson sent me a link to a story about the death of one of the last Navajo Code Talkers of WWII at the age of 94. And late on the evening of 30 November news broke that the 41st President of the US, George Herbert Walker Bush, had died, also at the age of 94. 

Definitely an interesting confluence of events in the month of November that worked in sinuous ways within my conscious/unconscious psyche. Especially since as a veteran practicing journalist with a rather unique background and perspective I’ve long considered myself the ultimate insider-outsider “observer” of the foibles and follies of this country. Among those converging “strands of confluence” that triggered re-assessment and realignment of knowledge and understanding: the consummate patriotism of that 94-year old Navajo, emblematic of all those Code Talkers whose native language became the US military’s impenetrable communications code the Japanese could not break and thus helped assure the Allied WWII victory in the Pacific. Navajo leaders reportedly say there are now less than ten of the Code Talkers still alive. Many of them had to “jump through hoops” just to be allowed to “perform their civic duty” for their country. (So too with the Nisei Japanese-American 442nd Infantry Regiment— which became the most highly decorated military unit in US history, the Tuskegee Airmen, and the Harlem Hellfighters of WWI.) 

Tie this Navajo Code Talker’s service and passing to the equally patriotic, love of country WWII Fighter-Pilot deeds and passing of George H. W. Bush, reportedly the youngest Navy Fighter Pilot in that war: a bona fide decorated war hero who continued to serve his country as a dedicated citizen and public political figure (a US Congressman from Texas, US Ambassador to the UN, Director of the CIA, 43rd Vice President of the US and of course our 41st President). But then you must also follow the equally connected threads and sinews: the same George H. W. Bush being rightly lauded for his astute handling of the end of the Cold War- the successful dissolution of the Soviet Union and smooth re-unification of Germany after the fall of the Berlin Wall; the swift, timely assembly and leadership of a UN Coalition to stop and repel Saddam Hussein’s invasion of Kuwait; and who advocated for “a thousand points of light” in a “kinder, gentler America,” is the same George H. W. Bush whose 1988 bid for the presidency launched the modern era of vicious negative, divisive, race-baiting political campaigning with the infamous “Willie Horton” campaign Ad—a tactic the current occupant of the White House still relies upon quite heavily; the same George H. W. Bush who was fully complicit in the Reagan administration’s Iran-Contra Affair and its cover up (illegally selling missiles to Iran—via Israel—to get funds for the CIA-backed Contra forces fighting the Sandinista regime in Nicaragua) to subvert and defy laws passed by the US Congress terminating funding for those Contras. As stated in the National Security Archive’s 25 November 2011 posting: “Independent Counsel Lawrence Walsh continued to consider filing criminal indictments against both Reagan and Bush.” Neither man was indicted, but it should be noted that before he left office, the 41st President of the US pardoned every member of the Reagan administration indicted and convicted for their part in Iran-Contra. One individual had not even come to trial, but was pardoned anyway. Closer to home, for me, and equally negative, was the George H. W. Bush-ordered invasion of Panama in 1989 when at least 3,000 civilians lost their lives so General Manuel Noriega could be “arrested” for aiding, abetting and profiting from the drugs the cartels were funneling into the US. Keep in mind this factoid: folks already knew General Noriega was in bed with those drug runners when the CIA recruited him as an “asset” to primarily aid them in their war against the Sadinistas in Nicaragua. The individual running the CIA at the time: George H. W. Bush.

So those disparate threads and sinews triggered interesting synaptic connections.  One was simply this, as Vice writer Cole Kazdin explained last summer: “Many historians and policy experts are quick to point out that much of the troubles in Central America were created or at least helped by the US’s interference in those countries going back decades. In other words, the foreign policy of the past has profoundly shaped the present immigration crisis.” And there have been similar reports in the New York Times within the past two months. 

But those are basically just reminders of “causal events” I fully understand, having lived through and covered some of them. I was the ABC TV News Foreign Assignment Editor the afternoon of June 20th 1979 when our cameraman, Jack Clark, called with the shocking news that correspondent Bill Stewart had been executed by one of Nicaraguan dictator Anastasio Somoza’s National Guardsmen. Jack had captured the horrific event on film. Film we smuggled out and shared with CBS, NBC and by extension the international news community. The upshot: The US (Carter Administration) withdrew support for Somoza’s dictatorial regime, which was quickly overthrown by the insurgent, revolutionary Sandinistas. 

You see where this is going, right? The incoming Reagan Administration abhorred the allegedly “Communist” Sandinistas and did everything in its power— legal and illegal— to thwart and destroy them. Thus we get the CIA-backed Contras, and the subsequent Iran-Contra web of lies, deceit, cover-up, prosecutions and pardons outlined above. And yes, out of that US-caused and/or influenced turmoil and violence in Nicaragua and other similar “adventures” in Central America, we are now “graced” with our current debacle caused by a now militarized southern border, shuttered legal border crossings and a non-existent amnesty policy. We are now graced with a White House occupant threatening to shut down the government if he doesn’t get funding for a “Border Wall” instead of demanding comprehensive immigration reform legislation. We are now graced with stories of seven-year old girls dying of dehydration while in the custody of US Customs Border Protection. This is who we are now?

What, you’re now wondering, do all these intersecting, interwoven people, places, and events have in common and mean to any of us? For me it’s about belonging or not belonging, about inclusion and exclusion, about who is welcomed and who is not, and thus ultimately who we are as a people and a nation. That caravan of Central Americans are brown people. That Navajo Code Talker would be called a “Redskin” by some in this country. Those Nisei Warriors, the Tuskegee Airmen and the Harlem Hellfighters, would be “yellow” and “black” people of color who many residents of this country felt did not “belong.” And as the stoker-of-fear-and-division in the White House fully understands, some still feel this way.

From it’s very beginnings this country has wrestled with this question. Our various attempts at legislating an answer have always been race-based, heavily white/Caucasian Northern European inflected. Asians, Blacks, Browns, the darker-skinned Eastern Europeans, even the country’s Native Americans, were all deemed of lesser worth and not to be truly welcomed into the fabric of the nation’s citizenry. This I observed and analyzed day to day living and growing up in the village of Harlem after my 9th birthday. Yours truly, me, the ultimate insider-outsider: a native born immigrant. Oxymoron? Not really. 

If the late Senator John McCain, a white male from the state of Arizona, born in the Panama Canal Zone was a “native born” son eligible to run for the presidency of the US why not a black kid from Harlem, born in the US hospital, Ancon, Canal Zone, with-ahem—a birth certificate to prove it? Equally “native born,” right? Unfortunately for a number of people, back in June of 1952, in another of those virulently anti-immigrant waves that are reminiscent of the current xenophobic, anti-immigrant crest we seem to be drowning in, the McCarran-Walter Immigration bill was ratified over President Harry Truman’s veto. 

I took great pains and immense personal pleasure in highlighting it’s inhumane, racist, anti-immigrant nature in Our World, Winter 1952: Fear and Frustration, a documentary I produced for a short-lived but outstanding documentary series in 1986, Our World (ABC TV News). It was a horrendous piece of legislation, as Mr. Truman bluntly stated before exercising his veto power:

“it discriminates, deliberately and intentionally, against many of the peoples of the world ... The idea behind this discriminatory policy was, to put it baldly, that Americans with English or Irish names were better people and better citizens than Americans with Italian or Greek or Polish names…. Such a concept is utterly unworthy of our traditions and our ideals. It violates the great political doctrine of the Declaration of Independence that "all men are created equal." 

It also made citizenship and legal immigration a very murky and highly difficult prospect not just for the darker-hued folks of Southern and Eastern Europe, but also for certain Asians and the black and brown people of Latin America and the Caribbean. This current 21st century wrangling over illegal and legal immigration—the now “back-burnered” Deferred Action for Childhood Arrivals (DACA) struggle and the current fear-mongering about the “Caravan Invasion”—are just another re-visitation of the age-old inclusionary/exclusionary struggle that forever plagues this country.

Mavis, my wise and pro-active mother took no chances with the citizenship of her children. She used her legal immigrant-naturalized citizen status to have us naturalized. Thus, this black kid from Harlem’s unique native born-immigrant status, with birth certificate and naturalization papers to prove it. The ultimate insider-outsider.

In history class I learned this lesson as a high school student: The U.S. Constitution as originally written was a morally flawed document that actually condoned and protected the ownership of human beings by other humans. Not only condoned, but actually supported slavery. 

Fr. Tiffany, my Cardinal Hayes High School American history teacher, imparted in-depth, eye-opening awareness and insights to us in that history class. Among the key ones: the country’s founding document was written to protect and safeguard the rights and property of white, male oligarchs. Fortunately it was written with the foreseen and unforeseen flexibility to eventually make it applicable to everyone, not just those oligarchs. 

Yes, as we are all too well aware, who tells the story, from what perspective—what’s included or left out—is central to who we are as a people and as a society. It is central to a full understanding of the underpinnings and bedrock of that society. Learn and accept the fact that this continent and country have always been a place of and for immigrants. Eons ago the ancestors of the Native American First Peoples migrated from Asia across that Bering Strait land-bridge (“Beringia”) and eventually populated this land mass from North to South. It’s no longer hypothesis. Recent DNA testing has confirmed the Siberian and “Beringinian” origins of these tribes. But later immigrants, whether Vikings and other Europeans—Italians, Dutch, Spaniards, English, or more free and enslaved Africans—whomever, especially into what’s now North America, just added to an ongoing, continuous influx. It has always been a diverse, multicultural land. Of all the works delineating and exploring this, especially as it pertains to the United States, my favorite by far is A Different Mirror: A History of Multicultural America by the late, and for me, really great, Dr. Ronald Takaki. It’s a must-read to fully understand the root-causes of the racial problems and constantly recurring “exclusionary” immigration tendencies of this country. 

Those flawed Founding Fathers were not exceptions in fomenting these national ills. Yet even as there are still significant anti-immigrant advocates today, there were also “inclusionary” others back then. Men like John Jay and Alexander Hamilton who pushed hard for educating free and enslaved Africans so they too could enjoy the full fruits of freedom and self-government promised by that recently won American Revolutionary War. Note well: from the very beginning, always that exclusionary/inclusionary battle that resurfaces over and over again. 

We need to re-examine and redefine ourselves to become that beacon of refuge and light we’ve only been pretending to be for a select few. We need to take a page from the positive side of George H. W. Bush and seek a “kinder, gentler America” that re-emphasizes what President Harry Truman also stated in his 1952 veto of the McCarran-Walter Immigration Bill: “It denies the humanitarian creed inscribed beneath the Statue of Liberty proclaiming to all nations, ‘Give me your tired, your poor, your huddled masses yearning to breathe free.’ "

Cliché or no, if that’s not who we are now, that’s exactly who and what we need to strive to be.

Fri, 18 Jan 2019 03:25:43 +0000 0
Lots of People Won New Rights in the 1960s, but Not College Women Athletes

Chris von Saltza, Olympic champion - By Harry Pot - [1] Dutch National Archives, The Hague, Fotocollectie Algemeen Nederlands Persbureau (ANeFo), 1945-1989, Nummer toegang Bestanddeelnummer 912-8410, CC BY-SA 3.0 nl


Today the #MeToo movement puts the spotlight on young women in college who have been abused without much recourse. Most media attention exposes flagrant violations by men in date rape on campus. It extends to gender harassment by executives in the work place. Looking back to the 1960s, however, another pervasive abuse included benign neglect by colleges and universities. Women as students were treated inequitably in campus activities, especially in intercollegiate sports. Graphic examples can help us remember and learn from past practices.

Between August 26th and September 11th in 1960 Chris von Saltza stood on the victory podium at the Olympic Games in Rome four times to receive swimming medals, a total of three gold and one silver. She then entered Stanford University and graduated in 1965 with a bachelor’s degree in Asian history, gaining prominence in her long career as a computer scientist. After the 1960 Olympics Chris never had an opportunity to swim competitively for a team again. Stanford, after all, did not offer varsity athletics teams for women. What was a young woman to do? There was no appeal. For better or worse, this was the way things were in American colleges back then. 

In contrast to Chris von Saltza’s experience, over a half-century later another high school senior, American swimmer Katie Ledecky, won five medals at the 2016 Olympics held in Rio de Janeiro. She, too, was about to graduate from high school and would enroll at Stanford as a freshman in the fall of 2016. The historic difference was that she had a full athletic grant-in-aid plus a year-round national and international schedule of training and competition along with prospects for substantial income from endorsements and a professional athletics career. 

In 2018 there are pinnacles of success that indicate changes since the 1960s. Katie Ledecky has excelled as a student and works as a research assistant for a Stanford psychology professor. Ms. Ledecky also led her Stanford women’s swimming team to two National Collegiate Athletic Championships and recently signed a $7 million professional swimming contract.

Connecting the dots to explain the comparisons and contrasts of these two Olympic champion women swimmers who were students at Stanford requires reconstruction of the condition of college sports for students in the decade 1960 to 1969 The profile of Stanford‘s Chris von Saltza’s lack of collegiate opportunities was not an isolated incident. Following World War II American women had triumphed in the Olympic Games every four years - but with little base provided by high school or college sports. 

At the 1964 Olympic Games in Tokyo, the women’s swimming star was Donna DeVarona, who won two gold medals. In 1964, she was featured on the covers of both Time and Life magazines and named the outstanding women athlete of the year. Despite her achievements, her competitive swimming career was over, as she and other women athletes had few if any options for formal training and participation in intercollegiate sports or elsewhere.

Young women from the U. S. won gold medals in numerous Olympic sports. A good example was Wilma Rudolph, who won three gold medals in track and field at the 1960 Olympics in Rome. Rudolph benefitted from one of the few college track and field programs for women in the U. S., coached by Ed Temple. Most of their competition was at Amateur Athletic Union (AAU) meets, with no conference or national college championship meets available. Furthermore, at historically black Tennessee State University, funding and facilities were lean. 

The limits on women’s sports are revealed in college yearbooks of the era. A coeducational university campus yearbook devoted about fifty pages to men’s sports, especially football and basketball. In contrast, women’s athletics typically received three pages of coverage. In team pictures, the uniforms often were those of gym class gear. The playing format was for one college to sponsor a “play day” in which five to ten colleges within driving distance gathered to sponsor tournaments in several sports at once. Softball, field hockey, basketball, and lacrosse were foremost.

Coaches, usually women, usually received minimal pay. Most held staff appointments, in physical education where they taught activity classes. The women’s gym had few, if any, bleachers for spectators. Coaches of the women’s teams usually lined the playing fields with chalk, mopped and swept up the gymnasium floors, and gathered soiled towels to send to the laundry. One indispensable piece of equipment for a woman coach was a station wagon, as players and coaches piled in with equipment to drive to nearby colleges for games and tournaments. The women’s athletic activities often had their own director – yet another example of “separate but unequal” in intercollegiate athletics and all student activities. There was a perverse equality of sorts. All women students were required to pay the same mandatory athletics fee as male students, even though the bulk of it went to subsidize varsity teams that excluded women.

Despite the lack of intercollegiate sports for women in the 1960s there were some signs of life. One was creation of alliances that eventually led to chartering a national organization, the Association for Intercollegiate Athletics for Women (AIAW) in 1971, with over 280 colleges as members. The first action the Division for Girls and Women Sports (DGWS) took was to establish the Commission on Intercollegiate Athletics for Women (CIAW)to assume responsibility for women’s intercollegiate sports and championships. 

One heroic figure associated with women’s sports to emerge in the decade was Donna Lopiano, who graduated with a degree in physical education from Southern Connecticut State University in 1968. She excelled in sports as a girl and was the top player picked in the Stamford, Connecticut Little League local draft. However, she was forbidden to play baseball with the boys due to by-law language’s gender restrictions. Lopiano started playing women’s softball at the age of sixteen. After college, she was an assistant athletics director at Brooklyn College, coached basketball, volleyball, and softball, and then took on leadership roles in national women’s sports associations. Eventually she was Director of Women’s Athletics at the University of Texas along with appointments in sports policies and programs. She also was one of the most honored athletes of her era. Her experiences, including exclusion from teams, shaped her dynamic leadership over several decades. 

The bittersweet experiences of women athletes such as Donna Lopiano, Chris von Saltza, Wilma Rudolph, and Donna DeVarona show that although the 1960s has been celebrated as a period of concern for equity and social justice, colleges showed scant concern for women as student-athletes. One conventional analysis is that in 1972 the passage of Title IX would usher in a new era for women and scholastic and college sports. That was an unexpected development. In congressional deliberations around 1970, neither advocates nor opponents of Title IX mentioned college sports. All sides were surprised when the issue surfaced in 1972. The National Collegiate Athletic Association opposed inclusion of women’s sports -- until it made an unexpected reversal in 1978. Many colleges were slow to comply with the letter or spirit of Title IX. As late as 1997 the burden was on women as student-athletes to file lawsuits against their own colleges, pitting them against athletics directors, presidents, boards, and university legal counsel.

Title IX eventually demonstrated how federal legislation could prompt universities to provide programs and services accessible to women that they would not have provided if left to their own volition. It has required contentious oversight of resources for student athlete financial aid, training facilities, coaching salaries and other parts of a competitive athletics team. It includes television coverage of women’s championships in numerous sports. Equity and opportunity across all college activities, ranging from sports to fields of study along with hiring and promotion, remain uneven. The caution is that the experience of a Katie Ledecky at Stanford, including her professional swimming contract is exceptional. Sixty years after Chris von Saltza won her four Olympic medals and entered Stanford, inclusion of women as full citizens in the American campus is still an unfinished work in progress.

Fri, 18 Jan 2019 03:25:43 +0000 0
Separating Children from Their Parents Is an Anglo-American Tradition


The separation of children from their illegally migrant parents in the USA is seen as an aberrant and inhumane deviation from American tenderness for the family. This orphaning, as a matter of policy, is not “who we are,” as many liberals and some conservatives despairingly say.

But in many ways it is, and indeed, has long been. For the state, in the United States and earlier in Britain, has been a formidable creator of orphans. Perhaps this helps to explain the ambiguity in the attitude to the orphan: great display is made of theoretical pity and piety, but the way such children have been actually treated has frequently been punitive and repressive. Whether in orphanages, asylums, schools or other receptacles for those guilty of losing their parents, the extent of abuse by those in whose power such unfortunates have fallen is only now becoming clear.

Whatever charitable sentiments are kindled by the plight of orphans, such compassion has rarely prevented countries from the making of yet more of them by failing to wage war, or even to prevent it in those places – Syria and Yemen – where the indiscriminate harvesting of human life yields its sorry crop of abandoned children.

But it has not required war for governments, charities and even private individuals to rob children of their parents. From the first time a ship sailed from London to Virginia in the early 17th century taking “a hundred children out of the multitude that swarm about the place” until the last forced child migrants from Britain to Australia in 1967, thousands of young people were orphaned, not only of parents but of all ties of kinship, country and culture. The orphans sent from Britain to Australia alone numbered some 180,000.

A long association of derelict and orphan boys with the sea was formalized in a statute of 1703, which ordered that “all lewd and disorderly Man Servants and every such Person and Persons that are deemed and adjudged Rogues, Vagabonds and Sturdy beggars…shall be and are hereby directed to be taken up, sent, conducted and conveyed to Her Majesty’s Service at Sea.” Magistrates, overseers of the poor were empowered to apprentice to marine service “any boy or boys who is, are or shall be, of the age of ten and upwards or whose parents are or shall be chargeable to the parish or who beg for alms.”

Transportation removed 50,000 felons – among them many juveniles – by transportation to the American colonies; and in the process robbed many more children of at least one parent. In the 1740s, recruiting agents in Aberdeen sowed fear by luring children to service in the plantations. Peter Williamson and his companions, shipped to Virginia in 1743, were sold for sixteen pounds each. In 1789 the first convict ship, the Lady Juliana, consisting entirely of transported women and girls, set sail for Australia.

The historical fate of a majority of orphans is unknown. Many were taken in by kinsfolk or neighbors, and while many must have been fostered out of duty or affection, others were certainly used as cheap labor, for whom their foster-parents were accountable to no one.

It was not until the industrial era that the policy of removing children from their parents in the interests of society became widespread. The Poor Law Amendment Act permitted parishes to raise money to send adults abroad. One of the Assistant Commissioners claimed that “workhouse children had few ties to their land, and such as there were could be broken only to their profit.” In 1848, Lord Salisbury also advocated emigration for slum children.

Annie McPherson, Quaker and reformer, was the first private individual to organize mercy migrations, the rescue of children from their “gin-soaked mothers and violent fathers.” She set up a program of emigration in 1869. Dr Barnardo used Annie McPherson’s scheme, before implementing his own in 1882. He referred to “philanthropic abduction” as the rationale behind this disposal of the offspring of misery. 

At the same time, “orphan trains” carried children from New York and Boston to the open plains of the West, under the auspices of the Children’s Aid Society, established in 1853 by Charles Loring Brace. Sometimes children were “ordered” in advance, others were chosen as they left the train, or paraded in the playhouses of the small towns where farmers could assess their strength and willingness to work. These “little laborers” responded to a shortage of workers on farms. Between 1854 and 1929 a quarter of a million children were dispatched in this way.

In Britain, what were referred to as “John Bull’s surplus children,” were promised a future of open air, freedom and healthy work. Some were undoubtedly well cared for; but others exposed to exploitation, life in outhouses and barns, freezing in winter, stifling in summer, isolation and deprivation of all affection. The proponents of such schemes argued that this would provide them with a fresh start in life; but the cost of a one-way journey to Canada was far less than their maintenance by payers of the poor-rate. 

Joanna Penglase has called babies and infants taken from their mothers’ care for “moral” reasons, or simply because it was regarded as socially impossible for a woman to raise a child on her own as “orphans of the living.”

In 2010, the then British Prime Minister Gordon Brown apologized for the removal of children from their parents under the Fairbridge scheme, which took them to Australia, a practice which continued into the late 1960s. In 2008 Kevin Rudd, then Premier of Australia, apologized to indigenous families whose children had for generations been removed. In 2013 the Irish Taoiseach apologized for the abuse of orphans and illegitimate children by the Magdalene laundries from 1910 until 1970. 

It is in this context that former Attorney General Jeff Sessions declared zero tolerance of illegal immigration in April 2018. All such people would be prosecuted. Families were broken up, because detention centers were “unsuitable” for children. In June, after harrowing scenes of forcible separations, Trump signed an executive order that families should be kept together. All children under five were to be reunited with their families within 14 days, and those over five within 30 days.

It might have been thought that the creation of orphans by government had been consigned to history. Was it amnesia or dementia that made the administration, in its determination to be tough on illegal migration, separate parents from their children in its retreat to a tradition of punitive indifference to the most vulnerable? 

And then, what of the orphans of addiction, of mass incarceration, the abductions of the offspring of the marriage of technology with commerce, orphans of the gun-fetish and the multiple social estrangements created by social media and the engines of fantasy which lure children from their parents, protectors and guardians? The orphan-makers have never been busier in this era of wealth and progress.

Fri, 18 Jan 2019 03:25:43 +0000 0
Why Conservatives Are Right to Stress Moral Education

In 1993 conservative William Bennett, former secretary of education under Ronald Reagan, wrote The Book of Virtues “to aid in the time-honored task of the moral education of the young,” which meant training their hearts and minds “toward the good.” Such a task was also important to Founding Fathers such as Washington, Adams, Jefferson, and Franklin. 

But “wait a minute,” progressives might object. “What kind of ‘moral education’ are we talking about?” Good question. They are right to raise it, but wrong if they ignore the importance of such education. 

Similarly, Barack Obama, before he became president, wrote: “I think Democrats are wrong to run away from a debate about values.” But progressives tend to be suspicious of any talk about values or “moral education.” They identify such emphases with conservative causes such as favoring religious and charter schools.

Take for example, the National Heritage Academies (NHA), founded in 1995 by Christian businessman J.C. Huizenga and later supported by Trump’s secretary of education, Betsy DeVos. According to the NHA website, it is today “a network of 87 public charter schools serving more than 59,000 [K-8] students in 9 states,” mostly in Michigan. In a 2003 essay favorable to NHA, conservative advocate Robert Holland states that it is “among those charter-school companies that consider moral education to be central to the classroom experience.” 



Such an emphasis on moral education, if the right type of morality is taught, is not far from the wishes of the Founding Fathers. These early Americans grew up in a world in which moral education was stressed. As Daniel Boorstin in his discussion of higher education pointed out, “By the time of the Revolution nearly every major Christian sect had a degree-granting institution of its own.” Protestants of various denominations had established the most prominent of them, including most of the Ivy League colleges and universities and William and Mary. In 1792, classes began at the first Catholic institution of higher learning, Georgetown.

Even though in 1749 Benjamin Franklin proposed for Pennsylvania an institution that would not be religiously affiliated, he still emphasized the importance of moral education. Although proposing a diverse curriculum, he quoted favorably English philosopher John Locke: “Tis VIRTUE, then, direct VIRTUE, which is to be aim'd at in Education. All other Considerations and Accomplishments are nothing in Comparison to this.” 

For Franklin such an emphasis on virtue was not new. More than a decade earlier, in the words of historian Edmund Morgan, “he formulated his most lasting definition of the virtues he sought to attain in his own life, without the aid of any church or minister.” And Morgan adds that one virtue he left off his list of 13, charity, “was actually the guiding principal of Franklin’s life.” 

In 1778, John Adams, drafted the Constitution of the Commonwealth of Massachusetts. It stated that “Wisdom and knowledge, as well as virtue,” were “necessary for the preservation of their [people’s] rights and liberties,” which in turn depended “on spreading the opportunities and advantages of education.” Therefore, appropriate state officials were “to countenance and inculcate” various moral principles “among the people.” Several years later, when his son and future president, John Quincy Adams, was a student at Leiden University (in the Netherlands), he wrote to him, “You will ever remember that all the end of study is to make you a good man and a useful citizen.”

As a recent HNN op-ed stated that Thomas Jefferson consistently championed “moral instruction in formal education” and thought such “instruction of some sort was necessary throughout one’s life.” In his Notes on the State of Virginia (1787), he wrote that “the first elements of morality” can be instilled in elementary education. And more than three decades later, in his post-presidential years, he still believed so and that higher education could “cultivate their [students’] morals, and instill into them the precepts of virtue and order.”

Although George Washington had less to say about education than Franklin, Adams, and Jefferson, he was also a believer in moral education. In his Farewell Address (1796), with his second presidential term nearing its end, he advised the nation that “virtue or morality is a necessary spring of popular government” and that promoting “Institutions for the general diffusion of knowledge” was of “primary importance.”

Eight years earlier, In 1788, Noah Webster typified the educational thinking of many educated citizens of the time when he wrote in his long essay on “The Education of Youth in America”: “It is an object of vast magnitude that systems of Education should be adopted and pursued, which may not only diffuse a knowledge of the sciences, but may implant, in the minds of the American youth, the principles of virtue and of liberty.”

A year earlier the Northwest Ordinance, which provided guidance for U.S. expansion, stated that “religion, morality, and knowledge, being necessary to good government and the happiness of mankind, schools and the means of education shall forever be encouraged.”

Although much changed in the half-century following these words, a young Abraham Lincoln--like Franklin, primarily self-educated—conveyed similar views on education. When running for the Illinois General Assembly, he expressed his “desire to see the time when education, and by its means, morality, sobriety, enterprise and industry, shall become much more general than at present.” In his mature years, as Fred Kaplan has indicated, Lincoln’s most valued reading and sources of inspiration were books that imparted important moral lessons such as the Bible, the plays of Shakespeare, Aesop’s Fables, and the poetry of Robert Burns.

From the early nineteenth century until today, great changes have taken place in U. S education. In his Anti-Intellectualism in American Life, Richard Hofstadter indicates some of them. For example, regarding the late nineteenth and first six decades of the twentieth centuries,“ he writes: “On two matters there was almost no disagreement: education should be more ‘practical’; and higher education, as least as it was conceived in the old-time American classical college, was useless as a background for business. Business waged a long, and on the whole successful, campaign for vocational and trade education at the high-school level and did much to undermine the high school as a center of liberal education.”  

In the 1970s, the English economist and environmentalist E. F. Schumacher complained that the main purpose of education had become career preparation for work in our modern industrial societies. It did not help students to answer questions like “What is our purpose in life?” and “What are our ethical obligations?” “Education can help us,” he wrote, “only if it produces more wisdom.”

Today, in an attempt to attract students, many colleges and universities advertise that their various offerings will best prepare students for good [i.e., well-paying] jobs for the future. Most public education, including public universities, are more committed to preparing students to fit in to U. S. society than to be truth-seekers who ask themselves the most profound questions about their own values and those of their society. 

Many private schools at all levels claim that they do a better job of teaching values. Conservatives often believe that too many public schools and higher educational institutions—in their view, too dominated by liberal professors—fail to inculcate proper values and often lead students astray. But what values are conservatives advocating?

In his The Book of Virtues, Bennet stresses self-discipline, compassion, responsibility, friendship, work, courage, perseverance, honesty, loyalty, and faith. Conservatives also like to talk about “family values,” patriotism, and respect for authority. In general, conservatives favor values that they believe are in keeping and supportive of our free-market capitalism. But they often ignore the importance of truth-seeking, open-mindedness, and tolerance. During the last century, from the Scopes Trial of 1925 up to the more than 80-percent support of white evangelical voters for Donald Trump in 2016, we often sense little respect for the scientific evidence of evolution and human-caused global warming. 

We often see in such Christians the flaw that Pope Francis spoke of in a 2013 sermon—ideological rigidity. He warned against letting faith pass “through a distiller” and become an ideology. . . . Ideologies are rigid, always. . . without kindness.” 

Most of our Founding Fathers were not ideologues. They tended to take a pragmatic approach to achieving the common good (at least of whites—attitudes and actions toward Afro and Native Americans are a whole other subject). In his biography of Franklin, Walter Isaacson highlights Franklin’s “pragmatic humanism,” which included charity, an “emphasis on reason and observable experience, mistrust of religious orthodoxy and traditional authority,” and a willingness to compromise. Isaacson further notes, “Few people have ever worked as hard, or done as much, to inculcate virtue in themselves and their communities,” and Franklin’s “religious outlook [was] based on humility and openness.”

Washington, Jefferson, and Adams also shared a tolerant religious outlook. Chernow cites Washington’s letter to a Hebrew Congregation as evidence of his “religious toleration, showing that he had no notion of foisting a Christian state on the nation.” Jefferson, in his Notes on the State of Virginia (1787) praised “unbounded” religious tolerance. 

If we agree with the Founding Fathers that “moral education” is important, that education should not be seen as exclusively, or even primarily, career preparation, then what values should we teach and how should we teach them? 

The best answer I have come across is in a 2002 article by psychologist Robert Sternberg, “It's Not What You Know, but How You Use It: Teaching for Wisdom.” It reinforces many of the sentiments of the Founding Fathers and the point Schumacher made in the 1970s—we should help students become wiser people. Wisdom has various values associated with it that are almost universally recognized as good, such as love, empathy, humility, respect for truth, honesty, justice, and toleration.

Sternberg believes that “people are wise to the extent that they use their intelligence to seek a common good. They do so by balancing, in their courses of action, their own interests with those of others and those of larger entities, like their school, their community, their country, even God.” Another important goal is to teach students to see “things from others' perspectives as well as one's own,” to tolerate “other people's points of view, whether or not one agrees with such views.” Unlike many educational programs, this one stresses not so much “the acquisition of knowledge but . . . how such knowledge will be used.” 

Even though this program was developed at Yale for pre-college students in U. S. history, Sternberg maintains that “teaching for wisdom can be made part of any subject matter, because wisdom is a way of looking at the world.” 

Because of three years teaching at Wheeling Jesuit University and then four decades at Eastern Michigan University, I also know that it’s possible to teach at least some wisdom in both private and public universities. At Wheeling in the late 1960s, we demonstrated tolerance by inviting a rabbi to teach a course on Judaism. At Eastern for several years, I taught a Comparative Religions course in which I included a segment on atheistic humanism. My approach was not to discuss which religion was best, but to emphasize that we owed respect and empathy for all beliefs, even atheist humanistic ones. In history courses dealing with Russia or other parts of the world, I stressed the importance of truth-seeking and empathy, a quality historians should possess. In a 2010 essay, I indicated other ways that the humanities can teach us important values and make us wiser.

Practical realities including financial ones, student choices, differing value systems, and many more complicate teaching wisdom and its associated values. And I do not underestimate them. But my heart is with the words of the Founding Fathers. If the Trump presidency teaches us nothing else, it should teach us that in both our private and public activities morality (or the lack thereof) matters.

Fri, 18 Jan 2019 03:25:43 +0000 0
A Less-Kind and Less-Gentle Grand Old Party


The death of George Herbert Walker Bush symbolizes the end of the Republicans as the GOP, the “Grand Old Party.” He dipped his toes into the new Republican Party that emerged during his leadership, but that new party was not his cultural home. He was in that party, but not of it. 

George H. W. Bush as Federalist 

Despite the Republican Party nickname, the Democratic Party is far older. That old party began in opposition to the grandeur that the Federalists brought to American politics in the first years of constitutional democracy in the 1790s. The Federalists endorsed the constitution, ratified in 1789, as a structure to institutionalize power to the people—once duly refined and enlarged, as James Madison insisted. The Federalists presented themselves as the rightful custodians of governmental power, the best-educated citizenry, the new world equivalents of old world aristocrats. As the son of a Senator and raised with a spirit of public service, Bush could have been at home with the Federalists.

The first Democrats objected to these faux aristocrats, arguing an 18th century equivalent of the recent advertisement: to have a democratic-republic, Just Do It; yes, really give more power to the people. They began as Democratic- Republican Clubs, and their leader, Thomas Jefferson, still believed in “natural aristocrats.” However, his elitism allowed for very democratic social and economic mobility, with more power to average citizens, but only for whites and men. For over a century after Jefferson’s time, the Democratic Party would continue to endorse more democracy for white citizens and fewer rights for African Americans.

George H. W. Bush as Radical Republican

The Republican Party began as a reform movement in the 1850s against slavery and the concentrated power of slaveholders in the South. The party championed free enterprise as a path to opportunities for all citizens, from the African Americans eager to earn the fruits of their labors to enterprising whites. During the Civil War, many northern soldiers fought for “the best government on God’s footstool,” as a Minnesota soldier declared shortly before dying at Gettysburg, where President Abraham Lincoln would define the war effort for commitment to citizen opportunity. During the Civil War and Reconstruction, Republicans led the charge to avoid pegging people according to the status of their birth. Lincoln became an icon of this creed, as the poor kid made good, a self-made man. When President Bush endorsed the Americans with Disabilities Act in 1990, he followed in the tradition of this first Republican Party. 

George H. W. Bush as Champion of Big Business

In the last decades of the nineteenth century, Republicans continued their commitment to free enterprise, but now in support of the enterprising of large corporations, with the reformist spirit overwhelmed by support of the already-made men of wealth. And indeed, almost all the Republican leaders of politics and business were men; even Miriam Leslie, the wealthy and brilliant publisher of Leslie’s Illustrated Weekly, changed her name to “Frank Leslie” to maintain the appearance of gender norms. This is when the nickname of the GOP first emerged, to keep alive the memory that this “Grand Old Party” led in political and military defense of the Union in the recent war.

Reform-minded Republicans in this era did little for African Americans, but they defied the Republican “Stalwarts” who used their power to bestow patronage on those willing to do their bidding. For their defiance, the reformers became known as “Half-Breeds” for not being fully Republican. Patrician Bush left his Connecticut roots to light out for Texas, where he struck it rich in the offshore oil-drilling business. Texan Bush would have felt right at home with Gilded Age Republicans, even if his willingness to compromise on taxes would make many fellow Republicans call him a 1990s version of a Half Breed.



George H. W. Bush as Global Helmsman

In the early twentieth century, most Republicans focused on domestic economic interests with endorsement of isolationism in foreign affairs, from resistance to joining the League of Nations after the “Great War” in 1919, to their hesitancy in the 1930s to embroil the nation in “foreign wars” to fight the expansionist totalitarian states Germany, Italy, and Japan. Once the United States entered World War Two, that focus on American First turned toward internationalism, based on prioritizing American interests, especially business interests, around the world. Decisive American victories confirmed American global leadership, and in the Cold War against Communism, beginning in the late 1940s, the United States for the first time did not demobilize after war but maintained permanent wartime footing. The Republican Party supported the military and denounced Communism most vigorously, but both parties endorsed the national security state with the US as a kind of Federalist Party to the world. Lieutenant Bush lived out this American mission in his service as the youngest Navy pilot during the Second World War, and he continued this commitment as United Nations Ambassador and Director of Central Intelligence and as president during the collapse of the Soviet Union.

George H. W. Bush Not at Home with the New Religious Right

For all of George Bush’s keen alignment with the traditions of the Republican Party, he showed signs of being out of step when it gained new affiliations with the Religious Right starting in the 1970s and 1980s. By contrast with those traditionalist moral standards, Bush supported funding for Planned Parenthood early in his career, even though as Vice President under President Ronald Reagan, from 1981 to 1989, he turned to support of the anti-abortion policies of the National Right to Life Committee. Then as president, Bush’s history and style kept him under a cloud of suspicion with the religious traditionalists who were for the first time voting heavily for the Republican Party. The Religious Right welcomed Bush’s conservative choices, including for Dan Quayle as his running mate and for Clarence Thomas on the Supreme Court, but their endorsement of Bush in his presidential runs in 1988 and 1992 over their own Pat Robertson were practical steps for electability, not hearty endorsements. The Democratic Party may show less discipline when falling in love with their candidates, in the words of an old saying, but Republicans were still more ready to fall in line in their support of Bush. Despite their impatience with him, the Religious Right was not displeased by Bush’s continued support for the War on Drugs and his minimal actions during the AIDS epidemic.

George H. W. Bush in a Strange New World

In 1960, Democrats were indeed smitten with John Kennedy, and his popularity ushered in an age of charisma- and media-driven politics, and yet his successor, Lyndon Johnson, was a product of an older style of Congressional deal making. In a similar way, George Bush’s presidency was a remnant of an older GOP after his predecessor, Ronald Reagan, had already set trends for a new Republican Party, more bluntly committed to free enterprise and military power, and newly open to the issues and styles of the Religious Right.

The Religious Right has supplied not only some of the major issues that animate the Republican Party of today, but also its intellectual style, with the moral tone and uncompromising attitude that often emerges in politics infused with religion. While President Donald Trump is not himself a deeply religious man, his approaches to politics parallel that style, with readiness to divide the world into supporters and enemies. Republicans seem to have fallen in love with this knight of righteous anger, in hopes that his goal of “making American great again” can bring together the party’s commitment to corporate enterprise with its populist support of security and tradition. The religiously and temperamentally moderate George Herbert Walker Bush would find this a strange new world. He ran hard and hired aggressive campaigners, including Lee Atwater and Roger Ailes, but his own instincts were those of a patrician serving his democratic nation, expecting deference and ready to mediate the disparate voices beneath him.

Throughout his career, the recently departed George Bush exhibited major aspects of every part of the Republican Party’s traditions except in its current edition. Even while the Republican president has brought the party into uncharted waters, Trump actually displays many features of the GOP heritage. There are Republican precedents for the unprecedented qualities of the current White House occupant. Trump maintains downright Federalist expectations to serve as the nation’s custodian, even if his style in claiming that “I alone can fix” America’s problems would make the founders blush. The businessman turned politician harbors an instinct for profit worthy of Gordon Gekko’s brash insistence in the movie Wall Street(1987) that “greed is good.” America First for national self-interest is the first rule in Trump’s international playbook. And he presents himself with a blunt populism, speaking bluntly to popular fears with promises that increases in corporate and military power will serve their interests. 

Toward a Fighting Style of Politics 

For decades, Trump’s type of politics had been in the margins of the Republican Party but would occasionally gain public attention. It was the style of the John Birch Society in the 1950s insisting that containment of communism was not enough. In 1970, a cigarette company made use of this style to defy effete intellectuals when asking their customers bluntly, “What do you want, good grammar or good taste?” And in 1984, with urban crime on the rise, Bernie Goetz gained fame as the “subway vigilante” when he shot four young African American men who seemed to be attacking him. Trump embodies this tone of righteous anger, a tone that Bush did not share.

George Bush said some of the words of this new way of being Republican, but he never sang the tune. The biggest change has been in style. Bush never gloated. As a child, he rushed home from a baseball game to boast to his mother about his home run; his mother said, that’s good, but how did the team do? As president, he was still his mother’s son; while Americans gloried in the fall of the Soviet Union, Bush showed little enthusiasm. By contrast, Trump is eager to talk about his strengths—and even to exaggerate them. Bush’s largesse extended to a hope to govern over all citizens, while Trump is content to appeal only to his base of supporters, aiming not for 50% + 1 of voters, but only about 40%, with others as targets of their fear and anger. While Bush was ready to take conservatism in a “kinder, gentler” direction, Trump emphasizes fighting. And while Bush, the 41st president, was usually ready to listen, mediate, and even compromise, as he did with tax increases for some fiscal discipline, the 45th president scorns such conciliations as signs of weakness. 

Trump is taking some of the most aggressive features of the Republican Party in some less-than-grand new directions. Republicans face a moment of truth about their identity: Are they willing to give up much of their heritage as represented by George Herbert Walker Bush? And a key question for Democrats is to find ways to counter the fear and fighting in the new Republican Party while deciding whether to campaign with their own versions of fear and fighting.


Fri, 18 Jan 2019 03:25:43 +0000 0
The Firing of Marc Lamont Hill Raises This Question


When noted black intellectual Marc Lamont Hill spoke at the UN last month about justice for the Palestinian people, critics like those in the Anti-Defamation League (ADL) were quick to condemn him. They said his words implied support for the “one state solution” to the Israeli-Palestinian conflict, which his detractors claimed was an anti-Semitic and even genocidal notion. Just one day after Hill made his comments, CNN responded to the furor by firing Hill from his post as a commentator on the network. Soon thereafter both the president and chair of the board of trustees of Temple University, where Hill teaches, denounced him and his “hate speech.” Civil libertarians were quick to defend Hill and his right to free speech, and supporters of the Palestinians groused about yet another public figure silenced for evidencing sympathy with the Palestinians. 

Yet some of the most insightful criticisms of the way Hill was treated pointed out the controversy’s racial context: Hill’s was just the most recent case in a long history of blacks being publicly excoriated for “daring” to speak out on the great issues of the day in ways that defy white conventions. This was particularly true when discussing the Arab-Israeli conflict in a manner that challenges the carefully circumscribed discourse enforced by strongly pro-Israeli groups like the ADL.

This has happened before. Indeed, next year, 2019, marks the fortieth anniversary of a similar brouhaha that erupted when another black man very much in the public eye dared to challenge the rigidly pro-Israeli understanding of Americans’ approach to the Middle East: the Andrew Young Affair.

In August 1979, President Jimmy Carter forced the American ambassador to the UN, Andrew Young, to resign following revelations that Young had secretly met once with an official from the Palestine Liberation Organization (PLO) in violation of an American pledge to Israel not to deal with the PLO in any way. Young, the highest-ranking black official in the Carter administration, had met the official to advance American policy aims but nonetheless was fired after facing a barrage of hostile public criticism, notably by American Jewish organizations.

When it was soon revealed that the American ambassador to Austria, a Jewish industrialist from Cleveland named Milton Wolf, also had met several times with PLO officials earlier that year but without similar repercussions, African-Americans exploded in fury and rallied behind Young. Why the double standard, they demanded to know? Some, like the Southern Christian Leadership Conference’s Joseph Lowery and Operation PUSH’s Jesse Jackson, quickly announced they would continue Young’s efforts by themselves talking to the PLO. They then separately traveled to Lebanon, met PLO Chair Yasir Arafat, and presented him with their ideas for resolving the Israeli-Palestinian conflict. Lowery and Jackson insisted that black Americans had a legitimate role to play in the formulation of American foreign policy discussions and decisions, just like any other ethnic or religious group in the country. They were not going to be muzzled by a refusal to meet with any particular side in the conflict.



What happened to both Marc Lamont Hill and Andrew Young speaks volumes about race, foreign policy, and American positions on the Middle East. Both instances hearken back to long-held black complaints that their leadership voices are not welcome, whether in fields of life long dominated by well-educated whites or even in their own Civil Rights organizations. One of the key demands of the Black Power movement in the 1960s in fact was that African-Americans must take charge of their own destinies, their own groups, and formulate their own strategies and tactics for liberation. This was exemplified when the Student Nonviolent Coordinating Committee (SNCC) asked white members to leave the group in 1966, leaving blacks to lead SNCC and whites to go out and organize their own community. Other black forces in the 1960s and 1970s similarly demanded political and cultural autonomy.

Another way that this black autonomy was expressed during that heady period of time was by asserting blacks’ right to speak out on the great foreign policy issues of the day regardless of whether or not the white establishment approved. Martin Luther King, Jr. accordingly defied his critics by denouncing the Vietnam War in 1967.

No topic proved more controversial in this regard than the Arab-Israeli conflict. Black Power advocates hailed the various Third World liberation movements underway in the 1960s, and accordingly saw themselves and the Palestinians as kindred peoples of color each fighting against a racialized system of imperialism and domination. Malcolm X visited East Jerusalem in 1959 and Gaza in 1964, and publicly denounced Israel and Zionism. SNCC issued a newsletter article that spoke out forcefully in support of the Palestinian struggle against Israel shortly after the 1967 Arab-Israeli War. A few weeks later black militants at the National Conference for New Politics in Chicago arranged for the gathering to issue a statement against Israel.

Thereafter the floodgates of black pro-Palestinianism opened. Stemming first and foremost from their Black Power internationalism, the increasingly vocal stances in favor of the Palestinians emerging from activists in the Black Panther Party, the Black Arts Movement, even individuals like the boxer Muhammad Ali, also emanated from blacks’ insistence that their voices mattered. They no longer would sit in the back of the foreign policy bus lest they upset their white benefactors and political allies who urged them to stick to speaking about race relations only.

In the 1970s these attitudes began moving from Black Power radicals into the African-American mainstream. The Congressional Black Caucus and politicians associated with it like Shirley Chisholm and Walter Fauntroy, for example, spoke sympathetically about the Palestinian experience. The National Black Political Assembly in Gary, Indiana issued a statement critical of Israel. In the wake of the Andrew Young Affair, religious groups like the Black Theology Project and the National Black Pastors’ Conference affirmed Palestinian rights, as did secular organizations like TransAfrica. They once again were affirming both the need for a just and peaceful resolution of the Israeli-Palestinian conflict that included talking to the PLO as well as the right of black Americans to speak publicly about the Middle East, even if such speech angered pro-Israeli forces.

African-Americans well know the dangers of publicly speaking their minds. Yet many also have felt for decades that the Palestinians, like themselves, are a people of color seeking to be free who deserve black Americans’ public support. This is why blacks from Ferguson, Missouri and West Bank Palestinians visited one another in 2014 and 2015, and tweeted back and forth about how to deal with the guns and tear gas used against them by security forces in their respective homelands. This is why rappers like Method Man, Jasiri X, Boots Riley, Mos Def, and Talib Kweli perform songs about the Palestinians. This is why the 2015 Black Solidarity Statement with Palestine garnered over 1,100 signatures, including those of groups like the Dream Defenders and Hands Up United, and individuals like Angela Davis and Cornel West. People may be trying to silence Marc Lamont Hill, but the long history of black support for the Palestinians suggests that voices like his will continue to be raised.


Fri, 18 Jan 2019 03:25:43 +0000 0
The GOP Wasn’t Always the Party of Right-wingers


The Republican Party has been in existence now for 164 years. It was founded in 1854 in opposition to the expansion of slavery, as permitted under the Kansas Nebraska Act of that year, which also drew the opposition of abolitionists, as well as “Free Soilers.” What started as a reform oriented party with a real commitment to principle and remained so for about a generation until the mid 1870s, became a party openly connected to the massive growth of monopoly capitalism. By then, beholden to the status quo, it had lost interest in the issue of racial equality and racial justice. 

For twenty years the Republican Party had inspiring leadership. It wasn’t just Abraham Lincoln. Republicans in Congress brought about the 14th and 15th Amendments, the Civil Rights Acts of 1866 and 1875, and the Ku Klux Klan Act in the years of Reconstruction. GOP President Ulysses S. Grant fully embraced the fight for equality.



Among those Radical Republicans who stand out in history are the following US Senators: Charles Sumner; Salmon Chase (later Secretary of the Treasury under Lincoln and Chief Justice of the Supreme Court), Benjamin Wade (later President Pro Tempore of the Senate at the time of the Andrew Johnson impeachment trial), Henry Wilson (who was the second Vice President under Grant); Hannibal Hamlin (Lincoln’s first Vice President); John P. Hale; Oliver P. Morton; and Jacob Howard. 

In the House of Representatives, we had such luminaries as Thaddeus Stevens, John A. Bingham, James A. Garfield (later the 20th President of the United States); Schuyler Colfax (later Speaker of the House and first Vice President under Grant); Henry Winter Davis; Elihu Washburne, John A. Logan, Benjamin Butler, and James F. Wilson.

After a twenty-five year period of conservative dominance of the Republican Party, as the 20th century began, the party experienced the rise of a “Progressive” wing which had a major influence in the first generation of the century, generally labeled “the Progressive Era.” Theodore Roosevelt initiated the national commitment to progressivism. He backed the regulation of corporations, environmentalism, labor rights, and social justice. GOP members of Congress and some state governors also championed progressive ideas.

Among the major figures who promoted progressivism were Senator Robert LaFollette, Sr. of Wisconsin (who had been the first major state governor to advocate reform and change); Senator Hiram Johnson of California (who had promoted progressivism as the state’s governor); Charles Evans Hughes of New York (who had been a reform governor in the Empire State earlier and ran for President in 1916 against Woodrow Wilson); Senator William Borah of Idaho; Senator George Norris of Nebraska; and New York Congressman Fiorello LaGuardia. 

After 1920, progressivism declined in the Republican Party in the era of Warren G. Harding, Calvin Coolidge, and Herbert Hoover, but Johnson, Borah and Norris remained progressive leaders in the Senate. They were joined by Robert LaFollette, Jr., James Couzens, Bronson Cutting, Charles McNary, Gerald Nye and Lynn Frazier. But growing isolationism in the era of Fascism and opposition to internationalism caused their decline by the time of the formation of the American First Committee in 1940-1941, and America’s entrance into World War II.

Reform commitments however still existed in the hearts and minds of many, and so emergence of a primarily Eastern “Liberal” Republican wing of the party began in the battle over the Republican Presidential nomination in 1952 between General Dwight D. Eisenhower and conservative icon Senator Robert Taft of Ohio, son of former President William Howard Taft. As Eisenhower was about to leave office, Rockefeller Republicans became a strong force in the party. 

New York Governor Nelson Rockefeller became the acknowledged leader of liberals in the GOP in three bids to win the party’s presidential nomination in 1960, 1964 and 1968. But others who were part of this group included Governors George Romney and William Scranton and Senators Jacob Javits, Clifford Case, Charles Mathias, Lowell Weicker, Charles Percy, and Mark Hatfield. All of these Republicans favored New Deal programs, including business regulation and social welfare, as well as civil rights, infrastructure improvements and investments in education. Internationalists, they supported foreign aid and close ties to the North Atlantic Treaty Organization to combat communism. 

The term “liberal Republican” is now an oxymoron, but a few members of the party are rightly described as moderates: Three Maine senators—former Senators William Cohen and Olympia Snowe and present Senator Susan Collins; former Massachusetts Senator Scott Brown; former Massachusetts Governor and 2012 GOP Presidential candidate Mitt Romney; former Arizona Senator and 2008 GOP Presidential nominee John McCain; as well as present day Governors Charlie Baker of Massachusetts and Larry Hogan of Maryland.

But since the age of Ronald Reagan, the concept of “radical,” “progressive” or “liberal” Republicans is basically a part of the past. The party today has been captured by the most extreme elements, particularly since President Donald Trump’s election. The question is whether reform oriented Republicans will ever arise again.

Fri, 18 Jan 2019 03:25:43 +0000 0
It’s Watergate All Over Again in So Many Ways

Paul Manafort and Howard Hunt


If matters were not so serious, one would think that current events involving Paul Manafort, Roger Stone, Jerome Corsi and Julian Assange are nothing but a bad redux of Watergate and the events of January 1973.

The script has been updated but the tactics remain the same. In January 1973, the Watergate burglars faced the prospect of serving long prison terms for crimes involving the break-in of the Democratic National Committee headquarters. In 2018, alleged electronic burglars and their conspirators are staring down what appear to be virtual life sentences for hacking into the computers of the Democratic National Committee.

There are important differences. In the case of Watergate, we had Americans committing crimes against other Americans. Today, the allegations are that Americans conspired with a foreign enemy to commit crimes against Americans. With Watergate, the burglars were unsuccessful in obtaining any damaging information against the Democrats. Richard Nixon on tape said, “It wasn’t a third-rate burglary; it was a third-rate attempted burglary.” In 2016, electronic burglars were successful—they did steal emails and those emails were used to great effect in influencing the outcome of a presidential election.

Despite these differences, there are striking similarities. Paul Manafort is a modern-day Howard Hunt and Stone, Corsi and Assange are the Watergate burglars.

Howard Hunt was a CIA operative who helped organize the disastrous Bay of Pigs invasion of Cuba in 1961 by recruiting Cuban exiles living in Miami. Some of those same men would become Hunt’s Watergate team in June 1972 working for the Committee to Re-Elect the President. When they were arrested during the bungled burglary, Hunt and his partner G. Gordon Liddy fled (they were in the attached Watergate hotel) but left behind incriminating evidence like Hunt’s check for country club dues and an address book with White House telephone numbers in it.

Soon enough Hunt and Liddy were indicted, but by that time a major cover-up was underway. None of the burglars or their superiors were talking. The men in jail were being paid “hush money” to assure their silence and to cover attorney fees, family costs and bail money.

After Nixon’s reelection in November 1972, Howard Hunt called Charles Colson, a Nixon advisor and lawyer, to complain that “the ready,” meaning the support money, was coming in “dribs and drabs” and that the whole matter could blow apart if somehow the burglars were forgotten by the White House after Nixon’s massive victory. Colson, eager to have Hunt acknowledge that he was not involved in the planning of the Watergate operation, secretly recorded Hunt on a White House dictabelt.



Colson played the dictabelt recording for White House Counsel John Dean and Dean immediately knew they all had a problem. The payment of money to assure silence in the face of a criminal investigation, Dean concluded, constituted an obstruction of justice under federal law. Instead of blowing the whistle, as he himself had been a lynchpin of the cover-up, Dean did what most do facing certain loss—he doubled down. After meetings with Nixon’s top advisors, Haldeman and Ehrlichman, and Nixon’s former Attorney General, John Mitchell, the White House used leftover campaign money (called a “slush fund”) to keep the hush money flowing.

But the noose began to tighten. The burglars’ trial was set for January 1973 before federal judge John Sirica. Sirica, a conservative Eisenhower appointee, had read enough in the Washington Post and elsewhere to be highly skeptical that the conspiracy to infiltrate the DNC stopped with Hunt and Liddy. He smelled, as he put it, a “whitewasher.” So he made sure the burglars and their lawyers knew that if a jury found them guilty, forty-year sentences could be expected.

Hence, this is the same predicament faced by Manafort, Stone, Corsi and Assange. Corsi recently said on MSNBC that he fully expects to spend the rest of his life in prison.

During December 1972 one event propelled matters such that the White House and Nixon had to surreptitiously intervene, further enmeshing them in obstructing justice. In a moment of the highest drama Howard Hunt’s wife, carrying $10,000 in fresh hundred dollar bills on a flight from Washington, DC, to Chicago, was killed when her plane crashed just outside Midway Airport. She had been the paymistress for the hush money.

Hunt was devastated. He had young children and now they would be effectively orphaned if he spent decades in prison. Hunt again reached out to his friend Colson (they knew each other as Brown University alums). Through his lawyer, Bill Bittman, Hunt sought assurances that he would be pardoned if he remained silent and pled guilty to the break-in crimes. Colson met with President Nixon in the first week of January 1973 and microphones in the EOB office recorded the exchange. Nixon agreed that Hunt’s case was special given his wife’s sudden death and that he could expect his sentence to be commuted if he kept his mouth shut.

There were lots of winks and nods. Colson met with Bittman, a former Department of Justice prosecutor who helped Bobby Kennedy nail Jimmy Hoffa. Colson told Bittman that “Christmas comes once a year.” Bittman understood the code. Hoffa had been pardoned by Nixon the day before Christmas a year earlier.

Bittman had his answer. His client Hunt, who had said he would fight the charges against him, folded and plead guilty to the entire indictment. But what about the others?

There is no evidence that Nixon offered pardons or commutations to the other burglars. But the Cuban exiles took the cue from Hunt that they, too, could expect executive clemency if they pled guilty, so they also threw in the towel. Liddy and wireman James McCord thought they could beat the prosecutors, so they continued the fight and went to trial, losing on the last day of January 1973.

The question is whether this history is repeating itself. One logical explanation for Paul Manafort’s bizarre behavior is that he has a wink and nod, just like Howard Hunt did in 1973. We know Manafort’s lawyers have been communicating with the President Trump’s attorneys, just as Bill Bittman did with Charles Colson. The defiant response of Stone and Corsi also seems to be the result of cues taken from the Manafort situation and President Trump’s encouraging tweets.

If all this is true, these acts are just as much an obstruction of justice as the promises Nixon allowed to be given to Hunt in order to assure his silence. During John Dean’s “cancer on the presidency” talk with Nixon in late March 1973, when the wheels of the conspiracy were coming off, Nixon can be heard to agree with Dean that dangling pardons to assure silence “would be wrong, that’s for sure.” The use of the pardon power to interfere with a criminal investigation is an obstruction of justice and this abuse of presidential power became one of the articles of impeachment that eventually drove Nixon from office.

Fri, 18 Jan 2019 03:25:43 +0000 0
These Two Midwestern Democrats Could Be Serious Contenders for the Presidency in 2020


The midterm elections of 2018 have brought to the forefront, and given more attention to two Midwestern Democratic Senators who offer the possibility of regaining the Midwest for the Democrats in the 2020 Presidential Election cycle. Not only did Michigan, Wisconsin, and Illinois witness the regaining of governorships for Democrats in 2018, but also the midterm elections added stature to two members of the Senate, both reelected to their third terms, and both having an appeal to the white working class which Hillary Clinton lost in 2016, leading to the victory of President Donald Trump.

Sherrod Brown of Ohio and Amy Klobuchar of Minnesota stand out as appealing “liberal” politicians who have managed to thrive in an atmosphere of extreme partisanship, and yet have the reputation of being able to “cross the aisle” and work with Republicans. At the same time, they have solid records of accomplishment, without the often attendant histrionics too much present in today’s politics.



Sherrod Brown has had a long, distinguished career, having spent eight years in the Ohio legislature; eight years as Ohio Secretary of State; eight years as a Congressman; and now starting his 13th year in the US Senate, after an impressive victory in November. 

Brown has been a strong progressive over his career and is in the tradition of two earlier progressive Democratic Senators who inspired many, namely John Glenn, who served four terms in the Senate from 1975-1999, and briefly sought the presidency in 1984, and Howard Metzenbaum, who had three terms in the Senate from 1977-1995, and was always an exceptional liberal firebrand. 

Brown has excelled in support by the electorate and is seen by many as a younger version of Joe Biden, a full decade younger than the former Vice President. Both of them have the ability to appeal to working class whites, a crucial demographic in elections. He has been a leading member on important Senate committees, including Banking, Housing and Urban Affairs, where he’s the ranking member. He is also a member of the committees on Agriculture, Nutrition and Forestry, Finance, and Veterans Affairs. Brown is considered a “progressive populist”; he has long been a critic of free trade agreements, making him very appealing to the working class of his state and section.

Brown is, however, not as dramatic and outspoken a politician as Howard Metzenbaum was, although he can fight for his causes with vigor. His appeal to working class whites has drawn national attention. In a tough state for Democrats historically, Brown has won 56, 51 and 53 percent of the vote in his races in 2006, 2012, and now 2018, respectively. Do not count him out.

Also gaining attention is Amy Klobuchar, who pursues the liberal working class tradition of the Minnesota Democratic Farmer Labor Party, which led to the historic careers and accomplishments of Hubert Humphrey (Senator and Vice President), Eugene McCarthy (Senator and presidential candidate), Walter Mondale (Senator, Vice President and presidential candidate), and Paul Wellstone (Senator). 

Klobuchar has the advantage of being the only potential woman nominee for President in 2020 who is not from the Atlantic or Pacific Coast, which are considered automatic areas of Democratic support in recent times. While Minnesota has been generally Democratic in recent years, Klobuchar offers the opportunity to put a successful Midwesterner on the ballot, and has a longer and more substantive career of accomplishment in the Senate than other potential women candidates, including Elizabeth Warren of Massachusetts, Kirsten Gillibrand of New York, and Kamala Harris of California. 

Klobuchar has an appealing family story. She overcame an impoverished and troubled childhood to become a successful and renowned prosecutor in Hennepin County (Minneapolis) for eight years. She is now beginning her third term in the Senate. Klobuchar has sponsored or cosponsored 98 pieces of legislation that became law through early 2018. She serves on the Judiciary Committee, where she performed well in her questioning of Supreme Court nominee Brett Kavanaugh. She also serves on the Joint Economic Committee, Commerce, Science, and Transportation, Agriculture, Nutrition, and Forestry, and is the ranking member of the Rules and Administration Committee. She was up for consideration as Attorney General under Obama and has been mentioned as a potential future Supreme Court nominee under a Democratic President. Many have believed she would make a calming, tough presidential candidate for 2020. She has a warm personality but dogged determination to promote the progressive causes she has fought for in her career. She has won her Senate seat by vast margins of 58, 65, and 60 percent in 2006, 2012, and 2018, respectively.

Of course, either Sherrod Brown or Amy Klobuchar could also be a potential Vice President in 2020 if they were unable to win the presidential sweepstakes. At this early point in the early stages of the 2020 campaign, this author would imagine that Klobuchar would be more likely to succeed in her quest than Brown and seems at this point certain to run. Brown may or may not run. So many US Senators, probably close to a dozen, are planning to announce for President, and Brown’s own declaration that he did not grow up with the motivation and desire to be President, could convince him to stand aside. 

But, as politics always proves, there is no way to predict what the fallout will be from the multicandidate race that is about to begin in earnest in the new year of 2019. In any case, the Midwest is a crucial battleground for 2020, and gives attention to both Brown and Klobuchar.

Fri, 18 Jan 2019 03:25:43 +0000 0