Roundup Roundup articles brought to you by History News Network. Thu, 25 Apr 2024 16:36:29 +0000 Thu, 25 Apr 2024 16:36:29 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://historynewsnetwork.org/article/category/25 The Roundup Top Ten for July 14, 2023

Child Labor Was Essential to Early Capitalism—Don't be Shocked it's Coming Back

by Steve Fraser

As employers seek new possibilities to exploit labor, state legislatures are going back to the future with rollbacks of child labor bans. 

Colleges Must Follow the Law, but they Don't Need to Aid SCOTUS's Resegregation Agenda

by Richard Thompson Ford

From the architects of Jim Crow to William Rehnquist to John Roberts, conservatives have been able to use "color blind" principles to actively defend segregation. Colleges must consider this history in deciding how they adjust their admissions practices in response to SCOTUS's affirmative action ruling. 

In Post-Soviet Russia, Children Have Been Propaganda Instruments

by Clementine Fujimura

Russian regimes since the fall of Communism have inherited and created crises of mass orphanage; their policy responses to parentless children have been informed by politics and nationalism at the expense of child welfare. Removal of orphans from Ukraine to Russia is just the latest instance. 

The Body Mass Index Grew out of White Supremacy, Eugenics and Anti-Blackness

by Sabrina Strings

Beneath the statistical and scientific imprimatur of the measurement lie a host of assumptions that the bodies of affluent white people are normal and those of others are deviant and deficient. 

Our Amicus Brief Against Florida's Stop WOKE Act

by Amna Khalid and Jeffrey Aaron Snyder

"The Supreme Court’s rejection of affirmative action in college admissions will provoke widespread debate. But not in the classrooms of Florida’s public colleges and universities, because the Stop WOKE Act prohibits it."

Ozempic is the Latest Vain Pursuit of a Scientific Solution to Addiction

by Simon Torracinta

Now that the diabetes drug has been used off-label to suppress appetite, scientists are speculating about its use to suppress neurological aspects of addictive behavior. History suggests this is misguided. 

What "Crackhead" Really Meant in 1980s America

by Donovan X. Ramsey

The memories of politicians and police have been allowed to dominate our understanding of the emergence of crack cocaine in the 1980s. A new book seeks to elevate the voices of urban Black Americans and others who experienced it directly and still live with its effects.

The Entanglement of Art and Slavery in the Work of Juan de Pareja

by Rachel Hunter Himes

Diego Velázquez painted the portrait of Juan de Pareja in 1650. An art historian considers what more we can learn about the painting and the world in which it was made by examining the paradox of a dignified and beatific portrayal of a man painted by another man who enslaved him. 

Why Has Medicine Looked at PCOS Through the Lens of Fertility Instead of Pain?

by Alaina DiSalvo

Polycystic Ovarian Syndrome has had a complicated history in medicine. But its path toward recognition has been unfortunately colored by a concern for preserving fertility instead of improving women's quality of life—even in groundbreaking feminist health guides like Our Bodies, Ourselves. 

In Memphis, Tyre Nichols's Killing Echoes 1866 Massacre

by Isaiah Stafford and Kathy Roberts Forde

In the aftermath of the Civil War, Memphis was a city in political upheaval in which policing became a method of reasserting white supremacy. 

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185917 https://historynewsnetwork.org/article/185917 0
Why Has Medicine Looked at PCOS Through the Lens of Fertility Instead of Pain? “The medical industry is always trying to preserve women’s ovaries to have a baby,” lamented Erin Barnett, a woman with endometriosis and polycystic ovarian syndrome (PCOS). “But I want them to put the same amount of effort into helping me be pain-free.” In an article in the Canberra Times and in her book, Endo Unfiltered, Barnett describes how, for decades, doctors completely disregarded her level of pain and discomfort while dealing with PCOS in favor of preserving her fertility. This is the case for many women living with PCOS and other related conditions. Doctors need to start focusing on the whole-body impact PCOS has on the many women with the condition and to prioritize patients’ stated desires above the preservation of their fertility. Physicians often deprioritize and belittle women’s quality of life, and PCOS is an excellent case study through which we can see the prevalence of this issue within the medical community.

Although not immediately life-threatening, PCOS is a chronic, incurable condition affecting at least 1 in every 10 women. This number is likely inaccurate, because researchers at the Robinson Institute of the University of Adelaide estimate that up to 70% of women with PCOS around the world remain undiagnosed. With PCOS, women can suffer from several symptoms: a highly irregular and especially painful menstrual cycle, facial hair, male-pattern baldness, cystic acne, dark and velvety patches of skin, obesity, extreme difficulty losing weight, insulin resistance and diabetes, sleep apnea, a higher risk of uterine cancer, and ovarian cysts that vary in size and severity. But most doctors only focus on one thing: PCOS can also cause infertility.

This focus on infertility has emerged from a historical contestation over defining PCOS. Doctors have historically disagreed about what causes PCOS. There is a popular rumor that it is caused by a modern culture that promotes obesity and eating processed foods, but studies have found this probably isn’t true. In fact, some scientists believe that PCOS has been around since the Paleolithic era and remained prevalent because it gave women certain advantages, such as sturdiness, strength, and lower mortality rates. In 1721, doctors began investigating polycystic ovaries and hyperandrogenism, concentrating their studies on whether the women in question were able to successfully conceive after experimental treatments. They mostly focused on the abnormal physical appearance of the ovaries and infertility. In the following centuries, doctors and other medical researchers reported on the presence of cysts and enlarged follicles on women’s ovaries. Only in 1935 did two gynecological researchers, Irving Freiler Stein and Michael Leventhal, report on the different effects these types of ovaries could have on the female body. The illness was dubbed Stein-Leventhal Syndrome, and the researchers’ work constituted the first significant investigation into PCOS. Since then, however, there has been a serious lack of research into the condition as well as a lack of public knowledge about its existence.

Over the course of the 20th century, the medical community took an increased interest in and performed more research into women’s health. Our Body, Ourselves, originally published in 1970 by the Boston Women’s Health Collective, was one of the first books that comprehensively detailed conditions and illnesses facing women. The publication began as a pamphlet on women’s health and grew into a book as its authors researched more medical conditions affecting women. In the 1998 edition, PCOS is mentioned only once. It would make sense for PCOS to be mentioned in other sections relevant to the symptoms the condition causes, such as insulin-related conditions, menstrual cycles, or hirsutism. But it isn’t mentioned in those places at all. Rather, it is given two sentences in the section entitled: “A woman may experience infertility because…”[1] Our Bodies, Ourselves is a groundbreaking text intended to provide women with detailed medical information about their bodies amidst a male-dominated culture of dismissal and secrecy. But even this book was not infallible and was infected by the cultural standard of over-focusing on women’s fertility.

Formal diagnostic criteria for the condition were not created until the 1990s. This is due to many things, including the fact that the pain and discomfort of women have consistently been ignored or dismissed by medical professionals. In addition, many women with PCOS retain the ability to become pregnant, putting the issue even further down the medical field’s list of priorities. Today, most doctors go by the Rotterdam Criteria and will diagnose patients if they meet at least two out of the three listed points. The list includes irregular or absent periods, physical or chemical signs of hyperandrogenism, and polycystic ovaries as shown on an ultrasound. This list is not at all comprehensive and leads to ignorance about other markers of PCOS, such as acanthosis nigricans (dark, velvety patches of skin), obesity, and insulin resistance.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185916 https://historynewsnetwork.org/article/185916 0
In Post-Soviet Russia, Children Have Been Propaganda Instruments  

People in Brussels attend a memorial for the Ukrainian children who have been forcibly taken to Russia. hierry Monasse/Getty Images

Clementine Fujimura, United States Naval Academy

Since Russia launched a full-scale invasion of Ukraine in February 2022, Russian soldiers have forcibly taken an estimated 16,000 Ukrainian children to Russia. Over 300 children have since returned home, but it is not clear what happened to most of the rest.

The mass abductions led prosecutors at the International Criminal Court to issue arrest warrants in March 2023 for Russian President Vladimir Putin and Russia’s children’s rights commissioner, Maria Lvova-Belova. Moscow counters that the children it has brought to Russia – its estimates are as many as 744,000 Ukrainian children – have been evacuated from conflict zones.

I am an anthropologist who studies marginalized communities, including youth subcultures in Russia and other places, including the United States and parts of Europe.

The kidnapping of Ukrainian children offers a reminder of how Putin and other Russian leaders have historically used children as pawns in international politics.

A Soviet promise to children

I explore the lives of homeless and abandoned Russian children, including kids in orphanages and other similar institutions in Moscow, in my 2005 co-authored book, “Russia’s Abandoned Children: An Intimate Understanding.”

My research included numerous trips to Russian orphanages between 1990 and 2000, as well as time spent living and volunteering in an orphanage and shelter for babies.

It’s helpful to understand that before the Soviet Union collapsed in 1991, the Soviet government presented a myth that all children – including those in institutions – would receive excellent care. The Soviet government promised these children that their futures were promising and that they would receive an education and have help getting a job.

Other than adults who worked in these Soviet orphanages or psychiatric hospitals, no one was allowed to see what went on inside.

The myth of these orphaned children’s perfect childhood calmed citizens’ potential concerns, my research shows.

However, the public began to realize Russian orphans’ plight once the Soviet Union broke apart. Orphans and otherwise abandoned children in orphanages began to escape the institutions when possible. They formed their own version of kinship groups, gathering on city streets and in underground train stations.

I discovered in my research that many abandoned children preferred being homeless to living in orphanages.

This trend of youth vagrancy became a sore spot for the Russian government, as it tried to grow its economy and rebrand itself in the West.

Soviet orphans play in a crib in 1991, the year the Soviet Union fell. Peter Turnley/Corbis/VCG via Getty Images

Russia’s struggle to care for kids

Russia’s decision to end adoptions to American families in 2012 offers another example of how the Russian government has used children for nefarious purposes in the past few decades.

The Russian government first opened the doors for international adoption in 1991. Citizens from the U.S. and other Western countries eagerly responded, welcoming the new openness of Russia.

This helped boost Russia’s image in the West as a kinder country than it was during the Cold War. At the time, around 371,700 Russian children were growing up in state institutions. Most of these kids had at least one living parent.

In some cases, government deemed some parents unfit for the job and moved the kids to an institution.

U.S. citizens adopted more than 60,000 Russian orphans from the early 1990s until 2013.

During my time spent with teachers, doctors and children in Russian orphanages and shelters, it was clear that Russia struggled to care for abandoned and otherwise institutionalized children, including those taken from parents.

There were also widespread reports of the children being neglected and mistreated.

In the orphanage I studied, children did not eat fresh fruits and vegetables, and the caretakers often lamented the food’s lack of nutritional value. I was asked to bring vitamins, diaper rash cream and other basic necessities.

The fact that the Russian government could not handle its orphans was a source of embarrassment. Putin, who served as president from 2000 through 2008 and again starting in 2012, saw the need to change the narrative of the poor Russian orphan, if only for the sake of the country’s public image.

‘It’s hard to believe’

In 2008, a Russian toddler born with the name Dima Yakovlev died of heatstroke while left unattended in his adoptive father’s parked car in the Washington, D.C., area.

This news made international headlines. Some Russian officials pointed out the lack of oversight and abuse that adopted Russian children experienced in the U.S. This narrative helped weaken the U.S. in the eyes of Russian citizens, thereby strengthening the image of the Russian government.

“When we give our children to the West and they die, for some reason the West always tells us it was just an accident,” Russian politician Tatyana Yakovleva reportedly said in 2009. “It’s hard to believe.”

This case and other news stories about a few U.S. adoptive families treating Russian children poorly coincided with another political controversy.

Russian police arrested attorney Sergei Magnitsky on questionable grounds. Magnitsky had uncovered a tax fraud worth US$230 million. Magnitsky died while in custody in 2009, before he could stand trial.

In 2012, the U.S. Congress approved new legislation, called the Magnitsky Act, which identifies and imposes sanctions on Russian officials who are accused of human rights violations.

A halt to adoptions

In 2012, Putin signed the law banning international adoptions to the U.S.

Putin’s law, which went into effect in early 2013, halted thousands of adoptions already in progress with American families.

U.S. scholars and journalists have argued that Putin’s adoption ban was a direct retaliation to the Magnitsky Act and was not about Putin’s concern for Russian orphans. Putin promised to improve the Russian child welfare system in 2013. Some outside analysis by groups like the World Bank have documented positive changes at Russian institutions for children, such as more funding. But there remain challenges – including the fact that Russia has a much higher rate of institutionalized children than other middle- to high-income countries.

While some abducted Ukrainian children have come home to their families, most remain unaccounted for. Pierre Crom/Getty Images

A similar playbook

In the face of evolving battlefield failures in Ukraine, Putin has pivoted to a familiar playbook of using and abusing children, continuing to call for the “evacuation” of Ukrainian children, both from Ukrainian orphanages and from their families. These children are being moved to Russian orphanages and camps, where they learn how to be Russian.

In order to become citizens of Russia, these children have been forced to abandon their Ukrainian heritage, both physically and mentally, and to get a new education in Russian propaganda and history.

Russian citizens, in turn, are once again presented with the myth that children in Ukraine are being saved from the war and offered a better life.

But for Ukrainian families and orphanage staff involved, these abductions amount to a form of torture, with parents and caretakers clamoring to find their children and bring them home.

Clementine Fujimura, Professor of Anthropology, Area Studies and Russian, United States Naval Academy

This article is republished from The Conversation under a Creative Commons license. Read the original article.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185915 https://historynewsnetwork.org/article/185915 0
The Body Mass Index Grew out of White Supremacy, Eugenics and Anti-Blackness Today, we treat “obesity” (measured as a body mass index [BMI] ≥ 30) with a surprising seriousness, given its history. Fear of fatness did not begin as a medical concern. In fact, it took off in the mid-18th century. At that time, several race scientists began arguing that being “too fat” was bad specifically because it had been linked to women of color. Renowned scientist Georges-Louis Leclerc, Comte de Buffon, for instance, repeated claims made by other scientists that Chinese people while not all “fat and bulky … consider being so as an ornament to the human figure.” Adding that one could find, therefore, many Chinese women with enormously “big bellies.” Big bellies were also, according to Buffon, a noticeable deficit among the women of some African tribes.1

These ideas crept into medicine through eugenics. Eugenics was a late-19th to mid-20th century movement to promote so-called better breeding by identifying qualities of the human race to be cultivated and defects of the human population to be eradicated through selective breeding. Race and weight were intrinsic to their concerns. In the United States, eugenicists like the zoologist Charles Davenport argued that fatness was a constitutional flaw. The “low” types betrayed this form of embodiment. Chinese and Jewish people, for instance, were thought to be prone to a lamentable “racial obesity.”2

Davenport and other eugenicists, by combining race science and medical science, were inventing what I call a “white bannerol of health and beauty.” This bannerol pseudoscientifically bundles attractiveness and healthfulness. Peoples and physical proportions that had been held in high or low esteem by race theorists and philosophers of beauty were, with the eugenics movement, subject to a new form of medical penalty. These faux-scientific notions about body size, health, and desirability (especially for women) would ultimately make their way into the medical mainstream.

Davenport had frontloaded race in his pseudoscientific understanding of the link between weight and health. He’d also embraced the latest science for identifying how much fat was believed to make a person sick. Such notions had arrived by way of the insurance companies.

The insurance industry had long been creating so-called “standard weight” tables. These tables gave the average weight by age and height for thousands of people judged by insurance companies’ medical examining boards to be sufficiently healthy to be acceptable life insurance risks.3 Most of the insured were white, but the insurance industry’s primary concern was not in identifying racial differences but in demonstrating a link between weight and health. This was the mechanism used to delimit potential policyholders and, by extension, potential monetary payouts. Yet the insurance industry’s ignoring race did not stop Davenport and others from continuing to make racialized assertions about body weight. Davenport was known to use the weight tables to advance his eugenic claims about a racial factor in obesity.4

Still, as the 20th century wore on, eugenic claims were becoming less tenable. The devastation of the Holocaust led some postwar scientists to publicly admit that race was not biological.5 During the 1940s and 1950s, the medical community downplayed the overt role of race in questions of health, even those about obesity.6 A new emphasis was placed on discipling the bodies of all people based on the insurance industry weight tables, which unfortunately still relied on an implied white standard.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185914 https://historynewsnetwork.org/article/185914 0
Child Labor Was Essential to Early Capitalism—Don't be Shocked it's Coming Back An aged Native American chieftain was visiting New York City for the first time in 1906. He was curious about the city and the city was curious about him. A magazine reporter asked the chief what most surprised him in his travels around town. “Little children working,” the visitor replied.

Child labor might have shocked that outsider, but it was all too commonplace then across urban, industrial America (and on farms where it had been customary for centuries). In more recent times, however, it’s become a far rarer sight. Law and custom, most of us assume, drove it to near extinction. And our reaction to seeing it reappear might resemble that chief’s — shock, disbelief.

But we better get used to it, since child labor is making a comeback with a vengeance. A striking number of lawmakers are undertaking concerted efforts to weaken or repeal statutes that have long prevented (or at least seriously inhibited) the possibility of exploiting children.

Take a breath and consider this: the number of kids at work in the United States increased by 37 percent between 2015 and 2022. During the last two years, fourteen states have either introduced or enacted legislation rolling back regulations that governed the number of hours children can be employed, lowered the restrictions on dangerous work, and legalized subminimum wages for youths.

Iowa now allows those as young as fourteen to work in industrial laundries. At age sixteen, they can take jobs in roofing, construction, excavation, and demolition and can operate power-driven machinery. Fourteen-year-olds can now even work night shifts and once they hit fifteen can join assembly lines. All of this was, of course, prohibited not so long ago.

Legislators offer fatuous justifications for such incursions into long-settled practice. Working, they tell us, will get kids off their computers or video games or away from the TV. Or it will strip the government of the power to dictate what children can and can’t do, leaving parents in control — a claim already transformed into fantasy by efforts to strip away protective legislation and permit fourteen-year-old kids to work without formal parental permission.

In 2014, the Cato Institute, a right-wing think tank, published “A Case Against Child Labor Prohibitions,” arguing that such laws stifled opportunity for poor — and especially black — children. The Foundation for Government Accountability, a think tank funded by a range of wealthy conservative donors including the DeVos family, has spearheaded efforts to weaken child-labor laws, and Americans for Prosperity, the billionaire Koch brothers’ foundation, has joined in.

Nor are these assaults confined to red states like Iowa or the South. California, Maine, Michigan, Minnesota, and New Hampshire, as well as Georgia and Ohio, have been targeted, too. Even New Jersey passed a law in the pandemic years temporarily raising the permissible work hours for sixteen- to eighteen-year-olds.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185913 https://historynewsnetwork.org/article/185913 0
What "Crackhead" Really Meant in 1980s America The crack epidemic has passed into legend since its end in the mid-1990s, and the further we get from its height, the larger it looms in the collective imagination. That’s partly a product of memory itself, it seems, but it’s also a consequence of which memories of the epidemic have been prioritized.

For more than three decades, the accounts of law-enforcement officials, politicians, and pundits have dominated the conversation. Most of those people were never touched in personal ways by the epidemic, except perhaps for what they did at work or saw in the news or experienced in passing. For those people, the crack epidemic was and continues to be an idea that encapsulates everything bad about the ’80s and ’90s—the poverty, crime, gangs, violence, everything the ghetto represented in America after the civil-rights movement.

But for the community members who came face-to-face with the crack epidemic, it was as real as flesh and blood. Crack and its attendant misery permeated every aspect of our lives. For us, the crack epidemic was more than a collection of statistics used in an article or speech. It was embedded in our neighborhoods and homes. It was in some people’s childhoods, interrupted constantly by trauma, tragedy, threats, and stress.

Michelle lived just a few doors down the block from my family, in Columbus, Ohio. I hardly ever saw her, though. In fact, I don’t remember ever actually meeting Michelle, but I was taught to be afraid of her. My mom, a cautious woman who otherwise avoided gossip, would drag our house phone room to room by its long white cord and talk at length with her friends about Michelle From Down the Street.

She had too many strange people going in and out of her house. The neighborhood could hear her parties at all hours of the night. She looked “a mess.” It was all “just so sad,” my mom would say with a slow shake of her head. She would move on to other topics, but I stayed fixed on Michelle and tried to imagine what might be going on just a few feet away.

One Sunday afternoon, I was sitting on our front porch with my older sister when a van pulled up and parked in front of Michelle’s place. Out of it came an older woman and a young girl, each resembling our mysterious neighbor in her own way. Because my sister knew everything, I asked her who the strangers were. “Duh! That’s Michelle’s family,” she said, adding that the small girl was Michelle’s daughter. “Why don’t she live with her mom?” I asked. My sister shrugged her shoulders, annoyed, like it was the kind of inconsequential question she’d never ask, and answered, “I don’t know. Probably because Michelle is a crackhead.”

It was 1993 or 1994. I was just 5 or 6 years old but had heard the word crackhead countless times, usually from other kids. Crackhead was a go-to insult—so-and-so was “acting like a crackhead”; “yo mama” was a “crackhead.”

It was popular, I assume, because it belonged to the grown-up world, and using it made us feel grown. I suppose we made crackhead a slur because we feared what it represented, a rock bottom to which any of us could sink. That’s what children do when they’re in search of power over things that frighten them: They reduce them to words, bite-size things that can be spat out at a moment’s notice.

I couldn’t make sense of the fact that Michelle was a crackhead. She lived just down the street, after all, and she had a family. Crackheads were supposed to be foreigners from some netherworld, whose main activities were begging for money and otherwise disrupting community life. Then they were supposed to return to wherever they came from—alleyways, sewers, wherever the trash went after we threw it out.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185912 https://historynewsnetwork.org/article/185912 0
Ozempic is the Latest Vain Pursuit of a Scientific Solution to Addiction Will 2023 be the year that the battle against addiction is finally won? The key, it seems, may be the new diabetes drug, Ozempic. While doctors and pharmaceutical companies hail it as a powerful new weight loss drug, early reports also suggest that GLP-1 analogue drugs like Ozempic, Wegovy and Mounjaro may suppress a variety of other appetites, from smoking and alcohol to shopping and nail-biting.

GLP-1 analogues were developed to control diabetes by triggering insulin secretion from the pancreas. Some evidence suggests these drugs also affect the brain’s dopamine pathways, which appear so significant in rewarding our behavior that some neuroscientists have labeled it the “wanting” system. The hope is that GLP-1 analogues might target this wanting system directly, eliminating or reducing cravings — including those associated with addiction. Are we around the corner from a major advance in the science of desire?

Not so fast. The history of many failed attempts to generate a reliable biological treatment for thorny problems like addiction should caution us against anticipating another magic bullet. Explaining desire through simple biology fails to grasp the variety of motivations for drug-taking, and previous medical fixations on extinguishing “cravings” alone typically did more harm than good.

The search for a scientific solution to cure addiction dates back over 150 years. In the mid-19th century, the fast-developing science of neurophysiology believed that the source of all human motivations could be traced back to the brain. “All [man’s] desires and motives are experienced in and act upon this important apparatus,” the pioneering brain physician Thomas Laycock declared in 1860. Only “perfect knowledge” of this neurological system, Laycock added, would allow man “to direct and control the internal or vital forces, just as he directs and controls the physical or external forces.”

Inspired by these developments, many doctors became convinced that disorders like “opium poisoning,” “inebriety” and “morphinism” were linked by a common “disease of the will” that required a physiological solution. By the turn of the 20th century, the language of “addiction” had developed as a catchall term to describe this group of afflictions that had previously been considered separate conditions.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185911 https://historynewsnetwork.org/article/185911 0
In Memphis, Tyre Nichols's Killing Echoes 1866 Massacre Three years after the police murder of George Floyd, the Department of Justice has released a searing report on the Minneapolis Police Department, finding widespread use of excessive force, including deadly force, and discrimination against Black and Indigenous people.

In May the DOJ announced it is investigating police “use of force” policies in yet another city, Memphis, plus a separate review of specialized units used in law enforcement across the country. Five Memphis police officers, members of a special anti-crime strike team named Scorpion, brutally attacked 29-year-old Tyre Nichols after a stop for an alleged traffic violation on Jan. 7. Nichols died three days later from head injuries suffered during the police beating, according to the autopsy report.

Just three months after Nichols’s killing at the hands of police, Memphis recorded 40 murders in March alone, double the January figure. As David A. Graham reported in The Atlantic, this surge in murder is part of a U.S. pattern following “highly publicized killings by police officers.” Experts aren’t sure why this happens and why it’s happening in Memphis this year, especially given the significant decline in national murder rates in 2023 thus far. But it’s a social fact that makes life even more precarious in this majority Black city, where, the Marshall Project reports, rates of police arrests of, and violence against, Black residents are disproportionately high.

Police violence is the leading cause of death for young Black men in the United States. It is an urgent and heartbreaking national problem, but with specific local expressions.

Memphis has been a racialized hot spot for police violence for more than 150 years, with roots reaching back to a massacre that occurred soon after the Civil War ended.

By the mid-19th century, the steamboat industry in Memphis, a city perched along the Mississippi River, was thriving, as was railroad construction, providing livelihoods for those willing to endure the dangerous work. Irish immigrants flowed into Memphis, fleeing famine and mass unemployment back home, to take these unskilled jobs and build new lives as Irish Americans. As a fairly young city, just 40 years old when the Civil War began, Memphis was largely a city of newcomers, including White and free Black Southerners.

Black people freed from slavery during and after the Civil War moved to Memphis, occupied early on by Union forces during the war, and competed for the same positions. The labor situation was toxic, pitting the formerly enslaved against Irish immigrants not considered fully White in an evolving racial hierarchy. In short order, the Irish in Memphis adopted the white supremacist ideas and racial grievances of the former Confederates and aligned with the Democratic Party, a strategic move meant to cement their whiteness and Americanness. The population of Memphis had exploded from roughly 20,000 in 1860 to 35,000 by the war’s end, with 20 percent being Irish.

Ex-Confederates, barred from voting, left a hole in the electorate that the new Irish Americans quickly filled as they became active in Memphis politics. Because many had not fought with the Confederates, they were eligible to vote and dominated the elections. (Black men had yet to receive the vote in Tennessee.) As a result, the Irish gained political power in Memphis and parlayed that power in the labor market. By 1865, the Irish held 162 of the 177 positions in the Memphis police force, with the Irish American mayor appointing many based not on their qualifications but by their political affiliation.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185909 https://historynewsnetwork.org/article/185909 0
Our Amicus Brief Against Florida's Stop WOKE Act The Supreme Court’s rejection of affirmative action in college admissions will provoke widespread debate. But not in the classrooms of Florida’s public colleges and universities, because the Stop WOKE Act prohibits it.  

A pillar of Governor Ron DeSantis’ campaign against alleged “woke indoctrination,” the Stop WOKE Act, signed in 2022, stipulates that students in Florida’s K-20 public education system cannot be subjected to instruction that “espouses” or “advances” eight blacklisted concepts. 

One of these concepts is that “a person by virtue of his or her race, color, national origin, or sex should be discriminated against or receive adverse treatment to achieve diversity, equity, or inclusion.” This provision disqualifies any material that is pro-affirmative action. How, then, will students at Florida’s state universities “contribute to the democratic process” and “function as engaged community citizens” (to quote the mission statement of the University of South Florida) if they can’t discuss the pros and cons of vital issues like this?  

We just completed a year-long fellowship with the University of California National Center for Free Speech and Civic Engagement, where we investigated the effects of the Stop WOKE Act and other laws targeting higher education in Florida. From our research, which included interviews with over a dozen faculty at Florida public universities, we learned how the Stop WOKE Act has trampled on academic freedom, significantly restricting the ability of teachers to deliver accurate, engaging, and effective classroom instruction.  

That’s why we were happy to submit an amicus brief last Friday to support the plaintiffs—seven faculty members and a student group—seeking to strike down the Stop WOKE Act. The law is subject to a preliminary injunction, pending appeal from Florida. The federal appeals court for the Eleventh Circuit is expected to rule in the next six months.  

The crux of our brief is that by compelling professors to commit educational malpractice, the Stop WOKE Act—its full title is Stop the Wrongs to Our Kids and Employees Act—undermines public higher education’s mission to develop students’ critical thinking skills and prepare them to be informed citizens.  

“The Stop WOKE Act is an eerie combination of Orwell and Kafka,” University of Florida history professor Jeffrey Adler told us, adding that there are “mandates about what we’re not supposed to do and about what we’re not supposed to say,” but the specifics remain frustratingly amorphous.  

Frank Fernandez, an assistant professor of higher education administration and policy at the University of Florida, said, “The law is vague, but the message is clear.” And the message is that faculty members should avoid topics related to race, racism, and social inequality.  

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185908 https://historynewsnetwork.org/article/185908 0
Colleges Must Follow the Law, but they Don't Need to Aid SCOTUS's Resegregation Agenda On cue, the Supreme Court has invalidated race-conscious affirmative action in higher education. The majority opinion was preordained, but the dishonesty and cowardice of that opinion and the concurrences are still breathtaking. It’s especially galling that the court cloaked an attack on integration in the equal-protection clause of the 14th Amendment and the Brown v. Board of Education decision. And make no mistake: Opposition to affirmative action is part of a segregationist agenda that began with opposition to school desegregation in the 1950s and has taken new and more-underhanded forms as evolving public opinion made an open defense of segregation untenable.

Chief Justice John G. Roberts Jr.’s mentor, the late Chief Justice William H. Rehnquist, opposed Brown v. Board of Education when he was a clerk to Justice Robert H. Jackson. In a memo to Jackson, Rehnquist urged that the doctrine of “separate but equal” be upheld, writing: “I think Plessy v. Ferguson was right and should be reaffirmed.” By 1971, when Rehnquist was nominated to the Supreme Court, Brown was sufficiently well established that he felt compelled to deny his position and attribute it to his deceased former boss, Justice Jackson. (Jackson in fact voted with a unanimous majority in Brown to overturn Plessy and segregation.) Rehnquist made the same claim in his 1986 confirmation hearing to be chief justice.

But the historical record available today makes it clear that the memo endorsing segregation reflected Rehnquist’s own views. In 1987, when Ronald Reagan nominated Robert H. Bork to the court, Democrats blocked his nomination, pointing out that his originalist judicial philosophy would overturn Brown and pave the way to a return of racial segregation. From then on, opponents of integration would adopt the covert approach pioneered by Rehnquist: Rather than attack Brown directly, they would quietly undermine it by limiting its scope and distorting its meaning.

Over time they turned Brown against itself, supplanting the egalitarian imperative of desegregation with one that was compatible with continued segregation: colorblindness. It had been well understood that colorblindness was compatible with segregation. Long before Brown, Jim Crow states used formally colorblind laws to lock in racial hierarchy. For example, after Reconstruction, they enacted literacy tests and poll taxes to disenfranchise recently emancipated slaves but added “grandfather clauses” that exempted anyone descended from prewar voters from the new requirements. Later, segregated school districts tried to circumvent Brown with colorblind “school choice” plans that assigned most students to the (segregated) schools they had previously attended unless they objected to the assignment. Today’s most prominent advocate of the colorblind interpretation of Brown and the 14th Amendment is a former Rehnquist clerk, Chief Justice Roberts.

Those tactics worked. Today, many K-12 schools are as racially segregated as they were in the 1960s, and for most students in those schools, college offers their first significant exposure to people of other races. With the Students for Fair Admissions cases, the Supreme Court has taken a big step toward ensuring that selective colleges and universities become as segregated as the typical K-12 school in a wealthy suburb.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185907 https://historynewsnetwork.org/article/185907 0
The Next Culture War Battle? College Accreditation The War on Accreditation

Former president Donald Trump has vowed to “fire” them. Republicans in Congress would like to restrict them. Florida governor Ron DeSantis wants the courts to break them. And Christopher Rufo, the chief architect of today’s “Critical Race Theory” panic, recently declared them his “next target.”

The culture wars have come for university accreditors.

For many Americans, the process by which colleges and universities are accredited may seem obscure or unimportant, but accreditation agencies matter. There are seven major bodies in the United States, the so-called “regional” accreditors, that accredit higher education institutions. Many majors and disciplines, such as nursing and engineering, also have their own accrediting bodies.

Simply put, accreditation is one of the principal guarantors of quality in America’s higher education system and one of the major ways students can differentiate between a reputable institution and a diploma mill. Each accreditor has its own standards for issues like graduation rates, financial health, and curricula. Accredited institutions themselves help develop these standards as well as plans for their implementation and evaluation. Ultimately, public and private universities must meet these standards to acquire and maintain accreditation.

And they have a very good reason to do so. Under the Federal Higher Education Act, unless a college is accredited by a federally recognized agency, its students are ineligible for Pell grants, federal loans, and work-study funds; nearly 84% of all college students rely on this financial support. Without accreditation, they may also have difficulty getting their degrees recognized by prospective employers or licensing boards. For the majority of universities, de-accreditation amounts to a death sentence.

This is the dynamic that some lawmakers and commentators want to change, and for one very specific reason: because accrediting bodies are a shield against government censorship. 

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185906 https://historynewsnetwork.org/article/185906 0
The Entanglement of Art and Slavery in the Work of Juan de Pareja The Black figure is currently in a sustained spotlight. For some time now, curators, scholars, and critics have wrestled with the representation of people of African descent in art, grappling with the interpretive problems and possibilities presented by subjects who were once objects, cargo, and commodities. The rise of a Black figurative turn in contemporary art reflects this interest. In the past six years, Kehinde Wiley, Mickalene Thomas, Kerry James Marshall, and Toyin Ojih Odutola—all of whom have made Black figures their central subject—have each had a solo exhibition at a major museum.

Galleries, too, have capitalized on the Black figure’s new presence in the public eye. Business has been particularly brisk among art institutions seeking to remediate the relentless whiteness of their holdings. And many museums have followed suit, mining their own collections for Black subjects and engaging with paintings, prints, sculptures, and works of decorative art anew in their efforts to bring to light the histories of race, slavery, and colonialism. Such attention has been a long time in coming.

At the Metropolitan Museum of Art in New York City, a portrait by Diego Velázquez has served as a starting point for a new exhibition and catalog exploring the tangled history of art production, race, and enslaved labor. The portrait, completed in 1650, shows a man named Juan de Pareja. Captured in a dignified pose, he meets our gaze with a sensitive regard. The fluid and shimmering brushwork of Velázquez evokes the light gleaming on Pareja’s brow and glinting from his dark eyes. He appears in the dress of a Spanish nobleman, with a broad lace collar and a sash across his chest. Yet while nothing in the painting would suggest it, the power that Velázquez holds over Pareja exceeds the typical relationship of artist to subject or portraitist to sitter. Velázquez, the Old Master, is a master in another sense: the master of the man he has painted, who is his slave.

In their 2013 book Slave Portraiture in the Atlantic World, Agnes Lugo-Ortiz and Angela Rosenthal ask: If the Western visual tradition insists on portraiture’s affirmation of the subject, can there really be a portrait of a slave? Or do portraits of enslaved individuals intrinsically undermine the objectifying project of slavery? Pareja’s dignified presence here stands as a visual counterpoint to what typically turns up in the search for Black figures in collections of European art: fantastically attired blackamoor pages, sometimes with silver slave collars, crouching at the knees of the white subjects of European portraiture, offering a tonal contrast between ethereal whiteness and inky blackness, and a conceptual contrast between power and subservience, dominance and subjugation. Unlike these anonymous Black figures, however, Pareja has a history. He was a painter himself. After his manumission, he went on to found his own workshop as a free man, executing paintings that were displayed in the private and ecclesiastical spaces of Madrid. Several major examples of his work appear in the Met’s exhibition alongside paintings attributed to Velázquez, many of which reflect Pareja’s contributions to the Old Master’s output. Also in the exhibit are polychrome sculptures, metalwork, and ceramics that further reveal the breadth of enslaved and emancipated artistic labor in 17th-century Spain. Together, these works allow us to glimpse the milieu into which Pareja entered, first as enslaved assistant and then as independent artist.

We don’t know a great deal about Juan de Pareja—but then again, we know more about him than we do many other European artists of the early modern period, some of whom we can name only with epithets like “Master of Ávila” or (a personal favorite) “Master of the Drapery Studies.” In his catalog essay, David Pullins, a curator of the exhibition alongside Vanessa K. Valdés, lays out what we do know of Pareja’s life. Born around 1608 in Antequera, a small city about 90 miles west of Seville, he was perhaps the child of a Spanish man and an enslaved African woman, or then again maybe a Morisco, a descendant of the North African Muslims who were forcibly converted to Catholicism after the end of Muslim rule on the Iberian Peninsula. He was a member of a substantial population of enslaved men, women, and children of African descent living and working in Spanish urban centers, where it was common for households to own one or two, but usually not more than three, slaves. His duties in Velázquez’s workshop would have included grinding pigments and preparing canvases—but as the show reveals, he also made far more significant contributions to the paintings that today bear his master’s name.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185904 https://historynewsnetwork.org/article/185904 0
The Roundup Top Ten for July 7, 2023

SCOTUS's Affirmative Action Decision Caps a Decades-Long Backlash

by Jerome Karabel

A scholar of university admissions says that the decision will be a "monumental setback for racial justice" that is rooted in myths about the policy that have surfaced through decades of opposition to affirmative action. 

After Brown v. Board, Segregationists also Attacked "Woke" Businesses

by Lawrence B. Glickman

When two TV networks decided in 1956 to no longer air racist lyrics to popular songs by Stephen Foster and other minstrelsy holdovers, some southern segregationists took the move as an attack on the very foundations of civilization. 

Macron's Statements on Police Killing Show France has Far to Go in Acknowledging Racism

by Crystal M. Fleming

Histories of official violence against nonwhite citizens confound the nation's official policy of universalism; President Macron's description of riots as "inexplicable" shows that official colorblindness won't help the French move toward justice. 

Why We are Still Debating Birthright Citizenship

by Martha S. Jones

Opposition to birthright citizenship has, historically and today, reflected opposition to the idea of equal membership in the political community of the nation and has been inextricable from the idea that white Americans should be privileged citizens, argues the leading historian of the subject. 

The Pendulum of Queer History

by Samuel Huneke

As the Republican Party embraces aggressive transphobia as a political wedge issue, there is historical reason to believe that the strategy will provoke organizing, reform, or even revolution for queer liberation. 

University of California is Cracking Down on Workers and Dissent

by Rafael Jaime

The arrest of three UC labor activists on vandalism charges related to chalking a sidewalk with pro-labor slogans show that the university will throw its commitment to free speech out the window when it comes to playing hardball with graduate workers, says one grad union leader. 

History Shows Debt Relief is All-American

by Chloe Thurston and Emily Zackin

Throughout American history state legislatures have passed debt relief over the objections of creditors and the courts, responding to the economic needs of the citizenry and defying the idea that indebtedness was a personal failing. 

July 4 Was Once a Day of Protest by the Enslaved

by Matt Clavin

The public declarations of freedom and political equality that accompanied Independence Day were a prompt for protest, escape, and rebellion for the enslaved. 

John Roberts's Tragedy is of His Own Making

by Jeff Shesol

John Roberts has the power to arrest the Court's slide into disrepute, extremism, and trollishness. He's chosen not to use it. 

The Declaration of Independence Sealed a Shotgun Wedding

by Eli Merritt

If the founding is to inspire us today, it should be for the way that the journey from the Declaration to the Constitution reflected the ability to overcome bitter and pervasive division. 

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185903 https://historynewsnetwork.org/article/185903 0
History Shows Debt Relief is All-American On Friday the Supreme Court ruled that the Biden administration’s plan for student loan debt relief exceeded its statutory authority. In addition to the legal challenges the plan faced, prominent Republicans — who cheered the ruling — have argued that the concept of student debt relief runs counter to our country’s deepest commitments. As Arkansas Gov. Sarah Huckabee Sanders tweeted, for example, we should not require taxpayers to “pay off $300 billion of other people’s debts … It’s un-American.”

But history reveals that such claims are false. For much of the country’s history, Americans have pressed their governments for relief from debts — and often, legislators granted it. This long tradition suggests that today’s ruling won’t put an end to the debate over debt relief, and the activism associated with it may yet pave the way for new protective policies.

Even before the U.S. Constitution was adopted, indebted Americans sought relief. Over a six-month period between 1786 and 1787, thousands of armed farmers, calling themselves Regulators, marched on courthouses in organized regiments, preventing judges from issuing judgments against debtors and freeing them from prison. Although this insurgency, known as Shays’ Rebellion, ended in military defeat, the Massachusetts state legislature ultimately issued a moratorium on debt collection, as did the legislatures of several other states.

Other debtors pursued similar ends through less violent means. In 1819, for instance, when the United States entered an economic depression, Kentucky was particularly hard hit, and many residents urged the state legislature to provide debt relief. Lawmakers responded with zeal. In 1820, they chartered a new bank, and authorized it to lend up to $1,000 (almost $26,000 in 2023 dollars) to individuals for the repayment of “just debts.”

The Kentucky legislature also went further, stipulating that creditors must accept repayment of debts in the form of (badly depreciated) notes from one of two state banks. If they refused, a debtor could delay repayment for two full years. When the state Supreme Court declared this law unconstitutional, the legislature passed a statute creating an entirely new court. Unsurprisingly, the old court deemed this move yet another violation of the state's constitution, and for three years, both high courts remained in operation, each refusing to acknowledge the legitimacy of the other.

While Kentucky’s court standoff was extraordinary, the legislative effort to protect over-indebted borrowers was common throughout the 19th century during times of economic hardship. State legislatures also prohibited the sale of mortgaged properties at too great a discount from their former value, and passed “redemption laws,” extending the time available to debtors to buy back their mortgaged properties once they had entered foreclosure. Some even closed their courts temporarily to prevent the issuance of judgments against debtors. State legislatures were also responsible for the proliferation of homestead exemptions, which shielded a portion of debtors’ assets from collection even if they could not repay their debts.

Courts deemed many of these measures unconstitutional, but state legislatures continued to pass them, calculating that the political benefit to passing these laws outweighed the risk of a judicial determination that they overstepped. One pro-debtor newspaper in Kansas urged its legislature to enact a stay law in 1890, remarking that “if in the opinion of the Supreme Court it should prove unconstitutional, no harm will have been done.” Chiding legislators for their “fear,” the columnists pointed out that, “Legislatures [had] passed unconstitutional laws before.”

Crucially, farm organizations repeatedly mobilized their members to fight for debt relief. They also worked to ensure that Americans did not perceive over-indebtedness as a personal failing.

These efforts shaped the politics of the issue.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185902 https://historynewsnetwork.org/article/185902 0
The Declaration of Independence Sealed a Shotgun Wedding During our own time of troubles, it is important to remember that in 1776 the men in powdered wigs who drew up the Declaration of Independence overcame political difficulties far more perilous than our own. Throughout the 1770s and 1780s, in fact, one wrong move in the Continental Congress might well have led to the secession of a regional bloc of states, whether New England, the Middle states, or the Southern states, with possibly cataclysmic consequences for all thirteen: continental civil war.

An apt metaphor capturing the spirit of the early republic is that of a shotgun wedding. If the New England, Middle, or Southern states had split apart into separate confederacies, civil wars would have broken out over finances, commerce, and, most of all, land. For this reason—the prevention of bloody mayhem—the founders reluctantly bound themselves into one Union. They united with the guns of civil war pointing at their backs, but, to their everlasting credit, they also practiced the art of cooperative political maneuvering in order to save the Union from self-destruction.

Consider early July 1776. American founding myth would have us believe that after years of British abuses and usurpations, the leaders of the thirteen colonies amiably adopted the “unanimous” Declaration of Independence.

In fact, the battle over the Declaration of Independence was an epic tale of secession threats, fears of intercolonial bloodshed, and furtive politics engineered to rescue the Middle states from tragically falling into civil war against New England and the Southern states.

Observers of the American scene had long since forecast that an independent America would self-destruct in civil wars. As James Otis, patriot and author of the influential “The Rights of the British Colonies Asserted and Proved,” averred in 1765, “Were these colonies left to themselves tomorrow, America would be a mere shambles of blood and confusion.” An English traveler visiting the colonies in 1759 and 1760 concurred, warning, “Were they left to themselves, there would soon be civil war from one end of the continent to the other.”

These were hardly isolated forebodings. After the British Parliament enacted the Coercive Acts in 1774 to punish Boston for its wreckage of more than 340 chests of tea in the harbor, newspapers were filled with anxious prognostications about the consequences of independence: American civil wars.

“Whenever the fatal period shall arrive,” Reverend Samuel Seabury of New York asserted that year, “in which the American colonies shall become independent on Great Britain, a horrid scene of war and bloodshed will immediately commence. The interests, the commerce of the different provinces will interfere: disputes about boundaries and limits will arise. There will be no supreme power to interpose; but the sword and bayonet must decide the dispute.”

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185901 https://historynewsnetwork.org/article/185901 0
Why We are Still Debating Birthright Citizenship When my Google Alerts sounded this past week, I knew that birthright citizenship was again lighting up in the news. My interest in debates over birthright is professional and abiding: I’m a historian who in 2018 published a book, Birthright Citizens, that traced this approach to national belonging from its origins in debates among Black Americans at the start of the 19th century to 1868, when the ratification of the Fourteenth Amendment established that, with a few exceptions, anyone born on U.S. soil is a citizen.

On Monday, Florida Governor Ron DeSantis, looking to advance his presidential campaign, promised to reverse more than a century and a half of law and policy and, as he put it in a statement, “end the idea that children of illegal aliens are entitled to birthright citizenship if they are born in the United States.” A few days later, a spokesperson for another GOP presidential candidate, Nikki Haley, said she “opposes birthright citizenship for those who enter the country illegally,” and the entrepreneur Vivek Ramaswamy’s campaign said he would reform birthright by adding new citizenship requirements. Having lived through more than one such outburst in recent years—the first in 2018, when then-President Donald Trump proposed to do away with birthright—I know that any promise to transform our citizenship scheme is sure to set off a debate.

But what, we should ask, is that debate really about? Why does it keep coming up? When we talk about birthright citizenship, we are talking about democracy—its fundamental component that grants equal status to every person born in this country and affords them all the same rights of citizenship.

Let’s briefly review. Although the 1787 Constitution did not bar Black Americans from citizenship, it also did not plainly state what made any person a citizen. The result was that Black Americans received profoundly uneven treatment before the law; most authorities leaned toward the view that color, with its implied links to slave status, disqualified Black Americans from citizenship. Black activists waged a long campaign arguing that, on the face of the Constitution and as a matter of natural rights, Black people were citizens by virtue of their birth on U.S. soil.

Notoriously, the U.S. Supreme Court, in the 1857 case Dred Scott v. Sandford, concluded that citizenship was beyond the reach of Black Americans; their race disqualified them. During the Civil War and Reconstruction, lawmakers remedied this circumstance: first in an 1862 opinion from Attorney General Edward Bates, then in the Civil Rights Act of 1866, and finally in the first clause of the Fourteenth Amendment, which installed birthright in the Constitution, guaranteeing that Black people and all those born in the United States were citizens.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185900 https://historynewsnetwork.org/article/185900 0
Amendments are the Key to Avoiding Constitutional Extinction Every Fourth of July, Americans celebrate independence, but it might be more significant, more pregnant with meaning, to celebrate amendment — the writing, ratifying and especially the amending of constitutions. Except lately there hasn’t been much to celebrate, with amendment having become a lost art. And a constitution that can no longer be amended is dead.

The U.S. Constitution hasn’t been meaningfully amended since 1971. Congress sent the Equal Rights Amendment to the states for ratification in 1972, but its derailment rendered the Constitution effectively unamendable. It’s not that people stopped trying. Conservatives, especially, tried.

In 1982, President Ronald Reagan endorsed a balanced-budget amendment. In the 1990s, Republicans proposed anti-flag-burning amendments, fetal-personhood amendments and defense-of-marriage amendments. Lately, amendments have been coming from the left. “Nationally, Democrats generally wish to amend constitutions and Republicans to preserve them,” The Economist proclaimed last month, on the same day that California’s Democratic governor, Gavin Newsom, proposed a federal constitutional amendment that would regulate gun ownership. “I don’t know what the hell else to do,” he said, desperate.

The consequences of a constitution frozen in time in the age of Evel Knievel, “Shaft” and the Pentagon Papers are dire. Consider, for instance, climate change. Members of Congress first began proposing environmental rights amendments in 1970. They got nowhere. Today, according to one researcher, 148 of the world’s 196 national constitutions include environmental protection provisions. But not ours. Or take democratic legitimacy. Over the last decades, and beginning even earlier, as the political scientists Daniel Ziblatt and Steven Levitsky point out in a forthcoming book, “The Tyranny of the Minority,” nearly every other established democracy has eliminated the type of antiquated, antidemocratic provisions that still hobble the United States: the Electoral College, malapportionment in the Senate and lifetime tenure for Supreme Court justices. None of these problems can be fixed except by amending the Constitution, which, seemingly, can’t be done.

It’s a constitutional Catch-22: To repair Senate malapportionment, for instance, you’d have to get a constitutional amendment through that malapportioned Senate.

While it’s true that Americans can no longer, for all practical purposes, revise the Constitution, they can still change it, as long as they can convince five Supreme Court justices to read it differently. But how well has that worked out? That’s what happened, beginning in the early 1970s, with abortion and guns, the north and south poles of America’s life-or-death politics, in which either abortion is freedom and guns are murder or guns are freedom and abortion is murder. Chances are that if you like the current court, you like this method of constitutional change and if you don’t like the current court, you don’t like this method. But either way, it’s not a great boon to democracy.

Troublingly, our current era of unamendability is also the era of originalism, which also began in 1971. Originalists, who now dominate the Supreme Court, insist that rights and other ideas not discoverable in the debates over the Constitution at its framing do not exist. Perversely, they rely on a wildly impoverished historical record, one that fails even to comprehend the nature of amendment.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185898 https://historynewsnetwork.org/article/185898 0
John Roberts's Tragedy is of His Own Making In June 2012, at the end of a contentious Supreme Court term that decided, among other things, the fate of the Affordable Care Act, Chief Justice John Roberts prepared to leave for Malta, to teach a course on the court. “Malta, as you know, is an impregnable island fortress,” he joked on the eve of his trip. “It seemed like a good idea.”

Eleven years later, Malta no doubt retains its allure. The term that just ended must have been a torment for the chief. The court’s popularity has plunged to record lows; its members bicker on and off the bench; calls for the court to be packed are commonplace. Such circumstances would pain any chief justice, this one more than most. From the start of his tenure in 2005, he has painted himself as an institutionalist whose paramount concern is the court’s integrity. He conducts himself accordingly: He is decorous, almost regal; he speaks of moderation and judicial minimalism. He keeps a sovereign’s distance from modern life. In 1867, in a classic book on the English constitution, Walter Bagehot wrote that in times of change, “the most imposing institutions of mankind” maintain influence by demonstrating an “inherent dignity.” It is ironic, perhaps bitterly so, that a collapse in public esteem has become a hallmark of the Roberts court. Rarely, in recent decades, has the institution seemed less worthy of reverence.

The chief justice is portrayed by some as a tragic figure, powerless to save his court from itself. But the tragedy of John Roberts is that he does have the power to restore some measure of the court’s reputation — he just hasn’t used it. He has attempted, here and there, to restrain the court’s crusaders — by siding with liberals in the Alabama voting rights case, for example, and soundly rejecting the “independent state legislature” theory — but mostly, he has suggested that their methods and conduct are above reproach. His idea of integrity, it turns out, is a brittle thing, and self-defeating. It has put the court’s reputation at greater risk; it has made the court more, not less, vulnerable to public scrutiny and to encroachment by Congress and the White House.

This term will likely be remembered as the year the Supreme Court, led by its chief justice, ended race-conscious admissions at the nation’s colleges and universities. But the larger story of this term has been one of ethical rot and official indifference. Justices Samuel Alito, Neil Gorsuch and Clarence Thomas drew attention — not for the first time — for their close ties to wealthy benefactors who have business before the court. Reports by ProPublica and in The New York Times show justices accepting gifts and blandishments as monarchs might: free vacations at luxury resorts; undisclosed trips on private jets and yachts; and, in Justice Thomas’s case, largess in the form of private-school tuition for a family member, secret real estate deals and donations to pet projects of the justice and his wife, Virginia Thomas. This is hardly a complete list.

As their conduct has grown more unrestrained, so has the tenor of their public statements. Justice Alito’s peremptory, self-exculpatory op-ed in The Wall Street Journal in June, denying even a hint of an appearance of impropriety, was shocking — unless you happen to have caught his comments in the right-wing echo chamber. At conferences and galas, the justice unspools his grievances — against nonbelievers, same-sex marriage, the 21st century — sounding less like a jurist than “a conservative talk-radio host,” as Margaret Talbot wrote in The New Yorker.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185897 https://historynewsnetwork.org/article/185897 0
America Broke its Own Military Industrial Complex After Russia invaded Ukraine in February 2022, the United States pledged its “unwavering support for Ukraine’s sovereignty.” This support has materialized in over $75 billion in security assistance to date, with the United States committed to aiding Ukraine until the fighting stops. As U.S. Secretary of State Antony Blinken said in announcing a new installment of weapons to Ukraine: “The United States and our allies and partners will stand united with Ukraine, for as long as it takes.”

These unlimited commitments to furnishing Ukraine with weapons to counter Russian aggression have invoked parallels to World War II. Weeks after the fighting began, the New York Times columnist Paul Krugman argued that the United States and its allies are “serving as the ‘arsenal of democracy,’ giving the defenders of freedom the material means to keep fighting” in Ukraine. The journalist Elliot Ackerman then wrote that the workers building missiles for Ukraine’s defense “are a key component of America’s arsenal of democracy.” President Joe Biden has also embraced the “arsenal of democracy” analogy. When he visited a Lockheed Martin plant in Troy, Alabama, in May last year, Biden told the audience that the United States “built the weapons and the equipment that helped defend freedom and sovereignty in Europe years ago” and is doing so again today.

But this lofty rhetoric does not match the reality on the ground. Shortages in production, inadequate labor pools, and interruptions in supply chains have hamstrung the United States’ ability to deliver weapons to Ukraine and enhance the country’s defense capabilities more broadly. These problems have much to do with the history of the U.S. defense industry since World War II. Creeping privatization during the Cold War, along with diminished federal investment and oversight of defense contracting since the 1960s, helped bring about the inefficiency, waste, and lack of prioritization that are complicating U.S. assistance to Ukraine today.

After the Berlin Wall fell, major players in the U.S. defense industry consolidated and downsized their operations and labor forces. They also pursued government contracts for expensive, experimental weaponry to obtain larger profits to the detriment of small arms and ammunition production. As a result, the industry has been underprepared in responding to the Ukraine crisis and unmoored from the broader national security needs of the United States and its allies. Although reforms are possible, there are no quick fixes to these self-inflicted injuries.

Today’s defense industry bears no resemblance to the U.S. system of military production during World War II. Back then, the industry was predominantly a government-run business. President Franklin Roosevelt’s New Deal emphasized economic regulation and relied on “alphabet agencies” such as the Works Progress Administration to boost employment, paving the way for later wartime contracting. New Deal agencies inspired the creation of the War Production Board in 1942, which mobilized business and rationed resources for the battlefront. Weapons production was concentrated in shipbuilding and aircraft, with companies based mainly in industrial centers in the Northeast and Midwest in government-owned, government-operated facilities known as GOGO plants. The government owned nearly 90 percent of the productive capacity of aircraft, ships, and guns and ammunition. This is in contrast to today’s climate, where commercial items have made up over 88 percent of new procurement awards since 2011, and private capital invests over $6 billion a year in the defense industry.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185896 https://historynewsnetwork.org/article/185896 0
SCOTUS's Affirmative Action Ruling no Coincidence; Court Seeks to Preserve Power of Small Elite On Thursday, in a 6-3 decision, the US supreme court ruled against affirmative action in American colleges and universities. The obvious concern now is whether the ruling will significantly reduce the number of Black, Latinx and Indigenous students enrolled at elite institutions. But a more dire reality undergirds the court’s decision: it reflects a decades-long drive to return higher education to white, elite control.

That movement predates affirmative action by at least a century, because no entity impacts American life more than higher education. During the Reconstruction era following emancipation, Black people were allowed to advance in political and various other roles, but white powerbrokers drew a hard line at higher education. On 28 September 1870 the chancellor of the University of Mississippi, John Newton Waddel, declared: “The university will continue to be, what it always has been, an institution exclusively for the education of the white race.”

Waddel was not alone in his appraisal. Following the civil war, many white academic leaders and faculty members believed higher education was designed solely to educate white people. Waddel and other white academics maintained that the University of Mississippi’s faculty “never, for a moment, conceived it possible or proper that a Negro should be admitted to its classes, graduated with its honors, or presented with its diplomas”.

Over the past century, Black Americans’ struggles to secure equal educational opportunity have always been met with white resistance. The recent lawsuits filed by Students for Fair Admissions – an organization led by anti-affirmative-action activist Edward Blum – against Harvard University and the University of North Carolina are not about academic merit or even the mistreatment of white or Asian American students; they are an extension of this movement to ensure American higher education can be used to maintain social norms.

This is why, in defending affirmative action, the argument for campus diversity falls short. Rather than make wealthy, majority-white campuses more diverse, affirmative action was intended to acknowledge and address the nation’s history of racism and atone for past racial harms that disproportionately affected descendants of enslaved Black people.

This was made plain in 1963 – one of the most racially tumultuous years of the civil rights movement. By summer, John F Kennedy – a Harvard University alumnus in his third year in the White House – was forced to take immediate action about racial segregation, in part because it had become a foreign policy embarrassment to the United States that belied the nation’s stated commitment to democracy.

]]>
Thu, 25 Apr 2024 16:36:29 +0000 https://historynewsnetwork.org/article/185894 https://historynewsnetwork.org/article/185894 0