Nepotistic Legacy Admissions are the Gaping Loophole of American MeritocracyBreaking News
tags: higher education, meritocracy, class, Nepotism, Inequaltiy
Richard V. Reeves is a senior fellow at the Brookings Institution, where he directs the Future of the Middle Class Initiative.
The United Kingdom has a hereditary monarchy and a hereditary aristocracy, but strong norms against nepotism in education and the workplace.
The U.S. is a republic, a nation founded on anti-hereditary principles, where nepotism is not only permitted but codified—most obviously in the practice of legacy preferences in college admissions. My eldest son has two parents who went to the University of Oxford, but if that fact had made a difference to his own chances of getting in, both he and we would have been appalled, as would all the other applicants. (He did not get in.)
While Jared Kushner was safely ensconced at Harvard in 2001, Oxford’s Trinity College rejected the application of a wealthy donor’s son, leading to the cancellation of his planned gift. Michael Beloff, the president of the college, contrasted the two countries’ approaches in an op-ed for The Times of London, explaining, “With fewer resources, less space, strict government quotas and substantial dependence on public monies, we do things differently here.”
But this wasn’t always the case. Up until the late 1950s, Oxford colleges gave a very clear preference to the sons of “gentlemen” (i.e., aristocracy) and of alumni. But in the more egalitarian postwar era, this kind of privilege became socially toxic. There was rhetorical pressure from politicians, certainly. The postwar Labour governments of the 1940s and ’50s were determined to make society less constricted by social class, and Oxford and Cambridge were important symbols in this crusade. As early as 1949, Walter Moberly, the chair of the University Grants Commission, lamented the role of Oxbridge in “buttress[ing] the existing social order.” But it was the colleges themselves, sensing a shift in public and political opinion, that ended legacy preferences.
In the U.S., the history of legacy preferences is the other way around. Elite colleges adopted legacy preferences in the early part of the 20th century, largely to keep down the number of Jews filling the lecture halls. In the 1960s, the dean of admissions at Yale, R. Inslee Clark, reduced the weight of legacy status and halved the proportion of legacies in the freshman class, from 24 to 12 percent. Outrage ensued. The conservative author and commentator William F. Buckley Jr. complained that without legacy preference, “a Mexican-American from El Paso High with identical scores on the achievement test … has a better chance of being admitted to Yale than Jonathan Edwards the Sixteenth from Saint Paul’s School.” Clark lost his fight, and today legacy preferences are treated as business as usual.
Even people who feel some qualms about the practice can say, accurately, that “everybody does it.” It’s the norm. And that is what has to change. The reshaping of norms can be a powerful driver of social change, as recent history confirms: Think about smoking, or drunk driving, or attitudes toward same-sex relationships. When norms shift, so does the social acceptance of certain kinds of behavior. What was once “just what everyone does” becomes “simply not done.” Norms are often more powerful than rules and laws because they are regulated by, as the legal scholar Cass Sunstein argues in his book How Change Happens, socio-emotional responses such as “pride (a kind of informal social subsidy) and guilt or shame (a kind of informal social tax).”
comments powered by Disqus
- Eastern Europe Brought Soccer Into the Modern Age. Why is it a Wasteland Now?
- Ties Documented Between Legal Activist Challenging Affirmative Action and White Nationalists
- Work More, Consume Less: The Coercive Nature of Austerity Politics
- Will the Philadelphia Museum Strike Change an Industry?
- Qatar Isn't The First Regime to Polish its Image With a World Cup