With support from the University of Richmond

History News Network puts current events into historical perspective. Subscribe to our newsletter for new perspectives on the ways history continues to resonate in the present. Explore our archive of thousands of original op-eds and curated stories from around the web. Join us to learn more about the past, now.

Nepotistic Legacy Admissions are the Gaping Loophole of American Meritocracy

It’s odd.

The United Kingdom has a hereditary monarchy and a hereditary aristocracy, but strong norms against nepotism in education and the workplace.

The U.S. is a republic, a nation founded on anti-hereditary principles, where nepotism is not only permitted but codified—most obviously in the practice of legacy preferences in college admissions. My eldest son has two parents who went to the University of Oxford, but if that fact had made a difference to his own chances of getting in, both he and we would have been appalled, as would all the other applicants. (He did not get in.)

....

While Jared Kushner was safely ensconced at Harvard in 2001, Oxford’s Trinity College rejected the application of a wealthy donor’s son, leading to the cancellation of his planned gift. Michael Beloff, the president of the college, contrasted the two countries’ approaches in an op-ed for The Times of London, explaining, “With fewer resources, less space, strict government quotas and substantial dependence on public monies, we do things differently here.”

But this wasn’t always the case. Up until the late 1950s, Oxford colleges gave a very clear preference to the sons of “gentlemen” (i.e., aristocracy) and of alumni. But in the more egalitarian postwar era, this kind of privilege became socially toxic. There was rhetorical pressure from politicians, certainly. The postwar Labour governments of the 1940s and ’50s were determined to make society less constricted by social class, and Oxford and Cambridge were important symbols in this crusade. As early as 1949, Walter Moberly, the chair of the University Grants Commission, lamented the role of Oxbridge in “buttress[ing] the existing social order.” But it was the colleges themselves, sensing a shift in public and political opinion, that ended legacy preferences.

In the U.S., the history of legacy preferences is the other way around. Elite colleges adopted legacy preferences in the early part of the 20th century, largely to keep down the number of Jews filling the lecture halls. In the 1960s, the dean of admissions at Yale, R. Inslee Clark, reduced the weight of legacy status and halved the proportion of legacies in the freshman class, from 24 to 12 percent. Outrage ensued. The conservative author and commentator William F. Buckley Jr. complained that without legacy preference, “a Mexican-American from El Paso High with identical scores on the achievement test … has a better chance of being admitted to Yale than Jonathan Edwards the Sixteenth from Saint Paul’s School.” Clark lost his fight, and today legacy preferences are treated as business as usual.

Even people who feel some qualms about the practice can say, accurately, that “everybody does it.” It’s the norm. And that is what has to change. The reshaping of norms can be a powerful driver of social change, as recent history confirms: Think about smoking, or drunk driving, or attitudes toward same-sex relationships. When norms shift, so does the social acceptance of certain kinds of behavior. What was once “just what everyone does” becomes “simply not done.” Norms are often more powerful than rules and laws because they are regulated by, as the legal scholar Cass Sunstein argues in his book How Change Happens, socio-emotional responses such as “pride (a kind of informal social subsidy) and guilt or shame (a kind of informal social tax).”

Read entire article at The Atlantic