;



Why we’re letting Americans vote, marry and drink far too young

Roundup
tags: public history, voting history



Holly N.S. White is assistant editor at the Omohundro Institute of Early American History and Culture and author of "Negotiating American Youth: Age, Law, and Culture in the Early Nineteenth Century" (forthcoming, University of Virginia Press).

Does it make sense to tie rights and privileges — to vote, to marry, to drink — to age?

Age is a biological reality. Science has shown us that certain growth, be it skeletal, reproductive or mental, occurs around certain ages. For example, carpal bones in the wrist begin to overlap at the age of 5 for girls and 7 for boys; most women complete puberty by 14 and men by 16; wisdom teeth show up by 21; and our brains reach their full maturity at 25.

But environmental factors, nutrition and psychological traumas have been shown to stunt or accelerate a person’s physical and mental development. And historians, alongside sociologists and psychologists, have consistently observed that the significance of a person’s age is a social and cultural construct informed by race, gender, class, religion and geographical affiliation.

So why, despite an abundance of evidence showing that it is unwise to use age to define maturity, does our age-based legal system persist?

Because assumptions about age are a legal pillar of American society reaching back almost 250 years to our nation’s founding. Age laws were a tool to advance ideas about equality and fairness, and they persist because, on the surface, they seem to work well. It’s only when we stop to consider the science behind age, or how race, class and gender intersect with the application of these laws, that we see how troubling their existence is in 2019.

At age 18, Americans can claim legal independence from their parents or guardians. They can vote in federal elections at 18 and drink alcohol at 21. These legal age laws operate as rites of passage for American youths as they become adults. But these laws reflect 18th-century definitions of maturity and age, rather than scientific understandings about how age and growth actually work.

Americans’ earliest legal-age-related concerns revolved around when a person should be permitted by law to marry, consent to contract, vote, and testify in court, as well as when they became culpable for crimes. Using age to set these boundaries was an explicit rejection of the British system, in which inherited status governed privileges. Americans wanted to build a system governed by experience and informed consent, one with seemingly neutral markers for attaining the rights of citizenship. While not everyone could acquire land or an education, theoretically, everyone could reach the age of 18 or 21.

This system would have achieved such equality had it only applied to white men, the people whose rights were paramount in the minds of the framers of these new laws. But when applied to the entire population, it fell woefully short. The rigid racial and gender hierarchies that prevailed for much of U.S. history interacted with this system in complicated ways that to this day result in racially biased juvenile criminal sentencing and an outrageously high number of girls marrying before they reach adulthood, to give just two examples.

During the 19th century, new ideas and scientific understandings about childhood and its fundamental differences from adulthood emerged. By the early 20th century, states began to turn these ideas into new laws meant to shield children from the growing demands of an industrial society. Laws regulating workplace hours and conditions, requiring children to attend school and protecting them from abuse and neglect all shifted the responsibility of children’s socialization and safety from the family to the state. These evolved understandings also resulted in the formation of new institutions like juvenile courts.

Read entire article at Washington Post

comments powered by Disqus