The World Turned Inside Out: American Thought and Culture at the End of the 20th Century
But Roger Kimball, who coined the term “tenured radicals,” was on to something when he concluded that “the real battle that is shaping up is not between radicals and conservatives but between radicals and old-style liberals.” Any inspection of the battlefield bears out his conclusion Just look at the most consequential examples. They all involve committed, activist, academic Leftists going after what they perceived as radicals who challenged Enlightenment ideals. Think of Nancy Fraser, Seyla Benhabib and Martha Nussbaum going after Judith Butler. Think of Alan Sokal, Richard Rorty, and In These Times going after the “Cultural Left” represented by “assassins of objectivity” like Andrew Ross. Think of Fredric Jameson, David Harvey, and Frank Lentricchia going after postmodernism and post-structuralism; to be sure, all three were amused by dropouts from the Frankfurt School like Jacques Derrida and Michel Foucault, but they were also appalled..
The critique of “tenured radicals” was, then, an intramural sport on the Left in the late-20th century. It was most definitely not a vast right-wing conspiracy—but it was in many ways conservative, because it sought to rehabilitate 19th-century notions of individualism, agency, and objectivity. There was, in fact, a right-wing critique of academic excess, but its producers lived in a state of exile, far from the debauched ivory tower and in protest against the reckless hedonism of the larger society; this self-imposed distance made their complaints always sound like they were radically distorted by the wrong amplifier.
For example, Robert Bork, the outspoken jurist , claimed in 1996 that the only conceivable explanation for a progressive income tax was envy; that gun control laws were the cause of violent crime, because they disarmed the law-abiding population (if criminals knew everybody carried a gun, they would be deterred from using their own); that the Declaration of Independence was a big mistake because, by enfranchising the insane individualism of the Enlightenment, it ignored the problem of social order; and that—oops—Western civilization was to blame for what happened in the abominable 1960s.
No, really. Here is what Bork sincerely stated in Slouching Towards Gomorrah: Modern Liberalism and American Decline (1996): “This [the shrinking number of required courses in college curricula after 1914] confirms a pattern repeatedly suggested in this book: trends slowly moving through an area of life, in this case higher education, until the Sixties when those trends accelerated rapidly. This [antecedent unknown] suggests, as noted earlier, that we would in any event have eventually arrived where the Sixties took us but perhaps two or three decades later. Which [antecedent absconded] in turn suggests that we are merely seeing the playing out of qualities—individualism and egalitarianism—inherent in Western civilization and to some degree unique to that civilization.”
Bork probably should have let Lynne Cheney read the manuscript, especially since they overlapped at the American Enterprise Institute in the early 1990s. She could have warned him off this alarming attack on the pillars of western civilization. The world-weary William Bennett, Ronald Reagan’s Secretary of Education, then George Bush’s “drug czar,” and then Bill Clinton’s scold-in-chief, could have, too. In The De-Valuing of America, a book of 1992, Bennett, like Cheney, insisted that “we must study, nurture, and defend the West”—that is, Western civilization—in large part because “the West is good.” In the same book, he praised Judge Bork as “perhaps the finest legal mind in America.” Yet he chose to endorse Bork’s doubts about both the egalitarian imperatives of the Declaration and the individualistic premises of Western civilization. Indeed, Bennett’s paperback blurb for Slouching Towards Gomorrah makes it sound like a masterpiece of either social history or pornography: “A brilliant and alarming exploration of the dark side of contemporary American culture.”
The culture wars inflamed by such determined combatants were fought, for the most part, on a battlefield controlled by the Left—that is, on campus, in the new public sphere of higher education. That is why Robert Bork and William Bennett—Mrs. Cheney, too—were railing against a culture galvanized, since the 1960s, by the campuses. Their purpose was to reform (or debunk) the university, and thus to redeem that larger culture. But the evidence suggests that Americans needed no prodding from tenured radicals as they moved in the 1980s and 90s toward acceptance of equity between races, classes, genders, and sexual preferences, on their way to bemused tolerance of almost anything, including animal rights.
To be sure, education as such remained the object of cultural critique from self-conscious conservatives. Pat Robertson, for example, the televangelist and presidential contender—he participated forcefully in the Republican primaries and debates of 1988—claimed that the public school system in the U.S. was “attempting to do something that few states other than the Nazis and Soviets have attempted to do, namely, to take the children away from the parents and to educate them in a philosophy that is amoral, anti-Christian and humanistic and to show them a collectivist philosophy that will ultimately lead toward Marxism, socialism and a communistic type of ideology.” Jimmy Swaggart, another televangelist, was more succinct: “the greatest enemy of our children in this United States. . .is the public school system. It is education without God.”
Even so, the larger culture exhibited unmistakable signs of rapid, irreversible, and enormous change. The educators themselves, high and low, were being educated by that change. How, then, should we summarize that change—how should we gauge its symptoms?
One way is to import Samuel P. Huntington’s notion of a “clash of civilizations” as the characteristic divide of the late-20th century. Huntington was the Harvard political scientist who made his political bones back in the 1960s by planning “forced draft urbanization” in Vietnam—if the peasants aren’t out there in the countryside helping the guerillas (the Viet Cong) because you have removed them to existing cities or concentrated them in newly constructed “urban centers,” he surmised, you can then stage direct military confrontations between American soldiers and communist insurgents. In keeping with his policy-relevant duties in the aftermath of Vietnam, he suggested in a 1995 book that impending global conflicts would turn on cultural (read: religious) divisions rather than the older divisions of political economy which had placed capitalism and socialism at the opposite extremes of diplomatic decisions and developmental options. The domestic analogue would be the so-called culture wars, which dispensed, for the most part, with arguments about economic arrangements and instead engaged the problems of moral values, civic virtues, and familial integrity—“cultural indicators,” as Bennett called them in a flurry of articles and books.
Another way to summarize the same great divide is to enlist Daniel Bell, and to propose that the “cultural contradictions of capitalism,” as he calls them, reached their apogee in the early 1990s, when the so-called culture wars got formally declared. In the sequel to The Coming of Post-Industrial Society, Bell argued that the American social structure—the mundane routine of work, family, and daily life—“is largely bourgeois, moralizing, and cramped,” the arid domain of “traditional values,” but meanwhile the culture “is liberal, urban, cosmopolitan, trendy, fashionable, endorsing freer lifestyles, and permissive.” In other words, the bourgeois values inherited from the 19th century became problematic if not merely obsolete in the post-industrial rendition of 20th-century consumer capitalism. To borrow the terms invented by Raymond Williams, the residual (bourgeois) culture was still committed to deferring gratification, saving for a rainy day, and producing character through hard work, whereas the dominant (capitalist?) culture was already animated by a market-driven hedonism—a “consumer culture”—in which such repressive commitments seemed quaint.
But notice that the Cultural Left ensconced in the universities was aligned with the bohemian values validated by a post-industrial consumer capitalism, whereas the New Right enfranchised by the churches and the think tanks was opposed to these same values. In this sense, once again, conservatism in the late-20th century was not a blanket endorsement of what free markets make possible; like the radicalism of the same moment in American history, it was a protest against the heartless logic of the market forces created and enforced by consumer capitalism.
From either standpoint, however, Huntington’s or Bell’s, we witness a 19th-century version of self, family, and nation competing with a 20th-century version. From either standpoint, bourgeois society struggles to survive against the global tentacles of post-industrial consumer capitalism. Perhaps the impending conclusion of this struggle, the impending extinction of bourgeois society, is what we mean—and is all we can mean—by the end of modernity. The modern world, the “era of the ego,” was, after all, created by bourgeois individuals eminently capable of deferring gratification.
But most Americans were not reading Huntington or Bell in the 1980s and 90s. Nor were they using Judith Butler’s post-structuralist vocabulary to understand what was happening to them How then did they experience and explain the end of modernity? The question can be asked in more specific ways. Were these academic theorists just making it up? Or were they making sense of new realities—of fundamental changes? Was there a colloquial, vernacular idiom in which these changes were anticipated, recorded, codified? To answer, let us revisit some hugely popular movies of the late 20th century—let us see what they require us to experience and explain—and then, in the next chapter, turn to some equally popular cultural forms, TV and music.
Big Movies, Big Ideas
To begin with, let us have a look at “The Matrix,” Terminator II,” and “Nightmare on Elm Street,” each a part of a movie “franchise” in which increasingly intricate—or ironic—sequels retold the same story from new angles. The preposterously complicated plot of the original “Matrix” (1999) is almost beside the point. But for those of you who haven’t seen it, here goes. In a post-holocaust future that resembles the scorched earth of the “Terminator” movies, machines have taken over the world: technological hubris has finally put an end to progress. Human beings have been reduced to dynamos whose metabolism is converted into the energy the machines need to—what?—go about their evil business. These benighted human beings just think that they’re going to work on those familiar city streets (the “city” looks like Chicago). In fact, they’re only holograms projected by the machines to keep their energy source happy. As in the “Nightmare” franchise, appearance and reality are identical, at least in the beginning.
But an underground movement exists to wake these unwitting creatures up, by bringing them out of the Matrix and teaching them how to fight the power on its own holographic terms. This movement recruits “Neo,” a young blank slate of a man—played of course by Keanu Reeves, a young blank slate of a man—onto whom the underground leader has projected his millennial ambitions. “Neo” (his screen name) turns out to be the “chosen one” after all; he quickly surpasses his teacher, the leader, and becomes a virtual martial artist who kicks virtual ass.
“Neo” learns to enter and disable the Matrix, thus revealing the awful reality beneath the normal, hopeful images that sustain the physical life of the dynamos down on the energy farm. The assumption here is that human beings can’t stay alive without hopes and dreams: if they knew that they were merely cogs in a vast energy-producing machine, they would surely die. By the same token, if they lived in a perfect world, they would know from their experience of western religion—which insists that you can’t get to heaven until your body expires—that they were dead. In both settings, they would be unhappy, but their hopes for a brighter future that is somehow different from the abiding present would keep them alive; the evil designers of the Matrix introduce imperfection into the grid when they realize this simple truth of human nature.
In “The Matrix,” the artificial finally overpowers the real, or rather the natural; meanwhile the expectation of an end to the illusions of the holographic world finally becomes a religious urge that displaces any residual pretense of science fiction. The monstrous agents of the Matrix are shape-shifting, indestructible machines that inhabit and impersonate human beings. But “Neo” has no oppositional force or effect against them unless he’s “embodied” as a slice of computer code and inserted into a holographic “reality”—until he’s “embodied” in recognizable human form as a part of a machine. And his triumph over these agents of dehumanization is a result of his belief in himself as the messiah (the “chosen one”), which requires first a consultation with “the Oracle”—a woman who, by the way, inhabits the Matrix, not the scene of resistance—and then the loss of his corporeal form. At any rate the laws of gravity and mortality no longer apply to our hero by the end of this movie: he has become a god-like creature who can soar like Superman.
“Terminator II” (1998; the original was 1984) has no less of an appetite for biblical gestures and sacrificial rites. But the cyborg from the future who helps save the world from the bad machines isn’t an immaterial, possibly immortal presence like “Neo.” He’s always embodied. And even though he’s mostly machine—his apparent humanity is only skin-deep—he’s a better father to young John Connor, the leader of the coming rebellion against “Skynet,” than anyone else in view. “This machine was the only thing that measured up,” his mother, Sarah, says while watching the son and the cyborg do manly, mechanical things under the hood of a car.
In the original installment of the franchise, Sarah Connor is impregnated by a soldier sent back from the post-apocalyptic future to protect her from the cyborg intent upon killing her; somehow everybody knows that her offspring will some day lead the rebellion against the machines. In “Terminator II,” the stakes are even higher. Sarah wants to abort the apocalypse, and her son pitches in with the help of the same model of cyborg that, once upon a time, came after his mother. In doing so, she is of course relieving her son of his heroic duties in the dreaded future—in the absence of real fathers in the flesh, after all, mothers have to do what’s right.
The apocalypse is finally aborted in three strokes. The Connors and their protector destroy the computer chip from the original cyborg of “T1,”which has fueled research and profits at the malevolent corporation that invented “Skynet,” the digital universe of knowledge to be captured by the bad machines on August 29, 1997. Then they defeat a new, more agile and flexible cyborg sent back to kill young John by dipping the thing in molten metal—the end of the movie is shot in what looks like a cross between a foundry and a steel plant, both throwbacks to an imaginary, industrial America where manly men worked hard and earned good pay (Freddy Krueger stalks his teenage victims in a strikingly similar dreamscape, as if what haunts them, too, is an irretrievable and yet unavoidable industrial past). Finally, the old, exhausted, even dismembered protector cyborg lowers himself into the same vat of molten metal that had just dispatched his robotic nemesis, thus destroying the only remaining computer chip that could restart the train of events that led to “Skynet.”
So the narrative alternatives on offer in “T2” are both disturbing and familiar: Dads build machines—or just are machines—that incinerate the world, or they get out of the way of the Moms. Like the cowboys and outlaws and gunfighters of the old West, another imaginary landscape we know mainly from the movies, such men might be useful in clearing the way for civilization, but they probably shouldn’t stick around once the women and children arrive.
The endless sequels to the original “Nightmare on Elm Street” (1983) follow the trajectory of the “Terminator” franchise in one important respect—the indomitable villain of the 1980s evolves into a cuddly icon, almost a cult father figure, by the 1990s. But the magnificent slasher Freddy, who punctures every slacker’s pubescent dreams, always preferred the neighborhood of horror, where apocalypse is personal, not political: it may be happening right now, but it is happening to you, not to the world.
Here too, however, the plot is almost irrelevant because it turns on one simple device. It works like this. The violence and horror of your worst nightmares are more real than your waking life; the dreamscapes of the most insane adolescent imagination are more consequential than the dreary world of high school dress codes and parental aplomb: welcome to Columbine. Freddy teaches us that the distinction between appearance and reality, the distinction that animates modern science—not to mention the modern novel—is not just worthless, it is dangerous. If you don’t fight him on his own post-modern terms, by entering his cartoonish space in time, you lose your life. If you remain skeptical, in the spirit of modern science or modern fiction, you lose your life.
The enablers of every atrocity in sight are the parents and the police (the heroine’s father, for example, is the chief of police), who are complacent, ignorant, and complicit, all at once. They killed the child molester Freddy years ago, when he was freed on a legal “technicality”—or at least they thought they killed him—and so his revenge on their children seems almost symmetrical: the vigilantes in the neighborhood are now victims of their own extra-legal justice. And their hapless inertia in the present doesn’t help the kids. In fact, their past crimes have disarmed their children. The boys on the scene aren’t much help, either—they’re too horny or too sleepy to save anybody from Freddy’s blades, even when the girls explain what will happen if they don’t stand down, wake up, and get right with their bad dreams.
The Cultural Vicinity of the Matrix
So what is going on in the cultural vicinity of these hugely popular, truly horrific scenarios? At least the following. First, representations are reality, and vice versa. The world is a fable, a narrative machine, and that’s all it is. The directors of “The Matrix” make this cinematic provocation clear by placing a book in the opening sequences—a book by Jean Baudrillard, the French theorist who claimed a correlation between finance capital and the historical moment of “simulacra,” when everything is a copy of a copy (of a copy), not a representation of something more solid or fundamental. At this moment, the reproducibility of the work of art becomes constitutive of the work as art: nothing is unique, not even the artist, and not even you, the supposed author of your own life. Everything is a sign of a sign. The original “Nightmare” had already proved the same post-modern theorem with more gleeful ferocity and less intellectual pretensions, but it performed the same filmic experiment, and provided the same audience experience. “T2” accomplishes something similar by demonstrating that the past is just as malleable as the future: again, the world is a fable waiting to be rewritten.
Second—this follows from Baudrillard’s correlation of finance capital and the historical moment of “simulacra”—the world is, or was, ruled by exchange value, monopoly capital, and their technological or bureaucratic offspring. The apocalypse as conceived by both “The Matrix” and “T2” is a result of corporate-driven greed (in the latter, the war that arms the machines is fought over oil). An ideal zone of use value beyond the reach of the market, a place where authentic desires and authentic identities are at least conceivable, is the coast of utopia toward which these movies keep tacking. The movies themselves are of course commodities that could not exist without mass markets and mass distribution; but there is no hypocrisy or irony or contradiction lurking in this acknowledgment. Successful filmmakers understand and act on the anti-capitalist sensibilities of their audiences—none better than Steven Spielberg. Even so, they know as well as we do that there’s no exit from the mall, only detours on the way.
Third, the boundary between the human and the mechanical, between sentient beings and inanimate objects, begins to seem arbitrary, accidental, inexplicable, and uncontrollable. “Blade Runner” (1982) and “Robocop” (1987), perhaps the two best movies of the 1980s, are testament to this perceived breakdown of borders, this confusion of categories: the good guys here are conscientious machines that are more human than their employers. That these heroes are both victims of corporate greed and street gangs does not change the fact that, like the tired old cyborg of “T2,” their characters and missions were lifted directly from the westerns of the 1930s and 40s—they’re still figuring out what it means to be a man while they clean up Dodge City, but now they know that machines, not lawyers, might do the job better. Again, the artificial overpowers the natural and remakes the world. A fixed or stable reality that grounds all representation and falsifies mere appearance starts to look less detailed, and to feel less palpable, than the imagery through which we experience it; or rather the experience just is the imagery. So the end of Nature, conceived by modern science as an external world of objects with its own laws of motion, is already at hand, already on display. The world has been turned inside out.
That is why the eviscerated world on view in these movies seems “post-historical”: technological progress can no longer look like the horizon of expectation, not even for the citizens of the most advanced capitalist nation on the planet. Even here the machines are taking over, downsizing every sector, but particularly manufacturing, making good jobs in the factory or the foundry—or for that matter in the back offices—a thing of the past. When the machines do everything, the prospect of getting a better job than your father (if you have one) becomes unlikely, and the prospect of human civilization looks no better than bleak. Put it another way. If the future of “Man” doesn’t look so good because the difference between sentient beings and inanimate objects has become arbitrary, accidental, inexplicable, and uncontrollable, the future of men looks even worse.
Fourth, the self, the family, and perhaps the nation are at risk in a world ruled by “simulacra”—that is, where you can’t specify the difference between appearance and reality, between machines and men, or when you realize that everything, maybe even your own self, is a sign of a sign. We’ve already noticed that John Connor’s adoptive father is a cyborg; and we’ve noticed that the parents in the original “Nightmare” are a big part of the problem our pubescent heroine faces.
We should also notice that only two of the small band of heroes which recruits “Neo” to the cause have been born outside the Matrix—you can tell because they don’t have metal inserts in their necks and arms—but there’s no explanation of who Mom and Pop are, except that, like the leader, they’re African–American. This is a family? We must assume so, because these two are designated as “brothers.” Meanwhile the others are trying to figure out where they begin and the computer code ends (we in the audience are as well, especially when the traitor decides he wants to go back into the Matrix and eat virtual steak). Their creaky old craft—it, too, looks like a remnant of industrial America—is named “Nebuchadnezzar,” after an ancient king of Babylon who had conquered Judaea in accordance with a cranky God’s wishes, but the key to their survival is “Zion,” the mainframe that unites the resistance.
This naming of the thing that keeps them together is significant because it is shorthand for a nation that is imminent but not extant—it’s an idea whose time has not yet come, as in the “promised land.” The question it raises is, how are we to imagine a productive relation between these three categories (self, family, nation) now that we have put them at risk, that is, in motion, in cyberspace, where the weakened condition of a fixed, external reality loosens all ties?
So the end of modernity was not the intellectual property of academics isolated in their ivory tower, lacking any connections to the “real world.” It was deeply felt and widely perceived in the popular culture organized by film (and by TV and music, of course, which we’ll get to later). One way to measure the breadth of this feeling, this perception, is to notice how it informed really bad movies as well as really good ones, and how it reanimated—in the most literal sense—the politics of cartoons. Or, to put it in the terms proposed by Carol Clover, the brilliant analyst of horror films, one way to measure the widespread panic induced by the end of modernity is to watch how the thematics and sensibilities of truly awful movies entered, and altered, the mainstream.
Let’s begin with the panic.
Many historians and critics have pointed to the profound sense of an ending that permeated American film of the 1970s, 80s, ad 90s. But it was not just the American century that was waning in Oscar-winning movies like the “The Deer Hunter” (1978), which dramatized the military defeat of the US in Vietnam as a crushing blow to American manhood. The fin-de-siecle feeling built into the approach of a new millennium was compounded and amplified by realistic reports—and hysterical fears—of pervasive criminality, random yet universal violence, maybe even ineradicable evil; by the decline of patriarchy which accompanied the decomposition of the traditional nuclear family and the de-industrialization of the American economy; by the rise of the new, “post-feminist” woman whose bodily integrity, moral capacity, and sexual autonomy were validated by the Supreme Court in the Roe v. Wade decision of 1973, then contested by the emergence of the religious right; by corporate malfeasance, government corruption—incessant scandal, public and private—from Watergate to Gary Hart, on toward Iran-Contra and the dangerous liaisons of the Clinton years; by damning revelations of the uses to which American power had been put during and after the Cold War from Iran to Chile to Nicaragua, where revolutions in the 1970s were designed to discredit and disarm the Great Satan, the Whited Sepulchre based in Washington, D.C.; and by the public, determined, sometimes flamboyant display of homosexual affection and solidarity in the name of gay rights, a movement both complicated and magnified in the 1980s by the eruption of a deadly new sexually transmitted disease, HIV-AIDS.
When everything—law and order, manhood, fatherhood, womanhood, family, heterosexuality, even national honor—is ending, the apocalypse is now. At any rate that is the feeling that permeates the atlas of emotion etched by American culture in the late-20th century. To illustrate this feeling, let us take a look at what happens generically in American film from the late 1970s to the late 1990s.
Probably the most important trend is the ascendance of the horror genre, in all its weird permutations (slasher, possession, occult, etc.). It remained a lowbrow, B-movie genre from the early 1930s into the 1970s, but then, with the rapid expansion of the “Halloween” (1978) and “Friday the 13th” (1980) franchises in the 1980s, it became the stuff of blockbuster box office. As Mark Edmundson and others have noted, when “Silence of the Lambs,” a tasteful, muted, sublimated—almost stuffy—slasher film won the Oscar for Best Picture in 1991, horror had become the mainstream of American film. It had meanwhile reshaped every other genre, even westerns, as for example Clint Eastwood’s “High Plains Drifter” (1972).
Another important trend is an integral part of the ascendance of the horror genre. Where once female protaganists were hapless victims of violence unless they could rely on their fathers, husbands, and brothers—or the law—to protect them from the slashers, psychopaths and rapists, they now take the law into their own hands, and exact a new kind of revenge on a world of pervasive criminality coded as male. Here the thematic movement “from the bottom up,” from truly awful to pretty good movies, is unmistakable. A terrifically bad movie called “I Spit on Your Grave” (1976) first installs the female victim of rape in the role of single-minded avenger, for example, and it thereafter presides, in spirit, over more polished, upscale films like “Silence of the Lambs.”
Yet another important trend in late-20th century movies is the hypothesis that the familyas such is dysfunctional, perhaps even destructive of social order and individual sanity. As Robin Wood has argued, the horror genre is the laboratory in which this indecent hypothesis has been tested most scientifically, from “The Texas Chain Saw Massacre” (1974) to “The Omen” (1976) and “Poltergeist” (1982), all movies about families permeated or penetrated by unspeakable evil—families confused by the modern liberal distinction between private and public spheres. But the return of the repressed gangster begun by “The Godfather” cycle in the 1970s, magnified in the 1983 remake of “Scarface”—the original appeared in 1931—and completed by “The Sopranos” on cable TV in the late 1990s, also demonstrated, in the most graphic terms, that strict devotion to family makes a man violent, paranoid, and finally unable to fulfill his obligations to loved ones.
If all you inhabit or care for is your family, both these genres keep telling us, you are the most dangerous man alive. At the very least you’ll forget your loyalties to a larger community, contracting your commitments until they go no further than the boundary of your own home; at that point, you will have destroyed your family and broken the rules that regulate life out there where everybody else lives. But how do you situate your self in relation to a larger community—to the state, the nation—in the absence of this middle term, the family? It was an urgent political question in late-20th century America, as the decomposition of the traditional nuclear family accelerated, and it was raised most pointedly on screen, by a culture industry supposedly out of touch with “traditional values.”
A fourth important trend in the movies of the late-20th century is an obsession with the ambiguities and the consequences of crime. Film noir of the 1940s and 50s was predicated on such ambiguities and consequences, of course, but the sensibility of that moment seems to have become a directorial norm by the 1980s. The difference between the good guys and the bad guys is at first difficult to discern in the battle between the criminals and the deputies staged by Arthur Penn in “Bonnie and Clyde” (1967), in part because it is clear from the outset that our heroes are deranged. It gets more and more difficult in the 1970s and 80s, when Clint Eastwood’s “Dirty Harry” franchise makes the detective less likable than his collars; when drug dealers, pimps, and prostitutes become lovable characters in “blaxploitation” movies (“Sweet Sweetback” , “Shaft” , “Superfly” ); when gangsters become the unscrupulous yet dutiful bearers of the American Dream (“The Godfather,” “Scarface”); when Custer’s Last Stand becomes a monument to imperial idiocy (“Little Big Man” ), even as the Indians become the personification of authentic America (“Dances With Wolves” ); when the origin of civic renewal is a crime that appears as both domestic violence and foreign policy—it begins as incest and ends as the colonization of what was once exterior to the city fathers’ domain (“Chinatown” ); and when the assassination of a president becomes comparable to the “secret murder at the heart of American history” (“JFK” : this is the District Attorney talking to the jury!).
That not-so-secret murder is of course the American Dream itself—the dream that allows you to become father of yourself, to cast off all the traditions and obligations accumulated in the “Old World,” to treat the past as mere baggage. If you are father to yourself, you don’t have a father except yourself: you don’t have a past to observe or honor, or, more importantly, to learn from. But when you’re on your own in this fundamental sense, as Americans like to be, you lean toward radical visions of the future and radical resolutions of problems inherited from the past. As D. H. Lawrence noted in his studies of classic American literature almost a hundred years ago, the masterless are an unruly horror.
And when you know that every cop is a criminal—and all the sinners saints—sympathy for the devil becomes your only option as a viewer of movies. The lawful and the unlawful intersect in startling ways in this social and cultural space. So do the natural and the supernatural, as witness Quentin Tarantino’s easy transition from “Reservoir Dogs” (1992) and “Pulp Fiction” (1994)—movies about the redemption of the most callous criminals—to the vampire flick written with Robert Rodriguez, “From Dusk ‘Til Dawn” (1997), a movie that mixes so many genres it seems as contrived as a cocktail invented in SoHo. Witness as well the epidemic of celestial messengers, angry demons, impossible conspiracies, and talented witches on TV after Tony Kushner, an avowed Marxist, won the 1994 Pulitzer Prize for his two-part Broadway play, “Angels in America.” Buffy the Vampire Slayer was waiting in the wings, Stage Left. “The X-Files” entered earlier, Stage Right.
One more important trend, which tracks the other four quite closely, is the remarkable increase in spectacular violence done to heroes, victims, and villains alike. The analogue of video games is not very useful on this score, however, because the recipient of excruciating violence in the movies of the late-20th century is typically a female who then exacts revenge (“I Spit on Your Grave,” “Ms. 45” ), or a male who revels in the physical torture he’s “taking like a man,” presumably because this debilitating experience equips him with the moral authority he will later need to vanquish the enemy without ceremony or regret. The “Rocky” (1976) and the “Rambo” (1982) franchises sponsored by Sylvester Stallone are the founding fathers of the latter movement, in which masochism finally becomes unmistakably male.
The “Lethal Weapon” (1987) franchise animated by Mel Gibson’s jittery impersonation of Norman Mailer’s “White Negro”—Gibson’s cop character has to teach his African-American partner (Danny Glover) how to live in the moment, how to be existential if not suicidal—is the parallel film universe in which guys get crazy because they have to, because the world has excluded them from the theater of good wars and good jobs, where boys once learned how to be men. “Fight Club” (1999) is the final solution to this fear of male irrelevance, and the apogee of male masochism at the movies. In its moral equivalent of war, men keep trying to mutilate themselves, but we know it’s OK because they use their bare hands: until the ugly and inexplicable ending, they’re purposeful artisans, not mindless machine herds.
Experience and Explanation at the Cineplex
Let us work backward in this list of filmic trends of the late-20th century, to see if we can make historical sense of them, to see if they have anything in common. The increase of spectacular violence at the movies has of course been explained as a result of the recent decline in the median age of the audience—adolescents, it is said, have always experienced the onset of their pubescence and then their reluctant graduation to adulthood in the unholy images of dismemberment. More scenes of carnage, more rivers of blood, are what these hormone-fueled maniacs need, and what Hollywood gladly delivers. It is an argument that works pretty well until you realize that adults still buy more tickets than the teenage crowd, and that the violence on view increased exponentially in every genre toward the end of the 20th century, to begin with in westerns and war movies—for example, “The Wild Bunch” (1969), “Apocalypse Now” (1979), “Platoon” (1986), and “Saving Private Ryan” (1998)—where teenagers did not tread unless accompanied by their parents.
The better arguments are offered by film theorists who suggest that the extreme fury inflicted on the human body in the movies since the 1970s should be understood in terms of a general unsettlement of subjectivity—of selfhood—and who suggest that, by the late-1980s, the signature of this unsettlement had become male masochism. In The Philosophy of Horror, a groundbreaking book of 1990, Noel Carroll suggests that the ever more elaborate violence visited upon the characters of his favored genre constitutes an “iconography of personal vulnerability.” Horror as such, he insists, is “founded on the disturbance of cultural norms.” The late-20th century festival of violence in movies is, then, a visual depiction, a pictorial externalization, of the anxieties necessarily attached to the end of modernity, when “an overwhelming sense of instability seizes the imagination in such a way that everything appears at risk or up for grabs.” But the crucial cultural norm in question is the father of himself—the modern individual, the American Adam.
That is why Carroll correlates the “death of ‘Man’” postulated by post-modern theory with the “demotion of the person” expressed by the extraordinary violence of recent horror film—the popular, colloquial, vernacular version of academic elocution can be seen at the Cineplex, he suggests, long before (or after) you are forced to read Foucault and Derrida by your demented professors. Carroll summarizes his argument as follows: “What is passing, attended by feelings of anxiety, is the social myth of the ‘American’ individualist, which, in the case of horror, is enacted in spectacles of indignity, [and is] directed at the body.” What is passing, right before our very eyes in the artificial night of the local theater, is that remnant of the 19th century, the bourgeois proprietor of himself. It is a violent business, this cinematic execution of our former self, and it can never be finished. No wonder we want to prolong the agony on screen.
What is also “passing” in the torrent of violence that floods every genre in the late-20th century, is manhood as it was conceived in the “era of the ego,” ca. 1600-1900, as it was then embalmed in the canonical novels and the literary criticism of the 1920s—Ernest Hemingway and Lewis Mumford come to mind—and as it was reenacted in movies, mainly westerns, of the 1930s, 40s, and 50s. The strong, silent types who inhabited that imaginary American space west of everything give way, by the 1980s and 90s, to male leads who are anything but. All they want is to talk about their psychological afflictions, as if we—the audience—can cure them. Tony Soprano is the culmination of this cinematic species. And it is no accident that the back story informing every episode is Tony’s search for meaning in a world turned inside out by race and gender (“Woke up this morning, the blues [that is, the blacks] moved in our town,” as the song goes over the opening credits). For it is here, in the world of therapy and thus the language of psychoanalysis, that the problem of male masochism at the movies becomes visible and, in the most old-fashioned sense, remarkable.
Kaja Silverman and Carol Clover are among the accomplished film theorists who have deployed the language of psychoanalysis to interpret the systematic abuse and abjection of males in late-20th century movies (by then, a film theorist who did not trade in the currency of psychoanalysis was an anomaly, something like a chaperone at a bachelor party; Noel Carroll resisted the urge and found a voice by falling back on the Marxoid rhythms of Fredric Jameson). Like their counterparts—David Savran and Barbara Creed are probably their best contestants—both Silverman and Clover rely on two famous essays of 1924 by the founding father, Sigmund Freud, in which masochism is defined as the psychological space that permits, maybe even demands, male experimentation with an imaginary femininity.
In all the clinical/case studies Freud cites, it is men who are being masochistic, but the passivity that allows their penetration, laceration, etc., is coded as female. “In the case of the girl what was originally a masochistic (passive) situation is transformed into a sadistic one by means of repression, and its sexual quality is almost effaced,” he declares. “In the case of the boy,” on the other hand, “the situation remains masochistic.” For he “evades his homosexuality by repressing and remodeling his unconscious phantasy [of being beaten, penetrated, by his father]; and the remarkable thing about his later conscious phantasy is that it has for its content a feminine attitude without a homosexual object-choice.” In these psychoanalytical terms, masochism on screen looks and feels like men trying to be women—men trying to identify as women—but without cross-dressing, and without coming out of a closet to renounce heterosexuality. Again, it is the psychological space in which an imaginary femininity becomes actionable.
At any rate it is the cultural space in which the mobility—the increasing instability—of masculinity can be experienced. Clover has shown that the predominantly male audience for crude horror films like “I Spit on Your Grave” is not indulging its sadistic fantasies by identifying with the rapists, as pious mainstream critics would have it; instead, that male audience is placing its hopes and fears in the resilient character of the Last Girl Standing, the young woman who ignores the law because she has to, the gentle female who comes of age by killing the slashers and the psychopaths. Violence is the cinematic medium in which this transference, this out-of-body experience, gets enacted. Violence is the cinematic medium in which male subjectivity gets tested, in other words, and is finally found wanting except as a form of emotional solidarity with the female character who outlasts her tormentors.
So male masochism at the movies looks and feels bad—it is hard to watch, particularly when Mel Gibson’s William Wallace is getting tortured in “Braveheart” (1995), when Sylvester Stallone’s “Rocky” is being beaten to a pulp, or when Brad Pitt is begging for more punishment in “Fight Club”—but it accomplishes something important. Its violent sensorium lets us experience the end of modernity as the dissolution of male subjectivity and the realignment of the relation between what we took for granted as feminine and masculine (keeping in mind that this realignment may well prove to be regressive and destructive). Freud was on to something, then, when he suggested that by way of male masochism, “morality becomes sexualized once more [and] the Oedipus complex is revived.” Translation: the identities we discovered as we detached ourselves from a primal, physical, emotional connection to our parent(s)—as we worked through the Oedipus complex—are perturbed and perplexed, perhaps even reconstructed, by the horrific experience of masochistic violence at the movies. These identities now become fungible, divisible, negotiable, recyclable, in a word, scary.
The criminal element of late-20th century film is of course related to the increase of spectacular violence done to heroes, victims, and villains alike. The American fascination with crime runs deep because rapid change is normal in this part of the world—here “crisis becomes the rule,” as a famous philosopher, John Dewey, once put it. His admirer Kenneth Burke explained that “any incipient trend will first be felt as crime by reason of its conflict with established values.” It’s hard to distinguish between criminals and heroes because they both break the rules and point us beyond the status quo, and they’re always urging us to expect more (the heroes of sports are heralds of this type, from Bill Russell and Mickey Mantle to Michael Jordan). They’re like the revolutionaries of the college textbooks—Max Weber’s “charismatic” leaders—but they’re more rooted in everyday routine, in what we call “practice.” They’re more visible, more approachable, more likable than, say, Lenin, Mao, Castro, or Che, because they don’t want to change the world, they want to change the rules.
So crime, like violence, is as American as apple pie. But the late-20th century filmic rendition of criminal behavior departs from its antecedents, especially in depicting gangsters. Where such felons were once represented as deviations from a norm of manhood and domesticity, and thus as a threat to the peace of the city and the integrity of the nation—think of Paul Muni, Jimmy Cagney, and Edward G. Robinson in their founding roles of the early 1930s—by the 1970s and 80s, Gangsters R Us. By the 1990s, accordingly, crime as committed at the movies became the origin and insignia of everything American. This is a useful notion, mind you. It forces us to acknowledge that the western hemisphere was not a “New World” when Europeans invaded America, and that the idea of original sin still has explanatory adequacy. At any rate it lets us know that our country was not born free: it is no exception to the rules of history, no matter who—whether Marx or Freud or Weber—wrote them up as the laws of motion that regulate modernity.
It also lets us know that the private space of contemporary home and family is not exempt from the public atrocities of the political past. “Poltergeist” and the remake of “The Haunting” (1999) suggest, for example, that the extermination of Indians on the frontier of American civilization and the exploitation of child labor during the industrial revolution cannot be forgotten, not even by the most ignorant individuals and the most insulated, intimate, social organisms of the present—they suggest that the return of the repressed is always already underway from within the family. The political is personal in the late 20th century USA. Otherwise it is almost invisible.
The “traditional” family was, after all, breaking down in the late-20th century, and crime rates were, in fact, climbing in the 1960s and after: for many observers, such as George Gilder and Charles Murray, the relation between these two phenomena was clearly, and simply, cause and effect. And the extrusion of women from the home—from familial roles and obligations—looked to them like the proximate cause of the cause. Feminism signified “sexual suicide,” in Gilder’s hysterical phrase. It takes on new meanings in the context of awful yet rousing movies like “I Spit on Your Grave” and “Ms. 45.” Here the female protagonists, young professional women who are victims of brutal and repeated rape, decide to kill the perpetrators instead of waiting on the law made by their fathers, husbands, or brothers. In doing so, they broaden the scope of their vengeance to include male supremacy itself. They’re carefully killing off the idea that men should have control of women’s bodies. So the figurative mayhem on view is not suicide but homicide—it is patricide, the adjournment of the law of the father, the inevitable result of giving women the weapons they need to protect themselves against violent men.
And that brings us back to where we began, to the ascendance of the horror genre, wherein violence, crime, and family are typically represented, and sometimes rearranged, by the suffering of women at the hands of men. What links our five filmic trends of the late-20th century, in this sense, is gender trouble—that is, “the disturbance of cultural norms” which derives from the social (and thus political) problem of the new, “post-feminist” woman, and which redraws the perceived relations, the effective boundaries, between males and females.
comments powered by Disqus
- Archaeologists Take Wrong Turn, Find World’s Oldest Stone Tools
- Evidence of Pre-Columbus Trade Found in Alaska House
- Rwanda Pullout Driven by Clinton White House, U.N. Equivocation
- Centuries of Italian History Are Unearthed in Quest to Fix Toilet
- The U.S. Discovery of Israel's Secret Nuclear Project