Actors as Historians
John Wayne as Ethan Edwards in The Searchers. Credit: Wiki Commons.
The story of this book begins in 2001, when I left academe and began working as a high school teacher. In the process of trying to plan the first semester of a U.S. history survey, I made a curious discovery after generating a slate of movies I planned to show over the course of the fall semester: every one of them starred Daniel Day-Lewis. There was The Crucible. And Last of the Mohicans. And The Age of Innocence. Later I added Gangs of New York and There Will Be Blood. All told, there were nine times I ran an annual event I dubbed "The Daniel Day-Lewis Film Festival."
Maybe it's not surprising that my predilections would express themselves without conscious effort. But keep in mind that we're talking about Daniel Day-Lewis here. As anyone vaguely familiar with his work knows, Day-Lewis is legendary for the extraordinary variety of characters he has played, and the vertiginous psychological depth with which he has played them. I first became aware of Day-Lewis in early 1985, when, in the space of a week, I watched him portray the priggish Cecil Vyse in the tony Merchant-Ivory film adaptation of E.M. Forster’s Room with a View and then saw him play Johnny, the punk East End homosexual, in Stephen Frears's brilliantly brash My Beautiful Launderette. Day-Lewis went on to have a distinguished career, winning an Academy Award for his portrayal of the handicapped Irish poet Christy Brown in My Left Foot in 1989, but between 1988 and 2007 he played a string of American figures that ranged from a seventeenth-century Puritan to a twentieth-century art collector.
What could this mean, I wondered? Every year like clockwork, I watched these films again with my students, marveling at the inexhaustible nuances of Day-Lewis's performances. Gradually I discerned a thread that connected the Puritan to the gangster, the pioneer and lawyer. But perhaps the more important outcome of the experience is that it got me thinking: Could it make sense to think of actors as historians? That people, in the process of doing a job whose primary focus was not thinking in terms of interpretation of the past, were nevertheless performing one? And that in doing so repeatedly over the course of a career would articulate an interpretive version of American history as a whole?
Of course, such people are aware when they're dealing with historical situations (or contemporary situations with historical resonances), and may make real effort to exercise historical imagination as part of their work. But that's the point: it's part of their work. We all understand that there are many people out there who "do" history without writing books—archivists, curators, and, of course, filmmakers, including both documentarians as well as writers and directors of feature films, who work consciously and conceptually to craft an interpretive experience for their audiences. What intrigues me about actors, though, are the obvious limitations and obstacles to executing a purely historical function. Their work is always embedded in a larger context in which their control of the material is limited—actors do not typically write their own lines—and their craft is collaborative, part of enterprises that will always be at as much aesthetic and commercial as they will be historical.
Now I must acknowledge that there is less to the distinction I'm making than meets the eye. Archivists, curators, and documentary filmmakers also labor under limitations of various kinds; they collaborate; they embark on enterprises that are very often aesthetic and commercial, too: they can't afford not to. So do academic historians. But there's a powerful mythology surrounding academic work—a mythology that extends, for example, to procedures for hiring and promotion at research universities—that suggests scholarship should exist outside such considerations. That it has its own intrinsic value, and should be pursued independently of them. This is a powerful proposition, and it has led to work of enormous value that has enriched our understanding of the past. I'd never want to see it go away, and understand it cannot be taken for granted in a society under great financial pressure and long-standing anti-intellectual influences.
But I'm after something a little different: to apprehend the way history is absorbed into the fabric of everyday life—messy, fragmented, more suggestive than direct. In the words of one scholar who has compared cinematic history with more traditional kinds, “its point is not to have a point but to point.” In the ensuing pages I follow such cues, tracing sometimes faint, and always contestable, master narratives as they emerge in bodies of work.
All works of art essentially say the same thing: this is the way the world works. They usually say it implicitly rather than explicitly (in modes of harmony or dissonance; optimism or pessimism; naturalism or artifice), and as often as not point toward an alternative to the set of arrangements they depict. In the process of such a search, works of art will refer directly or indirectly to other works of art -- they will say, in effect, the world doesn’t work that way; instead, it works this way. Or they will say, yes, the world works that way, but with this caveat or corollary. But all works of art must start, if not end, with an assertion about the world as it is. No work of art claims to represent reality in its totality -- it could not, for then it would be life and not art -- but every work of art claims to capture something essential, which is to say something shared.
The lifeblood of art is choices. To create is to edit, and editing is a process (usually conscious, but sometimes not) of making decisions about what to include, which inevitably means decisions about what to exclude. Representing reality -- which is to say using one thing to stand for another -- is at least as much a matter of subtraction as it is addition. And, if you will permit one more theoretical statement here, representation is a matter of abstraction, the transubstantiation of substance and concept.
Works of art vary in their degree of abstraction (think of the difference between a Michelangelo and a Picasso painting), and I think it’s fair to say that some forms of art tend to be more abstract than others (think of the difference between a symphony and a building). If you were to somehow chart a spectrum of verisimilitude from the abstract to the concrete, the medium of film would fall on the latter end. Though, even more than the other arts, it rests on an illusion (namely a neurological quirk of the human brain in which images shown in rapid succession create a perception of motion), film is regarded as among the most mimetic of the arts. At the same time, because film is typically experienced in finite segments of time -- unlike media such as television, which is a more open-ended enterprise measured in seasons -- we tend to think of films as finite, fully-realized worlds in themselves we experience in a sitting.
For all their perceived transparency, however, we all understand that movies -- I’m going to make a semantic switch now, both because in a digital age the word film is on the way to losing its precision, and because the word movie has a vernacular immediacy that corresponds to the larger point I’m about to make -- have traditionally been expensive and complicated to produce. Every year at the Oscars, the Motion Picture Academy of Arts and Sciences (note the double plural) hands out a bevy of rewards to remind us of this fact. One reason they have to remind us is that for all our increasing cultural sophistication about the film industry -- the attention to box office grosses, for example, or the celebrity status of directors or producers like Steven Spielberg, who typically work behind the camera -- is that there are few things in life that immerse one to the degree a good movie does. We watch what’s before us. And what’s before us, the overwhelming majority of the time, is the people we call “actors.” Movies are among the most mimetic of the arts, and actors are among the most mimetic aspects of the movies.
I so love that word: actor. To act is to pretend, to make believe. But it’s also to commit, to execute. As we have been reminded since the time of the ancient Greek philosopher Heraclitus, character is destiny: an actor embodies a set of ideas, the value of which is very often bound up in the fate of the character that actor plays. (A case when this is not so -- when the good guy gets punished, when the bad gal literally or figuratively gets away with murder -- becomes a statement in its own right.) The immediacy and clarity of this widely available performance art, an art that slices across linguistic lines and educational levels, make it -- paradoxically, given the vast sums and hierarchies with which it has always been associated -- thrillingly democratic.
Actors vividly display the act of choice central to the artistic process. Putting aside the fact that any acting performance includes countless renditions that are shot out of sequence or discarded on the cutting room floor, watching a movie involves witnessing an inexhaustible array of choices in language, posture, expression, and setting. A century of experience has taught us that some people make these choices so strikingly that we will watch them repeatedly not only in the same movie, but in movie after movie. One is reminded of the words of F. Scott Fitzgerald’s narrator Nick Carraway, who, in the process of explaining what made his friend Jay Gatsby great, defined personality as “an unbroken series of successful gestures.” Writing almost a century later, the rock critic Greil Marcus, in a characteristically roaming exegesis of the rock band The Doors, notes that “when actors migrate from movie to movie, traces of their characters travel with them, until, regardless of the script, the setup, the director’s instructions, it’s partly the old characters speaking out of the mouths of the new ones, guiding a new character’s hand into a gesture you remember from two or twenty years before.”
This is what the best actors do -- or at any rate, a certain kind of successful actor does. In his now classic study Acting in the Cinema, James Naremore defines acting as “the transposition of everyday behavior into the theatrical realm.” Acknowledging the surprisingly thin line between acting done on a stage or in a studio and the roles -- with varying degrees of staginess -- we all play in everyday life, Naremore notes that the key challenge for people in television and movies, who often must often move and act in highly artificial ways in order to appear “natural” in front of a camera, involves “a compromise between ‘obviousness’ and ‘doing nothing.’ ” Some actors (Naremore cites Spencer Tracy as a quintessential example) are so deft at manipulating this duality that we have a hard time distinguishing the difference between an onscreen and offscreen persona, even as we know there must be one, and we find ourselves fascinated in the attempt to do so.
We have a term for such people: we call them movie stars. More so than other artists, movie stars intrigue us because they exhibit a series of intriguing frictions. One set of frictions involves the relationships between the actual person, the character that person plays in a given movie, and the variations on that character in a career of movie roles. All but a child recognizes that each of these is distinct, but a star wouldn’t be a star if there weren’t some connection between them. Moreover, such connections are perceived to matter. In addition to connecting the star to the role, they also connect the star to the fan -- which in turn creates another set of frictions, because the fan experiences something shared with the movie star while at the same time experiencing a sense of awe-inspiring distance; hence the metaphor of an astronomical object in the sky. Bruce Springsteen, a cinematic songwriter if ever there was one, captures this friction in his classic song “Backstreets”: “Remember all the movies, Terry, we’d go see/Trying to learn how to walk like the heroes we thought we had to be.” Seeking liberation through, and yet being oppressed by, the set of choices made by a movie star (who in turn can feel oppressed by all the attention of fans) is among the great conundrums of cinematic life.
And here’s one more friction that’s particularly germane: the tension between the power of choice at the heart of acting and the limits of control intrinsic to appearing in a movie. For, as any veteran will tell you, acting is also reacting -- to your co-star, to the director, and to the technical demands of the immediate task at hand, not to mention the professional apparatus of agents, managers, studios, and the like. This sense of obvious as well as subtle enmeshment (we know what that’s like!) helps explain the intensity of identification the public sometimes has with actors, a kinship of enmeshment fostered by other media.
Here we must return to the distinction between actors and the subset of that species we know as movie stars, acknowledging that the line is porous. Actors need work, and although they may have standards or priorities about the jobs they take, a professional’s code very often includes a commitment to flexibility and variety. Movie stars, by contrast, tend to think in terms of roles. They have more power than actors to choose the parts they play—which in its most potent form is the power to say no repeatedly—and to convert that power into other kinds, like directing or producing. Our democratic impulses lead us to honor actors, whose work ethic (typically exhibited on a daily basis in theaters, as opposed to episodic stints on sets) we admire. But it’s stars that capture our imaginations.
That said, my focus on movie stars is to a great degree a utilitarian one. In the way their work is embedded in a web of considerations, they mimic the manifold complications and compromises of everyday life. But to the extent that they have more power over the conditions of their work than most people, they make it possible to discern, even isolate, strands in their thinking that are powerful because they are widely shared -- very often at the level of presumption more than explicit argument. Indeed, it’s precisely their uncanny capacity to project these shared presumptions and put them in a new light that allows such people to become stars in the first place.
Perhaps a small example will help illustrate my point. Consider one of the most famous images in Hollywood cinematic history: the final shot of John Ford’s classic 1956 Western The Searchers, which shows John Wayne standing in a doorway. Wayne’s character, Ethan Edwards, is an unreconstructed Confederate soldier who has returned to Texas in 1868 and visits his brother’s family shortly before most of that family, sans Ethan’s niece Debbie, gets massacred by Comanche Indians. An avowed racist, Ethan spends the next five years trying to avenge this atrocity, assisted, much to his dismay, by Debbie’s adoptive -- and part-Cherokee -- brother. When Ethan learns that she has been married to a Comanche chief and wants to live as a Native American, he tries to kill her: better dead than red. Ultimately, however, he seizes her and returns her to “civilized” life with an adoptive white family. His self-appointed mission complete, Ethan remains at the threshold of the house, seemingly unwilling or unable to enter. And then he turns and walks away, with that much imitated, but never equaled, loping walk that Wayne made famous.
Now, there’s much that has been, and still can be, said about The Searchers. Some commentary focuses on the accuracy of the film’s scenario, based on the 1954 novel by Alan Le May, with a screenplay by Frank S. Nugent. It looks at the story’s resonances with a long tradition of captivity narratives dating back to Mary Rowlandson’s 1682 account of her time with the Narragansett Indians during King Philip’s War; it views The Searchers as a document of American racism; and so on. A variant on this approach traces the morphology of the film: what did Le May do with the factual events on which his novel is loosely based, and what in turn did director John Ford do with Le May? Still another line of inquiry looks at the internal cinematic logic of The Searchers. It’s worth noting, for example, that the final shot of the movie is the culmination of a whole string of threshold sequences, from doorways to tepees to caves, that runs through the film.
While I’m interested in such questions, and have pursued them myself at different points in my career, my real interest here is in The Searchers as a John Wayne movie. Definitive as his performance is for generations of film fans, there was nothing ordained about casting him in the film; indeed, at different points in his career Ford kept Wayne at arm’s length. It’s possible to imagine, say, Clark Gable or Gary Cooper playing a tortured soul like Ethan Edwards, even as most of us would say this by way of a concession -- Wayne just seems so right for the part. But one reason why Wayne seems so right for the part is that he had already literally and figuratively established himself as the man on the threshold. He did it in Red River (1948), and he would do it again in The Man Who Shot Liberty Valance (1962), among other pictures. Far more than Gable or Cooper, Wayne repeatedly portrayed tortured souls who do dirty work, and yet in the process of doing so create or preserve a life of decency for others, even if they cannot cross over into the promised land themselves. This is not just a statement that John Wayne made about John Wayne’s world; it’s a statement John Wayne is making about the world he and the rest of us live in as well.
There’s a long history of film criticism that analyzes the choices of directors or film moguls in terms of such patterns, going back to the auteur theory that dominated cinematic studies in the second half of the twentieth century. Less common, for some of the reasons I’ve suggested, is thinking about a body of work on the part of an actor as something more than the embodiment of a personal archetype. But that may be changing. “There is a slowly developing vocabulary adequate to an accurate and objective discussion about film acting: what it is and how it affects the film and its audience, how an individual creates a presence on the screen, what the presence is, and what the viewer’s relationship is to it,” veteran critic Robert Kolker notes in the most recent edition of his classic study, A Cinema of Loneliness. Kolker doesn’t cite any examples of such emerging scholarship, however. I will cite Garry Wills’s 1997 book John Wayne’s America as a pioneering example of the kind of work I’m pursuing here.
Of course, directors remain major, even decisive influences on these people, especially in their early work. But in the last third of the twentieth century movie stars increasingly became important shapers of an overall cinematic product themselves; in many cases it was their decision to participate that decided whether a particular project was made or not. Stars also often shape scripts and casting, and it’s no accident that many stars become directors themselves. Moreover, many -- consider the relationship between Steven Spielberg and Tom Hanks, for example, or Clint Eastwood and Don Siegel -- related to each other as peers.
Whatever the relationship, I’m looking at a specific kind of statement a star makes in amassing a body of work that consists of a series of roles: historical ones. Whether in a Western like The Searchers or a World War II movie like The Sands of Iwo Jima (1949), Wayne repeatedly chose to depict restless, isolated men who must effectively become Moses figures. Wayne hardly invented this line of interpretation; indeed, its very power and influence derive from the way he tapped a vein of thought that runs back through the scholarship of Frederick Jackson Turner, the novels of James Fenimore Cooper, and the religious testimony of Mary Rowlandson. But if Wayne offered his viewers a specific master narrative of American history, it was never the only one available. Actually, by the end of his life in 1979 its influence was fading fast, supplanted by a younger generation who viewed the Moses figures he portrayed as having little redemptive character, and who questioned the decency or efficacy of those “Israelites” for whom Wayne characters made sacrifices. But Wayne’s persona lingers -- and, more importantly, so does his point of view. His history is also historiography: not merely a specimen or a symptom, but also an interpretation available for multiple uses, factual and otherwise, laudable or not.
Do people like Wayne consciously make such historical statements? Do they really think of themselves as historians? The answer, for the most part, is no -- which is precisely why they’re so interesting. To paraphrase an old John Lennon song, history is what happens when you’re making other plans. These plans, when realized, become movies that are in effect “chapters” in which the body of work as a whole is the “book.” When you analyze a literary text, you’re very likely to care what formative influences shaped it, and the analogy here would be the other players in the filmmaking process: a production designer or a screenwriter is to an actor what a real person or a historical event is to a novelist. These influences matter, but they’re not central: the text itself is. Similarly, you’re likely to care what the author of a text may have had to say about her work, but at the end of the day it is what’s between the covers (or on the screen) that matters, not what the author (or actor) says on a talk show. When artists assert, as they often do, that work “speaks for itself,” this is what they mean: that a finished piece offers its audience a proposition that can be accepted, rejected, revised, or whatever as a discrete statement in its own right, whatever the creator(s) may think, or whatever others may say about the creator(s). Think of this as the “literary” side of the (old-fashioned) American studies equation.
The “historical” side of that equation rests on a different proposition: that what actors do is closer than what historians do in capturing the ways ordinary people actually think and feel about the past. For most of us, history is a series of propositions that from an academic standpoint can seem disconcertingly primitive: of progress; of decline; of cycles; of the rich foiling the poor; of a break that occurred when X happened. The stories that get told on the basis of these ideas prove nothing, and are almost always imaginary. But the most successful of them embody mythic truths that bear some relationship to fact, and to a shared collective memory, even if there is much to disagree with in the particulars. Movies give us a sometimes hopelessly jumbled collage that most people don’t even begin to untangle unless they’re somehow provoked into doing so, either by having an unexpected experience at a place like a movie theater, or being coaxed into articulating its contours in a place like a classroom. But it’s nevertheless a part of their everyday lives -- a sense of time comparable to those of sight and sound that also orient us. I mean “sense” in another way as well: as an experience that’s not quite conscious -- or, to the degree that it is, is ineffable. To sense the past is to feel in time.
Related Links