Tuesday, November 30, 2010

Bruce Zirilli (Springsteen)


The Boss circles back to an (ethnic) road not taken in The Promise


Like all great artists, Bruce Springsteen has always been first and foremost a great editor. Any Springsteen album in the last four decades has always begun with many more songs that actually ended up on the record, and no doubt much of his work has ended on a proverbial cutting room floor for good. In the last dozen years, however, Springsteen has begun opening his vault to release what he regards as worthy work. Tracks (1998) was a broad, multi-CD collection that spanned his whole career, as was an intriguing bonus CD of rarities that accompanied The Essential Bruce Springsteen (2003), part of the fine Sony Records Essential series. Springsteen has also taken to repackaging his work -- like Born to Run, released in a 30th anniversary edition in 2005 -- and releasing live performances, audio and video. He's taken this process a step farther this fall with The Promise, a package that exists in multiple iterations: a documentary on the making of his classic 1978 album Darkness on the Edge of Town, a reissue of the album itself, and -- the focus here -- a two-disc collection of material recorded at the time of the Darkness sessions that did not actually make it to the album.

Of course, even merely talking about this music in such a way dates it. With the advent of digital downloads, owning recordings has become a virtual experience more than a literal one. The album, vinyl or CD, as an artifact is a largely generational phenomenon, and Springsteen is tapping a large, relatively prosperous and middle-aged fan base here with a product line well-insulated from the fickleness of mass taste or the recent upheavals of the music industry. Notwithstanding some real gems that have surfaced in this segment of Springsteen's oeuvre -- like his beautiful tribute to his mother, "The Wish" (Tracks), or the deceptively comical “From Small Things (Big Things One Day Come)” (Essential Bruce Springsteen), this is not really the best way to meet The Boss.

Nor is The Promise. If the first Springsteen songs you ever heard were "Racing in the Street ('78)," "Come On (Let's Go Out Tonight)" or "Candy's Boy," you would plausibly regard them as fine pieces of songcraft. But they're just not as good as the "Racing in the Street" that ends up on Darkness, or the transformation of “Come On” into "Factory," or that of “Candy’s Boy” into "Candy's Room." The final “Racing” and “Factory” have a more stripped-down arrangement and stately pace; “Candy” is more frenetic. Each makes more emotional sense in its final version. It's fascinating to hear the insertion of a reference to Elvis Presley's death in "Come On," or the reference to a 1932 Ford in "Racing," given that a 1969 Chevy is what (more plausibly) made it on the album. But this makes these songs curios, not great pieces of music.

That said, The Promise is also an intriguing document of a road not taken. Darkness on the Edge of Town marked a decisive turn in Springsteen's work in opening up a imaginative landscape with settings that included the Dakotas ("Badlands") and the Utah desert ("The Promised Land"). He wrote hard, lean songs stripped of the ornate embellishments that had characterized his first two albums, and the baroque arrangements that dominated Born to Run, typified by the ten-minute epic of "Jungleland." This new orientation is evident on the packaging of this music, which has always emphasized its western orientation.

But Springsteen's core influences, much more apparent in his live shows than in his recordings, had always been the classic rock of the late 1950s and early 1960s. That means Elvis, of course, and Buddy Holly, whose spirit is palpable here in "Outside Looking In." It also meant Gary U.S. Bonds, whose career he helped revive in the eighties, and represented by "Ain't Good Enough for You." And Phil Spector, whose girl groups receive homage treatment in "Someday (We'll Be Together)."

More specifically, though, there was an east coast, urban -- and, more specifically still, an Italian -- strain in Springsteen's music of the seventies. Ethnically, he's pure American mutt; "Springsteen" is a Dutch name, but his father was mostly Irish. His mother Adele's maiden name was Zirilli -- Irish/Italian marriages where what passed for multiculturalism in the mid-twentieth century -- and it's his Zirilli side that that decisively shaped his persona. That may be why one influence that hovers over many of the songs in this collection is Dion (as in DiMucci), who enjoyed a string of hits in the late fifties with the Belmonts and as a solo act in the early sixties with songs like "Runaround Sue" and "The Wanderer." The Promise evokes another Jersey Boy, Frankie Valli, whose work with the Four Seasons is discernible in tracks such as "Gotta Get that Feeling," even if Springsteen is smart enough not to even try emulating Valli's unforgettable falsetto.

This vein in Springsteen's body of work did not entirely disappear with Darkness on the Edge of Town. He gave some of these songs away. "Talk to Me" became a signature tune for Southside Johnny and the Asbury Jukes, who carried this metropolitan torch into the eighties. Patti Smith had a Top Ten single with "Because the Night" in 1978, as did the Pointer Sisters with "Fire," which Springsteen himself released as a single a decade later. "Rendezvous" became a staple of his live shows, much beloved by Boss cognoscenti. So it's a real pleasure to see this material resurface in this collection.

Though none of it appears on The Promise, some of the material from this phase of Springsteen's career did enter his canon, principally on The River (1980), an album far more emotionally and musically variegated than Darkness on the Edge of Town, and for that reason a more satisfying experience (and a good place for a Springsteen novice to begin). In the documentary on the making of Darkness recently broadcast on HBO and included in some versions of the package, there is footage of Springsteen performing "Sherry Darling," which ended up on The River, with bandmate and buddy Steve Van Zandt -- born Steven Lento, and known by millions of Sopranos fans as Silvio Dante, the character he played in that series. With its boisterous crowd noise and hand-clapping in the background, redolent of life on a Brooklyn stoop, "Sherry Darling" encapsulates this moment in Springsteen's musical life.

In recent years Springsteen has been seen as an American icon in the Woody Guthrie tradition, in large measure because he's positioned himself that way, typified by his 2006 album The Seeger Sessions, and a carefully sculpted public image as a liberal godfather in the Age of Obama. The Promise reminds us of the vast sprawl in his lifetime of music-making, and of Springsteen's legitimate claim to the Ellis Island branch of American identity.  If Springsteen's body of work is a meal, The Promise is a cannoli.

Friday, November 26, 2010

The Old Frontier




The following post is part of an ongoing work in progress about actors as historians. Earlier posts below. Comments welcome at jcullen@ecfs.org

Daniel Day-Lewis’s decision to make The Crucible in 1996 coincided with a new level of stability in his life. His previous film, In the Name of the Father (1993), had earned Academy Award nomination as Best Actor (he lost to Tom Hanks in Philadelphia),  and ratified his standing as one of the premier actors of his generation. He also solidified his standing on a high plateau by settling down by marrying screenwriter and director Rebecca Miller, daughter of playwright Arthur Miller, who he met after shooting the move and married shortly thereafter. (The fast courtship and marriage followed a relatively long-term relationship with French actor Isabelle Adjani, with whom he had a son, and a series of appearance on gossip pages with figures that ranged from Julia Roberts to Madonna.)
The fact that a Hollywood version of The Crucible ever got made is surprising. It is, of course, a bright star in the firmament of American theater, and a staple of high school English classes. It also single-handedly turned the Salem witch-trials into a form of allegorical shorthand for McCarthyism in the 1950s and any form of social pressure through scapegoating ever after.  But even as one might expect some form of independent film production to surface sooner or later, a big-budget shoot on location in Massachusetts – on Hog Island in Ipswich, not far from where the original event took place in 1692 – would be very unlikely under any circumstances.
 Except for the fact that Miller’s son, Robert, was friendly with Joe Roth, then the head of 20th Century Fox.  As it was, it took years to get the film in the production pipeline. And it could very plausibly have been run aground when Roth left Fox for Disney. But the new management under Bill Mechanic left the project in place. There had been hopes that this prestige project – Arthur Miller himself adapted the screenplay, and the cast included highly regarded actors like Joan Allen and the legendary Paul Scofield, directed by Briton Nicholas Hytner – would generate Oscar, and thus box office, buzz. Neither materialized. But this remains a beautifully rendered, if austere, work of art, in which the work of the performers is greatly amplified by the beautiful sets, evocative costumes, and other high production values.
 The Crucible is a story that’s typically read one of two ways. The first, and perhaps primary, one is what prompted Arthur Miller to write it: as a warning about the dangers of social conformity and letting irrational fears – in particular a fear of Communism that dominated American public life at the time of the play’s premiere – govern everyday life. The second tends to see the story in terms more specific to its time and place: seventeenth century New England. Such an angle of vision leads one away from viewing it as a tale about American character generally, and more about a particular sub-species: Puritanism.  Ever since the seventeenth century, North American settlers, including some in New England, have pointedly contrasted themselves with what they considered the arrogant piety of these religious dissidents (some inside, others outside) the Church of England – and their secular heirs. In this telling, the Salem Witch trials are the logical, if not inevitable, outcome of a subculture in which religious fanaticism runs amok and consumes those who burn with self-righteousness.
Both of these views have cogency, of course. But I’m not particularly interested in tracing them here. In part, that’s because it’s been done so often. In part, too, it’s because I think the Puritans have gotten a bit of a bum rap, as if they were the only people who ever burned witches, intolerance was invented in New England, or the people who finally stood up the witch hunters and ended this tragic event, sometimes via their own martyrdom, were not themselves Puritans (of the very best kind). Instead, I see The Crucible through a different lens, one that comes into focus through the character of John Proctor, played by Daniel Day-Lewis. And that is to see it as the story of a frontiersman. 
Actually, there are some good historical reasons to do so. Of course Salem, Massachusetts is not typically seen as a frontier town; after all, it was founded in 1626, even before Boston, and thus was 66 years old when the witch trials took place. But if Salem could not plausibly been seen as a frontier, it was in fact quite close to one: the district of Maine, which would be part of Massachusetts until 1820. For most of the seventeenth century, the beaver and timber trade of northern New England was a major source of prosperity for Massachusetts. But the outbreak of King Philip’s War in Rhode Island in 1676, which spread northward and lingered until later in the decade, broke a relatively long stretch of peaceable relations with the region’s Indians, specifically the Abenaki peoples of northern New England. The outbreak of another war 1689 – popularly known as King William’s War, but known in the region as the Second Indian War – destabilized the region. This war has geopolitical origins, in that it was partially the result of the Glorious Revolution in England, which brought the Dutch-born William of Orange to the throne and tilted the nation against France, launching three-quarters of a century of imperial conflict that would culminate in what came to be known as the French and Indian War (the setting of another Daniel Day-Lewis movie, Last of the Mohicans).  But the war also had complex local dimensions in lingering hard feelings between the Abenaki and colonists. There was also a high degree of distrust among those colonial officials and military officers, some of whom had strong ties to the former governor, Sir Edmund Andros, who was overthrown when William came to power, and whom some settlers believed put imperial interests ahead of their own in his military conduct.  
The Indian wars of the late seventeenth century destroyed lives, livelihoods and homes, and created a significant number of refugees, some of them ending up in Essex county, where Salem is located.  The noted U.S. historian Mary Beth Norton has documented that a significant number of accused witches as well as their accusers have ties that can be traced to Maine in the 1670s and 80s.  Just how decisive a factor Indian war really was is open to debate. Actually, few events in U.S. history have been as variously explained than the witch trials. But it is certainly plausible to see frontier-related stresses as a factor with what went wrong in Salem in 1692.
      Actually, the ambiguities involved in actually using the term “Salem” points to another problem – and another way in which it can be viewed as a kind of frontier. People at the time made a distinction between the incorporated district of Salem Town, and the surrounding area known as the Farms (later Salem Village, and still later the town of Danvers). Residents of the Farms resented their civic obligations to the Town, and lobbied the General Court of Massachusetts for political autonomy as early as the 1670s. They would not get it until 1752, though they did gain the right to have their own church and minister. Yet this proved to be deeply divisive, and indeed, the outbreak of the witch trials was directly connected to the controversy over the ministry of Samuel Parris, whose daughter Betty and niece Abigail Williams were the first people to exhibit the ailments that would be attributed to witchcraft. (One of Parris’s predecessors, George Burroughs, who had lived on the Maine frontier and returned to Salem, would be among the 18 accused witches who were put to death.) The lack of clear political boundaries and a tendency to resolve personal conflicts by organized means fostered a kind of vigilante justice in which Farm residents resolved grievances by resorting to what amounted to vigilante justice, settling scores by making accusations based on spectral evidence. The decision on the part of trial judges to suspend normal rules for substantiating that evidence effectively allowed them to deflect attention and blame away from their own failures in the political and military upheaval of the late 17th century.
As far as the makers of The Crucible were concerned, this is all inside baseball. In the original script for the play, Miller has Abigail Williams threaten Betty Parris by saying “I saw Indians smash my dear parents’ heads on the pillow next to mine, and I have seen some reddish work done at night, and I can make you wish the sun had never gone down!”   (The line also appears in the movie.) This (fictive) contextual information is important in establishing a basis for the core malignancy of Williams’s character. But it’s more in the spirit of background information than a proximate explanation for her behavior. The real Williams was a child in 1692; Miller her makes her a young woman whose jealousy over severed sexual relationship John Proctor leads her to accuse his wife Elizabeth Proctor of witchcraft.  (The real Proctor’s brother-in-law had Maine ties, which may be one reason why she and her sister were accused of witchcraft.) The romantic triangle remains the core of the movie, with Winona Ryder as Williams, Day-Lewis as John Proctor, and Joan Allen as Elizabeth.
That said, the adaption of the play into a movie 43 years later resulted in a subtle shift in the story’s frame of reference. “In a sense, The Crucible is the first western, because the frontier was there and they were the first pioneers,” director Nicholas Hytner explained in 1996. “So there’s an austerity to the look that we tried to capture." It is perhaps inevitable that Miller, in adapting his work for a new medium, would literally open it up. For example, the opening and closing scenes of the play, which depict the girls’ participation in an occult ritual and the hanging of important characters respectively, take place offstage. In the movie, they are shot in exterior locations. But the most important element in establishing a frontier dimension for the film version is the portrayal of John Proctor.

Next: John Proctor as frontiersman.

Tuesday, November 23, 2010

Shooting Star


The long road to Hollywood -- by way of Ireland -- for a child of the British artistic aristocracy


The following post on Daniel Day-Lews is part of a work-in-progress.

As a general proposition, there's nothing particularly surprising about a foreigner doing a notably good job at explaining the United States to Americans. A long tradition stretches from the proto-sociological musings of Frenchman Alexis de Tocqueville (whose 1835/1840 study Democracy in America continues to be widely read) to the Taiwanese-born filmmaker Ang Lee (whose movies like The Ice Storm, Brokeback Mountain, and the overlooked Ride with the Devil are pitch-perfect in their evocation of their times). Like countless travelers to America, Daniel Day-Lewis left his homeland. But his adopted country is not the United States, but Ireland, a nation of emigrants more than immigrants.

Day-Lewis was born in London on April 29, 1957, of distinguished lineage. His father, Cecil Day-Lewis, was a leftist poet who went on to become Poet-Laureate of Great Britain. (Born in Ireland in 1904, he declared British citizenship after the creation of a separate Irish republic in 1948.) Daniel's mother, Jill Balcon, was a film and radio actor, and the daughter of Michael Balcon, a British-born Baltic Jew who entered the nascent film industry at the turn of the twentieth century and rose to become head of Ealing Studios, where he collaborated with figures such as Alfred Hitchcock. The marriage between Cecil Day-Lewis and Jill Balcon was the product of a May-November romance, and the birth of their daughter Tamasin and son Daniel was in effect the groom's second family (he had two sons from a previous marriage). Biographical accounts suggest both affection and distance between father and son, with his mother a presence until the end of her life in 2009.

Daniel Day-Lewis spent the first years of his life in the fairly posh Greenwich section of London in a childhood that is described in multiple accounts as "middle class." But this is a bit hard to believe. While it is certainly true that actors and poets do not typically have huge incomes, the mere fact that he was raised with live-in nannies suggests some degree of affluence. So does the string of literary celebrities (Vanessa Redgrave, Kingsley and Martin Amis, et. al.) that appear at his home in accounts of his youth. And the fact that he attended elite boarding schools. The first of these, Sevenoaks, was traditional and Day-Lewis hated it. He was later allowed to transfer to the more progressive Bedales, where his thespian aspirations first took root. (At age 12, he had a cameo appearance in the 1971 John Schlesinger thriller Sunday Blood Sunday.)

Day-Lewis appears to have had a relatively happy childhood, but but there are indications of restlessness and turmoil in his adolescence. He spent a fair amount of time scouring South London in what would prove to be a longstanding fascination with working-class life. Cecil Day-Lewis died in 1972, when Daniel was fifteen years old; the following year he was admitted to a psychiatric hospital for a drug overdose, which he claimed then and since was a misunderstanding. Though he had a long-term relationship that lasted for many years after high school, he was viewed by his peers as a loner. Uncertain of a career path after his graduation from Bedales and participation in Britain's National Youth Theatre program, he considered a career in carpentry. But he ultimately decided to follow in his mother's footsteps, gaining admission into the prestigious Old Vic Drama School in Bristol.

Over the course of the next decade, Day-Lewis embarked on a career trajectory of a professional stage actor. He appeared in a series of school productions, and upon his graduation endured the feast-or famine existence that characterizes the life of people struggling make a living as artists. He appeared in high-profile Royal Shakespeare Company and National Theatre productions, and also did television work for BBC movies and television shows. There were also occasional cameo appearances big-budget productions like the 1982 Academy-Award winning Best Picture Gandhi (he played a racist South African) and a supporting role in a 1984 remake of Mutiny on the Bounty (which flopped). Family connections were a factor in Day-Lewis's ability to land such parts, though his talent was never in doubt. Day-Lewis was also in good company: his peers included actors like Kenneth Branagh, Rupert Everett, and Gary Oldman, and Tim Roth, who would all go on to have distinguished stage and screen careers.

But Day-Lewis had a secret ambition that was not quite legitimate in this elite milieu: he wanted to become a bona fide move star. "Where I come from, it was a heresy to say you wanted to be in movies, leave alone American movies," later explained. "We were all encouraged to believe that the classics of the theater were the fiery hoops through which you'd have to pass if you were going to have self-esteem as a performer." [Hirschberg] Day-Lewis made a major step toward realizing his goal in the mid-1980s as a result of his wildly divergent performances in two films whose simultaneous release generated international attention. The first of these, My Beautiful Laundrette, started out as a BBC television drama written by Pakistani-British screenwriter Hanif Kureishi and helmed by veteran director Stephen Frears. A vivid document of Thatcherite Britain, the film is a cross-class and racial gay love story between an upwardly mobile Paki entrepreneur with strong family connections and a London punk reconsidering his thuggish ways. Day-Lewis, who had earned acclaim as a gay character in a 1982 West End production of Another Country, campaigned for the part by sending Frears threatening letters [Jenkins, 164], an early indication of what would prove to be a  uniquely intense approach to method acting. Greeted enthusiastically at the Edinburgh film festival in the fall of 1985, it was redirected for theatrical release, reaching the United States in early 1986. In a rave review of the movie as a whole, Vincent Canby of the New York Times singled out Day-Lewis for a "performance that has both extraordinary technical flash and emotional substance."

Shortly after finishing his work on My Beautiful Laundrette, Day-Lewis landed a role in another British film, this one an adaptation of the 1908 E.M. Forster novel A Room with a View. This project, a collaboration of the famed team of producer Ismail Merchant, director James Ivory and screenwriter Ruth Prawer Jhabvala, was a world away from My Beautiful Laundrette in its period detail, burnished production values, and literary pedigree. Interestingly, the part Day-Lewis sought and won was not the lead figure in a love triangle (that would go to Julian Sands), but rather that of the insufferable prig Cecil Vyse, whose snobbery costs him an engagement to the young woman played by Helena Bonham-Carter. As he later explained, the character of Vyse was that of "a man whose skin I could occupy in some of my worst nightmares." (Indeed, a morning television interview he gave at the time, while lacking the arrogance of Vyse, nevertheless conveys the patrician unease that has made Day-Lewis an elusive figure when it comes to dealing with the media.) Yet Day-Lewis's ability to humanize this almost cartoonish character in a crucial scene where he is rejected by his fiance was a principal reason why he won a New York Film Critics Circle award for Best Supporting Actor.

Again: the synergy here is important in transforming Day-Lewis from promising journeyman to overnight success. My Beautiful Laundrette and Room with a View were both released in the United States on March 7, 1986, and the experience of seeing them both in close proximity was, as I will testify personally, almost overpowering.  Canby called Day-Lewis's performance in View "spectacular," noting "a style and wit all the more remarkable when compared with his very different characterization in My Beautiful Laundrette." Chicago Sun-Times review Roger Ebert was even more emphatic. "Seeing these two performances side by side is an affirmation of the miracle of acting: That one man could play these two opposites is astonishing," he wrote.

While the two movies certainly raised Day-Lewis's profile internationally, they did not transform his career immediately. He followed up his work on them with a small role in an Anglo-French production, Nanou (1986), and appeared onstage in Futurists, a National Theatre Production of a play about the Russian Revolution. His next film project was a starring role as Tomas, a doctor, in a 1987 screen version of Milan Kundera's 1984 novel The Unbearable Lightness of Being, a love story set amid the Prague Spring of 1968 and directed by the Philip Kaufman (who had worked with Clint Eastwood before being fired from the set of The Outlaw Josey Wales). Clocking in at almost three hours, Lightness is a beautifully made film (Day-Lewis worked with the accomplished Lena Olin and Juliette Binoche, among other actors), but a bit plodding in its fidelity to the novel. It was not a commercial success. Nor was his next project, Stars and Bars (1988), in which Day-Lewis plays Henderson Dores, a hapless London art dealer who comes to America in the hope of procuring a valuable painting from an eccentric patriarch (Harry Dean Stanton). His love interest, played by Joan Cusack, makes this one of the most interesting casts of any Day-Lewis has made. But the creaky stereotypical plot -- borderline offensive in its portrayal of Southern whites -- helped ensure it virtually disappeared without a trace.

Clearly, Day-Lewis was a successful actor. But it was much less clear that he had truly attained movie-star status, notwithstanding his evident sex appeal in the erotically-charged Lightness. Ironically, the role that helped him reach it was that of that of an unlikely real-life character: the handicapped Irish writer and poet Christy Brown (1932-1981). Teaming up with the seasoned stage director but film neophyte Jim Sheridan, Day-Lewis gave a tour de force performance in My Left Foot (1989), which won him an Academy Award for Best Actor in 1990. It was in this role that Day-Lewis's growing penchant for staying in character for an entire production became a practice of widespread comment -- and, for the crew that had to treat this able-bodied man as if he were paralyzed -- some irritation.  But there seemed to be no arguing with the results. Day-Lewis would team up again with Sheridan for In the Name of the Father (1993), in which he played Gerry Conlon, an Irishman wrongly jailed with his father for terrorism, and The Boxer (1997), movies which capture the claustrophobic character of Ireland at the time of "The Troubles." The intensity of Day-Lewis's identification with the country is suggested by his decision to reaffirm his connection with the land of his father's birth by becoming an Irish citizen in 1987.

Following My Left Foot, Day-Lewis once again made an off-beat foray to Patagonia, where he played an itinerant dentist in Eversmile, New Jersey (1989), a British-Argentine production. He then made embarked on a Royal National Theater tour of Hamlet. This turned out to be a disaster, not so much in terms of reviews (which were mixed) but in the emotional strain the role seemed to impose. There was much speculation at the time that the performance led Day-Lewis to dredge up unresolved feelings about his own father. About three-quarters of the way through the production Day-Lewis suddenly froze on stage in October of 1989. He never returned to the show -- or the stage.

It was at this point that Day-Lewis crossed paths with Michael Mann, a Chicago-native director who rose to fame on the strength of his fashionable 1980s television series Miami Vice. Mann was leveraging his commercial Hollywood power for a surprising project: a big-budget blockbuster version of James Fenimore Cooper's 1825 novel Last of the Mohicans. Certain that Day-Lewis was perfect for the lead, Mann succeeded in recruiting him for it. In some sense, at least, it probably wasn't all that hard. Long a fan of American movies -- and a devotee of method-actor predecessors like Robert DeDiro -- Day-Lewis was fascinated by the United States. "I didn't know America," he said in 2007, explaining why he went to see Taxi Driver over and over again upon its release in 1976, "but that was a glimpse of what America might be, and I realized that, contrary to expectation, I wanted to tell American stories." And so he did.

We now need to break the chronology of this story. Day-Lewis made a string of movies about the United States between 1992 and 2007, ranging from the seventeenth century world of the Puritans in The Crucible (1997) to the aftermath of the 1960s in The Ballad of Jack & Rose (2005). Though they were not made in order, and though it's very unlikely there was much in the way of a conscious design, it is the contention of this chapter that they trace a thematically unified narrative arc. So we're going to follow them in the sequence of their settings, not their production. And begin, as well as end, in New England.

More to come.

Friday, November 19, 2010

Integration in shades of gray


James C. Cobb depicts a Dixie as American as . . . Lynyrd Skynyrd

The following review was posted earlier this week on the Books page of the History News Network site. 

Is the South -- still -- a place apart? Thirty years ago, in Place Over Time, Carl Degler argued for the persistence of a distinctive regional identity notwithstanding the successive waves of modernity that followed the "New South" of the post-Civil War era. Earlier in this decade, in Still Fighting the Civil War, David Goldfield argued that many Southerners insisted on seeing themselves as apart from the of the Union. Meanwhile, an array of scholars from Bruce Schulman to Michael Lind see recent American history as essentially a process of Southernization. In this ably written synthetic account of the region, veteran University of Georgia professor James C. Cobb shows how all these views can be seen as credible in a narrative trajectory that moves from that of a backward, isolated region to an assertive national presence. But for Cobb, the region and the nation have always been deeply intertwined.

The first half of The South and America moves at a brisk pace, describing the quickening effect of the Second World War on the region, the dawn of the Civil Rights movement, and the emergence of a steady -- and increasingly sophisticated -- strategy of resistance to it. We meet a familiar gallery of characters, from Gunnar Myrdal to Emmett Till, and a political spectrum that runs from the daring novels of Lilian Smith to the whites who said of their returning veterans, "Our heroes didn't die in Europe to give Negroes the right to marry our wives." (Those Negroes, for their part, had their own ideas about what they were fighting for.)

Cobb pays particular attention to the economic dimensions of the Civil Rights movement and its aftermath. He notes that the business community was anxious lest segregationist intransigence interfere with commerce, and that this concern played a factor in the willingness of many whites to accept, though not embrace, what was ultimately seen as an inevitable move toward racial integration.  It did not take long, however, for corporate leaders to conclude that the new status quo was not only acceptable, but necessary for the maintenance of a low-wage, low-regulation economic climate. A full-throated liberal, Cobb is insistent -- and convincing -- in tracing a persistent double standard on the part of white Southerners, who disproportionately benefit from federal spending while remaining reluctant to tax themselves, and warning about the corrosive effects of welfare for the poor while distributing lots of pork to the rich.  This hypocrisy is not unique to the South, but notably widespread there.

The second half of The South and America takes more a thematic approach, covering cultural, gender, and racial history. What this part of the book lacks in cohesion it makes up for in useful segmentation. It is also notably up to date, covering topics like controversies over the Confederate flag, gay rights, and the growing Latino presence in the region. Cobb occasionally betrays his generational roots (he's much better on pop music of the fifties than later, for example, omitting what would seem to be a necessary discussion of soul music, though OutKast does make a cameo). But there's plenty of grist for an undergraduate mill here. Cobb also shows a sharp eye for the resonant detail. It's moving for example, to learn that while supporters of the vicious segregationist U.S. Senator James O. Eastland could not raise the funds for his portrait to hang in the U.S. Capitol, the once obscure Fanny Lou Hamer is rightly lionized as one of the true heroes of American history (there's a Bronx high school named for her a few miles away from where this review has been written, where her legacy -- and the social inequality that made her labors necessary -- live on).

By this point, the discourse on the place of the South, in the broadest sense of that phrase, is vast. Indeed, Cobb has spent the better part of a lifetime mastering it. The South and America is a useful place for the neophyte to begin.

Tuesday, November 16, 2010

A television appearance

Last summer, I taped a talk show for a EBRU, a television network that stretches from New England to the Midwest, on the state of the American Dream. The show, Fresh Outlook, was broadcast recently. You can view it here.

Friday, November 12, 2010

The significance of "Significance"


A few observations from the post-Frontier frontier


The following post represents notes toward an essay on Daniel Day-Lewis, meant to be part of a larger work-in-progress.

Much to my surprise, I've been thinking a lot about Frederick Jackson Turner lately. Turner is to the historical profession what Sigmund Freud is to psychology: an emergent figure in the late nineteenth century who dominated the first half of the twentieth, but a figure whose ideas are now consciously rejected by just about everybody in their professions -- and unconsciously absorbed by just about everybody else. Turner's 1893 essay "The Significance of the Frontier in American History" is probably the single most important piece of historical scholarship ever published in the United States. Written as a time when the modern research university was just emerging in the United States, it was an example of a literary genre -- the analytic essay  -- that was just coming into its own. Turner was one of the founding fathers of the American Historical Association (AHA) and its journal, the American Historical Review, which remains at the apex of the discipline.

A Wisconsin native, Turner first delivered "Significance" at an AHA meeting in Chicago, held amid the fabled World Columbian Exposition held in that city to celebrate the 400th anniversary of Christopher Columbus's arrival in America. It seems almost comical to imagine the 32-year old Turner (then, as now, young for a historian) standing in the front of a room talking to an indeterminate number of scholars while thousands of his fellow Americans were taking amusement park rides, surveying the huge temporary stucco buildings of the so-called White City, and watching the "exotic" foreign exhibits (like the "hootchy-kootchy dance of a Syrian performer named "Little Egypt") at a site which was artificially lit thanks to the technological innovations of the Westinghouse Corporation. Yet the so-called "Turner Thesis" unveiled in Chicago would prove to be more durable than any of these fleeting material realities, in large measure because it was so succinctly stated at the end of the first paragraph of his paper: "The existence of an area of free land, its continuous recession, and the advance of American settlement westward, explain American development."

From the vantage point of over a century later, it may be hard to appreciate just how edgy an assertion this really was. Turner had been trained back east at Johns Hopkins, under the tutelage of the legendary Herbert Baxter Adams. Adams was a proponent of the then-dominant "germ" theory, which argued that western civilization owed its origins to the forests of Germany, out of which emerged a Teutonic character that swept western Europe, jumped to America, and now dominated the world. Like so much academic thought of the time, this approach to history was modeled on science, both in its new emphasis on empirical research and its use of a biological model -- more specifically a (Social) Darwinian model -- to explain historical change. Like his predecessors, Turner embraced a process-driven empirical approach to History (colleagues and students remember him as an obsessive collector of data and maps), and invoked science as fact and metaphor. But Turner's own inclinations were decidedly on the environmental side of the Darwinian equation: he was fascinated not by fixed destiny, but protean adaptability. America was a place that did something to people, he said: it made them Americans. Which is to say it turned them into something new and unique in historical experience. And that's because they had lots of room to evolve through a renewable cycle of scouts giving way to traders, farmers, and capitalists in scattershot sequences that stretched from sea to shining sea.

Over the course of ensuing decades, the Turner Thesis itself evolved from maverick idea to common sense, ratified by Turner's appointment at Harvard in 1910. By mid-century, it had a wide impact on subsequent historians, like Walter Prescott Webb, an early environmental historian who drew on Turner's ideas. But in the second half of the century the thesis came under increasing attack on a variety of fronts. Some scholars questioned Turner's data; others its implications, particularly his assertions that the frontier was the engine of U.S. democracy. The most serious challenge came from those historians, notably Patricia Limerick, who rejected the racial assumptions underlying the very idea of the frontier and the implicit omissions involved in discussing "empty" land that was in fact inhabited by multicultural populations. Turnerism was little more than a racist fantasy.

I think I know two things at this point. The first is that these people are surely right that Turner made some serious omissions in formulating his theory, omissions that reflect his limited perspective and limitations that now make it impossible to accept it at face value. The second is that it's only a matter of time before Turner makes a comeback in one form or another, because that's just the way these kinds of arguments go, particularly one in which questions of nature versus nurture figure in some way, as they clearly do here. But I'm less interested in whatever objective truth there may be in the thesis than its cultural power: how Turner described the frontier, what it meant to him, and how his conception of it entered the bloodstream of our national life.

The first point to be made in this context is something people don't recognize about "Significance": it's a gorgeous piece of writing deeply imbued love, an emotion that's not supposed to surface in academic writing but which is lifeless without it. "Significance" is a document with over 50 footnotes that cite a variety of scholarly sources, but its dominant mode is poetic. Take, for example this stretch of prose, in which I hear Whitmanic accents:

The wilderness masters the colonist. It finds him a European dress, industries, tools, modes of travel, and thought. It takes him from the railroad car and puts him in the birch canoe. It strips off the garments of civilization and arrays him in the hunting shirt and moccasin. It puts him in the log cabin of the Cherokee and Iroquois and runs an Indian palisade around him. 

Turner's acute visual sensibility is also evident in an ecological analogy: "As successive terminal moraines result from successive glaciations, so each frontier leaves traces behind it." And in a bibliographic one: "The United States lies like a huge page in the history of society. Line by line as we read this continental page from West to East we find the record of social evolution . . . Particularly in eastern states this page is a palimpsest. What is now a manufacturing State was in an earlier decade an area of intensive farming." Turner's frontier is a living thing: it "leaped" over the Allegheny Mountains, "skipped" over the Great Plains; "pushed" into Colorado, with "tongues of settlement" reaching into the wilderness. Ever situated in specific locations, it is ever restlessly moving forward, recreating itself in what Turner called "this perennial rebirth, this fluidity of American life." My favorite expression in this regard is Turner's evocation of the Mississippi River: "On the tide of the Father of Waters, North and South mingled into a nation."

Turner did not consider the frontier an unalloyed good. While he viewed it a truly nationalizing phenomenon -- people living near frontiers tended to cast their lot together, while those far from it took on stronger regional identities -- as well as a wellspring of democracy, he also recognized that a frontier mentality tended to resist even benevolent forms of outside control and foster a grasping materialism. It also led to a lax approach to government that fostered the creation of a spoils system. Moreover, Turner clearly understood, even if he didn't dwell on it, that the extension of the frontier was a matter of conquest for which he used the correct imperial term of "colonization."

But the biggest problem Turner has with the frontier in 1893 is that it's dead. He makes this clear in the first sentence of "Significance," which discusses recently updated information from the U.S. Census Bureau indicating the disappearance of an unbroken line in the American West, which he described as "the closing of a great historic moment." What the Mediterranean was the the Greeks, the frontier had been to the Americans, he concluded. "And now," he wrote in a closing sentence laced with melancholy, "four centuries from the discovery of America, at the end of a hundred years of life under the Constitution, the frontier has gone, and with its going has closed the first period of American history." The Turner Thesis, in effect, was the frontier's obituary.

What would take its place? Turner did not say. As Richard Hoftstader, himself a distinguished historian, would write 75 years later, the latent pessimism of the frontier thesis was in sharp contrast to the ebullient optimism attributed to frontier communities. But while Turner never offered an alternative -- indeed, he had considerable trouble writing books, and while he eventually articulated an influential view on the role of sections in American history, he never quite realized the huge potential suggested by "Significance" -- his politics were considered generally consonant with his friend and colleague Woodrow Wilson, who of course became the President of the United States and a leader of the Progressive movement. Turner himself has long been considered a Progressive historian in the mold of his contemporaries Charles Beard and Vernon Parrington, which generally meant an empirically-minded, skeptical approach to inherited intellectual constructs and a belief in the power of ideas to improve society.  For such people, the frontier was less a living reality -- as it had been for the previous generation of political reformers, the Populists -- than a metaphor that denoted opportunity on a large scale in a new domain. When, following the death of his wife in 1884, Theodore Roosevelt went out the the Dakota Territory to play the role of a cowboy, he proved to be a financial failure as a cattle rancher. But the episode paid vast dividends in advancing his own political career as a maverick in the nation's capital. Roosevelt, for his part, praised Turner in 1893 for "having put into shape a good deal of thought that has been floating around rather loosely."

The frontier remained fertile symbolic terrain for much of the twentieth century, nowhere more obvious than in the 1960 presidential campaign of John F. Kennedy, whose slogan was "The New Frontier." But its appeal went a good deal beyond politics, evident in the rhetoric of the space program as well as that of the Internet. Nowhere, however, was its appeal more evident than in U.S. cultural life. Turnerism is the bedrock of assumptions for the whole genre of the Western, for example, and the Western, in turn is the seedbed of other cultural genres stretching from sci-fi to hip-hop. Along with the legacy of slavery, the frontier is what makes American culture American.

But if the people of the early twentieth century experienced what they considered the transformation of the frontier from reality into myth, we are presiding over the transformation of myth into memory. Now the belief in the frontier as a living symbol is itself receding in our imaginations. The proximate cause is our current economic situation, which has cast doubt on the upward mobility that so many of us have considered our birthright so long, and which is so deeply intertwined with our sense of a frontier. This sense of doubt is not new; it has recurred periodically throughout American history, as did in the Great Depression and amid the political scandals and economic stagflation of the 1970s. But that it is part of a larger narrative of geopolitical decline has become a perception of our moment that has become increasingly impossible to ignore.

But I am speaking in terms of elusive abstractions now. A notion of decline is surprisingly slippery, subject to multiple definitions and conflicting data. Nor is it necessarily an objective reality; I'm more interested in a perception of decline as a historical artifact than I am in ascertaining its accuracy (and its maddeningly elusive timing). In recent years my apprehension of it has gradually taken shape through a set of six films about American history that have been released in the last twenty years. The threat that connects them is British-born, adoptively Irish actor named Daniel Day-Lewis.

More to come.


Monday, November 8, 2010

Taylor-made

Swift steps up with Speak Now

My principal point of access to the inner life of adolescent females has long been pop music. Not an ideal one, perhaps. But as a demographically challenged high school teacher who nevertheless hopes not to be entirely clueless -- or rely entirely on the now-dated Clueless -- for information about people with whom I work every day, I have to take my intelligence where I can get it.

Fifteen years ago, Alanis Morrisette's Jagged Little Pill showed me that hell hath no fury like a teenager scorned. "You told me you'd hold me  'til you died/But you're still alive,"  she raged at a former lover with an angry amazement that I still find amazing (and very funny). Yet Morrisette also gave us the jaunty confidence of "Hand in My Pocket," and the wry wisdom of "You Learn." In 2002, Avril Levigne brought peer relationships to life in the vivid exasperation of "So Complicated." The omnipresence of these songs at the time of their release, and the joy with which they were greeted when I heard them in the company of teens, told me that they connected with kids on a deep level. As they still do for big kids like me.

Last year I belatedly discovered the charms of Taylor Swift. Swift lacks the bracing edginess of the young Morrisette and Levigne but the saccharine way her music is packaged cannot dim the emotional clarity of her songs. Sometimes this clarity takes the form of full-throated joy, as it does in her hit "Love Story," its yearning most potent in the the way she conveys aching in unarticulated sounds, or the way her voice catches on the word "real" (as in "this love is difficult/but it's re-eal"). But Swift does anger well, too, as attested by "Should've Said No" or the frustration of "You Belong to Me," all the more compelling because such expressions of pain come from Nice girls who finally decline to shut up.

This struggle to overcome diffidence is the motif that runs through Swift's new album Speak Now. I picked it up recently at my local Starbucks, a fact which suggests that Swift's tide extends all the way to CD-buying dinosaurs. My expectations for this record were relatively low. By definition, ingenues don't age particularly well, and if emotional complexity is the goal, Emmylou Harris is my go-to singer-songwriter (and Liz Phair my favorite peer). But I enjoyed Swift's last album, Fearless, so much that even a few good songs would make an impulse purchase worthwhile.

I was right. There's plenty of forgettable music on this album, whether in the cloying "Never Grow Up," or a string of familiar sentiments that run through songs whose very titles ("Haunted," "Last Kiss") are redolent of cliche. Not coincidentally, perhaps, they dominate the second half of the album and suggest that it perhaps should have been shorter. But there's some really terrific music here too, music that suggests we may be at the start, rather than the end, of a pop music career.

The best songs on Speak Now are those that manage to fuse some of the emotions Swift expressed so vividly in her earlier music and in so doing give her music a new sense of texture. The opening track on the album, "Mine," begins with melismatic expressions of pleasure and chiming guitars, but the mood of glee that suffuses the song -- "you made a rebel of a careless man's careful daughter" -- is anchored by an almost grim satisfaction in the knowledge this character's life could have turned out very differently.  Conversely, the anguish of looming breakup in "The Story of Us" is all the more potent because the woman who is desperately trying to reach an emotionally barricaded partner is incredulous that the narrative she's written in her head has gone so far awry. "The end," she says in disgust as the song severs.

But the best song by far on Speak Now is the title track. Like "Love Story," there's a cinematic quality to the piece, in which a young woman crashes a wedding in an effort to convince a groom to abandon the altar. What's striking, even potentially worrisome, is just how bitchy this character is, snarkily taking note of the pastels in the wedding party and observing that the bride looks like a pastry. (So this is how girls talk.) But this particular girl -- this woman? -- is utterly irresistible and she knows it, like a gender-inverted version of Dustin Hoffman at the end of The Graduate.

But let's be clear: the principal asset of "Speak Now" is the music, which shimmers like an early Beatles song (and which, if I'm not mistaken, has a track of hand-clapping, just like so many great early Beatles songs, like "I Wanna Hold Your Hand," do). And here we have to give Swift credit for a decidedly adult dedication to craft, because she and co-producer Nathan Chapman have cultivated a genuine gift for catchy songs.  Whatever reservations one may have about autobiographical acts of reprisal, like "Mean" or "Dear John," their hooks are impossible to shake. (If there was any doubt that the latter is addressed to former paramour John Mayer, the bluesy guitar part serves as a pointed finger.) Swift entered the mainstream through a Nashville channel, but there's a surprising rock muscularity running through much of the album.

To say that Taylor Swift has room to grow is partially to make a complaint: her artistic range remains fairly narrow. Moreover, her cattiness may yet grow stale. But Swift's talent and ambition remain abundant. She's moved quickly, yet she's built for the long haul. Long may she run.

Wednesday, November 3, 2010

Queen C


In Cleopatra: A Life: Stacy Schiff renders a thoroughly modern monarch

Over the course of the past fifteen years, Stacy Schiff has emerged as one of the nation's most esteemed biographers. With France as a geographic crossroad, her subjects have ranged widely: Antoine de Saint-Exupéry, Vera Nabokov (a portrait that won her the 2000 Pulitzer Prize), and Benjamin Franklin. But Schiff's latest book takes her far afield in time and place. It's an audacious move, and as such is a form of fidelity to the life she limns.

In a way, Schiff's body of work is less a set of individual lives than an extended exercise in different kinds of biographical problem-solving. Saint-Exupéry was a shrewd choice of subject in that he's both famous and little-known to the general reading public that is Schiff's chosen domain; Vera is a foray into the fascinating life lived in the shadow of a powerful mate. Benjamin Franklin, by contrast, is almost too well-known (an issue Schiff finessed by focusing on his diplomatic career). So is Cleopatra; but while the problem for Franklin is essentially one of too much documentation, that of Cleopatra is a matter of having so little.

But of course this is also an opportunity, because the ambiguities surrounding Cleopatra's life give a biographer lots of license for informed speculation, a stratagem Schiff seizes frequently and boldly. (Was Caesarion really Julius Caesar's biological child by Cleopatria? Schiff acknowledges this long-running controversy in a footnote, but considers the child his and moves on.)  In an important sense, the facts are really beside the point anyway; if ever there were a case where the truth resides in legend, this would be it.

But living legends are moving things. So it is that Schiff gives us a Cleopatra for the third millennium. I was not surprised to learn that Angelina Jolie is already said to be involved in a possible film version of the book; I kept thinking of her while reading it (perhaps because of the evocative jacket). Schiff is insouciant in deploying anachronism for dramatic effect, whether in comparing Cleopatra's wealth to that of the most successful hedge fund manager in history, or by emphasizing her Hellenic heritage by asserting that she was about as Egyptian as Elizabeth Taylor.

It's only natural, then, that this Cleopatra is a feminist icon: intelligent, powerful, and patriotic. She uses her sexuality, but she isn't defined by it. Faced with a difficult geopolitical situation, she navigates it not infallibly, but nevertheless with an acumen that has largely escaped previous writers -- who are, to Schiff's credit, often classical ones. But she's not one to defer to antiquity, and she's pointedly critical of Cleopatra's critics. "Cicero had two modes: fawning and captious," she says, calling him a Roman John Adams. (Given Schiff's Francophile orientation, we can safely conclude the comparison is not flattering.) Plutarch "sniffs" that Cleopatra pretends to be in love with Mark Antony; Dio's account of her meeting with Octavian is "so cinematic as to be suspect, too purple even for a Hellenistic queen." (A somewhat odd criticism coming from Schiff, who is nothing if not colorful.) Some readers will no doubt be thrilled with such prose;  others may be unsettled by the glee with which she settles sexist scores -- which, of course, will not make her unhappy. But it may not be an accident that none of the impressive promotional accolades that accompany the book come from Egyptologists.

This Cleopatra is thoroughly contemporary in other ways as well. Schiff filters her material through a postmodern sensibility. Whether or not Cleopatra actually believes she's the incarnation of the Egyptian goddess Isis, playing the part certainly confers strategic  advantages. A sense of indeterminacy shapes relationships in which the personal is political; as far as Julius Caesar is concerned, "Cleopatra was in many respects similar to her country: a shame to lose, a risk to conquer, a headache to govern." (The ironically allusive language here, which evokes "veni, vidi, vici," is one of the many ways Schiff shows herself to be a master prose stylist.) Even those parts of the book that are presumably meant to showcase the distance between the ancient world and ours, like those that talk at some length about the literally incestuous character of the Ptolemy dynasty of which Cleopatra was the culmination, exhibit a multiculturalist's hearty embrace of Difference.

It is ironic, then, that the truth apparently remains: in front of every great woman is a man -- or, in this case, two: Caesar and Mark Antony. At different points in the story it almost seems they're going to run away with the book. Partly this a function of the fact that Hellenistic Egypt was closer to a pawn than a queen in Roman geopolitics. And partly it's a function that most of the extant sources are Roman, not Egyptian, and as Schiff notes, echoing observers like Edward Said, the West has long been constructed as masculine, and the East as feminine. Sigh, one might plausibly conclude. It's still a man's world.

Early reviews of this book have hailed it as myth-shattering, as uncovering lost truths about Cleopatra. It is probably more accurate to see it as a feisty piece of mythography that resonates with the spirit of its time. As another Francophile once said, "the earth belongs to the living." And Cleopatra, whoever she was, is now -- amid the struggles and uncertainties of history that provide the indispensable ballast for myth -- ours to claim. Schiff has done so with gusto.