Monday, August 31, 2009
Reading books in a time of technological transition
Like a lot of devoted readers, I reacted to the roll-out of Amazon's new Kindle two years ago with a sense of wary fascination. As someone with a deep feeling for bound books -- and as someone who quite rationally considers the book the most powerful and user-friendly media technology ever invented -- I wasn't crazy about the idea of the printed page becoming obsolete. On the other hand, there were features of the Kindle, like the ability to download books instantly and store multiple books on the same device, that I found undeniably arresting. And I'll admit that I liked the idea of finding myself at the vanguard of a technological revolution (for once).
So at various points in the last two years, I was on the very cusp of making the leap into the new format, particularly when the Kindle 2 debuted earlier this year (of course by then I'd largely forfeited my media vanguard bragging rights). Even the well-publicized limits of Kindle in terms of Amazon's control of downloaded books and the inability to move them across e-book platforms didn't deter me; I actually kind of liked the idea of having an electronic device solely devoted to reading. Nor did Amazon's involuntary recall of downloaded copies of George Orwell's 1984 (of all books!). No new medium comes without a price, and the old one would be around for a while in any case. Besides, books are always being banned, confiscated, and getting people into trouble. I'd made the decision to go ahead and do with electronic textbooks in my U.S, history survey course; now, I decided, it's time to go the rest of the way with books to be read for pleasure.
As it turned out, the biggest obstacle I had in acquiring a Kindle was the Kindle itself, which I would periodically have a look at courtesy of friends who had acquired them. No one would ever call it a lovely piece of machinery. Whatever Amazon's claims in cutting-edge innovation, it's hard to shake the perception that the Kindle is little more than a gray piece of plastic, even if you can't plausibly call it cheap plastic. I understand the bona fide achievement in developing non-backlit electronic ink, but that too is depressingly gray. I will admit here to being an avowed sensualist when it comes to books -- touch, smell, even the sound of flipping pages matters to me. And I do judge books by their covers, without apology. All of these qualities get bleached out of existence with a Kindle. Even at $9.99 per hardcover, modernity seemed to have a very stiff price.
And yet for all this I was still going to make my move and hope for the best when I read Nicholson Baker's recent piece on the Kindle in The New Yorker. I began that piece with a sense of skepticism; I'm always interested to hear what Baker has to say, but have long regarded him as a bona fide weirdo in terms of his off-beat take on subjects ranging from phone sex to the coming of World War II. But he confirmed a lot of my still inchoate unease about the Kindle, and made a surprising case for the iPod Touch as a better way to make a transition to e-book reading.
I resisted that advice. I also seriously considered staying on the sidelines pending the introduction of a new wave of devices, among them a larger reader to be unveiled by Plastic Logic as well as larger Apple e-document readers to be pitched to the academic and business market in the coming months. (Kindle also has has a larger version, the DX, though the reviews are no better than mixed.) In the end, I decided to take the plunge and go for the iPod Touch. In my Verizon family of multiple lowest-common-denominator cheap-o cell phones, the iPhone was too rich for our blood. So an iPod Touch was the next best thing.
There were three reasons why I finally went this route. The first is that I decided in the end that smaller was better; I liked the idea that an iPod would fit in my pocket, even smaller than a rack-sized paperback, and at the same time the iPod screen is in the end not that much smaller than that of a Kindle, much of whose bulk is actually devoted to its casing, keyboard, and pagination buttons, none of which were particularly appealing to my chronically carpel-tunnel ravaged hands. Baker claimed that neither the size nor the backlighting of the iPod Touch proved to be a problem -- in fact for him the latter was a genuine asset, particularly when reading beside a sleeping spouse -- and I have found he's right on these counts. (It's staring at a screen at a distance, not reading something in your hand, that I find difficult.)
Second, I realized that a multi-tasking device really does make more sense. Being able to play music, games, and access the web all on the same device is great, even when reading books on that device was at best an afterthought. That Apple CEO Steve Jobs's would famously assert last year that "people don't read anymore" continues to bemuse me. But he may soon laugh off that one all the way to the bank.
Finally, as has been widely noted, the iPod Touch is a beautiful object, both in form and function. Crucially, that elegance exists in multiple formats when it comes to reading e-books. Amazon, already half-conceding that the Kindle may lose the race to become the go-to hardware option, now offers free Kindle software for reading on iPods and iPhones. (Even if the Kindle ultimately flops as a piece of hardware, the software is the gateway to untold riches as a means of selling e-books.) So does Barnes and Noble. Other purveyors, notably Stanza, offer readers that are widely considered better. There are various features and trade-offs involved in any reader when compared with another, but they all offer a legitimate reading experience, and they all allow you to turn pages with a painless swipe of a thumb.
I got my iPod Touch just before leaving on a family vacation, which took me to a New England location where I did not have easy Internet access. This limited my ability to play with my new toy (which of course my kids stole from me anyway), though I didn't mind because I had a stack of old media books to work through. As it turned out, my Barnes & Noble e-book reader came pre-stocked with Jane Austen's 1811 novel Sense and Sensibility, which I'd never read. I mentioned this to my wife, an Austen fan who has read all her work, and she suggested we read the book together -- she the old fashioned way she has no intention of surrendering, and me on my new device. Since her multiple copies of the novel were at her office, she dropped by our local Barnes & Noble and bought another in what turned out to be a Barnes & Noble edition. So it was that giving away content for free redounded to a retailer's (and publisher's) benefit. I'm certain this is not a unique scenario, even as I understand that electronic books are very likely to do to the publishing business what music downloads did to the record business. (Why do you think that I, a ten-time book author, am now blogging away?)
I always assumed that reading on an electronic device would take some getting used to, though I got comfortable with e-book reading surprisingly quickly. (In another ironic twist, I found myself comfortably esconced in a bookstore section of a Massachusetts supermarket, reading my e-book while my kids watched a video and my wife picked up catfish for dinner.) I developed the requisite interest in the marital prospects of Elinor and Marianne Dashwood, thinking, as so many Austen readers already had, about how much of her fiction is really about the vagaries of real estate. But I also found myself repeatedly wondering what Austen herself would make of e-books, and the way her body of work -- which she published anonymously in her lifetime -- will extend her global fame. No doubt she would be pleased, but no doubt too she would find it very, very strange, and perhaps not entirely appropriate.
I think this question has special significance for a writer such as Austen. She came of age in the late eighteenth century, when the novel had emerged as the premiere form of literary expression in Western society, and book publishing was about to become a truly mass medium. Books were physical objects, and prized as such, but they were mass produced physical objects, whose uniqueness was now, in contrast to the illuminated manuscripts of the Medieval era, a contradiction in terms. Today, Jane Austen's artistry exists as an abstraction that may get embodied in multiple ways forms in multiple media. In a way, she's become more like a composer such as her contemporary, Beethoven. Beethoven's music exists as a scores on paper, records on plastic (various kinds), wavelengths over the radio, and live performance. Austen's work is embodied in books, audiobooks, e-books, and a veritable cottage industry of movies. (In my teaching I've paired her 1816 novel Emma with the 1995 Amy Heckerling film Clueless, which is a very clever form of homage to the novel.) Austen is more insubstantial then ever, and yet more potentially immortal than ever. It's all a little intoxicating to comtemplate.
But like many intoxicating perceptions, there's probably less to this than I think. Here Amazon.com's withdrawal of Orwell's 1984 is a reminder that what the web giveth the web can take away. But for now, at least, let's end on a note of hope. A new day is coming -- it's not here yet, but it's coming. Let's greet it with open minds and open hearts. And in the meantime, let's turn real pages as well as virtual ones.
Friday, August 28, 2009
In Runaway Dream, Louis Masur draws a series of concentric circles around Bruce Springsteen's landmark album Born to Run
The following review was published yesterday in the book section at the History News Network. This is the second in a trilogy of Springsteen posts. To see the last one, click here.
We are now in the second generation of commentary on the life and art of Bruce Springsteen, a man now widely considered one of the defining artists of our time. The first generation of Springsteen commentators were dominated by a new form of writer who emerged in the late 1960s and who played a surprisingly large role in shaping popular taste for the next quarter century: rock critics. These figures, who include Robert Santelli, Greil Marcus, Robert Hilburn and future Springsteen biographer Dave Marsh, were there for the creation, as it were (as was Ariel Swartley, who along with Ellen Willis were among the handful of women who elbowed their way into this men’s club). They witnessed Springsteen on the club scene, reviewed the early albums as they were released, and discussed Springsteen’s work in high profile venues like Rolling Stone. These people did not all feel the same way about Springsteen, but they all took him seriously. And he took them seriously, granting them access in a way that would become more rare.
Beginning in the 1990s, a second wave of writers, many with academic training (myself among them), emerged to produce a body of Springsteen scholarship. Their backgrounds have varied. Liberal journalist Eric Alterman has a Ph.D. in History; Daniel Cavicchi, author of a fine study of Springsteen fans, is an ethnomusicologist. Psychologist Robert Coles published an underrated oral history of listeners; Colleen Sheedy, a curator, mounted a traveling exhibition of visual art about Springsteen. By and large, this protocol of commentators is younger (Coles, who will turn 80 later this year, is a generational outlier). They were children or adolescents when Springsteen came of age. Lacking the experience and access of the rock critics, their work tends to be contextual.
Louis P. Masur falls into this group. A cultural historian with a penchant for choosing specific subjects (the year 1831, baseball’s first world series, a famous photograph from the Boston busing crisis of the 1970s) and situating them against the larger background of their times, he combines scholarly rigor and journalistic accessibility. These talents are vividly on display in Runaway Dream, which uses Born to Run as ground zero for understanding Springsteen's career as a whole.
Perhaps not surprisingly, Masur breaks little new interpretative ground. Although this is the first book devoted to the subject, Born to Run has been the subject of extensive commentary, notably by Springsteen himself, who released a richly documented thirtieth anniversary edition in 2005 (and, one can see in retrospect, has been positively cunning in crafting master narratives of his work that he dispenses to media outlets when he releases albums). An avowed fan, Masur sees Born to Run as a shot of adrenalin in the culturally enervated 1970s, when economic malaise, political exhaustion, and commercial hucksterism dominated the national mood. What he also sees, whether by virtue of his perspective as a historian, the benefit of hindsight, or both, is that the questions the album raises about love and aspiration do not simply capture a moment in time or a period in one's life, but also provide an opportunity for recurrent reflection that's more than memory or nostalgia. As Masur explains, "Born to Run was connected to the times but also timeless, built to last, so that thirty years later it still speaks to those who were eighteen when it came out, and those who turned eighteen in 1985, 1995, or 2005."
Runaway Dream draws a ring of concentric circles around Born to Run in a set of six brief chapters, beginning with the circumstances of its creation and rippling outward to the legacy of the album 35 years later. Of particular note is his chapter on “The Geography of Born to Run,” in which he interprets the concept of place broadly – in his analysis, night is a “location” – and shows how it operates on the album. (Masur presented this material as a work in progress at “Glory Days: A Bruce Springsteen Symposium” in 2005; the second such symposium will be held in
The unanswered question at this point is how Springsteen will fare in the coming generation. We’re at a point now when it’s possible to imagine good work produced by writers born post-Born to Run. Will they maintain the high degree of enthusiasm that has marked the first two cycles of Springsteen scholarship? How will their Springsteen be different from ours? Will he have the durability of a Woody Guthrie, a Billie Holiday, or an Elvis Presley? It is perhaps the uncertainty surrounding these questions more than anything else which will determine whether the conversation around Springsteen remains interesting. In the meantime, those of us who are living with Springsteen will go on trying to make sense of what has now become the experience of a lifetime.
Wednesday, August 26, 2009
In which we see Ms. Bradstreet navigate her way through a curricular block
The Maria Chronicles, #7
"Damn! This traffic jam/How I hate to be late/Hurts my motor to go so slow . . . ."
Maria is stuck on the parkway. She thinks of the James Taylor song she loves because the radio is playing "You've Got a Friend," which she hates: too maudlin. She punches the radio off. It's 7:49. Under normal conditions she'd be at school in about 15 minutes. Still time before the first class at 8:30, but this will cut into her zone to set up, adjust, and generally decompress.
She thinks about other things that are irritating her. Her hair. The fact that she didn't stop at Dunkin' Donuts, which means she's marooned without coffee. The fact that she hasn't heard from her son Evan.
And then there's her ongoing inability to resolve her assessment problem in the colonial history unit in her survey class. It's a long stretch, about four weeks. At the end of it, Maria intends for the students to compare empires and/or colonies in the Americas. The problem is what comes in between. She needs time for them to get information, from her and from the readings. The question then becomes what to get them to do with it. Maria hates assigning busy work -- worksheets, mini-essays, and the like. The kids resent it, and it loads her up with grading when she's trying to juggle all her other courses. A quiz doesn't really make sense to her, either. The colonial era isn't like, say, the sequence of events leading up the American Revolution, where keeping facts straight matters. It's a big stretch of time, and the point is to sketch broad themes that will stick. Maria herself loves the period, but she also knows it's the most likely to seem remote to the students. Back at Derry High, there was a map assignment that was pretty good, but Maria is restless with it. Still, she may just have to go back to it.
Maria crawls past the exit for the convention center. It's now 7:52. She decides she will panic in eight minutes. Then, coming around a bend, she sees the problem. Accident up ahead in the left lane. Thank God. The end is in sight.
Hmmm. Convention center. Maybe she could do some kind of convention? A colonial meeting of some kind. Maria begins imagining a character. She'll call him Nicolo Pragmatica, a Jesuit priest. Father Pragmatica has distinctly ambiguous relations with the Vatican, as Jesuits so often do. Let's say it's 1650, right after the Treaty of Westphalia, which has finally brought stability to Europe. It's Fr. Pragmatica's idea -- his motives are murky -- to finally do something comparable for America. To that end, he's invited Spanish, French, Dutch, and English teams in an unofficial capacity (though these will be people hoping to have influence with their governments) to his estate in Tuscany. As a result of some intense back-room jockeying, representatives from the Huron and Iroquois have also been invited to the parley. Father Pragmatica will host an elegant dinner, with spaghetti and his best Chianti, and ask members of the teams just what their long-term goals in America are. The Spanish will no doubt talk about security for their southern possessions, the Dutch their trading interests in New Amsterdam, the Huron their friendship with the French and skepticism about the English, and so on.
The traffic is beginning to pick up momentum. CRV ditched by the side of the road. Police, two passengers milling around on cell phones.
Maria resumes her train of thought: This will kill a lot of birds with one stone. Simulations are fun. It will get the kids to do group work. Some light research -- she'll expect them to rely on the textbook and whatever they can pick up on Google as homework. They can IM if they can't get together. She'll tell them she's giving them group grades, and she will. But this will also allow her to identify class leaders, as well as who's passive. Depending on how it goes, she might prod a student or two, or purposely sow dissension in a particular delegation (maybe she can use her cap and gown from her master's degree to play Fr. Pragmatica). Be a tight squeeze; she can't make it more than a night's homework or give more than a class session to it, or maybe a class and a half, which might be a problem. But maybe she'll drop that draggy DVD on King Philip's War she used last year.
The scenario will need work, and Maria isn't certain she'll be able to fit it in. But it's worth a try. Maybe something to tinker with tonight. For now, though, there are more immediate issues -- here's her exit. She just hopes to God she can make the Smart Board work.
Monday, August 24, 2009
In which we see Ms. Bradstreet conduct a wearing conversation (with herself)
The Maria Chronicles, # 6
Now she's having second thoughts. She was briefly tempted by the sleeveless black blouse -- probably the coolest top she has, in more ways than one -- but decides this is too much, too soon. Actually, she thinks she should dial back the gender statement entirely and go with the gray slacks, which are also on the bed. Goes just fine with the new red blouse she's got. Has a new black belt, too. Shoes a bit of a problem, but there are a couple pairs in her tiny walk-in closet she could dig out. Certainly a safe choice . . . .
"Are the men doing this right now?" Maria says aloud, irritation in her voice. She realizes this is a silly thing to say on number of levels. For one thing, there are only a handful of people, male or female, about to start the first day of classes at Hudson High. For another, she's got pretty clear pictures in her head of her soon-to-be ex Mark, as well as her son Evan, gazing at mirrors, asking for advice, and complaining about how a particular shirt looks. Neither one of them was particularly vain -- though, come to think of it, Mark got a little more so over time -- and neither was she. But you can't really escape this stuff, she thinks, pinching her belly. One of the ironies of feminism is the way men now get objectified the way women have all along. Welcome to our world, guys. So much for "progress."
Maria adjusts her gaze to the alarm clock at the head of the bed: 6:42. Time to decide. No point in getting agitated about being agitated: it's built into the situation. What does she really want to do? Wear the khaki skirt. Maria remembers all those exams she's graded in which the student chose the correct answer, changed it to something else, and lost points. Let's go with the gut, she thinks. She grabs the slacks and blouse she's not wearing -- actually, they look pretty good together, maybe tomorrow? -- and lays them aside on top of her hamper. She pulls some new underwear out of her dresser drawer. Gotta get moving. A little makeup, some work on her hair, a little breakfast, a look at the Times, out the door by 7:30. She wants to have lots of extra time today.
Speaking of breakfast -- should she go light, or eat something filling that will get her through the morning? Maybe a stop at Dunkin' Donuts? Coffee yes, doughnut no? What does her gut say now? "Oh Maria," she says aloud, "sometimes it's impossible to live with you!"
Friday, August 21, 2009
In Creating a Nation of Joiners, Johann Neem reveals the nation's often overlooked love-hate relationship with voluntary organizations
The following review was posted yesterday in the book section at the History News Network.
From publication of the first volume of Alexis de Tocqueville’s Democracy in America in 1835 to that of Robert Putnam’s influential Bowling Alone in 2000, Americans have long been understood to be, in the memorable 1944 words of Arthur Schlesinger, “the world’s greatest example of joiners.” Tocqueville and Schlesinger celebrated this national trait; Putnam feared its disappearance. But these three observers and others took for granted it was a good thing.
Johann Neem doesn’t necessarily disagree. But in Creating a Nation of Joiners: Democracy and Civil Society in Early National Massachusetts, he shows that all these figures reflect a longstanding collective amnesia about U.S. history. Americans, he says, became a nation of joiners in fits and starts, with a good deal of anxiety on the part of players in the political system. And the fears they had stemmed from concerns that were, and are, by no means trivial.
Neem begins his story in the wake of the American Revolution. The Federalists who dominated both Massachusetts and national politics understood themselves as representatives of a new people who constituted themselves in protest against a corrupt British empire. Since the new government was the embodiment of the state, any group that sought to advance a parochial interest was, by definition, against the people, an idea that got strengthened in the wake of the Shays and Whiskey Rebellions of the 1780s and 90s.
The idea of government-as-people continued to be challenged in this period, albeit in less violent ways. Opposition to the Federalist regime in Massachusetts took the form of political clubs and other kinds of organizations, gathering momentum as Jeffersonian Republicans toward came to power nationally circa 1800. Once they did so in Massachusetts, Federalists effectively made a U-turn, opportunism most obvious in the attempt to relocate control of Harvard College, at the time a public institution, away from elected officials. In the decades that followed, Harvard, along with newer colleges like Williams and Bowdoin, became political footballs as politicians, alumni, and ministers jockeyed for power over them.
One might think that self-interest would have led the Republicans, and their Democratic successors, to embrace the chartering of government-sponsored organizations like banks, bridge companies, and other organizations. And to some degree, this was true. But suspicion of concentrated power was a core tenet of the Jeffersonian tradition, and not one that yielded easily to the often confusing exigencies of antebellum politics. Indeed, many Democrats viewed the new party of Andrew Jackson as the locus of public will; all else was in effect a special interest. This was true even as grassroots organizations sprang up in Massachusetts and elsewhere to promote causes that ranged from Antimasonry to sabbatarianism (and, later, temperance and abolition). Democratic partisans were not necessarily hostile to the goals of these movements, but were often skeptical, if not explicitly opposed, to their increasingly effective organizational acumen.
Which is not to say that Whigs were necessarily happy with such associations, either. To the extent that they made their peace with non-government organizations – which, ultimately, they did with more intellectual clarity and operational cohesion than Democrats – they were avowed elitists who explicitly tried to build in electoral circuit breakers in the form of civic boards, credentialing systems, and other methods of constraining majoritarian electoral tyranny. An excellent case study, limned by Neem, is the career of Horace Mann, who tried to overcome the localism and political intervention endemic to public education to fashion a proto-Progressive teaching profession marked by state-wide standards and insulation from political patronage.
The success of such efforts, in turn, prompted some Democrats to make the somewhat counter-intuitive position that U.S. society needed more, not less, chartered corporations and other bodies to create competition and check the growth of concentrated power represented by figures such as Mann. (Such logic, rightly attributed to the Anti-Federalist Elbridge Gerry, calls to mind that of James Madison in the Federalist #10, which, curiously, goes unmentioned here.)
By the end of the antebellum era, voluntary associations were here to stay, much as political operatives may have loathed them (which both parties certainly did in the case of Garrisonian abolitionists). But while a durable consensus took root about the legitimacy of activities that ranged from public-interest lobbying to commercial incorporation, the tensions first grappled with in the early national era remain with us still. Indeed, it has been remarkable to see how partisans of the left who mobilized in favor of Barack Obama last year have been wringing their hands over protests on the right over health care legislation this year. Clearly – or, more accurately, not-so-clearly – one man’s grassroots organizing is another woman’s astro-turfing.
Creating a Nation of Joiners is an example of the publishing-industry beleaguered genre of the academic monograph at its best. (In a spirit of disclosure, I will mention that the book is dedicated in part to my father in law, Professor Theodore R. Sizer of Brown University, a fact that biases me in its favor, but which I did not realize when I embarked on the review.) Exhaustively researched and yet tightly focused, it is a notably resonant piece of scholarship.
I will say that I might have liked to see a bit more in the way of context. While Neem is nimble in navigating the complex ecclesiastical terrain of New England – particularly in his discussion on the strange bedfellows of evangelical-minded Democrats and Orthodox Congregationalists, who teamed up to disestablish religion in Massachusetts – I found myself wondering if or how the political fault lines he describes had their origins in the separating/non-separating tensions in seventeenth century Puritan churches. And while I understand his decision not to trace the well-known emergence of the Second Party System of the early nineteenth century, I wished I had a better sense of the degree to which Massachusetts politics reflected or diverged from the nation as a whole. (Were Democrats really as strong there as they were in New York or Tennessee, for example?) But these are minor reservations in what is an elegantly conceived and executed book.
Wednesday, August 19, 2009
Notes toward a keynote address on the multifaceted legacy of a landmark album, 25 summers later
Part I of a three-part trilogy of Springsteen posts
Next month, Virginia Tech University will be sponsoring "Glory Days: A Bruce Springsteen Symposium" at a series of locations near Asbury Park, New Jersey. This is actually the second Springsteen symposium; the first, held four years ago, drew hundreds of participants from around the world. (I had a lovely lunch with a Springsteen fan from
As you could probably guess, this is one of those titles that got cooked up months ago, before I had the faintest idea of what I wanted to say. The conference organizer noted that this year marks the 25th anniversary of the release of Springsteen's landmark album Born in the
One of the ironies for me in this particular talk is that Born in the U.S.A. is among my least favorite Bruce Springsteen albums. Springsteen himself seems to agree. "I put a lot of pressure on myself reproduce the intensity of
Of course, bad Springsteen is relative. I bought BUSA (on vinyl) immediately upon release in June of 1984, and played the hell out of it all summer long -- a summer of Springsteen, and Prince, and Cyndi Lauper, that I will cherish for the rest of my life. But music that's remembered fondly is not quite the same thing as great music. I never got a compact disc version of BUSA during that transition period so many of us went through in our record collections in the eighties, and since the most important songs on the album -- notably the title track, from those Nebraska sessions -- ended up on various anthologies in the years that followed, I can't really say it was ever missed. It was only this week, for the sake of the conference, that I downloaded it onto my newly-acquired iPod Touch. I greeted it like an old friend (so good to hear "
The standard gambit in cases like these is the trope of "continuity vs. change." Certainly it's possible to speak of both in terms of Springsteen's place in American culture since 1984. But the thing that stands out for me as I reflect on the matter is the way in which the continuities in Springsteen's career are in effect a facade for more fundamental changes. Consider the following examples:
Springsteen as a concert act. Seems simple enough: Springsteen was a stage legend in 1984 (and, for that matter, in 1974), and he's still one now. The Born in the USA tour of 1984-85 was a turning point in terms of Springsteen filling the biggest arenas; since then he's never looked back -- and indeed is on such a tour at this very moment. But the industry in which Springsteen works has been turned on its head, as the rise of the Internet and music downloads have destroyed the traditional economic foundation of the recording industry. Back then, you toured to support the record. Now the record is a promotional tool for the tour. As John Seabrook reports in his piece of the concert business in the August 10/17 issue of The New Yorker, Springsteen's new album, Working on a Dream, has sold about a half-million copies, which by my calculations would gross a little over $10 million this year in the unlikely event all copies sold at the list price of $18.98. His tour last year to support the 2007 release Magic, by contrast, grossed over $200 million.
There's also been a shift in Springsteen's relationship to touring that's more specific to him. As we all know, there are some popular musicians, like the Beatles, whose principal work has been defined in terms of records. There are others, like the Grateful Dead, whose reputation rests on live performance. Springsteen has been notable in that his appeal rests on both these pillars. Yet in terms of his ongoing vitality as an artist, there's been a clear tilt toward the latter. Springsteen has recorded some very good music in the last twenty years. But not since the brief frenzy surrounding Live 1975-85 -- a three-disk set, it's worth emphasizing, of live performances -- has any Springsteen release generated excitement resembling that of his still instantly-sold out concerts.
Springsteen as pop star. BUSA was an important record as a music industry phenomenon. Springsteen had his first top-ten single in 1980 with "Hungry Heart," but BUSA put him in the stratosphere. Arriving at the peak moment when albums, helped by MTV, spawned multiple hit singles (see: Michael Jackson, Lionel Ritchie, Billy Joel, and of course, Madonna), it generated a remarkable seven top-ten singles on the Billboard pop chart: "Dancing in the Dark," "Cover Me," "Born in the USA," "I'm on Fire," "Glory Days," "I'm Goin' Down" and "My Hometown." (None reached number one; ironically, the only Springsteen-written song that has, "Blinded by the Light," was a cover version by Manfred Mann's Earth Band in 1976.) In the decade that followed, Springsteen has made appearances as a pop artist, notably a trio from his 1987 album Tunnel of Love and his Academy-Award winning single "Streets of Philadelphia" in 1994. He's been even more dominant on the album chart. As Billboard reported back in February, Springsteen has nine #1 albums, including Working on a Dream. Only the Beatles, Elvis Presley and Jay-Z have more.
Yet Springsteen's prominence as a popular recording artist has receded steadily since BUSA. Note that it's been 15 years since he's been in the top ten, notwithstanding minor hits like "
Springsteen as cultural icon. In 1984, Ronald Reagan invoked Springsteen on the campaign trail; in 2008 Barack Obama invoked Springsteen on the campaign trail (and made "The Rising" a fixture of his rallies). Of course, the circumstances were quite different. Reagan invoked Springsteen disingenuously (though, as I plan to argue at the symposium, less inaccurately than many believe). Obama, by contrast, has a dappled imagination comparable to Springsteen's own.
Perhaps a more important difference, though, is Springsteen's changed stature. For Reagan, Springsteen was a current event; invoking him was a way of showing he was in touch with the culture of the moment. By the time of Obama's inauguration in 2009, by contrast, Springsteen had become a legend who bridged and blended a heritage that extended back to Woody Guthrie and Pete Seeger, with whom he performed at Obama's inauguration. The Rising may not have sold many copies compared to BUSA, but it has become perhaps the best-known and culturally resonant art to come out of the 9/11 catastrophe. In the quarter century since BUSA, Springsteen has consolidated and extended his place in the fabric of American national life. His reputation is secure. In this regard, then, his position has strengthened, not weakened, since the release of BUSA 25 years ago.
This reality leads me to an important point, one that reflects the larger tenor of my remarks here: that Springsteen's importance is as someone who recapitulates what I called in my 1997 book "the American Tradition." Unlike, say, the Rolling Stones or Bob Dylan, the legacy of his work is more about the way he strived to preserve a vanishing past than blaze a path in a new cultural direction. But this is a topic for another day. I'll have more to say about it as the symposium approaches.
Monday, August 17, 2009
Slices of summer --
and the seasons of a life
I suspect that I'm not unusual in having running mental obsessions to occupy my time when doing fairly mindless tasks like mowing the lawn, folding laundry, or taking a shower. Perhaps because I'm in the history business, these mental obsessions typically involve segmenting time.
For instance, I've spent a fair amount of time mapping out the components of a month. You could simply divide one into two halves: early and late. Or you could slice it into thirds: early, middle, and late. But this is more complicated than it appears, because 31-day months don't cleave evenly. On the other hand, 30-day months don't have the nice fulcrum on the 16th, fifteen days on either side. Thirty-day months are always either early or late.
I also make a distinction between early and beginning as well as late and end. The 6th of a month is the beginning of the month because it's less than a week old; the 27th is the end of the month because there's less than a week left. The 9th, by contrast, is early (not the beginning) and the 22nd is late (not the end), because while they fall in the first or third segments of the month, respectively, they have more than seven days behind or in front of them. I also made distinctions between, say, early-mid month (e.g. the 12th), or late mid-month (the 18th). After a while the whole thing got tedious, so I stopped doing it. But it was something to get me through a vacuum cleaning or leaf-raking session.
This is the year I finally decided out how to segment a summer. What's surprising when you start to think about it is how many standards there are. The simplest is meteorological: as a matter of temperatures in the Northern Hemisphere, summer = June, July and August. As a matter of astronomy, i.e. the amount of light in the day designated by the summer equinox (longest day of the year) and summer solstice (halfway to the shortest), the season begins at slightly varying points in the day on June 20th and ends at some point on September 20th or 21st. Culturally speaking, summer in the United States is widely understood to commence on Memorial Day weekend and end on Labor Day weekend. Independence Day on July 4th is more or less the middle.
For me, none of these periodizations are satisfactory. I think we'd all agree that June 21 is at best a very late time to begin summer, and that the party is long since over by September 20. Some public school students and teachers might go along with the former, but even they wouldn't be satisfied with the latter. College students and teachers are typically done by mid-May; the exodus back to school begins in mid-August. Indeed, as I reflect on all these variations the thing I find most surprising is the way we all seem to assume there's a consensus of what we mean by summer. July is about the only stretch that falls squarely into everybody's definition.
My school year usually ends around June 15, give or take a couple days that vary by the year. I now call the three weeks or so that follow "early summer." Early Summer ends with the arrival of the July 4th weekend, which can happen any time in that first week, and ushers in Mid-Summer, or as I call it, "High" Summer. High Summer is six-weeks long, ending sometime around August 15 (actually, my anniversary falls on the 12th, so I consider it a holiday of sorts). The next three weeks, leading up to Labor Day constitute Late Summer. It's taken me a while to decide on all of this, but it's my story, and I'm sticking with it. Be curious to know it corresponds to yours. I think I can say that by about any standard, we are now in Late Summer.
I know: this is all a bit neurotic. But I like to think it's not all that different from what serious historians do all the time. What marks the effective end of the Roman Empire: the military disaster at Andrinople in 376 CE, the sacking of Rome by the Visigoths in 410, or the abdication of Romulus Augustulus in 476? When did that period we call "The Sixties" actually begin? How about the "Middle Ages?" The fact that answers to these questions vary -- vary over time, and vary at different locations at the same time -- is a vivid reminder that time itself is a historical artifact. Time is something we make. But we never make it alone, and we never entirely control it. And while we may make time, time ultimately unmakes us. We resist that process in ways that range from children to blogs, with varying success. Part of what defines that success is our skill in manipulating it for maximum advantage.
For now, at least, we have some more time. What would you like to do with it?
Thursday, August 13, 2009
Steve Hely's hilarious How I Became a Famous Novelist exposes painful truths about publishing, the blockheads who write for money, and the people who actually read their "work"
"Nor do I cut book reviewers any slack for 'advancing the culture' or 'calling our attention to good work' or keeping the culture of letters alive.' If a guy drove a truck around your neighborhood, pointing out which people were too fat, he would be advancing wellness, and calling fitness to our attention, and keeping public health alive. But you would hate him. You would throw rocks at him, as well you should."--Pete Tarslaw, The Tornado Ashes Club
When Pete Tarslaw, who ghostwrites college and graduate school admissions essays for a living, finds out that his college sweetheart is going to marry another man, he decides he needs to take revenge by becoming rich and famous and showing her what she's going to miss. To do this, Tarslaw decides to write a bestselling book. After all, he reasons, any idiot can produce a huge commercial success. And, in this richly comic novel, Steve Hely shows us an idiot who does.
That's not to say that Tarslaw doesn't lay the groundwork for his eventual "triumph." He scopes out the marketplace to see what will work. He knows he can't write a memoir, because, he explains, "I had a tragically happy childhood. Mom never had the foresight to hit me or set me to petty thieving or to enlist us in a survivalist cult. I wasn't even from the South, which would've bought me a few dozen pages. Lying wouldn't work; these days memoir police seem to emerge and make sure you truly had it bad. And the bar for bad is high -- reviewers have no patience for standard-issue alcoholics or battered wives anymore."
Instead, he haunts bookstores, where he encounters titles like the presumably fascinating micro-history Cumin: The Spice that Saved the World or the inspirational coaching missive Jockstraps Ain't For Eating. Tarslaw knows from the start that his goal is to write an awesomely bad novel; he wants to surpass the drivel he sees correctly sees as dominating the business, and early on targets Kindness to Birds, a mindlessly inspirational feel-good book by a pompous ass named Preston Brooks, as his standard. To that end, he stitches together every cliche he can think of and stuffs them into a novel he calls The Tornado Ashes Club, about a young man wrongly accused of a crime who goes on the lam with his grandmother, whose long-lost lover fought in World War II (and Peru, among other places) and the beautiful singer they pick up along the way. A friend who works in the imploding publishing industry convinces her desperate boss to take Tarslaw on, and then tries, with uneven success, to make the book coherent.
And: Not much happens. When the novel unaccountably gets the attention of a reviewer at the San Francisco Chronicle, he dismisses it as "a slurry of mixed images and tiresome characters, in language as worn out and withered as [a] sixty-some-odd bar slattern." Tarslaw is wounded. But the slam also marks the beginning of a steep ascent that will rocket him up the Amazon.com rankings, lead to sexual encounters on the convention circuit, and get him meetings with screenwriters and talk show producers, as well as a few less conventional developments. And, of course, a memorable moment at his ex-girlfriend's wedding (though not one Tarslaw himself will be able to remember).
Hely, a veteran television writer whose credits include The Late Show with David Letterman, has clearly seen, up close, more than his share mediocrity in show business (he's also the author of a comic travelogue of a trip around the world, The Ridiculous Race, with one of his colleagues). He has a pitch-perfect ear in mocking both the rituals and products of the publishing industry, as well as the various rules and examples his protagonist generates in preparation to write the book (the actual writing, Tarslaw finds out, "is a tremendous pain in the ass"). How I Became A Famous Novelist itself, however, unravels at it proceeds, because Hely can't seem to integrate his goals of mocking Tarslaw, mocking the industry, and leaving open some space to acknowlege truly good work. Like his main character, Hely has difficulty with the execution.
Still, this book is good fun most of the way. And like a lot of fiction these days, it's been issued as a paperback original, so it's $14 (at most). All in all, a nice way to finish the summer. If nothing else, get it for Hely's mock New York Times bestseller list, with its capsule summaries of bestsellers like Indict to Unnerve or Empenadas in Worcester. That stuff is priceless.
Wednesday, August 12, 2009
A master novelist is back with That Old Cape Magic, a salty tale of marriages made and unmade in coastal New England
Summer's here and the time is right for another Richard Russo novel.
Over the course of the last decade or so, new Russo fiction has arrived in time for summer every few years, perfect vacation reading. Every one of his eight books, which date back to 1986, is like an excursion, and although the destinations are not always prepossessing -- his early novels had rundown upstate New York settings -- you're typically plopped down into small communities with a wonderfully witty (or entertainingly witless) cast of rogues, led by an amiable protagonist you can't help but like immensely as he makes his rounds while engaged in various kinds of shuttle diplomacy among parents, children, siblings, or other scoundrels. Russo novels typically feature hapless men and generally sharper, and thus exasperated, women. Sometimes a dowager figure looms over the proceedings, difficult but whip-smart. By the time they're all over, the main characters (boys in the early going, boyish men more recently) have figured out a thing or two, and so have you. The best book of Russo's early phase is Nobody's Fool (1994), which was made into an unjustly overlooked Robert Benton movie starring Paul Newman.
In recent years, the locus of Russo fiction has shifted to New England -- Maine, and especially recently, Cape Cod -- and there's been a demographic shift as well toward more upscale characters. His 1997 send-up of academic life, Straight Man (this one set at a Pennsylvania university) was laugh-out loud hilarious. Empire Falls (2001), about a declining Maine mill town, won a Pulitzer Prize in 2002 and was made into an HBO miniseries, also starring Newman in one of his last roles. Russo's last novel, Bridge of Sighs (2007), essentially fused a series of strands in his literary persona by telling a multi-generational story of three upstate New York friends, two of whom stay behind while the other becomes a successful painter in Venice.
Russo's latest offering, That Old Cape Magic, has familiar accents but gets reconfigured in a new permutation. This is, in its entirely, a story about relatively affluent people, some of whom make their money in Hollywood (a motif that emerged in his recent 2003 short-story collection, The Whore's Child). It also features a set of college professors, returning us to the realm of Straight Man. Cape Cod and Maine resurface as settings, as they did in Empire Falls. Once again, the main theme is inter-generational tension, and a child's desire -- this time relatively late in life -- to finally come to terms with family legacies.
That Old Cape Magic is the most slender of Russo's novels, perhaps in more ways than one. And it rides a line of bitterness further than he has before, risking his readers' sympathies. I don't think it ranks with his best work. But even Russo on a bad day is always worth reading.
The story this time, bounded by a pair of weddings, concerns Jack Griffin, a former hack screenwriter turned college writing teacher. When the story opens, he's on his way to the Cape to attend the wedding of his daughter's best friend -- and to spread the ashes of his recently deceased father. The trip brings back two sets of memories. The first concerns his parents, a pair of professors who are frustrated by having to settle for a pair of jobs in the "Mid-fucking-west" (the only way they ever refer to it), but who seek a balm for their psychic wounds with the perfect Cape rental every summer, with the result that no summer vacation home is ever quite right. The second concerns the origins of Jack's own marriage 34 years earlier, specifically a "Great Truro Accord" informally forged on the Cape between Jack and his wife Joy that the two would eventually leave Los Angeles and settle down with a family (in Connecticut, as it turned out). Unsettled on both counts -- and by his impossible mother, who importunes him with aggravating cell phone calls -- the trip triggers a downward spiral.
The second half of the book is occurs a year later, when Jack's own daughter gets married in Maine. By this point, his marital life has been turned upside down, and he's lugging around two sets of ashes instead of one, somehow unable to spread either. He has to parry the hostility of various in-laws, and juggle relationships with characters with whom he met or saw at the last wedding. It's obvious to just about everyone what Jack needs to do -- including Jack himself -- but he's somehow paralyzed, belatedly realizing that the various choices he's made in his life that he's understood as a repudiation of his parents in fact has effectively replicated them. Will he finally come to terms with his past and truly move forward? (Take a wild guess.)
That Old Cape Magic is probably Russo's funniest book since Straight Man. Sometimes the humor is gentle (though not without an edge), like in this passage from early in the novel:
Joy's relationship to the English language was not without glitches. She was forever mixing metaphors, claiming that something was "a tough line to hoe." Row to hoe? Line to walk? Her sisters, Jane and June, were ever worse, and when corrected all three would narrow their eyes dangerously and identically. If they'd had a family motto, it would be You Know Perfectly Well What I Mean.
Other times the book gets downright zany, with pratfalls that include a defecating seagull, multiple fender benders, and well-deserved sluggings. One suspects Russo's increasing involvement in the film industry has given his storytelling an increasingly punchy and visual quality, though there's always been an unpredictably anarchic quality to it.
The portrayal of Jack's mother gives one pause. She's perfectly awful: caustic, self-centered, and ruthless in hunting down and exposing the vulnerabilities of others. If Russo didn't have a relatively balanced track record in his portrayal of women, I'd be tempted to call it downright misogynistic. Yet it's also clear as the novel unfolds that she serves a meta-narrative function about reliability of memory, a motif sustained via a story-within-a-story subplot. This stratagem raises another concern, however: that Russo's fiction, which for all its myriad variations nevertheless rests on an autobiographical foundation, may be retreating into a smaller compass in which a successful writer writes about writing.
Let me be clear about the origins of such a notion on my part: greed. I've been so happy with Russo's work that I hate to contemplate anything that may indicate his inability to keep delivering such satisfying work indefinitely. (That and the prospect of his mortality reminding me of my own.) So let me end on a note of gratitude. The fiction of Richard Russo has enriched my life at those times I'm most likely to savor it, i.e. my summer vacations. So I recommend him that you might enjoy similar pleasure -- and put a little extra money in his pocket. Russo reports in his acknowledgments that own daughters recently got married. If only in his wallet, that's gotta hurt.
Monday, August 10, 2009
In Empire of Illusion: The End of Literacy and the Triumph of Spectacle, Chris Hedges argues that American Culture is a Force with No Meaning
The following review was published on the Books page of the History News Network on Saturday night.
According to Chris Hedges, American civilization is coming apart at the seams. The masses are corrupted by deadening vices like professional wrestling and violent pornography. Higher education has become a craven handmaiden to corporate power. Happiness has been conflated with adjustment to the relentlessly capitalist status quo. And a tottering economy is on the verge of bringing a self-indulgent society to the point of collapse.
But the most self-indulgent figure in Empire of Illusion may well be Hedges himself.
Don't get me wrong: many of the social ills that Hedges, whose urgently eloquent 2002 book War Is a Force that Gives Us Meaning has become one of the most influential and widely cited works of its kind in recent years, are real enough. (That's part of the problem: much of what he describes here has been discussed at length elsewhere.) And he brings a reporter's immediacy to many of the scenes he describes; his on-the-ground accounts at wrestling matches or interviews with X-rated film actors are often painstaking -- and at times just plain painful -- in their attention to detail. But he also brings a sledgehammer sensibility to making sense of the phenomena he describes, and forfeits his credibility with his hectoring tone and refusal to consider alternative arguments. He also shows a surprising ignorance in his use and understanding of history.
Take, for example, his rant against celebrity culture. In the space of less than a page, he jumps from a tawdry crowd at a female wrestling match to citing Plato's Republic -- a how-far-we-have-fallen rhetorical gambit that sidesteps Plato's hostility to democratic values that Hedges himself upholds, not to mention the not-exactly-prudish sexual culture of the Greeks -- and then jumps to citing Daniel Boorstin's 1961 book The Image: A Guide to Psuedo-Events in America.
Now, Boorstin's book is an important document of its time, and his observations about popular culture are not without ongoing relevance (even if his politics were a good deal more conservative than those Hedges, which it's not clear he understands). But to uncritically use a source like that as a self-evident description of reality a half-century later strikes me as lazy thinking. For Hedges, television means reality shows like Survivor or The Jerry Springer Show. He never acknowledges that television also means The Sopranos and Mad Men, never mind Bill Moyers or Charlie Rose. And while he complains that the level of discourse in the Lincoln-Douglas debates were about twice as high as that of the 2000 presidential debate, he overlooks the often mind-numbing repetition, innuendo, and explicit racism of St. Abraham himself in 1858, evident to anyone who has actually read those debates.
Again: the coarseness, even brutality, Hedges describes in modern popular culture is real and may even be growing in relative prominence. But most of the culture of any time and any place is mediocre at best, and if Hedges assumes that the viewers of pornographic movies or wrestling matches uncritically accept everything they're shown as a transparent description of reality, he betrays a lack of respect for ordinary Americans not all that different from conservative critics of popular culture (like Jose Ortega y Gasset, also unselfconsciously invoked here) who are at least clear in their own minds about their contempt for the masses whose culture they decry.
At the other end of the social spectrum, Hedges's critique of the academy is similarly ham-fisted. "Our elites replicate, in modern dress, the elaborate mannerisms and archaic forms of speech employed by calcified, corrupt, and dying aristocracies," he writes. "They cannot grasp that truth is often relative. They base their decisions on established beliefs, such as the primacy of the unregulated market or globalization, which are accepted as unquestioned absolutes." Maybe so. But most critics of university culture would say that if there is one thing that characterizes academic life in the humanities, at least, it is skepticism about the market economy and a doctrinaire insistence on the constructed nature of reality, a form of relativism that Hedges also decries (indeed he's quite bitter about the state of the humanities on this and other counts). He also manages to lump together the oft-commented upon narcissism of our contemporary meritocratic elite of Ivy-League universities with the old-boy network of George W. Bush -- a conflation of two admittedly unattractive, but hardly interchangeable, demographic segments.
Part of what makes all of this hard to take is the severe and remorseless tone of Hedges's indictment of American society. Take this paragraph of unsupported assertions unrelieved even by the mercy of a comma:
At no period in American history as our democracy been in such peril or the possibility of totalitarianism as real. Our way of life is over. Our profligate consumption is finished. Our children will never have the standard of living we had. This is the bleak future. This is reality. There is nothing President Obama can do to stop it. It has been decades in the making. It cannot be undone with $1 trillion or $2 trillion in bailout money. Nor will it be solved by clinging to the illusions of the past.
Well, then, what should we do? In a general sense, it's clear enough: Wean ourselves from our attachment to materialism and self-gratification and learn to accept a life of limitations. Yet there's a curiously abstract quality to this prescription, because Hedges never seems to describe a three-dimensional country in which such a vision might approach reality. His native land seems to consist of rapacious corporate executives and their victims. The only thing in between seems to be the world of his grandparents, hard-working New England folk who knew the value as well as limits of a dollar, and of learning -- and who appear to be invisible in any recognizable form today.
Frankly, I'm puzzled by Hedges. As a war correspondent in places like Latin America and the Balkans, he's seen the worst human beings can do to each other. I'm not such an exceptionalist that I assume it would be self-evident that the state of our union would be a simple inversion of, say, the Serbia of Slobodan Milosevic. But he seems incapable of registering any distinctions. He argues at the end of the book, quite plausibly, that economic distress is really the single most important factor in political disintegration. But he seems to treat cultural decay, which dominates the opening of the book, as a cause rather than a symptom. I wonder whether it is either. In any case, the inexorable economic logic of imperial decline would seem to make complaining about the state of American pornography beside the point.
Which brings me another source of confusion. Hedges was trained as a seminarian, and a fierce moral energy is what gave War Is a Force that Gives Us Meaning is intensity. Righteous anger has its place, and it really worked there. He has also written jeremiads against evangelical Christians and atheists alike. At some point, though, it seems to me that an effective social critique has to move beyond complaining about what you hate and describing what you love, because, as Hedges made clear in that book, love is a force that gives us meaning, too. The power of positive example -- in generous and engaged writing no less than in the subject of that writing -- can furnish a powerful lesson in its own right. After all these years on the front lines, I really wish Chris Hedges would finally come home to the place that made him -- and the place that sustains him still. If he could map those coordinates, perhaps a few more of us would find ourselves with him.