Wednesday, July 25, 2007

Opening Acts, Narrowcasting, and Whether Our Music Really Says Who We Are (or, Porter Wagoner Plus Nick Cave Kinda Does Equal Jack White)

The White Stripes played Madison Square Garden Tuesday night, which in itself might be worth a post--e.g., Can two people make a sound big enough to fill a place that large? Yes, if they're divorced and sound like concentrated Led Zeppelin--but what fascinated me most were the opening acts.

I got an email the day before the concert telling me that the Stripes would be going on after 10. I noticed that Porter Wagoner was opening for them, backed by Marty Stuart and his band, but my eye skipped over the fact that there was a second opening act, Grinderman. Only when the frontman of the band started singing did my friend say, "Oh my god that's Nick Cave." The juxtaposition of the tidily entertaining Wagoner and Cave's willfully chaotic new band was odd to say the least, but most of the audience probably didn't notice. The seats were half empty at 8 and barely more filled at 9. At 10, though, everyone was in their places and on their feet for the next two hours. Clearly, they had gotten the email.

I actually look forward to opening acts. (Yes, as a matter of fact, I do like the trailers before the movie too.) I like the fact that I got a chance to hear three different styles of music in one long evening. I was especially happy to hear Wagoner, a number of whose songs I like--especially "The Cold Hard Facts of Life"--but whom I never would have gone to see just by himself. Nevertheless, I was reminded again that I would never want to be an opening act. I've performed in restaurants while people are eating, and even that threw me off my game. When you perform, there's a certain expectation that the audience at least wants to pay attention. After all, they came there to see you. If they're not paying attention, then you work harder to make sure that they do. But when you're an opening act, there's a built-in resistance that must sometimes seem insurmountable: I do know who I came to see, mister, and it sure as hell ain't you.

There seem to be three or four theories behind the selection of opening acts. Sometimes, they're mostly there to provide variety, the way Frank Sinatra and other old school Vegas singers would have comedians open for them rather than musicians. Other times, the opening acts are meant to serve as more obvious stylistic appetizers for the star attraction. Over the years, the Rolling Stones have caught at least a dozen rock bands just before they were about to break wide and gotten them to open for them, most notably Guns N Roses in the late 1980s. Many times, the not yet-broken acts are on the same record label as the main attraction, and the label is using various opening slots to simply get them known outside of their familiar regional base. Back in the early 1970s, for example, between his first and second albums, Bruce Springsteen opened for The Beach Boys, Chuck Berry, Blood, Sweat, and Tears, Paul Butterfield, Chicago, The Eagles, Hall and Oates, Richie Havens, New Riders of the Purple Sage, Lou Reed, Sha-Na-Na, and Stevie Wonder, all in the space of a few months. One can argue that Springsteen's music shared specific traits with each of those acts, but the sheer musical range of the performers within that list, particularly within the popular music industry back then, is a little daunting. This was obviously about exposure pure and simple, about getting him on the road and in front of people who wouldn't listen to him otherwise.

My favorite opening acts, though, are the ones that the main attractions themselves choose on purpose in order to educate their audience. You like us, we like these people, why don't you see if you like them too? This is especially true in cases like Tuesday night when the stars are trying to get their audience to listen to something a little bit different. The most infamous case of this was back in the 1960s when the Monkees asked Jimi Hendrix to open for them for a number of dates. (Eyewitness accounts suggest that Davy Jones fans were in fact not yet ready to kiss the sky.) Even if the artist is established, however, mixing genres can still meet with audience indifference at best or hostility at worst. One of the hardest working performances I've ever seen was Kanye West opening a stadium date for the Rolling Stones last fall. Unlike the Wagoner/Grinderman/Stripes bill, I'm not sure that West has much in common with the Stones musically, but he openly embraced the challenge of performing for an audience that almost certainly knew his name more than they knew his songs. He saved the one they were sure they'd know ("Golddigga") until about 2/3 of the way through the performance but alternated styles and arrangements for over an hour, building his act as the stadium filled. I'm not sure how many fans he won over that night, but if he didn't he sure went down swinging.

The truth is, most successful musicians have more eclectic tastes than their fans. In the late 1970s, when rock fans were shouting "Disco Sucks" and burning truckloads of allegedly offending vinyl in public, many of the acts to which they pledged their allegiance (like the Stones or Springsteen) were listening to, loving, and openly promoting dance records, even wondering whether the 12-inch remixed single was a format that might enhance rock music too. When a hip-hop performer like Nelly collaborates with Tim McGraw, fans of both acts may scratch their heads. What the typical country fan prefers is not an open collaboration between generically separated artists but rather a more stereotyped usage of markers from one musical genre in another, as when Big & Rich positioned themselves several years back as ersatz "country rappers."

Think how often people listen to music without listening to it: they catch a repeated phrase, a rhythm, a combination of instruments. The impression matters more than the substance, especially the first impression. For the last twenty-five years, our whole system of radio stations in the United States has been based on this premise. For the most part, satellite radio has not represented an advance over these conditions. Fans of satellite radio praise it as a way to find music that you'd never hear on terrestrial radio, but your options are inevitably divided up by genre, by sound.

Except for rare personality-based radio shows (like the one Bob Dylan has been doing lately on XM), you don't get the sort of mix that was more common on radio in the 1960s and 1970s. Those madly eclectic shows, vestiges of which you can still find on some college stations, flourished in that accidentally golden age that fell between the time when television made radio seem irrelevant and the time when demographic breakdowns taught ad executives how you could make money off niche culture as well as mass culture.

But it's not just The Industry that is to blame. The same complaints, I believe, could be made about Ipods. Yes, people have all this music at their fingertips, but how generically varied are the contents of most of the Ipods you've seen, particularly of users aged over 35? We find our niches, be they generic, chronological, or both, and they all too frequently become ruts. Our musical taste, to a certain extent, is betrayed by our guilty pleasures, not the patently good stuff we embrace in all genres but the mediocrities we still enjoy even though we can't defend them. Like our larger lifestyles, all our music appears to fit together; we accept that Paul McCartney, Joni Mitchell, and books by refugee African child soldiers all fit together with strong coffee; and I'm not entirely sure that all this assumed neatness is such a good thing.

Particularly when dealing with older audiences, our popular culture frequently perpetuates the myth that all subcultures are separate and somehow individually "pure," but the truth is that all American mass culture springs from common sources and reflects the presence of multiple strains. Even though so many commentators have insisted for the last forty years that ideology trumps all, Clint Eastwood can direct Tim Robbins and Sean Penn in a film like Mystic River and everyone can get along famously, because familiar forms and genres can contain and even dilute ideology. At their most widely popular, male-dominated crime films can enlist both conservative and liberal sympathies to their cause. In fact, many of the most successful works of all kinds of American popular culture bind seemingly opposed ideologies together within singular mythic structures.

In an aesthetic form like film, this sort of ideological containment occurs on the level of story, mixing characters, scenes, and narrative outcomes. In a more abstract form like music, the containment is much harder to tease out. Sometimes, lyrics and music can appeal to different segments of the audience, as in Martina McBride's 1994 recording of Gretchen Cryer's "Independence Day," a song that mixed a familiarly rousing country arrangement with an ironically patriotic title to tell the story of an abused wife who stands up to her spouse. At the time of its release, the song didn't meet with the backlash that would greet the Dixie Chicks' "Goodbye Earl," which dealt with the same theme, five years later. The Chicks rocked murder in self-defense a little harder and made it sound like a blast. Cryer's music and McBride's performance on the earlier track, however, gave the clear impression that this was a solemn business. The lyrics were different on the two tracks, but the music was more different, and that made the controversial subject matter more or less respectable.

Most of the time, I love my country and I love its culture in all its varied messiness. As I hope for its future, I dream of a nation where that messiness is clearer. I'm not talking about "diversity" here or "multiculturalism," two words that have lost their initial utility as each has come to mean "pluralism" rather than something more wonderfully swirling and mixing. If we are a rainbow, it is not the sort of rainbow you see in a kindergarten classroom, in which each of the seven colors (ROYGBIV, remember?) is sharp and distinct. We are a rainbow as it exists in nature. Each shade blurs into the next, and when you truly see it, you know it's not seven things but one thing. You can't see where one of its supposed segments ends and the next one begins. To say that we're one is not to say that we're monochromatic, but we're not just two colors either. We're all the shades, and we are always modulating into each other.

If, as my musician friends keep telling me, the old New Orleans will probably never exist anymore, then we need the virtual equivalent of that world, maybe even in cyberspace. We need people bumping up against each other and listening to each other's music, whether they want to or not, finding the lost root chords and the potential for counterpoint. Culture may be based in ideology but only in stasis. When ideology changes, it is usually because it is following culture's lead.

Is it too much to think that a revolution like that can begin with listening to Porter Wagoner, Grinderman, and the White Stripes in one sitting? Maybe all you need to listen to is the Stripes' version of "Jolene," one of the greatest songs by Wagoner's even more talented ex-wife Dolly Parton. I don't know where the Stripes' version would find its proper home, but I can tell you it's not in a red state, and it's not in a blue state either. The way I hear it, it's pure purple.

Wednesday, July 18, 2007

Harry, Buffy, and the Post-1980s Adolescent Hero (or, The Boy Who Lived and The Girl Who Died--a Lot)

Let me get the most automatic business out of the way first, and lay out my still unspoiled predictions in advance of 12:01am Saturday: Harry dies. The lightning scar on his forehead is the last of the horcruxes that needs to be destroyed to turn Voldemort mortal again. Snape went undercover with the Death Eaters at the end of Harry Potter and the Goblet of Fire and was acting on Dumbledore's orders all the while, even when he killed him at the end of Harry Potter and the Half-Blood Prince. (There could be some fudging here with Dumbledore's eventual resurrection via his phoenix Fawkes, but I hope not, for reasons that should be clear after you read what follows.) Depending on how far forward Rowling takes the story in the last chapter, I also expect that we may learn that Hermione eventually becomes a teacher at Hogwarts, possibly of Transfigurations, and maybe even Headmaster.

As I said, that's all the automatic business, but aside from the sheer mechanics of tying up the plot, what does this all mean? Over the last few weeks, journalists have been pouring over the last decade of Pottermania, deciding that it really didn't permanently change adolescent reading habits as much as many had hoped. They have also discovered, most tellingly, that much interest among younger readers dropped off after the first three books--which are, after all, the shortest, least grim, and most self-contained stories in the series. Even allowing for all this hedging, though, the books do seem to have created a sizable cult. In my own lifetime, I think only the world of Star Trek has achieved this level of widespread aesthetic communion, but unlike the Star Trek universe, which stretches across centuries and lightyears, the world of Harry Potter remains within very distinct boundaries of time and space. If Star Trek in all its incarnations captured for its adherents a particular vision of the perfect society, the Potter novels, even with all their shadow governmental agencies and extended backstories, ultimately trace the transit of a single life. Somehow, the books that relate that life have resonated with readers, particularly with the generation of readers who have grown up with them since the publication of Harry Potter and the Philosopher's Stone in 1997. The depth of that resonance should make us wonder where it comes from.

In that regard, the most useful comparison to Harry Potter may not be any of the members of the United Federation of Planets, but Buffy Anne Summers, more popularly known as Buffy the Vampire Slayer, from the film (1992) and television series (1997-2003) that bore that name. I'm not sure if anyone has ever noted it before, but Harry and Buffy are almost the same age. Based on information given in J. K. Rowling's books and the Joss Whedon-supervised television series, Harry was born 31 July 1980, and Buffy was born less than six months later, on 19 January 1981. Although Buffy is younger than Harry, she seems older, in part because her story was told earlier, and in part because the key events in her life happen at later ages than those in Harry's. As every loyal reader knows, Rowling's novels tell the story of Harry's development from age 11 to age 17. By contrast, the core of Buffy's story covers an almost exactly later period, from age 16 to age 22. In other words, Rowling's last two Potter novels cover the same period as the first two seasons of Whedon's television series, both in terms of historical chronology and in terms of their protagonists' ages. Yes, I know both protagonists are fictional, but somehow I find it oddly fitting that in the spring of 1997, just about a month before Harry Potter witnessed Dumbledore's tragic death at Hogwarts, Buffy Summers herself died (for the first of three recorded times) in the caves under the Hellmouth-ridden town of Sunnydale.

The similarities between the two heroes are fascinating--both created by authors writing across gender lines, both seconded by clever witches who made it cool for girls to be bookworms, etc.--but it's that age difference that I think is probably most telling. In a sense, Buffy's story begins when Harry's ends. Harry Potter's story is a story of adolescence. It almost functions as the English equivalent of many of the early Marvel comics (Spider-Man, The Incredible Hulk, X-Men), asking the question: what would you do with powers if you had them? The strongest attraction of such fantasies always lies in their underlying emotional reality. This is what adolescence feels like to the vast middle of teenagers, neither under- nor over-privileged, in a relatively affluent society. If fairy tales offer younger children fantasies of alternative households and parents, superhero stories offer adolescents fantasies of alternative talents and fates: this is what I'm really like, the consumer thinks, I'm just killing time while I wait to fulfill my destiny. Rowling is obviously keenly aware of this and has made her characters' choices of eventual vocations within the magical world a minor but compelling subplot in her series. In the early books, the characters are just having fun with the discovery that they can do magic. As they advance though their education, however, the magic moves beyond mere play, and they begin asking: What do I want to do with this ability?

If Harry's story details a process of adolescent discovery, Buffy's traces her slow acceptance of adult responsibility. One of the most reiterated situations in Buffy's story is her refusal to take on the duties that come with her unique position as Slayer. From her initial awareness of her powers and move to a second high school at age 16, through her running away from home at age 17, to the aftermath of her unwilling resurrection from the dead at age 20, right down to her eventual comfort with her role as counselor, leader, and teacher at age 22. Buffy's path to full adulthood is persistently stymied by a longing to be "normal." She regularly dreams of what it would be like not to be special, not to have powers, not to have a unique destiny. She longs for the sense of play that Harry left behind around the time of the Tri-Wizard Tournament, just as he longs for the clear sense of purpose that she gained around the time that she graduated from high school.

Yet even though these two stories focus on different parts of growing up, they share a common historical grounding. There have been repeated attempts to determine the historical referents for the major events in Rowling's world--from 1930s homegrown fascism, to the rise of Thatcherism, to Tony Blair's role in the war on terror--but the genius of Rowling's creation is that this world is not merely a one-for-one allegory for ours but rather comprises its own internally consistent alternate reality. To my knowledge, there have been no analogous attempts to find historical referents for Buffy's Sunnydale, but even without a specific historical peg on which to hang a sociopolitical analogy, there is always the most accidental thing that the two heroes have in common: their birthdates. Both grew up in the world of the 1980s and 1990s, and their careers and separate paths through adolescence and young adulthood suggest that they are shaped by a shared generational psychology that is apparently transatlantic rather than national in origin.

As generational heroes, Harry and Buffy most clearly stand apart from the analogous characters who preceded them in popular culture by how they regard authority. In Buffy's case, this attitude is perhaps not so striking. In western society, we expect stories of the late high school and college years to entail a certain amount of vague "rebellion." But as parents who have read Harry's adventures to pre-teen children will tell you, it is striking how many rules Hogwarts students break in Rowling's books. When Dorothy Gale broke rules, there was always a consequence. Tom Swift, Nancy Drew and the Hardy Boys would never even have dreamt of breaking a rule to reach their goals. From their very first adventure, however, Harry, Ron, and Hermione have to break rules, to undermine the authority of nearly every adult around them (up to and including Albus Dumbledore) in order to achieve a satisfactory conclusion. This is not to say that they don't get in trouble for breaking some rules. It would be more accurate to say that one of the most important skills that Harry and his friends acquire over the series is learning which rules you should break and which rules you shouldn't. Unlike earlier adolescent heroes, the students of Hogwarts are not being taught to conform by their adventures.

They are not being taught to rebel either, though, and neither are Buffy and her friends. Rebellion and isolation inevitably lead to bad results, as nearly all of Whedon's characters realize during their first year of college. The appropriate response to oppressive, blind, even destructive conformity for both sets of heroes is not pure independence but the building of new societies: the creation of the armed resistance group Dumbledore's Army in Harry Potter and the Order of the Phoenix; the student-led counterattack at Sunnydale High's 1999 graduation ceremony, which ends up being a dry-run for the slayer army that Buffy trains in the last season of the television series and after. The rules cannot be trusted, and neither can the adults who made or enforce them. You can listen to these adults, you can learn from them, but ultimately you have to make your own world from the pieces that they leave you.

This ambivalence toward adults and what they might have to offer adolescents by way of example is the most compelling aspect of the two series' shared emotional world. Barring anything we might learn Saturday morning from Harry Potter and the Deathly Hallows, with the exception of saintly Albus Dumbledore, no adult in either series can be fully trusted. Politicians and parents prove consistently unreliable, but even the characters' most valued mentors lose their tempers, drink to excess, and can practice unthinking cruelty unless prevented from doing so by the stories' more responsible adolescents. One episode from the third season of Buffy, entitled "Band Candy," showed this perhaps better than any other, when enchanted chocolate made the teachers and parents of Sunnydale revert to their own high school years. If anything, they acted worse than the members of Buffy's generation, with Buffy's usually buttoned-down mentor Rupert Giles being the worst offender of the lot. He slacked off on his Watcher duties, smoked marijuana and had sex with Buffy's mother, stripped down to his tshirt, and began inflicting casual violence and vandalism any chance he could get. The voice of order had reverted to his secret origin: a sneering punk who was only interested in the quickest kicks.

Sirius Black serves as a similarly ambivalent mentor for Harry Potter. In emphasizing that Black was wrongfully imprisoned is Azkaban for over a decade, many readers ignore the hints that he was something of a juvenile delinquent before he got in there: a darkly born, flying motorcycle-riding, unregistered Animagus who played a prank on Severus Snape during their own time at Hogwarts that almost got his classmate killed. Along with his friends (including Harry's father James), he felt that the rules did not apply to him, and that the magical world was made for his amusement. At least once, he seriously considers killing Peter Pettigrew, the man responsible for his imprisonment, in cold blood, and it is only the more responsible Harry who can talk him out of it. Both Giles and Black have a great deal to offer as mentors, but part of their mentees' maturation requires that they learn to do as they say and not always as they do.

Around the time that both Harry and Buffy were born, there was a considerable vogue for wearing tiny buttons with ironic messages. Like many other subcultural tics of the late 1970s and early 1980s, these buttons were both an inheritance from and a refutation of the counterculture of the 1960s. To be precise, the buttons were a continuation of the proudly declared politics of that earlier era, but their ironic messages were clear signs of the more alienated, less forthright era close at hand. My favorite button from this period you can still see around, although I sometimes wonder how many people got the joke. It bore only two words: QUESTION AUTHORITY. When I wore it, I was always surprised how many people read it as if it was a straightahead hippie message. Yeah, man, they thought it meant, don't let the establishment get away with anything, man. When I bought it, though, I had assumed that the first thing people were supposed to notice about the button was that it bore a direct order. Who's telling me to question authority, I thought it meant, and why should I do what they say?

I can't know for sure, but my sense is that if you were born after 1980, you don't need to have the irony of that button explained to you. The most lasting children's fantasies of the 1960s, stories like the film version of Mary Poppins and Roald Dahl's original novel Charlie and the Chocolate Factory, suggest that all a displaced child needs is to be taken under the wing of nice countercultural types and everything will turn out fine. Even their families will eventually get turned on. But the story changes once the countercultural types become the parents: then rebellion itself can become authority and even a form of legacy. Once you buy your five-year-old a tshirt with an anarchy symbol on it, cultural radicalism is now coexisting side by side with psychological conformity. That kind of unresolvable contradiction is exactly the sort of condition that is ripe for the creation of new pop myths, not just the ones I've traced in Harry's and Buffy's stories but those in a number of other post-1980 books, films, and TV series too, perhaps most arrestingly in Wes Craven's original Nightmare on Elm Street and Richard Kelly's film Donnie Darko.

Joss Whedon and others are currently writing a series for Dark House Comics that is frankly labelled Buffy: Season 8, but for me, the final episode of the TV series four years ago was the only ending I needed. J. K. Rowling has built up enough good will with me that I trust that her seventh book will round out her protagonist's story just as well, even if I'm wrong and he doesn't definitively die. As time goes on, it will be interesting to see if the stories of these characters prove powerful for new readers or simply remain touchstones for the generation that grew up with them. From all reports, the currently rising generation is less worried about having an embarrassing ex-hippie or ex-punk for a parent than they are about the rise of a form of blank conformism that is very different from the kind that ruled during the 1950s. Somewhere right now, someone is glimpsing precisely how different the conformity of our own time is from that of the earlier era. If we're very lucky, they'll use that sense of difference as the cornerstone of a brand new, fully imaginary world, one that finally helps us to clearly see what's going on in our own.