Music is supposed to be the language of peace and brotherhood, a force that can bring the world together in harmony. But is it any freer from politics as anything else in our lives?
Just look at the 2008 Eurovision Song Contest. The musical competition has taken place since 1956, produced under the auspices of the European Broadcasting Union (EBU). The event is broadcast in 43 countries; the residents of each country call in to vote for the entry of any country other than their own. The EBU is a global association of national broadcasters; it has nothing to do with the European Union, which is why there were entries from Israel and Russia.
As an American, I’d never seen the Eurovision contest before, but I had always had the impression that it only generated mediocre pop songs, light and insubstantial in nature. But on a recent trip to London, I ran across the live broadcast on my hotel TV and I was excited about getting a first-hand look.
Venerable British television personality Sir Terry Wogan narrated the official broadcast from Belgrade. I was surprised at the overall quality of the entries – not great, but no worse than your average run-of-the-mill pop. The musical numbers had a greater range than I had expected. Spain’s “Baila El Chiki Chiki” was a silly dance number with less substance than your average cell ringtone. Latvia’s number looked like pirates run amok. Finland’s entry was heavy metal. Turkey and Azerbaijan entered rock songs. Sebastian Tellier‘s entry for France was clearly inspired by the Beach Boys.
Tellier’s song, like most of the others, featured lyrics sung in English (Apparently this caused a stir with French conservatives). Almost the entire international broadcast took place in English.
I was a little surprised by the narration by play-by-play announcer Wogan, who has done the British broadcast for decades. There were moments of gentle irony and sardonicism, but also hints of real bitterness. He predicted that blocs of countries would all vote for each, such as the Baltic nations. He predicted that the Russians had it in the bag.
After the performance portion ended, the call-in voting began. Fifteen minutes later, the voting results started to trickle in, as each country announced their results in turn. In addition to a series of points from one to seven from the phone calls, each country could award 8, 10 & 12 points each to three deserving countries.
Russia started dominating early. Greece and Ukraine also pulled ahead. Former Soviet Republics all voted for Russia and each other. The Nordic countries (Finland, Sweden, Norway, etc.) all voted for each other. England came in dead last.
It took a long time for the votes to be reported. During this lengthy slog, Terry Wogan kept getting progressively more bitter. He expressed the notion that no one will support “us” – in other words, England – and it felt to me like he wasn’t just talking about the Eurovision Song Contest. Towards the end of the broadcast, he threatened to never do the show again.
This attitude of Wogan and some other Britons seems rooted in England’s long-standing skepticism towards the rest of Europe. In part, this stance can be attributed to the physical gulf separating the United Kingdom from the European mainland, but it’s a cultural gap as well, no doubt rooted (in part) to Britain’s former dominance in world affairs. You can see it in England’s reluctance to accept the Euro as their currency and in Ireland’s recent rejection of the Lisbon treaty.
I’ll admit, I got a little worked up myself at the voting in the Eurovision Song Contest. It did seem unfair. It wasn’t just the showing for England, a country I love. There were other votes that seemed politically motivated. Why didn’t Sebastien Tellier’s song do better? Why was Turkey’s rock number “Deli” (7th place) beaten by more mainstream pop songs? Why did anyone for that stupid “Cheeky, Cheeky” song from Spain?
This may seem like a ludicrous issue, but it’s still being hotly debated in Britain even now. Wogan has charged that the competition is “racist,” and that the other countries will never give England a fair shake. Ireland suggested that England has a bad attitude, assuming they can’t win, and so does not enter sufficiently competitive songs. England is one of the prime sponsors of the competition, raising the question of whether they should continue to support a contest they can’t win. (It also raises the question of whether assuming that everybody hates you endears you to anyone. In other words, if you assume you will lose, does it handicap your ability to win?)
The truth is that, while not my cuppa tea, the Russian song wasn’t too bad. It was produced by American producer Timbaland. It garnered lots of votes from countries that aren’t necessarily political allies. So maybe the most popular song just won on its own.
And while England might complain about losing a battle, Western culture may have won the war. The Eurovision rules were relaxed a few years ago and now entries can be in languages other than the native tongue of the entering country. Most of the songs had English lyrics. Most of the songs weren’t Balkan folk songs or German polkas, but were typical pop fare you might hear on the radio in Britain or America.
So perhaps the European Union has been brought together, under the unifying flag of crappy middle-of-the-road pop songs.
Posted by P.J. Rodriguez at 7:00 PM | Permalink
The connection between contemporary pop music and politics tends to be quite prickly. There’s a lot of power available any time a pol can make a link to popular culture. Witness Walter Mondale’s use of the phrase “Where’s the beef?” in 1984 or Hillary Clinton’s Sopranos video last year.
But the connection is often made gingerly. As evidence, check out some examples of politicians answering the softball “What’s on your iPod?” question and you’ll see nothing but caution and calculation.
Once upon a time, about the best example of a joint venture between pop music and politics was the infamous meeting between Elvis Presley and Richard Nixon in 1970. But since Elvis was trying to become a “Federal Agent-at-Large” in the Bureau of Narcotics and Dangerous Drugs, it’s not a very rock ‘n’ roll moment. In the collision between pop music and politics, the former has to bend to the latter. For example, consider James Brown’s endorsement of Hupert Humphrey in 1968. I don’t think Humphrey displayed the slightest bit of soul in response.
More successfully, in 1992, Bill Clinton’s campaign used Fleetwood Mac’s 1977 hit “Don’t Stop” as a kind of theme song. Fleetwood Mac (at least the mid Seventies version) was a pretty soft-rock sort of band. That song’s lyrics are uplifting, not rebellious or anti-authoritarian. Recall that Bill Clinton’s campaign had briefly used Jesus Jones’ song “Right Here’ Right Now” and that Fleetwood Mac ended up re-uniting (once again) and playing live at Clinton’s inaugural ball. The first Baby Boomer Presidential candidate rode to victory (in part) on the back of a Baby Boomer hit single.
And what did Hillary end up selecting as her campaign theme song? As previously discussed: Celine Dion’s “You and I.” Now, there’s a tune that says “experience” and “moderate.”
That means we have to look at Barack Obama to break the mold. After all, he’s the candidate who keeps promising he’s not going to do things in the traditional Washington fashion. He’s the guy who’s supposedly captured the youth vote and the progressive vote. But has he done so?
Well, yes and no. Obama’s campaign rallies never shied away from music others might consider controversial. Have a look at this playlist from a San Francisco rally last fall. When he gave his concession on Tuesday from Indiana, after losing the Pennsylvania primary, Obama’s speech was followed by “R.O.C.K. in the U.S.A.,” as the song’s author John Mellencamp stepped forward to shake the candidate’s hand in congratulations. That’s a photo-op, but not much of a rock ‘n’ roll revolution.
But let’s go back a few days earlier. On April 16, the last Democratic debate was held and moderators George Stephanopolis and Charlie Gibson raked Obama over the coals pretty well. Hillary Clinton got one hit on her account of a Bosnia trip, but Obama took a barrage of blows. Does Jeremiah Wright loves this country? Do you? How come you don’t love the flag? How come you don’t love white people? Aren’t we loveable enough for you? Left, right, left, right – BAM! Upper cut to the jaw.
Two days later, Obama referred to this pounding during a speech. He acknowledged the incident, classified it as politics, tried to move past it. Then came the bold move: He said, “You’ve just got to…” And he flipped his hand with a dismissive gesture, as if brushing a little lint of his shoulder. This wasn’t arrogance, at least not the garden variety sort. He stole that damn move right out of the Jay-Z playbook.
The rapper’s song “Dirt Off Your Shoulder” is the typical theme of Me-Against-the-World, but the chorus offers advice that if you are “feelin’ like a pimp,” then you ought to “go and brush your shoulders off.” Jay-Z clarifies that “Ladies is pimps, too,” and they should likewise “brush your shoulders off.” The message pumps louder: “You gotta get / that / dirt off your shoulder.”
There’s a reason the rock/politics equation usually doesn’t work. Rock is often loud, rude, chaotic, antiauthoritarian. If that’s true, hip-hop is that same attitude cranked up to 11. But in this instance, hip-hop didn’t conform to politics. Obama stepped over to hip-hop and borrowed the attitude unadulterated.
Despite the fact that hip-hop has been continuously under attack as an artform over the last 30-plus years; despite the sexism, homophobia, violence, and materialism often found in hip-hop; despite Jay-Z’s own controversial nature and his use of “pimp” and the N-word in this song – despite all that, Obama was trying to communicate a response to an attack with a move that was (in many ways) rude, rebellious and anti-authoritarian.
To really appreciate this event, you need to see the video version that showed up quickly on YouTube. Set to the beats of Jay-Z, you see Hillary Clinton hammering away at him on numerous occasions; Stephanopolis and Gibson take their turns. Then Obama speaks and little cartoon heads of his attackers pop up on his shoulder – he brushes them off. They pop up on the other shoulder and are brushed off again. Finally, a little kitchen sink is thrown at him, to no avail.
There is a danger of embracing hip-hop. It’s an undeniably controversial form, with sex, drugs, violence, and race. Any sane politician would keep this stuff at arm’s length. And yet, for one moment, danger was embraced: A perfect marriage of pop culture and politics.
Posted by P.J. Rodriguez at 5:51 AM | Permalink
Americans act like they love to get access to the real thing. Not the fake plastic manufactured version, but the real deal. Reality shows, gossip blogs that rip the lid off Hollywood, autobiographies of former drug addicts: we eat it up even though much of it is positioned, scripted and about as “real” as the special effects in your average blockbuster. Still, sometimes it seems the worst charge you can hurl at some famous people is that they’re a “phony.”
Why does this charge seem to stick some times and other times doesn’t even come up?
In part because we assume that the more “authentic” view is the better one. Rapper Vanilla Ice was unmasked as a mere wannabe gangsta, while 50 Cent survived getting shot nine times. Compare the lyrics of their respective hit singles “Ice Ice Baby” and “Candy Shop” and then try to figure out which is the more thoughtful lyric.
Of course, our focus on authenticity in our popular culture is flawed. Gangsta rap and punk are supposed to be authentic, but bubble gum pop and teeny boppers are fake. There are music fans that don’t care, listening to whatever strikes their fancy, and I suppose you could charge that they are lacking in artistic values. But you could just as easily charge certain discriminating hipsters and intellectuals as being snobs.
This quest for authenticity – which provides the framework for arguing that a novel about gang life is not as compelling as an autobiographical account of rising up from the street – leads to cases like James Frey and Margaret B. Jones (a.k.a. Margaret Seltzer), authors who mixed personal facts, accounts from others and a heavy dose of artistic license. The crime was being caught lying, but why did they feel that an autobiography was superior to an acknowledged work of fiction?
The world of politics isn’t immune from this tendency.
I’m thinking of Al Gore in 2000 and John Kerry in 2004. Once Gore got the “phony” tag slapped on him, and that label stuck, then everything he did and said got viewed through that prism. He was supposedly a congenital liar, who claimed he invented the Internet and was the inspiration for Love Story. He was a fake, who relied on consultants who relied on consultants to dress him and help him be an Alpha male. He was an opportunist, who manufactured a ridiculous crusade around unproven facts about the environment. It was the same for Kerry, far beyond the successful Swift Boat project that turned a war record against a military veteran. Kerry also had the windsurfing, Lambeau Field and Swiss cheese on his Philly cheesesteak. Both fakes, phonies and opportunists.
You may recall that part of George W. Bush’s campaign strategy in 2000 was that he was authentic. Love him or hate him, what you see is what you get. He doesn’t put on airs. He doesn’t pretend to be what he isn’t. Or so the theory goes. But again, it’s much like raw gangsta versus bubbly boy bands: Is the “fake” artist worse than the “real” one? If so, why? I believe that the charges against Gore were false, sometimes manufactured and sometimes overblown. And Bush, of course, is the son of a wealthy family of New England WASPs who happened to have settled in Texas to do business – not as he’d have us believe a true Texan, through and through.
In this election cycle, you can see the “phony” charge bubbling under the surface. Were Hillary Clinton’s tears in New Hampshire real or fake? Is Barack Obama a visionary or an empty suit? Did John Edwards’ $400 haircut show he wasn’t sincere about fighting poverty? Mind you, John McCain gets to reverse course on most of his stands from eight years ago, but that’s okay, since he’s no phony. He’s the real deal, the captain of the Straight Talk Express, right? Or so the theory goes.
The “phony” charge can be incredibly insidious. Once you get stuck with it, everything proves you’re a phony. It reminds me of the work of Dr. John Gottman, whose studies of marital stability and divorce prediction was profiled in Malcolm Gladwell’s best-seller Blink. Gottman says in that book that people are typically in one of two states in a relationship. In “positive sentiment override,” positive emotion acts as a buffer. One spouse will do something potentially irritating and the other spouse will let it slide. In “negative sentiment override,” even a relatively neutral act is perceived as negative. In this state, Gottman says, “[if a] spouse does something positive, it’s a selfish person doing a positive thing.”
Perhaps this reminds you of some events in the current Democratic contest for the presidential nominee. It’s a contest that is partly for who can be more real and authentic. One side is convinced that their person is the genuine article, while the other candidate is a charlatan and a fraud. Every word out of their mouth proves it, whether it concerns the sermons of Dr. Jeremiah Wright or peacemaking trips to Bosnia.
As for me, I’m not so convinced that we can ever really know what goes on in a famous person’s head. “Authenticity” is low on my list of qualifications, partly because of the difficulty of judging this state and partly because I’m not sure how much it really matters. With politicians, I tend to judge them like songs. Does it have a good beat and can you dance to it?
Posted by P.J. Rodriguez at 5:13 AM | Permalink
There’s no denying that there’s a relationship between performers and audience, on and off-stage. Performers give us enjoyment, insight, entertainment, enlightenment. We give them financial rewards, adulation, the pleasures of someone who will listen. But what do they owe us and what is our responsibility to them? Is this just quid pro quo or do we owe each other more?
Let us take, for example, British singer Amy Winehouse. In many ways, the last year has been very good for her: best-selling album, hit singles, five Grammys. But she’s become just as well-known for drinking and drug-taking (here and here are just two examples). So do we link these two sides together – the talented musician and the troubled woman – or keep them apart? By supporting the singer, am I enabling the addict?
She’s hardly the first musician with what are now politely called “issues”. In the 1950′s, it was an open secret that many talented Jazz musicians abused drugs and alcohol. Some of their followers thought these substances contributed to their art. Nobody seemed to shun them for their failings. The late 1960s and early ’70s saw a parade of rock musicians who burned brightly. Some died and some survived. Again, there were those who thought these personal habits contributed to their music.
But we live with a parade of headlines and video of star-powered train wrecks and celebrity “acting out” on a weekly basis. And it’s natural to feel one’s patience snap at some point (say, when a performer runs offstage to vomit). So, let’s go straight to the bottom line: Does Amy Winehouse deserve to be “rewarded” with awards, critical acclaim, the adoration of audiences, and the financial rewards of music sales and concert tours? Are we complicit in her behavior?
The balancing act is difficult. If someone we know acts in a way that’s harmful – to himself/herself or others – we can argue that we have a responsibility to not support that individual. Otherwise, we have to bear some of the responsibility for their actions. In the case of the addict, the dividing line between gifted and tormented is fuzzy – note this anecdote where Winehouse tells her producer Mark Ronson about the incident that led to the song “Rehab”. He quickly goes from being troubled to noticing that Winehouse has come up with a great idea for a song.
But I can just as easily argue the opposite point of view: It’s a dangerous spiral when you don’t separate art from artist. You can end up assessing everything from music to movies to fashion based on whether you feel that you can support the creator behind those works. Must everything pass an acid test of social responsibility?
Sometimes these choices seem like no-brainers, but it’s a sliding scale from black/white to shades of grey. When Ike Turner died year, the headline in the L.A. Times read: “Rock pioneer was known for abusing wife Tina Turner.” He spent years in bitterness because that sort of headline summed up his long music career. Gary Glitter was charged as a sex offender and his career fell apart. But these are serious crimes against others; it’s easy to agree that O.J. Simpson shouldn’t get our support. Lindsay Lohan is ridiculed for her rehab efforts; Robert Downey Jr. seemed to be supported for his multiple efforts. Amy Winehouse caroused, stumbled and fell. She’s done some time in rehab and we’ll see if she changes her ways.
Suppose you find that the guy behind your favorite pop culture product is a racist or once killed a kid while driving drunk. How can you in good conscience support such a person? Does this mean we have to investigate the background of everybody? And what do we do when we figure out who the bad guys are?
In the end, it seems that we each have to make that decision for ourselves. Does the behavior rise to a sufficient level that I should consider withholding my entertainment dollars? Or do I decide that an actor might be a nutjob, but I love his movies? It seems like a fairly tricky calculation to me and I am loath to offer easy answers. Sometimes, pop culture asks us to pay a price.
Posted by P.J. Rodriguez at 4:08 PM | Permalink
From both the political right and the left, this is the time of year that people moan that we’ve lost our way. We need to get back to our roots, they say, back to the “true” meaning of Christmas. According to one academic, a renowned expert on the holiday, if you insist on finding an essential purpose to the celebration – the one going back centuries – it can be identified. But it’s not what you think.
Whether it’s Bill O’Reilly and John GibsononFox News discussing the “war” on Christmas as an assault on traditional values or the new documentary What Would Jesus Buy? which profiles Reverend Billy and The Church of Stop Shopping as they protest the commercialization of Christmas, materialism and the overextension of personal credit, conservatives and progressives, religious and secular forces, all insist we’ve lost our way. They say that we’ve gotten away from how Christmas used to be celebrated.
Nissenbaum notes that our current notion of Yuletide is of fairly recent vintage. Western culture seems to have had some sort of Christmastime festivities going back centuries. The Winter Solstice has long been a time for celebration. If you’re a farmer, by December, the agricultural cycle has ended. The working aspects of the farm have closed down. The beer and wine is ready for consumption. The weather becomes cool enough to slaughter animals, but it’s before the deep freeze of winter. It’s time to kick back and have a party.
As the feudal system developed, new traditions emerged, such as wassailing, where less fortunate peasants would go to the lord’s house, begging for food and drink. This was a big thing at Christmas, eating and drinking to excess. Christmas, like Halloween and Mardi Gras, was a time of what sociologists refer to as “ritualized social inversion.” This roughly translates into a socially-sanctioned time to go nuts and break the rules.
As the 19th Century began, cities grew in size and industrial capitalism increased in influence. The wealthy began to physically withdraw from the lower classes, setting up communities like Boston’s famous Beacon Hill. The Christmas celebration became a time to stay in the house with your family and celebrate. Christmas also played an important role in commercializing the American economy, as people made luxury purchases and gave gifts to family members.
Nissenbaum marks our modern version of Christmas as beginning around 1823 with the publication of Clement Moore’s poem “A Visit from St. Nicholas” (more popularly known as “The Night Before Christmas”). By the late 1820s, he notes that images of Santa Claus – a clearly prosperous, fat, generous old man meant to embody the spirit of the season – appear in advertising.
Today, spending (and overspending) during Christmas has become a part of our popular culture, as news outlets cover Black Friday, the day after Thanksgiving and the beginning of the holiday shopping season, with breathless anticipation. How will the economy do? Will consumers do their part and go out and spend?
But even the notion of consumers’ responsibility to go out and buy is not new. The Panic of 1837 lead to a five-year depression in America. Nissenbaum found two newspaper editorials that argued in 1840 that people ought to go out and spend during the holidays in order to cure the economy – shades of President George Bush’s exhortations to do the same after 9/11.
But perhaps you remain unconvinced by economic history and sociological references. In your mind, Christmas represents one thing: the birth of Jesus Christ. But it wasn’t until the 4th Century that December 25th was designated as the date of the Nativity. In fact, it’s fairly clear just from Biblical text that this is incorrect, since shepherds would be watching their flocks at night during the spring. Some current scholarship also suggests that it didn’t happen in Bethlehem.
In a conversation with me, Professor Nissenbaum said that if one wants to find an essential meaning, then that meaning is “consumption.” Before 1800, that meant eating and drinking, often to excess. After 1800, it became shopping, often to excess.
I am not in favor of gluttony, whether physical or financial. I’m as stressed and stretched during the holidays as most people. I love the notion of a time to think of our friends and family and wish goodwill towards others. But history tells me that there’s never been a time when it was picture perfect and pristine. The consumerism, materialism and mix of sin and sanctimony are, for better and for worse, for richer and poorer, our tradition. And we seem to be keeping it up rather nicely.
Posted by P.J. Rodriguez at 5:00 AM | Permalink
There are many ways of examining the Writer’s Guild strike as it heads into its second month. You could examine some of the business issues – do the studios have any credibility in claiming that new media holds no profit for the foreseeable future? Or you could look at the power of American labor and ask if the idea of an honest day’s pay for an honest day’s work still means anything in the global economy. Or you could take the flippant approach and assume that the writers are simply another overpaid sector of the entertainment industry asking for even more money.
But there’s a theme bubbling under the surface here, a belief that the impact of professional writers in today’s media environment has been lessened. The suggestion is that much of today’s entertainment either isn’t written at all or is nothing more than the re-writing of old ideas – shows and movies from Hollywood’s glory days – and thus not a truly creative product. There’s also that buried assumption that these so-called “writers” are already paid so much that it’s simply selfish to ask for more.
You could find some of these attitudes in stories in the media, such as the one about the “good news” of the strike: newsmagazine shows might benefit. Peter Chernin, president of News Corp., crowed about how good the strike would be for Fox, saving money on cancelled deals and unshot pilots, while allowing the network to make money on American Idol and other reality fare. A reluctantly striking writer sent an e-mail to The National Review, claiming that with “football, The Next Iron Chef, and Law and Order re-runs” who needs writers? (This attitude – ironically, from a Guild member – ignores the fact that fictional fare has long competed with sports and that the heart of the strike is precisely about residuals from repeats and new platforms.)
But while so-called “reality” shows offer competitions, game shows and human train wrecks, giving you the sense that it’s all just unspooling before the cameras, the truth is that almost nothing you see on television is presented in a raw, unedited form (And the writers employed by reality shows are not covered by the Guild, an issue which has come up in the current negotiations). Beyond the reality genre, it’s possible in the face of YouTube and other amateur online video sources to assume that craft is no longer required to create content.
But that assumption is incorrect.
Within the entertainment industry, writing is simultaneously the most and least valued aspect of the process. Since just about anybody can operate a pen or keyboard, there is often the perception that anyone can write. Whether your favorite show is a sit-com, a reality show or even a YouTube video, somebody sat down and had to figure out, “What’s going to happen this week?”
Content doesn’t happen by accident. It happens as a result of determining what kind of things will happen, who will be doing those things and what they will say as they do those things. And while it’s possible for talent and creativity to come from anywhere, online video (which is getting better all the time) has yet to produce a consistent stream of content as good as The Simpsons or Lost. While the studios may think the answer is that they can make money off of amateur online videos, saving themselves some production costs, they shouldn’t forget that the writers could also ditch the studios and head straight for the Internet.
Whether you call it content or story or anything else, it’s a skill to create it. And whether you’re J.K. Rowling or a guy with a webcam, it’s the same set of creative muscles that are flexed. The writer’s strike is about the value we place on that effort. The answer to my question ought to be that writers will always matter as long as people want to be amused and excited. We ought to acknowledge that writing is embedded throughout our daily consumption of entertainment and information, regardless of the media platform.
Once that premise is accepted, then the studios and the writers can figure out the fair compensation. But let’s not pretend the craft of writing no longer matters.
Posted by P.J. Rodriguez at 7:18 PM | Permalink
As the father of two young women, let me go on the record: Not a fan of princesses. But I am crazy for a new Disney movie about a fairytale princess because it illuminates what’s so wrong with the enduringly popular Princess mythology.
Enchanted is a new film that builds on the popularity of Princess movies, but it also subtly undermines their foundation, suggesting that real life is preferable to living in a myth. I’m sure there are fans of the hit movie that don’t see that subtext, but I found it a delightful antidote to the Princess Myth, a mythology filled with True Love, but based on simplistic notions of relationships.
Now, my problem isn’t with actual princesses, although I’m not convinced of the symbolic worth of a monarchy to a democratic nation. I also don’t have a problem with the original fairytales from which the classic princesses come from – such as those of the Brothers Grimm – since those stories are very much rooted in their time and are filled with historical details. My beef is with the modern Princess Myth as it’s exemplified by those princesses and princess-wannabes found in American film, including such Disney classics as Cinderella and The Little Mermaid and the plucky characters found in movies like Pretty Woman and The Prince and Me.
There are two halves to the appeal of the Princess Myth, one classic and one contemporary. The older half is the image despaired by feminists for years: Our young heroine sits around, looking pretty and doing little more than longing for happiness until Prince Charming swoops in and saves her, presumably then taking her off to a castle where she will spend the rest of her days doing little more than looking pretty and being happy. While the notion that a man “saves” a woman is still around, there’s also the more material modern half to the myth: Oh My God, wouldn’t it be cool to have all those clothes and live in a fancy house and have butlers and stuff?
In this version of the myth, women can have their cake and eat it too. You can be free-thinking and independent and all that good stuff, but still benefit from having a Prince who supplies that American Express Black Card (annual spending required: $250,000) you need to complete your life.
I’m convinced this is an essential part of the huge appeal of Princess Diana, a fandom that approached hysteria upon her death. There is an element to Diana’s celebrity that relies on her fans believing an inaccurate but basic message: “She was one of us and she made it. She achieved the dream.” Yes, the dream every woman has of being in line to ascend the throne, having your husband cheat on you and dying in a fiery crash in Paris. The stuff of the Brothers Grimm. And, of course, Diane was part of a line of distinguished English aristocrats, the Spencers, who can trace their lineage (and land holdings) back to the 17th century. She was hardly starting life with little more than the rags on her back.
In Enchanted, Amy Adams plays Giselle, a classic Princess type. In the initial animated sequence, which perfectly nails both the classic Disney films of the Forties, as well as those of the Nineties, Giselle sings and plays with her animal friends, while waiting for her Prince to come along, whereupon the pair will instantly fall in love and live happily ever after. The happy couple is split asunder when the evil Queen throws Giselle into the real world of Manhattan, where she must try to make her way in a world of diminished expectations.
But it’s interesting how things wind up (here come the blessed spoilers). At some point, Giselle begins to prefer the real world. While she’s dismayed to discover the concept of divorce – she hooks up with a divorce attorney, played by Patrick Dempsey – there’s also a delightful scene where she discovers the emotion of anger, a completely foreign concept to her.
Even though she finally is rescued by her handsome Prince, who follows her to the real world, Giselle finds she is unsatisfied. Her Prince loves her utterly and without reservation, but Giselle has also discovered the concept of the “date,” of going out to talk and getting to know the other person. Why is it those animated heroines always seemed to love their dashing Princes at first sight, almost never knowing much about them?
Giselle battles the wicked Queen and saves her love. She chooses to stay in New York and open a business. She becomes a step-mother to a young girl. She elects to live in the real world, a place where things don’t always work out. Another New Yorker in the film, a rival female character, does choose the fairy tale and heads off for her happily-ever-after with the Prince. But which of these two women has found real happiness?
Editor’s note: P.J. Rodriguez isn’t the only Spot-on writer who’s not a fan of the princess phenomenon. Deborah Klosky’s taken a look at the “real” life version of this story in this post “Princesses are Us.”
Posted by P.J. Rodriguez at 4:53 PM | Permalink
Political humor has become a contact sport.
Stephen Colbert, D.F.A., a South Carolina native and host of the television program The Colbert Report (the two ending T’s are silent), has filed papers asking the state’s Democratic Party to put his name on the ballot.
It didn’t work – party officials said “no” – but it’s still a brilliant piece of political theater. Colbert, of course, is the creation of comic actor Stephen Colbert, former correspondent for The Daily Show, and a recently published author whose first book, I Am America (And So Can You!) easily climbed the best-seller lists. His group of “friends” on the popular social networking site Facebook surpassed 1 million within a few days of its debut.
America has not always proven to have the most subtle sense of humor when it comes to satire. And journalists and politicians are prone to being either disdainful or starstruck when someone like Colbert enters the arena. We’re seeing both as Colbert mounts his campaign – think he’ll ask voters to write him in? – and it’s almost as funny as the man himself.
Political wisemen Tim Russert and Larry King gave Colbert very respectful and amused receptions on their programs, with Russert playing along with the gag and treating Colbert as a serious candidate. Maureen Dowd turned her N.Y. Times column over to Colbert. In the Kansas City Star, a high school student wrote an op-ed that declared, “In a country already divided politically, the last thing we need is for the results of an important presidential election to be skewed by a late-night comedian.” MarketWatch columnist Jon Friedman called Colbert (and Jon Stewart) “failed actors” who kind of stumbled into the talk-show gimmick.
But before condemning Colbert as some sort of clown who has no business messing in the serious business of either journalism or politics, note Colbert’s explanation for abandoning his initial plan to file for inclusion on both party’s ballots. The Democratic primary has a $2,500 filing fee, while the GOP’s is $35,000. As they say, you can’t make this stuff up.
That’s what makes Colbert the perfect remedy for the age of Bush. It’s all the nuttiness you may desire, leavened with humor and humanity. For example, in a 2004 article by Ron Suskind, a Bush aide mocked “what we call the reality-based community,” declaring that “We’re an empire now, and when we act, we create our own reality.” Colbert is famous for saying, “Reality has a well-known liberal bias.”
A key part of the appeal of the character is that the audience is invited into the process – and not just by casting their election-day votes. “Stephen Colbert” is a very arrogant character, very demagogic and messianic, and it’s a lot of fun for the audience to join in on the process of feeding his outsized ego and delusions of power. For example, Colbert introduced the concept of “wikiality” and encouraged his viewers to edit an entry on African elephants; so many did so, that Wikipedia had to lock the page down. He encouraged his viewers to vote online to name a Hungarian bridge after him, and handily won the contest. My sense is that his candidacy is not a serious effort, or that he is not (at this time) an actual symbol of voter dissatisfaction. But it doesn’t matter. His campaign – even the idea – is hilarious.
The brilliance of Stephen Colbert is not just how well he parodies conservative commentators – because he’s not just tweaking the FOX News crowd, but also such crusading CNN anchors as Anderson Cooper and Lou Dobbs – but because he provides an antidote to our times. There’s much that is bad in the daily news and you can either find the humor or wallow in despair. I know where I prefer to go: straight to the Good Book (in this case, Colbert’s).
If a Harp Seal needs money that badly, it should do what I do. I hold a little fundraiser every day. It’s called Going to Work… And don’t give me “Harp seals can’t survive in an office habitat,” because that excuse doesn’t hold water any more, thank you very much, Americans with Disabilities Act.
This is the other key difference between the actual pundits and the fake one: wit, a wit that comes from actual caring and insight. In a 2006 interview with TV critic Tim Goodman, Colbert, the actor, described his character as living a “completely unexamined life,” which means that “he can indict himself with what he says and constantly say things that prove the falsity of his beliefs without knowing it.” In a 2005 interview with Fresh Air‘s Terry Gross, Colbert said that the key to his satire is simple: “If you maintain your humanity, if you don’t think like a joke is more important than being humane, like not talking about tragedy or not questioning someone’s dearly held beliefs religiously, if you can keep in mind a certain level of humanity, then that’s a good guide to what you can or cannot talk about.” Think conservative commentator Ann Coulter knew of this principle when she said of outspoken 9/11 widows: “I’ve never seen people enjoying their husbands’ deaths so much”? Probably not.
But while Stephen Colbert provides solace for liberals, I would be a little worried about him if I were a conservative – and not because of the South Carolina primary. I would be concerned that a fake conservative TV pundit could say such crazy things and sound so much like the real pundits that people would start to blur the difference. One last bit from Colbert’s book provokes the question: Is this really satire?
As gay people are increasingly integrated into society and accepted as friends and coworkers, there is a new threat looming on the horizon.
The threat that we will forget to feel threatened by them.
On this final battlefield, the greatest casualty of all may be our anger.
So I, for one, am delighted by the whole thing. Just imagine if Colbert actually got enough votes to become a spoiler – someone who could draw away from a front-runner. If that happens, Colbert will have made a very important point: A vote for his bombastic, insensitive, over-the-top moronic egotist is a entertaining way for voters to say “none of the above,” rather than simply staying at home and not voting at all.
Would his critics be able to handle the truthiness of that reality?
Posted by P.J. Rodriguez at 4:18 PM | Permalink
There are plenty of clichés about children. They are the future. They are beautiful, innocent little beings that will inherit this world. Or as Jack Handy once said, “The face of a child can say it all, especially the mouth part of the face.”
And there’s something really fascinating about watching children engage in adult pursuits, to see what they do differently and what they do exactly the same – horrifyingly, the same. Forget the current reality show Kid Nation. For a look at children that also rips the lid off the flaws of the democratic process, you won’t see a program more hilarious and frightening than Please Vote For Me, this coming week’s episode of Independent Lens, the public television series that showcases independent films and videos.
Please Vote For Me is an hour-long documentary by Chinese filmmaker Weijun Chen. He was asked to contribute to the Why Democracy? project, which solicited films from all over the world with an eye toward examining contemporary democracy. Making a film on such a subject can be very dangerous in China, especially when involving adults. Then Chen heard from a friend and colleague about an election that was being held for third graders to elect a class monitor, a position previously appointed by the teacher. Please Vote For Me documents that election and captures some very familiar political techniques.
The teacher had selected the three candidates. Xiaofei is a smart, but sensitive girl. Her mother is divorced. Luo Lei is small and wiry. He was the previous class monitor and is viewed as a bully who abused his power. His father is the police chief. Cheng Cheng is physically bigger and very outgoing than the other two. His father is Weijun Chen’s friend, and works as TV producer. These kids are all eight years old.
Their school is located in Wuhan, which is about the size of London and is the most populated city in central China. The children and their parents are part of China’s new urban middle class, which only makes up about one-fifth of the population of that country, but is an influential group. Thanks to China’s One Child Policy, parents are very focused on their children’s ambitions and achievements and each of the candidate’s parents gets right to work as their campaign managers.
They push the kids to be aggressive, to be crafty. They contribute bribes, such as when Luo Lei’s father takes the class on a trip on a new monorail system. As Chinese citizens, they have little direct experience of democracy, but they clearly know what it takes to be elected.
But even though the parents start things off, most of the program focuses on the children, in the classroom and on the campaign trail. The kids figure out what works quickly. Xiaofei is clearly the candidate that’s too “nice” to be participating in politics and she’s quickly crushed. Cheng Cheng manages to intimidate Xiaofei in class and then blame it on Luo Lei. The two boys debate face-to-face and you see one trick the other into taking an untenable position. The candidates dredge up old behavior. They accuse their opponents of running nasty campaigns, while simultaneously doing the exact same thing. It’s the filthiest, dirtiest, nastiest campaign you’ve ever seen – except for almost every U.S. election ever held.
Some of what we see is a product of Chinese culture. A Chinese adult’s success is mostly reflected in the achievements of his or her child. Then those children grow up and are expected to support their parents and grandparents. Clearly, the candidates in this election are pushed by their parents. But there are plenty of scenes of them in school where they’re making decisions on their own and doing what they think they need to do to win. That’s what’s funny about kids regardless of the culture in which they’re raised. Adults can push and prod them into behavior and then children will run with it. Where do adult desires end and children’s begin? Sometimes it’s hard to tell.
This onscreen campaign reflects our own democratic process filtered through layers and then shot back to us from across the globe It’s unnerving how much bad behavior and ruthless campaigning feels ripped from the headlines. It’s The Little Rascals meets The Candidate. I urge you and every voting adult you know to watch this show. As another cliché puts it, “Out of the mouths of babes…”
Editor’s Note: For another look at how China’s culture filters Western ideas and ideals, see Spot-on’s Jonathan Ansfield’s report from Beijing, “Where Less is More.” To find out when Independent Lens airs in your area, check your local listings.
Posted by P.J. Rodriguez at 3:07 PM | Permalink
Once upon a time – a time not that long ago, one I am old enough to remember – Michael Eisner, the man who ran The Walt Disney Company, was a feared and respected media executive. Arianna Huffington was the wife of a wealthy man spending hard on a run for the U.S. House of Representatives to represent California’s 22nd District.
But, as they say, the Internet changes everything. Last week, I attended the WebbyConnect conference, the first summit organized by the folks behind the Webby Awards. Huffington and Eisner were among the participants and the differences in how they were received by attendees are instructive in what it takes to make it in today’s environment.
Author Marc Prensky has popularized the idea of “digital natives” versus “digital immigrants” to distinguish between those people born into the Digital World, who are completely familiar with its workings, and those who have had to enter into it and learn the geography. There are young people who don’t know a world without color television, remote controls, digital cable, the Internet, high-speed data, personal computers, and so on. Prensky argues that these younger people “think and process information fundamentally differently from their predecessors,” because they were born into the Digital World and are native speakers of its inherent language. There are also older people who have learned to make the transition into the new world, even though that world may not always be completely natural for them.
In theory, Eisner is the big dog of the two. After stints at ABC and Paramount, he went to Disney, which he ran with an iron fist. He was forced out two years ago and is now playing in the digital space. His investment firm The Tornante Company launched Vuguru, which has had some success with the online series Prom Queen. Arianna Huffington came to public notice as the wife of millionaire Michael Huffington as he ran for Congress as a conservative. After his defeat and their divorce, Huffington got into political commentary, teaming up with the likes of Al Franken and Bill Maher, ran for governor of California (really, a book tour in disguise) and now takes progressive reformist stances. In the Spring of 2005, with backing and support from former Time Warner executive Ken Lerner, she launched The Huffington Post, a website offering news and group blogging, which quickly grew in popularity.
These two are both unlikely independent media moguls of the digital age. They regularly hobnob with the rich & famous. Eisner told a story of biking through steep Italian hills when his company was being launched; “tornante” was on the road signs, indicating hairpin curves. Huffington dropped such establishment names as historian Arthur Schlesinger and Hollywood agent Ari Emanuel (the real world model for the Ari character in the HBO show Entourage). And even though Huffington and Eisner are roughly equal in terms of star power, the response of the crowd was very different. The crowd seemed to like Huffington just fine and there seemed to be respect as well. But the response to Eisner was decidedly cool. Fun was made of the hairpin turns. Some thought Eisner was lecturing them. He came across as old-fashioned.
Huffington talked a good game. She said the Old Media had Attention Deficit Disorder, covering flashy stories and then quickly moving on. She spoke about new forms of journalism that shattered the old model of the men and women covering political campaigns “on the bus,” spouting conventional wisdom while caught in the echo chamber. She suggested that rather than choosing between print and the Internet, we embrace them both in a three-way, what she called “Promiscuity for profit.”
In contrast, Eisner seemed a little lost in the future. He related an incident in which he took a flight on JetBlue; when he landed, he found a blog post, complete with a photo of him from a camera phone, suggesting that he must be in financial trouble for choosing that airline. He said that content providers had an “obligation to exercise good taste” in order to ensure that the government didn’t step in with regulation. He argued that there is a place in the new order for the editor, a place for culture, humor, filtering. My impression is that some audience members took the talk like a lecture from a parent. I took it more like a dad struggling to be cool and responsible at the same time.
I made a note for myself at the end of his remarks: “He knows how to make entertainment. He knows how to make money.” I think both of these factors are important, even in today’s digital environment. I’m glad he’s in the game. I wonder if anyone else cares.
Arianna Huffington spoke to an audience made largely of natives, and she was seen as an immigrant, one who has learned the native tongue. Poor Michael Eisner – the man generally credited with revitalizing almost every aspect of The Walt Disney Company, from its animation business to its broadcast and cable TV offerings – was nothing but a digital tourist.
Posted by P.J. Rodriguez at 9:28 AM | Permalink