Culture, media

Getting out of the bubble

NBC’s 90th anniversary show last weekend featured a heavy dose of former and current stars sharing memories of how certain shows were so “important” or “ground-breaking.”

“Come on,” I found myself thinking at various times. “This is television. This is passive entertainment we watch because it’s easier than reading and we don’t feel like putting on pants and going out.”

On Medium, I wrote about NBC’s inflated perspective – and how such a mentality might bleed over into the news division. But it isn’t hard to see how this would happen – and it doesn’t come from a place of arrogance. Anyone who works in a field, or in a given place, runs the risk of an altered perspective. People who work at NBC for years, and develop an understanding of its history, could be excused for over-inflating its importance (especially on a program designed to showcase the network’s programming). Similarly, it’s understandable why someone in the news division might conflate any attack on a media outlet as a full-on assault on the First Amendment.

Cultural bubbles exist. And while they may not pop easily, you can at least see outside of them, if you’re looking. For reporters, that’s going to become even more important in the coming years.

That’s not to say that television shows have not had meaningful cultural impact, nor that criticisms of the press could devolve into the erosion of press freedoms. It just means that the occasional dose of bubble-popping perspective is healthy and necessary.

Culture, Uncategorized

The First Black President

Here’s a real “check your privilege” moment. Did you know that, in 1971, Bill White became the first black play-by-play announcer in sports when he took to the mic for the New York Yankees? It took until 1971 for that to happen.

It makes sense when you think about it: Teams tend to hire former athletes as their sportscasters, and until 1947 there weren’t any black baseball players. So seeing a black sportscaster 24 years later seems right – except, of course, that neither of those lines should never have existed in the first place.

Still, I had no idea that White was so significant until I read this post on The Undefeated. (I just knew him as Phil Rizzuto’s former broadcast partner.) The piece uses White’s legacy to point out how the Barack Obama Presidency has changed the perception about further color barriers: Obama has made those barriers temporary. If a black person can be President, we assume will will be the “first black [INSERT ANYTHING HERE]” at some point. Time, more than prejudice, is the enemy now.

For whatever you think about now-ex-President Barack Obama (I have some opinions), that legacy alone means something. As a white guy, I can’t even fully appreciate it myself; just as I took for granted growing up hearing Joe Morgan and Ken Singleton call baseball games. When I was young, my parents told me that if I tried hard enough, I could do or be whatever I wanted. It’s hard to imagine a parent having to tell their child the opposite – that no matter how good you are, some doors will be closed. Whether it was always true or not, that was a legitimate feeling in communities of color.

Among the debates surrounding the legacy of our 44th President, this accomplishment is worth celebrating. It’s sad that there was once a color barrier on the baseball field, or in the broadcast booth, or any number of other places. Now, hopefully, we can know there will never be a time like that again.

 

Culture

Faith Hill’s “Where Are You Christmas?” is the worst kind of bad song

Christmas music doesn’t have to be good to be enjoyable. It’s fun to hear the mixed bag of it all. Pointless, upbeat ditties like “Rocking Around the Christmas Tree” share the radio dial with the reverent “O Holy Night” and the wistful “I’ll Be Home For Christmas.”

Which brings us to the song I’m picking on: Faith Hill’s “Where Are You Christmas?” It’s a swing-and-a-miss of a Christmas song.

You’ve heard the song – in fact, you’ve probably heard it a few times just this year. It was, of course, the signature song for Ron Howard’s 2000 adaptation of “How the Grinch Stole Christmas.”

The tune isn’t bad. The lyrics ruin what could be a great song.

The first stanza:

Where are you Christmas?
Why can’t I find you?
Why have you gone away?
Where is the laughter
You used to bring me?
Why can’t I hear music play?

Hill goes on to sing about how time has changed her as a person, and wonders if Christmas will ever bring the same enjoyment she enjoyed in her youth.

So far, so good. There could be a very resonant story in here. Hill is singing about something many people go through. As we grow up, Christmas means different things to us. It’s not a bad setup, and has the potential to be a very relevant and affirming song.

After two stanzas of that, we hear this bridge:

Christmas is here
Everywhere, oh
Christmas is here
If you care, oh

If there is love in your heart and your mind
You will feel like Christmas all the time

Okay, I guess? Surely there must be more to this journey. But no, there’s just the last stanza.

Oh, I feel you Christmas
I know I’ve found you
You never fade away…

And that’s pretty much it. To paraphrase the song’s main narrative points:

  • I don’t feel the Christmas spirit this year.
  • It’s Christmas.
  • Okay, I feel the Christmas spirit this year.

This is a Lucy Van Pelt level of holiday psychiatry.

It works a little bit better viewed a companion piece to the Grinch movie, in which the Whos down in Whoville are wrapped up in the material trappings of Christmas at the expense of the Christmas spirit.

Still, this touches a nerve. It could be a really good and unique song. Many people have a soft spot for the Christmasses they celebrated as a kid, when the magic just seemed to happen all around them. Growing to adulthood (which is to say Christmas, as in Yule…) brings the assorted stress points of the holiday season. (Sidebar: This topic was covered in another carol, “The 12 Pains of Christmas.”)

The payoff for being an adult at Christmas is getting to be the magician who makes the Christmas season wonderful for others. Being a musical soliloquy, the song doesn’t tackle that. At the beginning and end of the song , Hill sings about feelings the audience can identify with, but she skips the viable transition.

There’s a story in there, one that audiences would hear and identify with. The songwriters should have had Hill sing about watching her kids at Christmas, or about bringing joy to others. They could have created something with depth that spoke to contemporary audiences. The potential was there to create a true modern classic in the tradition of The Waitresses’ “Christmas Wrapping” or Dan Fogelberg’s “Same Auld Lang Syne.

Instead, they skipped the depth and crapped out a shallow, schmaltzy song to promote a mediocre movie. Like a half-assed Christmas gift, it leaves you wishing they just wouldn’t have bothered in the first place.

 

 

 

Culture, media

Ghostbusters 2016?

In the run-up to the new Ghostbusters movie, much of the marketing had a clear undertone: “Go see this movie so the anti-woman internet trolls won’t win,” it seemed to say. In fact, in an odd parallel with the 2016 presidential campaigns, this message has eclipsed any discussion of the movie’s actual quality.

Lost in the discussion about whether a female-led Ghostbusters franchise reboot can succeed is this: Why is “Ghostbusters” considered a franchise? There was the excellent original movie in 1984 and a cash-grab sequel in 1989. There were tie-ins: the toy-driven kids’ cartoons from the mid-1980s through the early 1990s and the 2009 video game with  a plotline that, on the big screen, could have been the third part of a trilogy. Importantly, most of these center on the same characters as the original movie.

But media coverage of this year’s reboot seems to accept the idea that Ghostbusters is on par with the likes of Star Wars, Star Trek, Marvel’s Cimematic Universe, Superman, and other film properties with long track records of success. That’s just not true. As an example, when Star Wars: The Force Awakens hit theaters last year, it was the seventh movie in a lineup that enjoyed mixed critical reviews but scored big box office numbers across multiple decades, and – this is important – inspired an expanded universe of new characters. Ditto with recent Star Trek movies, which recast characters while, incredibly, keeping the old ones. And that’s in a universe which has enjoyed multiple successful spinoffs only tangentially related to the adventures depicted in the original televeision series. Again, until last week just about every successful incarnation of the Ghostbusters centered around the same four original characters.

That creates really unreasonable expectations of Paul Feig’s Ghostbusters 2016, which flushes the old story completely in a very limited universe where the old story was pretty much the only story.

If the new Ghostbusters see their box office returns dip, don’t blame sexism. Blame Sony Pictures’ green light to build a new house on a pretty shaky foundation.

 

Culture, Funny Stuff

Funny First

“Indeed, work whose Christianity is latent may do quite as much good and may reach some whom the more obvious religious work would scare away. The first business of a story is to be a good story.” – C.S. Lewis

C.S. Lewis might have been talking about religion, but his words apply to politics, too. Overt politics makes for bad entertainment.

It’s a lesson America’s political conservatives certainly ought to have learned by now. Right-leaning would-be entertainers have spent years trying to counter the left’s dominance of the culture with movies that clumsily and unsubtly push conservative ideas. There’s a considerable list of failures. The awful 2011 film adaptation of Ayn Rand’s Atlas Shrugged bludgeoned audiences with bad acting, forced dialogue, and an anti-government message. In 2008, David Zucker’s An American Carol pushed unabashed patriotism with poor satire and awkward slapstick. Fox News tried to counter the Daily Show’s bias with “The Half Hour News Hour” in 2007 – a Weekend Update-wannabe whose laugh track was the only way viewers would know where the jokes were. There are numerous enough examples to prove that artists who focus on political messages first and their art second will lose their audiences.

That lesson applies on the left, too.

Will Farrell caught heat recently after media reports linked him to the title role in a project titled Reagan. The satirical comedy reportedly revolved around staff members coaxing the former President through his second White House term through the fog of Alzheimer’s Disease.

Dementia is comedy gold, right?

Enough people thought otherwise – including Reagan’s family – that Farrell backed out of the project not 48 hours after those reports hit the mainstream news.

Unfortunately, the screenplay’s apparent goals went beyond making a political satire. Positioning Reagan – still the champion of so many on the center-right – as a witless buffoon comments negatively not only on conservatism, but Alzheimer’s as well. The would-be filmmakers (including Farrell, screenwriter Mike Rosolio and others attached to the project) seem to have allowed politics to cloud their judgment when considering what audiences would laugh at. Blinded by ideology, they lost sight of comedy.

It’s too bad, because there’s a nugget of value in that plot. Imagine this alternative: A party leader, so desperate to win some race (maybe state legislator or even Congress) hatches a plan. He recruits an aging, politically uninvolved former actor, who doesn’t watch much TV or pay attention to social media, into appearing in a “movie” about running for Congress. Except, the actor isn’t filming a movie, he’s filming commercials, and participating in actual debates rather than staged scenes. Now imagine Farrell, playing a comically demanding prima donna actor past his prime, as the hapless, unwitting candidate. (Maybe Steve Carell could play the unscrupulous party leader.)

In this version, the objects of satire are party leaders political image-makers. The film doesn’t target anyone else suffering from Alzheimer’s, or cast the voters and supporters of any particular side as easy dupes. It wouldn’t have the major buzz that controversial subject matter attracts, but with smart, witty writing and a tight plot, it could achieve the type of cult-hit status that films like Dave or Thank You for Smoking have enjoyed in political circles.

The film was early in its development. Perhaps, had news of the project not been so widely reported, smarter minds would have revised the concept as the script went through re-writes. More likely, the production would have suffered the same insular groupthink that made it acceptable to use dementia for laughs because of the patient’s political party. The most probable result would have been a disastrous finished film that inadvertently spent two hours making fun of people stricken with Alzheimer’s.

Audiences don’t want movies that sacrifice a story in pursuit of political points. Farrell, Rosolio, and company should be happy they learned this lesson before they sank any more time and money into a sure box office bomb.

Culture, Uncategorized

That mad world of blood, death, and fire

Each March, someone on Facebook posts a video of Liam Clancy singing “And the Band Played Waltzing Matilda” for St. Patrick’s Day. The song’s protagonist is an Australian World War I soldier and its writer is Scottish-born, but Irish singers seem to do it the most justice. Written in 1971, you’d be excused for categorizing it with the anti-war songs of its era. But give it a listen after you’ve spent about 24 total hours listening to Dan Carlin talk about World War I, or read any of the grisly accounts from the era, and the song takes on a much different tone.

World War I was called “The War to End All Wars” because of the near-universal realization that modern warfare sucks. As the song alludes, killing technology has been getting much more efficient in the past century and a half, and WWI was the first chance to observe that trend.

Since World War I, most long-term conflicts have had some sort of moral reasoning. World War II fought Adolph Hitler’s plan for world domination; the Cold War fought the Soviet plan for World world domination; the War on Terror fights jihadis who use radicalized Islam to justify their plan for world domination, and so forth. World War I was, in many ways, a local territorial war that expanded because of alliances and agreements among great powers. For example, if France and Russia didn’t have an “I Got Your Back If You Got Mine” treaty, Germany might not have invaded France – heck, maybe Great Britain wouldn’t have been in the war at all.

When you think about how much of that war was triggered by paper and handshakes, and then read or listen to how ill-prepared the military leaders and troops were for the shift from horses and swords to tanks and machine guns, “And the Band Played Waltzing Matilda” becomes that much more more sad.

Culture, media, Politics and Grassroots

Obama doesn’t have to go to Nancy Reagan’s funeral, but I wish he would

Vice Presidents are supposed to be U.S. Government’s designated funeral attendee. There’s no reason President Obama should feel obligated spend his time there. The demands that he drop everything to pay respects to Nancy Reagan, and before that Justice Antonin Scalia, are shrill and senseless. They delegitimize the numerous valid criticisms of the President.

With all that said, don’t you wish he had gone?

After winning the 2008 campaign with soaring rhetoric of ushering in a new era of cooperation in Washington, Obama promptly reminded Congressional Republicans, “I won” when they expressed concern over his policies. His reelection was far from a rousing national endorsement; his campaign’s groundbreaking GOTV efforts squeezed every ounce of support from an electorate with mixed feelings.

This is the current President, but it could just as easily have been our former President. The left despised George W. Bush just as the right despises Obama, and W similarly squeaked through a close reelection relying on base voters. The man who claimed he was “a uniter, not a divider” saw a more fractured Washington in his rear view mirror when he left office than the one he had found eight years prior.

It adds up to 16 years of acidic national politics, and the choices for 2016 don’t appear likely to end the cycle.

With his days in the White House slipping into history, a warm gesture by the President to the other side would offer some glimpse of the idealistic young Senator we got to know in 2008 – and, perhaps, bandage some of the wounds. Scalia was beloved by thinking conservatives; Reagan was the First Lady to the man who, as more time passes, may prove to be the last pinnacle of post-World War II Republican Party success. Showing up at these funerals would have symbolized more than condolences; it would clearly tell the other side, “Hey, nothing personal and no hard feelings.” President Obama probably didn’t understand the significance of these two figures to his opponents across the aisle; otherwise he might have rethought his schedule.

(From a calculating, partisan perspective, it would also give the digital cheerleaders and opinion leaders within his base some motivation. “Look how magnanimous our Dear Leader is,” they could crow on Twitter.)

With eight years of sins on his record and almost two decades of political acrimony as a backdrop, surely these overtures would be rejected by some and ignored by still more. That doesn’t make them any less right. Eight years later, it would be nice for the President to go the extra mile and stand up for real change – especially because he doesn’t have to.