Showing posts with label blustery hoopla. Show all posts
Showing posts with label blustery hoopla. Show all posts

Thursday, October 7, 2010

Today In Bullshit

Probably the biggest downside to having professional training in psychology is learning to cope with the cringe response as mainstream news and opinion types contort the discipline in idiotic ways to support arguments that mostly aren't worth making. A sterling example of this is an article published today on Slate as part of an ongoing series on "how your unconscious mind shapes you" that attempts to explain today's heated political climate by comparing the collective partisanships of the left and right to a married couple seeking counseling. The author, Shankar Vedantam, draws on research on predictors of marital conflict and dissatisfaction (conducted by John Gottman, the biggest name in the marital therapy field) to enlighten us on the fact that the "right" is expressing anger toward the "left" while the "left" is expressing contempt toward the "right," which by the way, is provably more toxic to the health of a marriage, and therefore worse for society by the logic of Vedantam's incredibly tortured analogy.

The first point to make here is one that's so obvious that Vendantam acknowledges it himself in the second to last paragraph of his article: opposing political persuasions are nothing like a marriage. The point of marriages are to help facilitate bonds of love and support between partners, which can be threatened by an excess of disagreement and dispute. Politics is about disagreement and dispute. If it wasn't, there'd be no need for multiple political points of view.

I suppose Vedantam might make the argument that our political discourse today is uniquely marked by anger on one side and contempt on the other, and that the emotional tone is baked in unconsciously to one's political leanings (this may be the point that he's making in the article, but it's difficult to tell because it's such an incoherent piece of work). That doesn't wash, though, because political tone, like everything else in politics, varies dramatically based on who's in power and who's out of it. Think back to the bygone days of the 2004 election, when Republicans controlled the executive and legislative branches. At that time, the Democratic base was at the peak of a nearly decade-long angry fist shake at George W. Bush. Meanwhile, the Republican base was sneering at John Kerry for having the sheer balls to be a decorated Vietnam veteran. Do those emotions sound familiar?

The thing that really gets me about this article is that it pulls the old trick of analyzing our "political discourse" without actually much, if any, reference to those who hold political office. I suppose if your sample size for liberal thinking is a smattering of blogs and Keith Olbermann, you could make the argument that contempt for the right is a dominant emotion, but wouldn't it be a good idea to mention President Barack Obama, who ran on promises to pursue bipartisan compromise and has, with severely limited success, actually tried to do so? This "both sides are at fault" thinking has gotten almost comical in an age where Senate Republicans have filibustered close to a hundred bills in the past 20 months.

So, no, marriage counseling can't tell us anything about liberals and conservatives.

Tuesday, October 5, 2010

The Social Network review

As you may or may not have already heard, The Social Network is an enormous critical hit. I honestly can't recall the last time I saw a movie that was so widely acclaimed, which is particularly impressive considering it's a feature length movie about motherfucking Facebook. It actually reminds me a bit of when Brokeback Mountain was announced and endured 15 months of gay cowboy snark before being rapturously received upon its actual release. Granted, Brokeback Mountain seems to have since had more of a shelf-life as a punchline, because making jokes about gay people never goes out of vogue, and maybe in five years no one will remember why everyone thought that a movie about Facebook was so great, but right now, it's a pretty big deal.

I saw The Social Network over the weekend. I liked it a lot, and I think it's a great movie. In the couple days since I saw it, though, I've been doing a lot of thinking about exactly why it's a great movie and I've found it pretty difficult to pinpoint. Part of the issue is that The Social Network is supposed to be a movie about the founding of Facebook, but it's not really primarily concerned with telling that story as a dramatic narrative. It's really more of a character piece that focuses tightly on Jesse Eisenberg's portrayal of Mark Zuckerberg. One of the most interesting things about the movie is the way it purposefully ignores depicting the larger context and effects of Facebook, even though the exploding popularity of the site is the major driver of the narrative. There's no montage of college students at their computers getting hooked into the Facebook phenomenon, or anything comparable to dramatize the network's broadening impact besides snatches of dialogue and other exchanges.

It's a smart decision, because no moviegoing audience in 2010 needs to be told that Facebook is a big deal. It also reflects the clear fact that nobody involved in the making of this movie gives two shits about Facebook. That's understandable, but where it really gets interesting is that The Social Network also doesn't seem to be terribly concerned with being about Mark Zuckerberg, insofar as Mark Zuckerberg is an actual human being, who actually exists, founded, and runs Facebook. Comparing The Social Network to Citizen Kane feels like somewhat of a cliche already, but thinking about The Social Network as something of a goof on the narrative structure of Kane is really the most useful framework I can conjure to discuss it. Both movies tell the story of the rise of wealthy men, but do so mainly through the perspectives of others. This latter fact isn't entirely obvious in The Social Network, mostly because Zuckerberg's character is alive and present during the telling of the story while C.F. Kane is dead, but it's clear that the framing device of The Social Network (two depositions regarding lawsuits filed against Zuckerberg) signals that the storytelling reflects the biases of the plaintiffs on key points, rather than objective reality. Basically, The Social Network has two main characters: "Mark Zuckerberg," an asshole computer genius who may or may not have screwed over other people on his way to creating a world-beating Internet company, and Mark Zuckerberg, an asshole computer genius who points out various flaws and inconsistencies about the story of the first character as it's being told.

The reason that I called The Social Network a goof on Citizen Kane's narrative is that while Kane  explores the flaws and complexity of its main character in an ultimately futile quest to arrive at a larger understanding of his identity, The Social Network doesn't really ask any questions about Mark Zuckerberg at all. Eisenberg's portrayal of Zuckerberg is a fascinating character to watch onscreen, but more because of his lack of complexity than because of the presence of it. The character can be essentially summarized by extremes of two traits: intelligence and self-absorption, and it's the latter that really seems to be of the most interest to the filmmakers. I think it's entirely fair to argue that The Social Network is about solipsism more than it's about anything else. The genius move is that the movie explores this by focusing entirely on the founder (s?) of Facebook while ignoring the users entirely. If Fincher and Sorkin explicitly said that social network addicts are disappearing up their own asses, The Social Network would probably have come off as reactionary bullshit. Instead, by weaving a creation myth by which Facebook was born out of a series of interlocking acts of self-absorption, they make the argument by proxy. The Social Network's Mark Zuckerberg isn't really a person as much as he is an avatar of a perceived generational flaw. As arresting as the closing image of the film is, it struck me as more of a red herring than a character insight - I don't believe Fincher or Sorkin think they're explaining anything substantial with it; just like Charlie Kane's secrets weren't really unlocked by that sled. (It may be a similar added "fuck you" to the character's real-life counterpart, though: The Social Network is pretty blatantly drawing on shopworn computer geek stereotypes, and "Rosebud" was William Hearst's secret nickname for his mistress's vagina).

I realize that all of this was probably pretty incoherent if you haven't seen The Social Network yet, so here's a couple general sentiments about the movie itself: the acting is phenomenal, the composition and cinematography is stunning and doubly so considering it's an entirely dialog driven movie about computers, the score is great, and it features movie history's hands-down most convincing use to date of one actor playing both halves of a pair of twins. Go see it already.

Wednesday, June 23, 2010

Alan Wake review

Above: Memo to prospective Alan Wake players: Hope you like woods.

I was quite looking forward to Alan Wake, partly because of my admiration for the Max Payne noir-shooter diptych by the same creators, and partly because it promised to borrow narrative inspiration from pulp thriller novels and TV rather than the standard video game muses (respectively: Aliens and other video games). Briefly, Alan Wake is an action game in which you play a famous author who retreats to a hermetic community in the Pacific Northwest to cope with a chronic case of writer's block. Shortly after his arrival, his wife is kidnapped under mysterious circumstances, and he blacks out for a week. When he comes to, he finds that he has written a manuscript that he has no recollection of composing, and further discovers that the town has been taken over by a shadowy presence that possesses people and objects and imbues them with murderous intent. As you might guess, the player's role is to take control of Alan Wake, confront these people/objects, and shoot your way to the truth.

The gameplay hook in Alan Wake is that the possessing force renders the enemies impervious to injury, so you can't just shoot them outright. You first have to use a flashlight or other light source to burn away the darkness that protects them. This isn't the most mindblowingly original conceit, but it's a clever way of fulfilling several gameplay functions. Most significantly, it amps up the tension by increasing the amount of time between spotting an enemy and being able to kill it, and does so without gimping the controls, which is the route that most other horror-shooters take. It also allows your flashlight beam to double as a crosshairs, which goes a long way toward minimizing the HUD. Thirdly, it gives a gameplay excuse for the constant showcasing of Alan Wake's lighting graphics, which are quite impressive.

The thing about Alan Wake is that it's such a solid and well-crafted game that the few shortcomings it has seem all the more nagging as a result. Gameplay-wise, there isn't a whole lot to be mad at: the balance between burning away shadows with your flashlight, shooting, and keeping track of multiple enemies is fun and challenging, and the controls are very solid. The dodge button, which needs to be combined with a directional press, is very well implemented - when you pull off a successful dodge, which takes enough skill that you can't just spam the button, the game shifts into slow-mo for a second to showcase just how close you were to getting nailed by an axe aimed at your head or what have you, which leads to any number of memorable close-call moments. The graphics are great and do a lot of heavy lifting in terms of creating a spooky atmosphere. It's an enjoyable game, and has a lot to recommend it on that level.

Alan Wake, however, has set its sights a bit higher than "enjoyable game." This much is clear from the unique structure that divides the game into six episodes, which being with 'previously on' recaps and end with cliffhangers. This is a narratively-focused affair that wants to be a bold statement of purpose for gaming as a storytelling medium. And it's actually fairly effective in doing so; I liked playing the game half-an-episode at a time, and the plot twists and wanting to find out what happened next was a big part of what kept me engaged. It is refreshing to see a game put a clear emphasis on story and pacing.

The problem with Alan Wake as a narrative is that it can't balance its aspirations toward originality with its desire to pay homage to its influences, and the latter too often overwhelms the former. As reviews of Alan Wake never fail to note, the game is heavily inspired by the works of Stephen King and David Lynch. The Stephen King angle isn't really so bad, even though King is actually mentioned by name at least twice in the game's dialogue, but the constant cribbing from David Lynch in general and Twin Peaks in particular becomes actively distracting very early on in the game. Now if this were limited to the 'unsettling things happening in a bucolic Northwestern town' aspect, I'd say fair play and leave it at that. However, Alan Wake has the gall to deploy naked facsimiles of the characters of Shelly Johnson and the Log Lady from Twin Peaks. It uses coffee thermoses as hidden collectable items, with the inevitable associated Achievement being titled Damn Good Cup of Coffee. There was a part early in the game where a character told me to go to a lodge that made me groan audibly, although fortunately the lodge in question proved more concrete than the one from the show. The game's boner for David Lynch is such that the song soundtracking the first end-of-episode title is "In Dreams" by Roy Orbison, and although I'm sure I probably don't need to jog your memory as to why that's relevant, I'd be seriously remiss if I didn't take the chance to embed:


This is probably substantially less of an issue for the vast majority of Alan Wake players, who likely don't care about the subtle line between a deft professional homage and a vaguely embarrassing fanboyish one. My issue with it is less about Alan Wake trying to punch above its weight class and more about a serious missed opportunity to incorporate its influences on a deeper level. The brilliance of Twin Peaks was the way that it placed its unsettling and avante-garde elements within a wholehearted embrace of the formal strictures of the primetime soap opera format. Given the fact that video games live and die by convention, there was a huge opening for Alan Wake to do the same thing within the milieu of third-person shooters. However, instead of balancing the base gameplay against sometime more experimental that takes advantage of the interactive form, Alan Wake too often opts to cut-and-paste David Lynch. The only point in which I felt Alan Wake was really doing something truly different comes in a playable sequence that closes out the game, and that's tucked safely away after the final boss fight, causing it to feel set apart from the "real" game.

To be fair, there's a lot to applaud about the way Alan Wake approaches the narrative-gameplay fusion; for one, the game actually works a subtle, non-superfluous rationale for the existence of scattered ammo and supplies into the narrative as it progresses. There's actually a significant aspect of the plot which struck me as inspired by Diary, one of Chuck Palahniuk's best novels; if this is intentional, it's carried out with the kind of grace I wish had been used in incorporating the influences I mentioned above. Secondly, although the final boss is rather limp, Alan Wake has one of the better endings to a game story I've seen in a while; it goes out on an ambiguous note without skimping on a sense of resolution. Granted, the former has probably more than a little to do with the impeding DLC bonus episodes (of which the first is free to retail buyers who keep the voucher packed in to the box, classy move there) but it still works within the context of the core game.

All told, Alan Wake is a worthy game. Given the focus on story and atmosphere, it seems like it might be one of those games that's fun to watch as well as play. Despite my quibbles with some of the choices, I'm looking forward to checking out the downloadable bonus episodes later this year, and I do hope that it does well enough to fund a sequel where the designers can hopefully broaden their palette some more.

Saturday, June 5, 2010

Prince of Persia: The Sands of Time: The Movie, or An Object Lesson In Why Hollywood Can't Make A Decent Video Game Adaptation

Above: The hands on hips pose makes Jake Gyllenhaal look like less like a fearsome 'Persian warrior and more like he's waiting impatiently for bar service at a leather club. Who does a boy have to blow to get a vodka and Red Bull in this place?

Over last weekend, when I was visiting my family and girlfriend in St. Louis, we all went out to see Prince of Persia: The Sands of Time. My sister was the main person who was interested in seeing it, but I was kind of curious myself, seeing as this is probably the highest-profile and most expensive video game to movie adaptation to date, having been midwifed by blockbuster merchant du jour Jerry Bruckheimer in a thinly veiled attempt to replicate the success of the Pirates of the Caribbean movies. Plus, I've played the 2003 Xbox/PS2 game, which I enjoyed and is widely regarded as a minor classic to boot, so I had a pretty good point of comparison against which to judge it.

I'm not going to waste a lot of time reviewing the movie itself - it sucked, but if you saw the ads, you probably guessed that already. I would like to point out that while I admire Jake Gyllenhaal's ability to Bowflex himself into the $10 million dollar abs you see on display above, he's really not right for this type of part. Gyllenhaal works best when he can break out that look of slight naive confusion that he employed to such good effect in Donnie Darko and most-underrated-movie-evar Zodiac. He can't really conjure the mocking insouciance that his character in Prince of Persia is clearly intended to have. Come to think of it, most of the under-40 A-list male crowd in Hollywood these days is lacking in the smart-ass factor - that was always the weakest part of Tobey Maguire's performance as Spider-Man as well.

Anyhow, the point I want to raise is that adapting a video game into a crowd-pleasing blockbuster shouldn't be nearly as hard as the dismal results of the many attempts to do so would seem to indicate. As I see it, this is a classic Hollywood problem: lack of respect for the source material. Check out this trailer for the original Sands of Time game:

 

You can basically summarize what the game's like from it: you play as a prince who performs amazing acrobatic feats and can rewind time with a dagger powered by magical sand. He spends a lot of time swordfighting with monsters possessed by the same magical sand that powers his dagger. This is kind of a stupid plot, but the plot isn't really the point. The cool stuff you can do in the game is the point, and the plot is a means to that end.

The Prince of Persia movie, by contrast (I'd embed the trailer below for comparison, if it weren't for the fact that the trailer is pretty misleading about the actual content of the movie) does away with the idea of possessed monsters, barely has any time rewinding at all, and stages the action scenes mostly in spatially confusing medium-close shots stitched together with quick-cut editing. Most of the movie is divided between watching Jake Gyllenhaal and Gemma Arterton walking through the desert and engaging in limply-written bickering, and listening to various characters spout boring expository dialogue about court intrigue and the rules for protecting the dagger. To add insult to injury, whereas the game was renowned for its lighthearted storybook aesthetic, the tone of the movie veers erratically between goofiness and self-seriousness. 

Prince of Persia would have been a much better movie if it had been built around the same stuff that went into the game instead of all the superfluous crap thrown in as a desperate attempt to have a story to focus on. Summer blockbusters get a lot of crap for being overly reliant on action set pieces, but I think that criticism speaks more to mediocrity of action set pieces these days rather than the basic template. Put it this way: Raiders of the Lost Ark is just a bunch of set pieces with the barest minimum of exposition connecting them, and everybody in the world loves that movie. Prince of Persia was obviously never gonna come close to that, but why not try? Why not hire some parkour experts to try and top the foot chase sequences in Casino Royale or District B-13? Why not keep the sand monsters idea and turn the swordfights into a PG-13 friendly version of the battles in 300? Who decided that this movie needed to be a slow-witted homage to Romancing the Stone? Making a video-game based blockbuster movie doesn't entail re-inventing the wheel, but it ought to entail a careful consideration of how to keep whatever made the game appealing in the first place in the film. 

Saturday, May 1, 2010

Me and Ayn Rand

Define irony: an stamp issued by the federal government bearing the likeness of Ayn Rand

One of the most interesting things for me personally in seeing the resurgence on the right of the parts of libertarian ideology that oppose government spending for the purposes of saving the economy and increasing access to health care (the libertarian influence on matters involving limiting the security state and reining in defense spending being curiously MIA) is Ayn Rand's return to semi-relevance in the national conversation. For the uninitiated, Ayn Rand is a Russian emigre who rose to prominence as a novelist and philosopher in the 1950s and 60s. Her topic du jour was the persecution of the individual by society, mostly by government and religion, which she believed needed to be fought by celebrating the moral importance of self-interest and by implementing an unrestricted economic system of lassiez-faire capitalism.

Rand's return to scrutiny was probably inevitable given the circumstances; she's by far the most accessible anti-regulation thinker around, and her apocalyptic streak fits well with the prevailing emotional tone of modern conservative populism, making her a natural avatar for the right on economic issues. For the left, Rand's relevance is more tied up with the contribution of her acolyte Alan Greenspan's deregulatory reforms during his lengthy tenure as the Chairman of the Federal Reserve, which contributed substantially to the current economic crash. As such, the default attitude toward Ayn Rand among in-the-know liberals tends to involve eye-rolling and sneering, which isn't really a new development, but more pronounced these days. So it's in a bit of a strange position that I admit that Ayn Rand was a major influence on my intellectual development.

Hear me out - I'm not the type that'll be at the next Tea Party rally. I never exactly was. In fact, what drew me to Rand initally was her strident atheism. It's mentioned fairly infrequently in the present day, but Rand's contempt for the religious makes Richard Dawkins sound like Thomas Aquinas, and I first read her pretty shortly after I realized that religious faith held no meaning to me. This was in 1999, around the time of Bill Clinton's impeachment, and it felt to me like half the nation was suddenly stewing in moralistic outrage and pious theatrics. When I read Ayn Rand, I felt very acutely that she was the sort of thinker that would go blow for blow against the Jerry Falwells and Bob Barrs and match or exceed them in fury. The initial appeal of reading Ayn Rand (dissected brilliantly in this recent GQ article) comes from the sheer force of her stridency over all other factors, which spoke to me because I began reading her during a strident period of history and at the time in the lifespan (late adolescence) when force of passion seems most like a legitimate form of argument.

My attraction to Rand's social ideas made me more interested in her economic ones, which is actually fairly hard to avoid given her insistence that her philosophy is an irreducible whole. Now, this is where things got challenging, because I was raised in a solid Democratic household devoted to 1950s and 60s-vintage mainstream liberalism. By no means was it a radical milieu - in point of fact, my namesake is Robert F. Kennedy, famed for his efforts in elbowing out Eugene McCarthy - but enough to the left that Rand's jaundiced eye toward progressive social doctrine and unabashed championing of selfishness and capitalism were a fair shock to the system. Since the whole package was framed in terms that I found quite attractive -the importance of individuality and independence, the rewards that come from developing one's talents and capacities - I engaged with it in a serious way. In fact, in the span of about a year, I read both of Rand's major novels, the lengthy The Fountainhead and the gargantuan Atlas Shrugged, and probably four book-length collections of her essays. (I haven't actually picked up a Rand book since that time, and I probably never will again - thematically speaking, the sheer amount of internal redundancy built into her writings more or less obviates the benefits of revisitation, and only a masochist would read her for the prose.)



At this time, I hadn't ever really immersed myself in a topic intellectually the way that I did with Ayn Rand's philosophy, which was a formative experience in and of itself. I'd never felt the sense of immediacy and relevance that can accompany the act of thinking deeply about something (public high school is extraordinarily ill-suited to facilitate this kind of experience) and this was my first hint of how fulfilling and rewarding that can be. That's really more of a developmental milestone than something that can be attributed to Rand specifically - I'm sure I would have had it even if I had never read her. What Rand added to the mix was the insight that intellectual engagement is particularly valuable and important when ideas are being challenged.


The knee-jerk liberal critique of Rand's work is that it essentially carries the water for conservative establishment ideas, and post-Reagan and Greenspan, this isn't totally inaccurate, although it probably reverses the direction of influence. What this leaves out, though, is that Rand was essentially a pugilist and a contrarian rather than a supporter of one political establishment or popular line of argument. My favorite Rand essay is probably "Racism," written in 1963, which combines one of the brutally frank excoriations of the practice of racial prejudice that I've ever read with a pre-emptive strike against the Civil Rights Act of 1964; although I don't agree with her about the legislative aspects, it's impossible for me not to be impressed with someone who composed this paragraph while George Wallace was busily amassing a large national following and four-plus years before the notion of the Republican "Southern Strategy":
One of the worst contradictions, in this context, is the stand of many so-called "conservatives" (not confined exclusively to the South) who claim to be defenders of freedom, of capitalism, of property rights, of the Constitution, yet who advocate racism at the same time. They do not seem to possess enough concern with principles to realize that they are cutting the ground from under their own feet. Men who deny individual rights cannot claim, defend or uphold any rights whatsoever. It is such alleged champions of capitalism who are helping to discredit and destroy it.

Above everything else, I came away from Rand convinced of the value of considering things from a rational and independent viewpoint. It's not remotely a stretch to say that the time I spent with her works taught me how to think critically. I think I learned this lesson in a far better way than I would have just relying on my university education (which was excellent) alone - liberal arts curricula seem to have a way of explicitly encouraging students to "think and analyze material critically" while implicitly adding as long as you reach the conclusion I want you to or, more insidiously, as long as you don't "offend" anyone in the process.



It's practically a law of nature that reading Ayn Rand in late adolescence tends to turn one into an insufferable asshole. In fairness, that could be said of practically anyone getting into politically-oriented thought in that time of life - try and hold a conversation with a college sophomore who's read Naomi Klein - but I was certainly no exception to the rule. In retrospect, I was extremely fortunate that I didn't fall in with a crowd of Rand devotees during my college years, which would have worsened things considerably; one of the odder things surrounding Ayn Rand - which is really saying something - is the manner in which she ju-jitsued her philosophy of bold independent thought into a rigidly enforced cult of personality which is sadly still very much in existence. 

As I got deeper into college, I became a lot less attached to what Rand thought, but I never really lost my appreciation for how she thought. This is a distinction that is too often obliterated by our discourse's relentless focus on categorizing people and their ideas into columns marked "acceptable" and "unacceptable." I've found that developing and maintaining a critical focus and a distrust of consensus to be extremely valuable in every area of my life. I should note that Rand isn't the only route to this conclusion (and very probably not the best); I recently read through Christopher Hitchens' Letters to a Young Contrarian, which is a far more compact volume that anything Rand ever put together, yet concludes with a beautiful summary of exactly the type of mentality I've been attempting to describe:
"So I have no peroration or clarion note on which to close. Beware the irrational, however seductive. Shun the "transcendent" and all who invite you to subordinate or annihilate yourself. Distrust compassion; prefer dignity for yourself and others. Don't be afraid to be thought arrogant and selfish. Picture all experts as if they were mammals. Never be a spectator of unfairness or stupidity. Seek out argument and disputation for their own sake; the grave will supply plenty of time for silence. Suspect your own motives, and all excuses. Do not live for others any more than you would expect others to live for you."  

Since it's rather impossible at this stage in time to admit of any affection for Ayn Rand and not discuss politics, I can say that I remain a self-identified liberal, and a registered Democrat, but I don't consider myself overly identified with party affiliation to the point where I would feel pressured to refrain from criticizing, say, Obama's shameful continuation of indefinite detention policies or his implementation of targeted assassination programs. I will admit to some sympathy and interest in libertarian thinking, which can and does promote things like a genuine respect for data (check out Megan McArdle's analysis of whether or not Toyota's cars were actually accelerating due to mechanical defects) or a commitment to challenging the long-held ideas of ideological allies (David Boaz's takedown of the myth that America "used to be more free" is truly praiseworthy). I'm very interested in the appearance of libertarian thinkers like Will Wilkinson who are advocating for replacing the longstanding conservative-libertarian alliance (the existence of which never made any sense to me) with a liberal-libertarian one, as elaborated in this essay by Cato's Brink Lindsey.

All of which is to say, I find these to be interesting times, for more reasons than the sport of speculating on whether or not the Tea Party is racist.

Friday, April 23, 2010

Video Games as Art: A Proposition

At the end of the post I did a month ago on video game addiction, I left an opening for future posts on the subjects of whether gaming is healthy and whether games are art. As it happens, this past week has presented an ideal context to address the second question, as famed film critic Roger Ebert posted a substantive rebuttal to a talk claiming the mantle of art for gaming. Ebert's critique builds off of an earlier exchange on the topic between himself and Clive Barker (yes, the one you're thinking of), and his position is stated boldly in the title of his latest post: video games can never be art. I won't link to any of the responses from gaming press and enthusiasts, but suffice it to say they range from polite engagement to petulant dismissal and none (that I've read) agree with Ebert.

I do agree with him, and I think that it would benefit the status of gaming as a cultural phenomenon immensely if more people immersed in gaming did, too. The essence of Ebert's case against gaming as an art form rests on two related observations. The first, which invokes the tradition of auteur theory in film criticism, notes that video games as an experience are not generally the product of a singular creative vision (i.e. there's no 'artist' whose work one can be said to be taking in while playing a game). The second observation, which is connected to the first one, is that games are ill-suited to producing emotional or intellectual insights about the human condition, which he states most explicitly as part of his 2007 reply to Barker:
(T)he real question is, do we as their consumers become more or less complex, thoughtful, insightful, witty, empathetic, intelligent, philosophical (and so on) by experiencing them?
Ebert openly admits that the vast majority of ostensibly artistic works, including those in his favored medium, fail to clear this bar, but contends that for the reasons summarized above, no video game will ever make the cut.

The typical response to Ebert from gaming aficionados usually centers on two points: (a) the subjectivity and malleability of how one defines"art" and (b) the fact that gaming is in its relative infancy and no one can tell what the future of the medium will hold. I'm not going to engage these contentions because I find (a) to be tedious and pedantic and (b) to be impossible to discuss in any informed way, because time, not argument, will settle the score.

In fact, in agreeing with Ebert, I'm going to sidestep the particulars of the debate entirely and instead attack the underlying assumption behind it. As a jumping off-point, I want to expand on a point Ebert makes in the closing paragraphs of his most recent post:
"Why aren't gamers content to play their games and simply enjoy themselves? They have my blessing, not that they care. Do they require validation? In defending their gaming against parents, spouses, children, partners, co-workers or other critics, do they want to be able to look up from the screen and explain, "I'm studying a great form of art?""
To put it simply: pretty much. I'd venture that the average adult hardcore gamer who feels that he (or possibly she, but let's be real about the demographics here) has skin in the "are games art" debate is motivated at least in part by defensiveness over a lifetime of having their enthusiasm dismissed as childish. I think this is more of a human trait rather than something specific to gamers, although male nerds of all stripes seem more susceptible to it - consider the type of person who insists he's reading graphic novels, not comic books. There's a very strong, but mostly unspoken, rule in our culture that things which we do to Improve Ourselves are fundamentally superior to things we do just because we like to. The definition offered by Ebert that I excerpted above pretty much states that art and self-improvement are inseparable from one another, and all semantic debates aside, I think that tracks fairly well with the views of most people.

This idea regarding the preferability of Improving Ourselves, when filtered through the intense moralistic streak that somehow manages to simultaneously be both one of American culture's greatest strengths and one of it's greatest weaknesses, inevitably comes out as the idea that we should constantly be Improving, and should never consider passing on the opportunity to do so. Ask a gamer if any of these statements sound familiar:
"How can you waste the day inside playing video games when it's so beautiful out?"
"Why would you play Guitar Hero when you could be learning how to play real guitar?"
"Why don't you get together with your friends and do something, instead of just playing video games?"
I've played a lot of video games, and barring some sort of thumb incapacitating incident in the near future, I'll probably play a lot more. To answer Ebert's challenge, no video game has given me any sort of experience that has expanded me intellectually, emotionally, or culturally. I am perfectly OK with this, because I never picked up a controller expecting anything like that. I have many other sources of acculturation and learning in my life that more than compensate. What's more: with the time I've spent playing video games, I almost certainly could have learned another language, read more great literature, and cultivated a unique and interesting hobby of some sort. I chose to play video games instead, and I'm not sorry about that. I like playing video games, and that's a good enough reason for me to do it. It ought to be a good enough reason for anybody to do anything in their leisure time.

I understand the impulse to go on the defensive and try and stick games with the tag of "art" thus marking them as something with the potential to fulfill the holy task of Improving Ourselves, but it's the wrong path. Instead, I want to see some pushback against this idea that there's a moral obligation to maximizing our exposure to things designed to Improve Ourselves and that we should feel guilty about choosing to do things simply because we find them pleasurable. I want to be clear that this isn't about making different choices - we pretty much wind up doing the things we find pleasurable regardless of how anyone feels about them - but about consciously stating that our choices in entertainment, whether they be video games, watching pornography, or knitting, don't need to be transformative or Important to be worthy of respect. And I think we should get started on this before somebody decides to fuck around and try to make the Un Chien Andalou of games in an attempt to prove Ebert wrong, because I definitely don't want to play that shit.

Sunday, April 18, 2010

Male Studies: A Good Idea That Needs To Be Saved From Itself

As somebody with a dilettante's interest in gender issues, I was pretty interested to hear about the announcement of a new academic discipline called "male studies" last week. The tidbit that particularly caught my attention was the focus on biological differences and their influence on masculinity, which the academics highlighted as a feature distinguishing their vision from that of contemporary academic gender studies. This topic in particular is something that I've been fascinated by ever since I read Steven Pinker's The Blank Slate, a book-length argument for the influence of biology on human behavior that I found extremely compelling and would recommend to anybody.

So the argument that biological differences are insufficiently considered in gender studies is one that I buy completely. In fact, I think a lot of the critical commentary directed toward "male studies" in the blogosphere from writers (presumably) steeped in gender studies goes a long way to proving this point. The main line of argument, typified here by Mother Jones' Titania Kumeh and the Washington City Paper's Amanda Hess, is that Men's Studies already exists and already incorporates biological perspectives. The first point is definitely true: the American Men's Studies Association has a website and a president, who is quoted in the NYT labeling the proposed "Male Studies" discipline "kind of a Glenn Beck approach," which doesn't make any sense taken literally, but can be safely assumed to be an expression of disapproval given the context.

The second point, about biological differences being covered in Men's Studies, seems pretty dubious to me. Here's how the Men's Studies association defines the spectrum of topics covered by the discipline:
"Men’s studies includes scholarly, clinical, and activist endeavors engaging men and masculinities as social-historical-cultural constructions reflexively embedded in the material and bodily realities of men’s and women’s lives."
If you didn't understand any of that, good for you! You probably spent your postsecondary education pursuing marketable skills. The gist of it is that Men's Studies looks at the ways society, history, and culture affect the way men understand masculinity. Which is good! All those things are important. Notice, however, that biology doesn't make the cut, which would seem to contradict the argument for the redundancy of Male Studies. Kumeh approvingly states that the Men's Studies curriculum " investigates society's standards for masculinity in men and boys. It covers the effects a hyper-masculine status quo has on the XY-chromosomed among us." Again, no mention of biology, until later in the article (after analogizing the idea of male studies to excluding slavery from history courses, which strikes me as something less than a logical and restrained analysis) when she claims that "(b)iology is covered in men's studies, but not in a vaccum that discredits nature/nurture arguments."

What I take this to mean, and I don't think I'm being unfair here because I've heard similar sentiments expressed in the past, is "biological differences may exist, but we're not interested in talking about them because they don't seem to be something that we can influence as easily as social standards." Later in her post, Kumeh fears that teaching about biological differences "lacks context and conscience" and "gives people an excuse not to change." This basic idea, which again I believe to be fairly widespread among gender studies scholars and students, is exactly why we need an disciplined and intellectually serious examination of biological differences. I've never read, in either the popular or scientific press, any argument that social and cultural factors are irrelevant to gender differences.

Instead, most of the discussion (again, I can't recommend Pinker's book strongly enough) centers on the idea that the timeless nature/nurture argument is essentially a false dichotomy and that biology deserves to be considered seriously as a key influence on how and why the social context develops, including gender norms. Again: no credible commentator that I'm aware of on the topic is agitating for the strict "anatomy is destiny" hypothesis, or endorsing the idea that biology excuses discrimination or injustice. There are plenty of populist idiots beating that drum, but I'd argue that makes the case for more education about what biological differences imply and do not imply, not less.

Having said all this, there's a fine line to be walked in making this point, and based on the published accounts, the people behind Male Studies aren't doing a very good job in walking it. In a nutshell: the least productive thing possible in this instance is rhetorical mudslinging directed at feminism, which is exactly what one of the architects behind the discipline, Rutgers anthropologist Lionel Tiger (apparently his real name) does in referring to it as “a well-meaning, highly successful, very colorful denigration of maleness as a force, as a phenomenon.” This is a terrible idea for a very simple reason: "feminism" isn't an easily defined thing. It's a multifaceted and complex tradition that covers a diverse array of political, intellectual, and philosophical questions and features a continually evolving internal debate. I find it extremely unlikely that the Male Studies set categorically opposes all things identified as feminist; for instance, I doubt that Dr. Tiger is agitating for the repeal of women's suffrage or the decriminalization of marital rape. Rather, they're pushing back against one very specific component of some feminist thought: the idea that biological differences should be marginalized in discussions of gender.

Framing this argument as a broadside against the abstract notion of feminism is essentially an invitation to be dismissed summarily by anyone who self-identifies as feminist (if it wasn't for my prior interest in the topic, I would have done so myself). It also opens Male Studies proponents to the ad hominem charge of misogyny, which many gender studies stalwarts are quick to deploy. If the discipline of Male Studies wants to define itself as an antidote to feminism writ large, we can expect the level of rational discourse on both sides to roughly resemble that of an Internet forum debate between fans Star Wars and The Lord of the Rings.

Much as I'd like to believe otherwise, I think this is probably the most likely scenario, which doesn't bode well for Male Studies as an intellectual undertaking. I don't think there's a future for ways of thinking about masculinity that gather steam from anti-feminist grievance (see also the Men's Rights "movement," a disastrous amalgamation of embittered men who claim that their child support payments are evidence of a vast conspiracy against the male gender). I do think that a broader conversation about biology and gender than the one currently taking place in the academic-activist spectrum is welcome and needed. In order to succeed, Male Studies has to figure out how to do the second while avoiding the first.

Sunday, April 11, 2010

The Curious Anthropology of Insane Clown Posse



In case you haven't seen it yet, "Miracles," the new Insane Clown Posse video, is posted above. It's getting a serious amount of chatter from the lulz crowd, and deservedly so, because even when judged against the ridiculously low standards of the Insane Clown Posse oeuvre, it's laughably bad. It's basically a mishmash of curse words inserted into a list of things that ICP considers "miracles," none of which are actually miracles, similar to how nothing in Alanis Morissette's "Ironic" is actually ironic. There's also a part where one of the duo talks about hating scientists. Since the video came out a few days again, I'm sure that by now there's twenty thousand blog posts snarking on it, so in the interest of preventing redundancy, I'll forego commentary in lieu of encouraging you to read Daniel O'Brien's hilarious post.

Instead, I want to draw attention to the most fascinating thing about Insane Clown Posse besides the awesome badness of their music: the remarkably robust subculture that the group literally stands at the center of. Unless you know a fan of ICP personally or are atypically immersed in music culture, you've probably never heard any of their music before clicking on that video (or never at all if you didn't click on the video). Hell, I was part of ICP's strongest demographic (socially unpopular Midwestern male teenagers) during the group's brief period of mainstream semi-relevance in the late 90s, and counting "Miracles" I've heard maybe four of their songs in my entire life. ICP is not a major mainstream phenomenon by any stretch of the imagination, and they're one of the most critically reviled musical acts in recent memory, if not of all time.

Despite all this, Insane Clown Posse runs a vertically-integrated media enterprise that between music, concerts, and licensed merchandise, grosses somewhere around 10 million dollars per year. Given the current climate in the music industry, that's an amazing accomplishment. Considering that ICP are regarded as a punchline by the vast majority of people who are even aware of their existence, it's downright miraculous.

Except it's not, because ICP have made their bones the old-fashioned way: by cultivating an intense and personal relationship with their fanbase. If you ever encounter an ICP fan (or a "Juggalo" as they refer to themselves), you'll know it, because they'll probably be wearing an ICP T-shirt. In fact they'll probably have one for every day of the week. Not only that, they'll have a surprisingly large vocabulary of ICP-centric slang words and rituals (should you dare to click through to it, this dictionary website somehow manages to convey all of the irritating, ridiculous, and ignorant aspects of Juggalo mentality within the first 30 seconds of loading it up).

Pretty much every last dollar if ICP's lucrative enterprise comes from this guy and the thousands of other diehards just like him. One of ICP's ventures is "The Gathering of Juggalos," a multi-day yearly music festival headlined by the band and affiliated acts held in a remote state park in Southern Illinois not terribly far from where I went to graduate school. It draws up to 20,000 fans. In 2007, writer Thomas Morton went to the festival and wrote up his experience as an article for Vice, a bit of work which probably stands as the definitive anthropological study of Insane Clown Posse fans to date, not that I have a comprehensive grasp of the alternatives. Anyhow, last fall, Morton wrote a stunning rant decrying the practice of smirking at ICP fans from a great height. In the process, he openly admits how writing the article challenged his preconceptions of his subject, makes a (convincing) case that ICP is the new Grateful Dead, and most importantly, nails down the essence of Insane Clown Posse fandom. Here's the key paragraph of his argument:
As for the big one, the joke about who would voluntarily be into this music and save up to go to this festival and be excited about the helicopter rides and Rowdy Roddy Piper and cheeseburgers, here’s your punchline: Poor midwestern kids from mostly broken homes with absolutely no prospects of material success who even goth and punk kids make fun of.
And that's pretty much the crux of it. ICP built an empire by embracing with both arms the exact segment of society that everyone else goes out of their way to avoid or ignore. Think about Abercrombie and Fitch forcing a (very attractive) female employee to sort hangars in the back where customers couldn't see her because she had a prosthetic arm and you get a fairly decent idea of the extremes modern consumer society goes to reinforce prevailing ideas of desirability and success, both as a goal to be achieved and an illusion to be created. The people who wind up in ICP fandom don't have a prayer of fulfilling anyone's conventional idea of those notions, or even making a convincing pretense at it, and they know it. In a lot of cases, they've been told it their entire lives.

And then along comes Insane Clown Posse, themselves none-too-bright burnouts from a city that itself has become a sort of American shorthand for failure (Detroit, if you're wondering) with music that combines the sort of barely articulated rage common to every teenage outcast since time immemorial with a goofy aesthetic that screams "I'm not even trying to impress the cool kids." See, the stupidity of Insane Clown Posse is a feature, not a bug; it means that only the people who see themselves as having nothing to lose in the eyes of society will embrace it. Imagine you fell into that category: would you rather go to a show put on by an up and coming buzz band and stand in a crowd of sneering poseurs who'll go out of their way to find fault with you, or would you rather go to a raucous and profane ICP show filled with people who are just happy to have a place to be accepted? Fuck Bruce Springsteen. Insane Clown Posse are the torchbearers of the real American underclass.

I'm a big, big music fan. I could rattle off a list of dozens of albums that have touched me greatly and, I believe, impacted the course of my life in a tangible way. But truthfully, I don't think any musical artist will ever mean as much to me as Insane Clown Posse does to its ardent fans. So by all means, laugh and gape at the video for "Miracles" (I watched it twice, just to make sure it was real) and any of the other dumbass ICP-related stuff you come across, because there sure as hell isn't any shortage of that.

Just don't wonder so hard about what type of person would like this kind of thing.

Thursday, March 25, 2010

Video Game Addiction and the Psychology of Gaming

Above: Fake Magic Johnson and Fake Alan Alda enjoy a competitive videogame on a television that is not turned on.

There's a lengthy but compelling piece in the Guardian online about video game addiction that's well worth a read. It chronicles the author Tom Bissell's journey from being a prolific writer to what amounts to a video game-addicted cokehead, and centers around his obsession with Grand Theft Auto IV. The thing that grabbed me about this piece is that there's a million cliches that could have gone into this story, from analogizing video game makers and drug pushers to assuring readers he's given up games to spend more time sitting outside or some such, but Bissell avoids pretty much all of them. Instead, he gives one of the most frank and thoughtful depictions of games and their appeal that I've ever read.

A large part of this is the fact that Bissell devotes several paragraphs to a stream of consciousness description of the experience of playing a GTA game (Vice City, in this instance), and writes it in a way that really captures the sense of freedom and possibility that the games provide. Later, when he's started abusing cocaine heavily, Grand Theft Auto IV becomes his go-to activity while high, and the game and drug form a sort of symbiosis. This leads up to the climactic meditation of the piece:
What have games given me? Experiences. Not surrogate experiences, but actual experiences, many of which are as important to me as any real memories. Once I wanted games to show me things I could not see in any other medium. Then I wanted games to tell me a story in a way no other medium can. Then I wanted games to redeem something absent in myself. Then I wanted a game experience that pointed not toward but at something. Playing GTA IV on coke for weeks and then months at a time, I learned that maybe all a game can do is point at the person who is playing it, and maybe this has to be enough.
My experiences with life and with video games are hardly identical to Bissell's, to put it mildly, but I think he absolutely captures something vital about gaming with his point about games providing real experiences. This is a point that I do not believe non-gamers fully understand: so much of the quality of a game, especially a modern game, is tied into its ability to break down the sense of separation between the physical activity of playing a game (read: pressing buttons) and the actions onscreen. In short: the extent that a game can make you feel personally involved and empowered in what's happening onscreen, you'll probably like it.

In some ways, this is a grotesque oversimplification: there's many, many factors that have to come together to provide that experience. However, in great or even merely enjoyable games, the whole is more than the sum of the parts in a way that's difficult to capture with objective description. Here's the really interesting thing, though: despite all this complexity, games are getting much, much better at providing this quality of experience on a consistent basis. Think about this: a short game is roughly 5-7 hours of playtime in length, and a long game can easily be over 100 hours (I played Grand Theft Auto IV for at least 140 hours, and I imagine that Bissell played for triple that amount or more) or essentially endless if the focus is on competitive multiplayer. That means that a game has to keep your attention for much longer than a feature film does, and likely as much as an entire season of a TV program. What's more, almost all games are built around a fairly simple set of actions that repeat themselves over and over again with usually little more than minor variations over time. Even in expansive, free-form games like the Grand Theft Auto series, you'll get the main essentials of play in the first few hours.

By all rights, keeping someone interested in a video game ought to be an impossible task, but it turns out that it isn't. In fact, over the past several years, I've found that video games as a medium are not only more consistently compelling in my opinion than pretty much any other form of entertainment, but getting better all of the time, and I think Bissell has zeroed on the main reason why with his statement about experiences. A misconception that has plagued popular thinking about video games for some time now is the idea that the appeal of games is something like a more-participatory movie or television show, that the structured narrative is at the core and that the interactivity serves to make the narrative more compelling for the player. In fact, the reverse is true: games get most of their appeal from coming up with cool things for the player to do and letting the player control his or her experience of those things. Nobody plays video games for the story; if they did, nobody would ever play video games, because video game stories, with punishingly few exceptions, are terrible. GamesRadar wrote an article on the plot holes of Modern Warfare 2 that's three goddamn pages long (granted, mostly to maximize the number of pages clicked on, and thus ads viewed - welcome to the world of Internet games writing) and that's a game that's grossed more than one billion dollars since last November. MW2's narrative flaws didn't even stop it from amassing widespread critical acclaim, either. Hell, I'll even throw in my two cents: Modern Warfare 2's story was completely retarded, and I still played through the game twice, spent a solid two months with the multiplayer, and loved about every minute of it.

The contrast between games and other forms of entertainment has really hit home for me that past couple weeks as I've been playing Bioshock 2. A brief recap for the uninitiated: Bioshock is a first-person shooter game that came out in 2007 that became a massive critical and commercial hit. It is far and away one of the most original first-person shooters ever created; partly because of the setting (a failed underwater city resplendent in 1930s Art Deco architecture created as a libertarian utopia by a thinly veiled version of Ayn Rand), and partly because of the rich and well thought-out narrative, which actually came to a definitive resolution at the climax. This last point is important because virtually all major video games follow the modern Hollywood blockbuster model, of openly planning for a multi-sequeled franchise in pretty much all aspects of production, with none more glaringly obvious than the plot. Bioshock, however, felt self-contained from the get-go.

It also (deservedly) made a shitload of money, so when Bioshock 2 was announced, I was scornful. Here was an unnecessary cash-in sequel to a original and complete work, which to top it off wasn't even being made by the creators of the first. It's the sort of thing that drives fans crazy. The thing is, though, that Bioshock 2 didn't turn out to be the Blues Brothers 2000 of the video game world (fun challenge: come up with a snappier analogy and post in in the comments! I couldn't!), it's actually an immense amount of fun to play (less surprisingly, it's also selling by the bucketload).

Now, I don't want to give short shrift to the makers of Bioshock 2, who all things considered, did a fine job with the narrative and tonal aspects of the game, but most of what makes Bioshock 2 work in the way that it does are small improvements to the core gameplay and a few neat additions. In fact, the phrase "small improvements to the core gameplay and a few neat additions" is essentially a comprehensive summary of the philosophy behind video game sequeling. Simple as it is, this approach really works in games in a way that I don't think it can in other media. If somebody came up and told me that the upcoming sequel to a blockbuster movie was being hailed as "pretty much the same thing as the first one, only the hero shoots somebody with a speargun this time," I probably wouldn't make seeing it a priority. However, when I found the speargun in Bioshock 2 (it pins dead enemies to the wall, and you can pull out the spears to reuse them, which makes the corpse fall to the floor!) I was genuinely jazzed. I can't rule out the possibility that this speaks to a certain lack of sophistication on my part, but I think it has more to do with the point that the experience of playing games is qualitatively very different from our experience of other media. Playing Bioshock 2 made me realize that despite what I had thought, the not easily replicated aspects of the first game, its originality and narrative focus, were less important than I had previously thought, while more reproducible elements, the combat and exploration of the game world, were much more.

This has been a bit of a digression, but I think it relates profoundly to Bissell's piece on game addiction and the core role of experience in it. There's a reason that we can talk seriously about gaming being an "addiction," even if it doesn't fit the technical parameters of the term, in a way that's harder to apply to an equally avid consumer of movies, TV, or novels: video games give us feedback. They reward good play, punish bad play, and create a sense of improvement over time. Cracked's David Wong wrote a great piece a few weeks back on how video games use basic principles of behavioral psychology to hook people and keep them playing. Wong's focus is more on massively multiplayer RPGs like World of Warcraft, which I've never played and have no interest in, (although my girlfriend and I have sunk 80+ hours into playing Borderlands, which is essentially an scaled-back postapocalyptic shooter version of the same basic ideas) but I think that the basic framework he elaborates can be made to fit just about any type of video game to one degree or another. Simply put, the appeal of video games is ultimately a behavioral one, rather than an intellectual one.

I'm not sure what the full implications of this are, although I'm sure that the hoary old chestnuts of "are video games healthy?" and "are video games art?" will resurface fairly rapidly. I may have more to say on this topic in general, and those two questions in particular, at some future date. I am fairly sure, however, that it means that the current cultural ascendancy of video games will probably be a prolonged one, and may even just be getting started, given the explosion of platforms like Facebook and the iTunes App Store. Look for some 40 year old college student to publish an essay in Newsweek about the intermingling between her FarmVille and OxyContin habit by December or so.

Sunday, February 21, 2010

Shutter Island review

A note: I'm not going to get into spoilers in this review, but I'll be discussing some things about Shutter Island that'll probably affect the viewing experience going in, so if you haven't seen it yet and are hell-bent on going in pristine, you might want to save this for later.

As a Martin Scorsese diehard, I was really looking forward to Shutter Island. I'd probably go see anything that Scorsese puts out, but I was especially excited by the trailers for this one; which promised an unapologetic, atmospheric thriller with a great cast, topped off by Scorsese's unmatched visual command. And that's pretty much what it is. So why didn't it blow me away like I was hoping it would?

Here's the problem with Shutter Island: it's a genre picture. Specifically, it's what's often referred to as a 'psychological thriller,' which in these days is essentially shorthand for "a plot-driven drama constructed to set up at least one major third-act twist in the narrative.' It'll come as no surprise that I love this type of movie, when it's done well. That last caveat is important, because post -The Sixth Sense American cinema has been inundated with terrible twist-based movies. To my mind, there are two main determinants of whether this type of movie "works." It's obviously best if both are present, but if a movie doesn't have the first one, it really, really ought to nail the second. They are:

(a) The twist is something truly original.
(b) The film is so tightly plotted that the twist, although foreshadowed, catches the audience by surprise, usually because of clever misdirection created by emphasizing some other aspect of the plot.

Shutter Island doesn't pass either of these tests. Without spoiling it, the plot twist in Shutter Island is a variant of something I've seen so many separate times that I can't even associate it with just one other piece of work (although a few candidates come to mind). Again, that's not a make-or-break thing; the success of a thriller has much more to do with execution than with concept. But here's the problem: if somebody put a gun to Martin Scorsese's head and threatened to pull the trigger unless he made a movie with a running time of under two hours, Scorsese would, without fucking question, be dead. Shutter Island is 138 minutes long, which is at least 20 minutes too many, and probably 30. The bloat doesn't really become apparent until early in the third act, where Leonardo DiCaprio's character has two back-to-back interactions with characters who essentially reiterate thematic undertones that were fairly unsubtly voiced by completely different characters an hour earlier. This kind of flab is lethal to a thriller plot.

The issue with Shutter Island in a nutshell is that Martin Scorsese isn't really a director who focuses on plot; he's a director that focuses on visuals and theme. These strengths are on full display in Shutter Island. The set design and cinematography are typically stellar. From the opening of the film, Scorsese uses a truly brilliant array of cinematic tricks to foreshadow the climactic twist, and there's an impressive use of historical allusions and parallels woven into the story throughout. The highlights of the film are the set-piece flashbacks that Leonardo DiCaprio's character experiences continually; they're magnificently conceived and poetically executed. Unfortunately, they also telegraph the ending so heavily that they drain a lot of the ambiguity that the film desperately needs to sustain the narrative tension through the third act. The result is somewhat like watching a magician who performs a clever trick but can't quite sell the illusion to the audience. (Although, I have to add that the very last scene in the movie is fantastic and nearly redeems the disappointing elements of the twist).

I'd kind of like to see Scorsese put out two different versions of Shutter Island. One would be about 100 minutes long and would jettison the abstract visuals in favor of tightening up the plot around the central twist, as a traditional thriller would. The other would downplay the plot even further and go to town on the visual and thematic aspects to create an ambiguous tone. I think that the what's in theaters now plays like a compromise between elements of both of these "movies" that doesn't quite resolve the tensions between them in a satisfactory way.

I'm being rough on Shutter Island, but I actually think it's a very interesting film, and one I'd like to see again unburdened of the need to focus on the plot. Even though it's not a very efficient film, it is a very well-constructed one on a number of levels, which I think that I might appreciate better on another viewing. As it stands, though, it's not as good at first viewing as I had hoped it would be. Your mileage may vary.

Wednesday, November 4, 2009

Let's parse last night's elections!

A disclaimer: I think that off-year elections, particularly those coming a scant 10 months into the first term of a presidency, aren't incredibly meaningful, and discussing them as such is really more akin to masturbation that serious analysis. Then again, you could say that about nine-tenths or more of contemporary political discussion and you wouldn't likely be wrong. I like masturbation, though. I also like politics, so let's do this.

The dominant spin from last night's elections is probably going to be that the Republicans taking the governorships of Virginia and New Jersey represents a Republican resurgence and a rebuke of sorts to the Obama Administration. Exhibit A of this line is Karl Rove's op-ed in the Wall Street Journal today in which he essentially chalks up the Democratic losses to voter unease over the potential costs of Obama's health care reform proposals. I'm not sure that this argument holds water particularly, since it isn't like Obama decided to reform the health care system sometime in March of this year. It was a major part of his presidential campaign, and it didn't seem to dissuade a lot of people from voting Democratic then. It seems more plausible to me that the fact that there's been so little forward momentum on health care depressed Democratic turnout and created a lane for the energized Republican opposition, but I don't think that's what happened either.

Rather, I think people voted for the gubernatorial elections based on their satisfaction or dissatisfaction with the governance of their state, rather than based on their antipathy or affection for Barack Obama. Chris Christie, newly elected Republican governor of New Jersey says that he's looking forward to "working with President Obama," not exactly words that will get him a headliner slot at any upcoming Tea Party rallies. It also bears mentioning that Jon Corzine is a Wall Street billionaire who bought his way into office, and "Wall Street billionaire" is a shade below "Roman Polanski" in the hierarchy of things people, let alone the Democratic base, are positively disposed to at present. If I lived in New Jersey, I sure as shit wouldn't take twenty hard-earned minutes out of my day to throw a vote Corzine's way.

As for the whole three-way ordeal in New York's 23rd district, I think more than anything it demonstrates that winning Congressional elections is basically a matter of convincing people that you're going to be fully dedicated to kicking loose those sweet, sweet federal pork dollars than your overarching allegiance to a philosophical theory of governance. From what I've read, Hoffman was running far more on the latter category and was notably weak in the former. It's not by accident that pretty much everybody in Congress gets re-elected for decades despite the fact that the voting public pretty much unanimously hates the House of Representatives as a collective entity. As such, I don't think it's really valid to draw a larger inference about the electoral future of the conservative movement from this one instance. However, I did come across a quote today from noted conservative intellectual Glenn Beck that gave me pause:

And here's what the ‑‑ forget about the Democrats. Here's what the Republicans should learn. The tea party movement, if you think you're going to run people that are going to be, you know, ACORN wannabes and they're just part of the corruption, part of the system, if you're going to run those people, you can expect a tea party guy to come out, and the tea parties, they'll help you lose every single election. Every single election. Because I for one am not ‑‑ if I believe in the Republican, I'll vote for the Republican. But if you're running somebody who's like part of the system, I'm not interested. I'm not interested. And I think that a lot of Americans are like that. So the Republicans have a choice to make. You can either spend a million dollars trying to destroy a third party accountant, or you could say, wow, this accountant probably would come in within three points of beating the Democrat if we combined our efforts, Republicans and Democrats, spent a fortune, had our candidate then drop out and campaign for the Democrats, we might be able to come in with about a 3‑point margin. You might want to just say, "Maybe we should go with the accountants. Maybe we should go with the regular people."
Remember two months ago when I suggested that the right was succumbing to fallacies that had long plagued the left by mounting strident and inane protest marches? What noted conservative intellectual Glenn Beck is suggesting here is literally a replica of the modern American left's worst idea, running ideological protest candidates to "send a message" to the mainstream party. Let's review the two most prominent examples. The first is successfully defeating Joe Lieberman in the Democratic Senate primary in Connecticut. Rather than ushering in a new wave of unabashed legislative progressivism, Lieberman just won re-election as an Independent, proceeded to campaign wholeheartedly for John McCain in 2008, and was most recently seen vowing to help the Republican minority fuck over any meaningful healthcare reform bill from being brought to a vote in the Senate. The second is Ralph Nader's presidential bid in 2000, which was aimed squarely at siphoning votes from Democratic nominee Al Gore. Despite a rather pathetic nationwide showing, Nader still managed to accrue more than enough votes to cover the small margin separating Bush from Gore in Florida, clearing the way for Bush to win both the state and the election ('win' of course, being shorthand for 'U.S. Supreme Court decision barring the completion of vote recounting', although I think Bush would have wound up winning anyway). Suffice it to say that the Bush presidency isn't exactly what the average Nader voter had in mind on his or her way to the ballot box in 2000. I should know, I was a freshman in college at the time and I was acquainted with quite a few of them. If there's a similar situation that forms on the right in 2010/2012 (or the Republicans nominate Sarah Palin for president), I don't imagine that it'll turn out much better. I'm somewhat skeptical that this will actually happen, but the idea's obviously percolating out there.

From my point of view, the only thing about last night that should inspire anger or fear among liberals is the narrow passage of yet another gay marriage ban, this time in Maine. Specifically, I'm extremely disappointed that Obama and/or the DNC didn't lift one finger to suggest that Democrats should turn out to prevent rights being stripped from gay citizens. I know that that Obama's against gay marriage and the Democrats as a national political entity have absolutely no spine when it comes to taking a stand for social liberties, but this is really fucking shameful. Legal discrimination against gays is the defining civil rights issue of our time. These state constitutional bans are not going to last forever. They're going to fall, either by being repealed by less-bigoted future electorates (which I'd prefer) or by federal action (which I'll accept, despite the fact that it'll kick off yet another generations-long political battle a 'la Roe v. Wade). And eventually, Americans will look back at these laws with the same revulsion that we (or: most of us) look back at Jim Crow laws today. I expect better of Obama than the half-assed thumb-twiddling we're getting from him on these kinds of issues, and I hope that I'm far from alone in that view.

Wednesday, October 21, 2009

This just in: we're winning the culture war


I didn't get much of a chance to read up on the blogosphere today, but I noticed that several of the writers I read commented in amazement at the Washington Post's decision to publish an editorial by Bill Donohue, president of the Catholic League, entitled "America's Secular Saboteurs." These bloggers quite accurately pointed out that the content of this piece is unhinged and naked bigotry, and argued that its publication reflects extremely poorly on the Post's editorial standards. The truth is quite the contrary. The Post has done a great public service in giving this column such a prominent platform, as it ably illustrates the desperation and intellectual bankruptcy of religious conservatism in 2009. It would take far too much time to point out every instance of ignorance and historical contradiction in Donohue's piece (read it yourself and they'll likely jump right out at you), so I'll only address what I see as the highlights. The editorial's first paragraph:
"There are many ways cultural nihilists are busy trying to sabotage America these days: multiculturalism is used as a club to beat down Western civilization in the classroom; sexual libertines seek to upend the cultural order by attacking religion; artists use their artistic freedoms to mock Christianity; Hollywood relentlessly insults people of faith; activist left-wing legal groups try to scrub society free of the public expression of religion; elements in the Democratic party demonstrate an animus against Catholicism; and secular-minded malcontents within Catholicism and Protestantism seek to sabotage their religion from the inside."
The standard practice in this type of writing is to identify your ideological opponents as a marginalized but devious band of schemers seeking to deceive the larger body of honest citizens into complicity in their agenda. By contrast, Donohue rattles off a laundry list of conspirators: educators, sex enthusiasts, artists, civil libertarians, the Democratic Party, and even unnamed fifth columnists within Christianity itself. Note that this 'paragraph' is actually only a single hysterical sentence. One can almost feel Donohue's paranoia rising with each successive semi-colon. Indeed, given the vagueness of his language, Donohue may well be including up to half of the U.S. population under his "cultural nihilist" rubric.

Shortly thereafter, Donohue pinpoints his villains' sinister logic:

"If societal destruction is the goal, then it makes no sense to waste time by attacking the political or economic structure: the key to any society is its culture, and the heart of any culture is religion. In this society, that means Christianity, the big prize being Catholicism. Which explains why secular saboteurs are waging war against it."
The magnitude of ignorance displayed in Donohue's equation of Catholicism with the essence of American culture is nothing short of breathtaking. The heyday of American anti-Catholicism to date took place in the nativist movement of the mid-to-late 19th century, when Catholic immigrants began arriving en masse to United States from Europe. The reaction from the largely Protestant populace was to mount a campaign of violence and economic and social marginalization against Catholic immigrants for - you guessed it - their perceived lack of allegiance to American culture. (Feel free to watch Gangs of New York for a fictionalized primer on the religious politics of the era, but be sure to fast-forward through the scenes where Cameron Diaz has speaking roles). In fact, as recently as 1960 John F. Kennedy, still the only Catholic to hold the U.S. Presidency, had to take pains during his campaign to assure the electorate that he would not be beholden to Papal authority in making decisions as President. Beyond the typical religious-right "religion is the primary arbiter of culture" fallacy (which I'll return to later), it's nearly impossible to argue that American history and culture are synonymous with Catholicism without ignoring a great deal of salient historical facts.

Donohue later follows with this bit of revisionism:

"There was a time when Hollywood made reverential movies about Christianity. But those days are long gone. Now they just insult. And when someone finally makes a film that makes Christians proud, he is run out of town. Were it not for Mel Gibson, there would have been no "Passion of the Christ." But for every Harvey Weinstein who likes to bash Catholics, there is always someone else waiting in the wings to do the same."
Mel Gibson was not "run out" of any town for making The Passion of the Christ. Donohue conveniently neglects to mention the very public incident in which Gibson was caught driving while intoxicated and proceeded to sexually harass a female arresting officer, all the while spewing the kind of rank anti-Semitic beliefs that he had so vigorously denied holding during the run-up to The Passion of the Christ's release. Isn't it interesting, in this context, that Donohue prefers to pin Gibson's downfall on the likes of Harvey Weinstein (what kind of last name is that, anyway?)

"The ACLU and Americans United for Separation of Church and State harbor an agenda to smash the last vestiges of Christianity in America. Lying about their real motives, they say their fidelity is to the Constitution. But there is nothing in the Constitution that sanctions the censorship of religious speech. From banning nativity scenes to punishing little kids for painting a picture of Jesus, the zealots give Fidel a good run for his money."
No. What Donohue and those who think like him fail to understand is that these groups and their supporters, with very few exceptions, have no designs on censoring private religious expression. Rather, they push back forcefully on the fiction that religious belief, in general or particular forms, is an intrinsic part of American society and should receive official sanction and support as such. Preventing public property and money from being employed to display a nativity scene or a statue of the Ten Commandments is not religious censorship and is no way equivalent to denying private citizens the right to do the same with their private property. Rather, it's a judicious assertion of the First Amendment of the United States Constitution, which remains deeply ingrained in American culture no matter how loudly the Donohues of the world disdain it.
"Catholics were once the mainstay of the Democratic Party; now the gay activists are in charge. Indeed, practicing Catholics are no longer welcome in leadership roles in the Party: the contempt that pro-life Catholics experience is palpable. The fact that Catholics for Choice, a notoriously anti-Catholic front group funded by the Ford Foundation, has a close relationship with the Democrats says it all."
I'm sure that the gay activist overlords of the Democratic Party are quite pleased with President Obama's speedy and bold moves to overturn the federal Defense of Marriage Act and the military's "Don't Ask, Don't Tell" ban on homosexuals serving in the armed forces. More fascinating is the rapid sequence of assertions that (1) the Democratic Party does not welcome "practicing Catholics" (2) there exists a group called Catholics for Choice, which in fact has a close relationship with the party, and (3) Catholics for Choice is "notoriously anti-Catholic." The message here is fairly obvious: personal religious belief and identification mean nothing when it comes to determining whether or not a person "counts" as a Catholic, while toeing the Church's anti-abortion hardline means everything.

Donohue concludes with this gem:

"The culture war is up for grabs. The good news is that religious conservatives continue to breed like rabbits, while secular saboteurs have shut down: they're too busy walking their dogs, going to bathhouses and aborting their kids. Time, it seems, is on the side of the angels."

Let's leave aside the more explicitly disturbing connotations of asymmetrical breeding as a strategy to achieve cultural and political goals for the time being and focus on the real implications of what Donohue is saying here. In a single sentence, he's managed to neatly encapsulate his view that children are little more than empty vessels to be indoctrinated with an unaltered version of their parent's religious, cultural, and political beliefs, for the purposes of continued engagement of a vaguely defined and ever-changing enemy on a metaphorical field of battle. It is inconceivable to Donohue that, absent some nefarious outside influence, children raised in a conservative religious family could grow up to become atheists, homosexuals, members of Catholics for Choice, or any other of the myriad means of "deviancy" that populate his worldview. To him, these are not their choices to make. They are to be made for them by authority; specifically, by a glorious singularity of parental power and religious dictate.

Bill Donohue is a grubbing fascist with not one shred of respect for the United States of America's rich and dynamic culture of individualism, mutual tolerance, and democracy. He makes no secret of his profound contempt for American citizens who fail to reflect in full his personal prejudices. All of this is abundantly clear in every sentence of his idiotic and vile editorial. We are fortunate to live in a time where this can be made clear, and even more fortunate that Donohue himself has chosen to discredit himself so thoroughly and nakedly on the public stage. It's clear that Donohue imagines himself to be a holy warrior leading a vast army of the devout to a divinely ordained victory. His writing reveals him to be little more than a cheap dictator huddled in a bunker, cursing the names of imagined conspirators under his breath, while promising his dwindling camarilla a glorious triumph in a war that he has already lost.