Video Games Killed The Media Stars
By James Pinkerton
Everyone knows, the videogame industry has eclipsed the movie industry in terms of total revenues. Still, in terms of cultural impact and influence, movies still predominate. Yet that could be changing.
How do I know that such change is in the Zeitgeistal wind? I read it in The New York Times, so it must be true. You don't believe me? You want to perform your own fair and balanced test? OK, then please let me report a bit, and then you decide.
But first we might take note of the current cultural mismatch. As just one indicator of the movie-heavy status quo, let's consider the difference in the cultural weight between movie-related and game-related awards shows. A game such as Halo 2 might bring in record amounts of money in its 2004 premiere, far more than any competing film, but when it comes to ceremonies, there's no competition.
The Academy Awards show, for example, is a huge event, covered live on broadcast television, as it has been for the past 52 years. By contrast, the Video Game Awards are consigned to the testosteronal boondocks of cable, specifically, the second-tier SpikeTV.
Applying Marxist terminology (all those years in lefty schools in the '70s ought to be put to some good use!) to this situation, we might say that while the economic substrate for videogames is larger than that of movies, videogames' cultural superstructure is still dwarfed by the flicks' superstructure. And yet if we are going to be good materialists, we must believe that situation determines consciousness. Or, to put it another way, the media phylum with the greater mode of production will ultimately produce the greater superstructure. Therefore, according to the dialectical laws of history, this anomalous situation will be reconciled, as videogames empower many a flower of the new super-culture.
In the meantime, another interesting question is this: Why has Hollywood proven to be so far behind the cutting edge of entertainment? After all, Tinseltown has been in the business of manipulating images -- and audiences -- for a century now. With that much of a head start, surely the studios could have snuffed out, or co-opted, the game movement. That makes sense as a theory, but it hasn't worked out that way in practice. As another German philosopher, Immanuel Kant, explained, "The actual proves the possible." Or, in this particular instance, if Kant were around today and reading PC Gamer, he might explain, "Dude! If it didn't happen, it couldn't have happened."
Three explanations for Hollywood's failure to make the jump, from story-telling to interactivity, come to mind:
First, studio execs as a group have never known anything about videogames, and can't ever be expected to learn. Movie makers, after all, think of themselves as being in the movie business, as opposed to being in the overall entertainment business. And so just as the railroads ignored the automobile -- because railroad men could not see that their business was transportation overall, as opposed to simply railroading -- and just as autos similarly ignored airplanes, so it is that movies are mostly clueless about the new entertainment platform, videogames. Long ago, management guru Peter Drucker made the point that a new technique would not be adopted until it was obviously and demonstrably ten times better than the old technique. Since then, much work has been done about the impact of "disruptive technology"; the general lesson seems to be that it's the rarest of companies that can make the jump from one way of generating profits to another.
Second, almost no matter what they do, no matter how clueless they might be, the studios are still making money. Let's face it: the studios do a good job of producing entertainment, of a certain kind. So there's no sense of crisis, or at least not a sufficient sense of crisis, to force a new receptivity to new ideas, even money-making ideas. Hollywood is notoriously adept at hiding its profits, but even as movie moguls keep poormouthing to outsiders, including the IRS, the plain fact is that show biz is a good biz; real estate prices in West Los Angeles keep rising. Let's face it: this is a rich country; the GDP is north of $11 trillion annually. With so much money to go around, an existing market-player can lose share, on a relative basis, and still gain ground, on an absolute cash-in-pocket basis. And so while studios are happy enough to take money from gamers for the occasional hot property such as Spider-Man, for the most part Hollywood doesn't really care; "top ten" lists of videogames are almost completely free of movie-originated titles. Meanwhile, one should fear for the shareholder value of those companies, such as Disney, that belatedly try to buy their way into gaming relevance.
Third, the cultural superstructure of movies is stubbornly persistent, even hegemonistic. In the same way that, say, Paris -- that's Paris, France -- is still riding on the sheer cultural fabulousness of the 19th century and before, so it is that movies are riding on their past glory days, too. Across the country, film festivals, film schools, and film critics create a thick blanket of propaganda for movies.
But that deep system of embedded cultural legacy is being silted over; a new tide of silicon-based sedimentation is coming, and coming fast. On Friday, The New York Times' review of the new Nicole Kidman movie, "The Interpreter," clocked in at 978 words -- I counted. That same day, April 22, the Times printed a 1272-word review of a new videogame, "Jade Empire". That's right: the videogame review was nearly a third longer than the movie review.
Of course, that's only fair, since "Jade" is a lot more entertaining than "Interpreter." Yet still, movies have a vastly bigger place in the Times' pantheon: the review of "Interpreter" was on page E1 of the entertainment section in the hard copy, with a color photo, plus another black and white photo on the jump page, while the "Jade" review was on page E36, with just a single small black and white photo.
However, in the brave new search-world of cyberspace, old notions of prominence and placement are less important, because Googlers aren't looking "above the fold"; they are looking instead in dialogue boxes. The notion of what constitutes good "real estate" for the eye will thus have to shift from the physical/visual relationship with the viewer to the linked relationship among key words.
And while I sorta doubt that too many gamers are looking to the Times for guidance on games, I do have to give the Gray Lady credit for taking "Jade Empire" seriously; the Timesters are at least trying to keep up with that youthful "demo" -- a demographic group that is mostly unfamiliar with newspaper reading.
Parenthetically, one might note that newspapering is an industry that's more threatened with extinction than the movies. Just as railroads forgot that they were in the transportation business, so newspapers have mostly failed to figure out that they're in the eyeball biz. And so the eyeballs are going elsewhere, lured by cool technology and cooler content. Today, as newspaper circulation falls, and as mindshare for key profit centers is lost to out-of-nowhere upstarts, some pundits are contemplating the outright death of newspapers, in the same way that Hamlet regarded the skull of poor Yorick -- as a demise that's already happened.
Which is to say, the Times might fail at upgrading itself. It might, for example, fall victim to some new disruptive new media juggernaut, such as "Googlezon" -- the hypothetical merger of Google and Amazon -- within the decade. But give the Times credit; it's a tangible paper publication that is diligently trying to intangibilize itself in cyberspace. And who knows? If, by chance, the Times gets it right and makes the necessary techno-paradigmatic leap, maybe one day, game reviewer Charles Herold will be as significant a cultural arbiter as movie reviewer A.O. Scott.
After all, anything can happen in this Schumpeterian environment. As The Los Angeles Times' Andres Martinez observed on Wednesday, Google might well be wise to gobble up an old-media content provider to anchor its own brand, especially since it could buy, say, Dow Jones -- owner of the cash-poor but name-rich Wall Street Journal -- for comparative chump change: $3 billion, next to Google's market cap of $60 billion.
Meanwhile, regardless of legacy-cultural overhang and lack of critical acclaim, videogames are sailing along in 2005, even as movies -- and newspapers -- are sagging.
The sentimentalist in me would love to see some videogame tycoon create a whole new superstructure of awards and honorifics, especially if it would also spawn a Vanity Fair-like after-party that I could be invited to.
But maybe the Marxists are wrong about the inevitable close linkage between economic substrate and cultural superstructure -- and that would make sense, as they've been wrong about almost everything else.
Or maybe videogames will come up with a new kind of superstructure that owes more to Moore's Law than it does to Marx. Maybe the new culture will be entirely virtualized, as seen in the online community of games such as Everquest, or in other interactive/collective media, such as augmented reality.
But on the other hand, even the most virtual of environments have a way of turning physical, eventually. So sign me up, movie fan that I am, for the actual, tangible Videogame Academy Awards show. I'll want to go the Oscar show, too -- unless, of course, they're being held at the same time. In which case, loyal TechCentralStationeer that I am, I'll go with the new flow.
And speaking of flow, videogames may be notorious for their unflinching depictions of flesh and blood, but I will presume that the Videogame Awards will feature ample displays of healthy, sexy, flesh, not spattery blood. That's one legacy from the silver-screen awards that that I hope stays real forever, no matter what the evolving technological substrate.