Thursday, December 22, 2011

Rudolph, the red-nosed alcoholic

This Christmas season at the movies has left me feeling, well, slightly hoarse. Based on that pun alone you might have an idea of what I'm intending to talk about here, and if so, I apologise at the outset for inflicting such a stinker upon ye good people of the Internet straight out of the gate. If not, well, that doesn't matter, that just means I got away with it. Twice. Phew, I guess.

So it's summer blockbuster territory for those of us who are south-of-the-border, while it's non-denominational holiday blockbuster territory for all those, ahem, normal people up north. You may have noticed a certain picture focusing around a roving young stallion being sent off to fight and die in a miserable horrific conflict in Europe's past has been the recipient of a surprisingly intense marketing push given its competition with other, higher-profile sequels and followups that are due to hit theatres this season. Hmmm, I suppose that description reveals quite a bit in itself. Violence! People like violence. Give the people blood and guts and pinata parties gone horribly wrong in the Colosseum sands I say. Right?

The point is, the comparatively aggressive promotion for Steven Spielberg's War Horse has been the catalyst for me doing no small amount of head-scratching and wondering: are we seeing the emergence of a “horse movie” subgenre within the drama film category, and if so, what is it about the “horse movie” that holds so much appeal to a wide audience?

As soon as I started considering the former question I had a couple of early examples I could raffle off in somewhat arguably flimsy support of the whole sordid affair and here they are: The Horse Whisperer, Hidalgo and Sea-fucking-biscuit. Not much to build an empire on, for sure, but the Mongolians did it with less. Nearest as I can see, the narratives contained within these movies all hold the following in common:

They focus first and foremost on the horse as a character in its own right, if not the primary protagonist altogether. Perhaps given that a studio can't predict the subjective appeal/bankability of a particular lead human actor to one member of the audience or another, it's safer just to assume that everyone likes horses and will thus rally in support of an anthropomorphic protagonist. Animals can't speak (parrots and dolphins please place yourselves in the temporary self-designated Exclusion Zone visible in the corner) so to some extent, we as audience can project onto them whatever personality we relate to. Nonetheless...

The horse is an animal with recognisably human idiosyncrasies. This is what I could and will call the inevitable “cute factor”. Horses are also not unlike us in other ways - they're temperamental, they can feel pain and we can empathise with their suffering. There is also a sense of impenetrable loyalty in their actions, particularly in movies such as Hidalgo where a bond is shown gradually developing between horse and master (notably, this is subverted for comic effect in the Zorro movies) But...

Horses also possess attributes we ourselves would willingly possess. Strength, speed, majesty – who wouldn't want these as physical traits? Or, in a more abstract sense, the ability to literally just focus on a goal and run full tilt at it- bravely shrugging off whatever life throws in your way. We also project onto ourselves the desire to possess the characteristics of the person riding the horse. Aristocratic dashing cavalry officers, cowboys, knights in armour, impossibly regal elven queens moving soundlessly through the forest while a hapless courtier rides in front to throw his cloak or himself over muddy potholes so she can cross. None of these guys is exactly chewing the proverbial plankton off the floor of the food chain's country franchise outlet. (God that was a terrible metaphor, even by my standards) The horse functions as a symbolic metaphor for speed, power, courage and grace and it shares these attributes with its rider – they are ultimately inseparable as an image of subjective, abstract desire.


We recognise in the horse a yearning for the pastoral ideal. My childhood memories of playing the Legend of Zelda: Ocarina of Time largely consist of gritted teeth, smashed tv screens (blatantly untrue) and generic little-shit frustration at being unable to win that bloody Golden Scale by coughing up a 20 pound hake to the capitalistic owner. But those that don't have a certain nostalgia for the times spent larking in Hyrule Field, galloping Epona in circles and firing fire arrows into the distance to hit some poor unsuspecting subsistence farmer's prize pumpkins, or riding to the crest of a hill to simply sit and watch the sun go down in all its 1997 graphics era-lens flarey goodness. Do all horses live such carefree lives as this? I would guess not. 

But that doesn't really matter. We assign the horse an idyllic existence, a place in the mind with the boundless freedom of wide-open plains. This is of course a whole world away from the cramped-cubicle office compartments, car interiors or concrete Fuhrerbunkers that many of us inhabit in cold stark reality. This imagined life of the horse seems somehow more “natural” or “essential” than what a certain cross-section of human life seems to consist of these days. It connects with us at a primal level, because we recognise our own yearnings to be out on those open plains with the freedom to go anywhere and do anything we want without a care in the world. It's a powerful association. 


Now, my immediate conclusion for the next logical place that Hollywood could go as far as specific-animal movie genres was, naturally the dinosaur film, and I even had a snide joke lined up with exactly this slant. It would have involved me having a good chuckle over the outlandishness of such a thing and posting a jpeg of the Philosoraptor as guest star to end on a vaguely Platonic note. But then I realised it wasn't outlandish, and how close to the truth that idea actually was, and it kind of wasn't really funny anymore, if it ever was to begin with. Jurassicparkgodzilladinosauriceageserieskingkongjustaboutanymonstermovieevermade. Um, yes.

So there we have it, the outlines of what I would consider the horse genre in popular Western cinema. I hope that's something to build the foundations of a taxonomy on. Meanwhile, I've got to go clean my room and vacuum the carpet. Fresh straw doesn't replace itself, you know.

Tuesday, December 13, 2011

Your nuclear existence

A scant observation I'd like to borrow thirty seconds of your time to share, if I may: no, not to tell you the word about Jesus Christ, but to note that an increasing number of consumer technology advertising hooklines are deploying a very specific phrase as breezily as if it's the latest model battle tank.

Last year, you were buying gifts for “your family”, this Christmas season it's “the family”. I thought I was just getting a pair of socks for dad, but whatever. Oops, I guess I just indirectly invalidated my claims three sentences earlier. Bugger, I guess. You'll have to forgive me, my record for delivering on lofty assertions in which I have no stake whatsoever in upholding has been faltering as of late.

Now in this techno-savvy context, what does the use of “the family” as opposed to “your family”, or “my family”, connote?

First and foremost, the family appears as a kind of obligatory accessory, a mandatory ingredient in the life of the sophisticated, hip urban consumer. It's something that “everyone” has, whether they want it or not – regardless of and totally divorced from the reality that not everyone has a family, or even one they particularly want to associate with.

If you belong to the certain youthful socio-economic bracket that can afford hip-and-happening consumer technology like /Ipads/Galaxy S II's/miniature pocket keyring hydrogen bombs et al., then you have “the family” that has provided you with the lodgings, education, nourishment and home life throughout the course of your privileged upbringing presumably necessary to attain the stance in society whereby you have the disposable income to purchase such items.


“The family” is successful in its own right. It made you what you are, it got you to where you are today. You're successful, you've made it, and now you're going to build on that success by buying one of whatever Mr. Shitface is dangling like a droopy marshmallow on the end of his long pointy stick over your proverbial cage.

To be sure, I guess the element of depersonalisation in such a phrase is necessary for exactly the same reasons outlined above – the marketing boffins can't know for sure that whoever is reading their advert is going to fall into the aforementioned consumer demographic, so saying “buy a gift for your family” might ruffle some feathers and/or possibly be a bit un-PC.

But really, I guess the whole feel-good-about-yourself-because-you're-a-wealthy-urban-guerilla-connotation that comes with it is just an unexpected bonus – like finding that your laptop doubles as a place setter. Bon appetite, and don't you go handling those crackers irresponsibly, you hear?.

Thursday, December 1, 2011

Built to last

Christmas time always seems to bring out the techno-head in me, so it is with a healthy suspicion when I say that for no particular reason, I decided today I would write a blog entry about Toughbooks. I may as well have chosen to write one about Siberian snow leopards. But I've spent the past year working with the damned things and for all their interminable bloody quirks, the things have grown on me, when I'm not curtailing my urge to put the manufacturer claims to the test in the most spectacular way possible and hurl them into the nearest solid surface or Atlantic maelstrom at warp factor 5. I like their old-fashioned quality. They're solidly built and well made, which I guess is appropriate and a real breath of fresh air when every other notebook and/or/ prosumer product in general these days is crappy plastic or some substitute thereof, cooked up in a chemical vat somewhere next door to the WMD factory. And they incorporate all the best elements of the wave of trendy tablety-stylusey computing-type thingy consumer devices that have hit the shelves in recent years. I don't think I would be making too outlandish a statement to say that they basically channel the spirit of the mouse-bot from Star Wars in notebook form. I won't use the word cute. That would be committing political suicide. But anyway, hopefully you get what I'm on about. I'm really not sure I do, so some help might be appreciated in this department. 


This in conjunction with the venerable Lumix dynasty of compact cameras convinces me that Panasonic are quite capable of making some good stuff and quite possibly one of the key players that will cause DSLR's to become a thing of the past for the enthusiast or hobbyist in the conceivable future. I splashed out for a Canon 550D approx. 7 months ago (just before the 600D landed and caused the price to drop, grr) and now kind of wish I hadn't – the range of compact cameras that are on par with DSLRs in terms of image quality is steadily increasing every time I turn my head to look, like some kind of aggressive mutant algae on the colourfully exaggerated mural of my perceptions of the camera market.

Funny how neither of the invincible heavyweight duo (Cannikon)'s offerings in this area have caught my eye quite as readily. Powershot and Coolpix are names that are known to me, but they just don't have the same up-there or trendy connotations of something like a Lumix, Olympus EP-series or a Finepix X100 (drools) Which reminds me... Anyone want to trade? Kiss X4 plus two lenses, Tamron 17-50 & Canon 50mm 1.8, both with hoods & 52mm polarising filter for the latter. Chuck Fujifilm's second-to-latest offering, duly mentioned above in my direction and I'll consider. No, I'm not talking about the X10. None of that popcorn-snack-sized Roman rubbish. (Disclaimer: Above offer may not actually be propositioned in seriousness and may possibly be the result of the author talking glibly out of his ass, as usual)

I do digress, don't I? I may have to amend my initial statement about this blog post being about Toughbooks, because as it turns out, it's really not. I would have bored you all senseless if it were, anyway, because no-one's heard of the damn things who doesn't shoot terrorists or chase dinosaurs for a living. But I suppose there is a common thread here, and it goes something like this: Build quality is a drawcard for the lay consumer as well as the professional, and when you mix quality build with antique retro aesthetics, more often than not you get something really rather nice. My 550D is not really built for outdoor shooting. I know the 5D Mark II (now there's something I really would trade my entire camera kit and quite possibly a substantial number of other valuable things in my possession for as well) has that beautifully constructed magnesium-alloy body and all the other trappings of professional ruggedness much like the 1D Mark IV, the Nikon D700, D3s or whatever else have you. Whenever it starts raining, I run for the nearest solid object between me and the man upstairs taking a leak. I guess it's not all about the camera body, there's no way you can keep an expensive lens swaddled in cotton wool forever, and if you're going to be shooting in downpour or thunderstorms or lava flows or biblical apocalypses you would probably be best off throwing in your lot with the Canon camp and dropping multiple weeks' worth of pay checks on an L lens that is specifically weather-sealed. But I like things that are metal or alloy or wood or solid fibreglass, things that seem to be well-made, things that seem to come from the real world that you can picture some old guy sitting in a room spending hours polishing and buffing to the extent that he can look in the front panel and see his own reflection and remember that he forgot to put his monocle on that morning and shrivel up in embarrassment.


It's not just about the functional properties of the build. When you own an item that you perceive to be quality, you are inclined to look after it better, you anticipate the results it delivers with a more positive outlook. There is a definite market out there for lay-level consumers who like products that affect an air of quality construction, whether or not they actually are, or whether they have any idea of the technical intricacies that define it as well-made or otherwise. I definitely bloody don't. The Leica M9 could be stuffed with newspaper and old Cheezels inside and I wouldn't have a clue.

As a side note, I'm wondering if this has anything to do with an infusion of the steampunk aesthetic popularised in recent films like Sucker Punch or the Bioshock franchise? The technology in those textual worlds have a common, definite sense of the internal workings of devices being made transparent. You can see all their gears and knobs and levers going like the inventors were using the Incredible Machine as their prototype simulator. Mousetrap hits see-saw which strikes match which lights rocket fuse which blows hole in wall causing water to flood through and power turbine. You might not get exactly the same effect with a Cf-18 Toughbook or a Fujifilm X100, but you still get the undeniable impression that it was made using real materials not mined on the Moon, it has a real weight and substance to it. It's part of the built environment, it will last when your 50mm 1.8s or school-lunchbox notebooks are biting the carpet.

Oh and Digitalrev fixed their site! This is worth celebrating. I am impressed mucho.

Monday, October 31, 2011

The sound of silence

If games are their gameplay, then the essence of the first person shooter genre is noise and violence. It's things happening, multiple threats coming from opposite directions, a player who is required to think and adapt to a rapidly evolving situation in real time.

Half-Life 2 is really rather unique in this respect, even as an entry in its own franchise. Not many first person shooters are confident enough to let players take a step back and drink in the silence every now and then. To allow them to stand alone on a clifftop and just do nothing, taking in the stark vista of the abandoned surroundings until they are ready to press on. This stands in contrast to a number of more modern shooter games where the noise, action and violence is omnipresent to the point of being numbing.

The examples that spring to mind for me occur primarily in the vehicular chapters “Water Hazard” and “Highway 17”, but I don’t think it’s especially specific to the road trip-style vignette gameplay that these sections engender. Think of the section where you battle it out with that carpet-bombing attack chopper in the lagoon just before Black Mesa East. After ten minutes of intense combat, adrenaline, concentration, there’s a thrilling rush of exhilaration as you watch the bloody thing spiral out of the sky in flames to land in pieces at your feet, the stinger music providing a sudden spike of dramatic tension and relief.

Then comes a brief instant of confusion. For those past ten minutes you had a single-minded purpose in the game: you knew without question what you had to do: escape from and/or preferably destroy the most immediate threat i.e. the chopper. But now, for just a split second, you’re unsure of what to do next.

And in that moment, that’s when the game’s peculiar silence takes on a life its own. You realise where you are, the sight of the setting sun over the lake dawns on you (no pun intended), and it compels you to stop and marvel at this strange oasis of picturesque beauty hidden just on the outskirts of technocratic dystopia. Barrels bobbing in the water look like lifebuoys, and stone monoliths create long shadows on the surface. There's a chirp of evening crickets coming from some unidentifiable source (crickets, presumably, or perhaps someone making cricket noises) In this light, Half-Life 2’s crumbling post apocalyptic earthscape takes on a weird sense of peace.


The fact that it’s placed immediately after a balls-to-the-wall action sequence is not accidental either. The player has a chance to catch their breath, remember why they are here and proceed at their own pace (although sometime today would be nice, I’m sure)

Highway 17 uses silence a tad more liberally and not in any so memorable set-pieces; it’s more in the creaking of an old boat-shack door, a tyre swinging from a tree, the chilly wind blowing over the cliffs, the vastness of the seascape that remind you just how alone you are. Even the Combine patrols encountered at regular intervals seem like quaint anomalies rather than the symbols of all-powerful authority over this Hebridean landscape.


The absence of any soundtrack is crucial to these moments. It creates an immersiveness that is impossible to fake, one which is further authenticated by the lack of characterisation of the player character (G. Freeman esq, this guy). The seemingly natural progression of time, the sky darkening as day turns to night and vice-versa, is similarly a key component here.

What moments of silence such as these achieve, in a functional sense, is to remind us that the essence of this game is more than the noise and violence of its core gameplay elements. We are more than just players; at another level, we are participants in a fully-realised fictional drama, and beyond that, we inhabit a fully realised fictional world where we have the freedom to do as we will, to sod the hero's journey and go dune-buggy joyriding squashing antlions till judgement day if we so choose.

Half-Life 2 is, in its own way, a uniquely lonely game. (Yes, I get that may possibly be related to a significant chunk of its target market being perceived as possessing minimal to borderline functional social skills) But a starkly beautiful one, too. I’ve yet to see any subsequent entries in the genre, even those which are self-touted as “post-apocalyptic” that have quite managed to replicate the same effect.

No doubt there are plenty of recent shooters out there with technically more impressive visuals which ooze atmosphere (dammit, I thought I'd get through this spiel without making a single use of that word) in every singular facet of their art design. But Half-Life 2's silence is all its own. It's in the uncanny strangeness of the emptiness of this world, which hits you all at once when you least expect it. It's of a type that players will seek out voluntarily rather than having it bludgeoned into them at the end of a Combine stun-stick.

Ok, I’m done licking Valve’s boots now, but you get the point. It’s still a bloody well-made game in terms of art direction, even when the Source engine is going on seven candles on its birthday cake (which is not a lie).

Oh, and it was Halloween last night. Probably should make some mention of that. When I snap my fingers you will awaken and re-read this article all over again, except this time it will be scary. Blah.

Friday, September 16, 2011

Lie or no lie, it's still one for the metaphor hall of fame

Broadly speaking (the best kind of speaking. Or not actually. Probably comes in somewhere about seventh, on a good day) there are a couple of different ways of evoking atmosphere. Not to trot out the dead horse that is the second-most long suffering metaphor of all time currently in use in modern everyday conversation after automobile engines, but it’s vaguely analogous to baking a cake. You can drop that extra sprinkle of cocoa into the pan last of all as topping, or mix it into the batter from the word go so you get a garden variety (vanilla, even?) chocolate cake.

Really, it doesn’t matter if we’re talking cinema, games, literature, dance or what have you. Structurally, the narrative and the atmospheric elements of a text should be seamlessly integrated as to make the audience unaware of the process by which each supports the other. However, there are always creative decisions to made with regard to the specifics, which of course is where things get interesting.

Today I want to look at something a bit different: Sloth racing on Saturn. I couldn't find much on that topic, though, so instead I'm going to put Tom Waits’ spoken word piece from the 1985 album Rain Dogs, titled "9th and Hennepin" under the microscope. This is quite a compact and economical snippet of dark musical poetry that begins by painting broad strokes, quite literally rounding out the scenery on all sides to create an effectively realised picture in the listener’s mind’s eye:

Well it’s 9th and Hennepin
All the donuts have names that sound like prostitutes
And the moon’s teeth marks are on the sky
Like a tarp thrown all over all of this, and the broken umbrellas like dead birds
And steam comes out of the grill like the whole goddamn town’s ready to blow
And the bricks are all scarred with jailhouse tattoos
And everyone is behaving like dogs
And horses are coming down Violin Road, and Dutch is dead on his feet
And all the rooms, they smell like diesel, and you take on the dreams of the ones who slept there.
And I’m lost in the window, I hide in the stairway and I hang in the curtain and I sleep in your hat.
And no-one brings anything small into a bar around here, they all started out with bad directions
And the girl behind the counter has a tattooed tear. One for every year he’s away, she said.
Such a crumbling beauty, ah
There’s nothing wrong with her a hundred dollars won’t fix
She has that kind of razor sadness that only gets worse with the clang and the thunder
Of the Southern Pacific going by
And a clock ticks out like a dripping faucet, till you’re full of ragwater, bitters and blue ruin
And you spill out over the side to anyone who will listen
I’ve seen it all. I’ve seen it all, through the yellow windows of the evening train.

The streams-of-consciousness style flow and overall structure is maintained by the continual use of “And” at the beginning of each line, which gives the piece as a whole the air of a hazy vignette, one intended to be mainly texture and atmosphere without much basis in reality. There’s a sense of disconnectedness in which the speaker realizes all of his worst suspicions and fears about the place (a real street corner in Minneapolis), but he is imagining without seeing, as Waits himself has commented in his explanation of the piece.

What’s really interesting here I think is the way Waits begins. He starts out by describing the broader picture and then moves into the closer, more personal. If this were a scene from a film it would be a long continuous crane shot that starts out above the city skyscrapers, swooping down into the streets, past the streetlights and shadows of people and into the doorway of the hotel in the story, eventually meeting the “girl with the tattooed tear” at the bar. What we have here is a sense of progression from the disconnected abstract to the specific. We shift gradually from broad strokes of disjointed atmospherics to concrete realities of character and narrative. Waits fully fleshes out the world he has created before plunking us squarely in the middle of it as one of the spun-out drunks at the bar. 

This prominent foregrounding of atmospherics is a popular technique in modern cinema because it allows the filmmaker to fully immerse the viewer in the imaginary universe on the screen, before introducing the who-what-where-why of the story itself. It becomes immeasurably easier for the audience to accept and be emotionally involved in what they are seeing, if they perceive that which they are seeing to be reality, or are at least superficially unaware of its fictionality. I suppose it makes more sense to guarantee your audience will be totally under the spell and ready to listen to what you have say when you finally start to say it. Examples? I’m thinking along the lines of Wall-E. I’m well and truly sick of talking about that film though, so I’m going shut up and leave that one up to you to expand upon for the time being. Maybe after a decent interval I’ll be able to discuss it critically again without feeling like a malfunctioning gramophone. Seven posts or so should do it.

On the other side of the pancake, consider a futuristic science fiction film. Something like your Minority Report, or its similarly styled spiritual father, Blade Runner. Here we have injections of atmospheric detail at regular intervals throughout the movie. In this case, it’s largely made possible by the constant visual onslaught of what the filmmakers have imagined everyday life in the XXnd century (insert random post-noughties date of choice here) will look like. Indeed for the science fiction genre, this is precisely what makes the regular maintenance (I use that phrase completely ironically) of atmospheric detail so easy - technological change is one of those things that is all-encompassing, and will be evident in virtually every facet of the on-screen activity, from projecting quarterly reports to feeding the cat. Flying cars, or colossal urban billboards that advertise personalised consumer goods help keep the suspension of disbelief constant, allowing the viewer to fully accept the theatrical illusion. In places these immersive footnote details are even used directly as plot devices – I’m thinking, for example, of the tiny spider scout police robots in Minority Report that hunt Tom Cruise through a skid row tenant block.

But in both of the films used as examples above, there isn’t really a tutorial half-hour of atmospheric acclimatization before the plot proper starts. We’re launched head-first into the who-what-where specifics of the story and characters and expected to acclimatize to the reality of the onscreen world as we go. Those little atmospheric flourishes sprinkled throughout are the one-percenters that enable us to do this. It’s a somewhat different approach to the one outlined in the Waits example, and I would say, a less risky one to be sure. Putting all the texture and atmosphere up front and then launching into specifics risks audience disconnection once the spell wears off, but a seamless transition between the two is like a well-made gateaux: a culinary one-two-punch with delicious piped chocolate icing on the top and outer sides and something exponentially more awesome in the middle. I don’t know what. This maybe:


 

Saturday, August 13, 2011

"He's more machine, now, than man..."

So I’m back…again.
I spent the frigid winter in a manor by the coast after my court physician told me a spell in warmer climates might aide to “slayeth the bad humours”.
Actually, I’ll tell you a secret. That last sentence wasn’t quite true. Nonetheless. I haven’t been doing any writing in quite some time because, to be as honest as a sodium pentathol elemental, I’ve been less inspired than the architecture in Soviet-colonised Boredomland. There’s two flighty metaphors in quick succession… Caught those? Still alive and kicking? That’s good. Anyway, with a change of colour scheme comes a change of topic and, appropriately (I guess) for such a lengthy interlude (also in the interests of writing something of slightly broader interest than my usual choice of material), I’m going to take the focus away from the pop-culture banquet platter to wax philosophical for a bit.

I spend a lot of time working around machines. Specifically, machines marketed to the general public consumer base. Like the card-carrying consumer junkie I am I also tend to spend a lot of my disposable income on technology. Going to buy a new toy, whether it be a computer, camera lens, graphics tablet, cassette player (no! Bad Cameron! Shove this paragraph bum-first back through the time portal and get out of the 1980s) or such, you’ll frequently hear the remark “It’s a lovely machine”. I’ve heard that same comment thrown around once or twice when showing a customer a computer at work.

So my question for today is this: can a machine be beautiful?

I should qualify. When I use the b-word, I’m not referring to an objective idea of beauty submitted by popular culture, rather a personal or subjective one. I say that here not with the intent to moralise against the former. I want to draw attention to the fact that in the late nineties and in the era since we have seen an increasing push toward consumer devices becoming smaller, curvier and more stylised, a phenomenon which, notably, has gone hand-in-hand with the increasing ability of virtual graphics technologies to render extremely realistic and lifelike scenes. Clearly, this streamlined aesthetic is the direction that consumer-culture notions of beauty seem to be firmly set on heading in.

However, this fact is relevant to our question for the following reason: Man constructs his inventions in imitation of nature. And nature is, traditionally, the ideal of beauty.
You only need to take a quick look around your immediate surroundings to recognise the extent to which this is true – unless you are reading this blog post from the slopes of an ascetic retreat in the Himalayas, but maybe even then. Devices such as cars, planes, boats, music players, iPhones, they all possess an unnerving familiarity, a ghost of a resemblance to things from the natural world.

And nature is not all sharp edges and protractor-iffic 90 degree angles. Nature is curves, the curling form of a leaf, the figure-8 twist of an elephant’s trunk. Nature is messy, though of course it exhibits a subtle, at-times-literally-mathematical elegance as anyone who has read The Da Vinci Code will be quick to shout out. Nature is also the sleek outline of a cheetah’s hide, which carries with it associations of speed, cunning, athletic prowess and general badassery. This goes some of the way toward explaining the marketing drive of the modern sports car and its more primal connections with the masculine psyche (of which the entire Fast & Furious movie franchise is basically an extension). Many such campaigns, unfortunately, often choose to emphasise the twin elements of speed and aggression, to the detriment of their audience (not least their audience’s safety!) and perceptions of automotive culture in general.  


Machines resemble nature, yes, and at the same time, they resemble people. There are many devices out there in the everyday which seem to have a literal or abstract “face”, or other apparatuses, which are vaguely but distinctly humanoid in character. Some examples are things like handheld smartphones, old clothes dryers, ghetto blasters and tape decks, forklifts – the list goes on and on. There is a simultaneously eerie and comforting familiarity to seeing devices such as this that I will talk about in greater detail later on.

Certainly there’s an element of the practical in this type of design. For example, what might be analogous to “fingers” - i.e. the keypad on a smartphone, or the stylus on a tablet - facilitate physical human interaction with such devices; LCD monitors or “faces” allow interaction at the level of the visual i.e. through our eyes, and the speakers or “ears” either side of my computer monitor transmit sound waves to my own ears.

But are such practical considerations merely a side effect of an inherent human instinct to construct things in the image of nature, in the traditional (abstract) locus where beauty is to be found? Or is “the natural” or the “beautiful” in a machine just a by-product of the practical?

After having written several posts with reference to the Pixar movie Wall-E this is one of the most fascinating ideas to come out of that film, that continues to haunt me to this day: When does a machine become invested with a soul? The answer, or the only answer I could find that seemed consistent with the impression given by that movie, was that a machine gains a soul only when its creators have ceased to exist. But I don’t think this is objectively true - rather it seems just one angle of looking at a much larger picture.

To a lesser extent, but still relevant, I’m thinking also of the scene in American Beauty where Caroline arrives home to find Lester’s shiny new 1978 Pontiac Firebird in the driveway, a vignette which is reprised and given additional depth during Lester’s final black-and-white monologue at the film’s conclusion. The image of a man buying his dream car – a symbol of wealth, prestige and material happiness – is excellent material for dissection in line with the film’s argument, that true beauty is something altogether more subtle.

But the inclusion of the Firebird in Lester’s dialogic coda suggests there is a deeper spirituality to the experience, one that transcends the banalities of Western consumer or car culture. What is it? Is it the simple joy of getting what you want? Is the Firebird, for Lester, a symbol of something else, of youthful innocence, of boyhood lost? Is it the car itself? Is there a beauty to this particular machine that requires deeper divination?

I’m not sure to any of these questions, but I do have my own experience to relate. Many nights as of late I’ve found myself going for a walk around the garden to get a breath of fresh air, and I’ll be alone outside, with my brother’s new car parked in the driveway. At some subconscious level, I’m sure, I’ve always wondered if that car wasn’t alive somehow, if those headlights weren’t a pair of eyes looking back at me. It wasn’t a malevolent presence. Just, watchful, I suppose. I could never shake the feeling that machine didn’t have a soul, somehow, that there was an uncanny awareness, like a passive dog dozing with its head in its lap. I’d see the porch light glimmer in shades on the bonnet, and I caught the thought mid-flight making its way across my brain, “That’s a beautiful car”. I knew I wasn’t using the word beautiful in the purest sense that I usually associated with things I found attractive, or wonderful – but all the same, there was something there quite distinct from the shallow commercialised culture of car-worship perpetuated by advertising and media. What was that? I still don’t have the answer. But perhaps you do? I’d be really interested to hear other people’s insights on this. Can machines be truly beautiful? What machines do you find have a kind of beauty? The truth is out there. I think. I could be wrong about that. Maybe. But probably not. I dunno. I haven’t checked recently. Only on a Tuesday. Or was that every second Sunday? Yes. No. Possibly. Hmmm.

Monday, April 18, 2011

Let’s go back, back….

Does anyone remember Quake? I’m talking about the original, bunny-hop-around-medieval castles obliterating Lovecraftian horrors-with-a-nailgun-that-should-really-only-function-when-pressed-flush-to-a-flat-surface-due-to-manufacturer-safety-regulations-seminal ID Software first person shooter. It’s been somewhat obscured in collective gaming memory by its more conventional hulking space marines vs. ugly mutant bodybuilder aliens succession of sequels, but you can still grab it off Steam for 10 bucks. Play it, and as you do so, marvel to yourself: this is what shooters used to be.

All the seemingly intrinsic mechanics we take for granted as part of the genre today are unknown to this beast. Cover-based combat, regenerating health, evil PMC’s featuring as the excuse to throw wave after wave of human cannon fodder into successive linear corridors of conveniently placed debris, barbed wire, broken down tanks or crashed jam donut vans, none of them has any business being at this shindig. It’s a fascinating meld of 3D puzzle platforming, dodging traps a la Indiana Jones (I refuse to use the words “tomb” or “raider”), fast paced combat against otherworldly monsters and rudimentary survival horror. Visually it’s oozing with atmosphere, with advanced (for the time) lighting sources that are actually quite effective in complementing the recurring dark and demonic imagery throughout.

Yet it offers a very different kind of fear to that we encounter in many games today. Not that horror is its strongest suit; in comparison to psychological mindfuckery such as that transpiring in Amnesia: The Dark Descent, Quake’s scares seem somewhat tame. And there’s no plot beyond a very basic, “you have four runes to collect and there are lots of nasty things standing in your way,” blurb.

But what Quake evokes quite well is the sudden, visceral feeling of fighting for your life. This is a true relic from the old days of gaming, where players weren’t babied around. Completion is certainly achievable, but Nightmare difficulty means exactly that. Frustration and endless reloads will comprise a major part of your playtime. On the topic of which, quicksave is conspicuously absent leaving only a hastily scrawled explanatory note on the kitchen benchtop, tearfully apologising by way of the fact that it hasn’t been invented yet, which I suppose is a somewhat more reasonable excuse than saying you had to go home to feed the cat. All saves and reloads must be done through the pause menu, the comparatively lengthy process of which naturally discourages the use of the save function altogether. As a result, when you do manage to come out of a particularly trying battle by only the skin of your teeth, there’s a genuine sense of satisfaction. Your character is surprisingly agile for a pre-noughties shooter, so learning to move and shoot while maintaining environmental awareness are the skills that will allow you to survive. This lends itself to fast-paced gameplay that hurls you unsuspectingly from one threat to another in quick succession, barely giving the player time to breathe in between. 

And coming face to face with a mighty Dimensional Shambler or a pack of Fiends lurking in the shadows out of sight can be genuinely unnerving, as can running down a corridor toward a tantalizingly placed Quad Damage powerup only to have the floor open beneath your feet and drop you into a murky dungeon full of things with claws and teeth and filthy hygiene habits. Whenever this happened to me I’m sure that somewhere, an old ID level design veteran was chuckling evilly.

It all adds up to a game environment where between the oppressive atmosphere, truly tough enemies and old-fashioned adventure-serial crypt-pinching trickery, you rarely if ever feel completely at ease. And this was 2011 me playing. I can’t imagine what a splash (damage, natch) it must have made back in 1996, with its full 3D environments and immersive lighting. I first remember playing this game when I was 8 years old at a department store, hogging the demonstration computers they had set up and thinking what a thrill it was to be playing an MA-15 rated game chock full of zombies that exploded into clouds of gory gibs and warped mutant spider camels with human heads and serious dental problems. Come to think of it, that might explain a great deal about why I turned out the way I did today. Um, yes. Moving on.

Perhaps the greatest compliment I can bestow upon Quake is that it harks back to a time when first person games still held promise as a genre with real creative potential for the medium. All these modern checkboxes of “realistic” real-life-conflict-based “tactical” shooters filled with painstakingly rendered human opponents and plots lifted straight from Hollywood blockbusters by people with scriptwriting degrees, of which the genre has been progressively channelled into ticking off on a mandatory list of creativity-killing selling points, only serve to infantilize and dumb down first person-action and indeed gaming as a whole. This is a proverbial shipping container we need to think outside of if we’re ever to fully realize the tremendous immersive and story-telling potential of first person games more generally. With that goal firmly in mind, it’s fun to look back at old shooters like Quake and remember that developers, too, were once able to just let their imagination (and their rockets) fly.

Sunday, March 13, 2011

No, not the Bruce Willis sci-fi movie of the same name.


I can’t help but wonder if there’s a reason for the increasing proliferation of audience surrogates popping up like enterprising gophers in recent well known and less-well known Hollywood films as of late. When I use this term I am talking about characters in movies for whom at least one of their roles in the narrative is to metaphorically represent “the audience”. Straight off the bat and flying out into the tropopause at supersonic velocity there’s a couple of examples I’d like to talk about in the subsequent blather: Ariadne in Inception, Neo in The Matrix, Upham in Saving Private Ryan (Hornaday 2010, Jameson 1998: 23). Mostly the former, but some of the latter.

There’s a common thread in all of these movies. Some of them are high-concept, others less easily summarised, but they all require viewers to accept worlds outside the boundaries of the everyday, where accepted norms of reality have been transgressed, rewritten or completely turned upside down and faceplanted into the sand.

There’s a practical need for audience surrogates, of course: particularly when the protagonists of a film are acclimatised to the rules of the onscreen universe, it would make no sense for them to explain those rules for the audience’s benefit without a trainee member to rationalise the brief (i.e. Inception, Saving Private Ryan). The rules of Inception’s shared dreamscapes are so complex and elegantly layered that the film demands at least some level of verbal exposition, much of which is directed at Ariadne. Likewise, the first forty minutes or so of The Matrix feature Neo’s discovery and orientation of the “world within a world” of the Matrix itself.

In Saving Private Ryan, by contrast, the in-film reality is the battlefield and the grossly distorted version of the everyday this engenders. The other members of Miller’s squad are soldiers whose experience of battle make Upham (Spielberg’s audience surrogate) seem naïve and out of his depth by comparison, symbolically pointing to the helplessness of the civilian audience when confronted with the horrific realities of war (Jameson 1998: 23) To some extent, they act as mentors to Upham (i.e. Private Caparzo’s ruffled advisory that “every time you salute the captain, you make him a target for the Germans”) but for the most part, the audience is forced to adjust to the in-film reality themselves (Jameson 1998: 23).

It’s worth pointing out at this stage there’s a purely superficial aspect to the presence of the audience surrogate, as well, one which is much more commonly associated with mmopugers (Yahtzee’s phrase) such as World of Warcraft or Everquest than popular narrative cinema. Given that we are being asked to symbolically inhabit a fictional universe, we want (and indeed have the unfettered freedom to) imagine ourselves as the perceived ideal. So we project ourselves into the popular culture multiverse as the visually idealised versions of ourselves, complete with good looks, nice clothes and an ever-present air of confident capability. The audience surrogate, to use a very 21st-century-specific term of endearment, is the “avatar” of collective desire. There is a great deal of literature on the real-world concept of “celebrity” status and its similar significance as the embodiment of a collective set of ideas which people hold dear, which I think is probably relevant checking out in lieu of me expanding on it and boring everyone to sobs. I could say a bit about Inception specifically here, though, and that bit goes something like this: if we’re inhabiting the Freudian dreamscape of repressed desires (or the collective dream space of the darkened cinema), it makes sense that, perhaps involuntarily, we would dream of ourselves in our idealised forms. This is comparable to what Morpheus in the Matrix called, “residual self-image”, and indeed, the same dynamic is at play in that movie with its leather trenchcoat-clad, noir-styled pastiche of high fashion.

But there’s another angle which I find far more interesting for the purposes of this discussion: why do we want, or need an extraneous character serving as a “projection” of ourselves? Particularly if they’re not the emotional/dramatic locus of the story, (i.e. the character onto which the audience is traditionally expected to project their identity, hopes, fears etc) would this not just displace and/or weaken emotional identification with the film’s protagonist?

Theoretically, yes, but not always. Giving the audience a voice in a peripheral character can actually be used to create some clever polemic situations. For example, observing the events of Inception metaphorically through the eyes of Ariadne distances the protagonist Cobb, making him something of a mystic. He becomes an aloof, unknowable mentor figure. This in turn preserves the element of suspense around the film’s climax as it hinges upon the interference of Mal, who is herself an enigma of whom Cobb alone has knowledge. At the same time, Cobb becomes dramatically significant to us as half father-figure, half seer; a conflicted anti-hero who we are forced to trust to help guide us through the labyrinth of the dream. We may not identify with him as a projection of ourselves, but rather, Nolan makes us closely dependent on him from the outset and this fosters a kind of emotional identification.

This looks like a good spot for me to shift the topic to end on a question mark and possibly trot out my favourite I-word into the bargain. Does having a character and a personality we can project ourselves onto specifically for that purpose (as opposed to the implied physical presence of the spectator-audience, mediated through the camera lens) wreck the sense of immersion at all?

In what we might call the “traditional” mode of viewing, the audience is the implied observer, which doesn’t give us any visible signs of existence in the onscreen world but instead naturally replicates the process of seeing to the point of being self-effacing. Before we’re introduced to Ariadne, we watch the opening scenes of Inception as easily and naturally as if we were just another person sitting silently at the dinner table listening to Cobb and Saito talk. With the “audience surrogate” model, we possibly lose that sense of situated, first-person immersion but gain a sense of empowered dramatic involvement. The sense that we are an actual in-universe identity with the ability to directly influence or change the course of the narrative adds poignancy and pathos to the action.

So I suppose it ultimately boils down to the question: which is more important in the cinemagoing experience? The simple act of seeing, or being emotionally involved in the narrative? Incidentally, being a longtime fan of the Thief game franchise (as I think I may have mentioned once or twice before), I’m reminded of all those debates surrounding “first person vs. third person” that cropped up after Thief: Deadly Shadows went postal with its third person mode back in 2004. But this is a whole different basket of beetles. Well, perhaps not so different. I contend tentatively and somewhat necessarily ambiguously that it depends what the requirements of the movie are and what you’re personally trying to achieve as a director. But that was predictable, so here’s something that’s not: BGloooghgooopooolsdgfsdgsreaaargh.

Sources referenced in this entry:

Hornaday, Ann (2010) “Inception's' dream team weaves a mesmerizing tale”, Washington Post. http://www.washingtonpost.com/gog/movies/inception,1158861/critic-review.html
(accessed 14 March 2011)

Jameson, Richard T (1998) “History’s Eyes: Saving Private Ryan” in Film Comment 34.5: 20-23.

The Matrix (1999) Director: Larry & Andy Wachowski, Warner Bros.

Saving Private Ryan (1998) Director: Steven Spielberg, Dreamworks. 

Sunday, February 13, 2011

Playing God, or one of God's low-order subordinates. Also RTS's haunt my nightmares.

City sim games don’t generally evoke memories of mind-blowing graphics or death-defyingly cinematic action set pieces. They tend to appeal more to those with a slightly obsessive personality, who enjoy seeing things come to fruition over time. With this in mind, here’s a game you should be playing: http://www.mobygames.com/game/afterlife

If you were to go over Lucasarts’ nineties-era back catalogue with a fine-toothed comb then certainly you’d find long-established hit titles like Day of the Tentacle or the Monkey Island series, but you might also notice this little-known gem which (then) represented a radical departure from status quo for the studio. Afterlife is a high-concept city-building game which cast the player as a Demiurge (an all powerful deity only a few office floors down on the quantumdimensional being pecking order from God himself, in this case collectively represented by the mystically aloof Powers That Be from which the player receives periodically humorous directives throughout) tasked with the responsibility of building a fictional Heaven and Hell for the mortal denizens of an alien planet many many light years away, but with a society and beliefs very similar to our own.

You could find out all of the above from reading the Wikipedia entry, though – what you wouldn’t is that it’s actually fun. Sure, there’s quite a steep learning curve and a veritable barnyard of highly specialised gameplay mechanics and rules to get your head around - at points it resembles an RTS but without the offensively brutal combative edge that more often than not takes all the hard work you spent building fifteen Carrier battle groups, crinkles it into a ball with the density of a neutron star and stuffs it into the wastepaper basket. However, that just makes it all the more satisfying when you do actually manage to navigate the minefield of early-game money struggles and get a respectable little Hades or Elysium springing into being before your very eyes.

Most importantly, though, what Afterlife has which so many contemporary games these days are sorely lacking is a) a sense of humour, and b) a tendency not to take itself too seriously. It does everything with a flourish and an overall sense of savour faire created through its low-key wit, not least among which features the relentlessly charming banter between the two player adviser characters, Jasper the demon and Aria the angel. And to be perfectly honest, I think that’s the main thing that offsets the mind-numbing boredom that usually comes from playing a game such as this which requires excessive micromanagement, such as an RTS. Have I mentioned RTS’s suck recently? 

There's such a variety of blatant and subtle references and in-jokes strewn throughout Afterlife that you'll find yourself actually wanting to unlock more expensive buildings just to read the amusing little description texts that come with them. Anyway, enough out of me: go play it, if you can find a copy.

Sunday, February 6, 2011

I would have written "Myst-ified" here, but then I would have been summarily condemned to the ninth circle of portmanteau hell

On the subject of major game franchises that we haven’t heard so much as a blip on the sonar system from recently, the Myst franchise is a notably now-deceased example that managed to spawn four bratty offspring, a spin-off and a corresponding cult MMO before it flopped. Alright, point of clarification: the sequels weren’t bad. Actually, they were quite good, in fact, some going so far as to surpass the quality and timeless charm of the original. I’ve been trying to map out a point of analysis for the recurring formula of this game series for a short while now, and I think I’ve finally arrived at one.

What Myst, all its sequels and spin-offs (i.e. referring to Uru, Uru Live) have in common is that they rely on a primarily visual reward system. In return for exploring, completing puzzles and learning more about the story and characters, the player is allowed to advance further and thus behold the latest pretty techno-surrealistic vista the art department has managed to conjure out of its collective sleeve.

It’s by no means unique in this respect. Early 3D platform games such as Super Mario 64 or Banjo Kazooie spring to mind as other examples that have operated using a similar mechanic. The latter even directly parodies this fact by showing the player incomplete paintings of worlds that must be filled in using “jigsaw pieces”, the game’s chief currency in trade, to unlock the entryway to that world.

Travelling back even further down the poetic oak-tree lined boulevard of memory lane we encounter the classic retro side-scroller 2D platformers blocking our paths like a police cordon around a crime scene. Because of the limitations of the technology and 2D perspective you could only display a percentage of the game world limited to the size of your monitor at a time, and the camera moved as your character did, so moving your character forward was directly linked to revealing more of the game environment. Spatial progress and visual gratification were, in this genre, completely synonymous. Therefore, many in-game obstacles were geared toward actually literally impeding the player’s physical progress, as opposed to more abstract and complex challenges observable in later 3D platformer titles (i.e. collecting parts of a dismantled space ship in preparation for the end-game confrontation in Jet Force Gemini).

This is why games with essentially (sometimes necessarily) repetitive or bland aesthetic environments usually focus on an alternative reward system for exploration and advancement. Consider some examples of non-visual player reward systems:

  • Early FPS’s with low graphical capabilities: Wolfenstein, Doom etc: reward is the novelty of the introduction of new gameplay mechanics (i.e. use of new weapons, enemies etc) 
  • Action RPG’s: Diablo, World of Warcraft: the game is player character-centric; therefore “levelling up” of one’s character is the reward as it records effort and time spent through a numerical syntax. I.e. the player character itself becomes a work in progress and a motivating factor for continuing play.
  • First-person horror thrillers: F.E.A.R, Cryostasis, Amnesia: The Dark Descent, Doom 3: reward is emotional stimuli i.e. the momentary cinematic shock and awe of the paranormal encounter.
Myst and its progeny are notable, however, for taking the visual reward system to its logical extreme. More specifically, here it has become a self-consciously parodied trope that is an integral part of the game world and backstory. This occurs through the plot device of “linking books” that are the only means for players to travel to and from in-game levels (Ages). In the Myst lore, Ages must be “written” by authors (read: surrogate level designers). Learning to write Ages is a closely guarded art which takes years to master. Each Age has a distinct atmosphere bearing the mark of its creator, comprised by any number of things: from a uniform colour palette, use of recurring geometric shapes, methodological consistency of puzzles, polarisation toward technocracy or the pastoral, and more often than not corresponding textural identities in the soundtrack vis-à-vis unusual or unique instrumentation.

Myst is unique in that it represents the characterisation of environment. This is a game where levels are given veritable personalities in their own right. When players achieve the objective of unlocking an Age by locating or activating the Linking Book, they are motivated by the promise of a unique aesthetic reality at least as much as that of discovering new plot details or the simple sense of achievement generated by the completion of an intellectually challenging task.

Here the game posits entire worlds as singular acts of authorial self-indulgence. Myst’s “Ages” are self-contained works of art within a single product. The basis for this probably lies in the series’ origins; Myst was at the cutting edge of 3D rendering technology when the original game was released back in 1993, and so the pressure of expectation has weighed like a poorly designed diving helmet on each subsequent entry in the pantheon to push the envelope in terms of graphical sophistication. And even though the developers have concurrently fleshed out the Myst universe and given it a sense of identity beyond that of a glorified tech demo, there continues to be this lingering association that hangs around the series as being a river barge for a comfortable tourist ride up the Amazon of fantastically surreal extraterrestrial worlds. But in this regard, it’s an interesting example of a game franchise that taps an often marginalised or underplayed function of games in general – transporting us to simulations of places we could never actually visit IRL, rendered with astonishing depth and attention to detail.

Thursday, January 27, 2011

Black Ops (cont’d), and why release dates are meaningless as a Yellow Pages in Klingon.

Time for a proper review of Call of Duty: Black Ops’ single player. (If you've read my other posts, you've probably figured out that I don’t do multiplayer.) Following my illogical and somewhat off the rails outburst in the previous instalment on the subject I feel compelled to advise my loyal reader base, however depressing a sum total that might be, that I did not in fact jam up my computer’s fan with a donut as punishment for it refusing to run aforementioned game properly and in fact have subsequently had the opportunity to do a complete run of the story mode. And I would just like to note at the outset that I did almost forgive the game its shortcomings, upon which I shall soon expand, for the line, “Your president needs slugs!” delivered by the dev team’s resident JFK ghost-writer in the post-credits zombie mode. Timeless.

All the standard-issue set-piece cinematic “wow” moments that have defined the high points in the Call of Duty franchise are there, as you’d expect, and similarly it’s during these that Black Ops has its finest moments. Pelting parkour-style across the virtual rooftops of Hong Kong, dodging sniper fire on all sides is as breakneck-exciting as it sounds. But it’s when the game stamps its foot and locks you into doing something an FPS isn’t designed for, like driving a motorbike while firing a sawn-off shotgun (Terminator-style) or hammering open a submerged helicopter door, that problems begin to arise. These scripted events are occasionally somewhat haphazardly or awkwardly implemented (one section where you guide a stealth bomber off the runway is particularly abstract/pointless).

Moreover, you can’t shake the feeling that there are basically just too many of them. The game seriously tests the boundaries of interactivity by yanking you unceremoniously out of the action every five minutes, whether it’s to impart some new tidbit of knowledge unravelled by your mysterious interrogators or to ask you to press Space to abseil down a cliff. This naturally leads to the overall impression that there’s not really actually a great deal for you to do in this game. It’s like showing up to audition for a movie on the second-last day before filming wraps, and rather than strike you off altogether they somewhat grudgingly re-write the script  to include an extra character who, I don’t know, has to push the button to detonate the bomb or something. The outcome is already pre-determined; you’re just required to press the button as a token gesture to interactivity. Because of the sheer number and implementation of its predetermined scripted sequences, I’m leaning toward the interpretation that Black Ops is not so much as a game as it is first-person cinema masquerading as a game.

The game also has some pacing issues. After a gleefully enjoyable first couple of missions the campaign was let down by a run-of-the-mill midsection and especially lacklustre finale littered with clichés. See, after watching the fairly excellent 21 Grams the other day I was growing attached to the idea that it wasn’t actually possible to make a nonlinear narrative that was boring.

But in Black Ops the back-and-forth mission structure that allows Treyarch to take us on a heavily stylized highlight reel tour of the various historical conflicts and flashpoints of the Cold War also works splendidly to defuse any narrative tension that ignites more effectively than an automated sprinkler system at a pyromaniac convention. Before we can actually muster up any inclination to care about whichever life-and-death cliffhanger situation Mason and his pals have been left… erm, hanging in, we’re whisked away to a different time and place either years after or before where we have just been biting our nails into nonexistence. Annoying and then some.

See, this mightn’t have been such an issue if the various character sub-plots (i.e. Mason, Hudson) had been taking place simultaneously within a localised time period and had progressed in something approaching chronological order, but the apparent modus operandi of jumping back and forth wherever and whenever the explosions are results in a truly migraine-tastic sense of dyslexia. Note the word LOGICAL in CHRONOLOGICAL. Are you paying attention, developers?

The entire experience reminded me of reading a Choose Your Own Adventure book where whenever I got an ending I didn’t like I’d go back to an earlier story branch and pick page 34 instead. (Always 34.) While this way meant I got to see all the different narrative possibilities (which I would have been able to do anyway if the book had just progressed as a linear sequence of events) it also meant I had to mark fifteen different pages at any given time. Thus the first casualty of the “scraping plot elements off the linoleum of the creative ether and compressing them in a Powerball machine” mode of narrative was a shortage of available fingers making the physical task of reading rather annoying. The second was all sense of what the hell was going on in the narrative or why I should care. I was just reading words.

But at least in the above example it was somewhat justified, because I was reading a Choose Your Own Adventure Book. I knew beforehand that what I was getting was a non-linear narrative, so I’ve really no right to complain. But I do have a right to complain about you, Black Ops. The time period is perhaps the most ambitious the series has attempted to cover, due to the need to create an entirely new set of game assets (textures/models/sounds etc) for each stop on the Cold War highlight reel tour, and I must admit I was impressed by the lengths to which the developers went to keep things historically on-par. But really, was it worth for a narrative structure that basically shot your own game in the foot?

Something else that nonlinear narrative done badly achieves is to completely nullify the foundation of a three-act dramatic structure. This is probably partially accountable for the game’s finale being so weak (and predictable) and the midsection having a tendency to meander a bit. The story reaches Vietnam and effectively bunks down there for the majority of the rest of the campaign, using at as a springboard to jump back and forth between time zones as you investigate further into what the Soviets are doing behind NVA lines. Sure it’s fun to appreciate the minutiae of references to various classic Hollywood depictions of Vietnam (The Deer Hunter, Apocalypse Now to name a few) but I really felt this was one of the clearly identifiable areas the pacing got bogged down.

Not that the gameplay doesn’t achieve this with flying colours on its own, as the regenerating health/autosave system the series has embraced with open arms since Call of Duty 2 naturally lends itself to replaying the same fricking area 25-odd times as you attempt to cross a wide open space without getting shot to pieces, or blasted into a low-earth orbit with that mysterious exploding bulldozer that a second ago was just sitting there innocently apparently waiting for you to advance within a five-metre proximity of it. I found myself particularly dreading the large open-area fights that were scattered at intervals throughout the level design. Nine times out of ten Mason’s reason for kicking the bucket was a stray bullet from someone I couldn’t even see. 

And then there’s the super-size serving of glitches. This is the most unpolished CoD effort I’ve seen. I understand the pressure to meet release dates, particularly with a big-budget blockbuster like Black Ops is greater than ever but at the same time those of us who constitute the ever-dwindling minority of the PC gaming market would like to actually be able to play the damned thing we’ve just paid $90 for rather than just admire the pretty box with its enigmatically shadowed dual-pistol wielding Batman impersonator. Are we reaching the edge of some slippery downhill slope where it’s acceptable to ship a product which is effectively still in beta? Where downloadable patches and content become a band-aid after-the-thought for doing a shoddy job with the initial release? It’s as if the line between internal and external playtesting has blurred to the point that developers now apparently expect the players to find out what’s wrong with their game.

To be fair, this isn’t necessarily a bad thing as you’re likely to get a far broader indication of potential glitches when a product is released out in the community than within the limited parameters of the internal playtesting environment. But you have to draw the line somewhere. Namely, at problems that render the software unplayable, as was rather incisively documented in my previous article on the subject.

So, to summarise: disjointed and pedestrian story, clunky scripted events, occasionally thrilling gameplay moments interspersed with frustratingly repetitive gunfights and a road-train-sized truckload of bugs. What can you learn from all this, Black Ops? Well, I'd appreciate it if you'd start by giving us some consistency in narrative context. Please. There’s a reason time goes forward not backward!

Sunday, January 23, 2011

Grey kingfisher.

This time the film that I saw on the weekend was Black Swan. I knew virtually nothing about it, but the damned thing kept popping up in my Facebook feed so often that I couldn’t help but arrive at the deduction that my own personal film critic homunculus was trying to tell me something. So I took the hint, rang up a friend and barrelled off to see aforementioned stretch of celluloid. And let me start by saying that Darren Aro-I can’t-possibly-pronounce-the-remaining-few-syllables’ psychological ballet thriller Black Swan is an arty, arty film. In fact, it couldn’t possibly be artier if it sellotaped itself to the back of a Rembrandt and had itself smuggled into the Louvre by a disaffected professor of 19th century humanist literature dressed in a beret and a coat made entirely of paintbrushes.

This isn’t to say it isn’t watchable. It’s just that it wears its artiness so self-consciously on its sleeve that it may as well be shouting, “Look at us, we know how to make a movie cleverly! Critics of the world, cough up the stars”, with every ominous crystalline chime of the soundtrack that accompanies Barbara Hershey’s tyrannical mother bursting in on Natalie Portman’s darling daughter in the middle of doing something embarrassing. Which, incidentally, happens more often than you’d think. But that’s enough about that.

It’s shot in grainy desaturated colours that turn highlights to brilliant ivory white and cast shadows around the unknown edge of the screen, playing light off dark in a somewhat too-neat visual parallel of the struggles in its heroine’s inner psyche. Only when Portman’s Nina ventures outside the world of the theatre and into the New York nightlife do more vivid colours begin to dominate, and then only for a very brief period. Then there’s the at-turns alternately grandiose and minimalist soundtrack. And, of course, the surrealist nightmare episodes that offer insights into the mind of Nina as it cracks under the pressure of her own psychological metamorphosis.

If you told me beforehand that Black Swan was going to include freaky Japanese horror-movie F.E.A.R.-esque sequences, I’d reply that I’d be interested to see how they’d fit in without seeming artificially grafted on, and probably also mention something about how it would be kind of hard to achieve that. Yes, I just compared this self-consciously super-arty film to a first person shooter game series, go and have a cry about it and try and drown me in the tears if you require a topically relevant means of exacting revenge. But they do work and it’s because they segue effortlessly into a cinematic sea of relentless tension, where you’re never quite at ease. The predictable main arc of the storyline is the only comfort the audience is allowed in a film where anything could happen next, one which – like the ballet itself - is so heavily stylized that everything or nothing that occurs might be real or just a figment of the heroine’s imagination.

Sure, there are a couple of cheap shocks. The moments where Nina spins round and the camera lurches across to reveal – surprise, someone standing right behind her, usually Mila Kunis’ deadly rival in the main role stakes, or her mother – didn’t sit too well with me. They’re squatting there self-impudently on the border fence between being overdone and staying just under the quota. But the actors are so good in their respective roles that we’re sufficiently immersed to let this kind of potentially awkward stylistic transition pass without noticing. And there are even a few instances of subversive humour, so low-key as to be non-existent, like when one of Nina’s fellow stars clad in full monster costume shambles past her and mumbles an affable “Hey.”

So what else can I say about it? Hmmm. In a sense, Nina’s transformation into the Black Swan mirrors the journey of an adolescent into full maturity through the discovery of art. When we first encounter her at the beginning of the film, she’s still a child; naïve, sheltered, innocent, repressed. The sexuality, self-expression of adulthood: both have been swallowed wholesale in the yawning singularity of discipline enforced by her mother. Nina’s allowing the Black Swan to flourish within herself is akin to reaching a belated maturity – or perhaps just unlocking a part of herself that has remained latent. She knows the choreography, the technique perfectly, with mathematical precision; but true feeling is unknown to her. Learning to exert control over one’s emotions is a skill that is generally associated with growing up – but recast in this light, the film seems to suggest that learning when it is appropriate to relinquish control over emotion can be just as much a part of reaching true maturity.

So on one hand, maybe it’s essentially a tale of growth. But the Swan Lake subtext seems to suggest that both the child and the adult can coexist within one person, and be adopted at a moment according to whatever the situation calls for. Like actors on the stage, we too can switch wholesale back and forth between innocence and maturity as we wish. This is a film full of chameleons.

Friday, January 21, 2011

Sunshine and daisies and rainbows

Ok, Black Ops: I'll cut you a deal. You stop locking up halfway through the first fricking mission, and then maybe I won't go play Medal of Honor: Airborne instead. Not out of any desire to play that aforementioned piece of pectoral-punching hyperpatriotic crap, mind you, just a bloody-minded and in all probability completely futile desire to piss you off to the most distant horizons of my potential in this department by voting with my two rational choice-making-agent-of-free-will-consumer feet. Or in this case, the ASDW keys.

Alright, that rather vitriolic opening was uncalled for. Well, at least 50% of it. Sorry. Back to nice me now. Technically, Medal of Honor: Airborne is actually a halfway decent game, for a comparative relic from (gasp) 2007. And I did have a bit of fun playing it. Notably (for the purposes of this post), the myriad of cinematic flourishes I found scattered within the single player campaign like Fantales on the linoleum after a piñata massacre smacks of a post Call of Duty: Modern Warfare FPS environment. And it’s because of this that I’m left suitably confuzzled. Despite the collective of similar gaming franchises doing their best to move with the times I thought it unlikely that the thematically stubborn MoH would ever be prised loose from its firmly concreted, Saving Private Ryan-Band of Brothers-heyday gung-ho-patriotic origins.

The reason for my thinking this has a lot to do with the franchise’s age and the fact that it’s been relegated to the margins of FPS gaming for so long under the all-consuming Call of Duty reign of popular acclaim. It’s like the MoH franchise went into hibernation for nine years after the release of Allied Assault, popping up to offer only the occasional token bad entry like a half-choked snore before waking up properly when the alarm went off in 2007 only to discover that notions of what an FPS was had changed considerably. The entire WW2 fad-phase in first person shooters had well and truly come and gone to be replaced by “Modern Warfare” as the new bullet-scarred hellhole to be in if you wanted to make it on the modern FPS scene.

Perhaps it’s for this reason that Airborne seems almost nostalgic and quaint for 50% of the time you’re playing it. The other 50% of the time is spent reminding yourself that this is not in fact another Call of Duty game. The choose-your-drop-zone gimmick is occasionally cool (i.e. when you manage to pull off an obscenely difficult skill drop like parachuting in through a window to land on top of an unsuspecting French peasant’s ice cream cake) but ultimately of marginal use. We also find ourselves with a Gears of War-style motion blur effect whenever you sprint (sprinting itself being one of many Call of Duty/Halo-esque mechanics to be found within, along with grenade warning indicators, regenerating health, iron sights, gun attachments, nonlinear gameplay and multiple objective sites). To level the charge at a WW2 shooter that it’s generic is like accusing a continent of being too big to conveniently steal, but then a game like, say, World at War manages to duck that particular spinning razor blade of accusation by a half-millimetre and just avoid getting its Padawan braid sliced off at the knot. Airborne feels like the average of every WW2 game that’s come before it, and carries an appropriately average level of fun.

More recently, the MoH franchise even loyally followed CoD into the controversy mosh pit by allowing players to select the Taliban as one of the two sides in its multiplayer mode for the more recent modern-warfare themed, generically titled Medal of Honor (2010), although unlike CoD they ultimately bowed to pressure from a variety of angles and replaced it with the necessarily ambiguous title “Opposing Force”. It’s somewhat baffling, although I must admit I’m not particularly emotionally invested on the matter, to see MoH exhibiting so many similarities to CoD. It’s a case whereby the forefather has been eclipsed by its tackier, prodigally hedonistic digital offspring. 

Whatever. I suspect, although I could be completely barking up the wrong tree here, that where I’m heading with all this is to say that Call of Duty is overrated. Where the chief pitfall of the Medal of Honor series is to indulge itself with chest-thumping patriotism, the Call of Duty franchise revels in violence for its own sake; violence as sheer mindless spectacle.

Mind you, I didn’t always withhold that view. The series’ major watershed moment was the release of Modern Warfare, and the subsequent branching off into a sub-franchise that entailed marked the beginning of a downward spiral for a previously well-regarded name in first person shooters. That wasn’t to say that the signs weren’t there in Call of Duty 2 – a number of bad habits were apparent in that game that would grow into ever-more detrimental issues with the itinerant repetition that each subsequent entry in the series demanded. I won’t start rattling on about regenerating health, wave after wave of respawning enemies or lack of a quicksave key, but I do think some questionable design decisions were made in Call of Duty 2 from a purely ludic perspective that would funnel the series into the spectacle-oriented gameplay standard that it ultimately carries into the epic battle for our consumer dollars today.

Even putting this aside, there are at present some dire problems with the CoD series that need to be addressed, and the largest and most glaringly obvious of these is easily summarised. The series has lost the freshness that made the original Call of Duty such a hit. The stock-in-trade cinematic set-pieces have been taken to such an epic scope and scale, and overdone to such an extent, that nothing can impress us anymore. From the moment I started playing Black Ops, I couldn’t shake a feeling of déjà vu that has been the defining atmosphere to every single CoD game I’ve played since Call of Duty 2. It’s a sense of the generic. Every single mission is the same, just with slightly different shades of window dressing.

It’s compounded by some very fundamental realities of the series, which are not necessarily specific to Black Ops. At the basic level, these games are about shooting people. People; virtual people to be sure, but there’s no robotic alien microwave death turrets or baked-bean powered cyborg zombie stegosauruses to be found here. When you distil it down, the range of actions the player is able to perform are so one-dimensional as to be telling. You can move, sprint, jump, go prone, shoot, throw grenades, perform melee attacks. That’s about it. And every task that you complete is with the ultimate objective of being able to advance to the next area so that you can shoot more people. Albeit in slightly different settings, or wearing slightly different uniforms. Or in slightly different ways, such as using thermal imaging to strafe enemies from an aerial gunship. The essence of war is violence, and war is the essence of this game series. It takes precedence over plot, gameplay or, somewhat paradoxically, even realism. The end result is more than a tad mind-numbing.

Granted, occasionally there’s a car, boat, airplane or helicopter chase thrown in just to mix things up a bit. Indeed, the promise of that next big blockbuster cinematic scripted sequence just around the next corner was the only thing that prevented me quitting to Windows every time Mason decided to eat paving stones when playing Black Ops (which, incidentally, happens quite a bit whenever I play a CoD game as I invariably select Hardened difficulty or above, each time illogically reasoning to myself that I’m getting more of my money’s worth that way. I never learn.) The repetition is utterly maddening. The repetition is utterly maddening. The repetition is utterly maddening. The repetition is utterly maddening. The repetition is utterly maddening. The repetition is utterly maddening.

At least within the WW2 setting, there was the basis for a claim that Call of Duty had some value as a historical simulator, even if only vis-a-vis a very stylized rendering. But with the shift to fictional real-world modern scenarios (not to mention the Frederick Forsyth 60’s spy-thriller-type pulp material that the Black Ops plot appears to have been hewn wholesale from like an eco-friendly tea towel) the series has trodden further and further outside the boundaries of realism into the smelly dank jungles of sheer mindless spectacle.

With Modern Warfare 2, perhaps the most spectacularly brain-dead entry in the entire series - epitomised in the “No Russian” mission and its aforementioned attendant controversy surrounding the massacre of virtual citizens in an airport - the mandate of violent spectacle above all else has reached its logical zenith. Yet curiously, for all the controversy, there was very little actual outcome, but plenty of noise was made (read: spectacle). Not exactly of the violent variety, but. The phrase “life imitating art” is one I’m often a little too eager to trot out of the stable like a prized show pony with a jewel-encrusted saddle but I think it’s justified here. Though somewhat downplayed in comparison to its predecessor, Black Ops attracted its own fair share of the magic C-word because of its opening mission where the player is tasked with the assassination of a living real-life figure (Fidel Castro). I hope we’re not seeing the beginning of a trend. Spectacle and violence. Where does it end? (Oh look, I made it rhyme.)

None of the above really matters, though. Activision have found a winning formula for success, and it’s unlikely they’ll mess with it too radically in the next instalment of CoD, wherever or whenever that may be set. But at the same time, just as species evolve or get eaten, the series needs at the very minimum to keep pace (in both ludic and narrative arenas) with its competitors in the FPS marketplace. Which makes me think that Medal of Honor may well have just been on something vaguely resembling the right track after all.

But pulling a u-turn at the lights and screeching back in the opposite direction to Call of Duty: Black Ops. In the mean time, I just wish the blasted thing would stop bringing the fireworks show screeching to a halt just as I’m running through Cuban cane sugar fields in the middle of a CIA bombing raid. Does it have some aversion to sucrose, I wonder? Is my computer trying to help me lose weight by systematically censoring all references to junk food? Stupid thing, we’ll see if a Krispy Kreme shoved in your fan improves your attitude any. (Don’t look now, but it looks like not-nice me has returned with no particular aplomb). That should make you eat your words. Or your digits, rather. Your 1s and 0’s. Your binary. Yes. Tasty binary code…