Friday, March 30, 2018

The Saudade of Blade Runner

Blade Runners, models 2019 and 2049 (Harrison Ford, Ryan Gosling)
In an interview with The Hollywood Reporter, Dutch actor Rutger Hauer recently gave his reaction to Blade Runner 2049. Hauer was one of the principals of Ridley Scott's 1982 film who was absent from the sequel. His answer perhaps reflects why he never joined the reunion, and is worth quoting in full:


I scratch and sniff at it. It looks great but I struggle to see why that film was necessary. I just think if something is so beautiful, you should just leave it alone and make another film. Don't lean with one elbow on the success that was earned 30 years ago in the underground. In many ways, Blade Runner wasn't about the replicants, it was about what does it mean to be human? It's like E.T.. But I'm not certain what the question was in the second Blade Runner. It's not a character-driven movie and there's no humor, there's no love, there's no soul. You can see the homage to the original. But that's not enough to me. I knew it wasn't going to work. But I think it's not important what I think.

Hauer was never alone in his doubts. When plans for the sequel were confirmed in 2015, social media wasn't altogether jazzed about the prospect; the evidence of Ridley Scott's uneven Alien prequels left many fans skeptical that a new Blade Runner would succeed. Reverence for the original prompted a kind of dread. Like Hauer, they wondered "why is this necessary?"--an anxiety only partly resolved by Denis (Arrival, Incendies) Villeneuve's attachment to the project. 
For most, the arrival of 2049 has retired that discussion. The critical response has handily exceeded the new Alien films. In retrospect, Hauer's appraisal has the rare rhetorical distinction of being absolutely true in all its parts, yet wholly false (including the proposition "It doesn't matter what I think"). Demanding that films be "necessary" is of course a category error--no movie is "necessary", strictly speaking. But along with a look and a vibe, Blade Runner 2049 does have a "question". It is admittedly less explicit than the original's, though, and needs some archaeology to uncover it.
The original Blade Runner emerged from the cultural context of the late 1970's--a time that, in the "one week seems like a year" climate of the Trumpian era, feels as remote as bardic Greece. America was deep in the throes of its post-Vietnam, post-Watergate funk, with stagflation, gasoline crises, and a cardigan-wearing President who came to personify an age of diminished expectations. There was an efflorescence of "national declinism", prompted by the brute military muscle of the Soviet Union (having invaded Afghanistan in 1979), and the emergence of Japan as a serious industrial and technological rival of the US. Science fiction was exploring the territory lying between the progressivism of the Arthur C. Clarke-Isaac Asimov mainstream, and the Frankensteinian nightmares of pulp. The synthesis, soon to be called “cyberpunk," envisioned a future where (at risk of oversimplification) neither the wonders nor the nightmares predominated, but existed simultaneously. In that sense, our futures very much resembled our checkered present--or any other era. 
Blade Runner precisely embodied all these developments and anxieties. While harkening back to familiar tropes of movie sci-fi (towering cities, flying cars, homicidal robots), it presented a future that was thoroughly Asian in culture and person, and in a way so incidental that it requires no remark. Of course Asia will colonize the future, it seemed to shrug. Isn't that writing already on the wall?
The urbanization of the world well underway by 1982, so Scott's vision of 2019 is all about the city. Though ostensibly set in Los Angeles, there is nothing plausibly LA about the place. It is really post-war New York. It is Metropolis, and Gotham City, with some London weather too. It is dense and glowering, but it also glitters with a Tokyo-esque affinity for neon. In its density, in the way its towers reach up and out of the darkness, there is an element of glamor. 
Like much of what is now called film noir, it presents a twilit world, a place of compromised ideals, that nonetheless gestures at a lighter, purer sphere that still exists somewhere, offscreen, "off-world". In 2019, we hardly see beyond city limits, so it's still possible to imagine there's a normal landscape beyond them. We get an accidental  evocation of it at the end of the film, in a scene reportedly tacked on at the insistence of the studio: there, like a just-married couple on the highway to suburbia, Deckard and Rachel drive into a sun-dappled countryside somewhere "up north". 
2049 punctures that hope irrevocably. In K's flyover on his way back to the police station, we see the only hints of the glitter of 2019. The flashy avenues are just thin facades between vast, low-rise blocks with barely a light burning. We imagine either thousands of people shivering in the dark, or swathes of depopulated cityscape haunted by vagrants and scavengers. What lies beyond city limits isn't left to the imagination: to the south, there's a garbage dump that was once San Diego. To the west, a dead ocean. And to the east, the bones of nuked Las Vegas, choked by conspicuously unnatural orange dust.
Villeneuve's world is colder, more brutal. That sheen of drizzle has frozen to snow. The Asian patina is muted; there's an element of Russian too--and not just Russian but Soviet (evidenced by the film's use of Budapest locations, and the "Soviet Happy" holographic ballerina pirouetting above BiBi's Bar). It's as if Villeneuve conceives future California to resemble Siberia, or some kind of ticky-tacky Russian research outpost. If the world is on a knife edge in 2019, by 2049 it has been shoved over.
The films' respective anti-heroes mirror the cosmic decline. Rick Deckard is a conscious evocation of the gumshoe literature of the mid-20th century. His name vaults off the tongue like “Phillip Marlow”, “Sam Spade”, “Joe Friday”. Despite years of debate in the fandom, there's no firm evidence in any of the various versions of the film that he is anything but human. (For argument's sake: if he is a replicant why is he tossed around by every other replicant in the film? Are we to believe the cops assigned a particularly puny replicant to the dangerous task of hunting down other 'skin-jobs'?) Deckard, the flesh-and-blood private dick in a trench coat, is clearly a figure out of the past, a fulcrum of familiarity in a jarring setting.
“Officer K”, on the other hand, doesn’t even get a real name. Like most of the film's 'synthetic' characters (“Joi”, “Luv”), he get only a truncated name, as if bits of orthography are a luxury resource, to be dribbled out sparingly.
Instead of the classic gumshoe, K's literary antecedent is the character “K” from Kafka's novel The Castle—a character who does not inhabit his environment as much as he endures it. In direct inverse to Deckard, he is physically unbeatable but submissive in temperament, meekly submitting to the taunts of the other (human) cops. In what passes for the "love scene" in 2019, Deckard bullies Rachel into confronting the reality of her desires. In 2049, K is propositioned in his apartment by Joshi, his female boss—and barely summons a response. 
At the end of the film, K dies alone on the steps of Ana Stelline's lab as Deckard rushes inside. Unlike Roy Batty, K gets no one to watch his tears wash away. We wonder, would it have killed Deckard to pause a moment to bear witness to the death of the guy who saved his life? It seems like a prime opportunity to give the film the "heart" Hauer and others complain is missing. But Villeneuve declines it. 
The films are also at odds in visual language. 2019, as captured by Jordan Cronenweth, was as close to black and white as a color film could be. Deep blacks were split by a livid, god-like spears of light. Lens-flare was not just used expressively, but almost as a sacrament, as when it sanctified Rachel's tears after the fatal encounter with Leon. The look was so distinctive that NYU film students in the 1980'a spoke reverently of achieving "Blade Runner light" in their theses. 
The light in 2049 isn't livid and it isn't sanctifying. It always seems tea-like, tainted, like used dishwater. Even in Niander Wallace's eyrie, Roger Deakins makes the light indirect, baroque in its busy refraction. Some suggest that, in this future, access to clean light and air are matters of privilege, as if only rich people can afford them. But this light isn't just dirty--it is dissolving. Everything and everyone, rich and poor, human and replicant, seems to be fading, disintegrating. 
2019 is about how the future is like the past. 2049 says the past is dead and will never return. 2019 is cultural nostalgia. 2049 is a more profound savoring--not just painful, not just pleasure--of people and things permanently lost. Cognates of this yearning have been felt and named in cultures all over the world. In Portugal and Brazil, it is saudade, a melancholic state variously described as "the presence of absence", and "a pleasure you suffer, an ailment you enjoy" (Manuel de Melo). In Japan, it is mujo-kan, a heightened sense of the world's impermanence. In Bosnia, the folk musical genre of sevdah is characterized by such "missingness". In Istanbul, it is huzun, a sense of longing for something one cannot exactly name. Appearing five times in the Koran, the word was taken by the Sufis to connote remoteness from God. Orhan Pamuk likens it to "the emotion of a child might feel while looking through a steamy window." There's more than a little 2049 in the way he evokes it:

To feel this huzun is to see the scenes, evoke the memories, in which the city itself becomes the very illustration, the very essence of huzun. I am speaking of the evenings when the sun sets early; of fathers under street lamps in the back streets returning home carrying plastic bags. Of the old Bosphorus ferries moored to deserted stations in the middle of winter...of the simit vendors on the pier who gaze at the view as they wait for customers; of everything being broken, worn out, past its prime; I speak of them all.

In a future marred by cataclysmic climate change, and the conflicts and dislocations that will inevitably attend it, what will humans' collective sadness be called? What obscure sorrows will be coined to describe our feelings when we remember a gentler, more normal world we may never regain? 
2049 has no answer, but it poses the question.
The dream-like quality Pamuk describes, "like looking through a steamy window", at last gets us to the biggest "question" Hauer misses about 2049. Where 2019 asks "What is human?", the sequel is haunted by the fear that reality itself is lost, or unknowable.  Deckard and K both insist "I know what's real," the former when confronted with the replica of Rachel, the latter with the truth of his "memory" of the furnace. At Bibi's, Mariette realizes that K "doesn't like real girls"; when Joi is downloaded to the Emanator, she understands that she risks deletion, "just like a real girl". 
Perhaps the most ominous hint comes after the menage between K, Joi, and Mariette; before she is dismissed, the replicant Mariette disparages the holographic Joi, saying "I've been inside you. There's not so much there as you think." On the question of what qualifies as a "real" person, we'd think that replicants would be more broadly accepting than the humans who discounted them. Joi may lack a physical body, but her programming almost certainly shares much with the software that runs the bio-mechanical replicants. Yet Mariette doesn't think there's much to Joi's humanity--a prejudice that fairly gushes with epistemic irony. If Joi overestimates her own "realness", who's to say the replicants don't also, or the humans too?
  The closest analog to 2049's questioning of lived reality is probably the Matrix films. When Morpheus speaks of the "desert of the real" with its "scorched sky", he might as well be describing 2049’s slow planetary unwinding. Yet while Villeneuve's treatment of this theme is more subtle, it is arguably more radical. The Matrix, after all, is a deliberate construct, a tool of political control.  It has an architect, an eminence grise patterned on the "father" of the internet, Vint Cerf, who by appearance might as well be God. Unlike Los Angeles in 2049, the Matrix is designed to be more or less a livable place. Moreover, Morpheus offers a way out of the illusion--a "red pill" that enables the hero to see the Matrix as a kind of game, with rules that can be bent. 
The sense of unreality in 2049 isn't the consequence of some game. It can't be "won".  When Joshi speaks of "keeping order" in a world about to spin apart, she is not giving voice to any architect, implicit or otherwise. In this, Villeneuve raises a prospect that is arguably more frightening than being at the whims of hidden puppet-masters: the terror that there is no one in control at all. 
© 2018 Nicholas Nicastro


Wednesday, August 16, 2017

Why I Lean Lannister


In the ongoing War of the Queens in HBO’s Game of Thrones, the choice seems obvious. Between the ambitious but fundamentally decent Daenerys (“Dany”) Targaryen, and the murderously scheming Cersei Lannister, the fandom is all in for Dany. The show's writers seem to be, too: fan-favorite Tyrion Lannister, who pointedly believes in nothing, declares “I believe in you, Daenerys Targaryen”; Ser Davos Seaworth, a voice for compassion in a kingdom otherwise soaked in blood, jokes about “changing sides” to Dany--in the very presence of his liege lord Jon Snow, in fact. Meanwhile, from the dour looks on the face of Cersei’s brother-lover Jaime, even he seems resigned that Cersei's reign over war-torn Westeros must end. 
      But the writers and the fans have it all wrong. Their mistake doesn't lie in the characters of the Queens themselves. It lies instead in the legacies of their families, and what they mean in the context of history--including medieval-ish, semi-plausible, made-up history. To pick wisely, the right question should not be "Who do you prefer to have a cup of mead with?". Rather, it should be "Which side is better for Westeros in the long term?" And the answer to that is--very arguably--the detested Lannisters. 
      Exploring why demands a deep dive into fake history. In the canon of George R.R. Martin’s original novels, the Targaryens are the realm’s dynastic ruling house. Hailing from across the Narrow Sea, they conquered the continent using strategic weapons of mass destruction—giant fire-breathing dragons--to subdue six of Westero’s seven independent kingdoms (and exact the submission of the seventh). The Iron Throne itself was forged from the surrendered weapons of the Targaryen's enemies. 
      For three centuries, they brooked no defiance. Whole castles, like the impregnable Harrenhal, were literally melted into submission. Targaryen power could easily be called "pharaonic” in the depth of its absolutism. Indeed, it also resembled the pharaohs in the manner of their succession--by institutionalized incest within the royal house. 
      The predictable climax of all this inbreeding was the reign of the “Mad King” Aerys II, who planned to put down a rebellion by incinerating the entire capital and everyone in it with chemical weapons (aka “wildfire”, a kind of medieval napalm). The tyranny of the Targaryens was nominally ended by the military victory of Robert Baratheon. But the real coup de grace was administered by the Lannisters, who became the power behind Robert’s throne. This is where Martin’s first novel picks up the story. 
      Unlike the Targaryens, who ruled by blood, the Lannisters are most associated with finance, political gamesmanship, and clever use of diplomacy. Their family motto is “A Lannister always pays his debts”--a useful trait when forging alliances. The paterfamilias, Tywin Lannister, is a gimlet-eyed master of realpolitik who does not bed his daughter Cersei, but marries her off strategically. His son Jaime, the “Kingslayer”, was the one who finally put old Aerys out of his misery. True, he’s sleeping with his sister. But the fact that he killed a king and survived attests to a simple yet profound political development: that no one, not even the monarch, is above responsibility for his acts. 
     These two clans, admittedly, are not ideal options. But like the election of 2016, these are the choices we have. Is it better to be ruled by a closed dynastic family whose absolute power is backed up by magic creatures only it can control? Or one where power is negotiated, shared, and contested? 
      In actual history, the trajectory is clear: Europe became modern because its versions of the Lannisters, not its Targaryens, ultimately triumphed. Instead of god-kings, the continent evolved into a system of multiple, mutually-suspicious centers of power. Those rivalries led to competition between kingdoms that sparked revolutionary changes in technology and culture. And it was because of them that Lannister-esque Europe soon surged ahead of global rivals like India and China (ruled, appropriately enough, from its own “Dragon Throne”). 
     No doubt, Dany is the more immediately appealing choice. But there’s also no doubt about what kind of system she would seek to restore. It might be a just world, but only because Dany opts to be just. It might be a peaceful one, but only because Dany alone possesses the ultimate weapons. There would be no guarantees about her heirs. They might well might be more like the Mad King--or her insufferable brother Viserys, who was so stridently entitled he was killed by his own ally. Of the benevolence of kings, Thomas Jefferson was understandably skeptical: “Sometimes it is said that man cannot be trusted with the government of himself. Can he, then be trusted with the government of others? Or have we found angels in the form of kings to govern him? Let history answer this question.” 
     The Lannisters, meanwhile, are a withered branch. All of Cersei's children are dead. Petty and ruthless as she is, she still lives in a world defined by politics, not magic. As soon as she can't pay her debts, she will pass from the scene. 
     The Lannister’s plutocratic rule at least contains the promise of modernity. I can more easily see one of Cersei’s descendants, mortgaged and weak, signing a Westerosi version of the Magna Carta. The sons of platinum Dany, born to rule, backed up by dragons? Not so much. 
      Faced with the choice of Dany’s forever dynasty and Cersei's short-term tyranny, the choice is clear. When the armies of Westeros clash next, I’ll be rooting not for the Dragon, but for the Lion.

Nicholas Nicastro's latest novel, Hell's Half-Acre, was published in 2015 by HarperCollins.
© 2017 Nicholas Nicastro

Wednesday, November 11, 2015

Spectre of the Gun

Bond (Daniel Craig) brings a plane to a car chase in Spectre.

   Spectre. Written by John Logan, Neal Purvis, Robert Wade, & Jez Butterworth. Directed by Sam Mendes. At local theaters. 

Most criticism of the James Bond franchise emphasizes how the series stays relevant by adapting to the times. And sure enough, in Spectre it officially enters the post-Snowden era, as our hero fights a Big Data cabal plotting to turn our planet into a digital panopticon. But what’s equally important are the ongoing legacies—the “product DNA”, as developers like to call it—that make the Bond films stand apart. While people have been predicting the demise of the franchise for decades, it won’t truly be dead until it is indistinguishable from those Mission: Impossible and Jason Bourne movies. 

In the nine years since Daniel Craig donned the Savile Row suit, the series has veered all over the map. 2006’s Casino Royale rekindled the class and excitement of the franchise, and remains the best. Quantum of Solace (2008) seemed barely a Bond film at all, while Skyfall (2012) seemed too earnest to recover the nostalgia Solace had jettisoned. Its other flaws and virtues aside, Spectre at last seems comfortable being what it is, and sees no reason to strain for anything different.

Director Sam Mendes (American Beauty, Skyfall) opens the film with a visually stunning sequence, set in old Mexico City on the Day of the Dead, that is heavy on sugar skulls and foreboding and the kind of sledge-hammer irony not approached since Roger Moore donned clown makeup in Octopussy (1983). Indeed, the dead cast long shadows here, as Bond goes maverick to fulfill a posthumous “contract” from his dearly departed boss (Judi Dench as “M”). 

The cause sends him from London to Rome to Austria to Morocco in a quest to stop a secret organization called SPECTRE from hijacking the world’s electronic intelligence. As in the scene here where Bond comes to a car chase with a large airplane, it all seems like a lot invested to modest effect. While this is the most expensive 007 movie in history, rumored to cost $350 million, its hard to say all that money reached the screen. (Edward Snowden, by contrast, thwarted Big Data for the cost of a thumb drive and a ticket to Russia.) Mostly, it seems that the makers of Spectre spent hard not to fail, and as far as that goes, succeeded.

No Bond film can truly be judged without looking at the villain and the women, and in these departments Spectre is sturdy but not spectacular. Christoph Waltz (The Zero Theorem, Django Unchained) seems to fulfill his personal destiny to play a Bond adversary. And while he conveys a certain satiny menace here, he’s nowhere near as compelling as he was as the two-faced Nazi fixer in Inglorious Basterds. Nor does the film need the overwrought psycho-backstory that somehow implicates his character in all of Bond’s misfortunes since his orphanhood. 

At fifty years old, Monica Bellucci (Matrix Reloaded, The Passion of the Christ) made headlines for being the oldest actress to be cast as a “Bond girl”. (The previous record holder was Honor Blackman, who appeared in Goldfinger at 38.) No doubt, Bellucci still looks terrific. It’s just too bad the writers forgot to give her anything much to do except tumble in the face of Craig’s savoir faire. While the Bond films give a good impression of worldly sophistication, Bellucci—if given the chance— could have delivered more than an impression.

Léa Seydoux (Blue is the Warmest Color) fares best of all as Madeleine, the daughter of one of the Bond’s former adversaries. There’s nothing in Spectre as torrid as the Sapphic kisses in Blue, but Seydoux presents a combination of freshness and worldliness that is hard to resist. She also benefits from the possibility that Craig may not return to the series, in a way that won’t be spoiled here.

An essential part of that “product DNA” is Bond’s roots in the Mad Men era, when gender roles were first beginning to transition to whatever place we are going now. For women, the narrative is relatively straightforward—toward more freedom, more options. For men, the story isn’t so simple; while they are also promised more options, it isn't clear those are ones most men would want to take. Not a few of the men who come to see Spectre might send their days pushing strollers, keeping house for their wives, not exactly sure what a “dirty martini” really means. They’re told—and perhaps even mostly believe—that what they have is “freedom.” But they wonder.

 Bond embodies the old virtues of strength and purpose and lack of debilitating self-consciousness. (In Spectre, when Madeleine asks how he copes with his life, “…living in the shadows? Hunting, being hunted? Always alone?”, Bond replies “I don't stop to think about it.”) Old virtues, but also divisive, because they are lately presented as brutality, narrowness, and emotional inertia. As sex roles converge, perhaps 007’s larger cultural role is to keep those old virtues on ice, ready for when the times demand a different kind of man. 

© 2015 Nicholas Nicastro

Tuesday, October 6, 2015

Blues for a Red Planet

Saving Matt Damon in The Martian.

★ ★ ★ 1/2  The Martian. Written by Drew Goddard, based on the novel by Andy Weir. Directed by Ridley Scott. 

About twenty years ago, the late Carl Sagan, during a speech at Cornell, likened some natural phenomenon to “coupled differential equations”. When the students hissed at this reference to their math homework, Sagan remarked, “Why all the hissing? One day differential equations will save your life.”

Ridley Scott’s The Martian is the fictional realization of Sagan’s prediction. Based on the bestselling page-turner by Andy Weir, it concerns Mark Watney (Matt Damon), an American astronaut who accidentally gets left behind on Mars after an emergency evacuation by his crew. Literally the only man left on a barren, freezing planet, Watney has only three months of supplies but a three year wait before a rescue. After making the conscious decision to try not to die, he declares his only option: “I have to science the shit out of this.”

  And “science the shit out of it” he does. Weir’s book is a throwback to the heyday of “hard” science fiction—the kind that reveled in the physics and chemistry of their stories. In it, we learn how to make water from jet fuel; we explore the botany of raising crops in otherwise lifeless soil, and the astrodynamics of sending humans and machines to other planets. That Weir makes all these technicalities not only palatable, but gripping, is a real triumph. 

For the movie, director Ridley Scott (Alien, Gladiator, Prometheus) is wise to stay close to Weir’s winning formula. But the film is better than the book in one important respect. Weir is not an accomplished prose stylist, and The Martian is his first published novel. While engrossed in the story, this reader sometimes had to remind himself this story was set on an alien planet, because Weir offers precious few details on how it actually looks and feels to be on Mars. Now that we are roving Mars by robotic proxy, it’s not enough to be vague about all that. Mars is no longer just an astronomical object—it’s a piece of turf, as real as Yuma or Marrakech.

  What’s the sky like there? The stars? The sunset? What’s the sound of Martian wind from inside Watney’s lab? What’s it like to get all that fine Martian dust in the crotch of your space suit? Such are the kind of impressions that make a story vivid, and alas, they seem beyond Weir’s skills as a storyteller. Visual flair is in Scott’s wheelhouse, however, and The Martian is replete with the kind of details that make Mars come to life as the implacable antagonist it is supposed to be.

        No surprise that Matt Damon is a fun and appealing choice in the lead. As written, Mark Watney might as well have been designed for Damon to play. Other casting choices—such as Jeff Daniels as the Machiavellian NASA administrator—feel predictable. That a woman with the apparent youth of Jessica Chastain would be selected as mission Commander is hard to swallow, however. More likely a far more seasoned astronaut—female or male—would be chosen for that job.

  The Martian is hard science fiction, but it stretches the limits of disbelief in one respect. The premise that the United States of America, by what seems to be a few years in the future, would actually agree to fund the manned exploration of Mars is utter fantasy. In a time when the US Congress can’t agree to fund the government for more than a few months at a time, and a leading Presidential candidate declares evolution and the Big Bang theories “of the devil”, it seems almost quaint to expect our fractured nation to back a program of pure exploration that would last a decade and cost hundreds of billions of dollars. That, of course, would take patience, and vision, and the kind of trust in Big Science that we clearly lost a long time ago.

  More likely, a real “Martian” would not be named “Mark”, but “Chen”. And it would not be the Chinese space agency lending a small assist to NASA, but the other way around. That is, unless somebody finds a way to “science the shit out of” our national plunge into stupidity. 
© 2015 Nicholas Nicastro






Monday, January 19, 2015

The End of Criticism



In his Gettysburg address, Lincoln predicted "The world will little note, nor long remember what we say here…" It's hard to know if he really believed this, but there's no doubt it applies to critics. Of all the professions disrupted by the internet, professional art criticism was an early casualty. With their profit margins under siege, many newspapers and magazines simply eliminated them. Once, every daily newspaper employed at least one full-time movie critic from the local community. Now most just print reviews from syndicated, out-of-town "celebrity" critics—if they print them at all.
          In place of authentic, local voices, most people now get their impressions of new films from social media, or from sites like Rotten Tomatoes and Metacritic, which aggregate "yea or nay" verdicts from users, bloggers, and columnists. The vast majority of movie bloggers proffer their precious insights for free. Netflix claims to have an algorithm that will predict whether its customers will like a piece of "content". Such data-driven approaches have their uses—I've used them myself to scope out which films might be worth seeing, week to week.
          But no algorithm can do actual criticism. None can ferret out and discuss what might be interesting themes, or read subtext, or place a film in its particular political, social, or artistic context. None of them have a sense of history. For those things, we must resort to individual experts.
          Alas, the rise of the internet—of "user-provided content"—has meant a decline in public regard for experts of any kind. For many, the act of googling a subject makes anyone qualified to argue with much, much more informed people who may have devoted their whole lives to studying something. On the serious side, this has deeply obfuscated public debate on issues like health care and climate change. On the less serious, it has rendered art criticism into something like the human tailbone—a vestige of life in a bygone environment.
          All this is reflected in lack of public engagement. When I was the weekly critic for The Ithaca Times from 1985-1990, it was common for my editor to receive feedback from readers, commenting on (more often, disagreeing with) something I wrote. My files are filled with those old letters, which I cherish regardless of whether they praised or torched me. By contrast, after 352 columns to date in Tompkins Weekly, almost no one has written in more than eight years (since October, 2006). The precise number—exactly one letter—is telling.
          It's probably not because the issues have become less provocative. More likely, it's because, in lives of unrelenting activity, with stagnant wages and social media and DVRs and Tinder and YouTube, people simply have no energy to think very deeply about anything they read—much less summon the effort to put together a coherent response to it. In a time when "clicks" count more than the most crafted arguments, why bother?
          All of which occurs in the context to the ongoing juvenilization of movies in general. The Bible enjoins us to put aside childish things, but never Hollywood. None of that is new, but it is fortunately still possible to find challenging fare. This column has tried to highlight some of it, including the superb programming at Cornell Cinema and Cinemapolis. Trouble is, while you can lead people to better stuff, you can't necessarily make them care. According to Indiewire, the total US box office for foreign language films has plummeted 61% in the last six years. Despite promises of a brave new world of wide-open access to anything, the advent of video-on-demand has not reversed this trend. Instead, in practice, it has merely reinforced the domination of the same old thing.
          This will be my last regular posting for VIZ-arts. This is partly because I will be pursuing other opportunities as a novelist. Despite the fact that this column has been largely a one-way conversation, publisher Jim Graney and editor Jay Wrolstad have been consistently supportive of it—for which I am eternally grateful. I'm also fortunate to have had the opportunity to address the Ithaca community, which is (and I hope will continue to be) one of the most film-friendly in the nation.
          When I ended my column at the Times, I wrote "If any of you have felt a connection to the soul behind the words, I've been talking to you." I can think of no better way to end this one too.
© 2015 Nicholas Nicastro

Wednesday, January 7, 2015

What Were You Looking At?

Emily Blunt as the "Full Metal Bitch" in Edge of Tomorrow.

As a rule, this writer doesn't go in for "best of the year" lists. When it comes to movies, the exercise verges on the pointless, because many of most prestigious releases don't even arrive in local theaters until early the following year—and who wants to read about "the best of 2014" in March of 2015? Nor do the 35-40 movies reviewed here in a typical year amount to more than a tiny fraction of what should be considered.
          But what's a rule if it isn't worth being broken? Here's a list of some of the movies I'll remember from this otherwise forgettable year:
Edge of Tomorrow:  This Tom Cruise vehicle with the deathly dull title was a truly pleasant surprise—clever, fresh, with a wicked wit dwelling beneath the action. As a pair of mismatched soldiers coping with an invasion by time-bending aliens, Cruise and Emily Blunt make an appealing pair. And at least the producers got to fix the title when they started calling it Live, Die, Repeat.
Nymphomaniac, Part 2: Certified bad boy of European cinema Lars von Trier explores the wilder shores of sexual monomania—albeit more memorably in Part 2 of this epic than in Part 1. If lead Charlotte Gainsbourg were an athlete, we'd say she left everything on the field here. And she's a pretty terrific drummer too.
Fed Up: Stephanie Soechtig's exposé on America's true drug of choice—sugar—may not be the most artful documentary released this year, but it may be the most consequential. The short version: it's not your fault you can't exercise off those extra pounds, and it's not an accident.
Snowpiercer: This preposterous, outrageous, hurtling mess of a movie may carry the heaviest load of allegorical meaning of any film this year—but Korean visionary Joon-ho Bong pulls it off—right up to the last scream of metal-on-metal.
Gone Girl: David Fincher delivers the twists in this tense, almost faultless thriller. The film makes perfect use of Ben Affleck's congenital smugness, twisting it around his neck in a way that is virtually impossible to dislike. But the real story is Rosamund Pike as his wife/nemesis. If she isn't nominated for an Oscar, expect NYC cops to turn their backs on the Academy.
Interstellar: Christopher (Batman Begins, Inception) Nolan is known for turning out popular but overcooked thrillers. This time he's on the side of the angels, as he gives us a heartfelt, visually-stunning poem to the promise of a future among the stars—if we're rational enough to choose it.
Force Majeure: This double-diamond Swedish family drama came out of nowhere to sweep this critic away with its powerful story of failure and (maybe) redemption.
© 2015 Nicholas Nicastro 

Wednesday, December 17, 2014

Encumberbatched

Benedict Cumberbatch cracks the code in The Imitation Game.

««1/2 The Imitation Game. Written by Graham Moore, based on the book by Andrew Hodges. Directed by Morten Tyldum. At selected theaters.

The appearance of high-gloss bio-pics about scientists is the best evidence yet of the peaking cultural cachet of nerdiness. Along with the Stephen Hawking story in The Theory of Everything, we now have The Imitation Game, Morten Tyldum's account of the brilliant, sad career of British mathematician Alan Turing.
          The "human interest" hook with Hawking is his lifelong struggle with the disease that has left him a prisoner in his own body. Turing led the team that broke Nazi Germany's Enigma code—a key step in assuring Allied victory in World War II; he also laid much of the theoretical groundwork for modern computer science. But he was a closeted gay man at a time when homosexuality was a crime in Britain. His punishment for "indecency" is widely supposed (but never proven) to be the reason he committed suicide in 1952. That outrageous contrast—the genius and war hero unjustly persecuted merely for whom he loves—has made Turing a secular saint in these times.
          From a certain point of view, whether Hawking is crippled or Turing a victim of homophobia shouldn't really figure in our interest in them. Turing's work for British intelligence is said to have shortened the war by two full years and saved 14 million lives. Hawking has made seminal contributions to our understanding of the universe. In a better kind of world, such towering intellectual achievements would alone be enough to make these men fascinating. But we don't live in that world, and the story of saving 14 million lives isn't necessarily worth telling without the subsequent, tawdry downfall. Nerdiness may be cool, but ideas still aren't.
          The irascible Hawking would no doubt tell us where to stick our "human interest" for his plight—he prefers to be remembered for his science, not for being a disabled scientist. No doubt Turing would also object to going down in history as "that mopey gay codebreaker."
          The Imitation Game is a well-crafted work of reverential biography that takes absolutely no chances with Turing's complex legacy. The script by Graham Moore presents him as the usual genius with precious few social skills, alienating everyone around him as he almost single-handedly drags Britain to victory. Benedict Cumberbatch—who seems to be everywhere these days—is poignant and convincing in the lead. While Keira Knightley seems to be here purely as evidence of Turing's orientation (as in "anybody not interested in her must truly be gay"), she's also fresh in a way none of the other actors (Matthew Goode, Downton Abbey's Allen Leech) manage.
          Unfortunately, everything about The Imitation Game seems tailored not to upset anyone. Turing's identity as a gay man is invested entirely in a chaste schoolboy crush he had on a fellow student—an experience he likely had in common with a good number of not-so-gay English males at the time. Tyldum never risks rattling the teacups by presenting the adult Turing in the act of being intimate with an adult man. In that sense, the movie seems every bit as uneasy with homosexuality as the benighted era it depicts.
          For American audiences, there should be an extra level of ambivalence attached to Turing's career. After his team cracked the Enigma code, the British government didn't trust their American allies to keep that fact secret. Turing was obliged to lie to his US counterparts, keeping them in the dark about technology that might have saved thousands of American lives. Of course, when there are American Oscars to win, Tyldum dares not touch any of that.
© 2014 Nicholas Nicastro