Wednesday, August 16, 2017

Why I Lean Lannister


In the ongoing War of the Queens in HBO’s Game of Thrones, the choice seems obvious. Between the ambitious but fundamentally decent Daenerys (“Dany”) Targaryen, and the murderously scheming Cersei Lannister, the fandom is all in for Dany. The show's writers seem to be, too: fan-favorite Tyrion Lannister, who pointedly believes in nothing, declares “I believe in you, Daenerys Targaryen”; Ser Davos Seaworth, a voice for compassion in a kingdom otherwise soaked in blood, jokes about “changing sides” to Dany--in the very presence of his liege lord Jon Snow, in fact. Meanwhile, from the dour looks on the face of Cersei’s brother-lover Jaime, even he seems resigned that Cersei's reign over war-torn Westeros must end. 
      But the writers and the fans have it all wrong. Their mistake doesn't lie in the characters of the Queens themselves. It lies instead in the legacies of their families, and what they mean in the context of history--including medieval-ish, semi-plausible, made-up history. To pick wisely, the right question should not be "Who do you prefer to have a cup of mead with?". Rather, it should be "Which side is better for Westeros in the long term?" And the answer to that is--very arguably--the detested Lannisters. 
      Exploring why demands a deep dive into fake history. In the canon of George R.R. Martin’s original novels, the Targaryens are the realm’s dynastic ruling house. Hailing from across the Narrow Sea, they conquered the continent using strategic weapons of mass destruction—giant fire-breathing dragons--to subdue six of Westero’s seven independent kingdoms (and exact the submission of the seventh). The Iron Throne itself was forged from the surrendered weapons of the Targaryen's enemies. 
      For three centuries, they brooked no defiance. Whole castles, like the impregnable Harrenhal, were literally melted into submission. Targaryen power could easily be called "pharaonic” in the depth of its absolutism. Indeed, it also resembled the pharaohs in the manner of their succession--by institutionalized incest within the royal house. 
      The predictable climax of all this inbreeding was the reign of the “Mad King” Aerys II, who planned to put down a rebellion by incinerating the entire capital and everyone in it with chemical weapons (aka “wildfire”, a kind of medieval napalm). The tyranny of the Targaryens was nominally ended by the military victory of Robert Baratheon. But the real coup de grace was administered by the Lannisters, who became the power behind Robert’s throne. This is where Martin’s first novel picks up the story. 
      Unlike the Targaryens, who ruled by blood, the Lannisters are most associated with finance, political gamesmanship, and clever use of diplomacy. Their family motto is “A Lannister always pays his debts”--a useful trait when forging alliances. The paterfamilias, Tywin Lannister, is a gimlet-eyed master of realpolitik who does not bed his daughter Cersei, but marries her off strategically. His son Jaime, the “Kingslayer”, was the one who finally put old Aerys out of his misery. True, he’s sleeping with his sister. But the fact that he killed a king and survived attests to a simple yet profound political development: that no one, not even the monarch, is above responsibility for his acts. 
     These two clans, admittedly, are not ideal options. But like the election of 2016, these are the choices we have. Is it better to be ruled by a closed dynastic family whose absolute power is backed up by magic creatures only it can control? Or one where power is negotiated, shared, and contested? 
      In actual history, the trajectory is clear: Europe became modern because its versions of the Lannisters, not its Targaryens, ultimately triumphed. Instead of god-kings, the continent evolved into a system of multiple, mutually-suspicious centers of power. Those rivalries led to competition between kingdoms that sparked revolutionary changes in technology and culture. And it was because of them that Lannister-esque Europe soon surged ahead of global rivals like India and China (ruled, appropriately enough, from its own “Dragon Throne”). 
     No doubt, Dany is the more immediately appealing choice. But there’s also no doubt about what kind of system she would seek to restore. It might be a just world, but only because Dany opts to be just. It might be a peaceful one, but only because Dany alone possesses the ultimate weapons. There would be no guarantees about her heirs. They might well might be more like the Mad King--or her insufferable brother Viserys, who was so stridently entitled he was killed by his own ally. Of the benevolence of kings, Thomas Jefferson was understandably skeptical: “Sometimes it is said that man cannot be trusted with the government of himself. Can he, then be trusted with the government of others? Or have we found angels in the form of kings to govern him? Let history answer this question.” 
     The Lannisters, meanwhile, are a withered branch. All of Cersei's children are dead. Petty and ruthless as she is, she still lives in a world defined by politics, not magic. As soon as she can't pay her debts, she will pass from the scene. 
     The Lannister’s plutocratic rule at least contains the promise of modernity. I can more easily see one of Cersei’s descendants, mortgaged and weak, signing a Westerosi version of the Magna Carta. The sons of platinum Dany, born to rule, backed up by dragons? Not so much. 
      Faced with the choice of Dany’s forever dynasty and Cersei's short-term tyranny, the choice is clear. When the armies of Westeros clash next, I’ll be rooting not for the Dragon, but for the Lion.
© 2017 Nicholas Nicastro

Wednesday, November 11, 2015

Spectre of the Gun

Bond (Daniel Craig) brings a plane to a car chase in Spectre.

   Spectre. Written by John Logan, Neal Purvis, Robert Wade, & Jez Butterworth. Directed by Sam Mendes. At local theaters. 

Most criticism of the James Bond franchise emphasizes how the series stays relevant by adapting to the times. And sure enough, in Spectre it officially enters the post-Snowden era, as our hero fights a Big Data cabal plotting to turn our planet into a digital panopticon. But what’s equally important are the ongoing legacies—the “product DNA”, as developers like to call it—that make the Bond films stand apart. While people have been predicting the demise of the franchise for decades, it won’t truly be dead until it is indistinguishable from those Mission: Impossible and Jason Bourne movies. 

In the nine years since Daniel Craig donned the Savile Row suit, the series has veered all over the map. 2006’s Casino Royale rekindled the class and excitement of the franchise, and remains the best. Quantum of Solace (2008) seemed barely a Bond film at all, while Skyfall (2012) seemed too earnest to recover the nostalgia Solace had jettisoned. Its other flaws and virtues aside, Spectre at last seems comfortable being what it is, and sees no reason to strain for anything different.

Director Sam Mendes (American Beauty, Skyfall) opens the film with a visually stunning sequence, set in old Mexico City on the Day of the Dead, that is heavy on sugar skulls and foreboding and the kind of sledge-hammer irony not approached since Roger Moore donned clown makeup in Octopussy (1983). Indeed, the dead cast long shadows here, as Bond goes maverick to fulfill a posthumous “contract” from his dearly departed boss (Judi Dench as “M”). 

The cause sends him from London to Rome to Austria to Morocco in a quest to stop a secret organization called SPECTRE from hijacking the world’s electronic intelligence. As in the scene here where Bond comes to a car chase with a large airplane, it all seems like a lot invested to modest effect. While this is the most expensive 007 movie in history, rumored to cost $350 million, its hard to say all that money reached the screen. (Edward Snowden, by contrast, thwarted Big Data for the cost of a thumb drive and a ticket to Russia.) Mostly, it seems that the makers of Spectre spent hard not to fail, and as far as that goes, succeeded.

No Bond film can truly be judged without looking at the villain and the women, and in these departments Spectre is sturdy but not spectacular. Christoph Waltz (The Zero Theorem, Django Unchained) seems to fulfill his personal destiny to play a Bond adversary. And while he conveys a certain satiny menace here, he’s nowhere near as compelling as he was as the two-faced Nazi fixer in Inglorious Basterds. Nor does the film need the overwrought psycho-backstory that somehow implicates his character in all of Bond’s misfortunes since his orphanhood. 

At fifty years old, Monica Bellucci (Matrix Reloaded, The Passion of the Christ) made headlines for being the oldest actress to be cast as a “Bond girl”. (The previous record holder was Honor Blackman, who appeared in Goldfinger at 38.) No doubt, Bellucci still looks terrific. It’s just too bad the writers forgot to give her anything much to do except tumble in the face of Craig’s savoir faire. While the Bond films give a good impression of worldly sophistication, Bellucci—if given the chance— could have delivered more than an impression.

Léa Seydoux (Blue is the Warmest Color) fares best of all as Madeleine, the daughter of one of the Bond’s former adversaries. There’s nothing in Spectre as torrid as the Sapphic kisses in Blue, but Seydoux presents a combination of freshness and worldliness that is hard to resist. She also benefits from the possibility that Craig may not return to the series, in a way that won’t be spoiled here.

An essential part of that “product DNA” is Bond’s roots in the Mad Men era, when gender roles were first beginning to transition to whatever place we are going now. For women, the narrative is relatively straightforward—toward more freedom, more options. For men, the story isn’t so simple; while they are also promised more options, it isn't clear those are ones most men would want to take. Not a few of the men who come to see Spectre might send their days pushing strollers, keeping house for their wives, not exactly sure what a “dirty martini” really means. They’re told—and perhaps even mostly believe—that what they have is “freedom.” But they wonder.

 Bond embodies the old virtues of strength and purpose and lack of debilitating self-consciousness. (In Spectre, when Madeleine asks how he copes with his life, “…living in the shadows? Hunting, being hunted? Always alone?”, Bond replies “I don't stop to think about it.”) Old virtues, but also divisive, because they are lately presented as brutality, narrowness, and emotional inertia. As sex roles converge, perhaps 007’s larger cultural role is to keep those old virtues on ice, ready for when the times demand a different kind of man. 

© 2015 Nicholas Nicastro

Tuesday, October 6, 2015

Blues for a Red Planet

Saving Matt Damon in The Martian.

★ ★ ★ 1/2  The Martian. Written by Drew Goddard, based on the novel by Andy Weir. Directed by Ridley Scott. 

About twenty years ago, the late Carl Sagan, during a speech at Cornell, likened some natural phenomenon to “coupled differential equations”. When the students hissed at this reference to their math homework, Sagan remarked, “Why all the hissing? One day differential equations will save your life.”

Ridley Scott’s The Martian is the fictional realization of Sagan’s prediction. Based on the bestselling page-turner by Andy Weir, it concerns Mark Watney (Matt Damon), an American astronaut who accidentally gets left behind on Mars after an emergency evacuation by his crew. Literally the only man left on a barren, freezing planet, Watney has only three months of supplies but a three year wait before a rescue. After making the conscious decision to try not to die, he declares his only option: “I have to science the shit out of this.”

  And “science the shit out of it” he does. Weir’s book is a throwback to the heyday of “hard” science fiction—the kind that reveled in the physics and chemistry of their stories. In it, we learn how to make water from jet fuel; we explore the botany of raising crops in otherwise lifeless soil, and the astrodynamics of sending humans and machines to other planets. That Weir makes all these technicalities not only palatable, but gripping, is a real triumph. 

For the movie, director Ridley Scott (Alien, Gladiator, Prometheus) is wise to stay close to Weir’s winning formula. But the film is better than the book in one important respect. Weir is not an accomplished prose stylist, and The Martian is his first published novel. While engrossed in the story, this reader sometimes had to remind himself this story was set on an alien planet, because Weir offers precious few details on how it actually looks and feels to be on Mars. Now that we are roving Mars by robotic proxy, it’s not enough to be vague about all that. Mars is no longer just an astronomical object—it’s a piece of turf, as real as Yuma or Marrakech.

  What’s the sky like there? The stars? The sunset? What’s the sound of Martian wind from inside Watney’s lab? What’s it like to get all that fine Martian dust in the crotch of your space suit? Such are the kind of impressions that make a story vivid, and alas, they seem beyond Weir’s skills as a storyteller. Visual flair is in Scott’s wheelhouse, however, and The Martian is replete with the kind of details that make Mars come to life as the implacable antagonist it is supposed to be.

        No surprise that Matt Damon is a fun and appealing choice in the lead. As written, Mark Watney might as well have been designed for Damon to play. Other casting choices—such as Jeff Daniels as the Machiavellian NASA administrator—feel predictable. That a woman with the apparent youth of Jessica Chastain would be selected as mission Commander is hard to swallow, however. More likely a far more seasoned astronaut—female or male—would be chosen for that job.

  The Martian is hard science fiction, but it stretches the limits of disbelief in one respect. The premise that the United States of America, by what seems to be a few years in the future, would actually agree to fund the manned exploration of Mars is utter fantasy. In a time when the US Congress can’t agree to fund the government for more than a few months at a time, and a leading Presidential candidate declares evolution and the Big Bang theories “of the devil”, it seems almost quaint to expect our fractured nation to back a program of pure exploration that would last a decade and cost hundreds of billions of dollars. That, of course, would take patience, and vision, and the kind of trust in Big Science that we clearly lost a long time ago.

  More likely, a real “Martian” would not be named “Mark”, but “Chen”. And it would not be the Chinese space agency lending a small assist to NASA, but the other way around. That is, unless somebody finds a way to “science the shit out of” our national plunge into stupidity. 
© 2015 Nicholas Nicastro






Monday, January 19, 2015

The End of Criticism



In his Gettysburg address, Lincoln predicted "The world will little note, nor long remember what we say here…" It's hard to know if he really believed this, but there's no doubt it applies to critics. Of all the professions disrupted by the internet, professional art criticism was an early casualty. With their profit margins under siege, many newspapers and magazines simply eliminated them. Once, every daily newspaper employed at least one full-time movie critic from the local community. Now most just print reviews from syndicated, out-of-town "celebrity" critics—if they print them at all.
          In place of authentic, local voices, most people now get their impressions of new films from social media, or from sites like Rotten Tomatoes and Metacritic, which aggregate "yea or nay" verdicts from users, bloggers, and columnists. The vast majority of movie bloggers proffer their precious insights for free. Netflix claims to have an algorithm that will predict whether its customers will like a piece of "content". Such data-driven approaches have their uses—I've used them myself to scope out which films might be worth seeing, week to week.
          But no algorithm can do actual criticism. None can ferret out and discuss what might be interesting themes, or read subtext, or place a film in its particular political, social, or artistic context. None of them have a sense of history. For those things, we must resort to individual experts.
          Alas, the rise of the internet—of "user-provided content"—has meant a decline in public regard for experts of any kind. For many, the act of googling a subject makes anyone qualified to argue with much, much more informed people who may have devoted their whole lives to studying something. On the serious side, this has deeply obfuscated public debate on issues like health care and climate change. On the less serious, it has rendered art criticism into something like the human tailbone—a vestige of life in a bygone environment.
          All this is reflected in lack of public engagement. When I was the weekly critic for The Ithaca Times from 1985-1990, it was common for my editor to receive feedback from readers, commenting on (more often, disagreeing with) something I wrote. My files are filled with those old letters, which I cherish regardless of whether they praised or torched me. By contrast, after 352 columns to date in Tompkins Weekly, almost no one has written in more than eight years (since October, 2006). The precise number—exactly one letter—is telling.
          It's probably not because the issues have become less provocative. More likely, it's because, in lives of unrelenting activity, with stagnant wages and social media and DVRs and Tinder and YouTube, people simply have no energy to think very deeply about anything they read—much less summon the effort to put together a coherent response to it. In a time when "clicks" count more than the most crafted arguments, why bother?
          All of which occurs in the context to the ongoing juvenilization of movies in general. The Bible enjoins us to put aside childish things, but never Hollywood. None of that is new, but it is fortunately still possible to find challenging fare. This column has tried to highlight some of it, including the superb programming at Cornell Cinema and Cinemapolis. Trouble is, while you can lead people to better stuff, you can't necessarily make them care. According to Indiewire, the total US box office for foreign language films has plummeted 61% in the last six years. Despite promises of a brave new world of wide-open access to anything, the advent of video-on-demand has not reversed this trend. Instead, in practice, it has merely reinforced the domination of the same old thing.
          This will be my last regular posting for VIZ-arts. This is partly because I will be pursuing other opportunities as a novelist. Despite the fact that this column has been largely a one-way conversation, publisher Jim Graney and editor Jay Wrolstad have been consistently supportive of it—for which I am eternally grateful. I'm also fortunate to have had the opportunity to address the Ithaca community, which is (and I hope will continue to be) one of the most film-friendly in the nation.
          When I ended my column at the Times, I wrote "If any of you have felt a connection to the soul behind the words, I've been talking to you." I can think of no better way to end this one too.
© 2015 Nicholas Nicastro

Wednesday, January 7, 2015

What Were You Looking At?

Emily Blunt as the "Full Metal Bitch" in Edge of Tomorrow.

As a rule, this writer doesn't go in for "best of the year" lists. When it comes to movies, the exercise verges on the pointless, because many of most prestigious releases don't even arrive in local theaters until early the following year—and who wants to read about "the best of 2014" in March of 2015? Nor do the 35-40 movies reviewed here in a typical year amount to more than a tiny fraction of what should be considered.
          But what's a rule if it isn't worth being broken? Here's a list of some of the movies I'll remember from this otherwise forgettable year:
Edge of Tomorrow:  This Tom Cruise vehicle with the deathly dull title was a truly pleasant surprise—clever, fresh, with a wicked wit dwelling beneath the action. As a pair of mismatched soldiers coping with an invasion by time-bending aliens, Cruise and Emily Blunt make an appealing pair. And at least the producers got to fix the title when they started calling it Live, Die, Repeat.
Nymphomaniac, Part 2: Certified bad boy of European cinema Lars von Trier explores the wilder shores of sexual monomania—albeit more memorably in Part 2 of this epic than in Part 1. If lead Charlotte Gainsbourg were an athlete, we'd say she left everything on the field here. And she's a pretty terrific drummer too.
Fed Up: Stephanie Soechtig's exposé on America's true drug of choice—sugar—may not be the most artful documentary released this year, but it may be the most consequential. The short version: it's not your fault you can't exercise off those extra pounds, and it's not an accident.
Snowpiercer: This preposterous, outrageous, hurtling mess of a movie may carry the heaviest load of allegorical meaning of any film this year—but Korean visionary Joon-ho Bong pulls it off—right up to the last scream of metal-on-metal.
Gone Girl: David Fincher delivers the twists in this tense, almost faultless thriller. The film makes perfect use of Ben Affleck's congenital smugness, twisting it around his neck in a way that is virtually impossible to dislike. But the real story is Rosamund Pike as his wife/nemesis. If she isn't nominated for an Oscar, expect NYC cops to turn their backs on the Academy.
Interstellar: Christopher (Batman Begins, Inception) Nolan is known for turning out popular but overcooked thrillers. This time he's on the side of the angels, as he gives us a heartfelt, visually-stunning poem to the promise of a future among the stars—if we're rational enough to choose it.
Force Majeure: This double-diamond Swedish family drama came out of nowhere to sweep this critic away with its powerful story of failure and (maybe) redemption.
© 2015 Nicholas Nicastro 

Wednesday, December 17, 2014

Encumberbatched

Benedict Cumberbatch cracks the code in The Imitation Game.

««1/2 The Imitation Game. Written by Graham Moore, based on the book by Andrew Hodges. Directed by Morten Tyldum. At selected theaters.

The appearance of high-gloss bio-pics about scientists is the best evidence yet of the peaking cultural cachet of nerdiness. Along with the Stephen Hawking story in The Theory of Everything, we now have The Imitation Game, Morten Tyldum's account of the brilliant, sad career of British mathematician Alan Turing.
          The "human interest" hook with Hawking is his lifelong struggle with the disease that has left him a prisoner in his own body. Turing led the team that broke Nazi Germany's Enigma code—a key step in assuring Allied victory in World War II; he also laid much of the theoretical groundwork for modern computer science. But he was a closeted gay man at a time when homosexuality was a crime in Britain. His punishment for "indecency" is widely supposed (but never proven) to be the reason he committed suicide in 1952. That outrageous contrast—the genius and war hero unjustly persecuted merely for whom he loves—has made Turing a secular saint in these times.
          From a certain point of view, whether Hawking is crippled or Turing a victim of homophobia shouldn't really figure in our interest in them. Turing's work for British intelligence is said to have shortened the war by two full years and saved 14 million lives. Hawking has made seminal contributions to our understanding of the universe. In a better kind of world, such towering intellectual achievements would alone be enough to make these men fascinating. But we don't live in that world, and the story of saving 14 million lives isn't necessarily worth telling without the subsequent, tawdry downfall. Nerdiness may be cool, but ideas still aren't.
          The irascible Hawking would no doubt tell us where to stick our "human interest" for his plight—he prefers to be remembered for his science, not for being a disabled scientist. No doubt Turing would also object to going down in history as "that mopey gay codebreaker."
          The Imitation Game is a well-crafted work of reverential biography that takes absolutely no chances with Turing's complex legacy. The script by Graham Moore presents him as the usual genius with precious few social skills, alienating everyone around him as he almost single-handedly drags Britain to victory. Benedict Cumberbatch—who seems to be everywhere these days—is poignant and convincing in the lead. While Keira Knightley seems to be here purely as evidence of Turing's orientation (as in "anybody not interested in her must truly be gay"), she's also fresh in a way none of the other actors (Matthew Goode, Downton Abbey's Allen Leech) manage.
          Unfortunately, everything about The Imitation Game seems tailored not to upset anyone. Turing's identity as a gay man is invested entirely in a chaste schoolboy crush he had on a fellow student—an experience he likely had in common with a good number of not-so-gay English males at the time. Tyldum never risks rattling the teacups by presenting the adult Turing in the act of being intimate with an adult man. In that sense, the movie seems every bit as uneasy with homosexuality as the benighted era it depicts.
          For American audiences, there should be an extra level of ambivalence attached to Turing's career. After his team cracked the Enigma code, the British government didn't trust their American allies to keep that fact secret. Turing was obliged to lie to his US counterparts, keeping them in the dark about technology that might have saved thousands of American lives. Of course, when there are American Oscars to win, Tyldum dares not touch any of that.
© 2014 Nicholas Nicastro

Tuesday, December 9, 2014

The Blitzkrieg on Stupitude

Man on a roll in The Colbert Report.

««««« The Colbert Report. Monday thru Thursdays at 11:30pm on Comedy Central. Ends December 18.

The most important cultural event this month not involving Benedict Cumberbatch is the finale of Comedy Central's The Colbert Report. Since its debut in 2005, the show has arguably outdone its parent, The Daily Show, as the benchmark in late night fake news. Colbert (who, unlike his TV incarnation, pronounces the "t" in his last name) will air his last show on December 18, before taking over Late Night on CBS next year.
          For those very late to this party, Colbert's show is an impeccable parody of Fox blowhards like Bill O'Reilly and Glenn Beck. His alter-ego "Stephen Cobert" is one of those gut-level patriots who is proud to think with his red, white and blue balls instead of—you know—that gray and white stuff between his ears. Where Jon Stewart's main mode is self-deprecation, Colbert's character revels in the perfection of his ignorance. His interviews with guests are couched as ambushes by the forces of righteousness, predicated on "nailing" people—which he loudly celebrates whether he has accomplished it or not. And it's all been done on a consistently high level for more than 1340 episodes.
          The comparison with Stewart is key to appreciating how great the Report has become in its nine years. Though The Daily Show gets credit for epitomizing a broader trend toward satirical news, it is not, strictly speaking, satirical. Stewart's sharp, funny, and quite often true commentary is always delivered from a distance. That distance might be moralizing, or exhortatory, or just plain mean, but it is always there. What Stewart does is best described as clever snark, not satire.
          The only truly satirical material in The Daily Show are the reports from its staff of fake "correspondents", who impersonate the self-importance and showmanship of network field reporters. Colbert (along with Steve Carell, Ed Helms, John Oliver, Samantha Bee and many more over the years) came out this fine tradition, and has arguably raised it to true art.
          Make no mistake—I'm a fan of Stewart. Fact is, though, you can see the range of his comic repertoire in about a week. By contrast, it's taken almost a decade just to sample Colbert's full menu. Within a single episode, he'll veer from deep-fried pomposity to vacillating schoolboy to weepy narcissist. He'll pour scorn on bears, shake his fist at Heisenberg's uncertainty principle, put the British Empire "on notice". Where Stewart plays the comedic equivalent of a kazoo, Colbert works with a full symphony orchestra.
          The end of the Report will leave a big hole in our weeknights. So big, in fact, that it's hard to believe some version of "Stephen Colbert" won't make regular appearances on the new Late Night. We'll find out next year.
          The real legacy of late-night satire won't be told in 2015, but during the next Presidential election, and the one after that. On the plus side, Stewart and Colbert continue to be wildly popular among younger viewers (and a quite a few older ones too), and their shows have become major sources of news for whole segments of the voting population. Viewers who watched Colbert's comedic take on campaign finance laws were shown to be objectively better informed on that critical issue than viewers of Fox, CNN or any other major outlet.  A 2007 study found regular viewers of Stewart's and Colbert's show to be better informed on all issues than viewers of the PBS Newshour (surprising) and Bill O'Reilly (not surprising).
          Trouble is, none of this is necessarily translating into greater voter involvement. Both Comedy Central shows extensively covered the 2014 midterms, but turnout was dismal, the worst in 72 years. So the question becomes: is making the news funny an incentive to participate in the political process—or a substitute for it?
© 2014 Nicholas Nicastro