How many movie-goers will see Jake Gyllenhaal’s truly terrifying Louis Bloom in Dan Gilroy’s Nightcrawler? ...
... Probably not nearly as many as will see this subordinate Nightcrawler from X-Men 2.
Bryan Singer has moved from real people in The Usual Suspects to the mutants of X-Men.
In 2009, Marc Webb made a sparkling RomCom about a young man obsessing over a young woman. ...
In his next film, he had another young man obsessing over another young woman. Only this time, the young man wore a spider mask and shot webs from his fingers.
In Marc Forster’s Monster’s Ball, a harrowing adult drama about loneliness and prejudice, the “monsters” are figurative
In his World War Z, the zombies are real. (Author’s note: to the best of my knowledge, zombies actually are not real.)
Christopher Nolan’s 1st movie, Following, had a reported budget of $6 thousand.
Nolan’s latest movie, Interstellar, cost 27,500 times as much as his debut.
J. C Chandor has been able to buck the trend for new directors interested in smaller original stories about real people. A Most Violent Year is his third such movie in four years.
Despite some recent successes, documentaries continue to struggle to find screens in American theaters. Glen Campbell: I’ll Be Me, one of 2014’s most compelling movies, faced that struggle. Whether it will find success on smaller screens remains to be seen.
In the years to come, will American filmmakers continue to offer compelling and challenging stories about real people, like Jeremy Saulnier’s Blue Ruin?
Or is this all we have to look forward to?
Should this matter? If Anderson was right, then should we care that the multiplexes are devoting more and more screen space to derivative, spectacle-oriented work? If you can watch the latest low budget art house gem on you new improved iPhone, then what does it matter?
It matters a lot. It may be true that real artistic innovation can come from a couple of kids with a camcorder and a dream. They funnel that dream through whatever cheap editing software they have and put it up as a short form Internet program. It lights a spark. The ultra-hip youth of Portland or Memphis or Manchester pick up on it at first, and soon it has gone viral. Then what happens? It either flames out because the market is so glutted with other kids with other dreams and different editing suites that, without financial backing, our original dreamer has to fold up her tent. Or, she goes after the cash. And by cash, I mean the deep-pocketed Hollywood studios that have always been quick to buy up whatever is young and hip and hot. But if those Hollywood studios are so entrenched in the culture of spectacle and sequel, how are they possibly going to foster new movies without having them fall into the spectacle-sequel trap?
The long tail theory is predicated upon more accessible methods of production and distribution. What it never accounted for, or at least what it never properly accounted for, was marketing. In the free-for-all world of modern distribution, rising above the multitudinous din requires some magical potion of savvy and luck. And money helps. The world of popular music has a longer track record with the long tail and it is hard to see where positive musical change has come about. Anita Elberse’s book notes that approximately one third of all music tracks that are downloaded are downloaded a single time. A sizeable piece of that long tail could blow away in a modest wind. It is far cheaper and less time-consuming to record a three-minute pop song than to produce a feature film. The musician is better positioned to throw out more product in hopes of catching a spark.
For the independent filmmaker trying to catch that same spark, social media cleverness and an MBA’s sense of guerrilla marketing become essential. Notice that these skills have little to do with cinematic originality. There have been some start-ups that have attempted to fill in the marketing gap for new, out-of-the-mainstream filmmakers. The From Here to Awesome (FHTA) Festival, begun in 2008 by independent filmmakers Lance Weiler, Arin Crumley, and Mike Belmont, was a noble effort to create a community amongst the would-be up & comers. But its method for selecting worthy projects was predicated on Internet voting, and Internet voting favors the savvy marketer. A few years later, NeoFlix, a fulfillment house designed to empower independent filmmakers by providing marketing and distribution services, was launched with great fanfare. Within a few years, it had to close shop. Its president John Chang, in announcing the collapse, wrote, “The long tail concept did not track for most clients as most films would receive a burst of sale in the initial weeks or perhaps even months, and tail off sharply after that.” What is most interesting about this statement is that it is equally true for almost all movies, whether produced by your next-door neighbor or Time Warner. Time Warner, however, is diversified. One hit pays for a lot of failures. They can cross-promote and market those “failures” into new lives. It’s likely that your next-door neighbor cannot.
Dan Gilroy’s 2014 Nightcrawler has gotten very solid reviews, which tend to focus on the film’s indictment of modern media practices. Those reviews miss what is truly important about the movie. Haskell Wexler’s Medium Cool (1969) and the Sidney Lumet/Paddy Chayefsky prescient Network (1976) nailed the media element 40 years ago. What’s different about Nightcrawler (and please remember, we are talking about the movie and not the namesake X-Man mutant superhero with blue skin and a prehensile tail in X-Men 2) is the way its hero Louis Bloom aggressively markets himself. Louis is a free-lancer who has taken modern business classes and understands the value of branding. He doesn’t run from his amorality. He cultivates it. He is a free lancer who is not looking to reinvent journalism. He just wants as big a piece of the existing industry as he can get. He knows marketing is the way to get it.
Back to the only really important question: Is the United States producing better or worse movies?
Christopher Nolan once made Memento. Now he makes Batmen. Bryan Singer made The Usual Suspects before moving to the land of X-Men. Marc Webb was responsible for (500) Days of Summer. Now he’s responsible for not one, not two, but three amazing Spidermen. How about the Russo Brothers, going from the TV gem Community to Captain America, or Marc Forster, going from Monster’s Ball to World War Z, or James Mangold, going from Walk the Line to Wolverine (and its sequel)? James Wan was the most stylish new purveyor of horror when he made Saw. Now he’s the stylish new purveyor of the seventh Fast & Furious. Need I go on? I’m not saying some of these spectacles aren’t fine films. I am saying that Hollywood is rapidly approaching the point at which it will turn everything into a spectacle. Or a remake. Or a remake of a spectacle.
The muted voice of independence
Look at it another way. (30) The Independent Spirit Awards were established in 1984 in large measure as a reaction against that tidal wave of blockbusters that had been launched a decade before. The goal of the awards was to recognize and promote artists working outside the mainstream of Hollywood production. There were special awards for first features and micro-budgeted films. And though you can quibble with individual tastes and selections, there is no question that throughout their first fifteen years, the awards played a major role in developing powerful, independent-minded voices. If you consider the men and women who have been nominated in the First Feature category—the up-and-comers—the future of the industry—the Independent Spirit Awards have a staggering track record. There was no “First Feature” award in 1986, but the co-recipient of the Best Director award was Joel Coen, winning for his first movie, Blood Simple. Since then, and prior to 2000, here are some other names that have either won or been nominated for their first features: Spike Lee, Todd Haynes, Richard Linklater, Quentin Tarantino, Robert Rodriguez, David O. Russell, Kevin Smith, Paul Thomas Anderson, Darren Aronofsky, Spike Jonze. Among the most important directors working in America over the last 25 years. This time period was dubbed by film critic Jeffrey Sconce as the era of “new smart cinema,” and it may be the last great period of U.S. film.
Look at the same list post-2000. Obviously these directors would not have had as much time to develop their careers, but many of the names on the list above had followed up their initial success with something of note within five years. Todd Solondz, who was nominated for his withering debut portrait of U.S. youth, Welcome to the Dollhouse in 1995, was able to direct the equally withering portrait of suburbia Happiness just three years later. Alexander Payne, who wasn’t even nominated for his satire Citizen Ruth (1996) was still able to build on the promise of his debut with the equally penetrating and more accessible satire of Election (1999). If you focus on the more recent crop of Independent Spirit nominees who have had more than five years since their debut, you reach a disheartening conclusion. There are some excellent movies represented on the list, but where are the careers for Richard Kelly (Donnie Darko) and Dylan Kidd (Roger Dodger)? Karyn Kusama (Girlfight) and Patty Jenkins (Monster)? Goran Dukic (Wristcutters: A Love Story) and Vadim Perelman (House of Sand and Fog)? It is hard to find anyone who was nominated for a first feature between 2000 and 2009 who has developed a significant film career. Arguably the most successful director of the lot has been Catherine Hardwicke who exploded on the scene with the in-your-face coming of age drama Thirteen in 2003. Her best-known work since then was 2008’s crowd pleasing, but non-threatening vampire fantasy Twilight. Many writers, like Paul Haggis, or actors, like Julie Delpy, launched directing careers during the decade, but to this point, none has created what we might consider a significant directing career. Maybe the most intriguing name on the list of recent nominees is Lena Dunham, who has yet to make a second film after Tiny Furniture in 2010. Dunham, like Patty Jenkins and Nicole Kassell (The Woodsman) before her, has found a more welcoming home on television than in the world of feature filmmaking.
Or consider the following list of names: Henry Bean, Rebecca Miller, Shari Springer Berman, Robert Pulcini, Shane Carruth, Richard Glatzer, Christopher Zalla, Courtney Hunt, Debra Granik. Those are the directors of eight of the ten Sundance Film Festival Grand Jury Prize winning movies from the first decade of the 21st century. Sundance, even more so than the Independent Spirit Awards, was once a harbinger of new independent voices. The movies represented on this list, like Carruth’s mind-blowing Primer (2004) and Granik’s brilliant drama Winter’s Bone (2010), are not lesser films than earlier Sundance winners. But their directors have not been able to use this recognition to build successful filmmaking careers. The two filmmakers I did not list are Ira Sachs and Lee Daniels. Daniels managed mainstream success with Lee Daniels’ The Butler (2013), and Sachs has gotten some attention for 2014’s Love is Strange. The fact that I personally find Love is Strange very disappointing doesn’t particularly bother me. I am just grateful that Sachs had the opportunity to film a story about aging homosexuals, a subject rarely confronted by mainstream Hollywood.
Why have these particular barometers ceased to be a predictor of future success in U.S. film? Is it because the directors simply aren’t as good? Is it because the public has soured on films with an “independent spirit?” Independent filmmakers from earlier eras, from Italian Francesco Rosi, maker of such seminal movies as Salvatore Giuliano (1962) and Chronicle of a Death Foretold (1987), to producer Maggie Renzi, who worked with indie icon John Sayles right through the heart of that New Smart Cinema era, blame the relative ease of access to both production materials and money for weakening the crucial winnowing-out process that existed back when it was harder to make a movie. Whether that’s true or not, the answer, which is almost certainly a combination of many factors, really doesn’t matter. Whatever the reason, the mainstream of U.S. filmmaking seems less able to foster emerging talent than at any time in its history. And that, regardless of distribution methods, is a scary proposition.
Last year, when accepting an award from the Washington DC chapter of Women In Film & Video, veteran producer/director Penny Marshall told the audience that the movie for which she was being honored (National Film Preservation Board selection A League of Their Own) could not be made today. It had no trailer moments (read: aliens or explosions) on page 1. It was not animated, and thus not easily translatable into foreign languages and markets. It was not based on a very popular novel or play. It was not a sequel. It was just a good story.
The case of Interstellar
Christopher Nolan, who began his career with the micro-budget thriller Following in 1998, released Interstellar in 2014 with a reported budget of $165 million. There is the suggestion in Interstellar that mankind’s capacity for survival is virtually limitless. So too are the number of metaphors that can be drawn from Nolan’s new movie about the current state of U.S. film. The one that involves J.C. Chandor may be the most telling.
Chandor, along with Scott Cooper, suggest that the future for new U.S. filmmakers may not be as bleak as what I outlined above. Within the last five years, both men have put out strong debuts (Margin Call and Crazy Heart, respectively) and then have been able to follow them up with promising second features (All is Lost and Out of the Furnace). Of course, the jury is still out on their long term career prospects. Chandor presents a particularly intriguing case because of his direct run-in with a major studio blockbuster.
Chandor’s 2nd movie, 2013’s All is Lost, had a similar premise as the 2013 blockbuster Gravity, but with a radically different approach to storytelling. His 3rd film, A Most Violent Year, was scheduled for release on December 31, 2014. The crime story, starring Oscar Isaac and Jessica Chastain, received generally good early buzz. Chastain received particular praise. As the film began playing at festivals, her performance garnered Oscar-talk. Under normal circumstances, distributor A24 would aggressively promote such a performance by having Chastain make the media rounds, appearing at events, doing interviews, etc. But Chastain is also co-starring in Paramount’s Interstellar, which is also vying for box office and Oscar nominations in the same year. As detailed in the New York Times on 11/5/14, Christopher Nolan and Paramount have effectively prevented Chastain from doing publicity for A Most Violent Year. They have her devoting all of her “marketing” energy to Interstellar. The final chapter is yet to be written on the two films, but this situation certainly provides a clear picture of the way in which marketing prowess can trump other considerations. Regardless of access to production and distribution, marketing gives the major studios a seemingly insurmountable advantage.
And the marketing of Interstellar provides a window into what Hollywood considers its most important asset in 2014. Early trailers for Nolan’s movie were fairly standard. They stressed the intriguing premise—that Earth is a used-up planet and that in order to survive, mankind must find a new home—and the characters involved—primarily Matthew McCaughey as the hero torn between wanting to save mankind and not wanting to abandon his daughter. There was plenty of spectacle on display, as you might expect in space travel movie. But as the release date grew closer, the trailers went through a fascinating change. No longer was the story front and center. In fact, the premise, the plot, and the characters were virtually invisible in the new trailers. They were replaced by the technicians who worked on the film, from Nolan himself to his designers and technical crew touting the “reality” of the production. Chastain was replaced by costume designer Mary Zophres, noting how all the costumes were designed to really be worn by space travelers. It was as clear an example as I have ever seen of spectacle replacing story as the central element in the creation and marketing of U.S. film.
That is, until I saw a screening of the documentary Glen Campbell: I’ll Be Me (2014). The remarkable movie tells the story of country music legend Campbell’s final tour, covering 151 shows. Shortly before the tour, Campbell was diagnosed with Alzheimer’s Disease, and the film provides an intimate portrait of the man, his family, and his devilish adversary. The movie, which was still struggling to find wide distribution, had a special one-time screening at a multiplex in suburban Maryland. Interstellar was playing in the next theater and on at least three occasions, the explosive base from Interstellar’s booming soundtrack invaded the neighboring theater like a freight train trundling by your window in the middle of the night. I’ll Be Me’s producer Trevor Albert could only laugh at how perfectly those moments summed up the way in which big-movie spectacle was trampling smaller independent cinema.
Whither goest Aristotle?
The studios have always been greedy. The original moguls—Zukor, Mayer, Warner—never considered themselves artists. They saw a way to make a quick buck and they ran with it. But there was a time when making a quick buck meant developing stories that adhered to the dramatic elements identified by Aristotle more than two thousand years ago. Of the six elements he identified, plot and character came at the top of the list. Spectacle was at the bottom. Aristotle allowed that certain types of plays elevated spectacle above that sixth place ranking.
We have entered an era of U.S. film where spectacle is clearly considered to be the most important of the elements by the people who are in position to choose what will be filmed. When that is combined with an over-reliance on sequel, the future of the industry is in peril. Producers want to make sequels to popular movies because they consider that to be a safer bet. With the budget of the average film sky-rocketing over 100 million dollars, producers are understandably hesitant to take risks. Like their forefathers in the ‘50s who tried to spend their way out of the war with television, today’s producers have adopted the mindset that more money on lavish spectacle will save film. And like those forefathers, in the short term, they appear to be onto something. Profits are strong. But those profits are propped up by multiple factors. Exhibitors have maximized consumer spending through attractive food and beverage displays and by converting their screens into advertizing mediums. Producers are seeing great rewards in international markets where, spectacle and animation overcome the language barrier more readily than do other genres. But over the last ten years, actual ticket sales have declined markedly. I contend that the emphasis on spectacle and its cousins, sequel and cartoon, are undermining the infrastructure of an industry built upon clever, moving and innovative stories. How many more superhero origin stories—and sequels to superhero origin stories—can we really watch? How many more times can the zombies attack? How many more times can the world nearly end?
Zach Braff directed his first movie, Garden State, in 2004. It took him ten years to film his next project, 2014’s Wish I Was Here. Did Braff, a very popular television actor, particularly amongst the young demographic in which Hollywood seems most interested, have to wait so long because his 1st film only grossed $25 million in domestic receipts? It made back ten times its estimated production budget, but still grossed less than half the cost of an average Hollywood production. Or did he run into roadblocks because Garden State was the first mainstream drama to level an accusatory eye at the over-medication of America’s youth, which by extension could be read as a criticism of the vast U.S. medical/pharmaceutical industry? In other words, is Hollywood’s neglect benign or malignant? Regardless of how you see it, I think we can all agree that making one picture every ten years is not a reliable recipe for artistic growth. I have faith in Hollywood’s capacity to be part of the solution (even if they are a major part of the problem) because Hollywood has done it before. The numbers may be bigger in 2014 than they were in 1914, but the imperative to make profit is no different today than it was when Adolph Zukor battled Thomas Edison for control of a new industry. Zukor and his cohorts won by investing in stories and marketing the hell out of them. Can it happen again?
The USA has produced some remarkable movies in the past few years. Deeply-felt, well-crafted adult stories that explore important thematic questions. Movies like Denis Villeneuve’s Prisoners, Jeremy Saulnier’s Blue Ruin, and Damien Chazelle’s Whiplash. They have been distributed through mainstream channels such as Warners, Sony, and the Weinstein Company. All of these companies have smaller, more independently-minded subsidiaries which seem constantly on the verge of merger, reorganization, or outright shuttering. It has happened to Warner’s Picturehouse, Paramount Vantage, Capitol Films’ THINKFilm. Such producers are always at risk during times of economic turmoil, and 2008-2009 was particularly difficult for these types of production houses. But hopefully, the mainstream studios will see value in maintaining an industry that promotes well-crafted adult stories.
I do not have an all-encompassing solution to how this might resolve. But if I have learned anything from the long tail, it is that small voices like mine don’t need to solve all the major problems. We just need to contribute a thought. Then, hopefully, a million of those thoughts all join forces at the back end of that tail and the best ideas can emerge, polished and evolved, ready to change the world. So here’s my thought: a two-tiered Academy Award system. Identical awards for big budgets and for medium/small budgets. Draw the dividing line wherever you want. Start at $100 million. This way, smaller movies will get the marketing weight of the Academy behind them. I know what you are going to say. The Academy has been giving big prizes to modestly-budgeted movies for a while now. Why would two tiers improve the situation? Maybe it would get audiences to focus more specifically on smaller budgets and see them as a group of films worth applauding. Or maybe it wouldn’t. It’s just a thought. Go ahead and add yours. Don’t let Nightcrawler (and here, we are in fact talking about the X-Man) have the only meaningful tail in modern U.S. film.
On December 18, 2014, as this article was preparing for publication, Sony Pictures decided to cancel the scheduled release of The Interview. The comedy, starring James Franco and Seth Rogen, was premised on the fictional assassination of North Korean leader Kim Jong-un. After threats of violence against theaters intending to present the film, all major exhibitors cancelled their screenings and Sony withdrew the release. This is a very early chapter in what is likely to be a complex story about business, government, philosophy, and personal expression. But one thing became clear almost immediately. There were a great many calls for Sony to release The Interview online. Whether it proves major or minor over time, the distribution avenues on which the long tail is predicated are here to stay. Might we see a two-tiered system in which difficult material is relegated to online distribution and spectacle is similarly relegated to traditional movie theatres – a paradigm not unlike what has developed in the U.S. theater world (Off-Broadway/ Regional theatres showing the challenging material while Broadway specializes in spectacle)? Check back in another decade to examine how events none of us see coming will continue to remake the system.