copyright 2014, Jump Cut: A Review of Contemporary Media
Jump Cut
, No. 56, winter 2014-2015

The tail wags: Hollywood’s crumbling infrastructure

by Jonathan Eig

Long ago—so long, in fact, that few alive still can recall it—there was a mighty giant who ruled in benevolence over a grateful populace. All was good in the giant’s land, even in times of economic turmoil and armed conflict. The giant stood firm and the people were content. Then came a usurper. It came invisibly through the sky, with charismatic leaders who went by the menacing names of Uncle Milty and Hopalong and Lucy. And the giant, fearful for his domain, responded as best he could. He used his formidable power to literally stretch the dimensions of the world, to bring color to the colorless, to part the seas and span the globe.

What do you think? Could I sell that story to Hollywood studios today? They are constantly on the lookout for the epic battles of unreal characters. This is the age of spectacle and wonder. This is the CGI call. The way camera captures all our motions. This is the eve of the fall.

A brief historical overview

It has been ten years now since Wired Magazine’s Chris Anderson first published his essay “The Long Tail” in which he described a new world order for filmed entertainment. With affordable avenues for distribution multiplying day by day, Anderson argued that the era of the blockbuster was over. Easier access to distribution would empower filmmakers of all styles. No longer would the deep pockets of Hollywood studios control what we could see.

It’s a nice theory.

There has been much analysis already as to the accuracy of Anderson’s prediction. Five years ago, writing for Reuters, Yinka Adegoke began an essay with “Far be it for us to be the umpteenth person to assail Wired editor Chris Anderson’s much quoted and yet much maligned book, The Long Tail …”[open endnotes in new window] Harvard Business School professor Anita Elberse’s recent analysis in Blockbusters: Hit-making, Risk-taking, and the Big Business of Entertainment suggests that though the tail Anderson foresaw may have indeed grown longer, it has also grown thinner.[2] That means there are more products available, but only a miniscule part of the population is exposed to most of them. With technology changing at such a rapid pace, it is unwise to make any bold pronouncements about what the landscape will look like ten years hence. But we can ask one very pertinent question about the last ten years: Regardless of whether the long tail is thriving or is a myth, are the movies getting better?

I want to look at the current state of U.S. cinema and discuss crucial cracks in infrastructure that have led to a decline in quality, and which, if left unchecked, may have disastrous consequences in the not-too-distant future.

 It is useful to look back at Hollywood in the years after WWII when considering the current state of affairs. Throughout the 1920s and ‘30s, U.S. film had ruled the world. Of all the leading film producing nations before WWI, America was the one country that was not devastated by the Great War. Hollywood studios took advantage of that situation, and by the eve of 1939 U.S. film held dominion. The product was homogenized. Producers, more than directors or screenwriters, influenced the artistic output. The studio system may have muted individual expression, but as an economic model, the system worked magnificently. However, trouble was lurking. Despite getting a temporary reprieve during the early ‘40s, as soon as the war ended the U.S. Justice Department essentially declared the major studios monopolies and forced them to sell off distribution and exhibition pieces of their empires. The economic model began to crumble. At the same time, the Red Scare led to blacklisting which sucked much of the artistic vibrancy out of a frightened industry. But the big blow came from the usurper—television.

Hollywood initially laughed off the new medium, but as the ‘50s went on, it became increasingly obvious that television was no laughing matter for the film industry. Television set ownership increased some fiftyfold throughout the decade. People were staying home to watch their Westerns. Hollywood had to respond.

The studios responded by trying to offer the viewing public the types of experiences television could not yet duplicate. This was the era of presentational innovation—or gimmickry, depending on your particular tastes. The previous major innovation in the presentation of movies—the advent of synchronized sound in the late ‘20s—had come shortly after another new medium, radio, began challenging movies for audience share. In the ‘50s, color film became standard. Early forms of 3D were rolled out. Wide screen formats were aggressively marketed. And in terms of content, bigger and more lavish movies became common. The Production Code, which had been putting a damper on adult content in U.S. film since the early 1930s, was still in effect, but its impact was waning. Sex—and not just procreative sex between husband and wife—found an increasingly friendly home on the Hollywood screen, whether salacious (as in the case of Mark Robson’s soap opera Peyton Place, 1957) or sophisticated (as in the case of Richard Quine’s very adult romance Strangers When We Meet, 1960). Far from retrenching, the major studios began spending more than ever to convince the U.S. viewer to get off the sofa and come to the theater.

And it worked. At least until it didn’t.

It’s an oversimplification to say that U.S. filmmakers gave up on well-crafted stories and complex characters in favor of spectacle. But it is clear that the emphasis shifted. More value was placed on the spectacular, whether it was in a period piece or a musical. More emphasis was also placed on derivative work—work that was based on previously established material that offered producers a safer proposition. Representative blockbusters of the decade included Peter Pan (a youth-based spectacle from J.M. Barrie’s stories and stage play) and The Ten Commandments (remade by the same man, Cecil B. DeMille who had scored a major hit with it in the silent era). As studio monopolies weakened, a new crop of independent producers, often using United Artists as a distributor, emerged. Some of their movies, like the Aldrich & Associates The Big Knife (directed by Robert Aldrich in 1955) and the Hill-Hecht-Lancaster production of Sweet Smell of Success (directed by Alexander Mackendrick and starring Burt Lancaster in 1957) seemed to enjoy turning a scathing eye toward the mainstream entertainment industry with small, well-crafted screenplays and minimal visual extravagance. Meanwhile, at the studio level, relatively less emphasis was placed on developing original dramas and comedies that could appeal to an adult audience. What followed was a brief period of success, and then the doldrums. The ‘60s. The Deadball Era in baseball. Even the Yankees weren’t any good. And by almost every metric, U.S. films hit new lows as well. The overall artistic quality fell off dramatically. The period of studio neglect—of failing to develop original adult stories—caught up with the industry in a major way. Alexander Mackendrick, though only 45, was essentially done as a feature film director after Sweet Smell of Success, in part because he didn’t have the support of a studio behind him. Robert Aldrich would align himself more closely with the studios and make a series of successful movies after The Big Knife, but by the mid-1960s, when he was not yet 50, his career as a director of note was also largely over. Things got so bad in the early part of the ‘60s that film critic Danny Peary, when putting together his popular collection of “Alternate Oscars,” had to simply leave 1963 as a void, concluding there were no “best pictures” that year. For 1965, he had to turn to a Polish director filming in England (Roman Polanksi’s Repulsion) to find the only movie he considered worthy.[3]

Things began to turn around late in the decade. Economic disaster is a great motivator and the creaky old studios were desperate to find a way to speak to the new generation. They were open to experiment. Mark Harris’ excellent book—Pictures at the Revolution—chronicles the 1967 Oscar race for Best Picture in which two stodgy old school relics met up with two new and exciting pictures.[4] History would rank the new-school films, Bonnie and Clyde and The Graduate, far higher than the old-school entries, Guess Who’s Coming to Dinner and Dr. Doolittle. This seemed to usher in an era of intriguing new movies throughout the late ‘60s and early ‘70s. Then came the dual punch of Jaws (1975) and Star Wars (1977), and the blockbuster era was born. You might say that over the last forty years, a battle has raged on in the U.S. film landscape between those blockbusters and the smaller, more indie-oriented style of film. If Chris Anderson was correct in 2004, then we should be seeing the smaller films flourishing. They can’t be expected to outdraw studio-backed blockbusters, but could more accessible avenues of distribution, combined with more affordable methods of production, actually allow for more diverse types of films to play a major role in the industry? And more importantly, can the overall crop of movies, large and small, get better?

I do not dispute the underpinning of Anderson’s analysis. There is no question that the industry has changed. But I, like Adegoke and Eleberse among others, have difficulty sharing his optimism. Major studios are more dependent today on spectacle (such as the Marvel comic franchise) and on derivative stories (see Tak3n) than at any point in film history. As Time Warner CEO Jeff Bewkes said back in 2009, the success of The Dark Knight resulted in the following takeaway: “The obvious thing we’re going to take from (the film’s success) is more Dark Knight.”[5] This lack of innovation at the top of the food chain has disastrous implications. No matter how many Lena Dunhams (Tiny Furniture, 2010)or Cary Fukunagas (Sin Nombre, 2009) emerge, it is highly likely that at some point they will become engulfed by a system that seems intent on recycling unreality.

The smaller, more indie-oriented films have indeed won the awards. The blockbusters and the remakes and the sequels have won the box office. It should not come as a great revelation that bigger, more-spectacle oriented movies, or that sequels to popular movies, do better at the box office than smaller scale original works. The bigger movies cost more. You would hope they earn more back. Historically, major film producers have been willing to spend big bucks on extravagant projects because the potential return is so great. The red ink from many failed movies can be washed away with one mega-hit. If you look at the top ten lists for virtually any decade, it will be dominated by spectacle-oriented films. But in past decades, even mainstream studios offered a more balanced and diversified roster of films. Therefore, if you go a little deeper into box office returns—say, the second ten in a given decade—a disturbing trend emerges.[6]

In the 1960s, that second ten had a fairly even mix of spectacle and story. Big budget spectacles like Cleopatra and 2001: A Space Odyssey were balanced by dramas like the aforementioned Guess Who’s Coming to Dinner and Bonnie and Clyde. Even the top ten, which was skewed toward big-budget musicals, included Butch Cassidy and the Sundance Kid and The Graduate.

The 1970s, which witnessed the brief resurgence of original drama and comedy, was a gold mine for fans of those types of movies. The top ten boasted four original movies not primarily based on spectacle; The Sting, Animal House, The Godfather, and Smokey and the Bandit. A little something for everyone. (12) And the second ten was almost entirely comprised of similar, non-blockbusters. Only The Towering Inferno, Jaws 2, and Airport could be called out-and-out spectacle blockbusters.

Moving into the 1980s, blockbuster culture was clearly on the rise. And sequels were also gaining support in the board rooms of the major studios. Even so, original comedies Beverly Hills Cop and Back to the Future cracked the top ten, and the second ten was again dominated by original adult-oriented fare like Tootsie, Rain Man, and Fatal Attraction. By the end of the decade, the major studios had essentially cast their lot with spectacle. “Packaging” and “franchises” were the concepts of the day. The symbolic knockout punch came in 1989, when Time-Warner rolled out the new Batman franchise. [7] The blockbusters had won.

The 1990s, though clearly within the recognized era of the blockbuster, had three top ten movies—Forest Gump, The Sixth Sense, and Home Alone—which were not initially conceived of as blockbusters. The fact that they all did exceptionally well should not erase the fact that they were developed as original stories (or in the case of Forest Gump, adapted from a novel), primarily based on plot and character. Special effects may have figured into them, but they were not what we would consider special effects movies. The second ten had a few movies—Mrs. Doubtfire, Ghost, and to a certain degree Saving Private Ryan—that were conceived of in terms of plot and character as well. Private Ryan clearly has a great many effects, but I consider it an original story in which effects played an important, but not a defining, role.

As the millennium turned over, non-spectacles and non-sequels all but vanished from the top of the box office lists. Amongst the top twenty films of the new century’s first decade, you would be hard-pressed to find a title that is not primarily dependent on spectacle, or is not a sequel. Often, it is both. The closest you can come to an exception would be movies like The Passion of the Christ or the first Harry Potter film, neither in the top ten. There are some very good movies in that top twenty—movies which boast good plotlines and intriguing characters. Movies like The Dark Knight and The Return of the King. But it needs to be pointed out that unless you really stretch the definition, for the first time in its history, Hollywood did not produce a single top twenty original movie in the broad genres of drama or comedy. So far, in our current half-completed decade, that trend has continued. There is not a single original adult-oriented story not predicated upon spectacle amongst the box office giants. There have been movies based on comic books (The Dark Knight Rises, 2012) and cartoons geared towards kids (Frozen, 2013). There have been movies about dystopian rebellions (Hunger Games, 2012)and movies about superheroes (Marvel’s The Avengers, 2012). And there have been sequels galore—provided they were about dystopian rebellions and superheroes (Hunger Games Catching Fire, 2013, Toy Story 3, 2010, Iron Man 3, 2013). But where are the economically successful suspense stories (like The Sixth Sense) or the comedies (like Tootsie) or the dramas (like Kramer vs Kramer)? If they exist at all, they are being drowned out by an ever-increasing flood of Fast & Furious Despicable Iron Men. Pt. 3.

What it means

Should this matter? If Anderson was right, then should we care that the multiplexes are devoting more and more screen space to derivative, spectacle-oriented work? If you can watch the latest low budget art house gem on you new improved iPhone, then what does it matter?

It matters a lot. It may be true that real artistic innovation can come from a couple of kids with a camcorder and a dream. They funnel that dream through whatever cheap editing software they have and put it up as a short form Internet program. It lights a spark. The ultra-hip youth of Portland or Memphis or Manchester pick up on it at first, and soon it has gone viral. Then what happens? It either flames out because the market is so glutted with other kids with other dreams and different editing suites that, without financial backing, our original dreamer has to fold up her tent. Or, she goes after the cash. And by cash, I mean the deep-pocketed Hollywood studios that have always been quick to buy up whatever is young and hip and hot. But if those Hollywood studios are so entrenched in the culture of spectacle and sequel, how are they possibly going to foster new movies without having them fall into the spectacle-sequel trap?

The long tail theory is predicated upon more accessible methods of production and distribution. What it never accounted for, or at least what it never properly accounted for, was marketing. In the free-for-all world of modern distribution, rising above the multitudinous din requires some magical potion of savvy and luck. And money helps. The world of popular music has a longer track record with the long tail and it is hard to see where positive musical change has come about. Anita Elberse’s book notes that approximately one third of all music tracks that are downloaded are downloaded a single time. A sizeable piece of that long tail could blow away in a modest wind. It is far cheaper and less time-consuming to record a three-minute pop song than to produce a feature film. The musician is better positioned to throw out more product in hopes of catching a spark.

For the independent filmmaker trying to catch that same spark, social media cleverness and an MBA’s sense of guerrilla marketing become essential. Notice that these skills have little to do with cinematic originality. There have been some start-ups that have attempted to fill in the marketing gap for new, out-of-the-mainstream filmmakers. The From Here to Awesome (FHTA) Festival, begun in 2008 by independent filmmakers Lance Weiler, Arin Crumley, and Mike Belmont, was a noble effort to create a community amongst the would-be up & comers.[8] But its method for selecting worthy projects was predicated on Internet voting, and Internet voting favors the savvy marketer. A few years later, NeoFlix, a fulfillment house designed to empower independent filmmakers by providing marketing and distribution services, was launched with great fanfare. Within a few years, it had to close shop. Its president John Chang, in announcing the collapse, wrote, “The long tail concept did not track for most clients as most films would receive a burst of sale in the initial weeks or perhaps even months, and tail off sharply after that.”[9] What is most interesting about this statement is that it is equally true for almost all movies, whether produced by your next-door neighbor or Time Warner. Time Warner, however, is diversified. One hit pays for a lot of failures. They can cross-promote and market those “failures” into new lives. It’s likely that your next-door neighbor cannot.

Dan Gilroy’s 2014 Nightcrawler has gotten very solid reviews, which tend to focus on the film’s indictment of modern media practices. Those reviews miss what is truly important about the movie. Haskell Wexler’s Medium Cool (1969) and the Sidney Lumet/Paddy Chayefsky prescient Network (1976) nailed the media element 40 years ago. What’s different about Nightcrawler (and please remember, we are talking about the movie and not the namesake X-Man mutant superhero with blue skin and a prehensile tail in X-Men 2) is the way its hero Louis Bloom aggressively markets himself. Louis is a free-lancer who has taken modern business classes and understands the value of branding. He doesn’t run from his amorality. He cultivates it. He is a free lancer who is not looking to reinvent journalism. He just wants as big a piece of the existing industry as he can get.  He knows marketing is the way to get it.

Back to the only really important question: Is the United States producing better or worse movies?

Christopher Nolan once made Memento. Now he makes Batmen. Bryan Singer made The Usual Suspects before moving to the land of X-Men. Marc Webb was responsible for (500) Days of Summer. Now he’s responsible for not one, not two, but three amazing Spidermen. How about the Russo Brothers, going from the TV gem Community to Captain America, or Marc Forster, going from Monster’s Ball to World War Z, or James Mangold, going from Walk the Line to Wolverine (and its sequel)? James Wan was the most stylish new purveyor of horror when he made Saw. Now he’s the stylish new purveyor of the seventh Fast & Furious. Need I go on? I’m not saying some of these spectacles aren’t fine films. I am saying that Hollywood is rapidly approaching the point at which it will turn everything into a spectacle. Or a remake. Or a remake of a spectacle.

The muted voice of independence

Look at it another way. The Independent Spirit Awards were established in 1984 in large measure as a reaction against that tidal wave of blockbusters that had been launched a decade before. The goal of the awards was to recognize and promote artists working outside the mainstream of Hollywood production. There were special awards for first features and micro-budgeted films. And though you can quibble with individual tastes and selections, there is no question that throughout their first fifteen years, the awards played a major role in developing powerful, independent-minded voices. If you consider the men and women who have been nominated in the First Feature category—the up-and-comers—the future of the industry—the Independent Spirit Awards have a staggering track record. There was no “First Feature” award in 1986, but the co-recipient of the Best Director award was Joel Coen, winning for his first movie, Blood Simple. Since then, and prior to 2000, here are some other names that have either won or been nominated for their first features: Spike Lee, Todd Haynes, Richard Linklater, Quentin Tarantino, Robert Rodriguez, David O. Russell, Kevin Smith, Paul Thomas Anderson, Darren Aronofsky, Spike Jonze. Among the most important directors working in America over the last 25 years. This time period was dubbed by film critic Jeffrey Sconce as the era of “new smart cinema,” and it may be the last great period of U.S. film.[10]

Look at the same list post-2000. Obviously these directors would not have had as much time to develop their careers, but many of the names on the list above had followed up their initial success with something of note within five years. Todd Solondz, who was nominated for his withering debut portrait of U.S. youth, Welcome to the Dollhouse in 1995, was able to direct the equally withering portrait of suburbia Happiness just three years later. Alexander Payne, who wasn’t even nominated for his satire Citizen Ruth (1996) was still able to build on the promise of his debut with the equally penetrating and more accessible satire of Election (1999). If you focus on the more recent crop of Independent Spirit nominees who have had more than five years since their debut, you reach a disheartening conclusion. There are some excellent movies represented on the list, but where are the careers for Richard Kelly (Donnie Darko) and Dylan Kidd (Roger Dodger)? Karyn Kusama (Girlfight) and Patty Jenkins (Monster)? Goran Dukic (Wristcutters: A Love Story) and Vadim Perelman (House of Sand and Fog)? It is hard to find anyone who was nominated for a first feature between 2000 and 2009 who has developed a significant film career. Arguably the most successful director of the lot has been Catherine Hardwicke who exploded on the scene with the in-your-face coming of age drama Thirteen in 2003. Her best-known work since then was 2008’s crowd pleasing, but non-threatening vampire fantasy Twilight. Many writers, like Paul Haggis, or actors, like Julie Delpy, launched directing careers during the decade, but to this point, none has created what we might consider a significant directing career. Maybe the most intriguing name on the list of recent nominees is Lena Dunham, who has yet to make a second film after Tiny Furniture in 2010. Dunham, like Patty Jenkins and Nicole Kassell (The Woodsman) before her, has found a more welcoming home on television than in the world of feature filmmaking.[11]

Or consider the following list of names: Henry Bean, Rebecca Miller, Shari Springer Berman, Robert Pulcini, Shane Carruth, Richard Glatzer, Christopher Zalla, Courtney Hunt, Debra Granik. Those are the directors of eight of the ten Sundance Film Festival Grand Jury Prize winning movies from the first decade of the 21st century. Sundance, even more so than the Independent Spirit Awards, was once a harbinger of new independent voices. The movies represented on this list, like Carruth’s mind-blowing Primer (2004) and Granik’s brilliant drama Winter’s Bone (2010), are not lesser films than earlier Sundance winners. But their directors have not been able to use this recognition to build successful filmmaking careers. The two filmmakers I did not list are Ira Sachs and Lee Daniels. Daniels managed mainstream success with Lee Daniels’ The Butler (2013), and Sachs has gotten some attention for 2014’s Love is Strange. The fact that I personally find Love is Strange very disappointing doesn’t particularly bother me. I am just grateful that Sachs had the opportunity to film a story about aging homosexuals, a subject rarely confronted by mainstream Hollywood.

Why have these particular barometers ceased to be a predictor of future success in U.S. film? Is it because the directors simply aren’t as good? Is it because the public has soured on films with an “independent spirit?” Independent filmmakers from earlier eras, from Italian Francesco Rosi,[12] maker of such seminal movies as Salvatore Giuliano (1962) and Chronicle of a Death Foretold (1987), to producer Maggie Renzi,[13] who worked with indie icon John Sayles right through the heart of that New Smart Cinema era, blame the relative ease of access to both production materials and money for weakening the crucial winnowing-out process that existed back when it was harder to make a movie. Whether that’s true or not, the answer, which is almost certainly a combination of many factors, really doesn’t matter. Whatever the reason, the mainstream of U.S. filmmaking seems less able to foster emerging talent than at any time in its history. And that, regardless of distribution methods, is a scary proposition.

Last year, when accepting an award from the Washington DC chapter of Women In Film & Video, veteran producer/director Penny Marshall told the audience that the movie for which she was being honored (National Film Preservation Board selection A League of Their Own) could not be made today. It had no trailer moments (read: aliens or explosions) on page 1. It was not animated, and thus not easily translatable into foreign languages and markets. It was not based on a very popular novel or play. It was not a sequel. It was just a good story.

The case of Interstellar

Christopher Nolan, who began his career with the micro-budget thriller Following in 1998, released Interstellar in 2014 with a reported budget of $165 million. There is the suggestion in Interstellar that mankind’s capacity for survival is virtually limitless. So too are the number of metaphors that can be drawn from Nolan’s new movie about the current state of U.S. film. The one that involves J.C. Chandor may be the most telling.

Chandor, along with Scott Cooper, suggest that the future for new U.S. filmmakers may not be as bleak as what I outlined above. Within the last five years, both men have put out strong debuts (Margin Call and Crazy Heart, respectively) and then have been able to follow them up with promising second features (All is Lost and Out of the Furnace). Of course, the jury is still out on their long term career prospects. Chandor presents a particularly intriguing case because of his direct run-in with a major studio blockbuster.

Chandor’s 2nd movie, 2013’s All is Lost, had a similar premise as the 2013 blockbuster Gravity, but with a radically different approach to storytelling. His 3rd film, A Most Violent Year, was scheduled for release on December 31, 2014. The crime story, starring Oscar Isaac and Jessica Chastain, received generally good early buzz. Chastain received particular praise. As the film began playing at festivals, her performance garnered Oscar-talk. Under normal circumstances, distributor A24 would aggressively promote such a performance by having Chastain make the media rounds, appearing at events, doing interviews, etc. But Chastain is also co-starring in Paramount’s Interstellar, which is also vying for box office and Oscar nominations in the same year. As detailed in the New York Times on 11/5/14,[14] Christopher Nolan and Paramount have effectively prevented Chastain from doing publicity for A Most Violent Year. They have her devoting all of her “marketing” energy to Interstellar. The final chapter is yet to be written on the two films, but this situation certainly provides a clear picture of the way in which marketing prowess can trump other considerations. Regardless of access to production and distribution, marketing gives the major studios a seemingly insurmountable advantage.

And the marketing of Interstellar provides a window into what Hollywood considers its most important asset in 2014. Early trailers for Nolan’s movie were fairly standard. They stressed the intriguing premise—that Earth is a used-up planet and that in order to survive, mankind must find a new home—and the characters involved—primarily Matthew McCaughey as the hero torn between wanting to save mankind and not wanting to abandon his daughter. There was plenty of spectacle on display, as you might expect in space travel movie. But as the release date grew closer, the trailers went through a fascinating change. No longer was the story front and center. In fact, the premise, the plot, and the characters were virtually invisible in the new trailers. They were replaced by the technicians who worked on the film, from Nolan himself to his designers and technical crew touting the “reality” of the production. Chastain was replaced by costume designer Mary Zophres, noting how all the costumes were designed to really be worn by space travelers. It was as clear an example as I have ever seen of spectacle replacing story as the central element in the creation and marketing of U.S. film.

That is, until I saw a screening of the documentary Glen Campbell: I’ll Be Me (2014). The remarkable movie tells the story of country music legend Campbell’s final tour, covering 151 shows. Shortly before the tour, Campbell was diagnosed with Alzheimer’s Disease, and the film provides an intimate portrait of the man, his family, and his devilish adversary. The movie, which was still struggling to find wide distribution, had a special one-time screening at a multiplex in suburban Maryland. Interstellar was playing in the next theater and on at least three occasions, the explosive base from Interstellar’s booming soundtrack invaded the neighboring theater like a freight train trundling by your window in the middle of the night. I’ll Be Me’s producer Trevor Albert could only laugh at how perfectly those moments summed up the way in which big-movie spectacle was trampling smaller independent cinema.

Whither goest Aristotle?

The studios have always been greedy. The original moguls—Zukor, Mayer, Warner—never considered themselves artists. They saw a way to make a quick buck and they ran with it. But there was a time when making a quick buck meant developing stories that adhered to the dramatic elements identified by Aristotle more than two thousand years ago. Of the six elements he identified, plot and character came at the top of the list. Spectacle was at the bottom. Aristotle allowed that certain types of plays elevated spectacle above that sixth place ranking.

We have entered an era of U.S. film where spectacle is clearly considered to be the most important of the elements by the people who are in position to choose what will be filmed. When that is combined with an over-reliance on sequel, the future of the industry is in peril. Producers want to make sequels to popular movies because they consider that to be a safer bet. With the budget of the average film sky-rocketing over 100 million dollars, producers are understandably hesitant to take risks. Like their forefathers in the ‘50s who tried to spend their way out of the war with television, today’s producers have adopted the mindset that more money on lavish spectacle will save film. And like those forefathers, in the short term, they appear to be onto something. Profits are strong. But those profits are propped up by multiple factors. Exhibitors have maximized consumer spending through attractive food and beverage displays and by converting their screens into advertizing mediums. Producers are seeing great rewards in international markets where, spectacle and animation overcome the language barrier more readily than do other genres. But over the last ten years, actual ticket sales have declined markedly. I contend that the emphasis on spectacle and its cousins, sequel and cartoon, are undermining the infrastructure of an industry built upon clever, moving and innovative stories. How many more superhero origin stories—and sequels to superhero origin stories—can we really watch? How many more times can the zombies attack? How many more times can the world nearly end?

The end

Zach Braff directed his first movie, Garden State, in 2004. It took him ten years to film his next project, 2014’s Wish I Was Here. Did Braff, a very popular television actor, particularly amongst the young demographic in which Hollywood seems most interested, have to wait so long because his 1st film only grossed $25 million in domestic receipts? It made back ten times its estimated production budget, but still grossed less than half the cost of an average Hollywood production.[15] Or did he run into roadblocks because Garden State was the first mainstream drama to level an accusatory eye at the over-medication of America’s youth, which by extension could be read as a criticism of the vast U.S. medical/pharmaceutical industry? In other words, is Hollywood’s neglect benign or malignant? Regardless of how you see it, I think we can all agree that making one picture every ten years is not a reliable recipe for artistic growth. I have faith in Hollywood’s capacity to be part of the solution (even if they are a major part of the problem) because Hollywood has done it before. The numbers may be bigger in 2014 than they were in 1914, but the imperative to make profit is no different today than it was when Adolph Zukor battled Thomas Edison for control of a new industry. Zukor and his cohorts won by investing in stories and marketing the hell out of them. Can it happen again?

The USA has produced some remarkable movies in the past few years. Deeply-felt, well-crafted adult stories that explore important thematic questions. Movies like Denis Villeneuve’s Prisoners, Jeremy Saulnier’s Blue Ruin, and Damien Chazelle’s Whiplash. They have been distributed through mainstream channels such as Warners, Sony, and the Weinstein Company. All of these companies have smaller, more independently-minded subsidiaries which seem constantly on the verge of merger, reorganization, or outright shuttering. It has happened to Warner’s Picturehouse, Paramount Vantage, Capitol Films’ THINKFilm. Such producers are always at risk during times of economic turmoil, and 2008-2009 was particularly difficult for these types of production houses.[16] But hopefully, the mainstream studios will see value in maintaining an industry that promotes well-crafted adult stories.

I do not have an all-encompassing solution to how this might resolve. But if I have learned anything from the long tail, it is that small voices like mine don’t need to solve all the major problems. We just need to contribute a thought. Then, hopefully, a million of those thoughts all join forces at the back end of that tail and the best ideas can emerge, polished and evolved, ready to change the world. So here’s my thought: a two-tiered Academy Award system. Identical awards for big budgets and for medium/small budgets. Draw the dividing line wherever you want. Start at $100 million. This way, smaller movies will get the marketing weight of the Academy behind them. I know what you are going to say. The Academy has been giving big prizes to modestly-budgeted movies for a while now. Why would two tiers improve the situation? Maybe it would get audiences to focus more specifically on smaller budgets and see them as a group of films worth applauding. Or maybe it wouldn’t. It’s just a thought. Go ahead and add yours. Don’t let Nightcrawler (and here, we are in fact talking about the X-Man) have the only meaningful tail in modern U.S. film.


On December 18, 2014, as this article was preparing for publication, Sony Pictures decided to cancel the scheduled release of The Interview. The comedy, starring James Franco and Seth Rogen, was premised on the fictional assassination of North Korean leader Kim Jong-un. After threats of violence against theaters intending to present the film, all major exhibitors cancelled their screenings and Sony withdrew the release.  This is a very early chapter in what is likely to be a complex story about business, government, philosophy, and personal expression. But one thing became clear almost immediately. There were a great many calls for Sony to release The Interview online. Whether it proves major or minor over time, the distribution avenues on which the long tail is predicated are here to stay. Might we see a two-tiered system in which difficult material is relegated to online distribution and spectacle is similarly relegated to traditional movie theatres – a paradigm not unlike what has developed in the U.S. theater world (Off-Broadway/ Regional theatres showing the challenging material while Broadway specializes in spectacle)? Check back in another decade to examine how events none of us see coming will continue to remake the system.


1. Adegoke, Yinka.  “Time Warner: It’s the Hits, Stupid.”  Blog.reuters.com. U.S. version. Reuters. 8 January, 2009.  Web.  3 November, 2014.

2. Elberse, Anita.  Blockbusters: Hit-making, Risk-taking, and the Big Business of Entertainment. New York: Henry Holt & Co., 2013.

3. Peary, Danny. Alternate Oscars. New York: Dell Publishing, 1993.

4. Harris, Mark. Pictures at a Revolution. New York: Penguin Books, 2008.

5. Lewis, Hilary. “Ready for ‘The Dark Knight’ Parts III, IV, V, and VI?” businesinsider.com. Business Insider. 4 February, 2009. Web. 5 November, 2014.

6. Box office numbers are sometimes difficult to confirm.  I have tended to rely on Tim Dirks’ website www.filmsite.org.

7. Zoller Seitz, Matt. Defining Moments in Movies. Ed. Chris Fujiwara. London: Cassell Illustrated, 2007: 619.  Seitz’ short essay defines what he calls the “fully vertically-integrated corporate blockbuster.”

8. Weiler, Lance. “Virtual Discovery.”  Filmmaker. Vol 17, No. 2. Winter 2009.

9. Macaulay, Scott. “Blog.” Filmmakermagazine.com. Filmmaker. 5 February, 2011. Web. 5 November, 2014.

10. Sconce, Jeffrey. “Irony, Nihilism, and the New American “Smart” Film. Academia.edu. Winter, 2002. Web. 6 November, 2014.

11. Many female writer/producers have found a more welcoming home on television than in the world of feature films. Writer Winner Holtzman, creator of the teen drama My So-Called Life, speaking on a panel at the 1993 Austin Heart of Film Screenwriting Conference, explained some of the advantages of television.  One advantage she described was that television operates on a much faster work schedule and that feedback from your audience came much faster than it did from a movie.  Therefore, the learning curve for a television writer is much steeper.  They learn very quickly what works and what doesn’t work, and can grow and improve as a result.

12. From the Criterion Collection DVD of Salvatore Giuliano.  In commentary, film critic Peter Cowie relates the following from Francesco Rossi: “Our generation was impressed and dominated by the camera because the camera was a means of expression.  Nowadays, with digital cameras and very small cameras, it’s all changed.  At the time of Salvatore Giuliano, the camera was a mystery and had to be wielded with great skill.  The choice of composition or movement was very important.  Today, young directors are much more casual in their attitude toward the camera.” Date unknown.

13. Carson, Diane. “Let’s Not Compromise: An Interview with Independent Film Producer Maggie Renzi.” Journal of Film and Video. Spring, 2010. Renzi, when discussing the current crop of smaller, independent-minded films, says “... they're all competing for the same space in art house theaters and in the mind of the moviegoers. What happens with this glut of films- where there's no real winnowing system early on, and there's the huge impulse toward anything new and reaching the youth audience- is that we're getting a lot of unfinished films, films where the bar isn't set high enough for storytelling, for universality of story, of theme, and particularly where the production values are really not good enough.”

14. Cieply, Michael. “One Star, 2 Films and Conflict.” Nytimes.com. The New York Times. 5 November, 2014. Web. 6 November, 2014.

15. Corliss, Richard. “Billion Dollar Babies.” Content.time.com. Time Magazine. 1 January, 2010. Web. 8 November, 2014. http://content.time.com/time/specials/packages/article/

As with box office figures, budget numbers can be hard to verify.  Corliss notes that the average movie budget doubled in ten years, from $53 million in 1998 to $106 million in 2008.

16. Zeitchik, Steven. “Specialty Film Business Reeling After Cutbacks.” Reuters.com. Reuters. 6 June, 2008. Web. 5 November 2014.

Additional Reading

Denby, David. Do the Movies Have a Future? New York: Simon & Schuster, 2012.  New Yorker film critic David Denby discusses a variety of issues facing the modern film world, including the concept of “platform agnosticism.”

Turran, Kenneth. Sundance to Sarajevo: Film Festival and the World They Made. Berkeley and Los Angeles: University of California Press, 2002. Turan, film critic for the Los Angeles Times, discusses the impact of film festivals on the film world in a personal account of his travels.

Various. History of American Cinema Series. Ed. Charles Harpole. Berkeley and Los Angeles: University of California Press, Various. Harpole’s ten volume history, each book written by a different author and focusing on a different decade, is an excellent resource for tracking the evolution of t he American film industry.

To topJC 56 Jump Cut home

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 2.5 License.