Technology In Today’s Movies & The Future of Film

Christine Daniloff/MIT

There are any number of great blockbusters from the last two decades, marvels at the time, that simply do not visually hold up. We as audiences have become so used to great cinematic spectacles with bizarre creatures, wondrous worlds, and dazzling special effects, that the only way to realize how accustomed to it we’ve become is to look to the past, examining what we remember as equally compelling. Film is, and needs to be, constantly evolving. 

For example, at the time, among the things that the Star Wars prequels set out to achieve was heralding and championing an era of CGI, using new technology to create a fuller, more diverse world. There were many disappointments, but at least it was somewhat interesting visually. Yet, if you were to watch them today, they would no doubt feel dated; those once-impressive computer images may now, in light of more recent technological cinematic achievements, seem flat, dull, and distracting.

These films were made a relatively short time ago, but such is the rapid-fire evolution of our entertainment landscape. While streaming services are running rampant and audiences bemoan the cinematic releases of sequels, prequels, and other world-building franchise fare, blockbuster films have the opportunity––and certainly the financial insentive––to get increasingly creative.

Let’s take a look at the newer Star Wars entries and what they’ve done to alter filmmaking practices; let’s also consider other ways in which going to the movies can, and cannot, be made more engaging for the viewer.

Star Wars 

In Rogue One: A Star Wars Story, Disney essentially brought an actor back from the dead. Peter Cushing played Grand Moff Tarkin in A New Hope in 1977, and the character was so necessary to the 2016 reboot that Disney used a combination of digital technology and a stand-in actor to resurrect the character. 

This film also featured a cameo by a younger –– roughly forty years younger –– Princess Leia, another digital feat. Since then, Disney has de-aged a slew of actors inits films, most notably Robert Downey Jr. in Captain America: Civil War, and most recently Michael Douglas, Michelle Pfeiffer, and Laurence Fishburne in Ant-Man and the Wasp. In these cases, the actors are some twenty or thirtyyears younger. What’s more, it’s been announced that for Captain Marvel, a superhero film set decades earlier, Samuel L. Jackson will reprise his role of Nick Fury, but as a much younger man (with two working eyes!).

So clearly, with money and effort, we can make actors much younger, or even bring them back from the dead. Naturally, this leads to a whole bunch of ethical questions as we ponder the future of film. Guy Henry, who played Tarkin (sort of), told the Hollywood Reporter: “I think and hope it won’t be a commonplace thing.” So there you have it.

4DX

While digital technologies have the potential to change and essentially reanimate the actors and characters we know and love, theatres have made attempts to better engage viewers in the physical space. 

Enter 4DX, which is an ongoing attempt by the theatre to enhance the sensory experience of films; that is, instead of just watching and listening to a movie, one can smell and feel it (unsure about taste just yet). Regal Cinemas and others have introduced this at select theatres; moviegoers can expect moving chairs, blowing wind, and aromas rising in the air.   

Theme parks have long utilized this tech, but now movies (mainly those action and adventure films catering to the under-thirty-five crowd) are working with it. Reviews so far are middling to poor, but there does seem to be some potential not yet fully realized.

Choose Your Own Adventure 

There is something especially fun and satisfyingly gimmicky about the pop culture phenomena that is ‘choose your own adventure.’ It seems mostly relegated to kids’books, and anything that falls under the genre of science fiction, fantasy, or horror, but there are a lot of interesting possibilities and considerable allure in the idea that the user can affect the outcome.

Now more than ever, the opportunity is ideal for such an endeavour. Firstly, this concept would (almost only) work at home; it doesn’t seem practical in a theatre with a bunch of people. With Netflix and others, and new films getting regularly released at home as opposed to cinematically, the ability to interact with a film is easy. 

Secondly, plenty of filmmakers and studios have found ways to make cheap movies that earn a lot of money –– particularly within the horror and romance genres, and a lot of films geared towards young adults. Blumhouse famously made the Paranormal Activity films, among many other horrors, and found success in shocks without expensive spectacle. There is even now a rumour that its newest film, Unfriended: Dark Web has two different theatrical endings. We’re already on our way.

Thirdly, with careful planning, the filming and editing of extra scenes doesn’t necessarily have to break the bank. For example, a claustrophobic horror film employing this tactic can have the user decide which doors to go through, which people to trust, and other decisions that don’t drastically change the setting or momentum of the film. Sure, this eliminates the so-called vision of the director, but for a decent amount of films, it’s perhaps fair to say that audiences are investing in an experience, and don’t necessarily expect to be dramatically moved.

3D Without the Glasses

3D technology has endured various waves of innovation since the dawn of cinema; most of them were gimmicky, but there was always a curiosity. Films from as early as the 1920s worked with 3D technology, though it is debatable whether some of the early attempts qualify. The ‘50s made genuine attempts as well: Vincent Price led a horror genre that regularly used the technology to enhance scares and thrills, while Disney also employed the style for family-friendly fare.

The 1980s saw another boon in 3D, mainly for horror films again, as did the 2000s, a decade that remade a lot of popular horrors from said ‘80s. It was still mostly a novelty then, as studios realized they could make more money off 3D, and even more money if they did a poor job retrofitting films in 3D. It wasn’t particularly good or meaningful. Films got better at it though, those studios that go to the lengths to make 3D films – Marvel, for example – do a solid, if unnecessary job, at making their films have some depth. (I didn’t mind this sentence)

In recent years, a team of scientists from MIT have developed a movie screen that projects in 3D and requires no glasses. While there are already TVs in existence that allow for glasses-free 3D viewing, the process for cinematic screens is much more complex; it requires the development of a product that not only functions on a larger scale, but also works with all viewers sitting at various depths and angles in a theatre.

While it’s been noted that the process isn’t entirely practical just yet, a major breakthrough came in 2016. But don’t tell filmmaker James Cameron what’s practical. Naturally, he has been working to incorporate new technology for his long-awaited, much-anticipated, opus of a follow-up to Avatar (the sequels have been almost a decade in the works). He has alleged that his films will be in 3D without the glasses, so we will see. And while this perhaps signals a looming end for companies that print the 3D glasses, hopefully this means a better experience for movie goers and less waste for everyone.

Anthony Marcusa
Anthony Marcusa is a Toronto-based freelance journalist whose writing dabbles in film, TV, music, sports, and relationships – though not necessarily in that order. He’s simultaneously youthfully idealistic and curmudgeonly cynical. But he’s always curious.