How special effects transformed the movies

How special effects transformed the movies

To create movies of the quality we now expect, special effects houses have to use every trick in the book, from classic green-screen technologies to the creation of full artificial intelligence systems. It's no wonder that names like Industrial Light and Magic are as important in Hollywood as any producer's or director's.

While there's no question that you need advanced software techniques on your side to produce Hollywood effects, most of what's needed comes down to raw processing power. Many of the day-to-day tools used in the industry are mainstream applications available to enthusiasts and smaller studios.

Companies such as Softimage and Autodesk lead the way with suites of titles designed to cover everything from green-screen imaging and compositing to character animation, lip synching and lighting.

Custom code

These applications are fine for the basics, but the larger effects houses spend as much time on software engineering as they do on the artistic side, writing custom code to fix specific problems and bringing new effects to life.

Sometimes these become products in their own right, as happened with Pixar's RenderMan, the engine behind not only the company's own films such as Ratatouille and WALL-E, but also most major Hollywood blockbusters, including Harry Potter and I Am Legend.

Mental Ray is another common industry render engine, and it's used on all manner of Hollywood blockbusters. Essentially acting as an API, Mental Ray allows batch mode rendering within common software environments. This means that designers can render their output via their favourite software package, be it Maya, 3DS Max, Softimage XSI or Side Effects Software's Houdini.

The advantage of this is that designers and artists can use a common rendering file format – a '.mi' scene file – across different applications, using each app's own shading methods, procedural textures, bump and displacement maps, atmosphere and volume effects, environments, camera lenses and light sources.

The level of complexity involved here is closer to an engineering project than a standard artistic one, but it's wasted if the artistic side falls flat. Pixar is a great demonstration of the two working side by side. When Toy Story came out, the relatively primitive state of 3D graphics didn't allow for the complex effects we're now used to seeing – cloth effects, convincing human animation and photorealistic backgrounds, for example.

So the company focused on the type of effects it could pull off – rigid-body toys, where any weaknesses would simply contribute to the charm. Each subsequent release followed a similar pattern, introducing more realistic animation in A Bug's Life, mastering fur in Monsters Inc and coming up with the cartoon humans that made The Incredibles so much fun to watch. Every movie raised the stakes. Every movie was a hit.

The state of the art

The history of CGI in live-action films hasn't always been smooth. The earliest practical application of CGI is generally agreed to be the point-of-view sequences of Yul Brynner's robot gunslinger in the 1973 futuristic western Westworld.

The producers employed 2D computer-generated animation to simulate the robot's vision. For the 1976 follow-up Futureworld, the producers went one stage further and introduced 3D elements via rendered polygonal models, a technique which has now become standard. Not all the effects of the time were so complicated. In many cases, it was easier to cheat.

The TV version of Douglas Adams' The Hitchhiker's Guide To The Galaxy (1981) appeared to use computer graphics for the pages of the Guide, but in fact these were hand-drawn scenes created to mimic the style of contemporary computer animation. Other artists found that the technology available simply wasn't able to produce what they wanted.

The Japanese anime film Golgo 13 (1983) was one of the first movies of its kind to introduce proper computer animation interposed with traditional techniques, leading to a hysterical scene where the cell-animated main character keeps cutting away to a blocky, untextured helicopter gunship.

It's therefore not surprising that the first truly legendary CGI-heavy film was, like Pixar's films, designed to play to the technology's weaknesses as well as its strengths. 1982 saw the release of Tron, complete with real actors and the first fully computer-engineered 3D scenes.

"One of the difficult tasks on Tron was to create a unified look for both the real world and the electronic world," said producer Donald Kushner several years after the film's release. "Like in The Wizard Of Oz, there are two worlds. The difficult part was integrating both of them. We used computer simulation, we used backlit techniques and we used conventional live action. The challenge was to make it all look cohesive."

Beyond Tron

After Tron, a variety of watershed films employed ever-more impressive CGI advancements, from Indiana Jones and the Last Crusade featuring the first all-composite scene to Terminator 2: Judgment Day's startling visuals of the T1000. The latter marked the first use of natural human motion for a computer-sculpted character. Its liquid metal effects, particularly in conjunction with the then-revolutionary morphing technology that would soon take over every film and commercial in sight, was a particular eye-opener, giving us a villain that combined the best technology from both 1991 and a post-apocalyptic 2029.

It was Toy Story, though, that really cemented CGI's place in the industry. While producing the film, Pixar grew from just 10 people to 150 – an unheard-of number for a computer graphics project. 50 to 70 people were on the technical team, working under technical director Bill Reeves and animator John Lasseter.

They were tasked with producing the software that would become RenderMan. "If you have a good story and good characters, you can use CGI to create a movie that does $200million at the box office and accolades up the wazoo," Reeves said, noting the importance of choosing the right project instead of just relying on effects. As for Lasseter, it's tough to argue with his recent description of his field: "Computer animation's an art form that grew out of a science."