Scientists had been experimenting with computer animation since the 1960s, but birth of 3D animation is widely attributed to pioneers Ed Catmull and Fred Park, who created the world’s first 3D rendered movie, A Computer Animated Hand, in 1972. The techniques used by Catmull and Park, such as creating wireframe skeleton and then rendering surfaces on the models, formed the basis of the 3D animation as we know it today. Catmull went on to found Pixar, and he is currently the president of Pixar and Disney Walt Animation Studios
Interestingly enough, the original Star Wars was the third movie to ever use this kind of technology, with CGI wireframe models being used in various scenes.
However, it wasn’t until 1983’s Tron that audiences all over the world were introduced to solid 3D animation, such as digital terrain and digital vehicles. Since then, due to the huge advances in animation software and the success of fully 3D movies like Toy Story, 3D animation has become just another part of life in the 21st century.
Introduction Of Computer Animation
It is believed that computer animation originated sometime in the early 1940s or 1950s. Around this time, experiments in computer graphics were just beginning and computers were still in their infancy. John Whitney is known to be one of the first artists to delve into computer graphics. However, it took much longer before computer graphics took off. By the early 1960s, digital computers became widely popular and computer graphics began to blossom. Artistic experimentation with computer graphics began in the mid-1960s.
Around the mid-1970s, computer graphics were just starting to enter into public media. At this time, pretty much all graphics were 2 dimensional. As computers and associated technology continued to improve, more and more emphasis was placed on achieving 3-dimensional realism. It was until the late 1980s when photo-realistic 3D images started appearing in cinema movies. By the mid-1990s, the technology has improved greatly and it was then possible to create a complete feature film using nothing, but 3D animation.
The Improvements Of The 80s
Major improvements in the 80s paved the way for 3D animation. Silicon Graphics and Quantel released new hardware and software that enabled the development of 3D animation at the time. In 1982, Japan took a leap forward. The country’s Osaka University built a supercomputer specifically for rendering 3D computer graphics. That computer, the LINKS-1 Computer Graphics System, was considered the most powerful computer in 1984. The University of Montreal also played a rule role in bringing 3D animation to the forefront. The university actually produced three short 3D films with 3D charters in the 1980s.
In 1982, the public was introduced to 3D CGI. Walt Disney’s Tron was the first cinema money to use solid 3D CGI heavily. The film, which was directed by Steven Lisberger, was considered to be a milestone for the industry. However, it should be noted that less than 20 minutes of animation were incorporated into the film. Two years later, Universal Picture and Lorimar released The Last Starfighter. The film used CGI extensively. With The Last Starfighters, 27 minutes of CGI footage was produced. The film went on to be a box office success.
3D inbetweening and morphing became popular around the same time. In 1988, Ron Howard’s Willow, became the first movie to use morphing. The film portrays the main character, Willow, using magic to convert into animals and a sorceress. In 1986, Hollywood took 3D inbetweening one step further with the release of Star Trek IV: The Voyage Home. The movie included a dream sequence in which the characters travel back in time. It shows their faces transforming into one another. This was one of the first successful implementations of 3D scanning and 3D inbetweening.
By the 1990s, 3D animation had really taken off. At the time, there were numerous 3D programs used for this purpose. Wavefront was popular, but most Hollywood studios relied on Alias Research’s PowerAnimator. This software was used for numerous films, including Jurassic Park, Batman Returns and Terminator 2. In 1993, the company began developing Maya. Their software would go on to be used by Sony, Warner Brother, Pixar, and Walt Disney.
In the 2000s, 3D animation took another step forward. During this time, motion capture became immensely popular. In 2002, Peter Jackson shocked the world with the release of The Lord of the Rings: The Two Towers. This film was the first to utilize a real-time motion capture system. It enabled Andy Serkis to directly control a 3D CGI character, Gollum. Now, this same technology is very popular in video games and it is still frequently used in modern cinema. In the 2000s, most studios relied on three programs. Blender, Poser and Pointstream were commonly used by enthusiasts and professionals alike.
Improvements In the 2010s
In the 2010s, major improvements were made to CGI. In 2013, USC and Activision collaborated to create a real-time digital representation of Ira. This was done using the USC light stage X. The techniques that were once only available in high-end virtual cinema have since been adopted by video game developers. This is a trend that will likely continue well into the future.
While 3D animation has evolved significantly, it is safe to say that it is still in its infancy. There is no doubt that 3D animation technologies will continue to evolve and grow well into the future. In fact, it will be very interesting to see what they can come up with next.