I started my professional art career within the game industry shortly after receiving my undergrad degree in late 2007. My first work on a game was working on cinematic sequences for Blitz the League 2 at Midway games. At that time we used game assets in a Maya environment to create the renders that would later become full motion videos (or FMV) played during particular parts of the game. The time it took to do renders was a particular concern, because in the game industry the schedules are always tight. After Blitz I left Midway and started my second game job at High Voltage Software (HVS), where I would stay for over five years. During my first project at HVS, I made an alarming discovery that shocked me; the typical pipeline they used involved little to no preproduction! I saw various issues arise due to skipping this critical step, and I personally believed it was one of the core reason many of our games seemed to fail commercially. Before my departure from working at HVS I had the privilege to team up with the Saints Row team in Champagne Illinois. I went with a group of VFX and environment artists (like myself) and trained on Volition's engine as we were slotted to help with VFX, lighting, and environment creation. I was pleased to find out that Volition really invested in preproduction (about 8 months worth), and it seemed like they made best efforts to create a smooth pipeline for their entire staff. Flash-forward, we started working on the game, and as we were getting builds I was pleased to see some of the assets I created were, like in Blitz the League, being used during the cinematic sequences. What had changed from the time 6 years before on Blitz was that now, instead of creating cinematic sequences rendered in Maya, they were rendered in engine on the highest possible settings for lighting, shadows, geometry, and texture resolution. This thought prompted me to consider the applications a game engine might serve outside of the typical roles.
One of my great passions is feature animation. Preproduction in animation production is a must. Preproduction includes many disciplines such as concept design, story boarding, animatics, and Pipeline Tests (and more depending on the production house). During preproduction there is, of course, a working environment that allows the creative disciplines to share their work via source control, referencing of files, and dailies. All of these methods work to connect the team and process together, but it could be better. Perhaps an environment that allows the creative’s to see progress coming together in one place, at one time, in real time, and closer to a final vision would enhance the preproduction process? This Thesis postulates the utilization of real time game engine technology will change the dynamics of Television and Feature Animation preproduction pipelines by creating a holistic workflow that better represents a finished visualization process before production begins. In essence seeing a close to final look as early as possible will allow for less guess work, and give more direction to all creative’s involved in the pipeline process.
Creating a preproduction pipeline, starting at concept design and ending in a real time pipeline test will help to evaluate the pros and cons of using the game engine environment as a visualization tool within the animation pipeline. Discussing the shortcomings and strengths of game technology in an animation application will uncover how to improve the current tools and pipelines. Forgoing the final render in favor of the real time render solution will also be evaluated. Game and film teams working together to achieve a synergistic end will also be explored.
Daniel Triplett, is an artist that worked in game development for over 6 years, and now teaches in the Computer Graphics Technology department (CGT) at Purdue University.