Production artist open editor, and then opens the scene they are working on. Before getting started she looks to see if there are any updates to the storyboards since they last worked by accessing the repository (and sort by date). The Artist opens a list of production items to look through; from that list can be seen: Story, Character, Art Direction, Story board, Vocal Tracks, Sound Effects and Music, and 3D animatic, all of these in their latest carnation may be accessed right there within the editor. The artist is working on animation and looks at an update to the animatic for direction. The layout of the environment art is already loaded, some pieces are gray-boxed, while others are finished. The Animator looks at the scene to get a grasp of the lighting laid out by the lighting director. The Animator opens the scene file where the animations are located in Maya. The same gray-boxed assets from the environment art in the engine are referenced into the Maya scene. The animator begins work on the animations for that scene. After the animator finishes a bit of work she exports the animation to the engine and then plugs the animation into the scene. She takes note of all of the elements coming together; lighting, environment art, sound design, and of course her animations. She pulls up the animatic again to see if her timing is matching. It is, and she feels the work is ready for review, so she checks in her assets.
Shortly after she checks in her work, the sound designer updates his build. He sees the new animations for the sounds he is working on are in. He quickly exports his sounds, then opens the editor. He drops the sounds where he wants them. He hits play, and notices how the sounds work with the new animations, all in real time.
The animator shoots an email to the lighting director letting him know that the scene he was asking about is checked in. He updates his source control and is ready to begin lighting the scene in real time. His lighting setups happen before his eyes, and his previs for this scene has been setup in record time.
Character Previs Vision
The character artists while working in Zbrush have already exported lower poly versions of their characters to be auto rigged for the upcoming real time final test before production begins. While preproduction proceeds the character artists are getting closer for their final review of their finished models. The artist exports out a medium/high poly character out of Zbrush, and plugs it into their real time visualization tool, in this case Marmoset Toolbag, where they are able to hand off the file to the lighting director. The lighting director in little time lights the characters with instant feedback for the director to look at. The same process is happening with in environment and prop art. Using Direct X 11 the models even show displacement, further improving on faster turnaround times. The director is pleased that the time and quality of preproduction has simultaneously improved.
The team of artists producers and director sit down to watch a preproduction scene for review. As they examine the scene the directors and producers marvel at the near production quality preproduction has taken on; the characters are normal mapped and, fully colored, and lit within the scene, yet no extra time has been added to the preproduction cycle so the current budget remains intact.
PART 1: Animation Preproduction
Goal: the goal is to create a short animation that will be produced exported to a game engine to establish the viability of using a market game engine for preproduction. The hope is to create a holistic environment where preproduction and visualization is enhanced by the use of real time technology.
1. Character Models
a. Duncan- mostly modeled
2. Character’s Rigged
3. Environment Modeling
a. Blueberry Prop-Finished
b. Corn stalks- finished, upres for final render
c. Trees- Deciduous, and Conifers
d. Bushes-various bushes (at least 3)
e. Rocks-1 complete, need other variation
f. Mushrooms-3 variety of shapes
g. Hatches Fence
h. Blueberry bushes
i. Blueberry Pie
j. Mound of dirt
k. Dirt hole
l. Matte paintings-sky, tree lines
n. Stream-water, stream bottom
q. Sky dome
r. Various flowers
a. Dirt clouds-from running
b. Grass particles
c. Air particulates
d. Blueberry bush leaves
e. Sparkle off BB
a. Duncan running-quadruped
b. Duncan Jumping over stream
c. Duncan opening latch and door
d. Walking up to Blueberry bushes
e. Jumping into Blueberry bush
f. Laying down looking up and picking blueberries
g. Sitting up
h. Looking over
i. Noticing BB
j. Walking around bush
k. Exclamation over Huge BB
l. Thinking of BB Pie
m. Pushing BB Pie towards Home
a. Creation of boards
b. Creation of Animatic
c. Animatic as UI option-Animatic Viewer
7. Reference Art
a. Collect and/or create finished Character Sheet for Duncan and Garret
b. Collect photographic references
c. Implement references as a UI option- reference viewer
8. Sound library
b. Feet pitter-patter
c. Grass crunching
d. Bush crunching
e. Old door opening (creak)
g. Bush rattle
h. Grunt struggle pushing BB
i. Water flow
j. Birds chirping
Testing will need to be done to see the capabilities, struggles, and overall viability of game engine technology in this area.
PART 2: Visualization
1. Character Models
a. Duncan- mostly modeled
b. Garret- in progress
c. Picnic- in progress
2. Character’s Auto Rigged
3. Environment Models and props
a. Duncan in example environment-In Blueberry patch sitting on BB, next to fence and bushes
b. Garret in example environment-Playing Violin, bottle cap on a wood floor, lightening bugs, lamp
c. Picnic in example environment-nailing an Ollie off a stump, mushrooms off the side of stump
ANZOVIN RIG TOOLS: $299.00 – professional auto rigging tool for Maya, exports to game engines.
Unreal 4: $19 monthly
Marmoset Tool Bag: $129
Michael Sporn Animation -VFX of Fantasia
James Bodrero- artist interview, concept designer of Pastoral Symphony
Game Engine Tech for previs:
Crytek’s Cinebox – an update
The Holy Grail of Previs: Gaming Technology:
how previs helped make Maze Runner
Open Subdiv 3
Manuka: Weta Digital’s new Renderer
Marino, Paul. The Art of Machinima. Scottsdale, Arizona. Paraglyph Press, Inc. 2004. Print
Cantor, Jeremy and Pepe Valencia. Inspired 3D Short Film Production. Boston, MA: Thompson Course Technology, 2004. Print.
Frank Thomas, and Ollie Johnston. The Illusion of Life Disney Animation. New York, New York: Walt Disney Productions, 1995. Print
Osipa, Jason. Stop Starring Facial Modeling and Animation Done Right. Alemeda, CA: Sybex, 2003. Print
Williams, Richard. The Animator's Survival Kit. New York: Faber, 2001. Print
Fantasia was a huge influence to me growing up. In fact while I sat in the theater watching Fantasia for the first time when I was 9 or 10 years of age, I made the decision that I wanted to work in animation. My favorite scene in Fantasia back then was The Pastoral Symphony. Above are Cupids from the Fantasia's Pastoral Symphony, below is a render of a character I am developing, Duncan. As I looked at Fantasia I saw some influences coming through in the body shape of Duncan and the cupids. More important than an artistic design of a character will be to match the playfulness and the discovery found in the Pastoral Symphony.
I have always enjoyed Norman Rockwell's work, as much for the presentation of nostalgia as the beautiful renderings. Here one of my characters, Garret Longhopper, is posed next to this famous Rockwell rendering of Abraham Lincoln. The relationship between these two is multi-threaded. When I designed Garret I was thinking of an older wise grasshopper but with some mannerisms of Walt Disney's Goofy. I wanted Garret to look "nostalgic" and I was picturing a Norman Rockwell feel to him with an Abe Lincoln type build (the long chin was an homage to Lincoln's beard). After I had already designed Garret, I found this Norman Rockwell Lincoln; it pictured Lincoln in rural Midwest farmland, doing farm type work, similar to the farmer back story I gave to Garret. Even the wrinkles and the clothes of Lincoln reminded me of Garrets clothes. I felt like I really accomplished what I was going for.
All good things start with a great story. I take inspiration from "The Gruffalo" a great children's story which was brought to life on the TV screen. Like Axel Scheffler, Gruffalo's Illustrator, I am Illustrating a children's book, I am also a 3D artist. When I came upon "The Gruffalo" looking at 3D art, I had no idea it was first a book. I grabbed screen shots of the movie because I felt the 3D artists achieved the storybook type rendering and modeling I am looking to achieve when I bring my Illustrations to life.
By evaluating the integration of game engine environments as a lighting preproduction tool, it is possible for real time visualization to create an interactive workflow for feature animation and television production.
I started my professional art career within the game industry shortly after receiving my undergrad degree in late 2007. My first work on a game was working on cinematic sequences for Blitz the League 2 at Midway games. At that time we used game assets in a Maya environment to create the renders that would later become full motion videos (or FMV) played during particular parts of the game. The time it took to do renders was a particular concern, because in the game industry the schedules are always tight. After Blitz I left Midway and started my second game job at High Voltage Software (HVS), where I would stay for over five years. During my first project at HVS, I made an alarming discovery that shocked me; the typical pipeline they used involved little to no preproduction! I saw various issues arise due to skipping this critical step, and I personally believed it was one of the core reason many of our games seemed to fail commercially. Before my departure from working at HVS I had the privilege to team up with the Saints Row team in Champagne Illinois. I went with a group of VFX and environment artists (like myself) and trained on Volition's engine as we were slotted to help with VFX, lighting, and environment creation. I was pleased to find out that Volition really invested in preproduction (about 8 months worth), and it seemed like they made best efforts to create a smooth pipeline for their entire staff. Flash-forward, we started working on the game, and as we were getting builds I was pleased to see some of the assets I created were, like in Blitz the League, being used during the cinematic sequences. What had changed from the time 6 years before on Blitz was that now, instead of creating cinematic sequences rendered in Maya, they were rendered in engine on the highest possible settings for lighting, shadows, geometry, and texture resolution. This thought prompted me to consider the applications a game engine might serve outside of the typical roles.
One of my great passions is feature animation. Preproduction in animation production is a must. Preproduction includes many disciplines such as concept design, story boarding, animatics, and Pipeline Tests (and more depending on the production house). During preproduction there is, of course, a working environment that allows the creative disciplines to share their work via source control, referencing of files, and dailies. All of these methods work to connect the team and process together, but it could be better. Perhaps an environment that allows the creative’s to see progress coming together in one place, at one time, in real time, and closer to a final vision would enhance the preproduction process? This Thesis postulates the utilization of real time game engine technology will change the dynamics of Television and Feature Animation preproduction pipelines by creating a holistic workflow that better represents a finished visualization process before production begins. In essence seeing a close to final look as early as possible will allow for less guess work, and give more direction to all creative’s involved in the pipeline process.
Creating a preproduction pipeline, starting at concept design and ending in a real time pipeline test will help to evaluate the pros and cons of using the game engine environment as a visualization tool within the animation pipeline. Discussing the shortcomings and strengths of game technology in an animation application will uncover how to improve the current tools and pipelines. Forgoing the final render in favor of the real time render solution will also be evaluated. Game and film teams working together to achieve a synergistic end will also be explored.
I have always been interested in art; as far back as I can remember I would draw along side comics and story books. As I matured as an artist I realized how important it was to study the works of artists and craftsman of our history and learn from them. I live and work in and art rich environment here in Chicago land; The Art Institute of Chicago, The Museum of Science and Industry, The Field Museum, and other institutions such as these have always inspired me. In light of my interest in institutions such as those mentioned, I have realized that access to such places is somewhat limited by many factors: Money, proximity, and some peoples lack of mobility are just a few factors that might keep one from taking part in the rich experience that these places have to offer. How do we allow those who cannot experience the enriching potential of our history due to limitations do so? And how does a museum, a place of learning and reflection, expand beyond its physical borders to grow with time? The advent of virtual reality in a feasible and tangible form (ie Oculus Rift, and Project Morpheus) has the potential to bring immersion into these great halls of history without the limitations before mentioned. Creating a virtual interactive museum environment with photo real visuals is not outside our reach. The beauty of such an endeavor is that it never need be limited to walls, nor does it need to replace the current infrastructure, rather it could exist alongside, in concert with the current establishment.
Topic 1 Method
The beginning of a Virtual Museum would begin with creating a central hub, a foyer if you will. Within the foyer a site map would allow the viewer to see multiple topics that they can go and visit within the facility. Just like a real museum the visitor would choose one of the topics and advance to that area. Through the halls accurate representations of historic objects would be placed and have a description, as well as information on the artist who created the piece. When a visitor reaches a particular area of study, they would enter into another level, and that study could be much like a traditional museum, but it would not have to be; for instance if a student wanted to learn about Medieval Castles, the level loaded could be an actual representation of a castle, with interactive descriptions of its construction. Much of the building of a level like this castle would be appropriate for a Environment Artist. The museum hub and different subject packages could be downloaded from the web and launched from any computer.
Many of the challenges of starting to create such an infrastructure remain to be seen. Embarking on the journey to begin digitizing and including interactive elements into the halls of history is a noble endeavor, and has potential to bring a new method of visiting the halls of history.
Living in Chicago has its ups and downs. The culture here is great, but the weather, at times can be rough.
As I write this, my son is in the other room playing at home because school was called off today due to the extreme cold (-5 is the low today). The old adage "cabin fever" is very applicable after spending a harsh winter here. To combat cabin fever I try to get adequate exercise, without it I feel stagnate and less alive. I have a treadmill, but in all honesty, exercise could be more interesting. I have seen people who have tried to make repetitive workouts on stationary equipment more interesting through the implementation of virtual apps, some of those can be seen here: Virtual apps. After reviewing the options out on the current market, I see a need for something more immersive; something that has the potential to really allow one to forgo the immediate environment, and become lost in the scenery. Virtual reality has the potential to offer a greater level of immersion than ever before. Beyond just a VR environment, creating a VR game that networks people together to allow some competition would be without a doubt a motivator for those of us stuck indoors on the treadmill, bike, or elliptical. Within the parameters of a three month project I believe that a virtual environment created to be walked or run through on a treadmill while wearing the Oculus Rift to deliver a more immersive experience, is an attainable goal. I would like to test the capabilities of VR to help people forgo the monotony, and physical anguish of static exercise machines, as well as inquire of its potential to motivate people to exercise more.
Topic 2 Method
Creating a prototype level that allows the user to input a speed they wish to travel is attainable. The camera would be set on a predetermined path, and after setting the speed a player would simply put on the VR mask and walk or run. Starting out with this simple prototype would be an adequate way to test the potential hoped for (total immersion), and it would allow us to understand the roadblocks, like disorientation and motion sickness, that would need to be addressed before this idea could be feasible for the general population. Using test subjects from different age groups, gender, and at various health levels, I will gather research and
I have long been interested in feature animation, and I have worked on a number of cinematics for current titles on the market, Blitz the league 2, and Saints Row 4. In previous iterations we used Maya for rendering, and full motion video was added to the build of the game via bink video. In recent history the potential of next gen game technology has impressed the general population, but I believe that in days to come the potential to use game engine technology in television, and perhaps eventually in feature film is possible. Taking the current technology and putting it to the test to see if the quality bar of game engines is now at a level that could match or surpass the look of the typical cable TV 3D cartoon (i.e. Nickelodeon) would serve as a proving ground for a new application for game technology. If the real time game engine could be a viable alternative to the classic rendering we are used to, it could be implemented to get a "what you see is what you get" instant feedback, and dramatically speedup the animation pipeline.
Topic 3 Method
To test the theory I would create a small animatic, and then create a character and an environment to a film quality level to push the boundaries of what a game engine can handle. Some of the questions that need to be addressed would come in areas like rigging limitations, post processing pipelines, visual effects, camera controls, resolution limits, texture resolution limits, real time lighting limitations, rendering in layers and of course poly counts. The specialized tools needed to smoothly create the bridge between games and film would be researched and discussed. Upon completion of the final animation, it would be recorded to use in comparison to current market examples. Production cost analysis must be examined to determine if creating on a real time engine is fiscally responsible. Bringing the two disciplines, game and animation, together may carry with it discussions in the area of transferable skills, and opening more freedom in a challenging job market. Emerging unreleased technology should cap off the study and look to future of real time technology.
So that is not really a pun ("Let the Games Begin") because I am currently pursuing a MFA in Interactive Media and Game Development through Savannah College of Art and Design. I am very excited about the first class that focuses on developing a Thesis; Studio 1. I have a few ideas for a Thesis floating around in my thoughts, and it is time to throw them to paper (or pixel) and start to see which ideas seem to hold the most value.
Well, I'll save you the reader the long story about how I've been playing games my whole life, and when I was a kid I always dreamed of making games...blah blah blah; pretty much every kid that got into games has the same story. Where my story became different was around 2001; I met this pretty nurse at the doctors office where I facilitated the lab and the blood draws (yes, I am a vampire, professionally). My now wife and I started dating, and in my free time I would do sketches for her. I had been drawing for years, but I had lost site of art as a career. Sarah, who loved Disney films (we still have all her VHS tapes) encouraged me to take my skills and go back to school for an art degree. I enrolled and got interested in the game program that was just starting up at the Ai in Schaumburg IL. I did very well in school, I won Artimation 7, an industry judged art contest, for best level design. The next year my team and I won Best Team Game, and then in 2010 we also won Best Team Game of the decade for our work in Artimation 8. Needles to say things were looking great and I was looking forward to graduating and getting out there.
I graduated in summer of 2007 and through a contact made at our portfolio show I was given an art test for Midway Games. I interviewed and was hired as an art intern! I worked on Blitz the league 2 mostly on modeling and texture work, but I also had my hands in lighting and rendering some of the team captain intros. I stayed at Midway for a year, and though I wanted to work full time there, the days of the company were numbered, and I was let go shortly before the company went under. I used my contacts to get an interview at High Voltage Software, and I was hired full time! I spent over 5 years at HVS, and I will always remember all the great people I worked with who are still my friends. I choose to leave HVS in order to get a Masters Degree, and to plan my next step in my career. To see some of the work I did during my 6+ years working in games you can check out the portfolio page: Portfolio
Beyond my aspirations to create innovative and beautiful art, I also study photography, and have professionally shot a few weddings and engagements; here are examples of my Photography work . I plan to continue to hone my photography skills, as a hobby. I have a great interest in martial arts; I am a one of the coaches for my sons wrestling team, and have practiced Brazilian Jiu-Jitsu, Boxing, and Kickboxing for some years now. One of my goals has been to get a black belt in Brazilian Jiu-Jitsu. I love feature animation, in fact I would say it is my favorite medium right now, and I have always dreamed of working at Pixar, Disney or the like, so this is something I would like to pursue.
Interests in Interactive Design and Game Development
I started out playing games like I mentioned before, that was what led me to consider the study when I enrolled in school. I think one of the most interesting things about games is the immersion factor I can achieve while playing them, especially when competing. Though I find myself drawn to film as a main pass time these days, I still enjoy the artificial confines and rules of the "game". I will fully admit that I consider myself an artist above all other career titles; whether "game" is tacked on before "art" or not, I am still a student (and teacher) of art. Game art is unique and intriguing to me because it is efficient, and the end user gets some control over what you create; I am Geppetto in a sense, and you are the puppet master controlling my creation. This is a unique relationship, and it is all in the name of fun, how cool is that? Inspiration is a great topic to bring into the subject of interests in interactive design and game development; I was, and am still inspired by games, and now I hope to be a part of inspiring others. The list could go on and on, but I think it is suffice to say that these two topics of interactive art and inspiration top my two reasons for being here pursuing this topic.
Daniel Triplett, is an artist that worked in game development for over 6 years, and now teaches in the Computer Graphics Technology department (CGT) at Purdue University.