a. Environment creation simplified
b. Problem Statement
c. Thesis Statement
2. Teach thy self: Teaching others to teach themselves.
b. Digital Tutors
d. ok YouTube
3. Software Choices
a. Substance Designer
c. Unreal 4
f. Bitmap 2 Material
a. Goals of reference collecting
b. Early days of Architecture
c. Form follows function
d. Form follows fiction
4. Minimalistic Environments Training
a. Creating Simple Layouts (designing the scene)
b. Creating Simple UV layouts
c. Creating Simple Materials
d. Creating Simple Textures
e. Borrowing from the web
f. Lighting the Scene
g. Post Processing the Scene
h. Utilizing technology for rapid results (video tutorial creation and when to watch)
5. Working as a Team
a. Brainstorming (for ownership)
b. Working collaboratively, while still remaining an individual
c. Sharing and Version Control
d. Sharing knowledge encouraged
6. Presentation of the work
a. Camera Composition
b. Defining Contribution
b. Response to Feedback
c. Areas of improvement
a. Examining results
b. How did does the scenes of the non-artist look compared to their real world counterparts?
c. What is the next step?
b. Reiteration-has thesis goals been satisfied?
“Many believe that the holy grail of previs lies in the use of videogame production technology, especially that of the game engine, the core of a game that allows the movement and manipulation of modeled people and creatures within 3D sets, together with lighting and camera moves a process essentially the same as what previs supervisors do before a major movie starts production.”
“The benefit of being able to use a game engine is obvious instead of the complex scripting usually necessary to move characters around a virtual set to act out a proposed scene, previs teams could use game controls such as joysticks to move characters around, simply and in realtime.”
“A second reason to take the advent of gaming technology for previs seriously is because many experts are pointing in this direction. Game engines will become a critical part of previs, according to Scott Ross, the ceo of Digital Domain, who sees future filmmakers sitting at gaming consoles, making choices for characters and scenes and lighting and movements.”
“Digital Domain used a videogame (a flight simulator) and gaming interface for the movie Stealth, which involves a lot of aerial combat scenes similar to the movie Top Gun. Digital Domain was able to create footage of proposed flight scenes in Stealth for director Rob Cohen this way in previs, before the scenes were committed to and rendered in full resolution.”
“Peristere (Loni Peristere, co-founder of Zoic Studios and vfx supervisor of both films (Serenity, Zathura, Pathfinder) and TV shows (CSI, Angel, Firefly, Buffy the Vampire Slayer, Battlestar Galactica)) believes that use of game engines such as the Unreal engine (which at present seems to be the engine of choice for previs) will be part of a growing trend of moving previs more and more into the hands of the director.”
“A third reason is that games and movies are becoming closer to each other, in many ways. Movies are increasingly loaded with game-like effects. Games are increasingly based on movies.”
“Why not create the two (Games and Film) at the same time, with some of the same tools?”
“ILM is developing previs with the LucasArts game engine…’What were saying is, Lets make this like photography; do it in realtime’, suggests Lucasfilm cto Cliff Plumer. This is something weve been developing in conjunction with LucasArts to hand the previs to the director. Its almost like a game. The director can plan how to shoot a live-action or block a CG scene. Contained in the application are libraries of lenses and so forth. But, we can also record the camera moves, create basic animations and block in camera angles. And instead of handing rendered animatics to the CG pipeline, we have actual files camera files, scene layout files, actual assets that can feed into the pipeline. It gives the crew input into what the director is thinking”
“A recent development being used for previs is machinima, the use of game engines to create story sequences.”
Christopher Harz. The Holy Grail of Previs: Gaming Technology. AWN.com, 31, Jan. 2006 . Web. 18 Feb 2015 ‹ http://www.awn.com/vfxworld/holy-grail-previs-gaming-technology ›
On Dawn of the Planet of the Apes…They were able in realtime to make tweaks to adjust the lighting and the shadows, and change things. It gave them a way to create concept art in realtime. It meant there was a portal for everyone to communicate through. I hope that’s a way we can work in pre-production from now on (John Griffith).
“Griffith says in some ways these Cinebox projects are more like ‘advanced previsualizations.’ “It made sense that the previs pipeline I’d established was no different than a final VFX pipeline. It’s allowing you to light, render, create an atmospheric world in a shorter period of time than it would take to do it in a standard way.”
Griffith says his workflow still uses Maya, ZBrush and other 3D tools on the front end of the pipeline to create assets, with everything then mirrored in Cinebox. “I can then set up a pipeline with Maya animators and they can animate just like they normally would for any production,” explains Griffith, “with the exception that they’re scenes are much lighter and much faster than normal, because they don’t have to deal with the final assets. So those two worlds are mirrored, and all the environments, props and everything are created in the engine. It’s also an amazing renderer - it renders in realtime and rivals a lot of final render quality work.”
“The benefits of this approach, says Griffith, is the ability to sit with a director and make fast changes and, if necessary, plug into a VFX pipeline. “My previs pipeline was very Maya-centric so all my shots were created in Maya and mirrored in Cinebox,” notes Griffith. “So the engine in previs is a renderer and a viewer - in realtime. Those shots still exist in Maya, so I can provide the exact same shot in Maya to the production if they need it for technical purposes.”
(John Griffith) My goal has always been to merge concept art and previs into one discipline, so that we can help design the look of the characters and environments and props in previs, while we’re also designing the story and the action.”
Failes, Ian. Crytek Cinebox – an update. FXGuide.com 02 July 2014. Web. 19, Feb, 2015 <http://www.fxguide.com/featured/cryteks-cinebox-an-update/?ua=ipad>
This is a schedule that takes in account the workflow diagram found under the visual component section of this blog. This is a living document and subject to the needs and incidents that may arise during the next two years.
Real-time rendering in game engine technology is breaking the bounds of its interactive game development roots and emerging into areas like product and architectural visualization. In continuing the trend to stretch the application of real-time environments, it is possible to create an interactive nucleus for lighting within feature animation preproduction using current commercial game engines. Numerous artists are involved in the pre-visualization stage of animation production known as “pre-production” and while there have always been models for sharing progress made by the team, the current game engine would allow an interactive environment where assets could be catalogued, viewed, arranged, and lit in real-time. Sitting alongside a lighting artist, a director will be able to see changes interactively coming together as they propose their vision.
By creating pieces of the pre-production cycle the relevance of real-time game technology for lighting preproduction will be examined more closely. Concept art of environments and character development will precede creating 3D assets. Once Modeled and textured, the scenes will be lit by real-time render engine to discover the validity of real-time integration. Forgoing the restrictions and overhead of game play, integration of characters, environments, and lighting will allow for visuals to be pushed to their limits. Incorporating pipelines that do without the common restrictions addressed by game artists will also be integrated and evaluated. While the time to implement the assets into the engine will add to production efforts the payoff of the renders instantaneous feedback can be expected to boost pre-production final visual quality. The game engines holistic environment hub where all production personnel could access and view the lighting teams’ efforts will allow for a more informed and amalgamated individuals. The near final look of the assets will allow directors to feel comfortable in making a decisive approval.
Daniel Triplett, is an artist that worked in game development for over 6 years, and now teaches in the Computer Graphics Technology department (CGT) at Purdue University.