Animex Week – Day One

While the school had enough interest to take students to Animex 2017 for the Thursday and Friday game talks, I and a few other students decided to go for the Monday and Tuesday AVFX talks on our own. There were a number of interesting speakers on the first day and, while I might not be interested in going into animation or (possibly) VFX, I feel like I definitely learned some very interesting things.

The schedule for today was as follows:

  1. The FX of Lego Batman – Matt Estela, VR supervisor, Animal Logic
  2. Animating Ethel and Ernest – Peter Dodd, Animation director, Lupus Films
  3. Creating the Characters of Fantastic Beasts and Where to Find Them – William Gabriele, Rigging TD, Framestore
  4. The Animation of Fantastic Beasts and Where to Find Them – Colin McEvoy, Animation supervisor, Double Negative
  5. The VFX of Rogue One: a Star Wars Story – Bradley Floyd, 2D Sequence supervisor, and John Seru, Generalist lead, Industrial Light and Magic

Following these talks, there was an evening lounge meet and greet to speak with the guests from the day and the ones who would be speaking the next day, too. It was all a really impressive experience and I thought it would be a good idea to comment a little on the highlights of each of the talks.

The FX of Lego Batman

timthumb

We were told about the way the Lego Batman Movie wanted to be different than the original Lego movie – Ninjago was actually already on the go right after the Lego Movie and Lego Movie 2 was in the pipeline, but a pitch package was put together about a Lego Batman movie idea. The concept art was greenlit and the entire production was moved to be released before Ninjago, which meant Animal Logic only had two years to complete rather than the normal four years. As a result, the script, story, and art department all ran parallel.

While the Lego Movie tended to look like a desktop model, where edges and ends could be seen and the world felt much smaller, the team wanted the Batman movie to feel much bigger and have a lot more detail. Matt Estela mentioned a quote that the “movie should be as colourful as the Joker, but as dark as the Dark Knight” with the entire production just feeling a lot more atmospheric.

They used a number of references, such as Citizen Kane, the Batman animated TV show, American mansions of the gilded age, the ’60s Batman TV show, the Batman comics, and many others. The Lego brick database was invaluable, where Lego had provided the 3D models and Animal Logic could then use the Lego digital designer to get a piece, chuck it into the pipeline, and render out the model for the movie.

The movie used an in-house renderer call Glimpse, customised for their own use and apparently similar to Arnold. They used Glimpse as a plug-in to Renderman; it would trace the rays and give it back to Renderman to use. The software itself was still needing a lot of work, so only a few shots in the movie were fully rendered by Glimpse, but the rest were processed using Renderman. ShaderToy was also another application mentioned, especially for the real-time fractal looks achieved in the movie.

Animating Ethel and Ernest

maxresdefault

This sounded very much like a labour of love – while the movie took a year to make, it was eight years in the making before it got off the ground. Based off of a Raymond Briggs book about his parents, this was the first fully featured film by Lupus Films (who had made short films such as the Snowman or TV shorts like We’re Going on a Bear Hunt). It was co-produced with Studio 352 (who helped with the backgrounds) and ClothCat (who helped with the colour and compositing).

Raymond Briggs’ art style is very recognisable now. He uses a retractable pencil to sketch out the scenes in the book, then photocopies them with a harder line to then use a combination of goache and pencil for the final drawing. Lupus Films knew they wanted to keep that iconic look as much as possible, but it still had to be adapted for film and they didn’t really have the time to dedicate to that specific method.

At the time, they tried initial tests with paper and pencil drawing, but it would just be too costly and drawing digitally needed to be the way forward. Initially, Peter Dodd was reluctant to take that step into the digital age, but the process was refined and now he said he wouldn’t go back to the traditional drawing methods.

Using LazyBrush and TVPaint, they created an almost textured look to the drawing; building their own pencils to look “pencily” enough and creating character stamps on a 360 rotation to cut down on re-drawing. Animbrushes were made within TVPaint to help with the difficult strokes and model issues. It was all very educational to see how they transitioned so nicely from a more traditional artistic method to something modern, but still kept that Raymond Briggs feel.

Creating the Characters of Fantastic Beasts and Where to Find Them

FTB933_FBST_DTR4 2055.tif

Framestore worked very closely with JK Rowling to create the characters seen in the film, where William Gabriele mostly showed us how they came up with the various goblins and house elves in the movie. More specifically, we looked at the rigs of the goblin singer and Gnarlak.

Overall, there were 500 artists, 36 characters, 400 shots delivered, and about 47 shots delivered in one week – a lot of work! There was a lot of back and forth between the various departments so they could get the look of the characters just right within the various shots, using all the feedback to correct the look, lighting, etc. All told, there were 15 million CPU hours put into the characters final result… which would mean about 1700 years on a single CPU! The total data they ended up with was 274 TB on disk. It’s absolutely mind-boggling sometimes to think about how big even just smaller parts of a whole feature film are.

For the goblin singer, they used an actress to recreate the scene as well as motion capturing and kept it all as reference footage for the building, rigging, and animation of the in-movie model. The riggers would take the motion capture, put it all through validation, and then lock in the proportions of the character for the animators to then use. We were shown how they would start with a skeleton, pick out all the landmarks to emphasis or take note of, then add the layers of muscle and tendons over top, before the “skin” would be added last.

With Gnarlak, they did many of the same things (including using Ron Perlman for the motion capture and stand-in scenes!) but used a normal male body structure as a starting point and then adapted it to the character.

The Animation of Fantastic Beasts and Where to Find Them

fantastic-beasts-find

In contrast, in this talk we got to learn a bit more about the animation done on the FB movie, while specifically we were shown the process of bringing “Frank” the Thunderbird to life. This felt like one of the more interesting talks of the day, just because of the animations we’ve finished up in our own classes and how we could apply some of the knowledge to what we’d just done.

Of course, as always, Colin McEvoy heavily emphasised the importance of using references – it didn’t even matter that there is no such thing as a Thunderbird in real life, since they could look at various birds of prey, horses, and even jellyfish for the movements Frank would have. I found this a very good idea to take in: even if the animal doesn’t exist in real life, or if the source of inspiration is nothing like the animal you’re creating, it’s just about the visual language you’re trying to capture.

Curves and arcs were mentioned, while they continuously tried to look at the negative space being created by the arc of the wings and its flight pattern. Using layers to animate was a HUGE┬áthing we all picked up on and afterwards at the lounge we asked Colin about this more. By using layers, you could keep a clean slate of your original animation and always had something to revert back to – he said they would key the start and finish, then on fours, the twos, and eventually even on ones in a straight ahead method. He also used silhouetting to really help him focus on the movements and things that mattered, rather than being blinded by the model as a whole.

The field chart was also something interesting to learn about. It’s a feature in Maya that’s totally independent to the camera, but it lets you track how the character moves across the scene without any sort of background. Colin said it’s definitely an animator’s friend and helps you check your character is travelling exactly where you want! It’s also very good at tracking arcs… where everything should have arcs – Frank had plenty, from his beak to his head, his feet to his wings.

The VFX of Rogue One: a Star Wars Story

Jedha_city

This presentation from ILM was definitely another favourite of the day and the two gentlemen were extremely interesting to talk to afterwards at the lounge meet-up. Rogue One had a long history of films to live up to, so I felt really lucky to hear from two people who worked on the movie directly. While the other ILM branches shared the work for the visual effects in the movie, the London branch was responsible for anything seen on the planet Jedha.

Ralph McQuarrie’s name came up once again and they were still using his concept art to inspire the look for Rogue One – they also used inspiration from the Middle East, Africa, Egypt, and more to make up the look of Jedha City.

Nuke, Maya, and sometimes ZBrush were used to make up the city landscape, which I couldn’t believe when they told us that it was really only two people who modelled the majority of things we see. They really tried to keep the modelling and texturing simple and as re-usable as possible, so there were a little more than 30 buildings used from different angles to create the districts, which then were added to the city with the walls built up around them. While they had a building asset library, there was also a smoke library which helped bring some life to the rooftops of the city. Anywhere there might have been a pipe or grate, they added in a small smoke effect.

For the final destruction, they had to research nuclear explosions to see what approaching volumetric clouds would be like, how the ground would be effected, what the heat and exposure would be like, etc. The effect department was responsible for procedurally shattering the raw renders given to them from Arnold, as well as making up a number of FX sims for the smaller explosions, debris, and dust.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s