When a devastating attack shatters Mark Hogancamp (Steve Carell) and wipes away all memories, no one expected recovery. Putting together pieces from his old and new life, Mark meticulously creates a wondrous town where he can heal and be heroic. As he builds an astonishing art installation—a testament to the most powerful women he knows through his fantasy world, he draws the strength to triumph in the real one. In a bold, wondrous and timely film, Welcome to Marwen shows that when your only weapon is your imagination…you’ll find courage in the most unexpected place. However, when it came to the production data workflow for this project, Chris Bolton, DIT, left nothing to the imagination. Working with an incredible team from production to post, Chris outlined his approach to this show and the overall workflow.
What was it like working with such a technical Director as Robert Zemeckis? Was DoP C. Kim Miles just as technical or did you have to help out a lot?
When I learned I was awarded Welcome to Marwen under the direction of Robert Zemeckis and the keen eye of C. Kim Miles, I knew I had to step up my work to another level as Robert expects the very best from his crew.
Even though this was not an action film with difficult locations, and a vast array of cameras, like some of my previous projects. Welcome to Marwen took place in two worlds. One based in reality and one based in fantasy, so it was important to know what world we were shooting in. Robert lived and breathed this project and being such an incredible story teller and visual thinker, he knew what was technically possible with the motion picture gear and expected his crew to be masterful with each of their professions.
C. Kim Miles is an incredibly technical Director of Photography when it comes to lighting and the aesthetics of good photography. Kim knew how to plan ahead and hire the right people who complemented his strengths, so he didn't have to spend time on the small details that took him away from the story telling process. Though this was my first time working with C Kim Miles his reputation proceeded him, beyond hearing he has a masterful eye. He also is an amazing leader. I knew I had to work with him. My broad range of experience in camera, photography and post workflows along with the amazing camera team led by A Camera 1st AC Douglas Lavender, Kim knew the story was in good hands.
How many A65 cameras were used on this show?
Our main unit utilised two Alexa 65’s with a third body for splinter units and 2nd Units. The last 4 weeks of production were on a motion capture stage where we carried an additional body for the mo-cap scenes. Robert Zemeckis and C. Kim Miles prefer to shoot with minimal numbers of cameras, without limiting camera movement and allowing for the best possible lighting and composition for each scene. Shooting with too many cameras can compromise the photography.
Were you managing colour on set as well as data? Or did you have a utility assist for DIT work and Data was handled by someone else?
Shooting with the Alexa 65 creates massive RAW files, this made it essential to hire a Digital Loader to take care of the backup of the camera original footage and to create a detailed database of metadata and scene information for every shot. The Codex Vault-XL offload system was on a cart on the camera truck for all locations and was a smart plan. Our camera team was very fortunate to have Gaelle Jego behind our precious footage. She was basically our camera departments best boy, who along with taking care of the data also took care of ordering daily gear, keeping all of the departments scheduling and paperwork sorted, as well as finding extra crew for additional camera days and splinter units.
Having Gaelle on our crew meant I could focus on exposure, focus, colour, and all the details to help make a great image. We decided to use a standard grading LUT and CDL workflow on this project, as we didn't see a big advantage to trying to rollout ACES on set. The entire movie was to be shot on just one type of camera and even though ACES has its advantages in post, the ARRI’s LOG-C curve and large colour gamut are so good you really don't see a massive difference when linearising and viewing in REC709. If we did view in HDR, then I could see ACES as a big benefit.
For applying the colour changes to the live SDI feed we used Pomfort Live Grade along with the FSI BOX-IO hardware. The software allowed us to not only generate and apply the LUT’s and CDL’s, but also create non-destructive reference grabs thanks to the FSI BOX IO’s. This allowed me to regrade retroactively if the look altered at all during the scene. Being able to go back and refresh the still with a new look parameter was very useful. The library compare feature was very useful when shooting scene’s out of order or picking up portions of a scene at a later time.
Were you cloning and processing direct to Transfer Drives and shipping these to near-set or local post in Vancouver?
Gaelle was cloning and processing the Alexa 65 footage from the 2TB media to 8TB SSD Transfer Drives using the Codex XL that was parked on the camera truck. The Transfer Drive was then shipped to our production office where Technicolor had setup a dallies system at our production office. They used another Codex Vault-XL for ingesting the data from the Codex Transfer Drive to a RAID array and LTO attached by 40Gb fibre.
This process of backing up the footage and generating the LTO tapes typically took about 36 hours so the production had to carry enough camera media to last us two full shooting days as the Codex Capture Drive media acted as our on-set backed-up.
This was a little bit of different work flow than what I have become used to. Normally I would have an on-set RAID to duplicate the footage to be held for a couple of days. It just wasn’t financially viable to build and/or rent a system that would live on the truck, that could do this in a timely manner.
THE CODEX VAULT XL OFFLOAD SYSTEM WAS ON A CART ON THE CAMERA TRUCK FOR ALL LOCATIONS AND WAS A SMART PLAN
How much data did you generate on a given day?
Our footage volume varied a fair amount. Some days we would shoot as little as 6TB and on other days it would go as high as 24TB. With the Alexa 65 this is not totally unusual.
Thankfully Robert Zemeckis doesn't like rolling resets, this helped in the amount of data that was shot. We call this the “old school approach” where the director would save the film in the camera whenever possible. This in my option shows a lot of respect for the craft and the production team.
Did you manage the motion capture data as well, or was this and pre-vis a whole separate entity for data?
No, I only assisted the VFX team with a few of the data points that they needed from the camera. During the live action components, we had a small VFX team taking care of the necessary notes needed by the VFX houses.
During our weeks in the Motion Capture volume, our VFX team ballooned to a large team taking care of the witness cameras, calibrating tracking markers on actors, cameras, and set objects. The team also produced a live representation of the CG dolls and virtual set to help formulate the shooting angles and lenses to be used. This also helped show the actors the environment that they would be interacting with.
Besides camera, what other “metadata” is passed through your department? Camera Reports? Sound Reports? Any stills from the DP? Any technical data from VFX that you needed to back-up along with the RAW data?
Our department took care of the camera footage, the audio departments Broadcast Wave files, DIT colour notes, CDL’s, reference grabs and the odd alternate camera footage that was used for monitor playback. A complete VFX team took care of all the witness cameras, motion capture data, and camera position information. The one thing the team really liked was using the Prime 65 lenses. The camera displayed the focus position and T stop right on the SDI display. They kept all the playback video so they could check notes when needed.
Was their near-set dailies and post? Did you work with this team closely throughout the show or with the local post facility in any capacity?
Our near set dallies were taken care of by a team from Technicolor Los Angeles that setup a small dallies facility at our production office. We communicated at least once a day, and I was their liaison on-set if any questions were raised. Derek Hogue from Image Movers oversaw the whole production to post production process. We had a lot of discussions early on setting up the area we would extract from the sensor, based on reverse pixel level math, so we had the ideal capture area for down sample, and optimum delivery of all the side car files.
Were you using a “show” LUT when grading with Pomfort LiveGrade and then creating CDLs that were used for dailies?
We did consider using ACES on this project on-set, but when shooting with the ARRILOG-C curve and large colour gamut, we decided to utilise 33Cube grading LUT’s and CDL’s for camera matching, and to mould the look to the specific scene using the CDL controls. During the camera tests I created a 3D LUT based off of a Kodachrome film emulation, and a K2S2 LUT that got us into the ball park of what C. Kim Miles wanted. Kim wanted to create two distinctive looks. A real-world look shot on the ARRI Vintage Primes and utilised a K2S2 starting point, with some custom modifications, and a Doll World look shot with the Prime 65 lenses and a custom Kodachrome. This created the doll world that came out of Mark Hogancamp's mind. Kim wanted each portion of the film to feel uniquely different, to help convey the realities that Mark was experiencing between his real life and his fantasy world.
Kim wanted the Real world to feel less attractive, cooler colour palette and just more honest. And the Doll World would feel warm, vibrant with the look of a “feel good” 1950’s movie hence the Kodachrome emulation. Later we took the footage into a remote grading session with legendary finishing colourist Maxine Gervais from Technicolor LA. We sat in a P3 calibrated DCI 2K grading theatre and recreated the looks in a proper DI theatre so we could see how the image would look on a proper cinema screen. We also tested a fair amount of diffusion filters, and we both fell in love with one specific set that I’ll let Kim reveal if he wants to. All this gave us confidence in what we would shoot over the two months of production. After our grading session, Technicolor did a conversion from the P3 colour space to the various flavours of grading LUT’s that we would need for near-set, for dallies, offline editorial and the grading outs used on-set. I feel working from where the image will end up, and working backwards, gives the most precise and reliable outcomes. This also eliminates any possible colour pipeline issues later.
What monitors are you using on-set? What scopes?
I provided both Flanders Scientific CM-250 and Sony A250 25” OLED monitors calibrated to the REC709 standard. I have my own Spectrophotometer to check the monitors once a week for calibration. For our image analysing tools my go to is a Leader 5330 as our primary scope. We also used the built-in scopes on the OLED’s, and in Live Grade, to double check results.
Was HDR a consideration on-set for monitoring or were you working solely in ITU709?
HDR was certainly a consideration but we were all in an agreement that it was really only practical as a finishing process, and not necessary for monitoring on set, as long as we protected the highlights when we thought it would be advantageous for HDR. Also carrying around a 65” Dolby Vision Pulsar monitor would have been difficult and not practical.
What were the biggest challenges during production?
The biggest challenge was our shooting schedule. Robert and Steve Carell like to work short days, who can blame them! We did what’s called Pacific North West Hours, which meant we only really get 11 hours a day to setup, shoot and wrap. Though it was really nice to go home at a great time every day, we also had to have everything planned, and be as efficient as possible. This also meant zero down time due to technical issues.
Thankfully everything on the show went very well because of the amazing planning by Robert Zemeckis, and our amazing 1st AD Lee Gromett. We also had an internationally recognised top notch crew, so we wrapped a day early on the production schedule.
How about locations? It seems like you were on stages but also outside, and lots of green screen. Did this present any production hurdles for digital imaging monitoring?
This project was very tame when it came to difficult locations. Though we didn't have difficult locations precision was absolutely critically for Robert. The biggest hurdle during production was the Motion Capture component, where we had a massive array of images on set. For each camera feed we also had a live 3D animated preview of the doll characters, a composite of both the live action and 3D dolls where they placed the actors face to the CG dolls to see how the lighting and action would line up. At times we had up to 9 feeds going at once at 3 different villages. Our playback operator Justin Johns and his assistant Erica Fabian were working very hard indeed.
Did you wire up directly to camera or use wireless monitoring with all of the Steadicam work?
We really did a mix of hard wire and wireless. Anytime we could go hardwire we would as you just get a cleaner and more reliable image. I had some custom SDI repeater boxes made that allowed us to use up to 400-foot runs using copper cable so we often didn't have to use fibre. We did have a couple days where a splinter unit was shooting in the next stage and we utilised a fibre system to get to the set which was about 2000’ away in the next stage. Our HD transmitters where a mix of Paralinx and Teradek which mostly worked well.
Were there additional cameras used besides the A65? High Speed (Phantom or RED)? Alexa Mini?
Every single frame that was live action in Welcome to Marwen was shot on the Alexa 65. We had an Alexa Mini on hot standby just in case we needed a small form factor camera but with Peter Wilkie our A camera operator and Chris Jones his dolly grip at the helm, we never had to jump to another camera platform. So, it’s safe to say that 100% of the project was shot on the Alexa 65. It certainly shows! We also used the Alexa 65’s in the Motion Capture stage to capture the facial performances from the actors. The CGI team would then incorporate the mouth and eyes onto the CG dolls. The only other cameras we used were an array of 6 Alexa XT cameras, frame synced together for capturing face element for CG texture mapping, and a little Sony handy cam for shooting a shot for a TV playback element.
How did the Codex portion of the workflow work for you and the data management? What gear were you supplied in the end?
ARRI Rental supplied our main unit, and our near set post office with a Codex Vault-XL along with 6 8TB SSD Codex Transfer Drives, for transferring the footage to Post. Splinter Units and 2nd Units utilised a Codex Vault-S with the Alexa 65 specific hardware and software since there was only 1 camera. The Vaults also converted the footage from the native sensor data to an ARRIRAW file sequence, that is directly readable in post. That was processed on the Codex 8TB SSD Transfer Drives. At the mid-point of the day and the end of the day, we would pull a transfer report and send that on an SSD shuttle along with sound media, DIT reports, frame grabs and CDL files. We also used Pomfort Silverstack on a MacPro tower connected to the Vault by 10Gb ethernet for creating a full metadata library, scene information, notes and clip thumbnails. We could reference that library at the DIT station on-set. Our Codex Vault-XL also had an SDI output card so I would run back from time to time to spot check files or look at details that the SDI output from the Alexa 65 can’t show in full 6K glory. In some occasions we would offload some 4K QuickTime files and frame grabs to look at certain files on set, if we expected any image artifacts due to moiré, aliasing or lighting flicker or phasing.
Was this a fun production overall?
I have to say that this was one of the best productions I have ever worked on. If anyone ever wanted to do a case study on the most organised, efficient, and professional production sets this would be it. If you IMDB this project you would be very surprised at the depth of experience our crews have. The biggest component to this was the fact the we had Olympian level heads of department that worked and communicated with the crew with confidence, clarity and humanity. I don't think I heard anyone ever raise their voice or freak out once on this project. It’s pretty normal on big projects for someone at some point will lose their cool. It was an absolute honour and a pleasure to work with some of the most talented people I have ever known in my 17 years in film. We really had an amazing team!
What have you done since this production and what are you working on now?
After Welcome to Marwen I was very fortunate to work with the Legendary Director of Photography Russell Carpenter on a Disney project called Noelle staring Anna Kendrick, Bill Hader and Shirly MacLaine to name a few. It was my first Christmas feature and comes out Christmas 2019 and had impressive set builds, and a challenging winter location. After that I worked on a number of commercials for Hyundai, Kia, Pfizer and many more. Then landed 2nd Unit on a Paramount Pictures project called Sonic, a live action adaption of the popular Sonic the Hedgehog video game produced by Paramount Pictures. I had the pleasure to work with Peter Collister on this project. Just recently I finished a Pilot for Fox Networks shot by the talented cinematographer and storyteller Checco Varese.
Vancouver for Wonder
San Francisco for Birth of the Dragon
Codex related product and workflows
Images courtesy of their respective owners.