In Part 1 of our Lens Metadata series, we provided some history and information about gathering and using lens metadata on-set. It’s wise to collect and use all this information on-set, but what happens after this process? Lens metadata often gets lost along the way, but if carried through to post-production, this valuable information can save time and money, particularly when compositing complex VFX shots. If we take LDS as our example, the original camera negatives from ARRI cameras support the transport of lens metadata in the ARRIRAW, ProRes or MXF ARRIRAW files. The ARRI Metadata Bridge can then be used to transport the metadata into the DPX or Open EXR files which supply VFX companies.

Before the dailies and metadata head off to post-production, it is paramount to ensure this data is organized in an easily accessible fashion. Of course this varies from vendor to vendor, but as stated earlier, this process needs a protocol set up before shooting. Once organized, the files are sent off to post where the lens data may be extracted and used within applications like The Foundry’s Nuke or Adobe After Effects.

VFX compositor and compositing supervisor Brian Sales (Stranger Things, Black Sails) explains that he uses lens data, if available, when camera tracking in post is needed to get an accurate 3D track. Often the lens data is provided on line-up sheets; however, he further explains, “We often have to delve into the shot’s metadata to find lens data - the software can estimate field of view data, but I generally prefer to give it actual data if it’s available.” 

Brian went on to share, “I’ve honestly never worked at a company that has a set workflow for using that data, especially shading - that doesn’t mean that these aren’t aspects we have to deal with, they are just generally eyeballed.” This note seems remarkable knowing how long metadata has been available, the complexity of VFX in modern films, Brian’s extensive resume and the extraordinary work of so many other VFX artists. Alas as Les Zellan lamented, the implementation of his dream has yet to be fully implemented, and perhaps that is due to a disconnect between production, post, VFX supervisors and the artists completing VFX. “A lot of compositors don’t come from a production background, so dealing with sensor sizes and lenses is not what they want to focus on. They are just concerned that it looks right and doesn’t get too complicated” Brian continued, “I am hoping that there’s more usage of metadata in VFX as that info becomes more readily available…I have checked out what Zeiss is starting to offer with their eXtended lens data and that looks really promising. They offer tools that could make the use of lens data more routine.” 

All that said, the insight shared by Brian and others is nothing short of turning on a light in a dark room - the technology is there for the taking, and perhaps those who make a concerted effort to implement lens data will be leaps and bounds ahead of the competition.

Let’s look closer at how lens metadata has been used on some projects. It’s hard to show examples from Hollywood blockbusters, but we do have some cases from a pair of auto commercials and a project funded by Zeiss to explore and show proof of concept for their eXtended data.

Director Andrew Sinagra helmed two spots, one for Lincoln MKZ and another for Ford Escape. For both spots, Ntropic utilized an innovative technology shooting with the ARRI ALEXA Plus camera. This process entailed pulling metadata from the lens and embedding information (such as focal length, F-stop, etc.) into the footage. Ntropic first fully pre-visualized each spot and then, in cooperation with ARRI and cinematographer Bill Bennett ASC, Sinagra and Ntropic captured all the plates required to composite cars that were not yet available on the market. In place of the actual vehicles, the team used cars of similar size and design as a place holder. They shot full camera moves, including lens flares, dynamic pans, tilts and rolls and of course tracking shots with frame by frame accuracy. In both spots, the cars were fully replaced in post including reflections on the hood, body, windows and wheels, and all the human elements of an actual spot were either left in or embellished to keep the spot ‘real.’

See the Ford Escape and Lincoln spots here and here.

While the use of the metadata was essential to deliver these groundbreaking spots, Sinagra explained that work was still needed, “When we did those projects, the lens metadata system had the information we needed, but it was a bit cumbersome to access. We had to jump through a couple of conversions to get the data in a format that our tracking software would like. It would be great to see tools by the various camera tracking packages that would support reading the data directly, and knowing the best use of the information, remove a bit of the trial and error.”

Now let’s look at the short film Stucco directed by Janina Gavankar, shot by Quyen Tran and created through a collaboration between ZEISS, RED and, EFILM. The filmmakers used ZEISS Supreme Primes and the RED DSMC2 MONSTRO 8K camera and utilized a groundbreaking post workflow, designed by EFILM’s Joachim Zell, that unlocks new opportunities to simplify and increase the accuracy of image capture and processing workflow. The ZEISS XD workflow is the next step in innovative solutions for VFX. It’s based on Cooke \i but focuses on lens shading and distortion, removing the need to shoot lens grids by working with shading and distortion data straight from the lens itself.

For Stucco, VFX artist Caleb Knueven explains that having access to the lens characteristics in Nuke through EFILM’s technique of injecting lens data into Open EXR files will make the creation of complex VFX more streamlined, providing a flatter palette to work with for placing and tracking 3D objects, knowing that the distortion and shading will be added back when effects are done. This ensures a seamless transition between VFX shots and original camera shots. VFX artistGene Warren III states that in the past while using After Effects or Nuke, you couldn’t easily apply the metadata. Now, with the plugins from ZEISS that he used for Stucco, there is a secure method for utilizing lens characteristics. Using the newest Nuke plugin, after the composite was complete, Warren says he could apply the lens data back onto the shot as quickly as dropping in a color timing node. He reiterates how much time and money this will save through quicker compositing, with the proper pipeline. Ideally, the VFX artist will get a trimmed version of the shot in OpenEXR format with metadata attached.

If you see a TV show like Game of Thrones or Stranger Things or movies like Star Wars, Avengers or Jumanji, the seamless VFX you see, or even better don’t see, were probably made that much better because lens data was applied. The lines between image capture and VFX magic have been blended perfectly because a guy named Les had a dream some 20 years ago.

Thanks to DP Jimmy Matlosz ( for assistance in crafting this article.

Codex related product and stories

Images courtesy of their respective owners.

This site uses cookies. Learn More.