ACES UPDATE: PAST, PRESENT, AND FUTURE
When motion pictures were primarily acquired and finished on film, there was a relatively simple workflow in place – film negative went to the lab, where it was developed, color timed, and printed. An answer print could be used for a color reference for mastering and re-mastering, and an IP (inter-positive) was used for home video mastering. Some disruptive technologies and workflows – first of all, digital intermediate and then digital distribution and digital acquisition - changed this over a reasonably short period. Although these changes came with significant benefits, they did open up a Pandora’s Box of look-up tables (LUTS), and color workflows.
In November 2007, The Academy of Motion Picture Arts and Sciences published a paper called The Digital Dilemma, which examined the motion picture industry’s practices around digital storage and archiving. ACES (Academy Color Encoding System) is intended to address a facet of this by standardizing image formats and ensuring a consistent color experience that preserves the filmmaker’s creative vision from the point of capture through editing, VFX, mastering, distribution archiving and into the future.
ACES 1.0 was released in December 2014 after about ten years of research, testing, and discussion by some of the world’s leading color scientists. Now that ACES is five years old and has been used on many productions all over the world, perhaps it’s time to see where it’s at and what the future holds.
The Academy had these goals with ACES –
- Create a digital image file interchange and color management system that was not dependent on any specific camera, display, or production or post-production tool.
- Create a system that could enable innovations, such as high dynamic range (HDR) and wide color gamut.
- Encourage manufacturers to integrate this new system into their products.
- Enable producers and studios to create digital masters suitable for long-term archiving, so they would no longer have to remake a movie each time display capabilities advanced.
- Enable standardized workflows to reduce digital laboratory costs.
- Provide education and support for the motion picture and television communities to incorporate these benefits into their workflows.
- Make this new system available for free.
Some of the first feature film projects that used ACES included Big Eyes, Dolphin Tale 2, and Pan. Forward-thinking filmmakers realized that utilizing ACES could save money, improve image quality, and create a future-proof archival master. In a world where nothing stays constant for long, it is critical to think about the future when creating a master, given that technologies like HDR and 8K resolution continue to change how we view content.
More and more feature films are using ACES with studios like Marvel and Universal, making ACES an integral part of their production to post-production and VFX pipeline. Marvel has brought their VFX pulls in-house and delivered Open EXR files to VFX facilities around the world on recent projects like Avengers: Endgame, Captain Marvel, and Spider-Man: Far From Home. One of the first ACES projects for Marvel Studios was Guardians of the Galaxy Vol. 2, which was also the first movie to be shot with the 8K RED DRAGON VV sensor by cinematographer Henry Braham. Codex worked closely with Marvel to design an ACES pipeline that began on-set and continued through to the digital intermediate.
Companies like Mission, a UK-based DIT, and digital services company, are designing ACES workflows and making ACES the default on every project. For Downton Abbey, Mission designed an ACES workflow for the Sony Venice camera. For Cats, they developed an ACES workflow for the ARRI ALEXA 65, delivering Open EXR VFX pulls with embedded CDL information and the LMT (Look Management Transform) that cinematographer Chris Ross BSC had approved. This means that all the VFX artists are looking at the image as it’s supposed to be viewed and any the need for guesswork as to what the DP intended is removed.
Although ACES was developed under the auspices of the Motion Picture Academy, it has had a significant impact on television production as well. As an example, Netflix has embraced the use of ACES for their projects – they don’t mandate it, but they recommend it as a simple color pipeline that achieves their goals of creating an HDR, wide color gamut, master. David Fincher’s Mindhunter is just one example of a television series using ACES. Season 2 was finished in Dolby Vision, and DP Erik Messerschmidt worked with David Fincher’s in-house colorist, Eric Weidt, to design a color pipeline and on-set monitoring system that enabled them to monitor in HDR using an ACES workflow in Dolby PQ gamma and Rec.2020 color. Unlike on many productions, Messerschmidt monitored HDR on-set with the Canon DP-2420 as his primary monitor. Production on Mindhunter takes place in Pittsburgh, but color grading occurs in Hollywood, along with editorial and VFX, with David Fincher using PIX to send notes back and forth while he is still shooting.
Work on refining and improving ACES continues. ACES 1.2 is being actively developed, with Virtual Working Groups for CLF (Common LUT Format) and AMF (ACES Metadata File, the new name for ACESclip) finalizing the specification stage with 4th quarter 2019 as the target date for the publication of specifications. Work on ACES 2.0 is also underway with industry leaders like Annie Chang of Universal Studios and Joachim Zell of EFILM leading the way. The Academy has also launched the Academy Software Foundation (ASWF) in collaboration with the Linux Foundation as a neutral forum for open source software developers to share resources and collaborate on technologies for image creation, visual effects, animation, and sound. ASWF aims to ensure projects such as ACES and Open EXR are maintained and continued.
Related product and workflows
Images courtesy of their respective owners.