Investor Relations

Digital Domain's parent company, Digital Domain Holdings Limited, is publicly traded on the Hong Kong Stock Exchange under stock code 547.




LOS ANGELES June 15, 2022 Awardwinning video game developer Supermassive
has earned a reputation for creating some of the most immersive gaming experiences
ever made, in part thanks to a heavy emphasis on building from real performances that bring its
digital characters to life. For its latest title,
The Quarry, it went further than ever before, and
turned to Oscarwinning VFX powerhouse
Digital Domain to help create one of the most visually
immersive, photorealistic games of all time. But to get there, the VFX studio first had to expand
and grow its awardwinning tools, and invent a new one along the way.

“Right from the start we were blown away by Supermassive Games’ ambition, and we knew our
experience with feature films put us in a unique position to help,” said Aruna Inversin, creative
director and visual effects supervisor for Digital Domain. “We started with the same tools we use
to create movies and episodics, then adapted them for game development. Basically, we came
up with a new, faster way to animate photorealistic digital characters.”

The Quarry is an interactive survivalhorror experience fueled by the performances of an award
winning, ensemble cast. Photorealistic digital versions of several wellknown Hollywood
veterans react throughout the story with the same level of nuance and emotion seen in a live
action film, but with the added bonus of having the players control the circumstances and
actions. Will they choose to investigate the sounds in the distance, or hide? Will they help out
the others, or try to survive on their own? There are several outcomes, and survival is far from

“It was important for us to make every movement and reaction as realistic as possible, to create
an immersive experience where players truly feel the weight of their decisions and connect to
the characters,” said Will Byles, game director for The Quarry. “We wanted to get as close to the
likeness of the actors as possible, both in terms of the super high detail scans and the subtleties
and nuances of their performances. With its history of awardwinning VFX and pioneering digital
human work, Digital Domain was the best choice for this.”

Expanding on the idea of The Quarry as an interactive cinematic experience, there is even a
“Movie Mode” option that allows users to tweak the personality traits and actions of each
character, and then sit back and watch it all play out. Additional filters can further enhance the
experience by adding an 8mmstyle “Indie Horror” film grain, a retro VHS “80s Horror” look and
a blackandwhite Classic Horror filter.

Building a Better Slasher

To help bring photorealistic facial animation to a new level, the VFX studio famous for its work
on SpiderMan: No Way Home, Stranger Things, WandaVision and countless others, began by
conducting a series of facial scans of every cast member. That gave the team at Digital Domain
an accurate template to work from, as well as a comprehensive library of facial shapes and
expressions for each performer. The data was then sent on to the artists at Supermassive
Games, who created the looks for the ingame characters.

Next, it was time for the performances. Each cast member reported to Digital Domain’s
performance capture stage in LA, where they slipped on a full mocap suit and a facial capture
rig. They then performed their roles individually or in small groups (in adherence with COVID
safety protocols), acting out several versions of the story to create multiple branching narratives
including dozens of death scenes for each character.

Game developers are no stranger to using mocap and facial rigs to help create digital
characters, but with help from Digital Domain, The Quarry went several steps further than most.
In total, the cast underwent an incredible 42 days of motion capture, resulting in 32 hours of
captured footage.

From Thanos to The Quarry
With the digital characters from Supermassive Games ready and the liveaction performances
recorded, the combined data was then fed through Digital Domain’s Masquerade facial capture
system a tool initially developed for Thanos in Avengers: Infinity War. Since its critically
acclaimed debut, the system has evolved for use in projects beyond just feature films.

Masquerade 2.0
can analyze liveaction footage, combine that with the individual and unique
library of that performer’s facial shapes and then blend it all together with the CG character.
Utilizing machine learning, it can then create a photorealistic CG version of the performance and
fully animate it, all in record time. For The Quarry, that ensured that each performance seen in
the game includes every facial movement, every emotive reaction and every bit of nuance from
the talented cast.

“Masquerade is an incredibly powerful system, but ultimately it’s the performances that really
make The Quarry stand out,” said Paul “Pizza” Pianezza, Digital Domain senior producer. “We
wanted to ensure that our tools amplified what the cast brought, so each performance down to
the subtlest movement is as true to the liveaction performance as possible.”

But for a game like The Quarry that is built around interactivity, simply having filmquality digital
characters wasn’t enough. It needed to be able to edit those digital characters in real time, and
have those edited results be instantly ready to go without the need for additional touch ups from
artists. The only problem was that the tools Digital Domain needed didn’t exist… yet.

Solving Common Problems
In order to create digital characters that could be edited in real time, Digital Domain first needed
to address two common challenges with facial capture that typically require postcapture clean
up by artists: eye tracking and helmet stabilization. First, Digital Domain started with an open
source eyetracking solution known as “Gaze ML,” and heavily modified it over the course of
three years. By using machine learning to identify the unique nature of each cast member’s
eyes and pairing it with a high frame rate headmounted camera Gaze ML was able to
produce better accuracy in tracking, and improve the look of the digital eyes.

Next up, Digital Domain created a new way to stabilize the data from the headcam. Facial
tracking solutions typically require the wearer to remain relatively immobile to avoid shaking and
blurring. But using a new, proprietary technique developed inhouse, machine learning
algorithms in Masquerade 2.0 were able to analyze the capture footage and compensate for any

jostling. That gave the cast the freedom to move around normally and unrestricted, even run
and jump while wearing a head rig to elevate their performances.

Introducing Chatterbox
Tying it all together, Digital Domain went one step further and introduced a new technology
called “Chatterbox,” a tool initially developed by its internal
Digital Human Group to help lay the
groundwork for the future of autonomous humans. Powered by machine learning algorithms,
Chatterbox can analyze liveaction facial expressions unique to each performer, then determine
the best possible options to seamlessly alter facial expressions, while still maintaining quality
and lifelike movements. By streamlining the amount of data and focusing on only the best facial
options, the digital characters can then be uploaded and edited directly in a game engine.

When paired with Masquerade 2.0 for The Quarry, Chatterbox was able to access the library of
facial expressions from each cast member, opening up a host of possibilities for Supermassive
Games. Once the data was running in Unreal Engine, the devs were able to see the characters
in polished digital environments and make realtime adjustments to the performances as
needed, all based on each performer’s actual unique facial movements and mannerisms. That
eliminated the need for costly reshoots, while still ensuring the most accurate and faithful
performances possible

Overall, less than 0.5% only 27 out of a staggering 4,500 shots deemed gamequality
required minor postproduction alterations by an artist. In total, Digital Domain rendered over
250 million frames for The Quarry, helping to create a new highwater mark for highfidelity,
interactive cinematics “Creating highquality, photorealistic visual effects used to be unique to
feature films, but the technology has evolved, and given our decades of history and ongoing
success, we are in a unique position to bring professionalquality VFX to any screen,” said John
Fragomeni, global president of Digital Domain. “We have some of the best artists in the world,
so there’s really no limit to where and how we employ our tools, and we were honored to work
with Supermassive Games to help bring The Quarry to market.”

The Quarry will be available on June 10 in both physical and digital formats for PlayStation 4,
PlayStation 5, Xbox One, Xbox Series X|S and digitally on Windows PC via Steam



LOS ANGELES June 3, 2022 In Doctor Strange in the Multiverse of Madness, Marvel
Studios pushes the boundaries of reality and then keeps going. With broken universes,
environments wrapped in illusion and magical traps that shatter the world, the film introduces
visual elements unlike anything ever seen before. To help show audiences a new look at reality,

Marvel Studios turned to longtime VFX partner and Oscarwinning studio, Digital Domain.
“In Doctor Strange in the Multiverse of Madness the visuals play an important role in the
storytelling process, as the characters hop from one incredible reality to another, seeing
impossible things as they go,” said Joel Behrens, VFX supervisor for Digital Domain. “That gave
us the opportunity to create worlds and new realities for audiences to enjoy, which is something
most artists can only dream of doing.”

Worlds and Worlds
During his desperate, realityhopping flight through the “Multiverse,” Doctor Strange (Benedict
Cumberbatch) at one point finds himself in the center of an “incursion,” highlighting the
consequences of two realities colliding. Although the people are gone, pieces of the decimated
universes continue to exist, with shattered buildings no longer held in check by gravity and the
ghostly remnants of dead civilizations flying by. At the heart of these dead realities, one
structure remains standing: a warped version of Doctor Strange‘s home, the Sanctum

To create the scene of metaversal destruction, the filmmakers utilized a combination of both
physical and digital effects, starting with a practical twostory stage with bluescreens above and
surrounding the stage. As the performers move down the ruined, propfilled streets of a familiar
looking version of New York City, digital artists added the sky including shattered buildings
and wreckage from two realities floating by.

Digital Domain began by designing pieces from two separate realities, each with their own
distinct flair. Using Houdini and VRay for lighting and rendering, artists began by creating
several CG assets ranging from debris to vehicles to pieces of buildings. For the first of the two
Earths, the materials looked familiar and would be right at home in our own New York City (and
even included a CG version of a certain 1973 Oldsmobile Delta 88 that fans of director Sam
Raimi’s films will no doubt appreciate). The larger assets, including the buildings, were then
broken into countless pieces of debris, which were then animated to suggest a flowing motion
from the shattered structures into the dead void above.

With the familiar assets complete, Digital Domain then began constructing the remnants of a
Victorianinspired version of New York, complete with gothic spires and ornate metalworking. To
highlight this alternate world, artists began by adapting familiar structures, including the
Brooklyn Bridge and several other altered pieces. After reimagining them both under a
Steampunklens, the CG assets were then broken into pieces and animated before being added
to the crowded sky.

With the ruined, moving worlds created, Digital Domain then combined the digital content with
the liveaction footage of the incursion, adding in the objects in the sky and the extension of the
streets. The scene was then topped with a haunting gray sky reminiscent of fog, but reflecting
the absence of anything beyond it. With the broken materials in motion, artists then took things
one step further and showed pieces of the ruined cities colliding with each other, creating even
more debris. The VFX team also digitally recreated part of the physical set to bring the collisions
down to the street level.

With the world collapsing around him, Doctor Strange heads to the one place he thinks might
help him find a way back home: an alternate version of Sanctum Sanctorum. But once he
arrives, it’s clear that the alternate version of his home is a very different place than the one he
left behind. To create the “Sinister Sanctum,” Digital Domain began by digitally reconstructing
the once inviting structure from the ground up to make it feel more like a mausoleum than a
sanctuary. Artists then built on that by extending and animating the crumbling roof as it trails off
in the void, while also combining CG skulls to the physical skull props scattered around the
ground. In total, Digital Domain used only one existing asset for the sequence, the shattered
remnants of the iconic window.

Once inside, the alternate Sanctum Sanctorum further highlights the broken nature of the world,
with a physical set standing in for the floor and the walls, and bluescreens for the rest. Utilizing
plates taken from a beach in Iceland, Digital Domain created a CG ocean in Houdini, then
flooded the scene digitally. The VFX studio then took images of the physical staircase on set
and extended it to show it leading off into a seemingly endless march upward, before being lost
in the foglike haze.

Apples and Illusions
With the Multiverse in play, Doctor Strange seeks help from ally and fellow magic user, Wanda
Maximoff (Elizabeth Olsen), who is still dealing with the emotional fallout of the events depicted
in the Disney+ series, WandaVision. She and Strange discuss the Multiverse while walking
through an idyllic apple orchard, filmed on location in a real farm in England. Things change
quickly though, and the orchard is replaced by a warped and twisted environment, digitally
created to visually convey the consequences of dark magic.

The transformation sequence begins with an effect similar to the hex wall magic effect seen in
WandaVision. To create that look for the series, Digital Domain worked with the showrunners to
find a look that felt like a natural extension of Wanda’s powers, but with a more chaotic and
powerful feel. After multiple iterations, the new hex power was introduced along with the Scarlet
Witch persona.

Once the true environment is revealed, the liveaction moved from onlocation filming to a
bluescreen set. Teams at Digital Domain developed the corrupted version of the orchard,
highlighted by twisted, ruined trees, dusty grounds and a red sky covering it all. The once
inviting farmhouse created digitally was also replaced by a ruined husk of itself. Artists also
added swampy gasses and the sun, but filtered through the dark, nearly impenetrable clouds.
It’s a scene straight from a nightmare, setting the tone for the events still to come.

Mirrors and Traps
After the film’s villain has been revealed, the heroes head to the magical stronghold of Kamar
Taj, where they prepare for a battle that pits magic against magic. During the battle, Wanda
finds herself caught in a unique and powerful trap, where mirrors and reflections become a

Digital Domain started at the very beginning with the sequence’s previs to establish the full look
and movements. Artists then began to create a smaller, more contained version of the mirror
realm initially introduced in 2016’s Doctor Strange. As the character Wanda moves into the
mirror realm, launches a hex bolt, created using the same techniques Digital Domain used for
Wanda’s evolving magic in WandaVision. The mirror then shatters, creating multiple shards,
generated in Houdini.

For the liveaction component, Olsen recorded several versions of the scene, offering the
filmmakers and VFX artists several reactions to work with. The VFX team then referenced the
multiple shots to create several looks for Wanda they then animated, each highlighting a
different reaction within the mirror realm. The action then heads back into KamarTaj, with
digidoubles of Wanda pushing through mirrors and reflections. The digidoubles of Olsen along
with digital models of Cumberbatch created from digital scans of the actor were then shared
with other VFX studios for use throughout the film.

“Part of the reason Marvel Studios has had so much success is that they have a clear vision,
and an understanding of what needs to be done to make that become a reality,” said John
Fragomeni, global president of Digital Domain. “Creating something as unique as a Multiverse
and pairing it with realitybending magic isn’t a simple task, but we’ve been collaborating with
Marvel for years now. Understanding and executing on their vision has been key to our success,
and we can’t wait to show audiences what comes next.”

Digital Domain’s work on Doctor Strange in the Multiverse of Madness marks the latest in a long
line of collaborations between Marvel Studios and the VFX studio. The teamup will continue
with the episodics Ms. Marvel and SheHulk, the feature Black Panther: Wakanda Forever and

Doctor Strange in the Multiverse of Madness is playing now in theaters, and will begin streaming
exclusively on Disney+ June 22.



STUTTGART, Germany May 6, 2022 Today at FMX, Oscarwinning VFX studio and
leaders in digital human technology, Digital Domain, announces “Zoey,” the world’s most
advanced autonomous human. Powered by machine learning and created using an advanced
version of the technology and process that helped bring Thanos to the big screen, the
photorealistic Zoey can engage in conversations with multiple participants at once, remember
people, access the internet to answer questions and more, paving the way for the next step in
the evolution of AI.

“With the potential of the metaverse and continued advancements in AI, the desire to interact
with autonomous humans facetoface is becoming more important and part of modern life,”
said Daniel Seah, Global CEO of Digital Domain. “For decades, Digital Domain has been
pioneering digital human technology, and Zoey takes the concepts of virtual assistants like
Alexa and Siri several steps further, creating a digital helper you can truly interact with.”

Building on its proofofconcept autonomous human, “Douglas,” Zoey is the result of years of
research and development from the awardwinning visual effects house, spearheaded by its
internal Digital Humans Group. The physical appearance of Zoey is based on actress Zoey
Moses, who worked with Digital Domain to create a comprehensive set of facial movements and
mannerisms, as well as a full range of emotive expressions. Using that data and its proprietary
facial animation tool “Charlatan” recently seen in blockbuster films, including the Oscar
nominated Free Guy artists were able to take the real footage of Moses and create a flexible
digital face, capable of reacting in real time.

To give Zoey her personality, Digital Domain used multiple forms of machine learning to ensure
she could understand most questions and formulate proper replies, and then access the internet
(or stored data) for its answer. But rather than simply give a verbal or text response, Zoey can
fully emote. If a question is confusing, she will look perplexed; if she is told a joke, she’ll smile.
Throughout a conversation Zoey will also move and fidget in accordance with her answers, and
even get annoyed when interrupted. She can engage with participants, asking them questions
and opinions, and facial recognition software lets her recognize and remember people. Chatbots
can also be added, giving Zoey the potential to quickly assume the role of an established virtual
assistant for deployment in hospitality, the service industry and much more.

To provide Zoey with a robust and flexible voice system capable of more than just reciting
scripted answers, Digital Domain employed AIpowered texttospeech technology
WellSaid Labs. Using WellSaid Labs’ tools, Zoey can access a vast vocabulary of words
and phrases, ensuring that no answer is beyond its capacity to vocalize. There are also multiple
options to control the tone of her replies, including different levels of enthusiasm to reflect the
goal of the interactions. For a retail setting, for instance, a more upbeat approach would be
required, where a conversation would be more level. Zoey also has the ability to expand her
speech capabilities by incorporating multiple language packs, making her fully bilingual.

Once Zoey is “educated” and ready to interact with the real world, she can then be added to
multiple platforms, including custom systems designed by potential users, or game engines like
Unity and Epic Games’ software. For today’s introductory presentation at FMX, Zoey can be
seen running in real time on Epic Games’ Unreal Engine 4.

“For the last 30 years, Digital Domain has been one of the world leaders behind some of the
most sophisticated and memorable visual effects ever seen, so it was a natural progression for
us to become a leader in digital and virtual humans and to create ‘Zoey,’ the world’s most
advanced autonomous human,” said John Fragomeni, global president of Digital Domain. “We
will continue to push the boundaries of visual effects in all fields and on any screen, as we look
for ways to offer people a better experience, regardless of the medium or delivery platform.”

Zoey will be available to license from Digital Domain in the near future