Get a rare insight into the making of Nuke at Digital Domain, including a Nuke script for a classic Titanic VFX shot.
Many befores & afters readers may already know how Foundry’s node-based compositor Nuke originally came to be. It began life as an in-house compositing tool at Digital Domain in 1993, before being productized by Foundry in 2007. In fact, you can see previous pieces at befores & afters about the history of Nuke here and here.
But, a recent session at FMX with Jonathan Egstad, who became a principal architect of the third-generation Nuke while at Digital Domain, shed new light on how Nuke came to be so ubiquitous in the visual effects industry.
Read on to discover how Egstad in those early days of Digital Domain drew upon the original breakthroughs in Nuke to add a 3D system, 32-bit floating point color data and multiple channels. You’ll also learn about his current work at Foundry where he’s involved in revamping the 3D architecture in the software.
How compositing was done originally at Digital Domain
Egstad worked at Digital Domain as an artist and supervisor at the time that Nuke was born. While compositing on films like True Lies, Apollo 13, The Fifth Element and Titanic, he was right at the coalface of how the then-new visual effects studio was developing new digital compositing workflows. Egstad originally started at the studio, which was formed by Scott Ross, James Cameron and Stan Winston, as a video engineer in 1993. “I helped build the facility,” he says. “I installed all the video equipment, built the rooms, the editorial bays, all that kind of stuff.”

When Egstad began at Digital Domain, the studio mostly used Flame (from Discreet Logic, which was later acquired by Autodesk) for compositing. “Flame [used to] run on Silicon Graphics mainframes. We had eight SGI Onyx’s running Flame. The Onyx mainframe was a full rack in height. Your phone probably has more graphics horsepower than the Onyx did, but at the time, it was the only machine that had enough real-time graphics performance to be able to do real-time compositing or that kind of image manipulation.”
“They were extremely expensive,” continues Egstad. “One machine was half a million dollars, and the software Flame at the time was a hundred and some thousand dollars. So, each machine was three quarters of a million dollars in the end. We had eight of them and that was a huge investment in money.”

At the time, only one person could use the Flame machine at a time, and you could only process locally on the Flame, that is, you could not send work out to a renderfarm. “It was a very serial process,” notes Egstad. “That really constricted the number of shots you could work on simultaneously.”
Enter, the idea of Nuke. Initially, Nuke started as a script-based command line scripting system called ‘Nuka’. It was written by engineer Phil Beffrey. “It allowed what was fairly typical at the time, when the CG artists were rendering things and they wanted to assemble them, they would run a script to assemble those pieces,” explains Egstad. “Generally it was A over B with maybe some simple color corrections or blurs, and that was about it. There was no UI, there was no graphical user interface. It was simply you typing in commands into a terminal, set a script, and you had to execute it to see if something came out the other end.”

At some point, the tool became known as Nuke. There has been much conjecture over the years about how it got that name. Some say it was short for ‘New compositor,’ while others have mentioned it was an intentional reference to being larger or better than ‘Flame’. Egstad explains what he recalls as the origins of the name ‘Nuke’. “At the time, we were using Flame heavily, and so we had a relationship with Discreet Logic. Apparently there was a meeting that did not go well with Discreet Logic, and it really put the fire into some of the people at Digital Domain to come up with another solution. So that was really the genesis of why we got to write our own compositor as a tool. From that meeting, a bunch of riled up young guys were like, ‘What are we going to call it? It’s got to be bigger than Flame, bigger than Inferno! And because the term Nuka had already been used, it was like ‘Nuke! Nuke! What’s bigger than a Nuke?’”
Version 2 of Nuke implemented a UI, and furthered the kinds of basic compositing, color corrections, blurs, assembly and transforms. “Bill Spitzak was the primary engineer behind Nuke 2, and Nuke 3, and really built the main engine,” says Egstad. “He actually went to USC [and MIT], so he had an artistic background as well as being an amazing engineer. That’s one of the key things that made it successful, was that he was able to take that artistic ability and apply it.”

A major advancement in these earliest versions of Nuke was that it became able to transform an image effectively in 3D. “It was able to rotate that image and put in perspective,” outlines Egstad. “We used that a lot to do what are called pan-and-tile setups. We would take still photography from a single position, and we would then position tiles of those images in what would be the correct rotations. Then we could reconstruct backgrounds so we could pan around this virtual environment. That was used heavily on early shows like Apollo 13, Titanic and The Fifth Element.
Nuke’s origins or relationship to Flame around this time also included a tool called ‘flame2nuke’ which converted Flame’s Action, Keyer and ColorCorrector modules to Nuke 2 scripts. This enabled farm rendering, since a Flame’s disk arrays could be remotely accessed from the farm CPUs.
The impact of Titanic on Nuke
Digital Domain’s largest project then arrived; James Cameron’s Titanic, released in 1997. Many shots of the cruise liner about to set sail, or on the ocean, required the combination of hundreds of different elements—live-action, miniatures, greenscreen crowds, water, sky, birds, various CG renders and more. One particular shot, TT18, moves from present-day wreck Titanic, to Titanic about to depart from Southampton. Egstad was responsible for compositing that shot (see the Nuke script for it below) in Nuke 2.


Titanic was, of course, a massive success and won the Oscar for Best Visual Effects. It also became the impetus at Digital Domain to re-think the Nuke architecture. One challenge was that there were only four channels in Nuke to work with. Another key frustration was that there was then currently no way of working in 3D space inside Nuke, as Egstad describes.
“If you look at the greenscreen sections of the Titanic Nuke script, they go into these little transform nodes and there’s a transform and a camera node, and each of the camera nodes essentially is the shot camera. So there, I’m trying to shoot through the shot camera, and I have to transform the card. But, there was no 3D system, so this was all done through coordinates. You had to visualize in your mind where the camera was in 3D space in your head, and you had to then place things.”

“We knew where some of the things were because we had to construct objects, but there was no way you could pull out and see the scene,” adds Egstad. “So if the card didn’t render, you just got a black screen and you didn’t know if the card was behind you, above you, below you. In your head you had to keep the coordinates straight. That was a pain. There’s a lot of [node] clones going on because that same camera had to be used over and over and over and over and over and over again for all of these.”
Egstad also notes that the team was finding Nuke 2 would operate very slowly due to some challenges with cache management. “With a script as complex as [the one on Titanic], it could take 30 seconds for Nuke to change frames. There was a fundamental bug in the way that Nuke was evaluating that graph. It got to a point where it was really affecting our day-to-day.”

To find a solution, Egstad went to Spitzak, who was working as a compositor on Titanic at the time, for advice on how to fix it. “Bill said, ‘Well, here’s the source code. If you can figure out what the bug is, I’ll put it in.’ And so I did,” recalls Egstad. “I had enough software experience using Modula-2 and BASIC, so I knew how to do print statements. I put print statements everywhere. I found the loop that was causing the problem and fixed it. To me it was like a drug. It was like, oh my gosh, I was able to change the tool that I work on for the better and actually change how it functioned. That’s what gave me the enthusiasm and hope to do the new structure and the new design.”
A new architecture
Egstad then set out to help the small existing Nuke team with crafting an architectural re-design of Nuke, essentially incorporating an interactive 3D environment inside the software and multi-channel support. Work began on designing and writing Viewer code before the entire 3D system was imagined. The whole intention was to allow the user to visualize 3D space from outside the ‘camera’, and then also add to Nuke a multi-channel workflow and full floating point operation.

For the re-design, Egstad took some inspiration from some other tools, including Flame and Side FX’s Houdini (Digital Domain had been a heavy Prisms and then Houdini early-adopter). Egstad and fellow DD compositing supervisor Carey Villegas also evaluated Nothing Real’s Shake, which was the only other major node-based compositing tools available at the time. “While it had some tools that we didn’t have in Nuke, it was also a lateral move. It didn’t give us more than four channels. It didn’t give us a 3D system. It didn’t give us a full floating point workflow. We were wanting to step up, not step to the side.”
“That was really what caused us to work on the new architecture,” says Egstad, “taking inspiration from the way Flame worked, taking inspiration from the way Houdini worked and using those as the basis for this new workflow that we were trying to do for Nuke 3.”

Ultimately, three ‘killer’ features went into the release of Nuke 3: 32-bit floating point color data, a 3D system and multiple (64) channels. Other components included a new scanline image engine and a new UI and Viewer. Egstad notes that creating the 3D system involved not just the Viewer, but also geometry processing, rendering, importing and translating. As noted, existing tools and workflows were leaping off points. For example, Egstad says the reason that the 3D system has round nodes is because of inspiration from Flame. “I love the way Flame allowed you to easily do parenting hierarchies. It was really easy as a non-3D artist to think about that.”
Simon Wells’ The Time Machine, released in 2002, became the first project to demonstrate the new capabilities of Nuke. “That was the first time we had done full floating point, high-dynamic range compositing, except for one shot,” states Egstad. “One shot was left over from Nuke 2, and it can be identified by its clipped highlights from the lack of a linear-light workflow.”
A big change
In 2002, Digital Domain started D2 Software with the aim of developing and selling Nuke and other software. Suddenly, Nuke went from being an internal piece of software to Digital Domain, to being available to other studios. “When we went and designed the new architecture in Nuke 3, we did it with an eye towards selling it,” outlines Egstad. “We architected it so that it would be easier to take it out of Digital Domain and sell it as a separate product. D2 Software spun up to manage selling it as a commercial product, but Digital Domain was not a software company. [The challenge was], we were selling to our competitors. We were Digital Domain and we were competing with ILM and Sony. So there was a certain resistance to our competitors buying a product from one of their major competitors.”

“There were early adopters, however,” notes Egstad. “Mostly in Europe, who were brave. I remember getting an email from someone I think in Sweden or in Denmark, asking about the 3D system. This would’ve been around 2003. I was shocked that somebody outside the company was actually using this tool. It was like, ‘Oh my God. Now when I change something, I have to think about other people outside the company.’”
Another change occurred in 2007 when Foundry took over the productization of Nuke from D2 Software, brought about by Digital Domain effectively purchasing Foundry (within two years, Foundry returned to being an independent company).
Around this time, Egstad moved on from Digital Domain to ImageMovers Digital, before transitioning to work at DreamWorks Animation, where he helped the studio to bring in Nuke as a third party product to replace their in-house compositing tool.
In 2020, Egstad began work at Foundry as Senior Product Innovation Manager. “I’m now working on the new 3D system,” he relates. “I had written the original 3D system in the early 2000s, and by the time I left Digital Domain, it was already showing its age, and this was before Alembic and before USD. It was all designed for a whole different paradigm. The whole 2000s and 2010s went by with not a lot of change to the 3D system. It really needed to be completely updated to work with modern file formats like USD. So that’s been my primary project. And then on the side I’ve been working to bring more multi-shot functionality primarily for feature animation, but it turns out that a lot of visual effects companies are now doing more multi-shot type of work, too.”
Reflecting on Nuke
Egstad has received two Academy Scientific and Engineering Awards in relation to Nuke. The first was in 2002 (Technical Achievement Award), shared with Bill Spitzak, Paul Van Camp and Price Pethel for “their pioneering effort on the NUKE-2D Compositing Software.” The second was in 2018 (Scientific and Engineering Award) “for the visionary design, development and stewardship of the Nuke compositing system,” shared with Bill Spitzak.

Looking back at the earliest development on Nuke, while also witnessing the large-scale adoption of the toolset in the visual effects community, Egstad is proud of what he and others achieved at Digital Domain, especially since Nuke was born out of the needs of a busy production environment. “What a lot of people don’t realize is that because Nuke was an in-house production tool, a lot of the design of Nuke was just happenstance. Some things that may look like a bug are actually a feature and some things that look like a feature are actually a bug. There’s a good example in the 3D system where you take a card and then you split it into two and you transform it differently, then you merge it back together again and you get two cards. That was never supposed to happen. That’s actually a bug. But people started using that in production as a feature. It’s very useful, but that was never intended to work that way.”
“People just assume that somebody actually spent time designing it,” adds Egstad. “[For example], the UI back in the day was very, very simplistic and very, very clean. And that was because we didn’t want Bill Spitzak and the other engineers to spend any time working to make the UI nice. It was a tool to get work done. It was not supposed to look nice. When Foundry took it over, they made a really nice update of the UI to make it look the way it is now. But we were never interested in that as artists. We were concerned with getting our work done efficiently and fast and having a fast tool.”
Thank you to Foundry, Digital Domain, FMX and of course Jonathan Egstad for making this session at FMX 2025 possible.






