Podcasts

VFX Firsts: What was the first film to use a digital composite?

And did it involve a dolphin…

Most people will know that something BIG happened in visual effects in the mid-to-late 80s and early 90s, with the shift to digital VFX techniques. A lot tends to be made of the use of ‘CGI’, but just as crucial was the move to digital compositing.

So, which film was the first to use digital compositing? Well…

To help answer that question, check out the latest VFX Firsts podcast, featuring expert guest Laura J. Hill, now a visual effects supervisor at Crafty Apes VFX, with significant experience in compositing.

This week’s hosts: Laura J. Hill and Ian Failes.

There are major spoilers in the show notes below, but you can listen first to the podcast on Apple Podcasts or Spotify. It’s also embedded directly below. Also, here’s the RSS feed.

Buy Me A Coffee

1. Young Sherlock Holmes (1985)

Relevant sequence: The Stained Glass Knight
Director: Barry Levinson
VFX: Industrial Light & Magic

2. The Fruit Machine (1988)

Director: Philip Saville
Relevant sequence: Dolphin transition
VFX: CFC

BONUS: See the detailed notes from former CFC artist Paddy Eason about the early days of digital compositing at the studio, at the end of this post.



Image from Richard Rickett’s ‘Special Effects: History and Technique’.

3. Indiana Jones and The Last Crusade

Director: Steven Spielberg
Relevant sequence: Death of Donovan
VFX: Industrial Light & Magic

Note: A single shot in The Abyss (released also in 1989 but later than The Last Crusade) also made use of digital compositing by ILM. It was the shot where the door closes on the pseudopod, an ILM CG creation, which crashes to the floor in a watery splash.

4. Where to learn more about digital compositing

Alvy Ray Smith on the alpha and the history of digital compositing

Ron Brinkmann’s The Art and Science of Digital Compositing

Eran Dinur’s The Filmmaker’s Guide to Visual Effects: The Art and Techniques of VFX for Directors, Producers, Editors and Cinematographers

BONUS: notes from former CFC artist Paddy Eason

Eason shared these notes about CFC’s early digital compositing solutions. He joined not that long after The Fruit Machine was released.



CFC, when I joined in 1990, was very much based around the film scanner. It was CFC’s USP, raison d’etre and chief weapon. It often felt, in the early days, that VFX work was only carried out in order to earn money to fund scanner R&D!

The company had been formed with the aim of using digital imaging techniques on film-originated material. ‘Video’ was a dirty word. The scanner wasn’t even called a film scanner – it was ‘the Input Device’. It seemed as if this might be a computer nerd joke – with the scanner being treated as a thing along the lines of other input devices such as keyboards, tablets and mice.

The whole company was based in a small basement office under Berwick St (after an early pure R&D phase in a shed in Shepherd’s Bush). The scanner at Berwick St was in its own glass walled room, with sticky rubber floor and positive pressure air conditioning to keep the dust out. You had to don a clean ‘forensics’ suit to go in. The scanner was very much a Heath Robinson/Rube Goldberg affair, comprising light source, filters, film transport and pin registered gate, stepper motors and computer control, and it took up the entire room. I recall one vital adjustment in the scanner being carried out by tapping a wedge shaped metal pencil sharpener into a gap. One of the company’s founders, Mike Boudry, was an optical physicist, and so the scanner was very much his baby. I recall conversations about which parts were special, and why – though of course most of it went over my early-20s head.

The scanner didn’t scan negative – at least, not for the early years. An interpositive was made prior to the scanning. I recall the request to the film lab was that the interpos was ordered ‘3 points light’ (this may have varied from job to job). Interpos, normally used as an intermediate step towards the creation of a duplicate negative, was for some reason the chosen medium for CFC’s scanning. Some of this might have been safety – if the original neg was going to be damaged, better it happen at the lab than at CFC! But also a factor might have been that interpos has a smaller density range than original neg, making it easier to scan.

In the early years, CFC’s images were 1280×960 (‘hi res’) for the 35mm neg area. The scanner could be reconfigured for scanning the whole neg area, or the ‘academy’ area, not including sound track area. The resolution was later increased to 2560×1920, called ‘x res’. In the mid-to-late 90s, a 5120×3840 65mm scanning capability was created, as used on Bertolucci’s Little Buddha and several IMAX projects.

Early 90s images at CFC were 8 bit linear, with a special gamma designed to maximize the use of the limited precision. Images were gradable, but not by much. Logarithmic encoding was introduced after Kodak/Cinesite announced their log Cineon image format (CFC’s colour scientists were initially sceptical – ‘They haven’t used enough precision for the highlights.’ But they came around). CFC designed their own 8 bit log, which was a great improvement on the previous system.



The light source in the scanner was quite intense, and I recall that it was described as ‘highly specular’ – which helped reduce the visibility of dust for some reason. The (hot) white bulb shone through dichroic filters – successively, red, green and blue. The whole system was controlled by a series of Unix scripts, running off a very basic laptop with a tiny screen. Once film reels were loaded onto the scanner, and threaded through the film gate, the scanner could be set in motion. It was slow! The process of scanning long shots could take hours and was quite a taxing responsibility for the (mostly very young) operators (Ros Lowrie and Pete Hanson come to mind). The heart of the scanner was a little surprising – it was an off-the-shelf document scanner, designed to make monochrome copies of paper documents. I believe it was an area array. This had been highly modified by CFCs boffins (egg heads) to create full color digital scans of 35mm film.

The original input device was later redesigned as a very beautiful system that became commercially available as the ‘Northlight’ scanner, designed by Theo Brown. It was housed in a huge slab of marble to aid optical stability (and it also looked very cool).

The digital images created by the scanner travelled to CFC’s other secret weapon – the GIPs. GIP stood for Graphic Image Processor. These, again, were modified versions of commercially available image processing boards, normally used (I believe) in the processing of high resolution (for those days) satellite images by the military, in air traffic control, medical imaging and other high end applications. I believe its full name is the Dupont Pixel (Benchmark technologies) GiP.

CFC had two of these GIP systems when I joined, the ‘Production System’ and the ‘Development System’. When a GIP wasn’t being used for scanning, it would be used by a digital artist (called the ‘operator’) for designing VFX shots, or to drive the film recorder (the ‘Output Device’). Another couple of GIPs were added over the years, but there were never very many! They were housed in another air-conditioned room, with a sliding glass front for easy access – with its own freon-based fire extinguishing system. The GIPs were waist height racks of densely packed boards. They would occasionally blow a capacitor, resulting in black gunk spraying out. Each GIP had 4 similar boards – one each for Red, Green, Blue and Alpha. The GIPs, though digital devices, had something of an analog feel – when they were overheating or otherwise unhappy, you would see the effects in the images – scattered green dots, which we called ‘GIP fungus’. One of the key advantages of the GIPs over more ‘general purpose’ computers was said to be memory bandwidth, which I believe was 500MB a sec. The GIPs were basically big image stores that allowed efficient coding of image processing at the microcode level.

Much of CFC’s digital artist’s life was spent managing disks. Disks constantly shuttled between the GIPs and long term image storage (‘backup’). The backups were done to exabyte (hi8 video) tapes, a process managed via a basic Windows PC, painfully slowly. In 1990, the biggest disks in use at CFC were 650MB ‘Winchester Disks’, which were roughly the size and weight of a house brick. The company only had a handful of these. 1GB disks came soon after. The disks were connected to the GIPs via SCSI cables – one disk each for R. G, B and A. The management of data from scanner to GIP, to disk, to Exabyte and finally to the film recorder was very much CFCs business, and, given the extreme slowness of each process, efficient scheduling by coordinators and producers was key.

The actual VFX design system at CFC was, as we have seen, based on the GIP, which displayed film images on big (neurotically calibrated CRT monitors). The artist actually interacted with the GIP via a small ‘Unix host’ which controlled the actual image processing boards. The Unix host was very basic, running on an 8086 chip. The operator used a keyboard to type commands, as displayed on a WYSE text terminal. There was no mouse. This command line input caused the GIP to carry out graphics commands. ‘Begin’ would reset the GIP. ‘Getim’ would load an image off SCSI disk, into the GIP image boards and into the CRT display. ‘Putim’ would write to disk. Commands often required several parameters. An operator could experiment with how to create the effects he or she required by typing successive commands, and viewing the accumulating results interactively on screen. These commands could then be combined (using a simple text editor such as Micro Emacs) in a Unix shell script. This would be a loop script that would run once for each frame in a shot, and sometimes quite complex programming work was required by artists. The primitive WYSE text terminals were replaced in the early 90s by HP terminals running the HP-UX GUI.



As well as being able to control the GIPs via command line or shell scripts, operators were able to use a pen and tablet paint system. This offered a simple set of menus, visible on the image CRT, floating over the film image. The tablets were huge, perhaps 45 cm across, and the pens were pressure sensitive. A series of paint strokes (including cloning) could be recorded (‘journalled’) and replayed from within a shell script. The paint strokes could be observed on screen, as if a ghostly hidden hand were operating. As well as the pen and tablet, the system also offered image viewing control via a heavy trackball and button box. The trackball (like a big upturned mouse) allowed for roaming around an image and adjusting clone offsets, while the buttons allowed for zooming in and out, and flicking between the colour channels and alpha. The pen and tablet and trackball/button box system was extremely fast and interactive, and superior to any paint system I have used since.

The range of image processing operations available to the GIP operator would probably seem quite primitive to a Nuke user nowadays, but the basics were all there. Blurs and medians, color grading, sharpening, image transforms including warps, excellent keying operations (CFC’s ‘QMatte’ bluescreen algorithm forms the basis of today’s Keylight in Nuke), automated motion tracking, and so on. Creative uses of shell scripting allowed for quite complex work – for example, 2D particle systems or water refraction effects.

Once an operator had built a loop script for a shot, he or she would run it as a render – often taking many hours, or even overnight. A key innovation was that early tests could be run at half, quarter or even sixteenth res. Renders would often be run one after the other, all night, a process overseen by the night shift of ‘baby sitters’ – key members of CFC staff who would also move disks around as required, run backups, and so on.

The end result – a series of rendered frames on disk – could only be viewed running via a quite primitive flipbooking system on the GIP. Only a few seconds at a time could be run. Shots could only really be checked by shooting to film…

The final piece in the CFC puzzle was the film recorder/output device. This also had its own clean room, with blackout capability (for loading film). The output device was laid out along a railed optical bench. A totally traditional – indeed vintage – 35mm film camera was at one end. This was a Mitchell, employed for the excellence of its pin registered gate. A look at the serial number of CFC’s Mitchell’s showed that it was originally owned by Hollywood star Mary Pickford! The Mitchell’s lens became at some point in the mid 90s a custom-made Cooke, commissioned by CFC for this specialised application. Next in line was a set of coloured filters – R, G and B, set into a little motor controlled filter wheel that was, like the whole system, controlled by the unix host. The images were presented to the camera on CRT. In 1990 this was still a fairly standard 1280×960 color CRT (similar to those used by the operators). A few years later this was replaced by a small, highly specialized monochrome CRT. This only showed the image one line at a time, in white, and looked like a tiny sparkling ray scanning down the screen. The full color image was built up via the r, g and b filters.

Once the film recording had been completed (often by the night shift) the negative was canned up, labelled and run down to the collection desk for the film lab, where it was whisked off for the ‘night bath’. In the morning the print was delivered. In 1990, CFC had an old Moviola film viewer (actually, an Acmeola), which was soon replaced by a Steenback.



‘Dailies’ could also be shown on a projector, initially housed in a rather small room downstairs (with an ingenious ‘loop box’ designed by Theo Brown), and later upstairs in a proper preview theatre.

I joined just after Fruit Machine, but remembered the shots well – they were in the showreel and on brochures etc at the time (no websites then!)

The VFX artists at that time were (apart from the boffins such as Mike Boudry and Wolfgang Lempp, who would occasionally get involved in shots) Janek Sirrs and Val Wardlaw. Nick Brooks joined very shortly before I did, so I guess that makes me CFC artist #4.

Finally, did we get it wrong? Is there a different film you think was the first to use a digital composite? Let us know in the comments below. Thanks for listening!


Become a befores & afters Patreon for bonus VFX content


Join the discussion

  1. Richard Green

    I vividly remember visiting CFC around the time Paddy discusses. Pretty sure I spoke with Mike Boudry who was desperately searching London for people who could use Photoshop! ‘I would have so much work for them if I could find anyone” – I talked myself out of it as I felt too inexperienced. FWIW I was shown a looping shot from Memphis Belle of a gunner being sucked out of the nose of a damaged bomber which I recall them declaring as the first digital composite but that is post Fruit Machine.

  2. Craig Barron

    First digital composite I’m aware of predates your examples. It was for a wire removal shot on “Howard the Duck” 1986. The negative was scanned to digital, wires painted out and recorded back out to film. I remember it because the shot could not be completed with traditional optical effects and it being a unique solution at the time. You could ask Dennis Muren to confirm my memory – he would know for sure.

    • Ian Failes

      Thanks Craig, yes you are of course right about this. There is a whole episode coming up about that wire removal work with incredible inside information. (Can’t wait for people to hear it)

  3. Craig Barron

    I remember the early introductions to digital effects at ILM at the time. I was fascinated with the technology.

Leave a Reply

back to top
%d bloggers like this: