C1&C3: Visual Effects


Visual effects or special effects?

People often confuse two things: Visual Effects and Special Effects.

In the industry, special effects are technically real things, like a stunt car pile-up on set, or a carefully rigged fire, or even blood capsules popping under someone’s shirt. They’re generally effects done on set and they are not digital. They are sometimes referred to as practical effects. Before the advent of digital technology, everything was special effects, which is why older people particularly tend to refer to anything of that nature in a film as ‘Special Effects’. 

Visual effects however are far wider in their scope - they’re not just ‘special’ (like digital versions of a stunt car pile-up, a fire or blood emerging from a bullet wound) but often quite mundane, like digitally removing an electricity pylon from a period drama, or getting rid of a boom microphone that mistakenly came into shot. These days VFX (as we’ll now refer to visual effects for convenience) are always digital interventions and not something in front of the lens. It’s the integration of camera footage with imagery generated elsewhere. VFX are things done to the footage LATER.

TERMINOLOGY

VFX and Post Production

Post Production is a phrase that refers to everything after the footage was shot - sound design, titles, editing, for instance. It used to be that VFX was in that bracket too, something you created and added to the footage you shot after the camera crew had disbanded and everything was ‘in the can’.

Some of the work still is, however now most productions plan their VFX from the pre-production stage, and many will have a VFX Supervisor on set to make sure the green screen is correctly set up, to collect textures, take measurements and information about the camera lens and lighting. This helps fool the eye and make the VFX integrate with the footage. And it can save a lot of money.

Compositing


Compositing is an old term, and comes from the early days of photography, when artists started to experiment by layering prints, double exposing pictures and building ‘composite’ pictures. Today in VFX a composite or ‘comp’ is any image which has at least two layers of combined imagery. Some comps in Hollywood films can be made of hundreds of elements. As another historical throwback, today’s VFX artists will refer to a single film image as a ‘Plate’ which alludes to photographic plates. It’s surprising how these pre-digital terms have survived even in today’s cutting edge practice.


Green Screen and Blue Screen

Due to the fact we humans, no matter what colour or race, tend to be on the red end of the spectrum in terms of flesh tones, blue and green are excellent colours to shoot against when we want to automatically separate a subject from the background. This is also called ‘keying’.


Tracking

Tracking is about digitally attaching an object or image to some other moving footage. Examples might be putting a new number plate on footage of a car on the road, or adding dirt to a building facade. 

CGI

CGI stands for Computer Generated Imagery. VFX artists don’t just composite film elements; they’ll often composite CGI material – sometimes complex work like Godzilla (in the later films) or sometimes more unnoticeable things like a car or a particular kind of building not available to be filmed. When people complain of too much CGI in modern films, it’s usually the fantasy films they are talking about, because CGI elements shouldn’t really be noticed at all.


A little bit of history
We consume images all the time and often don’t question them

That’s ideally what we want to do with visual effects too.  


From the Flare shot you are going to create in the Flare Project, ensuring the flare moves convincingly in sync with the actor moving the stick would be hard to achieve without tracking - you would need to approximate frame by frame and that would end up looking very wobbly.

Historically this effect would have been done by re-filming shots or what were called ‘plates’ under a film rostrum camera frame by frame, and this was never wholly convincing, because our eyes are hyper-critical of every wobble or misalignment on film. Back in celluloid days, tracking was done by hand, and someone would need to (in our case) move an image of the flare frame by frame onto the stick under a rostrum camera. It was only when computers allowed us to track pixels that smooth and convincing attachments were really possible.

In today’s digital world we let the computer do the donkey work - the tracking software looks for the same shape or a similar group of pixels in every ensuing frame - in the Flare Project, it’s the group of pixels you chose on the end of the stick to which you attach the flare.

This ability to digitally attach one object to another is relatively new. Like a lot of technologies we use everyday, you probably won’t be surprised it has a military origin- in missile guidance systems. However it wasn’t long before enterprising post production companies in the early 90s saw the research papers and adopted the technology.

ILM (Industrial Light and Magic, the company originally behind Star Wars) first tried a form of laborious manual tracking on 1992’s Death Becomes Her (Zemeckis). The film used extensive digital retouching to remove the head of one actress, and then later placed a talking head on another body. The head needed to be tracked in place. ILM would improve on this to create a ‘3D’ tracking system that let them calculate depth so they could track and place computer generated dinosaurs in three dimensions, (not just two dimensions like we’ve used in the flare shot where we only move up/down and sideways)

It was clear in the 1990s that the public’s appetite was moving towards more spectacular and fantastic themes, so there was a demand for hardware that would allow such magic and a company called Discreet built a system called Flame in 1992 that had an inbuilt tracker. The first film that Flame created VFX for was the Super Mario Bros, (Jankel and Morton).

In the 1993 Sylvester Stallone movie Cliffhanger (Harlin), tracking was used to cover wooden supports on the studio set of a cliff-face, by aligning rock images over the visible wooden struts and carpentry.

From our privileged position these days it’s hard to appreciate what a breakthrough this was. It was magic! As Peter Webb, an early Flame Artist said of those times;

“I remember demoing the tracker myself in the early 90s at trade shows and we always used this footage of some Russian soldiers marching. 
We would track the Russian emblem on their fur hats and then replace it with a flame fish (the old Flame logo). 
The punters LOVED it as it had never been seen before”.

By 1994 other systems were adding tracking to their range of functions. Various processing techniques were invented - changing the luminance on the pixels so they were easier to follow, or tracking Red, Green and Blue channel values separately and then combining them to average out errors. One common problem with tracking a group of pixels over time is that they change. Also if your film footage is grainy, the computer isn’t intelligent enough to recognise this so ‘loses’ the group of pixels it was tracking. By 1995, machines allowed for the track point (the group of pixels) to be refreshed periodically as it tracked, and the track re-started over so many frames, allowing for longer smoother tracks and therefore more complex and longer effects sequences.

By 1998 the 1998 Oscars recognised Flame with a Scientific and Technical Academy award. A “tracking arms race” ensued - everyone wanted to use the best tracker technology for their films. If you imagine a Hollywood film might be four thousand pixels across so it looks good on the big screen - having a tracked object go wrong by just a couple of pixels might be noticeable to the audience, so you can appreciate the commercial importance.

Also, as film-goers demanded more sophisticated VFX, more 3D camera and object match-moving systems proliferated. Clever software could read film footage and work out the 3D space it originally portrayed, and then provide accurate data to allow 3D computer generated models to seemingly be placed into moving image shots and perform complex moves as if they were really there. Software like 3D Equalizer and laterBoujou revolutionised how VFX could be integrated into filmed footage. Films like The Matrix, Lord of the Rings, Harry Potter and other spectaculars relied on this technology.

Before you begin the Flare Project, which will allow you to glimpse at some of the core concepts of VFX, like layers and tracking, you need to be aware of how important it is to shoot your footage in such a way that makes the VFX fit in nicely, rather than just hoping your effects will ‘sit on top’ of the footage! 

For instance, how might you shoot and light a character who is going to be composited onto a yacht on a sunny Caribbean day? Or how might you shoot and light a model rocket that will have exhaust smoke and flames added later? Making sure the tracked object and the image you are attaching it to seem to share similar lighting will help ‘sell’ the shot to the viewer.

Watch this video and think about the attention to detail that helps us to accept (and the VFX team create) the scene:

GuerrillaDiarybehindthescenes.mp4


He built a set for the background, which barely registers when you see the shot, but because it’s on set it reflects the light and we accept it as real, almost subliminally.

In fact a trick we noticed is Justin put little strips of tin foil around the edge of the barrel and the pipe to just hint at water reflecting off these edges. Now that’s what we call attention to detail.

Notice also Justin talks about testing out the footage in HitFilm to make sure it all fits together whilst he is still on set- as it’s problematic (and expensive) to find out things don’t fit later.

Different scenario where you would use real lights and real objects in a creative way like Justin did? I’ll start you off- imagine I had an actor in a deckchair dressed in a t-shirt and hawaii shorts. I might pour a bit of sand on the floor and place some ‘tropical’ greenery around him. I might use a fan to blow a gentle wind through his hair, and have someone wave a filter in front of the light to replicate palm tree leaves shadows; then I might cut to some tropical beach stock footage. Now a shot like that might save a pile of money in airfares.


Complete this QUIZ then read the CASE STUDIES

Thanks to: Future Learn and Norwich University of the Arts