[Tig] IR pollution on Alexa shot material

Richard Kirk richard at filmlight.ltd.uk
Fri Jun 26 22:38:05 BST 2015


Hi.

This is not an Alexa specific solution, but I hope it may help.

Our eyes have no violet sensor, but we do have a poor sense of black. Our optic system nerve cells fire in 'tonic mode' which kinda means that the gain is turned up so cells will fire even when there is no light coming in. This gives us the best sensitivity but at the expense of not having an accurate black level. If our eyes did not fire for zero stimulus then RGB black would be no signal. but we would see very little shadow detail.

This brings us to point two: why do we see violet (a very short wavelength) as the sum of blue (a lightly longer wavelength) and red (a much longer wavelength)? The answer is probably we are seeing a blue signal, with very little green signal. We probably see very little red signal too, but we are used to compensating for the illuminant so sometimes we see very little red and sometimes we see very little blue but we can compensate for the illuminant by tweaking the red and blue gain a lot. So when we see a violet, we see blue, very little green, and an amount of red we cannot easily quantify because we cannot determine the amount of red from the amount of blue. So, our best guess, is we have blue, zero green, and some red to compensate for the unexpectedly low level of green; as our eye has no reliable black level,

So much for our eyes. Cameras can measure RGB, but when they detect a pure blue, we know we will see this as a purple even though camera physics doesn't allow for the possibility for a purple. So, if you make a camera that 'sees' colours things as we do, they are forced to add a bit of red to intense blues to reproduce our sense of purple. This is so even though that doesn't make any actual sense from the camera point of view, because the camera has a very good sense of the dark level in all three RGB channels compared to our eyes.

Suppose you add a ND filter to a camera. This makes everything darker, so all the RGB signals are smaller, but the noise remains the same. Noise, here can mean electronic noise from the neurons firing in our eyes, or the noise in the camera A/D, or cross-coupling or flare, or all sorts of other things. But it takes us from the good signal regime were we have a good blue signal and no significant green or red; to the shadow signal regime where we have blue, surprisingly little green, and a small red that is hard to interpret. So what we see will shift from blue to violet, and what a good camera will measure ought to do the same to match what we see.

It could be IR or UV pollution too. I have no hard evidence either way, but I doubt if it is. We can make good near-visible dichroic passband filters, and there are good deep IR and UV glass filters. ARRI have been making cameras for a while, and they are probably know more than I do about this.

So, what do you do? The obvious answer is to shoot the scene with the right level, rather than leaving the lens cap on and trying to remove it in post. If you can't do that, then it is quite reasonable to pick a mask on the purple colours and make them black. Real purple is fairly rare in real scenes. If you have real purple objects in your shot, then you will have to use your wits and do something artistic. But for most material, you can probably make all purples black with a clear conscience.

I don't know if that convinces you, but I buy it. But that is on Friday after FilmLight Wine Time, so that may not mean much.

Cheers.
Richard Kirk
---
FilmLight Ltd, Artists House, 14-15 Manette Street, London W1D 4AP
Tel: +44 (0)20 7292 0400  Fax: +44 (0)20 7292 0401









More information about the Tig mailing list