Old Lenses, Chromatic Aberration, Monochrome, Filtering

Discussion in 'Mirrorless Cameras' started by VidThreeNorth, Apr 4, 2019.

  1. Tim Tucker 2

    Tim Tucker 2 No longer a newbie, moving up!

    Joined:
    Jun 8, 2017
    Messages:
    263
    Likes Received:
    183
    Location:
    Scotland
    Can others edit my Photos:
    Photos OK to edit
    4% on a single air/glass transition.

    The problem comes when you have two or more air/glass transitions because the total reflection is then between 0% and 16%. This is a function of the relationship between the glass and the wavelength where for want of a better model the *interference* of the wave function can be use to visualise it. The reflection off the second surface can cancel or amplify the reflection off the first surface, (this is of course not what happens as when you use a detector to measure the photons reflected it increases or decreases the number of audible click of the same volume indicating particles and not waves...).

    You can see this effect in thin films such as oil on water, where the colours you see are the amplified reflections from the relationship between the thickness and the wavelength of the light. It was also Newton who measured and detailed the effect with glass.

    The effect is related to wavelength, but if you have a few special coatings of the right thickness it is possible to reduce the reflections quite considerably.

    Hmmm...

    The trouble with most photographers on forums is that they equate colour to the physics of light. Lenses are designed and work entirely in conjunction with the physics of light. Colour and colour reproduction is entirely perceptual and works by understanding the mechanics of the human eye/brain.

    Walk out of a bar lit by warm artificial light and into the heavy blue of midday under blue skies. Do you see the bar as being particularly yellow or the outside as being heavily blue? No because your eye adjusts colour to enhance the contrast between colour. This is an evolutionary thing, we survive because we have the ability to see the orange stripes of a tiger even in heavily blue biased light. Our eyes naturally cease to record as much blue, blue de-saturates the orange, and therefore we see a bigger contrast between colour.

    A number of very important things are happening here, and are mimicked in photographic images. The trouble is that because things remain constant to our eyes we assume that what we see is constant and can be explained by what we see rather than how we see.

    • When we take a photographic image we equalise exposure so the SAME amount of light falls on the sensor in all conditions, (it is the calibration of exposure). We no longer reproduce absolute intensity of light, we equalise the brightness much as the eye does and record the same intensity as dictated by the sensitivity of the film/sensor and the brightness of the output media.
    • We also apply a WB. In other words we no longer record absolute colour or colour temperature but convert it to a *Reference WB* which is fundamentally derived from how the human eye tries to balance primaries to increase visual contrast.
    That excessive blue de-saturates orange is entirely due to the workings of the human eye and has nothing to do with the physics of light. Combining the light of the corresponding wavelengths does NOT produce grey light. Light doesn't have the property of colour, colour is entirely how the human eye interprets wavelength.

    So if you combine blue and orange light what happens is that you produce a (roughly equal) signal in the three receptors in you human eye which the brain interprets as *grey*. If you reduce the amount of blue then you reduce the blue the proportion between the signal generated by the **blue and the **red/**green cones changes more towards the red/green and so you see a colour that's more orange. (** The cones in our eyes are labelled as red-green-blue and the mistake is to apply your understanding of the word *red* to the function of the cone. The red cone actually has a much broader spectral response that actually includes some blue and as a result we classify and order colour perceptively from indigo through magenta to red where no such relationship exists in physics).

    RGB and additive colour systems are an illusion so attuned to the three receptors in the human eye that it only works with humans. Even CMYK is fundamentally based on the workings of the human eye. It is because we always see yellow as the brightest colour and so in any subtractive system it's impossible to produce that brightness by subtraction, it has to be at the top of the tree and be the colour subtracted from, (yellow stimulates maximum signal in the *red/green* cones and minimum in the *blue* and so yellow always triggers more cones than blue and is always seen as a brighter colour).

    Using colour filters to subtract light is essentially filtering wavelength, but the resulting *colour* we see is not an accurate recording of that wavelength but a perceptual interpretation of wavelength that is due to the mechanics and chemistry of the eye/brain.

    Conclusion: Colour does NOT describe the physics of light only wavelength does. Colour is our perceptual interpretation of wavelength and all colour reproduction systems are based on that perceptual representation and NOT the physics of light or the absolute wavelength or intensity of light in any scene. I have seen posters on forums *prove* the *absorption of fragile blue* by comparing jpegs on computer screens which of course is a ridiculous concept and completely misunderstands the nature of colour and colour reproduction. All colour we see is composed of wavelengths and the way they combine to produce the colour we see, the way the saturation and intensity of that colour changes, the way we categorise and order colour is all defined by the mechanics of the human eye and cannot be understood by the physics of light. The complementary colours are entirely a product of the cones in our eyes and again have nothing do with the physics of light.

    All colour reproduction systems are based on how we see and interpret colour through human eyes, what is know as Perceptual Colour Theory, and not the actual recording and reproduction of wavelength. Even the RGB of cameras is a specific reference to the cones in your eyes and nothing to do with the actual physics of the light transmitted by the lens.

    So although CA is a product of the physics of light and how a lens fails with different wavelengths at the edges of the field it projects, the actual representation of it in an image is a perceptual model. It is a mistake to assume that you are looking at and can interpret the actual physics of the light and the way a lens transmits that when you view an image on any colour reproduction system. You DO NOT SEE the actual light that passed through the lens because you will ALWAYS view with the human eye and it's perceptual interpretation of colour. Which is why it's utterly pointless for any colour reproduction system not to be based on the perceptual model.

    Colour film is also a perceptually based reproduction system and so the effects of filters has to be understood in line with the perceptual model, images cannot be interpreted in relation to the actual physics of light and the transmission of a lens.

    BTW, it was Newton who correctly guessed the nature of light and especially white light being composed of all the colours. Before this it was assumed that white lacked the *purity* of colour and the purer the colour the more divine the light.

    It was Thomas Young who shortly afterwards correctly realised that the way colour and colour filters worked when combining light was actually describing the mechanics of the human eye and had nothing to do with the physics of light.


     
    Last edited: Jul 25, 2019
    • Informative Informative x 1

Share This Page