Old Lenses, Chromatic Aberration, Monochrome, Filtering

Discussion in 'Mirrorless Cameras' started by VidThreeNorth, Apr 4, 2019.

  1. VidThreeNorth

    VidThreeNorth No longer a newbie, moving up!

    Joined:
    Oct 21, 2016
    Messages:
    406
    Likes Received:
    46
    Can others edit my Photos:
    Photos NOT OK to edit
    Since last September I have used two high quality old lenses. One is my Nikkor 28mm F3.5 AI and the other is my Asahi SMC Takumar 300mm F4.0. The 300mm is the older design and according to "PentaxForums.com", the optical design goes back to 1965, but this "SMC" version came out in 1971, incorporating full-aperture metering and Asahi's "Super Multi-Coating".

    According to Ken Rockwell's wesite, my 28mm F3.5 Nikkor AI was produced from 1977 - 1981, which implies that it was probably being designed around 1976. It's 6 elements in 6 groups optical design and multicoated were contemporary for its day, but not really class leading. Nikon had brought out a 28mm F2.8 around 1974 with 7 elements in 7 groups, which was overall larger, better and higher priced. Moreover, Nikon had 24mm F2.8 lenses as far back as their earliest 28mm F3.5 lenses. So all 28mm F3.5's could be looked on as targeting a more "popular priced" market. But this is a "relative" idea, and no doubt, if you checked the bags of a lot of Pro photographers, some would have carried a 28mm F3.5 in their kit.

    In common, they both have fairly high chromatic aberration. Later versions of the 300mm reduced this problem. The Nikkor had higher quality models which might have had less chromatic aberration, even when it was introduced. I don't know. I haven't checked around for any recorded tests.

    But it is fair to say that chromatic aberration was likely to be less of a priority in "pre-1970's" lenses. The reason is that colour photography, though available, was not as popular as "black and white" until around the mid-1960's. The cost and availability of colour prints were prohibitive for casual photography until around this 1960's to 1970's period. Even for Pros, black and white was the staple for news, including sports, corporate photography including portraits, and architecture.

    So, chromatic aberration was not going to be as serious an issue even until "later", and "later" varied from company to company. Again, I will point out that the Nikkor 28mm F3.5 AI was not Nikon's top lens in its focal length, so the more expensive lenses might already have been better.


    Optimizing Monochrome With Filters

    Since monochrome was widely used by Pros, there were well known method of getting the most out of it. These were no secrets. If you asked any camera store salesman you'd be taught this stuff in about an hour. A good salesman could give you a good education, and had a good motivation, because s/he could legitimately advise you to spend a little money and buy some filters.

    A new professional or hobbyist could be shown examples of using Red, Green, Blue and Yellow filters. Most stores had example pictures. You would be taught that for sharper pictures, you would do well to remove the blue spectrum which scatters more from haze and fog. "Anything but blue" was good for this. You would also see that each of the RGB filters resulted in distorted representations of the scene. Red was generally the most dramatic, darkening the sky and plants. Green was often called the most "natural", though green leaves tended to be brightened. Yellow was called the "fog or mist" filter. Blue was, well, actually "kind of useless", but if you bought one, the salesman would take the money. . . .

    But it might also be mentioned that this was also a good way sharpen a picture because of chromatic aberration. The salesman might not mentioning this because you'd already expect the picture to be sharper, so it was to an extent "redundant information". But a more experienced expert or Pro would probably use carry red, green and yellow. The red was "sharpest", the green was "almost as sharp".

    If there was a "Pro - trick", then it was the yellow filter. Yellow only got rid of the blue content so it did not sharpen quite as much as red or green, but it also did not affect the exposure quite as much, so shutter speeds were higher or apertures were small, either of which can also contribute to sharper pictures. And it also resulted in pictures that looked comparatively natural.


    Translated to Digital

    If I take a picture with normal settings, I can convert it to monochrome. That much will probably not affect the sharpness of the picture, but some people might prefer it simply because it got rid of the chromatic aberration. However, some (probably most) processing programs will be able to make color separation files. They might support RGB, or CYMK, or both. So it is possible to simulate using these black and white filters. The red and green files will be fairly straight forward. The Yellow, is not so clear -- at least to me. The cyan, magenta and yellow files are meant to be used for colour printing, but they are meant to be used along with the others, including the Black ('K') file. There might be a useful way to combine some of these to create a good monochrome image, but I have never tried to do so. Not yet anyway. . . .


    Simulated v Real Filters

    One last issue is whether using the color separation files is really the same as using real filters? At this point, all I can say is "maybe". I can go further and say that if there is a difference, then I do not expect it to be very big, and it is probably different for different bodies.

    Most colour sensors have small filters which define their sensitivity as red, green or blue. But the purity of the filter is not absolute, and in fact, the degree of overlap might differ from one model to the next within a camera line.

    If this is true, then the colour separation files might also not be "pure". So yes, there might be a chance that a "better" file could be made if you start by adding a purer filter on the lens. The only way to know for sure is to do the tests, and for now, I am not interested in doing so. If you are, then go ahead.


    Uploaded Files:

    This is a file created on the YiM1 with my Nikkor 28mm, F3.5 AI on my Shift Adapter, so the chromatic aberration is representative as far off-axis as would be expected when the lens is used on a 35mm film camera. It isn't a "horrible" amount of chromatic aberration, and it could be handled by most chromatic aberration functions, but the alternative of converting to monochrome was also available, as was the creation of colour separation files.

    P9110009.DNG
    [A version of this file was previously uploaded.]

    Partial EXIF:
    Sept 11, 2018, 16:51:55
    Image width 5200
    Image height 3902
    Components per pixel 1
    Pixel height 3888
    Pixel width 5184
    Component configuration YCbCr
    Color space Uncalibrated [AdobeRGB]
    Exposure mode Manual exposure [actually P]
    Exposure bias 1.00ev
    Exposure time 1/80 sec.
    F number f/0.0 [probably F11]
    Max aperture f/1.0 [actually f/3.5]
    Focal length 0.0mm [actually 28mm]
    ISO speed 200
    Metering mode Center weighted average
    Gain control Low gain up

    The uploaded files were made from the DNG (raw) file with only white balance adjustment. No other changes (sharpening, noise reduction, etc.) were used. The first is the normal colour render. They "Grey" is the default grey conversion for Corel Paintshop Pro X9. The Red, Green and Blue files were the separation files that Corel makes normally.

    Conversion White Balance:

    Temperature 5830
    Tint 27

    "P9110009 -1b-Colour-C1.jpg"

    "P9110009 -1c-Grey-C1.jpg"

    "P9110009 -1d-Red-C1.jpg"

    "P9110009 -1e-Green-C1.jpg"

    "P9110009 -1f-Blue-C1.jpg"

    Both the Red and Green files are visibly sharper than the straight grey conversion, but as expected, the Green file looks more "natural".


     

    Attached Files:

  2. Derrel

    Derrel Mr. Rain Cloud

    Joined:
    Jul 23, 2009
    Messages:
    44,565
    Likes Received:
    16,948
    Location:
    USA
    Can others edit my Photos:
    Photos OK to edit
    Your 28mm f/3.5 has what, today,many people would call "very bad" CA. issues.

    The Green Filter effect looks good to me.

    I saw your tests of the 300/4 Takumar in the snow last week. A lens design from 50 or so years ago, yes, and well-built no doubt, but not good compared to newer, better-corrected lenses.

    The performance of MANY older,film-era Nikkor wide-angle lenses is not that great when paired with modern, high-resolution digital sensors. I applaud you for doing your own, ACTUAL tests.
     
    • Like Like x 1
  3. VidThreeNorth

    VidThreeNorth No longer a newbie, moving up!

    Joined:
    Oct 21, 2016
    Messages:
    406
    Likes Received:
    46
    Can others edit my Photos:
    Photos NOT OK to edit
    Thanks Derrel, as I've said before, mainly I do tests because I need to know something. I keep notes as I go along, and then if I think it might be useful for others, then I write them up and post them somewhere. This was different. I knew the theory, and, yes, I actually did take specific pictures for this post using the 300mm. But then I thought it would be best to avoid having to explain the "focus issues" on that lens, so I just looked at what I had from the 28mm. Aside from the focus on the 300mm, neither lens performed outside my expectations, and the 300mm can be fixed.

    I agree that generally, the older lenses really are not necessarily that wonderful. We have better glasses today (Extra Low Dispersion), aspheric elements, probably better grinding and polishing, and computers to help with designing. On top of all that, many of the companies have simply been at this a long time and have decades of experience. But occasionally, you can find something "competitive" with more modern products.

    Mainly, this project was just curiosity and fun. I still do not have a great need for the 300mm. But in retrospect I'm glad I got it.

    One thing I didn't mention above was that I was surprised that the separation files were noisier than the straight conversion file. I had not expected that. The Red file is pretty bad. There's no point in that one. If I "noise reduce" it, all the sharpness and detail will be gone anyway. I guess that was a lesson. Maybe there is a good reason to use real filters on the lenses?
     
    • Like Like x 1
  4. Tim Tucker 2

    Tim Tucker 2 No longer a newbie, moving up!

    Joined:
    Jun 8, 2017
    Messages:
    263
    Likes Received:
    183
    Location:
    Scotland
    Can others edit my Photos:
    Photos OK to edit


    Excuse the longish post, it's required to explain what is a basic and common misconception about colour which actually renders the whole test quite *inconclusive*.

    The nature of light is defined by *wavelength*, the visible spectrum is a linear scale from the longest wavelengths of red through to the shortest of indigo.

    When Newton split light into it's separate wavelengths and revealed the true nature of light he also revealed a problem or two. Not least of these was the problem of the three primary colours because he just proved that they didn't really exist. So why are there three primaries of Red, Green and Blue?

    It was Thomas Young who made the breakthrough realisation and connection by understanding that it was describing the way the human eye worked and had nothing to do with the nature of light. The human eye doesn't have numerous types of receptors capable of detecting slight nuances in colour, it only has three. They are named after the primary colours and *not* their spectral sensitivity so be very carful about jumping to a conclusion simply by reading the label *red*. In the eye the three receptors are heavily overlapped and are based on evolution and our need to be able to gain an understanding of the space we occupied in order to survive. As an example when life first started to evolve organic visual systems the most useful was the ability to differentiate the harsh UV laden light of midday from the softer warm yellow/orange of the mornings and evenings. This is still hard wired into the human eye and the yellow/blue remains that main contrast around which we understand colour. Also if you look at the spectral sensitivity of the eye you will notice that the *red* receptors actually react a little to blue and so it is that we understand that colour is a wheel where indigo fades through to magenta (a pure hue that doesn't exist as a single wavelength) to red. Basically colour is not a property of light but the way our visual system differentiates wavelength. The way it has evolved to do this is not the same as a calibrated scientific instrument would but in a way that skews colour and contrast to exaggerate the differences that are most useful to us.

    Now your RGB colour system based on the three primaries use the fact that you only have three types of receptors to produce a remarkable illusion. If you see light of the *yellow* wavelength then it triggers a pattern of response in the three receptors that our brain interprets as yellow. What a screen does is use narrow spectrums of red, green and blue that are specifically tuned to the receptors in your eye so when a certain combination RGB is shone in your eye it triggers exactly the same pattern in the receptors in your eye which your brain interprets as *yellow*. Yellow is not a combination of red and green, but because of the way the receptors in your eye work your brain can be tricked into seeing yellow by a combination of red and green.

    There is not a colour reproduction system designed by man that is not purely designed around the way the human eye works and are all *perceptual*, none of them are based on the physics of light and reproducing *absolute* wavelength. Even the spectral sensitivity of the RGB array on your camera sensor is specifically balanced as close to the spectral sensitivity of the eye as possible. Digital cameras vary only in their solution to this problem not their intent.

    Colour filters in B&W work on the absorption of *wavelength* not the colour wheel which is perceptual only. The *common* colour filters are:

    Yellow: Because we still retain the yellow/blue perceptual contrast as a result of the way the eye evolved we still see yellow as being brighter than blue. In fact if you look at the spectral sensitivity of the eye then the receptors (for yellow) pretty much give near maximum on both red and green and near minimum on blue. When you reduce the image a monochromatic one you remove that perceptual skew from the cones and so your perception of luminosity is also altered, you finally see blue and yellow as being closer to *absolute* and so bright blue skies look closer to white as monochrome than the perceptual darkening of blue suggests when you look at the actual colour scene. So a yellow filter is used because it restores the luminance of colours closer to the perceptual one of the eye by darkening the blue and lightening the yellow.

    Orange: More pronounced than yellow.

    Red: The exact opposite end of the *linear wavelength scale of light*, it is really cutting out blue and cyan. Remember that the red/green opposites are a *perceptual contrast that is to do with the cones in your eye and does not describe how light works*, it doesn't darken green to the same extent which is more pronounced towards a viridian green than a yellow/green.

    Green: Lightens foliage and darkens both blue and red to some extent, more pronounced and slightly more biased than the yellow but similar.

    Blue: Really not perceptually very accurate as it reverses the eyes tendency to see blue as the darker colour.

    EDIT having re-read the process: If you want to simulate the effect of colour filters you must use an adjustment of *luminosity* to simulate the filtering out of certain *wavelengths* as related to the linear scale of the visible spectrum. Not use the RGB channels which are simply the luminosity maps of the three primaries necessary to produce the *illusion* of colour in an additive display. They do not represent anything close to the effects of a filter on the front of a lens and are not based on the linear scale of wavelengths that filters absorb but the way the three kinds of receptor in the eye work.

    In actual fact, and slightly above my understanding at present, the Green channel is closely related to to our perception of luminosity. But it MUST be remembered that on a computer screen the channels mimics the way the eye works and so it is the *perceptual* difference you are shown and not the effect of a green filter in front of the lens. Below is a comparison, on the left the effect of a green filter and note that it darkens blue and red, some of the late autumn grass has a red/brown content. On the right is the green channel which is more of a *perceptual luminosity map* based on how the eye separates and rebuilds an understanding of colour and luminosity.

    ex-1.jpg

    BTW, by using stronger filters in front of the lens you bias the composition of the light passing through the lens to a single portion of the spectrum. This then is far less prone to chromatic aberration simply because the combination of wavelength passing through the lens is tighter and less prone to differences in refractive indices.

    When you simulate this on a computer screen with a colour digital file the chromatic aberration is already there and you simply adjust the luminosity of it and in many cases will increase the contrast making it more visible.
     
    Last edited: Apr 5, 2019
    • Informative Informative x 1
  5. VidThreeNorth

    VidThreeNorth No longer a newbie, moving up!

    Joined:
    Oct 21, 2016
    Messages:
    406
    Likes Received:
    46
    Can others edit my Photos:
    Photos NOT OK to edit
    @Tim Tucker 2: Much thanks for your input! I think that your example shows that the filters built into the sensors of your camera do allow a much more significant "overlap" than I had thought, and I expect that your camera is about the same as most of ours. So, it does seem that if I want to reduce chromatic aberration, putting "real" photographic filters on the lenses would help more than just using the colour separation function. That becomes a hard choice for me. Earlier in my life, I did a lot of black and white photography. Now that I have returned to it, it is hard for me to get enthusiastic about black and white. If I put the filter on, then it is a commitment that the picture will be black and white. That is not as nice as simply choosing to make a black and white later. Well, we'll see.
     
  6. Derrel

    Derrel Mr. Rain Cloud

    Joined:
    Jul 23, 2009
    Messages:
    44,565
    Likes Received:
    16,948
    Location:
    USA
    Can others edit my Photos:
    Photos OK to edit
    I NEVER use yellow, orange, red, or green filters in the field with digital...filter effects in LR are good, faster, and give me the greatest number of options.
     
    • Agree Agree x 1
    • Useful Useful x 1
  7. Tim Tucker 2

    Tim Tucker 2 No longer a newbie, moving up!

    Joined:
    Jun 8, 2017
    Messages:
    263
    Likes Received:
    183
    Location:
    Scotland
    Can others edit my Photos:
    Photos OK to edit
    A lot of people make the connection, (because of the words or labels and the belief that colour explains the nature of light), that a camera sensor splits the light into it's RGB components where it is shone back out of your screen in the correct proportions to produce the colour we see.

    I can't stress just how wrong this is.

    Light is not composed of red, green and blue but the full spectrum of visible wavelength.

    The sensor in your camera is designed to respond to the full spectrum, when light of a certain colour hits it a certain proportional pattern from the three sensors is recorded. This only works because it mimics the way your eyes work. If you change that significantly then a camera would not be able to record colour in the way you see it, or *correctly*. Colour describes the way the eye works not the nature of light, so accurate colour is what the eye sees and not the literal recording of wavelength.

    A screen does not then reconstruct those components in the correct proportions. The combination of narrow bands of RGB light (and it is RGB this time) does not reproduce the colour. It triggers an illusion so tuned to the way that the human eye works that it doesn't work for any other animal and the colour they see would be completely wrong. If you do something that never happens with subtractive colour and shine narrow bands of red and green light into your eyes at wavelengths carefully chosen to attune to the receptors in your eyes then you can trigger the same response in the receptors in your eyes as yellow and the eye simply cannot tell the difference...

    Visible Light and the Eye's Response

    The RGB colour separation on your computer is not reference to the *input* data but a specific reference to the *output* data necessary to display correct colour on an additive display in exactly the same way that the CMYK colour separation is related to the *output* data necessary to display colour through a subtractive CMYK press.

    I wouldn't use colour filters for colour digital, B&W film only. Digital is trying to separate and record colour in a similar way to the eye so it can reproduce that in a variety of different colour processes. By using colour filters you are working against this and skewing it in a fairly non-linear way, for instance a camera may use the *green* channel as the main reference to re-build the luminosity. Once the emulsion is set B&W film no longer distinguishes or separates colour, it is just a luminosity map defined by the emulsion sensitivity and the composition of the light falling on it.

    Why not just use the lens correction in the RAW editor? Or do what I do which is shoot B&W film with a LF camera? ;);););)
     
    Last edited: Apr 6, 2019
    • Informative Informative x 1
  8. vin88

    vin88 TPF Noob!

    Joined:
    Jan 29, 2018
    Messages:
    191
    Likes Received:
    18
    Can others edit my Photos:
    Photos NOT OK to edit
    I think the older nikors ( with glass lenses ) are better than plastic lenses of to day.
     
  9. Harry Liston

    Harry Liston TPF Noob!

    Joined:
    May 9, 2019
    Messages:
    2
    Likes Received:
    1
    Location:
    Ottawa, Canada
    Can others edit my Photos:
    Photos OK to edit
    It is true that human vision is much more sensitive in the green range. Having worked in a mini photofinishing lab for a while, it was always the magenta/green balance that caused the most number of redos hehe. But I'm not so sure that a Nikkor 28 3.5 has an excessive amount of chromatic aberration in it. Seemed fine to me, but that was a great while ago with Kodachrome film. Granted, it could still have been there, just not so noticeable due to the more forgiving film medium. Could there be other optical shortcomings besides CA that causes such a big loss of sharpness with digital?
     
  10. Soocom1

    Soocom1 Been spending a lot of time on here!

    Joined:
    Feb 27, 2006
    Messages:
    1,738
    Likes Received:
    523
    Location:
    Southwest US
    Can others edit my Photos:
    Photos NOT OK to edit
    I posted a thread on this some time back.
    The images I shot were with real and "fake" filters used (in my case Canon DPP).
    But the effect is basically the same.

    Old Soul: A B&W test. Red filter or not.

    Here is what I had discovered:
    1: The camera has the ability to create in box the monochrome effect and the end result is a B&W image that once put through the processes of contrast, brightness et-al post process can "mimic" the effects of the various filters.
    The filter I put in front of my camera and shot with the canon was set up where the camera would "shoot" monochrome. That was when I discovered that in realty (at least for the camera I was using,) that the red was actually recorded because the DPP can actually "undo" the monochrome information provided by the camera. And was actually able to take a "monochrome" image shot in camera and restore the full color.
    I realized this when I saw in the DPP program that the data from the camera showed "Monochrome" and once I switched it to "Prorate" "Faithful" and other settings, voila.. full color, except the filter image was all red!

    2: What most people who do not study photography or light, and the application in photography in digital is that the CMYK and the RGB was based on the tables and scales produced well before the 20th century when some folks discovered that color could be altered with various colored glass. Eg: Red allowed some green through (a different shade of red) and made blue essentially black. (At least mostly.)
    It was also here that color temperature (kelvin scale) was first applied and realized. Ergo why red is in the lower spectrum color and blue is higher. And why White Balance adjustment has a greater effect on final image than most really understand.

    3: What I DID discover with my experiment was that the application of red, green blue, yellow..blah blah blah filters on a digital set to monochrome is that there is in fact a direct result and effect on the image that is totally different from the faux filters in various programs.
    This is because when the filter is actually used, guess what is happening...
    The same effect as though it were B&W except the camera records mostly the red color. I discovered that some shades of green nearly burn out in a whole yellow color, but that's beside the point. The filter is only allowing red through and thus brings the digital image much closer to a "real" B&W image than if pushed through post process.

    One book in my collection is "Successful Color Photography" by Andreas Feininger that goes into quite alot of detail about this very subject. It is defiantly worth the read. (Not digital, but effects of filters.)
    https://www.amazon.com/Successful-Color-Photography-Andreas-Feininger/dp/B000O7VX8M/ref=sr_1_1?keywords=Successful+Color+Photography+Andreas+Feininger&qid=1563162356&s=gateway&sr=8-1.

    But more relevant to the OP is this: CA was obviously not a big issue with older lenses at least to the degree that the professional photographer was concerned.
    But also remember that early lens coating helped save many a Hollywood movie shot on Technicolor and I would dare someone to find on prints of El Cid, The Wizard of Oz or Gone with the Wind, bad color rendition.

    However in my copy of the afore mentioned book, Mr. Feininger points out that lens coating were part in part an issue of light transmission over CA because of the amount of light lost as a result of the tunnel effect and inverse square law. By having the coating on the lenses, it helped increase overall light transmission and the lenses were typically clearer and images much sharper.

    But one of the aspects involved was actually diaphragm calibration and "T stops". But I digress.

    The older lenses and especially ones purchased on Ebay and other sites may have histories unknown to the current owner.
    Ergo: one does not know if the lens suffered from the use of Windex on all the elements and the near complete wiping of lens coatings away from the lens.

    Except if flair and CA suddenly becomes a huge issue.
    But also remember that the formulations for coating from the 1960's and 70's have greatly evolved and even the structure of lens elements have evolved. Thus, the overall aspect of the lens itself if not brought into question due to history, could very well have a transitional aspect of being made between coating types and application.

    As for B&W, even the use of modern lenses will show a dramatic at times increase in quality over older lenses due to contrast and other aspects.
     
  11. vin88

    vin88 TPF Noob!

    Joined:
    Jan 29, 2018
    Messages:
    191
    Likes Received:
    18
    Can others edit my Photos:
    Photos NOT OK to edit
    good show !
     
  12. Derrel

    Derrel Mr. Rain Cloud

    Joined:
    Jul 23, 2009
    Messages:
    44,565
    Likes Received:
    16,948
    Location:
    USA
    Can others edit my Photos:
    Photos OK to edit


    My idea was that anti-reflective coating greatly improved light losses at air to glass surfaces and made it possible to design multi-element zoom lenses without a huge loss of light. In the old days it was commonly excepted that teach it to class service lost around 3% of The light. Imagine today's 70 to 200 mm zooms with 21 to 23 elements. Imagine if at each air to glass surface, 3% of the light was lost. In a 21 element Zoom lens can we have perhaps 10 to 15 air to glass surfaces.

    Today the most recent advances
    have almost virtually eliminated lens flare when shooting right into bright light sources. Nikon invented this improvement,which they called nano coating: it was only a few years before Canon released an almost-identical form of their lens coating.
     
    Last edited: Jul 24, 2019

Share This Page