Calling all portrait photographers!

85mm lens. Desired: 8.47 foot tall field of view, to show a man and woman, standing, full-length, with some space below their feet, and above their heads.

With a Full Frame sensor, this field of view height (8.47 feet) is achieved at 20 feet camera-to-subject distance; with a 1.6 x APS-C sensor it's right around 34.5 feet distant, to get the same, exact height (a 1.53x Nikon or Sony sensor is a little,tiny bit closer).

Soooooo: guess what! Moving the APS-C camera BACK, farther from the subject, causes the depth of field to increase, compared to the 20.0 foot camera-to-subject distance that the full-frame camera requires; that distance change causes 1) a greater (longer) camera-to-subject distance, and 2)tends to alter how far behind the background is in relation to the camera's sensor and 3) moves the focus distance closer to the Hyperfocal Distance.

The smaller the sensor is, the CLOSER the camera-to-subject distance is at the range where background objects move into a fairly deep DOF range. At the closest ranges, DOF is skimpy; as camera-to-subject distances increase, to say, beyond 20 feet, DOF increases at a VERY rapid rate. DOF is NOT a linear thing!

There's a point where the full-frame 24x36mm sensor and normal, common lenses like 50mm, 70-200, 85mm, 100mm, 135mm, offer a combination of shallow DOF and a high degree of background blurring, which can be utilized for creative effects. What happens with smaller sensors, like m4/3, APS-C, and smaller, is that there is a LOT of depth of field beginning in the 10- to 20-foot camera-to-subject range; the toughest thing to wrap one's head around when using on-line DOF calculators is this: DOF is easy to quantify with numbers,and it might seem that there's little difference between one format size and another, but the quality of the out of focus (meaning the degree of de-focus, not the bokeh quality) in the background is tricky to describe or to delineat or to define with numbers; the problem is that the human brain and eye can "recognize" many things in a backdrop, unless that backdrop is really,really defocused.

A second problem relating to DOF is that with smaller sensors, the wider-angle focal lengths are VERY short! This means shooting at 17,18,19mm on APS-C gives almost totally hyperfocal focus at CLOSE ranges, and means that getting OOF backdrops on wide-angle shots is impossible, or very difficult, or requires crazy-short, crazy-fast wide-angle lenses to get OOF backdrops.

There are basically three DOF ranges: close-in, moderate, and longer-range. DOF does NOT behave in a linear, perfect manner. Small-sensor cameras CAN create some (note: 'some') shallow DOF effects at macro and close-up ranges, but at medium ranges, and at longer range, small-sensor cameras tend toward deep DOF and easily-recongizable backdrops; Medium and larger-format (6x6,6x7,6x8,6x9 cm images being Medium format, and 4x5,5x7,and 8x10 inch being Large Format) cameras create images with shallow DOF to incredibly shallow DOF.

The FORMAT's SIZE, and the lenses that are used to make each and every Picture Angle of View, or ever single Picture Frame Size, have a MAJOR influence on the DOF, in most cases in the close-up to medium-range situations.

Seriously....listen to me..listen to Ysarex...this has been covered here a dozen times or more, over a decade...listen to Ysarex, not idiots on YouTube who cannot tell the difference between an APS-C shot and an iPhone shot and a Medium Format film shot. If you have ***ever*** shot a 6x6 cm film camera, you KNOW that format size can play a HUGE part in the kind of DOF you get, and the types of images you can create.

Bob Atkins has some fantastic articles on-line, that show the differences between FX (FF) and APS-C cameras, in regards to DOF. As well as an excellent field of view and DOF calculator.

Hi Derrel, Here we go again -- kinda like trying to slay a hydra.;)

Joe
 
Destin: I'll do the same as a control - D750, AF 50mm f/1.8D Nikkor.

Darryl: Destin and I are looking at this from different angles - I believe the sensor size has a direct affect on DoF, while Destin believes it is indirect. I will make my shots tomorrow evening. I think we've hijacked this thread enough so we should probably post results in a new thread .

Sounds good. I’ll be doing mine late tonight if I’m awake enough.

To keep it even I’ll use my 50mm 1.8G.
When a Nikon full frame camera is switched to crop (DX) mode, is only the portion of the sensor matching the size of the APS-C sensor being used, or is the just the full frame image cropped to the APS-C sized via software/firmware? If the sensor's zone of activation changes, that may be a real way to test (or demonstrate) it.

The zone of activation changes, which is how the crop factor is formed. The camera essentially just crops the image in software after you press the shutter but before you save it.

That’s exactly how I’ll test it on my D750 when I get home.

If I take an image in Fx mode and then crop it to match an image taken in Dx mode, I’m guaranteeing that I’ll get the same exact image.

You can't crop the FX image! If you do then it's not an FX image is it?
When you do your testing: It's not a valid comparison if you don't use the two sensor sizes to take the same photograph. If you take different photos with the different sensor sizes you have no comparison. If you crop an FX image to DX size then it's a DX image, duuuuh! Again you are thinking about DOF as though it is something that occurs or is defined on the sensor. It is not. Both sensors must be used without cropping to take the same photograph.

Joe

And I’ll take the same image with my D500 as a safety net to compare.

Extremely interested to see the results. I’ll post them when I can.

I’d theres a visible difference in depth of field I’ll compeltley admit that I’m wrong.

No. Not even close.

So now you’re telling me that cropping an image after the fact is going to change DOF?

My entire point here is that if all other variables are kept completely the same, sensor will have no effect on the depth of field as viewed in the final image.

Obviously if I move the full frame camera closer to maintain the same field of view, then I’m going to get a thinner depth of field. But the depth of field in that case isn’t getting thinner because of the sensor size directly, but rather because the subject distance was reduced.

Therefore the only way to compare this precisely is to take the photos with all other variables identical and then crop the full frame image to match for comparison sake. If the full frame sensor really causes thinner depth of field as you’re stating, even after cropping the background will appear more blurred than it did on the crop sensor. But I’m calling bull. It’ll end up looking exactly the same. And therefore the depth of field is not affected directly by the sensor size.
 
Destin: I'll do the same as a control - D750, AF 50mm f/1.8D Nikkor.

Darryl: Destin and I are looking at this from different angles - I believe the sensor size has a direct affect on DoF, while Destin believes it is indirect. I will make my shots tomorrow evening. I think we've hijacked this thread enough so we should probably post results in a new thread .

Sounds good. I’ll be doing mine late tonight if I’m awake enough.

To keep it even I’ll use my 50mm 1.8G.
When a Nikon full frame camera is switched to crop (DX) mode, is only the portion of the sensor matching the size of the APS-C sensor being used, or is the just the full frame image cropped to the APS-C sized via software/firmware? If the sensor's zone of activation changes, that may be a real way to test (or demonstrate) it.

The zone of activation changes, which is how the crop factor is formed. The camera essentially just crops the image in software after you press the shutter but before you save it.

That’s exactly how I’ll test it on my D750 when I get home.

If I take an image in Fx mode and then crop it to match an image taken in Dx mode, I’m guaranteeing that I’ll get the same exact image.

You can't crop the FX image! If you do then it's not an FX image is it?
When you do your testing: It's not a valid comparison if you don't use the two sensor sizes to take the same photograph. If you take different photos with the different sensor sizes you have no comparison. If you crop an FX image to DX size then it's a DX image, duuuuh! Again you are thinking about DOF as though it is something that occurs or is defined on the sensor. It is not. Both sensors must be used without cropping to take the same photograph.

Joe

And I’ll take the same image with my D500 as a safety net to compare.

Extremely interested to see the results. I’ll post them when I can.

I’d theres a visible difference in depth of field I’ll compeltley admit that I’m wrong.

No. Not even close.

So now you’re telling me that cropping an image after the fact is going to change DOF?

In fact it does, but that's not what you're doing. You're cropping the sensor before the fact.

My entire point here is that if all other variables are kept completely the same, sensor will have no effect on the depth of field as viewed in the final image.

Think. If that's your point then what you're saying is that the DOF from a 50mm lens at f/8 focused on a subect 10 feet away shot with a DX size sensor is the same as the DOF from a 50mm lens at f/8 focused on a subject 10 feet away shot with a DX size sensor. Why YES! and duuuh. If you crop an FX sensor down to a DX sensor then it's a DX sensor.

Obviously if I move the full frame camera closer to maintain the same field of view, then I’m going to get a thinner depth of field. But the depth of field in that case isn’t getting thinner because of the sensor size directly, but rather because the subject distance was reduced.

You can't move the FX camera closer. Then you're not taking the same photograph. For the two photographs to be the same they have to be taken from the same place. Otherwise perspective is changed by the camera move.

Therefore the only way to compare this precisely is to take the photos with all other variables identical and then crop the full frame image to match for comparison sake.

Then you're comparing DX with DX -- no variation at all.

If the full frame sensor really causes thinner depth of field as you’re stating, even after cropping the background will appear more blurred than it did on the crop sensor.

You're still trying to define DOF as something that happens on the sensor. DOF is not defined at the sensor or film level. There is no such thing. The effect that sensor size has on DOF isn't some alchemy that takes place on the sensor surface because the sensor dimensions are different. DOF is measured ONLY on the final image and you must use the entire uncropped sensor image from both cameras or sensor sizes to make the comparison.

Joe

But I’m calling bull. It’ll end up looking exactly the same. And therefore the depth of field is not affected directly by the sensor size.
 
Destin said:
So now you’re telling me that cropping an image after the fact is going to change DOF?

OMG--YES. That **fact** is usually found on the second page of the chapter dealing with photography and optics. Cropping an image after the fact **will alter** the depth of field.

YES. YES, yes,yes, yes. Cropping a negative or digital image down, from one size to another size, and then making a printed image can, and will, alter the depth of field. As an example, the DOF will change, similar to the way that altering the amount of yeast used in a pizza crust can make a thicker, or a thinner crust....even when the final pizza remains 16 inches in diameter. Your incredulity is understandable, but is akin to the more/less yeast degree of understanding of a final sized image.

I cannot blame you for not understanding this, since you grew up in the era of the Internet and YouTube, where everybody is an 'expert', and the science underlying so many things has been obscured by bullspit.

Like killing a hydra indeed...
 
You get a blurry background by:

1. First and foremost separating your subject from the background -- distance -- the more the better.
2. Second separate yourself from the subject and use a long lens to reduce the field of view -- that's again distance -- the more the better.
3. Third, use a large aperture opening on the lens (eg. f/2 instead of f/11).
4. Fourth, use a larger as opposed to smaller recording media -- larger sensor/film camera.

Use all four of the above together for maximum effect. I listed them in priority order although it's arguable whether 2. should come before 1.

Joe
Great minds! Our second point differs, but I'm sure you have set it forth better.

On that 2nd point - I think a hybrid of what you’re both trying to say works best. I’ve found that using a long focal length and then zooming in or framing in close to the subject gives the more blurred background.

I’ve found that the 50mm you have to get in really close to achieve that full blur. The 200mm end of a zoom or the 85mm or 105mm primes will be a big improvement.

You don't need to get that close with a 50mm to blur the background you just need a subject to background distance this bigger.
 
Destin said:
So now you’re telling me that cropping an image after the fact is going to change DOF?

OMG--YES. That **fact** is usually found on the second page of the chapter dealing with photography and optics. Cropping an image after the fact **will alter** the depth of field.

YES. YES, yes,yes, yes. Cropping a negative or digital image down, from one size to another size, and then making a printed image can, and will, alter the depth of field. As an example, the DOF will change, similar to the way that altering the amount of yeast used in a pizza crust can make a thicker, or a thinner crust....even when the final pizza remains 16 inches in diameter. Your incredulity is understandable, but is akin to the more/less yeast degree of understanding of a final sized image.

I cannot blame you for not understanding this, since you grew up in the era of the Internet and YouTube, where everybody is an 'expert', and the science underlying so many things has been obscured by bullspit.

Like killing a hydra indeed...

Derrel, if I may provide an example:

dof_one.jpg


Only the boy in the back is visibly out of DOF. But in a cropped version (same print size) from the same film negative we see the girl with the white collar is out while the girl with the plaid scarf is questionable. In a larger print the girl with the plaid scarf definitely goes out:

dof_two.jpg


I think it really stems from wanting to define DOF as happening at the film/sensor level rather than at the final print level. They're thinking; once it's recorded on the film/sensor it must be permanently fixed. Destin (Hey Destin, sorry don't want to sound like I'm talking about you behind your back -- Derrel & I have done this a dozen plus times now for literally years) is going through the exact same progression we've seen many times before. "The lens fixes the DOF at the moment of exposure and if we crop a section out of an FX image that's the same size as a DX image the DOF must be the same." They're thinking about what's happening in the camera without following through with the process. DOF is defined ONLY on the final print/display image.

Joe

Destin, Above all other factors and in simplest terms DOF is a function of magnification. Lens FL + distance is magnification. Enlarging the film sensor image to a print/display is magnification. Recording the same image on different size sensors is different levels of magnification. Cropping is magnification. Viewing the image while a print in your hands versus a print on the wall is different levels of magnification. All variables of magnification involved in realizing the final print/display image must be (and are) accounted for in the math that calculates DOF. You find sensor size in the math in the variable c (circle of confusion). DOF calculations require a value for c and the value we use for c is predicated (changes) upon sensor/film size. As such we're making allowances in the math for required variations in magnification due to sensor size. You have to make the final image and use the whole sensor -- cropping changes DOF.
 
Destin said:
So now you’re telling me that cropping an image after the fact is going to change DOF?

OMG--YES. That **fact** is usually found on the second page of the chapter dealing with photography and optics. Cropping an image after the fact **will alter** the depth of field.

YES. YES, yes,yes, yes. Cropping a negative or digital image down, from one size to another size, and then making a printed image can, and will, alter the depth of field. As an example, the DOF will change, similar to the way that altering the amount of yeast used in a pizza crust can make a thicker, or a thinner crust....even when the final pizza remains 16 inches in diameter. Your incredulity is understandable, but is akin to the more/less yeast degree of understanding of a final sized image.

I cannot blame you for not understanding this, since you grew up in the era of the Internet and YouTube, where everybody is an 'expert', and the science underlying so many things has been obscured by bullspit.

Like killing a hydra indeed...


I’ll agree to disagree.

The area of the image that has acceptable sharpness, measures perpendicular to the sensor plane (depth of field) will absolutely, 110%, not be altered by cropping the image.

So you’re telling me that if I take a portrait with a full frame camera and the depth of field is so shallow that only the eye is in focus.. that by cropping the image my depth of field will magically grow and cause more of the image to be in acceptably sharp focus?

No. Because cropping does nothing but trim the image. You might be cropping off some of the out of focus area.. but that does not change the depth of field recorded in the original image.

There is zero chance that I will ever agree with you on this. Period.

If there is a 4.3 foot depth of field in the original image, and I crop the image.. the depth of field is still 4.3 feet. You’ve just cropped off some of the image.
 
Destin said:
So now you’re telling me that cropping an image after the fact is going to change DOF?

OMG--YES. That **fact** is usually found on the second page of the chapter dealing with photography and optics. Cropping an image after the fact **will alter** the depth of field.

YES. YES, yes,yes, yes. Cropping a negative or digital image down, from one size to another size, and then making a printed image can, and will, alter the depth of field. As an example, the DOF will change, similar to the way that altering the amount of yeast used in a pizza crust can make a thicker, or a thinner crust....even when the final pizza remains 16 inches in diameter. Your incredulity is understandable, but is akin to the more/less yeast degree of understanding of a final sized image.

I cannot blame you for not understanding this, since you grew up in the era of the Internet and YouTube, where everybody is an 'expert', and the science underlying so many things has been obscured by bullspit.

Like killing a hydra indeed...

Derrel, if I may provide an example:

View attachment 148097

Only the boy in the back is visibly out of DOF. But in a cropped version (same print size) from the same film negative we see the girl with the white collar is out while the girl with the plaid scarf is questionable. In a larger print the girl with the plaid scarf definitely goes out:

View attachment 148098

I think it really stems from wanting to define DOF as happening at the film/sensor level rather than at the final print level. They're thinking; once it's recorded on the film/sensor it must be permanently fixed. Destin (Hey Destin, sorry don't want to sound like I'm talking about you behind your back -- Derrel & I have done this a dozen plus times now for literally years) is going through the exact same progression we've seen many times before. "The lens fixes the DOF at the moment of exposure and if we crop a section out of an FX image that's the same size as a DX image the DOF must be the same." They're thinking about what's happening in the camera without following through with the process. DOF is defined ONLY on the final print/display image.

Joe

Destin, Above all other factors and in simplest terms DOF is a function of magnification. Lens FL + distance is magnification. Enlarging the film sensor image to a print/display is magnification. Recording the same image on different size sensors is different levels of magnification. Cropping is magnification. Viewing the image while a print in your hands versus a print on the wall is different levels of magnification. All variables of magnification involved in realizing the final print/display image must be (and are) accounted for in the math that calculates DOF. You find sensor size in the math in the variable c (circle of confusion). DOF calculations require a value for c and the value we use for c is predicated (changes) upon sensor/film size. As such we're making allowances in the math for required variations in magnification due to sensor size.

No. You aren’t altering depth of field in that case.

You’re magnifying the image and making the edges of the original depth of field more apparent by essentially zooming in on the image, but you ARE NOT actually changing the physical depth of field with which the image was captured. It’s physically impossible.

The girl was equally out of focus in the uncropped image, you just couldn’t see it as clearly because you weren’t looking as closely.

You may be changing the *effective* depth of field in the print. You are NOT physically changing the depth of field that the lens originally captured.
 
Obviously when you *zoom* in on an image you get a better view of what was *actually* sharp in the first place.

At lower magnification levels you can’t make out the minor imperfections as well.

But that doesn’t mean the imperfections don’t exist. It just means you aren’t looking close enough to see them.

It’s the same reason that a sensor with higher megapixel count makes your focus accuracy more critical: the higher resolution makes the flaws that were *originally there* more apparent.

By your example, the depth of field would also have to become thinner with higher resolution sensors because you can better see which areas are in focus. But that isn’t the case; sensor resolution has no effect on depth of field in the same way that cropping to change the pixel size has no effect on depth of field.
 
Destin said:
So now you’re telling me that cropping an image after the fact is going to change DOF?

OMG--YES. That **fact** is usually found on the second page of the chapter dealing with photography and optics. Cropping an image after the fact **will alter** the depth of field.

YES. YES, yes,yes, yes. Cropping a negative or digital image down, from one size to another size, and then making a printed image can, and will, alter the depth of field. As an example, the DOF will change, similar to the way that altering the amount of yeast used in a pizza crust can make a thicker, or a thinner crust....even when the final pizza remains 16 inches in diameter. Your incredulity is understandable, but is akin to the more/less yeast degree of understanding of a final sized image.

I cannot blame you for not understanding this, since you grew up in the era of the Internet and YouTube, where everybody is an 'expert', and the science underlying so many things has been obscured by bullspit.

Like killing a hydra indeed...

Derrel, if I may provide an example:

View attachment 148097

Only the boy in the back is visibly out of DOF. But in a cropped version (same print size) from the same film negative we see the girl with the white collar is out while the girl with the plaid scarf is questionable. In a larger print the girl with the plaid scarf definitely goes out:

View attachment 148098

I think it really stems from wanting to define DOF as happening at the film/sensor level rather than at the final print level. They're thinking; once it's recorded on the film/sensor it must be permanently fixed. Destin (Hey Destin, sorry don't want to sound like I'm talking about you behind your back -- Derrel & I have done this a dozen plus times now for literally years) is going through the exact same progression we've seen many times before. "The lens fixes the DOF at the moment of exposure and if we crop a section out of an FX image that's the same size as a DX image the DOF must be the same." They're thinking about what's happening in the camera without following through with the process. DOF is defined ONLY on the final print/display image.

Joe

Destin, Above all other factors and in simplest terms DOF is a function of magnification. Lens FL + distance is magnification. Enlarging the film sensor image to a print/display is magnification. Recording the same image on different size sensors is different levels of magnification. Cropping is magnification. Viewing the image while a print in your hands versus a print on the wall is different levels of magnification. All variables of magnification involved in realizing the final print/display image must be (and are) accounted for in the math that calculates DOF. You find sensor size in the math in the variable c (circle of confusion). DOF calculations require a value for c and the value we use for c is predicated (changes) upon sensor/film size. As such we're making allowances in the math for required variations in magnification due to sensor size.

No. You aren’t altering depth of field in that case.

You’re magnifying the image and making the edges of the original depth of field more apparent by essentially zooming in on the image, but you ARE NOT actually changing the physical depth of field with which the image was captured. It’s physically impossible.

There is no physical depth of field with which the image was captured. DOF is not defined on the film/sensor. I think I've typed this a few times now: DOF is defined ONLY on the final print/display image. This is the root of the misunderstanding. DOF is well understood and has been defined in this discipline for at least 100 years before you were born. If you want to talk about DOF then you have to talk about DOF and not some other notion that you've recently concocted.

Joe

The girl was equally out of focus in the uncropped image, you just couldn’t see it as clearly because you weren’t looking as closely.

You may be changing the *effective* depth of field in the print. You are NOT physically changing the depth of field that the lens originally captured.
 
Destin said:
So now you’re telling me that cropping an image after the fact is going to change DOF?

OMG--YES. That **fact** is usually found on the second page of the chapter dealing with photography and optics. Cropping an image after the fact **will alter** the depth of field.

YES. YES, yes,yes, yes. Cropping a negative or digital image down, from one size to another size, and then making a printed image can, and will, alter the depth of field. As an example, the DOF will change, similar to the way that altering the amount of yeast used in a pizza crust can make a thicker, or a thinner crust....even when the final pizza remains 16 inches in diameter. Your incredulity is understandable, but is akin to the more/less yeast degree of understanding of a final sized image.

I cannot blame you for not understanding this, since you grew up in the era of the Internet and YouTube, where everybody is an 'expert', and the science underlying so many things has been obscured by bullspit.

Like killing a hydra indeed...

Derrel, if I may provide an example:

View attachment 148097

Only the boy in the back is visibly out of DOF. But in a cropped version (same print size) from the same film negative we see the girl with the white collar is out while the girl with the plaid scarf is questionable. In a larger print the girl with the plaid scarf definitely goes out:

View attachment 148098

I think it really stems from wanting to define DOF as happening at the film/sensor level rather than at the final print level. They're thinking; once it's recorded on the film/sensor it must be permanently fixed. Destin (Hey Destin, sorry don't want to sound like I'm talking about you behind your back -- Derrel & I have done this a dozen plus times now for literally years) is going through the exact same progression we've seen many times before. "The lens fixes the DOF at the moment of exposure and if we crop a section out of an FX image that's the same size as a DX image the DOF must be the same." They're thinking about what's happening in the camera without following through with the process. DOF is defined ONLY on the final print/display image.

Joe

Destin, Above all other factors and in simplest terms DOF is a function of magnification. Lens FL + distance is magnification. Enlarging the film sensor image to a print/display is magnification. Recording the same image on different size sensors is different levels of magnification. Cropping is magnification. Viewing the image while a print in your hands versus a print on the wall is different levels of magnification. All variables of magnification involved in realizing the final print/display image must be (and are) accounted for in the math that calculates DOF. You find sensor size in the math in the variable c (circle of confusion). DOF calculations require a value for c and the value we use for c is predicated (changes) upon sensor/film size. As such we're making allowances in the math for required variations in magnification due to sensor size. You have to make the final image and use the whole sensor -- cropping changes DOF.

Here is what we are going to test:
1. D750 with 50mm on tripod focused on stationary objects - I'm going to use cars in a parking lot.
2. Keep image size at FX to start and shoot whole stop apertures (2.8, 4, 5.6 . . . 22)
3. Change image size to DX, and run through apertures like above.
4. Running auto WB, constant ISO, JPEG Fine and Aperture priority.
5. Compare the FX to the DX with the same aperture without further processing.

Will there be a difference in DOF?
If I crop the FX image to the same area as the DX image, will there be the expected difference (FX DoF < DX DoF)?
 
Obviously when you *zoom* in on an image you get a better view of what was *actually* sharp in the first place.

At lower magnification levels you can’t make out the minor imperfections as well.

But that doesn’t mean the imperfections don’t exist. It just means you aren’t looking close enough to see them.

And that is how our discipline defines DOF: It's defined by what you can see in the final print/display, not by what you can't. Talk about DOF if you want to talk about DOF but don't change the subject and talk about something else. Learn how DOF is in fact defined. You're not in a position at this point to be taken seriously if you're trying to redefine it.

Joe

It’s the same reason that a sensor with higher megapixel count makes your focus accuracy more critical: the higher resolution makes the flaws that were *originally there* more apparent.

By your example, the depth of field would also have to become thinner with higher resolution sensors because you can better see which areas are in focus. But that isn’t the case; sensor resolution has no effect on depth of field in the same way that cropping to change the pixel size has no effect on depth of field.
 
Here is what we are going to test:
1. D750 with 50mm on tripod focused on stationary objects - I'm going to use cars in a parking lot.
2. Keep image size at FX to start and shoot whole stop apertures (2.8, 4, 5.6 . . . 22)
3. Change image size to DX, and run through apertures like above.
4. Running auto WB, constant ISO, JPEG Fine and Aperture priority.
5. Compare the FX to the DX with the same aperture without further processing.

Will there be a difference in DOF?
If I crop the FX image to the same area as the DX image, will there be the expected difference (FX DoF < DX DoF)?

If you crop the FX image to the same area as the DX image the expectation is there will be no difference. A DX sensor produces the same DOF as a DX sensor -- duuuh.

You're not testing the right thing. In order to run a valid test you have to use the two different size sensors to take the same photograph. Otherwise you don't have a comparison.

Joe
 
But, if I am using an FX sensor in DX mode, isn't that the same as using two different size sensors (and thus changing the Circle of Confusion)?
 
Obviously when you *zoom* in on an image you get a better view of what was *actually* sharp in the first place.

At lower magnification levels you can’t make out the minor imperfections as well.

But that doesn’t mean the imperfections don’t exist. It just means you aren’t looking close enough to see them.

And that is how our discipline defines DOF: It's defined by what you can see in the final print/display, not by what you can't. Talk about DOF if you want to talk about DOF but don't change the subject and talk about something else. Learn how DOF is in fact defined. You're not in a position at this point to be taken seriously if you're trying to redefine it.

Joe

It’s the same reason that a sensor with higher megapixel count makes your focus accuracy more critical: the higher resolution makes the flaws that were *originally there* more apparent.

By your example, the depth of field would also have to become thinner with higher resolution sensors because you can better see which areas are in focus. But that isn’t the case; sensor resolution has no effect on depth of field in the same way that cropping to change the pixel size has no effect on depth of field.

You act like I actually care if anyone on an Internet forum takes me seriously.

I’ve researched this topic to exhaustion in the last hour; it seems that half of the true respected professional photographers agree with you and the other half with me.

There are textbooks that teach it one way, and textbooks that teach it another.

Like I said, we’re essentially saying the same thing and wording it differently. By cropping the image you may change the *effective* or *apparent* depth of field.

But the amount of the image that was physically sharp when you captured the image does not actually change, no matter how much it appears to have done so.

If you want to get technical about what existed when the photo was taken we should be comparing 1:1 crops of the original image; which again would be affected by sensor resolution. And yet sensor resolution is never taken into account in a DOF calculator.

Even doing that, the results will end up being extremely subjective because we all define differently what we consider to be sharp.

While we’re applying your fluid definition of depth of field, it would also change based on the subjective nature of who is viewing it. And therefore you could never actually measure it because we’d all see different limits on where the sharpness of the in focus area ends.
 

Most reactions

New Topics

Back
Top