- Joined
- Mar 8, 2011
- Messages
- 25,324
- Reaction score
- 9,106
- Location
- Iowa
- Can others edit my Photos
- Photos NOT OK to edit
I'm writing a spreadsheet that will allow me to enter in the timed shutter speed of a camera, then compare that to the actual time set. I'm able to easily convert the difference to a percentage, but what I'd like to do is go one step further and and have a formula that converts that to how many stops the shutter speed is deviating from the ideal.
Meaning: If the shutter speed is, say, 1/60, which is 0.01666667. I take a shutter timer and the shutter is actually open 0.02387 sec. It's easy to divide the two and say the shutter is roughly 50% slower than ideal, but how to convert that to an f/stop?
Is it overexposing 1/4 of a stop? 1/3? 1/2? My brain just ain't wired to be able to convert the linear percentages to the non-linear world of f-stops.
I could probably calculate out all the various deviations for each shutter speed (1/4, 1/3, 1/2, 2/3, 3/4 etc.) then nest a boat-load of IF-THEN statements to figure it out, but that seems like the long route. I'd think there's a formula or two that could spit out the result.
I hope this makes sense.
Meaning: If the shutter speed is, say, 1/60, which is 0.01666667. I take a shutter timer and the shutter is actually open 0.02387 sec. It's easy to divide the two and say the shutter is roughly 50% slower than ideal, but how to convert that to an f/stop?
Is it overexposing 1/4 of a stop? 1/3? 1/2? My brain just ain't wired to be able to convert the linear percentages to the non-linear world of f-stops.
I could probably calculate out all the various deviations for each shutter speed (1/4, 1/3, 1/2, 2/3, 3/4 etc.) then nest a boat-load of IF-THEN statements to figure it out, but that seems like the long route. I'd think there's a formula or two that could spit out the result.
I hope this makes sense.