Reduce Noise! New Way!

If you want to capture stuff your eyes won't see, then that technique would help you bring out detail in stuff such as the Milky way. You'd need a high ISO to record the light, and by taking several exposures you would get a cleaner result. Earthly rotation might prove a problem when the process takes time.
 
Aye manual or motorized equatorial mounts are used to try and keep the earth rotation effect from giving you lines instead of dots for stars
 
I have read the postings so far and I am intrigued by the possibilities for low-light landscape imagery. One question - I have noticed that when I do HDR, it shows up the dust on my sensor like you wouldn't believe - I didn't believe it the first time it happened - does this approach to "noise reduction" make any dust on the sensor stand out more so than it would if you were just shooting one shot? Just curious.

WesternGuy
 
Last edited:
Dust on the sensor normally appears when you have small apertures being used (like in typical landscape or macro photography). Sadly the only cure is cleaning the sensor, trying to keep it clean and in use of the spot heal tool. Since most programs will be hunting for and removing random appearing data repeat shots will only reinforce dust spots when shots are merged (because they'll always be appearing in the same place). So it might be something you have to manually go through and remove from each shot (I'm not sure if lightroom will let you use spot heal tools across multiple shots at the same time?)
 
I think black frame noise reduction can help both the hot pixels and sensor noise. I'm not sure though.
 
Black frame noise reduction really only helps eliminate sources of noise that are determined by the duration of the exposure. That is hot pixels and sensor bleeding. Other noise is quite deterministic and the result of averaging a black frame and a normal frame is a picture that's half as bright. Astrophotography stacking software uses this for the same reasons. The black frames are used to eliminate hot pixels and bleeding, and the black frame pictures at max shutterspeed is used to eliminate errors due to the read out process, the rest of the noise is extracted by averaging lightframes.

There's a real benefit to noise in astrophotography. Because noise is deterministic it's possible that once your entire sensor image average is above the black point (when you average many high iso pictures together you won't have a single black pixel in your final result) any photon that hits a pixel statistically raises it's chance of recording a lighter value. This is why when you look at the result of a nights shooting you'll see 100 very dark frames with lots of noise, but when you stack them you'll get all sorts of wonderful colours of space that didn't exist in your individual shots previously, regardless of how much you play with curves in photoshop.

Anyway your process in action: http://www2.garbz.com/stars.jpg This image is the result of a bunch of 25 second exposures shot at ISO1600 with a camera that is widely regarded as being one of the worse performers in the high-iso noise department.

For once your coffee fuel over-thinking has led to an incredibly sound theory :)
 
Garbz said:
Black frame noise reduction really only helps eliminate sources of noise that are determined by the duration of the exposure. That is hot pixels and sensor bleeding. Other noise is quite deterministic and the result of averaging a black frame and a normal frame is a picture that's half as bright. Astrophotography stacking software uses this for the same reasons. The black frames are used to eliminate hot pixels and bleeding, and the black frame pictures at max shutterspeed is used to eliminate errors due to the read out process, the rest of the noise is extracted by averaging lightframes.

There's a real benefit to noise in astrophotography. Because noise is deterministic it's possible that once your entire sensor image average is above the black point (when you average many high iso pictures together you won't have a single black pixel in your final result) any photon that hits a pixel statistically raises it's chance of recording a lighter value. This is why when you look at the result of a nights shooting you'll see 100 very dark frames with lots of noise, but when you stack them you'll get all sorts of wonderful colours of space that didn't exist in your individual shots previously, regardless of how much you play with curves in photoshop.

Anyway your process in action: http://www2.garbz.com/stars.jpg This image is the result of a bunch of 25 second exposures shot at ISO1600 with a camera that is widely regarded as being one of the worse performers in the high-iso noise department.

For once your coffee fuel over-thinking has led to an incredibly sound theory :)

I drink tea. :)
 
Garbz said:
Black frame noise reduction really only helps eliminate sources of noise that are determined by the duration of the exposure. That is hot pixels and sensor bleeding. Other noise is quite deterministic and the result of averaging a black frame and a normal frame is a picture that's half as bright. Astrophotography stacking software uses this for the same reasons. The black frames are used to eliminate hot pixels and bleeding, and the black frame pictures at max shutterspeed is used to eliminate errors due to the read out process, the rest of the noise is extracted by averaging lightframes.

There's a real benefit to noise in astrophotography. Because noise is deterministic it's possible that once your entire sensor image average is above the black point (when you average many high iso pictures together you won't have a single black pixel in your final result) any photon that hits a pixel statistically raises it's chance of recording a lighter value. This is why when you look at the result of a nights shooting you'll see 100 very dark frames with lots of noise, but when you stack them you'll get all sorts of wonderful colours of space that didn't exist in your individual shots previously, regardless of how much you play with curves in photoshop.

Anyway your process in action: http://www2.garbz.com/stars.jpg This image is the result of a bunch of 25 second exposures shot at ISO1600 with a camera that is widely regarded as being one of the worse performers in the high-iso noise department.

For once your coffee fuel over-thinking has led to an incredibly sound theory :)

You don't get an image half as bright though. I think there's a blending mode that just removes the noise.
 
I just thought of a new (maybe its old but it's new for me) way to reduce noise. Since noise is random, shooting more of the same shots would neutralise it. I think that if we stack the photos, we will get less noise. I'll try it out tomorrow.
I wonder how the NCAA will react when I ask teams for do overs so I can stack shots at as a night football game or basketball game?
 
I just thought of a new (maybe its old but it's new for me) way to reduce noise. Since noise is random, shooting more of the same shots would neutralise it. I think that if we stack the photos, we will get less noise. I'll try it out tomorrow.
I wonder how the NCAA will react when I ask teams for do overs so I can stack shots at as a night football game or basketball game?

This is a good point, if a technique isn't useful shooting a basketball game, how can it have any relevance to any photographer shooting anything else?
 
I just thought of a new (maybe its old but it's new for me) way to reduce noise. Since noise is random, shooting more of the same shots would neutralise it. I think that if we stack the photos, we will get less noise. I'll try it out tomorrow.
I wonder how the NCAA will react when I ask teams for do overs so I can stack shots at as a night football game or basketball game?
This is a good point, if a technique isn't useful shooting a basketball game, how can it have any relevance to any photographer shooting anything else?
Wait.........What...........You mean there are other kinds of photography other than sports photography??? Who would have thunk it!
 
You don't get an image half as bright though. I think there's a blending mode that just removes the noise.

This would result in clobbering shadow detail. The only way to remove noise which is statistically random (like sensor noise) without affecting the detail of the image is to use two light frames with identical detail, thus allowing you to average out the noise sample. As soon as one of the frames contains no data it's impossible to remove the random noise from the signal. The dark frame as mentioned can only be used for noise that is statistically repeatable, in essence the dark frame makes the noise on your image MORE random. But even then it doesn't work properly. Take an image with an incredible amount of light pollution and the dual long exposure noise reduction thingy in your camera will actually convert your hot pixels into cold pixels instead.
 

Most reactions

Back
Top