Hello, I am pretty sure that this might have been asked before.
How short can an exposure be for photometry, considering that we need to average out the effects due to scintillation, and also avoiding saturated pixels?
Could I capture one second exposures, say, and stack 10 of them to get a reliable light frame that can then be calibrated using biases, flats and darks?
Three, or, five, or ten seconds single frame could have saturated stars. Any suggestions? Thanks.
Yes, that’s the way to do it. The advice I have heard for averaging out scintillation ranges from five to ten seconds of total exposure time. Five to ten one second exposures would work well if the one second sub keeps you from nearing saturation. I’d play it safe and stay well back from saturation. The range of max pixel values for your comp, check, and target values can move around a lot as seeing changes from second to second. If your system goes nonlinear at, say, 50,000 ADU counts, I’d watch a number of subs and keep the highest under 30,000. That’s particularly true if you are doing a time series. As you get higher in the sky, extinction goes down and seeing improves. It can be a big effect.
I suggest calibrating each sub and then stack the calibrated images for your final data image.
Andy Young wrote a nice series of articles about scintillation in the Astronomical Journal about 30 years ago. Radu Corlan programmed the main equation and created a set of tables of typical scintillation values as a function of telesccope aperture, exposure time, elevation and signal/noise. These tables can still be found at http://astro.corlan.net/gcx/scint.txt
but I don’t guarantee their existence for much longer - Radu has to be as old as I am by now! Basically, what Walt says is correct - you can stack images to remove scintillation. Stacking in general can decrease the signal/noise slightly because you have a read noise hit from every image, but with modern detectors and a bright target, this is rarely important.