Banner

Image weights

Which weighting algorithm is best? When should images be discarded?

This page aims to provide the information you need to make an informed judgment.

Photon noise

Photon Noise
Mdf, CC BY-SA 3.0, via Wikimedia Commons

A photon noise simulation, using a sample image as a source and a per-pixel Poisson process to model an otherwise perfect camera (quantum efficiency = 1, no read-noise, no thermal noise, etc). Going from left to right, the mean number of photons per pixel over the whole image is

(top row)0.0010.010.1
(middle row)110100
(bottom row)1,00010,000100,000

Note the rapid increase in quality past 10 photons/pixel. (The source image was collected with a camera with a per-pixel well capacity of about 40,000 photons.)

Please click on the image to see the larger form; the low-level images are mainly isolated pixels, which might be downsampled to black.

Photon noise is the dominant source of noise in the images that are collected by most digital cameras on the market today. Better cameras can go to lower levels of light -- specialized, expensive, cameras can detect individual photons -- but ultimately photon shot noise determines the quality of the image.

Stacking images

We already know that if we stack enough images that are typical of the top left image, the result will be equivalent to the bottom right image. The process of image integration is equivalent to adding the images together.

How does this change if the image quality varies? If we add the middle left image to the middle right, without weighting, we would see a significant drop in image quality. However, provided we weight the images by precisely the correct amount, in theory the image quality will always improve. To achieve this signal to noise ratio improvement, the weighting formula must be:

(Signal to Noise ratio)²

This is the formula used by NSG.

Minimum weight

In practice, a minimum weight criteria is useful:

  1. A very noisy image will interfere with data rejection. It becomes difficult to differentiate between the shot noise and image artefacts (for example, satellite trails). For this reason, images with weights less than 0.25 should usually be rejected.
  2. An image affected by clouds will have less signal (lower transmission) and higher noise (light pollution reflected by the clouds), which will reduce its weight. In this case, not only will it interfere with data rejection, the cloud may also produce complex gradients that will be difficult to remove. Images with a transmission of less than 0.75 should usually be rejected.

NSG has the ability to reject images both by (1) weight and (2) transmission to provide the optimum image rejection. See here for more details.

If light pollution is minimal, the transmission cutoff dominates. In this case a transmission of 0.75 corresponds to a weight of 0.5, which may explain why a weight cutoff of 0.5 is often recommended. However, this is not suitable for light polluted sites, where it will often reject too many images. The NSG defaults of 0.75 for transmission and 0.25 for weight covers both dark sites and heavily light polluted ones.

Star FWHM

FWHM image weights

A popular weighting strategy is to use star Full Width Half Maximum (FWHM) to provide a sharper, higher resolution final image. However, there are some significant drawbacks to this technique that you should be aware of.

From the section Stacking images, it is clear that weights should never be calculated from FWHM (or any other 'quality based' weighting system). Such a suboptimal algorithm will produce a lower signal to noise ratio. For optimum signal to noise ratio, the weighting algorithm must be (Signal to Noise ratio)².

Discarding low FWHM images

Rather than calculating weights from FWHM, we could instead use FWHM as a criteria for rejecting images. In theory this could produce sharper stars, but at the cost of lower resolution within fainter areas of the image (nebulae or galaxies). This lower resolution might surprise you; it's due to the relationship between signal to noise ratio and detail. As you reduce the number of images to be stacked, the signal to noise ratio will fall. As we can see from the Photon noise section, the top left image's low signal to noise ratio has catastrophically affected its resolution. The resolution gradually improves as the signal to noise ratio increases.

The resolution of fainter areas within our images is usually limited by photon shot noise, not the arc-second seeing.

FWHM accuracy

Another potential problem is FWHM measurement accuracy. To explain this problem, let's assume we have perfect optics, and the FWHM is due to seeing and tracking errors. In this case, physics tells us that all stars should have the same FWHM.

However, the FWHM measurements usually don't reflect this truth. Fainter stars end up with better FWHM values. This systematic error is a problem; it is rewarding images with fainter stars. For example, an image affected by light cloud may score better than an image with better transparency simply due to this measurement error. Used without caution, FWHM image rejection might reject our best images!

Note that while high humidity can produce excellent seeing at the cost of transparency, this relationship does not exist for clouds. Clouds typically degrade seeing. An image taken through light clouds might appear to have sharper stars, but this may be due to the star bloat being hidden by extra noise / light pollution.

Data rejection

A tracking error that lasts for 1/100th of the exposure time can produce a highly visible artefact on bright point-like objects such as stars. But such a small proportion of light spread from areas without point sources will have negligible effects (a tiny reduction in contrast).

Fortunately it is often possible to remove the effects of tracking errors (or star bloating from seeing) visible around stars without discarding the whole image. During stacking, data rejection will reject detected light that only appears in a small proportion of the images. This does a surprisingly good job of removing the seeing and tracking errors contained within the minority of images that you would otherwise have discarded. On an individual image, the star artefacts might look catastrophic, yet the star bloat or artefacts may have little or no effect on the final image.

When to discard low FWHM images

There is no right or wrong answer here. Astrophotography is part science, part art.

If the image is all about the stars (star cluster or globular cluster) then you may decide to concentrate on star FWHM. On the other hand, if you want to reveal the faintest detail within spiral arms, it may be best to only discard images with low transmission or low weight, and fix the stars in post stack processing.