Printer Friendly

Conquering gradients: get the most out of this powerful technique in PixInsight.

Gradients. It's the one thing every deep-sky astrophotographer has to deal with sooner or later. They appear in images as an uneven background, often introducing unwanted color biases and overpowering the signal of your subject galaxy, nebula, or star cluster. In an ideal world, astrophotographers wouldn't have to address gradients, and we wouldn't have to worry about things such as noise, either. But unlike noise, we can remove gradients from our images without compromising the underlying data.

Understanding the Problem

Gradients are often due to unwanted light sources such as light pollution, but there are many natural light sources that will introduce gradients to your deep-sky images, including the zodiacal light, aurorae, and even natural airglow. Each of these light sources add signal to your images--an unwanted signal.

Because gradients are unwanted signal in our photos, the most accurate way of removing them is to mathematically subtract them from the image. Ideally, this subtraction should be the first thing you do to your calibrated images before any stretching or sharpening. At this stage your image is linear (unstretched), which is the best time to achieve the most accurate gradient removal.

To get the best image under less-than-ideal conditions, you need to subtract the gradients and nothing else, but you can't produce an image that contains only gradients. You can, however, get close enough to carry on with your science or artistic rendition. Currently there are two methods for getting pretty close to producing a faithful image of gradients. The first is to shoot "superflats" during your imaging session. Superflats are flat-field calibration images recorded near the time and area where you captured your target data. For the majority of amateurs, this is impractical, and takes away valuable imaging time. Furthermore, superflats do not take into account changing conditions such as encroaching twilight.

The other method is to create an artificial representation (model) of the gradients in your image. So how can you create an artificial gradient? The most plausible solution is to measure small areas (samples) from strategic locations in the original image and produce a simulation of the gradient. Fortunately, there are tools that make this process much easier than it might seem.

All other conditions being equal, the larger your camera's, telescope's, or lens's field of view, the more obvious the gradients will be, but even images only 1[degrees] wide or less can and do show gradients.

PixInsight

Although there are a few programs today that offer point-measured gradient removal, I prefer the tools available in PixInsight 1.81 from Pleiades Astrophoto (http://pixinsight. com). I find that this program offers much more control and produces better results than the others available today. Here's how I achieve my best results.

I usually begin with a calibrated and combined image with any non-overlapping edges cropped off. My first step is to apply an automatic ScreenTransferFunction (STF) by selecting Image > STF AutoStretch from the pulldown menu to increase the image contrast, revealing any gradients in the image.

Once identified, PixInsight has two tools that can suppress gradients: AutomaticBackgroundExtractor and DynamicBackgroundExtraction (ABE and DBE for short). ABE, as its name implies, is mostly automatic and doesn't enable you to interact with the image. Launch the ABE module by going to the PROCESS > BackgroundModelization > AutomaticBackgroundExtractor. You can then adjust a few parameters that will launch the sample-generation and background-modeling engines. Set the Target Image Correction to Subtraction and then click the blue square at the bottom-left to produce a background model and corrected image. Unfortunately, this tool doesn't always discriminate between the subject and background, often resulting in inaccurate gradient corrections.

Dynamic Background Extraction

My preferred method uses the DBE tool (PROCESS > BaclcgroundModelization > DynamicBackgroundExtraction), which I find to be the most flexible and user-friendly gradient removal method available to amateurs today.

The best way to know whether you're removing the gradient and nothing else after applying ABE or DBE on an image is not by looking at the corrected image, but at the background model image that the ABE or DBE tools created and later subtracted.

Regardless of how wonderful your image might look after your first application of ABE or DBE, you should carefully check that the process only removed the gradient signal--nothing more, nothing less. A gradient shouldn't have a lumpy texture with bright areas scattered around the image like the background model example seen above. DBE probably chose a sample point on a galaxy or bright star. As a result, the model has removed desired signal, preserved unwanted signal (gradients), or a bit of both. To get the DBE tool to function best, it takes a little finessing of a few settings.

After opening the DBE tool, click within your target image. This will place symmetry guide lines across the picture and link the image to anything you do with the DBE tool. Start by defining a small number of samples per row in the Samples Generation menu. Although the default of 10 sample points is appropriate, I prefer a smaller number. Gradients usually present smooth transitions that are well defined with only a few samples, though severe cases may require more. For an image that's roughly 4,000-by-3,000 pixels and that displays a smooth gradient when the STF AutoStretch is applied, six samples per row seems a good number to start with. Once you've made this change, click the Generate button.

Often the DBE tool may refuse to place samples in certain areas of an image because they're too bright or too dark. The goal is to have the image's entire surface well sampled. One way to make sure DBE places samples in brighter and darker areas is by increasing the Tolerance value to 2 and decreasing the value of Minimum Sample Weight to 0. This tells DBE to accept almost every sample, no matter how bright or dark it is.

Click the Generate button again, to see where DBE places the samples after the changes.

With new additional sample points generated, make sure they're sampling areas free of stars or other subjects you want to keep in the image, such as faint nebulosity. Take a look at the point sample in the DBE window at the top of the page. Although the star is rejected (the area in black), the sample is poor: many pixels in the sample are rejected, and the pixel edges around the black area that are usually a bit brighter than the real background (darker in the inverse sample image shown by DBE) may affect the reading. It's best to move or remove this sample point altogether by clicking the red X at the top right of the window.

When you're satisfied with the sample placement, make sure that the Target Image Correction is set to Subtraction, and click the execute icon (green check mark). After a few seconds, two new images are created. One is the corrected result, the other a background model.

Again, examine the background model, this time side-by-side with the original image and the location of the samples. I often adjust the STF shadow/midtone/highlight sliders on the background model image to increase the contrast of the model to better display differences in illumination (PROCESS > IntensityTransformations > ScreenTransferFunction).

To illustrate what is going on in my background model, I'll place the samples from my corrected image over the background model image. This is accomplished by dragging DBE's New Instance icon (the blue triangle at bottom left) to the program workspace, canceling the DBE process (red X icon at bottom left), clicking on the background model image, and then double-clicking on the DBE process I dragged onto the workspace.

See that sample over the brighter area in the bottom-right in the image above? That's the sample responsible for that bright lump. This tells me that I should remove that sample. The best practice is to remove samples from bright areas that break a smooth transition, and to add samples in dark areas. Besides adding and removing samples, you can also move them around to more ideal locations to produce the best model of the background sky.

Once you're satisfied with all the sample points, apply the changes to the original image. Make sure to transfer the adjustments to the DBE back over to the raw image using the same steps you used to move them to the background map. You can then examine both the corrected image and the new background model based on the adjusted set of samples.

If you're still seeing lumps in the background model, continue adding, removing, or moving samples around and reapply. After a few interactions with DBE, you should end up with a background model that closely resembles the gradients in your target image. At that point, you can apply an STF stretch to the corrected image and inspect it for any additional gradients.

In my example on the left of the Virgo Galaxy Cluster, the resulting background isn't evenly illuminated. But my background model transitions very smoothly across the entire field, so these differences in "background illumination" must not be an artifact introduced by my efforts to remove gradients. This is the extremely faint dust and nebulosity that permeates the Milky Way. Accurately removing gradients enabled me to aggressively stretch the image to reveal it.

If your original data was just a couple of hours of exposure under suburban skies or through subpar conditions, chances are that any existing faint signal wasn't detected, or is buried under several sources of noise. In other words, a poor signal-to-noise ratio makes it impossible to reveal faint objects. In my example, the image is more than 25 hours of luminance under very dark skies. These conditions allowed me to obtain sufficient signal of these extremely faint objects.

It might be daring for an amateur with a small telescope to claim that the uneven illumination reveals faint interstellar dust or nebulosity. And although we can never be 100% certain that every background-intensity variation corresponds to actual objects in our images, with this accurate gradient removal technique, we can be very confident that most leftover variations are indeed data-driven, and not processing artifacts.

Rogelio Bernal Andreo often images elusive nebulosity from the darkest locations in the southwest United States.
COPYRIGHT 2014 All rights reserved. This copyrighted material is duplicated by arrangement with Gale and may not be redistributed in any form without written permission from Sky & Telescope Media, LLC.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2014 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Image Processing
Author:Andreo, Rogelio Bernal
Publication:Sky & Telescope
Geographic Code:1USA
Date:Sep 1, 2014
Words:1703
Previous Article:20 fun naked-eye double stars: from easy to challenging, here are some star pairs every skywatcher should know.
Next Article:A short & sweet 6-inch RFT: this ultra-fast Dobsonian is purpose-built for stunning wide-field views.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters