Printer Friendly

Natural color with six filters: try this alternative technique to enhance your color images made with broadband and narrowband filters.

[ILLUSTRATION OMITTED]

Astrophotographers striving to produce natural-appearing color images of deepsky targets often enhance conventional tri-color images with narrowband data to better reveal the structure and extent of faint nebulosity. Author Debra Ceravolo used Ha, S II, and O III imagery to enhance this view of M8 using her new technique to spectrally map the narrowband images into a conventional RGB photo, thus revealing subtle hues in their proper spectral wavelengths.

Narrowband imaging has become a popular method of showcasing nebula photographs in interesting and new ways. This technique got its start from the famous "Pillars of Creation" image of M16 taken with the Hubble Space Telescope back in 1995. This one image greatly influenced the world of amateur astronomical imaging. Today, it's common to see images of nebulae displayed in a variety of vibrant colors, so much so that many non-astronomers have come to accept this as reality. Often they're surprised to learn that this isn't how the universe really looks.

Amateur astrophotographers have embraced narrowband imaging mostly because the filters block many of the strongest sources of light pollution and allow one to shoot deep-sky objects from less-than-perfect observing sites. These filters have a very narrow passband, often just 3 to 10 nanometers (nm) of the visible spectrum, where specific ionized gases fluoresce. Images taken with these filters and combined into a color image tell us the predominant gas in a nebulous region, and highlight subtle structures within these glowing clouds. In contrast, traditional broadband red, green, and blue filters pass approximately 100 nm of the visible spectrum each. These filters originated as a way to create natural-looking color images of daylight scenes.

Astrophotographers who strive for natural-appearing celestial images often augment their RGB images with narrowband data that is blended into the RGB exposures. Generally, Ha and S II are blended into the red exposure. Because O III lies at the crossover between the blue and green filters, it's equally blended into both respective exposures. The original red, green, and blue image is thereby enhanced with narrowband detail while preserving the original color appearance.

The shortcoming of this blending technique is that it forces both the Ha and S II images to be the same color as the broadband red image. O III is forced to have the same color as the broadband green image, as well as the blue. The narrowband images aren't highlighted in their proper color assignment; they are only contributing to the overall structural detail in the object.

Understanding Color

As Andrew Young of San Diego State University pointed out in his article "What Color is the Solar System?" in the May 1985 issue of Sky & Telescope, color can be defined by three parameters: hue, saturation, and brightness. Hue is the dominant wavelength of a particular color, though in imaging we tend to use red, green, and blue filters to mimic our eyes' response to light. Saturation refers to the purity of the color (spectral colors are 100% saturated), and brightness means how dark or light the color is. Changing any one of these parameters changes the color perceived by our eyes. A good example of this is an orange peel and a chunk of chocolate. They look distinctly different but both are the same hue and saturation. They differ only in brightness. Similarly, Mars is not really the reddish orange often shown in photographs. Images of Mars that we have become accustomed to were lightened to better reveal detail. Mars reflects only 10% of the Sun's light, so when the brightness levels have been properly adjusted, like the orange peel and chocolate example, the planet really appears dark brown. In summary, when you change an object's brightness, you also change its apparent color.

[GRAPHIC OMITTED]

Color Mapping

In conventional RGB astrophotography, filtered images are monochrome until we designate each a particular color. Red, green, and blue images are assigned their colors according to their positions on a color wheel. Red is given a hue value of 0 or 360[degrees], green 120[degrees], and blue 240[degrees]. When assigning these images to the color channels in Adobe Photoshop, the program automatically makes their saturation 100% and assigns a medium brightness value of 50. These values are the software's default settings and not necessarily calibrated to the specific filters used. Once the color images are combined, there are several methods of balancing the color ratios to show natural color. The most common method used by astrophotographers is to select a spectral class G2V star in the image as a reference for white balance. G2V stars are similar to our Sun, and appear white to our eye.

[GRAPHIC OMITTED]

Using my spectral color mapping technique, all six images are placed in separate layers and assigned their own unique color using a Gradient Map adjustment layer. We can then assign the proper hue, saturation, and brightness values to each layer.

Matching the Visible Spectrum

Ideally, the RGB and narrowband image layers should have assigned colors that closely represent their natural wavelengths. Red's central wavelength is 640 nm, green540 nm, and blue 440 nm. Additionally, Ha is centered at 656 nm, SII 672 nm, and OIII lies at 500 nm. To create a natural color image using all six filters, the problem becomes one of converting each image's central wavelength to the appropriate hue, saturation, and brightness values.

Andrew Young developed a technique to render spectra for realistic display on RGB monitors (http://mintaka.sdsu.edu/GF/explain/optics/rendering.html). His work allows us to determine the proper values of hue, saturation, and brightness for any wavelength in the visible spectrum. By measuring these values in Young's spectrum in Photoshop, we can create naturally colored images of any celestial object by properly representing the color of each narrowband image.

To do this in Photoshop, open the image of Young's spectrum and using the Eyedropper Tool, click on an area of the spectrum corresponding to the filter's central wavelength. The Foreground Color Box at the bottom of the Tools palette now contains all the values of the selected color. By double-clicking on the Foreground Color box, the Color Picker window appears and displays the three important values of hue (H), saturation (S), and brightness (B).

[ILLUSTRATION OMITTED]

[ILLUSTRATION OMITTED]

Spectral Color Mapping

Before combining color and narrowband exposures, each image should be properly calibrated, registered, stacked, and stretched using your favorite CCD processing software. All six images, which are 16-bit monochrome TIFF images at this point, are then layered together in Photoshop. Start by using the red image as the base and convert its color profile from grayscale to RGB color (Image/ Mode/RGB Color) and then copy the green image on top of the red and set the blending mode to "Lighten." Next, copy the blue and paste it over the green layer, and continue doing the same with the Ha, S II, and O III images until all six layers are in one image. Each layer should have the blending mode set to "Lighten" except for the red layer on the bottom. Name all the layers accordingly.

Next, select an image layer, click on the adjustment layer icon, and choose the Gradient Map adjustment tool. A Gradient Map window will appear and clicking on the gradient shown will open the Gradient Editor. Click anywhere along or below the gradient in this window and a Color Stop icon appears. Double-click on the Color Stop icon and the Select Stop Color window opens. Here we enter the three important values of hue, saturation, and brightness we obtained for the filter using the color picker and Young's spectrum. Lastly, enter the brightness percentage in the location window of the Gradient Editor. We have now colorized the image with the spectral color that corresponds to the wavelength of that filtered image. Do this for each layer. To make sure each Gradient Map is affecting the correct layer, right click on the Gradient Map layer and create a clipping mask. A downward facing arrow will appear and the Gradient Map adjustment layer is now directly associated with the image layer below it.

[ILLUSTRATION OMITTED]

Final Color Balance

The introduction of narrowband images to your RGB layers often causes the sky background and star colors to shift. One remedy for this is to erase the sky background in the narrowband images using layer masks to allow only the nebulae in these image layers to contribute to the overall final image.

One way to do this is to highlight one of the narrowband image layers and create a layer mask. Use the dropdown menu and select Layer/Layer Mask. Select and copy the Ha layer; we'll use Ha because most nebulae are dominated by hydrogen alpha emission, making it the best layer for masking. Next, hold down the ALT key and click on the Layer Mask to make the window active. The image will appear white because you are now viewing the Layer Mask instead of the main image. Paste the Ha image into the layer mask, and apply a strong Gaussian Blur (a value of about 20 depending on size of image). Then open the Curves window and adjust the curve to look like it is clipping out the faintest and brightest areas of the mask.

To see the image layer again, simply click on the main image to the left of the layer mask. Do this for all of the narrowband layers to only allow the nebulae in the narrowband images to contribute to the overall image, while preserving the sky background and star colors.

This new technique brings out more narrowband structure than the channel-blending method, as well as displaying the natural colors of the emissions intrinsic to the nebula. It should be mentioned that colors are not portable, in other words, an image can be balanced and finessed on one monitor and yet appear slightly skewed on others. Monitor calibration is important, but still problems can arise with variation between color management programs, browsers, and projectors. Printing your image is another matter entirely; color saturation on a monitor exceeds what is possible in printed material. If your end goal is to print your images, you'll need to calibrate your monitor to match your printer's results.

[ILLUSTRATION OMITTED]

[ILLUSTRATION OMITTED]

Still, this technique is a big step toward many astrophotographers' goal of accurately portraying the vibrant beauty of the universe.

Debra Ceravolo and her husband Peter work together as an imaging team. Peter records long exposures of deep-sky objects that Debra processes into stunning celestial portraits.
COPYRIGHT 2011 All rights reserved. This copyrighted material is duplicated by arrangement with Gale and may not be redistributed in any form without written permission from Sky & Telescope Media, LLC.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2011 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Deep-Sky Imaging
Author:Ceravolo, Debra
Publication:Sky & Telescope
Geographic Code:1USA
Date:Dec 1, 2011
Words:1758
Previous Article:The dob buster: customization is the key to making a telescope that is truly your own.
Next Article:Gallery.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters