Comparison of the vignetting effects of two identical fisheye lenses.
1.1 HDR TECHNIQUES
High dynamic range (HDR) imaging is popular in the field of photography. This technique is also used more and more by lighting researchers in building applications because it offers the opportunity to rapidly capture a larger field of real world luminances than a spot luminance meter [Anaokar and Moeck 2005; Inanici 2006; Jacobs 2007]. In addition, coupling HDR imaging techniques with fisheye technology makes it possible to acquire luminances over a hemisphere. This is a way to assess the luminance repartition and ratios in the human field of view (FOV), which can then be used to evaluate the risks of glare, or to capture sky vault luminance distributions [Beltran and Mogo 2005; Inanici and Navvab 2006; Stumpfel and others 2004]. HDR imaging can also be used to characterize illuminance [Moeck and Anaokar 2006; Santa Clara 2009] and even to determine the luminous flux passing through large areas [Mardaljevic and others 2009].
But, to be reliable, the measurements realized using HDR imaging techniques must go through a strict calibration procedure. The level of accuracy of luminance measurements, using the HDR imaging technique, depends on its calibration [Jacobs and Wilson 2007]. To obtain precise values, it is necessary to perform a response curve calibration, a photometric calibration and in some cases, a photogrammetric calibration [Cal and Chung 2011].
This paper discusses one of the steps of the photometric calibration procedure: the correction of the brightness decrease observed from the center of the picture to its periphery, commonly called the vignetting effect, as illustrated in Fig. 1.
[FIGURE 1 OMITTED]
When using HDR techniques and fisheye lenses for luminance measurements, it is necessary to correct for the vignetting effect in order to obtain reliable data. Indeed, previous studies have shown that vignetting is accentuated with wide-angle lenses (such as fisheye lenses) and that it is not negligible since it can reach, with some fisheye lenses and some settings, a 55 percent loss of luminance at the periphery of the picture [Inanici 2010].
1.2 DETERMINATION OF VIGNETTING EFFECT
Different methodologies have been explored in the literature to determine the vignetting effect of lenses. Some studies have set out to determine the light falloff of conventional lenses [Anaokar and Moeck 2005; Moeck and Anaokar 2006] and others worked more particularly on the light falloff of fisheye lenses [Inanici 2006; Jacobs and Wilson 2007]. In all these studies, the vignetting effect was determined using HDR imaging techniques.
Anaokar and Moeck  determined the vignetting effect of a nonwide angle lens for a single aperture (f/7.9). In their study, a white and diffuse sheet of paper was placed to cover the entire field of the camera and pictures were taken at various shutter speeds, in order to recompose the HDR image. Finally, the vignetting effect of the lens was determined by comparing the luminance at each point of the HDR picture with the luminance at the center of it.
In an evaluation of the potential of HDR photography as a luminance data tool, Inanici described the way to determine the vignetting effect of a fisheye lens [Inanici 2006]. In this study, a fixed single target was used and the camera, fitted with an equidistant fisheye lens, was rotated by five degrees around its rotation axis, in order to capture the target from different angles, covering the 190[degrees] FOV of the camera. For each camera position, HDR pictures were realized at various shutter speeds and keeping a constant aperture of f/4. The vignetting effect was then determined by comparing the physical luminance measurements and the luminances extracted from the HDR pictures.
Another study determined the vignetting effect of a similar fisheye lens [Jacobs and Wilson 2007]. They also decided to rotate the camera, by three degrees, to capture a luminous target. Unlike Inanici, they did not perform physical luminance measurements of the target. No absolute photometric calibration was applied to the picture and the light fall off was determined similarly to Anaokar and Moeck .
1.3 FISHEYE LENS
This section presents existing fisheye projection types and their particular advantages [Bettonvil 2005; Schneider and others 2009]. The most common type of fisheye projections encountered in daylighting researches is the equidistant one. However, the equisolid, orthographic, and stereographic types also present some interesting characteristics. Figure 2 summarizes fisheye projection types and makes a correspondence with different lens brands.
Equidistant projection is an equal-angle projection for which the radial distance on the projected image is proportional to the angular distance. This type of projection can be used when determining zenith and azimuth angles because the altitude lines are evenly spaced on the captured picture. Distortion of the objects is the same at the center of the projected picture and at its periphery.
Fig. 2. Characteristics of several fisheye projections. Projection Equidistant Equisolid (also Orthographic type (also called called equal-area) (also called equiangular) hemispherical) Projection r = f*[theta] r = r = formula 2*f*sin([theta]/2) f*sin([theta]) where r is the distance from the optical axis, f is the focal length, and [theta] is the entrance angle measured from the optical axis. Scheme Lens Sigma 8mm Sigma 4.5mm F2.8 Nikkor 10mm F5.6 f/3.5 (Fan and OP others 2009; Inanici 2009; 2010; Stumpfel and others 2004;Van Den Wymelenberg and others 2010) Nikon FC-E8 (Jacobs and Wilson 2007) Nikon FC-E9 (Inanici 2006) Nikkor 8mm f/2.8 Projection Stereographic (also type called planispheric) Projection r = formula 2*f*tan([theta]/2) Scheme Lens Samyang 8mm
Equisolid projection keeps constant the ratio between the real world solid angle and the area on the captured picture, which makes it appropriate to work with the 145 Tregenza patches. Distortion of the objects is more or less the same at the center of the projected picture and at its periphery.
Orthographic projection is the projection where the point of view is at an infinite distance. It presents parallel projection lines but does not preserve area or angle, unlike equidistant or equisolid projection. Orthographic projection makes it possible to easily estimate the sky view factor (SVF) which is the portion of sky visible from a particular point (for example, a workplane or window). A high distortion of objects (that is, a reduction of their height) is observed at the periphery of the picture.
Stereographic projection is the projection onto a plane tangent to the zenith, where the center of the projection is the nadir (that is, the point diametrically opposite the zenith). This projection is conformal, which means that it preserves angles. Stereographic projection is often combined with sun path diagrams to study sun penetration and availability of light in a building or to evaluate shadowing from surrounding buildings. This projection amplifies the height of objects at the periphery of the picture.
Researchers using HDR techniques as a luminance data acquisition tool determine the proper vignetting corrections to apply to their lens.
The main objective of this study is to evaluate whether vignetting curves determined for a specific lens mounted on a particular camera can be used by other researchers using identical photographic material (that is, the same brand and model). If this hypothesis is true, researchers who do not have laboratory tools to determine accurately the vignetting curves of their lens can still use those determined precisely by other researchers having an identical camera and lens.
Some secondary objectives of this study are to check the radial symmetry of the lens vignetting, the noninfluence of the reflectance of the targets on the determined vignetting effect, and, finally, to verify the mapping function of the fisheye lens.
To achieve these aims, the vignetting characteristics of four similar devices (lens + camera) are determined in a field of 180 degrees, on the basis of gray and white targets, using HDR imaging techniques.
Measurements were realized under an artificial sky reproducing the CIE overcast sky [Bodart and others 2006]. Illuminance meters were placed in a mirror box to record, during the experiment, slight variations of horizontal illuminance. The luminance of diffuse targets was alternatively measured with a luminance meter and captured, using HDR imaging techniques, with each camera fitted with a fisheye lens.
2.1 CAMERA AND LENS
Two CANON 40D cameras were used in this study. They were fitted with two SIGMA 4.5 mm F2.8 fisheye lenses. As specified in the manufacturer's technical documentation, the SIGMA 4.5 mm F2.8 fisheye lens presents an equisolid projection. But as in practice, the projection of the lens does not exactly comply with the theoretical formula [Bettonvil 2005], one of the auxiliary aims of this study was to redefine it (see Section 5.1).
Each of the two identical fisheye lenses was successively mounted on each of the two cameras as illustrated in Fig. 3.
[FIGURE 3 OMITTED]
Four devices (camera + lens) were obtained in this way:
* a first camera fitted with a first fisheye lens (CAM#1FE#1);
* a second camera fitted with a second fisheye lens (CAM#2FE#2);
* the first camera fitted with the second fisheye lens (CAM# 1FE#2);
* the second camera fitted with the first fisheye lens (CAM#2FE# 1).
2.2 EXPERIMENTAL SETUP
The experimental setup, placed under the artificial sky, consisted of 49 white and gray squares arranged in a semi-circle as illustrated in Fig. 4. Initially, squares were painted every five degrees. As a result of some pre-tests, this interval was reduced, at the center and the periphery of the arc, to 2.5 degrees, in order to improve the precision of the obtained vignetting curve.
[FIGURE 4 OMITTED]
Finally, 25 diffuse white square targets (78 percent reflectance) were alternated with 24 diffuse gray ones (26 percent reflectance), the central target being white. Reflectance of these targets was determined using a Konica Minolta Chroma Meter CR-140 which is calibrated at each turning on, using a white calibration plate provided by the manufacturer.
2.3 ILLUMINANCE METERS
Two Hagner illuminance meters were placed on the experimental setup, in the mirror box, to record the variation of horizontal illuminance and balance luminance measurements (see Fig. 6(c)). These illuminance meters are calibrated every two years and checked every six months at the Belgian Building Research Institute (BBRI), where the experiment took place. This calibration at BBRI is done according to a reference illuminance meter, calibrated by a private German company (LMT Lichtmesstechnik GmbH Berlin).
2.4 LUMINANCE METER
Luminance measurement of each square was taken with a Minolta LS 110 spot luminance meter, with a 1/3[degrees] FOV and declaring an accuracy of [+ or -] 2 percent [Konica Minolta 2011]. This instrument is calibrated every two years by Minolta and was last calibrated in August 2010.
3 MEASUREMENT METHOD
Each device (CAM#1FE#1, CAM#2FE#2, CAM#1FE#2 and CAM#2FE#1) was successively positioned in the center of the semi-circle setup, to capture the 180[degrees] field as illustrated in Fig. 5.
[FIGURE 5 OMITTED]
As the vignetting effect is a function of the lens aperture width [Inanici 2006; Jacobs and Wilson 2007], all the possible apertures of CAM#1FE#1 and CAM#2FE#2 were studied. That is f/2.8, f/3.2, f/3.5, f/4, f/4.5, f/5, f/5.6, f/6.3, f/7.1, f/8, f/9, f/10, f/11, f/12, f/14, f15, f/16, f/18, f/20 and f/22. For CAM#1FE#2 and CAM#2FE#1, only four apertures were analyzed: f/2.8, f/4, f/10 and f/22.
The image capturing process (Fig. 6(a)) alternated with physical measurements with the luminance meter (Fig. 6(b)) as explained in Fig. 7.
[FIGURE 6 OMITTED]
[FIGURE 7 OMITTED]
At the beginning and at the end of each series of pictures (that is, for each tested aperture), the horizontal illuminance in the mirror box was recorded. The illuminance was also recorded for each physical luminance measurement realized with the luminance meter.
3.1 CAMERA POSITION
Fisheye lenses do not have a single center of projection: the perspective center which is the point from which the camera captures the scene corresponds to the entrance pupil point which moves as the incident angle varies [Gennery 2006]. When capturing panoramic views, it is necessary to rotate the camera around the entrance pupil point to reduce parallax errors. In the frame of the capture of luminances using HDR imaging techniques and fisheye lens, the entire human FOV is generally captured in one series of shots, without moving the camera. That is why this study assumes a single center of projection corresponding to the entrance pupil point for 0 = 0[degrees] (center of the semi-circle setup).
The camera was oriented towards the central white square (0 = 0[degrees]) of the experimental setup, in order to capture the entire semi-circle. Its horizontality was checked with a double axis spirit level.
3.2 CAMERA SETTINGS
In their study, Goldman and Chen showed that the intensity of the vignetting effect depends on some lens settings: the focal length, the aperture, and the lens focus [Goldman and Chen 2005]. During pre-tests, the influence of some other parameters on the recorded data was observed: the focus setting influences the image circle diameter and the metering mode influences the way the data are recorded. That is why all the pictures were taken with the settings presented in Table 1.
TABLE 1. Settings of the Camera Parameters Mode White balance Daylight Sensitivity ISO 100 Metering mode Spot Focusing setting Infinity Picture quality MEDIUM / normal (2816*1880 pixels) Number of f-stops [+ or -] 1 stop Number of shots 7
The white balance was fixed to daylight as suggested by Inanici [Inanici 2006], whereas the lowest sensitivity (IS0100) was chosen to reduce the noise in the HDR picture.
4 ANALYSIS METHOD
In order to analyze the vignetting effect, HDR pictures were firstly created, for each aperture and each device, on the basis of the series of low dynamic range (LDR) pictures taken in the mirror box. Luminances were extracted from these HDR pictures and those measured with the luminance meter were then weighted according to the horizontal illuminances recorded in the minor box. The HDR pictures were next calibrated according to physical measurements and, finally, the vignetting effect was calculated. The symmetry of the vignetting and the noninfluence of the reflectance of the target (white or gray square targets) on the determination of the vignetting were checked. Vignetting curves of the four devices were finally compared, for several apertures.
In addition to the determination of these vignetting functions, this experiment and its setup were an opportunity to check the projection formula given by the manufacturer, as the targets were arranged at known intervals.
4.1 CREATION OF THE HDR PICTURES
For each device (CAM#1FE#1, CAM#2FE#2, CAM#1FE#2 and CAM#2FE#1) and each tested aperture, a series of exposure bracketed low dynamic range (LDR) pictures were taken. These pictures were combined into a high dynamic range (HDR) image using the hdrgen Radiance command-line tool on LINUX [LBNL 2010]. A specific response curve was generated for each camera (on the basis of the f/9 aperture) as this curve, establishing the relation between RGB pixel values and luminance values, can vary even between cameras of same model [Jacobs 2007].
4.2 LUMINANCE EXTRACTION
The luminance of each square target was extracted from the HDR picture using the Radiance pvalue program (pvalue - o - h - H filename.pic > filename. txt) and a Matlab routine [The MathWorks 2009].
The luminance extracted from the pictures was calculated as follows:
[L.sub.HDR] = 179 x (0.2126R + 0.7152G + 0.0722B), (1)
Where [L.sub.HDR] is the luminance (cd/[m.sup.2]) extracted from the picture before photometric calibration and R, G and B are the red, green, and blue primary values.
4.3 VARIATION OF HORIZONTAL ILLUMINANCE IN THE MIRROR BOX
The physical luminances (measured with the luminance meter) were weighted according to the horizontal illuminances recorded during the experiment (image capturing process + luminance measurement), and using a rule of three, as follows:
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (2)
where [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], is the physical luminance after the illuminance correction, [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], is the physical luminance of the square at [[theta].sub.i], [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]is the recorded illuminance during the physical luminance measurement at [[theta].sub.i], and[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]is the illuminance recorded for the corresponding HDR picture. In this way, slight variations of horizontal illuminance in the mirror box were taken into account and reduced.
4.4 PHOTOMETRIC CALIBRATION
In the framework of this study, multiple calibration factors (CF) were used to calibrate specifically gray and white targets, local calibration enhancing the accuracy as observed by Cai and Chung .
Those calibration factors ([CF.sub.white] and [CF.sub.grey]) were determined, for each device and each tested aperture, on the basis of the three central targets, in this way:
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (3)
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (4)
where [CF.sub.white] and [CF.sub.grey] are respectively the calibration factor used for white and gray calibration and the luminances physically measured with the luminance meter for, respectively, the central square (positioned at [theta] = O[degrees]), the gray square positioned at -2.5[degrees], and those positioned at 2.5[degrees]. [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] and [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], are the luminances extracted from the HDR picture, for the same squares.
According to the reflectance of the target, the luminance of each target was finally calibrated as follows:
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (5)
where [??] is the luminance at [[theta].sub.i] after calibration is the luminance at [[theta].sub.i] extracted from the HDR picture, and CF is the calibration factor adapted to the reflectance of the target.
4.5 VIGNETTING EFFECT
The vignetting effect was determined, for each square, as the ratio between the luminance extracted from the calibrated HDR picture and the physical luminance measured with the luminance meter:
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (6)
where [??] is the vignetting effect at [theta] = i[degrees], is the luminance measured at [theta] = i[degrees] with the luminance meter and weighted according to horizontal illuminance, and [??] is the luminance captured with the camera at [theta] = i[degrees] and calibrated.
Due to the large distortion, the measurements at -90[degrees] and + 90[degrees] were withdrawn from the sample.
4.5.1 EVALUATION OF DIFFERENCES
The analysis is based on root mean square errors (RMSE):
RMSE = [square root of ([summation over ([.sub.i=1.sup.N])][([[mu].sub.i] - [[mu].sub.ref]/[[mu].sub.ref]).sup.2]/N)] (7)
where [mu] is the value of the vignetting curve, [[mu].sub.ref] is the value of the reference, and N is the number of data points. The reference corresponds to a theoretical photographic material without vignetting effect ([[mu].sub.ref] = 1).
The intensity of the vignetting effect was evaluated for each aperture through the calculation of the RMSE.
Finally, to evaluate the difference between two vignetting curves, the differences of their RMSEs were calculated between the two compared curves (for example, for an aperture f/2.8, the vignetting curves approximated for CAM#1FE#1 and CAM#2FE#2). Vignetting effects were judged similar if this difference of RMSE was less than 2 percent.
188.8.131.52 HYPOTHESIS OF RADIAL SYMMETRY
To evaluate the hypothesis of radial symmetry, two devices were studied: CAM#1FE#1 and CAM#2FE#2. For each device, measurements realized in the first quadrant were compared to measurements realized in the second quadrant. A set of vignetting values was determined in each quadrant, for the two studied cameras and all the tested apertures. For each device, each aperture and each quadrant, the RMSE between the vignetting effect and the reference (no vignetting) was calculated as in (7). The difference between the two quadrants was then evaluated by calculating, for each device and each aperture, the difference of RMSE calculated in each quadrant.
184.108.40.206 HYPOTHESIS OF NONINFLUENCE OF THE REFLECTANCE OF THE TARGET IN THE DETERMINATION OF VIGNETTING EFFECT
To confirm that reflectance of the target has no influence on the determination of vignetting functions, vignetting curves were approximated on the basis of white or gray targets, separately (see Fig. 8). These curves were then compared.
[FIGURE 8 OMITTED]
To achieve this aim, measurements realized in the first quadrant were superimposed on measurements realized in the second quadrant and mean vignetting values were calculated for each target. As white and gray targets did not have the same abscissae, it was not possible to work in an identical manner to that used for the control of the symmetry. Vignetting curves were here approximated with a polynomial of degree six. The RMSE of each approximated curve was then calculated and the difference of RMSE was calculated to evaluate the similarity of the vignetting effects.
220.127.116.11 SIMILARITY OF VIGNETTING EFFECT BETWEEN DEVICES
The comparison between the vignetting effects of the four devices (CAM# 1FE# 1, CAM#2FE#2, CAM#1FE#2 and CAM#2FE# 1) was realized on the basis of white squares only, in calculating differences of RMSE, for four apertures.
4.5.2 VIGNETTING FILTER
Generally, the evaluation of lens vignetting aims at determining a filter to apply to HDR pictures, in order to counterbalance the light fall off observable from the center of the picture to its periphery. This filter is determined, for different lens apertures, as the inverse of the approximated vignetting function:
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (8)
where[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]is the filter determined for the aperture A[v.sub.i] and [??] is the approximated vignetting function for this same aperture A[v.sub.i].
This filter is applied to HDR pictures to correct vignetting errors, as follows:
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (9)
where [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] is the vignetting corrected HDR picture, [L.sub.HDR] is the original HDR picture, and [??] is the filter determined for the aperture A[v.sub.i] used to take the picture.
5.1 PROJECTION FORMULA
Sigma 4.5 mm F2.8 fisheye lenses, used in this study, theoretically, produce an equidistant projection. On the basis of the pictures of the experimental setup, the equisolid projection type was checked.
The radial position of each square target was measured and a mapping function was approximated using a nonlinear regression model (nls2 package) in R software ER Development Core Team 2010]. Then, the theoretical formula, which is:
r = 2 x f x sin[theta]/2, (10)
was adapted by the following approximated mapping function:
r = 1.888 x f x sin[theta]/1.997, (11)
where r is the distance from the optical axis, [theta] is the entrance angle measured from the optical axis, and fis the focal length, that is 4.5 mm. This new mapping function, which assumes a single center of projection as explained in Section 3.1, minimizes errors between the formula and the measured points.
The theoretical and the approximated mapping functions for the equisolid projection as well as measured points are presented in Fig. 9, for two tested devices (CAM#1FE#1 and CAM#2FE#2).
[FIGURE 9 OMITTED]
5.2 HORIZONTAL ILLUMINANCE IN THE MIRROR BOX
Figure 10 presents the variation of horizontal illuminance in the mirror box during the experiment.
[FIGURE 10 OMITTED]
[FIGURE 11 OMITTED]
From the beginning to the end of the experiment, a 3 percent decrease of illuminance level inherent to the lamp flux variation is observed.
5.3 PHYSICAL LUMINANCES
Figure 11 presents physical luminances of white and gray squares, measured with the luminance meter. As explained in Section 3, four series of physical luminances have been realized during the experiment. The first series of measurements have slightly higher values than the following series.
The gray squares have a luminance around 250 cd/[m.sup.2] whereas the white squares have a luminance around 1,200 cd/[m.sup.2]. The standard deviation of luminances of white squares is 15 cd/[m.sup.2] whereas that for the gray squares is 6 cd/[m.sup.2].
5.4 CALIBRATION FACTOR
Table 2 presents the calibration factors (CF) calculated for each device, each type of target (white (W) or gray (G)), and the 19 tested apertures (f/2.8 to f/22), as well as their means, medians, and standard deviations (std).
Medium apertures from f/4.5 to f/ 12 have, globally, white and gray calibration factors whose difference is inferior to 0.05 whereas the other apertures have greater differences.
TABLE 2. Calibration Factors CF CAM# CAM# CAM# CAM# 1FE# 2FE# 1FE# 2FE# 1 2 2 1 Av W G W G W G W G f/2.8 1.74 1.79 1.43 1.45 1.61 1.45 1.55 1.59 f/3.2 1.75 1.83 1.47 1.51 - - - - f/3.5 1.64 1.72 1.41 1.48 - - - - f/4 1.69 1.71 1.42 1.48 1.55 1.48 1.55 1.61 f/4.5 1.77 1.78 1.47 1.50 - - - - f/5 1.62 1.63 1.35 1.38 - - - - f/5.6 1.74 1.70 1.46 1.49 - - - - f/6.3 1.84 1.85 1.50 1.53 - - - - f/7.1 1.66 1.65 1.35 1.37 - - - - f/8 1.64 1.60 1.31 1.32 - - - - f/9 1.76 1.72 1.36 1.39 - - - - f/10 1.58 1.58 1.31 1.34 1.44 1.34 1.46 1.46 f/11 1.80 1.75 1.36 1.38 - - - - f/12 1.72 1.67 1.31 1.31 - - - - f/14 1.82 1.75 1.32 1.35 - - - - f/16 1.81 1.73 1.35 1.31 - - - - f/18 1.87 1.77 1.39 1.30 - - - - f/20 1.70 1.59 1.32 1.26 - - - - f/22 1.76 1.64 1.30 1.25 1.52 1.25 1.49 1.50 Mean 1.73 1.71 1.38 1.39 1.53 1.38 1.51 1.54 Median 1.74 1.72 1.36 1.38 1.54 1.39 1.52 1.55 Std 0.08 0.08 0.06 0.09 0.07 0.11 0.05 0.07
5.5 VIGNETTING EFFECT
Figures 12 and 13 present the vignetting values determined respectively for CAM#1FE#1 and CAM#2FE#2, using (6). For reasons of clarity, only five apertures are shown in the graphs, but 19 apertures have been studied.
In the first quadrant of CAM#1FE#1 (see Fig. 12), some values corresponding to smaller apertures do not follow the tendency of the curve.
Figures 12 and 13 show that luminance loss is enhanced for larger apertures. A slight asymmetry is observed between the first and the second quadrant.
[FIGURE 12 OMITTED]
[FIGURE 13 OMITTED]
[FIGURE 14 OMITTED]
5.5.1 EVALUATION OF DIFFERENCES
In order to judge the symmetry, the RMSE is calculated in each quadrant as explained in Section 18.104.22.168.
The differences between the RMSE calculated in the first quadrant and in the second quadrant are inferior to 2 percent, for the two devices and all the apertures except f/ 16, f/18, f/20, and f/22, for the first device and f/2.8 for the second one (see Fig. 14).
The mean difference of RMSEs for CAM#1FE#1 is 1.23 percent whereas it is 1.02 percent for CAM#2FE#2.
22.214.171.124 REFLECTANCE OF THE TARGET
Figure 15 presents the differences of RMSE calculated, for each device, on the basis of white or gray vignetting values.
The differences of RMSE are higher than 2 percent for the two smallest apertures of CAM# 1FE# 1 and the five smallest apertures of CAM#2FE#2.
126.96.36.199 SIMILARITY BETWEEN THE FOUR DEVICES
Figure 16 presents the RMSE of the vignetting curves approximated for each tested aperture of each device and calculated as in (7).
[FIGURE 15 OMITTED]
[FIGURE 16 OMITTED]
Whatever the device, RMSE is maximal for the largest aperture, decreases until the f/5.6 aperture and remains roughly constant until f/22. For large apertures, RMSE is lower for CAM# 1FE# 1 whereas it is the opposite for smaller apertures. Finally, a side by side comparison was realized in calculating the differences of RMSE between devices, as summarized in Fig. 17.
[FIGURE 16 OMITTED]
Differences of RMSE between devices are always less than 2 percent and are globally higher for smaller apertures.
5.5.2 VIGNETTING FILTER
After analyzing the differences of vignetting effect between devices as well as the symmetry and influence of the reflectance of the target on the luminance falloff, vignetting filters were calculated. For each aperture, a filter was determined as the inverse of the vignetting curve approximated on the basis of the luminances of the white targets, captured with the CAM# 2FE# 2 device (see Fig. 18). These vignetting curves were approximated by a six degree polynomial. For reasons of clarity, only five apertures are shown in Fig. 18 but coefficients of the vignetting curves as well as the luminance loss at the periphery of the picture, for all the apertures, are specified in Table 3.
TABLE 3. Coefficients of the Polynomial Vignetting Functions Av Vignetting Curves (V = a[x.sup.6] + b[x.sup.5] + c[x.sup.4] + d[x.sup.3] + e[x.sup.2] + fx + 1) a b c d e f f/2.8 -4.6322 8.2363 -5.3033 2.031 -0.7423 -0.3243 f/3.2 5.148 4.9714 4.6563 7.252 2.2996 0.1941 f/3.5 9.6619 19.082 12.07 1.5323 0.5931 0.1051 f/4 10.416 22.24 17.324 5.4708 0.4415 0.0379 f/4.5 16.995 38.535 31.706 11.233 1.499 0.0515 f/5 19.773 46.952 40.763 15.507 2.3204 0.0809 f/5.6 10.524 25.072 21.66 7.8575 0.8505 0.0533 f/6.3 1.8667 2.5246 0.5489 2.386 1.3558 0.2387 f/7.1 0.3065 0.9753 2.9796 2.6684 1.0975 0.17 f/8 1.1387 4.3744 5.842 3.7053 1.266 0.1882 f/9 3.5709 11.455 13.641 7.7084 2.1894 0.2546 f/10 2.5612 8.7685 11.139 6.7781 2.0969 0.2654 f/11 0.3919 0.6742 3.016 3.1818 1.4497 0.2431 f/12 1.5912 5.4085 6.7282 4.1054 1.3756 0.1985 f/14 0.3552 0.7969 2.9851 2.972 1.3246 0.2219 f/16 0.7516 3.6828 5.6796 4.0508 1.4906 0.2306 f/18 0.0303 1.2351 2.8527 2.5295 1.1092 0.1966 f/20 0.7736 0.7316 0.9932 1.7952 0.9881 0.1782 f/22 0.9597 1.4964 0.3075 0.742 0.6078 0.1282 Av [R.sup.2] Maximal Loss. (at Periphery) f/2.8 1 73% f/3.2 0.9998 67% f/3.5 0.9995 63% f/4 0.9997 51% f/4.5 0.9928 38% f/5 0.9809 32% f/5.6 0.9386 16% f/6.3 0.9354 6% f/7.1 0.9815 4% f/8 0.9888 2% f/9 0.9673 2% f/10 0.9655 1% f/11 0.9821 3% f/12 0.9634 2% f/14 0.9793 4% f/16 0.8653 4% f/18 0.9391 3% f/20 0.894 3% f/22 0.9461 3%
[FIGURE 18 OMITTED]
The luminance loss is maximal and equals 73 percent for an aperture of f/2.8. It decreases with smaller apertures, is strongly reduced for apertures from f/6.3 to f/22, and is minimal for f/8, f/9 and f/10.
Finally, vignetting filters were applied to original HDR pictures. A comparison between luminance values measured by the spot luminance meter and those captured in the HDR photography was carried out, before and after the correction of the vignetting effect.
This comparison showed that the determined filters have a positive impact for large apertures from f/2.8 to f/5.6, whatever the device (CAM#1FE#1 or CAM#2FE#2). But that for apertures smaller than f/5.6, the application of the filter does not significantly reduce RMSE between HDR luminances and spot luminance meter measurements, for CAM#2FE#2, and increases it slightly for CAM# 1FE# 1.
Before the vignetting correction, RMSEs calculated for the seven largest apertures vary between 31.4 percent (f/2.8) and 7.4 percent (f/5.6) for CAM#1FE#1, and between 32.7 percent (f/2.8) and 5.8 percent (f/5.6) for CAM#2FE#2. After the application of the filters, these errors are reduced to 3 percent for CAM#1FE#1, and respectively to 3.3 and 2.3 percent for CAM#2FE#2. Without applying any filter and for the apertures varying from f/6.3 to f/22, RMSE vary between 2.4 and 6 percent for CAM#1FE#1 and between 2 and 3.3 percent for CAM#2FE#2, with the higher RMSEs correspond-ing to smaller apertures. If filters are applied, these RMSEs vary respectively between 2.9 and 8.2 percent for CAM#1FE#1, and between 1.9 and 3.3 percent for CAM#2FE#2.
6.1 EXPERIMENTAL SET-UP
The semicircular setup used in this study differs from the ones used in other studies evaluating the vignetting effects of fisheye lenses [Inanici 2006; Jacobs and Wilson 2007]. It was preferred for several reasons.
1. The setup used in this study makes it possible to avoid the rotation of the camera during the measurements, which is important because:
* The projection center of the fisheye lens is not a constant point but is a function of the incoming ray [Gennery 2006]. So, this center is moving and does not match with the rotation axis of the camera;
* One of the interests in the capture of real world luminances with HDR imaging techniques coupled with a fisheye lens is to record luminances in the entire human FOV, rapidly, without rotating the camera;
* Only one set of shots is necessary (per aperture) to obtain luminances of the 180[degrees] FOV. Variation of illuminance is thus minimized during the capture process.
2. Because of the semicircular layout, even if the squares are probably not 100 percent perfect Lambertian surfaces, they are captured with the same angle of view and their luminance is thus more constant.
3. Measurements are realized under an artificial sky reproducing a CIE overcast sky luminance distribution which offers a stable lighting environment, evenly distributed on the horizontal plane.
4. Because of the semicircular layout, symmetry of the fisheye can be checked, as well as its mapping function.
5. Targets are of two different reflectance coefficients (gray and white) and make thus possible the evaluation of vignetting effect on the basis of two different gray tones.
6.2 PHYSICAL LUMINANCES
In Jacobs and Wilson  the vignetting effect of the fisheye lens was determined without comparing the captured values with the physical luminances in order to avoid absolute photometric calibration.
In this experiment, it was decided to record the physical luminances to counter the fact that the luminances of the targets are not constant over time, as shown in Fig. 11. Indeed, during the experiment, the luminous flux of the lamps of the mirror box decreased. In Fig. 11, the fourth series of physical luminance measurements present lower luminance values than the first series. Horizontal illuminances were recorded in the mirror box in order to weight the measured luminances. After this correction of luminances, luminances of targets situated at the periphery of the device remain slightly less than luminances at the center of the device. This can be due to the fact that the setup is not perfectly symmetric as it is a semi-circle and not a full circle, so targets at the periphery of the setup do not have the same neighboring than targets at the center of it (see Fig. 5).
As the luminances of the targets are not perfectly constant over the angle [theta], it is still necessary to evaluate the vignetting effect by comparing physical luminance measurements with luminances extracted from the HDR picture instead of evaluating the luminance falloff from the center of the image to its border.
6.3 CALIBRATION FACTOR
Cai and Chung showed that a local calibration improves the accuracy of HDR pictures, even if the differences with global calibration are not significant [Cai and Chung 2011]. In this study, the targets are locally calibrated, according to their reflectance, in order to avoid, in the determination of the vignetting effect, the introduction of errors due to the photometric calibration. As it was not possible to place, at the same time, a white and a gray square at the perfect center of the setup, it was decided to place a white square at the center of the setup surrounded by two gray ones, at [theta] = + /-2.5[degrees] where the vignetting is still negligible. We observed that the vignetting curves are smoother when working in this manner than in using a mean calibration factor.
6.4 VIGNETTING EFFECT
As expected, the vignetting effect of the Sigma 4.5 mm F2.8 lens is not significant at the center of the lens, is maximal at the periphery, and is accentuated for large apertures (for example f/2.8) (see Figs. 12 and 13). Vignetting effect is weak for apertures smaller than f/5.6 and increases for apertures larger than f/5.6. Minimal loss at the periphery of the fisheye is encountered for middle apertures f/8, f/9 and f/10.
Comparing with luminance loss of other fisheye lenses (see Table 4), the Sigma 4.5 mm F2.8 equisolid fisheye lens has a vignetting effect less pronounced than that of the Sigma 8 mm f/3.5 equidistant, but more pronounced than that of the equidistant Nikon FC-E9 lens.
6.4.1 EVALUATION OF DIFFERENCES
A radially symmetric tendency was observed in Figs. 12 and 13, even if small differences were detected between measurements realized in the first quadrant and in the second one. These differences can be due to the fact that the camera was not perfectly centered at [theta] = O[degrees]. Indeed, some difficulties were encountered in perfectly positioning the camera at the center of the setup because the center focus point visible on the viewfinder is not situated at the center of the fisheye lens and can thus not be used as a reference point. During pre-tests, the camera position was checked by analyzing the captured image and the position of the camera was adjusted for the experiment in order that the entrance pupil point is placed at the center of the semi-circle setup. Through the calculation of the RMSE, the radial symmetry of the lens was checked and, for each target, a mean vignetting value was calculated from values captured in each quadrant.
TABLE 4. Comparison of Luminance Loss at the Periphery of Fisheye Lenses projection Aperture Type f/4 f/5.6 f/16 Sigma 8 mm f/3.5 Equidistant ~55% ~40% ~18% (Inanici, 2010) Nikon FC-E9 (Inanici, Equidistant 23% - - 2006) Canon 7.5 mm (Wagner, Equidistant - ~38% - 1998) Sigma 4.5 mm F2 .8 Equisolid 51% 16% 4%
Regarding the influence of the reflectance of the target, the analysis showed that the approximated vignetting curves were similar whatever the reflectance of the target and the device, excepted for the two smaller apertures of CAM# 1FE# 1 and the five smaller apertures of CAM#2FE#2 (that is, f/ 14, f/ 16, f/ 18, f/20, and f/22). Indeed, these apertures presented differences of RMSE higher than 2 percent, a difference which is judged to be significant. These differences can be due to lens diffraction, a phenomenon which appears for small apertures and which affects image sharpness. In order to avoid errors of determination due to lens diffraction, it was decided, whatever the aperture, to approximate the vignetting effect on the basis of white squares only, which present the highest luminances.
At last, the vignetting effect of the four devices was judged to be similar, as the differences of their RMSE were inferior to 2 percent. Nevertheless, the vignetting effect of the CAM#1FE#1 device seems always slightly less pronounced than those of CAM#2FE#2 whereas an overestimation of luminance values was observed for its smaller apertures.
6.4.2 VIGNETTING FILTER
Although the number of devices studied is not sufficient to make valid statistical analysis, the study tends to show that vignetting effect of the four devices can be judged similar. Therefore, vignetting filters were determined on the basis of measurement of white squares of one of the four devices. The CAM#2FE#2 device was chosen for two reasons:
* It presents the best symmetry and was thus probably better centered;
* It presents less overestimated luminance values for smaller apertures. That is why RMSE of these curves are inferior to those of CAM# 1FE# 1 (see Fig. 16).
RMSEs between captured HDR luminances and spot luminance measure-ments were calculated before and after applying vignetting filters to initial HDR pictures. This analysis showed that vignetting filters reduce RMSEs when apertures vary between f/2.8 and f/5.6 but that when apertures are smaller than f/5.6, it is not necessary to apply the filter as the RMSEs are already weak and that the application of filters does not improve significantly results.
Finally, average error percentages have been calculated similarly to Inanici and Cai and Chung, in order to be compared to the errors published in their studies [Cai and Chung 2011; Inanici 2006]. Vignetting filters were applied to apertures f/2.8 to f/5.6. Average error percentages for these apertures after correction vary from 2.3 to 6.3 percent for CAM#1FE#1 and from 1.8 to 4.4 percent for CAM#2FE#2. No vignetting filter was applied to apertures from f/6.3 to f/22 as the average error percentages between HDR luminances and spot luminance measurement were already weak. Indeed, for these apertures, average error percentages vary between 1.9 and 3.6 percent for CAM#1FE#1 and between 1.6 and 2.7 percent for CAM#2FE#2 which is acceptable if compared to those published by Inanici (5.8 percent for gray targets) or Cai and Chung (2.8 for gray surfaces). Moreover, middle apertures f/8, f/9, and f/10, whose vignetting effect and luminance loss at the periphery of the picture are weak and which avoid apparition of diffraction phenomena, present respectively average error percentages of 1.9, 1.9 and 2 percent (CAM#1FE#1) and 1.6, 2.1 and 1.9 percent (CAM#2FE#2).
As expected, the vignetting effect of the studied lens, producing an equisolid projection, is not significant at the center of the lens, is maximal at the periphery, and is accentuated for large apertures. For the largest aperture f/2.8, the luminance loss at the periphery of the fisheye is a maximum and reaches 73 percent. For apertures smaller than f/5.6, luminance loss at the periphery of the picture is almost nonexistent.
The hypotheses of radial symmetry and noninfluence of the reflectance of the target on the determination of vignetting effect have been checked. Furthermore, this first comparison of the vignetting effect of two identical lenses mounted on two identical cameras shows that vignetting curves determined for one device (CAM#2FE#2) can be reasonably used to correct the vignetting effect of the other device (CAM#1FE#1), made up of lens and camera of the same brands. More generally, vignetting curves determined accurately in this study for a SIGMA 4.5 mm F2.8 fisheye lens, mounted on a CANON 40D camera, can be used by other researchers using similar photographic materials. Nevertheless, the response curve of the camera should always be determined by the researcher as it can vary even between cameras of same make and model [Jacobs 2007].
The experiment highlighted the negative impact of lens diffraction phenomenon on the accurate capture of luminance at smaller apertures (for example f/22). Small apertures are thus not recommended when the HDR technique is used as a luminance data acquisition tool even if it permits to cover a greater depth of field. A compromise can be found in working with middle apertures which cover a depth of field adequate for lighting studies and sky luminance capture while avoiding vignetting effect (which appears with large apertures) and diffraction (which appears with small apertures). In the framework of the capture of luminances with the HDR technique and a Sigma 4.5 mm F2.8 fisheye lens, three middle apertures are recommended for taking pictures: f/8, f/9 and f/ 10. Indeed, these apertures minimize vignetting effect and luminance loss at the periphery of the lens, avoid the negative impact of lens diffraction and cover a depth of field adequate for building analysis and sky capture. If working with these apertures and more generally with apertures smaller than f/5.6, no vignetting filter is required. But if, for matter of exposure, apertures larger than f/5.6 are necessary, the adequate determined vignetting filter will be applied to the HDR picture in order to reduce vignetting effect (see coefficients in Table 4).
Finally, the experimental setup was an opportunity to correct the projection formula given by the manufacturer, as in practice, the lens does not
exactly comply with the theoretical formula. A new mapping function has been approximated and could be used, on the basis of LDR or HDR fisheye pictures, to determine the precise position of the sun in the sky, to introduce real sky luminances in simulation tools, or to analyze repartition of luminances in the FOV.
Coralie Cauwerts and Magali Bodart were supported by the Belgian Research National Foundation (FNRS) and Arnaud Deneyer by the Belgian Building Research Institute (BBRI).
Anaokar S, Moeck M. 2005. Validation of high dynamic range imaging to luminance measurement. Leukos. 2(2):133-144.
Beltran L, Mogo B. 2005. Assessment of luminance distribution using HDR photography. Proc. of ISES Solar World Congress, Orlando, USA.
Bettonvil F. 2005. Fisheye lenses. WGN The Journal of the IMO. 33(1):9-14.
Bodart M, Deneyer A, De Herde A, Wouters P. 2006. Design of a new single-patch sky and sun simulator. Light Res Technol. 38(1):73-87.
Cai H, Chung T. 2011. Improving the quality of high dynamic range images. Light Res Technol. 43(1):87.
Fan D, Painter B, Mardaljevic J. 2009. A data collection method for long-term field studies of visual comfort in real-world daylit office environments. Proc. of PLEA 2009. Quebec City, Canada.
Gennery D. 2006. Generalized camera calibration including fish-eye lenses. Int J Comput Vision. 68(3):239-266.
Goldman DB, Chen JH. 2005. Vignette and exposure calibration and compensation. 10th IEEE International Conference on Computer Vision. Beijing, China. 899-906.
Inanici M. 2006. Evaluation of high dynamic range photography as a luminance data acquisition system. Light Res Technol. 38(2):123-136.
Inanici M. 2009. Applications of image based rendering in lighting simulation: development and evaluation of image based sky models. Proc. of the 11th IBPSA Conference. Glasgow, UK. 264-271.
Inanici M. 2010. Evaluation of high dynamic range image-based sky models in lighting simulation. Leukos. 7(2):69-84.
Inanici M, Navvab M. 2006. The virtual lighting laboratory: per-pixel luminance data analysis. Leukos. 3(2):89-104.
Jacobs A. 2007. High dynamic range imaging and its application in building research. Advances in Building Energy Research. 1(1):177-202.
Jacobs A, Wilson M. 2007. Determining lens vignetting with HDR techniques. Proc. of Light'2007. Varna, Bulgaria.
Konica Minolta. 2011. LS-100/LS-110 Luminance Meters. <http://www.konicaminolta.com/instruments/products/light/luminance-meter/ls100-1s110/specifications.html>. [March 2011].
[LBNL] Lawrence Berkeley National Laboratory. 2010. Radiance 4.0. <http://radsite.lbl.gov/radiance/HOME.html>. [May 2011].
Mardaljevic J, Painter B, Andersen M. 2009. Transmission illuminance proxy HDR imaging: a new technique to quantify luminous flux. Light Res Technol. 41(1):27-49.
Moeck M, Anaokar S. 2006. Illuminance analysis from high dynamic range images. Leukos. 2(3):211-228.
R Development Core Team. 2010. R 2.11.1. <http://www.r-project.org/>. [May 2011].
Santa Clara M. 2009. Digital photography, a tool for lighting research: high-resolution sampling of spherical luminance maps with digital photographic technologies applied to diffuseness descriptors. Proc. of PLEA 2009. Quebec City, Canada.
Schneider D, Schwalbe E, Maas HG. 2009. Validation of geometric models for fisheye lenses. Int Soc Photogramme. 64(3):259-266.
Stumpfel J, Tchou C, Jones A, Hawkins T, Wenger A, Debevec P. 2004. Direct HDR capture of the sun and sky. Proc. of Afrigraph 2004. Cape Town, South Africa. 145-149.
The MathWorks. 2009. MATLAB The Language of Technical Computing. <http://www.mathworks.com/products/matlab/>. [May 2011].
Van Den Wymelenberg K, Inanici M, Johnson P. 2010. The effect of luminance distribution patterns on occupant preference in a daylit office environment. Leukos. 7(2):103-122.
Wagner S. 1998. Calibration of grey values of hemispherical photographs for image analysis. Agr Forest Meteorol. 90(1-2):103-117.
(1) Universite catholique de Louvain (UCL), Architecture & Climat, Belgium; (2) Belgian Building Research Institute (BBRI), Department of Lighting, Energy and Climate, Belgium.
* Corresponding Author: Coralie Cauwerts, Email: coralie. email@example.com
Coralie Cauwerts (1) (*), Magali Bodart PhD (1), and Arnaud Deneyer (2)
|Printer friendly Cite/link Email Feedback|
|Author:||Cauwerts, Coralie; Bodart, Magali; Deneyer, Arnaud|
|Date:||Jan 1, 2012|
|Previous Article:||Lighting controls in commercial buildings.|
|Next Article:||Comparative in situ study of LEDs and HPS in road lighting.|