PLEA 2016 Los Angeles - 36th International Conference on Passive and Low Energy Architecture.
Cities, Buildings, People: Towards Regenerative Environments
Improving the Accuracy of Measurements in Daylit Interior
Scenes Using High Dynamic Range Photography
J. Alstan Jakubiec1, Mehlika Inanici2, Kevin Van Den Wymelenberg3, Alen Mahic3
1Singapore University of Technology and Design, Singapore
2University of Washington, Seattle, USA
3University of Oregon, Eugene, USA
ABSTRACT: Measuring the luminous environment enables researchers and practitioners to study perception and
visual comfort in existing environments in order to decipher the components that contribute to good or problematic
lighting characteristics. High dynamic range photography is commonly used to study visual perception and comfort.
This paper presents new findings and specific methods of capturing interior scenes that may include peaks caused by
a direct view of the sun and specular reflections. Methods are tested to improve the range of luminance values that
can be captured, and new guidelines are proposed to improve the accuracy of computed visual comfort indices.
Keywords: daylighting, high dynamic range imagery, visual comfort, glare, luminous overflow
INTRODUCTION application in lighting research on topics such as visual
Daylighting design is a decision-making process that comfort, electric lighting or shading controls. However,
integrates the outdoor environment, building openings, the authors observe that there are common shortcomings
material properties, lamps, luminaires, and lighting for indoor HDR captures that typically occur when the
controls. These choices affect visual quality, human solar corona or specular reflections are visible in the
comfort, performance, and occupant well-being. field of view. When this occurs, typical HDR capture
Measurement of the luminous environment with techniques may underestimate the luminance of
affordable technologies such as high dynamic range extremely bright sources that cause discomfort. This
(HDR) photography paired with relatively inexpensive inability to measure the full dynamic range of a lighting
luminance and illuminance sensors has become a scene is called luminous overflow. Such shortcomings
common approach among researchers and practitioners limit the accuracy of HDR photography in practice;
to evaluate visual comfort and perception in this context. therefore, the intent of this publication is to share
important information with the lighting community on
HDR photography (Debevec and Malik, 1997) is a missteps in capturing HDR photographs. The paper
three step technique where: 1) Multiple photographic includes guidelines for measuring bright sources and
exposures are captured in the field in a successive demonstrates a post-processing method for correcting
manner; 2) they are fused into a single HDR image HDR photographs which exhibit luminous overflow.
using computational self-calibration methods; and 3)
post-processing is done to account for known In addition, the results of a longitudinal study
aberrations (such as the vignetting effect) and to fine identifying potential pitfalls from typical HDR capture
tune the results using a sensor reading taken in the field. techniques are presented. Hemispherical HDR
Single-aperture HDR photography techniques have been photographs were captured with paired vertical
validated for capturing luminance maps of building illuminance and luminance measurements under a wide
interiors (Inanici, 2006), and two-aperture and filter- variety of lighting conditions, sun positions, and space
based HDR photography techniques have been types. Care was taken to especially include the following
successfully used in capturing the sky and simulating conditions: overcast sky, artificial electric illumination,
image-based lighting for interior and exterior direct vision of the sun, bright specular reflections,
environments (Stumpfel, et al. 2004; Inanici, 2010). intermediate sky, application of shading devices, and
combinations of the preceding.
A common use of HDR photography is to evaluate
visual comfort in daylit spaces (Fan, et al. 2009; Van PREVIOUS WORK
Den Wymelenberg, et al. 2010; Konis 2013; Van Den This paper is built upon the best practices for HDR
Wymelenberg and Inanici 2014; Konis 2014; Hirning image captured laid out in Inanici (2006) and follows
2014; Van Den Wymelenberg and Inanici 2015; from previous work published by the authors (Jakubiec,
Jakubiec, et al. 2015; Jakubiec, et al. 2016). Being able et al. 2016). In all HDR captures in this paper, full-frame
to accurately measure luminance levels is crucial for the digital cameras outfitted with hemispherical fisheye
evaluation of existing lighting conditions and for lenses were used to take HDR photographs. The white
PLEA 2016 Los Angeles - 36th International Conference on Passive and Low Energy Architecture.
Cities, Buildings, People: Towards Regenerative Environments
balance of the camera is set to daylight, and ISO sensor luminance capture of a f/11 aperture photograph with a
sensitivity is set to 100. Luminance measurements of a ND-3 filter using the aforementioned example is
neutral grey target were measured using a Konica expected to be approximately 850,000,000 cd/m2
Minolta LS-110 luminance meter. Vertical illuminance (850,000 / 0.001).
measurements were taken at the front of the fisheye lens
using a Konica Minolta T-10a illuminance meter. Both It is important to note that ND filters are not truly
illuminance and luminance measurements were repeated neutral, and they introduce a chromatic shift that affects
before and after the image capture process in order to the middle luminous ranges substantially. As peak
detect changing lighting levels during the capture. luminances are captured (sun and specular reflections),
lower luminances (the rest of the scene) were found to
Inanici (2006) utilized an aperture size of f/4 to be reduced as compared to a standard capture using
capture luminance values within the range of 1–16,000 Inanici’s 2006 method. This effect can be observed in
cd/m2. The authors found that this range is not wide Fig. 1, which illustrates the various capture methods
enough to include the luminance values around the solar employed in this paper applied to a daylit scene with
disk which can be as high as a billion cd/m2 (Grondzik direct vision of the sun on a clear day. 1A shows the
et al. 2006) nor of intense specular reflections. Much of human perception of the scene using the Radiance
the authors’ previous work therefore related to capturing ‘pcond’ tool (Ward, 1994), 1B shows a capture without
this high luminance range (Jakubiec, et al. 2016). a filter, and 1C shows a capture using a ND-3 filter. The
ND-3 filter capture measures a significantly higher solar
Initially, aperture selection was investigated as a luminance compared to the normal capture (a nearly
factor determining the dynamic range of a HDR 200-factor increase), but it also lowers the average
photograph. Although any aperture size can be utilized diffuse scene luminance by 33.1% and shifts the color
as long as an aperture-specific vignetting correction is range of the photograph noticeably compared to Fig. 1A.
applied, the choice impacts both the captured luminance
range and potential for lens flare. The authors’ found The authors’ previous work also found that the
that the expected dynamic range difference between f/4 absence of high luminance values due to luminous
and f/22 apertures is approximately 32 fold (Jakubiec, et overflow significantly impacts the calculation of human
al. 2016), which follows the geometric difference visual comfort indices such as Daylight Glare
between the two settings. It was also found that the Probability (DGP) (Wienold and Christoffersen 2006)
maximum captured luminance is approximately 100,000 and the Unified Glare Rating (UGR) (CIE 1995). UGR
cd/m2 at f/4 using a Canon EF 8-15mm f/4L USM lens values for images corresponding to vertical illuminances
and EOS 5D camera and 3,200,000 cd/m2 at f/22 using greater than 5,000 lx were found to increase on average
the same lens and camera. Maximum dynamic range by a value of 10.17, enough to move an evaluation from
may vary with camera or lens type, and should be ‘imperceptible’ to ‘disturbing.’ DGP values for the same
checked by researchers. Based on this finding, it stands images were found to increase on average by 0.23,
to reason to use a smaller aperture (such as f/22) to enough to move an evaluation from ‘imperceptible’ to
increase the luminance range. ‘intolerable.’ In most—but not all—cases, vertical
illuminances under 5,000 lx indicated a lack of overflow
The authors also found that smaller apertures (Jakubiec, et al. 2016). Therefore, the previous study
correlated with greater potential for lens flare (Jakubiec, recommended the use of ND filters for conditions above
et al. 2016). Lens flare is the result of light scattering in 5,000 vertical lux measured at the camera lens in order
the lens, and it causes false luminance increases around to ensure that an adequate luminance range is captured.
bright light sources. An aperture size of f/11 was chosen
as a compromise between increased dynamic range and METHODOLOGY
reduced lens flare, with a maximum luminance range of Seventy-six HDR photographic series have been
approximately 850,000 cd/m2 while having less potential captured under a wide variety of interior luminous
for lens flare using the aforementioned camera and lens conditions: overcast skies, intermediate skies, sunny
combination. skies, electrically lit, and especially scenes that contain
direct vision of the sun or bright specular reflections
Since the luminance value of the solar disc is beyond (those likely to experience luminous overflow and visual
the range a normal HDR can capture using any aperture discomfort). The vertical illuminances corresponding to
size, neutral density (ND) filters were investigated to the HDR photographs taken range from 25.56 lx–74,850
increase the captured range. ND-1, 2, and 3 filters lx. In order to quantify the effects of luminous overflow,
transmit 10%, 1%, and 0.1% of incident light, a photograph with no filter (NF) was quickly followed
respectively. A ND-3 filter has been used in capturing by a photograph with a ND-3 filter. Grey card
HDR images of the sky (Stumpel et al., 2004), but it was luminances and vertical illuminances were measured
not utilized for interior captures. The maximum before and after each capture sequence in order to ensure
PLEA 2016 Los Angeles - 36th International Conference on Passive and Low Energy Architecture.
Cities, Buildings, People: Towards Regenerative Environments
Figure 1:Comparison of image capture and processing techniques explored in this paper
that lighting levels did not change significantly. Large based on the average of the before-and-after grey card
deviations in illuminance, greater than 10%, during a luminance measurements.
single capture or between the NF and ND-3 captures
resulted in images being removed from the study. It is impossible to know the true luminance of the sun
Therefore all NF and ND-3 HDR images in this study with current technology. The luminance meters available
are comparable. All images are corrected for vignetting, for this study have a maximum range of 999,900 cd/m2,
when wide angle lenses exhibit noticeable luminance far short of the billion cd/m2 potential solar luminance
decrease from the centre of the image to the perimeter (Grondzik, et al. 2006). Instead, vertical illuminance
(Inanici, 2006; Cauwerts and Deneyer 2012, Jakubiec, et measurements taken at the lens are used to check the
al. 2016). Furthermore, each HDR image is calibrated validity of HDR images. These independent illuminance
PLEA 2016 Los Angeles - 36th International Conference on Passive and Low Energy Architecture.
Cities, Buildings, People: Towards Regenerative Environments
sensor measurements can then be compared to direct sun, this spreads out the solar luminance into a
illuminance calculated from the fisheye image (Ev,image). larger solid angle area—as shown in Fig. 1E—where the
This is only possible when a HDR image contains a full peak luminance is less but covers a larger area. In this
hemispherical view, and is calculated as per Equation 1 paper this post-processing method is referred to as
(following page) as a summation across all pixels in a overflow correction. An entire computational procedure
hemispherical image capture. for performing luminous overflow correction is
∗ ∗ documented in the paper appendix. , (1) For each of the four capture methods (NF, ND-3, NF
θ is the incident angle of light arriving at the camera, L + ND-3, and overflow correction), peak luminances,
is the pixel luminance in cd/m2, and ω is the solid angle image-derived illuminances, and visual comfort
of the pixel in str. Computational methods of calculating values—DGP and UGR—are calculated. In this way, it
illuminance from a hemispherical image using Equation the authors’ aim to analyze the phenomena of luminous
1 are included in sections A and B of the appendix. overflow and make recommendations for its
identification and correction.
As stated in the previous work section, using a ND-3
filter alone is not fully satisfactory as the mid-range RESULTS
luminances in the scene are reduced. Therefore, two Fig. 2 (A–D) illustrates comparisons for all 76 image
alternative methods are proposed to investigate strong pairs—152 HDR images in total—as a stacked bar chart.
measured luminances while simultaneously maintaining In all cases, the colors indicate the capture or correction
the validity of the rest of the scene luminances. The first methodology employed: Dark grey (■) indicates a NF
method is simply to combine the NF and ND-3 images photograph, lighter grey (■) indicates a ND-3
in an intelligent manner; all pixels in the ND-3 image photograph, lightest grey (■) indicates a NF + ND-3
that are greater than the maximum luminance of the NF combined photograph, and red (■) indicates an overflow
image as found using the Radiance ‘pextrem’ command corrected image. The X-axis label value of each bar
(Ward 1994, appendix section C) are substituted into the indicates the mean of 4 vertical illuminance
original NF image. Thus, a new image is created where measurements taken at the lens before and after the NF
only pixel luminance values above the measurement and ND-3 captures. The Y-axis indicates values derived
capacity of the NF image capture are replaced by the from the images: 2A, peak luminances; 2B, the
more luminous pixels from the ND-3 image. In this way, percentage of sensor-measured illuminance represented
the more accurate mid-range luminance data of the NF by the photograph; 2C, image-derived DGP visual
image is maintained while the total luminous range of comfort values; and 2D, image-derived UGR visual
the image is increased. An example of this process is comfort values.
included in Fig. 1D, where it can be seen that maximum
luminance is the same as the ND-3 image but with mid- Fig. 2B illustrates that most images under 5,000
range luminances identical to the NF image. The vertical lux do not experience overflow, as they are
strength of this effect is astounding as the image- close to the sensor-measured lux value; on average
calculated vertical illuminance increases by 6,075 lx 88.6% of sensor-measured illuminance is represented by
compared the ND-3 image alone. In other words, 6,075 images where Ev < 5,000 lx. On the other hand, images
lx of illuminance was lost through the effects of the ND- above 5,000 vertical lux are highly likely to experience
3 filter on measured mid-range luminance values. overflow. On average only 28.6% of illuminance is
represented in a NF photograph for scenes where Ev ≥
The method described in the previous paragraph is 5,000 lx. Using a ND-3 filter increases the mean
still not ideal, because two HDR photographs need to be percentage of measured illuminance to 48.0% even
taken. This is time consuming, taking up to 2 minutes in though the mid-range luminances are attenuated. Using
the field for an adept user. During those 2 minutes, the NF + ND-3 combined image, on average 54.5% of
lighting conditions can change substantially (Jakubiec, measured illuminance is represented by the HDR image.
et al. 2016). Furthermore the post-processing effort is These results suggest that for many high-illuminance
increased, and the amount of data collection is doubled. situations, ≥5,000 lx, the direct solar component is
An alternative method is to capture only a single image, dominant, representing on average 71.6% of the total
but increase the luminance of the overflow pixels until visible light present in such scenes. It is also noteworthy
the image illuminance is equal to the sensor-measured that even with a dynamic range of between 10
8–109
2
vertical illuminance. Pixel brightnesses rounded based cd/m (2A) for most extremely high illuminances
on the first digit of the NF maximum luminance value (>25,000 lx), the NF + ND-3 images cannot come close
are increased until the image illuminance is equal to the to representing the incident illuminance.
measured illuminance. In the case of a photograph of the
PLEA 2016 Los Angeles - 36th International Conference on Passive and Low Energy Architecture.
Cities, Buildings, People: Towards Regenerative Environments
Figure 2:Comparison of image capture and processing techniques explored in this paper
PLEA 2016 Los Angeles - 36th International Conference on Passive and Low Energy Architecture.
Cities, Buildings, People: Towards Regenerative Environments
Expanding the dynamic range of photographs to the derived from the resulting HDR image should be
extent seen so far in this paper does not fully represent comparable to the sensor measurement if the image
the sensor-measured illuminance. This suggests that is accurately representing luminances in the scene.
further dynamic range is needed such as by applying a 3. When the peak luminance of any given HDR image
ND-4 filter (0.01% of light transmitted) or using a f/22 is near the maximum luminous range of the image
aperture, or that measurements of luminance are not and the image-computed vertical illuminance is
vary accurate at such high values. However, a ND-4 below the sensor measured value, the image likely
filter is exceedingly difficult to use indoors as it is exhibits luminous overflow. These images should be
nearly impossible to get a proper exposure, and it is corrected for the overflow condition or discarded.
likely to further reduce mid-range luminances. Choosing
a f/22 aperture increases the risk of lens flare as Another observation from this study is that luminous
previously mentioned. Therefore, correcting for overflow is more likely to occur at vertical illuminances
overflow using the NF photograph and measured greater than 5,000 lx; however, that is not deterministic
vertical illuminance as previously described and as seen in Fig. 3A and labeled in Fig. 2 with *3A where
depicted in Fig. 1E is a plausible option. overflow occurs through a fabric roller shade at only 731
lx. Likewise, overflow does not always occur for high
Overflow correction achieves photometric parity (Ev illuminance images as seen in 3B illustrating a large
= Ev,image) by virtue of its method of adjusting the image visible solid angle of diffuse sky at 7,405 lx (labeled
until it equals the measured illuminance. A valid *3B in Fig. 2). These observations are important to keep
concern with overflow correction is whether it may give in mind when using automated capture methods during
inaccurate results in terms of glare metrics such as DGP long-term visual comfort studies.
and UGR. Figs. 2C and 2D begin to address this
concern, comparing the metrics for the four capture and It is observable from Figs. 2C and 2D that DGP as a
correction methods. It is often the case that overflow metric is highly sensitive to overflow. Twenty image
correction increases glare metric results, especially for pairs of 38 that exhibit overflow (52.6% of overflow
aforementioned vertical illuminance conditions greater images) move the DGP value from one subjective
than 5,000 lx. However, the real question is if the threshold using a NF capture to another higher
evaluation of the metric changes with the overflow discomfort threshold. Conversely, this only occurs for 4
correction. For DGP results, only two images (at 337 of the same 38 images (10.5%) for the UGR metric.
and 731 lx Ev) are moved from one subjective threshold Therefore, UGR is more likely to evaluate ‘as intended’
to another beyond the value of the NF + ND-3 method. even when luminous overflow is observable.
For UGR results, no single image was changed from one
subjective threshold to another. The time and monetary cost of verifying the
accuracy of HDR photographs and ensuring against
CONCLUSION AND DISCUSSION luminous overflow is relatively small. Using Inanici’s
It is likely acceptable to identify luminous overflow (2006) capture method, it is recommended to always use
conditions from an illuminance mismatch between HDR an independent luminance measurement of a grey card.
photographic illuminance and a sensor measurement. It To perform measures before and after a HDR capture
is plausible to correct for the overflow using the and to add in a vertical illuminance measurement as
measured illuminance value without taking a second recommended in this paper is a relatively small time
photograph with a wider range of luminance capture. contribution for the photographer. A high-quality
The authors found that the overflow correction method illuminance sensor costs roughly 1/8 as much as an
proposed herein generates similar results when equivalent quality luminance sensor, so the monetary
considering the subjective evaluation of the visual cost is also low. If a photographer is not using
comfort metrics DGP and UGR as well as peak image independent sensor measurements with each HDR
luminances. In order to ensure the accuracy of HDR capture, then the monetary barrier is high (a $6,000–
images against luminous overflow, the authors make $10,000 investment), but the user should realize that the
several new recommendations: uncertainty of the results are significantly increased
1. Measure the luminous range of the specific camera, without a measurement-based calibration. The analysis
lens, ISO, and shutter speed combinations to be used and correction process can either be done manually or
in any measurement by capturing a very bright with automated analysis scripts (as described in the
source such as the sun or a halogen lamp. The peak appendix). This requires users to be organized with their
luminance of this image is near the value at which data. The analysis and correction process is very fast, on
overflow will occur using the specific hardware and the order of seconds, relative to glare evaluation
camera settings. programs such as Evalglare (Wienold 2015) and the
2. Measure vertical illuminance in addition to time required to capture and analyze the input images.
luminance with every HDR photograph. Illuminance
PLEA 2016 Los Angeles - 36th International Conference on Passive and Low Energy Architecture.
Cities, Buildings, People: Towards Regenerative Environments
Inanici, M., 2006. Evaluation of high dynamic range
photography as a luminance data acquisition system. Lighting
Research and Technology, 38(2), pp.123-134.
Inanici, M., 2010. Evalution of High Dynamic Range
Image-Based Sky Models in Lighting Simulation. Leukos,
7(2), pp.69-84.
Jakubiec, J.A., C. Reinhart and K. Van Den
Wymelemberg, 2015. Towards an Integrated Framework for
A. Dim HDR B. Bright HDR Predicting Visual Comfort Conditions from Luminance-Based Metrics in Perimeter Daylit Spaces. In Proc. of Building
with overflow without overflow Simulation. Hyderabad, India, Dec. 7-9.
(Ev = 731 lx) (Ev = 7,405 lx)
Figure 3: Example HDR images of atypical luminous Jakubiec, J.A., K. Van Den Wymelenberg, M. Inanici, and
overflow (or lack thereof) cases A. Mahic, 2016. Accurate Measurement of Daylit Interior
Scenes Using High Dynamic Range Photography. In Proc. of
Further work is necessary in order to validate the CIE Conference on Lighting Quality and Energy Efficiency.
simple illuminance-based correction for luminous Melbourne, Australia.
overflow proposed in this paper. Specifically, it may be
more accurate to place all of the ‘overflow’ luminous Konis, K., 2014. Predicting Visual Comfort in Side-lit Open-Plan Core Zones: Results of a Field Study Pairing High
energy into a pixel source roughly the size of the solar Dynamic Range Images with Subjective Responses. Energy
disc centered within the overflow pixels rather than and Buildings, 77, pp.67-79.
adjusting all of the overflow pixels to be more luminous.
The validation and study of such other corrective Stumpfel J., A. Jones, A. Wenger and P. Debevec, (2004).
methods is the subject of future work for the authors. Direct HDR capture of the sun and sky. In 3rd International
Conference on Virtual Reality, Computer Graphics,
ACKNOWLEDGEMENTS Visualization and Interaction in Africa. Cape Town, South
Africa.
The first author acknowledges support by the SUTD-
MIT International Design Center (IDC). Any opinions Van Den Wymelenberg, K., M. Inanici and P. Johnson,
expressed herein are those of the authors and do not (2010). The Effect of Luminance Distribution Patterns on
reflect the views of the IDC. Occupant Preference in a Daylit Office Environment. Leukos,
7(2), pp.103-122.
REFERENCES
CIE,1995. CIE 117-1995. Discomfort Glare in Interior Van Den Wymelenberg, K. and M. Inanici, (2014). A
Lighting. Vienna: CIE. Critical Investigation of Common Lighting Design Metrics for
Predicting Human Visual Comfort in Offices with Daylight.
Cauwerts, C., and A. Deneyer, 2012. Comparison of the Leukos, 10(3), pp.145-164.
Vignetting Effects of Two Identical Fisheye Lenses. Leukos,
8(3), pp.181-203. Van Den Wymelenberg, K. and M. Inanici, (2014).
Evaluating a New Suite of Luminance-Based Design Metrics
Debevec P. and J. Malik, (1997). Recovering high dynamic for Predicting Human Visual Comfort in Offices with
range radiance maps from photographs. In SIGGRAPH Daylight," Leukos, DOI: 10.1080/15502724.2015.1062392.
Computer Graphics Proceedings, Los Angeles, CA, pp.369-
378. Ward, G., (1994). The Radiance Lighting Simulation and
Rendering System. In Proceedings of SIGGRAPH 94,
Fan, D., B. Painter and J. Mardaljevic, 2009. A Data Computer Graphics Proceedings, Annual Conference Series,
Collection Method for Long-Term Field Studies of Visual pp. 459-572.
Comfort in Real-World Daylit Office Environments. In Proc.
of PLEA (pp. 251-256). Ward, G.J. Photosphere, Computer software, [Online],
Available: . [26 February 2016].
Grondzik, W.T,. A.G. Kwok, B. Stein and J.S. Reynolds,
2006. Mechanical and Electrical Equipment for Buildings. Wienold, J. and J. Christoffersen, (2006). Evaluation
John Wiley & Sons. methods and development of a new glare prediction model for
daylight environments with the use of CCD Cameras. Energy
Hirning, M.B., G.L. Isoardi and I. Cowling, 2014. and Buildings, 38(7), pp.743-757.
Discomfort Glare in Open Plan Green Buildings. Energy and
Buildings, 70, pp.427-440. Wienold, J. Evalglare 1.17, computer software 2015.
Available from: . [15
October 2015].
PLEA 2016 Los Angeles - 36th International Conference on Passive and Low Energy Architecture.
Cities, Buildings, People: Towards Regenerative Environments
APPENDIX: OVERFLOW CORRECTION E. Increasing Overflow Pixel Luminances to Achieve a
The process of correcting a HDR image for overflow is New Illuminance Value
described in greater detail within this appendix. Specific In order to increase pixels over a certain luminance to
command line functions are illustrated using the
Radiance suite of tools (Ward 1994). This process was match an image’s calculated illuminance ( , ) to
automated using a simple script for the presented an illuminance measurement ( ), first the current
research. illuminance contribution ( ) of the pixels to be
corrected should be determined using the command,
A. Masking a HDR Image pcomb ‐e "lo=if(li(1)‐L_threshold/179, illContrib, 0); illContrib=L*Sang*cosCor;
A monochromatic bitmap format image of a white circle L=179*li(1); Sang=S(1); cosCor=Dy(1);"
and a black background with the same dimensions as the ‐o image_masked.hdr
HDR photograph can be converted to the Radiance | pvalue ‐d ‐b ‐h ‐H | total
RGBE format using the below command,
And the potential contribution factor ( ) of the
ra_bmp ‐r mask.bmp > mask.hdr
same group of overflow pixels is the sum of the solid
And HDR images can have this mask applied by the angle and cosine correction for each pixel. This can be
following command,
∗ determined by the following, pcomb ‐e "lo=mask li(1); mask=if(li(2)‐.1,1,0);"
‐o image.hdr ‐o mask.hdr > image_masked.hdr pcomb ‐e "lo=if(li(1)‐L_threshold/179, illPoten, 0); illPoten=Sang*cosCor; Sang=S(1);
It is necessary to append the original view information cosCor=Dy(1);" ‐o image_masked.hdr
to the resulting file header as the ‘pcomb’ command | pvalue ‐d ‐b ‐h ‐H
comments the information out of the image header. In | total
the case of a typical hemispherical fisheye image, this is, It is then possible to select a new pixel luminance to
VIEW= ‐vta ‐vh 180 ‐vv 180 make the , equal to the measured value of as
in Equation 3 below,
B. Calculating Illuminance from a Masked Image ,
Illuminance can be calculated from a circularly masked (3)
hemispherical fisheye image (as created in A) using the
command, And finally this new luminance value ( ) can be
pcomb ‐e "lo=L ∗ Sang ∗ cosCor; L=179 ∗ li(1); used to replace pixels over the luminance threshold,
Sang=S(1); cosCor=Dy(1);" ‐o image_masked.hdr pcomb ‐e "lo=if(li(1)‐L_threshold/179, L_new/179,
| pvalue ‐d ‐b ‐h ‐H li(1));" ‐o image_masked.hdr >
| total image_corrected.hdr
It is again necessary to append the original view
C. Finding the Maximum Pixel Luminance information to the resulting file header. In the case of a
The Radiance command, hemispherical fisheye image, this is,
VIEW= ‐vta ‐vh 180 ‐vv 180
pextrem image.hdr
will return the location and RGB Radiance values of the
extreme brightest and darkest pixels. The luminance of
these pixels can be determined by Equation 2 using a
standard 179 lm/W conversion factor in Radiance.
179 ∗ 0.2651 0.6701 0.0648 (2)
D. Determining the Solid Angle of Pixels Above a Set
Luminance Threshold
The solid angle size of pixels over a set luminance
threshold (less than a factor of , ) can be
determined using,
pcomb ‐e "lo=if(li(1)‐L_threshold/179, Sang, 0);
Sang=S(1);" ‐o image_masked.hdr
| pvalue ‐d ‐b ‐h ‐H
| total