Next Article in Journal / Special Issue
Fabrication and Simulation of Self-Focusing Field Emission X-ray Tubes
Previous Article in Journal
Double Properties of Novel Acylhydrazone Nanomaterials Based on a Conjugated System: Anion Binding Ability and Antibacterial Activity
Previous Article in Special Issue
Visual Recognition and Its Application to Robot Arm Control
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Study of Three-Dimensional Image Brightness Loss in Stereoscopy

1
Department of Systems Engineering and Naval Architecture, National Taiwan Ocean University,2 Pei-Ning Road, Keelung 20224, Taiwan
2
Institute of Computer Science and Information Engineering, National Taiwan Normal University, Taipei 10610, Taiwan
3
Electronics and Optoelectronics Research Laboratories, Industrial Technology Research Institute, Hsinchu 31040, Taiwan
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Appl. Sci. 2015, 5(4), 926-941; https://doi.org/10.3390/app5040926
Submission received: 26 July 2015 / Revised: 28 September 2015 / Accepted: 13 October 2015 / Published: 21 October 2015

Abstract

:
When viewing three-dimensional (3D) images, whether in cinemas or on stereoscopic televisions, viewers experience the same problem of image brightness loss. This study aims to investigate image brightness loss in 3D displays, with the primary aim being to quantify the image brightness degradation in the 3D mode. A further aim is to determine the image brightness relationship to the corresponding two-dimensional (2D) images in order to adjust the 3D-image brightness values. In addition, the photographic principle is used in this study to measure metering values by capturing 2D and 3D images on television screens. By analyzing these images with statistical product and service solutions (SPSS) software, the image brightness values can be estimated using the statistical regression model, which can also indicate the impact of various environmental factors or hardware on the image brightness. In analysis of the experimental results, comparison of the image brightness between 2D and 3D images indicates 60.8% degradation in the 3D image brightness amplitude. The experimental values, from 52.4% to 69.2%, are within the 95% confidence interval.

Graphical Abstract

1. Introduction

One of the primary challenges encountered in the development of three-dimensional (3D) displays is image brightness. Even under application of current stereoscopic techniques, viewers experience a certain amount of image brightness degradation when viewing 3D images in cinemas or on stereoscopic televisions. Regardless of whether the images are recorded with half-mirror or double-parallel type filming equipment, the brightness is impaired. Both polarizing film group displays and the viewing of 3D images with 3D glasses cause brightness degradation. 3D displays must send separate images to the right and left eyes. However, image brightness loss occurs as a result of crosstalk, which manifests in the case of an overly high contrast between the images. If the brightness is increased to compensate for the 3D image brightness loss in stereoscopy only, spectators will not have the same visual experience as with two-dimensional (2D) images. Furthermore, some of the light is scattered when spectators wear 3D glasses; only half the light intensity from the screen is reflected upon the eyes, and the image brightness is reduced by 30%–50%. Clearly, the image brightness in 3D stereoscopic images is lower than that in 2D images during film playback. In addition, some spectators have strong antipathy towards 3D films, as a result of feelings of dizziness and other uncomfortable physiological phenomena [1,2,3,4,5].
Because the inter-ocular disparities between human eyes are 5–6.5 cm, the resulting perspective is slightly different when both retinas receive the same images. This difference is also called visual disparity. Rods and cones are the retinal cells that control visual signals; they are responsible for converting the brightness of light, color, and other information into optic nerve messages in the brain. Fusing of the two different perspectives generates depth perception, so that human eyes can discern 3D objects visually [6,7]. This study focuses on whether the 3D images in cinemas or on stereoscopic television can be adjusted for decreased 3D light degradation.

2. Three-Dimensional Image and Brightness Loss

2.1. Theory of Stereo Vision

The 3D visual phenomenon is generated by the parallax effect, which includes both binocular parallax and motion parallax. Binocular parallax is due to the different perspectives of the eyes, resulting in slightly different image messages being received by the left and right eyes. The two received images are combined in the brain to synthesize the stereoscopic effect [8,9], as shown in Figure 1.
Motion parallax is due to changes in the observation point, and is affected by the distance of an object relative to a moving background, as shown in the simplified illustration of this parallax in Figure 2. When observed from point A, the object appears to be left of the butterfly, but for observation from point B, the object appears to be to the right of the butterfly.
Figure 1. Two images sent to the brain to synthesize the stereoscopic effect.
Figure 1. Two images sent to the brain to synthesize the stereoscopic effect.
Applsci 05 00926 g001
Figure 2. Motion parallax.
Figure 2. Motion parallax.
Applsci 05 00926 g002
Using either of the parallax effects, human eyes can produce 3D visual effects when viewing an object. To construct an image display device that is capable of producing 3D visual effects, one can utilize the parallax technique to adjust the brightness of light, color, and viewing direction, so that the messages received by the left and right eyes have a difference in viewing angle, causing a stereoscopic effect. The brain activity during the synthesis process depends on complex factors related to human evolution, psychology, and medicine. However, the current plane 3D imaging technology remains focused on the concept of the “right and left eyes receiving different images,” which is related to binocular parallax. In other words, provided some analogy of the “right and left eyes receiving different images” of the visual environment is employed, viewers can observe 3D images on a 2D plane screen [10,11]. Currently, on the basis of the binocular parallax theory, 3D image display technologies are roughly divided into stereoscopic and auto-stereoscopic types.

2.2. Three-Dimensional Display and Three-Dimensional Glasses

There are two kinds of stereoscopic displays: passive and active. Passive polarization glasses, which are also called passive 3D glasses, are an example of passive stereoscopic technology. The working principle of this technology is to paste a micro-retarder layer on the front of a general television or monitor, using the polarization direction of the light to separate the left- and right-eye images. Then, the passive polarization glasses can correctly cause each of the viewer's eyes to see the appropriate right and left images, producing a 3D effect. The advantage of this approach is its low cost, but the screen resolution is reduced to half the original 2D image resolution, and the overall brightness is also reduced. Color-coded anaglyph 3D glasses can be divided into those using red and cyan, green and red, or blue and yellow filters, as shown in Figure 3.
Figure 3. Classification by color-code: (a) red and cyan and (b) red and green anaglyph 3D glasses used to create 3D stereo images.
Figure 3. Classification by color-code: (a) red and cyan and (b) red and green anaglyph 3D glasses used to create 3D stereo images.
Applsci 05 00926 g003
Active 3D glasses, shown in Figure 4, are another type of 3D glasses that are also called shutter glasses. In this process, the 3D screen continually alternates the display between the left- and right-eye images at a frequency of 120–240 Hz, with the right and left eyes being quickly switched and shielded alternately by the shutter glasses. This causes the left and right eyes to see the correct respective images, causing the brain to perceive a 3D image.
Figure 4. Active 3D glasses: (a) right and (b) left frame displays.
Figure 4. Active 3D glasses: (a) right and (b) left frame displays.
Applsci 05 00926 g004
The main advantage of this technology is that the picture resolution, color, and stereoscopic 3D effect are not sacrificed. However, some users of active 3D glasses suffer from dizziness and discomfort. The other advantages are that there is less blur, this technology is low cost, and it is applicable to television or computer screens as well as projectors, provided their update frequencies can meet the requirements of the glasses. Therefore, active shutter glasses are being used in the majority of the 3D display systems being introduced onto the market at present, including 3D televisions, 3D glasses, and 3D cinema screens.

2.3. Three-Dimensional Image Brightness Loss

The sources of the loss in brightness can be divided into two categories: (a) Those due to the half-mirror 3D camera rig used to film the 3D content; and (b) Those resulting from the 3D glasses during viewing of the 3D device. Note that the degrees of brightness degradation caused by polarization glasses and shutter glasses are almost identical, as both of these devices allow only half of the light to penetrate the lenses, whether in the spatial or temporal domain.
The sources of the brightness differences between 2D and 3D images that can have a visual impact on the eye can also be divided into two categories: (a) Control during filming of a scene, with adjustments being made in accordance with the ambient light (for example, the aperture for the right-eye image may need to be increased so that the brightness of the images received by both eyes are within a similar range). This prevents the eyes from becoming rapidly fatigued, while also reducing the time and cost of dimming of the film during post-production. (b) The screen brightness may be too low or too high during 2D or 3D viewing. In this case, the eyes can begin to suffer from significant discomfort within a short period of time. This study focuses on the brightness of 2D and 3D images within the range where a viewer can experience the same level of brightness for both eyes, while not experiencing physical discomfort such as eye soreness, dizziness, vomiting, etc., during the viewing of the 3D images.

3. Experiment Design for Brightness Loss Measurement

In general, commercially available 3D LCD televisions are adjusted by standard measurement methods when the products are manufactured to leave factories [12]. Moreover, the screen image brightness in average can be obtained by a light meter. The measurement method of local block-by-block does not need accurately to acquire precise brightness because human eyes can discern average brightness of the screen image. Hence, the 3D LCD televisions are adopted in the study to compare the brightness difference between 2D and 3D films. When comparing 2D and 3D films with the same base brightness, viewers tend to find that 2D images are brighter than 3D images. The adjustments necessary to ensure similar image brightness perception by both eyes are mainly decided based on experimental results. Then, the increases and decreases in brightness are controlled by tuning the filming aperture setting and the shutter speed. This can reduce the overall filming cost and time.
In this study, the ambient brightness was first set for the filming of a scene. Two lighting groups, with 3200 and 7500 lumens (lx) were used as spatial lighting. Then, test images were recorded in an indoor studio, as shown in Figure 5. The experimental and hardware conditions are listed in detail in Table 1. In the experiment, the normal ambient exposure value (EV) was set to ISO 400, which is the most commonly used value. Five aperture values (F4, F5.6, F8, F11, and F16) were selected, which were selected depending on the experiment. The shutter speed (4–1/3200 s) employed in conjunction with each of the five aperture values was used as a cross-aperture experimental variable.
Figure 5. Recording environment.
Figure 5. Recording environment.
Applsci 05 00926 g005
Table 1. Detailed environmental and hardware parameters.
Table 1. Detailed environmental and hardware parameters.
ApertureEVShutter Speed (s)ISOScene Recording Brightness (lx)
F4−2–+21/50–1/32004003200 and 7500
F5.6−2–+21/6–1/5004003200 and 7500
F7.1−2–+21/6–1/5004003200 and 7500
F9−2–+23–1/5004003200 and 7500
F10−2–+24–1/5004003200 and 7500
The experimental equipment included an illuminometer (i.e. CL200A, Konica Minolta, Tokyo, Japan), Canon 400D camera (Canon, Fukushima Prefecture, Japan), Vizio 32- and 47-inch television screens (WUSH, Irvine, CA, USA), polarized glasses, a Sony 52-inch television screen (Sony, Tokyo, Japan), and flash glasses are shown in Table 2, respectively. The illuminometer is a brightness measuring tool, which produces readings in lx. The height of the darkroom was equivalent to three televisions, and the optical axis of the optical test equipment was oriented vertically to the center of the display screen, at a distance of three (high-definition television (HDTV)) and four (standard-definition television (SDTV)) times the display screen height. This was so that all of the light was received as the average light of a single image [13], as shown in Figure 6. A test image is displayed with both 2D and 3D mode on a 3D TV, and luminance measurements are performed for each mode. In this way, the range of luminance for 2D mode and 3D mode can be found. In addition, there are only grayscale images but color image are not need in this experiment to enhance actual measurement images. Furthermore, the grayscale images have been added, as shown in Figure 7 and Figure 8, respectively.
Figure 6. Schematic of darkroom for display brightness measurement [13].
Figure 6. Schematic of darkroom for display brightness measurement [13].
Applsci 05 00926 g006
Table 2. 3D TV vendors.
Table 2. 3D TV vendors.
VendorModel3D GlassesUsed Time (h)Year
VizioVL320M 32-inchPolarized Glasses502012
VizioM420KD 42-inchPolarized Glasses452012
SonyKDL-52XBR7 52-inchFlash Glasses702011
Figure 7. F5.6 aperture 2D/3D test images.
Figure 7. F5.6 aperture 2D/3D test images.
Applsci 05 00926 g007
Figure 8. F10 aperture 2D/3D test images.
Figure 8. F10 aperture 2D/3D test images.
Applsci 05 00926 g008
Note that the CL200A illuminometer is one of the commonly used models in the industry and is also relatively easy to obtain. According to statistical-method-aided analysis, the results of such experiments can be applied to the industrial sector. Furthermore, the specifications of this device conform to Japanese Industrial Standards (JIS) C1609-1:2006 Class AA, and are extremely consistent with the International Commission on Illumination (CIE) standard observer curves [14].
After fixing the ISO and aperture conditions, images were recorded for different shutter conditions. To avoid obtaining different results for the same ambient light, the parameters were typically only adjusted after the last shooting iteration for a given setup. The camera was turned to aperture priority (AV) mode, and the camera function key “*” was pressed after the aperture adjustment. Then, the following steps were performed in order to obtain the optimal shutter value:
  • The camera was turned to M mode and images were recorded at the given shutter value, ensuring that the EV was 0.
  • The images were recorded within the EV −2 and EV +2 ranges. As a result, each group had 19 datasets.
  • The obtained images were presented on the television screen, and the image brightness was measured in the 2D and 3D modes.
  • The data was collected for analysis using statistical product and service solutions (SPSS) software (IBM, Chicago, IL, USA).

4. Experimental Results

An image brightness regression model was used to analyze the screen image brightness relationship with the following variables:
Dependent variable (Y): Screen image brightness;
Independent variables (X):
(1)
Screen size;
(2)
Screen recording brightness;
(3)
Mode (2D or 3D);
(4)
Photographic equipment EV;
(5)
Interactions between variables, as shown in Table 3.
Depending on the variables’ regression standardized residuals, it was determined whether the distribution of the sample was normal, for which the bell curve is called a completely normal distribution curve. Because of sampling errors, there was a gap between the actual observed-value histogram and the normal distribution curve (i.e., Figure 9). However, no extreme values beyond three standard deviations were found in this experiment. As a result, the sample values corresponded naturally with the normal distribution. The study then examined the variables’ standardized regression residual error on the normal P-P diagram, which exhibits a 45° line from lower left to upper right (i.e., Figure 10). Therefore, the sample observations are approximately in line with the basic assumption, as shown in Table 4, Table 5 and Table 6.
Table 3. Dependent and independent variables.
Table 3. Dependent and independent variables.
Variable TypeNameValues
Dependent Variable (Y)Screen Image BrightnessBecause of the nature of the luminance variables, there is no normal distribution, so a Box-Cox transform is used to convert the variable (λ = 0.3), so that ε (Note 1) has a normal distribution.
Independent Variables (X)(1) Screen Size32, 47, and 52 inch
(2) Field Brightness3200 and 7500 lx
(3) 2D or 3D mode2D and 3D modes
(4) Photographic Equipment EVConverted using the camera’s shutter aperture combination.
Conversion Formula: EV = log2 (N2/t), where N is the aperture (F value), and t is the shutter speed (s).
(5) Interactions between VariablesThe interactions between each variable.
The risk-free rate, ε, extracts a random sample from a normal distribution with a mean of 0 and a standard deviation of 1.
Figure 9. Standardized residuals histogram.
Figure 9. Standardized residuals histogram.
Applsci 05 00926 g009
Figure 10. Standardized regression residuals of normal P-P diagram.
Figure 10. Standardized regression residuals of normal P-P diagram.
Applsci 05 00926 g010
Table 4 shows the measures used in the brightness regression model. In this model, R2 is used to illustrate the explanatory power of the entire pattern. However, this measure tends to overestimate phenomena depending on the sample size; the smaller the sample, the more prone the model is towards overestimation. Therefore, the majority of researchers use R ¯ 2 , which is the error variance and variable (Y) divided by the degree of freedom.
Table 4. Brightness regression model summary.
Table 4. Brightness regression model summary.
Brightness Regression ModelCorrelation Coefficient (R)Coefficient of Determination (R2)Adjusted Coefficient of Determination ( R ¯ 2 )
0.9950.9900.990
Analysis of variance (ANOVA) is a particular form of statistical hypothesis testing that is widely applied in the analysis of experimental data (i.e., Table 5). Statistical hypothesis testing is a method of decision-making based on data. If the test results (calculated by the null hypothesis) fall within a certain likelihood of not being accidental, they are deemed to be statistically significant. For example, when the “p value”, which is calculated from the data, is less than the defined critical significance level, the original hypothesis can be deemed invalid. A regression coefficient of 0 can indicate that the variable has no effect on the model.
Table 5. Analysis of variance (ANOVA).
Table 5. Analysis of variance (ANOVA).
SourceSum of SquaresDegrees of FreedomMean SquareF Value
Return5382.37910538.23810200.217
Residual55.61710540.053
Total5437.9961064
The statistical coefficients of the linear regression model are presented in Table 6. Note that mode switching between 2D and 3D images has the most significant impact on the screen image brightness. That is, once the screen is switched from the 2D to 3D mode, the image brightness on the screen exhibits a very significant decline.
Table 6. Statistical coefficients of linear regression model.
Table 6. Statistical coefficients of linear regression model.
Brightness Regression VariablesNon-Standardized CoefficientsStandardized CoefficientsT
BStandard Errorβ
(A) Constant21.2590.115 184.892
(1) Screen Size 11.9050.1450.39913.180
(2) Screen Size 20.7070.1450.1474.890
(B) Shooting Scene Brightness0.6900.0200.15335.235
(C) 2D or 3D Mode−5.7490.119−1.270−48.340
(D) Photographic Equipment EV−1.3420.008−1.037−169.750
(E) Interaction Value of Variables
(1) and (C)−0.3140.030−0.051−10.499
(1) and (D)−0.0960.010−0.293−9.753
(2) and (D)−0.0450.010−0.137−4.549
(B) and (C)−0.1370.029−0.026−4.809
(C) and (D)0.3110.0080.99137.541
Each of the independent variables is explained and analyzed below.
(1) Screen size
After the 2D/3D mode variable, the screen size is the most important environmental variable when measuring brightness in the darkroom. The linear regression model assumes three screen sizes as separate dummy variables. The statistical results show a significant impact on the brightness, and larger screens have a positive effect on the brightness value, falling within the range of reasonable consideration. Therefore, the screen size and interactions between other factors significantly affect the brightness.
(2) Scene recording brightness
Only two brightness values were used in this study: 3200 and 7500 lx. The statistical results show that these two variables are within the range of reasonable consideration.
(3) Mode (2D or 3D)
This variable is the core consideration of the experiment, and its significance is apparent in the results of the statistical analyses. Thus, this statistical coefficient affects the screen image brightness very significantly. The experimental measurements and the linear regression model prove that the 3D image brightness is lower than that of the 2D images. To estimate the screen image brightness value, the linear regression model takes the 2D and 3D modes as dummy variables. The expected screen display mode is set in the linear regression model so that the estimated image brightness value can be obtained.
(4) Photographic equipment EV
In the experiment with different aperture and shutter conditions, this setting has a direct impact upon the image brightness. A larger EV indicates less exposure, and the statistical results are also consistent with this finding.
(5) Interaction values of variables
Each of the variables interacts with the others to a greater or lesser degree, but the coefficients of a given variable have a lesser effect on the screen brightness than interactions with all other major independent variables.
Figure 9 is a standardized residuals histogram that shows the regression distribution of the standardized residuals. Figure 10 is a normal probability plot diagram (P-P diagram). In statistical analysis, it is often necessary to determine whether a dataset is from a normal population using regression analysis or multivariate analysis. Of all the analysis methods, the use of statistical graphics to make such a judgment is relatively easy and convenient. With a P-P diagram and a least-squares line, the user can easily determine whether or not the entered data are from a normal population. Another function is to aid researchers in interpreting the meaning of the P-P plot. The least-squares line is obtained from the linear equation derived from the method of least squares, which is a linear equation that obtains the sum of the squared residuals between the least-squares line and the data minimal.
The regression equations adopted in this study are primarily based on concepts from reference [15], which is used as a mathematical model for derivation of the basic theory. After transforming the variables of the regression model via Box-Cox to λ = 0.3, ε matches the normal distribution, as does the R model. The linear regression model parameters are expressed in terms of
Y i = β 0 + β i X i + ε i
where Yi is a random variable, Xi is a known fixed constant, εi is an unobservable, and i = 1, … , nx (i-th test; Yi is the reaction value corresponding to Xi). Expressing the main variables in the experimental linear model formula yields
Y = 21.259 + 1.905 X 1 + 0.707 X 2 + 0.69 X B 5.749 X C 1.342 X D + μ
where X1 is the size of screen 1, X2 is the size of screen 2, XB is the scene recording brightness, XC indicates the mode (2D or 3D), XD is the photographic equipment EV, and μ is the interaction value of the variables.
From analysis of the image brightness using this linear regression model, different variables affect the screen brightness by different degrees, although the interactions between variables affect the screen brightness only minimally. However, changing from the 2D to 3D mode has the most significant effect on the brightness. Once the screen changes from the 2D to 3D mode, the screen brightness declines noticeably.
As detailed above, the linear regression model attempts to consider the effects of the main environment and hardware when estimating the screen brightness value. The brightness values are affected by numerous external factors; however, conversion from 2D to 3D mode has the largest impact. In fact, 3D image professionals require a certain amount of time to adjust the 3D image brightness in such scenarios. Therefore, the linear regression model can help to estimate image brightness if other environmental factors or hardware conditions are controlled. The faster the shutter speed, the lower the screen image brightness, and low-brightness value images that are difficult to observe can even be obtained. The 3D image brightness degradation in response to increased shutter speed is clearly shown in Table 7; further, the 3D image brightness values are significantly lower than the 2D image brightness values for specific shutter conditions.
Table 7. F5.6 aperture experimental results.
Table 7. F5.6 aperture experimental results.
Shutter Speed (s)2D Mode Brightness Value (lx)RGB Values3D Mode Brightness Value (lx)
1/1256.6R:125, G:131, B:1262.2
1/2502.2R:74, G:79, B:740.7
The experimental results show that, when the aperture is F5.6 (i.e., Table 7), the image brightness value is the same in 2D mode with a shutter speed of 1/250 s as it is in 3D mode with 1/125-s shutter speed. Therefore, the 3D display exhibits approximately 50% image brightness degradation. The experimental results for the F10 aperture are listed in Table 8. For this aperture setting, the image brightness value is the same in 2D mode with a shutter speed of 1/40 s as it is in 3D mode with 1/10-s shutter speed. Thus, the 3D display has only 50% of the 2D image brightness.
Table 8. F10 aperture experimental results.
Table 8. F10 aperture experimental results.
Shutter Speed (s)2D Mode Brightness Value (lx)RGB Values3D Mode Brightness Value (lx)
1/2019.8R:193, G:198, B:1946.6
1/409.4R:138, G:144, B:1393.2
1/506.6R:117, G:123, B:1172.2
1/803.2R:81, G:87, B:821.1
If the low-brightness (value 0) data are removed from the experimental dataset, the 3D image brightness average value can be increased to approximately 39.2% of the 2D image brightness value. This means that, in comparing the polarizing 3D and 2D image brightness values, the 3D image brightness decreases by approximately 60.8%. For a 95% confidence interval, degradation values of 52.4%–69.2% are within the reasonable range of consideration, as shown in Table 9. Because the polarized 3D image brightness is approximately 60% less than that of the corresponding 2D image, in order to achieve the same image brightness as the 2D image, the 3D image brightness must be increased.
Table 9. Comparison of 2D and 3D image experimental results.
Table 9. Comparison of 2D and 3D image experimental results.
Experimental ItemValues
3D Image Brightness Degradation60.8%
3D Image Brightness Degradation within 95% Confidence Level52.4%–69.2%
The above reference data is applicable to the production of 3D stereoscopic displays, and shows that the 3D image brightness must be increased in order to achieve the same image brightness as a 2D image on a screen. The use of this data can reduce the time period required for setting adjustments. Complementary adjustments taking the various environmental factors into consideration are recommended for practical imaging designs, so that the gap in display brightness between 2D and 3D images can be reduced. As a result, the main idea of this study is trying to measure the brightness loss (or difference) between filming 3D movie and watching 3D movie. The results show 60.8% brightness loss, which means we have to increase 2.5 times lighting intensity while filming to reach the standard brightness. This modification elevates the watching 3D movie to proper brightness.

5. Conclusions

In order to improve image brightness degradation in three-dimensional (3D) displays utilizing stereoscopic images in cinemas or on television, this study aimed to quantify the 3D image brightness degradation in such cases. It also aimed to estimate the image brightness relationship between 3D and two-dimensional (2D) images and, hence, to modify the brightness values of the former. The values measured based on the capturing of a single 2D and a single 3D image were estimated using the photographic principle. Moreover, image brightness data were collected in the 2D and 3D modes for analysis using statistical product and service solutions (SPSS), and so that the image brightness values could be estimated by the statistical regression model for different environmental factors or hardware devices. Finally, a comparison of the polarizing 3D image brightness value with that of a 2D image based on the experimental results indicated that the 3D image brightness can be decreased by 60.8%. Furthermore, the degradation values of 52.4%–69.2% are within the 95% confidence interval.

Acknowledgments

This work was supported by the Ministry of Science and Technology of Taiwan (Grant No. MOST 103-2221-E-019-045-MY2).

Author Contributions

Hsing-Cheng Yu conceived and designed the experiment, and contributed to a main part of manuscript writing. Xie-Hong Tsai and Ming Wu contributed to implement and setup the experiment. An-Chun Luo and Sei-Wang Chen contributed in corresponding data analysis. All authors contributed to polish the paper for improving the fluency.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Banno, A.; Ikeuchi, K. Omnidirectional texturing based on robust 3D registration through Euclidean reconstruction from two spherical images. Comput. Vis. Image Underst. 2010, 114, 491–499. [Google Scholar] [CrossRef]
  2. Gao, Z.; Zhang, Y.N.; Xia, Y.; Lin, Z.G.; Fan, Y.Y.; Feng, D.D. Multi-pose 3D face recognition based on 2D sparse representation. J. Vis. Commun. Image Represent. 2013, 24, 117–126. [Google Scholar]
  3. Zhang, Y.N.; Guo, Z.; Xia, Y.; Lin, Z.G.; Feng, D.D. 2D representation of facial surfaces, for multi-pose 3D face recognition. Pattern Recognit. Lett. 2012, 33, 530–536. [Google Scholar] [CrossRef]
  4. Berretti, S.; Bimbo, A.D.; Pala, P. 3D face recognition using iso-geodesic stripes. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 2162–2177. [Google Scholar] [CrossRef] [PubMed]
  5. Queirolo, C.C.; Silva, L.; Segundo, O.R.B.; Segundo, M.P. 3D face recognition using simulated annealing and the surface interpenetration measure. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 206–219. [Google Scholar] [CrossRef] [PubMed]
  6. Teng, C.H.; Chen, Y.S.; Hsu, W.H. Constructing a 3D trunk model from two images. Graph. Models 2007, 69, 33–56. [Google Scholar] [CrossRef]
  7. Lenz, R.K.; Tsai, R.Y. Techniques for calibration of the scale factor and image center for high accuracy 3-D machine vision. IEEE Trans. Pattern Anal. Mach. Intell. 1988, 10, 713–720. [Google Scholar] [CrossRef]
  8. Michael, H.L.; Armstrong, T.J. The effect of viewing angle on wrist posture estimation from photographic images using novice raters. Appl. Ergon. 2011, 42, 634–643. [Google Scholar]
  9. Lowe, B.D. Accuracy and validity of observational estimates of wrist and forearm posture. Ergonomics 2004, 47, 527–554. [Google Scholar] [CrossRef] [PubMed]
  10. Nawrot, M.; Joyce, L. The pursuit theory of motion parallax. Vision Res. 2006, 46, 4709–4725. [Google Scholar] [CrossRef] [PubMed]
  11. David, G.; Woods, V.; Li, G.Y.; Buckle, P. The development of the quick exposure check (QEC) for assessing exposure to risk factors for work-related musculoskeletal disorders. Appl. Ergon. 2008, 39, 57–69. [Google Scholar] [CrossRef] [PubMed]
  12. Zhang, J.; Li, S.; Shen, L.; Hou, C. A comparison of testing metrics between 3D LCD TV and 3D PDP TV. Commun. Comput. Inf. Sci. 2012, 331, 125–132. [Google Scholar]
  13. Zhao, X.; Song, H.; Zhang, S.; Huang, Y.; Sun, Q.; Fan, K.; Hu, J.; Fan, G. 3D definition certification technical specifications for digital TV displays. CESI001-2011 2011, 5–8. [Google Scholar]
  14. CL-200A Chroma Meter. Available online: http://sensing.konicaminolta.asia/products/cl-200a-chroma-meter/ (accessed on 20 February 2014).
  15. Li, P. Box-Cox transformations: An overview. 2005. Available online: http://www.ime.usp.br/~abe/lista/pdfm9cJKUmFZp.pdf (accessed on 11 April 2005).

Share and Cite

MDPI and ACS Style

Yu, H.-C.; Tsai, X.-H.; Luo, A.-C.; Wu, M.; Chen, S.-W. Study of Three-Dimensional Image Brightness Loss in Stereoscopy. Appl. Sci. 2015, 5, 926-941. https://doi.org/10.3390/app5040926

AMA Style

Yu H-C, Tsai X-H, Luo A-C, Wu M, Chen S-W. Study of Three-Dimensional Image Brightness Loss in Stereoscopy. Applied Sciences. 2015; 5(4):926-941. https://doi.org/10.3390/app5040926

Chicago/Turabian Style

Yu, Hsing-Cheng, Xie-Hong Tsai, An-Chun Luo, Ming Wu, and Sei-Wang Chen. 2015. "Study of Three-Dimensional Image Brightness Loss in Stereoscopy" Applied Sciences 5, no. 4: 926-941. https://doi.org/10.3390/app5040926

APA Style

Yu, H. -C., Tsai, X. -H., Luo, A. -C., Wu, M., & Chen, S. -W. (2015). Study of Three-Dimensional Image Brightness Loss in Stereoscopy. Applied Sciences, 5(4), 926-941. https://doi.org/10.3390/app5040926

Article Metrics

Back to TopTop