A Total Bounded Variation Approach to Low Visibility Estimation on Expressways
Abstract
:1. Introduction
- (1)
- It is the very first time that image spectrum and TBV were applied to characterize features of fog and haze and estimate visibility. From the practical standpoint, the expressway visibility of less than 300 m caused by heavy fog and haze, being more dangerous, was chiefly explored. While the visibility and the high frequency (HF) coefficient ratios of image Fourier transform (discrete cosine) were increasing, low frequency (LF) coefficient ratios decrease correspondingly. The TBV of foggy and hazy images climbed. For foggy and hazy images with visibility of less than 300 m, HF coefficient ratios were under 20%, and LF coefficient ratios ranged from 100% to 120%. Based on this spectrum feature, foggy and hazy images with low visibility images can be sorted out, and the TBV trend was consistent with the trend of foggy and hazy visibility.
- (2)
- Considering the polynomial regression and piecewise stationary time series analysis, a nonlinear relationship between TBV and real visibility was established.
- (3)
- To overcome the effect of different road landscape and sunshine luminance, the relative ratio of image spectrum and total bounded variation were adopted.
- (4)
- Unlike the current visibility estimation methods (model-driven), the method proposed in this study is a semi-data-driven approach. It is the very first time that a big dataset (1,782,000 frames) collected from real world, Tongqi expressway, China, was used to train the semi-data-driven model. The proposed approach was validated by the big video data.
2. Related Works
3. Research Methods
3.1. Visibility Definition and Application
3.2. Pseudo-Blurred Image
3.3. Foggy and Hazy Image Spectrum
3.4. Total Bounded Variation
Algorithm 1: Total bounded variation approach to low visibility estimation |
Input: Surveillance video, 990 min × 60 s/min × 30 frame/s = 1,782,000 frames |
Output: S-TBV model, visibility Vis |
Initialization: Sampling interval time |
Step: |
1. Surveillance video preprocessing and sampling; |
2. ROI extraction based on different road points; |
3. Search low visibility frame (less than 300 m); |
(1) DCT processing for sunny day images captured from the same road point (50 min); |
(2) Analyze the HF and LF coefficients of images processed in Step 3.1, and their median value is used; |
(3) DCT processing for fog and haze surveillance images; |
(4) Analyze image spectrum, e.g., DC component F(0, 0) and F(n − 1, n − 1), and calculate relevant values based on Formula (6) and Steps 3.1–3.4; |
(5) Search the low visibility frame on the basis of Fr(0, 0) and Fr(n − 1, n − 1); |
(6) Notes: if the visibility of fog and haze is less than 300 m, go to Step 4, or stop and output message, which is “more than 300 m”. |
4. Compute foggy and hazy visibility; |
(1) Calculate the TBV value for sunny day images in the same road point (50 min); |
(2) Use the median value to analyze the TBV above; |
(3) Calculate the TBV value for foggy and hazy images using the relative TBV; |
(4) Piecewise stationary function construction with machine learning (polynomial regression) |
(1) Training set: the coefficients an, bn and cn in Formula (10) are obtained by training. |
(2) Testing set: video data of road points 2, and 4–6 are used as the testing set respectively. |
(3) Notes: the training set and the testing set are independent. For example, data of road Points 2 are used as the testing set, and data of other road points (1, 3–6) are used as the training set. |
5. Optimize algorithm parameters. |
|
4. Results
5. Discussion
6. Conclusions
- (1)
- The situation, expressway visibility of fog and haze less than 300 m, was focused on firstly. This strategy of overcoming the challenge of estimation accuracy is reasonable. The total bounded variation approach can be used to handle this situation and the verified results are encouraging. The relative errors of estimation are less than 10%, and 68.54% of the errors are less than 5%.
- (2)
- Total bounded variation approach provides an effective framework for low visibility estimation on expressway. The fogy and hazy images can be processed as pseudo-blurred images. The texture features of pseudo-blurred images can be characterized by TBV, which is correlated to the trend of foggy and hazy visibility.
- (3)
- There are wide differences between the spectral features of sunny day images and that of foggy and hazy images of less than 300 m. Besides, HF coefficient ratios of sunny day images fluctuate around 100%, and HF coefficient ratios of foggy and hazy images fluctuate from 0% to 20%. LF coefficient ratios of foggy and hazy images fluctuate from 100% to 120%, and LF coefficient ratios decrease gradually when the visibility increases steadily.
- (4)
- Big dataset can help generate valid the S-TBV model. The dataset contains 1.78 million frames collected from expressway.
- (5)
- Relative ratios are used in this paper, namely the spectral coefficient ratio (HF, LF) and the TBV ratio. Some influencing factors, such as road points background and lighting, can be eliminated. The feasibility of the S-TBV model will be improved.
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Zhu, H.; Qiu, Y. 9 People Were Dead Caused by Foggy Traffic Accidents in Shanghai. Available online: http://www.chinanews.com/sh/2016/11-06/8054762.shtml (accessed on 6 November 2016).
- Admin. 144 Accidents in Dubai as Dense Fog Plays Havoc. Available online: http://emirates-business.ae/accidents-in-dubai-as-dense-fog-plays-havoc/ (accessed on 12 January 2017).
- Varjo, S.; Hannuksela, J. Image based visibility estimation during day and night. In Proceedings of the 2014 Asian Conference on Computer Vision (ACCV 2014), Singapore, 1–2 November 2014; pp. 277–289. [Google Scholar]
- Koschmieder, H. Theorie der horizontalen Sechtweite (Theory of horizontal visibility). Beitr. Phys. Freien. Atmos. 1924, 12, 33–53, 171–181. [Google Scholar]
- Blackwell, H.R. Contrast thresholds of the human eye. J. Opt. Soc. Am. 1946, 36, 624–643. [Google Scholar] [CrossRef] [PubMed]
- Middleton, W.E.K.; Mungall, A.G. On the psychophysical basis of meteorological estimates of ‘visibility’. Trans. Am. Geophys. Union 1952, 33, 507–512. [Google Scholar] [CrossRef]
- Horvath, H. On the applicability of the Koschmieder visibility formula. Atmos. Environ. 1971, 5, 177–184. [Google Scholar] [CrossRef]
- Steffens, C. Measurement of visibility by photographic photometry. Ind. Eng. Chem. 1949, 41, 2396–2399. [Google Scholar] [CrossRef]
- Hautière, N.; Tarel, J.P.; Lavenant, J.; Aubert, D. Automatic fog detection and estimation of visibility distance through use of an onboard camera. Mach. Vis. Appl. J. 2006, 17, 8–20. [Google Scholar] [CrossRef]
- Lenor, S.; Jähne, B.; Weber, S.; Stopper, U. An improved model for estimating the meteorological visibility from a road surface luminance curve. In Proceedings of the 35th German Conference (GCPR 2013), Saarbrücken, Germany, 3–6 September 2013; pp. 184–193. [Google Scholar]
- Negru, M.; Nedevschi, S. Image based fog detection and visibility estimation for driving assistance systems. In Proceedings of the 2013 IEEE International Conference on Intelligent Computer Communication and Processing (ICCP 2013), Cluj-Napoca, Romania, 5–7 September 2013; pp. 163–168. [Google Scholar]
- Guo, F.; Peng, H.; Tang, J.; Zou, B.; Tang, C. Visibility detection approach to road scene foggy images. KSII Trans. Internet Inf. Syst. 2016, 10, 4419–4441. [Google Scholar]
- Boussard, C.; Hautière, N.; d’Andrea-Novel, B. Vehicle dynamics estimation for camera-based visibility distance estimation. In Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; pp. 600–605. [Google Scholar]
- Hermansson, P.; Edstam, K. Atmospheric visibility estimation and image contrast calibration. In Proceedings of the SPIE 9997, Target and Background Signatures II, Edinburgh, UK, 24 October 2016. [Google Scholar]
- Hautière, N.; Labayrade, R.; Aubert, D. Estimation of the visibility distance by stereovision: A generic approach. IEICE Trans. Inf. Syst. 2006, E89D, 2084–2091. [Google Scholar] [CrossRef]
- Hautière, N.; Labayrade, R.; Aubert, D. Real-time disparity contrast combination for onboard estimation of the visibility distance. IEEE Trans. Intell. Transp. Syst. 2006, 7, 201–212. [Google Scholar] [CrossRef]
- Graves, N.; Newsam, S. Camera based visibility estimation: Incorporating multiple regions and unlabeled observations. Ecol. Inform. 2014, 23, 62–68. [Google Scholar] [CrossRef]
- Bronte, S.; Bergasa, L.M.; Alcantarilla, P.F. Fog detection system based on computer vision techniques. In Proceedings of the 12th International IEEE Conference on Intelligent Transportation Systems, St. Louis, MO, USA, 3–7 October 2009; pp. 30–35. [Google Scholar]
- Boussard, C.; Hautière, N.; d’Andrea-Novel, B. Visibility distance estimation based on structure from motion. In Proceedings of the 11th International Conference on Control, Automation, Robotics and Vision, Singapore, 7–10 December 2010; pp. 1416–1421. [Google Scholar]
- Lenor, S.; Martini, J.; Jähne, B.; Stopper, U.; Weber, S.; Ohr, F. Tracking based visibility estimation. In Proceedings of the 2014 German Conference on Pattern Recognition (GCPR 2014), Münster, Germany, 2–5 September 2014; Volume 8753, pp. 365–376. [Google Scholar]
- Belaroussi, R.; Gruyer, D. Road sign-aided estimation of visibility conditions. In Proceedings of the 14th IAPR International Conference on Machine Vision Applications (MVA), Tokyo, Japan, 18–22 May 2015; pp. 202–205. [Google Scholar]
- He, K.M.; Sun, J.; Tang, X.O. Single image haze removal using dark channel prior. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 2341–2353. [Google Scholar] [PubMed]
- Song, H.J.; Chen, Y.Z.; Gao, Y.Y. Real-time visibility distance evaluation based on monocular and dark channel prior. Int. J. Comput. Sci. Eng. 2015, 10, 375–386. [Google Scholar] [CrossRef]
- Babari, R.; Hautière, N.; Dumont, E.; Paparoditis, N.; Misener, J. Visibility monitoring using conventional roadside cameras-emerging applications. Transp. Res. Part C 2012, 22, 17–28. [Google Scholar] [CrossRef]
- Kim, K.W. Estimation of visibility using a visual image. Environ. Monit. Assess. 2015, 187, 66. [Google Scholar] [CrossRef] [PubMed]
- Rudin, L.I.; Osher, S.; Fatemi, E. Nonlinear total variation based noise removal algorithms. Phys. D Nonlinear Phenom. 1992, 60, 259–268. [Google Scholar] [CrossRef]
- Rudin, L.I.; Osher, S. Total variation based image restoration with free local constraints. In Proceedings of the 1994 IEEE International Conference on Image Processing (ICIP 1994), Austin, TX, USA, 13–16 November 1994; pp. 31–35. [Google Scholar]
- Chambolle, A.; Lions, P.L. Image recovery via total variation minimization and related problems. Numer. Math. 1997, 76, 167–188. [Google Scholar] [CrossRef]
- Osher, S.; Burger, M.; Goldfarb, D.; Xu, J.; Yin, W. An iterative regularization method for total variation-based image restoration. Multiscale Model. Simul. 2005, 4, 460–489. [Google Scholar] [CrossRef]
- Hintermüller, M.; Stadler, G. An infeasible primal-dual algorithm for total bounded variation based inf-convolution type image restoration. SIAM J. Sci. Comput. 2006, 28, 1–23. [Google Scholar] [CrossRef]
- Liu, X.W.; Huang, L.H. Split Bregman iteration algorithm for total bounded variation regularization based image deblurring. J. Math. Anal. Appl. 2010, 372, 486–495. [Google Scholar] [CrossRef]
- Liu, X.W.; Huang, L.H. Total bounded variation based poissonian images recovery by split bregman iteration. Math. Methods Appl. Sci. 2012, 35, 520–529. [Google Scholar] [CrossRef]
- Xu, Y.; Huang, T.Z.; Liu, J.; Lv, X.G. Split bregman iteration algorithm for image deblurring using fourth order total bounded variation regularization model. J. Appl. Math. 2013, 2013, 238561. [Google Scholar] [CrossRef]
- Xua, Y.; Huang, T.Z.; Liu, J.; Lv, X.G. An augmented Lagrangian algorithm for total bounded variation regularization based image deblurring. J. Frankl. Inst. 2014, 351, 3053–3067. [Google Scholar] [CrossRef]
- Jovanovski, O. Convergence bound in total variation for an image restoration model. Stat. Probab. Lett. 2014, 90, 11–16. [Google Scholar] [CrossRef]
- Cheng, X.G.; An, M.W.; Chen, Q.M. Image distortion metric based on total bounded variation. China Commun. 2012, 9, 79–85. [Google Scholar]
- International Commission on Illumination. Meteorological Visibility. Available online: http://eilv.cie.co.at/termlist?fie-ld_term_search_value_op=contains&field_te-rm_search_value=visibility (accessed on 2014).
- The State Council of the People’s Republic of China. Regulation on the Implementation of the Road Traffic Safety Law of the People’s Republic of China; Order No. 405; The State Council of the People’s Republic of China: Beijing, China, 2004.
- China Meteorological Administration. China Meteorological Industry Standard: Visibility Monitoring and Thick Fog Warning for Expressway; QX/T76-2007; Meteorological Press: Beijing, China, 2007.
Road Points No. | Chainage | District | Start and End Time | Duration | Maximum Visibility | Date |
---|---|---|---|---|---|---|
1 | K113 + 000 | Dasheng | 06:30–09:22 | 172 min | 306 m | 14 April 2016 |
2 | K148 + 150 | Haimen | 06:00–09:20 | 200 min | 306 m | 14 April 2016 |
3 | K159 + 950 | Haimen | 06:00–09:34 | 214 min | 315 m | 14 April 2016 |
4 | K106 + 980 | Dasheng | 06:00–08:57 | 177 min | 262 m | 14 April 2016 |
5 | K159 + 950 | Haimen | 06:00–08:06 | 126 min | 200 m | 13 April 2016 |
6 | K208 + 027 | Chenqiao | 06:00–07:41 | 101 min | 303 m | 15 March 2016 |
Testing Set | Training Set | ||
---|---|---|---|
Road Points No. | Duration | Road Points No. | Duration |
2 | 200 min | 1, 3, 4, 5, 6 | 790 min |
4 | 177 min | 1, 2, 3, 5, 6 | 813 min |
5 | 126 min | 1, 2, 3, 4, 6 | 864 min |
6 | 101 min | 1, 2, 3, 4, 5 | 889 min |
n | an (Power = 2) | bn (Power = 1) | cn (Power = 0) | Intervals |
---|---|---|---|---|
1 | 14.06 | −169.68 | 548.13 | [0, 50) |
2 | 2.54 | −37.20 | 187.41 | [50, 60) |
3 | −6.52 × 10−4 | −5.31 × 10−2 | 68.62 | [60, 70) |
4 | −6.71 × 10−4 | 1.06 × 10−1 | 73.01 | [70, 80) |
5 | −1.17 × 10−2 | 5.93 × 10−1 | 79.23 | [80, 90) |
6 | −9.11 × 10−3 | 4.29 × 10−1 | 90.52 | [90, 100) |
7 | 7.62 × 10−3 | −4.28 × 10−1 | 112.77 | [100, 120) |
8 | −4.53 × 10−4 | 5.31 × 10−3 | 128.93 | [120, 140) |
9 | −3.27 × 10−3 | 4.52 × 10−1 | 140.98 | [140, 160) |
10 | 3.22 × 10−3 | −2.16 × 10−1 | 170.96 | [160, 180) |
11 | −2.45 × 10−3 | 1.34 × 10−1 | 189.99 | [180, 200) |
12 | 5.07 × 10−3 | −3.27 × 10−1 | 220.76 | [200, 250) |
13 | −1.11 × 10−2 | 1.33 | 240.49 | [250, 300] |
DCT Coefficient Ratios | Foggy and Hazy Image Spectrum | Overall Trend |
---|---|---|
High frequency (HF) | Overall between 0% and 20%. | When the foggy and hazy visibility is improved, the HF coefficient ratios increase. |
Fluctuating around a small number when the visibility is less than 200 m. | ||
Increasing gradually when the visibility is between 200 m and 300 m. | ||
Low frequency (LF) | Between 100% and 120%. | The higher the foggy and hazy visibility is, the smaller the LF coefficients are. |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cheng, X.; Yang, B.; Liu, G.; Olofsson, T.; Li, H. A Total Bounded Variation Approach to Low Visibility Estimation on Expressways. Sensors 2018, 18, 392. https://doi.org/10.3390/s18020392
Cheng X, Yang B, Liu G, Olofsson T, Li H. A Total Bounded Variation Approach to Low Visibility Estimation on Expressways. Sensors. 2018; 18(2):392. https://doi.org/10.3390/s18020392
Chicago/Turabian StyleCheng, Xiaogang, Bin Yang, Guoqing Liu, Thomas Olofsson, and Haibo Li. 2018. "A Total Bounded Variation Approach to Low Visibility Estimation on Expressways" Sensors 18, no. 2: 392. https://doi.org/10.3390/s18020392
APA StyleCheng, X., Yang, B., Liu, G., Olofsson, T., & Li, H. (2018). A Total Bounded Variation Approach to Low Visibility Estimation on Expressways. Sensors, 18(2), 392. https://doi.org/10.3390/s18020392