Robust Detection of Abandoned Object for Smart Video Surveillance in Illumination Changes
Abstract
:1. Introduction
2. Related Works
3. Robust Detection of Abandoned Objects in Illumination Changes
3.1. General Detection Process
- Temporal rule: The luggage is declared an un-attended object when the owner leaves it and disappears, and it is not re-attended within time T = n seconds.
- Spatial rule: When the distance between the owner and the luggage is longer than a pre-defined distance, then an alarm event is triggered.
3.2. Illumination Change Handling
- Rapid detection and adaptation of illumination changes
- Template registration and presence authentication for candidate stationary objects
- Object comparison based on the largest contour
- Illumination Change Handling:
- if (The size of blob[i] in LF ≥ Th)
- then /*An illumination change is detected*/
- Short-term-model→learning-rate(maximum);
- Long-term-model→learning-rate(maximum);
- Illumination_change_flag = true;
- if (The number of blobs in DF == 0 && Illumination_change_flag == true)
- then /*The adaptation is terminated*/
- Short-term-model→learning_rate(original_short-term-learning-rate);
- Long-term-model→learning_rate(original_long-term-learning-rate);
- Illumination_change_flag = false;
- Presence Authentication:
- if ((matching_score(CSO.theLargestObjectContour, theCurrentLargestContour of CSO.area) ≥ Th)
- then /*abandoned object*/alarm AbandonedObjectDetection;
- else
- if ((matching_score(CSO.theLargestBackgroundContour, theCurrentLargestContour of CSO.area) ≥ Th)
- then /*moved object*/discard CSO;
- else /*occluded*/repeat Presence Authenticaion after pre-determined time passes;
4. Experiments
4.1. PETS2006
4.2. ABODA
4.3. Our Dataset
4.4. Challenging Issues
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Luna, E.; Carlos, J.; Miguel, S.; Ortego, D.; Martínez, J. Abandoned Object Detection in Video-Surveillance: Survey and Comparison. Sensors 2018, 18, 4290. [Google Scholar] [CrossRef] [PubMed]
- Wahyono; Jo, K.H. Cumulative Dual Foreground Differences for Illegally Parked Vehicles Detection. IEEE Trans. Ind. Inform. 2017, 99, 1–9. [Google Scholar]
- Bird, N.; Atev, S.; Caramelli, N.; Martin, R.; Masoud, O.; Papanikolopoulos, N. Real Time, Online Detection of Abandoned Objects in Public Areas. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation (ICRA 2006), Orlando, FL, USA, 15–19 May 2006; pp. 3775–3780. [Google Scholar]
- Mahale, M.A.; Kulkarni, H.H. Survey on Abandoned Object Detection in Surveillance Video. Int. J. Eng. Sci. Comput. 2017, 7, 15595–15599. [Google Scholar]
- PETS2006 Benchmark Data. Available online: http://www.cvg.reading.ac.uk/PETS2006/data.html (accessed on 1 December 2018).
- Lin, K.; Chen, S.C.; Chen, C.S.; Lin, D.T.; Hung, Y.P. Abandoned Object Detection via Temporal Consistency Modeling and Back-Tracing Verification for Visual Surveillance. IEEE Trans. Inf. Forensics Secur. 2015, 10, 1359–1370. [Google Scholar] [CrossRef]
- Fan, Q.; Pankanti, S. Modeling of temporarily y static objects for robust abandoned object detection in urban surveillance. In Proceedings of the 8th IEEE International Conference AVSS, Klagenfurt, Austria, 30 August–2 September 2011; pp. 36–41. [Google Scholar]
- Tian, Y.; Feris, R.; Liu, H.; Hampapur, A.; Sun, M.T. Robust detection of abandoned and removed objects in complex surveillance videos. IEEE Trans. Syst. Man Cybern. C 2011, 41, 565–576. [Google Scholar] [CrossRef]
- Tian, Y.; Feris, R.; Hampapur, A. Real-time Detection of Abandoned and Removed Objects in Complex Environments. In Proceedings of the Eighth International Workshop on Visual Surveillance (VS2008), Marseille, France, 17 October 2008. [Google Scholar]
- Porikli, F.; Ivanov, Y.; Haga, T. Robust abandoned object detection using dual foregrounds. EURASIP J. Adv. Signal Process. 2008, 2008, 30. [Google Scholar] [CrossRef]
- Evangelio, R.H.; Senst, T.; Sikora, T. Detection of static objects for the task of video surveillance. In Proceedings of the IEEE WACV, Kona, HI, USA, 5–7 January 2011; pp. 534–540. [Google Scholar]
- Lin, K.; Chen, S.C.; Chen, C.S.; Lin, D.T.; Hung, Y.P. Left-Luggage Detection from Finite-State-Machine Analysis in Static-Camera Videos. In Proceedings of the 22nd International Conference on Pattern Recognition, Stockholm, Sweden, 24–28 August 2014. [Google Scholar]
- Wahyono; Filonenko, A.; Jo, K.H. Unattended object identification for intelligent surveillance system using sequence of dual background difference. IEEE Trans. Ind. Inform. 2016, 12, 2247–2255. [Google Scholar]
- Filonenko, A.; Wahyono; Jo, K.H. Detecting abandoned objects in crowded scenes of surveillance videos using adaptive dual background model. In Proceedings of the International Conference Human System Interactions, Warsaw, Poland, 25–27 June 2015; pp. 224–227. [Google Scholar]
- Shyam, D.; Kot, A.; Athalye, C. Abandoned Object Detection Using Pixel-Based Finite State Machine and Single Shot Multibox Detector. In Proceedings of the IEEE International Conference on Multimedia and Expo, San Diego, CA, USA, 23–27 July 2018; pp. 1–6. [Google Scholar]
- Xu, H.; Yu, F. Improved compressive tracking in surveillance scenes. In Proceedings of the 7th International Conference on Image and Graphics (ICIG 2013), Qingdao, China, 26–28 July 2013; pp. 869–873. [Google Scholar]
- Barnich, O.; Van Droogenbroeck, M. ViBe: A universal background subtraction algorithm for video sequences. IEEE Trans. Image Process. 2011, 20, 1709–1724. [Google Scholar] [CrossRef] [PubMed]
- Liao, S.; Zhao, G.; Kellokumpu, V.; Pietikinen, M.; Li, S.Z. Modeling pixel process with scale invariant local patterns for background subtraction in complex scenes. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, CA, USA, 13–18 June 2010; pp. 1301–1306. [Google Scholar]
- Cuevas, C.; Martínez, R.; Berjón, D.; García, N. Detection of stationary foreground objects using multiple nonparametric background-foreground models on a finite state machine. IEEE Trans. Image Process. 2016, 26, 1127–1142. [Google Scholar]
- Smeureanu, S.; Ionescu, R.T. Real-time deep learning method for abandoned luggage detection in video. In Proceedings of the 26th European Signal Processing Conference, Rome, Italy, 3–7 September 2018. [Google Scholar]
- Sidyakin, S.V.; Vishnyakov, B.V. Real-time detection of abandoned bags using CNN. In Proceedings of the SPIE Optical Metrology, Munich, Germany, 26 June 2017; Volume 10334. [Google Scholar]
- Horn, B.K.P.; Schunck, B.G. Determining optical flow. Artif. Intell. 1981, 17, 185–203. [Google Scholar] [CrossRef]
- Chan, Y.T.; Hu, A.G.C.; Plant, J.B. A Kalman Filter Based Tracking Scheme with Input Estimation. IEEE Trans. Aerosp. Electron. Syst. 1979, AES-15, 237–244. [Google Scholar] [CrossRef]
- OpenCV. Available online: https://docs.opencv.org (accessed on 29 October 2019).
- Suzuki, S. Topological structural analysis of digitized binary images by border following. Comput. Vis. Graph. Image Process. 1985, 30, 32–46. [Google Scholar] [CrossRef]
- Zivkovic, Z.; Van Der Heijden, F. Efficient adaptive density estimation per image pixel for the task of background subtraction. Pattern Recognit. Lett. 2006, 27, 777–780. [Google Scholar] [CrossRef]
- Dalal, N.; Triggs, B. Histograms of oriented gradients for human detection. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR ‘05), San Diego, CA, USA, Jun 2005; pp. 886–893. [Google Scholar]
- ABODA Dataset. Available online: https://github.com/kevinlin311tw/ABODA (accessed on 29 October 2019).
- Ilias, D.; El Mezouar, M.C.; Taleb, N.; Elbahri, M. An edge-based method for effective abandoned luggage detection in complex surveillance videos. Comput. Vis. Image Underst. 2017, 158, 141–151. [Google Scholar]
- Krusch, P.; Bochinski, E.; Eiselein, V.; Sikora, T. A consistent two-level metric for evaluation of automated abandoned object detection methods. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 4352–4356. [Google Scholar]
- Liao, W.; Yang, C.; Ying Yang, M.; Rosenhahn, B. Security Event Recognition for Visual Surveillance. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 4, 19–26. [Google Scholar] [CrossRef] [Green Version]
Video | GT | TP | FP | FN | Owner |
---|---|---|---|---|---|
S1 | 1 | 1 | 0 | 0 | 1 |
S2 | 1 | 1 | 0 | 0 | 1 |
S3 | 1 | 1 | 0 | 0 | - |
S4 | 1 | 1 | 0 | 0 | 1 |
S5 | 1 | 1 | 0 | 0 | 1 |
S6 | 1 | 1 | 0 | 0 | 1 |
S7 | 1 | 1 | 0 | 0 | 1 |
Video | Scenario | GT | Proposed | Lin [6] | Wahyono [13] | Ilias [29] | Patrick [30] | Wentong [31] | Shyam [15] | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
TP | FP | TP | FP | TP | FP | TP | FP | TP | FP | TP | FP | TP | FP | |||
V1 | Outdoor | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 |
V2 | Outdoor | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 |
V3 | Outdoor | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 |
V4 | Outdoor | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 |
V5 | Night | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 |
V6 | Illumination Change | 2 | 2 | 0 | 2 | 0 | - | - | 2 | 0 | 1 | 0 | 2 | 1 | 2 | 0 |
V7 | Illumination Change | 1 | 1 | 0 | 1 | 1 | - | - | 1 | 2 | 1 | 1 | 1 | 0 | 1 | 0 |
V8 | Illumination Change | 1 | 1 | 0 | 1 | 1 | - | - | 1 | 2 | 1 | 0 | 1 | 1 | 1 | 0 |
V9 | Indoor | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 |
V10 | Indoor | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 0 |
V11 | Crowded Scene | 1 | - | - | 1 | 3 | - | - | 0 | 1 | 1 | 0 | 1 | 1 | - | - |
Object | Image | Color Histogram | Contours | Largest Contour | compareHist( ) Function | matchShapes( ) Function |
---|---|---|---|---|---|---|
Candidate Stationary Object (t) | Matching Score: 97.8% (The higher, the better) | Matching Score: 0.004 (The lower, the better) | ||||
Current Object (t+10s) |
Video | Object | Image | Color Histogram | compareHist( ) Function | Largest Contour | matchShapes( ) Function |
---|---|---|---|---|---|---|
KICV Video 1 | Candidate Stationary Object (t) | Matching Score: 22.4% Matching Score: 29.0% | Matching Score: 0.540 Matching Score: 0.059 | |||
Current Object (t+30s) | ||||||
Current Object (t+40s) | ||||||
KICV Video 2 | Candidate Stationary Object (t) | Matching Score: 9.6% | Matching Score: 0.057 | |||
Current Object (t+30s) |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Park, H.; Park, S.; Joo, Y. Robust Detection of Abandoned Object for Smart Video Surveillance in Illumination Changes. Sensors 2019, 19, 5114. https://doi.org/10.3390/s19235114
Park H, Park S, Joo Y. Robust Detection of Abandoned Object for Smart Video Surveillance in Illumination Changes. Sensors. 2019; 19(23):5114. https://doi.org/10.3390/s19235114
Chicago/Turabian StylePark, Hyeseung, Seungchul Park, and Youngbok Joo. 2019. "Robust Detection of Abandoned Object for Smart Video Surveillance in Illumination Changes" Sensors 19, no. 23: 5114. https://doi.org/10.3390/s19235114
APA StylePark, H., Park, S., & Joo, Y. (2019). Robust Detection of Abandoned Object for Smart Video Surveillance in Illumination Changes. Sensors, 19(23), 5114. https://doi.org/10.3390/s19235114