Hyperspectral Image Mixed Noise Removal via Double Factor Total Variation Nonlocal Low-Rank Tensor Regularization
Abstract
:1. Introduction
- First, under the framework of LRTF, by applying the weight TV regularization to the small-sized double factor (−), namely and , it not only helps to reduce the computational cost, but also accurately characterizes the spatial–spectral LS of HSIs.
- Second, we fully explore all priors of HSIs, such as LR, spatial–spectral LS, and NSS. And the NLR model is introduced before the main loop to estimate the initial value of , which indirectly characterizes NSS and solves the problem of high computational cost caused by the introduction of NLR.
- Third, the alternating direction multiplier method (ADMM) and augmented Lagrangian method (ALM) are used to solve the proposed DFTVNLR. The experimental results show the proposed DFTVNLR is superior to several other excellent methods for mixed noise removal, spatial texture structure restoration, and spectral feature preservation.
2. Notations and Preliminaries
3. HSI Mixed Noise Removal via DFTVNLR
3.1. Problem Formulation
3.2. Proposed Denoising Model
3.2.1. Spectral Global Low-Rankness (LR) Characterized by LRTF
3.2.2. Spatial–Spectral Local Smoothness (LS) Characterized by DFTV
3.2.3. Spatial Nonlocal Self-Similarity (NSS) Characterized by NLR
3.2.4. DFTVNLR-Based HSI Denoising Model
- Exploration of the prior knowledge of clean HSIs and mixed noise more fully. Compared to HSI mixed noise removal methods [2,29,30,31,41,46], which only consider partial priors, the proposed DFTVNLR comprehensively and fully considers LR, the spatial–spectral LS, the NSS of , data fidelity of , and the sparsity of .
- A better balance of the contradiction between the comprehensive representation of priors and the high computational cost. Compared to directly applying regularization to the large-sized [29,41] or only to [2], we applied regularization to small-sized − , which not only fully explores prior knowledge but also reduces the cost. Furthermore, we characterized NSS in the assisted loop instead of the main loop [30,41], which is another important measure to effectively control the cost.
3.3. Solving the Optimization Algorithm
3.3.1. Estimate the Initial Values of and
3.3.2. Update
3.3.3. Update
- (1)
- Update
- (2)
- Update
- (3)
- Update
3.3.4. Update
Algorithm 1. The pseudocode of the algorithm for solving the DFTVNLR model | |
Input: The observed noisy HIS , rank R (the dimension of subspace); the regularization parameters , , , and ; the penalty parameter of ALM function ; and the proximal error parameter . | |
Initialization: In the assisted loop iteration: , , and are estimated by SVD (under the condition of rank R), , , , , , PatSize = 5, PatNum = 200, SearchWin = 55, Step = 5. In the main loop iteration: , , the alternate iteration of AL function: , , and the maximum tolerance of iterative results ε = 10−4. | |
Assisted loop: Pre-estimation of the − initial value. | |
1: | is obtained from by the HySime algorithm, and it is taken as the initial value in the main loop. |
2: | The NLR regularization term is used to solve independently, and it is taken as the initial value in the main loop. |
3: | while do |
4: | Update by Equation (14). |
5: | . |
6: | end |
7: | , . |
Main loop: Solution of the −− subproblem. | |
1: | while not converged and do |
2: | Update by Equations (19) and (20). |
3: | while do |
4: | Update by Equations (28) and (29). |
5: | Update the auxiliary variable by Equation (31). |
6: | Update the Lagrange multiplier by Equation (32). |
7: | p = p + 1. |
8: | Check the convergence condition: . |
9: | end while |
10: | . |
11: | Update by Equation (34). |
12: | l = l + 1; |
13: | . |
14: | Check the convergence condition: . |
15: | end while |
Output: The denoised HSI . |
3.3.5. Computational Complexity Analysis
4. Experimental Results and Discussion
4.1. Simulated Data Experiments
4.1.1. Visual Quality Comparison
4.1.2. Quantitative Comparison
- The proposed DFTVNLR can achieve the best results under most noise cases, but its MSSIM, MFSIM, or MSAM is slightly lower than LRTFDFR or WNLRATV under some noise cases. This may be because our model is still difficult to completely match the real degraded model, resulting in a further “enhancement”. However, it enhances the visual effect to a certain extent.
- The principle of DFTVNLR is closest to that of LRTFDFR, but our results are better than those of LRTFDFR on the whole, which shows it is reasonable and effective to introduce an NSS prior. Additionally, the NLR term of DFTVNLR is derived from WNLATV, but our results are better than those of WNLATV on the whole, which proves that the strategy based on double factor is more effective and robust.
- Compared to the methods (LRTDTV, FGSLR, and L0L1HTV) that do not characterize the time-consuming NSS prior but impose constraints on the large-sized HSI, our algorithm is 1~2 times faster than them, thanks to our strategies of imposing constraints on small-sized factors and characterizing NSS outside the main loop. Compared to the methods (WNLRATV and SNLRSF) that also consider NSS prior and regularize the small-sized spatial factor, our computational cost is reduced by about 4 times and 10 times, respectively, which is due to our strategy of independently solving the NLR model with fewer iterations outside the main loop.
- Although our approach has made efforts to reduce or control the computational cost, it is currently difficult for us to achieve the optimal cost due to the fact we explore almost all major priors. Therefore, our cost is still higher than those of RCTV, FastHyMix, LRMR, and LRTFDFR, but our denoising performance is better. The faster execution of FastHyMix is due to the introduction of a deep image prior. The cost of RCTV is lower because it only imposes fewer regularization terms on small-sized spatial factor. LRMR is fast because only a few priors are mined. But the denoising performance of the above approaches is obviously insufficient. The cost of LRTFDFR is twice lower than ours because it does not consider an NSS prior.
4.2. Real Data Experiments
4.2.1. EO-1 Hyperion Dataset
4.2.2. AVIRIS Indian Pines Dataset
4.3. Discussion
4.3.1. Parameter Analysis
4.3.2. Convergence Analysis
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
HSI | hyperspectral image |
RS | remote sensing |
SAR | synthetic aperture radar |
RCI | representation coefficient image |
LR | spectral global low-rankness |
LS | local smoothness |
NSS | spatial nonlocal self-similarity |
S | sparsity of sparse noise |
TV | total variation |
SSTV | spatial–spectral TV |
ASSTV | adaptive SSTV |
DFTV | spatial–spectral double factor and total variation |
LRTF | low-rank tensor factorization |
LRMF | low-rank matrix factorization |
LRTV | low-rank TV |
LRMR | low-rank matrix recovery |
NLR | nonlocal low-rank tensor model |
PAM | proximal alternating minimization |
ADMM | alternating direction method of multipliers |
ALM | augmented Lagrangian method |
SVD | singular value decomposition |
FFT | fast Fourier transformation |
SBS | similar block search |
SoftThreshold | soft-thresholding operation |
MPSNR | the mean of peak signal-to-noise ratio |
MSSIM | the mean of structural similarity |
MFSIM | the mean of feature similarity |
MSAM | the mean of spectral angle mapping |
ERGAS | the Erreur Relative Globale Adimensionnelle de Synthese |
References
- Green, R.O.; Eastwood, M.L.; Sarture, C.M.; Chrien, T.G.; Aronsson, M. Imaging Spectroscopy and the Airborne Visible Infrared Imaging Spectrometer (Aviris). Remote Sens. Environ. 1998, 65, 227–248. [Google Scholar] [CrossRef]
- Peng, J.J.; Wang, H.L.; Cao, X.Y.; Liu, X.L.; Rui, X.Y.; Meng, D.Y. Fast Noise Removal in Hyperspectral Images Via Representative Coefficient Total Variation. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5546017. [Google Scholar] [CrossRef]
- Zhao, X.; Tao, R.; Li, W.; Li, H.-C.; Du, Q.; Liao, W.; Philips, W. Joint Classification of Hyperspectral and Lidar Data Using Hierarchical Random Walk and Deep CNN Architecture. IEEE Trans. Geosci. Remote Sens. 2020, 58, 7355–7370. [Google Scholar] [CrossRef]
- Ghamisi, P.; Yokoya, N.; Li, J.; Liao, W.; Liu, S.; Plaza, J.; Rasti, B.; Plaza, A. Advances in Hyperspectral Image and Signal Processing: A Comprehensive Overview of the State of the Art. IEEE Geosci. Remote Sens. Mag. 2017, 5, 37–78. [Google Scholar] [CrossRef]
- Dian, R.W.; Li, S.T.; Guo, A.J.; Fang, L.Y. Deep Hyperspectral Image Sharpening. IEEE Trans. Neural Netw. Learn. Syst. 2018, 29, 5345–5355. [Google Scholar] [CrossRef] [PubMed]
- Zhang, L.; Nie, J.T.; Wei, W.; Li, Y.; Zhang, Y.N. Deep Blind Hyperspectral Image Super-Resolution. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 2388–2400. [Google Scholar] [CrossRef] [PubMed]
- Lin, C.-H.; Bioucas-Dias, J.M. Nonnegative Blind Source Separation for Ill-Conditioned Mixtures Via John Ellipsoid. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 2209–2223. [Google Scholar]
- He, W.; Zhang, H.Y.; Zhang, L.P.; Shen, H.F. Total-Variation-Regularized Low-Rank Matrix Factorization for Hyperspectral Image Restoration. IEEE Trans. Geosci. Remote Sens. 2016, 54, 176–188. [Google Scholar] [CrossRef]
- Wang, Y.; Peng, J.J.; Zhao, Q.; Leung, Y.; Zhao, X.L.; Meng, D.Y. Hyperspectral Image Restoration Via Total Variation Regularized Low-Rank Tensor Decomposition. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens. 2018, 11, 1227–1243. [Google Scholar] [CrossRef]
- Zhou, L.; Ma, X.; Wang, X.; Hao, S.; Ye, Y.; Zhao, K. Shallow-to-Deep Spatial-Spectral Feature Enhancement for Hyperspectral Image Classification. Remote Sens. 2023, 15, 261. [Google Scholar] [CrossRef]
- Deng, C.; Chen, Y.; Zhang, S.; Li, F.; Lai, P.; Su, D.; Hu, M.; Wang, S. Robust Dual Spatial Weighted Sparse Unmixing for Remotely Sensed Hyperspectral Imagery. Remote Sens. 2023, 15, 4056. [Google Scholar] [CrossRef]
- Pan, H.; Jing, Z.; Leung, H.; Peng, P.; Zhang, H. Multiband Image Fusion Via Regularization on a Riemannian Submanifold. Remote Sens. 2023, 15, 4370. [Google Scholar] [CrossRef]
- Deng, Y.-J.; Li, H.-C.; Tan, S.-Q.; Hou, J.; Du, Q.; Plaza, A. T-Linear Tensor Subspace Learning for Robust Feature Extraction of Hyperspectral Images. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5501015. [Google Scholar] [CrossRef]
- Sun, S.; Bao, W.; Qu, K.; Feng, W.; Zhang, X.; Ma, X. Hyperspectral Image Super-Resolution Algorithm Based on Graph Regular Tensor Ring Decomposition. Remote Sens. 2023, 15, 4983. [Google Scholar] [CrossRef]
- Ji, L.; Geng, X. Hyperspectral Target Detection Methods Based on Statistical Information: The Key Problems and the Corresponding Strategies. Remote Sens. 2023, 15, 3835. [Google Scholar] [CrossRef]
- Chang, Y.; Yan, L.; Fang, H.; Zhong, S.; Liao, W. HSI-DeNet: Hyperspectral Image Restoration Via Convolutional Neural Network. IEEE Trans. Geosci. Remote Sens. 2019, 57, 667–682. [Google Scholar] [CrossRef]
- Cao, X.-Y.; Fu, X.-Y.; Xu, C.; Meng, D.-Y. Deep Spatial-Spectral Global Reasoning Network for Hyperspectral Image Denoising. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5504714. [Google Scholar] [CrossRef]
- Chen, Y.; Cao, X.; Zhao, Q.; Meng, D.; Xu, Z. Denoising Hyperspectral Image with Non-i.i.d. Noise Structure. IEEE Trans. Cybern. 2018, 48, 1054–1066. [Google Scholar] [CrossRef]
- Peng, J.; Xie, Q.; Zhao, Q.; Wang, Y.; Yee, L.; Meng, D. Enhanced 3DTV Regularization and Its Applications on HSI Denoising and Compressed Sensing. IEEE Trans. Image Process. 2020, 29, 7889–7903. [Google Scholar] [CrossRef]
- Zhuang, L.; Bioucas-Dias, J.M. Fast Hyperspectral Image Denoising and Inpainting Based on Low-Rank and Sparse Representations. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens. 2018, 11, 730–742. [Google Scholar] [CrossRef]
- Xie, Q.; Zhao, Q.; Meng, D.; Xu, Z.; Gu, S. Multispectral Images Denoising by Intrinsic Tensor Sparsity Regularization. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 27–30 June 2016; pp. 1692–1700. [Google Scholar]
- Zheng, Y.-B.; Huang, T.-Z.; Zhao, X.-L.; Jiang, T.-X.; Ma, T.-H.; Ji, T.-Y. Mixed Noise Removal in Hyperspectral Image Via Low-Fibered-Rank Regularization. IEEE Trans. Geosci. Remote Sens. 2020, 58, 734–749. [Google Scholar] [CrossRef]
- Lin, J.; Huang, T.-Z.; Zhao, X.-L.; Ma, T.-H.; Jiang, T.-X.; Zheng, Y.-B. A Novel Non-Convex Low-Rank Tensor Approximation Model for Hyperspectral Image Restoration. Appl. Math. Comput. 2021, 408, 126342. [Google Scholar] [CrossRef]
- Chang, Y.; Yan, L.-X.; Fang, H.-Z.; Liu, H. Simultaneous Destriping and Denoising for Remote Sensing Images with Unidirectional Total Variation and Sparse Representation. IEEE Geosci. Remote Sens. Lett. 2014, 11, 1051–1055. [Google Scholar] [CrossRef]
- Fan, H.-Y.; Li, C.; Guo, Y.-L.; Kuang, G.-Y.; Ma, J.-Y. Spatial-Spectral Total Variation Regularized Low-Rank Tensor Decomposition for Hyperspectral Image Denoising. IEEE Trans. Geosci. Remote Sens. 2018, 56, 6196–6213. [Google Scholar] [CrossRef]
- Jiang, T.-X.; Zhuang, L.-N.; Huang, T.-Z.; Zhao, X.-L.; Bioucas-Dias, J.M. Adaptive Hyperspectral Mixed Noise Removal. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5511413. [Google Scholar] [CrossRef]
- Zhao, B.; Ulfarsson, M.O.; Sveinsson, J.R.; Chanussot, J. Hyperspectral Image Denoising Using Spectral-Spatial Transform-Based Sparse and Low-Rank Representations. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5522125. [Google Scholar] [CrossRef]
- Chang, Y.; Yan, L.-X.; Zhao, X.-L.; Fang, H.-Z.; Zhang, Z.-J.; Zhong, S. Weighted Low-Rank Tensor Recovery for Hyperspectral Image Restoration. IEEE Trans. Cybern. 2020, 50, 4558–4572. [Google Scholar] [CrossRef]
- Wang, M.; Wang, Q.; Chanussot, J.; Hong, D. L0-L1 Hybrid Total Variation Regularization and Its Applications on Hyperspectral Image Mixed Noise Removal and Compressed Sensing. IEEE Trans. Geosci. Remote Sens. 2021, 59, 7695–7710. [Google Scholar] [CrossRef]
- Cao, C.-H.; Yu, J.; Zhou, C.-Y.; Hu, K.; Xiao, F.; Gao, X.-P. Hyperspectral Image Denoising Via Subspace-Based Nonlocal Low-Rank and Sparse Factorization. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens. 2019, 12, 973–988. [Google Scholar] [CrossRef]
- Zheng, Y.-B.; Huang, T.-Z.; Zhao, X.-L.; Chen, Y.; He, W. Double-Factor-Regularized Low-Rank Tensor Factorization for Mixed Noise Removal in Hyperspectral Image. IEEE Trans. Geosci. Remote Sens. 2020, 58, 8450–8464. [Google Scholar] [CrossRef]
- Yao, J.; Meng, D.; Zhao, Q.; Cao, W.; Xu, Z. Nonconvex-Sparsity and Nonlocal-Smoothness-Based Blind Hyperspectral Unmixing. IEEE Trans. Image Process. 2019, 28, 2991–3006. [Google Scholar] [CrossRef] [PubMed]
- Dian, R.-W.; Li, S.-T.; Sun, B.; Guo, A.-J. Recent Advances and New Guidelines on Hyperspectral and Multispectral Image Fusion. Inf. Fusion 2021, 69, 40–51. [Google Scholar] [CrossRef]
- Miao, Y.-C.; Zhao, X.-L.; Fu, X.; Wang, J.-L.; Zheng, Y.-B. Hyperspectral Denoising Using Unsupervised Disentangled Spatiospectral Deep Priors. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5513916. [Google Scholar] [CrossRef]
- Zha, Z.-Y.; Wen, B.-H.; Yuan, X.; Zhang, J.-C.; Zhou, J.-T.; Lu, Y.-L.; Zhu, C. Nonlocal Structured Sparsity Regularization Modeling for Hyperspectral Image Denoising. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5510316. [Google Scholar] [CrossRef]
- Wang, S.; Zhu, Z.-B.; Liu, Y.-F.; Zhang, B.-X. Weighted Group Sparse Regularized Tensor Decomposition for Hyperspectral Image Denoising. Appl. Sci. 2023, 13, 10363. [Google Scholar] [CrossRef]
- Han, J.; Pan, C.; Ding, H.-Y.; Zhang, Z.-C. Double-Factor Tensor Cascaded-Rank Decomposition for Hyperspectral Image Denoising. Remote Sens. 2024, 16, 109. [Google Scholar] [CrossRef]
- Kolda, T.G.; Bader, B.W. Tensor Decompositions and Applications. SIAM Rev. 2009, 51, 455–500. [Google Scholar] [CrossRef]
- Beck, A.; Teboulle, M. Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems. IEEE Trans. Image Process. 2009, 18, 2419–2434. [Google Scholar] [CrossRef]
- Xu, L.; Zheng, S.-C.; Jia, J.-Y. Unnatural L0 Sparse Representation for Natural Image Deblurring. In Proceedings of the 26th IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Portland, OR, USA, 23–28 June 2013; pp. 1107–1114. [Google Scholar]
- Chen, Y.; Cao, W.-F.; Pang, L.; Cao, X.-Y. Hyperspectral Image Denoising with Weighted Nonlocal Low-Rank Model and Adaptive Total Variation Regularization. IEEE Trans. Geosci. Remote Sens. 2022, 60, 15. [Google Scholar] [CrossRef]
- Xie, Y.; Qu, Y.-Y.; Tao, D.-C.; Wu, W.-W.; Yuan, Q.-Q.; Zhang, W.-S. Hyperspectral Image Restoration Via Iteratively Regularized Weighted Schatten P-Norm Minimization. IEEE Trans. Geosci. Remote Sens. 2016, 54, 4642–4659. [Google Scholar] [CrossRef]
- Bioucas-Dias, J.M.; Plaza, A.; Dobigeon, N.; Parente, M.; Du, Q. Hyperspectral Unmixing Overview: Geometrical, Statistical, and Sparse Regression-Based Approaches. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2012, 5, 354–379. [Google Scholar] [CrossRef]
- He, W.; Yao, Q.-M.; Li, C.; Yokoya, N.; Zhao, Q.-B. Non-Local Meets Global: An Iterative Paradigm for Hyperspectral Image Restoration. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 2089–2107. [Google Scholar] [CrossRef]
- Zhuang, L.-N.; Bioucas-Dias, J.M. Hyperspectral Image Denoising Based on Global and Non-Local Low-Rank Factorizations. In Proceedings of the 24th IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 1900–1904. [Google Scholar]
- Zhuang, L.-N.; Ng, M.K. FastHyMix: Fast and Parameter-Free Hyperspectral Image Mixed Noise Removal. IEEE Trans. Neural Netw. Learn. Syst. 2023, 34, 4702–4716. [Google Scholar] [CrossRef]
- Bioucas-Dias, J.M.; Nascimento, J.M.P. Hyperspectral Subspace Identification. IEEE Trans. Geosci. Remote Sens. 2008, 46, 2435–2445. [Google Scholar] [CrossRef]
- He, W.; Yao, Q.-M.; Li, C.; Yokoya, N.; Zhao, Q.-B. Non-Local Meets Global: An Integrated Paradigm for Hyperspectral Denoising. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 16–20 June 2019; pp. 6861–6870. [Google Scholar]
- Chen, Y.; He, W.; Yokoya, N.; Huang, T.-Z. Hyperspectral Image Restoration Using Weighted Group Sparsity-Regularized Low-Rank Tensor Decomposition. IEEE Trans. Cybern. 2020, 50, 3556–3570. [Google Scholar] [CrossRef]
- Boyd, S.; Parikh, N.; Chu, E.; Peleato, B.; Eckstein, J. Distributed Optimization and Statistical Learning Via the Alternating Direction Method of Multipliers. Found. Trends Mach. Learn. 2010, 3, 1–122. [Google Scholar] [CrossRef]
- Wu, C.-L.; Tai, X.-C. Augmented Lagrangian Method, Dual Methods, and Split Bregman Iteration for ROF, Vectorial TV, and High Order Models. SIAM J. Imag. Sci. 2010, 3, 300–339. [Google Scholar] [CrossRef]
- Donoho, D.L. De-Noising by Soft-Thresholding. IEEE Trans. Inf. Theory 1995, 41, 613–627. [Google Scholar] [CrossRef]
- Zhang, H.-Y.; He, W.; Zhang, L.-P.; Shen, H.-F.; Yuan, Q.-Q. Hyperspectral Image Restoration Using Low-Rank Matrix Recovery. IEEE Trans. Geosci. Remote Sens. 2014, 52, 4729–4743. [Google Scholar] [CrossRef]
- Chen, Y.; Huang, T.-Z.; He, W.; Zhao, X.-L.; Zhang, H. Hyperspectral Image Denoising Using Factor Group Sparsity-Regularized Nonconvex Low-Rank Approximation. IEEE Trans. Geosci. Remote Sens. 2021, 60, 3110769. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image Quality Assessment: From Error Visibility to Structural Similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
- Zhang, L.; Zhang, L.; Mou, X.-Q.; Zhang, D. FSIM: A Feature Similarity Index for Image Quality Assessment. IEEE Trans. Image Process. 2011, 20, 2378–2386. [Google Scholar] [CrossRef] [PubMed]
- Wang, P.; Wang, Y.-L.; Huang, B.; Wang, L.-G.; Zhang, X.-W.; Leung, H.; Chanussot, J. Poissonian Blurred Hyperspectral Imagery Denoising Based on Variable Splitting and Penalty Technique. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5505414. [Google Scholar] [CrossRef]
- Palud, P.; Thouvenin, P.-A.; Chainais, P.; Bron, E.; Le Petit, F. Efficient Sampling of Non Log-Concave Posterior Distributions with Mixture of Noises. IEEE Trans. Signal Process. 2023, 71, 2491–2501. [Google Scholar] [CrossRef]
Prior Term | Subject | Algorithm | Computational Complexity |
---|---|---|---|
Spectral Global Low-Rankness (LR) | SVD | ||
SVD | |||
Spatial–Spectral Local Smoothness (LS) | 2-D FFT | ||
2-D FFT | |||
1-D FFT | |||
Spatial Nonlocal Self-Similarity (NSS) | SBS + SVD | ||
SBS + SVD | |||
Sparsity of Sparse Noise (S) | SoftThreshold |
Method | Spectral Global Low-Rankness (LR) | Spatial–Spectral Local Smoothness (LS) | Spatial Nonlocal Self-Similarity (NSS) | Gaussian Noise (G) | Sparse Noise (S) |
---|---|---|---|---|---|
LRTFDFR | ___ | ||||
WNLRATV | ___ | ||||
RCTV | ___ | ||||
SNLRSF | ___ | ||||
L0L1HTV | ___ | ___ | |||
LRTDTV | ___ | ||||
LRMR | Minimize the rank of matrix | ___ | ___ | When rank(X) ≤ r | When card(S) ≤ r |
FGSLR | ___ | ___ | |||
DFTVNLR |
Metrics | Noisy | LRTFDFR | WNLRATV | RCTV | SNLRSF | FastHyMix | L0L1HTV | LRTDTV | LRMR | FGSLR | DFTVNLR |
---|---|---|---|---|---|---|---|---|---|---|---|
Case 1: Gaussian Noise | |||||||||||
MPSNR | 16.312 | 37.702 | 28.352 | 32.682 | 27.496 | 31.397 | 31.303 | 32.801 | 27.614 | 30.330 | 38.139 |
MSSIM | 0.3592 | 0.9823 | 0.9694 | 0.9680 | 0.9447 | 0.9642 | 0.9665 | 0.9647 | 0.8845 | 0.9486 | 0.9839 |
MFSIM | 0.4863 | 0.9960 | 0.9830 | 0.9812 | 0.9730 | 0.9782 | 0.9920 | 0.9763 | 0.9066 | 0.9656 | 0.9963 |
MSAM | 0.2763 | 0.0304 | 0.0797 | 0.0526 | 0.0872 | 0.0569 | 0.0654 | 0.0483 | 0.0859 | 0.0640 | 0.0284 |
ERGAS | 367.81 | 37.587 | 101.56 | 67.008 | 110.15 | 71.166 | 88.168 | 61.708 | 113.16 | 82.106 | 36.447 |
Time (s) | 0.0000 | 26.958 | 196.86 | 4.7124 | 560.02 | 5.0000 | 178.21 | 73.618 | 35.845 | 98.515 | 50.770 |
Case 2: Gaussian Noise + SaltandPepper Noise | |||||||||||
MPSNR | 13.747 | 37.687 | 27.659 | 32.593 | 26.015 | 27.813 | 31.980 | 32.034 | 26.439 | 29.582 | 38.197 |
MSSIM | 0.2181 | 0.9816 | 0.9622 | 0.9615 | 0.9107 | 0.9351 | 0.9637 | 0.9556 | 0.8537 | 0.9372 | 0.9840 |
MFSIM | 0.3870 | 0.9957 | 0.9769 | 0.9747 | 0.9451 | 0.9432 | 0.9900 | 0.9654 | 0.8807 | 0.9537 | 0.9958 |
MSAM | 0.3689 | 0.0289 | 0.0821 | 0.0525 | 0.0986 | 0.0820 | 0.0559 | 0.0473 | 0.1004 | 0.0710 | 0.0268 |
ERGAS | 486.39 | 37.496 | 109.02 | 67.773 | 125.93 | 106.98 | 81.603 | 65.330 | 134.78 | 92.077 | 35.186 |
Time | 0.0000 | 28.341 | 226.70 | 4.6043 | 557.89 | 4.2800 | 175.70 | 75.821 | 38.530 | 159.56 | 53.312 |
Case 3: Gaussian Noise + SaltandPepper Noise + Deadlines | |||||||||||
MPSNR | 13.521 | 36.658 | 27.701 | 29.690 | 25.837 | 26.278 | 29.969 | 29.754 | 25.971 | 29.512 | 37.007 |
MSSIM | 0.2119 | 0.9825 | 0.9611 | 0.9312 | 0.9077 | 0.8757 | 0.9589 | 0.9423 | 0.8460 | 0.9243 | 0.9830 |
MFSIM | 0.3835 | 0.9953 | 0.9761 | 0.9509 | 0.9380 | 0.8873 | 0.9845 | 0.9559 | 0.8727 | 0.9415 | 0.9954 |
MSAM | 0.3803 | 0.0339 | 0.0814 | 0.0835 | 0.0932 | 0.1067 | 0.0701 | 0.0794 | 0.1030 | 0.0753 | 0.0323 |
ERGAS | 500.19 | 42.8395 | 106.77 | 103.35 | 126.83 | 138.39 | 95.602 | 99.151 | 137.78 | 94.809 | 41.345 |
Time | 0.0000 | 28.868 | 220.43 | 4.5730 | 562.29 | 4.1600 | 176.19 | 73.441 | 35.975 | 437.95 | 53.962 |
Case 4: Gaussian Noise + SaltandPepper Noise + Stripes | |||||||||||
MPSNR | 13.630 | 35.671 | 27.573 | 30.489 | 25.749 | 26.742 | 29.473 | 30.814 | 26.129 | 29.270 | 35.954 |
MSSIM | 0.2137 | 0.9797 | 0.9586 | 0.9529 | 0.9086 | 0.9165 | 0.9589 | 0.9501 | 0.8479 | 0.9328 | 0.9805 |
MFSIM | 0.3831 | 0.9951 | 0.9753 | 0.9691 | 0.9400 | 0.9206 | 0.9882 | 0.9606 | 0.8744 | 0.9502 | 0.9951 |
MSAM | 0.3730 | 0.0362 | 0.0839 | 0.0712 | 0.1007 | 0.0957 | 0.0730 | 0.0560 | 0.1053 | 0.0749 | 0.0355 |
ERGAS | 494.81 | 47.767 | 116.31 | 94.314 | 132.61 | 121.42 | 99.085 | 75.942 | 138.41 | 99.220 | 46.485 |
Time | 0.0000 | 25.833 | 228.73 | 4.6481 | 562.17 | 3.9400 | 175.14 | 73.202 | 37.221 | 171.56 | 55.544 |
Case 5: Gaussian Noise + SaltandPepper Noise + Stripes + Deadlines | |||||||||||
MPSNR | 13.379 | 34.921 | 27.497 | 28.394 | 25.577 | 24.792 | 29.784 | 29.921 | 25.364 | 28.118 | 35.269 |
MSSIM | 0.2069 | 0.9795 | 0.9582 | 0.9194 | 0.9005 | 0.8395 | 0.9560 | 0.9400 | 0.8375 | 0.9128 | 0.9790 |
MFSIM | 0.3801 | 0.9948 | 0.9749 | 0.9393 | 0.9336 | 0.8567 | 0.9819 | 0.9521 | 0.8638 | 0.9303 | 0.9943 |
MSAM | 0.3856 | 0.0410 | 0.0841 | 0.0932 | 0.1019 | 0.1268 | 0.0652 | 0.0713 | 0.1149 | 0.0867 | 0.0370 |
ERGAS | 509.64 | 52.076 | 115.46 | 120.25 | 133.71 | 162.50 | 91.164 | 92.999 | 150.12 | 112.97 | 47.555 |
Time | 0.0000 | 28.918 | 199.38 | 3.7623 | 569.10 | 5.8200 | 126.32 | 75.541 | 36.245 | 502.65 | 53.756 |
Metrics | Noisy | LRTFDFR | WNLRATV | RCTV | SNLRSF | FastHyMix | L0L1HTV | LRTDTV | LRMR | FGSLR | DFTVNLR |
---|---|---|---|---|---|---|---|---|---|---|---|
Case 1: Gaussian Noise | |||||||||||
MPSNR | 13.168 | 29.001 | 28.513 | 27.960 | 30.601 | 29.935 | 22.821 | 26.430 | 26.998 | 27.282 | 30.725 |
MSSIM | 0.2864 | 0.9240 | 0.8878 | 0.8973 | 0.9426 | 0.9339 | 0.6131 | 0.8231 | 0.8643 | 0.8873 | 0.9463 |
MFSIM | 0.7176 | 0.9709 | 0.9304 | 0.9500 | 0.9745 | 0.9718 | 0.7426 | 0.9203 | 0.9440 | 0.9605 | 0.9760 |
MSAM | 0.5006 | 0.1954 | 0.1732 | 0.1968 | 0.1551 | 0.1644 | 0.3787 | 0.3222 | 0.2114 | 0.2047 | 0.1488 |
ERGAS | 1054.5 | 162.38 | 153.17 | 187.61 | 134.84 | 140.30 | 286.22 | 216.23 | 201.65 | 207.81 | 125.91 |
Time (s) | 0.0000 | 90.134 | 626.39 | 13.174 | 1692.8 | 10.130 | 478.94 | 156.18 | 105.53 | 1276.1 | 188.87 |
Case 2: Gaussian Noise + SaltandPepper Noise | |||||||||||
MPSNR | 12.731 | 27.871 | 28.248 | 25.751 | 26.228 | 24.490 | 22.696 | 26.112 | 26.429 | 24.987 | 29.338 |
MSSIM | 0.1659 | 0.9145 | 0.8642 | 0.8675 | 0.8591 | 0.8343 | 0.6074 | 0.8014 | 0.8426 | 0.8073 | 0.9252 |
MFSIM | 0.6305 | 0.9667 | 0.9179 | 0.9414 | 0.9382 | 0.9350 | 0.7360 | 0.9079 | 0.9327 | 0.9279 | 0.9615 |
MSAM | 0.5345 | 0.2320 | 0.1562 | 0.2921 | 0.2312 | 0.3102 | 0.3818 | 0.3391 | 0.2180 | 0.3757 | 0.1784 |
ERGAS | 1028.6 | 168.98 | 154.60 | 242.27 | 224.57 | 312.79 | 287.84 | 228.78 | 204.37 | 447.56 | 138.97 |
Time | 0.0000 | 89.041 | 629.39 | 12.377 | 1691.5 | 16.210 | 462.36 | 157.67 | 118.98 | 6651.2 | 190.96 |
Case 3: Gaussian Noise + SaltandPepper Noise + Deadlines | |||||||||||
MPSNR | 12.764 | 27.965 | 28.311 | 25.801 | 24.283 | 23.878 | 22.721 | 25.537 | 26.130 | 24.054 | 28.716 |
MSSIM | 0.1649 | 0.9136 | 0.8638 | 0.8674 | 0.8377 | 0.8282 | 0.6017 | 0.7948 | 0.8401 | 0.7846 | 0.9170 |
MFSIM | 0.6294 | 0.9658 | 0.9188 | 0.9373 | 0.9353 | 0.9318 | 0.7339 | 0.9039 | 0.9293 | 0.9133 | 0.9628 |
MSAM | 0.5367 | 0.2449 | 0.1584 | 0.2806 | 0.3165 | 0.3416 | 0.3800 | 0.3743 | 0.2462 | 0.4275 | 0.2088 |
ERGAS | 1029.5 | 171.05 | 154.25 | 219.39 | 267.26 | 313.00 | 290.93 | 243.99 | 208.08 | 473.78 | 149.87 |
Time | 0.0000 | 91.121 | 629.16 | 12.824 | 1687.8 | 17.130 | 472.53 | 156.49 | 112.65 | 7140.0 | 187.42 |
Case 4: Gaussian Noise + SaltandPepper Noise + Stripes | |||||||||||
MPSNR | 12.578 | 27.517 | 27.366 | 24.414 | 24.442 | 22.031 | 21.785 | 25.400 | 25.150 | 21.781 | 28.450 |
MSSIM | 0.1615 | 0.9085 | 0.8679 | 0.8443 | 0.8330 | 0.7566 | 0.5908 | 0.7899 | 0.8322 | 0.7502 | 0.9163 |
MFSIM | 0.6260 | 0.9659 | 0.9250 | 0.9372 | 0.9331 | 0.9103 | 0.7354 | 0.9053 | 0.9278 | 0.9040 | 0.9615 |
MSAM | 0.5378 | 0.2714 | 0.2072 | 0.3638 | 0.3155 | 0.4391 | 0.4011 | 0.3980 | 0.2908 | 0.5032 | 0.2274 |
ERGAS | 1048.0 | 182.88 | 178.34 | 290.17 | 288.67 | 461.74 | 323.23 | 264.59 | 223.34 | 538.91 | 166.86 |
Time | 0.0000 | 89.871 | 638.22 | 12.990 | 1699.2 | 15.440 | 471.36 | 156.19 | 128.12 | 4645.8 | 189.11 |
Case 5: Gaussian Noise + SaltandPepper Noise + Stripes + Deadlines | |||||||||||
MPSNR | 12.657 | 28.652 | 28.094 | 24.615 | 24.104 | 21.210 | 23.119 | 24.907 | 25.639 | 24.320 | 29.631 |
MSSIM | 0.1610 | 0.9165 | 0.8648 | 0.8422 | 0.8255 | 0.7309 | 0.6072 | 0.7892 | 0.8312 | 0.7812 | 0.9235 |
MFSIM | 0.6253 | 0.9649 | 0.9185 | 0.9320 | 0.9307 | 0.8991 | 0.7339 | 0.9041 | 0.9258 | 0.9133 | 0.9594 |
MSAM | 0.5374 | 0.2162 | 0.1671 | 0.3042 | 0.3300 | 0.4835 | 0.2878 | 0.3907 | 0.2652 | 0.4081 | 0.1789 |
ERGAS | 1037.7 | 160.21 | 156.70 | 280.78 | 306.97 | 548.08 | 270.20 | 253.70 | 228.40 | 482.66 | 138.54 |
Time | 0.0000 | 90.389 | 718.10 | 13.266 | 2640.6 | 14.440 | 369.64 | 157.34 | 123.19 | 7038.9 | 189.68 |
Parameter | ||||||||
---|---|---|---|---|---|---|---|---|
Dataset | ||||||||
Simulated HSI dataset | Indian Pines | 12 | 0.2 | 0.005 ~ 0.03 | 0.2 ~ 10 | 0.03 ~ 0.04 | 10,000 ~ 20,000 | ≥ 35 |
WDC Mall | 9 or12 | |||||||
Real noisy HSI dataset | Hyperion | 12 or estimated by HySime | ||||||
Indian Pines |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wu, Y.; Xu, W.; Zheng, L. Hyperspectral Image Mixed Noise Removal via Double Factor Total Variation Nonlocal Low-Rank Tensor Regularization. Remote Sens. 2024, 16, 1686. https://doi.org/10.3390/rs16101686
Wu Y, Xu W, Zheng L. Hyperspectral Image Mixed Noise Removal via Double Factor Total Variation Nonlocal Low-Rank Tensor Regularization. Remote Sensing. 2024; 16(10):1686. https://doi.org/10.3390/rs16101686
Chicago/Turabian StyleWu, Yongjie, Wei Xu, and Liangliang Zheng. 2024. "Hyperspectral Image Mixed Noise Removal via Double Factor Total Variation Nonlocal Low-Rank Tensor Regularization" Remote Sensing 16, no. 10: 1686. https://doi.org/10.3390/rs16101686
APA StyleWu, Y., Xu, W., & Zheng, L. (2024). Hyperspectral Image Mixed Noise Removal via Double Factor Total Variation Nonlocal Low-Rank Tensor Regularization. Remote Sensing, 16(10), 1686. https://doi.org/10.3390/rs16101686