A New Hybrid Descent Algorithm for Large-Scale Nonconvex Optimization and Application to Some Image Restoration Problems
Abstract
:1. Introduction
2. Motivation, Algorithm, and Sufficient Descent Property
Algorithm 1 New descent hybrid algorithm of MHS and DY methods (NMHSDY) for nonconvex functions. |
Step 0. Input and Initialization. Select an initial point , parameter and compute and . Set and ; Step 1. If , then stop; Step 2. Compute step length along direction by some line search strategy; Step 3. Let ; Step 4. Compute the conjugate parameter by (7) and the search direction by
Step 5. Set and go to Step 2. |
3. Convergence for General Nonlinear Functions
4. Numerical Performance
4.1. Performance on Benchmark Problems
4.2. Comparison for Stability
4.3. Application to Image Restoration
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Li, D.; Fukushima, M. A global and superlinear convergent Gauss-Newton-based BFGS method for symmetric nonlinear equations. SIAM J. Numer. Anal. 1999, 37, 152–172. [Google Scholar] [CrossRef]
- Yuan, G.; Wei, Z.; Wang, Z. Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization. Comput. Optim. Appl. 2013, 54, 45–64. [Google Scholar] [CrossRef]
- Dai, Y.; Han, J.; Liu, G.; Sun, D.; Yin, H.; Yuan, Y. Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 2000, 10, 345–358. [Google Scholar] [CrossRef]
- Hager, W.; Zhang, H. A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 2005, 16, 170–192. [Google Scholar] [CrossRef]
- Fletcher, R.; Reeves, C.M. Function minimization by conjugate gradients. Comput. J. 1964, 7, 149–154. [Google Scholar] [CrossRef]
- Dai, Y.; Yuan, Y. A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 1999, 10, 177–182. [Google Scholar] [CrossRef]
- Fletcher, R. Practical Methods of Optimization; Unconstrained Optimization; John Wiley & Sons: New York, NY, UAS, 1987; Volume 1. [Google Scholar]
- Hestenes, R.; Stiefel, L. Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 1952, 49, 409–436. [Google Scholar] [CrossRef]
- Polyak, B.T. The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 1969, 9, 94–112. [Google Scholar] [CrossRef]
- Polak, E.; Ribière, G. Note sur la convergence de méthodes de directions conjuguées. Rev. Fr. Informat Rech. Opér. 1969, 16, 35–43. [Google Scholar] [CrossRef]
- Liu, Y.; Storey, C. Efficient generalized conjugate gradient algorithms Part 1: Theory. J. Optim. Theory Appl. 1991, 69, 129–137. [Google Scholar] [CrossRef]
- Xiang, Y.; Gong, X.G. Efficiency of generalized simulated annealing. Phys. Rev. E 2000, 62, 4473–4476. [Google Scholar] [CrossRef] [PubMed]
- Yuan, Q.; Qian, F. A hybrid genetic algorithm for twice continuously differentiable NLP problems. Comput. Chem. Eng. 2010, 34, 36–41. [Google Scholar] [CrossRef]
- Gao, F.C.; Han, L.X. Implementing the Nelder-Mead simplex algorithm with adaptive parameters. Comput. Optim. Appl. 2012, 51, 259–277. [Google Scholar] [CrossRef]
- Yuan, G.; Wang, X.; Sheng, Z. The projection technique for two open problems of unconstrained optimization problems. J. Optim. Theory Appl. 2020, 186, 590–619. [Google Scholar] [CrossRef]
- Yuan, G.; Wang, X.; Sheng, Z. Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions. Numer. Algorithms 2020, 84, 935–956. [Google Scholar] [CrossRef]
- Mousavi, A.; Esmaeilpour, M.; Sheikhahmadi, A. A new family of Polak-Ribière-Polyak conjugate gradient method for impulse noise removal. Soft Comput. 2023, 27, 17515–17524. [Google Scholar] [CrossRef]
- Polyak, B.T. Introduction to Optimization; Optimization Software Inc., Publications Division: New York, NY, USA, 1987. [Google Scholar]
- Wang, X.; Yuan, G.; Pang, L. A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems. Numer. Algorithms 2023, 93, 949–970. [Google Scholar] [CrossRef]
- Wang, X. A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems. Appl. Numer. Math. 2023, 192, 41–56. [Google Scholar] [CrossRef]
- Touati-Ahmed, D.; Storey, C. Efficient hybrid conjugate gradient techniques. J. Optim.Theory Appl. 1990, 64, 379–397. [Google Scholar] [CrossRef]
- Zhang, L.; Zhou, W. Two descent hybrid conjugate gradient methods for optimization. J. Comput. Appl. Math. 2008, 216, 251–264. [Google Scholar] [CrossRef]
- Gilbert, J.; Nocedal, J. Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 1992, 2, 21–42. [Google Scholar] [CrossRef]
- Dai, Y.; Yuan, Y. An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 2001, 103, 33–47. [Google Scholar] [CrossRef]
- Jiang, X.; Liao, W.; Yin, J.; Jian, J. A new family of hybrid three-term conjugate gradient methods with applications in image restoration. Numer. Algorithms 2022, 91, 161–191. [Google Scholar] [CrossRef]
- Amini, K.; Faramarzi, P.; Pirfalah, N. A modified Hestenes-Stiefel conjugate gradient method with an optimal property. Optim. Methods Softw. 2019, 34, 770–782. [Google Scholar] [CrossRef]
- Narushima, Y.; Yabe, H.; Ford, J. A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 2011, 21, 212–230. [Google Scholar] [CrossRef]
- Dai, Y.; Kou, C. A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 2013, 23, 296–320. [Google Scholar] [CrossRef]
- Woldu, T.; Zhang, H.; Zhang, X.; Yemane, H. A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization. J. Optim. Theory Appl. 2020, 185, 223–238. [Google Scholar] [CrossRef]
- Yuan, G.; Meng, Z.; Li, Y. A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 2016, 168, 129–152. [Google Scholar] [CrossRef]
- Babaie-Kafaki, S.; Fatemi, M.; Mahdavi-Amiri, N. Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer. Algorithms 2011, 58, 315–331. [Google Scholar] [CrossRef]
- Livieris, I.; Tampakas, V.; Pintelas, P. A descent hybrid conjugate gradient method based on the memoryless BFGS update. Numer. Algorithms 2018, 79, 1169–1185. [Google Scholar] [CrossRef]
- Khoshgam, Z.; Ashrafi, A. A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function. Comp. Appl. Math. 2019, 38, 186. [Google Scholar] [CrossRef]
- Narayanan, S.; Kaelo, P. A linear hybridization of Dai-Yuan and Hestenes-Stiefel conjugate gradient method for unconstrained optimization. Numer.-Math.-Theory Methods Appl. 2021, 14, 527–539. [Google Scholar]
- Zoutendijk, G. Nonlinear Programming, Computational Methods; Integer & Nonlinear Programming: Amsterdam, The Netherlands, 1970; pp. 37–86. [Google Scholar]
- Li, D.; Fukushima, M. A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 2001, 129, 15–35. [Google Scholar] [CrossRef]
- Babaie-Kafaki, S.; Ghanbari, R. A modified scaled conjugate gradient method with global convergence for nonconvex functions. B Bull. Belg. Math. Soc. Simon Stevin 2014, 21, 465–477. [Google Scholar] [CrossRef]
- Andrei, N. An unconstrained optimization test functions collection. Environ. Ence Technol. 2008, 10, 6552–6558. [Google Scholar]
- Andrei, N. An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algorithms 2006, 42, 63–73. [Google Scholar] [CrossRef]
- Dolan, E.; Moré, J. Benchmarking optimization software with performance profiles. Math. Program 2002, 91, 201–213. [Google Scholar] [CrossRef]
- Watkins, S. Fundamentals of Matrix Computations; John Wiley and Sons: New York, NY, USA, 2002. [Google Scholar]
- Babaie-Kafaki, S. A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm. Int. J. Comput. Math. 2019, 96, 1595–1602. [Google Scholar] [CrossRef]
- Yu, G.; Huang, J.; Zhou, Y. A descent spectral conjugate gradient method for impulse noise removal. Appl. Math. Lett. 2010, 23, 555–560. [Google Scholar] [CrossRef]
- Yuan, G.; Lu, J.; Wang, Z. The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems. Appl. Numer. Math. 2020, 152, 1–11. [Google Scholar] [CrossRef]
- Bovik, A. Handbook of Image and Video Processing; Academic: New York, NY, USA, 2000. [Google Scholar]
No. | Problem | No. | Problem |
---|---|---|---|
1 | Extended Freudenstein and Roth Function | 32 | ARWHEAD (CUTE) |
2 | Extended Trigonometric Function | 33 | NONDIA (Shanno-78) (CUTE) |
3 | Extended Rosenbrock Function | 34 | DQDRTIC (CUTE) |
4 | Extended Beale Function | 35 | EG2 (CUTE) |
5 | Extended Penalty Function | 36 | DIXMAANA (CUTE) |
6 | Perturbed Quadratic Function | 37 | DIXMAANB (CUTE) |
7 | Raydan 1 Function | 38 | DIXMAANC (CUTE) |
8 | Raydan 2 Function | 39 | DIXMAANE (CUTE) |
9 | Diagonal 3 Function | 40 | Broyden Tridiagonal |
10 | Generalized Tridiagonal-1 Function | 41 | Almost Perturbed Quadratic |
11 | Extended Tridiagonal-1 Function | 42 | Tridiagonal Perturbed Quadratic |
12 | Extended Three Exponential Terms | 43 | EDENSCH Function (CUTE) |
13 | Generalized Tridiagonal-2 | 44 | VARDIM Function (CUTE) |
14 | Diagonal 4 Function | 45 | LIARWHD (CUTE) |
15 | Diagonal 5 Function | 46 | DIAGONAL 6 |
16 | Extended Himmelblau Function | 47 | DIXMAANF (CUTE) |
17 | Generalized PSC1 Function | 48 | DIXMAANG (CUTE) |
18 | Extended PSC1 Function | 49 | DIXMAANH (CUTE) |
19 | Extended Powell Function | 50 | DIXMAANI (CUTE) |
20 | Extended Cliff Function | 51 | DIXMAANJ (CUTE) |
21 | Quadratic Diagonal Perturbed Function | 52 | DIXMAANK (CUTE) |
22 | Extended Wood Function | 53 | DIXMAANL (CUTE) |
23 | Extended Hiebert Function | 54 | DIXMAAND (CUTE) |
24 | Quadratic Function QF1 | 55 | ENGVAL1 (CUTE) |
25 | Extended Quadratic Penalty QP1 Function | 56 | COSINE (CUTE) |
26 | Extended Quadratic Penalty QP2 Function | 57 | Extended DENSCHNB (CUTE) |
27 | A Quadratic Function QF2 Function | 58 | Extended DENSCHNF (CUTE) |
28 | Extended EP1 Function | 59 | SINQUAD (CUTE) |
29 | Extended Tridiagonal-2 Function | 60 | Scaled Quadratic SQ1 |
30 | BDQRTIC (CUTE) | 61 | Scaled Quadratic SQ2 |
31 | TRIDIA (CUTE) |
MHSCG Method | Algorithm 1 | |||||||
---|---|---|---|---|---|---|---|---|
Dim | NI | NFG | Time | NI | NFG | Time | ||
5 | 5 | 22 | 0.000000 | 0.000000 | 5 | 22 | 0.000000 | 0.000000 |
6 | 13 | 48 | 0.031250 | 0.000000 | 13 | 48 | 0.000000 | 0.000000 |
7 | 21 | 72 | 0.000000 | 0.000000 | 21 | 72 | 0.000000 | 0.000000 |
8 | 21 | 72 | 0.000000 | 0.000000 | 21 | 72 | 0.015625 | 0.000000 |
9 | 27 | 92 | 0.000000 | 0.000006 | 24 | 82 | 0.000000 | 0.000000 |
10 | 41 | 143 | 0.000000 | 0.000007 | 29 | 97 | 0.000000 | 0.000000 |
11 | 75 | 246 | 0.000000 | 0.000009 | 55 | 192 | 0.000000 | 0.000000 |
12 | 58 | 203 | 0.000000 | 0.000003 | 70 | 249 | 0.000000 | 0.000008 |
13 | 57 | 190 | 0.000000 | 0.000004 | 70 | 246 | 0.000000 | 0.000000 |
14 | 86 | 276 | 0.031250 | 0.000008 | 123 | 429 | 0.000000 | 0.000000 |
15 | 75 | 264 | 0.031250 | 0.000009 | 89 | 314 | 0.000000 | 0.000009 |
16 | 98 | 330 | 0.000000 | 0.000008 | 52 | 177 | 0.000000 | 0.000001 |
17 | 24 | 82 | 0.000000 | 0.000002 | 66 | 232 | 0.000000 | 0.000002 |
18 | 65 | 215 | 0.031250 | 0.000009 | 28 | 97 | 0.000000 | 0.000000 |
19 | 31 | 105 | 0.000000 | 0.000007 | 27 | 94 | 0.000000 | 0.000000 |
20 | 111 | 373 | 0.031250 | 0.000009 | 131 | 452 | 0.000000 | 0.000010 |
21 | 117 | 379 | 0.031250 | 0.000008 | 31 | 108 | 0.000000 | 0.000000 |
22 | 96 | 333 | 0.000000 | 0.000010 | 66 | 236 | 0.000000 | 0.000000 |
23 | 68 | 233 | 0.000000 | 0.000009 | 85 | 303 | 0.031250 | 0.000000 |
24 | 50 | 175 | 0.031250 | 0.000000 | 43 | 151 | 0.000000 | 0.000000 |
25 | 96 | 318 | 0.000000 | 0.000009 | 103 | 377 | 0.000000 | 0.000000 |
26 | 113 | 370 | 0.000000 | 0.000009 | 112 | 410 | 0.000000 | 0.000004 |
27 | 22 | 78 | 0.000000 | 0.000000 | 129 | 473 | 0.031250 | 0.000001 |
28 | 104 | 372 | 0.000000 | 0.000009 | 24 | 87 | 0.000000 | 0.000000 |
29 | 156 | 513 | 0.000000 | 0.000010 | 130 | 449 | 0.000000 | 0.000001 |
30 | 54 | 185 | 0.000000 | 0.000001 | 121 | 446 | 0.000000 | 0.000007 |
31 | 39 | 144 | 0.000000 | 0.000003 | 97 | 341 | 0.000000 | 0.000008 |
32 | 26 | 97 | 0.000000 | 0.000001 | 79 | 292 | 0.031250 | 0.000001 |
33 | 87 | 309 | 0.000000 | 0.000010 | 66 | 234 | 0.031250 | 0.000002 |
34 | 58 | 213 | 0.000000 | 0.000006 | 60 | 217 | 0.031250 | 0.000000 |
35 | 97 | 327 | 0.000000 | 0.000009 | 25 | 95 | 0.000000 | 0.000002 |
36 | 24 | 86 | 0.000000 | 0.000000 | 132 | 481 | 0.000000 | 0.000003 |
37 | 23 | 84 | 0.000000 | 0.000004 | 55 | 192 | 0.000000 | 0.000001 |
38 | 93 | 312 | 0.046875 | 0.000010 | 40 | 142 | 0.000000 | 0.000003 |
39 | 148 | 487 | 0.046875 | 0.000005 | 124 | 442 | 0.140625 | 0.000010 |
40 | 66 | 231 | 0.000000 | 0.000007 | 136 | 496 | 0.031250 | 0.000007 |
MHSCG Method | Algorithm 1 | |||||||
---|---|---|---|---|---|---|---|---|
Dim | NI | NFG | Time | NI | NFG | Time | ||
41 | 76 | 267 | 0.031250 | 0.000005 | 65 | 239 | 0.000000 | 0.000005 |
42 | 130 | 438 | 0.046875 | 0.000008 | 85 | 302 | 0.000000 | 0.000010 |
43 | 164 | 546 | 0.031250 | 0.000009 | 136 | 488 | 0.000000 | 0.000009 |
44 | 184 | 594 | 0.031250 | 0.000007 | 29 | 103 | 0.000000 | 0.000006 |
45 | 17 | 64 | 0.000000 | 0.000000 | 17 | 64 | 0.000000 | 0.000000 |
46 | 155 | 519 | 0.031250 | 0.000010 | 62 | 222 | 0.000000 | 0.000000 |
47 | 123 | 421 | 0.031250 | 0.000010 | 107 | 379 | 0.031250 | 0.000010 |
48 | 20 | 74 | 0.000000 | 0.000000 | 68 | 250 | 0.015625 | 0.000010 |
49 | 135 | 473 | 0.031250 | 0.000010 | 102 | 375 | 0.031250 | 0.000010 |
50 | 178 | 594 | 0.062500 | 0.000010 | 151 | 542 | 0.031250 | 0.000009 |
Algorithm 1 | MHSCG Method | ||||
---|---|---|---|---|---|
Image | Noise Level | PSNR | CPU Time | PSNR | CPU Time |
Barbara | 29.6638 | 2.265625 | 29.5831 | 2.578125 | |
Baboon | 27.9223 | 2.609375 | 27.8455 | 3.250000 | |
Barbara | 23.1256 | 3.593750 | 23.1103 | 3.765625 | |
Baboon | 21.1836 | 3.484375 | 21.1582 | 3.671875 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, S.; Wang, X.; Tian, Y.; Pang, L. A New Hybrid Descent Algorithm for Large-Scale Nonconvex Optimization and Application to Some Image Restoration Problems. Mathematics 2024, 12, 3088. https://doi.org/10.3390/math12193088
Wang S, Wang X, Tian Y, Pang L. A New Hybrid Descent Algorithm for Large-Scale Nonconvex Optimization and Application to Some Image Restoration Problems. Mathematics. 2024; 12(19):3088. https://doi.org/10.3390/math12193088
Chicago/Turabian StyleWang, Shuai, Xiaoliang Wang, Yuzhu Tian, and Liping Pang. 2024. "A New Hybrid Descent Algorithm for Large-Scale Nonconvex Optimization and Application to Some Image Restoration Problems" Mathematics 12, no. 19: 3088. https://doi.org/10.3390/math12193088
APA StyleWang, S., Wang, X., Tian, Y., & Pang, L. (2024). A New Hybrid Descent Algorithm for Large-Scale Nonconvex Optimization and Application to Some Image Restoration Problems. Mathematics, 12(19), 3088. https://doi.org/10.3390/math12193088