A Three-Dimensional Subspace Algorithm Based on the Symmetry of the Approximation Model and WYL Conjugate Gradient Method
Abstract
:1. Introduction
2. The Search Direction
2.1. Conic Model
2.2. Quadratic Model
3. The Stepsize and Algorithm
3.1. Strategy for the Initial Stepsize
3.2. The Nonmonotone Line Search
3.3. Algorithm
Algorithm 1 WYL_TSCO |
Require: initial point , initial stepsize , positive constants |
Ensure: optimal |
|
4. Properties of the Proposed Algorithm
4.1. Some Lemmas
4.2. Convergence Analysis
5. Numerical Results
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Hestenes, M.R.; Stiefel, E.L. Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 1952, 49, 409–436. [Google Scholar] [CrossRef]
- Fletcher, R.; Reeves, C.M. Function minimization by conjugate gradients. Comput. J. 1964, 7, 149–154. [Google Scholar] [CrossRef] [Green Version]
- Polyak, B.T. The conjugate gradient method in extremal problems. Jussr Comput. Math. Math. Phys. 1969, 9, 94–112. [Google Scholar] [CrossRef]
- Fletcher, R. Volume 1 Unconstrained Optimization. In Practical Methods of Optimization; Wiley: New York, NY, USA, 1987. [Google Scholar]
- Liu, Y.L.; Storey, C.S. Efficient generalized conjugate gradient algorithms, part 1: Theory. J. Optim. Theory Appl. 1991, 69, 129–137. [Google Scholar] [CrossRef]
- Dai, Y.H.; Yuan, Y.X. A nonlinear conjugate gradient method with a strong global convergence property. Siam J. Optim. 1999, 10, 177–182. [Google Scholar] [CrossRef] [Green Version]
- Wei, Z.; Yao, S.; Liu, L. The convergence properties of some new conjugate gradient methods. Appl. Math. Comput. 2006, 183, 1341–1350. [Google Scholar] [CrossRef]
- Huang, H.; Wei, Z.; Yao, S. The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search. Appl. Math. Comput. 2007, 189, 1241–1245. [Google Scholar] [CrossRef]
- Zhang, L. An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation. Appl. Math. Comput. 2009, 215, 2269–2274. [Google Scholar] [CrossRef]
- Yao, S.; Qin, B. A hybrid of DL and WYL nonlinear conjugate gradient methods. Abstr. Appl. Anal. 2014, 2014, 279891. [Google Scholar] [CrossRef] [Green Version]
- Huang, H.; Lin, S. A modified Wei-Yao-Liu conjugate gradient method for unconstrained optimization. Appl. Math. Comput. 2014, 231, 179–186. [Google Scholar] [CrossRef]
- Hu, Y.; Wei, Z. Wei-Yao-Liu conjugate gradient projection algorithm for nonlinear monotone equations with convex constraints. Int. J. Comput. Math. 2015, 92, 2261–2272. [Google Scholar] [CrossRef]
- Huo, J.; Yang, J.; Wang, G.; Yao, S. A class of three-dimensional subspace conjugate gradient algorithms for unconstrained optimization. Symmetry 2022, 14, 80. [Google Scholar] [CrossRef]
- Wolfe, P. Convergence conditions for ascent methods. SIAM Rev. Soc. Ind. Appl. Math. 1969, 11, 226–235, Erratum in SIAM Rev. Soc. Ind. Appl. Math. 1971, 13, 185–188. [Google Scholar] [CrossRef]
- Zhang, H.; Hager, W.W. A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 2004, 14, 1043–1056. [Google Scholar] [CrossRef] [Green Version]
- Grippo, L.; Lampariello, F.; Lucidi, S. A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 1986, 23, 707–716. [Google Scholar] [CrossRef]
- Hager, W.W.; Zhang, H. A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 2005, 16, 170–192. [Google Scholar] [CrossRef] [Green Version]
- Gu, N.Z.; Mo, J.T. Incorporating nonmonotone strategies into the trust region method for unconstrained optimization. Comput. Math. Appl. 2008, 55, 2158–2172. [Google Scholar] [CrossRef] [Green Version]
- Dai, Y.H.; Kou, C.X. A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 2013, 23, 296–320. [Google Scholar] [CrossRef] [Green Version]
- Huang, S.; Wan, Z.; Chen, X. A new nonmonotone line search technique for unconstrained optimization. Numer. Algorithms 2015, 68, 671–689. [Google Scholar] [CrossRef]
- Ou, Y.; Liu, Y. A memory gradient method based on the nonmonotone technique. J. Ind. Manag. Optim. 2017, 13, 857–872. [Google Scholar] [CrossRef]
- Ahookhosh, M.; Amini, K.; Peyghami, M.R. A nonmonotone trust-region line search method for large-scale unconstrained optimization. Appl. Math. Model. 2012, 36, 478–487. [Google Scholar] [CrossRef]
- Ahookhosh, M.; Amini, K. An efficient nonmonotone trust-region method for unconstrained optimization. Numer. Algorithms 2012, 59, 523–540. [Google Scholar] [CrossRef]
- Kimiaei, M.; Rahpeymaii, F. A new nonmonotone line-search trust-region approach for nonlinear systems. TOP 2019, 27, 199–232. [Google Scholar] [CrossRef]
- Kimiaei, M. A new class of nonmonotone adaptive trust-region methods for nonlinear equations with box constraints. Calcolo 2017, 54, 769–812. [Google Scholar] [CrossRef]
- Yuan, Y.X. Subspace techniques for nonlinear optimization. In Some Topics in Industrial and Applied Mathematics; Jeltsch, R., Li, D.Q., Sloan, I.H., Eds.; Higher Education Press: Beijing, China, 2007; pp. 206–218. [Google Scholar]
- Yuan, Y.X. A review on subspace methods for nonlinear optimization. In Proceedings of the International Congress of Mathematics, Seoul, Republic of Korea, 13–21 August 2014; pp. 807–827. [Google Scholar]
- Liu, D.C.; Nocedal, J. On the limited memory BFGS method for large scale optimization. Math. Program. 1989, 45, 503–528. [Google Scholar] [CrossRef] [Green Version]
- Wang, Z.H.; Wen, Z.W.; Yuan, Y. A subspace trust region method for large scale unconstrained optimization. In Numerical Linear Algebra and Optimization; Yuan, Y., Ed.; Science Press: Beijing, China, 2004; pp. 265–274. [Google Scholar]
- Wang, Z.H.; Yuan, Y. A subspace implementation of quasi-Newton trust region methods for unconstrained optimization. Numer. Math. 2006, 104, 241–269. [Google Scholar] [CrossRef]
- Lee, J.H.; Jung, Y.M.; Yuan, Y.X.; Yun, S. A subspace SQP method for equality constrained optimization. Comput. Optim. Appl. 2019, 74, 177–194. [Google Scholar] [CrossRef]
- Gill, P.E.; Leonard, M.W. Reduced-hessian quasi-newton methods for unconstrained optimization. SIAM J. Optim. 2001, 12, 209–237. [Google Scholar] [CrossRef]
- Liu, X.; Wen, Z.; Zhang, Y. Limited memory block Krylov subspace optimization for computing dominant singular value decompositions. SIAM J. Sci. Comput. 2013, 35, A1641–A1668. [Google Scholar] [CrossRef] [Green Version]
- Yuan, Y.X. Subspace methods for large scale nonlinear equations and nonlinear least squares. Optim. Eng. 2009, 10, 207–218. [Google Scholar] [CrossRef]
- Dong, Q.; Liu, X.; Wen, Z.W.; Yuan, Y.X. A parallel line search subspace correction method for composite convex optimization. J. Oper. Res. Soc. China. 2015, 3, 163–187. [Google Scholar] [CrossRef]
- Li, Y.; Liu, H.; Wen, Z.; Yuan, Y.X. Low-rank matrix iteration using polynomial-filtered subspace extraction. SIAM J. Sci. Comput. 2020, 42, A1686–A1713. [Google Scholar] [CrossRef]
- Kimiaei, M.; Neumaier, A.; Azmi, B. LMBOPT: A limited memory method for bound-constrained optimization. Math. Prog. Comp. 2022, 14, 271–318. [Google Scholar] [CrossRef]
- Cartis, C.; Roberts, L. Scalable subspace methods for derivative-free nonlinear least-squares optimization. Math. Program. 2023, 199, 461–524. [Google Scholar] [CrossRef]
- Yuan, Y.X.; Stoer, J. A subspace study on conjugate gradient algorithms. Z. Angew. Math. Mech. 1995, 75, 69–77. [Google Scholar] [CrossRef]
- Andrei, N. An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 2014, 65, 859–874. [Google Scholar] [CrossRef]
- Yang, Y.; Chen, Y.; Lu, Y. A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numer. Algorithms 2017, 76, 813–828. [Google Scholar] [CrossRef]
- Li, M.; Liu, H.; Liu, Z. A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numer. Algorithms 2018, 79, 195–219. [Google Scholar] [CrossRef]
- Dai, Y.H.; Kou, C.X. A Barzilai-Borwein conjugate gradient method. Sci. China Math. 2016, 59, 1511–1524. [Google Scholar] [CrossRef]
- Barzilai, J.; Borwein, J.M. Two-point step size gradient methods. IMA J. Numer. Anal. 1988, 8, 141–148. [Google Scholar] [CrossRef]
- Sun, W. On nonquadratic model optimization methods. Asia. Pac. J. Oper. Res. 1996, 13, 43–63. [Google Scholar]
- Sun, W.; Yuan, Y. Optimization Theory and Methods: Nonlinear Programming; Springer: New York, NY, USA, 2006. [Google Scholar]
- Davidon, W.C. Conic Approximations and Collinear Scalings for Optimizers. SIAM J. Numer. Anal. 1980, 17, 268–281. [Google Scholar] [CrossRef]
- Sorensen, D.C. The Q-superlinear convergence of a collinear scaling algorithm for unconstrained optimization. SIAM J. Numer. Anal. 1980, 17, 84–114. [Google Scholar] [CrossRef]
- Ariyawansa, K.A. Deriving collinear scaling algorithms as extensions of quasi-Newton methods and the local convergence of DFP- and BFGS-related collinear scaling algorithms. Math. Program. 1990, 49, 23–48. [Google Scholar] [CrossRef]
- Sheng, S. Interpolation by conic model for unconstrained optimization. Computing 1995, 54, 83–98. [Google Scholar] [CrossRef]
- Di, S.; Sun, W. A trust region method for conic model to solve unconstraind optimizaions. Optim. Methods Softw. 1996, 6, 237–263. [Google Scholar] [CrossRef]
- Li, Y.; Liu, Z.; Liu, H. A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Comput. Appl. Math. 2019, 38, 16. [Google Scholar] [CrossRef]
- Sun, W.; Li, Y.; Wang, T.; Liu, H. A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization. Comput. Appl. Math. 2022, 41, 178. [Google Scholar] [CrossRef]
- Yuan, Y.X. A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 1991, 11, 325–332. [Google Scholar] [CrossRef]
- Dai, Y.H.; Yuan, J.Y.; Yuan, Y.X. Modified two-point stepsize gradient methods for unconstrained optimization problems. Comput. Optim. Appl. 2002, 22, 103–109. [Google Scholar] [CrossRef]
- Liu, Z.; Liu, H. An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms 2018, 78, 21–39. [Google Scholar] [CrossRef]
- Liu, H.; Liu, Z. An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 2019, 180, 879–906. [Google Scholar] [CrossRef]
- Nocedal, J.; Wright, S.J. Numerical Optimization; Springer: New York, NY, USA, 2006. [Google Scholar]
- Andrei, N. Nonlinear Conjugate Gradient Methods for Unconstrained Optimization; Springer: Cham, Switzerland, 2020. [Google Scholar]
- Hager, W.W.; Zhang, H. Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 2006, 32, 113–137. [Google Scholar] [CrossRef]
- Andrei, N. Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. Second Ser. 2011, 34, 319–330. [Google Scholar]
- Andrei, N. An unconstrained optimization test functions collection. Adv. Model. Optim. 2008, 10, 147–161. [Google Scholar]
- Dolan, E.D.; Moré, J.J. Benchmarking optimization software with performance profiles. Math. Program. 2002, 91, 201–213. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, G.; Yao, S.; Pei, M.; Xu, J. A Three-Dimensional Subspace Algorithm Based on the Symmetry of the Approximation Model and WYL Conjugate Gradient Method. Symmetry 2023, 15, 1207. https://doi.org/10.3390/sym15061207
Wang G, Yao S, Pei M, Xu J. A Three-Dimensional Subspace Algorithm Based on the Symmetry of the Approximation Model and WYL Conjugate Gradient Method. Symmetry. 2023; 15(6):1207. https://doi.org/10.3390/sym15061207
Chicago/Turabian StyleWang, Guoxin, Shengwei Yao, Mingyang Pei, and Jieqiong Xu. 2023. "A Three-Dimensional Subspace Algorithm Based on the Symmetry of the Approximation Model and WYL Conjugate Gradient Method" Symmetry 15, no. 6: 1207. https://doi.org/10.3390/sym15061207
APA StyleWang, G., Yao, S., Pei, M., & Xu, J. (2023). A Three-Dimensional Subspace Algorithm Based on the Symmetry of the Approximation Model and WYL Conjugate Gradient Method. Symmetry, 15(6), 1207. https://doi.org/10.3390/sym15061207