A Modified Hestenes-Stiefel-Type Derivative-Free Method for Large-Scale Nonlinear Monotone Equations
Abstract
:1. Introduction
2. Algorithm and the Sufficient Descent Property
- ()
- The F is a monotone function:
- ()
- The F is Lipschitz continuous function, namely, there exists a such that
Algorithm 1: NHZ derivative-free method. |
Step 0: Given as an initial point, and the constants . Then set k := 0. Step 1: Calculate . If , stop the algorithm. Otherwise, go to step 2. Step 2: Determine the search direction by (10), (11) and (12). Step 3: Calculate the search steplength by (5). Let . Step 4: Calculate . If , stop the algorithm. Otherwise, calculate by using the projection (9). Set k := k + 1 and go to Step 1. |
3. Global Convergence Analysis
4. Numerical Experiments
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Iusem, A.N.; Solodov, M.V. Newton-type methods with generalized distances for constrained optimization. Optimization 1997, 41, 257–278. [Google Scholar] [CrossRef]
- Zhao, Y.B.; Li, D.H. Monotonicity of fixed point and normal mapping associated with variational inequality and its application. SIAM J. Optim. 2001, 4, 962–973. [Google Scholar] [CrossRef]
- Figueiredo, M.; Nowak, R.; Wright, S.J. Gradient projection for sparse reconstruction, application to compressed sensing and other inverse problems. IEEE J-STSP 2007, 1, 586–597. [Google Scholar] [CrossRef] [Green Version]
- Xiao, Y.; Zhu, H. A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing. J. Math. Anal. Appl. 2013, 405, 310–319. [Google Scholar] [CrossRef]
- Shang, Y. Vulnerability of networks: Fractional percolation on random graphs. Phys. Rev. E 2014, 89, 012813. [Google Scholar] [CrossRef]
- Shang, Y. Super Connectivity of Erdos-Renyi Graphs. Mathematics 2019, 7, 267. [Google Scholar] [CrossRef] [Green Version]
- Brown, P.N.; Saad, Y. Convergence theory of nonlinear Newton-Krylov algorithms. SIAM J. Optim. 1994, 4, 297–330. [Google Scholar]
- Gasparo, M. A nonmonotone hybrid method for nonlinear systems. Optim. Methods Softw. 2000, 13, 79–94. [Google Scholar] [CrossRef]
- Griewank, A. The global convergence of Broyden-like methods with suitable line search. J. Austral. Math. Soc. Ser. B 1996, 28, 75–92. [Google Scholar] [CrossRef] [Green Version]
- Li, D.H.; Fukushima, M. A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 2001, 129, 15–35. [Google Scholar] [CrossRef] [Green Version]
- Li, D.H.; Fukushima, M. A derivative-free line search and global convergence of Broyden-like method for nonlinear equations. Optim. Methods Softw. 2000, 13, 583–599. [Google Scholar] [CrossRef]
- Martínez, J.M. A family of quasi-Newton methods for nonlinear equations with direct secant updates of matrix factorizations. SIAM J. Numer. Anal. 1990, 27, 1034–1049. [Google Scholar] [CrossRef]
- Solodov, M.V.; Svaiter, B.F. A globally convergent inexact Newton method for systems of monotone equations. In Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods; Fukushima, M., Qi, L., Eds.; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1998; pp. 355–369. [Google Scholar]
- Barzilai, J.; Borwein, J.M. Two-point step size gradient methods. IMA J. Numer. Anal. 1998, 8, 141–148. [Google Scholar] [CrossRef]
- Zhang, L.; Zhou, W.J. Spectral gradient projection method for solving nonlinear monotone equations. J. Comput. Appl. Math. 2006, 196, 478–484. [Google Scholar] [CrossRef] [Green Version]
- Yu, Z.S.; Ji, L.N.; Sun, J.; Xiao, Y.H. Spectral gradient projection method for monotone nonlinear equations with convex constraints. Appl. Numer. Math. 2009, 59, 2416–2423. [Google Scholar] [CrossRef]
- Hager, W.W.; Zhang, H. A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 2005, 16, 170–192. [Google Scholar] [CrossRef] [Green Version]
- Zhang, L.; Zhou, W.J.; Li, D.H. A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 2006, 26, 629–640. [Google Scholar] [CrossRef]
- Cheng, W.Y. A two-term PRP-based descent method. Numer. Funct. Anal. Optim. 2007, 28, 1217–1230. [Google Scholar] [CrossRef]
- Zhang, L.; Zhou, W.J.; Li, D.H. Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 2006, 104, 561–572. [Google Scholar] [CrossRef]
- Dai, Z.; Zhu, H. Stock return predictability from a mixed model perspective. Pac-Basin. Finac. J. 2020, 60, 101267. [Google Scholar] [CrossRef]
- Narushima, Y.; Yabe, H.; Ford, J.A. A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 2011, 21, 212–230. [Google Scholar] [CrossRef] [Green Version]
- Andrei, N. A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comp. Appl. Math. 2013, 241, 19–29. [Google Scholar] [CrossRef]
- Andrei, N. On three-term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 2013, 219, 6316–6327. [Google Scholar] [CrossRef]
- Liu, Z.X.; Liu, H.W.; Dong, X.L. A new adaptive Barzilai and Borwein method for unconstrained optimization. Optim. Lett. 2018, 12, 845–873. [Google Scholar] [CrossRef]
- Babaie-Kafaki, S.; Reza, G. The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 2014, 234, 625–630. [Google Scholar] [CrossRef]
- Babaie-Kafaki, S. On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae. J. Optim. Theory Appl. 2015, 167, 91–101. [Google Scholar] [CrossRef]
- Yuan, G.; Zhang, M. A three-terms Polak-Ribiére-Polyak conjugate gradient algorithm for large-scale nonlinear equations. J. Comput. Appl. Math. 2015, 286, 186–195. [Google Scholar] [CrossRef]
- Yuan, G.; Meng, Z.H.; Li, Y. A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory. Appl. 2016, 168, 129–152. [Google Scholar] [CrossRef]
- Dong, X.; Han, D.; Reza, G.; Li, X.; Dai, Z. Some new three-term Hestenes-Stiefel conjugate gradient methods with affine combination. Optimization 2017, 66, 759–776. [Google Scholar] [CrossRef]
- Dong, X.; Han, D.; Dai, Z.; Li, L.; Zhu, J. An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 2018, 179, 944–961. [Google Scholar] [CrossRef]
- Li, M.; Feng, H. A sufficient descent Liu-Storey conjugate gradient method for unconstrained optimization problems. Appl Math Comput. 2011, 218, 1577–1586. [Google Scholar]
- Dai, Z.; Wen, F. Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property. Appl Math Comput. 2012, 218, 4721–4730. [Google Scholar] [CrossRef]
- Dai, Z.; Tian, B. Global convergence of some modified PRP nonlinear conjugate gradient methods. Optim. Lett. 2011, 5, 615–630. [Google Scholar] [CrossRef]
- Dai, Z. Comments on a new class of nonlinear conjugate gradient coefficients with global convergence properties. Appl. Math. Computat. 2016, 276, 297–300. [Google Scholar] [CrossRef] [Green Version]
- Dai, Z.; Zhou, H.; Wen, F.; He, S. Efficient predictability of stock return volatility: The role of stock market implied volatility. N. Am. J. Econ. Finance. 2020. forthcoming. [Google Scholar] [CrossRef]
- Cheng, W.Y. A PRP type method for systems of monotone equations. Math. Comput. Model. 2009, 50, 15–20. [Google Scholar] [CrossRef]
- Yu, G. A derivative-free method for solving large-scale nonlinear systems of equations. J. Ind. Manag. Optim. 2010, 6, 149–160. [Google Scholar] [CrossRef]
- Li, Q.; Li, D.H. A class of derivative-free methods for large-scale nonlinear monotone equations. IMA J. Numer. Anal. 2011, 31, 1625–1635. [Google Scholar] [CrossRef]
- Yu, G. Nonmonotone spectral gradient-type methods for large-scale unconstrained optimization and nonlinear systems of equations. Pac. J. Optim. 2011, 7, 387–404. [Google Scholar]
- Zhou, W.; Shen, D. An inexact PRP conjugate gradient method for symmetric nonlinear equations. Numer. Funct. Anal. Optim. 2014, 35, 370–388. [Google Scholar] [CrossRef]
- Sun, M.; Wang, X.; Feng, D. A family of conjugate gradient methods for large-scale nonlinear equations. J. Inequal. Appl. 2017, 236, 1–8. [Google Scholar]
- Zhou, W.; Wang, F. A PRP-based residual method for large-scale monotone nonlinear equations. Appl. Math. Comput. 2015, 261, 1–7. [Google Scholar] [CrossRef]
- Dai, Z.; Chen, X.; Wen, F. A modified Perry’s conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equation. Appl. Math. Comput. 2015, 270, 378–386. [Google Scholar] [CrossRef] [Green Version]
- Li, M. A Liu-Storey type method for solving large-scale nonlinear monotone equations. Numer. Funct. Anal. Optim. 2014, 35, 310–322. [Google Scholar] [CrossRef]
- Hestenes, M.R.; Stiefel, E. Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 1952, 49, 409–436. [Google Scholar] [CrossRef]
- Dai, Z.; Wen, F. Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search. Numer Algor. 2012, 59, 79–93. [Google Scholar] [CrossRef]
- Yan, Q.R.; Peng, X.Z.; Li, D.H. A globally convergent derivative-free method for solving large-scale nonlinear monotone equations. J. Comput. Appl. Math. 2010, 234, 649–657. [Google Scholar] [CrossRef] [Green Version]
- Zhou, W.J.; Li, D.H. Limited memory BFGS method for nonlinear monotone equations. J. Comput. Math. 2007, 25, 89–96. [Google Scholar]
- Zhou, W.J.; Li, D.H. A globally convergent BFGS method for nonlinear monotone equations without any merit functions. Math. Comput. 2008, 77, 2231–2240. [Google Scholar] [CrossRef]
- Zhou, W.J. A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems. J. Comput. Appl. Math. 2020, 357, 454. [Google Scholar] [CrossRef]
- Gao, P.T.; He, C.; Liu, Y. An adaptive family of projection methods for constrained monotone nonlinear equations with applications. Appl. Math. Comput. 2019, 359, 1–16. [Google Scholar] [CrossRef]
- Dai, Z.; Zhu, H. Forecasting stock market returns by combining sum-of-the-parts and ensemble empirical mode decomposition. Appl. Econ. 2019. [Google Scholar] [CrossRef]
- Dai, Z.; Zhu, H.; Wen, F. Two nonparametric approaches to mean absolute deviation portfolio selection model. J. Ind. Manag. Optim. 2019. [Google Scholar] [CrossRef]
- Dai, Z.; Zhou, H. Prediction of stock returns: Sum-of-the-parts method and economic constraint method. Sustainability 2020, 12, 541. [Google Scholar] [CrossRef] [Green Version]
Initial | Dim. | SG | MPRP | NHZ | ||||||
---|---|---|---|---|---|---|---|---|---|---|
Time | Iter | Feval | Time | Iter | Feval | Time | Iter | Feval | ||
1000 | 1.16 | 16 | 37 | 0.81 | 12 | 39 | 0.65 | 10 | 29 | |
1000 | 1.16 | 16 | 37 | 0.81 | 12 | 39 | 0.65 | 10 | 29 | |
1000 | 0.77 | 13 | 26 | 0.55 | 8 | 27 | 0.48 | 7 | 22 | |
1000 | 1.24 | 15 | 43 | 0.85 | 9 | 42 | 0.78 | 7 | 35 | |
1000 | 1.15 | 14 | 44 | 0.53 | 6 | 28 | 0.48 | 5 | 24 | |
1000 | 1.15 | 14 | 44 | 0.53 | 6 | 28 | 0.48 | 5 | 24 | |
5000 | 6.96 | 18 | 37 | 5.32 | 13 | 42 | 4.55 | 11 | 32 | |
5000 | 6.96 | 17 | 37 | 5.01 | 12 | 39 | 3.85 | 11 | 29 | |
5000 | 4.16 | 11 | 22 | 3.01 | 7 | 24 | 2.43 | 6 | 20 | |
5000 | 10.45 | 18 | 62 | 7.38 | 12 | 60 | 5.99 | 11 | 48 | |
5000 | 6.88 | 14 | 44 | 3.40 | 6 | 28 | 2.84 | 5 | 24 | |
5000 | 6.96 | 15 | 45 | 3.40 | 6 | 28 | 2.84 | 5 | 24 | |
10,000 | 27.77 | 18 | 38 | 21.15 | 13 | 42 | 15.72 | 11 | 32 | |
10,000 | 27.62 | 17 | 36 | 19.65 | 12 | 39 | 14.36 | 11 | 27 | |
10,000 | 14.90 | 10 | 20 | 11.92 | 7 | 24 | 8.96 | 6 | 20 | |
10,000 | 44.65 | 20 | 69 | 35.15 | 14 | 72 | 30.98 | 12 | 68 | |
10,000 | 29.55 | 15 | 45 | 13.86 | 6 | 28 | 11.22 | 5 | 24 | |
10,000 | 29.55 | 15 | 45 | 13.86 | 6 | 28 | 11.22 | 5 | 24 |
Initial | Dim. | SG | MPRP | NHZ | ||||||
---|---|---|---|---|---|---|---|---|---|---|
Time | Iter | Feval | Time | Iter | Feval | Time | Iter | Feval | ||
1000 | 0.67 | 326 | 646 | 0.17 | 196 | 591 | 0.12 | 155 | 568 | |
1000 | 0.64 | 364 | 714 | 0.11 | 203 | 612 | 0.10 | 180 | 584 | |
1000 | 0.64 | 343 | 691 | 0.19 | 195 | 588 | 0.17 | 174 | 555 | |
1000 | 1.18 | 468 | 941 | 0.15 | 243 | 741 | 0.14 | 220 | 709 | |
1000 | 0.67 | 479 | 988 | 0.17 | 194 | 585 | 0.15 | 162 | 549 | |
1000 | 0.65 | 431 | 857 | 0.16 | 189 | 571 | 0.14 | 165 | 529 | |
5000 | 10.38 | 723 | 2047 | 8.19 | 487 | 1465 | 8.10 | 468 | 1448 | |
5000 | 16.87 | 753 | 2622 | 12.85 | 769 | 2310 | 12.63 | 744 | 2205 | |
5000 | 9.01 | 824 | 2849 | 7.98 | 469 | 1411 | 7.65 | 442 | 1324 | |
5000 | 18.74 | 1023 | 3261 | 15.96 | 929 | 2840 | 15.51 | 901 | 2781 | |
5000 | 10.05 | 1226 | 3053 | 7.59 | 453 | 1363 | 7.50 | 442 | 1320 | |
5000 | 11.32 | 836 | 2476 | 6.32 | 371 | 1122 | 6.20 | 338 | 1103 | |
10,000 | 47.90 | 960 | 2002 | 34.59 | 539 | 1622 | 10.01 | 460 | 1508 | |
10,000 | 82.62 | 1334 | 4066 | 65.06 | 1023 | 3072 | 60.79 | 1001 | 3003 | |
10,000 | 50.99 | 833 | 2469 | 34.30 | 516 | 1553 | 32.32 | 501 | 1502 | |
10,000 | 56.82 | 2042 | 6668 | 39.65 | 1668 | 5075 | 36.31 | 1602 | 5003 | |
10,000 | 49.63 | 832 | 2268 | 31.57 | 497 | 1497 | 30.25 | 436 | 1405 | |
10,000 | 45.70 | 850 | 1706 | 25.36 | 396 | 1202 | 22.24 | 375 | 1106 |
Initial | Dim. | SG | MPRP | NHZ | ||||||
---|---|---|---|---|---|---|---|---|---|---|
Time | Iter | Feval | Time | Iter | Feval | Time | Iter | Feval | ||
1000 | 1.16 | 16 | 37 | 0.81 | 12 | 39 | 0.62 | 10 | 28 | |
1000 | 1.17 | 17 | 36 | 0.83 | 12 | 39 | 0.71 | 11 | 28 | |
1000 | 0.77 | 11 | 24 | 0.57 | 8 | 27 | 0.49 | 7 | 28 | |
1000 | 1.25 | 14 | 44 | 0.88 | 9 | 42 | 0.75 | 7 | 32 | |
1000 | 1.16 | 13 | 42 | 0.56 | 6 | 28 | 0.48 | 5 | 22 | |
1000 | 1.16 | 13 | 42 | 0.57 | 6 | 28 | 0.48 | 5 | 22 | |
5000 | 6.98 | 17 | 36 | 5.42 | 13 | 42 | 4.63 | 11 | 32 | |
5000 | 6.98 | 17 | 36 | 5.11 | 12 | 39 | 3.95 | 11 | 30 | |
5000 | 4.29 | 10 | 22 | 3.12 | 7 | 24 | 2.34 | 6 | 20 | |
5000 | 10.57 | 19 | 64 | 7.46 | 12 | 60 | 6.25 | 11 | 52 | |
5000 | 6.99 | 13 | 42 | 3.52 | 6 | 28 | 3.92 | 5 | 24 | |
5000 | 6.99 | 13 | 42 | 3.52 | 6 | 28 | 3.92 | 5 | 24 | |
10,000 | 27.78 | 17 | 36 | 21.35 | 13 | 42 | 15.97 | 11 | 32 | |
10,000 | 27.79 | 17 | 36 | 19.75 | 12 | 39 | 15.86 | 11 | 30 | |
10,000 | 15.65 | 9 | 26 | 11.99 | 7 | 24 | 9.98 | 6 | 19 | |
10,000 | 44.85 | 20 | 69 | 35.36 | 14 | 72 | 29.98 | 12 | 60 | |
10,000 | 29.89 | 14 | 45 | 13.98 | 6 | 28 | 12.56 | 5 | 24 | |
10,000 | 29.89 | 14 | 45 | 13.98 | 6 | 28 | 13.59 | 6 | 24 |
Initial | Dim. | SG | MPRP | NHZ | ||||||
---|---|---|---|---|---|---|---|---|---|---|
Time | Iter | Feval | Time | Iter | Feval | Time | Iter | Feval | ||
1000 | 0.20 | 219 | 431 | 0.06 | 50 | 216 | 0.05 | 38 | 168 | |
1000 | 0.28 | 261 | 463 | 0.06 | 56 | 252 | 0.05 | 46 | 185 | |
1000 | 0.28 | 224 | 329 | 0.05 | 34 | 152 | 0.03 | 32 | 137 | |
1000 | 0.22 | 263 | 529 | 0.07 | 100 | 421 | 0.06 | 96 | 399 | |
1000 | 0.28 | 183 | 403 | 0.06 | 42 | 187 | 0.05 | 40 | 177 | |
1000 | 0.28 | 212 | 424 | 0.06 | 60 | 261 | 0.05 | 48 | 218 | |
5000 | 2.15 | 263 | 456 | 1.11 | 48 | 209 | 1.05 | 47 | 183 | |
5000 | 2.45 | 225 | 378 | 1.19 | 46 | 224 | 0.92 | 38 | 169 | |
5000 | 1.65 | 122 | 267 | 0.62 | 27 | 117 | 0.65 | 29 | 128 | |
5000 | 3.41 | 265 | 558 | 2.59 | 109 | 483 | 2.49 | 104 | 455 | |
5000 | 2.86 | 290 | 467 | 1.21 | 53 | 231 | 1.07 | 44 | 189 | |
5000 | 2.97 | 231 | 477 | 1.20 | 54 | 234 | 1.16 | 48 | 213 | |
10,000 | 5.30 | 278 | 502 | 3.96 | 45 | 195 | 3.83 | 42 | 185 | |
10,000 | 6.26 | 237 | 574 | 4.27 | 41 | 210 | 3.84 | 38 | 158 | |
10,000 | 5.62 | 275 | 585 | 1.93 | 42 | 96 | 2.62 | 35 | 142 | |
10,000 | 18.15 | 333 | 596 | 10.86 | 117 | 533 | 9.45 | 109 | 498 | |
10,000 | 13.52 | 341 | 595 | 4.34 | 49 | 212 | 3.85 | 44 | 186 | |
10,000 | 13.55 | 336 | 553 | 4.89 | 56 | 246 | 3.78 | 48 | 195 |
Initial | Dim. | SG | MPRP | NHZ | ||||||
---|---|---|---|---|---|---|---|---|---|---|
Time | Iter | Feval | Time | Iter | Feval | Time | Iter | Feval | ||
1000 | 0.89 | 119 | 289 | 0.66 | 47 | 199 | 0.44 | 38 | 168 | |
1000 | 0.78 | 122 | 263 | 0.45 | 22 | 105 | 0.44 | 24 | 98 | |
1000 | 0.69 | 130 | 235 | 0.35 | 48 | 209 | 0.28 | 38 | 120 | |
1000 | 0.85 | 190 | 249 | 0.47 | 37 | 165 | 0.34 | 35 | 98 | |
1000 | 0.75 | 194 | 248 | 0.55 | 94 | 237 | 0.45 | 66 | 192 | |
1000 | 1.22 | 225 | 462 | 0.79 | 174 | 396 | 0.75 | 142 | 372 | |
5000 | 2.32 | 113 | 260 | 1.22 | 51 | 221 | 0.98 | 42 | 172 | |
5000 | 2.92 | 128 | 270 | 0.56 | 22 | 105 | 0.58 | 28 | 96 | |
5000 | 3.80 | 228 | 412 | 1.11 | 47 | 200 | 0.79 | 44 | 144 | |
5000 | 3.50 | 216 | 424 | 1.20 | 48 | 206 | 0.79 | 44 | 142 | |
5000 | 3.00 | 226 | 443 | 1.17 | 47 | 206 | 0.81 | 44 | 122 | |
5000 | 6.57 | 461 | 881 | 5.06 | 308 | 707 | 4.25 | 262 | 628 | |
10,000 | 5.92 | 66 | 209 | 3.90 | 44 | 191 | 3.42 | 38 | 184 | |
10,000 | 6.86 | 68 | 218 | 51.1 | 22 | 105 | 49.1 | 21 | 98 | |
10,000 | 5.76 | 60 | 181 | 4.24 | 47 | 204 | 3.23 | 38 | 132 | |
10,000 | 11.84 | 69 | 227 | 10.5 | 48 | 209 | 8.52 | 44 | 148 | |
10,000 | 10.55 | 68 | 221 | 4.02 | 45 | 196 | 3.82 | 42 | 168 | |
10,000 | 12.46 | 89 | 326 | 10.1 | 74 | 262 | 7.83 | 68 | 232 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dai, Z.; Zhu, H. A Modified Hestenes-Stiefel-Type Derivative-Free Method for Large-Scale Nonlinear Monotone Equations. Mathematics 2020, 8, 168. https://doi.org/10.3390/math8020168
Dai Z, Zhu H. A Modified Hestenes-Stiefel-Type Derivative-Free Method for Large-Scale Nonlinear Monotone Equations. Mathematics. 2020; 8(2):168. https://doi.org/10.3390/math8020168
Chicago/Turabian StyleDai, Zhifeng, and Huan Zhu. 2020. "A Modified Hestenes-Stiefel-Type Derivative-Free Method for Large-Scale Nonlinear Monotone Equations" Mathematics 8, no. 2: 168. https://doi.org/10.3390/math8020168
APA StyleDai, Z., & Zhu, H. (2020). A Modified Hestenes-Stiefel-Type Derivative-Free Method for Large-Scale Nonlinear Monotone Equations. Mathematics, 8(2), 168. https://doi.org/10.3390/math8020168