Iterative Methods for Solving Nonlinear Equations and Systems 2020

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Difference and Differential Equations".

Deadline for manuscript submissions: closed (15 December 2020) | Viewed by 27842

Special Issue Editors


E-Mail Website
Guest Editor
Institute for Multidisciplinary Mathematics, Universitat Politècnica de València, 46022 València, Spain
Interests: iterative processes; matrix analysis; numerical analysis
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Telecommunications Engineering, Universitat Politècnica de València, 46022 Valencia, Spain
Interests: numerical analysis; iterative methods; nonlinear problems; discrete dynamics, real and complex
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Solving nonlinear equations (scalar or matrix equations) and nonlinear systems is a non-trivial task that involves many areas of science and technology. Usually it is not affordable in a direct way and iterative algorithms play a fundamental role in their approach. This is an area of research with an exponential growth in the last years.

The main theme of this Special Issue includes but is not limited to the design, analysis of convergence and stability, and application to practical problems of new iterative schemes for solving nonlinear problems. This includes methods with and without memory; with derivatives or derivative-free; the real or complex dynamics associated with them; and an analysis of their convergence whether local, semilocal, or global.

Prof. Dr. Juan R. Torregrosa
Prof. Dr. Alicia Cordero
Dr. Francisco I. Chicharro
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • nonlinear systems
  • transcendent equations
  • nonlinear matrix equations
  • iterative methods
  • convergence
  • efficiency
  • chaotic behavior
  • complex or real dynamics
  • numerical resolution of nonlinear models in biology
  • chemistry
  • aerospatial
  • communications
  • other engineering areas

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

5 pages, 311 KiB  
Article
A New Derivative-Free Method to Solve Nonlinear Equations
by Beny Neta
Mathematics 2021, 9(6), 583; https://doi.org/10.3390/math9060583 - 10 Mar 2021
Cited by 14 | Viewed by 2661
Abstract
A new high-order derivative-free method for the solution of a nonlinear equation is developed. The novelty is the use of Traub’s method as a first step. The order is proven and demonstrated. It is also shown that the method has much fewer divergent [...] Read more.
A new high-order derivative-free method for the solution of a nonlinear equation is developed. The novelty is the use of Traub’s method as a first step. The order is proven and demonstrated. It is also shown that the method has much fewer divergent points and runs faster than an optimal eighth-order derivative-free method. Full article
(This article belongs to the Special Issue Iterative Methods for Solving Nonlinear Equations and Systems 2020)
Show Figures

Figure 1

21 pages, 6306 KiB  
Article
Aboodh Transform Iterative Method for Spatial Diffusion of a Biological Population with Fractional-Order
by Gbenga O. Ojo and Nazim I. Mahmudov
Mathematics 2021, 9(2), 155; https://doi.org/10.3390/math9020155 - 13 Jan 2021
Cited by 26 | Viewed by 2980
Abstract
In this paper, a new approximate analytical method is proposed for solving the fractional biological population model, the fractional derivative is described in the Caputo sense. This method is based upon the Aboodh transform method and the new iterative method, the Aboodh transform [...] Read more.
In this paper, a new approximate analytical method is proposed for solving the fractional biological population model, the fractional derivative is described in the Caputo sense. This method is based upon the Aboodh transform method and the new iterative method, the Aboodh transform is a modification of the Laplace transform. Illustrative cases are considered and the comparison between exact solutions and numerical solutions are considered for different values of alpha. Furthermore, the surface plots are provided in order to understand the effect of the fractional order. The advantage of this method is that it is efficient, precise, and easy to implement with less computational effort. Full article
(This article belongs to the Special Issue Iterative Methods for Solving Nonlinear Equations and Systems 2020)
Show Figures

Figure 1

17 pages, 2217 KiB  
Article
A Family of Multiple-Root Finding Iterative Methods Based on Weight Functions
by Francisco I. Chicharro, Rafael A. Contreras and Neus Garrido
Mathematics 2020, 8(12), 2194; https://doi.org/10.3390/math8122194 - 9 Dec 2020
Cited by 7 | Viewed by 2271
Abstract
A straightforward family of one-point multiple-root iterative methods is introduced. The family is generated using the technique of weight functions. The order of convergence of the family is determined in its convergence analysis, which shows the constraints that the weight function must satisfy [...] Read more.
A straightforward family of one-point multiple-root iterative methods is introduced. The family is generated using the technique of weight functions. The order of convergence of the family is determined in its convergence analysis, which shows the constraints that the weight function must satisfy to achieve order three. In this sense, a family of iterative methods can be obtained with a suitable design of the weight function. That is, an iterative algorithm that depends on one or more parameters is designed. This family of iterative methods, starting with proper initial estimations, generates a sequence of approximations to the solution of a problem. A dynamical analysis is also included in the manuscript to study the long-term behavior of the family depending on the parameter value and the initial guess considered. This analysis reveals the good properties of the family for a wide range of values of the parameter. In addition, a numerical test on academic and engineering multiple-root functions is performed. Full article
(This article belongs to the Special Issue Iterative Methods for Solving Nonlinear Equations and Systems 2020)
Show Figures

Figure 1

15 pages, 529 KiB  
Article
Memory in a New Variant of King’s Family for Solving Nonlinear Systems
by Munish Kansal, Alicia Cordero, Sonia Bhalla and Juan R. Torregrosa
Mathematics 2020, 8(8), 1251; https://doi.org/10.3390/math8081251 - 31 Jul 2020
Cited by 4 | Viewed by 1969
Abstract
In the recent literature, very few high-order Jacobian-free methods with memory for solving nonlinear systems appear. In this paper, we introduce a new variant of King’s family with order four to solve nonlinear systems along with its convergence analysis. The proposed family requires [...] Read more.
In the recent literature, very few high-order Jacobian-free methods with memory for solving nonlinear systems appear. In this paper, we introduce a new variant of King’s family with order four to solve nonlinear systems along with its convergence analysis. The proposed family requires two divided difference operators and to compute only one inverse of a matrix per iteration. Furthermore, we have extended the proposed scheme up to the sixth-order of convergence with two additional functional evaluations. In addition, these schemes are further extended to methods with memory. We illustrate their applicability by performing numerical experiments on a wide variety of practical problems, even big-sized. It is observed that these methods produce approximations of greater accuracy and are more efficient in practice, compared with the existing methods. Full article
(This article belongs to the Special Issue Iterative Methods for Solving Nonlinear Equations and Systems 2020)
Show Figures

Figure 1

27 pages, 1079 KiB  
Article
Purely Iterative Algorithms for Newton’s Maps and General Convergence
by Sergio Amat, Rodrigo Castro, Gerardo Honorato and Á. A. Magreñán
Mathematics 2020, 8(7), 1158; https://doi.org/10.3390/math8071158 - 15 Jul 2020
Cited by 1 | Viewed by 2184
Abstract
The aim of this paper is to study the local dynamical behaviour of a broad class of purely iterative algorithms for Newton’s maps. In particular, we describe the nature and stability of fixed points and provide a type of scaling theorem. Based on [...] Read more.
The aim of this paper is to study the local dynamical behaviour of a broad class of purely iterative algorithms for Newton’s maps. In particular, we describe the nature and stability of fixed points and provide a type of scaling theorem. Based on those results, we apply a rigidity theorem in order to study the parameter space of cubic polynomials, for a large class of new root finding algorithms. Finally, we study the relations between critical points and the parameter space. Full article
(This article belongs to the Special Issue Iterative Methods for Solving Nonlinear Equations and Systems 2020)
Show Figures

Figure 1

12 pages, 1229 KiB  
Article
Two Iterative Methods with Memory Constructed by the Method of Inverse Interpolation and Their Dynamics
by Xiaofeng Wang and Mingming Zhu
Mathematics 2020, 8(7), 1080; https://doi.org/10.3390/math8071080 - 3 Jul 2020
Cited by 4 | Viewed by 1846
Abstract
In this paper, we obtain two iterative methods with memory by using inverse interpolation. Firstly, using three function evaluations, we present a two-step iterative method with memory, which has the convergence order 4.5616. Secondly, a three-step iterative method of order 10.1311 is obtained, [...] Read more.
In this paper, we obtain two iterative methods with memory by using inverse interpolation. Firstly, using three function evaluations, we present a two-step iterative method with memory, which has the convergence order 4.5616. Secondly, a three-step iterative method of order 10.1311 is obtained, which requires four function evaluations per iteration. Herzberger’s matrix method is used to prove the convergence order of new methods. Finally, numerical comparisons are made with some known methods by using the basins of attraction and through numerical computations to demonstrate the efficiency and the performance of the presented methods. Full article
(This article belongs to the Special Issue Iterative Methods for Solving Nonlinear Equations and Systems 2020)
Show Figures

Figure 1

9 pages, 254 KiB  
Article
The Newtonian Operator and Global Convergence Balls for Newton’s Method
by José A. Ezquerro and Miguel A. Hernández-Verón
Mathematics 2020, 8(7), 1074; https://doi.org/10.3390/math8071074 - 2 Jul 2020
Cited by 1 | Viewed by 2001
Abstract
We obtain results of restricted global convergence for Newton’s method from ideas based on the Fixed-Point theorem and using the Newtonian operator and auxiliary points. The results are illustrated with a non-linear integral equation of Davis-type and improve the results previously given by [...] Read more.
We obtain results of restricted global convergence for Newton’s method from ideas based on the Fixed-Point theorem and using the Newtonian operator and auxiliary points. The results are illustrated with a non-linear integral equation of Davis-type and improve the results previously given by the authors. Full article
(This article belongs to the Special Issue Iterative Methods for Solving Nonlinear Equations and Systems 2020)
21 pages, 648 KiB  
Article
Least-Square-Based Three-Term Conjugate Gradient Projection Method for 1-Norm Problems with Application to Compressed Sensing
by Abdulkarim Hassan Ibrahim, Poom Kumam, Auwal Bala Abubakar, Jamilu Abubakar and Abubakar Bakoji Muhammad
Mathematics 2020, 8(4), 602; https://doi.org/10.3390/math8040602 - 15 Apr 2020
Cited by 38 | Viewed by 3293
Abstract
In this paper, we propose, analyze, and test an alternative method for solving the 1 -norm regularization problem for recovering sparse signals and blurred images in compressive sensing. The method is motivated by the recent proposed nonlinear conjugate gradient method of Tang, [...] Read more.
In this paper, we propose, analyze, and test an alternative method for solving the 1 -norm regularization problem for recovering sparse signals and blurred images in compressive sensing. The method is motivated by the recent proposed nonlinear conjugate gradient method of Tang, Li and Cui [Journal of Inequalities and Applications, 2020(1), 27] designed based on the least-squares technique. The proposed method aims to minimize a non-smooth minimization problem consisting of a least-squares data fitting term and an 1 -norm regularization term. The search directions generated by the proposed method are descent directions. In addition, under the monotonicity and Lipschitz continuity assumption, we establish the global convergence of the method. Preliminary numerical results are reported to show the efficiency of the proposed method in practical computation. Full article
(This article belongs to the Special Issue Iterative Methods for Solving Nonlinear Equations and Systems 2020)
Show Figures

Figure 1

15 pages, 1655 KiB  
Article
Multipoint Fractional Iterative Methods with (2α + 1)th-Order of Convergence for Solving Nonlinear Problems
by Giro Candelario, Alicia Cordero and Juan R. Torregrosa
Mathematics 2020, 8(3), 452; https://doi.org/10.3390/math8030452 - 20 Mar 2020
Cited by 23 | Viewed by 2894
Abstract
In the recent literature, some fractional one-point Newton-type methods have been proposed in order to find roots of nonlinear equations using fractional derivatives. In this paper, we introduce a new fractional Newton-type method with order of convergence α + 1 and compare it [...] Read more.
In the recent literature, some fractional one-point Newton-type methods have been proposed in order to find roots of nonlinear equations using fractional derivatives. In this paper, we introduce a new fractional Newton-type method with order of convergence α + 1 and compare it with the existing fractional Newton method with order 2 α . Moreover, we also introduce a multipoint fractional Traub-type method with order 2 α + 1 and compare its performance with that of its first step. Some numerical tests and analysis of the dependence on the initial estimations are made for each case, including a comparison with classical Newton ( α = 1 of the first step of the class) and classical Traub’s scheme ( α = 1 of fractional proposed multipoint method). In this comparison, some cases are found where classical Newton and Traub’s methods do not converge and the proposed methods do, among other advantages. Full article
(This article belongs to the Special Issue Iterative Methods for Solving Nonlinear Equations and Systems 2020)
Show Figures

Figure 1

14 pages, 281 KiB  
Article
A Modified Hestenes-Stiefel-Type Derivative-Free Method for Large-Scale Nonlinear Monotone Equations
by Zhifeng Dai and Huan Zhu
Mathematics 2020, 8(2), 168; https://doi.org/10.3390/math8020168 - 30 Jan 2020
Cited by 53 | Viewed by 3361
Abstract
The goal of this paper is to extend the modified Hestenes-Stiefel method to solve large-scale nonlinear monotone equations. The method is presented by combining the hyperplane projection method (Solodov, M.V.; Svaiter, B.F. A globally convergent inexact Newton method for systems of monotone equations, [...] Read more.
The goal of this paper is to extend the modified Hestenes-Stiefel method to solve large-scale nonlinear monotone equations. The method is presented by combining the hyperplane projection method (Solodov, M.V.; Svaiter, B.F. A globally convergent inexact Newton method for systems of monotone equations, in: M. Fukushima, L. Qi (Eds.)Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, Kluwer Academic Publishers. 1998, 355-369) and the modified Hestenes-Stiefel method in Dai and Wen (Dai, Z.; Wen, F. Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search. Numer Algor. 2012, 59, 79-93). In addition, we propose a new line search for the derivative-free method. Global convergence of the proposed method is established if the system of nonlinear equations are Lipschitz continuous and monotone. Preliminary numerical results are given to test the effectiveness of the proposed method. Full article
(This article belongs to the Special Issue Iterative Methods for Solving Nonlinear Equations and Systems 2020)
Back to TopTop