Matrix Equations and Their Algorithms Analysis

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Computational and Applied Mathematics".

Deadline for manuscript submissions: 10 July 2025 | Viewed by 11758

Special Issue Editors


E-Mail Website
Guest Editor
School of Mathematics and Computational Science, Xiangtan University, Xiangtan 411105, China
Interests: matrix equation; iterative algorithms; numerical analysis; control theory

E-Mail Website
Guest Editor
School of Mathematics and Computational Science, Xiangtan University, Xiangtan 411105, China
Interests: matrix equation; iterative algorithms; numerical analysis; control theory

Special Issue Information

Dear Colleagues,

As is well known, many problems in several areas as linear system and control theory can be described in terms of matrix equations or systems of such equations, which can take different forms, from algebraic, differential, integral, linear or nonlinear models. For this reason, matrix equations constitute one of the most fruitful fields of study in pure and applied mathematics.

However, usually, direct methods allowing us to deal with their effective resolution are not available, hence the enormous interest in their numerical treatment.

This Special Issue is intended to collect recent advancements in this area. The topics of interest for the Special Issue include, but are not limited to, numerical methods and algorithms analysis for:

  • Linear matrix equations.
  • Nonlinear matrix equations.
  • Differential matrix equations.
  • Integro matrix equations.
  • Variational matrix equations.
  • Optimization problems.
  • Control problems.
  • Algebraic equations.

In addition to the above-mentioned topics, their applications to the real world are also especially welcome.

Prof. Dr. Juan Zhang
Prof. Dr. Jianzhou Liu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • matrix equations
  • numerical methods
  • differential, integral equations
  • optimization problems
  • variational equations
  • control problems
  • algebraic equations

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 284 KiB  
Article
New Results on the Unimodular Equivalence of Multivariate Polynomial Matrices
by Dongmei Li and Zuo Chen
Mathematics 2023, 11(12), 2745; https://doi.org/10.3390/math11122745 - 17 Jun 2023
Viewed by 1014
Abstract
The equivalence of systems is a crucial concept in multidimensional systems. The Smith normal forms of multivariate polynomial matrices play important roles in the theory of polynomial matrices. In this paper, we mainly study the unimodular equivalence of some special kinds of multivariate [...] Read more.
The equivalence of systems is a crucial concept in multidimensional systems. The Smith normal forms of multivariate polynomial matrices play important roles in the theory of polynomial matrices. In this paper, we mainly study the unimodular equivalence of some special kinds of multivariate polynomial matrices and obtain some tractable criteria under which such matrices are unimodular equivalent to their Smith normal forms. We propose an algorithm for reducing such nD polynomial matrices to their Smith normal forms and present an example to illustrate the availability of the algorithm. Furthermore, we extend the results to the non-square case. Full article
(This article belongs to the Special Issue Matrix Equations and Their Algorithms Analysis)
19 pages, 311 KiB  
Article
Algebraic Characterizations of Relationships between Different Linear Matrix Functions
by Yongge Tian and Ruixia Yuan
Mathematics 2023, 11(3), 756; https://doi.org/10.3390/math11030756 - 2 Feb 2023
Cited by 1 | Viewed by 1249
Abstract
Let f(X1,X2,,Xk) be a matrix function over the field of complex numbers, where X1,X2,,Xk are a family of matrices with variable entries. [...] Read more.
Let f(X1,X2,,Xk) be a matrix function over the field of complex numbers, where X1,X2,,Xk are a family of matrices with variable entries. The purpose of this paper is to propose and investigate the relationships between certain linear matrix functions that regularly appear in matrix theory and its applications. We shall derive a series of meaningful, necessary, and sufficient conditions for the collections of values of two given matrix functions to be equal through the cogent use of some highly selective formulas and facts regarding ranks, ranges, and generalized inverses of block matrix operations. As applications, we discuss some concrete topics concerning the algebraic connections between general solutions of a given linear matrix equation and its reduced equations. Full article
(This article belongs to the Special Issue Matrix Equations and Their Algorithms Analysis)
24 pages, 1524 KiB  
Article
A Data-Driven Parameter Prediction Method for HSS-Type Methods
by Kai Jiang, Jianghao Su and Juan Zhang
Mathematics 2022, 10(20), 3789; https://doi.org/10.3390/math10203789 - 14 Oct 2022
Viewed by 1563
Abstract
Some matrix-splitting iterative methods for solving systems of linear equations contain parameters that need to be specified in advance, and the choice of these parameters directly affects the efficiency of the corresponding iterative methods. This paper uses a Bayesian inference-based Gaussian process regression [...] Read more.
Some matrix-splitting iterative methods for solving systems of linear equations contain parameters that need to be specified in advance, and the choice of these parameters directly affects the efficiency of the corresponding iterative methods. This paper uses a Bayesian inference-based Gaussian process regression (GPR) method to predict the relatively optimal parameters of some HSS-type iteration methods and provide extensive numerical experiments to compare the prediction performance of the GPR method with other existing methods. Numerical results show that using GPR to predict the parameters of the matrix-splitting iterative methods has the advantage of smaller computational effort, predicting more optimal parameters and universality compared to the currently available methods for finding the parameters of the HSS-type iteration methods. Full article
(This article belongs to the Special Issue Matrix Equations and Their Algorithms Analysis)
Show Figures

Figure 1

19 pages, 400 KiB  
Article
Solution Bounds and Numerical Methods of the Unified Algebraic Lyapunov Equation
by Juan Zhang, Shifeng Li and Xiangyang Gan
Mathematics 2022, 10(16), 2858; https://doi.org/10.3390/math10162858 - 11 Aug 2022
Viewed by 1515
Abstract
In this paper, applying some properties of matrix inequality and Schur complement, we give new upper and lower bounds of the solution for the unified algebraic Lyapunov equation that generalize the forms of discrete and continuous Lyapunov matrix equations. We show that its [...] Read more.
In this paper, applying some properties of matrix inequality and Schur complement, we give new upper and lower bounds of the solution for the unified algebraic Lyapunov equation that generalize the forms of discrete and continuous Lyapunov matrix equations. We show that its positive definite solution exists and is unique under certain conditions. Meanwhile, we present three numerical algorithms, including fixed point iterative method, the acceleration fixed point method and the alternating direction implicit method, to solve the unified algebraic Lyapunov equation. The convergence analysis of these algorithms is discussed. Finally, some numerical examples are presented to verify the feasibility of the derived upper and lower bounds, and numerical algorithms. Full article
(This article belongs to the Special Issue Matrix Equations and Their Algorithms Analysis)
Show Figures

Figure 1

18 pages, 4185 KiB  
Article
Block Kaczmarz–Motzkin Method via Mean Shift Clustering
by Yimou Liao, Tianxiu Lu and Feng Yin
Mathematics 2022, 10(14), 2408; https://doi.org/10.3390/math10142408 - 9 Jul 2022
Cited by 3 | Viewed by 1921
Abstract
Solving systems of linear equations is a fundamental problem in mathematics. Combining mean shift clustering (MS) with greedy techniques, a novel block version of the Kaczmarz–Motzkin method (BKMS), where the blocks are predetermined by MS clustering, is proposed in this paper. Using a [...] Read more.
Solving systems of linear equations is a fundamental problem in mathematics. Combining mean shift clustering (MS) with greedy techniques, a novel block version of the Kaczmarz–Motzkin method (BKMS), where the blocks are predetermined by MS clustering, is proposed in this paper. Using a greedy strategy, which collects the row indices with the almost maximum distance of the linear subsystem per iteration, can be considered an efficient extension of the sampling Kaczmarz–Motzkin algorithm (SKM). The new method linearly converges to the least-norm solution when the system is consistent. Several examples show that the BKMS algorithm is more efficient compared with other methods (for example, RK, Motzkin, GRK, SKM, RBK, and GRBK). Full article
(This article belongs to the Special Issue Matrix Equations and Their Algorithms Analysis)
Show Figures

Figure 1

11 pages, 617 KiB  
Article
Constructing a Matrix Mid-Point Iterative Method for Matrix Square Roots and Applications
by Javad Golzarpoor, Dilan Ahmed and Stanford Shateyi
Mathematics 2022, 10(13), 2200; https://doi.org/10.3390/math10132200 - 24 Jun 2022
Cited by 1 | Viewed by 1657
Abstract
In this paper, an improvement to the mid-point method is contributed for finding the square root of a matrix as well as its inverse. To this aim, an iteration scheme to find this matrix function is constructed, and its error and stability estimates [...] Read more.
In this paper, an improvement to the mid-point method is contributed for finding the square root of a matrix as well as its inverse. To this aim, an iteration scheme to find this matrix function is constructed, and its error and stability estimates are provided to show the theoretical rate of convergence. Our higher-order method can compete with the existing iterative methods of a similar nature. This is illustrated in numerical simulations of various sizes. Full article
(This article belongs to the Special Issue Matrix Equations and Their Algorithms Analysis)
Show Figures

Figure 1

13 pages, 348 KiB  
Article
Efficient Reduction Algorithms for Banded Symmetric Generalized Eigenproblems via Sequentially Semiseparable (SSS) Matrices
by Fan Yuan, Shengguo Li, Hao Jiang, Hongxia Wang, Cheng Chen, Lei Du and Bo Yang
Mathematics 2022, 10(10), 1676; https://doi.org/10.3390/math10101676 - 13 May 2022
Viewed by 1537
Abstract
In this paper, a novel algorithm is proposed for reducing a banded symmetric generalized eigenvalue problem to a banded symmetric standard eigenvalue problem, based on the sequentially semiseparable (SSS) matrix techniques. It is the first time that the SSS matrix techniques are used [...] Read more.
In this paper, a novel algorithm is proposed for reducing a banded symmetric generalized eigenvalue problem to a banded symmetric standard eigenvalue problem, based on the sequentially semiseparable (SSS) matrix techniques. It is the first time that the SSS matrix techniques are used in such eigenvalue problems. The newly proposed algorithm only requires linear storage cost and O(n2) computation cost for matrices with dimension n, and is also potentially good for parallelism. Some experiments have been performed by using Matlab, and the accuracy and stability of algorithm are verified. Full article
(This article belongs to the Special Issue Matrix Equations and Their Algorithms Analysis)
Back to TopTop