Next Article in Journal
A Decision Support System for Dynamic Job-Shop Scheduling Using Real-Time Data with Simulation
Previous Article in Journal
Variational Approaches for Lagrangian Discrete Nonlinear Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Forward Order Law for Least Squareg-Inverse of Multiple Matrix Products

School of Mathematics and Computational Science, Wuyi University, Jiangmen 529020, China
*
Author to whom correspondence should be addressed.
Mathematics 2019, 7(3), 277; https://doi.org/10.3390/math7030277
Submission received: 25 January 2019 / Revised: 9 March 2019 / Accepted: 13 March 2019 / Published: 19 March 2019
(This article belongs to the Section Mathematics and Computer Science)

Abstract

:
The generalized inverse has many important applications in the aspects of the theoretic research of matrices and statistics. One of the core problems of the generalized inverse is finding the necessary and sufficient conditions of the forward order laws for the generalized inverse of the matrix product. In this paper, by using the extremal ranks of the generalized Schur complement, we obtain some necessary and sufficient conditions for the forward order laws A 1 { 1 , 3 } A 2 { 1 , 3 } A n { 1 , 3 } ( A 1 A 2 A n ) { 1 , 3 } and A 1 { 1 , 4 } A 2 { 1 , 4 } A n { 1 , 4 } ( A 1 A 2 A n ) { 1 , 4 } .

1. Introduction

Throughout this paper, C m × n denotes the set of m × n matrices with complex entries and C m denotes the set of m-dimensional vectors. I m denotes the identity matrix of order m, and O m × n is the m × n matrix of all zero entries (if no confusion occurs, we will drop the subscript). For a matrix A C m × n , A * , r ( A ) , R ( A ) , and N ( A ) denote the conjugate transpose, the rank, the range space, and the null space of A, respectively. Furthermore, for the sake of simplicity in the later discussion, we will adopt the following notation for the matrix products with A i C m × m , i = 0 , 1 , 2 , n 1 ,
A i = A n A n 1 A n i .
Let A C m × n , a generalized inverse X C n × m of A, be a matrix that satisfies some of the following four Penrose equations [1]:
( 1 ) A X A = A , ( 2 ) X A X = X , ( 3 ) ( A X ) * = A X , ( 4 ) ( X A ) * = X A .
For a subset η { 1 , 2 , 3 , 4 } , the set of n × m matrices satisfying the equations that is contained in η is denoted by A { η } . A matrix from A { η } is called an { η } -inverse of A and is denoted by A ( η ) . For example, an n × m matrix X of the set A { 1 } is called a { 1 } -inverse of A and is denoted by X = A ( 1 ) . One usually denotes any { 1 , 3 } -inverse of A as A ( 1 , 3 ) , which is also called a least squares g-inverse of A. Any { 1 , 4 } -inverse of the set A { 1 , 4 } is denoted by A ( 1 , 4 ) , which is also called a minimum norm g-inverse of A. The unique { 1 , 2 , 3 , 4 } -inverse of A is denoted by A , which is also called the Moore–Penrose inverse of A. We refer the reader to [2,3] for basic results on the generalized inverses.
Theory and computations of the reverse (forward) order laws for generalized inverse are important in many branches of applied sciences, such as non-linear control theory [4], matrix analysis [2,5], statistics [4,6], and numerical linear algebra; see [3,6]. Suppose A i C m × m , i = 1 , 2 , , n , and b C m ; the least squares technique (LS):
min x C m ( A 1 A 2 A n ) x b 2
is used in many practical scientific problems [2,4,6,7]. Any solution x of the above LS problem can be expressed as x = ( A 1 A 2 A n ) ( 1 , 3 ) b . If the LS problem is consistent, the minimum norm solution x has the form x = ( A 1 A 2 A n ) ( 1 , 4 ) b . The unique minimal norm least squares solution x of the LS problem is x = ( A 1 A 2 A n ) b . One problem concerning the above LS problem is under what conditions do the following reverse order laws hold?
A n ( i , j , , k ) A n 1 ( i , j , , k ) A 1 ( i , j , , k ) = ( A 1 A 2 A n ) ( i , j , , k ) .
Another problem is under what conditions do the following forward order laws holds?
A 1 ( i , j , , k ) A 2 ( i , j , , k ) A n ( i , j , , k ) = ( A 1 A 2 A n ) ( i , j , , k ) ,
where ( i , j , , k ) η { 1 , 2 , 3 , 4 } .
The reverse order law for the generalized inverse of multiple matrix products yields a class of interesting problems that are fundamental in the theory of the generalized inverse of matrices; see [2,3,5,8,9]. As one of the core problems in reverse order laws, the necessary and sufficient conditions for the reverse order laws for the generalized inverse of matrix product hold, is useful in both theoretical study and practical scientific computing, this has attracted considerable attention, and many interesting results have been obtained; see [8,10,11,12,13,14,15,16,17,18].
The forward order law for generalized inverse of multiple matrix products (4) originally arose in studying the inverse of multiple matrix Kronecker products. Let A i , i = 1 , 2 , , n , be n nonsingular matrices, then the Kronecker product A 1 A 2 A n is nonsingular as well, and the inverse of A 1 A 2 A n satisfies the forward order law A 1 1 A 2 1 A n 1 = ( A 1 A 2 A n ) 1 . However, this so-called forward order law is not necessarily true for the matrix product, that is A 1 1 A 2 1 A n 1 ( A 1 A 2 A n ) 1 . An interesting problem is, for given { i , j , , k } η and matrices A i , i = 1 , 2 , , n , with A 1 A 2 A n meaningful, when A 1 ( i , j , , k ) A 2 ( i , j , , k ) A n ( i , j , , k ) = ( A 1 A 2 A n ) ( i , j , , k ) or when
A 1 { i , j , , k } A 2 { i , j , , k } A n { i , j , , k } = ( A 1 A 2 A n ) { i , j , , k }
Recently, Z. Liu and Z. Xiong [19,20] studied the forward order law for { 1 , 3 } -inverses of three matrix products by using the maximal rank of the generalized Schur complement [21], and some necessary and sufficient conditions for A 1 { 1 , 3 } A 2 { 1 , 3 } A 3 { 1 , 3 } ( A 1 A 2 A 3 ) { 1 , 3 } were derived. In this paper, we further study this subject, and some necessary and sufficient conditions by the ranks, the ranges, or the null spaces of the known matrices are provided for the following forward order laws:
A 1 { 1 , 3 } A 2 { 1 , 3 } A n { 1 , 3 } ( A 1 A 2 A n ) { 1 , 3 }
and:
A 1 { 1 , 4 } A 2 { 1 , 4 } A n { 1 , 4 } ( A 1 A 2 A n ) { 1 , 4 } .
The main tools of the later discussion are the following lemmas.
Lemma 1
([21]). Let A C m × n , B C m × k , C C l × n , and D C l × k . Then:
max A ( 1 , 3 ) r ( D C A ( 1 , 3 ) B ) = min r A * A A * B C D r ( A ) , r B D .
Lemma 2
([5]). Let A C m × n , B C m × k , and C C p × n , then:
( 1 ) r A , B = r ( A ) + r ( E A B ) = r ( B ) + r ( E B A ) ,
( 2 ) r A C = r ( A ) + r ( C F A ) = r ( C ) + r ( A F C ) ,
where the projectors E A = I A A , E B = I B B , F A = I A A , F C = I C C .
Lemma 3
([2,3]). Let L, M be a complementary subspace of C n , and let P L , M be the projector on L along M. Then:
( 1 ) P L , M A = A R ( A ) L ,
( 2 ) A P L , M = A N ( A ) M .
Lemma 4
([2,3]). Let A C m × n and X C n × m , then:
( 1 ) X A { 1 , 3 } A * A X = A ,
( 2 ) X A { 1 , 4 } X A A * = A * .

2. Main Results

In this section, we will present some necessary and sufficient conditions for the forward order law (5) by using the maximal ranks of some generalized Schur complement forms. Let:
S ( A 1 A 2 A n ) * = S μ * = ( A 1 A 2 A n ) * ( A 1 A 2 A n ) * A 1 A 2 A n X 1 X 2 X n = μ * μ * μ X 1 X 2 X n ,
where A i C m × m , X i A i { 1 , 3 } , i = 1 , 2 , , n , and μ = A 1 A 2 A n .
For the convenience of the readers, we first give a brief outline of the basic idea. From the formula (1) in Lemma 4, we know that the forward order law (5) holds if and only if
μ * μ X 1 X 2 X n = μ *
holds for any X i A i { 1 , 3 } , i = 1 , 2 , , n , and μ = A 1 A 2 A n , which is equivalent to the following identity:
max X 1 , X 2 , , X n r ( S ( A 1 A 2 A n ) * ) = max X 1 , X 2 , , X n r ( S μ * ) = 0 .
Hence, we can present the equivalent conditions for the forward order law (5) if the concrete expression of the maximal rank involved in the identity (8) is derived. The relative results are included in the following lemma.
Lemma 5.
Let A i C m × m , X i A i { 1 , 3 } , i = 1 , 2 , , n . Let A i be as in (1) and S μ * be as in (7), and E A i = I A i A i , i = 1 , 2 , , n be n projectors. Then:
max X 1 , X 2 , , X n r ( S ( A 1 A 2 A n ) * ) = max X 1 , X 2 , , X n r ( S μ * ) = r μ * μ μ * A n 1 , μ * A n 2 E A 1 , μ * A n 3 E A 2 , , μ * A 0 E A n 1 , μ * E A n = r μ * μ μ * A n 1 μ * A n 2 μ * A n 3 μ * A 0 μ * O A 1 * O O O O O A 2 * O O O O O A n 1 * O O O O O A n * i = 1 n r ( A i ) .
Proof. 
By Lemma 1 and the formula (2) of Lemma 2, we have:
max X n r ( S ( A 1 A 2 A n ) * ) = max X n r ( S μ * ) = max X n r ( μ * μ * μ X 1 X 2 X n ) = min r A n * A n A n * μ * μ X 1 X 2 X n 1 μ * r ( A n ) , r I m μ * = min r O A n * μ * μ X 1 X 2 X n 1 μ * A n μ * r ( A n ) , m = r [ μ * μ X 1 X 2 X n 1 μ * A n , μ * · F O , A n * ] = r μ * μ X 1 X 2 X n 1 μ * A n , μ * E A n = r [ μ * A n , μ * E A n μ * μ X 1 X 2 X n 1 I m , O ] .
According to Lemma 1, the formula (2) in Lemma 2, and Equations (1) and (9), we have:
max X n 1 , X n r ( S ( A 1 A 2 A n ) * ) = max X n 1 , X n r ( S μ * ) = max X n 1 r [ μ * A n , μ * E A n μ * μ X 1 X 2 X n 1 I m , O ] = min r A n 1 * A n 1 A n 1 * O μ * μ X 1 X 2 X n 2 μ * A n μ * E A n r ( A n 1 ) , r I m O μ * A n μ * E A n = min r O A n 1 * O μ * μ X 1 X n 2 μ * A 1 μ * A n μ * E A n r ( A n 1 ) , m + r ( μ * E A n ) = min { r [ μ * μ X 1 X 2 X n 2 μ * A 1 , μ * A n , μ * E A n F O , A n 1 * , O ] , m + r ( μ * E A n ) } = r μ * μ X 1 X 2 X n 2 μ * A 1 , μ * A 0 E A n 1 , μ * E A n = r [ μ * A 1 , μ * A 0 E A n 1 , μ * E A n μ * μ X 1 X 2 X n 2 I m , O , O ] .
Again, by Lemma 1, the formula (2) in Lemma 2 and the results in (1) and (10), we have:
max X n 2 , X n 1 , X n r ( S ( A 1 A 2 A n ) * ) = max X n 2 , X n 1 , X n r ( S μ * ) = max X n 2 r [ μ * A 1 , μ * A 0 E A n 1 , μ * E A n μ * μ X 1 X 2 X n 2 I m , O , O ] = min { r A n 2 * A n 2 A n 2 * O O μ * μ X 1 X 2 X n 3 μ * A 1 μ * A 0 E A n 1 μ * E A n r ( A n 2 ) , r I m O O μ * A 1 μ * A 0 E A n 1 μ * E A n } = min { r O A n 2 * O O μ * μ X 1 X 2 X n 3 μ * A 2 μ * A 1 μ * A 0 E A n 1 μ * E A n r ( A n 2 ) , m + r μ * A 0 E A n 1 , μ * E A n } = min { r [ μ * μ X 1 X n 3 μ * A 2 , μ * A 1 , μ * A 0 E A n 1 , μ * E A n F O , A n 2 * , O , O ] , m + r μ * A 0 E A n 1 , μ * E A n } = r μ * μ X 1 X 2 X n 3 μ * A 2 , μ * A 1 E A n 2 , μ * A 0 E A n 1 , μ * E A n = r [ μ * A 2 , μ * A 1 E A n 2 , μ * A 0 E A n 1 , μ * E A n μ * μ X 1 X n 3 I m , O , O , O ] .
Suppose X 0 = I m . We contend that, for 2 i n 1 ,
max X n i , X n i + 1 , , X n r ( S ( A 1 A 2 A n ) * ) = max X n i , X n i + 1 , , X n r ( S μ * ) = max X n i r [ μ * A i 1 , μ * A i 2 E A n i + 1 , , μ * A 0 E A n 1 , μ * E A n μ * μ X 1 X 2 X n i I m , O , , O , O ] = r [ μ * A i , μ * A i 1 E A n i , μ * A i 2 E A n i + 1 , , μ * A 0 E A n 1 , μ * E A n μ * μ X 1 X 2 X n i 1 I m , O , O , , O , O ] .
We proceed by induction on i. For i = 2 , from (11), the equality relation (12) has been proven. Assuming that (12) is true for i 1 ( i 3 ), that is:
max X n i , X n i + 1 , , X n r ( S ( A 1 A 2 A n ) * ) = max X n i + 1 , X n i + 2 , , X n r ( S μ * ) = max X n i + 1 r [ μ * A i 2 E A n i + 1 , μ * A i 3 E A n i + 2 , , μ * A 0 E A n 1 , μ * E A n μ * μ X 1 X 2 X n i + 1 I m , O , , O , O ] = r [ μ * A i 1 , μ * A i 2 E A n i + 1 , , μ * A 0 E A n 1 , μ * E A n μ * μ X 1 X 2 X n i I m , O , , O , O ] .
Next, we will prove that (12) is also true for i. In fact, by Lemma 1, the formula (2) in Lemma 2, and the results in (13) and (1), we have:
max X n i , X n i + 1 , , X n r ( S ( A 1 A 2 A n ) * ) = max X n i , X n i + 1 , , X n r ( S μ * ) = max X n i r ( μ * A i 1 , μ * A i 2 E A n i + 1 , , μ * A 0 E A n 1 , μ * E A n μ * μ X 1 X 2 X n i I m , O , , O , O ) = min { r A n i * A n i A n i * O O μ * μ X 1 X 2 X n i 1 μ * A i 1 μ * A i 2 E A n i + 1 μ * E A n r ( A n i ) , r I m O O μ * A i 1 μ * A i 2 E A n i + 1 μ * E A n } = min { r O A n i * O O μ * μ X 1 X 2 X n i 1 μ * A i μ * A i 1 μ * A i 2 E A n i + 1 μ * E A n r ( A n i ) , m + r μ * A i 2 E A n i + 1 , , μ * E A n } = r μ * μ X 1 X 2 X n i 1 μ * A i , μ * A i 1 E A n i , μ * A i 2 E A n i + 1 , , μ * E A n = r [ μ * A i , μ * A i 1 E A n i , μ * A i 2 E A n i + 1 , , μ * A 0 E A n 1 , μ * E A n μ * μ X 1 X 2 X n i 1 I m , O , O , , O , O ] .
That is, the equality relation (12) has been proven. Specifically, when i = n 1 , we have:
max X n i , X n i + 1 , , X n r ( S ( A 1 A 2 A n ) * ) = max X 1 , X 2 , , X n r ( S μ * ) = max X 1 r [ μ * A n 2 , μ * A n 3 E A 2 , μ * A n 4 E A 3 , , μ * A 0 E A n 1 , μ * E A n μ * μ X 1 I m , O , O , , O , O ] = r [ μ * A n 1 , μ * A n 2 E A 1 , μ * A n 3 E A 2 , , μ * A 0 E A n 1 , μ * E A n μ * μ X 0 I m , O , O , , O , O ] = r μ * μ μ * A n 1 , μ * A n 2 E A 1 , μ * A n 3 E A 2 , , μ * A 0 E A n 1 , μ * E A n .
Combining (15) with Lemma 2, we finally have:
max X n i , X n i + 1 , , X n r ( S ( A 1 A 2 A n ) * ) = max X 1 , X 2 , , X n r ( S μ * ) = r μ * μ μ * A n 1 , μ * A n 2 E A 1 , μ * A n 3 E A 2 , , μ * A 0 E A n 1 , μ * E A n = r μ * μ μ * A n 1 μ * A n 2 μ * A n 3 μ * A 0 μ * O A 1 * O O O O O A 2 * O O O O O A n 1 * O O O O O A n * i = 1 n r ( A i ) .
From Lemma 5, Lemma 2, and Lemma 3, we immediately obtain the following theorem by Equation (8).
Theorem 1.
Let A i C m × m , i = 1 , 2 , , n , A i be as in (1) and μ = A 1 A 2 A n . Then, the following statements are equivalent:
(1) 
A 1 { 1 , 3 } A 2 { 1 , 3 } A n { 1 , 3 } ( A 1 A 2 A n ) { 1 , 3 } ;
(2) 
r μ * μ μ * A n 1 , μ * A n 2 E A 1 , μ * A n 3 E A 2 , , μ * A 0 E A n 1 , μ * E A n = 0 ;
(3) 
μ * μ μ * A n 1 = O and μ * A i E A n i 1 = O ( i = 0 , 1 , , n 2 ) and μ * E A n = O ;
(4) 
μ * μ = μ * A n 1 and N ( μ * A i ) N ( A n i 1 * ) ( i = 0 , 1 , , n 2 ) and N ( μ * ) N ( A n * ) ;
(5) 
r μ * μ μ * A n 1 μ * A n 2 μ * A n 3 μ * A 0 μ * O A 1 * O O O O O A 2 * O O O O O A n 1 * O O O O O A n * = i = 1 n r ( A i ) .
Proof. 
(1)⟺ (2). It is easy to see that the inclusion X 1 X 2 X n ( A 1 A 2 A n ) { 1 , 3 } holds for any X i A i { 1 , 3 } , i = 1 , 2 , , n if and only if:
max X 1 , X 2 , , X n r ( ( A 1 A 2 A n ) * ( A 1 A 2 A n ) * A 1 A 2 A n X 1 X 2 X n ) = max X 1 , X 2 , , X n r ( μ * μ * μ X 1 X 2 X n ) = 0 .
From Lemma 5, we obtain (1)⟺ (2).
(2)⟺ (3). In fact, r ( A ) = 0 if and only if A = O , so (2)⟺ (3) is obvious.
(3)⟺ (4). From (3), we have μ * μ μ * A n 1 = O μ * μ = μ * A n 1 . On the other hand, by (3), we obtain:
μ * A n 2 E A 1 = μ * A n 2 ( I m A 1 A 1 ) = O .
That is,
μ * A n 2 = μ * A n 2 A 1 A 1 = μ * A n 2 P R ( A 1 ) , N ( A 1 * ) .
According to the formula (2) of Lemma 3, it is known that the formula of (13) is equivalent to:
N ( μ * A n 2 ) N ( A 1 * ) .
Similarly, we can show that:
μ * A i E n i 1 = O ( i = 0 , 1 , , n 2 )
is equivalent to:
N ( μ * A i ) N ( A n i 1 * ) ( i = 0 , 1 , , n 2 ) .
Hence (3)⟺ (4).
(2)⟺ (5). By Lemma 2, we have:
r μ * μ μ * A n 1 , μ * A n 2 E A 1 , μ * A n 3 E A 2 , , μ * A 0 E A n 1 , μ * E A n = r μ * μ μ * A n 1 , μ * A n 2 F A 1 * , μ * A n 3 F A 2 * , , μ * A 0 F A n 1 * , μ * F A n * = r μ * μ μ * A n 1 μ * A n 2 μ * A n 3 μ * A 0 μ * O A 1 * O O O O O A 2 * O O O O O A n 1 * O O O O O A n * i = 1 n r ( A i ) .
From (17), we have:
r μ * μ μ * A n 1 , μ * A n 2 E A 1 , μ * A n 3 E A 2 , , μ * A 0 E A n 1 , μ * E A n = 0 ,
if and only if:
r μ * μ μ * A n 1 μ * A n 2 μ * A n 3 μ * A 0 μ * O A 1 * O O O O O A 2 * O O O O O A n 1 * O O O O O A n * = i = 1 n r ( A i ) .
The proof of Theorem 1 is completed.  □
By Lemma 4, we know that X A { 1 , 4 } if and only if X * A { 1 , 3 } . Therefore, from the results obtained in Theorem 1, we can get the necessary and sufficient conditions for the forward order law (6), and hence provide the following theorem without the proof.
Theorem 2.
Let A i C m × m , i = 1 , 2 , , n , μ = A 1 A 2 A n , and A i = A i A i 1 A 1 , i = 1 , 2 , , n . Then, the following statements are equivalent:
(1) 
A 1 { 1 , 4 } A 2 { 1 , 4 } A n { 1 , 4 } ( A 1 A 2 A n ) { 1 , 4 } ;
(2) 
r A n μ * μ μ * F A n A n 1 μ * F A n 1 A n 2 μ * F A 2 A 1 μ * F A 1 μ * = 0 ;
(3) 
A n μ * μ μ * = O and F A i A i 1 μ * = O , ( i = 2 , 3 , n ) and F A 1 μ * = O ;
(4) 
A n μ * = μ μ * and R ( A i 1 μ * ) R ( A i * ) , ( i = 2 , 3 , n ) and R ( μ * ) R ( A 1 * ) ;
(5) 
r A n μ * μ μ * O O O O A n 1 μ * O O O A n * A n 2 μ * O O A n 1 * O A 1 μ * O A 2 * O O μ * A 1 * O O O = i = 1 n r ( A i ) .

3. Conclusions

In this paper, we have studied the forward order laws for the { 1 , 3 } -inverse and { 1 , 3 } -inverse of a product of multiple matrices. By using the expressions for maximal ranks of the generalized Schur complement, we obtained some necessary and sufficient conditions for A 1 { 1 , 3 } A n { 1 , 3 } ( A 1 A 2 A n ) { 1 , 3 } and A 1 { 1 , 4 } A n { 1 , 4 } ( A 1 A 2 A n ) { 1 , 4 } . In the near future, we will study more challenging problems to find out the important applications of forward order laws in many algorithms for the computation of the least squares technique (LS) of matrix equations ( A 1 A 2 A n ) x = b .

Author Contributions

All authors have equally contributed to this work. All authors read and approved the final manuscript.

Funding

This work was supported by the project for characteristic innovation of 2018 Guangdong University and the National Science Foundation of China (No: 11771159, 11571004) and the Guangdong Natural Science Fund of China (Grant No: 2014A030313625).

Acknowledgments

The authors would like to thank the anonymous referees for their very detailed comments and constructive suggestions, which greatly improved the presentation of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Penrose, R. A generalized for matrix. Proc. Camb. Philos. Soc. 1955, 51, 406–413. [Google Scholar] [CrossRef]
  2. Ben-Israel, A.; Greville, T.N.E. Generalized Inverse: Theory and Applications, 2nd ed.; Springer: New York, NY, USA, 2002. [Google Scholar]
  3. Wang, G.; Wei, Y.; Qiao, S. Generalized Inverse: Theory and Computations; Science Press: Beijing, China, 2004. [Google Scholar]
  4. Campbell, S.L.; Meyer, C.D. Generalized Inverse of Linear Transformations; Dover: New York, NY, USA, 1979. [Google Scholar]
  5. Marsaglia, G.; Tyan, G.P.H.S. Equalities and inequalities for ranks of matrices. Linear Multilinear Algebra 1974, 2, 269–292. [Google Scholar] [CrossRef]
  6. Rao, C.R.; Mitra, S.K. Generalized inverse of Matrices and Its Applications; Wiley: New York, NY, USA, 1971. [Google Scholar]
  7. Golub, G.H.; van Loan, C.F. Matrix Computations, 3rd ed.; The Johns Hopkins University Press: Baltimore, MD, USA, 1996. [Google Scholar]
  8. Greville, T.N.E. Note on the generalized inverses of a matrix products. SIAM Rev. 1966, 8, 518–521. [Google Scholar] [CrossRef]
  9. Stanimirovic, P.; Tasic, M. Computing generalized inverses using Lu factorrization of Matrix product. Int. J. Comp. Math. 2008, 85, 1865–1878. [Google Scholar] [CrossRef]
  10. Cvetković-Ilić, D.; Harte, R. Reverse order laws in C*-algebras. Linear Algebra Appl. 2011, 434, 1388–1394. [Google Scholar]
  11. Cvetković-Ilić, D.; Milosevic, J. Reverse order laws for {1,3}-generalized inverses. Linear Multilinear Algebra 2018. [Google Scholar] [CrossRef]
  12. Cvetković-Ilić, D.; Nikolov, J. Reverse order laws for {1,2,3}-generalized inverses. Appl. Math. Comp. 2014, 234, 114–117. [Google Scholar]
  13. Hartwing, R.E. The reverse order law revisited. Linear Algebra Appl. 1986, 76, 241–246. [Google Scholar] [CrossRef] [Green Version]
  14. Liu, D.; Yan, H. The reverse order law for {1,3,4}-inverse of the product of two matrices. Appl. Math. Comput. 2010, 215, 4293–4303. [Google Scholar] [CrossRef]
  15. Liu, Q.; Wei, M. Reverse order law for least squares g-inverses of multiple matrix products. Linear Multilinear Algebra 2008, 56, 491–506. [Google Scholar] [CrossRef]
  16. Liu, X.J.; Huang, S.; Cvetkovic-Ilic, D.S. Mixed-tipe reverse-order law for {1,3}-inverses over Hilbert spaces. Appl. Math. Comput. 2012, 218, 8570–8577. [Google Scholar]
  17. Tian, Y. Reverse order laws for generalized inverse of multiple materix products. Linear Algebra Appl. 1994, 211, 85–100. [Google Scholar] [CrossRef]
  18. Wei, M. Reverse order laws for generalized inverse of multiple matrix products. Linear Algebra Appl. 1999, 293, 273–288. [Google Scholar] [CrossRef]
  19. Liu, Z.S.; Xiong, Z. A note on the forward order law for least square g-inverse of three matrix products. Comput. Appl. Math. 2019, 38, 48. [Google Scholar] [CrossRef]
  20. Liu, Z.S.; Xiong, Z. The forward order laws for {1,2,3}- ans {1,2,4}-inverses of three matrix products. Filomat 2018, 32, 589–598. [Google Scholar] [CrossRef]
  21. Tian, Y. More on maximal and minimal ranks of Schur complements with applications. Appl. Math. Comput. 2004, 152, 675–692. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Xiong, Z.; Liu, Z. The Forward Order Law for Least Squareg-Inverse of Multiple Matrix Products. Mathematics 2019, 7, 277. https://doi.org/10.3390/math7030277

AMA Style

Xiong Z, Liu Z. The Forward Order Law for Least Squareg-Inverse of Multiple Matrix Products. Mathematics. 2019; 7(3):277. https://doi.org/10.3390/math7030277

Chicago/Turabian Style

Xiong, Zhiping, and Zhongshan Liu. 2019. "The Forward Order Law for Least Squareg-Inverse of Multiple Matrix Products" Mathematics 7, no. 3: 277. https://doi.org/10.3390/math7030277

APA Style

Xiong, Z., & Liu, Z. (2019). The Forward Order Law for Least Squareg-Inverse of Multiple Matrix Products. Mathematics, 7(3), 277. https://doi.org/10.3390/math7030277

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop