Next Article in Journal
Time Series of Counts under Censoring: A Bayesian Approach
Next Article in Special Issue
Magic Numbers and Mixing Degree in Many-Fermion Systems
Previous Article in Journal
Quantum Circuit Components for Cognitive Decision-Making
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Tsallis Entropy of a Used Reliability System at the System Level

by
Mohamed Kayid
1 and
Mashael A. Alshehri
2,*
1
Department of Statistics and Operations Research, College of Science, King Saud University, P.O. Box 2455, Riyadh 11451, Saudi Arabia
2
Department of Quantitative Analysis, College of Business Administration, King Saud University, Riyadh 11362, Saudi Arabia
*
Author to whom correspondence should be addressed.
Entropy 2023, 25(4), 550; https://doi.org/10.3390/e25040550
Submission received: 10 February 2023 / Revised: 19 March 2023 / Accepted: 21 March 2023 / Published: 23 March 2023

Abstract

:
Measuring the uncertainty of the lifetime of technical systems has become increasingly important in recent years. This criterion is useful to measure the predictability of a system over its lifetime. In this paper, we assume a coherent system consisting of n components and having a property where at time t , all components of the system are alive. We then apply the system signature to determine and use the Tsallis entropy of the remaining lifetime of a coherent system. It is a useful criterion for measuring the predictability of the lifetime of a system. Various results, such as bounds and order properties for the said entropy, are investigated. The results of this work can be used to compare the predictability of the remaining lifetime between two coherent systems with known signatures.

1. Introduction

For engineers, the performance and quantification of uncertainties over the lifetime of a system is critical. The reliability of a system decreases as uncertainty increases, and systems with longer lifetimes and lower uncertainty are better systems (see, e.g., Ebrahimi and Pellery, [1]). It has found applications in numerous areas described in Shannon’s seminal work, [2]. Information theory provides a measure of the uncertainty associated with a random phenomenon. If X is a nonnegative random variable with an absolutely continuous cumulative distribution function (CDF) F ( x ) and density function f ( x ) , the Tsallis entropy of order α , defined by (see [3]), is
H α ( X ) = H α ( f ) = 1 1 α 0 f α ( x ) d x 1 , = 1 1 α [ E ( f α 1 ( X ) ) 1 ]
for all α > 0 , α 1 , where E ( · ) denotes the expected value. In general, the Tsallis entropy can be negative, but it can also be non-negative if one chooses an appropriate value for α . It is obvious that H ( f ) = lim α 1 H α ( f ) and thus reduces to the Shannon differential entropy. It is known that the Shannon differential entropy is additive in the sense that for two independent random variables X and Y, H ( X , Y ) = H ( X ) + H ( Y ) , where ( X , Y ) denotes the common random variable. However, the Tsallis entropy is non-additive in the sense that H α ( X , Y ) = H α ( X ) + H α ( Y ) + ( 1 α ) H α ( X ) H α ( Y ) . Because of the flexibility of Tsallis entropy compared to Shannon entropy, non-additive entropy measures find their justification in many areas of information theory, physics, chemistry, and engineering.
If X denotes the lifetime of a new system, then H α ( X ) measures the uncertainty of the new system. In some cases, agents know something about the current age of the system. For example, one may know that the system is in operation at time t and is interested in measuring the uncertainty of its remaining lifetime, that is, X t = X t | X > t . Then H α ( X ) is no longer useful in such situations. Accordingly, the residual Tsallis entropy is defined as
H α ( X t ) = 1 1 α 0 f t α ( x ) d x 1 = 1 1 α t f ( x ) S ( t ) α d x 1 ,
= 1 1 α 0 1 f t α 1 ( S t 1 ( u ) ) d u 1 , α > 0 ,
where
f t ( x ) = f ( x + t ) S ( t ) , x , t > 0 ,
is the probability density function (PDF) of X t , S ( t ) = P ( X > t ) is the survival function of X and S t 1 ( u ) = inf { x ; S t ( x ) u } is the quantile function of S t ( x ) = S ( x + t ) / S ( t ) , x , t > 0 . Various properties, generalizations and applications of H α ( X t ) are investigated by Asadi et al. [4], Nanda and Paul [5], Zhang [6], Irshad et al. [7], Rajesh and Sunoj [8], Toomaj and Agh Atabay [9], Mohamed et al. [10], among others.
Several properties and statistical applications of Tsallis entropy have been studied in the literature, which you can read in Maasoumi [11], Abe [12], Asadi et al. [13] and the references therein. Recently, Alomani and Kayid [14] investigated some additional properties of Tsallis entropy, including its connection with the usual stochastic order, as well as some other properties of the dynamical version of this measure and bounds. Moreover, they investigated some properties of Tsallis entropy for the lifetime of a coherent and mixed system. It is suitable to study the behavior of the uncertainty of the new system in terms of Tsallis entropy. For other applications and researchers concerned with measuring the uncertainty of reliability systems, we refer readers to [15,16,17,18] and the references therein. In contrast to the work of Alomani and Kayid [14], the aim of this work is to study some uncertainty properties of a coherent system consisting of n components and having the property that at time t , all components of the system are alive. In fact, we generalize the results of the work published in the literature. To this end, we use the concept of system signature to determine the Tsallis entropy of the remaining lifetime of a coherent system.
The results of this paper are organized as follows: In Section 2, we provide an expression for the Tsallis entropy of a coherent system under the assumption that all components have survived to time t. For this purpose, we used the concept of system signature when the lifetimes of the components in a coherent system are independent and identically distributed. The ordering properties of the residual Tsallis entropy of two coherent systems are studied in Section 3 based on some ordering properties of system signatures even without simple calculations. Section 4 presents some useful bounds. Finally, Section 5 gives some conclusions and further detailed remarks.
Throughout the paper, “ s t ”, “ h r ”, “ l r ” and “ d ” stand for stochastic, hazard rate, likelihood ratio and dispersive orders, respectively; for more details on these orderings, we refer the reader to Shaked and Shanthikumar [19].

2. Tsallis Entropy of the System in Terms of Signature Vectors of the System

In this section, the concept of system signature is used to define the Tsallis entropy of the remaining lifetime of a coherent system with an arbitrary system-level structure, assuming that all components of the system are functioning at time t. An n-dimensional vector p = ( p 1 , , p n ) whose i-th element p i = P ( T = X i : n ) , i = 1 , 2 , , n ; is the signature of such a system where X i : n is the i-th order statistic of the n independent and identically distributed (i.i.d.) component lifetimes X = ( X 1 , , X n ) , that is, the time of the i-th component failure, and T is the failure time of the system; (see Samaniego [20]). Consider a coherent system with independent and identically distributed component lifetimes X 1 , , X n and a known signature vector p = ( p 1 , , p n ) . If T t 1 , n = [ T t | X 1 : n > t ] , represents the remaining lifetime of the system under the condition that at time t , all components of the system are functioning, then from the results of Khaledi and Shaked [21] the survival function of T t 1 , n can be expressed as
P ( T t 1 , n > x ) = i = 1 n p i P ( X i : n t > x | X 1 : n > t ) , = i = 1 n p i P ( T t 1 , i , n > x ) ,
where T t 1 , i , n = [ X i : n t | X 1 : n > t ] , i = 1 , 2 , , n , denotes the remaining lifetime of an i-out-of-n system under the condition that all components at time t. The survival and probability density functions of T t 1 , i , n are given by
P ( T t 1 , i , n > x ) = k = 0 i 1 n k 1 S t ( x ) k S t ( x ) n k , x , t > 0 ,
and
f T t 1 , i , n ( x ) = Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) 1 S t ( x ) i 1 S t ( x ) n i f t ( x ) , x , t > 0 ,
respectively, where Γ ( · ) is the complete gamma function. It follows that
f T t 1 , n ( x ) = i = 1 n p i f T t 1 , i , n ( x ) , x , t > 0 .
In what follows, we focus on the study of the Tsallis entropy of the random variable T t 1 , n , which measures the degree of uncertainty contained in the density of [ T t | X 1 : n > t ] , in terms of the predictability of the remaining lifetime of the system in terms of Tsallis entropy. The probability integral transformation V = S t ( T t 1 , n ) plays a crucial role in our goal. It is clear that U i : n = S t ( T t 1 , i , n ) follows from a beta distribution with parameters n i + 1 and i with the PDF
g i ( u ) = Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) ( 1 u ) i 1 u n i , 0 < u < 1 , i = 1 , , n .
In the forthcoming proposition, we provide an expression for the Tsallis entropy of T t 1 , n by using the earlier transformation formulas.
Theorem 1.
The Tsallis entropy of T t 1 , n can be expressed as follows:
H α ( T t 1 , n ) = 1 1 α 0 1 g V α ( u ) f t α 1 ( S t 1 ( u ) ) d u 1 , t > 0 ,
for all α > 0 .
Proof. 
By using the change of u = S t ( x ) , from (2) and (6) we obtain
H α ( T t 1 , n ) = 1 1 α 0 f T t 1 , n ( x ) α d x 1 = 1 1 α 0 i = 1 n p i f T t 1 , i , n ( x ) α d x 1 = 1 1 α 0 1 i = 1 n p i g i ( u ) α f t ( S t 1 ( u ) ) α 1 d x 1 = 1 1 α 0 1 g V α ( u ) f t ( S t 1 ( u ) ) α 1 d u 1 .
In the last equality g V ( u ) = i = 1 n p i g i ( u ) is the PDF of V denotes the lifetime of the system with independent and identically distributed uniform distribution. □
In the specail case, if we consider an i-out-of-n system with the system signature p = ( 0 , , 0 , 1 i , 0 , , 0 ) , i = 1 , 2 , , n , then Equation (9) reduces to
H α ( T t 1 , i , n ) = 1 1 α 0 1 g i α ( u ) f t ( S t 1 ( u ) ) α 1 d u 1 ,
for all t > 0 .
The next theorem immediately follows by Theorem 1 from the aging properties of their components. We recall that X has increasing (decreasing) failure rate (IFR(DFR)) if S t ( x ) is decreasing (increasing) in x for all t > 0 .
Theorem 2.
If X is IFR (DFR), then  H α ( T t 1 , n )  is decreasing (increasing) in t for all  α > 0 .
Proof. 
We just prove it when X is IFR where the proof for the DFR is similar. It is easy to see that f t ( S t 1 ( u ) ) = u λ t ( S t 1 ( u ) ) , 0 < u < 1 . This implies that Equation (9) can be rewritten as
( 1 α ) H α ( T t 1 , n ) + 1 = 0 1 g V α ( u ) u α 1 λ t ( S t 1 ( u ) ) α 1 d u ,
for all α > 0 . On the other hand, one can conclude that S t 1 ( u ) = S 1 ( u S ( t ) ) t ,  for all 0 < u < 1 , and hence we have
λ t ( S t 1 ( u ) ) = λ ( S t 1 ( u ) + t ) = λ ( S 1 ( u S ( t ) ) ) , 0 < u < 1 .
If t 1 t 2 , then S 1 ( u S ( t 1 ) ) S 1 ( u S ( t 2 ) ) . Thus, when F is IFR, then for all α > 1 ( 0 < α 1 ) , we have
0 1 g V α ( u ) u α 1 λ t 1 ( S t 1 1 ( u ) ) α 1 d u = 0 1 g V α ( u ) u α 1 λ ( S 1 ( u S ( t 1 ) ) ) α 1 d u ( ) 0 1 g V α ( u ) u α 1 λ ( S 1 ( u S ( t 2 ) ) ) α 1 d u = 0 1 g V α ( u ) u α 1 λ t 2 ( S t 2 1 ( u ) ) α 1 d u ,
for all t 1 t 2 . Using (11), we obtain
( 1 α ) H α ( T t 1 1 , n ) + 1 ( ) ( 1 α ) H α ( T t 2 1 , n ) + 1 ,
for all α > 1 ( 0 < α 1 ) . This implies that H α ( T t 1 1 , n ) H α ( T t 2 1 , n ) for all α > 0 and this completes the proof. □
The next example illustrates the results of Theorems 1 and 2.
Example 1.
Consider a coherent system with system signature p = ( 0 , 1 / 2 , 1 / 4 , 1 / 4 ) . The exact value of H α ( T t 1 , 4 ) can be calculated using the relation (9) given the lifetime distributions of the components. For this purpose, let us assume the following lifetime distributions.
(i) 
Consider a Pareto type II with the survival function
S ( t ) = ( 1 + t ) k , k , t > 0 .
It is not hard to see that
H α ( T t 1 , 4 ) = 1 1 α k 1 + t α 1 0 1 u ( α 1 ) ( k + 1 ) k g V α ( u ) d u 1 , t > 0 .
It is obvious that the Tsallis entropy of H α ( T t 1 , 4 ) is an increasing function of time t . Thus, the uncertainty of the conditional lifetime T t 1 , 4 increases as t increases. We recall that this distribution has the DFR property.
(ii) 
Let us suppose that X has a Weibull distribution with the shape parameter k with the survival function
S ( t ) = e t k , k , t > 0 .
After some manipulation, we have
H α ( T t 1 , 4 ) = 1 1 α k α 1 0 1 t k log u ( 1 1 k ) ( α 1 ) u α 1 g V α ( u ) d u 1 , t > 0 .
It is difficult to find an explicit expression for the above relation, and therefore we are forced to calculate it numerically. In Figure 1 we have plotted the entropy of  T t 1 , 4  as a function of time t for values of  α = 0 , 2  and  α = 2 and k > 0 . In this case, it is known that X is DFR when α = 0 , 1 . As expected from Theorem 2, it is obvious that H α ( T t 1 , 4 ) is increasing in t for α = 0 , 1 . The results are shown in Figure 1.
Below, we compare the Tsallis entropies of two coherent systems from their lifetimes and their residual lifetimes.
Theorem 3.
Consider a coherent system with independent and identically distributed IFR(DFR) component lifetimes. Then H α ( T t 1 , n ) ( ) H α ( T ) for all α > 0 .
Proof. 
We prove it when X is IFR where the proof for DFR property is similar. Since X is IFR, Theorem 3.B.25 of Shaked and Shanthikumar [19] implies that X d X t , that is
f t ( S t 1 ( u ) ) f ( S 1 ( u ) ) , 0 < u < 1 ,
for all t > 0 . If α > 1 ( 0 < α < 1 ) , so we have
0 1 g V α ( u ) f t α 1 ( S t 1 ( u ) ) d u ( ) 0 1 g V α ( u ) f α 1 ( S 1 ( u ) ) d u , t > 0 .
Thus, from (9) and (15), we obtain
H α ( T t 1 , n ) = 1 1 α 0 1 g V α ( u ) f t α 1 ( S t 1 ( u ) ) d u 1 1 1 α 0 1 g V α ( u ) f α 1 ( S 1 ( u ) ) d u 1 = H α ( T ) .
Therefore, the proof is completed. □
Theorem 4.
If X is DFR, then a lower bound for H α ( T t 1 , n ) is given as follows:
H α ( T t 1 , n ) H α ( T ) S ( t ) + 1 1 α 1 S ( t ) 1 ,
for all α > 0 .
Proof. 
Since X is DFR, then it is NWU (i.e., S t ( x ) S ( x ) , x , t 0 . ) This implies that
S t 1 ( u ) + t S 1 ( u ) , t 0 ,
for all 0 < u < 1 . On the other hand, it is known that when X is DFR, the PDF f is decreasing which implies that
f α 1 ( S t 1 ( u ) + t ) ( ) f α 1 ( S 1 ( u ) ) , 0 < u < 1 ,
for all α > 1 ( 0 < α < 1 ) . From (9), one can conclude that
H α ( T t 1 , n ) = 1 1 α 0 1 g V α ( u ) f α 1 ( S t 1 ( u ) + t ) S ( t ) d u 1 1 1 α 0 1 g V α ( u ) f α 1 ( S 1 ( u ) ) S ( t ) d u 1 = 1 1 α ( 1 α ) H α ( T ) + 1 S ( t ) 1 ,
for all α > 0 , and this completes the proof. □

3. Entropy Ordering of Two Coherent Systems

Given the imponderables of two coherent systems, this section discusses the partial ordering of their conditional lifetimes. Based on various existing orderings between the component lifetimes and their signature vectors, we find some results for the entropy ordering of two coherent systems. The next theorem compares the entropies of the residual lifetimes of two coherent systems.
Theorem 5.
Let T t X , 1 , n = [ T t | X 1 : n > t ] and T t Y , 1 , n = [ T t | Y 1 : n > t ] denote the residual lifetimes of two coherent systems with the same signatures and n i.i.d component lifetimes X 1 , , X n and Y 1 , , Y n from cdfs F and G, respectively. If X   d Y and X or Y is IFR, then H α ( T t X , 1 , n ) H α ( T t Y , 1 , n ) for all α > 0 .
Proof. 
As a result of the relation (9), it is sufficient to demonstrate that X t d Y t . Due to the assumption that X d Y and X or Y is IFR, the proof of Theorem 5 of Ebrahimi and Kirmani [22] means that X t d Y t , and this concludes the proof. □
Example 2.
Let us assume two coherent systems with residual lifetimes T t X , 1 , 4 and T t Y , 1 , 4 with the common signature p = ( 1 2 , 1 4 , 1 4 , 0 ) . Suppose that X W ( 3 , 1 ) and Y W ( 2 , 1 ) , where W ( k , 1 ) stands for the Weibull distribution with the survival function given in (14). It is easy to see that X d Y . Moreover, X and Y are both IFR. Thus, Theorem 5 yields that H α ( T t X , 1 , 4 ) H α ( T t Y , 1 , 4 ) for all α > 0 . The plot of the Tsallis entropies of these systems is displayed in Figure 2.
Next, we compare the residual Tsallis entropies of two coherent systems with the same component lifetimes and different structures.
Theorem 6.
Let T 1 , t 1 , n = [ T 1 t | X 1 : n > t ] and T 2 , t 1 , n = [ T 2 t | X 1 : n > t ] represent the residual lifetimes of two coherent systems with signature vectors p 1 and p 2 , respectively. Assume that the system’s components are independent and identically distributed according to the common CDF, F. Additionally, let p 1 l r p 2 . Then,
(i) 
if f t ( S t 1 ( u ) ) is increasing in u for all t > 0 , then H α ( T 1 , t 1 , n ) H α ( T 2 , t 1 , n ) for all α > 0 .
(ii) 
if f t ( S t 1 ( u ) ) is decreasing in u for all t > 0 , then H α ( T 1 , t 1 , n ) H α ( T 2 , t 1 , n ) for all α > 0 .
Proof. 
(i) First, we note that the Equation (9) can be rewritten as follows:
( 1 α ) H α ( T t i 1 , n ) + 1 = 0 1 g V i α ( u ) d u 0 1 g V i ( u ) f t ( S t 1 ( u ) ) α 1 d u , ( i = 1 , 2 ) ,
where V has the PDF as
g V ( u ) = g V α ( u ) 0 1 g V α ( u ) d u , 0 < u < 1 .
Assumption s 1 l r s 2 implies V 1 l r V 2 , and this means that V 1 l r V 2 , which means that
g V 2 ( u ) g V 1 ( u ) g V 2 ( u ) g V 1 ( u ) α
is increasing in u for all α > 0 , and hence, V 1 s t V 2 . When α > 1 ( 0 < α < 1 ) , we obtain
0 1 g V 1 ( u ) f t ( S t 1 ( u ) ) α 1 d u ( ) 0 1 g V 2 ( u ) f t ( S t 1 ( u ) ) α 1 d u ,
where the inequality in (17) is obtained by noting that the conditions V 1 s t V 2 imply E [ π ( V 1 ) ] E [ π ( V 2 ) ] for all increasing (decreasing) functions π . Therefore, relation (16) gives
( 1 α ) H α ( T t 1 1 , n ) + 1 ( ) ( 1 α ) H α ( T t 2 1 , n ) + 1 ,
or equivalently, H α ( T 1 , t 1 , n ) H α ( T 2 , t 1 , n ) for all α > 0 . Part (ii) can be similarly obtained. □
The next example gives an application of Theorem 6.
Example 3.
Let us consider the two coherent systems of order 4 displayed in Figure 3 with residual lifetimes T 1 , t 1 , 4 = [ T 1 t | X 1 : 4 > t ] (left panel) and T 2 , t 1 , 4 = [ T 2 t | X 1 : 4 > t ] (right panel). It is not hard to see that the signatures of these systems are p 1 = ( 1 2 , 1 2 , 0 , 0 ) and p 2 = ( 1 4 , 1 4 , 1 2 , 0 ) , respectively. Assume that the component lifetimes are independent and identically distributed according to the following survival function,
S ( t ) = ( 1 + t ) 2 , t > 0 .
After some calculation, one can obtain f t ( S t 1 ( u ) ) = 2 u u 1 + t , t > 0 . This function is increasing in u for all t > 0 . Hence, due to Theorem 6, it holds that H α ( T 1 , t 1 , 4 ) H α ( T 2 , t 1 , 4 ) for all α > 0 .

4. Some Useful Bounds

When the complexity is high and the number of components is large, it is difficult to compute the H α ( T t 1 , n ) of a coherent system. This situation is frequently encountered in practice. Under such circumstances, a Tsallis entropy bound can be useful to estimate the lifetime of a coherent system. To see some recent research on bounds on the uncertainty of the lifetime of coherent systems, we refer the reader, for example, to Refs. [15,16,23] and the references there. In the following theorem, we provide bounds on the residual Tsallis entropy of the lifetime of the coherent system in terms of the residual Tsallis entropy of the parent distribution H α ( X t ) .
Theorem 7.
Let T t 1 , n = [ T t | X 1 : n > t ] represent the residual lifetime of a coherent system consisting of n independent and identically distributed component lifetimes having the common CDF F with the signature p = ( p 1 , , p n ) . Suppose that H α ( T t 1 , n ) < for all α > 0 . It holds that
H α ( T t 1 , n ) B n ( p ) α H α ( X t ) + B n ( p ) α 1 1 α ,
for all α > 1 and
H α ( T t 1 , n ) B n ( p ) α H α ( X t ) + B n ( p ) α 1 1 α ,
for 0 < α < 1 where B n ( p ) = i = 1 n p i g i ( p i ) , and p i = n i n 1 .
Proof. 
It can be clearly verified that the mode of the beta distribution with parameters n i + 1 and i is p i = n i n 1 . Therefore, we obtain
g V ( v ) i = 1 n p i g i ( p i ) = B n ( p ) , 0 < v < 1 .
Thus, for α > 1 ( 0 < α < 1 ) , we have
1 + ( 1 α ) H α ( T t 1 , n ) = 0 1 g V α ( v ) f t α 1 ( S t 1 ( v ) ) d v B n ( p ) α 0 1 f t α 1 ( S t 1 ( v ) ) d v = B n ( p ) α ( 1 α ) H α ( X t ) + 1 .
The last equality is obtained from (3), from which the desired result follows. □
The bounds given in (18) and (19) are very valuable when the number of components is large or the structure of the system is complicated. Now, we obtain a public lower bound using properties of the Tsallis information measure and mathematical concepts.
Theorem 8.
Under the requirements of the Theorem 7, we have
H α ( T t 1 , n ) H α L ( T t 1 , n ) ,
where H α L ( T t 1 , n ) = i = 1 n p i H α ( T t 1 , i , n ) for all α > 0 .
Proof. 
Recalling Jensen’s inequality for the convex function t α (it is concave (convex) for 0 < α < 1 ( α > 1 ) ), it holds that
i = 1 n p i f T t 1 , i , n ( x ) α ( ) i = 1 n p i f T t 1 , i , n α ( x ) , t > 0 ,
and hence, we obtain
0 f T t 1 , n α ( x ) d x ( ) i = 1 n p i 0 f T t 1 , i , n α ( x ) d x .
Since 1 α > 0 ( 1 α < 0 ) , by multiplying both sides of (21) in 1 / ( 1 α ) , we obtain
H α ( T ) 1 1 α i = 1 n p i 0 f T t 1 , i , n α ( x ) d x 1 = 1 1 α i = 1 n p i 0 f T t 1 , i , n α ( x ) d x i = 1 n p i = i = 1 n p i 1 1 α 0 f T t 1 , i , n α ( x ) d x 1 = i = 1 n p i H α ( T t 1 , i , n ) ,
and this completes the proof. □
Notice that the equality in (20) holds for i-out-of-n systems in the sense that we have p j = 0 , for j i , and p j = 1 , for j = i , and then H α ( T t 1 , n ) = H α ( T t 1 , i , n ) . When the lower bounds for 0 < α < 1 in both parts of Theorems 7 and 8 can be computed, one may use the maximum of the two lower bounds.
Example 4.
Let T t 1 , 5 = [ T t | X 1 : 5 > t ] represent the residual lifetime of a coherent system with the signature p = ( 0 , 3 10 , 5 10 , 2 10 , 0 ) consisting of n = 5 independent and identically distributed component lifetimes having a uniform distribution in [ 0 , 1 ] . It is easy to verify that B 5 ( p ) = 2.22 . Thus, by Theorem 7, the Tsallis entropy of T t 1 , 5 is bounded for α > 1 ( 0 < α < 1 ) , as follows:
H α ( T t 1 , n ) 2 . 22 α ( 1 t ) 1 α 1 1 α ,
for all α > 1 and
H α ( T t 1 , n ) 2 . 22 α ( 1 t ) 1 α 1 1 α ,
for 0 < α < 1 . Moreover, the lower bound given in (20) can be obtained as follows:
H α ( T t 1 , 3 ) 1 1 α ( 1 t ) 1 α i = 1 n p i 0 1 g i α ( u ) d u 1 , t > 0 ,
for all α > 0 . Assuming uniform distribution for the component lifetimes, we computed the bounds given by (19) (dashed line), as well as the exact value of H α ( T t 1 , 3 ) obtained directly from (9), and also the bounds given by (22) (dotted line). The results are displayed in Figure 4. As we can see, regarding the lower bound in (22) (dotted line) for α > 1 , it is better than the lower bound given by (19).

5. Conclusions

Intuitively, it is better to have systems that work longer and whose remaining life is less uncertain. We can make more accurate predictions when a system has low uncertainty. The Tsallis entropy of a system is an important measure for designing systems based on these facts. If we have some information about the lifetime of the system at time t, for example, that the system will still function at age t, then we may be interested in quantifying the predictability of the remaining lifetime. In this work, we presented a simple assertion for the Tsallis entropy of the system lifetime for the case where all components contained in the system are in operation at time t. Several properties of the proposed measure were discussed. In addition, some partial stochastic orderings between the remaining lifetimes of two coherent systems were discussed in terms of their Tsallis entropy using the concept of a system signature. Numerous examples were also given to illustrate the results.

Author Contributions

Conceptualization, M.K.; methodology, M.K.; software, M.A.A.; validation, M.A.A.; formal analysis, M.K.; investigation, M.A.A.; resources, M.A.A.; writing—original draft preparation, M.K.; writing—review and editing, M.A.A.; visualization, M.A.A.; supervision, M.A.A.; project administration, M.A.A.; funding acquisition, M.A.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Researchers Supporting Project number (RSP2023R392), King Saud University, Riyadh, Saudi Arabia.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

The authors acknowledge financial support from King Saud University. This work was supported by Researchers Supporting Project number (RSP2023R392), King Saud University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ebrahimi, N.; Pellerey, F. New partial ordering of survival functions based on the notion of uncertainty. J. Appl. Probab. 1995, 32, 202–211. [Google Scholar] [CrossRef]
  2. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  3. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  4. Asadi, M.; Ebrahimi, N.; Soofi, E.S. Dynamic generalized information measures. Stat. Probab. Lett. 2005, 71, 85–98. [Google Scholar] [CrossRef]
  5. Nanda, A.K.; Paul, P. Some results on generalized residual entropy. Inf. Sci. 2006, 176, 27–47. [Google Scholar] [CrossRef]
  6. Zhang, Z. Uniform estimates on the Tsallis entropies. Lett. Math. Phys. 2007, 80, 171–181. [Google Scholar] [CrossRef]
  7. Irshad, M.R.; Maya, R.; Buono, F.; Longobardi, M. Kernel estimation of cumulative residual Tsallis entropy and its dynamic version under ρ-mixing dependent data. Entropy 2021, 24, 9. [Google Scholar] [CrossRef]
  8. Rajesh, G.; Sunoj, S. Some properties of cumulative Tsallis entropy of order alpha. Stat. Pap. 2019, 60, 583–593. [Google Scholar] [CrossRef]
  9. Toomaj, A.; Atabay, H.A. Some new findings on the cumulative residual Tsallis entropy. J. Comput. Appl. Math. 2022, 400, 113669. [Google Scholar] [CrossRef]
  10. Mohamed, M.S.; Barakat, H.M.; Alyami, S.A.; Abd Elgawad, M.A. Cumulative residual tsallis entropy-based test of uniformity and some new findings. Mathematics 2022, 10, 771. [Google Scholar] [CrossRef]
  11. Maasoumi, E. The measurement and decomposition of multi-dimensional inequality. Econom. J. Econom. Soc. 1986, 54, 991–997. [Google Scholar] [CrossRef] [Green Version]
  12. Abe, S. Axioms and uniqueness theorem for Tsallis entropy. Phys. Lett. A 2000, 271, 74–79. [Google Scholar] [CrossRef] [Green Version]
  13. Asadi, M.; Ebrahimi, N.; Soofi, E.S. Connections of Gini, Fisher, and Shannon by Bayes risk under proportional hazards. J. Appl. Probab. 2017, 54, 1027–1050. [Google Scholar] [CrossRef]
  14. Alomani, G.; Kayid, M. Further Properties of Tsallis Entropy and Its Application. Entropy 2023, 25, 199. [Google Scholar] [CrossRef] [PubMed]
  15. Abdolsaeed, T.; Doostparast, M. A note on signature-based expressions for the entropy of mixed r-out-of-n systems. Nav. Res. Logist. (NRL) 2014, 61, 202–206. [Google Scholar]
  16. Toomaj, A. Renyi entropy properties of mixed systems. Commun. Stat.-Theory Methods 2017, 46, 906–916. [Google Scholar] [CrossRef]
  17. Toomaj, A.; Di Crescenzo, A.; Doostparast, M. Some results on information properties of coherent systems. Appl. Stoch. Model. Bus. Ind. 2018, 34, 128–143. [Google Scholar] [CrossRef]
  18. Baratpour, S.; Khammar, A. Tsallis entropy properties of order statistics and some stochastic comparisons. J. Stat. Res. Iran JSRI 2016, 13, 25–41. [Google Scholar] [CrossRef] [Green Version]
  19. Shaked, M.; Shanthikumar, J.G. Stochastic Orders; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2007. [Google Scholar]
  20. Samaniego, F.J. System Signatures and Their Applications in Engineering Reliability; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2007; Volume 110. [Google Scholar]
  21. Khaledi, B.E.; Shaked, M. Ordering conditional lifetimes of coherent systems. J. Stat. Plan. Inference 2007, 137, 1173–1184. [Google Scholar] [CrossRef]
  22. Ebrahimi, N.; Kirmani, S. Some results on ordering of survival functions through uncertainty. Stat. Probab. Lett. 1996, 29, 167–176. [Google Scholar] [CrossRef]
  23. Toomaj, A.; Chahkandi, M.; Balakrishnan, N. On the information properties of working used systems using dynamic signature. Appl. Stoch. Model. Bus. Ind. 2021, 37, 318–341. [Google Scholar] [CrossRef]
Figure 1. The exact values of H α ( T t 1 , 4 ) with respect to t for the Weibull distribution for values of α = 0.2 and α = 2 when k > 0 .
Figure 1. The exact values of H α ( T t 1 , 4 ) with respect to t for the Weibull distribution for values of α = 0.2 and α = 2 when k > 0 .
Entropy 25 00550 g001
Figure 2. The exact values of H α ( T t X , 1 , 4 ) (blue color) and H α ( T t Y , 1 , 4 ) (red color) with respect to t for values of α = 0.2 and α = 2 .
Figure 2. The exact values of H α ( T t X , 1 , 4 ) (blue color) and H α ( T t Y , 1 , 4 ) (red color) with respect to t for values of α = 0.2 and α = 2 .
Entropy 25 00550 g002
Figure 3. Two coherent systems with the likelihood ration ordered signature.
Figure 3. Two coherent systems with the likelihood ration ordered signature.
Entropy 25 00550 g003
Figure 4. Exact value of H α ( T t 1 , 3 ) (solid line), as well as the corresponding lower bounds (18) (dashed line) and (19) (dotted line) for the standard uniform distribution concerning time t .
Figure 4. Exact value of H α ( T t 1 , 3 ) (solid line), as well as the corresponding lower bounds (18) (dashed line) and (19) (dotted line) for the standard uniform distribution concerning time t .
Entropy 25 00550 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kayid, M.; Alshehri, M.A. Tsallis Entropy of a Used Reliability System at the System Level. Entropy 2023, 25, 550. https://doi.org/10.3390/e25040550

AMA Style

Kayid M, Alshehri MA. Tsallis Entropy of a Used Reliability System at the System Level. Entropy. 2023; 25(4):550. https://doi.org/10.3390/e25040550

Chicago/Turabian Style

Kayid, Mohamed, and Mashael A. Alshehri. 2023. "Tsallis Entropy of a Used Reliability System at the System Level" Entropy 25, no. 4: 550. https://doi.org/10.3390/e25040550

APA Style

Kayid, M., & Alshehri, M. A. (2023). Tsallis Entropy of a Used Reliability System at the System Level. Entropy, 25(4), 550. https://doi.org/10.3390/e25040550

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop