Next Article in Journal
DNS Request Log Analysis of Universities in Shanghai: A CDN Service Provider’s Perspective
Next Article in Special Issue
Explainable Decision-Making for Water Quality Protection
Previous Article in Journal
Is Information Physical and Does It Have Mass?
Previous Article in Special Issue
An Iterative Approach for the Solution of the Constrained OWA Aggregation Problem with Two Comonotone Constraints
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Accelerating Update of Variable Precision Multigranulation Approximations While Adding Granular Structures

1
Information Engineering Institute, Chuzhou Polytechnic, Chuzhou 239000, China
2
School of Computer and Information Engineering, Chuzhou University, Chuzhou 239000, China
*
Author to whom correspondence should be addressed.
Information 2022, 13(11), 541; https://doi.org/10.3390/info13110541
Submission received: 24 August 2022 / Revised: 23 October 2022 / Accepted: 8 November 2022 / Published: 15 November 2022

Abstract

:
In multigranulation environments, variable precision multigranulation rough set (VPMGRS) is a useful framework that has a tolerance for errors. Approximations are basic concepts for knowledge acquisition and attribute reductions. Accelerating update of approximations can enhance the efficiency of acquiring decision rules by utilizing previously saved information. In this study, we focus on exploiting update mechanisms of approximations in VPMGRS with the addition of granular structures. By analyzing the basic changing trends of approximations in VPMGRS, we develop accelerating update mechanisms for acquiring approximations. In addition, an incremental algorithm to update variable precision multigranulation approximations is proposed when adding multiple granular structures. Finally, extensive comparisons elaborate the efficiency of the incremental algorithm.

1. Introduction

As a classical framework for representing and handling uncertain or vague data, the model of rough set theory (RST) has been typically applied in diverse areas from theoretic and application viewpoints, such as rule induction, feature selection, three-way decisions, and so on [1,2,3,4,5].
In real applications, various kinds of data vary over time. Due to the dynamic changing of data, the naive method for knowledge discovery often cost too much running time. As a consequence, how to effectively accelerate the calculation of useful information becomes an essential issue. To alleviate this problem, incremental knowledge discovery under various rough set models when the information system varies have gained continuous interest [6,7,8,9,10]. Usually, the objects [11,12], attributes [13,14], and attribute values [15] may change dynamically. Moreover, Li et al. established a dynamical framework to obtain useful knowledge based on missing data [16]. Yang et al. developed an efficient framework to acquire fuzzy probability three-way decisions rules [17]. Chen et al. established an incremental model for handling noise data problems while objects dynamically alter [18]. Furthermore, Niu et al. presented the mechanisms for computing granular reduct and rules with the newly available data [19]. Huang et al. established an effective strategy applied for computing reduct under time-evolving data [20]. What is more, Hao et al. proposed an effective scale selection method when adding an object [21]. Li et al. introduced a unique approach to calculate dominated classes when a set of objects vary over time [22]. Moreover, Luo et al. designed a matrix method under evolving three-way decision framework [23]. To process neighborhood data, Zhang et al. put forward fast approaches for obtaining neighborhood-based approximations [24]. Luo et al. established an incremental set-valued rough framework [25]. In addition, Chen et al. solved decision rule updating problems when attribute values evolve [26]. To summarize, these extensive studies have attracted continuously growing interests.
Although RST has significant benefits in enhancing the performance of knowledge reasoning, several well-established soft computing frameworks have been exploited for intelligent decision making [27,28,29]. As an effective extension of RST, multigranulation rough set (MGRS) was proposed [30]. In MGRS, the approximations are established by a group of information granules derived from a set of equivalence relations and there are two different models, called the optimistic and pessimistic MGRS, respectively. Since then, multigranulation rough set offered useful insights into data analysis and its theory progresses rapidly [31,32,33]. For example, Qian et al. developed a unique pessimistic MGRS-based decision model based on seeking common ground [34]. Zhan et al. developed two novel covering based multigranulation fuzzy sets [35]. Zhang et al. developed a multi-attribute group decision mechanism in light of the multi-granularity three-way fuzzy context [36]. Li et al. investigated a unique method to construct three way cognitive concept in presence of multi-granularity [37]. Dou et al. discussed two kind strategies for constructing the framework of VPMGRS [38]. In dynamic environments, incremental update of multigranulation knowledge has attracted widespread concerns [39,40,41,42,43]. For instance, Li et al. discussed local multigranulation rough frameworks to handle incremental evolving ordered systems [44]. Zhang et al. presented updating mechanisms to acquire multigranulation knowledge in interval-valued approximate space [45]. The approximations of a concept which can be further used for rule acquisition and attribute reduction provide an essential help in intelligence analysis [46,47]. Nevertheless, in variable precision multigranulation rough sets, how to effectively dynamic updating approximations has rarely been concerned. Motivated from the above observations, we concentrate on exploiting an incremental approach to accelerate updating of approximations in VPMGRS with addition of granular structures.
The remainder of this study is structured as below. In Section 2, essential knowledge of RST, and VPMGRS are given. In Section 3, the accelerate update mechanisms and algorithm for computing approximations based on VPMGRS are proposed when adding multiple granular structures. Extensive comparisons are performed to validate the performance in Section 4. In Section 5, we give several remarks on the paper.

2. Preliminaries

This section briefly gives some notions in terms of RST and its extensions [33,34,38,48,49,50].

2.1. Rough Set and Variable Precision Rough Set

Definition 1
([49,50]). Let S = ( U , A , V , f ) be an information system, where U = { x 1 , x 2 , , x n } is a group of objects, namely the universe. A = { a 1 , a 2 , , a m } is a set of attributes. V = a A V a is a set of attribute values, where V a denotes the domain of attribute a. f refers to a decision function and f ( x , a ) V a , for  x U .
Notably, an information system S = ( U , A , V , f ) is abbreviated as S = ( U , A ) in this study.
Definition 2
([49,50]). Given S = ( U , A ) and  B A , an equivalence relation on U can be expressed by:
R B = { ( x , y ) | ( x , y ) U × U : f ( x , a ) = f ( y , a ) , a B }
Obviously, a pair ( U , R B ) is treated as an approximate space. The equivalence relation R B generates a partition of the universe, represented by U / R B , namely, U / R B = { E 1 , E 2 , , E m } . The equivalence class in terms of x is denoted by [ x ] R = { y U : ( x , y ) R B } .
Definition 3
([49,50]). Given S = ( U , A ) and  X U . [ x ] R refers to the equivalence class in terms of R B . The lower and upper approximations of X are formalized by:
R ̲ ( X ) = { x U | [ x ] R X }
R ¯ ( X ) = { x U | [ x ] R X }
Definition 4
([48]). Given S = ( U , A ) and  X , Y U . Then, the relative degree of misclassification c ( X , Y ) is described by
c ( X , Y ) = 1 | X | | Y | | X | , | X | > 0 ; 0 , | X | = 0 .
where | · | denotes the cardinality.
Definition 5
([48]). Given S = ( U , A ) with X U , and  0 β < 0.5 . Then, β-lower approximation, β-upper approximation are defined by:
R ̲ β ( X ) = { E U / R B | c ( E , X ) β }
R ¯ β ( X ) = { E U / R B | c ( E , X ) < 1 β }
Generally, β -positive region p o s β ( X ) , β -boundary region b n d β ( X ) and β -negative region n e g β ( X ) are expressed by: p o s β ( X ) = R ̲ β ( X ) , b n d β ( X ) = R ¯ β ( X ) R ̲ β ( X ) , and n e g β ( X ) = U R ¯ β ( X ) .

2.2. Variable Precision Multigranulation Rough Sets

Qian and Liang investigated multigranulation rough sets [33,34]. In MGRS, two effective frameworks were investigated, i.e., optimistic MGRS and pessimistic MGRS.
Definition 6
([33]). Given S = ( U , A ) with a 1 , a 2 , , a m A , X U . The lower and upper approximations of X in optimistic MGRS are formalized by i = 1 m a i O ̲ ( X ) and i = 1 m a i O ¯ ( X ) , where,
i = 1 m a i O ̲ ( X ) = { x U | [ x ] a 1 X [ x ] a 2 X [ x ] a m X }
i = 1 m a i O ¯ ( X ) = i = 1 m a i O ̲ ( X )
where X denotes the complement of X.
Theorem 1.
Given S = ( U , A ) , and  a 1 , a 2 , , a m A , X U . Then, the following result holds:
i = 1 m a i O ¯ ( X ) = { x U | [ x ] a 1 X [ x ] a 2 X [ x ] a m X }
Definition 7
([34]). Given S = ( U , A ) , and  a 1 , a 2 , , a m A , X U . The lower and upper approximations of X in pessimistic MGRS are formalized by i = 1 m a i P ̲ ( X ) and i = 1 m a i P ¯ ( X ) , where,
i = 1 m a i P ̲ ( X ) = { x U | [ x ] a 1 X [ x ] a 2 X [ x ] a m X }
i = 1 m a i P ¯ ( X ) = i = 1 m a i P ̲ ( X )
Theorem 2.
Given S = ( U , A ) and  a 1 , a 2 , , a m A , X U , the following result holds:
i = 1 m a i P ¯ ( X ) = { x U | [ x ] a 1 X [ x ] a 2 X [ x ] a m X }
In what follows, the optimistic VPMGRS is established by using multiple granular structures.
Definition 8
([38]). Given S = ( U , A ) , and  a 1 , a 2 , , a m A , X U , 0 β < 0.5 . The lower and upper approximations of X in optimistic VPMGRS are defined as i = 1 m a i β O ̲ ( X ) and i = 1 m a i β O ¯ ( X ) ,
i = 1 m a i β O ̲ ( X ) = { x U | c ( [ x ] a 1 , X ) β c ( [ x ] a 2 , X ) β c ( [ x ] a m , X ) β }
i = 1 m a i β O ¯ ( X ) = i = 1 m a i β O ̲ ( X )
Theorem 3.
Given S = ( U , A ) , and  a 1 , a 2 , , a m A , X U . Assume 0 β < 0.5 , then we have
i = 1 m a i β O ¯ ( X ) = { x U | c ( [ x ] a 1 , X ) < 1 β c ( [ x ] a 2 , X ) < 1 β c ( [ x ] a m , X ) < 1 β }
Proof. 
It is immediate from Definitions 4 and 8.    □
Similarly, in the following, we introduce pessimistic VPMGRS.
Definition 9
([38]). Given S = ( U , A ) and  a 1 , a 2 , , a m A , X U . Assume 0 β < 0.5 , the lower and upper approximations of X in pessimistic VPMGRS are formalized by i = 1 m a i β P ̲ ( X ) and i = 1 m a i β P ¯ ( X ) , where,
i = 1 m a i β P ̲ ( X ) = { x U | c ( [ x ] a 1 , X ) β c ( [ x ] a 2 , X ) β c ( [ x ] a m , X ) β }
i = 1 m a i β P ¯ ( X ) = i = 1 m a i β P ̲ ( X )
where X is the complement of X.
Theorem 4.
Given S = ( U , A ) and  a 1 , a 2 , , a m A , X U . Then, we have
i = 1 m a i β P ¯ ( X ) = { x U | c ( [ x ] a 1 , X ) < 1 β c ( [ x ] a 2 , X ) < 1 β c ( [ x ] a m , X ) < 1 β }
Proof. 
It can be easily proved by Definitions 4 and 9.    □
Next, an illustrated example is shown to describe the process of computing approximations in VPMGRS.
Example 1.
Given an information system in terms of students’ comprehensive qualities evaluation which is outlined in Table 1, where U = { x 1 , , x 8 } is the universe of eight objects which are viewed as eight students, and  A = { a 1 , a 2 , a 3 , a 4 } is a multigranulation space consisting of four granular structures. These granular structures denotes four courses of students. The value of each granular structure indicates the grade level that each student achieves. Furthermore, d is a decision attribute. Assume X = { x 2 , x 3 , x 5 , x 6 } is a target concept, then, by Definition 4, we can obtain the partitions with reference to each granular structure below:
U / a 1 = { { x 1 , x 4 , x 5 } , { x 2 , x 3 , x 6 } , { x 7 , x 8 } } , c ( [ x 1 ] a 1 , X ) = c ( [ x 4 ] a 1 , X ) = c ( [ x 5 ] a 1 , X ) = 2 / 3 , c ( [ x 2 ] a 1 , X ) = c ( [ x 3 ] a 1 , X ) = c ( [ x 6 ] a 1 , X ) = 0 , c ( [ x 7 ] a 1 , X ) = c ( [ x 8 ] a 1 , X ) = 1 .
U / a 2 = { { x 1 , x 3 , x 4 , x 5 } , { x 2 } , { x 6 , x 7 , x 8 } } , c ( [ x 1 ] a 2 , X ) = c ( [ x 3 ] a 2 , X ) = c ( [ x 4 ] a 2 , X ) = c ( [ x 5 ] a 2 , X ) = 1 / 2 , c ( [ x 2 ] a 2 , X ) = 0 , c ( [ x 6 ] a 2 , X ) = c ( [ x 7 ] a 2 , X ) = c ( [ x 8 ] a 2 , X ) = 2 / 3 .
U / a 3 = { { x 1 , x 7 } , { x 2 , x 3 } , { x 4 , x 5 , x 6 , x 8 } } , c ( [ x 1 ] a 3 , X ) = c ( [ x 7 ] a 3 , X ) = 1 , c ( [ x 2 ] a 3 , X ) = c ( [ x 3 ] a 3 , X ) = 0 , c ( [ x 4 ] a 3 , X ) = c ( [ x 5 ] a 3 , X ) = c ( [ x 6 ] a 3 , X ) = c ( [ x 8 ] a 3 , X ) = 1 / 2 .
U / a 4 = { { x 1 , x 4 , x 5 , x 6 } , { x 2 , x 3 } , { x 7 , x 8 } } , c ( [ x 1 ] a 4 , X ) = c ( [ x 4 ] a 4 , X ) = c ( [ x 5 ] a 4 , X ) = c ( [ x 6 ] a 4 , X ) = 1 / 2 , c ( [ x 2 ] a 4 , X ) = c ( [ x 3 ] a 4 , X ) = 0 , c ( [ x 7 ] a 4 , X ) = c ( [ x 8 ] a 4 , X ) = 1 .
Assume β = 0.3 , based on Definitions 8 and 9 and Theorems 3 and 4, we obtain the approximations of VPMGRS as follows: i = 1 m a i β O ̲ ( X ) = { x 2 , x 3 , x 6 } , i = 1 m a i β O ¯ ( X ) = { x 2 , x 3 , x 4 , x 5 , x 6 } , i = 1 m a i β P ̲ ( X ) = { x 2 } , and  i = 1 m a i β P ¯ ( X ) = { x 1 , x 2 , x 3 , x 4 , x 5 , x 6 , x 7 , x 8 } .
From Example 1, when the parameter β keeps constant, the lower approximation of optimistic VPMGRS is larger than that of pessimistic VPMGRS. Meanwhile, the upper approximation of optimistic VPMGRS is smaller than that of pessimistic VPMGRS. These decision strategies can help us to make better decision analysis according to different requirements. Based on above discussion, we present a traditional algorithm for calculating approximations in VPMGRS by using above definitions.
We outline the procedures of non-incrementally calculating variable precision multigranulation approximations in Algorithm 1 (Algorithm NACA). The complexity of obtaining equivalence classes and relative degree of misclassification in Lines 4–8 takes O ( m n 2 ) , where m and n respectively denote the number of granular structures and objects. The complexity of computing approximations of optimistic VPMGRS based on Definition 8 in Lines 9–26 takes O ( 2 m n ) . The complexity of computing approximations of pessimistic VPMGRS based on Definition 9 in Lines 27–44 takes O ( 2 m n ) . The complexity of outputting the approximations of VPMGRS in Line 45 takes O ( n ) . Accordingly, the whole complexity of Algorithm NACA takes O ( m n 2 ) .
Algorithm 1: A non-incremental algorithm for computing approximations in VPMGRS (NACA).
Information 13 00541 i001

3. Accelerating Update of Approximations of VPMGRS

In order to reduce computational cost, this section focuses on exploiting dynamical strategies to obtain approximations in the case of addition of a single granular structure. Under the addition of a new granular structure, the update of variable precision multigranulation approximations is by using the traditional approach to recompute the new data sets, which requires to scan all the objects and multigranulation structures. Nevertheless, it is very time-consuming to obtain the valuable knowledge. To deal with this problem, a novel incremental approach is investigated to update variable precision multigranulation rough approximations by using the previous computational results which avoids to recalculate all the objects and granular structures. Notably, the change of multiple granular structures is treated as the iterative change of a single granular structure. Assume a 1 , a 2 , , a m A are m granular structures in a multigranulation space. After adding a granular structure m + 1 , the approximations should be updated in an incremental manner. To facilitate the description, we denote the updated lower and upper approximations of X in optimistic VPMGRS as i = 1 m + 1 a i β O ̲ ( X ) , and  i = 1 m + 1 a i β O ¯ ( X ) , respectively. Similarly, after adding a granular structure m + 1 , we denote the updated lower and upper approximations of X in pessimistic VPMGRS as i = 1 m + 1 a i β P ̲ ( X ) , and  i = 1 m + 1 a i β P ¯ ( X ) , respectively.
Proposition 1.
Given S = ( U , A ) and  a 1 , a 2 , , a m A , 0 β < 0.5 , X U . After adding a single granular structure a m + 1 , for the optimistic variable precision multigranulation approximations of X, we have,
( 1 ) i = 1 m + 1 a i β O ̲ ( X ) i = 1 m a i β O ̲ ( X ) ;
( 2 ) i = 1 m + 1 a i β O ¯ ( X ) i = 1 m a i β O ¯ ( X ) .
Proof. 
(1) If x i = 1 m a i β O ̲ ( X ) , based on Definition 8, then c ( [ x ] a i , X ) β holds, i { 1 , 2 , , m } . With reference to addition of a m + 1 , it implies c ( [ x ] a i , X ) β for a given i { 1 , 2 , , m + 1 } . Therefore, x i = 1 m + 1 a i β O ̲ ( X ) . Hence, i = 1 m + 1 a i β O ̲ ( X ) i = 1 m a i β O ̲ ( X ) .
(2) With addition of a m + 1 , for  x i = 1 m + 1 a i β O ¯ ( X ) , according to Theorem 3, it implies c ( [ x ] a i , X ) < 1 β for every i { 1 , 2 , , m + 1 } . Obviously, c ( [ x ] a i , X ) < 1 β for every i { 1 , 2 , , m } . As a result, x i = 1 m a i β O ¯ ( X ) . Hence, i = 1 m + 1 a i β O ¯ ( X ) i = 1 m a i β O ¯ ( X ) .    □
According to Proposition 1, the lower approximation of optimistic VPMGRS has a trend of increase whereas the upper approximation of optimistic VPMGRS has a trend of decrease.
Proposition 2.
Given S = ( U , A ) , and  a 1 , a 2 , , a m A , 0 β < 0.5 , X U . After adding a single granular structure a m + 1 , for the pessimistic variable precision multigranulation approximations, we have,
( 1 ) i = 1 m + 1 a i β P ̲ ( X ) i = 1 m a i β P ̲ ( X ) ;
( 2 ) i = 1 m + 1 a i β P ¯ ( X ) i = 1 m a i β P ¯ ( X ) .
Proof. 
(1) If x i = 1 m + 1 a i β P ̲ ( X ) , then, c ( [ x ] a i , X ) β holds for any i { 1 , 2 , , m + 1 } according to Definition 9. As a result, c ( [ x ] a i , X ) β holds for any i { 1 , 2 , , m } . Therefore, x i = 1 m a i β P ̲ ( X ) . This implies that i = 1 m + 1 a i β P ̲ ( X ) i = 1 m a i β P ̲ ( X ) .
(2) If x i = 1 m a i β P ¯ ( X ) , based on Theorem 4, we have c ( [ x ] a i , X ) < 1 β holds for a given i { 1 , 2 , , m } . Therefore, c ( [ x ] a i , X ) < 1 β holds for a given i { 1 , 2 , , m + 1 } . Hence, x i = 1 m + 1 a i β P ¯ ( X ) . As a result, i = 1 m + 1 a i β P ¯ ( X ) i = 1 m a i β P ¯ ( X ) .    □
In light of Proposition 2, we can conclude that the pessimistic variable precision multigranulation upper approximation has a trend of increase while the pessimistic variable precision multigranulation lower approximation has a trend of decrease.
In what follows, updating mechanisms for acquiring approximations of optimistic and pessimistic VPMGRS are demonstrated.
Theorem 5.
Given S = ( U , A ) , and  a 1 , a 2 , , a m A , 0 β < 0.5 , X U , while adding a m + 1 , if  x i = 1 m a i β O ̲ ( X ) , the following result holds,
(1) if c ( [ x ] a m + 1 , X ) β , then i = 1 m + 1 a i β O ̲ ( X ) = i = 1 m a i β O ̲ ( X ) { x } ;
(2) otherwise i = 1 m + 1 a i β O ̲ ( X ) = i = 1 m a i β O ̲ ( X ) .
Proof. 
(1) Due to x i = 1 m a i β O ̲ ( X ) , according to Definition 8, it implies c ( [ x ] a i , X ) > β for any i { 1 , 2 , , m } , while adding a m + 1 , if  c ( [ x ] a m + 1 , X ) β , then i { 1 , 2 , , m + 1 } , c ( [ x ] a i , X ) β holds. So x i = 1 m + 1 a i β O ̲ ( X ) . Hence, i = 1 m + 1 a i β O ̲ ( X ) = i = 1 m a i β O ̲ ( X ) { x } .
(2) If x i = 1 m a i β O ̲ ( X ) , then we obtain c ( [ x ] a i , X ) > β for any i { 1 , 2 , , m } by Definition 8, while adding a m + 1 , if  c ( [ x ] a m + 1 , X ) > β , then we obtain c ( [ x ] a i , X ) > β for any i { 1 , 2 , , m + 1 } . Consequently, x i = 1 m + 1 a i β O ̲ ( X ) . Thus, i = 1 m + 1 a i β O ̲ ( X ) = i = 1 m a i β O ̲ ( X ) .    □
In light of Theorem 5, we can update lower approximation of optimistic VPMGRS with addition of a granular structure.
Example 2 (Continuation of Example 1).
Table 2 indicates the candidate five courses that will be added into the original multigranulation space. After adding a granular structure, the approximations applied for decision analysis may change. To verify incremental mechanism for updating approximations of VPMGRS, we suppose that the candidate granular structure a 5 shown in Table 2 will be added into Table 1. Then, we can compute U / a 5 = { { x 1 , x 3 , x 4 } , { x 2 , x 5 } , { x 6 , x 7 , x 8 } } . Based on Definition 4, we obtain, c ( [ x 1 ] a 5 , X ) = c ( [ x 3 ] a 5 , X ) = c ( [ x 4 ] a 5 , X ) = 2 / 3 , c ( [ x 2 ] a 5 , X ) = c ( [ x 5 ] a 5 , X ) = 0 , c ( [ x 6 ] a 5 , X ) = c ( [ x 7 ] a 5 , X ) = c ( [ x 8 ] a 5 , X ) = 2 / 3 . Because  x 5 i = 1 m a i β O ̲ ( X ) , and  c ( [ x 5 ] a 5 , X ) = 0 < β = 0.3 , according to Theorem 5, we obtain i = 1 m + 1 a i β O ̲ ( X ) = i = 1 m a i β O ̲ ( X ) { x 5 } = { x 2 , x 3 , x 5 , x 6 } .
Theorem 6.
Given S = ( U , A ) , and  a 1 , a 2 , , a m A , 0 β < 0.5 , X U , while adding a m + 1 into the information system, if  x i = 1 m a i β O ¯ ( X ) , the following result holds,
( 1 ) if c ( [ x ] a m + 1 , X ) 1 β , then i = 1 m + 1 a i β O ¯ ( X ) = i = 1 m a i β O ¯ ( X ) { x } ;
( 2 ) otherwise, i = 1 m + 1 a i β O ¯ ( X ) = i = 1 m a i β O ¯ ( X ) .
Proof. 
(1) If x i = 1 m a i β O ¯ ( X ) , according to Theorem 3, we obtain c ( [ x ] a i , X ) < 1 β for any i { 1 , 2 , , m } , while adding a m + 1 , if  c ( [ x ] a m + 1 , X ) 1 β , it follows that c ( [ x ] a i , X ) 1 β for a given i { 1 , 2 , , m + 1 } . So x i = 1 m + 1 a i β O ¯ ( X ) . Hence, i = 1 m + 1 a i β O ¯ ( X ) = i = 1 m a i β O ¯ ( X ) { x } .
(2) If x i = 1 m a i β O ¯ ( X ) , according to Theorem 3, then we obtain c ( [ x ] a i , X ) < 1 β for any i { 1 , 2 , , m } , while adding a m + 1 , if  c ( [ x ] a m + 1 , X ) < 1 β holds, then c ( [ x ] a i , X ) < 1 β for any i { 1 , 2 , , m + 1 } . Thus, x i = 1 m + 1 a i β O ¯ ( X ) . As a result, i = 1 m + 1 a i β O ¯ ( X ) = i = 1 m a i β O ¯ ( X ) .    □
Based on Theorem 6, we can effectively acquire upper approximation of optimistic VPMGRS with addition of a granular structure.
Example 3 (Continuation of Example 1).
Assume the granular structure a 6 in Table 2 is added into Table 1, then we obtain U / a 6 = { { x 1 , x 4 } , { x 2 , x 3 , x 5 , x 6 } , { x 7 , x 8 } } . By Definition 4, we have, c ( [ x 1 ] a 6 , X ) = c ( [ x 4 ] a 6 , X ) = 1 , c ( [ x 2 ] a 6 , X ) = c ( [ x 3 ] a 6 , X ) = c ( [ x 5 ] a 6 , X ) = c ( [ x 6 ] a 6 , X ) = 0 and c ( [ x 7 ] a 6 , X ) = c ( [ x 8 ] a 6 , X ) = 1 . Because  x 4 i = 1 m a i β O ¯ ( X ) and  c ( [ x 4 ] a 6 , X ) = 1 1 β = 0.7 , according to Theorem 6, it follows that i = 1 m + 1 a i β O ¯ ( X ) = i = 1 m a i β O ¯ ( X ) { x 4 } = { x 2 , x 3 , x 5 , x 6 } .
Theorem 7.
Given S = ( U , A ) and  a 1 , a 2 , , a m A , X U , 0 β < 0.5 , while adding a m + 1 into the information system, if  x i = 1 m a i β P ̲ ( X ) , we have
( 1 ) if c ( [ x ] a m + 1 , X ) > β , then i = 1 m + 1 a i β P ̲ ( X ) = i = 1 m a i β P ̲ ( X ) { x } ;
( 2 ) otherwise i = 1 m + 1 a i β P ̲ ( X ) = i = 1 m a i β P ̲ ( X ) .
Proof. 
(1) If x i = 1 m a i β P ̲ ( X ) , according to Definition 9, c ( [ x ] a i , X ) β holds for any i { 1 , 2 , , m } , while adding a m + 1 , if  c ( [ x ] a m + 1 , X ) > β , then c ( [ x ] a i , X ) > β holds for a given i { 1 , 2 , , m + 1 } . Therefore, x i = 1 m + 1 a i β P ̲ ( X ) . As a result, i = 1 m + 1 a i β P ̲ ( X ) = i = 1 m a i β P ̲ ( X ) { x } .
(2) If x i = 1 m a i β P ̲ ( X ) , according to Definition 9, then c ( [ x ] a i , X ) β for any i { 1 , 2 , , m } , while adding a m + 1 , if  c ( [ x ] a m + 1 , X ) β , obviously, c ( [ x ] a i , X ) β holds for any i { 1 , 2 , , m + 1 } . Therefore, x i = 1 m + 1 a i β P ̲ ( X ) . As a result, i = 1 m + 1 a i β P ̲ ( X ) = i = 1 m a i β P ̲ ( X ) holds.    □
On the basis of Theorem 7, we can update lower approximation of pessimistic VPMGRS with addition of a granular structure.
Example 4 (Continuation of Example 1).
Assume the granular structure a 7 in Table 2 is added into Table 1, then we can compute U / a 7 = { { x 1 , x 2 } , { x 3 , x 5 , x 6 } , { x 4 , x 7 , x 8 } } . Based on Definition 4, we can achieve c ( [ x 1 ] a 7 , X ) = c ( [ x 2 ] a 7 , X ) = 1 / 2 , c ( [ x 3 ] a 7 , X ) = c ( [ x 5 ] a 7 , X ) = c ( [ x 6 ] a 7 , X ) = 0 , c ( [ x 4 ] a 7 , X ) = c ( [ x 7 ] a 7 , X ) = c ( [ x 8 ] a 7 , X ) = 1 . Because  x 2 i = 1 m a i β P ̲ ( X ) and  c ( [ x 2 ] a 7 , X ) = 1 / 2 > β = 0.3 , according to Theorem 7, we have i = 1 m + 1 a i β P ̲ ( X ) = i = 1 m a i β P ̲ ( X ) { x 2 } = .
Theorem 8.
Given S = ( U , A ) , and  a 1 , a 2 , , a m A , X U , 0 β < 0.5 , while adding a m + 1 into the information system, if  x i = 1 m a i β P ¯ ( X ) , we have
( 1 ) if c ( [ x ] a m + 1 , X ) < 1 β , then i = 1 m + 1 a i β P ¯ ( X ) = i = 1 m a i β P ¯ ( X ) { x } ;
( 2 ) otherwise i = 1 m + 1 a i β P ¯ ( X ) = i = 1 m a i β P ¯ ( X ) .
Proof. 
(1) x i = 1 m a i β P ¯ ( X ) , in light of Theorem 4, we have c ( [ x ] a i , X ) 1 β for every i { 1 , 2 , , m } , while adding a m + 1 , if  c ( [ x ] a m + 1 , X ) < 1 β , then c ( [ x ] a i , X ) < 1 β holds for a given i { 1 , 2 , , m + 1 } . So, x i = 1 m + 1 a i β P ¯ ( X ) . Hence, i = 1 m + 1 a i β P ¯ ( X ) = i = 1 m a i β P ¯ ( X ) { x } .
(2) x i = 1 m a i β P ¯ ( X ) , according to Theorem 4, it implied that c ( [ x ] a i , X ) 1 β holds for every i { 1 , 2 , , m } , while adding a m + 1 , if  c ( [ x ] a m + 1 , X ) 1 β holds, then c ( [ x ] a i , X ) > 1 β for every i { 1 , 2 , , m + 1 } . As a result, x i = 1 m + 1 a i β P ¯ ( X ) . Therefore, we have i = 1 m + 1 a i β P ¯ ( X ) = i = 1 m a i β P ¯ ( X ) .    □
Based on Theorem 8, we can dynamically update upper approximation of pessimistic VPMGRS with addition of a granular structure.
Example 5 (Continuation of Example 1).
Assume a granular structure a 8 in Table 2 is added into Table 1, then we obtain  U / a 8 = { { x 1 , x 2 , x 3 } , { x 4 , x 6 } , { x 5 , x 7 , x 8 } } . Based on Definition 4, we obtain c ( [ x 1 ] a 8 , X ) = c ( [ x 2 ] a 8 , X ) = c ( [ x 3 ] a 8 , X ) = 1 / 3 , c ( [ x 4 ] a 8 , X ) = c ( [ x 6 ] a 8 , X ) = 1 / 2 , c ( [ x 5 ] a 8 , X ) = c ( [ x 7 ] a 8 , X ) = c ( [ x 8 ] a 8 , X ) = 2 / 3 . For  x 2 U , c ( [ x 2 ] a 7 , X ) = 1 / 3 > β = 0.3 , by Theorem 7, we obtain  i = 1 m + 1 a i β P ̲ ( X ) = i = 1 m a i β P ̲ ( X ) { x 2 } = . Due to x i i = 1 m a i β P ¯ ( X ) , there is no need to update i = 1 m + 1 a i β P ¯ ( X ) . According to Theorem 8, we obtain i = 1 m + 1 a i β P ¯ ( X ) = i = 1 m a i β P ¯ ( X ) = { x 1 , x 2 , x 3 , x 4 , x 5 , x 6 , x 7 , x 8 } .
In light of proposed theorems, we present an incremental algorithm for efficiently obtaining approximations of VPMGRS when adding multiple granular structures.
We summarize the procedures of updating variable precision multigranulation approximations in Algorithm 2 (Algorithm IAUA). The complexity of initializing the approximations in Lines 2–3 takes O ( n ) , where n is the number of objects. The complexity of calculating equivalence classes and relative degree of misclassification in Lines 4–8 takes O ( m n 2 ) , where m is the number of granular structures. The complexity of updating lower approximation of optimistic VPMGRS by Theorem 5 in Lines 9–15 takes O ( m n ) . The complexity of updating upper approximation of optimistic VPMGRS based on Theorem 6 in Lines 16–22 takes O ( m n ) . The complexity of updating lower approximation of pessimistic VPMGRS based on Theorem 7 in Lines 23–29 takes O ( m n ) . The complexity of updating upper approximation of pessimistic VPMGRS based on Theorem 8 in Lines 30–36 takes O ( m n ) . The complexity of outputting updated approximations of VPMGRS in Line 37 takes O ( n ) . Thus, the whole complexity of algorithm IAUA takes O ( m n 2 ) , which is less than that of algorithm NACA when adding granular structures.
Algorithm 2: An incremental algorithm for updating approximations of VPMGRS while adding multiple granular structures (IAUA).
Information 13 00541 i002

4. Experimental Analysis

This section aims to evaluate the performance of incremental algorithm IAUA by making use of extensive experiments. Experimental data sets are downloaded from UCI (http://archive.ics.uci.edu/ml/datasets.html, accessed on 15 August 2022). The description of all data sets is listed in Table 3. In addition, experiments are run on a PC with Win 10, Intel i5-1135G7 CPU @ 2.4 GHz, 16 GB RAM, and the algorithms are performed through Java.
In the subsequent sections, we try to evaluate the algorithm IAUA and the algorithm NACA from the following three aspects. The first one is to compare the efficiency between IAUA and NACA with different sizes of data sets. The second one is to verify the efficiency between IAUA and NACA under different updating ratios of adding granular structures. Furthermore, the last one is to evaluate the influence of the parameter β on the algorithms IAUA and NACA with respect to computational time. Moreover, for the first two experiments, when calculating variable precision multigranulation approximations, the parameter β is fixed as 0.3. Additionally, in our experiments, each attribute in the data sets is regarded as one single granular structure.

4.1. Comparison between IAUA and NACA with Different Sizes of Universe

In this subsection, we test algorithms IAUA and NACA under the circumstance of different sizes of universe. To show efficiency of IAUA and NACA more intuitively, the data sets are initially spit into ten equal parts. The first basic data is viewed as the first part, the second basic data is combined with first one and the second part, and so on, the tenth basic data is combined with all of ten parts. Additionally, in regard to the partition of granular structures, the first 50% are treated as original granular structures while the rest are treated as adding granular structures.
The computational times consumed by algorithms IAUA and NACA for acquiring approximations of VPMGRS are depicted in Figure 1, where x-axis reveals different size of universe, and y-axis indicates the time consumed for computing approximations. According to Figure 1, it is obvious that the NACA trend increases dramatically. This indicates that the algorithm NACA spends increasingly more computational time than that of IAUA. Computational time of the algorithm IAUA remains below that of the algorithm NACA, because the algorithm NACA performs the calculations of approximations in VPMGRS on the whole data sets without using the prior knowledge. On the contrary, the algorithm IAUA updates the approximations by making use of the prior valuable knowledge and thus reduces the computational time.

4.2. Comparison between IAUA and NACA with Different Updating Ratios

In this subsection, we verify the incremental algorithm IAUA under different updating ratios of granular structures, and the number of objects is fixed. We select 30% of the whole granular structures as the original part and the rest part is regarded as the candidate part. Furthermore, updating radio is the ratio with reference to the number of adding granular structures and that of candidate granular structures. Then, we select 10% to 100% of granular structures with the increment step of 10% as adding granular structures.
Figure 2 depicts computational time regarding algorithms IAUA as well as NACA to update approximations, where x-axis pertains to different updating ratios, while y-axis denotes time consumed for updating approximations. As observed from Figure 2, the algorithm IAUA reduces the computational time of updating approximations from each data set in comparison with the algorithm NACA. The main reason is contributed to that NACA has no updating mechanisms for making fully use of the prior useful information. When adding granular structures, the algorithm NACA requires to be carried out from beginning to obtain approximations of VPMGRS. Therefore, the incremental algorithm IAUA is more efficient for dynamical maintenance of approximations of VPMGRS when adding granular structures.

4.3. Comparison between IAUA and NACA with Changing Values of the Parameter β

In this subsection, we aim to elaborate on the influence of changing values of the parameter β on computational efficiency of the algorithms IAUA and NACA. To demonstrate the time consumed with changing values of β , the value of β is considered to change from 0.1 to 0.45 with incremental step of 0.05. For the algorithm NACA, we perform experiments on the whole universe of each data set from scratch. For the algorithm IAUA, the whole granular structures are divided into two parts, namely the original granular structures and adding granular structures. The original part consists of 40% of the whole granular structures while the adding part includes the rest of the granular structures. Furthermore, the size of test data set is fixed by the whole universe when the algorithm IAUA is carried out.
Figure 3 shows the times consumed by the algorithms IAUA and NACA on all data sets with changing values of β . In two sub-figures of Figure 3, the x-axis refers to changing values of β while the y-axis expresses computational time for all data sets. In light of Figure 3, it clearly indicates that the time consumed by IAUA is consistently lower than that of NACA. According to the algorithms IAUA and NACA, the computational times are stable in terms of different values of β . Therefore, the time consumed by the algorithms IAUA and NACA for updating approximations in VPMGRS fluctuates a little with the variation of β .

5. Conclusions

In real applications, the granular structures in multigranulation environments evolve over time. Therefore, the incremental technique by making use of prior knowledge can efficiently maintain valuable knowledge in changing data context. Therefore, this study exploited an efficient algorithm for maintenance of approximations in VPMGRS after adding multiple granular structures. We developed dynamical mechanisms for obtaining approximations in the presence of adding granular structures. At the same time, the incremental algorithm was investigated with the purpose of enhancing the efficiency. Experimental results on public available data sets validated the feasibility for updating approximations in VPMGRS. In an information system, the values of attributes may change. Accordingly, accelerating strategies for maintaining valuable knowledge will be developed by taking the generalization of values of attributes into account so as to decrease the computational cost. Meanwhile, it is possible to use the proposed algorithm for addressing some real-world uncertainty reasoning problems.

Author Contributions

Conceptualization, C.L.; Data curation, C.H.; Formal analysis, C.L. and C.H.; Funding acquisition, C.H.; Investigation, C.L. and C.H.; Methodology, C.L. and C.H.; Software, C.L. and C.H.; Supervision, C.H.; Validation, C.L. and C.H.; Writing—original draft, C.L. and C.H.; Writing—review editing, C.H. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by Anhui Provincial Natural Science Foundation (No. 2008085MF224), the Natural Science Foundation of Educational Commission of Anhui Province (No. KJ2020A0992), and the Provincial Quality Engineering Project of Anhui Province of China (No. 2021jxtd212), the Excellent Young Talents Fund Program of Higher Education Institutions of Anhui Province of China (No. gxyq2022098).

Data Availability Statement

Data are available from the authors upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hu, Q.H.; Liu, J.F.; Yu, D.R. Mixed feature selection based on granulation and approximation. Knowl. Based Syst. 2008, 21, 294–304. [Google Scholar] [CrossRef]
  2. Hu, X.M.; Yang, S.X.; Zhu, Y.R. Multiple-attribute decision making based on interval-valued intuitionistic fuzzy generalized weighted heronian mean. Information 2022, 13, 138. [Google Scholar] [CrossRef]
  3. Yao, Y.Y. Tri-level thinking: Models of three-way decision. Int. J. Mach. Learn. Cybern. 2020, 11, 947–959. [Google Scholar] [CrossRef]
  4. Liu, D.; Ye, X.Q. A matrix factorization based dynamic granularity recommendation with three-way decisions. Knowl. Based Syst. 2020, 191, 105243. [Google Scholar] [CrossRef]
  5. Kryszkiewicz, M. Rough set approach to incomplete information systems. Inf. Sci. 1998, 112, 39–49. [Google Scholar] [CrossRef]
  6. Huang, Q.Q.; Huang, Y.Y.; Li, T.R.; Yang, X. Dynamic three-way neighborhood decision model for multi-dimensional variation of incomplete hybrid data. Inf. Sci. 2022, 597, 358–391. [Google Scholar] [CrossRef]
  7. Shu, W.H.; Qian, W.B.; Xie, Y.H. Incremental neighborhood entropy-based feature selection for mixed-type data under the variation of feature set. Appl. Intell. 2022, 52, 4792–4806. [Google Scholar] [CrossRef]
  8. Ge, H.; Yang, C.J.; Xu, Y. Incremental updating three-way regions with variations of objects and attributes in incomplete neighborhood systems. Inf. Sci. 2022, 584, 479–502. [Google Scholar] [CrossRef]
  9. Chen, H.M.; Li, T.R.; Luo, C.; Horng, S.J.; Wang, G.Y. A decision-theoretic rough set approach for dynamic data mining. IEEE Trans. Fuzzy Syst. 2015, 23, 1958–1970. [Google Scholar] [CrossRef]
  10. Huang, Y.Y.; Guo, K.J.; Yi, X.W.; Li, Z.; Li, T.R. Matrix representation of the conditional entropy for incremental feature selection on multi-source data. Inf. Sci. 2022, 591, 263–286. [Google Scholar] [CrossRef]
  11. Lang, G.M.; Li, Q.G.; Cai, M.J.; Fujita, H.; Zhang, H.Y. Related families-based methods for updating reducts under dynamic object sets. Knowl. Inf. Syst. 2019, 60, 1081–1104. [Google Scholar] [CrossRef]
  12. Yang, L.; Qin, K.Y.; Sang, B.B.; Xu, W.H. Dynamic maintenance of variable precision fuzzy neighborhood three-way regions in interval-valued fuzzy decision system. Int. J. Mach. Learn. Cybern. 2022, 13, 1797–1818. [Google Scholar] [CrossRef]
  13. Liu, D.; Li, T.R.; Zhang, J.B. Incremental updating approximations in probabilistic rough sets under the variation of attributes. Knowl. Based Syst. 2015, 73, 81–96. [Google Scholar] [CrossRef]
  14. Hu, C.X.; Zhang, L. Incremental updating probabilistic neighborhood three-way regions with time-evolving attributes. Int. J. Approx. Reason. 2020, 120, 1–23. [Google Scholar] [CrossRef]
  15. Li, S.Y.; Li, T.R. Incremental update of approximations in dominance-based rough sets approach under the variation of attribute values. Inf. Sci. 2015, 294, 348–361. [Google Scholar] [CrossRef]
  16. Li, T.R.; Ruan, D.; Geert, W.; Song, J.; Xu, Y. A rough sets based characteristic relation approach for dynamic attribute generalization in data mining. Knowl. Based Syst. 2007, 20, 485–494. [Google Scholar] [CrossRef]
  17. Yang, X.; Liu, D.; Yang, X.B.; Liu, K.Y.; Li, T.R. Incremental fuzzy probability decision-theoretic approaches to dynamic three-way approximations. Inf. Sci. 2021, 550, 71–90. [Google Scholar] [CrossRef]
  18. Chen, H.M.; Li, T.R.; Ruan, D.; Lin, J.H.; Hu, C.X. A rough-set-based incremental approach for updating approximations under dynamic maintenance environments. IEEE Trans. Knowl. Data Eng. 2013, 25, 274–284. [Google Scholar] [CrossRef]
  19. Niu, J.J.; Chen, D.G.; Li, J.H.; Wang, H. A dynamic rule-based classification model via granular computing. Inf. Sci. 2022, 584, 325–341. [Google Scholar] [CrossRef]
  20. Huang, J.X.; Yu, P.Q.; Li, W.K. Updating the reduct in fuzzy β-covering via matrix approaches while adding and deleting some objects of the universe. Information 2020, 11, 3. [Google Scholar] [CrossRef]
  21. Hao, C.; Li, J.H.; Fan, M.; Liu, W.Q.; Tsang, E.C.C. Optimal scale selection in dynamic multi-scale decision tables based on sequential three-way decisions. Inf. Sci. 2017, 415, 213–232. [Google Scholar] [CrossRef]
  22. Li, S.Y.; Li, T.R.; Liu, D. Dynamic maintenance of approximations in dominance based rough set approach under the variation of the object set. Int. J. Intell. Syst. 2013, 28, 729–751. [Google Scholar] [CrossRef]
  23. Luo, C.; Li, T.R.; Zhang, Y.; Hamido, F. Matrix approach to decision-theoretic rough sets for evolving data. Knowl. Based Syst. 2016, 99, 123–134. [Google Scholar] [CrossRef]
  24. Zhang, J.B.; Li, T.R.; Ruan, D. Neighborhood rough sets for dynamic data mining. Int. J. Intell. Syst. 2012, 27, 317–342. [Google Scholar] [CrossRef]
  25. Luo, C.; Li, T.R.; Chen, H.M.; Lu, L. Fast algorithms for computing rough approximations in set-valued decision systems while updating criteria values. Inf. Sci. 2015, 299, 221–242. [Google Scholar] [CrossRef]
  26. Chen, H.M.; Li, T.R.; Qiao, S.J.; Ruan, D. A rough set based dynamic maintenance approach for approximations in coarsening and refining attribute values. Int. J. Intell. Syst. 2010, 25, 1005–1026. [Google Scholar] [CrossRef]
  27. Zadeh, L.A. Towards a theory of fuzzy information granulation and its centrality in human reasoning and fuzzy logic. Fuzzy Sets Syst. 1997, 90, 111–127. [Google Scholar] [CrossRef]
  28. Yao, Y. Three-way granular computing, rough sets, and formal concept analysis. Int. J. Approx. Reason. 2020, 116, 106–125. [Google Scholar] [CrossRef]
  29. Yao, Y.Y. The geometry of three-way decision. Appl. Intell. 2021, 51, 6298–6325. [Google Scholar] [CrossRef]
  30. Qian, Y.H.; Liang, J.Y.; Dang, C.Y. Incomplete multigranulation rough set. IEEE Trans. Syst. Man Cybern. Part A 2010, 40, 420–431. [Google Scholar] [CrossRef]
  31. Sun, B.Z.; Ma, W.M.; Chen, X.T. Variable precision multigranulation rough fuzzy set approach to multiple attribute group decision-making based on λ-similarity relation. Comput. Ind. Eng. 2019, 127, 326–343. [Google Scholar] [CrossRef]
  32. Zhang, C.; Li, D.Y.; Liang, J.Y.; Wang, B.L. MAGDM-oriented dual hesitant fuzzy multigranulation probabilistic models based on MULTIMOORA. Int. J. Mach. Learn. Cybern. 2021, 12, 1219–1241. [Google Scholar] [CrossRef]
  33. Qian, Y.H.; Liang, J.Y.; Yao, Y.Y.; Dang, C.Y. MGRS: A multi-granulation rough set. Inf. Sci. 2010, 180, 949–970. [Google Scholar] [CrossRef]
  34. Qian, Y.H.; Li, S.Y.; Liang, J.Y.; Shi, Z.Z.; Wang, F. Pessimistic rough set based decisions: A multigranulation fusion strategy. Inf. Sci. 2014, 264, 196–210. [Google Scholar] [CrossRef]
  35. Zhan, J.M.; Xu, W.H. Two types of coverings based multigranulation rough fuzzy sets and applications to decision making. Artif. Intell. Rev. 2020, 53, 167–198. [Google Scholar] [CrossRef]
  36. Zhang, C.; Ding, J.J.; Zhan, J.M.; Li, D.Y. Incomplete three-way multi-attribute group decision making based on adjustable multigranulation Pythagorean fuzzy probabilistic rough sets. Int. J. Approx. Reason. 2022, 147, 40–59. [Google Scholar] [CrossRef]
  37. Li, J.H.; Huang, C.C.; Qi, J.J.; Qian, Y.H.; Liu, W.Q. Three-way cognitive concept learning via multi-granularity. Inf. Sci. 2017, 378, 244–263. [Google Scholar] [CrossRef]
  38. Dou, H.L.; Yang, X.B.; Fan, J.Y.; Xu, S.P. The models of variable precision multigranulation rough sets. In Proceedings of the International Conference on Rough Sets and Knowledge Technology, Lecture Notes in Computer Science, Chengdu, China, 17–20 August 2012; Springer: Berlin/Heidelberg, Germany, 2012; Volume 7414, pp. 465–473. [Google Scholar]
  39. Hu, C.X.; Zhang, L. A dynamic framework for updating neighborhood multigranulation approximations with the variation of objects. Inf. Sci. 2020, 519, 382–406. [Google Scholar] [CrossRef]
  40. Hu, C.X.; Zhang, L. Dynamic dominance-based multigranulation rough sets approaches with evolving ordered data. Int. J. Mach. Learn. Cybern. 2021, 12, 17–38. [Google Scholar] [CrossRef]
  41. Ju, H.R.; Yang, X.B.; Song, X.N.; Qi, Y.S. Dynamic updating multigranulation fuzzy rough set: Approximations and reducts. Int. J. Mach. Learn. Cybern. 2014, 5, 981–990. [Google Scholar] [CrossRef]
  42. Xu, W.H.; Yuan, K.H.; Li, W.T. Dynamic updating approximations of local generalized multigranulation neighborhood rough set. Appl. Intell. 2022, 52, 9148–9173. [Google Scholar] [CrossRef]
  43. Yang, X.B.; Qi, Y.S.; Yu, H.L.; Song, X.N.; Yang, J.Y. Updating multigranulation rough approximations with increasing of granular structures. Knowl. Based Syst. 2014, 64, 59–69. [Google Scholar] [CrossRef]
  44. Li, W.T.; Xu, W.H.; Zhang, X.Y.; Zhang, J. Updating approximations with dynamic objects based on local multigranulation rough sets in ordered information systems. Artif. Intell. Rev. 2022, 55, 1821–1855. [Google Scholar] [CrossRef]
  45. Zhang, X.Y.; Li, J.R.; Mi, J.S. Dynamic updating approximations approach to multi-granulation interval-valued hesitant fuzzy information systems with time-evolving attributes. Knowl. Based Syst. 2022, 238, 107809. [Google Scholar] [CrossRef]
  46. Mi, J.S.; Wu, W.Z.; Zhang, W.X. Approaches to knowledge reductions based on variable precision rough sets model. Inf. Sci. 2004, 159, 255–272. [Google Scholar] [CrossRef]
  47. Liu, D. The effectiveness of three-way classification with interpretable perspective. Inf. Sci. 2021, 567, 237–255. [Google Scholar] [CrossRef]
  48. Ziarko, W. Variable precision rough set model. J. Comput. Syst. Sci. 1993, 46, 39–59. [Google Scholar] [CrossRef] [Green Version]
  49. Pawlak, Z. Rough Sets. Int. J. Comput. Inf. Sci. 1982, 11, 341–356. [Google Scholar] [CrossRef]
  50. Pawlak, Z.; Skowron, A. Rough sets: Some extensions. Inf. Sci. 2007, 177, 28–40. [Google Scholar] [CrossRef]
Figure 1. A comparison of NACA and IAUA with different sizes of universe.
Figure 1. A comparison of NACA and IAUA with different sizes of universe.
Information 13 00541 g001
Figure 2. A comparison of NACA and IAUA with different updating ratios.
Figure 2. A comparison of NACA and IAUA with different updating ratios.
Information 13 00541 g002
Figure 3. A comparison of NACA and IAUA with changing values of the parameter β .
Figure 3. A comparison of NACA and IAUA with changing values of the parameter β .
Information 13 00541 g003
Table 1. A description of information system.
Table 1. A description of information system.
U a 1 a 2 a 3 a 4 d
x 1 11111
x 2 23222
x 3 21222
x 4 11311
x 5 11312
x 6 22312
x 7 32131
x 8 32331
Table 2. A description of added candidate granular structures.
Table 2. A description of added candidate granular structures.
U a 5 a 6 a 7 a 8
x 1 1221
x 2 2121
x 3 1131
x 4 1212
x 5 2133
x 6 3132
x 7 3313
x 8 3313
Table 3. The description of data sets.
Table 3. The description of data sets.
IDData Sets# Objects# Attributes# Classes
1Sonar208602
2SPECTF267442
3Ionosphere351342
4Libras3609015
5Dermatology366336
6Wdbc569302
7Diabetic1151192
8Segmentation2310197
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, C.; Hu, C. Accelerating Update of Variable Precision Multigranulation Approximations While Adding Granular Structures. Information 2022, 13, 541. https://doi.org/10.3390/info13110541

AMA Style

Li C, Hu C. Accelerating Update of Variable Precision Multigranulation Approximations While Adding Granular Structures. Information. 2022; 13(11):541. https://doi.org/10.3390/info13110541

Chicago/Turabian Style

Li, Changchun, and Chengxiang Hu. 2022. "Accelerating Update of Variable Precision Multigranulation Approximations While Adding Granular Structures" Information 13, no. 11: 541. https://doi.org/10.3390/info13110541

APA Style

Li, C., & Hu, C. (2022). Accelerating Update of Variable Precision Multigranulation Approximations While Adding Granular Structures. Information, 13(11), 541. https://doi.org/10.3390/info13110541

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop