On the Depth of Decision Trees with Hypotheses
Abstract
:1. Introduction
- (i)
- Only attributes are used in decision trees;
- (ii)
- Only hypotheses are used in decision trees;
- (iii)
- Both attributes and hypotheses are used in decision trees.
2. Basic Notions
- Each terminal node is labeled with an n-tuple from the set ;
- Each node, which is not terminal (such nodes are called working), is labeled with an attribute from the set or with a hypothesis over z;
- If a working node is labeled with an attribute from , then there are two edges, which leave this node and are labeled with the systems of equations and , respectively;
- If a working node is labeled with a hypothesis:
3. Main Results
- (a)
- If , then ;
- (b)
- If , then for any .
- (a)
- If , then and ;
- (b)
- If , then , , and ;
- (c)
- If , then and for any .
4. Proof of Theorem 2
- Each terminal node is not labeled;
- Each nonterminal node is labeled with an attribute . There are two edges leaving this node that are labeled with the systems of equations and , respectively;
- The length of each complete path (the path from the root to a terminal node) is equal to d;
- For each complete path , the equation system , which is the union of equation systems assigned to the edges of the path , is consistent.
- (a)
- ;
- (b)
- .
5. Proof of Theorem 3
6. Conclusions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Angluin, D. Queries and concept learning. Mach. Learn. 1988, 2, 319–342. [Google Scholar] [CrossRef] [Green Version]
- Pawlak, Z. Rough sets. Int. J. Parallel Program. 1982, 11, 341–356. [Google Scholar] [CrossRef]
- Pawlak, Z. Rough Sets—Theoretical Aspects of Reasoning about Data; Theory and Decision Library: Series D; Kluwer: Dordrecht, The Netherlands, 1991; Volume 9. [Google Scholar]
- Pawlak, Z.; Skowron, A. Rudiments of rough sets. Inf. Sci. 2007, 177, 3–27. [Google Scholar] [CrossRef]
- Chegis, I.A.; Yablonskii, S.V. Logical methods of control of work of electric schemes. Trudy Mat. Inst. Steklov 1958, 51, 270–360. (In Russian) [Google Scholar]
- Azad, M.; Chikalov, I.; Hussain, S.; Moshkov, M. Minimizing depth of decision trees with hypotheses. In Rough Sets–International Joint Conference, Proceedings of the IJCRS 2021, Bratislava, Slovakia, 19–24 September 2021; Ramanna, S., Cornelis, C., Ciucci, D., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2021; Volume 12872, pp. 123–133. [Google Scholar]
- Azad, M.; Chikalov, I.; Hussain, S.; Moshkov, M. Minimizing number of nodes in decision trees with hypotheses. In Proceedings of the 25th International Conference on Knowledge—Based and Intelligent Information & Engineering Systems (KES 2021), Szczecin, Poland, 8–10 September 2021; Watrobski, J., Salabun, W., Toro, C., Zanni-Merk, C., Howlett, R.J., Jain, L.C., Eds.; Elsevier: Amsterdam, The Netherlands, 2021; Volume 192, pp. 232–240. [Google Scholar]
- Azad, M.; Chikalov, I.; Hussain, S.; Moshkov, M. Sorting by decision trees with hypotheses (extended abstract). In Proceedings of the 29th International Workshop on Concurrency, Specification and Programming, CS&P 2021, Berlin, Germany, 27–28 September 2021; CEUR Workshop Proceedings. Schlingloff, H., Vogel, T., Eds.; CEUR-WS.org: Aachen, Germany, 2021; Volume 2951, pp. 126–130. [Google Scholar]
- Azad, M.; Chikalov, I.; Hussain, S.; Moshkov, M. Optimization of decision trees with hypotheses for knowledge representation. Electronics 2021, 10, 1580. [Google Scholar] [CrossRef]
- Azad, M.; Chikalov, I.; Hussain, S.; Moshkov, M. Entropy-based greedy algorithm for decision trees using hypotheses. Entropy 2021, 23, 808. [Google Scholar] [CrossRef] [PubMed]
- Angluin, D. Queries revisited. Theor. Comput. Sci. 2004, 313, 175–194. [Google Scholar] [CrossRef] [Green Version]
- Littlestone, N. Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm. Mach. Learn. 1988, 2, 285–318. [Google Scholar] [CrossRef]
- Maass, W.; Turán, G. Lower bound methods and separation results for on-line learning models. Mach. Learn. 1992, 9, 107–145. [Google Scholar] [CrossRef] [Green Version]
- Moshkov, M. Conditional tests. In Problemy Kibernetiki; Yablonskii, S.V., Ed.; Nauka Publishers: Moscow, Russia, 1983; Volume 40, pp. 131–170. (In Russian) [Google Scholar]
- Moshkov, M. On depth of conditional tests for tables from closed classes. In Combinatorial-Algebraic and Probabilistic Methods of Discrete Analysis; Markov, A.A., Ed.; Gorky University Press: Gorky, Russia, 1989; pp. 78–86. (In Russian) [Google Scholar]
- Moshkov, M. Time complexity of decision trees. In Transactions on Rough Sets III; Peters, J.F., Skowron, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2005; Volume 3400, pp. 244–459. [Google Scholar]
- Moshkov, M. Test theory and problems of machine learning. In Proceedings of the International School-Seminar on Discrete Mathematics and Mathematical Cybernetics, Ratmino, Russia, 31 May–3 June 2001; MAX Press: Moscow, Russia, 2001; pp. 6–10. [Google Scholar]
- Pawlak, Z. Information systems theoretical foundations. Inf. Syst. 1981, 6, 205–218. [Google Scholar] [CrossRef] [Green Version]
- Naiman, D.Q.; Wynn, H.P. Independence number and the complexity of families of sets. Discr. Math. 1996, 154, 203–216. [Google Scholar] [CrossRef] [Green Version]
- Sauer, N. On the density of families of sets. J. Comb. Theory A 1972, 13, 145–147. [Google Scholar] [CrossRef] [Green Version]
- Shelah, S. A combinatorial problem; stability and order for models and theories in infinitary languages. Pac. J. Math. 1972, 41, 241–261. [Google Scholar] [CrossRef]
1 | 0 | 0 | 0 |
2 | 0 | 1 | 0 |
3 | 0 | 1 | 1 |
4 | 1 | 1 | 0 |
0 | 0 | 0 | n | n | n | |
0 | 1 | 0 | n | |||
0 | 1 | 1 | n | |||
1 | 1 | 0 |
1 | 0 | 0 | 0 |
2 | 0 | 1 | 0 |
3 | 0 | 1 | 1 |
4 | 1 | 1 | 0 |
5 | 1 | 0 | 0 |
6 | 0 | 0 | 1 |
7 | 1 | 0 | 1 |
8 | 1 | 1 | 1 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Moshkov, M. On the Depth of Decision Trees with Hypotheses. Entropy 2022, 24, 116. https://doi.org/10.3390/e24010116
Moshkov M. On the Depth of Decision Trees with Hypotheses. Entropy. 2022; 24(1):116. https://doi.org/10.3390/e24010116
Chicago/Turabian StyleMoshkov, Mikhail. 2022. "On the Depth of Decision Trees with Hypotheses" Entropy 24, no. 1: 116. https://doi.org/10.3390/e24010116
APA StyleMoshkov, M. (2022). On the Depth of Decision Trees with Hypotheses. Entropy, 24(1), 116. https://doi.org/10.3390/e24010116