A Neural Probabilistic Graphical Model for Learning and Decision Making in Evolving Structured Environments
Abstract
:1. Introduction
- the world is not stationary, but it changes over time;
- a learning machine may be applied under different conditions [4].
2. Materials and Methods
3. Results
- the hidden-to-output connection weights of the RBF. In order to ensure the satisfaction of the axioms of probability, these weights must range over and sum to one. This is guaranteed by introducing n unconstrained variables such that
- The mean vector and covariance matrix of the generic i-th Gaussian kernel in the RBF. For each component of the encoding space, ML parametric estimation of Gaussian mixture models [11] is applied in order to compute the quantities and .
- The parameters of the encoding network. For each , the quantity is obtained as follows: , where y represents the output from the neuron that is fed from v. The partial derivative is computed as usual. As for , the backpropagation through structures (BPTS) algorithm can be applied [18,19]. First, computation of the derivative is straightforward for the connection weights v between hidden and output neurons. This yields the initialization of the ’s to be backpropagated via BPTS. Then, let us turn our attention to the hidden weights , where ℓ and m are the hidden neurons connected by v. Relying on the aforementioned initialization of the ’s, the derivative is finally obtained via standard BPTS.
4. Summary and Conclusions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
AI | artificial intelligence |
HMM | hidden Markov model |
probability density function | |
RBF | radial basis function |
RNN | recursive neural network |
RBF-RNN | probabilistic radial basis functions variant of recursive neural networks |
RE | random environment |
-HMM | HMM over environments |
ML | maximum likelihood |
BPTS | backpropagation through structures |
References
- Selman, B.; Brooks, R.A.; Dean, T.; Horvitz, E.; Mitchell, T.M.; Nilsson, N.J. Challenge problems for artificial intelligence. In Proceedings of the Thirteenth National Conference on Artificial Intelligence AAAI’96, Portland, OR, USA, 4–8 August 1996; Volume 2, pp. 1340–1345. [Google Scholar]
- Benuskova, L.; Kasabov, N. Computational Neurogenetic Modeling, 1st ed.; Springer Publishing Company, Incorporated: Berlin/Heidelberg, Germany, 2007. [Google Scholar]
- Angelov, P.; Filev, D.P.; Kasabov, N. Evolving Intelligent Systems: Methodology and Applications; Wiley-IEEE Press: Hoboken, NJ, USA, 2010. [Google Scholar]
- Shafiullah, N.M.; Pinto, L. One After Another: Learning Incremental Skills for a Changing World. arXiv 2022, arXiv:2203.11176. [Google Scholar]
- Sayed Mouchaweh, M. Fault Diagnosis of Hybrid Dynamic and Complex Systems; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
- Overpeck, J.; Meehl, G.; Bony, S.; Easterling, D. Climate data challenges in the 21st century. Science 2011, 331, 700–702. [Google Scholar] [CrossRef] [PubMed]
- Cont, R. Statistical Modeling of High-Frequency Financial Data. IEEE Signal Process. Mag. 2011, 28, 16–25. [Google Scholar] [CrossRef]
- Mohammadi, M.; Yazdani, S.; Khanmohammadi, M.H.; Ma ham, K. Financial Reporting Fraud Detection: An Analysis of Data Mining Algorithms. Int. J. Financ. Manag. Account. 2020, 4, 1–12. [Google Scholar]
- Dada, E.G.; Bassi, J.S.; Chiroma, H.; Abdulhamid, S.M.; Adetunmbi, A.O.; Ajibuwa, O.E. Machine learning for email spam filtering: Review, approaches and open research problems. Heliyon 2019, 5, 180–192. [Google Scholar] [CrossRef] [Green Version]
- Polikar, R.; Alippi, C. Guest Editorial Learning in Nonstationary and Evolving Environments. IEEE Trans. Neural Netw. Learn. Syst. 2014, 25, 9–11. [Google Scholar] [CrossRef]
- Bishop, C.M. Pattern Recognition and Machine Learning; Springer: New York, NY, USA, 2006. [Google Scholar]
- Salem, F.M. Recurrent Neural Networks—From Simple to Gated Architectures, 1st ed.; Springer: Berlin/Heidelberg, Germany, 2022. [Google Scholar]
- Freno, A.; Trentin, E. Hybrid Random Fields—A Scalable Approach to Structure and Parameter Learning in Probabilistic Graphical Models; Springer: Berlin/Heidelberg, Germany, 2011; Volume 15. [Google Scholar]
- Trentin, E.; Lusnig, L.; Cavalli, F. Parzen neural networks: Fundamentals, properties, and an application to forensic anthropology. Neural Netw. 2018, 97, 137–151. [Google Scholar] [CrossRef]
- Trentin, E. Soft-Constrained Neural Networks for Nonparametric Density Estimation. Neural Process. Lett. 2018, 48, 915–932. [Google Scholar] [CrossRef]
- Trentin, E. Asymptotic Convergence of Soft-Constrained Neural Networks for Density Estimation. Mathematics 2020, 8, 572. [Google Scholar] [CrossRef] [Green Version]
- Ghosh, J.; Nag, A. An Overview of Radial Basis Function Networks. In Radial Basis Function Networks 2: New Advances in Design; Howlett, R.J., Jain, L.C., Eds.; Springer: Heidelberg, Germany, 2001; pp. 1–36. [Google Scholar]
- Sperduti, A.; Starita, A. Supervised neural networks for the classification of structures. IEEE Trans. Neural Netw. 1997, 8, 714–735. [Google Scholar] [CrossRef] [Green Version]
- Scarselli, F.; Gori, M.; Tsoi, A.C.; Hagenbuchner, M.; Monfardini, G. The graph neural network model. IEEE Trans. Neural Netw. 2009, 20, 61–80. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Salazar, D.S.P.; Adeodato, P.J.L.; Arnaud, A.L. Continuous Dynamical Combination of Short and Long-Term Forecasts for Nonstationary Time Series. IEEE Trans. Neural Netw. Learn. Syst. 2014, 25, 241–246. [Google Scholar] [CrossRef] [PubMed]
- Alippi, C.; Ntalampiras, S.; Roveri, M. A Cognitive Fault Diagnosis System for Distributed Sensor Networks. IEEE Trans. Neural Netw. Learn. Syst. 2013, 24, 1213–1226. [Google Scholar] [CrossRef] [PubMed]
- Nakada, Y.; Wakahara, M.; Matsumoto, T. Online Bayesian Learning With Natural Sequential Prior Distribution. IEEE Trans. Neural Netw. Learn. Syst. 2014, 25, 40–54. [Google Scholar] [CrossRef] [PubMed]
- Alippi, C.; Boracchi, G.; Roveri, M. Just-In-Time Classifiers for Recurrent Concepts. IEEE Trans. Neural Netw. Learn. Syst. 2013, 24, 620–634. [Google Scholar] [CrossRef] [PubMed]
- Erdös, P.; Rényi, A. On Random Graphs. Publ. Math. Debrecen 1959, 6, 290–297. [Google Scholar]
- Hammer, B.; Micheli, A.; Sperduti, A. Universal Approximation Capability of Cascade Correlation for Structures. Neural Comput. 2005, 17, 1109–1159. [Google Scholar] [CrossRef]
- Ross, S. Stochastic Processes; Wiley: New York, NY, USA, 1996. [Google Scholar]
- Bongini, M.; Freno, A.; Laveglia, V.; Trentin, E. Dynamic Hybrid Random Fields for the Probabilistic Graphical Modeling of Sequential Data: Definitions, Algorithms, and an Application to Bioinformatics. Neural Process. Lett. 2018, 48, 733–768. [Google Scholar] [CrossRef]
- Rabiner, L.R. A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Proc. IEEE 1989, 77, 257–286. [Google Scholar] [CrossRef] [Green Version]
- Castelli, I.; Trentin, E. Combination of supervised and unsupervised learning for training the activation functions of neural networks. Pattern Recognit. Lett. 2014, 37, 178–191. [Google Scholar] [CrossRef]
- Bongini, M.; Rigutini, L.; Trentin, E. Recursive Neural Networks for Density Estimation Over Generalized Random Graphs. IEEE Trans. Neural Netw. Learn. Syst. 2018, 29, 5441–5458. [Google Scholar] [CrossRef] [PubMed]
- Bengio, Y. Neural Networks for Speech and Sequence Recognition; International Thomson Computer Press: London, UK, 1996. [Google Scholar]
- Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern Classification, 2nd ed.; John Wiley & Sons: New York, NY, USA, 2001. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Trentin, E. A Neural Probabilistic Graphical Model for Learning and Decision Making in Evolving Structured Environments. Mathematics 2022, 10, 2646. https://doi.org/10.3390/math10152646
Trentin E. A Neural Probabilistic Graphical Model for Learning and Decision Making in Evolving Structured Environments. Mathematics. 2022; 10(15):2646. https://doi.org/10.3390/math10152646
Chicago/Turabian StyleTrentin, Edmondo. 2022. "A Neural Probabilistic Graphical Model for Learning and Decision Making in Evolving Structured Environments" Mathematics 10, no. 15: 2646. https://doi.org/10.3390/math10152646
APA StyleTrentin, E. (2022). A Neural Probabilistic Graphical Model for Learning and Decision Making in Evolving Structured Environments. Mathematics, 10(15), 2646. https://doi.org/10.3390/math10152646