entropy-logo

Journal Browser

Journal Browser

Formal Analysis of Deep Artificial Neural Networks

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: closed (31 October 2022) | Viewed by 3029

Special Issue Editors


E-Mail Website
Guest Editor
Insitute of Neural Information Processing, Ulm University, James Frank Ring, 89081 Ulm, Germany
Interests: artificial neural networks; pattern recognition; cluster analysis; statistical learning theory; data mining; multiple classifier systems; sensor fusion; affective computing
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Universita' di Siena, Dipartimento di Ingegneria dell'Informazione e Scienze Matematiche, Via Roma 56, 53100 Siena, Italy
Interests: artificial neural networks (ANN); probabilistic interpretation of ANNs; density estimation via ANNs; probabilistic graphical models for sequences; machine learning for graphs; combination of ANNs and graphical models

Special Issue Information

Dear Colleagues,

Artificial neural networks (ANN) represent a hot, fast-growing research field with a large impact on many application areas, such as computer vision, audio and video processing, bioinformatics, data science, pattern recognition, and natural language processing (to mention only a few). In spite of some long-established mathematical results on approximation and modeling capabilities of traditional ANN architectures (in particular, shallow ANNs), the theoretical understanding of the behavior and of the properties of complex or quantum ANNs (including very deep multilayer ANNs, or recurrent neural networks) is still limited. In recent years, the principles and methodologies of several mathematical disciplines have been proposed for the theoretical analysis of ANN architectures, dynamics, and learning. Such disciplines include approximation theory, complexity theory, information theory, as well as the study of von Neumann entropy in quantum ANNs. The purpose of this Special Issue is to highlight the state of the art of the investigation of ANNs in these contexts.

This Special Issue welcomes original research papers on the analysis of ANNs based on mathematically founded methods in general. Review articles describing the current state of the art of ANNs in the aforementioned contexts are highly encouraged. All submissions to this Special Issue must include substantial theoretical aspects of ANN research.

Dr. Friedhelm Schwenker
Dr. Edmondo Trentin
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • ANN architectures and learning in approximation and complexity theories
  • Cost functions and constraints in information-theoretic learning algorithms for ANNs
  • Complexity of deep, recurrent, or quantum ANN learning
  • Information-theoretic principles for sampling and feature extraction
  • Analysis of learning based on information-theoretic methods (e.g., information bottleneck approach) in deep, recurrent, or quantum ANNs
  • Applications of ANNs based on information-theoretic principles or quantum computing
  • Theoretical advances in quantum ANNs

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 681 KiB  
Article
Learnability of the Boolean Innerproduct in Deep Neural Networks
by Mehmet Erdal and Friedhelm Schwenker
Entropy 2022, 24(8), 1117; https://doi.org/10.3390/e24081117 - 13 Aug 2022
Cited by 1 | Viewed by 1801
Abstract
In this paper, we study the learnability of the Boolean inner product by a systematic simulation study. The family of the Boolean inner product function is known to be representable by neural networks of threshold neurons of depth 3 with only [...] Read more.
In this paper, we study the learnability of the Boolean inner product by a systematic simulation study. The family of the Boolean inner product function is known to be representable by neural networks of threshold neurons of depth 3 with only 2n+1 units (n the input dimension)—whereas an exact representation by a depth 2 network cannot possibly be of polynomial size. This result can be seen as a strong argument for deep neural network architectures. In our study, we found that this depth 3 architecture of the Boolean inner product is difficult to train, much harder than the depth 2 network, at least for the small input size scenarios n16. Nonetheless, the accuracy of the deep architecture increased with the dimension of the input space to 94% on average, which means that multiple restarts are needed to find the compact depth 3 architecture. Replacing the fully connected first layer by a partially connected layer (a kind of convolutional layer sparsely connected with weight sharing) can significantly improve the learning performance up to 99% accuracy in simulations. Another way to improve the learnability of the compact depth 3 representation of the inner product could be achieved by adding just a few additional units into the first hidden layer. Full article
(This article belongs to the Special Issue Formal Analysis of Deep Artificial Neural Networks)
Show Figures

Figure 1

Back to TopTop