Next Article in Journal
Anchor Link Prediction across Attributed Networks via Network Embedding
Previous Article in Journal
Informed Weighted Non-Negative Matrix Factorization Using αβ-Divergence Applied to Source Apportionment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Comment

Is the Voronoi Entropy a True Entropy? Comments on “Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem”, Entropy 2017, 19, 48

Engineering Faculty, Chemical Engineering, Biotechnology and Materials Department, Ariel University, P.O.B. 3, 407000 Ariel, Israel
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(3), 251; https://doi.org/10.3390/e21030251
Submission received: 17 February 2019 / Revised: 1 March 2019 / Accepted: 5 March 2019 / Published: 6 March 2019
(This article belongs to the Section Thermodynamics)

Abstract

:
The goal of this comment note is to express our considerations about the recent paper by A. Ben Naim (Entropy 2017, 19, 48). We strongly support the distinguishing between the Shannon measure of information and the thermodynamic entropy, suggested in the paper. We demonstrate that the Voronoi entropy should also be clearly distinguished from the entropy of a two-dimensional gas. Actually, the Voronoi entropy being an intensive value is the averaged Shannon measure of ordering for a given pattern.

The above paper by Ben Naim [1] criticized the identification of the Shannon measure of information (abbreviated SMI) with the thermodynamic notion of entropy. We are quoting: “The first (SMI) is defined on any probability distribution; and therefore it is a very general concept. On the other hand, entropy is defined on a very special set of distributions” [1]. Actually, the thermodynamic entropy is a special case of a SMI, corresponding to the distribution that maximizes the SMI, referred to as the equilibrium distribution [1]. Thus, SMI should be clearly distinguished from the thermodynamic entropy. In our comment we exploit the arguments suggested by Ben Naim [1,2,3,4] for distinguishing between the Voronoi entropy [5] and the averaged Shannon measure of two-dimensional ordering (abbreviated further SHMO), supplied by the Shannon-like expression, introduced by Voronoi [5]. The Voronoi entropy (known already to Kepler and Descartes [6,7]) is the useful notion, enabling the estimation of ordering for the set of points (also called seeds or nuclei) located in a plane [8,9].
A Voronoi tessellation or diagram of an infinite plane is a partitioning of the plane into non-overlapping convex polyhedral regions based on the distance to a specified discrete set of points. For each seed, there is a corresponding region consisting of all points closer to that seed than to any other. The Voronoi polyhedron of a point nucleus in space is the smallest polyhedron formed by the perpendicularly bisecting planes between a given nucleus and all the other nuclei [8,9]. To quantify the orderliness of the Voronoi tessellation the so-called Voronoi entropy is defined as:
S v o r = i P i l n P i
where Pi is the fraction of polygons with n sides or edges for a given Voronoi diagram [5,8,9]. Equation (1) has the form similar to SMI and the entropy in statistical mechanics [1]. That is why it was called “the Voronoi entropy”. Ben Naim called the identification of SMI with the thermodynamic entropy “grievous mistake” [1]. The same is true for the identification of the Voronoi entropy Svor with the thermodynamic entropy of 2D gas. This identification is erroneous for several reasons:
(1)
The Voronoi entropy may be calculated for any set of points, starting from the number N > 3. These points may represent the 2D gas, but this gas is not necessarily in the thermal equilibrium [10] and it obviously may be time-dependent [10].
(2)
The Voronoi entropy is an intensive value. This means that the Voronoi entropy of the pattern characterized with the given and constant 2D order does not depend either on the area of the pattern nor on the number of seed points (of course, this is true, when the boundary effects are neglected). In contrast, the entropy is an extensive thermodynamic value, in other words it grows with an increase in a number of particles constituting the system [11,12].
(3)
The Voronoi entropy is not the relativistic invariant value. The relativistic contraction changes the pattern and simultaneously it changes the Voronoi entropy related to the pattern. Whereas the thermodynamic entropy is the relativistic invariant [13].
Therefore, what is measured by the Voronoi entropy? Following Ben Naim [1], we suggest that actually the Voronoi entropy is the averaged Shannon measure of ordering for a given pattern (SHMO). However, this definition also needs certain care. Indeed, it is reasonable to suggest that the maximal value of the Voronoi entropy corresponds to random 2D patterns, for which S v o r = 1.71 was established [14,15]. Note, that we revealed recently ordered patterns, arising from the points located on the Archimedes spiral (such as shown in Figure 1), demonstrating the Voronoi entropy, which is markedly larger that S v o r = 1.71 , reported for random patterns [14,15,16]. Moreover, the Voronoi entropy may grow unrestrictedly with the number of kinds of polygons appearing in the pattern. The correct statement should be formulated as follows: The Voronoi entropy quantifies the ordering for the patterns demonstrating the same number of polygons.
Following Ben Naim [1], we conclude that actually the Voronoi entropy should be clearly distinguished from the thermodynamic entropy of the 2D gas, and actually it represents the averaged Shannon measure of ordering for 2D patterns.

Acknowledgments

The authors are indebted to Yelena Bormashenko for her kind help in preparing this comment.

Conflicts of Interest

The authors declare no conflict of interests.

References

  1. Ben Naim, A. Shannon’s Measure of information and Boltzmann’s H-Theorem. Entropy 2017, 19, 48. [Google Scholar] [CrossRef]
  2. Ben-Naim, A. Information Theory; World Scientific: Singapore, 2017. [Google Scholar]
  3. Ben-Naim, A. A Farewell to Entropy: Statistical Thermodynamics Based on Information; World Scientific: Singapore, 2008. [Google Scholar]
  4. Ben-Naim, A. Entropy, the Truth the Whole Truth and Nothing but the Truth; World Scientific: Singapore, 2016. [Google Scholar]
  5. Voronoi, G. Nouvelles applications des paramètres continus à la théorie des formes quadratiques. Deuxième mémoire. Recherches sur les paralléloèdres primitifs. Reine Angew. Math. 1908, 134, 198–287. [Google Scholar]
  6. Descartes, R. Principia Philosophiae; Ludovicus Elzevirius: Amsterdam, The Netherlands, 1644; ISBN 978-90-277-1754-2. [Google Scholar]
  7. Liebling, T.M.; Pournin, L. Voronoi diagrams and Delaunay triangulations: Ubiquitous Siamese Twins. Doc. Math. 2012, ISMP, 419–431. [Google Scholar]
  8. Barthélemy, M. Spatial networks. Phys. Rep. 2011, 499, 1–101. [Google Scholar] [CrossRef] [Green Version]
  9. Bormashenko, E.; Frenkel, M.; Vilk, A.; Legchenkova, I.; Fedorets, A.; Aktaev, N.; Dombrovsky, L.; Nosonovsky, M. Characterization of self-assembled 2D patterns with Voronoi Entropy. Entropy 2018, 20, 956. [Google Scholar] [CrossRef]
  10. Fedorets, A.A.; Frenkel, M.; Bormashenko, E.; Nosonovsky, M. Small levitating ordered droplet clusters: Stability, symmetry, and Voronoi Entropy. J. Phys. Chem. Lett. 2017, 8, 5599–5602. [Google Scholar] [CrossRef] [PubMed]
  11. Callen, H.B.; Greene, R.F. On a theorem of irreversible thermodynamics. Phys. Rev. 1952, 86, 702–710. [Google Scholar] [CrossRef]
  12. Gilmore, R. Le Châtelier reciprocal relations and the mechanical analog. Am. J. Phys. 1983, 51, 733–743. [Google Scholar] [CrossRef]
  13. Tolman, R.C. Relativity, Themodynamics and Cosmology; Oxford University Press: Oxford, UK, 1934; ISBN 978-0486653839. [Google Scholar]
  14. Limaye, A.V.; Narhe, R.D.; Dhote, A.M.; Ogale, S.B. Evidence for convective effects in breath figure formation on volatile fluid surfaces. Phys. Rev. Lett. 1996, 76, 3762–3765. [Google Scholar] [CrossRef] [PubMed]
  15. Martin, C.P.; Blunt, M.O.; Pauliac-Vaujour, E.; Stannard, A.; Moriarty, P.; Vancea, I.; Thiele, U. Controlling pattern formation in nanoparticle assemblies via directed solvent dewetting. Phys. Rev. Lett. 2007, 99, 116103. [Google Scholar] [CrossRef] [PubMed]
  16. Frenkel, M.; Legchenkova, I.; Bormashenko, E. Voronoi diagrams generated by the Archimedes spiral: Aesthetics vs. mathematics. Entropy 2019. submitted. [Google Scholar]
Figure 1. 80 points pattern (the total numbers N = 80), built from seven types of polygons and demonstrating the Voronoi entropy Svor = 1.8878 is shown. (Color mapping: Magenta polygons are triangles, green—tetragons, yellow—pentagons; grey—hexagons, blue—heptagons; brown—octagons, deep-green—nonagons).
Figure 1. 80 points pattern (the total numbers N = 80), built from seven types of polygons and demonstrating the Voronoi entropy Svor = 1.8878 is shown. (Color mapping: Magenta polygons are triangles, green—tetragons, yellow—pentagons; grey—hexagons, blue—heptagons; brown—octagons, deep-green—nonagons).
Entropy 21 00251 g001

Share and Cite

MDPI and ACS Style

Bormashenko, E.; Frenkel, M.; Legchenkova, I. Is the Voronoi Entropy a True Entropy? Comments on “Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem”, Entropy 2017, 19, 48. Entropy 2019, 21, 251. https://doi.org/10.3390/e21030251

AMA Style

Bormashenko E, Frenkel M, Legchenkova I. Is the Voronoi Entropy a True Entropy? Comments on “Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem”, Entropy 2017, 19, 48. Entropy. 2019; 21(3):251. https://doi.org/10.3390/e21030251

Chicago/Turabian Style

Bormashenko, Edward, Mark Frenkel, and Irina Legchenkova. 2019. "Is the Voronoi Entropy a True Entropy? Comments on “Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem”, Entropy 2017, 19, 48" Entropy 21, no. 3: 251. https://doi.org/10.3390/e21030251

APA Style

Bormashenko, E., Frenkel, M., & Legchenkova, I. (2019). Is the Voronoi Entropy a True Entropy? Comments on “Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem”, Entropy 2017, 19, 48. Entropy, 21(3), 251. https://doi.org/10.3390/e21030251

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop