Next Article in Journal
A Two-Stage Voting-Boosting Technique for Ensemble Learning in Social Network Sentiment Classification
Next Article in Special Issue
Asymptotic Distribution of Certain Types of Entropy under the Multinomial Law
Previous Article in Journal
Supervised Deep Learning Techniques for Image Description: A Systematic Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

About the Entropy of a Natural Number and a Type of the Entropy of an Ideal

Faculty of Mathematics and Computer Science, Transilvania University, Iuliu Maniu Street 50, 500091 Braşov, Romania
*
Author to whom correspondence should be addressed.
Entropy 2023, 25(4), 554; https://doi.org/10.3390/e25040554
Submission received: 17 January 2023 / Revised: 22 March 2023 / Accepted: 22 March 2023 / Published: 24 March 2023
(This article belongs to the Special Issue Mathematics in Information Theory and Modern Applications)

Abstract

:
In this article, we find some properties of certain types of entropies of a natural number. We are studying a way of measuring the “disorder” of the divisors of a natural number. We compare two of the entropies H and H ¯ defined for a natural number. An useful property of the Shannon entropy is the additivity, H S ( pq ) = H S ( p ) + H S ( q ) , where pq denotes tensor product, so we focus on its study in the case of numbers and ideals. We mention that only one of the two entropy functions discussed in this paper satisfies additivity, whereas the other does not. In addition, regarding the entropy H of a natural number, we generalize this notion for ideals, and we find some of its properties.

1. Introduction and Preliminaries

In information theory, the entropy is defined as a measure of uncertainty. The most used of the entropies is Shannon entropy ( H S ), which is given for a probability distribution p = { p 1 , , p r } ; thus,
H S ( p ) = i = 1 r p i · log p i .
An useful property of the Shannon entropy is the additivity, H S ( pq ) = H S ( p ) + H S ( q ) , where p = { p 1 , , p r } , q = { q 1 , , q r } and pq = { p 1 q 1 , , p 1 q r , , p r q 1 , , p r q r } .
In [1], Sayyari gave an extension of Jensen’s discrete inequality considering the class of uniformly convex functions getting lower and upper bounds for Jensen’s inequality. He applied this results in information theory and obtained new and strong bounds for Shannon’s entropy of a probability distribution. Recently, in [2], De Gregorio, Sánchez and Toral defined the block entropy (based on Shannon entropy), which can determine the memory for systems modeled as Markov chains of arbitrary finite order.
We have found several ways to define the entropy of a natural number. Jeong et al., in [3], defined the additive entropy of a natural number in terms of the additive partition function. If d is the divisor of a natural number n, then we will write d | n . If σ ( n ) is the sum of natural divisors of n, then it is easy to see that d | n d σ ( n ) = 1 . Thus, the ratio d σ ( n ) can be seen as a probability. As a result we, have a discrete probability distribution associated with a natural number. In [4], we found the following definition for the entropy of a natural number:
H ¯ ( n ) : = d | n d σ ( n ) log d σ ( n ) = log σ ( n ) 1 σ ( n ) d | n d log d ,
where log is the natural logarithm. Unfortunately, we did not find this interesting definition of the entropy of a natural number in a book or paper, but on a website. This entropy has the following interesting property:
H ¯ ( m n ) = H ¯ ( m ) + H ¯ ( n ) ,
when m , n N * and gcd ( m , n ) = 1 . If p is a prime number and α N * , then we have
H ¯ ( p α ) = ( α + 1 ) log p p α + 1 1 + log 1 p ( α + 1 ) p 1 + p log p p 1 .
Taking the limit as α , we obtain
lim α H ¯ ( p α ) = p log p p 1 log ( p 1 ) .
We remark that, if p is a prime number, q > 1 , such that 1 p + 1 q = 1 , then
H S 1 p , 1 q = p 1 p p log p p 1 log ( p 1 ) = 1 1 p lim α H ¯ ( p α ) .
In the paper [5], Minculete and Pozna introduced the notion of entropy of a natural number in another way—namely, if n N , n 2 , by applying the fundamental theorem of arithmetic, n is written uniquely n = p 1 α 1 p 2 α 2 p r α r , where r N * , p 1 , p 2 , , p r are distinct prime positive integers and α 1 , α 2 , , α r N * . Let Ω n = α 1 + α 2 + + α r and p α i = α i Ω n ,   i = 1 , r ¯ . The entropy of n is defined by
H n = i = 1 r p α i · log p α i .
Here, by convention, H ( 1 ) = 0 .
Minculete and Pozna (in [5]) gave an equivalent form for the entropy of n , namely:
H n = log Ω n 1 Ω n · i = 1 r α i · log α i .
For example, if n = 6 = 2 · 3 , we have:
H 6 = log 2 1 2 · 2 · log 1 = log 2 = 0.6931 .
Another example: if n = 24 = 2 3 · 3 , we have:
H 24 = log 4 1 4 · 3 · log 3 = 1 4 · log 4 4 3 3 = 2.2493 .
Minculete and Pozna proved (in [5]) the following:
Proposition 1.
0 H n log ω n , n N , n 2 ,
where ω n is the number of distinct prime factors of n .
Remark 1.
(i) If n = p α , then H n = 0 ;
(ii) 
If n = p 1 · p 2 · · p r , then H n = l o g ω n ;
(ii) 
If n = p 1 · p 2 · · p r k , then H n = l o g ω n .
It is easy to see that H ( n α ) = H ( n ) , with α 1 .
The relevance of this entropy is given by the possibility of extension to ideals. The extension of some properties of the natural numbers to ideals was recently given in [6]. Some of the studied results can be transferred to other types of generalized entropies that can be defined later [7]. Entropy is generally used in mathematical physics applications, but it can constitute a new element of analysis in theoretical fields [8]. Recently, in [9], Niepostyn and Daszczuk used entropy as a measure of consistency in software architecture. Therefore, the area of studying different types of entropies in various fields is expanding.
Our motivation of this article was to study some properties of certain types of entropies of a natural number. We compare two of the entropies defined for a natural number. Additionally, regarding the entropy H of a natural number, introduced in [5], we generalize this notion for ideals, and we find some of its properties. We mention that the entropy of the ideal is generalized from the second notion of the entropy of integers.

2. A Comparison between the Entropies H and H ¯

In this section, we propose to compare the entropies H and H ¯ , looking to similarities and differences between them.
Proposition 2.
lim p lim α H ¯ ( p α ) = 0 .
Proof. 
From relation (1), we have lim α H ¯ ( p α ) = p log p p 1 log ( p 1 ) . Next, we use the following limit of functions:
lim x x log x x 1 log ( x 1 ) = lim x x log x ( x 1 ) log ( x 1 ) x 1
= lim x log x log ( x 1 ) = lim x log x x 1 = 0 .
Therefore, we obtain lim p lim α H ¯ ( p α ) = lim p p log p p 1 log ( p 1 ) = 0 .
Remark 2.
Related to the entropy H ¯ , we have
lim α H ¯ ( n p α ) = H ¯ ( n ) + p log p p 1 log ( p 1 ) ,
when gcd ( n , p ) = 1 , with p being a prime number and n , α N * .
It is easy to see that lim p lim α H ¯ ( p α ) = 0 = H ( p α ) .
Proposition 3.
If gcd ( n , p ) = 1 , with p being a prime number and n , α N * , then we have
lim α H ( n p α ) = 0 .
Proof. 
From the definition of H, we have
H ( n p α ) = log ( Ω ( n ) + α ) 1 Ω ( n ) + α i = 1 r α i · log α i + α log α = log ( Ω ( n ) + α ) α log α Ω ( n ) + α 1 Ω ( n ) + α Ω ( n ) log Ω ( n ) Ω ( n ) H ( n ) = Ω ( n ) H ( n ) Ω ( n ) + α + log ( Ω ( n ) + α ) Ω ( n ) log Ω ( n ) + α log α Ω ( n ) + α .
It follows that
H ( n p α ) = Ω ( n ) H ( n ) Ω ( n ) + α + log ( Ω ( n ) + α ) Ω ( n ) log Ω ( n ) + α log α Ω ( n ) + α .
By taking the limit when α , we deduce the relation of the statement. □
We also see that if gcd ( m , n ) = 1 , then
H ( m n ) H ( m ) + H ( n ) .
As a result, we ask ourselves the question of what is the relationship between H ( m n ) and H ( m ) + H ( n ) , where m , n N * , m , n 2 .
If m = 22 and n = 105 , then H ( m ) = log 2 , H ( n ) = log 3 and H ( m n ) = log 5 , so we have
H ( m n ) < H ( m ) + H ( n ) .
If m = 20 and n = 63 , then H ( m ) = H ( n ) = log 3 2 3 log 2 and H ( m n ) = log 6 2 3 log 2 , which means that
H ( m n ) H ( m ) H ( n ) = 1 3 5 log 2 3 log 3 = 1 3 log 32 27 > 0 ,
so we have
H ( m n ) > H ( m ) + H ( n ) .
Next, we study a general result of this type for the entropy H.
Proposition 4.
We assume that m = p k q and n = p k t , where p , q , t are distinct prime numbers and k N * . Then, the inequality
H ( m n ) < H ( m ) + H ( n )
holds.
Proof. 
From the definition of H, we have H ( m ) = H ( n ) = log ( k + 1 ) k k + 1 log k and H ( m n ) = log 2 ( k + 1 ) k k + 1 log 2 k . Therefore, we obtain
H ( m ) + H ( n ) H ( m n ) = 1 k + 1 ( k + 1 ) log ( k + 1 ) k log k log 2 .
We consider the function f : [ 1 , ) R defined by
f ( x ) = ( x + 1 ) log ( x + 1 ) x log x log 2 . Since f ( x ) = log x + 1 x > 0 for every x 1 , we deduce that the function f is increasing, so we have f ( x ) f ( 1 ) = log 2 > 0 . Consequently, the inequality of the statement is true. □
Proposition 5.
We assume that m = p 1 k p 2 and n = q 1 k q 2 , where p 1 , p 2 , q 1 , q 2 are distinct prime numbers and k N * . Then, we have the following inequality
H ( m n ) H ( m ) + H ( n ) .
Equality holds for k = 1 .
Proof. 
For k = 1 , we deduce that m = p 1 p 2 and n = q 1 q 2 , which implies H ( m ) = H ( n ) = log 2 and H ( m n ) = log 4 , so we have
H ( m n ) = H ( m ) + H ( n ) .
For k 2 , we find H ( m ) = H ( n ) = log ( k + 1 ) k k + 1 log k and H ( m n ) = log 2 ( k + 1 ) k k + 1 log k . Now, we obtain
H ( m n ) H ( m ) H ( n ) = 1 k + 1 ( k + 1 ) log 2 + k log k ( k + 1 ) log ( k + 1 )
for all k 2 , because the function f : [ 2 , ) R defined by f ( x ) = ( x + 1 ) log 2 + x log x ( x + 1 ) log ( x + 1 ) is strictly positive. It is easy to see that f ( x ) > 0 for every x 2 . Therefore, for x = k , we prove the relation of the statement. □
We study another result for which we have
H ( m n ) H ( m ) + H ( n ) ,
where m , n N * , m , n 2 .
Proposition 6.
Let m , n be two natural numbers such that gcd ( m , n ) = 1 and decomposition in prime factors of m , n given by m = i = 1 r p i a i and n = j = 1 s q j b j with a i , b j k for all i { 1 , , r } and j { 1 , , s } , k N * . Then, the inequality
H ( m n ) > H ( m ) + H ( n ) + log k Ω ( m ) + k Ω ( n )
holds.
Proof. 
Using the definition of H, we deduce the equality
H ( m n ) H ( m ) H ( n ) = Ω ( n ) Ω ( m ) ( Ω ( m ) + Ω ( n ) ) i = 1 r a i log a i
+ Ω ( m ) Ω ( n ) ( Ω ( m ) + Ω ( n ) ) j = 1 s b j log b j log Ω ( m ) Ω ( n ) Ω ( m ) + Ω ( n ) .
Since log a i , log b j log k for all i { 1 , , r } and j { 1 , , s } , we obtain that i = 1 r a i log a i log k i = 1 r a i = ( log k ) Ω ( m ) and j = 1 s b j log b j log k j = 1 s b j = ( log k ) Ω ( n ) . Using equality (7) and above inequalities, we show that
H ( m n ) H ( m ) H ( n ) log k log Ω ( m ) Ω ( n ) Ω ( m ) + Ω ( n ) .
Consequently, the inequality of the statement is true. □
Theorem 1.
Let m , n be two natural numbers such that gcd ( m , n ) = 1 and H ( m ) , H ( n ) log 2 . Then, the following inequality
H ( m ) + H ( n ) H ( m n )
holds.
Proof. 
Using relation (7) and the definition of H, we have
H ( m n ) H ( m ) H ( n ) = Ω ( n ) Ω ( m ) + Ω ( n ) log ( Ω ( m ) ) H ( m ) )
+ Ω ( m ) Ω ( m ) + Ω ( n ) log ( Ω ( n ) ) H ( n ) log Ω ( m ) Ω ( n ) Ω ( m ) + Ω ( n )
= Ω ( n ) log ( Ω ( m ) ) + Ω ( m ) log ( Ω ( n ) ) Ω ( n ) + Ω ( m ) Ω ( n ) H ( m ) + Ω ( m ) H ( n ) Ω ( n ) + Ω ( m ) log Ω ( m ) Ω ( n ) Ω ( m ) + Ω ( n ) .
Since, using the concavity of the function log, we deduce the inequality
Ω ( n ) log ( Ω ( m ) ) + Ω ( m ) log ( Ω ( n ) ) Ω ( n ) + Ω ( m ) log 2 Ω ( m ) Ω ( n ) Ω ( m ) + Ω ( n ) .
Therefore, relation (8) becomes
H ( m n ) H ( m ) H ( n ) log 2 Ω ( n ) H ( m ) + Ω ( m ) H ( n ) Ω ( n ) + Ω ( m ) ,
so we obtain
H ( m ) + H ( n ) H ( m n ) Ω ( n ) H ( m ) + Ω ( m ) H ( n ) Ω ( n ) + Ω ( m ) log 2 .
Therefore, taking into account that H ( m ) , H ( n ) log 2 and using inequality (9), we deduce the statement. □
Next, our goal was to show that the entropy H is more suitable to extend it to ideals.

3. The Entropy of an Ideal

In this section, we introduce the notion of entropy of an ideal of a ring of algebraic integers, and we find interesting properties of it.
Let K be an algebraic number field of degree [ K : Q ] = n , where n N , n 2 , and let O K be its ring of integers. Let Spec O K be the set of the prime ideals of the ring O K . Let p be a prime positive integer. Since O K is a Dedekind ring, applying the fundamental theorem of Dedekind rings, the ideal p O K is written uniquely (except for the order of the factors) like this:
p O K = P 1 e 1 · P 2 e 2 · · P g e g ,
where g N * , e 1 , e 2 , , e g N * and P 1 , P 2 , …, P g S p e c O K . The number e i ( i = 1 , g ¯ ) is called the ramification index of p at the ideal P i .
Generally, according to the fundamental theorem of Dedekind rings, any ideal I of the ring O K decomposes uniquely:
I = P 1 e 1 · P 2 e 2 · · P g e g , w h e r e r N * , e 1 , e 2 , , e g N * a n d P 1 , P 2 , , P g S p e c O K .
We shall mostly work in this article with ideals of the form p O K , since for such ideals there are known ramification results in the ring O K , for many algebraic number fields K (when K is any quadratic field, or K is any cubic field, or K is any cyclotomic field, or K is any Kummer field, etc.)
The following result is known (see [10,11,12]):
Proposition 7.
In the above notation, we have:
(i) 
i = 1 g e i f i = [ K : Q ] = n ,
where f i is the residual degree of p , meaning f i = O K / P i : Z / p Z , i = 1 , g ¯ .
(ii) 
If, moreover, Q K is a Galois extension, then e 1 = e 2 = = e g (denoted by e), f 1 = f 2 = = f g (denoted by f). Therefore, e f g = n .
Let J be the set of ideals of the ring O K . Let I J , I be written uniquely as in equality (10).
It is easy to see that i = 1 g e i Ω ( I ) = 1 . Thus, the ratio e i Ω ( I ) can be seen as a probability; as a result, we have a discrete probability distribution associated with a ideal.
We generalize the notion of entropy of an ideal like this:
Definition 1.
Let I 0 be an ideal of the ring O K , decomposed as above. We define the entropy of the ideal I as follows:
H I = i = 1 g e i Ω ( I ) log e i Ω ( I ) ,
where Ω I = e 1 + e 2 + + e g .
Immediately, we obtain the following equivalent form, for the entropy of the ideal I:
H I = log Ω I 1 Ω I · i = 1 g e i · log e i .
We now give some examples of calculating the entropy of an ideal.
Example 1.
Let ξ be a primitive root of order 5 of the unity and let K = Q ξ be the 5th cyclotomic field. The ring of algebraic integers of the field K is O K = Z ξ . We consider the ideal 1 ξ · Z ξ . It is known that 1 ξ · Z ξ S p e c O K (see [10,13]). Let the ideal 5 · Z ξ = 1 ξ 4 · Z ξ . The entropy of the ideal 5 · Z ξ is
H 5 · Z ξ = log 4 1 4 · 4 · log 4 = 0 .
Example 2.
Let the pure cubic field K = Q 2 3 . Since 2 2 1 (mod 9), the results show that the ring of algebraic integers of the field K is O K = Z 2 3 (see [14]).
Since 29 2 (mod 3), 29 Z 2 3 = P 1 · P 2 , where P 1 , P 2 S p e c Z 2 3 . Thus, the ideal 29 Z 2 3 splits in the ring Z 2 3 . The entropy of the ideal 29 Z 2 3 is
H 29 Z 2 3 = log 2 1 2 · 2 · log 1 = log 2 .
Example 3.
In the same field (as in the previous example) K = Q 2 3 with the ring of integer O K = Z 2 3 , we consider the ideal 31 Z 2 3 .
Since 31 1 (mod 3), 31 Z 2 3 = P 1 · P 2 · P 3 , where P 1 , P 2 , P 3 S p e c Z 2 3 . Thus, the ideal 31 Z 2 3 splits completely in the ring Z 2 3 (see [14]). The entropy of the ideal 31 Z 2 3 is
H 31 Z 2 3 = log 3 1 3 · 3 · log 1 = log 3 .
Remark 3.
Let K be an algebraic number field, and let O K be its ring of integers. Let p be a prime positive integer. If p is inert or totally ramified in the ring O K , then H p O K = 0 .
Proof. 
To calculate the entropy of ideal p O K , we consider two cases.
Case 1: if p is inert in the ring O K , the results show that p O K is a prime ideal. Then Ω p O K = 1 and H p O K = 0 .
Case 2: if p is totally ramified in the ring O K , the results show that p O K = P n , where P S p e c O K and n = [ K : Q ] . This results immediately in Ω p O K = n and H p O K = l o g n l o g n = 0 .
Proposition 8.
Let n be a positive integer, n 2 , and let p be a positive prime integer. Let K be an algebraic number field of degree [ K : Q ] = n and let O K be its ring of integers. Then:
0 H p O K log ω p O K log n ,
where ω p O K is the number of distinct prime factors of the ideal p O K .
Proof. 
The proof of the inequality 0 H p O K log ω p O K is similar to the proof of Proposition 1 (that is, Theorem 2. from the article [5]).
Since O K is a Dedekind ring, the ideal p O K is written in a unique way:
p O K = P 1 e 1 · P 2 e 2 · · P g e g ,
where g N * , e 1 , e 2 , , e g N * and P 1 , P 2 , …, P g S p e c O K . By applying Proposition 7 (i), we obtain that ω p O K = g n . The equality ω p O K = n is achieved when the ideal p splits totally in the ring O K . It follows that
0 H p O K l o g ω p O K l o g n .
Proposition 9.
Let K be an algebraic number field, and let O K be its the ring of integers. Let p be a prime positive integer. If the extension of fields Q K is a Galois extension, then
H p O K = log ω p O K .
Proof. 
By taking into account the fact that O K is a Dedekind ring and applying Proposition 7 (ii), it follows that the ideal p O K is uniquely written as follows:
p O K = P 1 e 1 · P 2 e 1 · · P g e 1 ,
where g N * , e 1 N * and P 1 , P 2 , …, P g S p e c O K . According to Formula (2), the entropy of the ideal p O K is
H p O K = log g e 1 1 g e 1 · g e 1 · log e 1 = log g = log ω p O K .

4. Conclusions

Study of the entropy in information theory is a very important tool for for measuring uncertainty. The most used of entropies is the Shannon entropy. There are many studies regarding the characterization and application of entropy Shannon (see, e.g., [1,2], etc.). We are studying a way of measuring the “disorder” of the divisors of a natural number. Since we have d | n d σ ( n ) = 1 , the ratio d σ ( n ) can be seen as a probability. As a result, we have a discrete probability distribution associated with a natural number. Similarly, there are some studies related to the entropy of a natural number—namely, Jeong et al., in [3], defined the additive entropy of a natural number in terms of the additive partition function, and in [4], we found the following definition for the entropy of a natural number:
H ¯ ( n ) : = d | n d σ ( n ) log d σ ( n ) = log σ ( n ) 1 σ ( n ) d | n d log d ,
where σ ( n ) is the sum of natural divisors of n. Additionally, regarding the entropy H of a natural number, introduced in [5], another type of entropy is a natural number. Mainly, the discussion is about the properties of entropy H. In Propositions 6 and Theorem 1, we were talking about the magnitude of H ( m n ) and H ( m ) + H ( n ) .
In equality i = 1 g e i Ω ( I ) = 1 , the ratio e i Ω ( I ) can be seen as a probability. As a result, we have a discrete probability distribution associated with a ideal. Thus, we generalize this notion for ideals and find some of its properties. The relation between the proposed entropy of a natural number or an ideal is of a purely theoretical nature.
In the future, we will look for other connections of entropy within ideals, studying a possible generalization of existing entropy types for natural numbers or for ideals. We will study some inequalities involving the entropy H of an exponential divisor of a positive integer and the entropy H of an exponential divisor of an ideal. Additionally, we shall try to study the entropy in the cases of more general ideals of the ring of algebraic integers O K of an algebraic number field K , than the ideals of the form p O K , with p being a prime integer.

Author Contributions

Conceptualization, N.M. and D.S.; methodology, N.M. and D.S.; validation, N.M. and D.S.; formal analysis, N.M. and D.S.; investigation, N.M. and D.S.; resources, N.M. and D.S.; writing–original draft preparation, N.M. and D.S.; writing–review and editing, N.M. and D.S.; visualization, N.M. and D.S.; supervision, N.M. and D.S. All authors have read and agreed to the published version of the manuscript.

Funding

Both authors acknowledges the financial support from Transilvania University of Braşov.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors want to thank the anonymous reviewers and editor for their careful reading of the manuscript and for many valuable remarks and suggestions. The authors also want to thank Mirela Ştefănescu for useful discussions on this topic.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sayyari, Y. New entropy bounds via uniformly convex functions. Chaos Solitons Fractals 2020, 141, 110360. [Google Scholar] [CrossRef]
  2. De Gregorio, J.; Sánchez, D.; Toral, R. An improved estimator of Shannon entropy with applications to systems with memory. Chaos Solitons Fractals 2022, 165, 1112797. [Google Scholar] [CrossRef]
  3. Jeong, S.; Kim, K.H.; Kim, G. Algebraic entropies of natural numbers with one or two factors. J. Korean Soc. Math. Educ. Ser. B Pure Appl. Math. 2016, 23, 205–221. [Google Scholar] [CrossRef] [Green Version]
  4. Available online: https://math.stackexchange.com/questions/2369779/entropy-of-a-natural-number (accessed on 1 August 2022).
  5. Minculete, N.; Pozna, C. The Entropy of a Natural Number. Acta Tech. Jaurinensis 2011, 4, 425–431. [Google Scholar]
  6. Minculete, N.; Savin, D. Some generalizations of the functions τ and τ(e) in algebraic number fields. Expo. Math. 2021, 39, 344–353. [Google Scholar] [CrossRef]
  7. Furuichi, S.; Minculete, N. Refined Young Inequality and Its Application to Divergences. Entropy 2021, 23, 514. [Google Scholar] [CrossRef] [PubMed]
  8. Tsallis, C. Generalized entropy-based criterion for consistent testing. Phys. Rev. E 1998, 58, 1442–1445. [Google Scholar] [CrossRef]
  9. Niepostyn, S.J.; Daszczuk, W.B. Entropy as a Measure of Consistency in Software Architecture. Entropy 2023, 25, 328. [Google Scholar] [CrossRef] [PubMed]
  10. Ireland, K.; Rosen, M. A Classical Introduction to Modern Number Theory; Springer: New York, NY, USA; Berlin/Heidelberg, Germany; London, UK, 1992. [Google Scholar]
  11. Ribenboim, P. My Numbers, My Friends (Popular Lectures on Number Theory); Springer: New York, NY, USA; Berlin/Heidelberg, Germany, 2000. [Google Scholar]
  12. Ribenboim, P. Classical Theory of Algebraic Numbers; Springer: New York, NY, USA; Berlin/Heidelberg, Germany, 2001. [Google Scholar]
  13. Savin, D.; Ştefanescu, M. Lessons of Arithmetics and Number Theory; Matrix Rom Publishing House: Bucharest, Romania, 2008. (In Romanian) [Google Scholar]
  14. Murty, M.R.; Esmonde, J. Problems in Algebraic Number Theory, 2nd ed.; Springer: New York, NY, USA; Berlin/Heidelberg, Germany, 2005. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Minculete, N.; Savin, D. About the Entropy of a Natural Number and a Type of the Entropy of an Ideal. Entropy 2023, 25, 554. https://doi.org/10.3390/e25040554

AMA Style

Minculete N, Savin D. About the Entropy of a Natural Number and a Type of the Entropy of an Ideal. Entropy. 2023; 25(4):554. https://doi.org/10.3390/e25040554

Chicago/Turabian Style

Minculete, Nicuşor, and Diana Savin. 2023. "About the Entropy of a Natural Number and a Type of the Entropy of an Ideal" Entropy 25, no. 4: 554. https://doi.org/10.3390/e25040554

APA Style

Minculete, N., & Savin, D. (2023). About the Entropy of a Natural Number and a Type of the Entropy of an Ideal. Entropy, 25(4), 554. https://doi.org/10.3390/e25040554

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop