1. Introduction and Motivations
The work of Wojciech H. Zurek is universally recognized for its central importance in the field of quantum foundations; in particular, concerning decoherence and the understanding of the elusive border between the quantum and classical realms [
1]. Zurek emphasized the role of pointer states and environment-induced superselection rules (einselections). In recent years, part of his work has gone beyond mere decoherence and averaging focused on quantum Darwinism and envariance. The main goal of quantum Darwinism is to emphasize the role of multiple copies of information records contained in the local quantum environment. Envariance aims is to justify the existence and form of quantum probabilities; i.e., deriving Born’s rule from specific quantum symmetries based on entanglement [
2]. In recent important reviews of his work, Zurek stressed the importance of some of these concepts for discussing the measurement problem in relation with various interpretations of quantum mechanics [
3,
4]. Recent works showed, for instance, the importance of such envariance to the establishment of Born’s rule in the many-world and many-mind contexts [
5,
6]. While in his presentations, Zurek generally advocated a neutral position perhaps located between the Copenhagen and Everett interpretations, we believe his work on entanglement and decoherence could have a positive impact on other interpretations, such as the de Broglie–Bohm theory. We know that Zurek has always been careful concerning Bohmian mechanics (see for example his remarks in [
7] p. 209) perhaps because of the strong ontological price one has to pay in order to assume a nonlocal quantum potential and surrealistic trajectories (present even if we include decoherence [
3,
8]). Moreover, the aim of this work is to discuss the pivotal role that quantum entanglement with an environment of “Bohmian pointers” could play in order to justify Born’s rule in the context of such a Bohmian interpretation. The goal is thus to suggest interesting and positive implications that decoherence could have on ontologies different from Everettian or consistent histories approaches. In this work, we were strongly inspired and motivated by the success of envariance for justifying quantum probabilities. Moreover, as mentioned above, Zurek’s envariance emphasizing the role of entanglement is more “interpretation independent”. Therefore, for comparison, we also include in the conclusion a short summary of Zurek’s proof for the Born rule and compare the result with ours.
The de Broglie–Bohm quantum theory (BBQT) introduced by de Broglie in 1927 [
9,
10,
11] and further discussed by Bohm in 1952 [
12,
13], is now generally accepted as a satisfactory interpretation of quantum mechanics, at least for problems dealing with non-relativistic systems [
14,
15,
16]. Within this regime, BBQT is a clean, deterministic formulation of quantum mechanics preserving the classical concepts of point-like particles moving continuously in space-time. This formulation is said to be empirically equivalent to the orthodox description axiomatized by the Copenhagen school, meaning that BBQT is able to justify and reproduce the probabilistic predictions made by the standard quantum measurement theory. More specifically, this implies recovering the famous Born rule, which connects the probability
of observing an outcome
(associated with the quantum observable
) to the amplitude
in the quantum state expansion
(i.e.,
is an eigenstate of
for the observable eigenvalue
).
This issue has been a recurrent subject of controversies since the early formulation of BBQT (see for example Pauli’s objection in [
17,
18]). It mainly arises because BBQT is a deterministic mechanics and therefore, like for classical statistical mechanics, probabilities in BBQT can only be introduced in relation with ignorance and uncertainty regarding the initial conditions of the particle motions. Moreover, after more than one and a half centuries of developments since the times of Maxwell and Boltzmann, it is well recognized that the physical and rigorous mathematical foundation of statistical mechanics is still debatable [
19]. BBQT, which in some sense generalizes and extends Newtonian mechanics, clearly inherits these difficulties, constituting strong obstacles for defining a clean basis of its statistical formulation. This fact strongly contrasts with standard quantum mechanics, for which randomness has been axiomatized as genuine and inevitable from the beginning.
Over the years, several responses have been proposed by different proponents of BBQT to justify Born’s rule (for recent reviews, see [
20,
21,
22]). Here, we would like to focus on the oldest approach, which goes back to the work of David Bohm on deterministic and molecular chaos. Indeed, in 1951–1952, Bohm already emphasized the fundamental role of the disorder and chaotic motion of particles for justifying Born’s rule [
12,
13]. In his early work, Bohm stressed that the complexity of the de Broglie–Bohm dynamics during interaction processes, such as quantum measurements, should drive the system to quantum equilibrium. In other words, during interactions with an environment such as a measurement apparatus, any initial probability distribution
for
N particles in the configuration space (here
is a vector in the
N-particles configuration space) should evolve in time to reach the quantum equilibrium limit
corresponding to Born’s rule. In this approach, the relaxation process would be induced by both the high sensitivity to changes in the initial conditions of the particle motions (one typical signature of deterministic chaos) and by the molecular thermal chaos resulting from the macroscopic nature of the interacting environment (i.e., with
degrees of freedom). Furthermore, in this strategy, Born’s rule
should appear as an attractor similar to the microcanonical and canonical ensemble in thermodynamics. In 1953, Bohm developed an example model [
23] (see [
24] for a recent investigation of this idea) where a quantum system randomly submitted to several collisions with external particles constituting a bath was driven to quantum equilibrium
. In particular, during his analysis, Bohm sketched a quantum version of the famous Boltzmann
H-theorem to prove the irreversible tendency to reach Born’s rule (for other clues that Bohm was already strongly fascinated by deterministic chaos in the 1950s, see [
25] and the original 1951 manuscript written by Bohm in 1951 [
26] and rediscovered recently).
However, in later works, especially in the work conducted with Vigier [
27] and then subsequently Hiley [
14], Bohm modified the original de Broglie–Bohm dynamics by introducing stochastic and fluctuating elements associated with a subquantum medium forcing the relaxation towards quantum equilibrium
. In this context, we mention that very important works have been done in recent years concerning “Stochastic Bohmian mechanics” based on the Schrödinger–Langevin framework, the Kostin equation and involving nonlinearities [
28,
29,
30]. While this second semi-stochastic approach was motivated by general philosophical considerations [
31], proponents of BBQT have felt divided concerning the need for such a modification of the original framework. In particular, starting in the 1990s, Valentini has developed an approach assuming the strict validity of BBQT as an underlying deterministic framework and introduced mixing and coarse-graining à la Tolman–Gibbs in the configuration space in order to derive a Bohmian “subquantum” version of the
H-theorem [
32,
33]. However, we stress that the Tolman–Gibbs derivation [
34] and therefore Valentini’s deduction can be criticized on many grounds (see for example [
21] for a discussion). For instance, Prigogine already pointed out that the Tolman–Gibbs “proof” is a priori time-symmetric and cannot therefore be used to derive a relaxation. Furthermore, what the theorems show is that if we define a coarse-grained entropy
, we have necessarily (i.e., from the concavity of the entropy function)
(the second equality
comes from unitarity and Liouville’s theorem, and the third one
is an initial condition where the fine-grained and coarse-grained distributions are identical). However, this result cannot be used to directly prove the relation
for
. In other words [
21], one cannot show that the entropy is a monotonously growing function ultimately reaching quantum equilibrium (i.e., corresponding to the maximum of the entropy function [
32]). Importantly, in his work on the “subquantum heat-death” (i.e., illustrated with many numerical calculations [
35,
36] often connected with cosmological studies [
37,
38]), Valentini and coworkers stressed the central role of deterministic chaos in the mixing processes, and this indeed leads to an increase of the entropy function in the examples considered. Moreover, deterministic chaos in BBQT is a research topic in itself (for a recent review, see [
39,
40]) and many authors (including Bohm [
14] and Valentini [
35,
36]) have stressed the role of nodal-lines associated with phase-singularities of the wave-function for steering deterministic chaos in the BBQT [
41,
42,
43]. However, it has also been pointed out [
39,
44] that this chaos is not generic enough to force the quantum relaxation
for any arbitrary initial conditions
(a reversibility objection à la Kelvin–Loschmidt is already sufficient to see the impossibility of such an hypothetical deduction [
21,
45]). Therefore, this analysis ultimately shows that the H-theorem can only makes sense if we complete it with a discussion of the notion of typicality [
45,
46,
47].
In the present work, we emphasize the role of an additional ingredient that (together with chaos and coarse graining) helps and steers the quantum dynamical relaxation
: quantum entanglement with the environment. The idea that quantum correlations must play a central role in BBQT for justifying Born’s rule is not new of course. Bohm already emphasized the role of entanglement in his work [
13,
14,
23]. It has been shown that entanglement could lead to Born’s rule using ergodicity [
48]. Moreover, in recent studies motivated by the Vigier–Bohm analysis, we developed a Fokker–Planck [
22] and Langevin-like [
49] description of relaxation to quantum equilibrium
by coupling a small system
S to a thermal bath or reservoir
T inducing a Brownian motion on
S. We showed that, under reasonable assumptions, we can justify a version of the
H-theorem where quantum equilibrium appears as a natural attractor. Furthermore, at the end of [
22], we sketched an even simpler strategy based on mixing together with entanglement and involving deterministic chaotic iterative maps. After the development of such an idea, it came to our attention that a similar strategy has been already developed in an elegant work by Philbin [
50], and therefore we did not include too much detail concerning our model in [
22]. Here, we present the missing part and provide a more complete and quantitative description of our scenario, which is presented as an illustration of a more general scheme. More precisely, we (i) analyze the chaotic character of the specific de Broglie–Bohm dynamics associated with our toy model, (ii) build a Boltzmann diffusion equation for the probability distribution and finally (iii) derive a simple
H-theorem from which Born’s rule turns out to be an attractor. We emphasize that our work, like the one of Philbin, suggests interesting future developments for justifying Born’s rule and recovering standard quantum mechanics within BBQT.
2. The Status of Born’s Rule in the de Broglie–Bohm Theory
We start with the wave-function
obeying Schrödinger’s equation
for a single nonrelativistic particle with mass
m in the external potentials
(we limit the analysis to a single particle, but the situation is actually generic). BBQT leads to the first-order “guidance” law of motion
where
defines an Eulerian velocity field and
is a de Broglie–Bohm particle trajectory. Furthermore, from Equation (
2), we obtain the conservation rule:
where we recognize
as the distribution which is usually interpreted as Born’s probability density. Now, in the abstract probability theory, we assign to every point
a density
corresponding to a fictitious conservative fluid obeying the constraint
Comparing with Equation (
4), we deduce that the normalized distribution
satisfies the equation
This actually means [
23] that
f is an integral of motion along any trajectory
. In particular, if
at a given time
and for any point
, this holds true at any time
t. Therefore, Born’s rule being valid at a given time will be preserved at any other time [
11,
12,
23]. It is also important to see that the relation
plays the same role in BBQT for motions in the configuration space as Liouville’s theorem
in classical statistical mechanics (where
is the probability density in phase space
). Therefore, with respect to the measure
(which is preserved in time along trajectories since
), the condition
is equivalent to the postulate of equiprobability used in standard statistical mechanics for the microcanonical ensemble. Clearly, we see that the inherent difficulties existing in classical statistical mechanics to justify the microcanonical ensemble are transposed in BBQT to justify Born’s rule; i.e.,
.
At that stage, the definition of the probability
of finding a particle in the infinitesimal volume
is rather formal and corresponds to a Bayesian–Laplacian interpretation where probabilities are introduced as a kind of measure of chance. Moreover, in BBQT, the actual and measurable density of particles must be defined using a “collective” or ensemble of
N-independent systems prepared in similar quantum states
with
. However, the concept of independency in quantum mechanics imposes the whole statistical ensemble with
N particles to be described by the total factorized wave-function:
as a solution of the equation
For this quantum state
, BBQT allows us to build the velocity fields
, where
define the de Broglie–Bohm paths for the uncorrelated particles (i.e., guided by the individual and independent wave functions
and Eulerian flows
). Within this framework, the actual density of particles
at point
is given by
which clearly obeys the conservation rule
Comparing with Equation (
6), we see that if
plays the role of an abstract Laplacian probability,
instead represents the frequentist statistical probability. Both concepts are connected by the weak law of large numbers (WLLN), which is demonstrated in the limit
and leads to the equality
; i.e.,
where the equality must be understood in the sense of a “limit in probability” based on typicality and not as the more usual “point-wise limit”. We stress that the application of the WLLN already relies on the Laplacian notion of measure of chance since by definition in a multinomial Bernoulli process, the abstract probability density
is used for weighting an infinitesimal volume of the
N-particle configuration space
. It can be shown that in the limit
with the use of this measure
, almost all possible configurations
obey the generalized Born’s rule
(the fluctuation varying as
). It is in that sense that Equation (
11) is said to be typical, where typical means valid for “overwhelmingly many” cases; i.e., almost all states in the whole configuration space weighted by
. The application of the law of large numbers to BBQT is well known and well established [
33,
46,
47] but has been the subject of intense controversies [
45,
46,
51,
52]. Issues concern (1) the interpretation of
as a probability density—i.e., in relation with the notion of typicality—and (2) the choice of
as natural and guided by the notion of equivariance [
53]. To paraphrase David Wallace, the only thing the law of large numbers proves is that relative frequency tends to weight … with high weight [
54]. However, there is a certain circularity in the reasoning that at least shows that the axiomatic nature of the probability calculus allows us to identify an abstract probability such as
to a frequency of occurrence such as
. However, the WLLN alone is unable to guide us in selecting a good measure for weighting typical configurations (the condition on equivariance [
53] is only a convenient mathematical recipe based on elegant symmetries, not a physical consequence of a fundamental principle). Therefore, the value of the
f function is unconstrained by the typicality reasoning without already assuming the result [
51]. In other words, it is impossible to deduce Born’s rule directly from the WLLN.
However, it must be perfectly clear that our aim here is not to criticize the concept of typicality. Typicality, associated with the names of physicists such as Boltzmann or mathematicians such as Cournot and Borel, is, we think, at the core of any rigorous formulation of objective probability [
55]. Our goal in the next section is to understand how natural and how stable the Born rule
is. For this purpose, our method is to consider entanglement between an environment of pointers, already in quantum equilibrium, and a not yet equilibrated system driven by chaotic Bohmian dynamics to the quantum equilibrium regime.
4. Conclusions and Perspectives
The proposal discussed in this work is certainly schematic but it leads to several interesting conclusions. First, since the dynamics maps used here are deterministic and chaotic, this shows that randomness is unavoidable in BBQT. As stressed by Prigogine [
62,
63], we have two complementary descriptions: one with trajectories that can be associated with the evolution map
and the second with a probability density; i.e., as given by the Perron–Frobenius transformation
. The two pictures are of course not independent since for a single trajectory we have
(i.e.,
). Moreover, for a trajectory, the probability distribution is singular and the convergence to equilibrium is infinitely slow (this is connected to the fact that the coefficients
in Equation (
51) are given by an integral which is badly defined for the singular Dirac distribution
). Therefore, the infinite precision required to compute such a chaotic path (due to the exponentially growing deviation errors with time) leads all practical computations to the strong randomness previously mentioned. To quote Ford [
61], “a chaotic orbit is random and incalculable; its information content is both infinite and incompressible”. Subsequently, because of the extreme sensitivity in the initial conditions associated with the predictability horizon and the positive Lyapunov exponent, the use of probability distributions in BBQT seems (at least in our model) unavoidable if we follow Prigogine’s reasoning. Indeed, Prigogine dynamic instability (and thus deterministic chaos) leads to probability. The necessarily finite precision
used to determine the position of a particle will grow exponentially with time to ultimately cover the whole segment
. Therefore, if we assign a uniform ignorance probability
over the segment
(in which the particle is located) then—i.e., subsequently after a few iterations—we will have
over the whole segment.
However, we stress that we do not share all the conclusions obtained by Prigogine concerning determinism and probability here (for related and much more detailed criticisms, see e.g., [
64]). Indeed, BBQT (as with the classical mechanic considered by Prigogine in [
62,
63]) is a fully deterministic theory with a clear ontology in the 3D and configuration space. Therefore, while a trajectory could be incalculable by any finite mean or algorithm, the path still fundamentally exists for an idealized Laplacian daemon; i.e., having access to an infinite computing power and precision for locating and defining the particle motion. This metaphor is the core idea of Einstein’s realism: postulating the existence of a world independent of the presence or absence of observers (even if the observers can be part of the world). From this ontic perspective, we need more than simply ignorance in order to justify the use of probability in statistical physics. Indeed, as emphasized long ago by Poincaré, the laws of the kinetic theory of gases still hold true even if we exactly know the positions of all molecules—[
65]. There is something objective in the laws of statistical mechanics that goes beyond mere ignorance: otherwise, how could parameters such as diffusion constants have objective physical contents? This point was emphasized by Prigogine from the very beginning, and this constitutes the motivation for his program in order to justify the objectivity of thermodynamics in general and the second law—i.e., irreversibility—in particular.
However, in our opinion, the missing point in Prigogine’s implication—“instability → probability → irreversibility”—is the recognition that in a deterministic theory, the laws (chaotic or not) are not complete but must be supplemented by specific initial conditions, ultimately with a cosmological origin. Indeed, if we suppose a universe made of only one electron described initially by the wave-function
and all the pointers involved in the iterative procedure sketched in
Figure 4, then we must use the chaotic Bernoulli map
for this system or equivalently the Perron–Frobenius evolution
. As we have explained, this system is unstable due to the presence of a positive Lyapunov exponent. Moreover, if we want to make sense of the Formulas (
49) and (
50) with the rapid convergence to
, we must consider a sufficiently regular distribution
. Now, as mentioned in
Section 2, the application of the WLLN to a statistical ensemble requires a “metric” of typicality associated with the Laplacian definition of probability. In BBQT, this metric reads
, and the law of large numbers leads to Equation (
11)—i.e.,
—defined probabilistically in the long term; i.e, for an infinitely long sequence or infinite system. In our problem, this means that we consider an infinite Gibbs ensemble of copies similar to our system, as described in
Figure 4. Here, the presence of an infinite sum of Dirac distributions is expected to lead to difficulties in connection with the chaotic map
. In our problem, if the WLLN
is used to specify the initial distribution at time
, this preserves the chaotic description associated with the positive Lyapunov exponent; therefore, Dirac distributions become problematic. In order to remove this unpleasant feature, one must introduce coarse-graining as proposed by Valentini [
32,
51]. In our case, this can be done by using a regular weighting function
such that
, which in connection with the WLLN leads to
. The coarse-graining of cells in the configuration space plays a central role in the work of Valentini for defining a “subquantum H-theorem” [
32,
33]. Here, we see that in connection with Prigogine’s work, coarse-graining must be supplemented with a dose of deterministic chaos and entanglement in order to reach the quantum equilibrium regime. We believe that these two pictures complete each other very well.
Before summarizing our work, it is important to go back to Zurek’s envariance as discussed in the introduction in order to see connections with the derivation of Born’s rule as presented in this article. We remind the reader that in 2003, Zurek [
66] proposed an alternative proof of Born’s rule based on envariance—a neologism for environment-assisted invariance—with a purely quantum symmetry based on the entanglement of a system with its environment. The importance of this elegant proof could perhaps only be compared with that presented by Gleason [
67] in 1957. As stressed by Zurek, “Envariance of entangled quantum states follows from the nonlocality of joint states and from the locality of systems, or, put a bit differently, from the coexistence of perfect knowledge of the whole and complete ignorance of the parts” [
66]. The proof is remarkably general and does not rely on any specific ontology, even though it has been used by advocates of the many-world interpretation to justify or recover Born’s rule (for a review and a comparison to the decision-theoretic deduction [
5], see [
6]).
In order to have a vague idea of the whole derivation, consider a Bell state
between a system
and environment
. Now, the main idea of envariance concerns symmetry: a local “swapping” (for example, on
for the two possible outcomes
) in the entanglement is irrelevant for the local physics of
(this is obvious a priori, since
is untouched by the swap). This (unitary) swap reads
The symmetry of the swap should a priori also impact probabilities associated with outcomes (whatever the definition used for a probability). In other words, if we are allowed to define a probability function
for the two correlated outcomes ♡ and ⟡ before the swap, then the previous equation imposes
where
is a probability after the swap (i.e., defined for the state
). Moreover, the swap on
can be compensated by a “counterswap” acting locally on the subsystem
:
Now, again from symmetry, we must have the relation
However, by comparing Equation (
60) and Equation (
62), we clearly deduce
which implies equiprobability for the two branches in the state
. This equiprobablity is clearly an illustration of Born’s rule for the entangled state
. Therefore, envariance can be used to derive Born’s rule (more general reasonings and deductions are given in [
66]).
It is important to remark that the reasoning depends on the a priori existence of a probability function, and in order to justify this point, we should rely on a more precise definition of probability in a given ontology. Moreover, in the de Broglie–Bohm ontology, as in classical statistical mechanics, the concept of probability is related to a distribution of particles in ensembles or collectives and is therefore strongly rooted in the concepts of frequency and population. In other words, if we consider a large ensemble of copies for the entangled systems prepared in the quantum state , then according to the Bernoulli WLLN, the probability is simply a measure of the fraction of systems prepared in the states . Now, in the de Broglie–Bohm theory (like in classical physics), x-coordinates for particles define a “preferred basis” in the sense that particles are really located at some positions defining trajectories. Zurek’s envariance can thus be applied to the de Broglie–Bohm ontology if we consider systems and that are well located in the configuration. Therefore, like in the model used in the present article, we can consider two non-overlapping wave-functions and associated with the coordinates in the configuration space of the -subsystem and similarly for the non-overlapping wave-functions and of the -subsystem. In this ontology, we can give a physical meaning to the invariance under swap or counterswap conditions.
It is indeed possible to postulate that there areas many copies of the systems prepared in the state as in the state in the universe. The situation is similar to the one found in a classical gas of molecules were correlated pairs can be defined by exchanging some properties and are present in equal numbers before and after the swap (this kind of symmetry played a key role in the deduction made by Maxwell and Boltzmann justifying the canonical ensemble distribution). Fundamentally, this symmetry in the population is related to some choices in the initial conditions of the whole ensemble. The full deduction of Zurek based on envariance is thus preserved, and this must lead to Born’s rule (at least if we assume that the population of de Broglie–Bohm particles is uniformly distributed in the spatial supports of the various wave functions).
Furthermore, it is important to stress that the envariance deduction is linked to the no-signaling theorem as shown by Barnum [
68]. This no-signaling theorem was also emphasized by Valentini [
69] in the de Broglie–Bohm theory in order to protect macroscopic causality and to prohibit faster-than-light signaling. Crucially, Born’s rule appears as a necessary condition for the validity of the no-signaling theorem (this was also related to the second law of thermodynamics by Elitzur [
70]). Interestingly, in the present work, we considered regimes of quantum-nonequilibrium where the symmetry of the entangled wave-functions was not present in the particle distribution characterized by the
function. However, in the end, we showed that if the environment of pointers was already in quantum equilibrium, then the system would be driven to the quantum equilibrium
acting as an attractor under the chaotic Bohmian dynamics. In the end, this also shows that the quantum equilibrium in the de Broglie–Bohm dynamics is natural and also how fragile and unstable physical deviations to the Born rule are. We believe that this confirms the deductions made by Zurek concerning the fundamental role of envariance.
There is another way to express the same concept: going back to our discussion about typicality at the end of
Section 2, we see that in this article, we indeed developed a model that does not assume quantum equilibrium for all particles. The system moving in the interferometer is initially out of quantum equilibrium
. However, it is quickly driven to quantum equilibrium due to (1) entanglement with pointers already relaxed in the regime
and (2) the presence of chaotic dynamics inducing fast mixing and thus a fast relaxation
. It is interesting that the number of iterations
N and therefore the number of pointers involved in the process does not have to be large (i.e., we do not have to go to the thermodynamic limit
associated with a quantum bath). As we have shown, the chaotic Bernoulli map drives the system to quantum equilibrium already for
. This demonstrates, we think, the robustness of this attractor leading to Born’s rule.
To summarize, in this work, we have proposed a mechanism for relaxation to quantum equilibrium in order to recover Born’s rule in BBQT. The proposed mechanism relies on entanglement with an environment of “Bohmian pointers” allowing the system to mix. The scenario was developed for the case of a single particle in 1D motion interacting with beam splitters and mirrors, but the model could be generalized to several situations involving collisions between quanta and scattering with defects or other particles. The general proposal is thus to consider the quantum relaxation to Born’s rule as a genuine process in phases of matter where interactions between particles play a fundamental role. This involves usual condensed matter or even plasma or gases where collisions are mandatory. For example, based on our toy model, we consider that interaction with the beam splitter and entanglement with Bohmian pointers is a good qualitative model for discussing collisions between molecules in the atmosphere, and if we remember that nitrogen molecules at a temperature of 293 K and at a pressure of 1 bar involve typically a collision frequency of 7 ×
/s (which implies fast dynamics for reaching quantum equilibrium), we thus have a huge number
n of collisions per second corresponding to a huge number of iterations in our Bernoulli-like process based on the Perron–Frobenius operator
. Compared to Valentini’s framework [
32,
33] where mixing and relaxation to quantum equilibrium are associated with coarse-graining à la Gibbs, our approach emphasizes the role of information losses due to entanglement with a local environment. In both cases, we obtain an increase of entropy and a formulation of the
theorem for BBQT. These two views are certainly complementary, in the same way that Gibbs and Boltzmann perspectives on entropy are related. This could have an impact on the efficiency of quantum relaxation in the early stages of the evolution of the universe [
37,
38].