Friction is the price for moving too fast. An attempt to induce a rapid change in the state of a system would be accompanied by additional entropy generation and will be encumbered by energy costs. As an example of friction we can consider a body moving rapidly against a stationary background. Its kinetic energy is dissipated, generating heat and entropy in the environment. The amount of dissipation is proportional to the velocity. Another archtypical case of friction is a driven gas compression process. For a system that is thermally decoupled from its environment, rapid changes in the piston position that compresses the gas will result in internal heating of the gas and entropy generation. To restore the system to its slow quasi-static compression equivalent, heat has to be removed from the gas. This additional heat is equivalent to extra work against friction.
Several important changes occur in the transition into quantum mechanics. First, the physics of an isolated system is strictly reversible and unitary. We will argue below that internal friction still occurs but its analytical description requires a proper extension of the classical concepts of entropy and temperature. Furthermore, we will point to a new mechanism that creates distinctly quantum internal friction.
Secondly, the quantum description of the system plus its environment allows for entanglement correlations that have no classical analog. We will describe the quantum theory of open systems in some detail, and argue that in the quantum domain it is difficult to separate the system from its environment. This difficulty has led to misunderstandings and incorrect attempts to characterize friction.
Finally, quantum fluctuations and zero-point energy change the nature of the environment, which leads to additional frictional forces.
Both processes can have quantum contributions. Quantum fluctuations will generally function as an extra dissipative environment, while quantum non-commutativity will ensure that the system cannot perfectly follow external changes.
Throughout our treatment we will consider quantum systems exclusively. We will refer to the classical limit when appropriate.
1. Internal Friction
Internal friction is induced in a system when its external constrains are changed rapidly. Consider a quantum system with a discrete energy spectrum (we will further assume non-degeneracy for simplicity). An ensemble will have some average energy (where is the probability to find the system in a certain energy eigenstate). A rapid change in external constraints corresponds to a change in some external semi-classical field in the system’s Hamiltonian. For the paradigmatic case of a gas in a piston the field will be related to the location of the potential barrier confining the gas particles. If the change to the external field is slow enough, the quantum adiabatic theorem assures us that there will be no change in the energy populations and therefore the change in the energy will be due only to the change in energy levels. The energy in such a “quasistatic” process changes minimally in this sense.
Now consider a faster change. The rapid change in energy levels can now lead to population changes, changing the energy beyond the “minimal” quasistatic change of the energy levels themselves. For simplicity, let us assume for now that we start from the ground state. Then we can only lose population density to higher states, so that we can only reach a higher (or equal) energy compared to the quasistatic change. Thus we receive a “resistance” to velocity: when we strive to drive the system quickly we need to invest more work.
This process, by itself, is reversible. Simply reversing the field-change protocol will yield back the original state and the original energy. The evolution is reversible because it is a unitary dynamics. However, consider appending a non-unitary step to the process. Now we leave the field at its final value for a time, while bringing the system into contact with a heat bath at its original temperature (zero temperature, in this example). This will induce irreversible thermalization and loss of information about the original state. Such thermalization will convert the extra energy required into extra heat in the environment.
This process can be generalized for a finite temperature. Allahverdyan and Nieuwenhuizen proved that, barring level-crossing, for a system initially at a thermal state the minimal work (energy) is reached by a quasistatic process [
4,
5]. The derivation is too long to be repeated here, but in general it hinges on realizing that the state’s eigenvalues do not change during unitary evolution, and that for a smooth enough field-change protocol the adiabatic theorem ensures absence of transitions between states. Having an initial thermal state is not required, but it is necessary to have an initial state with decreasing occupations in the energy eigenbasis. Most importantly, the requirement for no level crossing is satisfied for a single varying field parameter, a result known as the non-crossing rule [
6,
7]. No level crossing, in turn, allows us to estimate the time scale required for the quasistatic limit as related to the inverse of the minimum energy level gap [
8,
9,
10,
11]. We emphasize that if level-crossing does occur, Allahverdyan and Nieuwenhuizen show that the quasistatic protocol may not be the optimal one. A quasistatic timescale for the adiabatic theorem can still be defined in this case [
12,
13].
For a concrete example we turn to the paradigmatic case of ideal gas in a piston. For simplicity, we consider spinless particles in one dimension and furthermore replace the square-well confining potential with an harmonic well [
14]. A change in piston size corresponds to changing the frequency of the harmonic potential. We are therefore dealing with an ensemble of time-dependent harmonic oscillators.
The harmonic oscillator has an adiabatic conserved quantity,
(where
is the oscillator’s Hamiltonian) [
15]. This means that in the “quasistatic limit” of a slow change in frequency
, this quantity will be conserved. A quasistatic change from
and energy
to
will therefore lead to a final energy of
. Since there is no heat exchange with any environment, it is only possible to interpret this energy change as work. This is the baseline quasistatic work that a faster change needs to be compared against.
Treating faster, finite-time, frequency change protocol
requires more mathematical tools. We can begin by considering the system initially at zero temperature. Then, the energy for any later time will be given by [
16]
where
is a parameter introduced by Husimi [
17] and is related to a quasistatic protocol: it is unity for a quasistatic change and increases the faster the change in frequency is [
18]. This makes the effects of non-adiabacity readily apparent but we would like to deal with the more general initial thermal Gibbs state.
To properly consider an initial thermal state, we turn to the concept of a dynamical Lie algebra [
19]. A dynamical Lie algebra is a Lie algebra generated by the Hamiltonian of the system. The idea is to look for a Lie algebra with elements
so that any operator within the algebra
will remain within the algebra under the dynamics generated by commutation with the Hamiltonian,
. This is most easily assured by taking as
all the operators within the Hamiltonian and then adding elements by commutation with the Hamiltonian until no new operators are generated. For the harmonic oscillator the Hamiltonian
can be considered to be comprised of two time-independent operators
with time-dependant parameters,
, with
and
. The minimal completion of this algebra is then
; it can be verified that
forms a closed S(1,1) Lie algebra under commutation. Alternative dynamical Lie algebras can be constructed by linear superpositions of these three operators. Using the creation and annihilation operators one such algebra is
.
One advantage of using a dynamical Lie algebra is that it allows one to easily write down a state whose form is conserved by the dynamics. Consider the state
Although this operator does not lie within the dynamical Lie algebra, its form is conserved in the dynamics. This can be verified by expanding the exponentials and examining the commutations order by order [
14]. Note that the thermal Gibbs state is obtained for
; this corresponds to a state with no coherences and no position-momentum correlations.
Another advantage of the dynamical Lie algebra is that it allows one to easily identify dynamical invariants of the motion. “Dynamical invariants” are quantities that remain constant, but are explicit functions of time-dependent parameters and of (implicitly time-dependent) expectation values; for the harmonic oscillator, the first such invariant was noted by Lewis [
20]. Korsch and Koshual used dynamical algebras to derive the dynamical invariants which lie within the algebra [
21,
22] (
i.e., invariants which are linear in the expectation values of the algebra’s operators), and Sarris and Proto demonstrated that for our state (Equation
2) it is possible to generalize further and derive dynamical invariants that are outside the algebra [
23], consisting of higher powers of the the algebra’s expectation values (they actually consider maximum entropy states,
, but a product form and exponential sum are interchangeable (see [
24], corollary to theorem 2)). One such invariant in our case was found to be [
25]
The normalization of the state (Equation
2) requires
, so that
, and the maximum occurs only in the thermal state (
).
We now have the tools to return to analyzing the time-dependent harmonic oscillator under a finite change of frequency. We assume an initial thermal state so that
and the state is diagonal in the energy eigenbasis. A quasistatic change from
to
will maintain the diagonal form, leading to another thermal state with
and some energy
, set by the constant
. Now consider what would have happened if we would have changed the field quickly. In general we would not expect the final state to be thermal, so that
and the invariance of
(Equation
3) now implies a higher energy. Reaching a final thermal state may be possible, but would require a specially-chosen protocol and even then will only lead to the same final energy, not a lower one. We conclude that for any initial thermal state a non-quasistatic protocol will generally lead to higher energies (than the quasistatic baseline),
. As noted above, appending a thermalization stage will convert this extra energy into excess heat.
We have demonstrated that driving the system at a finite rate will require excess energy to make the same change. But has it generated more entropy and increased the system’s temperature? To answer this we must first consider what entropy and temperature mean in the quantum domain. The entropy of quantum states is usually considered to be the von Neumann entropy,
. This quantity can be shown to be invariant to unitary transformation (e.g., [
26], Section 2.3), and in particular to the Hamiltonian dynamics at hand. For our system, the von Neumann entropy can be shown to depend only on the dynamical invariant
[
25]. This entropy does not increase during the creation of the “internal friction”—in this respect it appears that there is no friction!
The paradox can be resolved by realizing that the von Neumann entropy is an idealized entropy, representing the information missing after all possible measurements have been made. This is not a measure that is necessarily relevant. Since we are interested in the dissipation and dynamics of energy it makes sense to consider an entropy that is related to energy. And indeed, if we look at the Shannon energy entropy,
(where
is the probability of finding the system in the
energy state), we find that a non-quasistatic change always increases it. The two entropies are equal only when the state is diagonal in the energy eigenbasis,
i.e., when
and the state is thermal. A finite rate of frequency change will generally lead to the development of coherences (
) and therefore to the deviation from the diagonal form in the energy eigenbase. Since the von Neumann entropy is always smaller than any other Shannon entropy [
27,
28], the Shannon energy entropy will be higher in these cases—internal friction indeed generates entropy.
What about increasing the system’s temperature? Note that when
,
β no longer serve as the (inverse) temperature. To understand what the temperature is, we have to go back to its thermodynamic definition as the ratio between energy and entropy change. The Shannon energy entropy of our system is the same as that of a thermal oscillator with the same energy [
14],
This is a concave monotonically increasing function of energy. Since non-quasistatic dynamics leads to higher energies at the same frequencies, such dynamics will result in lower slopes. The inverse-slope will therefore be higher, and the inverse-slope is the temperature (see
Figure 1). We arrive at higher temperatures under non-quasistatic dynamics.
Figure 1.
The Shannon energy entropy for a fixed frequency , as a function of the energy E in units of . In a quasistatic process from some initial thermal state we will reach some final energy and therefore entropy at the final frequency, represented by point A. A faster process reaching the same final frequency will generally result in higher energy, represented by point B. The slope at point B is lower than in point A, and therefore the inverse slope will be higher. Using the thermodynamic identification , this figure shows that non-quasistatic dynamics will yield higher temperatures.
Figure 1.
The Shannon energy entropy for a fixed frequency , as a function of the energy E in units of . In a quasistatic process from some initial thermal state we will reach some final energy and therefore entropy at the final frequency, represented by point A. A faster process reaching the same final frequency will generally result in higher energy, represented by point B. The slope at point B is lower than in point A, and therefore the inverse slope will be higher. Using the thermodynamic identification , this figure shows that non-quasistatic dynamics will yield higher temperatures.
We received all the hallmarks of friction—excess work needed to enact the same change, entropy production, and temperature increase. The latter two, however, are only true from the appropriate (quantum) perspective. We note that these effects are purely due to the fact that the system could not adiabatically follow the change in frequency, which is in turn due to the fact that the system’s Hamiltonian does not commute at different times
which is a purely quantum feature. Quantum internal friction, therefore, stems from the non-Abelian nature of quantum algebra.
In our treatment we separated out the thermalization phase from the driving phase. In realistic cases, however, driven systems will be at least weakly coupled to thermal environments. The two processes will occur simultaneously, implying that any external driving will be converted through dissipation to some quantum friction (although this effect may be negligible).
Furthermore, residual interactions with the environment can also lead to dephasing noise. This will also be the effect of imperfect control over the external field. Since pure dephasing is identical to a weak measurement of the momentary energy, one would expect it to draw the state towards the momentary energy eigenbasis, thereby approximating the quasistatic process and thus acting as a “quantum lubricant” that reduces friction. This indeed happens in some cases [
29]. However, in at least some cases pure dephasing of this sort can decrease efficiency [
30]. The effect of dephasing noise in general is still not sufficiently understood, but it does not appear to eliminate quantum internal friction entirely even when it does function as a lubricant.
It should be noted that, formally, there are frequency-change protocols that avoid generating friction (in these solutions
at some finite time
). Using such protocols, it is seemingly possible to drive the system at a finite rate and still avoid friction. However, although it appears that such processes can occur in arbitrarily short time [
30,
31,
32], that requires an arbitrarily large available energy. This can be understood in light of the energy-time uncertainty relation: an infinitely fast process would require an infinite variance in energy. An instantaneous frictionless solution is therefore not viable and any finite-period solution will result in dissipative losses to the environment on its points where
. In at least some cases, frictionless solutions also seem unstable under dephasing noise [
30]. In the realistic case of weak coupling, then, some frictional loss is unavoidable (although it may be negligible in practice).
Our results are not limited to the harmonic oscillator - separate analysis reveals that it is valid for spin systems [
30,
33], and holds under continuous coupling to the bath for a three-level system [
34]. Since the underlying features that give rise to the phenomena are the non-commutative nature of the Hamiltonian at different times and the irreversible nature of thermalization, there is good reason to believe that this kind of quantum friction would be endemic in all realistic systems.
A constraint on the universal applicability of our results is the assumption of a thermal environment. We have so far treated the thermalization and dephasing processes only roughly, so will now amend this lacuna by devoting the next section to examining more carefully how dissipation to the environment occurs.
2. Dissipation to the Environment
The description of interactions with the environment has been left intentionally vague in the above section, but since external friction involves dissipation into the environment we need to consider their interaction in some detail. One approach to think of the interaction of an (open) system with the environment is to formally consider both the system and the environment as a single closed “extended” quantum system, and then develop the dynamics of only the part involving the degrees of freedom of the open system, where signifies tracing over the environment’s degrees of freedom, and and are the extended and open system’s density operators (i.e., states), respectively.
Consider some extended system, with an Hamiltonian
in the Hilbert space
. The evolution of this system is given, with no approximation, by the Liouville-von Neumann equation
where the Liouville operator
operates on the state to produce its dynamics.
Since we are interested only in the degrees of freedom of the open system, we can attempt to define a projection operator that projects any operator from the extended system’s Hilbert space to the “relevant” subspace of interest, namely the open system’s subspace . Since operates on the extended Hilbert space , however, it is formally necessary to consider the operator as projecting from and to the extended Hilbert space. It is typically taken as where is a time-independent thermal state of the environment.
Following Zwanzig [
35], we now further define a complementary projection
into the “irrelevant” part. By simply writing the Liouville-von Neumann equation in these terms, it is possible to formally solve it to obtain
with the memory kernel
. This equation is known as the Nakajima-Zwanzig (NZ) equation [
26]. Limiting the initial conditions to ones that are not altered by the projection,
, removes the first term, leaving us with an exact closed equation for the dynamics of the relevant part. For the typical projector operator given above, this condition is fulfilled if the initial state is already separable
. Formally, it is possible to present the NZ equation in time-convolutionless form,
where
is a complicated operator that is local in time (for details and a more rigorous derivation of the NZ equation, see [
26]; for an alternative derivation, see [
36]).
Solving the dynamics of the relevant part therefore involves a memory kernel or local operator
which is very complex and hard to apply. However, the local operator can become much simpler under reasonable physical assumptions. It is common to consider an environment that acts as a heat bath, dissipating the energy the system gives to it on time scales that are much faster than the system’s time scales
. We therefore have a separation of time scales, where
. Over time scales above
, we would then expect the environment to “forget” its past interactions with the system, leading to a loss of the memory in the NZ equation for “coarse grained” evolution (averaging over times below the graining). The resulting local-time propagator is now a constant
. It still generally depends on the initial state of the overall system, but if we assume that a single operator exists for a group of initial conditions (such as separable initial states), we can eliminate this dependence and obtain a Markovian dynamics
Since the propagator is independent of time and initial state, we can formally solve the dynamics and present the time evolution propagator as a dynamical semigroup,
. Lindblad determined the most general structure of a “completely positive” quantum dynamical semigroup as [
37]
where
are the “generators” of the evolution, and we have dropped the subscript
. “Complete positivity” here is intended to capture the fact that the evolution must be compatible with a greater unitary dynamics, and specifically conforms to maintaining the free evolution of a dummy system (
) that does not interact with the open system [
37,
38]. Amongst other properties, a completely positive map maintains the positivity of the density. Equation
10 defines the acclaimed Lindblad form, which appears to be applicable under reasonable physical assumptions yet provides a very concrete and (as we will see below) useful form.
Some cautionary words are in order. First, the Markovian approximation depended on a coarse-graining of the evolution in time, which makes it inapplicable to short time scales, where there is generally “slippage”. This was demonstrated by Gnutzmann and Haake [
39]. Define the difference between the constant (Markovian) and local-in-time convolutionless propagator
. We can now write the evolution equation as
Separating out the time-independent evolution,
, one can formally solve the evolution of the effective density
where
denotes the time ordered integral. The the last approximation is valid since the integrand approaches zero for times longer than the bath relaxation time
. Equation
12 indicates that the evolution at long times is Markovian (
), but if this long-time approximation is applied to the initial-time
it yields a wrong “initial state”
. Specifically, this effective density may be non-positive at short times. The requirements of complete positivity are therefore excessive, as they guarantee positivity even for arbitrarily short times. Real Markovian dynamics that develop at long times need not be completely positive (see [
39] for a more thorough discussion).
Similarly, an upper bound on the appropriate time scale is set by quantum recurrence. Further caution is urged by Pechukas [
38] (but see also [
40]), who demonstrates that beyond the weak coupling limit complete positivity is not guaranteed, and that even within this limit it requires a separable initial state.
Despite these misgivings, it should be emphasized that completely positive master equations have been derived from microscopic models ([
41,
42,
43,
44] and more), under suitable assumptions. Equations of Lindblad form
can be relevant and correct approximations and are often successfully applied to model thermalization and other processes. Indeed, sometimes several Lindblad forms are applicable for different conditions or from different perspectives.
In conclusion, the theory of quantum open systems allows us to explicitly see that the irreversibility induced by the environment is due to the fast time scale separation between the bath and open system, allowing the environment to quickly spread the effects of the interaction with the system over many degrees of freedom [
45]. This creates irreversible dynamics at not-too-long times, which (under suitable assumptions) will be of the Markovian kind. In other words, energy dissipation can be identified with situations where the environment acts as an energy sink, for particular modes of energy, within the relevant time-scales.
For driven systems, the combination of energy excitation by the driving and their irreversible dissipation by the large environment creates friction.
3. Brownian Motion
Brownian motion includes the dumping of momentum, or velocity, dissipating the kinetic energy into the medium. But why does the dissipative environment lower the velocity, of all things? Consider a classical particle moving against some uniform background environment. Assuming that the environment around it is in thermal equilibrium we could imagine it getting hit by elastic collisions, so the dynamics would be a combination of a free evolution and elastic collisions. In the laboratory frame the momentum distribution of the environment is assumed to be isotropic, so that
. A particle moving at velocity
, however, will see an asymmetrical distribution in its co-moving frame,
[
46] (with obvious notation). Since the environment is thermal it will have higher densities at higher momenta and this asymmetry will translate into the damping of momentum. The existence of friction can therefore be attributed to symmetry breaking: a contrast between the isotropy of the interactions at the laboratory frame and the symmetry-breaking direction of the particle’s velocity. More abstractly, the environment acts to restore the symmetry of zero momentum, but is
dissipative (irreversible) in the large-system approximation, wherein it retains its isotropy (an approximation that is broken by the quantum recurrence time).
Note that we would also expect thermal stochastic fluctuations which will lead to diffusion in momentum. Overall, for movement in a homogenous background one can expect a Fokker-Plank equation, with a friction (momentum damping) term and a (momentum) diffusion term [
47].
This is a classical mechanism description of friction, and so we will not expand on it in this paper. However, it is worth bringing up that to the extent that complete positivity is assumed it has no quantum analog. Quantum mechanics requires interactions with the environment to include interactions with location as well as momentum, and connects them to diffusion in position as well as in momentum. We will devote the rest of this section to explaining why and when this is valid.
To expand on this idea, let us consider quantum Brownian motion specifically. In a seminal paper, Caldeira and Leggett considered a particle interacting with the environment solely through a pair-wise spatial interaction,
(where the summation is over the environment’s particles, and
x is the particle’s position). Under suitable assumptions (such as a large environment), they find in the semiclassical region a Fokker-Plank like equation [
48]
However, this dynamics is not of Lindblad form [
26], and is therefore not completely positive. Equation
13 can be “extended” to conform to Lindblad’s form by the addition of a spatial diffusion term,
, which will complete the form if
[
26,
49]. This reveals a quantum effect—friction, momentum diffusion and spatial diffusion are all connected, there cannot be one without the other. However, as an abstract requirement, its origin remains obscure.
The result can be understood in light of the so-called “unraveling” of the master equation. The Lindblad form (Equation
10) is mathematically equivalent to taking a stochastically-determined series of measurements on a pure state (defining a “quantum trajectory”), and then combining different trajectories statistically to recover the ensemble picture [
26]. These measurements are precisely the Lindblad generators. The “extension” of the Fokker-Plank like equation is equivalent to working with a generator that is a linear superposition of position and momentum [
26]. Working from a linear combination
, Gao likewise obtains a Lindblad form solution [
50]. Similarly, Vacchini provided an alternative microscopic derivation that preserves the Lindblad form and results in the generators
(with
and
a thermal wavelength).
From this perspective, it seems necessary for the environment to measure, and hence interact with, both the momentum and the position of the particle. This approach was taken by Barnett and Cresser, who developed the Lindblad master equation associated with simulatenous measurement of both these observables [
49]. Note that they maintain the classical assumption of elastic, momentum-exchange collisions. In the continuous measurement limit, they obtain a master equation of the Lindblad form. In a beautiful paper [
51], Barnett
et al. posit that measurement is therefore the underlying mechanism behind the phenomena of quantum friction—the open system is measured by the environment, which induces both friction and the two types of diffusion.
Such a conclusion is not tenable. The reason that friction is not related to measurement is simply that not all conditions giving rise to friction correspond to such a measurement by the environment. It is possible that a Markovian but not completely positive master equation can be derived for the behavior of the subsystem. As explained in
Section 2 this will lead to seemingly-unphysical results at short times and indeed the Caldeira-Leggett equation can result in negative probabilities for short times [
51,
52,
53]. But such short times should not be considered seriously, as they lie below the coarse-graining. The conclusion that friction and the two diffusions are connected therefore needs to be taken with a grain of salt—it is valid only when complete positivity is a good approximation, which need not generally be the case. The firm affirmation of their connection stated in the literature is not justified.
It is important to note the significance of this. Requiring completely positive (sub)dynamics led us to the conclusion that “in quantum theory we cannot have separate friction and diffusion terms” (Vacchini, [
54]). Completely positive subdynamics is equivalent to measuring the system [
55,
56]. If Barnett
et al. are wrong, this means that it is impossible to consider the open system as a separate system that is being continuously (and weakly) measured by the environment. Instead, quantum mechanics requires a more holistic treatment in which the dynamics of the observables in the open system subspace can be followed (a “subdynamics” exists), but the system can not be truly considered independently of the extended system. To paraphrase Vacchini, “in quantum theory we cannot have separate system and environment”.
4. Quantum Fluctuations
One of the features that set quantum mechanics apart from classical mechanics are the ever-present quantum fluctuations, in addition to and beyond any thermal fluctuations. One way to see this is from the perspective of a Feynman–Vernon path integral [
57,
58]. In this formalism a quantum particle is considered to take all paths between initial and final points, and the probability amplitudes are summed up with the weight
, where
S is the path’s action. The dominant contribution to this sum is from the classical action,
, which is the minimum action. But there are sizeable contributions from all paths whose action is within
ħ of the minimum, so that one is likely to observe results in this range. This results in quantum fluctuations whose scale is determined by
ħ. Fluctuations are related to dissipation through the fluctuation dissipation theorem (most notably, [
47]), so that quantum fluctuations would lead to quantum dissipation—quantum friction.
Quantum fluctuations are usually drowned by thermal fluctuations, but can still lead to measurable effects. Perhaps the most famous is the Casimir effect,
i.e., an attraction between two conducting plates, that can be attributed to the influence of the plates on the fluctuations of the electromagnetic field between and around them [
58]. If the plates are allowed to move, the zero-point fluctuations will also create a “dynamical Casimir effect” of a dissipative force [
59,
60,
61,
62]. The dissipative force can be obtained by a consideration of the vacuum modes [
63], but up to a numerical factor the result can be obtained by a dimensional analysis. For a large and smooth enough body, we would expect the force to be proportional to the area of the body’s surface,
A [
64]. As noted before, the fluctuations (and therefore the dissipative force) should also be proportional to
ħ. The dissipative force is also related to electromagnetic effects and therefore to the speed of light
c, and to some derivative of the spatial location
. Balancing the dimensions of the dissipation force leads to only one possible scaling [
65],
What is the origin of this friction? In light of the above section, it can be appreciated that the situation is very much like Brownian motion, with the plate flowing through the viscous quantum vacuum foam [
58,
66,
67]. The reason that the other plate is needed is to set a reference point, a “stationary” laboratory frame. Physically, the plate is affected by the image pictures induced on the other plates, that (because of causality) do not react instantaneously to the plate’s change of position and therefore act to slow it down [
68]. Alternatively, one can see the entire system as a closed cavity that is driven externally, in which case one would expect to see internal friction for high rates of change. And indeed, the dynamical Casimir effect only occurs for high (relativistic) changes [
69]. On this view, the internal friction is then dissipated by the luminosity and detection of the photons emerging from the cavity.
Another quantum aspect of this friction force is its relation to the non-commutativeness of the position and the momentum. Barton [
64] considers very large uncertainties, of the order
where
T is a typical measurement time. He obtains that for a large body of linear dimension
a, the regime
is inconsistent. The Heisenberg uncertainty relations make it impossible to measure deviations in position or momentum in this limit. Only large bodies, whose size is too large for light to travel their length during the measurement, would suffer from the above dissipative force. Causality therefore indicates the source of the force as residing in the time-lapse between the object’s movement and the change of the electric field. This delay-based friction mechanism is purely classical, but its presence is required by the quantum uncertainty relations if the environment is to measure the system.
5. Discussion
We have discussed three examples of quantum friction. In the harmonic oscillator, we showed explicitly that driving the system created internal friction that was distinctly quantum in origin. The internal heating can be dissipated through an interaction with an environment, so that realistic systems, which are not ideally isolated, should suffer additional quantum friction when driven. The dissipative effect can be attributed to the dispersion of the interaction across many degrees of freedom, that creates effective irreversibility by separation of time scales; it should not be confused with the irreversibility of completely positive maps, that is related to quantum measurements. Quantum friction needs to be understood as a combination of both the excitations induced by external driving and the more general irreversible dynamics. Although we treated only the case of an harmonic oscillator this treatment can be generalized and has been applied to other systems.
We also considered systems that are already driven, assuming some velocity in the Brownian motion and dynamic Casimir effect. Brownian motion demonstrated that the cause of the dissipation of momentum was the symmetries of interactions with the environment, combined with the asymmetry of the system’s velocity. These are largely classical effects, although the underlying mechanics (and therefore some of the details) are quantum. The dynamic Casimir effect also demonstrated that quantum fluctuations can act as a dissipative environment, and that external driving can lead to internal friction in a cavity too.
We conclude that quantum effects can, and generically do, create friction in two ways. They can set up quantum environments that create a dissipative external friction like the fluctuations underlying the dynamic Casimir effect; and they create excitations that are then dissipated by such means, as in the case of the driven harmonic oscillator. Both these effects are generally omnipresent.
Quantum friction may be ubiquitous, but when is it non-negligible? Quantum fluctuations are relevant when the action is of the order of Planck’s constant. These will constitute external dissipation when there is an asymmetry such as a velocity in a particular direction, with the scale of the dissipative force depending on the precise dimensionality and scales of the problem (as in the Brownian motion example). The relevance of this external friction is therefore dependant on the precision of the measurement compared to the scale of this force.
Internal friction will develop when the field is changed rapidly, and therefore depends on the dimensionless factor (where is now some generic external field with units of rate). Since the direction of change need not matter, friction needs to be a second order phenomena in this parameter. The change in energy would as an upper estimate be proportional to some fraction of the initial energy, but also to the time . The time of the process, in turn, must be smaller than the inverse of the energy to maintain the quasistatic limit, setting the time scale to give . We should not expect to see relevant internal friction phenomena if our energy resolution is below this limit, but may encounter friction below it.
We note that the connection found in the literature between diffusion and friction for Brownian motion cannot be maintained beyond the conditions of complete positivity, indicating that quantum treatment requires a holistic approach.