Next Article in Journal
Message Passing-Based Inference for Time-Varying Autoregressive Models
Next Article in Special Issue
Objective Quantum Fields, Retrocausality and Ontology
Previous Article in Journal
Dynamic Modeling and Chaos Control of Informatization Development in Manufacturing Enterprises
Previous Article in Special Issue
Lapsing Quickly into Fatalism: Bell on Backward Causation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Causality Is an Effect, II

by
Lawrence S. Schulman
Physics Department, Clarkson University, Potsdam, NY 13699-5820, USA
Entropy 2021, 23(6), 682; https://doi.org/10.3390/e23060682
Submission received: 9 May 2021 / Revised: 20 May 2021 / Accepted: 24 May 2021 / Published: 28 May 2021
(This article belongs to the Special Issue Quantum Theory and Causation)

Abstract

:
Causality follows the thermodynamic arrow of time, where the latter is defined by the direction of entropy increase. After a brief review of an earlier version of this article, rooted in classical mechanics, we give a quantum generalization of the results. The quantum proofs are limited to a gas of Gaussian wave packets.

1. Introduction

The history of multiple time boundary conditions goes back—as far as I know—to Schottky [1,2], who, in 1921, considered a single slice of time inadequate for prediction or retrodiction (see Appendix A). There was later work of Watanabe [3] concerned with prediction and retrodiction. Then, Schulman [4] uses this as a conceptual way to eliminate “initial conditions” prejudice from Gold’s [5] rationale for the arrow of time and Wheeler [6,7] discusses two time boundary conditions. Gell–Mann and Hartle also contributed to this subject [8] and include a review of some previous work. Finally, Aharonov et al. [9,10] proposed that this could solve the measurement problem of quantum mechanics, although this is disputed [11]. See also Appendix B.
This formalism has also allowed me to come to a conclusion: effect follows cause in the direction of entropy increase. This result is not unanticipated; most likely everything having to do with arrows of time has been anticipated. What is unusual, however, is the ability to prove the thesis mathematically.
The proof given [12] only applies to classical mechanics. The present work extends it to quantum mechanics. Moreover, since the previous paper was buried in conference proceedings, and the present work uses similar arguments, I will repeat some parts of that proof. Note though that the present work is extremely limited: it applies to particles in a gas with Gaussian wave functions. I think it should apply more generally, but that is not what I prove.
A remark is in order on the arrow of time. Often the sequence of cause and effect is taken as primary [13], but in the present context this is shown not to hold. Our argument depends on two conditions: the nature of perturbation and two-time boundary conditions. Both will be elaborated on. A future condition on quantum problems may make measurement deterministic but is hollow in that it does not lead (to my knowledge) to testable claims. There is the possibility of advanced effects ([6,14]) but so far, this has not proved measurable.

2. Perturbation

A “cause” can be a change (or intervention, in the dynamics or in the initial conditions) or it can be a precursor (“pre” in the usual meaning of the term). In a change, you can compare the outcome of two similar but not identical “causes.” The case of “precursor” is different. You can say one thing leads to another, A is a precursor of B, but you only consider A, not some variant, say A . I will adopt the meaning “change.” This allows study of “effects”, whereas I do not know how to examine ordinary time evolution. One should bear in mind, however, that in ordinary language both meanings of “cause” can be used. This is also discussed in Appendix B of [12], where a rationale for this choice is given.

3. Review of Previous Work

The argument for causality to follow the direction of entropy increase depends on the dynamics. The principle assumption on the dynamics is that there is a relaxation time. Call it τ (this is aside from the cosmological assumptions, discussed below). Let the dynamics be ϕ , so that from time t to t + 1 a subset of ϵ of the phase space goes from ϵ to ϕ ϵ . Then, by giving two times separated by more than 2 τ and low entropy in the beginning and end, one gets increase, then constant, then decline of the entropy. Thus, the set of points satisfying both boundary conditions ( ϵ 0 at 0 and ϵ T at T) at time 0 is
ϵ = ϵ 0 ϕ T ϵ T .
Note that both beginning and end are forced to have low entropy (by definition, assuming ϵ 0 and ϵ T are small). This is illustrated for the “cat map” in the first graph of Figure 1. See Appendix C for information on the “cat map” and on our procedure.
Then, we do something different! Two times for perturbation are contemplated: the first during the rising of the entropy, the second during its fall. By detailed calculation (not reproduced here, but see below, Section 5) it is found that causality, the macroscopic change in behavior, follows the direction of entropy increase, that is in the falling entropy case, the macroscopic behavior changes earlier in the time parameter (which is neutral, i.e., has two different arrows and in between doesn’t have an arrow).
Intuitively, this is simple. Let the perturbation be at t 0 and 0 < t 0 < τ < T τ < T . Between 0 and t 0 (before the perturbation) both perturbed and unperturbed have the same boundary conditions, hence their macroscopic behavior is similar (but the microscopic behavior is, in general, different). After the perturbation there is no constraint, i.e., no effective boundary condition: both systems go to equilibrium, from which there is adequate time (recall T > 2 τ ) to reach (whether perturbed or not) a specific region within the unit square. There was however a perturbation and there are different microscopic and macroscopic paths. Similarly, for T τ < t 0 < T the boundary conditions at ϵ T and t 0 (just after the perturbation in the neutral parameter t) are the same for perturbed and unperturbed systems, hence the macroscopic behavior is the same. However, they both go (in ( t ) ) to equilibrium; hence, the two systems (perturbed and unperturbed) have different macroscopic paths (N.b., distinguish carefully the words “macroscopic” and “microscopic.”).
This is illustrated in the case of the cat map, both for 0 < t 0 < τ and T τ < t 0 < T . See Figure 1, second and third images.

4. Quantum Version

The first step is to show that with low entropy conditions at both (distantly separated) times, the entropy increases in between. To calculate entropy in subspaces of Hilbert space presence or absence in that subspace should be defined and numbers of states counted. This leads to having regions of 6-dimensional x - p space—to mimic the classical space—and that is possible. What we do is take a region of phase space and select a basis of states whose maximum value is in this region. Coherent states will do the job. For convenience, the value of the spread used to define those states might be the fixed point of [15], but that is not necessary. Finally the dimension of the Hilbert subspace will be the number of states in that region. What this means is that one can go back to the classical method of using (the logarithm of) volume in phase space as a measure of entropy.
Therefore, as was done previously, we take coarse grains that are regions of phase space—coordinates and momentum. The density matrix involves mainly the diagonal elements. Thus even for identical particles the separation of locale as well as separation of momenta makes the density matrix nearly diagonal. For more on this theme, see [16].
Even for identical particles this leads to cancellation. In one dimension the wave function for a pair of Gaussians is (not normalized)
exp ( x 1 x α ) 2 4 σ α 2 + ( x 2 x β ) 2 4 σ β 2 + i k α x 1 + i k β x 2 ± exp ( x 2 x α ) 2 4 σ α 2 + ( x 1 x β ) 2 4 σ β 2 + i k α x 2 + i k β x 1 .
(The variables are x 1 and x 2 ; all the others are constants.) The diagonal elements of the density matrix (calculated from Equation (2)) already show signs of cancellation, as follows:
ρ ( x 1 , x 2 ; x 1 , x 2 ) = exp ( x 1 x α ) 2 2 σ α 2 ( x 2 x β ) 2 2 σ β 2 + exp ( x 2 x α ) 2 2 σ α 2 ( x 1 x β ) 2 2 σ β 2 + exp ( x 1 x α ) 2 4 σ α 2 ( x 2 x β ) 2 4 σ β 2 ( x 2 x α ) 2 4 σ α 2 ( x 1 x β ) 2 4 σ β 2 × 2 cos i sin k α k β [ x 1 x 2 ] .
As is evident, if x α is significantly different from x β or k α from k β , then there is already cancellation or rapid oscillation. With more particles the effect is stronger. This, by the way, is the reason that isolated systems can be analyzed without paying attention to symmetrization with respect to all identical particles in the universe.
To get low entropy at the beginning and the end confine the wave functions—at the beginning and end—to particular coarse grains. In between the two times considered, wave functions will spread. This sounds like the classical definitions, but once you have coarse grains the definitions are not all that different. However, to imitate [12] it is necessary that ψ final = U T ψ initial , where U T is the propagator from time-0 (zero is the initial time) to time-T (T is the final time). This imposes a significant constraint on the wave function, in particular, the wave function, under U T , should not spread. If spreading were to happen, the entire space available, not just the target in phase space, would be occupied and the entropy would not drop.
In a recent article [15], we found that Gaussians that scatter do not spread. (This was explored further in [17], but the principal application to Gaussians was done in [15].) The idea is that scattering provides localization. In [17], it is argued that wave functions become Gaussian (often) but that involves decoherence, which is human perception. Is everything a Gaussian? Obviously not; atomic wave functions can be complicated and even the hydrogen atom is a different function. Nevertheless, for the purpose of this demonstration a Gaussian will be adequate, at least for showing that sometimes causality is an effect.
The requirement that ψ final = U T ψ initial and that both be confined to a (small) region of phase space (at t = 0 and t = T ) is severe. However, based on the results of [15], it can be done. It is possible that the entropy would not be strictly zero (due to the Gaussian’s never vanishing) but it can be made small. The same holds in momentum space.
At this point we are back in classical mechanics and the proof is straightforward. Since non-standard definitions are used in [12] we repeat the proof (now using standard definitions).

5. Classical Proof

We present a précis of what has been done in our previous work ([12]).
On the phase space Ω let μ be a measure and μ ( Ω ) = 1 . Let the dynamics be a measure-preserving map ϕ ( t ) on Ω , with ϕ ( t ) ( ω ) the time-t image of an initial point ω Ω . The coarse graining of Ω , providing a notion of “macroscopic,” are sets with the following properties: { Δ α } , α = 1 , , G , with α Δ α = Ω , Δ α Δ β = for α β . Let χ α be the characteristic function of Δ α and let v α = μ ( Δ α ) = χ α ( ω ) d ω ( ω Ω ). If f is a function on Ω , its coarse graining is defined as
f ^ ( ω ) α χ α ( ω ) f ^ α with f ^ α d ω χ α ( ω ) f ( ω ) v α .
Let the system’s distribution in Ω be described by a density function ρ ( ω ) . The entropy to be used for studying irreversibility involves coarse graining and is defined as
S ( ρ ) Ω ρ ^ log ρ ^ d μ .
with ρ ^ formed from ρ as in Equation (4). (In other notation, S = Ω ρ ^ ( ω ) log ρ ^ ( ω ) d ω .) The relative entropy (Kullback–Leibler divergence), to which S ( ρ ) is related, was given in [12] with a different sign from the usual. Moreover, the illustration given in Figure 1 uses a different definition of entropy.
Turning to the system at hand, it is required to start ( t = 0 ) in a subset ϵ 0 Ω and end ( t = T ) in a subset ϵ T Ω . (The fact that Gaussian wave functions necessarily do not vanish anywhere may lead to a small correction.) The points of Ω satisfying this two-time boundary condition are
ϵ = ϵ 0 ϕ ( T ) ( ϵ T ) .
In [12,18,19] it is argued that for chaotic dynamics and for sufficiently long times T, ϵ . (Whether this carries over to quantum mechanics will be dealt with later.) We assume that there is a relaxation time τ , and that T τ . As a consequence
μ ( ϵ ) μ ( ϵ 0 ) μ ( ϵ T ) .
(Recall that for mixing dynamics, ϕ ( t ) satisfies lim t μ A ϕ ( t ) ( B ) = μ ( A ) μ ( B ) . This is true for the t limit, but we assume that there is a time ( τ ) such that the decorrelation condition holds. Moreover, ϕ is measure preserving.) Under ϕ ( t ) , ϵ becomes
ϵ ( t ) = ϕ ( t ) ( ϵ 0 ) ϕ ( t T ) ( ϵ T ) .
To calculate the entropy, the density, which was ρ ( 0 ) = χ ϵ / μ ( ϵ ) at time-0, must be coarse grained. The important quantity for the entropy calculation is
ρ α ( t ) = μ Δ α ϵ ( t ) μ ( ϵ ) = μ Δ α ϕ ( t ) ( ϵ 0 ) ϕ ( t T ) ( ϵ T ) μ ( ϵ ) .
If T t > τ then the following will hold
μ Δ α ϕ ( t ) ( ϵ 0 ) ϕ ( t T ) ( ϵ T ) = μ Δ α ϕ ( t ) ( ϵ 0 ) μ ϕ ( t T ) ( ϵ T ) , μ ( ϵ ) = μ ( ϵ 0 ) μ ϕ ( T ) ( ϵ T ) .
Using the measure-preserving property of ϕ ( t ) , the factors μ ( ϵ T ) in both numerator and denominator of ρ α cancel, leading to
ρ α ( t ) = μ Δ α ϕ ( t ) ( ϵ 0 ) μ ( ϵ 0 ) .
This is precisely what one gets without future conditioning, so that all macroscopic quantities, and in particular the entropy, are indistinguishable from their unconditioned values.
Working backward from time-T one obtains an analogous result. Define a variable s T t and set ϵ ˜ ( s ) ϵ ( T s ) . Then
ϵ ˜ ( s ) = ϕ ( T s ) ( ϵ 0 ) ϕ ( s ) ( ϵ T ) .
If s satisfies T s > τ , then when the density associated with ϵ ˜ ( s ) is calculated, its dependence on ϵ 0 will drop out. It follows that
ρ α ( s ) = μ ϕ ( s ) ( ϵ T ) μ ( ϵ T ) .
For a time-reversal invariant dynamics this will give the entropy the same time dependence coming back from T as going forward from 0. Even in the absence of T invariance, there should be roughly the same behavior because of the absence of any dissipative dynamics.
Now we turn to perturbations. Call the unperturbed system A. The microstates are in the set
ϵ A = ϵ 0 ϕ ( T ) ( ϵ T )
(formerly called ϵ ). System B, the perturbed case, has an instantaneous transformation act on it at time- t 0 . Call this transformation ξ . (This transformation is called ψ in [12]. The letter ξ is chosen to avoid confusion with the wave function.) It is not dissipative—the arrow does not arise from such an asymmetry. ξ is invertible and measure preserving. Successful solutions must go from ϵ 0 to ϵ T under the transformation ϕ ( T t 0 ) ξ ϕ ( t 0 ) . The microstates for system B are therefore
ϵ B = ϵ 0 ϕ ( t 0 ) ξ 1 ϕ ( T + t 0 ) ( ϵ T )
Again, there is the assumption that ϵ B is not empty, which will be taken up for quantum mechanics in Section 6. Clearly, ϵ A and ϵ B are different—at all times. But as will now be shown, for mixing dynamics and for sufficiently large T, the following hold: (1) for t 0 close to 0, the only differences in macroscopic behavior between A and B are for t > t 0 ; (2) for t 0 close to T, the only differences in macroscopic behavior between A and B are for t < t 0 . The direction of causality follows the direction of entropy increase.
The proof is nearly the same as before. Again we use a time τ such that the mixing decorrelation holds for time intervals longer than τ . First consider t 0 close to 0. The observable macroscopic quantities are the densities in grain- Δ α , which are, for t < t 0 ,
ρ α A ( t ) = μ Δ α ϕ ( t ) ( ϵ 0 ) ϕ ( t T ) ( ϵ T ) / μ ( ϵ A ) , ρ α B ( t ) = μ Δ α ϕ ( t ) ( ϵ 0 ) ϕ ( t t 0 ) ξ 1 ϕ ( t 0 T ) ( ϵ T ) / μ ( ϵ B ) .
As before, the mixing property, for T t > τ , yields ρ α A ( t ) = μ Δ α ϕ ( t ) ( ϵ 0 ) / μ ( ϵ 0 ) , which is the initial-value-only macroscopic time evolution. For ρ α B , the only difference is to add a step, ξ 1 . But this step is as measure preserving as ϕ itself, and therefore, as before, [ ϕ ( t t 0 ) ξ 1 ϕ ( t 0 T ) ] ( ϵ T ) μ ( ϵ B ) = 1 μ ( ϵ 0 ) . Thus A and B have the same macrostates before t 0 .
For t > t 0 , ρ α A ( t ) continues its behavior as before. For ρ α B ( t ) things are different:
ρ α B ( t ) = μ Δ α ϕ ( t t 0 ) ξ ϕ ( t 0 ) ( ϵ 0 ) ϕ ( t T ) ( ϵ T ) / μ ( ϵ B ) ( t > t 0 ) .
Now, I require T t > τ . If this is satisfied the ϵ T dependence drops out and
ρ α B ( t ) = μ Δ α ϕ ( t t 0 ) ξ ϕ ( t 0 ) ( ϵ 0 ) / μ ( ϵ 0 ) .
This shows that the effect of ξ is the usual initial-conditions-only phenomenon.
If we repeat these arguments for t such that T t is small, then just as we showed in Section 3, the effect of ξ will only be at times t less than t 0 .

6. The Set ϵ 0 ϕ T ϵ T Is Not Empty under Quantum Evolution

The assertion, the set ϵ 0 ϕ T ϵ T is not empty under quantum evolution, is sometimes true. What has been shown is that Gaussian wave functions do not spread indefinitely, but rather are localized each time they scatter [15]. Thus, our arguments hold for a gas of particles that are Gaussians and occasionally scatter. As found in [15,17], wave functions overlap, but scattering usually takes place at distances on the order of = mean free path (for scattering).
What happens in more general situations is not known. One theory of the quantum measurement process ([19], namely, there is no such thing as a measurement) proposes that non-entanglement is usually the case. However, experiment has not yet ruled on this issue.

7. Conclusions

There are many defects to this quantum version of entropy increase determining the direction of causality. What is not clear is whether a quantum theory is at all needed. Causality as usually understood is a macroscopic phenomenon and not quantum mechanical. Nevertheless, we point out the deficiencies of the present work.
There is uncertainty over black holes. If one considers two time boundary conditions purely as a way to get rid of “initial conditions” prejudice, OK, but if there is any thought that the world is actually symmetric (approximately) then that too must be dealt with. Do black holes evaporate? Do they become white holes? Are there black holes at all? (See the blog by Cramer [20].) We will not deal with any of these questions and allow our results to depend on the answers.
Our arguments are phrased in terms of chaotic dynamics. Chaos is problematic for quantum mechanics. Our previous paper ([12]) deals also with harmonic oscillators (which are not chaotic) but has other limitations. For the argument above to be meaningful all that is really required is a relaxation time. It is plausible that this exists in quantum theory.
Although we have briefly mentioned identity of particles, that has not been seriously dealt with.
Finally, there is the requirement that ψ final = U T ψ initial (where U T is the evolution operator for time T) and that both be low entropy, confined to a region of phase space. In some versions of quantum mechanics this is a natural requirement, but if that is wrong, there are severe limitations on the applicability of these results (see Section 6).
We mention in passing that our understanding of the “destiny” of Aharonov et al. does not fix the results of measurements (the “destiny” being a future value of the wave function). This is because general wave functions allow many outcomes to a given experiment.

Funding

This research received no external funding.

Data Availability Statement

Not Applicable.

Acknowledgments

I’d like to thank Stefan Kirchner for his help.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A. Justification of Schottky

The assertions of Schottky [1,2] were later justified in the context of Wheeler-Feynman (and predecessors) theory by Schulman [21]. The cited paper only shows the existence problem when two particles are involved. It is clear that if there are more, confusion reigns. This is not to say that the initial value problem by conventional electrodynamics (with the fields explicit) lacks a solution. But the field-free formulation presents further difficulties.

Appendix B. More on History

This is a bare bones review of the history of two time or two state boundary conditions. In particular there is other work by Bopp [22], Craig [23], Hegerfeldt [24], Schulman [25] and many others on related subjects.

Appendix C. The “Cat Map”

The “cat map” is a gas of N non-interacting particles, each obeying x = x + y , y = x + 2 y , both mod 1, and both x and y in the unit square. See [26]. The entropy is S = p α log p α with p α = n α / N where n α is the number of points in a box of a 1 M -by- 1 M grid, N = n α (=500 in the figure) and M (=10) is an integer.
The name, due to Arnold, is derived from an image of a cat, lines in the unit square, that become highly distorted in a few time steps.
In the figures a few more than N M 2 points are started in the first region and only those which land in the final region are kept. This turns out to be symmetric and is a realization of the set ( ϵ = ϵ 0 ϕ T ϵ T or versions involving ξ ) in the analytic version.

References

  1. Schottky, W. Das Kausalproblem der Quantentheorie als eine Grundfrage der modernen Naturforchung überhaupt. Naturwissenschaft 1921, 9, 492–496. [Google Scholar] [CrossRef] [Green Version]
  2. Schottky, W. Das Kausalproblem der Quantentheorie als eine Grundfrage der modernen Naturforchung überhaupt. Naturwissenschaft 1921, 9, 506–511, This Is the Portion of the Schottky Article Where the Topic Is Discussed, Despite the Page Number Not Being Given in the Wikipedia Article. Available online: https://en.wikipedia.org/wiki/Two-state_vector_formalism (accessed on 31 March 2021). [CrossRef] [Green Version]
  3. Watanabe, S. Symmetry oI: Physical Laws. Part III. Prediction and Retrodiction. Rev. Mod. Phys. 1955, 27, 179–186. [Google Scholar] [CrossRef]
  4. Schulman, L.S. Correlating Arrows of Time. Phys. Rev. D 1973, 7, 2868–2874. [Google Scholar] [CrossRef]
  5. Gold, T. The Arrow of Time. Am. J. Phys. 1963, 30, 403–410. [Google Scholar] [CrossRef]
  6. Wheeler, J.A. Conference Summary: More Results than Ever in Gravitation Physics and Relativity. In General Relativity and Gravitation (G7); Shaviv, G., Rosen, J., Eds.; Israel Univ. Press: Jerusalem, NY, USA; Wiley: Jerusalem, NY, USA, 1975; pp. 299–344. [Google Scholar]
  7. Wheeler, J.A. Frontiers of time. In Rendicotti della Scuola Internazionale di Fisica, “Enrico Fermi”, LXII Corso; Toraldo di Francia, G., Ed.; North Holland: Amsterdam, The Amsterdam, 1979; pp. 395–497. [Google Scholar]
  8. Gell-Mann, M.; Hartle, J.B. Time-Symmetry and Asymmetry in Quantum Mechanics and Quantum Cosmology. In Physical Origins of Time Asymmetry; Halliwell, J.J., Perez-Mercader, J., Zurek, W.H., Eds.; Cambridge Univ. Press: Cambridge, UK, 1994; pp. 311–345. [Google Scholar]
  9. Aharonov, Y.; Gruss, E. Two-time interpretation of quantum mechanics. arXiv 2005, arXiv:quant-ph/0507269v1. [Google Scholar]
  10. Aharonov, Y.; Cohen, E.; Landsberger, T. The Two-Time Interpretation and Macroscopic Time-Reversibility. Entropy 2017, 19, 111. [Google Scholar] [CrossRef] [Green Version]
  11. Robertson, K. Can the Two-Time Interpretation of Quantum Mechanics Solve the Measurement Problem? Stud. Hist. Phil. Sci. Part B 2017, 58, 54–62. [Google Scholar] [CrossRef] [Green Version]
  12. Schulman, L.S. Causality is an effect. In Time’s Arrows, Quantum Measurements and Superluminal Behavior; Mugnai, D., Ranfagni, A., Schulman, L.S., Eds.; Consiglio Nazionale delle Ricerche (CNR): Rome, Italy, 2001; pp. 99–112. [Google Scholar]
  13. Newton, R.G. Thinking about Physics; Princeton University Press: Princeton, NJ, USA, 2000. [Google Scholar]
  14. Dirac, P.A.M. Classical Theory of Radiating Electrons. Proc. R. Soc. Lond. A 1938, 167, 148–169. Available online: http://rspa.royalsocietypublishing.org/content/167/929/148.full.pdf (accessed on 25 May 2021).
  15. Gaveau, B.; Schulman, L.S. Reconciling Kinetic and Quantum Theory. Found. Phys. 2020, 50, 55–60. [Google Scholar] [CrossRef] [Green Version]
  16. Zeh, H.D. Related ideas and concepts. In Decoherence and the Appearance of a Classical World in Quantum Theory; Giulini, D., Joos, E., Kiefer, C., Kupsch, J., Stamatescu, I.-O., Zeh, H.D., Eds.; Springer: Berlin, Germany, 1996; Chapter 9; pp. 269–282. [Google Scholar]
  17. Schulman, L.S. What is the size and shape of a wave packet? Symmetry 2021, 13, 527. [Google Scholar] [CrossRef]
  18. Schulman, L.S. Accuracy of the semiclassical approximation for the time dependent propagator. J. Phys. A 1994, 27, 1703–1721. [Google Scholar] [CrossRef]
  19. Schulman, L.S. Time’s Arrows and Quantum Measurement; Cambridge University Press: Cambridge, UK, 1997. [Google Scholar]
  20. Cramer, J.G. Do Black Holes Really Exist? Available online: https://www.npl.washington.edu/av/altvw192.html (accessed on 25 May 2021).
  21. Schulman, L.S. Some Differential-Difference Equations Containing Both Advance and Retardation. J. Math. Phys. 1974, 15, 295–298. [Google Scholar] [CrossRef]
  22. Bopp, F.W. Time Symmetric Quantum Mechanics and Causal Classical Physics. Found. Phys. 2017, 47, 490–504. [Google Scholar] [CrossRef] [Green Version]
  23. Craig, D.A. Observation of the Final Boundary Condition: Extragalactic Background Radiation and the Time Symmetry of the Universe. Ann. Phys. 1996, 251, 384–425. [Google Scholar] [CrossRef] [Green Version]
  24. Hegerfeldt, G.C. Remark on causality and particle localization. Phys. Rev. D 1974, 10, 3320. [Google Scholar] [CrossRef]
  25. Schulman, L.S. Opposite Thermodynamic Arrows of Time. Phys. Rev. Lett. 1999, 83, 5419–5422. [Google Scholar] [CrossRef] [Green Version]
  26. Arnold, V.I.; Avez, A. Ergodic Problems of Classical Mechanics; Benjamin: New York, NY, USA, 1968. [Google Scholar]
Figure 1. The entropy as a function of time for the “cat map” is plotted. In all three figures the unperturbed entropy is shown. In the second figure the perturbation (with circles around values) takes place at time t = 3 , while in the third figure the perturbation is at time t = 13 . In the latter two cases the change in macroscopic behavior is subsequent in the sense of increasing entropy. (The graphs are inverted from the usual definition of entropy.)
Figure 1. The entropy as a function of time for the “cat map” is plotted. In all three figures the unperturbed entropy is shown. In the second figure the perturbation (with circles around values) takes place at time t = 3 , while in the third figure the perturbation is at time t = 13 . In the latter two cases the change in macroscopic behavior is subsequent in the sense of increasing entropy. (The graphs are inverted from the usual definition of entropy.)
Entropy 23 00682 g001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Schulman, L.S. Causality Is an Effect, II. Entropy 2021, 23, 682. https://doi.org/10.3390/e23060682

AMA Style

Schulman LS. Causality Is an Effect, II. Entropy. 2021; 23(6):682. https://doi.org/10.3390/e23060682

Chicago/Turabian Style

Schulman, Lawrence S. 2021. "Causality Is an Effect, II" Entropy 23, no. 6: 682. https://doi.org/10.3390/e23060682

APA Style

Schulman, L. S. (2021). Causality Is an Effect, II. Entropy, 23(6), 682. https://doi.org/10.3390/e23060682

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop