# The Ergodic Problem
The ergodic problem or the problem of [ergodicity](https://en.wikipedia.org/wiki/Ergodicity) has a long history going back to [Ludwig Boltzmann](https://www.informationphilosopher.com/solutions/scientists/boltzmann/)'s development of statistical mechanics.
Today ergodic theory is a set of formal problems in mathematics arising from the dynamics of [deterministic](https://www.informationphilosopher.com/freedom/determinism.html) and _continuous_ classical mechanics. It is concerned with the path of a hypothetical infinitesimal particle in 6-dimensional phase space, or in the 6N-dimensional path of N particles.
The "ergodic hypothesis" originated with Boltzmann and was used by [J. Willard Gibbs](https://www.informationphilosopher.com/solutions/scientists/gibbs/) and [Albert Einstein](https://www.informationphilosopher.com/solutions/scientists/einstein/) to replace time averages over the motions of a single dynamical system with average properties of an "ensemble" of identical systems, sometimes loosely described as replacing time averages with "space" averages.
The ergodic problem is intimately connected to the problems of [reversibility](https://www.informationphilosopher.com/problems/reversibility/) and [recurrence](https://www.informationphilosopher.com/problems/recurrence/) raised by 19th-century critics of Boltzmann's theorem about the increase of thermodynamic entropy.
Information philosophy suggests the solution to this difficult problem is to use [indeterministic](https://www.informationphilosopher.com/freedom/indeterminism.html) and _discrete_ [quantum](https://www.informationphilosopher.com/quantum/) physics to consider what happens when material particles approach one another.
The phenomenological thermodynamics of [Carnot](https://en.wikipedia.org/wiki/Sadi_Carnot) and [Clausius](https://en.wikipedia.org/wiki/Sadi_Carnot) uses only macroscopic continuous variables like pressure, volume, temperature, and chemical potentials to discover entropy.
The statistical mechanics of Boltzmann and [James Clerk Maxwell](https://www.informationphilosopher.com/solutions/scientists/maxwell/) studies the possible distributions of N particles in the 6N dimensions of their phase space. It usually studies gases in an equilibrium state to derive all the results of classical thermodynamics.
Maxwell was first to calculate the distribution of velocities (and therefore energies) of "ideal gas" particles. In an "ideal gas" we assume collisions between particles are "elastic." All energy is in the kinetic energy of the moving particles.
Boltzmann's kinetic theory of gases analyzes the statistics of collisions between gas particles to draw conclusions about systems that may be distant from equilibrium. Boltzmann neglected any three-particle collisions on the assumption that even two-particle collisions are rare.
Boltzmann's transport equation analyzes the flow of particles in and out of a certain volume. He studied two-particle collisions and assumed that after the collision there was no correlation with the paths and velocities before the collisions. He called this assumption "molecular chaos" or "molecular disorder" (_molekular geordnet_).
It uses the Liouville theorem that the phase-space distribution function is constant along the trajectories of the system. The phase space distribution _ρ (p,q)_ determines the probability _ρ (p,q)dqdp_that the system will be found in the infinitesimal phase space volume _dqdp_. The density of system points in the vicinity of a given system point traveling through phase-space is constant in time.
Statistical mechanics assumes _a priori_ probabilities give us the density. The underlying mechanics is the classical mechanics of Newton.
## Loschmidt's Paradox
In 1874, [Josef Loschmidt](https://www.informationphilosopher.com/solutions/scientists/loschmidt/) criticized his younger colleague [Ludwig Boltzmann](https://www.informationphilosopher.com/solutions/scientists/boltzmann/)'s 1866 attempt to derive from basic classical dynamics the increasing entropy required by the second law of thermodynamics.
Increasing entropy is the intimate connection between time and the second law of thermodynamics that [Arthur Stanley Eddington](https://www.informationphilosopher.com/solutions/scientists/eddington/) later called the [Arrow of Time](https://www.informationphilosopher.com/problems/arrow_of_time/). (The fundamental arrow of time is the expansion of the universe, which makes room for all the other arrows.) Despite never seeing entropy decrease in an isolated system, attempts to "prove" that it always increases have been failures.
Loschmidt's criticism was based on the simple idea that the laws of classical dynamics are time reversible. Consequently, if we just turned the time around, the time evolution of the system should lead to decreasing entropy. Of course we cannot turn time around, but a classical dynamical system will evolve in reverse if all the particles could have their velocities exactly reversed. Apart from the practical impossibility of doing this, Loschmidt had shown that systems could exist for which the entropy should decrease instead of increasing. This is called Loschmidt's "Reversibility Objection" (_Umwiederkehreinwand_) or "Loschmidt's paradox." We call it the _problem of_ microscopic [reversibility](https://www.informationphilosopher.com/problems/reversibility/).
## [Zermelo](https://www.informationphilosopher.com/solutions/scientists/zermelo/)'s Paradox
Zermelo's paradox was a criticism of [Ludwig Boltzmann](https://www.informationphilosopher.com/solutions/scientists/boltzmann/)'s H-Theorem, the attempt to derive the increasing entropy required by the second law of thermodynamics from basic classical dynamics.
It was the second "paradox" attack on Boltzmann. The first was [Josef Loschmidt](https://www.informationphilosopher.com/solutions/scientists/loschmidt/)'s claim that entropy would be reduced if time were reversed. This is the [problem of microscopic reversibility](https://www.informationphilosopher.com/problems/reversibility/).
[Ernst Zermelo](https://www.informationphilosopher.com/solutions/scientists/zermelo/) was an extraordinary mathematician. He was (in 1908) the founder of [axiomatic set theory](http://en.wikipedia.org/wiki/Zermelo%E2%80%93Fraenkel_set_theory), which with the addition of the axiom of choice (also by Zermelo, in 1904) is the most common foundation of mathematics. The axiom of choice says that given any collection of sets, one can find a way to unambiguously select one object from each set, even if the number of sets is infinite.
Before this amazing work, Zermelo was a young associate of [Max Planck](https://www.informationphilosopher.com/solutions/scientists/poincare/) in Berlin, one of many German physicists who opposed the work of Boltzmann to establish the existence of atoms.
Zermelo's criticism was based on the work of [Henri Poincaré](https://www.informationphilosopher.com/solutions/scientists/poincare/), an expert in the three-body problem, which, unlike the problem of two particles, has no exact analytic solution. Where two-bodies can move in paths that may repeat exactly after a certain time, three bodies may only come arbitrarily close to an initial configuration, given enough time.
Poincaré had been able to establish limits or bounds on the possible configurations of the three bodies from conservation laws. Planck and Zermelo applied some of Poincaré's thinking to the _n_ particles in a gas. They argued that given a long enough time, the particles would return to a distribution in "phase space" (a 6_n_ dimensional space of possible velocities and positions) that would be indistinguishable from the original distribution. This is called the Poincaré "recurrence time."
Thus, they argued, Boltzmann's formula for the entropy would at some future time go back down, vitiating Boltzmann's claim that his measure of entropy always increases - as the second law of thermodynamics requires. Poincaré' described his view in 1890.