|
|
[交流]
[物理]统计力学:系综,各态历经和其他
Statistical Mechanics: Ensembles, Ergodicity and other Stuff
--------------------------------------------------------------------------------
The physical foundation of simulation is statistical mechanics. Without statistical mechanics, it would be difficult to make the connection between the rather small systems (at most several million particles) simulated for short periods of time with measurements on macroscopic samples. Also most physically important systems are in contact with the rest of the universe, which we call a "heat bath" (although it can be a reservoir for pressure or particles etc.) Statistical mechanics "course grains"- separates the thermodynamic macroscopic variables like pressure and temperature from the microscopic variables: the individual atomic positions and velocities. As we will see the arguments apply also to few body systems in contact with a "bath".
Even though most of the simulations we will talk about are classical, it is actually easier to understand the fundamental principles if we use the basic concepts of stationary states in quantum mechanics. This is because the energy states (of a finite system) are a countable set while classically position and momentum are continuous variables.
The Phase Space: A classical system is composed of many particles with position, qi, and momentum, pi, denoted as q and p for short. This is the classical phase space. Any conservative mechanical systems can be characterized by a function, H(q, p; t), the Hamiltonian of the system which determines the energy as a function of the coordinates.
Suppose that we have a system with fixed N and V (volume). Then:
There are a finite number of energy levels in any energy range (E, E+dE). We call this the density of states in Phase Space g(E)= exp( S(E) ) where the entropy is S(E).
If we don't know which state the system is in, the natural assumption is that it is equally likely for the system to be in any of the states in that range. Among the states in this range, all are equally likely. ( A priori equal probabilities is what one assumes for a toss of a fair coin, or for a well-shuffled deck of cards in which each of the 52! arrangements are equally likely.) This was called "principle of insufficient reason" by James Bernoulli (1713) at the start of probability theory, which was furthered by Bayes and Laplace and their non-frequency interpretation of probability. In classical statistical mechanics, the idea of phase space is harder to justify since one has continuous variables; quantum mechanics showed that conjugate variables like position and momentum are the fundamental variable for phase space.
Next consider the interaction between two weakly coupled systems (with Energy E1 and E2 and particle numbers N1 and N2) that are allowed to exchange energy and nothing else. Then for each pair of states, there exists a new combined state with energy E1+E2. This implies that the entropy of the combined system is the sum of the entropy of the two parts. S(E1+E2)= S1(E1)+S2(E2).
If one of the two systems is large (N1 N2 with N1+N2=N), the density of states is enormous and energy will flow to maximize the total entropy. This maximum will occur when the temperatures of the two systems are the same. The temperature (corresponding to the energy E) is defined as 1/T = dS/dE. See notes on Gibbs Distribution (PDF).
The following is the important conclusion of Boltzmann: For a system in contact with a heat bath, P(Ei) = exp(- Ei)/Z.
Here, Z =i exp(- Ei) is a normalization constant, and is known as the partition function. Usually it is more convenient to work with the (Helmholtz) free energy: F = -kT ln(Z), since the free energy is proportional to the size of the system, it is an extensive parameter, rather than depending exponentially on the system size. This is also important in computer programs to avoid exponential overflow. All thermodynamic properties can be calculated as various derivatives of the free energy as we will see.
The measured value of some observable (operator A) will equal the Tr(exp(- H) A)/ Tr (exp(-H)), where Tr=trace or sum of the diagonal elements. This is the quantum version of the more familiar < A > = iPiAi.
While we have discussed the constant energy ensemble (micro-canonical) we will soon encounter other ensembles, for example, constant pressure, constant temperature (canonical), grand canonical (where the system can exchange particles with the heat bath), etc.
The above discussion was formulated in terms of quantum states. Let us now take the classical limit of the Boltzmann formula. Suppose H=K+U where K is the kinetic operator and U is the potential operator. For high enough temperatures we can write exp(-H) = exp(-K) exp(-U). (See the formula page.) One finds that the probability of being in a volume dpdr in phase space is proportional to exp(-E) where E is the classical energy E= p2/2m+V(R). To summarize, if a system is allowed to exchange energy and momentum with outside systems its distribution will equal the canonical distribution. Quantum mechanics does introduce a new feature, an N! coming from the fact the particles are often indistinguishable. Usually but not always, the N! drops out in classical statistical mechanics.
The momentum part of the Maxwell-Boltzmann distribution (the Maxwell distribution) is just a "normal" Gaussian distribution. It is completely uncorrelated with the position part. If you know something about the positions you have no knowledge about the momentum (and vice-versa). The average kinetic energy is exactly (3/2)NkBT (law of equipartition) for a system of N particles, independent of the interactions, unless there are rigid constraints. After doing the momentum integrals, we are left with the configuration distribution and partition function.
Time Average versus Ensemble Average
In experiments one typically considers a large number of successive times (t1 .... tM) to make measurements, each of which reveals the current state of the system. Then = i AiPi where Pi = ni/M and ni is the number of times the system is in state i out of M total states. This is a time average over a single system because the values of ni were obtained by successive observations. We will model that by a classical simulation for a fixed number of particles and a given potential energy function V(R) isolated from any heat bath. Newton tells us the system will evolve according to F = -V= ma. We assign initial positions and velocities (implying an initial total energy and momentum). If we let it evolve what will happen? Will it go to the canonical distribution?
First, a few constants of motion maybe be known: energy, momentum, number of particles, etc. These will not change but in a system of more than 2 particles the number of degrees of freedom is greater than the number of constants of motion. Gibbs introduced the artifice of a large number of replicas of the system (i.e., an ensemble) and imagined them to trace out paths on the energy surface. Basically, we could follow A(t) for one system for infinite time and average , or just consider the value of A for each member of the ensemble and average. Under certain conditions (if the system is ergodic) one should be the same value. Gibbs adopted the view from the outset that this was only a device to calculate the probable behavior of a system.
In other words, instead of starting from a single initial condition, we start from the canonical distribution of positions and momentum. By Louiville Theorem, it is easy to show that this distribution is preserved. What we are doing in Molecular Dynamics, since we do not have the heat bath which gets most physical systems into equilibrium, is to assume that we can interchange the average over initial condition with the average over time. In an MD simulation, tMDsteps cannot go over infinite time, but hopefully an average of long finite tMDsteps is satisfactory. Thus, = (1/MDsteps) t=1,MDsteps A(t).
Gibbs clearly recognized the relation to entropy to probability and observed the most probable situation (equilibrium) corresponds to maximum entropy subject to given constraints.
[ Last edited by sdlwwxb on 2005-12-15 at 17:30 ] |
|