Showing posts with label statistical physics. Show all posts
Showing posts with label statistical physics. Show all posts

Thursday, May 14, 2015

StatMech (I) - Equiprobability Principle

In my statistical mechanics courses, everytime we had started from the very basic 'equiprobability principle' which states that
... in a state of thermal equilibrium, all the accessible microstates of the system are equally probable
It just looks like a very innocent and trivial sentence but no one states that this is the very foundation of all equilibrium statistical mechanics and I must assure you that it is far from trivial! I had a chance to tackle with this notion this term when I was working on a computational project for my graduate Statistical Mechanics class which involves the 'Monte Carlo Simulation of Hard Spheres' focusing on ergodicity of Boltzman Statistical Mechanics and Classical Mechanics. I will be trying to share my 'enlightment' with a couple of posts starting with this one.

First of all, we need to state that there is no complete axiomatic derivation of equilibrium statistical mechanics from the foundations of classical dynamics. We need additional inputs such as the one above in order to get it. Thus the above statement, the equiprobability principle, plays a significant role in statistical mechanics.

Consider an isolated, closed Hamiltonian system with no energy and particle exchange allowed with the surrounding, for example a gas in a closed container. We can characterize the system with the Hamiltonian function $H(q,p)$ if the involved dynamics has a time translational symmetry and $q$'s and $p$'s are the corresponding degrees of freedom (DOF) of the system. We can measure the value of $H$ with a precision, say, $\delta E$: \[H(q,p) = E + \delta E \tag{1}\]This specification of the energy of the systems can be thought as a constraint, thus once we specify $E$, we are constrained to move on an energy hyper surface (a shell in phase space) which is defined by $(1)$. In the absence of further information, there is little thing one can do except our experience with general classical systems with many DOF tells that this systems is likely to be chaotic; it is likely to have positive Lyapunov exponents and it wonders around such that energy surface is filled up (i.e ergodic - more on that in the next post).

The simple representation of a phase space of the system. The dimension of the whole space in this case is 24, thus collection of configuration and momentum axis are depicted with only one axis for representational purposes (Source).

Due to this energy constraint, it is clear that the whole space is not accessible to the system. Our assumption is, any one portion of this energy shell is as likely to be occupied as the other part of the shell. Actually, we can regard this statement as a maximal ignorance; since we know nothing about it, we say therefore that all outcomes are equally likely. Now we can specify our fundamental postulate of equilibrium statistical mechanics again as:
In a state of thermal equilibrium, all the accessible microstates of the system are equally probable.
This postulate is actually suffices to derive all equilibrium statistical mechanics, furthermore thermodynamics comes as a special case. Now let's go over the highlighted words in the definition one by one, starting with the thermal equilibrium.

Thermal equilibrium is defined as the case where time averages of macroscopic quantities are independent of time. In our gas in a container example, if we look for the velocity of a particular gas molecule in an instant of time (which we call a microscopic quantity) we don't expect it to be time-independent, since the gas molecule encounters collisions with other particles and the wall. But the average velocity of all the particles (which we call a macroscopic quantity) is zero for all time since the velocities to the right are compensated with the ones to the left and similarly for the other directions. Thus we say that the average velocity in thermal equilibrium is time-independent. Similarly we can say that average energy and pressure is also time-independent macroscopic quantities.

Averages are taken with respect to some probability distribution. The only way some macroscopic quantity becomes time-independent is that the probability distribution itself is also independent of time; because that way you are sure that all the moments of all the variables you take are all independent of time.

The next word to be defined is an accessible microstate. The microstate is defined by specifying the state of each constituents of the systems, i.e all the coordinates $(q,p)$ for each particle. The accessible microstates are the ones which are allowed by the constraint that the total energy is $E$. If you increase E, more states will be accessible, if you decrease $E \to 0$, almost all particles will get to rest, thus there will only be a limited number of accessible states. (This configuration under these constraints is called the microcanonical ensemble.)

Now comes the crucial part: equally probable.  Before we define it, let's us the question 'Can we derive the equiprobability from classical dynamics?'. Actually we can derive the converse of it. By that I mean, if for any instant of time, all the microstates of a system are equally probable, then we can show that it remains there for all time, i.e system is in equilibrium. But we are in search for other way around: If we have a thermal equilibrium, what is the probability distribution?

In classical dynamics, the phase space density, $\rho$, of the system obeys the Liouville Equation: \[\frac {\partial \rho}{\partial t} = \{H,\rho \} \tag{2}\] This expression is equal to zero in the equilibrium, i.e it is independent of time. \[\frac {\partial \rho}{\partial t} = \{H,\rho \} = 0 \tag{3}\] Thus we can say that $\rho$ is a constant of the motion. Furthermore, the eqn.$(3)$ suggests that $\rho$ must be a function of $H$ and possibly other constants of the motion.

Under equal probability condition, so far we only know that for our closed isolated system is on the energy shell defined by $(1)$. From only this information in hand, our equilibrium phase space density function($\rho_{eq}$) can only be the Dirac delta function of $H(q,p)$ peaked at the value of $E$: \[\rho_{eq}=\rho_{eq}(H) = \delta(H(q,p) - E) \tag{4} \] If we take the energy shell on which our system lies and divide it into little patches, eqn.($4$) does not distinguish one patch or another. This fact definitely shows that equal volume of patches in the phase space have equal probability. Thus our assumption of equal probability seems to be satisfied... But only with one caveat! That is, we need to specify a point in phase space in order to describe a state, but a point in a continuous space is a set of measure zero (the probability in continuum of finding any specific value is zero)! 

We need to do a further assumption that we have a finite resolution in order to give a finite equal probability to the patches in our phase space. We can not specify a point in a phase space, we don't have an infinite precision available to us. So we divide our phase space into unit cells; as long as our unit cell is finite, we are in a good shape. The problem of having uncountable many points in our phase space is tamed by discretizing, by giving a resolution. So we define a microstate as something in elementary volume element in phase space. But where does this finite resolution come from?

Nature comes to our rescue at this time and quantum mechanics tells us that due to the uncertainty principle, we only have a resolution up to a constant times $h$, the Planck's constant. This comes from the foundational fact that you can not specify the coordinate $q$ and its conjugate momentum $p$ at the same time, thus there is an inherent resolution in our $(q,p)$ phase space. We could construct whole theory of classical statistical mechanics pretending that there is a finite resolution in our phase space.

Finally, we can compute the number of accessible microstates $\Omega (E)$: \[\Omega (E) = \frac{\mathcal{V}(E)}{h^{3N}}\tag{5}\]where $ \mathcal{V}(E)$ is the volume of the accessible phase space and it is divided by unit cells of $h^{3N}$($h$ for each $(q,p)$ pair). The fact that, our probability density saying that $H(q,p) = E$ and nothing more, implies that each of these microstates has as much weight as another else, hence they are equally probable with the probability of \[\mathcal{P} = \frac{1}{\Omega (E)} \tag{6}\] Thus we have showed that the equiprobability of accessible microstates is the most crucial assumption in equilibrium statistical mechanics and it holds for a discrete phase space governed by the quantum mechanical restriction of finite resolution.

These notes have been composed with the help of the wonderful Classical Mechanics lectures of Prof.V.Balakrishnan on Youtube.

Saturday, April 18, 2015

Random Curiosities - I

There are always a lot of things that I encounter which make me think that I should write them on the blog but had no time to do so... In order to deal with it and keep some kind of an 'interesting reading list' for myself and for interested other, I decided to share these 'random curiosities' in a bulk manner. So here are this week's:

- The most significant and interesting thing that I've learnt this week was the so-called 'Benford's Law'. It is an unintuitive law that concerns the frequency distributions of the first digits of random-data. I will definitely elaborate on this one in one of my future post but there are a lot of great references where you can dive deep into it! For instance this article from Plus Maths really nails it down really nicely! If you have only 10 minutes, this video from Numberphile would work too..

- Moving on with statistics and there is this really curious article in Quanta Magazine called 'For Persi Diaconis’ Next Magic Trick'. It is about shuffling a deck of cards in order to make them 'random'. But the method proposed is not ordinary shuffling, which they have already proven that only seven shuffle is enough to achieve it, but 'smooshing' which is basically spreading the cards out on a table, swishing them around with your hands, and then gathering them up. Standford mathematician Persi Diaconis's proposal involves one of the fundamental ideas of statistical physics namely mixing. Quite an inspiring read...

- One of my past-time activities, especially in the busy traffic of Istanbul is to listen to interesting podcasts on the way and one of my favourite is ABC's 'Future Tense'. It is some kind of a 'futuristic' program but all the ideas have grounds on the present, it is not a sci-fi kind of future in a sense. One of the last week's program was themed 'Crowds and Motion' which deals with interesting problems and proposed solutions to urban planning and modeling the crowds as particles and simulations related with it. There was two interesting models, one with pedestrians walking in open space avoiding collisions and other one, namely "Steffen Method" of Airplane Boarding.. Further details and recorded audio is on the website.

- Web's first-class online science magazine Nautilus's this months theme is 'Dominoes' and their first week's installment includes a fascinating article called 'The Amazing, Autotuning Sandpile'. It is one of those 'simple to state but having jaw-dropping consequences' kind of problems of statistical physics. Simple mathematical model of a sand pile creating a very complex and dynamical patterns and behaviour, namely avalanches. The article is accompanied by nice pictures and simulations..

- Biology related, another Quanta Mag. article released this week was 'How Structure Arose in the Primordial Soup', telling the story of the search of the origins of fundamental transitions like formation of the cell, creation of the genetic code and making up the energy supplying metabolism functions way before the last common ancestor. The main ideas resonates with a Maynard Smith's wonderful book that I've been reading recently, The Origins of Life, and also an interesting method is proposed involving tracing the amino-acid code in the Tree of Life. An inspiring read...

- Academic tips and tricks side, I've read a very well-written compact blog post related to 'A few steps toward cleaner, better-organized code' which I desperately need more and more.

- And also a seemingly a trivia but one of the most complex thing that I've seen lately, solving a 17 X 17 Rubick's Cube. It is just beyond explanation...