SCIENCE AT THE SHINE DOME canberra 30 April 2 May 2003
Symposium: Nanoscience – where physics, chemistry and biology collide
Friday, 2 May 2004
Professor Denis Evans
Dean, Research School of Chemistry, Australian National University
Denis Evans is Dean, Research School of Chemistry, Australian National University. Prior to joining the Research School of Chemistry, he worked overseas at a number of laboratories, including the National Institute for Standards and Technology, USA, Cornell University, USA, and Oxford University, UK. He was a prime mover in introducing supercomputing to Australian science during the 1980s. Internationally he is recognised for his scientific work in statistical mechanics. He has won a number of national and international prizes for his contributions to nonequilibrium statistical mechanics. He is a Fellow of the Australian Academy of Science.
Thermodynamic limits to nanomachines
Prior to about 10 years ago I would have thought that there was not very much sensible that thermodynamics could say about extremely small systems. When I was an undergraduate, I was taught that one of the important things in thermodynamics is to look at large systems in the so-called thermodynamic limit, where intensive properties like temperature, pressure and so forth no longer depend on the number of particles or the mass of the total system that you are looking at.
As a summary of thermodynamics, most people would know that thermodynamics is a very, very general kind of subject. It is independent of the existence of atoms and molecules; indeed, it has even been applied to galactic formation. It is a very, macroscopic subject. If ever there is a subject that shouldn’t apply to nanotechnology, it would be thermodynamics.
Thermodynamics relates energy to heat and temperature. Some of the interesting ways in which to summarise thermodynamics would be to say that you cannot build a perpetual motion machine which violates energy conservation you cannot construct a perpetual motion machine of the first kind. Then there is a much more subtle statement that thermodynamics makes, that is much harder to come to grips with. You cannot construct a machine which converts ambient heat, the heat from your surroundings, into useful work. This would be called, if you could construct such a thing, a perpetual motion machine of the second kind.
I can illustrate what a perpetual motion machine of the second kind might look like.
![]()
Click on image for a larger version
Here we have a motorboat cruising around in the ocean. Sea water is not at absolute zero so there is actually some energy content in the water, by virtue of the fact that it is not at absolute zero.
So this boat sucks in warm sea water and extracts energy from the warm sea water to turn the propeller to propel the boat. But then in order to conserve energy the first law of thermodynamics it chills the water that it takes in and throws ice cubes out the back. That is what thermodynamicists call a perpetual motion machine of the second kind.
These things can’t be constructed, and the US Patent Office routinely uses these two formulations of the laws of thermodynamics to throw out patent applications.
![]()
Click on image for a larger version
I am going to talk about a mathematical theorem that we discovered and first wrote down in 1993 (Evans, Cohen and Morris), and first proved in 1994. It has been the subject of considerable mathematical investigation since then. It is called the Fluctuation Theorem.
It is basically a generalisation of the second law of thermodynamics to finite systems that are observed for finite times. One of the key quantities in thermodynamics is the so-called entropy production. The entropy production, roughly speaking I am not being terribly rigorous here, so statistical mechanicians have to be gentle with me is work, which is force times velocity, divided by the temperature that the system is exhausting its heat output to.
That is entropy production. It gets larger if the system size gets bigger. And one of the quantities that the Fluctuation Theorem is interested in is the time average of the entropy production. So you integrate the entropy production for a period of time, 0 to t, and divide by t to work out what the average rate of entropy production is.
Here is the equation. It is an extremely simple and extremely beautiful equation. What the Fluctuation Theorem says is that if you repeat an experiment over and over again, a finite number of particles observed for a finite amount of time, if you construct a probability distribution of the time averaged values of the entropy production and form a histogram, then you can talk about the probability that the time averaged entropy production takes on a value A. Let’s just suppose that A is a positive number.
The Fluctuation Theorem states that the probability that the time averaged entropy production is equal to a value A, divided by the probability that it takes the value -A, is simply the exponential of At.
The founders of statistical mechanics, even the founders of thermodynamics, 100 or 150 years ago, knew that the second law was not, strictly speaking, valid for small systems. They knew that if you looked at just a small collection of particles, things like entropy would not always increase. Thermodynamics says that this probability ratio is +∞, but for a finite system, what this equation or the theorem says is that it is not +∞, it is the exponential of a positive number, because we assumed that A was positive. This number in the exponent here scales with the system size N and the observation time t. And this is what is completely new about this. Boltzmann, Ehrenfest, all the greats of statistical thermodynamics, knew that something like this equation would be true, but no-one knew how to quantify it.
Why is this important? If you deal with small systems nanosystems and so forth N is not infinity. In the thermodynamic limit this exponential becomes the exponential of N, which becomes Avogadro’s number, 1023 particles. It is essentially +∞. But the Fluctuation Theorem tells you what happens in finite systems for finite times, like nanosystems.
There is another thing this equation has also resolved. This problem has been recognised for well over 100 years. However, I don’t want to talk too much about it here. The laws of mechanics it doesn’t matter whether you are doing quantum mechanics or classical mechanics are all time reversible. The laws of thermodynamics are time irreversible. This thing, entropy production, is most likely to be positive, and it diverges to +∞, this ratio here. The Fluctuation Theorem senses a direction of time that is not present in dynamics, and so for over 100 years the resolution of how you get irreversible macroscopic behaviour from reversible equations of motion whether it be quantum mechanics or classical mechanics makes no difference has been an unresolved question. But it is also resolved by this same theorem.
There is another version of the theorem where you sum over all possible positive values of the time averaged entropy production and ask what is the probability that entropy production for a finite system for a finite time is positive, compared to negative. If you think about the mathematics, both of these quantities are greater than zero. And as the system has more and more particles, or the observation time gets longer, these quantities here diverge to +∞.
![]()
Click on image for a larger version
So we have got a proof of the second law of thermodynamics, but it also points out that in small systems, observed for short periods of time, you should be able to, as it were, violate the second law of thermodynamics. You can say there is no such thing as a violation of the second law of thermodynamics, because in a literal sense it only deals with infinite systems. Okay, if you want to be pedantic. But everyone knows what you mean when you say ‘violate it for a short time’. You mean that these ratios here are not +∞, but are actually finite numbers.
So we come to the experiment by Edith Sevick and Genmiao Wang. I got talking to Edith about her experimental apparatus and so forth, and we figured out that we could do a test of the Fluctuation Theorem, using her optical tweezers apparatus.
![]()
Click on image for a larger version
So what are optical tweezers? If you have a colloid particle, a little latex sphere of, say, 6 microns in diameter, and it is in a watery medium, and if you suppose that the refractive index of that colloid particle is greater than the surrounding water, then according to the laws of electrodynamics, basically going back to Einstein, this particle will be attracted to a region where the intensity of light is greatest. Einstein was the first to point out that light can exert a force on material that light can exert a force and, indeed, exert a pressure on systems and so forth. The motion of a Brownian particle in an aqueous medium, the theory of Brownian motion, was first also elucidated by Einstein; so this is a double-Einstein type of experiment.
One of the things that Edith pointed out to me is that you can pull these colloid particles around in this aqueous medium by moving the laser beam. You move the focus of the laser beam around, and you can drag this little particle around with you just by light. Not by a spring, but by light.
If you focus the laser beam and you get the optics right, the force that the colloid particle experiences is that appropriate to a simple harmonic oscillator first year undergraduate physics. And so here is our harmonic force. Well, our spring constant is pretty small, and we are dealing with forces here Edith can resolve forces down to about 3 x 10-15 newtons these are very, very weak forces.
With a piezoelectric transducer it is possible to actually move the focus of the laser beam around the place, and it moves at the colossal velocity of about a micron per second, which is about a millionth of a kilometre per hour, if you use macroscopic units.
![]()
Click on image for a larger version
So we can do experiments on these colloid particles and pull them around, and we can test the second law of thermodynamics for these little suspended particles. Here is a picture of the apparatus. The actual focusing and so forth is done with a standard biological, commercial microscope. The laser beam comes in here and is reflected up through the sample, which is contained here. Here is a piezoelectric transducer; there is another one over on the other side to control the movement of the stage and so forth. And there is a detector over here, which processes the results.
![]()
Click on image for a larger version
Just before Christmas 2000, Genmiao Wang spent a lot of time in the lab it wasn’t very automated at that time and started repeating this experiment. You start off with the colloid particle still, and then you start to move the laser beam off at a millionth of a kilometre per hour to the right. You move it for 10 seconds, stop, and then repeat the experiment to the left. And you just go over and over.
He computed the entropy production, and the time average of the entropy production is the integral of the force times the velocity on the particle, divided by the temperature of the water. He produced histograms of this. In those days we couldn’t do an experiment on the Fluctuation Theorem directly; we could only handle the integrated form of the Fluctuation Theorem that I showed you as the second set of complicated equations.
So here is the probability ratio sorry that
the symbols have been changed from
S to W that the
entropy production is negative compared to positive. Thermodynamics just says
this probability ratio is identically zero. You can’t get negative entropy
production. If the Fluctuation Theorem, or the integrated form of it, is
correct, this is one of the expressions that I showed you it should be equal
to.
So here are the two sets of data. If the
Integrated Fluctuation Theorem is correct, these two datasets would be
coincident with one another. And that, within statistical uncertainties, is the
case.
One of the amazing things about this graph
is: what is the time axis here? The time axis out here to the right is in
seconds not in milliseconds or microseconds. We can see observable examples
of negative entropy production out to two seconds. That is examples of that
motor launch that I showed you. The thing is running backwards. You move the
laser beam to the left, the particle doesn’t move with it, it goes to the
right. Where does it get the energy to climb up the wall of the potential force
field to enable it to move in the wrong direction? It gets it by extracting the
heat from the solvent, in violation of what you would call the second law if
you assumed that it applied to such small systems.
When this work was published it attracted a
certain amount of news. Edith Sevick had an amazing 10-day period: she
gave birth to her second child, while she was in hospital she was doing an
interview for a newspaper, and at the same time she was offered a permanent
appointment at the Australian National University!
You may wonder why the Financial Times or the Wall
Street Journal or the New York Times is interested in these sorts of things. One of the reasons and I know this because of the interviews I did with
the journalists is that they were worried about the overhyping of some
scientific developments. We were at the stage where the tech bubble had just
burst, and they were interested in rigorous mathematical statements about the
limits of nanotechnology. I will come back to that point.
More recently just in the last few weeks, actually we have been doing some different
experiments. This is all first-year undergraduate physics again. If you take an
optical trap with a given spring constant, you get a distribution of positions
of particles in that harmonic force field. If the force field is harmonic, it
is easy to see that the distribution is Gaussian. If you are not a
statistician, don’t worry about it; it looks like that.
You then change
the power of the laser. That changes the force constant for the optical trap.
You can increase the power, and that will make a stiffer harmonic potential.
But if you do this virtually instantaneously, there is no time for the
distribution of positions to relax. Then you just sit there and wait, and watch
this distribution relax through non-equilibrium states towards the final
equilibrium distribution over here.
If you analyse
the distribution mathematically, the quantity that corresponds to the entropy
production involves the product of the difference in the spring constants and
the difference in the squares of the positions. What we have been able to do in
this case is to test the actual Fluctuation Theorem, the transient Fluctuation
Theorem, itself not just the integrated version, but this is the logarithm of
the probability that entropy production is -A to +A.
If the transient
Fluctuation Theorem was true, this is plotted as a function of Omega here, you
should just get a straight line of slope 1. And within the statistical
uncertainties, that is what we have got. These results aren’t published at the
moment but they will be shortly.
To summarise, one
of the things that the Fluctuation Theorem does is that it generalises the laws
of thermodynamics so that they apply to systems that are very, very small and
observed for very small periods of time. Just like thermodynamics itself, it is
a fundamental physical limitation that you cannot beat. It is just as
fundamental in some senses as saying that you cannot beat the speed of light you can’t produce particles that go faster than the speed of light you cannot
violate the Heisenberg uncertainty principle by simultaneously locating
position and momentum with infinite precision. It is one of those kinds of
laws.
Why the Financial Times, the New York Times and so forth were
interested in it is that the Fluctuation Theorem places absolute limits on what
you can do with nanotechnology, in a very precise mathematical way. It doesn’t
say you cannot create nanomachines. It says, though, that if you make
nanomachines smaller and smaller if you take a Holden motorcar and you make
it smaller and smaller and drive it down this micro-highway, then it
becomes a nano-highway and then it gets smaller and smaller keeping the
temperature of the environment fixed, what will happen is that as it gets
smaller and smaller it starts to move in a two steps forward, one step backward
motion. And by backwards you mean the engine running backwards, sucking in
heat, generating petrol and oxygen, and consuming CO2 and so forth,
for a short while. It really is the thing running backwards.
It doesn’t say
you can’t do nanotechnology. You can. Biology has done it ever since life
existed on this planet. But it is the way in which small engines and small
machines and organelles themselves have to work: they have to obey this
fundamental theorem of nature.
Questions/discussion
Question: This is not a question so
much as a comment. The actual process of coagulation of two colloidal particles
is, in a sense, similar, in that in a chemical reaction you ask how much energy
is involved to get over the hump. And if you’ve got that much energy you will
get there. But in the colloidal case, if you have a certain repulsion barrier,
it turns out that if you had enough energy to be able to get over that barrier
you would lose it in viscous drag and dissipation before you got through the
barrier. So the particle has to actually creep up the hill and over diffusion
in a field of force and so it goes back more frequently than it goes forward.
But it still gets there in the end. So it is rather like your nanomachine.
Question: The direction of increasing
energy gives us the direction of time. So where we have negative entropy, are
we dealing with time reversal?
DE: Yes.
One of the things you can do this takes a little bit of time to explain is
that you don’t just have to move the laser beam at a constant velocity, you could
actually do something funny with it. You could move it for a while and then
stop it. You could have a wave form for the motion. If the wave form has a
definite parity under time reversal symmetry, either odd or even, you can
actually bin the entropy production in terms of whether it is positive or
negative and so forth, and the probabilities will be controlled by this
theorem, but what you can do is you can look at what is the time dependence of
the entropy production in those bins that have +A and -A for their time
integrated entropy production. So what happens is that they are time reversal
maps of each other.
There is an old problem going back to
Boltzmann, Ehrenfest and so forth: how do you reverse the motion of particles?
Well, you map the coordinates to themselves and the p’s to -p’s, and things
will run backwards.
There is another way in which you can do it.
You bin the results of these experiments in terms of entropy production and
look at the time dependence of the average response in the +A bin and the -A
bin. They will be time reversal maps of one another. And if one of them
satisfies causality, the other one will look as though it is violating
causality and so forth.
It would take a little bit longer to explain
it, but that is actually how you derive the Fluctuation Theorem.
Question: In a one-dimensional
electrical system, this sort of thing has been known for a long time. The
fluctuation is called noise, and then there is the question: can you rectify
noise that is, extract all the times it is positive and separate them from
the times it is negative? Feynmann has a famous example of a micromachine in
the form of a ratchet, where every time it fluctuates to the left you let it,
and when it tries to fluctuate to the right you stop it. He shows how that
cannot possibly, in the long term, work.
What is the
analogue of the ratchet in your situation? How can you not extract the times
that you have positive entropy fluctuation from the times that you have
negative entropy fluctuations, to make your ice cube machine work?
DE: What
happens is that you always get an overwhelming preponderance of positive
entropy production compared to the negative entropy production. So on time
average you can’t win. It is like going to the casino; you are always going to
be cleaned up. You may be lucky for a while in fact, it is the same kind of
logic. What the Fluctuation Theorem does is to quantify your probabilities of
losing at the casino as an ensemble average of the amount of time that you
spend at the tables. Ultimately you are going to lose, and it just tells you
what that probability of loss is ultimately going to be.
![]()
Click on image for a larger version
![]()
Click on image for a larger version
![]()
Click on image for a larger version



