Monday, 12 October 2015

Thermodynamics, Entropy and the Arrow of Time

by Caitlin French

The Arrow of Time



Why does a pane of glass smash but it doesn’t piece itself back together again? Why can an egg scramble but not unscramble? Why does an ice cube melt but it doesn’t spontaneously become solid again? In our everyday macroscopic world, we experience time asymmetry. Time only flows in one, forwards direction, creating the arrow of time. However, physical laws at the microscopic level have time reversal symmetry – it is theoretically possible for events to run both forwards and backwards. If you played a video of a swinging pendulum in a vacuum, you would not be able to tell whether it was running forwards or backwards.

Imagine you filmed a particle falling towards the ground, accelerating downwards due to gravity. If you then watched the film in reverse, the particle would decelerate upwards, which would be possible, provided the particle was given an initial velocity. By giving it an initial velocity, momentum is conserved. This initial velocity in the time-reversed scenario would be provided by the vibrations of atoms as the particle hit the ground in the initial falling scenario. The fact that these vibrating particles have kinetic energy means that energy is conserved in both scenarios as well. This is all theoretically possible. However, we don’t see these time-reversal effects in everyday life. Evidently, there is a conflict between time-reversible microstates and the one-way time of macrostates.


Thermodynamics


Thermodynamics is the mathematical physics of gases, including the study of heat and energy, as “thermos” means heat and “dynamic” means power. Its four laws define temperature, energy and entropy in thermodynamic systems.

The first use of steam to do work was a machine built by the Greek mathematician and engineer, Hero of Alexandria. It worked by heating a water-filled sphere to make the water evaporate to steam, which escapes through the tubes and makes the sphere spin.

Source: https://www.grc.nasa.gov/www/k-12/ TRC/Rockets/IMAGES/History1.gif


Thermodynamics became extremely important in the Industrial Revolution as steam engines much more complex than this were developed by Savery, Newcomen, Watt and many others. Many individuals, like Clausius, Joule, Kelvin and Carnot, led the development of thermodynamics, with the task of maximising the efficiency of these steam engines.

Kinetic theory was developed by Bernoulli and later Krönig and Clausius. Inspired by Clausius’ work, Maxwell and Boltzmann later created a formula for the probability that a molecule will travel at a given speed, based on the normal distribution or bell curve: the Maxwell-Boltzmann distribution. Boltzmann used kinetic theory to reinterpret thermodynamics, founding statistical mechanics and also the statistical interpretation of entropy.


The Laws of Thermodynamics


0th law: Heat transfer is only possible when fluids are at different temperatures. It is important to note that temperature is a fluid property which measures states, whereas heat is a measure of energy transfer and measures changes. Also note that this is called the 0th law because it logically precedes the 1st law, but was historically recognised later.

1st law: A form of the law of conservation of energy: energy cannot be created or destroyed, only transferred. The change in internal energy of a system (ΔU) is equal to the heat added to the system (Q) minus the work done by the system (W)

∆U=Q-W

Entropy is a measure of the amount of energy unavailable to do work, or the disorder of a system. The change in entropy of a closed system (ΔS) is the amount of heat absorbed (ΔQ) divided by the temperature (T) at the point where the heat transfer took place.

∆S=  ∆Q/T

2nd law: Going forward in time, the net entropy (disorder) of a closed system must stay the same or increase. There is a natural tendency towards disorder, because disorder can be achieved in many more ways than order. Note that, in systems which aren’t closed, entropy can decrease (for instance in planet formation) as long as the net entropy of the whole universe increases. Note that this is more a statistical principle than a fundamental law. This is summarised by:

∆S ≥0

3rd law: As temperature approaches absolute zero (0 Kelvin or -273.15 degrees Celsius), the entropy of a system approaches a constant value.

The entropy of a system (S) is equal to Boltzmann’s constant (k = 1.38 x 10-23 JK-1) multiplied by the log of the number of distinct microstates that make up the overall macrostate.

S=k logW


The Consequences of the 2nd law


The 2nd law implies that perpetual motion (100% efficient) machines are impossible, and instead only a maximum efficiency can be reached: the Carnot efficiency. It is also why, in a closed system, heat flows from hot to cold and not the other way around. Note that although refrigerators appear to break this rule as they freeze things to a lower temperature than the air around them, they are not isolated systems. Fridges require input electrical energy to force a coolant liquid to evaporate as it enters the fridge (so the coolant temperature falls so it can extract energy from the fridge), to then compress the coolant as it leaves the fridge (to heat it up so that it’s heat can be released to the room). Finally, the arrow of time arises as a consequence of the 2nd law. The entropy of the universe will increase until it eventually faces a “heat death”, where entropy has reached a maximum value, everything is the same temperature and no work can be done.


Statistical Examples


Why should entropy increase? Probability dictates that, if there are more disordered states than ordered ones, it is more likely for something to be in a disordered state. Returning to our example of the particle falling towards the ground, if a tennis ball (made up of billions of atoms) was used instead of a particle, the time-reversal of this scenario would seem less plausible. This is because, for the tennis ball, there are more microstates of the system; more disordered possibilities. With a single particle, the time-reversed initial conditions are easy to implement, whereas the tennis ball scenario requires the very precise control of billions of atoms to get it to jump upwards. Using S = k logW, we can see that the greater the number of microstates (W), the greater the entropy (S).

Let’s use another example using six playing cards: 2,3,4,J,Q,K. If we split this into two piles with low-value cards (2,3,4) in one and the court cards (J,Q,K) in the other, we have an ordered (low entropy) arrangement. However, if we shuffle both piles together, it becomes more disordered (higher entropy), for example Q42J3K. Relating this to Boltzmann’s formula, there are 36 ways to arrange the two piles: 3! = 6 for each pile, so 6x6 = 36 for both. However, there are 720 ways (6! = 1x2x3x4x5x6) to arrange all 6 cards in order. The type of ordering (2 piles or 1) is analogous to the macrostate of the system, whilst the exact order is the microstate. The more ordered macrostate has 36 microstates, whereas the less ordered one has 720. Using logarithms: log36 = 3.58, but log720 = 6.58. So, the more microstates there are, the less ordered (the higher entropy) the macrostate.

Now considering the whole pack of cards, these effects become more obvious. If we split 52 cards into red and black. There are (26!)2 = 1.63x1053 possible arrangements. Shuffling both piles gives us 52! = 8.07x1067 microstates. The logarithms are 122.52 and 156.36 respectively, and again the second is larger.

However, although extremely unlikely, it is still theoretically possible that a shuffled pile of cards could become ordered – there is nothing in the laws of physics which prevents this.

Source: http://image.wikifoundry.com/image/1/ _nkraglNrgg7QpH5PAZ2dA51255/GW376H218



Entropy and the Big Bang


If entropy increases, this suggests that entropy was initially low. But from the viewpoint of entropy, ordered states are rare. This makes the Big Bang the ultimate low-entropy, high-order event in need of explanation.

Interestingly, the reasoning used to argue that entropy increases towards the future works the same when applied towards the past. Not only is there a high probability of high entropy in the future, we also expect entropy to be high in the past. This would suggest that the universe exists as a statistical fluctuation – a low-entropy fluctuation within a high-entropy background. Given enough time, a statistical fluctuation to a state of low entropy would be inevitable.



However, this idea is not very widely accepted. Sir Arthur Eddington dismissed this idea using the fact that whilst low-entropy fluctuations are rare, large fluctuations are ever rarer. So, either the universe is a random fluctuation – which then resulted in intelligent humans developing, all before entropy could reach a maximum again – or this very second is a statistical fluctuation. The more likely of these two possibilities is the latter, which would make us “Boltzmann Brains” that exist for long enough to perceive our own existence before disappearing again.

Many people explain the existence of the universe by inflation. From Hubble’s discovery that space is expanding came the idea that the universe expanded from a singularity – the main concept of the Big Bang theory. But this still doesn’t explain the initial low-entropy state of the singularity.

Maybe we should just accept a low-entropy beginning as a brute fact... Or maybe this low-entropy state is in need of a different explanation. Some scientists contemplate a time before the Big Bang, with quantum fluctuations producing new universes expanding out in both temporal directions. When you next open your fridge to get an egg, you could ask yourself why it exists in such an ordered state. How did the egg come to exist? The answer is that it is not a closed system: a chicken created the ordered egg, whilst the second law of thermodynamics was not violated as net entropy still increased. Extending this analogy to the universe, could there be some “universal chicken” or event before the Big Bang responsible for the low-entropy beginning and the arrow of time? The answer: we don’t know!


References

The Fabric of the Cosmos, Brian Greene
17 Equations that Changed the World, Ian Stewart
https://en.wikipedia.org/wiki/Laws_of_thermodynamics 
http://www.scientificamerican.com/article/2-futures-can-explain-time-s-mysterious-past/ 
http://www.exactlywhatistime.com/the-arrow-of-time/ 
http://preposterousuniverse.com/eternitytohere/faq.html
https://www.youtube.com/watch?v=rEr-t17m2Fo 

0 comments:

Post a Comment

Hi there!

We'd love it if you'd share your thoughts and ideas. Don't forget to check back after commenting because we try to reply to all of your comments.

Just remember to be nice, please!

:)

ShareThis