c
|
Introduction ............................................................................................................................................ 2
Two cycle die ....................................................................................................................................... 3
Fundamental Principle of Continuous Systems ......................................................................................... 3
Conservation of Distinctions, conservation of information ʹ basic assumption ..................................... 3
Trajectories do not Cross or Merger in an Adiabatic System ................................................................. 3
Momentum is not very important in Ep p p................................................................ 3
Liouville Theorem (53:00 ......................................................................................................................... 4
Ergodic Hypothesis of Phase Space ...................................................................................................... 4
If a system is not Ergodic, there are additional conserved quantities ................................................ 5
Phase Space Definition ʹ Volume x Momentum ....................................................................................... 5
Energy (65:00 .......................................................................................................................................... 5
The First Law of Thermodynamics is that energy is conserved. ............................................................. 5
The Second Law of Thermodynamics is that Entropy always increases. ................................................ 5
Probability Distributions (67:00 ............................................................................................................... 6
Consider a system with various probability states (75:00 ..................................................................... 6
The number of possible states is a measure of ignorance ..................................................................... 6
Entropy an information-theoretic concept (80:00 .................................................................................... 6
Examples of why Entropy is log M ........................................................................................................ 7
Entropy is measured in bits. (86:00 .................................................................................................. 7
The general definition of entropy (90:00 .............................................................................................. 7
Thermal Equilibrium for a system (99:00 .................................................................................................. 8
What is Temperature? (108:00 ............................................................................................................ 8
Slogan: Apart from a factor of log 2, the temperature is the amount of energy needed to increase the
entropy by one bit. .............................................................................................................................. 9
Appendix Susskind Notes ......................................................................................................................... 9
1.2 The 1st Law of Thermodynamics: Energy is conserved .................................................................... 9
1.3 Entropy is an information-theoretic concept. ................................................................................. 9
Entropy is measured in bits ............................................................................................................ 10
The general definition of entropy ................................................................................................... 10
1.4 Temperature ................................................................................................................................ 11
1.5 The 2nd Second Law of Thermodynamics: Entropy always increases ................................................ 12
Statistical mechanics of often thought of as the theory of how atoms combine to form gases,
liquids, solids, and even plasmas and black body radiation. But it is both more and less than
that. Statistical mechanics is a useful tool in many areas of science in which large a number of
variables have to be dealt with using statistical methods. My son, who studies neural networks,
uses it. I have no doubt that some of the financial wizards at AIG and Lehman Brothers used it.
Saying that Stat Mech is the theory of gasses is rather like saying calculus is the theory of
planetary orbits. Stat Mech is really a particular type of probability theory.
Coin flipping is a good place to start. The probabilities for heads (H) and tails (T) are both equal
to 1/2. Why do I say that? One answer is that the symmetry between H and T means their
probabilities are equal. Here is another example. Let͛s take a die (as in dice) and color the six
faces red, yellow, blue, green, orange, and purple (R, Y, B, G,O, P). The obvious cubic symmetry
of the die dictates that the probabilities are equal to 1/6. But what if we don͛t have a symmetry
to rely on? How do we assign a priori probabilities?
Suppose for example instead of the coloring scheme that I indicated above, I chose to color the
purple face red. Then there would be only five colors. Would the probability of throwing a given
color be 1/5? After all, if I just write (R, Y, B, G,O), the 5 names are just as symmetric as the
original 6 names. Nonsense, you say: the real symmetry is among the 6 faces, and that is so. But
what if there really is no obvious symmetry at all, for example if the die is weighted in some
unfair way?
In that case we would have to rely on a bunch of details such as the precise way the die was
thrown by the hand that threw it, the wind, maybe even the surface that the die lands of (can it
bounce?). As is often the case, we have to think of the system in question as part of a bigger
system. But what about the bigger system? How do we assign its probabilities.
Here is another idea that involves some dynamics. Suppose there is a law of motion (in this
example time is discrete) that takes a configuration and in the next instant replaces is by
another unique configuration. For example RїB, BїY, YїG, GїO, OїP, PїR. I can then ask
what fraction of the time does that die spend in each configuration? The answer is 1/6. In fact
there are many possible laws for which the answer will be the same. For example, RїB, BїG,
GїP, PїO, OїY, YїR or RїY, YїP, PїG, GїO, OїB, BїR.
á
p
pp p
pp pppp p
p p
p p p p p
p p p p p
p p pp
pp p p p p p
p p p
p
J
|
p p p
! p p p pp p
p p p "p
pp p
#p ppp$#p ppp$p p
pp p p
|
p pp p%pppp
%ppp p p p
p p % &
p
á
|
p ppp p p pp
' pp p pp p
pp p
p
pp p p p p p p p
( p p p
ë
á !!
ë p
p pp p
p p % p &
p p p p
) p pp p p
)
"
pp p p
p p p p
p p
p
p pp p
p p
p p p
ppp p p p
pp
p p p p
p
p p
p
2 ppp pp p
p pp"
2 p &p pp
pp
#
2 p pp p
p p pp p "
2 p p pp
p
$ %
2 p pp p "
2 pp% p % p
2 Ô* pp p
2 | |
& !!
á
Jë á '
+
',
+) p
pp p p p ppp
ppp p p p
p p p
''
-'
p p p p p p
p pp -
- p p p
p p p p p p p p
p pp p p
p
á
ë á '
ppp %. p) p
p) p'pp p p
p p
p p pp
+ p p p
Ôp p ) p p p
Ô
Ô
Ô
p p
/
Ô
!p pp p % p p
p p p p p &
*
R
R
p p ppp p
0
p * p
0 p
1
R
R
p p %2Ô Ôp
Ô
Ô
'p* p
p ' p p
p ppp
p p
Ô
p pp
p
pp p
&( !!
p pp
2 'p 3 p4
2
p pp#$
2 'p%5%
2 p ++%
+ p 6
Å
% #Å
| ( !!
á
#$ pp p #$
pp
7 p p
) *! !!
'pppp p 8!p
pp
p pppp&p p p
pp
p pp!
2 z !" !
2 Ô p pp
2 p p #$ p
p
#$ &
p #$ ,9
p
2
#
$
%!&
(
ppp % pp p p
:
!
pp p
pp pp( (;! p
ppp p p p (
' p p p
%
) pp p'p % %Ô
p p p
ppp p
p
p p
.
p
p
'*& !!
! p pp p
% p 'p p p
p
pp
p pÔp p p p
pp p p( p&
! p pp ppp
p ppp p pp
p pp p p%
p
p pp p p p
p p
+ p p pp p pp
á
+! !!
u
p ppp
p ÷ p p ÷ %÷p %÷ &
p p%÷,( p
p %
*
'
() ()
p pp p
pp p ,(p ,((p
( (
+ p p p
0
() * +,
'
() # p p p
p pp p p 4%÷4
ppp &
p p p pp
p p p%p #$
p! & ppp pp
!p'pp p p, p
, p
Ô p p
p
1
'
&
-
.
/ 0
pp
12
3
p
á
# ++ !!
p p Ô pp
9 p p p pp
á,-!* !!
p p%' p
'
'
2
)
'
÷
p ÷
p p Ô #$
Ôpp p
pÔ'
Ô
9 Ô p p
Ô
pp Ô#$p
p
Ô
%
Ô
'
Ô)
p &
p9p p p
+ p p Ô,
Ô
/p
p
Ô
;
Ô
/
'
Ô
Heat Bath ͞B͟
A
p #Ô$ p p #$
pp p #Ô<$ < 9
p #Ô$#$ p
p p % p
#Ô$
! p p
p ÷ + '÷ p
p) p p p
pp
Ô ' p p pp'
p p p p p pp %
p
Z
45
46
7
p',
45
46
8 R
.
'
+ %)pp pp
p p p
p
' p pp
%/0
-'.á-ë á
+) p p p + %p
',
p p %p p
''
-'
p p p p p p p p
p pp p p
p
! pp p p p ppp
ppp p p p
! p p p p p p
p pp
-
-
p
p
pp
p
p
-' ) '
'pppp p 8!p
pp
p pppp&p p p
pp
p pp! ppp
% pp p p
!
pp pp( (;! p
ppp p p p
(
÷ppp
) pp p'p
% %Ô p p p =
ppp p
p
p p
p
p
! p pp p
% p p÷÷p÷ p
p
pp
p÷p
Ôp p p p pp
p p( p&
! p pp ppp
p ppp p p
p
p pp p p%
p
2 p pp p p p
p p
2 + p p pp p p
p
á
p ppp
p ÷ p p ÷ %÷p %÷ &
p p%÷,( p
p %
'
() ()
p pp p
pp p ,(p ,((p
( (
+ p p p
() * +,
'
() #
p p p
p pp p p 4%÷4
ppp &
p p p pp
p p p%p #$
p! & ppp pp
!p'pp p p,
p, p
Ô p p
p
'
&
#
Z
/ 0
pp
12
3
p
-'1á
p p%' p
'
'
2
)
'
÷
p ÷
! p p
p ÷ + '÷ p
p) p÷÷÷÷÷÷p÷p
pp
Ô ' p p pp'
p p p p p pp %
p
:
Z
45
46
7
p',
.
45
46
8 R
pÔ÷ppp÷p
pp÷
+ pp p p
p
p
' p pp p
ppp ppp p
-'á. ë á
p) p'pp p p
p p
p p pp
+ p p p
Ôp p ) p p p
Ô
Ô
Ô
p p
/
Ô
!p pp p % p p
p p p p p &
*
R
R
p p ppp p
0
p * p
0 p
1
R
R
p p %2Ô Ôp
Ô
Ô
'p* p
p ' p p
p ppp
p p
Ô
p pp
p