Wolfram Alpha:
Search by keyword:
Astronomy
Chemistry
Classical Mechanics
Classical Physics
Climate Change
Cosmology
Finance and Accounting
Game Theory
General Relativity
Group Theory
Lagrangian and Hamiltonian Mechanics
Macroeconomics
Mathematics
Mathjax
Microeconomics
Nuclear Physics
Particle Physics
Probability and Statistics
Programming and Computer Science
Quantitative Methods for Business
Quantum Computing
Quantum Field Theory
Quantum Mechanics
Semiconductor Reliability
Solid State Electronics
Special Relativity
Statistical Mechanics
String Theory
Superconductivity
Supersymmetry (SUSY) and Grand Unified Theory (GUT)
The Standard Model
Topology
Units, Constants and Useful Formulas
Definition of Entropy
---------------------
Entropy is a measure of our ignorance regarding the exact state of a
system. Consider the example of a gas in a bottle. The smaller the
bottle the less uncertainty there is about where the molecules will be
and so the entropy will be lower. On the other hand for a constant
sized bottle the entropy will increase if heat is applied. This is because
there will now be a wider range of possible velocities for the molecules
in the gas. Now, if the bottle is opened in outer space, the gas will escape
an diffuse into the vacuum. Now there are progressively more and more
states that the gas can be in and so the entropy increases. Consequently,
as the system evolves in time the system becomes more and more
disordered and our ignorance regarding the exact state of the system
increases.
Definition:
We know state of system is somewhere in region defined by the distribution Pi
Pi
|
| . .
| . .
| . .
| . .
| . .
-------------------------------------------- State, i
Then,
S = -ΣiPilnPi
This is the average value of lnPi
Consider a specific case where all the states have the same Pi:
Pi
|
|
| ----------
| | |
| | |
--------------------------- State, i
<-- M -->
S = -ΣiPilnPi
S = -Σi(1/M)ln(1/M)
S = -M(1/M)ln(1/M)
S = -ln(1/M)
S = -lnM
Thus, in this case, S is just the number of states in the system.
Consider N coins:
N = 2n
S = lnN = nln(2)
Entropy
--------
The following discussion refers to distinguishable particles. Two particles
can be considered to be distinguishable if their separation is large compared
to their DeBroglie wavelength.
For example, the condition of distinguishability is met by molecules in an
ideal gas under ordinary conditions. On the other hand, two electrons
in the first shell of an atom are inherently indistinguishable because of the
large overlap of their wavefunctions.
Consider N particles. Let ni = number of particles in the ith energy state, Ei
Energy
|
+
+
+<- ith state with Energy Ei
+
+
+
+
|
The number of possible ways to fit the particles in the number of
avalable states is called the multiplicity function.
The multiplicity function for the whole system, W, is the product of
the multiplicity functions for each energy Ei:
W = Πi(N!/ni) ... 1.
Consider the following example of how 6 particles can be distributed amongst
10 energy states.
For the 3 states shown above the numbers of ways each state can be
filled with 6 particles is given by:
6!/5!1! = 6
6!/3!2!1! = 60
6!/2!2!1!1! = 180
We can simplify 1. above by taking the log of both sides,
lnW = lnN! - Σiln(ni!)
Now apply Stirling's approximation i.e. lngA! = AlnA - A
So
lnW = NlnN - N - Σiniln(ni) + Σini
Now,
Pi = ni/N = probability that a given particle is in state i
Therefore,
lnW = NlnN - N - ΣiNPilnNPi + N
= NlnN - (ΣiNPilnN + ΣiNPilnPi)
= NlnN - (NlnN + NΣiPilnPi)
= -NΣiPilnPi
= NS
where
S = ΣiPilnPi
This is the definition of ENTROPY.
Boltzmann Distribution
------------------------
From the discussion of entropy it was shown that:
lnW = NS
= NΣiPilnPi where W is the mutiplicity function.
Now maximize lnW by maximizing S. This can be achieved by using
the method of Lagrange multipliers under the constraints ΣPi = 1
and E = ΣiPiEi (the average energy). Thus,
lnW = -NΣiPilnPi - αΣiPi - βΣiEiPi
Now, for maximization, the derivative of each of the i's must
equal 0. Thus we can ignore the Σ.
∂lnW/∂Pi = -lnPi - 1 - α - βEi = 0
∴ lnPi = -(1 + α) - βEi
Or
Pi = exp(-(1 + α))exp(-βEi)
Write as,
Pi = (1/Z)exp(-βEi)
This is the BOLTZMANN DISTRIBUTION.
The Partition Function
-----------------------
From the discussion of the Boltzmann distribution it was shown that:
Pi = (1/Z)exp(-βEi)
Now impose the above constraints ΣPi = 1 and E = ΣiPiEi Thus,
ΣPi = 1:
(1/Z)Σie-βEi = 1
∴ Z = Σie-βEi
This is the PARTITION FUNCTION
E = ΣiPiEi:
Take the derivative of Z w.r.t. β
∂Z/∂β = -ΣiEiexp(-βEi)
Divide both sides by -Z:
(-1/Z)∂Z/∂β = ΣiEiexp(-βEi)/Z
The RHS contains the Boltzmann distribution Pi = (1/Z)exp(-βEi)
= ΣiPiEi
= E
∴ E = (-1/Z)∂Z/∂β
This can be written as:
E = -∂lnZ/∂β
Entropy Revisited
-----------------
S = -ΣiPilnPi and Pi = (1/Z)exp(-βEi)
Therefore S can be wriiten as:
S = -Σi(1/Z)e-βEilnPi
Now,
lnPi = (-βEi - lnZ). Therefore,
S = -Σi(1/Z)e-βEi(-βPi - lnZ)
= Σi(1/Z)e-βEi(βEi + lnZ)
= βΣi(1/Z)e-βEiEi + lnZ(1/Z)Σie-βEi
= βΣiPiEi + lnZ(1/Z)z
= βE + lnZ
Now,
dS = βdE + Edβ + (∂lnZ/∂β)dβ
but ∂lnZ/∂β = -E so
dS = βdE
or,
β = dS/dE
Helmholtz Free Energy
---------------------
Temperature is defined as:
T = (1/kB)dE/dS
The temperature, T, is the change in E that causes a change
in the entropy by 1J/K.
Alternatively,
dS/dE = 1/KBT
By comparison,
β = 1/KBT
Now from before S = βE + lnZ. Therefore,
S = (1/KBT)E + lnZ
Or
E - KBTS = A = -KBTlnZ
This is the HELMHOLTZ FREE ENERGY.
Summarizing:
Pi = (1/Z)exp(-βEi)
Z = Σie-βEi
E = -∂lnZ/∂β
KBT = 1/β
S = βE + lnZ
A = -KBTlnZ