Wolfram Alpha:
Search by keyword:
Astronomy
Chemistry
Classical Mechanics
Classical Physics
Climate Change
Cosmology
Finance and Accounting
Game Theory
General Relativity
Group Theory
Lagrangian and Hamiltonian Mechanics
Macroeconomics
Mathematics
Mathjax
Microeconomics
Nuclear Physics
Particle Physics
Probability and Statistics
Programming and Computer Science
Quantitative Methods for Business
Quantum Computing
Quantum Field Theory
Quantum Mechanics
Semiconductor Reliability
Solid State Electronics
Special Relativity
Statistical Mechanics
String Theory
Superconductivity
Supersymmetry (SUSY) and Grand Unified Theory (GUT)
The Standard Model
Topology
Units, Constants and Useful Formulas
Definition of Entropy

Entropy is a measure of our ignorance regarding the exact state of a
system. Consider the example of a gas in a bottle. The smaller the
bottle the less uncertainty there is about where the molecules will be
and so the entropy will be lower. On the other hand for a constant
sized bottle the entropy will increase if heat is applied. This is because
there will now be a wider range of possible velocities for the molecules
in the gas. Now, if the bottle is opened in outer space, the gas will escape
an diffuse into the vacuum. Now there are progressively more and more
states that the gas can be in and so the entropy increases. Consequently,
as the system evolves in time the system becomes more and more
disordered and our ignorance regarding the exact state of the system
increases.
Definition:
We know state of system is somewhere in region defined by the distribution P_{i}
P_{i}

 . .
 . .
 . .
 . .
 . .
 State, i
Then,
S = Σ_{i}P_{i}lnP_{i}
This is the average value of lnP_{i}
Consider a specific case where all the states have the same P_{i}:
P_{i}


 
  
  
 State, i
< M >
S = Σ_{i}P_{i}lnP_{i}
S = Σ_{i}(1/M)ln(1/M)
S = M(1/M)ln(1/M)
S = ln(1/M)
S = lnM
Thus, in this case, S is just the number of states in the system.
Consider N coins:
N = 2^{n}
S = lnN = nln(2)
Entropy

The following discussion refers to distinguishable particles. Two particles
can be considered to be distinguishable if their separation is large compared
to their DeBroglie wavelength.
For example, the condition of distinguishability is met by molecules in an
ideal gas under ordinary conditions. On the other hand, two electrons
in the first shell of an atom are inherently indistinguishable because of the
large overlap of their wavefunctions.
Consider N particles. Let n_{i} = number of particles in the ith energy state, E_{i}
Energy

+
+
+< ith state with Energy E_{i}
+
+
+
+

The number of possible ways to fit the particles in the number of
avalable states is called the multiplicity function.
The multiplicity function for the whole system, W, is the product of
the multiplicity functions for each energy E_{i}:
W = Π_{i}(N!/n_{i}) ... 1.
Consider the following example of how 6 particles can be distributed amongst
10 energy states.
For the 3 states shown above the numbers of ways each state can be
filled with 6 particles is given by:
6!/5!1! = 6
6!/3!2!1! = 60
6!/2!2!1!1! = 180
We can simplify 1. above by taking the log of both sides,
lnW = lnN!  Σ_{i}ln(n_{i}!)
Now apply Stirling's approximation i.e. lngA! = AlnA  A
So
lnW = NlnN  N  Σ_{i}n_{i}ln(n_{i}) + Σ_{i}n_{i}
Now,
P_{i} = n_{i}/N = probability that a given particle is in state i
Therefore,
lnW = NlnN  N  Σ_{i}NP_{i}lnNP_{i} + N
= NlnN  (Σ_{i}NP_{i}lnN + Σ_{i}NP_{i}lnP_{i})
= NlnN  (NlnN + NΣ_{i}P_{i}lnP_{i})
= NΣ_{i}P_{i}lnP_{i}
= NS
where
S = Σ_{i}P_{i}lnP_{i}
This is the definition of ENTROPY.
Boltzmann Distribution

From the discussion of entropy it was shown that:
lnW = NS
= NΣ_{i}P_{i}lnP_{i} where W is the mutiplicity function.
Now maximize lnW by maximizing S. This can be achieved by using
the method of Lagrange multipliers under the constraints ΣP_{i} = 1
and E = Σ_{i}P_{i}E_{i} (the average energy). Thus,
lnW = NΣ_{i}P_{i}lnP_{i}  αΣ_{i}P_{i}  βΣ_{i}E_{i}P_{i}
Now, for maximization, the derivative of each of the i's must
equal 0. Thus we can ignore the Σ.
∂lnW/∂P_{i} = lnP_{i}  1  α  βE_{i} = 0
∴ lnP_{i} = (1 + α)  βE_{i}
Or
P_{i} = exp((1 + α))exp(βE_{i})
Write as,
P_{i} = (1/Z)exp(βE_{i})
This is the BOLTZMANN DISTRIBUTION.
The Partition Function

From the discussion of the Boltzmann distribution it was shown that:
P_{i} = (1/Z)exp(βE_{i})
Now impose the above constraints ΣP_{i} = 1 and E = Σ_{i}P_{i}E_{i} Thus,
ΣP_{i} = 1:
(1/Z)Σ_{i}e^{βEi} = 1
∴ Z = Σ_{i}e^{βEi}
This is the PARTITION FUNCTION
E = Σ_{i}P_{i}E_{i}:
Take the derivative of Z w.r.t. β
∂Z/∂β = Σ_{i}E_{i}exp(βE_{i})
Divide both sides by Z:
(1/Z)∂Z/∂β = Σ_{i}E_{i}exp(βE_{i})/Z
The RHS contains the Boltzmann distribution P_{i} = (1/Z)exp(βE_{i})
= Σ_{i}P_{i}E_{i}
= E
∴ E = (1/Z)∂Z/∂β
This can be written as:
E = ∂lnZ/∂β
Entropy Revisited

S = Σ_{i}P_{i}lnP_{i} and P_{i} = (1/Z)exp(βE_{i})
Therefore S can be wriiten as:
S = Σ_{i}(1/Z)e^{βEi}lnP_{i}
Now,
lnP_{i} = (βE_{i}  lnZ). Therefore,
S = Σ_{i}(1/Z)e^{βEi}(βP_{i}  lnZ)
= Σ_{i}(1/Z)e^{βEi}(βE_{i} + lnZ)
= βΣ_{i}(1/Z)e^{βEi}E_{i} + lnZ(1/Z)Σ_{i}e^{βEi}
= βΣ_{i}P_{i}E_{i} + lnZ(1/Z)z
= βE + lnZ
Now,
dS = βdE + Edβ + (∂lnZ/∂β)dβ
but ∂lnZ/∂β = E so
dS = βdE
or,
β = dS/dE
Helmholtz Free Energy

Temperature is defined as:
T = (1/k_{B})dE/dS
The temperature, T, is the change in E that causes a change
in the entropy by 1J/K.
Alternatively,
dS/dE = 1/K_{B}T
By comparison,
β = 1/K_{B}T
Now from before S = βE + lnZ. Therefore,
S = (1/K_{B}T)E + lnZ
Or
E  K_{B}TS = A = K_{B}TlnZ
This is the HELMHOLTZ FREE ENERGY.
Summarizing:
P_{i} = (1/Z)exp(βE_{i})
Z = Σ_{i}e^{βEi}
E = ∂lnZ/∂β
K_{B}T = 1/β
S = βE + lnZ
A = K_{B}TlnZ