Gabriele Perfetto
Institüt für Theoretische Physik,
Tübingen, DE
17 Febbraio 2022
gperfetto@sissa.it
gabriele.perfetto@uni-tuebingen.de
Entropia, Informazione e Fisica
Entropy in physics
Ice
Water
Vapor
Order in terms of Entropy?
Entropy in physics
Ice
Water
Rudolf Clausius 1865
Entropy (more in general)
Order in terms of entropy?
Low
Medium
High
Which ball do we pick?
High Knowledge
Medium Knowledge
Low Knowledge
Low Entropy
Medium Entropy
High Entropy
Formula for the Entropy???
“With great entropy comes low knowledge” (semicit).
Game
Which is the best box to play with?
Best
Medium
Worst
What is the probability of winning in each of the three games?
Game
X
X
X
=
“Probability theory is nothing but common sense reduced to calculation” Laplace
Game
X
X
X
=
Game
X
X
X
=
Probability of winning
Products
X
X
X
=
Products bad
Sums good
QUIZ
Which function to use?
Entropy
Entropy in Information theory
“I thought of calling it "information", but the word was overly used, so I decided to call it "uncertainty". [...] Von Neumann told me, "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."
Conversation between Claude Shannon and John Von Neumann regarding what name to give to the attenuation in phone-line signals
Claude Shannon
(bits)
John Von Neumann
Entropy in Statistical mechanics
Ludwig Boltzmann
1875
Statistical mechanics: derive thermodynamic properties of macroscopic systems starting from the laws of the microscopic constituents using probabilistic methods.
Part 2: Exercises-together!
Physics is 20% inspiration and 80% perspiration
Compute
and plot it as a function of p. Interpret and discuss the result.
1)
2)
Determine the entropy of the sum obtained when a pair of fair dice is rolled.
3)
Binary encoding
Compute the entropy and compare it with the average length L of the codewords. Can you find a more efficient-shorter representation? Discuss the result.
Take-home message
Entropy
The choice is yours!!!
Statistical Physics
Thermo-dynamics
Information
theory
Biophysics
Computer science
Economics