1 of 19

Gabriele Perfetto

Institüt für Theoretische Physik,

Tübingen, DE

17 Febbraio 2022

gperfetto@sissa.it

gabriele.perfetto@uni-tuebingen.de

Entropia, Informazione e Fisica

2 of 19

Entropy in physics

Ice

Water

Vapor

Order in terms of Entropy?

3 of 19

Entropy in physics

Ice

Water

Rudolf Clausius 1865

4 of 19

Entropy (more in general)

Order in terms of entropy?

Low

Medium

High

5 of 19

Which ball do we pick?

High Knowledge

Medium Knowledge

Low Knowledge

Low Entropy

Medium Entropy

High Entropy

Formula for the Entropy???

“With great entropy comes low knowledge” (semicit).

6 of 19

Game

7 of 19

Which is the best box to play with?

Best

Medium

Worst

What is the probability of winning in each of the three games?

8 of 19

Game

X

X

X

=

“Probability theory is nothing but common sense reduced to calculation” Laplace

9 of 19

Game

X

X

X

=

10 of 19

Game

X

X

X

=

11 of 19

Probability of winning

12 of 19

Products

X

X

X

=

Products bad

Sums good

QUIZ

Which function to use?

13 of 19

Entropy

14 of 19

Entropy in Information theory

“I thought of calling it "information", but the word was overly used, so I decided to call it "uncertainty". [...] Von Neumann told me, "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

Conversation between Claude Shannon and John Von Neumann regarding what name to give to the attenuation in phone-line signals

Claude Shannon

(bits)

John Von Neumann

15 of 19

Entropy in Statistical mechanics

Ludwig Boltzmann

1875

Statistical mechanics: derive thermodynamic properties of macroscopic systems starting from the laws of the microscopic constituents using probabilistic methods.

16 of 19

17 of 19

Part 2: Exercises-together!

Physics is 20% inspiration and 80% perspiration

18 of 19

Compute

and plot it as a function of p. Interpret and discuss the result.

1)

2)

Determine the entropy of the sum obtained when a pair of fair dice is rolled.

3)

Binary encoding

Compute the entropy and compare it with the average length L of the codewords. Can you find a more efficient-shorter representation? Discuss the result.

19 of 19

Take-home message

Entropy

The choice is yours!!!

Statistical Physics

Thermo-dynamics

Information

theory

Biophysics

Computer science

Economics