Radio announcer Bob Enyart takes me to task for not distinguishing between "heat entropy" and "information entropy" in my

*American Journal of Physics*article "Entropy and Evolution". Any knowledgeable person could just look at the equations in my paper and see that I mean "thermodynamic/statistical mechanical entropy".

The word "entropy", like most words, has many meanings, and the meaning in use is determined from context. If I say "Run away from danger", you don't think "A run is a small stream, so I must follow a small stream away from danger".

Here I want to present some of the other meanings of the word "entropy", to emphasize that it would have been silly to say that I'm not talking about each of them:

information entropy

topological entropy

Kolmogorov entropy

Kolmogorov-Sinai entropy

metric entropy

Gibbs entropy

Boltzmann entropy

Tsallis entropy

von Neumann entropy

Shannon entropy

Rényi entropy

volume entropy

If I spend so much time talking about what I'm not going to be talking about, the paper would have been quite long indeed!

## Leave a comment: