"I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.'"
In order to contemplate the definition of entropy, and perhaps gain an advantage in debates about it, I recommend this article about classical thermodynamics. In section 3 'a one-dimensional classical analogue of Clausius' thermodynamic entropy' is constructed, which dates back to Helmholtz. (The properties of one-dimensional classical gas were previously discussed on this blog here.)
And then there is a pdf file available to this text book
about 'Entropy, Order Parameters and Complexity'.
I found both the article and the book on Cosma's list of interesting links.