Shannon's information theory and foundations of mathematics
pratt at cs.stanford.edu
Sat Jul 9 23:20:32 EDT 2022
Dennis Hamilton wrote,
"I find it more interesting that the idea of someone's worldview being
something having Shannon entropy goes unchallenged."
Well before Shannon, linguists such as Jean-Baptiste Estoup, Felix
Auerbach, and George Zipf had noticed an inverse relationship between
length of words in any given language and frequency of their usage that has
come to be called Zipf's law.
Much more recently, evolutionary ecologists studying animal behavior have
noticed a similar phenomenon for non-human communication, for example the
by evolutionary ecologists Arik Kershenbaum, Vlad Demartsev et al titled
"Shannon entropy as a robust estimator of Zipf's Law in animal vocal
Zipf's law itself is somewhat vaguely stated, whereas Shannon entropy is
well-defined. This paper claims a good match for the latter with some
This observation that Shannon entropy arises naturally without human
intervention (other than the experiment itself making the observation)
makes it quite plausible that humans themselves would intuitively associate
less probable events with more information.
And it doesn't even need to be as "robust" a match as with Shannon
entropy. Any reasonable estimator for Zipf's law will show that effect,
which is all that my original point, namely that disinformation can carry
more information than the conventional wisdom, really needs.
This of course depends on the audience failing to recognize that the
disinformation is actually false, because if they succeed they're likely to
Disinformation succeeds when it is believed, and is the situation my point
is about. Surprising truths are more interesting than expected truths, and
the less critical the audience the more likely they are to believe them.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the FOM