Shannon's information theory and foundations of mathematics

Vaughan Pratt pratt at cs.stanford.edu
Sun Jun 26 11:27:33 EDT 2022


Vladik Kreinovich proposes that disinformation about a proposition P
increases Shannon entropy, thereby decreasing audience A's information
about the state of the system.  He illustrates this with (essentially) the
example in which ~P is the only other proposition and A's probability of P
is decreased from 1 to 0.5, raising the system entropy from zero to 1 bit
and implying that A has lost 1 bit of information.

The difficulty I have with that proposal is that when p is reduced from 1
to 0 the entropy remains zero, implying that A has lost no information.  In
that case the disinformation has conveyed no information, contrary to our
intuition about disinformation.  All it has done is to change A's mind
about P, namely from true to false, i.e. convincing A of ~P.

This would seem to suggest that it is unsound to use Shannon entropy to
measure information conveyed when probabilities are changed arbitrarily
subject only to their sum being 1.  Entropy in that scenario seems rather
to measure certainty or confidence, not information.

Vaughan Pratt
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </pipermail/fom/attachments/20220626/c1cdf371/attachment.html>


More information about the FOM mailing list