Shannon's information theory and foundations of mathematics

X.Y. Newberry newberryxy at
Wed Jul 13 00:50:09 EDT 2022

Shannon information of a message m is defined as

H(m) = log(1/p(m))

where p(m) is the probability of m's occurrence given a fixed number of
potential messages. For example given 256 equiprobable messages we get our
familiar 8 bits.

           log_2(256) = 8

The messages could be anything: measurements of a physical quantity, images
or indeed syntactically correct sentences of an interpreted language, say
English. You could call the meaning of these sentences 'information', but
it is NOT Shannon information.

The Shannon information of disinformation cannot be negative. Even if the
propositions coded by the sentences are false, they still had to be coded
by a positive number of bits. In other words, regardless if your true
knowledge of a subject matter increases or decreases, a finite number of
bits had to be pushed through the communication channel.

X.Y. Newberry

*There are two ways to be fooled. One is to believe what isn't true; the
other is to refuse to believe what is true.*
― Søren Kierkegaard
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </pipermail/fom/attachments/20220712/be775d20/attachment-0001.html>

More information about the FOM mailing list