[FOM] Fuzziness

Kreinovich, Vladik vladik at utep.edu
Wed Jul 20 18:21:49 EDT 2016

While there does not seem to be a direct relation between fuzzy logic and neural networks, in both areas, there are some semi-empirical function that (somewhat mysteriously) turn out to lead to the most efficient results.

·        In neural networks, this is the “activation function” f(x) that for each neuron, transform a linear combination x of inputs x1, …, xn into the output y=f(x); the most efficient choice is the sigmoid function f(x)=1/(1+\exp(-k*x)).

·        In fuzzy logic, it is the selection of a membership function – which assigns to each value of a quantity the degree from [0,1] to which this quantity satisfies the given informal property like “small”, and “and”- and “or”-operations  f_& and f_\/ that transform our degrees of belief a and b in statements A and B into estimates f_&(a,b) and f_\/(a,b) of degrees of belief in A & B and A \/ B.
It turns out that in both cases (and also, for the similar selection of functions in evolutionary computations) the empirical selection of functions can be explained by the fact that all these functions are related to natural symmetries (re-scaling), like linear transformations that come from changing the measuring unit or starting point, or more general non-linear transformations. These results are described in detail e.g., in our book

Hung T. Nguyen and Vladik Kreinovich, "Applications of continuous
mathematics to computer science", Kluwer, Dordrecht, 1997.


On 22 June 2016 at 03:07, Harvey Friedman <hmflogic at gmail.com<mailto:hmflogic at gmail.com>> wrote:
"fuzzy logic" --  how does it relate to recent breakthroughs in machine learning,
deep leaning, etcetera?

-------------- next part --------------
An HTML attachment was scrubbed...
URL: </pipermail/fom/attachments/20160720/7f2f5f7e/attachment.html>

More information about the FOM mailing list