P-T Probability Framework and Semantic Information G Theory Tested by Seven Difficult Tasks
Abstract
To apply information theory to more areas, the author proposed semantic information G theory, which is a natural generalization of Shannon’s information theory. This theory uses the P-T probability framework so that likelihood functions and truth functions (or membership functions), as well as sampling distributions, can be put into the semantic mutual information formula at the same time. Hence, we can connect statistics and (fuzzy) logic. Rate-distortion function R(D) becomes rate-verisimilitude function R(G) (G is the lower limit of the semantic mutual information) when the distortion function is replaced with the semantic information function. Seven difficult tasks are 1) clarifying the relationship between minimum information and maximum entropy in statistical mechanics, 2) compressing images according to visual discrimination, 3) multilabel learning for obtaining truth functions or membership functions from sampling distributions, 4) feature classifications with maximum mutual information criterion, 5) proving the convergence of the expectation-maximization algorithm for mixture models, 6) interpreting Popper’s verisimilitude and reconciling the contradiction between the content approach and the likeness approach, and 7) providing practical confirmation measures and clarifying the raven paradox. This paper simply introduces the mathematical methods for these tasks and the conclusions. The P-T probability framework and the semantic information G theory should have survived the tests. They should have broader applications. Further studies are needed for combining them with neural networks for machine learning.
Origin | Files produced by the author(s) |
---|