2020
DOI: 10.3390/philosophies5040025
|View full text |Cite
|
Sign up to set email alerts
|

The P–T Probability Framework for Semantic Communication, Falsification, Confirmation, and Bayesian Reasoning

Abstract: Many researchers want to unify probability and logic by defining logical probability or probabilistic logic reasonably. This paper tries to unify statistics and logic so that we can use both statistical probability and logical probability at the same time. For this purpose, this paper proposes the P–T probability framework, which is assembled with Shannon’s statistical probability framework for communication, Kolmogorov’s probability axioms for logical probability, and Zadeh’s membership functions used as trut… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
15
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1
1

Relationship

3
2

Authors

Journals

citations
Cited by 6 publications
(15 citation statements)
references
References 40 publications
0
15
0
Order By: Relevance
“…If subjective predictions always accord with facts, the two information accounts are the same. As presented by Chenguang Lu in two articles in a Special Issue of Philosophies, the first approach deals with how information is acquired, a field often referred to as Informatics [5]. The second, different approach of Information Theory deals with information measurement.…”
Section: Probability Information and Truthmentioning
confidence: 99%
“…If subjective predictions always accord with facts, the two information accounts are the same. As presented by Chenguang Lu in two articles in a Special Issue of Philosophies, the first approach deals with how information is acquired, a field often referred to as Informatics [5]. The second, different approach of Information Theory deals with information measurement.…”
Section: Probability Information and Truthmentioning
confidence: 99%
“…These conclusions accord with Popper's thoughts [37] (p. 294). For this reason, I(xi; θj) is also explained as the verisimilitude between yj and xi [32]. We can also use the above formula to measure sensory information, for which T(θj|x) is the confusion probability function of xj with x or the discrimination function of xj [31].…”
Section: The Semantic Information G Measurementioning
confidence: 99%
“…To increase G = I(X; Yθ), we need to fix T(θj|x) and optimize control to improve P(x|y). The R(G) function tells us that we can improve P(y|x) with Equation (15) and increase G by amplifying s. In [32], the author improperly uses a different information formula (Equation (24) in [32]) for the above purpose. That formula seemingly only fits cases where the control results are continuous distributions.…”
Section: Considering Other Kinds Of Semantic Informationmentioning
confidence: 99%
See 2 more Smart Citations