Statistical Information and Uncertainty: A Critique of Applications in Experimental Psychology

Author: Laming Donald  

Publisher: MDPI

E-ISSN: 1099-4300|12|4|720-771

ISSN: 1099-4300

Source: Entropy, Vol.12, Iss.4, 2010-04, pp. : 720-771

Access to resources Favorite

Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.

Previous Menu Next

Abstract

This paper presents, first, a formal exploration of the relationships between information (statistically defined), statistical hypothesis testing, the use of hypothesis testing in reverse as an investigative tool, channel capacity in a communication system, uncertainty, the concept of entropy in thermodynamics, and Bayes’ theorem. This exercise brings out the close mathematical interrelationships between different applications of these ideas in diverse areas of psychology. Subsequent illustrative examples are grouped under (a) the human operator as an ideal communications channel, (b) the human operator as a purely physical system, and (c) Bayes’ theorem as an algorithm for combining information from different sources. Some tentative conclusions are drawn about the usefulness of information theory within these different categories. (a) The idea of the human operator as an ideal communications channel has long been abandoned, though it provides some lessons that still need to be absorbed today. (b) Treating the human operator as a purely physical system provides a platform for the quantitative exploration of many aspects of human performance by analogy with the analysis of other physical systems. (c) The use of Bayes’ theorem to calculate the effects of prior probabilities and stimulus frequencies on human performance is probably misconceived, but it is difficult to obtain results precise enough to resolve this question.