

Author: Cerra Daniele Datcu Mihai
Publisher: MDPI
E-ISSN: 1099-4300|13|4|902-914
ISSN: 1099-4300
Source: Entropy, Vol.13, Iss.4, 2011-04, pp. : 902-914
Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.
Abstract
Information content and compression are tightly related concepts that can be addressed through both classical and algorithmic information theories, on the basis of Shannon entropy and Kolmogorov complexity, respectively. The definition of several entities in Kolmogorov’s framework relies upon ideas from classical information theory, and these two approaches share many common traits. In this work, we expand the relations between these two frameworks by introducing algorithmic cross-complexity and relative complexity, counterparts of the cross-entropy and relative entropy (or Kullback-Leibler divergence) found in Shannon’s framework. We define the cross-complexity of an object
Related content


Algorithmic complexity theory and the relative efficiency of financial markets
EPL (EUROPHYSICS LETTERS), Vol. 84, Iss. 4, 2008-11 ,pp. :


Efficiency of financial markets and algorithmic complexity
Journal of Physics: Conference Series , Vol. 246, Iss. 1, 2010-09 ,pp. :


FOCUS: a deconvolution method based on algorithmic complexity
Astronomy & Astrophysics, Vol. 454, Iss. 1, 2006-07 ,pp. :

