Kolmogorov Complexity and Algorithmic Randomness ( Mathematical Surveys and Monographs )

Publication series :Mathematical Surveys and Monographs

Author: A. Shen;V. A. Uspensky;N. Vereshchagin  

Publisher: American Mathematical Society‎

Publication year: 2017

E-ISBN: 9781470440831

P-ISBN(Paperback): 9781470431822

Subject: TP301.5 Computational complexity theory

Keyword: 暂无分类

Language: ENG

Access to resources Favorite

Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.

Kolmogorov Complexity and Algorithmic Randomness

Description

Looking at a sequence of zeros and ones, we often feel that it is not random, that is, it is not plausible as an outcome of fair coin tossing. Why? The answer is provided by algorithmic information theory: because the sequence is compressible, that is, it has small complexity or, equivalently, can be produced by a short program. This idea, going back to Solomonoff, Kolmogorov, Chaitin, Levin, and others, is now the starting point of algorithmic information theory. The first part of this book is a textbook-style exposition of the basic notions of complexity and randomness; the second part covers some recent work done by participants of the “Kolmogorov seminar” in Moscow (started by Kolmogorov himself in the 1980s) and their colleagues. This book contains numerous exercises (embedded in the text) that will help readers to grasp the material.

Chapter

Title page

Contents

Preface

Acknowledgments

Basic notions and notation

Introduction. What is this book about?

What is Kolmogorov complexity?

Optimal description modes

Kolmogorov complexity

Complexity and information

Complexity and randomness

Non-computability of \KS and Berry’s paradox

Some applications of Kolmogorov complexity

Chapter 1. Plain Kolmogorov complexity

1.1. Definition and main properties

1.2. Algorithmic properties

Chapter 2. Complexity of pairs and conditional complexity

2.1. Complexity of pairs

2.2. Conditional complexity

2.3. Complexity as the amount of information

Chapter 3. Martin-Löf randomness

3.1. Measures on Ω

3.2. The Strong Law of Large Numbers

3.3. Effectively null sets

3.4. Properties of Martin-Löf randomness

3.5. Randomness deficiencies

Chapter 4. A priori probability and prefix complexity

4.1. Randomized algorithms and semimeasures on ℕ

4.2. Maximal semimeasures

4.3. Prefix machines

4.4. A digression: Machines with self-delimiting input

4.5. The main theorem on prefix complexity

4.6. Properties of prefix complexity

4.7. Conditional prefix complexity and complexity of pairs

Chapter 5. Monotone complexity

5.1. Probabilistic machines and semimeasures on the tree

5.2. Maximal semimeasure on the binary tree

5.3. A priori complexity and its properties

5.4. Computable mappings of type Σ→Σ

5.5. Monotone complexity

5.6. Levin–Schnorr theorem

5.7. The random number Ω

5.8. Effective Hausdorff dimension

5.9. Randomness with respect to different measures

Chapter 6. General scheme for complexities

6.1. Decision complexity

6.2. Comparing complexities

6.3. Conditional complexities

6.4. Complexities and oracles

Chapter 7. Shannon entropy and Kolmogorov complexity

7.1. Shannon entropy

7.2. Pairs and conditional entropy

7.3. Complexity and entropy

Chapter 8. Some applications

8.1. There are infinitely many primes

8.2. Moving information along the tape

8.3. Finite automata with several heads

8.4. Laws of Large Numbers

8.5. Forbidden substrings

8.6. A proof of an inequality

8.7. Lipschitz transformations are not transitive

Chapter 9. Frequency and game approaches to randomness

9.1. The original idea of von Mises

9.2. Set of strings as selection rules

9.3. Mises–Church randomness

9.4. Ville’s example

9.5. Martingales

9.6. A digression: Martingales in probability theory

9.7. Lower semicomputable martingales

9.8. Computable martingales

9.9. Martingales and Schnorr randomness

9.10. Martingales and effective dimension

9.11. Partial selection rules

9.12. Non-monotonic selection rules

9.13. Change in the measure and randomness

Chapter 10. Inequalities for entropy, complexity, and size

10.1. Introduction and summary

10.2. Uniform sets

10.3. Construction of a uniform set

10.4. Uniform sets and orbits

10.5. Almost uniform sets

10.6. Typization trick

10.7. Combinatorial interpretation: Examples

10.8. Combinatorial interpretation: The general case

10.9. One more combinatorial interpretation

10.10. The inequalities for two and three strings

10.11. Dimensions and Ingleton’s inequality

10.12. Conditionally independent random variables

10.13. Non-Shannon inequalities

Chapter 11. Common information

11.1. Incompressible representations of strings

11.2. Representing mutual information as a string

11.3. The combinatorial meaning of common information

11.4. Conditional independence and common information

Chapter 12. Multisource algorithmic information theory

12.1. Information transmission requests

12.2. Conditional encoding

12.3. Conditional codes: Muchnik’s theorem

12.4. Combinatorial interpretation of Muchnik’s theorem

12.5. A digression: On-line matching

12.6. Information distance and simultaneous encoding

12.7. Conditional codes for two conditions

12.8. Information flow and network cuts

12.9. Networks with one source

12.10. Common information as an information request

12.11. Simplifying a program

12.12. Minimal sufficient statistics

Chapter 13. Information and logic

13.1. Problems, operations, complexity

13.2. Problem complexity and intuitionistic logic

13.3. Some formulas and their complexity

13.4. More examples and the proof of Theorem 238

13.5. Proof of a result similar to Theorem 238 using Kripke models

13.6. A problem whose complexity is not expressible in terms of the complexities of tuples

Chapter 14. Algorithmic statistics

14.1. The framework and randomness deficiency

14.2. Stochastic objects

14.3. Two-part descriptions

14.4. Hypotheses of restricted type

14.5. Optimality and randomness deficiency

14.6. Minimal hypotheses

14.7. A bit of philosophy

Appendix 1. Complexity and foundations of probability

Probability theory paradox

Current best practice

Simple events and events specified in advance

Frequency approach

Dynamical and statistical laws

Are “real-life” sequences complex?

Randomness as ignorance: Blum–Micali–Yao pseudo-randomness

A digression: Thermodynamics

Another digression: Quantum mechanics

Appendix 2. Four algorithmic faces of randomness

Introduction

Face one: Frequency stability and stochasticness

Face two: Chaoticness

Face three: Typicalness

Face four: Unpredictability

Generalization for arbitrary computable distributions

History and bibliography

Bibliography

Name Index

Subject Index

Back Cover

The users who browse this book also browse