#### CS 687: Algorithmic Information Theory

Algorithmic Information Theory grew out of foundational investigations in probability theory. The main achievement of this area is the application of the theory of computation to define the notion of an individual random finite object (finite string). The most important tool used in this definition, Kolmogorov Complexity, is a formal analogue of the information-theoretic concept of entropy. Thus it is a fruition of Kolmogorov's vision that "Information Theory should precede Probability Theory, and not be based on it".

The aim of this course is to provide an introduction to the field of algorithmic information theory, in particular, to Kolmogorov Complexity, and to its applications to other areas in computing. In addition to covering the standard theory, we hope to discuss later results in the literature - in particular, notions of resource- bounded Kolmogorov Complexity and a survey of the application of Kolmogorov Complexity to computational complexity.

###### Course Outline

I'll provide you with a bibliography as the course progresses.

**Introduction** Motivation and History, Brief Review of Prerequisites
**Kolmogorov Complexity** Plain Kolmogorov Complexity and its Problems, Self-delimiting machines and Kolmogorov Complexity, Symmetry of Information, the Coding Theorem, Algorithmic Probability.
**Randomness.** Uniform Distribution, Computable Distributions, Definition of a finite random string, Abundance of randomness, Dentin of an finite random sequence, Martingale Characterizations.
**Information Theory** Information Theoretic Inequalities, Algorithmic Entropy, Randomness and Complexity.
**The Incompressibility method** Some basic lower bounds, The incompressibility method, A lower bound for Shell sort.
**Resource-bounded Kolmogorov Complexity** Hartmanis' results, Levin's Optimal Search, A Hierarchy theorem for Probabilistic Classes with advice, Hutter's Optimal Search, Kolmogorov Complexity and Communication Complexity.