Algorithmic Information Theory grew out of foundational investigations in probability theory. The main achievement of this area is the application of the theory of computation to define the notion of an individual random finite object (finite string). The most important tool used in this definition, Kolmogorov Complexity, is a formal analogue of the information-theoretic concept of entropy. Thus it is a fruition of Kolmogorov's vision that "Information Theory should precede Probability Theory, and not be based on it".
The aim of this course is to provide an introduction to the field of algorithmic information theory, in particular, to Kolmogorov Complexity, and to its applications to other areas in computing. In addition to covering the standard theory, we hope to discuss later results in the literature - in particular, notions of resource- bounded Kolmogorov Complexity and a survey of the application of Kolmogorov Complexity to computational complexity.
I'll provide you with a bibliography as the course progresses.