|CS 295 - Information and Complexity|
Information and Complexity describe a broad theoretical framework that can be applied to a variety of problems in computer science, engineering, statistics, and other disciplines. Claude Shannon extended Ludwig Boltzmann's concept of entropy to describe the amount of computer memory that is required to store a random message, as well as the maximum rate at which it can be reliably transmitted over a given communication channel.
Andrey Nikolaevich Kolmogorov (1903--1987) developed algorithmic information theory to measure the complexity of a message as the "size" of the smallest computer program that generates it. This course will develop, analyze, and apply these and other measures of information and complexity in a variety of contexts, including communication theory, computer science, finance, physics, statistics, and complex systems.
A course in probability or statistics. E.g., stat 141, 143, 151, or 153.
Other: Information Theory
Not specific to any particular application domain
|Frequency: Not sure|
|Return to Courses page|