Entropy and information theory
Computer scientist C. Shannon drew on some ideas from thermodynamics to devise a sound mathematical theory that could study the difficulties related to information. In the current time when information abounds his work is one of the pillars to be able to solve some of the most complicated problems in the industry and other areas of knowledge. In this course we propose a mathematical study of the phenomenology of the following concepts: transmitter, message and receiver.
Syllabus

Computer motivation of entropy

Combinatorial entropy

Entropy of probability spaces

Entropy of random variables

Applications of entropy

Fundamental theorems

Relationship to information compression