Artboard 1 copy 7.png
Entropy and information theory

Computer scientist C. Shannon drew on some ideas from thermodynamics to devise a sound mathematical theory that could study the difficulties related to information. In the current time when information abounds his work is one of the pillars to be able to solve some of the most complicated problems in the industry and other areas of knowledge. In this course we propose a mathematical study of the phenomenology of the following concepts: transmitter, message and receiver.


  1. Computer motivation of entropy

  2. Combinatorial entropy

  3. Entropy of probability spaces

  4. Entropy of random variables

  5. Applications of entropy

  6. Fundamental theorems

  7. Relationship to information compression