How much information does this seminar have?
Demian Cho
Associate Professor
Physics
Possibly quite a lot. (Caveat: I never said "useful" information!) But, seriously, what do we mean by that a message has a lot of information? Can we quantify information? In this introductory talk, I will informally introduce the measure of information, Shannon Entropy, and its related measures, such as Relative Entropy. These relatively simple concepts have applications in many areas, including neuroscience, machine learning, biology, and economics, among others.