table of contents

entropy

2023-02-27

cite: (Peter Norvig, Stuart J. Russell, 2020) entropy is a measure of uncertainty of a random variable

a random variable with only one value–a coin that always comes up heads–has no uncertainty and thus its entropy is defined as zero; thus, we gain no information by observing its value. a flip of a fair coin is equally likely to come up heads or tails, 0 or 1, this counts as “1 bit” of entropy. the roll of a fair four-sided die has 2 bits of entropy, because it takes two bits to describe one of four equally probable choices. now consider an unfair coin that comes up heads 99% of the time. intuitively, this coin has less uncertainty than the fair coin–if we guess heads we’ll be wrong only 1% of the time–so we would like it to have an entropy measure that is close to zero, but positive.

the entropy of a random variable with values , each with probability , is defined as

the entropy of a boolean random variable that is true with probability is