profile pic
⌘ '
raccourcis clavier

see also: lesswrong

idea: quantifies average level of uncertainty information associated with a variable’s potential states or possible outcome.

definition

Given a discrete random variable X\mathcal{X}, which takes a value in a set X\mathcal{X} distributed according to p:X[0,1]p : \mathcal{X} \to [0,1], the entropy H(x)H(x) is defined as

H(X)xXp(x)logp(x)H(X) \coloneqq - \sum_{x \in \mathcal{X}} p(x) \log p(x)

Base 2 gives unit of “bits” (or “shannons”), while natural base gives “natural units” (or “nat”), and base 10 gives unit of “dits” (or “bans”, or “hartleys”)

joint

measure of uncertainty associated with a set of variables

namely, the joint Shannon entropy,

joint Shannon entropy

in bits, of two discrete random variable XX and YY with images X\mathcal{X} and Y\mathcal{Y} is defined as:

H(X,Y)=xXyYP(x,y)log2[P(x,y)]H(X,Y) = - \sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} P(x,y) \log_2 [P(x,y)]

where P(x,y)P(x,y) is the joint probability of both XX and YY occurring together.

For more than two random variables, this expands onto:

H(X1,,Xn)=x1X1xnXnP(x1,,xn)log2[P(x1,,xn)]H(X_{1},\ldots,X_{n}) = - \sum_{x_{1} \in \mathcal{X}_{1}} \cdots \sum_{x_{n} \in \mathcal{X}_{n}} P(x_{1},\ldots,x_{n}) \log_2 [P(x_{1},\ldots,x_{n})]