this dir | view | cards | source | edit | dark top

Lecture

Lecture

credit

Probability, information theory

event AA as a set of basic outcomes

AΩA\subseteq\Omega

Probability, information theory

we can estimate the probability of event AA by experiment

Probability, information theory

axioms

Probability, information theory

entropy

nothing can be more uncertain than the uniform distribution

Probability, information theory

perplexity

G(p)=2H(p)G(p)=2^{H(p)}

Probability, information theory

chain rule

H(X,Y)=H(YX)+H(X)H(X,Y)=H(Y\mid X)+H(X)

Probability, information theory

coding interpretation

entropy … the least average number of bits needed to encode a message

Probability, information theory

mutual information

Hurá, máš hotovo! 🎉
Pokud ti moje kartičky pomohly, můžeš mi koupit pivo.