7 views

0 votes

Recall that the entropy (in bits) of a random variable $\mathrm{X}$ which takes values in $\mathbb{N}$, the set of natural numbers, is defined as

\[

H(X)=\sum_{n=1}^{\infty} p_{n} \log _{2} \frac{1}{p_{n}},

\]

where, for $n \in \mathbb{N}, p_{n}$ denotes the probability that $\mathrm{X=n}$.

Consider a fair coin (i.e., both sides have equal probability of appearing). Suppose we toss the coin repeatedly until both sides are observed. Let $\mathrm{X}$ be the random variable which denotes the number of tosses made. What is the entropy of $\mathrm{X}$ in bits?

- $1$
- $2$
- $4$
- Infinity
- None of the above