edited by
143 views
0 votes
0 votes

Recall that the entropy (in bits) of a random variable $\mathrm{X}$ which takes values in $\mathbb{N}$, the set of natural numbers, is defined as

$$H(X)=\sum_{n=1}^{\infty} p_{n} \log _{2} \frac{1}{p_{n}},$$

where, for $n \in \mathbb{N}, p_{n}$ denotes the probability that $\mathrm{X=n}$.
Consider a fair coin (i.e., both sides have equal probability of appearing). Suppose we toss the coin repeatedly until both sides are observed. Let $\mathrm{X}$ be the random variable which denotes the number of tosses made. What is the entropy of $\mathrm{X}$ in bits?

  1. $1$
  2. $2$
  3. $4$
  4. Infinity
  5. None of the above
edited by

Please log in or register to answer this question.

Answer: