recategorized by
72 views
1 votes
1 votes

Recall that for a random variable $X$ which takes values in $\mathbb{N}$, the set of natural numbers, its entropy in bits is defined as

\[H(X)=\sum_{n=1}^{\infty} p_{n} \log _{2} \frac{1}{p_{n}},\]

where, for $n \in \mathbb{N}, p_{n}$ denotes the probability that $X=n$. Now, consider a fair coin which is tossed repeatedly until a heads is observed. Let $X$ be the random variable which indicates the number of tosses made. What is the entropy of $X$ in bits?

  1. $1$
  2. $1.5$
  3. $\frac{1+\sqrt{5}}{2} \approx 1.618$ (the golden ratio)
  4. $2$
  5. None of the above
recategorized by

Please log in or register to answer this question.

Answer: