Recall that for a random variable $X$ which takes values in $\mathbb{N}$, the set of natural numbers, its entropy in bits is defined as
\[H(X)=\sum_{n=1}^{\infty} p_{n} \log _{2} \frac{1}{p_{n}},\]
where, for $n \in \mathbb{N}, p_{n}$ denotes the probability that $X=n$. Now, consider a fair coin which is tossed repeatedly until a heads is observed. Let $X$ be the random variable which indicates the number of tosses made. What is the entropy of $X$ in bits?
- $1$
- $1.5$
- $\frac{1+\sqrt{5}}{2} \approx 1.618$ (the golden ratio)
- $2$
- None of the above