in Others edited by
0 votes
0 votes

TIFR ECE 2023 | Question-4

Recall that the entropy (in bits) of a random variable $\mathrm{X}$ which takes values in $\mathbb{N}$, the set of natural numbers, is defined as
H(X)=\sum_{n=1}^{\infty} p_{n} \log _{2} \frac{1}{p_{n}},
where, for $n \in \mathbb{N}, p_{n}$ denotes the probability that $\mathrm{X=n}$.
Consider a fair coin (i.e., both sides have equal probability of appearing). Suppose we toss the coin repeatedly until both sides are observed. Let $\mathrm{X}$ be the random variable which denotes the number of tosses made. What is the entropy of $\mathrm{X}$ in bits?

  1. $1$
  2. $2$
  3. $4$
  4. Infinity
  5. None of the above


in Others edited by
43.6k points

Please log in or register to answer this question.

Welcome to GO Electronics, where you can ask questions and receive answers from other members of the community.