edited by
92 views
1 votes
1 votes

A memoryless source emits $n$ symbols each with a probability $p$. The entropy of the source as a function of $n$

  1. increases as $\log n$
  2. decreases as $\log (1 / n)$
  3. increases as $n$
  4. increases as $n \log n$
edited by

Please log in or register to answer this question.