2 views

A memoryless source emits $n$ symbols each with a probability $p$. The entropy of the source as a function of $n$

1. increases as $\log n$
2. decreases as $\log (1 / n)$
3. increases as $n$
4. increases as $n \log n$