1 votes 1 votes A memoryless source emits $n$ symbols each with a probability $p$. The entropy of the source as a function of $n$ increases as $\log n$ decreases as $\log (1 / n)$ increases as $n$ increases as $n \log n$ Others gate2008-ec + – admin asked Sep 17, 2022 • edited Feb 14, 2023 by Lakshman Bhaiya admin 46.4k points 92 views answer comment Share Follow See all 0 reply Please log in or register to add a comment.