0 votes 0 votes Let $U$ and $V$ be two independent and identically distributed random variables such that $P(U=+1)=P(U=-1) = \dfrac{1}{2}.$ The entropy $H(U + V)$ in bits is $\frac{3}{4} \\$ $1$ $\frac{3}{2} \\$ $\log_{2}3$ Communications gate2013-ec communications entropy + – Milicevic3306 asked Mar 25, 2018 edited Nov 17, 2020 by soujanyareddy13 Milicevic3306 16.0k points 108 views answer comment Share Follow See all 0 reply Please log in or register to add a comment.