A $1 \mathrm{~mW}$ video signal having a bandwidth of $100 \; \mathrm{MHz}$ is transmitted to a receiver through a cable that has $40 \mathrm{~dB}$ loss. If the effective one-sided noise spectral density at the receiver is $10^{-20}$ Watt $/ \mathrm{Hz}$, then the signal-to-noise ratio at the receiver is
- $50 \mathrm{~dB}$
- $30 \mathrm{~dB}$
- $40 \mathrm{~dB}$
- $60 \mathrm{~dB}$