edited by
26 views
0 votes
0 votes

A $1 \mathrm{~mW}$ video signal having a bandwidth of $100 \; \mathrm{MHz}$ is transmitted to a receiver through a cable that has $40 \mathrm{~dB}$ loss. If the effective one-sided noise spectral density at the receiver is $10^{-20}$ Watt $/ \mathrm{Hz}$, then the signal-to-noise ratio at the receiver is

  1. $50 \mathrm{~dB}$
  2. $30 \mathrm{~dB}$
  3. $40 \mathrm{~dB}$
  4. $60 \mathrm{~dB}$
edited by

Please log in or register to answer this question.

Answer: