in Probability and Statistics recategorized by
7 views
1 vote
1 vote

Let $x_{1}=-1$ and $x_{2}=1$ be two signals that are transmitted with equal probability. If signal $x_{i}, i \in$ $\{1,2\}$ is transmitted, the received signal is $y=x_{i}+n_{i}$, where $n_{i}$ is Gaussian distributed with mean $\mu_{i}$ and variance $\sigma_{i}$. At the receiver, knowing $y$, your job is to detect whether $x_{1}$ or $x_{2}$ was sent. Let $\theta$ be the detection threshold, i.e. if $y<\theta$ then we declare $x_{1}$ was transmitted, otherwise $x_{2}$ was transmitted depending. In general, which of the following is true?

  1. If $\sigma_{1}>\sigma_{2},$ the optimal detection threshold $\theta^{\star}$ to minimize the probability of error is $\leq 0$.
  2. If $\sigma_{1}<\sigma_{2},$ the optimal detection threshold $\theta^{\star}$ to minimize the probability of error is $\leq 0$
  3. If $\mu_{1}>\mu_{2},$ the optimal detection threshold $\theta^{\star}$ to minimize the probability of error is $\leq 0$.
  4. If $\mu_{1}<\mu_{2},$ the optimal detection threshold $\theta^{\star}$ to minimize the probability of error is $\leq 0$
  5. None of the above.
in Probability and Statistics recategorized by
by
41.3k points
7 views

Please log in or register to answer this question.

Welcome to GO Electronics, where you can ask questions and receive answers from other members of the community.