7 views

1 vote

Let $x_{1}=-1$ and $x_{2}=1$ be two signals that are transmitted with equal probability. If signal $x_{i}, i \in$ $\{1,2\}$ is transmitted, the received signal is $y=x_{i}+n_{i}$, where $n_{i}$ is Gaussian distributed with mean $\mu_{i}$ and variance $\sigma_{i}$. At the receiver, knowing $y$, your job is to detect whether $x_{1}$ or $x_{2}$ was sent. Let $\theta$ be the detection threshold, i.e. if $y<\theta$ then we declare $x_{1}$ was transmitted, otherwise $x_{2}$ was transmitted depending. In general, which of the following is true?

- If $\sigma_{1}>\sigma_{2},$ the optimal detection threshold $\theta^{\star}$ to minimize the probability of error is $\leq 0$.
- If $\sigma_{1}<\sigma_{2},$ the optimal detection threshold $\theta^{\star}$ to minimize the probability of error is $\leq 0$
- If $\mu_{1}>\mu_{2},$ the optimal detection threshold $\theta^{\star}$ to minimize the probability of error is $\leq 0$.
- If $\mu_{1}<\mu_{2},$ the optimal detection threshold $\theta^{\star}$ to minimize the probability of error is $\leq 0$
- None of the above.