edited by
26 views
0 votes
0 votes

A monochrome video signal that ranges from $0$ to $8 \mathrm{~V}$, is digitized using an $8$-bit $\mathrm{ADC}$.

  1. Determine the resolution of the $\mathrm{ADC}$ in $\mathrm{V} / \mathrm{bit}$.
  2. Calculate the mean squared quantization error.
  3. Suppose the ADC is counter controlled. The counter is upcount and positive edge triggered with clock frequency $1 \; \mathrm{MHz}$. What is the time taken in seconds to get a digital equivalent of $1.59 \mathrm{~V} ?$
edited by

Please log in or register to answer this question.