A monochrome video signal that ranges from $0$ to $8 \mathrm{~V}$, is digitized using an $8$-bit $\mathrm{ADC}$.
- Determine the resolution of the $\mathrm{ADC}$ in $\mathrm{V} / \mathrm{bit}$.
- Calculate the mean squared quantization error.
- Suppose the ADC is counter controlled. The counter is upcount and positive edge triggered with clock frequency $1 \; \mathrm{MHz}$. What is the time taken in seconds to get a digital equivalent of $1.59 \mathrm{~V} ?$