I’ve noticed recently that the number of bits available in an ADC is slowly creeping up. That makes me excited… until I look at the system accuracy. How do you tell what you are actually reading with an integrated ADC? You measure the internal voltage reference (IVR) usually. Then you do some math to figure out what your rails are. For single-ended measurements, this is usually ground and VCC. So, what good is 12 bits of ADC accuracy if we only have six bits of accuracy with the IVR? Yes, you heard me right, only six bits. If you have an IVR with +/- 3 % accuracy, that ends up being about 6 bits of accuracy. I’ll walk you through it.
Let’s say we have a 1.0 V IVR with a +/- 3% tolerance. That means that it could be anywhere between 0.97 to 1.03 V, for a total range of 0.06 volts. To find how many bits of resolution that gives us, we say 0.06 V >= VREF+/2n where n= number of bits of resolution and VREF- = GND. If we have VREF+ = 3.3 V, that gives us n = -log2(0.06/3.3) = 5.78 bits. Working this backwards to verify we see that for six bits of resolution, the LSB is 3.3 V / 26 = 0.052 V, which is less than our margin of error. So, we have slightly less than six bits of absolute accuracy.
If you are using your ADC monitoring absolute voltage, any bits after six are waisted, just giving you a false sense of accuracy. This can be dangerous when we optimize an algorithm for a particular device. We may assume that the level of precision we have is also the level of accuracy we have. This will bite us later when the lot-to-lot or device-to-device deviations in manufacturing suddenly move the IVR by 30 mV when we designed our algorithm for 12-bits of precision (which equates to 0.8 mV with a 3.3 V VREF+).
If you need more absolute accuracy, you can get an external precision voltage reference to connect to VREF+. Depending on your application you may be able to depend on the stability of VCC and assume what it is. When you are running of a battery and VCC can change, that is not an option. Now you know.