Jitter free 8-bit ADC

I have done quite a few embedded projects using 8-bit microcontrollers such as the Atmega328, Atmega644, etc. These are the ones that are used on Arduino boards.

These chips have 10-bit DAC converters for reading analog values. The usual problem when pulling the data from these inputs is the jitter. You, for example, wire a potentiometer as a voltage divider and pull the reading into your software. And commonly, you end up with readings that tend to toggle between a couple different values.

One usually proposed idea to cope with this is to count the average value from a few samples in a series. While this improves the situation, you will still get jitter. And your signal will also react slower.

Now the question is: these are 8-bit microcontrollers, so why is the ADC 10-bit? I’m not sure if this is really the explanation for this, but coincidentally, 10-bits seems to be just enough to implement a simple hysteresis algorithm to give you a jitter-free 8-bit value with no latency.

Actually atmega328 datasheet will tell you that the actual accuracy of the ADC is between 2 and 4 LSB. Which in practice means that two bits out of 10 bits is thought to be used to noise control in software.

The idea is to store the last “approved” 8-bit reading and compare the new reading to it. You will only update this reading if a new 10-bit reading mapped to an 8-bit reading differs from it. The code (Arduino/C++) would be something like:

byte approved;

void loop() {
  int p = analogRead(POT);
  if ( abs(approved * 4 - p) > 3 ) {
    approved = p / 4;
  }
}

You might be able to optimize that a bit with some bitmath magic, but it would make the code snippet more messy to understand.

«
»