Hi,

I'm (or wanting to) adding some features to my battery charger.

The aim is to sum the milli-amperes that the charger has provided to the battery (some kind of energy counter).

The maximum instant load is 1400mAh; the charging time can be up to six hours (this is depending on the battery charge level at the beginning of the charging process) and the program checks the load every second.

If I store a 3600th of mAh every second, then I will get: 1400mAh / 3600 = 0,3888 and this is not easy to store unless it is modified.

In the worst case, there are going to be 21'600 "mAh" samples (3600 seconds * 6 hours).

I could simply multiply by 3 each measured load ("3" comes from a WORD sized var 65'535 / 21'600 samples) and when the time comes to display the result, divide the summed samples by 3 again. This would be like (1400mAh * 3) / 3600 = 1,1666. Since the stored result is "1", I have an important precision loss...

Another way could be to multiply the measured load by 100 like (1400mAh * 100) / 3600 = 38, then store this in two WORD sized vars (one would be High the other Low, but I don't know how to do this) and again, as before, divide the sum by 100 when it is time to display the result. Here also, I have a loss in precision, less than before but still a loss.

What is the best mathematical way to add these small numbers without making too much "calculation-storming"?