This is somewhat related to my ongoing pitch detection quest (in that it's what's driven me to ask - I'm seeing occasional glitches in the input signal, which is causing a hiccup in the 'time measured between comparator interrupts, oncde in a while), but it could relate to any situation...not just pitch detection, so hence coming back to this thread.
My question relates to 'intelligent' averaging (& rejecting spurious values)
Let's say you have a series of data (counts, samples, pulses....whatever) , but within the data occurs the odd 'sample' that's way out of whack. For example.....
2, 2, 2, 3,2,2,3,2,2,2,8,2,2,2,3,3,2
What I'd like to do is average all those samples, but reject any one occurence that's way out of whack (eg if 1 sample per 16 is out of whack, ignore it). so in that example all the 2s & 3s get averaged, but the 8 is ignored....else it'll really dick the average!
Ok, if it were that simple I could probably kludge something together - BUT, lets say that the above series of data is then immediately followed on by by say...
10, 10, 11, 10, 11, 10, 10, 11, 10
therefore giving a stream of values like thus.... 2, 2, 2, 3,2,2,3,2,2,2,8,2,2,2,3,3,2,10, 10, 11, 10, 11, 10, 10, 11, 10
There clearly becomes a problem, because the first couple of 10s of the second series would appear out of whack to any 'moving' simple check that was applied to the first series.
Ideally, I'd like to put in place some form of intelligent averaging routine, where only samples that are close together (say percentage terms) are averaged, any spurious samples out of that range (say 1 in 8 samples) are rejected ..... but if a threshold is breached, then the spurious values aren't treated as spurious anymore, but as new valid data to be then averaged.
Clear as mud?
Bookmarks