Quote Originally Posted by richard View Post
...imho most of the time using ticks is pointless by the time you process the reading and then display it it's already meaningless and out of date . if you want to time events to such accuracy trigger times need to be allowed for and properly sync'ed , there are much better ways
ps
the k22 interrupt vagaries are a separate issue and unrelated to ticks not being 100th's of a sec
I disagree. This is the first time I have a specialized need for a timer and it requires knowing how long a tick really is. I want to debounce using interrupts, so I press a button, look how long it takes for the contact to close and open again, take an arbitrary average and add maybe 25%, that's my debounce in mSeconds.

I know others use timers in what I consider a more complicated way, I have no clue how a preload value works and can never remember if the timer counts up or down from there (I have a hard time remembering little details like that). To them it's easier, I prefer the elapsed timer approach.

I turn a LED on and off, check the interval on the Saleae probe (I should get a commission every time I typed that), and that's it. Ticks=10mS is stuck in my brain ever since I first saw it on Darrel's include. I just count the ticks until I get my interval.

When you think about it, if we were that concerned with accuracy, we'd program in assembler.

Robert