hello all

to the entry above
no one an idea why timer1 runs without a start command? did i done something obviously incorrect? thanks for a feedback

the application is like a stop watch. i will control a loop of about ten minutes and this should be very exact. means about 20ms in relation of 10 minutes. to verify that in real time i need a good idea to control them in real world. for that i send a short serial command after start and one more after stoping the timing sequence. i look at the serial terminal on the pc for the timestamp (resolution 1/100s) when the messages arrived. unfortunalety the difference is not on each run the same.. i think it is a problem of the buffer at the pc end. pic is stable i think.

anyone a good idea how i could verify and measure them also?

thanks a lot