Below is a part of a simple little program I've been experimenting with to generate a 'random' pause but of known maximum and minimum time period(average maximum wait about 12secs for the LED to flash as the program stands).
The pause occurs randomly but because there are a maximum of 65535 'lenghts' if you will of a given time (be it clock cycles etc (I'm not sure how RANDOM works) it should be possible to my mind to work out how long the longest and shortest pause times will be when they occur (1 or 65535 lenghts) I think.
So my question is this: How long does it take the PIC processor to calculate 1 lenght? At the moment the maximum and minimum pause times occur at 12secs max 1sec min.
I hope this makes sense
Code:
X VAR WORD
main:
RANDOM x
PAUSE X /4
HIGH PORTA.2
PAUSE 1000
low PORTA.2
GOTO MAIN:
Dave
Bookmarks