Here's one that's got me scratching my head so far.
I'm interfacing to an LED PWM chip that (as part of the display routine), needs a constant pulsetrain fed to one pin, and on a second pin, one pulse for every 4096 pulses of the first pin.
Now, the first part is easy, as I just enable clkout, and feed that to the chip. However, the second part is throwing me. I'll be the first to admit that the whole timer concept kinda throws me, but I have a feeling that's where the solution to this lies. I'd really like to avoid interrupts if at all possible, as it adds to the overhead, but if that's how this cat must be skinned, then so be it.
Here's my test code...it's cobbled together from sources, so don't nail me to the wall (please) if it really doesn't make sense.
Code:
@ device pic16F648A, intrc_osc_clkout, mclr_off, protect_off, lvp_off, wdt_off
DEFINE OSC 4
TRISB = 0 ' Outputs on
CMCON = 7 ' Turn all comparators to fully digital (needed??)
OPTION_REG = %10000011 '$84 assign prescaler to TMR0, set prescale xxx0PPP
TMR0 = 2 'add 2 to timer if you're anal about it starting exactly on the 1st cycle
INTCON = %10100000 '$A0 enable TMRO interrupt
pause 20
on interrupt goto clkreset
INTCON = %10100000 '$A0 enable TMRO interrupt
start:
goto start
clkreset:
PORTB.7=1
PORTB.7=0
TMR0 = 2 'correct timer for missed cycles?
INTCON = %00100000 '$20 Reset interrupt set T0IE, clear T0IF
resume
And so I'm (hopefully) clear, here's what the code would look like if I didn't need it to run in the background constantly.
Code:
start:
for I=0 to 4095
PORTB.7=1
PORTB.7=0
NEXT i
PORTB.6=1
PORTB.6=0
goto start
-Matt
Bookmarks