I've searched through a stack of datasheets, errata and app notes without any success.
Short version: When using a hardware serial port, is a 2.3% difference between actual and expected baud rates enough to render received data unreliable?
Longer version:
18F6310, 20 Mhz. 'Heartbeat' LED confirms that the chip is running at the proper speed.
The AUSART (port 2) is set up to receive.
Paper calculations and Mister-E's calculator tools show that there are two sets of register settings which result in a 0% baud rate error for the speed I'm receiving.
This makes me happy.
Expected bit time of incoming data is 3,125 uS. Actual bit time of incoming data - measured on the 'scope with 4 uS resolution - is 3,200 uS, or 2.34% too slow.
I'm occasionally seeing the OERR (overrun) bit set in the RCSTA2 register, even when polling RCIF2 in an exceedingly tight loop.
I can't really change the incoming data rate.
Is this difference enough to occasionally corrupt the data?
Thoughts?
Thanks in advance.
Bookmarks