keithdoxey
- 12th February 2007, 20:44
Hi All,
Several of the posts here recently have been about problems recieving or transmitting serial data.
I was wondering what the maximum allowable error is to still retain reliable communications. I realise that this is probably a similar question to "how long is a piece of string" but I have been thinking about this quite a bit and playing with Mister_Es superb little utility.
I currently have a project with several pics receiving IR data and forming messages to be sent to the main processor. The IR receiver PICs are 16F88s running at 4Mhz and the main CPU is an 18F452 running at 20MHz. Comms are at 9600 at the moment although I may speed this up once I have everything working OK.
It occurs to me that the actual error is not the overall consideration but the difference between the error rates.
e.g. if both PICs are running with an error of +8.5% then the difference is zero so comms should be perfect but if the two devices have different error rates then the situation could be totally different.
If device A was +8.5% but device B was -1.5% then overall the discrepancy is 10%.
How far out of sync can the two devices be before the error is too great ?
Thanks
Several of the posts here recently have been about problems recieving or transmitting serial data.
I was wondering what the maximum allowable error is to still retain reliable communications. I realise that this is probably a similar question to "how long is a piece of string" but I have been thinking about this quite a bit and playing with Mister_Es superb little utility.
I currently have a project with several pics receiving IR data and forming messages to be sent to the main processor. The IR receiver PICs are 16F88s running at 4Mhz and the main CPU is an 18F452 running at 20MHz. Comms are at 9600 at the moment although I may speed this up once I have everything working OK.
It occurs to me that the actual error is not the overall consideration but the difference between the error rates.
e.g. if both PICs are running with an error of +8.5% then the difference is zero so comms should be perfect but if the two devices have different error rates then the situation could be totally different.
If device A was +8.5% but device B was -1.5% then overall the discrepancy is 10%.
How far out of sync can the two devices be before the error is too great ?
Thanks