Hi Folks,

I've read and read and I am in need of some advice. I am completely green, and I would appreciate a push in the right direction. Here is what I am trying to do. I have an electronic tilt sensor that outputs a 4 byte HEX string on a RS232 line when an ASCII character is sent to it. For example, if I send it a "C" via a terminal program, it may respond with 85F (2143), the last byte isn't shown as it is a CR. To get an actual angle I need to do this: (2143-2048) * .0125. Which turns out to be 1.18°. BTW 2048 and .0125 are static variables.
I've only successfully been able to send the ASCII character from my pic to the tilt sensor and what I end up getting back is nonsense. I guess what I am having a hard time with is how I go about setting up my variables to be able to receive my 4 byte string (ignoring the last) and then correctly displaying it on my LCD.
Am I looking at setting up an array, naming the bytes in the array doing the math and then send those bytes out to the LCD?

Currently I am using a 16F877A on a LABX1 board.

I'm sorry, what I have so far is at work, as soon as I get back on Wednesday I can post what I have so far. It is ugly, and I welcome any constructive advice.
Please be patient, I haven't messed with any of this stuff in about 6 years, and even then the most I did were the examples from the RCG Research class!