hi all,

so i've been having very odd problems with signed math lately, and have recently gone through the trouble to make a pic serial calculator to test a few things. using the following code as an example of what i'm testing, i am quite confused by the pic's behaviour. in this example, assume a and b are always less than $80 (positive).

in hex (2's comp math) take the following example,
$00 - $05 = $FB
since the result is greater than $80, it is negative.
to find the magnitude of a negative number:
result = ~$FB+1 = $05

''''''''''''''''''''
a var byte
b var byte
result var byte
neg var bit

neg = 0
a = 0
b = 5

result = a - b 'the answer is -5, in 2's comp: $FB

if (result > $80) then neg = 1 ' test for negative numbers

result = ~(result) ' this yeilds $04
result = result +1 ' this should yield 5 - the correct magnitude
''''''''''''''''''''

what i get instead, is a 16 bit value for result. as soon as i NOT the result the binary value dumped over serial is 16 bits - $FF04. if i hserout he decimal value, i get 65284. how is this happening?

if someone could please explain this, i'd really appreciate it. i can't see how the pic can allocate a byte for the variable, then write a word to it without crashing. i must be doing something wrong, the pic couldn't function if memory allocation was that untrustworthy - the stack would get hosed.