Hi Guys , was wondering how PBP handles bits var defines , relative to byte defines. for use of memory code space use.

when i have a need of many flags i have generally defined a byte , and then allocated each bit of that byte to the var flag names as required ,


Code:
 bit_byte1 var byte 
   bit0 var bit_byte1.0
   bit1 var bit_byte1.1
   bit2 var bit_byte1.2
   bit3 var bit_byte1.3
   bit4 var bit_byte1.4
   bit5 var bit_byte1.5
   bit6 var bit_byte1.6
   bit7 var bit_byte1.7

' need only 1 bit then 

mybit  var  Bit
where when i need only 1 bit i define as a bit only

this is done in the assumption that when is compiled the memeory use will be less than if i had defined a byte

does this hold true ?