I have a device that accepts volts and squirts out two bytes of data shown below which represent the volts. I sent some accurate volt tests into it and got the following.
How are they getting those numbers from the volts and vice versa?
I have played around with various highbyte and low byte ideas and can get near but not near enough. Multiplying the first byte, adding the second etc etc.
I can get near at 50V but then it's wildly out at others..
0V * = $0000
10V *= $0010
20V *= $0500
30V *= $0970
40V *= $0E70
50V *= $1360 *If we just take the decimal of this = 4960, it could be right..
60V *= $1860
70V *= $1D60
80V *= $2250
90V *= $2740
100V= $2C30 However the decimal of this 11312 is off by miles.
110V= $3130
120V= $3620
130V= $3B10
140V= $4000
150V= $4470
160V= $4970
170V= $4E72
180V= $5352
190V= $5862
200V= $5B52
If we can't find the formula/logic they are using perhaps it is a look up table and i could plot these points on a graph to reproduce it in excel?
THere could be a flag bit in the data when it goes to 170V and above.. That errant 2 on the end maybe!
The data it outputs is unlikely to be super accurate but it would certainly be within a couple of volts.
In the past in a similar device the data was in a single byte which was doubled to give the voltage. So $32 = 50 decimal = 100V
This is more recent and probably more accurate.
Grateful for any ideas or thoughts? *Thanks
Bookmarks