Woohoo, software will be here tomorrow!

OK, Im trying to measure the voltage coming from a pressure sensor with an output of 0.2v to 4.9v.
On my desk it reads 1.79 to 1.80v. The device is supposed to read 0.2v at 2.9 psi (no 0 psi reading is specified) and 4.9v at 36.9psi (max). This would lead me to beleive its reading atmosphere as 14.7psi and thus outputting ~1.8v. I can lower the output by placing a vacuum on it.
My question is, I need to have a display reading 0 psi at atmosphere, so how do I offset 1.8v's worth of 10bit ADC counts in my program, and how do I make my display read a vacuum value at anything below 1.8v? This needs to be a different scale (HG v PSI.)
Obviously, I need to have correct calibration from a display reading of 0 psi to (36.9 - 14.7) 22.2psi so do I do...

(4.9v-0.2v)/1024=0.0045v per count.
0.2v/0.0045v=44.4 counts for min output of device.
1.8v/0.0045v=400 counts for atmosphere, "display" reading of "0psi", or 14.7psi.
1024-400=624 counts from "display" reading of 0psi to 22.2psi


So, 1 display psi = 624/22.2psi=28.1 counts if anything below 400 counts is ignored.
This means atmosphere by these calculations should be 14.7 x 28.1 = 413.07 counts, or..
36.9-2.9=34psi/1024=0.033psi per count. x 400 = 13.2psi x 413.07 = 13.63psi
Obviously, something is wrong here.

Please help me prove Im not insane. Why do I always try maths after a beer and past my bedtime??

Regards,

James