Quote Originally Posted by Darrel Taylor View Post
I've been playing with the NCD/DCD with excel, and there's something I can't seem to work in there.

Granted, I'm relying on 30 year old memories from when I was working on microwave radios in the air force. But if memory serves me right, then a change of 3 db is double the power, and -3 db is half the power. And since the impedance is constant, it also means the voltage is either doubled or halved.

So let's assume the signal is scaled so that the (maximum A/D reading +1) corresponds to +7 dbm. Then half of that would be 512 or +4 dbm.

If you keep dividing that down, since there's only 10 bit's, you end up with a minimum of -23 dbm with an A/D input of 1.

And it appears that it would take a resolution of 16-bits to be able to get down to -40 dbm.



I think this applies to however you do it. Lookup, calculate, ...

ADDED: And any noise or offset voltages will Obliterate the low end.

But then, maybe I've got it all wrong? ??
<br>
I was always taught (having worked in the audo industry on digital mixin consoles, and having spent years designing synthesisers and samplers as a hobby) that the actual figure is around 6dB per bit, so 10 bits gives you 60dB, but in reality it's more like 4.5dB. I might be wrong but this is what I've always been told. Surely all you have to do is shift the levels?