Through software interpolation I think you'd be able to measure a repetitive waveform @500MHz, but for a one shot events I fail to see how.
Through software interpolation I think you'd be able to measure a repetitive waveform @500MHz, but for a one shot events I fail to see how.
@500MHz with 2G samples / second -- that's only 4 sample points. Without black magic, how in the world could you capture much with that? (might suffice in purely the digital domain, certainly won't capture glitches though)
I don't really know. But I thought if the "Golden Rule" of 5x would work,that's only 4 sample points. Without black magic, how in the world could you capture much with that?
maybe silver was ok too.
And I guess I'm also trying to figure out what it would take to do the original post.
Assuming you want a 5Mhz "Bandwidth". Which way do you mean?
5Mhz bandwidth? Or 5M samples/sec?
<br>
DT
When I say x5 -- I generally mean that, if you consider 100MHz to be your highest point of measurement, then ideally you'd probably be best of with a 500MHz scope. I think though however, this old saying is more applicable to analogue scopes than anything.
5MHz bandwidth for a DIY scope? -- personally, I'd be chasing 20M samples / second or more.
But without complex software algorithms (that I don't know enough about yet) -- I estimate the usable bandwidth to be about 1MHz @ 20M samples / second.
This statement has a lot to do with it I feel ...
The fundamental AC waveform is the sine wave, a square wave is made up of a multitude of many, many, sine waves. We certainly have a square wave @ 500MHz w/ 2g sample / second.
Question is; how do we extract it?
Qualified engineers like; Melanie & Jerson should have little difficulty in answering this.
Bookmarks