- Jul 16, 2005
- 19
- 0
- 0
I'm having an argument with this guy at work...
This is one of those "never possible in reality" things...
Given an analog waveform, and an infinite number of sampling points, can't you choose your sampling points so that there are no gaps between them- IE, produce a sampled function that, given any real number, returns sampled data without any interpolation?
My conjecture is that you can, and at that point your perfectly sampled function is the same as the original waveform. He believes that there would still be gaps between your sample points. I said that yes, you could have an infinite number of sample points and choose to still have gaps (only sample for rational numbers, for instance), but you didn't need to.
This all started from an argument we were having about whether it was possible for digital representations to ever be as good as analog. He said it was not possible, I said it was.
Correct me if I'm wrong about this: but in the real world there comes a point when, no matter how good your detector of the original waveform is there is still a margin of error (uncertainty principle), and that as soon as the distance between every two adjacent sample points becomes less than that margin of error, the digital representation becomes just as good as the analog.
This is one of those "never possible in reality" things...
Given an analog waveform, and an infinite number of sampling points, can't you choose your sampling points so that there are no gaps between them- IE, produce a sampled function that, given any real number, returns sampled data without any interpolation?
My conjecture is that you can, and at that point your perfectly sampled function is the same as the original waveform. He believes that there would still be gaps between your sample points. I said that yes, you could have an infinite number of sample points and choose to still have gaps (only sample for rational numbers, for instance), but you didn't need to.
This all started from an argument we were having about whether it was possible for digital representations to ever be as good as analog. He said it was not possible, I said it was.
Correct me if I'm wrong about this: but in the real world there comes a point when, no matter how good your detector of the original waveform is there is still a margin of error (uncertainty principle), and that as soon as the distance between every two adjacent sample points becomes less than that margin of error, the digital representation becomes just as good as the analog.