Originally posted by: n7
The only time aliasing becomes less apparent is with a smaller pixel pitch. Pixel pitch, not resolution, is what determines visibility of aliasing.
That's not accurate. Given the same pixel pitch, a higher resolution should exhibit less aliasing.
None of the posts have really stated what aliasing really is. Aliasing is caused by excessively fine details (high frequencies) at too low of a resolution (sampling rate).
Now in computer graphics, an edge is pretty much like an impulse when treated as a signal. An edge, like an impulse, has energy (brightness) but technically zero width. Thus, it contains infinitely high frequencies. The spectrum is a straight line, meaning the same amplitude for all frequencies. So in this case, raising the resolution DOES NOT reduce the aliasing.
Raising the resolution does not reduce edge aliasing for the reason I just mentioned, but it does reduce object aliasing. For example, let's say your field of vision is 90 degrees and you have a fine object that is only 1/10 of a degree wide. If you only have 640 pixels of horizontal resolution, then that object is only 0.7 pixels wide. This means as you move around at a constant speed, maintaining the same distance, you will see it 70% of the time -- it will flicker. Now if you had 1600 pixels of horizontal resolution, then the object is 1.8 pixel wide, meaning you it will be 2 pixels 80% of the time and 1 pixel 20% of the time. Depending on the contrast between that object and it's surrounding, this can be a huge reduction in aliasing.
When you turn on anti-aliasing, you increase the number of sampling points, meaning fine objects get sampled and drawn instead of flickering. It's usually also weighted (objects closer the actual pixel location carry greater weight), so if the fine object was black, it gets drawn as gray. The weighting also causes an edge to get blended with its surrounding, reducing aliasing since it's no longer an impulse.