Originally posted by: ArchAngel777
Originally posted by: xtknight
Originally posted by: ArchAngel777
Originally posted by: unfalliblekrutch
rendering a game at 25x16 and having the tv sample it down to native will have much less jaggies than if you simply ran the game at 10x7, right?
That is a good point. In that respect, yes, you are right.
No, not at all. Jaggies will likely be worse because of interpolation processing artifacts, and turning on AA over that will just make them worse.
How would it not?
Edit ** Did not see the second sentance there until I hit reply. Anyway, you are right in that respect, but rendering someone at a higher resolution and downscaling it if it is correct pixel mapping will result in less jagged edges. But if you factor in interpolation, then you are definitely correct. Overall, it would be rather worthless to run a high resolution on a TV that just has to downscale it improperly.
Thanks for the clarification on interpolation. Something I did not consider when I responded.