I was bored and looking at some screenshots of oblivion on IGN and noticed that there is some horrible aliasing going on in the HD version. This made me curious as i'm playing at 480p and I have no aliasing what-so-ever. So I downloaded the COD demo because I know this is something i've played in stores on HD TV and was pretty dissapointed in the jaggies. I eventually loaded it up and found that they were not as apparent their either. Now I have a good eye for detail being an "artist" and know this is not because of the slight blurring that occurs with going to 480p, and aliasing is easy to spot on the last gen games.
What i'm thinking is that at the higher resolutions they just aren't able to to 4 or 6 times FSAA and therefore there is some aliasing left over, but at the lower 480p they maybe able to max it out at 6X FSAA and a higher AF.
Now the question is, do you want a crisper pictures if it's going to induce jaggies which will seem all the more apparent because of the crisp'ness ?
Maybe i'm a nut but jaggies are the biggest reality jarring "effect" in video games now...yeah graphics definately aren't reality level, but they are getting pretty good and the lighting is getting spot on, which is something our brain pays alot of attention to....the thing is there is no place for broken jagged lines, it's one of the little things our brain pick up on as being out of place and not right...
Anyways I think I'd rather have a nicely anti-aliased scene over it being higher resolution....what I really wish is if there where some sort of test or actual statistic on this....it makes sense that if you are not displaying in as high a resolution that you can use the extra power for FSAA, but i'd like to know this forsure, and if all developers do it, or maybe the 360 itself does...maybe someone knows ?
Just something to think about really :P
oh and heres an example...