Have you ever played Amnesia in 4K? The latest nVidia driver makes it possible for anyone who have at least a GeForce GTX 400 series. You don't need a 4K TV. Just set DSR Factor to 4.00 and DSR Smoothing to 0%. Then change your display resolution to 3840x2160 pixels and also in-game.
It looks like extremely qualitative anti-alaising.
This works for every game!
Spoiler below!
In other games like Gothic 3 (2006) for example where are a lot of details in the world the effect is even greater.
Or Skyrim (2011) in 4K combined with 8x MSAA.
(This post was last modified: 10-25-2014, 09:23 PM by Googolplex.)
Ahem... Sorry to ruin the party, but this post makes absolutely no sense to me.
Why would you want to play in 4K resolution without having a native 4K monitor? It will be downgraded to your native resolution in monitor while still rendering game in 4K resolution in GPU and therefore slowing down the game because most GPUs (if any?) cannot run the game in 4K 60fps.
You understand something wrong. The GPU will render the game in 4K resolution and then downsample it to 1080p. The result is that there are more details on surfaces in the distance (128x anisotropic texture filtering) and almost no aliasing especially on grasses, trees or other transparency textures.
It is like an extremely effective anti-aliasing mode!
And it runs on 60 Hz with a standard HDMI cable.
@Rött
Exactly.
(This post was last modified: 10-25-2014, 06:07 PM by Googolplex.)
Give me some statistic comparisons. Currently I'm pretty pessimistic of this method being more effective than regular anti-aliasing on lower resolution.
(10-25-2014, 06:20 PM)IIPEE Wrote: Give me some statistic comparisons. Currently I'm pretty pessimistic of this method being more effective than regular anti-aliasing on lower resolution.
Googolplex is right. This kind of down-sampling is actually how anti-aliasing works in the first place, so increasing the resolution will improve the quality of the anti-aliasing, and the anisotropic filtering too. You can see some comparisons here.
Unless you have an absolutely kickass graphics card setup, the real question is: is the graphics quality gain worth the performance loss?
I created some comparisons. I choosed Gothic 3, because there is a lot of vegetation.
Note that in-game with wind animation the difference looks even better than on this static screenshots!
(10-25-2014, 06:20 PM)IIPEE Wrote: Give me some statistic comparisons. Currently I'm pretty pessimistic of this method being more effective than regular anti-aliasing on lower resolution.
This kind of down-sampling is actually how anti-aliasing works in the first place, so increasing the resolution will improve the quality of the anti-aliasing, and the anisotropic filtering too.
OK then. How does this 4K method compare in performance to, say, 4/8 AA on 1080p?