Earlier this year, we took a detailed look at different anti-aliasing technologies in Part 1 of this series. In Part 2, we test all of those modes in order to get an accurate picture of the performance hit we can expect when it comes to using each one.
Back in April, we took an in-depth look at anti-aliasing technologies, the image quality the impart, and the driver settings needed to enable them. Imagine our surprise when we discovered that, in many cases, forcing a particular AA mode on via the GeForce or Radeon driver panel simply does not work. In case you missed that piece, or simply need to bone up on the different anti-aliasing settings offered by AMD and Nvidia, check out Anti-Aliasing Analysis, Part 1: Settings And Surprises.
Clearly, it took us an inordinate amount of time to prepare this follow-up, and that's partially due to the fact that there was a mountain of data to amass in order to create a meaningful comparison. Any game title we wanted to use for testing had to work on cards from both prevalent vendors under specific anti-aliasing technologies. As we learned from the previous story, this severely limits the playing field. Nevertheless, we managed to gather enough information to get a solid feel for how graphics cards perform when it comes to all of the different options available.
First, What's New?
But before we dig into the data, there are a couple notable events that transpired since the publication of our Part 1 coverage.
Back in April, we gave Nvidia a hard time for using misleading naming to describe its coverage sample anti-aliasing modes. As an example, the GeForce driver’s 8xAA mode did not take eight multi-samples per pixel as its name implied, but rather four multi-samples, plus four coverage samples. As a result, 8xAA on a GeForce card was not comparable to 8xAA on a Radeon card.
We’re happy to see that the company modified its nomenclature, and coverage sample modes are now easily identified with CSAA, per the following chart:
GeForce CSAA vs. Radeon EQAA Anti-Aliasing Levels | |||
---|---|---|---|
Old GeForce Driver Mode | New GeForce Driver Mode | Combined Color/Coverage Samples Plus Extra Coverage Samples | Radeon Driver Mode |
2x | 2x | 2+0 | 2x |
N/A | N/A | 2+2 | 2xEQ |
4x | 4x | 4+0 | 4x |
8x | 8x CSAA | 4+4 | 4xEQ |
16x | 16x CSAA | 4+12 | N/A |
8xQ | 8x | 8+0 | 8x |
16xQ | 16xQ CSAA | 8+8 | 8xEQ |
32x | 32x CSAA | 8+24 | N/A |
N/A | N/A | 16+0 | 16x |
Although AMD’s naming system still makes more sense (it indicates the number of multi- and coverage-samples), Nvidia’s new scheme is much better than the previous one. Despite combining both samples into a single number, the CSAA designation lets you know that the target mode represents the sum of both sample types. Now, 8x anti-aliasing always means eight multi-samples, regardless of graphics hardware, giving us all a degree of consistency we can appreciate.
Aside from this, Nvidia now has a post-process anti-aliasing filter it calls FXAA. The technology is similar to AMD's morphological anti-aliasing, but it's implemented in-game instead of through the driver, and it works with any vendor’s graphics hardware. This new option is only included in a few games thus far (such as Duke Nukem Forever. F.3.A.R., Elder Scrolls V: Skyrim, and Battlefield 3), but the code is purportedly easy to integrate. A private programmer even released a non-commercial hack to enable FXAA in DirectX 9, 10, and 11 games. That's something you might want to hunt down on Google if you’re interested in toying with it.
Authors: