Jump to content

[PCPER] NVIDIA Multi-Frame Sampled Anti-Aliasing (MFAA) Tested on GTX 980

Frosty11

With the release of NVIDIA's 344.75 driver, support for MFAA has been added.

The goal of MFAA is to change the AA sample pattern in a way to produce near 4xMSAA quality at the effective performance cost of 2x MSAA.

Taking screenshots of MFAA doesn't work with Fraps or similar capture utilities as its not the final image sent to the display. Hardware capture device is needed to see the results of MFAA.

List of Games supporting MFAA with this driver:

Assassin's Creed IV Black Flag

Assassin's Creed: Unity

Battlefield 4

Civilization V

Civilization: Beyond Earth

Crysis 3

DiRT 3

DiRT Showdown

F1 2013

F1 2014

Far Cry 3

Far Cry: Blood Dragon

GRID 2

GRID Autosport

Hitman: Absolution

Just Cause 2

Saints Row IV

Splinter Cell: Blacklist

Titanfall

Wargame: European Escalation

If you enable MFAA in the control panel but start up a game that doesn't support it, you will receive no warning and no message to tell you that won't be getting the improvements associated with the feature. There won't be any performance penalties either.

NVIDIA tells us that GeForce Experience support for MFAA will be here soon so only games that are on that approved list will have the feature enabled.

MFAA is not supported with GTX 980 or GTX 970 SLI yet. Support for it will be added in future driver updates.

Enabling MFAA in your control panel:

settings.jpg

New line item in the control panel shows up if you have supported hardware and the options are simply on and off.

Comparisions:

MFAA is a temporal AA, meaning the sample patterns change over time. According to NVIDIA the pattern changes on a frame-to-frame basis, using two samples in frame 1, then two different samples in frame 2, then two different sample locations in frame 3, etc. These are then filtered back together through a custom section of the Maxwell hardware, resulting in the combined, smoother image. However, because these sample patterns are being combined on a frame to frame basis to a composited image, movement on the screen will affect the final output.

During a static scene, where the character is not moving, the alternating pattern of the AA samples in 4xMFAA should be able to create a new perfect match to the visual quality of 4xMSAA. However, as soon as you start to move on the screen, that changes. Imagine a strafing animation in a shooter where the character passes in front of a door. Inside the door is a very dark scene, but outside it's very bright. The edge between that will have clear aliasing. With MFAA, and other temporal AA methods, issues arise when the pixels on your screen move from the bright part of the scene to the dark; since previous frames are being used to calculate the color of the current frame, there is the potential for degradation.

NVIDIA claims that it's filtering algorithm is intelligent enough to take this into consideration and even in the worst case scenario, the quality of 4xMFAA will never be worse than that of standard 2xMSAA. But you have the potential to see quality nearing that of 4xMSAA.

Screenshots:

Grid 2-

comp-grid2.png

You can see that MFAA implementation is definitely a step above that of the 2xMSAA screen capture. The quality of 4xMSAA and 4xMFAA are difficult to distinguish between as this was a static portion of the benchmark.

BF4-

comp-bf4.png

This screenshot is from a moving portion of the benchmark and MFAA is definitely better than 2xMSAA and nearly identical to the result of the 4xMSAA image.

Crysis 3-

comp-crysis3-2.png

Look along the top edge of the red pipe we are walking across and notice the jagged edges at 2xMSAA that are mostly gone at 4xMSAA. With 4xMFAA though, the result is better than 2xMSAA but still a clear step behind that of the standard 4xMSAA implementation.

Comparision video:

Performance:

Hardware used-

Sandy Bridge-E Core i7-3960X, 16GB of DDR3 and a reference GeForce GTX 980 4GB card.

Grid 2-

GRID2_2560x1440_PER.png

Battlefield 4-

BF4_2560x1440_PER.png

Crysis 3-

Crysis3_2560x1440_PER.png

The performance gains are quite significant in Battlefield 4 and Crysis 3 compared to Grid 2. Looks like the addition of MFAA is a success for NVIDIA.

Source: http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Multi-Frame-Sampled-Anti-Aliasing-MFAA-Tested-GTX-980

Link to comment
Share on other sites

Link to post
Share on other sites

Not a whole lot different, in my opinion.

 

EDIT: I guess the whole point is to provide the same effect with less of a performance hit?

Case: Corsair 4000D Airflow; Motherboard: MSI ZZ490 Gaming Edge; CPU: i7 10700K @ 5.1GHz; Cooler: Noctua NHD15S Chromax; RAM: Corsair LPX DDR4 32GB 3200MHz; Graphics Card: Asus RTX 3080 TUF; Power: EVGA SuperNova 750G2; Storage: 2 x Seagate Barracuda 1TB; Crucial M500 240GB & MX100 512GB; Keyboard: Logitech G710+; Mouse: Logitech G502; Headphones / Amp: HiFiMan Sundara Mayflower Objective 2; Monitor: Asus VG27AQ

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe more of a benefit with a less powerful card?

Satan's buttcrack 4790K - MSI mpower ultramaxextreme AC - 16gb G.skillz Trident 2400 - ZLoLtac GTX 980  - Corsair H110 Overkill - Oculus Rift DK2 - Asus vg248qe7the144hzone

Link to comment
Share on other sites

Link to post
Share on other sites

Still no FC4 compatibility? I had higher hopes when I saw this thread. :(

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

its not alot, but if you can get a 5% performance increase from every setting like shadows, AA, lighting etc it all adds up :D (in general, i know mfaa is only a new form of AA)

Link to comment
Share on other sites

Link to post
Share on other sites

Have to hand it to Nvidia, is does look a lot like 4x but performs as well as 2x. Nice.

.

Link to comment
Share on other sites

Link to post
Share on other sites

You know, I had that idea a while back when I was still working on a game engine. It's weird to see it implemented now by a GPU maker rather than a software developer as it's actually not too hard to do. In fact, I don't think I came up with it myself but read about it somewhere, several years ago. I think some Googling is in order...

 

EDIT: Can't find it anymore. Anyway, the idea was to shift the everything half a pixel to the top and to the right every other frame, so it'd alternate between different viewpoints very quickly and edges would be rendered slightly different each time. In reality it flickers a little but the human eye won't notice. What nVidia does is ever so slightly different but it's based on that same idea. Basically they take two different samples for every frame.

I cannot be held responsible for any bad advice given.

I've no idea why the world is afraid of 3D-printed guns when clearly 3D-printed crossbows would be more practical for now.

My rig: The StealthRay. Plans for a newer, better version of its mufflers are already being made.

Link to comment
Share on other sites

Link to post
Share on other sites

Is this a Maxwell thing, or is it going to be enabled for Kepler GPUs as well like DSR was?

Git Gud.

Link to comment
Share on other sites

Link to post
Share on other sites

MFAA is a waste of time gap-bridger. AA is a stop-gap patchwork fix to a big problem: lack of resolution. If Nvidia can spend R&D money on MFAA, they can spend money in the hardware department to make extinct anything lower than 6GB of VRAM and at least a 384-bit memory bus. But they don't do that. Instead they flood us with "high end" 256-bit bus 4GB cards that needs ghetto solutions like these for games to not look like shit when we should alread be fully moved into 4K for games. We can't do that because Nvidia are holding us back.

 

AMD is interested in giving us enough VRAM and bandwith, but they don't have the GPU horsepower to do anything with it. Nvidia has the GPU horsepower, but are putting those GPUs on PCBs made out of middle fingers and ass salad.

 

Fuck both of them. Star Citizen is almost into crucial beta stages, I have no time for this bullfuckingshit

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

--snip--

buy-led-light-bulbs-uk.jpg

You sir are soo much in the light, it's like an explosion of truthiness!

Link to comment
Share on other sites

Link to post
Share on other sites

Is there support for 700-600 series? Or are they doing this Maxwell only?

PC: 5600x @ 4.85GHz // RTX 3080 Eagle OC // 16GB Trident Z Neo  // Corsair RM750X // MSI B550M Mortar Wi-Fi // Noctua NH-D15S // Cooler Master NR400 // Samsung 50QN90A // Logitech G305 // Corsair K65 // Corsair Virtuoso //

Link to comment
Share on other sites

Link to post
Share on other sites

MFAA is a waste of time gap-bridger. AA is a stop-gap patchwork fix to a big problem: lack of resolution. If Nvidia can spend R&D money on MFAA, they can spend money in the hardware department to make extinct anything lower than 6GB of VRAM and at least a 384-bit memory bus. But they don't do that. Instead they flood us with "high end" 256-bit bus 4GB cards that needs ghetto solutions like these for games to not look like shit when we should alread be fully moved into 4K for games. We can't do that because Nvidia are holding us back.

AMD is interested in giving us enough VRAM and bandwith, but they don't have the GPU horsepower to do anything with it. Nvidia has the GPU horsepower, but are putting those GPUs on PCBs made out of middle fingers and ass salad.

Fuck both of them. Star Citizen is almost into crucial beta stages, I have no time for this bullfuckingshit

AA is a technology so people don't have to replace their monitors just to play games and get a good image out of them. Screw off. Not everyone has the wallet for a 4K monitor, and you who have no clue what engineering goes into that silicon have also no clue how expensive it is to implement a 512-bit and 768-bit bus with so many RAM chips onboard all sucking up electricity and all generating heat for which there then needs to be more powerful, more stable power delivery, and more cooling.

If it was so easy to do it Intel would have already, or Qualcomm or Apple or Imagination Technologies.

Don't go shooting your mouth off just because you're rich enough to afford a pair of 295x2 and an E7 15-core Xeon. If electrical and computer engineering was easy any half-wit in the backwoods of Apalacia could compete with Samsung, Nvidia, AMD, and Intel just fine.

You sir are soo much in the light, it's like an explosion of truthiness!

Don't encourage him.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

AA is a technology so people don't have to replace their monitors just to play games and get a good image out of them. Screw off. Not everyone has the wallet for a 4K monitor, and you who have no clue what engineering goes into that silicon have also no clue how expensive it is to implement a 512-bit and 768-bit bus with so many RAM chips onboard all sucking up electricity and all generating heat for which there then needs to be more powerful, more stable power delivery, and more cooling.

If it was so easy to do it Intel would have already, or Qualcomm or Apple or Imagination Technologies.

Don't go shooting your mouth off just because you're rich enough to afford a pair of 295x2 and an E7 15-core Xeon. If electrical and computer engineering was easy any half-wit in the backwoods of Apalacia could compete with Samsung, Nvidia, AMD, and Intel just fine.

Don't encourage him.

You are correct but it's clear Nvidia is holding back. It's a good business strategy for sure but it's still frustrating for us consumers lol. Maybe I have overly high expectations of them..

Link to comment
Share on other sites

Link to post
Share on other sites

Not a whole lot different, in my opinion.

 

EDIT: I guess the whole point is to provide the same effect with less of a performance hit?

Well that wasn't really achieved, mind you these games probably weren't designed for it so maybe in the future well see a better difference. 

Spoiler

Corsair 400C- Intel i7 6700- Gigabyte Gaming 6- GTX 1080 Founders Ed. - Intel 530 120GB + 2xWD 1TB + Adata 610 256GB- 16GB 2400MHz G.Skill- Evga G2 650 PSU- Corsair H110- ASUS PB278Q- Dell u2412m- Logitech G710+ - Logitech g700 - Sennheiser PC350 SE/598se


Is it just me or is Grammar slowly becoming extinct on LTT? 

 

Link to comment
Share on other sites

Link to post
Share on other sites

snip

 

Don't know why I'm bothering to feed the corporate apologist but anyway:

 

  • On account of the tiny die size, we've never had this much PCB space, so my 384-bit 6GB minimum was extremely conservative.
  • I know you know Nvidia are holding back on purpouse. You can't be this dumb, so I'll give you the benefit of the doubt and choose to believe you are simply defending them in doing so, which is a micron less intolerable.
  • The MFAA would be fine if it were brought in to make visual quality and performance better on older cards. Then it would truly be your imagined Robin Hood no-ulterior-motive nice gesture from Nvidia. But it's not. It was made exclusive to recent cards. Obvious ulterior motive is obvious.
  • Anything less than 6GB of VRAM should be extinct by now on the highest-end cards. If your problem is lack of money, you wouldn't be buying the highest end cards to begin with. So your whole motivation for this seems to be nothing but to be a devil's advocate for shits and giggles.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

<p>

Don't know why I'm bothering to feed the corporate apologist but anyway:

  • On account of the tiny die size, we've never had this much PCB space, so my 384-bit 6GB minimum was extremely conservative.
  • I know you know Nvidia are holding back on purpouse. You can't be this dumb, so I'll give you the benefit of the doubt and choose to believe you are simply defending them in doing so, which is a micron less intolerable.
  • The MFAA would be fine if it were brought in to make visual quality and performance better on older cards. Then it would truly be your imagined Robin Hood no-ulterior-motive nice gesture from Nvidia. But it's not. It was made exclusive to recent cards. Obvious ulterior motive is obvious.
  • Anything less than 6GB of VRAM should be extinct by now on the highest-end cards. If your problem is lack of money, you wouldn't be buying the highest end cards to begin with. So your whole motivation for this seems to be nothing but to be a devil's advocate for shits and giggles.

If developers actually require that much memory, they're not good enough to be developing games. AMD's 390x will also launch with 4GB VRAM, so clearly your idea is a farce. If both AMD and Nvidia think it's near pointless to have that much VRAM on gaming cards, maybe it's because that's true... You have to be using a ton of uncompressed textures to fill up that much memory. If you do that, you're not using the tools to their fullest extent. AMD and Nvidia both have hardware-implemented lossless compression and decompression which allows a developer to do more with less.

The only loads in the world which require that much data are bound up in scientific computing. End of story.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

[*]The MFAA would be fine if it were brought in to make visual quality and performance better on older cards. Then it would truly be your imagined Robin Hood no-ulterior-motive nice gesture from Nvidia. But it's not. It was made exclusive to recent cards. Obvious ulterior motive is obvious.

That would have been great but it would have been possible only if MFAA was purely a software implementation, which it isn't.

MFAA is a temporal AA, meaning the sample patterns change over time. According to NVIDIA the pattern changes on a frame-to-frame basis, using two samples in frame 1, then two different samples in frame 2, then two different sample locations in frame 3, etc. These are then filtered back together through a custom section of the Maxwell hardware, resulting in the combined, smoother image. However, because these sample patterns are being combined on a frame to frame basis to a composited image, movement on the screen will affect the final output.

Link to comment
Share on other sites

Link to post
Share on other sites

That would have been great but it would have been possible only if MFAA was purely a software implementation, which it isn't.

 

Just like how Nvidia tried to launch Shadowplay on just 6xx and newer cards, then 2 weeks later someone pointed out that "oh looks like the 470 also had an H.265 encoder" and BLAMMO! Shadowplay support for Fermi suddenly materialized as if to try and pretend it was there to begin with.

 

Not saying it's what is happening here. But Nvidia are usually pretty sleazy about these things.

PS: I must have missed the schematics showing where on the die or the PCB this MFAA prerequisite technology is located, could you point me to it?

 

EDIT: Don't forget about DSR either, they tried to keep that unsupported for anything other than 970 and 980 as well, but caved in the end.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

Mother F*cking Anti Alising :P

#killedmywife #howtomakebombs #vgamasterrace

Link to comment
Share on other sites

Link to post
Share on other sites

PS: I must have missed the schematics showing where on the die or the PCB this MFAA prerequisite technology is located, could you point me to it?

If NVIDIA's claims are to be believed then there is a hardware requirement for MFAA but PCPER did state that NVIDIA were kind of secretive about this custom Maxwell section. So who knows... maybe its possible to implement MFAA on 7 series cards.

Link to comment
Share on other sites

Link to post
Share on other sites

Tested it today with Titanfall for 10 minutes. Still looked awesome.

 

The only game where the performance impact could matter is Crysis 3. All the other games run at 60+ FPS all the time. :D

who cares...

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

 

Just to point out a few things with your previous 2 posts;

 

- Nowhere in this new post or the source article does it say anything about limiting MFAA to only recent cards. 

- You have no idea about the hardware complications that shadowplay, DSR and MFAA brought when using cards that weren't in development at the time. You can't expect that 'if shadowplay works on a 780 it's obviously going to work no problems on a 480'. That's not how it works.

4930k @ 4.5GHz - EVGA 780ti Superclocked ACX - ASUS x79 Deluxe - 32GB Tactical Tracer @ 2133

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 year later...

Just like how Nvidia tried to launch Shadowplay on just 6xx and newer cards, then 2 weeks later someone pointed out that "oh looks like the 470 also had an H.265 encoder" and BLAMMO! Shadowplay support for Fermi suddenly materialized as if to try and pretend it was there to begin with.

 

Not saying it's what is happening here. But Nvidia are usually pretty sleazy about these things.

PS: I must have missed the schematics showing where on the die or the PCB this MFAA prerequisite technology is located, could you point me to it?

 

EDIT: Don't forget about DSR either, they tried to keep that unsupported for anything other than 970 and 980 as well, but caved in the end.

 

 

Just to point out a few things with your previous 2 posts;

 

- Nowhere in this new post or the source article does it say anything about limiting MFAA to only recent cards. 

- You have no idea about the hardware complications that shadowplay, DSR and MFAA brought when using cards that weren't in development at the time. You can't expect that 'if shadowplay works on a 780 it's obviously going to work no problems on a 480'. That's not how it works.

 

MFAA never came to 780ti card/700 series cards. ......

 

(750ti has it as its a Maxwell)

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

no pls 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×