Jump to content

Nvidia Pitches Advantages of G-Sync Over AMD's FreeSync

BiG StroOnZ

Engineer points out smoother experiences at low FPS and anti-ghosting technology Nvidia engineer Tom Petersen was recently interviewed by Forbes. Here are the highlights (read the whole interview for a more in-depth understanding):
 

Forbes: You’ve said in the past that one of the crucial goals of G-Sync was to never introduce screen tearing, no matter what.

Tom Petersen: “You never want to introduce stutter, either. It’s a complex problem, which is why we think you need some of that secret sauce in both the driver and the module. In contrast, AMD’s not doing that. As you transition from the high frequencies to the low frequencies of FPS, they have some jarringly negative experiences coming out of their zone. If you take any of their panels and run it from whatever frequency is in the zone, to any frequency out of the zone at the low end, the experience is not good at all."

Tom Petersen: “There’s also a difference at the high frequency range. AMD really has 3 ranges of operation: in the zone, above the zone, and below the zone. When you’re above the zone they have a feature which I like (you can either leave V-Sync On or V-Sync Off), that we’re going to look at adding because some gamers may prefer that. The problem with high refresh rates is this thing called ghosting. You can actually see it with AMD’s own Windmill Demo or with our Pendulum Demo. Look at the trailing edge of those lines and you’ll see a secondary image following it."



Tom Petersen: “We don’t do that. We have anti-ghosting technology so that regardless of framerate, we have very little ghosting. See, variable refresh rates change the way you have to deal with it. Again, we need that module. With AMD, the driver is doing most of the work. Part of the reason they have such bad ghosting is because their driver has to specifically be tuned for each kind of panel. They won’t be able to keep up with the panel variations. We tune our G-Sync module for each monitor, based on its specs and voltage, which is exactly why you won’t see ghosting from us.

We also do support the majority of our GPUs going back to Kepler. The 650Ti Boost is the oldest GPU we support, and there’s a lot of gaps in their GPU support. It’s a tough problem and I’m not meaning to knock AMD, but having that module allows us to exercise more control over the GPU and consequently offer a deeper range of support.”

 

Some of you are probably wondering which is better or why it's better, and so might be interested in hearing what Tom Petersen has to say about the differences between G-Sync and FreeSync. Regardless of what Tom's input is, the ghosting present in the Freesync monitors in that video from PCPer is quite noticeable. That might be unacceptable for those experiencing it.

 

Source: http://www.forbes.com/sites/jasonevangelho/2015/03/23/nvidia-explains-why-their-g-sync-display-tech-is-superior-to-amds-freesync/

 

Other Sources:

Click here to view the article

Link to comment
Share on other sites

Link to post
Share on other sites

Only thing I can say its its DRM hell now. So Freesync is still gonna be my go to.

|King Of The Lost|
Project Dark: i7 7820x 5.1GHz | X299 Dark | Trident Z 32GB 3200MHz | GTX 1080Ti Hybrid | Corsair 760t | 1TB Samsung 860 Pro | EVGA Supernova G2 850w | H110i GTX
Lava: i9 12900k 5.1GHz (Undervolted to 1.26v)| MSI z690 Pro DDR4| Dominator Platnium 32GB 3800MHz| Power Color Red Devil RX 6950 XT| Seasonic Focus Platnium 850w| NZXT Kraken Z53
Unholy Rampage: i7 5930k 4.7GHz 4.4 Ring| X99 
Rampage|Ripjaws IV 16GB 2800 CL13| GTX 1080 Strix(Custom XOC Signed BIOS) | Seasonic Focus Platinum 850w |H100i v2 
Revenge of 775: Pentium 641 | Biostar TPower i45| Crucial Tracer 1066 DDR2 | GTX 580 Classified Ultra | EVGA 650 BQ | Noctua NH D14

Link to comment
Share on other sites

Link to post
Share on other sites

Reason 1.It's not 2.U will buy it because its Nvidia

Perhaps you should take the red goggles off and take a good look at reality. AMD doesn't win big in markets because it just doesn't have the same or better quality to Nvidia. AMD SOMETIMES wins in raw performance on some games at a cheaper price point at the cost of card reliability averages. Furthermore AMD has a nasty habit of shooting itself in the foot when it comes to marketing.

No matter how much any fan child refuses to admit it, every company including his or her own favorite has serious, deep flaws. AMD's main flaws are its marketing and cooling solutions teams, but they also stem into overly supporting open standards like FreeSync before making any money on any research they did. Nvidia on the other hand correctly develops a standard, implements it with minimal flaws, and proceeds to make its money back for future R&D. After that point Nvidia has a poor history of not making standards open, but that is its flaw. Still, Nvidia is out into a better position of competition for the future. AMD on the other hand is left in the dust once again.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Reason 1.It's not  2.U will buy it because its Nvidia 

Pretty much of course Nvidia are going to say it's better it costs more and makes them money but you'd have to be freaking retarded to believe them.

 

 
 

Perfect example of a "closed minded" person right here ^^^

 

Well no his entitled to his opinion and he has as much info to base it on as you do yours so perfect example of a needless post only existing to belittle the opinion of someone else congrats on having been a douche. 

| CPU: i7-4770K @4.6 GHz, | CPU cooler: NZXT Kraken x61 + 2x Noctua NF-A14 Industrial PPC PWM 2000RPM  | Motherboard: MSI Z87-GD65 Gaming | RAM: Corsair Vengeance Pro 16GB(2x8GB) 2133MHz, 11-11-11-27(Red) | GPU: 2x MSI R9 290 Gaming Edition  | SSD: Samsung 840 Evo 250gb | HDD: Seagate ST1000DX001 SSHD 1TB + 4x Seagate ST4000DX001 SSHD 4TB | PSU: Corsair RM1000 | Case: NZXT Phantom 530 Black | Fans: 1x NZXT FZ 200mm Red LED 3x Aerocool Dead Silence 140mm Red Edition 2x Aerocool Dead Silence 120mm Red Edition  | LED lighting: NZXT Hue RGB |

Link to comment
Share on other sites

Link to post
Share on other sites

Change the BenQ from performance mode to normal and change the AMA settings and the ghosting disappears.

I personally find it amusing of how much Tom contradicts his own words and ties things like VSYNC into problems with ghosting.

Nvidia, the way you're meant to be extorted.

BS I still see it.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

So hardware is superior to software. Who would have thought.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

Perhaps you should take the red goggles off and take a good look at reality. AMD doesn't win big in markets because it just doesn't have the same or better quality to Nvidia. AMD SOMETIMES wins in raw performance on some games at a cheaper price point at the cost of card reliability averages. Furthermore AMD has a nasty habit of shooting itself in the foot when it comes to marketing.

No matter how much any fan child refuses to admit it, every company including his or her own favorite has serious, deep flaws. AMD's main flaws are its marketing and cooling solutions teams, but they also stem into overly supporting open standards like FreeSync before making any money on any research they did. Nvidia on the other hand correctly develops a standard, implements it with minimal flaws, and proceeds to make its money back for future R&D. After that point Nvidia has a poor history of not making standards open, but that is its flaw. Still, Nvidia is out into a better position of competition for the future. AMD on the other hand is left in the dust once again.

I think you may need to remove your green glasses and present some evidence of AMD cards being less reliable otherwise you're being even more biased then he was..

AMD cooling solutions? I was unaware AMD was even in the cooling market I'm yet to ever see a AMD cooler being sold, both AMD and Nvidia reference coolers are loud and suck balls if you buy them you're an idiot or more worried about it's appearance then performance so AMD being warmer shouldn't even bother you. 

You're clearly a Nvidia fanboy your last line sort makes that obvious AMD are far from being left in the dust and making things open wont make you as much money sure but it's less of a douche move. AMD are like that nice guy that gets friend zoned lol Nvidia are the abusive douche that every woman falls all over lol. 

| CPU: i7-4770K @4.6 GHz, | CPU cooler: NZXT Kraken x61 + 2x Noctua NF-A14 Industrial PPC PWM 2000RPM  | Motherboard: MSI Z87-GD65 Gaming | RAM: Corsair Vengeance Pro 16GB(2x8GB) 2133MHz, 11-11-11-27(Red) | GPU: 2x MSI R9 290 Gaming Edition  | SSD: Samsung 840 Evo 250gb | HDD: Seagate ST1000DX001 SSHD 1TB + 4x Seagate ST4000DX001 SSHD 4TB | PSU: Corsair RM1000 | Case: NZXT Phantom 530 Black | Fans: 1x NZXT FZ 200mm Red LED 3x Aerocool Dead Silence 140mm Red Edition 2x Aerocool Dead Silence 120mm Red Edition  | LED lighting: NZXT Hue RGB |

Link to comment
Share on other sites

Link to post
Share on other sites

I think you may need to remove your green glasses and present some evidence of AMD cards being less reliable otherwise you're being even more biased then he was..

AMD cooling solutions? I was unaware AMD was even in the cooling market I'm yet to ever see a AMD cooler being sold, both AMD and Nvidia reference coolers are loud and suck balls if you buy them you're an idiot or more worried about it's appearance then performance so AMD being warmer shouldn't even bother you. 

You're clearly a Nvidia fanboy your last line sort makes that obvious AMD are far from being left in t he dust. 

Well recorder failure rates would suggest that AMD cards are more likely to fail. Take all statistics with a grain of salt though.

 

http://www.pugetsystems.com/labs/articles/Video-Card-Failure-Rates-by-Generation-563/

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

Well no his entitled to his opinion and he has as much info to base it on as you do yours so perfect example of a needless post only existing to belittle the opinion of someone else congrats on having been a douche. 

That's not an opinion. That's ignorance due to the fact that you have nothing to base that on.

Linus hasn't even made his review on the freesync monitor and you already assume it is better because of the brand?

 

LOL you need to calm down and wait for SOME WHO ACTUALLY OWNS BOTH MONITORS to review it. aka linus.

BTW profanity is unnecessary if you actually have something to back up your claims other than bad language.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

im confused hes saying that it wont work when its outside the range of the free sync fps but isnt that the same as g sync. also even if g sync is a lot better than free sync i dont want to pay 150 dollars extra maybe 20 dollars extra so they need to cut back on the price

Link to comment
Share on other sites

Link to post
Share on other sites

im confused hes saying that it wont work when its outside the range of the free sync fps but isnt that the same as g sync. 

Also confused by this. What exactly is this guy trying to say?

Link to comment
Share on other sites

Link to post
Share on other sites

im confused hes saying that it wont work when its outside the range of the free sync fps but isnt that the same as g sync. 

 

 

Also confused by this. What exactly is this guy trying to say?

 

 

 

It's either 40-144 Hz with a TN panel or a 48-75 Hz with an IPS panel. That's the FreeSync zone.

 

http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Inside-and-Outside-VRR-Wind

Link to comment
Share on other sites

Link to post
Share on other sites

Well recorder failure rates would suggest that AMD cards are more likely to fail. Take all statistics with a grain of salt though.

 

http://www.pugetsystems.com/labs/articles/Video-Card-Failure-Rates-by-Generation-563/

It says that they use Asus Direct CU cards, which are not well recommended overall, at least for the 290/290X, so yeah take it with a grain of salt.

Link to comment
Share on other sites

Link to post
Share on other sites

That's not an opinion. That's ignorance due to the fact that you have nothing to base that on.

Linus hasn't even made his review on the freesync monitor and you already assume it is better because of the brand?

 

LOL you need to calm down and wait for SOME WHO ACTUALLY OWNS BOTH MONITORS to review it. aka linus.

BTW profanity is unnecessary if you actually have something to back up your claims other than bad language.

I have more to back up my claims then you do.  You called him closed minded for not buying into Nvidia's BS marketing nothing more. Even if you own a Freesync monitor and a Gsync monitor unless both use the exact same panel and both are attempting to push the exact same frames at the exact same time, you're not even going to get any evidence about one being better then the other from a review. So your comment was a needless attempt to belittle his opinion or lack  of belief in Nvidia's claims nothing more. So unless you have something more to back up your claim then Nvidia marketing you were being just as bad as he was. 

| CPU: i7-4770K @4.6 GHz, | CPU cooler: NZXT Kraken x61 + 2x Noctua NF-A14 Industrial PPC PWM 2000RPM  | Motherboard: MSI Z87-GD65 Gaming | RAM: Corsair Vengeance Pro 16GB(2x8GB) 2133MHz, 11-11-11-27(Red) | GPU: 2x MSI R9 290 Gaming Edition  | SSD: Samsung 840 Evo 250gb | HDD: Seagate ST1000DX001 SSHD 1TB + 4x Seagate ST4000DX001 SSHD 4TB | PSU: Corsair RM1000 | Case: NZXT Phantom 530 Black | Fans: 1x NZXT FZ 200mm Red LED 3x Aerocool Dead Silence 140mm Red Edition 2x Aerocool Dead Silence 120mm Red Edition  | LED lighting: NZXT Hue RGB |

Link to comment
Share on other sites

Link to post
Share on other sites

AMD cooling solutions? I was unaware AMD was even in the cooling market I'm yet to ever see a AMD cooler being sold, both AMD and Nvidia reference coolers are loud and suck balls if you buy them you're an idiot or more worried about it's appearance then performance so AMD being warmer shouldn't even bother you. 

maybe not a team bigger that something like CM, corsair or noctua but they have some people working to create the reference coolers and ofc neither is gonna be better that the oem, first, they are suppose to be cheap and two, oem providing a better cooler is one of the few ways they differ from reference, problems was that amd released the worst cooler on a reference card ever, the chip is fine, but that it slows down due to bad thermals is not acceptable

this is one of the greatest thing that has happened to me recently, and it happened on this forum, those involved have my eternal gratitude http://linustechtips.com/main/topic/198850-update-alex-got-his-moto-g2-lets-get-a-moto-g-for-alexgoeshigh-unofficial/ :')

i use to have the second best link in the world here, but it died ;_; its a 404 now but it will always be here

 

Link to comment
Share on other sites

Link to post
Share on other sites

That video just shows different panel type and not technology difference. It should be fair to compare them with same type of panel with one gaving G-sync other Freesync. And I've heard some issues with those tech's, they still need to be worked on more to be perfected. We should be seeing same monitors to come out later on, one having Nvidia's G-sync module and that same without it but with AMD's Freesync. Like let's say ROG Swift without G-sync module but with Freesync support, and then compare them.

 

Also what about latency? No one tests that.

Anyway, I'll still keep monitoring about this monitor tech before I get one.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

 

 
 

 

 

It's either 40-144 Hz with a TN panel or a 48-75 Hz with an IPS panel. That's the FreeSync zone.

 

http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Inside-and-Outside-VRR-Wind

 

Actually it's 9-240Hz

Depends on monitor manufacturer how they make it in the end.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

I think Tom definitely picked his words carefully in that interview, which I appreciate. And the fact remains that as of now, G-Sync is the superior choice for those of us who are brand agnostic. And I like what he said about adopting AMD's take on V-Sync behavior above the refresh rate.

 

I appreciate what AMD is trying to do with the open standard, but until they can deliver as well as a dedicated G-Sync module, there will be many such as myself that are willing to pay extra for that. It just sucks that it limits our future card options.

Turnip OC'd to 3Hz on air

Link to comment
Share on other sites

Link to post
Share on other sites

maybe not a team bigger that something like CM, corsair or noctua but they have some people working to create the reference coolers and ofc neither is gonna be better that the oem, first, they are suppose to be cheap and two, oem providing a better cooler is one of the few ways they differ from reference, problems was that amd released the worst cooler on a reference card ever, the chip is fine, but that it slows down due to bad thermals is not acceptable

But no one actually uses a reference cooler so it's pretty much a complete waste of money even developing them. Nvida clearly believe it's not worth it hence the reason they've been using the same bad cooler for the last few years even when it's pushed to its limit for the Titan X.  So sure AMD have bad reference coolers but that has no effect on the consumer at all seems no one should be using either reference cooler.

Technically AMD's reference cooler is better then Intel's but we all know Intel have a clear and undeniable lead in CPU's and some people actually use stock CPU coolers. Not a single person in the entire world has an excuse to use a reference AMD GPU cooler other then the card being 100% free lol.

What's funny is I was always a Intel/Nvidia user and I still own far more Nvidia cards then AMD cards I just don't like fanboys on either side nor do I like throwing money away for nothing and Nvidia's current pricing(in AUD) pretty much makes buying a Nvidia card exactly that.

| CPU: i7-4770K @4.6 GHz, | CPU cooler: NZXT Kraken x61 + 2x Noctua NF-A14 Industrial PPC PWM 2000RPM  | Motherboard: MSI Z87-GD65 Gaming | RAM: Corsair Vengeance Pro 16GB(2x8GB) 2133MHz, 11-11-11-27(Red) | GPU: 2x MSI R9 290 Gaming Edition  | SSD: Samsung 840 Evo 250gb | HDD: Seagate ST1000DX001 SSHD 1TB + 4x Seagate ST4000DX001 SSHD 4TB | PSU: Corsair RM1000 | Case: NZXT Phantom 530 Black | Fans: 1x NZXT FZ 200mm Red LED 3x Aerocool Dead Silence 140mm Red Edition 2x Aerocool Dead Silence 120mm Red Edition  | LED lighting: NZXT Hue RGB |

Link to comment
Share on other sites

Link to post
Share on other sites

I have more to back up my claims then you do.  You called him closed minded for not buying into Nvidia's BS marketing nothing more. Even if you own a Freesync monitor and a Gsync monitor unless both use the exact same panel and both are attempting to push the exact same frames at the exact same time, you're not even going to get any evidence about one being better then the other from a review. So your comment was a needless attempt to belittle his opinion or lack  of belief in Nvidia's claims nothing more. So unless you have something more to back up your claim then Nvidia marketing you were being just as bad as he was. 

You realize linus' camera can record at 500fps right? How do you think they make slow motion videos timing the input lag and frames of a monitor?

Also maybe you didnt read the OP, or maybe you forgot, but most of what was said was justified in the video.

And when you see these monitors in person, you may think they look the same, but comparing them side by side someone like linus will be able to say clearly if one is better than the other and why.

 

PS - i called him closed minded for not even reading the OP, he just threw his biased opinion at the topic title. And you obviously just agreed with someone who doesn't think rationally.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

Actually it's 9-240Hz

Depends on monitor manufacturer how they make it in the end.

 

No it's not, AMD lied and No. When FreeSync is enabled it is limited to 40-144Hz in a TN panel or 48-75Hz in an IPS panel. Read PCPer:

 

"The published refresh rate is more than a bit misleading. AMD claims that FreeSync is rated at 9-240 Hz while G-Sync is only quoted at 30-144 Hz. There are two issues with this. First, FreeSync doesn’t determine the range of variable refresh rates, AdaptiveSync does. Second, AMD is using the maximum and minimum refresh range published by the standards body, not an actual implementation or even a planned implementation from any monitor or display. The 30-144 Hz rating for G-Sync is actually seen in shipping displays today (like the ASUS PG278Q ROG Swift). The FreeSync monitors we have in our office are either 40-144 Hz (TN, 2560x1440) or 48-75 Hz (IPS, 2560x1080); neither of which is close to the 9-240 Hz seen in this table."

 

http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion

Link to comment
Share on other sites

Link to post
Share on other sites

You realize linus' camera can record at 500fps right? How do you think they make slow motion videos timing the input lag and frames of a monitor?

Also maybe you didnt read the OP, or maybe you forgot, but most of what was said was justified in the video.

And when you see these monitors in person, you may think they look the same, but comparing them side by side someone like linus will be able to say clearly if one is better than the other and why.

 

PS - i called him closed minded for not even reading the OP, he just threw his bias

Clearly you're unable to understand that unless the 2 techs are both using the exact same display panel and attempting to push the exact same frames at the exact same time you're not going to get a reliable result no matter what FPS you attempt to film it at. The OP didn't compare 2 identical displays it compared a bunch of random shit for marketing purposes. You should learn not to trust anything any company says about their product being better because of course they want you to buy it.

| CPU: i7-4770K @4.6 GHz, | CPU cooler: NZXT Kraken x61 + 2x Noctua NF-A14 Industrial PPC PWM 2000RPM  | Motherboard: MSI Z87-GD65 Gaming | RAM: Corsair Vengeance Pro 16GB(2x8GB) 2133MHz, 11-11-11-27(Red) | GPU: 2x MSI R9 290 Gaming Edition  | SSD: Samsung 840 Evo 250gb | HDD: Seagate ST1000DX001 SSHD 1TB + 4x Seagate ST4000DX001 SSHD 4TB | PSU: Corsair RM1000 | Case: NZXT Phantom 530 Black | Fans: 1x NZXT FZ 200mm Red LED 3x Aerocool Dead Silence 140mm Red Edition 2x Aerocool Dead Silence 120mm Red Edition  | LED lighting: NZXT Hue RGB |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×