Jump to content

Nvidia G-Sync vs Adaptive-Vsync?

Gerr

Everyone is still on W8.1 bro atleast a few months more...

 

Also the AMD drivers suck because VSR can only do 3200x1800 where as I can do 8K with DSR.

 

Most people are on Windows 7, "bro".

 

Going 8K with DSR is just stupid. DSR and VSR are horribly inefficient AA anyway, plus you can do the same thing with third party tools.

Link to comment
Share on other sites

Link to post
Share on other sites

Most people are on Windows 7, "bro".

 

Going 8K with DSR is just stupid. DSR and VSR are horribly inefficient AA anyway, plus you can do the same thing with third party tools.

 

Why is it "stupid"? I play games at 5K DSR to my 1440P 144hz monitor all day now.

 

Can't you just admit that a 980Ti is better then a Fury X... lmao. My last GPU was a R9 290 so...

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

Display section: 

Nvidia G-Sync vs Adaptive-Vsync

 

Nvidia technology vs Nvidia technology

 

Nvidia

 

AMD 

 

980Ti v Fury X

 

Sounds about right...  :D

i5 4690k | GTX 980Ti G1 Gaming | 16GB RAM | MSI Z97 Gaming 7 | NZXT Kraken X61 | 850 EVO 250GB x2  | 1TB 850 Evo NZXT Noctis 450 | EVGA 750W 80+ Gold

 

 Ducky Shine 3 TKL (Browns) | LG 34UC87C | Logitech MX Master ATH-M50x's + DT990 Pro's 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why is it "stupid"? I play games at 5K DSR to my 1440P 144hz monitor all day now.

 

Can't you just admit that a 980Ti is better then a Fury X... lmao. My last GPU was a R9 290 so...

 

Because it's the most primitive type of AA, and completely ruins performance. No way you'd run any demanding games at anywhere near 60 FPS, let alone 144.

Link to comment
Share on other sites

Link to post
Share on other sites

Because it's the most primitive type of AA, and completely ruins performance. No way you'd run any demanding games at anywhere near 60 FPS, let alone 144.

 

I run Witcher 3 at around 40FPS 5K without Hairworks off. I don't know what is up with Witcher 3 but it looks horrible on 1440P for some reason, the least I can play is 4K DSR to 1440P.

 

I still play a lot of old games, so I love DSR. I've tested all AC games, everything from 1 to Liberation runs at 5K 60fps or higher. 

 

Mafia 3 just got announced, want to replay Mafia 1 and 2? Those both work at 5K etc. However I hate when UI gets so small due to bad scaling in games like Total War (which make no sense since they are PC exclusives)

 

 

 

Rf0Y1U2.jpg

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

I run Witcher 3 at around 40FPS 5K without Hairworks off.

 

Oh, okay. I didn't realize you lived under a bridge. My bad.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh, okay. I didn't realize you lived under a bridge. My bad.

 

You think a 980Ti is expensive?

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

You think a 980Ti is expensive?

 

Meh, sort of, but more importantly it's not going to run Witcher 3 at 40 FPS at 5K resolution. Especially not with Hairworks on.

Link to comment
Share on other sites

Link to post
Share on other sites

Meh, sort of, but more importantly it's not going to run Witcher 3 at 40 FPS at 5K resolution. Especially not with Hairworks on.

 

Hairworks sucks I actually like it off more then on.

But please stop defending AMD only because you own a R9 270. My overclocked MSI 980Ti is 10% or more better in games, has 6GB vram instead of 4GB HBM, is quieter then the Fury X, and a hell of a lot better availability. Currently you can't buy the Fury X anywhere due to shortage of HBM memory. 

 

It's a big improvement in terms of performance, quietness, driver crashes and features like DSR and Shadowplay over my R9 290. Hell it's even cooler then my R9 290 which used less watt's and had three fans instead of two.

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

Hairworks sucks I actually like it off more then on.

But please stop defending AMD only because you own a R9 270. My overclocked MSI 980Ti is 10% or more better in games, has 6GB vram instead of 4GB HBM, is quieter then the Fury X, and a hell of a lot better availability. Currently you can't buy the Fury X anywhere due to shortage of HBM memory. 

 

It's a big improvement in terms of performance, quietness, driver crashes and features like DSR and Shadowplay over my R9 290. Hell it's even cooler then my R9 290 which used less watt's and had three fans instead of two.

 

I'm not. You're the one hardcore fanboying for Nvidia in a thread that wasn't even about comparing AMD and Nvidia. Your performance claims are absurdly out of whack, and you've got the driver crashing and noise claims the wrong way around too (Nvidia is currently having driver problems, and the Fury X is quieter than the 980 Ti).

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not. You're the one hardcore fanboying for Nvidia in a thread that wasn't even about comparing AMD and Nvidia. Your performance claims are absurdly out of whack, and you've got the driver crashing and noise claims the wrong way around too (Nvidia is currently having driver problems, and the Fury X is quieter than the 980 Ti).

 

And who are you to say that? Do you own a 980Ti or a R9 290? No you don't... Also my mistake I thought OP was talking about Freesync. I didn't even know Adaptive-Vsync existed.

 

You are talking about driver problems on W10 but why should I or anyone else care, since everyone tells us to NOT upgrade to Windows 10 for atleast a few months.

 

The Fury X is way more louder, just imagine that one single 120mm radiator has to handle 250W of heat, were as my AIO on my relatively low TDP cpu already was like as loud as a tornado. It's basically 1 120mm fan vs 2 120mm fans on the Msi one.

 

About the performance:

 

 

 

Sorry for going so offtopic... this comment that I found on the video above describes me when then Fury X was released:

 

I insult inferior products! I insult companies who talk big but end up falling short! AMD said this was the "Titan X" killer. AMD fanboys kept saying "red will rise again." All I'm seeing is that AMD fell flat on its face once again! I'm not an Nvidia fanboy. If Nvidia were pushing inferior products I'd be all over them as well. I'm sick and tired of AMD fanboys protecting and damage controlling for AMD! They should be the main ones criticizing AMD for releasing such a disappointing card. I blame AMD fanboys for the Fury X's failure. AMD realizes that they can put anything out there on the market because they have a dedicated fan base who'll buy anything with their brand on it. AMD fanboys accept mediocrity and it's hurting the industry! Force AMD to do better instead of defending them!

 

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

And who are you to say that? Do you own a 980Ti or a R9 290? No you don't... Also my mistake I thought OP was talking about Freesync. I didn't even know Adaptive-Vsync existed.

 

You are talking about driver problems on W10 but why should I or anyone else care, since everyone tells us to NOT upgrade to Windows 10 for atleast a few months.

 

The Fury X is way more louder, just imagine that one single 120mm radiator has to handle 250W of heat, were as my AIO on my relatively low TDP cpu already was like as loud as a tornado. It's basically 1 120mm fan vs 2 120mm fans on the Msi one.

 

About the performance:

 

-snip-

 

Sorry for going so offtopic... this comment that I found on the video above describes me when then Fury X was released:

 

Windows 10 has already been adopted by millions and millions of people, it's going to matter a heck of a lot to a heck of a lot of people.

 

The Fury X is not louder, it's quieter than any 980 Ti that existed when it launched. It wouldn't surprise me if some of the OEMs have come up with 980 Ti's that are as quieter or quieter since then, but of course that'll come at a premium.

Comparing graphics card cooling to CPU cooling is pointless. They're completely different and always have been. A Core i7-4790K has a die size of 177 mm2 whereas Fiji and GM200 are a massive 596 mm2 and 601 mm2 respectively.

 

Nice video. I could complain that most youtube videos are unreliable at best, but I'll skip that and go right to the fact that they don't cover The Witcher 3, the game we're focusing on here.

Link to comment
Share on other sites

Link to post
Share on other sites

Windows 10 has already been adopted by millions and millions of people, it's going to matter a heck of a lot to a heck of a lot of people.

 

The Fury X is not louder, it's quieter than any 980 Ti that existed when it launched. It wouldn't surprise me if some of the OEMs have come up with 980 Ti's that are as quieter or quieter since then, but of course that'll come at a premium.

Comparing graphics card cooling to CPU cooling is pointless. They're completely different and always have been. A Core i7-4790K has a die size of 177 mm2 whereas Fiji and GM200 are a massive 596 mm2 and 601 mm2 respectively.

 

Nice video. I could complain that most youtube videos are unreliable at best, but I'll skip that and go right to the fact that they don't cover The Witcher 3, the game we're focusing on here.

 

Yes dude, circlejerk more. While you are the one on a R9 270 and Windows 7 HP (wtf???) and I'm the one who has first hand experience with a 980TI

Do you want why I trew my AIO cooler in the carbage? The pump died after just 6 months. It nearly fried my CPU because it reached 99C temps.

Also did you just called DigitalFoundry a "unreliable youtuber"? Seriously? They have over 1375 videos benchmarking every single graphics card and even framerates on PS4/Xbox One. Those are also the guys behind eurogamer.net

 

I wish you goodluck and goodnight, sir.

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

Yes dude, circlejerk more. While you are the one on a R9 270 and Windows 7 HP (wtf???) and I'm the one who has first hand experience with a 980TI

Do you want why I trew my AIO cooler in the carbage? The pump died after just 6 months. It nearly fried my CPU because it reached 99C temps.

 

Also did you just called DigitalFoundry a "random youtuber"? Seriously? 

 

Your inept first hand experience does not overrule clear evidence from multiple experienced and reliable reviewers. Your claims of 40FPS at 5K in Witcher 3 are just straight-up BS. It can barely do that at 4K.

 

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/24.html

 

http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-5.html

 

http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,20.html

Link to comment
Share on other sites

Link to post
Share on other sites

Your inept first hand experience does not overrule clear evidence from multiple experienced and reliable reviewers. Your claims of 40FPS at 5K in Witcher 3 are just straight-up BS. It can barely do that at 4K.

 

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/24.html

 

http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-5.html

 

http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,20.html

 

I also like how the 980Ti completely demolishes the Fury X in the benchmarks you have linked.

 

I honestly don't care anymore. Even if the Fury X was 100-200$ cheaper. I had a 1000 euro in my savings, instead of getting a new PC like everyone else seems to think its cool, I bought a 980Ti slapped into my 4.5 year old PC and it works just fine.

 

It's just the best graphics card at the moment, even beating the Titan X. Amd has failed to deliver after overhyping us for so long.

 

42MB uncompressed Fraps screenshot of Witcher 3 screenshot @ 5K  42FPS usually 35 on the heaviest areas and up to 50 indoors.

pndfuctnxk.bmp

 

 

Do I now have permission to go to sleep your royal highness?

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

I also like how the 980Ti completely demolishes the Fury X in the benchmarks you have linked.

 

I honestly don't care anymore. Even if the Fury X was 100-200$ cheaper. I had a 1000 euro in my savings, instead of getting a new PC like everyone else seems to think its cool, I bought a 980Ti slapped into my 4.5 year old PC and it works just fine.

 

It's just the best graphics card at the moment, even beating the Titan X. Amd has failed to deliver after overhyping us for so long.

 

Except it doesn't. I'll hand you two quotes and a graph:

 

 

The Radeon Fury X in most scenarios (depending on the game intensity) will perform close to the GeForce GTX 980 Ti, with the usual exceptions here and there. And then once Ultra HD kicks in, things equalize or get better real fast at very acceptable framerates.

 

 

Instead, the Radeon R9 Fury X delivers performance surreally similar to Nvidia’s 980 Ti. Sure, the GM200-based board tends to finish ahead at 2560x1440, while the Fury’s massive memory bandwidth gives it the advantage at 3840x2160. In either case, though, you’d have a tough time telling the two cards apart.

 

fV5wVNG.gif

 

Demolished indeed...

 

Look, you're happy with your GTX 980 Ti. I get that. You have reason to be, it's a great card. The fact that the Fury X is competitive does not take anything away from the 980 Ti.

Link to comment
Share on other sites

Link to post
Share on other sites

Except it doesn't. I'll hand you two quotes and a graph:

 

 

 

fV5wVNG.gif

 

Demolished indeed...

 

Look, you're happy with your GTX 980 Ti. I get that. You have reason to be, it's a great card. The fact that the Fury X is competitive does not take anything away from the 980 Ti.

 

1) Did you look at my 5K screenshot or does it don't load again?

2) Those results are with the cards NON-OVERCLOCKED. Overclocking a 980Ti can lead to 20% difference compared to reference card.

3) 4K is the worst thing that happen to technology. 3 years ago nobody except the proffesional cinemashooters used&talked about 4K. Now everyone screams 4K here and 4K there when in 2 years it's gonna be all pointless with 8K tv's and monitors. There is a rumour that Apple will release a 8K Imac this fall. Technology is going SUPER FAST.

 

Heck it all started in 2006:

 

At the 2006 NAB show, Jannard announced that Red would build a 4K digital cinema camera and began taking pre-orders. More than one thousand people put down a refundable deposit, and Red began work to fulfill the orders.

 

4) 1440P 144hz is the sweetspot for gaming. And I have one 55 inch IPS 4K TV and a Acer 4K TN G-Sync panel that I use as a secondary monitor for productivity, the Acer XB270HU is my main monitor.

 

5) It's 3:26AM let's stop this for now. We all know we can trow those Fury X and 980Ti's in the trashcan next year with Pascal series and 400-series. HBM V2 is bad ass compared to HBM V1 which is pretty useless. Hopefully AMD won't fail to overhype us for nothing again.

 

Goodnight  :P 

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

I just wanna o7 @o0Martin who so far seems to be the only one who bothered to read and understand what the OP was asking lol...

 

Display section: 

Nvidia G-Sync vs Adaptive-Vsync

 

Nvidia technology vs Nvidia technology

 

Nvidia

 

AMD 

 

980Ti v Fury X

 

Sounds about right...  :D

This topic is about G-Sync Vs NVIDIA's Adaptive-VSync. :P

 

And to make sure everyone is on the same page, I am refering to Adaptive VSYNC, not adaptive-sync(freesync).  And I know G-Sync is better, but by how much?

@Gerr , by A LOT. Adaptive-VSYNC only reduces input lag, it does not fix screen tearing every time I've tried using it. All Adaptive-Vsync does is unlock your framerate when it drops below 60fps to prevent input lag... once it hits 60fps? It just seems to cap your framerate at 60fps. It's not true Vsync thus you still get screen tearing. Screen tearing gives me headaches so I don't bother with Adaptive-VSYNC. I just play with VSYNC on and I'll even go as far as lowering some graphics settings to even try to maintain 60fps at all times that way input lag isn't an issue for me.

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

I just wanna o7 @o0Martin who so far seems to be the only one who bothered to read and understand what the OP was asking lol...

 

This topic is about G-Sync Vs NVIDIA's Adaptive-VSync. :P

 

 

I pointed that out on the first page...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×