Jump to content

Nvidia Pitches Advantages of G-Sync Over AMD's FreeSync

BiG StroOnZ

I haven't seen anyone say they want AMD to fail. I have seen a few say they are not convinced it is as good as others are making out.  Given we have only a few reviews to go of and one of them wasn't very positive I'd say the response is mostly fair.  With exception of course of those few who are blindly saying one is superior without any real hands on experience or evidence to support that claim.

theres a lot of people here who are just out to find reasons to smear mud on amd

Link to comment
Share on other sites

Link to post
Share on other sites

While I disagree with Nvidia's tendency to lock technologies to only their cards, you have to understand they spend a lot of money and time developing this stuff. They have every right in the world to limit it to their own GPU's, I wouldn't call this anti-consumer. If AMD had bigger market share, maybe nvidia would feel pressured to include that part of the market as well but as it is right now they don't need to. It's kind of an annoying circle since large market share means they don't have to make their technologies available to others and the same technologies likely drive sales..

 

You disagree?

 

  • Advanced PhysX
  • Gsync
  • Gamestream (shield)
  • Nvidia's 3D vision
  • CUDA
  • NVLink

All of these require buying proprietary hardware (with PhysX and Cuda, it is the gfx itself).

 

No one is denying Nvidia's right to income or getting their investment back. But it all comes at the cost of the consumer, when dealing with proprietary solutions. It divides markets, undermining competition. Primarily because you get locked into an eco system, which means you have a huge sunk cost, that you will have to redo, if you change brand (new monitor, new 3D glasses new streaming hardware, etc.) None of this is ever in the consumers interest.

 

Again Nvidia has all the right to do so. But anyone in here should have the point of interest, in the consumer, not the company. Only investors or maybe employees, should ever be interested in that.

 

Nvidia only has a large market share, because people accept these anti consumer and anti competitive behaviours.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

With Nvidia, a larger market share means they don't have to properly support their older cards after they launch new ones. What better way is there to reward customer loyalty than doing this?

 

fc4-fps.gif

 

What is that supposed to show? Older, weaker cards doing worse in games? 

 

Is that your evidence of not supporting older cards? Are they supposed to give the 660 magical juice to become stronger? 

Link to comment
Share on other sites

Link to post
Share on other sites

What is that supposed to show? Older, weaker cards doing worse in games? 

 

Is that your evidence of not supporting older cards? Are they supposed to give the 660 magical juice to become stronger? 

Really? Since when is the 770 weaker than a 280x? Or a 760 than a 270x?

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

What is that supposed to show? Older, weaker cards doing worse in games? 

 

Is that your evidence of not supporting older cards? Are they supposed to give the 660 magical juice to become stronger? 

This should help you jog your memory. This is how they performed in games prior to the 900 series.

 

bl-fps.gif

 

ai-fps.gif

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Company A says their product is better then Company B. Must be true. Bottom line is this is a new technology used. Both are going to have growing pains and no one will know for sure what one is better till a few years from now. Fanboys need to let the tech mature before drawing conclusions. Hell, fanboys need to mature too  :rolleyes:

PC Audio Setup = Beyerdynamic DT 770 pro 80 ohm and Sennheiser pc37x (also for xbox) hooked up to Schiit Fulla 3

Link to comment
Share on other sites

Link to post
Share on other sites

This should help you jog your memory. This is how they performed in games prior to the 900 series.

 

 

 

 

 

Do you have a benchmark showing us Far Cry 4 before 900 and after 900 launch? Not just spitting out various, unrelated games? 

 

You are really, really reaching for reasons to hate Nvidia on this one. 

Link to comment
Share on other sites

Link to post
Share on other sites

Do you have a benchmark showing us Far Cry 4 before 900 and after 900 launch? Not just spitting out various, unrelated games? 

 

You are really, really reaching for reasons to hate Nvidia on this one. 

........ Far Cry 4 launched AFTER the 900 series came out. Even if it didn't how is that reaching? Gtx 770 has never performed an average of 11 fps worse than a 280x and neither has a 760 performed 13 fps worse, even in games where they do better than the other respectively. Give me your take on what is causing the disparity.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

........ Far Cry 4 launched AFTER the 900 series came out. Even if it didn't how is that reaching? Gtx 770 has never performed an average of 11 fps worse than a 280x and neither has a 760 performed 13 fps worse, even in games where they do better than the other respectively. Give me your take on what is causing the disparity.

 

Games usually prefer one brand over another. AMD doing better at Farcry 4, does not prove, that Nvidia stops supporting old graphics cards.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Games usually prefer one brand over another. AMD doing better at Farcry 4, does not prove, that Nvidia stops supporting old graphics cards.

 

Exactly. And you can't compare different games as some evidence of Nvidia gimping a card. Thats such flawed methodology and testing. 

 

What he should've done is found benchmarks from a game pre-900 (say, BF4, since that gets tested a lot) and then benchmarks post-900 using the same game. THEN make such a accusation using proper facts that back up his position. 

 

Giving me three different games doesn't tell me that Nvidia started gimping their older cards. 

Link to comment
Share on other sites

Link to post
Share on other sites

Games usually prefer one brand over another. AMD doing better at Farcry 4, does not prove, that Nvidia stops supporting old graphics cards.

Far Cry 4 is an Nvidia partnered game. And yes, I know that games favor brands over the other, but the performance difference between the games that the 280x beats a 770 (and 760 vs 270x) has never been this huge. If it's about being favored, why is a 770 losing to a 960?

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Exactly. And you can't compare different games as some evidence of Nvidia gimping a card. Thats such flawed methodology and testing. 

 

What he should've done is found benchmarks from a game pre-900 (say, BF4, since that gets tested a lot) and then benchmarks post-900 using the same game. THEN make such a accusation using proper facts that back up his position. 

 

Giving me three different games doesn't tell me that Nvidia started gimping their older cards. 

Because they're not going to go back and change the drivers to make the cards perform worse in old games. They do so on new games to tell just how much better their new cards are than their older ones and why should buy them.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Games usually prefer one brand over another. AMD doing better at Farcry 4, does not prove, that Nvidia stops supporting old graphics cards.

 

 

Exactly. And you can't compare different games as some evidence of Nvidia gimping a card. Thats such flawed methodology and testing. 

 

What he should've done is found benchmarks from a game pre-900 (say, BF4, since that gets tested a lot) and then benchmarks post-900 using the same game. THEN make such a accusation using proper facts that back up his position. 

 

Giving me three different games doesn't tell me that Nvidia started gimping their older cards. 

Is a 280x better than a 780 because games favor different cards?

 

(Sorry for the big image, I don't know how to add spoilers)

r1920.png

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

When did this thread get derailed into some hardcore tinfoil hat conspiracy theories.

 

Here's some GTX 980, 970, and the 780, along with the 280X. No where here the GTX 780 loses to the 280x

 

http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_970_and_980_reference_review,17.html

 

Here's the Titan X review.

http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_titan_x_review,16.html

 

Why aren't you also adding the sources for these reviews. You seem to be only picking a single benchmark at a time, and from different sites, and different games.

 

Have a look through the Guru3d reviews, I don't see how NV is gimping their cards.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

Nvidia only has a large market share, because people accept these anti consumer and anti competitive behaviours.

Yes, exactly why I don't plan on supporting them any time soon. The available features are tempting, I just wish AMD would hurry up and launch some competition to the 900 series already.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, exactly why I don't plan on supporting them any time soon. The available features are tempting, I just wish AMD would hurry up and launch some competition to the 900 series already.

Trust me, you don't want AMD to hurry up. We all know what happens when they're in a hurry. :D

CPU: AMD Ryzen 7 3800X Motherboard: MSI B550 Tomahawk RAM: Kingston HyperX Predator RGB 32 GB (4x8GB) DDR4 GPU: EVGA RTX3090 FTW3 SSD: ADATA XPG SX8200 Pro 512 GB NVME | Samsung QVO 1TB SSD  HDD: Seagate Barracuda 4TB | Seagate Barracuda 8TB Case: Phanteks ECLIPSE P600S PSU: Corsair RM850x

 

 

 

 

I am a gamer, not because I don't have a life, but because I choose to have many.

 

Link to comment
Share on other sites

Link to post
Share on other sites

When did this thread get derailed into some hardcore tinfoil hat conspiracy theories.

 

Here's some GTX 980, 970, and the 780, along with the 280X. No where here the GTX 780 loses to the 280x

 

http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_970_and_980_reference_review,17.html

 

Here's the Titan X review.

http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_titan_x_review,16.html

 

Why aren't you also adding the sources for these reviews. You seem to be only picking a single benchmark at a time, and from different sites, and different games.

 

Have a look through the Guru3d reviews, I don't see how NV is gimping their cards.

Because those are old games and thank you for proving my point that a 280x is not better than a 780 (seriously, did you even read what I as saying?). I posted 2 different sources that shows the 770 losing to a 280x in Far Cry 4. Now let's look at games that have come out since the 900 release. In YOUR benchmarks the 970 loses to the 780 ti. Let's look at games that have come out since the 900 release. (Once again, I don't how to do spoilers so I'll just post the links instead of giant images).

 

Far Cry 4 is obviously above

 

Dying Light:

http://www.techspot.com/review/956-dying-light-benchmarks/page3.html

 

Evolve

http://www.techspot.com/review/962-evolve-benchmarks/page3.html

 

The 970 that lost a 780 ti in YOUR benchmark, is now either on par or beating it in new games. I'm also still waiting for someone to tell me how a 770 losing to a 960 is because a game favors AMD.

 

EDIT: Just noticed I didn't provide the sources for the previous ones

http://techreport.com/review/27702/nvidia-geforce-gtx-960-graphics-card-reviewed/5

 

http://www.techspot.com/review/917-far-cry-4-benchmarks/page3.html

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

theres a lot of people here who are just out to find reasons to smear mud on amd

I'll admit there probably is one or two anti-amd comments that are unjustified in the last 11 pages, but you make it sound like anyone who isn't convinced about freesync right now is an Nvidia fanboy out to smear mud.  

 

It is a simple fact that we are at one of those cycles in tech where AMD are chasing and nvidia/intel are leading.  This isn't mud smearing or wanting AMD to fail, it is just discussing the facts.

It will turn around, Amd will retake the crown again and nvidia/intel will be chasing, it's just going to take longer than it has in the past because success for tech companies is strongly linked to R+D resources over time.

 

In future when you see a post where someone states they want to see company X fail or die, or go bust and then offer nothing to the conversation; report them for trolling.  I do it for a lot of negative comments that go beyond personal opinion.  The mods are quite good at deleting BS posts. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I'll admit there probably is one or two anti-amd comments that are unjustified in the last 11 pages, but you make it sound like anyone who isn't convinced about freesync right now is an Nvidia fanboy out to smear mud.  

 

It is a simple fact that we are at one of those cycles in tech where AMD are chasing and nvidia/intel are leading.  This isn't mud smearing or wanting AMD to fail, it is just discussing the facts.

It will turn around, Amd will retake the crown again and nvidia/intel will be chasing, it's just going to take longer than it has in the past because success for tech companies is strongly linked to R+D resources over time.

 

In future when you see a post where someone states they want to see company X fail or die, or go bust and then offer nothing to the conversation; report them for trolling.  I do it for a lot of negative comments that go beyond personal opinion.  The mods are quite good at deleting BS posts. 

theres a guy who thinks that its amd's fault that monitor makers are not making 240hz ips monitors

Link to comment
Share on other sites

Link to post
Share on other sites

theres a guy who thinks that its amd's fault that monitor makers are not making 240hz ips monitors

 

Still doesn't change the fact that majority of people don't want AMD to fail and haven't said as much either.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

theres a guy who thinks that its amd's fault that monitor makers are not making 240hz ips monitors

 

Let's not stretch the truth here. Nobody thinks its anyone's fault for there not being 240Hz monitors being available. My problem is that they are advertising and promoting Freesync as being able to do up to 240Hz when their actual FreeSync monitors available only have a highest of 144Hz. Which is no better than G-Syncs current max of 144Hz (not a max from G-Sync but a max from the monitors). So, let's not lie or make up bullcrap to make a point. AMD is using the VESA standard of 9-240Hz to promote their products, when their actual products have a maxmimum of 40-144Hz. This is the limitation of current panels. So this is a problem with AMD. They are falsely promoting their products to counter NVIDIA but are basically lying in the process. Which shouldn't be acceptable by anyone with half a brain. 

 

If you think it's acceptable to falsely promote products, then I don't know what to tell you, but based on the GTX 970 fiasco, I would assume that people would have a problem with AMD lying about FreeSync, but I guess it's ok since it's AMD, right?  :rolleyes:

Link to comment
Share on other sites

Link to post
Share on other sites

Let's not stretch the truth here. Nobody thinks its anyone's fault for there not being 240Hz monitors being available. My problem is that they are advertising and promoting Freesync as being able to do up to 240Hz when their actual FreeSync monitors available only have a highest of 144Hz. Which is no better than G-Syncs current max of 144Hz. So, let's not lie or make up bullcrap to make a point. AMD is using the VESA standard of 9-240Hz to promote their products, when their actual products have a maxmimum of 40-144Hz. This is the limitation of current panels. So this is a problem with AMD. They are falsely promoting their products to counter NVIDIA but are basically lying in the process. Which shouldn't be acceptable by anyone with half a brain. 

 

If you think it's acceptable to falsely promote products, then I don't know what to tell you, but based on the GTX 970 fiasco, I would assume that people would have a problem with AMD lying about FreeSync, but I guess it's ok since it's AMD, right?  :rolleyes:

how do you know that they can't support a 240hz monitor if one comes out

Link to comment
Share on other sites

Link to post
Share on other sites

how do you know that they can't support a 240hz monitor if one comes out

 

They can support it, but so can NVIDIA too, but there aren't any 240Hz panels planned anytime in the near future and won't be for probably 3 years minimum. So there's no point in AMD using the spec of 9-240Hz because it is non-existent at the moment. 

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

how do you know that they can't support a 240hz monitor if one comes out

 He never said they wouldn't support it if it came out. Hes saying that as of right now they are unable to support that because that this period in time there is nor 240hz tn or ips panel that is compatible.

Link to comment
Share on other sites

Link to post
Share on other sites

They can support it, but so can NVIDIA too, but there aren't any 240Hz panels planned anytime in the near future and won't be for probably 3 years minimum. So there's no point in AMD using the spec of 9-240Hz because it is non-existent at the moment. 

so people should be outraged at vesa saying they can support 8k monitors with displayport 1.3 because they are not 8k monitors yet -_-

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×