Jump to content

AMD FreeSync VS Nvidia G-Sync (Tom's Hardware)

Rekx

The opposite is EXACTLY TRUE.

 

If you HAVE A FREESYNC CAPABLE MONITOR YOU CANT MAGICALLY GET G-sync so if you WANT TO KEEP FREESYNC THEN YOU HAVE TO BUY AMD.

 

Don't pretend this isn't a two way street.

 

You can use g-sync monitors without nvidia gpu's. G-sync won't be an option anymore though.

 

That's exactly what he is saying,  the post he quoted was saying you'd be stupid not to go G-sync. Dale is advocating that the brand locking nature of the technology as it is now means that under that posters mindset you'd technically be stupid regardless of the sync you chose.    I.E it's not stupid to choose freesync any more than it is stupid to choose g-sync.

 

Personally I recommend getting whatever suits you the best at the time.  For some people it will be governed by price while for others they will trade CPU power for GPU power and a flashy case to get a G/Fsync monitor.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

The opposite is EXACTLY TRUE.

 

If you HAVE A FREESYNC CAPABLE MONITOR YOU CANT MAGICALLY GET G-sync so if you WANT TO KEEP FREESYNC THEN YOU HAVE TO BUY AMD.

 

Don't pretend this isn't a two way street.

 

You can use g-sync monitors without nvidia gpu's. G-sync won't be an option anymore though.

 

Wow, someone got triggered.

Link to comment
Share on other sites

Link to post
Share on other sites

The opposite is EXACTLY TRUE.

 

If you HAVE A FREESYNC CAPABLE MONITOR YOU CANT MAGICALLY GET G-sync so if you WANT TO KEEP FREESYNC THEN YOU HAVE TO BUY AMD.

 

Don't pretend this isn't a two way street.

 

You can use g-sync monitors without nvidia gpu's. G-sync won't be an option anymore though.

You missed my point entirely. @mr moose hit the nail on the head.

 

Who the hell is gonna buy a G-Sync monitor and then buy an AMD GPU? No one. The percentage of people who would do that would be a fraction of a percent of total G-Sync Monitor sales.

 

And yes, at the moment, AMD is the only vendor offering Adaptive-Sync compatibility - but that is NVIDIA's fault, and no one elses. Adaptive-Sync is a completely open standard, with no license fee to implement on the GPU side. NVIDIA could release an Adaptive-Sync monitor tomorrow if they wanted. But they won't. Because they want you to buy G-Sync and suffer from Vendor Lock-in.

 

Intel could (and likely will, given time) offer Adaptive-Sync compatibility on their iGPU's too.

 

That's the entire point of Adaptive-Sync - no vendor lock-in. Anyone can adopt it if they so desire.

 

But even if you consider that AMD is the only one to currently offer Adaptive-Sync compatibility in the form of FreeSync, then that's Vendor Lock-in too. Vendor Lock-in is never good, but at least with Adaptive-Sync, there's the potential for all GPU Vendors to get on board.

 

Good luck ever seeing an AMD GPU support G-Sync.

 

The end result is that if you buy a G-Sync monitor, you will likely never consider an AMD GPU again unless you eventually replace that monitor completely.

 

That's exactly what he is saying,  the post he quoted was saying you'd be stupid not to go G-sync. Dale is advocating that the brand locking nature of the technology as it is now means that under that posters mindset you'd technically be stupid regardless of the sync you chose.    I.E it's not stupid to choose freesync any more than it is stupid to choose g-sync.

 

Personally I recommend getting whatever suits you the best at the time.  For some people it will be governed by price while for others they will trade CPU power for GPU power and a flashy case to get a G/Fsync monitor.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

I hope gsync is better for an additional $150.

Intel 4670K /w TT water 2.0 performer, GTX 1070FE, Gigabyte Z87X-DH3, Corsair HX750, 16GB Mushkin 1333mhz, Fractal R4 Windowed, Varmilo mint TKL, Logitech m310, HP Pavilion 23bw, Logitech 2.1 Speakers

Link to comment
Share on other sites

Link to post
Share on other sites

You are missing the point that because nvidia will not support that open standard, freesync also becomes a locked standard. There is no other competitor. Unless you are going to argue that maybe Intel will support it with their igpus.

Literally that statement is pointless because both standards lock people into gpu sides.

It doesn't matter that it's an open standard if only one company supports it.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

You are missing the point that because nvidia will not support that open standard, freesync also becomes a locked standard. There is no other competitor. Unless you are going to argue that maybe Intel will support it with their igpus.

Literally that statement is pointless because both standards lock people into gpu sides.

It doesn't matter that it's an open standard if only one company supports it.

You're glossing over my point entirely.

 

EVEN IF you can consider FreeSync "Vendor Lock-in"... Guess what? THAT'S STILL BAD!

 

Vendor Lock-in is bad. Period. The simple fact is, if you buy a specific unnamed Monitor that requires a specific unnamed GPU Brand to operate a highly desirable feature - regardless of what that brand is - you will continue to buy that same brand of GPU every time you upgrade your GPU - even if the unnamed alternative GPU vendor offers a better, faster, stronger, cheaper product. Even if that alternative GPU offers their own "highly desirable" feature of their own - but you can't use it, because you bought the competitors monitor.

 

Do you understand what I'm saying?

 

And yes, for the record, I am going to argue that Intel will support Adaptive-Sync on their iGPU's. It's coming, sooner or later.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

You're glossing over my point entirely.

 

EVEN IF you can consider FreeSync "Vendor Lock-in"... Guess what? THAT'S STILL BAD!

 

Vendor Lock-in is bad. Period. The simple fact is, if you buy a specific unnamed Monitor that requires a specific unnamed GPU Brand to operate a highly desirable feature - regardless of what that brand is - you will continue to buy that same brand of GPU every time you upgrade your GPU - even if the unnamed alternative GPU vendor offers a better, faster, stronger, cheaper product. Even if that alternative GPU offers their own "highly desirable" feature of their own - but you can't use it, because you bought the competitors monitor.

 

Do you understand what I'm saying?

 

And yes, for the record, I am going to argue that Intel will support Adaptive-Sync on their iGPU's. It's coming, sooner or later.

Ok then for the record say vendor lock is bad. Don't say G-sync is bad. That sort of double standard was what pissed me off. I don't give a shit which way you want to go. I do however care about hypocrisy 

 

I would argue that even if Intel supports the standard (which I don't believe they ever will because they do a very very good job not picking sides, even check out api implementations) it wont be available for quite sometime (cannonlake or later) making it still a locked platform for the foreseeable future. 

 

 

I do apologize for taking it out on you, but man when something comes out against one company or another, so many fanboys on this forum don't even stop to read or think about what they read before claiming bias (the first person on this thread I commented to was a clear example of this.)

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ok then for the record. Then say vendor lock is bad. Don't say G-sync is bad. That sort of double standard was what pissed me off. I don't give a shit which way you want to go. I do however care about hypocrisy 

 

I would argue that even if Intel supports the standard (which I don't believe they ever will because they do a very very good job not picking sides, even check out api implementations) it wont be available for quite sometime (cannonlake or later) making it still a locked platform for the foreseeable future. 

I do think G-Sync is even worse then regular old vendor lock-in, because AMD at least took FreeSync to VESA and made it an open industry standard.

 

G-Sync will never be an option for AMD, even if AMD wanted it.

 

That's not double standard. That's AMD giving us an open standard, and NVIDIA choosing not to use it. NVIDIA didn't even offer G-Sync to AMD (On a business perspective, why would they?)

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Still waiting for good 24 inch 1080p sub 300 dollar 144hz Gsync monitors... Mass produce them and people WILL buy them.

The Grey Squirrel

CPU: i7-6700k @ 4.8GHz - CPU Cooler: Be Quiet! Dark Rock 3 - Motherboard: ASUS Z170-E - GPU:  ASUS GTX 1060 DUAL

Case: Inwin 303 - RAM: 4x8GB Corsair LPX Storage: 2x Samsung 850 EVO 500GB - PSU: EVGA SuperNOVA G2 550W

Mouse: Logitech G502 Wired / Bungee Keyboard: Corsair Strafe Cherry MX Red Headphone: Sony MDR- 1R

Microphone:  Blue Yeti - Webcam: Logitech C920 - Monitors: 3x Dell S2415H 

Link to comment
Share on other sites

Link to post
Share on other sites

I do think G-Sync is even worse then regular old vendor lock-in, because AMD at least took FreeSync to VESA and made it an open industry standard.

 

G-Sync will never be an option for AMD, even if AMD wanted it.

 

That's not double standard. That's AMD giving us an open standard, and NVIDIA choosing not to use it. NVIDIA didn't even offer G-Sync to AMD (On a business perspective, why would they?)

 

An open standard is completely irrelevant if it isn't used by more than one player... Then it is still a unique feature and locks people in either way.

 

Also this is Nvidia developing a standard (note that was/is also hardware based), then almost a year? later AMD making an open standard knowing full well that nvidia would never abandon their own standard (hell they haven't given up with physx yet...) as a sheer publicity stunt to make them seem more 'open?'

 

It's like giving AMD credit for taking Mantle and giving it to the Kronos Group for Vulkan they had no choice in the matter...

 

On a side note, I would like to see the g-sync modules sold separately again so that more g-sync options would appear, but it looks like the top Monitor makers would like to keep them tight so they can charge a heftier premium for g-sync.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

An open standard is completely irrelevant if it isn't used by more than one player... Then it is still a unique feature and locks people in either way.

 

Also this is Nvidia developing a standard (note that was/is also hardware based), then almost a year? later AMD making an open standard knowing full well that nvidia would never abandon their own standard (hell they haven't given up with physx yet...) as a sheer publicity stunt to make them seem more 'open?'

 

It's like giving AMD credit for taking Mantle and giving it to the Kronos Group for Vulkan they had no choice in the matter...

 

On a side note, I would like to see the g-sync modules sold separately again so that more g-sync options would appear, but it looks like the top Monitor makers would like to keep them tight so they can charge a heftier premium for g-sync.

If AMD doesn't push an open standard, regardless of their personal motivations, then it simply won't get adopted. NVIDIA will never propose an open standard. It's completely against their business practice, which is Vendor Lock-in as much as possible. Good for NVIDIA users right now, but bad for consumers in the long run.

 

Obviously AMD did FreeSync as a publicity stunt, but that doesn't mean it was a bad move, or wasn't good for consumers.

 

I do agree that NVIDIA should allow the modules to be sold separately, but I completely understand why they don't. G-Sync is all about a calibrated, tweaked experience. If they start selling modules, then they lose control over the experience. They won't be able to guarantee how good things run, and you'll just end up getting all the same problems as FreeSync, but without the benefits (eg: An open standard and a lower cost).

 

What I would LOVE is a Monitor Vendor who offers a Monitor with Adaptive-Sync compatibility, and a G-Sync monitor. That would truly be a step forward, and would allow vendor lock-in to be sidestepped.

 

I don't think NVIDIA would ever allow that though.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

An open standard is completely irrelevant if it isn't used by more than one player... Then it is still a unique feature and locks people in either way.

 

Also this is Nvidia developing a standard (note that was/is also hardware based), then almost a year? later AMD making an open standard knowing full well that nvidia would never abandon their own standard (hell they haven't given up with physx yet...) as a sheer publicity stunt to make them seem more 'open?'

 

It's like giving AMD credit for taking Mantle and giving it to the Kronos Group for Vulkan they had no choice in the matter...

 

On a side note, I would like to see the g-sync modules sold separately again so that more g-sync options would appear, but it looks like the top Monitor makers would like to keep them tight so they can charge a heftier premium for g-sync.

I don't think you know what standard means. Also, FreeSync was announced 2.5 months after GSync was. It's hilarious that you're accusing people of fanboyism when yours is making you make up shit.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

If AMD doesn't push an open standard, regardless of their personal motivations, then it simply won't get adopted. NVIDIA will never propose an open standard. It's completely against their business practice, which is Vendor Lock-in as much as possible. Good for NVIDIA users right now, but bad for consumers in the long run.

 

Obviously AMD did FreeSync as a publicity stunt, but that doesn't mean it was a bad move, or wasn't good for consumers.

 

I do agree that NVIDIA should allow the modules to be sold separately, but I completely understand why they don't. G-Sync is all about a calibrated, tweaked experience. If they start selling modules, then they lose control over the experience. They won't be able to guarantee how good things run, and you'll just end up getting all the same problems as FreeSync, but without the benefits (eg: An open standard and a lower cost).

 

What I would LOVE is a Monitor Vendor who offers a Monitor with Adaptive-Sync compatibility, and a G-Sync monitor. That would truly be a step forward, and would allow vendor lock-in to be sidestepped.

 

I don't think NVIDIA would ever allow that though.

I don't think nvidia has the say in that matter, monitor sellers have no incentive for a one size fits all experience (they get more sales if people swap standards, and good lord we all know tv makers have no excuse for the lack of displayport except to drive sales of other products.) They used to sell the modules btw, but I think they were actually at a loss as a way to try to get people going on the hype train.

 

The real question in my mind is mobile g-sync actually distinct from free-sync? How would you know?

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think you know what standard means. Also, FreeSync was announced 2.5 months after GSync was. It's hilarious that you're accusing people of fanboyism when yours is making you make up shit.

You are right. I confused the announcement for Gsync (which included working prototypes in october) compared to the first demonstration for a freesync prototype monitor (which was 9 months later in june) The announcement of working on it was 3 months later. The first consumer product announcement for freesync was 13 months after g-sync was announced (and 10 months after the first consumer g-sync monitor was announced.)

 

As to a 'standard'. No I'm not. You have a particular common-law application of 'standard' in mind to which I may or may not have prescribed. (like the refresh vs rebrand debate)

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think nvidia has the say in that matter, monitor sellers have no incentive for a one size fits all experience (they get more sales if people swap standards, and good lord we all know tv makers have no excuse for the lack of displayport except to drive sales of other products.) They used to sell the modules btw, but I think they were actually at a loss as a way to try to get people going on the hype train.

 

The real question in my mind is mobile g-sync actually distinct from free-sync? How would you know?

They did sell the Module at first, but that was ONLY for the original ROG Swift G-Sync launch monitor (I think it was the Swift - it was the original G-Sync ASUS monitor anyway). They only did that so that people who had already purchased the monitor didn't feel completely screwed.

 

As for Mobile G-Sync vs FreeSync? I doubt there really is any major difference.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

They did sell the Module at first, but that was ONLY for the original ROG Swift G-Sync launch monitor (I think it was the Swift - it was the original G-Sync ASUS monitor anyway). They only did that so that people who had already purchased the monitor didn't feel completely screwed.

 

As for Mobile G-Sync vs FreeSync? I doubt there really is any major difference.

Yea, but you could still use it for other monitors. It was just sold on its own at all different e-retailers. (Also that's pretty amusing because apparently one of the new mobile g-sync laptops is getting flack for not allowing people who just bought their laptops before the g-sync version was announce at the same price to get the software upgrade. I mean I guess if nvidia is charging a licensing fee on mobile g-sync, but still...)

 

I doubt their is any major difference... The question is is there ANY difference? Because for quite a while it was sounding like g-sync was specifically and intentionally hardware bound, and it wasn't until after free-sync had already been released to market that a software version of g-sync was brought into play (I mean it could be partially because of money grab, or features planned years from now, but still...) 

 

Didn't Nvidia originally say they wouldn't sell laptops with it? (I honestly don't remember if Linus's video covered it.)

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Asus will be releasing in the 3rd quarter a g sync monitor pg2749g and currently on the market an Asus mg279q freesync. Its of my understanding that both of these monitors are using the same panel. For those who own a 980 or better is it worth paying 200 more for g sync. When your videocard is more than capable of running 1440p at 144hz, with little no to tearing.

 

Also if you turn g sync off, do you lose any performance.   

Test ideas by experiment and observation; build on those ideas that pass the test, reject the ones that fail; follow the evidence wherever it leads and question everything.

Link to comment
Share on other sites

Link to post
Share on other sites

 

3) Streaming and Video encoding GPU based is available on AMD cards as well via VCE

 

 

Is it 1080p 60FPS with very little loss?

why do so many good cases only come in black and white

Link to comment
Share on other sites

Link to post
Share on other sites

Asus will be releasing in the 3rd quarter a g sync monitor pg2749g and currently on the market an Asus mg279q freesync. Its of my understanding that both of these monitors are using the same panel. For those who own a 980 or better is it worth paying 200 more for g sync. When your videocard is more than capable of running 1440p at 144hz, with little no to tearing.

 

Also if you turn g sync off, do you lose any performance.   

 

Personally,  if you already own a G-sync card which you don't intend upgrading in the next few years then it is absolutely better to buy a G-sync monitor now.  Otherwise you are going to spend a whole lot more to get both a new monitor and a new GPU if you swap to Freesync. 

 

Conversely if you are buying both (GPU and monitor) from scratch then the best thing to do is weigh up the total cost to get the performance/features you want, then decide based on that.

 

EDIT: also turning G-sync off won't loose performance but will introduce screen tearing. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Personally,  if you already own a G-sync card which you don't intend upgrading in the next few years then it is absolutely better to buy a G-sync monitor now.  Otherwise you are going to spend a whole lot more to get both a new monitor and a new GPU if you swap to Freesync. 

 

Conversely if you are buying both (GPU and monitor) from scratch then the best thing to do is weigh up the total cost to get the performance/features you want, then decide based on that.

 

EDIT: also turning G-sync off won't loose performance but will introduce screen tearing. 

 Yep at the current price difference 150, its not worth getting nvida. Both are using same panel, getting better results using AMD card with freesync turn off.

 

"Used 980 vs fury for my comparsion. Using asus mg27 and asuspg2749q. Adding this info for the crazies." 

Test ideas by experiment and observation; build on those ideas that pass the test, reject the ones that fail; follow the evidence wherever it leads and question everything.

Link to comment
Share on other sites

Link to post
Share on other sites

Personally,  if you already own a G-sync card which you don't intend upgrading in the next few years then it is absolutely better to buy a G-sync monitor now.  Otherwise you are going to spend a whole lot more to get both a new monitor and a new GPU if you swap to Freesync. 

 

Conversely if you are buying both (GPU and monitor) from scratch then the best thing to do is weigh up the total cost to get the performance/features you want, then decide based on that.

 

EDIT: also turning G-sync off won't loose performance but will introduce screen tearing. 

This is probably one of the worst times to buy hardware because there is no way of knowing what will be supported in just a matter of a year. I'm glad Mantle is gone and it's assets have been consolidated into Vulkan. As much as I liked what they did/tried to do with Mantle, that kind of vendor lock-in is terrible. I predict that Nvidia will eventually support AdaptiveSync because it would be insane for them not to, but it would be driving me insane atm if I owned an Nvidia card and was trying to decide on a monitor.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

This is probably one of the worst times to buy hardware because there is no way of knowing what will be supported in just a matter of a year. I'm glad Mantle is gone and it's assets have been consolidated into Vulkan. As much as I liked what they did/tried to do with Mantle, that kind of vendor lock-in is terrible. I predict that Nvidia will eventually support AdaptiveSync because it would be insane for them not to, but it would be driving me insane atm if I owned an Nvidia card and was trying to decide on a monitor.

 

I was going to upgrade, but the reality is for me I am happy with 1050. 

 

I think the reality is there is never a good time (so to speak) to buy tech, there will always be something better and cheaper in a few months/years.  My philosophy is breathe, the chance of spending up big and getting something unusable after a short time is very rare so just get what you can afford and be happy with it.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

 Yep at the current price difference 150, its not worth getting nvida. Both are using same panel, getting better results using AMD card with freesync turn off.

 

"Used 980 vs fury for my comparsion. Using asus mg27 and asuspg2749q. Adding this info for the crazies." 

 

It's most likely not worth it if you already own an AMD GPU or still need to buy a GPU, but if you already have a decent G-sync GPU then the extra 150 is worth it in my mind.  Sure it means you'll be stuck with nvidia until you are ready to buy a new monitor but the reverse is also true if you get freesync.  Personally I wouldn't count on things changing anytime soon.  

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

It's most likely not worth it if you already own an AMD GPU or still need to buy a GPU, but if you already have a decent G-sync GPU then the extra 150 is worth it in my mind.  Sure it means you'll be stuck with nvidia until you are ready to buy a new monitor but the reverse is also true if you get freesync.  Personally I wouldn't count on things changing anytime soon.  

 

All I know it was a bitch shopping for a new monitor. Looking over Gsync and freesync monitors. For me all i cared about was a ips panel, 1440p at 144hz with low enough GTG response. I brought my video card after I got my monitor. To me choosing the right monitor was must important   

Test ideas by experiment and observation; build on those ideas that pass the test, reject the ones that fail; follow the evidence wherever it leads and question everything.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×