Jump to content

The real difference between "Free-Sync" vs G-Sync

exyia

Businesses don't remain successful ( and by that I mean don't go belly up) by giving stuff away, esp. to their main competitor.  If we want to see GPU's get faster and cheaper, then AMD and Nvidia need to both be profitable and play the game.  That means they won't do anything that doesn't net them a return or invest in a technology that they can't control. 

ok so having a module work for only one gpu manufacturer will be better then working with all.

Very sarcasm

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

ok so having a module work for only one gpu manufacturer will be better then working with all.

Very sarcasm

 

Depends on which side of the fence you are. Sometimes you have to look at the whole picture instead of just being in the consumer's boots and things will make more sense.

Core - EVGA Classified 3 | i7 980x | 12GB Corsair Dominator GT | Lian Li P80 | Corsair 128 Neutron GTX | 2 x WD 500gb Velociraptor | Asus Xonar Xense | 2 x EVGA 590 | Enermax Platimax 1500


Water Cooling - Alphacool NexXxos 360 Monsta | TFC 360 | Alphacool D5 Vario | Alphacool 250 Tube res | EK Supreme HF Nickle Plexi | 2 x EK Nickle Plexi 590 WB | Aquaero 5 XT

Link to comment
Share on other sites

Link to post
Share on other sites

ok so having a module work for only one gpu manufacturer will be better then working with all.

Very sarcasm

Better is subjective, for some it is.  As a consumer, if you want that module to exists at all and be a performance boost with the gpu you own, then you have to let that company make money out of it, otherwise they won't be there to advance the gpu to the next level or introduce the next piece of innovative technology.  It's really an all or nothing thing, if Nvidia don't make money out of their R+D work then the lack of income will lead to stagnation of product development.  

 

 

You only have to look at the history of AMD to see the impact of financial struggles on product development. In the last 5 years AMD has been loosing money hand over fist, in fact since the glory days of the 3200 and the 3500 processors with the Winchester cores AMD has been on a downward spiral financially.  The result of all that is that their latest offerings are trailing their competitors by up to 9 months (a very long time in the tech world)  In feb Nvidia released the GK110, in oct AMD released a core that could compete with it.  It's not that they don't know how to produce a better product, they just don't have the resources to make it happen and everyone has seen what Nvidia has done while their competition is handy-caped, they have been sitting on their next gen GPU design since god knows when,  the 800 series could have been released already except they have had no need to do it. Now imagine how slowly Nvidia would release new technology and maybe even develop it if they had no competition at all? 

 

In short all I am trying to say is that what ever product a company develops, it has to make money on otherwise we will all lose not just the ones who can only afford a cheap monitor or GPU.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe it's my Fanboyism but every time I hear that NVIDEAHHHHHHHH shit before a game I wana just take my monitor and throw it.

That's just about how I feel when I see AMD, but then I have to remember that if it's made for AMD gaming, it just means that I can max out the graphics and then my anger goes away :)

Proc: Intel Core i9 9900K 5GHz OC MoBo: ASUS Maximus XI Formula Z390 Ram: Corsair Dominator Platinum 3200MHz 32GB  Vidcard: ASUS RTX 2080 ROG STRIX OC Sound: Creative Sound Blaster AE-5 SSD: Samsung 970 EVO 240GB  HDD: 4X WD Black 2TB PSU: Corsair HX 850i Case: Corsiar 760T Black Monitor: ASUS PG278QE 165Hz 1ms  Peripheral: Razer Huntsman Elite - Deathadder V2 - Sabertooth Controller

Link to comment
Share on other sites

Link to post
Share on other sites

That's just about how I feel when I see AMD, but then I have to remember that if it's made for AMD gaming, it just means that I can max out the graphics and then my anger goes away :)

I've actually never seen as many AMD splashscreens as Nvidia ones. And they don't last 10 seconds. and you can skip them.

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

I've actually never seen as many AMD splashscreens as Nvidia ones. And they don't last 10 seconds. and you can skip them.

I know what you mean but, hear are a few games that I have that have the AMD splashscreen. DMC, bioshock infinite, Tomb Raider, Crysis 3 and battlefield 4 and all those games I can absolutely max the setting out and have a smooth experience. Just seems that when AMD comes up as a splashscreen, right off the mark I know that the game was tamed down so it can run on AMD Systems where Nvidia games give you the Hardcore here's everything eyecandy and I know I have to sometimes set the setting back a little bit.

 

It's just my opinion and may not be true for others, it's just been that way for me and the systems I build.

Proc: Intel Core i9 9900K 5GHz OC MoBo: ASUS Maximus XI Formula Z390 Ram: Corsair Dominator Platinum 3200MHz 32GB  Vidcard: ASUS RTX 2080 ROG STRIX OC Sound: Creative Sound Blaster AE-5 SSD: Samsung 970 EVO 240GB  HDD: 4X WD Black 2TB PSU: Corsair HX 850i Case: Corsiar 760T Black Monitor: ASUS PG278QE 165Hz 1ms  Peripheral: Razer Huntsman Elite - Deathadder V2 - Sabertooth Controller

Link to comment
Share on other sites

Link to post
Share on other sites

Better is subjective, for some it is.  As a consumer, if you want that module to exists at all and be a performance boost with the gpu you own, then you have to let that company make money out of it, otherwise they won't be there to advance the gpu to the next level or introduce the next piece of innovative technology.  It's really an all or nothing thing, if Nvidia don't make money out of their R+D work then the lack of income will lead to stagnation of product development.  

 

You only have to look at the history of AMD to see the impact of financial struggles on product development. In the last 5 years AMD has been loosing money hand over fist, in fact since the glory days of the 3200 and the 3500 processors with the Winchester cores AMD has been on a downward spiral financially.  The result of all that is that their latest offerings are trailing their competitors by up to 9 months (a very long time in the tech world)  In feb Nvidia released the GK110, in oct AMD released a core that could compete with it. 

Other then that over 1000$ Titian, I haven't seen any gap of market offerings. I've seen an always present nvidia low-end grap (gap+crap). So...hahahaha. the only reason AMd has less sales is USA consumers.

 

 

 The 800 series could have been released already except they have had no need to do it. Now imagine how slowly Nvidia would release new technology and maybe even develop it if they had no competition at all? 

 

In short all I am trying to say is that what ever product a company develops, it has to make money on otherwise we will all lose not just the ones who can only afford a cheap monitor or GPU.

Lol, if Nvidia didnt have that much money, they would get off their lazy ass company and make a difference by releasing that 800 series you so believe is ready( just to go back to Gsync limitations)

And with the exception of NA (mining) whatever is in the price point of an AMD card gets a big lose most of the time, its not even an opinion anymore unless you consider nvidia cultivation of people's ingorant minds that bring most of its profits.

 

The only reason I got a gtx 670, is because it was 370$ and performed better then a ref 500+$ then a 680. It also came close to the 450$ 7970, and I ignored the POWA of the hd 7950 at that time for some ignorant reason.

Long story short, the only reason I got a gtx 670, is because it owned at its price point thanks mostly to MSI and not Nvidia. Who had the beautifull record of 2 motnhs straight deffective control panel. While I never experienced AMD problems, it was probably a thing of the past when I got into heavier gaming wiht my first real desktop PC and I live in the present.

 

Seeing your contradiction I decide to claim that your biased towards Nvidia, I just checked if you ahd a GTx Titan, and u have a GTX570. so I dont see that offering being any relevant to you.

I have seen nearly zero AMD biased people comapred to Nvidia fans. Because they got the holy Titan which blows their brain and lives because most of them will never even get it!

Nvidia: such wow, such price, no better difference in your life.

 

-Doge

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

I know what you mean but, hear are a few games that I have that have the AMD splashscreen. DMC, bioshock infinite, Tomb Raider, Crysis 3 and battlefield 4 and all those games I can absolutely max the setting out and have a smooth experience. Just seems that when AMD comes up as a splashscreen, right off the mark I know that the game was tamed down so it can run on AMD Systems where Nvidia games give you the Hardcore here's everything eyecandy and I know I have to sometimes set the setting back a little bit.

 

It's just my opinion and may not be true for others, it's just been that way for me and the systems I build.

Or Nvida games are not optimized well...

 

Nivida Cards = Artist; people who game exclusively (and have deep pockets)

AMD Cards = Businessman; person who needs raw power and can enjoy some great games at a low price (And make sound financial decisions)

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

Or Nvida games are not optimized well...

 

Nivida Cards = Artist; people who game exclusively (and have deep pockets)

AMD Cards = Businessman; person who needs raw power and can enjoy some great games at a low price (And make sound financial decisions)

 

 

 

Wow, thats just an ignorant thing to say. Raw power, as of now the GTX 780Ti is THE fastest graphics card out on the market right now. AMD has not, at this time, responded to it. That can't be disputed, its just the truth. AMD and Nvidia both have graphics cards tailored to the professional market. Yes, oh boy, Nvidia does sell to the Professional market, as does AMD. Even to Artists. As for your personal assumptions on sound financial decisions and businessmen? Its not even worth my breath to respond, you just are blinded by your fanboyism it seems. Sad, just sad. You have some growing up to do.

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, thats just an ignorant thing to say. Raw power, as of now the GTX 780Ti is THE fastest graphics card out on the market right now. AMD has not, at this time, responded to it. That can't be disputed, its just the truth. AMD and Nvidia both have graphics cards tailored to the professional market. Yes, oh boy, Nvidia does sell to the Professional market, as does AMD. Even to Artists. As for your personal assumptions on sound financial decisions and businessmen? Its not even worth my breath to respond, you just are blinded by your fanboyism it seems. Sad, just sad. You have some growing up to do.

Is the Hawaii chip used to the fullest power? No. Is the 780Ti and almost every other Nvidia card overpriced? Yes. Are you saying you are not a Fanboy? It takes a man to admit it. And a bigger man to accept facts no matter who they are in favor of.

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

Is the Hawaii chip used to the fullest power? No. Is the 780Ti and almost every other Nvidia card overpriced? Yes. Are you saying you are not a Fanboy? It takes a man to admit it. And a bigger man to accept facts no matter who they are in favor of.

 

Considering I run both AMD and Nvidia, nah, I don't like fanboys. They don't base things on facts. See, I've only stated facts here, which has got you rilled up. If you look in this very thread, I even corrected something I said in an edit. I left it in there, and used the edit to correct myself. I am man enough to admit my mistakes. I'm MARRIED, I've got to admit my mistakes, or I wouldn't be married much longer. I've got nothing to lose by stating facts. BTW, where did you get that the Hawaii chip wasn't used to its fullest extent, I know there was a rumor going around, that some guy did some strange math and some odd counting, but NOTHING was ever confirmed and his assumption was iffy at best (Link is needed to back up your claim here, as I couldn't find anything myself).

 

Overpriced by your standards maybe. See, thats the thing about arguing about price, its subjective. You have to look at the bigger picture, which we all know you can't, then just the physical card. Its much more then the physical object. Alas, not everyone understands that, sadly. People throws this claim out all the time, without taking into account support, reliability of hardware and software (i.e. drivers), extras (i.e. software), hardware extras, stuff like that.

 

Only thing I fanboy over, if you could put it that way, is my Porsche. I'll only buy them, but thats because they took good care of me when my 968 had issues right out of the dealer. By take care of, you'd be shocked what they did. Ever since then, I've only owned Porches. Can you blame me? Chevy was like pulling teeth when I had issues (airbag went off driving over a bridge, idled way too low, among other things). '94 'Vette, go figure it had all kinds of problems. Funny thing, Chevy ended up selling it again, without fixing it. I got a call several years ago since my name was listed as a previous owner.

Link to comment
Share on other sites

Link to post
Share on other sites

Considering I run both AMD and Nvidia, nah, I don't like fanboys. They don't base things on facts. See, I've only stated facts here, which has got you rilled up. If you look in this very thread, I even corrected something I said in an edit. I left it in there, and used the edit to correct myself. I am man enough to admit my mistakes. I'm MARRIED, I've got to admit my mistakes, or I wouldn't be married much longer. I've got nothing to lose by stating facts. BTW, where did you get that the Hawaii chip wasn't used to its fullest extent, I know there was a rumor going around, that some guy did some strange math and some odd counting, but NOTHING was ever confirmed and his assumption was iffy at best (Link is needed to back up your claim here, as I couldn't find anything myself).

 

Overpriced by your standards maybe. See, thats the thing about arguing about price, its subjective. You have to look at the bigger picture, which we all know you can't, then just the physical card. Its much more then the physical object. Alas, not everyone understands that, sadly. People throws this claim out all the time, without taking into account support, reliability of hardware and software (i.e. drivers), extras (i.e. software), hardware extras, stuff like that.

 

Only thing I fanboy over, if you could put it that way, is my Porsche. I'll only buy them, but thats because they took good care of me when my 968 had issues right out of the dealer. By take care of, you'd be shocked what they did. Ever since then, I've only owned Porches. Can you blame me? Chevy was like pulling teeth when I had issues (airbag went off driving over a bridge, idled way too low, among other things). '94 'Vette, go figure it had all kinds of problems. Funny thing, Chevy ended up selling it again, without fixing it. I got a call several years ago since my name was listed as a previous owner.

Can't find your system specs so I can't be sure if this is true. Here's the link to our own forum which has the source: http://linustechtips.com/main/topic/95643-r9-290x-does-not-utilize-a-fully-unlocked-hawaii-gpu/

 

Isn't account support and RMA's handled by the distributor such as Newegg? Yes, AMD drivers are bad, and it's really sad. They could be so much more with better drivers. WAIT. IS THAT. A FAULT OF AMD POINTED OUT BY A FANBOY? ALERT THE NEWSPAPERS!

 

Good for Porsche, happy customers = happy business.

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

Can't find your system specs so I can't be sure if this is true. Here's the link to our own forum which has the source: http://linustechtips.com/main/topic/95643-r9-290x-does-not-utilize-a-fully-unlocked-hawaii-gpu/

 

Isn't account support and RMA's handled by the distributor such as Newegg? Yes, AMD drivers are bad, and it's really sad. They could be so much more with better drivers. WAIT. IS THAT. A FAULT OF AMD POINTED OUT BY A FANBOY? ALERT THE NEWSPAPERS!

 

Good for Porsche, happy customers = happy business.

 

Actually that is an objective opinion having dealt with driver issues with my HTPC (the one running AMD hardware), and having to have to deal with it with friends PCs, and listening to my Bro-in-law (who works for a game company here) bitch about AMD and the issues he's had personal and professional. You can call me a fanboy if you want, thats fine, but I've only stated known facts. I can't help it, like I said, its not what you like or want to hear. 

 

But I guess you saw the title and forgot to read @kingduqc post here nor did you read the source article here to understand what is going on. You took the topic as fact, without doing much research. To sum it up, its all theoretical with NO facts to back it up. The chip wasn't opened up at all, no real measurements were taken, vague % was given for certain things, using vague rational to dictate size with no measurements, etc. Research would do you better to understand that this just hasn't been proven true. Don't state it like its fact basically. OMG, look at that facts rule again. Seriously, do you ever research at all?

Link to comment
Share on other sites

Link to post
Share on other sites

I know what you mean but, hear are a few games that I have that have the AMD splashscreen. DMC, bioshock infinite, Tomb Raider, Crysis 3 and battlefield 4 and all those games I can absolutely max the setting out and have a smooth experience. Just seems that when AMD comes up as a splashscreen, right off the mark I know that the game was tamed down so it can run on AMD Systems where Nvidia games give you the Hardcore here's everything eyecandy and I know I have to sometimes set the setting back a little bit.

 

It's just my opinion and may not be true for others, it's just been that way for me and the systems I build.

Sorry but the only reason Nvidia TWIWMTBP games are hard to max out is that PhysX rapes your framerate. TressFX does the same thing and I'm pretty sure any AMD GPU will beat their Nvidia counterparts in Tomb Raider 2013 benchmarks because AMD has far superior OpenCL.

 

Wow, thats just an ignorant thing to say. Raw power, as of now the GTX 780Ti is THE fastest graphics card out on the market right now. AMD has not, at this time, responded to it. That can't be disputed, its just the truth. AMD and Nvidia both have graphics cards tailored to the professional market. Yes, oh boy, Nvidia does sell to the Professional market, as does AMD. Even to Artists. As for your personal assumptions on sound financial decisions and businessmen? Its not even worth my breath to respond, you just are blinded by your fanboyism it seems. Sad, just sad. You have some growing up to do.

You do realise raw power is referencing to compute as opposed to games frame rate and in compute AMD has been on top since 7970.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

You do realise raw power is referencing to compute as opposed to games frame rate and in compute AMD has been on top since 7970.

 

Yes, in compute AMD has the edge, but when you look at the card as a gamer who is better? When I mentioned raw power, I was referencing gaming. That would be Nvidia. I apologize though, I should have been a little more clear.

 

Edit: I thought faster would have clued it in, but I realize reading it, you're right, I probably wasn't as clear as I could have been.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, in compute AMD has the edge, but when you look at the card as a gamer who is better? When I mentioned raw power, I was referencing gaming. That would be Nvidia. I apologize though, I should have been a little more clear.

 

Edit: I thought faster would have clued it in, but I realize reading it, you're right, I probably wasn't as clear as I could have been.

RAW power is compute. There is no argument there, it's the truth. In terms of gaming I agree Nvidia is ahead there (except at higher resolutions :P ) and when I looked at it as a gamer when I bought my GPU (R9 290) it was £320 while the 780 was still hanging there at £500.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

Other then that over 1000$ Titian, I haven't seen any gap of market offerings. I've seen an always present nvidia low-end grap (gap+crap). So...hahahaha. the only reason AMd has less sales is USA consumers.

 

 

Lol, if Nvidia didnt have that much money, they would get off their lazy ass company and make a difference by releasing that 800 series you so believe is ready( just to go back to Gsync limitations)

And with the exception of NA (mining) whatever is in the price point of an AMD card gets a big lose most of the time, its not even an opinion anymore unless you consider nvidia cultivation of people's ingorant minds that bring most of its profits.

 

The only reason I got a gtx 670, is because it was 370$ and performed better then a ref 500+$ then a 680. It also came close to the 450$ 7970, and I ignored the POWA of the hd 7950 at that time for some ignorant reason.

Long story short, the only reason I got a gtx 670, is because it owned at its price point thanks mostly to MSI and not Nvidia. Who had the beautifull record of 2 motnhs straight deffective control panel. While I never experienced AMD problems, it was probably a thing of the past when I got into heavier gaming wiht my first real desktop PC and I live in the present.

 

Seeing your contradiction I decide to claim that your biased towards Nvidia, I just checked if you ahd a GTx Titan, and u have a GTX570. so I dont see that offering being any relevant to you.

I have seen nearly zero AMD biased people comapred to Nvidia fans. Because they got the holy Titan which blows their brain and lives because most of them will never even get it!

Nvidia: such wow, such price, no better difference in your life.

 

-Doge

 

 

You lost me, I couldn't really make any sense out of all that with regard to what I said, you have posted a lot of opinion and sensationalized what you "think" I think, but not much insight. I couldn't give a rats arse who is the better company or who makes the absolute best GPU, I just see the correlation between product advancement and company profit. The fact is there was a product gap between Nvidia's top end release and the time it took AMD to reach it and that is what an under financed companies R+D looks like, that is why it is important that both companies remain profitable. So each company pushes the other to produce better.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I have seen nearly zero AMD biased people comapred to Nvidia fans. Because they got the holy Titan which blows their brain and lives because most of them will never even get it!

 

You think fanboyism is a one-way street? Either that's indicative of your own fanboyism, or you're just choosing not to pay attention.

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry but the only reason Nvidia TWIWMTBP games are hard to max out is that PhysX rapes your framerate. TressFX does the same thing and I'm pretty sure any AMD GPU will beat their Nvidia counterparts in Tomb Raider 2013 benchmarks because AMD has far superior OpenCL.

 

You do realise raw power is referencing to compute as opposed to games frame rate and in compute AMD has been on top since 7970.

Well it's all about who wants what in there set up, you want AMD and I want Nvidia. I just go with what works.

 

P.S. You get what you pay for...

Proc: Intel Core i9 9900K 5GHz OC MoBo: ASUS Maximus XI Formula Z390 Ram: Corsair Dominator Platinum 3200MHz 32GB  Vidcard: ASUS RTX 2080 ROG STRIX OC Sound: Creative Sound Blaster AE-5 SSD: Samsung 970 EVO 240GB  HDD: 4X WD Black 2TB PSU: Corsair HX 850i Case: Corsiar 760T Black Monitor: ASUS PG278QE 165Hz 1ms  Peripheral: Razer Huntsman Elite - Deathadder V2 - Sabertooth Controller

Link to comment
Share on other sites

Link to post
Share on other sites

Well it's all about who wants what in there set up, you want AMD and I want Nvidia. I just go with what works.

 

P.S. You get what you pay for...

._. You can't completely disregard everything I just said when I addressed all your points, that's the same as giving up.....

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

You think fanboyism is a one-way street? Either that's indicative of your own fanboyism, or you're just choosing not to pay attention.

No, that is my experience, paying attention doesn't mean you will find 50-50, and so that there was what I saw so far.

 

I stated what I have seen so far paying attention, nothing else. Did your conclusions make you feel attacked? Pretty sure.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

You lost me, I couldn't really make any sense out of all that with regard to what I said, you have posted a lot of opinion and sensationalized what you "think" I think, but not much insight.

and so you forgot ''Lol, if Nvidia didnt have that much money, they would get off their lazy ass company and make a difference by releasing that 800 series you so believe is ready( just to go back to Gsync limitations)

going back to gsync limitations for you, because your too lazy or blind.  They dont need more money to innovate when excluding their lazy asses factor''

Not much insight but enough to answer the question that started our dicussion.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

and so you forgot ''Lol, if Nvidia didnt have that much money, they would get off their lazy ass company and make a difference by releasing that 800 series you so believe is ready( just to go back to Gsync limitations)

going back to gsync limitations for you, because your too lazy or blind.  They dont need more money to innovate when excluding their lazy asses factor''

Not much insight but enough to answer the question that started our dicussion.

Of course Nvidia have been lazy, they have had a period of no competition and their R+D well in front, Why would they risk loosing income from current products by releasing a new product that has no market to defend?  You  had better go back and read my posts again because as I said in my original statement, you need money in order to R+D and develop, no one said anything about having the money and not doing it.  I also never said they need more money to innovate, I said only that they need money.  which is vastly different from whatever it is you are trying to say about being lazy.    I don't understand what you mean by Gsync limitations, If you are referring to a financial limitation to the end user then you  don't seem to understand how corporate finance works, and what keeps a business in the black.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Of course Nvidia have been lazy, they have had a period of no competition and their R+D well in front, Why would they risk loosing income from current products by releasing a new product that has no market to defend?  You  had better go back and read my posts again because as I said in my original statement, you need money in order to R+D and develop, no one said anything about having the money and not doing it.  I also never said they need more money to innovate, I said only that they need money.  which is vastly different from whatever it is you are trying to say about being lazy.    I don't understand what you mean by Gsync limitations, If you are referring to a financial limitation to the end user then you  don't seem to understand how corporate finance works, and what keeps a business in the black.

They dont need to limit gsync to their gpus to get mor einnovation. thats all I said. why do you attack the rest?

lets just end it here if your okay with the above. I dont mean to attack ro discuss what u said. I want ot focus on the main subject we were on.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

They dont need to limit gsync to their gpus to get mor einnovation. thats all I said. why do you attack the rest?

lets just end it here if your okay with the above. I dont mean to attack ro discuss what u said. I want ot focus on the main subject we were on.

All I meant is the Nvidia (as a company like all others) need to make money out of each of their investments, that includes Gsync. If they release Gsync and make it an open system they risk loosing profit and/or returns on investment, which has the ultimate effect of retarding product development and innovation.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×