Jump to content

You thought GTX 1060 was confusing? You've seen nothing. Enter RTX 2060 and 6 variants of the same card

Bouzoo
Just now, mr moose said:

Humor aside, without knowing exactly what they are charging is seems illogical to assume they choose to sell the 4G for less than what they can just because they aren't offering a 3G.

No it's totally logical, you've said it yourself. THE MARKET. Zero difference, the market doesn't care if it's 3GB or 4GB unless both exist at the same time.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

No it's totally logical, you've said it yourself. THE MARKET. Zero difference, the market doesn't care if it's 3GB or 4GB unless both exist at the same time.

It's logical to assume Nvidia would sell a GPU for less than what the market will pay? 

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mr moose said:

It's logical to assume Nvidia would sell a GPU for less than what the market will pay? 

The market will only pay so much for the lowest configuration on offer (edit: at release), be it 1GB, 2GB, 20GB, 20000GB so long as it's the smallest/smaller one within the same model i.e. GTX 1060. Had the GTX 1060 been 4GB not 3GB the price would not have been different, market price would of been the same.

 

I can make plenty of arguments for why the 3GB and 4GB are good for Nvidia and AIBs however I cannot for the consumer.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

The market will only pay so much for the lowest configuration on offer, bet it 1GB, 2GB, 20GB, 20000GB so long as it's the smallest/smaller one within the same model i.e. GTX 1060. Had the GTX 1060 been 4GB not 3GB the price would not have been different, market price would of been the same.

 

I can make plenty of arguments for why the 3GB and 4GB are good for Nvidia and AIBs however I cannot for the consumer.

 

I just don't see any evidence,  in my mind less options are better for nvidia becasue you don't have a choice, it the expensive or the really expensive.  Which is why I have another AMD GPU in my system (that and I hate Nvidia software, but not enough if the GPU is actually better and cheaper)

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, mr moose said:

I just don't see any evidence,  in my mind less options are better for nvidia becasue you don't have a choice, it the expensive or the really expensive.  Which is why I have another AMD GPU in my system (that and I hate Nvidia software, but not enough if the GPU is actually better and cheaper)

Well the problem explaining it would be reliant on agreeing that the lowest price point of the RTX 2060 would be the same if it were 3GB or 4GB. I'll just run with that for now.

 

The x60 is traditionally the highest volume product Nvidia and their AIBs sell so they do actually have enough room to have different variants and not be cost burdened to have them. Where it benefits the AIBs, probably Nvidia, is that working with the lowest price point being the same whether it were to have been 3GB or 4GB releasing both variants on to the market at the same time means that it's easier for the AIB to get a much better margin on the 4GB card. Being that a 1GB (8Gb) GDDR5X memory module is around $8-12 for Nvidia and the AIBs and using a higher margin than in reality of 10% the $13 increased cost assuming everything about the card is the same means that if you price the 4GB card at a market reasonable $20-$25 more you make a significant amount more on the 4GB models than the 3GB or if there were only the 4GB model priced lower.

 

Even if we go with the 4GB model costing more in absence of a 3GB model it will still need to be priced competitively, likely more so than if there were a 3GB model, so you are getting a lower margin.

 

By releasing at the same time you avoid price competing with your existing products or having find ways to cut product cost to bring a cheaper product to market if you have to. Bring them all to market now, use a higher margin and then as necessary lower margin to compete, but only as required. I can see a situation where the 3GB cards get retired from sale and the 4GB cards lowered in price depending on what AMD is able to bring to market.

 

It's the same for the GDDR5X and GDDR6 variants, releasing them at the same time makes it far easier to create a workable pricing structure than to try and do it later. I can definitely see a situation where certain models of the RTX 2060 have much better margin than others and those will be the most available options, not necessarily the best/higher hardware configuration.

 

Get the product availability right and I can see the RTX 2060 generation being rather profitable for the AIBs, I'm not really that much against that as they do operate extremely tightly which has lead more of them to diversify out in to higher margin computer accessories.

 

However I still maintain 6 variants of the RTX 2060 is unnecessary, not automatically good because more options, or that these extra options results in a lower minimum price.

 

I hope I explained that well enough, brain isn't exactly firing on all cylinders atm. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, leadeater said:

Well the problem explaining it would be reliant on agreeing that the lowest price point of the RTX 2060 would be the same if it were 3GB or 4GB. I'll just run with that for now.

 

The x60 is traditionally the highest volume product Nvidia and their AIBs sell so they do actually have enough room to have different variants and not be cost burdened to have them. Where it benefits the AIBs, probably Nvidia, is that working with the lowest price point being the same whether it were to have been 3GB or 4GB releasing both variants on to the market at the same time means that it's easier for the AIB to get a much better margin on the 4GB card. Being that a 1GB (8Gb) GDDR5X memory module is around $8-12 for Nvidia and the AIBs and using a higher margin than in reality of 10% the $13 increased cost assuming everything about the card is the same means that if you price the 4GB card at a market reasonable $20-$25 more you make a significant amount more on the 4GB models than the 3GB or if there were only the 4GB model priced lower.

 

Even if we go with the 4GB model costing more in absence of a 3GB model it will still need to be priced competitively, likely more so than if there were a 3GB model, so you are getting a lower margin.

 

By releasing at the same time you avoid price competing with your existing products or having find ways to cut product cost to bring a cheaper product to market if you have to. Bring them all to market now, use a higher margin and then as necessary lower margin to compete, but only as required. I can see a situation where the 3GB cards get retired from sale and the 4GB cards lowered in price depending on what AMD is able to bring to market.

 

It's the same for the GDDR5X and GDDR6 variants, releasing them at the same time makes it far easier to create a workable pricing structure than to try and do it later. I can definitely see a situation where certain models of the RTX 2060 have much batter margin than others and those will be the most available options, not necessarily the best/high hardware configuration.

 

Get the product availability right and I can see the RTX 2060 generation being rather profitable for the AIBs, I'm not really that much against that as they do operate extremely tightly which has lead more of them to diversify out in to higher margin computer accessories.

 

However I still maintain 6 variants of the RTX 2060 is unnecessary, not automatically good because more options, or that these extra options results in a lower minimum price.

 

I hope I explained that well enough, brain isn't exactly firing on all cylinders atm. 

That's all good and well until you take "competing with themselves" out of the picture (because no company releases products that reduce its revenue).  Also as I said before, that only works if you assume nvidia are starting their prices at rock bottom and want to keep all their GPU's as low as possible.  

 

On top of all that, it's also just as likely that they are offering a 4G model becasue there is a market for one.   As you say there is a huge market there in the middle and people are picky.   AMD are already offering some competitive products just under the 1060, if AMD introduce all new netter stuff sooner rather than later (And I suspect they might given Nvidia released all three top tier 20's in the first go and then the titan  not long after,  and we are already seeing the X60's IN 6 variants off the bat to boot), then Nvidia might be just be preparing to flood the market with options because even though they have gained considerable market share in the last 12 months they are not immune to an AMD swing if they pull of a ryzen-esc GPU release.

 

Either way I look at it I see the benefits fro consumers.  I'm not forced to buy anything, so with options comes choice.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mr moose said:

Also as I said before, that only works if you assume nvidia are starting their prices at rock bottom and want to keep all their GPU's as low as possible.  

Not sure about Nvidia but the AIBs don't have a lot of room to work with. Nvidia could keep the GPU die parts cost high enough to allow for decent price drop later, I know more about what AIBs have to work with than what it actually costs Nvidia and how they can price their parts. AIBs don't exactly have a lot of sway to get the GPU die prices down so I can only see that coming from strong competition that would necessitate it, generally AIBs just have to eat cost increases in other areas or increase the price than be able to negotiate better prices from Nvidia or AMD.

 

We have some AIBs claiming to be only getting 4% margin on certain models of cards, you can't realistically do much with that. Any chance I could get to get a product on to market with significantly better margin than 4% I would take it. I do think 4% is very much the low point though but based on the commentary of it I don't see more than 10% being common.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

Not sure about Nvidia but the AIBs don't have a lot of room to work with. Nvidia could keep the GPU die parts cost high enough to allow for decent price drop later, I know more about what AIBs have to work with than what it actually costs Nvidia and how they can price their parts. AIBs don't exactly have a lot of sway to get the GPU die prices down so I can only see that coming from strong competition that would necessitate it, generally AIBs just have to eat cost increases in other areas or increase the price than be able to negotiate better prices from Nvidia or AMD.

 

We have some AIBs claiming to be only getting 4% margin on certain models of cards, you can't realistically do much with that. Any chance I could get to get a product on to market with significantly better margin than 4% I would take it. I do think 4% is very much the low point though but based on the commentary of it I don't see more than 10% being common.

Margins that low wouldn't surprise me.  Nvidia didn't get as big as they are buy charging the minimum chips.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mr moose said:

Margins that low wouldn't surprise me.  Nvidia didn't get as big as they are buy charging the minimum chips.

I do also wonder if Gigabyte is just being kinda greedy and using all the possibilities, maybe other AIBs might only go with a few that make the most sense?

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, leadeater said:

I do also wonder if Gigabyte is just being kinda greedy and using all the possibilities, maybe other AIBs might only go with a few that make the most sense?

 

Nah, Asus and MSI will have all variants too.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Random thought: is the memory controller in the GPU back/forward compatible with the options presented? My thinking is, nvidia could potentially cover the whole of Gigabyte's SKU list with as little as one GPU SKU. This does require two assumptions: 1, they're not altering the core numbers between them like with 1060, and 2, that the ram is back/forward compatible from a single controller. Unless there is a major incompatibility I'd imagine they'd try to make the memory controller accept both types of ram to keep options open according to market circumstances. Ram quantity could be modified easily by changing chip capacity/quantity.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/26/2018 at 10:31 AM, Bouzoo said:

Edit: So many reactions. Am I an influencer now? 

Yes, start a YouTube channel and sell your merch, it will sell like a god church... (kill me)

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/26/2018 at 10:18 AM, NunoLava1998 said:

Here are all the RTX 2060 models (this is assuming every part has a Max-Q (laptop) version and I might have gotten some things incorrect):

  • RTX 2060 GDDR5X 3GB
  • RTX 2060 GDDR5X 4GB
  • RTX 2060 GDDR5X 6GB
  • RTX 2060 GDDR6 3GB
  • RTX 2060 GDDR6 4GB
  • RTX 2060 GDDR6 6GB

 

So, lets assume they would have done the opposite and only used one version.

 

  • RTX 2060 GDDR5X 3GB

The Internet: "OMG, only 3gb is not enough in 2019! Also not even GDDR6!"

  • RTX 2060 GDDR5X 4GB

The Internet: "OMG, not even GDDR6!"

  • RTX 2060 GDDR5X 6GB

The Internet: "OMG, not even GDDR6 and it won't be able to actually push 6gb anyways. Could have been cheaper!"

  • RTX 2060 GDDR6 3GB

The Internet: "OMG, only 3gb is not enough in 2019! Also, GDDR6 is hella expensive, should have used GDDR5x!"

  • RTX 2060 GDDR6 4GB

The Internet: "OMG, GDDR6 is hella expensive! Should have used GDDR5x!"

  • RTX 2060 GDDR6 6GB

The Internet: "OMG, GDDR6 is hella expensive! Also it won't be fast enough to push those 6gb anyways. Could have been so much cheaper!"

 

Also the Internet: "OMG, enough choices so all the common complaints don't fit. IT IS CONFUSING!"

 

On a more serious note:

Vacuum cleaners are confusing. They have plenty of versions, but you can't look up a benchmark to know which is faster. This holds true for almost everything that is not a benchmarkable tech item. For GPUs you actually can do that. It takes all of 2 minutes, if you read some text as well. Less if you just look at the bars.

I seriously don't get how Nvidia could win here, due to the internet bashing on everything they could have done. Going with all possibilities for an easily comparable item is not exactly the worst idea, no?

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, porina said:

Random thought: is the memory controller in the GPU back/forward compatible with the options presented? My thinking is, nvidia could potentially cover the whole of Gigabyte's SKU list with as little as one GPU SKU. This does require two assumptions: 1, they're not altering the core numbers between them like with 1060, and 2, that the ram is back/forward compatible from a single controller. Unless there is a major incompatibility I'd imagine they'd try to make the memory controller accept both types of ram to keep options open according to market circumstances. Ram quantity could be modified easily by changing chip capacity/quantity.

Probably not?

 

Quote

GDDR6 has a new “packetized” command address (CA) bus. Command and address are combined into a single, 10-bit interface, operating at double data rate to CK. This eliminates chip select, address strobe, and write enable signals and minimizes the required CA pin count to 12 per channel (or 16 in pseudo-channel mode). The elimination of a CS aligns with the point-to-point nature of GDDR memory and reinforces the requirement that there is only a single (logical) device per memory interface (single DRAM or two DRAM back-to-back in byte mode, operating as a single addressable memory).

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

Probably not?

Ok, that would make it more difficult, but not impossible. Either they'd have to have a different set of pinouts for both present, or multiplex some common lines when one or other is in use. At some point of complexity, they'll just go, make two variations.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Tech Enthusiast said:

but you can't look up a benchmark to know which is faster.

I would hope all 6 are equally as fast otherwise RTX 2060 would lose a bit of it's meaning.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

I would hope all 6 are equally as fast otherwise RTX 2060 would lose a bit of it's meaning.

Well, of course they won't be.

At least if GDDR6 is of any use! And unless more than 3gb of VRAM won't help in games.

 

On the bright side: If neither is of any use for FPS, we can just buy the cheapest version, since we get that option too.

Easily comparable options are a good thing for the consumer. No idea where the hate comes from.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

Ok, that would make it more difficult, but not impossible. Either they'd have to have a different set of pinouts for both present, or multiplex some common lines when one or other is in use. At some point of complexity, they'll just go, make two variations.

Some of the changes with GDDR6 were done with GDDR5X as well so I could see it being possible. Were GDDR5X controllers cross compatible with GDDR5?

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Tech Enthusiast said:

Well, of course they won't be.

At least if GDDR6 is of any use!

GDDR6 shouldn't actually do anything unless the GPU needs more memory bandwidth than the GDDR5X models can supply it, GDDR5X was fast enough for the 1080/1080 Ti/Titan Xp so I can't realistically see it being too slow for an RTX 2060.

 

Maybe in the 1440p and higher resolutions it might give a few more FPS or it might give better 0.1% and 1% lows if it has much better latency. Memory bandwidth is one of those things where if you need more of it then you need more but if you don't increasing it won't increase performance with it.

 

That's why I hope the performance of all 6 is going to be for the most part the same otherwise the ones with measurable difference should really come under a different model name to denote that performance difference. Edit: By that I mean across all resolutions not just faster in select cases.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/26/2018 at 10:17 AM, MysteriousAeon said:

I've been using a R9 390 for 3 years and it was issue after issue. Just switched to a GTX 1080 after the 390 died and I'm very happy with it. It's worth trying out if you have the money but I would stay away from AMD GPUs if you're even just a little bit serious about gaming/streaming/stuff that needs a good PC. I don't know how their CPUs are.

Honestly 99% of their GPUs are just fine. You likely just stumbled upon a bad card.

 

As for AMD's CPUs, they're pretty much all good if not great. It's the cheap A320 motherboards that people need to look out for.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, leadeater said:

GDDR6 shouldn't actually do anything unless the GPU needs more memory bandwidth than the GDDR5X models can supply it, GDDR5X was fast enough for the 1080/1080 Ti/Titan Xp so I can't realistically see it being too slow for an RTX 2060.

 

Maybe in the 1440p and higher resolutions it might give a few more FPS or it might give better 0.1% and 1% lows if it has much better latency. Memory bandwidth is one of those things where if you need more of it then you need more but if you don't increasing it won't increase performance with it.

 

I just grabbed numbers from Wikipedia listings, and below is the ratio of ram bandwidth (GB/s) to rated boost SP TFLOPS for x70 and higher Pascal and Turing cards. Higher is "better".

2070 60.0
2080 44.5
2080 Ti 45.8
Titan RTX 41.2
   
1070 39.6
1070 Ti 31.3
1080 36.1
1080 Ti 42.7
Titan Xp 45.1

 

Overall I think it can be said Turing has more proportionate bandwidth than Pascal. As a general trend, typically lower cards have higher ratio than higher cards, so I can't imagine a 2060 being bandwidth starved for gaming uses anyway. It might get a bit more questionable for the 4GB cards, if we assume they're running 2/3 the bandwidth of the 3/6GB variations. Has anyone done a study on video ram bandwidth impacts on gaming performance?

 

Part of the reason I got a 2070, apart from price, was that it had a relatively high ratio. I hadn't thought of this before, but elsewhere in compute use cases, the 2070 isn't significantly behind 2080 or 2080Ti. Presumably that use case, isn't entirely ram bandwidth limited but in that transition zone, and the higher cards are unable to make use of their potential. And they all blow away similar Pascal, where my 2070 is comparable to my 1080Ti at a much lower power consumption.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, leadeater said:

Some of the changes with GDDR6 were done with GDDR5X as well so I could see it being possible. Were GDDR5X controllers cross compatible with GDDR5?

gddr5x is QDR so gddr5x controler can probably do gddr5 but not the other way around

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, porina said:

Has anyone done a study on video ram bandwidth impacts on gaming performance?

There was some ages ago but I can't even remember which cards were being looked at, plus it was more just looking at cards that were the same model but had different bus widths variants.

 

Can you down clock GPU memory like by 50% and still have a stable card?

 

13 minutes ago, porina said:

Part of the reason I got a 2070, apart from price, was that it had a relatively high ratio. I hadn't thought of this before, but elsewhere in compute use cases, the 2070 isn't significantly behind 2080 or 2080Ti. Presumably that use case, isn't entirely ram bandwidth limited but in that transition zone, and the higher cards are unable to make use of their potential. And they all blow away similar Pascal, where my 2070 is comparable to my 1080Ti at a much lower power consumption.

That would make sense considering the compute cards feature HBM and way higher bandwidths, GDDR6 brings the bandwidth up to Titan V levels but the full HBM configs are still 200+ GB/s more.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×