Jump to content

Modded GeForce RTX 3070 with 16GB memory gets major 1% lows FPS boost

On 4/24/2023 at 4:02 PM, Zodiark1593 said:

The point I was trying to get across was that knowing what occurs in the background, and actually seeing the impact in practice and so acutely, are somewhat different. It appears to be a particular pain point with some more recent games, as it means that, owing to more VRAM, certain less powerful cards will end up actually performing better than their supposedly higher end counterparts. Generally, demand increases over time have been more balanced, so by the time you actually need more VRAM, the GPU itself probably has it's hands full as well. With the specific games in question however, this is not the case.

 

VRAM requirements have increased disproportionately compared to the need for raw compute. In this specific situation posed, if games move to such high VRAM requirements, we could have a very strange case in which higher end cards, actually age substantially worse, and perform worse, than the lower-end counterpart. Regardless of reasons, this is kind of bonkers if it plays out like this.

More like most recent (AAA) games tbh. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, SteveGrabowski0 said:

3060 12GB was spanking the 3070 8GB at 1440p with RT almost two years ago with Doom Eternal RTX

 

https://gamegpu.com/action-/-fps-/-tps/doom-eternal-test-rtx

Tbh, kind of bringing to mind one of Steve’s quotes. A Waste of Sand. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Pointless bickering aside, I'm interested to see whether a usable version of this mod will materialize.

Getting my 3070 I knew its VRAM limitation would be just that sooner rather than later, and that the 3060 I got thereafter would presumably end up being the better fit for VR use.

My 3070 happens to be the same as the one used by the modder, a Palit GamingPro non-OC model. I'd be lying if I said I'm not keen on getting ahold of the memory chips used to try out the mod myself...

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, ShirtyGamer said:

I'm interested to see whether a usable version of this mod will materialize.

What do you mean? This is the usable version, desolder existing 8gbit modules and replace with 16gbit ones, along with the single resistor change for the GPU to recognize it.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, igormp said:

What do you mean? This is the usable version, desolder existing 8gbit modules and replace with 16gbit ones, along with the single resistor change for the GPU to recognize it.

one wonders how cost effective the mod is.
Usually when I quote prices, im quoting minimal orders of 1200 chips. Im not sure where you can go out and buy just 8 16gbit GDDR6 that is trustworthy. |

Like sure you can just hot air gun with a stencil at home this particular mod, but spending 50 dollars per chip or perhaps if some lets you buy a pack of 10 for 300 USD so you can make a couple of mistakes and have it half be worth it to the seller. Its a hella expensive mod. 

resistors is the easy part. 

I have a hard time saying a 300 dollar mod for a 3070 is usable. 

Ayo, who wants to spend 35k to make 3k by selling 124 mod kits at 310 USD a pop. Though that doesnt include the stencle in the mod kit.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, starsmine said:

Usually when I quote prices, im quoting minimal orders of 1200 chips. Im not sure where you can go out and buy just 8 16gbit GDDR6 that is trustworthy. |

Trustworthy? Mouser, I found it for ~$30 per unit.

Wanna take your risks? Found at aliexpress for even cheaper.

1 hour ago, starsmine said:

Its a hella expensive mod. 

Yup, indeed is, specially for something like a 3070.

Thinking about the 3090(ti), on the other hand, is a comparison against other 3090 or even an rtx a5500/a6000, then the cost starts to sound way more reasonable.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

With AI becoming ever more needed in datacenters, why doesn't Nvidia, AMD, and Intel agree to just standardize on HBM, and lots of it? Economies of scale and all that.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, StDragon said:

With AI becoming ever more needed in datacenters, why doesn't Nvidia, AMD, and Intel agree to just standardize on HBM, and lots of it? Economies of scale and all that.

They sort of have, just only on the DC only focused products.

 

Nvidia A30/A100/H100 HBM2

AMD MI8/MI25/MI50/MI60/MI100/MI210/MI250/MI250X HBM2/HBM2e

 

Intel will also do HBM for their equivalents, the have HBM options on Xeons currently.

 

I think it mostly comes down to the enormous memory controllers required, it might not be economic on smaller dies to have HBM. I'm interested where AMD takes their GPU MCD design idea and if there will be HBM variants, maybe die stacked on the MCD. MI300 I think is this but details are few right now.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/24/2023 at 9:02 PM, igormp said:

I can make use of the full 48gb of two 3090s in my applications (ML-related stuff), and don't need SLI at all (however I did look into a nvlink bridge since it'd net a ~10% increase in perf).

 

If you just want the memory for larger models, you can find old datacenter gpus going for pretty cheap(<1000 USD) with a lot more memory. Heck even P40s and K80s are going for sub 300 dollars, which is absolutely wild imo. You can conceivably save money by using one of these, instead of chucking stuff onto an ec2/gce instance 

Link to comment
Share on other sites

Link to post
Share on other sites

The sad thing about all of this is that as the time goes on we were starting to have decently capable 1440p/4k mid range cards that are now struggling even at 1080p in the new games due to lack of VRAM.

And to top that off... even the higher end cards that could easily drive those resolutions at decent settings run into VRAM issue. It's just sad to have a capable card that is being held back but memory capacity.

 

For years the issue was "my card is not fast enough, maybe I should upgrade" but now it's about "my card IS fast enough but I got screwed over on VRAM amount, F you GPU corp!" It's almost the same thing as not being able to print a black and white document because you're out of magenta.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, WereCat said:

It's almost the same thing as not being able to print a black and white document because you're out of magenta.

 

Oh damn, don't get me started on that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, WolframaticAlpha said:

If you just want the memory for larger models, you can find old datacenter gpus going for pretty cheap(<1000 USD) with a lot more memory. Heck even P40s and K80s are going for sub 300 dollars, which is absolutely wild imo. You can conceivably save money by using one of these, instead of chucking stuff onto an ec2/gce instance 

The problem is that their performance is awful and they do not support fp16/fp8/int8/int4, so you end up having to use larger models that make this extra vram become moot.

 

An EC2/GCE instance actually ends up cheaper depending on how long you plan on using, I've done so for quite a while during the mining craze after I sold my previous GPU.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, leadeater said:

I think it mostly comes down to the enormous memory controllers required, it might not be economic on smaller dies to have HBM. I'm interested where AMD takes their GPU MCD design idea and if there will be HBM variants, maybe die stacked on the MCD. MI300 I think is this but details are few right now.

The actual DRAM control logic is in the HBM stack (the first layer), so the interfacing overhead between HBM and the main die is very small, compared to GDDR. The economy of scale is not that simple solution for a complicated tech like HBM interposer integration. There are non-linear dependencies in the manufacturing process that will keep it in the higher tier product line for now.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DuckDodgers said:

The actual DRAM control logic is in the HBM stack (the first layer), so the interfacing overhead between HBM and the main die is very small, compared to GDDR. The economy of scale is not that simple solution for a complicated tech like HBM interposer integration. There are non-linear dependencies in the manufacturing process that will keep it in the higher tier product line for now.

I think you need to re-check die images and labelling. A 5120/6144 bit memory bus for HBM is enormous, so is a 386bit G6/G6X but HBM is larger. Memory buses aren't all that dense like logic portions of the die are so they end up taking up lots of space.

 

ca50b988-2c01-4204-ad81-7ca4da0e654b_329

 

FOT_-NJWUAARrtB.jpg:large

 

HBM interfaces are a little more efficient than G6/G6X as you can see here but it's still quite a lot more total die area being used. Top that off with one of those HBM PHY being there for binning/redundancy so only 5 of the 6 actually are used.

 

Interposer and die connectivity isn't really that bad today, there are low cost products doing this. HBM is still expensive, low volume in comparison, and you have to or should put more in to the product for defect redundancy unlike GDDR were defective modules can be replaced. Both these together would make it a hard sell but I still think it's largely down to the area size requirements of HBM. HBM doesn't scale down well in module count for bandwidth so that makes it poor choice for mid range products. A x70 class product with 2 active stacks (3 actual, or 4), 16GB ram, ~620 GB/s I tend to think would make the die at least 50mm2 larger which is 20% larger thus a greater than 20% increase in cost just for this die.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/26/2023 at 7:36 PM, igormp said:

What do you mean? This is the usable version, desolder existing 8gbit modules and replace with 16gbit ones, along with the single resistor change for the GPU to recognize it.

The modder states in the video that the card and/or driver invariably crashes at some point during use.

I'm talking about a mod which can be done and used as though nothing but the VRAM capacity has changed, where 'issues' are concerned.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, ShirtyGamer said:

The modder states in the video that the card and/or driver invariably crashes at some point during use.

That's why they had to change the power management to performance, and after that it worked flawlessly without crashes. Maybe you got this wrong due the language barrier.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

How much do these ram chips go for anyways. I doubt it adds much to the actually cost of the card.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Rodinski said:

How much do these ram chips go for anyways. I doubt it adds much to the actually cost of the card.

A lot. Just search digi key for gddr6 and check. Like I said here. The mod is not cheap. All the ram chips is often the single most expensive part on the BOM, not the gpu die. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Rodinski said:

How much do these ram chips go for anyways. I doubt it adds much to the actually cost of the card.

$33 per unit at mouser, and you can buy single units instead of entire lots:

https://www.mouser.com/c/semiconductors/memory-ics/dram/?type=SGRAM - GDDR6

 

Finding those is not that hard if you know where to look at. You can get even cheaper ones from china (at your own risk, ofc).

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, igormp said:

That's why they had to change the power management to performance, and after that it worked flawlessly without crashes. Maybe you got this wrong due the language barrier.

Might be. I was using the closed captions and relying on what some of the articles reporting on it were saying, a few of which said the modder stated they reverted to the 8GB because of lingering stability issues they suspect are from the drivers more than the card's mod.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ShirtyGamer said:

Might be. I was using the closed captions and relying on what some of the articles reporting on it were saying, a few of which said the modder stated they reverted to the 8GB because of lingering stability issues they suspect are from the drivers more than the card's mod.

Being realistic extremely, extremely few people would actually do this mod. I could likely attempt it with decent chance of success, done a lot of hobby electronics and soldering before and firmware updates/swaps on devices but even then I'd be looking at in-game settings first to lower VRAM requirements than go poking around with my GPU to do something unsupported which at any point due to a driver update stop functioning correctly because you just never know.

 

I put this in the filing cabinet of "awesome but practicality wise not worth considering".

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, ShirtyGamer said:

I'm talking about a mod which can be done and used as though nothing but the VRAM capacity has changed, where 'issues' are concerned.

Not to mention Nvidia could pull the plug at any moment and simply exclude 16 GB variants from drivers.

Flip the 3070 and with the modding costs you can get something nice from AMD (or maybe Intel with Battlemage in the future). A 16GB mod doesn't add to resell value, it's more expensive than comparable GPUs with more VRAM and any driver update might kill it out of the blue.

19 minutes ago, leadeater said:

I put this in the filing cabinet of "awesome but practicality wise not worth considering".

Exactly.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, ShirtyGamer said:

Might be. I was using the closed captions and relying on what some of the articles reporting on it were saying, a few of which said the modder stated they reverted to the 8GB because of lingering stability issues they suspect are from the drivers more than the card's mod.

Nope, he hasn't said that at all. 

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

Same folks also upgraded a 3060 8gb to 12gb:

 

Added benefit of extra memory bandwidth (128-bit to 192-bit) and extra 4gb.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×