Jump to content

RDNA2 VRAM Leaks: Go buy AMD cards if you want more than 10GB.

Results45
5 minutes ago, Blademaster91 said:

This is fine if you can always upgrade to the latest card, but not if you want to keep a GPU for at least 3 years.

Release cycle is roughly 2 years so it wouldn't be a problem unless dev intentionally release games that current cards cannot run at max (FlightSim).

 

5 minutes ago, Blademaster91 said:

After seeing the GN 3080 teardown the FE has room for 2 extra chips, either Nvidia has a 3080Ti planned

There is ALWAYS such a card planned.

 

5 minutes ago, Blademaster91 said:

Gigabyte of a 20GB 3080 is true.

This I doubt but I'd have to look up the technical possibility. Double the capacity modules would have to exist, if they don't then this is 100% impossible no matter what so actually very easy to verify. The 3090 should be using such modules so I would assume they do exist, like I said not checked though.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

But also no, the difference some setting make is literally zero visually, when going from Ultra to High.

That's something that really bothers me with benchmarks and the way people play in general.

They crank everything up to ultra, run the game and then complains about having not enough performance and them needing higher end hardware.

Dude, just turn the settings down a bit. Going from "Ultra" to "Very high" might lower image quality by 5% but you get 15% higher performance.

 

Typically, the higher the settings the less optimized they are, and on top of that there is a diminishing return which makes it even less worth it.

 

 

A lot of tech channels like LTT loves making videos about how awesome high end stuff is, but I think a more interesting video would be "how much visual fidelity do you gain and how many FPS do you lose from going between settings" would be a really interesting video.

Is cranking everything to ultra really a good idea and is it worth it?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LAwLz said:

That's something that really bothers me with benchmarks and the way people play in general.

They crank everything up to ultra, run the game and then complains about having not enough performance and them needing higher end hardware.

Dude, just turn the settings down a bit. Going from "Ultra" to "Very high" might lower image quality by 5% but you get 15% higher performance.

 

Typically, the higher the settings the less optimized they are, and on top of that there is a diminishing return which makes it even less worth it.

 

 

A lot of tech channels like LTT loves making videos about how awesome high end stuff is, but I think a more interesting video would be "how much visual fidelity do you gain and how many FPS do you lose from going between settings" would be a really interesting video.

Is cranking everything to ultra really a good idea and is it worth it?

It’s apparently highly variable by both game and user.  I personally dislike the art in Skyrim.  I find it weirdly bony.  As such I actually prefer it on lower settings.  Fallout 4 and Witcher 3 I like more though, and that’s just me.  I saw someone mention once that ray tracing capacity has to double again beyond the capacity of the 3090 before it can actually come into its own where shading can be abandoned.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

Nothing today would require you to, future is future. Like I said everything could change so worrying about what might happen later is pointless, just buy the next new GPU at that point.

OMG, you trying to tell us nothing is truly future proof?! 😱

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Blademaster91 said:

After seeing the GN 3080 teardown the FE has room for 2 extra chips, either Nvidia has a 3080Ti planned or the leak from Gigabyte of a 20GB 3080 is true.

It seems like Nvidia held back on adding memory because either they don't want people keeping their GPU's like so many did with Pascal cards, or Nvidia knows they have no competition so they can keep the memory low, and get those that need the extra VRAM to buy a 3090.

Random thought: do we know if the 3080 and 3090 share the same PCB? We know they use the same die, so from a production perspective it would be advantageous to use the same PCB for both. Because of the bus width difference, that leads to the two unpopulated spaces on the 3080 relative to the filled 3090. The width would factor into the overall bandwidth. Could they fill 11 or 12 of those in a 3080 refresh? The possibility is there.

 

4 hours ago, leadeater said:

This I doubt but I'd have to look up the technical possibility. Double the capacity modules would have to exist, if they don't then this is 100% impossible no matter what so actually very easy to verify. The 3090 should be using such modules so I would assume they do exist, like I said not checked though.

If the AMD rumours are at all credible then I think high VRAM nvidia cards are a given. Maybe availability is in part why they haven't announced them from the start. If this ram is almost but not quite available today, but this is pure speculation on my part.

 

I've not kept up with one part of the rumours, anyone know if AMD are supposed to be going GDDR or HBM with their impending cards? If AMD do go high capacity, how will they do so?

 

1 hour ago, LAwLz said:

That's something that really bothers me with benchmarks and the way people play in general.

They crank everything up to ultra, run the game and then complains about having not enough performance and them needing higher end hardware.

For performance comparisons, it is nice to have presets that allow for ease of repeatability and comparison across different devices. It doesn't necessarily follow that those settings make sense in a practical use case. You might have insane fps at low resolution/settings, and a slideshow at the other end.

 

1 hour ago, LAwLz said:

Dude, just turn the settings down a bit. Going from "Ultra" to "Very high" might lower image quality by 5% but you get 15% higher performance.

This is something more "value sensitive" gamers have been doing anyway. There will be a number of people who want the best possible visual quality, which is not the same as the best experience. Balancing fps against visual quality is part of that. 

 

1 hour ago, LAwLz said:

Is cranking everything to ultra really a good idea and is it worth it?

It depends on the game too much for a general answer. On my main timesink FFXIV I can leave it on max 1440p and I frame rate limit so it isn't stupid high. On the other end, Flight Simulator defaulted to Ultra when I started it, but I was getting an average of 20fps. Dropped it down to high, average 60fps or so (but not necessarily in the same area, variations may apply due to scenery). What about the visual quality? Looking at them one at a time, I can't tell the difference. In other side-by-side comparison videos on youtube, there's only a small difference between ultra and high, and you only take a more noticeable hit in visual quality at medium. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

My GTX 970 setup had the power to drive my large res monitors but always ran out of memory and it was a PITA and a sad story of a great card that could have been legendary. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LAwLz said:

A lot of tech channels like LTT loves making videos about how awesome high end stuff is, but I think a more interesting video would be "how much visual fidelity do you gain and how many FPS do you lose from going between settings" would be a really interesting video.

I think HardwareUnboxed does this on their Game Optimization video series, I don't really watch those but I know they have done such investigations.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

Random thought: do we know if the 3080 and 3090 share the same PCB? We know they use the same die, so from a production perspective it would be advantageous to use the same PCB for both. Because of the bus width difference, that leads to the two unpopulated spaces on the 3080 relative to the filled 3090. The width would factor into the overall bandwidth. Could they fill 11 or 12 of those in a 3080 refresh? The possibility is there.

 

If the AMD rumours are at all credible then I think high VRAM nvidia cards are a given. Maybe availability is in part why they haven't announced them from the start. If this ram is almost but not quite available today, but this is pure speculation on my part.

 

I've not kept up with one part of the rumours, anyone know if AMD are supposed to be going GDDR or HBM with their impending cards? If AMD do go high capacity, how will they do so?

nvidia has usually done this in the past, its why you see empty gddr footprints on the 2080 ti for example

 

 

 

 

so here is the info we know for sure (based on kernel data):

3 dies, one with 80 CUs with both meantions of a 256bit bus of gddr, and also hbm, 64 rops, 4 shader engines, another one with 40 CUs 192 bit bus of gddr6, 32 rops

from rumors and leaks:

16GB of ram for the biggest die, some sort of huge cache (128MB) to allow the smaller 256bit bus to keep up, we also have seen the pcb for this gpu, which 100% matches the renders of the cooler with 3 fans amd released.

12GB for the 192bit bus card

die sizes: 500 340 and 250

 

thats mostly it i think.

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, porina said:

Random thought: do we know if the 3080 and 3090 share the same PCB? We know they use the same die, so from a production perspective it would be advantageous to use the same PCB for both. Because of the bus width difference, that leads to the two unpopulated spaces on the 3080 relative to the filled 3090.

Possible, but the 3090 also has NVLink connectors on the pcb unlike the 3080, so unless they’re making a daughter board for that... 

Quote

I've not kept up with one part of the rumours, anyone know if AMD are supposed to be going GDDR or HBM with their impending cards? If AMD do go high capacity, how will they do so?

Actually Hardcore Overclocking took a look at the fortnite renders of the 6000 series card and noted that the screw holes either side of the GPU chip look to be right where the memory traces would need to be for GDDR6, suggesting It might use HBM of some sort, maybe HBM2e? 

That being said, it’s highly likely anything below the top-tier card would be using GDDR6 as HBM would just be too expensive.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

I don’t think this is much of an issue. If you turn everything up to ultra and fill the memory you are likely to have 80fps. I am more likely to turn down setting to get 100fps that will give back memory.

 Still don’t understand why it’s not 12G of memory instead of 10G they had the space for it. Would have give just that extra bit of buffer. But looking at the settings in games and how much memory is used with the FPS falling. Realistically you wouldn’t have all that turned on unless you are happy with 60fps.

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Maticks said:

Still don’t understand why it’s not 12G of memory instead of 10G they had the space for it.

It would also increase the memory bus width to equal that of the 3090, and reduce the performance gap between the two. I suspect they're keeping this option in reserve for a future higher performance refresh of the 3080.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/21/2020 at 2:01 PM, Moonzy said:

-sips tea-

 

internally:

  Reveal hidden contents

spandong.jpg

 

call me a fanboy if you want, but all of their previous launches for the last few years were train wrecks

hope they can do it better this year

 

I mean nvidia pulled an AMD as well this year

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, AndreiArgeanu said:

I mean nvidia pulled an AMD as well this year

They also had vram issue last year with their 2080ti

Two launches in a row now

Though not sure of they can patch this with a driver update

The 2080ti had to be RMA iirc

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Moonzy said:

They also had vram issue last year with their 2080ti

Two launches in a row now

Though not sure of they can patch this with a driver update

The 2080ti had to be RMA iirc

they've had one with the 900 gen also. the 970,980 and 980ti all needed more

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

I mean that's the plan, y'all buy these "big navi™"  16GB cards and I'll get the RTX 3070 SUPER 12GB couple of months later. (for$499)

 images.jpeg-1.jpg.c7a79054134220d41f055b230c3ddb34.jpg

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×