Jump to content

Nvidia back at their memory game - RTX 3080 20GB found and benchmarked

williamcll
On 1/30/2021 at 5:39 AM, Kisai said:

The reason certain numbers show up has to do with channels (eg "bits") of memory that can be accessed. The 3080 is listed as 320-bit with 10GB GDDR6X, The 3090 is 384-bit with 24GB GDDR6X, the 3070 is 256-bit with 8GB. 3060 Ti is 256 bit with 8GB GDDR6, and 3060 is 12GB GDDR6 with 192 bit.

 

The less wide the bits are, the narrower that bus is, so that effectively means to access larger amounts of memory, more latency will be induced the more of that memory is utilized. It's like how your PC might operate in single channel or dual channel mode. It doesn't result in the memory speed being cut in half, but it does noticeably impact the performance when the memory is fully utilized.

I'm not sure you've explained that the best. GDDR modules tend to come in two commonly used and mass produced sizes, a wider bus means a greater number of memory modules. A graphics card could be 128 bit or 256 bit and have the same amount of VRAM, one of them would have twice the number of memory modules of half the size. The one with more memory modules would have far greater memory bandwidth while only a very tiny reduction in  memory latency, something which isn't much of a concern to GPUs and the workloads they run.

 

Each GDDR6 module is 32 bits, you can just divide the bus width by 32 to get the number of memory modules, then divide the memory capacity by the number of modules to get the size of module being used. That is if you want to know these two things.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

I'm not sure you've explained that the best. GDDR modules tend to come in two commonly used and mass produced sizes, a wider bus means a greater number of memory modules. A graphics card could be 128 bit or 256 bit and have the same amount of VRAM, one of them would have twice the number of memory modules of half the size. The one with more memory modules would have far greater memory bandwidth while only a very tiny reduction in  memory latency, something which isn't much of a concern to GPUs and the workloads they run.

 

Each GDDR6 module is 32 bits, you can just divide the bus width by 32 to get the number of memory modules, then divide the memory capacity by the number of modules to get the size of module being used. That is if you want to know these two things.

Most benchmarks I've seen, be it CPU or GPU, the improvement with wider channels is never double, it's always more like 10-20%. With GPU's often software demands a specific level of bandwidth, so you would really want to avoid the low-end parts.

https://knowledge.autodesk.com/support/civil-3d/learn-explore/caas/sfdcarticles/sfdcarticles/System-requirements-for-Autodesk-Civil-3D-2020.html

Quote
Display Card Minimum: 1 GB GPU with 29 GB/s Bandwidth and DirectX 11 compliant
Recommended: 4 GB GPU with 106 GB/s Bandwidth and DirectX 11 compliant

Even though that says "minimum" and "Recommended" no software, not since 1986 has ever been honest about requirements. Usually the "minimum" is "this is what the software requires, and anything less we didn't test on" and Recommended being "This is the lowest-end thing we tested"

 

As I have a variety of systems at the office, that "recommended" spec is exactly, or near to GP107 (Quadro P500/Geforce GTX 1050), which you might go "oh, well a GTX 1050 is sufficient for AutoCAD", but then if you actually try to use the software, no, it certainly isn't unless you want to hear the GPU fan sound like a jet engine the entire time. The 1050 with a 96bit GDDR5 bus is about 84GB/sec, the 1050/1050Ti with a 128bit bus is 112GB/sec. All the CAD systems have a minimum of a P1000, which have 96GB/sec on a 128bit bus, falling slightly below the Recommended. All of these mentioned are GP107. The actual CAD users get laptops with GP106 or desktops with GP104, which aren't configured with such low memory bandwidth. However even these high end laptops, sound like a jet engine after about 2 minutes running AutoCAD. Desktops don't break a sweat however.

 

Neural Net stuff also demands higher GPU memory bandwidth. So my feeling here is that a 20GB 3060 may really be intended for something like this, but the lack of bandwidth doesn't impair it as much (even the bandwidth difference between a 2060 and a 2080 isn't that much, 336GB/sec vs 448GB/sec) compared to the sub x60 parts. The 3060Ti is 448/GB sec which puts it at the same bandwidth as the 2080 on GDDR6.  The 3080 is 760GB/sec, and the 3090 is 936GB/sec on GDDR6X.

 

Honestly, it actually kinda feels like nVidia didn't intend to release a 3080 10GB at all and was forced to due to the lack of GDDR6X chips. The same strange reason why we're seeing 3060Ti's with 20GB, is probably to soak up the GDDR6 inventory that was for previous 20xx parts.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Kisai said:

Most benchmarks I've seen, be it CPU or GPU, the improvement with wider channels is never double, it's always more like 10-20%. With GPU's often software demands a specific level of bandwidth, so you would really want to avoid the low-end parts.

That's because memory bandwidth is not directly related to performance, it never has been. Memory bandwidth is only ever about if you have enough for the compute resources. With enough memory bandwidth you get 100% performance, with more you get 100% performance, with not enough you get something below 100%.

 

4 hours ago, Kisai said:

Honestly, it actually kinda feels like nVidia didn't intend to release a 3080 10GB at all and was forced to due to the lack of GDDR6X chips. The same strange reason why we're seeing 3060Ti's with 20GB, is probably to soak up the GDDR6 inventory that was for previous 20xx parts.

Doubt it, Nvidia has never favored high capacity VRAM on Geforce cards ever and has always been criticized for it. Nvidia only ever puts the more expensive memory modules on Titans and Quadros. For all we know there is an abundance of the expensive high capacity GDDR6 modules but that doesn't mean Nvidia would be willing to use them on Geforce cards. 

Link to comment
Share on other sites

Link to post
Share on other sites

Now, RTX 3080 Ti will maybe have 12GB G6X...

 

https://videocardz.com/newz/nvidia-geforce-rtx-3080-ti-with-12gb-memory-to-launch-in-april

 

12GB or 20GB ? Wait and see when it will release...

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/29/2021 at 11:06 AM, germgoatz said:

why cant we just get a 12gb 3080 because 10gb is not enough but 20gb is way too much and a waste of money for most gamers

10GB isn't enough?  When do you break this?  In real use, not allocation.

 

Btw, if 10 isn't enough, I'd want more than 12.  16GB would be a nice step up, not 12.

 

I think Nvidia, while being asshats, know exactly how to get into our wallets.  20GB VRAM?  I bet some 3080 owners are going to sell them and rebuy this.  Strong work, Nvidia.

"Do what makes the experience better" - in regards to PCs and Life itself.

 

Onyx AMD Ryzen 7 7800x3d / MSI 6900xt Gaming X Trio / Gigabyte B650 AORUS Pro AX / G. Skill Flare X5 6000CL36 32GB / Samsung 980 1TB x3 / Super Flower Leadex V Platinum Pro 850 / EK-AIO 360 Basic / Fractal Design North XL (black mesh) / AOC AGON 35" 3440x1440 100Hz / Mackie CR5BT / Corsair Virtuoso SE / Cherry MX Board 3.0 / Logitech G502

 

7800X3D - PBO -30 all cores, 4.90GHz all core, 5.05GHz single core, 18286 C23 multi, 1779 C23 single

 

Emma : i9 9900K @5.1Ghz - Gigabyte AORUS 1080Ti - Gigabyte AORUS Z370 Gaming 5 - G. Skill Ripjaws V 32GB 3200CL16 - 750 EVO 512GB + 2x 860 EVO 1TB (RAID0) - EVGA SuperNova 650 P2 - Thermaltake Water 3.0 Ultimate 360mm - Fractal Design Define R6 - TP-Link AC1900 PCIe Wifi

 

Raven: AMD Ryzen 5 5600x3d - ASRock B550M Pro4 - G. Skill Ripjaws V 16GB 3200Mhz - XFX Radeon RX6650XT - Samsung 980 1TB + Crucial MX500 1TB - TP-Link AC600 USB Wifi - Gigabyte GP-P450B PSU -  Cooler Master MasterBox Q300L -  Samsung 27" 1080p

 

Plex : AMD Ryzen 5 5600 - Gigabyte B550M AORUS Elite AX - G. Skill Ripjaws V 16GB 2400Mhz - MSI 1050Ti 4GB - Crucial P3 Plus 500GB + WD Red NAS 4TBx2 - TP-Link AC1200 PCIe Wifi - EVGA SuperNova 650 P2 - ASUS Prime AP201 - Spectre 24" 1080p

 

Steam Deck 512GB OLED

 

OnePlus: 

OnePlus 11 5G - 16GB RAM, 256GB NAND, Eternal Green

OnePlus Buds Pro 2 - Eternal Green

 

Other Tech:

- 2021 Volvo S60 Recharge T8 Polestar Engineered - 415hp/495tq 2.0L 4cyl. turbocharged, supercharged and electrified.

Lenovo 720S Touch 15.6" - i7 7700HQ, 16GB RAM 2400MHz, 512GB NVMe SSD, 1050Ti, 4K touchscreen

MSI GF62 15.6" - i7 7700HQ, 16GB RAM 2400 MHz, 256GB NVMe SSD + 1TB 7200rpm HDD, 1050Ti

- Ubiquiti Amplifi HD mesh wifi

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Dedayog said:

10GB isn't enough?  When do you break this?  In real use, not allocation.

 

Btw, if 10 isn't enough, I'd want more than 12.  16GB would be a nice step up, not 12.

 

I think Nvidia, while being asshats, know exactly how to get into our wallets.  20GB VRAM?  I bet some 3080 owners are going to sell them and rebuy this.  Strong work, Nvidia.

ive seen many benchmarks for 1440p and 4k where all the ram is being used, thats one of the reasons why the rx 6000 series has more ram than its nvidia counterparts

but i do agree that 16gb would be better

hi

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dedayog said:

16GB would be a nice step up, not 12.

1 hour ago, germgoatz said:

16gb would be better

You want even slower transfer rates because they have to step down the interface width to 256 bits? Or something like a 970 with some parts of the memory being slower?

If they want to use the same GA102 for the 3080 TI, they basically have to use either 12 GB with a 384 bit interface, or 20 GB with a 320 bit interface and memory modules on both sides of the card. I would prefer 12 GB. Cooling the modules on the back (like 3090s) seems to be a lot harder. And 12 GB should be enough for now.

 

But it doesn't matter anyway. With the current rise of cryptocurrency we are not going to see any cards in the wild any time soon.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, germgoatz said:

ive seen many benchmarks for 1440p and 4k where all the ram is being used, thats one of the reasons why the rx 6000 series has more ram than its nvidia counterparts

but i do agree that 16gb would be better

That's just ram allocation, it can be next to impossible to tell how much vram is actually in use. You typically have to go by secondary factors like frame time spikes and large loses in 1% and 0.1% lows.

 

Nvidia always been very good at putting just enough vram on their cards to do the intended job, and for the RTX 3080 that includes 4K. The gaming community has been down this path before, multiple times in the past in fact, and it typically turns out that the lesser vram compared to AMD only becomes a factor that gets talked about around 2 years after the release of the GPUs, right around the time Nvidia is about to release a new generation or refresh 😉

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/29/2021 at 11:04 AM, williamcll said:

A man from Bilibili managed to buy himself a 3080 with 20GB of memory from an anonymous person with bitcoins, he benchmarked it:

image.thumb.png.2988e35b678cd043367959f41097ab0a.png

3080 20GB is in the middle, note doubled VRAM size

TBH, to me the 3080 20GB looks more like a cut down 3090 than a beefed 3080 (yes I know a 3080 is essentially a cut down 3090) but given they both have the same CUDA core count, maybe it should be like the 3090 Base or 3090 SE or 3090 sub 3080ti super rev 2.1a

"Put as much effort into your question as you'd expect someone to give in an answer"- @Princess Luna

Make sure to Quote posts or tag the person with @[username] so they know you responded to them!

 RGB Build Post 2019 --- Rainbow 🦆 2020 --- Velka 5 V2.0 Build 2021

Purple Build Post ---  Blue Build Post --- Blue Build Post 2018 --- Project ITNOS

CPU i7-4790k    Motherboard Gigabyte Z97N-WIFI    RAM G.Skill Sniper DDR3 1866mhz    GPU EVGA GTX1080Ti FTW3    Case Corsair 380T   

Storage Samsung EVO 250GB, Samsung EVO 1TB, WD Black 3TB, WD Black 5TB    PSU Corsair CX750M    Cooling Cryorig H7 with NF-A12x25

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, TVwazhere said:

TBH, to me the 3080 20GB looks more like a cut down 3090 than a beefed 3080 (yes I know a 3080 is essentially a cut down 3090) but given they both have the same CUDA core count, maybe it should be like the 3090 Base or 3090 SE or 3090 sub 3080ti super rev 2.1a

Or 3090 Gimped.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×