Jump to content

Not quite gone yet - RTX 3080 20GB found

williamcll

image.thumb.png.ec34e0b2b2ddf974c3b1b9816e628a65.png

Amongst all the 3060 Ti news, MSI made an unusual act to submit their trademarks for a 20GB memory model of the RTX 3080 to the Eurasian Economic Commission alongside their 3060 Ti model names, no information of when this card will come out is unknown.

Quote

We are not sure why MSI has submitted the 20GB models to Eurasian Economic Commission today. It might be an automated submission. There could also be a recent change in the roadmap that we are not aware of. Either way, the RTX 3080 20GB SKU appears at the same time as RTX 3060 Ti, which launches next month. The original launch date for RTX 3080 20GB was also scheduled for the first week of December.

Source: https://videocardz.com/newz/msi-submits-geforce-rtx-3060-ti-and-geforce-rtx-3080-20gb-to-eec

https://portal.eaeunion.org/sites/odata/_layouts/15/Portal.EEC.Registry.UI/DisplayForm.aspx?ItemId=71930&ListId=d84d16d7-2cc9-4cff-a13b-530f96889dbc

Thoughts: Memory issues was a problem for a lot of people who wanted to use the 3080 as a work card, which meant 10GB GDDR6X weren't enough in many rendering situations. 20GB should be just enough to run software like DaVinci resolve at 4K or even higher with multiple video adjustments. Not sure how much higher they will hike the pricing however, though I imagine it wouldn't be too high since the competitor Radeon 6900XT would be out already by then.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

nobody needs a 3070 or 3080 with 10GB. Those are gaming cards only, those are for the people who want them.

 

some people will need these 20GB cards, these are the work cards.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

Various scenarios are possible:

1, previous rumour the 3080 20GB was cancelled was wrong

2, previous rumour the 3080 20GB was cancelled was right, but nvidia changed their mind since then (but then, why no new rumours of that?)

3, submission was originally made before it was cancelled, and was only just recently published

4, anything else we don't know

 

7 minutes ago, williamcll said:

Not sure how much higher they will hike the pricing however, though I imagine it wouldn't be too high since the competitor Radeon 6900XT would be out already by then.

nvidia historically can get away with a price premium, or more correctly, AMD generally tried to undercut them. Where it will fit remains to be seen. If it really is a 3080 with more vram, the 6800XT would be the more obvious comparison point. If VRAM quantity is really the big thing for specific uses, 20>16 and even if the cores are on average slightly slower, this could be more attractive over a 6900XT.

 

Advanced congratulations to the two people who will be able to buy a 6900XT this year.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Fasauceome said:

nobody needs a 3070 or 3080 with 10GB. Those are gaming cards only, those are for the people who want them.

 

some people will need these 20GB cards, these are the work cards.

If you are planning on playing at 4k then 10 gb of vram gets eaten up real fast and when it is all gone you take a big performance hit. I would say that the 16 gb that amd is doing makes more sense for a 4k card as it will be plenty for any game for quite some time. The main issue is that games are just creeping up on the 10gb of vram usage in newer games at 4k. 

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Brooksie359 said:

If you are planning on playing at 4k then 10 gb of vram gets eaten up real fast and when it is all gone you take a big performance hit. I would say that the 16 gb that amd is doing makes more sense for a 4k card as it will be plenty for any game for quite some time. The main issue is that games are just creeping up on the 10gb of vram usage in newer games at 4k. 

NN training require lots of RAM. Like a 6GB card will not cut it at all. Just to load one NN for processing, not training, that I've been experimenting with, pulls 4GB of VRAM before even running a single thing on it. Right now a lot of NN stuff being experimented with seems to err on the side of low-quality inputs to conserve video memory. Like the underlying software for generating deepfakes right now, can only deal with 512x512 images and takes about a week to train on previous generation 2080ti hardware. So if you wanted to increase that to 1024x1024, the memory requirements quadruple, and the training speed decreases by the same amount.

 

Games on the other hand, very much depends on the output settings. Sure, you could run an 8K or 4K game, but it's a case of diminishing returns, if a game model isn't complex enough, then a 720p optimized model will look exactly the same at 4K and 8K even with a 4K texture pack. Just the overall rendered output at 4K or 8K.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×