Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Not quite gone yet - RTX 3080 20GB found

image.thumb.png.ec34e0b2b2ddf974c3b1b9816e628a65.png

Amongst all the 3060 Ti news, MSI made an unusual act to submit their trademarks for a 20GB memory model of the RTX 3080 to the Eurasian Economic Commission alongside their 3060 Ti model names, no information of when this card will come out is unknown.

Quote

We are not sure why MSI has submitted the 20GB models to Eurasian Economic Commission today. It might be an automated submission. There could also be a recent change in the roadmap that we are not aware of. Either way, the RTX 3080 20GB SKU appears at the same time as RTX 3060 Ti, which launches next month. The original launch date for RTX 3080 20GB was also scheduled for the first week of December.

Source: https://videocardz.com/newz/msi-submits-geforce-rtx-3060-ti-and-geforce-rtx-3080-20gb-to-eec

https://portal.eaeunion.org/sites/odata/_layouts/15/Portal.EEC.Registry.UI/DisplayForm.aspx?ItemId=71930&ListId=d84d16d7-2cc9-4cff-a13b-530f96889dbc

Thoughts: Memory issues was a problem for a lot of people who wanted to use the 3080 as a work card, which meant 10GB GDDR6X weren't enough in many rendering situations. 20GB should be just enough to run software like DaVinci resolve at 4K or even higher with multiple video adjustments. Not sure how much higher they will hike the pricing however, though I imagine it wouldn't be too high since the competitor Radeon 6900XT would be out already by then.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 5 3600 @ 4.1Ghz          Case: Antec P8     PSU: G.Storm GS850                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition @ 2Ghz

                                                                                                                             

Link to post
Share on other sites

nobody needs a 3070 or 3080 with 10GB. Those are gaming cards only, those are for the people who want them.

 

some people will need these 20GB cards, these are the work cards.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k (won) - EVGA Z370 Classified K - G.Kill Trident Z RGB - Force MP500 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G2 650W - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

Linux Proliant ML150 G6:

Dual Xeon X5560 - 24GB ECC DDR3 - GTX 750 TI - old Seagate 1.5TB HDD - Dark moded Ubuntu (and Win7, cuz why not)

 

How many watts do I need? Seasonic Focus threadUserbenchmark (Et al.) is trash explained, PSU misconceptions, protections explainedgroup reg is bad

Link to post
Share on other sites

Various scenarios are possible:

1, previous rumour the 3080 20GB was cancelled was wrong

2, previous rumour the 3080 20GB was cancelled was right, but nvidia changed their mind since then (but then, why no new rumours of that?)

3, submission was originally made before it was cancelled, and was only just recently published

4, anything else we don't know

 

7 minutes ago, williamcll said:

Not sure how much higher they will hike the pricing however, though I imagine it wouldn't be too high since the competitor Radeon 6900XT would be out already by then.

nvidia historically can get away with a price premium, or more correctly, AMD generally tried to undercut them. Where it will fit remains to be seen. If it really is a 3080 with more vram, the 6800XT would be the more obvious comparison point. If VRAM quantity is really the big thing for specific uses, 20>16 and even if the cores are on average slightly slower, this could be more attractive over a 6900XT.

 

Advanced congratulations to the two people who will be able to buy a 6900XT this year.

Desktop Gaming system: Asrock Z370 Pro4, i7-8086k, Noctua D15, Corsair Vengeance Pro RGB 3200 4x16GB, Gigabyte 2070, NZXT E850 PSU, Cooler Master MasterBox 5, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p144 G-sync
TV Gaming system: Gigabyte Z490 Elite AC, i5-10600k, Noctua D15, Kingston HyperX RGB 4000@3600 2x8GB, EVGA 2080Ti Black, EVGA 850W, Corsair 230T, Crucial P1 1TB + MX500 1TB, LG OLED55B9PLA 4k120 G-Sync Compatible
Streaming system: Asus X299 TUF mark 2, i9-7920X, Noctua D15, Corsair Vengeance LPX RGB 3000 8x8GB, Asus Strix 1080Ti, Corsair HX1000i, GameMax Abyss, Samsung 970 Evo 500GB, Crucial BX500 1TB, BenQ XL2411 1080p144 + HP LP2475w 1200p60
Former Main system: Asus Maximus VIII Hero, i7-6700k, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, Acer RT280k 4k60 FreeSync [link]
Gaming laptop: Asus FX503VD, i5-7300HQ, DDR4 2133 2x8GB, GTX 1050, Sandisk 256GB + 480GB SSD [link]


 

Link to post
Share on other sites
16 hours ago, Fasauceome said:

nobody needs a 3070 or 3080 with 10GB. Those are gaming cards only, those are for the people who want them.

 

some people will need these 20GB cards, these are the work cards.

If you are planning on playing at 4k then 10 gb of vram gets eaten up real fast and when it is all gone you take a big performance hit. I would say that the 16 gb that amd is doing makes more sense for a 4k card as it will be plenty for any game for quite some time. The main issue is that games are just creeping up on the 10gb of vram usage in newer games at 4k. 

Link to post
Share on other sites
44 minutes ago, Brooksie359 said:

If you are planning on playing at 4k then 10 gb of vram gets eaten up real fast and when it is all gone you take a big performance hit. I would say that the 16 gb that amd is doing makes more sense for a 4k card as it will be plenty for any game for quite some time. The main issue is that games are just creeping up on the 10gb of vram usage in newer games at 4k. 

NN training require lots of RAM. Like a 6GB card will not cut it at all. Just to load one NN for processing, not training, that I've been experimenting with, pulls 4GB of VRAM before even running a single thing on it. Right now a lot of NN stuff being experimented with seems to err on the side of low-quality inputs to conserve video memory. Like the underlying software for generating deepfakes right now, can only deal with 512x512 images and takes about a week to train on previous generation 2080ti hardware. So if you wanted to increase that to 1024x1024, the memory requirements quadruple, and the training speed decreases by the same amount.

 

Games on the other hand, very much depends on the output settings. Sure, you could run an 8K or 4K game, but it's a case of diminishing returns, if a game model isn't complex enough, then a 720p optimized model will look exactly the same at 4K and 8K even with a 4K texture pack. Just the overall rendered output at 4K or 8K.

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×