Jump to content

Rumor: RTX 3080 Ti delayed to end of Feb. at the earliest, RTX 3060 12GB to be released at CES

Random_Person1234

Summary

According to Igor's LAB, the RTX 3080 Ti has been delayed to the end of Feb. at the earliest, maybe even later. This is due to Nvidia lowering the priority of the 3080 Ti after seeing the performance of the RX 6000 series. Igor's LAB also says that the RTX 3060 12GB is to be released at CES 2021 (Jan. 11-14). This is to compete with the expected RX 6700 XT 12GB. Igor's LAB goes on to say that the RTX 3050 Ti is to be renamed to the RTX 3060 6GB. 

 

Quotes

Quote

Let’s start with the long awaited GeForce RTX 3080 Ti, which has turned out to be more and more probable, but whose appearance should be further delayed. If I can believe my own sources, which refer to both NVIDIA and AMD, NVIDIA waited for the launch of the new RX 6000 series and then analyzed all the results very thoroughly. Although the current GeForce RTX 3080 only has 10 GB of graphics memory compared to the current AMD cards with 16 GB, the results in Ultra-HD in particular have once again shown that this numerical disadvantage doesn’t really take effect in reality.

Quote

This, in turn, allows NVIDIA to take a deep breath, and also lowers the priority level of a GeForce RTX 3080 Ti with a more generous memory expansion, while the card is set back in time. So take a deep breath and breathe. If the information I got from several sources (up to suppliers) is correct and congruent, then the GeForce RTX 3080 Ti with 20 GB graphics memory will be launched only after the CNY (Chinese New Year, 11 to 17 February) holidays, i.e. at the end of February at the earliest, may be even later.

Quote

Already at the virtual CES 2012 (January 11th to 14th, 2021) NVIDIA is supposed to introduce the new GeForce RTX 3060 12 GB and launch it quite soon. The production launch and nationwide availability in the retail sector is scheduled to take place by the end of January, or by the CNY at the latest. This counteracts then also the expected AMD Radeon RX 6700 XT 12 GB, about which I could already report. Interestingly enough there should be no 3050 Ti this time, because the RTX naming scheme should end with the 60 as usual.

Quote

The RTX 3050 Ti already planned and circulated, is to be marketed as RTX 3060 6 GB, whereby one is to set on identical boards, a similar and/or even the same chip and only 6 GB GDDR6 memory expansion. Exactly at this point I have to protect my own sources, because it seems that there are different specifications in circulation, but they only differ in details. Whether NVIDIA intentionally distributed different information or whether there were transmission errors or misunderstandings I can’t say, but it wouldn’t be the first time that they tried to narrow down possible sources of leaks. Therefore, for the time being there is no further information on the exact details.

Quote

What I found interesting is that Igor also confirmed what we wrote earlier, that RTX 3050 Ti will be renamed into RTX 3060 6GB. The original specs for 3050 Ti featured GA106 GPU and it appears that NVIDIA might have thought it would better fit 3060 6GB instead. NVIDIA has so far not confirmed whether it has plans for 3050 Ti, however, there is RTX  3050 non-Ti on a horizon that would feature a cut-down GA107 GPU. The Ti variant could use the full chip if it is ever released.

Both RTX 3060 models are still to launch next month. According to Igor, the 12GB model would launch around CES 2021 (Jan 11th to 14th), while the 6GB model could launch by the end of January. It is worth noting that according to the Chinese website ChannelGate, the RTX 3060 6GB is also delayed, meaning that only RTX 3060 12GB would launch in January. We will do our best to confirm this information.

 

My thoughts

Interesting that a variant of a -60 series will have more VRAM than the -80 series. I wonder if Nvidia will release a lower end 3000 series card, such as a RTX 3050 non-Ti, or if the RTX 3060 6GB will be the lowest end 3000 series card.

 

Sources

https://www.igorslab.de/en/die-geforce-rtx-3080-ti-kommt-spaeter-die-rtx-3060-eher-und-die-gtx-1060-3gb-bekommt-eine-nachfolgerin-2/

https://videocardz.com/newz/nvidia-geforce-rtx-3080-ti-allegedly-postponed-till-february-rtx-3060-12gb-6gb-in-january

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, WereCat said:

If NVIDIA is really about to release a 12GB 3060 then I pity anybody who bought the 3080.

Again, it won't make a difference really. 12GBs of much slower VRAM with a much narrower bus will not equal 10GBs of much faster VRAM on a much wider bus.

 

This seems to be the same reason the RX 6800 XT does not trade blows with the 3080 at 4K in most titles (from what I've heard), instead falling behind more. 

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Random_Person1234 said:

RTX 3060 12GB

imagine having more VRAM than 3080 and 3070 lmfaoo

 

owners of those cards are gonna whine

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Hymenopus_Coronatus said:

Again, it won't make a difference really. 12GBs of much slower VRAM with a much narrower bus will not equal 10GBs of much faster VRAM on a much wider bus.

 

This seems to be the same reason the RX 6800 XT does not trade blows with the 3080 at 4K in most titles (from what I've heard), instead falling behind more. 

im not talking about game performance

 

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, WereCat said:

im not talking about game performance

Then what are you talking about?

 

 

If all officially announced nvidia and AMD cards were available at MSRP today, plus the choice of this "3060 12GB" presumably priced somewhere below 3070, I'd still get the 3080 without hesitation as the best all round feature and performance card. This is based on my personal priority which I'd rate roughly as 80% gaming, 20% compute use cases.

 

I find it comical that many view RTX and DLSS as future technologies while they are in use today. Yet those are often the same claiming anything with less than 16GB is already obsolete despite about three people who aren't scalpers owning them.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, porina said:

Then what are you talking about?

 

 

If all officially announced nvidia and AMD cards were available at MSRP today, plus the choice of this "3060 12GB" presumably priced somewhere below 3070, I'd still get the 3080 without hesitation as the best all round feature and performance card. This is based on my personal priority which I'd rate roughly as 80% gaming, 20% compute use cases.

 

I find it comical that many view RTX and DLSS as future technologies while they are in use today. Yet those are often the same claiming anything with less than 16GB is already obsolete despite about three people who aren't scalpers owning them.

My personal use is for Blender 3D rendering where some scenes need 15+GB. I have to limit myself with my 1080ti because I am often hitting the 11GB VRAM limit already. Once you run out of VRAM with GPU rendering the performance falls off the cliff.

Sure, these are gaming cards and not a workstation cards but still... there is absolutely no reason why NVIDIA actually downgraded the VRAM amount vs the 1000 and 2000 series on their high end cards.

Yes, you could argue that the 3080 is actually a 2GB upgrade from the 1080/2080 and the 3090 is the replacement for 1080ti/2080ti but its not really. The price range of that card is absolutely out of the whack.

 

Which brings the question, why are they suddenly going to release a 12GB 3060 if 10GB is enough for 3080? (if thats actually true)

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, WereCat said:

My personal use is for Blender 3D rendering where some scenes need 15+GB. I have to limit myself with my 1080ti because I am often hitting the 11GB VRAM limit already. Once you run out of VRAM with GPU rendering the performance falls off the cliff.

I'm not so familiar with the "creative" use cases consumer grade GPUs have. I wonder how big it is actually. Certainly you are not alone in mentioning this on this forum. While big companies can throw money at pro hardware, I guess it is also an area where hobbyists and smaller operators do rely on consumer hardware too.

 

BTW do you have a specific reason to look specifically at nvidia? Would AMD's offerings be of advantage to you here?

 

8 minutes ago, WereCat said:

Which brings the question, why are they suddenly going to release a 12GB 3060 if 10GB is enough for 3080? (if thats actually true)

AMD. They started a core war with Intel, now they're starting a VRAM war with nvidia. For past, present and near future gaming perspective, around 8GB is still going to be fine for a great experience as that still covers the majority of the installed base. Of course, it is in AMD's interest now for game devs to go heavy on the assets for the "best" experience, as their current cards will have some advantage there until nvidia responds. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

I'm not so familiar with the "creative" use cases consumer grade GPUs have. I wonder how big it is actually. Certainly you are not alone in mentioning this on this forum. While big companies can throw money at pro hardware, I guess it is also an area where hobbyists and smaller operators do rely on consumer hardware too.

 

BTW do you have a specific reason to look specifically at nvidia? Would AMD's offerings be of advantage to you here?

 

Lack of OpenCL support. 

Look at the LTT Blender benchmarks with AMD new RX cards. The renders look hideous. 

Link to comment
Share on other sites

Link to post
Share on other sites

Definitely interested to see where the 3060 falls in terms of performance. I did actually buy a used 2080 Super as an interim card for about what I think the 3060 is going to retail for so if the 3060 doesn't perform as well as that especially for non-gaming workloads (I'm thinking Davinci Resolve primarily for my use case), I might just stick with that. I guess it comes down to how much those extra 4GB of VRAM will actually be useful.

Link to comment
Share on other sites

Link to post
Share on other sites

I play HZD on my 1660Ti and that game eats my 6GB VRAM easily at 1080p ultra. So I appreciate the extra VRAM on a mid-range product. But 12 GBs? Isn't that essentially 4K level resolution? Is that card even capable of running current AAA title at 4K?

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, WereCat said:

If NVIDIA is really about to release a 12GB 3060 then I pity anybody who bought the 3080. That's just a spit in the face directly from NVIDIA and scalpers.

 

you missed the whole rtx 2000/ super episode then didn't you? because this really should surprise no one. 

 

 

20 minutes ago, AldiPrayogi said:

6GB VRAM easily at 1080p ultr

 

20 minutes ago, AldiPrayogi said:

But 12 GBs? Isn't that essentially 4K

no, by that logic you would need 24GB for "4k".  But luckily display resolution only plays a minor role in vram usage, the larger role is texture resolution, which "surprisingly" is independent of display resolution. 

 

 

Imo... 12GB is fine / minimum, should have been standard from the get go, but as usual Nvidia wants to get all the double dippers, they'd be stupid not to... 

 

 

*I actually have to cancel my 3070 step up order now, don't I.... *sigh*

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, RejZoR said:

Having more VRAM on crappy RTX 3060 than on multiple levels superior RTX 3080 is just moronic design decision.

That's what I was thinking. I mean by the time 12 gbs of vram is necessary to run a game the 3060 will likely not run the game well anyways. 8 gb would have been fine. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Brooksie359 said:

That's what I was thinking. I mean by the time 12 gbs of vram is necessary to run a game the 3060 will likely not run the game well anyways. 8 gb would have been fine. 

Where RTX 3080 has enough grunt to utilize 12GB even today. Hell, 16GB would actually be a sweet spot for a card with so much grunt.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, RejZoR said:

Where RTX 3080 has enough grunt to utilize 12GB even today. Hell, 16GB would actually be a sweet spot for a card with so much grunt.

I think 10 gb is enough for today but will likely end up not being the case a few years out. Its why I have no intention to buy a 3080 even if the stock issue is resolved. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Brooksie359 said:

I think 10 gb is enough for today but will likely end up not being the case a few years out. Its why I have no intention to buy a 3080 even if the stock issue is resolved. 

I'm still unsure. In a way 10GB should be enough because I plan on sticking with 1440p max where 10GB should be enough. Then again RX 6800XT is similarly fast and has 16GB. As for ray tracing, I think I'm just gonna stick with ReShade RTGI so it ultimately almost doesn't matter if Radeon doesn't have ray tracing supported in games as I can just bolt it on any game myself and get amazing results. Ray traced global illumination is pretty sweet and SSAO that comes with it adds suchamazing depth to scenes I almost don't care about RTX. Especially because that applies to nearly all games from my backlog and not select few like with RTX...

Link to comment
Share on other sites

Link to post
Share on other sites

wait, now they are naming improved versions of their cards RTX 30XX ti again ? I'm confused

Hi

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

hi

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Drama Lama said:

wait, now they are naming improved versions of their cards RTX 30XX ti again ? 

Well in case of 3060 the better version is the non ti now...  🤔 

 

 

14 minutes ago, Drama Lama said:

I'm confused

All according to leather jacket's evil plan! :)

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I haven't been able to get a 3070 still and, at this point, I think I'll just wait for the 3070 Ti with more VRAM to land. Would've been nice to play Cyberpunk 2077 with ray-tracing enabled; such a shame 😢

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, WereCatf said:

I haven't been able to get a 3070 still and, at this point, I think I'll just wait for the 3070 Ti with more VRAM to land. Would've been nice to play Cyberpunk 2077 with ray-tracing enabled; such a shame 😢

There are bunch of them available in my country. At inflated price, but available. You basically can't get RTX 3080's. Except with super inflated price. We're talking 1200€.

Link to comment
Share on other sites

Link to post
Share on other sites

its going to be fun to see people reacting to a XX60 card having move vram than their "flagship" card

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Random_Person1234 said:

 

My thoughts

Interesting that a variant of a -60 series will have more VRAM than the -80 series. I wonder if Nvidia will release a lower end 3000 series card, such as a RTX 3050 non-Ti, or if the RTX 3060 6GB will be the lowest end 3000 series card.

 

 

I have a feeling that the lower-memory cards are targeted at 1080p gaming, while the higher memory cards are targeted at 4K gaming. Like if all you're running is 1080p then it logically makes sense that a 3060 6GB will suffice. It however is insufficient for neutral net training and likewise 4K rendering. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I will likely buy one. The current cards aren't really an upgrade for me (on a cost benefit basis) due to their lack of vram, no reason to pay an insane amount of money for the same amount of vram (3060ti/3070) or just 2gb extra (3080) compared to my current 2060 Super. A 3060 12gb would be way more useful for me whilst being cheaper.

 

A 16gb 3070 would be even more interesting, but I doubt prices will be reasonable enough for those until 2022.

 

22 hours ago, RejZoR said:

Having more VRAM on crappy RTX 3060 than on multiple levels superior RTX 3080 is just moronic design decision.

While I agree to some point, we need to remember that the 3060 has regular GDDR6 memory while also being on a way smaller bus (much less bandwidth).

For nvidia to have more memory on the 3080, they have 3 options:

- Doubling the amount of vram chips/using 2GiB chips instead of 1GiB: this gets the GPU way too close to the 3090 memory-wise, and it's what we're going to see with the 3080ti, with almost no gains performance-wise (compared to the regular 3080) since the extra cores will starve for data due to the smaller bus (when compared to the 3090).

- Using a different die cut of the GA102: using a cut similar to the 3090 with the same bus (384 instead of the current 320bits) but half of the memory would make the 3080 really close to the 3090 in raw performance, since those GPUs are really being starved for data even with GDDR6X.

- Using an even smaller bus size: 9/18 or 11/22GiB models would be possible with a bus size of 288/352bits, but the former would leave the GPU starved for data and tank its performance to 3070-levels, while the latter would get it even closer to a 3090.

 

tl;dr: nvidia's current segmentation is whack, and either the GPUs are way too close in performance, or there's a huge gap between them, it's almost impossible to have a sensible stack.

 

EDIT: oh, I forgot to add, 16Gbit (2GiB) GDDR6X modules aren't available yet, so the only way to add more vram is to double the amount of chips on the PCB.

Edited by igormp

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×