Jump to content

Yeah it's SUPER, super out of stock that is - new RTX card rumors.

williamcll


Nvidia might be repeating what their lineup design was on the previous generation as a recently leaker claimed that SUPER version of RTX cards may come out later on, how different would these be to the Ti counterparts are yet to be known.

Quote

The new information from Kopite claims that NVIDIA may already be considering refreshing its 3070 and 3080 SKUs. While the leaker is not sure about the naming, it is important to note that the company has already canceled many SKUs that have appeared on the roadmaps that present on the documents for many weeks. We have seen multiple SKUs being canceled shortly after the Radeon RX 6000 series announcement. For this reason SKUs such as RTX 3080 20GB are no longer expected, instead, NVIDIA is now planning a 3080 Ti model instead. The RTX 3070 SUPER had already appeared in the rumors.

 

It is unclear how much different this SKU would be from (also rumored) RTX 3070 Ti. Kopite has been advocating that the latter might feature a new GA103 GPU that has not yet appeared anywhere else but in his tweets. Neither we were able to confirm such a GPU yet. Assuming that Kopite’s information is correct, both 3070 Ti and 3070 SUPER could feature a GA103 processor, which would act as a gap filler between GA104 and GA102 GPUs. Meanwhile, the specifications of the rumored 3080 SUPER are not known. The graphics card could feature more memory than the original 3080, however, it is unclear if the configuration would be 12GB or something to match Radeon RX 6800 series.

Source: https://videocardz.com/newz/nvidia-rumored-to-introduce-geforce-rtx-3080-super-and-rtx-3070-super-series
Thoughts: Considering the super line of rtx cards did sell reasonably well last time, I think this will too. Though now that Radeon cards have performance that contest RTX codes at varying price levels it would probably face resistance. Furthermore, what even could be the difference between a 3080S and a 3080Ti?

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

I'll see them when I see them. The rumours just keep alternating between Super's and Ti's at this point it feels.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, williamcll said:

difference between a 3080S and a 3080Ti?

i would assume about the same difference as a 3060ti and a 3070
 

Link to comment
Share on other sites

Link to post
Share on other sites

Super cards seem to happen when AMD out adjusts Nvidia.  If super cards happen it will be because the Ti cards failed to beat AMD’s offerings in price/performance.  If the SUPER cards fail too, we will see KO cards. If the KO cards fail something else will no doubt appear.  Nvidia changes prices by changing models.  A regular to a ti to a super to a KO is really nothing but an Nvidia obfusticated price change. Sure, they’ll flop around what chip with what parts disabled running at what MHz, but fps/dollar is fps/dollar.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Bombastinator said:

Super cards seem to happen when AMD out adjusts Nvidia.  If super cards happen it will be because the Ti cards failed to beat AMD’s offerings in price/performance.  If the SUPER cards fail too, we will see KO cards. If the KO cards fail something else will no doubt appear.  Nvidia changes prices by changing models.  A regular to a ti to a super to a KO is really nothing but an Nvidia obfusticated price change. Sure, they’ll flop around what chip with what parts disabled running at what MHz, but fps/dollar is fps/dollar.

It is difficult for ANY manufacturer to significantly change price of a product after it is made available. The easy fix for that is to make a new different product. It might not be that much different, but just has to be different enough.

 

We've only had one generation of Super cards to go by, so it isn't exactly a huge amount to go on. A mid cycle refresh is also not unusual, whatever it is called. Still a bit early for such, but preparing for it now isn't out of the question. Remember the high vram nvidia card rumours were around even before the current ones were launched. It's just a matter of when. BTW isn't KO the name used by one AIB? It's not an nvidia term as far as I'm aware.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, porina said:

BTW isn't KO the name used by one AIB? It's not an nvidia term as far as I'm aware.

It was used solely by EVGA last gen, but long ago it was a GPU that's from multiple AIB if I'm not wrong

 

Edit: I can only find evga 9800 KO, so it might be an evga thing

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

3080 Super: Now with 2GB of GDDR6X!

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Sarra said:

3080 Super: Now with 2GB of GDDR6X!

No, that's the RT 3000.

It's got 10 CUDA cores, 1 RT core, 1 tensor core (yep it supports DLSS 2.0), 1 HDMI 2.1 port, 1 VGA port, and 2GB of GDDR6X.

The RT 3001 is the same, except 15 CUDA cores, only a VGA port, and 42GB of HBM2 VRAM.

elephants

Link to comment
Share on other sites

Link to post
Share on other sites

It's like Nvidia desperately trying to beat AMD as they fear this gen will be one of the last gen of their dominance, as AMD starts to produce MCM GPUs, they will lose

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, DoTr said:

It's like Nvidia desperately trying to beat AMD as they fear this gen will be one of the last gen of their dominance, as AMD starts to produce MCM GPUs, they will lose

NVidia screwed up by dropping their memory capacities.

 

Then they realized that, said 'Hold my beer!', and pissed the enthusiast community off (Hardware Unboxed).

 

Now, some people (ahem, myself included) refuse to touch their products, just out of disgust at what they did.

 

If they lose the performance crown, I won't cry. Hell, in rasterization, they've already either been matched, or outpaced, and I'm not sad at all about that.

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Sarra said:

NVidia screwed up by dropping their memory capacities.

 

Then they realized that, said 'Hold my beer!', and pissed the enthusiast community off (Hardware Unboxed).

 

Now, some people (ahem, myself included) refuse to touch their products, just out of disgust at what they did.

 

If they lose the performance crown, I won't cry. Hell, in rasterization, they've already either been matched, or outpaced, and I'm not sad at all about that.

I wish ROCm or any open-source standard will beat Nvidia CUDA and be the industry standard, as that is the only reason a lot of ppl uses their card/accelerator

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, DoTr said:

I wish ROCm or any open-source standard will beat Nvidia CUDA and be the industry standard, as that is the only reason a lot of ppl uses their card/accelerator

I’m in the market for a card right now actually. $500 is the limit, so I’m either looking at a 3070 at $499 that is listed as Coming Soon, or hopefully AMD launches something competitive at a similar price points (RX 6700?). Honestly, whatever I can get ahold of first gets my money. Blender isn’t super picky anymore. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, porina said:

It is difficult for ANY manufacturer to significantly change price of a product after it is made available. The easy fix for that is to make a new different product. It might not be that much different, but just has to be different enough.

 

We've only had one generation of Super cards to go by, so it isn't exactly a huge amount to go on. A mid cycle refresh is also not unusual, whatever it is called. Still a bit early for such, but preparing for it now isn't out of the question. Remember the high vram nvidia card rumours were around even before the current ones were launched. It's just a matter of when. BTW isn't KO the name used by one AIB? It's not an nvidia term as far as I'm aware.

This is often seen in mattress and appliance sales.  A given chain has the “best price” on a given product but only because that SKU is only sold by that chain.  There may be otherwise basically identical products sold in other stores, but because they have a different sku they are technically different products. Video cards have more sliders to mess with than sku number, but the effect is the same. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Zodiark1593 said:

I’m in the market for a card right now actually. $500 is the limit, so I’m either looking at a 3070 at $499 that is listed as Coming Soon, or hopefully AMD launches something competitive at a similar price points (RX 6700?). Honestly, whatever I can get ahold of first gets my money. Blender isn’t super picky anymore. 

Yeah, but for DL and a lot of other compute task, CUDA is still the standard as it works better than anything else. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, DoTr said:

Yeah, but for DL and a lot of other compute task, CUDA is still the standard as it works better than anything else. 

Which don’t matter if all he wants it for is blender.  As far as “working better than anything else” if you are talking about it being more commonly supported in some types of apps I would say you have a point. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, DoTr said:

Yeah, but for DL and a lot of other compute task, CUDA is still the standard as it works better than anything else. 

Just looking at my own use case right now. I concede a lot of commercial renderers such as Octane and Renderman currently only use Nvidia cards. I intend to stick with Eevee and Cycles, neither of which really much care about what card is in there. 
 

Eevee is a rasterizer that uses OpenGL 3.3 for rendering (pretty much an extremely high quality real-time engine with some ray trace features), so any fast card with lots of VRAM will do, and Cycles (for the past couple years) supports both OpenCL and CUDA equally well. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DoTr said:

as they fear this gen will be one of the last gen of their dominance, as AMD starts to produce MCM GPUs, they will lose

Nvidia also has MCM GPUs coming (Hopper). We don't know which will be released first. I think they are creating these additional SKUs to prevent AMD from creating a RX 6800 situation where there is no comparison. This would mean that whatever AMD releases, Nvidia would already have a response at almost every price point (at $50 increments, like last gen).

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Zodiark1593 said:

Just looking at my own use case right now. I concede a lot of commercial renderers such as Octane and Renderman currently only use Nvidia cards. I intend to stick with Eevee and Cycles, neither of which really much care about what card is in there. 
 

Eevee is a rasterizer that uses OpenGL 3.3 for rendering (pretty much an extremely high quality real-time engine with some ray trace features), so any fast card with lots of VRAM will do, and Cycles (for the past couple years) supports both OpenCL and CUDA equally well. 

If you're interested in using Cycles, then its support for OptiX on Nvidia cards may interest you - allowing you to use the RT cores for acceleration. There are a few features missing from this though - OptiX support is still classed as 'experimental'.

 

Currently there isn't any support for AMD hardware raytracing in Cycles, but it's on the to-do list. That being said, given the superior RT performance of the Nvidia cards anyway, I personally would choose team green if hardware-accelerated raytracing was something I was interested in.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

Super variants already ? ....

Well, considering I still haven't managed to get a 3060 ti, much less a 3070, to finally replace my 7970... If it's true and it comes out soon, I may as well just continue waiting... not like I have much choices.

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, TetraSky said:

Super variants already ? ....

Well, considering I still haven't managed to get a 3060 ti, much less a 3070, to finally replace my 7970... If it's true and it comes out soon, I may as well just continue waiting... not like I have much choices.

Replacing that HD7970 with something like GTX 1080Ti would already provide like 4x the performance... Going with RTX 3080 or RX 6800XT for that matter, would make it like 6x...

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, tim0901 said:

If you're interested in using Cycles, then its support for OptiX on Nvidia cards may interest you - allowing you to use the RT cores for acceleration. There are a few features missing from this though - OptiX support is still classed as 'experimental'.

 

Currently there isn't any support for AMD hardware raytracing in Cycles, but it's on the to-do list. That being said, given the superior RT performance of the Nvidia cards anyway, I personally would choose team green if hardware-accelerated raytracing was something I was interested in.

Might not even matter much due to lack of availability. I’m tempted just to toss out the idea until next gen tbh. :/

 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/3/2021 at 9:57 PM, Sarra said:

3080 Super: Now with 2GB of GDDR6X!

That reminds me, I was wondering couldn't we just solder more ram on these cards?  it's honestly the only thing i regret already buying a 3070... it's already being maxed out playing 2 year old games... how will this be in the future... it can only get worse... 

 

 

I mean the modules would need to have the same size with double capacity... is that possible does that exist... (I'd definitely do it if it's just a simple solder job...) 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, Mark Kaine said:

That reminds me, I was wondering couldn't we just solder more ram on these cards?  it's honestly the only thing i regret already buying a 3070... it's already being maxed out playing 2 year old games... how will this be in the future... it can only get worse... 

 

 

I mean the modules would need to have the same size with double capacity... is that possible does that exist... (I'd definitely do it if it's just a simple solder job...) 

I doubt it. The actual signalling wouldn't be there, and the controller on the die would be expecting specific capacity chips, if it saw double that capacity, you would either have half the bandwidth, half the capacity, or it would flat out reject the chips, and you'd basically have a card with no memory.

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Sarra said:

I doubt it. The actual signalling wouldn't be there, and the controller on the die would be expecting specific capacity chips, if it saw double that capacity, you would either have half the bandwidth, half the capacity, or it would flat out reject the chips, and you'd basically have a card with no memory.

Is the 3070 the 104 or 102 die? We know the 102 die certainly supports the higher density chips (3090 vs 3080) and I'd be surprised if the lower ones didn't also. Look at it like desktop ram, you can put two modules on a channel instead of one. Max bandwidth is same, although there are optimisations that often lead to two modules working better than one. Logically this can happen within a single higher capacity chip also (like dual rank vs single rank modules). 

 

 

 

Elsewhere:

https://www.igorslab.de/en/nvidia-geforce-rtx-3060-ultra-12-gb-schneller-als-eine-rtx-3060-ti-asus-tuf-gaming-geleakt-2/

What year is it? The "Ultra" name might be coming back with this rumoured 3060 Ultra 12GB. For those who aren't as ancient as me, Ultra used to be nvidia's top model CPU, above GT.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×