Jump to content

Nvidia RTX 40 Super Series Review

AdamFromLTT

Nvidia has released 3 new GPUs this month, the 4070 Super, 4070 Ti Super and the 4080 Super. But are they worth the super name? Do they bring AMD to shame? How well do they game?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Meh

Gaming With a 4:3 CRT

System specs below

 

CPU: AMD Ryzen 7 5700X with a Noctua NH-U9S cooler 
Motherboard: Gigabyte B450 Aorus M (Because it was cheap)
RAM: 32GB (4 x 8GB) Corsair Vengance LPX 3200Mhz CL16
GPU: EVGA GTX 980 Ti SC Blower Card
HDD: 7200RPM TOSHIBA DT01ACA100 1TB, External HDD: 5400RPM 2TB WD My Passport
SSD: 1tb Samsung 970 evo m.2 nvme
PSU: Corsair CX650M
Displays: ViewSonic VA2012WB LCD 1680x1050p @ 75Hz
Gateway VX920 CRT: 1920x1440@65Hz, 1600x1200@75Hz, 1200x900@100Hz, 960x720@125Hz
Gateway VX900 CRT: 1920x1440@64Hz, 1600x1200@75Hz, 1200x900@100Hz, 960x720@120Hz (Can be pushed to 175Hz)
 
Keyboard: Thermaltake eSPORTS MEKA PRO with Cherry MX Red switches
Link to comment
Share on other sites

Link to post
Share on other sites

I believe that here is a little problem here...

How can the RTX 4070 Ti SUPER, that does not even touch the 70C, have 79C MAX temp ? And how the AVG is 77? 

 

image.thumb.png.ecf9f5291bd6e15eeee49a1cd84dd98b.png

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AdamFromLTT said:

Nvidia has released 3 new GPUs this month, the 4070 Super, 4070 Ti Super and the 4080 Super. But are they worth the super name? Do they bring AMD to shame? How well do they game?

 

 

Why no XTX?

80+ ratings certify electrical efficiency. Not quality.

Link to comment
Share on other sites

Link to post
Share on other sites

At 1:06, the axis label says "Frames per $0.01" where it should say "Frames per $100".

Link to comment
Share on other sites

Link to post
Share on other sites

7900XTX does feel like a rather big omission, but at the same time it feels like a pretty cut and dry conclusion.

 

The 7900XTX performs practically identically to the 4080 in rasterization, therefore it performs practically identically to the 4080 Super. It's the same price as the 4080 Super as well, at least here in the UK, but you get worse RT performance and no DLSS.

 

As such, I see no reason to buy the 7900XTX unless you can get one substantially cheaper than the 4080 Super. Maybe that's possible where you are, but here that's not really the case.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, tim0901 said:

7900XTX does feel like a rather big omission, but at the same time it feels like a pretty cut and dry conclusion.

 

The 7900XTX performs practically identically to the 4080 in rasterization, therefore it performs practically identically to the 4080 Super. It's the same price as the 4080 Super as well, at least here in the UK, but you get worse RT performance and no DLSS.

 

As such, I see no reason to buy the 7900XTX unless you can get one substantially cheaper than the 4080 Super. Maybe that's possible where you are, but here that's not really the case.

It seems like they could have just cut the price of the regular 4080 and not bothered with the 4080 Super at all...

 

As it stands the Super moniker implies superiority it doesn't really deliver, but it's priced lower than the card which it's supposed to surpass.

 

 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

At 6:57 Did Linus really just said he will build something that could help level the AI playing field for AMD?

 

What dose it even mean? If it is not the case that LTT can help AMD have better AI performance in general, then why?

 

If AMD still cannot beat Nvidia in those tasks, and then the team somehow "make the benchmark fair", AMD might still not perform as the benchmark results indicated in real world.

 

What a benchmark should do is helping people predict what the product will performance in THEIR workflow, not something the LABS comes up with.

 

Benchmark dose not need be fair, it just needs to be accurate. 

 

How about talk to the engineers from Nvidia or AMD to design such benchmark for "AI"?

Link to comment
Share on other sites

Link to post
Share on other sites

whats up with the 7900 xt temps in the vid?

just ran my own for 11 min, on default settings its 62 stable.

 

any specific settings used? and did nvidia run on equivelants then?

 

sry for crude screenshot

unknown_2024.02.02-00.46_1.png

Link to comment
Share on other sites

Link to post
Share on other sites

What does this graph mean?

 

20 frames per 1 cent is 2000 frames per dollar, or 2 million frames for $999 - the price of 4080 super.

 

What does this measure exactly? Will the card die after providing 2 million frames or what?

Screenshot_2024-02-02-01-46-45-335_com.google.android.youtube.jpg

Ryzen 9 5900X, 32 Gb of 3200 RAM, Radeon RX 6800XT from VAG

Link to comment
Share on other sites

Link to post
Share on other sites

Best thing about this review is that the test system they are using is literally the *exact* system I have lol... Gave up on my 4080/XTX dreams, wonder if a 4070Ti Super or a 7900XT would be better as an upgrade from a 3070?

PC: AMD 7800X3D, Gigabyte B650 Gaming AX, Noctua D15, G.SKILL Trident Z5 Neo 16GBx2 6000MHz CL30, EVGA RTX 3070 FTW3, NZXT H510i, Samsung 980 Pro 2TB, Samsung Pro 970 1TB, Samsung Pro 970 500GB

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, nideii said:

At 6:57 Did Linus really just said he will build something that could help level the AI playing field for AMD?

 

What dose it even mean? If it is not the case that LTT can help AMD have better AI performance in general, then why?

 

If AMD still cannot beat Nvidia in those tasks, and then the team somehow "make the benchmark fair", AMD might still not perform as the benchmark results indicated in real world.

 

What a benchmark should do is helping people predict what the product will performance in THEIR workflow, not something the LABS comes up with.

 

Benchmark dose not need be fair, it just needs to be accurate. 

 

How about talk to the engineers from Nvidia or AMD to design such benchmark for "AI"?

I took it to mean that they want to develop/adopt testing for AI that isn't specifically accelerated by CUDA. Having Nvidia win every benchmark due to CUDA accelerating is not very useful if you are buying cards for an AI application that is not accelerated by CUDA.

Link to comment
Share on other sites

Link to post
Share on other sites

Another review, another review where they forgot to post the base specs. If you're not going to dedicate 15-30 seconds of video to say the base specs, please, for the love of gawd, add a line in the graphs with the base specs of the bench(es).

 

I know that there is that 7800X3D video where you guys try to dind the most identical 7800X3D out of a bunch of them and you might have used those benches for this. However, it has to be repeated every time, on every review. Like, seriously.

 

LABS is all good and all but if you can't get that right, it'll kind of be hard to trust LABS... because, right now, you can say "What are the base specs for the bench? A Pentium 4? My ass?"

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Anfros said:

I took it to mean that they want to develop/adopt testing for AI that isn't specifically accelerated by CUDA. Having Nvidia win every benchmark due to CUDA accelerating is not very useful if you are buying cards for an AI application that is not accelerated by CUDA.

First of all, what I mean "AI" is GPGPU and Hardware accelerated tensor operations.

 

As far as I know, there is very little use case that not supports CUDA but only other technology.

 

Major deep learning framework like Caffe, Tensorflow, Pytorch, all have CUDA and Tensor Core support. Common use case for AI, like LLM, Text-To-Image/stable-diffusion, will have CUDA or even Tensor Core support.

 

Here are the cases I can think of:

1. OpenCL applications could have better performance on AMD, but normal gaming GPU support is a mess.

2. Compute shader might be a thing, but usually used by game during post processing, and it doesn't matter that much.

3. CPU only applications.

 

Sounds like a lot, but if we very think about it, all of them are not what should be done on a gaming GPU.

You cannot put a 35W TDP cooler on a 125W TDP CPU and call it a bad cooler.

 

I would say, for "AI", testing CUDA/Tensor Core performance as usually, and use ROCm (or even pytorch with Vulkan) for AMD, also include Apple M-series CPU as well. Those kind of number can easily translated into real world application and should be compare against each other.

 

Link to comment
Share on other sites

Link to post
Share on other sites

How can you forget the 7900 XTX that is direct competition to the 4080 and 4080 super ? that is just bad.......

You have so many people watching your reviews and you left out most important GPU, NVIDIA must be very happy right now....

 

  

17 hours ago, tim0901 said:

7900XTX does feel like a rather big omission, but at the same time it feels like a pretty cut and dry conclusion.

 

The 7900XTX performs practically identically to the 4080 in rasterization, therefore it performs practically identically to the 4080 Super. It's the same price as the 4080 Super as well, at least here in the UK, but you get worse RT performance and no DLSS.

 

As such, I see no reason to buy the 7900XTX unless you can get one substantially cheaper than the 4080 Super. Maybe that's possible where you are, but here that's not really the case.

image.thumb.jpeg.a3bf5b7dfe0ff8ba58b98daef7716858.jpeg

 

Link to comment
Share on other sites

Link to post
Share on other sites

The omission of the 7900 XTX feels a bit sus, tbh.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

This is the second time in recent memory where LTT have bungled a major product release. While the supers weren't all that great, this is still a GPU release. LTT is many things, but we all started watching them as a tech channel first and foremost. Whole room water-cooling, server management, etc. are all fun videos, but this is a tech review channel. If you can't get that right, why are you posting videos?

 

The omission of the XTX from the other benchmarks, only to include it in the last slide like you're slipping it in, well...we noticed. The 7600XT video omitting any comparably performant cards, only showing the 4060Ti but not a 6700XT or even the 4060 for that matter was similarly weird. Whoever is writing the GPU videos needs to have a sit down and realize that they screwed up here. No amount of labs data can obscure the fact that this video had several data representation typos and it is missing the only GPU that would actually matter in this release: the direct competitor of the most anticipated card in this new lineup???

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Ruslanets said:

What does this graph mean?

 

20 frames per 1 cent is 2000 frames per dollar, or 2 million frames for $999 - the price of 4080 super.

 

What does this measure exactly? Will the card die after providing 2 million frames or what?

Screenshot_2024-02-02-01-46-45-335_com.google.android.youtube.jpg

It means the 4080 Super is shit value.

Gaming With a 4:3 CRT

System specs below

 

CPU: AMD Ryzen 7 5700X with a Noctua NH-U9S cooler 
Motherboard: Gigabyte B450 Aorus M (Because it was cheap)
RAM: 32GB (4 x 8GB) Corsair Vengance LPX 3200Mhz CL16
GPU: EVGA GTX 980 Ti SC Blower Card
HDD: 7200RPM TOSHIBA DT01ACA100 1TB, External HDD: 5400RPM 2TB WD My Passport
SSD: 1tb Samsung 970 evo m.2 nvme
PSU: Corsair CX650M
Displays: ViewSonic VA2012WB LCD 1680x1050p @ 75Hz
Gateway VX920 CRT: 1920x1440@65Hz, 1600x1200@75Hz, 1200x900@100Hz, 960x720@125Hz
Gateway VX900 CRT: 1920x1440@64Hz, 1600x1200@75Hz, 1200x900@100Hz, 960x720@120Hz (Can be pushed to 175Hz)
 
Keyboard: Thermaltake eSPORTS MEKA PRO with Cherry MX Red switches
Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, At0micMuff1n said:

This is the second time in recent memory where LTT have bungled a major product release. While the supers weren't all that great, this is still a GPU release. LTT is many things, but we all started watching them as a tech channel first and foremost. Whole room water-cooling, server management, etc. are all fun videos, but this is a tech review channel. If you can't get that right, why are you posting videos?

 

The omission of the XTX from the other benchmarks, only to include it in the last slide like you're slipping it in, well...we noticed. The 7600XT video omitting any comparably performant cards, only showing the 4060Ti but not a 6700XT or even the 4060 for that matter was similarly weird. Whoever is writing the GPU videos needs to have a sit down and realize that they screwed up here. No amount of labs data can obscure the fact that this video had several data representation typos and it is missing the only GPU that would actually matter in this release: the direct competitor of the most anticipated card in this new lineup???

I honestly haven't paid too much attention to LTT's specific hardware release videos, ever, being someone who's viewed their channel for at least 12 years. There's almost always been better sources of raw benchmark and product reviews *cough GN cough*, and I can see LTT realizing that its not worth attempting to compete with GN or HWUB's day 1 reviews.

 

Regarding the lack of 7900 XTX, not including one of the product's direct competitors that will generally out perform it for the same or less price almost screams, "Nvidia told us not to when we wanted to do a single release video", which they talked about on the WAN Show being a controversial choice. 

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, MadAnt250 said:

It means the 4080 Super is still shit value.

Comparable bin to the 3070ti from last generation, at least they're only upselling it by 66% and not 100% like the rest. The original RTX 4080 at $1200 was particular bad at 2.4x the price of its comparable RTX 3000 part. No amount of inflation or fab cost could cause that much of a jump.

image.png.3ff33258d4ac20131fa83f4b742ae471.png

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Agall said:

Comparable bin to the 3070ti from last generation, at least they're only upselling it by 66% and not 100% like the rest. The original RTX 4080 at $1200 was particular bad at 2.4x the price of its comparable RTX 3000 part. No amount of inflation or fab cost could cause that much of a jump.

image.png.3ff33258d4ac20131fa83f4b742ae471.png

Yup, no amount of inflation could cause a jump that big. In my opinion the 4080 should be a 750 dollar base msrp gpu, but we all know this problem started back when the RTX 2000 series came out being a chunk more expensive than the GTX 1000 series cards they replaced. 

Gaming With a 4:3 CRT

System specs below

 

CPU: AMD Ryzen 7 5700X with a Noctua NH-U9S cooler 
Motherboard: Gigabyte B450 Aorus M (Because it was cheap)
RAM: 32GB (4 x 8GB) Corsair Vengance LPX 3200Mhz CL16
GPU: EVGA GTX 980 Ti SC Blower Card
HDD: 7200RPM TOSHIBA DT01ACA100 1TB, External HDD: 5400RPM 2TB WD My Passport
SSD: 1tb Samsung 970 evo m.2 nvme
PSU: Corsair CX650M
Displays: ViewSonic VA2012WB LCD 1680x1050p @ 75Hz
Gateway VX920 CRT: 1920x1440@65Hz, 1600x1200@75Hz, 1200x900@100Hz, 960x720@125Hz
Gateway VX900 CRT: 1920x1440@64Hz, 1600x1200@75Hz, 1200x900@100Hz, 960x720@120Hz (Can be pushed to 175Hz)
 
Keyboard: Thermaltake eSPORTS MEKA PRO with Cherry MX Red switches
Link to comment
Share on other sites

Link to post
Share on other sites

Weird to not include the 7900 XTX, especially when much smaller channels (in regards to both staff and sub count) were able to.

Wonder how he tries to justify it on WAN show.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MadAnt250 said:

Yup, no amount of inflation could cause a jump that big. In my opinion the 4080 should be a 750 dollar base msrp gpu, but we all know this problem started back when the RTX 2000 series came out being a chunk more expensive than the GTX 1000 series cards they replaced. 

Given the other variables, the launch price of the RTX 4080 16GB should've been $900, something I suggested on its launch too, with the 4070 ti at ~$700-750. That would've also opened the market for the Super variants to be slightly more at their current pricing.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

let me get straight they tested only 1 super 4070 and 4080?

did they use more than 1 card to get benchmark results?

if it was me i would test atleast 5 same type gpus for reliable results(IT COULD BE BAD LUCK WITH SLICON LOTTERY).

is there any proof linus actually used high end motherboard+ram for results?

cheap budget gaming/mid range motherboard can make in eye you lots frames.

where is proof for that?

if understand correct some ohter youtube channel made proof linus channel make bit off results and they where forced to cut down how many video they release.

maybe this also made them use cheaper mobo etc and that why so poor frames.

lastly computer are listed in video info,but we never see actual pc.can i trust such benchmark results?fishy

Link to comment
Share on other sites

Link to post
Share on other sites

@AdamFromLTT Any word on what is going in this chart (after promising to improve after the GamersNexus video)?

 

The RTX 4070ti super data is suspicious:

Eye balling it the average should be approx. 61°C (yellow trace) but it's listed as 77 °C.

Eye balling the max. value as 62°C but it's listed as 79 °C.

 

Sanity check against for example the RTX4080 super chart I would guess 71°C and the number LTT published, as such stands behind with their name, as average 71°C.  For this trace the chart matches the published number. Another sanity check? rx 7800XT. Eyeballing the average as 85°C.  Published value is 85°C.

Another round of this with max. values: Eyeballed RX 7800XT: 88°C published: 87°C | RTX4070 super: eyeballed 78°C, published 77°C. Both are a match. 

 

TL;DR: The data is screwed up. Either the yellow trace isn't from a RTX4070 ti super or the published avg. or max. where taken from a different data set.

 

On 2/1/2024 at 8:55 PM, Goodevil said:

How can the RTX 4070 Ti SUPER, that does not even touch the 70C, have 79C MAX temp ? And how the AVG is 77? 

 

image.thumb.png.ecf9f5291bd6e15eeee49a1cd84dd98b.png

 

People never go out of business.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×