Jump to content

After more than 2 years, the 4090 is finally getting a successor. But is the new Blackwell architecture going to bring improvements over Ada Lovelace? Or is it all just AI magic? It's a bit of both. Join us as we delve into the world of Neural Rendering and DLSS 4 with the latest generation of Nvidia graphics. Is AMD cooked? Will Intel even be on the charts? I'm not gonna tell you here. Scroll up and watch the video!

Link to comment
https://linustechtips.com/topic/1597779-the-rtx-5090-our-biggest-review-ever/
Share on other sites

Link to post
Share on other sites

I appreciate including the 1080 Ti in the gaming benchmarks. Seeing it be X% better than the 4090 is great and all, but seeing how it performs compared to the likes of the 1080ti and 2080 provides a lot more perspective for people who are looking to upgrade from those generations of cards.

 

I liked the approach with the different hosts for each section of the video.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to post
Share on other sites

Proportionally larger silicon that consumes proportionally more power that costs proportionately more money with gatekept features to sell the generation.

Builder/Enthusiast/Overclocker since 2012 who spends way too much money on computer hardware.

Link to post
Share on other sites

13 minutes ago, Agall said:

Proportionally larger silicon that consumes proportionally more power that costs proportionately more money with gatekept features to sell the generation.

Give it a month and people will have bypassed the restrictions in multiple games and got dlss4's feature set on older cards. It's what happened to 3.5 too. Frame gen is on the 2000 series and 3000 series on multiple games. Wukong and such have it and it works well.

Link to post
Share on other sites

1 minute ago, jaslion said:

Give it a month and people will have bypassed the restrictions in multiple games and got dlss4's feature set on older cards. It's what happened to 3.5 too. Frame gen is on the 2000 series and 3000 series on multiple games. Wukong and such have it and it works well.

My speculation that the RTX 4090 will end up the 1080ti of RTX cards is looking better. 

 

Considering the die is 22.2% larger and the shader core count is 32.8% higher but with nearly double the memory bandwidth, I was hoping I was wrong and it was more than a 25%-30% increase at 4K.

Builder/Enthusiast/Overclocker since 2012 who spends way too much money on computer hardware.

Link to post
Share on other sites

4 minutes ago, Agall said:

My speculation that the RTX 4090 will end up the 1080ti of RTX cards is looking better. 

 

Considering the die is 22.2% larger and the shader core count is 32.8% higher but with nearly double the memory bandwidth, I was hoping I was wrong and it was more than a 25%-30% increase at 4K.

I'd actually crown that to the 3090 due to it's value having had a drop after the mining craze to 600€ or less for some months. That and it runs all the new dlss features the 40's have as long as you spoof the game.

Link to post
Share on other sites

6 minutes ago, jaslion said:

I'd actually crown that to the 3090 due to it's value having had a drop after the mining craze to 600€ or less for some months. That and it runs all the new dlss features the 40's have as long as you spoof the game.

When I use the 1080ti as a point of comparison, its comparing it to the 2080ti's performance and price. The RTX 4090 vs 3090 had one of the largest jumps in performance in a generation, especially at 4K (and it wasn't even as good of a bin as the 3090/3090ti).

 

Potentially time to make this chart again, especially with how spaced the binning between the 5080 and 5090 is:

 

 

Builder/Enthusiast/Overclocker since 2012 who spends way too much money on computer hardware.

Link to post
Share on other sites

26 minutes ago, Agall said:

When I use the 1080ti as a point of comparison, its comparing it to the 2080ti's performance and price. The RTX 4090 vs 3090 had one of the largest jumps in performance in a generation, especially at 4K (and it wasn't even as good of a bin as the 3090/3090ti).

 

Potentially time to make this chart again, especially with how spaced the binning between the 5080 and 5090 is:

 

 

Aaah ok in that way fully agree.

Link to post
Share on other sites

19 minutes ago, jaslion said:

Aaah ok in that way fully agree.

I have no idea how the RTX 5080 is going to compete while only having half the core count. I wouldn't be surprised if the RTX 4090 gaps it by +50%. I have a feeling tomorrow will be even worse when the RTX 5080 benchmarks come out and its only like 12% over the RTX 4080 super.

Builder/Enthusiast/Overclocker since 2012 who spends way too much money on computer hardware.

Link to post
Share on other sites

1 minute ago, Agall said:

I have no idea how the RTX 5080 is going to compete while only having half the core count. I wouldn't be surprised if the RTX 4090 gaps it by +50%. I have a feeling tomorrow will be even worse when the RTX 5080 benchmarks come out and its only like 12% over the RTX 4080 super.

I don't have my hopes up for the 5080. Since it is more power hungry I have a feeling it's gonna scale exactly like that in performance uplifts and thats it. Going to be a major dissapointment

Link to post
Share on other sites

1 hour ago, Agall said:

Proportionally larger silicon that consumes proportionally more power that costs proportionately more money with gatekept features to sell the generation.

 

/thread

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | RTX 3080 ti Founders Edition | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to post
Share on other sites

Seem like frame gen still need a relatively high native frame rate to work best, this doesnt bode well for Nvidia claim that a 5070 is on par with a 4090

Also amazing review, the even focus between theories and benchmarks and the smooth transition between hosts was so good, and David presenting graphs was a amazing move lol

Link to post
Share on other sites

20 minutes ago, Agall said:

I'm sorry but I don't know what that means.

 

It means your post should end the thread because there's nothing else to say. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | RTX 3080 ti Founders Edition | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to post
Share on other sites

Hmm with the frame gen not being hardware but an AI model that runs on the Blackwell Tensor cores they should be able to update it with a driver update.  Awesome. 

It's so nice having a high end GPU that (hopefully) won't be applied unobtainium. Expensive sure, but you can at least find it to buy if you have the cash.  At least I hope so. 

Link to post
Share on other sites

3 hours ago, Uttamattamakin said:

's so nice having a high end GPU that (hopefully) won't be applied unobtainium. Expensive sure, but you can at least find it to buy if you have the cash.  At least I hope so. 

The 4090 and such were also not too hard to get some time after launch. The 5090's are gonna get scalped for some months and then availablity will return.

Link to post
Share on other sites

I'm so glad the review covers the 3080 and the 2080. This review seems really well made.

 

I would have liked a N game average and cost per frame chart but the picture is clear.

 

Skipping the 5xxx series... I can drop realliy big money for a GPU. I was prepared to drop 2000€ even. Not 2400€ in the best case scenario. Jensen isn't seeing any of my euros this time around either.

 

I'll be picking up a 7900XTX and I'll see if the 5080 Ti Super 24 GB is worth it ina  couple of years. I'll certanly not buy a 16GB card above 1000 €.

 

I'm really disappointed there is NO manufacturer manufacturing new 24GB card -.-

Link to post
Share on other sites

power draw... yep nope am not rewiring a room or 2 of my house for my rig. to be able not to trip the breaker

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flow ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3200 MHz | Corsair RM1200i |200tb raw | Asus tuff gaming mid tower| 10gb NIC

Link to post
Share on other sites

pretty much as expected 30% performance uplift in titles that use rt/ai nothing burger anywhere else 

 

 and that is perfectly fine. times are a changin I have said this before there is good reason for nvidia to be all-in on ray tracing/ai 

 

1. there is realistic limits to what you can do using traditional lighting methods  making it look good is possible but very time consuming

2. there is No reason to make a faster-raster card the hardware has been 'enough' for some time 

3. making faster-raster hardware is expensive in terms of silicon used and most importantly power, lighting and shading is the most computationally expensive part of rendering a 3d scene, we have had hardware that has been more then fast enough to satisfy any need for raw pixel fill that you could want 

 

ray tracing is hear to stay the age of raster-lite GI light maps and shader tricks is coming to a close  the king is dead, long live the king 

Link to post
Share on other sites

Eh, the AI benchmarks were pretty much useless since they were using DirectML instead of CUDA.

But I guess there wasn't much they could've done since the newest CUDA with blackwell support wasn't available yet.

 

On 1/23/2025 at 12:19 PM, Agall said:

When I use the 1080ti as a point of comparison, its comparing it to the 2080ti's performance and price. The RTX 4090 vs 3090 had one of the largest jumps in performance in a generation, especially at 4K (and it wasn't even as good of a bin as the 3090/3090ti).

 

Potentially time to make this chart again, especially with how spaced the binning between the 5080 and 5090 is:

 

 

Do you mean like so?

 

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×