Jump to content

3090 vs 4070 super vs 4070 ti

Ed.Nev

Hi! 
I’m looking to purchase a new gpu and these are my current options and their prices:

Rtx 4070 Super = £580

Rtx 3090 = £580

Rtx 4070ti = £640

 

My concerns are the lack of frame generation on the 3090 - though i have tried frame gen and don’t love it in all games. I’m really worried about only 12gb vram on both the 4070 super and ti as i want this card to last a few years.

 

Sadly the 4070ti super (and above) is just too expensive. 

 

My psu = Corsair TXM 750 

My case has good cooling and have undervolted before so could do that to offset the heat of the 3090 

 

Any advice or guidance is greatly appreciated. 

Link to comment
Share on other sites

Link to post
Share on other sites

As it stands, frame gen (on games where you really care about FPS) is kinda crap.

Neither the 4070 Super, or the 3090 will be limited in FPS anytime soon in normal gaming.

Basically pick if you want the VRAM (3090) or extra tech, at the expense of VRAM (4070 Super) 

Link to comment
Share on other sites

Link to post
Share on other sites

website where I got this list from:

 

image.png.529228273330e6cbc4f393633cca405a.png

 

As far as I'm aware, 3090 has DLSS, but maybe just lower version of it or something.

 

Just now, tkitch said:

As it stands, frame gen (on games where you really care about FPS) is kinda crap.

Neither the 4070 Super, or the 3090 will be limited in FPS anytime soon in normal gaming.

Basically pick if you want the VRAM (3090) or extra tech, at the expense of VRAM (4070 Super) 

So 3090 is a no brainer?

Note: Users receive notifications after Mentions & Quotes. 

Feel free to ask any questions regarding my comments/build lists. I know a lot about PCs but not everything.

PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti

PCs I used before:

Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050

Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050

Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, podkall said:

So 3090 is a no brainer?

On a PC with an 850W PSU or more?  I'd go 3090 any day.   

 

But, a 750W is gonna be an issue on a 3090, as NVidia says the requirements are an 850+

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, tkitch said:

On a PC with an 850W PSU or more?  I'd go 3090 any day.   

 

But, a 750W is gonna be an issue on a 3090, as NVidia says the requirements are an 850+

Techpowerup doesn't: (but maybe the OC versions are bit hungry)

 

image.png.33bd9bc1c2592554ce34916cc63db00b.png

Note: Users receive notifications after Mentions & Quotes. 

Feel free to ask any questions regarding my comments/build lists. I know a lot about PCs but not everything.

PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti

PCs I used before:

Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050

Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050

Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

I'd still undervolt 3090 at least slightly either way to avoid potential crashes from transients.

Note: Users receive notifications after Mentions & Quotes. 

Feel free to ask any questions regarding my comments/build lists. I know a lot about PCs but not everything.

PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti

PCs I used before:

Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050

Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050

Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, podkall said:

As far as I'm aware, 3090 has DLSS, but maybe just lower version of it or something.

Capable of the same thing, - FG. You still get DLSS and even Ray Reconstruction in games that support it. 

Link to comment
Share on other sites

Link to post
Share on other sites

I was slightly mistaken:

3090 TI = 850W+
3090 = 750W

What CPU do you have, @Ed.Nev

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, podkall said:

I'd still undervolt 3090 at least slightly either way to avoid potential crashes from transients.

Sorry what are transients? 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, tkitch said:

I was slightly mistaken:

3090 TI = 850W+
3090 = 750W

What CPU do you have, @Ed.Nev

 

Apologies for not providing that. It’s the I5 12600KF

 

As far as i’ve seen elsewhere it is good for any of the cards here. Would you agree? Thanks 

Link to comment
Share on other sites

Link to post
Share on other sites

I ran a 5800x (PBO) and a 3090 with its power limit raised to 400W for two years on a Corsair SF750, never a hint of an issue. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Ed.Nev said:

Sorry what are transients? 

transients are very short term power spikes that can go WELL above what the card normally draws.

IE:  The 3090 can draw 350W on its own, but sometimes can spike to 500-600W for a split second, and depending that can really piss off a PSU.

 

2 minutes ago, Ed.Nev said:

Apologies for not providing that. It’s the I5 12600KF

 

As far as i’ve seen elsewhere it is good for any of the cards here. Would you agree? Thanks 

It'd probably do just fine.  

If you had an i9, I'd have said you need a bigger PSU.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GuiltySpark_ said:

I ran a 5800x (PBO) and a 3090 with its power limit raised to 400W for two years on a Corsair SF750, never a hint of an issue. 

difference being, that at peak, that 5800x was gonna still be a fair bit under 200W

Whereas an i9 today could be passing 300  (That's a pretty big difference in draw)

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Ed.Nev said:

Hi! 
I’m looking to purchase a new gpu and these are my current options and their prices:

Rtx 4070 Super = £580

Rtx 3090 = £580

Rtx 4070ti = £640

 

My concerns are the lack of frame generation on the 3090 - though i have tried frame gen and don’t love it in all games. I’m really worried about only 12gb vram on both the 4070 super and ti as i want this card to last a few years.

 

Sadly the 4070ti super (and above) is just too expensive. 

 

My psu = Corsair TXM 750 

My case has good cooling and have undervolted before so could do that to offset the heat of the 3090 

 

Any advice or guidance is greatly appreciated. 

I think it depends on which models you're looking at, but I'd just go for the RTX 4070 Super. Its one of the most efficient GPUs when it comes to performance/watt. The RTX 4070ti is decent for another £60, but you'd have to do some math to figure out if its worth it.

 

If you're able to get the FE version of the RTX 4070 Super, I'd opt for that. Those tend to retain their value based on the RTX 3000 series and dries out shortly after release.

 

RTX 3090 is great and all, but I imagine its used and also will draw a lot more power. The extra VRAM might be nice, but unless you're running 4K textures, more than 12GB of VRAM isn't as necessary.

 

12 minutes ago, tkitch said:

As it stands, frame gen (on games where you really care about FPS) is kinda crap.

I've had pretty good experience with frame generation so far. My primary use of it is in Diablo 4 where its literally free performance. I haven't noticed any performance/stuttering that's caused me to turn it back off. I have disabled DLSS upscaling though since that's noticeable, so I'm running 4K ultra with no upscaling but frame generation.

 

When it first came out on Witcher 3's RTX update, it would cause the game to crash, but otherwise worked just fine. I'm also running a 4K 240Hz QLED, so its not a slow display by any means where frame generation's input latency might be masked.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Agall said:

I've had pretty good experience with frame generation so far. My primary use of it is in Diablo 4 where its literally free performance. I

I too think it works fine for a few titles. I'm curious about this one though since I've put in a quite a few hours in D4 with a 4090 at 4K and I just run DLAA. Game is so easy to run on this card that DLSS or even FG seemed completely unnecessary. I have a global frame cap of 118fps (LG C2) so maybe that's why, as it stands the game just sits there on that cap at all times. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, tkitch said:

difference being, that at peak, that 5800x was gonna still be a fair bit under 200W

Whereas an i9 today could be passing 300  (That's a pretty big difference in draw)

For sure, but they look to have a 12600k in this case. Regardless if its a recent TX750 it'll probably be fine. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GuiltySpark_ said:

For sure, but they look to have a 12600k in this case. Regardless if its a recent TX750 it'll probably be fine. 

150W on turbo according to Intel's website, still could be less if we're not using all cores

Note: Users receive notifications after Mentions & Quotes. 

Feel free to ask any questions regarding my comments/build lists. I know a lot about PCs but not everything.

PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti

PCs I used before:

Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050

Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050

Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, podkall said:

150W on turbo according to Intel's website, still could be less if we're not using all cores

That's what I'd see in y-cruncher with a 5800x. Similar. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GuiltySpark_ said:

I too think it works fine for a few titles. I'm curious about this one though since I've put in a quite a few hours in D4 with a 4090 at 4K and I just run DLAA. Game is so easy to run on this card that DLSS or even FG seemed completely unnecessary. I have a global frame cap of 118fps (LG C2) so maybe that's why, as it stands the game just sits there on that cap at all times. 

I try to get to 240 fps with my 4K 240Hz, so its free performance for that. I wouldn't be surprised if you saved yourself a solid 30-40W of draw enabling it with your framerate cap.

 

If they ever add it to Warframe, I'll be able to test it extensively, especially if FSR3 comes with that update for AMD's side. That's a LOT more latency sensitive game than Diablo 4, where you're mostly limited by the game's mechanics. An FPS/TPS with frame generation might be more noticeable.

 

Keep in mind that Diablo 4 gets RTX on the 26th of March, so your performance characteristics might change since as long as they properly added RT textures, it'll probably look wonderful. I wouldn't be surprised though if its 'RT by name' like they did in WoW Shadowlands, but hopefully not.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Agall said:

I try to get to 240 fps with my 4K 240Hz, so its free performance for that. I wouldn't be surprised if you saved yourself a solid 30-40W of draw enabling it with your framerate cap.

Ah, ok yeah the 240Hz part of the equation I was missing. Yeah, capping for 120Hz means that in some lighter weight games this card has to work so little its kind of funny. SO much less heat and power than 3090 but anyway I'm getting off topic for the thread. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, podkall said:

150W on turbo according to Intel's website, still could be less if we're not using all cores

I've seen reports as high as 226W. Apparently can go as high as 320W though. Noting the removed power limit non-OC part:

 

image.png.3a3e4cfb1c9163ef91beee26070f23fb.png

 

Intel Core i5-12600K Review - Winning Price/Performance - Power Consumption & Efficiency | TechPowerUp

 

Not that it'll regularly go that high or can without removing power limits, so closer to 190W at stock.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GuiltySpark_ said:

Ah, ok yeah the 240Hz part of the equation I was missing. Yeah, capping for 120Hz means that in some lighter weight games this card has to work so little its kind of funny. SO much less heat and power than 3090 but anyway I'm getting off topic for the thread. 

Nvidia has done a good job at convincing me that they're putting the work in to expand the utility of their architecture and drivers. I find myself using the RTX enhancements to video through Super Resolution and HDR, even their new Chat with RTX. All being features they've just added for free as apart of their software suite.

 

The 'RTX tax' we all saw back in the RTX 2000 series is becoming worth it over time, considering something like an RTX 2060 Super can now be used in an AI upscaling home threatre machine and/or local LLM server. That also comes with the expectation that they might not support their older GPUs with new features or that it wouldn't resell as easy as something like an RTX 4070 Super with its power efficiency/compact size.

 

We'll have to see what happens with RTX 5000 series, if we even get high end cards. If the AI performance is as insane as Nvidia is claiming, something like an RTX 4050 might end up being a DLSS/frame generation/AI/home theatre monster.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Agall said:

We'll have to see what happens with RTX 5000 series, if we even get high end cards. If the AI performance is as insane as Nvidia is claiming, something like an RTX 4050 might end up being a DLSS/frame generation/AI/home theatre monster.

 

This concept I think many have a real problem with in theory. Less effort will need to be put into optimization if the end result is your expected to always use some sort of machine learning enhancement, I think is people's fear. Oversimplifying but that's the gist of it.  

 

Personally I think the tech is exciting and I'm totally here for seeing how it progresses, I don't have the doom and gloom outlook yet. 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Agall said:

I've seen reports as high as 226W. Apparently can go as high as 320W though. Noting the removed power limit non-OC part:

 

image.png.3a3e4cfb1c9163ef91beee26070f23fb.png

 

Intel Core i5-12600K Review - Winning Price/Performance - Power Consumption & Efficiency | TechPowerUp

 

Not that it'll regularly go that high or can without removing power limits, so closer to 190W at stock.

that's more of Cinebench kind of application power I'd assume, you're not drawing 200+W on 150W turbo spec CPU if you're gaming

Note: Users receive notifications after Mentions & Quotes. 

Feel free to ask any questions regarding my comments/build lists. I know a lot about PCs but not everything.

PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti

PCs I used before:

Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050

Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050

Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, podkall said:

that's more of Cinebench kind of application power I'd assume, you're not drawing 200+W on 150W turbo spec CPU if you're gaming

For sure but you want to consider max load of all components at once. Gaming yeah, it would be fine. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×