Jump to content

RTX 3080 10gb upgrade suggestions?

Tadziunia

Hello men of culture!

Can't make a decision should i upgrade or not, HERE IS MY STORY, SORRY FOR A BLOCK OF TEXT. I have Gigabyte RTX 3080 gaming OC 10gb and i had it since 2021-01, ofc i overpaid a lot for it here in EUROPE pricing was terrible and i bought it for 1240euros and infact it was cheapest model, Vision OC was at 1450~ eur and etc and prices kept increasing. And this is price with no markup, my friend work at pc part shop and its just supplier who sold card to retail for extreme markups. Had issues just like all RTX 3080/3090 from all manufacturers. OVERHEATING GDDR6X, 105c on memory during gaming wasn't normal even in open case with high airflow. CHIP itself was at 64c while memory almost twice as hot. SUPRIM X and others had same issue out of the box. And since gpus were out of stock and were manufacturer with same cheap thermal pads, only option was to buy good thermal pads and change it myself. VRAM dropped from 105c to 72c. I changed it in 2021-05. Temps still are great  and i didn't gethowever VRAM amount is not enough. Can't max out basic things like texture makes me crazy. Payed so much back then to not care because it can run anything max and it can't. Far cry 5 with HD textures not possible, getting muddy image same with far cry 6. Couldn't run warzone 1 with high textures at 4k it would stutter a lot, later i found out that to be able to max it out without stutter i would need atleast 16gb vram gpu and even then it would be at limits, 3090 ate like 18gb, same story goes with warzone 2 if i wanted to play it with all bells and whistles i would need atleast 24gb of vram gpu, seen 4090 chuging 22gb of vram at 4k. Didn't play last of us, however ive seen that there is issues too. Also there is weird issues with ray tracing, even though i had card for almost 2 and half years i didn't really finished any game without it turned off, for example dying light 2 can only use raytracing for 15-30minutes, then gpu usage drop by 20percent, abit longer performance is even slower gpu is at 50% and i had to restart pc fully and it repeated. I have 7950x 5.5ghz i don't think its cpu bottleneck also i am at 3440x1440. So funny enough finished game with FSR2 and high settings without raytracing. Waiting for dead island 2 definitely getting that, however i am pretty sure i won't be able to max simplest setting like texture. Again mudbath. If gpu had 12gb like later nvidia scam rereleases, i think could live with that. So yeah should i sell 3080 and get 7900xtx or chugg extra 600eur for 4090 ?

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Tadziunia said:

Hello men of culture!

Can't make a decision should i upgrade or not, HERE IS MY STORY, SORRY FOR A BLOCK OF TEXT. I have Gigabyte RTX 3080 gaming OC 10gb and i had it since 2021-01, ofc i overpaid a lot for it here in EUROPE pricing was terrible and i bought it for 1240euros and infact it was cheapest model, Vision OC was at 1450~ eur and etc and prices kept increasing. And this is price with no markup, my friend work at pc part shop and its just supplier who sold card to retail for extreme markups. Had issues just like all RTX 3080/3090 from all manufacturers. OVERHEATING GDDR6X, 105c on memory during gaming wasn't normal even in open case with high airflow. CHIP itself was at 64c while memory almost twice as hot. SUPRIM X and others had same issue out of the box. And since gpus were out of stock and were manufacturer with same cheap thermal pads, only option was to buy good thermal pads and change it myself. VRAM dropped from 105c to 72c. I changed it in 2021-05. Temps still are great  and i didn't gethowever VRAM amount is not enough. Can't max out basic things like texture makes me crazy. Payed so much back then to not care because it can run anything max and it can't. Far cry 5 with HD textures not possible, getting muddy image same with far cry 6. Couldn't run warzone 1 with high textures at 4k it would stutter a lot, later i found out that to be able to max it out without stutter i would need atleast 16gb vram gpu and even then it would be at limits, 3090 ate like 18gb, same story goes with warzone 2 if i wanted to play it with all bells and whistles i would need atleast 24gb of vram gpu, seen 4090 chuging 22gb of vram at 4k. Didn't play last of us, however ive seen that there is issues too. Also there is weird issues with ray tracing, even though i had card for almost 2 and half years i didn't really finished any game without it turned off, for example dying light 2 can only use raytracing for 15-30minutes, then gpu usage drop by 20percent, abit longer performance is even slower gpu is at 50% and i had to restart pc fully and it repeated. I have 7950x 5.5ghz i don't think its cpu bottleneck also i am at 3440x1440. So funny enough finished game with FSR2 and high settings without raytracing. Waiting for dead island 2 definitely getting that, however i am pretty sure i won't be able to max simplest setting like texture. Again mudbath. If gpu had 12gb like later nvidia scam rereleases, i think could live with that. So yeah should i sell 3080 and get 7900xtx or chugg extra 600eur for 4090 ?

Hi bro 🙂 

Pardon me to call you bro, that's because I also got a 3080 10GB by Q1 21, was the Gigabyte Eagle model, had no choice neither, paid 1200EUR in Europe as well (France), and use 3440x1440 too ! 🙂 

Also had to change thermals pads, but as it still was way too loud and I eventually watercooled it by mid 21

Worked like a champ since (with a 360 rad it was max 60C, and quiet..) but I couldn't do anything about the small VRAM issue, irritating stutters in many games, so ended up upgrading to a 7900XTX last week ! 🙂

I didn't really consider 4090 not only for the huge price jump, but also because it'll bottleneck my 5900X, so choice was between 4080 and 7900XTX

Really I chose AMD mostly because NVidia are evil hyenas, I didn't want to pay them again for the planned obsolescence they put in the 3080🤬 (they even tried it again with the "4080 12GB"...)

Now 7900XTX is nice, super performing in rasterization, enough VRAM for the next 5 years 🙂but you got same perf in RT on a 7900XTX than a 3080...No DLSS neither, FSR seems worse (but I didn't use it yet)

The Adrenalin software is also quite nice, putting to shame the crap NVidia stuff that relies on a now unpaid Russian guy that did Afterburner all by himself 😞 

AMD cards can't do Stable diffusion neither atm (unless you do a lot of tweaking), I had fun with that, but I'm sure it'll be soon compatible

Overall unless you really want good RT and DLSS I'd say go red, no need to spend 600 bucks more for a 4090 !

 

 

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

First off, when your GPU useage drops, the GPU is not the bottleneck. Low GPU useage means it's waiting for something else in the chain. Since it only happens after some time passes, my bet would be RAM. If the GPU or CPU don't perform to spec, you'd notice that instantly.

 

Second, imo the 3080 10G is still an insanely fast GPU. I have one myself and can play everything i want at or close to max settings. If it can't do ultra, i enable DLSS. If that's still not enough, i just drop down to high. Typically i don't bother will all the little options in the graphics settings. In both cases you're not missing out on much.

 

If i were to upgrade from a 3080, anything less than a 4090 is too little of a jump to convince me. I also hate that Nvidia made a high-end GPU with 10GB of VRAM. At the time of it's release there was no game that needed more. However, now there are some titles. Luckily i haven't run into a VRAM limitation yet with the games i play and i don't have a problem dropping to high settings.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, PDifolco said:

Hi bro 🙂 

Pardon me to call you bro, that's because I also got a 3080 10GB by Q1 21, was the Gigabyte Eagle model, had no choice neither, paid 1200EUR in Europe as well (France), and use 3440x1440 too ! 🙂 

Also had to change thermals pads, but as it still was way too loud and I eventually watercooled it by mid 21

Worked like a champ since (with a 360 rad it was max 60C, and quiet..) but I couldn't do anything about the small VRAM issue, irritating stutters in many games, so ended up upgrading to a 7900XTX last week ! 🙂

I didn't really consider 4090 not only for the huge price jump, but also because it'll bottleneck my 5900X, so choice was between 4080 and 7900XTX

Really I chose AMD mostly because NVidia are evil hyenas, I didn't want to pay them again for the planned obsolescence they put in the 3080🤬 (they even tried it again with the "4080 12GB"...)

Now 7900XTX is nice, super performing in rasterization, enough VRAM for the next 5 years 🙂but you got same perf in RT on a 7900XTX than a 3080...No DLSS neither, FSR seems worse (but I didn't use it yet)

The Adrenalin software is also quite nice, putting to shame the crap NVidia stuff that relies on a now unpaid Russian guy that did Afterburner all by himself 😞 

AMD cards can't do Stable diffusion neither atm (unless you do a lot of tweaking), I had fun with that, but I'm sure it'll be soon compatible

Overall unless you really want good RT and DLSS I'd say go red, no need to spend 600 bucks more for a 4090 !

 

 

I didn't really use raytracing while having 3080 since day one, idea of getting less performance for prettier shadows or reflections is no use. I wanted RX 6800 XT originally however when it arrived to Lithuania it cost 1700eur retail. So waiting wasn't an option. If i had AMD gpu i wouldn't ask for advices what should i do with 3080. Also differences comparing FSR2 and DLSS is so small. Seen hub video, Tim saying ogh DLSS is noticibly better, i would say lmao, while playing at high resolution and not pixel peeping they look the same. I even prefer FSR2 over DLSS in many games. Also AMD is constantly improving with drivers nowadays 6800 xt is atleast 10% faster in new games and much smoother due to 16gb vram buffer compared to our old 3080 10gb.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stahlmann said:

First off, when your GPU useage drops, the GPU is not the bottleneck. Low GPU useage means it's waiting for something else in the chain. Since it only happens after some time passes, my bet would be RAM. If the GPU or CPU don't perform to spec, you'd notice that instantly.

 

Second, imo the 3080 10G is still an insanely fast GPU. I have one myself and can play everything i want at or close to max settings. If it can't do ultra, i enable DLSS. If that's still not enough, i just drop down to high. Typically i don't bother will all the little options in the graphics settings. In both cases you're not missing out on much.

 

If i were to upgrade from a 3080, anything less than a 4090 is too little of a jump to convince me. I also hate that Nvidia made a high-end GPU with 10GB of VRAM. At the time of it's release there was no game that needed more. However, now there are some titles. Luckily i haven't run into a VRAM limitation yet with the games i play and i don't have a problem dropping to high settings.

Agree with you too buddy, Anything less than 4090 is exactly not enough, considering most of us paid far over msrp for 3080 and most of us had to change thermal pads and probably lost warranty. 3080 wouldn't be bad, i had 1080 Ti before it, and it supposed to be 2x gain, but it wasn't. 1080 Ti was too good and it took a while for nvidia to dump pascal with drivers making it slower and slower, 1080 Ti now barely performing like rtx 2060 in new games while it was on par with 2080 back in the day. RX 7900 XTX is fast and price is right but its not 2x, only in call of duty it touching 4090, else its like 4080 (which is bad value also 1.45k euros here). 

Link to comment
Share on other sites

Link to post
Share on other sites

You have the right idea in that, from a 3080, the only upgrades right now are either a 4090 or 7900XTX - nothing else really makes any sense.

 

The 4090 is faster than the 7900XTX in general rasterization performance. There are a couple of outlier games where the 7900XTX pulls out a narrow win, but they are by far the exception, not the rule. So do note that, with the 7900XTX, it isn't like the 6900XT vs the RTX 3090, where you're getting about the same level of performance either way. The 4090 is about 25-30% faster for gaming even without RT. So if you need to max things out at any cost, the RTX 4090 will be more capable of doing that.

 

That's not to say the 7900XTX can't - I don't know of any game it can't run maxed out until RT comes into the mix - but the 4090 will age better, even without RT and DLSS. Whether or not it will age €600 better is a different question.

 

I think the next generation of graphics cards is going to be much faster again compared to the current one, and it seems like Nvidia's pricing strategy is starting to fall apart, so the 5000 series and 8000 series respectively should be in a better place in terms of price/performance. All that to say that I expect 4090 levels of performance to cost significantly less next gen. So you may have another upgrade option available for less than you're paying today by the time the 7900XTX starts to show signs of aging.

 

Another consideration in all of this is your CPU. If your CPU is holding you back some, the 7900XTX may be better overall for that use case. This is because Nvidia uses a multi-threading software scheduler that runs on the CPU via its driver, whereas AMD uses a hardware scheduler on the graphics card itself. So AMD cards put less strain on your CPU compared to Nvidia cards, making them sometimes 20-30% faster in CPU bound situations.

 

All that to say, I think both cards are viable choices - they both give you the 24GB of VRAM necessary to turn up textures to the max - but if you can afford the 4090, I think it will provide you with a more meaningful upgrade at 4K in terms of performance, unless your CPU is weaker than a 5700X or 11700K, in which case, you'll almost certainly be CPU limited and then the 7900XTX may be better until you get a CPU upgrade.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, YoungBlade said:

You have the right idea in that, from a 3080, the only upgrades right now are either a 4090 or 7900XTX - nothing else really makes any sense.

 

The 4090 is faster than the 7900XTX in general rasterization performance. There are a couple of outlier games where the 7900XTX pulls out a narrow win, but they are by far the exception, not the rule. So do note that, with the 7900XTX, it isn't like the 6900XT vs the RTX 3090, where you're getting about the same level of performance either way. The 4090 is about 25-30% faster for gaming even without RT. So if you need to max things out at any cost, the RTX 4090 will be more capable of doing that.

 

That's not to say the 7900XTX can't - I don't know of any game it can't run maxed out until RT comes into the mix - but the 4090 will age better, even without RT and DLSS. Whether or not it will age €600 better is a different question.

 

I think the next generation of graphics cards is going to be much faster again compared to the current one, and it seems like Nvidia's pricing strategy is starting to fall apart, so the 5000 series and 8000 series respectively should be in a better place in terms of price/performance. All that to say that I expect 4090 levels of performance to cost significantly less next gen. So you may have another upgrade option available for less than you're paying to day by the time the 7900XTX starts to show signs of aging.

 

Another consideration in all of this is your CPU. If your CPU is holding you back some, the 7900XTX may be better overall for that use case. This is because Nvidia uses a multi-threading software scheduler that runs on the CPU via its driver, whereas AMD uses a hardware scheduler on the graphics card itself. So AMD cards put less strain on your CPU compared to Nvidia cards, making them sometimes 20-30% faster in CPU bound situations.

 

All that to say, I think both cards are viable choices - they both give you the 24GB of VRAM necessary to turn up textures to the max - but if you can afford the 4090, I think it will provide you with a more meaningful upgrade at 4K in terms of performance, unless your CPU is weaker than a 5700X or 11700K, in which case, you'll almost certainly be CPU limited and then the 7900XTX may be better until you get a CPU upgrade.

Thank you for huge response! I have 7950x and i had 5800x before it,  and i am at 3440x1440. So i should be fine with either choice. Really wished 4090 cost a little bit less. However indeed nothing else makes sense to upgrade. 4090 cost 1815eur here in my country, while 4080 cost 1400-1500eur depending on version and its not even mining craze. MSRP is just too high and VAT adds up. 7900xtx cost 1200~ performs mainly around the same to 4080 with RT performance closer to RTX 3090 from previous generation. However 24gb of buffers  is nicer than 16gb when its almost at its limit at current generation. AMD is always improving, its not perfect, but considering finewine technology they always age better. Good examples GTX 780 TI vs r9 290. Old kepler cant even run games or if it can its almost twice slower than AMD. So this facts alone makes me feel like i need AMD gpu for a long run, while with RTX 3080 one gen later we are not getting DLSS3 we are not getting AV1 encoding + lack of vram for basic needs. So there is three options. Either 7900 xtx OR 4090 OR just keep 3080 and cope that it just can't run games at ultra textures and will perform decently with mud quality. 

Link to comment
Share on other sites

Link to post
Share on other sites

using a 3080 10gig myself and have no issue. water cooled the critter and paly whatever i want. 

running super smooth on a 1440p 34" widescreen nano IPS at 144hz.

No way I am upgrading again for at least anther 2 years. by then i might grab a bargain basement 4090 LOL.

Visually I can run cyberpunk with great graphics and trying to squeeze that little extra reflection in a water puddle is wasted on me if I am fighting and trying not to die......no time to look how pretty my reflection is.

My main games are still Control Deus Ex Warframe horizon Zero Dawn bashing zombies in 7 days 2 die and still revisit games as forsaken Star Citizen eve online metro exodus etc etc.

but if you want to upgrade then go ahead but I dont think outside of the 4090 the costs for a new card are going to blow you away. Thats just my opinion and my reasoning for staying on my 3080 for a good while longer.

Link to comment
Share on other sites

Link to post
Share on other sites

I have a 3080 10gb and was planning on keeping it to see if the 7900/7900 XTX (prefer this because it's better at Compute and Video editing - but, still not as good as Nvidia cards) - I dual boot Windows and Linux - therefore, considering AMD if I upgrade.

But, the card I have - Asus Tuf Gaming has been great - no complaints.   It runs cool including when gaming.   Best card I've had so far.   I am worried though - if these 3080 10gb cards will have much demand on the used market/2nd hand if ppl are abandoning the lower VRAM cards.   Thoughts?    I got a decent deal for it but 're-selling' it - I dunno..... the longer I hold on to it, the bigger the risk?   I considered building 2 computers when I had extra money but it's not an option now (at the moment). 

I guess I'll probably just hang on to it - I'd still need an extra $500, at least, if I could sell it for close to what I paid and the 7900 XT would cost that extra.   The 6900 XT isn't very good at Blender or video editing programs like Davinci Resolve - so, I think that's a step back even though it'd be in Windows gaming and Linux (in general).  

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, YoungBlade said:

You have the right idea in that, from a 3080, the only upgrades right now are either a 4090 or 7900XTX - nothing else really makes any sense.

 

The 4090 is faster than the 7900XTX in general rasterization performance. There are a couple of outlier games where the 7900XTX pulls out a narrow win, but they are by far the exception, not the rule. So do note that, with the 7900XTX, it isn't like the 6900XT vs the RTX 3090, where you're getting about the same level of performance either way. The 4090 is about 25-30% faster for gaming even without RT. So if you need to max things out at any cost, the RTX 4090 will be more capable of doing that.

 

That's not to say the 7900XTX can't - I don't know of any game it can't run maxed out until RT comes into the mix - but the 4090 will age better, even without RT and DLSS. Whether or not it will age €600 better is a different question.

 

I think the next generation of graphics cards is going to be much faster again compared to the current one, and it seems like Nvidia's pricing strategy is starting to fall apart, so the 5000 series and 8000 series respectively should be in a better place in terms of price/performance. All that to say that I expect 4090 levels of performance to cost significantly less next gen. So you may have another upgrade option available for less than you're paying today by the time the 7900XTX starts to show signs of aging.

 

Another consideration in all of this is your CPU. If your CPU is holding you back some, the 7900XTX may be better overall for that use case. This is because Nvidia uses a multi-threading software scheduler that runs on the CPU via its driver, whereas AMD uses a hardware scheduler on the graphics card itself. So AMD cards put less strain on your CPU compared to Nvidia cards, making them sometimes 20-30% faster in CPU bound situations.

 

All that to say, I think both cards are viable choices - they both give you the 24GB of VRAM necessary to turn up textures to the max - but if you can afford the 4090, I think it will provide you with a more meaningful upgrade at 4K in terms of performance, unless your CPU is weaker than a 5700X or 11700K, in which case, you'll almost certainly be CPU limited and then the 7900XTX may be better until you get a CPU upgrade.

The 7900 XT is not a good upgrade (or for the $$/performance, in comparison)?  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Paul17 said:

The 7900 XT is not a good upgrade (or for the $$/performance, in comparison)?  

The regular XT just doesn't give a big enough performance boost. To me, unless it's a 50% increase, there's really no point in upgrading - it's just not that meaningful of an uplift.

 

Here's Techspot's 16 game average from their 7900XT review. There's only a 30% increase in performance going from the 3080 10GB. The only cards that offer more than a 50% boost are the 4080 (51%), 7900XTX (57%), and 4090 (98%).

 

Spoiler

Average_4K_o-p.webp

 

As for the RTX 4080, the OP is worried about the amount of VRAM. The 7900XTX and 4090 both offer 24GB, but the 4080 has just 16GB. Since the OP is struggling with 10GB, and seems like using 20GB+ is possible, 16GB doesn't seem sufficient.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×