Jump to content

Is the 3080 still good in 2023 ?

2 minutes ago, xg32 said:

if you are buying a card then the 6900 or 6950xt is probably a better deal, after the recent games, 16gb vram for a new card is almost a requirement (i can easily induce vram related crashes on my 3080ti on RE4, 2077 and spiderman, i dont have TLOU or jedi survivor to test)

RE 4 if you don’t do RT there’s not crashing even if you go over VRAM amount 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Sol_NB said:

How many though? And are they the recent very badly optimized barely working ones? Because I don’t know if that’s a good test bed 

Many recent AAA like CP2077, HWL, TW Warhammer 3...Seems Jedi survivor as well

That's clearly a trend for AAA games indeed and 3080 users usually want to play those 

 

System : AMD R9  7950X3D CPU/ Asus ROG STRIX X670E-E board/ 2x32GB G-Skill Trident Z Neo 6000CL30 RAM ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Thermalright Peerless Assassin 120 cooler (with 2xArctic P12 Max fans) /  2TB WD SN850 NVme + 2TB Crucial T500  NVme  + 4TB Toshiba X300 HDD / Corsair RM850x PSU

Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, PDifolco said:

Many recent AAA like CP2077, HWL, TW Warhammer 3...Seems Jedi survivor as well

That's clearly a trend for AAA games indeed and 3080 users usually want to play those 

 

CP 2077 I can get to run great on my 3070 if I use DLSS and can even do so with RT one at 1440p. Even Warhammer I saw benchmarks in 3070 it atleast runs at 60FPS on high or ultra pretty sure. 
 

I wonder if we’re getting lost in the sauce thinking we need to crank everything up all the way and if it dosent runs at like 120FPS that way it’s a giant big deal and means these cards are dead. 
 

also DLSS helps out a lot with this stuff 

Link to comment
Share on other sites

Link to post
Share on other sites

Like if you’re buying new, sure go for more VRAM. But let’s not act like jsut because these 8gb cards are not getting 144 FPS at max settings they are bad 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Sol_NB said:

CP 2077 I can get to run great on my 3070 if I use DLSS and can even do so with RT one at 1440p. Even Warhammer I saw benchmarks in 3070 it atleast runs at 60FPS on high or ultra pretty sure. 
 

I wonder if we’re getting lost in the sauce thinking we need to crank everything up all the way and if it dosent runs at like 120FPS that way it’s a giant big deal and means these cards are dead. 
 

also DLSS helps out a lot with this stuff 

Sure we aren't talking about unplayable framerates, but lacking VRAM still degrades the experience with stutters and bad lows

My personal experience in TW Warhammer 3, I had stutters and frame drops to 30ish when moving around or zooming the campaign map with my 3080, it's now way smoother on my new 7900XTX

 

The 3080 was good at launch but ages rather badly due to the low VRAM, that's my point 

 

I'm not at all a fan of upscaling, I mean what's the point of having higher resolution if the game is rendered in a lower one and needs some software sauce to get to your higher resolution?

 

Seems scammy to me ☺️ (and frame generation even more so)

System : AMD R9  7950X3D CPU/ Asus ROG STRIX X670E-E board/ 2x32GB G-Skill Trident Z Neo 6000CL30 RAM ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Thermalright Peerless Assassin 120 cooler (with 2xArctic P12 Max fans) /  2TB WD SN850 NVme + 2TB Crucial T500  NVme  + 4TB Toshiba X300 HDD / Corsair RM850x PSU

Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, PDifolco said:

Sure we aren't talking about unplayable framerates, but lacking VRAM still degrades the experience with stutters and bad lows

My personal experience in TW Warhammer 3, I had stutters and frame drops to 30ish when moving around or zooming the campaign map with my 3080, it's now way smoother on my new 7900XTX

 

The 3080 was good at launch but ages rather badly due to the low VRAM, that's my point 

 

I'm not at all a fan of upscaling, I mean what's the point of having higher resolution if the game is rendered in a lower one and needs some software sauce to get to your higher resolution?

 

Seems scammy to me ☺️ (and frame generation even more so)

Sure but if you already have these 8GB cards doing the upscaling and frame generation can mean you don’t need to drop 600 dollars or more in a newer card. Like personally I bought this 3070 in November. It’s hard for me to believe all of the sudden 5 months later it’s just obsolete and not good anymore. And I’d rather not go out and buy a new one so soon. 
 

and DLSS is honestly so good most can’t tell the difference anyway so why not use it even if you have a beefier card? It’s free frames  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Sol_NB said:

Sure but if you already have these 8GB cards doing the upscaling and frame generation can mean you don’t need to drop 600 dollars or more in a newer card. Like personally I bought this 3070 in November. It’s hard for me to believe all of the sudden 5 months later it’s just obsolete and not good anymore. And I’d rather not go out and buy a new one so soon. 

Makes sense from your standpoint but I really hate how Nvidia is voluntarily gimping the VRAM on their card to push consumers to either upgrade fast or depends on their illusion creating softwares

That's why I recently switched to AMD (got only Nvidia cards these last 10 years), cards may be a bit rougher but vendor is way more respecting their customers imo

System : AMD R9  7950X3D CPU/ Asus ROG STRIX X670E-E board/ 2x32GB G-Skill Trident Z Neo 6000CL30 RAM ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Thermalright Peerless Assassin 120 cooler (with 2xArctic P12 Max fans) /  2TB WD SN850 NVme + 2TB Crucial T500  NVme  + 4TB Toshiba X300 HDD / Corsair RM850x PSU

Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, PDifolco said:

Makes sense from your standpoint but I really hate how Nvidia is voluntarily gimping the VRAM on their card to push consumers to either upgrade fast or depends on their illusion creating softwares

That's why I recently switched to AMD (got only Nvidia cards these last 10 years), cards may be a bit rougher but vendor is way more respecting their customers imo

Like I don’t know if I’m just huffing copium or something. And sure if you’re making a new today purchase go ahead and get like a 6800xt or if you can swing more a 4070 or something with more VRAM. It just can’t imagine if the price is right and or if you already have a 8GB card you just can’t have a good experience anymore. If that’s the case then I guess I’m screwed and others who have these cards 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, PDifolco said:

Makes sense from your standpoint but I really hate how Nvidia is voluntarily gimping the VRAM on their card to push consumers to either upgrade fast or depends on their illusion creating softwares

That's why I recently switched to AMD (got only Nvidia cards these last 10 years), cards may be a bit rougher but vendor is way more respecting their customers imo

Same here. I wanted a card that could do RT well but I was already seeing 8GB not being enough to do RT in games like Doom Eternal, Resident Evil Village, and Far Cry 6 last year so figured if I was turning off RT anyways I'd save $150 and get 4GB more of VRAM with a 6700 XT. Hate seeing all these Nvidia gpus that would have been legendary cards with a proper amount of VRAM. Like the 3080 beats the 4070 at 4k right now in Techpowerup's testsuite of 20-25 games, but that lead is short lived once that testsuite gets filled with games designed for the current gen consoles without being held back by last gen.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, SteveGrabowski0 said:

Same here. I wanted a card that could do RT well but I was already seeing 8GB not being enough to do RT in games like Doom Eternal, Resident Evil Village, and Far Cry 6 last year so figured if I was turning off RT anyways I'd save $150 and get 4GB more of VRAM with a 6700 XT. Hate seeing all these Nvidia gpus that would have been legendary cards with a proper amount of VRAM. Like the 3080 beats the 4070 at 4k right now in Techpowerup's testsuite of 20-25 games, but that lead is short lived once that testsuite gets filled with games designed for the current gen consoles without being held back by last gen.

I mean if you don’t do RT and just use high settings at 1440p wouldnt the 8gb card hold up better? 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Sol_NB said:

I mean if you don’t do RT and just use high settings at 1440p wouldnt the 8gb card hold up better? 

On what planet is 8GB better than 12GB for 1440p? Especially when it costs more. Hogwarts has a huge vram bottleneck without RT. So does TLOU. On RE4 you're pretty much stuck with the 1GB textures from what I have seen and I can run RT on RE games with a 6700XT.  Why would I pay more to turn down textures? Why would I want a card that'll age worse for more money?

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, SteveGrabowski0 said:

On what planet is 8GB better than 12GB for 1440p? Especially when it costs more. Hogwarts has a huge vram bottleneck without RT. So does TLOU. On RE4 you're pretty much stuck with the 1GB textures from what I have seen and I can run RT on RE games with a 6700XT.  Why would I pay more to turn down textures? Why would I want a card that'll age worse for more money?

If you don’t do RE 4 you can exceed the VRAM amount be be fine. Just can’t do RT. Also you can do about 2GB on that game and it doesn’t exceed the 8GB when you do RT
 

and when I said “better” I didn’t mean it as better then 12gb. I just meant do better with the 8gb at that lower setting then if you tried doing ultra on it. 
 

im also not really talking about someone buying a 8gb card anymore new. Just if you have one already and maybe don’t want to drop another 600 dollars or so on another one right away if you just got it.

 

personally I don’t think RT is worth it overall 

Link to comment
Share on other sites

Link to post
Share on other sites

Love this post from the gpu forums at anadtech

 

Reasons why 8GB nine years after first release isn't nVidia's fault.

  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. 4K is irrelevant.
  11. Texture quality is irrelevant as long as it matches a console's.
  12. Detail levels are irrelevant as long as they match a console's.
  13. There's no reason a game should use more than 640K 8GB, because a forum user said so.
  14. It's completely acceptable for 3070/3070TI/3080 owners to turn down settings while 3060 users have no issue.
  15. It's an anomaly.
  16. It's a console port.
  17. It's a conspiracy against nVidia.
  18. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  19. It's completely acceptable to disable ray tracing on nVidia cards while AMD users have no issue.
Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, SteveGrabowski0 said:

Love this post from the gpu forums at anadtech

 

Reasons why 8GB nine years after first release isn't nVidia's fault.

  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. 4K is irrelevant.
  11. Texture quality is irrelevant as long as it matches a console's.
  12. Detail levels are irrelevant as long as they match a console's.
  13. There's no reason a game should use more than 640K 8GB, because a forum user said so.
  14. It's completely acceptable for 3070/3070TI/3080 owners to turn down settings while 3060 users have no issue.
  15. It's an anomaly.
  16. It's a console port.
  17. It's a conspiracy against nVidia.
  18. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  19. It's completely acceptable to disable ray tracing on nVidia cards while AMD users have no issue.

Except you do realize AMDs RT is still markably worse than nvidias  right? You don’t buy a AMD card for RT 

Link to comment
Share on other sites

Link to post
Share on other sites

Like if you wanna say Nvidia isnt a good value and AMD has better raw performance for the money, sure go ahead. But let’s not act like FSR is as good as DLSS and AMD RT is not even close to what Nvidia is doing 

Link to comment
Share on other sites

Link to post
Share on other sites

Like I remember on the WAN show they talk about “moats” meaning something a company has that seperate a themselves from everyone else and is they’re advantage always. What is AMDs? Like I mean that as a literal serious question not bait. Other than them being cheaper and offering more value, what makes them speacil in GPUs? Cause Nvidia one day can wake up and price match then and offer same amount of RAM. Then what for AMD? They’re RT is still worse, FSR still isn’t as good as DLSS. What speacil sauce do they have over Nvidia if price and specs  are equal? 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Sol_NB said:

Like I remember on the WAN show they talk about “moats” meaning something a company has that seperate a themselves from everyone else and is they’re advantage always. What is AMDs? Like I mean that as a literal serious question not bait. Other than them being cheaper and offering more value, what makes them speacil in GPUs? Cause Nvidia one day can wake up and price match then and offer same amount of RAM. Then what for AMD? They’re RT is still worse, FSR still isn’t as good as DLSS. What speacil sauce do they have over Nvidia if price and specs  are equal? 

Better drivers would be the big one. For instance the driver overhead problem was the big culprit for AMD cards outperforming Nvidia ones in Jedi Survivor at launch since the game was massively cpu bound.

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/1/2023 at 6:57 PM, Budget DIY said:

OP is requesting thoughts, and here is mine: it never was a good option. High-end cards tend to gobble up all the power, just so your eyeballs can get flooded with some fancy photons. Smooth and good enough quality to distinguish things @ low-ish power == perfection IMO. I wouldn't want to have such a card in my system, not even if I got it for free.

Sometimes i come to this forum just to laugh
Thanks for this. 

Ryzen 5 3600 | MSI B450 Tomahawk Max | Corsair Vengeance lpx 32gb 3600mhz | EVGA GeForce RTX 3080 FTW3 ULTRA GAMING | XPG Core Reactor 850w

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/1/2023 at 8:47 PM, Sol_NB said:

Like I remember on the WAN show they talk about “moats” meaning something a company has that seperate a themselves from everyone else and is they’re advantage always. What is AMDs? Like I mean that as a literal serious question not bait. Other than them being cheaper and offering more value, what makes them speacil in GPUs? Cause Nvidia one day can wake up and price match then and offer same amount of RAM. Then what for AMD? They’re RT is still worse, FSR still isn’t as good as DLSS. What speacil sauce do they have over Nvidia if price and specs  are equal? 

DLSS, RT and Compute - only reasons to go Nvidia (those 3 > AMD's FSR / etc.)?

Link to comment
Share on other sites

Link to post
Share on other sites

I’ve been running a rtx 3080 10gb at 1440p and 4k for a long time. Older games are fine at 4k, newer stuff I run at 1440p. 

I haven’t had an issue with my 3080 that made me say damn I need a better card. I don’t play on max settings unless I’m over the monitor refresh rate 165/120hz. 

No cpu mobo or ram atm

2tb wd black gen 4 nvme 

2tb seagate hdd

Corsair rm750x 

Be quiet 500dx 

Gigabyte m34wq 3440x1440

Xbox series x

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 months later...

Got mine for 500 euro and no issues so far with vram but i haven't played those 2023 releases that everyone talks about most intensive game i've played is cyberpunk, only problem is the heat but that might be my 7700x

Link to comment
Share on other sites

Link to post
Share on other sites

It is still an upper med\high end card. Depends on price as to whether it is "worth" it.

Can find them for $400 USD if patient. Near that price its hard to beat price\performance.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×