Jump to content

Graphics card for 1440p gaming?

Sol_NB
49 minutes ago, GamerDude said:

Yes, aware of that, I think either Daniel Cowen or HUB had talked about it....but, isn't one of the selling points of the RTX cards the ability to have great graphics (High Texture) AND RT? Don't you think having to sacrifice one or the other kinda defeats the purpose of 'RTX' branding? Sure, for lower end cards like the RTX 3050 and RTX 3060 8GB, the RTX branding is more or less for show only, but certainly NOT the case for the RTX 3070 series.

 

IF I were an owner of the RTX 3070, or worse yet, the RTX 3070 Ti, I'd feel really pissed off that nVidia had done this. That modded 16GB RTX 3070 shows (let alone the RTX 3070 Ti), very clearly I might add, that the RTX 3070 can and does benefit with the extra VRAM. With 16GB, you don't have to turn off texture for RT, or disabling RT in favor of more texture....the RTX 3070 16GB (with proper driver) would allow you to use both RT and high texture setting.

 

I don't get it, I don't own the RTX 3070 series cards, and yet I feel nVidia had stuck it to owners of such cards, yet owners feel obliged to defend nVidia (or are willing to sacrifice graphics settings to net playable framerate). I'd thought the selling points of the RTX cards are access to DLSS and RT at very high graphics setting.

Well personally I don’t care about RT that much. But I get your point. Like when I got the card in November I thought it was gonna last me for years. Now not so sure. And I’d rather not feel forced to buy a new GPU this soon after buying the 3070

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Sol_NB said:

Well personally I don’t care about RT that much. But I get your point. Like when I got the card in November I thought it was gonna last me for years. Now not so sure. And I’d rather not feel forced to buy a new GPU this soon after buying the 3070

IF RT isn't your cuppa, you should have gone with an RX 6800, but I understand that at that point in time, a few months back, the RTX 3070 appeared more attractive with features like DLSS and RT (before this shortfall of VRAM came up front and center as a concern). I'd gotten a Sapphire Nitro+ RX 6800 back when it was launched, then within a month or so, upgraded to the Nitro+ RX 6900 XT.

 

Both had 16GB VRAM, so I could have settled for the RX 6800, but I'm always getting flagship or near flagship AMD cards, hence the move to the 6900 XT.  The MERC 310 RX 7900 XTX I have now has ample amount of VRAM to last me a while, I believe the GPU would still have the rasterize performance grunt a year to two down the road, so I'd prolly sit on these cards for the next couple of years...

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Actually it does look ok and generally you don't have to do anything,  just set the game to 1080p monitor will upscale, just like watching a youtube video ,and yes there will be some artifacts but shouldn't be too bad.

 

Thats the huge downside to 1440p,  unlike 4k it doesn't scale properly with 1080p content.  

 

ie, its a good idea to have the right hardware for the resolution or otherwise it would be better to have a 1080p or even 4k monitor,  but again the upscaling shouldn't be too bad.

 

 

On 4/22/2023 at 9:04 PM, Inception9269 said:

You'd keep your resolution to 1440, with you enabling FSR or DLSS in games.

 

If I remember right, I think the Radeon drivers do have the option to enable FSR across the board. Also I believe Linus did a video about Nvidia having it where in the control panel you can enable AI upscaling across the board as well.

this is a lot of confusing "info" , yes there are software solutions but not all games would be supported and generally you don't have to do anything, monitor will automatically upscale and typically thats the best you'll get, it doesn't really make sense bothering with software/driver stuff which is probably mostly buggy and a hassle.

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Mark Kaine said:

Actually it does look ok and generally you don't have to do anything,  just set the game to 1080p monitor will upscale, just like watching a youtube video ,and yes there will be some artifacts but shouldn't be too bad.

 

Thats the huge downside to 1440p,  unlike 4k it doesn't scale properly with 1080p content.  

 

ie, its a good idea to have the right hardware for the resolution or otherwise it would be better to have a 1080p or even 4k monitor,  but again the upscaling shouldn't be too bad.

 

 

this is a lot of confusing "info" , yes there are software solutions but not all games would be supported and generally you don't have to do anything, monitor will automatically upscale and typically thats the best you'll get, it doesn't really make sense bothering with software/driver stuff which is probably mostly buggy and a hassle.

There's nothing confusing about it.

Also like Steve said, AMD has Radeon Super Resolution in its settings, which allows you to enable FSR across the board, even for games that don't have that feature. Nvidia having RTX video enhancement settings in the Nvidia control panel.

⣿⣿⣿⣿⣇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠛⠻⣿⣿⣿⣿⣿⣿

⣿⣿⣿⣿⣿⣦⠀⠀⠀⠀⠀⠀⠀⠀⢀⣤⣄⡀⠀⢻⣿⣿⣿⣿⣿

⣿⣿⣿⣿⣿⣿⣇⠀⠀⠀⠀⠀⠀⠀⠸⣿⣿⣿⠃⢰⣿⣿⣿⣿⣿

⣿⣿⣿⣿⣿⣿⣿⣆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣼⣿⣿⣿⣿⣿

⣿⣿⣿⣿⣿⣿⣿⣿⡆⠀⠀⠀⠀⠀⠀⢶⣶⣶⣾⣿⣿⣿⣿⣿⣿

⣿⣿⣿⣿⣿⣿⣿⣿⣧⠀⢠⡀⠐⠀⠀⠀⠻⢿⣿⣿⣿⣿⣿⣿⣿

⣿⣿⣿⣿⣿⣿⣿⣿⣿⡄⢸⣷⡄⠀⠣⣄⡀⠀⠉⠛⢿⣿⣿⣿⣿

⣿⣿⣿⣿⣿⣿⣿⣿⣿⣇⠀⣿⣿⣦⠀⠹⣿⣷⣶⣦⣼⣿⣿⣿⣿

⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣼⣿⣿⣿⣷⣄⣸⣿⣿⣿⣿⣿⣿⣿⣿

⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⢿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿

⣿⣿⡿⢛⡙⢻⠛⣉⢻⣉⢈⣹⣿⣿⠟⣉⢻⡏⢛⠙⣉⢻⣿⣿⣿

⣿⣿⣇⠻⠃⣾⠸⠟⣸⣿⠈⣿⣿⣿⡀⠴⠞⡇⣾⡄⣿⠘⣿⣿⣿

⣿⣿⣟⠛⣃⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⣿⣿⣿⣿⣿

⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, GamerDude said:

IF RT isn't your cuppa, you should have gone with an RX 6800, but I understand that at that point in time, a few months back, the RTX 3070 appeared more attractive with features like DLSS and RT (before this shortfall of VRAM came up front and center as a concern). I'd gotten a Sapphire Nitro+ RX 6800 back when it was launched, then within a month or so, upgraded to the Nitro+ RX 6900 XT.

 

Both had 16GB VRAM, so I could have settled for the RX 6800, but I'm always getting flagship or near flagship AMD cards, hence the move to the 6900 XT.  The MERC 310 RX 7900 XTX I have now has ample amount of VRAM to last me a while, I believe the GPU would still have the rasterize performance grunt a year to two down the road, so I'd prolly sit on these cards for the next couple of years...

Yeah but can’t exactly go back in time. So hoping the 3070 will still be ok at 1440p atleast little more 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, GamerDude said:

Yes, aware of that, I think either Daniel Cowen or HUB had talked about it....but, isn't one of the selling points of the RTX cards the ability to have great graphics (High Texture) AND RT? Don't you think having to sacrifice one or the other kinda defeats the purpose of 'RTX' branding? Sure, for lower end cards like the RTX 3050 and RTX 3060 8GB, the RTX branding is more or less for show only, but certainly NOT the case for the RTX 3070 series.

 

IF I were an owner of the RTX 3070, or worse yet, the RTX 3070 Ti, I'd feel really pissed off that nVidia had done this. That modded 16GB RTX 3070 shows (let alone the RTX 3070 Ti), very clearly I might add, that the RTX 3070 can and does benefit with the extra VRAM. With 16GB, you don't have to turn off texture for RT, or disabling RT in favor of more texture....the RTX 3070 16GB (with proper driver) would allow you to use both RT and high texture setting.

 

I don't get it, I don't own the RTX 3070 series cards, and yet I feel nVidia had stuck it to owners of such cards, yet owners feel obliged to defend nVidia (or are willing to sacrifice graphics settings to net playable framerate). I'd thought the selling points of the RTX cards are access to DLSS and RT at very high graphics setting.

You could see 8GB becoming a problem from a mile away, which is why I steered clear of the RTX 3060 Ti and RTX 3070 when I did a gpu upgrade last November (got an RX 6700 XT instead for $30 cheaper than the cheapest Zotac 3060 Ti). The 8GB was already showing itself to be a problem at 1440p when Doom Eternal got the RTX update and the RTX 3060 12GB was beating the 3070 by 33%. And it was showing in Resident Evil Village where the 6700 XT could play 4k60 fine with RT on while the 3070 was pulling in the low 40s while 3060 was in the low 50s at 4k. You could see it the way Far Cry 6 died at 4k with RT with the 3070 pulling like 6 fps while the 6700 XT could do mid 40s. So I fully expected the 3070 to age like milk, but even I am taken aback that it happened less than six months after buying my 6700 XT. Most of the big games recently released have choked on 8GB even at 1080p, stuff like Callisto, The Last of Us, Plague Tale Requiem, Resident Evil 4, Hogwarts Legacy. I would feel burned like hell if I bought a 3060 Ti / 3070 / 3070 Ti in 2022 or later and it would make me never want to buy Nvidia again.

 

But I figured if RT is the big selling point for Nvidia and Nvidia gpus in the $300 to $500 price class I was buying in didn't have the VRAM to support RT, then effectively RT was off the table whether I went AMD or Nvidia. So give me the lower price, better drivers, higher VRAM, and the free games from AMD (in my case Callisto and Dead Island 2), fuck Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, SteveGrabowski0 said:

You could see 8GB becoming a problem from a mile away, which is why I steered clear of the RTX 3060 Ti and RTX 3070 when I did a gpu upgrade last November (got an RX 6700 XT instead for $30 cheaper than the cheapest Zotac 3060 Ti). The 8GB was already showing itself to be a problem at 1440p when Doom Eternal got the RTX update and the RTX 3060 12GB was beating the 3070 by 33%. And it was showing in Resident Evil Village where the 6700 XT could play 4k60 fine with RT on while the 3070 was pulling in the low 40s while 3060 was in the low 50s at 4k. You could see it the way Far Cry 6 died at 4k with RT with the 3070 pulling like 6 fps while the 6700 XT could do mid 40s. So I fully expected the 3070 to age like milk, but even I am taken aback that it happened less than six months after buying my 6700 XT. Most of the big games recently released have choked on 8GB even at 1080p, stuff like Callisto, The Last of Us, Plague Tale Requiem, Resident Evil 4, Hogwarts Legacy. I would feel burned like hell if I bought a 3060 Ti / 3070 / 3070 Ti in 2022 or later and it would make me never want to buy Nvidia again.

 

But I figured if RT is the big selling point for Nvidia and Nvidia gpus in the $300 to $500 price class I was buying in didn't have the VRAM to support RT, then effectively RT was off the table whether I went AMD or Nvidia. So give me the lower price, better drivers, higher VRAM, and the free games from AMD (in my case Callisto and Dead Island 2), fuck Nvidia.

I mean, I haven’t had issues really yet with stuff I played. But I was more talking long term potential worry’s 


also regarding RE 4 I have had zero issue with it with 8gb of VRAM. So not sure where you’re getting that one. 
 

also explain to me how they have BETTER drivers? I have never heard drivers being an issue for Nvidia. If anything I’ve always heard AMD drivers were worse 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Sol_NB said:

I mean, I haven’t had issues really yet with stuff I played. But I was more talking long term potential worry’s 

 

also explain to me how they have BETTER drivers? I have never heard drivers being a issue for Nvidia 

Nvidia has higher driver overhead than AMD in DirectX 12. It's why AMD cards tend to win benchmarks at 1080p over Nvidia cards that should be stronger. Like in Techpowerup's benches from about a year ago the RX 6750 XT soundly lost to the RTX 3070 at 4k in their testsuite (non RT) but edged it out at 1080p.

 

https://www.techpowerup.com/review/msi-radeon-rx-6750-xt-gaming-x-trio/31.html

 

Used to be the opposite a few years ago, where AMD's DirectX 11 drivers had crippling overhead that made weaker Nvidia cards outperform them when cpu bound.

Link to comment
Share on other sites

Link to post
Share on other sites

I mean, it seems like only recently AMD is getting gains on Nvidia. And it seems like a lot of those issues are solved by Nvidia not being crazy with price and not being stingy with VRAM. I really don’t know what advantages AMD has over Nvidia without those two fairly easily fixable things because there RT isn’t as good as Nvidia, DLSS is better then FSR, and Nvidia is doing more AI related things that are probably gonna pay off big in the king term. Basicaly what I’m saying is I’m not sure what AMDs “Moats” are. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Sol_NB said:

I mean, it seems like only recently AMD is getting gains on Nvidia. And it seems like a lot of those issues are solved by Nvidia not being crazy with price and not being stingy with VRAM. I really don’t know what advantages AMD has over Nvidia without those two fairly easily fixable things because there RT isn’t as good as Nvidia, DLSS is better then FSR, and Nvidia is doing more AI related things that are probably gonna pay off big in the king term. Basicaly what I’m saying is I’m not sure what AMDs “Moats” are. 

AMD's work trying to push the low level Mantle API in the mid 2010's is likely why they're better with DirectX 12 nowadays since it's a very low level API. Nvidia has put a lot of the onus on gamers in getting them to subsidize their branching into machine learning by paying through the nose for tensor cores that aren't doing a whole lot for gaming. DLSS is nice but paying $150 to $200 more for the tensor cores to run it when FSR is mostly really good too isn't worth it to me. Especially when they gimp the VRAM on their gaming cards to keep professional users from buying them.

Link to comment
Share on other sites

Link to post
Share on other sites

Not saying AMD isn't an evil corporation just like Nvidia. I remember back in 2001 on the cpu front AMD was absolutely crushing Intel not on just price to performance, but flat our performance so they started locking their cpus. I had to use silver paint and I think putty if I remember right to fill in the holes they cut in the PCB to be able to unlock the multiplier and overclock the cpu again. None of these corporations are our friends so none of them deserve to have us defend them online. AMD is just less shitty at the moment when it comes to gpus.

Link to comment
Share on other sites

Link to post
Share on other sites

Like the VRAM thing I can’t defend. I hope my 8gb will last atleast a year or so until I can stomach upgrading again. I just don’t know if I could give up the other Nvidia features and maybe why I always will go with them. But I’ll evaluate when time comes I guess 

Link to comment
Share on other sites

Link to post
Share on other sites

Like I’m not writing off AMD either. When I upgrade again I’ll compare the two again.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Mark Kaine said:

this is a lot of confusing "info" , yes there are software solutions but not all games would be supported and generally you don't have to do anything, monitor will automatically upscale and typically thats the best you'll get, it doesn't really make sense bothering with software/driver stuff which is probably mostly buggy and a hassle.

You can still force upscaling via NIS on Nvidia cards or RSR on AMD cards. Obviously not as good as using FSR or DLSS most of the time since you'd be upscaling HUD elements and such too, but probably still better than letting the monitor do the upscale. At least it is for me with RSR.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, SteveGrabowski0 said:

You can still force upscaling via NIS on Nvidia cards or RSR on AMD cards. Obviously not as good as using FSR or DLSS most of the time since you'd be upscaling HUD elements and such too, but probably still better than letting the monitor do the upscale. At least it is for me with RSR.

That was part of my point,  why bother when the monitor "should" be able to handle it, idk 1080p on my 1440p looks pretty sharp*, i know there are some artifacts but i dont really notice them much or at all tbh (unlike on some other tvs/monitors, admittedly) hence better to try out how the monitor handles the scaling first.

 

*actually i sometimes have to check if a video is 1440p because it looks so sharp, and nope usually its 1080p then, same for some games (example DOA6 only supports 1080p and 4k, but it still looks sharp and clear at 1080p, and despite being overall a pretty shitty port)

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Sol_NB said:

I mean, I haven’t had issues really yet with stuff I played. But I was more talking long term potential worry’s 


also regarding RE 4 I have had zero issue with it with 8gb of VRAM. So not sure where you’re getting that one. 
 

also explain to me how they have BETTER drivers? I have never heard drivers being an issue for Nvidia. If anything I’ve always heard AMD drivers were worse 

I think Hogwarts and RE4R were patched so that 8GB cards were usable at certain settings, try running them maxed out, with RT enabled and issues might crop up. I saw a vid by HUB, the texture popping so that 8GB cards can be used was ....appalling. You do even have to move your character, just standing still at one spot, and you'll see texture around you popping in and out.

 

Yes, turning RTX off, like for RE4R for example, allowed for higher texture setting, even easily exceeding the 8GB VRAM buffer, but the moment you enable RT, it would crash (dunno if this issue has been fixed or not). One thing was clear to me even back then, 8GB VRAM isn't enough, whether it was deliberate or an oversight, or cost saving by nVidia at expense of future useability, I can't say....though I tilt toward the belief that it was a deliberate move by nVidia to ensure RTX 3070/3070 Ti owners would have to upgrade sooner rather than later.

As for drivers, well, neither are perfect, but as of late (was using a GTX 1080 until November late year) I've had no real issue with my AMD and nVidia cards. Issues occur during installation, perhaps conflicts, and such. so it's hard to say either way. I can say that while the previous 2-3 drivers from AMD have been fine, I do have an issue of screen staying blank after I'd left my rig alone for a while (an hour and more). but other than that, no issue with games that I've played on two systems (RX 6900 XT and RX 7900 XTX)

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, GamerDude said:

I think Hogwarts and RE4R were patched so that 8GB cards were usable at certain settings, try running them maxed out, with RT enabled and issues might crop up. I saw a vid by HUB, the texture popping so that 8GB cards can be used was ....appalling. You do even have to move your character, just standing still at one spot, and you'll see texture around you popping in and out.

 

Yes, turning RTX off, like for RE4R for example, allowed for higher texture setting, even easily exceeding the 8GB VRAM buffer, but the moment you enable RT, it would crash (dunno if this issue has been fixed or not). One thing was clear to me even back then, 8GB VRAM isn't enough, whether it was deliberate or an oversight, or cost saving by nVidia at expense of future useability, I can't say....though I tilt toward the belief that it was a deliberate move by nVidia to ensure RTX 3070/3070 Ti owners would have to upgrade sooner rather than later.

As for drivers, well, neither are perfect, but as of late (was using a GTX 1080 until November late year) I've had no real issue with my AMD and nVidia cards. Issues occur during installation, perhaps conflicts, and such. so it's hard to say either way. I can say that while the previous 2-3 drivers from AMD have been fine, I do have an issue of screen staying blank after I'd left my rig alone for a while (an hour and more). but other than that, no issue with games that I've played on two systems (RX 6900 XT and RX 7900 XTX)

Well I mean I’m kinda stuck with this 3070 for the time being because I really don’t wanna plop down a grand on a GPU after just buying one 5 months ago. So guess just gotta hope I can make it work if I lower settings if needed and do RT 

Link to comment
Share on other sites

Link to post
Share on other sites

So I’ve been debating lately due to these recent VRAM stories and how it’s gonna be a potential issue if I should sell my 3070 and get another GPU. I was thinking maybe the 6800xt or 6950XT. I could also do a 4070TI or 4070, but even the 12gb VRAM on those I’m not too sure about. Like right now I’m not having ton of issues with the 8gb of the VRAM on the 3070, but it seems like it’s just gonna be a issue going forward and I’m kinda not sure what to do.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Sol_NB said:

Like right now I’m not having ton of issues with the 8gb of the VRAM on the 3070, but it seems like it’s just gonna be a issue going forward and I’m kinda not sure what to do.  

Don't upgrade if you don't have to. You may be able to pick up those cards for even cheaper once you will need to upgrade. Or heck, maybe something new and better for the price as new (old) RDNA2 will appear.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, WereCat said:

Don't upgrade if you don't have to. You may be able to pick up those cards for even cheaper once you will need to upgrade. Or heck, maybe something new and better for the price as new (old) RDNA2 will appear.

I just am very paranoid about the 8gb VRAM thing. Like most stuff is fine right now. But games like RDR2 it just crashes when I put textures to ultra 

Link to comment
Share on other sites

Link to post
Share on other sites

Are you actually running into VRAM Issues?

If yes:  Consider upgrading.

If no:  Don't waste your money.  

 

There's only a few situations where 8GB is an issue, and those can also be sidestepped by playing with settings still.  

 

2-3 years from now?  It'll probably be different, but you'd also get a new tier of card instead.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Sol_NB said:

I just am very paranoid about the 8gb VRAM thing. Like most stuff is fine right now. But games like RDR2 it just crashes when I put textures to ultra 

Then don't set them to Ultra?

Link to comment
Share on other sites

Link to post
Share on other sites

Ultra and High once you move around in game it is next to impossible to see the difference.

On very fast paced games settings at medium it will even be hard to see the ultra eye candy.......

standing still and looking around making screen shots etc ultra settings good game play a waste of time for 99% of gameplay

Link to comment
Share on other sites

Link to post
Share on other sites

I mean according to all those videos even dropping to high you run into VRAM issues 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, WereCat said:

Then don't set them to Ultra?

Sure. But even setting them to high we’re seeing these brand new games like hogwarts and Jedi and Dead Space and so on stutter and hit that 8gb don’t we? I think it’s because PC games are going off the consoles. I’m not sure 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×