Jump to content

Is the Gtx 970 3.5GB vram really a problem at 1080p ?

Matthew_C123

I just purchased a gtx 970 and already knew about the 3.5GB of vram,is this really a issue or bottleneck?

Link to comment
Share on other sites

Link to post
Share on other sites

Not at 1080P, only in some rare extreme instances.

Intel Core i7-5820K (4.4 GHz) | Gigabyte GTX 970 G1 Gaming | Corsair Vengeance LPX 16GB  | 2x 360mm Custom Loop (Noctua iPPC) | ASRock X99 Extreme6 | Samsung 840 EVO 250GB | Fractal Design Define S | Corsair HX750 | Windows 10 | Corsair M65 RGB PRO | Corsair K70 RGB LUX (CherryMX Brown) | Beyerdynamic Custom One Pro & Creative Sound Blaster Z | Nexus 6P (32GB Aluminium) | Check out my setup: Project Kalte Here!

Link to comment
Share on other sites

Link to post
Share on other sites

No, unless you are runing some really heavy mods that eat a lot of VRAM like some Skyrim mods for example. I am runing 2GB GTX 770 and I havent encountered lack of VRAM problem so far.

Link to comment
Share on other sites

Link to post
Share on other sites

No, another .5GB of RAM probably won't mean the difference between playable or unplayable in anything honestly. If a 3.5GB card can't handle it, you probably need to step up more than just to 4GB. I think I used like 2.5GB in TW3 (Novigrad) at 1440p, so I assume you'd have to try to use that much vram at 1080p.

 CPU:  Intel i7-4790K      Cooler:  Noctua NH-D14     GPU: ZOTAC GTX 1070 TI MINI     Motherboard:  ASUS Z97 Gryphon     RAM:  32GB G Skill Trident X     

Storage: 2x 512GB Samsung 850 EVO (RAID 0) / 2TB Seagate Barracuda     PSU: 850W EVGA SuperNova G2     Case: Fractal Design Node 804

Link to comment
Share on other sites

Link to post
Share on other sites

No, you'd be fine at low-end 2k.

 

What does 2k even mean? Some use it to discuss 1920x1080, some use it to discuss 2560x1440, but in reality it is 2048×1080. Similar to how TVs marketed as "4K" are usually UHD and 4K is actually 4096x2160 not 3840x2160. If you mean 1440p, please stop, as that should be 2.5K if anything... I almost burst a blood vessel reading phone leaks last year.

 

/rant over

 CPU:  Intel i7-4790K      Cooler:  Noctua NH-D14     GPU: ZOTAC GTX 1070 TI MINI     Motherboard:  ASUS Z97 Gryphon     RAM:  32GB G Skill Trident X     

Storage: 2x 512GB Samsung 850 EVO (RAID 0) / 2TB Seagate Barracuda     PSU: 850W EVGA SuperNova G2     Case: Fractal Design Node 804

Link to comment
Share on other sites

Link to post
Share on other sites

Main reason I went for AMD 290x 

 

I considered a 970 and at the time they were about £50 more than a 290x so the 290x won out and I didn't like the feeling I would have had knowing I'd been short changed on VRAM no matter how you look at it NVIDIA cocked up on this issue 

DISPLAYS: LG 27UL500 IPS 4k60hz + HDR and LG 27GL650F IPS 1080p 144hz + HDR

 

LAPTOP: Lenovo Legion 5 CPU: AMD Ryzen 7 5800H GPU: RTX 3070 8GB RAM: 16GB 3200MHz (2x8GB DDR4) STORAGE: 1TB Crucial P5 NVMe SSD + 2TB Samsung 970 evo plus NVMe SSD DISPLAY: 1080p 165hz IPS OS: Windows 10 Pro x64

Link to comment
Share on other sites

Link to post
Share on other sites

I just purchased a gtx 970 and already knew about the 3.5GB of vram,is this really a issue or bottleneck?

Not really,

when the 970 released it was faster and cheaper than most gpu's,

nowadays, the R9 390 is as cheap and as fast for more vram.

 

I'm looking forward to the new nvidia gpu's to come out soon.

 

https://www.reddit.com/r/buildapc/comments/2tu86z/discussion_i_benchmarked_gtx_970s_in_sli_at_1440p/?ref=search_posts

 

This is a good post about it for 1440p and above, at 1080p it's still an issue in some very rare cases

Recommend what is best, not what you preffer.

"Like" comments to show your support of them or the idea they express.

Link to comment
Share on other sites

Link to post
Share on other sites

It has 4GB. NOT 3.5GB. 

 

Difference in speed, but still usable.

Not arguing, just saying....... that isn't exactly true. Yea it has 4 gigs, but those last 500 megs are so brutally slow, once you start actually using that portion of the RAM your performance will tank so hard. So saying its "usable" isn't exactly true. It is usable but it will hurt your performance horrible.

 

But, OP, its fine. 1080p isn't an issue at all.

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

What does 2k even mean? Some use it to discuss 1920x1080, some use it to discuss 2560x1440, but in reality it is 2048×1080. Similar to how TVs marketed as "4K" are usually UHD and 4K is actually 4096x2160 not 3840x2160. If you mean 1440p, please stop, as that should be 2.5K if anything... I almost burst a blood vessel reading phone leaks last year.

 

/rant over

If you're using 2k as 1080p you need to stop. that is not 2k. 2k is 1440p.

Link to comment
Share on other sites

Link to post
Share on other sites

I constantly hit 3.7 or 3.8gb while playing GTA5 and notice no more than 5-10fps drops and thats from the increased vegetation thats requiring the vram...

Gaming PC: CPU: i7 4770k@4.2GHz w/ CM Nepton 140xl, GPU: Gigabyte 1070 @2050, RAM: ADATA XPG V1 16GB@2133MHz, Mobo: MSI Z97 Gaming 7, Case: Corsair NZXT S340.

Link to comment
Share on other sites

Link to post
Share on other sites

If you're using 2k as 1080p you need to stop. that is not 2k. 2k is 1440p.

 

No it is not. That is a very common misconception though. 

 

The naming comes from a crude aproximation for horizontal pizels 4096 (or 3840 I guess), is about 4,000 so we say 4K, therefore 2K should be half the horizontal pixels implying 2048 (or 1920).

 

If you have followed monitors recently, 5K displays have a resolution of 5120x2880 (closest to 5,000 so we round to 5K), therefore 1440p would be related to 5K by being half the horizontal pixels, or 2.5K... there is no reason to call 1440p 2K except that it is easy to say in marketing, but not technically accurate in any way and ambiguous to boot.

 

I am not trying to be a jerk so please don't read my words that way, I just feel the term gets tossed around and doesn't have a consistent meaning right now.

 

EDIT: It's wikipedia official https://en.wikipedia.org/wiki/2K_resolution

and another source I am less familiar with http://www.streamingmedia.com/Articles/Editorial/What-Is-.../What-is-2K-and-4K-Video-88297.aspx

 CPU:  Intel i7-4790K      Cooler:  Noctua NH-D14     GPU: ZOTAC GTX 1070 TI MINI     Motherboard:  ASUS Z97 Gryphon     RAM:  32GB G Skill Trident X     

Storage: 2x 512GB Samsung 850 EVO (RAID 0) / 2TB Seagate Barracuda     PSU: 850W EVGA SuperNova G2     Case: Fractal Design Node 804

Link to comment
Share on other sites

Link to post
Share on other sites

No, you'd be fine at low-end 2k.

2k is 1080p..

Gaming PC: • AMD Ryzen 7 3900x • 16gb Corsair Vengeance RGB Pro 3200mhz • Founders Edition 2080ti • 2x Crucial 1tb nvme ssd • NZXT H1• Logitech G915TKL • Logitech G Pro • Asus ROG XG32VQ • SteelSeries Arctis Pro Wireless

Laptop: MacBook Pro M1 512gb

Link to comment
Share on other sites

Link to post
Share on other sites

No it is not. That is a very common misconception though. 

 

The naming comes from a crude aproximation for horizontal pizels 4096 (or 3840 I guess), is about 4,000 so we say 4K, therefore 2K should be half the horizontal pixels implying 2048 (or 1920).

 

If you have followed monitors recently, 5K displays have a resolution of 5160x2880 (closest to 5,000 so we round to 5K), therefore 1440p would be related to 5K by being half the horizontal pixels, or 2.5K... there is no reason to call 1440p 2K except that it is easy to say in marketing, but not technically accurate in any way and ambiguous to boot.

 

I am not trying to be a jerk so please don't read my words that way, I just feel the term gets tossed around and doesn't have a consistent meaning right now.

Yeah, and the standard is 1440p=2k. The spec isn't changing. Everyone accepts it, even professionals.

Link to comment
Share on other sites

Link to post
Share on other sites

if you are not playinnggames or  on a super moded skyrim with high textures and shadow of mordor( at 1440p) your fine, if ytour playing star citizen you might

lives on

BAKABT

 

Link to comment
Share on other sites

Link to post
Share on other sites

No. If you're that worried, the 390 is another option. 

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

I just purchased a gtx 970 and already knew about the 3.5GB of vram,is this really a issue or bottleneck?

 

I don't think it's ever been proven to be an "issue" at 2160p, for that matter.

 

Though by issue in this context, I mean a situation that is unplayable because of the 3.5/0.5 GB VRAM partition, that otherwise would have been playable if all of the VRAM ran at the same speed. Obviously it's easy to find games/settings that are unplayable at that resolution on the 970.

 

Yeah, and the standard is 1440p=2k. The spec isn't changing. Everyone accepts it, even professionals.

 

Very rarely do I hear professionals use the term 2K to describe any resolution. It's very careless terminology. Even 4K isn't really "correct."

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, and the standard is 1440p=2k. The spec isn't changing. Everyone accepts it, even professionals.

 

Only if by professionals you mean phone reviewers. I believe Linus actually says 2.5K but don't have a source on that. Here are 2 that legitimize 2K as 2048x1080. The consumer resolution is 1920x1080 so it could probably go by that name too, although few would say 2K since 1080p is so common a term.

 

 

 CPU:  Intel i7-4790K      Cooler:  Noctua NH-D14     GPU: ZOTAC GTX 1070 TI MINI     Motherboard:  ASUS Z97 Gryphon     RAM:  32GB G Skill Trident X     

Storage: 2x 512GB Samsung 850 EVO (RAID 0) / 2TB Seagate Barracuda     PSU: 850W EVGA SuperNova G2     Case: Fractal Design Node 804

Link to comment
Share on other sites

Link to post
Share on other sites

 

Only if by professionals you mean phone reviewers. I believe Linus actually says 2.5K but don't have a source on that. Here are 2 that legitimize 2K as 2048x1080. The consumer resolution is 1920x1080 so it could probably go by that name too, although few would say 2K since 1080p is so common a term.

 

 

 

In all my days of watching Linus, I have not ever heard him ONCE use the therm '2.5k'. You also know there are 'professional' and 'commercial' resolution standards, right?

Link to comment
Share on other sites

Link to post
Share on other sites

In all my days of watching Linus, I have not ever heard him ONCE use the therm '2.5k'. You also know there are 'professional' and 'commercial' resolution standards, right?

 

I said I didn't have a source for that, but there are a LOT of WAN shows and monitor reviews and I don't have time to check. I don't know why you question whether I know the difference between professional and consumer/commercial resolutions, since I have acknowledged those all along. 4096x2160 is professional, whereas 3840x2160 is the consumer equivalant. 2048x1080 is professional, whereas 1920x1080 is the consumer equivalant. I think anyone would agree monitor resolution naming is an absolute mess with HD, FHD, QHD, UHD, etc and 2K, 2.5K, 4K, 5K and also 1080p, 1440p, 2160p, 2880p... there is little consistency and a lot of bad terms in general. But 1440p is 2.5K, if it is anything. I would be fine if 2K/2.5K were never said again, as 1080p is already ubiquidous as a term, and 1440p is moderately so. My problem with them is the ambiguity evidenced by our disagreement.

 CPU:  Intel i7-4790K      Cooler:  Noctua NH-D14     GPU: ZOTAC GTX 1070 TI MINI     Motherboard:  ASUS Z97 Gryphon     RAM:  32GB G Skill Trident X     

Storage: 2x 512GB Samsung 850 EVO (RAID 0) / 2TB Seagate Barracuda     PSU: 850W EVGA SuperNova G2     Case: Fractal Design Node 804

Link to comment
Share on other sites

Link to post
Share on other sites

nvidia messed up pretty badly, but 970 is still a very strong card, that price to performance ratio is hard to beat (nvidia wise) though

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×