Jump to content

RTX 3080 **10GB VRAM ENOUGH?!**

makadee
Go to solution Solved by Exeon,
On 10/30/2020 at 4:30 PM, Leisor said:

(which will definitly be the case with 10GB) than not being able to play poor optimised games due tu a small vram pool.

 

Facts are the only thing that really matter.

The fact is most games don't even reach 8GB's let alone 10GB's for 1440

 

The only game that currently reaches 10GB's is Crysis remastered however since at max settings the 3080 can't even reach 60FPS this isn't a realistic scenario

Upcoming games like watchdogs Legion don't get to 10GB's on 4K, let alone 1440p

 

If you follow gaming trends we won't reach 10GB's 1440p for some years to come, by which time one could consider upgrading again.

DLSS also turns down the amount of VRAM used by quite a bit and we don't know how many titles will adopt this technology (AMD is also working on a similar thing)

 

"Not being able to play" is a big overstatement, when I bought the 980TI at release date 5 years ago I had to turn down settings on unoptimized games.

This wasn't because my card couldn't handle it, no card could handle it and even today's card can't, not at max settings.

 

 

At this point people shouldn't order and wait for AMD's benchmarks, I hope they are as promising as they make it seem.

I'm still waiting to see how they perform with ray tracing turned on.

 

 

3 minutes ago, ewitte said:

Yes 10GB is fine there is barely anything that pushes 8GB is what I said.  *need* vs allocation.

Also what do you think about games coming out in the next 6 months, the true next gen gaming. Do you think VRAM could be an issue at that time for someone who is playing on 1440p like me?

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Well if your Vram is overused, then the game will allocate Ram, which is much slower than Vram, and then you have unsharp textures and those kind of problems.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, makadee said:

On my 1440p monitor i ran FS on high settings and was getting around 65FPS, sometimes it would dip to like 45FPS but even a RTX 3090 has this issue with this game... Am i right on that?

There is much more than only fps, if the vram is overused usually frametimes are spiking, the games doesnt feel smooth eventough the framerate is high. And it could lead to problems described by another user here (missing textures, pop ins...).

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Leisor said:

Horizon Zero Dawn for example are very thankfull for more Vram. I experienced frametime spikes in 1440P due to a shortage of vram. 

https://www.techpowerup.com/review/horizon-zero-dawn-benchmark-test-performance-analysis/4.html

 

So you have a card with less than 8GB memory?

 

6 minutes ago, Leisor said:

Flight Simulator

https://www.guru3d.com/articles-pages/microsoft-flight-simulator-(2020)-pc-graphics-performance-benchmark-review,4.html

 

Exactly, that's a simulator

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Leisor said:

There is much more than only fps, if the vram is overused usually frametimes are spiking, the games doesnt feel smooth eventough the framerate is high. And it could lead to problems described by another user here (missing textures, pop ins...).

The only way to tell is to drop the texture resolution and rerun the benchmark.  If there is just a small difference your fine you will be seeing 10-20%+ if it is memory limited.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ewitte said:

The only way to tell is to drop the texture resolution and rerun the benchmark.  If there is just a small difference your fine you will be seeing 10-20%+ if it is memory limited.

What does frametime spike mean?

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Jurrunio said:

My GPU (RTX 2080 Super) has 8GB of Vram, and anything higher then medium texture settings (in 1440P at HZD) will use or allocate more and more vram the more you play. After a while you definitly feel stutter due to inconsistent frametimes.

We should all hope that Nvidia is going to fill the huge Vram gap between the 3080 (10 GB) and the 3090 (24 GB). 

The RTX 3070 has the same amount of Vram as a GTX 1070, although it has the same performance as a RTX 2080ti!

Right now I could definitly not recommend a RTX 3070 or a RTX 3080 due to their small Vram pool, it doesnt fit to the performance they deliver. AMD seems to have a big advantage this time.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Leisor said:

My GPU (RTX 2080 Super) has 8GB of Vram, and anything higher then medium texture settings (in 1440P at HZD) will use or allocate more and more vram the more you play. After a while you definitly feel stutter due to inconsistent frametimes.

We should all hope that Nvidia is going to fill the huge Vram gap between the 3080 (10 GB) and the 3090 (24 GB). 

The RTX 3070 has the same amount of Vram as a GTX 1070, although it has the same performance as a RTX 2080ti!

Right now I could definitly not recommend a RTX 3070 or a RTX 3080 due to their small Vram pool, it doesnt fit to the performance they deliver. AMD seems to have a big advantage this time.

What about DLSS 2.0?

 

And also I have been running my rtx 3080 in a few games and have not noticed these issues of stuttering etc. Sometimes it says 9.5gb is being used on the VRAM but i dont notice a single thing. The game runs very well at high fps, and im actually really picky with noticing issues during gaming.

 

This is why I am confused if this whole VRAM thing is an actual issue or something that isnt very important. I tried many games so far and have no noticed a single thing. Sometimes I even run games at 5K resolution since I have an ultra wide and everything runs extremely well!!

 

So I am a bit confused with whats going on here and so many people seem to have different opinions. I guess one thing that seems right is how much vram is allocated vs used. Maybe games dont need as much vram as we might think?? 

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

What is a game that uses a lot of vram? Let me know and I will put it to the test to see if i notice anything. I will post results here!

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, makadee said:

.

The 3080's gimped, wait for rx 6xxx if you don't have a gsync monitor, there are multiple games with settings i can crash on the 2080 ti with 11gb vram, but it's not many and the settings can be adjusted. 12gb is the desired amount.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, xg32 said:

The 3080's gimped, wait for rx 6xxx if you don't have a gsync monitor, there are multiple games with settings i can crash on the 2080 ti with 11gb vram, but it's not many and the settings can be adjusted. 12gb is the desired amount.

The thing is that I have a very expensive g-sync monitor and I really dont want to get an AMD card..... But at the same time right now I dont feel so good about what I am hearing about the RTX 3080.....

 

I spent so much money on my setup, I sure hope that there is some false information about the VRAM issues. But then again will see soon!

 

Worst case I can resell my GPU and get a 3080ti if something like that is available in about 6 months or less!

 

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, makadee said:

The thing is that I have a very expensive g-sync monitor and I really dont want to get an AMD card..... But at the same time right now I dont feel so good about what I am hearing about the RTX 3080.....

 

I spent so much money on my setup, I sure hope that there is some false information about the VRAM issues. But then again will see soon!

 

Worst case I can resell my GPU and get a 3080ti if something like that is available in about 6 months or less!

 

nvidia will need a 3080 ti to counter the 6900 xt, since the 3080 isnt really in stock to begin with, you have time for more research, i'm definitely not getting a 3080 10gb, i'd rather cap frame rates and go with a 6900 xt and forego rtx/dlss for 1 gen. (dlss looks blurry to me anyway compared to native)

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, makadee said:

What is a game that uses a lot of vram? Let me know and I will put it to the test to see if i notice anything. I will post results here!

RDR2, that's a big VRAM user

 

11 minutes ago, makadee said:

What about DLSS 2.0?

DLSS essentially runs the game at lower res and then upscales it, so VRAM usage overall is reduced compared to running native res.

 

11 minutes ago, makadee said:

So I am a bit confused with whats going on here and so many people seem to have different opinions.

Neither are the builds similar, even if the hardware's the same the software side of things are different. It's one of those things that unless you put everyone in front of the same screen, no one can pursuade the other.

 

35 minutes ago, Leisor said:

We should all hope that Nvidia is going to fill the huge Vram gap between the 3080 (10 GB) and the 3090 (24 GB). 

The RTX 3070 has the same amount of Vram as a GTX 1070, although it has the same performance as a RTX 2080ti!

Then how could they solve it? Take the 3080 as an example, they could either open up more memory bus width and give us 12GB which I think is the most viable option but people like you may not be satisifed, go clam shell mode and give us 20GB (also with significant higher price since it's now much more capable for work), or cut down memory bus width a bit (also hurts bandwidth) and give us 16GB through clam shell mode. Neither option will satisfy everyone.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Jurrunio said:

RDR2, that's a big VRAM user

I run it at maximum settings at 1080p, and it doesn't even allocate all of my VRAM. (system in signature)

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, BTGbullseye said:

I run it at maximum settings at 1080p, and it doesn't even allocate all of my VRAM. (system in signature)

1080p with 8GB card, I wholeheartedly hope it doesn't

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Jurrunio said:

1080p with 8GB card, I wholeheartedly hope it doesn't

Sits around 6GB allocated. RDR2 is not a VRAM intensive game. Ghost Recon Breakpoint uses way more. (I can't run at max settings because it uses more than 8GB)

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, BTGbullseye said:

Sits around 6GB allocated. RDR2 is not a VRAM intensive game. Ghost Recon Breakpoint uses way more. (I can't run at max settings because it uses more than 8GB)

Oh yeah there's that one

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

Between 1440p and VR i've seen multiple times my VRAM usage being over 10GB on my 1080ti and going forward 10GB on a card as powerful as the RTX 3080 is certainly a concern for me personally.

 

Nvidia knew AMD had an ace down the sleeve...that's the way they found to cut on cost without cutting on profit margin.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Jurrunio said:

RDR2, that's a big VRAM user

 

I can run that game without a single issue on all ultra setting 3440x1440p usually around 80-100FPS sometimes drops to 70 maybe, but not stuttering or texture issues or any other issues. So does this mean the VRAM is ok?

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, BTGbullseye said:

Sits around 6GB allocated. RDR2 is not a VRAM intensive game. Ghost Recon Breakpoint uses way more. (I can't run at max settings because it uses more than 8GB)

What about the new watchdogs?

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, i_build_nanosuits said:

Between 1440p and VR i've seen multiple times my VRAM usage being over 10GB on my 1080ti and going forward 10GB on a card as powerful as the RTX 3080 is certainly a concern for me personally.

 

Nvidia knew AMD had an ace down the sleeve...that's the way they found to cut on cost without cutting on profit margin.

But there is really nothing that shows 10GB is not enough yet!! Not one video showing that due to VRAM the game is not running properly and is limited etc. Thats the part im confused. Because how much VRAM is allocated i learned today has a big difference between how much vram is actually being used.

 

So far I have no issues, I can try Ghost Recon Breakpoint and let you guys know if I notice issues with stuttering. 

 

I feel like if the VRAM was such a big deal, how come big youtubers are not talking about it?

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Jarsky said:

Keep in mind VRAM is just the highest level of memory for your GPU. Textures and such will still live in memory and Virtual memory (pagefile) if needed. You'll never "run out" of video memory. 

Thats pretty bad game design - imo - I play a lot of Monster Hunter World which is basically almost like open world (huge areas) and you never get any loadings or pauses between different regions of these areas or when monsters appear etc. It's all loaded in Vram before you start. 

 

I absolutely love it and I would do exactly the same if I were to make a game. 

 

However, yes, the downside is if you go over the amount of ram your gpu has you will experience massive issues (most likely even crash) 

 

There's a dynamic setting which I'm not sure what it does exactly, but I'm using that and I'm still running into issues when I go over the max vram usage. (I guess there's still stuff in the background that uses system RAM, maybe AI logic etc) but the fact is the level will be loaded fully into Vram from the get go (which is usually around 4-6 GB, depending on your settings it can be a lot more of course) 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Mark Kaine said:

Thats pretty bad game design - imo - I play a lot of Monster Hunter World which is basically almost like open world (huge areas) and you never get any loadings or pauses between different regions of these areas or when monsters appear etc. It's all loaded in Vram before you start. 

 

I absolutely love it and I would do exactly the same if I were to make a game. 

 

However, yes, the downside is if you go over the amount of ram your gpu has you will experience massive issues (most likely even crash) 

 

There's a dynamic setting which I'm not sure what it does exactly, but I'm using that and I'm still running into issues when I go over the max vram usage. (I guess there's still stuff in the background that uses system RAM, maybe AI logic etc) but the fact is the level will be loaded fully into Vram from the get go (which is usually around 4-6 GB, depending on your settings it can be a lot more of course) 

 

It's a bit trickier than that when we're talking the new cards. Have to remember that the RTX3000 series has direct storage capability now, so can directly load textures over the PCIe bus from your drive without it having to go via the CPU & Memory first. So if your game files are on a PCIe 4.0 NVMe drive your card can effectively load texture data at up to 5GB/s, so it can easily swap in textures and discard unnecessary textures on the fly considerably easier. It's going to be interesting to see how direct storage with the new generation cards affects the need for huge VRAM. 

 

Also most game design is fairly intelligent when it comes to memory management in massive AAA title games. Typically they will buffer lower quality textures in memory which it can use temporarily while larger higher res textures load. It doesn't look as pretty but generally mitigate stuttering and engine crashes. 

Spoiler

Desktop: Ryzen9 5950X | ASUS ROG Crosshair VIII Hero (Wifi) | EVGA RTX 3080Ti FTW3 | 32GB (2x16GB) Corsair Dominator Platinum RGB Pro 3600Mhz | EKWB EK-AIO 360D-RGB | EKWB EK-Vardar RGB Fans | 1TB Samsung 980 Pro, 4TB Samsung 980 Pro | Corsair 5000D Airflow | Corsair HX850 Platinum PSU | Asus ROG 42" OLED PG42UQ + LG 32" 32GK850G Monitor | Roccat Vulcan TKL Pro Keyboard | Logitech G Pro X Superlight  | MicroLab Solo 7C Speakers | Audio-Technica ATH-M50xBT2 LE Headphones | TC-Helicon GoXLR | Audio-Technica AT2035 | LTT Desk Mat | XBOX-X Controller | Windows 11 Pro

 

Spoiler

Server: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM850v2 PSU | Fractal S36 Triple AIO | 12 x 8TB HGST Ultrastar He10 (WD Whitelabel) | 500GB Aorus Gen4 NVMe | 2 x 2TB Samsung 970 Evo Plus NVMe | LSI 9211-8i HBA

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Jurrunio said:

RDR2, that's a big VRAM user

 

DLSS essentially runs the game at lower res and then upscales it, so VRAM usage overall is reduced compared to running native res.

 

Neither are the builds similar, even if the hardware's the same the software side of things are different. It's one of those things that unless you put everyone in front of the same screen, no one can pursuade the other.

 

Then how could they solve it? Take the 3080 as an example, they could either open up more memory bus width and give us 12GB which I think is the most viable option but people like you may not be satisifed, go clam shell mode and give us 20GB (also with significant higher price since it's now much more capable for work), or cut down memory bus width a bit (also hurts bandwidth) and give us 16GB through clam shell mode. Neither option will satisfy everyone.

3080ti has been rumoured to launch with 12gb g6x.

What do you reckon it could cost? I know this won't happen but I'm hoping for something below $849, any higher, ill go amd because I don't like the idea of having only 10gb vram going forward into next gen which will use unreal engine 5 o.o

I'm just hoping amd releases their own dlss before march 2021 as I'll be building a pc before then.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Hickaru said:

3080ti has been rumoured to launch with 12gb g6x.

Rumors are unreliable, they already changed once giving up the 20gb card idea for example.

 

1 minute ago, Hickaru said:

What do you reckon it could cost? I know this won't happen but I'm hoping for something below $849, any higher, ill go amd because I don't like the idea of having only 10gb vram going forward into next gen which will use unreal engine 5 o.o

If it's a 12gb model maybe with more SMs enabled, I expect MSRP to be around $700 to $750 (wont be too high in order to compete with 6800XT)

If it's a 20gb model, maybe $900-$1000 because 6900XT is $1000.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×