Jump to content

More evidence to throw in to the pile of why I question VRAM utilization reports as a measure of what a game needs in order to run smoothly (from https://www.guru3d.com/articles_pages/the_division_2_pc_graphics_performance_benchmark_review,6.html)

 

index.php?ct=articles&action=file&id=494

If the game really needed and used 12GB of VRAM, why isn't it a slogfest on the RTX 2080 and GTX 1660?

 

Which also brings up another point: we assume that when a game maxes out the video card's VRAM and it wants/needs more, it'll just start swapping out data from VRAM to system RAM. And when that happens we assume it's like what happens when the computer itself starts swapping out data in system RAM to storage: it'll start hiccuping and becoming unresponsive.

 

However, in modern games I don't think everything in VRAM is absolutely necessary at a given time when rendering a frame. The only thing I do know that's absolutely necessary are the render targets and the base quality assets for textures and meshes. The renderer doesn't necessarily need higher quality assets if it has the base quality ones to render the image. The image will just look fugly. So I think there is an absolute minimum amount of VRAM that's needed at a given resolution, but outside of that, the game uses the rest simply as cache space because locality is still important and if you can use the RAM, why not?

×