Jump to content

Is max playable resolution more vram or raw power dependent?

Go to solution Solved by Blazepoint5,
2 minutes ago, Fat Cat11997 said:

Just curious since ive heard so much about nvidia and their limiting vram. Quick google search didn't bring up anything useful so i thought i would come here. Thanks!

Max playable resolution is dependent on both VRAM and raw power . Here's a breakdown of how each contributes:

 

VRAM is responsible for storing the frame buffer, which contains the image that is displayed on the screen. Higher resolutions require more VRAM, as there are more pixels to store in the frame buffer. So, if your GPU doesn't have enough VRAM, you may not be able to run games at the highest resolution, even if your GPU is powerful enough.

 

 The raw power of your GPU, primarily its processing power, is crucial for rendering the game at high resolutions. More pixels mean more work for the GPU, as it needs to calculate the color and other properties of each pixel. If your GPU isn't powerful enough, it won't be able to render the game at high resolutions in real-time, even if you have enough VRAM.

Just curious since ive heard so much about nvidia and their limiting vram. Quick google search didn't bring up anything useful so i thought i would come here. Thanks!

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Fat Cat11997 said:

Just curious since ive heard so much about nvidia and their limiting vram. Quick google search didn't bring up anything useful so i thought i would come here. Thanks!

Max playable resolution is dependent on both VRAM and raw power . Here's a breakdown of how each contributes:

 

VRAM is responsible for storing the frame buffer, which contains the image that is displayed on the screen. Higher resolutions require more VRAM, as there are more pixels to store in the frame buffer. So, if your GPU doesn't have enough VRAM, you may not be able to run games at the highest resolution, even if your GPU is powerful enough.

 

 The raw power of your GPU, primarily its processing power, is crucial for rendering the game at high resolutions. More pixels mean more work for the GPU, as it needs to calculate the color and other properties of each pixel. If your GPU isn't powerful enough, it won't be able to render the game at high resolutions in real-time, even if you have enough VRAM.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Blazepoint5 said:

Max playable resolution is dependent on both VRAM and raw power . Here's a breakdown of how each contributes:

 

VRAM is responsible for storing the frame buffer, which contains the image that is displayed on the screen. Higher resolutions require more VRAM, as there are more pixels to store in the frame buffer. So, if your GPU doesn't have enough VRAM, you may not be able to run games at the highest resolution, even if your GPU is powerful enough.

 

 The raw power of your GPU, primarily its processing power, is crucial for rendering the game at high resolutions. More pixels mean more work for the GPU, as it needs to calculate the color and other properties of each pixel. If your GPU isn't powerful enough, it won't be able to render the game at high resolutions in real-time, even if you have enough VRAM.

Good example that is happening right now. The rtx 3070 8gb vs a 3060 12gb. In games that use a lot of vram at say 1440p the 3070 can be overwhelmed to such a point that a 3060 is faster. The fix is to lower texture settings and see if that gets you in the limit.

 

Now the 3080 10gb is also starting to have it happen in a handfull of titles that it can run just fine but textures need to go down because its ram is full

Link to comment
Share on other sites

Link to post
Share on other sites

storing frames actually isnt *that* memory intensive, so as long as the game does nothing else under the hood when you change resolution (like, if bumping up to 1440p also loads up higher res textures) it's pretty much entirely core load.

 

for example:

a frame of 1080p with 8 bits per color is 49MB

a frame of 1440p with 8 bits per color is 88MB

a frame of 2160p with 8 bits per color is 199MB

 

so while this number rises significantly... it's a very small number compared to the VRAM available these days.

 

case in point.. i've been doing both 1440p and 2160p gaming on my GTX970, and while the vram limits how high i can put texture settings, this problem is fairly universal across resolution choices. on the flip side, core load scales pretty linearly with pixel count.

Link to comment
Share on other sites

Link to post
Share on other sites

I have a 3070Ti, and people smack talk them all the time for being an 8GB card. But it can do 4K gaming, just gotta tweak things a bir.. 

AMD R7 5800X3D | Thermalright Aqua Elite 360, 3x TL-B12, 2x TL-K12
Asus Crosshair VIII Dark Hero | 32GB G.Skill Trident Z @ 3733C14 1.5v
Zotac 4070 Ti Trinity OC @ 3045/1495 | WD SN8501TB, SN850X2TB
Seasonic Vertex GX-1000 | Fractal Torrent Compact, 2x TL-B14, TL-D14X

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×