Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Battlefield V with DXR on tested from Techspot(Hardware Unboxed)

I think we are like "my GPU is stopping my CPU" and I'm like "well, no, it can bottle neck or hog resources". Both those things do "stop" the CPU, but not in the same way. In fact, they both do opposites (CPU on a wait will do nothing because the GPU too slow, and a GPU polling the CPU too quick will put the CPU at 100% resource allocation and hog it). You are right, I need to get better at explaining that/conversing that way.

As said with the simpson's paradox, if I'm at the top looking down, or at the bottom looking up, I get different views. If I step back, I get the full view.

 

So, the development cycle is not hiding/holding/preventing better graphics at a lower resolution (Hint, it was AMD suggesting we play at 720p, I wonder why ;)). It is a lot of complex hardware, software, artistic etc limitations.

 

Look up some older games and how they scale LOD, look up some of the old art assets. You may be amazed what code they have to enable effects and resolutions, textures and LOD packs that were disabled because the graphics cards could not at the time push the pixels at any resolution.

 

RTX likewise is way to early at any resolution, 640p or 8k. :) We can already buy 2 1080tis = zero RTX twice the performance,  instead of 1 2080ti = RTX for half the performance. It already scales to the hardware/software/cost/performance. What we *don't* get is greater performance and ray tracing for the same price. This will take time.

 

Link to post
Share on other sites

For example, AMD is not pushing RTX... yet have not double GPU performance. If RTX stopped Nvidia improving performance, then AMD would have stepped in and gone "no RTX, but twice the performance of a 1080ti!!!!!!!!!!". They have not, so something else is stopping Nvidia doing greater than 10% incremental refreshes/releases. What could that be? ;)

Link to post
Share on other sites
17 hours ago, TechyBen said:

RTX reduces the cost of development. Not increases. I'm saying, in the development studio, the skew is opposite to the GPU cost of the user (for RTX). The user pays more for RTX, the developer less (as it's a render pipeline, the game engine does a lot that the artist would have had to do by hand previously, so less artist time = less cost). The costs don't scale the same for resolution increases or shader increase, or poly count, or texture count. They have different costs/benefits. They have different development time (or testing) sinks.

 

See unity art assets as an example of quick, cheap, bulk "photorealistic" art to order. It is however taxing on the GPU. Raytracing is similar. It's going to tax the GPU at any resolution. It's already being culled/upscaled and interpolated from a lower resolution.

 

Right, but this all started as an argument about the cost/ benefit of implementing higher resolutions as compared to higher details at a lower resolution. The monetary development cost of developing a graphical feature whilst related to the cost/benefit of implementing it is not directly proportional.as some things can be cheap but still not worth it because it provides no tangible benefit to the end user experiance and thus current and future profits due to good reviews and brand loyalty gained encouraging sales of the current and future products.

 

Thus i'm grenmuinly not sure what any of that has to do with how this whole argument started?

Link to post
Share on other sites
3 hours ago, CarlBar said:

Right, but this all started as an argument about the cost/ benefit of implementing higher resolutions as compared to higher details at a lower resolution. The monetary development cost of developing a graphical feature whilst related to the cost/benefit of implementing it is not directly proportional.as some things can be cheap but still not worth it because it provides no tangible benefit to the end user experiance and thus current and future profits due to good reviews and brand loyalty gained encouraging sales of the current and future products.

 

Thus i'm grenmuinly not sure what any of that has to do with how this whole argument started?

Well it is a complicated issue so fair enough for someone to try and point that out, though my fundamental point was graphical detail and resolution are not one and the same.

 

It also does no good for me and my gaming experience if say the main character on the screen is highly detailed then hops in to or on a vehicle that is a simplistic model and thus is both a different visual quality and also unrealistic looking, this I find very jarring. This applies to everything on screen. Increasing resolution won't fix that. Boiling it down to detailed assets exist is not at all the issue.

 

Lighting is also very important to our visual perception but is/can be computationally expensive, I don't mean Ray Tracing. There are many different types of lighting techniques all with there own pros and cons so if you use the wrong one, in the sense of needlessly picking a complex one when a simple one can achieve near the same result, or add an extra light source that isn't absolutely required then you are greatly increasing the GPU or CPU resource demand.

https://80.lv/articles/learning-lighting-for-video-games/

https://iq.intel.com/bringing-games-life-with-light/

https://unity3d.com/learn/tutorials/topics/graphics/choosing-lighting-technique

 

Or we could look at making hair look and act real to life, another extremely complicated effect to pull off convincingly.

 

So for me I would rather what is currently considered not critical game assets or effects get more attention, make that tree a more complicated model, make it look more real, waste some resources on that sort of thing because for me it is not a waste. If doing so would mean 4k would now be impossible on current GPUs so be it.

 

4k resolution output is not actually a well established thing, the hardware hasn't been both around that long or at a generally available price point. However there is a very big push towards it, mostly from marketing not a very strong technical need to. If 4k monitors and TV did not exist then no one would be talking about it, it wouldn't prevent graphical improvements because it doesn't exist.

 

Games have been getting more visually detailed over a long period of time at 1080p, the hardware has been improving to allow it, the tools have been improving to allow it, budgets in the industry have been increasing to allow it, but what I personally don't want is this progression to be slowed down to meet the demands of 4k when everyone is saying that 4k now and even potentially in the next generation is unrealistic. We are now developing techniques to get around the limitations to enable 4k, or accepting certain sacrifices, and this is in my opinion not the area I would like to see development resources go in to.

Link to post
Share on other sites
8 hours ago, leadeater said:

Well it is a complicated issue so fair enough for someone to try and point that out, though my fundamental point was graphical detail and resolution are not one and the same.

 

It also does no good for me and my gaming experience if say the main character on the screen is highly detailed then hops in to or on a vehicle that is a simplistic model and thus is both a different visual quality and also unrealistic looking, this I find very jarring. This applies to everything on screen. Increasing resolution won't fix that. Boiling it down to detailed assets exist is not at all the issue.

 

Lighting is also very important to our visual perception but is/can be computationally expensive, I don't mean Ray Tracing. There are many different types of lighting techniques all with there own pros and cons so if you use the wrong one, in the sense of needlessly picking a complex one when a simple one can achieve near the same result, or add an extra light source that isn't absolutely required then you are greatly increasing the GPU resource demand.

https://80.lv/articles/learning-lighting-for-video-games/

https://iq.intel.com/bringing-games-life-with-light/

 

Or we could look at making hair look and act real to life, another extremely complicated effect to pull off convincingly.

 

So for me I would rather what is currently considered not critical game assets or effects get more attention, make that tree a more complicated model, make it look more real, waste some resources on that sort of thing because for me it is not a waste. If doing so would mean 4k would now be impossible on current GPUs so be it.

 

4k resolution output is not actually a well established thing, the hardware hasn't been both around that long or at a generally available price point. However there is a very big push towards it, mostly from marketing not a very strong technical need to. If 4k monitors and TV did not exist then no one would be talking about it, it wouldn't prevent graphical improvements because it doesn't exist.

 

Games have been getting more visually detailed over a long period of time at 1080p, the hardware has been improving to allow, the tools have been improving to allow it, budgets in the industry have been increasing to allow it, but what I personally don't want is this progression to be slowed down to meet the demands of 4k when everyone is saying that 4k now and even potentially in the next generation is unrealistic. We are now developing techniques to get around the limitations to enable 4k, or accepting certain sacrifices, and this is in my opinion not the area I would like to see development resources go in to.

 

 

Honestly i feel like your so caught up in your personal, (if perfectly understandable), hate against pushing resolution above all else that you've let it get in the way of the point your trying to make. And thats why you and TechyBen have been going back and forth, because he's got caught up on all that personal stuff and started arguing against the personal opinion rather than the core factual argument.

 

At this point i wouldn't mind if @TechyBen would chime in and state what he's been trying to argue for. Hopefully then i can make sense of where we are right now.

Link to post
Share on other sites
17 minutes ago, CarlBar said:

Honestly i feel like your so caught up in your personal, (if perfectly understandable), hate against pushing resolution above all else that you've let it get in the way of the point your trying to make.

I don't actually hate it, I just don't think it's necessary. I'd absolutely like to see it happen, just when things are more in line to making it achievable. I hated the nonsensical argument itself more than the point about resolution, when you honestly don't feel like the other party in the debate is actually trying to listen to your point of view you stop trying to make a well structured argument or put forward proper reasoning because what would be the point? After the middle of page 12 I stopped bothering, did it then make it seem like I hated resolution more than I do, yep.

Link to post
Share on other sites
6 hours ago, leadeater said:

Well it is a complicated issue so fair enough for someone to try and point that out, though my fundamental point was graphical detail and resolution are not one and the same.

Yep. We agree

Quote

It also does no good for me and my gaming experience if say the main character on the screen is highly detailed then hops in to or on a vehicle that is a simplistic model and thus is both a different visual quality and also unrealistic looking, this I find very jarring. This applies to everything on screen. Increasing resolution won't fix that. Boiling it down to detailed assets exist is not at all the issue.

Yes. We agree. :)

 

Quote

 

Or we could look at making hair look and act real to life, another extremely complicated effect to pull off convincingly.

Yep.

 

Quote

If doing so would mean 4k would now be impossible on current GPUs so be it.

It does not work that way. I posted Quake 1 that could run at 4K. 4K did not prevent them from adding bumpmapping to it. A target resolution of (IIRC back then) 1024 by 576p did not prevent them from adding vertex shaders. Likewise, a target of 4k has not prevented them adding effects to Crackdown, Tombraider (it has the hair you asked for! XD) and BFV.

 

It does change the budget and the fidelity, and the LOD scaling. But these things were always in flux. The art department and development studio is always chasing things. The *game* design dictates if the game gets "hair simulations", if it's a driving game, it probably won't. No one is stopping us programming a 320i photorealistic game... the tech is there. But would you pay for/play it?

Link to post
Share on other sites
6 hours ago, CarlBar said:

 

 

Honestly i feel like your so caught up in your personal, (if perfectly understandable), hate against pushing resolution above all else that you've let it get in the way of the point your trying to make. And thats why you and TechyBen have been going back and forth, because he's got caught up on all that personal stuff and started arguing against the personal opinion rather than the core factual argument.

 

At this point i wouldn't mind if @TechyBen would chime in and state what he's been trying to argue for. Hopefully then i can make sense of where we are right now.

That 4K (or any resolution) does not prevent art assets/tech/visuals. It may prevent *2d* GUI elements, as these are currently drawn for each resolution, but 720p still exists, let alone 1080p and 1440p! So games are made with these LOD also included.

 

4K does not prevent "new tech" or "hair simulations" as these games/tech already exist. No one has held back the tech, it's in the games (Tomb raider etc). Or we have games like Grown Home, that ignore resolution entirely, and do only the art that scales to any size (620x300 to 16k :D ). (PS, Grown Home has no hair simulations, also COD7 has no hair simulations. This is not because of a render budget, but a "do we car about this art" budget. Maybe. maybe someone will look at COD7 and go "can we add hair simulations?" but the GPU shader pipeline is not all pixels, and not all graphics settings prevent others, some do, but not all. So they will check CPU use, GPU use, FPS, resolution, shaders, AI agents etc etc. 4K does not prevent it, all the items working together each prevent each other.).

 

If Crackdown/BFV/COD decided "fck it, we are doing 24k at zero textures, no shaders, no artwork, no specular highlights, no raytracing, and at 15FPS", a second developer would go "I'm doing 720p at photorealistic gameplay". Like, no one is hiding/preventing this magic tech people think exists. :P

ghos03.jpg

This is a single developers artwork. 1 Person. They have not "held back" art, rendering, tech to get a target. The game *scales* to any resolution. 1 Programmer/artist can *scale* a game code to *over* or *under*. No one is preventing you, or breaking games because "4K!!!". ?

Link to post
Share on other sites
3 minutes ago, TechyBen said:

It does change the budget and the fidelity, and the LOD scaling. But these things were always in flux. The art department and development studio is always chasing things. The *game* design dictates if the game gets "hair simulations", if it's a driving game, it probably won't. No one is stopping us programming a 320i photorealistic game... the tech is there. But would you pay for/play it?

This is the closest you've gotten to what I was actually talking about which is good to see. Yep it's always been in flux. I'm also not asking for true photorealism I'm just saying that you can put assets, effects and lighting in to a game that the end result would be something that on current hardware would be too slow performance wise if you tried to run it at 4k on any GPU but would be just barely possible at 1080p. I'm not saying photorealsitic, I'm saying better than we have now. So if you actually wanted your game to actually be possible to run at 4k on current hardware the total end result wouldn't be as demanding.

 

Here in the above hypothetical the decision to aim for 4k would have impacted overall design choices, if the goal was the best possible ever.

 

I don't think you're getting my point about making the actual end result scene harder to render to the point current hardware can't do it at 4k and anyone would call it a playable game.

Link to post
Share on other sites
Just now, leadeater said:

Yep it's always been in flux. I'm also not asking for true photorealism I'm just saying that you can put assets, effects and lighting in to a game that the end result would be something that on current hardware would be too slow performance wise if you tried to run it at 4k on any GPU but would be just barely possible at 1080p.

But no one does. 1080p - 4k difference is 25-50fps this is within acceptable performance differences all the studios aim for (25/30FPS or 50/60FPS). :D So if it makes 4K look slow, it also makes 1080p look slow. Any improvement to 1080p can be "scaled" to 4K or any improvement to 4K can be "scaled" to 1080p with the exception of 2d only assets.

 

Your hypothetical is a boogeyman. It's not a real threat. :) As said, I follow and are really interested in the render pipeline and the development process. I can show you where people are ADDING 1080p effects for you, not "holding it back for 4K". ?

 

As an example, Eevee  https://cgcookie.com/articles/get-ready-for-eevee-blender-s-new-real-time-rendering-engine Has been released for Unity. This might mean those using unity aim for the *content creation* resolution they have their main rig/development rig as. Because it's a live preview. But look at Ghost of a Tail screenshot above. Did that game get made worse to hit a resolution budget of 1080p? Could it be made better if they aimed for 320p instead?

 

Link to post
Share on other sites
3 minutes ago, TechyBen said:

But no one does

I know! That's why I'm expressing my WISH for someone to do it.

 

3 minutes ago, TechyBen said:

Your hypothetical is a boogeyman. It's not a real threat.

I'm not saying it's a threat, this whole line you keep putting forward is born from your misunderstanding of what I'm wishing for, to see happen. I'm not saying I expect it to happen, is it so offensive of me to ask for it?

Link to post
Share on other sites
6 minutes ago, leadeater said:

I know! That's why I'm expressing me WISH for someone to do it.

 

I'm not saying it's a threat, this whole line you keep putting forward is born from you misunderstanding of what I'm wishing for, to see happen. I'm not saying I expect it to happen, is it so offensive  of me to ask for it?

You took my positive as a negative. You misread my reply! Sorry!

 

Ask for what? What are you asking for?

[edit]

Quote

Yep it's always been in flux. I'm also not asking for true photorealism I'm just saying that you can put assets, effects and lighting in to a game that the end result would be something that on current hardware would be too slow performance wise if you tried to run it at 4k on any GPU but would be just barely possible at 1080p.

You are not asking for photorealism? Ok. You want more effects? Ok. So what do you want, more effects? You wish to see more effects?

Link to post
Share on other sites
15 minutes ago, TechyBen said:

Ask for what? What are you asking for?

For someone to start a new game, develop for the most demanding level of detail possible and not have 4k as a consideration, to create something with a level of detail that is superior than ever before. That if you increase the resolution to 4k you would hit limitations within GPU hardware such as memory bandwidth, total vram as well as GPU performance.

 

You keep saying that I have been saying that the push for 4k is limiting or preventing artists, that's not accurate to what I have been saying at all. I've been saying that the art that makes it in to the game, the effects that are implemented in the game, the lighting choices are influenced when you develop for 4k, you might not aim for a level of detail that is not possible at 4k and I also don't think that they would allow/include it so if you were to run at 1080p instead that the level of detail would increase/scale above that of what is used when at 4k.

 

Would you agree it would be rather odd to see a game get released where the level of detail went up when you decreased the resolution? Sure I realize you would hit that tipping point where you wouldn't actually be able to see/tell that the detail increased without also increasing resolution with it, I just don't think that something like 1080p or 1440p would be that.

 

15 minutes ago, TechyBen said:

You took my positive as a negative. You misread my reply! Sorry!

Don't worry I actually didn't, I think you're getting much closer to understanding what I was trying to express.

Link to post
Share on other sites
1 minute ago, TechyBen said:

What are you asking for?

Okay, so maybe a real world example. Famously Witcher 3 has been a subject to graphical downgrade when comparing the early gameplay trailer to the actual game. Devs responded to that that yes, they had to change rendering because what they prepared for trailer purposes, while being rendered live on a PC, could not be rendered efficiently when the game was ready and the world was much bigger.

 

Now, part of it was definitely console influence (they did state that if the game has not been released on consoles it would not have been released period) but definitely part of it was due to the fact that across all resolutions the performance would be inefficient (they boasted that the game was well optimized after that). 

 

But part of me (and, I think, @leadeater is thinking that they could've thought "damn, with what we are doing people are gonna need a Titan or a 980ti to run it in 1080p, better dial it down or nobody will run it.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to post
Share on other sites
Just now, leadeater said:

For someone to start a new game, develop for the most demanding level of detail possible and not have 4k as a consideration, to create something with a level of detail that is superior than ever before. That if you increase the resolution to 4k you would hit limitations within GPU hardware such as memory bandwidth, total vram as well as GPU performance.

 

They do! They do "the most demanding game" at 1080p. Crysis, Crackdown, TombRaider, BFV!!!

If you follow the game development, game engine development, game coding, pipeline, GPU hardware, you will see these things are all added.

You wish they would add more effects for 1080p? They are:

 

91878784_.thumb.png.13105eda79ae1b606901ca7b6a4a646f.png

That's a LOT of effects. Some will only run at 1080p because no one has the hardware to run it in 4K.

 

VR? Yes, VR will prevent people adding effects to a game. Any effect that increases latency, lowers FPS cannot be added to VR. But likewise, a 3D game cannot be rendered as a 2D game. But 1080p can be rendered like 4K, and 4K like 1080p. Resolution scales, VR/gameplay design/AI/etc does not always scale.

 

Quote

 I've been saying that the art that makes it in to the game, the effects that are implemented in the game, the lighting choices are influenced when you develop for 4k, you might not aim for a level of detail that is not possible at 4k and I also don't think that they would allow/include it so if you were to run at 1080p instead that the level of detail would increase/scale above that of what is used when at 4k.

I provided 10 links (and 2 others) of games where 4K did not prevent them adding the assets... They sent the 1080p extra high assets to players! :D

Link to post
Share on other sites
3 minutes ago, TechyBen said:

 

Sigh just when I thought you were, rest assured you didn't actually get my point or what I was asking for but I'm not going to debate it further. You and I are just going to have to accept that the mutual understanding is just not going to happen.

Link to post
Share on other sites
6 minutes ago, Lathlaer said:

Okay, so maybe a real world example. Famously Witcher 3 has been a subject to graphical downgrade when comparing the early gameplay trailer to the actual game. Devs responded to that that yes, they had to change rendering because what they prepared for trailer purposes, while being rendered live on a PC, could not be rendered efficiently when the game was ready and the world was much bigger.

 

Now, part of it was definitely console influence (they did state that if the game has not been released on consoles it would not have been released period) but definitely part of it was due to the fact that across all resolutions the performance would be inefficient (they boasted that the game was well optimized after that). 

 

But part of me (and, I think, @leadeater is thinking that they could've thought "damn, with what we are doing people are gonna need a Titan or a 980ti to run it in 1080p, better dial it down or nobody will run it.

 

Yes! Thankyou. A solid, factual example. *Why* did they downgrade? You are blaming what?

What are you blaming for the downgrade? Colonial Marines was due to an engine swap (it was pre-rendered faked gameplay).

 

Why was Witcher 3 downgraded? I agree it was. But in the development cycle, testing, shader pipeline, art assets conversion, it was downgraded. Why was it downgraded? (This is what we need to learn, not hypothetically blame, but factually find out).

 

Quote

Now, part of it was definitely console influence

What part of a console prevents it? Is it the "resolution" or is it that the console has less shader cores? Less VRAM, less bandwidth?

 

Quote

But part of me (and, I think, @leadeater is thinking that they could've thought "damn, with what we are doing people are gonna need a Titan or a 980ti to run it in 1080p, better dial it down or nobody will run it.

No idea, Leadeater says the "did not mean that" every time I try to pin down the problem. But if that's what you and I wish to discuss, that's fine. :)

Yes, so they "dialed it down" to run 60FPS at 1080p... could they "dial it up" to run 60FPS at 524p? That's the assumption/hypothetical. Guess what the real answer is? (Coming from an artist who has tried to render and change resolutions to "fix" FPS problems! XD )

Link to post
Share on other sites
11 minutes ago, leadeater said:

For someone to start a new game, develop for the most demanding level of detail possible and not have 4k as a consideration, to create something with a level of detail that is superior than ever before. That if you increase the resolution to 4k you would hit limitations within GPU hardware such as memory bandwidth, total vram as well as GPU performance.

 

You keep saying that I have been saying that the push for 4k is limiting or preventing artists, that's not accurate to what I have been saying at all. I've been saying that the art that makes it in to the game, the effects that are implemented in the game, the lighting choices are influenced when you develop for 4k, you might not aim for a level of detail that is not possible at 4k and I also don't think that they would allow/include it so if you were to run at 1080p instead that the level of detail would increase/scale above that of what is used when at 4k.

 

Would you agree it would be rather odd to see a game get released with the level of detail went up when you decreased the resolution? Sure I realize you would hit that tipping point where you wouldn't actually be able to see/tell that the detail increased without also increasing resolution with it, I just don't think that something like 1080p or 1440p would be that.

I don't actually get that argument too much: The resolution that is presented on the screen doesn't has a direct impact on polygon count and lighting effects which is what impacts visuals the most: A modern game at even 720p looks far and away better than a game from 2003 at 4k simply because there's just orders of magnitude more polygons.

 

If anything, I think that the limiting factor today is more on the side of getting artists that can actually draw such high polygon counts: when everything looked like shit many artists of different skill levels could make assets and look respectable enough but as we greatly expand visual fidelity you can't just grab the same guys and demand they produce photo realistic or very complex and nuanced assets. Drawing is a skill and while most people can draw and most people giving moderate training can make say, passable cartoons or comic books (not great ones but passable) you can't just grab a person, give it a bit of training and expect a renaissance master piece that shit takes years and years of training and skill developed and very few people will be able to master it to such degree no matter how much assistance you can get from a computer.

 

To me this discussion about assets and 4k sounds like people saying "No we only have 1000 monkeys with 1000 type writers we need 1,000,000 monkeys and 1,000,000 type writers to go from incoherent nonsense to instant literally classic for the ages!" Um no: you need a lot more than just technical considerations or what's the barrier for photo realism in 4k or beyond.

-------

Current Rig

-------

Link to post
Share on other sites
5 minutes ago, TechyBen said:

What part of a console prevents it?

I'd have to assume it wouldn't look good if the game was graphically drastically different from consoles. The devs said as much, that if they did the game only for PC they could've made it better looking but part of succesful release PR is not letting consoles players feel like they have the "shitty version" :D

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to post
Share on other sites
2 minutes ago, Misanthrope said:

I don't actually get that argument too much: The resolution that is presented on the screen doesn't has a direct impact on polygon count and lighting effects which is what impacts visuals the most: A modern game at even 720p looks far and away better than a game from 2003 at 4k simply because there's just orders of magnitude more polygons.

I agree, I made the same point earlier. I have only been saying that increasing the resolution increases the demand on the GPU which would have in impact on the polygon count and lightning effects that you would actually put in to the scene.

Link to post
Share on other sites
12 minutes ago, Lathlaer said:

I'd have to assume it wouldn't look good if the game was graphically drastically different from consoles. The devs said as much, that if they did the game only for PC they could've made it better looking but part of succesful release PR is not letting consoles players feel like they have the "shitty version" :D

You know what they say about "assuming". But honestly... that's all I'm trying. All these people assuming, just need to go and look at (not an interview with AMDs CEO and Marketing directors) an actual game design, program and art development article and see how these games are made, and what prevents better effects. It's not resolution. It is because "they have smaller/cheaper/slower GPUS".

 

I am only a hobbiest. I've done super basic stuff, but we were aiming for photorealism in 1995. We *could do it then*. But "resolution target" never prevented it. The hardware, pipeline, art sink, technology limits prevented it.

 

Ray Tracing required an *entire new silicone die* from Nvidia. It could not be added to the existing card, because the maths transformations/pipelines are radically different to reasterising. :)

Link to post
Share on other sites
11 minutes ago, leadeater said:

I agree, I made the same point earlier. I have only been saying that increasing the resolution increases the demand on the GPU which would have in impact on the polygon count and lightning effects that you would actually put in to the scene.

Bolded the incorrect part. The game dynamically scales this content:

 

kspset.jpg.69ff6ad3779fe867ffbf28cdee81fc9f.jpg

 

This is from a team of less than 20. On a tiny budget (Mexican pay is horrid apparently). Do you think, KSP dev team can add dynamic scaling to improve 1080p, and COD/BFV/TombRaider/Crackdown cannot, on Millions of budget?

Link to post
Share on other sites
1 minute ago, TechyBen said:

Bolded the incorrect part. The game dynamically scales this content:

 

kspset.jpg.69ff6ad3779fe867ffbf28cdee81fc9f.jpg

 

This is from a team of less than 20. On a tiny budget (Mexican pay is horrid apparently). You are saying, KSP dev team can add dynamic scalling to improve 1080p, and COD/BFV/TombRaider/Crackdown cannot, on Millions of budget?

Because I can make that slider go past the maximum point right? Same for Render Quality Level and Texture Quality Level. As I said it's ok, you don't understand my point it's fine. I'm not going to worry about it.

Link to post
Share on other sites
39 minutes ago, leadeater said:

Because I can make that slider go past the maximum point right? Same for Render Quality Level and Texture Quality Level. As I said it's ok, you don't understand my point it's fine. I'm not going to worry about it.

Yes you can! In an ini edit, you can oversample it.

Can your PC run that many lights if you do?

 

(You can *never* run that many lights, ever, at 1080p. You GPU does not have enough shader pipelines!)

[EDIT, AFAIK, such things are rendered on Nivida cards through the CUDA cores (pixel shaders), https://www.gamersnexus.net/dictionary/2-cuda-cores ]

This is what limits the lighting count. Not programming. Not resolution. Not artists. Not 4K target render quality.

Link to post
Share on other sites

Ok i think i've got a feel for where the discussion is now and everything.

 

First things first let's clear somthing up super quick.

 

 

Games development in terms of graphical fidelity works to a GPU budget. In effect the game devs look at available info and go "this is how much grunt low/mid/high end gamers will have access to when this game releases".

 

And each time they look to implement a feature or effect they'll go "how much of the GPU budget is this taking up and is the improvement worth the usage cost". But of course nearly everything scales with resolution and that means each GPU tier has a target resolution assigned to it. They're not going to limit anyone from going to a higher, (or generally speaking a lower), resolution, but they aren't building the art assets for best performance or good appearance at the other resolution settings. They can also build things such that a given GPU range can run one combination at 60 FPS and another at high refresh rates, (150fps or so).

 

The key thing to remember is that because of how resolution is implemented in terms of standardised elements it's quite normal for games to be able to run at much higher resolutions than their target resolutions. I've got a fair few older games that way predate the availability of 1440P and 4K monitors that will run at those resolutions. The developers where obviously never targeting those resolutions because they effectively, (from a commercial game sales standpoint), did not even exist back then. But those games still allow you to wind that up if your hardware can do it. 

 

Resolution is a bit of a special case however as it doesn't need any extra work from the developer to have an actual effect. Things like particle count limits however, (as an example of somthing else i can wind right up in older games), require that there actually be enough on screen particles to exceed the limit in most situations. If that isn't the case you can wind the particle count up all you want, it's not going to do anything because the developers haven't implemented enough particles for you to see any real world effect.

 

That means in most cases regardless of how much you wind up certain scalables your actually reliant on the developers implementing the necessary in game effects and assets to take advantage of said scalables.

 

And thats where the whole GPU budget comes into this again. Games developers are not going to waste any time, money, or effort implementing somthing if it cannot be run at one of their GPU performance point targets. (Slippage can and does occur for various reasons, including but not limited to, overestimation of aalibile GPU power at release and performance reducing bugs they couldn't fix before release). And given most things that can appear on screen get more GPU expensive to display at the same quality level as the display resolution goes up there is naturally a roughly inversely proportional decrease in peak attainable quality of whats on screen to ensure GPU usage dosen;t get out of hand at the target FPS.

 

 

That said i don;t think 4k is limiting 1440p quality, only the RTX2080 can really do high refresh rate 4k gaming, everything else it's either 4k standard refresh, or 1440p high refresh. So 4k high refresh isn't really a practical target point for developers atm. 1080p is almost certainly taking a quality hit on the highest end hardware as developers aren't generally going to put the effort into developing a "super ultra quality" quality spec for 1080p gamers on high end hardware. They're going to assume people with top end GPU's wuill have comparably expensive monitors. 

 

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×