Jump to content

Battlefield V with DXR on tested from Techspot(Hardware Unboxed)

kiska3

@TechyBen: What is so hard for you to grasp here?

 

You have a higher target main gameplay resolution on the same hardware the less graphically intensive the actual scenes can be because rendering it for a 4k display takes more GPU resources and thus if you want the game to run at all on available hardware you have to accept penalties in overall graphic quality so the current hardware can run in.

 

It's really basic stuff and game graphical design 101. Wherever possibble you don't build a game with intended for normal play graphics that can't run on hardware available around it's release. Obviously buggy games, bad optimizations, or miscalculations can all make this not pan out but. No one deliberately builds a game that you need next gen or even next, next, gen hardware to see running at it's intended quality level.

 

They'll build some level of fidelity in beyond their intended level for people with top end release date era hardware, but they don't deliberately do so to such a point that no one can run it at reasonable graphical settings assuming release era hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, TechyBen said:

You've literally done a Simpson's Paradox on the data. :P

Literally had to, you forced me to because you were not addressing at all what I was saying and going off on your own points. Yes I understand what you were saying, yes I agree it's heavily dependent on art budget but as yourself mentioned we can only do what we can now because GPU resources have gotten better and that is the enabler, the reason artists and producers are willing to do more because it's not wasted effort/money on the impossible.

 

So with that very thing in mind, GPU resources are not unlimited, textures (FYI not always technically correct usage of that word nor the only thing I was talking about) are only going to be as good as it's possible to actually utilize.

 

Were those higher quality texture packs not used originally or in the base game because they actually want the game to be able to be rendered at 4k by more people, highly likely. There is no point doing something which is going to make the game near impossible for most people to be able to render it on their computer.

 

That's where choices come in, do we go for higher quality assets making it impossible for any current GPU to render at 4k or we settle here so they can. Do we add more light sources, work on more realistic overall lighting, better shadows, reflections etc take your pick of any of these or combinations that results in being impossible to render at 4k on the current available hardware the developers will be releasing their game to. Wanting your game to be able to be rendered at 4k will impact all of this.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, CarlBar said:

Alright quick cliff notes version of the 3 raytracing types and how they work. 

 

All raytracing, even hybrid starts by shooting 1 or more rays per pixel from the camera position through the pixels on the virtual screen, (the contents of this virtual screen are what you actually see by the way). These rays keep going till they hit somthing in the games 3d render of the scene. We'll call these ray's master rays. here's probably some real fancy name for them but "master Rays" will suffice for our purposes. What happens next exactly depends on what is being handled by raytracing. In a full raytraced scene several different types of ray then emanate from where each ray touches the the scene. Again for sake of all these under the general classification of "Secondary Rays". For the purposes of what is implemented in hybrid raytracing we only really care about 3, as those are the only types implemented.

 

The first of these secondary rays is properly called the reflection ray. This shoots off at an angle set by the angle between the surface and the Master Ray. Just like light striking a mirror in real life. This reflection ray then shoots off and bounces around picking up colour info which is then combined with the colour info from the master ray to produce the final pixel colour value.

 

The second is the Shadow Ray. I haven't see a lot on this so i'm not sure how what direction it goes off in is decided. But it detects the shadowing effect of any surfaces it pases through on it's way out. Again this is combined with master ray data to decide the final pixel colour.

 

The third type are known as light rays, whilst hybrid ray tracing puts a cap on the number of light sources that can be used this fires a ray off at every light source in the scene until it hits the ray cap. Again picking up on lighting factors as it goes to again combine with the master ray to get a final result.

 

The thing is all 3 of these effects can use the same Master ray to implement, in fact they have to if combined, and the results of all 3 are put together to produce the final pixel values.

 

What this also means however is that the Master Rays are a one time overhead item, adding extra RTX effects doesn't necessarily double the workload. In fact because of the amount of bouncing they can do, (I believe BFV limits it to 4 bounces total), reflection is amongst the most intensive to do unless you have a scene with an enormous number of lights. But i think currently software caps it at the 3 closest light sources.As a result implementing additional effects isn;t a simple scalar.

You make it sound like ray tracing is a lot more achievable than many would believe. I certainly hope that is the case since that would mean better visuals should be closer but the demonstrations we have been shown would say otherwise.I guess the question is how much time does the master rays portion of ray tracing typically take to compute. If 1080p ultra will yield a frame every 6.7ms typically and 1080p ultra with ray tracing enabled it becomes 20.4ms (17.9ms with RT set to low) then it's fair to say it must be somewhere around 10ms or maybe a tad less. Then my next question would be about which type of ray is typically the most taxing to implement of the three types of rays.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, CarlBar said:

@TechyBen: What is so hard for you to grasp here?

 

You have a higher target main gameplay resolution on the same hardware the less graphically intensive the actual scenes can be because rendering it for a 4k display takes more GPU resources and thus if you want the game to run at all on available hardware you have to accept penalties in overall graphic quality so the current hardware can run in.

 

It's really basic stuff and game graphical design 101. Wherever possibble you don't build a game with intended for normal play graphics that can't run on hardware available around it's release. Obviously buggy games, bad optimizations, or miscalculations can all make this not pan out but. No one deliberately builds a game that you need next gen or even next, next, gen hardware to see running at it's intended quality level.

 

They'll build some level of fidelity in beyond their intended level for people with top end release date era hardware, but they don't deliberately do so to such a point that no one can run it at reasonable graphical settings assuming release era hardware.

Yes... But no one is making you (or leadeater) run the game at 4K. Thus you still have "Ultra" settings, highter art assets and better images than those running at 4K... in fact you're still wrong, because (as said, for example) GTAV 4K "Ultra" settings still exist (or choose any other PC game, I gave a list of 5 older games with ultra DLC artwork packs). No one made the game "less graphically intensive" because there is a 4K option there. What they have done, is allow the game to *possibly* downscale graphics intensity for 4K.

 

So no one got forced for the image of the low res FF textures. The PS2/3/4 may not be able to display those textures AND 4K/1080p/524p at the same time, but the games would have also either dynamically scaled content (LOD models/settings) or in the case of the PS4 have the option for 1080p OR 4K.

 

That's all I've tried to say. It's not one preventing the other. We have both. We have 1080p at low and high settings, we have 4K at low and high settings. Why? Because unless it's a games console, the hardware exists/will exist to drive the higher resolutions eventually.

 

The FF image would be blurred because texture units, VRAM, pixel pipelines/shader cores etc are not enough to drive it at a high resolution. This does not stop you running the game at a lower resolution.

 

Finally "looks worse" is subjective when comparing 2 different scaling options. For some people, their vision notices blurred textures/low poly more. For some people their vision notices strobing/pixel edges more. So, *both* are options for improving fidelity.

 

  • One can improve texture/poly resources but suffer FPS/resolution performance.
  • One can choose resolution improvement (better aliasing/colour gradients/smoothness of frame interpolation) but suffer texture/poly resource performance.

If you have the hardware, you can choose both, or you can chose one. But both are "improvements" to image fidelity, both are reductions in FPS, and not mutually exclusive. You have done a Simpsons paradox*, because you've concentrated on 4K lowering FPS/Performance but not accepted higher textures/polycount lowers FPS/Performance.

 

Leadeater:

Quote

the reason artists and producers are willing to do more because it's not wasted effort/money on the impossible.

I gave an example of 5 games, from before 4K was a thing, that "not wasted effort" was already done. I gave an example of an art asset that could do 16 bazillion K screen resolution. The art assets exist. The GPUs do not. It's also not "impossible", as 2 sli 2080s exist. It's just expensive! XD

 

Quote

So with that very thing in mind, GPU resources are not unlimited, textures (FYI not always technically correct usage of that word nor the only thing I was talking about) are only going to be as good as it's possible to actually utilize.

Yes, but as you say, GPU resources are not unlimited. This included texture budget *irregardless of display resolution*, and includes pixel pipeline/shader cores *irrespective of display resolution*. You get a sliding scale between the two, but it's not always resolution bound (again, see Grown Home, or a really good multi pipeline and scaled game engine).

 

Quote

Were those higher quality texture packs not used originally or in the base game because they actually want the game to be able to be rendered at 4k by more people, highly likely. There is no point doing something which is going to make the game near impossible for most people to be able to render it on their computer.

No. Because they were 85GB!!! Because they were over the texture VRAM of most cards. Because they were 100mil art budget department/time scale costs. Because they *did not even do 4K back then*. These were mainly 1080p games.?

 

"highly likely", is your feelings. I agree with your feelings. I agree with your assessment and preference of the art, resolution and poly count. I disagree that you know why these things are limited in games.

 

 

*https://en.wikipedia.org/wiki/Simpson's_paradox

We have 2 scales. Blue would be "better art, poly, texture, effects" and red would be "better resolution, ray tracing, game engine etc". But if we choose the worse of red, and the best of blue, we can think it "gets worse" and not gets better.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, TechyBen said:

No one made the game "less graphically intensive" because there is a 4K option there. What they have done, is allow the game to *possibly* downscale graphics intensity for 4K.

I was not saying anyone made the game less graphically intensive because there is a 4k option. The point is it is possible to use more geometrically complex models, more complex lighting, higher quality effects if you are willing to completely sacrifice the ability to be able to run the game on current hardware at 4k. Game developers don't do that because there is a current want, it is popular in reviews, it is good marketing to have the game running well at 4k so you, as a developer, are not going to make your game incapable of being able to run at 4k.

 

And textures themselves are not all that demanding and isn't actually completely what you were talking about further on. 

 

This is a texture file.

t7kVK90.png

 

This alone does not make something look more realistic or impressive. You can increase that texture resolution and apply it to a geometrically inaccurate non impressive model and it won't greatly improve the overall graphics quality of the game.

 

And most textures, even in high res texture upgrade packs are rarely 4096x4096 resolution. What matters for textures is how much screen/image area it will take up on screen, if it is on something smaller in size then you don't need to go high res on it. Some textures are also tiled over the model, like a brick wall, so also do not use very large textures.

 

The fact that these very high quality art assets exist is not the issue, neither did I say they did not exist. The issue is the active choice, the requirement to lessen the quality when used in the game so current hardware is able to handle them. How much you may have to reduce them can be effected by the design goal of the game i.e. 4k@60 or 4k@30 or 1080p@60 etc.

 

Edit:

TL;DR I want a game that is impossible to run at 4k on 2 2080 Tis, I want to see an actual leap in graphics quality beyond making things look just a little sharper, blend a tiny bit better. I don't want the game to be made easier to run, very much the opposite. Make it so hard you cannot at 4k. At least in a generation or two there will be hardware than could run it at 4k and the game will not date as quickly.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, leadeater said:

I was not saying anyone made the game less graphically intensive because there is a 4k option. The point is it is possible to use more geometrically complex models, more complex lighting, higher quality effects if you are willing to completely sacrifice the ability to be able to run the game on current hardware at 4k. Game developers don't do that because there is a current want, it is popular in reviews, it is good marketing to have the game running well at 4k so you, as a developer, are not going to make your game incapable of being able to run at 4k.

 

And textures themselves are not all that demanding and isn't actually completely what you were talking about further on. 

 

This is a texture file.

t7kVK90.png

 

This alone does not make something look more realistic or impressive. You can increase that texture resolution and apply it to a geometrically inaccurate non impressive model and it won't greatly improve the overall graphics quality of the game.

 

And most textures, in in high res texture upgrade packs are rarely 4096x4096 resolution. What matters for textures is how much screen/image area it will take up on screen, if it is on something smaller in size then you don't need to go high res on it. Some textures are also tiled over the model, like a brick wall, so also do not use very large textures.

 

The fact that these very high quality art assets exist is not the issue, neither did I say they did not exist. The issue is the act choice, the requirement to lessen the quality when used in the game so current hardware is able to handle them. How much you may have to reduce them can be effected by the design goal of the game i.e. 4k@60 or 4k@30 or 1080p@60 etc.

*Possible* to use those things. Guess what, it's also *possible* to use them at 4k. Or 8k. Or 16k. I don't understand what you mean by "games developers don't".

I gave a list of 5 games that do, before 4K was even a thing. I can go back further if you wish *but CD/DVD sizes were not big enough to provide that content 15 years ago*. It was not "screen resolution* preventing the art department from releasing it.

 

That's why they did not provide those options on older PC game. Games Consols did not do multiple resolutions, *because of TVs* (being a fixed resolution, unlike PC monitors having options). Not because they did not have the assets.

 

So now multi resolution consoles/TVs and download services exist, we can now have both worlds.

 

You are:

  • Referring to an old development restriction that no longer exists
  • Blaming the wrong part of the development pipeline/budget/limit

So yes, there was a restriction. But it was not because 4k (or 1080p back in the day) lowered FPS... it was not because 4k maxes out and breaks a GTX1030...

 

If a user chooses to run a game at 15fps because they have a GTX 1030 doing 4K on GTAV Ultra, then they choose that. The game developers never did. The content does not "look worse", it slideshows. But *the frame is still rendered the same as on a GTX 1080! Both cards render the same, but 1 renders less FPS. A games console is different, but 4K and 1080p are still targets, either suffer from the other currently.

 

*No game developer wants 4K on a GTX 1030!* So no game developer has lowered the quality of the art assets to "hit 4k@60". They *add a game settings slider from low to ultra*.

Link to comment
Share on other sites

Link to post
Share on other sites

 

4 minutes ago, TechyBen said:

*Possible* to use those things. Guess what, it's also *possible* to use them at 4k. Or 8k. Or 16k.

It may not be possible to use them if there is a requirement for the game to be playable at 4k. Why you ignore the critical part to this I have no idea. I'm not talking you the customer sitting in front of the computer selecting 4k resolution. If that is the design requirement of the game, before you can play it, then you aren't going to make the game impossible to achieve the goal are you, you just failed to achieve one of the project scope requirement.

 

6 minutes ago, TechyBen said:

it was not because 4k maxes out and breaks a GTX1030...

And if it were to do the same to 2 2080 Tis? Because you make the game so graphically demanding and actually impressive looking compared to anything else before? But it worked perfectly fine on 1 2080 Ti at 1080p?

 

You keep brining this back to user choice, user selecting in game option, the user. I'm talking about developer choice before you ever see, know or hear about the game. Project scopes, choice that are made to achieve the final product that you eventually see.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, leadeater said:

It may not be possible to use them if there is a requirement for the game to be playable at 4k.

Again. No. There is not *requirement*. It will go "We want the game to be 4K 60FPS please". So, that's a set in stone requirement? On PC, there are scaling options, so they will go "oh, we can throw in 16k, 64k and 96k options, it costs us 2 lines of code, 50c in dev budget, please put that slider in". I gave 5 examples of this.

 

Yet you still say they hold back art budget/content!!!

 

Another 5 examples:

Smite:

https://store.steampowered.com/app/677370/SMITE__Texture_Pack/

Mass Effect:

https://www.pcgamesn.com/mass-effect/mass-effect-trilogy-4k-texture-pack (Not official, but now avaliable due to *higher VRAM size!!!*)

Far Cry 5:

https://www.guru3d.com/news-story/far-cry-5-has-just-received-a-graphical-upgrade-with-a-hr-texture-pack.html (Again due to VRAM)

Shadow of Mordor:

https://middle-earthgamessupport.wbgames.com/hc/en-us/articles/360001096167-How-do-I-use-Ultra-Textures-on-Shadow-of-Mordor-for-PC-

Minecraft:

https://www.minecraftforum.net/forums/mapping-and-modding-java-edition/resource-packs/2848934-ultra-realistic-texture-pack-1-8 Minecraft because LOLZ! ?

 

 

 

Quote

I'm talking about developer choice before you ever see, know or hear about the game. Project scopes, choice that are made to achieve the final product that you eventually see.

Go find that developer then... I found 10 who are not doing what you *imagine* they are doing. (But you are correct about what the users are choosing to do, and it's not my place to judge if they prefer low res or high res, low FPS or high FPS)

 

Again, I "liked" your comment. Because I do agree with your principles and thoughts. I'm just trying to show that the game development and GPU render pipeline is not quite the limiting factors here. It is not 4K or 1080p. We have both (see the Simpson's Paradox graph).

Link to comment
Share on other sites

Link to post
Share on other sites

(Double posting because I don't want to look like I'm editing out my posts. But I realised another example to show how I agree with you, but wish to show where your understanding is partially wrong).

 

Ben and Leadeater both choose 1080p games and computer GPUs (cheaper budget we have GTX1060s). Linus and Luke choose 8k gaming and 2x2080tis.

 

  • Ben and Leadeater get high FPS at low settings and low FPS at high settings
  • Linus and Luke get high FPS at low settings and Low FPS at high settings.*

This is why I gave the example of the Simpson's Paradox. Both groups of gamers get the same improvements over changes in game settings, but if comparing Luke and Ben, or Linus and Leadeater, we mistakenly blame the resolution. Now I also agree, resolution *also* contributed to FPS. But it is not instead of texture/poly budget, it's inclusive of.

 

*If we give Linus and Luke a GTX1060s, they *still* get higher FPS when reducing the poly count. Even if trying to play at 8k. However, the game engine/development/art team have not taken away any of the content (even dynamic LOD will try to load in textures/poly if possible). It would be Linus and Luke trying too much on their cards, not the game developers.

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, TechyBen said:

Yet you still say they hold back art budget/content!!!

I never said that ever. I said the assets that make it in to the game are not the highest possible, may be limited because there is no point putting something so highly detailed it's not possible to render them on current hardware, but what if it is but only at 1080p. Currently I do not see any games going with this choice, a choice that would mean no card in existence at the time could render the game at 4k.

 

40 minutes ago, TechyBen said:

It'll just pick this one for example sake. At 1080p VRAM usage is around 3GB and 4k is around 4.2GB. So yes VRAM was an issue and would be a reason to not ship the game with the HD textures now available, darn all those 4GB GPU users who wouldn't be able to run the game at 4k regardless. The game could have shipped with the HD textures. You also know this is a 2018 game right? How long have 8GB graphics cards existed?

 

Seems to me you have just shown an active choice, this year, to limit game graphics because of a developer requirement/target for a certain type of hardware and resolution setting.

 

40 minutes ago, TechyBen said:

Lets go with an older 2014 one then, 1080p ~3.4GB and 4k ~4GB (No AA). For this game I would agree that the hardware at the time would not allow anything better from that game, the ultra texture pack could not have been used in the base game.

 

 

14 minutes ago, TechyBen said:

Ben and Leadeater both choose 1080p games and computer GPUs (cheaper budget we have GTX1060s). Linus and Luke choose 8k gaming and 2x2080tis.

 

  • Ben and Leadeater get high FPS at low settings and low FPS at high settings
  • Linus and Luke get high FPS at low settings and Low FPS at high settings.*

This is why I gave the example of the Simpson's Paradox. Both groups of gamers get the same improvements over changes in game settings, but if comparing Luke and Ben, or Linus and Leadeater, we mistakenly blame the resolution. Now I also agree, resolution *also* contributed to FPS. But it is not instead of texture/poly budget, it's inclusive of.

But this example in no way represents what I am talking about nor wishing from games. I am not mistakenly blaming resolution, in my examples that I am using the resolution would be to blame. Because I am talking about creating a game that is so demanding no hardware on the market could render it at 4k but could at 1080p, who's doing that? No one. Why? Because right now 4k is a demand from users that developers wish to meet.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, asus killer said:

Thanks, but as far as I can see, it proves my point. The FF example was resolution of textures/poly count. The GamesRadar article *proves they scale FPS not poly/texture count*.

 

12 minutes ago, leadeater said:

It'll just pick this one for example sake. At 1080p VRAM usage is around 3GB and 4k is around 4.2GB. So yes VRAM was an issue and would be a reason to not ship the game with the HD textures now available, darn all those 4GB GPU users who wouldn't be able to run the game at 4k regardless. The game could have shipped with the HD textures. You also know this is a 2018 game right? How long have 8GB graphics cards existed?

 

Yes. Most people (steam stats as an example) have cards with 3-4gb VRAM *now*. So less so at release. It was a VRAM budget for the developers to keep to (they can increase the textures more, but would then hit the 5-8gb range). It was not "4K" preventing them from hitting the FPS/effects/texture quality budget.

 

From GamesRadar:

Quote

“It hasn’t affected other aspects of the game because I can’t imagine spending that 10 per cent on anything that would give better value to the player.”

and

Quote

It won’t come as a surprise that Crackdown 3’s 4K mode will run at 30 frames per second

So, instead of choosing lower textures, lower poly count and lower art assets (which would be dynamic LOD toggle/switch out anyhow) they choose 30FPS. They never downgraded the art, they downgraded the FPS. :)

 

(Another example I thought of last night https://en.wikipedia.org/wiki/The_Order:_1886 They chose lower FPS to hit the poly/texture asset needs. They did not lower the poly/texture assets to hit 1080p or 4K... they increased it!)

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, TechyBen said:

Thanks, but as far as I can see, it proves my point. The FF example was resolution of textures/poly count. The GamesRadar article *proves they scale FPS not poly/texture count*.

 

just trying to ad to the discussion, i have a different view, not that you're wrong but that it also isn't that simple. Still take it up with @leadeater ?

.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, asus killer said:

just trying to ad to the discussion, i have a different view, not that you're wrong but that it also isn't that simple. Still take it up with @leadeater ?

Yeah. I can get with that. If you wish to say it's not that simple, I'm all ears for where it gets more complex (I've not even myself gone into it all, and the GamesRadar article reminds me also that render pipelines have a lot of competing factors, so if Grown Home leaves is spare for more resolution and less performance hit, and BF5 just gets tanked at 4K, then there is a multitude of reasons why).

 

And I also agree, I prefer a middle of the ground image, high res everything, 60fps, less shader effects. But others might prefer low res, more shader effects. Others super high texture, low poly 4k. Others have 4 1080tis, and are doing 4K with ultra settings.

 

I also know how Level Of Detail or Frame render time goal works, and have not gone into those details. But games from 2001 were doing it, and so were not resolution limited, but GPU power limited.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, TechyBen said:

Yes. Most people (steam stats as an example) have cards with 3-4gb VRAM *now*. So less so at release. It was a VRAM budget for the developers to keep to (they can increase the textures more, but would then hit the 5-8gb range). It was not "4K" preventing them from hitting the FPS/effects/texture quality budget.

How many of those 3GB-4GB could have rendered the game at 4k with acceptable frame rates, none of them. The higher res pack could have been used from the start, because at 1080p the vram usage would have allowed it just not at 4k on cards not capable in the first place.

 

And there is a lot in that article that supports both sides of this.

Quote

But he goes further: “I think it would be an interesting world if we stuck at 720p and then just kept on scaling performance. Then developers would have more opportunity to use the GPU for more than just filling more pixels.” On one hand, that excess GPU power could be put into perfecting every pixel to better close the distance to CG movie visuals, employing high-quality antialiasing and lighting effects such as the realtime ray- tracing showcased by Microsoft and Unreal at GDC this year.

 

What is clear though is 4k is a target that they are all trying to meet, they even say so, so from start to end of the development cycle 4k is a forefront requirement. There is no way this is not going to effect choice, drive technology a certain direction.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, TechyBen said:

Yeah. I can get with that. If you wish to say it's not that simple, I'm all ears for where it gets more complex (I've not even myself gone into it all, and the GamesRadar article reminds me also that render pipelines have a lot of competing factors, so if Grown Home leaves is spare for more resolution and less performance hit, and BF5 just gets tanked at 4K, then there is a multitude of reasons why).

 

And I also agree, I prefer a middle of the ground image, high res everything, 60fps, less shader effects. But others might prefer low res, more shader effects. Others super high texture, low poly 4k. Others have 4 1080tis, and are doing 4K with ultra settings.

 

I also know how Level Of Detail or Frame render time goal works, and have not gone into those details. But games from 2001 were doing it, and so were not resolution limited, but GPU power limited.

i think he's right and not the article but the developers quoted there give reason to what Leadeater was arguing (if i don't get this even more confusing), 4k is a choice and 4k impacts the "cost" of development, there is a tradeoff you will have to accept and hard choices they have to make in choosing to develop for 4k.

.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, leadeater said:

How many of those 3GB-4GB could have rendered the game at 4k with acceptable frame rates, none of them. The higher res pack could have been used from the start, because at 1080p the vram usage would have allowed it just not at 4k on cards not capable in the first place.

Frame rates =|= image quality. They don't prevent each other. You posted the FF image as image quality. Developer FPS goal =|= removal of content/image quality (I will link to LOD pipelines explinations and pre-engine art assets if you wish). Do you want to learn about how game art assets are created, and how they are loaded/unloaded from the game engine?

 

Because you seem to think they downgrade the assets to hit a 60FPS mark at 4K. I'm saying they *don't*. Can you provide the example of 1 game that downscales texture resolution to hit 60FPS in 4K?

 

Quote

On one hand, that excess GPU power could be put into perfecting every pixel to better close the distance to CG movie visuals

How? We already have the render pipelines for better shaders. We already have the render pipelines for better poly count/textures. We already have these (I linked to 10 of them!!!). Crackdown 3 has high poly/texture at 1080p. How could they improve it at 1080p? It already maxes out at 60FPS, any improvement in textures/poly/shaders/render at 1080p will *lower the FPS at 1080p not increase it!!!*

 

Trying to hit 4K will increase the poly/texture/shader count of art assets and they will downscale them to hit 30fps. These are added back when you or I or Asus Killer chooses 1080p!

 

4K is a higher resolution, so you have to have higher original art assets. Even if you later find your FPS budget does not allow them (as in your FF8 image you posted).

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, asus killer said:

i think he's right and not the article but the developers quoted there give reason to what Leadeater was arguing (if i don't get this even more confusing), 4k is a choice and 4k impacts the "cost" of development, there is a tradeoff you will have to accept and hard choices they have to make in choosing to develop for 4k.

The issue I think is cropping up here is what people value more and consider a better graphical game. My taste, what I use to measure is different so my conclusion is going to be different to another person. I could see a beautiful realistic looking game and be amazed at it and call it the best graphical game ever, someone else might look at it and say it's not because it's not being rendered at 4k so has drawbacks that are more important to them.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

The issue I think is cropping up here is what people value more and consider a better graphical game. My taste, what I use to measure is different so my conclusion is going to be different to another person. I could see a beautiful realistic looking game and be amazed at it and call it the best graphical game ever, someone else might look at it and say it's not because it's not being rendered at 4k so has drawbacks that are more important to them.

Again. Show me 1 game they took content out of to hit a higher resolution. ?

 

Finally. Textures and poly count are limited by framebuffers and cuda cores. Yes, higher resolutions do impact this, but are not the reason we do not have "photorealistic" games at 300x200p! ?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, TechyBen said:

Frame rates =|= image quality.

Yes I know that, frame rate target will effect the image quality. They aren't mutually exclusive. Raise one the other lowers, they are linked. 

 

3 minutes ago, TechyBen said:

You posted the FF image as image quality.

No I used that show show that asset being used in a game would result in a lower graphical looking game. The higher quality asset would result in a higher graphical looking game. Now where it's actually important to this discussion is if the low quality asset has to be used to achieve 4k on current hardware but the high quality one could be used if the start was 1080p instead. There is no guarantee that the higher quality asset will be available in the game if the development goal from day 1 was 4k.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, TechyBen said:

Again. Show me 1 game they took content out of to hit a higher resolution. ?

Far cry 5.

 

Edit:

Also how they heck could you possible know unless a dev said they did. You really think you can anticipate every choice during development to know why things were done a certain way. Nor am I trying to show or prove this has been done.

 

You seem to fundamentally miss what point of what I'm saying.

 

What if a game was to start development today and 4k was completely off the table, lets go as far as to remove it as a game resolution option. How good graphically could the game be made to look under this set of conditions, in my view better than if 4k was a forefront goal. Other choices would be made, the resultant product would be different if it were to be done.

 

16 minutes ago, TechyBen said:

How? We already have the render pipelines for better shaders. We already have the render pipelines for better poly count/textures. We already have these (I linked to 10 of them!!!). Crackdown 3 has high poly/texture at 1080p. How could they improve it at 1080p? It already maxes out at 60FPS, any improvement in textures/poly/shaders/render at 1080p will *lower the FPS at 1080p not increase it!!!* 

Take that up with the developer who said it in the article, he seems to have the same thought process I'm alluding to.

 

16 minutes ago, TechyBen said:

any improvement in textures/poly/shaders/render at 1080p will *lower the FPS at 1080p not increase it!!!*

And I'm not opposed to that at all, and that is for the consoles anyway. You can make that statement but replace console with PC, replace 1080 with 4k and my reply would be fine do it, now lower the resolution to 1080p.

 

15 minutes ago, TechyBen said:

4K is a higher resolution, so you have to have higher original art assets

Not true, that may not happen at all. It's a reason to do so, it's also a reason not to.

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

 frame rate target will effect the image quality

Where? Show me. I showed you 10 games that have the image quality set to "ultra stupid high", and did not allow frame rate target to lower image quality. Find me 1 game that does this (because I know they exist, but then I can show you the limiting factor, and if it is resolution, VRAM, cuda core pipeline, game engine, development budget etc).

 

Quote

No I used that show show that asset being used in a game would result in a lower graphical looking game. 

Yes, but no one is using the lower assets. I gave an example of 10 games using the higher assets than were *possible* at the time. Show me the 1 game using lower assets. :)

 

Quote

Now where it's actually important to this discussion is if the low quality asset has to be used to achieve 4k on current hardware

Which no one is doing. The GamesRadar article on 1 game (Crackdown) says "we lower the FPS" not the art assets*. So can you find me the 1 game that lowered the art assets? ?

 

*They do mention a theoretical "better" game at 1080p, but they do not provide any numbers/facts to back that up, such as how much extra money to spend on the art team, or how much VRAM spare, or how many cPU/GPU cycles spare

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

The issue I think is cropping up here is what people value more and consider a better graphical game. My taste, what I use to measure is different so my conclusion is going to be different to another person. I could see a beautiful realistic looking game and be amazed at it and call it the best graphical game ever, someone else might look at it and say it's not because it's not being rendered at 4k so has drawbacks that are more important to them.

i do agree you seem to be arguing on different points there, and some confusion that don't care to go into. 

But i do get your point 4k doesn't make a " beautiful realistic looking game" on it's on, it makes for a good looking game, and those aren't the same thing. 

.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, asus killer said:

i do agree you seem to be arguing on different points there, and some confusion that don't care to go into. 

But i do get your point 4k doesn't make a " beautiful realistic looking game" on it's on, it makes for a good looking game, and those aren't the same thing. 

Yes this. Image quality *is* improved with 4K. Art quality is not. Art quality is improved *at any resolution*. FPS count is not. These are separate things, and some game developers/Users push for them as if they are the same.

 

Leadeater

Quote

Far cry 5.

I posted the Ultra HD Texture pack... They did not lower the quality for 4K!!! ? ??

 

I'm not trying to prove you wrong. I'm saying you don't seem to realise, the games higher art assets are there. No one is downgrading to hit 4K. They are dynamically lowering the assets when choosing 4K, but increasing them back up for 1080p.

 

Show me 1 game that took out the 1080p art assets/shaders/textures/polycounts to hit a 4K 60FPS budget.

 

[Just 1 addition to your edit]

 

Quote

What if a game was to start development today and 4k was completely off the table, lets go as far as to remove it as a game resolution option. How good graphically could the game be made to look under this set of conditions, in my view better than if 4k was a forefront goal.

But they cannot. Because no one can do a photorealistic game at 300x200p. Resolution is not the limiting factor!

 

Again, I don't disagree with you. I'm saying no one in the industry is doing this to you. It sometimes happened in the past due to mainly DVD size/VRAM/code/GPU limits. It sometimes happens on Consoles (see Crackdown), but even then, it does not remove the art assets, as these get released later when those limits drop.

 

Quote

And I'm not opposed to that at all, and that is for the consoles anyway. You can make that statement but replace console with PC, replace 1080 with 4k and my reply would be fine do it, now lower the resolution to 1080p.

No! Because you just fudged the numbers into the Simpson's Paradox again.

220px-Simpson%27s_paradox_continuous.svg.png

Blue is 1080p, red is 4K. Notice both increase. FPS, art assets, gameplay. If we say "4K is making 1080p worse" we are following the black line. But both the red and blue lines exist. No one took away the blue line, 1080p, when they invented the red line (the GamesRadar article says they MIGHT do that, but currently DO NOT and instead lower the FPS).

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, TechyBen said:

I posted the Ultra HD Texture pack... They did not lower the quality for 4K!!! ? ??

 

I'm not trying to prove you wrong. I'm saying you don't seem to realise, they games higher art assets are there. No one is downgrading to hit 4K. They are dynamically lowering the assets when choosing 4K, but increasing them back up for 1080p.

It was not in the base game, sooo..... art assets were reduced.... It's not dynamic if it's a DLC or a mod, later released outside of base game.

 

So Far Cry 5 is my example, it wasn't a serious example because I'm not looking for games that have done it. I have no interest in trying to find one because that is not a position I'm trying to say has happened, not in the way you are trying to represent it. The way you are trying to phrase it is not correct to what I'm wishing for, so there is no good reason to keep trying to go down that path.

 

You don't seem to be able to acknowledge that targeting 4k will have an impact on choices game developers will make that will effect game graphics, and not always for the better.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×