Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Battlefield V with DXR on tested from Techspot(Hardware Unboxed)

17 minutes ago, TechyBen said:

No. They ended up like that because they could spend more money on the art budget. And because Textures sizes could increase. Poly numbers could increase. Why? Hardware improved. You could not push that number of polys, texture pipelines etc, at ANY resolution on the older hardware.

Rendering at 4k is not free, it does require more GPU resources so you may not be able to use the higher poly assets because if you do at 4k you're going to get 10 FPS but at 1080p you might get 30 FPS.

 

That was purely a hypothetical point, read it again as written. Ignore that it's 2001, ignore the hardware then, ignore the hardware now. Read the point, game one looks like crap because 4k was the target, game two looks far better because 1080p was the target.

 

Very blunt point, so maybe I was correct. Maybe you are not able to listen to the point I'm making. I totally do not disagree with what you say about requiring a higher art budget but you are not listening to the fact that 4k may limit what those artists will do or can do. 4k is not free.

 

 

Edit:

AHHHH!! it's so bloody frustrating when someone doesn't want to listen to your point.

 

maxresdefault.jpg

Link to post
Share on other sites
5 minutes ago, leadeater said:

Rendering at 4k is not free, it does require more GPU resources so you may not be able to use the higher poly assets because if you do at 4k you're going to get 10 FPS but at 1080p you might get 30 FPS.

 

That was purely a hypothetical point, read it again as written. Ignore that it's 2001, ignore the hardware then, ignore the hardware now. Read the point, game one looks like crap because 4k was the target, game two looks far better because 1080p was the target.

The setting "1080p or 4K" is free. It makes no difference to the art quality. Thus, 4K makes no difference to the quality of gameplay for those who choose 1080p. You suggested we should instead concentrate on the art.

 

I say we are concentrating on the art. Because the button to set "4K or 1080p" was 5 seconds of programming, done 5 years ago. ?

 

Quote

 it does require more GPU resources so you may not be able to use the higher poly assets because

Look at the image/s I posted. All of them are *independent of resources*. Each game comes with at least 3 resource packs, low, med, high. Texture resolutions, model densities, shader pipelines etc. Again, I am not sure you understand how art for these things are made, or how the render engine processes them.

 

Quote

Read the point, game one looks like crap because 4k was the target, game two looks far better because 1080p was the target.

[edit] Found it:

b55263e9_Yuna.png

NO!

Resolution =/= art budget!

Left is 4K WITH low art budget. Right is 1080p WITH high art budget.

You can render the right at 4K. You can do this at 60FPS with scene dynamics and/or texture limits, poly limits etc. OR you can lower the FPS. However, *you* can choose which, it does not affect how the game is released.

 

Show me 1 game where they chose the image on the left over the image on the right (as said, you can already download high quality art assets for most games, because these files *already exist*).

 

Every game is already as the right image. None are made as the left unless the hardware cannot push them even at lower resolutions.

Link to post
Share on other sites
9 minutes ago, TechyBen said:

Wait, where? You never posted any image of 4K vs 1080p? (I'm going back through the thread now, so sorry if I find it after this post).

Your two, of FFX and Spirits Within. I'm using those two as an unrealistic example of why aiming for 4k could matter.

 

Hypothetical 4k game, horrible don't want it.

Spoiler

FFX_Party.JPG

 

Hypothetical 1080p game, excellent focus on that.

Spoiler

Aki_Ross_(sample_image).jpg

 

 

9 minutes ago, TechyBen said:

I say we are concentrating on the art. Because the button to set "4K or 1080p" was 5 seconds of programming, done 5 years ago. ?

I'm saying what we are and can do with the art is limited when you aim for 4k, because rendering at 4k is not free, it increases GPU resource load.

Edited by leadeater
Link to post
Share on other sites
12 minutes ago, leadeater said:

Your two, of FFX and Spirits Within. I'm using those two as an unrealistic example of why aiming for 4k could matter.

 

I'm saying what are and can do with the art is limited when you aim for 4k, because rendering at 4k is not free, it increases GPU resource load.

Exactly. Your example unrealistic. Sorry, your one does not make sense. It does not follow the actual art development, or the render budget. Look to see if there is a game you can say to me "it's worse because they aimed at 4K so had to reduce the poly count and texture resolution". Again, art development or game code does not work like that AFAIK, they do reduce the texture and poly count, but the does not impact FPS enough to make it worth it for resolution increases. The pictures you posted do look worse/better, but *because they can now render better textures/models* not because anyone stopped painting the necklace because "who cares, 16k lolz". They stopped because "320p? Oh, I cannot even see the face, let alone the necklace, better stop painting". [edit] Or in your example of the render in Blender/game, it would have to be a FPS goal. But guess what, for Example GTA V 4K at low: 95 FPS, GTA V 4K at Ultra: 20FPS... but the game *still has ultra settings*. It still has better artwork than the FF example! They did not remove the "ultra" to hit the 4K budget, but left in the low, and left in the 4K, but also left in the high, and left in the 1080p... actually, what is the min res? [/edit]

 

You've literally done a Simpson's Paradox on the data. :P

 

Examples: 

Crysis high texture DLC shortly after release.

https://www.gamepressure.com/download.asp?ID=34398

 

Skyrim High quality Texture DLC shortly after release:

https://store.steampowered.com/sub/13437/

 

Fallout 4 High quality Texture DLC shortly after release:

https://store.steampowered.com/app/540810/Fallout_4__High_Resolution_Texture_Pack/

 

Watchdogs 2 Ultra Quality Texture Pack DLC (not sure when released)

https://support.ubi.com/en-GB/Faqs/000025708/Ultra-Texture-Pack-Installation

 

Again, the art already exists. The reason you don't get it, is not because of the monitor resolution.

 

Look at those art asset packs, and see if you can see the reason it's not on general release, or not put in the base game. Can you see one or two possible difficulties with it?

Link to post
Share on other sites

@TechyBen: What is so hard for you to grasp here?

 

You have a higher target main gameplay resolution on the same hardware the less graphically intensive the actual scenes can be because rendering it for a 4k display takes more GPU resources and thus if you want the game to run at all on available hardware you have to accept penalties in overall graphic quality so the current hardware can run in.

 

It's really basic stuff and game graphical design 101. Wherever possibble you don't build a game with intended for normal play graphics that can't run on hardware available around it's release. Obviously buggy games, bad optimizations, or miscalculations can all make this not pan out but. No one deliberately builds a game that you need next gen or even next, next, gen hardware to see running at it's intended quality level.

 

They'll build some level of fidelity in beyond their intended level for people with top end release date era hardware, but they don't deliberately do so to such a point that no one can run it at reasonable graphical settings assuming release era hardware.

Link to post
Share on other sites
33 minutes ago, TechyBen said:

You've literally done a Simpson's Paradox on the data. :P

Literally had to, you forced me to because you were not addressing at all what I was saying and going off on your own points. Yes I understand what you were saying, yes I agree it's heavily dependent on art budget but as yourself mentioned we can only do what we can now because GPU resources have gotten better and that is the enabler, the reason artists and producers are willing to do more because it's not wasted effort/money on the impossible.

 

So with that very thing in mind, GPU resources are not unlimited, textures (FYI not always technically correct usage of that word nor the only thing I was talking about) are only going to be as good as it's possible to actually utilize.

 

Were those higher quality texture packs not used originally or in the base game because they actually want the game to be able to be rendered at 4k by more people, highly likely. There is no point doing something which is going to make the game near impossible for most people to be able to render it on their computer.

 

That's where choices come in, do we go for higher quality assets making it impossible for any current GPU to render at 4k or we settle here so they can. Do we add more light sources, work on more realistic overall lighting, better shadows, reflections etc take your pick of any of these or combinations that results in being impossible to render at 4k on the current available hardware the developers will be releasing their game to. Wanting your game to be able to be rendered at 4k will impact all of this.

Link to post
Share on other sites
12 hours ago, CarlBar said:

Alright quick cliff notes version of the 3 raytracing types and how they work. 

 

All raytracing, even hybrid starts by shooting 1 or more rays per pixel from the camera position through the pixels on the virtual screen, (the contents of this virtual screen are what you actually see by the way). These rays keep going till they hit somthing in the games 3d render of the scene. We'll call these ray's master rays. here's probably some real fancy name for them but "master Rays" will suffice for our purposes. What happens next exactly depends on what is being handled by raytracing. In a full raytraced scene several different types of ray then emanate from where each ray touches the the scene. Again for sake of all these under the general classification of "Secondary Rays". For the purposes of what is implemented in hybrid raytracing we only really care about 3, as those are the only types implemented.

 

The first of these secondary rays is properly called the reflection ray. This shoots off at an angle set by the angle between the surface and the Master Ray. Just like light striking a mirror in real life. This reflection ray then shoots off and bounces around picking up colour info which is then combined with the colour info from the master ray to produce the final pixel colour value.

 

The second is the Shadow Ray. I haven't see a lot on this so i'm not sure how what direction it goes off in is decided. But it detects the shadowing effect of any surfaces it pases through on it's way out. Again this is combined with master ray data to decide the final pixel colour.

 

The third type are known as light rays, whilst hybrid ray tracing puts a cap on the number of light sources that can be used this fires a ray off at every light source in the scene until it hits the ray cap. Again picking up on lighting factors as it goes to again combine with the master ray to get a final result.

 

The thing is all 3 of these effects can use the same Master ray to implement, in fact they have to if combined, and the results of all 3 are put together to produce the final pixel values.

 

What this also means however is that the Master Rays are a one time overhead item, adding extra RTX effects doesn't necessarily double the workload. In fact because of the amount of bouncing they can do, (I believe BFV limits it to 4 bounces total), reflection is amongst the most intensive to do unless you have a scene with an enormous number of lights. But i think currently software caps it at the 3 closest light sources.As a result implementing additional effects isn;t a simple scalar.

You make it sound like ray tracing is a lot more achievable than many would believe. I certainly hope that is the case since that would mean better visuals should be closer but the demonstrations we have been shown would say otherwise.I guess the question is how much time does the master rays portion of ray tracing typically take to compute. If 1080p ultra will yield a frame every 6.7ms typically and 1080p ultra with ray tracing enabled it becomes 20.4ms (17.9ms with RT set to low) then it's fair to say it must be somewhere around 10ms or maybe a tad less. Then my next question would be about which type of ray is typically the most taxing to implement of the three types of rays.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to post
Share on other sites
7 hours ago, CarlBar said:

@TechyBen: What is so hard for you to grasp here?

 

You have a higher target main gameplay resolution on the same hardware the less graphically intensive the actual scenes can be because rendering it for a 4k display takes more GPU resources and thus if you want the game to run at all on available hardware you have to accept penalties in overall graphic quality so the current hardware can run in.

 

It's really basic stuff and game graphical design 101. Wherever possibble you don't build a game with intended for normal play graphics that can't run on hardware available around it's release. Obviously buggy games, bad optimizations, or miscalculations can all make this not pan out but. No one deliberately builds a game that you need next gen or even next, next, gen hardware to see running at it's intended quality level.

 

They'll build some level of fidelity in beyond their intended level for people with top end release date era hardware, but they don't deliberately do so to such a point that no one can run it at reasonable graphical settings assuming release era hardware.

Yes... But no one is making you (or leadeater) run the game at 4K. Thus you still have "Ultra" settings, highter art assets and better images than those running at 4K... in fact you're still wrong, because (as said, for example) GTAV 4K "Ultra" settings still exist (or choose any other PC game, I gave a list of 5 older games with ultra DLC artwork packs). No one made the game "less graphically intensive" because there is a 4K option there. What they have done, is allow the game to *possibly* downscale graphics intensity for 4K.

 

So no one got forced for the image of the low res FF textures. The PS2/3/4 may not be able to display those textures AND 4K/1080p/524p at the same time, but the games would have also either dynamically scaled content (LOD models/settings) or in the case of the PS4 have the option for 1080p OR 4K.

 

That's all I've tried to say. It's not one preventing the other. We have both. We have 1080p at low and high settings, we have 4K at low and high settings. Why? Because unless it's a games console, the hardware exists/will exist to drive the higher resolutions eventually.

 

The FF image would be blurred because texture units, VRAM, pixel pipelines/shader cores etc are not enough to drive it at a high resolution. This does not stop you running the game at a lower resolution.

 

Finally "looks worse" is subjective when comparing 2 different scaling options. For some people, their vision notices blurred textures/low poly more. For some people their vision notices strobing/pixel edges more. So, *both* are options for improving fidelity.

 

  • One can improve texture/poly resources but suffer FPS/resolution performance.
  • One can choose resolution improvement (better aliasing/colour gradients/smoothness of frame interpolation) but suffer texture/poly resource performance.

If you have the hardware, you can choose both, or you can chose one. But both are "improvements" to image fidelity, both are reductions in FPS, and not mutually exclusive. You have done a Simpsons paradox*, because you've concentrated on 4K lowering FPS/Performance but not accepted higher textures/polycount lowers FPS/Performance.

 

Leadeater:

Quote

the reason artists and producers are willing to do more because it's not wasted effort/money on the impossible.

I gave an example of 5 games, from before 4K was a thing, that "not wasted effort" was already done. I gave an example of an art asset that could do 16 bazillion K screen resolution. The art assets exist. The GPUs do not. It's also not "impossible", as 2 sli 2080s exist. It's just expensive! XD

 

Quote

So with that very thing in mind, GPU resources are not unlimited, textures (FYI not always technically correct usage of that word nor the only thing I was talking about) are only going to be as good as it's possible to actually utilize.

Yes, but as you say, GPU resources are not unlimited. This included texture budget *irregardless of display resolution*, and includes pixel pipeline/shader cores *irrespective of display resolution*. You get a sliding scale between the two, but it's not always resolution bound (again, see Grown Home, or a really good multi pipeline and scaled game engine).

 

Quote

Were those higher quality texture packs not used originally or in the base game because they actually want the game to be able to be rendered at 4k by more people, highly likely. There is no point doing something which is going to make the game near impossible for most people to be able to render it on their computer.

No. Because they were 85GB!!! Because they were over the texture VRAM of most cards. Because they were 100mil art budget department/time scale costs. Because they *did not even do 4K back then*. These were mainly 1080p games.?

 

"highly likely", is your feelings. I agree with your feelings. I agree with your assessment and preference of the art, resolution and poly count. I disagree that you know why these things are limited in games.

 

 

*https://en.wikipedia.org/wiki/Simpson's_paradox

We have 2 scales. Blue would be "better art, poly, texture, effects" and red would be "better resolution, ray tracing, game engine etc". But if we choose the worse of red, and the best of blue, we can think it "gets worse" and not gets better.

Link to post
Share on other sites
19 minutes ago, TechyBen said:

No one made the game "less graphically intensive" because there is a 4K option there. What they have done, is allow the game to *possibly* downscale graphics intensity for 4K.

I was not saying anyone made the game less graphically intensive because there is a 4k option. The point is it is possible to use more geometrically complex models, more complex lighting, higher quality effects if you are willing to completely sacrifice the ability to be able to run the game on current hardware at 4k. Game developers don't do that because there is a current want, it is popular in reviews, it is good marketing to have the game running well at 4k so you, as a developer, are not going to make your game incapable of being able to run at 4k.

 

And textures themselves are not all that demanding and isn't actually completely what you were talking about further on. 

 

This is a texture file.

t7kVK90.png

 

This alone does not make something look more realistic or impressive. You can increase that texture resolution and apply it to a geometrically inaccurate non impressive model and it won't greatly improve the overall graphics quality of the game.

 

And most textures, even in high res texture upgrade packs are rarely 4096x4096 resolution. What matters for textures is how much screen/image area it will take up on screen, if it is on something smaller in size then you don't need to go high res on it. Some textures are also tiled over the model, like a brick wall, so also do not use very large textures.

 

The fact that these very high quality art assets exist is not the issue, neither did I say they did not exist. The issue is the active choice, the requirement to lessen the quality when used in the game so current hardware is able to handle them. How much you may have to reduce them can be effected by the design goal of the game i.e. 4k@60 or 4k@30 or 1080p@60 etc.

 

Edit:

TL;DR I want a game that is impossible to run at 4k on 2 2080 Tis, I want to see an actual leap in graphics quality beyond making things look just a little sharper, blend a tiny bit better. I don't want the game to be made easier to run, very much the opposite. Make it so hard you cannot at 4k. At least in a generation or two there will be hardware than could run it at 4k and the game will not date as quickly.

Link to post
Share on other sites
21 minutes ago, leadeater said:

I was not saying anyone made the game less graphically intensive because there is a 4k option. The point is it is possible to use more geometrically complex models, more complex lighting, higher quality effects if you are willing to completely sacrifice the ability to be able to run the game on current hardware at 4k. Game developers don't do that because there is a current want, it is popular in reviews, it is good marketing to have the game running well at 4k so you, as a developer, are not going to make your game incapable of being able to run at 4k.

 

And textures themselves are not all that demanding and isn't actually completely what you were talking about further on. 

 

This is a texture file.

t7kVK90.png

 

This alone does not make something look more realistic or impressive. You can increase that texture resolution and apply it to a geometrically inaccurate non impressive model and it won't greatly improve the overall graphics quality of the game.

 

And most textures, in in high res texture upgrade packs are rarely 4096x4096 resolution. What matters for textures is how much screen/image area it will take up on screen, if it is on something smaller in size then you don't need to go high res on it. Some textures are also tiled over the model, like a brick wall, so also do not use very large textures.

 

The fact that these very high quality art assets exist is not the issue, neither did I say they did not exist. The issue is the act choice, the requirement to lessen the quality when used in the game so current hardware is able to handle them. How much you may have to reduce them can be effected by the design goal of the game i.e. 4k@60 or 4k@30 or 1080p@60 etc.

*Possible* to use those things. Guess what, it's also *possible* to use them at 4k. Or 8k. Or 16k. I don't understand what you mean by "games developers don't".

I gave a list of 5 games that do, before 4K was even a thing. I can go back further if you wish *but CD/DVD sizes were not big enough to provide that content 15 years ago*. It was not "screen resolution* preventing the art department from releasing it.

 

That's why they did not provide those options on older PC game. Games Consols did not do multiple resolutions, *because of TVs* (being a fixed resolution, unlike PC monitors having options). Not because they did not have the assets.

 

So now multi resolution consoles/TVs and download services exist, we can now have both worlds.

 

You are:

  • Referring to an old development restriction that no longer exists
  • Blaming the wrong part of the development pipeline/budget/limit

So yes, there was a restriction. But it was not because 4k (or 1080p back in the day) lowered FPS... it was not because 4k maxes out and breaks a GTX1030...

 

If a user chooses to run a game at 15fps because they have a GTX 1030 doing 4K on GTAV Ultra, then they choose that. The game developers never did. The content does not "look worse", it slideshows. But *the frame is still rendered the same as on a GTX 1080! Both cards render the same, but 1 renders less FPS. A games console is different, but 4K and 1080p are still targets, either suffer from the other currently.

 

*No game developer wants 4K on a GTX 1030!* So no game developer has lowered the quality of the art assets to "hit 4k@60". They *add a game settings slider from low to ultra*.

Link to post
Share on other sites

 

4 minutes ago, TechyBen said:

*Possible* to use those things. Guess what, it's also *possible* to use them at 4k. Or 8k. Or 16k.

It may not be possible to use them if there is a requirement for the game to be playable at 4k. Why you ignore the critical part to this I have no idea. I'm not talking you the customer sitting in front of the computer selecting 4k resolution. If that is the design requirement of the game, before you can play it, then you aren't going to make the game impossible to achieve the goal are you, you just failed to achieve one of the project scope requirement.

 

6 minutes ago, TechyBen said:

it was not because 4k maxes out and breaks a GTX1030...

And if it were to do the same to 2 2080 Tis? Because you make the game so graphically demanding and actually impressive looking compared to anything else before? But it worked perfectly fine on 1 2080 Ti at 1080p?

 

You keep brining this back to user choice, user selecting in game option, the user. I'm talking about developer choice before you ever see, know or hear about the game. Project scopes, choice that are made to achieve the final product that you eventually see.

Link to post
Share on other sites
14 minutes ago, leadeater said:

It may not be possible to use them if there is a requirement for the game to be playable at 4k.

Again. No. There is not *requirement*. It will go "We want the game to be 4K 60FPS please". So, that's a set in stone requirement? On PC, there are scaling options, so they will go "oh, we can throw in 16k, 64k and 96k options, it costs us 2 lines of code, 50c in dev budget, please put that slider in". I gave 5 examples of this.

 

Yet you still say they hold back art budget/content!!!

 

Another 5 examples:

Smite:

https://store.steampowered.com/app/677370/SMITE__Texture_Pack/

Mass Effect:

https://www.pcgamesn.com/mass-effect/mass-effect-trilogy-4k-texture-pack (Not official, but now avaliable due to *higher VRAM size!!!*)

Far Cry 5:

https://www.guru3d.com/news-story/far-cry-5-has-just-received-a-graphical-upgrade-with-a-hr-texture-pack.html (Again due to VRAM)

Shadow of Mordor:

https://middle-earthgamessupport.wbgames.com/hc/en-us/articles/360001096167-How-do-I-use-Ultra-Textures-on-Shadow-of-Mordor-for-PC-

Minecraft:

https://www.minecraftforum.net/forums/mapping-and-modding-java-edition/resource-packs/2848934-ultra-realistic-texture-pack-1-8 Minecraft because LOLZ! ?

 

 

 

Quote

I'm talking about developer choice before you ever see, know or hear about the game. Project scopes, choice that are made to achieve the final product that you eventually see.

Go find that developer then... I found 10 who are not doing what you *imagine* they are doing. (But you are correct about what the users are choosing to do, and it's not my place to judge if they prefer low res or high res, low FPS or high FPS)

 

Again, I "liked" your comment. Because I do agree with your principles and thoughts. I'm just trying to show that the game development and GPU render pipeline is not quite the limiting factors here. It is not 4K or 1080p. We have both (see the Simpson's Paradox graph).

Link to post
Share on other sites

(Double posting because I don't want to look like I'm editing out my posts. But I realised another example to show how I agree with you, but wish to show where your understanding is partially wrong).

 

Ben and Leadeater both choose 1080p games and computer GPUs (cheaper budget we have GTX1060s). Linus and Luke choose 8k gaming and 2x2080tis.

 

  • Ben and Leadeater get high FPS at low settings and low FPS at high settings
  • Linus and Luke get high FPS at low settings and Low FPS at high settings.*

This is why I gave the example of the Simpson's Paradox. Both groups of gamers get the same improvements over changes in game settings, but if comparing Luke and Ben, or Linus and Leadeater, we mistakenly blame the resolution. Now I also agree, resolution *also* contributed to FPS. But it is not instead of texture/poly budget, it's inclusive of.

 

*If we give Linus and Luke a GTX1060s, they *still* get higher FPS when reducing the poly count. Even if trying to play at 8k. However, the game engine/development/art team have not taken away any of the content (even dynamic LOD will try to load in textures/poly if possible). It would be Linus and Luke trying too much on their cards, not the game developers.

Link to post
Share on other sites
40 minutes ago, TechyBen said:

Yet you still say they hold back art budget/content!!!

I never said that ever. I said the assets that make it in to the game are not the highest possible, may be limited because there is no point putting something so highly detailed it's not possible to render them on current hardware, but what if it is but only at 1080p. Currently I do not see any games going with this choice, a choice that would mean no card in existence at the time could render the game at 4k.

 

40 minutes ago, TechyBen said:

It'll just pick this one for example sake. At 1080p VRAM usage is around 3GB and 4k is around 4.2GB. So yes VRAM was an issue and would be a reason to not ship the game with the HD textures now available, darn all those 4GB GPU users who wouldn't be able to run the game at 4k regardless. The game could have shipped with the HD textures. You also know this is a 2018 game right? How long have 8GB graphics cards existed?

 

Seems to me you have just shown an active choice, this year, to limit game graphics because of a developer requirement/target for a certain type of hardware and resolution setting.

 

40 minutes ago, TechyBen said:

Lets go with an older 2014 one then, 1080p ~3.4GB and 4k ~4GB (No AA). For this game I would agree that the hardware at the time would not allow anything better from that game, the ultra texture pack could not have been used in the base game.

 

 

14 minutes ago, TechyBen said:

Ben and Leadeater both choose 1080p games and computer GPUs (cheaper budget we have GTX1060s). Linus and Luke choose 8k gaming and 2x2080tis.

 

  • Ben and Leadeater get high FPS at low settings and low FPS at high settings
  • Linus and Luke get high FPS at low settings and Low FPS at high settings.*

This is why I gave the example of the Simpson's Paradox. Both groups of gamers get the same improvements over changes in game settings, but if comparing Luke and Ben, or Linus and Leadeater, we mistakenly blame the resolution. Now I also agree, resolution *also* contributed to FPS. But it is not instead of texture/poly budget, it's inclusive of.

But this example in no way represents what I am talking about nor wishing from games. I am not mistakenly blaming resolution, in my examples that I am using the resolution would be to blame. Because I am talking about creating a game that is so demanding no hardware on the market could render it at 4k but could at 1080p, who's doing that? No one. Why? Because right now 4k is a demand from users that developers wish to meet.

Link to post
Share on other sites
11 minutes ago, asus killer said:

Thanks, but as far as I can see, it proves my point. The FF example was resolution of textures/poly count. The GamesRadar article *proves they scale FPS not poly/texture count*.

 

12 minutes ago, leadeater said:

It'll just pick this one for example sake. At 1080p VRAM usage is around 3GB and 4k is around 4.2GB. So yes VRAM was an issue and would be a reason to not ship the game with the HD textures now available, darn all those 4GB GPU users who wouldn't be able to run the game at 4k regardless. The game could have shipped with the HD textures. You also know this is a 2018 game right? How long have 8GB graphics cards existed?

 

Yes. Most people (steam stats as an example) have cards with 3-4gb VRAM *now*. So less so at release. It was a VRAM budget for the developers to keep to (they can increase the textures more, but would then hit the 5-8gb range). It was not "4K" preventing them from hitting the FPS/effects/texture quality budget.

 

From GamesRadar:

Quote

“It hasn’t affected other aspects of the game because I can’t imagine spending that 10 per cent on anything that would give better value to the player.”

and

Quote

It won’t come as a surprise that Crackdown 3’s 4K mode will run at 30 frames per second

So, instead of choosing lower textures, lower poly count and lower art assets (which would be dynamic LOD toggle/switch out anyhow) they choose 30FPS. They never downgraded the art, they downgraded the FPS. :)

 

(Another example I thought of last night https://en.wikipedia.org/wiki/The_Order:_1886 They chose lower FPS to hit the poly/texture asset needs. They did not lower the poly/texture assets to hit 1080p or 4K... they increased it!)

Link to post
Share on other sites
2 minutes ago, TechyBen said:

Thanks, but as far as I can see, it proves my point. The FF example was resolution of textures/poly count. The GamesRadar article *proves they scale FPS not poly/texture count*.

 

just trying to ad to the discussion, i have a different view, not that you're wrong but that it also isn't that simple. Still take it up with @leadeater ?

.

Link to post
Share on other sites
Just now, asus killer said:

just trying to ad to the discussion, i have a different view, not that you're wrong but that it also isn't that simple. Still take it up with @leadeater ?

Yeah. I can get with that. If you wish to say it's not that simple, I'm all ears for where it gets more complex (I've not even myself gone into it all, and the GamesRadar article reminds me also that render pipelines have a lot of competing factors, so if Grown Home leaves is spare for more resolution and less performance hit, and BF5 just gets tanked at 4K, then there is a multitude of reasons why).

 

And I also agree, I prefer a middle of the ground image, high res everything, 60fps, less shader effects. But others might prefer low res, more shader effects. Others super high texture, low poly 4k. Others have 4 1080tis, and are doing 4K with ultra settings.

 

I also know how Level Of Detail or Frame render time goal works, and have not gone into those details. But games from 2001 were doing it, and so were not resolution limited, but GPU power limited.

Link to post
Share on other sites
12 minutes ago, TechyBen said:

Yes. Most people (steam stats as an example) have cards with 3-4gb VRAM *now*. So less so at release. It was a VRAM budget for the developers to keep to (they can increase the textures more, but would then hit the 5-8gb range). It was not "4K" preventing them from hitting the FPS/effects/texture quality budget.

How many of those 3GB-4GB could have rendered the game at 4k with acceptable frame rates, none of them. The higher res pack could have been used from the start, because at 1080p the vram usage would have allowed it just not at 4k on cards not capable in the first place.

 

And there is a lot in that article that supports both sides of this.

Quote

But he goes further: “I think it would be an interesting world if we stuck at 720p and then just kept on scaling performance. Then developers would have more opportunity to use the GPU for more than just filling more pixels.” On one hand, that excess GPU power could be put into perfecting every pixel to better close the distance to CG movie visuals, employing high-quality antialiasing and lighting effects such as the realtime ray- tracing showcased by Microsoft and Unreal at GDC this year.

 

What is clear though is 4k is a target that they are all trying to meet, they even say so, so from start to end of the development cycle 4k is a forefront requirement. There is no way this is not going to effect choice, drive technology a certain direction.

Link to post
Share on other sites
3 minutes ago, TechyBen said:

Yeah. I can get with that. If you wish to say it's not that simple, I'm all ears for where it gets more complex (I've not even myself gone into it all, and the GamesRadar article reminds me also that render pipelines have a lot of competing factors, so if Grown Home leaves is spare for more resolution and less performance hit, and BF5 just gets tanked at 4K, then there is a multitude of reasons why).

 

And I also agree, I prefer a middle of the ground image, high res everything, 60fps, less shader effects. But others might prefer low res, more shader effects. Others super high texture, low poly 4k. Others have 4 1080tis, and are doing 4K with ultra settings.

 

I also know how Level Of Detail or Frame render time goal works, and have not gone into those details. But games from 2001 were doing it, and so were not resolution limited, but GPU power limited.

i think he's right and not the article but the developers quoted there give reason to what Leadeater was arguing (if i don't get this even more confusing), 4k is a choice and 4k impacts the "cost" of development, there is a tradeoff you will have to accept and hard choices they have to make in choosing to develop for 4k.

.

Link to post
Share on other sites
8 minutes ago, leadeater said:

How many of those 3GB-4GB could have rendered the game at 4k with acceptable frame rates, none of them. The higher res pack could have been used from the start, because at 1080p the vram usage would have allowed it just not at 4k on cards not capable in the first place.

Frame rates =|= image quality. They don't prevent each other. You posted the FF image as image quality. Developer FPS goal =|= removal of content/image quality (I will link to LOD pipelines explinations and pre-engine art assets if you wish). Do you want to learn about how game art assets are created, and how they are loaded/unloaded from the game engine?

 

Because you seem to think they downgrade the assets to hit a 60FPS mark at 4K. I'm saying they *don't*. Can you provide the example of 1 game that downscales texture resolution to hit 60FPS in 4K?

 

Quote

On one hand, that excess GPU power could be put into perfecting every pixel to better close the distance to CG movie visuals

How? We already have the render pipelines for better shaders. We already have the render pipelines for better poly count/textures. We already have these (I linked to 10 of them!!!). Crackdown 3 has high poly/texture at 1080p. How could they improve it at 1080p? It already maxes out at 60FPS, any improvement in textures/poly/shaders/render at 1080p will *lower the FPS at 1080p not increase it!!!*

 

Trying to hit 4K will increase the poly/texture/shader count of art assets and they will downscale them to hit 30fps. These are added back when you or I or Asus Killer chooses 1080p!

 

4K is a higher resolution, so you have to have higher original art assets. Even if you later find your FPS budget does not allow them (as in your FF8 image you posted).

Link to post
Share on other sites
2 minutes ago, asus killer said:

i think he's right and not the article but the developers quoted there give reason to what Leadeater was arguing (if i don't get this even more confusing), 4k is a choice and 4k impacts the "cost" of development, there is a tradeoff you will have to accept and hard choices they have to make in choosing to develop for 4k.

The issue I think is cropping up here is what people value more and consider a better graphical game. My taste, what I use to measure is different so my conclusion is going to be different to another person. I could see a beautiful realistic looking game and be amazed at it and call it the best graphical game ever, someone else might look at it and say it's not because it's not being rendered at 4k so has drawbacks that are more important to them.

Link to post
Share on other sites
4 minutes ago, leadeater said:

The issue I think is cropping up here is what people value more and consider a better graphical game. My taste, what I use to measure is different so my conclusion is going to be different to another person. I could see a beautiful realistic looking game and be amazed at it and call it the best graphical game ever, someone else might look at it and say it's not because it's not being rendered at 4k so has drawbacks that are more important to them.

Again. Show me 1 game they took content out of to hit a higher resolution. ?

 

Finally. Textures and poly count are limited by framebuffers and cuda cores. Yes, higher resolutions do impact this, but are not the reason we do not have "photorealistic" games at 300x200p! ?

Link to post
Share on other sites
2 minutes ago, TechyBen said:

Frame rates =|= image quality.

Yes I know that, frame rate target will effect the image quality. They aren't mutually exclusive. Raise one the other lowers, they are linked. 

 

3 minutes ago, TechyBen said:

You posted the FF image as image quality.

No I used that show show that asset being used in a game would result in a lower graphical looking game. The higher quality asset would result in a higher graphical looking game. Now where it's actually important to this discussion is if the low quality asset has to be used to achieve 4k on current hardware but the high quality one could be used if the start was 1080p instead. There is no guarantee that the higher quality asset will be available in the game if the development goal from day 1 was 4k.

Link to post
Share on other sites
12 minutes ago, TechyBen said:

Again. Show me 1 game they took content out of to hit a higher resolution. ?

Far cry 5.

 

Edit:

Also how they heck could you possible know unless a dev said they did. You really think you can anticipate every choice during development to know why things were done a certain way. Nor am I trying to show or prove this has been done.

 

You seem to fundamentally miss what point of what I'm saying.

 

What if a game was to start development today and 4k was completely off the table, lets go as far as to remove it as a game resolution option. How good graphically could the game be made to look under this set of conditions, in my view better than if 4k was a forefront goal. Other choices would be made, the resultant product would be different if it were to be done.

 

16 minutes ago, TechyBen said:

How? We already have the render pipelines for better shaders. We already have the render pipelines for better poly count/textures. We already have these (I linked to 10 of them!!!). Crackdown 3 has high poly/texture at 1080p. How could they improve it at 1080p? It already maxes out at 60FPS, any improvement in textures/poly/shaders/render at 1080p will *lower the FPS at 1080p not increase it!!!* 

Take that up with the developer who said it in the article, he seems to have the same thought process I'm alluding to.

 

16 minutes ago, TechyBen said:

any improvement in textures/poly/shaders/render at 1080p will *lower the FPS at 1080p not increase it!!!*

And I'm not opposed to that at all, and that is for the consoles anyway. You can make that statement but replace console with PC, replace 1080 with 4k and my reply would be fine do it, now lower the resolution to 1080p.

 

15 minutes ago, TechyBen said:

4K is a higher resolution, so you have to have higher original art assets

Not true, that may not happen at all. It's a reason to do so, it's also a reason not to.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×