Jump to content

Battlefield V with DXR on tested from Techspot(Hardware Unboxed)

kiska3
1 hour ago, TechyBen said:

Yeah, you COULD do the hud at 4k and the game//3d content at 1440p (on a 4k panel so you still get 4k video etc). I don't see anyone supporting it though.

It's already supported, I don't mean the HUD. The output resolution is just a container frame, you can render the game at lower than the output resolution. Fortnite is an example of many games that have this now.

 

aHR0cDovL3d3dy5sYXB0b3BtYWcuY29tL2ltYWdl

 

3D resolution is what the GPU is rendering the game at, display resolution is the output to the monitor and this ensures that you are displaying a native image on the display so you don't get misaligned pixels with the physical monitor pixels.

 

This way you can render at 1080p and display output at 4k.

 

Game assets, what I was talking about earlier is different all together. Make those higher quality and they'll look better no matter the settings you pick, the 3D resolution, the display resolution. Game assets are the source for the render, unlike CSI you can't zoom and enhance because it's not a thing (ok it sort of is but you're a million times better off to start with more detail than trying to create it from which is not there).

 

Edit:

Game assets aren't "1080p" or "4k", entirely different thing. It sounds like you're mixing that up a bit.

 

Here's an example of what I mean, FFX on PS2 has two character models for each character and it'll swap them depending on how graphically demanding the scene is or if it's a scripted but game engine rendered scene

b55263e9_Yuna.png

The right is obviously higher graphical quality, using the left and rendering at 4k won't make it look better than the right being used and rendered at 1080p or 1440p. The left won't look better rendered at 8k, 16k, or 100000k.

 

For a bit of a laugh.

apdsz76lydk01.png

Link to comment
Share on other sites

Link to post
Share on other sites

That is not what I meant. LOL. Hud/GUI is VERY much rendered at resolution only (often using jpg/gif/png/tiff etc, converted to the render engine file format). Unless it's vector based, and I don't know of a game that does do that. Any form of after frame scaling will mess up a hud. It will blurr when upscaling, or could strobe when downscaling.

 

GUI. Hud. Text. I prefer them at native resolution. I don't care how/what/where a frame is rendered for analysing/resolution/scaling... but even then, I personally prefer pixel to pixel output. Let the software do the colour/edge/output control not the monitor/screen space. (Your example is textures, and has nothing to do with video screen resolution or render resolution, you can render a 400pix texture out to 4k if on a tessellated 3d triangle mesh for example).

 

Look at text without cleartype/aliasing. While at 4k it's less noticeable, it can still be prone to the strobe effect.

 

(I'm having trouble finding a texture map, but usually COD/BF would do 3 GUI textures, 1 for 720, 1 for 1080p 1 for 4k for example. Anything between those, gets stretched or rescaled. See problems when gaming at super widescreen or 8K+ and the GUI *does not scale* to the screen, but is stuck rendered at 1080p textures. Alternatively, some games do allow customisation of the GUI, but not many)

 

[edit]

Found one: 

 

 

AOE2-100-4.thumb.jpg.8f46bee27f37e318a78a34d8e0e3aff8.jpg

 

So you can scale up that GUI at the bottom, but it would require bitmap scaling. It won't be rendered natively at destination resolution. So would look like a 1080p upscaled content.

 

Also, your examples are of an old game... that has no 4k content. So of cause "rendered at 16k" looks exactly the same. Add a tree, grass or anything with detail that could strobe on lower res/have greater colour/branch/leaf detail and it will look different per resolution. Only slightly, but it will be a gradual improvement, until about 4k where 8k + is beyond human vision (we could choose 8K as the ultimate limit, but seems 4K is bout center to the cut off).

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TechyBen said:

That is not what I meant. LOL. Hud/GUI is VERY much rendered at resolution only (often using jpg/gif/png/tiff etc, converted to the render engine file format). Unless it's vector based, and I don't know of a game that does do that. Any form of after frame scaling will mess up a hud. It will blurr when upscaling, or could strobe when downscaling.

Well tell that to the games already using it, like Fortnite as exactly I showed you. I don't know what else to say when there are literally games already doing it. No blurr, no mess, no issues.

 

1 hour ago, TechyBen said:

(Your example is textures, and has nothing to do with video screen resolution or render resolution, you can render a 400pix texture out to 4k if on a tessellated 3d triangle mesh for example).

Well no kidding because that is what I'm talking about. Games with higher quality assets WILL look better at 1080p than a game with low quality assets at 4k.

 

We can make far better looking games, with actually better visuals by increasing asset quality, lighting effects, shadows, physics models and not go anywhere near resource wasting 4k render resolution that is a very poor improvement when the before mentioned things are not improved along with it.

 

My example is not textures either, that is applied over the models I showed. What I showed was a low bone count, low poly asset versus a higher bone count higher poly asset that does also have a higher quality texture map.

 

1 hour ago, TechyBen said:

Also, your examples are of an old game... that has no 4k content. So of cause "rendered at 16k" looks exactly the same.

Literally my point....

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

Well no kidding because that is what I'm talking about. Games with higher quality assets WILL look better at 1080p than a game with low quality assets at 4k.

Again. You may not have read what I said. You gave a close up of a low poly lower/medium res texture character model, but not the *entire scene* which included grass and trees. These will render nicer on higher resolutions, have less strobing and less blur. While not as visible from 1080p to 4k (or 1440p) it is still a factor, as the previous linked Youtube video shows.

 

The tree in the background strobes in all the content, but is about no strobing at 4-8k. The higher the resolution, the greater the change content blends together, and visually, is indistinguishable from analogue/natural images.

 

4K seems like the sweet spot. So solidifying it now, will mean they stop pushing to higher res, and as you say, will hopefully concentrate on the rest of the game.

 

Quote

Literally my point....

But I'm still talking about BFV and Raytracing. I'm still talking about *current* games and render engines, not 10 year old ones. ???

Same with Raytracing. We don't need it now. We don't need it in FPS. It may work in games made for raytracing, or slower paced games. But it's being pushed in the wrong area. Higher resolutions are fine, if used well. For large scene FPS, it can look nice. For a close up corridor simulator, it's pointless yes.

 

But players can already choose a 4k monitor and 1080p settings. Nothing is stopping them doing that. But your example of "you cannot notice it in motion" is not entirely true. That's all I mean. It's noticeable, and people will like it (see the "tin can" effect on peaches/pineapples. ? ) at 4k over 1080p, and possibly 1440p, because interpolation is problematic.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, TechyBen said:

Again. You may not have read what I said. You gave a close up of a low poly lower/medium res texture character model, but not the *entire scene* which included grass and trees. These will render nicer on higher resolutions, have less strobing and less blur. While not as visible from 1080p to 4k (or 1440p) it is still a factor, as the previous linked Youtube video shows.

 

The tree in the background strobes in all the content, but is about no strobing at 4-8k. The higher the resolution, the greater the change content blends together, and visually, is indistinguishable from analogue/natural images.

 

4K seems like the sweet spot. So solidifying it now, will mean they stop pushing to higher res, and as you say, will hopefully concentrate on the rest of the game.

 

But I'm still talking about BFV and Raytracing. I'm still talking about *current* games and render engines, not 10 year old ones. ???

Um may I remind you the start of my conversation was that we can improve the visual quality of games much better by NOT trying to render games at 4k and instead improve asset quality and other effects.

 

What I'm talking about and shown demonstrates my exact point, those old ass PS2 assets look terrible no matter how high you render them. Look at the HD remaster image I showed where Waka got the overhaul and the other NPCs did not, you're going to tell me it's better to up the render resolution over say giving those 2 other characters the same treatment as Waka?

 

My example is not invalid because it's an old game, it perfectly illustrates what I'm saying and why games look better now than back then and it has little to do with render resolution.

 

Edit:

Because if we have a choice of either 4k render resolution or much more realistic lighting/reflections, we can't have both due to the GPU not being powerful enough, then the better lighting/reflections is going to have a more noticeable impact to overall visual quality of the game.

 

And the reason I have this opinion is because with most current GPUs they are barely able to render games at 4k, this leaves no room to make other improvements that would make games more graphically demanding. It's nice to want both, but if I have to pick then you know what my choice is.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TechyBen said:

But players can already choose a 4k monitor and 1080p settings.

Yep, but when developers are pushing for 4k because that's what is popular and that is restricting other areas of graphical improvements because we don't have infinite GPU resources to do everything then that is where I say 4k is wasted resource. If we go back to the FFX example, like I was trying to point out is that the literal character will still look bad in the scene rendered at what ever resolution you like. If you use the higher quality one then visually it will look better. I agree all the other stuff matters but if we have two games, one using the low quality models and one using the high quality models and the low quality model game was at a higher render resolution I will say the other game with high quality models is the superior graphical game.

 

When I actually played FFX on the PS2 it was noticeable when the models swapped in and out, it would have been a far less graphically impressive game back then if it only used the low quality models.

 

This is the very heart of what I'm saying, if 4k is limiting the ability to make better graphical improvements to games then stop trying for 4k. Wait until we have the hardware that can handle it, more than just 1 or 2 GPU models. It's the very same point people are making about RTX, examples of RTX are currently 1 and a few tech demos so it's rather early to completely write it off. Even without RTX we have the ability to make graphical improvements.

 

1 hour ago, TechyBen said:

But your example of "you cannot notice it in motion" is not entirely true.

I said you won't notice it when concentrating on the game, not that it's not possible to, due to motion and our cognitive/sensory limitations. I also said there are other things we can do that would be more noticeable than the difference between 1440p and 4k. A good implementation of HDR is going to be more noticeable, granted you won't be in a situation where you would only be able to choose one or the other due to GPU resources.

 

There's always going to be some point in the game where it slows down or a cut scene etc where you will take more notice of the image quality and go "Yes this does look better in 4k". Are these times worth it if it's limiting other advancements? That may indeed be personal and subjective answer to that question.

 

1 hour ago, TechyBen said:

That's all I mean. It's noticeable, and people will like it (see the "tin can" effect on peaches/pineapples. ? ) at 4k over 1080p, and possibly 1440p, because interpolation is problematic.

If someone buys a 4k monitor and doesn't have a GPU that can do 4k, most of them, and has to deal with interpolation and that is a problem for them then that was a poor decision to get that 4k monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

@TechyBen Hitman 1 has upscaling. So do a number of other games in my library. You can even enable it in the NVIDIA control panel for desktop. It's not some mythical 10 year old stuff. It's petty standard these days.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Carclis said:

This could very well be true. However ray tracing was presented to us as something that "just works" so we would be led to believe that this is a typical implementation. Mind you this problem of implementing or not implementing effects outside of intended conditions is not exclusive to just the ray tracing technology.

 

I think the dirty environment is actually ideal in that case because you then get reflections in a lot of cases that our current "cheating" methods would not produce a good effect. Furthermore the dirtier environment means you would have less reflective surfaces and in theory incur less of a performance hit. So, performance wise BFV might actually be a lot easier to run than other implementations, especially when you consider that reflections are the only effect.

 

For the future, sure. I can tell you that is not going to stick right now though. There isn't gong to be a studio out there passing up on rasterisation just to work on ray tracing tech just because it's easier to implement. Right now it's just another thing to develop time and resources to which is clearly not something trivial since Dice have been working on it for months now.

 

I have no doubt that this will be the case. The issue is that we're only seeing one of these effects at a time and the impact on performance is colossal. It would be nice to have something like this tech to reduce the overhead for smaller developers but I don't see it happening in the near future or it having any effect on the poor condition of the industry (cost savings will just go to the publishers).

 

I agree with this but I'm also well aware that each individual ray tracing feature has a performance hit similar to other graphical feature releases. For that reason I'm not expecting ray tracing to do much over the next couple of years, which is why I'm so critical of the RTX products. They're charging through the nose for something that will never take off, at least for it's useful product life and by the time  it does the hardware will be far better equipped for it. I'm still excited to see where the technology goes for sure, but right now it's not all that dissimilar to Intel claiming they have the worlds first 5Ghz CPU when in reality you will never see 5Ghz, and if you do you're not benefiting at all.

 

Alright quick cliff notes version of the 3 raytracing types and how they work. 

 

All raytracing, even hybrid starts by shooting 1 or more rays per pixel from the camera position through the pixels on the virtual screen, (the contents of this virtual screen are what you actually see by the way). These rays keep going till they hit somthing in the games 3d render of the scene. We'll call these ray's master rays. here's probably some real fancy name for them but "master Rays" will suffice for our purposes. What happens next exactly depends on what is being handled by raytracing. In a full raytraced scene several different types of ray then emanate from where each ray touches the the scene. Again for sake of all these under the general classification of "Secondary Rays". For the purposes of what is implemented in hybrid raytracing we only really care about 3, as those are the only types implemented.

 

The first of these secondary rays is properly called the reflection ray. This shoots off at an angle set by the angle between the surface and the Master Ray. Just like light striking a mirror in real life. This reflection ray then shoots off and bounces around picking up colour info which is then combined with the colour info from the master ray to produce the final pixel colour value.

 

The second is the Shadow Ray. I haven't see a lot on this so i'm not sure how what direction it goes off in is decided. But it detects the shadowing effect of any surfaces it pases through on it's way out. Again this is combined with master ray data to decide the final pixel colour.

 

The third type are known as light rays, whilst hybrid ray tracing puts a cap on the number of light sources that can be used this fires a ray off at every light source in the scene until it hits the ray cap. Again picking up on lighting factors as it goes to again combine with the master ray to get a final result.

 

The thing is all 3 of these effects can use the same Master ray to implement, in fact they have to if combined, and the results of all 3 are put together to produce the final pixel values.

 

What this also means however is that the Master Rays are a one time overhead item, adding extra RTX effects doesn't necessarily double the workload. In fact because of the amount of bouncing they can do, (I believe BFV limits it to 4 bounces total), reflection is amongst the most intensive to do unless you have a scene with an enormous number of lights. But i think currently software caps it at the 3 closest light sources.As a result implementing additional effects isn;t a simple scalar.

 

8 hours ago, xAcid9 said:

 

The fact that DICE calls it their denoising algorithm suggests they aren't using the tensor denoising, and from the sounds of it there's some serious hangups happening on the Raster side in terms of getting the special second render to the RT cores in a timely manner so the GPU is spending a lot of time sitting on it's hands between frames waiting for things to happen. That fits with the lower power consumption too. If most of the GPU is only active for part of every second and then furiously busy the rest your going to see a lot of dectivaited parts producing low power draw, but a high average clock speed.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, CarlBar said:

@TechyBen Hitman 1 has upscaling. So do a number of other games in my library. You can even enable it in the NVIDIA control panel for desktop. It's not some mythical 10 year old stuff. It's petty standard these days.

Game or GUI? Again, I'm not talking about game upscaling... and even with upscaling, you're gonna get strobing and/or GUI blurring. Did you see the screenshot I posted?

 

Leadeater...

Quote

I said you won't notice it when concentrating on the game, not that it's not possible to

Strobing. You won't notice strobing? Then why does anyone bother with antialiasing? ?

 

Again, I did not say better graphics are not wanted. I did not say 4k is better than concentrating on animations + textures + poly meshes. I said 4K can reduce some strobing. It can improve some fidelity. And it can be noticed.

 

PS, and as above. 1080p or 4k and Raytracking can work exactly the same, as (even as you put it), that individual effect can/could be upscaled. No need to render the entire game engine at 4k (Rocket league even has an option, so in Unreal engine IIRC, to downscale render detail of individual objects/distances. Gives you low res on distant objects but *is noticeable* at 1080p, and would IMO not be as noticeable at a 4k to 1080p dynamic render). That's what I'm talking about. 4k as the "you don't need anti aliasing" and content being perfectly balanced within it (unlike 1440p which is not a perfect multiple of 1080p, is not supported in cinema/video much, is not supported in games as much with scaling of assets etc).

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, TechyBen said:

Strobing. You won't notice strobing? Then why does anyone bother with antialiasing? ?

I don't bother with it much, rarely do I turn it up much. It's another one of those things with very quick diminishing returns with rapid computational demand.

 

23 minutes ago, TechyBen said:

Again, I did not say better graphics are not wanted. I did not say 4k is better than concentrating on animations + textures + poly meshes. I said 4K can reduce some strobing. It can improve some fidelity. And it can be noticed.

I honestly don't think you are aware of the point I was trying to make. If we have a choice of only being able to target 4k at the expensive of making real, more noticeable difference then we should not do it. Trying to render at 4k will not inherently give a game a big graphical improvement and it is very demanding.

 

I have seen many games rendered at 4k, on very good monitors and it's not a big jump in game graphics. We can stand around all day arguing back and forward about how big a difference it makes but games fundamentally have gotten more graphically impressive over time independent of the render resolution.

 

Targeting 4k, now, is a waste of resources and limits what we can do because we don't have unlimited GPU resources.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

IIf we have a choice of only being able to target 4k at the expensive of making real, more noticeable difference then we should not do it. Trying to render at 4k will not inherently give a game a big graphical improvement and it is very demanding.

We don't. Selecting "output to 4k" is 1 line of code. Take a couple of seconds and $5 of development budget. It results in less jaggies, less jitter/strobing, higher scene fidelity (while your FFX example was good, it was 1 character. The Darksouls Youtube example had a whole scene, with trees and swords and stuff. These all looked better as each resolution improved... with zero extra art/development budget). Your FFX example took an entire dev team, artists and months/years to "remaster" the game. Even doing it from the start (so not twice over) costs more on budget. I've seen 1 or 2 small indie devs do nice complex artwork (that game you play as a mouse XD ), but most go for stylised and simple, so they can lone dev the art or do a small outsource of assets. See PUBGs vs Fortnight (Or BUBG vs COD :Pfor difference in art quality.

 

Yes, 4k means you need a better GPU. But no one is forcing people to buy 4k monitors! It's just an option there for those who do, and who do have the GPU to drive it. However, I am saying 4K is the cut off. Not the "oh 1080p is good enough, and no one can see better than 1440p" because, people can see better than 1440p... but arguably cannot once we start to push beyond that (or 4K).

 

I agree we don't really need 4k. Just as we don't need dolby surround sound on mp3 players. But while walkmans/phones/ipods stopped at "stereo" sound as the general consensus of "good enough", I believe 4K will be the point on monitors/TV. Not 1080p. Just because, it's that little tiny push that is noticeable, and so it will naturally stop there. 1440p won't be purely because cinema/TV does not support it *natively*.

 

The price of 4k will come down even more (it has done massively) in time. Same with GPUs price/performance. The eyes ability to see better will not improve. ;)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, TechyBen said:

We don't. Selecting "output to 4k" is 1 line of code. Take a couple of seconds and $5 of development budget.

That's not rendering the game in 4k though is it, are you making the same point I made ages ago where I showed Fortnite has this ability and why it's good?

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, TechyBen said:

The Darksouls Youtube example had a whole scene, with trees and swords and stuff.

Dude my point was inclusive of a whole scene, you're singular focused on "But you didn't show it in a scene". Put that character in the Darksouls scenes, it will look crap. Bad character models look bad period, in any situation, all the time, in every scene at every resolution. It's at this point I'm going to end this and walk away, you're just not getting it.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, leadeater said:

That's not rendering the game in 4k though is it, are you making the same point I made ages ago where I showed Fortnite has this ability and why it's good?

Wait. "output to 4k" is not rendering at 4k? What do you mean?

You can render most games at any resolution. Polygons = render at any resolution. See Linus doing 8k and 16k gaming.

 

Do you mean texture resolution? This also has nothing to do with screen resolution.

 

What do you mean?

 

I do get it. Take Grown Home:

ss_0723d0bb068c6779aa4f601b4dd43a2fa0a71

This game is stylised. It will look good at *any* resolution. Art assets are not dependent on their fidelity, but on how well they are made.

 

This game will look better as resolution improves, because you will get less jaggies, and less strobing (though arguably, very little grass/trees to do that). It will have nicer colour gradients (just for the pure fact there are more pixels to gradually shade). And I agree, between 1440p and 4K it may be impossible to tell visually. But I don't see support for 1440p from film/video/Netflix, so 4K is all we can choose, because 1080p IS a noticeable difference.

 

Again, rendering Grown Home at 4K costs zero development budget. And possibly little GPU resources. However, COD/BF4 etc will use up resources. Ray tracing will use up resources. But higher resolution means less antialiasing. More raytracing means less scene faking. It means the hardware can do things to lessen the art development time.

 

4K will be what people push for. Even if you suggest we should not. Why? I can see the jaggies on my 1080p monitor. Fact. I can see the truetype/antialiasing even if perfectly setup, it cannot hide all the edges. On 4K, I doubt I ever could see a pixels individual edge/square. So people will notice the difference, and will not wish to downgrade once they have it. *This is irrespective of what resolution games are rendered at*. However, your statement "you cannot notice it in motion" is not true. It is only less noticed between 1080p/1440p/4k.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, TechyBen said:

Again, rendering Grown Home at 4K costs zero development budget. And possibly little GPU resources. However, COD/BF4 etc will use up resources. Ray tracing will use up resources. But higher resolution means less antialiasing. More raytracing means less scene faking. It means the hardware can do things to lessen the art development time.

You are aware that when a game is developed and they set a development target of 4k render output that this will have an impact on graphical quality of the game?

 

In the situation of limited GPU resources and you pick 4k over other things that will also increase GPU demand but improve graphics quality that 4k choice just then had an impact.

 

You can take an existing game now and set it to 4k and yes it will look better. You can take a team of developers now, set a 1440p render goal, and set out to make the most graphically impressive game possible. Now do that very same thing with a 4k goal, I'd posit that resulting game with the 1440p goal will be more graphically impressive.

 

32 minutes ago, TechyBen said:

so 4K is all we can choose, because 1080p IS a noticeable difference.

Game development industry is very independent of the film and television industry.

 

32 minutes ago, TechyBen said:

4K will be what people push for. Even if you suggest we should not. Why? I can see the jaggies on my 1080p monitor.

Because 1440p monitors exist or 2560x1600 like I currently use. There is no void of choice between 1080p and 4k.

 

32 minutes ago, TechyBen said:

However, your statement "you cannot notice it in motion" is not true. It is only less noticed between 1080p/1440p/4k.

Now you're just changing what I said.

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, leadeater said:

You are aware that when a game is developed and they set a development target of 4k render output that this will have an impact on graphical quality of the game?

Grown Home... Groooowwwnn Hoooome.

 

Now. Does resolution matter? I can set my current games at 8k. I can set older games at 16k. HL2 or similar. No, the  development budget of HL2 was not affected by the option for the resolution of 4k in the menu. The game artwork looks no better at 8k than 640p. However, the higher resolutions do improve the render quality. HL2 Renders fine at 4k with low GPU power. Resolution does increase render time and processing requirement. It is not required to make game photorealistic so that the GPU cannot render at higher resolutions though (again, see Grown Home).

 

The film industry only does 1080p and 4k (they did IIRC do some content at 1440p for a while, but not much). Thus, you will get monitors at 1080p, and 4k more so than at 1440p. Also, people (like me) won't like 4k downsampled to 1440p. So when choosing a monitor, I will get 1080p or 4k for video content. I may get a 1440p for gaming... but if I can only have 1 (and guess how many I can afford), I'd choose the 4k because it can do both film + gaming (I can game at 1080p on a 4K with no scaling artifacts due to halving the res).

 

Again, 4K is not a problem. It is a tool. How it is used can be a problem.

[Edit]

Here is an example of your FF games/art assets:

FFX_Party.JPG

(Sorry for Jpeggness)

VS

Aki_Ross_(sample_image).jpg

Both were from 2001. But 1 was at a much higher art budget. Plus, no matter how you scaled it, the hardware at the time could not output that texture/model/lighting fidelity of the film version. However, the hardware/game engine could arbitrarily change resolutions to anything you ask it to, at a performance hit.

 

Same with Raytracing. You can scale it up/down to what is possible or what is desired. I could easily see a game where the raycasting is constant, and improves over time (so a turn based/card game where graphics improve as you wait to play). But in a FPS, it's currently not the best implementation. Increasing/decreasing resolution or now the new "raycasting" scales costs little to no extra art budget. However, it costs the user to scale their rig/PC/GPU to match their desighered settings/output.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, TechyBen said:

Now. Does resolution matter? I can set my current games at 8k. I can set older games at 16k. HL2 or similar. No, the  development budget of HL2 was not affected by the option for the resolution of 4k in the menu.

Because the ability to pick a resolution has nothing to do with my point. Allowing the ability to select an output resolution is not the same as setting a 4k target during game development that has an over arching impact on the choices you make while developing the game, from assets to lighting models, to shadows etc. All the way through the development that 4k choice has an impact, this is nothing to do with your ability to select a render output resolution.

 

The choices you get in the game graphics menu for resolution is not what I'm talking about.

 

31 minutes ago, TechyBen said:

Resolution does increase render time and processing requirement. It is not required to make game photorealistic so that the GPU cannot render at higher resolutions though (again, see Grown Home).

And if your goal is photorealism and as I have said, and you just said, rendering at 4k does nothing to achieve this. By aiming for 4k it could prevent you from achieve it though.

 

Bringing game artistic choice in to this really is not that relevant, the only thing it shows is impact of developer choice.

 

31 minutes ago, TechyBen said:

The film industry only does 1080p and 4k (they did IIRC do some content at 1440p for a while, but not much).

The game development industry is not limited by the choices of film industry? How does this apply?

 

31 minutes ago, TechyBen said:

Thus, you will get monitors at 1080p, and 4k more so than at 1440p. Also, people (like me) won't like 4k downsampled to 1440p.

There are very many 1440p monitors, there are also many 2560x1600 monitors. There are others available too.

 

31 minutes ago, TechyBen said:

So when choosing a monitor, I will get 1080p or 4k for video content.

But we are talking about games

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, leadeater said:

Because the ability to pick a resolution has nothing to do with my point. Allowing the ability to select an output resolution is not the same as setting a 4k target during game development that has an over arching impact on the choices you make while developing the game, from assets to lighting models, to shadows etc. All the way through the development that 4k choice has an impact, this is nothing to do with your ability to select a render output resolution.

 

The choices you get in the game graphics menu for resolution is not what I'm talking about.

 

And if your goal is photorealism and as I have said, and you just said, rendering at 4k does nothing to achieve this. By aiming for 4k it could prevent you from achieve it though.

 

Bringing game artistic choice in to this really is not that relevant, the only thing it shows is impact of developer choice.

 

The game development industry is not limited by the choices of film industry? How does this apply?

 

There are very many 1440p monitors, there are also many 2560x1600 monitors. There are others available too.

 

But we are talking about games

Nope. Photorealism at 240p vs 524p looks horrid. Yes, that is not 1080p to 4k jump. But it is a fact. Now, does 4K prevent rendering at photorealism? No. IMO it's art budget and GPU render ability, not the resolution that is preventing the quality art. Again, see FF Spirits Within, proving we CAN do the art (and IMO could render out to that quality in real time these days, at 1080p, but faking the raytracing effects though). And also, the example of Grown Home. Where it is not bound by art budget on resolution, but CAN support super high resolutions.

 

Quote

Bringing game artistic choice in to this really is not that relevant, the only thing it shows is impact of developer choice.

 

It is. The FF example you gave was them choosing a more cartoon example, over realism. Now do the same for the Res Evil series. All of those went for "photo resolution", as they all mapped rather well to photos/traces of people/backgrounds. Note, some were pre-renders or *photos*. Yet higher resolutions will help improve (from say the 524p to 720p). Sometimes it's art, sometimes it's hardware limits.

 

I agree with your views and desires, but suggest they are blaming the wrong things. You are correct that we need 4K less than "better art", but are wrong that better art is a reachable goal, whereas 4K is already here.

 

Game industry and film industry don't have the same content no. But they run on the same monitors/TV! Try getting a 1440p TV for your Xbox/PS4... now you know why 4K is more a "target" for BF4/COD etc (but not Grown Home ;) ). PC gaming also does not suffer from 4K. Give me one example where it does! ?

 

Quote

There are very many 1440p monitors, there are also many 2560x1600 monitors. There are others available too.

But then you cannot watch 4K netflix. You have to choose. 1440p and native gaming at 144hz/60hz, or 4K and do everything but at 30FPS. Most people understand the 4K bit, they don't understand the FPS bit. Note, I say "understand", not "don't perceive". We do see the difference though.

 

PS, sorry for the edits.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, TechyBen said:

Both were from 2001. But 1 was at a much higher art budget. Plus, no matter how you scaled it, the hardware at the time could not output that texture/model/lighting fidelity of the film version. However, the hardware/game engine could arbitrarily change resolutions to anything you ask it to, at a performance hit.

To make a stupid, extremely blunt final attempt for you to get my point and it is by no means accurate or realistic here goes.

 

If in the two images you just showed the first, FFX one, resulted in a game looking like that because the render target was 4k so asset quality and other effects were reduced so it was possible at the performance target required and the second resulted in a game looking like that because the choice was say 1080p then the second is the more graphically impressive game and was able to look like that because 4k was not the target.

 

As a reminder, this is an extremely stupid example but maybe that is what is required. I dunno, maybe it's just impossible task.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, TechyBen said:

Game industry and film industry don't have the same content no. But they run on the same monitors/TV!

No they don't have to, YOU chose to.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, leadeater said:

To make a stupid, extremely blunt final attempt for you to get my point and it is by no means accurate or realistic here goes.

 

If in the two images you just showed the first, FFX one, resulted in a game looking like that because the render target was 4k so asset quality and other effects were reduced so it was possible at the performance target required and the second resulted in a game looking like that because the choice was say 1080p then the second is the more graphically impressive game and was able to lok like that because 4k was not the target.

 

As a reminder, this is an extremely stupid example but maybe that is what is required. I dunno, maybe it's just impossible task.

No. They ended up like that because they could spend more money on the art budget. And because Textures sizes could increase. Poly numbers could increase. Why? Hardware improved. You could not push that number of polys, texture pipelines etc, at ANY resolution on the older hardware.

 

Again, I agree that one is better than the other. But disagree with the reason. I'm not sure you understand how the artwork is made, or how long it takes. Or the cost of resolutions changes (most games are made/designed around *higher* quality assets, then downscaled for release. Some older/newer/quicker remasters/high quality DLC were just releasing the original art assets!!!).

 

Again. The artwork pipeline is not as you describe. The artwork cost, both monetary/time of the development team and pixel/render on the GPU is not quite as you are describing it.

 

So I agree, the FF image you posted has worse and better images. But not because of the resolution of the display or the target of the art team. Final example. This is a pre-game engine art asset, before going out to be in the game:

 

fbaa3a2a70aa92650073b3d25dc3c44a.jpg

That is actual 3D with textures and could have animation splines/bones too....... that could be 1000x (it may not be, but could) higher quality than is possible to render on a GPU today. But it's "easy" to design/paint/model as an artist, provided you have the money/time to spend on it.

 

However, you cannot put that in a game. Why? Because of the poly count. Texture resolution, animation frames etc etc etc. Rendering that at any resolution could be problematic. So, "we should choose 1440p because 4K is slowing down the art" is a wrong statement. We have the art. We don't have the GPU power, or the art budgets to cover it though. However, scaling up to 4K is easy and cheap for development, but slow and expensive for the user.

 

Quote

No they don't have to, YOU chose to.

I love you. Really I do. :) You complain that people should not aim for 4K, as it's expensive on the GPU card, costs too much for gamers (so they all get 30 FPS and under on cheaper hardware), and development teams are not spending money on the art budget to get the (second) image quality in the FF pic you post... then suggest I pay twice over for 2 displays, 1 for TV content, and 1 for gaming. :D

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, TechyBen said:

No. They ended up like that because they could spend more money on the art budget. And because Textures sizes could increase. Poly numbers could increase. Why? Hardware improved. You could not push that number of polys, texture pipelines etc, at ANY resolution on the older hardware.

Rendering at 4k is not free, it does require more GPU resources so you may not be able to use the higher poly assets because if you do at 4k you're going to get 10 FPS but at 1080p you might get 30 FPS.

 

That was purely a hypothetical point, read it again as written. Ignore that it's 2001, ignore the hardware then, ignore the hardware now. Read the point, game one looks like crap because 4k was the target, game two looks far better because 1080p was the target.

 

Very blunt point, so maybe I was correct. Maybe you are not able to listen to the point I'm making. I totally do not disagree with what you say about requiring a higher art budget but you are not listening to the fact that 4k may limit what those artists will do or can do. 4k is not free.

 

 

Edit:

AHHHH!! it's so bloody frustrating when someone doesn't want to listen to your point.

 

maxresdefault.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

Rendering at 4k is not free, it does require more GPU resources so you may not be able to use the higher poly assets because if you do at 4k you're going to get 10 FPS but at 1080p you might get 30 FPS.

 

That was purely a hypothetical point, read it again as written. Ignore that it's 2001, ignore the hardware then, ignore the hardware now. Read the point, game one looks like crap because 4k was the target, game two looks far better because 1080p was the target.

The setting "1080p or 4K" is free. It makes no difference to the art quality. Thus, 4K makes no difference to the quality of gameplay for those who choose 1080p. You suggested we should instead concentrate on the art.

 

I say we are concentrating on the art. Because the button to set "4K or 1080p" was 5 seconds of programming, done 5 years ago. ?

 

Quote

 it does require more GPU resources so you may not be able to use the higher poly assets because

Look at the image/s I posted. All of them are *independent of resources*. Each game comes with at least 3 resource packs, low, med, high. Texture resolutions, model densities, shader pipelines etc. Again, I am not sure you understand how art for these things are made, or how the render engine processes them.

 

Quote

Read the point, game one looks like crap because 4k was the target, game two looks far better because 1080p was the target.

[edit] Found it:

b55263e9_Yuna.png

NO!

Resolution =/= art budget!

Left is 4K WITH low art budget. Right is 1080p WITH high art budget.

You can render the right at 4K. You can do this at 60FPS with scene dynamics and/or texture limits, poly limits etc. OR you can lower the FPS. However, *you* can choose which, it does not affect how the game is released.

 

Show me 1 game where they chose the image on the left over the image on the right (as said, you can already download high quality art assets for most games, because these files *already exist*).

 

Every game is already as the right image. None are made as the left unless the hardware cannot push them even at lower resolutions.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, TechyBen said:

Wait, where? You never posted any image of 4K vs 1080p? (I'm going back through the thread now, so sorry if I find it after this post).

Your two, of FFX and Spirits Within. I'm using those two as an unrealistic example of why aiming for 4k could matter.

 

Hypothetical 4k game, horrible don't want it.

Spoiler

FFX_Party.JPG

 

Hypothetical 1080p game, excellent focus on that.

Spoiler

Aki_Ross_(sample_image).jpg

 

 

9 minutes ago, TechyBen said:

I say we are concentrating on the art. Because the button to set "4K or 1080p" was 5 seconds of programming, done 5 years ago. ?

I'm saying what we are and can do with the art is limited when you aim for 4k, because rendering at 4k is not free, it increases GPU resource load.

Edited by leadeater
Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, leadeater said:

Your two, of FFX and Spirits Within. I'm using those two as an unrealistic example of why aiming for 4k could matter.

 

I'm saying what are and can do with the art is limited when you aim for 4k, because rendering at 4k is not free, it increases GPU resource load.

Exactly. Your example unrealistic. Sorry, your one does not make sense. It does not follow the actual art development, or the render budget. Look to see if there is a game you can say to me "it's worse because they aimed at 4K so had to reduce the poly count and texture resolution". Again, art development or game code does not work like that AFAIK, they do reduce the texture and poly count, but the does not impact FPS enough to make it worth it for resolution increases. The pictures you posted do look worse/better, but *because they can now render better textures/models* not because anyone stopped painting the necklace because "who cares, 16k lolz". They stopped because "320p? Oh, I cannot even see the face, let alone the necklace, better stop painting". [edit] Or in your example of the render in Blender/game, it would have to be a FPS goal. But guess what, for Example GTA V 4K at low: 95 FPS, GTA V 4K at Ultra: 20FPS... but the game *still has ultra settings*. It still has better artwork than the FF example! They did not remove the "ultra" to hit the 4K budget, but left in the low, and left in the 4K, but also left in the high, and left in the 1080p... actually, what is the min res? [/edit]

 

You've literally done a Simpson's Paradox on the data. :P

 

Examples: 

Crysis high texture DLC shortly after release.

https://www.gamepressure.com/download.asp?ID=34398

 

Skyrim High quality Texture DLC shortly after release:

https://store.steampowered.com/sub/13437/

 

Fallout 4 High quality Texture DLC shortly after release:

https://store.steampowered.com/app/540810/Fallout_4__High_Resolution_Texture_Pack/

 

Watchdogs 2 Ultra Quality Texture Pack DLC (not sure when released)

https://support.ubi.com/en-GB/Faqs/000025708/Ultra-Texture-Pack-Installation

 

Again, the art already exists. The reason you don't get it, is not because of the monitor resolution.

 

Look at those art asset packs, and see if you can see the reason it's not on general release, or not put in the base game. Can you see one or two possible difficulties with it?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×