Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Battlefield V with DXR on tested from Techspot(Hardware Unboxed)

5 hours ago, CarlBar said:

Resolution is a bit of a special case however as it doesn't need any extra work from the developer to have an actual effect.

 

Yes! And the bit about the particles and lights is also true.

 

There may be times they go "we cannot add RTX, it's too GPU intensive"... but that usually applies to 720p, 1080p, 4k etc etc. Sometimes, they might go "this game has to run at 4K 60FPS, so we cannot add planar reflections". But that tech still exists, and you could add it as a shader injection (for some engines, BF Frostbite is proprietary so you cannot).

That game was not aimed at 4k AFAIK. They did not put in planar reflections, because it tanks FPS at all resolutions. Same with RTX. It tanks FPS at all resolutions. It only helps/worsens FPS depending on resolution scaling.

 

BF5 Chose "screen space" reflections instead of "planar" for high performance and chose "RTX" instead of "planar" for high fidelity/quality. Granted, they could have added all 3, or not done RTX. But it's horses for courses, it's not possible to do everything.

 

Some older console games upscaled, to get all the content in. Some downscale. And some to push for HD and less Characters on screen (PS2 did this IIRC). But that was because it was memory limited. They had to choose a resolution or number of polygons. They had to choose how many enemies on screen or resolution. Because the hardware is limited, and every asses is a choice "do I do 2 textures and less enemies, or 1 texture and more" or "do I do 15 music tracks, or just 12 and use the space for extra FMV footage." or "do I use up all the CUDA cores on AI and make it a Terminal text game". XD

 

There is nothing stopping a developer releasing a text game where they use the CUDA cores for 100% AI performance. But guess what. "Defcon" is a boring game (look it up on steam ;) ).

 

Crysis is another example of developers not caring what the GPU limit is, and just creating the game/assets and scaling it. :)

 

Link to post
Share on other sites
5 hours ago, TechyBen said:

Yes! And the bit about the particles and lights is also true.

 

There may be times they go "we cannot add RTX, it's too GPU intensive"... but that usually applies to 720p, 1080p, 4k etc etc. Sometimes, they might go "this game has to run at 4K 60FPS, so we cannot add planar reflections". But that tech still exists, and you could add it as a shader injection (for some engines, BF Frostbite is proprietary so you cannot).

That game was not aimed at 4k AFAIK. They did not put in planar reflections, because it tanks FPS at all resolutions. Same with RTX. It tanks FPS at all resolutions. It only helps/worsens FPS depending on resolution scaling.

 

BF5 Chose "screen space" reflections instead of "planar" for high performance and chose "RTX" instead of "planar" for high fidelity/quality. Granted, they could have added all 3, or not done RTX. But it's horses for courses, it's not possible to do everything.

 

Some older console games upscaled, to get all the content in. Some downscale. And some to push for HD and less Characters on screen (PS2 did this IIRC). But that was because it was memory limited. They had to choose a resolution or number of polygons. They had to choose how many enemies on screen or resolution. Because the hardware is limited, and every asses is a choice "do I do 2 textures and less enemies, or 1 texture and more" or "do I do 15 music tracks, or just 12 and use the space for extra FMV footage." or "do I use up all the CUDA cores on AI and make it a Terminal text game". XD

 

There is nothing stopping a developer releasing a text game where they use the CUDA cores for 100% AI performance. But guess what. "Defcon" is a boring game (look it up on steam ;) ).

 

Crysis is another example of developers not caring what the GPU limit is, and just creating the game/assets and scaling it. :)

 

 

Um  i'm a bit confused by your response TBH. You seem to have wandered off on a wild unrelated tangent about planar reflections after expressing agreement for a couple of things i said, before adding somthing relevant with your last line.

 

As far as that last line:

 

Yes there are examples of games that push way beyond what the hardware can deliver. But they're fairly rare, (Crysis is a meme for a reason). And most times it's because for whatever reason the developers prediction of how good the hardware will be is off base. Again like with Crysis there will be some flat out exceptions. But you take a large enough sample size and  unless the laws of psychics explicitly forbid it there will be an exception. But exceptions are exceptions because they're highly abnormal. You can't build any kind of argument about the normal state of things, (what where discussing), based of the exceptions to the norm.

Link to post
Share on other sites
1 hour ago, CarlBar said:

Um  i'm a bit confused by your response TBH. You seem to have wandered off on a wild unrelated tangent about planar reflections after expressing agreement for a couple of things i said, before adding somthing relevant with your last line.

I've kind of likened it to the 'Can't see the forest for the trees' saying. On one hand I'm, or we but I'll mostly say I, trying to have a conversation about the forest as a whole and on the other someone else is wanting to talk about that tree, or that tree or some other tree. 'But the forest', 'but the trees'. Before you know it you're both lost in the forest not knowing where you are or how you got there.

Edited by leadeater
Link to post
Share on other sites
1 hour ago, leadeater said:

I've kind of likened it to the 'Can't see the forest for the trees' saying. On one hand I'm, or we but I'll mostly say I, trying to have a conversation about the forest as a whole and on the other someone else is wanting to talk about that tree, or that tree or some other tree. 'But the forest', 'but the trees'. Before you know it you're both lost in the forest not knowing where you are or how you got there.

If no one is around to read this thread in the forest, was it even posted?

QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Sometimes I miss contractions like n't on the end of words like wouldn't, couldn't and shouldn't.    Please don't be a dick,  make allowances when reading my posts.

Link to post
Share on other sites
1 hour ago, leadeater said:

I've kind of likened it to the 'Can't see the forest for the trees' saying. On one hand I'm, or we but I'll mostly say I, trying to have a conversation about the forest as a whole and on the other someone else is wanting to talk about that tree, or that tree or some other tree. 'But the forest', 'but the trees'. Before you know it you're both lost in the forest not knowing where you are or how you got there.

 

To be fair i totally get the going of on tangents thing when i can see how thy relate to the discussion in question. I do that all the time. Currently being tested for Asperger's and thats one of the common things about autism spectrum disorders. The issue here is i'm not really sure how it relates to what where discussing in anything but the very loosest sense.

Link to post
Share on other sites
5 hours ago, CarlBar said:

 

Um  i'm a bit confused by your response TBH. You seem to have wandered off on a wild unrelated tangent about planar reflections after expressing agreement for a couple of things i said, before adding somthing relevant with your last line.

 

As far as that last line:

 

Yes there are examples of games that push way beyond what the hardware can deliver. But they're fairly rare, (Crysis is a meme for a reason). And most times it's because for whatever reason the developers prediction of how good the hardware will be is off base. Again like with Crysis there will be some flat out exceptions. But you take a large enough sample size and  unless the laws of psychics explicitly forbid it there will be an exception. But exceptions are exceptions because they're highly abnormal. You can't build any kind of argument about the normal state of things, (what where discussing), based of the exceptions to the norm.

I also agree with the "forest vs the trees" problem, but I posted the "Simpson's Paradox" example, for how, mathematically, we were blaming the wrong bottlenecks! :) Mathmatically, the bottleneck to the GPU is not where *gamers* think it is. But artists/programmers notice where that is (VRAM, memory bandwidth, CUDA/stream cores, CPU pushing data to the card, art assets, LOD scaling, development time, GPU and memory clocks).

 

We can blame the artists/developers/game designers. But are they really at fault? They take 1 light out, 1 pixel shader or 1 line of reflections code, and is it making the visuals worse when compared to things they put in instead? We don't loose visual fidelity to hit a 4K budget, we swap one type for another.

 

Planar reflections are not a "tangent" they are data to prove or disprove my position. That Crysis is not an exception. It's a general trend. Planar reflections are a disproof to my position. We don't get them a lot in games. So that is one place games don't push the tech at 1080p.

 

As said, there are a few games out there where the developers turned off features. Stopped making art.

 

But the question is, do you know which games those are? Because I agree, with you and Leadeater to some extent, but you are blaming the wrong games/engines/settings/effects. :P

 

That's the only bit I'm trying to give more info on (hence the example of screen space vs planar vs RTX... no one is making games effects worse, they have to choose how to make it work at all, or not at all). They have to choose 1 or the other. We cannot have it all. Even if we stop the world using resolutions above 600p. ?

 

We had photorealistic renders at 320p decades ago... no one made a game out of it. ?

Link to post
Share on other sites
12 hours ago, TechyBen said:

But artists/programmers notice where that is (VRAM, memory bandwidth, CUDA/stream cores, CPU pushing data to the card, art assets, LOD scaling, development time, GPU and memory clocks).

Thing is I never said it wasn't these.

 

12 hours ago, TechyBen said:

We can blame the artists/developers/game designers. But are they really at fault? They take 1 light out, 1 pixel shader or 1 line of reflections code, and is it making the visuals worse when compared to things they put in instead? We don't loose visual fidelity to hit a 4K budget, we swap one type for another.

And this has been what I have been saying all along, one for another. I prefer the other, I'd like to see the other as that's just my personal preference to see.

 

I actually gave a more than enough information to illustrate my point (I would argue too much since it wasn't actually required and clouded the point), I posted links to some information about the different types of choices that can be chosen and how they effect visual look of the game and also the CPU and GPU resources. Like, not the only thing, the choice to use real time lighting, baked GI, pre-computed GI.

 

The choices that will be made will likely be different, not all of them but at least some not zero, when targeting 4k@30 vs 1080p@30 on the same hardware.

 

12 hours ago, TechyBen said:

But the question is, do you know which games those are? Because I agree, with you and Leadeater to some extent, but you are blaming the wrong games/engines/settings/effects. :P

I was never blaming anything, just wishing for a different goal/focus. I don't what all games or developers to do it how I'd like to see, I don't want to stop developers from picking a certain art style that simply will not make sense with what I'd like to see. It's not an all encompassing wish and I don't want to stop 4k games from being developed.

 

My general wish was only that I'd prefer to see a game where the absolute hardware limits was pushed at 1080p or 1440p, that would mean no user would want to play it at 4k even if they could until better hardware came out. If I see a game, a really good one, hitting say 100 FPS on a 2080 Ti at 1440p then I'd say the hardware is potentially capable of doing much more, I'm not imply that the developers of that game didn't do everything they possibly could at the time.

 

It's about the visual progression of games, will that progression happen as quickly for 1080p when 4k is the target. Maybe or maybe not, I know it's less likely while 4k is the target/primary consideration.

Edited by leadeater
Link to post
Share on other sites
2 hours ago, TechyBen said:

I also agree with the "forest vs the trees" problem, but I posted the "Simpson's Paradox" example, for how, mathematically, we were blaming the wrong bottlenecks! :) Mathmatically, the bottleneck to the GPU is not where *gamers* think it is. But artists/programmers notice where that is (VRAM, memory bandwidth, CUDA/stream cores, CPU pushing data to the card, art assets, LOD scaling, development time, GPU and memory clocks).

 

We can blame the artists/developers/game designers. But are they really at fault? They take 1 light out, 1 pixel shader or 1 line of reflections code, and is it making the visuals worse when compared to things they put in instead? We don't loose visual fidelity to hit a 4K budget, we swap one type for another.

 

Planar reflections are not a "tangent" they are data to prove or disprove my position. That Crysis is not an exception. It's a general trend. Planar reflections are a disproof to my position. We don't get them a lot in games. So that is one place games don't push the tech at 1080p.

 

As said, there are a few games out there where the developers turned off features. Stopped making art.

 

But the question is, do you know which games those are? Because I agree, with you and Leadeater to some extent, but you are blaming the wrong games/engines/settings/effects. :P

 

That's the only bit I'm trying to give more info on (hence the example of screen space vs planar vs RTX... no one is making games effects worse, they have to choose how to make it work at all, or not at all). They have to choose 1 or the other. We cannot have it all. Even if we stop the world using resolutions above 600p. ?

 

We had photorealistic renders at 320p decades ago... no one made a game out of it. ?

 

Except your tangent about planar reflections doesn't have anything to do with proving crysis is not an exception. It's a tangent about what can be added to a game by modding it. Which is completely different from what a developer adds to a game.

 

Also Crysis is an exception to the rule, thats why it's a meme to this day, because it's one of the few examples of a game released down the years that just wouldn't run on current hardware due to sheer graphical fidelity, (stuff that won't run because super badly optimized is a different story, hello Assassins Creed games).

Link to post
Share on other sites
16 hours ago, leadeater said:

 

The choices that will be made will likely be different, not all of them but at least some not zero, when targeting 4k@30 vs 1080p@30 on the same hardware.

 

The industry targets 1080p 60fps and 4k 30fps. They sometimes target 1080p30fps (IIRC The Order 1886 did 1080p at 30fps). Crackdown is targeting 1080p@60fps and 4k@30fps.

 

 

The art/resolution scales. Now.. they might drop some target rendering techniques for more FPS. And reduction in 4k, will increase FPS in 1080p. I agree, on games consoles this can mean they push FPS instead of graphics fidelity. But on PC, you can just re-enable that artpack/LOD/setting in the menu (or ini file edit at times).

 

Quote

I was never blaming anything, just wishing for a different goal/focus. I don't what all games or developers to do it how I'd like to see, I don't want to stop developers from picking a certain art style that simply will not make sense with what I'd like to see. It's not an all encompassing wish and I don't want to stop 4k games from being developed...

...My general wish was only that I'd prefer to see a game where the absolute hardware limits was pushed at 1080p or 1440p

Which is why I've not said you are wrong about your opinion! You are correct! :D

I've said, and provided examples of 12 games with what you asked for. I can provide probably hundreds if you wish. :)

 

13 hours ago, CarlBar said:

 

Except your tangent about planar reflections doesn't have anything to do with proving crysis is not an exception. It's a tangent about what can be added to a game by modding it. Which is completely different from what a developer adds to a game.

 

Also Crysis is an exception to the rule, thats why it's a meme to this day, because it's one of the few examples of a game released down the years that just wouldn't run on current hardware due to sheer graphical fidelity, (stuff that won't run because super badly optimized is a different story, hello Assassins Creed games).

Nope. No tangent.

Here, as said, I like to follow the "tech". So FarCry 5:

Quote

Previously on Far Cry games we'd used a planar reflection, but it was difficult to maintain a forward rendering pipeline (and often it didn't match up with what was rendered in the main view), and ensuring we only had one water height to generate reflections at was always a pain for our art and world building team. Moreover, for Far Cry 5 we wanted sloped water for river rapids, so planar reflections would no longer work. Plus if you already have SSLR (screen space lighting reflections) for your world, why not re-use it for water?

The technology and the art decided on what to use and what target, not the resolution. The resolution affected the decision ("Difficult to maintain a forward rendering pipeline", but this was before 4k!), but it was "downgraded" because while Planar Reflections are more accurate, they do not work with the rest of the art assets or render engine.

 

As said. Graphics coding/art assets/render pipeline does not work the way some people in this thread are suggesting. I agree that some looks better, and some looks worse. RTX looks better, but has a massive performance hit. Planar looks ok, but is limited in render options (won't mirror all effects/assets, won't be adjustable in play) and Screen space reflections are easy, quick and mainly visually acceptable.

 

Screen Space -> Planar -> RTX in quality and opposites for performance. BUT each one also has practical and technical limits. IIRC BattleField has never used Planar, because of the tech limits. So RTX allows for some cash from Nvidia some new reflections and higher quality images.

 

I'll sit back for a bit. But why don't people here also tell me how these things are made? I've not said they are wrong that some things "look worse", but I've said they "don't look worse because of developers", they "look worse because of limits to tech/reality/how many Titan Vs you have in your rig". XD

Link to post
Share on other sites
1 hour ago, TechyBen said:

I've said, and provided examples of 12 games with what you asked for. I can provide probably hundreds if you wish. :)

Well yes and no, you've provided something but it's not really targeting what I'm talking about much. Sure you can get HD texture packs DLC, or a official or unofficial mod, at release or later down the track but that only addresses a smaller aspect of what I'm expressing (textures and/or models, may not be everything I'd like to see improved or by how much) nor does it prove that with these we are getting the absolute total maximum possible that can be achieved at lower resolutions.

 

That's why I've said it doesn't matter how many you provide, it's not really the right area of the discussion I was having, or more correctly trying to have.

 

It's just a shorter term concern that while 4k is a very demanding resolution compared to 1080p/1440p and GPUs in computers and consoles really are not sufficiently capable without compromise, something has to give, that the rate graphical fidelity improves will temporarily wane. It might only be a 2, 3 or 5 year thing while the commonly used hardware bridges the gap in capability but it's something I do think about and ponder that what if.

 

1 hour ago, TechyBen said:

As said. Graphics coding/art assets/render pipeline does not work the way some people in this thread are suggesting

Well I have to disagree with this sentiment because you've never really understood what I've been expressing, or it's never appeared like you have.

 

Edit:

1 hour ago, TechyBen said:

but I've said they "don't look worse because of developers", they "look worse because of limits to tech/reality/how many Titan Vs you have in your rig". XD

I think there is the missing link, I don't think you've gotten that when I talk about developer choices that I am doing so in direct relation to hardware or their budget, or other technology factors that all require choices to be made. The final product must actually be achievable and also run on the consumer's devices/hardware, we've all agreed on that.

Edited by leadeater
Link to post
Share on other sites
31 minutes ago, leadeater said:

nor does it prove that with these we are getting the absolute total maximum possible that can be achieved at lower resolutions.

*The Order 1886*. They physically could not get more out of the render engine at 1080p. I've like literally given examples many many many times yet you say "we can get more out of 1080p".

Quote

I think there is the missing link, I don't think you've gotten that when I talk about developer choices that I am doing so in direct relation to hardware or their budget, or other technology factors that all require choices to be made. The final product must actually be achievable and also run on consumers devices/hardware, we've all agreed on that.

I posted the developer choices. I literally posted their choices. Why "ponder" when I have actual comments from the developers? (Far Cry 5, CrackDown, The Order 1886, A Ghost of a Tail, BFV.) ?

 

Quote

The final product must actually be achievable and also run on consumers devices/hardware, we've all agreed on that.

Yes, but your hardware (or consoles to some extent) can run 30 or 60 fps, 10 lights or 20, high res or low res. 99% of the games/developers give *both* options. About 1% choose *only* resolution (I cannot find a single one, Crackdown is close, but not quite) or *only* FPS (Most Nintendo games, some Xbox/PS Games, online multiplayer games, Rocket League for example) or *only polygons (Grown Home).

 

PS, How do I ban myself from 1 specific thread? I really cannot help myself trying to post info on how games are made... If people ask/post their opinion that "it's made this way" and it's not, I'm gonna post (PDF/presentation/youtube video/images) of how it actually is made.

Link to post
Share on other sites
2 minutes ago, TechyBen said:

*The Order 1886*. They physically could not get more out of the render engine at 1080p. I've like literally given examples many many many times yet you say "we can get more out of 1080p".

 

26 minutes ago, leadeater said:

It's just a shorter term concern that while 4k is a very demanding resolution compared to 1080p/1440p and GPUs in computers and consoles really are not sufficiently capable without compromise, something has to give, that the rate graphical fidelity improves will temporarily wane

 

As a quote of myself above I don't think you've grasped that whole talking about industry trend aspect, future projections. It's great that a game has now, or in the past done so, but is that indicative of trend that will continue? That there is likely going to be games doing that in the future?

 

You have a tenancy to latch on to a comment that is made in a post, argue that point, while missing what was actually being expressed. It really does cloud the discussion and pull it away from relevance.

Link to post
Share on other sites

[Edited out my argument, because forget adding facts, and examples, comments from developers and actual games and code... I'm done with trying to inform people about game development]

I understand. We are scared that all the games will go 4k 60fps, so they will drop what? RTX support? They did not, they *added* it. This is the thing, the trend is the opposite to the one you are scared of.

 

Bye.

Link to post
Share on other sites
57 minutes ago, TechyBen said:

How can you improve 1080p when we already hit the technological and artistic limit?

Couple of things, that was a PS4 title only so yes I would agree that it may or most likely is the maximum possible on the PS4. It was also a game that started development and was released before the PS4 Pro so would not be subject to my concern of 4k having some kind of impact when the target hardware during that time was not capable of 4k.

 

That the upgraded PS4 Pro does not nearly have a sufficiently powerful GPU for 4k and that with the big industry push for 4k, to meet hardware that is now becoming more generally available so would actually be advisable to address that market, that this newly obtained hardware resources is more likely to be spent on 4k optimized games than 1080p optimized games.

 

Quote

PlayStation 4 Pro (codenamed Neo)[40] was announced on September 7, 2016, and launched worldwide on November 10, 2016.[205] Its model number is CUH-7000.[206] It is an upgraded version of the PlayStation 4 with improved hardware to enable 4K rendering and improved PlayStation VR performance, including an upgraded GPU with 4.2 teraflops of processing power and hardware support for checkerboard rendering,[207]

 

Rendering games at 4K resolution is achieved through various rendering techniques and hardware features; PlayStation technical chief Mark Cerny explained that Sony could not "brute force" 4K without compromising form factor and cost, so the console was designed to support "streamlined rendering techniques" using custom hardware, "best-in-breed temporal and spatial anti-aliasing algorithms", and "many new features from the AMD Polaris architecture as well as several even beyond it". The most prominent technique used is checkerboard rendering, wherein the console only renders portions of a scene using a checkerboard pattern, and then uses algorithms to fill in the non-rendered segments. The checkerboarded screen can then be smoothed using an anti-aliasing filter. Hermen Hulst of Guerrilla Games explained that PS4 Pro could render something "perceptively so close [to 4K] that you wouldn't be able to see the difference".

Edit: And there is also a Sony rule that does not allow PS4 Pro only games, you are not allowed to create a game that cannot be run on the PS4 so there is never going to be a game on the PS4 Pro not running in 4k that is graphically superior to the PS4. Increased frame rate is allowed  [/edit]

 

That like the PS4 Pro and Xbox One X the common hardware in the PC gaming market is also not sufficiently powerful for 4k but there is still a industry push to deliver it, to make it possible and this is going to be the predominate focus than achieving the best possible at 1080p/1440p on the hardware at the time.

 

The Order 1886 would have been an excellent example and more relevant if it were released last year or this year but it pre-dates hardware that was 4k capable.

Edited by leadeater
Link to post
Share on other sites
4 hours ago, TechyBen said:

how many Titan Vs you have in your rig".

 

Pardon me for cutting the rest of the post but i needed to quote this specifically as it's really the maor relevant portion 

 

This is exactly what i've been saying all along so what exactly are we arguing about.

 

As an aside, yes thing other than system resources can limit things, i've never said they can't. In fact a lot of non-AAA games have budgetary constraints, and many more games are console ports with little if any other features enabled.But the hardware is (pardon the pun), the hard limiting factor.

 

There's all sorts of tricks developers can pull to get a given effect into the game. Remember highly realistic movie CGI has been rendered on non-RTX hardware, if the devs are willing to go to the effort they can put any levels of detail in they so choose. They may have to do all kinds of workarounds to make it happen but they can physically do it.

 

But no matter what they can do in terms of implementing a given level of graphical detail the hardware rendering capabilities are allways the final limiting factor they can't go beyond. The hardware is allways the final limit.

 

 

To come back to that Far Cry 5 example, yes they moved to screen space in part for technical reasons, but as the discussions about reflections a few pages back in i think it was killing floor 2 showed, there are workarounds that could have been implemented in Far Cry 5 to eliminate any quality drop off. Yet they didn't do so. And it wasn't a technichial limit that stopped them from implementing those features.

Link to post
Share on other sites

I agree with your principles. I agree with their feelings! :)  But I don't think you are correct about the statements being made about the technology. For example:

19 hours ago, CarlBar said:

killing floor 2 showed, there are workarounds that could have been implemented in Far Cry 5 to eliminate any quality drop off.

(I cannot find that discussion, but I can look into that game, and see what tech it uses)

https://www.geforce.com/whats-new/articles/killing-floor-2-graphics-technology-and-gameworks

 

Killing floor only uses Screen Space Reflections *which is what FarCry 5* uses. So, I am confused, which one is better? Which one should FarCry 5 implement? And which one has "workarounds"? (As shown, Killing floor uses no workarounds, it just uses SSR the same as FarCry 5. Planar Reflections may have workarounds, but would tank FPS and/or need custom code to get both SSR and Planar working in 1 game... and I've no idea if that matrix transform/binary search is even possible :P ).

 

I'm trying to find out if KF2 got Planar Reflections as an update/upgrade later on. Even if they did, it's not got the hills/rivers/waterfalls of FarCry 5 (which is mainly an outside countryside artistic choice), so that influences the choice of technology used.

 

You say Killing floor shows you can use workarounds? What workarounds has killing floor used? (I cannot see any)

 

Why do people keep saying "but they could do this" when they don't even know if they are already. ?

 

22 hours ago, leadeater said:

The Order 1886 would have been an excellent example and more relevant if it were released last year or this year but it pre-dates hardware that was 4k capable.

But the Order 1886 focused on 900p not 1080p! I just let you troll yourself. Proof that the higher resolution option is being "hidden" to get higher fidelity. ?

Well, not really trolling, but showing you have not checked before making statements, you seem to make it up? They are doing it both. They could hit 1080p, but to do so would lower the image quality. They chose quality.

720p Can also be done. Would 1886 look better in 720p? Did 1080p stop The Order 1886 looking better in 720p? Or did 720p stop The Order 1886 looking better at 524p?

 

With 4K, can we see anyone who has, is or will choose to lower quality? On console? OK, so are you getting worse games on your Xbox/PS4/Switch? You have one?

Link to post
Share on other sites
7 hours ago, TechyBen said:

With 4K, can we see anyone who has, is or will choose to lower quality? On console? OK, so are you getting worse games on your Xbox/PS4/Switch? You have one?

Yes, I've owned every PlayStation and play all my favorite JRPGs on them, as well as racing games or any other game that is not a better experience with a keyboard and mouse. So on my PC I play RTS, TBS and FPS games.

 

7 hours ago, TechyBen said:

But the Order 1886 focused on 900p not 1080p! I just let you troll yourself. Proof that the higher resolution option is being "hidden" to get higher fidelity.

And that's ignoring the resource demand difference between those two resolutions and 1080p vs 4k. That's also ignoring that I'm talking about future not the past, have I actually at any point said that developers have been limiting graphical fidelity improvements in the past to chase resolution? I think you'll find I have not.

 

I looked at the release date and immediately determined that it was not applicable to the discussion, along with all the other games released before the PS4 Pro/Xbox One X existed. I'm aware that multiple games on both the Xbox One and PS4 had lower internal resolutions, same goes for the previous console generation.

 

I could have brought up my own old example such as GT5 vs GT6 where GT6 actually looked worse. I could have even attempted to raise my own counter point to myself, a far more applicable one, of GT Sport. GT Sport not only actually looks graphically better in 4k mode it has two render modes, Quality or Frame Rate. Frame Rate mode runs at 1080p@60 in all situations and Quality mode uses what they call "Dynamic 4k" where game play is 1800p checker board and up scaled to 4k@60 and replays are 4k@30, Quality 4k mode on a 1080p output enables super sampling. Consideration was put in to ensuring best game play experience while also aiming for the best graphical experience possible. While Polyphony Digital was not allowed to make a higher quality 1080p mode on the PS4 Pro they found a way to make it look the best they could at that resolution as well as at 4k (game play and replay). Could they have made a better looking game at native 1080p on the PS4 Pro? Maybe, but I'm happy enough with what they did with the restrictions they were working with.

 

But lets not get hung up too much on console examples because my intention from the start was never to dive in to the super details and I just have a general concern about the whole "4k marketing" impact on graphical progression. I personally don't know of enough examples, data points, of games like GT Sport that are within the time frame that would interest me.

 

My issue with your examples is that 4k development focus hasn't been around for that long, a lot of games are console first now and 4k consoles have been available to the consumer for almost exactly 12 months.

 

Do I expect every game to have, just for example sake, 300 million dollars dumped in to it to chase the extremely diminishing returns of ultimate graphical fidelity, of course not. I shouldn't even need to say that.

 

But no I'm somehow incorrect in my assessment of the hardware limits that are in consoles, that most people have in their computers, that Sony does actually place restrictions on the PS4 Pro. Oh and all my comments must apply to every single game development studio and every single game ?‍♂️.

 

You can have all you're but in the past discussion you like, future state is what I'm pondering and I don't think your crystal ball is any more powerful than mine.

 

7 hours ago, TechyBen said:

Why do people keep saying "but they could do this" when they don't even know if they are already. ?

Probably because this was exclusively a future state discussion.

 

But irrespective of all of that, this discussion is massively off topic, gone on far to long, and we're having different discussions and at no point have I seen the discussion points align with each other to make it anywhere close enough to constructive so I'm not going to continue it. You can with others but I'll try the best I can not to reply to anything.

Link to post
Share on other sites
6 hours ago, TechyBen said:

I agree with your principles. I agree with their feelings! :)  But I don't think you are correct about the statements being made about the technology. For example:

(I cannot find that discussion, but I can look into that game, and see what tech it uses)

https://www.geforce.com/whats-new/articles/killing-floor-2-graphics-technology-and-gameworks

 

Killing floor only uses Screen Space Reflections *which is what FarCry 5* uses. So, I am confused, which one is better? Which one should FarCry 5 implement? And which one has "workarounds"? (As shown, Killing floor uses no workarounds, it just uses SSR the same as FarCry 5. Planar Reflections may have workarounds, but would tank FPS and/or need custom code to get both SSR and Planar working in 1 game... and I've no idea if that matrix transform/binary search is even possible :P ).

 

I'm trying to find out if KF2 got Planar Reflections as an update/upgrade later on. Even if they did, it's not got the hills/rivers/waterfalls of FarCry 5 (which is mainly an outside countryside artistic choice), so that influences the choice of technology used.

 

You say Killing floor shows you can use workarounds? What workarounds has killing floor used? (I cannot see any)

 

Why do people keep saying "but they could do this" when they don't even know if they are already. ?

 

But the Order 1886 focused on 900p not 1080p! I just let you troll yourself. Proof that the higher resolution option is being "hidden" to get higher fidelity. ?

Well, not really trolling, but showing you have not checked before making statements, you seem to make it up? They are doing it both. They could hit 1080p, but to do so would lower the image quality. They chose quality.

720p Can also be done. Would 1886 look better in 720p? Did 1080p stop The Order 1886 looking better in 720p? Or did 720p stop The Order 1886 looking better at 524p?

 

With 4K, can we see anyone who has, is or will choose to lower quality? On console? OK, so are you getting worse games on your Xbox/PS4/Switch? You have one?

 

Looks like i got the name of the game wrong. It was the one mentioned earlier in the thread that had surprisingly good reflections. Having to go from memory as i'm not even sure what page its on.

Link to post
Share on other sites
2 hours ago, leadeater said:

have I actually at any point said that developers have been limiting graphical fidelity improvements in the past to chase resolution? 

Thanks! I now know you have the experience of the Playstation. For example, the PS4 version of GTAV may not have "super ultra high def textures", but I have no idea if it is because they have a 4K budget. Otherwise the same game on PC, does have "super ultra high def textures" (or better lighting, or better 3d meshes). So opting out for one specific games console seems to more be the fault of Sony advertising/marketing, not "4K". :)

 

But games consoles (Nintendo, Sony and Microsoft) they have been limiting graphical fidelity due to resolution, VRAM, GPU/CPU power, shader cores, bandwidth, latency, netcode... on all the consoles, since forever. Even if you never noticed it in the past. Metal Gear series (PS1 through to PS3), FF series (probably the same). These all did the same for 500p vs 524p, 720p vs 1080p. Yes, 1080p vs 4k is a larger jump. But game development has always had that on PC (320x200 vs 1200x728 monitors back in the HL1/2 days). I agree about games consoles. I also agree it's a fear for the future with 4k. I disagree that it's a rational fear though. I disagree that we should blame 4k or people playing on it, or the consoles supporting it.

 

But if it's a future fear, we can come back in a couple of years and see what has happened. In the meantime though, can you promise me you will learn about game development and render engines? So we can look at what has improved and what has been taken away? :) (For example I already know "Super Sampling Anti Aliasing" has been taken away about 10 years ago)

 

1 hour ago, CarlBar said:

 

Looks like i got the name of the game wrong. It was the one mentioned earlier in the thread that had surprisingly good reflections. Having to go from memory as i'm not even sure what page its on.

Ah, ok. As said. You can do a lot of things. But certain technologies or game engines are not compatible. Turns out rasterization and ray tracing is. So we get the Frostbite rasterization engine with a raycasting RTX support overlaid on it.

 

I've no idea if the game you are remembering can also do a first person shooter with aircraft (Farcry 5) or do rivers and waterfalls. I agree, it may look worse to have screen space reflections instead of planar. But most of those features are going to be artistically influenced or down to players opinion. Are we certain the games we are comparing are technologically different, and not artistically different? A fixed camera angle game renders entirely different to a moving one. The "tricks" some game engines use are not compatible with others, because the games work differently (some never change height, some never have moving lights etc).

 

One example is the baked in lighting effects. They are done by the artists and engine "pre-rendered", then the scene is setup. You get wonderful lighting on the scene, buildings etc, and can put some on characters as they move through it. However, you cannot have any of the lights/shadows moving. It's pre-rendered, so it is like a photo, overlayed on the texture/map/scene/buildings.

 

If you then wish to add moving lights, these will look strange over the top of the "baked" in lighting. So you have to choose one *or* the other. Moving lights, or baked in. Each one has different render requirements, different art, different GPU resources etc etc. Some engines like Cryengine found a way to do moving lights using less GPU power, so added that. However the quality was a little less than Volumetric lighting, so they migrated to that. BF/Frostbite and the developers of the BattleField games chose to not have the performance hit, and pre-render the lighting instead.

 

So in the Crysis games you get tons of moving lights/shadows/godrays. Or changes of day/night (so changes to lighting of the level). In BF you play the same map over and over and over. :P

 

You'd be amazed how many games use the Cryengine engine, so it's not a one off. StarCitizen uses it. But I think the engine has gone away to the wayside. With Unreal, Unity, Frostbite and a few other proprietary ones (the ones used for the Final Fantasy games) are the main things being used these days.

 

vxgi

So BF5 Could use some of these... It probably does. The NVidia presentation though, pretended these other technologies did not exist. But as far as I know, it did not stop EA adding them for when RTX is turned off. :)

 

 

 

Link to post
Share on other sites
4 hours ago, TechyBen said:

Thanks! I now know you have the experience of the Playstation. For example, the PS4 version of GTAV may not have "super ultra high def textures", but I have no idea if it is because they have a 4K budget. Otherwise the same game on PC, does have "super ultra high def textures" (or better lighting, or better 3d meshes). So opting out for one specific games console seems to more be the fault of Sony advertising/marketing, not "4K". :)

 

But games consoles (Nintendo, Sony and Microsoft) they have been limiting graphical fidelity due to resolution, VRAM, GPU/CPU power, shader cores, bandwidth, latency, netcode... on all the consoles, since forever. Even if you never noticed it in the past. Metal Gear series (PS1 through to PS3), FF series (probably the same). These all did the same for 500p vs 524p, 720p vs 1080p. Yes, 1080p vs 4k is a larger jump. But game development has always had that on PC (320x200 vs 1200x728 monitors back in the HL1/2 days). I agree about games consoles. I also agree it's a fear for the future with 4k. I disagree that it's a rational fear though. I disagree that we should blame 4k or people playing on it, or the consoles supporting it.

I'm a little confused by this section, it's saying the exact same thing as my quote, and the rest of what I said which appears you did not read by what you just tried to tell me. Remember I was not limiting my "4k really is not necessary" point like 6 pages ago to consoles, but as a whole. At least on consoles they more actively look in to things like checker board rendering because they are stuck with a true fixed point hardware limitation, PCs can and do have different GPUs of different capabilities. Not that I'm complaining that there is more choice on the PC side, plenty of games have the ability to set 3D render resolution now anyway.

 

Also the section below that about the lighting, I had already linked to that and talked about it but I guess you didn't read that either. Nice to see you've made you conclusions about what I know about game developed on incomplete information that you actively ignored.

 

4 hours ago, TechyBen said:

In the meantime though, can you promise me you will learn about game development and render engines?

I already do, need I remind you of the programming background I do have while limited and the fact I have personally made a few 2D and 3D games (3D assets provided). And I have followed game development with interest, I just never felt like it was at all ever necessary to jump in the woods and get lost in pointless discussion over an extremely basic point, "Don't waste time trying to render 4k we don't have the hardware for it yet". Because I thought I made it really obvious multiple times I had no interest in that side of the discussion.

 

But hey at least you finally understood the discussion I was having, if only it didn't take 6 pages to get there.

 

Edit: Btw if you really must know so you can be a little less condescending, I am a Systems Engineer and a hardware specialist in an academic institution and support researchers running computational experiments on large clusters of both CPUs and GPUs and I make the hardware recommendations to them about what to purchase as well as configuring the servers, so I might know a bit more about the GPU hardware than you or you seem to think. But this is even less relevant and further in to the woods of pointlessness. 

Link to post
Share on other sites
16 hours ago, leadeater said:

Also the section below that about the lighting, I had already linked to that and talked about it but I guess you didn't read that either. Nice to see you've made you conclusions about what I know about game developed on incomplete information that you actively ignored.

Sorry, you presented many "what ifs", and no actual examples/data. So I presented examples/data, that is opposite to the what if. How would you reply if I said "all this push for GPUs in server farms means we won't be able to do precise floating point calculations on CPUs in the future!!! Theoretically what if they do that?!?!" ;) Would you agree I was making a general incorrect statement?

 

Quote

"Don't waste time trying to render 4k we don't have the hardware for it yet"

But some people do have the hardware. Don't they? As said, I'm not disagreeing with your opinion and decisions. I'm trying to understand what technology, hardware and systems you are making claims about. Do you mean "we don't have common acceptance of 4K capable hardware". Or "we don't have the ability to do all things in 4K"?

 

Quote

Edit: Btw if you really must know so you can be a little less condescending, I am a Systems Engineer and a hardware specialist in an academic institution and support researchers running computational experiments on large clusters of both CPUs and GPUs and I make the hardware recommendations to them about what to purchase as well as configuring the servers, so I might know a bit more about the GPU hardware than you or you seem to think. But this is even less relevant and further in to the woods of pointlessness. 

Ok then. Thanks! :) Remember, I also said I follow the art and game development from the gamers/hobbyist side of things. So my angle of observation is from the art angle mainly.

 

Please help me understand the render engine, and how developers are dropping features for 4K (or any resolution). :)

Can you give me 1 example of a games engine dropping a render/texture/art asset for a resolution target? As said, I think I know of some for the Metal Gear Solid series, Final Fantasy Series, and Crysis, though the articles I read were decades ago, and only on consoles.

Link to post
Share on other sites
1 hour ago, TechyBen said:

Ok then. Thanks! :) Remember, I also said I follow the art and game development from the gamers/hobbyist side of things. So my angle of observation is from the art angle mainly.

I have no doubt you have also followed it more than I have and have a lot more practical experience than what I have, nothing modern either. XNA is nothing like the tools used today, or really even back then, you pretty much had to code everything yourself but that was the type of experience I was after since I was looking for more interesting and variety because most programming courses early on are arduous and dry. Really was just Visual Studio/C# with some extra specific game and rendering libraries so it wasn't truly coding everything (I did write some custom DirectX code), plus some nice included tutorial/starter projects.

 

The only other adjacent thing I have done is Inventor for robotics design when I was helping out with a Vex robotics course, if it's something technical I can deal with it but I have the artistic skills of a blind and drunk person.

 

1 hour ago, TechyBen said:

But some people do have the hardware. Don't they? As said, I'm not disagreeing with your opinion and decisions. I'm trying to understand what technology, hardware and systems you are making claims about. Do you mean "we don't have common acceptance of 4K capable hardware". Or "we don't have the ability to do all things in 4K"?

Mixture of both, as well as hardware that 'can do' 4k but actually lack the resource increase comparative to previous generation or product and the demand of 4k i.e. PS4 vs PS4 Pro. That is why checker board rendering was a primary focus of the PS4 Pro because everyone on the hardware side knew that it wasn't capable enough for 4k native. I apply this lack of capability to anything including and below a GTX 1080, I suspect your hardware minimum point is lower than mine?

 

I take quite a different view of making do with what we have and actually defining something as capable and up to the task. You can take any car and go out and race it on a track, even have a ton of fun doing it but I don't call it a race car, the car can be made to do it in spite of it's lack of capability (bad example? it's late).

 

This is where I look at 4k and the very real and active developer time that is being spent on making it a thing whether or not we have a realistic audience of what I would class as 4k capable hardware owners. By developer I am covering the wider spectrum of the industry not just the game studios but including engine developers, tools developers who focus solely on one things like lighting and provide that technology for use, Microsoft/DirectX and Vulkan, Nvidia and AMD driver developers. I worry that once the more common hardware does become 4k capable how much of this was essentially wasted effort because "we don't need to do that anymore, it's no longer an issue". Or will it stay useful or be transferable to a different way of utilizing it. Take DLSS as an example, will we just stop caring about that in the next generation, or after, or will we just keep using it to up scale because the technology exists so may as well use it. Was that overall really a good spend of industry resources? For this example I take solace in that Tensor cores offer a much wider use case than just that, and DLSS was more find a use case for the hardware than having a use case and creating hardware for it, RT cores would be the reverse case.

 

I would rather see effort put in to addressing any technical limitations that may be impacting graphical improvements, improving current technologies or allowing a higher detail usage of them without causing significant issues in doing so. As you mentioned earlier on, you can't simply jam everything in to a single game and expect it to work even if the raw hardware itself might have been capable of doing it, there are too many points in the chain that you can overload and stall everything before the task actually makes it to an FPU.

 

I see art very much as making do with what you have, making your vision come alive under the constraints you are under and skill limits. Art can be a driver for technology and technology is also an enabler of art, it's a double sided feedback and this is why I say this big 4k focus (driven by the need to sell products, talking hardware not games) is more likely than not to impact graphical progression in the areas I care more about. Adopting 4k for the sake of adopting it because it exists is poor reasoning to me, the PS4 Pro as it is to me is a bad product.

 

Anyway I still don't think this warrants any more discussion, not here anyway, because of how off topic we have gotten. Do reply though if you need/want to, I might just decide not to after that is all.

Link to post
Share on other sites
32 minutes ago, leadeater said:

I suspect your hardware minimum point is lower than mine?

 

I take quite a different view of making do with what we have and actually defining something as capable and up to the task.

Minimum of what? Capable of what? :)

 

From the artistic side, we have it all. We have had above 1080p quality, even 8k quality art since Crysis 1. So games of that era usually had better than 1080p art (lights/shaders/reflections/etc) and were scaled back to hit 60fps at 1080p. So the only limiting factor here seems to be the hardware?

 

So from a technical point, with the hardware, on the PS4 Pro. What can we add to games targeted at 4K, that will make them look better than The Order 1886 at 900p? Should we limit every game to 900p, or 720p or lower? What about audio, netcode, physics, AI... are those not also preventing a game being rendered better? If we drop Netcode and AI, that leaves more compute for the screen (post processing on CPU, cuda cores/stream processors on the GPU)?

 

Quote

tools developers who focus solely on one things like lighting and provide that technology for use, Microsoft/DirectX and Vulkan, Nvidia and AMD driver developers.

As said. Are we supposing and theorising that these tools when used at 4K make the games look worse? Or make them look worse at 1080p?

 

Quote

I worry that once the more common hardware does become 4k capable how much of this was essentially wasted effort because "we don't need to do that anymore, it's no longer an issue".

This has always happened. Certain lighting techniques in games have come and gone. Every couple of years a new system/math is used to render those lights. Literally the industry does this every game engine/revision (again, see Crysis/FarCry that upgraded every game, or "downgraded" for higher FPS, such as dropping Oversampling for MSAA).

 

That's just covering lights, anti aliasing etc. Not even if we count textures, tessellation, render boxes/outputs, reflections, specular, volumetrics... The thing is, every game that runs in 4K on the PS4/pro also runs in 1080p. Every adjustment in 4K will make the FPS/frame time quicker in 1080p or make the content render targets higher (due to more resources available) while the LOD of art can scale higher.

 

Theoretically, BF5/GT5/Crackdown cold make their game look like Grown Home to hit a 4k, 8k or 16k target... that would give no noticeable benefit to 1080p gaming. But very few, if any don't filter down. We really are not at the 1000 poly vs 10,000 poly models of yesteryear (N64/PS1 etc did do resolution tweeks to get 10 or 100 extra polys out of the hardware, but can we really notice the difference these days?). Content scales dynamically, and differences are in the few percentage difference, not the massive one some seem to be expecting.

 

Quote

I would rather see effort put in to addressing any technical limitations that may be impacting graphical improvements, improving current technologies or allowing a higher detail usage of them without causing significant issues in doing so. As you mentioned earlier on, you can't simply jam everything in to a single game and expect it to work even if the raw hardware itself might have been capable of doing it, there are too many points in the chain that you can overload and stall everything before the task actually makes it to an FPU.

They are putting in the effort. I cannot say "I wish data centers would stop putting effort into GPUs, it's stopping them making CPUs", they make both, one does not stop the other. Does AMD making GPUs stop them making CPUs? Why would higher resolutions stop the industry finding new lighting math, shortcuts on render engines, etc?

 

Quote

 Adopting 4k for the sake of adopting it because it exists is poor reasoning to me, the PS4 Pro as it is to me is a bad product.

Yes, but that's not stopping the game developers. It's not stopping their graphics, any more than "must have wii mote controls" stopped the graphics on the Nintendo Wii. Yes, it was a wasted effort to put motion controls in games that did not need them, but it did not change the graphics potential and render target of the system (Nintendo was often a 60fps render target, so games had to be downgraded in graphics to hit that, or coding efficiencies found). As said, there are places where graphics are reduced. Can you find and describe those games where graphics are reduced? :)

                                                                                                                                                                                                                                                        

Yeah, I can keep it to the GPU demand and art development of RTX... that is a game changer, as some engines might go RTX only. So my hope is it's like "PhysX" where no one adopts the proprietary option, and it goes cross platform. It will though be a division, as is VR only games. There will be RTX only games.

Link to post
Share on other sites
5 minutes ago, TechyBen said:

They are putting in the effort. I cannot say "I wish data centers would stop putting effort into GPUs, it's stopping them making CPUs", they make both, one does not stop the other. Does AMD making GPUs stop them making CPUs? Why would higher resolutions stop the industry finding new lighting math, shortcuts on render engines, etc?

It was more a diverting of resources comment not a stopping or prevention. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×