Jump to content

Battlefield V with DXR on tested from Techspot(Hardware Unboxed)

kiska3
1 minute ago, leadeater said:

That all graphic settings, all techniques, all methods, including assets are influenced by the decision to target 4k. You know exactly why the game will not look like that pre-render, I know why it won't. You seem to be ignoring that to argue explicitly that the raw asset file is provided and included in the game files or are in existence at all yet ignore how it's actually used in the game, what actually matters.

Wait... so you know it will not look like the pre-render. But do you know why that model/texture/lighting is not in the game assets? You say TechyBen does... and TechyBen says it's because your graphics card cannot render it *at 1080p*. So, am I wrong? Can your graphics card render that image at 1080p?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, TechyBen said:

Wait... so you know it will not look like the pre-render. But do you know why that model/texture/lighting is not in the game assets? You say TechyBen does... and TechyBen says it's because your graphics card cannot render it *at 1080p*. So, am I wrong? Can your graphics card render that image at 1080p?

Ahhh that's not what I'm saying.

 

13 minutes ago, leadeater said:

It is possible to get a different looking game where 1080p was the design goal and not 4k, are you really going to tell me that it's not possible? It's not possible that due to this choice something more complex will not be done that would only be possible at 1080p and not 4k? You know this for sure? You would stake your life on it that if for the next 3 years games only targeted 1080p that we would not implement things that could not be done at 4k at acceptable performance? We would not make developmental and technological decisions based on this 1080p constraint? We would do nothing?

Answer this.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

Ahhh that's not what I'm saying.

 

Answer this.

I will. First tell me what you mean? Every time I show a picture, you say "that's not what I mean". What do you mean?

 

As said. I know the render pipeline, the art assets, the art production and the development cycle. I know the PC hardware and it's limits.

 

What do you mean that "It is possible to get a different looking game where 1080p was the design goal". Where is this magic better looking game at 1080p?

 

Quote

are you really going to tell me that it's not possible?

No. I'm asking "is it possible?" You are saying it is possible. I am giving examples of higher quality art + higher quality game engines. You are saying "that's not what I mean". So show me a game engine at 1080p or game art at 1080p that you mean. :)

 

(I'm guessing you mean "future" art? If you do, I will discuss that. ?  )

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, TechyBen said:

No. I'm asking "is it possible?" You are saying it is possible. I am giving examples of higher quality art + higher quality game engines. You are saying "that's not what I mean". So show me a game engine at 1080p or game art at 1080p that you mean. :)

You are giving inconsequential not relevant to actual game rendering that we will see and get in the attempt to show that high quality assets exist. So what? That's a nice pre-render, why are you showing it? Are you thinking at I'm under the belief that we could get a game to that final product quality standard right now by limiting ourselves to 1080p on current hardware?

 

You keep asking if it's possible to render it, that's not the right question, that doesn't actually apply to what I'm talking about.

 

Here is what the final, actual game, will look most like, not that pre-render.

3246850-glossyrichasiaticmouflon.gif

 

Boy it looks nothing like the pre-render, I'm totally shocked and did not expect that.

 

You say you know the hardware limits but then for some reason think that targeting 4k, in game, is not going to impact what is possible to achieve on the final look of the game.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, leadeater said:

You are giving inconsequential not relevant to actual game rendering that we will see and get in the attempt to show that high quality assets exist. So what? That's a nice pre-render, why are you showing it?

The *steps* to game art creation and game engine asset production. You seem to have missed a few, so are making a Simpson's Paradox conclusion (that the direction/trend is downwards).

 

Hence the *artwork* at an Oversampled quality. Now we agree the artwork exists, I can see if we agree the 1080p resolution effects exist.

 

Quote

You say you know the hardware limits but then for some reason think that targeting 4k, in game, is not going to impact what is possible to achieve on the final look of the game.

 

Where, in the GIF you provided, did they lower the quality for the 4k target? Or, where in the GIF, can they improve, the 1080p render?

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, TechyBen said:

Where, in the GIF you provided, did they lower the quality for the 4k target? Or, where in the GIF, can they improve, the 1080p render?

At the very start, where the developers stated the design goal of the game was 4k. Where they made active decisions to meet it, where the made choices that effected the end product. What, do you want me to go back in time and get them to change the goal to 1080p@30 instead so we can see how the hypothetical would turn out?

 

8 minutes ago, TechyBen said:

Simpson's Paradox conclusion (that the direction/trend is downwards).

No you are by not understanding what I'm pointing to so under your assumption of what I mean, not what I do, you are concluding this.

 

"I say what your opinion/point is so I can disprove it". I get the very strong impression that this is what you are doing. 

Edited by leadeater
Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, leadeater said:

Where they made active decisions to meet it, where the made choices that effected the end product.

How does that make it worse? They said they had to meet 4k. But where does that make the game visuals worse than if hitting 1080p? As said, you've yet to provide 1 real life example (your FF one is correct and true, but not a real example in a game. The game rendered those AT 1080p, it was not prevented from rendering them).

 

Quote

What, do you want me to go back in time and get them to change the goal to 1080p@30 instead so we can see how the hypothetical would turn out?

No. I'm asking, where did they say they can make it look better at 1080p@30? They said they "may" be able to. Can you show a single game (if as you imagine Crackdown has not) that has made this possible?

 

Quote

"I say what your opinion/point is so I can disprove it". I get the very strong impression that this is what you are doing. 

You have stated that your opinion is that they downgraded the graphics of games (I use Crackdown as a choice for us to discuss). I state they have not. As I know the artwork pipeline, they *have to increase the graphics* to hit 4K resolution. That's not an opinion. That's a fact.

 

If they have to increase the graphics fidelity to hit 4K resolution, how are they lowering the graphics to hit it? They lower the LOD to hit 30FPS. Not the graphics quality. They don't remover the quality, they adjust the LOD scale. They don't do it for performance at a resolution, they do it for performance at a resolution with matching LOD. They already have *oversampled* content (number of lights, textures, poly mesh, shaders, pixel pipelines, number of characters on screen).

 

If this is wrong, then sorry. I posted multiple images trying to understand what you mean, and you say none if it is what you mean. I try to post multiple artwork pipelines and render options, and you say that's not what you mean.

 

1982658.jpg

How is this lower quality than a 1080p@30FPS target?

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, TechyBen said:

How does that make it worse?

I never said it made it worse, you did, you are saying I am saying that. I am saying we could have a different end product result with a different goal. Better? Worse? Subjective depending on person? I can't show you a hypothetical. I can posit that there would be a difference, you can disagree and say there wouldn't.

 

19 minutes ago, TechyBen said:

You have stated that your opinion is that they downgraded the graphics of games

No I haven't, again you have said that not me.

 

19 minutes ago, TechyBen said:

No. I'm asking, where did they say they can make it look better at 1080p@30? They said the "may" be able to.

Bold part. This is where I'm interested in, the may be able to, so try. But no 4k was the goal so they didn't. Not wrongly, just a different path than what I would have liked to see.

 

19 minutes ago, TechyBen said:

(your FF one is correct and true, but not a real example in a game. The game rendered those AT 1080p, it was not prevented from rendering them).

Wait how is this not a real example? I was not aware I was using that as an example of a game having to use lower quality assets because they targeted a higher resolution. I think you'll find if you go read the post again I was using that as an example of not being able to make a model look any better than what it actually is, you can make it look worse though. Nor have I used any examples or points to mean prevented in the way of literally impossible task, cannot be done.

 

Edit:

I cannot show you something that the industry is not doing, I do mean that in the literal impossible sense. If they do, forgo 4k, and spend GPU resources in other areas that would not be used because you aren't targeting 4k then the final product would in all likelihood look different than the same game developed with a 4k goal but at 1080p.

 

Like above, it's the may be able to I'm interested in and that comment was made in mind with console hardware. I am painfully aware that games are now designed first in mind for consoles now though so that would be another challenge/barrier to what I'd like to see.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, leadeater said:

 I am saying we could have a different end product result with a different goal. Better? Worse? Subjective depending on person? I can't show you a hypothetical. I can posit that there would be a difference, you can disagree and say there wouldn't.

Different? Yes. But knowing the artwork pipeline I can tell you, that having a 4K goal gives you a better end product. Not a "worse" one. It MAY give you lower FPS, or lower LOD *if* forced to 4K. But if choosing 1080p or 1440p it will *always* give a higher LOD or FPS when aiming at a higher than median resolution target. If a developer cuts content, they will be cutting existing content, and 9x out of 10 will be content that would not fit on the disk (85gb texture packs) or will not fit in VRAM or will not render at 1080p (the above pre-game assets).

Quote

This is where I'm interested in, the may be able to, so try. But no 4k was the goal so they didn't  may not have. Not wrongly, just a different path than what I would have liked to see.

See my edit to your statement. You confused (with a Simpson's Paradox comparison) the statement he made with an absolute. You think they hid back something from the game. So let's see if that is true, via a comparison. So, let's now ask the Crackdown Music developers how good their music is in the game:

Quote

We wanted surround sound 15.1 support, but the target for 1080p is stopping us, if we dropped the visuals entirely, and released the game on vinyl, we can spend the entire budget on an Orchestra for the music. Yeah, so if we did not have to concentrate on showing what is happening, at 1080p, then we could make an even better audio track.

If that argument is nonsensical, then replace "music" with "4K" and it's still nonsensical. Spending more money/time/budget/processing power on the music will stop you spending as much on the visuals.  Same with 4K. Spending more time/budget/processing power in that mode will stop you spending it on music, netcode, Wayne Johnson adverts... ?

 

But it does not prevent or make the other things "different" in and of itself.

 

Grown Home did not have worse graphics, because they added a 4K target render in the game. The game works at any resolution. It renders beautifully at them. The *artwork* target and budget scales separately to the *render* target and budget.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, TechyBen said:

You confused (with a Simpson's Paradox comparison) the statement he made with an absolute.

 

You know this is the full statement right? By "Timothy Lottes, a member of GPU maker AMD’s Game Engineering team." 

 

Quote

But he goes further: “I think it would be an interesting world if we stuck at 720p and then just kept on scaling performance. Then developers would have more opportunity to use the GPU for more than just filling more pixels.” On one hand, that excess GPU power could be put into perfecting every pixel to better close the distance to CG movie visuals, employing high-quality antialiasing and lighting effects such as the realtime ray- tracing showcased by Microsoft and Unreal at GDC this year. Or what about forgetting all that and putting it into world simulation to vastly expand the number of AI-controlled actors, physics and other complementary systems that go into producing dynamic and interactive places in which to play?

So this was not being said by a crackdown developer and is purely a hypothetical opinion statement pondering an outcome, like I am.

 

So I know for a fact, using the crackdown developers comments that 4k was indeed the design target.

Quote

“We’ll always support 4K mode, if we’re intending on targeting it, from the earliest point in the production cycle. This makes things easier from a cost and workflow perspective as opposed to trying to retrofit support at the end.” For Sumo, thinking about 4K early is a good way of managing its demands.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

In other words, consoles have not kept in step with the additional requirements of 4K

So, you are talking about Console gaming?

Quote

Developers must therefore, use a number of tricks to achieve 4K output while also reaching the same level of visual detail that their games can achieve in 1080p

They reach the *same* as in 1080p. Not remove/reduce/redirect.

Quote

. Onrush, for example, only renders half of 4K’s pixels, thereby only doubling the additional work from outputting at 1080p rather than quadrupling it.

So, it does not affect the 1080p quality... it just upscales.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, TechyBen said:

So, you are talking about Console gaming?

Nope, my comment is actually exactly the same as Timothy Lottes.

 

Quote

“I think it would be an interesting world if we stuck at 720p and then just kept on scaling performance. Then developers would have more opportunity to use the GPU for more than just filling more pixels.” On one hand, that excess GPU power could be put into perfecting every pixel to better close the distance to CG movie visuals, employing high-quality antialiasing and lighting effects such as the realtime ray- tracing showcased by Microsoft and Unreal at GDC this year. Or what about forgetting all that and putting it into world simulation to vastly expand the number of AI-controlled actors, physics and other complementary systems that go into producing dynamic and interactive places in which to play?

 

 

Edit:

Ok 1080p or 1440p not 720p. Other than that I'm all for seeing what is possible with the GPU resources not being used to render at 4k. Why it took, as far as I can tell, 3 pages for you to realize I'm talking hypothetical I'm not sure. I directly stated that was the case more than once. 

Edited by leadeater
Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, TechyBen said:

They reach the *same* as in 1080p. Not remove/reduce/redirect.

Now increase that 1080p visual detail past the point it would be possible at 4k on those consoles no matter the "tricks" they use. A side point to this, they aren't going ship a game where the visual quality is better at 1080p and worse at 4k.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, leadeater said:

Nope, my comment is actually exactly the same as Timothy Lottes.

 

 

A)

Quote

For Lottes, a high framerate is more important than resolution because it favours fluid motion and faster input response times, and he’s comfortable with the cost of lower resolution.

I compared resolution to framerate too.

B)

Quote

In terms of pure numbers, though, he points out that targeting native 4K at 30 frames per second is equal in the rate of rendered pixels to targeting native 1080p at 120 frames per second.

Which is what a lot of games already DO. So no games have dropped content to get 1080p at 30FPS. Because it would conflict with A) FPS. They prefer 1080p at high FPS. But will concede to low FPS at 4K. Not Low FPS at 4K instead of high fidelity at 1080p.

 

Quote

He acknowledges that this calculation is entirely theoretical

Well. Um.

 

Quote

I think it would be an interesting world if we stuck at 720p and then just kept on scaling performance

We have. Like every game already has a 720p option. See if it allows you to increase the asset count (you said the devs have not made them, because "4K", I've posted numerous sources for these asset libraries/DLC/packs... so we can add them to a game and render/run it at 720p to see if it looks better!).

 

Quote

On one hand, that excess GPU power could be put into perfecting every pixel to better close the distance to CG movie visuals

Yes, see here in 2014: 

But that is not good enough for a game. As the hardware cannot run it at a gameplay FPS *even at the lower resolution* of the 600x400 window in the middle.

 

Quote

Or what about forgetting all that and putting it into world simulation to vastly expand the number of AI-controlled actors, physics and other complementary systems that go into producing dynamic and interactive places in which to play

*We already have this*

knack-listing-thumb-01-ps4-us-06nov14?$I

35681ed7eac5cbef5714476dfe63bc4085dc1397

 

?

 

But not all games want to have or need to have 1000 AI agents, or millions of particles. So some choose resolutions or textures.

 

Quote

Now increase that 1080p visual detail past the point it would be possible at 4k on those consoles no matter the "tricks" they use. A side point to this, they aren't going ship a game where the visual quality is better at 1080p and worse at 4k.

GTAV, Farcry5, Watchdogs 2, COD, BFV every single game ever (from scaling LOD, not from holding back the "quality"). :P

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, TechyBen said:

We have. Like every game already has a 720p option.

You are conflating the possibility of 720p option as being the same as literally being stuck at 720p as what is being said. If we're stuck at 720p then you won't have an option higher than that will we. So in the hypothetical situation we would look to other ways of increasing visual detail of games.

 

Edit:

1 hour ago, leadeater said:

"I say what your opinion/point is so I can disprove it". I get the very strong impression that this is what you are doing. 

Still getting this feeling so I'm going to bed, this is a lost cause.

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, leadeater said:

If we're stuck at 720p then you won't have an option higher than that will we. So in the hypothetical situation we would look to other ways of increasing visual detail of games.

Hypothetical. Now, imagine we *do* stick to 720p. You tell me what happens. Where do we improve the image?

 

So, where is anyone not putting this extra grunt into games? Where is no one looking to increase visual detail at lower resolutions? (We already have the code and artwork, but the GPU cannot push it at 720, 1080, 1440p)

 

If it's possible, why are there not indie devs everywhere pushing amazing 720, 600, 320p graphics because "lower resolution so more AI actors" as Lottes puts it?

(See CUDA cores vs Raycasting cores. We can already choose resolution or gameplay/simulation fidelity!)

 

Night. Get a good sleep. Really, I was only trying to show the additional details, and how they are an upwards trend, not a downwards or cutting out one (increasing resolution is not new, the industry has done it for decades, we kinda know the limits applied to it already). No a disagreement. More a "you've missed a lot of detail"... you've tried to render yourself at 1080p. ❤️

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Carclis said:

You make it sound like ray tracing is a lot more achievable than many would believe. I certainly hope that is the case since that would mean better visuals should be closer but the demonstrations we have been shown would say otherwise.I guess the question is how much time does the master rays portion of ray tracing typically take to compute. If 1080p ultra will yield a frame every 6.7ms typically and 1080p ultra with ray tracing enabled it becomes 20.4ms (17.9ms with RT set to low) then it's fair to say it must be somewhere around 10ms or maybe a tad less. Then my next question would be about which type of ray is typically the most taxing to implement of the three types of rays.

 

Trust me thats the simplified version, and the real headaches with hybrid raytracing are the tracing of the ray paths and the denoising. 

 

@TechyBen I've pretty much lost track of what you two are arguing about at this point, theres that much back and forth on little details that i'm a bit lost as to how to respond to most of it, because i'm honestly not sure how most of it relates to the core discussion. Namely around the costs to develop, (and consequently the compromises required), a game for 4K intended viewing resolution vs 1440p intended viewing resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TechyBen said:

Hypothetical. Now, imagine we *do* stick to 720p. You tell me what happens. Where do we improve the image?

 

So, where is anyone not putting this extra grunt into games? Where is no one looking to increase visual detail at lower resolutions? (We already have the code and artwork, but the GPU cannot push it at 720, 1080, 1440p)

 

If it's possible, why are there not indie devs everywhere pushing amazing 720, 600, 320p graphics because "lower resolution so more AI actors" as Lottes puts it?

(See CUDA cores vs Raycasting cores. We can already choose resolution or gameplay/simulation fidelity!)

 

Night. Get a good sleep. Really, I was only trying to show the additional details, and how they are an upwards trend, not a downwards or cutting out one (increasing resolution is not new, the industry has done it for decades, we kinda know the limits applied to it already). No a disagreement. More a "you've missed a lot of detail"... you've tried to render yourself at 1080p. ❤️

Sorry, I do feel like replying to this. What I was getting at is it not possible know the flow on effects that would happen to the game development industry if we were limited to 720p or 1080p or really any resolution. Are you really that confident you know what the outcome would be in a world/situation to graphics hardware, graphics APIs, render techniques and overall development workflow that evolved with this limitation?

 

It's nice that you tired to show addition details however it didn't have much relation to the hypothetical situation because everything you were saying, referencing, talking about, overall knowledge is from a perspective of not being limited to a set resolution. All of course very important to make educated commentary on this, so don't think I'm just dismissing it/sweeping it aside. 

 

It was as if I was saying it would be an amazing experience if we could fly then you came in and asked for proof of humans that could fly. You asked for the impossible, to know something or see something that doesn't exist. We don't live in a world where we are limited to 720p or 1080p so I can't show you what the full effects of that would be.

 

So I can't tell you where we do improve the image, not in specific detail because right now we are not in a world where the game development industry has chosen or stuck to a lower resolution but we still have powerful graphics cards. What I will say is that in general we could look at ways to improve the lighting to help with realism, if that's the art style we're after. Or we could improve the shadows, the detail of them and how they interact with the environment. We could make sure that more objects are at a higher level of detail, are more proportionally accurate, physically/structurally look more realistic, more natural animations. Use your own examples, I bet you could come up with better ones than I can (at 5am).

 

I'm probably far less ignorant to the development process than you think I am, thing is it was not my intention to start a huge detailed debate about it. I actually went through this book while at university doing my degree because I thought it might be interesting and would also help with the actual C# programming that was actually part of the degree. What I did learn very quickly is I'm not a 3D artist and did not have the time to dedicate to becoming even a little bit good at it, my game development experience ended with that book and the provided assets that came with it.

9781590599242_p0_v3_s550x406.jpg

(I would not actually recommend this book or XNA ever)

 

Development/programming is not the career path I chose or was working to though, so what I did learn is largely lost in the recesses of the mind that if required I hope I can still call upon, it has been more than 10 years so I'm doubtful.

 

These I do find as valuable sources of information about this topic, no actual working experience doing any of it or any desire to.

https://jesshiderue4.wordpress.com/real-time-rendering-an-overview-for-artists/

https://cgcookie.com/articles/maximizing-your-unity-games-performance

 

It is still my opinion that by aiming to render a game at 4k with good performance it will impact on graphical detail choices, both in a good way and a bad way. The bad ways are the interesting bits for me, what do we have to sacrifice to achieve 4k. Those detail improvements you mentioned that are required when going to 4k, sure but you can still make them while staying at 1080p. Could you have made them more detailed and not usable at 4k on current hardware?

 

You might be correct, the void in hardware performance required to achieve the graphical detail improvement that I would call significant at 1080p might just be far too much for the GPUs we have now or in the next generation. The issue is I don't see anyone trying, I see all the big budget games all designing for 4k. Other than RTX I do not see any concerted effort to improve game graphics that is not linked to 4k and therefore the performance demands of that and the limitations that entail from it.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, CarlBar said:

I've pretty much lost track of what you two are arguing about at this point

Hell if I know either, at lot of it left like created examples/setups and then a counter argument to it that had no relation to the original point/hypothetical that was made.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not trying to argue. I'm trying to show the art/graphics development cycle. Trying to show the poly/texture/resolution budget. :)

 

Raytracing does not take effects away from our current generation. It *may* from future ones. As they may stop supporting older hardware that does not have raytracing. However, I don't think they will (same as the 1080p vs 4K). I don't think they WILL drop support, textures and artwork to match.

 

Theoretically they could do anything in this hypothetical "lower resolution/NoRTX" world. But the observed here and now (and decades of gaming/3d content creation) shows additional tools add additional content, additional resource/demands adds additional content also.

 

1 hour ago, CarlBar said:

 

Trust me thats the simplified version, and the real headaches with hybrid raytracing are the tracing of the ray paths and the denoising. 

 

@TechyBen I've pretty much lost track of what you two are arguing about at this point, theres that much back and forth on little details that i'm a bit lost as to how to respond to most of it, because i'm honestly not sure how most of it relates to the core discussion. Namely around the costs to develop, (and consequently the compromises required), a game for 4K intended viewing resolution vs 1440p intended viewing resolution.

 

RTX reduces the cost of development. Not increases. I'm saying, in the development studio, the skew is opposite to the GPU cost of the user (for RTX). The user pays more for RTX, the developer less (as it's a render pipeline, the game engine does a lot that the artist would have had to do by hand previously, so less artist time = less cost). The costs don't scale the same for resolution increases or shader increase, or poly count, or texture count. They have different costs/benefits. They have different development time (or testing) sinks.

 

See unity art assets as an example of quick, cheap, bulk "photorealistic" art to order. It is however taxing on the GPU. Raytracing is similar. It's going to tax the GPU at any resolution. It's already being culled/upscaled and interpolated from a lower resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, TechyBen said:

I'm not trying to argue.

But it 100% felt like you were completely and utterly ignoring my raised point to argue yours. Like you were changing what I said to suit your point, changing the examples to suit yours. At no point in the discussion did I ever feel like you were responding to anything I was raising.

 

Do you realize how frustrating that is?

 

Sure I still need to let you raise yours, and allow you to discuss them with me, but it felt like you were not doing the same for me.

Edited by leadeater
Link to comment
Share on other sites

Link to post
Share on other sites

I think we are like "my GPU is stopping my CPU" and I'm like "well, no, it can bottle neck or hog resources". Both those things do "stop" the CPU, but not in the same way. In fact, they both do opposites (CPU on a wait will do nothing because the GPU too slow, and a GPU polling the CPU too quick will put the CPU at 100% resource allocation and hog it). You are right, I need to get better at explaining that/conversing that way.

As said with the simpson's paradox, if I'm at the top looking down, or at the bottom looking up, I get different views. If I step back, I get the full view.

 

So, the development cycle is not hiding/holding/preventing better graphics at a lower resolution (Hint, it was AMD suggesting we play at 720p, I wonder why ;)). It is a lot of complex hardware, software, artistic etc limitations.

 

Look up some older games and how they scale LOD, look up some of the old art assets. You may be amazed what code they have to enable effects and resolutions, textures and LOD packs that were disabled because the graphics cards could not at the time push the pixels at any resolution.

 

RTX likewise is way to early at any resolution, 640p or 8k. :) We can already buy 2 1080tis = zero RTX twice the performance,  instead of 1 2080ti = RTX for half the performance. It already scales to the hardware/software/cost/performance. What we *don't* get is greater performance and ray tracing for the same price. This will take time.

 

Link to comment
Share on other sites

Link to post
Share on other sites

For example, AMD is not pushing RTX... yet have not double GPU performance. If RTX stopped Nvidia improving performance, then AMD would have stepped in and gone "no RTX, but twice the performance of a 1080ti!!!!!!!!!!". They have not, so something else is stopping Nvidia doing greater than 10% incremental refreshes/releases. What could that be? ;)

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, TechyBen said:

RTX reduces the cost of development. Not increases. I'm saying, in the development studio, the skew is opposite to the GPU cost of the user (for RTX). The user pays more for RTX, the developer less (as it's a render pipeline, the game engine does a lot that the artist would have had to do by hand previously, so less artist time = less cost). The costs don't scale the same for resolution increases or shader increase, or poly count, or texture count. They have different costs/benefits. They have different development time (or testing) sinks.

 

See unity art assets as an example of quick, cheap, bulk "photorealistic" art to order. It is however taxing on the GPU. Raytracing is similar. It's going to tax the GPU at any resolution. It's already being culled/upscaled and interpolated from a lower resolution.

 

Right, but this all started as an argument about the cost/ benefit of implementing higher resolutions as compared to higher details at a lower resolution. The monetary development cost of developing a graphical feature whilst related to the cost/benefit of implementing it is not directly proportional.as some things can be cheap but still not worth it because it provides no tangible benefit to the end user experiance and thus current and future profits due to good reviews and brand loyalty gained encouraging sales of the current and future products.

 

Thus i'm grenmuinly not sure what any of that has to do with how this whole argument started?

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, CarlBar said:

Right, but this all started as an argument about the cost/ benefit of implementing higher resolutions as compared to higher details at a lower resolution. The monetary development cost of developing a graphical feature whilst related to the cost/benefit of implementing it is not directly proportional.as some things can be cheap but still not worth it because it provides no tangible benefit to the end user experiance and thus current and future profits due to good reviews and brand loyalty gained encouraging sales of the current and future products.

 

Thus i'm grenmuinly not sure what any of that has to do with how this whole argument started?

Well it is a complicated issue so fair enough for someone to try and point that out, though my fundamental point was graphical detail and resolution are not one and the same.

 

It also does no good for me and my gaming experience if say the main character on the screen is highly detailed then hops in to or on a vehicle that is a simplistic model and thus is both a different visual quality and also unrealistic looking, this I find very jarring. This applies to everything on screen. Increasing resolution won't fix that. Boiling it down to detailed assets exist is not at all the issue.

 

Lighting is also very important to our visual perception but is/can be computationally expensive, I don't mean Ray Tracing. There are many different types of lighting techniques all with there own pros and cons so if you use the wrong one, in the sense of needlessly picking a complex one when a simple one can achieve near the same result, or add an extra light source that isn't absolutely required then you are greatly increasing the GPU or CPU resource demand.

https://80.lv/articles/learning-lighting-for-video-games/

https://iq.intel.com/bringing-games-life-with-light/

https://unity3d.com/learn/tutorials/topics/graphics/choosing-lighting-technique

 

Or we could look at making hair look and act real to life, another extremely complicated effect to pull off convincingly.

 

So for me I would rather what is currently considered not critical game assets or effects get more attention, make that tree a more complicated model, make it look more real, waste some resources on that sort of thing because for me it is not a waste. If doing so would mean 4k would now be impossible on current GPUs so be it.

 

4k resolution output is not actually a well established thing, the hardware hasn't been both around that long or at a generally available price point. However there is a very big push towards it, mostly from marketing not a very strong technical need to. If 4k monitors and TV did not exist then no one would be talking about it, it wouldn't prevent graphical improvements because it doesn't exist.

 

Games have been getting more visually detailed over a long period of time at 1080p, the hardware has been improving to allow it, the tools have been improving to allow it, budgets in the industry have been increasing to allow it, but what I personally don't want is this progression to be slowed down to meet the demands of 4k when everyone is saying that 4k now and even potentially in the next generation is unrealistic. We are now developing techniques to get around the limitations to enable 4k, or accepting certain sacrifices, and this is in my opinion not the area I would like to see development resources go in to.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×