Jump to content

David Wang From AMD Confirms That There Will Eventually Be an Answer to DirectX Raytracing

4 hours ago, mr moose said:

And yet here is AMD essentially claiming they have nothing to compete with Nvidia but admitting RT will be a thing.  My point is not an assumption, but that GCN is dead and AMD have had almost a decade to prove otherwise, instead where are they up to? a rehash of the 580?

Regardless of why you think nvidia are where they are, the fact is it's an observation of the past not an assumption, Nvidia are here and now with a tech that is being taken up very fast and that is causing the industry to get excited.  

 

Even if they don't come out till next year, it's still faster uptake than any other new tech,  what's your point supposed to be?

Calling it taxing and irrelevant doesn't actually make it so.   Gotta ask why it is so important to you that Nvidia not be successful or that the RTX is not what it is?  You seem to be arguing against the evidence.

 

 

Because it is not technically interesting. It's old stuff turned into a gimmick and it doesn't even run well. I always had a beef with the techs that Nvidia pushes mostly because they only push tech as a way to destroy performance on other hardware. There's so much progress we could've had otherwise.

It is taxing as every demo of it available shows it is. Then it is irrelevant because it's hardly present now, I'm not sure it will be present in the end. Isn't it final fantasy which dropped support for it recently when one of their dev left? Who says it won't be the same for the other studios or some half assed implementation just to say it's there.

But even taking all of that out of the equation, as I talked about earlier I'm fundamentally against the premise of the tech they push. The premise is that AI makes it possible through denoising. The reason why is that it's easy to implement for them and that's about it. But there is a reason why the AI stuff is the only relevant tech of the rtx launch, which is that the rest of the pipeline is actually a subject of deeper research that Nvidia needs to do in order to actually make ground-breaking advances. Because for what it does actually, people have been getting fairly close to it since already a few years now.

To get back on point, I don't believe in the success of that rtx tech as is, because to me it is incomplete and it is a grave oversight on Nvidia to think they can make it happen solely through post processing neural networks. It's a part of the equation but it's maybe one third of it I'd say. One other really important axis is to define the best cache architecture for the underlying data structures for intersections, and reduce the number of paths traced to achieve a good enough result, through learning method or otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, laminutederire said:

Because it is not technically interesting. It's old stuff turned into a gimmick and it doesn't even run well. I always had a beef with the techs that Nvidia pushes mostly because they only push tech as a way to destroy performance on other hardware. There's so much progress we could've had otherwise.

It is taxing as every demo of it available shows it is. Then it is irrelevant because it's hardly present now, I'm not sure it will be present in the end. Isn't it final fantasy which dropped support for it recently when one of their dev left? Who says it won't be the same for the other studios or some half assed implementation just to say it's there.

But even taking all of that out of the equation, as I talked about earlier I'm fundamentally against the premise of the tech they push. The premise is that AI makes it possible through denoising. The reason why is that it's easy to implement for them and that's about it. But there is a reason why the AI stuff is the only relevant tech of the rtx launch, which is that the rest of the pipeline is actually a subject of deeper research that Nvidia needs to do in order to actually make ground-breaking advances. Because for what it does actually, people have been getting fairly close to it since already a few years now.

To get back on point, I don't believe in the success of that rtx tech as is, because to me it is incomplete and it is a grave oversight on Nvidia to think they can make it happen solely through post processing neural networks. It's a part of the equation but it's maybe one third of it I'd say. One other really important axis is to define the best cache architecture for the underlying data structures for intersections, and reduce the number of paths traced to achieve a good enough result, through learning method or otherwise.

That's a pretty long winded way to say you don't like Nvidia.  that's fine and all, but lets avoid turning your opinions into a statement of fact just to support that dislike.

 

Maybe RT will fail, but it doesn't look like there is sufficient evidence to support such a hypothesis just yet. EDIT: and that's not even  looking at all the home animators that are all over this because they use maya and renderman.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, laminutederire said:

Because it is not technically interesting. It's old stuff turned into a gimmick and it doesn't even run well. I always had a beef with the techs that Nvidia pushes mostly because they only push tech as a way to destroy performance on other hardware. There's so much progress we could've had otherwise.

Maybe we should just toss everything we've done up until this point because many "game changing" GPU tech didn't really improve performance of then-current software.

 

Imagine the FPS we could achieve if we just stuck with DX5.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, M.Yurizaki said:

Imagine the FPS we could achieve if we just stuck with DX5.

Probably less than now because DX5 wouldn't know wtf to do with all that GPU hardware ?

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, mr moose said:

That's a pretty long winded way to say you don't like Nvidia.  that's fine and all, but lets avoid turning your opinions into a statement of fact just to support that dislike.

 

Maybe RT will fail, but it doesn't look like there is sufficient evidence to support such a hypothesis just yet. EDIT: and that's not even  looking at all the home animators that are all over this because they use maya and renderman.

My technical arguments are independent on my opinion of Nvidia. There is a reason why people depending path traced to finish fast have published on denoising, path guiding, and other methods, and have done so in a riguorous way where they check that their methods are non biased and so on. We don't have much to go on with Nvidias denoisers right now, and since neural networks properties are nightmare to prove, it's just a matter of "Yeah but is works on our examples", without any study of when does it actually fail, like other publishing scientists do.

The final remark most of the researchers end up with is that denoisers are all nice and shiny, but they are easily the easiest place to introduce bias, which is not what you want

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, laminutederire said:

The final remark most of the researchers end up with is that denoisers are all nice and shiny, but they are easily the easiest place to introduce bias, which is not what you want

Well that depends, as the person playing the game the technicality of how it's done or if it's not the most robust method doesn't matter if the image/result is at or above expectation. Errors or bias in an image rendered for a game doesn't really matter, there is no consequence from it like a medical image or a engineering design, if it doesn't distort the object or effect beyond notability.

 

When it comes to real time ray tracing for games sacrifices have to be made which for this scenario is speed over accuracy.

 

I am by no means impressed by RTX and all the actual game demos I've seen so far are not just low performance but very little on screen effect at all but I don't actually have to care too specifically how Nvidia is doing it. If it's not the best way then it's on someone else to do it better if they can, if they can't you either use what is available or you don't.

 

Nvidia isn't releasing a scientific study on ray tracing, or ray tracing methodology. They aren't setting the industry standard on how to do it either, they are late to the party anyway. Nvidia might get market dominance in real time ray tracing, it might not be the best method but that doesn't mean that they won't improve it or someone else won't supplant them as the market leader. No one is going to care if AMD or Intel come along with a technically superior method on technically superior hardware to support that method if it does it at 1 fps.

 

TL;DR The wrong way at 30 FPS is better than the right way at 1 FPS. We're here to play a game not watch PowerPoint slides.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

 

It opens up the possibility for diverging from the artistic intention or can create artefacts, and creates non uniform experiences. Plus it's a guarantee that nothing can go wrong in any cases. That's why companies like Disney care about making thing bias free since they have  guarantee on the result. If you have a bias allowed, then you usually can't control how much it'll diverge. In that case you can play a game just fine, and when you boot up another one you get weird artefacts that developers have to counter it (or murder frame rates to try and compensate in the case of denoiser only).

 

That being said I'm biased myself because I care about well done algorithms :)

 

The thing is that yes you need to trade-off speed over something, of course. However the methods we need to use are the methods which trade speed over precision while keeping perfect accuracy.

 

Good news, I'm planning on trying doing that as soon as I finish my master's degree in a few months :)

 

Well they'll most surely succeed if they get 30fps instead of 60fps to pay the price to get the really interesting path traced effects :) but we're not there yet to be honest.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, laminutederire said:

That being said I'm biased myself because I care about well done algorithms :)

Well I do care as well but I know full well the hardware is not there yet, there isn't going to be some magic come along to alleviate just how big that deficit is. If you want to fly to the moon you first have to develop the rocket motor, flight computer, ascent and decent modules, space suits, communications equipment etc all of which individually are very hard and complex problems. There is nothing wrong with picking just one problem and working on that even when alone it will not get you to the moon.

 

23 minutes ago, laminutederire said:

The thing is that yes you need to trade-off speed over something, of course. However the methods we need to use are the methods which trade speed over precision while keeping perfect accuracy.

If your going down the path of finding ways to reduce the rays then you are reducing accuracy, that's fairly inevitable. There's also a lot to be said about precision too, sometimes it can be good to be precisely inaccurate. Precision is predictability, knowing what will happen can be better. I can be terribly inaccurate with a gun but the gun can be very precise for example, bullet path is well understood even if I myself can't aim for shit ?.

 

Knowing exactly why the denoiser is producing an inaccurate image is very helpful, that's the first step to fixing it. Getting a perfect accurate image but not knowing why this time it is can be useless. From a developer point of view I think I would rather precision over accuracy on the proviso accuracy can be improved, in time with more development of the technology. 

 

23 minutes ago, laminutederire said:

That's why companies like Disney care about making thing bias free since they have  guarantee on the result.

Disney's primary focus isn't real time though, final product matters to them and they can spend vast amounts of time to perfect. They have more than 10ms to 50ms to get it right.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, leadeater said:

If your going down the path of finding ways to reduce the rays then you are reducing accuracy, that's fairly inevitable. There's also a lot to be said about precision too, sometimes it can be good to be precisely inaccurate. Precision is predictability, knowing what will happen can be better. I can be terribly inaccurate with a gun but the gun can be very precise for example, bullet path is well understood even if I myself can't aim for shit ?.

 

Knowing exactly why the denoiser is producing an inaccurate image is very helpful, that's the first step to fixing it. Getting a perfect accurate image but not knowing why this time it is can be useless. From a developer point of view I think I would rather precision over accuracy on the proviso accuracy can be improved, in time with more development of the technology. 

 

Disney's primary focus isn't real time though, final product matters to them and they can spend vast amounts of time to perfect. They have more than 10ms to 50ms to get it right.

But I want to go to the moon now!

More seriously, I'm convinced that we need to find the algorithms first on paper and then make the hardware which is the most appropriate for it. Because otherwise you're just penalized by cache architecture shortcomings or compute units bottlenecks and so on.

 

The thing is that no, cutting out rays doesn't mean you lose accuracy necessary. You can actually prove that if you cut them smartly you still converge to the same results. It all revolves around the fact were using MonteCarlo estimation, so you can cut rays while keeping the expectancy untouched, and that's how you can keep accuracy and improve precision with a time budget.

 

Well yes, you can precisely shot me whole inaccurately aiming for the paper target, but I'd prefer you to be less precise and more accurate. Because I can know where to stand so that you never shoot me. But anyway, for instance I have a bias against biased methods because with the like of photon mapping or virtual point lights methods, you can get good results in most cases but you always create displeasing artefacts. But every biased method I've ever seen so far creates undesirable effects that can be noticed given the scene is complex enough.

For rtx we don't see those because there are too few games using it that for all we know they have a NN overfitting the he'll out of it, or we don't see it because it's only used for details which aren't important enough.

 

The thing is that if you have an unbiased method you don't have anything to fix at any point. So as a lazy computer scientist I prefer that :)

mostly because I'm quite afraid that fixing it for new games will break something else for future games or older games, at which point you either need gigantic drivers to take care of having those multiple code paths, or more likely you'll have to uninstall and reinstall drivers to get the one which isn't broken for the game you want to play right now, which is not ideal from a consumer standpoint.

 

well to be honest in Disney's case they try to improve things son that it takes minutes instead of years to compute one image. But they still have an expertise at cutting immensely computing time to get perfect images. We could however use lower resolutions and less complex effects to cut render times and have perfect images as well.

The reason why it takes so much in their case is that they actually do ground-breaking improvements on effects possible.

Like the light paths in snow are actually really natural in their movies, but those are extremely complex since it's diffused in the volume and so on. However I can accept shortcomings on that and only that, but i cant accept it happens on direct lighting or simple effects.

Link to comment
Share on other sites

Link to post
Share on other sites

 

1 hour ago, laminutederire said:

-Snip-

Perhaps ray tracing's (or I guess more accurately, path tracing) is that non-RT rendering has gotten very good, and so whatever RT does has a subtle look to it that a lot of people probably won't readily notice. However, if I stop to look at the scenery once in a while, I am starting to get annoyed by non-RT rendering not being able to render lighting properly at times.

 

Examples:

Spoiler

20181114065518_1.jpg.ae81456bd9041590d945dc1ac640bc14.jpg

 

How is it physically possible for the floor to be lit up as brightly as it's shown? This isn't a one time instance either, nor is it limited to this game.

 

Spoiler

20181114065603_1.jpg.807a9f1b609af4d4ca7e8abfd13684a9.jpg

 

20181114065606_1.jpg.ffbd9f84f3034d5a05e9a5e81dee9094.jpg

 

Where the view model is actually affecting the shadowing on the floor

 

Spoiler

Another example of the character model affecting lighting when it shouldn't.

 

ffxiv_11142018_065914.jpg.793250eb6cc271e67144c67cb4bfb0d0.jpg

 

ffxiv_11142018_065917.jpg.2f6356f25a2747fa0957b23cc3a7aeb8.jpg

 

Spoiler

I'm pretty sure water doesn't cut off reflections at this angle and that swiftly

ffxiv_11142018_065936.jpg.35aec5ad74a0be6cdf98a680eb9f14cf.jpg

This is the motivation for having ray/path tracing. Making this accurate using traditional rendering methods would likely be just as taxing. Now as a disclaimer, I don't constantly look for flaws, but when I stop to enjoy the graphics of the game, things like this sort of kill it for me.

 

And while one can argue that the current path tracing methods are "too slow" for real time graphics, given how much research ray tracing has been given, and by at least two major hardware companies to boot, I'm willing to bet that NVIDIA is using the best methods that the brightest minds can come up with. But theory is just theory. Until it's put into practice, it'll be harder to figure out where the shortcomings are and how you can improve it.

Edited by M.Yurizaki
Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, M.Yurizaki said:

 

Perhaps ray tracing's (or I guess more accurately, path tracing) is that non-RT rendering has gotten very good, and so whatever RT does has a subtle look to it that a lot of people probably won't readily notice. However, if I stop to look at the scenery once in a while, I am starting to get annoyed by non-RT rendering not being able to render lighting properly at times.

 

Examples:

  Hide contents

20181114065518_1.jpg.ae81456bd9041590d945dc1ac640bc14.jpg

 

How is it physically possible for the floor to be lit up as brightly as it's shown? This isn't a one time instance either, nor is it limited to this game.

 

  Hide contents

20181114065603_1.jpg.807a9f1b609af4d4ca7e8abfd13684a9.jpg

 

20181114065606_1.jpg.ffbd9f84f3034d5a05e9a5e81dee9094.jpg

 

Where the view model is actually affecting the shadowing on the floor

 

  Hide contents

Another example of the character model affecting lighting when it shouldn't.

 

ffxiv_11142018_065914.jpg.793250eb6cc271e67144c67cb4bfb0d0.jpg

 

ffxiv_11142018_065917.jpg.2f6356f25a2747fa0957b23cc3a7aeb8.jpg

 

  Reveal hidden contents

I'm pretty sure water doesn't cut off reflections at this angle and that swiftly

ffxiv_11142018_065936.jpg.35aec5ad74a0be6cdf98a680eb9f14cf.jpg

This is the motivation for having ray/path tracing. Making this accurate using traditional rendering methods would likely be just as taxing. Now as a disclaimer, I don't constantly look for flaws, but when I stop to enjoy the graphics of the game, things like this sort of kill it for me.

 

And while one can argue that the current path tracing methods are "too slow" for real time graphics, given how much research ray tracing has been given, and by at least two major hardware companies to boot, I'm willing to bet that NVIDIA is using the best methods that the brightest minds can come up with. But theory is just theory. Until it's put into practice, it'll be harder to figure out where the shortcomings are and how you can improve it.

Actually the need for path tracing isn't that rasterization isn't good enough or too taxing to be good enough. We could have systems to have something somewhat accurate as much as path tracing, it's just that to do so you need artists creating many cheats to do it. So in practice they don't do it at all or at best they do it most of the time they can.

A good path tracing pipeline just alleviates this entirely, and you achieve results without much hassle on the artistic side. So that'll improve drastically the looks of games so let because it would require less game dev time to make it look pretty. Shining objects are a good example of that, water reflections are another :)

 

thing is Nvidia is a bit lazy on that one because no they don't use the best for everything. They use state of the art NN, because that's what they kinda specialized their hardware for a while now, and that's mostly it.

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, laminutederire said:

We could have systems to have something somewhat accurate as much as path tracing, it's just that to do so you need artists creating many cheats to do it. So in practice they don't do it at all or at best they do it most of the time they can.

At a certain point, those cheats are likely going to cost you in terms of performance as much as just doing ray tracing. 

55 minutes ago, laminutederire said:

Thing is Nvidia is a bit lazy on that one because no they don't use the best for everything. They use state of the art NN, because that's what they kinda specialized their hardware for a while now, and that's mostly it.

And this is one of those cheats to achieve a result. Path tracing produces a noisy image unless you sample so many rays your FPS becomes SPF. Sampling just enough and running a de-noising algorithm achieves the "good enough" result in games.

 

The quality and performance of path tracing is literally how many rays per second you can evaluate. If you can find a more efficient algorithm that produces the same results, I'm all ears.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, laminutederire said:

So as a lazy computer scientist I prefer that :)

I very much support lazy ?

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, laminutederire said:

My technical arguments are independent on my opinion of Nvidia.

They are also completely irrelevant to your claims of its adoption and how good it will be.  Just because you don't like something doesn't mean it won;t get better or will be the monopoly of methods.  Fact of the matter is Nvidia has it out and it is being adopted while AMD doesn't even have a performance matching card let alone anything that resembles RTX.

9 hours ago, laminutederire said:

There is a reason why people depending path traced to finish fast have published on denoising, path guiding, and other methods, and have done so in a riguorous way where they check that their methods are non biased and so on. We don't have much to go on with Nvidias denoisers right now, and since neural networks properties are nightmare to prove, it's just a matter of "Yeah but is works on our examples", without any study of when does it actually fail, like other publishing scientists do.

The final remark most of the researchers end up with is that denoisers are all nice and shiny, but they are easily the easiest place to introduce bias, which is not what you want

 

I really hope AMD can get back in the game, but none of those technicalities you mention change the current market or tech on offer. much less the adoption of it as I have already shown.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, asus killer said:

according to this guys Battlefield V is finally ready for some RTX

https://www.dsogaming.com/news/battlefield-5-day-1-patch-adds-dxr-real-time-ray-tracing-support/

It is coming, there's no real debate about it, the problem is a few AMD supporters think that if nvidia can't produce something that is perfect out of the box and an open standard to boot then they are evil and failing the community.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, M.Yurizaki said:

At a certain point, those cheats are likely going to cost you in terms of performance as much as just doing ray tracing. 

And this is one of those cheats to achieve a result. Path tracing produces a noisy image unless you sample so many rays your FPS becomes SPF. Sampling just enough and running a de-noising algorithm achieves the "good enough" result in games.

 

The quality and performance of path tracing is literally how many rays per second you can evaluate. If you can find a more efficient algorithm that produces the same results, I'm all ears.

The thing is that their denoiser actually only works because it denoises marginal effects. It just won't work as is for fully part traced scenes solely because the noise structures there will be inherently more complex, unless you have already quite a noise free image already.

 

The performance of a path tracing algorithm is in how much time you need for a close enough to ground truth you can get. You can achieve that through brute force by evaluating more rays, or you can achieve it by being smart about it and do it by evaluating less rays and spend a little more time on those.

4 hours ago, mr moose said:

They are also completely irrelevant to your claims of its adoption and how good it will be.  Just because you don't like something doesn't mean it won;t get better or will be the monopoly of methods.  Fact of the matter is Nvidia has it out and it is being adopted while AMD doesn't even have a performance matching card let alone anything that resembles RTX.

 

I really hope AMD can get back in the game, but none of those technicalities you mention change the current market or tech on offer. much less the adoption of it as I have already shown.

Well let's hope it'll get better if that's the future of gaming...

My point was that adoption or not, it won't really matter anyway, because something else will supersede it, solely because the technicalities of it shows it's not robust enough for a scalability to more interesting uses of ray tracing. So until then, sure it'll be there, just like things like hairworks still exist and is adopted, or other gimmicky stuff like that, but most people will turn those off most of the time because it's too much of a performance hit for the gain it actually gives. And it won't be adopted as a mainstream tech for a while. Case in point the only rtx enabled gpu out there being bought is the 2080ti, and the 2080s of the world will soon only sell because there won't be 1080tis anymore.

But even then it's still only on the top 5% of cards being bought for a selected games in the future, so it's not like amd or intel have an emergency in answering to it right away. And that's a good thing, because I'm sure both of those will actually push the limits farther and well have better products anyway in the end.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/13/2018 at 2:45 PM, M.Yurizaki said:

Node size doesn't mean

Something something It's not the node size.

Something something It's how you use it.

Something something innuendo.

Something something I need a girlfriend.

 

12 hours ago, leadeater said:

If you want to fly to the moon you first have to develop the rocket motor, flight computer, ascent and decent modules, space suits, communications equipment etc all of which individually are very hard and complex problems.

If you don't like the guy you're sending, some of those steps become optional.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

So David Wang, Pootie Tang Burger Kang, Gucci Gang, Nvidia Fan, Charlamagne gonna hate anyway, doesn't matter what I say Give me Benchmark of the Day.

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, laminutederire said:

 

My point was that adoption or not, it won't really matter anyway, because something else will supersede it, solely because the technicalities of it shows it's not robust enough for a scalability to more interesting uses of ray tracing. So until then, sure it'll be there, just like things like hairworks still exist and is adopted, or other gimmicky stuff like that, but most people will turn those off most of the time because it's too much of a performance hit for the gain it actually gives. And it won't be adopted as a mainstream tech for a while. Case in point the only rtx enabled gpu out there being bought is the 2080ti, and the 2080s of the world will soon only sell because there won't be 1080tis anymore.

But even then it's still only on the top 5% of cards being bought for a selected games in the future, so it's not like amd or intel have an emergency in answering to it right away. And that's a good thing, because I'm sure both of those will actually push the limits farther and well have better products anyway in the end.

You can say the same about all top end cards as they really only make up a small portion of the total market. We could have saidc the same about Freesync, vulcan and opengl, hell we can even apply the same argument to cuda and openCL when they first became a thing.  But the fact is that technology filters down, give it a few years and majority of cards down to the XX60 will have RTX on them.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, mr moose said:

You can say the same about all top end cards as they really only make up a small portion of the total market. We could have saidc the same about Freesync, vulcan and opengl, hell we can even apply the same argument to cuda and openCL when they first became a thing.  But the fact is that technology filters down, give it a few years and majority of cards down to the XX60 will have RTX on them. 

The problem is we don't know what sort of improvement we'll see in the tensor cores themselves. In 5 years I expect 1080p to be mostly out the window and 1440p to be the new minimum for PC gaming. So barring a major leap in the denoising algorithms you're looking at 50 FPS max and closer to 30-40 FPS at 1440p with any decent implementation(The Battlefield implementation is mediocre at best) of raytracing with the current cores. So even if the XX60 is as good as the 2080ti in that respect, you're not going to see a lot of PC gamers choosing to dump framerate for raytracing, especially since AMD will have a lock on the console market. This means that all the normal rendering tricks will still be in full use.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ravenshrike said:

The problem is we don't know what sort of improvement we'll see in the tensor cores themselves.

That's not a problem, that's my point.   We don't know what sort of improvements will come, let alone how hard/easy it is to improve.  all we have is very little data on the very first release of such tech.

 

 

1 minute ago, ravenshrike said:

In 5 years I expect 1080p to be mostly out the window and 1440p to be the new minimum for PC gaming. So barring a major leap in the denoising algorithms you're looking at 50 FPS max and closer to 30-40 FPS at 1440p with any decent implementation(The Battlefield implementation is mediocre at best) of raytracing with the current cores. So even if the XX60 is as good as the 2080ti in that respect, you're not going to see a lot of PC gamers choosing to dump framerate for raytracing, especially since AMD will have a lock on the console market. This means that all the normal rendering tricks will still be in full use.

My wasn't to insinuate the xx60 would be as good s the 2080ti in a few years, it was that such tech would filter down to that level (as it has in the past). how usable that is remains to be seen, but if history is anything to go of it will be there and beneficial to some degree.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×