Jump to content

Nvidia reps suggest 2060 and below will not have Ray Tracing support

AlTech
4 hours ago, killcomic said:

Ray tracing, it will become super popular like other Nvidia only features like VXAO and PhysX...

 

 

Did you count how many times people said that this is NOT a Nvidia before you went on about how this was an NVidia only feature?

 

Guess we need @LinusTech to make a Video about this, so we can reference it. Kinda like "What did AMD, Microsoft, NVidia, Epic etc. do in a room a few months ago and what does it mean for RT".

 

1 hour ago, Humbug said:

Ray tracing is a compute function. AMD GPUs are better at compute than Nvidia.  Also based on what AMD has told us so far the asynchronous compute capabilities of their cards allow them to run Radeon Rays 2.0 with less performance impact.

2

Sounds amazing. Now we need a product we can buy to check on those AMD claims. Because, you know, they claim a lot. They claim how they are superior in almost every single aspect, even when they have zero products to back up their claims. And while I am more than willing to hope it is true, I am getting a little annoyed at the constant claims that never show up in a product I could buy.

 

Fewer claims, magic and hopes, more products, please.

I refuse to down talk any other companies real products because AMD claims they would do it better. Well, then do it AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

If the non-RT performance of GTX 2060 is in line with RTX-lineup, sign me in. $100-200 for GTX 1080 performance or even greater is kind of a good deal (when considering buying a new one, I'm very hesitant to buy a GPU used today just because they might have run 24/7 100% load).

 

After all ray-tracing won't be a that huge thing for couple of years and then we probably have RT[X/R/whatever] [3000/2100/whatever] lineup with probably a lot better performance by then. And that is just markets, 100% of todays GPUs can't handle RT at all and probably in a year or two the percentage of GPUs handling RT is probably around 5-10% if everything goes extremely well. From marketing standpoint developers won't be pushing it forcefully and even when there will be games supporting it, we are probably not talking about huge percentages of released games that support RT. Just like if you jumped to the VR train before GTX10-lineup, it wasn't that pretty and IMHO one generation later and you got better performance with lower price and products that were better suited for the task and overall better experience and after all there even wasn't that much content for it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Tech Enthusiast said:

Did you count how many times people said that this is NOT a Nvidia before you went on about how this was an NVidia only feature?

But Nvidia is introducing specific hardware into their cards to accelerate ray tracing. No, it's not an Nvidia made up function, but Nvidia is selling cards with the promise of games with ray tracing. 

Ray tracing is not possible in games because it's performance expensive. Do you think any non-accelerated cards will be able to run ray tracing function in games at more that 10 FPS?

Same thing with VXAO. Any card can do it, but it's stupidly expensive to do so, which is the reason developers simply do not use it.

 

2 hours ago, Thaldor said:

If the non-RT performance of GTX 2060 is in line with RTX-lineup, sign me in. $100-200 for GTX 1080 performance or even greater is kind of a good deal (when considering buying a new one, I'm very hesitant to buy a GPU used today just because they might have run 24/7 100% load).

 

After all ray-tracing won't be a that huge thing for couple of years and then we probably have RT[X/R/whatever] [3000/2100/whatever] lineup with probably a lot better performance by then. And that is just markets, 100% of todays GPUs can't handle RT at all and probably in a year or two the percentage of GPUs handling RT is probably around 5-10% if everything goes extremely well. From marketing standpoint developers won't be pushing it forcefully and even when there will be games supporting it, we are probably not talking about huge percentages of released games that support RT. Just like if you jumped to the VR train before GTX10-lineup, it wasn't that pretty and IMHO one generation later and you got better performance with lower price and products that were better suited for the task and overall better experience and after all there even wasn't that much content for it.

Unlikely. It is expected to be somewhere around the 1070 and 1080. The same as the 1060 is somewhere between the 970 and 980 (but not better).

"Fighting for peace is like screwing for virginity"

- George Carlin (1937-2008)

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Tech Enthusiast said:

-snip-

It's not exactly unwarranted for this launch to be getting this sort of backlash. I mean how many companies besides Apple can increase the asking price by 50% for a given generation without taking a hit to their reputation? Nvidia did a very poor job of demonstrating what level of performance to expect in traditional rasterized games and either misrepresented their raytracing performance in games or showed just how poor it is. At this point it's hard to tell which one it is but all the non-synthetics showed awful performance.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, killcomic said:

But Nvidia is introducing specific hardware into their cards to accelerate ray tracing. No, it's not an Nvidia made up function, but Nvidia is selling cards with the promise of games with ray tracing. 

Ray tracing is not possible in games because it's performance expensive. Do you think any non-accelerated cards will be able to run ray tracing function in games

Same thing with VXAO. Any card can do it, but it's stupidly expensive to do so, which is the reason developers simply do not use it.

That is what GPUs are there for. To accelerate graphics. That is the reason we have GPUs in the first place.

Bashing a company for being first to add dedicated hardware for a software spec is kinda strange, don't you think? Everyone will add that sooner or later. Including AMD and Intel. Hell, I am 95% sure NVidia did this to beat Intel to it for bragging rights.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Carclis said:

It's not exactly unwarranted for this launch to be getting this sort of backlash.

 

I do understand the price backlash and not questioning that even a tiny bit.

Personally, I don't care about the price, only the new tech. But I guess that is kinda a giveaway just reading my forum handle.

 

The stuff I mentioned is not a "backlash" tho. It is flat out chanting lies that have been disproven. There is nothing that is warranting that. They can talk about the price all they want since it is a fair point to make. But imagine people chanting "That new BMW only has 3 wheels!!!" after seeing pictures of 4 wheels. It is just flat out idiotic stuff to claim. Not what a backlash is all about.

 

1 minute ago, Carclis said:

I mean how many companies besides Apple can increase the asking price by 50% for a given generation without taking a hit to their reputation? Nvidia did a very poor job of demonstrating what level of performance to expect in traditional rasterized games and either misrepresented their raytracing performance in games or showed just how poor it is. At this point it's hard to tell which one it is but all the non-synthetics showed awful performance.

This I can agree with.

The Demo was done too soon, guess they have run out of time before the gamescon. They wanted that platform.

Wish the Enlisted Demo would have been on stage. It was amazing and running smoothly at 4k.

Guess money was a factor here. Tomb Raider and BFV just sell better on stage, than a game from a company barely anyone knows at all.

 

All this is no excuse tho, just a possible explanation for why they did it. This does not change the fact that you brought up: They did a piss poor job at showing us what we would get from a performance point of view. They only showed us how amazing the new tech is, but not how it performs. Also they screwed up the games that got used as Demos. But i guess that is on the developers and not NVidia for the most part. ;-)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Tech Enthusiast said:

Oh wait, there is already a video of a RT 4k@130fps game, just one page ago!

That demo honestly does not look that impressive at all though, overall asset quality I would say is about 1 or 2 generations ago equivalent. It's a lighting demo which is great an all but it's a regression in overall graphics, a simple scene at 4k just is not impressive and we know it's simple because current high end games at 4k don't do 130fps either which these RTX cards still have to do with the layer of Raytracing on top of that. Not all lighting effects and shadows are done using just RTX so the cards are doing both so there isn't a significant reduction GPU demand from the traditional raster pipeline.

 

If it were say FF15 PC at 4k with all lighting replaced with Raytracing running at 130 fps then I'd be impressed, well amazed to be honest because I doubt you could do 60 fps with 2 cards let alone 1.

 

As significant as Raytracing is as technology most of this just feels like a de ja vu of when Bloom first came on the scene, ultimately just an overused poorly implemented gimmick for years until developers stopped using it like Michael Bay uses lens flares.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, leadeater said:

As significant as Raytracing is as technology most of this just feels like a de ja vu of when Bloom first came on the scene, ultimately just an overused poorly implemented gimmick for years until developers stopped using it like Michael Bay uses lens flares.

 

I see your point there. And I totally don't claim RT will be a performance boost.

All I wanted to point out with that video is, that RT does not cripple games down to 1080p at 30ish FPS if implemented on actual RT cores (their Job, duh!).

The textures you mentioned can be added without any RT performance impact though. Those are handled by the cuda part of the GPU, correct? So making Enlisted look prettier, would not diminish the shown RT performance at all.

 

Also, developers don't have to go all in on RT, they can choose and pick which features they want in order to look better and still perform well.

 

I have no doubt about RT being new and having a rough start. That is to be expected. Still, no reason to support blatantly false claims that are chanted by a torch-wielding mob. Would be amazing to see more people talking about the tech, rather than "omg, the price!" or "AMD will do this with a driver patch on a 580 too!".

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Tech Enthusiast said:

I see your point there. And I totally don't claim RT will be a performance boost.

All I wanted to point out with that video is, that RT does not cripple games down to 1080p at 30ish FPS if implemented on actual RT cores (their Job, duh!).

Also, developers don't have to go all in on RT, they can choose and pick which features they want in order to look better and still perform well.

I'd still rate it as too soon, would have made more sense with the coming node shrink than to try and do it on the existing node and using huge dies. I know there needs to be a first but I don't think now was the right time. There is zero appetite for price increases in an already inflated industry and even less for a performance decease so satisfying both conditions is impossible, computer component prices over the last few years is simply not sustainable and if Intel, Nvidia, AMD, Samsung etc think they are then they will be in for a rather rude awakening when sales crash or people start settling for less. You can have all the 7980XEs, 2990WXs, 2080 Tis etc but if all people buy is the low price mid range option you'll never push something like RTX to the 0.5% owners of the hardware.

 

No one is going to talk about the technology while a product that was in the obtainable category slowly starts shifting to unobtainable, graphics cards aren't Ferrari's/Bugatti's/Zonda's they are items that we expect to be able buy even on the high end. We can appreciate the hyper expensive cars because they are accepted as generally not obtainable so treat them more like art or engineering feats, graphic cards should never turn in to that. So price will always overshadow technological improvements when they conflict with each other. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Tech Enthusiast said:

That is what GPUs are there for. To accelerate graphics. That is the reason we have GPUs in the first place.

Bashing a company for being first to add dedicated hardware for a software spec is kinda strange, don't you think? Everyone will add that sooner or later. Including AMD and Intel. Hell, I am 95% sure NVidia did this to beat Intel to it for bragging rights.

But the problem is that ray tracing may not be implemented at all in games as only a very small subset of users will be able to use it.

And as for everyone implementing it later, maybe, maybe no. Future GPUs may be able to brute force it, but right now, as it is, ray tracing seems like a dead end.

"Fighting for peace is like screwing for virginity"

- George Carlin (1937-2008)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

I'd still rate it as too soon, would have made more sense with the coming node shrink than to try and do it on the existing node and using huge dies. I know there needs to be a first but I don't think now was the right time.

 

I agree on the too soon part, from our perspective. But Nvidia is not dumb. They know that just as well as we do. So the question is: What information do they have, that we do not have? We can argue they are just silly, but is that really likely? I know that most people jump to conclusions right away and it usually entails "they are dumb", "they are greedy", "they want to hurt us!", "they only do this to piss us off!". 

 

Stepping back a little, how likely is either of those explanations really?

They are not the market leader because they were silly. And they won't keep that spot if they are silly now.

I honestly think they are doing this now, so Intel can't do it first. We only know that Intel is going to join the GPU market soon(TM). NVidia likely knows a lot more details. Given that Intel tried RT with their last shot at GPUs, we can safely assume it will be included in their next try as well. Would it be better on 7nm? No doubt about that, not in the slightest.

 

Think about the marketing boom Intel would get if they join the GPU market for the first time, while instantly delivering better graphics than AMD and NVidia. This would be a PR disaster for both, but even more so for Nvidia. I can picture the headlines: "Intel joins the GPU market. Former market leader has nothing comparable!". Nvidia clearly does not want someone else to claim the title of the company that brought us RT.

 

NVidia has to cover that. 

Still, too soon may be true. But they may not have had a choice. This gives them a way to be first (marketing happy), while also working with developers to fine-tune their implementation before they have to cover their bases vs a MUCH bigger company with MUCH more money and RnD.

 

Not relevant for the consumer. Not gonna claim that. But I am not willing to just flat out buy that NVidia does this without thinking about it and without serious reasons. 

 

1 hour ago, leadeater said:

No one is going to talk about the technology while a product that was in the obtainable category slowly starts shifting to unobtainable, graphics cards aren't Ferrari's/Bugatti's/Zonda's they are items that we expect to be able buy even on the high end. We can appreciate the hyper expensive cars because they are accepted as generally not obtainable so treat them more like art or engineering feats, graphic cards should never turn in to that. So price will always overshadow technological improvements when they conflict with each other. 

5

That is the sad truth. And I hate that tbh.

It is expensive, yes. But it is also amazing new tech. It should be praised and not bashed. The extreme focus on money, in a tech community no less, just makes me sad.

Why can't people just see the good, talk about it... then see the price and decide they are not gonna buy it anyways. People are acting like NVidia forces them to buy RTX. 

 

NVidia could have just as well not released anything. They would be fine. Pascal sells like hotcakes, no competition, no pressure. 

And it seems like people would be happier without new GPUs, then they are with RTX. I guess I just don't understand the sentiment at all. Something optional can never be bad. It can, at worst, be neutral. /shrug

Link to comment
Share on other sites

Link to post
Share on other sites

if the 2080ti can barely do 1080p @60fps it would be pointless to release a 2060 with RT support, would anyone play at 720p or less in 2018 just for some shadows?

Well no!

 

.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Tech Enthusiast said:

I agree on the too soon part, from our perspective. But Nvidia is not dumb. They know that just as well as we do. So the question is: What information do they have, that we do not have? We can argue they are just silly, but is that really likely? I know that most people jump to conclusions right away and it usually entails "they are dumb", "they are greedy", "they want to hurt us!", "they only do this to piss us off!".

Maybe not dumb but I definitely think they are capable of mistakes. Even earlier this year they misjudged how the public would react to the GPP and were met with quite a bit of backlash from consumers and the tech press. Now they're releasing products that have obscene prices compared to their predecessors and receiving flak from the majority of people for it. In all likelihood the decision to go with enormous GPU dies would have been made quite a long time ago when Nvidia thought that its reputation alone would be enough to sell their products, along with the fact that this architecture is targeted at a more professional audience.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, asus killer said:

if the 2080ti can barely do 1080p @60fps it would be pointless to release a 2060 with RT support, would anyone play at 720p or less in 2018 just for some shadows?

Well no!

 

I think that argument can go both ways though. What situation do you think is more plausible; somebody who has $1200 to spend on a GPU playing at 1080p or somebody with a $200 GPU playing at 720p?

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

All these small bits of news things are so annoying -.- They should just give third parties the cards so they can benchmark and we can judge for ourselves if it's worth a purchase. 

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA future lineup:

 

Initial release, RTX 2060 6GB: Doesn't support Ray Tracing technologies

2 months after launch, RTX 2060 3GB: Doesn't support Ray Tracing Technologies

4 months after launch, RTX 2060 8GB: Doesn't support Ray Tracing Technologies

5 months after launch, RTX 2060 8GB: Supports Ray Tracing Technologies.

 

Because NVIDIA want to make sure their consumers know exactly what they are buying.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Ryujin2003 said:

Because NVIDIA want to make sure their consumers know exactly what they are buying.

A good milking?

 

:D

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Tech Enthusiast said:

I agree on the too soon part, from our perspective. But Nvidia is not dumb. They know that just as well as we do. So the question is: What information do they have, that we do not have? We can argue they are just silly, but is that really likely? I know that most people jump to conclusions right away and it usually entails "they are dumb", "they are greedy", "they want to hurt us!", "they only do this to piss us off!". 

Comes down to timing. Pascal has been around for an actually rather long time and Nvidia knows people would be expecting a fair decent performance uplift both because of how long and also because of last time but that is just something they cannot deliver on a node with the same density as the current generation.

 

Nvidia on it's side is correct in trying to shift the narrative away from pure performance gain because that's an expectation they cannot meet, application specific instruction sets/accelerated hardware pathways are actually not that large in die area and deliver high performance so that is an easy way to gain specific performance increases that are easy to market. The only issue is Raytracing is just that demanding you still need a sizable amount of RT and Tensor cores to do it in real time with any kind of applicable usage so we're still left with a large die because not increasing raster performance isn't an option because that really would be too hard of a sell.

 

Just business optics to me, do something earlier than would be normally undertaken as it's the best option on the table in their view. I honestly think most people would have been more happy with extending product life for another year, refining costs and supply agreements and lowering the price.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, leadeater said:

Just business optics to me, do something earlier than would be normally undertaken as it's the best option on the table in their view. I honestly think most people would have been more happy with extending product life for another year, refining costs and supply agreements and lowering the price.

 

Sad, isn't it?

Instead of seeing this generation as optional as it is, people cry like babies because they got something else than they expected.

 

Even thinking about it that people would be happier if they did not get anything at all,... I mean, it obviously looks like that, judging from the forums, but damn. So hard to understand. They could just as well ignore RTX and come back in a year for the 7nm stuff. But getting something seems to be worse than getting nothing for some people.

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, Tech Enthusiast said:

But getting something seems to be worse than getting nothing for some people.

Can't hate something you don't know about or doesn't exist xD

Link to comment
Share on other sites

Link to post
Share on other sites

I don't find this to be too much of a big deal, really when Ray Tracing is widely used, GTX 2060 and under will probably be obsolete anyway and won't run the features properly. Until then it will be an upgrade for those on older platforms but in the long term, will it really matter? I don't think so.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

Comes down to timing. Pascal has been around for an actually rather long time and Nvidia knows people would be expecting a fair decent performance uplift both because of how long and also because of last time but that is just something they cannot deliver on a node with the same density as the current generation.

 

Nvidia on it's side is correct in trying to shift the narrative away from pure performance gain because that's an expectation they cannot meet, application specific instruction sets/accelerated hardware pathways are actually not that large in die area and deliver high performance so that is an easy way to gain specific performance increases that are easy to market. The only issue is Raytracing is just that demanding you still need a sizable amount of RT and Tensor cores to do it in real time with any kind of applicable usage so we're still left with a large die because not increasing raster performance isn't an option because that really would be too hard of a sell.

 

Just business optics to me, do something earlier than would be normally undertaken as it's the best option on the table in their view. I honestly think most people would have been more happy with extending product life for another year, refining costs and supply agreements and lowering the price.

I think they marketed the RTX cards wrong.

 

They should have marketed it as

"30% gain over previous generation, with the added benefit of heralding in the new age of Ray Tracing"

 

They instead chose to make ray tracing front and center, even though the cards do not have the performance for it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Tech Enthusiast said:

Sad, isn't it?

Instead of seeing this generation as optional as it is, people cry like babies because they got something else than they expected.

 

Even thinking about it that people would be happier if they did not get anything at all,... I mean, it obviously looks like that, judging from the forums, but damn. So hard to understand. They could just as well ignore RTX and come back in a year for the 7nm stuff. But getting something seems to be worse than getting nothing for some people.

The best part is some people saying that we don't even need another generation, that the current cards are fast enough.

.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Tech Enthusiast said:

Sad, isn't it?

Instead of seeing this generation as optional as it is, people cry like babies because they got something else than they expected.

 

Even thinking about it that people would be happier if they did not get anything at all,... I mean, it obviously looks like that, judging from the forums, but damn. So hard to understand. They could just as well ignore RTX and come back in a year for the 7nm stuff. But getting something seems to be worse than getting nothing for some people.

I think Nvidia has already been getting hate for dragging out and "milking: this generation, which admittedly is less than what they're getting from this launch. That said I think this launch is what you get when you set the price too high. Any product can be valid given the right price but in this case the market is rejecting the product and rightly so. After two years on the same generation an invalid product launch is sure to hit a few nerves, especially for those who were holding out and expecting something good xD

I think this is Nvidias move to avoid a monopoly personally. Push a new technology that has application for professionals, people with bundles of cash and also cater to the consumers who would buy Titans. Take the high margin customers and leave the scraps for AMD.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×