Jump to content

NVidia Geforce RTX 2080 confirmed up to 2x GTX 1080 performance. Evidence for 2080 ti as well.

1 minute ago, SolarNova said:

becouse if you grab an average run of the mill game and run it on a 780ti, it looks the same as it being run on a 1080ti, the only difference is the FPS.

I dont see hwo thats hard to understand.

 

I get what your trying to say but their are very very very few features specific to a generation that can only be shown with any given gen over all prevouse gens that can make a game look significantly better from one generation to another.

raytracing on a commercial level is brand new though, while I will wait to see benchmarks comparing current games with and without it enabled on the new cards I still think the people tearing their hair out over a single aspect of performance are the ones at fault here.

 

That said, I do think Nvidia are taking the piss with those prices, especially given the hyper inflated GPU market the last couple of years.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, SolarNova said:

Performance is irrelivant when talking about price in this case.

Going by the logic that higher performance = higher price then the 1080ti when compared vs a much slower 8800 GTX should cost somehwere around $12,000. but it doesnt, becouse its a direct replacement for that price bracket.

 

The 20 series is a new series, as such price can only be compared against what each individual card replaces.

The 2080ti replaces the 1080ti, thats how the naming scheme works, thats where Nvidia has placed it. As such you have to compare the PRICE against the 1080ti , the 980ti the 780ti, the 680, the 580 and so on. As such the price 'should' be around $700, regarldess of the performance.

 

If you dont want to go back the 8800 GTX example then compare the 1080ti vs the 980ti.

980ti MSRP $650

980ti vs 1080ti perf = up to +50%

$650 + 50% = $975

 

The 1080ti is not $975.

 

The 1080ti MSRP is $700, same as the 780ti was.

 

You can not justify a massive jump in price bassed on its performance vs the prior generation, that is not how it has ever worked.

 

I cannot fathom the mindset of som1 trying to justify the price NVidia are charging.

 

Actually it seems that the RTX2080Ti replaces the Titan series, hence:

RTX2080Ti = Titan.

RTX2080 = GTX1080.

RTX2070= GTX1070.

 

42 minutes ago, Lathlaer said:

You are trying to apply some kind of logic to their plans and I'm trying to say that there isn't one to find.

 

Might I remind you that this is the same company that first sold people the Titan X, then slapped them in the face with a 1080ti that was just as fast but $400 cheaper and then slapped them again with a Titan Xp that cost the same amount but was another 15% stronger. So you tell me, out of those 3 - which card didn't make sense?

 

Sure, there was some time between those launches - Titan X premiered in August 2016 (btw. 2 months AFTER 1080) but this only proves my point. I bet those who bought the Titan X were pretty sure that they have the best that Pascal has to offer but how could they foresee what will happen in 6 months (1080ti) or in 7 months (Titan Xp).

 

So tell me now, how sure are you again that someone who bought the RTX 2080ti now has the best that Turing has to offer in this generation?

 

So .... RTX2080TiTi ?

CPU: i7 4790K | MB: Asus Z97-A | RAM: 32Go Hyper X Fury 1866MHz | GPU's: GTX 1080Ti | PSU: Corsair AX 850 | Storage: Vertex 3, 2x Sandisk Ultra II,Velociraptor | Case : Corsair Air 540

Mice: Steelseries Rival | KB: Corsair K70 RGB | Headset: Steelseries H wireless

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Angel102 said:

makes me wonder, why are people only considering FPS as performance? detail and effects matter just as much as straight up frames per second. You can get 500fps in Minecraft, doesn't mean it looks any better.

If all the settings and resolution is the same ( if not its just pointless comparision) the only thing measurable are fps, what else are you going to measure?

.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, asus killer said:

If all the settings and resolution is the same ( if not its just pointless comparision) the only thing measurable are fps, what else are you going to measure?

that's the entire point though, raytracing is not the same as what we currently have at all, just like a video on youtube at 1080p in comparison to a 1080p video on Floatplane.

 

I see raytracing as being like going from 1080p to 4k. others might see it as going from 1080p to 1080p in 3D though.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, SolarNova said:

You can not justify a massive jump in price bassed on its performance vs the prior generation, that is not how it has ever worked.

 

I cannot fathom the mindset of som1 trying to justify the price NVidia are charging.

 

1

I agree, if and only IF what you are saying is what NVidia actually does.

The difference in hardware between the 2080 and 2080 ti is bigger than the difference between a 1080 and 1080 ti.

 

We will likely see a bigger jump in performance as well. And we are seeing a larger jump in price as well.

 

It is all just guessing at this point, but just try to ignore the naming of the cards for a second. It makes a lot more sense that way. You are insisting on prices based on names, not based on hardware. And you are ignoring the possibility that NVidia could just have changed how their naming works (suggested by Jay as well btw).

You are also ignoring the fact, that there is zero competition and NVidia would just push out their own cards if they were to make the "usual" update to the lineup.

 

Again, I don't disagree if all your premises are true. But there is plenty of evidence, that it is indeed not true and we are just looking at a different naming scheme / different positioning in the lineup. I doubt NVidia wants to push out the 1080 and 1080tis that are still ahead of the competition, just for the sake of it. Why would they?

 

I personally would love to see a 2080ti go for 800 bucks, don't get me wrong here. I just don't see why NVidia would do that in the current situation. And we can't just ignore the situation we are in.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, asus killer said:

If all the settings and resolution is the same ( if not its just pointless comparision) the only thing measurable are fps, what else are you going to measure?

Well, if you go by that, every new Tech automatically turns into irrelevant side effects that should not be taken into consideration. 

Improvements would grind to a halt and we would never see prettier games.

 

I get where you are coming from, but you can't really think ignoring new features is a good idea. We SOMEHOW have to include them, otherwise what is the point?

 

If the new Cards 1:1 are better by 50% AND have new tech to boot. That should be worth more than being 50% better without having any new tech, right?

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Rattenmann said:

Well, if you go by that, every new Tech automatically turns into irrelevant side effects that should not be taken into consideration. 

Improvements would grind to a halt and we would never see prettier games.

 

I get where you are coming from, but you can't really think ignoring new features is a good idea. We SOMEHOW have to include them, otherwise what is the point?

 

If the new Cards 1:1 are better by 50% AND have new tech to boot. That should be worth more than being 50% better without having any new tech, right?

people dont think performance isnt linked to visuals but it is performance is all around just not in frames

people dont think ram prices should not affect pricing right now

people dont think think prices of tiers of products should always be the same

 

but people do on phones and their service lol

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SolarNova said:

Performance is irrelivant when talking about price in this case.

Going by the logic that higher performance = higher price then the 1080ti when compared vs a much slower 8800 GTX should cost somehwere around $12,000. but it doesnt, becouse its a direct replacement for that price bracket.

 

The 20 series is a new series, as such price can only be compared against what each individual card replaces.

The 2080ti replaces the 1080ti, thats how the naming scheme works, thats where Nvidia has placed it. As such you have to compare the PRICE against the 1080ti , the 980ti the 780ti, the 680, the 580 and so on. As such the price 'should' be around $700, regarldess of the performance.

 

If you dont want to go back the 8800 GTX example then compare the 1080ti vs the 980ti.

980ti MSRP $650

980ti vs 1080ti perf = up to +50%

$650 + 50% = $975

 

The 1080ti is not $975.

 

The 1080ti MSRP is $700, same as the 780ti was.

 

You can not justify a massive jump in price bassed on its performance vs the prior generation, that is not how it has ever worked.

 

I cannot fathom the mindset of som1 trying to justify the price NVidia are charging.

 

The 1080ti die size was 471 square mm. The 2080ti is 754 square mm. Normally the reason they get away with a big performance increase with about the same cost is because they have a process shrink and can pack more performance into the same size die. The 1080ti die is actually smaller than the 980ti die. With the 2080ti being on a newer smaller process with lower yields than the higher ones while also being almost twice as big of a die than the 1080ti it's actually not that rediculus that they would charge so much for it. It likely is much more expensive to produce than the 1080ti was so they increased the price naturally. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Brooksie359 said:

The 1080ti die size was 471 square mm. The 2080ti is 754 square mm. Normally the reason they get away with a big performance increase with about the same cost is because they have a process shrink and can pack more performance into the same size die. The 1080ti die is actually smaller than the 980ti die. With the 2080ti being on a newer smaller process with lower yields than the higher ones while also being almost twice as big of a die than the 1080ti it's actually not that rediculus that they would charge so much for it. It likely is much more expensive to produce than the 1080ti was so they increased the price naturally. 

no only the supposedly increase in manufacturing because they might have high yields

 

but top tier cards were not increasing as much as last couple gens supposedly too

 

7xx to 9xx series was big jump over many previous in overall % increases

9xx to 1xxx was even bigger

and if this is even bigger along with cuda/rt/tensor with ram prices makes complete sense

 

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Angel102 said:

that's the entire point though, raytracing is not the same as what we currently have at all, just like a video on youtube at 1080p in comparison to a 1080p video on Floatplane.

 

I see raytracing as being like going from 1080p to 4k. others might see it as going from 1080p to 1080p in 3D though.

RT was not used, just read the OP's post.

.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, asus killer said:

RT was not used, just read the OP's post.

I was answering a post, not the OP. the post I was replying to talked about how framerates are all that matters when everything else is the same, and my point was that thing's weren't the same at all given raytracing is a totally different way of rendering an image, even the same image from the same game, so anyone going purely of FPS comparisons is looking at it incorrectly.

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Angel102 said:

that's the entire point though, raytracing is not the same as what we currently have at all, just like a video on youtube at 1080p in comparison to a 1080p video on Floatplane.

 

I see raytracing as being like going from 1080p to 4k. others might see it as going from 1080p to 1080p in 3D though.

So here is the thing though. You are making the assumption that raytracing will become the new norm. It does look better than some of the current tech used, but the problem is those other tech also do a good job at reproducing the effects that Raytracing is for

 

The next big hurdle is going to be whether AMD picks up RayTracing. If they don't then that will make Raytracing call in to the same category as Physx and Hairworks. Does it look good in the games that support it? Yes, but it isn't going to be widely adopted. It will be available in a handful of "Optimized for Nvidia" games. I mean all of those items are grouped into what is known as Nvidia Gameworks. All in all there are like 6-8 Gameworks titles out of ALL the games out there atm.

 

Raytracing will become similar to that without more people jumping on board... so to say FPS doesn't matter because this has new tech, well that is true and isn't. The real question will be how much adoption it sees.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, asus killer said:

RT was not used, just read the OP's post.

I don't think he suggested that.

Pretty sure he was going for the idea that we should not just compare 1:1 while ignoring RT, but factor in the new tech somehow.

 

And I strongly agree.

Seeing the new Cards improve performance is what we kinda all expected, so it is not exiting in itself.

But if we ignore Tensor and RT Cores, we have to stop right there. All we can measure is a straight 1:1 thing.

 

Like comparing a Hybrid car to a gas car, but fully ignoring the electric engine.

It will be hard to compare this generation, that is for sure. There are arguments for all variations of comparisons. Straight up 1:1, almost 1:1 but with the new AA enabled on Turing, RT comparisons (tho, those would be kinda useless, as nothing but Turing can do it with any form of playable FPS) and I bet there are more possible combinations in between.

 

Personally, I am kinda excited to see this new generation. Finally, we get something that is not only slightly faster at the same old shit but actually brings new graphical possibilities.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AngryBeaver said:

So here is the thing though. You are making the assumption that raytracing will become the new norm. It does look better than some of the current tech used, but the problem is those other tech also do a good job at reproducing the effects that Raytracing is for

 

The next big hurdle is going to be whether AMD picks up RayTracing. If they don't then that will make Raytracing call in to the same category as Physx and Hairworks. Does it look good in the games that support it? Yes, but it isn't going to be widely adopted. It will be available in a handful of "Optimized for Nvidia" games. I mean all of those items are grouped into what is known as Nvidia Gameworks. All in all there are like 6-8 Gameworks titles out of ALL the games out there atm.

 

Raytracing will become similar to that without more people jumping on board... so to say FPS doesn't matter because this has new tech, well that is true and isn't. The real question will be how much adoption it sees.

I agree it does depend on the uptake, like 3D televisions, it could simply not get taken up, or it could be like 4k which is becoming the norm very quickly.

 

 

I didn't say FPS doesn't matter, I said it should not be the only metric used, especially right now with first generation consumer raytracing cards with very limited driver support and games that are being retrofitted rather than designed from the ground up.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Rattenmann said:

I don't think he suggested that.

Pretty sure he was going for the idea that we should not just compare 1:1 while ignoring RT, but factor in the new tech somehow.

 

And I strongly agree.

Seeing the new Cards improve performance is what we kinda all expected, so it is not exiting in itself.

But if we ignore Tensor and RT Cores, we have to stop right there. All we can measure is a straight 1:1 thing.

 

Like comparing a Hybrid car to a gas car, but fully ignoring the electric engine.

It will be hard to compare this generation, that is for sure. There are arguments for all variations of comparisons. Straight up 1:1, almost 1:1 but with the new AA enabled on Turing, RT comparisons (tho, those would be kinda useless, as nothing but Turing can do it with any form of playable FPS) and I bet there are more possible combinations in between.

 

Personally, I am kinda excited to see this new generation. Finally, we get something that is not only slightly faster at the same old shit but actually brings new graphical possibilities.

that is a brilliant comparison in bold, thank you.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, AngryBeaver said:

I mean all of those items are grouped into what is known as Nvidia Gameworks. All in all there are like 6-8 Gameworks titles out of ALL the games out there atm.

1

It is not Gameworks tho, it is build into DX and Unity. That is as far as Nvidia only Gameworks as it can get.

Also there are more than 6-8 games with RT now, so that in itself is a positive trend. ;-)

3 minutes ago, AngryBeaver said:

The next big hurdle is going to be whether AMD picks up RayTracing. If they don't then that will make Raytracing call in to the same category as Physx and Hairworks.

2

Why would it matter if AMD picks it up?

All AMD can do is ignore it and screw over their fan base with that.

How would the marketing go? "The new AMD Navi 64, supports the full range of DX12, apart from feature x, y,.... z....".

 

The kicker here is, that it is not an NVidia thing. NVidia is just the first GPU manufacturer to build hardware for it.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, AngryBeaver said:

So here is the thing though. You are making the assumption that raytracing will become the new norm. It does look better than some of the current tech used, but the problem is those other tech also do a good job at reproducing the effects that Raytracing is for

 

The next big hurdle is going to be whether AMD picks up RayTracing. If they don't then that will make Raytracing call in to the same category as Physx and Hairworks. Does it look good in the games that support it? Yes, but it isn't going to be widely adopted. It will be available in a handful of "Optimized for Nvidia" games. I mean all of those items are grouped into what is known as Nvidia Gameworks. All in all there are like 6-8 Gameworks titles out of ALL the games out there atm.

 

Raytracing will become similar to that without more people jumping on board... so to say FPS doesn't matter because this has new tech, well that is true and isn't. The real question will be how much adoption it sees.

Amd had Radeon rays and dx12 supports it

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Rattenmann said:

It is not Gameworks tho, it is build into DX and Unity. That is as far as Nvidia only Gameworks as it can get.

Also there are more than 6-8 games with RT now, so that in itself is a positive trend. ;-)

Why would it matter if AMD picks it up?

All AMD can do is ignore it and screw over their fan base with that.

How would the marketing go? "The new AMD Navi 64, supports the full range of DX12, apart from feature x, y,.... z....".

 

The kicker here is, that it is not an NVidia thing. NVidia is just the first GPU manufacturer to build hardware for it.

The reason it matters if AMD picks it up is because of adoption rate. As it stands you are correct it depends on the game developers to incorporate it, but having both big GPU companies support it will make more companies push to use it. If they only Nvidia uses it, then now you have the extra effort to push the technology, but only a limited amount of people will be able to take advantage of it. I mean remember these cards are first gen, not many people are going to run to the RTX stuff right away so the market share for this technology will be low. I know that one major game engine does have support for it and that it can be incorporated in to products that are already out, but the desire to put in that effort will depend on market share and the potential gains for using it.

 

So why would a company invest in the time to add this to a game which would cost them a few 100k more in development, when the amount of users who could take advantage of it might be 5% or less of their player base? On top of that this would also assume the people with that tech would actually buy their game. So it comes down to do they stand to make enough money by drawing in more of that 5% to offset the cost of adding that feature to their game.

 

I mean to be realistic it isn't THAT much better than the technology we already use to imitate it. It seems to have a pretty nasty performance penalty as well. Then we have the fact that this isn't going to be something 90%+ of people have the hardware for anyways. I think the Rtx line might have a lot of value in the designer workspace though depending on how it is used.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Angel102 said:

I was answering a post, not the OP. the post I was replying to talked about how framerates are all that matters when everything else is the same, and my point was that thing's weren't the same at all given raytracing is a totally different way of rendering an image, even the same image from the same game, so anyone going purely of FPS comparisons is looking at it incorrectly.

you should use the quote function, because it seemed like a ordinary comment to the topic and not an answer to someone. Worst when your answering something that really derailed from the original post.

.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Angel102 said:

that's the entire point though, raytracing is not the same as what we currently have at all, just like a video on youtube at 1080p in comparison to a 1080p video on Floatplane.

 

I see raytracing as being like going from 1080p to 4k. others might see it as going from 1080p to 1080p in 3D though.

Raytracing won't be viable in games for several years. So, it's "cool, new tech" that does look better, but that's not exactly what people care about if they're going to Upgrade their GPU. They care about if it's worth upgrading to. In 4-6 years when RT works well in games, then that performance will matter.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, AngryBeaver said:

The reason it matters if AMD picks it up is because of adoption rate. As it stands you are correct it depends on the game developers to incorporate it, but having both big GPU companies support it will make more companies push to use it. If they only Nvidia uses it, then now you have the extra effort to push the technology, but only a limited amount of people will be able to take advantage of it. I mean remember these cards are first gen, not many people are going to run to the RTX stuff right away so the market share for this technology will be low. I know that one major game engine does have support for it and that it can be incorporated in to products that are already out, but the desire to put in that effort will depend on market share and the potential gains for using it.

 

So why would a company invest in the time to add this to a game which would cost them a few 100k more in development, when the amount of users who could take advantage of it might be 5% or less of their player base? On top of that this would also assume the people with that tech would actually buy their game. So it comes down to do they stand to make enough money by drawing in more of that 5% to offset the cost of adding that feature to their game.

 

I mean to be realistic it isn't THAT much better than the technology we already use to imitate it. It seems to have a pretty nasty performance penalty as well. Then we have the fact that this isn't going to be something 90%+ of people have the hardware for anyways. I think the Rtx line might have a lot of value in the designer workspace though depending on how it is used.

RT is going to be like a lot of the xAA types. It's going to be 6+ years before it's standard practice. Not only because the replacement rate on GPUs, but also because it needs to hit the mainstream parts.

 

In Raytracing's case, the only thing that actually matters is whether the technology is in Navi. If there is no dedicated hardware for it in the next console generation (starting 2020/2021), then it's going to be at least 3 GPU generations before it becomes viable to develop for.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, AngryBeaver said:

The reason it matters if AMD picks it up is because of adoption rate. As it stands you are correct it depends on the game developers to incorporate it, but having both big GPU companies support it will make more companies push to use it. If they only Nvidia uses it, then now you have the extra effort to push the technology, but only a limited amount of people will be able to take advantage of it. I mean remember these cards are first gen, not many people are going to run to the RTX stuff right away so the market share for this technology will be low. I know that one major game engine does have support for it and that it can be incorporated in to products that are already out, but the desire to put in that effort will depend on market share and the potential gains for using it.

 

So why would a company invest in the time to add this to a game which would cost them a few 100k more in development, when the amount of users who could take advantage of it might be 5% or less of their player base? On top of that this would also assume the people with that tech would actually buy their game. So it comes down to do they stand to make enough money by drawing in more of that 5% to offset the cost of adding that feature to their game.

 

I mean to be realistic it isn't THAT much better than the technology we already use to imitate it. It seems to have a pretty nasty performance penalty as well. Then we have the fact that this isn't going to be something 90%+ of people have the hardware for anyways. I think the Rtx line might have a lot of value in the designer workspace though depending on how it is used.

From what I understand raytracing is very simple to implement compared to older techniques used to simulate it. You simply need a light source and give object a reflectivity and let the raytracing do the work. That doesn't sound like a 100,000k endevour. If it's in game engines and is simple to implement it would be dumb not to use it. I mean game developers spend so much time trying to mimic raytracing using very complex hard to implement methods that don't look as natural or as good as raytracing and you think they wouldn't take the time to implement raytracing? I would have to disagree with you on that one. Maybe if they use an engine that doesn't have great support for it but if they are using one that does then they likely will use it. 

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, asus killer said:

you should use the quote function, because it seemed like a ordinary comment to the topic and not an answer to someone. Worst when your answering something that really derailed from the original post.

my original comment was specifically about people who only measure FPS regardless of other factors, hence why that single post had no quotes to it, every other post I have made since has abeen a direct answer to someone else.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Brooksie359 said:

From what I understand raytracing is very simple to implement compared to older techniques used to simulate it. You simply need a light source and give object a reflectivity and let the raytracing do the work. That doesn't sound like a 100,000k endevour. If it's in game engines and is simple to implement it would be dumb not to use it. I mean game developers spend so much time trying to mimic raytracing using very complex hard to implement methods that don't look as natural or as good as raytracing and you think they wouldn't take the time to implement raytracing? I would have to disagree with you on that one. Maybe if they use an engine that doesn't have great support for it but if they are using one that does then they likely will use it. 

many developers were pushing for this I actually believe it could make things cheaper

and on that note many hate gameworks

 

making a game now is expensive as fuck wouldnt you want to use or push for shortcuts and reduced cost?

but by using gameworks alot of time is saved which is money

build a house to a hammer ok fine does the job now building it with nailgun wow

that is what tools are for

and saying its gimping amd, nvidia gets gimped too

 

so on to rt

amd has their own

companies want it if amd doesnt come up with ways to make it happen reasonably or better which I think they can

but rt isnt like gameworks

its more like async with freesync and gsync each will have their own version like they do

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, pas008 said:

many developers were pushing for this I actually believe it could make things cheaper

and on that note many hate gameworks

 

making a game now is expensive as fuck wouldnt you want to use or push for shortcuts and reduced cost?

but by using gameworks alot of time is saved which is money

build a house to a hammer ok fine does the job now building it with nailgun wow

that is what tools are for

and saying its gimping amd, nvidia gets gimped too

 

so on to rt

amd has their own

companies want it if amd doesnt come up with ways to make it happen reasonably or better which I think they can

but rt isnt like gameworks

its more like async with freesync and gsync each will have their own version like they do

not forgetting all the added DLC you get on virtually every game now on top of the initial purchase cost so companies get far more in return for half finished games on release.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×