Jump to content

NVidia Geforce RTX 2080 confirmed up to 2x GTX 1080 performance. Evidence for 2080 ti as well.

20 minutes ago, Brooksie359 said:

yes they will indeed. that is why they said they would make updates to the AI that you can download like drivers. this training is all done by super computers mind you and will likely be able to train itself relatively fast.

Yeah that'll hurt when they'll drop driver optimization for Turing...

That and prepare to have a huuuuge amount of storage drained to have the trained models for each renderer/games. Because that's the issue with deep learning, it only performs well on one thing usually, and not that well for general purpose.

8 minutes ago, Rattenmann said:

 

Ai is not magic though. It uses informations that are in fact somewhere. It's not in the image itself but it's somewhere. That's where potential artefacts that some people have already spotted on the bfv demo mostly come from, transposing knowledge where it shouldn't.

(Personnally I don't even use AA most of the time, but that implementation is only free as long you don't need tensor cores for ray tracing, and that's where you need it the most since it struggles even at 1080p right now).

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Brooksie359 said:

How would that be artificially inflating? You get AA in with both cards at 4k yet the 2080ti gets better performance because it can use a better form of AA that has a smaller hit on performance. 

This presumes that there would be support for DLAA. I think Nvidia's approach makes more sense and makes their GPU more versatile. It's just not an apples to apples comparison. 

Link to comment
Share on other sites

Link to post
Share on other sites

KNEW IT!

Good Job NVIDIA!!! so excited right now :D

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Rattenmann said:

Source: 

Techradar

videocardz.com

Computerbase (german)

HardwareLuxx (german)

 

 

xBWE6pSdBDh8TQAoGJsF44.jpg

 

My take on the articles

The source articles talk about a 40-50% performance jump from 1080 to 2080.

Even without DLSS (Tensor cores) and RT factored into this, Turing seems to mob the floor with Pascal.

NVidia claimed, "the biggest jump in performance in any generation" and this most definitely backs that claim.

 

If we ignore the bad Chart title (which I mistook for a SLI setup before thinking more about it), we can still see how big DLSS is for performance.

Basically, everyone that has a 20 Series card will switch to DLSS as a no-brainer option in every game that supports it.

 

Important to note:

This is all without any RT whatsoever. This is not a chart about showing how the new cards with RT are better, it would otherwise be a 10x factor, not a 2x.

"Prove" of this is pretty much common sense. Neither Hitman 2 or Mass Effect (or any of those older titles) would be able to just plug in RT tech into their game, just for the sake of this chart.

 

 

Some more evidence of 2x performance of the 2080 TI

 

Keep in mind that the Epic Infiltrator Demo does not include RT Technology.

Many people seem to believe the 20Series cards can only be so fast with RT enabled. The Demo does not have RT at all and is from 2015, a few years old.

This is an important consideration. The performance shown is before factoring in RT madness. So RT will be a bonus ON TOP of the amazing performance figures.

 

 

My take on performance shown on stage

I personally calculated the 2080 TI performance based on an estimation of their Infiltrator Demo before and came to the conclusion that 2x performance is actually pretty likely. You can check my Google Sheet for this here --> Goggle Sheet, performance comparison

 

A 20% Overclocked 2080 TI would reach about 60500 3dmark scores and be a 331% Improvement over 1070 (see the sheet for details and premises).

The performance per money (Euro prices in this sheet, so with tax included) would be around 48. A 1080TI only has 37.1 and Vega 64 runs around 35.1.

 

Note:

Performance comparison for the 2080 ti was made with the 78 FPS claim from Jenson on stage. As shown above, the press actually saw a steady 85 FPS behind the curtain. This means an even bigger jump for the 2080 ti as Jenson actually lowballed the 2080 ti performance.

 

 

Conclusion

If these Datasheets hold true we may have a crazy update right here.

Even ignoring the RT possibilities we are looking at a 2x gain and possibly even more for the 2080 ti. Then we can add RT goodness on top to make buying decisions easier.

 

Also, this may be a confirmation that the 20xx Lineup is actually an EXTENSION to the 10xx Lineup and not a replacement.

They simply added more to the upper spectrum and prices do reflect that. With this performance, the prices may actually be fine.

As you can see in the Google Sheet linked, an RTX 2080 TI may even have a better performance / Price ratio than the Pascal Lineup. Which is impressive given the price.

 

We can only wait for benches and hope they prove these Slides.

They wanna shift old stock

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, descendency said:

This presumes that there would be support for DLAA. I think Nvidia's approach makes more sense and makes their GPU more versatile. It's just not an apples to apples comparison. 

It is quite a new approach to doing GPUs, so apples to apples comparison is hard to do.

Since the result is the same, it is "fair enough" in my eyes. Both options try to do something, they both get it done, but one is more efficient.

Then again, the more efficient one uses something most cards don't have access to. But is that misleading? I would say not. The consumer should not have to care how the task is done, as long as it is being done, right?

 

We are running into the same comparison issue with all the benchmarks.

We could just ignore all Tensor and RT Core related features and go with that. But is that fair to the product? 

On the other side we can not compare Tensor and RT features with anything else, because nothing has it.

 

I guess the ones that kinda "suffer" the most from this are the people that are enough into tech to look for a good comparison but are not far enough into tech to understand why a fair comparison is not easily doable.

Link to comment
Share on other sites

Link to post
Share on other sites

Good tech but hating the price. I'd opt out paying the early adapter premium.

ROG X570-F Strix AMD R9 5900X | EK Elite 360 | EVGA 3080 FTW3 Ultra | G.Skill Trident Z Neo 64gb | Samsung 980 PRO 
ROG Strix XG349C Corsair 4000 | Bose C5 | ROG Swift PG279Q

Logitech G810 Orion Sennheiser HD 518 |  Logitech 502 Hero

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's not DLAA it's DLSS, it's a super sampling technique they're usign AI to render an 8K image from the games "ground" before it gets rasterized and rendered at a lower resolution (1080p, 1440p, 4k, etc etc). It's not anti aliasing the image (well it probably is after teh rasterization to make it smoother and cleaner) but it's allowing the game to render in your native resolution all the while the rasterized image your getting in each frame is rendered in higher detail before it's rasterized completely all in real time and the more people who use the feature and allow Nvidia to collect data the more "perfected" it can become as the plug the data into their own super computer to improve the process and project better ground images and rasterized renders.

 

At least this is what makes most sense to me when I see the fact it improves performance so drastically in some games. While the games base is rendered in such a high resolution it's all handled on cores that aren't busy rendering the rasterized images and the game itself, the only issue I see it having is the occasional artificating mistakes and increased latency unless the cores can all communicate with each other simultaneously.

 

I could be wrong behind what they're doing, someone more knowledgeable than me could explain it better. But anti aliasing and super sampling work i ndifferent ways and I doubt any kind of anti aliasing could create an image nearly free of jaggies without a filter being applies over things to make it blurry to cover up the mistakes.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, AdmiralMeowmix said:

It's not DLAA it's DLSS, it's a super sampling technique they're usign AI to render an 8K image from the games "ground" before it gets rasterized and rendered at a lower resolution (1080p, 1440p, 4k, etc etc). It's not anti aliasing the image (well it probably is after teh rasterization to make it smoother and cleaner) but it's allowing the game to render in your native resolution all the while the rasterized image your getting in each frame is rendered in higher detail before it's rasterized completely all in real time and the more people who use the feature and allow Nvidia to collect data the more "perfected" it can become as the plug the data into their own super computer to improve the process and project better ground images and rasterized renders.

 

At least this is what makes most sense to me when I see the fact it improves performance so drastically in some games. While the games base is rendered in such a high resolution it's all handled on cores that aren't busy rendering the rasterized images and the game itself, the only issue I see it having is the occasional artificating mistakes and increased latency unless the cores can all communicate with each other simultaneously.

 

I could be wrong behind what they're doing, someone more knowledgeable than me could explain it better. But anti aliasing and super sampling work i ndifferent ways and I doubt any kind of anti aliasing could create an image nearly free of jaggies without a filter being applies over things to make it blurry to cover up the mistakes.

From what I understand they use the tensor cores at the end. By this they mean they render the the scene to a certain point using the cuda cores and let the tensor cores finish by guess what should be rendered based on AI. From my understanding super sampling is the opposite. They render at high resolution and then use that to create an image at lower resolution. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Rattenmann said:

The Raytracing is done on separate RT Cores, not on tensor cores as far as I gathered.

See here: New RT Cores in addition to tensor Cores

It does though. The performance is so abysmal that they need denoising on top of the few ray they trace in real time so that it doesn't murder too much performance.

Link to comment
Share on other sites

Link to post
Share on other sites

If I am correct, there has been a ballet with Nvidia branding, since Titan won't be used anymore, but instead, it is replaced with the Ti card.

 

I got this from the last Jayztwocents last video, but I was not 100% listening.

 

Anyway, seeing this huge perf gain is not that surprising imo. RTX2080 should be actually compared to the GTX 1080Ti.

CPU: i7 4790K | MB: Asus Z97-A | RAM: 32Go Hyper X Fury 1866MHz | GPU's: GTX 1080Ti | PSU: Corsair AX 850 | Storage: Vertex 3, 2x Sandisk Ultra II,Velociraptor | Case : Corsair Air 540

Mice: Steelseries Rival | KB: Corsair K70 RGB | Headset: Steelseries H wireless

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, [EDIM] Chazz said:

They wanna shift old stock

 

In the other thread I dared to suggest that there might be very good reasons why Nvidia where charging so much.  Apart from the claims of performance (I know it's nvidia but the benchmarks might be true), I also pointed out that they are still selling the 10 series. 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, IhazHedont said:

If I am correct, there has been a ballet with Nvidia branding, since Titan won't be used anymore, but instead, it is replaced with the Ti card.

Yes it would seem that way but are we really to believe that they won't make a Titan RTX in half a year? Something that will be about 10% stronger than 2080ti and cost about $2000?

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Lathlaer said:

Yes it would seem that way but are we really to believe that they won't make a Titan RTX in half a year? Something that will be about 10% stronger than 2080ti and cost about $2000?

What would be the point? They are already making large margins on the 2080ti which was the entire point of the lineup. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Brooksie359 said:

What would be the point? They are already making large margins on the 2080ti which was the entire point of the lineup. 

Oh I believe in their ability to sell people that another Titan is essential to the lineup ;-) 

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Lathlaer said:

Oh I believe in their ability to sell people that another Titan is essential to the lineup ;-) 

Makes no sense to do it. The 2080ti is already a huge gpu and creating a titan would be kind of meaningless. If they were going to do the titan lineup they would have done it at release. 

Link to comment
Share on other sites

Link to post
Share on other sites

nothing seems to make much sense to me with this new cards as to the Nvidia release.

First no actual performance comparison, and when people complain they come up with this weird 1.5x, 2x metric instead of comparing fps. And then comparing 1080 vs 2080 instead of the flagship cards, 1080ti vs 2080ti.

 

Weird the way Nvidia is handling all this.

.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Brooksie359 said:

Makes no sense to do it. The 2080ti is already a huge gpu and creating a titan would be kind of meaningless. If they were going to do the titan lineup they would have done it at release.

You are trying to apply some kind of logic to their plans and I'm trying to say that there isn't one to find.

 

Might I remind you that this is the same company that first sold people the Titan X, then slapped them in the face with a 1080ti that was just as fast but $400 cheaper and then slapped them again with a Titan Xp that cost the same amount but was another 15% stronger. So you tell me, out of those 3 - which card didn't make sense?

 

Sure, there was some time between those launches - Titan X premiered in August 2016 (btw. 2 months AFTER 1080) but this only proves my point. I bet those who bought the Titan X were pretty sure that they have the best that Pascal has to offer but how could they foresee what will happen in 6 months (1080ti) or in 7 months (Titan Xp).

 

So tell me now, how sure are you again that someone who bought the RTX 2080ti now has the best that Turing has to offer in this generation?

 

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Lathlaer said:

You are trying to apply some kind of logic to their plans and I'm trying to say that there isn't one to find.

 

Might I remind you that this is the same company that first sold people the Titan X, then slapped them in the face with a 1080ti that was just as fast but $400 cheaper and then slapped them again with a Titan Xp that cost the same amount but was another 15% stronger. So you tell me, out of those 3 - which card didn't make sense?

 

Sure, there was some time between those launches - Titan X premiered in August 2016 (btw. 2 months AFTER 1080) but this only proves my point. I bet those who bought the Titan X were pretty sure that they have the best that Pascal has to offer but how could they foresee what will happen in 6 months (1080ti) or in 7 months (Titan Xp).

 

So tell me now, how sure are you again that someone who bought the RTX 2080ti now has the best that Turing has to offer in this generation?

 

The 7nm gpus are rumored to be coming pretty soon compared to most generations so I honestly don't think they have the time. They won't make a titan this time around. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Lethal Seraph said:

Good tech but hating the price. I'd opt out paying the early adapter premium.

i'd opt out completely and take it for free - but JayZ2cents video explains the reason for the pricing which makes sense.

 

Wonder 2080 /2080 ti is like when used with/without s gsync monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Brooksie359 said:

Makes no sense to do it. The 2080ti is already a huge gpu and creating a titan would be kind of meaningless. If they were going to do the titan lineup they would have done it at release. 

Are you forgetting Titan X and Xp? GP102s with no DP performance. Not really worthy of the name Titan.

Link to comment
Share on other sites

Link to post
Share on other sites

Performance is irrelivant when talking about price in this case.

Going by the logic that higher performance = higher price then the 1080ti when compared vs a much slower 8800 GTX should cost somehwere around $12,000. but it doesnt, becouse its a direct replacement for that price bracket.

 

The 20 series is a new series, as such price can only be compared against what each individual card replaces.

The 2080ti replaces the 1080ti, thats how the naming scheme works, thats where Nvidia has placed it. As such you have to compare the PRICE against the 1080ti , the 980ti the 780ti, the 680, the 580 and so on. As such the price 'should' be around $700, regarldess of the performance.

 

If you dont want to go back the 8800 GTX example then compare the 1080ti vs the 980ti.

980ti MSRP $650

980ti vs 1080ti perf = up to +50%

$650 + 50% = $975

 

The 1080ti is not $975.

 

The 1080ti MSRP is $700, same as the 780ti was.

 

You can not justify a massive jump in price bassed on its performance vs the prior generation, that is not how it has ever worked.

 

I cannot fathom the mindset of som1 trying to justify the price NVidia are charging.

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

makes me wonder, why are people only considering FPS as performance? detail and effects matter just as much as straight up frames per second. You can get 500fps in Minecraft, doesn't mean it looks any better.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Angel102 said:

makes me wonder, why are people only considering FPS as performance? detail and effects matter just as much as straight up frames per second. You can get 500fps in Minecraft, doesn't mean it looks any better.

becouse if you grab an average run of the mill game and run it on a 780ti, it looks the same as it being run on a 1080ti, the only difference is the FPS.

I dont see hwo thats hard to understand.

 

I get what your trying to say but their are very very very few features specific to a generation that can only be shown with any given gen over all prevouse gens that can make a game look significantly better from one generation to another.

 

In this case, Ray Tracing can actualy be done on older gen cards , so frame for frame there is no difference, however the speed (fps) that it can be don at is different. BUT the implimentations shown thus far make little differences visualy.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×