Jump to content

NVidia Geforce RTX 2080 confirmed up to 2x GTX 1080 performance. Evidence for 2080 ti as well.

6 minutes ago, Rattenmann said:

 

And at least the Tensor core based AA is really easy to implement. It is basically a checkbox in the engine. 

It's actually not if you want to avoid image artefacts often.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Rattenmann said:

For me, this is the actual kicker here.

More performance plus better graphics.

Sad that people just see the price and go into instant hate-mode, without even checking some key facts. Glad Nvidia tried to improve graphics for the first time in like forever. And from what we were shown, I am actually impressed.

 

Sure it won't be in every game right away. But should we ignore new stuff, just because it won't be everywhere out of the box? That would be quite sad!

 

And at least the Tensor core based AA is really easy to implement. It is basically a checkbox in the engine. 

Except we already know with RTX turned on you get MAYBE similar (or SLIGHTLY higher) FPS numbers to a 1080 Ti overclocked in the same infiltrator game.  So if you want to call that a 'kicker' or 'win' for both higher performance and graphics for $1200 go right ahead lol.

 

And even saying its 'improving' graphics is slightly disingenuous given we already have lighting tech that can recreate all these effects, it's just more expensive to do so than with RTX and these new cards.

 

Its great tech, for next next gen, but right now its only great for nvidia to fleece the sheep for their money.

Link to comment
Share on other sites

Link to post
Share on other sites

I don’t think we care about performance in normal games. We know it’s going to be better, what everyone wants to know is how is performance with ray tracing. 

 

Rumors show it the 2080Ti might not be able to even hit 60fps at 1080p. So sure regular performance might be phenomenal....but if the cards can only barely push a playable experience with the highest end card.....there might be a problem.

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, DrMacintosh said:

I don’t think we care about performance in normal games. We know it’s going to be better, what everyone wants to know is how is performance with ray tracing. 

 

Rumors show it the 2080Ti might not be able to even hit 60fps at 1080p. So sure regular performance might be phenomenal....but if the cards can only barely push a playable experience with the highest end card.....there might be a problem.

Not to mention we don't know what quality settings were set to in that demo I believe.  Could have, and I would suspect low/med.

 

I mean really when were even talking about RTX in these new cards, were only talking about the 2080 Ti, because if that one cant push RTX in games the lower tier sure as shit aren't going to.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Underwrought said:

Except we already know with RTX turned on you get MAYBE similar (or SLIGHTLY higher) FPS numbers to a 1080 Ti overclocked in the same infiltrator game. 

1

Link to the information pretty please.

Or are you just posting wild guesses based on nothing?

 

I would love to see the infiltrator demo that has RT tech in it. Epic sure does not know about that, which is funny because they created the demo and distribute it with their unreal engine.

 

Look, I get the skepticism. I get that you want to be cautious. And I agree on all of that.

But your examples make you look pretty bad if you present objectively false information as facts. This is helping no one and at worst you would invalidate actual valid claims, by coating them in something every quick search unmasks as wrong.

 

Keep the skepticism, it is a good trait to have. But drop the hatefully presented lies. Please. That, or post the source of your claims that you present as facts.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Rattenmann said:

Link to the information pretty please.

Or are you just posting wild guesses based on nothing?

 

I would love to see the infiltrator demo that has RT tech in it. Epic sure does not know about that, which is funny because they created the demo and distribute it with their unreal engine.

 

Look, I get the skepticism. I get that you want to be cautious. And I agree on all of that.

But your examples make you look pretty bad if you present objectively false information as facts. This is helping no one and at worst you would invalidate actual valid claims, by coating them in something every quick search unmasks as wrong.

 

Keep the skepticism, it is a good trait to have. But drop the hatefully presented lies. Please. That, or post the source of your claims that you present as facts.

Sorry you're right it was just infiltrator at 4k WITHOUT RTX shown in the keynote demo so thats even worse.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Underwrought said:

Sorry you're right it was just infiltrator at 4k WITHOUT RTX shown in the keynote demo so thats even worse.

2

How is that worse?

A 2x performance jump is somehow bad? I am confused. The press confirmed 85fps stable on a demo that only runs at 45fps max on a balls to the wall OCed 1080ti,... do I miss something here? Usually, people kinda freak out about this kinda performance jumps and not call it bad.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rattenmann said:

How is that worse?

A 2x performance jump is somehow bad? I am confused. The press confirmed 85fps stable on a demo that only runs at 45fps max on a balls to the wall OCed 1080ti,... do I miss something here? Usually, people kinda freak out about this kinda performance jumps and not call it bad.

On top of people making arbitrary requirements on what "playable" or "acceptable" is.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rattenmann said:

The press confirmed 85fps stable on a demo that only runs at 45fps max on a balls to the wall OCed 1080ti,... do I miss something here? Usually, people kinda freak out about this kinda performance jumps and not call it bad.

Check out the video @Underwrought posted. If the description is to be believed it's at 4k max settings with TAA on a 1080ti, holding a stable 60fps during the part that was also shown during the Keynote.

 

I've seen people post different results though, not sure what to think about it.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Lathlaer said:

Check out the video @Underwrought posted. If the description is to be believed it's at 4k max settings with TAA on a 1080ti, holding a stable 60fps during the part that was also shown during the Keynote.

 

I've seen people post different results though, not sure what to think about it.

Good point.

I am not certain I would trust that video, considering people on Reddit claimed they could not reach past 60fps.

It is something to keep in mind tho. We now have a single evidence of a 1080ti being MUCH faster than any other evidence we found so far. Pretty significant outlier right there so smells a little like fake.

 

But there are people with 1080tis on this forum, no? Plenty I would guess. Can anyone of you guys run the demo and let us know how realistic a 62 fps figure for a 1080ti is?

I know my 970 barely gets it to start *cough*.

Link to comment
Share on other sites

Link to post
Share on other sites

Should the price/performance ratio not consider generstion increase either way? We should always expect to have better performance for same price otherwise what's the point of progress?

I don't see 20 series as a slight change of existing cards. The jump is quite signifficant and the tech included is noteworthy. Otherwise we would have 11 series.

So yeah, performance is good, price is a rip off and competition is jowhere to be seen. Some times to be in PC hardware...

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rattenmann said:

Can anyone of you guys run the demo and let us know how realistic a 62 fps figure for a 1080ti is?

I'm actually trying to download this thing. I want to run it in single GPU and in SLI just to see what to expect.

 

It's not easy to download it though xD

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Lathlaer said:

I'm actually trying to download this thing. I want to run it in single GPU and in SLI just to see what to expect.

 

It's not easy to download it though xD

Yah, you need the full Engine first, then the project. Then you need to compile it.

Would be awesome to see your results tho.

 

I suspect the 62fps video to not use any AA at all. That would explain the difference in fps.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, M.Yurizaki said:

But that's marketing's job: to make the product they're trying to sell sound better than it really is.

Well I guess we shall see on release. I am just glad I reserved my 2080ti at microcenter. Didn't pay anything yet and I will pick it up on the 26th if it's actually good. If it isn't I can just not pick it up and they will put it back on the shelves. 

Link to comment
Share on other sites

Link to post
Share on other sites

 

19 minutes ago, Rattenmann said:

Yah, you need the full Engine first, then the project. Then you need to compile it.

Would be awesome to see your results tho.

 

I suspect the 62fps video to not use any AA at all. That would explain the difference in fps.

This makes me wonder how much performance (FPS) is lost by turning on DLAA. If DLAA uses Tensor Cores, then this may be a much cheaper option to other AA methods. Nvidia could artificially inflate numbers by turning on traditional AA on 1080s and using their new tensor cores to do DLAA on 2080s. 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, descendency said:

 

This makes me wonder how much performance (FPS) is lost by turning on DLAA. If DLAA uses Tensor Cores, then this may be a much cheaper option to other AA methods. Nvidia could artificially inflate numbers by turning on traditional AA on 1080s and using their new tensor cores to do DLAA on 2080s. 

How would that be artificially inflating? You get AA in with both cards at 4k yet the 2080ti gets better performance because it can use a better form of AA that has a smaller hit on performance. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, descendency said:

 

This makes me wonder how much performance (FPS) is lost by turning on DLAA. If DLAA uses Tensor Cores, then this may be a much cheaper option to other AA methods. Nvidia could artificially inflate numbers by turning on traditional AA on 1080s and using their new tensor cores to do DLAA on 2080s. 

That is precisely what it is.

It is a way to do good looking AA, without taxing the GPU main core at all. It is offloaded to the Tensor cores. or at least that is what I gathered from reading about it. It has to somehow make it into the final picture, so the main core will need to do "something", but very little.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, laminutederire said:

They're comparing different antialiasing methods. That says more about the anti aliasing methods than the cards themselves..

Them going full on on deep learning will bite them in the ass in the long run. It's cheap but it'll create image quality issues in the end.

That's just speculation. Who knows how effective AI based AA will be. It's completely possible it will be better than normal AA in image quality. I mean it could end up being worse but there is no way for you or me to know that at this point. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Brooksie359 said:

That's just speculation. Who knows how effective AI based AA will be. It's completely possible it will be better than normal AA in image quality. I mean it could end up being worse but there is no way for you or me to know that at this point. 

I doubt any form of non-AI based AA can come close.

Every non-AI based AA can only extrapolate from the pixels it can see. AI can learn to "guess" what should be there, without it actually being there.

 

I may be oversimplifying this, but I can't see how non AI AA can do what AI based predictions can.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rattenmann said:

I doubt any form of non-AI based AA can come close.

Every non-AI based AA can only extrapolate from the pixels it can see. AI can learn to "guess" what should be there, without it actually being there.

 

I may be oversimplifying this, but I can't see how non AI AA can do what AI based predictions can.

I think the idea would be the opposite. They are under the idea that AI based AA won't be able to guess correctly as well as traditional AA. I think it will bit who knows at this point. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Rattenmann said:

Would be awesome to see your results tho.

Bad news. 

 

1. Apparently I am too stupid to see the settings for the demo when I'm running the project. I have found some that say "Epic" in all relevant categories but I can't really choose resolution. It scales up to 100% (I'm assuming my monitors resolution).

 

Someone knows this enough to tell me what to click? :D

 

2. I have an ultrawide monitor at 3440x1440 and for some reason it does not cooperate with custom 4k resolution. As such I can set it only via DSR for something either a bit lower than 4k (4213x1764) or much higher than 4k (4587x1920). Neither will give good representation of what to expect in 4k.

 

BTW all that discussion about forms of AA is interesting but for the purpose of 4k kinda moot since most people turn AA off altogether when running in high resolution. Even at lower resolutions you will find many opinions that downscaling from higher one (DSR) is better form of AA than any other ;-)

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Brooksie359 said:

That's just speculation. Who knows how effective AI based AA will be. It's completely possible it will be better than normal AA in image quality. I mean it could end up being worse but there is no way for you or me to know that at this point. 

There's been extensive research about that and devouring through AI. As soon as the noise structure or the edge structure is different, things start to get funky. They'll have to at least retrain their model on new data each time a new engine or maybe in some cases games, for it not to have artefacts anymore.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, laminutederire said:

There's been extensive research about that and devouring through AI. As soon as the noise structure or the edge structure is different, things start to get funky. They'll have to at least retrain their model on new data each time a new engine or maybe in some cases games, for it not to have artefacts anymore.

yes they will indeed. that is why they said they would make updates to the AI that you can download like drivers. this training is all done by super computers mind you and will likely be able to train itself relatively fast.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, laminutederire said:

There's been extensive research about that and devouring through AI. As soon as the noise structure or the edge structure is different, things start to get funky. They'll have to at least retrain their model on new data each time a new engine or maybe in some cases games, for it not to have artefacts anymore.

The idea behind my thinking is that AI networks can learn and adapt to information that is not actually there.

 

Traditional AA can only work with what is actually there.

An extreme is Googles Picture upscaling. You can input a 240p picture and it spits out an amazing 4k Image. 

That is simply impossible to do by just looking at the given input, but quite possibly by a neural network.

 

On the other hand, AA use cases likely always have "something" to work with and don't have to guess. So that may be a reason.

 

I am not into that enough to know where the breakpoints are, or if the available Tensor Cores in the 2080 ti will be enough to handle that, but judging from what people from GamesCon report, it does indeed look better then traditional AA. Yes, the basic stuff is done via a super network at NVidia and then downloaded via drivers, but the fine-tuning and usage of those algorithms are done on the card itself.

42 minutes ago, Lathlaer said:

Someone knows this enough to tell me what to click? :D

 

I sadly don't and can't access my installation right now. I may be able to dig it up tomorrow if no one else is faster.

42 minutes ago, Lathlaer said:

BTW all that discussion about forms of AA is interesting but for the purpose of 4k kinda moot since most people turn AA off altogether when running in high resolution. Even at lower resolutions you will find many opinions that downscaling from higher one (DSR) is better form of AA than any other ;-)

2

If DLSS is zero performance due to all the work being done on Tensor cores, then it would make sense even for that 4k scenario. It is free after all.

It may also be better than downscaling from DSR, as it will surely still cost performance to do that, no?

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Rattenmann said:

It may also be better than downscaling from DSR, as it will surely still cost performance to do that, no?

Oh yeah, using DSR as a form of antialiasing is very taxing.

 

It all depends on the game and AA implementation. There are titles where the general consensus is to avoid AA altogether because the graphics looks blurred and many people want it OFF but they compensate by using DSR.

 

So yea, if the AA is essentially free then there is no reason not to use it on top of everything else but only if it doesn't do anything funky with the overall look. I think FF XV was a perfect example where people were very divided whether TAA actually improves visuals or not. Many did not like the blurring effect.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×