Jump to content

2080 ti = 2 x 1080 ti

mikedrewsmy


Look at that fps. Almost two times the output from 1080ti thanks to DLSS.

35 FPS VS 71 FPS for the scene below.

 

dlss_vs_taa.png

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

People keep bringing up the idea that using the tensor cores for improved performance is "cheating" or "irrelevant" but I don't see why Nvidia shouldn't show off their new technology like this.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

Deep Learning aka AI is the way to go.

Pixar's using it to make new movies.

Smartphones are adopting it.

Look at the GPU Boost from Huawei in which is basically Deep Learning.
It improves the frame rates dramatically.

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

I can't even begin to understand how it works, but quite frankly I don't care so long as it does.

 

I just hope its easy for developers to implement into existing engines.  Although with talk of all these demos being hacked together in a few weeks it seems it probably is.

 

What's even more amusing are the people complaining that RT is inefficient while also spouting that "you can just use SLI for the same price", the LEAST efficient method to get more frames.

 

They will be laughing on the other side of their faces if games optimised for RTX end up with WAY higher frame rates at better quality thanks to offsetting lighting and anti-aliasing off the shader cores.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Alex Atkin UK said:

I can't even begin to understand how it works, but quite frankly I don't care so long as it does.

 

I just hope its easy for developers to implement into existing engines.  Although with talk of all these demos being hacked together in a few weeks it seems it probably is.

DLSS is being handled by Nvidia.

 Their supercomputers computes everything and send the DLSS data via Geforce Experience updates.

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

Is that what that GPU cluster they were bragging about was for?

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, fasauceome said:

People keep bringing up the idea that using the tensor cores for improved performance is "cheating" or "irrelevant" but I don't see why Nvidia shouldn't show off their new technology like this.

They can show off a new technology like that, however it's like you compared a 4x4 Jeep to a normal car in terms of which one can drive better in off-road.
Of course it's going to be the Jeep, because it's made specifically for that.

So analogically, if you show a comparison with a feature that utilizes special hardware not to gimp performance between a card that has this hardware inside and another one that doesn't, what do you think the outcome will be?

We want comparisons in terms of performance in games that don't use ray-tracing features (aka all current games).

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Morgan MLGman said:

They can show off a new technology like that, however it's like you compared a 4x4 Jeep to a normal car in terms of which one can drive better in off-road.
Of course it's going to be the Jeep, because it's made specifically for that.

So analogically, if you show a comparison with a feature that utilizes special hardware not to gimp performance between a card that has this hardware inside and another one that doesn't, what do you think the outcome will be?

We want comparisons in terms of performance in games that don't use ray-tracing features (aka all current games).

But the benchmark highlights DLSS and not Ray Tracing, Mr. Morgan Freeman sir.

And the Infiltrator Benchmark made by Unreal Engine has been around wayyyyyyy before Geforce RTX launching, so that means it's not made specifically for that.

It's basically showing that with DLSS & Tensor cores actually helped improved gaming capabilities of the card.

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

Dlss is kind of like checkerboard on steroids . It is still taking a lower quality image and upsampling it to 4k . Hence the obviously better frame rates . Obviously it does this must better than checkerboard but it is not native 4k .   If each card 1080ti and 2080ti both ran native 4k then the difference would be much closer . 

Personally I prefer native quality but dlss is a valid option and in fast paced games where speed farther than quality is preferred it most certainly has its place 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Colb said:

Dlss is kind of like checkerboard on steroids . It is still taking a lower quality image and upsampling it to 4k . Hence the obviously better frame rates . Obviously it does this must better than checkerboard but it is not native 4k .   If each card 1080ti and 2080ti both ran native 4k then the difference would be much closer . 

Personally I prefer native quality but dlss is a valid option and in fast paced games where speed farther than quality is preferred it most certainly has its place 

That's where you're basically wrong though. 

But the bench above runs on native 4k quality + DLSS.

 


Yes, one of the abilities for DLSS is to improve lower res image and make it looks better.

 

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

*facepalm*

 

55 minutes ago, mikedrewsmy said:


Look at that fps. Almost two times the output from 1080ti thanks to DLSS.

35 FPS VS 71 FPS for the scene below.

 

 

Its not that DLSS is making 2080ti twice as powerfull... Its making 1080ti twice as bad.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, WereCat said:

*facepalm*

 

Its not that DLSS is making 2080ti twice as powerfull... Its making 1080ti twice as bad.

:D
When i read the facepalm, my head was like "OMG DID I SAY SOMETHING WRONG.. AGAIN!"

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, mikedrewsmy said:

:D
When i read the facepalm, my head was like "OMG DID I SAY SOMETHING WRONG.. AGAIN!"

:D

 

Well, they compare T-AA on 1080ti to DLSS on 2080ti.

Since DLSS is supposed to work with Tensor Cores it should be possible to run it even on the 1080ti at least to some extent (likely fully) because the AI training process is happening on NVIDIA servers and the the results and data to work with are pushed to the GPU via a Driver Updates. 

 

You can look on a Dota 2 OpenAI for example. They have a huge server for training the AI to play dota but after its trained they can just run it on a single PC just fine (PC with current HW). 

 

I think that NVIDIA could easily make the GTX 1080 and 1080ti work just fine with DLSS and achieve a very simmilar results if they were willing to provide the driver update for DLSS on those as well. 

 

EDIT:

me stoopid, Pascal has no Tensor Cores, nvm

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, WereCat said:

:D

 

Well, they compare T-AA on 1080ti to DLSS on 2080ti.

Since DLSS is supposed to work with Tensor Cores it should be possible to run it even on the 1080ti at least to some extent (likely fully) because the AI training process is happening on NVIDIA servers and the the results and data to work with are pushed to the GPU via a Driver Updates. 

 

You can look on a Dota 2 OpenAI for example. They have a huge server for training the AI to play dota but after its trained they can just run it on a single PC just fine (PC with current HW). 

 

I think that NVIDIA could easily make the GTX 1080 and 1080ti work just fine with DLSS and achieve a very simmilar results if they were willing to provide the driver update for DLSD on those as well. 

Unfortunately there's no Tensor Cores on the 1080ti. And i don't know how to explain this but I don't think in terms of graphical process, it's totally different kind of AI/Deep Learning compared to the bots.

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, mikedrewsmy said:

Unfortunately there's no Tensor Cores on the 1080ti. And i don't know how to explain this but I don't think in terms of graphical process, it's totally different kind of AI/Deep Learning compared to the bots.

There are Tensor Cores on a 1080ti.

Otherwise I would not be able to use my GPU to train the FakeAPP with a fake images to make video face swaps.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, WereCat said:

There are Tensor Cores on a 1080ti.

Otherwise I would not be able to use my GPU to train the FakeAPP with a fake images to make video face swaps.

Trying to find articles that's saying there's tensor cores on 1080ti but couldn't find one :(

 

 

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, mikedrewsmy said:

Trying to find articles that's saying there's tensor cores on 1080ti but couldn't find one :(

There are no official statements about the amount of Tensor Cores on a 1000gen cards or the Tensor teraflop performance. But they are there.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, WereCat said:

There are no official statements about the amount of Tensor Cores on a 1000gen cards or the Tensor teraflop performance. But they are there.

As far as i know,  Tensor cores were introduced with Volta architechture, and was first featured in Tesla V100 (which releases after the 1080 ti),

 

The FakeAPP basically uses TensorFlow (Google's Dataflow Programming) uses whatever cores the GPU has in which is the CUDA cores in your 1080 Ti).

 

Tensor cores are way efficient in Deep Learning compared to the Cuda Cores and having them separately from the Cuda cores makes it even more powerful.

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, WereCat said:

There are no official statements about the amount of Tensor Cores on a 1000gen cards or the Tensor teraflop performance. But they are there.

No tensor cores on Pascal cards. That's why rtx whoops dey ass in ai stuff

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mikedrewsmy said:

As far as i know,  Tensor cores were introduced with Volta architechture, and was first featured in Tesla V100 (which releases after the 1080 ti)

I am stupid. Sorry.

Youre right.

I assumed that the Deep Learning on Pascal is done with Tensor Cores but its not.

So I am taking everything back :/ I go hide now :D 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, fasauceome said:

No tensor cores on Pascal cards. That's why rtx whoops dey ass in ai stuff

How long has this been going on?
You've been creeping round on me.
Why you calling me baby?

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, WereCat said:

I am stupid. Sorry.

Youre right.

I assumed that the Deep Learning on Pascal is done with Tensor Cores but its not.

So I am taking everything back :/ I go hide now :D 

Dude, no ones calling names or calling each other stupid.
We're here to share information and learn more :D

I myself is trying to learn more.

So You Wanna Be A Playa, But Your Rig's Ain't Fly,
You Gotta Hit Us Up, To Get A Pimped Out Rig,

You've Got To Pimp My Riggggggg...  (DAMN RIGHT)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mikedrewsmy said:

Dude, no ones calling names or calling each other stupid.
We're here to share information and learn more :D

I myself is trying to learn more.

I am calling me names because I derped out quite a bit :D

tumblr_n787buvPqb1skmak9o3_500.gif&f=1

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×