Jump to content

3DMark score for RTX 2080 Ti leaks

asim1999
1 hour ago, Void Master said:

You forgot ray tracing

Exactly what an NVIDIA rep would say >:(  the plot thickens 

When the PC is acting up haunted,

who ya gonna call?
"Monotone voice" : A local computer store.

*Terrible joke I know*

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Delicieuxz said:

You left out the GTX 980 Ti -> GTX 1080 Ti performance increase, which, if added to the list, would make the GTX 2080 Ti performance increase below average.

Only below average for 1 out of 5 generation jumps.  Like it or not it performs better, has new features and costs more, don;t like it buy a 10 series or wait for AMD to catch up.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Master Disaster said:

Honestly this is RTGs fault.... Hear me out.....

 

So we all know that Nvidia is now about as far ahead of RTG in the 'arms race's as Intel was ahead of AMD when we had Core Vs Excavator. At this point RTG aren't even a serious competitor to anything above the upper middle tier of cards.

 

The problem with this situation is it's given Nvidia time to spend billions and multiple years developing ray tracing technology. It RTG were more of a threat you can get your ass Jen Shun wouldn't have spent the last 10 years developing something no one asked for and, let's be real here, might end up flopping entirely and being a huge failure and right now we wouldn't be getting a product that makes very little sense.

 

Thanks RTG/AMD, thanks a lot.

As much as I like to rag on RTG, AMD's GPU division isn't anywhere close to the difference that the Bulldozer era was at. AMD was selling full Dozer parts for 1/2 of what they expected to be able to sell them for and it wiped them out of the server market as well. AMD chose not to make a "Big Polaris" when they could have, instead they went for the compute focused Vega. AMD can't keep Vega's in stock, mostly due to HBM2 issues, but, still because of HBM2 issues, AMD also isn't too interested in supplying a lot to the high-end Gaming market right now.

 

The RT Cores, like the Tensor cores, make sense for GPUs. In the Compute space. In the gaming space? Eh, only once Nvidia brought in Async Compute. Now Turing looks a lot like GCN with some of the CPU Core concepts brought in.

 

A Turing-like generation had to come from Nvidia, as they had to move in certain directions to keep unified GPU designs. They just want everyone to pay them handsomely for the privilege of technology AMD brought out several years ago.

Link to comment
Share on other sites

Link to post
Share on other sites

Let me sum up the question I have with the following, I am sure you get my drift.

 

The 1080 ti only has about 50% more performance over 1070, but costs twice as much.

Why would anyone ever consider buying a 1080ti, it makes no sense!

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Princess Cadence said:

"You're not paying for more performance, you're paying for Ray Tracing"

 

Well if we come down to this, all you gotta do is ask yourself if you want to play at native 4k with a 1080 Ti or play 1080p with fancy Ray Tracing on, and "DLSS" which is still 1080p just the newest fancy anti-aliasing, keep in mind even the 2080 Ti seems to be struggling with Ray Tracing features.

 

I do want Ray Tracing and whatever to become the norm but this will take another 3 years at least and by then we'll be overflood with 7nm GPUs and even Intel in the game.

It's very unlikely that ray tracing will become the norm. Let's look at some of the big advances and new features in gaming over the past few years....

 

3D > Failed miserably

DX12 > Still not the norm

VR > Less than 1% of Steam users run VR in any capacity

Vulkan > Only ID software seems to care

Nvidia Ansel > I'll be surprised if many of you even know what Ansel is

 

It's very simple, new tech can only succeed if developers start using it and given just how niche RTX will be I will be incredibly surprised if anymore than 10 to 15 games ever release with actual support, and by that I mean ground up built support, not bolted on after the fact.

 

It's too expensive to be mainstream and reduces performance by a huge margin, just like the last big revolution that still has utterly failed to be anything more than an expensive toy it's destined to fail because it's not ready for prime time yet, just like VR Nvidia released it too early.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Sfekke said:

Exactly what an NVIDIA rep would say >:(  the plot thickens 

So, you would just ignore new features in your value calculations?

You realize humanity would still be using sticks and stones if it was not for new "features" and advancements?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Master Disaster said:

It's very unlikely that ray tracing will become the norm. Let's look at some of the big advances and new features in gaming over the past few years....

 

3D > Failed miserably

DX12 > Still not the norm

VR > Less than 1% of Steam users run VR in any capacity

Vulkan > Only ID software seems to care

Nvidia Ansel > I'll be surprised if many of you even know what Ansel is

 

It's very simple, new tech can only succeed if developers start using it and given just how niche RTX will be I will be incredibly surprised if anymore than 10 to 15 games ever release with actual support, and by that I mean ground up built support, not bolted on after the fact.

 

It's too expensive to be mainstream and reduces performance by a huge margin, just like the last big revolution that still has utterly failed to be anything more than an expensive toy it's destined to fail because it's not ready for prime time yet, just like VR Nvidia released it too early.

I was avoiding mentioning Ansel, haha, though I think it's more for Game Publishers. They can make some really deceptive marketing with it.

Link to comment
Share on other sites

Link to post
Share on other sites

@Master Disaster though RTG is the ones that gave Nvidia the space in the high-end market, such that they can roll out "useless for now" hardware. At 7nm with double the RT Cores that are there now, I think Ray Tracing at 1080p will be doable. However, at 12nm, it's something pretty like 8x MSAA but mostly unusable in normal game play. 

 

A class of tech is going to declare the 2080 Ti a failure if it isn't getting 900 FPS in CS:GO on a 5+ Ghz 9900k. That's going to be fun.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Delicieuxz said:

So, 1.5 years after the 1080 Ti's release, people can pay 70% more than a 1080 Ti's MSRP for a performance increase of just 35% more than the 1080 Ti.

 

That isn't an attractive deal at all. Even if the 2080 Ti released at the same time as the 1080 Ti, it would make no sense to purchase it. And even if the 2080 Ti had a 70% performance increase over the 1080 Ti to go with the 70% increase in cost, it still wouldn't make sense.

 

A 70% performance increase with a 35% price increase might make sense, though that would still be an extreme generational price-to-performance increase.

 

The 1080 Ti had around a 70% performance increase over the 980 Ti, and had an MSRP of just $50 more than the 980 Ti.

So this gets a little worse when we consider what Jay-Z bought up, that the 2080Ti will not be replacing the 1080Ti, but as a replacement for Titan-level performance. 

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Master Disaster said:

It's very unlikely that ray tracing will become the norm. Let's look at some of the big advances and new features in gaming over the past few years....

 

3D > Failed miserably

DX12 > Still not the norm

VR > Less than 1% of Steam users run VR in any capacity

Vulkan > Only ID software seems to care

Nvidia Ansel > I'll be surprised if many of you even know what Ansel is

And look at other advances over the years that didn't really impress at first:

  • 3D Accelerators: S3 ViRGE wasn't very convincing at "accelerating"
  • Hardware Transform and Lighting: A fast enough CPU could make up for the lack of it
  • Programmable shaders: Didn't really show any "wow" factor at first
  • Unified shaders: DX10 required it, didn't really show any "wow" factor either
  • DX11: Tessellation was really the only "wow" feature, but it sort of played to our expectations of improving polygonal detail anyway.

The other thing is that the game development industry has been really wanting ray tracing because of its potential in improving rendering quality and the API is at least standardized meaning AMD can take advantage of this. And I'm not sure if it's possible to backport DXR to D3D 11.3, but D3D 12 and D3D 11.3 have the same render feature set, meaning you should able to achieve the same image quality results on either API. However, if Microsoft backports DXR to D3D 11, then it will become accessible to a lot more developers.

 

EDIT: And I have no reason to believe D3D12 will completely overtake D3D11. For all intents and purposes, you can achieve the same level of image quality using either. The only difference D3D12 is if you're CPU bound and you need to extract more performance out of the game by multithreading the GPU job batching. If you don't need that, there's really no reason to use D3D12 over D3D11.

Link to comment
Share on other sites

Link to post
Share on other sites

and-the-haters-gonna-hatehate-hate-hate-

i9-9900k @ 5.1GHz || EVGA 3080 ti FTW3 EK Cooled || EVGA z390 Dark || G.Skill TridentZ 32gb 4000MHz C16

 970 Pro 1tb || 860 Evo 2tb || BeQuiet Dark Base Pro 900 || EVGA P2 1200w || AOC Agon AG352UCG

Cooled by: Heatkiller || Hardware Labs || Bitspower || Noctua || EKWB

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, M.Yurizaki said:

And look at other advances over the years that didn't really impress at first:

  • 3D Accelerators: S3 ViRGE wasn't very convincing at "accelerating"
  • Hardware Transform and Lighting: A fast enough CPU could make up for the lack of it
  • Programmable shaders: Didn't really show any "wow" factor at first
  • Unified shaders: DX10 required it, didn't really show any "wow" factor either
  • DX11: Tessellation was really the only "wow" feature, but it sort of played to our expectations of improving polygonal detail anyway.

The other thing is that the game development industry has been really wanting ray tracing because of its potential in improving rendering quality and the API is at least standardized meaning AMD can take advantage of this. And I'm not sure if it's possible to backport DXR to D3D 11.3, but D3D 12 and D3D 11.3 have the same render feature set, meaning you should able to achieve the same image quality results on either API. However, if Microsoft backports DXR to D3D 11, then it will become accessible to a lot more developers.

And how many of those things cost the user 25% more for less actual performance once they're enabled?

 

The reason I drew more comparisons to VR is because that's what it's closest too, it costs the user more money than the alternative and once enabled kills performance.

 

Ray Tracing will end up like VR & 3D, a niche feature that very few people enjoy on a regular basis. The cards aren't ready yet and the developers certainly aren't.

 

To get these things to succeed you gotta make them mainstream and exactly how many casuals do you see gaming on their 4K TVs at 1080p whilst getting below 60fps because the game looks ever so slightly more pretty?

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Master Disaster said:

Honestly this is RTGs fault.... Hear me out.....

 

So we all know that Nvidia is now about as far ahead of RTG in the 'arms race's as Intel was ahead of AMD when we had Core Vs Excavator. At this point RTG aren't even a serious competitor to anything above the upper middle tier of cards.

 

The problem with this situation is it's given Nvidia time to spend billions and multiple years developing ray tracing technology. If RTG were more of a threat you can bet your ass Jen Shun wouldn't have spent the last 10 years developing something no one asked for and, let's be real here, might end up flopping entirely and being a huge failure and right now we wouldn't be getting a product that makes very little sense.

 

Thanks RTG/AMD, thanks a lot.

 

48 minutes ago, Master Disaster said:

It's very unlikely that ray tracing will become the norm. Let's look at some of the big advances and new features in gaming over the past few years....

 

3D > Failed miserably

DX12 > Still not the norm

VR > Less than 1% of Steam users run VR in any capacity

Vulkan > Only ID software seems to care

Nvidia Ansel > I'll be surprised if many of you even know what Ansel is

 

It's very simple, new tech can only succeed if developers start using it and given just how niche RTX will be I will be incredibly surprised if anymore than 10 to 15 games ever release with actual support, and by that I mean ground up built support, not bolted on after the fact.

 

It's too expensive to be mainstream and reduces performance by a huge margin, just like the last big revolution that still has utterly failed to be anything more than an expensive toy it's destined to fail because it's not ready for prime time yet, just like VR Nvidia released it too early.

You know larrabee was trying to do ray tracing

From my understanding developers want real time rt less work once implemented

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, valdyrgramr said:

NVLink Bridge with VRAM stacking pooling memory from multiple cards

I heard memory pooling on the non professional cards is disabled, hope it's not but I suspect that will be the case. It's not a full bandwidth NVLink connector. Reviews can't come soon enough grr.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Master Disaster said:

-Snip-

The problem with using VR as a comaprison is it has niche uses to begin. What can you do in VR outside of first person games? Not to mention VR has one issue that technology can't overcome: people reacting differently to it. When I tried a VR driving game on the PS4, it wasn't the 30FPS that bothered me, it was that every time I turned in-game, my body was expecting it in real life. There was a disconnect that I couldn't get rid of. Improvements to technology to bring it more accessible isn't going to change that

 

Ray tracing touches a fundamental aspect of 3D gaming and it's possible with it to get rid of traditional rendering entirely. 

 

Also I can think of one example that disproves that cost doesn't mean anything for widespread adoption but how well it integrated in the fundamentals: motion control. The Wii made competent motion control very affordable. Sony and Microsoft jumped on that bandwagon. But it's now a dead feature because it was a niche thing in the end . The only use it has now is in VR, which is already a niche application.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, valdyrgramr said:

If it is that is going to piss me off cuz I was planning on eventually doing multi-card, but stacking it still might be overkill for what I do.

Same, memory pooling would be so amazing to have even just to screw around with it and force it on for games that aren't supposed to support it cos why not. Memory pooling and split frame render could be the answer to all the SLI woes of the past... or not.

Link to comment
Share on other sites

Link to post
Share on other sites

It's about on par with a 2500MHz~ 1080 Ti. Neat.

 

21 minutes ago, leadeater said:

I heard memory pooling on the non professional cards is disabled, hope it's not but I suspect that will be the case. It's not a full bandwidth NVLink connector. Reviews can't come soon enough grr.

I may not be a full the bandwdith connector but it's far superior to the paltry 1-2GB/s that SLi currently does. I have a feeling that reviewers will pass over it though because sLi Is DeAd RoFl.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Master Disaster said:

It's very unlikely that ray tracing will become the norm. Let's look at some of the big advances and new features in gaming over the past few years....

 

3D > Failed miserably

DX12 > Still not the norm

VR > Less than 1% of Steam users run VR in any capacity

Vulkan > Only ID software seems to care

Nvidia Ansel > I'll be surprised if many of you even know what Ansel is

I guess everyone forgot Pascal SMP. xD

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

If this pans out, looks like Petersen might actually be on point.

 

And this seems to reinforce my opinion. Unless you’re an early adopter and are interested in ray tracing or you do machine learning and need those Tensor cores, there isn’t much of a reason to upgrade to the RTX and you’d probably be better off waiting if you already have an upper-end GTX 10 GPU.

 

Of course, it’s unconfirmed, so there’s a chance everything could change over time but if this ends up being the case, well, unless you’re interested in ray tracing, DLSS and feel that you can afford the premium, you’d probably want to stick with GTX 10 if you don’t care about those right now

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

That's seriously unimpressive. It's within 17% of what I consider to be a realistic 1080 Ti overclock.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully it is an early driver issue but idk, I'm not expect waves.

My Titan X Pascal overclocked gets 11,111 for a graphics score in the same test. :/

 

I'll admit my expectations were probably unreasonably high for the new cards.

9900K  / Noctua NH-D15S / Z390 Aorus Master / 32GB DDR4 Vengeance Pro 3200Mhz / eVGA 2080 Ti Black Ed / Morpheus II Core / Meshify C / LG 27UK650-W / PS4 Pro / XBox One X

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, valdyrgramr said:

Because the costs of RND, a new memory type being implemented, and more is going to stack.  Companies will not pay for it themselves.  What I don't understand is why people are failing to see that. 

You're posting on the LinusTechTips forums - the people who bother to try looking at the business strategy beyond the typical "hurr durr Intel/Nvidia are greedy!!" seems to be unfortunately, a minority.

 

Spoiler

It is just a tech forum catered towards 'enthusiasts' after all.

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, valdyrgramr said:

You're paying for more than that.  10 years of RND tax, raytracing, and more.  The point of them isn't purely that performance gain.

true

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Void Master said:

You forgot ray tracing

Nobody cares about raytracing if it makes the game run at 30fps.

I rather let them improve on that tech and wait for the next gen.

My Rig "Jenova" Ryzen 7 3900X with EK Supremacy Elite, RTX3090 with EK Fullcover Acetal + Nickel & EK Backplate, Corsair AX1200i (sleeved), ASUS X570-E, 4x 8gb Corsair Vengeance Pro RGB 3800MHz 16CL, 500gb Samsung 980 Pro, Raijintek Paean

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Kukielka said:

Nobody cares about raytracing if it makes the game run at 60fps.

 

except for all the people who basically have made the new cards sold out before they even hit shelves.  I think  some of them might care.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×