Jump to content

Nvidia thinks "Pascal is just unbeatable" and already decided not to move Volta ahead

Misanthrope
2 hours ago, Taf the Ghost said:

 RTG's Navi is going to really shift the market around.

Let's hope so, the market suffers from the stale state occurring! 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, valdyrgramr said:

9370 and 9590 at launch were priced high.

$999 for the 9590 at launch

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, valdyrgramr said:

9370 and 9590 at launch were priced high.

Yep, and that was a mistake.  They had to drop the price down pretty quickly, as I recall.  They pushed the architecture to it's very limits, and thought they could justify the extra cost because of the clock speed, but the market proved them wrong.

 

It wasn't like the old Socket 939 days, when they could justify a $1,000+ CPU because they actually had the better architecture.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, laminutederire said:

Where have you seen those figures though?

Basing my calculations on this review:

With power saving mode, you get around 25% more performance per watt for the 1070, not 40%. That's nearly half your figures.

Besides, I said closer, compared to Polaris etc. At least in Hitman there is a significant improvement in perf/watt under power saving mode (13% improvement in that game). I don't have the figures for every game, but that's a probably valid trend regarding the improvement from Polaris to vega (which could be worse vs the 1070 as hitman is a good game for Amd, but closer anyway)

I got my numbers from TechPowerUp which takes 20 games into consideration (not just one) which I think will be a pretty good average without any cherry picking. But to be fair, they did not use power saving BIOS. But on the other hand, will anyone actually use that? I doubt many people will buy an expensive high end GPU, and then put it in power saving mode (keeping it in standard mode will give you about 9% higher performance, for the 64, as you can see here).

Here is their chart:

perfwatt_3840_2160.png.5ac4b0c21d660806a516e8ea664788a0.png

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, LAwLz said:

I got my numbers from TechPowerUp which takes 20 games into consideration (not just one) which I think will be a pretty good average without any cherry picking. But to be fair, they did not use power saving BIOS. But on the other hand, will anyone actually use that? I doubt many people will buy an expensive high end GPU, and then put it in power saving mode (keeping it in standard mode will give you about 9% higher performance, for the 64, as you can see here).

Here is their chart:

perfwatt_3840_2160.png.5ac4b0c21d660806a516e8ea664788a0.png

For future designs it's interesting though. They were put of the efficiency range to push extra perf, but they're capable of way better in terms of efficiency.

9% compared to turbo or balanced mode?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, laminutederire said:

For future designs it's interesting though. They were put of the efficiency range to push extra perf, but they're capable of way better in terms of efficiency.

9% compared to turbo or balanced mode?

9% higher performance going from the power saving BIOS to balanced mode. Turbo is within the margin of error from balanced so it's completely worthless.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, laminutederire said:

The 56 draws less power than my fury, but I'm fine with my 530W PSU, so 750W is a bit extreme, 600W should be enough for mostly everyone granted that my cpu isn't that power hungry (6600k). With the power saving mode, the 56 is on par with reference 1070s and it draws something around 30W more, that's not that bad for efficiency in itself.

The power draw figure I mentioned is coming from both AMD and assuming a high-performance gaming PC using an Intel Core i7 or high-end Ryzen 7.

 

Yes it seems overkill to start there, but it's also never good to load a power supply to 80% or more of its total rated capacity, as that shortens the lifespan.

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, JurunceNK said:

The power draw figure I mentioned is coming from both AMD and assuming a high-performance gaming PC using an Intel Core i7 or high-end Ryzen 7.

 

Yes it seems overkill to start there, but it's also never good to load a power supply to 80% or more of its total rated capacity, as that shortens the lifespan.

The calculated minimum power supply wattage for my system is 495W and I have a 530W PSU, and it usually draws something like 400W on load. That's nearly 80 percent but under it. 600W is probably plenty enough for most systems

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Dylanc1500 said:

Details on the Volta arch improvements: Here

Some comparions between P100 and V100: Here

Everyone loves slides so here's a couple slides: Slides

Have to wonder how the gaming GPUs will be configured. Looking at all that information there are two standout things to me: Most performance comparisons/boosts shown are for the Tensor cores, Volta SM are now split and have dedicated INT32 and FP32 cores.

 

Going in to the second point more, SM cores, which is more applicable to us for gaming I have to wonder how good this could be. So the SM count has increased to 80 but the core makeup has totally changed. Each SM has 64 FP32 Cores, 64 INT32 Cores, 32 FP64 Cores, and 8 Tensor Cores. We know Nvidia cuts down the FP64 cores for the gaming cards but will they also keep the separate INT32 cores? How useful would dedicated INT32 cores be for gaming?

 

What if Nvidia swaps out the INT32 cores for FP32 cores instead giving 128 of them for 10240 CUDA cores, does this mean Volta could have well over double the performance of Pascal?!?

 

I'm also very sure Tensor cores will be removed for GV102 and below, for cost/die size reasons.

 

image3.png

Link to comment
Share on other sites

Link to post
Share on other sites

Either nVidia is pulling an Intel, or they have to delay and use this as an excuse because the maturity of the node they want to use isnt where it needs to be yet. So that means we are stuck with the 10 series atleast up to next summer.

//Case: Phanteks 400 TGE //Mobo: Asus x470-F Strix //CPU: R5 2600X //CPU Cooler: Corsair H100i v2 //RAM: G-Skill RGB 3200mhz //HDD: WD Caviar Black 1tb //SSD: Samsung 970 Evo 250Gb //GPU: GTX 1050 Ti //PSU: Seasonic MII EVO m2 520W

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, leadeater said:

Have to wonder how the gaming GPUs will be configured. Looking at all that information there are two standout things to me: Most performance comparisons/boosts shown are for the Tensor cores, Volta SM are now split and have dedicated INT32 and FP32 cores.

 

Going in to the second point more, SM cores, which is more applicable to us for gaming I have to wonder how good this could be. So the SM count has increased to 80 but the core makeup has totally changed. Each SM has 64 FP32 Cores, 64 INT32 Cores, 32 FP64 Cores, and 8 Tensor Cores. We know Nvidia cuts down the FP64 cores for the gaming cards but will they also keep the separate INT32 cores? How useful would dedicated INT32 cores be for gaming?

 

What if Nvidia swaps out the INT32 cores for FP32 cores instead giving 128 of them for 10240 CUDA cores, does this mean Volta could have well over double the performance of Pascal?!?

 

I'm also very sure Tensor cores will be removed for GV102 and below, for cost/die size reasons.

 

image3.png

From what I understand and how it was explained to us, the tensor cores have the ability to  also operate as two FP16 units or a single FP32 unit doing two FP32 operations. I could be misunderstanding it entirely.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Dylanc1500 said:

From what I understand and how it was explained to us, the tensor cores can also operate as two FP16 units or a single FP32 unit doing two FP32 operations. I could be misunderstanding it entirely.

Something like that, in your link there's comments down the bottom and someone asks that question and it gets answered. Still I can't really see Tensor core staying outside of the slim chance the Titan's get them and not the other GeForce products. Nvidia won't be putting gaming GPUs on the market above 610mm2 that's for sure.

 

Edit:

Quadro's share the same die with minor revisions to them as the GeForce products but Telsa is the dedicated product line for the tasks Tensor is good at so Quadro doesn't need them either.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

Something like that, in your link there's comments down the bottom and someone asks that question and it gets answered. Still I can't really see Tensor core staying outside of the slim chance the Titan's get them and not the other GeForce products. Nvidia won't be putting gaming GPUs on the market above 610mm2 that's for sure.

 

Edit:

Quadro's share the same die with minor revisions to them as the GeForce products but Telsa is the dedicated product line for the tasks Tensor is good at so Quadro doesn't need them either.

Oh it will definitely be cut down, it'll be interesting what they will for that, that is for sure.

 

I see a possibility of them having at least one Quadro with the full V100 (similar to the Quadro P100)  but I guess we will just have to wait and see. The rest of the line-up will be, just as you said, sharing the same dies with the consumer line with specific revisions to them.

 

On a separate thought could you imagine them putting a fully enabled GV100 on a Titan? My wonder is whether games could take full advantage of something like the tensor cores. I haven't dealt with games (last game I played was portal 2 lol) , and what not, to know.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Dylanc1500 said:

I see a possibility of them having at least one Quadro with the full V100 (similar to the Quadro P100)  but I guess we will just have to wait and see. The rest of the line-up will be, just as you said, sharing the same dies with the consumer line with specific revisions to them.

Oh right, I totally forgot there was a Tesla P100 and a Quadro P100.

 

9 minutes ago, Dylanc1500 said:

On a separate thought could you imagine them putting a fully enabled GV100 on a Titan? My wonder is whether games could take full advantage of something like the tensor cores. I haven't dealt with games (last game I played was portal 2 lol) , and what not, to know.

I'd think based on how much faster they are supposed to be performance wise if they were any good for gaming and/or other workloads Volta would only have Tensor cores, but to be honest my knowledge of Tensor and TensorFlow stops at knowing it exists lol.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

Oh right, I totally forgot there was a Tesla P100 and a Quadro P100.

 

I'd think based on how faster they are supposed to be performance wise if they were any good for gaming and/or other workloads Volta would only have Tensor cores, but to be honest my knowledge of Tensor and programming stops at knowing it exists lol.

Ya I know exactly ZERO about Tensor. I design and create databases, I didn't know what Tensor was till we got info last year about Volta lol.

Link to comment
Share on other sites

Link to post
Share on other sites

On 15/08/2017 at 6:52 PM, TrigrH said:

Did anyone honestly expect Volta to come out soon anyway?

well if Vega Crushed Pascal then yeah maybe

Processor: Intel core i7 930 @3.6  Mobo: Asus P6TSE  GPU: EVGA GTX 680 SC  RAM:12 GB G-skill Ripjaws 2133@1333  SSD: Intel 335 240gb  HDD: Seagate 500gb


Monitors: 2x Samsung 245B  Keyboard: Blackwidow Ultimate   Mouse: Zowie EC1 Evo   Mousepad: Goliathus Alpha  Headphones: MMX300  Case: Antec DF-85

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Eroda said:

well if Vega Crushed Pascal then yeah maybe

Nvidia has known how far Vega could be pushed since last year. That's why they pushed out the 1080 Ti & 1050 Ti when they did, along with some price cuts. Though even Nvidia expected AMD to make the April window.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Dylanc1500 said:

Ya I know exactly ZERO about Tensor. I design and create databases, I didn't know what Tensor was till we got info last year about Volta lol.

What's Tensor for us dumb people?

.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Taf the Ghost said:

Nvidia has known how far Vega could be pushed since last year. That's why they pushed out the 1080 Ti & 1050 Ti when they did, along with some price cuts. Though even Nvidia expected AMD to make the April window.

Yep, Linus has stated multiple times on different WAN shows that its pretty normal for the competition to have the oppositions new hardware in their labs weeks (if not months) before general release.

 

Its a weird concept and one which I struggle to understand (like how the hell do they get hold of the hardware so early?) but Linus knows his shit when it comes to how these companies operate.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Master Disaster said:

Yep, Linus has stated multiple times on different WAN shows that its pretty normal for the competition to have the oppositions new hardware in their labs weeks (if not months) before general release.

 

Its a weird concept and one which I struggle to understand (like how the hell do they get hold of the hardware so early?) but Linus knows his shit when it comes to how these companies operate.

GPUs are a different space than CPUs. I believe Board Partners aren't bound in the same way with the GPUs than with the CPUs. There's also the issue that, with GPUs, they normally both need access to the same Memory. This means starting a pricing war hurts both of them, so they're a little more chummy. 

 

With CPUs? They ship early test chips by armed guard.

Link to comment
Share on other sites

Link to post
Share on other sites

nice means my 980 ti will solider on, granted i dont play any games (except bf1, fallout and world of tanks/warships) soo 980ti is a wasted space 

lives on

BAKABT

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×