Jump to content

Where is Pascal? Nvidia’s Drive PX 2 prototype powered by Maxwell, not Pascal

Mr_Troll

Why would self driving cars need HBM though?

???

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

Well, if that is pascal. I don't expect HBM.

 

So they are lying, or pascal has no HBM. Or HBM will be featured on other chips, but that would be weird because if they show it off with a big-ass headline like 6x more whatever it was.

I expected this would be at least high-end stuff and not some mid-range sh*t...

Pascal will have HBM2. The problem here is they are doing a dishonest demo, a problem that is not exclusive to Nvidia.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Ark's dx12 patch has been MIA with no sign that it's coming any time soon. It took months for them to get up to par with AMD in Ashes of the Singularity. The 980 was worse than a 390 in Fable Legends which didn't even have any indication it was using Async Compute. Can Nvidia get early working drivers in future dx 12 releases? Possible, but none of the early signs point to it happening.

 

That's Ark, a game from a studio of new game developers, a game that people only bought because it's running on UE4. We're talking about the largest video card vendor, not a game studio. If the game supports the API, then I'm pretty sure Nvidia will have no issues on their end.

 

They were obviously working on getting it working for that long in AotS, and got it working. I don't know what your point it anymore other than you think Nvidia just isn't going to make it work, when I have already told you they have had it working for a while now.

Link to comment
Share on other sites

Link to post
Share on other sites

Man is that all they're doing is adding HBM and GDDR5X or whatever the shit

The way it works, is that Nvidia has essential 2 big teams.

One that works (I don't work at Nvidia, I don't know how it actually works, but this is from information that I know) on the 'true' next gen architecture (in this case Volta), and the other who that takes it more relaxed and work on the second revision of the current architecture (then it switches... VOlta team works on Volta 2.0, and Pascal (Maxwell 2.0) work on the next architecture). For example: GeForce 500 series architecture is based on the 400 Fermi architecture. The 500 series is a Fermi 2.0.

The more time that the "revision 2.0" team has, the more improvements they can do, the more bugs on the GPU itself that they can fix, and more features they can add. For example, HBM2 which was supposed to be on Volta, was taken and put in Pascal, so we have this part earlier. Volta will probably focus on improving the implementation if needs to be while Pascal is released. So in a way, it reduces the performance increase between GPUs. But it can also mean that Pascal might have other unique features, that will be ported into Volta.

I don't know what is really new on Pascal, no one knows (well I guess Nvidia does, but ignoring this). It might have a few more things. But expect Pascal to have HBM2 on the highest model (TITAN and/or 1080 (or whatever they'll call it)), and GDDR5/GDDR5X on the lower end models, more power efficient, GPU bug fixes (but that is more for developers), tweaked further to boost performance (either architecture wise, and/or able to push the clock rates faster).

Link to comment
Share on other sites

Link to post
Share on other sites

Pascal will have HBM2. The problem here is they are doing a dishonest demo, a problem that is not exclusive to Nvidia.

 

Mate, you need to read what GoodBytes mentioned. It's probably a Pascal unit, but it looks the way that it does because it's a fookin graphics unit for one, but it's also just Maxwell 2.0 (Pascal).

Link to comment
Share on other sites

Link to post
Share on other sites

Pascal will have HBM2. The problem here is they are doing a dishonest demo, a problem that is not exclusive to Nvidia.

No. Only the highest model. This can be Tesla exclusive. OR at best (for the consumer): Titan and/or 1080 model (or whatever it will be called).

I'll be surprised if the 1070 (or whatever it will be called), will have HBM2. If it does, it would be most likely "manufacture reject" of the 1080 where it doesn't meet the specs of the 1080, but using common average of what is working or not o n most chips that failed, they cut out the broken CUDA cores / memory chips, to have the most working, adjust the clock frequency, and selling that as the lower end... common things in the processor production field)

Link to comment
Share on other sites

Link to post
Share on other sites

That's Ark, a game from a studio of new game developers, a game that people only bought because it's running on UE4. We're talking about the largest video card vendor, not a game studio. If the game supports the API, then I'm pretty sure Nvidia will have no issues on their end.

 

They were obviously working on getting it working for that long in AotS, and got it working. I don't know what your point it anymore other than you think Nvidia just isn't going to make it work, when I have already told you they have had it working for a while now.

UE4 is an Nvidia partnered engine and Nvidia has been having performance problems in other dx 12 demos/games. Coincidence?

 

I didn't say they couldn't get it working. I'm talking about getting it working in a timely manner. Not being able to do this is a huge marketing issue, one which is a huge thorn in AMD's side today. Nvidia still has a lot of time to get their shit together before more dx 12 games start coming to the market, and there are at least 2 big ones in Q1 this year.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

No. Only the highest model. This can be Tesla exclusive. OR at best (for the consumer): Titan and/or 1080 model (or whatever it will be called).

I'll be surprised if the 1070 (or whatever it will be called), will have HBM2. If it does, it would be most likely "manufacture reject" of the 1080 where it doesn't meet the specs of the 1080, but using common average of what is working or not o n most chips that failed, they cut out the broken CUDA cores / memory chips, to have the most working, adjust the clock frequency, and selling that as the lower end... common things in the processor production field)

Yes, I know that HBM2 will mostly likely be exclusive to the enthusiast cards. I was just responding to the guy who though Pascal won't have it at all. There's still no concrete evidence that GDDR5 is a huge issue outside of 4k, and we're even getting the upgraded GDDR5X.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Ah ok! :)

 

There's still no concrete evidence that GDDR5 is a huge issue outside of 4k, and we're even getting the upgraded GDDR5X.

Well in reality it depends on the game engine. They are 2 ways to do things: Put the most you can on the GPU memory that you can (needs more memory for 4K), or load textures as you need (need speed). That is why AMD card with HBM1, which is limited to 4GB, some games get a nice performance boost under 4K displays, but other suffers due to the lack of memory. So you want both, ideally, to cover both techniques.
Link to comment
Share on other sites

Link to post
Share on other sites

Ah ok! :)

 

Well in reality it depends on the game engine. They are 2 ways to do things: Put the most you can on the GPU memory that you can (needs more memory for 4K), or load textures as you need (need speed). That is why AMD card with HBM1, which is limited to 4GB, some games get a nice performance boost under 4K displays, but other suffers due to the lack of memory. So you want both, ideally, to cover both techniques.

Which games suffer from this?

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

UE4 is an Nvidia partnered engine and Nvidia has been having performance problems in other dx 12 demos/games. Coincidence?

 

I didn't say they couldn't get it working. I'm talking about getting it working in a timely manner. Not being able to do this is a huge marketing issue, one which is a huge thorn in AMD's side today. Nvidia still has a lot of time to get their shit together before more dx 12 games start coming to the market, and there are at least 2 big ones in Q1 this year.

 

MFW you think a new game studio with a shit game somehow correlates with Nvidia's alleged lack of DX12 optimization

post-52977-0-12968300-1452538349_thumb.j

Link to comment
Share on other sites

Link to post
Share on other sites

No. Only the highest model. This can be Tesla exclusive. OR at best (for the consumer): Titan and/or 1080 model (or whatever it will be called).

I'll be surprised if the 1070 (or whatever it will be called), will have HBM2. If it does, it would be most likely "manufacture reject" of the 1080 where it doesn't meet the specs of the 1080, but using common average of what is working or not o n most chips that failed, they cut out the broken CUDA cores / memory chips, to have the most working, adjust the clock frequency, and selling that as the lower end... common things in the processor production field)

So essentially, Pascal is a rebrand. Much like AMD did with 200-300 series then.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

So essentially, Pascal is a rebrand. Much like AMD did with 200-300 series then.

I am unfamiliar with the AMD 200-300 series so I can't comment. But, no.

Rebrand means that 0 modification was made on the chip, and you just call it something else.

Here what is happening is that they take the chip, do improvements on it to make it faster, fix bugs, maybe a die shrink, add a few features, maybe improved DirectX12 performance as now it is officially out, and Nvidia get to work with the official specs, do tweak to allow more overclocking, so that they can boost the clock themselves for even greater performance, etc.

It is not a brand new architecture, just an improved products.

Intel does a similar thing, you have a new architecture, then you have the improved architecture which is based on the older one (tic-toc system as they call it).

Link to comment
Share on other sites

Link to post
Share on other sites

Should this item not be on Ford, GM or Chrysler tech web page since it's a car part? 

Link to comment
Share on other sites

Link to post
Share on other sites

Don't know from the top of my head.

Lol, because it doesn't exist. Shadow of Mordor is the only current game capable of that (and maybe modded skyrim?) and there's no performance degradation at 4k.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

I am unfamiliar with the AMD 200-300 series so I can't comment. But, no.

Rebrand means that 0 modification was made on the chip, and you just call it something else.

Here what is happening is that they take the chip, do improvements on it to make it faster, fix bugs, maybe a die shrink, add a few features, maybe improved DirectX12 performance as now it is officially out, and Nvidia get to work with the official specs, do tweak to allow more overclocking, so that they can boost the clock themselves for even greater performance, etc.

It is not a brand new architecture, just an improved products.

Intel does a similar thing, you have a new architecture, then you have the improved architecture which is based on the older one (tic-toc system as they call it).

I understand it better now. Thank you. :)

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

MFW you think a new game studio with a shit game somehow correlates with Nvidia's alleged lack of DX12 optimization

A new studio with a shit game that happens to be a GameWorks game also using a GameWorks engine. Yea, I'm pretty sure Nvidia would want them to push a feature they don't have a handle on yet.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

The takeaway for me is that if the prototype and all the simulations run and done were using Maxwell 980 MXM and not Pascal - Pascal should be even more incredible with this capability and use-scenario.  That or they had the Pascal up and running, and the chip they took out on to stage was using the 980M because, that's all they had available.  Either way - nVidia is paving the future and nabbing some yuuuge freakin' clients.

Link to comment
Share on other sites

Link to post
Share on other sites

The big question is why?

 

Remember last time Nvidia went with new node and a new memory controller? Didn't end up so well..

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Well that's what I expected, its nvidia after all

Error: 451                             

I'm not copying helping, really :P

Link to comment
Share on other sites

Link to post
Share on other sites

WOW. So AMD have working Polaris samples, have early gpus. This could be a good year if AMD can get there next gen gpus before nvidia, especially by a huge amount like 2 quarters. AMD said june, nvidia are rumored to be releasing in q4. This will give AMD a huge market advantage, of having the fastest single gpu on the market for half a year.

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

I don't remember if they said it outright, but they made me feel they had been using the PX2 for at least enough time to get quotes from auto manufacturers. He talked like everything demoed was running on it.

Air 540, MSI Z97 Gaming 7, 4770K, SLI EVGA 980Ti, 16GB Vengeance Pro 2133, HX1050, H105840 EVO 500, 850 Pro 512, WD Black 1TB, HyperX 3K 120, SMSNG u28e590d, K70 Blues, M65 RGB.          Son's PC: A10 7850k, MSI A88X gaming, MSI gaming R9 270X, Air 240, H55, 8GB Vengeance pro 2400, CX430, Asus VG278HE, K60 Reds, M65 RGB                                                                                       Daughter's PC: i5-4430, MSI z87 gaming AC, GTX970 gaming 4G, pink air 240, fury 1866 8gb, CX600, SMSNG un55HU8550, CMstorm greens, Deathadder 2013

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why do some people have such  hardon for pointing out any little thing Nvidia seems to do wrong? Tall poppy syndrome? 

Link to comment
Share on other sites

Link to post
Share on other sites

A new studio with a shit game that happens to be a GameWorks game also using a GameWorks engine. Yea, I'm pretty sure Nvidia would want them to push a feature they don't have a handle on yet.

How are those straws that's your grasping at holding up?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×