Jump to content

Where is Pascal? Nvidia’s Drive PX 2 prototype powered by Maxwell, not Pascal

Mr_Troll

Nvidia’s Drive PX 2 prototype allegedly powered by Maxwell, not Pascal

 

When Nvidia’s CEO, Jen-Hsun Huang took the stage at CES last week he unveiled the company’s next generation self driving car platform, the Drive PX 2. According to Nvidia, its Drive PX 2 platform packs the same amount of compute power as six Titan X boards, in just two GPUs. During the show, Jen-Hsun displayed the new system — but what he showed from stage almost certainly wasn’t Pascal.

The drive PX2 nvidia showed at CES:

 

PX2GPUs-640x383.jpg

 

This is what a GTX 980 MXM module ( laptop 980 ) looks like: 

 

GTX980MXM.jpg

As Anandtech reader noted, the hardware Jen-Hsun showed was nearly identical to the GTX 980 in an MXM configuration. The new Drive PX 2 is shown above, the GTX 980 MXM is shown below. The hardware isn’t just similar — the chips appear to be identical. Some readers have also claimed they can read the date code on the die as 1503A1 — which would mean the GPUs were produced in the third week of 2015.

 

If Nvidia actually used a GTX 980 MXM board for their mockup, it would explain why the Drive PX 2 looks as though it only uses GDDR5. While Nvidia could still be tapping that memory standard for its next-generation driving platform, this kind of specialized automotive system is going to be anything but cheap. We’ve said before that we expect GDDR5 and HBM to split the upcoming generation, but we expect that split in consumer hardware with relatively low amounts of GPU memory (2-4GB) and small memory busses. The Drive PX 2 platform sports four Denver CPU cores, eight Cortex-A57 CPUs, 8 TFLOPS worth of single-precision floating point, and a total power consumption of 250W. Nvidia has already said that they’ll be water-cooling the module in electric vehicles and offering a radiator block for conventional cars. Any way you slice it, this is no tiny embedded product serving as a digital entertainment front-end.Then again, it is still possible that the compute-heavy workloads the Drive PX 2 will perform don’t require HBM. It seems unlikely, but it’s possible.

 

Wood screws 2.0?

 

These issues with Pascal and the Drive PX 2 echo the Fermi “wood screw” even of 2009. Back then, Jen-Hsun held up a Fermi board that was nothing but a mock-up, proclaimed the chip was in full production, and would launch before the end of the year. In reality, NV was having major problems with GF100 and the GPU only launched in late March, 2010.The good news is, we’ve seen no sign that Nvidia is having the same types of problems that delayed Fermi’s launch and hurt the final product. As far as we know, both AMD and Nvidia are on-track to launch new architectural revisions in 2016.What’s more perplexing is why Nvidia engages in these kind of overreaches in the first place. Claiming the first public demo of a 16nm FinFET GPU may be a decent PR win, but claiming it in the context of automotive market isn’t going to ignite the coals of a GPU fanboy’s heart. Nvidia’s push into self-driving cars and deep learning is a long-term game. Launching a new Shield or game GPU might push some fans to upgrade immediately; precious few people are going to have the luxury of planning a new car purchase on the basis of Nvidia’s PX 2, no matter how powerful it might be.The entire point of holding up a product from stage is to demonstrate to the audience that the hardware actually exists. If it’s later shown that the hardware in question wasn’t what it was claimed to be, it undercuts the original point. Worse, it invites the audience to question why the company is playing fast and loose with the truth. There’s no reason to think Pascal is suffering an unusual delay — but these kind of antics invite speculation to the contrary.

 

Well done. Just realized i got fooled aswell with this drive PX2. I actually believed that those were pascal chips like Jen Sun said. 

 

Source:http://www.extremetech.com/gaming/220818-nvidias-drive-px-2-prototype-allegedly-powered-by-maxwell-not-pascal

http://www.anandtech.com/show/9903/nvidia-announces-drive-px-2-pascal-power-for-selfdriving-cars

Tomsen post in this topic  : http://linustechtips.com/main/topic/521232-nvidia-shows-off-drive-px-2-water-cooled-supercomputer-for-self-driving-cars-deep-learning-tech/

Intel Core i7 7800x @ 5.0 Ghz with 1.305 volts (really good chip), Mesh OC @ 3.3 Ghz, Fractal Design Celsius S36, Asrock X299 Killer SLI/ac, 16 GB Adata XPG Z1 OCed to  3600 Mhz , Aorus  RX 580 XTR 8G, Samsung 950 evo, Win 10 Home - loving it :D

Had a Ryzen before ... but  a bad bios flash killed it :(

MSI GT72S Dominator Pro G - i7 6820HK, 980m SLI, Gsync, 1080p, 16 GB RAM, 2x128 GB SSD + 1TB HDD, Win 10 home

 

Link to comment
Share on other sites

Link to post
Share on other sites

Does it matter? It's not a gaming card.

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Does it matter? It's not a gaming card.

it does Nvidia said those were pascal GPUS.

Intel Core i7 7800x @ 5.0 Ghz with 1.305 volts (really good chip), Mesh OC @ 3.3 Ghz, Fractal Design Celsius S36, Asrock X299 Killer SLI/ac, 16 GB Adata XPG Z1 OCed to  3600 Mhz , Aorus  RX 580 XTR 8G, Samsung 950 evo, Win 10 Home - loving it :D

Had a Ryzen before ... but  a bad bios flash killed it :(

MSI GT72S Dominator Pro G - i7 6820HK, 980m SLI, Gsync, 1080p, 16 GB RAM, 2x128 GB SSD + 1TB HDD, Win 10 home

 

Link to comment
Share on other sites

Link to post
Share on other sites

Damn. Was hoping I could finally run Crysis with my car.

Link to comment
Share on other sites

Link to post
Share on other sites

it does Nvidia said those were pascal GPUS.

it's a prototype tho.

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe they're panicking to rework Pascal after async compute performance of Nvidia's architecture in DX12 embarrassed them?

 

Hope not, it would be great to have a whole generation of AMD recouperating market share.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

it's a prototype tho.

but they still lied about the tech being used in it... just pointing out how they lied

Intel Core i7 7800x @ 5.0 Ghz with 1.305 volts (really good chip), Mesh OC @ 3.3 Ghz, Fractal Design Celsius S36, Asrock X299 Killer SLI/ac, 16 GB Adata XPG Z1 OCed to  3600 Mhz , Aorus  RX 580 XTR 8G, Samsung 950 evo, Win 10 Home - loving it :D

Had a Ryzen before ... but  a bad bios flash killed it :(

MSI GT72S Dominator Pro G - i7 6820HK, 980m SLI, Gsync, 1080p, 16 GB RAM, 2x128 GB SSD + 1TB HDD, Win 10 home

 

Link to comment
Share on other sites

Link to post
Share on other sites

but they still lied about the tech being used in it... 

why? It's a prototype, they might be using Pascal in the actual version?

 

 

Pretty sure Pascal GPU's aren't being manufactured yet, how do you want them to use Pascal?

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe they're panicking to rework Pascal after async compute performance of Nvidia's architecture in DX12 embarrassed them?

 

Hope not, it would be great to have a whole generation of AMD recouperating market share.

 

Don't think so. They were talking about using software (as usual) to be able to use that, and I believe they already did it with the 980 Ti IIRC and got good results compared to AMD. Can't be arsed to Google it for you though. 

Link to comment
Share on other sites

Link to post
Share on other sites

Don't think so. They were talking about using software (as usual) to be able to use that, and I believe they already did it with the 980 Ti IIRC and got good results compared to AMD. Can't be arsed to Google it for you though. 

 

When you have a $400 card of 5000 Gflops beginning to catch up to a $600 card of 5700 Gflops due to its better async compute capabilities, (16x) it is embarrassing no matter how you look at it. I wouldn't call it "good results" if you can't proportionally outperform the 5K Gflop card with 12% more raw power.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

Whoa. How come the people making baseless accusations in the Polaris vs 950 thread didn't catch this?

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

When you have a $400 card of 5000 Gflops beginning to catch up to a $600 card of 5700 Gflops due to its better async compute capabilities, (16x) it is embarrassing no matter how you look at it. I wouldn't call it "good results" if you can't proportionally outperform the 5K Gflop card with 12% more raw power.

 

If it performs well in games what does it matter?

Link to comment
Share on other sites

Link to post
Share on other sites

Don't think so. They were talking about using software (as usual) to be able to use that, and I believe they already did it with the 980 Ti IIRC and got good results compared to AMD. Can't be arsed to Google it for you though. 

They did catch up, but it took them months. That is the problem with Maxwell, and they're fixing that with Pascal. It saps precious resources they could be using on other things, while AMD sits back and lets their hardware do all the work. It also defeats the purpose of dx12, which is software talking directly to the hardware and less need of frequent driver releases like we're used to in dx11.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

They did catch up, but it took them months. That is the problem with Maxwell, and they're fixing that with Pascal. It saps precious resources they could be using on other things, while AMD sits back and lets their hardware do all the work. It also defeats the purpose of dx12, which is software talking directly to the hardware and less need of frequent driver releases like we're used to in dx11.

 

That's almost always been the case though, between the two vendors; Nvidia has almost always (least since Kepler anyways, to play it safe in my point since I'm not going to Google all of this to double check) been ahead on the software front and AMD has been chugging along in their raw performance.

 

If it works, it works. I don't need to know the details.

Link to comment
Share on other sites

Link to post
Share on other sites

Can we shut the hell up about games, this prototype has nothing to do with games.  And as a prototype/mock-up/whatever, no one has even tested this thing, and no one has any proof the chips being used will be in the final product.

Link to comment
Share on other sites

Link to post
Share on other sites

That's almost always been the case though, between the two vendors; Nvidia has almost always (least since Kepler anyways, to play it safe in my point since I'm not going to Google all of this to double check) been ahead on the software front and AMD has been chugging along in their raw performance.

 

If it works, it works. I don't need to know the details.

You are right, but it will be a marketing problem for Nvidia like it is for AMD now, if people have to wait months for every dx12 release. Day 1 performance seems to be a reason a lot of people don't want AMD cards.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

You are right, but it will be a marketing problem for Nvidia like it is for AMD now, if people have to wait months for every dx12 release. Day 1 performance seems to be a reason a lot of people don't want AMD cards.

 

I don't think we can judge their release schedule for DX12 support on games with how long it took them to get a-sync working. Regardless of what people think about Nvidia (even AMD more so), they have intelligent people that work their asses off at the company and know what they're doing. There's no way they wouldn't be able to support it on a regular basis like they are/have been with new releases.

Link to comment
Share on other sites

Link to post
Share on other sites

You must remember that Pascal is Maxwell revision 2.0.

It has been added as Volta got delayed.

While I don't know if what was shown on stage was really Maxwell or Pascal. All I am saying that it is normal that Pascal board looks identical to Maxwell.

Link to comment
Share on other sites

Link to post
Share on other sites

You must remember that Pascal is Maxwell revision 2.0.

It has been added as Volta got delayed.

 

Man is that all they're doing is adding HBM and GDDR5X or whatever the shit

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia lying. So what's new?

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think we can judge their release schedule for DX12 support on games with how long it took them to get a-sync working. Regardless of what people think about Nvidia (even AMD more so), they have intelligent people that work their asses off at the company and know what they're doing. There's no way they wouldn't be able to support it on a regular basis like they are/have been with new releases.

Ark's dx12 patch has been MIA with no sign that it's coming any time soon. It took months for them to get up to par with AMD in Ashes of the Singularity. The 980 was worse than a 390 in Fable Legends which didn't even have any indication it was using Async Compute. Can Nvidia get early working drivers in future dx 12 releases? Possible, but none of the early signs point to it happening.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

but they still lied about the tech being used in it... just pointing out how they lied

They haven't lied about shit, Its not out yet.

 

If the final unit is maxwell and they haven't offered a statement correcting the pascal claims in the near future then they have lied/been dishonst.

 

Not before those conditions are met.

System Specs

CPU: Ryzen 5 5600x | Mobo: Gigabyte B550i Aorus Pro AX | RAM: Hyper X Fury 3600 64gb | GPU: Nvidia FE 4090 | Storage: WD Blk SN750 NVMe - 1tb, Samsung 860 Evo - 1tb, WD Blk - 6tb/5tb, WD Red - 10tb | PSU:Corsair ax860 | Cooling: AMD Wraith Stealth  Displays: 55" Samsung 4k Q80R, 24" BenQ XL2420TE/XL2411Z & Asus VG248QE | Kb: K70 RGB Blue | Mouse: Logitech G903 | Case: Fractal Torrent RGB | Extra: HTC Vive, Fanatec CSR/Shifters/CSR Elite Pedals w/ Rennsport stand, Thustmaster Warthog HOTAS, Track IR5,, ARCTIC Z3 Pro Triple Monitor Arm | OS: Win 10 Pro 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Well, if that is pascal. I don't expect HBM.

 

So they are lying, or pascal has no HBM. Or HBM will be featured on other chips, but that would be weird because if they show it off with a big-ass headline like 6x more whatever it was.

I expected this would be at least high-end stuff and not some mid-range sh*t...

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×