Jump to content

Where is Pascal? Nvidia’s Drive PX 2 prototype powered by Maxwell, not Pascal

Mr_Troll

it does Nvidia said those were pascal GPUS.

Did they say it had that or would have that? Big differentiation especially due to it being a prototype.
Link to comment
Share on other sites

Link to post
Share on other sites

I must say, the more I read about this, the more worried I get. Back in the day with Fermi and the infamous wood screw, Fermi got delayed more than a quater.

 

To this day we still haven't seen any prototype from NVidia actually showing either a 16nmFF+ /pascal chip or any HBM chip in any shape or form in any presented prototype what so ever.

 

Sure we have not actually seen an Arctic Island chip in the flesh, but we have seen HBM implemented, even launched. At CES working polaris prototypes was demoed, and Greenland (or ellesmere) was demoed behind closed doors, meaning we have working prototype 14nm FF LPP GPU's demoed. NVidia haven't even shown us a defective broken chip yet.

 

Considering AMD has confirmed a Polaris launch on both laptops and desktops before the August school start, I have to question the actual state of Pascal. Will we even see it this year? Maybe only their high end model (Titan) around the holidays?

Also have Fury Line of cards which shows HBM working pretty well excluding better overcloccking.

Link to comment
Share on other sites

Link to post
Share on other sites

@GoodBytes 

You keep mentioning that this is Maxwell 2.0. Shouldn't this be like Maxwell 3.0 ? I mean 750Ti is the first gen Maxwell afaik, then the 900 series is the 2nd gen. Also the fact that they lied about what they had on that prototype is illegal. They are a publicly traded company and they can't say: Look we have Pascal up and running, when in fact it's a 980MXM board. There were investors there and according to SEC regulations this is not legal. 

Also regarding rebrands:

AMD didn't change the ASIC of their GPU, they only binned it better/got higher clocks, but they did change the card with better components. So it is more like a refresh/optimization of the cards.

While the 750Ti uses a Maxwell chip and was first on the market, it is still Maxwell 1.0. It has no differences compared to the 900 series, beside the much fewer cores, frequencies and all that. But architecture wise, it is the same.
Link to comment
Share on other sites

Link to post
Share on other sites

I have had Nvidia cards before at home and work systems never liked there driver interface and was terrible on mole even more

 

yeah, Nvidia control panel looks like a software relic from the windows 98/xp era. I'm saying that strictly from having used the dang interface for so long. Sure, if it ain't broke don't fix it, but maybe slapping a new coat of paint on it or something wouldn't hurt.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

yeah, Nvidia control panel looks like a software relic from the windows 98/xp era. I'm saying that strictly from having used the dang interface for so long. Sure, if it ain't broke don't fix it, but maybe slapping a new coat of paint on it or something wouldn't hurt.

While it could be more responsive as you load tabs/sections, it follows the Windows GUI design and is high-DPI aware.

It doesn't try to have a "gamer" look. You have GeForce Experience for that, but in professional field, people don't like "gamer" look anything.

At our office, and when I used to work as IT at another company, I can can tell when we install Nvidia drivers, the first thing we do is uncheck everything except the drivers (of course), and the Audio drivers of the card (mostly for Device Manager to show an unknown device in there, and just in the case we use HDMI or DP with audio out on day, on that system)

Link to comment
Share on other sites

Link to post
Share on other sites

yeah, Nvidia control panel looks like a software relic from the windows 98/xp era. I'm saying that strictly from having used the dang interface for so long. Sure, if it ain't broke don't fix it, but maybe slapping a new coat of paint on it or something wouldn't hurt.

No please no, just leave it as it as or make it more responsive.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

While it could be more responsive as you load tabs, it follows the Windows GUI design and is high-DPI aware.

It doesn't try to have a "gamer" look. You have GeForce Experience for that, but in professional field, people don't like "gamer" look anything.

At our office, and when I used to work as IT at another company, I can can tell when we install Nvidia drivers, the first thing we do is uncheck everything except the drivers (of course), and the Audio drivers of the card (mostly for Device Manager to show an unknown device in there, and just in the case we use HDMI or DP with audio out on day, on that system)

Crimson doesn't have a gamer look yet still looks amazing.
Link to comment
Share on other sites

Link to post
Share on other sites

it does Nvidia said those were pascal GPUS.

didn't the absence of HMB on the die and those huge GDDR5 chips sprayed around PCB throw you off?

 

also it's clear as day that there are two daughter boards plugged into the whole thing

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

didn't the absence of HMB on the die and those huge GDDR5 chips sprayed around PCB throw you off?

As already mentioned early on in the this thread, Pascal will have GDDR5, GDDR5X and HBM2 depending on the models. Expect the HBM2 to on on the Tesla cards, and Titan.. MAYBE on the "1080" model. MAYBE. Expect the 1070 or 1060 and lower to be on GDRR5X, than lower end ones on GDDR5, and DDR3 for the very low end one, if any model would exists, and not be rebrands.
Link to comment
Share on other sites

Link to post
Share on other sites

As already mentioned early on in the this thread, Pascal will have GDDR5, GDDR5X and HBM2 depending on the models. Expect the HBM2 to on on the Tesla cards, and Titan.. MAYBE on the "1080" model. MAYBE. Expect the 1070 or 1060 and lower to be on GDRR5X, than lower end ones on GDDR5, and DDR3 for the very low end one, if any model would exists, and not be rebrands.

whan you're the leader of graphics industry why cheap out on the memory technology

unless... you aren't really a leader

 

 

people will read online pascal-hbm

ooh, got to get one it will be blazing fast

 

that's as low as Intel putting their HEDT chips one number higher than the architecture it actually is

people shed out thousands of dollars for the enthusiast platform because it's 5960X meaning it should be the 5th gen Core processor, right? - wrong it's actually on 4th gen

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

yeah, Nvidia control panel looks like a software relic from the windows 98/xp era. I'm saying that strictly from having used the dang interface for so long. Sure, if it ain't broke don't fix it, but maybe slapping a new coat of paint on it or something wouldn't hurt.

This^^^^^^^^^^^^^^^^^

 

I seriously have no idea how to do anything with nvidia cards because I have only owned one, a GT630. The interface is so stale. At least AMD did it right with their catalyst control center and now the crimson interface.

 

Please dont kill me

Link to comment
Share on other sites

Link to post
Share on other sites

OK, I've just phoned nVIDIA's mr Huang and told him that I'm selling my GTX 980 tomorrow and for his sake it is better to hurry all this Pascal thing! 

 

:lol:

Cosmic Council Department of Defense ; Interplanetary Class 3 Relations & Diplomatic Affairs - OFFICE 117

Link to comment
Share on other sites

Link to post
Share on other sites

whan you're the leader of graphics industry why cheap out on the memory technology

unless... you aren't really a leader

 

Nvidia made a pile of money with Maxwell by shrinking the manufacturing cost by a large chunk, while charging as much as they did with Kepler. I recall the numbers being something crazy like 1 maxwell card profiting as much as 3 equivalent tier Kepler cards - might be somewhat off, but its in the right ball park. note i said profit, not revenue.

 

Also, anyone buying a sub 400 dollar graphic card probably won't care if they have HBM or GDDR5x, and low-mid range graphic cards may not have the power to make use of HBM at this time.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why would they need to run 3 different kind of memory technology (even so if 2 are quite similar)?

Can HBM and GDDR5 not offer the wide range of bandwidth needed? Putting another memory technology might actually increase cost and development.

If anything, I see GDDR5x as a temporary solution for those not quite ready for HBM.

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia made a pile of money with Maxwell by shrinking the manufacturing cost by a large chunk, while charging as much as they did with Kepler.

Incorrect, the 900 series cost much less. Nvidia dropped the price. Now that our Canadian dollar went to the toilet, you don't see it, but if you look at U.S pricing it did.

 

 

Why would they need to run 3 different kind of memory technology (even so if 2 are quite similar)?

Can HBM and GDDR5 not offer the wide range of bandwidth needed? Putting another memory technology might actually increase cost and development.

If anything, I see GDDR5x as a temporary solution for those not quite ready for HBM.

Has to do with cost. Already people have a hard time buying Titan card. Companies buys Tesla which cost significantly more.

The problem with HBM (1 and 2) is that the chips are soldered on the GPU. So, what happens if the GPU is perfect, but 1 of the memory chip is faulty? The whole thing goes at the bin. This increases cost. Also, if you buy a graphics card, you as a consumer, and you have faulty memory that develops over time, you would RMA it to the card manufacture. Fine. The card manufacture normally, would replace the memory chip, and sell it as refurbished. Now with HBM, they need a new GPU from Nvidia.

It is a huge mess for Nvidia, and it is fully understandable why Nvidia and AMD doesn't want to put HBM on all models.

Link to comment
Share on other sites

Link to post
Share on other sites

Hbm 1 is still a good option for lower end as well unless you want to sqeeze every penny and use them pennies for car parts

Link to comment
Share on other sites

Link to post
Share on other sites

Has to do with cost. Already people have a hard time buying Titan card. Companies buys Tesla which cost significantly more.

The problem with HBM (1 and 2) is that the chips are soldered on the GPU. So, what happens if the GPU is perfect, but 1 of the memory chip is faulty? The whole thing goes at the bin. This increases cost. Also, if you buy a graphics card, you as a consumer, and you have faulty memory that develops over time, you would RMA it to the card manufacture. Fine. The card manufacture normally, would replace the memory chip, and sell it as refurbished. Now with HBM, they need a new GPU from Nvidia.

It is a huge mess for Nvidia, and it is fully understandable why Nvidia and AMD doesn't want to put HBM on all models.

Well, I'm quite sure the memory is been validated before.

But again, HBM is becoming increasingly cheaper, GDDR5x still cost more than GDDR5 and still require some additional development.

Splitting up orders, also increases the price of the individual memory module.

Again, why not just stay with HBM and GDDR5 and exclude GDDR5x? Can HBM and GDDR5 not offer the range of bandwidth needed?

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

In correct, the 900 series cost much less. Nvidia dropped the price. Now that our Canadian dollar went to the toilet, you don't see it, but if you look at U.S pricing it did.

 

note that i said profit, not revenue. Maxwell was much cheaper to manufacture, and the prices were not always much lower with Maxwell. The 780ti was 699USD at launch, while the 980ti was 649USD at launch.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well, I'm quite sure the memory is been validated before.

And yet you have people that buys graphics card and have faulty memory. Heck, I have a GTX 780 at work, has been used the day it was used, and 6 months ago, the memory broke.

 

 

But again, HBM is becoming increasingly cheaper, GDDR5x still cost more than GDDR5 and still require some additional development.

Splitting up orders, also increases the price of the individual memory module.

Of course it is getting cheaper, GDDR5 is also getting cheaper as well, that is how technology works.

But HBM is in the GPU die.

Link to comment
Share on other sites

Link to post
Share on other sites

And yet you have people that buys graphics card and have faulty memory. Heck, I have a GTX 780 at work, has been used the day it was used, and 6 months ago, the memory broke.

I'm not as sure if GDDR memory is also been validated before hand.

Of course it is getting cheaper, GDDR5 is also getting cheaper as well, that is how technology works.

But HBM is in the GPU die.

But not in the same sense. HBM is in its adoption face. Lots of component to make HBM work, have quickly seen a downfall in cost, whereas GDDR is in a more 'steady' downfall. Also HBM also put less complexitivity on PCB, leaving off a smaller footprint.

We are starting to derail my point: Couldn't HBM and GDDR5 (not GDDR5x) supplement their needs? Without making things more complex, by introducing a third memory technology to a subset of GPUs between GDDR5 and HBM?

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Cost less to make pcb, uses less power and much higher band width more than is needed atm Hbm is better than ddr

Link to comment
Share on other sites

Link to post
Share on other sites

Has anyone in this thread mentioned that nvidia took down the photo from their twitter?

IMO that's admission of guilt.

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

Has anyone in this thread mentioned that nvidia took down the photo from their twitter?

IMO that's admission of guilt.

Or they were tired of everyone over analyzing everything to fucking death.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

Or they were tired of everyone over analyzing everything to fucking death.

Yup. Won't be surprise me if this will be the last time that Nvidia shows any circuit board on stage.

People don't get what prototype/engineering sample is, and bases their "analysis" on rumors and speculation which they take as facts, and spreads false information which hurts the company. Especially that its isn't the first time this happens, and once the product is in people hands, discovers that, yea Nvidia was not lying about what they said about the product.

Link to comment
Share on other sites

Link to post
Share on other sites

Yup. Won't be surprise me if this will be the last time that Nvidia shows any circuit board on stage.

People don't get what prototype/engineering sample is, and bases their "analysis" on rumors and speculation which they take as facts, and spreads false information which hurts the company.

At least we can count on Jen-Hsun to always come out on stage with the next iteration of the Titan.  For some reason, I actually really like that display.  "His smooth, peppered hair, and leather jacket rustling in the artificially created wind as he revealed the next generation of high-tier performance.  The crowd 'ooed" and 'awed' as they witnessed him.  He was witnessed."

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×