Jump to content

Nvidia Pascal & Volta Compute Performance Figures Released, HBM2 Power-Hungry?

patrickjp93

Do you have a source for this?

Because I can't find Nvidia on this list and this article mentions explicitly that they are not a member.

They aren't anymore. they dropped membership in 2014 shortly after switching plans to use HBM 2.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

When I looked up on JEDEC voting procedures it said all dissenting votes have to be supported with documentation on the reasoning behind it, so the issues can be sorted out or the whole group can be made aware of issues or disagreements before the vote goes through.

 

We can't pull too much out of the air without getting some behind closed doors info, but the entire JEDEC system appears to be set up to keep participants operating above board.

Link to comment
Share on other sites

Link to post
Share on other sites

We also see the HPC world saying that, and frankly I put more stock in their actions than anyone's words on the matter here.

Pretty sure I was talking about consumer gaming cards, but thanks for your irrelevant comment as always.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

When I looked up on JEDEC voting procedures it said all dissenting votes have to be supported with documentation on the reasoning behind it, so the issues can be sorted out or the whole group can be made aware of issues or disagreements before the vote goes through.

 

We can't pull too much out of the air without getting some behind closed doors info, but the entire JEDEC system appears to be set up to keep participants operating above board.

Regardless, there was no reason for Samsung to vote against a product it helped design and create. Just because a product ends up in JEDEC doesn't mean everyone has to make it. Micron stays out of embedded memory like eMMC for instance. Elpida isn't lining up to make HBM 2. The only company with a vested interest in keeping HMC out was SK Hynix, and now it's becoming more and more obvious why with the technical report coming from Nvidia about power and heat scaling.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Pretty sure I was talking about consumer gaming cards, but thanks for your irrelevant comment as always.

Same die, same RAM (plus ECC), slightly lower clocks. Thanks for pulling an ad hominem.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Same die, same RAM (plus ECC), slightly lower clocks. Thanks for pulling an ad hominem.

Except, when they say non-competitive for gaming card reviews, they're talking about how much it will set you back in terms of you electric bill and paying more for a power supply. Again, thanks for you irrelevant comment like always.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Except, when they say non-competitive for gaming card reviews, they're talking about how much it will set you back in terms of you electric bill and paying more for a power supply. Again, thanks for you irrelevant comment like always.

Which is a valid point. Nvidia gives you better guarantees than AMD does. Considering some of the stock spikes of the 290X are 350W+, I'm further unwilling to trust AMD's figures and would rather take security over being cheap up front only to have a component fail.

 

Thank you for missing the nuance in the argument, again.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

If this is true, I can't wait for nvidia fanboys to defend the 600% increase in power consumption, for a 50% increase in performance.

Also if this is true, I'd have a hard time believing nvidia would openly accept hbm 2, especially since they're so big on overclocking. Would have thought they'd go for the much more efficient hbm 1, and tried to work out sourcing it en mass.

You realize this would affect AMD too?

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

Which is a valid point. Nvidia gives you better guarantees than AMD does. Considering some of the stock spikes of the 290X are 350W+, I'm further unwilling to trust AMD's figures and would rather take security over being cheap up front only to have a component fail.

 

Thank you for missing the nuance in the argument, again.

1st of all, citation needed.

2nd of all, even if it did, that would only be an issue with a bad quality PSU. A 290x can be safely run with ZERO issues on a good quality 550W PSU, and I don't know of any 780 ti or 980 owners who has less than that.

 

You know what, don't even bother citing, I know you're used to other people playing this stupid game with you, but I'm not interested. I have better things to do with my time, like masturbate or something. Go back to writing code for the CIA, or working with IBM's engineers to make their next big chip, or whatever it is you do with your free time. You win, go pest someone else.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

1st of all, citation needed.

2nd of all, even if it did, that would only be an issue with a bad quality PSU. A 290x can be safely run with ZERO issues on a good quality 550W PSU, and I don't know of any 780 ti or 980 owners who has less than that.

 

You know what, don't even bother citing, I know you're used to other people playing this stupid game with you, but I'm not interested. I have better things to do with my time, like masturbate or something. Go back to writing code for the CIA, or working with IBM's engineers to make their next big chip, or whatever it is you do with your free time. You win, go pest someone else.

https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#safe=off&q=290x+power+spikes+350w

 

Lots of people have reported them that high.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

(sigh, why am I responding to this garbage)

 

Wow, great citation there. Let me actually do your job for you.

 

http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-4.html

 

You should probably go sell your Nvidia cards though since you're so worried about spikes taking your system (I like that you ignored the PSU argument I made)

 

http://www.tomshardware.de/gigabyte-windforce-gtx-780-ti-ghz-edition,testberichte-241428-4.html

 

Even with all this, I am still declaring you the winner (just like you did last time after I proved you wrong in our AMD/ATI vs Nvidia history of market share and dx 10.1 debate), so please don't even bother responding. Facts seem very irrelevant to you.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

(sigh, why am I responding to this garbage)

 

Wow, great citation there. Let me actually do your job for you.

 

http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-4.html

 

You should probably go sell your Nvidia cards though since you're so worried about spikes taking your system (I like that you ignored the PSU argument I made)

 

http://www.tomshardware.de/gigabyte-windforce-gtx-780-ti-ghz-edition,testberichte-241428-4.html

 

Even with all this, I am still declaring you the winner (just like you did last time after I proved you wrong in our AMD/ATI vs Nvidia history of market share and dx 10.1 debate), so please don't even bother responding. Facts seem very irrelevant to you.

your argument was pointless. If you tell me exactly how much power I need, I buy that power and a little more so I can do some minor overclocking. If you hand me a bill of goods and then put me in the position where I buy something I can't return, I'm sorry but you deserve the cirticism for not being forthcoming, and further, yes, power matters.

 

Oh please had I handed you any single source you'd have called it biased just as you did in the DX 10.1 debate. I'm sorry but I gave you the a la carte option since you don't respond to journalists.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

your argument was pointless. If you tell me exactly how much power I need, I buy that power and a little more so I can do some minor overclocking. If you hand me a bill of goods and then put me in the position where I buy something I can't return, I'm sorry but you deserve the cirticism for not being forthcoming, and further, yes, power matters.

Oh please had I handed you any single source you'd have called it biased just as you did in the DX 10.1 debate. I'm sorry but I gave you the a la carte option since you don't respond to journalists.

What? Journalist? You're trying to pretend that you linked me to an article? Do you make up so much shit that you can't keep track of them? Just like back then, yoi gave no citation.

Once again, go sell your Nvidia cards if power consumption and spikes are legitimates issue to you like you're claiming. Once again refer to actual evidence, and once again, I'm not interested in playing this stupid game with you.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

What? Journalist? You're trying to pretend that you linked me to an article? Do you make up so much shit that you can't keep track of them? Just like back then, yoi gave no citation.

Once again, go sell your Nvidia cards if power consumption and spikes are legitimates issue to you like you're claiming. Once again refer to actual evidence, and once again, I'm not interested in playing this stupid game with you.

No, I'm saying you'd have ignored any journalist I quoted since you have a history of doing so. So I just gave you the gateway to the slew of people who've measured it so you couldn't fire back with "source confirmation bias."

I have no interest in selling either brand of card. I have only interest in corporate honesty and getting the best product for me. You're the one who skulked over here and derailed the topic.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No, I'm saying you'd have ignored any journalist I quoted since you have a history of doing so. So I just gave you the gateway to the slew of people who've measured it so you couldn't fire back with "source confirmation bias."

I have no interest in selling either brand of card. I have only interest in corporate honesty and getting the best product for me. You're the one who skulked over here and derailed the topic.

I have a long history of ignoring something you've never done? That's hilarious.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

I have a long history of ignoring something you've never done? That's hilarious.

I'm afraid the community at large would disagree with you. @LukaP, @Opcode, @MageTank, @Shakaza, I leave the rabble to you if you'd like. He's derailed my thread enough.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

You realize this would affect AMD too?

 

I made absolutely no point to amd, because people fully expect by now their cards to run and suck down the power like a roided out guerrilla. I know my fury x chews through more watts than a titan x, while only barely performing akin to it at 4k, im perfectly fine with that.

 

The reason i find it so deeply amusing is because nvidia fanboys, especially the uninformed and screaming variety have always defended nvidia as being super budget friendly when it comes to power consumption, when in reality its never been that big of an issue unless you're at the very very high end. (295 vs titan z: 295 wins by a lot, but drinks like 60% more watts than the titan z, resulting in actual tangible differences on power bills each month if you play regularly.)

But if this is true, then nvidia cards will easily use as much as amd does now, which will freaking hilarious to try and watch one of those aforementioned fangays attempt to reason out that its still efficient with power consumption. Note, I haven't seen any nvidia fanboys like that here on the forums in a long while, though with holidays coming up, i have no doubt we're going to start seeing a lot of fresh accounts with these kinds of values. You can quote me on this, I fully expect by 1/1/16 someone to have posted "[lol]... nvidia is so much more power efficient that you earn money back on your powerbill, and amd makes your bill go up by like $100"

Updated 2021 Desktop || 3700x || Asus x570 Tuf Gaming || 32gb Predator 3200mhz || 2080s XC Ultra || MSI 1440p144hz || DT990 + HD660 || GoXLR + ifi Zen Can || Avermedia Livestreamer 513 ||

New Home Dedicated Game Server || Xeon E5 2630Lv3 || 16gb 2333mhz ddr4 ECC || 2tb Sata SSD || 8tb Nas HDD || Radeon 6450 1g display adapter ||

Link to comment
Share on other sites

Link to post
Share on other sites

I'm afraid the community at large would disagree with you. @LukaP, @Opcode, @MageTank, @Shakaza, I leave the rabble to you if you'd like. He's derailed my thread enough.

I miss Opcode.  I swear I found him on a forum once.

Link to comment
Share on other sites

Link to post
Share on other sites

I miss Opcode. I swear I found him on a forum once.

I found him on wccf actually.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I found him on wccf actually.

Really?  How is he?  I've gone through a few hundred pages of conversations you've had on WCCF with many people.  You piss off the AMD group and they all insult you and call you some fairly uncreative names, and then you piss off the nVidia group and it's the same thing only.. a bit more hostility.  It's kind of incredible how much crap is thrown at you over there.  Very reactionary - they are.

Link to comment
Share on other sites

Link to post
Share on other sites

Really?  How is he?  I've gone through a few hundred pages of conversations you've had on WCCF with many people.  You piss off the AMD group and they all insult you and call you some fairly uncreative names, and then you piss off the nVidia group and it's the same thing only.. a bit more hostility.  It's kind of incredible how much crap is thrown at you over there.  Very reactionary - they are.

Makes you wonder why so many people don't like him. Hmmmm

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

They went for JEDEC and Hynix played dirty with AMD's R&D money.

 

That's still a conspiracy theory of yours. Just like my conspiracy theory of Yakuza killing Iwata.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

It makes perfect sense if what you want is good press and to look legitimate.

 

Samsung's in the middle of its Wide-IO push for mobile devices. I'm not surprised it hasn't jumped into the HMC fray yet. Samsung's perfectly happy to let Micron remain alive by having Oracle's business for another couple years.

 

JEDEC has a lot of meaning. Not being under JEDEC means being a less secure (for future support) product. That can shake demand clear to its foundations. HMC is vastly superior to DDR4 in every single way, and yet...

 

It makes absolutely no sense. Like mentioned, all members much argue their vote. The HMC consortium members could have easily protested.

 

Like you said yourself, NVidia outright left the HMC consortium and scrapped HMC outright. If they planned on switching at a later date to HMC, they would not have left the consortium surely. It still does not change the fact that if a market has a massive demand, someone will supply. That is market theory 101. And with behemoths like Intel in the server world there is definitely a demand that is very stable and large (unless the industry is indeed assuming AMD will make a big comeback with ZEN and HBM.

 

HMC is not superior in every way to DDR4. The prices are sky high and supply is limited to hell, and by your own point, not being JEDEC means it's not future proof in support.

 

Either way, I doubt HBM is going to become system memory, unlike HMC. So honestly they seem to be separate technologies for separate markets. In that case, why would HMC not be possible to be a JEDEC? There must be other reasons not apparent to us. But your HBM hating and HMC praising is getting tiresome. Especially your completely unsubstantiated claims of foul play. 

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Makes you wonder why so many people don't like him. Hmmmm

People tend to hate others when you disagree with them or when you think differently than them. Now I'm not saying @patrickjp93 is correct every time, but that's completely different story.

CPU: AMD Ryzen 7 3800X Motherboard: MSI B550 Tomahawk RAM: Kingston HyperX Predator RGB 32 GB (4x8GB) DDR4 GPU: EVGA RTX3090 FTW3 SSD: ADATA XPG SX8200 Pro 512 GB NVME | Samsung QVO 1TB SSD  HDD: Seagate Barracuda 4TB | Seagate Barracuda 8TB Case: Phanteks ECLIPSE P600S PSU: Corsair RM850x

 

 

 

 

I am a gamer, not because I don't have a life, but because I choose to have many.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×