Jump to content

NVIDIA Pascal officially called PK100 and PK104 w/ HBM2

BiG StroOnZ

All I can say is "Yaaaaaaaaay!"

 

Omg I can't wait, will probably buy one of these upon release GTX-X70 or GTX-X80 (X is whatever number nvidia sticks at the front, probably GTX 1070 or 1080 lol)

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

PK... why not GP I wonder...

 

GF = Geforce Fermi

GK = Geforce Kepler

GM = Geforce Maxwell

PK = Pascal (Kepler?)...

PK...Suban

 

Pascal's Kingdom?

Cpu:i5-4690k Gpu:r9 280x with some other things

Link to comment
Share on other sites

Link to post
Share on other sites

Please share some oficial quotes with confirmation of the tech being used in the upcoming GPU, I would love to get a similar time spam like this claim from Jen ... wich is probably a 12 months period from release. We, the public, got confirmation on the HBM in 2015GPUs 4 days ago, in May... or show us the roadmap in 2014 claiming it for 2015.

 

I would appreciate it because I can't find any.

 

 

 

So I talk about a 3 to 4 years period and you talk about something they did last week after months of hiding information of their to-be released GPUs? And OEM wich are the most predictable cards (specially the ones announced, the low tier ones)... the 2015 GPU lineup announced in Q2 2015 lol... no veil there xD

They haven't announced jack shit until we received confirmation 4 days ago on HBM presence in the new GPUs. All the rest was vague comments and logical assumptions - nothing clear and was in no roadmap as far as we know. On thing is assumption, other thing is the confirmation from a official representative, specially a CEO.

 

AMD had HBM on their roadmaps as early as 2010/2011 if I remember correctly. Nvidia had something about HBM/HMC around the same time too. Going to scour some places and try to find it.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

it does make sense

 

if LTT is anything to go by (lol) then more RAM= Nvidia is a step in front of AMD, and there is a bigger number there= better

 

Yes, and it does not even matter how well all the parts of the ram are connected to the GPU! :P

 

Seriously. 32GiB just screams scientific workload. Because the bus connection is not so much of a bottleneck when you just can upload all your data to the GPU at once and don't need to stream it.

And for the same reason this is actually not that interesting for most of us. Because this will be extremely pricy "professional" cards that don't have any real benefits except specific workload.

 

And to the people saying there is no need for more bandwidth... why do you think AMD and Nvidia switching to this kind of technologies then?

Link to comment
Share on other sites

Link to post
Share on other sites

Reminder: Stay on topic. This is a thread about the fact that nVidia's Pascal GPUs' codenames will be PK100 and PK104, and that they will have HBM2. Discussing the other parts of the OP is also acceptable. However, arguing about whether Intel or AMD invented this feature or that, or whether the AMD R9 295X2's cooler was good or not, doesn't count as being on topic.

HTTP/2 203

Link to comment
Share on other sites

Link to post
Share on other sites

whether the AMD R9 295X2's cooler was good or not, doesn't count as being on topic.

why because we talk about future products based on what we see in current and previous gens and what changes we expect .

 

thats like not talking about haswell/bulldozer in skylake/zen topics.

 

anyway,just my opinion

Link to comment
Share on other sites

Link to post
Share on other sites

Glad I bought a GTX 960. I'm crossing my fingers for cheaper 4K monitors to coincide with Nvidia's (hopefully) beast Pascal cards.

Link to comment
Share on other sites

Link to post
Share on other sites

why because we talk about future products based on what we see in current and previous gens and what changes we expect .

 

thats like not talking about haswell/bulldozer in skylake/zen topics.

 

anyway,just my opinion

There's a difference between discussing another product and how it compares to this new product, and arguing about whether the cooler on that product was too loud or not. If you're discussing another product, but it's still all related to the original topic at hand, it's fine, but the argument that I was referring to was not.

HTTP/2 203

Link to comment
Share on other sites

Link to post
Share on other sites

But these aren't out yet... So they're still behind.

is pascal out? your point is moot.

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

Glad I bought a GTX 960. I'm crossing my fingers for cheaper 4K monitors to coincide with Nvidia's (hopefully) beast Pascal cards.

 

Thats a weird thing to be glad about.

Such a underwhelming gpu . Even a lot of nvidia fanatics  were disappointed  by its performance .

GTX 970 is where its at from nvidia , amazing card still.

Link to comment
Share on other sites

Link to post
Share on other sites

Glad I bought a GTX 960. I'm crossing my fingers for cheaper 4K monitors to coincide with Nvidia's (hopefully) beast Pascal cards.

 

I'm not glad at all about any of this,  I am due to upgrade my GPU next year and even if both companies only half come good on their hype I am going to have choice panic.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Thats a weird thing to be glad about.

Such a underwhelming gpu . Even a lot of nvidia fanatics  were disappointed  by its performance .

GTX 970 is where its at from nvidia , amazing card still.

 

970 is nearly $150 more with only 3.5gb of ram and it's more of a 1440p card. I'd rather cut my losses with a cheap 1080p card and spend the bank with 4K and Pascal.

 

I'm not glad at all about any of this,  I am due to upgrade my GPU next year and even if both companies only half come good on their hype I am going to have choice panic.

 

Hm... hopefully there will be a price war to help you choose but if there isn't then GL lmao.

Link to comment
Share on other sites

Link to post
Share on other sites

970 is nearly $150 more with only 3.5gb of ram and it's more of a 1440p card. I'd rather cut my losses with a cheap 1080p card and spend the bank with 4K and Pascal.

That is extremely wrong, it has 4GB of vRAM, its just that the last 512MB isn't much faster than system memory.

 

Hm... hopefully there will be a price war to help you choose but if there isn't then GL lmao.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Money reasons. AMD is a small company compared to Intel...

They have Global Foundries, AMD just has to design.

CPU AMD FX-8350 @ 4.0GHzCooling AMD StockMotherboard AsRock 970 Extreme4RAM 8GB (2x4) DDR3 1333MHz GPU AMD Sapphire R9 290 Vapor-XCase Fractal Define R5 Titanium 


Storage Samsung 120GB 840 EVO | PSUThermaltake Litepower 600WOS Windows 8.1 Pro 64-bit


Upgrading to - Intel i7 - New motherboard - Corsair AIO H110i GT watercooler -  1000W PSU


Link to comment
Share on other sites

Link to post
Share on other sites

Thats a weird thing to be glad about.

Such a underwhelming gpu . Even a lot of nvidia fanatics  were disappointed  by its performance .

GTX 970 is where its at from nvidia , amazing card still.

 

He got a new GPU.....he isn't even allowed to be happy about his purchase? No matter where it lies on the performance to dollar scale......

Do you have to become a total ***** to everybody as soon as there is a discussion with AMD? He said nothing about AMD, nothing about anything, and you just had to stomp on whatever joy he might have had....

 

Stay classy Zappian, Stay classy.

The Mistress: Case: Corsair 760t   CPU:  Intel Core i7-4790K 4GHz(stock speed at the moment) - GPU: MSI 970 - MOBO: MSI Z97 Gaming 5 - RAM: Crucial Ballistic Sport 1600MHZ CL9 - PSU: Corsair AX760  - STORAGE: 128Gb Samsung EVO SSD/ 1TB WD Blue/Several older WD blacks.

                                                                                        

Link to comment
Share on other sites

Link to post
Share on other sites

DX 10.1 adoption was practically non-existent until Nvidia jumped aboard. It's called reading the market momentum. 400 & 500 were not disasters, and I currently run off a 570 in my school build. up until very recently 3GB was more than enough. Suddenly memory management in games went to pot.

 

XDMA XFire still has pretty prolific microstutter issues. Frankly it's not a superior solution right now, and Nvidia understands that. There's also the fact going bridgeless means potentially creating a PCIe bottleneck where there wasn't one before. Using a bridge cable allows mitigation of this. Nvidia pulled ahead on Power Consumption with Kepler, or else IBM wouldn't be using Kepler in their supercomputers. They'd have picked AMD's FirePro chips.

 

Fully programmable shaders, HDMI 2.0, DP 1.2, physics engines, real-time rendering and simulation, GPGPU AI programming, and a whole slew more. Nvidia is still a step ahead, but Nvidia also knows which steps to take when. Some of you may find Nvidia's slowness to move as complacency or arrogance, but I'd pose this question: is it arrogance if you really are the best in the room? It seems most consumers recognize Nvidia's superiority, and I can guarantee you it's not primarily marketing.

The history revision here is amazing.

 

Dx 10.1 adoption was low because of Nvidia, not the other way around. Nvidia had clout with so many devs back then and if Nvidia cards didn't support a feature, they wouldn't bother with it.

 

Did you really just say there was nothing wrong with Fermi? Let me help you jog your memory.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/30321-nvidia-geforce-gtx-470-review.html

Heat issues were well known, but pay special attention to power consumption in relation to performance.

 

Nvidia pulled ahead in power consumption with Kepler?

59708.png

59709.png

59710.png

 

Yea, I can definitely see how you came to the conclusion that Nvidia has always been one step ahead.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

The history revision here is amazing.

Dx 10.1 adoption was low because of Nvidia, not the other way around. Nvidia had clout with so many devs back then and if Nvidia cards didn't support a feature, they wouldn't bother with it.

Did you really just say there was nothing wrong with Fermi? Let me help you jog your memory.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/30321-nvidia-geforce-gtx-470-review.html

Heat issues were well known, but pay special attention to power consumption in relation to performance.

Nvidia pulled ahead in power consumption with Kepler?

-snips-

Yea, I can definitely see how you came to the conclusion that Nvidia has always been one step ahead.

1) no, ATI and Nvidia were neck and neck for market share.

2) look at the Teslas and Quadros where power consumption actually matters and compare to FirePro cards. There are critical clock speeds at which it takes quadratic or greater increases power relative to linear growth in clock speed/performance. For gaming cards where clock is king, both companies push beyond that critical point. Below it Nvidia won out by a fair margin, a margin made only wider by Maxwell, though of course without any Tesla products it's less noteworthy.

3) read again. I said Fermi wasn't a disaster, and it wasn't. Loud and hot were the flavors du jour from both companies. That's hardly a disaster.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

So, can we just fast forward to 2016?  Like--is there a place I can go to just speed up time, or nah?

Either a black hole, or going really fast. Or just sleep near your mom for a night

"Unofficially Official" Leading Scientific Research and Development Officer of the Official Star Citizen LTT Conglomerate | Reaper Squad, Idris Captain | 1x Aurora LN


Game developer, AI researcher, Developing the UOLTT mobile apps


G SIX [My Mac Pro G5 CaseMod Thread]

Link to comment
Share on other sites

Link to post
Share on other sites

1) no, ATI and Nvidia were neck and neck for market share.

2) look at the Teslas and Quadros where power consumption actually matters and compare to FirePro cards. There are critical clock speeds at which it takes quadratic or greater increases power relative to linear growth in clock speed/performance. For gaming cards where clock is king, both companies push beyond that critical point. Below it Nvidia won out by a fair margin, a margin made only wider by Maxwell, though of course without any Tesla products it's less noteworthy.

3) read again. I said Fermi wasn't a disaster, and it wasn't. Loud and hot were the flavors du jour from both companies. That's hardly a disaster.

1. http://www.bit-tech.net/news/hardware/2009/04/30/nvidia-increases-market-share/1

 

2. You have to be trolling.

 

3. Pay special attention to the 5850 vs the 470

GTX480-37.jpg

 

GTX480-39.jpg

 

 

 

GTX480-80.jpg

 

GTX480-82.jpg

 

GTX480-89.jpg

 

The gtx 470 was using 100W more to give as little as 2 more fps in some cases. The gtx 480 was a bit better, but still terrible compared to what AMD was offering. It used 20W less than a 5970 while performing an average of 20 fps worse.

 

At this point, I'm convinced you're a troll so don't bother responding anymore.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Either a black hole, or going really fast. Or just sleep near your mom for a night

 

1. http://www.bit-tech.net/news/hardware/2009/04/30/nvidia-increases-market-share/1

 

2. You have to be trolling.

 

3. Pay special attention to the 5850 vs the 470

 

 

 

-so-

-many-

-snippity-

-snip-

-snips-

 

The gtx 470 was using 100W more to give as little as 2 more fps in some cases. The gtx 480 was a bit better, but still terrible compared to what AMD was offering. It used 20W less than a 5970 while performing an average of 20 fps worse.

 

At this point, I'm convinced you're a troll so don't bother responding anymore.

1) incomplete and blatantly false article that only took into account new sales in the previous year, not actual standing marketshare in a rapidly expanding market

2) Not at all. You can see this behavior yourself modeled in Intel and AMD CPUs as well as in GPUs or the Power 8 processors. Everything I've said is true, even if highly inconvenient for you. External proof of this is reflected in the cost structure for GPU-accelerated supercomputers/datacenters. Electricity is the dominant cost over the long term, not equipment replacement. If AMD had a decent power consumption and/or heat density lead, especially with its theoretical performance lead, then it would have vastly outsold the Nvidia Kepler Teslas purely on merit.

 

3) Your point? Also you are quoting launch benchmarks which did narrow in the performance gap over the course of the product lifetime. 

 

Fermi wasn't a disaster. It wasn't Nvidia's best architecture or time period, but it wasn't a disaster by any stretch.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1) no, ATI and Nvidia were neck and neck for market share.

discrete.jpg

 

now its 24% amd and the rest is took by nvidia

Link to comment
Share on other sites

Link to post
Share on other sites

Lol @ all the people thinking AMD is going to "disappear" or "go bankrupt" or that they are "so far behind" when it comes to GPUs. Oh ye of ignorance and little knowledge... :P

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×