Jump to content

NVIDIA 16nm Pascal Powered GeForce GTX 1080 could be Launching in May

33 minutes ago, Thony said:

I use dvi on both of my monitors. They didn't come with HDMI or dp included in the box. So why buy extra cable to do literally the same thing ?

 

Only a minor majority of people uses VGA and they are still being included even in high end mobos and laptops.

 

That standard won't go away for a long time...

 

Luckily we don't see VGA on GPUs anymore.  I'm sure some manufacturers will start releasing cards with only dp or HDMI I/O :)

Not every card is going to do that though. 

 

You're already right, R9 Fury X and R9 Nano are DP, HDMI only

- ASUS X99 Deluxe - i7 5820k - Nvidia GTX 1080ti SLi - 4x4GB EVGA SSC 2800mhz DDR4 - Samsung SM951 500 - 2x Samsung 850 EVO 512 -

- EK Supremacy EVO CPU Block - EK FC 1080 GPU Blocks - EK XRES 100 DDC - EK Coolstream XE 360 - EK Coolstream XE 240 -

Link to comment
Share on other sites

Link to post
Share on other sites

this will most likely be a GTX980 equivalent, the naming scheme makes perfect sense: GTX980 -> GTX1080

also expected, presence of GDDR5X

 

as for the power requirements, that single 8pin PCIe power connector ... here where it gets tricky:

it can use only the additional power (from 8pin) for 150W

or, it can use power from both the PCIe connector (75W) and the 8pin, for a total of 225W

Link to comment
Share on other sites

Link to post
Share on other sites

Let's rush a product that is not ready for market, that's a good idea.

 

PS: Nvidia is going to 16nm? Isnt AMD going 14nm? Wouldnt AMD be advantaged?

 

3 hours ago, ozziestig said:

Yeah Nvidia has said many times that Pascal is a compute powerhouse, but still it is a node shrink so there is that.

I thought it was a vast improvement compared to nvidia cards but still shit compared to current AMD cards.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, zMeul said:

this will most likely be a GTX980 equivalent, the naming scheme makes perfect sense: GTX980 -> GTX1080

also expected, presence of GDDR5X

 

as for the power requirements, that single 8pin PCIe power connector ... here where it gets tricky:

it can use only the additional power (from 8pin) for 150W

or, it can use power from both the PCIe connector (75W) and the 8pin, for a total of 225W

Nah, that's not at all necessarily true... Z97->Z170 not 107.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Mystic Mistro said:

Fair enough, i was more getting at why have multiple types of connnecters at all if they all do the same thing :P

Good point but thats not something we have power to change. It would be nice to have one connector do everything but that is still to happen, probably in distant future.

Connection200mbps / 12mbps 5Ghz wifi

My baby: CPU - i7-4790, MB - Z97-A, RAM - Corsair Veng. LP 16gb, GPU - MSI GTX 1060, PSU - CXM 600, Storage - Evo 840 120gb, MX100 256gb, WD Blue 1TB, Cooler - Hyper Evo 212, Case - Corsair Carbide 200R, Monitor - Benq  XL2430T 144Hz, Mouse - FinalMouse, Keyboard -K70 RGB, OS - Win 10, Audio - DT990 Pro, Phone - iPhone SE

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, RagnarokDel said:

Let's rush a product that is not ready for market, that's a good idea.

 

PS: Nvidia is going to 16nm? Isnt AMD going 14nm? Wouldnt AMD be advantaged?

Not necessarily just because its smaller doesn't mean its better for now.

Steve Wozniak - "Never trust a computer you can't throw out a window."                                                                                                                                               Carl Sagan - "If you want to make an apple pie from scratch, you must first create the universe."

 

Spoiler

CPU: Core i5 6600K Cooling: NH-D14 Motherboard: GA-Z170XP-SLI RAM: 8GB Patriot Graphics: Sapphire Nitro R9 380 4G Case: Phanteks Enthoo Pro HDD: 2TB Seagate Barracuda PSU: Threamaltake Smart 750W

My computer runs on MSX, Its very hard to catch.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ozziestig said:

Not necessarily just because its smaller doesn't mean its better for now.

right, but in theory it should?

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Tesel said:

Nah, that's not at all necessarily true... Z97->Z170 not 107.

we're talking nVidia

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, RagnarokDel said:

PS: Nvidia is going to 16nm? Isnt AMD going 14nm? Wouldnt AMD be advantaged?

didn't SAMSUNG made 14nm Apple chips that were hotter than (not sure if them) TSMC's 16nm ones

Link to comment
Share on other sites

Link to post
Share on other sites

only 1 power  connector... and still with GDDR5 memory... meh ... not worth upgrading.... 

Intel Core i7 7800x @ 5.0 Ghz with 1.305 volts (really good chip), Mesh OC @ 3.3 Ghz, Fractal Design Celsius S36, Asrock X299 Killer SLI/ac, 16 GB Adata XPG Z1 OCed to  3600 Mhz , Aorus  RX 580 XTR 8G, Samsung 950 evo, Win 10 Home - loving it :D

Had a Ryzen before ... but  a bad bios flash killed it :(

MSI GT72S Dominator Pro G - i7 6820HK, 980m SLI, Gsync, 1080p, 16 GB RAM, 2x128 GB SSD + 1TB HDD, Win 10 home

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Thony said:

I use dvi on both of my monitors. They didn't come with HDMI or dp included in the box. So why buy extra cable to do literally the same thing ?

 

Only a minor majority of people uses VGA and they are still being included even in high end mobos and laptops.

 

That standard won't go away for a long time...

 

Luckily we don't see VGA on GPUs anymore.  I'm sure some manufacturers will start releasing cards with only dp or HDMI I/O :)

Not every card is going to do that though. 

DP passively supports DVI and HDMI but DVI only passively supports HDMI. so if you really need HDMI or DVI just buy a $5 cable.

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Notional said:

Still disappointing. HBM not only provides a much larger bandwidth, but also uses less power and takes up way less space, making small cards possible. Then again, I have no idea how many Polaris models will utilize HBM. I hope all of them.

 

2 hours ago, zMeul said:

this will most likely be a GTX980 equivalent, the naming scheme makes perfect sense: GTX980 -> GTX1080

also expected, presence of GDDR5X

 

as for the power requirements, that single 8pin PCIe power connector ... here where it gets tricky:

it can use only the additional power (from 8pin) for 150W

or, it can use power from both the PCIe connector (75W) and the 8pin, for a total of 225W

What do you mean? The article said the opposite, that the card will likely use GDDR5, and that GDDR5X wasn't expected until much later, and at the AIB partners initiative only. 'Course, this entire article is purely speculative and rumours.

1 hour ago, RagnarokDel said:

Let's rush a product that is not ready for market, that's a good idea.

 

PS: Nvidia is going to 16nm? Isnt AMD going 14nm? Wouldnt AMD be advantaged?

 

 

I thought it was a vast improvement compared to nvidia cards but still shit compared to current AMD cards.

Nah, the differences between Samsung/GloFo 14nm and TSMC 16nm is negligible. Something to do with the way both of them got to their respective FinFet nodes, means that they are basically using equivalent nodes. I couldn't tell you the technical details.

 

In theory, 14nm might give a slight advantage, but I don't think it'll be the deciding factor.

35 minutes ago, Mr_Troll said:

only 1 power  connector... and still with GDDR5 memory... meh ... not worth upgrading.... 

Why don't we wait until actual products are released and benchmarked/reviewed? How do you know it isn't worth upgrading, when realistically, you don't know a damn thing about power consumption or performance?

28 minutes ago, Suicidal Korean said:

DP passively supports DVI and HDMI but DVI only passively supports HDMI. so if you really need HDMI or DVI just buy a $5 cable.

Indeed, DP can be adapted to pretty much anything, including VGA. We use DP-to-DVI adapters on almost all of our workstations at work, because the new Dell PC's have DP only, and all our monitors support DVI or VGA only. Our new Public Internet computers are using DP-to-VGA adapters, since those monitors only support VGA.

 

And let me tell you, when you're dealing with ~150 monitors, it's damn well cheaper to buy an adapter vs replacing all the monitors.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, dalekphalm said:

What do you mean? The article said the opposite, that the card will likely use GDDR5, and that GDDR5X wasn't expected until much later, and at the AIB partners initiative only. 'Course, this entire article is purely speculative and rumours.

"the article" .. the article is usual WCCFTech bullcrap based on a BenchLife post: https://benchlife.info/gp104-aka-nvidia-geforce-gtx-1080-will-ship-in-may-and-no-hbm2-031112016/

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, zMeul said:

"the article" .. the article is WCCFTech bullcrap based on a BenchLife post: https://benchlife.info/gp104-aka-nvidia-geforce-gtx-1080-will-ship-in-may-and-no-hbm2-031112016/

... I fail to see your point?

 

I just read that article on BenchLife (Translated into English), and it specifically says:

Quote

This card will remain GDDR5, or a faster GDDR5X memory, and memory capacity is 8GB in size.

The article does not confirm the presence of GDDR5X. It says it could be either/or. The WCCFTech article further explains that Micron isn't expected to begin mass production of GDDR5X until the summer, meaning that if NVIDIA truly plans to launch in May, Micron will miss the launch window.

 

Unless you have a third article that somewhere states that Micron will in fact be ready sooner, and that this is wrong and the "GTX1080" will in fact use GDDR5X confirmed?

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't really care since I'm waiting for the Titan, but I can see a lot of people being annoyed by the GDDR5 VRAM.

 

I know Nvidia didn't technically promise anything, but when you hear "Pascal will feature HBM2 Memory", I feel like most people would expect it to be on more than one card. I know quite a few people who were waiting for Pascal instead of upgrading to a 980ti and expecting the high end cards (1070, 1080) to have HBM as well.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, dalekphalm said:

... I fail to see your point?

 

I just read that article on BenchLife (Translated into English), and it specifically says:

The article does not confirm the presence of GDDR5X. It says it could be either/or. The WCCFTech article further explains that Micron isn't expected to begin mass production of GDDR5X until the summer, meaning that if NVIDIA truly plans to launch in May, Micron will miss the launch window.

 

Unless you have a third article that somewhere states that Micron will in fact be ready sooner, and that this is wrong and the "GTX1080" will in fact use GDDR5X confirmed?

 

GDDR5X mass production could very well fit the timeframe of GTX1080 launch - the launch could very well be just a paper launch with some limited availability for tech media

Link to comment
Share on other sites

Link to post
Share on other sites

If the 1080 is around $600 or $500 then I guess I could get two and be set for awhile if their performance is on par or a bit above an OC'd 980ti or Titan X.  I really want the Pascal Titan, but, eh.. I don't care to wait anymore.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Thony said:

Good point but thats not something we have power to change. It would be nice to have one connector do everything but that is still to happen, probably in distant future.

really distant future...unless someone comes up with a new 'Best ever standard' :P

Hi!  Please quote me in replies otherwise I may not see your response!

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, jasonvp said:

Correct.  And I agree with Mystic: WTF are they still dragging around DVI?  Chuck it in the trash where it belongs, and we can have a single-slot backplate on our big cards (assuming the OEM cooler is removed and water cooling added).  3 DP1.3 ports and an HDMI (because, HDMI is also still important... /sarcasm) should be good.  It'll all fit in one slot.

I'd be happy with like 3 mini DP and 2 HDMI...HDMI is just too wide spread in terms of use and availability and with HDMI 2.0 is keeping up. 

Though I'm not saying I wouldn't like to just see like 5 mini DP :P 

Hi!  Please quote me in replies otherwise I may not see your response!

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, byalexandr said:

That is a big card, there's only a couple of cards that are going to be higher end than this one.

so there is, like i said. im gonna buy the equivalent for the 980ti. that wont be the 1080.

IF YOU WANT ME TO REPLY TO YOU, QUOTE MY POST.

Fire Strike Score

5820K @ 4.8GHZ - 1.25v / Uncore @ 4.5GHZ - 1.2v / 3000MHZ G.skill 32GB Quad Channel / Asus Rampage V Extreme / 950 Pro Nvme / Sound Blaster ZxR  / 980 TI / Windows 7

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Sakkura said:

I still doubt it will be called GTX 1080. I think 1800 is more likely. 

Or GTX Thousand_and_eighty

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, jasonvp said:

Correct.  And I agree with Mystic: WTF are they still dragging around DVI?  Chuck it in the trash where it belongs, and we can have a single-slot backplate on our big cards (assuming the OEM cooler is removed and water cooling added).  3 DP1.3 ports and an HDMI (because, HDMI is also still important... /sarcasm) should be good.  It'll all fit in one slot.

Why would you chuck the DVI standard? It may be inferior to DP and HDMI but who wants to be upgrading their monitors just because they just wanted to upgrade their GPU but have a older monitor which doesn't have DP or HDMI?

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Mystic Mistro said:

I'd be happy with like 3 mini DP and 2 HDMI...HDMI is just too wide spread in terms of use and availability and with HDMI 2.0 is keeping up. 

Though I'm not saying I wouldn't like to just see like 5 mini DP :P 

7990=4 mini DP and a DVI so almost 5 mini DP :P 

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Mr.Meerkat said:

Why would you chuck the DVI standard?

Real estate.  DVI ports are fricken huge, and it's ANCIENT TECH!  There's no reason to keep dragging it along with newer cards.

 

Upgrade your crappy old panels.  Use a DP->DVI adapter.  DVI needs to go away.  I like single-slot but kick-ass GPUs.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×