Jump to content

Huge GTX 1180 leak from Tech Power Up, Volta based (possibly), 12nm Finfet node

Master Disaster

Tech Power Up have the GTX 1180 listed with lots of info...

 

Quote

The GeForce GTX 1180 is a graphics card by NVIDIA that is not released yet. Built on the 12 nm process, and based on the GV104 graphics processor, the card supports DirectX 12.0. It features 3584 shading units, 224 texture mapping units and 64 ROPs. NVIDIA has placed 16,384 MB GDDR6 memory on the card, which are connected using a 256-bit memory interface. The GPU is operating at a frequency of 1405 MHz, which can be boosted up to 1582 MHz, memory is running at 1500 MHz.


We recommend the NVIDIA GeForce GTX 1180 for gaming with highest details at resolutions up to, and including, 3840x2160 (4K).
Being a dual-slot card, the NVIDIA GeForce GTX 1180 draws power from 1x 6-pin + 1x 8-pin power connectors, with power draw rated at 200 W maximum. Display outputs include: 1x HDMI, 3x DisplayPort. GeForce GTX 1180 is connected to the rest of the system using a PCIe 3.0 x16 interface. The card measures 267 mm in length, and features a dual-slot cooling solution.

Quote

The GPU entry lists the specifications for the graphics card – most of which are identical to the leaks we have seen before. The NVIDIA GTX 1180 graphics card will be manufactured on TSMC’s 12nm FinFET process which will introduce significant power efficiency upgrades on a core to core basis.  The exact core count will be 3584 CUDA cores divided in 28 SMs. 64 ROPs and 224 TMUs make up the rest of the specifications. According to the same entry, the memory in question will be the GDDR6 variant with up to 16 GB worth of DRAM.

Screenshot_20180508-092325.thumb.png.357578626b5aeeb7da0c7c624d3f8e0c.png

NVIDIA-GeForce-GTX-1180-wccftech-1030x48

So about the whole Volta thing, there's a chance this might be an ID error and TPUs DB is detecting Turing as Volta, remember Turing is a process shrink of Volta so if TPUs DB hasn't been updated it's entirely possible that it's being incorrectly identified

Quote

Here is where things get slightly murkier, however, the TPU GPU DB entry lists the card as a volta variant, which is odd considering the previous leaks have told us that the architecture in question will be called Turing. This might simply be a MachineID error where the GPU-Z tool is reading the architecture incorrectly. Turing is essentially a process shrink and optimization of the Volta architecture and will bring significant performance and power efficiencies to the next-generation graphics card.

The GTX1180 is expected to launch around CES for anywhere between $699 & $799

Quote

The expected launch date for the next generation of NVIDIA Turing graphics cards is probably around the Computex time frame. Earlier reports have indicated that they could hit the shelves as soon as July which means a tease or soft launch at Computex next month might not be entirely out of the question. As far as pricing goes, keep in mind that the crypto bubble has mostly burst (as far as graphics card pricing goes anyways) so we are expecting a slightly toned down MSRP of somewhere around $699-799 but that sticker will mean nothing if retailers and e-tailers decide to charge a heavy premium for them depending on demand.

https://wccftech.com/nvidia-geforce-gtx-1180-12nm-database/

https://www.techpowerup.com/gpudb/3224/geforce-gtx-1180

 

Interesting stuff, $699 launch price but we all know they're going to be much more expensive than that.

 

Will you buy one?

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder if it will push the price of current volta chips down.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Master Disaster said:

Will you buy one?

Apart from waiting for benchmarks from Steve to show up, i would much rather get a 1170 or 1180ti. 

The xx80 never appealed to me tbh. The xx70 is a lot more value for the money, while the xx80ti is a good chunk more horses for slightly more money.

 

A few years back i would have said "i will wait for a LTT benchmark first", but they are too much about not stepping on anyone's toes to be trusted anymore. Maybe trust is the wrong word, but they sugar coat results so much that it is sometimes hard to tell how the product really did.

 

Edit: Also there is a very slight chance that AMD can offer something to compete with. Latest rumors suggest that i am just a dreamer tho.

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting. But are we sure this is a GTX card? 16GB of vram for a consumer gaming card doesn't sound right. Especially in these inflated DRAM pricing times. Then again, it could be a Titan GTX card.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

So on core counts it is on parity with 1080Ti so maybe only a small difference in performance depending on where clocks end up. More ram (more cost?) but lower bandwidth... unless there is a new trick which adds value, I don't think it'll make an impact for those already at 1080Ti level.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Notional said:

Interesting. But are we sure this is a GTX card? 16GB of vram for a consumer gaming card doesn't sound right. Especially in these inflated DRAM pricing times. Then again, it could be a Titan GTX card.

Maybe due to the memory controller it has to be a "power of two" to be effective. That would make it 8 or 16GB options. Maybe they feel 8GB is too low for the early flagship 11 series card? Especially given 1080ti and Titan Xp are at 11 and 12GB already. I suppose on that note, a split launch of a Titan at 16GB and 1180 at 8GB could make sense, with the 1180 somewhere between 1080Ti and 1080 pricing.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, porina said:

So on core counts it is on parity with 1080Ti so maybe only a small difference in performance depending on where clocks end up. More ram (more cost?) but lower bandwidth... unless there is a new trick which adds value, I don't think it'll make an impact for those already at 1080Ti level.

Considering that Titan V performs just about 20%-30% better than the 1080ti in games, the GTX 1180 will probably be sitting at or just above the Titan Xp performance with a lower TDP. Just my speculation, I agree that this will probably won't affect us people with 1080ti... well, at least until Ray tracing becomes a thing.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, porina said:

Maybe due to the memory controller it has to be a "power of two" to be effective. That would make it 8 or 16GB options. Maybe they feel 8GB is too low for the early flagship 11 series card? Especially given 1080ti and Titan Xp are at 11 and 12GB already. I suppose on that note, a split launch of a Titan at 16GB and 1180 at 8GB could make sense, with the 1180 somewhere between 1080Ti and 1080 pricing.

That could definitely be the case. However, NVidia tends to save on memory on their GPU's. Think 3GB 780 vs. 4GB 290, or the 3GB 1060 vs. 4GB 580 (that no one should buy). Heck even the 3½GB 970. NVidia has improved on this matter though, like the 1080ti with 11GB. We will have to wait and see. Personally, I prefer GPU's to have a lot of Vram, so we can have high quality textures.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, TheRandomness said:

Lmao

 

Read the part that says ‘This is just a placeholder’

Don't you dare stopping the hype train.

We have been waiting for new GPUs for ages. Let us have some test tears already!

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Notional said:

That could definitely be the case. However, NVidia tends to save on memory on their GPU's. Think 3GB 780 vs. 4GB 290, or the 3GB 1060 vs. 4GB 580 (that no one should buy). Heck even the 3½GB 970. NVidia has improved on this matter though, like the 1080ti with 11GB. We will have to wait and see. Personally, I prefer GPU's to have a lot of Vram, so we can have high quality textures.

1180Ti 8GB (15% less SMs) & 1180Ti 16GB cos Nvidia ;)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

1180Ti 8GB (15% less SMs) & 1180Ti 16GB cos Nvidia ;)

Well the first one would just be the 1180 non Titanium :P

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, WereCat said:

Considering that Titan V performs just about 20%-30% better than the 1080ti in games, GTX the 1180 will probably be sitting at or just above the Titan Xp performance with a lower TDP. Just my speculation, I agree that this will probably won't affect us people with 1080ti... well, at least until Ray tracing becomes a thing.

Titan V has 5120 cores vs 3584 mentioned here. Without considering clock, that's a 43% brute force advantage. I guess we don't have much to compare, has anyone tried to work out a "gaming IPC" between Volta and Pascal?

 

Just now, Notional said:

That could definitely be the case. However, NVidia tends to save on memory on their GPU's. Think 3GB 780 vs. 4GB 290, or the 3GB 1060 vs. 4GB 580 (that no one should buy).

I think where they go 3GB/6GB is due to the way they handle memory, and have something comparable to 3-channel (or multiple thereof). 

 

Just now, Notional said:

Heck even the 3½GB 970.

As they said in the lawsuit, they offered 4GB, you got 4GB. Never mind 0.5GB was slower to access. :) 

 

Just now, Notional said:

NVidia has improved on this matter though, like the 1080ti with 11GB. We will have to wait and see. Personally, I prefer GPU's to have a lot of Vram, so we can have high quality textures.

Depends on game support... unless you're into heavily modding games how much do you really need?

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Notional said:

Well the first one would just be the 1180 non Titanium :P

Well there was a much bigger SM count difference this generation but I wouldn't put it past Nvidia to pull a 1060 again but on higher end cards this time around, 1070Ti was only a half slap in the face compared to that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

I think where they go 3GB/6GB is due to the way they handle memory, and have something comparable to 3-channel (or multiple thereof). 

 

As they said in the lawsuit, they offered 4GB, you got 4GB. Never mind 0.5GB was slower to access. :) 

 

Depends on game support... unless you're into heavily modding games how much do you really need?

Yeah, the memory controller dictates the amount of vram for sure.

Indeed, but they lost that lawsuit, so it doesn't really matter what they said.

 

That's what gamers said back in the 780 days, heck even the 680 days, when AMD's offerings had 1 GB of vram extra. That extra GB came in handy a lot. I will admit that over 8GB doesn't seem to make a lot of sense right now. But then again, we don't know what industry push the next gen consoles will bring. The last time, PS4 and XBone got almost 12 times the VRAM from the previous gen. That's why we saw a game like Watch Dogs, which was a true next gen game, have high res textures, and a lot more unique textures on the screen at any given time. Suddenly you needed 4GB of vram, and the 780 users got shafted as a result. VRAM is just a silly thing to run out of.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

Well there was a much bigger SM count difference this generation but I wouldn't put it past Nvidia to pull a 1060 again but on higher end cards this time around, 1070Ti was only a half slap in the face compared to that.

Indeed, but that is also partly because the 1080 and 1080ti are two vastly different chips, and not just the same chip cut down. 

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

Titan V has 5120 cores vs 3584 mentioned here. Without considering clock, that's a 43% brute force advantage. I guess we don't have much to compare, has anyone tried to work out a "gaming IPC" between Volta and Pascal?

 

I think where they go 3GB/6GB is due to the way they handle memory, and have something comparable to 3-channel (or multiple thereof). 

 

As they said in the lawsuit, they offered 4GB, you got 4GB. Never mind 0.5GB was slower to access. :) 

 

Depends on game support... unless you're into heavily modding games how much do you really need?

"IPC" is lower on Volta than Pascal. 

 

Volta is squarely a enterprise architecture. Anybody who says otherwise can be pointed to /dev/null. 

 

It's meant for the growing neural networking space, (hence the "Tensor cores") and that's where 70% of Nvidia's profit on the TitanV and Tesla V100 will come from. 

 

There will be a second architecture for consumers, and Nvidia has been heading that way for a while now. We haven't gotten top tier silicon on day one for years, and with the rise of GPGPU in the enterprise space we will never get top tier sillicon day one. 

idk

Link to comment
Share on other sites

Link to post
Share on other sites

Looks great from Nvidia.

 

But I hope TSMC’s 12nm FinFET can handle the demand, of miners + gamers.

Link to comment
Share on other sites

Link to post
Share on other sites

So a slight improvement, not surprising considering there's no competition at the high end.

 

since 4k is now within reach with 1 Gpu, im guessing they are going for more add in modules than raw horsepower.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, leadeater said:

1180Ti 8GB (15% less SMs) & 1180Ti 16GB cos Nvidia ;)

Hey am I allowed to mark my post as an important post so it can be displayed at the top? :P Like, seriously, it says right at the bottom of the post.85CA5156-0DE3-4BFE-9876-C2466CC6A589.thumb.jpeg.0fb8df7bba9de3815f6a046223c85d7a.jpeg

USEFUL LINKS:

PSU Tier List F@H stats

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Notional said:

Indeed, but that is also partly because the 1080 and 1080ti are two vastly different chips, and not just the same chip cut down. 

Yep, but I was meaning literally another 1060 but on the 1180Ti ;)

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, TheRandomness said:

Hey am I allowed to mark my post as an important post so it can be displayed at the top? :P Like, seriously, it says right at the bottom of the post.

Dunno, I'm more leaning towards lock myself since this really isn't anything at all.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

Yep, but I was meaning literally another 1060 but on the 1180Ti ;)

Ah, well that would not surprise me. But that used to be the x80ti and Titan variants :P

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Notional said:

Ah, well that would not surprise me. But that used to be the x80ti and Titan variants :P

lol actually very good point, mind blanked on that. They might need to now with the Titan V how it is.

Link to comment
Share on other sites

Link to post
Share on other sites

If I had the money I would definitely buy the 1180 instead of the 1080 because I want a next-gen card as soon as possible. I'm realistically waiting for the 1160 or 1170. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×