Jump to content

Nvidia Announces Titan V (Volta!)

Max_Settings
32 minutes ago, William Payne said:

Nobody has said the card will suck at gaming, its just that nvidia never intended it to be one and the only people who want it to be the worlds most amazing gaming card are gamers. 

 

Its that whole sledgehammer to open a walnut scenario, sure it works but was never the intended use of the sledge hammer. 

 

I am sure gamers will buy it and I bet you they will want to SLI it and all that and then I am sure they will complain to nvidia that they left off SLI. 

 

Nvidia will say "It was never meant for gamers" which is a true statement, and then gamers will just complain and say its all nvidias fault. 

I wasn't implying anything, just giving information on benchmarks. The more information we have about a product the better, no matter what the product is. I wasn't trying to insinuate that people thought it would suck, or that I'm defending the card. I would rathe have information that we may not need/want, than information we may need/want and not have it.

 

I personally don't care if someone buys it and with the intention of SLI. That's on them for buying a piece of hardware (for $3000 none the less) without informing themselves, but again that's just my opinion.

 

Anyways, I'm sorry if my lack of input on the articles made it seem as though I was trying to be the "smart" one and playing the "ha gotcha" card. As that was never my intention, as I already stated I just wanted more information. For everyone.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Dylanc1500 said:

I wasn't implying anything, just giving information on benchmarks. The more information we have about a product the better, no matter what the product is. I wasn't trying to insinuate that people thought it would suck, or that I'm defending the card. I would rathe have information that we may not need/want, than information we may need/want and not have it.

 

I personally don't care if someone buys it and with the intention of SLI. That's on them for buying a piece of hardware (for $3000 none the less) without informing themselves, but again that's just my opinion.

 

Anyways, I'm sorry if my lack of input on the articles made it seem as though I was trying to be the "smart" one and playing the "ha gotcha" card. As that was never my intention, as I already stated I just wanted more information. For everyone.

No no sorry I wasn't implying anything towards you. I was more implying regarding the benchmarks. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, William Payne said:

No no sorry I wasn't implying anything towards you. I was more implying regarding the benchmarks. 

Oh ok. well honestly my only question is, since the card is practically identical, if it would be possible to enable the NVlink on them. Someone might figure it out. Time will tell I suppose.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, 3 grand a bit much considering that getting into Quadro territory big time.

 

Though, that is one big arse die.  I thought the dies on my OG Titans back when I had a pair of those where big.  I sure would like to see the folding output of such a card for the giggles.  :P

2023 BOINC Pentathlon Event

F@H & BOINC Installation on Linux Guide

My CPU Army: 5800X, E5-2670V3, 1950X, 5960X J Batch, 10750H *lappy

My GPU Army:3080Ti, 960 FTW @ 1551MHz, RTX 2070 Max-Q *lappy

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, Xbox One S, Xbox One X

My Tablet Squad: iPad Air 5th Gen, Samsung Tab S, Nexus 7 (1st gen)

3D Printer Unit: Prusa MK3S, Prusa Mini, EPAX E10

VR Headset: Quest 2

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/8/2017 at 5:28 AM, Master Disaster said:

Sure. NVIDIA have been selling Volta already for months, big server farms doing machine learning stuff have been hands on with Volta since at least spring.

 

As I'm sure you know the cost of these machine learning cards is phenomenal, like $3K for the Titan V is chump change, some of the machine learning stuff costs 5 figures.

 

Moving on to mainstream retail, NVIDIA have the current fastest card in existence already, the Titan XP.

 

Basic business says that if you already have the fastest thing on the market then you don't go replacing it with something even faster until your competition catches you (or as is the case with Vega even gets into the same league as you). All Nvidia has done is devalued Volta ($10K+ for the machine learning parts to $3K for the retail part), devalued Titan XP (as no one is going to buy it at full price when it's been replaced) and caused a situation where they're competing against themselves.

 

Will NVIDIA move more stock in the retail sector? At $3K a pop I'm not actually sure they will move much more than they did in the machine learning sector and remember each unit is up to 3x cheaper at retail than in the business sector.

 

NVIDIA has nothing to gain from releasing Volta right now, I legitimately can't see an upside for them. They could very easily have sat on Volta for another 12 months, kept on raking in the machine learning money and consumers wanting the fastest thing available would still be buying Titan XP anyway.

 

Still I'm not suggesting I know their industry better than they do, I'm sure they have their reasons, I just don't understand what those reasons are.

Ah I hadn't considered the machine learning cards. My guess would be that while they can charge extreme premiums for those machine learning cards, the sales from them represent a very small portion of their revenue compared to the consumer gpu space. So while yes they could continue to sell their 10 series cards for the next 2-3 years while amd plays catch-up I imagine releasing 2 new product generations in that time will be worthwhile in the amount of current 10XX series customers that will upgrade to the latest thing. 

 

I know I'm itching to upgrade my 1080 can't hit 100fps in a lot of games at 3440x1440 on my x34. With the new 1440p ultrawides coming out at 200hz and presumably even higher def vr headsets coming next year there is demand for GPU horsepower much higher than we have currently. Not to mention intel is coming for nvidia's ass in the high laptop market with their new cpu/gpu collab with amd that packs gtx 1060 level performance in an ultrathin.

 

You rest on your laurels and you risk getting caught with your pants down like intel did with ryzen. Plus there's gonna be huge money in self driving cars in the next 10 years why risk milking this current generation.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Dolt said:

My guess would be that while they can charge extreme premiums for those machine learning cards, the sales from them represent a very small portion of their revenue compared to the consumer gpu space.

Wow. No, just no. The vast majority of nVidia's growth over the past 3-4 years has pretty much come explicitly from various machine learning efforts.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/10/2017 at 1:59 PM, ravenshrike said:

Wow. No, just no. The vast majority of nVidia's growth over the past 3-4 years has pretty much come explicitly from various machine learning efforts.

Really? What are the most profitable sectors of their business for them in order of profitability? Are there any easy to follow investor graphs direct from nvidia that break it down? Always been curious how important consumer gpu's are to their bottom line.

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Dolt said:

Really? What are the most profitable sectors of their business for them in order of profitability? Are there any easy to follow investor graphs direct from nvidia that break it down? Always been curious how important consumer gpu's are to their bottom line.

Well it's anecdotal but look at it this way, servers doing machine learning might use 100 cards at $10K a pop.

 

Also remember Nvidia are pushing hard into automotive AI too.

 

They have their fingers deep in many different pies.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

oh so it was all about Volta?

"Make it future proof for some years at least, don't buy "only slightly better" stuff that gets outdated 1 year, that's throwing money away" @pipoawas

 

-Frequencies DON'T represent everything and in many cases that is true (referring to Individual CPU Clocks).

 

Mention me if you want to summon me sooner or later

Spoiler

My head on 2019 :

Note 10, S10, Samsung becomes Apple, Zen 2, 3700X, Renegade X lol

 

Link to comment
Share on other sites

Link to post
Share on other sites

I got so exited now :D. I'm taking my master in machine learning and cant stop drooling over those 640 tensor cores now :x

 

Such a shame I upgraded my system just tree weeks ago, not that i could afford a Titan V if i had waited a couple of weeks. But still now my 1080ti seem useless in compresence. :P    Anyone else feels this way?

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/7/2017 at 10:52 PM, RotoCoreOne said:

$3000... rip wallet

110Tflops doe. Jesus my GTX 1080 is only 9 its price to performance is still cheaper lol. 

 

Heres to hoping the 1180 is 20-25tflops cause that should do 4k 60fps Ultra Easy  1080TI are close to doing it in a lot of games and that would double it and then some

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Umbriona said:

I got so exited now :D. I'm taking my master in machine learning and cant stop drooling over those 640 tensor cores now :x

 

Such a shame I upgraded my system just tree weeks ago, not that i could afford a Titan V if i had waited a couple of weeks. But still now my 1080ti seem useless in compresence. :P    Anyone else feels this way?

Just got a 1080 a few weeks ago and i figured whatever its probably over a year away or close to it and it probably wont completely wreck my card ( new Series 1100s) but thats more then 10x the power of my card  even if its 1/5th as powerful of the Titian the 1180 will be pretty sick.  for people with 4k 60 and 1440p ultrawide 144hz

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, michaelocarroll007 said:

110Tflops doe. Jesus my GTX 1080 is only 9 its price to performance is still cheaper lol. 

 

Heres to hoping the 1180 is 20-25tflops cause that should do 4k 60fps Ultra Easy  1080TI are close to doing it in a lot of games and that would double it and then some

Is this why a bunch of kids are spouting the Ti will be better? That's machine learning flops. The flops relevant to gaming/compute are only a couple more than a Titan Xp.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TenThousand said:

Is this why a bunch of kids are spouting the Ti will be better? That's machine learning flops. The flops relevant to gaming/compute are only a couple more than a Titan Xp.

I dont know why anyone is saying TI will be better. But Yeah your right i was wront about Tflops i didnt see the different info and screwed up. Looks like its just a hair short of 15tflops compared to the 1080TI at 12 tflops (Both rounded up)  in terms of gaming performance. Id hope to see the 1180 or what ever name its given beat this titan then in gaming personally.  i really think for a 4k card to last 2-3 years of usablitlity it will need to have around 18-22tflops. ( when comparing a similar arch as current gen)

 

Buying one this year with 10-12 gets you decent 4k 60fps on high some games ultra.  but as it gets bumped up each year it a 1080 and 1080TI wont last 2-3 years at near max settings 4k at least thats my guess. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TenThousand said:

Is this why a bunch of kids are spouting the Ti will be better? That's machine learning flops. The flops relevant to gaming/compute are only a couple more than a Titan Xp.

Unless somehow the game can used FP16 FMAs.

 

Still feels funny that gaming is going back to FP16.

 

1 minute ago, michaelocarroll007 said:

I dont know why anyone is saying TI will be better. But Yeah your right i was wront about Tflops i didnt see the different info and screwed up. Looks like its just a hair short of 15tflops compared to the 1080TI at 12 tflops (Both rounded up)  in terms of gaming performance. Id hope to see the 1180 or what ever name its given beat this titan then in gaming personally.  i really think for a 4k card to last 2-3 years of usablitlity it will need to have around 18-22tflops. ( when comparing a similar arch as current gen)

The FLOPS rating though should be taken with a grain of salt. GCN has been more powerful in terms of FLOPS than whatever NVIDIA offers, but it doesn't show.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, M.Yurizaki said:

Unless somehow the game can used FP16 FMAs.

 

Still feels funny that gaming is going back to FP16.

 

The FLOPS rating though should be taken with a grain of salt. GCN has been more powerful in terms of FLOPS than whatever NVIDIA offers, but it doesn't show.

yeah they should but when its all the info we have thats about it. and if you look at the same arch  like the 10 series it gives pretty decent estimates. Volta could be completley different and tflops could be a bad comparison but we wont know that till its out but its rumored to not be crazy different architecturally Volta that is. 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, michaelocarroll007 said:

yeah they should but when its all the info we have thats about it. and if you look at the same arch  like the 10 series it gives pretty decent estimates. Volta could be completley different and tflops could be a bad comparison but we wont know that till its out but its rumored to not be crazy different architecturally Volta that is. 

Volta has a whitepaper, so it's pretty much known what the architecture is like. And considering that the consumer versions of the Gx-100 series GPUs are the same minus the FP64 units (and probably the other stuff), I wouldn't be surprised if GV-104 is basically GP-104 with four more SMCs per GPC as with GV-100 is to GP-100 from a higher level point of view.

 

EDIT: On that note, if GV-104 ends up exactly as I mentioned, then the next 80 series video card should have a 50% performance bump over the GTX 1080 at the minimum. Unless they downclock it lower than the base frequency of the 1080.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Misanthrope said:

This card is punching at Dual 1080/Vega64 numbers, dayum son I wish I had 3k to waste:

It appears to be scaling directly with the increased CUDA cores. So no apparent difference in architecture performance.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Misanthrope said:

This card is punching at Dual 1080/Vega64 numbers, dayum son I wish I had 3k to waste:

Interesting, seems it only gets those large performance gains over the Xp in DX12/Vulkan games. DX11 games only get much smaller performance increases.

Link to comment
Share on other sites

Link to post
Share on other sites

Seems someone out there may have their hands on and testing the Titan V in the real world. Some ~26% faster than a 1080 Ti at least in the below benchmark suite albeit kind of old software now. Makes me think this is similar to the previous GPU refreshes/launches of new generations at Nvidia. 

 

https://www.videocardbenchmark.net/high_end_gpus.html 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Interesting, seems it only gets those large performance gains over the Xp in DX12/Vulkan games. DX11 games only get much smaller performance increases.

Yes apparently this a result of proper async for Nvidia now but still not a lot for DX11. Though since this is a "non-gamer" card if they did decided to release as 2080 specifically for gamers (and no doubt a bit cut down) they could claim back some of those gains on DX11 as well, but probably not as dramatic.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Misanthrope said:

Yes apparently this a result of proper async for Nvidia now but still not a lot for DX11. Though since this is a "non-gamer" card if they did decided to release as 2080 specifically for gamers (and no doubt a bit cut down) they could claim back some of those gains on DX11 as well, but probably not as dramatic.

thats what I was thinking somewhat

 

wonder if we will ever see more voltas or if they going to try to push mcm fast

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×