Jump to content

NVIDIA Announces GP102-based TITAN X with 3,584 CUDA cores

RZeroX
1 minute ago, Hunter259 said:

 There was one picture of a Titan X Arctic Storm but that never happened.

 

Sounds nice.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SamStrecker said:

Yeah good ole marketing. #W84Vega #W84Volts

Exactly, we don't even know if it is 60% in a certain workload or just one specif game with settings tweaked to perfection or if performance/watt is included in the calculation. We can get better estimations by going to previous gpu releases and specs given.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, done12many2 said:

 

I would definitely like to see a 10 series Ti, but I'm not sure that we'll be getting one this go around. 

I think only if AMD has something to beat the 1080 with a competitive price. Otherwise there will be no Ti card, I agree.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MageTank said:

I stand corrected. It's odd though, as I have searched and could only find EVGA doing this, or "sold separately" shrouds/water blocks. I do wonder just how much extra this will add to the cards already expensive ($1200) price tag, and how much additional performance will be obtained from it. Guess we will know come August 2nd, when people inevitably slap water blocks on the Titan XP. 

If someone has such big pocket to afford a $1200 graphics card. An extra $100 waterblock won't be a big deal for them.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Deli said:

If someone has such big pocket to afford a $1200 graphics card. An extra $100 waterblock won't be a big deal for them.

You know, I used to always believe that myself, until I saw people buying FX 9590's and putting them into $80-$100 970 motherboards and complaining about fires. Now I've learned to never underestimate the combined greed and stupidity of some consumers. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, MageTank said:

You know, I used to always believe that myself, until I saw people buying FX 9590's and putting them into $80-$100 970 motherboards and complaining about fires. Now I've learned to never underestimate the combined greed and stupidity of some consumers. 

Learning new things everyday. xD

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, MageTank said:

It's amazing how they make this claim, saying it "leaves the 10 series in the dust" without having any of the performance metrics. They assume "40% more cuda cores = much faster" and make bold statements without facts to back them up. Nvidia already came out and said the GTX Titan XP is "up to" 60% faster than the previous Titan X. The previous Titan X is only 20-30% slower than the GTX 1080. Transitive property dictates that the new Titan XP will only be 20-30% faster than the GTX 1080, using Nvidia's own information. 

 

Remember people. Core count does not scale linearly if clock speeds change. Something might have 40% more cores, but if there is even the slightest difference in clock speed, it will not automatically yield 40% more performance. 

There's also the matter of having nearly 80% higher bandwidth though.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, LeapFrogMasterRace said:

Don't forget the 1080ti will come out in the future and it will run cooler, quieter and be faster when overclocked AIB designs are made for ~$900. I love how stupid the press is, did anyone see ars technicas headline "Nvidia unveils new GTX Titan X: 11 teraflops, 12GB GDDR5X, JUST $1,200

 

Looks like they just copied Nvidia's description of it and don't know what they are talking about like GDDR5X on a $1200 card is a good thing anyway.

The whole "Just $1200" could've been verbal irony.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, patrickjp93 said:

There's also the matter of having nearly 80% higher bandwidth though.

Compared to the Titan X? Or the GTX 1080? The new Titan XP has 480GB/s memory bandwidth. The GTX 1080 has 320GB/s. That is a difference of 50%. The Maxwell Titan X had a memory bandwidth of 336.5GB/s. Nearly 43% less than the Titan XP. Have we reached a situation where the general consumer is limited by memory bandwidth? I have yet to personally run into said limitations on a GTX 980 Ti, but then again, i only do 1440p at 120hz. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MageTank said:

It's amazing how they make this claim, saying it "leaves the 10 series in the dust" without having any of the performance metrics. They assume "40% more cuda cores = much faster" and make bold statements without facts to back them up. Nvidia already came out and said the GTX Titan XP is "up to" 60% faster than the previous Titan X. The previous Titan X is only 20-30% slower than the GTX 1080. Transitive property dictates that the new Titan XP will only be 20-30% faster than the GTX 1080, using Nvidia's own information. 

 

Remember people. Core count does not scale linearly if clock speeds change. Something might have 40% more cores, but if there is even the slightest difference in clock speed, it will not automatically yield 40% more performance. 

Wow shows how much you know. Don't you know that GameWorks gives Nvidia an unfair advantage against AMD and somehow makes their cards almost always outperform AMD's cards? I wish AMD would just release their passively cooled RX 495x2 3.5ghz Black Swag Edition with 420 jigabites of high band with vroom ram

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, MageTank said:

Compared to the Titan X? Or the GTX 1080? The new Titan XP has 480GB/s memory bandwidth. The GTX 1080 has 320GB/s. That is a difference of 50%. The Maxwell Titan X had a memory bandwidth of 336.5GB/s. Nearly 43% less than the Titan XP. Have we reached a situation where the general consumer is limited by memory bandwidth? I have yet to personally run into said limitations on a GTX 980 Ti, but then again, i only do 1440p at 120hz. 

Not so much on the 1080 with GDDR5X, but people are finding pretty big performance bumps from overclocking the memory on 1060's and 1070's, with no relative upper limit to the gains. Or in other words, they haven't been able to overclock the memory far enough to see diminishing returns yet. That has nothing to do with the Titan X though.  Just something people are noting about the lower tier cards. My 1070 sees substantial gains at 9200 vs 8008.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Kloaked said:

I wish AMD would just release their passively cooled RX 495x2 3.5ghz Black Swag Edition with 420 jigabites of high band with vroom ram

Is that also going to use the closed loop LN2 cooler they are designing for big Vega?

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, ace_cheaply said:

Not so much on the 1080 with GDDR5X, but people are finding pretty big performance bumps from overclocking the memory on 1060's and 1070's, with no relative upper limit to the gains. Or in other words, they haven't been able to overclock the memory far enough to see diminishing returns yet. That has nothing to do with the Titan X though.  Just something people are noting about the lower tier cards. My 1070 sees substantial gains at 9200 vs 8008.

That actually makes sense. Take the 1070 for example. 256GB/s bandwidth, has the same 8GB frame buffer as the 1080. It's probably having issues fully saturating that buffer size, so more bandwidth allows it to be properly utilized. Granted, I don't really know. I do not have a 1070, and I certainly am no expert on VRAM (it differs compared to normal system memory) and I am simply talking out of my rear on this subject.

 

I just know that at least with my 980 Ti, overclocking the VRAM only really helped in benchmarks. I saw no tangible difference in my games. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Notional said:

When NVidia made Kepler, they were surprised by how underwhelming AMD's cards where. The results were that Kepler not only was launched with smaller chips than what they expected to make, but also that they ended up making 2 entire lines of cards with Kepler, the 600 and 700 series. That is also why the 700 series just rebranded the 680 to 770. Now since AMD didn't have anything faster than the 680, NVidia decided to launch the first Titan card as a fairly cut down GK110 chip. This was a lot faster than anything else on the market, which lead NVidia to create a new price bracket of 1k$. They could do this because there was no competition, so they price skimmed the hell out of the chip.

3 months (may 2013) later they released a further cut down GK110 chip in the form of 780 that got a "normal" price bracket and a normal moniker in the 700 series.

Now in October 2013, AMD launched the 290x that was faster than both the Titan and 780. The titan was dead in the water. NVidia swiftly launched yet another GK110 chip, but less cut down than the original Titan. New koth card, new high price, but not a 1k$ titan price, as the competition of the 290x was too close in perf.

3 months later they released the Titan Black, full GK110 chip. Again, no competition, so new Titan 1k$ price.

 

20 hours ago, Notional said:

If we look only at the x80 models, the 780 launched at 649$

 

It's nice that you're trying to paint a nice little picture for us, but throwing facts into a story does not prove the rest of the story is true.. I could include a true statement "Nvidia released the 580 in 2010" into a story "Nvida released the 580 in 2010 because they are part of the KKK".... But having a fact in there doesn't make the rest of it true. You're stating facts, but concluding things from them using no logic or proof.

 

And why are you so obsessed with what chip they used? If, at the x80 level, the consumer saw the same 20-30% jump in performance, then it's irrelevant and your argument is invalid. You keep claiming to be looking at things from the consumer's point of view, but if you're getting the same thing you always have, what difference does it make that Nvidia found a more cost effective way to do it?

I'll concede, at $649, the 780 was overpriced on launch... But that's a single outlier over 6 generations.

 

To say "Nvidia has been bumping up the prices for years" or "the 10 series GPUs are overpriced" is entirely inaccurate.

 

21 hours ago, Notional said:

You have talked about perceived value of the customer, but that is nothing but marketing, pricing and performance in one hefty marketing campaign.

You were the one who brought up perceived value (you also included a chart for us)... I've been actually calculating value.

Based on the assumptions we were operating off of at the time, I proved, using flawless logic, that the Titan XP was actually not over priced. And despite that you kept stating that it was overpriced based on nothing.

"It is the most expensive yet, therefore it is overpriced" is still an invalid logical statement... Made especially worse by the fact that I had already logically proved it wasn't. The fact that someone brought new numbers to our attention that tell a different story does not mean that you were any less wrong.

 

21 hours ago, Notional said:

The Titan cards were never supposed to exist, and the GK110 chips were never designed to be a thousand dollars.

Baseless statement... This is just your interpretation based on nothing.

 

21 hours ago, Notional said:

the x80 series had now been officially downgraded as a different lower tier chip (GM204).

You can't argue based on naming scheme of the chip. I've already shown the x80 segment did not change from the 480 through the 980. AND I've already shown that the inclusion of the Titan did not affect this segment.

 

21 hours ago, Notional said:

Yet the 1080 GP104 (not the highest end chip) launched at 649$ despite not being the highest tier chip. In comparison the 980ti (highest tier chip), launched at 649$ and the 780ti launched at 699$ (highest tier chip).

1) The 1080 launched with a $599 MSRP,

2)This exactly counters your argument that the x80 series has been "degraded". The relative performance people are getting with the x80 series has actually increased.

3) I've already shown that the 1080 shifted the x80 segment up in performance slightly, and they're under priced for what this segment should be.

4) You keep bringing up the fact that the x80 is no longer the highest tier chip. I've already explained that it changes nothing about it's segment. If you get "X" amount of something for a price, and someone else spends more to get more, it doesn't mean that you got any less.

 

21 hours ago, Notional said:

As for the 1000 series, we have no clue what performance a Titan XP or a 1080Titanium will have. NVidia's words certainly are nothing to go by. Need I link the 1060 NVidia bench graph again?

Firstly, no need to link the 1060 graph. I fully understand how BS Nvidia marketing is. That graph made me cringe harder than ever before.

 

If you want to argue that we have no clue what performance a Titan XP is going to have, so we can't draw conclusions... Then that literally invalidates your argument that it's overpriced, because we don't have half the equation. <- Proof of this. If AMD launched a card that was 5000x faster than the 1080, and charged $1300 for it... Would the fact that it is the most expensive yet make that overpriced? Absolutely not. We can either go on the most credible number we have, or we can't make any statements about it at all, especially "It's overpriced".

 

The CORRECT argument to make here is:

We can agree that Nvidia manipulates data to make their products look better than they are... yes?

If you agree with that (which I'm sure you do), then that tells us there is zero chance that the Titan XP is MORE than 60% faster than the Titan XM.

We now know that, if we use the 60% figure, we know we're using a best possible case scenario.

Looking at the 60% figure... We know that the 1080 is 30% faster than the Titan XM, so the fastest the Titan XP can possibly be is 23% faster than the 1080 (1.6/1.3)

I think, without even needing to do the math, we can all agree that a 23% increase in performance for 200% increase in price will make it overpriced.

Therefore, in a BEST case scenario, the Titan XP is overpriced.

This means we can conclude that, no matter what, the Titan XP will be overpriced.

 

Just because you know the number won't be indicative of real world performance doesn't mean you can't extrapolate information from it.

 

21 hours ago, Notional said:

but also pushing up the price of their cards significantly?

Showm above why this is wrong.

 

21 hours ago, Notional said:

Now do you see how and why NVidia not only is guilty of price skimming

This is a merky one. While it shares some of the characteristics of price skimming, I'd argue that it may not be.

Releasing GPUs, and then releasing new GPUs with the same performance for less $$ with a relatively short turn around is standard practice for the industry.

Essentially what is happening is customers are paying to have a given amount of performance "X" amount of time earlier, than if they had just waited for the next card to come out. That's how the GPU market works.

Unless you can show Nvidia is creating a cost to the customer that is significantly greater than the industry norm, then I don't think you can call it price skimming.

The metric you're looking for is % of card's value per month. You also want to factor in price/performance curve, since values lower down the curve will have a smaller delta than ones higher up the curve.

 

If you can show me the math for that, or a logical deduction that shows it's significantly greater than the industry norm, I will concede that it very much resembles price skimming. (Not a trap. Being genuine. I just don't have the time to do the math, and you're the one making the claim... so the onus is on you to prove it.)

 

______________________________________________________________________________________________________________________

 

 

But alas, this is getting exhausting. I am officialy done. The only argument I'm willing to concede on is the price skimming. I know the rest of my logic is sound, and that's going to have to be enough for me. No point in trying to explain color to a blind man.

Anyone with an understanding of logic and rational argument will be able to see what actually happened here.

Link to comment
Share on other sites

Link to post
Share on other sites

It blows me away that NV is able to charge $1200 for a card with a cut down/scavenged/partially disabled 471mm^2 chip because people will happily buy it. Holy shit.

 

No one here can say for sure what yields are like w/ TSMC's 16nm on a chip of that size, but I doubt it is more expensive (to NV) per working unit as compared to the ~600mm^2 chips of past generations that were pushing the reticle limits. I am aware $/transistor has plateaued from 28nm to 14-16nm, but GP102 in the TXP is not nearly as large as the 28nm big boys and isn't fully enabled... IDK, maybe I am way off in this thinking here.

 

Such a shame, amazing technology made unreachable (to me) simply because of how the market landscape currently sits. Of course NV would be stupid not to maximize their profits by charging this much, but it does leave a bitter taste in my mouth seeing this price tag compared to what the chip actually is.

 

Still, can't wait to see what this thing will do under water...what a beast.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, KeltonDSMer said:

It blows me away that NV is able to charge $1200 for a card with a cut down/scavenged/partially disabled 471mm^2 chip because people will happily buy it. Holy shit.

 

No one here can say for sure what yields are like w/ TSMC's 16nm on a chip of that size, but I doubt it is more expensive (to NV) per working unit as compared to the ~600mm^2 chips of past generations that were pushing the reticle limits. I am aware $/transistor has plateaued from 28nm to 14-16nm, but GP102 in the TXP is not nearly as large as the 28nm big boys and isn't fully enabled... IDK, maybe I am way off in this thinking here.

 

Such a shame, amazing technology made unreachable (to me) simply because of how the market landscape currently sits. Of course NV would be stupid not to maximize their profits by charging this much, but it does leave a bitter taste in my mouth seeing this price tag compared to what the chip actually is.

 

Still, can't wait to see what this thing will do under water...what a beast.

Yeah. People will buy it. Thing is, paying like.. $300 instead of waiting for the 1080ti, or getting the 1080 for half the price seems silly to the average person... But there are people out there who make a LOT of money, and $300 extra is nothing...

 

"Extra half day of work for 23% more performance on my GPU? NBD."

 

All depends on your situation, and what the value is to you... Cost to Nvidia is not really relevant.

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, -BirdiE- said:

It's nice that you're trying to paint a nice little picture for us, but throwing facts into a story does not prove the rest of the story is true.. I could include a true statement "Nvidia released the 580 in 2010" into a story "Nvida released the 580 in 2010 because they are part of the KKK".... But having a fact in there doesn't make the rest of it true. You're stating facts, but concluding things from them using no logic or proof.

 

And why are you so obsessed with what chip they used? If, at the x80 level, the consumer saw the same 20-30% jump in performance, then it's irrelevant and your argument is invalid. You keep claiming to be looking at things from the consumer's point of view, but if you're getting the same thing you always have, what difference does it make that Nvidia found a more cost effective way to do it?

I'll concede, at $649, the 780 was overpriced on launch... But that's a single outlier over 6 generations.

What an odd example. It really is not that difficult to understand. Titan is a result of the lack of competition. Nothing more. The notion that a consumer grade GPU would be made to reach a 1k$ price point is nonsensical. SImply because the limitations of manufacturing such a chip doesn't lie with NVidia, but rather the foundry. Those limitations are universal. The only thing that brings the cost of such chips up to those levels are insanely low yields or an extremely small market. The Titan is neither of those things, as the original Titan was a very cut down chip, meaning yields should not be an issue.

This is basic profit optimization due to lacking competition. I really fail to see how you don't understand this with your degree.

 

NVidia defines the chips and model numbers, not me. A usual chip will generally have at least 2 models for it: A full fat chip, and a cut down chip (like the 970 being a cutdown 980, and the 1070 being a cut down 1080). GK110 had models out of it, Titan Black being the last non cut down, and Titan, 780 and 780ti being cut down to various degrees. This is odd behavior that cannot be explained by poor yields, as AMD used TSMC too at that generation. Profit optimizations are the only rational answer for someone like NVidia.

 

Performance jumps are completely relative. Usually node shrinks brings huge performance boosts, that would break the usual boost between architecture changes on the same node. But there is no standard 30% increase:

lineup.png

 

Now notice the Introduction of the Titans. Minuscule performance boosts at best, but with a massive price hike

Notice something about the 900 series? For the first time x80 is not the highest end chip, but like I stated and proved before the x80 still carries the old highest end chip price. A new market segment has simply been made without cost being the reason why (we can see this with AMD's equally sized TSMC chips).

 

Single outlier? Every single Titan has been an outlier with a horrible price to performance. The only metric consumers should be looking at, all else equal. Look at the Titan performances on that chart and compare it to the launch prices of each card. Absolutely abysmal price/perf.

 

20 hours ago, -BirdiE- said:

To say "Nvidia has been bumping up the prices for years" or "the 10 series GPUs are overpriced" is entirely inaccurate.

 

You were the one who brought up perceived value (you also included a chart for us)... I've been actually calculating value.

Based on the assumptions we were operating off of at the time, I proved, using flawless logic, that the Titan XP was actually not over priced. And despite that you kept stating that it was overpriced based on nothing.

"It is the most expensive yet, therefore it is overpriced" is still an invalid logical statement... Made especially worse by the fact that I had already logically proved it wasn't. The fact that someone brought new numbers to our attention that tell a different story does not mean that you were any less wrong.

 

Baseless statement... This is just your interpretation based on nothing.

 

You can't argue based on naming scheme of the chip. I've already shown the x80 segment did not change from the 480 through the 980. AND I've already shown that the inclusion of the Titan did not affect this segment.

In your opinion. When production cost (compared to equal chip sizes from AMD) cannot explain these new price brackets, what can, if not for price skimming? I never argued this as a bad move from NVidia's side, only a bad thing for consumers.

 

Yes, because perceived value is the reason why these cards can exist at these price brackets. That's the entire point of marketing! Well my graph was calculating value as well.

Flawless logic? Without knowing the actual performance of the card? We already know the 1200$ card, creating a completely new price bracket yet again, is, yet again, a cut down chip (of GP102). We know this because of P6000 having a higher cuda core count, while still being a GP102 chip. Expect a later titan card at either 1200$ or more. This won't happen of course until AMD has competitive cards out (performance wise).

 

It was expensive because price to performance was very poor. I know price/perf. decreases as you venture to the top of the chip sizes, due to yield and cost of production, but the Titan prices has always been much higher than such cost increases.

 

NVidia would never design a 1k$ chip, especially when such a price bracket had never existed before. NVidia does not take risks like that. Furthermore, such a target price is without a doubt not based on production cost. That added 3-400$ price premium, is profit - double vram cost. Nothing else.

 

Actually I can, as naming scheme (models) and pricing are connected (as you argue yourself). What changed was degradation of the x80 model from highest end chip, to second highest, without a price cut.

 

20 hours ago, -BirdiE- said:

1) The 1080 launched with a $599 MSRP,

2)This exactly counters your argument that the x80 series has been "degraded". The relative performance people are getting with the x80 series has actually increased.

3) I've already shown that the 1080 shifted the x80 segment up in performance slightly, and they're under priced for what this segment should be.

4) You keep bringing up the fact that the x80 is no longer the highest tier chip. I've already explained that it changes nothing about it's segment. If you get "X" amount of something for a price, and someone else spends more to get more, it doesn't mean that you got any less.

  1. No the 1080 launched at 699$ for the founders edition, which was the only available card at the (paper) launch. Find me one single card you can buy for 599$. Even the MSI aero has an MSRP on the wrong side of 700$ Their third party vendor price claim of 599$ is a completely free claim to make, as NVidia has no power over this, and no responsibility either. Of course no third party vendor would price their cards this way. What a joke.
  2. Indeed, performance seems to rise steadily on the x80, but the issue I'm pointing out, is that this is completely unhinged from the actual cost of the chip production. When chip advancements happens, it usually benefits the consumer, with relatively higher performance boosts, than expected, usually due to die shrinks. What we have seen with the 1000 series, is that NVidia has not passed those huge advancements onto the consumer, but rather degraded the x80 series even more (third best chip), while increasing the price.
  3. That's because you used a non existing price point. It still does not counter the issue, that chip advancements are not passed onto the consumer, but only NVidia's profit margin.
  4. See 2.

 

21 hours ago, -BirdiE- said:

If you want to argue that we have no clue what performance a Titan XP is going to have, so we can't draw conclusions... Then that literally invalidates your argument that it's overpriced, because we don't have half the equation. <- Proof of this. If AMD launched a card that was 5000x faster than the 1080, and charged $1300 for it... Would the fact that it is the most expensive yet make that overpriced? Absolutely not. We can either go on the most credible number we have, or we can't make any statements about it at all, especially "It's overpriced".

 

The CORRECT argument to make here is:

We can agree that Nvidia manipulates data to make their products look better than they are... yes?

If you agree with that (which I'm sure you do), then that tells us there is zero chance that the Titan XP is MORE than 60% faster than the Titan XM.

We now know that, if we use the 60% figure, we know we're using a best possible case scenario.

Looking at the 60% figure... We know that the 1080 is 30% faster than the Titan XM, so the fastest the Titan XP can possibly be is 23% faster than the 1080 (1.6/1.3)

I think, without even needing to do the math, we can all agree that a 23% increase in performance for 200% increase in price will make it overpriced.

Therefore, in a BEST case scenario, the Titan XP is overpriced.

This means we can conclude that, no matter what, the Titan XP will be overpriced.

 

Just because you know the number won't be indicative of real world performance doesn't mean you can't extrapolate information from it.

Even based on NVidia's own bogus numbers, it's overpriced. I get your argument about first mover advantage and so on. It's a sound conclusion, because that is exactly what NVidia did, and good for them. My entire point has always been, that that is a bad thing for consumers, as the value add is nothing but price skimming, as the price has nothing to do with cost of the chip production.

If AMD did that, the price to performance would be ground breaking, instantly making NVidia obsolete. This of course will not happen, as the limitations for AMD and NVidia both are chip production they have no power over (GloFo/Samsung/TSMC).

 

I think we can conclude that yes. Now factor in that the Titan XP is not even the full fat second tier Pascal chip (if we see the P100 as the highest tier. We can agree not to though).

 

22 hours ago, -BirdiE- said:

This is a merky one. While it shares some of the characteristics of price skimming, I'd argue that it may not be.

Releasing GPUs, and then releasing new GPUs with the same performance for less $$ with a relatively short turn around is standard practice for the industry.

Essentially what is happening is customers are paying to have a given amount of performance "X" amount of time earlier, than if they had just waited for the next card to come out. That's how the GPU market works.

Unless you can show Nvidia is creating a cost to the customer that is significantly greater than the industry norm, then I don't think you can call it price skimming.

The metric you're looking for is % of card's value per month. You also want to factor in price/performance curve, since values lower down the curve will have a smaller delta than ones higher up the curve.

 

If you can show me the math for that, or a logical deduction that shows it's significantly greater than the industry norm, I will concede that it very much resembles price skimming. (Not a trap. Being genuine. I just don't have the time to do the math, and you're the one making the claim... so the onus is on you to prove it.)

_____________________________________________________________________________________________________________________

 

But alas, this is getting exhausting. I am officialy done. The only argument I'm willing to concede on is the price skimming. I know the rest of my logic is sound, and that's going to have to be enough for me. No point in trying to explain color to a blind man.

Anyone with an understanding of logic and rational argument will be able to see what actually happened here.

That's actually my entire point: it's not standard for the industry. Nvidia was the first ever to do so with the Titan cards. Your argument only holds up if you equal this to rebranding, which has been debunked not just by me.

I already have with my graph, and calculations of price/perf. If that is not enough, then idk what would be? The Titan cards has a price/perf. lower than not only standard NVidia cards (including those based on the same chips as the Titans), but also AMD's cards.

 

NVidia made a marketing masterpiece with the Titan cards. I will give them that. All I'm saying is that those prices where price skimming, not based on production cost. It was only ever possible due to the lack of competition from AMD, and now history is repeating itself in the 1000 series, not only with Titan this time. 


 

Well it's always a great exercise to have to defend and support ones claims, especially when it starts to get technical. I just hope people learned something from all of this, and got some new perspectives on things. That is why I'm in here: To learn.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Found first actual photo of the card that isn't just a CG render. Seems to still be carrying the "Geforce GTX" branding despite that being removed everywhere from the cards official title on Nvidia's site. https://twitter.com/AndrewYNg/status/757734438436343808

CoQC--xVYAAue7Q.jpg

CPU: i7 6700k @ 4.6ghz | CASE: Corsair 780T White Edition | MB: Asus Z170 Deluxe | CPU Cooling: EK Predator 360 | GPU: NVIDIA Titan X Pascal w/ EKWB nickel waterblock | PSU: EVGA 850w P2 | RAM: 16GB DDR4 Corsair Domintator Platinum 2800mhz | Storage: Samsung 850 EVO 500GB | OS: Win 10 Pro x64 | Monitor: Acer Predator X34/HTC VIVE Keyboard: CM Storm Trigger-Z | Mouse: Razer Taipan | Sound: Audio Technica ATH-M50x / Klipsch Promedia 2.1 Sound System 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×