Jump to content

NVIDIA Announces GP102-based TITAN X with 3,584 CUDA cores

RZeroX
39 minutes ago, patrickjp93 said:

Volta is early-mid 2017, and big Vega will not be ready in Q4 2016. Where does everyone get this 2018 crap?

According to NVidia's official roadmap:

NVIDIA-Pascal-GPU_Roadmap.jpg

 

But of course it will always be speculation until the cards are actually very close to launch.

12 minutes ago, -BirdiE- said:

Source? Sounds like you're just talking out of your ass...

Source to what? You cannot source a logical conclusion. I have finished all my masters classes in business if that's what you mean?

 

I have argued all of my points in the post, and anyone with any knowledge of the GPU markets and chip production would agree with that statement.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Notional said:

Source to what? You cannot source a logical conclusion. I have finished all my masters classes in business if that's what you mean?

 

I have argued all of my points in the post, and anyone with any knowledge of the GPU markets and chip production would agree with that statement.

Logical conclusion based on what? 

 

"Nvidia is intentionally gimping supply so they can charge way more than the card is worth" is not a logical conclusion... It's a statement based on no evidence.

 

And you clearly don't have a Masters in business... maybe you took an introductory business course while taking your masters for something else...

If you did you'd understand, as I mentioned earlier, that based on the price/performance curve for GPUs, that the Titan XP is actually reasonably priced.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, -BirdiE- said:

Logical conclusion based on what? 

 

"Nvidia is intentionally gimping supply so they can charge way more than the card is worth" is not a logical conclusion... It's a statement based on no evidence.

 

And you clearly don't have a Masters in business... maybe you took an introductory business course while taking your masters for something else...

If you did you'd understand, as I mentioned earlier, that based on the price/performance curve for GPUs, that the Titan XP is actually reasonably priced.

Don't bother mate. You're not gonna win with those lot. Not worth the headache.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Kloaked said:

Don't bother mate. You're not gonna win with those lot. Not worth the headache.

But... There's people on the internet.... and they're wrong....

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, -BirdiE- said:

But... There's people on the internet.... and they're wrong....

They see it the same way, except they have this irrational beef with a company that's arguably just as bad as the company that they shill for.

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, Daegun said:

Source? Because gp100 enterprise cards aren't releasing until January.

GP 100 is already out and about. You just have to buy in batches of 1000 Teslas.

 

As for Volta, it's been known that the big chips have to be in Oak Ridge's hands in July for the new Power 9 supercomputer being built there. Several outlets are also following the rumor Volta will launch at GTC which is in May.

 

http://info.nvidianews.com/rs/nvidia/images/An Inside Look at Summit and Sierra Supercomputers-3-1.pdf

http://www.fool.com/investing/2016/07/19/nvidia-corporation-may-launch-first-volta-processo.aspx

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, -BirdiE- said:

Logical conclusion based on what? 

 

"Nvidia is intentionally gimping supply so they can charge way more than the card is worth" is not a logical conclusion... It's a statement based on no evidence.

 

And you clearly don't have a Masters in business... maybe you took an introductory business course while taking your masters for something else...

If you did you'd understand, as I mentioned earlier, that based on the price/performance curve for GPUs, that the Titan XP is actually reasonably priced.

Already argued my point. 

 

Yes, you do know all companies manipulate supply and demand right? Lowering supply increases price, if demand is there. Raising the price lowers the demand. This is basic college/bachelor level stuff. 

 

Yes it clearly is 9_9. You actually think a regular curve dictates pricing on the top end of the scale? Especially where there is no competition? That is not how it works. You still fail to realize that pricing doesn't necessarily have anything to do with cost. And if it did, how come the x80ti cards, that are based on cut down Titans are so much cheaper, if they cost the same to manufacture? Unless you believe the x80ti cards are subsidized? It's essentially price skimming the same chip, by making two very different segmented cards. From a business perspective, it's brilliant, don't get me wrong. It's very clever by NVidia. It just sucks for the consumers, and that is where my focus lies.

 

It's fine if you don't believe me, that is your prerogative.

 

4 minutes ago, Kloaked said:

Don't bother mate. You're not gonna win with those lot. Not worth the headache.

Says the guy who would not acknowledge that GameWorks gimped performance on AMD.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Notional said:

Already argued my point. 

 

Yes, you do know all companies manipulate supply and demand right? Lowering supply increases price, if demand is there. Raising the price lowers the demand. This is basic college/bachelor level stuff. 

 

Yes it clearly is 9_9. You actually think a regular curve dictates pricing on the top end of the scale? Especially where there is no competition? That is not how it works. You still fail to realize that pricing doesn't necessarily have anything to do with cost. And if it did, how come the x80ti cards, that are based on cut down Titans are so much cheaper, if they cost the same to manufacture? Unless you believe the x80ti cards are subsidized? It's essentially price skimming the same chip, by making two very different segmented cards. From a business perspective, it's brilliant, don't get me wrong. It's very clever by NVidia. It just sucks for the consumers, and that is where my focus lies.

 

It's fine if you don't believe me, that is your prerogative.

 

Says the guy who would not acknowledge that GameWorks gimped performance on AMD.

That manipulation actually doesn't help profit, because most people react very heavily to price in luxury markets. If this was a market with very inelastic demand, your argument would make sense, but the GPU market is no such thing.

 

The curve doesn't dictate it, but adhering to it are many sales models, GPUs included.

 

You get what you pay for, and market segmentation is nothing new, but this segmentation is actually more tame than previous generations' phenomena.

 

There's no reason to believe a man who can be proven wrong purely on the logic.

 

Gameworks doesn't gimp AMD. AMD's shitty tessellation hardware and crap drivers gimp AMD.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Notional said:

Yes, you do know all companies manipulate supply and demand right? Lowering supply increases price, if demand is there. Raising the price lowers the demand. This is basic college/bachelor level stuff. 

Yes. Having an ACTUAL degree is business. I understand supply and demand.

 

12 minutes ago, Notional said:

You actually think a regular curve dictates pricing on the top end of the scale? Especially where there is no competition? That is not how it works.

Yes, but if you create a price/performance curve based on current GPUs in the market from both vendors, and another card is launched that lies below that line, then you can say that the price of that product is lower than what the market dictates.

 

12 minutes ago, Notional said:

And if it did, how come the x80ti cards, that are based on cut down Titans are so much cheaper, if they cost the same to manufacture? Unless you believe the x80ti cards are subsidized?

What do you think re-branding cards is? The cost to manufacture the card is such a small part of the equasion. The majority of the price of a GPU is trying to recoup the R&D expenses. The market dictates what a reasonable price is. And you're right in saying that a company can charge whatever they want if they have no competition, but Nvidia clearly hasn't done that, since the price of this card actually falls below the standard price/performance function set by existing cards.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Notional said:

Says the guy who would not acknowledge that GameWorks gimped performance on AMD.

As Patrick said, GameWorks did not gimp AMD cards.. It was technology created by Nvidia to capitalize on something their GPUs excelled at (tessellation). It's unfortunate that AMD cards sucked at it, but it doesn't mean Nvida and game developers shouldn't include it as an option so Nvidia cards can be used to their full extent.

 

It's similar to Async compute and Maxwell. It's not AMD's fault that Maxwell sucked at async... And just because Maxwell sucked at Async doesn't mean AMD shouldn't be able to implement it so their cards can be used to their fullest extent.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, patrickjp93 said:

That manipulation actually doesn't help profit, because most people react very heavily to price in luxury markets. If this was a market with very inelastic demand, your argument would make sense, but the GPU market is no such thing.

 

The curve doesn't dictate it, but adhering to it are many sales models, GPUs included.

 

You get what you pay for, and market segmentation is nothing new, but this segmentation is actually more tame than previous generations' phenomena.

 

There's no reason to believe a man who can be proven wrong purely on the logic.

 

Gameworks doesn't gimp AMD. AMD's shitty tessellation hardware and crap drivers gimp AMD.

The GPU market is indeed elastic, but things change in the upper echelon of the markets space. However, selling high numbers just for the sake of it, is only a strategy used, if you need market share (like AMD), or adoption rate (like consoles). For companies, the name of the game is profit optimization. That is achieved by finding the perfect balance between high pricing and enough sales. NVidia are masters at hitting that sweet point with Titan. But there is a reason they saturate the highest end first, before releasing an x80ti model. 

 

The point of the Titan, is not to get it into the hands of everyone, but rather sell it with as high of a profit margin as possible, before having to sell the "defective" stuck of cut down x80ti chips. You know this already.

 

You only get what you pay for when there is competition. Or would you say the internet prices for shitty DSL in the US is as much worth as Google fiber at the same price? No? Good. Titan cards operate in a small niche market outside of competition.

 

Industry insiders disagree with you. Heck even NVidia cards are gimped heavily by GameWorks, like Kepler. But I guess that is also shitty tessellation hardware and crap drivers?

 

8 minutes ago, -BirdiE- said:

Yes. Having an ACTUAL degree is business. I understand supply and demand.

 

Yes, but if you create a price/performance curve based on current GPUs in the market from both vendors, and another card is launched that lies below that line, then you can say that the price of that product is lower than what the market dictates.

 

What do you think re-branding cards is? The cost to manufacture the card is such a small part of the equasion. The majority of the price of a GPU is trying to recoup the R&D expenses. The market dictates the price. And you're right in saying that a company can charge whatever they want if they have no competition, but Nvidia clearly hasn't done that, since the price of this card actually falls below the standard price/performance function set by existing cards.

Great, so you understand that you can have several pricing, supply and demand curves for the same product under different circumstances. Even though a Titan is a different product than an x80ti products, it's still the same chip with the same manufacturing cost. Only that supply differs from full fat chips and cut down chips. Raising the price for the full fat chip goes without saying, but unless you think the x80ti chips are sold subsidized, the argument that a Titan has the price it has because of manufacturing costs, simply is not correct.

 

The problem with "the market", is that a company can define it. Sure a natural demand has to exist, but nothing a good marketing team cannot create. Remember that all full fat Titan chips can be cut down for x80ti if demand requires it. So having the price as high as the demand can carry, and for as long as an x80ti chip isn't needed (it's not until competition exists or market is saturated).

 

Before the original Titan launched, how many cards do you know of that costs 1000$ for 1 GPU? All the older Titan cards, sans Z was 1k$. This is 200$ more for the same series of cards, that already pushed the boundries of pricing with the original. Price to performance always drops with new generations. That's kinda the point. If price to performance was a constant, the NVidia 1260 will be 1200$ and the 1280ti would be 4k$ or something ridiculous. That simply is not how the micro processor market works.

 

19 minutes ago, -BirdiE- said:

As Patrick said, GameWorks did not gimp AMD cards.. It was technology created by Nvidia to capitalize on something their GPUs excelled at (tessellation). It's unfortunate that AMD cards sucked at it, but it doesn't mean Nvida and game developers shouldn't include it as an option so Nvidia cards can be used to their full extent.

 

It's similar to Async compute and Maxwell. It's not AMD's fault that Maxwell sucked at async... And just because Maxwell sucked at Async doesn't mean AMD shouldn't be able to implement it so their cards can be used to their fullest extent.

What was the point of putting tessellation factor to 11, when there is zero visual reason to do so? When the triangles are smaller than 1 pixel, it's a mass waste of performance. You think it's a coincidence that NVidia made GameWorks work like that? And that they blackboxed the hell out of it, so neither the developer or AMD could change the source code to work better?

 

Read this article: http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd/2 I linked to page 2 so you can see the examples. Tell me what possible benefits such a rendering has to anyone? Or why NVidia disabled any possibility of changing it? That's not even taking Crysis 2's most tessellated concrete slab in the world into account. Or the 64x tessellation in HairWorks, that made no difference to 32x and hardly any at 16x. The way GameWorks is designed is not a coincidence. Nothing NVidia does is a coincidence, but very speculated. That's why they are in the market position they are in today.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Notional said:

Says the guy who would not acknowledge that GameWorks gimped performance on AMD.

Case in point.

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, -BirdiE- said:

As Patrick said, GameWorks did not gimp AMD cards.. It was technology created by Nvidia to capitalize on something their GPUs excelled at (tessellation). It's unfortunate that AMD cards sucked at it, but it doesn't mean Nvida and game developers shouldn't include it as an option so Nvidia cards can be used to their full extent.

 

It's similar to Async compute and Maxwell. It's not AMD's fault that Maxwell sucked at async... And just because Maxwell sucked at Async doesn't mean AMD shouldn't be able to implement it so their cards can be used to their fullest extent.

yeah, there's a lot of double standards when it comes to AMD vs. Nvidia. I like to think the key difference between AMD and Nvidia culture is this (from what I've seen, heard and read): Nvidia acts like a business that is reacting to consumers and the market; AMD acts like a business that is reacting to Nvidia. Nothing proves this more than visiting the Nvidia and AMD subreddits. One is full of consumers complaining about various problems with Nvidia, like prices or availability or driver bugs. The other subreddit is full of consumers complaining about nvidia prices, availability, and praising Doom Vulkan and the next custom 480 to launch in 2 months.

 

It's like Nvidia are so large as if to take up every square inch of space in the discrete GPU market, and AMD are inhabiting Nvidia's space no matter where they go. Everything AMD does is in relation to Nvidia, while everything Nvidia does is in relation to where they can profit and advance. Nvidia's marketing is all about Nvidia and what they've accomplished since last year. AMD's marketing is about sticking it to Nvidia (fixer videos for example), rising up and rebelling against the system that permeates everything (like the latest rebellion ads). Nvidia's sales pitch is "we're the best." AMD's Sales pitch is "we're the good guys."

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Briggsy said:

yeah, there's a lot of double standards when it comes to AMD vs. Nvidia. I like to think the key difference between AMD and Nvidia culture is this (from what I've seen, heard and read): Nvidia acts like a business that is reacting to consumers and the market; AMD acts like a business that is reacting to Nvidia. Nothing proves this more than visiting the Nvidia and AMD subreddits. One is full of consumers complaining about various problems with Nvidia, like prices or availability or driver bugs. The other subreddit is full of consumers complaining about nvidia prices, availability, and praising Doom Vulkan and the next custom 480 to launch in 2 months.

 

It's like Nvidia are so large as if to take up every square inch of space in the discrete GPU market, and AMD are inhabiting Nvidia's space no matter where they go. Everything AMD does is in relation to Nvidia, while everything Nvidia does is in relation to where they can profit and advance. Nvidia's marketing is all about Nvidia and what they've accomplished since last year. AMD's marketing is about sticking it to Nvidia (fixer videos for example), rising up and rebelling against the system that permeates everything (like the latest rebellion ads). Nvidia's sales pitch is "we're the best." AMD's Sales pitch is "we're the good guys."

We do need to recognize that not everyone at these companies may agree with how the company is being marketed. Both AMD and Nvidia have extremely intelligent people working there, and they both have the most cringe worthy marketing that seems so out of touch with the market - same could be said for the game developers at Ubisoft or any EA studio.. At least in my opinion.

 

What's funny to me, though, is I remember AMD's new CEO saying that she doesn't want AMD to be known as the cheaper option anymore, yet I've seen nothing but that in their marketing continuously. The 480 was marketed as the budget VR card that could perform well for the money. It definitely delivers, I'm not discounting that but the marketing is what gets me, like I mentioned.

 

People on this forum also need to realize that not everyone is going to go on the internet and talk about their hardware if they don't have any issues. Most of what you'll see are people asking for help or complaining about an issue they cannot solve, so it tends to look like more people are having issues than people who are without any. 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Notional said:

The GPU market is indeed elastic, but things change in the upper echelon of the markets space. However, selling high numbers just for the sake of it, is only a strategy used, if you need market share (like AMD), or adoption rate (like consoles). For companies, the name of the game is profit optimization. That is achieved by finding the perfect balance between high pricing and enough sales. NVidia are masters at hitting that sweet point with Titan. But there is a reason they saturate the highest end first, before releasing an x80ti model. 

 

The point of the Titan, is not to get it into the hands of everyone, but rather sell it with as high of a profit margin as possible, before having to sell the "defective" stuck of cut down x80ti chips. You know this already.

 

You only get what you pay for when there is competition. Or would you say the internet prices for shitty DSL in the US is as much worth as Google fiber at the same price? No? Good. Titan cards operate in a small niche market outside of competition.

 

Industry insiders disagree with you. Heck even NVidia cards are gimped heavily by GameWorks, like Kepler. But I guess that is also shitty tessellation hardware and crap drivers?

 

Great, so you understand that you can have several pricing, supply and demand curves for the same product under different circumstances. Even though a Titan is a different product than an x80ti products, it's still the same chip with the same manufacturing cost. Only that supply differs from full fat chips and cut down chips. Raising the price for the full fat chip goes without saying, but unless you think the x80ti chips are sold subsidized, the argument that a Titan has the price it has because of manufacturing costs, simply is not correct.

 

The problem with "the market", is that a company can define it. Sure a natural demand has to exist, but nothing a good marketing team cannot create. Remember that all full fat Titan chips can be cut down for x80ti if demand requires it. So having the price as high as the demand can carry, and for as long as an x80ti chip isn't needed (it's not until competition exists or market is saturated).

 

Before the original Titan launched, how many cards do you know of that costs 1000$ for 1 GPU? All the older Titan cards, sans Z was 1k$. This is 200$ more for the same series of cards, that already pushed the boundries of pricing with the original. Price to performance always drops with new generations. That's kinda the point. If price to performance was a constant, the NVidia 1260 will be 1200$ and the 1280ti would be 4k$ or something ridiculous. That simply is not how the micro processor market works.

 

What was the point of putting tessellation factor to 11, when there is zero visual reason to do so? When the triangles are smaller than 1 pixel, it's a mass waste of performance. You think it's a coincidence that NVidia made GameWorks work like that? And that they blackboxed the hell out of it, so neither the developer or AMD could change the source code to work better?

 

Read this article: http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd/2 I linked to page 2 so you can see the examples. Tell me what possible benefits such a rendering has to anyone? Or why NVidia disabled any possibility of changing it? That's not even taking Crysis 2's most tessellated concrete slab in the world into account. Or the 64x tessellation in HairWorks, that made no difference to 32x and hardly any at 16x. The way GameWorks is designed is not a coincidence. Nothing NVidia does is a coincidence, but very speculated. That's why they are in the market position they are in today.

God replying to this wall of text... No, nothing changes at the upper echelon. The demand is what it is and responds to price accordingly. Everyone knows the cheaper TI version is coming later with slightly less performance at a much lower price. You get diminishing returns on investment as you go higher in product tiers. This has been true practically since the dawn of dGPUs. Is Nvidia trying to make as much money as possible? Yes. It's market segmentation. People will be willing to pay just about any price. Those who aren't will wait for the TI, and Nvidia will time its launch to ensure the Fury doesn't have room to compete.

 

Someone has to maintain those DSL lines and someone has to try to push their speeds as high as possible to prevent more competition from moving in. The only reason Europe is doing any better is because the U.S. designed and replaced its infrastructure after WWII. The U.S. had no such luxury. It takes a lot of money to do the construction to lay fresh line. That's one reason Google is only doing really big cities right now: immediate ROI for big investment. It provides an opportunity for others to tie into Google's backbone and expand Fiber and even better copper options, but yes, that DSL is worth the same as Google Fiber until someone provides a better solution.

 

Find me a source on that, b/c bullshit. Nvidia is the one leading the charge with new effects libraries. If AMD can't cut the mustard and stay up to snuff with its own tech, that's its problem, which is why I have no competition qualms about AMD dumping ludicrous amounts of Asynchronous Compute and Shading into games. I have enormous technical reasons to be pissed at them for it, but that's a separate discussion.

 

I can see the difference between 64, 32, and 16x tessellation in TW3 and other Hairworks titles. Maybe Kepler is weaker at tessellation than Maxwell, but then again using the tessellation is cheaper than doing it from raw compute anyway, so why are we mad that old technology is obsolesced by bleeding edge techniques and new tech? No one else has produced the same or better results for cheaper. TressFX/PureHair is still a mile and a half behind on quality.

 

They are sold subsidized. The price is subsidized by lower quality.

 

Only if that company is a monopoly. Nvidia is no such thing, and thus none of your remaining arguments based on this premise need to be addressed. They're simply incorrect for this singular reason.

 

There was not 0 visual reason to do so, and you can adjust tessellation in games' config files if you choose to. Also, having multiple triangles in a single pixel provides more information for hue averaging, so it is not useless when done for the right reason. It helps with blend and realism.

 

Nvidia has every right to protect its innovations from being stolen, and AMD can optimize off the binary the same way Nvidia does, so that black box argument is BS.

 

Crysis' developers were terrible at optimization. All of the tessellation libraries provided in Gameworks have the ability to let the programmer determine tessellation factors. It's a parameterized function. You can't blame Nvidia for shitty developers that didn't even properly occlude their water when running around on dry land.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, Notional said:

Great, so you understand that you can have several pricing, supply and demand curves for the same product under different circumstances.

Each product will have it's own supply and demand, which can be graphed. But there is also a price/performance function for the GPU market, which is made up of all the relevant market offerings. If you lie above that curve, you product is priced higher than expected for that market. If you lie below it, you product is priced lower than expected for that market. Every new product affects the function, but does not define it... Which allows it to lie above or below the curve.

 

59 minutes ago, Notional said:

but unless you think the x80ti chips are sold subsidized, the argument that a Titan has the price it has because of manufacturing costs, simply is not correct.

When have I, at any point, said the Titan is priced where it is specifically because of manufacturing costs?

I have specifically said that manufacturing costs is only a very small part of the cost calculation. Most of it is trying to recoup R&D expenses.

People who spend more money get the benefits sooner... that's how it always works...

Again, this is why rebranding is a thing.... When AMD rebranded their 7000 card for most of the R 200 series... The price of the same chips dropped. Was that because the cost to manufacture them went down? No. Was it some shady ploy by AMD? No. It's because now that it was "x" amount of time later, that level of graphics performance was valued  at less.

 

59 minutes ago, Notional said:

Before the original Titan launched, how many cards do you know of that costs 1000$ for 1 GPU? All the older Titan cards, sans Z was 1k$. This is 200$ more for the same series of cards, that already pushed the boundries of pricing with the original. Price to performance always drops with new generations.

Price to performance, of course, drops with every generation. When has anyone been debating that? The price/performance curve is set by the RX 460-480, and the GTX 1060-1080, and is a function of RELATIVE price to performance for that generation (or the current prices of cards on the market)... The price/performance of irrelevant cards from previous generations is not a part of it.

The performance delta between the x80 and the Titan has increased, which is why the price of the Titan increased.

 

59 minutes ago, Notional said:

Before the original Titan launched, how many cards do you know of that costs 1000$ for 1 GPU? All the older Titan cards, sans Z was 1k$. This is 200$ more for the same series of cards, that already pushed the boundries of pricing with the original.

The Titan was a new market segment created by Nvidia. Before that the flaship x80 card would have a performance increase of 20-30% per generation. As evidenced by the 680 and 780 following that pattern, the Titan was a new, higher performance, market segment that didn't previously exist, and obviously commanded a higher price.

 

 

Man. It's super obvious you've never taken an economics course...

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, patrickjp93 said:

God replying to this wall of text... No, nothing changes at the upper echelon. The demand is what it is and responds to price accordingly. Everyone knows the cheaper TI version is coming later with slightly less performance at a much lower price. You get diminishing returns on investment as you go higher in product tiers. This has been true practically since the dawn of dGPUs. Is Nvidia trying to make as much money as possible? Yes. It's market segmentation. People will be willing to pay just about any price. Those who aren't will wait for the TI, and Nvidia will time its launch to ensure the Fury doesn't have room to compete.

 

Someone has to maintain those DSL lines and someone has to try to push their speeds as high as possible to prevent more competition from moving in. The only reason Europe is doing any better is because the U.S. designed and replaced its infrastructure after WWII. The U.S. had no such luxury. It takes a lot of money to do the construction to lay fresh line. That's one reason Google is only doing really big cities right now: immediate ROI for big investment. It provides an opportunity for others to tie into Google's backbone and expand Fiber and even better copper options, but yes, that DSL is worth the same as Google Fiber until someone provides a better solution.

 

Find me a source on that, b/c bullshit. Nvidia is the one leading the charge with new effects libraries. If AMD can't cut the mustard and stay up to snuff with its own tech, that's its problem, which is why I have no competition qualms about AMD dumping ludicrous amounts of Asynchronous Compute and Shading into games. I have enormous technical reasons to be pissed at them for it, but that's a separate discussion.

 

I can see the difference between 64, 32, and 16x tessellation in TW3 and other Hairworks titles. Maybe Kepler is weaker at tessellation than Maxwell, but then again using the tessellation is cheaper than doing it from raw compute anyway, so why are we mad that old technology is obsolesced by bleeding edge techniques and new tech? No one else has produced the same or better results for cheaper. TressFX/PureHair is still a mile and a half behind on quality.

 

They are sold subsidized. The price is subsidized by lower quality.

 

Only if that company is a monopoly. Nvidia is no such thing, and thus none of your remaining arguments based on this premise need to be addressed. They're simply incorrect for this singular reason.

 

There was not 0 visual reason to do so, and you can adjust tessellation in games' config files if you choose to. Also, having multiple triangles in a single pixel provides more information for hue averaging, so it is not useless when done for the right reason. It helps with blend and realism.

 

Nvidia has every right to protect its innovations from being stolen, and AMD can optimize off the binary the same way Nvidia does, so that black box argument is BS.

 

Crysis' developers were terrible at optimization. All of the tessellation libraries provided in Gameworks have the ability to let the programmer determine tessellation factors. It's a parameterized function. You can't blame Nvidia for shitty developers that didn't even properly occlude their water when running around on dry land.

IKR :D Well it does change when there is no competition there. Rest of that paragraph does not disprove anything it answers.

 

TWC etc has a profit margin over 90% or so. The value of a product does not necessarily have anything to do with production cost. That was the point.

 

You should read more about it here: http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy It's interesting just in general.

But the real issue is whether or not graphics vendors SHOULD make graphics effects to begin with. Sure things like simulated hair and stuff like that is cool, but go the way of AMD with GPUOpen, that benefits both the developer and all the consumers, AMD and NVidia alike. That is a good way to go. Black boxed proprietary effects are not.

On 19/6/2015 at 8:37 PM, patrickjp93 said:

The differences are very visible until 32 vs 64, at least for me, and I'm halfway to blind as it is.

Are you sure about that? ;)

https://linustechtips.com/main/uploads/gallery/album_1762/gallery_6624_1762_547642.jpg

64x to 16x, yeah a little bit. 64x to 32x no way.

Well TressFX 3 is a LOT more effective than Hairworks, and it looks better imho.Mostly due to the effects the TressFX hair can get, like wet or having snow in it.

 

Sold subsidized means they are sold at a loss (or at least without a profit). x80ti cards are NOT sold at a loss for NVidia.

 

What competitive cards exists for the 1080? Or the Titan XP? At that performance for a single card, it's essentially a monopoly yes. The price skimming shows that. Of course people can just choose not to buy it or wait for Vega, but the implications of the price is still there. The Titan cards only work because they are the koth cards. Titan Z was not a koth card because the 295x2 performed better. The Titan Z was an utter disaster as a result.

 

You could not change the tessellation factor on GameWorks effects. That is the entire reason why AMD included a tessellation multiplier overrider in their drivers to begin with. You constantly shit on game devs for their incompetence. DO you honestly think that picture of the tessellated ice on Batman is a good and optimized solution? Of course not, it's dumb.

 

No one's saying NVidia doesn't have the right. The point is that they shouldn't and that it is hurting the industry.

 

Except binary optimization is very bad. One mistake and your NVidia drivers stops rendering the face of people in Assassins Creed Unity. Read Geldrich's comment on optimization in the extremetech link further up.

 

The Crysis 2 DX11/Tessellation patch was made with NVidia. This was also before GameWorks. And no, the devs could not do that. Well technically they might be able to, but not contractually.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Notional said:

IKR :D Well it does change when there is no competition there. Rest of that paragraph does not disprove anything it answers.

 

TWC etc has a profit margin over 90% or so. The value of a product does not necessarily have anything to do with production cost. That was the point.

 

You should read more about it here: http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy It's interesting just in general.

But the real issue is whether or not graphics vendors SHOULD make graphics effects to begin with. Sure things like simulated hair and stuff like that is cool, but go the way of AMD with GPUOpen, that benefits both the developer and all the consumers, AMD and NVidia alike. That is a good way to go. Black boxed proprietary effects are not.

Are you sure about that? ;)

https://linustechtips.com/main/uploads/gallery/album_1762/gallery_6624_1762_547642.jpg

64x to 16x, yeah a little bit. 64x to 32x no way.

Well TressFX 3 is a LOT more effective than Hairworks, and it looks better imho.Mostly due to the effects the TressFX hair can get, like wet or having snow in it.

 

Sold subsidized means they are sold at a loss (or at least without a profit). x80ti cards are NOT sold at a loss for NVidia.

 

What competitive cards exists for the 1080? Or the Titan XP? At that performance for a single card, it's essentially a monopoly yes. The price skimming shows that. Of course people can just choose not to buy it or wait for Vega, but the implications of the price is still there. The Titan cards only work because they are the koth cards. Titan Z was not a koth card because the 295x2 performed better. The Titan Z was an utter disaster as a result.

 

You could not change the tessellation factor on GameWorks effects. That is the entire reason why AMD included a tessellation multiplier overrider in their drivers to begin with. You constantly shit on game devs for their incompetence. DO you honestly think that picture of the tessellated ice on Batman is a good and optimized solution? Of course not, it's dumb.

 

No one's saying NVidia doesn't have the right. The point is that they shouldn't and that it is hurting the industry.

 

Except binary optimization is very bad. One mistake and your NVidia drivers stops rendering the face of people in Assassins Creed Unity. Read Geldrich's comment on optimization in the extremetech link further up.

 

The Crysis 2 DX11/Tessellation patch was made with NVidia. This was also before GameWorks. And no, the devs could not do that. Well technically they might be able to, but not contractually.

It disproves plenty of it. 

 

Maintenance cost is another factor for non-commodity products such as infrastructure.

 

I've read it and it's not credible. All claims, no proof, and I provided counterproof and counter-reasoning.

 

No, GPUOpen has invited complete amateur developers who don't understand performance optimization, and it just slows down the committing process. I'm waiting on 1300 lines of hybrid raytracing code to be committed to the lighting engine even though it's vastly more accurate than their rasterized lighting model and performs 20% better than the current code because I'm # 60 in a queue of people who have an 85% refusal rate. Nvidia should be allowed to make effects, and people should be allowed to use them, and AMD should not automatically be able to receive full benefit, because Nvidia did the work. That is what competition is.

 

Those black-box effects are beating the competition handily in both performance/quality and in raw quality. TressFX hair does not react well with hair or water turbulence. Hairworks does (also, Hairworks hair can have snow and such in it. TW3 devs did not use it to its fullest). The only reason you don't like Hairworks is because Geralt is ugly AF, but he was designed that way, so your comparison is biased because of that.

 

Yes, I can see difference in 64 vs. 32x.

 

Fury X with a decent overclock beats it in Doom, AOTS, and ROTR w/DX12 enabled.

 

The Titan Z performed decently better than the 295x2 if you were willing to put a water block on it.

 

Yes you could. If you bought into the rumor you couldn't, I have a bridge to sell you. You could do it in config files for some games, but if you couldn't, it was the developers' fault. You generally can't manipulate every setting in games, but a little disassembly can prove the capability is there if developers allowed it, because these functions are parameterized. AMD couldn't override the tessellation factor if this wasn't so.

 

The tessellated ice that interacts with the cape is certainly a better solution than the alternatives which require raw compute that would eat twice as much out of the performance budget. Now, to be fair, there is a lot of bad code in Batman, but I will defend that design choice, because it was the least expensive method to achieve that from a memory and compute perspective.

 

They should, and it isn't hurting the industry, by the very virtue that Nvidia is providing the best solutions. It just means more competition in making those effects, and it means cheaper games if studios don't choose to invest in their own home-brewed stuff and can use Gameworks material for far cheaper.

 

It's not bad. Nvidia and AMD do it all the time to fix shader code that shipped broken. I've read it, and that's less than half the story. Nvidia's own ex-employees say the bulk of driver optimization is done by analyzing binary because analyzing source code is useless because of Visual Studio's obfuscation system (release-style code is obfuscated at the assembly level).

 

The devs can always do that. I've programmed with gameworks before. You can literally set tessellation and polymorph parameters until the cows come home. The fact the games don't allow you to do it easily is a developer laziness issue, not  Gameworks library issue.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, -BirdiE- said:

Each product will have it's own supply and demand, which can be graphed. But there is also a price/performance function for the GPU market, which is made up of all the relevant market offerings. If you lie above that curve, you product is priced higher than expected for that market. If you lie below it, you product is priced lower than expected for that market. Every new product affect the function, but does not define it... Which allows it to lie above or below the curve.

Demand for Titan and x80ti should essentially be the exact same market, unless you really need the added VRAM, as they are the same chips. The difference lies in pricing and supply. Exactly and you can lie above that curve for a short while. The longer it takes for competition to make an alternative, the longer you can stay above that curve. Look at your standard VEL like this:

basic-conceptsofmarketing-41-728.jpg?cb=

When a brand goes above the line, it's price skimming. Such a thing can only be upheld, as long as there is no competition to provide an alternative. That has always been the case with Titan cards. The only exception was the Titan Z, that performed worse than the 295x2 at half the price. That made the Titan Z a disaster as no one in their right mind would buy such a value proposition.

 

1 hour ago, -BirdiE- said:

When have I, at any point, said the Titan is priced where it is specifically because of manufacturing costs?

I have specifically said that manufacturing costs is only a very small part of the cost calculation. Most of it is trying to recoup R&D expenses.

People who spend more money get the benefits sooner... that's how it always works...

Again, this is why rebranding is a thing.... When AMD rebranded their 7000 card for most of the R 200 series... The price of the same chips dropped. Was that because the cost to manufacture them went down? No. Was it some shady ploy by AMD? No. It's because now that it was "x" amount of time later, that level of graphics performance was valued  at less.

Good, so pricing on the Titan cannot be claimed to be production cost based. You can certainly argue it's the recoup of the fixed costs of the architecture, but honestly I very much doubt that. That usually happens with regular scale of economics or outright from the professional market, like Teslas and Quattros.

 

Yeah but the first mover advantage is seriously small here, as an x80ti is a given so far. But that is why the Titan card can exist and be sold. Does not change that it's overpriced to hell, just that people are willing to shell out despite it.

I fail to see the relevance of rebranding in this case. Sure it's the same chip, but x80ti's are not full fat. And yes chips do get cheaper the longer you make them, as yields increases, and defects diminishes. In your argument, sure the fixed costs has been lowered a long time ago (specifically r&d), but this is still a somewhat different market situation.

Let me put it the other way around; why isn't NVidia launching the x80ti first, then when the stock is high, they can launch a Titan at Titan prices? Well you answered it yourself with the first mover point.

 

1 hour ago, -BirdiE- said:

Price to performance, of course, drops with every generation. When has anyone been debating that? The price/performance curve is set by the RX 460-480, and the GTX 1060-1080, and is a function of RELATIVE price to performance for that generation (or the current prices of cards on the market)... The price/performance of irrelevant cards from previous generations is not a part of it.

The performance delta between the x80 and the Titan has increased, which is why the price of the Titan increased.

 

The Titan was a new market segment created by Nvidia. Before that the flaship x80 card would have a performance increase of 20-30% per generation. As evidenced by the fact that the 680 and 780 following that pattern, the Titan was a new, higher performance, market segment that didn't previously exist, and obviously commanded a higher price.

The entire issue is that the relative price to performance line/curve/whatever has slowly risen to new heights. That is what I'm criticizing. Not that Titan cards exists, but that they are over priced, price skimming ripoffs (and yes a similarly priced AMD card would be too).

 

Well you can't argue the delta between x80 and Titan has increased when they are essentially arbitrary. When Kepler launched the 680 was the highest end card. Now we have 2 higher tiers based on a completely separate higher end chip (Gx100/102). You can as easily state that "lower end" cards are being marketed as x80 cards instead of say x60 cards, while retaining the x80 price. Again something achievable when competition is insufficient. It's brilliant market manipulation NVidia is doing. All of the sudden a single chip graphics card at 1200$ seems reasonable. That's the kind of things companies loves! And consumers are defending this behaviour?

 

That new segment was made by creating larger chips on the same node. That's all. Nothing impressive by it. But I guess we can thank 28 nm being relevant for 5+ years for that.

 

1 hour ago, -BirdiE- said:

Man. It's super obvious you've never taken an economics course...

Yeah I could say the exact same about you. But I will refrain from ridiculous statements like that, and only conclude, that both bachelors and masters in business are vastly different in specialization as well as syllabus. As such the knowledge, theories and views will be different. Rather I think this discussion is a matter of perspective. My perspective is from the consumers point of view. Yours seems to be Nvidia's point of view.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I've been gone for a while now. Can someone catch me up?

 

From what I can see, the Titan X (Pascal) is expected to be "up to" 60% faster than the previous Titan X (Maxwell). Now if I remember correctly, the GTX 1080 is 30% faster than the Titan X(Maxwell)/980 Ti. This puts the Titan X (Pascal) at 30% faster than the GTX 1080, right? However, the Pascal Titan X's MSRP is $1200. The GTX 1080 has an MSRP of $700 for its FE (Going to use FE for this, since both use FE coolers). That is a difference of 71.5% cost, for theoretically 30% more performance. If you compare that to the cheapest MSRP of the 1080, its literally a 100% difference in price for 30% performance.

 

While I know the Titan cards have never been great as far as performance per dollar goes, but has it ever been this bad in the past? Not to mention (correct me if I am wrong) the Titan X (Pascal) only has 12GB of VRAM, which hinders their ability to cut it down properly. They can't just cut the 1080 Ti (if it will ever exist) down to 6GB, as it would not look good on paper compared to the normal GTX 1080 with 8GB of VRAM. If they leave it at 12GB of VRAM, and cut down the cores, it may not be enough to make the Titan X look like a better purchase decision (Kinda how the Titan X (Maxwell) had 12GB of VRAM vs 6GB on the 980 Ti). 

 

Is it possible to cut down say, 4GB off that 12GB? We already know Nvidia is capable of cutting specific pieces of the VRAM (3.5gb + .5gb on the 970 is a great example). Could we see such a thing on the GTX 1080 Ti (Again, if it ever exists)?

 

This just seems like a very confusing launch to me. The timing is odd, given that AMD has yet to even release a card to put the 1080 on edge. Why is Nvidia competing against themselves in the high-end market? Granted, it costing nearly twice as much puts it in a market of its own, it still seems too early as far as a sales standpoint goes. Would let the people finish buying the 1080's, wait for sales to die down/competition to emerge before releasing a superior, more expensive product. 

 

Hopefully people that are more educated on the subject can answer a few of these questions, as I am genuinely confused. This is what I get for being gone for so long.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, patrickjp93 said:

It disproves plenty of it. 

 

Maintenance cost is another factor for non-commodity products such as infrastructure.

 

I've read it and it's not credible. All claims, no proof, and I provided counterproof and counter-reasoning.

 

No, GPUOpen has invited complete amateur developers who don't understand performance optimization, and it just slows down the committing process. I'm waiting on 1300 lines of hybrid raytracing code to be committed to the lighting engine even though it's vastly more accurate than their rasterized lighting model and performs 20% better than the current code because I'm # 60 in a queue of people who have an 85% refusal rate. Nvidia should be allowed to make effects, and people should be allowed to use them, and AMD should not automatically be able to receive full benefit, because Nvidia did the work. That is what competition is.

 

Goes for fiber too. When you have a 90+% profit margin, it means maintenance cost is not an explanation for the price. Profit margin is. But you are completely missing the point.

 

Yes, AMD, NVidia ans several big developers are not credible 9_9

 

GPUOpen effects are just as developed as GameWorks. Only difference is the legality, that allows devs to edit, mod and change the effects as they please. Not sure what you are talking about. Developers are not committing code to AMD.

 

They are allowed. No one has said otherwise or tried to prevent them for making GameWorks. I have heavily criticized them for the anti consumer result of their design and use of GW. Vendor lock in is the exact opposite of competition. So that is simply NOT true.

 

4 minutes ago, patrickjp93 said:

IThose black-box effects are beating the competition handily in both performance/quality and in raw quality. TressFX hair does not react well with hair or water turbulence. Hairworks does (also, Hairworks hair can have snow and such in it. TW3 devs did not use it to its fullest). The only reason you don't like Hairworks is because Geralt is ugly AF, but he was designed that way, so your comparison is biased because of that.

What are you on about? HairWorks had like a 20fps penalty in W3. RotTR had less than 5 fps penalty, while adding effects (water/snow/etc), while still being longer (thus more computing needed per strand of hair). Turbulence seems to work just fine:

The only reason I don't like HairWorks is it's enormous performance penalty without any increased visual fidelity to show for it. Unoptimized is how I would describe HW.

 

4 minutes ago, patrickjp93 said:

Fury X with a decent overclock beats it in Doom, AOTS, and ROTR w/DX12 enabled.

 

The Titan Z performed decently better than the 295x2 if you were willing to put a water block on it.

 

For the 1080 in a handful of DX12/Vulkan games, sure. But we both know that is not a proper alternative. Especially taking the 4GB of VRAM into account, as well as the power requirements. 

 

None one water cooled a Titan Z. Spending twice as much on a TZ, only to spend even more on a custom water cooling loop and block? Either way, the Titan Z was a failure as a card, so it's potential under even more expensive scenarios are irrelevant now.

 

19 minutes ago, patrickjp93 said:

Yes you could. If you bought into the rumor you couldn't, I have a bridge to sell you. You could do it in config files for some games, but if you couldn't, it was the developers' fault. You generally can't manipulate every setting in games, but a little disassembly can prove the capability is there if developers allowed it, because these functions are parameterized. AMD couldn't override the tessellation factor if this wasn't so.

 

The tessellated ice that interacts with the cape is certainly a better solution than the alternatives which require raw compute that would eat twice as much out of the performance budget. Now, to be fair, there is a lot of bad code in Batman, but I will defend that design choice, because it was the least expensive method to achieve that from a memory and compute perspective.

For which games could you change the tessellation factor in the GameWorks effects directly in a settings file?

When it comes to GameWorks, you still fail to take into account, that something might be technically possible, but not be legally allowed qua their contract. AMD never signed such a contract.

 

You cannot honestly think that solid surface of pink triangles is performance well spent? You would tear AMD a new one, if they did that. It simply does not translate all that performance penalty into actual visual fidelity.

 

20 minutes ago, patrickjp93 said:

They should, and it isn't hurting the industry, by the very virtue that Nvidia is providing the best solutions. It just means more competition in making those effects, and it means cheaper games if studios don't choose to invest in their own home-brewed stuff and can use Gameworks material for far cheaper.

 

It's not bad. Nvidia and AMD do it all the time to fix shader code that shipped broken. I've read it, and that's less than half the story. Nvidia's own ex-employees say the bulk of driver optimization is done by analyzing binary because analyzing source code is useless because of Visual Studio's obfuscation system (release-style code is obfuscated at the assembly level).

 

The devs can always do that. I've programmed with gameworks before. You can literally set tessellation and polymorph parameters until the cows come home. The fact the games don't allow you to do it easily is a developer laziness issue, not  Gameworks library issue.

It is when they are proprietary and/or skewing the performance between AMD and NVidia as extremely as NVidia has done so far. From what we have seen, HairWorks are not better than the newer iterations of TressFX. Neither visually, nor performance wise.

Yeah because we all know that GameWorks games are cheaper than non gw games. Come on, that argument is moot.

 

Again, you blame something on the developers, that might easily be a restriction put on them contractually. Even CDPR hinted strongly at that. Either way, we know AMD can do it, because they added the function in their drivers. Point is, that it should not be necessary to do in the first place.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, MageTank said:

I've been gone for a while now. Can someone catch me up?

 

From what I can see, the Titan X (Pascal) is expected to be "up to" 60% faster than the previous Titan X (Maxwell). Now if I remember correctly, the GTX 1080 is 30% faster than the Titan X(Maxwell)/980 Ti. This puts the Titan X (Pascal) at 30% faster than the GTX 1080, right? However, the Pascal Titan X's MSRP is $1200. The GTX 1080 has an MSRP of $700 for its FE (Going to use FE for this, since both use FE coolers). That is a difference of 71.5% cost, for theoretically 30% more performance. If you compare that to the cheapest MSRP of the 1080, its literally a 100% difference in price for 30% performance.

 

While I know the Titan cards have never been great as far as performance per dollar goes, but has it ever been this bad in the past? Not to mention (correct me if I am wrong) the Titan X (Pascal) only has 12GB of VRAM, which hinders their ability to cut it down properly. They can't just cut the 1080 Ti (if it will ever exist) down to 6GB, as it would not look good on paper compared to the normal GTX 1080 with 8GB of VRAM. If they leave it at 12GB of VRAM, and cut down the cores, it may not be enough to make the Titan X look like a better purchase decision (Kinda how the Titan X (Maxwell) had 12GB of VRAM vs 6GB on the 980 Ti). 

 

Is it possible to cut down say, 4GB off that 12GB? We already know Nvidia is capable of cutting specific pieces of the VRAM (3.5gb + .5gb on the 970 is a great example). Could we see such a thing on the GTX 1080 Ti (Again, if it ever exists)?

 

This just seems like a very confusing launch to me. The timing is odd, given that AMD has yet to even release a card to put the 1080 on edge. Why is Nvidia competing against themselves in the high-end market? Granted, it costing nearly twice as much puts it in a market of its own, it still seems too early as far as a sales standpoint goes. Would let the people finish buying the 1080's, wait for sales to die down/competition to emerge before releasing a superior, more expensive product. 

 

Hopefully people that are more educated on the subject can answer a few of these questions, as I am genuinely confused. This is what I get for being gone for so long.

They might (don't quote me on it, just a hunch) cut the memory down to 9GB on the 1080ti. It seems stupid in respect to the whole "power of 10" thing, but given that its a 384 bit bus on the GP102, they'd have to stick to multiples of 3, i.e. - 768MB (8800 GTX), 1.5GB (GTX 580), 3GB (GTX 780), 6GB (GTX 980ti), 9GB, 12GB (TitanX) etc..

 

The other factor would be Vega, and how much competition the big Vega provides. If big Vega has 16BG of memory (good lord), I can see Nvidia just sticking with 12GB on the 1080ti, and cutting cores if big Vega is weaker.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, MageTank said:

I've been gone for a while now. Can someone catch me up?

 

From what I can see, the Titan X (Pascal) is expected to be "up to" 60% faster than the previous Titan X (Maxwell). Now if I remember correctly, the GTX 1080 is 30% faster than the Titan X(Maxwell)/980 Ti. This puts the Titan X (Pascal) at 30% faster than the GTX 1080, right? However, the Pascal Titan X's MSRP is $1200. The GTX 1080 has an MSRP of $700 for its FE (Going to use FE for this, since both use FE coolers). That is a difference of 71.5% cost, for theoretically 30% more performance. If you compare that to the cheapest MSRP of the 1080, its literally a 100% difference in price for 30% performance.

 

While I know the Titan cards have never been great as far as performance per dollar goes, but has it ever been this bad in the past?

Off the top of my head, I can think of the Titan Z... As you said, the Titan cards were never intended to be any good at price to performance, and were only intended to be for the those who wanted the best of the best. NVIDIA has a pseduo-monopoly for the high end until Vega comes out in 2017. Until then, NVIDIA can control the malleable market to its liking.

11 minutes ago, MageTank said:

Not to mention (correct me if I am wrong) the Titan X (Pascal) only has 12GB of VRAM, which hinders their ability to cut it down properly. They can't just cut the 1080 Ti (if it will ever exist) down to 6GB, as it would not look good on paper compared to the normal GTX 1080 with 8GB of VRAM. If they leave it at 12GB of VRAM, and cut down the cores, it may not be enough to make the Titan X look like a better purchase decision (Kinda how the Titan X (Maxwell) had 12GB of VRAM vs 6GB on the 980 Ti). Is it possible to cut down say, 4GB off that 12GB? We already know Nvidia is capable of cutting specific pieces of the VRAM (3.5gb + .5gb on the 970 is a great example). Could we see such a thing on the GTX 1080 Ti (Again, if it ever exists)?

The Titan X Pascal (I'm going to just call it the Titan XP) is an already cut down chip (the full chip should be 3840 cores), so what we're seeing is reminiscent of Kepler. I doubt the GTX 1080Ti will emerge, since the most it can have is 3072-3200 cores at most, which isn't much of a jump from the GTX 1080's 2560 (as you said). The Titan XP will likely be NVIDIA's ultimatum, given that GP100 + HBM2 is solely for computing and wouldn't be any good as a enthusiast gaming card, but rather as a Tesla.

11 minutes ago, MageTank said:

This just seems like a very confusing launch to me. The timing is odd, given that AMD has yet to even release a card to put the 1080 on edge. Why is Nvidia competing against themselves in the high-end market? Granted, it costing nearly twice as much puts it in a market of its own, it still seems too early as far as a sales standpoint goes. Would let the people finish buying the 1080's, wait for sales to die down/competition to emerge before releasing a superior, more expensive product. 

Eh, I just think NVIDIA is taking advantage of the high-end market now. Granted, they put billions into their R&D and there's a part of me which thinks Vega cannot top a monster such as this. The prospects of a GTX 1080Ti, in my opinion, is extremely slim and would only make sense if AMD manages to get even close to the Titan XP with Vega, thus forcing NVIDIA to offer a competitive product. It's all speculation for now and I'm excited to see how it turns out.

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, MageTank said:

While I know the Titan cards have never been great as far as performance per dollar goes, but has it ever been this bad in the past? Not to mention (correct me if I am wrong) the Titan X (Pascal) only has 12GB of VRAM, which hinders their ability to cut it down properly. They can't just cut the 1080 Ti (if it will ever exist) down to 6GB, as it would not look good on paper compared to the normal GTX 1080 with 8GB of VRAM. If they leave it at 12GB of VRAM, and cut down the cores, it may not be enough to make the Titan X look like a better purchase decision (Kinda how the Titan X (Maxwell) had 12GB of VRAM vs 6GB on the 980 Ti). 

 

Is it possible to cut down say, 4GB off that 12GB? We already know Nvidia is capable of cutting specific pieces of the VRAM (3.5gb + .5gb on the 970 is a great example). Could we see such a thing on the GTX 1080 Ti (Again, if it ever exists)?

 

This just seems like a very confusing launch to me. The timing is odd, given that AMD has yet to even release a card to put the 1080 on edge. Why is Nvidia competing against themselves in the high-end market? Granted, it costing nearly twice as much puts it in a market of its own, it still seems too early as far as a sales standpoint goes. Would let the people finish buying the 1080's, wait for sales to die down/competition to emerge before releasing a superior, more expensive product. 

 

Hopefully people that are more educated on the subject can answer a few of these questions, as I am genuinely confused. This is what I get for being gone for so long.

I guess they technically could reduce the amount of memory controllers/bandwidth, but honestly I believe they will have the 1080ti be 12GB too. That could explain launching the Titan XP now already, even though it can only cannibalize the 1080 market. In that way the Titan XP will be on the market for a long enough period, that they can retire it completely after saturating the market. The Titans are usually dead in the water the second the x80ti cards launch anyways.

 

As for the price: Seems like a lot of people in here find it completely reasonable. I don't.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×