Jump to content

Jensen Huang (Nvidia CEO) Responds to AMD, " it's underwhelming-- We'll crush it"

Deus Voltage
6 hours ago, ZacoAttaco said:

Yeah it's weird how no one really knows about this. The AMD CEO and the NVIDIA CEO are actually related, they mustn't get along though ?.

nah I think most of CEOs only cares about money... so... I am pretty sure that both of them are rich.

If it is not broken, let's fix till it is. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, yolosnail said:

The thing is, he isn't necessarily wrong that basically no effort went into the 'Radeon 7', all they did is just take one of their server GPUs, slap on a new cooler and ask Apple for a name.

And lets be real, if you can just take what is essentially a 'defective' card and rebrand it as a pretty good gaming card then you're doing something right.

 

What has happened I think is AMD never thought they will be able to sell this profitably as a gaming card.

 

But since Nvidia has pushed up prices so much (the RTX 2080 and RTX 2080ti are so expensive) that it allows AMD room to sell this as a high end gaming card at $700.

 

It surprised AMD themselves, so Nvidia's continuous price inflation every generation has benefited AMD this time, although it is really damaging PC gaming overall.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Humbug said:

What has happened I think is AMD never thought they will be able to sell this profitably as a gaming card.

 

But since Nvidia has pushed up prices so much (the RTX 2080 and RTX 2080ti are so expensive) that it allows AMD room to sell this as a high end gaming card at $700.

 

It surprised AMD themselves, so Nvidia's continuous price inflation every generation has benefited AMD this time, although it is really damaging PC gaming overall.

Maybe, but vega basically sold out at he same price when nvidia was cheaper in some places.  I think the demand for GPU's right now is high across the board. Having seen this,  It wouldn't surprise me if Nvidia's prices and tier structure has little to do with AMD and more to do with knowing what the market will pay.

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, mr moose said:

Maybe, but vega basically sold out at he same price when nvidia was cheaper in some places.  I think the demand for GPU's right now is high across the board. Having seen this,  It wouldn't surprise me if Nvidia's prices and tier structure has little to do with AMD and more to do with knowing what the market will pay.

 

I think Vega sold out also during the mining craze. Miners were buying and gamers were desperate, and HBM production kept Vega supply limited.
 

You are correct that Nvidia's pricing is (within reason) dictated by how much the market will play. Obviously if AMD launches the Radeon 7 at $450 then Nvidia cannnot ignore it. But for the most part they rely on their brand name and their current strategy is to inflate prices every successive generation.

GTX 960 --> GTX 1060 --> RTX 2060

GTX 970 --> GTX 1070 --> RTX 2070

GTX 980 --> GTX 1080 --> RTX 2080

It's pretty alarming. It's bad for PC gaming. But they know people will pay and it increases their margins on the short term. 

 

AMD on the other hand have less brand name recognition so they set their MSRPs by looking at Nvidia and then going a bit lower. With a GPU like the 7nm Vega with stacks of HBM 2.0 which is so expensive to produce AMD expected that they can only make money from this by selling it as an instinct card. But then Nvidia became so expensive that it allowed AMD to slip this one in and they probably feel it's better to maintain market presence in the high end.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Humbug said:

I think Vega sold out also during the mining craze. Miners were buying and gamers were desperate, and HBM production kept Vega supply limited.
 

You are correct that Nvidia's pricing is (within reason) dictated by how much the market will play. Obviously if AMD launches the Radeon 7 at $450 then Nvidia cannnot ignore it. But for the most part they rely on their brand name and their current strategy is to inflate prices every successive generation.

GTX 960 --> GTX 1060 --> RTX 2060

GTX 970 --> GTX 1070 --> RTX 2070

GTX 980 --> GTX 1080 --> RTX 2080

It's pretty alarming. It's bad for PC gaming. But they know people will pay and it increases their margins on the short term. 

 

AMD on the other hand have less brand name recognition so they set their MSRPs by looking at Nvidia and then going a bit lower. With a GPU like the 7nm Vega with stacks of HBM 2.0 which is so expensive to produce AMD expected that they can only make money from this by selling it as an instinct card. But then Nvidia became so expensive that it allowed AMD to slip this one in and they probably feel it's better to maintain market presence in the high end.

bugger, I just wrote out a nice post then closed the tab like a moron.  Agree and disagree.  AMD definitely need to get some top end market presence though.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, ZacoAttaco said:

Even if you did make it up, this is the same guy who said:

I wish people stopped quoting him on that without proper context. 

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Humbug said:

I think Vega sold out also during the mining craze. Miners were buying and gamers were desperate, and HBM production kept Vega supply limited.
 

You are correct that Nvidia's pricing is (within reason) dictated by how much the market will play. Obviously if AMD launches the Radeon 7 at $450 then Nvidia cannnot ignore it. But for the most part they rely on their brand name and their current strategy is to inflate prices every successive generation.

GTX 960 --> GTX 1060 --> RTX 2060

GTX 970 --> GTX 1070 --> RTX 2070

GTX 980 --> GTX 1080 --> RTX 2080

It's pretty alarming. It's bad for PC gaming. But they know people will pay and it increases their margins on the short term. 

 

AMD on the other hand have less brand name recognition so they set their MSRPs by looking at Nvidia and then going a bit lower. With a GPU like the 7nm Vega with stacks of HBM 2.0 which is so expensive to produce AMD expected that they can only make money from this by selling it as an instinct card. But then Nvidia became so expensive that it allowed AMD to slip this one in and they probably feel it's better to maintain market presence in the high end.

I blame the price increases that both Nvidia AMD have done mostly on the mining craze, people were willing to pay crazy prices then, and most are now because GPU's are cheap compared to how expensive cards  were during the mining craze. AMD is a business too and they aren't in the business of caring any more about the consumer than Nvidia is.  AMD took workstation chips that were otherwise defective and charges $699 for them,so they could definitely be undercutting while still having a competitive card. Its just pure profit for AMD, and piles of money if they sell the cards to Apple too.

Sure the 20 series looks expensive when you just compare their numbering scheme, but the 2060 performs about as well as a 1070, a 2070 is a bit better than a 1080, and a 2080 is better than a 1080ti.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Lathlaer said:

I wish people stopped quoting him on that without proper context. 

never going to happen on the internet.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, mr moose said:

bugger, I just wrote out a nice post then closed the tab like a moron.  Agree and disagree.  AMD definitely need to get some top end market presence though.

hate when that happens

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

"The performance is lousy and there’s nothing new,” Huang said. “[There’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.”

I like this quote a lot. It says all you need to know about him. Granted it is not the product that we had hoped for (too expensive) but he chooses to focus on the technologies such as DLSS and ray tracing that even his own products don't do. Then he proceeds to insult his own products by claiming that 2080 performance is lousy. Bravo Jensen!

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Blademaster91 said:

I blame the price increases that both Nvidia AMD have done mostly on the mining craze, people were willing to pay crazy prices then, and most are now because GPU's are cheap compared to how expensive cards  were during the mining craze. AMD is a business too and they aren't in the business of caring any more about the consumer than Nvidia is.  AMD took workstation chips that were otherwise defective and charges $699 for them,so they could definitely be undercutting while still having a competitive card. Its just pure profit for AMD, and piles of money if they sell the cards to Apple too.

Sure the 20 series looks expensive when you just compare their numbering scheme, but the 2060 performs about as well as a 1070, a 2070 is a bit better than a 1080, and a 2080 is better than a 1080ti.

And a 6600k performs worse than a 7600k and worse than 8600k but you cant raise the prices because technology upgrades and force to buy removing the previous product, 

Those 20X0 prices would be ok if we could buy a 1080-1080ti the entire 20series lifespan , but guess what, you wont be allowed to buy a 10gen soon :( 

Case: Corsair 760T  |  Psu: Evga  650w p2 | Cpu-Cooler : Noctua Nh-d15 | Cpu : 8600k  | Gpu: Gygabyte 1070 g1 | Ram: 2x8gb Gskill Trident-Z 3000mhz |  Mobo : Aorus GA-Z370 Gaming K3 | Storage : Ocz 120gb sata ssd , sandisk 480gb ssd , wd 1gb hdd | Keyboard : Corsair k95 rgb plat. | Mouse : Razer deathadder elite | Monitor: Dell s2417DG (1440p 165hz gsync) & a crappy hp 24' ips 1080p | Audio: Schiit stack + Akg k712pro + Blue yeti.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Blademaster91 said:

I blame the price increases that both Nvidia AMD have done mostly on the mining craze, people were willing to pay crazy prices then, and most are now because GPU's are cheap compared to how expensive cards  were during the mining craze.

Agreed. I see 3 factors in the continuous destructive price escalation

-mining craze

-Nvidia's strategy to keep increasing prices every generation

-AMD's failure to offer proper competition in the high end (took 1.5 years to match the 1080 and almost 2 years to beat the 1080ti)

 

1 hour ago, Blademaster91 said:

AMD is a business too and they aren't in the business of caring any more about the consumer than Nvidia is.  AMD took workstation chips that were otherwise defective and charges $699 for them,so they could definitely be undercutting while still having a competitive card. Its just pure profit for AMD, and piles of money if they sell the cards to Apple too.

Sure the 20 series looks expensive when you just compare their numbering scheme, but the 2060 performs about as well as a 1070, a 2070 is a bit better than a 1080, and a 2080 is better than a 1080ti.

Oh absolutely. AMD is here to make money too.

 

However if they could have mustered enough production it would have been a good strategic move  on the longterm to launch the Radeon 7 GPU at USD 500-550 and market it as the savior of high end PC gamer bringing back normal prices. The community would have eaten that up. Let's face it when prices and performance are near competitive people most buy Nvidia. Right now people are not happy with Nvidia due to the continuous price escalation putting good hardware out of reach for many. So AMD could have maneuvered that situation to build the Radeon brand and endear gamers to them. 

 

Could have been a chance to take a short term hit on margins in order to build their brand, and get some penetration with the high end PC community.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Carclis said:

I like this quote a lot. It says all you need to know about him. Granted it is not the product that we had hoped for (too expensive) but he chooses to focus on the technologies such as DLSS and ray tracing that even his own products don't do. Then he proceeds to insult his own products by claiming that 2080 performance is lousy. Bravo Jensen!

Its not that hes calling the 2080 performance lousy, hes calling a 7nm card with 16gb of HBM2, a higher power draw, and no transistors spent on RTX or Tensor Cores, having 2080 performance lousy. Which it is!

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Humbug said:

What has happened I think is AMD never thought they will be able to sell this profitably as a gaming card.

 

But since Nvidia has pushed up prices so much (the RTX 2080 and RTX 2080ti are so expensive) that it allows AMD room to sell this as a high end gaming card at $700.

 

It surprised AMD themselves, so Nvidia's continuous price inflation every generation has benefited AMD this time, although it is really damaging PC gaming overall.

I think you are 100% correct on this. I was trying to find the HBM2 costs, and it looks like it has to be at least $200-300 for 16gb. So yeah if Nvidia kept the $550 price point for the 2080, AMD would likely not have bothered having a Radeon VII as they likely would be losing money on each board sold. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Chett_Manly said:

Its not that hes calling the 2080 performance lousy, hes calling a 7nm card with 16gb of HBM2, a higher power draw, and no transistors spent on RTX or Tensor Cores, having 2080 performance lousy. Which it is!

 

 

Which is fine, except it has those extra transistors spent elsewhere. For example, it scales past half precision all the way down to int4 and also does half rate fp64. There is also a massive disparity in die size that is most likely not accounted for by even the 12->7nm jump. In fact you could easily argue the same for his own RTX 2080. It has a slightly more advanced process, uses considerably more die space, cost more, consumes more power and is still only at parity with a 1080ti with a 2.5 year development time over it.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Carclis said:

Which is fine, except it has those extra transistors spent elsewhere. For example, it scales past half precision all the way down to int4 and also does half rate fp64. There is also a massive disparity in die size that is most likely not accounted for by even the 12->7nm jump. In fact you could easily argue the same for his own RTX 2080. It has a slightly more advanced process, uses considerably more die space, cost more, consumes more power and is still only at parity with a 1080ti with a 2.5 year development time over it.

If it was on parity with a 1080 and didn't have TC or RT and didn't have an entire world of optimizations in front of it, then yes you would have a point. But it doesn't,  it performs as it should (on par with one model up from the previous series) had plenty of room for optimization in software and adaptions and has tensor cores and RT cores.  Regardless of whether people think RT and AI is value yet or not is moot, they are there.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, mr moose said:

If it was on parity with a 1080 and didn't have TC or RT and didn't have an entire world of optimizations in front of it, then yes you would have a point. But it doesn't,  it performs as it should (on par with one model up from the previous series) had plenty of room for optimization in software and adaptions and has tensor cores and RT cores.  Regardless of whether people think RT and AI is value yet or not is moot, they are there.

Well I think the argument would have to revolve around price since you can have the fastest part in the world but it's not worth a damn if it costs a fortune ie Titan. That's why I compared the 2080 to the 1080ti. They are similar products with basically identical performance, however the $100 price premium makes the 2080 disappointing. It brings no new performance for the given price point and costs more because of the added features. Hence why I think someone who bought a 1080ti for $700 and wanted to purchase an upgrade for the same price again would consider the performance of the 2080 lousy.

 

Now I'm not saying that Radeon VII will be a good product by any means, but if it's on par with a 2080 and it actually sells for $100 less whilst cutting what a large portion of the market doesn't care about from the card it could be a more compelling option. And I only say more compelling because none of the new cards are impressive from either company. Add to that the fact that it appears to be an incredibly capable compute card with 16GB of VRAM and I believe the highest bandwidth on a single GPU to date at what is a fairly attainable price and you have new levels of performance in areas that have actual demand. Whilst you're right that RT and AI is there on these cards and probably to stay, I'd argue they're only there because Jensen wants them to be and the end result is lousy performance. So Jensen standing there and remarking that he brought new features that nobody wanted which make the 0% performance gain acceptable at a $100 price premium don't really resonate with me. Especially if he is going to bring down the hammer on a competing product that performs the same at a lower price point and has other key features that the RTX line don't support.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Carclis said:

In fact you could easily argue the same for his own RTX 2080. It has a slightly more advanced process, uses considerably more die space, cost more, consumes more power and is still only at parity with a 1080ti with a 2.5 year development time over it.

Yes the 2080 is not a great product. Though at least you have some features that might be of use one day.

 

This also doesn't make the Radeon VII any better. If anything the Radeon VII is a 1080ti competitor launched at the RTX inflated price point. It is not a good product either.

 

Both of them fail to deliver a performance increase per dollar over the previous generation. Which is quite the FU to consumers.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Carclis said:

Well I think the argument would have to revolve around price since you can have the fastest part in the world but it's not worth a damn if it costs a fortune ie Titan. That's why I compared the 2080 to the 1080ti. They are similar products with basically identical performance, however the $100 price premium makes the 2080 disappointing. It brings no new performance for the given price point and costs more because of the added features. Hence why I think someone who bought a 1080ti for $700 and wanted to purchase an upgrade for the same price again would consider the performance of the 2080 lousy.

Your still ignoring the RT an Tensor cores.  They may not mean anything to you now but that doesn't mean they don't exist or have a price tag.

 

10 hours ago, Carclis said:

Now I'm not saying that Radeon VII will be a good product by any means, but if it's on par with a 2080 and it actually sells for $100 less whilst cutting what a large portion of the market doesn't care about from the card it could be a more compelling option. And I only say more compelling because none of the new cards are impressive from either company. Add to that the fact that it appears to be an incredibly capable compute card with 16GB of VRAM and I believe the highest bandwidth on a single GPU to date at what is a fairly attainable price and you have new levels of performance in areas that have actual demand. Whilst you're right that RT and AI is there on these cards and probably to stay, I'd argue they're only there because Jensen wants them to be and the end result is lousy performance. So Jensen standing there and remarking that he brought new features that nobody wanted which make the 0% performance gain acceptable at a $100 price premium don't really resonate with me. Especially if he is going to bring down the hammer on a competing product that performs the same at a lower price point and has other key features that the RTX line don't support.

 

I don't think any of that makes any tangible difference.  Again your preference for the value of RT and AI now doesn't change their existence.  Now if in 3 years AI and RT is not working and nvidia are still charging a premium for it you'd have an argument, but right now it's just the early adopters tax at best and at worst we lack sufficient evidence to claim it is a failure. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, valdyrgramr said:

hey originally acted like it was replacing the Titan as the high tier gaming card and that they weren't releasing a card named the titan.  Then they released a Titan which pissed off a lot of people, iirc.  I could be wrong about two.
 

That was just consumer speculation, I always claimed it was true from a marketing angle, which still is true if you consider they aren't aiming the titan at gamers (for a change) and they never actually said they weren't releasing it, that was just forum talk.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Chett_Manly said:

Yes the 2080 is not a great product. Though at least you have some features that might be of use one day.

 

This also doesn't make the Radeon VII any better. If anything the Radeon VII is a 1080ti competitor launched at the RTX inflated price point. It is not a good product either.

 

Both of them fail to deliver a performance increase per dollar over the previous generation. Which is quite the FU to consumers.

Well Radeon VII has features that are available today vs those that the RTX line which have yet to be proven. To be fair though I don't think the Radeon VII was ever supposed to be aimed at or available to consumers, which is why Linus is quite excited to see how well it performs for professional use. I suspect it became a feasible option because of how uncompetitive RTX is. I think the only losers here are gamers.

 

5 hours ago, mr moose said:

Your still ignoring the RT an Tensor cores.  They may not mean anything to you now but that doesn't mean they don't exist or have a price tag.

Well I mentioned that in my previous comment. So I think it's fair to assume that if the cards launched without the new features we would have seen the usual pricing structure which means the RTX 2080 jumped $200-300 for those features alone (depending on whose pricing you wish to believe). Given that Jensen's statement reads as:

Quote

"And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it"

You have yourself a very big "if" next to it actually being a valid claim to make, a $200-300 one in fact. Thing is Jensen likes to tell gamers and developers what they want but that doesn't really translate into the real world as you'll probably see if you look at stock levels of 2070's or 2080's or even look at how many titles actually support their new technologies. Quite a significant amount of titles that Nvidia claimed to be adopting them are still missing either feature despite already being full releases (also ignoring how poor the existing implementations are). I mean we're told "it just works", but does it? Or are people just not interested?

DLSS and RTX may have a price tag but that doesn't stop the RTX 2080 from being just as lousy, if not more lousy than the Radeon VII.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Carclis said:

Well Radeon VII has features that are available today vs those that the RTX line which have yet to be proven. To be fair though I don't think the Radeon VII was ever supposed to be aimed at or available to consumers, which is why Linus is quite excited to see how well it performs for professional use. I suspect it became a feasible option because of how uncompetitive RTX is. I think the only losers here are gamers.

 

Well I mentioned that in my previous comment. So I think it's fair to assume that if the cards launched without the new features we would have seen the usual pricing structure which means the RTX 2080 jumped $200-300 for those features alone (depending on whose pricing you wish to believe). Given that Jensen's statement reads as:

You have yourself a very big "if" next to it actually being a valid claim to make, a $200-300 one in fact. Thing is Jensen likes to tell gamers and developers what they want but that doesn't really translate into the real world as you'll probably see if you look at stock levels of 2070's or 2080's or even look at how many titles actually support their new technologies. Quite a significant amount of titles that Nvidia claimed to be adopting them are still missing either feature despite already being full releases (also ignoring how poor the existing implementations are). I mean we're told "it just works", but does it? Or are people just not interested?

DLSS and RTX may have a price tag but that doesn't stop the RTX 2080 from being just as lousy, if not more lousy than the Radeon VII.

Jensen is just spouting usual CEO rhetoric,  "we'll crush it" just means he thinks his product is exceptionally better than theirs.  not too sure I've ever seen a CEO claim otherwise.

 

it really just comes back to comparing the cards for what they are, Jensen's comments don't make one better value than the other,  only the end user can determine that.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, mr moose said:

Your still ignoring the RT an Tensor cores.  They may not mean anything to you now but that doesn't mean they don't exist or have a price tag.

And you’re ignoring the extra 8GB of VRAM and additional compute capabilities. They both have a niche.

Link to comment
Share on other sites

Link to post
Share on other sites

The thing is that the 2080 has extras that have yet to see its capabilities realized.

 

So if and when the RT and Tensor cores can be fully utilized on the 2080, that's where the interesting stuff begins.

 

Trouble is that the 2080 is still very new and those features have yet to be realized. It's likely that by the time RT and Tensor's capabilities become realized, a new breed of GPUs may be already on their way.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, D13H4RD said:

The thing is that the 2080 has extras that have yet to see its capabilities realized.

 

So if and when the RT and Tensor cores can be fully utilized on the 2080, that's where the interesting stuff begins.

 

Trouble is that the 2080 is still very new and those features have yet to be realized. It's likely that by the time RT and Tensor's capabilities become realized, a new breed of GPUs may be already on their way.

By time the RT and Tensor capabilities are realised I expect there to be a second improved iteration of cards with those capabilities from NVIDIA and an AMD implementation. Buying first-gen of a new technology is something I generally give a miss, it’s just never worth it.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×