Jump to content

Jensen Huang (Nvidia CEO) Responds to AMD, " it's underwhelming-- We'll crush it"

Deus Voltage
2 minutes ago, mr moose said:

And it's proof to the investors they can make the goods.   The XX80 and XX80ti do not exist for consumer demand, they exist because A. you have to push the envelop all the time or you get overtaken, B. the investors want to see you actually proving you have the metal in the game, C. Those cards are good for mid tier next year if all else fails.

True but at this point there's a 4th factor and it's just market demand as AI/Compute need explodes

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, D13H4RD said:

I'd say that NVIDIA knows how to market their products. It's why the GeForce brand is strong. 

 

As I said in another thread, it's easier to market ray-tracing than HBM2. 

That and GPU's (like most products) are designed to be the best first,  Companies don't make a budget model then try to improve it to sell as a premium product, they build the best product they can and then cut it down to make the budget options.  When people see the best coming out of AMD  they know that is their best, and that turns customers off. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, S w a t s o n said:

True but at this point there's a 4th factor and it's just market demand as AI/Compute need explodes

true, and ties in nicely with this:

Just now, mr moose said:

That and GPU's (like most products) are designed to be the best first,  Companies don't make a budget model then try to improve it to sell as a premium product, they build the best product they can and then cut it down to make the budget options.  When people see the best coming out of AMD  they know that is their best, and that turns customers off. 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ZacoAttaco said:

That trickles down to creators as well, if there is more hype around NVIDIA products creators will cater to that interest. Look at the ratio of LTT videos featuring or containing NVIDIA and AMD cards.

And it trickles down further. 

 

Because when a brand is associated with a product that is noteworthy, it spreads towards the other products that share the same brand. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, schwellmo92 said:

Was it underwhelming? It was about what I was expecting except for the 16GB of VRAM.

And most were expecting a pricetag between the 2080 and 2080ti. 

 

At least based on talk here on the forum.

Link to comment
Share on other sites

Link to post
Share on other sites

By saying the R7 performs lousy, is he not also implying that the RTX 2080 performs lousy since the R7 beats the RX2080 (from amd slides).

"And if we turn on DLSS we’ll crush it "  okay turn it on, wait that feature is not ready yet (ignoring the visual problems with it on that can maybe be fixed)

 "And if we turn on ray tracing we’ll crush it" only thing RT crushes at the moment it your framerate. 50+% loss of frames for a tiny visual improvement.

 

Yes there launch is underwhelming, but better than the 2080ti, 2080, 2070 launches which were huge disappointments and the most recent 2060 launch which is also very underwhelming.

Link to comment
Share on other sites

Link to post
Share on other sites

HBM2 being literally 50% of MSRP is fucking AMD over real  bad. It costs like double G6 and G6 is expensive as it is. Unfortunately SKHynix didnt live up to their claimed pricing and with supply not being increased *cough* oligopoly *cough* there's not much they can do as vega needs all the memory bandwidth as is it.

In an alternate universe this card costs like $499

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

I still don't get Huang's statement on FreeSync. 

 

What do you mean that it was never proven to work? Because there are a large number of monitors that don't meet your standards? 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, D13H4RD said:

And it trickles down further. 

 

Because when a brand is associated with a product that is noteworthy, it spreads towards the other products that share the same brand. 

This was true for most normies when AMD didn't make relevant chips, people were turned off by laptops that didn't have an Intel chip and if it was gaming it had to have a NVIDIA mobile chip. Now thanks in part to Intel supply issues and the big step forward that Ryzen was for AMD that tide is slowly changing.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, ZacoAttaco said:

Now thanks in part to Intel supply issues and the big step forward that Ryzen was for AMD that tide is slowly changing.

And for AMD, they have to keep the momentum going 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

isnt nvidia in some trouble rn since they over produced gpus and then the crypto market for gpus fell out

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Morgan MLGman said:

Yeah, I only think that 16GBs of HBM2 is a bit too much considering its cost (GamersNexus analyzed the cost of using HBM2 vs GDDR5X on Vega)... If they used less memory on the Radeon VII it could have been $599... And it should have been that much.

They doubled the quantity and doubled the bus width. Give or take some clock, they doubled the bandwidth. Were they forced into this quantity to get the bandwidth? If so, does it need the bandwidth for its performance? I think that's the question.

 

16GB might make sense for uses other than gaming, but it does seem excessive for a gamer, for now. Maybe they'll do further cut down versions in future.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, D13H4RD said:

I still don't get Huang's statement on FreeSync. 

 

What do you mean that it was never proven to work? Because there are a large number of monitors that don't meet your standards? 

 

it was never proven to work because I, Huang, will turn on my tensor core and crush the freesync

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mr moose said:

 

it was never proven to work because I, Huang, will turn on my tensor core and crush the freesync

Don't forget to engage Le G-SYNC ULTIMATE 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, D13H4RD said:

Don't forget to engage Le G-SYNC ULTIMATE 

For some reason I had Drax in my mind as I was reading Huangs comments.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Deus Voltage said:

no AI

Well that's actually not correct but don't let me get in the way of someone having an ego hissy fit ?.

 

I would of expected Jensen to act more professionally than this, maybe CES has just made him tired and cranky.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

They doubled the quantity and doubled the bus width. Give or take some clock, they doubled the bandwidth. Were they forced into this quantity to get the bandwidth? If so, does it need the bandwidth for its performance? I think that's the question.

 

16GB might make sense for uses other than gaming, but it does seem excessive for a gamer, for now. Maybe they'll do further cut down versions in future.

Yes, it's quite clear that Vega is bandwidth-starved and overclocking the HBM on a Vega 56 gave almost just as much performance as overclocking the core (without additional tweaking), GamersNexus also confirmed that AMD were forced into using HBM on Vega because of that bandwidth issue.

I think this move, although it increases the cost & price of this card, might suprise people in terms of performance gains because this card might be quite a bit faster than the RTX 2080 once tweaked properly. We'll see what respected reviewers make of it.

Although some people have no idea what "respected" means, I recently got an argument with someone over HardwareUnboxed, he was claiming that it's one of the most UNRELIABLE benchmarking tech channels because it had different CPU benchmark results than GamersNexus, he didn't notice though that HU tested the games at Ultra settings at 1080p & 1440p to give the viewers a real-life scenario, while GN tested on Low/Medium 720p/1080p to showcase CPU performance difference with an RTX 2080Ti (as unrealistic as this scenario is). Both benchmarks were correct, just done differently with a different goal in mind...

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Deus Voltage said:

Honestly, this is just getting ridiculous at this point. I think this could have perhaps been avoided if AMD, as some here commented, released an 8Gb version for a lesser price to compete more aggressively. 

I doubt that this would actually be possible and would save much cost. The configuration requires 4 HBM memory chips and smaller capacity chips just may not be getting made or in high quantities to make them any cheaper. Not only that because you are mounting multiple dies on to a package you don't know if it's a functioning product until afterward, then you have to validate/bin it. Another SKU has an introduction cost, sales volume likely does not project a return on doing it so basically no matter how we slice it an 8GB GPU wouldn't be any cheaper unless sales projections of like 10x, 20x, 30x times what AMD is probably expecting.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, VegetableStu said:
  • "no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it."
  • "It’s a weird launch, maybe they thought of it this morning"
  • "Intel’s graphics team is basically AMD, right? I’m trying to figure out AMD’s graphics team."

oh yeah, kinda, with the calm "brute attempts clever statement" poise and all

maxresdefault.jpg

he's only missing the word pathetic.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, mr moose said:

he's only missing the word pathetic.

 

 

1671693729_ZomboMeme10012019185450.jpg.f832338ac6862aee78dc2cc48dac51f6.jpg

Now it's complete. 

Spoiler

I clearly did not do this while half drunk 

 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, D13H4RD said:

1671693729_ZomboMeme10012019185450.jpg.f832338ac6862aee78dc2cc48dac51f6.jpg

Now it's complete. 

  Hide contents

I clearly did not do this while half drunk 

 

I half way there too, fighting the urge to pull out all the hulk/jensen crush memes for nvidia right now.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Coaxialgamer said:

But after thinking about it, i still have doubts as to the manufacturability  of TU104 and TU102. Amd's 7nm vega is literally half the size of those chips. . 

This is the part I thought of immediately as well and amuses me greatly. Sure Vega 20 might not be better than TU104 but that die is much larger, like by a lot. Fab cost of the Vega 20 GPU and HBM dies is going to be less than TU104 with higher yields, if AMD has to I'm sure they can price lower than Nvidia can or at least bleed less.

 

If the entire point of Radeon 7 is just to impact Nvidia's GPU prices and potentially lead to price decrease on a product that's going to outsell by 100 times then AMD wins, if I could make a smug industry giant effectively punch themselves in the face I'd jump at the chance.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RejZoR said:

I don't know why people always have this notion that you need to be best of the best, fastest of the fastest. AMD proved several times with RX480 that that's not the case. And then again with RX580 despite being just a mild refresh. And then again with RX590. Despite being a yet another mild refresh. Vega 64 cards after receiving some fine wine and price drops are actually really decent cards now and come cheaper than GTX 1080 while also being a bit faster in most cases. The muh power consumption narrative really got old. Like 3 years ago almost every system had like 750W PSU basically as standard and now all of a sudden everyone is freaking out on 500W PSU's for some reason. Stop being poor and buy a 750W PSU and you won't even give a damn what GPU is running in your system. It's not like you're playing games 24/7 that consumption would actually matter. Coz in idle, they are all the same.

 

So, bottom line, Radeon 7 actually doesn't look bad at all. S it has a bit higher power consumption, big fucking deal. If I wasn't on GTX 1080Ti already I'd probably even consider it. Or even old Vega 64 if I was aiming at that performance level. When new, they were too expensive, but now they are more reasonably priced.

A HD7970 GHz edition noticeably warms my room up. And it consumes far less power than Vega 56 or 64....so that's a shitload of heat getting dumped into the case and from there the room the rig is located in if you're using Vega. Power consumption is an important aspect for at the very least having a room that isn't going to become a desert or sauna.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

From my understanding of radeon 7 though is that with all the HBM 2 and interposer, they are selling that chip at cost. This knowledge coming from some AMD engineer that left this fall.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×