Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

NVIDIA Fires Shots at AMD’s 7nm Tech - Claims "Can Create Most Energy-efficient GPU in the World Anytime"

1 hour ago, RejZoR said:

There was one point in my ownage of GTX 1080Ti where I had so many dumb problems with it I was "this" close to selling the damn thing and buying Vega 64. NVIDIA, stfu and sit down you annoying bragging little shit. This is one of reasons why I don't like NVIDIA and especially its founder who acts like a little kid bragging how his dad or brother can kick anyone's ass.

 

Great, so you have the most power efficient GPU and the fastest GPU while at that. Has anyone looked at the NVIDIA Control Panel recently and with honest face said "yup, this is modern". The damn thing looks the same as it did back in 2004 and it's absolute shit. Organized like a turd, all flickering and twitching, selections jerking around when you change anything, you need to use fat and useless NVIDIA "Experience" to get some extra mostly useless functionality that doesn't even work in 99% of stuff (like that custom shading thing that I couldn't get to work in any damn game). How about you shut up and improve your garbage software instead, eh NVIDIA?

 

I was setting up an AMD system for a relative and boy the AMD Crimson control panel is such a sublime experience. Beautiful, fast, responsive, well arranged and the new OSD feature is freaking amazing as we could monitor GPU clock under heavy load, see thermals and also the framerate without having to touch MSI Afterburner installer. No fiddling with extra fat software, it's a part of the drivers.

AMD's control panel is vastly better looking and designed, but honestly the NCP has never bothered me. The biggest thing it needs is some work to make it more responsive, a problem its had from day one.

Link to post
Share on other sites

The reason they'll never do this is because they're to busy spending billions chasing pipedreams like Raytracing that no one asked for and based on current implementation rates in software no one wants either.

 

The phrase "the bigger you are the harder you fall" applies to Nvidia in this scenario. Just like Intel they got complacent with being market leader and instead of focusing on what mattered they decided to focus on a personal dream of their CEO. Guess what Nvidia, while you were wasting time developing RTX AMD caught you up and might just overtake you in the market where it really matters.

 

This is what happens when you chase the top 1% and ignore the other 99%. The top 1% might buy the most expensive cards but the other 99% are where the real money lies.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Windows 10 Pro X64 |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to post
Share on other sites
4 hours ago, ARikozuM said:

If Huang isn't going to do it, he shouldn't bring it up. Unless he's trying to be an American politician... 

 

Mother of God | Know Your Meme

 

  Reveal hidden contents

Give a single-slot, low-profile, fanless 750 Ti with the power to RTX everything at 2160@60 Ultra. - The People

 

many people are looking at this from a too narrow perspective. Sure the Radeon VII might only trade blows with the RTX 2080 in gaming performance, but have you seen its performance in professional/data centre tasks? It can compete with Nvidia's highest end, and do so while costing significantly less and even consume less power. The data centre is already a large part of Nvidia's revenue and is only going to continue growing. The only reason a CEO would ever speak in such a manner about a competitor is in a last, and frankly pathetic attempt to prevent costumers from switching over to their products. Sure, Nvidia has its niche, but with the overall disappointment that ray tracing was, and the ever increasing news that AMD gpu's can actually ray trace as well Nvidia is shitting their pants. This is not to say that AMD will completely annihilate Nvidia, because that simply wont happen. Nvidia have way too many resources, and as stated, at the current time are still ahead of AMD in certain workloads. But change is coming. Otherwise you would most definitely not hear any such statements from the CEO of a multi billion dollar company.

Link to post
Share on other sites
32 minutes ago, Master Disaster said:

The reason they'll never do this is because they're to busy spending billions chasing pipedreams like Raytracing that no one asked for and based on current implementation rates in software no one wants either.

 

The phrase "the bigger you are the harder you fall" applies to Nvidia in this scenario. Just like Intel they got complacent with being market leader and instead of focusing on what mattered they decided to focus on a personal dream of their CEO. Guess what Nvidia, while you were wasting time developing RTX AMD caught you up and might just overtake you in the market where it really matters.

 

This is what happens when you chase the top 1% and ignore the other 99%. The top 1% might buy the most expensive cards but the other 99% are where the real money lies.

If raytracing wasn't worth chasing after Nvidia wouldn't be investing into it, and apparently AMD is wanting to add raytracing into their cards as well.

The implementation is slow because its new, and game publishers care more about money than implementing a feature that significantly improves the visual quality. The real money is in the top 1% in the top tier cards and in datacenter applications.

Link to post
Share on other sites

Hah this is a great way for Nvidia to piss off their datacenter customers who are presumably asking for such a product. 

Link to post
Share on other sites

Let's see, in gaming performance, the 7nm Radeon VII is comparable to the 1080Ti released nearly two years earlier, and still doesn't beat it on power consumption. Am I missing something here?

 

I'd give the VII has potential to do really well in some other specific tasks, and certainly in those cases, you can follow the performance. On a similar note, for compute applications I did see Turing take a good step ahead over Pascal if you reference gaming performance, the compute is both faster and lower power consuming. 

Desktop Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, Corsair Vengeance Pro RGB 3200 4x16GB, Asus Strix 1080Ti, NZXT E850 PSU, Cooler Master MasterBox 5, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

TV Gaming system: Asus X299 TUF mark 2, 7920X @ 8c8t, Noctua D15, Corsair Vengeance LPX RGB 3000 8x8GB, EVGA 2080Ti Black, Corsair HX1000i, GameMax Abyss, Samsung 970 Evo 500GB, LG OLED55B9PLA

Former Main system: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB + 480GB SSD

Link to post
Share on other sites
28 minutes ago, Master Disaster said:

The reason they'll never do this is because they're to busy spending billions chasing pipedreams like Raytracing that no one asked for and based on current implementation rates in software no one wants either.

 

The phrase "the bigger you are the harder you fall" applies to Nvidia in this scenario. Just like Intel they got complacent with being market leader and instead of focusing on what mattered they decided to focus on a personal dream of their CEO. Guess what Nvidia, while you were wasting time developing RTX AMD caught you up and might just overtake you in the market where it really matters.

 

This is what happens when you chase the top 1% and ignore the other 99%. The top 1% might buy the most expensive cards but the other 99% are where the real money lies.

Everyone wants ray tracing, it has been the "holy grail" of graphical and effects design for decades. The adoption rate is slow because the performance isn't there right now nor are the sales. Also, you do realize that both companies have developed professional products that handle ray tracing for movie studios right?

Link to post
Share on other sites
3 minutes ago, porina said:

Let's see, in gaming performance, the 7nm Radeon VII is comparable to the 1080Ti released nearly two years earlier, and still doesn't beat it on power consumption. Am I missing something here?

 

I'd give the VII has potential to do really well in some other specific tasks, and certainly in those cases, you can follow the performance. On a similar note, for compute applications I did see Turing take a good step ahead over Pascal if you reference gaming performance, the compute is both faster and lower power consuming. 

I still have to wonder how much difference there is between an arch shrunk to 7nm versus an arch designed for 7nm. How much difference if any at all dunno, but that's the main area I'm interested in.

Link to post
Share on other sites
11 minutes ago, Blademaster91 said:

If raytracing wasn't worth chasing after Nvidia wouldn't be investing into it, and apparently AMD is wanting to add raytracing into their cards as well.

The implementation is slow because its new, and game publishers care more about money than implementing a feature that significantly improves the visual quality. The real money is in the top 1% in the top tier cards and in datacenter applications.

No, the real money is in the mid end cards that majority buys, not the top 1% of cards.

AMD Ryzen 7 5800X | ASUS Strix X570-E | G.Skill 32GB 3600MHz CL16 | AORUS GTX 1080Ti | Samsung 850 Pro 2TB | Seagate Barracuda 8TB | Sound Blaster AE-9 MUSES

Link to post
Share on other sites
2 hours ago, RejZoR said:

There was one point in my ownage of GTX 1080Ti where I had so many dumb problems with it I was "this" close to selling the damn thing and buying Vega 64. NVIDIA, stfu and sit down you annoying bragging little shit. This is one of reasons why I don't like NVIDIA and especially its founder who acts like a little kid bragging how his dad or brother can kick anyone's ass.

 

Great, so you have the most power efficient GPU and the fastest GPU while at that. Has anyone looked at the NVIDIA Control Panel recently and with honest face said "yup, this is modern". The damn thing looks the same as it did back in 2004 and it's absolute shit. Organized like a turd, all flickering and twitching, selections jerking around when you change anything, you need to use fat and useless NVIDIA "Experience" to get some extra mostly useless functionality that doesn't even work in 99% of stuff (like that custom shading thing that I couldn't get to work in any damn game). How about you shut up and improve your garbage software instead, eh NVIDIA?

 

I was setting up an AMD system for a relative and boy the AMD Crimson control panel is such a sublime experience. Beautiful, fast, responsive, well arranged and the new OSD feature is freaking amazing as we could monitor GPU clock under heavy load, see thermals and also the framerate without having to touch MSI Afterburner installer. No fiddling with extra fat software, it's a part of the drivers.

I've never had that problem with the Nvidia Control Panel, it could definitely use an update although the only time I need to use it is when cleaning out drivers with DDU and reapplying the resolution and color settings. If AMD were the best they would be constantly bragging, they already overhype everyone then at release its the 2nd best product, and the 7nm marketing despite more power consumption than Nvidia's 12nm parts. Yeah the AMD control panel looks nice, but the ability to easily tweak is almost necessary especially undervolting if you got a card with a bad cooler. I personally don't touch apps like MSI afterburner unless i'm benchmarking a card since Nvidia took the most of the fun out of overclocking, and it would be nice if AMD's drivers were actually optimized on release instead of buying a card then waiting for improvements.

Link to post
Share on other sites

 

18 minutes ago, Blademaster91 said:

If raytracing wasn't worth chasing after Nvidia wouldn't be investing into it, and apparently AMD is wanting to add raytracing into their cards as well.

The implementation is slow because its new, and game publishers care more about money than implementing a feature that significantly improves the visual quality. The real money is in the top 1% in the top tier cards and in datacenter applications.

When it comes to the consumer market, no. The top end cards likely produce the most profit per unit, but the vast majority of sales are at the mid-range making that the biggest money maker on the consumer side. Datacenter is a good money earner, but the consumer sector has routinely brought in more revenue and profit for Nvidia.

Link to post
Share on other sites
53 minutes ago, Master Disaster said:

The reason they'll never do this is because they're to busy spending billions chasing pipedreams like Raytracing that no one asked for and based on current implementation rates in software no one wants either.

 

The phrase "the bigger you are the harder you fall" applies to Nvidia in this scenario. Just like Intel they got complacent with being market leader and instead of focusing on what mattered they decided to focus on a personal dream of their CEO. Guess what Nvidia, while you were wasting time developing RTX AMD caught you up and might just overtake you in the market where it really matters.

 

This is what happens when you chase the top 1% and ignore the other 99%. The top 1% might buy the most expensive cards but the other 99% are where the real money lies.

 

Raytracing has better uptake ATM than DX12 did at the same timeframe after release.5 Months after DX12 hit there where 2 games that supported it. One was an indie title the other was developed by microsoft. There were no pro dev studio released titles with DX12 support that when't associated with microsoft until 7 months after DX12 released.

 

3 3rd party pro studio titles after 5 months is a massive level of uptake by comparison.

Link to post
Share on other sites

Turing also has some major issues though, NVIDIA. Here's just a few... The 2080ti, 2080 and 2070 (And even 2060 to some extent) are like $100 more expensive than they should be. The 2060 having 6gb vram and the 2080 having 8gb is a joke. The Titan RTX makes no sense whatsoever to anybody. The gtx 1660 gets beat by Polaris in some instances, a LAST GEN card. Ray Tracing performs awful. DLSS looks like Vaseline has been smeared on the display.

 

Other than the NVENC encoder, I can't think of anything in Turing that you could say is actually any good.

 

30% more performance for 60% more price. Anyone interested?

Link to post
Share on other sites

So why don't they?

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to post
Share on other sites

Someone's going to jail in a bit for stock manipulation with bold claims that can't be backed up. He just need a journalist challenging him a bit before saying stupid things he shouldn't.

Link to post
Share on other sites
3 hours ago, Humbug said:

LOL true. That would not be efficiency though.

 

Efficiency does not mean low power. It means performance per watt, getting work done for less power...

Fine, it runs on wind power from your case fan, and so draws 0W fro the outlet. Will clarify

Link to post
Share on other sites
27 minutes ago, GoodBytes said:

Fine, it runs on wind power from your case fan, and so draws 0W fro the outlet. Will clarify

Im not cashing out for an RTX card untill it runs off quantum field fluctuations, dat fan power doesnt come for free and wind power will impede cooling

Link to post
Share on other sites

This ain't really news to me honestly. Both Nvidia and Intel had years of barely any competition from AMD to fine tune their products.

There's also the fact that AMD is competing on two front, CPU and GPUs, while they only have to compete on a single side of the hardware "war".

 

Funny how these sort of news from Nvidia, bashing AMD, always come around whenever AMD is about to release a new product eh...

CPU: AMD Ryzen 3600 / GPU: Radeon HD7970 GHz 3GB(upgrade pending) / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to post
Share on other sites

I would love a 1060-like card that draws like 50 watts.

 

SFF enthusiasts unite!

Fan Comparisons          F@H          PCPartPicker         Analysis of Market Trends (Coming soon? Never? Who knows!)

Designing a mITX case. Working on aluminum prototypes.

Open for intern / part-time. Good at maths, CAD and airflow stuff. Dabbled with Python.

Please fill out this form! It helps a ton! https://linustechtips.com/main/topic/841400-the-poll-to-end-all-polls-poll/

Link to post
Share on other sites
8 hours ago, BiG StroOnZ said:

"What makes us special is we can create the most energy-efficient GPU in the world at anytime. And we should use the most affordable technology. Look at Turing. The energy efficiency is so good even compared to 'somebody else’s' 7nm."

Bring me a worthy upgrade from my GTX 1060 3GB that doesn't require more than a 6-pin.

 

PastScrawnyCassowary-size_restricted.gif.1fb643938aeecf1b7af121580db879b6.gif

mechanical keyboard switches aficionado & hi-fi audio enthusiast

switch reviews  how i lube mx-style keyboard switches

Link to post
Share on other sites
3 hours ago, leadeater said:

I still have to wonder how much difference there is between an arch shrunk to 7nm versus an arch designed for 7nm. How much difference if any at all dunno, but that's the main area I'm interested in.

Considering how much cooperation it took to get 7nm parts, my guess is that the differences are quite small, as they would have to change quite a bit to get it to work, though vega 20 was the first 7nm part made by amd so their is a good chance had they done it latter on it would be better due to the experience they got from making it

3 hours ago, CarlBar said:

 

Raytracing has better uptake ATM than DX12 did at the same timeframe after release.5 Months after DX12 hit there where 2 games that supported it. One was an indie title the other was developed by microsoft. There were no pro dev studio released titles with DX12 support that when't associated with microsoft until 7 months after DX12 released.

 

3 3rd party pro studio titles after 5 months is a massive level of uptake by comparison.

you really cant compare the two, dx12 takes many times more work to implement it, they need to change the base of the render engine for the game to take advantage of it, while raytracing is another effect on top of the render engine

Link to post
Share on other sites

Now I'm really curious as to what both companies are cooking up. 

The Workhorse

R7 3700X | RTX 2070 Super | 32GB DDR4-3200 | 512GB SX8200P + 2TB 7200RPM Barracuda Compute | Windows 10 Pro

 

The Portable Station

Core i7 7700H | GTX 1060 | 8GB DDR4-2400 | 128GB SSD + 1TB HGST | Windows 10

 

Samsung Galaxy Note8 SM-N950F

Exynos 8895 ARM Mali G71 MP20 | 6GB LPDDR4 | 64GB internal + 128GB microSD | 6.3" 1440p "Infinity Display" AMOLED | Android Pie 9.0 w/ OneUI

Link to post
Share on other sites
9 hours ago, fasauceome said:

. Huang knows what AMD has up their sleeves as much as we do so these lofty claims are easy to ignore.

Yes and thus he is afraid of that and shoots at AMD preemtively.

 

The "nVidia believers" will eat it and go to Forums, starting flamewars because of that.

 

 

In other words, it means that AMD totally concentrated on Navi as they've seen that they were behind and only did what they have to to somewhat compete.

 

And Jensen knows probably a good bit more about Navi, is shitting his pants so he's rambling about AMD because if AMD is good, that means less money for him.

 

But hey, once Intel brings out their GPU technology, its pretty much over for them...

"Hell is full of good meanings, but Heaven is full of good works"

Link to post
Share on other sites
Guest
This topic is now closed to further replies.


×