Jump to content

Windows discrete GPU driver stability tested; but there's a catch

Humbug
3 minutes ago, M.Yurizaki said:

AMD's PR campaign has basically been "let's poke fun at the other guy."  If you have to poke fun at the other guy to uplift your own product, it tells me two things: You're either overconfident in your product (which most marketers have to be anyway) or you lack confidence in your product that it can stand on its own without resorting to schoolboy tactics like this.

Not sure I understand this style of marketing that some of you here seem to think is a good idea. 

 

You ask a company to distinguish itself without comparing itself to those it is trying to be better than. That is not something that happens often in the free market. 

 

Comparisons will always be drawn and you have to realize and accept that those are realities of marketing. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

You mean other than the OS, it's Kernel, drivers for hardware in the system etc heaps of places for stuff to go wrong and that's before introducing something like a hypervisor and VMs. More potential for something to go wrong CPU wise than GPU.

Software wise ;)
Why don't we just agree that the Software Stack is more probale to cause problems than the Hardware?

 

2 hours ago, leadeater said:

I also wouldn't say I've personally experienced more issues with workstations cards or their drivers either but it's not like I've done any rigorous testing on that front, just not observed many issues.

Yes, because you used the Card as intended by the manufacturer. 

You can argue that this test was not using the WOrkstation Card as intended ;)

 

In the WOrkstation driver you don't have the crap ton of Workarounds for Games and other software, those drivers are:

a) specialized for the intended software I mentioned above

b) working as close to the spec as possible 

 

Since all the Workstation cards failed, either there was a Hardware problem with one system that only happened with the Workstation cards or the used software had a Bug that only triggered on the Workstation cards because the Mainstream Cards ignored the behaviour of the software or had a workaround in the driver.


Workstation Cards Drivers probably don't use Shader Replacement and other shit to increase the performance at the cost of possible inacurccys while that doesn't really matter on Gaming Cards (much)...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, DrMacintosh said:

Not sure I understand this style of marketing that some of you here seem to think is a good idea. 

 

You ask a company to distinguish itself without comparing itself to those it is trying to be better than. That is not something that happens often in the free market. 

 

Comparisons will always be drawn and you have to realize and accept that those are realities of marketing. 

It's fine to compare your product to the competition in an objective manner. As in, throw in data on performance that you can objectively do better in and features you think make your product stand out from the others. But if you're trying to sell me a product and you constantly make jabs at your competition with petty insults, then that will reflect upon your company culture. And that sort of culture leaks out to its fanboys.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, M.Yurizaki said:

It's fine to compare your product to the competition in an objective manner. As in, throw in data on performance that you can objectively do better in and features you think make your product stand out from the others. But if you're trying to sell me a product and you constantly make jabs at your competition with petty insults, then that will reflect upon your company culture. And that sort of culture leaks out to its fanboys.

If their marketing works, which it does, why does it matter? 

 

Get off your high horses. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, DrMacintosh said:

If their marketing works, which it does, why does it matter? 

 

Get off your high horses. 

Because they make themselves look like idiots when they make a jab at Nvidia when they cant make a product that can compete

 

And even if they did marketing it like a 3 year old loses respect for the company. 

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, M.Yurizaki said:

AMD's PR campaign has basically been "let's poke fun at the other guy."  If you have to poke fun at the other guy to uplift your own product, it tells me two things: You're either overconfident in your product (which most marketers have to be anyway) or you lack confidence in your product that it can stand on its own without resorting to schoolboy tactics like this.

I actually disagree with some of this.

For example the below ad... It's so tongue in cheek, it's so intentionally cheesy that I find it funny. Some people don't like it or find it cringe I think because they take it too seriously... Personally I think it's good fun. And most of AMD's digs at the competition are like this.. Tongue in cheek.

 

I think that some of the people who take the GPU wars and fanboy wars too much to heart get upset by these. It's better to never take advertising at face value.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mynameisjuan said:

Because they make themselves look like idiots when they make a jab at Nvidia when they cant make a product that can compete

In the context of products that they are making jabs at, yes, they can compete. That EPYC banner is an example. Intel is read to lose marketshare to AMD, they even are willing to give AMD a good percentage too. Same with the Threadripper/8080K trade in, AMD and Intel are both competitive. 

 

As for the Nvidia side of things, they really aren't actively marketing Vega at all, nor their RX line. They have some youtube videos and some promo material at some events in some places. 

 

I just realized that all this judgment of AMD has been made off of 1 example with no other references to back up the claims that all AMD does is poke fun at everyone......

 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Humbug said:

I actually disagree with some of this.

For example the below ad... It's so tongue in cheek, it's so intentionally cheesy that I find it funny. Some people don't like it or find it cringe I think because they take it too seriously... Personally I think it's good fun. And most of AMD's digs at the competition are like this.. Tongue in cheek.

I guess different strokes for different folks.

 

But if you want to sell me a product, don't jab at the other guy like you're trying to do a mic drop. The company's public face tends to inadvertently become the face of its customers.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, DrMacintosh said:

They really aren't actively marketing Vega at all, nor their RX line. They have some youtube videos and some promo material at some events in some places. 

For the recent past they probably think it's a waste of money. They don't think that TSMC/Global Foundries combined with the memory suppliers are capable of creating AMD GPUs any faster until 7nm...  so for now just be happy that whatever is produced is selling out and no point advertising too expensively because even if the demand increases they cannot be supplied. Better save it for next round...

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Stefan Payne said:

That doesn't change the fact that it is a hack and violates the official specification of DX11.

And the "overhead" is NOT lower, its just better threadded so that it runs on more than one thread. That is all.

 

And the last time I've seen dual core Tests from some sites, it looked abysmal for nVidia, they claimed it was a bug...

Or was it rather a sideeffect of these optimizations?! That it does NOT use _LESS_ CPU Power but more, just hides it better among 4 Thread CPUs??

What?

You just described why the overhead is lower.

 

Conroe and Penryn had quad cores as well. When was that?

However it was done: AMD's GPU did not run as well in CPU bound situations as Nvidia's GPU.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Jito463 said:

 

Except if your competition can outspend you, then you're in an uphill battle to prove your performance is better.  Hopefully that will improve now that AMD is no longer bleeding money thanks to Ryzen.

 

Not sure I follow,  It doesn't matter how much money Nvidia or Intel have, the benchmarks either show AMD performs better or they don't.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Stefan Payne said:

That doesn't change the fact that it is a hack and violates the official specification of DX11.

And the "overhead" is NOT lower, its just better threadded so that it runs on more than one thread. That is all.

What you're describing is called deferred contexts. It is an actual part of the DX11 specification (https://docs.microsoft.com/en-us/windows-hardware/drivers/display/introduction-to-deferred-contexts)

Quote

And the last time I've seen dual core Tests from some sites, it looked abysmal for nVidia, they claimed it was a bug...

I'd like to see these.

 

Quote

Or was it rather a sideeffect of these optimizations?! That it does NOT use _LESS_ CPU Power but more, just hides it better among 4 Thread CPUs??

If we're talking about graphics performance, then as long as the GPU can keep rendering frames, the CPU usage should increase because it's going to send more render commands to the GPU.

Edited by M.Yurizaki
Better link to deferred contexts
Link to comment
Share on other sites

Link to post
Share on other sites

I've had more problems with Nvidia cards than Radeon cards, so this makes complete sense to me.

 

My entire time using Radeon cards was pretty much stellar (even when running a CrossFire setup years back), but I've had weird driver crashes, BSOD's, and the like with Nvidia cards. I even thought my GTX 1070 was shitting the bed because one of the drivers released in the last year caused severe artifacting in Doom when using Vulkan. I tested it on a GTX 770, 1060, and 1070 and the artifacting all showed, which made me realize it was the driver and not the card.

 

TL;DR: the days of "lol amd drivers" is loooong gone.

Current Build:

CPU: Ryzen 7 5800X3D

GPU: RTX 3080 Ti FE

RAM: 32GB G.Skill Trident Z CL16 3200 MHz

Mobo: Asus Tuf X570 Plus Wifi

CPU Cooler: NZXT Kraken X53

PSU: EVGA G6 Supernova 850

Case: NZXT S340 Elite

 

Current Laptop:

Model: Asus ROG Zephyrus G14

CPU: Ryzen 9 5900HS

GPU: RTX 3060

RAM: 16GB @3200 MHz

 

Old PC:

CPU: Intel i7 8700K @4.9 GHz/1.315v

RAM: 32GB G.Skill Trident Z CL16 3200 MHz

Mobo: Asus Prime Z370-A

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Emberstone said:

I've had more problems with Nvidia cards than Radeon cards, so this makes complete sense to me.

 

My entire time using Radeon cards was pretty much stellar (even when running a CrossFire setup years back), but I've had weird driver crashes, BSOD's, and the like with Nvidia cards. I even thought my GTX 1070 was shitting the bed because one of the drivers released in the last year caused severe artifacting in Doom when using Vulkan. I tested it on a GTX 770, 1060, and 1070 and the artifacting all showed, which made me realize it was the driver and not the card.

 

TL;DR: the days of "lol amd drivers" is loooong gone.

And to me it doesn't make sense because I've had almost zero issues with NVIDIA cards for the past 10 years. However, I've had almost zero issues with ATI/AMD cards that I've used in that same time frame.

 

But I'm the kind of person that tends to be leery of system software like drivers. I don't update my drivers the moment a new release comes out because updating drivers offers zero improvement unless explicitly stated on the release notes. I also don't install more than I need to. The fewer things are running on my system, the lower the chances are that something will break.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Humbug said:

I think that some of the people who take the GPU wars and fanboy wars too much to heart get upset by these. It's better to never take advertising at face value.

The issue with comparative marketing is that to be successful a company has to be able to point to an actual advantage over the competitor it is comparing to.   So far with many of AMD's shit digs against Nvidia,  they have not actually successfully pointed to advantages. When a company claims its competitors products are not as good as theirs, then present a steaming pile of dog shit to the consumer, the consumer remembers that and the funny marketing campaign turns into a sour grapes school yard fight.  The reality is you don't have to be a fanboy to not like this style of marketing, like many forms of marketing there is always going to be people it simply rubs the wrong way and it's worse for comparative because it has the potential to backfire if not executed perfectly.  

 

I personally think this is the issue with AMD, they have been unsuccessfully using this marketing for a very long time now.  I remember Richard Huddy used to engage in it during interviews.  It never worked for them. In fact I would argue it had the opposite effect because up until ryzen (AMD's first real performing product since god knows when), AMD's sales did not change, and this is during the time when Intel where up to their eyeballs in antitrust guilt, nvidia had the wrath of the internet with the 970 and everyone thought dx12 was going to be outmoded because of mantle.  AMD literally had all of its competitors with major PR issues and their shit dig marketing tactics resulted in nothing.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, DrMacintosh said:

You cannot do that’s without comparative advertising. Compatible advertising is the life blood of capitalism. 

You absolutely can.

 

Basically all this ad is saying is

"Intel sucks, AMD is better, if you want to find out why, google search it, we're not going to tell you anything."

 

They could easy has marketed their performance figures and processor specs. But no, they felt the need to take jabs because they already know that people are rooting for the underdog and that brand bias will carry them through without them even having to market anything.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Humbug said:

I think that some of the people who take the GPU wars and fanboy wars too much to heart get upset by these. It's better to never take advertising at face value.

I apparently missed this bit but to add what @mr moose said, people inevitably take this advertisement style to heart and it starts becoming nothing more than preaching to the choir.

 

Like Apple's ad campaign during the mid to late 2000s. People actually felt sorry for John Hodgman's character and perceived the Apple side as an arrogant hipster jerkwad, applying that perception to any Apple user. i.e., Justin Long's caricature ended up being the stereotype of Apple users as far as non-Apple users were concerned. And at the same time on the interwebs, it seems like Apple users were reveling in what the company was saying, regardless of how true the ad campaign actually was (but this was like 10 years ago so my memory's hazy)

 

And Apple stopped doing that. I wonder why.

Edited by M.Yurizaki
Link to comment
Share on other sites

Link to post
Share on other sites

 

Quote

BUT: AMD hasn't killed their cards with a driver (yet), because they implemented a temperature controlled hardware solution for the fan controller. At least the older oens like Tahiti can't overheat. If they get too hot, the fan goes to 100% until its under the threshold (something around 90°C or higher), while nVidia implemented a software solution, the first had two states: 2D and 3D....

I very clearly remember a Crimson driver from a couple years ago killing GPUs as they locked the fan speed at 20%.

 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

TBH AMD drivers have always been better for me 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:


@mr moose

 

And Apple stopped doing that. I wonder why.

Apples marketing team and product strategy have been consistently labeled the best in the industry since 2008 (maybe even a bit before).  It's of no surprise they are quick to realise when a marketing strategy is no longer working or starting to turn.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:

I apparently missed this bit but to add what @mr moose said, people inevitably take this advertisement style to heart and it starts becoming nothing more than preaching to the choir.

 

 

The other thing I forgot too mention was that this comes back to what I said earlier about it's effectiveness.  Apple where using the PC versus Mac advertising during the time of vista (while not technically a bad OS, it did have serious teething troubles for other reasons), Yet apple did not see a gain in market share.   I put it that the biggest reason for this is that people in the tech industry have long memories,  they remember MS digging apple out of a hole in the 90's, then they see apple shit ripping MS at a time when mac OS just wasn't a viable option for an industry that was already saturated with MS software (servers and office). No point in pontificating the awesomeness of being more secure and stable if your product can't be used with a companies existing hardware/software.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, mr moose said:

Not sure I follow,  It doesn't matter how much money Nvidia or Intel have, the benchmarks either show AMD performs better or they don't.

As example, I give you the Athlon64 vs Pentium 4.  Admittedly that involved Intel using dirty tricks, but it's just one example of AMD having the objectively superior product, yet failing to gain market and mind share because they lacked the funds to counter Intel's message.  Money can make a significant difference when you're trying to attract new customers (or when trying to draw old customers back).

Link to comment
Share on other sites

Link to post
Share on other sites

On 16/07/2018 at 9:36 PM, samcool55 said:

After taking a quick look at the results, it's clear the pro-grade cards (quadro and radeon pro cards) failed more often than the geforce and RX cards. I would expect it would be the other way round, but nope.

 

I guess if someone complains about the results they can put their money where their mouth is and do it themselves. Everything is documented so you can do it yourself if you want to.

Yeah that does seem strange, when work force branding tends to mean better hardware support and/Or more stable drivers.

                     ¸„»°'´¸„»°'´ Vorticalbox `'°«„¸`'°«„¸
`'°«„¸¸„»°'´¸„»°'´`'°«„¸Scientia Potentia est  ¸„»°'´`'°«„¸`'°«„¸¸„»°'´

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Jito463 said:

As example, I give you the Athlon64 vs Pentium 4.  Admittedly that involved Intel using dirty tricks, but it's just one example of AMD having the objectively superior product, yet failing to gain market and mind share because they lacked the funds to counter Intel's message.  Money can make a significant difference when you're trying to attract new customers (or when trying to draw old customers back).

I though the athlon's enjoyed over 20% market share back in their day.  I know it was the only processor recommended except when Intel fanboys where in the threads. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×