Jump to content

Windows discrete GPU driver stability tested; but there's a catch

Humbug
5 minutes ago, DrMacintosh said:

AMDs PR has been pretty cool lately

-5olOZiRZOUi_h7dmZ3ijCGTxfl7dBOwVX9l0P_-Mis.jpg.0562597c2529f4c6a59c78effe49e4c5.jpg

 

Look at that Intel jab with the "more advanced security features"

AMD have been using this technique in PR and marketing for a long time, it hasn't exactly gotten them that far, I think most likely because whilst we find it funny, consumers don't generally see it as evidence their products are any better than they were.   by cool PR I mean PR that makes them look like they ARE the best, not trying to dethrone the best with a humorous punchline, and that I'm afraid has to be backed up with performance figures.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, DrMacintosh said:

AMDs PR has been pretty cool lately

Look at that Intel jab with the "more advanced security features"

i despise comparative advertising, regardless of who does it, because it's what politicians do. "I'm not going to tell you what it is that I'm doing, I'm just going to tell you that the competition is worse"

 

instead of taking jabs, advertise what it is that makes your product better and what your product does, not that it's just better than the competition

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Arika S said:

i despise comparative advertising, regardless of who does it, because it's what politicians do. "I'm not going to tell you what it is that I'm doing, I'm just going to tell you that the competition is worse"

 

instead of taking jabs, advertise what it is that makes your product better and what your product does, not that it's just better than the competition

Not only that but when it comes to their history of GPU performance, it's just embarrassing when they do it while all the time offering a sub performing alternative.   When Epyc sells, I can guarantee it will be because they are performing well as a product, not because of the shit dig marketing techniques.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

-> So AMD drivers are 400% more stable? ;p

CPU: Intel i7 3970X @ 4.7 GHz  (custom loop)   RAM: Kingston 1866 MHz 32GB DDR3   GPU(s): 2x Gigabyte R9 290OC (custom loop)   Motherboard: Asus P9X79   

Case: Fractal Design R3    Cooling loop:  360 mm + 480 mm + 1080 mm,  tripple 5D Vario pump   Storage: 500 GB + 240 GB + 120 GB SSD,  Seagate 4 TB HDD

PSU: Corsair AX860i   Display(s): Asus PB278Q,  Asus VE247H   Input: QPad 5K,  Logitech G710+    Sound: uDAC3 + Philips Fidelio x2

HWBot: http://hwbot.org/user/tame/

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

You can't kill internet narrative with a half arsed study into drivers.   I would have thought AMD would have learnt by now, the only thing that will kill a bad reputation on the internet is better performance and cooler PR.  

And less CPU overhead as well.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Humbug said:

There was a time where Nvidia had objectively better graphics drivers than the competition.

And when was that?!
THe Problem with this claim is that you don't know if it is the driver or the Application and you mix them up!
Because if you're "the first", everyone takes care of you and your quirks, while "the other ones" have to do more work for the same shit...

 

Sorry, but that is just propaganda and not the whole story.

 

13 hours ago, Humbug said:

ATI and later AMD developed a reputation of being second best in that regard at the time.

Yeah  in the 90s...

And some people still claimed that even when it wasn't really true and there even were some test from (German) Game Magazines that showed opposite results...

 

Let me say it this way:
Try to run the Original Windows Release of Final Fantasy 7 and 8 for Windows 95 on an nVidia Card of the time.

You can't because you need a patch for that first....

While those work well with 3DFX and  Matrox Hardware.

 

13 hours ago, Humbug said:

While it is generally accepted in the enthusiast community that those days are long gone (particularly since the formation of RTG)

No, R300...

And even further back...

 

13 hours ago, Humbug said:

the stigma of it still affects AMD today when it comes to buyer decisions.

Yes, because some people need some arguments to justify their spending.

And when nothing helps, you get the Skeletons out the Closet...

 

That there were days where "the other one" was not any better, well, lets ignore that.

Because there was a Driver Release for the Riva 128 where Wing Commander Prophecy crashed as soon as  one of the big ships came to the screen...

 

Or the above mentioned Original Windows Release of Final Fantasy 7 and 8....

 

13 hours ago, Humbug said:

As can be expected it takes time for the general public who is less informed to catch up to the current state of affairs. It's something that requires more marketing from AMD which is difficult given the brand strength of Geforce.

Its not the General Public, who is at fault, its the fans of the company who try to propagate it as long as possible to convince other people that it must be true because there was a Time when the Graphics Chips were called Rage and more made towards the OEMs (ATi was the biggest OEM Manufacturer back in the Day and that was for example because their chips were cooler than the competition. The screeching fans ont he Original Radeon was bullshit for example. You could easily cool it with a passive heatsink like those on the GF2 MX)...

 

In the End, both are equally shitty.

BUT: AMD hasn't killed their cards with a driver (yet), because they implemented a temperature controlled hardware solution for the fan controller. At least the older oens like Tahiti can't overheat. If they get too hot, the fan goes to 100% until its under the threshold (something around 90°C or higher), while nVidia implemented a software solution, the first had two states: 2D and 3D....


...and it was possible to kill the card with the driver, wich actually happened. There are articles where one driver had a problem with some Fermi cards and let them overheat so they die...

13 hours ago, Humbug said:

After 12 days Nvidia GPUs passed 356 out of 436 without crashes or hangs. AMD passed 416 out of 436.

Great result.

And hopefully we can stop with this driver feeling discussions, can we?

Because that isn't more than that, its just a feeling.

 

And as I said earlier, if you are the leader, everyone takes care of you. If it crashes, THEY fix it, you can keep your bugs.

 

One example was TressFX in the First Tomb Raider Rerelease back in 2013, wich lead to crashes on nVidia Cards...

 

13 hours ago, Humbug said:

Now for the catch. This testing was commissioned by AMD.

Remninds me on the good old days of Tommys Hardware, where they did the AMD vs. Intel Live Stability test. And the Intel Board died with the Pentium D CPU, while the AMD rig ran fine until the end without any issue, lockup, freeze or other shit...

 

 

13 hours ago, Humbug said:

Does that mean that the results are false? Probably not, looking at the source; a legitimate large company that does QA testing for HP, Microsoft, BMW, Apple, AT&T etc... they are not going to throw their reputation down the drain by publishing a false report.

And that is the most likely.

They did an external evaluation of their products to see how good they really are.

It just happened that they were this much better wich lead to the release of this document...

 

But its not the first time AMD (or Ati at the time) came out on top of the Driver thingy.

Back in the Day, I believe it was Gamester, made a big test of old Games with various graphics cards and with that Ati at the time also came out on top.

I don't remember if it was the Radeon 8500 or 9700, but I believe it was the earlier 200s, so probably more in the time of the 8500 than 9700...

 

13 hours ago, Humbug said:

But what it does mean IMO is that if Nvidia had come out on top then these results would have remained an internal AMD report and would never have been made public. AMD only published this because the results are favorable, it's possible that there are unfavorable research papers from the past comissioned by both AMD and Nvidia that they keep secret because it doesn't fit their narrative. But this time AMD is turning it into a marketing push.

Yeah, but we already knew that both weren't that bad and especially in the recent years, nVidia had a couple of big Problems with their drivers, while AMD only had the usual smaller things like Texture Flickering and slowdowns...

 

With the results, you might argue that AMD has a higher priority on stability while nVdia has a higher priority on performance...

 

13 hours ago, Humbug said:

Furthermore according to the report although GPUs were swapped between test systems to prevent bias there was no variety of test systems. All were Intel coffee lake test systems on MSI socket 1151 motherboards. Therefore I don't think this can be taken as definitive scientific proof that AMD is more stable than Nvidia across a variety of hardware systems. It will remain however a useful point of information.

...and here we are again at trying to make this situation not too positive for AMD, now do you?!
Its not about the System, its about the graphics chips!

As long as the TEst System is the same everytime, that is enough. YOu have to use the same system because another system might cause Problems for various reasons like slightly worse RAM on one Machine than the other. 

 

 

So yes, this is a scientific test because they don't want the variable of the Test System because that's not what this was about. And to be blunt: The Test System is Irrelevant, as long as there is no bias there (wich there isn't!) and its a somewhat modern machine...

It might be entirely possible that the Contractee just happened to have those systems and that is why they used them. 

Or rather: its possible that they did use what they have and not went and bought new hardware for this Test!

That would be the logical conclusion.

 

13 hours ago, Humbug said:

Another thing to observe, it seems that workstation GPUs aren't any more stable than gaming GPUs, regardless of IHV (nvidia or AMD).

Of course they aren't!
Its the same chip!

The difference is the level of service and the Driver Certification.

 

And that is what you should expect from Workstation Class Hardware: Workstation Class, Premium Service from the Company and certified drivers for the Card...

 

Oh and of course the full performance as some companys tend to deactivate some things that makes the Workstation Applications slower on normal Cards.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Dabombinable said:

And less CPU overhead as well.

And how do you know that there is less Overhead??

Or are you talking about some hacks to push some things from the D3D Task to another Core/Thread??


Wich on the other hand might be the main cause for the result of this Test...


Because that is a hack and will cause instability!

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Froody129 said:

I've had plenty of issues with Nvidia drivers and I haven't even been that into PCs for very long. The drivers feel very bloated right now. 

Same here.

Whenever I get a (new) nVidia Gaming card I run into issues. 

Especially in the Olden Days, the nVidia Hardware was just the worst of the Worst Garbage you can find if you wanted to do something else then "3D Stuff".

 

 

With Riva, TNT and Geforce up to 4, you couldn't even adjust the Size of the TV Output, you had to buy a 3rd party tool for that...

Although there were only 2 Chips supported: the legendary BT868 that almost everyone used, a Chronotel chip and on later Models (I Only remember Geforce 4 Ti Cards with it, maybe 3?) Philipps 7104 and 7108 (ViVo)

 

Its just unusable as you loose up to 50% of the Screen Size...

 

And the same was the case with the Geforce GTX570 I got a while Ago: Started MPC-HC with madVR and got an error Message....

13 hours ago, Froody129 said:

I've actually never personally used an AMD card but I think we are at the point where both sides are going to provide an excellent experience. People underestimate how far we've come since CLI OS drivers :P 

Exactly...

You can say that both are equally good or that both are equally shit. That's true for a long time but there are still people out there who want to have a reason to buy only one and not even to look at the other.

 

That is the Point.

 

PS: This is written on an AMD A10-7850K with a GT710 inside.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Stefan Payne said:

And how do you know that there is less Overhead??

Or are you talking about some hacks to push some things from the D3D Task to another Core/Thread??


Wich on the other hand might be the main cause for the result of this Test...


Because that is a hack and will cause instability!

The performance tanks sooner when the CPU is the bottleneck. That's why an Nvidia card was the best option when looking for something to run with anything Bulldozer, Conroe or Penryn based.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Taf the Ghost said:

It should be noted that the Driver issues flipped around the time it went from ATI to AMD graphics. Nvidia has had a lot of driver issues since the late 2000s.

Naa, it was way earlier than that. it began with the Original Radeon that ATi did invest more in the Driver.


The "Shitty Driver" Legend is something from the Rage days, when they mostly did OEM Stuff and made their money with that...

 

14 hours ago, Arika S said:

I've never had any drivers on any GPU i'v had. Is it a big enough issue than AMD felt the need to fund this kind of test? it does raise a few questions though

 

  • What was AMD hoping to get out of this?
  • Was NVidia aware of such an upcoming test to optimize their driver's as ADM might have had the time to do.
  • Because i dont feel like reading the full report, do they go into detail about what kind of workload was running at the time of the failure? ie a workstation load may affect gaming cards in a different way
  • How is AMD going to prove that this is an unbias test when it makes them look better than the competition and they funded it (like when people got up in arms about the AMD flaw that they though was an intel smear campaign funded by them)

Maybe its a thing they usually do in regular timeframes to evaluate their cards?
And sometimes you pay someone else to do that...

 

  • so what was AMD Hoping to get out of this?
    Information about the stability of their cards in comparisation to the Competition
  • Why is that important and why should they care?? So that they could make a "Ready for AMD Testing" Driver like they always do??
  • CRASH from Microsof's Windows Hardware Lab Kit
  • How are you gonna prove that it is a biased test, when there is no reason to suspect that but to suspect that its just a good result of this test?!

As for the Test, this one looks like its an internal evaluation of products by an external, 3rd Party Entity to see what happens and where AMD has problems.

It just happened that AMD came out with a +10.4% points lead in this test. Its not like they made the nVidia Cards look bad intentionally...

 

As for the Drivers:

Quote

 

The latest certified/recommended graphics drivers as of mid-May 2018 were used for both vendors.

For AMD that was Adrenalin 18.5.1 and Raden Pro Enterprise Edition 18.q2

for nVidia it was GeForce® Game Ready Driver 397.64 and Quadro® Desktop 391.58

 

And that:

https://blogs.msdn.microsoft.com/windows_hardware_certification/2018/04/27/accepting-windows-10-version-1803-submissions/

 

The Hardware looks like a standard prebuilt PC with somewhat OKish components...

 

12 hours ago, leadeater said:

Why is it that all the professional cards form both AMD and Nvidia had far higher issues? Aren't we paying more for less of those????

NO, you pay for the Software, not the Hardware. And of course probably the Support, depending on where/how you got your Card...

That its guaranteed that the Card works with the certified Software. Thats a big bunch of the cost.

And also those WS Cards have less features deactivated than the Desktop cards.

 

In some instances its FP64 power, where Workstation goes up to 1/2 of FP32, while Desktop Cards stay at mostly 1/16 or less...

though there are exception s for that.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dabombinable said:

The performance tanks sooner when the CPU is the bottleneck. That's why an Nvidia card was the best option when looking for something to run with anything Bulldozer, Conroe or Penryn based.

That doesn't change the fact that it is a hack and violates the official specification of DX11.

And the "overhead" is NOT lower, its just better threadded so that it runs on more than one thread. That is all.

 

And the last time I've seen dual core Tests from some sites, it looked abysmal for nVidia, they claimed it was a bug...

Or was it rather a sideeffect of these optimizations?! That it does NOT use _LESS_ CPU Power but more, just hides it better among 4 Thread CPUs??

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Stefan Payne said:

NO, you pay for the Software, not the Hardware. And of course probably the Support, depending on where/how you got your Card...

That its guaranteed that the Card works with the certified Software. Thats a big bunch of the cost.

And also those WS Cards have less features deactivated than the Desktop cards.

Workstation cards are brought specifically because they are supposed to produce less errors and system problems and are designed for 24/7 use. It's hardware and software not just software. There is a reason they feature ECC memory on them, you know what that stands for right?

 

There is no reason workstation cards should be producing more issues on standardized testing that is the exact same thing repeated, none at all. This is one of the very reasons for the existence of such cards not just certified drivers for professional applications unless professionals like more errors and issues... not.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

Workstation cards are brought specifically because they are supposed to produce less errors and system problems and are designed for 24/7 use. It's hardware and software not just software. There is a reason they feature ECC memory on them, you know what that stands for right?

You mean that thing that came after parity that can detect 2bit errors and correct 1 bit errors, that one?

And workstation cards are designed for specific tasks and have highly specialized drivers while "Gaming" Cards have General drivers.

 

With the Workstation Cards you know that its one of a couple of applications you might throw at them, while you don't/can't know that on "Gaming cards".

 

But there is also Warranty:

Quote

Retail versions of Radeon™ Pro WX Series graphics cards are covered by our 3-year limited warranty and optional 7-year limited extended warranty.

3 Years Warranty and an an additonal 4 Years Optional.

 

 

2 minutes ago, leadeater said:

There is no reason workstation cards should be producing more issues on standardized testing that is the exact same thing repeated, none at all. This is one the very reasons for the existence of such cards not just certified drivers for professional applications unless professionals like more errors and issues...

Well, in theory yes, but how often do you use your Workstation Card with Software that is not Autodesk 3DS Studio Max, Maya, Blender, Solidworks or the other professional Render stuff.

In the olden Days, when they used OpenGL, the Drivers were Certified by the Application Companys.

 

And that is what those cards are made for.

 

And the Reason why the AMD VEGA Frontier Edition has a "Game Mode" and a "Radeon Pro Mode.

https://marketrealist.com/2017/06/a-look-into-amds-new-vega-frontier-edition-gpu

 

It might have been the wrong enviroment/test for those cards, I'm not sure.

But it equally happened on both sides...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, iamdarkyoshi said:

Considering I have daily issues with my 1080Ti, I can't say I'm surprised... 

What are your issues with the 1080ti?

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Stefan Payne said:

You mean that thing that came after parity that can detect 2bit errors and correct 1 bit errors, that one?

And workstation cards are designed for specific tasks and have highly specialized drivers while "Gaming" Cards have General drivers.

 

With the Workstation Cards you know that its one of a couple of applications you might throw at them, while you don't/can't know that on "Gaming cards".

 

But there is also Warranty:

3 Years Warranty and an an additonal 4 Years Optional.

 

 

Well, in theory yes, but how often do you use your Workstation Card with Software that is not Autodesk 3DS Studio Max, Maya, Blender, Solidworks or the other professional Render stuff.

In the olden Days, when they used OpenGL, the Drivers were Certified by the Application Companys.

 

And that is what those cards are made for.

 

And the Reason why the AMD VEGA Frontier Edition has a "Game Mode" and a "Radeon Pro Mode.

https://marketrealist.com/2017/06/a-look-into-amds-new-vega-frontier-edition-gpu

 

It might have been the wrong enviroment/test for those cards, I'm not sure.

But it equally happened on both sides...

They are the same GPU architecture, fabricated in the same way, validated to a workstation specification, fitted out with extra features like ECC memory to prevent errors and system stability errors. It doesn't matter if gaming graphics cards are for a more general use case professional cards are the same thing, they can run everything that a gaming graphics card while supposedly being more stable and less prone to errors while also supporting extra features at the driver level that professional application use which are actually for the most part legacy extensions to OpenGL that are depreciated.

 

The literal whole point of a professional graphics card is it's ability to run the same thing countless times successfully never producing an error, if it can do it once it should do it 1 million times more in succession with no errors.

 

This is an extremely simple concept, you don't buy professional grade tools for them to go blunt quicker to break sooner else we'd call them armature tools not professional and the same applies to professional graphics cards. Less errors not more, better stability not less.

 

You don't have to explain to me what professional cards are, we have servers full of them for VDI, we have high end workstations with Quadros and Titan Vs in them for academic research, all critical stuff where downtime and errors are simply not acceptable and literally anything could be run on them from just playing back video to AutoCAD to some random researchers CUDA code they wrote 5 minutes ago and want to do a test run.

 

Professional cards having more problems is fundamentally counter to the principal of their existence which is why I pointed that out, wasn't the only person either. What you're saying is we should expect more issues with servers that have Xeons, ECC RDIMMS, RAID cards, 25GbE, HBAs, NVDIMMs and a server OS or Hypervisor running multiple different systems of all types.... nope that's not a thing either.

Link to comment
Share on other sites

Link to post
Share on other sites

Using both AMD and Nvida gpus in the last 2 years... Had more problems with my pc crashing when I swapped to Nvida vs when I had AMD, but in the past it was quite the opposite. AMD has improved a lot on their drivers recently and I was surprised by how my rx 480 and 470 performed and was quite annoyed with nvida because I've either had a BSOD after the update or I saw weird artifacting in some of my games, namely GTA V at the time

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, leadeater said:

Professional cards having more problems is fundamentally counter to the principal of their existence which is why I pointed that out, wasn't the only person either.

Yes, I agree that that shouldn't have happened and is a strange result that the Gaming products were more stable than their Workstation Counterparts.

 

The Driver could be a potential explanation.

And I wasn't arguing about the hardware, I was arguing about the Software side of the Cards.

6 minutes ago, leadeater said:

What you're saying is we should expect more issues with servers that have Xeons, ECC RDIMMS, RAID cards, 25GbE, HBAs, NVDIMMs and a server OS or Hypervisor running multiple different systems of all types.... nope that's not a thing either.

Well, no, because there is no software level between the CPU and whatever you throw at them, while there is a software level between a GPU and whatever you throw at them...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Stefan Payne said:

Well, no, because there is no software level between the CPU and whatever you throw at them, while there is a software level between a GPU and whatever you throw at them...

You mean other than the OS, it's Kernel, drivers for hardware in the system etc heaps of places for stuff to go wrong and that's before introducing something like a hypervisor and VMs. More potential for something to go wrong CPU wise than GPU.

 

Was just last week I had to do a kernel update on a Linux server to resolve an issue with software running on that system, servers aren't error free but we certainly expect less and do actually get less.

 

I also wouldn't say I've personally experienced more issues with workstations cards or their drivers either but it's not like I've done any rigorous testing on that front, just not observed many issues.

 

edit:

Oh and those extra hardware features are there to prevent software errors, it's not mutually exclusive.

Link to comment
Share on other sites

Link to post
Share on other sites

Not sure how one would market "drivers are more stable"  without looking really bad... unless the opposition is just really bad.

 

Though I will say I am certainly convinced that Nvidias drivers these days are not amazing. The few months I had to be use old drivers because the new ones caused my PC to crash randomly while using Firefox was evidence of that.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Humbug said:

But what it does mean IMO is that if Nvidia had come out on top then these results would have remained an internal AMD report and would never have been made public. AMD only published this because the results are favorable, it's possible that there are unfavorable research papers from the past comissioned by both AMD and Nvidia that they keep secret because it doesn't fit their narrative. But this time AMD is turning it into a marketing push.

...


Cheers

Good to see rational thinking continues to survive in this fanboyism-steeped echo-chamber.

 

Cheers indeed :) !

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Arika S said:

instead of taking jabs, advertise what it is that makes your product better and what your product does, not that it's just better than the competition

You cannot do that’s without comparative advertising. Compatible advertising is the life blood of capitalism. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Humbug said:

There was a time where Nvidia had objectively better graphics drivers than the competition.

Honestly, that's not really true.  Nvidia has had better drivers at times, and ATI/AMD has had better drivers at times.  There was no time that it was strictly Nvidia drivers were the best and ATI was the worst.  They both went back and forth over the years in that regard.

7 hours ago, mr moose said:

the only thing that will kill a bad reputation on the internet is better performance and cooler PR.

Except if your competition can outspend you, then you're in an uphill battle to prove your performance is better.  Hopefully that will improve now that AMD is no longer bleeding money thanks to Ryzen.

1 hour ago, Sypran said:

Though I will say I am certainly convinced that Nvidias drivers these days are not amazing. The few months I had to be use old drivers because the new ones caused my PC to crash randomly while using Firefox was evidence of that.

We built a Vega 56 system at work several months back, and ended up having to use the drivers that came on the disk, because every version of the drivers from the AMD website would cause the system to become unstable under load.  As I said before, they both go back and forth on having good/bad drivers (though at least AMD's drivers haven't killed a card.....yet ;) ).

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, DrMacintosh said:

AMDs PR has been pretty cool lately

 

Look at that Intel jab with the "more advanced security features"

AMD's PR campaign has basically been "let's poke fun at the other guy."  If you have to poke fun at the other guy to uplift your own product, it tells me two things: You're either overconfident in your product (which most marketers have to be anyway) or you lack confidence in your product that it can stand on its own without resorting to schoolboy tactics like this.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×