Jump to content

NVIDIA Fires Shots at AMD’s 7nm Tech - Claims "Can Create Most Energy-efficient GPU in the World Anytime"

1 minute ago, leadeater said:

Though that hate is a tad bipolar when you see comments hailing Intel's entry in to the GPU market lol.

Clearly it means that NVIDIA needs to get into desktop CPU market xD

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, dalekphalm said:

Jensen needs to put up or shut up.

He can't.

He's just that kind of person, that can't deal with competition on an even plainfield.

 

The rise of nVidia was started by bad Tech Journalists, that hyped useless features. Just look up how high the "32bit penalty" of a Riva TNT was.

Until at least Geforce DDR, nobody really played with 32bit color space because of the performance impact...

That was also the reason for the downfall of 3Dfx...

 

 

19 hours ago, Blademaster91 said:

About 300W is still from the GPU alone, and the radeon VII heat and noise was a valid complaint most reviewers had despite the card having a triple fan cooler.

a) Nobody really cares about the Power Consumption. Except for the nVidia Side that uses it to bash nVidia

b) oh and how does it look on the other side?
And how about the easy "2click fix" for the Radeon 7??

 

Oh it doesn't matter, its AMD. We need fodder to bash them, I forgot.

 

But you forget that YOU can adjust the power consumption and fan speed in the driver...

19 hours ago, Blademaster91 said:

No plenty of people that pay for their own power, and companies care about the power consumption

Yeah, all those People with overclocked Intel CPU absolutely care about the Power Consumption and the 250-300W of the CPU.

Oh wait, they don't.

So why care about the GPU when you don't care about the CPU?!
That's hypcritical.

Or the ones with the old CCFL screens that gobble up to 140W (24" and less) through them...

 

19 hours ago, Blademaster91 said:

it isn't just something to use "against the other side".

Then why is nobody talking about Power Consumption of CPU right now?!
I don't ever see that mentioned these days. But back when AMD Sucked at it, everyone feeled the need to mention it.

Now that AMD is on top, nobody seems to want to mention it.

19 hours ago, Blademaster91 said:

  There would be more small form factor cards from AIB's if AMD GPU's consumed less power, and more laptop OEM's would be using Radeon gpu's.

That's total bullshit and you know it.

 

 

Just show the Laptops with the superior AMD APU. There are almost none. And the ones that exist are mostly shit.

 

AMD just can't do anything right in the eyes of some persons.

Even if they aren't too bad, you expect them to have +50% performance with half the power consumption at at least half the price of the nVidia Card...

 

 

19 hours ago, cj09beira said:

point is its not 300w extra, at most its like 50w extra, which is not much ps aib cards from nvidea can use the same amount of  power

Exactly!
Its not a big difference but people seem to feel the need to make a big deal out of it to justify their purchase...

6 hours ago, Lathlaer said:

Nothing but the topic was posted here on LTT which is significantly pro AMD so now we will have obligatory several pages of people bashing NVIDIA to even out the score ?

Yeah, because people are beginning to see through the Bullshit coming from nVidia, they don't like the drastically increased prices in the last couple of years and the "It just Works" Bullshit.


As wel as the Geforce Partner Programm that was meant to reduce possible choices.

 

So yes, nVidia deserves the shit thrown at them, more than any other company. They were always on the high horse.

 

You might not know their response to people asking them about the integration of settings for the TV Encoder Chip. Ah, its not our thing, the Board Partners are responsible for that.

 

Yeah, right. So why the heck does everyone else do that?
And who is responsible for the reference design? 


And then there was the Geforce Partner Programme in the last year, wich also nobody dared to break - because it would "damage" the relationship with nVidia...

5 hours ago, leadeater said:

I just think they are an evil company purposefully withholding the best they can offer and charging more than which the products are worth....

Yeah or mention how they fucked over Microsoft with the Original XBox, together with Intel.

And also was not very cooperative with Sony, while the other side was.

 

If nVidia was cooperative, they would have allowed IBM/Sony to integrate it to the Chip - same as they did with the Xenos Chip on the XBox 360.

 

And also there was a bug in the design as well wich limited the bandwith to a couple of Megabyte per seconds under some circumstances...

 

5 hours ago, leadeater said:

hmm I guess that does sound like I'm bashing Nvidia.

You know, if you say anything bad about nVidia, its bashing.

For example mentioning the uneven Memory Interface of IIRC 560ti or was it the 660ti?? (192bit with 2GiB VRAM)

And especially the GTX970...

5 hours ago, leadeater said:

Seriously though I don't think the forum is pro AMD, I think it's more anti Intel and Nvidia than that.

To say it with the Words of Louis R.:

Real fans call out "their" companys for their bullshit, in hope that they correct it and make better products.

5 hours ago, Lathlaer said:

Clearly it means that NVIDIA needs to get into desktop CPU market xD

Yeah, right.

They can't.

Intel won't allow it.

And they shouldn't. 


nVidia deserves to die and replaced with a better company that's less evil than they are.

They are on the same level as Facebook and twitter.


Ever seen their Privacy Policy for Geforce Experience?? (Or was it their drivers?) The part where they say that they will share the data with "their partners", whoever that might be...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

How about the most cost effective? bitch

 

Personally, I don't care much if my GPU is drawing more power. I guess I'm different that way, but to me, that doesn't matter as much as how well it performs vs how much it costs.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, leadeater said:

evil company

 

"Evil" as defined by what...? ?

 

They act exactly as a corporation should - they act to make money for their stakeholders (usually owners/shareholders, employees/members in some co-operative models, etc).

 

A corporation (and by extension, a business) exists to make money by providing a service or a product; contrary to some beliefs, they do not exist (and are certainly not obligated beyond whatever laws exist to that effect) to make life easier or better for their customers - making life better or easier (by providing products or services) is simply the means to the end (making money). 

 

Disliking their business practices is good and all, but you're screeching at the symptoms, not the causes (the concepts and implementations of the social-economic order our world operates in, and if we want to get to the baser levels, the way humans are biologically wired to function to acquire resources and the way each one fits into and affects a larger group).

 

7 hours ago, leadeater said:

purposefully withholding the best they can offer and charging more than which the products are worth....

The "worth" of a product, as determined by who exactly...? ?

 

If their products they are "charging for more than worth" continue to sell (in other words, people are still willing to pay, regardless of reason), that is simply the market dictating the product is worth purchasing at whatever particular price point(s) the product is being sold at.

 

Not just limited to video cards or products in the technology sector, I am sure you very well know.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/30/2019 at 3:52 AM, CarlBar said:

 

As many, many, many, many people have pointed out the bottlenecks involved in the architecture design are the main reason it's so poor at power per frame. Based off some claims about the exact % of dead time on bits of silicon i'd guess a more rational design could do the R7 performance at about 300W draw. I suspect small Navi will be 200w or less.

Do you mean the broken primitive shaders?

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/30/2019 at 4:19 AM, Master Disaster said:

The reason they'll never do this is because they're to busy spending billions chasing pipedreams like Raytracing that no one asked for and based on current implementation rates in software no one wants either.

 

The phrase "the bigger you are the harder you fall" applies to Nvidia in this scenario. Just like Intel they got complacent with being market leader and instead of focusing on what mattered they decided to focus on a personal dream of their CEO. Guess what Nvidia, while you were wasting time developing RTX AMD caught you up and might just overtake you in the market where it really matters.

 

This is what happens when you chase the top 1% and ignore the other 99%. The top 1% might buy the most expensive cards but the other 99% are where the real money lies.

I wanted raytraced graphics for the longest time. :/

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, thorhammerz said:

"Evil" as defined by what...?

Common sense.

 

https://www.hardocp.com/article/2018/03/08/geforce_partner_program_impacts_consumer_choice

 

Forcing their Partners to not have competing Products with the same Marketing Label ie "ASUS RX VEGA ROG STRIX" with the same name as nVidia Cards, like "ASUS ROG STRIX RTX2080" would be forbidden under that label...


That is a direct attack on the sovereignity of the "Partner Company", that is evil and should be looked into by the FTC - and nVidia fined a couple of Billion Dollars.

 

Quote

They act exactly as a corporation should - 

Companys should extort/blackmail other companys to not allow competing products in their lineup??

 

Quote

A corporation (and by extension, a business) exists to make money by providing a service or a product;

You can make money without blackmailing your partners and pissing them off...


But yeah, I forgot. We need the Money RIGHT NOW and what happens next year or two later is not relevant, if the Company got fined to death by the EU, FTC and other asian Kartell thingys.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, thorhammerz said:

-snip-

You know the first part was a joke right.....

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Stefan Payne said:

Common sense.

That's cute.

17 minutes ago, Stefan Payne said:

https://www.hardocp.com/article/2018/03/08/geforce_partner_program_impacts_consumer_choice

 

Forcing their Partners to not have competing Products with the same Marketing Label ie "ASUS RX VEGA ROG STRIX" with the same name as nVidia Cards, like "ASUS ROG STRIX RTX2080" would be forbidden under that label...


That is a direct attack on the sovereignity of the "Partner Company", that is evil and should be looked into by the FTC - and nVidia fined a couple of Billion Dollars.

Ah yes, and everyone has obviously learned their lessons amirite? ?

 

Quote

Companys should extort/blackmail other companys to not allow competing products in their lineup??

 

You can make money without blackmailing your partners and pissing them off...

Whichever generates more green ?!

Quote

But yeah, I forgot. We need the Money RIGHT NOW and what happens next year or two later is not relevant, if the Company got fined to death by the EU, FTC and other asian Kartell thingys.

Then their demise shall be their own making - the consequences of their own decisions. Nothing more, nothing less. 

 

12 minutes ago, leadeater said:

You know the first part was a joke right.....

All in good fun.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Lathlaer said:

Nothing but the topic was posted here on LTT which is significantly pro AMD so now we will have obligatory several pages of people bashing NVIDIA to even out the score ?

Not true at all. 

 

Its almost a split right down the middle - but I would actually hedge towards more people liking NVIDIA, so the forum definitely isn’t pro either. 

 

The AMD fanboys might be louder than the NVIDIA fanboys - but AMD is the underdog, so that’s to be expected. 

 

Most users, like me and others - aren’t fanboys of either, and respect that both make good products, with some products being better in some areas at some times. 

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, ryao said:

Do you mean the broken primitive shaders?

 

No. There are several different types of workload that take place in the silicon. You'll hear a lot about compute, (CUDA being NVIDIA's implementation), but there's also a whole bunch of other stuff that goes into creating the image on your screen. Data Centers mostly only care about the compute though, (many dedicated datacenter cards completely lack a graphics output), and as an architecture optimized for that Vega has a far higher ratio of compute capability silicon to other stuff. However in gaming workloads that use a more balanced mix of workloads much of the time rendering a frame dosen;t actually involve enough compute related tasks to keep the compute side fully busy whilst the non-compute sections are running flat out full bore with no spare capability. This leads to the compute side stalling for lack of work. Now silicon thats not actually doping calculations won't consume as much power as silicon that is. But Power draw rises exponentially with frequency and because of how it;s all interconnected you can't downclock the idle compute hardware so it's still suffering from the issues of high frequency driven power draw scaling. I've heard claims that upto 40% of the silicon on the die can be in this high power draw standby state in gaming workloads. That dosen;t mean power draw will drop 40% if you fix things. But a 25-30% drop is probably reasonable.

 

This is also why Vega does so well when handed a ray trace algorithm, thats primarily compute and vega has compute power to spare. Based on the teraflop ratings, (a measure of available computing power), and what we know about BFV's very shoddy RTX implementation, (it apparently doesn't use the RT or tensor cores at all), R7 could probably achieve 20780Ti levels of FPS if you could run BFV ray tracing on the R7.

Link to comment
Share on other sites

Link to post
Share on other sites

Well riding a bicycle is more efficient than driving a car by a massive amount but you would want to ride one across a state.

                     ¸„»°'´¸„»°'´ Vorticalbox `'°«„¸`'°«„¸
`'°«„¸¸„»°'´¸„»°'´`'°«„¸Scientia Potentia est  ¸„»°'´`'°«„¸`'°«„¸¸„»°'´

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/31/2019 at 9:50 PM, dalekphalm said:

Not true at all. 

 

Its almost a split right down the middle - but I would actually hedge towards more people liking NVIDIA, so the forum definitely isn’t pro either. 

 

The AMD fanboys might be louder than the NVIDIA fanboys - but AMD is the underdog, so that’s to be expected. 

 

Most users, like me and others - aren’t fanboys of either, and respect that both make good products, with some products being better in some areas at some times. 

Given that Nvidia has something like 75% marketshare, having it be only 50% Nvidia here would mean that this forum significantly leans toward AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

By the way, if people feel so strongly about GPUs, I suggest that they make their own GPUs like these guys did:

 

http://miaowgpu.org/

 

Do the design work. Then arrange a group buy where the chips are fabricated at TSMC, a contractor in China uses them to build the cards and then mails them to each buyer.

 

If you design it to mimic an actual AMD GPU, you should be able to reuse AMD’s drivers.

 

By the way, the Miaow GPU needs plenty of design work:

 

https://www.reactos.org/zh-hant/node/897

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, ryao said:

Given that Nvidia has something like 75% marketshare, having it be only 50% Nvidia here would mean that this forum significantly leans toward AMD.

I did say that I thought it edged towards NVIDIA.

 

Point being is that there's not an unusually high amount of AMD fanboys. And certainly no more than there are NVIDIA, Intel, Apple, Android, Samsung, ASUS fanboys, etc.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, dalekphalm said:

I did say that I thought it edged towards NVIDIA.

 

Point being is that there's not an unusually high amount of AMD fanboys. And certainly no more than there are NVIDIA, Intel, Apple, Android, Samsung, ASUS fanboys, etc.

Even if the population here is slightly higher for nvidia, given that Nvidia outnumbers AMD by something like 3:1 in the market, even being 51:49 nvidia:AMD here would mean that the forum leans toward AMD. The leaning is done on a slope that is skewed toward Nvidia because of their marketshare.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ryao said:

Given that Nvidia has something like 75% marketshare, having it be only 50% Nvidia here would mean that this forum significantly leans toward AMD.

The forum does significantly lead towards AMD, and that's because we constantly have people coming on asking about 1660ti's and 2060's when there's a much better solution to that problem: A vega 56.

 

Similarly, we have people coming on saying they are interested in the rtx 2070 ALL THE TIME. It's a disgrace of a gpu for the price they are selling that thing at. Again, there's a much better solution at keeping 8gb vram and similar performance for a better price: It's called Vega 64.

 

And then you've got the rx 570, which actually STILL makes Nvidia look bad and will continue to make them look bad until it's small Navi brother takes over that job.

 

Turing is awful. Most people on this forum realise that, and that's why a lot of us recommend AMD. Who wants bad framerates in the future because of a lack of vram? Nobody. Who wants a technology that isn't fully completed properly? Nobody. Who wants Vaseline smeared all over their screen? Nobody. At the end of the day, people want high fps in games and good performance in compute workloads. That's it. AMD does that for a better price so it would make sense for us to recommend them instead of Turing. Other than the NVENC encoder what is actually GOOD about Turing? Because I can't think of a single thing.

 

The majority of people would rather wait until Nvidia gets their act together on these new features and instead opt for a gpu that just does what they want it to do for a better price. The reason the marketshare is so high is because most people don't even look at AMD. It's our duty to help to change that. We need competition. It's not AMD fanboys, it's just people who look at value and see that it's blindingly obvious that AMD has better value cards.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, MeatFeastMan said:

The forum does significantly lead towards AMD, and that's because we constantly have people coming on asking about 1660ti's and 2060's when there's a much better solution to that problem: A vega 56.

 

Similarly, we have people coming on saying they are interested in the rtx 2070 ALL THE TIME. It's a disgrace of a gpu for the price they are selling that thing at. Again, there's a much better solution at keeping 8gb vram and similar performance for a better price: It's called Vega 64.

 

And then you've got the rx 570, which actually STILL makes Nvidia look bad and will continue to make them look bad until it's small Navi brother takes over that job.

 

Turing is awful. Most people on this forum realise that, and that's why a lot of us recommend AMD. Who wants bad framerates in the future because of a lack of vram? Nobody. Who wants a technology that isn't fully completed properly? Nobody. Who wants Vaseline smeared all over their screen? Nobody. At the end of the day, people want high fps in games and good performance in compute workloads. That's it. AMD does that for a better price so it would make sense for us to recommend them instead of Turing. Other than the NVENC encoder what is actually GOOD about Turing? Because I can't think of a single thing.

 

The majority of people would rather wait until Nvidia gets their act together on these new features and instead opt for a gpu that just does what they want it to do for a better price. The reason the marketshare is so high is because most people don't even look at AMD. It's our duty to help to change that. We need competition. It's not AMD fanboys, it's just people who look at value and see that it's blindingly obvious that AMD has better value cards.

 

 

 

 

 

Nvidia’s GPUs are significantly more energy efficient than AMD’s:

 

https://www.anandtech.com/show/13923/the-amd-radeon-vii-review/19

 

Their drivers are also better. Unlike AMD’s drivers, Nvidia’s drivers support GPU resets if the GPU hangs yet hangs seem to occur far less on Nvidia hardware than on AMD graphics hardware. I had nothing but ATI graphics hardware from 1998 to 2006 when I got fed up and switched to Nvidia. I have enjoyed far fewer graphics issues since then. I am told that little has changed. The DXVK author puts up with the lack of GPU reset support in AMD’s drivers, but uses Polaris instead of Vega because the driver issues with Pascal are more severe.

 

I dislike Nvidia’s use of a binary blob for a driver, but upon looking at AMD’s hardware which on Linux has an open source driver stack, I concluded that even if I expended enormous time on fixing the drivers, energy efficiency would never match Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

Look at division 2 benchmarks though, i think amd has fantastic hardware but the software is EXTREMELY lacking. On full direct x 12 like the division 2 lol look at the benchmarks, radeon 7 is beating rtx 2080ti i almost spilt coffee out of my mouth when i saw the benchmarks saying... WTF

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ryao said:

Even if the population here is slightly higher for nvidia, given that Nvidia outnumbers AMD by something like 3:1 in the market, even being 51:49 nvidia:AMD here would mean that the forum leans toward AMD. The leaning is done on a slope that is skewed toward Nvidia because of their marketshare.

 

Thats not how skewing in favour of somthing works.

 

I'd also be amazed if it's anywhere near that high. That said the ratio of AMD to NVIDIA is certainly more biased towards AMD than NVIDIA. Thats inevitable though. That average consumer tends to rely strongly on a combination of brand familiarity and third party OEM supplied parts, both of which favour NVIDIA heavilly. Actual whats the best choice for them analytics tend to be very low on the factors that influence NVIDIA vs AMD GPU sales.

 

Conversely people here are quite understandably more tech savvy, we do a lot more focusing on the best choice analytics, and price per performance tends to be a big deal for us. For people buying mid or low end graphics cards AMD has had a stranglehold for a while, (the 1660Ti and 1660 have broken that for now, but how long that will remain true is subject to change without notice), so they tend to get a lot of attention from the majority. (Most of us would like high end GPU's in our systems, but only a few of us can actually afford them).

 

Incidentally i'm typing this from a system powered by a 2080Ti in the graphics department.

 

3 minutes ago, Zeeus said:

Look at division 2 benchmarks though, i think amd has fantastic hardware but the software is EXTREMELY lacking. On full direct x 12 like the division 2 lol look at the benchmarks, radeon 7 is beating rtx 2080ti i almost spilt coffee out of my mouth when i saw the benchmarks saying... WTF

 

What are you smoking and can i have some of it please. The R7 beats the 2070 be 10-15%. The 2080ti beats the 2070 by far more...

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Zeeus said:

 

Can't get in, it's inciting i turn off my ad blocker which is basically a get lost move for me. But i'm telling you they're wrong. Hardware unboxed did an optimization guide though and there's a few graphs at the end showing actual comparative FPS between several cards. The R7 is the top card listed, (the 2070 is the top NVIDIA card), the and it's nowhere near that high. Hell it manages a worse performance in their optimized setup than my 2080Ti does with everything maxed.

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, CarlBar said:

What are you smoking and can i have some of it please. The R7 beats the 2070 be 10-15%. The 2080ti beats the 2070 by far more...

Good luck finding Radeon VII's in stock anywhere near the price of an RTX 2070.

 

2070's in stock for ~$500 (or slightly cheaper). Radeon VII's at.... $700? That's 2080 pricing (and therefore, what consumers will compare it to).

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, ryao said:

 

Nvidia’s GPUs are significantly more energy efficient than AMD’s:

AMD CPUs are more energy efficient than Intel CPUs, yet nobody cares.

 

So that's just an argument used on the nVidia side to justify their purchase.

And don't come with the Enviroement.

Just look at GTX680 vs 7970! Wich one was the one that lasted longer??


And what's better for the Enviroment: a slightly higher energy consumption but +2 years or so usable time or slightly less but you have to replace the GPU a couple of times more because the driver optimizations were canned and only come for the newest generation...

4 hours ago, ryao said:

Their drivers are also better.

Bullshit.

There is the AMD Driver thing floating around, wich comes to a different conclusion.


And the nVidia Users in the Discord are always telling something about Problems with the Driver.

That's just not true anymore. And some people are very dissatisfied.

 

Also the nVidia Driver Panel is the same old shit from 20 years ago.

 

4 hours ago, ryao said:

Unlike AMD’s drivers, Nvidia’s drivers support GPU resets if the GPU hangs yet hangs seem to occur far less on Nvidia hardware than on AMD graphics hardware.

Where did you get that shit from?

Remember this, from last year?

https://www.guru3d.com/news-story/third-party-audit-reveals-amd-drivers-are-the-most-stable-gamers.html

 

That are FACTS. You are talking about anecdotes.

 

In the end: Both drivers are kinda garbage, claiming "driver superiority" is just nonsense. And that is proven.

 

4 hours ago, ryao said:

I had nothing but ATI graphics hardware from 1998 to 2006 when I got fed up and switched to Nvidia.

See, I got a wide variety of Graphics cards between 1996 or so and 2006 or so when I got fed up and switched to ATi/AMD and stuck with it. Not that I didn't have any nVidia Cards, I did (two GTX 570 and the GT710 I use right now)...

 

So what??

4 hours ago, ryao said:

I have enjoyed far fewer graphics issues since then. I am told that little has changed.

THen you are told wrong because I have rarely any issue today while the nVidia Users say they have.

 

Buttom Line:
BOTH sides have issues!

The only difference is that one side it is accepted and dealt with, the other side is bashed.

Same with an Intel System: If that don't work, you fix it.

If an AMD System don't work, you don't fix it, go through forums and claim that its shit because you have some shit Memory.

4 hours ago, ryao said:

The DXVK author puts up with the lack of GPU reset support in AMD’s drivers, but uses Polaris instead of Vega because the driver issues with Pascal are more severe.

Wait, are you contradict yourself here right now?!

 

First, DXVK looks like Linux. If he has Issues with the AMD Driver, he could fix them himself! because AMD has some nice Open Source Drivers,. nVidia does not.

 

So you are saying that the Problem is under Linux?
And instead of trying to fix the Problem someone just complains about it??

 

Yet you don't mention that the nVidia drivers are violating the Kernel rules as they don't come with an open source component that has to be compiled into the Kernel but rather hack the Kernel, while AMD complies with the Kernel driver rules.

And also the sabotage of the Open Source Drivers from nVidia, who often don't help them with newer GPUs and lock them down.

 

Ähm, RLY?! You see the Problem, don't you??

 

4 hours ago, ryao said:

I dislike Nvidia’s use of a binary blob for a driver, but upon looking at AMD’s hardware which on Linux has an open source driver stack, I concluded that even if I expended enormous time on fixing the drivers, energy efficiency would never match Nvidia.

You are wrong and came to the wrong conclusion.

The "energy efficiency" is the usual Pseudo Argument that hardly anyone cares about, especially the bunch who has Intel "MCE" enabled or even overclock the CPU...

 

Yeah, totally makes sense to run around with a "Intel At 5,2GHz" thing and then complain about the Power COnsumption of the Graphics Card...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, thorhammerz said:

Good luck finding Radeon VII's in stock anywhere near the price of an RTX 2070.

https://www.mindfactory.de/Highlights/MindStar

689 is reasonable for a card with 16GiB VRAM

1 hour ago, thorhammerz said:

2070's in stock for ~$500 (or slightly cheaper). Radeon VII's at.... $700? That's 2080 pricing (and therefore, what consumers will compare it to).

2070 has 8GiB VRAM

R7 has 16GiB VRAM

 

And look at Level1 Techs Youtube Channel. The Video with EposVox, where he used a Radeon 7 and it worked, where it didn't with nVidia.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×