Jump to content

"Nvidia Disappointed that Witcher, Cars "Tainted by False Allegations""

MegaDave91

Why couldn't AMD just buy the license, get the source code and optimize thier drivers? I don't understand.

<p>CPU: Intel i7 4790k(4.9GHz), Motherboard: ASUS sabertooth Z97 Mark 1 RAM: Corsair Dominator Platinum 16GB (2 x 8GB) GPU: EVGA GeForce GTX 980 4GB Superclocked ACX 2.0 (2 way SLI) PSU: EVGA 850w G2 Storage: Intel 730 Series 240GB 2.5" SSD, Western Digital Caviar Black 1TB Case: NZXT H440 Designed by Razer, EK,Alphacool,Bitspower custom loop.

“Peace cannot be kept by force. It can only be achieved by understanding.”

Albert Einstein 14 December 1930

Link to comment
Share on other sites

Link to post
Share on other sites

But I agree, AMD lacks the budget to get its optimizers to get tp work on stuff

 

And add to that the fact that AMD has brought only 2 new high-end cards during the past couple of years, which were barley performing on par with nVidia when they were launched anyway. Although there might be some stuff wrong with nVidia's approach to Gameworks, AMD should stop blaming the other people, get their sh*t together and finally do a complete graphics lineup refresh, cause just giving us the 390 and 390X is not enough.  

CPU: AMD Ryzen 9 - 3900x @ 4.4GHz with a Custom Loop | MBO: ASUS Crosshair VI Extreme | RAM: 4x4GB Apacer 2666MHz overclocked to 3933MHz with OCZ Reaper HPC Heatsinks | GPU: PowerColor Red Devil 6900XT | SSDs: Intel 660P 512GB SSD and Intel 660P 1TB SSD | HDD: 2x WD Black 6TB and Seagate Backup Plus 8TB External Drive | PSU: Corsair RM1000i | Case: Cooler Master C700P Black Edition | Build Log: here

Link to comment
Share on other sites

Link to post
Share on other sites

Is everyone forgetting that AMD's cards are 2 years old? With some being even older due to re-branding? You can not expect old graphics cards to hold up to the newest ones.

 

And add to that the fact that AMD has brought only 2 new high-end cards during the past couple of years, which were barley performing on par with nVidia when they were launched anyway. Although there might be some stuff wrong with nVidia's approach to Gameworks, AMD should stop blaming the other people, get their sh*t together and finally do a complete graphics lineup refresh, cause just giving us the 390 and 390X is not enough.  

Yep, they need to have all of the next release as new cards.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Why couldn't AMD just buy the license, get the source code and optimize thier drivers? I don't understand.

AMD would never buy anything from Nvidia, and vice versa. 

Case: Corsair 4000D Airflow; Motherboard: MSI ZZ490 Gaming Edge; CPU: i7 10700K @ 5.1GHz; Cooler: Noctua NHD15S Chromax; RAM: Corsair LPX DDR4 32GB 3200MHz; Graphics Card: Asus RTX 3080 TUF; Power: EVGA SuperNova 750G2; Storage: 2 x Seagate Barracuda 1TB; Crucial M500 240GB & MX100 512GB; Keyboard: Logitech G710+; Mouse: Logitech G502; Headphones / Amp: HiFiMan Sundara Mayflower Objective 2; Monitor: Asus VG27AQ

Link to comment
Share on other sites

Link to post
Share on other sites

AMD would never buy anything from Nvidia, and vice versa.

It seems they need to honestly, if they want optimized performance with gameworks. Gotta spend money to make money.

<p>CPU: Intel i7 4790k(4.9GHz), Motherboard: ASUS sabertooth Z97 Mark 1 RAM: Corsair Dominator Platinum 16GB (2 x 8GB) GPU: EVGA GeForce GTX 980 4GB Superclocked ACX 2.0 (2 way SLI) PSU: EVGA 850w G2 Storage: Intel 730 Series 240GB 2.5" SSD, Western Digital Caviar Black 1TB Case: NZXT H440 Designed by Razer, EK,Alphacool,Bitspower custom loop.

“Peace cannot be kept by force. It can only be achieved by understanding.”

Albert Einstein 14 December 1930

Link to comment
Share on other sites

Link to post
Share on other sites

Poor NVIDIA... victims of their own bullshit practices... 

Link to comment
Share on other sites

Link to post
Share on other sites

Poor NVIDIA... victims of their own bullshit practices... 

Poor AMD... victims of their own bullshit rebrands

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Poor AMD... victims of their own bullshit rebrands

Man... this is why we can't have good stuff...

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

When I am done with my GTX 760, I'll be smashing it with a mallet and throwing the remains at 10 different landfills.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

Poor AMD... victims of their own bullshit rebrands

Indeed that's is true, you can add poor marketing as well... but I think that's a bit, if not completly, off topic.

Yet I got the feeling that you said that to me in some sort of retalliation... like you were to try to hurt me in the feels bro... probably because you were hurt? lol

 

This brand loving is some case study material xD

Link to comment
Share on other sites

Link to post
Share on other sites

Indeed that's is true, you can add poor marketing as well... but I think that's a bit, if not completly, off topic.

Yet I got the feeling that you said that to me in some sort of retalliation... like you were to try to hurt me in the feels bro... probably because you were hurt? lol

 

This brand loving is some case study material xD

Shots were fired one way, some one had to fire back-even if they haven't owned an AMD dGPU since 2001. (And that was my only one-before that was my diamond stealth SIII S540, and a couple of 8bit ISA vga cards).

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA would probablt deny the gtx 970 VRAM issues if they hadnt been caught with their pants down by the whole tech community and would have said those problems were merely "false allegations".

 

They are scum and their credibility is forever tainted as far as I am concerned I take everything they say from now on will be taken with a truck load of salt.

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA would probablt deny the gtx 970 VRAM issues if they hadnt been caught with their pants down by the whole tech community and would have said those problems were merely "false allegations".

 

They are scum and their credibility is forever tainted as far as I am concerned I take everything they say from now on will be taken with a truck load of salt.

Go back further. AMD's just as bad. CPU's with a 66MHz FSB sold as CPU's with a 100MHz FSB. Back when the FSB made enormous differences in performance, a Celeron 300A at 450MHz on a 100MHz FSB was faster than a Celeron 500 on the stock 66MHz FSB for example.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe if AMD had released a new card (or any card) after 2013 that actually fixed the issues with tessellation maybe they wouldn't be having so many problems.

 

They did, R9 285. That's why it beats 280, 960, 770 and 280x.

Build: CPU: Intel i5 4690k OC = 4.4Ghz | GPU: Gigabyte R9 285 OC = 1100/1575MHz | MB: MSI Z97M Gaming | RAM: Corsair 8GB 2133MHz CL9 | Storage: Crucial MX100 512GB Cooler: Thermaltake Frio Silent 14 | CaseCooltek U3 | PSU: Corsair RM550 | MonitorDell UltraSharp 2414h | Keyboard: Steelseries 6gv2 | Mouse: Corsair M95

 

FireStrike score = 7803. Graphic score = 9363 (record single r9 285).

Link to comment
Share on other sites

Link to post
Share on other sites

They did, R9 285. That's why it beats 280, 960, 770 and 280x.

I've never actually looked at the 285 in depth. By the sounds of it there was a lot going on with it that should have been carried over to other models.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

The only thing cars uses is a fucking CPU-based physics engine called PhysX. CPU-based, as in doesn't use the GPU. Ever. On Any manufacturer.

 

As for TW3, last time I checked you could turn the settings off. Someone should really call the Whaaaaaaambulance on AMD, they're making this a bigger deal to improve their image, but to me they just look like a bunch of whiners. Like I've said in the other thread, you can turn that shit off, just like you can turn settings down on a lower end GPU when the game doesn't run well enough. Doesn't run? Turn it off, get over the fact that you won't have simulated fancy hair and keep playing. It's ugly anyway, just like TressFX. I'd say it looks like spaghetti, but that would be a compliment.

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

Cutting through all the finger pointing...

 

The bottom line and the problem is that the default tesselation level in the game is far higher than required. Thereby artificially tanking frame-rates for no further visual benefit. AMD users have an easy fix using the override in the catalyst control panel. Not aware if there is a way for Nvidia users to fix it.

 

The question of why this was not optimized first needs to be answered by CD Projekt Red. It's their game and they should take responsibility for it.

Link to comment
Share on other sites

Link to post
Share on other sites

They did, R9 285. That's why it beats 280, 960, 770 and 280x.

 

Eh eh. The 285 only beats the 280 and is (almost) equal to the performance of 960. 

The order goes something like this: 280, 960&285, 770, 280x

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

We do not forbid it, we just don't provide the source code which amounts to the same fucking thing.

 

Dear god both AMD and Nvidia are so fucking retarded in this argument. Here's my proposal: A game about people that were all irradiated so none of them have fucking hair. Done I just solve your fucking controversy, let's move the fuck on.

 

Wut. The guy said they don't forbid the studios to not work with other IHV's in their license agreement. So it's not the same thing at all.

 

nvidia's doing this to force amd's hand and make them realese their new cards.

 

Nah.

 

Poor NVIDIA... victims of their own bullshit practices... 

 

Victim to what, exactly? They did nothing wrong here as far as I'm concerned.

 

When I am done with my GTX 760, I'll be smashing it with a mallet and throwing the remains at 10 different landfills.

 

Or you can just give it to someone who actually needs a GPU. Even if you're just joking, that's literally going to do nothing but destroy a video card that someone could use.

 

But to respond to your anger, I don't see how a company making advances and developers wanting to implement those advances in their game is something to be pissed about.

 

NVIDIA would probablt deny the gtx 970 VRAM issues if they hadnt been caught with their pants down by the whole tech community and would have said those problems were merely "false allegations".

 

They are scum and their credibility is forever tainted as far as I am concerned I take everything they say from now on will be taken with a truck load of salt.

 

We really need to go over this again? Most people had no issues with the 970. I had one for a week before I sent off the system to the company who bought it, but I ran games with it to see how it performed. If you remember, I even told you about my findings.

 

I experienced no such issue with the card that I didn't experience with my two 770's: in Watch Dogs, it would lag for a couple of seconds but it would go away as it loaded the environment when I got into certain parts of the city. I even watched the VRAM usage go up to 3.8Gigs, not sit stuck at 3.5. There's nothing wrong with the card and reviewers even confirmed that - Jay especially got tired of it and finally put it to rest in his video about it. So what you mean by "the whole tech community" is the people who are gullible to the bandwagons of the internet.

 

The real scum here are the people who would rather let the wool continue to cover their eyes rather than face facts, and it's certainly not me or anyone else who thinks there's nothing wrong with Gameworks or the 970.

Link to comment
Share on other sites

Link to post
Share on other sites

Wut. The guy said they don't forbid the studios to not work with other IHV's in their license agreement. So it's not the same thing at all.

 

Looks like you completely fell for NVidia's spin. He says licensees can gain access to GameWorks code. Well AMD is not a licensee and never will be (let's be realistic).

 

So sure, AMD can optimize the game with the dev, but the dev cannot show, share or talk about GameWorks code with AMD. That is the issue. Witcher 3 proves this, as the base game runs just fine on AMD.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Looks like you completely fell for NVidia's spin. He says licensees can gain access to GameWorks code. Well AMD is not a licensee and never will be (let's be realistic).

 

So sure, AMD can optimize the game with the dev, but the dev cannot show, share or talk about GameWorks code with AMD. That is the issue. Witcher 3 proves this, as the base game runs just fine on AMD.

What exactly are you getting at, and how is this Nvidia's spin? This has always been this way from the get-go.

Why is it a bad thing that they can't show or share Gameworks code with AMD? Nothing in the code is purposely gimping AMD, and the effects that do not perform well can be turned off. It's an issue you're trying to turn into an Nvidia-hate train, just like anyone else who refuses to see the facts.

Link to comment
Share on other sites

Link to post
Share on other sites

Eh eh. The 285 only beats the 280 and is (almost) equal to the performance of 960. 

The order goes something like this: 280, 960&285, 770, 280x

 

I was referring to tessellation performance and the impact it has in Witcher 3.

Build: CPU: Intel i5 4690k OC = 4.4Ghz | GPU: Gigabyte R9 285 OC = 1100/1575MHz | MB: MSI Z97M Gaming | RAM: Corsair 8GB 2133MHz CL9 | Storage: Crucial MX100 512GB Cooler: Thermaltake Frio Silent 14 | CaseCooltek U3 | PSU: Corsair RM550 | MonitorDell UltraSharp 2414h | Keyboard: Steelseries 6gv2 | Mouse: Corsair M95

 

FireStrike score = 7803. Graphic score = 9363 (record single r9 285).

Link to comment
Share on other sites

Link to post
Share on other sites

What exactly are you getting at, and how is this Nvidia's spin? This has always been this way from the get-go.

Why is it a bad thing that they can't show or share Gameworks code with AMD? Nothing in the code is purposely gimping AMD, and the effects that do not perform well can be turned off. It's an issue you're trying to turn into an Nvidia-hate train, just like anyone else who refuses to see the facts.

 

It's spin, as people don't seem to understand, that NVidia are actually confirming, that devs cannot get helped by AMD, nor help AMD with optimization of GameWorks.

 

Doesn't matter, if the effect isn't directly gimping AMD, which is illegal btw. The issue is, that AMD cannot properly optimize drivers for these effects, without source code access. We know from the ex Valve programmer, that optimizing at the end of the graphics stack, is both ineffective, and very difficult. That is the issue.

 

Claiming that you can just turn it of, is not very sustainable in the gaming scene. How much is acceptable to have to turn off, in the graphics setting, just because you have one brand of GPU? This is not about NVidia hate, but about criticizing a tendency in the gaming space, no consumer should condone, nor accept.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

It's spin, as people don't seem to understand, that NVidia are actually confirming, that devs cannot get helped by AMD, nor help AMD with optimization of GameWorks.

 

That's not Nvidia's or the studio's fault, it's AMD's. AMD needs to hire people who can actually do a job, not a Chief Gamer Scientist who likes to stir up marketing spins to get the scraps off of the dinner table: making meme's of Radeon cards "Actually having 4GB of VRAM" and then claiming Nvidia is hurting their performance in Gameworks titles, a completely false accusation.

Doesn't matter, if the effect isn't directly gimping AMD, which is illegal btw. The issue is, that AMD cannot properly optimize drivers for these effects, without source code access. We know from the ex Valve programmer, that optimizing at the end of the graphics stack, is both ineffective, and very difficult. That is the issue.

What you're saying is you want Radeon cards to be able to run Nvidia effects? Is that why you keep bringing up your arguments?

 

Claiming that you can just turn it of, is not very sustainable in the gaming scene. How much is acceptable to have to turn off, in the graphics setting, just because you have one brand of GPU? This is not about NVidia hate, but about criticizing a tendency in the gaming space, no consumer should condone, nor accept.

Yes it is sustainable. AMD should get their shit together because they could be handing Nvidia's asses to them. But no, they have to put Huddy out there to get all of the gullible people (AMD and Nvidia owners) riled up and raise their pitchforks to Nvidia for no reason.

Link to comment
Share on other sites

Link to post
Share on other sites

I was referring to tessellation performance and the impact it has in Witcher 3.

 

Oh you meant that?  :lol:  My apologies, the 285 definitely has better tessellation performance than those cards

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×