Jump to content

NVIDIA have been hit with a class action suit over the GTX 970.

The proof is that in some nvidia titles a 650 ti boost beats a 270x.

 

Thats good proof enough that some games are extremely NVIDIA biased easily.

 

But you guys like blank statements with nothing to back them off i guess.

No, that's proof AMD's drivers need work for that game. AMD may say it can't see the game works DLLs, but that's BS because they can disassemble and decompile and analyze them! I am not smarter than all of AMD's computer programmers/scientists! Or, if I am, then AMD is doomed anyway!

 

I'm a bachelor of computer science. It's garbage to claim game works gimps anything. If anything, AMD is just using those titles as excuses to stir up their base with conspiracies. Also, the 650TI Boost is a serious overclocker.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I am childish yet I am providing proof while you refute my arguments without any proof.

 

Nice scientific method .

 

The burden of proof is on you.

But your proof is not scientific either. Correlation does not imply causation, and you are still only posting 1 single game while ignoring games like Watch_Dogs (from the same developers) which runs bad on all cards.

 

Your argument is "Splinter Cell runs bad on AMD cards. Splinter Cell is one (out of many) GameWork titles. Therefore, it must be Gameworks that causes AMD cards to perform bad in this game!". You have posted 0 evidence to support this theory.

Link to comment
Share on other sites

Link to post
Share on other sites

Carnegie Melon delivering 800 letters that will actually deliver real depression upon students > GTX 970 nonsense. 

 

One is material goods that have no bearing on your life. The other is pretty brutal for someone. Maybe some people here have never received letters of rejection from a leading university. They suck. It hurts. Putting in all the effort that these higher level institutions require just to apply and then getting rejected? Hurts. 

So imagine being told "Hey, you got in!". And then getting the "Actually, you didn't". I've seen people actually get depressed from the rejection. But that involves people on this forum having some perspective on what actually matters in life. 

And out of all things, it's definitely not a $330 lump of plastic, metal and glass fiber that sits in your PC and gets hot.

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

God, this thread... LTT is fucked. I said it before, I'll say it again.


My 2 cents:

It doesn't fucking matter if Nvidia lied or not. They were at least misleading and/or did not advertise their product correctly. For an international company as big as Nvidia, this is a big fucking deal! As with any large, international company, give them an inch and they'll take a mile. Nvidia absolutely bloody must be held accountable for their actions, no matter what their intentions were, even if they simply fucked up. If they aren't held accountable, they'll continue to "mess up" or "lie" (whatever you want to call it) and it will only get worse.

 

Do you want to see shit like this continue with their future products and possibly on a larger/harsher scale? I sure as hell don't!

 

If Nvidia did indeed lie:

 

Nvidia is an a very large company. Companies as big as Nvidia are constantly trying to increase profits and lower costs. Sometimes they do this by being dishonest, it's nothing new. As consumers, it's up to us to decide whether this sort of thing from Nvidia is acceptable or not and take appropriate action - like speaking with our wallets.


Get your head into gear, LTT.

 

Or pretty much like everyone else. Intel are probably still lying to us. So are Kingston, AMD and the rest of them. I won't trust that companies always tell us the truth because they don't.

 

Not to say that there's an excuse for lying. There isn't, but don't be suprised if they are lying to you.

 

I'm sorry, am I missing something? Did AMD lie about something?

I get the feeling they didn't. As the underdogs, they cannot afford to be caught out lying. It could possibly destroy their company if they were.

waffle waffle waffle on and on and on

Link to comment
Share on other sites

Link to post
Share on other sites

And out of all things, it's definitely not a $330 lump of plastic, metal and glass fiber that sits in your PC and gets hot.

 

I'm sure it does more than that. 

 

It has feelings. 

 

Right? It feels the feels, as the kids say. 

Link to comment
Share on other sites

Link to post
Share on other sites

Using  ad hominem .

 

Not using proof to back up statements.

 

Very nice debate , I thought rationality was linked to using proof to demonstrate a statement.

 

Thats what modern science is based on , using proof to back up a thesis.

 

Stop backing into a corner with overused arguments.

 

Again, I did not refute anything you've said other than the Gameworks debacle being debunked many times, even by game developers who have actually worked with the fucking code.

 

I said I was going to investigate it myself since the benchmark you showed was an interesting one and didn't make any sense. I wasn't being insulting or off-topic to your argument, yet you're spouting all this bullshit?

 

KK

Link to comment
Share on other sites

Link to post
Share on other sites

God, this thread... LTT is fucked. I said it before, I'll say it again.

My 2 cents:

It doesn't fucking matter if Nvidia lied or not. They were at least misleading and/or did not advertise their product correctly. For an international company as big as Nvidia, this is a big fucking deal! As with any large, international company, give them an inch and they'll take a mile. Nvidia absolutely bloody must be held accountable for their actions, no matter what their intentions were, even if they simply fucked up. If they aren't held accountable, they'll continue to "mess up" or "lie" (whatever you want to call it) and it will only get worse.

 

Do you want to see shit like this continue with their future products and possibly on a larger/harsher scale? I sure as hell don't!

 

Nvidia is an a very large company. Companies as big as Nvidia are constantly trying to increase profits and lower costs. Sometimes they do this by being dishonest, it's nothing new. As consumers, it's up to us to decide whether this sort of thing from Nvidia is acceptable or not and take appropriate action - like speaking with our wallets.

Get your head into gear, LTT.

 

 

I'm sorry, am I missing something? Did AMD lie about something?

I get the feeling they didn't. As the underdogs, they cannot afford to be caught out lying. It could possibly destroy their company if they were.

They lied back in 2007 about TDP ratings for CPUs by creating their own system called ACU. It's just as much a lie as any, as they never obviously showed that the rating was ACU and not TDP.

 

http://www.techpowerup.com/46921/amd-fudges-power-consumption-figures-by-making-up-power-consumption-rating-system.html

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

It annoys me when people say " the 970 still has 4 gb" it's like if I broke one of your legs forcing you to permantly walk with a limp and claiming " you still have 2 legs".

False equivalency. If you really want to use the 2 legs analogy, the 970 would only be able to use 2GB of "fast" memory, while the remaining is crippled. A better equivalency would be a sprained ankle or broken toe. Hurts a bit, but doesn't cripple the person/card. 

 

 

Nvidia is an a very large company. Companies as big as Nvidia are constantly trying to increase profits and lower costs. Sometimes they do this by being dishonest, it's nothing new. As consumers, it's up to us to decide whether this sort of thing from Nvidia is acceptable or not and take appropriate action - like speaking with our wallets.

 

We're not Nvidia's main customers. If every PC gamer were to go up in arms and never buy an Nvidia product in a ludicrous attempt to cripple Nvidia's profits, they'll just laugh at you and carry on supporting the majority of their customers (workstations and supercomputers), which is more than enough for them to keep making money and stay in business. 

Interested in Linux, SteamOS and Open-source applications? Go here

Gaming Rig - CPU: i5 3570k @ Stock | GPU: EVGA Geforce 560Ti 448 Core Classified Ultra | RAM: Mushkin Enhanced Blackline 8GB DDR3 1600 | SSD: Crucial M4 128GB | HDD: 3TB Seagate Barracuda, 1TB WD Caviar Black, 1TB Seagate Barracuda | Case: Antec Lanboy Air | KB: Corsair Vengeance K70 Cherry MX Blue | Mouse: Corsair Vengeance M95 | Headset: Steelseries Siberia V2

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry, am I missing something? Did AMD lie about something?

I get the feeling they didn't. As the underdogs, they cannot afford to be caught out lying. It could possibly destroy their company if they were.

Not sure if you can call it lies (people in this thread seems to have a weird definition of "lie") but here are a few things:

Not telling the whole truth about their FreeSync Demo.

 

Having "staged release dates" for benchmark results of their products. What this means is that AMD told reviewers that they were only allowed to release benchmark results where their product were good, and then several weeks later they were allowed to post other benchmarks as well. This would make the product look far better than it really was, because all the first batch of benchmarks would look good.

 

The "ACU" measurement someone else posted earlier.

 

The way they measure CPU temperatures is misleading and terrible. If someone tells you that their AMD CPU temp is below ambient then it's not that they are stupid and lying (below ambient temps is physically impossible without something like phase cooling), it's just that AMD's temperature readings are terrible.

"Starting with the Phenoms, AMD's digital sensor no longer reports an absolute temperature value anymore, but a reading with a certain offset, which is unknown. It is estimated that this offset is between 10 - 20c."

Link to comment
Share on other sites

Link to post
Share on other sites

Would people even care if Nvidia marked it as a 3.5GB card instead? Still beats the crap out of my anemic 1GB of vram, though.

 

Whenever AMD releases a new wave of cards, I'm jumping on that.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Would people even care if Nvidia marked it as a 3.5GB card instead? Still beats the crap out of my anemic 1GB of vram, though.

 

Whenever AMD releases a new wave of cards, I'm jumping on that.

 

Someone would have found something to bitch about. That's what the internet is good at.

Link to comment
Share on other sites

Link to post
Share on other sites

God, this thread... LTT is fucked. I said it before, I'll say it again.

My 2 cents:

It doesn't fucking matter if Nvidia lied or not. They were at least misleading and/or did not advertise their product correctly. For an international company as big as Nvidia, this is a big fucking deal! As with any large, international company, give them an inch and they'll take a mile. Nvidia absolutely bloody must be held accountable for their actions, no matter what their intentions were, even if they simply fucked up. If they aren't held accountable, they'll continue to "mess up" or "lie" (whatever you want to call it) and it will only get worse.

 

Do you want to see shit like this continue with their future products and possibly on a larger/harsher scale? I sure as hell don't!

 

If Nvidia did indeed lie:

 

Nvidia is an a very large company. Companies as big as Nvidia are constantly trying to increase profits and lower costs. Sometimes they do this by being dishonest, it's nothing new. As consumers, it's up to us to decide whether this sort of thing from Nvidia is acceptable or not and take appropriate action - like speaking with our wallets.

Get your head into gear, LTT.

 

 

I'm sorry, am I missing something? Did AMD lie about something?

I get the feeling they didn't. As the underdogs, they cannot afford to be caught out lying. It could possibly destroy their company if they were.

Nvidia wouldn't continue doing that, because their reputation with consumers would rot away to nothing and people would stop buying. Markets self-regulate when it's not an infrastructure deal as with oil.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

hahahahhahah.

 

The judge is going to take one look at the defence's application and smack the prosecution on their arse and tell them to get the fuck out of his office.

 

Do they really think they stand a chance here, just because they don't understand jack about the architecture?

This is what I think of Pre-Ordering video games: https://www.youtube.com/watch?v=wp98SH3vW2Y

Link to comment
Share on other sites

Link to post
Share on other sites

It was a clerical error. IT HAPPENS! Carnegie Melon just screwed up and sent 800 students acceptance letters for their master's program in computer science, when it accepts less than 100 a year and had to renege on all 800 of them. IT HAPPENS!

Absolute. Horse. Shit.

 

You really think that the design team, which spent god knows how many hours working on this thing, sent the wrong spec's? Never mind that the customers found out first, and we were only told it was a "clerical error" afterwards.

 

Absolute, complete and utter HORSE SHIT.

 

hahahahhahah.

 

The judge is going to take one look at the defence's application and smack the prosecution on their arse and tell them to get the fuck out of his office.

 

Do they really think they stand a chance here, just because they don't understand jack about the architecture?

 

It's about false advertising.

 

 

Before this incident, I was an Nvidia fanboy, now, I am going to switch to AMD at my next possible opportunity. Nvidia is dead to me until they apologize for lying about the card. No half assed cop-out blaming a clerical error.

 

This seems relevant.

 

-snip

 

Never seen that one before. funny though lol

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

2e34556805.jpg

it would be except the 970 also has a major problem with boost clock, that can cause performance to tank to about half,  and ALOT of 970 owners have it including me.

Desktop:ryzen 5 3600 | MSI b45m bazooka | EVGA 650w Icoolermaster masterbox nr400 |16 gb ddr4  corsiar lpx| Gigabyte Aorus GTX 1070ti |500GB SSD+2TB SSHD, 2tb seagate barracuda [OS/games/mass storage] | HpZR240w 1440p led logitech g502 proteus spectrum| Coolermaster quick fire pro cherry mx  brown |

 

Link to comment
Share on other sites

Link to post
Share on other sites

it would be except the 970 also has a major problem with boost clock, that can cause performance to tank to about half,  and ALOT of 970 owners have it including me.

That's most likely the same issue that @Darkman had which was resolved by switching to a different driver.

"The of and to a in is I that it for you was with on as have but be they"

Link to comment
Share on other sites

Link to post
Share on other sites

That's most likely the same issue that @Darkman had which was resolved by switching to a different driver.

Yeah... Driver 347.09 doesn't have issues, 347.25 does and it even makes my 650ti flip shit

 

Spoiler

Senor Shiny: Main- CPU Intel i7 6700k 4.7GHz @1.42v | RAM G.Skill TridentZ CL16 3200 | GPU Asus Strix GTX 1070 (2100/2152) | Motherboard ASRock Z170 OC Formula | HDD Seagate 1TB x2 | SSD 850 EVO 120GB | CASE NZXT S340 (Black) | PSU Supernova G2 750W  | Cooling NZXT Kraken X62 w/Vardars
Secondary (Plex): CPU Intel Xeon E3-1230 v3 @1.099v | RAM Samsun Wonder 16GB CL9 1600 (sadly no oc) | GPU Asus GTX 680 4GB DCII | Motherboard ASRock H97M-Pro4 | HDDs Seagate 1TB, WD Blue 1TB, WD Blue 3TB | Case Corsair Air 240 (Black) | PSU EVGA 600B | Cooling GeminII S524

Spoiler

(Deceased) DangerousNotDell- CPU AMD AMD FX 8120 @4.8GHz 1.42v | GPU Asus GTX 680 4GB DCII | RAM Samsung Wonder 8GB (CL9 2133MHz 1.6v) | Motherboard Asus Crosshair V Formula-Z | Cooling EVO 212 | Case Rosewill Redbone | PSU EVGA 600B | HDD Seagate 1TB

DangerousNotDell New Parts For Main Rig Build Log, Señor Shiny  I am a beautiful person. The comments for your help. I have to be a good book. I have to be a good book. I have to be a good book.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Absolute. Horse. Shit.

 

You really think that the design team, which spent god knows how many hours working on this thing, sent the wrong spec's? Never mind that the customers found out first, and we were only told it was a "clerical error" afterwards.

 

Absolute, complete and utter HORSE SHIT.

Funny, because apparently the recruiting agents at Carnegie Melon who slave away for months weighing the various attributes and interview notes of each candidate are just as meticulous, and YET IT HAPPENED! Holy Hell! These people work their asses off into the wee hours of the night, and because people at Nvidia trust each other, something is bound to eventually slip through that isn't quite correct!

 

Tech review sites get things wrong too by copying and pasting tables and forgetting to edit one value. It happens to everyone, including Intel, AMD, Nvidia, IBM, Samsung, and even APPLE! Nvidia's reputation is trashed, and I'm sorry but JSH might be a narcissist who wouldn't bat an eye, but if you think the engineers are happy about the misprint you're sorely mistaken.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

They lied back in 2007 about TDP ratings for CPUs by creating their own system called ACU. It's just as much a lie as any, as they never obviously showed that the rating was ACU and not TDP.

 

http://www.techpowerup.com/46921/amd-fudges-power-consumption-figures-by-making-up-power-consumption-rating-system.html

Not sure if you can call it lies (people in this thread seems to have a weird definition of "lie") but here are a few things:

Not telling the whole truth about their FreeSync Demo.

 

Having "staged release dates" for benchmark results of their products. What this means is that AMD told reviewers that they were only allowed to release benchmark results where their product were good, and then several weeks later they were allowed to post other benchmarks as well. This would make the product look far better than it really was, because all the first batch of benchmarks would look good.

 

The "ACU" measurement someone else posted earlier.

 

The way they measure CPU temperatures is misleading and terrible. If someone tells you that their AMD CPU temp is below ambient then it's not that they are stupid and lying (below ambient temps is physically impossible without something like phase cooling), it's just that AMD's temperature readings are terrible.

"Starting with the Phenoms, AMD's digital sensor no longer reports an absolute temperature value anymore, but a reading with a certain offset, which is unknown. It is estimated that this offset is between 10 - 20c."

 

Fair enough. To be honest, "not telling the whole truth" is more or less lying by omission to me. Seems like a pretty stupid thing to do with their current financial troubles, although I don't know where they stood back in 2007. Hopefully Su will do things differently.

I guess this also says something about brand loyalty, i.e, that it's stupid.

 

A class action lawsuit is the last thing that people should want.  The only people who win are the lawyers.

 

Is it? Class action lawsuits generate a lot of attention and heat, Nvidia won't like that.

 

We're not Nvidia's main customers. If every PC gamer were to go up in arms and never buy an Nvidia product in a ludicrous attempt to cripple Nvidia's profits, they'll just laugh at you and carry on supporting the majority of their customers (workstations and supercomputers), which is more than enough for them to keep making money and stay in business. 

 

Maybe not "main", but significant, I'd say.

waffle waffle waffle on and on and on

Link to comment
Share on other sites

Link to post
Share on other sites

Funny, because apparently the recruiting agents at Carnegie Melon who slave away for months weighing the various attributes and interview notes of each candidate are just as meticulous, and YET IT HAPPENED! Holy Hell! These people work their asses off into the wee hours of the night, and because people at Nvidia trust each other, something is bound to eventually slip through that isn't quite correct!

 

Tech review sites get things wrong too by copying and pasting tables and forgetting to edit one value. It happens to everyone, including Intel, AMD, Nvidia, IBM, Samsung, and even APPLE! Nvidia's reputation is trashed, and I'm sorry but JSH might be a narcissist who wouldn't bat an eye, but if you think the engineers are happy about the misprint you're sorely mistaken.

I'd be more inclined to believe you, if every vendor page for the 970 that I've seen since this issue surfaced, didn't still flatout say "4gb" of Vram. It at least warrants an asterisk saying "only 3.5 usable at any given time"

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Good. I hope they get taken to the cleaners for this, even if I don't get my money back. Companies need to learn that lying about a product is NOT okay.

I'm with you on this one. I'm sick of companies lying to us and this should set an example that WE as consumers have the power, not them. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×