Jump to content

Some user are mad because Nvidia new driver (347.09) Blocked Overclocking for GTX 900M

ahhming

if it doesn't overheat I fail to see the issue. Maxwell TDPs are generally good too. And some users may want temp performance boost especially when plugged in.

 

You don't actually gain much at all with the chance of damaging the unit, though. Unless something's changed with overclocking from the 600m series, I still don't see any benefit from it.

Link to comment
Share on other sites

Link to post
Share on other sites

I know lol, no hard feelings, just making a statement. What the hell do you do to those hard drives though?!

Nothing that difficult, I was using them as backup HDD. All that happened to them was that once a week I connected the SATA power cables to them to transfer a copy of all of my documents.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I never saw a point in overclocking in laptops anyways. Kinda defeats the purpose of a laptop.

Its not just overclocking. Those Beefy laptop with desktop CPU shipped with higher  GPU clock speed Out of the box and have been restored to default clock speed. Those are the customer that are upset.

Link to comment
Share on other sites

Link to post
Share on other sites

Its not just overclocking. Those Beefy laptop with desktop CPU with higher clock speed Out of the box have been restored to default clock speed. Those are the customer that are upset.

 

Your sentence is hard to comprehend, so I'm assuming you're saying that somehow NVIDIA has somehow released a driver that made a laptop with an out-of-the-box overclocked INTEL CPU go back to default speeds?

Link to comment
Share on other sites

Link to post
Share on other sites

Its not just overclocking. Those Beefy laptop with desktop CPU with higher clock speed Out of the box have been restored to default clock speed. Those are the customer that are upset.

 

Better bold + clarify that in the OP then to make it be seen.

The Internet is the first thing that humanity has built that humanity doesn't understand, the largest experiment in anarchy that we have ever had.

Link to comment
Share on other sites

Link to post
Share on other sites

Its not just overclocking. Those Beefy laptop with desktop CPU with higher clock speed Out of the box have been restored to default clock speed. Those are the customer that are upset.

This has nothing to do with CPU. Only the GPUs are reverted, and honestly they should be. Cooling a powerful GPU in a laptop isn't an easy feat, but some laptops do much better than others. Nvidia probably got the failure rates per clock speed (GeForce experience sends anonymous system info) and said no more. I don't blame them.

The projects never end in my line of work.

CPU: Dual Xeon E5-2650v2 || GPU: Dual Quadro K5000 || Motherboard: Asus Z9PE-D8 || RAM: 64GB Corsair Vengeance || Monitors: Dual LG 34UM95, NEC MultiSync EA244UHD || Storage: Dual Samsung 850 Pro 256GB in Raid 0, 6x WD Re 4TB in Raid 1 || Sound: Xonar Essense STX (Mainly for Troubleshooting and listening test) || PSU: Corsair Ax1500i

CPU: Core i7 5820k @ 4.7GHz || GPU: Dual Titan X || Motherboard: Asus X99 Deluxe || RAM: 32GB Crucial Ballistix Sport || Monitors: MX299Q, 29UB65, LG 34UM95 || Storage: Dual Samsung 850 EVO 1 TB in Raid 0, Samsung 850 EVO 250GB, 2TB Toshiba scratch disk, 3TB Seagate Barracuda || PSU: EVGA 1000w PS Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

Let's not forget, readable degrees don't translate to actual heat produced. The 980M may have lower reading temperatures, but may actually output a significant amount of heat that can damage other components in the system. That's something that can fall back to on NVIDIA.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Your sentence is hard to comprehend, so I'm assuming you're saying that somehow NVIDIA has somehow released a driver that made a laptop with an out-of-the-box

overclocked INTEL CPU go back to default speeds?

What I am trying to say is, those laptop that shipped with higher clock speed, have been restored to nvidia default clock speed. This will likely affect those who have custom laptop with desktop cpu.
Link to comment
Share on other sites

Link to post
Share on other sites

What I am trying to say is, those laptop that shipped with higher clock speed, have been restored to nvidia default clock speed

If people want to overclock.. Do it on a desktop. That's the same reason 99+% of laptops have locked multipliers. I know there are desktop chips in some, but those generally have heat coming out the wazoo. On a desktop you can upgrade cooling, on a laptop the best you got is a cooling pad. It was really a smart thing on their end, honestly.

The projects never end in my line of work.

CPU: Dual Xeon E5-2650v2 || GPU: Dual Quadro K5000 || Motherboard: Asus Z9PE-D8 || RAM: 64GB Corsair Vengeance || Monitors: Dual LG 34UM95, NEC MultiSync EA244UHD || Storage: Dual Samsung 850 Pro 256GB in Raid 0, 6x WD Re 4TB in Raid 1 || Sound: Xonar Essense STX (Mainly for Troubleshooting and listening test) || PSU: Corsair Ax1500i

CPU: Core i7 5820k @ 4.7GHz || GPU: Dual Titan X || Motherboard: Asus X99 Deluxe || RAM: 32GB Crucial Ballistix Sport || Monitors: MX299Q, 29UB65, LG 34UM95 || Storage: Dual Samsung 850 EVO 1 TB in Raid 0, Samsung 850 EVO 250GB, 2TB Toshiba scratch disk, 3TB Seagate Barracuda || PSU: EVGA 1000w PS Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know about you guys, but laptops and the like can make fun "Science" experiments.

 

Seriously, ever hook up a 212 to a craptop? Its great fun.

 

I'd pull out the liquid Nitrogen and try and get a record for the best OC for the 9XXm.

 

 

I understand Nvidia's view on this, but I feel it should ultimately be up to the user to decide. After all, they payed for it.

"Normandy" i7 4790K - GTX 970 - Phantom 410 (Gun metal) - Z97 Extreme4 (asrock) - 128GB Crucial SSD - 1TB WD HDD - H60 Refurb. - 7 case fans | G710+ Keyboard, G230 Headset, Acer GN246HL Monitor.

Quick thoughts on system: I7 is extremely quick and I'm glad I spent the extra for hyper-threading. I regret my decision to get the GTX 970, it has horrible coil whine. There isn't any excuse for this terrible whine I and others are having. I HIGHLY recommend a 144hz monitor. Future Improvements/upgrades: Rubber fan mounts, basic speakers, more ram (for a total of 16gb), replace GPU.

144hz is love. 144hz is life. I like to submit unfinished posts then do about 20 edits. I like the Night Theme too.
Link to comment
Share on other sites

Link to post
Share on other sites

If people want to overclock.. Do it on a desktop. That's the same reason 99+% of laptops have locked multipliers. I know there are desktop chips in some, but those generally have heat coming out the wazoo. On a desktop you can upgrade cooling, on a laptop the best you got is a cooling pad. It was really a smart thing on their end, honestly.

I'm gonna take it you misread/don;t understand what he meant.

He probably means that laptops that have shipped Overclocked by DEFAULT by OEMS are now having those overclocks turned down via nvidia. These are not USER overclocks but clocks done by OEM partners (or at least thats what I'm interpreting)

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know about you guys, but laptops and the like can make fun "Science" experiments.

 

Seriously, ever hook up a 212 to a craptop? Its great fun.

 

I'd pull out the liquid Nitrogen and try and get a record for the best OC for the 9XXm.

 

 

I understand Nvidia's view on this, but I feel it should ultimately be up to the user to decide. After all, they payed for it.

And they also pay for any replacements as well, so it would be good for Nvidia. (Voided warranty)

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I'm gonna take it you misread/don;t understand what he meant.

He probably means that laptops that have shipped Overclocked by DEFAULT by OEMS are now having those overclocks turned down via nvidia. These are not USER overclocks but clocks done by OEM partners (or at least thats what I'm interpreting)

I did read what was said about the OEM factory overclock. That was not Nvidia's doing though. They don't have to, and obviously aren't, supporting the vendor's modification.

The projects never end in my line of work.

CPU: Dual Xeon E5-2650v2 || GPU: Dual Quadro K5000 || Motherboard: Asus Z9PE-D8 || RAM: 64GB Corsair Vengeance || Monitors: Dual LG 34UM95, NEC MultiSync EA244UHD || Storage: Dual Samsung 850 Pro 256GB in Raid 0, 6x WD Re 4TB in Raid 1 || Sound: Xonar Essense STX (Mainly for Troubleshooting and listening test) || PSU: Corsair Ax1500i

CPU: Core i7 5820k @ 4.7GHz || GPU: Dual Titan X || Motherboard: Asus X99 Deluxe || RAM: 32GB Crucial Ballistix Sport || Monitors: MX299Q, 29UB65, LG 34UM95 || Storage: Dual Samsung 850 EVO 1 TB in Raid 0, Samsung 850 EVO 250GB, 2TB Toshiba scratch disk, 3TB Seagate Barracuda || PSU: EVGA 1000w PS Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

I did read what was said about the OEM factory overclock. That was not Nvidia's doing though. They don't have to, and obviously aren't, supporting the vendor's modification.

 

So just a dick move in general?

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

So just a dick move in general?

No. A smart move. I'd be willing to bet that they didn't approve OEM changes, but they did it anyway.

As I said before, performance numbers (including clock) are sent anonymously back to Nvidia's through GeForce experience. If they see OC cards failing at a much higher rate, of course they'd nix that. It costs them money to replace if under warranty. If I was running the show over there, I would do the same.

http://www.geforce.com/geforce-experience/faq

Read under "What data does GeForce Experience send to NVIDIA?"

Edited by Prastupok

The projects never end in my line of work.

CPU: Dual Xeon E5-2650v2 || GPU: Dual Quadro K5000 || Motherboard: Asus Z9PE-D8 || RAM: 64GB Corsair Vengeance || Monitors: Dual LG 34UM95, NEC MultiSync EA244UHD || Storage: Dual Samsung 850 Pro 256GB in Raid 0, 6x WD Re 4TB in Raid 1 || Sound: Xonar Essense STX (Mainly for Troubleshooting and listening test) || PSU: Corsair Ax1500i

CPU: Core i7 5820k @ 4.7GHz || GPU: Dual Titan X || Motherboard: Asus X99 Deluxe || RAM: 32GB Crucial Ballistix Sport || Monitors: MX299Q, 29UB65, LG 34UM95 || Storage: Dual Samsung 850 EVO 1 TB in Raid 0, Samsung 850 EVO 250GB, 2TB Toshiba scratch disk, 3TB Seagate Barracuda || PSU: EVGA 1000w PS Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

I did read what was said about the OEM factory overclock. That was not Nvidia's doing though. They don't have to, and obviously aren't, supporting the vendor's modification.

...

I don;t know the news about this kinda left a really bad taste in my mouth. Nvidia has been doing some pretty wierd stuff recently.

Link to comment
Share on other sites

Link to post
Share on other sites

Just to be sure, this is only affecting teh GTX900m series? 

 

(Confused, as my Quadro 2000m running 342.52 still overclocks like a beast (300MHz core OC (755MHz) (maxed slider on MSI afterburner) stable). But is in no way a 900 series card). 

Spoiler

Desktop <dead?> 

Spoiler

P8P67-WS/Z77 Extreme4/H61DE-S3. 4x4 Samsung 1600MHz/1x8GB Gskill 1866MHzC9. 750W OCZ ZT/750w Corsair CX. GTX480/Sapphire HD7950 1.05GHz (OC). Adata SP600 256GB x2/SSG 830 128GB/1TB Hatachi Deskstar/3TB Seagate. Windows XP/7Pro, Windows 10 on Test drive. FreeBSD and Fedora on liveboot USB3 drives. 

 

Spoiler

Laptop <Works Beyond Spec>

Spoiler

HP-DM3. Pentium U5400. 2x4GB DDR3 1600MHz (Samsung iirc). Intel HD. 512GB SSD. 8TB USB drive (Western Digital). Coil Wine!!!!!! (Is that a spec?). 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just to be sure, this is only affecting teh GTX900m series?

(Confused, as my Quadro 2000m running 342.52 still overclocks like a beast (300MHz core OC (755MHz) (maxed slider on MSI afterburner) stable). But is in no way a 900 series card).

Because most older GPUs aren't under warranty anymore. They aren't responsible after x amount of time.

People seem to forget they are a business.

Edited by Prastupok

The projects never end in my line of work.

CPU: Dual Xeon E5-2650v2 || GPU: Dual Quadro K5000 || Motherboard: Asus Z9PE-D8 || RAM: 64GB Corsair Vengeance || Monitors: Dual LG 34UM95, NEC MultiSync EA244UHD || Storage: Dual Samsung 850 Pro 256GB in Raid 0, 6x WD Re 4TB in Raid 1 || Sound: Xonar Essense STX (Mainly for Troubleshooting and listening test) || PSU: Corsair Ax1500i

CPU: Core i7 5820k @ 4.7GHz || GPU: Dual Titan X || Motherboard: Asus X99 Deluxe || RAM: 32GB Crucial Ballistix Sport || Monitors: MX299Q, 29UB65, LG 34UM95 || Storage: Dual Samsung 850 EVO 1 TB in Raid 0, Samsung 850 EVO 250GB, 2TB Toshiba scratch disk, 3TB Seagate Barracuda || PSU: EVGA 1000w PS Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

Because most older GPUs aren't under warranty anymore. They aren't responsible after x amount of time.

People seem to forget they are a business.

Nvidia is not responsible for the warranty anyway, that goes through the laptop manufacturer.  (Also, as a driver block, it should affect all or none, but I could see them removing the feature from Certified drivers (assuming that 342.52 is even based off the newest consumer release)). 

Spoiler

Desktop <dead?> 

Spoiler

P8P67-WS/Z77 Extreme4/H61DE-S3. 4x4 Samsung 1600MHz/1x8GB Gskill 1866MHzC9. 750W OCZ ZT/750w Corsair CX. GTX480/Sapphire HD7950 1.05GHz (OC). Adata SP600 256GB x2/SSG 830 128GB/1TB Hatachi Deskstar/3TB Seagate. Windows XP/7Pro, Windows 10 on Test drive. FreeBSD and Fedora on liveboot USB3 drives. 

 

Spoiler

Laptop <Works Beyond Spec>

Spoiler

HP-DM3. Pentium U5400. 2x4GB DDR3 1600MHz (Samsung iirc). Intel HD. 512GB SSD. 8TB USB drive (Western Digital). Coil Wine!!!!!! (Is that a spec?). 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia is not responsible for the warranty anyway, that goes through the laptop manufacturer. (Also, as a driver block, it should affect all or none, but I could see them removing the feature from Certified drivers (assuming that 342.52 is even based off the newest consumer release)).

If the GPU fails, who do you think the manufacturer calls? They certainly don't take the loss. It's Nvidia's part, they are responsible for it.

It is a smart move on their part. Mobile GPUs are what they are, and as I said in my first post, I'd rather sacrifice 10% performance gain for 40% longer life. It's foolish not to do so.

Edited by Prastupok

The projects never end in my line of work.

CPU: Dual Xeon E5-2650v2 || GPU: Dual Quadro K5000 || Motherboard: Asus Z9PE-D8 || RAM: 64GB Corsair Vengeance || Monitors: Dual LG 34UM95, NEC MultiSync EA244UHD || Storage: Dual Samsung 850 Pro 256GB in Raid 0, 6x WD Re 4TB in Raid 1 || Sound: Xonar Essense STX (Mainly for Troubleshooting and listening test) || PSU: Corsair Ax1500i

CPU: Core i7 5820k @ 4.7GHz || GPU: Dual Titan X || Motherboard: Asus X99 Deluxe || RAM: 32GB Crucial Ballistix Sport || Monitors: MX299Q, 29UB65, LG 34UM95 || Storage: Dual Samsung 850 EVO 1 TB in Raid 0, Samsung 850 EVO 250GB, 2TB Toshiba scratch disk, 3TB Seagate Barracuda || PSU: EVGA 1000w PS Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

If the GPU fails, who do you think the manufacturer calls? They certainly don't take the loss. It's Nvidia's part, they are responsible for it.

There are very few things that HP for example (the manufacturer whom installed this Quadro) could return to Nvidia for. If it is a heating issue (which at 55*C full load OC'd, there would never be an issue), it is up to HP unless the GPU malfunctioned causing the heat. 

It is no different than if you buy a Strix card and it dies from heat, it would not be returned to Nvidia. Asus would eat the cost at that point. 

 

 

But then again, without the contract between Nvidia and HP (etc.) for reference, all we can do is guess at whom eats the bill for bad laptop designs (clue, it is probably not HP on the workstation side at least.). 

Spoiler

Desktop <dead?> 

Spoiler

P8P67-WS/Z77 Extreme4/H61DE-S3. 4x4 Samsung 1600MHz/1x8GB Gskill 1866MHzC9. 750W OCZ ZT/750w Corsair CX. GTX480/Sapphire HD7950 1.05GHz (OC). Adata SP600 256GB x2/SSG 830 128GB/1TB Hatachi Deskstar/3TB Seagate. Windows XP/7Pro, Windows 10 on Test drive. FreeBSD and Fedora on liveboot USB3 drives. 

 

Spoiler

Laptop <Works Beyond Spec>

Spoiler

HP-DM3. Pentium U5400. 2x4GB DDR3 1600MHz (Samsung iirc). Intel HD. 512GB SSD. 8TB USB drive (Western Digital). Coil Wine!!!!!! (Is that a spec?). 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

There are very few things that HP for example (the manufacturer whom installed this Quadro) could return to Nvidia for. If it is a heating issue (which at 55*C full load OC'd, there would never be an issue), it is up to HP unless the GPU malfunctioned causing the heat.

It is no different than if you buy a Strix card and it dies from heat, it would not be returned to Nvidia. Asus would eat the cost at that point.

But then again, without the contract between Nvidia and HP (etc.) for reference, all we can do is guess at whom eats the bill for bad laptop designs (clue, it is probably not HP on the workstation side at least.).

I had an Nvidia GPU blow up on me, manufactured by EVGA if I remember right. They had me contact Nvidia, who then issued me a replacement. Too bad it took out the rest of my workstation too, but nothing else burned luckily. I keep my desktop systems on a concrete base, in a room with overhead sprinklers.

But in most cases if the actual card dies, and not through fault of the cooling solution on the card, it seems to fall back to them. I can only imagine it's the same in laptops.

The projects never end in my line of work.

CPU: Dual Xeon E5-2650v2 || GPU: Dual Quadro K5000 || Motherboard: Asus Z9PE-D8 || RAM: 64GB Corsair Vengeance || Monitors: Dual LG 34UM95, NEC MultiSync EA244UHD || Storage: Dual Samsung 850 Pro 256GB in Raid 0, 6x WD Re 4TB in Raid 1 || Sound: Xonar Essense STX (Mainly for Troubleshooting and listening test) || PSU: Corsair Ax1500i

CPU: Core i7 5820k @ 4.7GHz || GPU: Dual Titan X || Motherboard: Asus X99 Deluxe || RAM: 32GB Crucial Ballistix Sport || Monitors: MX299Q, 29UB65, LG 34UM95 || Storage: Dual Samsung 850 EVO 1 TB in Raid 0, Samsung 850 EVO 250GB, 2TB Toshiba scratch disk, 3TB Seagate Barracuda || PSU: EVGA 1000w PS Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

quote

curious, as the only  experience I have with failing Nvidia GPU's were both on Dell workstations, and in both cases (a D630 that failed out, artifacts in BIOS even, and a M90, which Dell issued the wrong vBIOS in the support case) Dell handled the RMAs. That said, it was years ago, and things may have changed, or it may have been something different on the Quadro line versus GeForce.  

Spoiler

Desktop <dead?> 

Spoiler

P8P67-WS/Z77 Extreme4/H61DE-S3. 4x4 Samsung 1600MHz/1x8GB Gskill 1866MHzC9. 750W OCZ ZT/750w Corsair CX. GTX480/Sapphire HD7950 1.05GHz (OC). Adata SP600 256GB x2/SSG 830 128GB/1TB Hatachi Deskstar/3TB Seagate. Windows XP/7Pro, Windows 10 on Test drive. FreeBSD and Fedora on liveboot USB3 drives. 

 

Spoiler

Laptop <Works Beyond Spec>

Spoiler

HP-DM3. Pentium U5400. 2x4GB DDR3 1600MHz (Samsung iirc). Intel HD. 512GB SSD. 8TB USB drive (Western Digital). Coil Wine!!!!!! (Is that a spec?). 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

No. A smart move. I'd be willing to bet that they didn't approve OEM changes, but they did it anyway.

As I said before, performance numbers (including clock) are sent anonymously back to Nvidia's through GeForce experience. If they see OC cards failing at a much higher rate, of course they'd nix that. It costs them money to replace if under warranty. If I was running the show over there, I would do the same.

http://www.geforce.com/geforce-experience/faq

Read under "What data does GeForce Experience send to NVIDIA?"

 

Why would it cost Nvidia money when it's the OEM's doing the overclocks, and carrying the burden of warranty?

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Sheldon_King,

I think it may have been a change over time. This happened 3 years ago, but it was a Quadro that burst into flames.

I'm still pissed to this day, took out a 2 month old workstation that I spent over 5 grand on. Sigh. But shit happens, right?

I'm merely saying that from a business perspective.. It does make sense. I've never OCed a laptop gpu for the simple matter of heat output.

Khvarrioiren, did you read the conversation we've been having at all?

Edited by Prastupok

The projects never end in my line of work.

CPU: Dual Xeon E5-2650v2 || GPU: Dual Quadro K5000 || Motherboard: Asus Z9PE-D8 || RAM: 64GB Corsair Vengeance || Monitors: Dual LG 34UM95, NEC MultiSync EA244UHD || Storage: Dual Samsung 850 Pro 256GB in Raid 0, 6x WD Re 4TB in Raid 1 || Sound: Xonar Essense STX (Mainly for Troubleshooting and listening test) || PSU: Corsair Ax1500i

CPU: Core i7 5820k @ 4.7GHz || GPU: Dual Titan X || Motherboard: Asus X99 Deluxe || RAM: 32GB Crucial Ballistix Sport || Monitors: MX299Q, 29UB65, LG 34UM95 || Storage: Dual Samsung 850 EVO 1 TB in Raid 0, Samsung 850 EVO 250GB, 2TB Toshiba scratch disk, 3TB Seagate Barracuda || PSU: EVGA 1000w PS Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×