Jump to content

NVIDIA forbids deployment of GeForce drivers in data centers...

Not long ago in talking to @leadeater on an AMD thread I was talking about how Nvidia might be able to keep AMD on the sidelines of the compute workloads with their usual tactics since they didn't generate enough bad faith from companies as Microsoft did in the past for example.

 

Looks like I stand corrected: with moves like this they might just be able to turn that around too and become a datacenter enemy. Not this alone of course but they're really miscalculating here if they think nobody will notice "Hey this AMD cards can do compute even better and we can get the cheaper ones Bob...could we look into that?" at one point.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

On 27/12/2017 at 7:52 AM, Bit_Guardian said:

He needs guaranteed ssh and SFTP access.

Fine, simply don't put his user in sudoers :)

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

How does this impact remove desktop/ota gaming services?

“I like being alone. I have control over my own shit. Therefore, in order to win me over, your presence has to feel better than my solitude. You're not competing with another person, you are competing with my comfort zones.”  - portfolio - twitter - instagram - youtube

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/26/2017 at 5:05 PM, leadeater said:

The issue is for a very large variety of workloads GeForce cards are just as fast as Quadros or Teslas so many people have opted for those cards saving huge amounts of money, likely also getting better performance by being able to buy a much faster GPU.

 

Nvidia didn't like this so the solution, rather than making more affordable products, is to disallow the use of GeForce cards in servers so yea plenty justification to have a bitch in my opinion.

 

So people that run datacenters are running unvalidated non 24/7 ready hardware? I don't believe that in the slightest lol.

Stuff:  i7 7700k @ (dat nibba succ) | ASRock Z170M OC Formula | G.Skill TridentZ 3600 c16 | EKWB 1080 @ 2100 mhz  |  Acer X34 Predator | R4 | EVGA 1000 P2 | 1080mm Radiator Custom Loop | HD800 + Audio-GD NFB-11 | 850 Evo 1TB | 840 Pro 256GB | 3TB WD Blue | 2TB Barracuda

Hwbot: http://hwbot.org/user/lays/ 

FireStrike 980 ti @ 1800 Mhz http://hwbot.org/submission/3183338 http://www.3dmark.com/3dm/11574089

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Lays said:

So people that run datacenters are running unvalidated non 24/7 ready hardware? I don't believe that in the slightest lol.

Some do since the GPU reliability isn't any different as the exact same fab and silicon supply is being used. They will also be cooler in a server chassis and have better and cleaner power compared to a desktop.

 

It's not the greatest idea but doesn't have much to do with reliability or some mythical 24/7 validation, Quadros and Teslas have their power connectors on the edge not the top so those actually work in 2U servers, you need 4U to use GeForce cards because of top power connectors.

 

Beyond that unless you need the specific drivers and application support for something like VMware Horizon why pay more for less? Makes no sense either.

 

It's the same story for Intel CPUs, a Xeon is no more reliable than any of the other Intel CPUs.

 

Also if people weren't doing it why is Nvidia changing the terms of GeForce cards ;).

 

Edit:

Spoiler

geforce.png

 

Spoiler

maxresdefault.jpg

Spoiler

IMG_3064-geforce.jpg

 

Spoiler

KrakenBuild-23.jpg

 

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds like their driver is so intrusive in its "data collection" that using it in a server environment could potentially lead to server data being sent back to nvidia that you wouldn't want to have sent outside your network. This is most likely their way of saying "Dont use this in a data center(because our driver will rip apart your machine and network to collect as much analytical data as possible and send it back to us without your knowledge)."

Lets play connect the dots!

::::::::::

::::::::::

::::::::::

Link to comment
Share on other sites

Link to post
Share on other sites

so is nvidia going to sue me for running a minecraft server in my basement with a computer that has an gtx 280

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Name Taken said:

Think of all the poor miners trying to make bank! Well... they probably don't care.

Isn't mining explictly allowed? 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

On 27/12/2017 at 9:35 AM, Theguywhobea said:

I think they just don't want people using consumer grade hardware in industrial scenarios like a data center. Their issue is if they don't explicitly forbid people from using consumer grade cards in their data centers, they will end up getting all kinds of support calls and issues where the customer believes this is actually Nvidia's fault where it's really their own fault for using the wrong hardware. 

 

To give a bit of an example, you wouldn't slap a netgear switch in a panel in a boiler room at school. Consumer grade routers aren't rated to operate in ambient temperatures that you might see in a metal box in a boiler room, but you can bet that the System Intergrator that installed that switch is going to have an unhappy customer when they come in in the morning to a cold building because their controllers aren't talking anymore.

 

Nvidia doesn't validate their consumer grade cards to run flat out 24/7, and their drivers aren't validated to have high accuracy when doing computations. The cost premium on Quadro cards come from the extra validation Nvidia does on the cards and the drivers.

My old job placed anetgear switch inside a shed in a school, where summer ambient inside the shed gets over 50c easily. STRAYA

Rig 1 CPU: 3570K Motherboard: V Gene GPU: Power Color r9 280x at 1.35GHZ  RAM: 16 GB 1600mhz PSU: Cougar CMX 700W Storage: 1x Plexor M5S 256GB 1x 1TB HDD 1x 3TB GREEN HDD Case: Coolermaster HAFXB Cooling: Intel Watercooler
"My day so far, I've fixed 4 computers and caught a dog. Australian Tech Industry is weird."

"It's bent so far to the right, It's a hook."

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Name Taken said:

Think of all the poor miners trying to make bank! Well... they probably don't care.

Nvidia allow the drivers to be used for cryptocurrency mining in data centres. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Name Taken said:

Statistically Linux based operating systems are the most deployed in data centres with Ubuntu 16.04 LTS specifically having the largest percentage. Ubuntu 16.04 LTS's repo includes Nvidia 375 LTS driver so how does a EULA update effect old Linux drivers?

Whats that got to do with Nvidia allowing their use for crypto?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×