Jump to content

Nvidia unlocks GSP on selected GPUs

NumLock21

Starting with Nvidia driver version 510.39, selected GPU from Nvidia will have their GSP unlocked for improved system performance. Nvidia GSP GPU System Processor,
 

Quote

 

Some GPUs include a GPU System Processor (GSP) which can be used to offload GPU initialization and management tasks. This processor is driven by the firmware file /lib/firmware/nvidia/510.39.01/gsp.bin. A few select products currently use GSP by default, and more products will take advantage of GSP in future driver releases.

Offloading tasks which were traditionally performed by the driver on the CPU can improve performance due to lower latency access to GPU hardware internals.

 

 

https://www.techpowerup.com/291088/nvidia-unlocks-gpu-system-processor-gsp-for-improved-system-performance

https://download.nvidia.com/XFree86/Linux-x86_64/510.39.01/README/gsp.html

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

This affects the below products from Nvidia: 
image.png.be6068741525d11cf6f5bc14c93e4d95.png

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

Wonder if this is related to the "Hardware-accelerated GPU scheduling" setting in Windows display settings? Still, wont matter unless this feature makes its way to consumer tier GPUs.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, porina said:

Still, wont matter unless this feature makes its way to consumer tier GPUs.

i think it's more meant for linux powered servers that want to offload more traditional workloads off of the CPU and onto the GPU.

since there was no offical way to do it, any CPU workloads (such as running the program itself) had to run on the CPU.

now the entire program and it's workload can run on the GPU which would help a lot for GPU accelerated workloads such as rendering a scene in blender or encoding a video stream.

things like that, hell it could even be used on a linux workstation to help these workloads since they can benefit from this on the client side, faster blender viewport rendering times, faster response from davinci resolve's timeline.

things like that can be improved with this, while it would be awesome for normal users, it's mainly meant for workstation and server use and many consumers won't notice anythings changed since the programs they use (i.e chrome and firefox) won't take advantage since it doesn't improve the experience for a good portion of their user base because they don't have said feature and because of that, they aren't gonna support it.

it's cool, but it's not meant for general use.

*Insert Witty Signature here*

System Config: https://au.pcpartpicker.com/list/Tncs9N

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm going to assume that regular GeForce cards do not have a GSP?

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, williamcll said:

I'm going to assume that regular GeForce cards do not have a GSP?

Won't at all be surprised if this were the case, nVidia only cares about their latest and greatest, that is, the 'RTX' line of cards, GTX owners can suck it! But, hey, I'd be happy to be proven wrong.

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, GamerDude said:

Won't at all be surprised if this were the case, nVidia only cares about their latest and greatest, that is, the 'RTX' line of cards, GTX owners can suck it! But, hey, I'd be happy to be proven wrong.

what do you mean? rtx and gtx are both "geforce" cards.

 

those cards in this news are professional Nvidia "processors" that may or may not be called geforce and cost generally a gazillion dollars to buy = )

 

But damn do they look good though!

 

https://www.pny.com/nvidia-a40

 

no gamery gamey nonsense 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mark Kaine said:

what do you mean? rtx and gtx are both "geforce" cards.

 

those cards in this news are professional Nvidia "processors" that may or may not be called geforce and cost generally a gazillion dollars to buy = )

 

But damn do they look good though!

 

https://www.pny.com/nvidia-a40

 

no gamery gamey nonsense 

Are you being sarcastic, or do you really not know that RTX is different from GTX, yes they are GeForce cards but RTX card (RTX 2xxx/RTX3xxx) come with Tensor cores (that's why RTX cards can do DLSS, and GTX cards are left behind), and that's what nVidia wants everyone to buy. DLSS is a nice to have feature for now as cards aren't powerful enough to do many AAA games at 4K natively, maxed out ingame graphics setting + RT (yet another nice to have feature available only in RTX cards) and still net good >60fps, let alone 120fps or higher for higher refresh rate 4K monitors. DLSS will be rendered redundant once cards, and not only flagship level cards, can do 4K/60fps (or higher) natively.

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, bcredeur97 said:

This affects the below products from Nvidia: 
 

Mod needs to just edit the title to "quadro gpus" so we can all not give a shit about this 'news'.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, GamerDude said:

Won't at all be surprised if this were the case, nVidia only cares about their latest and greatest, that is, the 'RTX' line of cards, GTX owners can suck it! But, hey, I'd be happy to be proven wrong.

The GSP has been around since Turing which includes the 16xx series, 2060 cards and beyond. It's just been dark until now. Given how conservative and rigid Quadro drivers typically are, perhaps Nvidia is wanting to test on these first. Consumer GeForce gaming cards tend to be flaky in stability as they're often pushed way harder and crash more often because of it.

 

8 hours ago, porina said:

Wonder if this is related to the "Hardware-accelerated GPU scheduling" setting in Windows display settings? Still, wont matter unless this feature makes its way to consumer tier GPUs.

Exactly what I was thinking too when reading about this.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, AnonymousGuy said:

Mod needs to just edit the title to "quadro gpus" so we can all not give a shit about this 'news'.

Not even Quadro, it's just the Datacenter/Tesla GPUs.

 

1 hour ago, GamerDude said:

Are you being sarcastic, or do you really not know that RTX is different from GTX, yes they are GeForce cards but RTX card (RTX 2xxx/RTX3xxx) come with Tensor cores (that's why RTX cards can do DLSS, and GTX cards are left behind), and that's what nVidia wants everyone to buy.

Zero of the affected GPUs are RTX, Geforce or otherwise. They are all Datacenter GPUs. Trust me move on, this story doesn't mean anything to you so pay no notice.

 

The statement "Regular Geforce cards" was in relation to that these are Datacenter GPUs and not gaming/Geforce.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, leadeater said:

this story doesn't mean anything to you so pay no notice.

That golden A40 though... 4000+ bux, blower style iirc,  what could go wrong?! T^T

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

It honestly piss me off how much misinformation and bullshit people spread these days.

I went to look at the comments on some other news sites to try and gather information, but people are just spouting bullshit. Bullshit about how Nvidia are only doing this to help miners, or how miners will love this even though, from what I've gathered, this won't help miners at all.

Then some are saying how they should bring this to GTX and RTX cards but won't because Nvidia are greedy, even though it seems like they have no idea what this feature actually is.

It's like listening to babies who cry all the time because they just want something they don't need or understand. 

 

It's always the same thing.

People spin whatever Nvidia does into something negative and "greedy".

People complaining about miners no matter what the news are.

People complaining about the product they own not supporting some new feature, regardless of whether or not it even makes sense for the old product to support it.

 

It doesn't even seen like Windows supports this feature, so I am not sure why people are bitching about how Nvidia aren't bringing this to some gaming card.

 

 

This piece of news just makes me depressed for the state of tech journalism as well. I have gone through like 5 articles from different sites and all of them just parrot the same stuff without understanding it. Nvidia has posted like 6 sentences about this, what it does, how to enable it etc, and all articles are just repeating those 6 sentences without adding anything. 

Apparently WCCFTech wanted to add a bit more substance to the article and started writing about how to pronounce RISC-V, because they had so little to add about Nvidia's GSP.

 

It would be great if someone from these tech news publications could reach out to Nvidia and ask for some information, instead of just parroting the one or two sentences in what are essentially patch notes.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

It would be great if someone from these tech news publications could reach out to Nvidia and ask for some information, instead of just parroting the one or two sentences in what are essentially patch notes.

 

Quote

Despite Nvidia saying it has locked down the feature to Enterprise solutions (for now), Nvidia also announced the feature as a critical component for Max-Q laptops coming in 2022 and beyond.

 

Quote

In a new YouTube video released today, Nvidia explains how its latest version of Max-Q will maximize efficiency between the CPU and GPU. One of those features includes GSP -- but in this instance, Nvidia calls it a "command processor" instead of a GPU System Processor.

 

We believe the GSP and "command processor" are the same thing since they function in the same way by offloading low-level tasks from the CPU to the GPU in an effort to improve performance. In the case of Max-Q, Nvidia gives the example of command validation, which performs pointer verification and balance checking, and how this workload will be transitioned from the CPU to the GPU for better performance.

https://www.tomshardware.com/uk/news/nvidia-gpu-system-processor-introduction

 

Sounds like only one, or very few, actually read the information from Nvidia at all.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mark Kaine said:

That golden A40 though... 4000+ bux, blower style iirc,  what could go wrong?! T^T

They really don't get hot

 

Thu Jan 20 10:34:52 2022
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 495.29.05    Driver Version: 495.29.05    CUDA Version: 11.5     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA A40          On   | 00000000:27:00.0 Off |                    0 |
|  0%   77C    P0   201W / 300W |   2708MiB / 45634MiB |     99%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   1  NVIDIA A40          On   | 00000000:A3:00.0 Off |                    0 |
|  0%   72C    P0   200W / 300W |   2708MiB / 45634MiB |     99%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   2  NVIDIA A40          On   | 00000000:C3:00.0 Off |                    0 |
|  0%   61C    P0   138W / 300W |   2708MiB / 45634MiB |     37%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+

 

image.thumb.png.2a7a634fbd1e7a79231f2ef445d8502b.png

 

Also you don't really see much of them once in a server heh

584396002_20211116_175956423.thumb.jpg.814e027f6b0bc37c5e25002b9c9f17dc.jpg

Note: Old photo, 3rd card on the right is now the 3rd A40.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, leadeater said:

They really don't get hot

 

Thu Jan 20 10:34:52 2022
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 495.29.05    Driver Version: 495.29.05    CUDA Version: 11.5     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA A40          On   | 00000000:27:00.0 Off |                    0 |
|  0%   77C    P0   201W / 300W |   2708MiB / 45634MiB |     99%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   1  NVIDIA A40          On   | 00000000:A3:00.0 Off |                    0 |
|  0%   72C    P0   200W / 300W |   2708MiB / 45634MiB |     99%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   2  NVIDIA A40          On   | 00000000:C3:00.0 Off |                    0 |
|  0%   61C    P0   138W / 300W |   2708MiB / 45634MiB |     37%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+

Note: Old photo, 3rd card on the right is now the 3rd A40.

I can't remember which Intel product it is but the TDP is something like 600W where the rack cooling solution is basically an AIO because no amount of airflow on a heatsink can handle it.  Maybe Arctic Sound....yeah I think the 4 tile configuration but the whole Arctic Sound project may have been scraped last year.  The yields were hilarious bad like you needed an entire wafer just to have the yield for a single 4 tile package part.  Or it maybe was even worse where you need an entire lot of wafers to be able to find 4 full featured tiles that could be put on a package.  I forget but the punchline was we were shaking our heads when this was being described to us.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, AnonymousGuy said:

I can't remember which Intel product it is but the TDP is something like 600W where the rack cooling solution is basically an AIO because no amount of airflow on a heatsink can handle it.  Maybe Arctic Sound....yeah I think the 4 tile configuration but the whole Arctic Sound project may have been scraped last year.  The yields were hilarious bad like you needed an entire wafer just to have the yield for a single 4 tile package part.  Or it maybe was even worse where you need an entire lot of wafers to be able to find 4 full featured tiles that could be put on a package.  I forget but the punchline was we were shaking our heads when this was being described to us.

The highest TDP SKUs of the A100's are already like that now too, liquid cooling only. Same goes for certain configurations of CPUs and GPUs, like to use AMD 7763 with GPUs you must choose the liquid cooling option otherwise only the 7713 are the highest CPU supported.

 

Most things are going liquid only in the very high-end now, the next coming generations of CPUs and GPUs even more so as they have even higher TDPs.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, leadeater said:

The highest TDP SKUs of the A100's are already like that now too, liquid cooling only. Same goes for certain configurations of CPUs and GPUs, like to use AMD 7763 with GPUs you must choose the liquid cooling option otherwise only the 7713 are the highest CPU supported.

 

Most things are going liquid only in the very high-end now, the next coming generations of CPUs and GPUs even more so as they have even higher TDPs.

Lovely, that means they're more likely to fail. More moving parts = more failures. At least data center systems are typically build with hotswappable/field-replicable parts unlike laptops.

 

12 hours ago, NumLock21 said:

Starting with Nvidia driver version 510.39, selected GPU from Nvidia will have their GSP unlocked for improved system performance. Nvidia GSP GPU System Processor,
 

 

https://www.techpowerup.com/291088/nvidia-unlocks-gpu-system-processor-gsp-for-improved-system-performance

https://download.nvidia.com/XFree86/Linux-x86_64/510.39.01/README/gsp.html

My feelings here are that the GSP is just nVidia's end-run around kernel drivers and just more binary blob commonality between windows/linux.

 

I can't imagine the GSP was intended to do anything else

Quote

Supported Features

When using GSP firmware, the driver will not yet correctly support display-related features or power management related features. These features will be added to GSP firmware in future driver releases.

Looks like it's still not ready for production use either, but based on the wording, "display related features or power management" suggests that it's only on the data center products right now because those features aren't typically needed when it's run full-tilt anyway.

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Kisai said:

Lovely, that means they're more likely to fail. More moving parts = more failures. At least data center systems are typically build with hotswappable/field-replicable parts unlike laptops.

I was commenting on datacenter GPUs, TDPs for laptops and desktops aren't going to change because they are at the maximum feasible air cooling ranges already. Most I'd realistically expect is 400W on an OC AIB card with increase TDP/TGP.

 

Datacenter GPUs Air or Liquid are "passive", in that there is no active cooling parts on the cards at all. For liquid the pump is either in the rack for the rack, end of row rack for all racks in the row, or central facility pumps for all racks. The last one is what we have however we are yet to do any direct to chip liquid cooling and just to hot aisle containment and push the hot air through heat exchangers/RADs and the water takes the heat up to the roof to heat exchangers/RADs. Hot aisle can easily be anywhere around 40C, makes for a nice place to go warm up in winter 🙃

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

They really don't get hot

 

Thu Jan 20 10:34:52 2022
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 495.29.05    Driver Version: 495.29.05    CUDA Version: 11.5     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA A40          On   | 00000000:27:00.0 Off |                    0 |
|  0%   77C    P0   201W / 300W |   2708MiB / 45634MiB |     99%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   1  NVIDIA A40          On   | 00000000:A3:00.0 Off |                    0 |
|  0%   72C    P0   200W / 300W |   2708MiB / 45634MiB |     99%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   2  NVIDIA A40          On   | 00000000:C3:00.0 Off |                    0 |
|  0%   61C    P0   138W / 300W |   2708MiB / 45634MiB |     37%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+

 

image.thumb.png.2a7a634fbd1e7a79231f2ef445d8502b.png

 

Also you don't really see much of them once in a server heh

584396002_20211116_175956423.thumb.jpg.814e027f6b0bc37c5e25002b9c9f17dc.jpg

Note: Old photo, 3rd card on the right is now the 3rd A40.

interesting i didn't know A series could be setup in a set of 3. I thought they had to be either alone or in even number.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Franck said:

interesting i didn't know A series could be setup in a set of 3. I thought they had to be either alone or in even number.

Conversely i thought @leadeater has at least 10 of them in SLI (nvlink)! 🙃

 

But seriously,  nice lol, the stickers make'm sure look less premium though haha. 70-80c is what i expected though for blower, i just think it doesn't fit as nicely in a gaming pc,  and well, temps are higher, my 3070 is around 60 usually while playing games. : )

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Franck said:

interesting i didn't know A series could be setup in a set of 3. I thought they had to be either alone or in even number.

You can run those in any number as long as you have PCIe lanes available.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, GamerDude said:

Won't at all be surprised if this were the case, nVidia only cares about their latest and greatest, that is, the 'RTX' line of cards, GTX owners can suck it! But, hey, I'd be happy to be proven wrong.

You do know the latest RTX cards are also in their GeForce product line?

 

15 hours ago, bcredeur97 said:

This affects the below products from Nvidia: 
image.png.be6068741525d11cf6f5bc14c93e4d95.png

So basically not a single GPU that would actually be relevant to this community.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×