Jump to content

GPU socket instead of PCI-E slots

ToneStar

I kind of wonder if AMD made a GPU socket with vram slots with some ram embedded depending on the GPU on the GPU chips if they would have a better chance.  Seems like you could make more powerful and cheaper GPUs that way because you would not have to pay for a cooler, ram and board every time you bought a GPU.

 

Like Vega VII for example, say you had a Vega 64 and were upgrading to a Vega VII that was in a GPU socket, you could keep 4-8 gigs of your HBM2, the AIO or air cooler that you were using and just buy the chip and maybe 4-8 gigs more of vram.  Seems like it would be a lot cheaper that way and a way that they could really compete with Nvidia a lot better.  They still could make normal GPUs too just have the board partners make them.

 

This would be a lot harder for Nvidia to do because they would likely have to work with Intel, AMD and the motherboard manufacturers to make something like this. Where it would be a lot easier for AMD since they have their own CPUs and motherboards already.

Link to comment
Share on other sites

Link to post
Share on other sites

People have brought this up alot in the past, memory in your PC is exponentially slower than VRAM, so using the same memory on a GPU would hugely impact performance.  I would have to be a different type of memory which would probably be expensive due to lack of demand.

There are other concerns as well like compatibility, physically fitting the modules etc.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Snipergod87 said:

People have brought this up alot in the past, memory in your PC is exponentially slower than VRAM, so using the same memory on a GPU would hugely impact performance.  I would have to be a different type of memory which would probably be expensive due to lack of demand.

There are other concerns as well like compatibility, physically fitting the modules etc.

Why would it be more expensive AMD could sell the ram or have board partners sell it? It would be the same chips that they already sell to  us just on little boards instead of the GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

In addition to what @Snipergod87 mentioned, there's been cards in the past (quite a while ago) that had this as a feature...an extra memory slot on the end.  Sound cards as well.  However, there wasn't really specialized memory then for them.  You'd still have to get other manufacturers onboard to produce the expansion chips, and even then there isn't enough of a market to warrant the change.

Link to comment
Share on other sites

Link to post
Share on other sites

Video cards vary in power consumption A LOT, let's say between around 30 watts and 300 watts.

 

If you make sockets, you also have to put on the motherboard the VRM to power the gpu chip in the socket , the voltage regulator for the video card, so you'd basically force every motherboard manufacturer to duplicate the CPU VRM (which is probably 50%+ of the cost of the whole motherboard) to power ONE video card... and what if user wants two video cards?

Don't forget the cooler to cool up to 300 watts, think adding holes around the socket so you could mount a cooler just as big as the coolers you normally see on processors, with 6 heatpipes or more.

Then, if you don't make the video card use only embedded memory like HBM2, you'd have to deal with the incredible amount of traces going to slots holding the VRAM... one memory stick is 64 bit wide. a GTX 1060 3GB has a 192bit wide memory bus, so you'd have 3 slots on the motherboard for VRAM... to get 256 bit you'd need 4 memory sticks, basically the equivalent of quad channel for processors.

The problem is the higher the frequency something runs at, the more critical it is to get all the wires (traces) that carry date between chip equal in length (with sub mm precision), and it's more important to have as much unobstructed path.. a socket would add resistance at the point where the pins make contact with the socket, there's extra inductance and other things that mess up the signals.

It's only thing to use 3200 mhz memory sticks which run at 1600 mhz but GDDR6 and these modern memory types run at up to 14-16 gbps (you can say about 3-4 times as fast as regular memory), and the more memory slots you have the more difficult it would be to keep the signal integrity and keep the power low. 

 

Look at circuit boards of video cards and look how close the memory chips are around the gpu chip, compared to the distance between a cpu and its memory slots.

 

Link to comment
Share on other sites

Link to post
Share on other sites

SQUINT.jpg.c1f4853ca7a425323c0c522d1adb4366.jpg

My rig:

CPU: Ryzen 5 3600 3.6Ghz, OC'ed to 4.2Ghz all core @ 1.25v + Corsair H60 120mm AIO

MB: Gigabyte B450 I Aorus Pro WiFi

RAM: Kingston Fury Beast RGB 32GB (2x16GB) 3600mhz CL16 (1-to-1 Infinity Fabric enabled)

GPU: Gigabyte RTX 2080 Super

*bought for $200 CAD off a friend who needed an RTX 3080, price was my reward.

CASE: InWinn A1 Plus in White with included 600w gold sfx PSU and included custom length cables

DISPLAY: 3x 20" AOC 1080p 60hz 4ms ,  32" RCA 1080p/60hz TV mounted above, all on a single arm.

 

Storage: C : 1TB WD Blue NVMe      D : 2TB Barracuda      E: 240GB Kingston V300 (scratch drive)

NAS: 240GB Kingston A400 + 6x 10+ year old 700GB Barracuda drives in my old FX8350+8GB DDR3 system

 

Logitech G15 1st Gen + Logitech G602 Wireless

Steam Controller +  Elite Series 2 controller + Logitech G29 Racing Wheel + Wingman Extreme Digital 3D Flight Stick

Sennheiser HD 4.40 Headphones + Pixel Buds 2 + Logitech Z213 2.1 Speakers

 

My Girlfriends Weeb-Ass Rig:

Razer Blade Pro 17 2020

10th Gen i7 10875H 8c/16t @5.1ghz 

17.3" 1080p 300Hz 100% sRGB, factory calibrated, 6mm bezel

RTX 2070 Max-Q 8GB

512GB generic NVMe

16GB (2x8GB) DDR4 3200Mhz

Wireless-AX201 (802.11a/b/g/n/ac/ax), Bluetooth® 5.1, 2.5Gbit Ethernet

70.5 Whr Battery

Razer Huntsman Quartz, Razer Balistic Quartz, Razer Kraken Quartz Kitty Heaphones

*deep breath*

Razer Raptor 27" monitor, IT'S BEAUTIFUL.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×