Jump to content

Rumor: AMD to go LGA with AM5, PCIe Gen 5 limited to only EPYC CPUs for Zen 4 Update: More leaks

Random_Person1234
Go to solution Solved by Random_Person1234,

Update: Executable Fix has now leaked more stuff. Zen 4 Raphael (first consumer Zen 4 chip) will support DDR5 (as expected), but unlike Intel Alder Lake, it will not also support DDR4. Raphael will have 28 PCIe 4.0 lanes, up from Zen 3's 24. The chips will have a 120W TDP, with a 170W special variant also possible. Executable Fix also leaked a picture of the LGA 1718 pads on the CPU.

Here it is compared to LGA 1700 (Alder Lake socket):

Spoiler

AMD-Raphael-AM5-vs-Intel-AlderLake-LGA1700-768x504.jpg

 

As many before, I prefer PGA, but in the end I never broke a CPU or mobo having built ~2 thousand PCs back in my day, admittedly LGA was very fragile and tiny bit of inaccuracy could make me spend a minute or two on straightening the pins, but the only damaged socket I saw came like that from the supplier.

PCI-e 5.0 is def not a must, but it's a nice have, you could for example get more perf in eGPU, or just put shitload of SSDs in a 20 Lane system. At the same time PCIE4 uses more energy already and has higher signal quality requirements, meaning PCIE5 will make everything more expensive etc. so waiting till it's cheaper and more stable does make sense. I wonder if Zen4 Ryzen IF will use PCIE5 though.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Chiyawa said:

Well, if it reduce the latency by a fraction, it is considered a big bonus for competitive gamers. The reason PCIe 4 became so wide spread is because of this. even nVidia who laughed at AMD for adopting PCIe 4 nonsense since GPU bandwidth only utilised half of PCIe 3.0 x16 bandwidth jumped into the band wagon.

For GPUs PCIe 4.0 is still largely irrelevant for current gaming. On both sides, you can reduce bandwidth way down to even equivalent to 4x 3.0 with hardly any impact in fps. AMD may have got there first and tried to sell it as an advantage but it was essentially marketing and made no real difference.

 

Also it was inevitable for nvidia to support faster PCIe speeds. They did not go "oh, AMD did it, we better copy it". As far as I'm aware that only happened for re-bar support.

 

6 hours ago, Chiyawa said:

Still, with PCIe 4, We have Smart Array Memory support

It doesn't need PCIe 4.0. My laptop which uses an AMD 5800H and nvidia 3070 runs re-bar, but due to the crippled connections of the AMD CPU it only supports 8x 3.0.

 

5 hours ago, AndreiArgeanu said:

Current GDDR6X memory found in an RTX 3090 can reach a theoritcal maximum speed of 936GB/s, while with DDR5 the information I could says that it can achieve a peak 51.2GB/s per module which is double DDR4-3200mhz.

I think we can compare peak rates. Dual channel ram at 3200 is around 50GB/s, say we get to 100GB/s with DDR5. Even with slack timing modules you can attain about 90% of that for read bandwidth, perhaps 70% or so for write/copy. If you get tight timing kits you get closer to the peak theoretical bandwidth. Agree it might make some impact for iGPUs that are designed for it. 

 

Also note, PCIe 5.0 x16 would only provide a max of about 64 GB/s per direction. If you only do reads or writes, you can't even max out DDR5 ram. It might be possible if you do simultaneous reads and writes. I think PCIe can do both at the same time, but DDR is a shared bus so can only do one or the other.

 

5 hours ago, AndreiArgeanu said:

I doubt that PCIE 4.0 causes a measurable difference in latency over PCIE 5.0. Even if PCIE 5.0 would be released I highly doubt NVIDIA or AMD would be quick to switch to it anyway, so all that latency improvement would go to waste.

Like many technologies, the time for first access is largely unchanged between generations. Where newer generations help out is that you take less time to transfer that data once it has started. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

It would actually be nice to maybe see a different approach to the LGA Motherboard socket. Maybe they have something thats not as easy to damage as the intel sockets.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, thewill102 said:

Maybe they have something thats not as easy to damage as the intel sockets.

Easy? It's as easy as don't drop anything on it and tbh depending on what you drop on it nothing might be able to prevent damage. LGA is less prone to damage than PGA but typical damage is more severe, PGA on the other hand is more prone to damage but typically less severe.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

Easy? It's as easy as don't drop anything on it and tbh depending on what you drop on it nothing might be able to prevent damage. LGA is less prone to damage than PGA but typical damage is more severe, PGA on the other hand is more prone to damage but typically less severe.

 

As a Systemintegrator i handled a lot of CPUs and Motherboards and i never managed to drop something into any socket. And yes, i completely agree. In my opinion, there could still be some "innovation" on how to design the LGA socket on the mobo to make damage less severe.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, thewill102 said:

It would actually be nice to maybe see a different approach to the LGA Motherboard socket. Maybe they have something thats not as easy to damage as the intel sockets.

Be realistic, how much time is spent with the pins exposed on an LGA socket? It ships with a cover attached, and the only time the pins normally gets exposed is when installing the CPU. If they get damaged in the process, it is most likely user error.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, porina said:

Be realistic, how much time is spent with the pins exposed on an LGA socket? It ships with a cover attached, and the only time the pins normally gets exposed is when installing the CPU. If they get damaged in the process, it is most likely user error.

Yes, it is most likely a user error.

 

My point is, the way Intel does it, can't be the only way to make a LGA socket. Im just curious as if AMD takes a new approach to the LGA socket type. 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, thewill102 said:

My point is, the way Intel does it, can't be the only way to make a LGA socket. Im just curious as if AMD takes a new approach to the LGA socket type. 

What is the design goal you're thinking of? Making the pins better protected from damage, for example, by making them retract when the bracket is open could be possible. But that would add complexity and cost. At the end of the day, is a damaged socket a big enough problem to justify that increased cost?

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, James Evens said:

I would say PGA is more complicated/fragile then LGA.

PGA pins are typically thicker and more resilient to bends and thus can be more likely fixed. Like I said PGA more likely to get damaged, more likely you can fix it. LGA pins any damage is near certain irreparable.

 

4 minutes ago, James Evens said:

the cheap part is damaged (mainboard)

Only if you buy cheap.

 

4 minutes ago, James Evens said:

ripping it out of the socket due to the thermal paste with the risk of damaged pins (it is so bad of a problem that there are now third party metal parts holding the CPU in the socket to fix AMDs oversight).

Twist then lift, never pull. 100% reliable solution to this issue. Everyone who's built pre Core 2 Duo knows this. I have never and will never have this problem.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, James Evens said:

The difference is the exposure/risk itself.

And my original point saying more likely to get damaged doesn't equal this?

 

Should probably read past that full stop to the next sentence of what you quoted 😉

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Easy? It's as easy as don't drop anything on it and tbh depending on what you drop on it nothing might be able to prevent damage. LGA is less prone to damage than PGA but typical damage is more severe, PGA on the other hand is more prone to damage but typically less severe.

While it is true as long as you're careful it's not a problem, however "Things" can happen too.

The pin shifting problem I've mentioned happens even if you're very careful getting it in and out. The issue is for some reason one or more of the pins in the sockets is/will get "Loose" where they are mounted in the socket and tend to move/wiggle around to one side or the other. If it's off to the side too far it's going to throw a code and not work in many cases.

To try and avoid any damage I use a small suction cup from a smartphone repair kit to remove and set chips in these sockets so I don't have to worry about a case of butterfingers making me drop it. Has a little ring for holding it and I can simply set the chip right in, close the latch on the chip and then remove the suction cup - The cup's ring I can pass right through the open part of the CPU latch when closing it.
As long as the CPU lid isn't scratched up or has any gouges that would cause it to not hold suction it works great.

Even with that I still have the errant pin getting shifted to one side on occasion and it's just annoying having to deal with it, good thing though is it's always the same pin that shifts over so I know which one it is everytime it happens depending on the specific board in use.

"If you ever need anything please don't hesitate to ask someone else first"..... Nirvana
"Whadda ya mean I ain't kind? Just not your kind"..... Megadeth
Speaking of things being "All Inclusive", Hell itself is too.

 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, leadeater said:

Only if you buy cheap.

I guess it depends a lot on where in the stack you buy on both sides. A higher end system would have a CPU at say US$300 or much above that, with a mobo perhaps only $200 or so. Still on the lower end of enthusiast builds we're still looking at ball park $100+ and perhaps a bit less for a mobo. On average, for a balanced system I think more often than not the CPU costs more than the mobo.

 

I know, much cheaper CPUs exist, but they're unlikely to be paired with very expensive mobos most of the time.

 

46 minutes ago, leadeater said:

Twist then lift, never pull. 100% reliable solution to this issue. Everyone who's built pre Core 2 Duo knows this. I have never and will never have this problem.

I know this but I still pull AMD CPUs out with the heatsink. I guess it is a different thing if you're doing this regularly, but I do it once in forever so I forget between each time. When I remember I like to preheat the CPU immediately before trying to extract it.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, porina said:

I know, much cheaper CPUs exist, but they're unlikely to be paired with very expensive mobos most of the time.

It's more the other end of the scale, when you'd not want to break either as both are significant cost so which is cheaper doesn't make a whole lot of difference. Either hurts just as bad realistically.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, porina said:

I guess it depends a lot on where in the stack you buy on both sides. A higher end system would have a CPU at say US$300 or much above that, with a mobo perhaps only $200 or so. Still on the lower end of enthusiast builds we're still looking at ball park $100+ and perhaps a bit less for a mobo. On average, for a balanced system I think more often than not the CPU costs more than the mobo.

 

I know, much cheaper CPUs exist, but they're unlikely to be paired with very expensive mobos most of the time.

 

I know this but I still pull AMD CPUs out with the heatsink. I guess it is a different thing if you're doing this regularly, but I do it once in forever so I forget between each time. When I remember I like to preheat the CPU immediately before trying to extract it.

Happens to me all the time, I just straight pull up and out to get the chip out of the socket and separate the chip and cooler after that's done.
Just don't be twisting it while pulling it out - Great way to distort/break pins.

Speaking of what I posted above concerning the suction cup I took a shot of it, the kit that had it and how it looks on a chip. The cup holds suction well enough I can literally pickup the board from the tabletop and this board isn't a light one either.
Again, make sure the chip doesn't have a scratched up surface or any gouges that would affect how it holds suction, helps to have a very light film of grease/oil on the cup and that just wipes off the chip's lid after you get the chip in the socket.
Also remember to latch the chip in place FIRST while passing the ring through the open part of the latch, then remove the cup from the chip.

842102148_Suctioncupandkit.thumb.JPG.dbf94a4cdc2480b2eee180fceea5df74.JPG

"If you ever need anything please don't hesitate to ask someone else first"..... Nirvana
"Whadda ya mean I ain't kind? Just not your kind"..... Megadeth
Speaking of things being "All Inclusive", Hell itself is too.

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, ONOTech said:

Their retention is aaawwwwwful.

Can't be awful if it doesn't exist

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Update: Executable Fix has now leaked more stuff. Zen 4 Raphael (first consumer Zen 4 chip) will support DDR5 (as expected), but unlike Intel Alder Lake, it will not also support DDR4. Raphael will have 28 PCIe 4.0 lanes, up from Zen 3's 24. The chips will have a 120W TDP, with a 170W special variant also possible. Executable Fix also leaked a picture of the LGA 1718 pads on the CPU.

Here it is compared to LGA 1700 (Alder Lake socket):

Spoiler

AMD-Raphael-AM5-vs-Intel-AlderLake-LGA1700-768x504.jpg

 

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Random_Person1234 said:

DDR5 (as expected), but unlike Intel Alder Lake, it will not also support DDR4

DDR4 and DDR5 on the same board?

✨FNIGE✨

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, SlimyPython said:

DDR4 and DDR5 on the same board?

It was like that with DDR2/DDR3 for both Intel and AMD for a while.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Does this mean AM4 coolers probably won't work with AM5 socket motherboards?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SlimyPython said:

DDR4 and DDR5 on the same board?

No. On Alder Lake a board will only support either DDR4 or DDR5, the memory controller on the CPU can support either.

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Joduko said:

Does this mean AM4 coolers probably won't work with AM5 socket motherboards?

No, they're probably still gonna work. As mentioned in the post, the physical sockets will be the same size.

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/24/2021 at 5:26 AM, staticpage said:

RIP finding good working mobos at the scrapyard. 100% of all the intel LGA mobos that would work great are destroyed because LGA is fragile and the socket cover never gets put back on. I find piles of good Z170s and have taken the time to repair a few but they are mostly all destroyed pins, so much E-waste. I find an A320 in the gaylords and works every time.

Seems almost like there should be a solution to that. Almost as if maybe the motherboard should have a thermal plate lever to protect the CPU from the cooler. Think about how ZIF sockets work. At any rate it's an issue that probably should be dealt with by having the CPU soldered to the MB on budget boards, and instead making the "MB" just a PCIe backplane where an AMD or Intel SoC+RAM module can be installed regardless of the backplane MB.

 

On 5/24/2021 at 8:51 AM, porina said:

I know for sure it was a thing in 486 land. AMD offered the same CPUs but at higher clocks. I also know it wasn't a thing for Pentium 2, where we had Slot 1 vs. Slot A (and some slocket fun). See what they did there? The generation in between, I don't recall clearly any more. I did have a Pentium MMX 120 MHz but don't recall that AMD was ever an option for my build.

AMD, Cyrix, SiS and others had no licence to build Socket 370/Slot 1 boards, only AMD. You can tell at some point the AMD Athlon was intended to work on Intel chipsets because the AMD Athlon and the Pentium II used the exact same 242 pin socket. However the actual bus was different and it seems like a last-minute change was made so that it used EV6 instead of GTL+

 

Otherwise why go this route at all.

 

On 5/24/2021 at 7:16 PM, Chiyawa said:

Well, if it reduce the latency by a fraction, it is considered a big bonus for competitive gamers. The reason PCIe 4 became so wide spread is because of this. even nVidia who laughed at AMD for adopting PCIe 4 nonsense since GPU bandwidth only utilised half of PCIe 3.0 x16 bandwidth jumped into the band wagon.

 

Still, with PCIe 4, We have Smart Array Memory support, so it's actually a good thing to be honest. There's a lot of application can take advantage of this technology since the CPU can address every single memory cell inside the GPU directly. Maybe when PCIe 5 comes up, VRAM will be a thing of the past, as the GPU can share the super fast RAM that DDR5 offers. Of course they still need super fast cache.

The GPU bandwidth reason is demonstrably false. The problem is not that games aren't using the full capability of the GPU, but that nVidia actually isn't making fast enough chips to use PCIe 4.0 at 16 lanes while pre-DX12/Vulkan API's were bottlenecked to one CPU thread. The GTX 3080 and 3090 show exactly how that is getting bottlenecked. There is more to a GPU than just the PCIe lanes.

 

The memory bandwidth plays a lot more into the performance of the GPU than the PCIe connection, and games routinely do things to reduce the amount of data they push through the PCIe bus knowing this. 

 

A game running at 240fps at 8K, is simply not possible on existing hardware never mind HDR and Raytracing which is effectively impossible.  Like to give a realistic example Final Fantasy XV and Cyberpunk 2070 are the two games I have that I have not been able to play at "maximum" settings on my GTX 1080, because 1) The GPU doesn't have Raytracing, 2) The nvidia gameworks libraries are buggy as hell, making most of those NVidia features worthless. 

 

Like I kid you not, I could maybe get 30 minutes of game play of FFXV before it would crash to the desktop because of memory leaks in the gameworks libraries. Even then the performance wasn't spectacular. If I ever get my hands on a 3090 or maybe a 4090 or something like that in the next 2 years at this rate, I still don't expect the GPU to do 8K. But I would certainly like to be able to play 4K games in 4Kp60 without having to sacrifice any of the GPU features other than antialiasing and motion blur.

 

So I do reasonably expect Vulkan and DX12 games to make PCIe4 cards a requirement, especially with direct access between storage and video memory so that the CPU is no longer the bottleneck in compression/decryption.

Link to comment
Share on other sites

Link to post
Share on other sites

Update:

Summary

ExecutableFix on Twitter has released some more info on AM5, along with a picture of the underside of an AM5 CPU (where the pads will be). AM5 CPU substrates will remain the same as AM4 as well.

Quotes

Quote

ExecutableFix put out a handful more details about the I/O of this socket. Apparently, AM5 is a pure-DDR5 platform, with no backwards compatibility with DDR4. The socket features a dual-channel DDR5 memory interface. The PCI-Express interface is PCI-Express 4.0, with the socket putting out 28 lanes in total. 16 of these go to the PEG slot(s), four to an M.2 NVMe slot, and possibly the remaining eight as chipset bus.

 

My thoughts

This is exciting news for 2 reasons in my opinion. AM4-compatible coolers should remain compatible with AM5 if the CPU height is unchanged, and the differences between the bottoms of the existing LGA115x CPUs and AM5 should dissuade all but the most dedicated/most stupid people from trying to put an AM5 CPU in an LGA115x socket and vice versa. I'm also intrigued as to how 8 lanes for connecting the CPU to the chipset will impact PCIe storage speeds.

Sources

https://www.techpowerup.com/282637/amd-socket-am5-package-underside-pictured

https://twitter.com/ExecuFix/status/1397173117487828992

Other notes 

This info came from a Twitter post, but Twitter is blocked at my school so I used the TechPowerUp article. I've linked the tweet, but I have no idea what it says.

elephants

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, FakeKGB said:

I'm also intrigued as to how 8 lanes for connecting the CPU to the chipset will impact PCIe storage speeds.

That is speculation on TPU's part and not from the tweet.

 

It is a fair question where could those extra lanes go. One option is a flexible set of lanes for more mobo connected things. Either an extra CPU connected PCIe x4 slot (or multiple x1, x2 slots?), or 2nd M.2 slot for example. Dedicating it to the chipset does raise the question if the chipset will provide more of a hub for higher bandwidth devices. Current AMD 4.0 x4 would be around 8 GB/s, doubling the lanes doubles that to 16 GB/s. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

*supports standards of having no pins on either*

loose pins that can be locked teeth or other grip and changed out?

Just to annoy everyone.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×