Jump to content

Intel drops PCIe 4.0 support for Comet Lake Desktop

Reytime
10 hours ago, SeriousDad69 said:

This doesn't really change anything. I can't see GPUs being bottlenecked by PCI-E 3.0 x16 for years to come and FPS snobs will always buy Intel as long as they maintain the slight FPS advantage(220 FPS instead of 200!!!..... At 3X power usage lol)

I cant speak to the gpu side of things, but I saw a yuuge increase going from my b450 to the current x570 I have, but only in my pcie to m.2 adapter, because i had an nvme running off 4x pcie and being throttled to half speed. The jump to a 4x pcie4 just about brought it up to where it would be on an actual m.2

 

I also need to admit that my setup is a bit reckless when it comes to storage. 7 drives total right now, two m.2, 5 sata, certainly and absolutely not something that every, or even many gamers would resort to. oh, none of which are raid, its all just various storage for videos, raws, edits, and a disgusting 260gb photoshop folder i organize by just creating more subfolders in.

Updated 2021 Desktop || 3700x || Asus x570 Tuf Gaming || 32gb Predator 3200mhz || 2080s XC Ultra || MSI 1440p144hz || DT990 + HD660 || GoXLR + ifi Zen Can || Avermedia Livestreamer 513 ||

New Home Dedicated Game Server || Xeon E5 2630Lv3 || 16gb 2333mhz ddr4 ECC || 2tb Sata SSD || 8tb Nas HDD || Radeon 6450 1g display adapter ||

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, TrigrH said:

We don't need PCIE 4, I claimed we need more usable CPU lanes. When did I claim we need PCIe 4 I think you missed my point.

 

7 hours ago, TrigrH said:

Pcie 4 is only needed cos intel gimps the number of lanes on mainstream desktop. If Comet Lake has 32 CPU pcie 3.0 lanes then that would be great. Even Ryzen has 24 pcie CPU 4.0 lanes, which is awesome. 

 

You claim it is only needed because intel gimps it and you back that up by talking about GPU using all the lanes.  No GPU uses more than 8 lanes and the 4 lanes you get of the Chipset are barely slower for NVME.  Where do we need pcie4 when pcie3 and the chipset stuff is by and large not gimped? As I said before we don't need pcie4, at worst we need current pcie from CPU to NVME, but that won't net you the gains you think it will even with only 16 lanes on the CPU.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

sounds like what amd did back in the day with 2.0 to 3.0

i9 11900k - NH-D15S - ASUS Z-590-F - 64GB 2400Mhz - 1080ti SC - 970evo 1TB - 960evo 250GB - 850evo 250GB - WDblack 1TB - WDblue 3TB - HX850i - 27GN850-B - PB278Q - VX229 - HP P224 - HP P224 - HannsG HT231 - 450D                                                         
Link to comment
Share on other sites

Link to post
Share on other sites

Yeah I know most people don't need PCIe 4.0 now. However it isn't uncommon for many to keep the same CPU/motherboard for 3-5 years. Over that period, we might see big improvement for storage, GPU and other stuffs that benefits PCIe 4.0. What happens when you want to upgrade the GPU three years from now. But the next next gen GPU saturates the PCIe 3.0 x16?

 

People keep arguing about "now".

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Deli said:

Yeah I know most people don't need PCIe 4.0 now. However it isn't uncommon for many to keep the same CPU/motherboard for 3-5 years. Over that period, we might see big improvement for storage, GPU and other stuffs that benefits PCIe 4.0. What happens when you want to upgrade the GPU three years from now. But the next next gen GPU saturates the PCIe 3.0 x16?

 

People keep arguing about "now".

Given that the RTX 2080 Ti doesn't really choke at PCIe 3.0 x4 speeds, which is basically the same as PCIe 1.x x16, I find it hard to believe in 3-5 years, we'll have a GPU that demands at the minimum PCIe 3.0 x16 bandwidth. The performance impact is also less apparent at higher resolutions, meaning a GPU that's fully taxed is unlikely to see more of a PCIe bandwidth requirement.

 

There is a catch: this requires that the GPU is not starved for VRAM and is swapping data in and out between it and system RAM. Then yes, under this condition PCIe bandwidth performance will start to matter. However, I'd argue if you're in this condition, you should really be tuning the application settings so it's not eating up so much VRAM.

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Mira Yurizaki said:

Given that the RTX 2080 Ti doesn't really choke at PCIe 3.0 x4 speeds, which is basically the same as PCIe 1.x x16, I find it hard to believe in 3-5 years, we'll have a GPU that demands at the minimum PCIe 3.0 x16 bandwidth. The performance impact is also less apparent at higher resolutions, meaning a GPU that's fully taxed is unlikely to see more of a PCIe bandwidth requirement.

 

There is a catch: this requires that the GPU is not starved for VRAM and is swapping data in and out between it and system RAM. Then yes, under this condition PCIe bandwidth performance will start to matter. However, I'd argue if you're in this condition, you should really be tuning the application settings so it's not eating up so much VRAM.

I think the article says 2080Ti can overwhelm PCIe 3.0 x8. Or do I miss something?

 

Will Nvidia/AMD/Intel able to double the performance of top end GPU in 3-4 years? Maybe, maybe not.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Deli said:

I think the article says 2080Ti can overwhelm PCIe 3.0 x8. Or do I missing something?

 

Will Nvidia/AMD/Intel able to double the performance of top end GPU in 3-4 years? Maybe, maybe not.

I think their usage of the word "overwhelming" is misleading. A 2% performance difference on average is considered to be within a margin of error for a lot of people. It also depends on how application was designed. But as it stands, I'm not convinced that PCIe 4.0 x16 will be a requirement for top performance.

 

EDIT: I wanted to look at what Futuremark thought about PCIe bandwidth since they have a PCIe bandwidth test. The test is streaming geometry data into the GPU (See https://s3.amazonaws.com/download-aws.futuremark.com/3dmark-technical-guide.pdf)

 

However, they did say:

Quote

In real-world use with today's rendering pipelines, a PC’s gaming performance is unlikely to be limited by PCIe bandwidth. Nevertheless, the increase in bandwidth that PCIe 4.0 brings is sure to open up new possibilities with future hardware. 

So yes, I don't doubt the possibility in the second sentence. However, given that the adoption of PCIe 4.0 is going to take a very long time (PCIe 3.0 wasn't even available until 2012 with Intel's Ivy Bridge and AMD wasn't even on the PCIe 3.0 train until Ryzen) I doubt this will be a thing until at east another 8 years.

Edited by Mira Yurizaki
Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, 2FA said:

PCIe 4.0 is more important for servers than desktops thanks to NVMe and 40/100Gbps networking.

PCIe 4.0 is just like 5G or 10 gbps ethernet. I means who has internet fast enough to take advantage of that? Similarly, a 1000 mbps per second read and write on a super fast disk is meaningless if your cpu arent even fast enough to copy and write files at that speed. 

Sudo make me a sandwich 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Mira Yurizaki said:

I think their usage of the word "overwhelming" is misleading. A 2% performance difference on average is considered to be within a margin of error for a lot of people. It also depends on how application was designed. But as it stands, I'm not convinced that PCIe 4.0 x16 will be a requirement for top performance.

 

EDIT: I wanted to look at what Futuremark thought about PCIe bandwidth since they have a PCIe bandwidth test. The test is streaming geometry data into the GPU (See https://s3.amazonaws.com/download-aws.futuremark.com/3dmark-technical-guide.pdf)

 

However, they did say:

So yes, I don't doubt the possibility in the second sentence. However, given that the adoption of PCIe 4.0 is going to take a very long time (PCIe 3.0 wasn't even available until 2012 with Intel's Ivy Bridge and AMD wasn't even on the PCIe 3.0 train until Ryzen) I doubt this will be a thing until at east another 8 years.

It depends on what you’re doing.  Pcie 4 is currently only useful at all in a very few situations involving very fast nvme hard drives, and then only just barely.  A 2080ti can only just barely bust 8xpcie3 though and not for everything.  Pcie 4 has some weird advantages for storage though.  When pcie3 first came out there was a lot of argument that it was useless just like there is now.  What it did mostly was make sata6 a lot easier to implement, and very gradually cards got fast enough to make 2.0 kind of a pita to have.  Does one need pcie4 now?  No.  Will one wish one had pcie4 in a year? Unlikely.  Will one wish one had pcie4 in 5 years? Much more likely.  It probably won’t be for raw GPU speed though.  It will be for something oddball like sata6

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Bombastinator said:

It depends on what you’re doing.  Pcie 4 is currently only useful at all in a very few situations involving very fast nvme hard drives, and then only just barely.  A 2080ti can only just barely bust 8xpcie3 though and not for everything.  Pcie 4 has some weird advantages for storage though.  When pcie3 first came out there was a lot of argument that it was useless just like there is now.  What it did mostly was make sata6 a lot easier to implement, and very gradually cards got fast enough to make 2.0 kind of a pita to have.  Does one need pcie4 now?  No.  Will one wish one had pcie4 in a year? Unlikely.  Will one wish one had pcie4 in 5 years? Much more likely.  It probably won’t be for raw GPU speed though.  It will be for something oddball like sata6

On the desktop side I would have liked to see, at the PCIe 3.0 generation, a move away from x16 slots for GPUs or at least switched the slot to x8 in favor of other devices being able to directly connect to the CPU. So few run two GPUs it makes less sense to me to have two x16 physical slots that switch to x8/x8 when both are populated over instead utilizing those 8 lanes for dual x4 NVMe or NVMe + 10Gbe.

 

If we have all accepted dual GPUs are dead (god damn it I refuse) then stop designing boards around x8/x8 PCIe expansion slots, that's 8 lanes that are wasted and that will be true of PCIe 4.0. Just have 1 PCIe expansion slot connected to the CPU and the rest chipset, we don't actually need more lanes we just need to stop them from being needlessly wasted.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Deli said:

I think the article says 2080Ti can overwhelm PCIe 3.0 x8. Or do I miss something?

 

Will Nvidia/AMD/Intel able to double the performance of top end GPU in 3-4 years? Maybe, maybe not.

If PassMark's data is to be trusted, a 2080 ti scores about 50% better than a 980 ti, which have 3 years between their release dates. A 2080 ti will just barely be limited by an 3.0x8 slot, so it's unlikely in 3-4 years they'll get a card together that's twice as fast as a 2080 ti. Maybe 5-6 years down the line, but even then that'll just be limited by a full 3.0x16 slot. Outside of some very specific storage situations (I guess), a 4.0 slot is not at all necessary. I don't think a consumer or gamer will see any benefit other than "bench racing" for a 4.0x4 SSD over a 3.0x4 one.

My Current Setup:

AMD Ryzen 5900X

Kingston HyperX Fury 3200mhz 2x16GB

MSI B450 Gaming Plus

Cooler Master Hyper 212 Evo

EVGA RTX 3060 Ti XC

Samsung 970 EVO Plus 2TB

WD 5400RPM 2TB

EVGA G3 750W

Corsair Carbide 300R

Arctic Fans 140mm x4 120mm x 1

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, atxcyclist said:

If PassMark's data is to be trusted, a 2080 ti scores about 50% better than a 980 ti

You got a bit mixed up there. In relative performance if the 2080ti is 100% performance and the 980ti is 50% as fast, that means that the 980ti is only half as fast as the 2080ti, or in other terms the 2080ti is twice as fast as a 980ti. So yes in 3 years GPUs got twice as fast (even if they got more expensive).

Link to comment
Share on other sites

Link to post
Share on other sites

All this talk is about the desktop and in particular the small segment of gaming usage. What most of you forget is the industrial use cases, like servers. Looking at the Storinator videos Linus made you can see where PCIe will be used a lot in the future: fast storage access. Quite a lot of that will be for storage farms where VMs reside, access in the micro-second range is essential. Often these machines are connected via the network. A better explanation is in this video:

 

"You don't need eyes to see, you need vision"

 

(Faithless, 'Reverence' from the 1996 Reverence album)

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Dutch_Master said:

All this talk is about the desktop and in particular the small segment of gaming usage.

This thread is about some random rumour that Intel dropped 4.0 from a consumer desktop CPU. No one forgot there are real use cases of PCIe 4.0 and even 5.0, just that isn't what is being discussed here.

 

 

I'll still welcome anyone with a link to Intel saying they ever had plans to put 4.0 in anyway, for it to possible be a thing for them to allegedly take it away.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

OK, fair point.

"You don't need eyes to see, you need vision"

 

(Faithless, 'Reverence' from the 1996 Reverence album)

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, VegetableStu said:

the Wafer Eater has spoken

So the whole thread is based on the false premise that Intel was going to put pcie4 on comet lake in the first place.  It wasn’t “dropped” because it was never there to begin with.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Medicate said:

You got a bit mixed up there. In relative performance if the 2080ti is 100% performance and the 980ti is 50% as fast, that means that the 980ti is only half as fast as the 2080ti, or in other terms the 2080ti is twice as fast as a 980ti. So yes in 3 years GPUs got twice as fast (even if they got more expensive).

On Passmark the 980 ti scores 11440 and the 2080 ti scores 16692, 16692/11440 = 1.45, so the 2080 ti is about 145% the performance of the 980 ti according to PassMark, not 200%. Some other comparisons may be different, but that's the one I specified in my post.

My Current Setup:

AMD Ryzen 5900X

Kingston HyperX Fury 3200mhz 2x16GB

MSI B450 Gaming Plus

Cooler Master Hyper 212 Evo

EVGA RTX 3060 Ti XC

Samsung 970 EVO Plus 2TB

WD 5400RPM 2TB

EVGA G3 750W

Corsair Carbide 300R

Arctic Fans 140mm x4 120mm x 1

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, VegetableStu said:

preeeeeeety much, it seems. kinda interesting for a thought exercise, but now considering if even the mainboard partners did make those boards that way, the possibilities are either "eh they've got 4.0 traces already, might as well copy the designs over for intel stuff" or "they seriously thought intel was going to do the impossible" o_o

Considering that on a 30,000 foot level, Intel's and AMD's platforms are basically the same, just swap the socket and the chipset (yes I know there would be different pin counts), it wouldn't surprise me that they're recycling the designs.

Link to comment
Share on other sites

Link to post
Share on other sites

It may support PCI-E 4.0 making consumers think they're buying a board that will work for next years release, but knowing intel they'll still force you to upgrade your mobo for the their next 14nm refresh.

5800X3D / ASUS X570 Dark Hero / 32GB 3600mhz / EVGA RTX 3090ti FTW3 Ultra / Dell S3422DWG / Logitech G815 / Logitech G502 / Sennheiser HD 599

2021 Razer Blade 14 3070 / S23 Ultra

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Mira Yurizaki said:

Considering that on a 30,000 foot level, Intel's and AMD's platforms are basically the same, just swap the socket and the chipset (yes I know there would be different pin counts), it wouldn't surprise me that they're recycling the designs.

*remembers pin-compatible VIA chipsets on boards with 440BX/LX screening*

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dabombinable said:

*remembers pin-compatible VIA chipsets on boards with 440BX/LX screening*

Ohh VIA.  I remember someone mentioning there’s a decent chance they’ll rise from the dead like a vampire when the AMD64 patent runs out, which is pretty soon now.  I wonder how that might change things? 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bombastinator said:

Ohh VIA.  I remember someone mentioning there’s a decent chance they’ll rise from the dead like a vampire when the AMD64 patent runs out, which is pretty soon now.  I wonder how that might change things? 

It depends on whether or not AMD will try to become the Disney of the PC world, and get their patent extended...

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Dabombinable said:

It depends on whether or not AMD will try to become the Disney of the PC world, and get their patent extended...

My memory is Disney managed that with a Five year lawsuit, a portion of luck, and an unbelievably large amount of money. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Bombastinator said:

My memory is Disney managed that with a Five year lawsuit, a portion of luck, and an unbelievably large amount of money. 

Disney managed to do it twice, actually, and one of the times was through legislation. But they've stopped trying to defend Steam Boat Willy. So it's only content that's 125 years old. Only.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Dabombinable said:

It depends on whether or not AMD will try to become the Disney of the PC world, and get their patent extended...

I think patent extensions only apply to food/drug related products considering USPTO does have a list of patents that were extended and it looks like all of them are food/drug related.

 

11 hours ago, Taf the Ghost said:

Disney managed to do it twice, actually, and one of the times was through legislation. But they've stopped trying to defend Steam Boat Willy. So it's only content that's 125 years old. Only.

Steamboat Willie's copyright is going to expire in 2023 anyway (US copyright law says 120 years after creation or 95 years after publication, whichever is shorter)

 

But they have a hold on Mickey Mouse until the end of time because the character is now considered a trademark, not a copyrighted work.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×