Jump to content

Intel’s Entire 10th Gen Comet Lake Desktop CPU Lineup Leaked + Intel 400-Series Platform & LGA 1200 Socket Detailed

11 minutes ago, mr moose said:

 

I am pretty sure he was being sarcastic, but it is hard to tell with some of the comments this forum throws up regarding new products etc.  Poe's law I guess.

I usually look for the /s ?

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, dizmo said:

 

People that have an LGA1151 based PC probably aren't looking at upgrading anyway. So it's the people who have older systems, and would be upgrading it regardless, or those who are buying new systems. I don't know why people think the only market is people who just bought a PC 9_9

Fair point, but I would normally expect a clear benefit to upgrade the socket, e.g. moving to PCI 4.0, whereas here the main benefit as far as I can tell is more cores. That's creating more cost for the end user as the new socket boards will come at a premium. 

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, mr moose said:

I am pretty sure he was being sarcastic, but it is hard to tell with some of the comments this forum throws up regarding new products etc.  Poe's law I guess.

I thought the part where I refered to a computer product as my silicon penis would be /s enough, but I suppose it's really not that far off from some genuine comments around here. 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, VegetableStu said:

so far their excuse is more power pins, but eh I might give them that pass since they've done the same for gen 8 and 9. I'd expect their adoption of PCIe 4 or 5 would require a new socket topology

 

also PCIe 4 isn't being fully utilised by GPUs yet ._. the ones that benefit more from the higher bandwidth are SSDs, so unless Nvidia and AMD decides to bring storage closer to their GPU core (DAMMIT AMD TRICKLE DOWN THE SSG ALREADY) there's noooooott much

relative-performance_3840-2160.png

https://www.techpowerup.com/review/pci-express-4-0-performance-scaling-radeon-rx-5700-xt/23.html

 

 

I gave them a pass simply because they never promised anyone something different, it is par for the course with their product history and they haven't exactly started doing anything that would leave people high and dry on an old platform.   The whole argument that the socket not being supported is anti consumer is just illogical.   No one claimed AMD's lack of ddr2 support on AM3+ was anti consumer (for good reason). 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I have nothing bad to say about Intel. Every Intel Processor / product I own is doing its job very well. And I haven't even considered buying AMD processors.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, D.U.F.F. said:

I have nothing bad to say about Intel. Every Intel Processor / product I own is doing its job very well. And I haven't even considered buying AMD processors.

No one's saying that they can't do their job just as well as AMD's ones. It's just that AMD's stuff is better value for money.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/3/2019 at 2:10 AM, MichaelWd said:

Can anyone help me understand what would be driving the reported 8% increase in performance? Same size, same pcie, is it just due to more cores? Or a slightly higher clock?

Perhaps it is from hardware spectre mitigations being more efficient than the software ones.

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, VegetableStu said:

so far their excuse is more power pins, but eh I might give them that pass since they've done the same for gen 8 and 9. I'd expect their adoption of PCIe 4 or 5 would require a new socket topology

 

also PCIe 4 isn't being fully utilised by GPUs yet ._. the ones that benefit more from the higher bandwidth are SSDs, so unless Nvidia and AMD decides to bring storage closer to their GPU core (DAMMIT AMD TRICKLE DOWN THE SSG ALREADY) there's noooooott much

relative-performance_3840-2160.png

https://www.techpowerup.com/review/pci-express-4-0-performance-scaling-radeon-rx-5700-xt/23.html

 

This suggests that they could switch to 8x PCI-E 4.0 slots to conserve PCI-E lanes.

 

At this point, I have given up on seeing all 16x slots on consumer motherboards, so aiming for all 8x is probably the next best thing. Only GPUs use 16x slots as far as I have seen. At the very least, I can hope.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, VegetableStu said:

put it this way, if no one did it for PCIe 2.0/3.0... ._.

 

EDITx1: unless you count x16x0/x8x8 boards, iunno

EDITx2: actually hold up, there's one recent board that did another dream setup:

 

the Asus WS X570 had a secondary 3.0 x8 lane from the chipset, which had a PCIe 4.0 x4 connection remuxed into a 3.0 x8 connection. needless to say no one has done this for CPU lanes itself so far

If they switched to 8x slots for PCI-E 3.0, then people putting the cards into PCI-E 2.0 slots would have seen a performance drop from 8x PCI-E 2.0 behaving like 16x PCI-E 1.0 slots.

 

Switching now would give the PCI-E 3.0 slots a 1% performance decrease according to the numbers posted, so it seems feasible to do without really affecting people with PCI-E 3.0 slots.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, VegetableStu said:

the Asus WS X570 had a secondary 3.0 x8 lane from the chipset, which had a PCIe 4.0 x4 connection remuxed into a 3.0 x8 connection. needless to say no one has done this for CPU lanes itself so far

Quote

AMD X570 chipset
1 x PCIe 4.0 x16 (x8 mode)
1 x PCIe 4.0 x1

from the Asus website, it appears to be 8x lanes of 4.0

 

probably stole some lanes from a m.2 slot.

 

effective speed to the CPU is still the same as 8x 3.0 tho. so you arent far from wrong. 

 

edit: afaik there are no PCIe 4.0 to 2x 3.0 splitters like there is for 3.0 to 2.0. 

 

not to mention the 3.0 splitters allready cost tons, making them a bad deal for most hardware. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, WereCatf said:

No one's saying that they can't do their job just as well as AMD's ones. It's just that AMD's stuff is better value for money.

Me to Intel ....... shut up and take my money.
AMD ..... not until hell freezes

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, D.U.F.F. said:

AMD ..... not until hell freezes

Finnland allready thinks hell is a frozen hellscape. 

 

so thats allready the case btw. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, GoldenLag said:

Finnland allready thinks hell is a frozen hellscape.

Ever played Frostpunk? It's clearly modeled after the Finnish winter.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ryao said:

If they switched to 8x slots for PCI-E 3.0, then people putting the cards into PCI-E 2.0 slots would have seen a performance drop from 8x PCI-E 2.0 behaving like 16x PCI-E 1.0 slots.

 

Switching now would give the PCI-E 3.0 slots a 1% performance decrease according to the numbers posted, so it seems feasible to do without really affecting people with PCI-E 3.0 slots.

Just make all PCIe 4.0 slots open ended, make the GPUs x16 electrical, make the GPU slots in PCIe 4.0 motherboard x8 physical or just electrical. There you have backwards compatibility with previous standards at the maximum bandwidths while not over provisioning PCIe 4.0 lanes that can be used in other places.

Link to comment
Share on other sites

Link to post
Share on other sites

The socket is also what gets me here.

 

Intel really are shooting themselves in the foot a bit I feel. Maybe I'm being stupid and missing some technical thing, but if its basically the same chip being rehashed, why does almost every new desktop generation need a new board, other than to screw the consumer?

They're practically forcing me to wait for a big enough step in performance or pushing me to AMD at some point.

I think that I would have happily bought a new intel chip in the last few years if I didnt have to also get a new motherboard and practically rebuild the whole machine to upgrade.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, WereCatf said:

Ever played Frostpunk? It's clearly modeled after the Finnish winter.

Nope, i just knew someone finnish not far back. 

Link to comment
Share on other sites

Link to post
Share on other sites

 

On 11/3/2019 at 2:36 PM, Waffles13 said:

We all know that Nvidia and Intel only ever announce products for the sole purpose of screwing over the people that bought the previous generation. 

 

If they really cared, they would slow down all product development and generational advancement of technology just so that we didn't need to feel mildly upset when we only get 12-18 months with the biggest silicon penis. 

 

On 11/3/2019 at 9:30 PM, dizmo said:

Yeah, let's slow down the advancement of the human race so that people don't feel bad about their purchases. 9_9

 

55 minutes ago, SADS said:

They're practically forcing me to wait for a big enough step in performance or pushing me to AMD at some point.

Yes, both, SADS.  I'm planning to go with AMD with my next build (when DDR5 or DDR6 comes out).

And, I WANT to see significant per-generation performance bumps.  From what I understand, 8086 to 286, and 486 to Pentium, had a 2x or more jump in IPC.  Also my dad's $100 (Oct 1995) 486DX4-120 was, if my estimation of MIPS is in the ballpark, was about 78.75x faster than his ~$300-or-so (Jan 1989) 286-10.  I bought my 4790K in Jan 2015 (for $330), and Oct 2021 would be the same time interval...

 

There are several other criteria I'm looking for before I upgrade (thereby getting a new mobo), but I'll mention 2 here.

  1.  New CPU encodes HEVC (H.265), 3840x2160, q=0, --keyint=1 video, at least as fast as my existing 4790K encodes 320kbps, 44.1kHz, stereo, q=0, checksums mp3.  In a test I did a few months ago with those settings, the 4790K...
          a.  took about 2 minutes to encode about 2 hours of audio (source was 44kHz 16-bit stereo Wave), using a version of Lame that supported multi-threading.
          b.  took about FOUR DAYS(!!) to encode a 4-MINUTE video!  Source was an H.264 4K video shot on my Panasonic FZ1000, software was Handbrake.
          c.  (Btw how do you do tabs and font size on mobile?)
  2. The slowest Celeron-Y or Atom-Y from the new CPU's generation, at BASE clock (no turbo; bonus points if on 1 core with HT off), is faster than the fastest maximum-number-of-socket-configuration Xeon E7s from the same generation (Haswell) as my CPU (4790K) that would be up for replacement.

 

 

55 minutes ago, SADS said:

I think that I would have happily bought a new intel chip in the last few years if I didnt have to also get a new motherboard and practically rebuild the whole machine to upgrade.

Yes!  I would have liked to drop like an i9-9900K or Ryzen 7 3700X in my ASRock Z97 Extreme6.  (As for why I mentioned an AMD chip... I'm old enough to remember a time when you could drop AMD AND Intel CPUs into the same socket, even if it was during my childhood so I wasn't actually building any PCs then.

Also I was hoping for a die-shrunk-from-14nm more-core CPU to replace the i7-6700K in my laptop.  (Chipset is Z170, and Clevo, afaik, has put out ZERO BIOS updates, not even for Spectre.)  If I could have a 10nm 12- or 16-core LGA1151v1 CPU with no iGPU (P750DM-G requires dGPU anyway, also would be nice to upgrade that MXM 970M 6GB to like a 3060 or 3070)....

 

 

On 11/3/2019 at 12:28 PM, Waffles13 said:

shift to AM5 (or whatever they call it) in time for DDR5 to hit mainstream in mid/late 2022. Depending on how slow DDR5 rolls out,

 

4 hours ago, ryao said:

At this point, I have given up on seeing all 16x slots on consumer motherboards,

Yeah ... A few things I'd like to see going forward, starting with the next generation...

  • ALL RAM would be ECC.  No more denying that feature in basic consumer boards.  (With some of the data security issues, etc, I can't see how ECC RAM would hurt anything, other than profit margins maybe.)
  • ALL boards would support high-density registered ECC, and anything over $80 at non-sale price would also support LR-DIMMs.
  • Subsequent RAM generations (and CPU sockets across the entire stack, from Atom/Athlon to Xeon Platinum/Epyc) would be mechanically and electrically compatible, like PCI Express, SATA, 4-pin Molex power, USB type A, etc, have been for decades or more.  (Bonus points for out-lasting the QWERTY keyboard layout, which is still the standard today and was invented in, I think, 1873.)
  • Count of PCIe, HSIO lanes, etc, would be based on motherboard form factor, not chipset or artificial market segmentation.  Mini ITX wouls get relatively few lanes, Micro ATX would have at least 4 x16 PCIe slots, standard ATX would have at least 7 x16 slots, etc.  EATX would be dual-CPU with 16 DIMM slots minimum.
  • (I think having full-ATX H310 boards, and mini-ITX LGA2066 or LGA3647 or Micro ATX TR4, is out of place / backwards.)
  • Speaking of EATX, it would be the same size as SSI EEB.  No more saying cases are compatible with EATX when they only accept motherboards up to about 10.5" wide or so (similar to SSI CEB) - I think that should be renamed to something like "ATX+".  (A couple weeks ago, someone avoided getting the wrong case for their dual-socket EATX board.  They were maybe going to try to put a Supermicro X8DTH-6F into a Corsair 750D.)

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, WereCatf said:

Ever played Frostpunk? It's clearly modeled after the Finnish winter.

282268289_Screenshotfrom2019-11-0603-05-44.thumb.png.30b471a3cf3ea455c502201d474b2ea9.png

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, D.U.F.F. said:

-- Some picture --

What about it?

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, WereCatf said:

What about it?

Try to make point. Finland's winter is not so cold. Maybe in January-February there are a few almost -30 degrees Celsius. Otherwise normal "tropical" winter.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, D.U.F.F. said:

Try to make point. Finland's winter is not so cold. Maybe in January-February there are a few almost -30 degrees Celsius. Otherwise normal "tropical" winter.

First of all, no one claimed that the entire winter-period is like that. It's not like one day we have +15C and then someone flips a switch and it's suddenly -30C. It doesn't work like that anywhere on the planet. Winter's still just coming, so obviously it's not going to be that cold yet.

 

Secondly, it was a joke.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, D.U.F.F. said:

Try to make point. Finland's winter is not so cold. Maybe in January-February there are a few almost -30 degrees Celsius. Otherwise normal "tropical" winter.

that wasnt the main point tho........

 

its regarding that under Finnish belief, hell is a frozen hellscape. 

 

also, while the Nordic Winter is cold, it is widely different depending on where and when. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, GoldenLag said:

also, while the Nordic Winter is cold, it is widely different depending on where and when. 

Just a silly anecdote, but anyways: my aussie buddy was aghast when I mentioned last winter how we hit -33C for a week or so. His freezer goes only down to -25C and even that is quite literal death to him. He was laughing about how I can crawl in the freezer to warm up after having been outdoors.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, WereCatf said:

Just a silly anecdote, but anyways: my aussie buddy was aghast when I mentioned last winter how we hit -33C for a week or so. His freezer goes only down to -25C and even that is quite literal death to him.

yeah, hitting below -20*C is pretty awful. its nolonger just a bit cold, like with -10 to -15*C. its very cold

 

glad i dont live anywhere it usually gets that cold, and when it does, its when the weather is clear and there is no wind. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×