Jump to content

Buildzoid X570 First Look

CarlBar

I'm still watching this so i'll update with more details once i've finished. But two quick things before the link.

 

1. He's looking at a publicly released set of images for a board, but confirms he has additional info via official contacts at various board vendors.

 

2. Although he has no information on any of the specifics of the processors other than core counts he has had it confirmed to him that there will be 12 and 16 core CPU's. Thats at roughly the 3 minute mark.

 

 

 

 

Not much more to it actually. The other two main points are some speculation about worst case 16 core power draw and the chipset apparently has overheating issues with some storage configurations.

 

Still AFAIK this is the first time we've had someone come out and flat out confirm 16 core and 12 core will be a thing rather than explicit unofficial source leaks.

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't call this confirmation of anything, third hand information is conjecture and nothing more than that.

 

Still I'm pretty excited for Zen 2 and X570. I'm saving right now so when it drops I can get a board and 16 core CPU.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

So, 8c/16t Ryzen at CES as an engineering sample at not final frequency was running 133 watts (full system power) when completing cinebench in line with the 9900k that was running 180 watts (full system power).  So, if we assume some basics about the chiplet design, and not all of that power pull would be doubled to add another 8c chiplet, but we give that back for frequency enhancements for final form, the 300w that buildzoid mentions as a possibility is a reasonable assumption for worst case.  But I'd expect final to be lower or similar wattage while faster clock speed, so a 250w or 275w 16c chip seems somewhat reasonable to me if that's all you consider.

 

However...Another interesting note is that Threadripper 1950x 16c chips are 180w, as is the 2950x gen 2 Threadripper.  So, under 200w really seams like a reasonable hope for a finalized Ryzen 16c chip unless the frequency really gets pumped more than I expect.

 

I wonder what more they're moving to the chipset to increase that wattage draw, and if that will have a yet further reduction in CPU watt draw (or allow for higher clocks at the same draw).

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, VegetableStu said:

inb4 noctua announces X590 coolers

inb4 ekwb comes with x699 chipset coolers

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, justpoet said:

So, under 200w really seams like a reasonable hope for a finalized Ryzen 16c chip unless the frequency really gets pumped more than I expect.

TDP is kinda irrelevant as they can have a boost curve depending on the ammount of cores, and a really low baseclock. 

Link to comment
Share on other sites

Link to post
Share on other sites

At the end of the day, do we really care whether it's pulling 100, 200, 300W more than before?

 

If it's as good as people hope then that's even more chance you will buy one, box might look pretty but to get any use out of it you have to power it up regardless of the power consumption.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Ethariel01 said:

At the end of the day, do we really care whether it's pulling 100, 200, 300W more than before?

 

If it's as good as people hope then that's even more chance you will buy one, box might look pretty but to get any use out of it you have to power it up regardless of the power consumption.

i mean noone cares their 9900k is drawing between 150-200 watts. and when its limited to 95 watts, in multicore workloads it runs like an r7 2700x. 

 

people dont care about power, people care about the sideeffects. which can all be mitigated. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GoldenLag said:

i mean noone cares their 9900k is drawing between 150-200 watts. and when its limited to 95 watts, in multicore workloads it runs like an r7 2700x. 

Recognising I'm not a typical home user, I do care about power. I effectively have a power budget which limits how much I can run at the same time. While I don't have a 9900k, I did run P95 small FFT and also CB R20 for indication on my 8086k stock. Actually CB R20 used slightly more power at 97W compared to 90W of P95. Without adjusting for clocks, 33% extra cores would put 8 cores around 130W under similar loading. AFAIK my mobo isn't set for power limiting and I didn't observe clock drops in either of those.

 

I don't have similar numbers for Zen on hand, but with the improved FPU I'm really curious where the next gen will fall. Maybe this time I'll not get a budget board so I'll have some chance of higher end overclocking fun too. Chipset power isn't a big concern if it is relatively small compared to the CPU.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, porina said:

Recognising I'm not a typical home user, I do care about power.

and i recognize there are exceptions. but i will stand by that "noone" really cares about powerdraw. 

 

but it would be nice if manufacturers could provide a peak powerdraw spec and not some illusive TDP

Link to comment
Share on other sites

Link to post
Share on other sites

The chipset has overheating issues? That seems unlikely unless there's no heatsink on it.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, GoldenLag said:

and i recognize there are exceptions. but i will stand by that "noone" really cares about powerdraw. 

In a home environment maybe.

 

6 minutes ago, GoldenLag said:

but it would be nice if manufacturers could provide a peak powerdraw spec and not some illusive TDP

Not gonna defend Intel's usage of TDP, but again we have a gap between enthusiast hardware that ignores it, and the mass market stuff like Dells that do follow it. Even in Intel's case, the number you're looking for is called PL2 (one of the turbo power levels), but it takes more work to dig that out than TDP.

 

If power is irrelevant for most, then peak power is even more so.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Trixanity said:

The chipset has overheating issues? That seems unlikely unless there's no heatsink on it.

 

There's a heatsink and a fan on it.

 

And apparently thats the recommended reference minimum for the chipset.

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, CarlBar said:

 

There's a heatsink and a fan on it.

 

And apparently thats the recommended reference minimum for the chipset.

Sounds very odd. The TDP of X470 was 4.5W (and X370 should be 6W I think). For it to require a fan I'd assume at least 10W or perhaps even 15W. That seems crazy. And if the fan can't keep it cool it might be even higher. 

Link to comment
Share on other sites

Link to post
Share on other sites

Wonder if any X570 or B550 M-itx mobo will have the VRM to support 12 and 16 core CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Trixanity said:

Sounds very odd. The TDP of X470 was 4.5W (and X370 should be 6W I think). For it to require a fan I'd assume at least 10W or perhaps even 15W. That seems crazy. And if the fan can't keep it cool it might be even higher. 

Supposed to be up to 28w on the chipset.

Link to comment
Share on other sites

Link to post
Share on other sites

Could anyone enlighten me as to why chipsets used to have higher tdp's, then they didn't anymore and now this one does again.

Will it perform other functions that it didn't before?

GAMING PC CPU: AMD 3800X Motherboard: Asus STRIX X570-E GPU: GIGABYTE RTX 3080 GAMING OC RAM: 16GB G.Skill 3600MHz/CL14  PSU: Corsair RM850x Case: NZXT MESHIFY 2 XL DARK TG Cooling: EK Velocity + D5 pump + 360mm rad + 280mm rad Monitor: AOC 27" QHD 144Hz Keyboard: Corsair K70 Mouse: Razer DeathAdder Elite Audio: Bose QC35 II
WHAT MY GF INHERITED CPU: Intel i7-6700K (4.7GHz @ 1.39v) Motherboard: Asus Z170 Pro GPU: Asus GTX 1070 8GB RAM: 32GB Kingston HyperX Fury Hard Drive: WD Black NVMe SSD 512GB Power Supply: XFX PRO 550W  Cooling: Corsair H115i Case: NZXT H700 White
Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Ethariel01 said:

At the end of the day, do we really care whether it's pulling 100, 200, 300W more than before?

Whenever AMD has the advantage with efficiency it's the most important thing ever to all the AMD fanboys.

Whenever Intel has the advantage with efficiency it's the most important thing ever to all the Intel fanboys.

 

In reality, there has to be a balance, and be reasonable.

Let's say there exist a chip which uses 100 watts on average and gets a score of 100 in a benchmark.

If someone releases a chip which uses 200 watts and gets a score of 90, or 110, then that's bad. Really bad.

If another chip uses 200 watts and gets a score of 180, then it's more acceptable.

If fourth chip comes out and uses 500 watts and gets a score of 500, then that's still pretty bad because a 500 watt CPU causes a lot of cooling and power delivery issues.

 

There has to be a balance. But when we're talking about somewhat comparable performance and somewhat comparable efficiency then I don't think most people care about a few watts one way or another, (unless it's their favorite brand which is the most efficient).

 

 

 

 

17 minutes ago, CiBi said:

Could anyone enlighten me as to why chipsets used to have higher tdp's, then they didn't anymore and now this one does again. 

Will it perform other functions that it didn't before? 

Back in the old days, we used to have a northbridge on our motherboards which was directly connected to the CPU. Then things like the PCIe bus and RAM were connected to the northbridge.

Then we had a southbridge which was connected to the northbridge. Since the southbridge was further away from the CPU in the chain, it handled lower bandwidth devices such as audio, USB, and stuff like that.

Spoiler

Motherboard.png.c352af457e5547e244be3712e7b0b3d7.png

 

So the chipsets had a lot to do back in the day, shuffling both PCIe and data from/to RAM.

These days however, both of those things are directly integrated into the CPU. It gives higher throughput and lower latency.

 

But I doubt that they have moved those things off to the chipset again. It would break backwards compatibility, which is something AMD has promised and championed.

Not sure what might have happened to make the northbridge run so hot and power hungry all of a sudden.

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, CiBi said:

Could anyone enlighten me as to why chipsets used to have higher tdp's, then they didn't anymore and now this one does again.

Will it perform other functions that it didn't before?

I'd have to guess PCIe 4.0 could be a factor to it going up again. Assuming Zen 2 CPUs when paired with a 500 series chipset can communicate to each other at PCIe 4.0 speeds, instead of current 3.0 speeds. This could allow more connectivity off it, and consume more power that way.

 

44 minutes ago, LAwLz said:

If fourth chip comes out and uses 500 watts and gets a score of 500, then that's still pretty bad because a 500 watt CPU causes a lot of cooling and power delivery issues.

If we limit ourselves to the home market, maybe. But datacentres would be all over that scenario. They might not gain efficiency, but they will gain density. It doesn't matter if you're removing 500W from one CPU or 5x 100W CPUs. We kinda see that to an extent in HEDT also. Sometimes you just want the fastest single box, and multiple slower boxes don't cut it.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, LAwLz said:

Whenever AMD has the advantage with efficiency it's the most important thing ever to all the AMD fanboys.

Whenever Intel has the advantage with efficiency it's the most important thing ever to all the Intel fanboys.

 

In reality, there has to be a balance, and be reasonable.

Let's say there exist a chip which uses 100 watts on average and gets a score of 100 in a benchmark.

If someone releases a chip which uses 200 watts and gets a score of 90, or 110, then that's bad. Really bad.

If another chip uses 200 watts and gets a score of 180, then it's more acceptable.

If fourth chip comes out and uses 500 watts and gets a score of 500, then that's still pretty bad because a 500 watt CPU causes a lot of cooling and power delivery issues.

 

There has to be a balance. But when we're talking about somewhat comparable performance and somewhat comparable efficiency then I don't think most people care about a few watts one way or another, (unless it's their favorite brand which is the most efficient).

 

 

 

 

Back in the old days, we used to have a northbridge on our motherboards which was directly connected to the CPU. Then things like the PCIe bus and RAM were connected to the northbridge.

Then we had a southbridge which was connected to the northbridge. Since the southbridge was further away from the CPU in the chain, it handled lower bandwidth devices such as audio, USB, and stuff like that.

  Reveal hidden contents

Motherboard.png.c352af457e5547e244be3712e7b0b3d7.png

 

So the chipsets had a lot to do back in the day, shuffling both PCIe and data from/to RAM.

These days however, both of those things are directly integrated into the CPU. It gives higher throughput and lower latency.

 

But I doubt that they have moved those things off to the chipset again. It would break backwards compatibility, which is something AMD has promised and championed.

Not sure what might have happened to make the northbridge run so hot and power hungry all of a sudden.

Then isn't what we call 'the chipset' now mostly what used to be the south bridge? And didn't that need less cooling than the north bridge?
Anyway, gives us a reason to watercool the chipset ?

GAMING PC CPU: AMD 3800X Motherboard: Asus STRIX X570-E GPU: GIGABYTE RTX 3080 GAMING OC RAM: 16GB G.Skill 3600MHz/CL14  PSU: Corsair RM850x Case: NZXT MESHIFY 2 XL DARK TG Cooling: EK Velocity + D5 pump + 360mm rad + 280mm rad Monitor: AOC 27" QHD 144Hz Keyboard: Corsair K70 Mouse: Razer DeathAdder Elite Audio: Bose QC35 II
WHAT MY GF INHERITED CPU: Intel i7-6700K (4.7GHz @ 1.39v) Motherboard: Asus Z170 Pro GPU: Asus GTX 1070 8GB RAM: 32GB Kingston HyperX Fury Hard Drive: WD Black NVMe SSD 512GB Power Supply: XFX PRO 550W  Cooling: Corsair H115i Case: NZXT H700 White
Link to comment
Share on other sites

Link to post
Share on other sites

He missed to mention the nForce Series of Chipset (except 900 Series and some others), that were really hot.

 

BUT: if the Power Consumption is good or not depends on the Features of the Chipset, what's inside.

One thing is clear: PCIe 4.0 is not cheap and might be the reason this chipset gets so warm...

 

 

4 hours ago, LukeSavenije said:

28 freakin watts...

ON A CHIPSET

I can give you one with 30W:

https://ark.intel.com/content/www/us/en/ark/products/35143/intel-82x48-memory-controller-hub.html

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Trixanity said:

The chipset has overheating issues? That seems unlikely unless there's no heatsink on it.

He hinted that it could be when there is heavy load on m.2 SSDs.

So the TDP is probably because of PCIe 4.0 and if you don't use (much) PCIe 4.0 from the Chipset you might be fine...


Still: we don't know what the Chipset has integrated, how many PCIe 4.0 goes in, how many PCIe 4.0 Lanes are provided by the Chipset, how many S-ATA and other goodies are integrated.

 

Just throwing a number at the wall, without context...

 

Remember: X48 had 30W TDP.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Stefan Payne said:

Scraping the bottom of the barrel with an 11 year old chipset, we're talking over decade old technology, it can hardly be compared to technology today. If anything it makes them look worse for hitting 28w

 

Come on Stefan, you're better than that

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

My old x48 chipset even has its own dedicated ihs and it looks so freaking awesome! They need to add ihs to those X570, so it look just as awesome or maybe like add rgb on it, to bring it up to date. And all of a sudden price of boards goes up by $50 bucks. ?

 

 

x48-8.jpg.351cf2ce583446746873427747405599.jpg

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×