Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
CarlBar

Buildzoid X570 First Look

Recommended Posts

Posted · Original PosterOP

I'm still watching this so i'll update with more details once i've finished. But two quick things before the link.

 

1. He's looking at a publicly released set of images for a board, but confirms he has additional info via official contacts at various board vendors.

 

2. Although he has no information on any of the specifics of the processors other than core counts he has had it confirmed to him that there will be 12 and 16 core CPU's. Thats at roughly the 3 minute mark.

 

 

 

 

Not much more to it actually. The other two main points are some speculation about worst case 16 core power draw and the chipset apparently has overheating issues with some storage configurations.

 

Still AFAIK this is the first time we've had someone come out and flat out confirm 16 core and 12 core will be a thing rather than explicit unofficial source leaks.

Link to post
Share on other sites

I wouldn't call this confirmation of anything, third hand information is conjecture and nothing more than that.

 

Still I'm pretty excited for Zen 2 and X570. I'm saving right now so when it drops I can get a board and 16 core CPU.


Main Rig:-

Ryzen 7 2700X @ 4.2Ghz | Asus ROG Strix X370-F Gaming | 16GB Team Group Dark T-Force 3200Mhz | Samsung 970 Evo 500GB NVMe | Asus Rog Strix Vega 64 8GB OC Edition | Coolermaster Master Air 620P | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Coolermaster Master Box MB520P | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Windows 10 Pro X64 |

Link to post
Share on other sites

28 freakin watts...

 

ON A CHIPSET


PSU Tier List//Graphics card (cooling) tier list//Build Guide Megathread//Motherboard Tier List//Linux Guide//Build logs//Before troubleshoot//Mark Solved//Off Topic//Community standards

Don't forget to quote or mention me

 

Primary PC:

Spoiler

CPU: I5-8600k  @4.5 ghz  GPU: GTX 1070 ti EVGA SC Gaming   RAM: 8+8 3360 mhz DDR4 Trident Z   MOBO: MSI Gaming Pro Carbon AC   HDD: 1 TB 7200 RPM Seagate Baracudda, 1 TB 5400 RPM Samsung ECOGREEN   SSD: Samsung 860 EVO 500 GB   Soundcard: built in   Case: Cooler Master Masterbox Lite 5 RGB   Screen: Salora 40LED1500

 

Secondary PC: Cedar mill

Spoiler

CPU: i3-2130   GPU: Intel HD graphics   RAM: 4+2 GB 1333 mhz DDR3    MOBO: HP H series   HDD: 320 GB WD Black 7200 RPM   PSU: HP 250 watt   Soundcard: built in   Case: Sunbeam Quarterback   Screen: IIyama Prolite T2240MTS, Samsung SyncMaster710N

 

Server: CookieVault

Spoiler

CPU: core2dual E8400   GPU: Intel HD graphics   RAM: 2+1+1+1 gb 1333 mhz ddr3   MOBO: HP Q series   HDD: 4x 1tb 5400 RPM Samsung Spinpoint Ecogreen   Soundcard: built in   Case: Compaq 6000 pro mt   Screen: Samsung SyncMaster710n

 

Laptop: Acer TravelMate 8573t

Spoiler

CPU: I3-2330M   GPU: Intel HD graphics   RAM: 8+2 GB 1333 mhz DDR3   MOBO: Acer   SSD: 250 gb mx500 sata   Soundcard: built in   Case: Acer TravelMate 8573t   Screen: TN 768p

 

Consoles:

Spoiler

PS4 slim glacier white 500 gb, PS4 FTP Special Edition 500 gb, Xbox, 3 DS lites, DSI XL, Gameboy Advanced Color, PS Vita v2, Wii, PS3 500 gb

 

Link to post
Share on other sites

So, 8c/16t Ryzen at CES as an engineering sample at not final frequency was running 133 watts (full system power) when completing cinebench in line with the 9900k that was running 180 watts (full system power).  So, if we assume some basics about the chiplet design, and not all of that power pull would be doubled to add another 8c chiplet, but we give that back for frequency enhancements for final form, the 300w that buildzoid mentions as a possibility is a reasonable assumption for worst case.  But I'd expect final to be lower or similar wattage while faster clock speed, so a 250w or 275w 16c chip seems somewhat reasonable to me if that's all you consider.

 

However...Another interesting note is that Threadripper 1950x 16c chips are 180w, as is the 2950x gen 2 Threadripper.  So, under 200w really seams like a reasonable hope for a finalized Ryzen 16c chip unless the frequency really gets pumped more than I expect.

 

I wonder what more they're moving to the chipset to increase that wattage draw, and if that will have a yet further reduction in CPU watt draw (or allow for higher clocks at the same draw).

Link to post
Share on other sites
Just now, VegetableStu said:

inb4 noctua announces X590 coolers

inb4 ekwb comes with x699 chipset coolers


PSU Tier List//Graphics card (cooling) tier list//Build Guide Megathread//Motherboard Tier List//Linux Guide//Build logs//Before troubleshoot//Mark Solved//Off Topic//Community standards

Don't forget to quote or mention me

 

Primary PC:

Spoiler

CPU: I5-8600k  @4.5 ghz  GPU: GTX 1070 ti EVGA SC Gaming   RAM: 8+8 3360 mhz DDR4 Trident Z   MOBO: MSI Gaming Pro Carbon AC   HDD: 1 TB 7200 RPM Seagate Baracudda, 1 TB 5400 RPM Samsung ECOGREEN   SSD: Samsung 860 EVO 500 GB   Soundcard: built in   Case: Cooler Master Masterbox Lite 5 RGB   Screen: Salora 40LED1500

 

Secondary PC: Cedar mill

Spoiler

CPU: i3-2130   GPU: Intel HD graphics   RAM: 4+2 GB 1333 mhz DDR3    MOBO: HP H series   HDD: 320 GB WD Black 7200 RPM   PSU: HP 250 watt   Soundcard: built in   Case: Sunbeam Quarterback   Screen: IIyama Prolite T2240MTS, Samsung SyncMaster710N

 

Server: CookieVault

Spoiler

CPU: core2dual E8400   GPU: Intel HD graphics   RAM: 2+1+1+1 gb 1333 mhz ddr3   MOBO: HP Q series   HDD: 4x 1tb 5400 RPM Samsung Spinpoint Ecogreen   Soundcard: built in   Case: Compaq 6000 pro mt   Screen: Samsung SyncMaster710n

 

Laptop: Acer TravelMate 8573t

Spoiler

CPU: I3-2330M   GPU: Intel HD graphics   RAM: 8+2 GB 1333 mhz DDR3   MOBO: Acer   SSD: 250 gb mx500 sata   Soundcard: built in   Case: Acer TravelMate 8573t   Screen: TN 768p

 

Consoles:

Spoiler

PS4 slim glacier white 500 gb, PS4 FTP Special Edition 500 gb, Xbox, 3 DS lites, DSI XL, Gameboy Advanced Color, PS Vita v2, Wii, PS3 500 gb

 

Link to post
Share on other sites
8 minutes ago, justpoet said:

So, under 200w really seams like a reasonable hope for a finalized Ryzen 16c chip unless the frequency really gets pumped more than I expect.

TDP is kinda irrelevant as they can have a boost curve depending on the ammount of cores, and a really low baseclock. 

Link to post
Share on other sites

At the end of the day, do we really care whether it's pulling 100, 200, 300W more than before?

 

If it's as good as people hope then that's even more chance you will buy one, box might look pretty but to get any use out of it you have to power it up regardless of the power consumption.

Link to post
Share on other sites
10 minutes ago, Ethariel01 said:

At the end of the day, do we really care whether it's pulling 100, 200, 300W more than before?

 

If it's as good as people hope then that's even more chance you will buy one, box might look pretty but to get any use out of it you have to power it up regardless of the power consumption.

i mean noone cares their 9900k is drawing between 150-200 watts. and when its limited to 95 watts, in multicore workloads it runs like an r7 2700x. 

 

people dont care about power, people care about the sideeffects. which can all be mitigated. 

Link to post
Share on other sites
1 hour ago, GoldenLag said:

i mean noone cares their 9900k is drawing between 150-200 watts. and when its limited to 95 watts, in multicore workloads it runs like an r7 2700x. 

Recognising I'm not a typical home user, I do care about power. I effectively have a power budget which limits how much I can run at the same time. While I don't have a 9900k, I did run P95 small FFT and also CB R20 for indication on my 8086k stock. Actually CB R20 used slightly more power at 97W compared to 90W of P95. Without adjusting for clocks, 33% extra cores would put 8 cores around 130W under similar loading. AFAIK my mobo isn't set for power limiting and I didn't observe clock drops in either of those.

 

I don't have similar numbers for Zen on hand, but with the improved FPU I'm really curious where the next gen will fall. Maybe this time I'll not get a budget board so I'll have some chance of higher end overclocking fun too. Chipset power isn't a big concern if it is relatively small compared to the CPU.


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte Windforce 980Ti, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, G.Skill TridentZ 3000C14 2x8GB, Asus 1080 Ti Strix OC, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 2600, Noctua D9L, Corsair Vengeance LPX 3000 2x4GB, Vega 56, Corsair CX450M, NZXT Manta, Crucial MX300 525GB, Acer RT280K

VR rig: Asus Z170I Pro Gaming, i7-6600k stock, Silverstone TD03-E, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB SSD

Total CPU heating: i7-7800X, 2x i7-6700k, i7-6700HQ, i5-6600k, i5-5675C, i5-4570S, i3-8350k, i3-6100, i3-4360, 2x i3-4150T, E5-2683v3, 2x E5-2650, R7 1700, 1600

Link to post
Share on other sites
Just now, porina said:

Recognising I'm not a typical home user, I do care about power.

and i recognize there are exceptions. but i will stand by that "noone" really cares about powerdraw. 

 

but it would be nice if manufacturers could provide a peak powerdraw spec and not some illusive TDP

Link to post
Share on other sites
6 minutes ago, GoldenLag said:

and i recognize there are exceptions. but i will stand by that "noone" really cares about powerdraw. 

In a home environment maybe.

 

6 minutes ago, GoldenLag said:

but it would be nice if manufacturers could provide a peak powerdraw spec and not some illusive TDP

Not gonna defend Intel's usage of TDP, but again we have a gap between enthusiast hardware that ignores it, and the mass market stuff like Dells that do follow it. Even in Intel's case, the number you're looking for is called PL2 (one of the turbo power levels), but it takes more work to dig that out than TDP.

 

If power is irrelevant for most, then peak power is even more so.


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte Windforce 980Ti, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, G.Skill TridentZ 3000C14 2x8GB, Asus 1080 Ti Strix OC, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 2600, Noctua D9L, Corsair Vengeance LPX 3000 2x4GB, Vega 56, Corsair CX450M, NZXT Manta, Crucial MX300 525GB, Acer RT280K

VR rig: Asus Z170I Pro Gaming, i7-6600k stock, Silverstone TD03-E, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB SSD

Total CPU heating: i7-7800X, 2x i7-6700k, i7-6700HQ, i5-6600k, i5-5675C, i5-4570S, i3-8350k, i3-6100, i3-4360, 2x i3-4150T, E5-2683v3, 2x E5-2650, R7 1700, 1600

Link to post
Share on other sites
Posted · Original PosterOP
11 minutes ago, Trixanity said:

The chipset has overheating issues? That seems unlikely unless there's no heatsink on it.

 

There's a heatsink and a fan on it.

 

And apparently thats the recommended reference minimum for the chipset.

Link to post
Share on other sites
55 minutes ago, CarlBar said:

 

There's a heatsink and a fan on it.

 

And apparently thats the recommended reference minimum for the chipset.

Sounds very odd. The TDP of X470 was 4.5W (and X370 should be 6W I think). For it to require a fan I'd assume at least 10W or perhaps even 15W. That seems crazy. And if the fan can't keep it cool it might be even higher. 

Link to post
Share on other sites
14 minutes ago, Trixanity said:

Sounds very odd. The TDP of X470 was 4.5W (and X370 should be 6W I think). For it to require a fan I'd assume at least 10W or perhaps even 15W. That seems crazy. And if the fan can't keep it cool it might be even higher. 

Supposed to be up to 28w on the chipset.

Link to post
Share on other sites

Could anyone enlighten me as to why chipsets used to have higher tdp's, then they didn't anymore and now this one does again.

Will it perform other functions that it didn't before?


GAMING RIG CPU: Intel i7-6700K (4.7GHz @ 1.39v) Motherboard: Asus Z170 Pro GPU: Asus GTX 1070 8GB RAM: 32GB Kingston HyperX Fury Hard Drive: WD Black NVMe SSD 512GB Power Supply: XFX PRO 550W  Cooling: Corsair H115i Case: NZXT Switch 810 (white) Operating System: Windows 10 Pro 64bit Monitor: AOC 27" QHD 144Hz Keyboard: Corsair K70 Mouse: Razer DeathAdder Elite Audio: Bose QC25
Link to post
Share on other sites
3 hours ago, Ethariel01 said:

At the end of the day, do we really care whether it's pulling 100, 200, 300W more than before?

Whenever AMD has the advantage with efficiency it's the most important thing ever to all the AMD fanboys.

Whenever Intel has the advantage with efficiency it's the most important thing ever to all the Intel fanboys.

 

In reality, there has to be a balance, and be reasonable.

Let's say there exist a chip which uses 100 watts on average and gets a score of 100 in a benchmark.

If someone releases a chip which uses 200 watts and gets a score of 90, or 110, then that's bad. Really bad.

If another chip uses 200 watts and gets a score of 180, then it's more acceptable.

If fourth chip comes out and uses 500 watts and gets a score of 500, then that's still pretty bad because a 500 watt CPU causes a lot of cooling and power delivery issues.

 

There has to be a balance. But when we're talking about somewhat comparable performance and somewhat comparable efficiency then I don't think most people care about a few watts one way or another, (unless it's their favorite brand which is the most efficient).

 

 

 

 

17 minutes ago, CiBi said:

Could anyone enlighten me as to why chipsets used to have higher tdp's, then they didn't anymore and now this one does again. 

Will it perform other functions that it didn't before? 

Back in the old days, we used to have a northbridge on our motherboards which was directly connected to the CPU. Then things like the PCIe bus and RAM were connected to the northbridge.

Then we had a southbridge which was connected to the northbridge. Since the southbridge was further away from the CPU in the chain, it handled lower bandwidth devices such as audio, USB, and stuff like that.

Spoiler

Motherboard.png.c352af457e5547e244be3712e7b0b3d7.png

 

So the chipsets had a lot to do back in the day, shuffling both PCIe and data from/to RAM.

These days however, both of those things are directly integrated into the CPU. It gives higher throughput and lower latency.

 

But I doubt that they have moved those things off to the chipset again. It would break backwards compatibility, which is something AMD has promised and championed.

Not sure what might have happened to make the northbridge run so hot and power hungry all of a sudden.

Link to post
Share on other sites
41 minutes ago, CiBi said:

Could anyone enlighten me as to why chipsets used to have higher tdp's, then they didn't anymore and now this one does again.

Will it perform other functions that it didn't before?

I'd have to guess PCIe 4.0 could be a factor to it going up again. Assuming Zen 2 CPUs when paired with a 500 series chipset can communicate to each other at PCIe 4.0 speeds, instead of current 3.0 speeds. This could allow more connectivity off it, and consume more power that way.

 

44 minutes ago, LAwLz said:

If fourth chip comes out and uses 500 watts and gets a score of 500, then that's still pretty bad because a 500 watt CPU causes a lot of cooling and power delivery issues.

If we limit ourselves to the home market, maybe. But datacentres would be all over that scenario. They might not gain efficiency, but they will gain density. It doesn't matter if you're removing 500W from one CPU or 5x 100W CPUs. We kinda see that to an extent in HEDT also. Sometimes you just want the fastest single box, and multiple slower boxes don't cut it.


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte Windforce 980Ti, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, G.Skill TridentZ 3000C14 2x8GB, Asus 1080 Ti Strix OC, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 2600, Noctua D9L, Corsair Vengeance LPX 3000 2x4GB, Vega 56, Corsair CX450M, NZXT Manta, Crucial MX300 525GB, Acer RT280K

VR rig: Asus Z170I Pro Gaming, i7-6600k stock, Silverstone TD03-E, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB SSD

Total CPU heating: i7-7800X, 2x i7-6700k, i7-6700HQ, i5-6600k, i5-5675C, i5-4570S, i3-8350k, i3-6100, i3-4360, 2x i3-4150T, E5-2683v3, 2x E5-2650, R7 1700, 1600

Link to post
Share on other sites
53 minutes ago, LAwLz said:

Whenever AMD has the advantage with efficiency it's the most important thing ever to all the AMD fanboys.

Whenever Intel has the advantage with efficiency it's the most important thing ever to all the Intel fanboys.

 

In reality, there has to be a balance, and be reasonable.

Let's say there exist a chip which uses 100 watts on average and gets a score of 100 in a benchmark.

If someone releases a chip which uses 200 watts and gets a score of 90, or 110, then that's bad. Really bad.

If another chip uses 200 watts and gets a score of 180, then it's more acceptable.

If fourth chip comes out and uses 500 watts and gets a score of 500, then that's still pretty bad because a 500 watt CPU causes a lot of cooling and power delivery issues.

 

There has to be a balance. But when we're talking about somewhat comparable performance and somewhat comparable efficiency then I don't think most people care about a few watts one way or another, (unless it's their favorite brand which is the most efficient).

 

 

 

 

Back in the old days, we used to have a northbridge on our motherboards which was directly connected to the CPU. Then things like the PCIe bus and RAM were connected to the northbridge.

Then we had a southbridge which was connected to the northbridge. Since the southbridge was further away from the CPU in the chain, it handled lower bandwidth devices such as audio, USB, and stuff like that.

  Reveal hidden contents

Motherboard.png.c352af457e5547e244be3712e7b0b3d7.png

 

So the chipsets had a lot to do back in the day, shuffling both PCIe and data from/to RAM.

These days however, both of those things are directly integrated into the CPU. It gives higher throughput and lower latency.

 

But I doubt that they have moved those things off to the chipset again. It would break backwards compatibility, which is something AMD has promised and championed.

Not sure what might have happened to make the northbridge run so hot and power hungry all of a sudden.

Then isn't what we call 'the chipset' now mostly what used to be the south bridge? And didn't that need less cooling than the north bridge?
Anyway, gives us a reason to watercool the chipset 😁


GAMING RIG CPU: Intel i7-6700K (4.7GHz @ 1.39v) Motherboard: Asus Z170 Pro GPU: Asus GTX 1070 8GB RAM: 32GB Kingston HyperX Fury Hard Drive: WD Black NVMe SSD 512GB Power Supply: XFX PRO 550W  Cooling: Corsair H115i Case: NZXT Switch 810 (white) Operating System: Windows 10 Pro 64bit Monitor: AOC 27" QHD 144Hz Keyboard: Corsair K70 Mouse: Razer DeathAdder Elite Audio: Bose QC25
Link to post
Share on other sites

He missed to mention the nForce Series of Chipset (except 900 Series and some others), that were really hot.

 

BUT: if the Power Consumption is good or not depends on the Features of the Chipset, what's inside.

One thing is clear: PCIe 4.0 is not cheap and might be the reason this chipset gets so warm...

 

 

4 hours ago, LukeSavenije said:

28 freakin watts...

ON A CHIPSET

I can give you one with 30W:

https://ark.intel.com/content/www/us/en/ark/products/35143/intel-82x48-memory-controller-hub.html


"Hell is full of good meanings, but Heaven is full of good works"

Link to post
Share on other sites
2 hours ago, Trixanity said:

The chipset has overheating issues? That seems unlikely unless there's no heatsink on it.

He hinted that it could be when there is heavy load on m.2 SSDs.

So the TDP is probably because of PCIe 4.0 and if you don't use (much) PCIe 4.0 from the Chipset you might be fine...


Still: we don't know what the Chipset has integrated, how many PCIe 4.0 goes in, how many PCIe 4.0 Lanes are provided by the Chipset, how many S-ATA and other goodies are integrated.

 

Just throwing a number at the wall, without context...

 

Remember: X48 had 30W TDP.


"Hell is full of good meanings, but Heaven is full of good works"

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×