Jump to content

Ryzen Threadripper Delidding

First off, damn!  That CPU did not want it's heat spreader removed.

Glad to see that Threadripper uses solder, i'm not surprised though.

What was surprising, was the fact that Threadripper has four dies.

That would mean that Threadripper is probably a reused EPYC CPU.  

Does this mean that we could see up to 32 cores on Threadripper if AMD wanted to?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, tom_w141 said:

I can't find it now. You might be ok if it got moved

I didn't know whether this would be more appropriate in Tech News and Reviews or CPUs, Motherboards, and Memory

Link to comment
Share on other sites

Link to post
Share on other sites

Good to see AMD continuing to use Solder instead of pulling an Intel and using subpar TIM.

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Cinnabar Sonar said:

People did?  It seems Perfectly logical to me.

It makes sense considering two of the dies are not functioning. If it was 4 functioning dies then there would be 128 pcie lanes instead of 64, that was one of the reasons why I thought threadripper would be 2 dies.

hello!

is it me you're looking for?

ᴾC SᴾeCS ᴰoWᴺ ᴮEᴸoW

Spoiler

Desktop: X99-PC

CPU: i7 5820k

Mobo: X99 Deluxe

Cooler: Dark Rock Pro 3

RAM: 32GB DDR4
GPU: GTX 1080

Storage: 1TB 850 Evo, 1TB HDD, bunch of external hard drives
PSU: EVGA G2 750w

Peripherals: Logitech G502, Ducky One 711

Audio: Xonar U7, O2 amplifier (RIP), HD6XX

Monitors: 4k 24" Dell monitor, 1080p 24" Asus monitor

 

Laptop:

-Overkill Dell XPS

Fully maxed out early 2017 Dell XPS 15, GTX 1050 4GB, 7700HQ, 1TB nvme SSD, 32GB RAM, 4k display. 97Whr battery :x 
Dell was having a $600 off sale for the fully specced out model, so I decided to get it :P

 

-Crapbook

Fully specced out early 2013 Macbook "pro" with gt 650m and constant 105c temperature on the CPU (GPU is 80-90C) when doing anything intensive...

A 2013 laptop with a regular sized battery still has better battery life than a 2017 laptop with a massive battery! I think this is a testament to apple's ability at making laptops, or maybe how little CPU technology has improved even 4+ years later (at least, until the recent introduction of 15W 4 core CPUs). Anyway, I'm never going to get a 35W CPU laptop again unless battery technology becomes ~5x better than as it is in 2018.

Apple knows how to make proper consumer-grade laptops (they don't know how to make pro laptops though). I guess this mostly software power efficiency related, but getting a mac makes perfect sense if you want a portable/powerful laptop that can do anything you want it to with great battery life.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Cinnabar Sonar said:

People did?  It seems Perfectly logical to me.

Yea, I originally thought that the TR rumors were just a few people mistaking it for Naples and the rest jumping on it. My reasoning for TR making no sense was that it wasn't justified to spend any R&D on making a different CPU package with two dies when you could literally just market Naples as a workstation CPU. So what actually happened is exactly what I thought it would be, Naples with different microcode.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

Yea, I originally thought that the TR rumors were just a few people mistaking it for Naples and the rest jumping on it. My reasoning for TR making no sense was that it wasn't justified to speed any R&D on making a different CPU package with two dies when you could literally just market Naples as a workstation CPU. So what actually happened is exactly what I thought it would be, Naples with different microcode.

Wasn't that more or less how the Intel HEDT CPUs came to be?

27 minutes ago, TetraSky said:

Good to see AMD continuing to use Solder instead of pulling an Intel and using subpar TIM.

Definitely.  Take notes Intel.  Solder, not mayonnaise!

3 minutes ago, rattacko123 said:

It makes sense considering two of the dies are not functioning. If it was 4 functioning dies then there would be 128 pcie lanes instead of 64, that was one of the reasons why I thought threadripper would be 2 dies.

I would imagine if four of the dies were functional, AMD would just have made it an EPYC chip.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Cinnabar Sonar said:

Wasn't that more or less how the Intel HEDT CPUs came to be?

Definitely.  Take notes Intel.  Solder, not mayonnaise!

I would imagine if four of the dies were functional, AMD would just have made it an EPYC chip.

Its not the same with Intel HEDT due to their insistence on using monolithic die since the start of the Core series (excluding the early generations having the iGPU on a separate die on a larger process)

And we already have examples of what's happened with the Ryzen 3: Half of the cores in each die aren't functional/are faulty, and AMD doesn't want to waste them.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dabombinable said:

Its not the same with Intel HEDT due to their insistence on using monolithic die since the start of the Core series (excluding the early generations having the iGPU on a separate die on a larger process)

Really?  Interesting, I would have assumed that Intel's HEDT chips where reject xeons.

2 hours ago, Dabombinable said:

And we already have examples of what's happened with the Ryzen 3: Half of the cores in each die aren't functional/are faulty, and AMD doesn't want to waste them.

It makes no sense to waste them, not when you can still sell them as cheaper, lower core CPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Dabombinable said:

Its not the same with Intel HEDT due to their insistence on using monolithic die since the start of the Core series (excluding the early generations having the iGPU on a separate die on a larger process)

The big difference with Intel HEDT is that HEDT aren't necessarily reject processors being repurposed, but they are born as Xeon E5 processors. It's the same basic process as AMD's HEDT lineup, where server processors are getting consumer features, as an enthusiast platform.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

God dam called it! And people gave me shit about that lol.

 

Only the smart ones knew this was basically a cut down EPYC chip with different microcode. 

 

On topic, and to address multiple people without quoting everyone who made this, It did not make sense to create a two die solution, not when they could use defective EPYC chips and reuse all the good cores across the various CCX's. It's either that, or each die is a 4 core chip, which would actually make tons of sense in terms of Threadripper, but I don't think its the case.

Do you even fanboy bro?

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

God dam called it! And people gave me shit about that lol.

Can't believe people did that.

 

People have no clue about engineering and business sometimes. lol

 

Why would AMD make a seperate SKU which at that high a price not many people would buy anyways.

\\ QUIET AUDIO WORKSTATION //

5960X 3.7GHz @ 0.983V / ASUS X99-A USB3.1      

32 GB G.Skill Ripjaws 4 & 2667MHz @ 1.2V

AMD R9 Fury X

256GB SM961 + 1TB Samsung 850 Evo  

Cooler Master Silencio 652S (soon Calyos NSG S0 ^^)              

Noctua NH-D15 / 3x NF-S12A                 

Seasonic PRIME Titanium 750W        

Logitech G810 Orion Spectrum / Logitech G900

2x Samsung S24E650BW 16:10  / Adam A7X / Fractal Axe Fx 2 Mark I

Windows 7 Ultimate

 

4K GAMING/EMULATION RIG

Xeon X5670 4.2Ghz (200BCLK) @ ~1.38V / Asus P6X58D Premium

12GB Corsair Vengeance 1600Mhz

Gainward GTX 1080 Golden Sample

Intel 535 Series 240 GB + San Disk SSD Plus 512GB

Corsair Crystal 570X

Noctua NH-S12 

Be Quiet Dark Rock 11 650W

Logitech K830

Xbox One Wireless Controller

Logitech Z623 Speakers/Subwoofer

Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Nice, it would be fun if AMD drop an 18 core chip too right after Intel drops theirs just because 

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Bananasplit_00 said:

Nice, it would be fun if AMD drop an 18 core chip too right after Intel drops theirs just because 

I suspect if they wanted to beat Intel in terms of core count, I doubt it'd only be an 18 core ;) Why not a 20-22 one? ^_^

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Morgan MLGman said:

I suspect if they wanted to beat Intel in terms of core count, I doubt it'd only be an 18 core ;) Why not a 20-22 one? ^_^

Because that's where Epyc comes in. Just like with Intel where they don't want to cannibalize the Xeon sales, AMD don't want to impact Epyc sales

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Bananasplit_00 said:

Because that's where Epyc comes in. Just like with Intel where they don't want to cannibalize the Xeon sales, AMD don't want to impact Epyc sales

No, I meant that IF AMD wanted to beat Intel in the core count battle, they'd definitely have to make that TR at least a 20 core so it'd equal Intel's 18 core in performance. Unless the Intel 18-core will be clocked so low due to Kaby Lake not being nearly as efficient as Zen

 

Besides, following your core-count assumption TR already is about to impact lower-end EPYC sales:

10899_amd%20epyc%207000%20line%20up.jpg

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Vode said:

People have no clue about engineering and business sometimes. lol

 

It seems common sense to me, although admittedly I was surprised about the news myself.

19 minutes ago, Morgan MLGman said:

No, I meant that IF AMD wanted to beat Intel in the core count battle, they'd definitely have to make that TR at least a 20 core so it'd equal Intel's 18 core in performance. Unless the Intel 18-core will be clocked so low due to Kaby Lake not being nearly as efficient as Zen

 

AMD can, but why would they? 16 cores is already approaching server chip territory as it is.  The only reason that Intel released an 18 core chip is because they couldn't be seen as inferior to AMD.

Long story short, there probably isn't a huge market that would need more than 16 cores, and wouldn't be better off with EPYC.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, VegetableStu said:

The EPYC and TR stack have cores in multiples of 4 so far. I'm not sure if AMD would want to disable a different number of cores on just one of the four CCXes ._.

 

20-core sounds more possible, maybe?

Engineering Samples showed up of split-count Cores (3+1 & 4+2), but AMD didn't want people sorting through CPUs to figure out which ones were busted. Plus it might cause some wonky performance problems. Current symmetrical system assures performance levels.

 

The TR version they delided is an ES, so it's possible they might not release the retail versions quite like this. Just with a placeholder under in the two spots.

 

I would also assume it's 1 good package per side. TR would be the Epyc CPUs with a faulty memory controller on one of the packages. Epyc requires that all 4 packages have a functioning memory controllers. This would explain how they can "use" >98% of dies on ~85% 8 core yields. Just make the die a spacer, haha. It got "used" at that point!

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Cinnabar Sonar said:

It seems common sense to me, although admittedly I was surprised about the news myself.

AMD can, but why would they? 16 cores is already approaching server chip territory as it is.  The only reason that Intel released an 18 core chip is because they couldn't be seen as inferior to AMD.

Long story short, there probably isn't a huge market that would need more than 16 cores, and wouldn't be better off with EPYC.

RAM controllers would be an issue with any further-up TR parts beyond 16c. It's possible the core choices is more because it'd break their microcode programmers trying to deal with the issues that would crop up.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, VegetableStu said:

I dunno, if the methodology with R3 and R5 made sense for them, I don't see why they should bomb two CCXes (even if they leave the glass in there) when they could pick which core to disable. I would draw parallels on that at least ._.

Well, the Zen Package (2 CCX = 8 cores) costs them about $30 USD each. That's part of the reason their costs are so low. (Obviously R&D is a bit part of the end-user cost.) It seems strange to sack 2 full packages just to keep feed the TR pieces.

 

However, if a "bad" Zen package can be put on the interposer for TR and basically just act has a spacer, they've suddenly "saved" dead dies. At least some of them have bad memory controllers, and this is probably where they ended up.

Link to comment
Share on other sites

Link to post
Share on other sites

Though, honestly, if AMD can swing the Quad channel w/ 4 Zen Packages (which I actually think might be the issue), they really should think about selling a 32c TR part in the future. Just price it at $1999. Intel has clearly shown that $1999 USD "desktop" part is acceptable!

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×