Jump to content

[Rumour] AMD's Zen To Have Ten Pipelines Per Core

HKZeroFive

It is technological limitations. This has been confirmed literally tens if not hundreds of times. You can't use a solder anymore because at the small transistor size it ruins the chip.

Now there are other solutions available but Intel doesn't feel they need to use them. Like better standard type tim or ideally (from a temp standpoint not really from a technology perspective) a liquid metal tim as that has been shown to massively (like up to 10-30C reduce core temps with high powered coolers.)

then again, a liquid metal solution would drive up prices by quite a bit.

Link to comment
Share on other sites

Link to post
Share on other sites

That would depend on whether there are conflicts between optimization patterns for each memory architecture, and it seems Skylake is in general a sidestep with improvement in some areas and losses in others under both memory architectures. Latency matters, but there are other factors at play we need to take into account with tests such as comparing say 4 cores of the 5960X against the 6700K with both on dual-channel DDR4. We don't have enough evidence to be conclusive yet.

So what you are saying is:

 

"well in theory DDR4 should be faster due to improvements in specifications all around. But this has yet to translate very well into real life"....

 

And yes, i can fully agree with that. Some technologies need proper time to mature. On the flip side, it also means that for now, i will remain correct.

As it currently stands, a low latency high speed DDR3 kit WILL outperform DDR4 at a lower price-point.

 

If we give things some time, say 6months to a year, DDR4 should start "pulling away" due to the other improvements being made.

 

also, i found this article. While it does not give definitive answers, it does help shed light on some thing here and there:

http://www.anandtech.com/show/8959/ddr4-haswell-e-scaling-review-2133-to-3200-with-gskill-corsair-adata-and-crucial

Link to comment
Share on other sites

Link to post
Share on other sites

So what you are saying is:

 

"well in theory DDR4 should be faster due to improvements in specifications all around. But this has yet to translate very well into real life"....

 

And yes, i can fully agree with that. Some technologies need proper time to mature. On the flip side, it also means that for now, i will remain correct.

As it currently stands, a low latency high speed DDR3 kit WILL outperform DDR4 at a lower price-point.

 

If we give things some time, say 6months to a year, DDR4 should start "pulling away" due to the other improvements being made.

no. at this point, the only thing DDR3 has for itself is price. 

  • you cant get a modern platform that supports DDR3
  • It is way more power hungry
  • the bandwith is generally lower (unless we are talking about high end kits but lets focus on JEDEC standard speeds)
  • The latency is actually better for DDR4, you just think it isnt because its a higher number (for example, Corsairs Dominators at 3200MHz have a CAS latency of 16. that is the same as a DDR3 1600MHz at CL8. Now keep in mind we are talking massive bandwith difference here, a completly new tech, and its still outperforming DDR3 standard, which is 1600/10. Now if we take a look at DDR4 standard 2133 versus DDR3 highspeed 2133, we see an opposite, DDR3 comes in CL10, and DDR4 comes in CL12-13 mostly. but, lets take a look at the absolute numbers. With DDR3 we calculate the CAS latency to be 4.6ns. With DDR4 that goes up to 6.1ns. That is a difference of 1.5ns. Compare that to the improvement in the bus latency, which we count in 10s of nanoseconds, and you see that the latency at the DIMM increase is so insignificant that its better to stop talking about it.)

"Unofficially Official" Leading Scientific Research and Development Officer of the Official Star Citizen LTT Conglomerate | Reaper Squad, Idris Captain | 1x Aurora LN


Game developer, AI researcher, Developing the UOLTT mobile apps


G SIX [My Mac Pro G5 CaseMod Thread]

Link to comment
Share on other sites

Link to post
Share on other sites

Why would amd need that level of hyper threading, they already give their cpus 8 physical cores fairly regulary, Zen looks stranger every time I look at it, hopefully it doesn't fall flat so it promotes competition.

 

Always love hearing how people want AMD to make good products so they can buy better Intel and Nvidia products afterwards.

 

AMD SHOULD get out of this market asap imo... and all of you can choke on it.

Link to comment
Share on other sites

Link to post
Share on other sites

Always love hearing how people want AMD to make good products so they can buy better Intel and Nvidia products afterwards.

 

AMD SHOULD get out of this market asap imo... and you of you can choke on it.

No, they should spin off Ati, give (not sell) their IP to a company that is actually competent, then they need to piss off.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

but not a quantum leap... it will be like always, 30% increase in FPS

Well. nvidia is expecting to have 18 billion transistors. over double what the gm200 gpu has.

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

Well. nvidia is expecting to have 18 billion transistors. over double what the gm200 gpu has.

yes, because they need to add 64bit CUDA cores if they want to make Pascal Teslas. that takes alot of transistors by itself (GM200 and GK110 had a similar number of transistors, but because maxwell has virtually no 64b cores, it could have way more normal Cores.) then add a few more transistors for more cores, and other logic, and you quickly double the transistor count and increase the gaming performance by merely 30% or so.

"Unofficially Official" Leading Scientific Research and Development Officer of the Official Star Citizen LTT Conglomerate | Reaper Squad, Idris Captain | 1x Aurora LN


Game developer, AI researcher, Developing the UOLTT mobile apps


G SIX [My Mac Pro G5 CaseMod Thread]

Link to comment
Share on other sites

Link to post
Share on other sites

@patrickjp93 Give 'em hell LOL

 

Edit: failed to tag him xD How do I do it properly? lel

MacBook Pro 15' 2018 (Pretty much the only system I use)

Link to comment
Share on other sites

Link to post
Share on other sites

Always love hearing how people want AMD to make good products so they can buy better Intel and Nvidia products afterwards.

 

AMD SHOULD get out of this market asap imo... and all of you can choke on it.

I'm not going to buy a cpu from a company who's cpus I have had problems with in the past, once you have ruined your rep with me I stay away, but I like competition promoting progress in the market rather than government regulation, gpus are still something I would buy from them however I bought a gtx 970 because at the time the 300 series cards didn't exist when I built my computer.

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

If they come with huge IPC performance jump in the end that will be awesome, just think of the 8-core they may really be priced well.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

It is technological limitations. This has been confirmed literally tens if not hundreds of times. You can't use a solder anymore because at the small transistor size it ruins the chip.

Now there are other solutions available but Intel doesn't feel they need to use them. Like better standard type tim or ideally (from a temp standpoint not really from a technology perspective) a liquid metal tim as that has been shown to massively (like up to 10-30C reduce core temps with high powered coolers.)

Wasn't it confirmed that Intel TIM isn't really that bad? It's just distance between CPU and IHS that makes that big difference ?

Link to comment
Share on other sites

Link to post
Share on other sites

No offence but im calling it out :

AMD will be a goner by the time they make the ZEN CPU.

but on the other hand , the gpu part of AMD is getting shaky lately... ether the only thing that will keep AMD going gg or game over is the GPU market OR the GPUs will triple the rate of AMDs downfall.

(⌐■_■) 

Link to comment
Share on other sites

Link to post
Share on other sites

No offence but im calling it out :

AMD will be a goner by the time they make the ZEN CPU.

but on the other hand , the gpu part of AMD is getting shaky lately... ether the only thing that will keep AMD going gg or game over is the GPU market OR the GPUs will triple the rate of AMDs downfall.

and then we say goodbye to you, ignorant nay-sayer

Link to comment
Share on other sites

Link to post
Share on other sites

Wasn't it confirmed that Intel TIM isn't really that bad? It's just distance between CPU and IHS that makes that big difference ?

Original haswell was awful, haswell refresh was decent, skylake is between the two.

Nothing compares to the much better stuff we use outside the cpu, but it's ok enough.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

and then we say goodbye to you, ignorant nay-sayer

dont get me wrong , i love AMD . in fact , i have a AMD GPU inside my pc ( white couger ) .

(⌐■_■) 

Link to comment
Share on other sites

Link to post
Share on other sites

No offence but im calling it out :

AMD will be a goner by the time they make the ZEN CPU.

but on the other hand , the gpu part of AMD is getting shaky lately... ether the only thing that will keep AMD going gg or game over is the GPU market OR the GPUs will triple the rate of AMDs downfall.

I am calling it that you will not only be wrong, but AMD will have a winner with Zen. Not only will Zen bring back AMD's name, it will be a launching point for their turnaround. 

 

The problem with the notion of AMD's demise, is that it stems from people who fail to look at the big picture. AMD's demise might be seen as a good thing (It will most likely give Intel the graphics IP to put NVidia on edge, and give NVidia the ability to make x86 CPU's), but there are too many powers that be, preventing this demise for a number of reasons. Several companies still see value in keeping this triangular conflict alive. AMD being in this fragile, desperate situation means they will take anything they can get. This is good news for the console market, one of gaming's largest markets. Hence those numerous rumors of MS trying to acquire AMD. 

 

AMD knows their time might be coming to an end. It's why Zen is their last great act of defiance. If it comes within spitting distance of Haswell, it will renew faith in everyone, even Intel fans. It will show that AMD can compete, and will mean competition has to begin again. A tiny spark is all it takes to set a forest ablaze, and Zen is AMD's last match in the box. 

 

Then again, nobody ever really considers Plan C. There is Plan A, where AMD is dissolved, Nvidia takes CPU potion, Intel takes GPU portion, and it becomes a 1V1 fight for total market control. Plan B is where AMD proves the world wrong with Zen, regains their footing, wipes the blood from their mouth while squaring back up again for another round. Plan C is the most unlikely, but its also the most chaotic option of the three. It would also be funny to see. Plan C would be AMD going forward with their graphics division (They already re-organized the graphics department with that Radeon Technology Group, or whatever it was called) and nullify the X86-64 agreement with Intel. Stepping out of the CPU realm, and completely breaking the natural order of things. Again, this is almost 100% certain to never happen, it would just be funny if it were to happen.

 

Well, i guess its the part where i sum up my giant wall of text so people can skip the boring stuff and get the gist of what i am saying. TL:DR? Zen will bring AMD back. I don't know how long it will last, but i believe it will buy them enough time to stave off these current rumors. If AMD dies, Intel and Nvidia will likely go toe to toe. If AMD wanted to give them all the finger, they could dissolve the agreements made with Intel, and focus entirely on their graphics division and still exist to some extent. If i am wrong, @patrickjp93 will certainly tell me that I am. Either way, everyone wins in the end.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

@MageTank

Option C would be hillarious

I've not looked into the full Intel/AMD cross license agreement, so i don't even know if such a scenario is even remotely possible. If it is, that would be reason enough for Intel to want to keep AMD afloat. If AMD can just say "screw it" and revoke AMD64 (x86-64) from them, and let the world fend for itself, it could really cause some trouble. It might even push ARM forward by a ton. That being said, Intel failed in the past when they tried to revoke x86 from AMD. The FTC was having none of that back in the day, and i doubt they'd let such shenanigans play out now. Still, it would be quite the funny move. That, and if it were to end up being possible, several big name companies would pay to stop it from happening. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

snip

 

If Intel and Nvidia went toe to toe, where Intel gained access to AMD's graphics IP and Nvidia got Zen, I think we know how that would end up. If people think that Nvidia plays dirty, I think Intel would smack Nvidia around like a rag doll, and make Nvidia look like saints by comparison.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

If Intel and Nvidia went toe to toe, where Intel gained access to AMD's graphics IP and Nvidia got Zen, I think we know how that would end up. If people think that Nvidia plays dirty, I think Intel would smack Nvidia around like a rag doll.

They both play dirty. The larrabee project died because Intel got greedy, and tried to run a smear campaign against Nvidia, and Nvidia has been known to act childish when something does not go their way. I think it would take Intel a little bit of time to figure out where they would need to concentrate an attack to hurt Nvidia (Knights Landing is already going to hurt Nvidia quite a bit in the most important market). Nvidia's preferred weapon of choice, is to use the developers they have in their pockets to optimize games for Nvidia exclusive features, which tends to sway gamers on their side of the market. I just don't know if Intel can beat Nvidia in marketing, which is what it will take to put a dent in Nvidia's marketshare stranglehold on consumer GPU's.

 

I think what Intel may do, is take the APU idea, and run with it. Bolstering their CPU lineup with even stronger graphics would only make them more of a threat in several markets. Once they show how great their APU's can be, and start gearing their software to give gamers more fine-control over it (like Nvidia and AMD does with their control panel suites), people will be more inclined to accept Intel once they decide to make real dedicated GPU's. Either way, we are still a long way away from this outcome being possible. Zen will have to fail before this reality comes to fruition any time soon, and even then, every other company that has an interest in AMD will also have to fold their hands before its truly over.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

No offence but im calling it out :

AMD will be a goner by the time they make the ZEN CPU.

Not gonna happen dude. Even if they keep up the current losses they do have a few more years left. Zen comes out in 2016.

if they are a 'goner' eventually it will not be before we get a couple of years at least of Zen CPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

the earliest AMD "can" die is 2019, as that is when their next term is due for their debts. If they cannot pay or negotiate a extension, they are goners.

But we WILL see ZEN+ (which will be like going from Ivy bridge to Haswell or atleast that is the plan) hit the market (due late 2017/early 2018)

Link to comment
Share on other sites

Link to post
Share on other sites

If Intel and Nvidia went toe to toe, where Intel gained access to AMD's graphics IP and Nvidia got Zen, I think we know how that would end up. If people think that Nvidia plays dirty, I think Intel would smack Nvidia around like a rag doll, and make Nvidia look like saints by comparison.

Intel stopped playing dirty when Paul Otellini stepped down. Kirzanich really is a saint by comparison.

 

They both play dirty. The larrabee project died because Intel got greedy, and tried to run a smear campaign against Nvidia, and Nvidia has been known to act childish when something does not go their way. I think it would take Intel a little bit of time to figure out where they would need to concentrate an attack to hurt Nvidia (Knights Landing is already going to hurt Nvidia quite a bit in the most important market). Nvidia's preferred weapon of choice, is to use the developers they have in their pockets to optimize games for Nvidia exclusive features, which tends to sway gamers on their side of the market. I just don't know if Intel can beat Nvidia in marketing, which is what it will take to put a dent in Nvidia's marketshare stranglehold on consumer GPU's.

 

I think what Intel may do, is take the APU idea, and run with it. Bolstering their CPU lineup with even stronger graphics would only make them more of a threat in several markets. Once they show how great their APU's can be, and start gearing their software to give gamers more fine-control over it (like Nvidia and AMD does with their control panel suites), people will be more inclined to accept Intel once they decide to make real dedicated GPU's. Either way, we are still a long way away from this outcome being possible. Zen will have to fail before this reality comes to fruition any time soon, and even then, every other company that has an interest in AMD will also have to fold their hands before its truly over.

No, just no... Larrabee died because Nvidia got scared and bought itself out of its full graphics licensing agreement with Intel, scuttling its ability to release Larrabee as a graphics card, but now we have the Xeon Phi and Nvidia's eating crow.

 

The problem for Nvidia and trying to keep developers in their pockets is Intel has

1) Far more developers on its platforms

2) Support of open standards that actually work well (OpenMP, OpenACC)

3) At least double the programmers to aid developers in transitioning to its own platform

 

Intel only has to win the following to beat Nvidia in the long run

1) HPC

2) Professionals

3) High end enthusiasts

 

Take those three and the rest falls in line on its own, not to mention Nvidia's finances would be a smoking pile of rubble at that point. There's very little need for Intel to be the king of marketing in consumer GPUs.

 

I still say Microsoft has no real interest and that Samsung would be legally barred from purchasing AMD due to national security concerns in the U.S., but that's more speculative than based solely in the facts.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×