Jump to content

Jim Keller leaves AMD - AMD claims Zen "on track"

CommandMan7

There is a difference between never being wrong (never believing or uttering something untrue) and never being ignorant. I'm ignorant all the time and consuming more knowledge all the time. That said, I'm next to never wrong. I can improve, and I do. That doesn't require that I learn from being wrong. It requires I move from states of ignorance to knowledge and application of that knowledge. Honestly who taught you informal logic?

Human nature taught me that "informal logic". One cannot improve if they are perfect. To never be wrong,means you always tell the truth. To always tell the truth, you cannot be human. Lying is a human defense mechanism after all. You yourself stated multiple times that you are the best. To be the best, means you are without equal, perfect.

 

The contradiction in your statement begins with the word "ignorance". You stated you are never wrong in several other threads while speaking to me. Now you claim "next to never wrong". You cannot be next to "never" something. Never is an absolute word. The term "almost never" is incorrect. 

 

That being said, if we look beyond the paradox you have created regarding your perfect ignorance, it is rather simple to discern what i was talking about, and why it applies to you. To be ignorant of something, means you lack understanding of something. To speak about something you do not understand, as if you do, makes you wrong. Considering your history of speaking about things you do not understand (launch price of the 4790k, GT3e Skylake performance vs GT3e Broadwell performance, this new rumor you've invented without evidence) would make you wrong. The fact that you do not see yourself as wrong, means you cannot learn from your mistakes. Because in your eyes, you do not make mistakes. You've said it yourself. 

 

 

There is a difference between never being wrong (never believing or uttering something untrue) and never being ignorant.

Except in this scenario, you are ignorant to the fact that you are wrong. The underlying question still remains to be answered. Can you learn from your mistakes? Though, for that to be answered, you first have to acknowledge that you are capable of making mistakes. An acknowledgement I doubt I will ever see.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Human nature taught me that "informal logic". One cannot improve if they are perfect. To never be wrong,means you always tell the truth. To always tell the truth, you cannot be human. Lying is a human defense mechanism after all. You yourself stated multiple times that you are the best. To be the best, means you are without equal, perfect.

 

The contradiction in your statement begins with the word "ignorance". You stated you are never wrong in several other threads while speaking to me. Now you claim "next to never wrong". You cannot be next to "never" something. Never is an absolute word. The term "almost never" is incorrect. 

 

That being said, if we look beyond the paradox you have created regarding your perfect ignorance, it is rather simple to discern what i was talking about, and why it applies to you. To be ignorant of something, means you lack understanding of something. To speak about something you do not understand, as if you do, makes you wrong. Considering your history of speaking about things you do not understand (launch price of the 4790k, GT3e Skylake performance vs GT3e Broadwell performance, this new rumor you've invented without evidence) would make you wrong. The fact that you do not see yourself as wrong, means you cannot learn from your mistakes. Because in your eyes, you do not make mistakes. You've said it yourself. 

Except in this scenario, you are ignorant to the fact that you are wrong. The underlying question still remains to be answered. Can you learn from your mistakes? Though, for that to be answered, you first have to acknowledge that you are capable of making mistakes. An acknowledgement I doubt I will ever see.

Philosophical dodge. You have broken every single logical fallacy in our discussions at some point.

 

I don't lie. I haven't since I was 10. The consequences of being caught in a lie are always worse than dealing with a hard truth. It's a matter of defense and expediency to NOT lie.

 

Are you really going to argue semantics when you know the intended meaning of "next to never?" When you can't win, Straw Man! Would you prefer I use the phrase "extremely rarely wrong?" But then, what is extremely? What's my scale? You understood the meaning. Don't split hairs just to be that guy. 

 

No, ignorance is lack of knowledge, not lack of understanding. Review your English please. I can understand motion without having the knowledge (and thus being ignorant) of its bounding equations.

 

I've been right about all 3 of those.

 

I can make mistakes. It's extremely rare that it happens, much more so in public view.

 

As I've said before, if you're going to step into the ring with someone above your weight class, you better punch so hard your opponent doesn't remain conscious. You've done a poor job puffing yourself up against me.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Philosophical dodge. You have broken every single logical fallacy in our discussions at some point.

 

I don't lie. I haven't since I was 10. The consequences of being caught in a lie are always worse than dealing with a hard truth. It's a matter of defense and expediency to NOT lie.

 

Are you really going to argue semantics when you know the intended meaning of "next to never?" When you can't win, Straw Man! Would you prefer I use the phrase "extremely rarely wrong?" But then, what is extremely? What's my scale? You understood the meaning. Don't split hairs just to be that guy. 

 

No, ignorance is lack of knowledge, not lack of understanding. Review your English please. I can understand motion without having the knowledge (and thus being ignorant) of its bounding equations.

 

I've been right about all 3 of those.

 

I can make mistakes. It's extremely rare that it happens, much more so in public view.

 

As I've said before, if you're going to step into the ring with someone above your weight class, you better punch so hard your opponent doesn't remain conscious. You've done a poor job puffing yourself up against me.

Talk about irony. You mock my arguing semantics, and yet you shoot yourself in the foot. 

 

 

 

No, ignorance is lack of knowledge, not lack of understanding.

And what exactly is "Knowledge"? Knowledge: facts, information, and skills acquired by a person through experience or education; the theoretical or practical understanding of a subject.

 

I love you patrick. You really do think you are above the people around you. You treat every disagreement as a battle to the death, and yet here I remain. Again, i refuse to believe you have not lied once in the past 11 years. Even a small lie is a lie. Lying is, as i stated before, a human defense mechanism. Not only to defend ourselves, but even others. You try to convey this appearance that you are this infallible machine of a man, that is above others simply because you believe you are. I've "beaten" you several times, and still have yet to "lose" to you. You never provided those benchmarks that you swore you had. You never gave me proof that you purchased your 4790k at below intel MSRP from Amazon, because you can no longer access the account, and the physical receipt was locked in a safe in another physical location. And now, you have one more day left for your current statement to come true, or else it too will be added to the short list of "defeats" you have faced at my hand. 

 

I love the words you assign to our conversations. It makes it feel 10x more interesting than what it really is. Which is two nerds arguing on the internet about things that truly do not matter in the grand scheme of things. Either way, i am thankful to have someone fun to talk to. You keep me sharp  :wub: 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Talk about irony. You mock my arguing semantics, and yet you shoot yourself in the foot. 

 

And what exactly is "Knowledge"? Knowledge: facts, information, and skills acquired by a person through experience or education; the theoretical or practical understanding of a subject.

 

I love you patrick. You really do think you are above the people around you. You treat every disagreement as a battle to the death, and yet here I remain. Again, i refuse to believe you have not lied once in the past 11 years. Even a small lie is a lie. Lying is, as i stated before, a human defense mechanism. Not only to defend ourselves, but even others. You try to convey this appearance that you are this infallible machine of a man, that is above others simply because you believe you are. I've "beaten" you several times, and still have yet to "lose" to you. You never provided those benchmarks that you swore you had. You never gave me proof that you purchased your 4790k at below intel MSRP from Amazon, because you can no longer access the account, and the physical receipt was locked in a safe in another physical location. And now, you have one more day left for your current statement to come true, or else it too will be added to the short list of "defeats" you have faced at my hand. 

 

I love the words you assign to our conversations. It makes it feel 10x more interesting than what it really is. Which is two nerds arguing on the internet about things that truly do not matter in the grand scheme of things. Either way, i am thankful to have someone fun to talk to. You keep me sharp  :wub: 

No, I used irony against you. The fact you're thinking too simplistically to see that is what's most laughable here, and yet you act like you won the fight. You're dreaming, because you're KOed.

 

No, disagreements require both sides have something to argue. You're slandering me. It is a fight to the death, because appearances are 90% of everything. If you don't look strong, no one cares if you actually are, because you can't get your foot in the door. 

 

Again, I'm not infallible. You just haven't succeeded in proving me wrong about anything, and at the rate you're going, you never will. I don't defend people at my own expense and I certainly don't lie for people. If you've done something shameful and beg me for help, I'll give you a second chance, but you're facing everyone else's guillotine without me. I'm not above you. I'm better at logic than you. That doesn't make me a superior human being. It just means one of my skills is at a higher level than yours. If you believe in God, we're all equal. If you don't, well, the work we do extends into the future generations. I'll let them decide who was the better man in our time.

 

Yes I did. Double check both the thread and our conversation. 4K benches included.

 

The physical receipt went in the garbage after the package arrived. It wasn't my money and my Dad doesn't fiddle with receipts to file taxes. He makes enough money that that isn't a bother necessary for him.

 

Is your preferred depression drink gin or brandy?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No, I used irony against you. The fact you're thinking too simplistically to see that is what's most laughable here, and yet you act like you won the fight. You're dreaming, because you're KOed.

 

No, disagreements require both sides have something to argue. You're slandering me. It is a fight to the death, because appearances are 90% of everything. If you don't look strong, no one cares if you actually are, because you can't get your foot in the door. 

 

Again, I'm not infallible. You just haven't succeeded in proving me wrong about anything, and at the rate you're going, you never will. I don't defend people at my own expense and I certainly don't lie for people. If you've done something shameful and beg me for help, I'll give you a second chance, but you're facing everyone else's guillotine without me. I'm not above you. I'm better at logic than you. That doesn't make me a superior human being. It just means one of my skills is at a higher level than yours. If you believe in God, we're all equal. If you don't, well, the work we do extends into the future generations. I'll let them decide who was the better man in our time.

 

Yes I did. Double check both the thread and our conversation. 4K benches included.

 

The physical receipt went in the garbage after the package arrived. It wasn't my money and my Dad doesn't fiddle with receipts to file taxes. He makes enough money that that isn't a bother necessary for him.

 

Is your preferred depression drink gin or brandy?

Actually, my preferred drink is Grape flavored Sunny D. 200% of my daily vitamin C in one single serving, what more does a man need? Now, lets get back into the thick of it, shall we?

 

Now, you are right. I think simplistically, so whatever irony you claimed to use on me, is apparently lost on me. Can you explain where the irony was in your post? I do not see it. As for the "disagreements require both sides having something to argue", i agree. It's why i used the word "disagreement". I argued against your imaginary claim of Skylake GT3e being 20% faster than Broadwell GT3e, and you argued that it was. Ta-da, both sides had something to argue. Now, lets check my personal messages for those 4k benchmarks. Oh... Oh my. My vision must be failing me... HELP ME LORD, FOR THINE EYES HATH FORSAKEN THEE! Oh wait, nevermind. There was no 4k benchmarks in that private message. Huh. 

 

 

 

 

 

 

Still, 3Dmark is a poor representation on what to expect when really gaming. In 3d mark, the 6200 scores well below what it actually performs in gaming. The 6200 scores 1712 max on 3d mark, but is only 20% slower than the GTX 750 on average. The GTX 750 however, averages 4800 on 3d mark. I would not consider this a valid metric to "guess" the architectural improvements of an iGPU, let alone use it as a means of saying something will perform faster than something else. 

 

Too many variables to be certain of anything at this point.

Ah, but then you have gaming benchmarks that still put Skylake between 20 and 25% above Broadwell at the same SKU levels.

 

Where are those? Remember, games take CPU speed into consideration, so i need those 4k benchmarks you talked about earlier.

 

I know in the 6700K review LTT did either 4K or 1440p on HD 530. I'll see if I can't hunt down a gaming 5500 review from JTC, Anandtech, or another well-known. Might have to go to murkier waters.

 

I just watched that video. They did Far Cry 4 at 4k and that City Skylines game at 4k. They did it on a 980 Ti, not the HD 530.

 

 

Now, my eyes might still be failing me for all i know. I do require a new prescription of glasses, so i will summon the two people that were also included in that private chat. @Shakaza and @Prysin

 

Perhaps they can find them in that private chat. In the mean time, lets check that thread that you also said you linked it in. Oh boy... 24 pages. This might take a while.

 

http://linustechtips.com/main/topic/440575-updated-oxide-responds-to-aots-conspiracies-maxwell-has-no-native-support-for-dx12-asynchronous-compute/?p=5908338

 

That is where you first responded to me, saying you gave me the evidence. 

 

http://linustechtips.com/main/topic/440575-updated-oxide-responds-to-aots-conspiracies-maxwell-has-no-native-support-for-dx12-asynchronous-compute/?p=5908859

 

This is where you gave me broken links to 3dmark, which we proved in that private chat, was wayyyy bogus, and that the GTX 750 absolutely killed the Iris Pro 6200 to the tune of 3-4 fold. Because as i stated in that private chat, 3dmark is not a representation of real world performance. (you being beaten with your own source, would that be considered ironic? again, i don't understand irony according to you). 

 

http://linustechtips.com/main/topic/440575-updated-oxide-responds-to-aots-conspiracies-maxwell-has-no-native-support-for-dx12-asynchronous-compute/?p=5917438

 

This is where you used an excuse to ignore my requests for that benchmark, and it marked the last that we spoke of the subject. You used it to avoid me because you did not have it. You did not have it, because it does not exist. It does not exist, which means you are wrong. You are wrong, so you made a mistake. You made a mistake, so you can learn from it. 

 

I only hope @Shakaza doesn't come to rescue you again this time. Don't know if i can face that mans wrath a second time, and live. He is far better at this than you are.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

-snippity-

While I'd love to give either of you a hand here, I have more important things that I should be doing right now. (Yes, homework, yes, I shouldn't be looking at an internet forum in the first place. I get distracted easily. >_>) Anyway, you give me waaay too much credit. And am I a man now? I still think of myself as a small, carefree child.

Why is the God of Hyperdeath SO...DARN...CUTE!?

 

Also, if anyone has their mind corrupted by an anthropomorphic black latex bat, please let me know. I would like to join you.

Link to comment
Share on other sites

Link to post
Share on other sites

@MageTank The irony is that I accuse you of arguing semantics and then contradict you with fact while also building on it semantically to, admittedly, mock you. I suggest you reread with this in mind. Is juxtaposition that hard for you to understand?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

@MageTank The irony is that I accuse you of arguing semantics and then contradict you with fact while also building on it semantically to, admittedly, mock you. I suggest you reread with this in mind. Is juxtaposition that hard for you to understand?

I think you are saying that in hindsight. I totally caught you, and used your own argument against you. Notice how you only called that one specific piece out, and ignored the whole "you didn't provide benchmarks, therefore, you are wrong" part of my gigantic post? Yeah... classic deflection.

 

You should buy ESO. I'd love to play a real game with you.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think you are saying that in hindsight. I totally caught you, and used your own argument against you. Notice how you only called that one specific piece out, and ignored the whole "you didn't provide benchmarks, therefore, you are wrong" part of my gigantic post? Yeah... classic deflection.

 

You should buy ESO. I'd love to play a real game with you.

No need to attack the whole brick wall when you only have to remove one piece to have it crumble.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No need to attack the whole brick wall when you only have to remove one piece to have it crumble.

"I can't beat you, so i'm gonna poke you once and claim victory" - Patrick Proctor

 

All that dramatic talk about how you have "beaten" me, and yet its been over 22 days since i last asked for proof from you, and you still run from me. Also, removing one brick from a brick wall won't cause it to crumble. Not when properly sealed and maintained, like my brick walls are. OH SNAP. EVEN YOUR ANALOGY GOT USED AGAINST YOU! OHHHHH. Okay. That was childish of me. I apologize for that.

 

Since you once again saw through my clever ruse to obtain those benchmarks that you promised, ill go retreat back into my cave, claiming total victory. Summon me whenever you get the masochistic urge to be stepped on.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Any news?

Id like to know how intel plans to respond to Patrick about leaking information from what clearly seems to be of high value to their company before it was publicly announced.

(Not that it was announced at all!)

Well done patrick. Evidence was a no show again?

Link to comment
Share on other sites

Link to post
Share on other sites

Id like to know how intel plans to respond to Patrick about leaking information from what clearly seems to be of high value to their company before it was publicly announced.

(Not that it was announced at all!)

Well done patrick. Evidence was a no show again?

HEY, this is THE Patrick J. Proctor you are talking about. He has gotten professors and lawyers fired. He doesn't need to show you proof, he knows everything. How dare you slander this mans good name? He has contributed so much to this forum. Without him, who else would put down others in a manner that borderlines pathological narcissism? Without Patrick, we would never grow as a species. He has opened my eyes to this world, and i am forever in his debt. I will not tolerate your tarnishing of his reputation. 

 

Don't worry @patrickjp93, i got your back. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

HEY, this is THE Patrick J. Proctor you are talking about. He has gotten professors and lawyers fired. He doesn't need to show you proof, he knows everything. How dare you slander this mans good name? He has contributed so much to this forum. Without him, who else would put down others in a manner that borderlines pathological narcissism? Without Patrick, we would never grow as a species. He has opened my eyes to this world, and i am forever in his debt. I will not tolerate your tarnishing of his reputation. 

 

Don't worry @patrickjp93, i got your back. 

 

:lol: He kinda reminds me off Sheldon come to think of it.​

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

:lol: He kinda reminds me off Sheldon come to think of it.​

Wow. I never realized that until you said it. Thank you for that.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

@patrickjp93

i simply do not see why intel would hire keller atm.

Intels own engineers are on good track with their current projects (atleast there has been no news of otherwise)

Their SSD section has been doing good for ages (their pricing of said SSDs are not good)

The only place keller could fit in atm, that would make sense, was in the new automotive security branch. Making custom chips there that is a bit more rugged then usual and with advanced security barriers would MAYBE be his forte.

Other then this, as a chip designer who works on a project basis, i see no logical reason to hire him on any "normal" type of project keller has been part of.

If intel is struggling with Canonlake, then that is a manufacturin issue, and that again is a machine design issue which goes well beyond a chip makers field of work...

Sorry mate, i think there is somthing else going on with that alleged investors call.

Also patrick. Just because you have posted evidence once, does not mean you have redeemed yourself.

Your way of acting still does not contribute positivly to any sort of discussion.

Also you mentioned to me, long ago, that "IBM clearly thought i was good enough", now you state that you were an intern. Let me just jack your ego down a bit with some realistic facts...

No matter the company, mega corp or family business, if you is taken in as a intern they take you in based upon free labor.

So why did IBM choose you? more likely then not because huge corps only want the better students, as they will hopefully not annoy or slow down their paid staff. A bad student would tie up a staff members time, this is counter-productive to a large corp.

Until you are hired on either a contract or full time basis, you are but convenient free labor to any company. Big or small. Also remember, for the mega corps, the students applying is not just from your uni, it is from ALL unis in the states, in canada, in europe. Everyone want to work for the big guys.

So take a bottle of humility, take a deep breath and act as if everyone is your equal.

The most successful people on this planet does not start out as ruthless, they start out as sociable, friendly and fun people to stick around. Those who start out ruthless, unless they have absurd amounts of money and power from the start, is always stomped into the ground and forgotten about sooner or later. Not to mention, any co-worker will spend his or her time either avoiding or undermining you out of spite if you act like a dick.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Zen was expected to be fully taped out by 4th qtr 2014 with provisional sales by 2nd qtr 2015. Dec 14 came and went without a peep as Glofo couldn't get their processes dialed in. The new date was 3rd quarterish 2015 and it was much more vague. That was moved to 2nd qtr 2016 and now it's 4th qtr 2016. And none of it has anything to do with AMD but all with their foundry partner.

 

Actually, Glofo and TSMC co built the new foundry. They may be competitors but they also are not willing to shoulder the complete cost of a $6 billion USD fab. On top of all that the process is 16nm finFET, not the Samsung 14nm finFET. Different process and not transferable.

 

You need some references for your claims here.

 

GloFo 14nm is on its same schedule since I've ever heard of it, though they failed a 20nm node.  In addition, it is quite likely AMD will be using 14nm LPP, though is certainly not a guarantee.  16nm at TSMC would make sense, but AMD already shelled out tens of millions to GloFo for a node transition (quarerly financial disclosure documents).

Link to comment
Share on other sites

Link to post
Share on other sites

You need some references for your claims here.

 

GloFo 14nm is on its same schedule since I've ever heard of it, though they failed a 20nm node.  In addition, it is quite likely AMD will be using 14nm LPP, though is certainly not a guarantee.  16nm at TSMC would make sense, but AMD already shelled out tens of millions to GloFo for a node transition (quarerly financial disclosure documents).

I think you're right. I think I'm conflating the GPU issues that TSMC is having @16nm with GloFo problems @14nm. Either way, neither is coming up roses. And I just heard that Glofo is now saying 1st qtr 2017. That's waaaay too long. I don't know if AMD can hang on that much. It's too bad that they don't have the resources to whip out a FM4 chipset and refresh their line with new m/bds. See if there was something that could supercharge the current lineup. Of course, if it was that easy, they would have done it by now.

 

It just goes to show just what the powerhouse Intel is. They make it look easy.

Sir William of Orange: Corsair 230T - Rebel Orange, 4690K, GA-97X SOC, 16gb Dom Plats 1866C9,  2 MX100 256gb, Seagate 2tb Desktop, EVGA Supernova 750-G2, Be Quiet! Dark Rock 3, DK 9008 keyboard, Pioneer BR drive. Yeah, on board graphics - deal with it!

Link to comment
Share on other sites

Link to post
Share on other sites

He did the Apple A4 and A5?

Well that guarantees that Zen won't be bad.

Did i read that right? WTF! The A4 was a joke and the A5 was incompetent.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

I think you're right. I think I'm conflating the GPU issues that TSMC is having @16nm with GloFo problems @14nm. Either way, neither is coming up roses. And I just heard that Glofo is now saying 1st qtr 2017. That's waaaay too long. I don't know if AMD can hang on that much. It's too bad that they don't have the resources to whip out a FM4 chipset and refresh their line with new m/bds. See if there was something that could supercharge the current lineup. Of course, if it was that easy, they would have done it by now.

 

It just goes to show just what the powerhouse Intel is. They make it look easy.

Intel doesn't really spend much more than AMD on producing a new CPU architecture. Most of Intel's R&D goes into its foundry construction, equipment purchases, and related expenses. After that you have Infiniband/Omnipath, USB, Thunderbolt, PCIe, Ethernet, and a plethora of standards Intel is basically the main innovator for, and then about 1.5 billion out of the whole 100 billion USD R&D budget i spent on new CPU/GPU architectures each year.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Yet another thread that got derailed by a battle of egoes. Great work.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Yet another thread that got derailed by a battle of egoes. Great work.

I don't see the derailment. The thread was basically done anyway.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Did i read that right? WTF! The A4 was a joke and the A5 was incompetent.

Explain yourself.

While the A5 wasn't much of an upgrade over the A4, the A4 was a pretty damn powerful chip back in the day and is one of reasons multi-core phone CPUs exist.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

Explain yourself.

While the A5 wasn't much of an upgrade over the A4, the A4 was a pretty damn powerful chip back in the day and is one of reasons multi-core phone CPUs exist.

 

A4 was a single-core CPU, the A5 was a dual-core CPU and a big upgrade over the A4. Significantly better singlethreaded performance in addition to obviously superior multithreaded performance.

 

http://www.anandtech.com/show/4225/the-ipad-2-review/4

Link to comment
Share on other sites

Link to post
Share on other sites

Explain yourself.

While the A5 wasn't much of an upgrade over the A4, the A4 was a pretty damn powerful chip back in the day and is one of reasons multi-core phone CPUs exists

Well the A4 was a single core Cortex A8. and The A5 was a great upgrade, it brought a dual core Cortex A9 with out of order execution. The A5 is the reason multi core SOCs exist.

 

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×