Jump to content

AMD faces class action suit over Bulldozer missrepresentation

zMeul

Hell, if they'd at the very least die shrunk K10 they'd have made a very good competitor to Sandybridge. And the "MAOR COREZ AND GHZ!!!!!!" is exactly the kind of thing that had Intel shitting bricks when they realised that to keep the Pentium 4 competitive they needed to push it to 4GHz and have it consume around 220W of power (kind of like a certain highly binned FX 8350 that AMD tried to sell for nearly $1000 originally-AMD really showed their desperation with that things launch).

Oh well. Engineers were given near total freedom for Zen and had the God like Jim Keller overseeing them. I think Zen will deliver. 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh well. Engineers were given near total freedom for Zen and had the God like Jim Keller overseeing them. I think Zen will deliver. 

Him leaving again however really isn't a good sign.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Him leaving again however really isn't a good sign.

But he worked there for two years. You do realise he only left after it was done, right? Look at his employment history. He was at AMD when K8 hit, he was at Apple when the impressive A5 (for the time) hit etc...he's like a superhero. Once his job his complete, he flies away into the darkness of the night :ph34r:

Link to comment
Share on other sites

Link to post
Share on other sites

Unfortunate. If engineers were allowed to do whatever they wanted for Bulldozer they probably would have had a very competitive chip. I hear the reason AMD went with CMT was because the management wanted to market "MOAR COREZ AND GHZ!!!!!!!"

CMT is a valid architecture philosophy. AMD's implementation was garbage but the theory behind it and the possibilities FOR it were always competitive.

Link to comment
Share on other sites

Link to post
Share on other sites

CMT is a valid architecture philosophy. AMD's implementation was garbage but the theory behind it and the possibilities FOR it were always competitive.

I know cache latency was an issue with their implementation, what else?

Link to comment
Share on other sites

Link to post
Share on other sites

I know cache latency was an issue with their implementation, what else?

The fact that 1 ALU has to wait for the other to finish most of the time because of the shared resources? The long pipeline?

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

The fact that 1 ALU has to wait for the other to finish most of the time because of the shared resources? The long pipeline?

Meh. A long pipeline shouldn't have been an issue with a good branch predictor...which means the branch predictor wasn't all that great?

Link to comment
Share on other sites

Link to post
Share on other sites

Meh. A long pipeline shouldn't have been an issue with a good branch predictor...which means the branch predictor wasn't all that great?

Yep, its bad (Intel had a really good branch predictor with later Pentium 4-and their pipeline length was insane).

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Yep, its bad (Intel had a really good branch predictor with later Pentium 4-and their pipeline length was insane).

Oh, but didn't Pentium 4 suck? AMD's processors thrashed Intel back when Pentium 4 with NetBurst came out. 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh, but didn't Pentium 4 suck? AMD's processors thrashed Intel back when Pentium 4 with NetBurst came out. 

Yep, it sucked specifically because Intel sacrificed everything for the highest clock speed. That didn't work out so well for them and they ended up dropping it in 2008.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Yep, it sucked specifically because Intel sacrificed everything for the highest clock speed. That didn't work out so well for them and they ended up dropping it in 2008.

Pretty surprised AMD didn't learn from Intel's mistake of trying to chase high clocks:P
Link to comment
Share on other sites

Link to post
Share on other sites

Pretty surprised AMD didn't learn from Intel's mistake of trying to chase high clocks:P

This is AMD we are talking about, and the main people behind Bulldozer are no longer part of AMD because they left or got fired....

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

This is AMD we are talking about, and the main people behind Bulldozer are no longer part of AMD because they left or got fired....

Speaking of, if AMD hits the performance target for Zen, do you think Zen+ will be able to compete with Icelake?
Link to comment
Share on other sites

Link to post
Share on other sites

That is...ancient... How the Hell does the board it's attached to still work? Under electrical current copper oxidizes fairly rapidly, not as rapidly as silver, but still...

copper lasts a while... a looong while..

take it from an electrician who has been working in houses that was originally wired in 1910...

 

if you can still push 10 amps in a 105 year cable that were rated for 10 amps in 1910. Then you can use the mobo just fine. Yes there is degradation, but aslong as the board does not allow for oxygen to easily come into contact with the copper, the oxidization process will slow down A LOT.

Link to comment
Share on other sites

Link to post
Share on other sites

Speaking of, if AMD hits the performance target for Zen, do you think Zen+ will be able to compete with Icelake?

Nope. Looking at AMD's track record they can get it close, but they won't match it.

 

copper lasts a while... a looong while..

take it from an electrician who has been working in houses that was originally wired in 1910...

 

if you can still push 10 amps in a 105 year cable that were rated for 10 amps in 1910. Then you can use the mobo just fine. Yes there is degradation, but aslong as the board does not allow for oxygen to easily come into contact with the copper, the oxidization process will slow down A LOT.

Well the motherboard does have a protective coating.

Edit: The air quality makes a difference as well.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

depends. the ones which keep Intel at bay from implementing a GDDR5 controller for a GPU expire in 2. The ones for a cluster-based GPU? 7 and 9 respectively for AMD and Nvidia. The last time Intel tried was Larrabee, and Nvidia pulled out to avoid having a 3rd financially equipped competitor (Intel historically sucked, but Larrabee was still a big enough improvement to make JSH think twice about letting it have free range with Nvidia IP).

 

There are other fundamental techs that will last longer, but the big two are related to memory controllers and threading structure for dGPUs, without access to it, Intel is all but powerless. AMD, for all its charity, doesn't license its GPU tech to anyone except Samsung. It builds chips for Microsoft, Sony, and supposedly Nintendo, but in terms of licensing, Samsung is the only one who has external access to Radeon's IP.

doesnt qualcomm have the Adreno IP?

which is Radeon, just for mobile?

Link to comment
Share on other sites

Link to post
Share on other sites

AMD stated that the cores aren't like normal cores from the very beginning--this stinks of consumer entitlement for easy money to me.This is coming from an Intel user sitting on Sandy Bridge because AMD hasn't really put out anything to justify switching platforms.

 

>Buys AMD to support them

>Wants AMD to best Intel

>Files lawsuit which would take away R&D funding for better chips

i donno what amd stated or not but in all the news and adds all i saw was 'WOW 8 CORE' , 'FIRST 8 CORE DESKTOP PROCESSOR', etc etc etc

 

these things are good for US CONSUMERS,so stop defendig companies. especially their marketing . i mean lol i know people are fond with the companies but damn.this is good for us

Link to comment
Share on other sites

Link to post
Share on other sites

i donno what amd stated or not but in all the news and adds all i saw was 'WOW 8 CORE' , 'FIRST 8 CORE DESKTOP PROCESSOR', etc etc etc

 

these things are good for US CONSUMERS,so stop defendig companies. especially their marketing . i mean lol i know people are fond with the companies but damn.this is good for us

http://anandtech.com/show/2881

 

found this on anandtech.... so there was atleast ONE article on this back in 2011, prior to Bulldozer launching.

Not really helping either sides case though

Link to comment
Share on other sites

Link to post
Share on other sites

http://anandtech.com/show/2881

 

found this on anandtech.... so there was atleast ONE article on this back in 2011, prior to Bulldozer launching.

Not really helping either sides case though

AMD did say four of those "cores" or modules. Seems pretty cut and dried to me-4 cores/modules. Plus AMD decided to change their definition of core so that they wouldn't be seen as lying when they advertised their FX 8*** series as having 8 cores.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Really? I thought Rory did more harm than good...

In terms of PR, yeah, nightmare, but in terms of dealing with actual managerial overhead and structural issues, he was a very necessary evil.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

AMD did say four of those "cores" or modules. Seems pretty cut and dried to me-4 cores/modules. Plus AMD decided to change their definition of core so that they wouldn't be seen as lying when they advertised their FX 8*** series as having 8 cores.

may i remind you that AMD is not THE RETAILER.

so if a retailer, say Newegg, wants to misuse the specs AMD has given the products. Then Newegg is the ones misrepresenting the product.

 

THAT IS THE IMPORTANT PART HERE!

AMD DOES NOT SELL THIS DIRECTLY. THEIR PARTNERS DOES. HOWEVER THEIR PARTNERS ARE BUSINESSES TOO, THEY WANT TO MAKE MONEY. SO IF THEY CAN TWIST THE TRUTH, THEY WILL!!!

Link to comment
Share on other sites

Link to post
Share on other sites

doesnt qualcomm have the Adreno IP?

which is Radeon, just for mobile?

Adreno is quite old at this point and is mobile only. It's also the sole owner since AMD sold all rights to it.

As far as I know the only company AMD is licensing its GPU tech to right now is Samsung, a complete non-competitor for the time being.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Speaking of, if AMD hits the performance target for Zen, do you think Zen+ will be able to compete with Icelake?

aslong as Zen+ increases performance by 10-15%, it should stick around skylake levels. And as such, if it is cheap enough, it will be competitive even if it is performing slightly worse

Link to comment
Share on other sites

Link to post
Share on other sites

may i remind you that AMD is not THE RETAILER.

so if a retailer, say Newegg, wants to misuse the specs AMD has given the products. Then Newegg is the ones misrepresenting the product.

THAT IS THE IMPORTANT PART HERE!

AMD DOES NOT SELL THIS DIRECTLY. THEIR PARTNERS DOES. HOWEVER THEIR PARTNERS ARE BUSINESSES TOO, THEY WANT TO MAKE MONEY. SO IF THEY CAN TWIST THE TRUTH, THEY WILL!!!

AMD does sell directly to businesses and enterprise customers too. We'll have to see how this all pans out.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

aslong as Zen+ increases performance by 10-15%, it should stick around skylake levels. And as such, if it is cheap enough, it will be competitive even if it is performing slightly worse

Do you think they'll be able to catch up?
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×