Jump to content

AMD faces class action suit over Bulldozer missrepresentation

zMeul

Oh you got an old HTPC? Could you try decoding a 1080p (8 bit) video on the CPU and post results? I know a 10bit one won't work but maybe an 8bit one will. Please make sure that it is not being decoded on the GPU though if you decide to test it.

 

I have to agree with Patrick on this one though. If you get a BSOD then it sounds like an issue with hardware (or possibly the programs you use). At worst it should just be extremely laggy.

 

 

 

 

It's just H.264 but with 10 bits of internal color precision instead of the 8 bits regular H.264 files got. It allows for smaller files at the same quality but it breaks hardware accelerated playback since most GPUs don't support 10bit H.264 and instead it has to be decoded on the CPU.

it could do it when it was new... that old HTPC is 4-5 years old now... it struggles just to boot.

Yes i did a complete wipe and fresh install on it...

 

lets just say, Intel ATOM + Nvidia ION = fail.

the end.

 

today, as it currently is, it struggles to play 1080p YT stream :|

 

i think i cooked it a little, as i mounted it at the back of an old LCD monitor which got notoriously hot... such marriage for 2 years with me playing games (thus loading it 100%) for 8 hours+ a day didnt make shit better.

Link to comment
Share on other sites

Link to post
Share on other sites

Good old fashioned American Class Action Lawsuit over something as stupid as what Subway had a class action lawsuit over.

Sigh... The reason AMDs CPUs don't perform as well isn't necessarily because of the layout but rather the fact it is using an older manufacture process.

Maybe the four module/two cores per module might have something to with it but I mean they're not really misrepresenting the product(s) because they do technically have 8 cores. 

a Moo Floof connoisseur and curator.

:x@handymanshandle x @pinksnowbirdie || Jake x Brendan :x
Youtube Audio Normalization
 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

If the lawyers win this they're going to have a field day on Kaveri counting the GPU cores too... It's carefully labeled as 4+8 instead of 12 for precisely this reason but I don't think the lawyers will care.

I cannot be held responsible for any bad advice given.

I've no idea why the world is afraid of 3D-printed guns when clearly 3D-printed crossbows would be more practical for now.

My rig: The StealthRay. Plans for a newer, better version of its mufflers are already being made.

Link to comment
Share on other sites

Link to post
Share on other sites

Good old fashioned American Class Action Lawsuit over something as stupid as what Subway had a class action lawsuit over.

Sigh... The reason AMDs CPUs don't perform as well isn't necessarily because of the layout but rather the fact it is using an older manufacture process.

Maybe the four module/two cores per module might have something to with it but I mean they're not really misrepresenting the product(s) because they do technically have 8 cores. 

Same with the APUs having 12 cores.

You are on a need to know basis, and you don't need to know.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Same with the APUs having 12 cores.

I guess that's one way of looking at it but those extra cores are GPU cores.

The 8 core FX CPUs physically have 8 cores its just there's 4 modules and 2 cores per module. All 8 cores are functional. AMD markets the quad core APUs as 4+8 which is technically 12 cores but 8 of them are the iGPU.

a Moo Floof connoisseur and curator.

:x@handymanshandle x @pinksnowbirdie || Jake x Brendan :x
Youtube Audio Normalization
 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I guess that's one way of looking at it but those extra cores are GPU cores.

Yep. And AMD calls them Compute Cores, not CPU or GPU cores so they are not misleading the customer. Its the consumers fault for believing it has 12 cpu cores and then they complain about false advertising

You are on a need to know basis, and you don't need to know.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds interesting. Got any link to it? I would like to compare encoding/decoding speed and compression ratio for myself if that's okay.

What settings did you use to create the H.264 file?

 

Wait, so you are saying that compressing/decompressing things with WinZip is just as fast with or without hardware acceleration enabled? That's strange because the benchmarks showed that WinZip becomes a lot faster when you enable it. Earlier in the thread you said that the extra speed didn't matter because people who need to compress and decompress things fast will have a beefy CPU to do it with anyway, and for the average Joe the few seconds didn't matter. That to me sounded like "no to free performance".

 

What trade off are you referring to?

I never made an H.264 file. I made a file that was an encoding of a RAW format. It was an alternate compression method. Now, I will be the first to admit it didn't have I-frames to be able to quickly start from various places in the stream (and it's not streamable since no container could support it). I just made the compression and decompression schemes. File goes in, gets compressed to about 1/12 the size. I run the decompressor on it, and I get back a raw file. If I then run the diff  command under linux on the two files, the total difference is less than 2% over the whole volume of the file, and it's still viewable under a raw editting software, so the file's usefulness is still intact. Some of the colors gained back are imperfect, but the bulk of them are right or off by at most a value of 5 in a single channel (average of 2,2,2 difference across 3 channels in the worst case loss for a mixed color). Noticeable only if you're looking for it. And it runs in very acceptable time for me being an amateur at the art.

 

You make the tradeoff of being hot and wasteful with your resources. People say we need better CPUs, I say we need software that actually uses up most of the available performance on the table before we complain that Intel's still selling quad cores at the top of its mainstream offering.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

if its a quad core we would be seeing better gaming performance than we are currently getting with fx processors

Here's a hint-disable 1 ALU and you see better performance in games-still behind the performance of a 45nm Phenom II X4, but better.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Yep. And AMD calls them Compute Cores, not CPU or GPU cores so they are not misleading the customer. Its the consumers fault for believing it has 12 cpu cores and then they complain about false advertising

Literally no CPU before AMD's APU were defined as having X amount of compute cores, and the same goes for ALU in Bulldozer, everyone was expecting a CPU with 8 individual processing cores that could function as a full CPU in their own right, not a modified Phenom II X2/X3/X4 with 2 ALU per module or "core". AMD knew that people would be mislead by the terminology and they still insisted on using it.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I never made an H.264 file. I made a file that was an encoding of a RAW format. It was an alternate compression method. Now, I will be the first to admit it didn't have I-frames to be able to quickly start from various places in the stream (and it's not streamable since no container could support it). I just made the compression and decompression schemes. File goes in, gets compressed to about 1/12 the size. I run the decompressor on it, and I get back a raw file. If I then run the diff  command under linux on the two files, the total difference is less than 2% over the whole volume of the file, and it's still viewable under a raw editting software, so the file's usefulness is still intact. Some of the colors gained back are imperfect, but the bulk of them are right or off by at most a value of 5 in a single channel (average of 2,2,2 difference across 3 channels in the worst case loss for a mixed color). Noticeable only if you're looking for it. And it runs in very acceptable time for me being an amateur at the art.

 

You make the tradeoff of being hot and wasteful with your resources. People say we need better CPUs, I say we need software that actually uses up most of the available performance on the table before we complain that Intel's still selling quad cores at the top of its mainstream offering.

Oh so it was not really a lossly compression (except some errors)? I am far less impressed now.

Lossy compression is far harder to write, especially if you want to get anywhere near the size:quality of for example H.264 (uses a lot of complex predictions, filters and other techniques). You can find more info about it on this website (login: fj@mailinator.com Password: Qwerty123).

So you just want programs to for example use newer instructions? Why run it on the CPU at all when the GPU got so much unused power? I rarely pat AMD on the back but I think they are on the right track with HSA and moving tasks from the CPU to the GPU.

 

 

I am kind of disappointed that you just ignored all the stuff I wrote to prove you wrong on a wide variety of things.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh so it was not really a lossly compression (except some errors)? I am far less impressed now.

Lossy compression is far harder to write, especially if you want to get anywhere near the size:quality of for example H.264 (uses a lot of complex predictions, filters and other techniques). You can find more info about it on this website (login: fj@mailinator.com Password: Qwerty123).

So you just want programs to for example use newer instructions? Why run it on the CPU at all when the GPU got so much unused power? I rarely pat AMD on the back but I think they are on the right track with HSA and moving tasks from the CPU to the GPU.

 

 

I am kind of disappointed that you just ignored all the stuff I wrote to prove you wrong on a wide variety of things.

No need to be disappointed, its PatrickJP after all

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Anyone thinking this suit should go forward should be sterilized.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

1: As far as I know AMD never claimed the processors had a full 8 independent cores

2: There is no single concrete definition on what  single core actually has to contain in terms of CPU hardware, all there is is a general idea that has been sold as fact

3: AMD can Market there stuff how ever they want. just because Intel does it one way and uses certain terms doesn't mean AMD has to follow suit.

4: Complaining is one thing but filing a most likely useless class action lawsuit really wont get anyone but the lawyers anything

 

 

This is a HUGE waste of money for AMD because defending themselves will be super expensive and the money used for that could be put to use making better cpus....

WHY ARE PEOPLE SO STUPID!!!!!

GAAAAAAHHHH!!!!!

Use this guide to fix text problems in your postGo here and here for all your power supply needs

 

New Build Currently Under Construction! See here!!!! -----> 

 

Spoiler

Deathwatch:[CPU I7 4790K @ 4.5GHz][RAM TEAM VULCAN 16 GB 1600][MB ASRock Z97 Anniversary][GPU XFX Radeon RX 480 8GB][STORAGE 250GB SAMSUNG EVO SSD Samsung 2TB HDD 2TB WD External Drive][COOLER Cooler Master Hyper 212 Evo][PSU Cooler Master 650M][Case Thermaltake Core V31]

Spoiler

Cupid:[CPU Core 2 Duo E8600 3.33GHz][RAM 3 GB DDR2][750GB Samsung 2.5" HDD/HDD Seagate 80GB SATA/Samsung 80GB IDE/WD 325GB IDE][MB Acer M1641][CASE Antec][[PSU Altec 425 Watt][GPU Radeon HD 4890 1GB][TP-Link 54MBps Wireless Card]

Spoiler

Carlile: [CPU 2x Pentium 3 1.4GHz][MB ASUS TR-DLS][RAM 2x 512MB DDR ECC Registered][GPU Nvidia TNT2 Pro][PSU Enermax][HDD 1 IDE 160GB, 4 SCSI 70GB][RAID CARD Dell Perc 3]

Spoiler

Zeonnight [CPU AMD Athlon x2 4400][GPU Sapphire Radeon 4650 1GB][RAM 2GB DDR2]

Spoiler

Server [CPU 2x Xeon L5630][PSU Dell Poweredge 850w][HDD 1 SATA 160GB, 3 SAS 146GB][RAID CARD Dell Perc 6i]

Spoiler

Kero [CPU Pentium 1 133Mhz] [GPU Cirrus Logic LCD 1MB Graphics Controller] [Ram 48MB ][HDD 1.4GB Hitachi IDE]

Spoiler

Mining Rig: [CPU Athlon 64 X2 4400+][GPUS 9 RX 560s, 2 RX 570][HDD 160GB something][RAM 8GBs DDR3][PSUs 1 Thermaltake 700w, 2 Delta 900w 120v Server modded]

RAINBOWS!!!

 

 QUOTE ME SO I CAN SEE YOUR REPLYS!!!!

Link to comment
Share on other sites

Link to post
Share on other sites

Oh so it was not really a lossly compression (except some errors)? I am far less impressed now.

Lossy compression is far harder to write, especially if you want to get anywhere near the size:quality of for example H.264 (uses a lot of complex predictions, filters and other techniques). You can find more info about it on this website (login: fj@mailinator.com Password: Qwerty123).

So you just want programs to for example use newer instructions? Why run it on the CPU at all when the GPU got so much unused power? I rarely pat AMD on the back but I think they are on the right track with HSA and moving tasks from the CPU to the GPU.

I am kind of disappointed that you just ignored all the stuff I wrote to prove you wrong on a wide variety of things.

To your latter point, I've been implementing artificial intelligence game players combined with networking all day. I'll take the time to respond to those inanities at length at a future date after my graduate knowledge representation homework due Tuesday night, my presentation of the project on Wednesday, my exam on Thursday, my AI and graphics homework due Friday, and at least 6 hours of sleep going into Saturday. I may also grade some undergrads' homework and labs in-between all that. Before you ask about tomorrow (Monday), I'm finishing the AI project.

H.264 isn't lossless for the record without use of the maximum bit rates. Mine's so close you can't even tell unless you look at the bytes themselves or line up 2 of the same frame and check pixels with a magnifying glass.

And why are you so much less impressed? I'm within 2% of your ratio with hardly any effort. Now, is it streamable? Not unless someone writes a container for it. Can you skip to wherever you want in the movie? Not easy since I don't use I-Frames. That said, VP9 and H.264 are not implemented well from a data and instruction-level parallelismstandpoint, or the compilation is garbage. If you're telling me it takes thousands of operations per bit to decompress a video or encode it with high retention of quality and a high compression ratio, I'm going to laugh at your expense. It's bullshit.

Because unless the CPU is busy, you're wasting energy putting it on the GPU.

HSA is 3 years late to that party on design alone, 5 and growing on implementation in the real world.

#pragma omp offload nowait target (iGPU){ //decoder already written in C/C++/Fortran here}

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

-Snippity snip-

I'm beginning to think you don't sleep, and you're just a machination of advanced Intel processing power from the future, sent back in time specifically to see how it would develop in the 21st century.  I mean--I get a headache just reading that list of things to do.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm beginning to think you don't sleep, and you're just a machination of advanced Intel processing power from the future, sent back in time specifically to see how it would develop in the 21st century. I mean--I get a headache just reading that list of things to do.

That's light compared to 3 weeks ago. 3 exams in 2 days, a 10-page paper, and performance in 2 choirs in a single concert at the end of that week(Knowledge Representation by Gelfond up through Dynamic Domains, Linear Algebra from definition of a matrix to change of basis in Anton's book 11th edition, and Artificial intelligence on everything from graph/tree searches up through Alpha-Beta game playing. Now we're on neural networks, learning, and knowledge representation.) Each choir was fully memorized at 10 songs each averaging 6 voice parts per song in the following languages: English, French, Spanish, German, Latin, Swahili, and Creole-Haitian.

Chorale (Mixed): Sigalagala, Twa Tanbou, Gede Nebo, Children Go Where I Send Thee, Noche de Lluvia, Sensemaya, Jentends le Moulin, Cloudburst, Unclouded Day, Precious Lord

Men's Glee Club: Baba Yetu (Civilization IV), 2 Trinkleids by Schubert in German, In Taberna Quando Sumus (not a single repeated phrase the whole song), Two Meditations: Jonah's Song, Behold Man, Pye Aleman, Lux Aurumque, (Cheesy Traditional Set) Java Jive, Johnny Schmoeker, Parting Blessing.

On top of all this I'm also doing a capstone project building an a Apple Watch app for Key Bank, an actual app that will be deployed to the store in another 2 months.

I get very little sleep this semester.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That's light compared to 3 weeks ago. 3 exams in 2 days, a 10-page paper, and performance in 2 choirs in a single concert at the end of that week(Knowledge Representation by Gelfond up through Dynamic Domains, Linear Algebra from definition of a matrix to change of basis in Anton's book 11th edition, and Artificial intelligence on everything from graph/tree searches up through Alpha-Beta game playing. Now we're on neural networks, learning, and knowledge representation.) Each choir was fully memorized at 10 songs each averaging 6 voice parts per song in the following languages: English, French, Spanish, German, Latin, Swahili, and Creole-Haitian.

Chorale (Mixed): Sigalagala, Twa Tanbou, Gede Nebo, Children Go Where I Send Thee, Noche de Lluvia, Sensemaya, Jentends le Moulin, Cloudburst, Unclouded Day, Precious Lord

Men's Glee Club: Baba Yetu (Civilization IV), 2 Trinkleids by Schubert in German, In Taberna Quando Sumus (not a single repeated phrase the whole song), Two Meditations: Jonah's Song, Behold Man, Pye Aleman, Lux Aurumque, (Cheesy Traditional Set) Java Jive, Johnny Schmoeker, Parting Blessing.

I get very little sleep this semester.

I feel sorry for you. 

Link to comment
Share on other sites

Link to post
Share on other sites

I feel sorry for you.

The price of a master's in 4 years.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

The price of a master's in 4 years.

Ugh. I don't know if I wanna do comsci or Computer systems engineering. If I take Comsci to a higher level, I'd probably eventually learn about how x86 and ARM function right?Tbh CPU design is far more interesting to me but I don't know if I am capable enough to do it...

Link to comment
Share on other sites

Link to post
Share on other sites

Ugh. I don't know if I wanna do comsci or Computer systems engineering. If I take Comsci to a higher level, I'd probably eventually learn about how x86 and ARM function right?Tbh CPU design is far more interesting to me but I don't know if I am capable enough to do it...

Computer architecture (278 at Miami) was my first semester of Sophomore year. We did x86, MIPS, and ARM programming as well as analysis of compiled programs from GCC. Comp Sci is more programming. If you like hardware, go systems designs. Be aware though the required electro physics gets very difficult very fast starting sophomore year. You either pay attention and get your B or above in QM or you move to another part of systems design and think on bigger scales, such as architecting a server/cluster instead of components of a chip.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Computer architecture (278 at Miami) was my first semester of Sophomore year. We did x86, MIPS, and ARM programming as well as analysis of compiled programs from GCC. Comp Sci is more programming. If you like hardware, go systems designs. Be aware though the required electro physics gets very difficult very fast starting sophomore year. You either pay attention and get your B or above in QM or you move to another part of systems design and think on bigger scales, such as architecting a server/cluster instead of components of a chip.

Yeah I don't know what interests me right now. And I have to choose January next year since I'm in my last year of high school/college.
Link to comment
Share on other sites

Link to post
Share on other sites

Yeah I don't know what interests me right now. And I have to choose January next year since I'm in my last year of high school/college.

I am in the same boat for decisions, and I seem to always gravitate back toward Network Systems Administration, and just getting an electrical engineering degree for being an electrician.  Not highly technical on the last one, but, hey, it pays well, and it will afford me all the headphones.  Every. Single.  One.

Link to comment
Share on other sites

Link to post
Share on other sites

I am in the same boat for decisions, and I seem to always gravitate back toward Network Systems Administration, and just getting an electrical engineering degree for being an electrician. Not highly technical on the last one, but, hey, it pays well, and it will afford me all the headphones. Every. Single. One.

Haha. I'm stuck between systems engineering and comsci but I'm gravitating to the one that is stupid hard. Computer systems-.-
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×