Jump to content

AMD faces class action suit over Bulldozer missrepresentation

zMeul

You did not answer my question. Do you think Intel should have advertised their i5-2500K as a 16 core processor? If they did do you think they would have done so in an attempt to trick customers? It is exactly what AMD is doing and I think the answers are "no they should not be allowed to do it and yes they are doing it to trick customers".

All that stuff about Vulcan/DX12 is irrelevant to the conversation. The conversation is purely "should you be allowed to take the sum of CPU and GPU cores and advertise that under a generic core term?". AMD is doing it and I think it is wrong. They should specify how many CPU cores it has and how many GPU cores it has. They should not be added together and advertised as a single number, ever.

 

As for the Twitter account, yes it is an official account and it has almost 50K followers.

 

 

 

 

Not really. They might have broken the law (that's up to the judge to decide). We don't let robbers get away with their crimes just because they might have cancer. Their health (physical of financial) does not put them above the law.

 

Personally I think the whole "module vs core" debate is not really anything to sue them for. It's kind of a gray zone and I don't think they did it on purpose. It's just a byproduct of how the architecture works. I think the whole concept of "let's advertise the CPU and GPU cores together as a single number" is a far bigger issue that genuinely deserve to get sued over. That is nothing but a dirty attempt to trick people and does not have anything to do with the architecture.

can intel use GPGPU with their i5 2500k?

 

i dont think they had actual compute capabilities with their HD series back then. So no.

 

AMDs Kaveri on the other hand CAN use the GPU to do CPU tasks, IF the program is written to make use of it.

 

Also, so "@AMDAPU" did a fuckup and didnt list that it is 4 cpu cores and 8 gpu cores. Then again, in every other fucking slide or description or product label. the A6, A8 and A10 series APUs says "total cores = cpu + gpu cores". And AMD specifically calls them compute cores.

Also, the amount of followers does not make an account official.

Link to comment
Share on other sites

Link to post
Share on other sites

can intel use GPGPU with their i5 2500k?

i dont think they had actual compute capabilities with their HD series back then. So no.

AMDs Kaveri on the other hand CAN use the GPU to do CPU tasks, IF the program is written to make use of it.

Also, so "@AMDAPU" did a fuckup and didnt list that it is 4 cpu cores and 8 gpu cores. Then again, in every other fucking slide or description or product label. the A6, A8 and A10 series APUs says "total cores = cpu + gpu cores". And AMD specifically calls them compute cores.

Also, the amount of followers does not make an account official.

No, the Sandy Bridge iGPU is not programmable for OpenGL 3 or OpenCL. It's a fixed-function pipeline GPU.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No, the Sandy Bridge iGPU is not programmable for OpenGL 3 or OpenCL. It's a fixed-function pipeline GPU.

yeah, thats what i thought.

AFAIK, only their most recent, aka HD 500 (skylake) and the HD 6200 Iris Pro supports OpenGL2.0 from what ive read on reviews... Iv'e seen no proof that they work or can execute the way Kaveri can. But they are said to support it.

Link to comment
Share on other sites

Link to post
Share on other sites

can intel use GPGPU with their i5 2500k?

 

i dont think they had actual compute capabilities with their HD series back then. So no.

Yes you can.

QuickSync for example is a GPGPU task and that launched with Sandy Bridge. GPGPU is not just limited to OpenCL, OpenGL, Vulcan and DirectX 12.

Also, it doesn't have to use all 16 "compute cores" for the same task right? If it got 12 GPU cores doing one job and 4 CPU cores doing another job then it's still 16 cores that are computing things.

 

AMDs Kaveri on the other hand CAN use the GPU to do CPU tasks, IF the program is written to make use of it.

Same with the iGPU Intel got.

 

Also, so "@AMDAPU" did a fuckup and didnt list that it is 4 cpu cores and 8 gpu cores. Then again, in every other fucking slide or description or product label. the A6, A8 and A10 series APUs says "total cores = cpu + gpu cores". And AMD specifically calls them compute cores.

Also, the amount of followers does not make an account official.

I have seen them listed as just "compute cores" without the (CPU + GPU cores) on sites before. That's how I was informed about this stupid classification. I was some "10 compute core" chip on Newegg and had to Google it to find out that it was 4 CPU cores and 6 GPU cores (I think those were the numbers). Could have been a mess-up on Newegg's part, but it does not change the fact that AMD is clearly using the term "compute core" as a tool to deceive people into thinking their APUs are more powerful. There is no other explanation for why they would invent a new term and start using that a lot in their marketing material.

 

 

 

 

No, the Sandy Bridge iGPU is not programmable for OpenGL 3 or OpenCL. It's a fixed-function pipeline GPU.

Well first of all, it does support OpenGL 3.3. Secondly, you don't need OpenGL 3 or OpenCL for GPGPU.

It certainly helps when programming, but it is by no means a requirement.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes you can.

QuickSync for example is a GPGPU task and that launched with Sandy Bridge. GPGPU is not just limited to OpenCL, OpenGL, Vulcan and DirectX 12.

Also, it doesn't have to use all 16 "compute cores" for the same task right? If it got 12 GPU cores doing one job and 4 CPU cores doing another job then it's still 16 cores that are computing things.

 

Same with the iGPU Intel got.

 

I have seen them listed as just "compute cores" without the (CPU + GPU cores) on sites before. That's how I was informed about this stupid classification. I was some "10 compute core" chip on Newegg and had to Google it to find out that it was 4 CPU cores and 6 GPU cores (I think those were the numbers). Could have been a mess-up on Newegg's part, but it does not change the fact that AMD is clearly using the term "compute core" as a tool to deceive people into thinking their APUs are more powerful. There is no other explanation for why they would invent a new term and start using that a lot in their marketing material.

 

 

 

 

Well first of all, it does support OpenGL 3.3. Secondly, you don't need OpenGL 3 or OpenCL for GPGPU.

It certainly helps when programming, but it is by no means a requirement.

first of all, ima correct myself, i meant OpenCL2.0 not GL.

 

second of all.

do you even listen?

yes you can use GPGPU for non gaming tasks... but for tasks SUCH AS rendering via iGPU, you want to use DX, CUDA or OpenCL.... from what i know, i CAN NOT assing my Core i7 4790ks iGPU to render for me in Adobe software (OpenCL and OpenGL acceleration capable).

 

So clearly, intels HD series is only supported. But not actually given ANY function....

 

third of all:

under DX12 and Vulcan, you can use Kaveri to execute out of order compute tasks, while doing iGPU work and CPU work.

CAN INTEL HD DO THIS? NO!

CAN IT THEN BE USED AS A PROPER COMPUTE CORE CPU? NO!

Link to comment
Share on other sites

Link to post
Share on other sites

If you only have 4 FPU per 8 Integer units, you have 4 cores. A core must have both integer and floating point capability. So yes, AMD were always lying about Bulldozer.

Says who? Can you please source an industry standard definition of CPU core that doesn't come from AMD or Intel? CPU cores typically have an FPU in them, but I've seen no definition that days a core "must" have one.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yes you can.

QuickSync for example is a GPGPU task and that launched with Sandy Bridge. GPGPU is not just limited to OpenCL, OpenGL, Vulcan and DirectX 12.

Also, it doesn't have to use all 16 "compute cores" for the same task right? If it got 12 GPU cores doing one job and 4 CPU cores doing another job then it's still 16 cores that are computing things.

 

Same with the iGPU Intel got.

 

I have seen them listed as just "compute cores" without the (CPU + GPU cores) on sites before. That's how I was informed about this stupid classification. I was some "10 compute core" chip on Newegg and had to Google it to find out that it was 4 CPU cores and 6 GPU cores (I think those were the numbers). Could have been a mess-up on Newegg's part, but it does not change the fact that AMD is clearly using the term "compute core" as a tool to deceive people into thinking their APUs are more powerful. There is no other explanation for why they would invent a new term and start using that a lot in their marketing material.

 

 

 

 

Well first of all, it does support OpenGL 3.3. Secondly, you don't need OpenGL 3 or OpenCL for GPGPU.

It certainly helps when programming, but it is by no means a requirement.

It doesn't use programmable shaders. The driver for it just converts everything to fixed-function GLSL.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

first of all, ima correct myself, i meant OpenCL2.0 not GL.

 

second of all.

do you even listen?

yes you can use GPGPU for non gaming tasks... but for tasks SUCH AS rendering via iGPU, you want to use DX, CUDA or OpenCL.... from what i know, i CAN NOT assing my Core i7 4790ks iGPU to render for me in Adobe software (OpenCL and OpenGL acceleration capable).

 

So clearly, intels HD series is only supported. But not actually given ANY function....

 

third of all:

under DX12 and Vulcan, you can use Kaveri to execute out of order compute tasks, while doing iGPU work and CPU work.

CAN INTEL HD DO THIS? NO!

CAN IT THEN BE USED AS A PROPER COMPUTE CORE CPU? NO!

Intel HD 5300 and up can do out of order compute. With Broadwell and up you have OpenCL 2.0. Haswell has OpenCL 1.2

https://software.intel.com/en-us/file/compute-architecture-of-intel-processor-graphics-gen8pdf

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

do you even listen?

yes you can use GPGPU for non gaming tasks... but for tasks SUCH AS rendering via iGPU, you want to use DX, CUDA or OpenCL.... from what i know, i CAN NOT assing my Core i7 4790ks iGPU to render for me in Adobe software (OpenCL and OpenGL acceleration capable).

 

So clearly, intels HD series is only supported. But not actually given ANY function....

 

third of all:

under DX12 and Vulcan, you can use Kaveri to execute out of order compute tasks, while doing iGPU work and CPU work.

CAN INTEL HD DO THIS? NO!

CAN IT THEN BE USED AS A PROPER COMPUTE CORE CPU? NO!

I am listening. The problem is that you are not listening. You can render on the iGPU of the i5-2500K. You can't use your 4790K to do it in Adobe's software because it does not support the particular APIs Adobe takes advantage of, but it is possible to do it.

 

But if you are so stuck up on OpenCL then let's change from the 2500K to the i5-4670K. That supports OpenCL. Should Intel call the 4670K a "24 compute core" chip? How about calling the 4770R a "44 compute core" chip?

 

Who are you to decide what is and isn't a "proper compute core"? Do you realize that you are now saying pretty much exactly the same thing as the guy suing AMD for advertising their chips as 8 cores because "they aren't proper cores"? Your double-standard is ridiculous.

 

 

And you still haven't answered my question. Do you think Intel should be allowed to advertise their processors as having "x compute cores"? For example the 2500K as a 16 computer core chip and the 4670K as a 24 computer core chip?

If you say no because Intel doesn't have "real computing cores" then you are saying exactly the same thing as the person currently suing AMD.

 

 

And like I said before:

Could have been a mess-up on Newegg's part, but it does not change the fact that AMD is clearly using the term "compute core" as a tool to deceive people into thinking their APUs are more powerful. There is no other explanation for why they would invent a new term and start using that a lot in their marketing material.

Can you come up with any explanation why they would invent this new term that doesn't involve them trying to deceive customers? Because it is not like they say "quad core with 8 GPU cores". They say "12 compute cores".

 

And just so that we are clear, I would be just as annoyed if Intel tried to pull this bullshit. I was actually annoyed when Motorola did this last year with their "X8".

 

 

 

 

 

It doesn't use programmable shaders. The driver for it just converts everything to fixed-function GLSL.

The 12 execution units in the HD 3000 are actually programmable. Here is a quote from Intel's official documentation for the P3000 (which is the same GPU as in the 2500K but with a special driver that "optimizes it for workstation applications". The hardware is exactly the same though):

Intel HD Graphics takes advantage of a generalized unified shader model including support for Shader Model 4.1. The platform also has support for DirectX* 11 on DirectX* 10 hardware. The graphics core executes vertex, geometry, and pixel shaders on the programmable array of Execution Units (EUs). The EUs have programmable SIMD (Single Instruction, Multiple Data) widths of 4 and 8 element vectors (two 4 element vectors paired) for geometry processing and 8 and 16 single data element vectors for pixel processing. Each EU is capable of executing multiple threads to cover latency. The new generation of Intel HD Graphics now integrates transcendental shader instructions into the EU units, rather than a shared math box found in prior generations, resulting in improved processing of instructions such as POW, COS, and SIN. Clipping and setup have moved to Fixed Function units, further increasing performance by reducing contention within the EUs. The end result is the fastest Intel HD Graphics to date.

Straight from Intel's mouth. The EUs (the same EUs as in the HD 3000) are programmable. And before someone says that the article is about P3000 and not HD 3000. They are the same chip but with different drivers and that particular quote is a general statement about that generation of GPUs. That's why Intel keeps just saying "Intel HD Graphics" and not "P3000".

Link to comment
Share on other sites

Link to post
Share on other sites

Says who?

 

Says I.

 

Cores always had symmetrical floating point and integer math capability. If you make a CPU effectively a 4 core for floating point operations, and an 8 core for integer calculations, you have a 4 core. Rule of weakest link.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

lawls

intel can freely call them X amount of core CPUs, if they clarify.

however what use does these cores have other then iGPU work or extremely specialized workloads?

 

my point is, with Kaveri, you can use the iGPU to accelerate CONSUMER FEATURES OR APPLICATIONS, while for intel, that seems to be locked for professional SKUs based upon you own evidence.

So when you market something for consumers, you want to make sure the consumers CAN USE IT. In the case of Kaveri, yes they can. If the application supports it.

In the case of non-workstation SKUs from intel, according to you, they cannot due to drivers.... Now Skylake and Broadwell should have support for openCL2.0 thus they should be able to do this on the consumer side too, and as thus since these are consumer SKUs, they have the full right to call them compute cores if they want.

 

one discerning detail though is the the intention behind the products.

AMD wanted to market their Kaveri APUs as a CPU with a GCN GPU stuck on it. Which is what it is. They wanted to make sure people realize that the APU has the same capabilities as a small, cheap end discrete GPU...

Intel has been rather humble with their own iGPU, and aside from Broadwell and other Iris Pro SKUs, they have mostly treated their iGPU as a "reserve solution" for desktop users and a "low power, but good enough" solution for laptop users that does not need or want a mobile Radeon or GeForce GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

however what use does these cores have other then iGPU work or extremely specialized workloads?

Same use as the GPU in AMD's APUs. If the program supports it, some work can be offloaded to the iGPU. The difference is which APIs they support.

 

my point is, with Kaveri, you can use the iGPU to accelerate CONSUMER FEATURES OR APPLICATIONS, while for intel, that seems to be locked for professional SKUs based upon you own evidence.

So when you market something for consumers, you want to make sure the consumers CAN USE IT. In the case of Kaveri, yes they can. If the application supports it.

In the case of non-workstation SKUs from intel, according to you, they cannot due to drivers.... Now Skylake and Broadwell should have support for openCL2.0 thus they should be able to do this on the consumer side too, and as thus since these are consumer SKUs, they have the full right to call them compute cores if they want.

So, you didn't read what I said... It is not locked for a professional SKU.

The EUs (the same EUs as in the HD 3000) are programmable. And before someone says that the article is about P3000 and not HD 3000. They are the same chip but with different drivers and that particular quote is a general statement about that generation of GPUs. That's why Intel keeps just saying "Intel HD Graphics" and not "P3000".

 

and consumers CAN use GPGPU on the i5-2500K. Again, QuickSync was one of the biggest features of Sandy Bridge and it is a GPGPU task. There is a lot of support for it (LAV Filters, OBS, RivaTuner/MSI Afterburner, Handbrake etc) as well. When you talk about Kaveri you keep saying "if the application supports it", and the exact same thing is true for the HD 3000 used in the 2500K.

Link to comment
Share on other sites

Link to post
Share on other sites

Same use as the GPU in AMD's APUs. If the program supports it, some work can be offloaded to the iGPU. The difference is which APIs they support.

 

So, you didn't read what I said... It is not locked for a professional SKU.

 

and consumers CAN use GPGPU on the i5-2500K. Again, QuickSync was one of the biggest features of Sandy Bridge and it is a GPGPU task. There is a lot of support for it (LAV Filters, OBS, RivaTuner/MSI Afterburner, Handbrake etc) as well. When you talk about Kaveri you keep saying "if the application supports it", and the exact same thing is true for the HD 3000 used in the 2500K.

@patrickjp93

can you please explain to this person why GPGPU is near worthless to the normal man-in-the-street consumer?

cuz my patience is wearing a bit thin here

Link to comment
Share on other sites

Link to post
Share on other sites

I bought an 8150 when it first came out, i wonder if i can cash in on this lol.

If your a piece of fucking shit you can. 

Link to comment
Share on other sites

Link to post
Share on other sites

Though I doubt this guy has a prayer in court, it still wastes AMD's time and money, which given their finances cuts deeper then for other large tech companies.

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

-

This lawsuit has no standing. There isn't a concrete definition for what a core really is on x86, you could argue any chip only has 1 core, or 20. It depends who is looking at it and what the company they stand for things a core is. 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD stated that the cores aren't like normal cores from the very beginning--this stinks of consumer entitlement for easy money to me.This is coming from an Intel user sitting on Sandy Bridge because AMD hasn't really put out anything to justify switching platforms.

 

>Buys AMD to support them

>Wants AMD to best Intel

>Files lawsuit which would take away R&D funding for better chips

Well you can't fix stupid.

Link to comment
Share on other sites

Link to post
Share on other sites

This lawsuit has no standing. There isn't a concrete definition for what a core really is on x86, you could argue any chip only has 1 core, or 20. It depends who is looking at it and what the company they stand for things a core is.

That may or may not be true. I don't want amd to be in a worse financial position, but you cannot deny their intentional misleading of the public has been a serious issue to consumers and really shouldn't have been allowed these last 5 years.

I understand that enforcement may not be possible, but that doesn't mean that ethically it shouldn't be done.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

That may or may not be true. I don't want amd to be in a worse financial position, but you cannot deny their intentional misleading of the public has been a serious issue to consumers and really shouldn't have been allowed these last 5 years.

I understand that enforcement may not be possible, but that doesn't mean that ethically it shouldn't be done.

I would argue AMD marketed the chips correctly, they were in indeed 8 core chips. Intel standard makes no difference to what AMD sells. 

Link to comment
Share on other sites

Link to post
Share on other sites

@patrickjp93

can you please explain to this person why GPGPU is near worthless to the normal man-in-the-street consumer?

cuz my patience is wearing a bit thin here

Something tells me you don't know what GPGPU is. It stands for "general-purpose computing on graphics processing units". It is incredibly useful.

OpenCL is GPGPU. Are you saying OpenCL is "near worthless"? Just a few posts ago you tried to argue that Intel did not have a "proper compute core CPU" because they didn't support it.

 

Your patience is wearing thin because you don't understand what you are talking about and you are fighting a losing battle.

 

You are also still avoiding to answer my question. Can you think of any reason why AMD would market their chips as having "12 computing cores" instead of continuing to market them the same way they did before? As a quad core and a GPU. The only reason I can think of is to deceive consumers into thinking they are buying a CPU with 12 CPU cores.

Link to comment
Share on other sites

Link to post
Share on other sites

Something tells me you don't know what GPGPU is. It stands for "general-purpose computing on graphics processing units". It is incredibly useful.

OpenCL is GPGPU. Are you saying OpenCL is "near worthless"? Just a few posts ago you tried to argue that Intel did not have a "proper compute core CPU" because they didn't support it.

 

Your patience is wearing thin because you don't understand what you are talking about and you are fighting a losing battle.

 

You are also still avoiding to answer my question. Can you think of any reason why AMD would market their chips as having "12 computing cores" instead of continuing to market them the same way they did before? As a quad core and a GPU. The only reason I can think of is to deceive consumers into thinking they are buying a CPU with 12 CPU cores.

 

 

also, OpenCL is an API, it uses the basis of GPGPU, but it is NOT GPGPU itself. Pure GPGPU programming, not using traditional APIs is fully possible, but is not going to benefit consumers. So if the iGPU doesnt support OpenCL 2.0 (at least) you cannot do Out of order execution, and thus effective compute, without a lot of work-arounds. The other option is CUDA, but AMD and Intel doesnt use CUDA so that is a moot point.

 

My patience is wearing thin because you are just repeating yourself even if you have your answer.

 

Now to answer your reformulated question: their old APUs based upon VLIW4 werent able to do compute properly. Honestly, they were really shit at it.

Kaveri has the ability to actually perform "real CPU" tasks for each of its GPU cores. While as VLIW would be restricted to one CPU task that it split across its cores internally.

 

 

Kaveri and Counting Cores

With the move towards highly integrated SoCs we've seen a variety of approaches to core counts. Apple, Intel and Qualcomm still count CPU cores when advertising an SoC. For Apple and Qualcomm that's partially because neither company is particularly fond of disclosing the configuration of their GPUs. More recently, NVIDIA took the somewhat insane stance of counting GPU CUDA cores on its Tegra K1 SoC. Motorola on the other hand opted for the bizarre choice of aggregating CPU, GPU and off-die companion processors with the X8 platform in its Moto X smartphone. Eventually we will have to find a way to characterize these highly integrated SoCs, particularly when the majority of applications actually depend on/leverage both CPU and GPU cores.

AMD finds itself in a unique position with Kaveri where it has a truly unified CPU/GPU architecture and needs to establish a new nomenclature for use in the future. With 47% of the Kaveri die dedicated for GPU use, and an architecture that treats both CPU and GPU as equals, I can understand AMD's desire to talk about the number of total cores on the APU.

AMD settled on the term "Compute Core", which can refer to either an x86 (or maybe eventually ARM) CPU core or a GCN compute unit. The breakdown is as follows:

  • Each thread on a CPU is a Compute Core
  • Each Compute Unit on the IGP is a Compute Core
  • Total Compute Cores = CPU Compute Cores + IGP Compute Cores

This means that the high end SKU, the A10-7850K will have a total of 12 compute cores: four from the CPU (two Steamroller modules supporting four threads) and eight from the IGP (due to eight compute units from the R7 graphics).

 

There are some qualifications to be made on this front. Technically, AMD is correct – each compute unit in the IGP and each thread on the CPU can run separate code. The Hawaii GCN architecture can spawn as many kernels as compute units, whereas a couple of generations ago we were restricted to one compute kernel on the GPU at once (merely with blocks of work being split across the CUs). However, clearly these 12 compute units are not equivalent: a programmer will still have to write code for the CPU and GPU specifically in order to use all the processing power available.

Whenever AMD (or partners) are to promote the new APUs, AMD tells us clearly that two sets of numbers should be quoted in reference to the Compute Cores – the total, and the breakdown of CPU/GPU on the APU. Thus this would mean that the A10-7850K APU would be marketed at a “12 Compute Core” device, with “(4 CPU + 8 GPU)” following immediately after. I applaud AMD's decision to not obfuscate the internal configuration of its APUs. This approach seems to be the most sensible if it wants to tout the total processing power of the APU as well as tell those users who understand a bit more what the actual configuration of the SoC is. The biggest issue is how to address the users who automatically assume that more cores == better. The root of this problem is very similar to the old PR-rating debates of the Athlon XP. Explaining to end users the intracacies of CPU/GPU programming is really no different than explaining why IPC * frequency matters more than absolute frequency.

 

 

That is what AMD says on the matter. AMD says it should be marketed as 12 compute cores, aka 4 CPU + 8 GPU.

If you read anywhere else that it doesnt say "12 compute = 4 CPU + 8 GPU" then that website is marketing AMDs product specifications wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

So? It's still software... it shows what it was programmed to show. They probably cound fpus as indicators of a core. A core is not defined by what a piece of software says it is, if anything the piece of software should adapt accordingly. Not that it matters in any way since the way it's called doesn't affect performance.

 

 

I've seen a couple of threads of people asking why their computers contain 8 i7s after seeing the performance tab in task manager. People clearly don't know what they buy regardless. Do you expect them to know the difference between a core, a module, a thread and an fpu? Don you think they'd have known any better if it had been advertised as a "quad module" cpu? What they can understand is a performance chart, and those haven't been manipulated in any way. But the thing is. these people don't even bother looking for those, even if a 5 minute google search would have provided them with all the information they needed to make an informed decision. The reality of the matter is they bought the first random prebuilt that said "gaming" on it or had a shiny red case and fit their arbitrarily set budget. It could have contained fairy dust and a bowl of hopes and dreams for all they knew. So bear with me when I say that not only do they have no basis for a class action lawsuit, they pretty much got what was coming to them.

So your saying that software should be changed to show Bulldozer in AMD's favour? Again: ALU are not capable of performing all  of a CPU's functions on their own, therefore the are not cores.

 

Something tells me you don't know what GPGPU is. It stands for "general-purpose computing on graphics processing units". It is incredibly useful.

OpenCL is GPGPU. Are you saying OpenCL is "near worthless"? Just a few posts ago you tried to argue that Intel did not have a "proper compute core CPU" because they didn't support it.

 

Your patience is wearing thin because you don't understand what you are talking about and you are fighting a losing battle.

 

You are also still avoiding to answer my question. Can you think of any reason why AMD would market their chips as having "12 computing cores" instead of continuing to market them the same way they did before? As a quad core and a GPU. The only reason I can think of is to deceive consumers into thinking they are buying a CPU with 12 CPU cores.

Just to be clear, the HD4600 in my i5 supports OpenCL. Its CUDA that isn't supported (or implemented yet-Intel does have the license to use it).

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

So your saying that software should be changed to show Bulldozer in AMD's favour? Again: ALU are not capable of performing all  of a CPU's functions on their own, therefore the are not cores.

 

No, I'm saying that the hardware is the same regardless of what software calls it. Also, a core is not necessarily capable of performing all cpu operations by itself. Take one of the alus out and the cpu will still work - it would just turn the module effectively in a standard core. A module is effectively 90% of two separate cores in the phenom sense - enough to consider it as two cores if you don't have time or space to explain what a module is and how it works.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

No, I'm saying that the hardware is the same regardless of what software calls it. Also, a core is not necessarily capable of performing all cpu operations by itself. Take one of the alus out and the cpu will still work - it would just turn the module effectively in a standard core. A module is effectively 90% of two separate cores in the phenom sense - enough to consider it as two cores if you don't have time or space to explain what a module is and how it works.

OK, in that case then the ALU are irrelevant and Bulldozer only has 4 cores then, with each module being the core. AMD still lied about the core count. I can see where your coming from with the ALU since it was only after the 486 that they went from being co processor to the modern integrated ALU.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

their old APUs based upon VLIW4 werent able to do compute.

Yes they were.

 

First of all, "compute" just means calculating. All GPUs can do that. What you mean when you say "compute" is GPGPU, and yes even the APUs that used VLIW4 could do that. Llano APUs had support for OpenCL (which like I said before is GPGPU). Hardware acceleration was supported in Photoshop back in CS6.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×