Jump to content

New news on the Intel CPU and AMD GPU collaboration

Mr.Meerkat
5 hours ago, patrickjp93 said:

There is one reason it wouldn't be stupid, and I can't believe "I" am the one saying it... If Intel's graphics can become Radeon-compliant, and if eDRAM on desktop becomes a thing (or HBM/HMC), then hybrid XFire out of the box is not a low-value proposition. It would make AMD cards more powerful on both AMD and Intel platforms.

theoretically yes, but practically .. even AMD's hybrid CFX was shitty at best

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, zMeul said:

theoretically yes, but practically .. even AMD's hybrid CFX was shitty at best

It was limited only to a couple low-end GPUs that were already underpowered and old. That's not enough to say the whole idea should be scrapped.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, patrickjp93 said:

It was limited only to a couple low-end GPUs that were already underpowered and old. That's not enough to say the whole idea should be scrapped.

thing is developers don't give a shit about SLI or CFX and it's up to GPU manufacturers to sort it out, if the game's engine allows it

it doesn't make sense for Intel to spend money into developing the HW for AMD's CFX compatibility if it's not working all the time - there's one thing that Intel hates just as much as losing money, bad publicity

and then what if you pair it with a nVidia video card?

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, zMeul said:

thing is developers don't give a shit about SLI or CFX and it's up to GPU manufacturers to sort it out, if the game's engine allows it

it doesn't make sense for Intel to spend money into developing the HW for AMD's CFX compatibility if it's not working all the time - there's one thing that Intel hates just as much as losing money, bad publicity

and then what if you pair it with a nVidia video card?

Intel hates bad publicity? What planet are you on? And Intel would not care if Nvidia fans got stung. What are they going to do, buy AMD everything?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, patrickjp93 said:

Intel hates bad publicity? What planet are you on?

I'm sorry what?!

or don't you recall #gamergate and Intel pulling their ad support -_-

or oh look how our products are "conflict free" and we're going to make Linus do a video promo on it

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, zMeul said:

I'm sorry what?!

or don't you recall #gamergate and Intel pulling their ad support -_-

or oh look how our products are "conflict free" and we're going to make Linus do a video promo on it

Intel hired Sarkeesian and still has her hired so...

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, patrickjp93 said:

75/25 market share in discrete graphics makes me think they'd be enthused either way.

Actually, it's 70/30 right now. Everything helps.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

On ‎04‎.‎02‎.‎2017 at 5:46 PM, Coaxialgamer said:

Intel sells a 1700$ die that measures 240mm².  Amd and nvidia both sell dies around that size ( gtx 1060 and rx 480)  for around 200-250$.

about that , cpus are ALOT more complicated than a gpu ,

a gpu is just 2000ish dumb cpu´s doing parallel math ,

its a lot easier to fill 240mm² w/ those than a massive cpu

 

I might be wrong tho , im just speculating here

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Bouzoo said:

Actually, it's 70/30 right now. Everything helps.

The last report we have is from Q3 2016, and Nvidia claims it made ground in Q4.

 

21 minutes ago, Space Reptile said:

about that , cpus are ALOT more complicated than a gpu ,

a gpu is just 2000ish dumb cpu´s doing parallel math ,

its a lot easier to fill 240mm² w/ those than a massive cpu

 

I might be wrong tho , im just speculating here

In terms of real production cost, the CPU is more expensive, but because of cache, not because of the logic. More of the die is used for cache and memory transactions than actual logic.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, patrickjp93 said:

The last report we have is from Q3 2016, and Nvidia claims it made ground in Q4.

*claims

I've no reason not to believe it, but until we get official numbers, I'm going by the last official statistics we have. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Even if its true i completely disagree with such decisions.

If intel wants to sell APU's they need to make a separate line of APU's outside of high end cpu's.

Its mindblowing that they still sell i5/i7 for enthusiasts with fucking iGPU, get fucking lost, instead of that extra useless silicon that i will never use put 2 core's and more cache extra on i5's/i7  7600k/7700k FOR THE SAME PRICE, else fuck of intel.

If you want a real APU get one from AMD A10/A8 whatever, it has great graphics.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, deviant88 said:

Even if its true i completely disagree with such decisions.

If intel wants to sell APU's they need to make a separate line of APU's outside of high end cpu's.

Its mindblowing that they still sell i5/i7 for enthusiasts with fucking iGPU, get fucking lost, instead of that extra useless silicon that i will never use put 2 core's and more cache extra on i5's/i7  7600k/7700k FOR THE SAME PRICE, else fuck of intel.

If you want a real APU get one from AMD A10/A8 whatever, it has great graphics.

No it isn't. I for one would love if I could switch off all my dGPUs when I don't need them. Further, there's a lot of "CPU" performance being left on the table because the iGPU isn't being used b/c lazy devs. Even Sandy Bridge has another 4-5 years' worth of potential the moment games start making use of AVX.

 

It's not useless Silicon. You (and a very lazy dev community) simply haven't found the use (or the will TO use it). It's fully programmable for OpenCL 2.1, OpenMP 4.5, the most recent versions of OpenACC, and is fully DX 12 AND 12.1 compliant at the highest feature levels.

 

Also, you say never until that one day your graphics card(s) die(s) or the video out port dies or the cable dies or the monitor port dies and you need to diagnose which of these is the issue.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I would love for AMD to focus purely on GPU market and drop CPU market if they dont prove their place in CPU market with Zen.

 

That would result in Intel monopoly but at least they would focus on what they do best.

Connection200mbps / 12mbps 5Ghz wifi

My baby: CPU - i7-4790, MB - Z97-A, RAM - Corsair Veng. LP 16gb, GPU - MSI GTX 1060, PSU - CXM 600, Storage - Evo 840 120gb, MX100 256gb, WD Blue 1TB, Cooler - Hyper Evo 212, Case - Corsair Carbide 200R, Monitor - Benq  XL2430T 144Hz, Mouse - FinalMouse, Keyboard -K70 RGB, OS - Win 10, Audio - DT990 Pro, Phone - iPhone SE

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, patrickjp93 said:

No it isn't. I for one would love if I could switch off all my dGPUs when I don't need them. Further, there's a lot of "CPU" performance being left on the table because the iGPU isn't being used b/c lazy devs. Even Sandy Bridge has another 4-5 years' worth of potential the moment games start making use of AVX.

 

It's not useless Silicon. You (and a very lazy dev community) simply haven't found the use (or the will TO use it). It's fully programmable for OpenCL 2.1, OpenMP 4.5, the most recent versions of OpenACC, and is fully DX 12 AND 12.1 compliant at the highest feature levels.

 

Also, you say never until that one day your graphics card(s) die(s) or the video out port dies or the cable dies or the monitor port dies and you need to diagnose which of these is the issue.

Honestly i dont care about programming for such a weak gpu. If you invest your time learning all of that, at least program for a true GPU like pascals,vega or prof grade quadro/radeon, such that when you give it some "real work" it actually finishes this century.

I honestly dont care about "OMG my GPU dieded and i cant fix it without iGPU" scenario, old entry level gt 210,410, 710 or whatever sell for scrap change used, if you are a enthusiast/high end PC gamer or developer(especially dev)and you dont have an old gpu layin around in your nerd stash, boi you are doing something wrong.

Lets face it,  most of us probably want (especially ME), a 6-core i5/i7 for the same price OR a 4-core i5/i7 for cheaper $ without iGPU.

I love AMD way of segmenting the market, APU's are one thing, powerfull CPU's are another.

Lets face it stuff like HSA is dead no real developers use real gpu's for processing, deep learning, folding, video processing, they all have their own GPU's, TitansX, quadro's, ASIC's. No real developer will ever use nonesense igpu.

Dont get triggered, if you like intel, or intel's way of doing things then arguments are pointless.

The real truth is the same either way, if you have 2-400$ for K cpu and 100$+ for mobo+ 30-50$ cooler, you would be a real faggot to just run iGPU, and not spend at least 100$ on real dGPU, no arguments you add or extra intel features will change that.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, deviant88 said:

Honestly i dont care about programming for such a weak gpu. If you invest your time learning all of that, at least program for a true GPU like pascals,vega or prof grade quadro/radeon, such that when you give it some "real work" it actually finishes this century.

I honestly dont care about "OMG my GPU dieded and i cant fix it without iGPU" scenario, old entry level gt 210,410, 710 or whatever sell for scrap change used, if you are a enthusiast/high end PC gamer or developer(especially dev)and you dont have an old gpu layin around in your nerd stash, boi you are doing something wrong.

Lets face it,  most of us probably want (especially ME), a 6-core i5/i7 for the same price OR a 4-core i5/i7 for cheaper $ without iGPU.

I love AMD way of segmenting the market, APU's are one thing, powerfull CPU's are another.

Lets face it stuff like HSA is dead no real developers use real gpu's for processing, deep learning, folding, video processing, they all have their own GPU's, TitansX, quadro's, ASIC's. No real developer will ever use nonesense igpu.

Dont get triggered, if you like intel, or intel's way of doing things then arguments are pointless.

The real truth is the same either way, if you have 2-400$ for K cpu and 100$+ for mobo+ 30-50$ cooler, you would be a real faggot to just run iGPU, and not spend at least 100$ on real dGPU, no arguments you add or extra intel features will change that.

 

It's just as strong as an A8 from AMD. Hell in a number of compute benchmarks Skylake GT2 goes toe to toe with an A10.

 

Compute power is compute power. Games should try to make use of everything in the system to its fullest extent, period.

 

A real developer maximizes the performance potential of any system. That is my standard as an HPC programmer, and given my age that standard should be lower than industry veterans'.

 

Sorry but still wrong. If I just want to practice parallel and heterogeneous computing but don't want to triple my electric bill by having a dGPU, an iGPU is a very good option.

 

Also, really, how old are you? Twelve?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Space Reptile said:

about that , cpus are ALOT more complicated than a gpu ,

a gpu is just 2000ish dumb cpu´s doing parallel math ,

its a lot easier to fill 240mm² w/ those than a massive cpu

 

I might be wrong tho , im just speculating here

True,  cpu cores have a higher R&D cost than single gpu cores  but the actual chip itself is usually quite a bit  smaller.  Plus, take into account that a huge chunk of space in a cpu is allocated to cache or an igpu. Driving down the overall complexity if the chip

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Coaxialgamer said:

True,  cpu cores have a higher R&D cost than single gpu cores  but the actual chip itself is usually quite a bit  smaller.  Plus, take into account that a huge chunk of space in a cpu is allocated to cache or an igpu. Driving down the overall complexity if the chip

The very biggest chips in the world are CPUs. Knight's Landing is nearly 700mm sq.. The 8890 V3 is 662mm sq.. Back on 90nm the top Xeon was 710mm sq.. The biggest GPU die ever I believe was big Maxwell at just 610mm sq..

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, patrickjp93 said:

The very biggest chips in the world are CPUs. Knight's Landing is nearly 700mm sq.. The 8890 V3 is 662mm sq.. Back on 90nm the top Xeon was 710mm sq.. The biggest GPU die ever I believe was big Maxwell at just 610mm sq..

True,  but those are hardly typical are they? Those chips may hold the record for largest chip,  but they also sold with HUGE margins.    Consumer grade cpus are usually only 100-200mm²,  while gpu dies are traditionally much bigger for a given price segment. 

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Coaxialgamer said:

True,  but those are hardly typical are they? Those chips may hold the record for largest chip,  but they also sold with HUGE margins.    Consumer grade cpus are usually only 100-200mm²,  while gpu dies are traditionally much bigger for a given price segment. 

Considering they vastly outnumber GPUs and other accelerators in datacenters across the world, yes, they're typical.

 

And mind you consumer software just doesn't make use of advanced CPUs. Hell games still really haven't discovered vectorization despite SSE being 11 years old now. We can get 10x CPU performance improvements from Sandy Bridge on just by changing some industry standard code into AVX intrinsics. You wouldn't even need multithreading at that point. 1 core on a 2600K using AVX can outperform the entire 6950X using scalar but multithreaded code. That should have the community up in arms against bad development studio practices, and yet...

 

If consumer software truly necessitated those core counts, Intel would provide them cheaply, but such software really doesn't exist outside of the professional apps like the Adobe/Sony suites.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×