Jump to content

So apparently, the Intel/Radeon marriage is a thing that's happening

jasonc_01
Go to solution Solved by captain cactus,

sauce: https://newsroom.intel.com/editorials/new-intel-core-processor-combine-high-performance-cpu-discrete-graphics-sleek-thin-devices/

 

Quote

The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD’s Radeon Technologies Group* – all in a single processor package.

It’s a prime example of hardware and software innovations intersecting to create something amazing that fills a unique market gap. Helping to deliver on our vision for this new class of product, we worked with the team at AMD’s Radeon Technologies Group. In close collaboration, we designed a new semi-custom graphics chip, which means this is also a great example of how we can compete and work together, ultimately delivering innovation that is good for consumers.

So this was the semi-custom design AMD talked about a while back. But it has happened. Here's Intel's video:

 

 

So yeah. We now have an Intel CPU with an AMD RTG GPU in the 35-55W TDP range.

 

Here's how they did that:

 

Intel-8th-Gen-CPU-discrete-graphics.jpg

 

That's a single HBM2 stack so we're likely limited to 4GB of video RAM. The details of the AMD GPU are unknown at this point, likely to be a Vega-based GPU, but things as SP count and clock speeds are not known yet.

7 minutes ago, leadeater said:

Hope they actually talk about it and not get derailed by non tech news, sure very interesting stuff but almost nothing got covered last week lol. I blame the technical issues more for that though.

A couple weeks back they had my story about the Pixel 2 XL display issues in their list but never got to it :(

Would have been a good one too, I'm sure Linus would have had an opinion on it considering he reviewed one

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Ryan_Vickers said:

A couple weeks back they had my story about the Pixel 2 XL display issues in their list but never got to it :(

Would have been a good one too, I'm sure Linus would have had an opinion on it considering he reviewed one

 Honestly my thought on it is that it is quite overblown. 

 

 I actually do end up mentioning it in my upcoming LG V30 review.  They use very similar displays. 

 

As for the WAN Show this week I actually may not be on it. I will be travelling in New York at the time.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, LinusTech said:

As for the WAN Show this week I actually may not be on it. I will be travelling in New York at the time.

Pre-record aye aye, go on do it ;)

 

Joking aside I would actually like to hear your comments on some of the bigger news items.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, LinusTech said:

 As for the WAN Show this week I actually may not be on it. I will be travelling in New York at the time.

I call dibs on your spot.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

But the really big question is: how will this effect the steam surveys? and will AMD fanboys now attribute a portion of Intel's GPU percentage to AMD to soften the blow?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mr moose said:

But the really big question is: how will this effect the steam surveys? and will AMD fanboys now attribute a portion of Intel's GPU percentage to AMD to soften the blow?

I wonder how that would work, wouldn't the systems be running AMD GPU drivers so would count toward AMD?

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, leadeater said:

I wonder how that would work, wouldn't the systems be running AMD GPU drivers so would count toward AMD?

who knows?  Sometimes I wonder if even steam know how their surveys work. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Dabombinable said:

AMD doesn't need to join Intel though.....unless there is some life left in Larabee?

A lot of HD graphics ip comes from Nvidia if I'm not mistaken. Making this a chance to diversify, make more money, and take a small chunk of revenue from nvidia. Hopefully not at the cost of their own cpus.

Link to comment
Share on other sites

Link to post
Share on other sites

well that's news, wouldn't NVidia do the same thing but with AMD CPUs? think about it ,the are Radeon GPUs(AMD) and AMD CPUs .since intel went with Radeon GPUs would it not be logical for NVidia to go with AMD CPUs since  

Capture.PNG

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, cj09beira said:

yes and no, its probably 2 completely different price brackets, i would like amd to make their own version of this with the same gpu though, stead of relying on intel

I think Intel has better brand name recognition than AMD and this move may help AMD sell products to OEMs in the future through association (assuming the AMD components will be successful in this case). I don't know though, this is just pure speculation.

CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, tsk said:

Raven ridge= 12-25W 

Intel H series= 45W + dGPU(probably 75W+) 

Apparently the dGPU is supposed to be 35-55W. I read this further back in the comment chain and don't feel like looking back that far, so sorry to the user I would usually be quoting now.

CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, linustouchtips said:

its my weird fantasy as a matter of fact they are getting a boost in the stock as we speak

I'm confused, your profile pic is Lisa Su, the CEO of AMD, but you want AMD to sell RTG to Intel? What gives?

CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Shahnewaz said:

-snip-

We (at SA)

-snip-

Seeking Alpha? Nice to see I'm not the only one here that's also there.

CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Ezilkannan said:

uhm, actually their higher end 1080Ti is much better priced than the 1080.

Uhm, still would be nice if it was cheaper though.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, commenter said:

- snip -

 

Thoughts?

Although I didn't think about the issue in the same level of depth that you described, I did consider the general point you made with regards to Apple. The iMac lineup does use Vega graphics and Intel CPUs and previous Apple products have used RTG products as well, so I thought it'd make since for a low power combination of the products to be used in Apple's Laptops both for legal reasons and to reduce optimization input in the Apple ecosystem to allow for higher performance products.

CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Shahnewaz said:

- snip -

On that logic an Intel-Nvidia combo would be exactly what Apple wants, but that's not happening in this case.

- snip -

Explain the new iMac lineup with Vega then. If you're correct then I'm missing something that I can't figure out.

CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Taf the Ghost said:

Part of me thinks AMD did the deal to help that along. Though AMD already needs HBM3 for Vega. (Clocks barely do much for the RX Vega cards, it's all about the HBM overclocking.)

ya, the feasibility of navi in my mind really needs hbm3 to be ready for action because some cards will end up with only one hbm stack, which needs to be as fast as possible.

10 hours ago, Trik'Stari said:

I feel like AMD's graphics division isn't much of a threat to anyone. Ryzen was and is great, but Vega is just another disappointment. It's not top of the line, and it's basically unavailable because of HBM2 yields combined with miners buying up all the cards.

vega is top of the line the issue is that its bottleneck in other parts of the pipeline while rendering games, its compute perf is actually really good, been basically as fast as p100 while being a smaller die, it seems amd is focusing more on the markets where they think vega will sell for its merits more often.

1 hour ago, Gravesnear said:

A lot of HD graphics ip comes from Nvidia if I'm not mistaken. Making this a chance to diversify, make more money, and take a small chunk of revenue from nvidia. Hopefully not at the cost of their own cpus.

not exactly intel is developing its own igpus the deal with nvidea (now with amd) was just so that intel doesn't get sued, as right now amd and nvidea have so many patents that its impossible to make a  fast efficient gpu without inflicting on those patents 

Link to comment
Share on other sites

Link to post
Share on other sites

Smart, people in my college here have associated AMD as "bad" when looking for laptops and just defaults to intel + nvidia.

 

I suppose if AMD can get people to buy laptops with intel + AMD setup and realize AMD is good now, it'll help rebuild their public image on the laptop side of things. Ultimately help sell their Ryzen laptops.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ATFink said:

Seeking Alpha? Nice to see I'm not the only one here that's also there.

SemiAccurate. ;)

1 hour ago, ATFink said:

Explain the new iMac lineup with Vega then. If you're correct then I'm missing something that I can't figure out.

Same argument. Nvidia has better GPUs, yet Apple refuses to use them. Yet they pay top dollar for the best binned Intel CPUs and AMD GPUs, even specially binned ones.

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/6/2017 at 7:28 AM, Misanthrope said:

Wait, why? Now that AMD actually might have a very competitive product with the Ryzen + Vega APUs for laptops, why now that those are officially announced? It anything it makes far less sense now.

 

This rumor needs to die already.

Take a look at this idiot who thought it couldn't possibly real.

 

Quite something isn't it?

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Here's another article that makes some interesting points about why AMD are doing it. This article says that the reason that AMD and Intel are teaming up isn't to remove AMD from the market but is actually to bring AMD into competition with Nvidia (if you read between the lines) cause they want to bring discrete graphics card power to the CPU itself much like what Nvidia has done with the MX150.

 

Also the chip is meant to go into the small 2 in 1's or thin and light ultrabooks which doesn't take away from AMD's discrete cards or CPU's in larger more purpose built gaming laptops. 

 

I think this is great because it could (depending on performance) put more pressure on Nvidia to put research and development into the small form factor graphics cards rather than their Max-Q design and actually make a purpose built chip for the small 2 in 1's or ultrabooks which is what more and more people are buying.

 

https://www.pcworld.com/article/3235934/components-processors/intel-and-amd-ship-a-core-chip-with-radeon-graphics.html

Link to comment
Share on other sites

Link to post
Share on other sites

@cj09beira

 

AMD's GPU department seems to have actually had a problem making the Compute too strong and ending up without enough culling and asset bottlenecks. Beyond just optimization issues, I think the CUs are too far ahead of the rest of the system. An interesting possibility, which lines up with the issues that both Intel & AMD have on the CPU side of things. (It also explains the big push in Memory tech over the last few years. They can't keep the cores filled.)

Link to comment
Share on other sites

Link to post
Share on other sites

This makes me think AMD or rather RTG is running out of options now. I see nVidia strengthening itself into a monopoly on desktop graphics. It's going to hurt our wallets in future, but can't be helped much. Guess we all should just get a console at this rate :|

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Shahnewaz said:

Same argument. Nvidia has better GPUs, yet Apple refuses to use them. Yet they pay top dollar for the best binned Intel CPUs and AMD GPUs, even specially binned ones.

Nvidia has higher end GPUs that perform better under DX11 via their expensively maintained Driver Team. Those aren't just random caveats. Nvidia has an advantage in certain APIs and specific game engines under Windows, this doesn't just translate to the Macintosh environment as a 1 for 1. The technical aspects of "why" are really important here. Apple has the most money of any corporation ever. They have specific reasons for sticking with their AMD alliance.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×