Jump to content

So apparently, the Intel/Radeon marriage is a thing that's happening

jasonc_01
Go to solution Solved by captain cactus,

sauce: https://newsroom.intel.com/editorials/new-intel-core-processor-combine-high-performance-cpu-discrete-graphics-sleek-thin-devices/

 

Quote

The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD’s Radeon Technologies Group* – all in a single processor package.

It’s a prime example of hardware and software innovations intersecting to create something amazing that fills a unique market gap. Helping to deliver on our vision for this new class of product, we worked with the team at AMD’s Radeon Technologies Group. In close collaboration, we designed a new semi-custom graphics chip, which means this is also a great example of how we can compete and work together, ultimately delivering innovation that is good for consumers.

So this was the semi-custom design AMD talked about a while back. But it has happened. Here's Intel's video:

 

 

So yeah. We now have an Intel CPU with an AMD RTG GPU in the 35-55W TDP range.

 

Here's how they did that:

 

Intel-8th-Gen-CPU-discrete-graphics.jpg

 

That's a single HBM2 stack so we're likely limited to 4GB of video RAM. The details of the AMD GPU are unknown at this point, likely to be a Vega-based GPU, but things as SP count and clock speeds are not known yet.

2 minutes ago, MageTank said:

 the topic at hand: Didn't we see a fabricated story recently where someone photoshopped "Vega inside" on a CFL box? 

I think the story you're referring to wasn't a shop but actually a poster for employee of the month (or something to that effect) and the employee was coincidentally named Vega.

At least that's what I recall.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Trixanity said:

I think the story you're referring to wasn't a shop but actually a poster for employee of the month (or something to that effect) and the employee was coincidentally named Vega.

At least that's what I recall.

You are right, just googled it and found this article:

https://www.pcgamesn.com/intel-amd-vega-mobile-cpu

intellifts.jpg

 

Intel needs to seriously work on their elevator art. I'd be mighty creeped out if I opened that "Paul Inside" door, only for Paul to actually be standing inside. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't really care what happens so long as it's bad for Nvidia, taking the piss recently with their xx80Ti pricing.

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

A spokesman for AMD told the paper the chip it is working on with Intel won’t compete with its own Ryzen Mobile semiconductor that combines processing and graphics. The spokesman said the Intel chip is targeting gamers while the chip AMD is launching by the end of the year isn’t specialized to support that market specifically. “We’re playing in a complementary market,” the spokesman told the Wall Street Journal.

I don't see how chips with better igpus are not targeted at gamers... you're not going to use them in a render farm and the average non-gamer has had no need for better igpus in years.

 

I don't necessarily trust the WSJ much, especially since they aren't quoting any official statement, but they say we should get something official later today, so I guess we'll see.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Cookybiscuit said:

I don't really care what happens so long as it's bad for Nvidia, taking the piss recently with their xx80Ti pricing.

What's wrong with the X80Ti pricing? From what I've seen in the states, the 1080 Ti is priced normally according to their previous X80 Ti cards since Kepler, which launched at $700. MSRP on the 1080 Ti is also $700. The 980 Ti was $650 MSRP, but they also did that to undercut the Fury X's launch, so it was a specialized case. 

 

Unless you are referring to mining causing prices to skyrocket, in which case, that's not really Nvidia's fault. What people do with their GPU's after they bring them to the market is out of their control. 

 

I will say, it's fairly easy for us in the states to find reference 1080 Ti's for $649, $50 less than MSRP, so it could also be a regional thing. I got a solid discount on my hybrid 1080 Ti, which normally retails for $850, I got it for $770, paying only 10% more than MSRP for a decent AIO strapped to it. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Intel just tweeted this out. Looks like it's official.

 

They are integrating their Coffee Lake laptop CPUs with AMD HBM2 graphics in one package. The press release does not mention whether it's Vega or not, just that it uses HBM2 memory.

 

According to the press release we should be hearing more during Q1 2018

Quote

At the heart of this new design is EMIB, a small intelligent bridge that allows heterogeneous silicon to quickly pass information in extremely close proximity. EMIB eliminates height impact as well as manufacturing and design complexities, enabling faster, more powerful and more efficient products in smaller sizes. This is the first consumer product that takes advantage of EMIB.

 

The GPU design is a custom one made in collaboration with Intel, and as the quote above mentions the two are connected through the use of a custom interface named EMIB (Embedded Multi-Die Interconnect Bridge). 

 

They are touting it as a high performance solution, targeted at both gamers and productivity professionals alike. 

 

Quote

Intel-8th-Gen-CPU-discrete-graphics-2.jp

 

https://newsroom.intel.com/editorials/new-intel-core-processor-combine-high-performance-cpu-discrete-graphics-sleek-thin-devices/

Edited by Rafy
Updated post with more information
Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, MageTank said:

What's wrong with the X80Ti pricing? From what I've seen in the states, the 1080 Ti is priced normally according to their previous X80 Ti cards since Kepler, which launched at $700. MSRP on the 1080 Ti is also $700. The 980 Ti was $650 MSRP, but they also did that to undercut the Fury X's launch, so it was a specialized case. 

 

Unless you are referring to mining causing prices to skyrocket, in which case, that's not really Nvidia's fault. What people do with their GPU's after they bring them to the market is out of their control. 

 

I will say, it's fairly easy for us in the states to find reference 1080 Ti's for $649, $50 less than MSRP, so it could also be a regional thing. I got a solid discount on my hybrid 1080 Ti, which normally retails for $850, I got it for $770, paying only 10% more than MSRP for a decent AIO strapped to it. 

Yeah actually you're right I must be losing it, looks like they've been $699 or there-abouts since 780Ti. I just remember there being a big price difference between my 780, 980Ti and 1080Ti, guess it's just shit USD/GBP exchange rates recently.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Cookybiscuit said:

Yeah actually you're right I must be losing it, looks like they've been $699 or there-abouts since 780Ti. I just remember there being a big price difference between my 780, 980Ti and 1080Ti, guess it's just shit USD/GBP exchange rates recently.

Hopefully AMD can come back at the high-end on the GPU side, so that the X80 Ti cards have enough of a competition to lower their prices like they did during the Maxwell days. Either that, or wait for Nvidia to step on their toes once again, like they did with the 1070 Ti stepping on both the 1070 and 1080's toes. Hopefully Nvidia doesn't take my joke seriously, and end up delivering a 1085. Don't need another Fermi-esque product stack. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, anthonyjc2010 said:

I still don't understand why this is happening. The reason to choose a Ryzen APU is for the superior iGPU, if you take that away AMD's Ultrabook market is dead. I'm probably missing the big picture here, but I just don't get it.

supposedly this is a more expensive solution, although amd should have just done this themselves 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, anthonyjc2010 said:

yeah... I still think this is going to take away a large amount of potential customers for AMD RYZEN APUs.

my guess is that they are hungry for market share, and this is a quick way to do it (gpu side)

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly speaking I do wish Ice Lake was like i5 9600k 6c/12t with iGPU and the i7 9700k 8c/16t without iGPU to fit in the extra cores, I have never understood much why put the iGPU on unlocked highest end i7 on mainstream when 99% of people buying it will get a graphics card along but oh well...

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

Soooo I accidentally broke the thread by trying to merge it a repost, because it contained another source. Should be good now, sorry guys!

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Princess Cadence said:

Honestly speaking I do wish Ice Lake was like i5 9600k 6c/12t with iGPU and the i7 9700k 8c/16t without iGPU to fit in the extra cores, I have never understood much why put the iGPU on unlocked highest end i7 on mainstream when 99% of people buying it will get a graphics card along but oh well...

its mostly because they use the same die down the chain for locked i7s/i5s 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Morgan MLGman said:

Soooo I accidentally broke the thread by trying to merge it a repost, because it contained another source. Should be good now, sorry guys!

Yeah I noticed. Thought someone acidentally deleted it. 

 

Anyhow, this has been known for months.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, cj09beira said:

its mostly because they use the same die down the chain for locked i7s/i5s 

I know, there's always a reason to, truth be told though it is disappointing to pay the i7 price tag like I did myself on my 6700 knowing that 33% of your CPU which is the iGPU will never be used... Intel releases new processors all the time that makes no sense like Kaby Lake-X but can't release something to attend market deficiency.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Princess Cadence said:

I know, there's always a reason to, truth be told though it is disappointing to pay the i7 price tag like I did myself on my 6700 knowing that 33% of your CPU which is the iGPU will never be used... Intel releases new processors all the time that makes no sense like Kaby Lake-X but can't release something to attend market deficiency.

i think they are going to change that, maybe, as they talked about dividing the cpu in multiple silicon dies to make it cheaper, and have the gpu on a separate piece of silicon would help, but its intel we are talking about here they would not give us lower prices because of that 

Link to comment
Share on other sites

Link to post
Share on other sites

sauce: https://newsroom.intel.com/editorials/new-intel-core-processor-combine-high-performance-cpu-discrete-graphics-sleek-thin-devices/

 

Quote

The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD’s Radeon Technologies Group* – all in a single processor package.

It’s a prime example of hardware and software innovations intersecting to create something amazing that fills a unique market gap. Helping to deliver on our vision for this new class of product, we worked with the team at AMD’s Radeon Technologies Group. In close collaboration, we designed a new semi-custom graphics chip, which means this is also a great example of how we can compete and work together, ultimately delivering innovation that is good for consumers.

So this was the semi-custom design AMD talked about a while back. But it has happened. Here's Intel's video:

 

 

So yeah. We now have an Intel CPU with an AMD RTG GPU in the 35-55W TDP range.

 

Here's how they did that:

 

Intel-8th-Gen-CPU-discrete-graphics.jpg

 

That's a single HBM2 stack so we're likely limited to 4GB of video RAM. The details of the AMD GPU are unknown at this point, likely to be a Vega-based GPU, but things as SP count and clock speeds are not known yet.

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

A bit unrelated but do we know if anyone else (particularly AMD) have EMIB in the pipeline? It seems like it's far superior to an interposer.

Link to comment
Share on other sites

Link to post
Share on other sites

Can't help but think maybe AMD have shot themselves in the foot here.

Sounds like its their gpu techonology from their APU's, but paired with Intel's CPU power... I can already guess which will be the better performer when it comes to gaming. maybe even for other uses depending on the core counts.

PC - CPU Ryzen 5 1600 - GPU Power Color Radeon 5700XT- Motherboard Gigabyte GA-AB350 Gaming - RAM 16GB Corsair Vengeance RGB - Storage 525GB Crucial MX300 SSD + 120GB Kingston SSD   PSU Corsair CX750M - Cooling Stock - Case White NZXT S340

 

Peripherals - Mouse Logitech G502 Wireless - Keyboard Logitech G915 TKL  Headset Razer Kraken Pro V2's - Displays 2x Acer 24" GF246(1080p, 75hz, Freesync) Steering Wheel & Pedals Logitech G29 & Shifter

 

         

Link to comment
Share on other sites

Link to post
Share on other sites

Nice. I'm kind of wondering why they haven't done the same with some ryzen chips, what's stopping them?

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, RKRiley said:

Can't help but think maybe AMD have shot themselves in the foot here.

Sounds like its their gpu techonology from their APU's, but paired with Intel's CPU power... I can already guess which will be the better performer when it comes to gaming.

From what we've seen so far, mobile ryzen has no reason to fear the comparison with intel's offerings.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

didnt Intel and or AMD go out and say that this was not happaning? like one or two times already? and now it is on again? wut... 

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, RKRiley said:

Can't help but think maybe AMD have shot themselves in the foot here.

Sounds like its their gpu techonology from their APU's, but paired with Intel's CPU power... I can already guess which will be the better performer when it comes to gaming.

AMD's Raven Ridge is competing in the ULV-powered space, not the higher-power laptop chips. AMD doesn't have anything in that space yet, getting a GPU in there with a fast CPU will get them some well needed money.

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×