Jump to content

Massive leak regarding Intel Xe/X2 AKA Intel's new GPUs

Master Disaster

Coming from WCCF however this time they have proof.

 

Recently Intel held an internal presentation of their new GPUs and someone decided to film the trailer and leak some slides.

 

So the nomenclature, Xe, well it turns out the e is actually the number of GPUs so X2 has 2 GPUs, X4 has 4 etc 

 

Quote

The mother of all leaks from Intel has just happened. Intel recently held a high profile “Xe Unleashed” event internally where the GPU leadership presented their finalized Xe methodology to Bob Swan and some other key people (I am told  certain reps from certain AIBs like ASUS were also present) and needless to say, one of them thought we should know about it as well. We were able to get our hands on some presentation slides and even footage of the actual teaser! Know the pesky little “e” in Intel Xe? Well, it represents the number of GPUs.

 

Intel Xe philosophy believes that innovation needs to happen on 3 main fronts: process, microarchitecture and “e”. We are already familiar with the first two but ‘e’ is something that has not been successfully implemented so far. Sure there have been dual GPUs, but they all had to tradeoff some part of the functionality and never scaled linearly. Intel’s graphics team believes it has solved just that. With a brand new architectural approach (Xe) and a software layer (OneAPI) that can scale indiscriminately between any number of GPUs, it’s ready to remedy the years of neglect that ‘e’ has faced in the industry.

Here's some slides...

intel-xe-page-002.jpg

Quote

This slide is the cornerstone of the Xe philosophy and the big reveal about what e actually denotes. This also reveals the existence of the X4 class of GPUs by the way, which as you will see is just one step in Intel’s plan to dominate the GPU market.

 

They have designed the One API to act as an intermediary between the Direct3D layer and the GPU(s)  (I am told they have a Linux solution in the works as well) and allow the user to scale between multiple GPUs seamlessly. Seamless is the keyword here as a multi GPU that can perform cohesively as a single GPU has never been made. According to the presentation shown at the Xe Unleashed event, the GPU will register essentially as one large GPU.  This will allow it to mate with applications that may not have the capability for multi GPU and retain almost all backwards compatibility.

 

Developers won’t need to worry about optimizing their code for multi-GPU, the OneAPI will take care of all that. This will also allow the company to beat the foundry’s usual lithographic limit of dies that is currently in the range of ~800mm. Why have one 800mm die when you can have two 600mm dies (the lower the size of the die, the higher the yield) or four 400mm ones? Armed with One API and the Xe macroarchitecture Intel plans to ramp all the way up to Octa GPUs by 2024. From this roadmap, it seems like the first Xe class of GPUs will be X2.

intel-xe-unleashed-page-003.jpg

Quote

The tentative timeline for the first X2 class of GPUs was also revealed: June 31st, 2020.  This will be followed by the X4 class sometime in 2021. It looks like Intel plans to add two more cores every year so we should have the X8 class by 2024. Assuming Intel has the scaling solution down pat, it should actually be very easy to scale these up. The only concern here would be the packaging yield – which Intel should be more than capable of handling and binning should take care of any wastage issues quite easily. Neither NVIDIA nor AMD have yet gone down the MCM path and if Intel can truly deliver on this design then the sky’s the limit.

And the trailer

Quote
 
intel-xe-unleashed-page-001-410x231.jpg
HARDWARE  ⋮  LEAK

Intel ‘Xe Unleashed’ GPU Lineup Leaked – Xe Power 2 Flagship Graphics Card, Roadmap And More

14 hours ago

The mother of all leaks from Intel has just happened. Intel recently held a high profile “Xe Unleashed” event internally where the GPU leadership presented their finalized Xe methodology to Bob Swan and some other key people (I am told  certain reps from certain AIBs like ASUS were also present) and needless to say, one of them thought we should know about it as well. We were able to get our hands on some presentation slides and even footage of the actual teaser! Know the pesky little “e” in Intel Xe? Well, it represents the number of GPUs.

Intel Xe Unleashed: e denotes GPUs, Xe 2 flagship GPU will be a ‘seamless’ dual GPU, landing on 6/31 next year

Intel Xe philosophy believes that innovation needs to happen on 3 main fronts: process, microarchitecture and “e”. We are already familiar with the first two but ‘e’ is something that has not been successfully implemented so far. Sure there have been dual GPUs, but they all had to tradeoff some part of the functionality and never scaled linearly. Intel’s graphics team believes it has solved just that. With a brand new architectural approach (Xe) and a software layer (OneAPI) that can scale indiscriminately between any number of GPUs, it’s ready to remedy the years of neglect that ‘e’ has faced in the industry.

We managed to get our hands on 3 slides from the presentation that Raja Koduri gave:

intel-xe-page-002-1030x579.jpg

This slide is the cornerstone of the Xe philosophy and the big reveal about what e actually denotes. This also reveals the existence of the X4 class of GPUs by the way, which as you will see is just one step in Intel’s plan to dominate the GPU market.

They have designed the One API to act as an intermediary between the Direct3D layer and the GPU(s)  (I am told they have a Linux solution in the works as well) and allow the user to scale between multiple GPUs seamlessly. Seamless is the keyword here as a multi GPU that can perform cohesively as a single GPU has never been made. According to the presentation shown at the Xe Unleashed event, the GPU will register essentially as one large GPU.  This will allow it to mate with applications that may not have the capability for multi GPU and retain almost all backwards compatibility.

Developers won’t need to worry about optimizing their code for multi-GPU, the OneAPI will take care of all that. This will also allow the company to beat the foundry’s usual lithographic limit of dies that is currently in the range of ~800mm. Why have one 800mm die when you can have two 600mm dies (the lower the size of the die, the higher the yield) or four 400mm ones? Armed with One API and the Xe macroarchitecture Intel plans to ramp all the way up to Octa GPUs by 2024. From this roadmap, it seems like the first Xe class of GPUs will be X2.

intel-xe-unleashed-page-003-1030x579.jpg

The tentative timeline for the first X2 class of GPUs was also revealed: June 31st, 2020.  This will be followed by the X4 class sometime in 2021. It looks like Intel plans to add two more cores every year so we should have the X8 class by 2024. Assuming Intel has the scaling solution down pat, it should actually be very easy to scale these up. The only concern here would be the packaging yield – which Intel should be more than capable of handling and binning should take care of any wastage issues quite easily. Neither NVIDIA nor AMD have yet gone down the MCM path and if Intel can truly deliver on this design then the sky’s the limit.

Without any further ado, here’s the footage of the teaser our spy managed to take:

There you go folks, here’s your very first look at the official Intel Xe GPU, or more accurately the Intel X2 GPU. This short but rather spicy trailer gives a lot away. The design rocks a carbon fibre aesthetic with blue accents (from what I have been told, the blue stripes will be glow in the dark!) and the first reference design will be made in partnership with ASUS. You can also quite clearly see two intake pipes for what appears to be an internal water loop.

1 / 2
  • snapshot-2
  • snapshot-1
  • snapshot-2
  • snapshot-1
<>

My source has told me that the card will actually have two modes. A standard mode, which will allow the dual GPU to function at moderate clock speeds for most users, and a turbo boost mode, which when connected to the AIO upgrade, will allow the user to reach clock speeds exceeding 2.7 GHz (well 2.71828 to be exact) on both GPUs! This is an absolutely astonishing feat that allows Intel to reduce the upfront cost of their GPU. You can either buy the card with the AIO as a package or pay less and upgrade later.

 

I am told that Intel is planning to be very competitive in pricing and when asked hinted that their flagship would be more affordable than any counterpart on the market. This means we are looking at a maximum MSRP of $699 for the X2 flagship. The X2 GPU will be based on the new 4D XPoint memory and feature the Direct3D 14_2 feature level as far as hardware goes. Here are the complete specs that were discussed during the event:

https://wccftech.com/intel-xe-unleashed-gpu-lineup-leaked-xe-power-2-flagship-graphics-card-roadmap-and-more/

 

Very interesting stuff. Hopefully they'll knock Nvidia down a few notches.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Soooo April Fools or just WCCFSalt?

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

its april 1st

8086k

aorus pro z390

noctua nh-d15s chromax w black cover

evga 3070 ultra

samsung 128gb, adata swordfish 1tb, wd blue 1tb

seasonic 620w dogballs psu

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, fantasia. said:

its april 1st

Damn, maybe they got me....

 

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

That moment when you can't distinguish WCCF's April 1st joke from their regular news.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Master Disaster said:

Damn, maybe they got me....

 

Don't feel bad, With WTFtech you only had a 50% chance of getting real news anyway.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Master Disaster said:

Damn, maybe they got me....

 

honestly, we're all just going to have to see how it plays out

8086k

aorus pro z390

noctua nh-d15s chromax w black cover

evga 3070 ultra

samsung 128gb, adata swordfish 1tb, wd blue 1tb

seasonic 620w dogballs psu

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Should have called it Xfire

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, leadeater said:

Moved to off topic, unless I can walk on water and this is not an April fools article. Personally rooting for the ability to walk on water myself, yay Intel!

Considering the article claims it will have 32GB of 4D XPoint RAM I'd say they got me.

 

And DirectX 14 lmao. It was 7am when I posted this, that's my excuse and I'm sticking to it.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×