Jump to content

[Rumour] [AdoredTV] Intel Xe graphics project effectively dead?

Never-edge

According to this AdoredTV article, Intel’s Xe Graphics project, the Xe team and the discreet GPU based products are already effectively dead before they even launch the first Discrete graphics card.

Also, it’s rumored that Raja Koduri is leaving Intel within the next few months.

 

 

Quote

.....

 

Finally, our Taiwanese sources say Intel will eventually cancel Xe and dissolve the graphics division. We can’t be quite sure when this will happen, but given DG3’s cancelation and its prior launch target of 2023, 2023 could be the year Xe finally ends. The reason for Xe’s cancelation is just down to money. It has cost Intel about $500 million to fund Arctic Sound and DG1 (with Arctic Sound taking the lion’s share of that sum of money) and these graphics projects have yielded few results. Bob Swan is a financially focused CEO and he will seek to start cutting Intel’s lowest margin and least profitable ventures. Xe is first on the chopping block.
 

So, there you have it, the rough timeline of Intel Xe Graphics from today to 2023. Intel’s second foray into high end graphics ended much in the same way as the first: a slow death preceded by incredible hope and hype. 

 

Overall, it’s incredible stuff if true, especially on top of all Intel’s CPU [manufacturing] woes, and Jim Keller’s premature departure.

It really does like Intel is in big trouble, and is going to need a very dramatic turn around...

My systems:

Spoiler

Family Gaming PC:

  • Ryzen 5600X w/ 32 GB - 2 DIMMs of Corsair 3600 MT/s DDR4 RAM
  • ASRock Phantom Gaming 4
  • Radeon RX 6700 XT (Powercolor Hellhound)
  • EVGA 7500W BR PSU

 

Workstation:

  • Ryzen 7950X w/ 64 GB - 2 DIMMs of 5200 MT/s DDR5 RAM
  • ASRock X670E Steel Legend
  • Radeon RX 6600 - ASRock
  • 10 GbE NIC - Asus XG-C100F
  • Seasonic  1200W Platinum Prime PSU

 

Virtualization server:

  • Ryzen 3950X w/ 64 GB - 2 DIMMs of 3200 MT/s DDR4 ECC RAM
  • ASRock X570 Taichi
  • 10 GbE NIC - Asus XG-C100F
  • HBA - Broadcom / LSI SAS2008
  • Seasonic 450W Gold PSU

 

Link to comment
Share on other sites

Link to post
Share on other sites

I hope the rumors aren’t true, but I can see it happening, I guess. It seems like a lot of companies are quick to end a product if the returns arent there in the short term. Or they already see that the long term investment isnt worth it... id hope theyd know that building a good gpu takes several generations

Link to comment
Share on other sites

Link to post
Share on other sites

Can’t polish a turd, as they say.

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, RotoCoreOne said:

It seems like a lot of companies are quick to end a product if the returns arent there in the short term

Which is sad because entering a new market like that it should be accepted that short term anything isn't possible or be a priority.

Link to comment
Share on other sites

Link to post
Share on other sites

I hope it's not true because I really just want to see what could've been.

 

Also what happens to all of the supercomputers that were planning on using Intel Xe compute cards? Doesn't Intel face a lot of broken contract fees for not having the GPUs ready for them?

Link to comment
Share on other sites

Link to post
Share on other sites

Man, i thought if anyone can do it, it's gonna be Intel. I guess not. 

 

1 hour ago, Never-edge said:

According to this AdoredTV article

Oh..

tenor.gif?itemid=13660025

 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

This is terrible. A technologically illiterate CEO is going to cut costs by reducing R&D which means Intel will become less relevant if not totally obsolete. If true, this would be yet another example of a shortsighted business person crashing a tech company.

 

Just to recap, Intel’s CPU division has been struggling and they recently announced they are postponing their 7nm chips. Before that they gave up on making 5G modems, one of the hottest techs according every analyst on Wall St.

Link to comment
Share on other sites

Link to post
Share on other sites

Doubt it. Unless Intel is thinking long term, they most likely aren’t needing to shrink assets just yet.

Link to comment
Share on other sites

Link to post
Share on other sites

Having seen Larrabee play out, this was always the likely outcome. There was a chance they might not repeat their most recent attempt at anything other than iGPUs, but it was always a long shot.

Link to comment
Share on other sites

Link to post
Share on other sites

That would be stupid. Sacking something before it even goes on market. And I was already excited to see what Intel has cooked up in graphics. Even if they didn't have top of the line stuff, mid end and maybe even just reaching at the high end would be enough. Also having 3rd player in discrete GPU segment would be nice.

 

And even if the info is correct, they'll just replace their current iGPU tech with this and stick with it probably. And it'll just remain that, mediocre iGPU at best meanwhile AMD will churn out powerful APU's... Which is why it's laughable how some mention Intel when talking graphics market share. Just because you stick a shitty iGPU in almost every CPU it doesn't mean you dominate the market or have the biggest market share. It's retarded to compare iGPU and discrete graphic cards on the same level. One you have to purposely buy and another is there on CPU even if it'll never be used.

 

Also we can't compare Larrabee to Xe. Larrabee was a weird project to use x86 cores to run GPU instructions which it was interesting, but in all honesty I knew it'll never work. Graphics, despite being much more flexible these days are still very much fixed function thing for which you need highly specialized compute cores to do the rendering. Xe on the other hand is the actual GPU just like all the Radeon and GeForce GPU's and has a valid shot at the market. The problem are financial people who are entirely disconnected from the actual market and just think oh if we can't make it in 1 year, we're just gonna sack it. AMD didn't have anything good for a while and then they pulled RX Navi almost out of nowhere and it was super competitive almost up to the top of the line competitor products.

 

I still hope this is BS and dumb suitcase knobs at Intel won't sack Xe. Who knows, maybe I'll rock an Intel graphic card on Ryzen platform next time... That would be funny turn around...

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, RejZoR said:

That would be stupid. Sacking something before it even goes on market.

Not necessarily. Intel could have assessed that the cost of bringing a competitive product to market in a time when their main business, x86 CPUs, is struggling to compete is not a good idea. 

 

 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, DrMacintosh said:

Not necessarily. Intel could have assessed that the cost of bringing a competitive product to market in a time when their main business, x86 CPUs, is struggling to compete is not a good idea. 

 

 

Sacking something after you've spent all the R&D on it and already have working products is what's actually stupid. Also it could be what would be keeping them afloat just like it did with AMD when their CPU division was struggling. Unless it's so crap it won't do even that on any level. Remember, AMD was kept up by Polaris class cards, so RX480/RX580 and RX590... Mid range cards.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, RejZoR said:

Also it could be what would be keeping them afloat just like it did with AMD when their CPU division was struggling. Unless it's so crap it won't do even that on any level.

I have to assume that this was the case. How far do you think Intel could have gotten in terms of building an architecture from scratch by working off of old information gathered from a failed ex-leader of Radeon? 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, RotoCoreOne said:

I hope the rumors aren’t true, but I can see it happening, I guess. It seems like a lot of companies are quick to end a product if the returns arent there in the short term. Or they already see that the long term investment isnt worth it... id hope theyd know that building a good gpu takes several generations

Well when it's a hardware product, you don't want to sink money into it for 20 years like Intel had to do with Itanium (aka the Itanic.)

 

Intel has ventured into and abandoned graphics 3 or 4 times now, and it's just like, please just bloody learn not to be stupid one day Intel and just either license AMD's iGPU part or licence something from nVidia if you're that desparate to produce a GPU. The Xe stuff was likely still based on Larabee's tech which means it's a buttload of P54 cores with AVX instructions on them and not the most efficient thing in the world, even if it could be built on a 7nm fab. The timing of this loss with Apple dumping Intel for laptops is pretty indicative of why.

 

Intel has hit a wall.

 

I'm sure the engineers can still get one or two more die shrinks, but I think we're at the end. Unless something comes along and allows for like 500pm (that's 0.5nm), I think we're done. Future chips might have to be built at the molecule level to get any smaller. A water molecule is 0.27nm, DNA is 2nm. Once they're built at the molecule level, that's it, you can't shrink molecules. Like the absolute "wall" more or less is 0.2nm, as that's the size of one Si atom. So a chip would have to be 3D printed at the atomic level to get there, and thus requires an entirely other chip design that is built in layers.

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, DrMacintosh said:

I have to assume that this was the case. How far do you think Intel could have gotten in terms of building an architecture from scratch by working off of old information gathered from a failed ex-leader of Radeon? 

 

That's assuming all he had was old information and that he was a failure.  All we know about AMD failures was that GCN uArch didn't lend itself to improvements the way NVIDIA's uArch did.    I don't think the best GPU engineers in the world could make GCN competitive in a gaming market.

 

If Intel's goal is compute and supercomputer then all bets are off.  No amount of leaks and rumors is going to do justice to whatever they end up with.

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, DrMacintosh said:

I have to assume that this was the case. How far do you think Intel could have gotten in terms of building an architecture from scratch by working off of old information gathered from a failed ex-leader of Radeon? 

Failed? What is your measure of success then? Radeon was completing well in the market, which is not defined by 1080 Ti/2080 Ti alone, with significantly less resources than Nvidia (and Intel who is a factor here). Sure high end performance is important and is a big component of brand reputation but do you really think any single person from say Nvidia could have done any better in the same situation?

 

Fiscally Polaris was an important success for Radeon and the return on investment has been better than anything Nvidia has done.

 

And this is assuming that all the problems were even hardware related at all, throw in the same software support and ecosystem as Nvidia has around it's hardware and CUDA and things would be very different, which actually had little to do with Raja.

 

However Raja wouldn't have been pivotal to Intel's success at making dedicated GPUs anyway, Intel's been in the graphics sector for a very long time and doesn't need to be taught how to make GPUs and Raja is in an executive position. That's like saying a single person was responsible for building the Saturn V rocket.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

Failed? What is your measure of success then? Radeon was completing well in the market, which is not defined by 1080 Ti/2080 Ti alone, with significantly less resources than Nvidia (and Intel who is a factor here). Sure high end performance is important and is a big component of brand reputation but do you really think any single person from say Nvidia could have done any better in the same situation?

 

Fiscally Polaris was an important success for Radeon and the return on investment has been better than anything Nvidia has done.

 

And this is assuming that all the problems were even hardware related at all, throw in the same software support and ecosystem as Nvidia has around it's hardware and CUDA and things would be very different, which actually had little to do with Raja.

 

GCN performed very well in the compute space.   May not have sold as many as NVIDIA but I'm lead to believe that was not due to the hardware sucking. 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mr moose said:

GCN performed very well in the compute space.   May not have sold as many as NVIDIA but I'm lead to believe that was not due to the hardware sucking. 

The problem with being good at compute was that there wasn't the software support like CUDA so Radeon market share was extremely small because of that. With the funding RTG got that basically lead to the situation of either they develop new hardware or software ecosystem, only one was possible and in that situation you always go with hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

Those bean counters will be counting all the way to an early grave at this point. Given the reputation of AdoredTV I think I'll apply some scepticism though.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, scottyseng said:

Also what happens to all of the supercomputers that were planning on using Intel Xe compute cards? Doesn't Intel face a lot of broken contract fees for not having the GPUs ready for them?

I don't think that's at risk: according to OP, Xe will supposedly be shut down at the end of the current roadmap. That would mean they will deliver anything already planned/contracted, but development would halt. In turn, that means we should see the graphics division start to shrink quite some time before 2023.

 

That's also why I don't think this (if confirmed) is a pure short-term profit decision, but more of an update on long-run prospects based on what they managed to secure so far, what they perceive demand moving toward, and what they are seeing in the lab.

I mean, with the level of information I have (zero) it could be anything, but we have seen cases of prduct development halted for more than short-term motivations: for example, AMD never bothered bringing Steamroller and Excavator to the FX lineup, and while I personally would have liked to see what a late Excavator FX looks like, the truth is it wouldn't have been worth it for AMD. If instead of AMD it would had been, I don't know, Nvidia making a foray into the CPU market, that could have been the end of the CPU division for some time.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, 5x5 said:

This is to be expected. Intel tried this once, they failed. Rumours of the XE being slower than a 1050 Ti have been circling for months. Guess those probably ended up true so intel cancelled it before releasing a 300W card that can barely match a 1660/5500 XT

That doesn't seem very likely to be the case to me. We know what the laptop configuration can do so to be that bad at the proposed configuration it would have to be the worst scaling of anything every in history.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

That doesn't seem very likely to be the case to me. We know what the laptop configuration can do so to be that bad at the proposed configuration it would have to be the worst scaling of anything every in history.

We don't have a single laptop XE GPU out. All we have is Iris Pro which is based in law gen. It's all runours

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, 5x5 said:

We don't have a single laptop XE GPU out. All we have is Iris Pro which is based in law gen. It's all runours

 

Quote

Apart from showing us that the integrated Xe graphics can handle running Battlefield 5, it also shows us exactly how well it runs the game. In the video, the FPS game’s running at 1080p, high settings, DX11, and sits at around 30fps (give or take a couple of frames) for the entirety of the video. This isn’t a benchmark, so we can’t extrapolate too much from these results, but it certainly looks promising.

Even without full independent testing data we know roughly what an integrated laptop Xe is doing, an AIB card scaled out and also using 300W performing around 1660 just is not likely going off current information.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

 

Even without full independent testing data we know roughly what an integrated laptop Xe is doing, an AIB card scaled out and also using 300W perform around 1660 just is not likely going of current information.

 

Well, from what I know, Lenovo at least are not interested in using Xe in laptops at all.something about C states being too power hungry. Sooo, that's where I'm basing my assumption off of

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×