Jump to content

Intel confirms that its discrete Xe GPUs will release in 2020

Quote

Intel has confirmed at its Developer Conference ‘IDC’ 2019 in Tokyo that its new discrete Xe GPUs will release in 2020. Moreover, the blue team confirmed that these GPUs will support Ray Tracing, something that will please a lot of gamers.

 

Unfortunately, Intel did not reveal any additional details. Still, it’s good knowing that these are still planned for a 2020 release, and that they will support Ray Tracing. Again, Intel did not provide any extra details about the RT implementation. However, we suspect that the Xe GPUs will have hardware and not software support.

 

Raja Koduri has tweeted the following image:

 

This picture shows a brand new Tesla Model X with a really interesting plate. This plate suggests a June 2020 release for the Xe GPUs.

 

Source: https://www.dsogaming.com/news/intel-confirms-that-its-discrete-xe-gpus-will-release-in-2020-and-will-support-ray-tracing/

Source 2: https://www.tomshardware.com/news/intel-rajaq-koduri-xe-graphics-card-release-date,40562.html

 

Not much to say about this, as there is much speculation within the rumor itself (mostly if there is dedicated HW for Ray Tracing and other details; 2020 arrival looks to be largely settled). But at the very least, 2020 as a whole is shaping up to be a very interesting year for computer enthusiasts (from all of the big 3). Really looking forward to seeing what Intel's first dGPU winds up being; especially what market/markets they intend to enter in first (price bracket/tier

 

Apparently, Ray Tracing at all on Intel GPUs is a no-go:

 

Quote

Intel said translation errors may have lead some to believe its upcoming graphics card will have ray tracing support.

 

Source: https://www.pcworld.com/article/3445421/no-intel-didnt-confirm-its-discrete-xe-gpus-will-support-ray-tracing.html

Link to comment
Share on other sites

Link to post
Share on other sites

Is it still 14nm?

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

The hardest thing for the Intel Graphics devision is going to be drivers. I don't think Intel will have trouble getting OEMs to come on board and use their chips, it's the consumers that Intel has to worry about. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, DrMacintosh said:

The hardest thing for the Intel Graphics devision is going to be drivers. I don't think Intel will have trouble getting OEMs to come on board and use their chips, it's the consumers that Intel has to worry about. 

Intel has working drivers across all modern Game Engines, but without high-end hardware available, no one knows what their optimizations need to be. So they'll work, but it's going to be wildly erratic for effect.

 

However, they've previously said their first product was a GPGPU on 10nm, so Consumer facing parts are still a long ways off. They might open up OEM parts first to displace Nvidia, but I'd be surprised if they don't release Mobile graphics before we see them in Desktop. Consumer dGPUs just isn't worth jumping into until they've taken more market share elsewhere.

 

As a general reminder, Intel has been the largest Computer GPU vendor since something like 1985, by volume.

Link to comment
Share on other sites

Link to post
Share on other sites

But I have it on good authority that RT is just a novelty and sales gimmick.   Why is Intel wasting money on it? ?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Xe-lent 

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, mr moose said:

But I have it on good authority that RT is just a novelty and sales gimmick.   Why is Intel wasting money on it? ?

RT (DXR) is actually a neat feature. When it's open and anyone can use it. What NVIDIA was doing (RTX) was mostly proprietary crap which is why ppl hated it. And the fact only one party supported it and was desperate in making it an exclusive thing. If we'll have more vendors doing it, DXR will be the default and not some locked down RTX thing that only works if you have RTX graphic card. Hell, just the fact of having 3 GPU vendors in over decade and a half is amazing by itself. I just hope Intel will get their shit together in software and hardware side enough to be a viable option and not something you have to make a lot of compromises to make it usable. Which is what their GPU's have been for ages.

Link to comment
Share on other sites

Link to post
Share on other sites

If those Intel Xe GPUs support ray tracing, will one of those with ray tracing switched on be called Xe-On (if they do those RTX on/RTX off comparisons)?

Anyways, I won't hold my breath for Intel to sell discrete GPUs that compete with AMD's and NVidia's gaming GPUs. It would be a good thing if they did since the GPU market could use more competition for the prices to drop to more reasonable levels (in my opinion).

Link to comment
Share on other sites

Link to post
Share on other sites

"Support Raytracing", like the intel iGPUs currently "support" DX9? ;)

41 minutes ago, mr moose said:

But I have it on good authority that RT is just a novelty and sales gimmick.   Why is Intel wasting money on it? ?

It's NVidia. It's basically PhysX all over again (physics code now is common and standard, but not "special" or proprietary).

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, RejZoR said:

RT (DXR) is actually a neat feature. When it's open and anyone can use it. What NVIDIA was doing (RTX) was mostly proprietary crap which is why ppl hated it. And the fact only one party supported it and was desperate in making it an exclusive thing. If we'll have more vendors doing it, DXR will be the default and not some locked down RTX thing that only works if you have RTX graphic card. Hell, just the fact of having 3 GPU vendors in over decade and a half is amazing by itself. I just hope Intel will get their shit together in software and hardware side enough to be a viable option and not something you have to make a lot of compromises to make it usable. Which is what their GPU's have been for ages.

Nvidia bad mmmkay.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mr moose said:

Nvidia bad mmmkay.

??? That's not what they said. If AMD/Intel/ARM were doing the same... "We bring you VR+ foveal rendering only available on Intel XE processors" we'd all go "Yeah, you can mask out a render area on any GPU, not really proprietary..." (Or any other basic tech... We are not talking entirely "new" tech here, and IMO open APIs are good... just make the underlying code proprietary if you must).

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, TechyBen said:

??? That's not what they said. If AMD/Intel/ARM were doing the same... "We bring you VR+ foveal rendering only available on Intel XE processors" we'd all go "Yeah, you can mask out a render area on any GPU, not really proprietary..." (Or any other basic tech... We are not talking entirely "new" tech here, and IMO open APIs are good... just make the underlying code proprietary if you must).

All the arguments when RTX came out was that the tech was a gimmick and novelty.  People didn't even know that most of what we were seeing with RTX was DXR.  Honestly, go back and have a read through the complaints people were making about RT (not even knowing DXR was a thing) being proprietary NVIDIA and locked down.  It's absolutely mind numbing how people can even dream this stuff up let alone believe it.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, RejZoR said:

RT (DXR) is actually a neat feature. When it's open and anyone can use it. What NVIDIA was doing (RTX) was mostly proprietary crap which is why ppl hated it. And the fact only one party supported it and was desperate in making it an exclusive thing. If we'll have more vendors doing it, DXR will be the default and not some locked down RTX thing that only works if you have RTX graphic card. Hell, just the fact of having 3 GPU vendors in over decade and a half is amazing by itself. I just hope Intel will get their shit together in software and hardware side enough to be a viable option and not something you have to make a lot of compromises to make it usable. Which is what their GPU's have been for ages.

AMD patented their own ray tracing tech, does that make it just as bad as Nvidia's proprietary RTX feature?

There really needs to be another competitor, both Nvidia and AMD have been driving up prices. And Nvidia is the first with ray tracing on the mainstream desktop so I wouldn't be surprised if Intel GPU's support RTX in some way.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, mr moose said:

All the arguments when RTX came out was that the tech was a gimmick and novelty.  People didn't even know that most of what we were seeing with RTX was DXR.  Honestly, go back and have a read through the complaints people were making about RT (not even knowing DXR was a thing) being proprietary NVIDIA and locked down.  It's absolutely mind numbing how people can even dream this stuff up let alone believe it.

It was. Though only for a few months. Because the implementation was atrocious. Similar to PhysX, it was a nice tech demo. By the time it was common and well implemented, it became non-proprietary (Havok/HL2 Source Engine etc) without being locked down.

 

If the full feature set was there on release, it'd not be much of a "gimmick", or if there was wider adoption vs the advertising. A bit like AMDs "hair works"... like, AFAIK, any card could do it, why make it proprietary? Make your version "better", but why lock out users with different hardware?

 

"This Ford car is the only one that can use Petrol... all other cars have a triangular fuel cap, because proprietary!!!" we'd all gawp and laugh at Ford.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Blademaster91 said:

AMD patented their own ray tracing tech, does that make it just as bad as Nvidia's proprietary RTX feature?

There really needs to be another competitor, both Nvidia and AMD have been driving up prices. And Nvidia is the first with ray tracing on the mainstream desktop so I wouldn't be surprised if Intel GPU's support RTX in some way.

IMO yes. Patent the implementation by all means, but make the API open (As freesync was IIRC). Play nice!

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, TechyBen said:

IMO yes. Patent the implementation by all means, but make the API open (As freesync was IIRC). Play nice!

Whats wrong with directx?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

And the rumors are that Gen12 will offer DOUBLE the performance of Gen11, which is double the performance of it's predecessor. And that's from intel's mouth, not mine or anyone elses. In basically just a few years they would have quadrupled performance. Given Gen11 is only slightly behind mobile Vega, this is a huge step forward.

 

Do the maths..double Vega..Nvidia and AMD better have the alarm bells ringing cause chances are Intel will be in the same ballpark as them at the first attempt.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

Nvidia bad mmmkay.

Because proprietary shit is good mkay? Coz PhysX worked out so well right? Or anything NVIDIA has churned out as proprietary junk. Right? Yeah, Ray tracing that only works on RTX is bad. Ray tracing that works on any DXR capable graphic card is good.

Link to comment
Share on other sites

Link to post
Share on other sites

...how did a thread about an Intel GPU turn into ANOTHER Nvidia bashing contest?

 

really wish people knew how to stay on topic.

 

 

Looking forward to another player in the game. hoping they shake things up. personally still don't care about RT, but good to know it will be there should people want to use it

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Arika S said:

...how did a thread about an Intel GPU turn into ANOTHER Nvidia bashing contest?

 

really wish people knew how to stay on topic.

 

 

Looking forward to another player in the game. hoping they shake things up. personally still don't care about RT, but good to know it will be there should people want to use it

No one is bashing anything. But some people keep on insisting I'm NVIDIA hater. WHile having NVIDIA graphic card. And being against proprietary tech makes you a hater. Heh. All I said is I'm looking forward to having 3rd player in GPU market and also looking forward to wider DXR support which means ray tracing might be closer to a mainstream thing now.

Link to comment
Share on other sites

Link to post
Share on other sites

It's fascinating how much pushback Nvidia got from certain corners of the internet on the whole raytracing thing, compared to how widespread (Intel, AMD, Sony, Microsoft) adoption of raytracing has become among upcoming hardware releases.

AMD Ryzen 7 3700X | Thermalright Le Grand Macho RT | ASUS ROG Strix X470-F | 16GB G.Skill Trident Z RGB @3400MHz | EVGA RTX 2080S XC Ultra | EVGA GQ 650 | HP EX920 1TB / Crucial MX500 500GB / Samsung Spinpoint 1TB | Cooler Master H500M

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, mr moose said:

All the arguments when RTX came out was that the tech was a gimmick and novelty.  People didn't even know that most of what we were seeing with RTX was DXR.  Honestly, go back and have a read through the complaints people were making about RT (not even knowing DXR was a thing) being proprietary NVIDIA and locked down.  It's absolutely mind numbing how people can even dream this stuff up let alone believe it.

As I understand it, isn't the big problem with the RTX ray tracing that they just don't have the horsepower to do it for much? As I recall they've only got 40-something paths for an embarrassingly parallel problem. I don't think you really don't get enough until you're at something like four times that number of parallel paths. 

 

In the current generation they really only seem to have enough hp to only support limited gimiky effects. Not like you've got enough to be able to handle, say, the light dynamics of being say in a BF-110 greenhouse (with mirriors) being illuminated from multiple directions by spotlights trying to search for you, that are reflecting off of the overcast night sky and extend the main beams back to the AI to determine which ones can actually see you at the moment. 

 

Right now it's just being used for water reflections, and that seems to happily bring even the 2080Ti to its knees, so it's not something a dev can depend on for critical gameplay functions, like line of sight or whether the radar works. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Harry Voyager said:

As I understand it, isn't the big problem with the RTX ray tracing that they just don't have the horsepower to do it for much? As I recall they've only got 40-something paths for an embarrassingly parallel problem. I don't think you really don't get enough until you're at something like four times that number of parallel paths. 

 

There are lots of problems with new tech when it comes out (RTX is no different in that regard).  But none of that legitimizes arguments of it being a novelty or sales gimmick when literally everyone is jumping on board and trying to push the tech forward on top of the fact many companies have been trying to make it a mainstream thing for decades.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mr moose said:

There are lots of problems with new tech when it comes out (RTX is no different in that regard).  But none of that legitimizes arguments of it being a novelty or sales gimmick when literally everyone is jumping on board and trying to push the tech forward on top of the fact many companies have been trying to make it a mainstream thing for decades.

Agreed on the tech side, but does that make the limited implementation of it, as incorporated on the RTX 20XX cards an implementation worth spending that kind of money on currently or when they were released?

 

With the GTX 10XX lines penetration, those are going to be the likely minimum useful level, but the RTX 20XX don't offer enough power to do the big stuff, so despite the cost, I don't see how they are going to age well at all. 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×