Jump to content

Intel confirms that its discrete Xe GPUs will release in 2020

3 minutes ago, Harry Voyager said:

Agreed on the tech side, but does that make the limited implementation of it, as incorporated on the RTX 20XX cards an implementation worth spending that kind of money on currently or when they were released?

 

With the GTX 10XX lines penetration, those are going to be the likely minimum useful level, but the RTX 20XX don't offer enough power to do the big stuff, so despite the cost, I don't see how they are going to age well at all. 

That's up to each buyer to determine.    There will always be people who complain no matter what any company does.  Had Nvidia made RTX cards $200 cheaper people would have complained they were being unfair to those who recently bought the 10XX series, as it was they are complaining the 20XX series was too much for what it does.  All new tech is the same in that regard, high premium for bleeding edge and little support.  It's nothing new in this regard. 

 

All that aside I was only pointing out the asinine commentary of RT being a gimmick or novelty. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, mr moose said:

But I have it on good authority that RT is just a novelty and sales gimmick.   Why is Intel wasting money on it? ?

Right now it is, there's what.. 3 games that support it and another 3 on the way over the next 7-8 months.

 

Once it's matured and become adopted a lot more... then it will more than a novelty and a marketing tool... But were looking at a few years in reality before that happens.. Let's say 2021 or 2022 and the RTX 4xxx seriees (if they continue the current practice of updating existing with 'super' refresh.

 

To become more than the gimmick it currently is requires market penetration and adoption. The 2060-2070 can't use full on RT without severe penalty, essentially making  those games unplayable due to such a low frame rate. It requires AMD and Intel to come on board and support it and it requires developers to come along and support it in their games.

 

All of that takes time.. I mean FFS, we still see games being released that use DX11... DX12 hasn't even reached full market adoption yet.

System 1: Gigabyte Aorus B450 Pro, Ryzen 5 2600X, 32GB Corsair Vengeance 3200mhz, Sapphire 5700XT, 250GB NVME WD Black, 2x Crucial MX5001TB, 2x Seagate 3TB, H115i AIO, Sharkoon BW9000 case with corsair ML fans, EVGA G2 Gold 650W Modular PSU, liteon bluray/dvd/rw.. NO RGB aside from MB and AIO pump. Triple 27" Monitor setup (1x 144hz, 2x 75hz, all freesync/freesync 2)

System 2: Asus M5 MB, AMD FX8350, 16GB DDR3, Sapphire RX580, 30TB of storage, 250GB SSD, Silverstone HTPC chassis, Corsair 550W Modular PSU, Noctua cooler, liteon bluray/dvd/rw, 4K HDR display (Samsung TV)

System 3 & 4: nVidia shield TV (2017 & 2019) Pro with extra 128GB samsung flash drives.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Anomnomnomaly said:

Right now it is, there's what.. 3 games that support it and another 3 on the way over the next 7-8 months.

 

Once it's matured and become adopted a lot more... then it will more than a novelty and a marketing tool... But were looking at a few years in reality before that happens.. Let's say 2021 or 2022 and the RTX 4xxx seriees (if they continue the current practice of updating existing with 'super' refresh.

 

To become more than the gimmick it currently is requires market penetration and adoption. The 2060-2070 can't use full on RT without severe penalty, essentially making  those games unplayable due to such a low frame rate. It requires AMD and Intel to come on board and support it and it requires developers to come along and support it in their games.

 

All of that takes time.. I mean FFS, we still see games being released that use DX11... DX12 hasn't even reached full market adoption yet.

How can it be a gimmick when AMD, Intel and Nvidia amongst others are all building in RT support?

 

If that is the case then everything is a gimmick when it is first released and so there is literally no point in calling anything a gimmick, you may just as well call it a product in its infancy instead.    No, people are using the term gimmick and novelty in a very intentional way here, they are indeed trying to insinuate it is akin to a plastic whistle you get in a showbag.  

 

And I am pretty sure people are not using the term novelty by it's dictionary definition (something being new and interesting) because that is at odds with the term gimmick.  They are using the term novelty by it's colloquial definition, that of something that is new but the interest will wear off. 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

my question is if intel enters the gpu market and does well will nVidia counter by entering the x86 cpu market?

LOOK AT MY NEW FLAG DESIGNS FOR PA AND VOTE ON YOUR FAVORITE

LOOK AT MY FIRST BATCH OF DESIGNS HERE

 

 

 

 

 

4690K @ 4.5GHz

GTX 970 FTW

MSI Z97 PC MATE

Define R5 windowed

Cooler Master Seidon 240m

EVGA SuperNOVA 650 G1

Kingston 120gb SSD

SanDisk 480Gb SSD

Seagate 1Tb Hard drive

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, SuperCookie78 said:

my question is if intel enters the gpu market and does well will nVidia counter by entering the x86 cpu market?

Only if Nvidia manages to procure access to a set of x86 IP (the majority of which is held in stranglehold by Intel/AMD) :P.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, thorhammerz said:

Only if Nvidia manages to procure access to a set of x86 IP (the majority of which is held in stranglehold by Intel/AMD) :P.

that is true. so probably not but it would be so cool to see three way battle between parts on cpu and gpu sides. 

LOOK AT MY NEW FLAG DESIGNS FOR PA AND VOTE ON YOUR FAVORITE

LOOK AT MY FIRST BATCH OF DESIGNS HERE

 

 

 

 

 

4690K @ 4.5GHz

GTX 970 FTW

MSI Z97 PC MATE

Define R5 windowed

Cooler Master Seidon 240m

EVGA SuperNOVA 650 G1

Kingston 120gb SSD

SanDisk 480Gb SSD

Seagate 1Tb Hard drive

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, MeatFeastMan said:

And the rumors are that Gen12 will offer DOUBLE the performance of Gen11, which is double the performance of it's predecessor. And that's from intel's mouth, not mine or anyone elses. In basically just a few years they would have quadrupled performance. Given Gen11 is only slightly behind mobile Vega, this is a huge step forward.

 

Do the maths..double Vega..Nvidia and AMD better have the alarm bells ringing cause chances are Intel will be in the same ballpark as them at the first attempt.

Even at double from Gen 11 to Gen 12, Intel is going to still be about 50% behind in Performance per Area of silicon. The one slide they showed Icelake iGPU being on par with the AMD APU required double the memory bandwidth (and iGPUs are always bandwidth starved) and a significantly higher clocked CPU. Intel is still way behind AMD & Nvidia at even moderate power usages for their GPUs.

 

Maybe by Gen13 they'll be on par in the >75w range, but it's going to be a while. It's why Intel is going server parts first, as raw FP throughput should be easier to do.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, mr moose said:

There are lots of problems with new tech when it comes out (RTX is no different in that regard).  But none of that legitimizes arguments of it being a novelty or sales gimmick when literally everyone is jumping on board and trying to push the tech forward on top of the fact many companies have been trying to make it a mainstream thing for decades.

Turing RTX is a gimmick. Or a Tech Demo, if you prefer. It actually exists in Turing for two reasons: 1) Because AMD/Sony/MS were bringing it to the next console generation, so they "had" to get out first (that's just how Nvidia operates) and 2) so Nvidia had a public justification for wasting so much die space on units (Tensor Cores) that are really only for certain professional workloads.

 

Turing RTX is the Bulldozer of Ray Tracing. Internal bottlenecks mess up a good idea. Nvidia's 2nd gen RTX will be upwards of 5-10x faster because they'll remove the timing bottleneck, which will make Turing EOL for any DXR game pretty much the instant Ampere launches. (Or whatever Nvidia calls their next gaming design. Ampere still might just be Server only.)

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, RejZoR said:

RT (DXR) is actually a neat feature. When it's open and anyone can use it. What NVIDIA was doing (RTX) was mostly proprietary crap which is why ppl hated it. And the fact only one party supported it and was desperate in making it an exclusive thing. If we'll have more vendors doing it, DXR will be the default and not some locked down RTX thing that only works if you have RTX graphic card. Hell, just the fact of having 3 GPU vendors in over decade and a half is amazing by itself. I just hope Intel will get their shit together in software and hardware side enough to be a viable option and not something you have to make a lot of compromises to make it usable. Which is what their GPU's have been for ages.

Well Nvidia RTX is DXR with hardware accelerated paths in it which is supported as part of DXR. What you're complaining about is Nvidia being the only vendor ready with something that does RTX and the only vendor willing to support it in games currently.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, leadeater said:

Well Nvidia RTX is DXR with hardware accelerated paths in it which is supported as part of DXR. What you're complaining about is Nvidia being the only vendor ready with something that does RTX and the only vendor willing to support it in games currently.

No, I'm saying that I don't give a flying fart if RTX runs through DXR. What I care is if game that says RTX all over the place can also use effects on non-NVIDIA cards using DXR itself. Battlefield V does, because they openly state DXR in settings and they openly talked about it being DXR as is. Where I'm not so sure for other games like Metro. I'm talking about feature exposure to the graphic cards. Coz if it's RTX graphic card only and nothing else, then it sucks donkey balls. And I wouldn't be surprised if it was specifically locked down to NVIDIA RTX series cards. Wouldn't be the first time nor the last... Because such locking down just makes adoption of tech even longer and more painful. But if new AMD cards that will allegedly have ray tracing and same for Intel's Xe cards, then I'm fine with it even if it says RTX in the settings. But I have doubts about that... Nothing sucks more than having a game and "wrong" graphic card because NVIDIA locked them down to their cards. That's the issue I'd have and that's the issue I'd have with ANY brand, not just NVIDIA. It's just that AMD prefers to make open standards from get go which anyone can adopt. So it's hard to complain over them you know... We'll see how Intel will operate in this regard. Will they push industry or lock down and stagnate by doing proprietary shit. We'll see...

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, leadeater said:

Well Nvidia RTX is DXR with hardware accelerated paths in it which is supported as part of DXR. What you're complaining about is Nvidia being the only vendor ready with something that does RTX and the only vendor willing to support it in games currently.

Scuttlebutt is that Nvidia has to pay for RTX modes because it's horrible to code for and the engines aren't ready to actually take advantage of DXR yet anyway. Turing RTX is simply a tech demo that somehow made it to production hardware. Nvidia has to try to sell it, still, but it's going to obliterate gen 1 RTX with gen 2 and it won't even be close.

Link to comment
Share on other sites

Link to post
Share on other sites

My Riva TNT2 Vanta supports raytracing. Doesn't mean that it works well.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, RejZoR said:

Coz if it's RTX graphic card only and nothing else, then it sucks donkey balls. And I wouldn't be surprised if it was specifically locked down to NVIDIA RTX series cards.

For DXR implementations there is no feature lock-in or exclusivity, Nvidia has exclusive acceleration hardware capabilities which may make enablement exclusive to those supported cards but that is a hardware capability and performance issue. Really isn't any different to other DirectX features where different GPUs have different hardware capability matrix support which has flow on for the end user as to what they can actually enable.

 

That doesn't make it locked down to Nvidia just means only Nvidia has supported hardware and willingness to support it's usage. You could do everything with DXR right now feature parity to Nvidia RTX in legacy support model but you'd get less than 1 FPS so no one is supporting it.

 

RTX is just Nvidia's marketing brand name behind their hardware acceleration features that don't work without DirectX DXR, hardware acceleration is not mandatory in DXR either but you really really want/need it.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, ut if having RTX card is prerequsite to enable RTX, then it doens't matter hw many DXR's are underneath, you can't use it. And that's what I fear with NVIDIA. Always...

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, RejZoR said:

Yeah, ut if having RTX card is prerequsite to enable RTX, then it doens't matter hw many DXR's are underneath, you can't use it. And that's what I fear with NVIDIA. Always...

And you needed GTX 400 series for DX11, how did that turn out? DX11 gained mass market appeal but you had to have the right GPUs for it. There's plenty of features I cannot enable because I'm still using 290X's.

 

Sure developers are not going to go back and back port AMD support in to current RTX games but as soon as AMD has something to offer it will be implemented and games will have both supported and you'll similarly need a current AMD GPU to use them. And we are likely going to have dual stack DXR in games for 3 generations while the industry sees where things progress and how, where common standards can be applied and the hardware structure of RT accelerators moved towards Shader Core levels of standardization.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, VegetableStu said:

 

It's kinda disappointing me that journalists are using google translate.   That would be like your wedding photographer using paint.  Professional or go home.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mr moose said:

 

It's kinda disappointing me that journalists are using google translate.   That would be like your wedding photographer using paint.  Professional or go home.

Best machine translator is DeepL.com. It's actually amazing, a loooot better than google translate

CPU: Core i9 12900K || CPU COOLER : Corsair H100i Pro XT || MOBO : ASUS Prime Z690 PLUS D4 || GPU: PowerColor RX 6800XT Red Dragon || RAM: 4x8GB Corsair Vengeance (3200) || SSDs: Samsung 970 Evo 250GB (Boot), Crucial P2 1TB, Crucial MX500 1TB (x2), Samsung 850 EVO 1TB || PSU: Corsair RM850 || CASE: Fractal Design Meshify C Mini || MONITOR: Acer Predator X34A (1440p 100hz), HP 27yh (1080p 60hz) || KEYBOARD: GameSir GK300 || MOUSE: Logitech G502 Hero || AUDIO: Bose QC35 II || CASE FANS : 2x Corsair ML140, 1x BeQuiet SilentWings 3 120 ||

 

LAPTOP: Dell XPS 15 7590

TABLET: iPad Pro

PHONE: Galaxy S9

She/they 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/11/2019 at 9:33 AM, mr moose said:

Whats wrong with directx?

Is it proprietary? AFAIK anyone can program for it /implement it. AMD/intel or Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, TechyBen said:

but I ask... can I run DXR games on any hardware

Yes you can run DXR on any GPU, even Intel iGPU.

 

Quote

DXR introduces 4 main elements to the DirectX 12 API:[1]

  • An "acceleration structure" which is an object that holds a representation of a 3D environment so that objects contained in the scene can be found quickly by the GPU
  • A command list method called DispatchRays that controls raytracing
  • HLSL shader types appropriate for raytracing
  • The Raytracing pipeline state, comparable to the existing Graphics and Compute pipeline states

 

If you however want to use RTX you need supported hardware.

 

On 10/11/2019 at 9:29 PM, TechyBen said:

IMO yes. Patent the implementation by all means, but make the API open (As freesync was IIRC). Play nice!

The API is open, that is DXR. The patented implementation is RTX. Literally happening as you want it to be.

Link to comment
Share on other sites

Link to post
Share on other sites

- Thread cleaned -

Let's not get caught up in pointless arguments & personal insults. If you can't discuss the topic in a mature manner the thread will be locked.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, leadeater said:

Yes you can run DXR on any GPU, even Intel iGPU.

 

 

If you however want to use RTX you need supported hardware.

 

The API is open, that is DXR. The patented implementation is RTX. Literally happening as you want it to be.

Wait what? "DXR [can run] on any GPU" and "to use RTX you need supported hardware".

 

Is RTX not NVidia's proprietary solution? Does BF and TombRaider use DXR on DirectX12 or proprietary RTX?

Not fanboying as others have accused, as I feel the same over AMD hairworks or Intel 4k DRM (though I think in Intels case they were just hardware supporting before AMD got there?).

 

If AMD/Intel can plug into those games then that's great. If it's like PhysX, that's really annoying.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, TechyBen said:

Wait what? "DXR [can run] on any GPU" and "to use RTX you need supported hardware".

 

Is RTX not NVidia's proprietary solution? Does BF and TombRaider use DXR on DirectX12 or proprietary RTX?

Not fanboying as others have accused, as I feel the same over AMD hairworks or Intel 4k DRM (though I think in Intels case they were just hardware supporting before AMD got there?).

 

If AMD/Intel can plug into those games then that's great. If it's like PhysX, that's really annoying.

It doesn't matter what you wish, everyone in the industry knows that doing Ray Tracing on existing hardware for current gen games is impossible so Microsoft along with Nvidia, AMD and Intel got together to come up with an extension to DirectX 12 called DXR. DXR enabled new pipeline paths and support for hardware acceleration, as well as a software emulation layer which will never be used for actual games (beyond ridiculously simple/non complex ones) .

 

RTX is not DXR, RTX is the hardware acceleration that DXR allows to exist as well as the development framework to implement it (much the same as GameWorks).

 

Yes any GPU can run DXR but nobody is going to do it because it'll be garbage and idealism won't can't that.

 

And yes BF and TombRaider are using DXR, you cannot use RTX without it. AMD and Intel cannot plug in to those games because they do not have any support for DXR outside of software emulation layer and thus were not implemented in the game in any way for them.

 

On 10/11/2019 at 9:29 PM, TechyBen said:

IMO yes. Patent the implementation by all means, but make the API open (As freesync was IIRC). Play nice!

So I'll point to your own words again, that is what happened.

 

RTX = Patented Implementation

DXR = Open API

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

 

RTX is not DXR, RTX is the hardware acceleration that DXR allows to exist as well as the development framework to implement it (much the same as GameWorks).

 

Yes any GPU can run DXR but nobody is going to do it because it'll be garbage and idealism won't can't that.

 

And yes BF and TombRaider are using DXR, you cannot use RTX without it. AMD and Intel cannot plug in to those games because they do not have any support for DXR outside of software emulation layer and thus were not implemented in the game in any way for them. 

The question is how much of the implementation in current RT games is built to be completely platform agnostic and just use DXR, and how much is built (likely with Nvidia funding/support) to specifically utilize RTX exclusive features. 

 

Hopefully, once AMD and Intel have RT cards on the market, you can plug one in and load up BF or Tomb Raider and it'll Just Work™, with maybe a handful of RTX optimized effects missing. 

 

The worry, though, is that currently games are being built with RTX as the core, meaning that while it may technically still be running over DXR at its core, we wind up with a buggy or outright incompatible experience once new cards come out. I'm sure it would be relatively simple to patch, but how many devs are going to go back and rework parts of their engine for a game that may be 2+ years old at that point. 

 

I'm not saying that the second scenario is the more likely one, just that it is a concern. The one saving grace is that at this rate there will be like a dozen RTX games at most out by the time AMD rolls their implementation out, so absolute worst case scenario I doubt there's going to be any signficant portfolio of RTX "exclusive" games. And once consoles have AMD-based RT cards, any future games are pretty much guaranteed to be running the most open standard of RT available just for platform agnosticism. 

 

Spoiler

As a side note, I wonder how the PS5 is going to implement things. Xbone is obviously going to use some version of DXR, but have we gotten any more development towards Vulkan RT? 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/11/2019 at 2:48 PM, melete said:

It's fascinating how much pushback Nvidia got from certain corners of the internet on the whole raytracing thing, compared to how widespread (Intel, AMD, Sony, Microsoft) adoption of raytracing has become among upcoming hardware releases.

It's more fascinating how now everyone that was pushing against Raytracing are backtracking and saying that it was always "NVidia's RTX is just a gimmick that'll die out! Everyone else doing RayTracingmis right! Don't look at my reply history!"

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×