Jump to content

Low-end Intel Xe Graphics card specs & performance detailed in benchmarks

Results45
24 minutes ago, LAwLz said:

No, you're reading that incorrectly. Zen APUs beats the Intel GPU in synthetics as well as gaming.

The average lead of the AMD GPUs in synthetic benchmarks is 13.34%.

The average lead for AMD GPUs in gaming benchmarks is 13.9% (excluding Rise of the Tomb Raider which is an outlier).

 

So the results in synthetic benchmarks are pretty much the same as they are in gaming benchmarks.

 

That's because until now most Intel CPUs have more or less had the same iGPUs since Skylake (HD 530/UHD 630 or weaker) while AMD has been introducing and optimizing higher performing alternatives (Vega 6-7-8-9-10-11) for the past 2 years.

 

Integrated graphics from both brands could spit out 30%+ more frames if factory-set or overclocked to boost at 1550Mhz, which could have been especially beneficial for Intel's offerings in recent years.

 

Also it seems that Zen+ and Zen 2 was less power efficient in most laptops than Skylake derivatives partly because of more power-hungry iGPUs ~ not to mention that for AMD "TDP" is how much a SoC consumes near peak performance while "TDP" for Intel is a general power consumption target that they design products for.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Results45 said:

 

What I mean is that Intel has to put 50%+ more powerful versions of their latest laptop iGPUs in desktop CPUs in order to be competitive against AMD's desktop APUs, right?

 

So if an Ice Lake Core i7-1065G7 contains a 15W 64EU iGPU and a Tiger Lake "Core i7-11875H" contains a 25W 96EU iGPU (both boosting at 1100Mhz), then those same integrated graphics would need to consume more power (25W & 40W respectively) and boost higher (1500-1750Mhz) when paired with Intel's Rocket Lake & Alder Lake 10nm desktop processors arriving early next year.

 

Again this is assuming that Intel will actually try to be competitive against Ryzen 4000-series APUs that AMD will be shipping by the end of this year

 

 

Thank you. That made way more sense. First remember the 96EU dGPU is running at a nearly 50% frequency uptick over the iGPU in the laptops. It's also got to have it's own memory controller and memory. And lastly Laptops get the silicon that does well in low power testing when they bin out the chips, so the dGPU may have higher consumption due to lower average silicon quality. All told i'd bet the iGPu is at most 10w, and may be under 5w.

 

Also i doubt Intel will even try to compete with the 4000 series APU's, Intel and AMD have had very different approaches there for quite a while with Intel having weak but present iGPU's on every cpu, while AMD only has iGPU's on specific models, but they go for higher capability stuff.

 

@LAwLz I'm not saying you shouldn;t ask for a source or not be sceptical. But in a situation like this where your dealing with obscure subject matter it's not remotely unlikely or unreasonable that someone could read somthing years ago, have the content stick in their mind but not remember the source.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Results45 said:

Integrated graphics from both brands could spit out 30%+ more frames if factory-set or overclocked to boost at 1550Mhz, which could have been especially beneficial for Intel's offerings in recent years.

i guess every CPU, GPU and stick of RAM needs to bee 100% overclocked to it's limit then out of the factory.

 

the existence of overclocking does not mean that the manufacturer is deliberately holding things back.

 

You dont use an iGPU for gaming, much in the same way you dont take a motorcycle to home depot to pick up two-by-fours. while you COULD cobble the wood home on a motorcycle, they are not designed to be used for such.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, RejZoR said:

-snip-

Can you for one second just pause and admit that you were either using personal experiences from several years ago, or just making stuff up earlier in the thread?

We have gone from this post:

23 hours ago, RejZoR said:

Intel's drivers USED to be horrendously bad. Not in terms of stability issues and crashes, but just straight out garbage drivers. Poor standards support, missing features, unsupported features, glitches in games or just games that plain refused to run because the drivers were so crap because of all the above or parts of the above. Not that anyone seriously ever gamed on Intel graphics, but even the stuff that could theoretically run didn't because of problems. It's only been for last few years that they put even some effort into it.

 

Reason why I'd hesitate to go with their graphic card is the control or lack of it when it comes to control panel. Like, "Anisotropic filter". ON or OFF. No info what level of AF at all. You just turn it on and hope it doe something at unknown level. Antialiasing? I mean, I guess please? Post process AA like FXAA or SMAA? What even is this? And these are often so low impact it should be available on any graphic card, even Integrated junk. It just isn't there. And whole Intel GPU control panel is just weird arrangement of ancient looking components. Can't decide which one is worse offender, NVIDIA's archaic clunky Control Panel or Intel's. They both stink tho, that's what I'm sure about both tho.

 

To "I don't like how the GUI looks".

 

Here is a list of claims you made:

  • Poor standards support - This was debunked.
  • Missing Features - Not sure which ones you were referring and you didn't really elaborate.
  • Unsupported Features - Again, not sure what that means and you didn't elaborate.
  • Games that should be able to run on the iGPU didn't - You can't posted any proof of this.
  • Anisotropic filtering only having on/off and no granuality - This was proven false.
  • No support for SMAA - This was proven false.
  • You don't like how the GUI looks - The only thing that has been proven to be true out of all your claims.

 

 

  

40 minutes ago, RejZoR said:

If history has learned me anything is that scaling is never linear. Just because you can somewhat make shitty GPU be good compared to another shitty GPU, that doesn't mean you'll do great against high ends. As evident by struggles with AMD.

I have never disagreed or said anything to the contrary.

The only thing I have done in this thread so far is question and research the claim that Intel's GPU drivers are bad for gaming. So far, I have found 0 evidence to support those claims. I have also seen several other claims be made about Intel's GPU drivers and those have also been found to be false. So I would like for people to either stop talking rubbish they don't know anything about so that this myth can die like it should, or at the very least apologize for the time they have made me waste.

 

Making a bullshit and incorrect claim takes like 2 minutes. Researching to verify if the claims is bullshit (which have happened several times in this thread) takes like 20 minutes.

 

  

45 minutes ago, RejZoR said:

And if it was actually good I wouldn't even complain, but it's absolute trash and I can't believe a multi million company like NVIDIA can't redesign it to be less of a turd. Applying shit takes seconds, menus are twitching and glitching as you're selecting things and applying them, it takes long to fire up, profiles take forever to load up (the list of games) even on SSD, it's clumsy to navigate and use and it just feels terrible. And that's exactly how Intel's feels as well.

Looks and works fine to me. But to each their own.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, LAwLz said:

The only thing I have done in this thread so far is question and research the claim that Intel's GPU drivers are bad for gaming.

I think this is the point a lot of people are confusing, the drivers aren't bad for gaming, the iGPUs (in general) are bad for gaming. the AMD vega APUs are better than intel iGPUs, but they are still shit overall.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Results45 said:

That's because until now most Intel CPUs have more or less had the same iGPUs since Skylake (HD 530/UHD 630 or weaker) while AMD has been introducing and optimizing higher performing alternatives (Vega 6-7-8-9-10-11) for the past 2 years.

 

Integrated graphics from both brands could spit out 30%+ more frames if factory-set or overclocked to boost at 1550Mhz, which could have been especially beneficial for Intel's offerings in recent years.

 

Also it seems that Zen+ and Zen 2 was less power efficient in most laptops than Skylake derivatives partly because of more power-hungry iGPUs ~ not to mention that for AMD "TDP" is how much a SoC consumes near peak performance while "TDP" for Intel is a general power consumption target that they design products for.

I am not interested in discussing the hardware side of things. I just want to talk about the software (driver) side of things.

I'm not arguing that Intel's hardware is good or bad. What I am arguing against and questioning is the people who keep parroting "Intel's drivers are bad for gaming!" with no evidence.

 

 

24 minutes ago, CarlBar said:

 

@LAwLz I'm not saying you shouldn;t ask for a source or not be sceptical. But in a situation like this where your dealing with obscure subject matter it's not remotely unlikely or unreasonable that someone could read somthing years ago, have the content stick in their mind but not remember the source.

 

I totally get that, but if you are making arguments and claims about some product today, based on something you read 7 years ago then maybe you should stop and ask "is this information outdated?" before going on and on about it.

 

Imagine if I started making claims about Ryzen processors based on benchmarks of the AMD FX-9590. Don't you think I would sound stupid if I went "AMD's processors are shit because their 8 core processor doesn't actually have 8 full cores, here is an article about Bulldozer that explains it"?

That claims should be equally as silly as going "well these benchmarks from 7 years ago shows that Intel's drivers might not be optimized for games, therefore their drivers today aren't optimized for games".

 

 

 

  

2 minutes ago, Arika S said:

I think this is the point a lot of people are confusing, the drivers aren't bad for gaming, the iGPUs (in general) are bad for gaming. the AMD vega APUs are better than intel iGPUs, but they are still shit overall.

That's what I think too. I wouldn't have any problem with people saying "Intel's iGPUs are shit for gaming". But I do have a problem with people saying "Intel's GPU drivers are shit for gaming", especially when they are trying to apply that logic to why their future discrete graphics card might be good/bad.

It's easy to prove that Intel's iGPUs are shit for gaming. Just post any benchmark. But the "because of their drivers" portion is just flat out pulled out of someone's ass from what I can tell.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, LAwLz said:

I have never disagreed or said anything to the contrary.

The only thing I have done in this thread so far is question and research the claim that Intel's GPU drivers are bad for gaming. So far, I have found 0 evidence to support those claims. I have also seen several other claims be made about Intel's GPU drivers and those have also been found to be false.

Well intel drivers were MUCH worse, Back in the days of windows 7.....

Maybe that was only in my case :D......

Windows Aero issues, blue screens, but the drivers were much more polished around 2014 ....

People tend to blame the drivers but intel iGPUs were never intended for gaming ...

They are intended for media consumption and basic use...

3 minutes ago, LAwLz said:

I am not interested in discussing the hardware side of things. I just want to talk about the software (driver) side of things.

I'm not arguing that Intel's hardware is good or bad. What I am arguing against and questioning is the people who keep parroting "Intel's drivers are bad for gaming!" with no evidence.

 

 TBH intel drivers are MUCH better than they used to be ...

even for gaming 

Please quote or tag me @Void Master,so i can see your reply.

 

Everyone was a noob at the beginning, don't be discouraged by toxic trolls even if u lose 15 times in a row. Keep training and pushing yourself further and further, so u can show those sorry lots how it's done !

Be a supportive player, and make sure to reflect a good image of the game community you are a part of. 

Don't kick a player unless they willingly want to ruin your experience.

We are the gamer community, we should take care of each other !

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, CarlBar said:

 

@LAwLz I'm not saying you shouldn;t ask for a source or not be sceptical. But in a situation like this where your dealing with obscure subject matter it's not remotely unlikely or unreasonable that someone could read somthing years ago, have the content stick in their mind but not remember the source.

 

I think if someone is recalling an article from memory they read several years ago but can't find, then they should make that knownt.  Because A. an old article means old data which is no longer relevant to a discussion about current software and B. the human memory is pretty shit, so that sort of caveat is more important than making some big and absolute claims that can;t be backed up with any relevant and current information.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Arika S said:

while you COULD cobble the wood home on a motorcycle

When you do this could you post a photo? :)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

When you do this could you post a photo? :)

Next time i go to Vietnam, i'll get some pointers from the locals

 

Things Vietnamese People Do – Travel information for Vietnam from ...

 

Vietnam's Motorbikes Carry Mind-Boggling Loads of Stuff | Vietnam

 

Crazy Cargo On the #streetsofSaigon #Vietnam #motorbike #s… | Flickr

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

Can you for one second just pause and admit that you were either using personal experiences from several years ago, or just making stuff up earlier in the thread?

We have gone from this post:

 

To "I don't like how the GUI looks".

 

Here is a list of claims you made:

  • Poor standards support - This was debunked.
  • Missing Features - Not sure which ones you were referring and you didn't really elaborate.
  • Unsupported Features - Again, not sure what that means and you didn't elaborate.
  • Games that should be able to run on the iGPU didn't - You can't posted any proof of this.
  • Anisotropic filtering only having on/off and no granuality - This was proven false.
  • No support for SMAA - This was proven false.
  • You don't like how the GUI looks - The only thing that has been proven to be true out of all your claims.

 

 

  

I have never disagreed or said anything to the contrary.

The only thing I have done in this thread so far is question and research the claim that Intel's GPU drivers are bad for gaming. So far, I have found 0 evidence to support those claims. I have also seen several other claims be made about Intel's GPU drivers and those have also been found to be false. So I would like for people to either stop talking rubbish they don't know anything about so that this myth can die like it should, or at the very least apologize for the time they have made me waste.

 

Making a bullshit and incorrect claim takes like 2 minutes. Researching to verify if the claims is bullshit (which have happened several times in this thread) takes like 20 minutes.

 

  

Looks and works fine to me. But to each their own.

Basically sums up your competence to discuss the matter of drivers and control panel quality... Now stop nitpicking my words and trying to find the "gotcha" moments. Jesus. Also supporting a standard on paper means exactly dick. Which has also been proven through history. Like S3 supporting TnL in hardware and it never actually worked correctly in actual games. Or NVIDIA struggling with Pixel Shader 2.0 under Direct3D9 because they picked the wrong precision for shader compute that looked great on paper and was there to tick the checkbox but never really worked good in games and had massive performance penalties. Or how AMD was struggling with tessellation during HD2000 series iirc? Or how NVIDIA had higher D3D12 specs on paper but AMD offered more of actually functioning ones in games? And then there is this issue of supporting a function that's entirely unfeasible in actual games because the GPU is just so shit it can't really realistically use it even if it supports it to perfection. Which is still true. We'll be able to judge if supported standards are actually functional when they'll release Xe actually capable of running games like AMD and NVIDIA are on their mid and high end. You know, so you can actually run a game and play it, not just run it and even that barely. But hey, go on and nitpick my words and just throw any kind of realistic expectations or common sense out the window. Who needs that shit anyway, right.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RejZoR said:

Basically sums up your competence to discuss the matter of drivers and control panel quality... Now stop nitpicking my words and trying to find the "gotcha" moments. Jesus. Also supporting a standard on paper means exactly dick. Which has also been proven through history. Like S3 supporting TnL in hardware and it never actually worked correctly in actual games. Or NVIDIA struggling with Pixel Shader 2.0 under Direct3D9 because they picked the wrong precision for shader compute that looked great on paper and was there to tick the checkbox but never really worked good in games and had massive performance penalties. Or how AMD was struggling with tessellation during HD2000 series iirc? Or how NVIDIA had higher D3D12 specs on paper but AMD offered more of actually functioning ones in games? And then there is this issue of supporting a function that's entirely unfeasible in actual games because the GPU is just so shit it can't really realistically use it even if it supports it to perfection. Which is still true. We'll be able to judge if supported standards are actually functional when they'll release Xe actually capable of running games like AMD and NVIDIA are on their mid and high end. You know, so you can actually run a game and play it, not just run it and even that barely. But hey, go on and nitpick my words and just throw any kind of realistic expectations or common sense out the window. Who needs that shit anyway, right.

I don't think I have nitpicked your comments at all. It's just that you have said flat out wrong things and made them your primary arguments. Nitpicking would be "you said AMD struggled with Tessellation in the HD2000 series but they struggled with it in in the 4000 series as well". You know, irrelevant stuff that doesn't have anything to do with the point or argument you're making.

 

But when you say "Intel's drivers are bad because X, Y and Z" and it turns out all three of X, Y and Z are wrong, then it's not nitpicking. It's not nitpicking to point out that you're wrong about stuff like "Intel only has an ON/OFF switch for AF" when you're using that statement to validate a previous statement made (that Intel's GPU drivers are bad).

 

If you think supporting standards "means exactly dick" then why did you bring it up as an argument for why Intel's drivers are bad? You can't just go "Intel is bad because they don't support standards. Oh wait they do? Well in that case standards doesn't matter". It seems to me like you only cared about standards when you thought Intel didn't support them. But then when you got proven wrong (that Intel do in fact support standards) you try to play it off as unimportant because you have already made up your mind that Intel = bad no matter what.

 

And are you seriously going to try and discredit my knowledge on the subject just because I think the Nvidia control panel looks fine and you don't? Sorry but I care more about functionality and performance than how pretty something is. I don't think that lowers my competence regarding the subject.

I'm glad that the discussion seems to have moved from "Intel's drivers are bad" to "I don't like Intel's drivers because I don't like how the control panel looks" though. Hopefully the whole "Intel's drivers are bad" myth will stop being repeated now that I think we have quite clearly demonstrated and debunked the misconceptions people have about them.

 

Side note: I have zero experience with DirectX or OpenGL development, but from what I have heard (quite old and unreliable data mainly from IRC channels and from blogs/twitter) in terms of supporting standards the list goes like this:

Nvidia are the best. They implement the standards accurately and very nearly completely.

 

Intel are the second best. Their hardware is lacking in quite a few places, but software wise they are fantastic. Very few bugs, the features that are supported hardware wise are also implemented well and functions as expected.

 

AMD are the worst of the big three. When using some functions data is corrupted, they delete stuff on their developer forum when bugs are reported (might have been a mistake), or just don't respond to bugs at all, things crash without even providing errors so you have no idea why, the hardware reports supporting features that don't actually work.

Stuff like that.

But like I said, this is old info that might no longer be correct. My point is just that anecdotally, I have heard nothing but praise of Intel's GPU drivers (from developers). Their hardware has been terrible, but their drivers haven't. From what I can tell this is still the case, and if we are going to give Intel shit for something related to GPUs, it should be their hardware. Shitting on their drivers when that hasn't been the problem won't do any good.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll phrase it differently. Intel has excellent standards support that rarely actually functions or are usable in any real world scenario. Happy?

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, RejZoR said:

I'll phrase it differently. Intel has excellent standards support that rarely actually functions or are usable in any real world scenario. Happy?

Not really happy, because I would like some sources to back that claim up with.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, LAwLz said:

Not really happy, because I would like some sources to back that claim up with.

Not really a standard example, but dolphin has had trouble with vulkan on intel (scroll down to the integrated section).

"The first thing we need to address about the results above is the lack of Vulkan. Despite improving Vulkan drivers on Windows, Dolphin's Vulkan backend still will not run on Intel HD's Windows drivers. Users wishing to use Vulkan on their Intel HD graphics chips have to use Linux and the Mesa drivers."

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Jurrunio said:

I made my claim with Iris Pro which is mostly found in Macbooks, unsurprisingly few used them for gaming so data presented alongside Windows equivalents are really scarce.

 

and your source proves my point doesnt it? In synthetics the 1065G7 always beat the Zen and Zen+ APUs showing its stronger GPU hardware, but in games this is not always the case.

I have a desktop with Iris pro on a 4950hq under a 120mm AIO, what do you all need me to test to settle this squabble?

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bitter said:

I have a desktop with Iris pro on a 4950hq under a 120mm AIO, what do you all need me to test to settle this squabble?

 

Something something........claims about driver support hinders iGPU performance (debunked by @leadeater and others).......something something about comparable GT/GTX/Radeon cards always being faster in games......... something something lack of benchmarks showing whether driver support/optimization affects iGPU performance or if it's just a general lack of horsepower.

 

I would skim through the last 2 pages to understand the specifics of the debate.

 

I personally am curious to see if overclocking the Iris Pro 5200 from 1300Mhz to 1550-1650Mhz would help it be on-par or beat a GT 640/730/650M/740M, GTX 830M/920M/920MX, or MX110. Same goes for old mid-range Radeon graphics cards from 7+ years ago, APUs with R5/R7 integrated graphics, and Vega 3/6 APUs.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Results45 said:

 

Something something........claims about driver support hinders iGPU performance (debunked by @leadeater and others).......something something about comparable GT/GTX/Radeon cards always being faster in games......... something something lack of benchmarks showing whether driver support/optimization affects iGPU performance or if it's just a general lack of horsepower.

 

I would skim through the last 2 pages to understand the specifics of the debate.

 

I personally am curious to see if overclocking the Iris Pro 5200 from 1300Mhz to 1550-1650Mhz would help it be on-par or beat a GT 640/730/650M/740M, GTX 830M/920M/920MX, or MX110. Same goes for old mid-range Radeon graphics cards from 7+ years ago, APUs with R5/R7 integrated graphics, and Vega 3/6 APUs.

I have a few benchmarks installed, what one do you want?

 

I'm currently using a GTX 960 as the display out, should just be as simple as swapping the HDMI to the motherboard header and installing the Intel graphics drivers, right?

Link to comment
Share on other sites

Link to post
Share on other sites

Good for Intel! I'd love to see how these computer per watt!

Link to comment
Share on other sites

Link to post
Share on other sites

  • 5 weeks later...

@Results45 I already started a discussion thread for that.

 

 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

They really gotta work on the naming convention.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×