Jump to content

Low-end Intel Xe Graphics card specs & performance detailed in benchmarks

Results45
1 hour ago, LAwLz said:

No you can't just say a very specific thing like "Intel GPUs are held back by drivers" and then when asked to provide a source go "well i don't want to give examples without owning the hardware myself". 

I was under the impression Intel GPU drivers were well known to be good, very stable and support new features and APIs quickly. Sure the hardware sucks but I've never seen anyone say the drivers suck, in fact hardly anyone talks about their drivers at all because of how irrelevant the hardware is.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, leadeater said:

I was under the impression Intel GPU drivers were well known to be good, very stable and support new features and APIs quickly. Sure the hardware sucks but I've never seen anyone say the drivers suck, in fact hardly anyone talks about their drivers at all because of how irrelevant the hardware is.

I also have the impression of Intel's drivers being really good. But over the last couple of years this forum in particular has started parroting this "Intel's GPU drivers are bad" and I have never seen anyone actually post evidence that there are driver issues.

Like you said, the hardware has never been that good. I think a lot of people have seen poor performance (from lacking hardware) and that has somehow evolved into "Intel's drivers are bad" because a lot of people say it. When you hear like 20 different people say something you tend to believe it at face value. Doesn't help that it's hard to find good benchmarks and analysis of Intel's GPUs either.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, LAwLz said:

Doesn't help that it's hard to find good benchmarks and analysis of Intel's GPUs either.

 

And they you expect people to be able to find an article they probably haven't read in ages, (because there's so little work done on testing them), and may or may not have the evidence as part of a larger overall article on demand. Cute.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, LAwLz said:

I also have the impression of Intel's drivers being really good. But over the last couple of years this forum in particular has started parroting this "Intel's GPU drivers are bad" and I have never seen anyone actually post evidence that there are driver issues.

Like you said, the hardware has never been that good. I think a lot of people have seen poor performance (from lacking hardware) and that has somehow evolved into "Intel's drivers are bad" because a lot of people say it. When you hear like 20 different people say something you tend to believe it at face value. Doesn't help that it's hard to find good benchmarks and analysis of Intel's GPUs either.

Intel's drivers USED to be horrendously bad. Not in terms of stability issues and crashes, but just straight out garbage drivers. Poor standards support, missing features, unsupported features, glitches in games or just games that plain refused to run because the drivers were so crap because of all the above or parts of the above. Not that anyone seriously ever gamed on Intel graphics, but even the stuff that could theoretically run didn't because of problems. It's only been for last few years that they put even some effort into it.

 

Reason why I'd hesitate to go with their graphic card is the control or lack of it when it comes to control panel. Like, "Anisotropic filter". ON or OFF. No info what level of AF at all. You just turn it on and hope it doe something at unknown level. Antialiasing? I mean, I guess please? Post process AA like FXAA or SMAA? What even is this? And these are often so low impact it should be available on any graphic card, even Integrated junk. It just isn't there. And whole Intel GPU control panel is just weird arrangement of ancient looking components. Can't decide which one is worse offender, NVIDIA's archaic clunky Control Panel or Intel's. They both stink tho, that's what I'm sure about both tho.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, CarlBar said:

And they you expect people to be able to find an article they probably haven't read in ages, (because there's so little work done on testing them), and may or may not have the evidence as part of a larger overall article on demand. Cute.

Well I expect people that make comments about the current situation to back up their claims with up to date info.

That shouldn't be too much to ask, right? And if they don't have that info, why are they making the claims to begin with?

 

 

7 minutes ago, RejZoR said:

Intel's drivers USED to be horrendously bad. Not in terms of stability issues and crashes, but just straight out garbage drivers. Poor standards support, missing features, unsupported features, glitches in games or just games that plain refused to run because the drivers were so crap because of all the above or parts of the above. Not that anyone seriously ever gamed on Intel graphics, but even the stuff that could theoretically run didn't because of problems. It's only been for last few years that they put even some effort into it.

When you say "used to", how long ago back are you talking?

Poor standards support? Intel has been behind only to Nvidia in terms of standard support as far as I can remember.

What missing features are you talking about specifically?

Not sure what you mean by "unsupported features" or how that's different from missing features. Can you elaborate?

This is the first time I am hearing about games that in theory should be able to run on Intel graphics, but didn't because of driver issues. Got any sources for this claim?

 

10 minutes ago, RejZoR said:

Reason why I'd hesitate to go with their graphic card is the control or lack of it when it comes to control panel. Like, "Anisotropic filter". ON or OFF. No info what level of AF at all.

Now you're just flat out wrong.

Here is a screenshot of the Intel graphics control panel.

Spoiler

Capture.thumb.PNG.27803275a672f6620867b6ef5663c9e9.PNG

I don't know where you got the idea that anisotropic filtering was just an on or off switch from, but that's straight up false. As you can see, you can select 2x, 4x 8x or 16x right there in the control panel.

 

 

15 minutes ago, RejZoR said:

Antialiasing? I mean, I guess please? Post process AA like FXAA or SMAA? What even is this?

As you can see in the screenshot above, you can use SMAA as well.

 

 

18 minutes ago, RejZoR said:

And these are often so low impact it should be available on any graphic card, even Integrated junk. It just isn't there.

But it is. I'm starting to think you haven't even looked at the graphics control panel before bashing it, because a large portion of what you're saying is straight up false.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think I've ever personally experienced an issue with Intel graphics,  Anecdotal I know, but we are talking about 15 computers I manage for family and friends that at various times in their history ran solely on Intel graphics and were used for light gaming.  

 

Seeing as I never had problems I haven't gone looking to see if anyone else has,  and in my scrolling through the troubleshooting and GPU forums I don't see Intel drivers pop up more than any other device driver does.  In fact if I'm honest I think I see more AMD and Nvidia driver issues than I do Intel.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, LAwLz said:

Poor standards support? Intel has been behind only to Nvidia in terms of standard support as far as I can remember.

image.thumb.png.18422581f04fb799d43c974ab3e11c1c.png

 

Intel's drivers have always supported every DirectX and OpenGL feature the hardware was actually capable of supporting as soon as those features were available in the those APIs. The problem is the hardware couldn't always support everything because it's not designed to be the best gaming GPU. Even then Intel iGPUs have never really lacked what AMD and Nvidia had at the time, just actual performance. If anyone has been slow it's Nvidia, but they are also ahead in other parts so it's all meh anyway. Games don't work because of a lack of these or a lower support level, anything that would cause that would mean failure to comply with that DirectX level and the game wouldn't try and use it.

 

Weird performance results are almost always due to not having dedicated memory and bad memory bandwidth and the game requires that. To talk about optimization you really need to look at the same game using the same version comparing across many different driver versions, if the performance never improves it either can't and is optimized as best as possible already or they just aren't interested in optimizing. I don't think Intel does the later.

 

Intel iGPUs are just crap and slow and it's really easy to try and run a game past the maximum shared vram allowed for that generation which will obviously cause the game to crash or not run. This is why high res displays on devices without hardware to support it is stupid.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

I don't think I've ever personally experienced an issue with Intel graphics,  Anecdotal I know, but we are talking about 15 computers I manage for family and friends that at various times in their history ran solely on Intel graphics and were used for light gaming.  

 

Seeing as I never had problems I haven't gone looking to see if anyone else has,  and in my scrolling through the troubleshooting and GPU forums I don't see Intel drivers pop up more than any other device driver does.  In fact if I'm honest I think I see more AMD and Nvidia driver issues than I do Intel.

I just tried to add a custom resolution to my laptop...can't do it with Intel control panel.  Literally says "exceeds bandwidth".  I don't give a fuuuuck...that's why I'm messing around in the "advanced" menu to begin with.  (I also don't care about the bandwidth because it's not going to an actual display, it's being remoted into where I can run at 4K 60Hz if I want)

 

Meanwhile Nvidia control panel on another box lets me do it no problem.

 

Irrelevant anyways....Intel graphics aren't for consumers it's for the datacenter.  Maybe DG2 will be a discrete graphics card for consumers but that's like 18 months away.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, AnonymousGuy said:

I just tried to add a custom resolution to my laptop...can't do it with Intel control panel.  Literally says "exceeds bandwidth".  I don't give a fuuuuck...that's why I'm messing around in the "advanced" menu to begin with.  (I also don't care about the bandwidth because it's not going to an actual display, it's being remoted into where I can run at 4K 60Hz if I want)

 

Meanwhile Nvidia control panel on another box lets me do it no problem.

Why does it say not enough bandwidth?  Is it programed to work only within set bandwidth limits due to some other factor of being in the processor?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, mr moose said:

Why does it say not enough bandwidth?  Is it programed to work only within set bandwidth limits due to some other factor of being in the processor?

Not sure.  Cursory googling says the 8550u only supports displayport 1.2 so it's probably going by that...probably not many people trying to hang 4K screens off their laptop.  Fortunately there's a workaround via Custom Resolution Utility to force it outside the Intel control panel.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, AnonymousGuy said:

Not sure.  Cursory googling says the 8550u only supports displayport 1.2 so it's probably going by that...probably not many people trying to hang 4K screens off their laptop.  Fortunately there's a workaround via Custom Resolution Utility to force it outside the Intel control panel.

What are you using to remote in? RDP for example lets you set resolutions and it's a null software driver on the host computer so you'd never hit that problem. If you're having to set resolutions in the  Intel control panel then actually making sure you don't nuke your display output is a good thing not bad.

 

Also is there a display connected? Might only check if there is a display, good luck if it's a laptop though.

 

Edit again:

No you were talking about remote, I give up, going to sleep.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

What are you using to remote in? RDP for example lets you set resolutions and it's a null software driver on the host computer so you'd never hit that problem. If you're having to set resolutions in the  Intel control panel then actually making sure you don't nuke your display output is a good thing not bad.

 

Also is there a display connected? Might only check if there is a display, good luck if it's a laptop though.

RDP isn't optimized for anything really above 1080, thrashes the CPU above that leaving no resources for actual programs.  Got doubly worse since my laptop has 2 screens docked to it. Allegedly Windows Server has GPU acceleration support but not Enterprise.  Splashtop is what I'm using since it includes gpu acceleration and doesn't squash text with compression (had problems with Anydesk doing this).   

 

I had to plug in some dummy display port dongles to be able to add custom resolutions.  Windows absolutely shits itself and is incapable of doing displays unless they're physically atteched or via RDP.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, AnonymousGuy said:

Allegedly Windows Server has GPU acceleration support but not Enterprise

RemoteFX is supported on Enterprise.

 

7 minutes ago, AnonymousGuy said:

thrashes the CPU above that leaving no resources for actual programs

Personally not really had problems like that but I only use dual 2560x1600.

Link to comment
Share on other sites

Link to post
Share on other sites

Good luck turning on MSAA in any old game there, the only sorts of games really meant to be run on Intel graphics... It literally shows what I was complaining about, even if it has changed a bit in last year or two...

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, RejZoR said:

Good luck turning on MSAA in any old game there, the only sorts of games really meant to be run on Intel graphics... It literally shows what I was complaining about, even if it has changed a bit in last year or two...

What do you mean?

Are you saying that Intel's GPUs would not be able to handle MSAA? That's not a driver issue, that's a hardware issue.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

What do you mean?

Are you saying that Intel's GPUs would not be able to handle MSAA? That's not a driver issue, that's a hardware issue.

No, that's driver and control panel issue. Older games, like most on GOG don't have native MSAA support and you had to enable it in graphic card control panel. I guess having post process AA and actually controllable AF is a start. I hope they'll improve this when they launch consumer Xe graphic cards.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RejZoR said:

No, that's driver and control panel issue. Older games, like most on GOG don't have native MSAA support and you had to enable it in graphic card control panel. I guess having post process AA and actually controllable AF is a start. I hope they'll improve this when they launch consumer Xe graphic cards.

I get the feeling that you just throw out a bunch of stuff you think Intel graphics can't do, and then hope that one of your statements is true. Look at the post above how many statements you made and how many turned out to be false. I don't understand why you would say so many incorrect things, with no shame and no apology when it turns out you were wrong.

 

But it seems like you have been right about one thing, you can't force MSAA on with Intel's graphics control panel.

That's far from what was originally discussed though. The original subject was whether or not Intel's drivers are bad for performance in games.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, LAwLz said:

I get the feeling that you just throw out a bunch of stuff you think Intel graphics can't do, and then hope that one of your statements is true. Look at the post above how many statements you made and how many turned out to be false. I don't understand why you would say so many incorrect things, with no shame and no apology when it turns out you were wrong.

 

But it seems like you have been right about one thing, you can't force MSAA on with Intel's graphics control panel.

That's far from what was originally discussed though. The original subject was whether or not Intel's drivers are bad for performance in games.

Gee, sorry coz I only have a fucking Atom netbook of Intel devices from last 2-3 years and I'm not totally up to date with how their stupid interface looks like. It's still basic and crap and if I'd get this served on a GTX 2000 or RX 7000 competitor I'd be greatly disappointed.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, CarlBar said:

 

I'm trying to parse what you just typed and i can;t come up with anything that makes sense. You seem to switch from talking about integrated to discreet graphics half way through.

.

What I mean is that Intel has to put 50%+ more powerful versions of their latest laptop iGPUs in desktop CPUs in order to be competitive against AMD's desktop APUs, right?

 

So if an Ice Lake Core i7-1065G7 contains a 15W 64EU iGPU and a Tiger Lake "Core i7-11875H" contains a 25W 96EU iGPU (both boosting at 1100Mhz), then those same integrated graphics would need to consume more power (25W & 40W respectively) and boost higher (1500-1750Mhz) when paired with Intel's Rocket Lake & Alder Lake 10nm desktop processors arriving early next year.

 

Again this is assuming that Intel will actually try to be competitive against Ryzen 4000-series APUs that AMD will be shipping by the end of this year.

 

17 hours ago, LAwLz said:

When you hear like 20 different people say something you tend to believe it at face value. Doesn't help that it's hard to find good benchmarks and analysis of Intel's GPUs either.

 

Have you tried looking up synthetic, professional, and gaming benchmark results from each iGPU from Intel since Haswell?

 

Cuz both Anandtech (via laptop reviews) and Notebookcheck (in their database) have results on quite a few of them.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, LAwLz said:

 I don't understand why you would say so many incorrect things, with no shame and no apology when it turns out you were wrong.

 

After many discussions, the issue seems to be that he takes a small experience or understanding he has about something and extrapolates that as if it effects everything across the board all the time.  we had the same issue when we tried to explain the evolution of technology and how most great new technologies have very slow uptake in the consumer sphere,  but he was adamant that wasn't true because it wasn't his personal experience. 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Results45 said:

Have you tried looking up synthetic, professional, and gaming benchmark results from each iGPU from Intel since Haswell?

 

Cuz both Anandtech (via laptop reviews) and Notebookcheck (in their database) have results on quite a few of them.

I have more or less done than, yes. The problem, I think, is that reviewers do not contain enough information to actually do a comparison with.

The best and most up to date comparison I've found is this review from Anandtech.

Here we can compare the i7-1065G7 vs AMD's 4700U.

 

Rise of the Tomb Raider is clearly an outlier (33.70% difference) but let us include that anyway.

Average AMD lead for synthetics: 13.34%

Average AMD lead for gaming: 17.86%

Average AMD lead for gaming (excluding Rise of Tomb Raider): 13.9%

 

So based on Anandtech's latest review, if we compare AMD vs Intel for synthetics, then gaming, and then compare those two results, we come to the conclusion that...?

That Intel's graphics is somewhere around 0.6% and 4.5% slower in gaming than we should expect, assuming their GPU drivers were on par with AMD's drivers? If that's the actual results, then I seriously think people have been overstating the whole "Intel's GPU drivers are so bad!" if they are only like 2% behind AMD's drivers in terms of optimizations for gaming.

 

@Jurrunio is this what you wanted me to do? Sorry but I came to the exact opposite conclusion than you did when reading benchmarks and comparing synthetics vs real world benchmarks. Not sure if that's because you use 7 year old benchmarks to draw your conclusion while I use 7 days old benchmarks, or if we use a different methodology.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, LAwLz said:

@Jurrunio is this what you wanted me to do? Sorry but I came to the exact opposite conclusion than you did when reading benchmarks and comparing synthetics vs real world benchmarks. Not sure if that's because you use 7 year old benchmarks to draw your conclusion while I use 7 days old benchmarks, or if we use a different methodology.

I made my claim with Iris Pro which is mostly found in Macbooks, unsurprisingly few used them for gaming so data presented alongside Windows equivalents are really scarce.

 

and your source proves my point doesnt it? In synthetics the 1065G7 always beat the Zen and Zen+ APUs showing its stronger GPU hardware, but in games this is not always the case.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Jurrunio said:

I made my claim with Iris Pro which is mostly found in Macbooks, unsurprisingly few used them for gaming so data presented alongside Windows equivalents are really scarce.

 

and your source proves my point doesnt it? In synthetics the 1065G7 always beat the Zen and Zen+ APUs showing its stronger GPU hardware, but in games this is not always the case.

No, you're reading that incorrectly. Zen APUs beats the Intel GPU in synthetics as well as gaming.

The average lead of the AMD GPUs in synthetic benchmarks is 13.34%.

The average lead for AMD GPUs in gaming benchmarks is 13.9% (excluding Rise of the Tomb Raider which is an outlier).

 

So the results in synthetic benchmarks are pretty much the same as they are in gaming benchmarks.

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, LAwLz said:

I have more or less done than, yes. The problem, I think, is that reviewers do not contain enough information to actually do a comparison with.

The best and most up to date comparison I've found is this review from Anandtech.

Here we can compare the i7-1065G7 vs AMD's 4700U.

 

Rise of the Tomb Raider is clearly an outlier (33.70% difference) but let us include that anyway.

Average AMD lead for synthetics: 13.34%

Average AMD lead for gaming: 17.86%

Average AMD lead for gaming (excluding Rise of Tomb Raider): 13.9%

 

So based on Anandtech's latest review, if we compare AMD vs Intel for synthetics, then gaming, and then compare those two results, we come to the conclusion that...?

That Intel's graphics is somewhere around 0.6% and 4.5% slower in gaming than we should expect, assuming their GPU drivers were on par with AMD's drivers? If that's the actual results, then I seriously think people have been overstating the whole "Intel's GPU drivers are so bad!" if they are only like 2% behind AMD's drivers in terms of optimizations for gaming.

 

@Jurrunio is this what you wanted me to do? Sorry but I came to the exact opposite conclusion than you did when reading benchmarks and comparing synthetics vs real world benchmarks. Not sure if that's because you use 7 year old benchmarks to draw your conclusion while I use 7 days old benchmarks, or if we use a different methodology.

If history has learned me anything is that scaling is never linear. Just because you can somewhat make shitty GPU be good compared to another shitty GPU, that doesn't mean you'll do great against high ends. As evident by struggles with AMD. They made great mid end cards but couldn't pull a good high end one till Navi-ish. Also, if you look my posts from the past you can see how relentlessly I piss on garbage slow twitching ugly and outdated looking NVIDIA's Control Panel. NVIDIA has the best GPU's in the world currently but their control panel is an absolute garbage joke. It literally hasn't changed since what was it, 2004 or 2006? And if it was actually good I wouldn't even complain, but it's absolute trash and I can't believe a multi million company like NVIDIA can't redesign it to be less of a turd. Applying shit takes seconds, menus are twitching and glitching as you're selecting things and applying them, it takes long to fire up, profiles take forever to load up (the list of games) even on SSD, it's clumsy to navigate and use and it just feels terrible. And that's exactly how Intel's feels as well.

Link to comment
Share on other sites

Link to post
Share on other sites

So when pressed on the issue for specifics and a case where said problems can be observed,  we retreat to trying to excuse the lack of evidence by introducing a new metric (comparing onboard to high end GPU, or the lack of comparison because we all know that is dumb and pointless) as if that observation somehow magically explains the lack of substance in previous claims.  

 

Redirecting the argument toward a product that has ZERO bearing on the issues raised (Intel having crappy drivers) does not excuse earlier claims and hte lack of evidence to support them.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×