Jump to content

Intel Arc A750 Trades Blows with GeForce RTX 3060 in 50 Games in DirectX 12 and Vulkan APIs

Summary

Intel's Tom Petersen and Ryan Shrout have given more performance numbers for the anticipated Arc A750 in a video released late yesterday, August 10, 2022. The performance numbers came from as many as 50 game benchmarks using the DirectX 12 and Vulkan APIs. The A750's performance in 1080p and 1440p was compared to that of the NVIDIA GeForce RTX 3060.

 

YnUx3aX1eBRs1Ga2.thumb.jpg.d38748a0b864bd71b801fb9f23b60a4a.jpg

 

yF3dLXC56v5ER1YX.thumb.jpg.8beccc1a0e562b2ad6a50458795ca321.jpg

 

qgjsyTNOH2xSP3tb.thumb.jpg.1035c6f853aa6164badb261d5c8c2333.jpg

 

LNyGainxxu7Qi62o.thumb.jpg.f4a0428072c11d598648f3e41d14f128.jpg

 

2DujZfY01rAabN9M.thumb.jpg.799527a39dd90a0b4d90be7e9922fd8f.jpg

 

54KpH5xfWCK3UKrd.jpg.c5b3ea27f3cebe88699bff519db11f92.jpg

 

Eom1OMVLrnJ1EqfF.jpg.2bc598dbb19ab89702a1dfd44e249ce8.jpg

 

 

Quotes

Quote

According to the combined findings, the Arc A750 is barely faster than the GeForce RTX 3060 by up to 3% at 1080p and up to 5% at 1440p in 43 DirectX 12 titles. In games that also used the Vulkan API, the Arc A750 outperformed the GeForce RTX 3060. Intel measured a 4% performance difference at 1080p and a 5% performance delta at 1440p in favor of the Arc A750.

 

All testing was done without ray tracing, performance enhancements such as XeSS or DLSS weren't used. The small set of 6 Vulkan API titles show a more consistent performance lead for the A750 over the RTX 3060, whereas the DirectX 12 API titles sees the two trade blows, with a diversity of results varying among game engines. In "Dolmen," for example, the RTX 3060 scores 347 FPS compared to the Arc's 263. In "Resident Evil VIII," the Arc scores 160 FPS compared to 133 FPS of the GeForce. Such variations among the titles pulls up the average in favor of the Intel card.

 

Some of the benchmarks shared were taken from in game benchmark software, such as Dirt 5 and Hitman 3. The rest were run using repeatable scenarios, while Intel admits "other testers might come up with different results." All measurements for both the Intel Arc and NVIDIA RTX were run on a per game basis by the same Intel labs.

 

Intel hasn't officially revealed the specifications for the Arc A750. However, the graphics card will likely come with 24 Xe cores, 3,072 shaders, and 12GB of GDDR6 memory across a 192-bit interface. In addition, the boost clock speed probably hovers around the 2,300 MHz mark. Meanwhile, Intel used EVGA's GeForce RTX 3060 XC Gaming for comparison, one of the faster custom GeForce RTX 3060 on the market, flaunting a 1,882MHz boost clock.

 

Intel benchmarked the Arc A750 and GeForce RTX 3060 identical systems powered by the Core i9-12900K, the current Alder Lake flagship. The testbeds also had 32GB of (2x16GB) Corsair Dominator Platinum RGB DDR5-5200 C38 downlocked to 4,800MHz memory, MP600 Pro XT 4TB SSD. Intel used Windows 11 and the balance power plan for the tests.

 

One thing to note is that the Arc A750 was on Intel's engineering driver, whereas the GeForce RTX 3060 used the GeForce 516.59 WHQL driver. Arc Alchemist's drivers are still a work in process, so that could be holding the Arc A750 back in the gaming benchmarks. Intel has admitted that Arc underperforms in older APIs, so the company only used DirectX 12 and Vulkan titles.

 

My thoughts

For a "first" GPU this isn't too bad. If priced right, this could be a pretty decent contender for midrange builds. Problem is from many other outlets testing, the Arc A380 performed sub-par with its rivals in games based on the DirectX 11 API. Which still many games use today. Therefore testing in DX12 and Vulkan is really putting the Arc Alchemist GPU at an advantage. Going forward many games might start using these APIs, but if people want to play older games using these Intel video cards, they will have serious performance issues as shown in A380 reviews. Regardless, it seems that the A750 will see gaming performance about the same as NVIDIA's RTX 3060 in the games tested. Although, there are way more than 50 games out there, and how well the card performs in those games is yet to be seen. Nonetheless, it's still reassuring to see that the card is performing well with at least a nice chunk of games. I feel as though if these cards were released one year earlier, they really could have struck gold. It might be too little too late at this point. Despite everything, wait until independent reviewers get ahold of the card to confirm these results as manufacturers' results will put the card in the best light. Also, Intel still has a lot of time to get its driver act together. So, this might be an AMD FineWine Technology situation. 

 

Sources

https://videocardz.com/newz/intel-reveals-arc-a750-gaming-performance-in-48-dx12-vulkan-games-up-to-5-faster-than-rtx-3060

https://www.guru3d.com/news-story/intel-has-released-performance-figures-for-the-arc-a750-vulkan-and-directx-12-apis.html

https://www.tweaktown.com/news/87876/intel-arc-a750-benched-in-nearly-50-games-against-geforce-rtx-3060/index.html

https://www.techpowerup.com/297689/intel-arc-a750-trades-blows-with-geforce-rtx-3060-in-50-games

https://www.tomshardware.com/news/arc-a750-trades-blows-with-rtx-3060-across-nearly-50-games

https://hothardware.com/news/intel-graphics-division-lays-out-arc-a750-performance-expectations

https://game.intel.com/story/intel-arc-graphics-a750-benchmarks-dx12-vulkan/

Link to comment
Share on other sites

Link to post
Share on other sites

If I was in the market today for a graphics card that competed with 3060, this thing still would be hard sell due to how terrible drivers looked in GNs video (and this coming from someone who lived with ATI/AMD graphics drivers for 12 years).

 

Its good to see competition but I can only hope Intel fixes their drivers before they have to cut the prices on these things due to low sale volumes and eventually kill them a year or two down the line.

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

What worries me is people keep saying things like "for a first GPU", like Intel hasn't been making iGPUs for years already.

 

If they haven't learned to make decent drivers by this point, I'm rather worried if they ever will.

The rumour of there being a hardware flaw in the architecture that can't be fixed by drivers does at least give me a little hope that the next revision might perform more consistently/reliably, if they're willing to stick it out and don't completely kill their reputation with this generation.

That said, how they managed to let such a flaw remain in the design right into mass manufacturing is rather worrying in itself.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Alex Atkin UK said:

What worries me is people keep saying things like "for a first GPU", like Intel hasn't been making iGPUs for years already.

 

If they haven't learned to make decent drivers by this point, I'm rather worried if they ever will.

I used to think like that until it became apparent how much game tuning goes on for gaming. iGPUs might be able to run basic games, but getting a consistent high level performance appears to be proving a bigger challenge. I'd even go as far as to say, this work might improve the relative performance of iGPUs in future. This doesn't seem to be a work/not work type scenario, but a how well it works situation.  

 

18 minutes ago, Alex Atkin UK said:

The rumour of there being a hardware flaw in the architecture that can't be fixed by drivers 

Is there anything even vaguely tangible about this? It smells of FUD and clickbaiting to me.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, porina said:

I used to think like that until it became apparent how much game tuning goes on for gaming. iGPUs might be able to run basic games, but getting a consistent high level performance appears to be proving a bigger challenge. I'd even go as far as to say, this work might improve the relative performance of iGPUs in future. This doesn't seem to be a work/not work type scenario, but a how well it works situation.  

 

Is there anything even vaguely tangible about this? It smells of FUD and clickbaiting to me.

I will admit, the comments about "maybe this is why it sucks without resizeable bar" did rub me the wrong way a bit.

 

The fact they possibly did things differently to fully make use of that feature for better performance, doesn't seem like a "bug" to me.  Its quite possibly just allowing the hardware to punch above its weight.

 

The comments about it having huge API overheads though, surely someone could profile that to confirm/deny it?  Although that still sounds more of a driver issue than hardware.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Alex Atkin UK said:

 

 

If they haven't learned to make decent drivers by this point, I'm rather worried if they ever will.
 

They have never... ever... made decent drivers. Trust me, I've been fighting the intel drivers since the 4th gen iGPU, and the following issues have "always been a problem", and Intel has never solved:

1. Installing the driver package locks up, rendering the installer unable to install or remove it.

2. Installing the driver package, is overwritten by Windows Update

3. Windows deciding to rollback the drivers in the middle of doing work (the only way I found to stop this is to force windows update to never update the GPU driver)

4. The Intel Driver and Support Assistant often doesn't show the latest driver, or sometimes sees older larger build numbers as being later than newer versions.

5. When I got the 11th gen CPU, it also failed to install other CPU and GPU-related drivers (Intel® Gaussian Neural Accelerator 2.0 (GNA 2.0) driver does not get installed, and I had to explicitly look for it in an Intel driver package for something else. Despite that, I don't know of anything using it.)

6. Every time the monitor attached to the iGPU is turned on/off, every damn program that can change audio outputs asks if you want to change your output to the monitor. BTW, the Nvidia drivers ALSO cause this. Please please, Microsoft or GPU manufactures, if the last monitor is powered off, the default action should be to continue sending sound to the device that was last receiving it, not completely ejecting it from the audio endpoint. On top of this, the resolution of every program is also messed around with, with things ending up on the wrong monitor or sized wrong.

 

Like overall, do Intel GPU's work? Yes. Do they work well? Only if you can get past the driver install hurdle. Which is unfortunately the problem we continue to see with the Intel ARC GPU's. 

 

What Intel should be doing (and so should Nvidia and AMD, because they do this too) is completely rip apart their fancy installers and break them into smaller parts. The basic part that will install even in "safe mode" (DCH driver), and the full WDDM model which supplants the DCH driver when installed. Those fancy installers are the primary cause of things getting half installed. But the DCH + Windows Store driver split absolutely sucks for getting things to work on a gaming PC.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Alex Atkin UK said:

I will admit, the comments about "maybe this is why it sucks without resizeable bar" did rub me the wrong way a bit.

 

The fact they possibly did things differently to fully make use of that feature for better performance, doesn't seem like a "bug" to me.  Its quite possibly just allowing the hardware to punch above its weight.

 

The comments about it having huge API overheads though, surely someone could profile that to confirm/deny it?  Although that still sounds more of a driver issue than hardware.

Well that's great and all but if all you see is benchmarks based on ideal scenarios then you would think that is the normal performance when in reality if you don't have access to resizable bar then it has abysmal performance. If the gpu need the correct api and resizable bar to be competitive then yeah that is an issue. 

Link to comment
Share on other sites

Link to post
Share on other sites

I recall Linus/Ryan mentioned the price will be similar to the counterpart according to the card DX11 performance. 🤔

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

How does it compare to the RX 6600/XT?

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Alex Atkin UK said:

What worries me is people keep saying things like "for a first GPU", like Intel hasn't been making iGPUs for years already.

The bar was very low for Intel iGPUs until AMD made decent APUs recently. Users only expected a video out out of Intel iGPUs. 

I think the biggest problem is making a scalable driver, that can translate as many raw teraflops from the execution units into frame rate, and Intel never had to try and achieve that with UHD graphics. It's fair to say this is the first GPU Intel is trying to make that does try to do that.

3 hours ago, Brooksie359 said:

if you don't have access to resizable bar then it has abysmal performance.

I think the resizable bar requirement is not a problem as long as it is advertised accurately, which seems to be the case. it means only newer low/mid range systems are suitable to using an Intel DGPU. Intel is pushing more to system integrators, it solves both the "blessed hardware" problem and the "driver install" problem.

7 hours ago, Alex Atkin UK said:

The rumour of there being a hardware flaw in the architecture that can't be fixed by drivers

It came from "Moore Law Is Dead", giving how many predictions he makes and how few come to pass, I wouldn't put much stock into it until it is announced. That said, it's plausible this first generation architecture has some fundamental hardware flaw that holds it back in some scenario. I remember when the i386 processor had a mistake in the ALU and a given multiplication would always give the wrong result. This things happen. Mistakes are fine, it's about how a company deals with them, with disclosure, and remediation.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Kisai said:

What Intel should be doing (and so should Nvidia and AMD, because they do this too) is completely rip apart their fancy installers and break them into smaller parts. The basic part that will install even in "safe mode" (DCH driver), and the full WDDM model which supplants the DCH driver when installed. Those fancy installers are the primary cause of things getting half installed. But the DCH + Windows Store driver split absolutely sucks for getting things to work on a gaming PC.

I think the DCH thing is a push from Microsoft. At least that's how it feels to me. MS wants to distribute a minimal functioning driver via WU. Separating out the components allows those that want more to have more.

 

At least on nvidia side, I don't have any problem with the nvidia installer. Just checked my main gaming system and laptop, and both are reported to have the DCH driver installed by GPU-Z, but on top of that I do have the full nvidia package so I'm feature complete. Does non-DCH still exist for leading edge drivers?

 

1 hour ago, 05032-Mendicant-Bias said:

I think the resizable bar requirement is not a problem as long as it is advertised accurately, which seems to be the case. it means only newer low/mid range systems are suitable to using an Intel DGPU. Intel is pushing more to system integrators, it solves both the "blessed hardware" problem and the "driver install" problem.

ReBAR has been supported on 3 gens of Intel, and 2+ gens of AMD CPUs. It does suck a bit I wanted to use my Coffee Lake system to try an Arc once they're available, assuming they're at a low enough price. But no ReBAR on Coffee Lake. I was pleasantly surprised to find Skylake-X has been given ReBAR support. So, Arc will not be the choice to upgrade an older system.

 

Edit: I just looked up my Z390 mobo, and was surprised to see there is now a beta bios adding "Clever Access Memory". Asrock, why? Anyway, it may be possible to use that after all.

 

1 hour ago, 05032-Mendicant-Bias said:

I remember when the i386 processor had a mistake in the ALU and a given multiplication would always give the wrong result. This things happen. Mistakes are fine, it's about how a company deals with them, with disclosure, and remediation.

I don't recall one in 386 era, but there's the FDIV bug in the original Pentium: https://en.wikipedia.org/wiki/Pentium_FDIV_bug leading to a recall.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, porina said:

I think the DCH thing is a push from Microsoft. At least that's how it feels to me. MS wants to distribute a minimal functioning driver via WU. Separating out the components allows those that want more to have more.

 

At least on nvidia side, I don't have any problem with the nvidia installer. Just checked my main gaming system and laptop, and both are reported to have the DCH driver installed by GPU-Z, but on top of that I do have the full nvidia package so I'm feature complete. Does non-DCH still exist for leading edge drivers?

 

https://nvidia.custhelp.com/app/answers/detail/a_id/4777/~/nvidia-dch%2Fstandard-display-drivers-for-windows-10-faq

Quote

What are DCH Display Drivers from NVIDIA?

Microsoft DCH (Declarative Componentized Hardware supported apps) drivers refers to a new universal Windows 10 driver package. Click here for more information.

 

What is the difference between NVIDIA Standard and DCH Display Drivers?

Functionally, there is no difference between NVIDIA’s Standard and DCH drivers. While the base core component files remain the same, the way DCH drivers are packaged and installed differs from previous (Standard) drivers. When directly comparing the two driver types, the DCH driver package has a smaller size and a faster installation time than the Standard package.

Just to be clear, "DCH" is a packaging of the driver. Maybe I didn't make that clear. Basically the DCH driver is what Windows Update will install, and ONLY that driver. The rest of the control panels and other janky parts that you'd usually get in "the standard driver package" are then only available through the Windows Store.

 

The problem here, with GPU drivers, is that that separation is not clean. Like on corporate laptops, installing only the DCH driver is not enough, but often the "store" part gets blocked. So you end up in a situation where you have no nvidia control panel, thus can't adjust the screen resolution or rotation, and might not even be able to get it out of clone mode when there are two monitors connected.  This also happens to sound drivers and network drivers. Installing the driver is only enough to get the onboard speakers to work, the headphone jacks will not operate, because the "switching functionality" is part of the full driver package which is now trapped on the Windows store and unable to be downloaded. Network drivers are often missing their control panels, though usually this only matters for WiFi parts, which makes them unable to join networks seamlessly.

 

Like, I get it, the DCH is supposed to make things easier, but it's not done that. It's instead thrown some rather large road cones at users who now can't figure out how to do X thing, because the panel that is needed to do X thing no longer gets installed. Don't get me wrong, while I prefer installing the driver manually, I would also rather that the "DCH driver" expose all the functionality that is in the standard driver to the OS if that's how it's going to work if it's not going to install the control panel.

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, BiG StroOnZ said:

DirectX 12 and Vulkan

Considering that 98% of my game library is DX11,DX10 and DX9 games i would say nope to ARC.

For example: The Witcher 3 Wild Hunt,GTA V,Kingdom Come Deliverance,Batman Arkham series and many more...

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

Really interesting to see this, but using percentages instead of FPS is rather strange, also looks like they fixed a problem which "Moore's law is dead" talked about in his video.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, williamcll said:

How does it compare to the RX 6600/XT?

 

The RTX 3060 is 4% faster @ 1080p, and 10% faster @ 1440p than the RX 6600 non-XT:

 

1335208570_relative-performance_1920-1080(2).png.7fd30ed2243584601bccaf8550152cd5.png

 

relative-performance_2560-1440.png.7a4e0fd259fd8c59fca89b0bb57f8e0d.png

 

The RX 6600 XT is 11-14% faster @ 1080p, and 8-11% faster @ 1440p than the RTX 3060:

 

1675158401_relative-performance_1920-1080(3).png.dd3499903fc19acc6a9754087fcc5e24.png

 

597022239_relative-performance_2560-1440(1).png.1d696ff14a8bcffc72682ffb861b8175.png

 

The Arc A750 is supposedly 3-5% faster than the RTX 3060 on average (according to Intel).

 

So @ 1080p the Arc A750 is 7-9% faster than an RX 6600 and @ 1440p 13-15% faster than an RX 6600.

 

And @ 1080p the RX 6600 XT is 7-10% faster than the Arc A750 and @ 1440p the RX 6600 XT is 3-7% faster than the Arc A750.

 

3 hours ago, Vishera said:

Considering that 98% of my game library is DX11,DX10 and DX9 games i would say nope to ARC.

For example: The Witcher 3 Wild Hunt,GTA V,Kingdom Come Deliverance,Batman Arkham series and many more...

 

Yeah, besides the Resizable BAR having to be enabled, performance only being good in modern APIs is a serious deterrent for most people. Since these are supposed to be for midrange builds, you would want the card to be able to be slotted into an older machine. By having these caveats, it sort of nullifies the point of it being a midrange card in the first place. 

Link to comment
Share on other sites

Link to post
Share on other sites

Nice of Intel to provide specifics on how they did their testing, but is it normal to use single run results (for their second set of games)? I get there probably won't be that much variance, but why didn't they just stick with their median of 3 runs method like in their first set of games? I'm not a statistics guy, but at least be consistent?

 

No matter how I look at it, running a test through once just seems kinda lazy compared to all the benchmarks and testing I'm used to from tech youtubers and independent reviewers.

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/11/2022 at 3:30 PM, BiG StroOnZ said:

So, this might be an AMD FineWine Technology situation. 

 

 

AMD FineWine was essentially develop a highly compute-focused architecture (while Nvidia seemed to have went leaner with Kepler). This approach seemed to have worked once shaders have become more complex, and differing types of work began being handled by the GPU (such as physics and particle simulation), and paid off massively when low level APIs became more common. 
 

Intel’s strong performance in DX12 and Vulkan titles are promising as it shows the hardware is there, and Intel has the right architecture for modern gaming. They do need to step it up with drivers though. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Zodiark1593 said:

AMD FineWine was essentially develop a highly compute-focused architecture (while Nvidia seemed to have went leaner with Kepler). This approach seemed to have worked once shaders have become more complex, and differing types of work began being handled by the GPU (such as physics and particle simulation), and paid off massively when low level APIs became more common. 
 

Intel’s strong performance in DX12 and Vulkan titles are promising as it shows the hardware is there, and Intel has the right architecture for modern gaming. They do need to step it up with drivers though. 

 

What I meant by it being an AMD FineWine situation; was that there's room for optimization with the Intel Arc Drivers that will increase performance, over time:

 

Quote

AMD FineWine Technology, as the community loves to call it, is a situation where a Radeon GPU is seemingly able to catch up or even surpass its corresponding GeForce rival SKU due to the performance gains it undergoes during its life cycle. In essence, it gets better as it ages, something like a high-quality wine would do.

 

The gains it achieves are a consequence of subsequent driver updates as well as more favorable game optimizations. The FineWine phenomenon generally starts to kick in a couple of years into a GPU generation.

 

https://www.neowin.net/news/amds-big-navi-already-showing-finewine-magic-rx-6900-xt-almost-on-par-with-rtx-3090/

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×