Jump to content

Please Buy Intel GPUs.

AlexTheGreatish

Spongebob Patrick GIF - Spongebob Patrick Patrick Star GIFs

Sorry I probably edited my post. Refresh plz. Build Specs Below.

System

  • CPU
    Ryzen 9 5900x
  • Motherboard
    ASUS ROG STRIX X570-F
  • RAM
    32 GB (2X8) Trident Z Neo 3600MHz CAS 16
  • GPU
    ASUS ROG STRIX RTX 3070
  • Case
    Corsair 4000D Airflow
  • Storage
    Sabrent 1 TB TLC PCI 4.0 NVMe M.2
  • PSU
    NZXT C850 Gold PSU
  • Display(s)
    MSI Optix MAG342CQR 34" UWQHD
  • Cooling
    Corsair H100i RGB Pro XT 240mm
  • Operating System
    Windows 11
Link to comment
Share on other sites

Link to post
Share on other sites

So being DX12 "optimized" .... and being a "first generation" ... this makes me say pass, rather get a RX6600 or RTX3060, with solid drivers by this point, and game engines that have optimizations built into them. 

 

Being a new video card, a lot of games that are first optimized for consoles and then ported to PC will be by default optimized for RDNA, RDNA2 and so on ... and nVidia has enough money to pay studios to optimize games for them as well ... so by default Intel cards will be lacking optimizations, and not all will bother due to too low amount of cards in the market... which leaves game engines having to catch up and introduce optimizations for the Intel cards so it will take some time to be a solid product worth buying. 

Maybe the 2nd generation, if intel doesn't give up on it. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, mariushm said:

Being a new video card, a lot of games that are first optimized for consoles and then ported to PC will be by default optimized for RDNA, RDNA2 and so on ... and nVidia has enough money to pay studios to optimize games for them as well ... so by default Intel cards will be lacking optimizations, and not all will bother due to too low amount of cards in the market... which leaves game engines having to catch up and introduce optimizations for the Intel cards so it will take some time to be a solid product worth buying. 

Maybe the 2nd generation, if intel doesn't give up on it. 

Its a 2 way street and intel can do A LOT on the driver side to optimize for games. Which will translate to other games and older games depening on what api is used.

 

So they will have a mountain of work ahead BUT there is some decent raw performance available so there might be hope yet.

Link to comment
Share on other sites

Link to post
Share on other sites

In my eyes, Intel doesn't have a good reputation when it comes to drivers and product cancellations ... so it may be true what you're saying but I'll still wait.

 

For example, they pulled all the drivers and bioses for legacy Intel motherboards  from their website : https://www.zdnet.com/article/intel-to-remove-old-drivers-and-bios-updates-from-its-site-by-the-end-of-the-week/ - like intel doesn't have the money to buy a bunch of hard drives and rent a server to serve static downloads. 

 

They discontinued that processor which was made in collaboration with AMD after just a year... these were launched in Q3 2018 and were cancelled in Autumn 2019 : https://ark.intel.com/content/www/us/en/ark/products/codename/136847/products-formerly-kaby-lake-g.html

 

If they get bored with video cards, they'll cancel it next year, just like they cancelled other projects  ... see Larrabee and Timna (at least they have excuse for this one as it used Rambus ram)

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, AlexTheGreatish said:

Intel's Arc Alchemist GPUs are FINALLY launching and these are some surprisingly competitive cards, sometimes.

 

 

 

You know what, I'd consider buying one very soon if I could find where to.  I found two listings on newegg but that was it, no amazon, no eBay, no Walmart, nowhere else I looked, does anybody else know where I can get one of these?

Link to comment
Share on other sites

Link to post
Share on other sites

I really want to see final fantasy xiv tested, though as it’s an mmo that runs on either dx11 or dx9. If anyone from lmg staff wants to get access to my account for benchmarking, hit me up. @AlexTheGreatish @LinusTech

CPU: Intel core i7-8086K Case: CORSAIR Crystal 570X RGB CPU Cooler: Corsair Hydro Series H150i PRO RGB Storage: Samsung 980 Pro - 2TB NVMe SSD PSU: EVGA 1000 GQ, 80+ GOLD 1000W, Semi Modular GPU: MSI Radeon RX 580 GAMING X 8G RAM: Corsair Dominator Platinum 64GB (4 x 16GB) DDR4 3200mhz Motherboard: Asus ROG STRIX Z370-E Gaming

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AlexTheGreatish said:

Intel's Arc Alchemist GPUs are FINALLY launching and these are some surprisingly competitive cards, sometimes.

 

 

 

I  am curious how good the AV1 encoding is compared to Nvidia 4000. Linus brought up in a WAN show the possibility of using the lower end ARC card as a dedicated video encoder for editing and streaming. Seems like it would be an awesome option for that

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

wtf is with this drama switching, or if that is just going to be a standard.

like "first we are going to hate them in our videos because that is "trendy", then we will beg and ask you to support them!", felt a bit like that for GN too.

like yes the drivers were bad and all, but some of it felt so artificial and because of being "trendy" towards how one feel about them.

 

and yeah, missing labels for the stats in the video. which made the numbers useless.

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

 

monitors not detected, fan speed not configurable, other driver bugs ... still  a lot of issues reported by gamersnexus

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Would love to see CS:GO with -vulkan launch options
and it also would like to see it on linux using DXVK to translate DX9 to vulkan instead of the the windows directx 9 to 12 conversion layer

Link to comment
Share on other sites

Link to post
Share on other sites

I fully agree that we need a third competitor in the GPU market, but that still doesn't mean that I have 300 bucks just lying around to spend on a, let's be real, terrible GPU.

Meanwhile in 2024: Ivy Bridge-E has finally retired from gaming (but is still not dead).

Desktop: AMD Ryzen 9 7900X; 64GB DDR5-6000; Radeon RX 6800XT Reference / Server: Intel Xeon 1680V2; 64GB DDR3-1600 ECC / Laptop:  Dell Precision 5540; Intel Core i7-9850H; NVIDIA Quadro T1000 4GB; 32GB DDR4

Link to comment
Share on other sites

Link to post
Share on other sites

I play quite a few DX9 titles, so Intel Arc really makes me think against it.. However, I really want to know if I could get a DAZ Studio's iRay render using Arc.. but I doubt it, It's NVidia's Render engine.

 

Mind you, I'm half tempted to run this with my GTX 1060.. the AV1 support sounds nice.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, mariushm said:

So being DX12 "optimized" .... and being a "first generation" ... this makes me say pass, rather get a RX6600 or RTX3060, with solid drivers by this point, and game engines that have optimizations built into them. 

 

Being a new video card, a lot of games that are first optimized for consoles and then ported to PC will be by default optimized for RDNA, RDNA2 and so on ... and nVidia has enough money to pay studios to optimize games for them as well ... so by default Intel cards will be lacking optimizations, and not all will bother due to too low amount of cards in the market... which leaves game engines having to catch up and introduce optimizations for the Intel cards so it will take some time to be a solid product worth buying. 

Maybe the 2nd generation, if intel doesn't give up on it. 

 

 

Owning an AMD GPU atm, I will say F AMD GPUs. There's not a single value in them to be priced same or more as Nvidia's.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Just that Mario said:

Owning an AMD GPU atm, I will say F AMD GPUs. There's not a single value in them to be priced same or more as Nvidia's.

why?

Link to comment
Share on other sites

Link to post
Share on other sites

So, the "Microsoft" tech they are using is basically Mesa. Which is cool, because Mesa supports DX9 on Linux, and does it fairly competitively. So what gives with Mesa on Windows?

 

Consider on Linux Mesa talks directly to the GPU driver, but on Windows, this is converting to DX12 and that talks to the Driver. At a guess, that is where much of the performance is going.

 

This does give us some hope, however. Zink on the Linux side is getting some impressive numbers lately, which show that you can get very close to native with the right optimizations. It is just going to take time.

 

And since their strategy is to use this, they will use it for their next cards, and continue to optimize it, benefiting all those cards that use it.

 

That is the hopeful outlook, anyway.

Edited by BillDStrong
Can somebody please explain to me how "the performance" got changed into "heterofemale?"
Link to comment
Share on other sites

Link to post
Share on other sites

Why would i do that to myself? - After all my CPU doesn't even support REsizable BAR.

 

Intel shot themselves in the feet by requiring REBAR in order to get the full performance of their GPUs,

Which is excluding a large number of would be potential customers,even 9th gen owners are excluded.

When it comes to CPU lifespan 4 years is not much, in this day and age a lot of people hold on their CPUs for 7~10 years.

30.99% of gamers on the Steam hardware survey have a quad core CPU - and that's a lot of people.

 

Also the drivers are a huge issue.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

It's funny how often I'm seeing the 6600 being shouted out as a good buy considering how hard reviews went on it when it came out.

 

Edit: Come to think of it, would running DX9/10/11 games under DXVK improve performance at all? That's something that definitely warrants testing imo

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mariushm said:

 

monitors not detected, fan speed not configurable, other driver bugs ... still  a lot of issues reported by gamersnexus

 

 

Interesting. I haven't gotten to that vid yet, but am looking forward to his coverage. I'm a little surprised as Digital Foundry said that, aside from the performance issues everyone is mentioning, that they found the cards shockingly stable and problem free otherwise, particularly for a 1st run.

Link to comment
Share on other sites

Link to post
Share on other sites

CS:GO can use Vulkan in a same way Intel translates DX9 to DX12, through DXVK.

I think it'll be interesting to compare native DX9 to DX12 translation with DX9 to Vulkan translation layer made by doitsujin/Valve

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, BillDStrong said:

So, the "Microsoft" tech they are using is basically Mesa. Which is cool, because Mesa supports DX9 on Linux, and does it fairly competitively. So what gives with Mesa on Windows?

 

Consider on Linux Mesa talks directly to the GPU driver, but on Windows, this is converting to DX12 and that talks to the Driver. At a guess, that is where much of the performance is going.

 

This does give us some hope, however. Zink on the Linux side is getting some impressive numbers lately, which show that you can get very close to native with the right optimizations. It is just going to take time.

 

And since their strategy is to use this, they will use it for their next cards, and continue to optimize it, benefiting all those cards that use it.

 

That is the hopeful outlook, anyway.

Been gaming on Linux for nearly 2 decades with various success.  It's actually not a rare occurrence for a Windows game to perform better in Wine (or similar) than it does on Windows.  I used to dual-boot for that very reason, so I could switch over to Linux for the games that would perform better.  

 

I've seen plenty of speculation as to why, but it usually boils down to the idea that Windows just has so much overhead that linux emulation just doesn't have to deal with, so if the game runs stable in emulation, there's a chance you can get a few more FPS doing so.  

 

I'm excited to see how Arc runs in linux, especially on a system with SteamOS.  If they can nail their Linux drivers, then Intel immediately becomes the top option for building cheap Plex+Steam media centers. 

CPU: Ryzen 5 5600X  | Motherboard: ASROCK B450 pro4 | RAM: 2x16GB  | GPU: MSI NVIDIA RTX 2060 | Cooler: Noctua NH-U9S | SSD: Samsung 980 Evo 1T 

Link to comment
Share on other sites

Link to post
Share on other sites

I'd be curious to know if some older games even run at all. I remember having issues getting games like KotOR 1/2 and Fallout 3 to even launch on Intel integrated GPUs regardless of settings, although it has been a fair number of years since I've tried. Software support's kind of important, nobody wants to buy a $300 item and then not be able to run something they enjoy that could run on a computer from the 2000s.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Error 52 said:

It's funny how often I'm seeing the 6600 being shouted out as a good buy considering how hard reviews went on it when it came out.

The price is completely different.  At release price it was a horrible buy. At $250 it's a great buy.

image.thumb.png.7e4f4dcf8697eef807cedf55605cdbe2.png

It was also trendy to be hard on every GPU release at the time.

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×