Jump to content

I hope no one gets fired for this...

AlexTheGreatish

Intel brought us the Arc A770 to *definitely* not test. We certainly didn't show any performance data. For sure.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Who cares?  It's a steaming pile like the rest of the first ones.  When it's Erica, I might be interested.

Link to comment
Share on other sites

Link to post
Share on other sites

Inb4 the email:

Quote

 

G*ddammit Linus what the F**K!

 

Love,

Intel

 

 

Aerocool DS are the best fans you've never tried.

Link to comment
Share on other sites

Link to post
Share on other sites

A question on the mind of capital E  Enthusiast would be this.  Does Arc support SR-IOV?  For example could I install an Arc GPU in my machine and use it to game in a  windows VM while also being available to Linux?  Are the drivers fully open source for Linux without anything being locked out.  That lone would eliminate NVIDIA as being the go to for anyone who has Linux.  Give us an ARC with a large ram buffer and watch it grow. 

THIS MAY BE OF USE TO NON ENTHISIAST IN THE FUTURE. Especially if Windows goes to full robust VM based security.  Like your browsers would run sandboxed in a VM from eachother and each app.  Similar to something like Qubes OS.  We know they have it on their mind.  https://support.microsoft.com/en-us/topic/virtualization-based-security-in-windows-10-on-arm-818468f0-d67f-9bf9-4a5c-183fa55c0898  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Uttamattamakin said:

A question on the mind of capital E  Enthusiast would be this.  Does Arc support SR-IOV?  For example could I install an Arc GPU in my machine and use it to game in a  windows VM while also being available to Linux?  Are the drivers fully open source for Linux without anything being locked out.  That lone would eliminate NVIDIA as being the go to for anyone who has Linux.  Give us an ARC with a large ram buffer and watch it grow. 

THIS MAY BE OF USE TO NON ENTHISIAST IN THE FUTURE. Especially if Windows goes to full robus VM based security.  Like your browsers would run sandboxed in a VM from eachother and each app.  Similar to something like Qubes OS.  We know they have it on their mind.  https://support.microsoft.com/en-us/topic/virtualization-based-security-in-windows-10-on-arm-818468f0-d67f-9bf9-4a5c-183fa55c0898  

They actually asked this on the WAN show last week, the first hour is discussion with the Intel guys. They (Intel) said they would definitely look into bringing that if it's not supported already but they weren't 100% sure off hand. Not sure on the driver part but I vaguely remember them saying something about that too, could be wrong though.

Current Network Layout:

Current Build Log/PC:

Prior Build Log/PC:

Link to comment
Share on other sites

Link to post
Share on other sites

nice video, too bad in the level category of gaming.

Makes none of the cards a good deal for gamers... yet.

Even if linus wanted to say that A380 could be a good deal, yeah no, maybe at some point its "decent".

For AI or other workloads maybe and paired with intel CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

Curious if using DXVK is an option for DX11/9 games and Intel Arc. DXVK can already help with some old DX9 game.

Link to comment
Share on other sites

Link to post
Share on other sites

Is anyone else getting a "Sony vs Microsoft 'vs' Nintendo" vibe from this Intel GPU news? They're not trying to be top of the line. They're just happy to be part of the game at the mid range.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, battlepants220 said:

Is anyone else getting a "Sony vs Microsoft 'vs' Nintendo" vibe from this Intel GPU news? They're not trying to be top of the line. They're just happy to be part of the game at the mid range.

I get the sense that they want to be in the same breath as those two, but the tech isn’t there yet and they know it. So, they’re making chicken salad from chicken feathers.

Aerocool DS are the best fans you've never tried.

Link to comment
Share on other sites

Link to post
Share on other sites

I wish someone would do a NVENC vs whatever encoder the Arc GPU uses comparison. All the reviewers are focused on gaming, which is definitely a bigger market so I get it, but I've been holding off on upgrading my media server because of prices. If the Arc series can do avi encode/decode, in docker or a VM, I'll no longer have a reason to keep my 1000 series GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

Never thought it would be that much difference from DX11 or DX12..

Useful threads: PSU Tier List | Motherboard Tier List | Graphics Card Cooling Tier List ❤️

Baby: MPG X570 GAMING PLUS | AMD Ryzen 9 5900x /w PBO | Corsair H150i Pro RGB | ASRock RX 7900 XTX Phantom Gaming OC (3020Mhz & 2650Memory) | Corsair Vengeance RGB PRO 32GB DDR4 (4x8GB) 3600 MHz | Corsair RM1000x |  WD_BLACK SN850 | WD_BLACK SN750 | Samsung EVO 850 | Kingston A400 |  PNY CS900 | Lian Li O11 Dynamic White | Display(s): Samsung Oddesy G7, ASUS TUF GAMING VG27AQZ 27" & MSI G274F

 

I also drive a volvo as one does being norwegian haha, a volvo v70 d3 from 2016.

Reliability was a key thing and its my second car, working pretty well for its 6 years age xD

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, kryptonitecb said:

I wish someone would do a NVENC vs whatever encoder the Arc GPU uses comparison.

If the Arc series can do avi encode/decode, in docker or a VM

can't yet, i would assume and some of the support. here is a bit software not hardware on it. A bit like is it better towards the original or better to view?

personal and also motion differences per element per codec.

3 Things You Should Know About AV1, "better at lower bitrates"

https://youtu.be/ibXKKllz4xQ?t=694

While in intel's video it seem a bit different and not a still image but in motion.

 

kind of thinking about this though, DPC++ Compatibility Tool, However, it will not migrate all code and manual changes may be required, 80-90%

https://www.intel.com/content/www/us/en/develop/documentation/oneapi-programming-guide/top/software-development-process/migrating-code-to-dpc/migrating-from-cuda-to-dpc.html

 

"90-95%" Intel oneAPI Tools: Empowering GROMACS Cross-Architecture Development

"they were able to migrate CUDA code to Data-Parallel C++ using the Intel® DPC Compatibility Tool to create new cross-architecture-ready code."

https://www.youtube.com/watch?v=yBvWh6sp3bc

 

Link to comment
Share on other sites

Link to post
Share on other sites

They would have had luck with upcoming UE5 titles, if only the card's over all performance wasn't abysmal. AAA titles on DX12 down there will hit this hard, with worse designed indie games possibly hitting even harder.

They're claiming they have Freesync support under a different name, but if I had to guess... I'd say there would be compatibility tidbits with a lot of monitors just like how there have been when one camp opened sync support to the other at first. With lesser known monitors, Freesync might not work well enough to call properly, or not work at all.

Those and the load of other bugs make this beyond a bad product.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll just watch this all unfold with interest. I'm not buying anything anytime soon, but if they can make decent gamer cards at solid prices, by all means I'll be happy to see that.

 

Btw these dudes from Intel really will do anything for some press, they are just straight up shilling for LTT and the video still got sponsored. I'm now expecting them to be in the next Linus house update, just helping out with hanging stuff up.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, battlepants220 said:

Is anyone else getting a "Sony vs Microsoft 'vs' Nintendo" vibe from this Intel GPU news? They're not trying to be top of the line. They're just happy to be part of the game at the mid range.

Nah. Nintendo has a whole different market. Nintendo consoles are either the only gaming device for a person who wouldnt even consider anything else, or something someone buys in addition to their PC/PS/Xbox for a different type of game. Nintendo hasn't been competing directly against Microsoft and Sony since the Gamecube generation.

 

Intel wants to compete against Nvidia and AMD straight up but their technology isn't there yet.

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

so intel is marketing this towards mid-range gamers, the market most likey to buy older games due to their price?

 

yea thats not gonna work, games that run on older dx levels such as cs:go are gonna preform TERRIBLY.

if i was spec-ing out a system with an arc gpu and found out that it can't run cs:go well, i would remove it and get a competitor.

 

and before anyone says "ahhhhhh but source 2!". source 2 doesn't even support dx12 it only supports vulken, opengl and directx 11!

 

good and all to support newer standards but there are SO many games that run on dx11 or older that people still play to this day, dx12 games are EXPENSIVE, Final Fantasy VII Remake (2021) is $100 AUD! most of em are above $80 AUD with the only exception being tomb raider costing $15 AUD.

 

if you are an intel engineer reading this and you want this product to succeed, you HAVE to put more into getting these older games to run well before the mainstream gaming crowd gets their hands on it. lets not have another fallout 76 for gods sake.

*Insert Witty Signature here*

System Config: https://au.pcpartpicker.com/list/Tncs9N

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Quackers101 said:

can't yet, i would assume and some of the support. here is a bit software not hardware on it. A bit like is it better towards the original or better to view?

personal and also motion differences per element per codec.

 

 

Lots of good info, I skimmed the Intel API docs but I'll watch the videos later. I understand it enough to say that hopefully it's simple so that programmers are willing to dive in and start using it. My biggest concern is if Intel doesn't have a docker api then Nvidia will continue to be the only compatible external GPU's. Docker has compatibility with Intel's iGPU so I dare to assume that Arc might have it also, but I'm waiting for someone to hands on verify that. If it does then I don't have to do anything except slap an Arc and make a few changes in my compose files. Right now to use Intel iGPU in docker compose you add "devices: - /dev/dri:/dev/dri" and Intel's iGPU works. That would be the best case for Arc, so I'm doubtful things will play out that way.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Salv8 (sam) said:

so intel is marketing this towards mid-range gamers, the market most likey to buy older games due to their price?

 

yea thats not gonna work, games that run on older dx levels such as cs:go are gonna preform TERRIBLY.

if i was spec-ing out a system with an arc gpu and found out that it can't run cs:go well, i would remove it and get a competitor.

 

and before anyone says "ahhhhhh but source 2!". source 2 doesn't even support dx12 it only supports vulken, opengl and directx 11!

 

good and all to support newer standards but there are SO many games that run on dx11 or older that people still play to this day, dx12 games are EXPENSIVE, Final Fantasy VII Remake (2021) is $100 AUD! most of em are above $80 AUD with the only exception being tomb raider costing $15 AUD.

 

if you are an intel engineer reading this and you want this product to succeed, you HAVE to put more into getting these older games to run well before the mainstream gaming crowd gets their hands on it. lets not have another fallout 76 for gods sake.

Intels move on this will have to be to work on making credibly good GPU's.  At least as good as AMD's.  Then leveraging their advantage in everything CPU by giving us a mainstream PC worthy RISC SOC.   They must fear that somehow AMD or Nvidia will manage to have both the top GPU architecture and CPU architecture at the same time and put them on a single chip.   As Anthony pointed out that is the next vista for the PC. 

Link to comment
Share on other sites

Link to post
Share on other sites

I mean, if intel keeps true to their promise in regards to pricing. They could have at least some pretty good budget options. Even if they don't perform quite as great on directx11 and so on and so forth. The drivers are still being worked on. For instance while the a380 performs worse than the 6400, it's cheaper and the performance will rise with updates in the future. Plus the a380 by itself has a very low tdp around 35W but can be overclocked to around 50-55W for a significant boost in frames from 12% to even up to 60% for at least doom eternal though I assume the difference is more along the 15-20% range in general for most games.  It is just below the performance of a 1650 after the overclock while being quite a bit cheaper. While I wouldn't put this particular gpu in my computer. It's definitely a good card to put in something like a family computer that's meant to be on the cheap side however. A 10nth gen i5 with a 380 would be a very good and cheap computer to play something like minecraft, roblox, or other games such as those. I'm honestly interested to see what they will put out later this year.

https://www.tomshardware.com/news/intel-arc-a380-overclock-shows-impressive-gains

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Salv8 (sam) said:

dx12 games are EXPENSIVE, Final Fantasy VII Remake (2021) is $100 AUD! most of em are above $80 AUD with the only exception being tomb raider costing $15 AUD.

I don't know what games cost down under, but there are certainly a lot more games under $80 than just Tomb Raider.

Back 4 Bloody, Anno 1800, all modern Battlefield and CoD titles, Cyberpunk, Dead by Daylight, Death Stranding, Dirt 5, Elden Ring, all modern F1 titles, Forza Horizon 3, 4 and 5...

 

8 hours ago, Salv8 (sam) said:

if you are an intel engineer reading this and you want this product to succeed, you HAVE to put more into getting these older games to run well

I don't think that's worth it. If you as a customer want that support for all these old APIs, then you already have two options. Intel should concentrate on whats coming and not look towards the past, so they can actually compete with AMD and NVidia in new titles. If they don't do that, we might end up with another product that's neither good nor bad in any scenario and won't get any sh*t done at all.

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

i would very much like for someone to test dxvk on this card

if it was useful give it a like :) btw if your into linux pay a visit here

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Motifator said:

They're claiming they have Freesync support under a different name, but if I had to guess... I'd say there would be compatibility tidbits with a lot of monitors just like how there have been when one camp opened sync support to the other at first. With lesser known monitors, Freesync might not work well enough to call properly, or not work at all.

some they seem to say in the video, if I'm not mistaken will be done on the GPU, at least some of it. of course a montior might need it's own buffer.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Salv8 (sam) said:

so intel is marketing this towards mid-range gamers, the market most likey to buy older games due to their price?

 

yea thats not gonna work, games that run on older dx levels such as cs:go are gonna preform TERRIBLY.

if i was spec-ing out a system with an arc gpu and found out that it can't run cs:go well, i would remove it and get a competitor.

 

and before anyone says "ahhhhhh but source 2!". source 2 doesn't even support dx12 it only supports vulken, opengl and directx 11!

 

good and all to support newer standards but there are SO many games that run on dx11 or older that people still play to this day, dx12 games are EXPENSIVE, Final Fantasy VII Remake (2021) is $100 AUD! most of em are above $80 AUD with the only exception being tomb raider costing $15 AUD.

I'm not saying I disagree, but I like to look a bit to the future with this. Yes right now there are games who are definitely not performing how they should, but I can understand if they look toward the future. It isn't that people will stop playing DX11 games, it's that their cards will get powerful enough to run them at a very high framerate. They are jumping in late, so it's fine if they don't support everything properly, but they need to go long on this, because it could be like 5 years before no one gives a shit anymore that Intels cards only get 120 fps vs 240 fps in DX11 games. (that's 5 years to create more powerful cards)

Link to comment
Share on other sites

Link to post
Share on other sites

I do like the direction they are going, we needed another option in the GPU space, it's a very strong start for them. It has taken the other two years to get to where they are and as someone who likes the style they are going with and would love to see more competition in the space as more competition, the better for us.

 

Source, intel vs AMD, when AMD came out with ryzen things heated up in that space, in a few years i might be looking at a intel GPU, but it depends on how it performs in my design applications, the adobe ones.

PC: AMD Ryzen 7 5800X Gigabyte X570 UD Corsair Vengence 32gb 3200MHZ Gigabyte RTX 2070 Intel 512gb boot / Seagate 2tb spinner Windows 10 Corsair 4000D black

 

Fyi i am Autistic (Aspergers) so sorry for any social mistakes (im mostly okay) If you want to learn more, Just ask! 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×