Jump to content

No longer integrated - Intel launches ARC GPU for laptops.

williamcll
12 minutes ago, porina said:

@WereCatmy question was more about the potential userbase for it. How many use AV1 now? How many more would use it if it had higher encoding performance?

Anybody who streams will use, twitch already supports, YT supports it, there was just no way to take advantage of it. The video quality will go up significantly at those low bitrates.

 

The only reason why streaming services haven't addopted HEVC is because of the licencing fees but AV1 is open source and free.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, starsmine said:

Not many when the first chips with hardware encoding came out, but many do now. AV1 is a good codec and its past due for this.

"A journey of a thousand miles begins with a single step"

 

Everything will take adoption time. My thinking was more along the lines of who could/would use it NOW? If we're talking a year away, red and green may have implemented similar too, so it would no longer be a feature advantage for Intel.

 

30 minutes ago, WereCat said:

Anybody who streams will use, twitch already supports, YT supports it, there was just no way to take advantage of it. The video quality will go up significantly at those low bitrates.

Thanks for the info. Guess I'm out of the loop on AV1's role in the near future and this is a notable step in increasing its adoption.

 

Just been reading up more on it. Specifically on Twitch it sounds like to me it is a work in progress. As I found out as a small streamer I don't get all the tech features those at partner level do, such as re-encoding. AV1 sounds a bit of a hog without hardware decode, and unless you have recent gen hardware that might be a problem?  

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

What a huge letdown.

 

So this intel dgpu is slower than the Vega IGPU from 5700g.

 

The rdna2 igpu rx 680m will obliterate this intel dgpu

 

 

Yeah, we're all just a bunch of idiots experiencing nothing more than the placebo effect.
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

 

 

Thanks for the info. Guess I'm out of the loop on AV1's role in the near future and this is a notable step in increasing its adoption.

 

Just been reading up more on it. Specifically on Twitch it sounds like to me it is a work in progress. As I found out as a small streamer I don't get all the tech features those at partner level do, such as re-encoding. AV1 sounds a bit of a hog without hardware decode, and unless you have recent gen hardware that might be a problem?  

I didn't read into the details. Early adoption as always takes time but it has to start somewhere. 

 

Most mobile devices released in the last 2y to 3y have AV1 decode and on PC you can always use CPU for SW decode but it will use more resources. 

 

But it's not simple as that. Let's say you need 6000 bitrate for acceptable 1080p60 with x264 but you only need 1500 bitrate for better quality with AV1. 

 

Now compare the users who couldn't watch your stream because they didn't have fast enough Internet for 6000 bitrate vs people who don't have capable enough PC to handle AV1 decode. 

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, rcarlos243 said:

What a huge letdown.

 

So this intel dgpu is slower than the Vega IGPU from 5700g.

 

The rdna2 igpu rx 680m will obliterate this intel dgpu

 

 

Yeah. I'd say so. The Intel desktop gpus are rumored to be really bad, FWIW. Maybe price will be okay but their top end sku will be like a weaker 3070 with no DLSS/Encoder. 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

Could somebody please clairfy how efficient AV-1 codec actually is? I heard it's comparable to HEVC but royality free, I'm not totally sure tho.

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, WereCat said:

But it's not simple as that. Let's say you need 6000 bitrate for acceptable 1080p60 with x264 but you only need 1500 bitrate for better quality with AV1.

Intel's slides from yesterday claimed "50% more efficient than H.264" and "20% more efficient than HEVC". Without rewatching that part, does that sound about right for bitrate relative to a given quality? So perhaps 2x improvement, not 4x. Even if so, that's still quite a saving.

 

52 minutes ago, WereCat said:

Now compare the users who couldn't watch your stream because they didn't have fast enough Internet for 6000 bitrate vs people who don't have capable enough PC to handle AV1 decode. 

I've kinda ran into that already. Currently I'm streaming at 5500, but know someone who can't watch my streams because internet in their country sucks. 

 

45 minutes ago, Ryan829 said:

Maybe price will be okay but their top end sku will be like a weaker 3070 with no DLSS/Encoder. 

What encoder? AV1 or NVENC? AMD don't have DLSS either. FSR 2.0 soon, like Intel XeSS. In tech, XeSS seems closer to DLSS than FSR 2.0 will be. We'll have to wait and see results and more importantly developer support. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

Intel's slides from yesterday claimed "50% more efficient than H.264" and "20% more efficient than HEVC". Without rewatching that part, does that sound about right for bitrate relative to a given quality? So perhaps 2x improvement, not 4x. Even if so, that's still quite a saving.

IDK what they mean by "efficient". 

 

The HW encoder will likely not have the same quality as SW encoding so the test they did with 1440p 120FPS (pre-encoded) will likely still not be possible but it for sure be able to bring 1080p 120FPS or really high quality 1080p 60FPS on the table. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Ydfhlx said:

Could somebody please clairfy how efficient AV-1 codec actually is? I heard it's comparable to HEVC but royality free, I'm not totally sure tho.

"efficient" in what way you depend on what.

It's open to use, but not much hardware for testing. Although now intel and AMD starts to push it, AMD in their newer 6000 series laptops (GPU or CPU)?

encode vs decode, and some for preference. Do note the thing about now having hardware for AV1.

 

in general less size, more time and processing for some workloads.

And sometimes better, but not always.

https://youtu.be/ibXKKllz4xQ?t=689

 

not closer, it is like DLSS unlike FSR. FSR is pretty good without the AI and reconstruction part, although AMD want to state the use of, or how they want to add AI to that part.

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Quackers101 said:

"efficient" in what way you depend on what

I meant efficient in terms of bitrate/quality. For instance, HEVC requires about half the bitrate of H.264 for simmilar quality (especially at higher resolutions).

 

I'm asking out of curiosity, because I have quite a lot of DVDs that I am in the process of copying to a NAS, and was wondering if AV-1 brings improvement in this area for the future

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ydfhlx said:

I meant efficient in terms of bitrate/quality. For instance, HEVC requires about half the bitrate of H.264 for simmilar quality (especially at higher resolutions).

 

I'm asking out of curiosity, because I have quite a lot of DVDs that I am in the process of copying to a NAS, and was wondering if AV-1 brings improvement in this area for the future

AV1 in its current state will typically produce smaller files at the same quality as HEVC, with faster encode times, if you use Intel's AV1 encoder, SVT-AV1 (arguably the best for real world usage) and x265 (arguably the best HEVC encoder). 

 

If all you care about is file size and quality then libaom will hands down be the best, but it will take ages to encode. 

 

And just to clarify, when people talk about a codec efficient then they are talking about size vs quality. So your word use was correct. 

 

Edit:

Here is one of the many sources I base this post on.

 

 

x265 at crf 28 gets 21.39 FPS. Quality is 70.9 (vmaf) and the encoding FPS is 22.11 FPS.

SVT-AV1 at crf 34 gets 24.99 FPS. Quality is 73.3 (vmaf) and the encoding FPS is 24.99 FPS.

 

It's a similar story with other presets as well.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/30/2022 at 10:13 PM, leadeater said:

lol this looks familiar, wonder who's homework they were copying 🙃

 

image.thumb.png.4ac3de327295c7b29806e84d7b4cf14e.png

Samsung /s

image.thumb.png.aa94c7798d4de29df3c3c6d84299dee7.png

Link to comment
Share on other sites

Link to post
Share on other sites

what is people's faith in intel? With Arc 3 GPU's not looking that great for performance of an dedicated graphics card and closer to iGPU.

Hope the price is good, and the higher end GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/30/2022 at 9:11 AM, Ryan829 said:

Kinda weird they are starting with laptop gpus. How many people even use laptop gpus?

Those who can't afford a desktop GPU.

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Quackers101 said:

what is people's faith in intel? With Arc 3 GPU's not looking that great for performance of an dedicated graphics card and closer to iGPU.

Hope the price is good, and the higher end GPUs.

Rumours were the delay in release was, at least in part, due to getting driver maturity up. If so, while it sucks, it would be worse if they didn't have a substantially working solution when people buy them. It isn't like they don't have experience with GPUs, although gaming focused ones with the support level expected is likely a learning experience for them as a company.

 

As for performance, keep in mind so far they've only gave some numbers for the 3 series mobile part. This was never going to be high end. AMD released a slide showing how the 6500M beats the A370M, but keep in mind they are comparing a 5 tier part against a 3 tier part, so it really shouldn't be a surprise. Based on the launch slide, the A550M should be about 16% more compute potential than A370M with possibly 2x the bandwidth, so how that balances out for gaming remains to be seen. It also depends on what practical clocks may be achieved in use. Most likely for any serious modern gaming the 7 tier parts will be required, with the 5 may be ok for the more price conscious willing to give up performance to lower the price.

 

6 hours ago, BlueChinchillaEatingDorito said:

Those who can't afford a desktop GPU.

Forgot where I posted it, but I did look at the Steam Hardware Survey results for Feb. 2022. Looking only at Ampere, since it does break out laptop versions from desktop, laptop versions made up over a third of Ampere results on the survey. That's a good proportion of current gen gamer laptops out there! Over a year ago I did get both a 3070 desktop and 3070 laptop. The desktop GPU alone was £720 from memory, and the laptop was £1300. Desktop system may be higher performing overall (laptop GPU is limited to 130W) but laptop is still a decent amount of gaming power for not much total spend.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

As for performance, keep in mind so far they've only gave some numbers for the 3 series mobile part. This was never going to be high end. AMD released a slide showing how the 6500M beats the A370M, but keep in mind they are comparing a 5 tier part against a 3 tier part, so it really shouldn't be a surprise. Based on the launch slide, the A550M should be about 16% more compute potential than A370M with possibly 2x the bandwidth, so how that balances out for gaming remains to be seen. It also depends on what practical clocks may be achieved in use. Most likely for any serious modern gaming the 7 tier parts will be required, with the 5 may be ok for the more price conscious willing to give up performance to lower the price.

General note one of the best ways to compare GPUs, when marketing etc gets involved is just the TDP. Performance tracks very well with that, even more so when using same overarching silicon node i.e.N6/N7. Transistor counts, memory bus and bandwidth, core clocks etc are all nice information but as far as I've seen over a very long time is TDP alone is actually a very good ball park for performance.

 

A370M vs 6500M is actually fair from what I can see, I don't think 6500M is a ARC 5 series product segment GPU. Both are up to 50W TDP, 64bit bus, 4GB RAM. What we don't know is cost.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

General note one of the best ways to compare GPUs, when marketing etc gets involved is just the TDP. Performance tracks very well with that, even more so when using same overarching silicon node i.e.N6/N7.

Hadn't thought of it that way before, and I'll keep that in mind in forward looking info. I note Intel use the term "graphics power" so what does that mean? Is it the GPU under gaming or other loads? Does it include the vram? Just have to check those details. If we assume that is comparable, that would put the top A770M in the class of the mobile 3070.

 

Just want to make sure we don't get into an AMD chip TDP vs nvidia TBP which was a common mistake in the past.

 

Having said that, looked again at nvidia laptop GPUs, and they describe it as "GPU subsystem power" so presumably that includes any power delivery and vram as well as the GPU itself. But 3070 is listed as 80-125W where I'm sure Lenovo quotes the one in my laptop as 130W. But, that's a diversion and something for me to look at separately.

 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, porina said:

I note Intel use the term "graphics power" so what does that mean?

I would guess the same as Nvidia TGP (Total Graphics Power) or TBP (Total Board Power). Currently if you see Nvidia talking about power and TDP they mean Total Board Power.

 

If Intel is making the effort to make a distinction then I'd guess it's for the same reasons as Nvidia and out of the two I would say Intel means TGP equivalent, since these aren't add-in cards.

 

50 minutes ago, porina said:

But 3070 is listed as 80-125W where I'm sure Lenovo quotes the one in my laptop as 130W. But, that's a diversion and something for me to look at separately.

RTX 3070 Mobile 130W is an "official" and widely used SKU option. As in Nvidia, like on their AIB cards, allows vendors to set power limits within their allowed guidelines so there are many laptops with 130W RTX 3070's in them.

 

15.png

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/31/2022 at 4:03 PM, Ydfhlx said:

Could somebody please clairfy how efficient AV-1 codec actually is? I heard it's comparable to HEVC but royality free, I'm not totally sure tho.

It's open source and about 20% better quality vs HEVC at the same bitrate or 20% less file size at the same quality.

Obviously, encoding it was a huge pain a took a ton of time because it is very demanding so having a dedicated HW accelerated encoder is a huge plus for this to get widely adopted. I bet that many people will buy their GPUs just for AV1 encode itself.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/31/2022 at 1:17 PM, porina said:

My thinking was more along the lines of who could/would use it NOW? If we're talking a year away, red and green may have implemented similar too, so it would no longer be a feature advantage for Intel.

Just saw this, which is what I was kinda looking for but not getting details:

 

 

 

 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×