Jump to content

Oh yeah? Well we didn't even want that much performance anyway - Intel ARC to prioritize "mainstream" markets with 200 watt max power targets.

BachChain

Summary

 

In an interview with Gadgets 360, the head of Intel's graphics division Raja Koduri stated that their current plans for ARC was not to try and compete at the ultra-high end for absolute performance, but performance-per-watt and "mass-market" products. As part of this goal, Intel will apparently be limiting current and future ARC products to a single PCIe power connector and a TBP of 200-225 watts.

 

Quotes

Quote

We're seeing modern GPUs consuming ridiculous amounts of power, even though manufacturers have moved to more efficient process nodes. 600W and 800W power supplies are becoming the norm now. Will Intel also follow this trend?

 

Raja Koduri: Performance per Watt, or delivering higher performance at lower power, is my top priority. There will always be someone with some skill who can say “I'm going to give you more juice”, but my focus is lower power. The other issue I find with just increasing power and bragging about benchmarks is that while it's good from a marketing standpoint, [there is a limited] number of PC users who can just buy such a card and plug it in. It dramatically reduces your overall market, right?

Quote

That mass-market approach, would that mean that you primarily focus on the mid- and lower-tier SKUs first and then push out high-end ones?

 

Raja Koduri: High-end has no limit right now. What is the definition of high-end? Is it 600 Watts? Obviously our partners and our customers want some halo SKUs for bragging rights, and we always like to figure out ways to enable that. But my priority at this point is getting that core audience, with one power connector. And that can get you up to 200-225W. If you nail that, and something a little above and a little below, all that falls into the sweet spot.

 

My thoughts

While this definitely feels like damage control for what has so far been an all-around underwhelming first-generation product, it's definitely nice to see someone care about that market segment.

 

Sources

https://www.gadgets360.com/laptops/features/raja-koduri-interview-intel-arc-launch-price-ai-xe-gaming-in-india-meteor-lake-integrated-3585931

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, BachChain said:

this definitely feels like damage control for what has so far been an all-around underwhelming first-generation product

I don't know about that... this seems like Koduri is saying what was obvious to anyone paying attention, meaning that Intel was going to have to compete in something other than raw performance for the first few generations. The idea they could come out of nowhere and compete with nvidia or amd at the top end would be wishful thinking. It's a sound business strategy to establish a foothold in the midrange market, where performance per watt still sort of matters and most people reside anyway, then maybe move on to the high end.

 

I'd agree with him that trying to get something that competes with a 3090 out may have been possible for them but would have required insane power limits that would not be justified considering Intel's weak position of new kid on the block; why would you buy an ARC over a 3090 if the price were similar and the performance per watt worse?

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Gotta walk before you run. Intel feels like AMD in the Polaris era. If you know you're not going against the high end, you play for the volume/value market in the lower to mid range. Build share and experience before attacking higher up. Even AMD have failed to make any dent in nvidia's domination and I do feel that Intel can overtake AMD for 2nd spot in dGPU sales in a few generations.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

This is the play they've been going for since the start. The market of people buying RTX 4090's is tiny, the market buying RTX 3060's is huge.

Its why they marketed the A770 the way they did "between the 3060 and 3060 ti". They need to scoop out a bit of that much larger market.
Its a smart move for a variety of reasons but its the way theyll win people over, by simply having a better product in that common midrange market.

Link to comment
Share on other sites

Link to post
Share on other sites

I would have been floored had Intel's first ever dedicated GPU be able to go head to head, or even surpass, the other two established players. That they got as close as they did the first time around, is nothing short of amazing, and I hope to get one of their cards one day (assuming I ever buy a system new enough...and they sort out the driver teething issues)

 

Absolutely hats off to Intel for even daring to enter this market. 

NOTE: I no longer frequent this site. If you really need help, PM/DM me and my e.mail will alert me. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, 8tg said:

Its why they marketed the A770 the way they did "between the 3060 and 3060 ti". 

The problem is the die is a lot bigger than the 3060 (406mm^2 vs 276mm^2).

It is basically the same size of the 3060 ti (400mm^2), but the 3060 Ti is in cheaper 8nm, while Intel is using 6nm. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Forbidden Wafer said:

The problem is the die is a lot bigger than the 3060 (406mm^2 vs 276mm^2).

It is basically the same size of the 3060 ti (400mm^2), but the 3060 Ti is in cheaper 8nm, while Intel is using 6nm. 

The average consumer doesn’t know or care about any of that, it performs better than a 3060, and costs less than a 3060.

Thats also the big reason behind the 16gb one. It has to have a bigger number than the 12gb 3060.

Same reason there was an 8gb RX 470. They’re not targeting users of this forum, they’re targeting the guy who bought a GTX 1060 a few years ago and can’t play cyberpunk all that well right now and knows they need a better gpu, that person isn’t informed on all the fine details, they just want a modern equivalent so they gravitate towards the 3060.

Intel then shows off “hey we have a cheaper product than the 3060 that performs better” and they bolster that viewpoint with 16gb of video memory.

Big companies don’t look at this stuff the same way us generalized consumer computer nerds do.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, 8tg said:

The average consumer doesn’t know or care about any of that, it performs better than a 3060, and costs less than a 3060.

I meant that they're selling it with a tiny margin, possibly at a loss given their investment. They were even talking about writing off the entire division.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, BachChain said:

While this definitely feels like damage control for what has so far been an all-around underwhelming first-generation product, it's definitely nice to see someone care about that market segment.

Yeah nah.

 

You don't design high-end parts just to sell them to gamers. Products like the 4090 probably wouldn't be profitable if that was the only use for that piece of silicon - the development costs for something like the AD102 would just be far too high for the comparatively tiny amount of units they actually end up selling. No, you make the high-end 4090 as a way of selling leftover not-quite-perfect GPUs to gamers, while the proper, fully-functional chips are instead sold to professionals as Quadros for 4x the price. That's where your actual profit comes from for that piece of silicon.

 

And this is why RTX is even a thing. The reason we have tensor cores and RT cores as consumers is that Nvidia wanted to sell them to professionals, who benefit far more from them than you or I. But because Nvidia uses the same pieces of silicon for both markets, they needed a way to market those features to gamers in order to justify the die space: RTX.

 

Intel knows that they have no customers in the professional space right now. Even if they were to make high-end workstation GPUs, nobody would buy them as the software support and reputation simply isn't there. And as such, they know that designing high-end GPUs simply wouldn't be profitable for them today.

 

Intel's GPU business is in a very similar place today as AMD's CPU business was in back when they released 1st gen Ryzen. Nobody wanted to put that in a server for the exact same reasons - sure Zen was a decent product, but Intel's product was tried and trusted. Even today Intel still has over 85% marketshare in the server space, despite the huge benefits offered by EPYC. That's just how the professional world works, and Intel is going to be fighting that exact same battle in the pro GPU space against Nvidia, which is dominated by a similar degree.

 

And so it makes sense to not even try and compete at the high end at first. Instead of designing GPUs that won't be profitable anyway, you may as well spend that money on drivers and software to create a package that people actually want to buy, so that you can build up a solid foundation - and a reputation - upon which you can produce more competitive products in the coming years.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Heliian said:

I think this is quite important as no-one else is even talking about power savings and efficiency. 

 

 

Imagine actually being able to game while on battery? And not just shreading through a fully charged battery in an hour of gaming and destroying your battery health in the process. 

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, BachChain said:

My thoughts

While this definitely feels like damage control for what has so far been an all-around underwhelming first-generation product, it's definitely nice to see someone care about that market segment.

 

Well, Intel can only do what it has leverage to do, and it's current process nodes DO NOT give it that leverage.

10 hours ago, Sauron said:

I don't know about that... this seems like Koduri is saying what was obvious to anyone paying attention, meaning that Intel was going to have to compete in something other than raw performance for the first few generations.

If Intel is smart, they would realize that their cards should never exceed the 2-slot width. Figure out how to keep it in that space, because these 2.x and 3.x slot widths require motherboards and chassis configurations that aren't reasonable.

 

It was already "unreasonable" to even take 2 slots, but we let that slide, but 3-slots should never, ever, have been a thing in a consumer product. Nvidia and AMD should have not made those cards as consumer cards, with the expectation that you can just go to BB, buy any one, and fit it in your HP/Dell.  Nvidia should have kept the Titan name for their cards that require more than 2 slots of cooling.

 

Likewise Intel and AMD with their CPU's, if it requires liquid cooling, it should not carry the same labeling as the air-cooled processors. They fundamentally require a specific chassis and cooling setup.

 

As Intel has generally produced extremely under-performing iGPU parts, they need to draw the line somewhere with "okay, we know our parts are not going to be RTX 3090 killers at 300 watts, so let's just target a specific power target and try to beat the competitors there" Like the 3060 is "enough" for people who aren't running 4K, so that seems like the target to "kill" first.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Forbidden Wafer said:

I meant that they're selling it with a tiny margin, possibly at a loss given their investment. They were even talking about writing off the entire division.

Moving into a market is going to be a costly exercise. It would be nice if sales were profitable right away but this is a longer game.

 

I don't think I heard any credible talk about Intel killing Arc outside noise of gossip sites. Even with the cuts they're making elsewhere, it wasn't even mentioned as a thought.

 

9 hours ago, tim0901 said:

Intel knows that they have no customers in the professional space right now. Even if they were to make high-end workstation GPUs, nobody would buy them as the software support and reputation simply isn't there. And as such, they know that designing high-end GPUs simply wouldn't be profitable for them today.

There is Ponte Vecchio offering but I guess that overshoots workstation level and goes to HPC. Not sure if it stops at enterprise on the way.

 

9 hours ago, tim0901 said:

Intel's GPU business is in a very similar place today as AMD's CPU business was in back when they released 1st gen Ryzen.

I still wonder if Polaris is a better comparison, at least keeping in the same market. Polaris was not made to compete at the top end. Looking back, maybe Polaris isn't a perfect example, since AMD also had other consumer facing designs above it, like the Fury, and later Vega stepping in. Still, if we follow the Ryzen analogy, then Celestial might be the generation to look at. First two Ryzen architecture offerings (Zen, Zen+) had notable weaknesses vs Intel which were largely addressed with Zen 2, which was also the time AMD gain confidence to push harder. Battlemage development would be well underway already so incorporating learnings from Alchemist will be limited.

 

Early rumours were putting Arc against 3070, and that might still have been Intel's target performance given die size and other factors.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

It is quite often very profitable for a company to provide a decent product at an attractive price.  consumers (particularly those who talk and make recommendations on forums) often are less inclined to make a big deal out of smaller product issues if the overall product is great for budget and has a decent reputation for being solid.  I have seen this with some aldi products (4x4 winches were in high demand here in Aus after they had a stint of really well performing winches that were literally half he price of the cheapest named brand at the time).   It also happened with a set of bookshelf speakers from dicksmith back in 2003,  they were priced at $100 a pair yet were touted on many forums as sounding equally as good as named brands costing upwards of $300.    The best thing about Intel's position right now is that they can start to make bigger and better stuff down the line but because they will have built a reputation on "cheap and cheerful" their reputation will be more robust than if they came out of the gates with a hellraiser of a GPU and then became one hit wonders.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think this is Intel doing damage control. I think it's a sound strategy that I hope they stick to. 

 

Making these 1000+ dollar graphics cards looks good in marketing, but realistically it's just halo products design for marketing purposes. The big majority of people want mid range stuff that doesn't break the bank. Focusing on that is what will benefit most gamers. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Marko1600 said:

I'm willing to bet that this is somehow related to Apple and Laptops.

Laptops I can see..but Apple? They went in-house with their own silicon...

 

NOTE: I no longer frequent this site. If you really need help, PM/DM me and my e.mail will alert me. 

Link to comment
Share on other sites

Link to post
Share on other sites

If you ask me, mor pressure in the middle drags down prices on the middle/high and high end for an overall more affordable market. Thinking about how few 4080s are being sold, it doesn't really make sense to compete on the high end cards if intel puts enough pressure on Nvidia (AMD is already applying pressure on price) the 4060 tier will have to be very low margin for the cards to compete, hopefully that pressure can cause more companies like EVGA did to knock Nvidia off their contracts which creates a better industry overall.

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, GhostRoadieBL said:

If you ask me, mor pressure in the middle drags down prices on the middle/high and high end for an overall more affordable market. Thinking about how few 4080s are being sold, it doesn't really make sense to compete on the high end cards if intel puts enough pressure on Nvidia (AMD is already applying pressure on price) the 4060 tier will have to be very low margin for the cards to compete, hopefully that pressure can cause more companies like EVGA did to knock Nvidia off their contracts which creates a better industry overall.

The 4080's aren't selling like hotcakes because they are not a good value for the dollar.

It's the single worst value:

image.thumb.png.19798953924288f348e178cb5ebb8f28.png

src: https://www.videocardbenchmark.net/gpu_value.html#xy_scatter_graph 

The next two worst values are the 3080Ti and 3090Ti

The best value cards (relatively speaking, not the cheapest) are 1660/2060/3060/3060, and if you draw a line connecting all the nvidia parts, you'd see this extremely steep curve starting from the 3060.

 

Their value is actually not that much worse than the 13th gen intel i9 CPU's and the AMD 7950X

 

What you want to see on these charts is a straight line. Or something pretty close to one. When you see a curve, that means the value doesn't scale at all.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

The 4080's aren't selling like hotcakes because they are not a good value for the dollar.

src: https://www.videocardbenchmark.net/gpu_value.html#xy_scatter_graph 

The next two worst values are the 3080Ti and 3090Ti

The best value cards (relatively speaking, not the cheapest) are 1660/2060/3060/3060, and if you draw a line connecting all the nvidia parts, you'd see this extremely steep curve starting from the 3060.

That's only a factor for informed consumers, the other (arguably) more important factor is price vs average income which directly impacts the tier of cards the majority of the population can purchase regardless of the fps/$.

It's the reason the best selling cards are in the $200-300 range regardless of relative performance.

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

im good with high end being 250 watts. Anything more than that is too much space heater for me lol

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

It is not a matter of "we didn't want to" but rather "we weren't able to", this is just pure damage control.

Head down, work on the next project and deliver something that is good price to performance wise and that for the good of both the company and the consumer is competitive enough to fit in the AMD/nvidia space.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, GhostRoadieBL said:

That's only a factor for informed consumers, the other (arguably) more important factor is price vs average income which directly impacts the tier of cards the majority of the population can purchase regardless of the fps/$.

It's the reason the best selling cards are in the $200-300 range regardless of relative performance.

Generally, "nobody" wants/needs a $1200 GPU. The x90, and even the x80/x80Ti/x80Super tier parts have always been priced out of the range of consumer's and are well into HEDT territory, Unfortunately neither Intel or AMD have released CPU's to match a HEDT, their i9/79xx CPU's are basically consumer tier parts with the TDP uncapped, and the PCIe lanes nerfed. At least the Threadripper gives the illusion of being HEDT but it's really not a good value compared to a EPYC chip with the same configuration. Good luck getting a workstation board for an EPYC.

 

Like look at the actual slot and USB port configuration on a ASUS Pro WS WRX80E ( sWRX8), it absolutely embarrasses the "Consumer" boards. Yet, good luck finding competition here. If you have money to burn, why are you spending it on Intel i9/Ryzen x9xx parts that you have to make multiple compromises on. Hence, again, why do the parts exist at all except to stroke the ego's of their manufacturers of having the fastest power hungry parts. The Geforce xx90 parts belong in that category of "Yeah, you could put this on a i3, but why?"

 

When you're already spending thousands of dollars on the highest-end parts, why would you compromise that build? I personally feel that Intel and AMD "cheapened" their CPU's for a typical user who only has one GPU and one SSD, and NOTHING else, but then all this USB 3.2 and Thunderbolt stuff comes out and now that pitiful 24 PCIe lanes is not sufficient to have more than one USB port.

 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Kisai said:

Generally, "nobody" wants/needs a $1200 GPU.

"nobody" might be a small niche, but a small portion of a big number can still be a big number. If people really need performance, they'll pay for it if they can. If they can't, we get into value judgements. Most people will have to decide how much performance they want to pay for.

 

On a similar note, I feel many people aren't asking the right question with "what's best performance in a budget" when they really mean "what will do the job I want it to do at a price I can afford". It's a subtle distinction. Once you hit a "good enough" performance level it usually switches over to "how little can I pay for it?" I think that's where Intel might try for in early gens.

 

This is also in part why I tend to look at performance first, then pricing. Perf doesn't really change much over a product's lifetime. Maybe a bit more for relatively new products like Arc, as driver updates can have more impact than usual. Pricing is a choice, and can vary a lot more. It has to be considered at some point, but I tend to look at it later in the process, not as a starting point.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

I think Intel couldn't make an high end GPU, and Nvidia/AMD completely abandoned market for mid range GPUs between 200$ and 300$ at 200W.

 

  If there is one spot where Intel can enter, it's there.

On 12/11/2022 at 11:32 AM, porina said:

Early rumours were putting Arc against 3070, and that might still have been Intel's target performance given die size and other factors

I agree, Intel meant for the a770 to compete against the RTX3070 (GA104).

 image.jpeg.b48d41797cb528384a12ec81fa7acea2.jpeg

I look forward to the b770 card. Intel only needs to make something competitive vs the RTX5060 to sell GPUs in volume.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×