Jump to content

The Philosophy of ARC GPU from Intel.

FuuMasta

So I was thinking about the best possible route that Intel could take to get a massive market share in the GPU market.

 

The truth is that Intel will probably never enter the top of the line GPU market with performance on the level of the 4080/4090 or their equivalent every gen anyway BUT...

Intel could chose to become the reasonably priced GPU manufacturer that absolutely FOCUS on efficiency and performance per watt. (FPS per Watt and GFlops per Watt)

 

We have to get real at some point: GPU are more powerful every generation but they also use more power.

RTX 2070 --------- 175W -------- 6,497 GFlops ---------------- 37.13 GFlops/W

RTX 2080 --------- 215W -------- 8,920 GFlops ---------------- 41.49 GFlops/W

RTX 3070 --------- 220W -------- 17, 664 GFlops ------------- 80.29 GFlops/W

RTX 3080 --------- 320W -------- 25,068 GFlops -------------- 78.34 GFlops/W

RTX 3090 --------- 350W -------- 29,284 GFlops -------------- 83.67 GFlops/W

RTX 4080 --------- 320W -------- 42,998 GFlops -------------- 134.37 GFlops/W

RTX 4090 --------- 450W -------- 73,073 GFlops -------------- 162.38 GFlops/W

 

This is unsustainable: Electric cost and scarcity is a problem that grow every year and we can now see with the 40 Series how bad this path is hardware-wise.

So yes, I think that Intel should take that path and I hope that if they have not took that decision yet that this thread will become popular enough to catch their attention.

 

Sadly, at the moment they seem to be behind what I was hoping for:

A770 16GB ------ 225W -------- 17,203 GFlops --------------- 76.46 GFlops/W

 

Somehow, I hope that this idea gain in popularity and this is the way that Intel choose to go.

Link to comment
Share on other sites

Link to post
Share on other sites

Besides the 3080 your chart has Nvidia consistently getting more efficient with newer architectures and higher up on the sku list within the same architecture though. Looks like Nvidia is doing what you’re wanting Intel to do?

Thats the natural progression of things. Intel will have to do that. Same power same performance as last gen won’t sell. Same performance more power won’t sell. More performance same power and more performance with slightly more power will sell. 
 

If power is a big worry, look at getting a more power hungry card and under volting. 

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, IkeaGnome said:

Looks like Nvidia is doing what you’re wanting Intel to do?

2 Things:

-I'm calling for Nvidia to stop the rise of power usage, for the sake of the environment.

-I'm blaming Intel about not being able to beat the efficiency of the 3080, a +2 years old GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, FuuMasta said:

-I'm calling for Nvidia to stop the rise of power usage, for the sake of the environment.

You don't have to buy the top model. As long as efficiency continues to increase, you can pick lower models and still see improvements gen on gen.

 

5 minutes ago, FuuMasta said:

-I'm blaming Intel about not being able to beat the efficiency of the 3080, a +2 years old GPU.

FLOPS might be useful as an indicator of compute performance, but it definitely isn't good as a gaming performance indicator cross architectures/generations.

 

I'd also be more interested in architecture efficiency than product efficiency, since products can be chosen to sit at any point on the efficiency curve, it is more interesting to compare the curves.

 

The hardware was "done" about a year ago. The extra wait was mostly to get software in a better state. Give them a couple of gens and I think they'll have a good go at knocking AMD off 2nd spot. This takes time.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, FuuMasta said:

-I'm calling for Nvidia to stop the rise of power usage, for the sake of the environment.

Blame gamers and everyone else who wants more performance every generation. Many generations of screaming for more performance while not giving time for efficiency to catch up to those performance jumps is on you, me, and everyone else.

17 minutes ago, FuuMasta said:

-I'm blaming Intel about not being able to beat the efficiency of the 3080, a +2 years old GPU.

Cards have been "out" for a while now. Drivers kept them from being released. I think you were having too high of expectations for the first round of cards for them. I wouldn't expect them to be competitive until later on this decade.

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, IkeaGnome said:

Blame gamers and everyone else who wants more performance every generation. Many generations of screaming for more performance while not giving time for efficiency to catch up to those performance jumps is on you, me, and everyone else.

The other thing that seriouly bug me is how mobile hardware is always a shit-ton lot more efficient.

If we compare a 3080 to a 3080m, the desktop use 2.5x to 3x more power on both the GPU and the CPU than the laptop with a 3080m.

However, you only get a 30-40% increase in FPS.

How can this be acceptable or possible to begin with? I have no idea.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, FuuMasta said:

How can this be acceptable or possible to begin with? I have no idea.

 

 

CPU: Ryzen 5900x | GPU: RTX 3090 FE | MB: MSI X470 Gaming Pro Carbon | RAM: 32gb Ballistix | PSU: Corsair RM750 | Cooler: Sythe Fuma 2 | Case: Phanteks P600s | Storage: 2TB WD Black SN 750 & 1TB Sabrent Rocket | OS: Windows 11 Pro & Linux Mint

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×