Jump to content

Are CPUs destined to become irrelevant for gaming?

YoungBlade

When I look back on the games I've played throughout the years, I notice that the amount of RAM a game needs seems to have stagnated in the last decade. It took until just recently for 16GB to overtake the 8GB I was recommended back in 2013. Before that, it wasn't uncommon for new games to require a RAM upgrade.

 

Will the same thing happen to the CPU?

 

Yes, some games benefit from 6+ cores or high single core performance, but most can get by with any old Intel quad-core from the last decade.

 

Imagine trying to use a 2GHz Pentium 4 to play Skyrim or Arkham City? That's the equivalent of gaming today on the 2600k, except that the former would have been a disaster and the latter is often doable. The fact that the 2600k doesn't absolutely choke to death on AAA games is bonkers when compared to the prior decade.

 

Yes, some titles are CPU bound, but often only at 1080p and below. At resolutions above 1080p, which are becoming more common, newer titles generally stop caring about the CPU unless the graphics card is top-of-the-line. At 4K, even a 3090 won't notice whether you run a 5600X or 5950X.

 

Do you think this trend will continue like what happened with RAM, and that people will comfortably game on a 5900X or 10900k in 2031? Or do you think games will start to increase in requirements more rapidly in the 2020s than they did in the 2010s, relegating today's CPUs to the e-waste bin?

Link to comment
Share on other sites

Link to post
Share on other sites

TLDR. But CPUs are definitely not going to be irrelevant for gaming, no.

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

A with everything else, current CPUs will get obsolete for modern games some time in the future. But no one can say when it will happen. But don't expect a new 6 core CPU to be obsolete in the next few years.

 

And also CPU scaling with a 3090 will stop to some degree. But that doesn't mean your FPS will be the same when using a core 2 duo. At higher resolution the CPU becomes less important, but regardless of resolution the CPU is the 2nd most important part for games after your GPU. And that will not change. There is a lot of work that the GPU cannot do as efficiently as a CPU and vice versa.

 

So saying when todays CPUs become obsolete is basically impossible.

 

But at the time when current CPUs are obsolete, basically the rest of the system including GPU, RAM, etc. will also be lacking in performance.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

I mean, it's not like videogames suddently require less hardware, it's how videogames developers make their gameengine utilize the hardware. Seeing how AMD are about to make 20+ cores mainstream, probably means video games in 5-10 years or so will start to like 20+ CPU cores, because the gameengines for those games have been optimized for it. 

PC Setup: 

HYTE Y60 White/Black + Custom ColdZero ventilation sidepanel

Intel Core i7-10700K + Corsair Hydro Series H100x

G.SKILL TridentZ RGB 32GB (F4-3600C16Q-32GTZR)

ASUS ROG STRIX RTX 3080Ti OC LC

ASUS ROG STRIX Z490-G GAMING (Wi-Fi)

Samsung EVO Plus 1TB

Samsung EVO Plus 1TB

Crucial MX500 2TB

Crucial MX300 1TB

Corsair HX1200i

 

Peripherals: 

Samsung Odyssey Neo G9 G95NC 57"

Samsung Odyssey Neo G7 32"

ASUS ROG Harpe Ace Aim Lab Edition Wireless

ASUS ROG Claymore II Wireless

ASUS ROG Sheath BLK LTD'

Corsair SP2500

Beyerdynamic DT 770 PRO X (Limited Editon) & Beyerdynamic TYGR 300R + FiiO K7 DAC/AMP

RØDE VideoMic II + Elgato WAVE Mic Arm

 

Racing SIM Setup: 

Sim-Lab GT1 EVO Sim Racing Cockpit + Sim-Lab GT1 EVO Single Screen holder

Svive Racing D1 Seat

Samsung Odyssey G9 49"

Simagic Alpha Mini

Simagic GT4 (Dual Clutch)

CSL Elite Pedals V2

Logitech K400 Plus

Link to comment
Share on other sites

Link to post
Share on other sites

If anything, CPUs are going to become even more important as developers continue to add more and physics and general complexity to their games. Are there games you can run on a quad core? Sure, but they're relatively simple in functionality, mostly run and shoot on a 2D map, with a handful or less of enemies on the screen at any given time. Games like Cyberpunk easily will scale to 12 core/24 threads because of the complexity of everything happening on the screen at any given time.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Chris Pratt said:

If anything, CPUs are going to become even more important as developers continue to add more and physics and general complexity to their games. Are there games you can run on a quad core? Sure, but they're relatively simple in functionality, mostly run and shoot on a 2D map, with a handful or less of enemies on the screen at any given time. Games like Cyberpunk easily will scale to 12 core/24 threads because of the complexity of everything happening on the screen at any given time.

Physics is a poor example. Simulations are almost impossible to parallelize by their nature. You can't have an accurate simulation that has all of its parts running independently. Game engines might figure out how to cheat, and give reasonably accurate simulations that aren't single threaded, but it would require some fundamental changes to how such games are coded.

 

AI for NPCs, however, is a good example of how future games could leave current CPUs in the dust. Those can each "think" independently of the world thread.

Link to comment
Share on other sites

Link to post
Share on other sites

You're mixing memory and cpu in your original post. 

 

You don't notice CPU so much because games have learned to multi thread, so they make better use of multiple cores. Also, the IPC (performance per core) has jumped up significantly.

 

Video card drivers also benefit from higher performance per clock and multithreading - just go on Youtube and search for any video that runs a higher end video card with a 1-2 core older cpu and then with a modern cpu and you'll see differences in minimum framerate, latency and other things. 

 

The memory amount doesn't need to go up because video cards have kept up with the increased demands from higher resolutions and higher quality settings. For example, from my Radeon 4850 with 512 MB of memory I went up to a 7770 Ghz edition with 1 GB of memory then jumped up to a RX 470 with 4 GB VRAM and now 6 GB and 8 GB VRAM cards are the norm. You can buy cards with 24-32 GB VRAM.

A game only has to keep enough stuff in ram that's related to the current level and the textures and stuff that can't fit inside the video card, or stuff that will have to go in the video card soon ... as storage speeds have only increased games don't have to cache gigabytes worth of content into ram to transfer into video card as needed, because they can get smaller chunks from disks or because the video card ram is plenty to make such transfers unneeded.

 

Also CPU really does matter in some games, just because you don't play those kinds of games doesn't mean others don't play them ... see real time strategy games, turn base strategy games (civilization, city emulators) , big simulators (air planes shit)

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mariushm said:

Also CPU really does matter in some games, just because you don't play those kinds of games doesn't mean others don't play them ... see real time strategy games, turn base strategy games (civilization, city emulators) , big simulators (air planes shit)

I do play KSP and Civ V. Those are games that benefit from high IPC and frequency, but the core count doesn't really matter, because they are basically impossible to parallelize. If I upgraded to a 9900k from my 9600k, I would only benefit from the clock speed boost. If I overclock my 9600k to match the frequency of the 9900k, the difference will be negligible.

 

Look at GN's video on the 2600k in 2018. You'll see that, with an OC, the 2600k isn't much slower than the modern CPUs in the turn-time test, because its core count is not an issue.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, mariushm said:

You don't notice CPU so much because games have learned to multi thread, so they make better use of multiple cores. Also, the IPC (performance per core) has jumped up significantly.

IPC =/= performance per core

 

IPC = Instructions per clock (or cycle)

 

Two entirely different things.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Stahlmann said:

IPC =/= performance per core

 

IPC = Instructions per clock (or cycle)

 

Two entirely different things.

I know what it is ... I was just trying to write the simplest way possible, so I don't have to get into explaining what IPC is and being super technical about it.

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Mel0nMan said:

I can play some lighter modern games on my 4c4t Phenom x4. So yes it's technically outdated, as all tech things will eventually be. However cpus will not become 'irrelevant' for gaming.

To me, that's insane. You're using a 13 year old processor. (Unless you mean Phenom II) It shouldn't even be able to boot into Windows by the standards of computers in the late 90s.

 

My family's Pentium 200MHz basically couldn't play any new games in 2000 after just 4 years. The Sims was the most recent title it ever ran, and it struggled with that. It was struggling to run Internet Explorer at the end of its life in 2001.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, YoungBlade said:

To me, that's insane. You're using a 13 year old processor. (Unless you mean Phenom II) It shouldn't even be able to boot into Windows by the standards of computers in the late 90s.

 

My family's Pentium 200MHz basically couldn't play any new games in 2000 after just 4 years. The Sims was the most recent title it ever ran, and it struggled with that. It was struggling to run Internet Explorer at the end of its life in 2001.

Sorry I should have clarified. It's not my main rig. Granted I use 9 year old Xeons in my main rig, but it's dual socket 16 core 32 thread, so... 

But yes, that Phenom x4, not a Phenom II, just a regular, bog standard Phenom x4 9650. I don't use it for much gaming, it now runs Ubuntu. But it's actually not a bad cpu for everyday tasks. And also, pretty sure the 90s wasn't 13 years ago?

Link to comment
Share on other sites

Link to post
Share on other sites

I think it may be partly because, as most games are multiplatform, you don't want a game that needs more CPU horsepower than a console can provide. Plus it's in devs interest to keep required specs somewhat reasonable, especially for the CPU which many PC gamers don't swap out as frequently as their GPU.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Mel0nMan said:

Sorry I should have clarified. It's not my main rig. Granted I use 9 year old Xeons in my main rig, but it's dual socket 16 core 32 thread, so... 

But yes, that Phenom x4, not a Phenom II, just a regular, bog standard Phenom x4 9650. I don't use it for much gaming, it now runs Ubuntu. But it's actually not a bad cpu for everyday tasks. And also, pretty sure the 90s wasn't 13 years ago?

I guess I could have clarified as well. I didn't mean that the 90s were only 13 years ago. I meant that, in the 90s, a 13 year old CPU was incapable of doing even basic computing tasks with a modern OS. It didn't matter if you bought the best 80386 they made in 1985. By the time Windows 98 came out, it wouldn't even be able to boot. For Vista in 2006, even the top of the line Pentium II from 1997 (that retailed for almost $2000!) might as well have been literally a hunk of sand, despite coming out only 9 years prior.

 

I'm trying to point out that the fact that you're even able to run modern operating systems on a chip that old is a recent phenomenon. It was impossible just 15 years ago.

Link to comment
Share on other sites

Link to post
Share on other sites

It depends on the developers. If they just want to make a fast cashgrab title, no need to try hard and at some point when DirectStorage API becomes mainstream, a lot of games would be running with basically idling CPUs. But the ones who wanna push the realism and boundaries further will see this not as a one requirement removed from the list, but as an additional computing power. You have your the GPU handling all the usual beauty stuff - your physics, textures and so on and then you have the CPU doing the more hidden computing. Stuff like multiple paths from a much larger database, based on your previous playstyle. Actual interactive gameplay where you can truely play your character the way you like with your own text/audio input and the rest of the characters reacting to you, not following a dummy script. In racing games a much advanced AI were your opponents react to your actions and don't simple follow the path embedded in them. The possibilities are limitless, you just have to use your imagination.

| Ryzen 7 5800X3D | Arctic Liquid Freezer II 360 Rev 7| AsRock X570 Steel Legend |

| 4x16GB G.Skill Trident Z Neo 4000MHz CL16 | Sapphire Nitro+ RX 6900 XT | Seasonic Focus GX-1000|

| 512GB A-Data XPG Spectrix S40G RGB | 2TB A-Data SX8200 Pro| Phanteks Eclipse G500A |

Link to comment
Share on other sites

Link to post
Share on other sites

I noticed this earlier today, there's not much of a gap in performance between Sandy Bridge and Coffee Lake CPUs. According to UserBenchmark, my i7 8700k only outmatches the i5-2400 by a 35% margin, the i7-2600k by a mere 23%, both latter CPUs fully capable of playing triple-A games ahead of their time just fine with a decent GPU.

 

It's fascinating, really... improvements in software design aside, I am not sure if Intel is stagnating or the x86 architecture is hitting its limit. Could be both, could be neither, I have no clue.

 

Doesn't mean CPUs will be irrelevant, but it does remove incentive to buy anything considered flagship and new when building a gaming PC.

Link to comment
Share on other sites

Link to post
Share on other sites

I'd look at it from a different direction. What are the driving factors to needing more hardware performance?

 

We have increasing resolutions, and framerates to a lesser degree. This scales well with GPU power, so it is easy to look at more GPU to reach your performance target. I'd consider a 2080Ti/3070 to be 4k60 high class for modern games. 4k displays are ever more common so more people might want more higher end GPU.

 

CPU demands? Much less important, but not unimportant. Again this is more of an enough vs not enough thing. If you have a quad core, sure, it'll probably function, but you may not be getting the most out of your GPU. I think we're at the stage where if you want anything beyond entry level, you should be looking at a modern 6 core CPU, with 8 being the performance sweet spot for now. More than that, outside niche scenarios, doesn't really buy you more for gaming. Ram is similar to CPU. You need enough capacity, but more doesn't really help. Speed helps with performance as a smaller factor.

 

We should also look at the consoles, since they will influence the target level for gaming. We now have 8 decent cores in current gen along with 16GB of (shared) ram. It will be unlikely for game devs to target far above that for the life of the current generation, so that will likely be plenty for several years. Of course, there might be settings provided in some games that go beyond what current hardware allows, but that's more of a stretch goal than a requirement for today.

 

3 hours ago, Rhianwen said:

I noticed this earlier today, there's not much of a gap in performance between Sandy Bridge and Coffee Lake CPUs. According to UserBenchmark, my i7 8700k only outmatches the i5-2400 by a 35% margin, the i7-2600k by a mere 23%, both latter CPUs fully capable of playing triple-A games ahead of their time just fine with a decent GPU.

The first thing NOT to do is take the userbenchmark numbers literally. Any measure that tries to simplify a very complex measure into a single number will have problems.

 

Now for much older or less demanding games, their figure of 35% difference between the 8700k and 2400 might not be far off. Try running Cyberpunk 2077 on it. Chances are, it'll be a bad experience on the 2400 but fine on the 8700k. Who cares if it is 35% at that point? It is not very useful when people go CPU 1 is X% faster than CPU 2. You need to look at what it does for your gaming. If both CPUs are good enough, it doesn't matter much the performance difference between them. If one drops below a performance acceptability threshold, that's when you should care.

 

IMO I think modern 6 core CPUs (Haswell-E, Coffee Lake, Zen or newer) will be good enough for years to come. Quad core I feel is close to dropping off performance acceptability if you want to play the latest demanding games at a great experience level.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Irrelevant? Probably never. 
Less impactful? Yes. 

 

 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Why less impactful? Imagine an entirely new software piece like ray tracing which allows much complex AI choices and reactions. Pretty much like what Cities: Skylines has with the traffic settings and the more active and true to live you make the AI, the more CPU the game utilizes. Crowds could be set to be reacting basically like now in GTA 5 like total morons or have real-life reactions. Not so long ago shadows were an option for many people when GPUs weren't yet that capable of handling physics AND graphics simultaneously. So turning random AI options on or off based on your CPU capability isn't all that unthinkable. 

| Ryzen 7 5800X3D | Arctic Liquid Freezer II 360 Rev 7| AsRock X570 Steel Legend |

| 4x16GB G.Skill Trident Z Neo 4000MHz CL16 | Sapphire Nitro+ RX 6900 XT | Seasonic Focus GX-1000|

| 512GB A-Data XPG Spectrix S40G RGB | 2TB A-Data SX8200 Pro| Phanteks Eclipse G500A |

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, QuantumSingularity said:

Why less impactful? Imagine an entirely new software piece like ray tracing which allows much complex AI choices and reactions. Pretty much like what Cities: Skylines has with the traffic settings and the more active and true to live you make the AI, the more CPU the game utilizes. Crowds could be set to be reacting basically like now in GTA 5 like total morons or have real-life reactions. Not so long ago shadows were an option for many people when GPUs weren't yet that capable of handling physics AND graphics simultaneously. So turning random AI options on or off based on your CPU capability isn't all that unthinkable. 

This is a possible counter-argument. If AI stays on the CPU, and doesn't move to the dedicated AI cores on newer graphics cards, that would be something where I could see 12-core or 16-core chips becoming necessary to play the latest games.

 

The reason I'm not sure that would happen is that I see a trend of CPUs being less relevant for gaming as time goes on. When I was in high school in 2008, you started to need a dual-core processor, and those had only existed for two or three years. But then it took about five or six years before recommended specs were listing a quad-core around 2014. Now, after about seven more years, I'm starting to see 6 core chips show up as recommended for games (although most still recommend an Intel quad-core).

 

The whole Ryzen 5000X stack is basically equivalent for gaming today. It doesn't matter whether you bought a 5600X or a 5950X, because in a blind taste-test you couldn't tell the difference is most titles. Yes, there are a couple where you might notice something at 1080p, but if you just up the resolution to 1440p, the difference usually goes away, as you become GPU bound. I'm wondering if that's a glimpse of the future.

 

If that slow-down trend were to continue, where it took now 10 or 11 more years before 8 core chips were needed, it would make the choice of CPU effectively irrelevant, because I doubt you'll even be able to buy a new 4 or 6 core CPU a decade from now unless you're shopping at the absolute bottom of the product stack.

 

Similar to how today, you can't even buy two DDR4 sticks and not get at least 8GB (which is the recommended minimum for most games), in 2031, I'm wondering if you might not be able to buy a CPU that can't game reasonably well. Sure, you can buy just one stick of 4GB DDR4, and sure, you could buy the six-core Athlon 8000G, but in either case, you know you're scraping the bottom of the barrel.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, i disagree with everything you said in the OP lol, if anything GPUs will become obsolete in the long run since 3d stacking is a thing, you'll just get a chip that does everything … good for smaller pcs, and lower energy consumption, generally where the trend is going.

 

I think one of the reasons you're misunderstanding the current  situation is because moores law is literally dead while you seem to think it isnt. The times we're seeing double the performance after 1-2 years are long over.

 But, you still need a CPU and they're still progressing, just at a slower pace. 

 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mark Kaine said:

Well, i disagree with everything you said, if anything GPUs will become obsolete in the long run since 3d stacking is a thing, you'll just get a chip that does everything … good for smaller pcs, and lower energy consumption.

 

I think one of the reasons you're misunderstanding the current  situation is because moores law is literally dead while you seem to think it isnt. The times we're seeing double the performance after 1-2 years are long over.

I agree that the former could be the sort of Black Swan Event that would make current predictions about the evolution of computer hardware useless. Even within chips, like the M1 and RTX with Tensor and RT cores, I'm seeing more specialization in hardware, not less. And VRAM seems to be getting more specialized for those requirements. That said, you could also just have VRAM modules on the motherboard or even the CPU itself. Then again, if the igp is the future, I feel like that's more a semantic distinction; is the GPU actually obsolete if it's just moved onto what we call the CPU? And I don't see why that necessarily implies lower energy consumption - it could, but you could also see CPU energy consumption stay the same or grow as more is put onto the chip.

 

And why would Moore's Law being dead hurt my case? If CPU speed can't increase, then surely old CPUs will be just as potent as new ones.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, YoungBlade said:

And why would Moore's Law being dead hurt my case? If CPU speed can't increase, then surely old CPUs will be just as potent as new ones.

Because that isnt the case, theyre now getting better 10% or so every  1-2 years. So you're right they'll last longer but you still need to upgrade once in a while.  For example i dont think a 2600, great chip otherwise, would  cut it in combination with a 3070 or 3080.

 

And as for stacking thats what i thought of too, they could integrate "RAM" also on the chip. But what i think what happens first, is that RAM will become so fast the chip will just use system RAM for video RAM.

 

I also totally can see Nvidia being on the forefront of this, they already have excellent and highly efficiently cpus, thats just going to be the future, at least mid term, who knows what happens  after we hit ~1nm.

 

I just dont see cpus becoming irrelevant any time soon as said, if anything its going to be dedicated gpus.  

 

And specialized stuff, well nobody in the industry would want that, especially if its not necessary, things becoming more and more faceless imo, the opposite of specialized? 🤔

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Mark Kaine said:

And specialized stuff, well nobody in the industry would want that, especially if its not necessary, things becoming more and more faceless imo, the opposite of specialized? 🤔

But without large performance gains, don't you have to specialize? It becomes the only way to make hardware faster, so it is effectively necessary. If Nvidia could have used CUDA cores for RT and AI, I'm sure they would have. It'd be a lot easier to just use the existing architecture.

 

And I'm not sure the industry would want a market with just one chip that's equally good at everything. Video editing, gaming, number crunching, AI, code compiling, etc do have different needs, so it would make sense to have different chips for each. And then, since some people have less money than others, you'll want to segment those markets.

 

So while it might be called a "CPU," if gamers only want the gaming chips because they have the best igp, and they frankly don't care about the general computing cores as long as they meet a basic level of performance, then I'd argue it's the CPU, not the GPU, that became irrelevant for gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×