Jump to content

PC gaming change focus just to APUs and Laptops? We should stop thinking of GPU's as gaming hardware. Change my mind.

Uttamattamakin
Go to solution Solved by Uttamattamakin,

Given the supply constraints another user hit the nail on the head.

3 hours ago, whm1974 said:

Most Gamers will simply use whatever dGPU they already have and upgrade when the Market becomes Stable.

Which given the current supply constraints is all anyone should plan on.   Look at the stock levels at say ... microcenter right now.  TONS of Ryzen CPU's with no graphics.  Tons of GT (not GTX) level graphics cards.  

 

@tikker   To see how you are wrong demonstrated with observation.  You mentioned that with computing power more information can be extracted from a Hubble image.  That is similar to this image taken of Pluto with Hubble before the New Horizons space craft visited it. 

 

Pluto: New Horizons vs. Hubble | The Planetary Society

 

The image on the right is that super resolution you are going on about.  Compare that to an image taken up close, astronomically speaking, where the angular resolution of the camera allows discernment of features barely hinted at.    You sitting in front of your monitor are NOT seeing something like what is on the right.  

 

 

_____Some general comments and my final realization ________

The idea that you need 4k AND 120hz AND ray traced global illumination to game is just pure marketing hogwashAll of those features are really only useful for AI, high fidelity physics simulations.... research grade simulations... and other such task. Most games barely scratch the surface of them and don't do so very well. 

 

For games what matters more is a smooth real time experience.   Turn down the resolution and sit farther from the screen and you get that. 

 

The new reality right now, for most people, is if you seek to game on PC these are the real options

  • Just buy a laptop with a good APU (NEVER to be confused with an intel IGP maybe their newest graphics maybe). 
  • Just buy a pre built *See video from ETA prime about gaming on a Ryzen 4700g based pre built.   Plus notice it has an expansion slot. One could add a GPU if they can get one. (4) Cheapest Ryzen 4700G Prebuilt PC - Outstanding Performance From This APU! - YouTube
  • Just buy what parts are available BUT with an eye towards upgrading to what is next if you can.   This would be much more viable if we weren't at EOL for socket AM4. 
    • Given what is actually available and obtainable in stock, in stores, that means gaming on a GT710 or GT1030.

Then you can if you get lucky upgrade.  You can live in hopes and dreams or you can live in reality.   Just keep it real with yourself ... wanting to get a RTX 30 series is like trying to date a movie star, or win a lottery.  

 

As for the idea that I am not into building computers.  I built one at the start of the pandemic. 

 

The Stimulus Payment Build, Putting It Together. - Build Logs - Linus Tech Tips  See here. 

 

have been using computers since the 1980's and gaming on them since before the movie WarGames was new.  Now if you are a real OG sweet PC gamer you know where i'm coming from.  The idea that a GPU is really a requirement is a new innovation that was true for about 15 years ... but by necessity just can't be true anymore. 

 

19 hours ago, Uttamattamakin said:

Change my mind. How am I wrong given what we have just witnessed today?  

what did you witness today I don't get it? 

 

that playing at 720p is acceptable? yeah, maybe it is for some people, but for many it isn't, then there's the whole 'I need at least 500 fps cause my reflexes are faster than light!' fad which is extremely popular nowadays in the 'competitive' scene. 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, HelpfulTechWizard said:

So the average user is someone who needs a 3090!?!? Because Nvidia had a target for that card.

 

Yes, they did, the term you're looking for is 'whales'. 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, Uttamattamakin said:

Don't even pay them any mind until they are once again a reasonable price for a mere part of a computer.

 

21 hours ago, Uttamattamakin said:

I have an old GTX 1080.   It was the ONLY one on the shelf when I lucked into it.  The ONLY one in the store when I lucked into it.

 

Why not just forget the PC as a game platform at all if the components needed to make it competitive can't be had at any price that makes sense ....

Settle down there Thanos, just because you cant afford/find a shiny new GPU doesn't mean that all gamers should boycott them in favor of APUs. After all, graphical fidelity doesn't matter and does not at all contribute to the enjoyment and/or playability of a game. I personally love turning down the graphic options and lowering the resolution just to hit >20fps. I think I'll stick to gaming on my TI-83.

Link to comment
Share on other sites

Link to post
Share on other sites

  

  

19 hours ago, Uttamattamakin said:

The diffraction limit on the ability of any optical system to distinguish between objects is determined by the wavelength of the light and the aperture of the optical system.  This is a law of physics and human vision does not break it.

 

There's plenty of research out there that focusses on or uses superresolution to go beyond the diffraction limit. Angular resolution has various different defintions depending on the configuration and no one will argue that this is based on properties of the system. There is much more information available however that allows one to push further. Dithering for the HST was invented for the exact reason to extract information at scales smaller than the pixels on the camera. Superresolution is not an uncommon thing in astronomy or other fields and does allow you to exceed the diffraction limit. They key lies in using all of the information available to you. Here's a taste:

 

18 hours ago, Uttamattamakin said:

I'd put my actual name here and invite you to google scholar me.  I am a published theoretical astrophysicists myself.  But what do I know about anything.

As a fellow astrophysicist I am going to ask you to get off your high horse.

 

Putting this in a spoiler, as it's sort of off topic.
 

Spoiler

 

18 hours ago, Uttamattamakin said:

In fact all of the physics we know can be expressed with one equation.

Not sure which equation you mean here, but we have not found a complete grand unified theorem, equation or similar yet. The Standard Model lacks GR and says neutrinos are massless, and general relativity and quantum mechanics still aren't exactly best friends.

18 hours ago, Uttamattamakin said:

Meaning you can not distinguish with your eyes between objects with a smaller angular separation than 11.34 arc seconds.  Since the dot pitch on a 1080p screen is measured in fractions of a millimeter ... at arms length you cannot see the pixels.   It is all in your head. 

Since when are seconds of arc and millimetres the same unit? Don't compare apples and oranges, compare apples to apples and at least finish your calculations. Half the people here probably never even heard of arcseconds.

 

The gaps between the pixels are separated by the pixel size which, as per my previous post, is about 149 um (172 um) at 1920x1080 on a 13" (15") screen, and 74 um (86 um) at 3840x2160 on a 13" (15") screen. An arms length away is something like 30-60 cm away. At 30 cm your 11.35 arcsec corresponds to 16.5 um and at 60 cm away to 33.0 um.

 

For all intents and purposes, distinguishing the individual pixels is the same as seeing the gaps separating them, so both the laws of physics and mathematics are on our eyeball's side her. Especially with your generous resolution of 11 arcsec our eyes are technically capable of seeing it. Is it disturbingly obvious? Not even close. It's hard and you brain will likely filter it out anyway if not focussing on it, but if I try hard I can definitely see them on the 13" laptop I'm writing this post on.

 

8 hours ago, Uttamattamakin said:

"Hyper acuity" some are claiming, being able to see the individual pixels on the screen, requires breaking the laws of optics.   To argue with them isn't to argue with me ... but with a large well understood body of well tested science.  

No it doesn't. Much more information can be extracted by using additional constraints and information. You can go beyond the technical limits and extract more detailed information.

Spoiler

From university lecture material on visual acuity, emphasis mine:

Quote

For  a  long  time  human  spatial  discriminations  have  been  known  where  thresholds  are  markedly  lower  than  the  resolution  limit.  For  vernier  acuity,  the  foveal  alignment  threshold  for  two  abutting  lines is just a few arcsecs, as compared with the 1 arcmin or so of ordinary resolution limit. Such high precision of localization is shared by many kinds of pattern elements, both in the direction joining them, in their alignment, and in deviations from rectilinearity (Fig. 12). The word hyperacuity is applied to this discrimination of relative position, in recognition that it surpasses by at least an order of  magnitude  the  traditional  acuity.  Whereas  the  limitation  of  the  latter  are  mainly  in  the  resolving  capacity of the eye’s optics and retinal mosaic, hyperacuity depends on the neural visual system’s ability to extract subtle differences within the spatial patterns of the optical image on the retina.

<snip>

These  hyperacuity  tasks  do  not  contradict  any  laws  of  optics  and  are  the  result  of  sophisticated  neural  circuits  identifying  the  centroids  of  light  distributions  and  comparing  theirlocation.

<snip>

Hence,  from  the  standpoint  of  optics,  there  is  nothing  mysterious about the precision of locating visual target that is now called hyperacuity. Rather it draws attention to the physiological and perceptual apparatus that enables this precision to be attained.

 

https://cgvr.cs.uni-bremen.de/teaching/cg_literatur/visual_acuity.pdf

and the corresponding publication

https://www.sciencedirect.com/science/article/pii/S1350946212000389

That is practically dithering or other fancier image processing techniques. You use context and additional information to be able to distinguish things (far) beyond the typical limit. I will give you that hyperacuity seems to be more related to being able to determine differences between things rather than direct detail, like beating in audio where hearing two tones that are close together separately you may not readily distinguish them, but playing them together causes beating and you can pick that up easily.

 

 

18 hours ago, Uttamattamakin said:

Here is a fact I did not consider.  It is impossible to buy an APU to use in your own build or upgrade of a computer.  For the last two generations 4000 series and 5000 series the desktop APU has not or quite likely is not ever going to come.  Therefore, there is no choice but to buy a video card to get some sort of video output much less game.  Which is a problem since one can find the GT 1030 in stock OR the Quadro  RTX 5000 or RTX4000 or some such in stock.... and NOTHING in between that makes actual sense to use.

While APU is an AMD specific thing, you do realize that Intel has integrated graphics for like a decade or something now right? They suck for gaming, but getting video output is not an issue in the slightest in this day and age.

 

I also don't get the laptop argument. I have a laptop for work, macbooks are omnipresent because they are nice machines for productivity, why should game developers care about that? Gaming laptops also aren't meant to be power houses.

18 hours ago, The_russian said:

I am not a game developer so I don’t know how games are developed, but judging from what I can see, games are developed to take advantage of the current top graphics cards, or maybe even the soon to be released ones.

Exactly. Why wouldn't they. They want to make a stunning and great playing game. Certain levels of fidelity will require certain levels of graphics horsepower. It's already sad that there are relatively little games out there that really push hardware to its limits to a point where even the top of the line stuff perhaps can't even run it nowadays. We need more Crysis level games that eat top shelf hardware for breakfast where ultra settings means ultra hardware or Doom Eternal level games that make beautiful use of hardware.

 

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

My mind is already changed by the basic fact that there are not boxed Ryzen 5 4500G or Ryzen 5 5500G APU's available in stock to be purchased and according to a widely reported on study the supply will not get better for two years. 

 

SO no only should we give up on actually getting a GPU we should give up on getting an  APU (not talking about crappy intel graphics those aren't APU's) 

 

Either retasking an old 10 series GPU with a existing CPU OR if you don't have a Gaming grade GPU ... It's a 8C 16T Ryzen 5800x with a GT1030.  

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, tikker said:

  There's plenty of research out there that focusses on or uses superresolution to go beyond the diffraction limit. Angular resolution has various different defintions depending on the configuration and no one will argue that this is based on properties of the system. There is much more information available however that allows one to push further. Dithering for the HST was invented for the exact reason to extract information at scales smaller than the pixels on the camera. Superresolution is not an uncommon thing in astronomy or other fields and does allow you to exceed the diffraction limit. They key lies in using all of the information available to you. Here's a taste:

 

Yeah I know all of that.  It is those who would confuse the terms IGP and APU who need to really think things over. 

Intel integrated graphics are a joke.  AMD's APU graphics are , on the other hand are a serviceable standin for being able to get a graphics card.  

 

I clearly chose that term and also took consideration of what cards are available and exist.  

 

You are either not reading the research on so called "super resolution" or misunderstanding it, or misrepresenting it.    Using multiple different frequency bands to computationally try and discerne point sources that are close together is not what your eyes or brain does. 

You and I know how these screens work.  I know that I am looking at an array of lighted pixels.  So when I think about it I can fool myself in to thinking I see them.    That is all. The power of psychological suggestion.  

 

Here is what some eye doctors say about this. 

https://www.lasikmd.com/blog/can-the-human-eye-see-in-8k

 

Don't Be Fooled. 8K TVs Are A Waste Of Money For Most Viewers (forbes.com)

With the qualification being that most people won't have a TV gigantic enough for the pixels to be seen at any sensible viewing distance. 

 

Feel free to cite papers that have the word "superresolution" in them without knowing what that means or the limitations or caveats.  🙂 

 

Anything to distract from the following two facts. 

We'd all love to have a GPU rather than not. 

They don't exist in any numbers worthy of being bothered with. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

So I read a few pages of this Topic and the OP doesn't sound like he is a PC Gamer at all. Merely a Console Peasant who Claiming w/o evidence that that their shouldn't be Gaming PCs at all. Or at least that APUs/iGPUs are good enough for PC Gamers.

 

And most PC Gamers don't game on Laptops, but Desktops instead.

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, Uttamattamakin said:

Intel integrated graphics are a joke. 

Someone hasn't heard of Xe 😛

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

^-^

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Uttamattamakin said:

You are either not reading the research on so called "super resolution" or misunderstanding it, or misrepresenting it.    Using multiple different frequency bands to computationally try and discerne point sources that are close together is not what your eyes or brain does. 

  I'm not talking about combining different frequency bands. Occultation provides you with information beyond the diffraction limit, fact. Pixels on some Hubble's cameras grossly undersampled its diffraction limited PSF, and with dithering you can recover this information. This is indeed not directly comparable to our brain. Our eyes don't even operate in resolution. It can however use information like intensity, the distribution etc. to extract more information.

19 minutes ago, Uttamattamakin said:

You and I know how these screens work.  I know that I am looking at an array of lighted pixels.  So when I think about it I can fool myself in to thinking I see them.    That is all. The power of psychological suggestion.  

You can also fool yourself into thinking you can't see them. What happened to your super vision from page 1. The numbers proof you wrong that it's "mathematically/physically impossible". Especially at 1080p, you definitely have the ability to discern them. I'm not saying it's without effort, distracting or even noticeable in day to day work.

23 minutes ago, Uttamattamakin said:

Here is what some eye doctors say about this. 

https://www.lasikmd.com/blog/can-the-human-eye-see-in-8k

 

Don't Be Fooled. 8K TVs Are A Waste Of Money For Most Viewers (forbes.com)

With the qualification being that most people won't have a TV gigantic enough for the pixels to be seen at any sensible viewing distance. 

I'm not argueing for 8k in home use environments. There I would say it's overkill for home usage. 4k is a noticeable upgrade from 1080p though, and in my opinion just right for current usefulness in "normal sized" televisions. Your article addresses the heart of the problem: our eyes and brain don't see in pixels or FPS. The real world doesn't work that way. That does not mean, however, that higher resolutions aren't useful. Film, for example, has a resolution around or higher than 4k. In games 4k gives much more real estate for curves, small things etc. The law of diminishing returns of course rears its head the higher you go.

 

42 minutes ago, Uttamattamakin said:

Feel free to cite papers that have the word "superresolution" in them without knowing what that means or the limitations or caveats.  🙂

Superresolution literally means "beyond" resolution, achieving a higher resolution that the limits of the system at first glance set which is also how it's used in literature.

48 minutes ago, Uttamattamakin said:

Anything to distract from the following two facts. 

or having to defend your own "facts".

12 minutes ago, Uttamattamakin said:

We'd all love to have a GPU rather than not. 

They don't exist in any numbers worthy of being bothered with. 

Which the majority agrees on. You seem to have the most problem with this and started off by saying we should forget about dedicated GPUs and settle for APUs or similar. Most, as you see, would happily agree on the fact that we just need to suck it up and spend the money, or sit this one out.

 

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

I will not move back to console peasantry, I cannot even find one even if I were willing... PS5 has shit controller and literally self destruct! XBONE? Has no exclusive plus no STEAM! PCMR all the way now getting a GPU is hard at the moment but things will probably normalize! 😉 Nvidia should just hand us 2060Super 16GB instead of this lame new 3060 12GB... UwU

Lake-V-X6-10600 (Gaming PC)

R23 score MC: 9190pts | R23 score SC: 1302pts

R20 score MC: 3529cb | R20 score SC: 506cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: Intel Core i5-10600, 6-cores, 12-threads, 4.4/4.8GHz, 13,5MB cache (Intel 14nm++ FinFET) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: ASUS PRIME B460 PLUS, Socket-LGA1200 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W / RAM A1, A2, B1 & B2: DDR4-2666MHz CL13-15-15-15-35-1T "Samsung 8Gbit C-Die" (4x8GB) / Operating System: Windows 10 Home / Sound: Zombee Z300 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Seagate® Barracuda 2TB HDD / Storage 4: Seagate® Desktop 2TB SSHD / Storage 5: Crucial P1 1000GB M.2 SSD/ Storage 6: Western Digital WD7500BPKX 2.5" HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter (Qualcomm Atheros)

Zen-II-X6-3600+ (Gaming PC)

R23 score MC: 9893pts | R23 score SC: 1248pts @4.2GHz

R23 score MC: 10151pts | R23 score SC: 1287pts @4.3GHz

R20 score MC: 3688cb | R20 score SC: 489cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Ryzen 5 3600, 6-cores, 12-threads, 4.2/4.2GHz, 35MB cache (T.S.M.C. 7nm FinFET) / Display: HP 24" L2445w (64Hz OC) 1920x1200 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: ASUS Radeon RX 6600 XT DUAL OC RDNA2 32CUs @2607MHz (T.S.M.C. 7nm FinFET) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: ASRock B450M Pro4, Socket-AM4 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W / RAM A2 & B2: DDR4-3600MHz CL16-18-8-19-37-1T "SK Hynix 8Gbit CJR" (2x16GB) / Operating System: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Western Digital My Passport 2.5" 2TB HDD / Storage 4: Western Digital Elements Desktop 2TB HDD / Storage 5: Kingston A2000 1TB M.2 NVME SSD / Wi-fi & Bluetooth: ASUS PCE-AC55BT Wireless Adapter (Intel)

Vishera-X8-9370 | R20 score MC: 1476cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 (Base: @4.4GHz | Turbo: @4.7GHz) Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-1866MHz CL8-10-10-28-37-2T (4x4GB) 16.38GB / Operating System 1: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter

Godavari-X4-880K | R20 score MC: 810cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 95w Thermal Solution / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 860K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: EVGA GeForce GTX 960 SuperSC 2GB (T.S.M.C. 28nm) / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-40 (4x4GB) 16.38GB / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter

Acer Aspire 7738G custom (changed CPU, GPU & Storage)
Spoiler

CPU: Intel Core 2 Duo P8600, 2-cores, 2-threads, 2.4GHz, 3MB cache (Intel 45nm) / GPU: ATi Radeon HD 4570 515MB DDR2 (T.S.M.C. 55nm) / RAM: DDR2-1066MHz CL7-7-7-20-1T (2x2GB) / Operating System: Windows 10 Home / Storage: Crucial BX500 480GB 3D NAND SATA 2.5" SSD

Complete portable device SoC history:

Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
Apple A9 - Apple iPhone 6s Plus
HiSilicon Kirin 810 (T.S.M.C. 7nm) - Huawei P40 Lite / Huawei nova 7i
Mediatek MT2601 (T.S.M.C 28nm) - TicWatch E
Mediatek MT6580 (T.S.M.C 28nm) - TECNO Spark 2 (1GB RAM)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (orange)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (yellow)
Mediatek MT6735 (T.S.M.C 28nm) - HMD Nokia 3 Dual SIM
Mediatek MT6737 (T.S.M.C 28nm) - Cherry Mobile Flare S6
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (blue)
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (gold)
Mediatek MT6750 (T.S.M.C 28nm) - honor 6C Pro / honor V9 Play
Mediatek MT6765 (T.S.M.C 12nm) - TECNO Pouvoir 3 Plus
Mediatek MT6797D (T.S.M.C 20nm) - my|phone Brown Tab 1
Qualcomm MSM8926 (T.S.M.C. 28nm) - Microsoft Lumia 640 LTE
Qualcomm MSM8974AA (T.S.M.C. 28nm) - Blackberry Passport
Qualcomm SDM710 (Samsung 10nm) - Oppo Realme 3 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Given the supply constraints another user hit the nail on the head.

3 hours ago, whm1974 said:

Most Gamers will simply use whatever dGPU they already have and upgrade when the Market becomes Stable.

Which given the current supply constraints is all anyone should plan on.   Look at the stock levels at say ... microcenter right now.  TONS of Ryzen CPU's with no graphics.  Tons of GT (not GTX) level graphics cards.  

 

@tikker   To see how you are wrong demonstrated with observation.  You mentioned that with computing power more information can be extracted from a Hubble image.  That is similar to this image taken of Pluto with Hubble before the New Horizons space craft visited it. 

 

Pluto: New Horizons vs. Hubble | The Planetary Society

 

The image on the right is that super resolution you are going on about.  Compare that to an image taken up close, astronomically speaking, where the angular resolution of the camera allows discernment of features barely hinted at.    You sitting in front of your monitor are NOT seeing something like what is on the right.  

 

 

_____Some general comments and my final realization ________

The idea that you need 4k AND 120hz AND ray traced global illumination to game is just pure marketing hogwashAll of those features are really only useful for AI, high fidelity physics simulations.... research grade simulations... and other such task. Most games barely scratch the surface of them and don't do so very well. 

 

For games what matters more is a smooth real time experience.   Turn down the resolution and sit farther from the screen and you get that. 

 

The new reality right now, for most people, is if you seek to game on PC these are the real options

  • Just buy a laptop with a good APU (NEVER to be confused with an intel IGP maybe their newest graphics maybe). 
  • Just buy a pre built *See video from ETA prime about gaming on a Ryzen 4700g based pre built.   Plus notice it has an expansion slot. One could add a GPU if they can get one. (4) Cheapest Ryzen 4700G Prebuilt PC - Outstanding Performance From This APU! - YouTube
  • Just buy what parts are available BUT with an eye towards upgrading to what is next if you can.   This would be much more viable if we weren't at EOL for socket AM4. 
    • Given what is actually available and obtainable in stock, in stores, that means gaming on a GT710 or GT1030.

Then you can if you get lucky upgrade.  You can live in hopes and dreams or you can live in reality.   Just keep it real with yourself ... wanting to get a RTX 30 series is like trying to date a movie star, or win a lottery.  

 

As for the idea that I am not into building computers.  I built one at the start of the pandemic. 

 

The Stimulus Payment Build, Putting It Together. - Build Logs - Linus Tech Tips  See here. 

 

have been using computers since the 1980's and gaming on them since before the movie WarGames was new.  Now if you are a real OG sweet PC gamer you know where i'm coming from.  The idea that a GPU is really a requirement is a new innovation that was true for about 15 years ... but by necessity just can't be true anymore. 

 

Edited by Uttamattamakin
I wanted it to be really clear I was agreeing with that user.
Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Uttamattamakin said:

The idea that you need 4k AND 120hz AND ray traced global illumination to game is just pure marketing hogwash.

that's the thing. But marketing doesn't say you need something. It shows you why x thing is better

It's subjective. You don't need 4k or 120hz or rtx, but sone poeple prefer one or more of those things. 

And id much rather have that than a 720p 30fps non-rtx gameplay experience, like you keep saying should be what gaming shifts focus to.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, HelpfulTechWizard said:

that's the thing. But marketing doesn't say you need something. It shows you why x thing is better

It's subjective. You don't need 4k or 120hz or rtx, but sone poeple prefer one or more of those things. 

And id much rather have that than a 720p 30fps non-rtx gameplay experience, like you keep saying should be what gaming shifts focus to.

Exactly.   We'd all love to have the rare high end computer made with pure consumer grade unobtaium parts.  I'd buy such parts if I could find them.  

 

The thing is they aren't really needed


You have a point I'm talking about what people need VS what we all may want due to being marketed these GPU's and CPU's.    The point of marketing is to make you think you need something so you will buy it.  Make no mistake about that.  Otherwise, it would just be Nvidia sending everyone a spec sheet in their email. 

 

I'd rather have games written with the idea that most people have this "last gen / midrange / budget" computer.  Then as an extra sauce on your taco these other things that can be enabled. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, HelpfulTechWizard said:

that's the thing. But marketing doesn't say you need something. It shows you why x thing is better

It's subjective. You don't need 4k or 120hz or rtx, but sone poeple prefer one or more of those things. 

And id much rather have that than a 720p 30fps non-rtx gameplay experience, like you keep saying should be what gaming shifts focus to.

Why would you settle for 720p@30fps? PCs should play at 1080p@60fps.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, whm1974 said:

Why would you settle for 720p@30fps? PCs should play at 1080p@60fps.

Please reread.

8 hours ago, HelpfulTechWizard said:

And id much rather have that than a 720p 30fps non-rtx gameplay experience

 

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, whm1974 said:

Why would you settle for 720p@30fps? PCs should play at 1080p@60fps.

If that is all that the parts really available will allow it is either 720p or nothing.  An Intel IGP can do that. 
A AMD APU will do better ... but if you want to build with one they are very rare as well. 

Link to comment
Share on other sites

Link to post
Share on other sites

Since 3060 is fine, why does Nvidia not simply spam the market with 2060 12GB and 2060Super with 16GB instead of these dumb mining cards? The 2060 and 2060Super would be able to be used by both miner and gamer alike...

Lake-V-X6-10600 (Gaming PC)

R23 score MC: 9190pts | R23 score SC: 1302pts

R20 score MC: 3529cb | R20 score SC: 506cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: Intel Core i5-10600, 6-cores, 12-threads, 4.4/4.8GHz, 13,5MB cache (Intel 14nm++ FinFET) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: ASUS PRIME B460 PLUS, Socket-LGA1200 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W / RAM A1, A2, B1 & B2: DDR4-2666MHz CL13-15-15-15-35-1T "Samsung 8Gbit C-Die" (4x8GB) / Operating System: Windows 10 Home / Sound: Zombee Z300 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Seagate® Barracuda 2TB HDD / Storage 4: Seagate® Desktop 2TB SSHD / Storage 5: Crucial P1 1000GB M.2 SSD/ Storage 6: Western Digital WD7500BPKX 2.5" HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter (Qualcomm Atheros)

Zen-II-X6-3600+ (Gaming PC)

R23 score MC: 9893pts | R23 score SC: 1248pts @4.2GHz

R23 score MC: 10151pts | R23 score SC: 1287pts @4.3GHz

R20 score MC: 3688cb | R20 score SC: 489cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Ryzen 5 3600, 6-cores, 12-threads, 4.2/4.2GHz, 35MB cache (T.S.M.C. 7nm FinFET) / Display: HP 24" L2445w (64Hz OC) 1920x1200 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: ASUS Radeon RX 6600 XT DUAL OC RDNA2 32CUs @2607MHz (T.S.M.C. 7nm FinFET) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: ASRock B450M Pro4, Socket-AM4 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W / RAM A2 & B2: DDR4-3600MHz CL16-18-8-19-37-1T "SK Hynix 8Gbit CJR" (2x16GB) / Operating System: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Western Digital My Passport 2.5" 2TB HDD / Storage 4: Western Digital Elements Desktop 2TB HDD / Storage 5: Kingston A2000 1TB M.2 NVME SSD / Wi-fi & Bluetooth: ASUS PCE-AC55BT Wireless Adapter (Intel)

Vishera-X8-9370 | R20 score MC: 1476cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 (Base: @4.4GHz | Turbo: @4.7GHz) Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-1866MHz CL8-10-10-28-37-2T (4x4GB) 16.38GB / Operating System 1: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter

Godavari-X4-880K | R20 score MC: 810cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 95w Thermal Solution / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 860K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: EVGA GeForce GTX 960 SuperSC 2GB (T.S.M.C. 28nm) / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-40 (4x4GB) 16.38GB / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter

Acer Aspire 7738G custom (changed CPU, GPU & Storage)
Spoiler

CPU: Intel Core 2 Duo P8600, 2-cores, 2-threads, 2.4GHz, 3MB cache (Intel 45nm) / GPU: ATi Radeon HD 4570 515MB DDR2 (T.S.M.C. 55nm) / RAM: DDR2-1066MHz CL7-7-7-20-1T (2x2GB) / Operating System: Windows 10 Home / Storage: Crucial BX500 480GB 3D NAND SATA 2.5" SSD

Complete portable device SoC history:

Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
Apple A9 - Apple iPhone 6s Plus
HiSilicon Kirin 810 (T.S.M.C. 7nm) - Huawei P40 Lite / Huawei nova 7i
Mediatek MT2601 (T.S.M.C 28nm) - TicWatch E
Mediatek MT6580 (T.S.M.C 28nm) - TECNO Spark 2 (1GB RAM)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (orange)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (yellow)
Mediatek MT6735 (T.S.M.C 28nm) - HMD Nokia 3 Dual SIM
Mediatek MT6737 (T.S.M.C 28nm) - Cherry Mobile Flare S6
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (blue)
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (gold)
Mediatek MT6750 (T.S.M.C 28nm) - honor 6C Pro / honor V9 Play
Mediatek MT6765 (T.S.M.C 12nm) - TECNO Pouvoir 3 Plus
Mediatek MT6797D (T.S.M.C 20nm) - my|phone Brown Tab 1
Qualcomm MSM8926 (T.S.M.C. 28nm) - Microsoft Lumia 640 LTE
Qualcomm MSM8974AA (T.S.M.C. 28nm) - Blackberry Passport
Qualcomm SDM710 (Samsung 10nm) - Oppo Realme 3 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Uttamattamakin said:

If that is all that the parts really available will allow it is either 720p or nothing.  An Intel IGP can do that. 
A AMD APU will do better ... but if you want to build with one they are very rare as well. 

Maybe the newest Intel iGPU can play some low end Titles but no Major Games. And yes, you can buy Ryzen based APUs, just not the 4000 series.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, BillyGoat1776 said:

I agree with the part about basically telling ALL graphics card manufacturers to f**k right off until prices go DOWN and supply goes UP....couldn't agree more....solidarity goes a long way to turning things back in favor of the consumer....so yeah....thumbs up there

I also agree that games will be coded with an eye on mobile platforms.  However, this is very BAD and we should not encourage it.  It will promote an environment in which most of the game(s) will run server side (in some godforsaken datacenter somewhere) and this can only lead to gamer's being milked for data and money.  The gameplay will suck, the graphics will suck....well...it will just suck 😛

Heavy-duty clients running on powerful hardware will always provide the most immersive experience in my opinion.  However, market trends do seem to indicate that my 'opinion' is not sustainable and the gaming industry will probably go down the rabbit hole of mobile gaming and e-sports anyway.  So far, that's been a real sh*t show.

 

For what it's worth, that's the Goat's two cents 😄

 

~Billy G.

I agree fully with the bolded.  

 

The corporations that make the GPU's have all the power to impose this on the market.  I could easily hear someone from Nvidia saying "You can use Ge Force now with your GT 1030 for a GREAT RTX on gaming experience.   They'd likely even be correct. 

 

That may be the only way one gets to game now. IF they have the internet for it. 
 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Uttamattamakin said:

I agree fully with the bolded.  

 

The corporations that make the GPU's have all the power to impose this on the market.  I could easily hear someone from Nvidia saying "You can use Ge Force now with your GT 1030 for a GREAT RTX on gaming experience.   They'd likely even be correct. 

 

That may be the only way one gets to game now. IF they have the internet for it. 
 

This (relying on cloud gaming) does lock out a lot of rural gamers. Even with fast internet, latency is an ever present issue, and may constrict the genre of games that will run satisfactorily. 
 

If the shortages (both console and PC) do prove to be especially persistent, console manufacturers would probably do well to request that developers stick to PS4/Xbox One as the lead platforms until supplies stabilize. They could even seek to attract PC gamers by offering future rebates on PS5/Xbox X hardware if they purchase last gen hardware.

 

In the latter case, Apple could also be well positioned to take some of the gaming market should they choose to move aggressively. They probably aren’t so severely constrained as everyone else in silicon supplies, and have strong CPU and GPU designs they could potentially leverage. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×