Jump to content

WHY?!

Smc253

Tell me WHAT is the point to 4k gaming when you can game at 1440p for like 50 percent of the cost, pretty much same visuals, and less thermals with longer lifespan of hardware. WHAT IS THE POINT besides wasting money on upgraded PCI 4.0/5.0 compatible hardware. 

Link to comment
Share on other sites

Link to post
Share on other sites

The visuals are really not that different just seems stupid 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sure somebody somewhere posted something along these lines for every major advancement in graphical standards of years past.

 

5 minutes ago, Smc253 said:

and less thermals with longer lifespan of hardware.

There is no direct relationship between resolution and these things. 

 

5 minutes ago, Smc253 said:

WHAT IS THE POINT besides wasting money on upgraded PCI 4.0/5.0 compatible hardware. 

Only a few GPU's are meaningfully limited by PCIe 3.0. You certainly don't need 5.0 for 4k gaming.

 

3 minutes ago, Smc253 said:

The visuals are really not that different just seems stupid 

In number of pixels, there is actually a bigger jump from 1440p to 4k than there is from 1080p to 1440p. 

 

1080p is approximately 2 million pixels.

1440p is a little under 3.7 million pixels.

4k is 8.3 million pixels.

 

Of course pixel density comes into play depending on the size of the screen. 

 

If 4k isn't a good value proposition for you yet, that's fine. It isn't for me either. That doesn't mean it's "stupid" or pointless. 

 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

With that logic, what's the point of buying 4090's and other enthusiast tier parts? For 1440p a 3070 is plenty of GPU power.

 

People sometimes like to spend big bucks on PC parts, and a lot of times they have the income or savings (or financing) to be able to afford it.

 

Also 4K resolution does make a difference especially if you want a large display or a TV. Personally I don't have the budget or very much desire to switch to 4K. 1440p alone was a bit of a hard pill for me to swallow to begin with, money-wise (have to buy higher end/more frequent GPUs).

MAIN SYSTEM: Intel i9 10850K | 32GB Corsair Vengeance Pro DDR4-3600C16 | RTX 3070 FE | MSI Z490 Gaming Carbon WIFI | Corsair H100i Pro 240mm AIO | 500GB Samsung 850 Evo + 500GB Samsung 970 Evo Plus SSDs | EVGA SuperNova 850 P2 | Fractal Design Meshify C | Razer Cynosa V2 | Corsair Scimitar Elite | Gigabyte G27Q

 

Other Devices: iPhone 12 128GB | Nintendo Switch | Surface Pro 7+ (work device)

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Alcarin said:

For 1440p a 3070 is plenty of GPU power.

eh not for 1440p 120hz+ in AAA games, my 3070 gets around 60 with everything cranked in SOTTR and Cyberpunk (using DLSS balanced)

 

But I don’t care because I mostly play competitive games like Valorant on a 240hz 1440p monitor, where my 3070 has no trouble whatsoever pushing out that framerate on max settings.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Smc253 said:

The visuals are really not that different just seems stupid 

You can say the same thing about basically everything in the PC market, by that logic it could be argued they shouldn't have ever made a better GPU than a GTX 1070.

 

Some people can tell the difference, at larger monitor sizes like 43" it looks a lot better, and 4K panels have been coming down in price a fair bit. It's still expensive and into the point of diminishing returns, but its not that far into diminishing returns and for the people who have the money for it and want a better experience it's pretty fair. It's not for everyone, and definitely not a good idea for a first system, but if you just like tinkering with hardware or want a really nice monitor upgrade it makes sense.

 

If you don't want to do it, don't do it, but for the people who consider it a better experience (and it is for certain circumstances) it can make some sense. Personally I game at 4K because I like gaming on really large panels, 1440p on a 40"+ monitor is both hard to come by and very noticeably worse than 4k, and getting a used GPU that's very capable of gaming at 4K isn't actually that expensive (6900 XTs, for example, have been relatively easy to find for the ~$500 mark, and I even know a guy who got one for $350 US). If you think it's not worth that, that's fine, but there is a place for it, and it will get more and more relevant over time as prices keep decreasing and hardware keeps getting cheaper. 

Link to comment
Share on other sites

Link to post
Share on other sites

RTX2080 not TI can run 1440p games over 90fps with ray tracing. Its just a 1080 basically with ray tracing. 3070 can run 1440p games great but older gpus can run that resolution just fine over 60fps. 

Link to comment
Share on other sites

Link to post
Share on other sites

I mean on a large screen it can make a difference. Obviously gaming on a 24" monitor at 4k doesn't make a whole lot of sense at typical viewing distances. Large monitors and ultrawide displays though? Might start to make sense.

 

Another usecase is VR. My Vive pro runs somewhere between 1440p and 4k in terms of pixels per frame, and it's hardly the highest-res headset out there. Something like a Vive Pro 2 or Pimax 8kx dwarfs a UHD/4k display when it cames to pixel count.

Having a GPU capable of pushing those pixels, and doing so at 90+ fps isn't cheap, but defo worth it to people who buy 1000$ headsets

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Smc253 said:

RTX2080 not TI can run 1440p games over 90fps without ray tracing. Its just a 1080 basically with ray tracing. 3070 can run 1440p games great but older gpus can run that resolution just fine over 60fps. 

ok. and?

Link to comment
Share on other sites

Link to post
Share on other sites

I'm saying stop giving these GPU makers money for no reason. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Middcore said:

I'm sure somebody somewhere posted something along these lines for every major advancement in graphical standards of years past.

 

There is no direct relationship between resolution and these things. 

 

Only a few GPU's are meaningfully limited by PCIe 3.0. You certainly don't need 5.0 for 4k gaming.

 

In number of pixels, there is actually a bigger jump from 1440p to 4k than there is from 1080p to 1440p. 

 

1080p is approximately 2 million pixels.

1440p is a little under 3.7 million pixels.

4k is 8.3 million pixels.

 

Of course pixel density comes into play depending on the size of the screen. 

 

If 4k isn't a good value proposition for you yet, that's fine. It isn't for me either. That doesn't mean it's "stupid" or pointless. 

 

Yeah but when it hits your eyes it don't look much different 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Smc253 said:

I'm saying stop giving these GPU makers money for no reason. 

Games do get more demanding over time. Even at a steady 1080p, today's titles are far more demanding than those from 10 years ago. I'm willing to bet that I could run far cry 3 more easily at 4k than I could run some of the newer AAA titles at 1080p. As long as we keep asking for more impressive visuals we will need more powerful hardware to drive them

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Smc253 said:

Yeah but when it hits your eyes it don't look much different 

That's subjective. Some people would say the same about 1440p compared to 1080p. Objectively, assuming the same screen size, 4k is a bigger jump from 1440p than 1080p is to 1440p. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

Like what happened with EVGA leaving Nvidia because of treatment supposedly. Nvidia needs a wake up call but wont happen cause they are swimming in cash in effing over the consumer. I'm happy AMD is finally starting to become a major threat to them with those low price points on hardware. Especially with the 7000 series GPU's about to be released

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, NF-A12x25 said:

eh not for 1440p 120hz+ in AAA games, my 3070 gets around 60 with everything cranked in SOTTR and Cyberpunk (using DLSS balanced)

 

But I don’t care because I mostly play competitive games like Valorant on a 240hz 1440p monitor, where my 3070 has no trouble whatsoever pushing out that framerate on max settings.

I haven't played SOTTR but Cyberpunk is probably one of the most extreme cases of performance demand on a card. Not the norm.

 

Virtually my entire library of games runs at 90+ FPS on my 3070 at 1440p. I haven't played Cyberpunk since its launch though, as I am still upset about how badly it turned out.

MAIN SYSTEM: Intel i9 10850K | 32GB Corsair Vengeance Pro DDR4-3600C16 | RTX 3070 FE | MSI Z490 Gaming Carbon WIFI | Corsair H100i Pro 240mm AIO | 500GB Samsung 850 Evo + 500GB Samsung 970 Evo Plus SSDs | EVGA SuperNova 850 P2 | Fractal Design Meshify C | Razer Cynosa V2 | Corsair Scimitar Elite | Gigabyte G27Q

 

Other Devices: iPhone 12 128GB | Nintendo Switch | Surface Pro 7+ (work device)

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Smc253 said:

Like what happened with EVGA leaving Nvidia because of treatment supposedly. Nvidia needs a wake up call but wont happen cause they are swimming in cash in effing over the consumer. I'm happy AMD is finally starting to become a major threat to them with those low price points on hardware. Especially with the 7000 series GPU's about to be released

 

You're just kind of ranting now. I have my own thoughts about Nvidia's pricing and sales tactics but this has little or nothing to do with the topic of how worthwhile 4k gaming is. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

This is kind of a useless argument that can be applied to any enthusiast market. Why get a fast car if there’s a speed limit? Why get designer clothes if you can cover up just fine with goodwill clothes? Even with food, why buy name brand cereal when there’s generic.

 

It chalks up to personal preference. There’s a market for it for a reason. If you don’t like it you aren’t the target audience. Simple.

My PC Specs: (expand to view)

 

 

Main Gaming Machine

CPU: Intel Core i7-10700K - OC to 5 GHz All Cores
CPU Cooler: Corsair iCUE H115i RGB Pro XT (Front Mounted AIO)
Motherboard: Asus TUF GAMING Z490-PLUS (WI-FI)
Memory: Corsair Vengeance LPX 32 GB (2 x 16 GB) DDR4-3600

Storage: Intel 665p 1 TB M.2-2280 NVME SSD (x2)
Video Card: Zotac RTX 3070 8 GB GAMING Twin Edge OC

Power Supply: Corsair RM850 850W
Case: Corsair 4000D Airflow
Case Fan 120mm: Noctua F12 PWM 54.97 CFM 120 mm (x1)
Case Fan 140mm: Noctua A14 PWM 82.5 CFM 140 mm (x4)
Monitor Main: Asus VG278QR 27.0" 1920x1080 165 Hz
Monitor Vertical: Asus VA27EHE 27.0" 1920x1080 75 Hz

Link to comment
Share on other sites

Link to post
Share on other sites

OP needs to get his eyes/glasses checked if he says 4K has the same relative fidelity as 1440p.

1 hour ago, Smc253 said:

and less thermals with longer lifespan of hardware. WHAT IS THE POINT besides wasting money on upgraded PCI 4.0/5.0 compatible hardware. 

This just shows that you don't know what you're talking about.

Not an expert, just bored at work. Please quote me or mention me if you would like me to see your reply. **may edit my posts a few times after posting**

CPU: Intel i5-12400

GPU: Asus TUF RX 6800 XT OC

Mobo: Asus Prime B660M-A D4 WIFI MSI PRO B760M-A WIFI DDR4

RAM: Team Delta TUF Alliance 2x8GB DDR4 3200MHz CL16

SSD: Team MP33 1TB

PSU: MSI MPG A850GF

Case: Phanteks Eclipse P360A

Cooler: ID-Cooling SE-234 ARGB

OS: Windows 11 Pro

Pcpartpicker: https://pcpartpicker.com/list/wnxDfv
Displays: Samsung Odyssey G5 S32AG50 32" 1440p 165hz | AOC 27G2E 27" 1080p 144hz

Laptop: ROG Strix Scar III G531GU Intel i5-9300H GTX 1660Ti Mobile| OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Smc253 said:

and less thermals with longer lifespan of hardware.

What? This is running F@H which is a similar load on the GPU as mining while watching Youtube and playing Scum at 5140x1440p with max settings. Getting around 100 fps in game steady.

image.png.0d92f8e4afee6b34ee754782eed1d031.png

With the stock cooler that is cooler than my 6800xt ran on water with no F@H in the background and only using about 50 more watts than the 6800 xt did...

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

this argument is exactly the same as the arguments about 60hz monitors and 144hz. 😂

RELOAD BEFOR QUOTING. I EDIT MY POST ALOT!

MARK MY POST AS THE ANSWER IF I HELPED.

 

CPU: AMD Ryzen 5 5600 | GPU: RTX 2070 Super | Mobo: MSI X470 Gaming Pro | RAM: Corsair vengeance 32GB 8x2 3200mhz CL16 | SSD: Samsung QVO 870 1TB | PSU: SeaSonic FOCUS Plus 750 Gold 750 W 80+ Gold | Cooler: Thermaltake Contac Silent

12 | OS: Windows 11 Pro | Pcpartpicker: https://pcpartpicker.com/list/rGR6gb
Displays: ViewSonic XG2401 23.6" 1920 x 1080 144 Hz 

Laptop: Lenovo Ideapad Gaming i5-1130 RTX3050 16GB | OS: Windows 11 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Smc253 said:

The visuals are really not that different just seems stupid 

because most games aren't optimized for 4k, textures are poor, clipping will be even more apparent in 4k... for better "visuals" it would actually make more sense to render in 4k and downsample to 1080p , at least you get really good "AA" that way. downsampling from 4k to 1440p is less ideal btw, because the pixel ratio isnt 1:1 and there will be more artifacts.  

 

8 hours ago, Alcarin said:

With that logic, what's the point of buying 4090's

there is literally no point for gaming because games nowadays don't take advantage of the power of such a card, even a 3080 is basically "too much" already...

 

the only "point" is bragging rights (lol) and maybe if you're really chasing the fps... but even that seems pointless (for like 99% of people) 

Edited by Mark Kaine

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, IkeaGnome said:

What? This is running F@H which is a similar load on the GPU as mining while watching Youtube and playing Scum at 5140x1440p with max settings. Getting around 100 fps in game steady.

for an edge case like this such a card maybe makes sense, but not many people would do this, and not many people buy such a card to begin with,  shows very well however,  that as i said many games nowadays don't take fully advantage of such a card.

 

i think that nvidia (maybe amd too) are "efficient" in themselves, shouldn't really be a question,  because they are. 

 

if a "350w" gpu should be even a thing is another question entirely though. 

pretty sure if this gets more widespread eu would ban those or put massive restrictions,  they already cry about drinking straws! 😅

 

(not to mention "4k tvs" i guess...)

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just seems insane to me people spend over 2000 dollars for a custom pc so you can get 100 fps in 4K. Oh! The 2080 was gonna be great for 4K fps! Nope. The 3090 is gonna be great for FPS! Eh not that great really. Damn that power bill though and crap gotta buy a new case with a support bracket. Well, maybe the 4090 will do it! Yup not to bad! But now I got burning power connectors and 1000 watt power supply? Have people gone nuts? 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×