Jump to content

This 4k comparison accurate or fake?

zindan



Is the difference really that big and seeable ? that is insane! 

Link to comment
Share on other sites

Link to post
Share on other sites

Well before watching the video (which I wont, I'm at work), if you don't have a 4k display in front of you, then how do you expect to watch "4k footage"? Go to a store and check it out for yourself for best examples of what 4k looks like. Definitely check out a monitor or smaller screen and not a TV though, because it's the pixel density that really makes it look so sharp.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, NinJake said:

Well before watching the video (which I wont, I'm at work), if you don't have a 4k display in front of you, then how do you expect to watch "4k footage"? Go to a store and check it out for yourself for best examples of what 4k looks like. Definitely check out a monitor or smaller screen and not a TV though, because it's the pixel density that really makes it look so sharp.

Continuing on your tangent, it's ridiculous to look to a site like YouTube for comparisons on this sort of topic, since the video you're seeing is still heavily compressed and all that.

Ideally you'd be downloading barely compressed files to your computer then watching them on a 4k screen for the full effect in this case.

Sig under construction.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, NinJake said:

Well before watching the video (which I wont, I'm at work), if you don't have a 4k display in front of you, then how do you expect to watch "4k footage"? Go to a store and check it out for yourself for best examples of what 4k looks like. Definitely check out a monitor or smaller screen and not a TV though, because it's the pixel density that really makes it look so sharp.

its extreme close-up shots, "amplifying" the difference..

 

which is like playing games like this:

staring-too-close-at-the-computer-screen

---

 

i own a first gen dell 4K ultrasharp.. i should have every "subjective" reason to say 4K is all the new rage.. it isnt. the benefit of going 1080>4k is seriously hitting diminishing returns gaming wise, and even the productivity side is seriously mehh for all cases where having multiple displays is a possibility.

 

IMO, 2560x1440 is where it's at for now. they're a nice slice cheaper, and you dont need HDMI2.0 or DP1.2 to drive them. just regular ol' DVI will do.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, manikyath said:

its extreme close-up shots, "amplifying" the difference..

 

which is like playing games like this:

staring-too-close-at-the-computer-screen

---

 

i own a first gen dell 4K ultrasharp.. i should have every "subjective" reason to say 4K is all the new rage.. it isnt. the benefit of going 1080>4k is seriously hitting diminishing returns gaming wise, and even the productivity side is seriously mehh for all cases where having multiple displays is a possibility.

 

IMO, 2560x1440 is where it's at for now. they're a nice slice cheaper, and you dont need HDMI2.0 or DP1.2 to drive them. just regular ol' DVI will do.

So I shouldn't get 4k ? 

Link to comment
Share on other sites

Link to post
Share on other sites

at the point you are starting the video at those look pretty zoomed in and will exaggerate the differences compared to the naked eye, but yes there is a noticeable difference between 1080p and 4k, anyone that can't see a difference needs glasses or a better monitor, because they bought such a crappy model that even thow it has the technical resolution, it doesn't have the capacity to display it accurately enough (like the LG IPS 4k TV's that can't balance bright and darkness right without massive "IPS Glow" or the cheaper 4k TV's suffer HORRIBLE backlight banding issues).

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, manikyath said:

IMO, 2560x1440 is where it's at for now. they're a nice slice cheaper, and you dont need HDMI2.0 or DP1.2 to drive them. just regular ol' DVI will do.

I've got a 1440p TN panel, I honestly loved the switch from 1080 to 1440 however I'm waiting a few years for more powerful graphics cards and cheaper 4k IPS screens and then as soon as I can, I'm buying a 4k IPS monitor :D

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, NinJake said:

I've got a 1440p TN panel, I honestly loved the switch from 1080 to 1440 however I'm waiting a few years for more powerful graphics cards and cheaper 4k IPS screens and then as soon as I can, I'm buying a 4k IPS monitor :D

You will wait a long time then

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, zindan said:

Wil 1080TI SLI hold up until then ?

1080ti sli isnt the most budget minded decision either.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, (嗜杀本性)狼队 女 梅 said:

With volta coming out soon? We are already getting into the next gen,

 

Not so soon if Vega isn't proving all too threatening to Pascal.

Sig under construction.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Tedster said:

Not so soon if Vega isn't proving all too threatening to Pascal.

It's 3 times of Pascal or soo, but costs like  "NVIDIA's new Volta-powered DGX-1 costs $149,000" But it is the next gen, and it will have lower tire version down the line. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, (嗜杀本性)狼队 女 梅 said:

It's 3 times of Pascal or soo, but costs like  "NVIDIA's new Volta-powered DGX-1 costs $149,000" But it is a the next gen, and it will have lower tire version down the line. 

What do you mean by this?

Sig under construction.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, (嗜杀本性)狼队 女 梅 said:

"Quadrupling" 

Sorry, I'm still somewhat unclear.

You're talking about how GN mentioned 4x the "CUDA core throughput", in reference to performance for HPC tasks, somehow referencing that to gaming?

Sig under construction.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Tedster said:

Continuing on your tangent, it's ridiculous to look to a site like YouTube for comparisons on this sort of topic, since the video you're seeing is still heavily compressed and all that.

Ideally you'd be downloading barely compressed files to your computer then watching them on a 4k screen for the full effect in this case.

You're not really wrong but it's more nuanced than that. For example, you can tell a difference between super-sampling a game in 4k on a 1080p screen and native 1080p on a 1080p screen. The biggest difference is in the aliasing and thin, distant objects like tree branches or wires. (on a side note: you can also tell a huge difference between a 4k YT video and a 1080p YT video even when both are watched on a 1080p screen, although that's due to YT bitrate compression, not original quality.) My point is that rendering something in 4k is different than filming something in 4k because a 3D render sets distinct object edges whereas a camera sensor captures the average photon properties on each pixel without establishing distinct edges. A camera basically inherently "anti-aliases" the world around it when capturing an image (that's a very simplified explanation but it serves the point). So, even if the image is compressed/downscaled post-render, you can still often see a difference between an image rendered in 4k vs an image rendered in 1080p (especially in this case where they zoom in on the image). In fact, if you can see the difference on a downscaled and compressed video then the actual difference you will notice in real life is much more noticeable.

 

I have gamed in 1080p vs 4k and the difference is absolutely massive, especially with aliasing which is a personal pet peeve of mine. The other big difference is that once you use 4k for a little bit, 1080p looks blurry and out of focus in comparison.

 

You can try this for yourself on a 1080p screen by playing a game that allows you to change the resolution scaling (like BF1 for example). Play at 1080p and 100% scaling, then switch to 1080p with 200% scaling (basically rendering in 4k), you should be able to notice a huge difference.

 

Having said all that: I think the sweet spot right now is 1440p for monitors 24-28" and 4k is only worth it if you have a beast of GPU and are using a screen larger than 28" (on a much larger screen or when using a TV as a monitor, 4k definitely is worth the higher PPI). I'd personally rather have 1440p/144Hz than 4k/60hz considering that they're about the same price.

Primary PC-

CPU: Intel i7-6800k @ 4.2-4.4Ghz   CPU COOLER: Bequiet Dark Rock Pro 4   MOBO: MSI X99A SLI Plus   RAM: 32GB Corsair Vengeance LPX quad-channel DDR4-2800  GPU: EVGA GTX 1080 SC2 iCX   PSU: Corsair RM1000i   CASE: Corsair 750D Obsidian   SSDs: 500GB Samsung 960 Evo + 256GB Samsung 850 Pro   HDDs: Toshiba 3TB + Seagate 1TB   Monitors: Acer Predator XB271HUC 27" 2560x1440 (165Hz G-Sync)  +  LG 29UM57 29" 2560x1080   OS: Windows 10 Pro

Album

Other Systems:

Spoiler

Home HTPC/NAS-

CPU: AMD FX-8320 @ 4.4Ghz  MOBO: Gigabyte 990FXA-UD3   RAM: 16GB dual-channel DDR3-1600  GPU: Gigabyte GTX 760 OC   PSU: Rosewill 750W   CASE: Antec Gaming One   SSD: 120GB PNY CS1311   HDDs: WD Red 3TB + WD 320GB   Monitor: Samsung SyncMaster 2693HM 26" 1920x1200 -or- Steam Link to Vizio M43C1 43" 4K TV  OS: Windows 10 Pro

 

Offsite NAS/VM Server-

CPU: 2x Xeon E5645 (12-core)  Model: Dell PowerEdge T610  RAM: 16GB DDR3-1333  PSUs: 2x 570W  SSDs: 8GB Kingston Boot FD + 32GB Sandisk Cache SSD   HDDs: WD Red 4TB + Seagate 2TB + Seagate 320GB   OS: FreeNAS 11+

 

Laptop-

CPU: Intel i7-3520M   Model: Dell Latitude E6530   RAM: 8GB dual-channel DDR3-1600  GPU: Nvidia NVS 5200M   SSD: 240GB TeamGroup L5   HDD: WD Black 320GB   Monitor: Samsung SyncMaster 2693HM 26" 1920x1200   OS: Windows 10 Pro

Having issues with a Corsair AIO? Possible fix here:

Spoiler

Are you getting weird fan behavior, speed fluctuations, and/or other issues with Link?

Are you running AIDA64, HWinfo, CAM, or HWmonitor? (ASUS suite & other monitoring software often have the same issue.)

Corsair Link has problems with some monitoring software so you may have to change some settings to get them to work smoothly.

-For AIDA64: First make sure you have the newest update installed, then, go to Preferences>Stability and make sure the "Corsair Link sensor support" box is checked and make sure the "Asetek LC sensor support" box is UNchecked.

-For HWinfo: manually disable all monitoring of the AIO sensors/components.

-For others: Disable any monitoring of Corsair AIO sensors.

That should fix the fan issue for some Corsair AIOs (H80i GT/v2, H110i GTX/H115i, H100i GTX and others made by Asetek). The problem is bad coding in Link that fights for AIO control with other programs. You can test if this worked by setting the fan speed in Link to 100%, if it doesn't fluctuate you are set and can change the curve to whatever. If that doesn't work or you're still having other issues then you probably still have a monitoring software interfering with the AIO/Link communications, find what it is and disable it.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Tedster said:

Sorry, I'm still somewhat unclear.

You're talking about how GN mentioned 4x the "CUDA core throughput", in reference to performance for HPC tasks, somehow referencing that to gaming?

Well, It's like running multiple high-end CPU's for rendering, It speeds up the process, Even thou we aren't currently fully using this tech and almost all games are limited by 32 cores, Its a step forward with their theoretical ideology of the gpu's tech, In other sense I am saying we are moving forward faster, Still having a high-end GPU that's met for creating content isn't something we should care about it won't perform better, But it will or might give off spring of other tiers similar to the product that is optimized for the gaming,

Link to comment
Share on other sites

Link to post
Share on other sites

It's worth it for me, but that's because I have a 43" monitor. I could probably go down to 38" before it gets too small at 100% scaling. In terms of clarity, it's the same as something like a 27" at 2560x1440 or a 23" at 1920x1080

Aragorn (WS): 250D | 6800k | 840 Pro 512GB | Intel 530 480GB  | Asus X99-M WS | 64GB DDR4 | Corsair HX720i | GTX 1070 | Corsair H115i | Philips BDM4350UC 43" 3840x2160 IPS

Gimli (server):  Node 304 | G4560 | ADATA XPG SX8000 128GB | 2x 5TB WD Red | ASROCK H270M-ITX/AC  | 8GB DDR4 | Seasonic 400FL

 Omega (server):                 Fractal Arc Mini R2 | i3 4130 | 500GB Maxtor | 2TB WD Red : Raid 1 | 3TB Seagate Barracuda | 16GB RAM | Seasonic G-450w
Alpha (WS): 900D | 4770k | GTX 780  | 840 Pro 512GB  | GA-Z87X-OC | Corsair RM 850 | 24GB 2400mhz | Samsung S27B970D 2560x1440

                              ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Link to comment
Share on other sites

Link to post
Share on other sites

I made the jump to a 32 Inch 4K/60hz from a 34 inch 1440p/100hz (21:9) and for me it was worth it.

I have 1080 TI in SLI, if i only had one card though, im not sure i would be happy. running fallout 4 without G or V- sync i am anywhere between 70 to 90 FPS @ 4K max settings

 

Link to comment
Share on other sites

Link to post
Share on other sites

I personally don't even see any difference between 1080p and 4k, on a 28" 4k monitor at arms length, in fullscreen.... I do see a slight difference when alternating between a 1080p image and the same image at 4k, but when viewing content at the same resolution, you just don't notice it, at least I don't, but maybe my eyes are just bad or everyone has super vision.

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, zindan said:

You will wait a long time then

Well I did say I am planning on waiting "a few years" so if that's a long time to you...

 

1440p is treating my 980ti just fine and I can so far play any games with ultra settings on it. Makes me a happy camper.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, zindan said:

Wil 1080TI SLI hold up until then ?

check my post on the volta in grapics cards. been there for b6 months already!!

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×