Jump to content

What do you think about 4K?

Lairlair

Hi folks,

 

Lately I've been thinking a lot about how 4K isn't that great.

For a 55" 4K TV you'd need to be 1 meter (aka 3.2 feet) away from it to start seeing the difference between each pixels, and around 5 meters away (16.4 feet) you wouldn't physically be able to notice the difference between full HD and 4K. I know a more realistic scenario is rather to sit somewhere in between, so that the TV takes about 30° of your viewing angle (as seen in this LTT video), and that would be about 2m/6.5feet away. But still, at that distance, if you swap from a full HD to a 4K resolution on a film/video game... How many would notice the difference? I'm aware that I'm talking to a tech savvy crowd, but I'm sure most people around me wouldn't notice. Even I have worked in a cinema for a few years, and they only used 2K projectors. Once I went to another cinema that used 4K and I couldn't for the life of me see any benefits in the image quality.

 

So this is my tepid take, I'll admit that 4K does have some little benefits for the average Joe, can be useful for enthusiasts or people working in video / image creation. BUT the flip side is that it's a lot more expensive and I'm not just talking about moneys. It uses more energy (for the TVs but also for the servers to stream 4x more pixels and the graphics cards to process all that) and requires to upgrade a whole line of production for this to even start making sense (so we're talking mineral extraction, refining and assembly for all the new cameras, monitors, computers and other pieces of equipment). And for what? That you can tell Chris Hemsworth's beard hairs apart? Is it really worth it?

 

What do you think? Do you use 4K and like it? Would you recommend it and why? Are you also caught between being excited for new tech and hating how dirty its production is?

Link to comment
Share on other sites

Link to post
Share on other sites

For consuming media in general, I think 1080 to 4K is less of a step up than SD to 1080 was.

 

My main PC monitor is a 43" 4K TV though, and for that use case it's amazing. Regular PC monitors feel like working though a periscope in comparison. There's SO. MANY. PIXELS.

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

I tend to agree, depending on your setup. 4k can start to have a really big impact on larger screens.
I have a projector setup with a 120" inch screen and switching between 1080p and 4k content is extremely noticeable.
However watching the same content on my living room tv, the difference is marginal at best.

 

For things like gaming and even regular PC use, I would rather the power go to refresh rate.

 

If your question is answered, mark it so.  | It's probably just coil whine, and it is probably just fine |   LTT Movie Club!

Read the docs. If they don't exist, write them. | Professional Thread Derailer

Desktop: i7-8700K, RTX 2080, 16G 3200Mhz, EndeavourOS(host), win10 (VFIO), Fedora(VFIO)

Server: ryzen 9 5900x, GTX 970, 64G 3200Mhz, Unraid.

 

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, Lairlair said:

Lately I've been thinking a lot about how 4K isn't that great.

For a 55" 4K TV you'd need to be 1 meter (aka 3.2 feet) away from it to start seeing the difference between each pixels, and around 5 meters away (16.4 feet) you wouldn't physically be able to notice the difference between full HD and 4K.

The whole point of going higher resolution is to reach a point where the picture becomes seamless.  If your focus is on being able to clearly see the difference between adjacent pixels, you're completely missing the point, the goal is the opposite - for them to seamlessly blend into each other.

 

For example if I play The Crew 2 at 1440p it looks like crap, because the engine is rather dated now.  But if I run it at 4K it looks a lot more seamless and realistic.  A lot of older games I don't have to worry about anti-aliasing due to the pixels bleeding together, creating natural anti-aliasing.  I mean it doesn't fix shimmering, but on a lot of games nothing does.

 

Why anyone would want to sit 5m away from a 55" screen is beyond me.  Do you always sit right at the back when going to the cinema?  I know I don't, I want the image to almost fill my field of view and be in a sweet spot for surround sound.  Far too many people sit way too far from the TV and at an awkward angle (looking at you people who put the TV right up to the ceiling) which is not only awkward to watch, its bad for your neck.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

I sit about 2m away from a 55" 4k TV for most of my PC gaming. The difference between 1080p and 4k is obvious., with a bit of "it depends on the game". Basically the more photorealistic the game is, the less it matters. Anything with fine detail is really noticeable.

 

I find Genshin Impact blurry looking running at native 1080p on my laptop compared to 4k on the TV. However Yakuza: Like a Dragon looked hardly any different at 1080p and 4k, apart from aliasing of detail such as distant buildings.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Lairlair said:

And for what? That you can tell Chris Hemsworth's beard hairs apart? Is it really worth it?

Totally worth it, I want to see every follicle.

 

Jokes aside for my TV I'm pretty content with 1080p, when I get a new one I'll go for 4k because why not? But I'd be ok with 1080. It does make a difference but sure, it's not as marked as going from SD to 1080p (although some people can't even notice that).

 

On monitors it makes a huge difference though.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

The advantage of 4k is that you can have a bigger screen without making the image look ugly when you're close to it. This is great for gaming and high res videos since it can look really sharp and pretty.

 

But yeah there are definitely diminishing returns at this point.

Ryzen 1600x @4GHz

Asus GTX 1070 8GB @1900MHz

16 GB HyperX DDR4 @3000MHz

Asus Prime X370 Pro

Samsung 860 EVO 500GB

Noctua NH-U14S

Seasonic M12II 620W

+ four different mechanical drives.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Alex Atkin UK said:

Do you always sit right at the back when going to the cinema?

I sit somewhere in the middle. I just saw 1m and 5m as the two extremes where it's not best to sit anyway.

 

18 minutes ago, Giganthrax said:

But yeah there are definitely diminishing returns at this point.

Yeah that's what I find problematic mostly, diminishing returns at increasing costs (consumer price tag and environmental burden)

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Needfuldoer said:

For consuming media in general, I think 1080 to 4K is less of a step up than SD to 1080 was.

 

My main PC monitor is a 43" 4K TV though, and for that use case it's amazing. Regular PC monitors feel like working though a periscope in comparison. There's SO. MANY. PIXELS.

Same here. I don't really mind the extra res for media consumption, but using a 4k 43" TV as a monitor gives me so much real estate and allows me to have so much stuff open at the same time, it's awesome.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/1/2022 at 3:41 PM, igormp said:

Same here. I don't really mind the extra res for media consumption, but using a 4k 43" TV as a monitor gives me so much real estate and allows me to have so much stuff open at the same time, it's awesome.

Or photos.  Seriously, look at good DSLR photos at 1080p then look at them at 2160p, its an even bigger difference because of the quality.  Smartphone photos, even the best ones, look like junk on a good monitor or TV as you can see the noise reduction and lack of dynamic range.

 

I feel a lot of people are missing the difference too because of streaming which kills a lot of the finer detail.   I mostly watch streaming myself and watching a proper 4K mastered Bluray (even a 2K upscale) is night/day different to most streaming content, sometimes even a normal 1080p Bluray will look better than the 4K streaming edition.  The only thing that seems consistently good is Stranger Things, where I suspect Netflix make extra sure it looks terrific - though I have season 2 on UHD Bluray and it does look better than the streaming version still.

 

Gaming it entirely depends on the game.  A game with extensive post processing can look great at 1080p, but a game with less will look rough at 1080p and look a whole lot better at 2160p, as I already mentioned.  This is even more so if the game has a long rendering distance.  Its actually a little distracting in The Crew 2 for example as it renders fake traffic in the distance that you can visually see disappear when you get closer as it pulls in the real traffic.

But things like the Forza Horizon games look so much more realistic at 4K.  It was actually a big reason I got a 3080 as Forza Horizon 4 looked night/day different compared to 1440p, but feels more realistic at 80+ fps.

 

I've even AI upscaled a few movies from 1080p to 4K and the image appears a lot clearer with more depth.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

Just for watching TV at a normal living room distance it may not matter. But all newer TV are 4K. So the whole question is moot. Almost like wondering if you should buy a color TV. 

 

Even if you had to pay a bit more to get a 4K TV, it would be stupid not to do it. It keeps it open for future use. Like you could recruit a living room TV to become a monitor. Or you end up sitting closer to the TV than you thought. 

 

Some years ago my workplace bought 55" wall-hung HD TVs for conference rooms since they were a few $ cheaper than 4K. Maybe they saved $50 on a $1,200 purchase. Those are useless now since if you look at a PDF of plans or other things you will get fuzzy eyes from the large pixels. So to actually use them they would have to buy new 4K TVs/monitors and throw these away since no one today will buy a HD TV. 

 

If someone still manufacturers 55" HD TVs these days, they can ship them directly to the recycling center to avoid shipping them directly to the landfill. 

AMD 9 7900 + Thermalright Peerless Assassin SE

Gigabyte B650m DS3H

2x16GB GSkill 60000 CL30

Samsung 980 Pro 2TB

Fractal Torrent Compact

Seasonic Focus Plus 550W Platinum

W11 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Another thing is the quality of the screen.  My 4K monitor looks great for photos, but watching a 4K HDR movie looks like utter garbage as its only a HDR 400 monitor.

 

Now watch the same movie on an OLED TV, or Macbook Pro M1 XDR - night/day difference.  There is much more to 4K than just the resolution, its why some movies which weren't mastered in 4K I own on 4K because the HDR grading makes a HUGE difference to perceived detail and depth.

 

I was actually a doubter when it comes to HDR, I didn't believe it could be remotely as good as people were claiming.  But once I got an LG OLED TV, I was proven very wrong.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Mel0n. said:

It's just another resolution in the gradual progression of resolutions. Eventually it'll be as common as 1080p is now and people will ask, "Oh your display is only 4k?"

Meanwhile it took practically a decade for 768P to die.

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

It’s kinda pointless but all TVs are 4K now. The only 1080p ones are ultra budget models with terrible panels. But noone will buy a 500+ TV that isn’t 4K so it HAS to be 4K now. Similar to laptops and phones now where they’ll push higher resolutions when it’s just not needed. 

Link to comment
Share on other sites

Link to post
Share on other sites

for monitors i'd put color gamut before pixel density+4k/hdr (i probably won't buy a 4k monitor without true hdr support)

 

That being said, the image quality jump from a 24G2 to a 4k hdr monitor doesn't justify the cost increase of the monitor itself and the gpu needed to drive 4k/120hz.

 

Maybe 4k will become more important once GPUs can drive it.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Lairlair said:

Hi folks,

 

Lately I've been thinking a lot about how 4K isn't that great.

For a 55" 4K TV you'd need to be 1 meter (aka 3.2 feet) away from it to start seeing the difference between each pixels, and around 5 meters away (16.4 feet) you wouldn't physically be able to notice the difference between full HD and 4K. I know a more realistic scenario is rather to sit somewhere in between, so that the TV takes about 30° of your viewing angle (as seen in this LTT video), and that would be about 2m/6.5feet away. But still, at that distance, if you swap from a full HD to a 4K resolution on a film/video game... How many would notice the difference? I'm aware that I'm talking to a tech savvy crowd, but I'm sure most people around me wouldn't notice. Even I have worked in a cinema for a few years, and they only used 2K projectors. Once I went to another cinema that used 4K and I couldn't for the life of me see any benefits in the image quality.

 

So this is my tepid take, I'll admit that 4K does have some little benefits for the average Joe, can be useful for enthusiasts or people working in video / image creation. BUT the flip side is that it's a lot more expensive and I'm not just talking about moneys. It uses more energy (for the TVs but also for the servers to stream 4x more pixels and the graphics cards to process all that) and requires to upgrade a whole line of production for this to even start making sense (so we're talking mineral extraction, refining and assembly for all the new cameras, monitors, computers and other pieces of equipment). And for what? That you can tell Chris Hemsworth's beard hairs apart? Is it really worth it?

 

What do you think? Do you use 4K and like it? Would you recommend it and why? Are you also caught between being excited for new tech and hating how dirty its production is?

I am a meter away from my OLED and I don't see pixels. 

Here is my setup.

Spoiler

20211224_084149.thumb.jpg.1357b4e3cebb5a4a54936f66a35ec287.jpg

It is worth it to me.

I started in 2015 and it was mainly for my modded games that used 2 and 4k textures. 

It was also great for space and plane sims since distant objects have more detail.

Since I can easily do 1000s of hours in these types of games, 4k is worth it.

 

Most games don't have the detail in the textures to benefit from it or they have poor quality LODs that make distant objects look even worse.

 

Before I started doing 4k I usually bought XX70 cards like GTX 470, 570, 670 and 970 but after 4k it was a 980, 2x 980 tis in SLI and even 2x 1080 ti in SLI.

Now it is 3090s and 3090 tis. 

 

Now it is only expensive if you want to be close to 120hz as well.

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

Using a 28" 4K display and compared to 1440p or 1080p it is just amazing. At the distance I sit away from the monitor I cannot see individual pixels but with 1440p and 1080p at the same distance I can. Which means I get a way sharper and realistic looking image while also only using the lowest AA options. As someone who isn't into fast paced twitch shooters I don't care about FPS that much and to me the prettier image is far more enjoyable and important than FPS. I honestly don't want to go back to anything lower than 4K at this point.

Desktop: i9-10850K [Noctua NH-D15 Chromax.Black] | Asus ROG Strix Z490-E | G.Skill Trident Z 2x16GB 3600Mhz 16-16-16-36 | Asus ROG Strix RTX 3080Ti OC | SeaSonic PRIME Ultra Gold 1000W | Samsung 970 Evo Plus 1TB | Samsung 860 Evo 2TB | CoolerMaster MasterCase H500 ARGB | Win 10

Display: Samsung Odyssey G7A (28" 4K 144Hz)

 

Laptop: Lenovo ThinkBook 16p Gen 4 | i7-13700H | 2x8GB 5200Mhz | RTX 4060 | Linux Mint 21.2 Cinnamon

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Lairlair said:

For a 55" 4K TV you'd need to be 1 meter (aka 3.2 feet) away from it to start seeing the difference between each pixels

There are two sides to this coin. One is simply reducing pixel size so that you can, for example, sit closer without seeing individual pixels. There is another aspect to images however, and that is sampling of detail. A single pixel can't convey much information besides a colour. You can only start to see structures of things that are 2-3 pixels or more in size, because then you start resolving the actual structure, and won't look smooth until you go a bit further than that. Higher resolutions are thus also needed to show proper detail of things larger than the individual pixels. It's also nice for geometric constructions where you have lines and whatnot to have the 'natural AA' of smaller pixels. How much detail you can resolve does of course depend on how far away you sit from the screen, so there is a limit.

 

9 hours ago, Lairlair said:

What do you think? Do you use 4K and like it? Would you recommend it and why? Are you also caught between being excited for new tech and hating how dirty its production is?

I do use 4k. Productivity wise it's great to have that much screen real estate. For the TV, well there isn't really much choice. It's still great,but I do think HDR may have an overall bigger impact on perceived image quality improvement than 4k.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

I've never thought 4k was anything special. Too hard to drive for too little of a difference, imho. 1080 works fine for me. I do have a 1440 monitor and it was a nice little bump up from 1080, but for me, I have no desire to have 4k content/games. (I don't expect anyone else to agree with me, though)

System Specs: Second-class potato, slightly mouldy

Link to comment
Share on other sites

Link to post
Share on other sites

You do see a difference, especially on larger tvs. I own a 65inch 4k, picture is much better in 4k. 1080p on a 65inch is like 480p on a sub 40 inch vs 1080. Pixel per inch is not up to par because screen size is much larger.

 

If your screen is much smaller, you can get away with less hd and still looks like the same quality. You can see this in the smaller iPhones, the switch, and the steam deck. They aren’t rocking 1080p hd but still good quality pictures and pixel density due to the smaller screen size

Sudo make me a sandwich 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Lairlair said:

s it really worth it?

Based on our data usage and how close we get to our cap for internet, its not worth it for me. I still use a 720p TV. Ive recently thought about upgrading to one just for extra screen real estate for when I use my MacBook Pro in bed. 

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

Depends if your eyes has Gigapixel sensors built into one, or you sit about 10cm away from your 55 inch TV. 4k is just a scam to me. I couldn't tell the difference between 1080p and 4k image side by side, unless I walk very close to the TV.

I have ASD (Autism Spectrum Disorder). More info: https://en.wikipedia.org/wiki/Autism_spectrum

 

I apologies if my comments or post offends you in any way, or if my rage got a little too far. I'll try my best to make my post as non-offensive as much as possible.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Chiyawa said:

4k is just a scam to me

Oh but wait until you see an 8K TV 😂

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×