Jump to content

Why is Linus mentioning 4K?

filipcro

I find it really annoying that Linus keeps mentioning 4K, thus confusing people with a useless irrelevant thing.

 

A couple of facts:

 

1.No game supports this resolution

2.Nobody seriously thinks any game will support this resolution in at least 5 years

3.You can barely have 60 fps at 1920*1200 with highest settings on most games even if you have the most expensive GPU you could possibly buy

4.The necessary leap would make GPUs monstruously expensive by default and would drastically shorten the cycle of upgrading GPUs, making the whole think completly unviable

Link to comment
Share on other sites

Link to post
Share on other sites

I find it really annoying that Linus keeps mentioning 4K, thus confusing people with a useless irrelevant thing.

 

A couple of facts:

 

1.No game supports this resolution

2.Nobody seriously thinks any game will support this resolution in at least 5 years

3.You can barely have 60 fps at 1920*1200 with highest settings on most games even if you have the most expensive GPU you could possibly buy

4.The necessary leap would make GPUs monstruously expensive by default and would drastically shorten the cycle of upgrading GPUs, making the whole think completly unviable

 

Well it's a new technology so why shouldn't he mention it?

 

GPU's are getting more powerful so give it time :) would be interesting to see 3 4k displays and gaming on them

DESKTOP - Motherboard - Gigabyte GA-Z77X-D3H Processor - Intel Core i5-2500K @ Stock 1.135v Cooling - Cooler Master Hyper TX3 RAM - Kingston Hyper-X Fury White 4x4GB DDR3-1866 Graphics Card - MSI GeForce GTX 780 Lightning PSU - Seasonic M12II EVO Edition 850w  HDD -  WD Caviar  Blue 500GB (Boot Drive)  /  WD Scorpio Black 750GB (Games Storage) / WD Green 2TB (Main Storage) Case - Cooler Master 335U Elite OS - Microsoft Windows 7 Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

maybe, because 4k is pretty awesome? o.O and i think with the new consoles we see much better support for 4k.

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

maybe, because 4k is pretty awesome? o.O and i think with the new consoles we see much better support for 4k.

 

 

For what? Playing games at 10fps?

Link to comment
Share on other sites

Link to post
Share on other sites

Single GPUs are going to be there quick, and many people are already playing at beyond 4k resolution.  (Surround/eyefinity with 1440/1600p) with multiple GPUs

 

If I could buy a 32" 4k panel, I would....

3930k | RIVBE | 32GB Dominator Plat. | Titan XP | Intel 750 1.6TB | Mellanox 10Gb NIC | AX1200i 

Custom TJ07 | 2xD5 pumps | EK Supremacy EVO | EK Titan XP | EK RIVBE Block | Cu Tubing | 25x120mm Rad

Samsung U28D590D 4K | Leopold Otaku w/Browns | Corsair M65 | Corsair SP2500 | Sennheiser HD595

Link to comment
Share on other sites

Link to post
Share on other sites

I find it really annoying that Linus keeps mentioning 4K, thus confusing people with a useless irrelevant thing.

 

A couple of facts:

 

1.No game supports this resolution

2.Nobody seriously thinks any game will support this resolution in at least 5 years

3.You can barely have 60 fps at 1920*1200 with highest settings on most games even if you have the most expensive GPU you could possibly buy

4.The necessary leap would make GPUs monstruously expensive by default and would drastically shorten the cycle of upgrading GPUs, making the whole think completly unviable

Its a tech channel, 4k monitors are forefront technology. Doesn't take a genius to work out why he talks about them...

Link to comment
Share on other sites

Link to post
Share on other sites

I find it really annoying that Linus keeps mentioning 4K, thus confusing people with a useless irrelevant thing.

 

A couple of facts:

 

1.No game supports this resolution

2.Nobody seriously thinks any game will support this resolution in at least 5 years

3.You can barely have 60 fps at 1920*1200 with highest settings on most games even if you have the most expensive GPU you could possibly buy

4.The necessary leap would make GPUs monstruously expensive by default and would drastically shorten the cycle of upgrading GPUs, making the whole think completly unviable

games do support 5760*1080, which can be thought of as 3k, Im sure with the graphics hardware games will support 4K very soon

Link to comment
Share on other sites

Link to post
Share on other sites

4k is awesome and you can do way more with a pc than play games? I think it will take 2-3 years and 4k will be as good to use as 1440/1600p

Link to comment
Share on other sites

Link to post
Share on other sites

I find it really annoying that Linus keeps mentioning 4K, thus confusing people with a useless irrelevant thing.

A couple of facts:

1.No game supports this resolution

2.Nobody seriously thinks any game will support this resolution in at least 5 years

3.You can barely have 60 fps at 1920*1200 with highest settings on most games even if you have the most expensive GPU you could possibly buy

4.The necessary leap would make GPUs monstruously expensive by default and would drastically shorten the cycle of upgrading GPUs, making the whole think completly unviable

You'll need to do a little more research about this topic I think.

4k is supported by pretty much any PC game and modern graphics cards can handle many of them.

Another cool thing is that because it's exactly 4x 1080p you can run 1080p without interpolation.

Not to mention the productivity advantages of higher resolution displays.

Link to comment
Share on other sites

Link to post
Share on other sites

Lots of game engines can be set to "any" resolution you want. With a good dual gpu setup you should be able to do OK, remember, AA shouldn't be necessary at 4k

Edit: Ninja'd by Linus!

Link to comment
Share on other sites

Link to post
Share on other sites

Interpolation ?

If you don't run native resolution or a variant of it like 1/4 you lose a lot of imagequality. just set your display to 1600x900(if you have a 1080p display) and your monitor needs to interpolate a new pixel between 2 pixels. Text will lose focus, everthing looks squishy.

 

just play around with your resolution and you can see for yourself

Link to comment
Share on other sites

Link to post
Share on other sites

Interpolation ?

In discrete mathematics, you only have information at about specific points. Say you measure something, like the temperature inside an oven. You take one measurement at every minute and note it down. So you know what temp the oven had at 3 minutes and at 4 minutes, but if you want to know the temperature at 3 minutes and 30 seconds, you don't have any data points to help you out. What you do then is that you guess in what manner the temperature changes over time and use the data you have to make a guess about what temperature you had at 3:30. Normally one would just use linear interpolation, so at any mid point you would just take the average of the two closest known values.

Anyways, since the image signal only contains information about certain points, if the resolution provided by the HW doesn't match up with the native display resolution, what happens is that the display will want to display pixels that it doesn't really have any data on, so it'll have to guess what should be there, much like the oven example ;)

Link to comment
Share on other sites

Link to post
Share on other sites

When thinking of 4k gaming is the least of my concern. It'll be great for photo editing and would be king for productivity. When a 27 inch 4k monitor comes to play I will be jumping on it quick and use my two 23 inch 1080p panels as side panels for additional screen space for sub tasks. Gaming on 4k isn't that important to me.

Link to comment
Share on other sites

Link to post
Share on other sites

intresting what about 1920x1200 ?

it's the same problem. 960x600 would look good. everthing in between not :D

Link to comment
Share on other sites

Link to post
Share on other sites

intresting what about 1920x1200 ?

What about it?  You can't run it on a 1080p monitor.  If you have a 1920x1200 monitor and you set it to a 16:9 resolution you can have it either put black bars and act like a 1080p monitor, or have it stretch and then things will look really different from native resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

Interpolation ?

 

Here's something I quickly drew in SNote.

 

KTIjb4f.jpg

 

As you can see, a 2x2 image cannot be scaled pixel perfect to a 3x3 image. It will require interpolation

 

However, it can scale fine to 4x4 by using 4 pixels as one pixel.

I need more SSDs.

Link to comment
Share on other sites

Link to post
Share on other sites

How about some 4K Benchmarks?

Here is a 4K benchmark from PCPerspective: http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-High-End-GPUs-Benchmarked-4K-Resolutions

Hello and Welcome to LTT Forum!


If you are a new member, please read the rules located in "Forum News and Info". Thanks!  :)


Linus Tech Tips Forum Code of Conduct           FAQ           Privacy Policy & Legal Disclaimer

Link to comment
Share on other sites

Link to post
Share on other sites

I find it really annoying that Linus keeps mentioning 4K, thus confusing people with a useless irrelevant thing.

 

A couple of facts:

 

1.No game supports this resolution

2.Nobody seriously thinks any game will support this resolution in at least 5 years

3.You can barely have 60 fps at 1920*1200 with highest settings on most games even if you have the most expensive GPU you could possibly buy

4.The necessary leap would make GPUs monstruously expensive by default and would drastically shorten the cycle of upgrading GPUs, making the whole think completly unviable

1. no, most game engines can have custom resolutions

 

2. no

 

3. with titan SLI, you can do pretty well on a 120hz monitor on crysis 3.

 

4. that makes no sense.

Link to comment
Share on other sites

Link to post
Share on other sites

3.You can barely have 60 fps at 1920*1200 with highest settings on most games even if you have the most expensive GPU you could possibly buy

 

I have a gtx 580. I can play at 60fps on max settings on basically every game i own...

My Meme Proudly Featured on the Live-Stream: http://prodigydoo.com/ImageStorage/memes/linus-meme-6.jpg
Some Threads You Should Get involved in: Linus and Slick Memes     Youtubers on the LTT Forum      The List of Tech YouTubers (Contribute!)      Disable AdBlock for Linus (Guide)

Link to comment
Share on other sites

Link to post
Share on other sites

Because I would rather buy 1 30" 4k monitor rather than having 2x2 22/24" monitors for the office.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

I find it really annoying that Linus keeps mentioning 4K, thus confusing people with a useless irrelevant thing.

 

A couple of facts:

 

1.No game supports this resolution

2.Nobody seriously thinks any game will support this resolution in at least 5 years

3.You can barely have 60 fps at 1920*1200 with highest settings on most games even if you have the most expensive GPU you could possibly buy

4.The necessary leap would make GPUs monstruously expensive by default and would drastically shorten the cycle of upgrading GPUs, making the whole think completly unviable

 

1: Wrong

2: Wrong again

3: Wrong

4: Expensive but hardly "Monsterous".

Link to comment
Share on other sites

Link to post
Share on other sites

I thought Linus had a tech channel and not a gaming channel.

 

You seriously have to understand the productivity increase that 4K will bring.

| GPU: GT 650M | CPU: i5-3210M | Excuse my language, sometimes I can be pretty vulgar.

 

Link to comment
Share on other sites

Link to post
Share on other sites

You also have to realise that they aren't just for gaming. I don't know where you get the idea that a 4k monitor would only be used for gaming.

 

Consider this, gaming is probably one of the smallest percentage of use for computers, it's dominated by business oriented stuff (word processing, data entry, programming), web browsing, movie watching, and probably even school work.

 

The reason why he might be mentioning it (other than the fact that it's a tech channel and this is technology) is probably because he does other things (like 98% of the world) other than gaming on his computer. Remember, this isn't a gaming channel, it's about technology. Same reason why at CES he was looking at TVs and other non-gaming related stuff.

15" MBP TB

AMD 5800X | Gigabyte Aorus Master | EVGA 2060 KO Ultra | Define 7 || Blade Server: Intel 3570k | GD65 | Corsair C70 | 13TB

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×