Jump to content

Are we spoiled by HD?

Mooshi

1080p IS normal, 720p is pathetic!

....I'm waiting for 4k.....

My main monitor is 720p. I'll upgrade when I have money. So I can utilize the full power of my 780....

Someone told Luke and Linus at CES 2017 to "Unban the legend known as Jerakl" and that's about all I've got going for me. (It didn't work)

 

Link to comment
Share on other sites

Link to post
Share on other sites

We just need higher bitrate videos now to keep up with the resolution! The worst thing is watching a 1080p youtube video with a bitrate of less than 14Mb/s...

 

 

h.265, my friend. read up on it, and rejoyce, for thou have fortwith double the efficiency.

I am a female pc hardware expert and enthusiast, over 170 IQ, been in the tech scene since the 80s. get over it.  This message brought to you by me being tired of people which have problems with any of those things.   ~Jaqie Fox

-=|Fighting computer ignorance since 1995|=-

Link to comment
Share on other sites

Link to post
Share on other sites

My fav game of all time is an obscure PS2 title, so... no hate. :P

Spongebob: Battle For Bikini Bottom? If not, I haven't ever heard of that game, in fact, what were we talking about?

- "some salty pretzel bun fanboy" ~ @helping, 2014
- "Oh shit, watch out guys, we got a hopscotch bassass here..." ~ @vinyldash303

- "Yes the 8990 is more fater than the 4820K and as you can see this specific Video card comes with 6GB" ~ Alienware 2014

Link to comment
Share on other sites

Link to post
Share on other sites

I agree that people have lost their appreciation for 1080p now. But what makes me enjoy the content more is pixel density, which is what really matters here. This is the reason why I don't think we'll go much further than 4K in terms of resolution on the desktop side of things. A 27 inch 4K monitor has a dpi of 160, which when viewed form more than 50 cm away (which is typical sitting distance from a desk), you cannot distinguish individual pixels.

 

If we went to 8K resolutions on a 27 inch monitor you would only have to be 25 cm away from the display to be able to notice any pixels, which is a little too much to be honest. 8K will come to televisions though to bring the pixel density up, and we'll start to get 60-70 inch TVs as the norm. But I think we will stop at 4K with monitors, or perhaps increase the resolution by an extra 50%. 5760x3240 anyone? :P

 

Beyond that there's no benefit to increasing desktop resolution.

 

We just need higher bitrate videos now to keep up with the resolution! The worst thing is watching a 1080p youtube video with a bitrate of less than 14Mb/s...

Yes, but the thing is, not being able to distinguish individual pixels doesn't mean the image won't seem sharper.

- "some salty pretzel bun fanboy" ~ @helping, 2014
- "Oh shit, watch out guys, we got a hopscotch bassass here..." ~ @vinyldash303

- "Yes the 8990 is more fater than the 4820K and as you can see this specific Video card comes with 6GB" ~ Alienware 2014

Link to comment
Share on other sites

Link to post
Share on other sites

Broadcast TV isn't even up to 1080p yet. It's hardly a "standard". For PC enthusiast's it is, but remember, not everyone is into tech and just wants something that they can check there email and facebook on and don't even know what resolution is.

"Energy drinks don't make my mouth taste like yak buttholes like coffee does, so I'll stick with them." - Yoinkerman

Link to comment
Share on other sites

Link to post
Share on other sites

I will start getting into 4k once they release monitors at 120hz or 144hz that aren't just TN panel at a reasonable price, even though I have been considering the 1440p 120hz g sync monitor from asus, I am happy with my 144hz asus monitor I have atm. Maybe in like 4 or 5 years I will get a 4k monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, but the thing is, not being able to distinguish individual pixels doesn't mean the image won't seem sharper.

Huh? That would only be if the content being viewed wasn't at the native resoliution of the monitor. A raw 4K resolution video, or a page of text viewed on a 4K monitor at the aprropriate distance (50 cm for a 27" display) would look perfectly sharp because you wouldn't be able to distinguish the pixels, so there are no jagged edges. But if you watched a 1080p video then yes, you would be able to see the pixels, at the same distance. You would then need to be 1m away from the same monitor to stop being able to see individual pixels. But I thought we were talking about resolutions, not the resolution of the content being viewed on them.

 

A 4K video would only look blurry, on a 4K display sat the appropriate distance away, if the bitrate was incredibly low.

| My first build: http://linustechtips.com/main/topic/117400-my-very-first-build/ | Build for my friend's 18th: http://linustechtips.com/main/topic/168660-pc-for-my-friends-18th-with-pictures-complete/ |


ATH-M50X Review: http://linustechtips.com/main/topic/165934-review-audio-technica-ath-m50-x/ | Nintendo 3DS XL Review: http://linustechtips.com/main/topic/179711-nintendo-3ds-xl-review/ | Game Capture Guide: http://linustechtips.com/main/topic/186547-ultimate-guide-to-recording-your-gameplay/


Case: Corsair 200R CPU: i5 4670K @ 3.4GHz RAM: Corsair 8GB 1600MHz C9 Mobo: GIGABYTE Z87-HD3 GPU: MSI R9 290 Cooler: Hyper 212 EVO PSU: EVGA 750W Storage: 120GB SSD, 1TB HDD Display: Dell U2212HM OS: Windows 8

Link to comment
Share on other sites

Link to post
Share on other sites

Huh? That would only be if the content being viewed wasn't at the native resoliution of the monitor. A raw 4K resolution video, or a page of text viewed on a 4K monitor at the aprropriate distance (50 cm for a 27" display) would look perfectly sharp because you wouldn't be able to distinguish the pixels, so there are no jagged edges. But if you watched a 1080p video then yes, you would be able to see the pixels, at the same distance. You would then need to be 1m away from the same monitor to stop being able to see individual pixels. But I thought we were talking about resolutions, not the resolution of the content being viewed on them.

 

A 4K video would only look blurry, on a 4K display sat the appropriate distance away, if the bitrate was incredibly low.

I'm just saying that even if you can't distinguish the individual pixels, you'll still perceive greater clarity. Just because you can't count them doesn't mean that you aren't perceiving them at all. :P I guess it's an opinion though, as I can't scientifically prove it, but I think it makes sense.

- "some salty pretzel bun fanboy" ~ @helping, 2014
- "Oh shit, watch out guys, we got a hopscotch bassass here..." ~ @vinyldash303

- "Yes the 8990 is more fater than the 4820K and as you can see this specific Video card comes with 6GB" ~ Alienware 2014

Link to comment
Share on other sites

Link to post
Share on other sites

I am fine with 1080p at the moment, more is nice but I still consider gaming in 1080p to be nice on a monitor and on a bigger display like a TV because I'm normally much further away from it.

 

I'm also fine with 1080p for watching movies, again because of viewing distance, however when 4K native content is widespread I won't be denying the benefits of upgrading.

 

 

I don't think we are spoiled by HD quite yet if the resolution is to be 1080p, it is still a resolution that the majority of consumers still are struggling to see natively, consoles are still upscaling to meet 1080p, and 1080p digital video streamed from services like Netflix and Amazon is compressed.

 

If you are a person with a 50 inch Panasonic ST series television, or high quality HD projector at perfect viewing distance and exist on a diet of Blu Ray glory then I'd say you're elegible to be spoiled, but not most of us.

Link to comment
Share on other sites

Link to post
Share on other sites

My 1366x768 laptop looks sexy compared to my old Acer 1080p.

 

But my IPS in sig..... FTW.

Le Bastardo+ 

i7 4770k + OCUK Fathom HW labs Black Ice 240 rad + Mayhem's Gigachew orange + 16GB Avexir Core Orange 2133 + Gigachew GA-Z87X-OC + 2x Gigachew WF 780Ti SLi + SoundBlaster Z + 1TB Crucial M550 + 2TB Seagate Barracude 7200rpm + LG BDR/DVDR + Superflower Leadex 1KW Platinum + NZXT Switch 810 Gun Metal + Dell U2713H + Logitech G602 + Ducky DK-9008 Shine 3 MX Brown

Red Alert

FX 8320 AMD = Noctua NHU12P = 8GB Avexir Blitz 2000 = ASUS M5A99X EVO R2.0 = Sapphire Radeon R9 290 TRI-X = 1TB Hitachi Deskstar & 500GB Hitachi Deskstar = Samsung DVDR/CDR = SuperFlower Golden Green HX 550W 80 Plus Gold = Xigmatek Utguard = AOC 22" LED 1920x1080 = Logitech G110 = SteelSeries Sensei RAW
Link to comment
Share on other sites

Link to post
Share on other sites

From my viewpoint, not yet. But it looks like we're getting there. Gamers I know are demanding everything to do with graphics and it's just insane that they aren't already satisfied with the HD we have already.

IdeaCentre K450 Review - My favorite (pre-built) gaming desktop under $1000.

Gaming PC Under $500 = Gaming on a budget.

Link to comment
Share on other sites

Link to post
Share on other sites

h.265, my friend. read up on it, and rejoyce, for thou have fortwith double the efficiency.

no decent encoders even though the spec has been around for 2+ years. Itll take a while to overtake h.264 just as it took a while to overtake avi at least on the consumer end.

Broadcast TV isn't even up to 1080p yet. It's hardly a "standard". For PC enthusiast's it is, but remember, not everyone is into tech and just wants something that they can check there email and facebook on and don't even know what resolution is.

Its required in the US to have a digital HD OTA (over the air) broadcast now. Also the cable companies are finally slowly being required to do the same.

Link to comment
Share on other sites

Link to post
Share on other sites

My fav game of all time is an obscure PS2 title, so... no hate. :P

What game>?

 

4k is the new 1080p!

1080p IS normal, 720p is pathetic!

....I'm waiting for 4k.....

 

I'm also waiting for 4K DPI Scaling. Until software developers actually get their scaling issues together, 24"-27" 1440p is perfect for me. Adobe, I'm looking at you.

Desktop: KiRaShi-Intel-2022 (i5-12600K, RTX2060) Mobile: OnePlus 5T | REDACTED - 50GB US + CAN Data for $34/month
Laptop: Dell XPS 15 9560 (the real 15" MacBook Pro that Apple didn't make) Tablet: iPad Mini 5 | Lenovo IdeaPad Duet 10.1
Camera: Canon M6 Mark II | Canon Rebel T1i (500D) | Canon SX280 | Panasonic TS20D Music: Spotify Premium (CIRCA '08)

Link to comment
Share on other sites

Link to post
Share on other sites

Console fanboys need not apply, 720p is not serious beans. And 1080p is pretty standard now.

 

So, the question. Do you feel like we're too accustomed to 1080p that it doesn't feel all that special anymore? I remember years ago when 1080p was the buzzword of the week and HD channels as well as TVs were being marketed to death. Looking at where we are now, I feel 1920x1080 no longer feels like "HD", if anything, I kinda feel like it's standard def. Or what I consider standard these days.

 

This kinda bothers me in a way because my monitor is dying and I'm actually having second thoughts on quality replacements because they are 1920x1080 natively, my current monitor has a res of 1920x1220 - dat 16:10 I could get a generic Korean 1440p monitor, but I'm not sure if I'm very comfortable dropping 400 dollars on a risk with a crappy stand and limited inputs...LG, Samsung, Dell..release 2560x1440 monitors at reasonable prices already so i don't have to ship something from Korea. - This isn't even getting into 4k which is on a whole other plane of existence all together.

 

So what do you guys think? Does 1920x1080 feel too normal now?

In germany theres still a shit ton of ads for HD. Buy this and you get HD HD IS SOO GOOD VERY NEW MUCH GOOD. 

My Rig: AMD Ryzen 5800x3D | Scythe Fuma 2 | RX6600XT Red Devil | B550M Steel Legend | Fury Renegade 32GB 3600MTs | 980 Pro Gen4 - RAID0 - Kingston A400 480GB x2 RAID1 - Seagate Barracuda 1TB x2 | Fractal Design Integra M 650W | InWin 103 | Mic. - SM57 | Headphones - Sony MDR-1A | Keyboard - Roccat Vulcan 100 AIMO | Mouse - Steelseries Rival 310 | Monitor - Dell S3422DWG

Link to comment
Share on other sites

Link to post
Share on other sites

I've never heard of a Blueray disk (which is what PS3 games are) that was 720p

So, this is very well known. You can look it up good sir. Also, I don't think you understand how Blu ray(it's blu ray disc cuz its a brand name) works. A blu ray disc is similar to a cd or dvd. They are simply storage devices just like a flash drive, HDD or SSD. It's just a different type and form factor. A blu ray disk is different then a cd or dvd because it can allow more space on the disc. You can also upload standard definition (480p) onto a blu ray disk. When you buy a movie it will be full HD (1080p) and it will also have better sound because it can handle better picture and sound.

 

The problem isn't blu ray disk. The problem is the hardware of the PS3. The hardware can't support higher resolution because it's just to weak for games. It would cause terrible FPS, it would heat up more. It just doesn't support 1080p for gaming. Videos can definitely be in full HD. It's the games that aren't.

Link to comment
Share on other sites

Link to post
Share on other sites

to answer the question in the title.... go to any random youtube video and manually change it to 140p (no not 1440, 140p) and watch it fullscreen.  Then ask yourself if you are spoiled by HD.  The answer is most definitely a resounding YES.

 

I recently moved and due to special circumstances my only net for now (maybe several months even) is cricket 3g through my blackberry 8350 wifi hotspot..... it can barely do 140p youtube, and that's when it's working at all.  This has most definitely awakened me to how truly high quality video has become in the last few years with 1080p finally becoming the truly ubiquitous norm.

 

Then I ask myself.... would I want more than 1080p? only with h.265 compression (twice as efficient as h.264 and 1080p standard)... and not really yet. Eventually, it will be good but until this whole net debacle that is coming to a head with comcast and the like.... 1080p h.265 is a very happy place indeed.

 

 

 

As for the rest?  Gaming can be 1080p, they need more detail behind the pixels which needs better more efficient programming and better video cards (more power efficient and higher power, mantle et al is a good step in the right direction) before things should go beyond 1080p for gaming as a standard IMO.

Desktop resolution?  Yes please, I want 4k just so I can put many more open windows on one monitor and not have to have a pile of monitors hooked up to my PC to multitask without hiding windows behind eachother.

The problem with your 140p thing is that it's so low no one uses 140p. Standard definition is 480p as in everything that plays on TV that isn't HD is in standard def aka 480p. So, compared to 480p HECK YES we are still being spoiled. Not a bad thing because it's good to improve. Things look nicer. It's a good things. Not a bad thing to still be stuck at a lower def but it just sucks compared to 1080p and higher. It's so good that things are getting higher def.

Link to comment
Share on other sites

Link to post
Share on other sites

My main monitor is 720p. I'll upgrade when I have money. So I can utilize the full power of my 780....

LOL you have a 780 but only a 720p monitor? Talk about over kill. HAHA def upgrade soon. You can pretty cheap 1080p monitors. My friend has gotten higher def monitors for free from junk yards. My father picked up a 1050p(I think that's that it is 1680x1050 monitor for only 20 or 30 bucks I think on craigslist. PLEASE upgrade so you really can actually use that 780 to it's potential.

Link to comment
Share on other sites

Link to post
Share on other sites

Everybody should be at 4K now. Everyone should be able to own a display that is 4K. It should cost you nothing.
 

NWO

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

LOL you have a 780 but only a 720p monitor? Talk about over kill. HAHA def upgrade soon. You can pretty cheap 1080p monitors. My friend has gotten higher def monitors for free from junk yards. My father picked up a 1050p(I think that's that it is 1680x1050 monitor for only 20 or 30 bucks I think on craigslist. PLEASE upgrade so you really can actually use that 780 to it's potential.

What's so wrong with 720p with a powerful GPU? IMO it's not that bad (though I'm using a 1280x1024 display, so not sure)...

 

And btw: no need to bash other people's decisions. They'll do whatever they want with their own money.

 

One last thing: no need to do a triple post. You could've contained all those quotations and your answers in one long post instead of 3 separate ones in succession.

Never trust my advice. Only take any and all advice from me with a grain of salt. Just a heads up.

Link to comment
Share on other sites

Link to post
Share on other sites

What's so wrong with 720p with a powerful GPU? IMO it's not that bad (though I'm using a 1280x1024 display, so not sure)...

 

And btw: no need to bash other people's decisions. They'll do whatever they want with their own money.

 

One last thing: no need to do a triple post. You could've contained all those quotations and your answers in one long post instead of 3 separate ones in succession.

No, I'm not trying to bash your decision. I just find it funny that you have such a powerful card for a 720p monitor. It's just so OP for that resolution. That's all.

 

Yes I know I could have. What I did was read posts, found one I wanted to comment on and quoted it then posted it. Then I found another one and just quoted it and posted again. You can't really change pages while you quote someone and have it stay there.

Link to comment
Share on other sites

Link to post
Share on other sites

No, I'm not trying to bash your decision. I just find it funny that you have such a powerful card for a 720p monitor. It's just so OP for that resolution. That's all.

 

Yes I know I could have. What I did was read posts, found one I wanted to comment on and quoted it then posted it. Then I found another one and just quoted it and posted again. You can't really change pages while you quote someone and have it stay there.

You can actually switch pages when quoting: you quote the post, and then copy the entire thing in the bottom, and then paste to a post you're editing (on another tab), and you're done. At least that's what I do... A bit slow I know, but it works.

 

And yeah. 780 with a 720p monitor isn't that well balanced. It's like having a >700hp engine on a Ford Fiesta: unnecessarily powerful for a daily driver, but some people want it just for bragging rights or whatever reason they might have. In this case however, I think he'll upgrade to a sharper monitor(s) (1080p) later, who knows?

 

Other than that, we're all good.

Never trust my advice. Only take any and all advice from me with a grain of salt. Just a heads up.

Link to comment
Share on other sites

Link to post
Share on other sites

No, I'm not trying to bash your decision. I just find it funny that you have such a powerful card for a 720p monitor. It's just so OP for that resolution. That's all.

 

Yes I know I could have. What I did was read posts, found one I wanted to comment on and quoted it then posted it. Then I found another one and just quoted it and posted again. You can't really change pages while you quote someone and have it stay there.

 

 

You can actually switch pages when quoting: you quote the post, and then copy the entire thing in the bottom, and then paste to a post you're editing (on another tab), and you're done. At least that's what I do... A bit slow I know, but it works.

 

And yeah. 780 with a 720p monitor isn't that well balanced. It's like having a >700hp engine on a Ford Fiesta: unnecessarily powerful for a daily driver, but some people want it just for bragging rights or whatever reason they might have. In this case however, I think he'll upgrade to a sharper monitor(s) (1080p) later, who knows?

 

Other than that, we're all good.

 

Here is a tip for you both: When you find posts that you want to comment on (say you were reading through this thread any wanted to comment on post 10) then click the "Multiquote" button. Then read on and find all the posts that you want to comment until you get to the end. Then you press the "Reply to quoted posts" button in the "tab" that opened down to the right on your screen when you first pressed "Multiquote" and write in what you was going to comment for each person under the corresponding quote from them :)

 

Good day to you both and sorry if you knew this already :)

 

EDIT: Here is a steps list (people seems to like lists  :ph34r: ).

 

1) Read through the thread until you find a post you want to comment on.

 

2) Press the "Multiquote" button on that post which is between "Report" and "Quote".

 

3) Continue through the thread until you have found all the posts you wanted to comment on (and pressed Multiquote on them as well).

 

4) Press the "Reply to quoted posts" button in the "tab" that opened down to the right when you first pressed Multiquote.

 

5) Then a series of quotes will pop up. Write your comment for each person under their corresponding quote.

 

6) They will all be noted and everyone is happy :)

Tor
Corsair Obsidian 650D - Intel 4770K CPU - Gigabyte G1 Sniper 5 - ASUS GTX 780 Direct CU 2 - Kingston Beast Hyperx Beast 16 GB RAM -  Corsair AX 1200i PSU - Samsung EVO drive 750 GB - Corsair AF series 120mm fans - Corsair H100i - Razer Blackwidow Ultimate 2013 edition - Razer Ouroboros - Razer Manticor - Windows 7 - Beyerdynamic MMX 300

Link to comment
Share on other sites

Link to post
Share on other sites

Here is a tip for you both: When you find posts that you want to comment on (say you were reading through this thread any wanted to comment on post 10) then click the "Multiquote" button. Then read on and find all the posts that you want to comment until you get to the end. Then you press the "Reply to quoted posts" button in the "tab" that opened down to the right on your screen when you first pressed "Multiquote" and write in what you was going to comment for each person under the corresponding quote from them :)

 

Good day to you both and sorry if you knew this already :)

 

EDIT: Here is a steps list (people seems to like lists  :ph34r: ).

 

1) Read through the thread until you find a post you want to comment on.

 

2) Press the "Multiquote" button on that post which is between "Report" and "Quote".

 

3) Continue through the thread until you have found all the posts you wanted to comment on (and pressed Multiquote on them as well).

 

4) Press the "Reply to quoted posts" button in the "tab" that opened down to the right when you first pressed Multiquote.

 

5) Then a series of quotes will pop up. Write your comment for each person under their corresponding quote.

 

6) They will all be noted and everyone is happy :)

 

This just does not work with unreliable internet connections.  The cached data tends to reset at the first net trouble.

 

I tend to ctrl+a, ctrl+c the contents of the post box before hitting post, and before any other big changes, so that is mitigated.

 

 

 

 

Back on topic: 144p is most definitely still used, whenever the net connection goes really slow google's servers begin sending lower quality to the person viewing, and it happens more then you would care to admit on cellphone internet connections.   Also, 144p was most definitely a thing, you may not have been around for it but it was, and even below that.  Back in the earlier days of computing and such watching video was limited to that because of the (lack of) power of the average PCs of the day.

 

Heck there are a few videos on the windows 95 disc that look a whole lot like that, and those were SHOWCASING the power of the "modern pc" and windows 95, just having full screen full motion video was a huge deal back then.

I am a female pc hardware expert and enthusiast, over 170 IQ, been in the tech scene since the 80s. get over it.  This message brought to you by me being tired of people which have problems with any of those things.   ~Jaqie Fox

-=|Fighting computer ignorance since 1995|=-

Link to comment
Share on other sites

Link to post
Share on other sites

Spoiled by HD? Yeah right. Technology needs to go on. Or we would still be stuck at VGA and SVGA. One day, we will look back at 4K and think "oh god, it's horrible".

 

Spoiler

CPU:Intel Xeon X5660 @ 4.2 GHz RAM:6x2 GB 1600MHz DDR3 MB:Asus P6T Deluxe GPU:Asus GTX 660 TI OC Cooler:Akasa Nero 3


SSD:OCZ Vertex 3 120 GB HDD:2x640 GB WD Black Fans:2xCorsair AF 120 PSU:Seasonic 450 W 80+ Case:Thermaltake Xaser VI MX OS:Windows 10
Speakers:Altec Lansing MX5021 Keyboard:Razer Blackwidow 2013 Mouse:Logitech MX Master Monitor:Dell U2412M Headphones: Logitech G430

Big thanks to Damikiller37 for making me an awesome Intel 4004 out of trixels!

Link to comment
Share on other sites

Link to post
Share on other sites

I take 1080p for granted until I see someone playing one a new console or someone without a 1080p display on their phone/tablet/laptop and then I remember how amazing it is.

| Currently no gaming rig | Dell XPS 13 (9343) |

| Samsung Galaxy Note5 | Gear VR | Nvidia Shield Tab | Xbox One |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×