Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Is PPI actually matters?

I used to take a close look at a 1440p vs 4k screen and I've noticed that the pixel on 1440p panel is physically larger - like you can actually see the individual pixel on a large enough display, while on a 4k one I found the pixel to be virtually invisible to the naked eyes even on a gigantic TV. This led me to believe for quite a while that when it comes to seeing a pixel, the nature of the pixel itself (4k vs 1440p vs 1080p) is more important the the pixel count per squaremeter, as a physically larger pixel will always be visible regardless of the screen size.

 

But then I've noticed the panal on my phone which is 1080p and I've never manage to identify the individual pixel despite it's just a 1080p panel. So just kind of curious, does my above assumption is correct or the pixel density is the be all and end all when it comes to screen sharpness? If so what make the 1440p display I mentioned to have such a large visible pixel despite havign the same PPI as some of the 4k TV yet this has never been the case in the latter?

 

BTW what does PPI actually give us? Does it contribute to the fact that we can sit closer to the screen and not see any individual pixels or just make the image sharper for the same distance? 

Link to post
Share on other sites

PPI means pixels per inch. Your phone has drastically higher PPI at 1080p than you monitor or TV. I don't think you grasp the concept.

Current PC:

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to post
Share on other sites
10 minutes ago, e22big said:

does my above assumption is correct or the pixel density is the be all and end all when it comes to screen sharpness?

yep. a 2in 240p would look better than a 50in 1080p in terms of sharpness.
 

 

11 minutes ago, e22big said:

If so what make the 1440p display I mentioned to have such a large visible pixel despite havign the same PPI as some of the 4k TV yet this has never been the case in the latter?

What? im not following your point here?

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to post
Share on other sites

PPI is exactly what it is on the label: the pixels per inch. The higher the PPI, the more tightly the pixels are packed together.

 

Since this is a function of the number of pixels and the size of those pixels, it's affected by both the resolution and the screen size.

 

That's where viewing distance comes in. From a far enough distance, even big things look small. So, when you hold a phone 12in from your face, the pixels need to be very small for you not to be able to see them, but when you're watching TV from 6-8ft away, the pixels can be much larger and still not be perceptible.

 

Long and short, to not see the pixels, you need a combination of PPI (pixel density) and viewing distance. What's an acceptable PPI for a phone, is different for a monitor, which is different for a TV, because the viewing distance is different for each.

CPU: AMD Ryzen 9 5900X · Cooler: Noctua NH-U12S chromax.black · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: EVGA SuperNova 650 G+ 650W 80+ Gold · Case: Fractal Design Meshify C · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz

Link to post
Share on other sites
2 minutes ago, HelpfulTechWizard said:

yep. a 2in 240p would look better than a 50in 1080p in terms of sharpness.
 

 

What? im not following your point here?

Ok let me be a bit more spesific - I took a look at 32 inch 1440p display (which bring it to around 90 PPI) and a 55 inch 4k (80 PPI), for some reason, the 32 inch 1440p has some very large visible pixel, the 4k TV despite having lower PPI actually remain sharp and never show any individual pixel despite me taking a closer look at them. 

 

This is the reason why I feel a bit confuse and start to doubt if the effect of high PPI on 4k is the same as 1440p

Link to post
Share on other sites
16 minutes ago, Chris Pratt said:

PPI is exactly what it is on the label: the pixels per inch. The higher the PPI, the more tightly the pixels are packed together.

 

Since this is a function of the number of pixels and the size of those pixels, it's affected by both the resolution and the screen size.

 

That's where viewing distance comes in. From a far enough distance, even big things look small. So, when you hold a phone 12in from your face, the pixels need to be very small for you not to be able to see them, but when you're watching TV from 6-8ft away, the pixels can be much larger and still not be perceptible.

 

Long and short, to not see the pixels, you need a combination of PPI (pixel density) and viewing distance. What's an acceptable PPI for a phone, is different for a monitor, which is different for a TV, because the viewing distance is different for each.

That I know, but the thing is when I took a closer up look at both 1440p and 4k display (both are like less than an inch away), the 1440p have very large pixel and 4k despite having lower PPI actually looks sharp. Hold on let me upload some image to give ypu a clearer picture

Link to post
Share on other sites
3 minutes ago, e22big said:

Ok let me be a bit more spesific - I took a look at 32 inch 1440p display (which bring it to around 90 PPI) and a 55 inch 4k (80 PPI), for some reason, the 32 inch 1440p has some very large visible pixel, the 4k TV despite having lower PPI actually remain sharp and never show any individual pixel despite me taking a closer look at them. 

 

This is the reason why I feel a bit confuse and start to doubt if the effect of high PPI on 4k is the same as 1440p

Theyes both 80ppi.

Ppi is the same affect on any res. I'd guess that eitheE the 4k is smaller than you think, or that it's just you expecting it to be sharper, so it is.

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to post
Share on other sites
8 minutes ago, Chris Pratt said:

PPI is exactly what it is on the label: the pixels per inch. The higher the PPI, the more tightly the pixels are packed together.

 

Since this is a function of the number of pixels and the size of those pixels, it's affected by both the resolution and the screen size.

 

That's where viewing distance comes in. From a far enough distance, even big things look small. So, when you hold a phone 12in from your face, the pixels need to be very small for you not to be able to see them, but when you're watching TV from 6-8ft away, the pixels can be much larger and still not be perceptible.

 

Long and short, to not see the pixels, you need a combination of PPI (pixel density) and viewing distance. What's an acceptable PPI for a phone, is different for a monitor, which is different for a TV, because the viewing distance is different for each.

So the first image is how I see the 4k display upclose and the second is how I see 1440p (both aren't from the same display I mentioned, just to illustrate the point) while it's true that if I gotten close enough to both I can see pixels but the pixels on 4k seem finer and kind of round (with naked eyes, the photo doesn't look exactly the same as how I saw it) but the 1440p seem very large almost a grain of rice size and visibly square. 

 

 

 

 

20210515_223624.thumb.jpg.47471f17d9bbb95347729cf755b65b09.jpg

 

 

Untitled.thumb.png.dae8df0e7977dde5acccbc92ec9917fa.png

Link to post
Share on other sites
2 minutes ago, e22big said:

Untitled.thumb.png.dae8df0e7977dde5acccbc92ec9917fa.png

I see a very pronounced moire pattern, which makes it a bit difficult to focus on the pixels. Are you sure you're not confusing the moire pattern with pixels here? It's also possible the image itself simply isn't 1440p, so you get some aliasing along edges.

 

The 32" 1440p display should have about 30% more pixels per square inch than the 4K display at 55", so the pixels on the 32" screen should definitely be smaller.

Remember to quote or @mention others, so they are notified of your reply

Link to post
Share on other sites
38 minutes ago, e22big said:

I used to take a close look at a 1440p vs 4k screen and I've noticed that the pixel on 1440p panel is physically larger - like you can actually see the individual pixel on a large enough display, while on a 4k one I found the pixel to be virtually invisible to the naked eyes even on a gigantic TV. This led me to believe for quite a while that when it comes to seeing a pixel, the nature of the pixel itself (4k vs 1440p vs 1080p) is more important the the pixel count per squaremeter, as a physically larger pixel will always be visible regardless of the screen size.

 

But then I've noticed the panal on my phone which is 1080p and I've never manage to identify the individual pixel despite it's just a 1080p panel. So just kind of curious, does my above assumption is correct or the pixel density is the be all and end all when it comes to screen sharpness? If so what make the 1440p display I mentioned to have such a large visible pixel despite havign the same PPI as some of the 4k TV yet this has never been the case in the latter?

 

BTW what does PPI actually give us? Does it contribute to the fact that we can sit closer to the screen and not see any individual pixels or just make the image sharper for the same distance? 

Rule of thumb: Anything above 1080p is baller. Anything above 4k is retarded.

Link to post
Share on other sites
3 minutes ago, Eigenvektor said:

I see a very pronounced moire pattern, which makes it a bit difficult to focus on the pixels. Are you sure you're not confusing the moire pattern with pixels here? It's also possible the image itself simply isn't 1440p, so you get some aliasing along edges.

 

The 32" 1440p display should have about 30% more pixels per square inch than the 4K display at 55", so the pixels on the 32" screen should definitely be smaller.

I am not sure honestly, but the moire pattern on the 32 inch display is very visible to naked eyes while the 4k pattern, if they existed at all, was not. That's the source of my confusion. 

 

Too bad I don't have either of those display anymore so I can't take the exact picture to show a clearer effect.

Link to post
Share on other sites
1 minute ago, e22big said:

I am not sure honestly, but the moire pattern on the 32 inch display is very visible to naked eyes while the 4k pattern, if they existed at all, was not. That's the source of my confusion.

The moire pattern is an effect caused by the pixel grid of the display and the sensor grid of the camera, which depends on the resolution/density of both.

 

The pattern being more visible for the 1440p display doesn't say anything about its pixel density. It's just that this particular combination of monitor resolution and sensor resolution produces a more visible pattern in the resulting image.

Remember to quote or @mention others, so they are notified of your reply

Link to post
Share on other sites
9 minutes ago, Eigenvektor said:

I see a very pronounced moire pattern, which makes it a bit difficult to focus on the pixels. Are you sure you're not confusing the moire pattern with pixels here? It's also possible the image itself simply isn't 1440p, so you get some aliasing along edges.

 

The 32" 1440p display should have about 30% more pixels per square inch than the 4K display at 55", so the pixels on the 32" screen should definitely be smaller.

the other point I foud wierd is that on a 22-24 inch 1080p display, the W letter looks visibly pixelated despite when massively scaled up, yet they look sharp on the 32 inch 1440p despite having even lower PPI when scale to the same size - so I dunno, it's kind of weird like the same PPI never seem to scale perfectly on display of various size

Link to post
Share on other sites

Some displays have wider black gaps between pixels than others (lower fill ratio), so you see the grid and the pixels more even with the same PPI. Comes down to the particular display and not to resolution, and is not in specs.

 

https://www.nanolumens.com/blog/defining-distance-pixel-pitch-vs-fill-ratio/

F@H
Desktop: i7-5960X 4.4GHz, Noctua NH-D14, ASUS Rampage V, 32GB, RTX3080, 2TB NVMe SSD, 2x16TB HDD RAID0, Corsair HX1200, Thermaltake Overseer RX1, Samsung 4K curved 49" TV, 23" secondary

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB NVMe SSD RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Dell XPS 2 in 1 2019, 32GB, 1TB, 4K

 

GPD Win 2

Link to post
Share on other sites
Just now, Eigenvektor said:

The moire pattern is an effect caused by the pixel grid of the display and the sensor grid of the camera, which depends on the resolution/density of both.

 

The pattern being more visible for the 1440p display doesn't say anything about its pixel density. It's just that this particular combination of monitor resolution and sensor resolution produces a more visible pattern in the resulting image.

Like I said, the moire patter appears to my naked eyes, I picked the image because it resembles what I saw on a 1440p monitor but what I saw doesn't cause by any lens

Link to post
Share on other sites
1 minute ago, Kilrah said:

Some displays have wider black gaps between pixels than others (lower fill ratio), so you see the grid and the pixels more even with the same PPI. Comes down to the particular display and not to resolution, and is not in specs.

 

https://www.nanolumens.com/blog/defining-distance-pixel-pitch-vs-fill-ratio/

Oh that's very likely, the display appears with black grid everywhere is exactly what I saw. Thanks, most likely it.

Link to post
Share on other sites

Yes. 1440p on 27" looks much better than on 31.5". 1440p on 24" looks amazing, very sharp. I do prefer 27" for the immersion though.

Ryzen 5 2600X / ASRock Fatal1ty B450 Gaming K4 / G.Skill RIPJAWS V 16GB (2X8) 3000Mhz CL15 / Gigabyte RTX 2060 Super Gaming 8GB OC / Corsair RM650X 2018 / Crucial BX500 240GB / Seagate Barracauda 2TB 7200RPM Cooler Master MasterBox E500L /  ASUS TUF Gaming VG27WQ // Rog Orion / Corsair Harpoon RGB Pro / Cooler Master MasterKeys Lite L / Xbox One Red Sport  Special Edition Controller for Windows
Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×