Some people still say that 1080p is plenty, whether for reading text or watching videos (including gaming), but anyone who has used a 4K monitor knows that text looks far clearer and games look far more realistic and detailed at 4K. And the same will probably be true when 4K becomes truly mainstream on desktops.
> but anyone who has used a 4K monitor knows that text looks far clearer and games look far more realistic and detailed at 4K
It depends on the distance really.
Text at a desktop at an arms distance max, possibly on 24"+ will be noticeably better.
I have a 34" inches ultra wide 1440p and I definitely would love higher pixel density.
But start getting 24" or less and 1440p vs 4k are borderline marketing.
People do swear to be able to see the difference, yet I remember they random tested some 120+ gamers who were shown the same TV with different output res, and the distribution of guesses had a very minor slight advantage for the 4k, well in the realm of errors, and it obviously dropped to non existence with just few centimeters of distance more.
It also heavily depends on the content being displayed. You might not be able to tell any difference in a typical game frame or a movie, but a 1px black on white line still activates your cones while being way below the minimum angular resolution you can see. You can see stars for the same reason.
I personally struggle at 1440p vs 4k as soon as you get at 27" mark, but I'm generally at a 70/80 centimeters (2.5 feet in american units) distance from my screen.
This is hard to gauge though because it's rarely only a change from 1080p to 4k. The screen tech often changes from LCD to LED (or microLED, or OLED), there's all manner of motion smoothing added, sometimes higher refresh rates, and so on.
Anecdotally I was at a family Christmas where relatives were watching some home videos that were encoded at 480p on a flash drive on a new TV, and they all said that the quality of the 4K picture was amazing despite the fact they were all watching 480P video (without upscaling because I'd turned it off.) To be fair it did look better than an old TV, but not because of the resolution.
I have a 34" 4K TV and from the couchthere is a difference between 1080p and 4K on it. 4K is crisper whether on Netflix or YouTube. Only potential variable, I think is the codec and compression used on each resolution.
I find it fascinating that the same is true for frame rate. Some people think 60Hz is OK, while anyone who has tried a 120Hz screen will agree it is infinitely smoother. The same is true again for a 240Hz screen. I have yet to try a 480Hz screen but imagine the jump will be equally impressive.
Yeah, I think diminishing returns kick in at some point.
Going from 1080p to 1440p feels like a huge improvement.
Going from 1440p to 4k (aka 2160p) is a little bit sharper.
I don't think the jump from 4k to 8k will improve things that much.
I can tell the difference between 1080p (or upscaled 1080p) and 4k on a 50" screen at "living room" distances, but it's nowhere near as obvious as SD to DVD was.
At "laptop" screen distances the difference between my Retina display and non-retina external monitors is quite noticeable; so much so that I run 4k in 1080p mode more and more.
8k is going to require those curved monitors because you'll have to be that close to it to get the advantage.
> I can tell the difference between 1080p (or upscaled 1080p) and 4k
Are you talking about the resolution of the video or of the screen itself? Lower resolution video looks worse also because of the compression. I saw bigger difference from video compression than from screen resolution. E.g. good 1080p video looked better than bad 1080p video on any screen and resolution.
I have a 43" 4k monitor at ~1m distance and when I use the computer set up with 100% scaling it looks bad, I see pixelation... and "subpixelation". On a different computer connected to the same screen but something like 150% scaling it's like night and day difference. Everything looks smooth, perfect antialiasing.
This is the money picture [0]. Above a certain distance any improvement is imperceptible. But don't compare compressed video on a screen, it will add quality issues that influence your perception.
I remember those numbers being a third that 20 years ago. Either we have evolved brand new eyes without noticing or you are just talking about the current state of the art like it's the limit of human vision.
Put another way look at 300ppi prints and 1200ppi prints. The difference is night and day at 30 cm viewing.
For printing photos, you are right that a 300 ppi printer is better than an 1200 dpi printer.
On the other hand, for printing text, an 1200 dpi printer has the quality of an 1200 ppi printer.
Many well-designed traditional typefaces have relied on optical effects caused by details that require for being printed a resolution higher than that at which the human eye can distinguish a set of bars from an uniform background (which is the object of TFA). For instance, in some typefaces the edges of the strokes are concave or convex, not straight, which could be rendered in a computer display only by either a much higher resolution or by more sophisticated pixel preprocessing methods (in order to simulate the effect on the eye). Whenever such typefaces are displayed at a lower resolution, i.e. on computer monitors, they are very noticeably uglier than when printed on paper by traditional metal printing methods or even by a high-resolution laser printer.
You want to say "the minimum that is useful", because you want a resolution at least equal with that, to not see the pixel structure.
A 27" monitor has a height around 17", i.e. about 43 cm, and for watching a movie or anything else where you look at the screen as a whole the recommended viewing distance is twice the screen height, i.e. about 86 cm.
At this distance, the resolution needed to match the human vision is provided by a height of slightly less than 3000 pixels by this study, but by about 3300 pixels by older studies. In these conditions you are right, the minimum acceptable resolution is around 200 ppi.
This means that a 27 inch 5k monitor, with a resolution of 2880 by 5120 pixels, when viewed from a distance twice its height, i.e. about 86 cm (34 inch), provides a resolution close, but slightly less than that of typical human vision. (That viewing distance that is double the height corresponds to the viewing angle of camera lenses with normal focal length, which has been based on studies about the maximum viewing angles where humans are able to perceive a correct perspective when looking at an image as a whole.)
However, when not watching movies, but working with text documents, you normally stay closer to the monitor than that, so even a 5k monitor is not good enough (but an 8k monitor may be enough, so that might be the final monitor resolution, beyond which an increase is useless).
Higher ppi on mobile is still useful if it enables “manual zoom” (i.e. move your head closer). I do this with Google Sheets on mobile all the time, as I like to have a lot of the sheet displayed at once to see the overall structure, and then peer closer to read the text.
About 15 cm view distance for smartphones is pretty normal for me (shortsighted with glasses) on websites where the text is very small, e.g. Hacker News.
Oh, that's a great idea. I see Firefox also has this in the settings under Accessibility. I guess I never looked there because I don't think of myself as disabled. (Maybe I should.)
More and more - especially on Apple products - Accessibility is the "customization" menu, not the "disabled support" it used to be.
Size increases, animation decreases (when your iPhone is getting old, turn off animation and behold it operating super fast!), etc can all be found there.
I zoom websites until they "feel right" which is usually something close to "they are as wide as the window I have them in" - HN is a few taps up from "actual size".
One would expect the results to be highly correlated to corrected vision which is all over the place.. but they get suspiciously tightly grouped results.
Did they maybe not measure how many pixels we can see.. but rather how laughably bad COTS IPS are at contrast, as the examined pattern approaches their resolution? I wonder what happens if you repeat that with a reasonably bright 16K OLED.
It depends on the distance really.
Text at a desktop at an arms distance max, possibly on 24"+ will be noticeably better.
I have a 34" inches ultra wide 1440p and I definitely would love higher pixel density.
But start getting 24" or less and 1440p vs 4k are borderline marketing.
People do swear to be able to see the difference, yet I remember they random tested some 120+ gamers who were shown the same TV with different output res, and the distribution of guesses had a very minor slight advantage for the 4k, well in the realm of errors, and it obviously dropped to non existence with just few centimeters of distance more.
Looks like good anti-aliasing for text to look better on lower DPI display is slowly getting the bitrot treatment...
Anecdotally I was at a family Christmas where relatives were watching some home videos that were encoded at 480p on a flash drive on a new TV, and they all said that the quality of the 4K picture was amazing despite the fact they were all watching 480P video (without upscaling because I'd turned it off.) To be fair it did look better than an old TV, but not because of the resolution.
Going from 1080p to 1440p feels like a huge improvement. Going from 1440p to 4k (aka 2160p) is a little bit sharper. I don't think the jump from 4k to 8k will improve things that much.
At "laptop" screen distances the difference between my Retina display and non-retina external monitors is quite noticeable; so much so that I run 4k in 1080p mode more and more.
8k is going to require those curved monitors because you'll have to be that close to it to get the advantage.
Are you talking about the resolution of the video or of the screen itself? Lower resolution video looks worse also because of the compression. I saw bigger difference from video compression than from screen resolution. E.g. good 1080p video looked better than bad 1080p video on any screen and resolution.
I have a 43" 4k monitor at ~1m distance and when I use the computer set up with 100% scaling it looks bad, I see pixelation... and "subpixelation". On a different computer connected to the same screen but something like 150% scaling it's like night and day difference. Everything looks smooth, perfect antialiasing.
This is the money picture [0]. Above a certain distance any improvement is imperceptible. But don't compare compressed video on a screen, it will add quality issues that influence your perception.
[0] https://media.springernature.com/full/springer-static/image/...
For plain video, 1080p at high enough bitrate is fine.
For example:
- 40 cm view distance (e.g. smartphone): 300 ppi is roughly the maximum that's useful
- 100 cm (e.g. desktop monitor): about 200 ppi
https://www.nature.com/articles/s41467-025-64679-2/figures/2
Put another way look at 300ppi prints and 1200ppi prints. The difference is night and day at 30 cm viewing.
You don't need 1200ppi for a nice 1200dpi print; even 300ppi may be enough.
On the other hand, for printing text, an 1200 dpi printer has the quality of an 1200 ppi printer.
Many well-designed traditional typefaces have relied on optical effects caused by details that require for being printed a resolution higher than that at which the human eye can distinguish a set of bars from an uniform background (which is the object of TFA). For instance, in some typefaces the edges of the strokes are concave or convex, not straight, which could be rendered in a computer display only by either a much higher resolution or by more sophisticated pixel preprocessing methods (in order to simulate the effect on the eye). Whenever such typefaces are displayed at a lower resolution, i.e. on computer monitors, they are very noticeably uglier than when printed on paper by traditional metal printing methods or even by a high-resolution laser printer.
A 27" monitor has a height around 17", i.e. about 43 cm, and for watching a movie or anything else where you look at the screen as a whole the recommended viewing distance is twice the screen height, i.e. about 86 cm.
At this distance, the resolution needed to match the human vision is provided by a height of slightly less than 3000 pixels by this study, but by about 3300 pixels by older studies. In these conditions you are right, the minimum acceptable resolution is around 200 ppi.
This means that a 27 inch 5k monitor, with a resolution of 2880 by 5120 pixels, when viewed from a distance twice its height, i.e. about 86 cm (34 inch), provides a resolution close, but slightly less than that of typical human vision. (That viewing distance that is double the height corresponds to the viewing angle of camera lenses with normal focal length, which has been based on studies about the maximum viewing angles where humans are able to perceive a correct perspective when looking at an image as a whole.)
However, when not watching movies, but working with text documents, you normally stay closer to the monitor than that, so even a 5k monitor is not good enough (but an 8k monitor may be enough, so that might be the final monitor resolution, beyond which an increase is useless).
Size increases, animation decreases (when your iPhone is getting old, turn off animation and behold it operating super fast!), etc can all be found there.
I zoom websites until they "feel right" which is usually something close to "they are as wide as the window I have them in" - HN is a few taps up from "actual size".
Did they maybe not measure how many pixels we can see.. but rather how laughably bad COTS IPS are at contrast, as the examined pattern approaches their resolution? I wonder what happens if you repeat that with a reasonably bright 16K OLED.