Resolution Limit of the Eye – How Many Pixels Can We See?
Posted2 months agoActiveabout 2 months ago
nature.comResearchstory
calmmixed
Debate
40/100
Display ResolutionHuman VisionTechnology Limits
Key topics
Display Resolution
Human Vision
Technology Limits
A study published in Nature explores the resolution limit of the human eye, sparking discussion on the practical limits of display resolution and the value of high-resolution displays.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
24m
Peak period
28
168-180h
Avg / period
13.8
Comment distribution55 data points
Loading chart...
Based on 55 loaded comments
Key moments
- 01Story posted
Oct 28, 2025 at 7:28 AM EDT
2 months ago
Step 01 - 02First comment
Oct 28, 2025 at 7:53 AM EDT
24m after posting
Step 02 - 03Peak activity
28 comments in 168-180h
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 5, 2025 at 8:37 AM EST
about 2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45731469Type: storyLast synced: 11/20/2025, 3:29:00 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
For example:
- 40 cm view distance (e.g. smartphone): 300 ppi is roughly the maximum that's useful
- 100 cm (e.g. desktop monitor): about 200 ppi
https://www.nature.com/articles/s41467-025-64679-2/figures/2
Size increases, animation decreases (when your iPhone is getting old, turn off animation and behold it operating super fast!), etc can all be found there.
I zoom websites until they "feel right" which is usually something close to "they are as wide as the window I have them in" - HN is a few taps up from "actual size".
Put another way look at 300ppi prints and 1200ppi prints. The difference is night and day at 30 cm viewing.
You don't need 1200ppi for a nice 1200dpi print; even 300ppi may be enough.
On the other hand, for printing text, an 1200 dpi printer has the quality of an 1200 ppi printer.
Many well-designed traditional typefaces have relied on optical effects caused by details that require for being printed a resolution higher than that at which the human eye can distinguish a set of bars from an uniform background (which is the object of TFA). For instance, in some typefaces the edges of the strokes are concave or convex, not straight, which could be rendered in a computer display only by either a much higher resolution or by more sophisticated pixel preprocessing methods (in order to simulate the effect on the eye). Whenever such typefaces are displayed at a lower resolution, i.e. on computer monitors, they are very noticeably uglier than when printed on paper by traditional metal printing methods or even by a high-resolution laser printer.
A 27" monitor has a height around 17", i.e. about 43 cm, and for watching a movie or anything else where you look at the screen as a whole the recommended viewing distance is twice the screen height, i.e. about 86 cm.
At this distance, the resolution needed to match the human vision is provided by a height of slightly less than 3000 pixels by this study, but by about 3300 pixels by older studies. In these conditions you are right, the minimum acceptable resolution is around 200 ppi.
This means that a 27 inch 5k monitor, with a resolution of 2880 by 5120 pixels, when viewed from a distance twice its height, i.e. about 86 cm (34 inch), provides a resolution close, but slightly less than that of typical human vision. (That viewing distance that is double the height corresponds to the viewing angle of camera lenses with normal focal length, which has been based on studies about the maximum viewing angles where humans are able to perceive a correct perspective when looking at an image as a whole.)
However, when not watching movies, but working with text documents, you normally stay closer to the monitor than that, so even a 5k monitor is not good enough (but an 8k monitor may be enough, so that might be the final monitor resolution, beyond which an increase is useless).
Going from 1080p to 1440p feels like a huge improvement. Going from 1440p to 4k (aka 2160p) is a little bit sharper. I don't think the jump from 4k to 8k will improve things that much.
At "laptop" screen distances the difference between my Retina display and non-retina external monitors is quite noticeable; so much so that I run 4k in 1080p mode more and more.
8k is going to require those curved monitors because you'll have to be that close to it to get the advantage.
Are you talking about the resolution of the video or of the screen itself? Lower resolution video looks worse also because of the compression. I saw bigger difference from video compression than from screen resolution. E.g. good 1080p video looked better than bad 1080p video on any screen and resolution.
I have a 43" 4k monitor at ~1m distance and when I use the computer set up with 100% scaling it looks bad, I see pixelation... and "subpixelation". On a different computer connected to the same screen but something like 150% scaling it's like night and day difference. Everything looks smooth, perfect antialiasing.
This is the money picture [0]. Above a certain distance any improvement is imperceptible. But don't compare compressed video on a screen, it will add quality issues that influence your perception.
[0] https://media.springernature.com/full/springer-static/image/...
For plain video, 1080p at high enough bitrate is fine.
It depends on the distance really.
Text at a desktop at an arms distance max, possibly on 24"+ will be noticeably better.
I have a 34" inches ultra wide 1440p and I definitely would love higher pixel density.
But start getting 24" or less and 1440p vs 4k are borderline marketing.
People do swear to be able to see the difference, yet I remember they random tested some 120+ gamers who were shown the same TV with different output res, and the distribution of guesses had a very minor slight advantage for the 4k, well in the realm of errors, and it obviously dropped to non existence with just few centimeters of distance more.
- we are all colorblind, as we only see a very small part of the light spectrum
- there are no two individuals that perceive colors in the same way in the world as the receptor distribution changes by person to person (actually, even day by day on a single-individual basis)
That's not what "colorblind" means. Are we also all deaf, because we can't hear sounds outside a certain frequency range?
Our resolution varies across our FOV, but is best measured in the center of vision. (Ironically, we can potentially see smaller objects away from the center, but... biology is analog, not digital, and weird.)
Focal length is merely the conversion factor from physical resolution at the receiver side to angular resolution in the real world. It varies, but not by much, eye to eye, with vision aids (glasses, etc.), and (very trivially) with distance to the object. So, it's basically a constant.
To stick with your example: the pilots that "see" aircraft at 4 nautical miles don't "see" them because their foveas have super human pixel density. But because high-contrast very low res foveal samples + contextual inference + motion + time, give more time to the brain to infer the presence of the aircraft. But if you had the ability to stop the motion and image the very same aircraft would "disappear".
Your statement about resolution being infinite is laughably wrong. I assume you don't even understand what "infinite" means.
You wrote two sentences that are both wrong.
Also, your understanding of resolution is lacking. Our eyes aren't uniformly covered at all. The MP amount you claim is "virtual", for want of a better word. It's like taking an SD (480p) video for one second, combining it with AI, and extrapolating a 500MP single photo. It's very like that, in practice.
And my smartphone or my DSLR take pictures with more than 4k, you can see the image sharpness difference also immediately.
What content is left? Games, but they also have text. Higher resolution, less antialising needed.
The most frustrating thing about this type of discussion is: Its 2025! 4k displays exist for how much in consumer space? 10 years by now?
8k is also nice. Might actually be something i don't pay extra for, but supporting 8k opens up so many possibilities. Like controling 4 4k displays without issues (demo walls, event setups etc.) or just 2 or 3 24/27" displays on your desk.
I remember comparing 4k macbook with my 1440p samsung, both 14" and it was impossible to say at that screen size.
Looks like good anti-aliasing for text to look better on lower DPI display is slowly getting the bitrot treatment...
Anecdotally I was at a family Christmas where relatives were watching some home videos that were encoded at 480p on a flash drive on a new TV, and they all said that the quality of the 4K picture was amazing despite the fact they were all watching 480P video (without upscaling because I'd turned it off.) To be fair it did look better than an old TV, but not because of the resolution.
Having a high resolution monitor is great, its even become semi affordable. But color accuracy can be a pain point.
This dosnt matter if your just coding, unless you really care how realistic that code looks....but to anyone doing video or photo manipulation or front end design it can be a deal breaker.
I always buy good displays (ISP, high resolution, good test results) and the calibration was very very good out of the box. I sold the calibration device after.
If you don't have to provide pin point accuracy from display to the industry level printer with color profile, you will not need this if you don't buy cheap.
It seems after the failure of 3D and 8K the industry doesn't seems to care about anything anymore.
Did they maybe not measure how many pixels we can see.. but rather how laughably bad COTS IPS are at contrast, as the examined pattern approaches their resolution? I wonder what happens if you repeat that with a reasonably bright 16K OLED.
1 more comments available on Hacker News