What is the Resolution of an Image, Screen, or Camera?
Resolution is a term that has two different meanings when talking about digital images, cameras, computer screens, images for printing (DPI), or physical screen space (PPI) :
- The resolution of a screen is its dimensions in pixels, e.g. a screen with 1600x900px resolution has 1600 pixels in width and 900 pixels in height.
- The resolution of an image can be its dimension in pixels, too.
- The resolution of a camera refers to the dimensions of the digital images it creates. It has the same meaning as the above, but is normally measured in megapixels (1 million of pixels), e.g. if you take a photo and it saves a JPEG that's 1000x2000px, that's 2 million pixels in total, or 2 megapixels (2 MP).
- The resolution of an image for printing is measured in its DPI (dots per inch). Essentially, it refers to how many pixels of a digital image occupy a single inch in print. The term DPI can also be used with computer screens, where it's called PPI (pixels per inch), to refer to how many pixels fit a single inch on the physical screen. Typical values are 90 DPI (typical screen resolution), 150 DPI, 300 DPI, and 600 DPI (high quality print). In this case, terms like HI-DPI (or retina) refer to very dense resolutions, that display many pixels in small areas.
Important: resolution refers to the amount of graphical data, not to its quality. Although it makes sense to think that the more data, in this case, pixels, you have the higher quality it's going to be, that's not necessarily always the case as there are other factors that can make the quality worse than a lower-resolution adversary.
A simple example are digital cameras. The amount of megapixels gives an easy way to measure the resolution of the images taken, but that doesn't necessarily mean a camera that takes 16 megapixels images will take photos of better quality than one that takes 8 megapixel images. The hardware of the camera could be poorer and capture images of worse quality, but it could still market itself as having higher resolution because it creates images with more pixels.
Similarly, for monitors, weird terms like 4K and 8K refer to the resolution of the monitor screen. Because these terms are marketable, manufacturers may create monitors that technically fir the definition of 4K even if in practice they have poorer quality than a sub-4K monitor. This can happen due to fractional scaling. In a computer, most graphics are created to be displayed at a 1.0 pixel scaling. They will look reasonably good if we have integer scaling, e.g. if we jump to 2.0 or 3.0 scaling. What this means is that if we had a 1600x900px monitor before, we need a 3200x1800px monitor that has the same physical size to go from 1.0 to 2.0. The monitor screen doesn't get bigger, it just has more pixels in the same space. However, this is a huge leap, and most people won't buy a monitor screen that's exactly the same physical size as their old monitor. You're more likely to find monitors that have fractional scaling instead, such as 1.25, 1.5, or 1.75. These can make the graphics look very poor in various software, and then the users will say that the software is wrong for not supporting fractional scaling, instead of realizing they purchased a piece of hardware that only makes scalable graphics look good, and every other graphical asset is going to look "scaled up," and it's going to consume more electricity because your graphics card will have to do more work, and worst of all, you probably would be happier with an 1.0 monitor that is just physically bigger than what you got.
I want to note that some people say they can see the pixels and that's bad. I don't really see any pixels on my 1.0 monitor. I wouldn't really mind even if I did see pixels. I like pixels. You would need a very large monitor with a very small PPI or to shove your face against the screen to see any pixels.
Leave a Reply