DPI

Share

What is DPI?

DPI stands for dots per inch. It's a term used when measuring the resolution of something, so a high DPI means high resolution, which is generally better than low-resolution (low DPI). There are multiple things that have a DPI in a computer: digital images, screens, monitors, mouses, drawing tablets, and scanners. DPI is fundamentally the same concept as PPI (pixels per inch), SPI (samples per inch), or CPI (counts per inch).

What is DPI in an Image?

In a digital image, DPI is used when printing the image: DPI refers to how many pixels of the image will be used to fill one inch of the surface where the image will be printed onto.

For example, if an image is 1000x1000px, and it's printed at 100 DPI, that means we'll use 100 pixels of image per inch of paper. Since we have 1000 pixels of width, the printed image will have 10 inches of width.

If this same image, the same pixels, were printed at 200 DPI, then we would use more pixels of the image to fill a single inch of paper: the image would still be 1000px in width, but is physical width would become only 5 inches.

Some image formats, like JPG, support saving a DPI for the image inside the image file. This property is just a number in the metadata of the image, which means with appropriate software, it should be possible to change the DPI without creating JPG artifacts. The printing software may also allow you to ignore the image DPI completely and specify the DPI manually.

Infographic: This is a raster image (small cropped image of Mona Lisa's face). Resolution: 70x64px (Width: 70px, height: 64px) = Area: 4480px (4.48 kilopixels, or 0.00448 megapixels). A zoomed version of the version of the image that appears blocky, with the lines and rows of the blocky grid enumerated. A single colored square zoomed in. Coordinates: 13, 8. Horizontal (x): 13. Vertical (y): 8. This is a pixel. it has a single color. A red, green, and blue circles, overlapping like a Venn diagram with a white color at the center. This is a color model. The color of the pixel is decomposed into darker red, green, and blue colored circles, associated with values from 0% to 100% in ramps from black to red, green, and blue, respectively. R, G, B: these are color channels. Red: 36.5%. Green: 18%. Blue: 14.5%. % RGB: 36.5%, 18%, 14.5%. 0..1 RGB: 0.365, 0.18, 0.145. These are binary numbers: 0 = 0; 1 = 1; 2 = 10; 3 = 11; 4 = 100; 7 = 111; 15 = 1111; 31 = 11111; 63 = 111111; 127 = 1111111; 255 = 11111111; This is a color depth: Red: 01011101; Green: 00101110; Blue: 00100101; 8 bits (binary digits) x 3 = 24 bits. G Red: 93 = 255x0.365. Green: 46 = 255x0.18. GBlue: 37 = 255x0.145. These are hexadecimal numbers: 9, 10, 11, 12, 13, 14, 15, 16, 17... in decimal, equals to 9, A, B, C, D, E, F, A0, A1... in hexadecimal. FF = 16x16 - 1 = 256 - 1 = 255. This is a RGB HEX CODE: GRGB: #5D2E25; G Red: 5D; Green: 2E; GBlue: 26; This is uncompressed pixel data in memory: 010111010010111000100101; 8 bits = 1 byte; a single 8-bit RGB tuple = 3 bytes; A 70x64px 8-bit RGB image = 70x64x3 bytes = 4480x3B = 13440B = 13.44KB It takes 13.44 kilobytes of memory to display a tiny 70x64px image. This image resolution is 542x853px. It's consuming 1.39 megabytes of your memory :)
An infographic explaining what is a pixel in a raster image, and what sort of data a pixel contains. Observe how nothing here has anything to do with the actual size of the pixel on the screen or when it's printed.

Does DPI Matter for Images on the Web?

DPI doesn't affect images being displayed on screen at all, be it inside a web browser or image viewer. DPI is only important when printing an image. If you are uploading images to a website or creating your own webpage, DPI isn't something you should be worried about, unless the images are supposed to be printed.

Note: nowadays, if you're creating webpages yourself, it's possible to use multiple image files at different resolutions for high DPI screens (see below what they are). This can be done with with the <img>'s srcset attribute1.

DPI-Aware Graphics Software

Another scenario where DPI may be considered is in a DPI-aware image editor or graphics design software, such as Inkscape. In Inkscape, you can create a document by specifying its size in inches or centimeters. If you drag a JPG file inside Inkscape, it will be sized according to its DPI. This means that two images of same size (e.g. 1000x1000px) but with different DPIs will be embedded with different sizes inside Inkscape.

What is a Good DPI for an Image?

By default, digital images are created with the assumption they will only be displayed on screens, so they are created with a very low DPI. In most cases, this default value is 72. If you print at 72 DPI, you'll end up with blurry printed images.

For printing, it's recommended to use 300 DPI or 600 DPI. Note that changing the amount of DPI in an image doesn't help much since the amount of pixels the image has won't change, and resizing an image won't help either since it will just blur the existing pixels to stretch them to a new size.

For example, if you are a digital illustrator and you want to print something that is 10x10 inches, you must draw at a 3000x3000px resolution if you want to print at 300 DPI, or at a 6000x6000px resolution to print at 600 DPI.

There have been claims that 360 PPI is the most you need, but 720 PPI images can easily be seen to be much sharper again in print, if this data is available at good quality from the original file.

[...]

The existing standard for high quality, photographic printed images is 300 PPI, meaning that for each inch of the printed image, there must be 300 source pixels to use.

https://imagescience.com.au/knowledge/the-difference-between-ppi-and-dpi (accessed 2024-07-21)

What is DPI in Screens?

In a screen, be it a desktop PC's monitor screen, laptop screen, or a smartphone or tablet screen, DPI is normally called PPI (pixels per inch), or pixel density, and refers to how many pixels are displayed in a single inch on the screen. The distance between these physical pixels is called pixel pitch.

Why are Screens 72 DPI?

Screens aren't actually 72 DPI. We can do some basic math to find out the DPI of a screen.

Let's take a typical 20 inch monitor with a 1600x900px resolution. This "20 inch" refers to the diagonal measurement (i.e. the hypotenuse). To find its width in inches, knowing its aspect ratio, we can calculate2:

width = 1600
height = 900
ratio = height / width
diagonal = 20
physical_width = (diagonal^2/((ratio^2)+1))^(1/2)

In other words:

(20^2/(((9/16)^2)+1))^(1/2)=17.4315107425

So this monitor has 17.4 inches in width where 1600 columns of pixels will be displayed. If we divide 1600 by 17.4, we get 91.2. This monitor's DPI is 91, then, not 72.

Why is 72 the Default DPI?

I found only one source on this. It seems this has a very old origin!

In the 1980's, computer screens did have resolutions of 72 DPI, it was because dot matrix printers printed at 144 DPI, so what you saw on the screen was going to be roughly equal to what was printed on a 2 to 1 scale. As screens and printers have improved, the 72 DPI rule for web resolution is now completely irrelevant.

https://www.pixelperfectcreative.com/blog/72-dpi-why (accessed 2024-07-21)

What is HiDPI?

HiDPI is a term used with screens that have high pixel density, generally twice as high as the average screen. If we consider the traditional screen to have something around 72 and 90 DPI, then a HiDPI screen would have 150 to 200 DPI.

For the DPI of a screen to increase, that means the amount of pixels in its resolution increases while its physical size remains the same. If we doubled the DPI of a monitor while keeping its same physical size, that would mean everything on the screen would look twice smaller.

For example, say that we have a 300px wide button on a 75 DPI screen that occupies 4 inches of space on the screen. If we double the DPI to 150, the amount of pixels in the button won't magically increase, which means that same 300px button will now look much smaller, occupying only 2 inches of space on the screen.

In order to compensate for this, every program that displays something on the screen has to draw everything twice its normal size in pixels so that they look the same size in inches. When you do this, the amount of memory and processing necessary actually quadruples. For example, a 10x10px image has 100px in total, but a 20x20px image has 400px in total. 400px is 4 times 100px.

This can only be done with vector images such as SVG and programmable scalable graphics.

Some things can not be compensated. If you are browsing a website that was made for normal DPI screens, the images on the website will have been created assuming a "72 DPI" screen. If the web browser displayed them in their real size, they would look smaller than you expect. For them to look "normal" size, the program needs to stretch the image to twice its size.

This "twice" we're talking about is also known as 2x scaling (integer scaling). Some screens have fractional scaling, such as 1.5x. Fractional scaling has the same problems of consuming more resources just to stretch images, and it generally looks worse than integer scaling3.

HiDPI and fractional scaling is common in handheld devices' screens because they're held closer to the eyes, as such they occupy more arcdegrees of our vision and need a greater resolution per inch, just like print. In desktop screens, 4K and 8K monitors are typically HiDPI. Note that a 2x 4K monitor has the same physical size as a 1x 2K monitor.

What is Retina Display?

This is Apple's marketing term for HiDPI. It means the same thing as the above.

If there is any single number that people point to for resolution, it is the 1 arcminute value that Apple uses to indicate a “Retina Display”. This number corresponds to around 300 PPI for a display that is at 10-12 inches from the eye. In other words, this is about 60 pixels per degree (PPD).

https://www.anandtech.com/show/7743/the-pixel-density-race-and-its-technical-merits (accessed 2024-07-21)

What is DPI in a Mouse?

A mouse can have a DPI too. In this case, we're talking about the resolution of the hardware sensor. In other words, the DPI of a mouse is related to its sensitivity. Optical mouses need to sense the surface where they're moved to detect which direction they have been moved toward. The DPI of the mouse, or more accurately the CPI (counts per inch), refers to how many times per inch the mouse senses the surface. This measurement assumes we're moving the mouse in a straight line.

[..] if your mouse sensor was calibrated at 1 DPI, if you moved it linearly one inch to the right, your cursor would move 1 pixel across to the right.

https://www.corsair.com/us/en/explorer/gamer/mice/what-is-dpi-does-it-affect-gaming/ (accessed 2024-07-21)

It's worth noting that you also have the ability to set the mouse sensitivity in your operating system. The DPI of the mouse means, then, the maximum sensitivity its hardware is capable of. If a mouse has 800 DPI, that means it can only report up to 800 "movements" for each inch it moves. We'll call these movements "events" to keep things simpler.

So every time the physical mouse senses it has moved, it sends an event to the operating system, and the operating system moves the mouse cursor by 1 pixel on the screen. If a mouse has 500 DPI, that means it can only sense movements as small as 0.002 inches. If it's 1000 DPI, that means as small as 0.001 inches. And so on.

Theoretically, we could make the mouse cursor move by 2 pixels for each event, making the mouse two times as fast. However, the mouse cursor is actually an invisible virtual object with a size that is only one single pixel on the screen that happens to have an arrow icon drawn around it. Because its real size is only 1 pixel, if we moved 2 pixels per event, we would end up "jumping over" some rows or columns of pixels on the screen, making those pixels effectively unclickable.

Let's say we have a 1600x900px screen. With a 1600 DPI mouse, if the cursor was on the left side, and we moved the mouse 1 inch to the right, the cursor would move through the whole screen.

Note: DPI isn't the only factor that affects "mouse speed." Another factor is the USB pooling rate for USB mouses. For example, if the pooling rate is 8ms, then no matter how fast you move the mouse, the "events" won't be processed until they are pooled, and that can take up to 8ms, so there is always going to be a tiny lag between the hardware events and them being processed by the software.

References

  1. https://developer.mozilla.org/en-US/docs/Learn/HTML/Multimedia_and_embedding/Responsive_images (accessed 2024-07-21) ↩︎
  2. https://stackoverflow.com/a/1043229 (accessed 2024-07-21) ↩︎
  3. https://blog.elementary.io/what-is-hidpi/ (accessed 2024-07-21) ↩︎

Comments

Leave a Reply

Leave your thoughts! Required fields are marked *