DPI

by Joe Gillespie — Aug 1, 1999

How a laser printer emulates halftone dots using laser dots.

Laser matrix

With computer printers, dots are also used to build up an image, but in a slightly different way. Laser printers generally try to simulate the effect of a traditional halftone screen with varying sizes of dots. They can usually only produce a single dot size of one color - 300 or 600 DPI are typical. These tiny dots are clustered to produce larger halftone dots in a kind of snail shell pattern. The effective halftone screen is relatively crude and shows up as banding or posterisation in photographic images.

The example opposite shows how 16 different halftone dot clusters can be made from a grid of 4 x 4 laser toner dots.

Stochastic Screening

Stochastic screening as used in ink-jet printers.

Ink-jet printers employ a more random dot pattern known as 'Stochastic' screening, rather than the regular halftone patterns used in traditional print. This irregular, granular pattern gives more tonal range and finer detail than conventional fixed-screen halftones. Together with higher resolutions of 720 or 1440 DPI and the fact that there are four or more colors, ink-jet printers can give a virtually 'screenless' photographic image. More sophisticated ink-jet printers can vary the dot density too.

So, by now, you should understand what a 'dot' is. It is a tiny dollop of ink or laser toner on a page.

Pixels

Pixels

This 4x enlargement shows the individual pixels, there are 1628 of them in total.

A 'pixel' is the smallest addressable element on a computer monitor. It means PI cture EL ement. A pixel is just a space in the computer's video memory. The amount of space it occupies depends on the colour depth. A 1-bit pixel can only be on or off, giving a monochromatic (black and white) image. A 4-bit color depth can produce 16 colors, typical of older PCs running MS-DOS or early versions of Windows. 8-bits can give a palette of 256 colors. 16-bit and 24-bit can provide 65K and 16M colors per pixel respectively.

The resolution of a computer monitor is given in Pixels Per Inch. Quite simply, this is the number of pixels per linear inch - if your screen is set to 800 x 600 pixels and it is 12 inches wide, your screen resolution is 800/12 = 66.6 PPI. If you change the resolution to 1024 x 768 on the same 12 inch wide monitor, the pixels get smaller and your effective resolution is 85.3 PPI.

Smaller pixels = higher resolution!

Monitor sizes are not very meaningful. The sizes quoted are usually the diagonal measurement of the whole tube rather than the effective working area. My '17 inch' Apple Studio Display monitor has an effective width of 12 inches whereas my '15 inch' flat screen monitor is exactly the same width - and resolution.

Sometimes you will see the term 'dot-pitch' used to describe a monitor. This is a physical attribute of the cathode ray tube's shadow mask and nothing to do with pixel resolution. The screen's dot pitch has to be smaller than the smallest pixel you are likely to require - small enough to contain at least one Red, Green and Blue phosphor dot.

So, what matters is the number of pixels per inch on a given screen. This can vary considerably but averages out to about 72 PPI. At 72 PPI, each pixel is one typographic point square. The original Macintosh computers worked at this resolution but with modern multi-resolution monitors, it is just a notional size. It does have the advantage that what you see on the screen and what you print out on paper, are exactly the same size.

PCs started out with relatively low resolution monitors - just glorified television sets in fact. They didn't have the capability to display anything like 72 PPI, so when the Windows OS came along, they made everything on the screen one third bigger than the actual print size to compensate. They couldn't make the pixels smaller, so they made the inches bigger. So, a 'logical inch' on a PC monitor is an inch and a third and the quoted 96 PPI is based on a bigger inch, not smaller pixels. In reality, the screen resolution at a standard inch is still about 72 PPI, the same as a Mac.

Working with screen-based Web or multimedia images, DPI has no meaning whatsoever. If you scan an image at 50 dpi or 600 dpi and it is 100 pixels wide, it will be 100 pixels wide on any screen regardless of its resolution. Of course, if it is on a 640 x 480 screen it will occupy a larger area of the screen real estate than on a 1024 x 768 monitor.

If you think of screen images as being like mosaic tiles where every tile is a different color but fixed in size, you will get the idea. Pixels are just mosaic tiles made from glowing phosphors or LCD elements on a screen. They are almost always square, and approximately 1/72 of an inch in each direction.

Research shows that more than half of all Web surfer use an 800 x 600 pixel monitor followed by 1024 x 768. 16-bit color also accounts for more than half of all surfers. Web designers use more highly specified monitors and computers but should make sure that their pages work for the majority of surfers at 800 x 600, 16-bits.

Using type is more confusing. 12 Point type on a Mac is about 12 pixels high, but 12 point type on a PC is a third bigger - nearly 16 pixels high. The flow of the text columns will be considerably different from one machine or browser to another. If you consider that each user's Web browser has a 'normal' screen size, it is best to work relative to that by specifying -1, -2, +1, +2 etc. The fact that pictures stay a constant size and type doesn't, is potentially the Web designers' greatest trap.

People often ask which is the best resolution to scan Web images. There are a number of issues here. Desktop scanners are similar to LCD monitors in that they have a 'natural' resolution corresponding to one element per pixel. My Apple Flat Screen Display's natural resolution, for instance, is 1024 x 768. It can be set to 800 x 600 or 640 x 480 but then it loses the one pixel per LCD element relationship because the image has to be interpolated to fit. Neither 800 nor 640 divide evenly into 1024 and there is no such thing as a half pixel, so the image is fuzzy - same thing with a scanner.

A scanner will produce the best results at its natural resolution, this is most apparent on images with regular patterns. Scanning at multiples of the natural resolution and reducing back by the same amount keeps the scanning head elements and pixels in sync with one another and minimises distortion but the higher magnification you scan at, the more you have to interpolate when you reduce the image to its final size, and the softer it gets. So, it is best to scan at 2x or 4x of your scanner's natural resolution as quoted by the manufacturer.

Del.icio.us Digg Technorati Blinklist Furl reddit Design Float