Understanding Resolution

Artists often ask me about image resolution.  This is because they are often asked to provide images, say, for a gallery or a publication or a show application.  In the process they are presented with required specifications for the images, and those specs generally include resolution.

It turns out there are a lot of misconceptions about image resolution.  I’d like to try to clear some of them up with a simple discussion of how resolution works, as much in layman’s terms as possible.  You won’t need to be a geek to follow this!

Resolution is basically a measure of how much fine detail is in the image.  The higher the resolution, and greater the detail.   OK, but how does it work?  How do we know what resolution an image has, or what it should have?

To answer that, you have to look at the basic structure of a digital image.  Simply put, a digital image consists of a collection of tiny dots called pixels.  Each one represents a tiny part of the image.  When you put them all together, you get a whole image.  {Kind of like a magazine or newsprint image.  From a distance it looks like a picture, but up close it looks like a bunch of dots.)

Since digital images start life as rectangles or squares, images are broken into rows and columns of pixels.  (Think of a grid like a window screen being laid over an image, breaking it into tiny squares.)  The resolution of the image depends on how fine the screen is.  The finer the screen, the smaller the dots, and the smaller the dots, the higher the resolution.

For any image, you have a certain number of pixels across the image, and a certain number of pixels from top to bottom, and more pixels (dots) equals more resolution.  But how do we know how many dots there are?

Resolution is measured by how many of these tiny dots, these pixels, are across one inch of the image.  This is where we get the term ‘dpi,’ which stands for ‘dots per inch.’  More dots per inch equals higher resolution.

For example, say we have a picture that is 10 inches wide by 10 inches tall.  We also have two screens, one that has 100 holes per inch, and one that has 300 holes per inch.  If we lay the first one over the image and count the holes, or dots, we get 10 inches broken into 100 dots per inch, or 1000 dots.  But if we instead lay the second screen over the same image we get 10 inches broken into 300 dots per inch, or 3000 dots.

More dots per inch (a higher dpi) means higher resolution, which means more fine detail captured in the image.  300 dots per inch is way better than 100 dots per inch.

Simple enough, eh?  Not so fast.  Resolution is only part of the picture, so to speak.  There is another critical factor that needs to be taken into account, and that is image size, or dimensions.  In fact, resolution is not something that can be specified independently from image dimensions.  The two are closely interrelated.  Here’s why:

Say we have an image that is 300 dpi.  OK, how big is it?  There’s no way to know, just by the resolution.  It might be one inch across or ten inches across.  And the difference really matters, as you will see.  They both have the same dpi, but one has much higher resolution than the other.

Remember that images are broken into rows and columns of pixels, like laying a window screen over them.  The finer the screen, the smaller the dots, and the smaller the dots, the more of them you can pack into a given space, and the higher the resolution.

So imagine that we now have an image that is one inch by one inch, and we have a screen which has 300 holes per inch.  We lay the screen over the image and it breaks the image into dots.   Since the image is one inch wide and one inch tall and the screen is 300 dpi, the image will be broken into 300 dots vertically and 300 horizontally.

Now imagine the same image 10 inches wide and 10 inches tall.  We place the same screen over it and we get an entirely different result.  Since the image is 10 inches wide and the screen is 300 dots per inch, we wind up with 3000 dots across the image and 3000 from top to bottom.

Same image, very different number of dots.  It should be clear that the latter version has much higher resolution than the former.  But wait, aren’t they both 300 dpi.  Yes, they are.  And this is why simply specifying dpi is not enough.  Dimensions are critical to knowing the true resolution of an image.

We’ve already established that the more pixels into which a picture is broken, the higher the resolution of the image.  The fact is that for any two images with the same dpi, the larger one has higher resolution because it has more pixels.  Thus a six-inch wide image at 300 dpi has way more (4x) resolution than the same image three inches wide image at 300dpi.

In the final analysis, a digital image is nothing more than a certain number of pixels horizontally, and a certain number of pixels vertically.  Pixels per inch and number of inches are simply ways we have devised to talk about how many pixels are in the image.  But the image itself is actually independent of either.  Technically an image has neither dpi nor dimensions nor resolution until we assign them to it.  All it has is pixels, which can be sliced up however we want.

Once you understand this, you can actually use a common shorthand for specifying an image – how many pixels on the long side.  It is not uncommon to see an image specified as “2100 pixels on the long side,” or “3000 pixels on the long side.”  The reason is simple.  If you have 3000 pixels, you know that at 300 dpi it will be 10 inches wide, because 10 inches times 300 pixels per inch equals 3000 pixels.  The very same image at 600 dpi will be 5 inches across, because five inches times 600 pixels per inch equals, again, 3000 pixels.

In the end, the image doesn’t care about resolution, dimensions or dpi..  All it has is a certain number of pixels.  How you slice them up is your business, with the understanding that the more of them you pack into an inch, the fewer inches you get, or the more inches you require, the fewer pixels will be available for each inch.

So what is a good resolution for an image?  Well, that depends on what you intend to do with it.  For print, 300 dpi is considered the standard minimum resolution.  For most printing, a 300 dpi image will look great, with no visible artifacts from it being a digital image.  For very high quality printing some printers specify 450dpi or even 600.

So if you are sending an image off for print, read the requirements closely,  make sure you conform your image both to the required dpi AND the required dimensions.  If you are having a 4×6 postcard printed, your digital image will need to be 1200 x 1800 pixels to print 4” x 6” @300 dpi.

For web usage the requirements are much lower, since web resolution is so much lower.  Standard web resolution is 96 dpi.  So if you are putting an image on your website and you want the image to appear on-screen at four inches wide, it needs to be 4 x 96 pixels, or 384 pixels wide.

 

Takeaway:

Basic image resolution is expressed in terms of dots per inch, or ‘dpi.’  The higher the dpi, the higher the image resolution, all other things being equal, such as image dimensions.

It is not enough to only specify an image’s dpi when expressing its resolution.  You also need to specify it’s dimensions.  Dpi and image dimensions are interdependent and inseparable.  One without the other is meaningless.

An example of a properly specified image is “five inches by seven inches @ 300 dpi.”

A common shorthand is number of pixels on the long side, such as “2100 pixels on the long side.”

Images to be printed should be at least 300 dpi at the intended print dimensions.

Images to be used on websites should be 96 dpi at the intended screen dimensions.

 

This entry was posted in Technical Posts and tagged .