kristinf opened this issue on Jul 30, 2006 ยท 10 posts
thundering1 posted Mon, 31 July 2006 at 1:48 PM
DPI is the LEAST important number in the 3-part equation when figuring out the size of an image. The ONLY time dpi comes into play is when you're figuring out how large your image will print properly based on how it is to be printed.
A 2400x3000 image file will give you an 8x10 print for publication (divide by 300 - get it?) - and actually, it can be printed bigger as you can REALLY go down to counting on 212dpi to figure out the largest you can print it, and it will still be sharp for publication (9 out of 10 times, publications are printed at 150 lines per inch - they tell you they REQUIRE double that finished size for the sake of quality - that being the size it will be printed - an ad that is 2x3.5 inches, or a full page spread?).
That same image, printed at your minilab: Labs using LEDs and lasers to expose REAL photgraphic paper (not inkjet) actually smooth things out a little bit and you can figure out the largest you are able to print it by dividiung by 125 (believe it or not!) so you can actually print a sharp 19.2x24 inch picture.
Inkjet: 150dpi is the lowest you can go before you start to see pixellation. Why? Because when you print a 16x20 inch picture (again, divide our starting numbers above by 150) you're NOT going to be holding it inches from your face, and the edges in the picture blend VERY nicely, and inkjet printers ALSO do a bit of smoothing themselves, so you don't need to count on a huge dpi setting.
Wabe is exactly right - a 2400x3000 image file is the exact same file at 10dpi or 5000dpi. Dpi is the unit we figure out when coming up with an image we can hold in our hands (physical world dimensions) - until then, it's actual PPI you can change on a whim and never affect a thing with your image. When you go to print it out, depending on your method (one of the 3 described above) you set your dpi to figure out the maximum size before noticeable pixellation.
Hope that makes sense-
-Lew ;-)