Graphics 0
1. Image resolution is correctly stated in Pixels Per Inch or in Pixels (H) by Pixels (V), not Dots Per Inch. There's a reason for that, because …
2. Physical printer and display resolution (resolving power) is accurately and consistently stated in Dots (or Lines) per Inch.
3. Keep in mind that a digital file in and of itself does not have a physical size in inches or millimeters. It's just a collection of numbers arranged in an array. You can display or paint them out in any size you want. If you want actual size, the dpi of the display device needs to match the ppi of the image, and all of the iamge pixels need to be mapped 1:1 to their respective display pixels.
These two resolution definitions are not the same and are easily confused by many people, especially when combined with a discussion of the resolution of their graphics cards, a third "resolution" number.
For example, one can have a JPEG image with a resolution of, say, 1024 x 768 pixels per inch, which was downscaled (i.e., crushed to some degree) from an original 2048 x 1536 pixels per inch "raw" data file. One can then display this on a monitor with a graphics card setting of 1280 x 960 pixels (a dimensionless pair of numbers) and then display it on a 20.1" monitor with a native resolution of 1600 x 1200 pixels, or 100 dpi. (More below on where the 100 dpi comes from.)
Printers (not the machines, the guys) typically discuss lines per inch rather than dots per inch when discussing the resolving power of their printing process. Laser printers, like monitors, are best described by their resolving power in dots per inch (as placed on paper).
The point of all of this blather is that image resolution (digital files) is best described using the term pixels per inch (PPI), while native resolution of physical devices (monitors and printers) is best described by dots per inch (DPI). It is inappropriate to assign "inches" to a graphics card setting as the resulting "dpi" depends on the physical size of the display. That is, you should not refer to the setting of your graphics card as set to 1280 x 1024 dpi' it's just a rendering of the image that is 1280 x 1024 pixels per side.
You'll hear all kinds of nonsense about 72 dpi and 96 dpi as if they had some special meaning for Macintosh and Windows systems. Balderdash. It's an old-wives tale. At one time, early Macintosh systems with closed (i.e., fixed) graphics systems and built-in monitors *might* have had 72 dpi screens by default. My Samsung monitors have a native resolution of 100 dpi, simply because they are 20.1" diagonal screens with a 1600 x 1200 array of pixels. A larger monitor with the same number of pixels has a lower dpi value. Both will give the best picture when the image file is 1600 x 1200 pixels per inch and the graphics card resolution is 1600 x1200 pixels.
Standard GIF files, by the way, only support 256 colors. They are therefore compressed and should not be used for high-quality photographic images.
Underlying all of this is yet another description of resolution usually totally rejected by many computer graphics people when trying to de-confuse them. You can have two different JPEG images with the same resolution (say, 1600 x 1200 ppi) but with different degrees of compression. If one were looking for detail, the least compressed one would generally offer more of it. That is, it offers more detail (information) to the viewer than the other one. One could argue (successfully, I might add) that even though both images have the same resolution (1600 x 1200), the least compressed one has more resolution because you can resolve more detail. The heavily-compressed one will be blurry and "fuzz out" some details. Of course you knew this, but you'd be surprised how many "experts" will argue that the two files have the same resolution. It all depends on how you use the terminology and what it means to you. That's what prompted me to write this missive.
End.