I have an older scanner (Visioneer 6100b) which says it’s 600 x 1200 dpi optical resolution and 2400 x 2400 dpi maximum resolution. A new scanner I bought says it’s 600 dpi optical resolution and 600 x 1200 dpi hardware resolution.
How do they compare? What’s hardware resolution? Is the new scanner better resolution than the old scanner? Or, does some info seem to be missing? I need your advice!
When you compare scanners, there are other factors than resolution to consider. The most important one being colour fidelity. That is, how close is the scanned image to the original in terms of colours? The closer, the better.
Other factors include scanning speed, reliability, scan size, and even bundled software. Better scanners allow you to do colour adjustments, contrast and brightness adjustments, and even allow you to add colour and other filters.
It’s like a camera lens, one resolution is what the camera can do the other is what the software can do.
Anyway, consider file size when you consider resolution. If you have a dpi of 1200x1200dpi per square inch & you have an 8x10 image, you have 80 sq inches, a huge file to work with. In other words, this many dots:
1200x1200x80
Scanners vary tremendously in quality, a well made 600x600 scanner can beat any cheaply made 1200x1200.
PC Magazine does reviews of scanners on a regular basis. One nice thing they include is a blowup of a small part of a test image from the tested scanners. The variability in quality is obvious. Such tests are the only real way you can find out.
Manufacturers claims of resolution are not in any way shape or form to be taken seriously.
Hardware resolution is the resolution of the image sensor. Optical resolution is the resolution after you account for the quality of the lens system. Ignore interpolated resolution. It is a mathematical trick and only gives you a bigger file, not a better image.