BOOT CAMP ARCHIVE 2001

  

 

BOOT CAMP 188 (16/08//01)

 

PICTURES AND PIXELS part 2

 

In part one of this short series on computer imaging we looked at the rudiments of resolution with special reference to monitor screens. This week we'll tackle resolution in the context of digital cameras, scanners and printers and make a start on how imaging devices process colours.

 

Broadly speaking resolution or the ability of a digital camera to capture fine detail is determined by the number of pixels on the surface of the image sensor chip inside the camera. (Each pixel is made up of three light sensitive elements, sensitive to red, green and blue light). The sensor is the equivalent of a frame of photographic film so it's tempting to compare the two in terms of the number of pixels. However, film is an analogue medium and doesn't easily lend itself to that kind of comparison, but we'll have a go anyway. In the case of a top quality camera and lens shooting a static scene in good light a frame we can say that a frame of 35mm film contains the equivalent of 20 million or so pixels rising to more than 30-million pixels on the finest grain professional films.

 

That figure falls to between 9 and 12 million pixels in the case of a mid-range camera loaded with ordinary film. The best of today's high-end digital cameras have sensors with between 4 and 6 million pixels (4 and 6 'megapixels') so you can see that they still have a way to go to catch up with film.

 

Many other factors are involved in determining the quality of a picture shot on a digital still camera, including the way the camera and PC processes the data but all things being equal and assuming a decent printer, a mid-market digital still camera with a 2.1 or 3-megapixel sensor can produce very acceptable looking 4 x 6-inch prints that stand comparison with photos shot on a 35mm compact camera.

 

Scanner resolution is measured in a slightly different way to cameras, not by pixels but in dots per inch or 'dpi'. A dot as we explained last week is a pixel by any other name. Inside a flatbed scanner, on the scan 'head' that moves under the glass 'platen' on which the image is placed, there is a strip of light sensitive elements -- the dots or pixels --- and how many of them there are to the inch is the scanners 'optical' resolution. This is typically 600 to 800dpi on budget models and 1200 to 2000dpi on more advanced types. A scanner's optical resolution is usually quoted as two figures (i.e. 600 x 1200dpi), the second number '1200' denotes the number of 'steps' the scanner head makes per inch as it travels down the platen

 

Note that optical resolution is the true measure of a scanners performance and it is not the same as the larger and more impressive-looking  'interpolated' resolution figure that many scanner manufacturers are fond of quoting. This is essentially a trick whereby the scanner software takes an educated guess and fills in the missing detail between each pixel. 

 

How much resolution you need from a scanner depends on what you are using it for. If you only want to scan images that will appear on a video monitor – i.e. pictures for web pages, multimedia presentations etc. -- then you may be surprised to know that the resolution of most PC screens is between 75 and 100dpi; scanning at higher resolutions is basically a waste of time, effort and disc space as all of the extra detail is lost. Scanning text and documents for faxing, copying or optical character recognition (OCR) is another relatively undemanding application and a resolution of 300dpi is usually more than sufficient. Higher resolutions start to make sense when the aim is to print out the results. Most laser printers and colour inkjets operate in the range 300 to 800dpi. Imagesetters, used in the production of books and magazines normally require images to be scanned at resolutions of at least 1200dpi but for high quality work it can rise to 3500dpi and above.

 

We've already touched briefly on printers and the sort of resolutions they can achieve and like scanners the ability to reproduce fine detail is measured in dots per inch (dpi). However, the numbers are not so clear-cut. There are many different printer technologies and numerous techniques for making sharper images and more natural-looking colours. The type and quality of paper can also have a big impact on the finished results so it's a bit of a minefield. The simple rule of thumb is that the more dpi's a printer can manage the better and 'photorealistic' models with multi-colour (i.e. 4, 5 or 6 colour inks) printing systems produce the best results on photographs shot on a digital still camera.

 

Colour plays a very important role in image quality, in particular something called colour depth. In keeping with tradition the PC industry has managed to make the whole business seem a lot more complicated that it actually is but we'll try and make some sense of it.

 

The human eye is very sensitive to colour and we have the ability to distinguish up to 10-million shades. Digital imaging systems are even more adept at processing colour and can detect and reproduce millions more colours by assigning each colour a number, and as you know computers are very good with numbers.

 

Most PCs and peripherals use four basic colour formats. The first is the VGA standard of 16-colors, also referred to as 4-bit color (excuse the American spelling), which is the number of binary digits (bits) used to identify each of the colours. 4-bit color is okay for displaying simple graphics and icons but it can't handle the shades in photographic images and they look really coarse and blotchy. Next is 8-bit or 256-colors. This is just enough for a photographic image though variations in colour and shade tend to look very patchy. Picture quality takes a big leap with 16-bit color, also known as 'High Color' which can resolve 65,536 colours. Finally there's 24-bit and 32-bit 'True Color' that describes more than 16 million (16,777, 216 to be precise) colours. 32-bit True Colour is a special format used mainly for video games and high-end graphics applications where the extra 8-bits of information, known as the 'Alpha Channel' is used for creating special transparency and texture effects.

 

Next week – Pictures and pixels part 3 – image sizing

 

JARGON FILTER

 

IMAGESETTER

Device used to convert image data produced on a PC into photographic film used for making lithographic printing plates

 

MEGAPIXEL

As near as makes no difference one million pixels or picture elements

 

OCR

Optical Character Recognition – converting the scanned image of a document into a text file that can be read by a word processor

 

TOP TIP

It's all very well your PC being able to process over 16 million colours but can you see them all on your monitor screen? This simple little freeware monitor test program will help you find out and adjust your settings to produce the best possible picture. The self-extracting 'zip' file is only 278Kb and should only take a couple of minutes to download from:

http://www.monitortest.net/monitortest.html

Search PCTopTips 


Web

PCTopTips

Boot Camp Index

2010

2009

2008

2007

2006

2005

2004

2003

2002

2001

2000

1999

1998

 

Top Tips Index

Windows XP

Windows Vista

Internet & Email

Microsoft Word

Folders & Files

Desktop Mouse & Keyboard

Crash Bang Wallop!

Privacy & Security

Imaging Scanning & Printing

Power, Safety & Comfort

Tools & Utilities

Sound Advice

Display & screen

Fun & Games

Windows 95/98/SE/ME

 

 

 

 

 

 Copyright 2006-2009 PCTOPTIPS UK.

All information on this web site is provided as-is without warranty of any kind. Neither PCTOPTIPS nor its employees nor contributors are responsible for any loss, injury, or damage, direct or consequential, resulting from your choosing to use any of the information contained herein.