Sei sulla pagina 1di 50

Digital Image Processing in Life Sciences

March 14th, 2012

Lecture number 1: Digital Image Fundamentals

Lectures outline
What Is Digital Image Processing? (The Origins of Digital Image Processing) Fundamental Steps in Digital Image Processing Image Sampling and Quantization Spatial and Gray-Level Resolution Some Basic Relationships Between Pixels Zooming and Shrinking Digital Images

Lookup tables
Color spaces

Terms to be conveyed: Pixel Gray level Bit depth Dynamic range Connectivity types/neighborhood Interpolation types Look-up tables

Book: Digital Image Processing, Rafael C. Gonzales and Richard E.Woods

Web resources: www.microscopy.fsu.edu (very thorough and informative)

www.cambridgeincolour.com (beautiful examples, excellent tutorials)

Next topics:

2. Image enhancement in the spatial domain

3. Segmentation

4. Image enhancement in the frequency domain

5. Multi dimensional image processing

6-7. Guest lectures-TBD

What Is A Digital Image?

Image= a two-dimensional function, f(x,y), where x and y are spatial coordinates, and the amplitude of f at any pair of coordinates (x, y) is called the intensity (gray level of the image) at that point. When x, y, and the amplitude values of f are all finite, discrete quantities, we call the image a digital image. (Gonzalez and Woods).

These sets of numbers can be depicted in terms of frequencies

http://cvcl.mit.edu/hybrid_gallery/gallery.html

Digital Image Processing-Points to consider:

Why process?
Are both the input and output of a process images? Where does image processing stop and image analysis start? Are the processing results intended for human perception or for machine perception? Character recognition and fingerprint comparisons vs intelligence photos

We can define three types of computerized processes: Low-, mid-, and high-level. Low: image preprocessing, noise reduction, enhance contrast etc. Mid: segmentation, sorting and classification.

High: assembly of all components into a meaningful coherent form

Digital image originsThe digital image dates back to the 1920s and the Bartlane cable picture transmission system between NY and London. The image took 3 hours to transmit, instead of more than one week. They started with 5 tone levels and increased to 15 levels by 1929.

Taken from Gonzalez and Woods

Lectures outline
What Is Digital Image Processing? (The Origins of Digital Image Processing) Fundamental Steps in Digital Image Processing Image Sampling and Quantization Spatial and Gray-Level Resolution Some Basic Relationships Between Pixels Zooming and Shrinking Digital Images

Lookup tables
Color spaces

Essential steps when processing digital images: Acquisition Enhancement Restoration Outputs are digital images

Color image restoration


Wavelets Morphological processing Segmentation Representation Recognition Outputs are attributes of the image

Image acquisition
Acquire or receive an image for further processing. This step has a major impact over the entire procedure of processing and analysis.

Image Enhancement
Improving quality subjectively (e.g. by change of contrast) Image Restoration Improving quality objectively (e.g. by removing psf)

microscopy.fsu.edu

microscopy.fsu.edu

microscopy.fsu.edu

Morphological processing Extracting components for the purpose of representing shapes Segmentation Deconstructing the image into its constituent objects. A crucial step for successful recognition of the image contents.

Morphological processing Extracting components for the purpose of representing shapes Segmentation Deconstructing the image into its constituent objects. A crucial step for successful recognition of the image contents. Representation

Feature selection-classification/grouping of objects

Lectures outline
What Is Digital Image Processing? (The Origins of Digital Image Processing) Fundamental Steps in Digital Image Processing Image Sampling and Quantization Spatial and Gray-Level Resolution Some Basic Relationships Between Pixels Zooming and Shrinking Digital Images

Lookup tables
Color spaces

Sampling and quantization

Keep in mind:
The sensor we used to create the image has a continuous output. But, the transition from a continuum to a digital image requires two processes: sampling and quantization. Sampling is the process of digitizing the spatial coordinates. Quantization is the process of digitizing the amplitude values at those spatial coordinates. The arrangement of the sensor used to create the image determines the sampling method and its output. Different limits determine the performance of the optical sensors and of the mechanical sensors.

Sampling and quantization result in arrays of discrete quantities. By convention, the coordinate (x,y)=(0,0) is located at the upper leftmost corner of the image.

(Gonzales and Woods)


picture elements=image elements=pels=pixels Sampling results in typical image sizes that can vary from 128 x 128 to 4096 x 4096 or any combination thereof.

An Image Formation Model

Let l(x0, y0) be the gray level (gl) value at (x0, y0) : l=f (x0, y0)
l is bounded by Lmin and Lmax and the boundary [Lmin, Lmax] is the gray scale. This interval is usually shifted to [0, L-1] where 0 represents black gl values, and L-1 represents white gl values.

Quantization results in discrete values of gray levels, typically an integer power of 2: L=2k .

If k=8, the result is 256 gray levels, from 0 to 255.


Dynamic range- the portion of the gray levels in the image out of the entire gray scale of the image.

Think about high vs low dynamic range images: how does the dynamic range affect the contrast of the image? Next lecture

How many bits are required to save a digital image? b=M x N x k (or M2k for images of equal dimensions).

Size (kb)

Gray level (bit-depth)

8 (256) 12 (4096) 16 (65536) 128 16.384 24.576 32.768 256 65.536 98.304 131.072 512 262.144 393.216 524.288 1024 1048.576 1572.864 2097.152 2048 4194.304 6291.456 8388.608

Resolution

8bit images- values are integers, unsigned 16bit images- values are integers, some softwares allow signed. 32bit images-floating-point, signed.

Lectures outline
What Is Digital Image Processing? (The Origins of Digital Image Processing) Fundamental Steps in Digital Image Processing Image Sampling and Quantization Spatial and Gray-Level Resolution Some Basic Relationships Between Pixels Zooming and Shrinking Digital Images

Lookup tables
Color spaces

Spatial and gray-level resolution Spatial resolution is rather intuitive, and is determined by the quality and density of the sampling.

Sampling theories (eg Nyquist-Shannon) state that sampling should be performed at a rate that is at least twice the size of the smallest object/highest frequency. Based on this, over-sampling and under-sampling (=spatial aliasing) can occur.
Gray level resolution is a term used to describe the binning of the signal rather than the actual difference we managed to obtain when we quantized the signal. 8-bit and 16-bit images are the most common ones, but 10- and 12-bit images can also be found.

Changing the resolution of the image without changing bit-depth checker board patterns

512 x 512

256 x 256

128 x 128 64 x 64

Changing the bit-depth of the image without changing resolution False contouring 8bit 4bit

3bit
2bit 1bit

Lectures outline
What Is Digital Image Processing? (The Origins of Digital Image Processing) Fundamental Steps in Digital Image Processing Image Sampling and Quantization Spatial and Gray-Level Resolution Some Basic Relationships Between Pixels Zooming and Shrinking Digital Images

Lookup tables
Color spaces

Neighbors of a pixel (x-1, y-1) (x, y-1) (x+1, y-1)

(x-1, y)

(x,y)

(x+1, y)

(x-1, y+1) (x, y+1) (x+1, y+1) (x+1, y), (x-1, y), (x, y+1), (x, y-1)= 4 neighbors of p, or N4(p) (x+1, y+1), (x+1, y-1), (x-1, y+1), (x-1, y-1)= the four diagonal neighbors, or Nd(p). N4(p) together with Nd(p) are N8(p). Consider the case of image borders.

Adjacency/Connectivity, Regions, and Boundaries Pixels are said to be connected if they are neighbors and if their gray levels satisfy a specified criterion of similarity. Consider this example of binary pixels

V- the set of gray levels used to define adjacency. In this binary example, V={0} to define adjacency of pixels with the value 0. In non-binary images, the values of V can have a wider range.

The region R of an image- a subset of pixels which is a connected set, meaning that there exists a path that connects the adjacent pixels. The boundary (=border=contour) of R is the set of pixels in R that have one or more neighbors that are not in R.

What happens when R is the entire image? Do not confuse boundary with edge. The edge is formed by discontinuity of gray levels at a certain point. In binary images, edges and boundaries correspond.

Distances between pixels


Between (x,y) and (s,t): Eucladian distance: given by Pythagoras D4 distance (=city-block distance): D4(p, q) = |x s| + |y t|. Pixel coordinates: 3,1 2

2,2 1,3
2,3 2,4

3,2 3,3
3,4

4,2

2 5,3 2 1
2

1 0
1 2

2
1 2 2

4,3
4,4

3,5

Diamond pattern

D8(p, q) =max( |x s| , |y t|) results in a square pattern around the center pixel.

2 2 2 2

1 1 1 2

1 0 1 2

1 1 1 2

2 2 2

Lectures outline
What Is Digital Image Processing? (The Origins of Digital Image Processing) Fundamental Steps in Digital Image Processing Image Sampling and Quantization Spatial and Gray-Level Resolution Some Basic Relationships Between Pixels Zooming and Shrinking Digital Images

Lookup tables
Color spaces

Zooming and shrinking digital images Zoom: 1. Create new pixel locations 2. Assign gray level values to the locations

For increasing the size of an image an integer number of times, the method of pixel replication is used. For example, when changing a 512 x 512 image to 1024 x 1024, every column and every row in the original image is duplicated. At high magnification factors, checkerboard patterns appear.

Nearest neighbor interpolation Bilinear interpolation (2 x 2) Bicubic interpolation (4 x 4)

Examples of non-adaptive interpolation

Scaling up using different methods

Pixel replication Bilinear Bicubic

Lectures outline
What Is Digital Image Processing? (The Origins of Digital Image Processing) Fundamental Steps in Digital Image Processing Image Sampling and Quantization Spatial and Gray-Level Resolution Some Basic Relationships Between Pixels Zooming and Shrinking Digital Images

Lookup tables
Color spaces

Look up tables:
Save computational time (LUTs can be found early in history) Require a mapping or transformation function- an equation that converts the brightness value of the input pixel to another value in the output pixel Do not alter pixel values Image transformations that involve look-up tables can be implemented by either one of two mechanisms: at the input so that the original image data are transformed, or at the output so that a transformed image is displayed but the original image remains unmodified.

www.microscopy.fsu.edu

Lectures outline
What Is Digital Image Processing? (The Origins of Digital Image Processing) Fundamental Steps in Digital Image Processing Image Sampling and Quantization Spatial and Gray-Level Resolution Some Basic Relationships Between Pixels Zooming and Shrinking Digital Images

Lookup tables
Color spaces

There are ways to describe color images other than the RGB space
Color space=color gamut RGB= 3 X 8-bit channels= 24bit= true color The histograms of RGB images can be viewed either as separate channels or as the weighted average of the channels. Some representations of color images calculate a weighted average of green, red and blue.

Hue-Saturation-Intensity (more intuitive, as we perceive the world): Hue= color spectrum, Saturation= color purity, Intensity= brightness

More: Hue-Saturation-Lightness; Hue-Saturation-Brightness

End of Lecture 1

Thank you!

Potrebbero piacerti anche