Sei sulla pagina 1di 9

1-D Kernel Density Estimation For Image Processing

Pi19404
September 21, 2013

Contents

Contents
1-D Kernel Density Estimation For Image Processing 3

0.1 Introduction . . . . . . . . . . . . . . . . 0.2 Non Parametric Methods . . . . . . 0.3 Kernel Density Estimation . . . . . 0.3.1 Kernel Density Estimation Parzen window technique 0.3.2 Rectangular windows . . . . 0.3.3 Gaussian Windwos . . . . . . 0.4 Code . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

3 3 3 5 5 6 7 8

2 | 9

1-D Kernel Density Estimation For Image Processing

1-D Kernel Density Estimation For Image Processing


0.1 Introduction
In the article we will look at the basics methods for Kernel Density Estimation.

0.2 Non Parametric Methods


approach is to avoid restrictive The idea of the nonASparametric assumptions about the form of f(x) and to estimate this directly from the data rather than assuming some parametric form for the distribution eg gaussian,expotential,mixture of gaussian etc.

0.3 Kernel Density Estimation


kernel density estimation (KDE) is a non-parametric way to estimate the probability density function where the estimation about the population/PDF is performed using a finite data sample. A general expression for non parametric density estimation is

p(x) =

  

k NV

where k is number of examples inside V V is the volume surrounding x N is total number of examples

Histograms are most simplest form of non-parametric method to estimate the PDF . To construct a histogram, we divide the interval covered by the data values into equal sub-intervals, known as bins. Every time, a data value falls into a particular sub-interval/bin the count associated

3 | 9

1-D Kernel Density Estimation For Image Processing with bin is incremented by 1. For histogram V can be defined W xH where W is bin width and H is unbounded In the figure 1 the hue histogram of rectangular region of im-

(a) original image

(b) Hue histogram bin width 6


Figure 1: Object model

(c) bin width 1

age is shown. Histograms are described by bin width and range of values. In the above the range of Hue values is 0 180 and the number of bins are

4 | 9

1-D Kernel Density Estimation For Image Processing 30 We can see that histograms are discontinuous ,which may not necessarily be due to underlying discontinuity of underlying PDF but also due to discretization due to bins and Inaccuracies may also exist in the histogram due to binning . Histograms are not smooth and depend on endpoints and width of the bins This can be seen in figure 1 b. Typically estimate becomes better as we increase the number of points and shrink the bin width and this is true in case of general non parametric estimation as seen in figure 1 c. In practice the number of samples are finite,thus we not observe samples for all possible values,in such case if the bin width is small,we may observe that bin does no enclose any samples and estimate will exhibit large discontinuties. For histogram we group adajcent sample values into a bin.

0.3.1 Kernel Density Estimation


Kernel density estimation provides another method to arrive at estimate of PDF under small sample size.The density of samples about a given point is proportional to its probability. It approximate the probability density by estimating the local density of points. Parzen window technique Parzen-window density estimation is essentially a data-interpolation technique and provide a general framework for kernel density estimation. Given an instance of the random sample, x, Parzen-windowing estimates the PDF P (X ) from which the sample was derived It essentially superposes kernel functions placed at each observation so that each observation xi contributes to the PDF estimate. Suppose that we want to estimate the value of the PDF P (X ) at point x. Then, we can place a window function at x and determine how many observations xi fall within our window or, rather, what is the contribution of each observation xi to this windowing The PDF value P (x) is then the sum total of the contributions from

5 | 9

1-D Kernel Density Estimation For Image Processing the observations to this window Let (x1 ; x2 ; : : : ; xn ) be an iid sample drawn from some distribution with an unknown density f. We are interested in estimating the probability distribution f. Its parzen window estimate is defined as
^ (x) = f h 1

n X i=1

Kh (x xi )

nh

n  x xi  X i=1

Where kernel

u is called the kernel,h is called its bandwidth,kh is called a scaled

Kernel density estimates are related to histograms,but possess properties like smoothness or smoothness by using a suitable kernel. Commonly used kernel functions are uniform,gaussian,Epanechnikov etc Superposition of kernels centered at each data point is equivalent to convolving the data points with the kernel.we are smoothing the histogram by performing convolution with a kernel. Different kernels will produce different effects.

0.3.2 Rectangular windows


For univariate case the rectangular windows encloses k examples about a region of width h centered about x on the histogram. To find the number of examples that fall within this region ,the kernel function is defined as

k (x) =

1; 0;

if jxj < h otherwise

hence total number of bins of histogram be 180,hence bin width is 1.Let us apply a window function with bandwidth 6,12,18 etc and observe the effect on histogram The kernel density estimate using parzen window of bandwidth 6,12 and 18 are shown in figure 3.

6 | 9

1-D Kernel Density Estimation For Image Processing

(a) bandwidth 6

(b) bin width 12

(c) bin width 18

Figure 2: rectangular window

0.3.3 Gaussian Windwos


The kernel function for the gaussian window is defined as

k (x) = C exp

x 2 2
2

Instead of a parze rectangular window let us apply a gaussian window of width 6,12 and 18 and observe the effects on the histogram It

(a) bandwidth 6

(b) bin width 12


Figure 3: Gaussian window

(c) bin width 18

can be seen that estimate of PDF is smooth,however the bandwidth plays an important role in the estimated PDF.A small bandwidth of 6 estimates a bimodal PDF width peaks well seperated. A bandwidth of 12,still is bimodal however the peaks are no longer seperated. A larger bandwidth of 16 estimates a unimodal PDF. The bandwidth of the kernel is a free parameter which exhibits a strong influence on estimate of the PDF.Selecting bandwidth is a tradeoff between accuracy and generality.

7 | 9

1-D Kernel Density Estimation For Image Processing

0.4 Code
The class Histogram contains methods to perform kernel density estimation for 1D histogram using rectangular and gaussian windows.The definition for Histogram class can be found in files Histogram.cpp and Histogram.hpp.The code can be found at https:// github.com/pi19404/m19404/tree/master/OpenVision/ImgProc The file to test the kernel density estimation is kde_demo:cpp and can be found in https://github.com/pi19404/m19404/tree/master/OpenVision/demo To compile the code for kde_demo run command

make -f MakeDemo kde_demo

8 | 9

Bibliography

Bibliography
[1]  An introduction to kernel density estimation. In: ().

url: http://www.mvstat. net/tduong/research/seminars/seminar-2001-05/.

9 | 9

Potrebbero piacerti anche