Sei sulla pagina 1di 11

Ahsanullah University of Science and Technology

(AUST)
Department of Computer Science and Engineering (CSE)

CSE 4214: Pattern Recognition Lab

List of Experiments
Experiment 1: Designing a minimum distance to class mean classifier.
Experiment 2: Implementing the Perceptron algorithm for finding the weights of
a Linear Discriminant function.
Experiment 3: Designing a Minimum Error Rate classifier.
Experiment 4: Density Estimation using Parzen Window.
Experiment 5: Implementing a feature selection system using PCA.
Experiment 6: Implementing a feature transformation system using LDA.

Text Book:
1. Pattern Classification (2nd Edition) by R. O. Duda, P. E. Hart and D. Stork, Wiley
2002
Course Overview

1. Goals:

The goals of this course are:

 To learn the fundamental topics of pattern classifications


 To implement different algorithms and techniques in MATLAB.
 To understand their working principles and effects.
 To introduce few advanced algorithms used in pattern recognition.
 To know in which situation to use a suitable approach to solve a given problem.

2. Course Content:

The course will cover both basic and advanced topics of pattern classification covering
discriminant functions, linear/non-linear classifier, parametric/non-parametric techniques, and
feature selection and transformation techniques. The course will cover the theoretical part of the
taught algorithms/techniques as well as their impact on different set of scenarios or data set.
Therefore, students will not only be able to implement different techniques but also learn their
motives, working principles, impact and know when to use. During this course they will also learn
to code in MATLAB. At the end of the semester, students will have an understanding of different
components of a general pattern recognition system.

3. Course Organization:

Students will have to carry out their experiments individually. Each week a task will be given
where every student will be asked to implement and discuss on their own assignments.
Therefore, there will be two sorts of jobs to complete.
i. Lab Implementation.
ii. Report Writing.
In the implementation part, students have to implement the given algorithm with suitable data.
Whereas, in report writing, they have to write a formal report on the given task along with finding
and discussion sections that must highlight the working principles and effects of changing
different parameters or data set. Finally, all students will attend a final test and face a viva board
during the last week of the semester.

3.1 Marks Distribution:

 Class attendance 10%


 Lab Implementation 30%
 Report Writing 10%
 Lab viva 30%
 Lab Quiz 20%
Ahsanullah University of Science and Technology
(AUST)
Department of Computer Science and Engineering (CSE)

CSE 4214: Pattern Recognition Lab

Experiment No 1

“Designing a minimum distance to class mean classifier”

A. Sessional Task:
Given the following two-class set of prototypes

1 = {(2,2), (3,1), (3,3), (-1,-3), (4,2), (-2,-2)}


2 = {(0,0), (-2,2), (-1,-1), (-4,2), (-4,3), (2,6)}

1. Plot all sample points from both classes, but samples from the same class should have
the same color and marker.
2. Using a minimum distance classifier with respect to ‘class mean’, classify the following
points by plotting them with the designated class-color but different marker.

X1 = (-1,-1)
X2 = ( 3, 2)
X3 = (-2, 1)
X4 = ( 8, 2)

1 T
Linear Discriminant Function: gi ( X )  X T iY 
iY iY
2
3. Draw the decision boundary between the two-classes.

Note: Useful Matlab functions: plot, mean, sqrt, hold.


Ahsanullah University of Science and Technology
(AUST)
Department of Computer Science and Engineering (CSE)

CSE 4214: Pattern Recognition Lab

Experiment No 2

“Implementing the Perceptron algorithm for finding the weights of


a Linear Discriminant function.”

A. Sessional Task:
You are given the following sample points in a 2-class problem:
1 = (1,1), (1,-1), (4,5)
2 = (2,2.5), (0,2), (2,3)
1. Plot all sample points from both classes, but samples from the same class should have
the same color and marker. Observer if these two classes can be separated with a
linear boundary.
2. Consider the case of a second order polynomial discriminant function. Repose the
problem of finding such a nonlinear discriminant function, as a problem of finding a
linear discriminant function for a set of sample points of higher dimension. Generate
the high dimensional sample points. [Hint: Φ-machine.]
3. Use Perceptron Algorithm (one at a time) for finding the weight-coefficients of the
discriminant function boundary for your linear classifier in question 2.

4. Draw the decision boundary between the two classes.

Note: Useful MATLAB Functions: sym, solve, subs


% Plotting the discriminant function and the prototypes.
syms x1 x2;
s=sym(10*x1*x1-6*x2*x2+24*x1*x2-24*x1-68*x2+65);
s2=solve(s,x2);
xvals1=[-10:0.01:10];
xvals2(1,:)=subs(s2(1),x1,xvals1);
plot(xvals1,xvals2(1,:),'k');
grid, hold,
%Class S1
plot(1,1,'ro');
plot(1, -1,'ro');
plot(4,5,'ro');
%Class S2
plot(2,2,'gs');
plot(0,2,'gs');
plot(2,3,'gs');

B. Exercise:
You are given the following sample points in a 3-class problem:
1 = (0,2), (1,3), (1,1)
2 = (1.5,0), (0,-1), (2,-1)
3 = (4,0), (4,3), (3,1)
Use Perceptron Algorithm (one at a time) for finding the weight-coefficients of the
discriminant function boundary for your linear classifier. Draw the decision boundary
between the two classes.
Ahsanullah University of Science and Technology
(AUST)
Department of Computer Science and Engineering (CSE)

CSE 4214: Pattern Recognition Lab

Experiment No 3

“Designing a Minimum Error Rate classifier”

A. Sessional Task:
Design a Minimum Error Rate Classifier for a two-class problem with the given data
below:

P(x|1)=N(µ1,1) P(x|2)= N(µ2,2)


where, µ1=[0 0] where, µ2=[2 2]
and1= [.25 .3; .3 1]; and2= [.5 0; 0 .5];
P(1) =0.5 P(2) =0.5

1. Classify the following unknown samples:


(1,1), (1,-1), (4,5) (-2,2.5), (0,2), (2,-3)

2. Classified samples should have different colored markers according to the


assigned class label.
3. Change the parameter values (µi,i) to see different recognition results.
4. Label the regions Ri for each of those two classes.
5. Submit your sessional report with necessary illustrations and analysis within 7
days from the day of assigned task.

Note: Useful MATLAB Code:

x1 = -7:.2:7; x2 = -7:.2:7;
[X1,X2] = meshgrid(x1,x2);

mu1 = [0 0];
Sigma1 = [.25 .3; .3 1];
F1 = mvnpdf([X1(:) X2(:)],mu1,Sigma1);
F1 = reshape(F1,length(x2),length(x1));
%surfc(x1,x2,F1);
meshc(X1,X2,F1);
axis([-7 7 -7 7 -1.0 .6])
xlabel('x1'); ylabel('x2'); zlabel('Probability Density');
hold on;

mu2 = [2 2];
Sigma2 = [.5 .0; 0 .5];
F2 = mvnpdf([X1(:) X2(:)],mu2,Sigma2);
F2 = reshape(F2,length(x2),length(x1));
%surfc(x1,x2,F2);
meshc(X1,X2,F2);
axis([-7 7 -7 7 -1.0 .6])
xlabel('x1'); ylabel('x2'); zlabel('Probability Density');

caxis([min(F2(:))-.5*range(F2(:)),max(F2(:))]);

%% Write Your CODE here

plot3(1,1,-1.0,'rx');
Ahsanullah University of Science and Technology
(AUST)
Department of Computer Science and Engineering (CSE)

CSE 4214: Pattern Recognition Lab

Experiment No 4

“Density Estimation using Parzen Window”

A. Sessional Task:
For the following samples in a one-dimensional problem:
x1, x2, …, x16 = 0, 1, 3, 4.5, 5.5, 6.0, 6.5, 7.0, 7.2, 7.5, 8.0, 8.8, 9.2, 9.3, 11, 13

Give the values of the k-nearest neighbor estimate pj(x), for j=16 and kj=j , for the
range x=0 to13.

Vj  5,
When centered at x=2,
p j  (k j / j ) / Vj  (4 /16) / 5  0.05

B. Exercise
For the following samples in a one-dimensional problem:

{xi = -7, -5, -4, -3, -2, -2.1, 0, 1.1, 2, 3, 4, 4.5, 4.6, 4.8, 5, 6}

Plot the Parzen Window estimates pn(x), for n and hn=hj/n, for the range x=-10 to10
(step interval should be 0.1 and hj=1).

1 n 1 x  xi
When centered at x, the estimated density value is pn ( x)   ( h )
n i 1 hn n

Your window function  is zero mean, unit variance normal density function.
Ahsanullah University of Science and Technology
(AUST)
Department of Computer Science and Engineering (CSE)

CSE 4214: Pattern Recognition Lab

Experiment No 5

“Implementing a feature selection system using PCA”

A. Sessional Task:
Your today’s task is to carry out Principal Component Analysis (PCA) on a set of face
images. The basic task is to learn the use of princomp() matlab function in order to obtain
the eigen-vectors and eigen-values. Reducing the original feature dimension is the main
objective.

1. Read a set of face images and find the newly transformed face vectors based on PCA.
Reconstruct the face images from the transformed face vectors.

i. Read a set of input face image and generate the face feature (column) vector. Each
image is a 2D array of intensity values, i.e., a 2D matrix. In order to prepare the
column vector representation, all row values should be storied as a column and
appended one after another.
a 
 
a b   b 
For example: face1   c d    c 
   
   d 
 
ii. Prepare the matrix containing all the face vectors, where each row contains a face
vector (but transposed).
 face'1   a b c d 
For example: X  face'2    p q r s 
   
   
iii. Compute the eigen-vectors (evec) and eigen-values (eval). Discard unnecessary
eigen-vectors based on eigen values. [Show eigen faces: each evec vector should
be rearranged as 2D matrix like input image.]
Use function: [COEFF,SCORE,latent] = princomp(X); See help for details.

iv. Derive the new transformed feature vector.


Take projection onto the selected eigen-vectors nX = Xµ*COEFFr
v. From the transformed feature vector, reconstruct face image and display it.

2. The computation of eigen-vector and eigen-values should be done with cov() and
eig() functions.

i. Subtract mean from original face features.


ii. Calculate the covariance matrix using cov().
iii. Calculate the eigenvectors and eigenvalues of the covariance matrix using eig().

Note: Useful matlab code:


% Use loop as necessay

filepre='imgset/img_';
s=num2str(i); % i is the image number.
impath=strcat(filepre,s,'.bmp');

% Convert to grayscale image.

% Arrange each image as a vector.

% Write all your code here to complete the given task.


Ahsanullah University of Science and Technology
(AUST)
Department of Computer Science and Engineering (CSE)

CSE 4214: Pattern Recognition Lab

Experiment No 6

“Implementing a feature transformation system using LDA”

A. Sessional Task:
Your today’s task is to carry out Linear Discriminant Analysis (LDA) on a set of input data
and learn about the basic equations associated with LDA.

Compute the Linear Discriminant projection for the following two-dimensional dataset:

– Samples for class ω1 : {(4,2),(2,4),(2,3),(3,6),(4,4)}

– Samples for class ω2 : {(9,10),(6,8),(9,5),(8,7),(10,8)}

1. The computation of the coefficients of w should be obtained using inv() and


eig() MATLAB functions.

and

2. Also draw the projection line in the original space.


3. Plot the projected dataset for all the projection space found, including the ones
with non-maximum Fisher response J.

Potrebbero piacerti anche