Sei sulla pagina 1di 77

CHAPTER 1

INTRODUCTION

1.1 Pattern Recognition

Pattern recognition is concerned with the automatic detection or classification


of objects or events. The measurements or properties used to classify the objects are
called features, and the types or categories into which they are classified are called
classes.

1.2 Need for Pattern Recognition

Need for pattern recognition is to perform scalable classification and


clustering algorithms. Classification is a task that learns a function to map the data
item into one of several predefined classes. Clustering is a task that identifies finite
set of clusters to describe the data.

1.3 Applications of Pattern Recognition

Typical applications are automatic speech recognition, classification of text


into several categories (e.g. spam/non-spam email messages), the automatic
recognition of handwritten postal codes on postal envelopes, or the automatic
recognition of images of human faces.
1.4 Types of Pattern Recognition

Pattern recognition can be broadly classified into two main categories,

1) Supervised Learning

2) Unsupervised Learning

1.4.1 Supervised Learning

The term Supervised Learning refers to the process of designing a pattern


classifier by using training set of patterns of known class to determine the choice of
a specific decision making technique for classifying additional similar samples in the
future. The classifier, in other words, is designed using the training data. To provide
an unprejudiced estimate of the classifier’s accuracy on new data, we must test it on
a separate test set of patterns for which the class of each pattern is known.

Linear Discriminant Analysis, Back Propagation Algorithm, Adaptive


Decision making and Minimum Squared Error Deviation Method are some
Supervised Learning methods.

1.4.2 Unsupervised Learning

Unsupervised Learning studies how systems can learn to represent particular


input patterns in a way that reflects the statistical structure of the overall collection of
input patterns. By contrast with Supervised Learning there are no explicit target
outputs or environmental evaluations associated with each input; rather the
unsupervised leaner brings to bear prior biases as to what aspects of the structure of
the input are captured in the output.
Unsupervised Learning is important since it is likely to be much more
common in the brain than supervised learning.

K-Means clustering, Single Linkage Algorithm, Adaptive K-Means clustering


and WARD’s method are some methods of Unsupervised Learning.

Aim of the project is to develop a simple comprehensive tool box which


enables the user to choose an appropriate technique by working out all possible
methods.

The techniques we have discussed here are,

1) Linear Discriminant Analysis

2) Back Propagation Algorithm


CHAPTER 2

LINEAR DISCRIMINANT ANALYSIS

2.1 Introduction

Linear discriminant analysis (LDA) is the method used in statistics and


machine learning to find the linear combination of features which best separate two
or more classes of objects or events. The resulting combination may be used as a
linear classifier or, more commonly, for dimensionality reduction before later
classification.

LDA is closely related to ANOVA (analysis of variance) and Regression


Analysis, which also attempt to express one dependent variable as a linear
combination of other features or measurements. In the other two methods however,
the dependent variable is a numerical quantity, while for LDA it is a categorical
variable (i.e. the class label).

LDA is also closely related to Principal Component Analysis (PCA) and


Factor Analysis in that both look for linear combinations of variables which best
explains the data. LDA explicitly attempts to model the difference between the
classes of data. PCA on the other hand does not take into account any difference in
class, and Factor Analysis builds the feature combinations based on differences
rather than similarities. Discriminant Analysis is also different from Factor Analysis
in that it is not an interdependence technique: a distinction between independent
variables and dependent variables (also called criterion variables) must be made.

If we can assume that the groups are linearly separable, we can use linear
discriminant model (LDA).
2.2 Purpose

The purpose of Discriminant Analysis is to classify objects (people,


customers, things, etc.) into one of two or more groups based on a set of features that
describe the objects (e.g. gender, age, income, weight, preference score, etc. ). In
general, we assign an object to one of a number of predetermined groups based on
observations made on the object.

2.3 LDA Formula

Using classification criterion to minimize Total Error of Classification


(TEC), we tend to make the proportion of object that it misclassifies as small as
possible. TEC is the performance rule in the 'long run' on a random sample of
objects. Thus, TEC should be thought as the probability that the rule under
consideration will misclassify an object. The classification rule is to assign an object
to the group with highest conditional probability. This is called Bayes Rule. This
rule also minimizes the TEC. If there are groups, the Bayes' rule is to assign the

object to group where .

We want to know the probability that an object is belong to group ,

given a set of measurement . In practice however, the quantity of is difficult

to obtain. What we can get is . This is the probability of getting a particular


set of measurement given that the object comes from group . For example, after
we know that the soap is good or bad then we can measure the object (weight, smell,
color etc.). What we want to know is to determine the group of the soap (good or
bad) based on the measurement only.

Fortunately, there is a relationship between the two conditional probabilities


that well known as Bayes Theorem:
Prior probability is probability about the group known without making
any measurement. In practice we can assume the prior probability is equal for all
groups or based on the number of sample in each group.

In practice, however, to use the Bayes rule directly is unpractical because to

obtain need so much data to get the relative frequencies of each groups for
each measurement. It is more practical to assume the distribution and get the
probability theoretically. If we assume that each group has multivariate Normal
distribution and all groups have the same covariance matrix, we get what is called
Linear Discriminant Analysis formula:

Assign object to group that has maximum

The second term ( ) is actually Mahalanobis distance, which is


distance to measure dissimilarity between several groups.

2.4 Algorithm

Step 1: Assume that the given data are linearly separable.

Step 2: Assign features matrix to X and class matrix to Y

Step 3: Lets take g be the number of groups

Step 4: Separate X into several groups based on the number of category in Y,

each group represented by Xi

Step 5: Calculate the mean of features for each group,


Step 6: Calculate the global mean of features,

Step 7: Compute the mean corrected data for each group,

Mean corrected data = features data for group ( ) – global mean ( )

Step 8: Compute the covariance matrix for each group, ci

= covariance matrix of group

Step 9: Compute the global covariance matrix, C

It is calculated for each entry in the matrix

Step 10: Compute the prior probability,

= prior probability vector (each row represent prior probability of

group ). If we do not know the prior probability, we just assume it is

equal to total sample of each group divided by the total samples, that

is

Step 11: Compute the frequency, fi, for each data in the training set and finally

compute the frequency for test set using the discriminant function,

Step 12: Assign object to group that has maximum


2.5 Example

Factory “ABC” produces very expensive and high quality chip rings that their
qualities are measured in term of curvature and diameter. Result of quality control by
experts is given in the table below.

Curvature Diameter Quality Control Result


2.95 6.63 Passed
2.53 7.79 Passed
3.57 5.65 Passed
3.16 5.47 Passed
2.58 4.46 Not Passed
2.16 6.22 Not Passed
3.27 3.52 Not Passed

As a consultant to the factory, you get a task to set up the criteria for
automatic quality control. Then, the manager of the factory also wants to test your
criteria upon new type of chip rings that even the human experts are argued to each
other. The new chip rings have curvature 2.81 and diameter 5.46.

When we plot the features, we can see that the data is linearly separable. We
can draw a line to separate the two groups. The problem is to find the line and to
rotate the features in such a way to maximize the distance between groups and to
minimize distance within group.
= features (or independent variables) of all data. Each row (denoted by )
represents one object; each column stands for one feature.

= group of the object (or dependent variable) of all data. Each row represents one
object and it has only one column.

In our example, and

= data of row . For example, ,

= number of groups in . In our example, =2

= features data for group . Each row represents one object; each column stands
for one feature. We separate into several groups based on the number of category
in .
,

= mean of features in group , which is average of

= global mean vector, that is mean of the whole data set.

In this example,

= mean corrected data, that is the features data for group , , minus the global
mean vector

= covariance matrix of group

,
= pooled within group covariance matrix. It is calculated for

each entry in the matrix. In our example, ,

and , therefore

The inverse of the pooled covariance matrix is

= prior probability vector (each row represent prior probability of group ). If we


do not know the prior probability, we just assume it is equal to total sample of each

group divided by the total samples, that is

Discriminant function

We should assign object to group that has maximum .


LINEAR DISCRIMINAT ANALYSIS: 2 Groups 2 Features
Training Data MCD Results
0 0
Class x1 x2 x1 x2 f1 f2 Classification
Passed 2.95 6.63 0.060 0.951 54.25 51.72 Passed
Passed 2.53 7.79 -0.357 2.109 52.74 49.99 Passed
Passed 3.57 5.65 0.679 -0.025 61.49 58.23 Passed
Passed 3.16 5.47 0.269 -0.209 51.04 49.47 Passed
Not Passed 2.58 4.46 -0.305 -1.218 31.08 32.99 Not Passed
Not Passed 2.16 6.22 -0.732 0.547 33.68 34.50 Not Passed
Not Passed 3.27 3.52 0.386 -2.155 40.17 41.04 Not Passed
Prediction 2.81 5.46 -0.078 -0.219 43.10 42.76 Passed
BACK PROPAGATION ALGORITHM

2.6 Introduction

The area of Neural Networks probably belongs to the borderline between the
Artificial Intelligence and Approximation Algorithms. Think of it as of algorithms
for smart approximation. The Neural Nets are used in universal approximation
,mapping input to the output, tools capable of learning from their environment, tools
for finding non-evident dependencies between data and so on.

The Neural Networking algorithms are modeled after the brain and how it
processes the information. The brain is a very efficient tool. Having about 100,000
times slower response time than computer chips, it beats the computer in complex
tasks, such as image and sound recognition, motion control and so on. It is also about
10,000,000,000 times more efficient than the computer chip in terms of energy
consumption per operation.

The brain is a multi layer structure, 6-7 layers of neurons, with 10^11
neurons, structure, that works as a parallel computer capable of learning from the
feedback it receives from the world and changing its design by growing new neural
links between neurons or altering activities of existing ones. To make picture a bit
more complete, let's also mention, that a typical neuron is connected to 50-100 of the
other neurons, sometimes, to itself, too.
Example of a neural net with inputs and output,

The Back Propagation Algorithm for training the weights in a multilayer net
uses the steepest descent minimization procedure and the sigmoid function. The
Back Propagation Algorithm consists of two main steps: A feed forward step in
which the outputs of the nodes are computed at layer 1 and working forward to the
output layer k and back propagation step where the weights are updated in an
attempt to get better agreement between the observed outputs and desired outputs.

The Neural Net receives the inputs which can be a pattern of some kind. In
case of image recognition, for example, it would be pixels from the photosensitive
matrix of some kind.

After the neuron in the first layer received its input, it applies the Linear
Combiner and the Activation Function to the inputs and produces the Output. This
output, as you can see from the picture, will become the input for the neurons in the
next layer. So the next layer will feed forward the data, to the next layer. And so on,
until the last layer is reached.

When we work with previous layer output, we not only know the output of
previous layer, but also the output we are trying to predict, called the desired output
of the Neural Net. When we compare the two values, we can compute the Error:
dError = dDesiredOutput - dOutput; Now we can adjust this particular neuron to
work better with this particular input. As the Neural Net is learning, the errors will
decrease and we will have to adjust the weights in every layer.

Once we decided what adjustment we need to apply to the neurons in the


output layer, we can back propagate the changes to the previous layers of the
network. Indeed, as soon as we have desired outputs for the output layer, we can
make adjustment to reduce the error. Adjustment will change weights of the input
nodes of the neurons in the output layer.

But the input nodes of the last layer are output nodes of the previous layer.
So we have the actual output of the previous layer and the desired output and we can
adjust the previous layer of the net and so on, until we reach the first layer.

2.7 Algorithm

Step 1: Initialize the weights wij(k) to small random values, and choose a

positive constant c.

Step 2: Repeatedly set x1(0),……, xm0(0) equal to the features of samples 1

to N, cycling back to sample 1 after sample N is reached

Step 3: Feed Forward Step

Step 3.1 : For k=0…. K-1, compute

xj( k )=R( ∑ wij( k+1 )xi( k ) ), for all i = 0, 1 …., Mk

for nodes j = 1…… Mk+1. We use the sigmoid

threshold function R(s) = 1/(1+e-s).


Step 4: Back Propagation step

Step 4.1 : For the nodes in the output layer, j = 1……. Mk, compute

δj(k) = xj( k ) (1 – xj( k ) ) (xj( k ) – dj)

Step 4.2 : For layers k = k-1….. ,1 compute

δj(k) = xi( k ) (1 - xi( k ) ) ∑ δj(k+1) wij(k+1)

for j = 1, …., Mk+1 and for i=1, ….., Mk

Step 5: Replace wij( k ) by wij( k ) – c δj(k) xi( k-1 ) for all i, j, k

Step 6: Repeat steps 2 to 5 until the weights wij( k ) cease to change

significantly.

2.8 Example

Consider the simple network below:


0.1
Input A = 0.35 Input A = 0.35 0.3

0.8
0.4
Input B= 0.9 0.9
0.6

Assume that the neurons have a Sigmoid activation function and

(i) Perform a forward pass on the network.

(ii) Perform a reverse pass (training) once (target = 0.5).

(iii) Perform a further forward pass and comment on the result.


Answer:

(i) Forward Pass

Input to top neuron = (0.35x0.1)+(0.9x0.8)=0.755. Out = 0.68.

Input to bottom neuron = (0.9x0.6)+(0.35x0.4) = 0.68. Out = 0.6637.

Input to final neuron = (0.3x0.68)+(0.9x0.6637) = 0.80133. Out = 0.69.

(ii) Reverse Pass

Output error δ=(t-o)(1-o)o = (0.5-0.69)(1-0.69)0.69 = -0.0406.

New weights for output layer

w1+ = w1+(δ x input) = 0.3 + (-0.0406x0.68) = 0.272392.

w2+ = w2+(δ x input) = 0.9 + (-0.0406x0.6637) = 0.87305.

Errors for hidden layers:

δ1 = δ x w1 = -0.0406 x 0.272392 x (1-o)o = -2.406x10-3

δ2= δ x w2 = -0.0406 x 0.87305 x (1-o)o = -7.916x10-3


New hidden layer weights:

w3+=0.1 + (-2.406 x 10-3 x 0.35) = 0.09916.

w4+ = 0.8 + (-2.406 x 10-3 x 0.9) = 0.7978.

w5+ = 0.4 + (-7.916 x 10-3 x 0.35) = 0.3972.

w6+ = 0.6 + (-7.916 x 10-3 x 0.9) = 0.5928.

(iii) Result

Old error was -0.19. New error is -0.18205. Therefore error has reduced.
CHAPTER 3

DEVELOPMENT ENVIRONMENT

3.1 HARDWARE ENVIRONMENT

The hardware used for the development of the project is:

Processor : Pentium IV
Ram : 512MB
Monitor : 15” Color
Hard Disk : 20 GB

3.2 SOFTWARE ENVIRONMENT

The software used for the development of the project is:

Operating system : Windows 2000 Professional


Environment : Microsoft Visual Studio 6.0
Language : Microsoft Visual C++
CHAPTER 4

SYSTEM IMPLEMENTATION

4.1 Creating the Dialog Box

Header File

// ProjectDlg.h : header file

//

#if !

defined(AFX_PROJECTDLG_H__BF56921F_0F09_4C8A_BA63_8EA9856308A5

__INCLUDED_)

#define

AFX_PROJECTDLG_H__BF56921F_0F09_4C8A_BA63_8EA9856308A5__INCL

UDED_

#if _MSC_VER > 1000

#pragma once

#endif // _MSC_VER > 1000

/////////////////////////////////////////////////////////////////////////////

// CProjectDlg dialog
class CProjectDlg : public CDialog

// Construction

public:

CProjectDlg(CWnd* pParent = NULL); // standard constructor

// Dialog Data

//{{AFX_DATA(CProjectDlg)

enum { IDD = IDD_PROJECT_DIALOG };

int m_ch;

//}}AFX_DATA

// ClassWizard generated virtual function overrides

//{{AFX_VIRTUAL(CProjectDlg)

protected:

virtual void DoDataExchange(CDataExchange* pDX); // DDX/DDV

support

//}}AFX_VIRTUAL

// Implementation

protected:

HICON m_hIcon;
// Generated message map functions

//{{AFX_MSG(CProjectDlg)

virtual BOOL OnInitDialog();

afx_msg void OnSysCommand(UINT nID, LPARAM lParam);

afx_msg void OnPaint();

afx_msg HCURSOR OnQueryDragIcon();

afx_msg void OnOk();

afx_msg void OnCancel();

//}}AFX_MSG

DECLARE_MESSAGE_MAP()

};

//{{AFX_INSERT_LOCATION}}

// Microsoft Visual C++ will insert additional declarations immediately before the

previous line.

#endif // !

defined(AFX_PROJECTDLG_H__BF56921F_0F09_4C8A_BA63_8EA9856308A5

__INCLUDED_)
Implementation Code:

// ProjectDlg.cpp : implementation file

//

#include "stdafx.h"

#include "Project.h"

#include "ProjectDlg.h"

#include "super.h"

#include "unsuper.h"

#ifdef _DEBUG

#define new DEBUG_NEW

#undef THIS_FILE

static char THIS_FILE[] = __FILE__;

#endif

/////////////////////////////////////////////////////////////////////////////

// CAboutDlg dialog used for App About

class CAboutDlg : public CDialog

public:

CAboutDlg();
// Dialog Data

//{{AFX_DATA(CAboutDlg)

enum { IDD = IDD_ABOUTBOX };

//}}AFX_DATA

// ClassWizard generated virtual function overrides

//{{AFX_VIRTUAL(CAboutDlg)

protected:

virtual void DoDataExchange(CDataExchange* pDX); // DDX/DDV

support

//}}AFX_VIRTUAL

// Implementation

protected:

//{{AFX_MSG(CAboutDlg)

//}}AFX_MSG

DECLARE_MESSAGE_MAP()

};

CAboutDlg::CAboutDlg() : CDialog(CAboutDlg::IDD)

//{{AFX_DATA_INIT(CAboutDlg)

//}}AFX_DATA_INIT
}

void CAboutDlg::DoDataExchange(CDataExchange* pDX)

CDialog::DoDataExchange(pDX);

//{{AFX_DATA_MAP(CAboutDlg)

//}}AFX_DATA_MAP

BEGIN_MESSAGE_MAP(CAboutDlg, CDialog)

//{{AFX_MSG_MAP(CAboutDlg)

// No message handlers

//}}AFX_MSG_MAP

END_MESSAGE_MAP()

/////////////////////////////////////////////////////////////////////////////

// CProjectDlg dialog

CProjectDlg::CProjectDlg(CWnd* pParent /*=NULL*/)

: CDialog(CProjectDlg::IDD, pParent)

//{{AFX_DATA_INIT(CProjectDlg)

m_ch = -1;

//}}AFX_DATA_INIT
// Note that LoadIcon does not require a subsequent DestroyIcon in Win32

m_hIcon = AfxGetApp()->LoadIcon(IDR_MAINFRAME);

void CProjectDlg::DoDataExchange(CDataExchange* pDX)

CDialog::DoDataExchange(pDX);

//{{AFX_DATA_MAP(CProjectDlg)

DDX_Radio(pDX, IDC_RADIO5, m_ch);

//}}AFX_DATA_MAP

BEGIN_MESSAGE_MAP(CProjectDlg, CDialog)

//{{AFX_MSG_MAP(CProjectDlg)

ON_WM_SYSCOMMAND()

ON_WM_PAINT()

ON_WM_QUERYDRAGICON()

ON_BN_CLICKED(IDC_BUTTON1, OnOk)

ON_BN_CLICKED(IDC_BUTTON2, OnCancel)

//}}AFX_MSG_MAP

END_MESSAGE_MAP()

/////////////////////////////////////////////////////////////////////////////

// CProjectDlg message handlers


BOOL CProjectDlg::OnInitDialog()

CDialog::OnInitDialog();

// Add "About..." menu item to system menu.

// IDM_ABOUTBOX must be in the system command range.

ASSERT((IDM_ABOUTBOX & 0xFFF0) == IDM_ABOUTBOX);

ASSERT(IDM_ABOUTBOX < 0xF000);

CMenu* pSysMenu = GetSystemMenu(FALSE);

if (pSysMenu != NULL)

CString strAboutMenu;

strAboutMenu.LoadString(IDS_ABOUTBOX);

if (!strAboutMenu.IsEmpty())

pSysMenu->AppendMenu(MF_SEPARATOR);

pSysMenu->AppendMenu(MF_STRING,

IDM_ABOUTBOX, strAboutMenu);

}
// Set the icon for this dialog. The framework does this automatically

// when the application's main window is not a dialog

SetIcon(m_hIcon, TRUE); // Set big icon

SetIcon(m_hIcon, FALSE); // Set small icon

// TODO: Add extra initialization here

return TRUE; // return TRUE unless you set the focus to a control

void CProjectDlg::OnSysCommand(UINT nID, LPARAM lParam)

if ((nID & 0xFFF0) == IDM_ABOUTBOX)

CAboutDlg dlgAbout;

dlgAbout.DoModal();

else

CDialog::OnSysCommand(nID, lParam);

}
// If you add a minimize button to your dialog, you will need the code below

// to draw the icon. For MFC applications using the document/view model,

// this is automatically done for you by the framework.

void CProjectDlg::OnPaint()

if (IsIconic())

CPaintDC dc(this); // device context for painting

SendMessage(WM_ICONERASEBKGND, (WPARAM)

dc.GetSafeHdc(), 0);

// Center icon in client rectangle

int cxIcon = GetSystemMetrics(SM_CXICON);

int cyIcon = GetSystemMetrics(SM_CYICON);

CRect rect;

GetClientRect(&rect);

int x = (rect.Width() - cxIcon + 1) / 2;

int y = (rect.Height() - cyIcon + 1) / 2;

// Draw the icon

dc.DrawIcon(x, y, m_hIcon);

}
else

CDialog::OnPaint();

// The system calls this to obtain the cursor to display while the user drags

// the minimized window.

HCURSOR CProjectDlg::OnQueryDragIcon()

return (HCURSOR) m_hIcon;

void CProjectDlg::OnOk()

// TODO: Add your control notification handler code here

UpdateData(true);

super* s=new super;

unsuper* u=new unsuper;

switch(m_ch)

case 0:

s->Create(IDD_DIALOG1);

s->ShowWindow(SW_SHOW);
this->ShowWindow(SW_HIDE);

break;

case 1:

u->Create(IDD_DIALOG2);

u->ShowWindow(SW_SHOW);

this->ShowWindow(SW_HIDE);

break;

default:

MessageBox("Select Any one of the choice");

break;

void CProjectDlg::OnCancel()

// TODO: Add your control notification handler code here

exit(0);

}
4.2 Mapping Resources

Header File:

// Project.h : main header file for the PROJECT application

//

#if !

defined(AFX_PROJECT_H__9334AE5F_4A2E_4CF6_A7DF_ABD1B9CD2BC5__

INCLUDED_)

#define

AFX_PROJECT_H__9334AE5F_4A2E_4CF6_A7DF_ABD1B9CD2BC5__INCLU

DED_

#if _MSC_VER > 1000

#pragma once

#endif // _MSC_VER > 1000

#ifndef __AFXWIN_H__

#error include 'stdafx.h' before including this file for PCH

#endif

#include "resource.h" // main symbols


/////////////////////////////////////////////////////////////////////////////

// CProjectApp:

// See Project.cpp for the implementation of this class

//

class CProjectApp : public CWinApp

public:

CProjectApp();

// Overrides

// ClassWizard generated virtual function overrides

//{{AFX_VIRTUAL(CProjectApp)

public:

virtual BOOL InitInstance();

//}}AFX_VIRTUAL

// Implementation

//{{AFX_MSG(CProjectApp)

// NOTE - the ClassWizard will add and remove member functions

here.

// DO NOT EDIT what you see in these blocks of generated code !

//}}AFX_MSG
DECLARE_MESSAGE_MAP()

};

/////////////////////////////////////////////////////////////////////////////

//{{AFX_INSERT_LOCATION}}

// Microsoft Visual C++ will insert additional declarations immediately before the

previous line.

#endif // !

defined(AFX_PROJECT_H__9334AE5F_4A2E_4CF6_A7DF_ABD1B9CD2BC5__

INCLUDED_)

Implementation File

// Project.cpp : Defines the class behaviors for the application.

//

#include "stdafx.h"

#include "Project.h"

#include "ProjectDlg.h"

#ifdef _DEBUG
#define new DEBUG_NEW

#undef THIS_FILE

static char THIS_FILE[] = __FILE__;

#endif

/////////////////////////////////////////////////////////////////////////////

// CProjectApp

BEGIN_MESSAGE_MAP(CProjectApp, CWinApp)

//{{AFX_MSG_MAP(CProjectApp)

// NOTE - the ClassWizard will add and remove mapping macros

here.

// DO NOT EDIT what you see in these blocks of generated code!

//}}AFX_MSG

ON_COMMAND(ID_HELP, CWinApp::OnHelp)

END_MESSAGE_MAP()

/////////////////////////////////////////////////////////////////////////////

// CProjectApp construction

CProjectApp::CProjectApp()

// TODO: add construction code here,

// Place all significant initialization in InitInstance


}

/////////////////////////////////////////////////////////////////////////////

// The one and only CProjectApp object

CProjectApp theApp;

/////////////////////////////////////////////////////////////////////////////

// CProjectApp initialization

BOOL CProjectApp::InitInstance()

// Standard initialization

// If you are not using these features and wish to reduce the size

// of your final executable, you should remove from the following

// the specific initialization routines you do not need.

#ifdef _AFXDLL

Enable3dControls(); // Call this when using MFC in a shared

DLL

#else

Enable3dControlsStatic(); // Call this when linking to MFC statically

#endif
CProjectDlg dlg;

m_pMainWnd = &dlg;

int nResponse = dlg.DoModal();

if (nResponse == IDOK)

// TODO: Place code here to handle when the dialog is

// dismissed with OK

else if (nResponse == IDCANCEL)

// TODO: Place code here to handle when the dialog is

// dismissed with Cancel

// Since the dialog has been closed, return FALSE so that we exit the

// application, rather than start the application's message pump.

return FALSE;

}
4.3 Linear Discriminant Analysis Dialog Box

Header File:

#if !

defined(AFX_LDA_H__3F872543_4AED_4A6F_9B73_DA05E37DE6F5__INCLU

DED_)

#define

AFX_LDA_H__3F872543_4AED_4A6F_9B73_DA05E37DE6F5__INCLUDED_

#if _MSC_VER > 1000

#pragma once

#endif // _MSC_VER > 1000

// lda.h : header file

//

/////////////////////////////////////////////////////////////////////////////

// lda dialog

class lda : public CDialog

// Construction

public:

char *ver_groups[20];
int grpcount;

float *freq;

float *testfreq;

lda()

grpcount=0;

m_Filename="";

m_nos=0;

m_nof=0;

m_nog=0;

m_grp="";

m_test="";

// Dialog Data

//{{AFX_DATA(lda)

enum { IDD = IDD_DIALOG3 };

CEdit m_testfile;

CEdit m_inputfile;

CEdit m_out;

CButton m_nxtgrp;

CButton m_calc;

CString m_Filename;

int m_nos;
int m_nof;

int m_nog;

CString m_grp;

CString m_test;

CString m_outfile;

int m_output;

//}}AFX_DATA

// Overrides

// ClassWizard generated virtual function overrides

//{{AFX_VIRTUAL(lda)

protected:

virtual void DoDataExchange(CDataExchange* pDX); // DDX/DDV

support

//}}AFX_VIRTUAL

// Implementation

protected:

// Generated message map functions

//{{AFX_MSG(lda)

afx_msg void OnBack();

afx_msg void OnExit();


afx_msg void OnCalculate();

afx_msg void OnNextGrp();

afx_msg void OnYes();

afx_msg void OnNo();

afx_msg void OnBrowseInput();

afx_msg void OnBrowseTest();

afx_msg void OnViewSampleFile();

//}}AFX_MSG

DECLARE_MESSAGE_MAP()

};

//{{AFX_INSERT_LOCATION}}

// Microsoft Visual C++ will insert additional declarations immediately before the

previous line.

#endif // !

defined(AFX_LDA_H__3F872543_4AED_4A6F_9B73_DA05E37DE6F5__INCLU

DED_)

Implementation File:

// lda.cpp : implementation file

//
#include "stdafx.h"

#include "Project.h"

#include "lda.h"

#include "super.h"

#include "headers.h"

#include "math.h"

#include "graph.h"

#ifdef _DEBUG

#define new DEBUG_NEW

#undef THIS_FILE

static char THIS_FILE[] = __FILE__;

#endif

/////////////////////////////////////////////////////////////////////////////

// lda dialog

struct T

float *data;

struct T *next;

}*first,*temp,*last,*perm;

//float *features;
void lda::DoDataExchange(CDataExchange* pDX)

CDialog::DoDataExchange(pDX);

//{{AFX_DATA_MAP(lda)

DDX_Control(pDX, IDC_EDIT6, m_testfile);

DDX_Control(pDX, IDC_EDIT1, m_inputfile);

DDX_Control(pDX, IDC_EDIT7, m_out);

DDX_Control(pDX, IDC_BUTTON2, m_nxtgrp);

DDX_Control(pDX, IDC_BUTTON5, m_calc);

DDX_Text(pDX, IDC_EDIT1, m_Filename);

DDX_Text(pDX, IDC_EDIT2, m_nos);

DDX_Text(pDX, IDC_EDIT3, m_nof);

DDX_Text(pDX, IDC_EDIT4, m_nog);

DDX_Text(pDX, IDC_EDIT5, m_grp);

DDX_Text(pDX, IDC_EDIT6, m_test);

DDX_Text(pDX, IDC_EDIT7, m_outfile);

DDX_Radio(pDX, IDC_RADIO1, m_output);

//}}AFX_DATA_MAP

BEGIN_MESSAGE_MAP(lda, CDialog)

//{{AFX_MSG_MAP(lda)

ON_BN_CLICKED(IDC_BUTTON4, OnBack)
ON_BN_CLICKED(IDC_BUTTON6, OnExit)

ON_BN_CLICKED(IDC_BUTTON5, OnCalculate)

ON_BN_CLICKED(IDC_BUTTON2, OnNextGrp)

ON_BN_CLICKED(IDC_RADIO1, OnYes)

ON_BN_CLICKED(IDC_RADIO2, OnNo)

ON_BN_CLICKED(IDC_BUTTON1, OnBrowseInput)

ON_BN_CLICKED(IDC_BUTTON3, OnBrowseTest)

ON_BN_CLICKED(IDC_BUTTON7, OnViewSampleFile)

//}}AFX_MSG_MAP

END_MESSAGE_MAP()

/////////////////////////////////////////////////////////////////////////////

// lda message handlers

void lda::OnBack()

// TODO: Add your control notification handler code here

super* s=new super;

s->Create(IDD_DIALOG1);

s->ShowWindow(SW_SHOW);

this->ShowWindow(SW_HIDE);

void lda::OnExit()
{

// TODO: Add your control notification handler code here

exit(0);

void lda::OnCalculate()

// TODO: Add your control notification handler code here

UpdateData(true);

CString str;

int i,j,k,t,choice=1,groupcount,n,m,l,*group_item_count,flag=1;

float

*temp1,*global_mean,*group_data,*group_mean,*tempptr,sum,*inverse_matrix,*fe

aturetest;

float *features;

float

*mean_corrected_data,*transpose_data,*covariance_matrix,*final_covariance_matri

x;

float

*prior_prob,*tempptr1,*tempptr2,*tempptr3,*test_data_features,*frequency;

char *groups[15],*tempgrp,*groupname[15],*grouptest[15];

FILE *fp,*fp1,*fp2;

int msgbxchoice;

graph *g=new graph;


if(m_Filename!="")

fp=fopen(m_Filename,"r+");

fp1=fopen(m_test,"r+");

featuretest=(float *)malloc(m_nof*sizeof(float));

if(fp1==NULL)

MessageBox("Test File Does not exist");

choice=0;

for(i=0;i<m_nof;i++)

fscanf(fp1,"%f",&(*(featuretest+i)));

if(*(featuretest+i)<=0)

MessageBox("Test File Contains Less number of Features

than you specified OR an EMPTY FILE","LINEAR DISCRIMINANT

ANALYSIS");

choice=0;

msgbxchoice=MessageBox("Want to Correct the File?",

"LINEAR DISCRIMINANT ANALYSIS", MB_OKCANCEL |

MB_ICONQUESTION);

if(msgbxchoice==IDOK)
ShellExecute(NULL,"open","notepad.exe",m_test,NULL,SW_SHOWNORMAL);

else

MessageBox("Correct the File and Execute Again");

break;

fclose(fp1);

if(i==m_nof-1)

if(fgetc(fp1)!=EOF)

MessageBox("Features in the test file

Exceeds","LINEAR DISCRIMINANT ANALYSIS");

choice=0;

msgbxchoice=MessageBox("Want to Correct the

File?", "LINEAR DISCRIMINANT ANALYSIS", MB_OKCANCEL |

MB_ICONQUESTION);

if(msgbxchoice==IDOK)

ShellExecute(NULL,"open","notepad.exe",m_test,NULL,SW_SHOWNORMAL);

else

MessageBox("Correct the File and Execute

Again");
break;

free(featuretest);

if(fp==NULL||m_Filename=="")

MessageBox("Input File Does Not Exist","LINEAR

DISCRIMINANT ANALYSIS");

choice=0;

else if(grpcount<m_nog)

MessageBox("Enter the group Value","LINEAR DISCRIMINANT

ANALYSIS");

choice=0;

else if(m_nos==0)

MessageBox("Please Enter the number of samples","LINEAR

DISCRIMINANT ANALYSIS");

choice=0;

else if(m_nof==0)
{

MessageBox("Please Enter the number of Features","LINEAR

DISCRIMINANT ANALYSIS");

choice=0;

else if(m_nog==0)

MessageBox("Please Enter the number of Groups","LINEAR

DISCRIMINANT ANALYSIS");

choice=0;

else if(m_Filename=="")

MessageBox("Please Enter the path of the File name","LINEAR

DISCRIMINANT ANALYSIS");

choice=0;

else if(m_test=="")

MessageBox("Please Enter the path of the Test Data File

name","LINEAR DISCRIMINANT ANALYSIS");

choice=0;

else if(m_output==0 && m_outfile=="")


{

MessageBox("Enter the output file name","LINEAR

DISCRIMINANT ANALYSIS");

choice=0;

else if(m_output!=0 && m_output!=1)

MessageBox("Select Yes or No","LINEAR DISCRIMINANT

ANALYSIS");

choice=0;

else if(choice==1)

featuretest=(float *)malloc(m_nos*m_nof*sizeof(float));

for(i=0;i<m_nos;i++)

for(j=0;j<m_nof;j++)

fscanf(fp,"%f",&(*(featuretest+(i*m_nof)+j)));

/*str.Format("%f",*(featuretest+(i*m_nof)+j));

MessageBox(str);*/

if(*(featuretest+(i*m_nof)+j)<=0)

flag=0;
}

choice=0;

if(flag==0)

choice=0;

break;

tempgrp=(char *)malloc(15*sizeof(char));

fscanf(fp,"%s",tempgrp);

k=0;

while(tempgrp[k])

tempgrp[k]=toupper(tempgrp[k]);

k++;

for(t=0;t<m_nog;t++)

if(strcmp(tempgrp,ver_groups[t])==0)

choice=1;

break;

}
if(choice==1)

grouptest[i]=(char *)malloc(16*sizeof(char));

grouptest[i]=tempgrp;

else

break;

/*str.Format("%d",i);

MessageBox(str);*/

if(i==m_nos-1)

if(fgetc(fp)!=EOF)

choice=0;

if(choice==0)

MessageBox("Error in the file due to following

reasons\n1.Less or More Number of Features than you specified\n2.Group Name

May not Match\n3.EMPTY FILE","LINEAR DISCRIMINANT ANALYSIS");


msgbxchoice=MessageBox("Want to Correct the File?",

"LINEAR DISCRIMINANT ANALYSIS", MB_OKCANCEL |

MB_ICONQUESTION);

if(msgbxchoice==IDOK)

ShellExecute(NULL,"open","notepad.exe",m_Filename,NULL,SW_SHOWNORMA

L);

else

MessageBox("Correct the File and Execute Again");

free(featuretest);

fclose(fp);

if(choice!=0)

CString str;

//Opening the file

fp=fopen(m_Filename,"r+");

switch(m_output)

case 0:
fp2=fopen(m_outfile,"w+");

break;

case 1:

break;

default:

break;

//Getting the features

features=(float *)malloc(m_nos*m_nof*sizeof(float));

temp1=features;

str.Format("%u",features);

MessageBox(str);

for(i=0;i<m_nos;i++)

for(j=0;j<m_nof;j++)

fscanf(fp,"%f",&(*(features+(i*m_nof)+j)));

/*str.Format("%f",*(features+(i*m_nof)+j));

MessageBox("j = "+str);*/

groups[i]=(char *)malloc(16*sizeof(char));

fscanf(fp,"%s",groups[i]);
/*str.Format("%s",groups[i]);

MessageBox("grp = "+str);*/

/*str.Format("%u",temp1);

MessageBox(str);*/

features=temp1;

fclose(fp);

lda::OnDisp(features);

//Calculating the Global Mean

global_mean=(float *)malloc(m_nof*sizeof(float));

for(i=0;i<m_nof;i++)

*(global_mean+i)=sum_of_features(features,m_nos,m_nof,i)/m_nos;

/*str.Format("%f",sum_of_features(features,m_nos,m_nof,i));

MessageBox(str);*/

if(m_output==0)

if(i==0)

fprintf(fp2,"%s","----------- Global

Mean-----------\n");

fprintf(fp2,"%f",*(global_mean+i));
fprintf(fp2,"%s","\t");

/*str.Format("%f",*(global_mean+i));

MessageBox(str);*/

MessageBox("sssssssssssssss");

if(m_output==0)

fprintf(fp2,"%s","\n\n");

//Categorizing different Groups

group_data=(float *)malloc(m_nof*m_nos*sizeof(float));

group_item_count=(int *)malloc(m_nog*sizeof(int));

groupcount=-1;

n=0;

for(i=0;i<m_nos;i++)

k=0;

for(j=0;j<i;j++)

if(strcmp(groups[j],groups[i])==0)

k++;
}

l=0;

if(k==0)

groupcount++;

groupname[groupcount]=(char

*)malloc(16*sizeof(char));

strcpy(groupname[groupcount],groups[i]);

if(m_output==0)

fprintf(fp2,"%s","----------- ");

fprintf(fp2,"%s",groupname[groupcount]);

fprintf(fp2,"%s"," -----------\n");

for(t=0;t<m_nos;t++)

m=0;

if(strcmp(groups[i],groups[t])==0)

while(m<m_nof)

*(group_data+(n*m_nof)

+m)=*(features+(m_nof*t)+m);
if(m_output==0)

fprintf(fp2,"%f",*(group_data+(n*m_nof)+m));

fprintf(fp2,"%s","\t");

m++;

l++;

n++;

if(m_output==0)

fprintf(fp2,"%s","\n");

if(l!=0)

*(group_item_count+groupcount)=l;

if(m_output==0)

fprintf(fp2,"%s","\n");

}
//Calculating Group Mean

group_mean=(float *)malloc((groupcount+1)*m_nof*sizeof(float));

tempptr=group_data;

if(m_output==0)

fprintf(fp2,"%s","---------Group Mean--------\n");

for(k=0;k<=groupcount;k++)

if(m_output==0)

fprintf(fp2,"%s",groupname[k]);

fprintf(fp2,"%s","\t");

for(i=0;i<m_nof;i++)

*(group_mean+(k*m_nof)

+i)=(sum_of_features(tempptr,*(group_item_count+k),m_nof,i)/

(*(group_item_count+k)));

if(m_output==0)

fprintf(fp2,"%f",*(group_mean+(k*m_nof)+i));

fprintf(fp2,"%s","\t");

if(m_output==0)
fprintf(fp2,"%s","\n");

tempptr=tempptr+((*(group_item_count+k))*m_nof);

//Calculating the Mean Corrected Data

n=0;

mean_corrected_data=(float *)malloc(m_nof*m_nos*sizeof(float));

if(m_output==0)

fprintf(fp2,"%s","\n\n---------Mean Corrected Data--------\n");

for(k=0;k<=groupcount;k++)

if(m_output==0)

fprintf(fp2,"%s",groupname[k]);

fprintf(fp2,"%s","\n");

for(i=0;i<*(group_item_count+k);i++)

for(j=0;j<m_nof;j++)

*(mean_corrected_data+(n*m_nof)

+j)=*(group_data+(n*m_nof)+j)-(*(global_mean+j));

if(m_output==0)

{
fprintf(fp2,"%f",*(mean_corrected_data+(n*m_nof)+j));

fprintf(fp2,"%s","\t");

if(m_output==0)

fprintf(fp2,"%s","\n");

n++;

if(m_output==0)

fprintf(fp2,"%s","\n");

//Calculating the Covariance Matrix

tempptr=mean_corrected_data;

for(k=0;k<=groupcount;k++)

temp = (struct T *)malloc(sizeof(struct T));

transpose_data =

find_transpose(tempptr,*(group_item_count+k),m_nof);

covariance_matrix =

cal_covar(transpose_data,tempptr,m_nof,*(group_item_count+k));

temp->data = covariance_matrix;
temp->next = NULL;

if(first==NULL)

first = temp;

last = temp;

else

last->next = temp;

last = temp;

tempptr=tempptr+((*(group_item_count+k))*m_nof);

free(mean_corrected_data);

if(m_output==0)

k=0;

temp=first;

while(temp!=NULL)

fprintf(fp2,"%s","\n------------------------------------------------------------------------------

-\n");
fprintf(fp2,"%s"," COVARIANCE

MATRIX FOR GROUP ");

fprintf(fp2,"%s",groupname[k++]);

fprintf(fp2,"%s","\n------------------------------------------------------------------------------

-\n");

tempptr=temp->data;

for(i=0;i<m_nof;i++)

for(j=0;j<m_nof;j++)

fprintf(fp2,"%f",*tempptr++);

fprintf(fp2,"%s","\t");

fprintf(fp2,"%s","\n");

temp = temp->next;

//Calulating Final Covariance Matrix


final_covariance_matrix=(float

*)malloc(m_nof*m_nof*sizeof(float));

temp=first;

n=0;

if(m_output==0)

fprintf(fp2,"%s","\n------------------------------------------------------------------------------

-\n");

fprintf(fp2,"%s"," FINAL COVARIANCE

MATRIX \n");

fprintf(fp2,"%s","-------------------------------------------------------------------------------\

n");

for(i=0;i<m_nof;i++)

for(j=0;j<m_nof;j++)

sum=0.0;

t=0;

while(temp!=NULL)
{

tempptr=temp->data;

for(k=0;k<n;k++)

tempptr++;

sum+=(*(group_item_count+t)*(*tempptr));

temp=temp->next;

t++;

n++;

temp=first;

*(final_covariance_matrix+(i*m_nof)

+j)=(sum/m_nos);

if(m_output==0)

fprintf(fp2,"%f",*(final_covariance_matrix+

(i*m_nof)+j));

fprintf(fp2,"%s","\t");

if(m_output==0)

fprintf(fp2,"%s","\n");

free(first);
//Calculating Inverse Matrix

inverse_matrix=cal_inverse(final_covariance_matrix,m_nof);

if(m_output==0)

fprintf(fp2,"%s","\n------------------------------------------------------------------------------

-\n");

fprintf(fp2,"%s"," INVERSE MATRIX

\n");

fprintf(fp2,"%s","-------------------------------------------------------------------------------\

n");

for(i=0;i<m_nof;i++)

for(j=0;j<m_nof;j++)

fprintf(fp2,"%f",*(inverse_matrix+(i*m_nof)

+j));

fprintf(fp2,"%s","\t");

fprintf(fp2,"%s","\n");
}

free(final_covariance_matrix);

prior_prob=(float *)malloc((groupcount+1)*sizeof(float));

if(m_output==0)

fprintf(fp2,"%s","\n------------------------------------------------------------------------------

-\n");

fprintf(fp2,"%s"," PRIOR

PROBABILITIES \n");

fprintf(fp2,"%s","-------------------------------------------------------------------------------\

n");

for(i=0;i<=groupcount;i++)

*(prior_prob+i)=(*(group_item_count+i)*1.0)/m_nos;

if(m_output==0)

fprintf(fp2,"%f",*(prior_prob+i));

fprintf(fp2,"%s","\t");
}

if(m_nof==2)

freq=(float *)malloc((2*m_nos)*sizeof(float));

//Calculating the Frequencies

if(m_output==0)

fprintf(fp2,"%s","\n------------------------------------------------------------------------------

-\n");

fprintf(fp2,"%s"," Frequencies

\n");

fprintf(fp2,"%s","-------------------------------------------------------------------------------\

n");

for(i=0;i<=groupcount;i++)

if(m_output==0)

{
fprintf(fp2,"%s","\n------------------------------------------------------------------------------

-\n");

fprintf(fp2,"%s"," ");

fprintf(fp2,"%s",groupname[i]);

fprintf(fp2,"%s","\n------------------------------------------------------------------------------

-\n");

tempptr=mat_multi(group_mean+

(i*m_nof),inverse_matrix,1,m_nof,m_nof);

tempptr1=find_transpose(group_mean+(i*m_nof),1,m_nof);

tempptr2=mat_multi(tempptr,tempptr1,1,m_nof,1);

for(j=0;j<m_nos;j++)

tempptr1=find_transpose(features+

(j*m_nof),1,m_nof);

tempptr3=mat_multi(tempptr,tempptr1,1,m_nof,1);

sum=*tempptr3-

(*tempptr2*0.5)+(2.303*log(*(prior_prob+i)));

if(m_nof==2)

*(freq+(i*m_nos)+j)=sum;
str.Format("%f",*(freq+(i*m_nos)+j));

MessageBox(str);

if(m_output==0)

fprintf(fp2,"%s","Frequency of Object ");

fprintf(fp2,"%d",j);

fprintf(fp2,"%s"," ");

fprintf(fp2,"%f",sum);

fprintf(fp2,"%s","\n");

//Testing a new data

if(m_nof==2)

testfreq=(float *)malloc(2*sizeof(float));

fp1=fopen(m_test,"r+");

test_data_features=(float *)malloc(m_nof*sizeof(float));

for(i=0;i<m_nof;i++)

fscanf(fp1,"%f",&(*(test_data_features+i)));

}
if(m_output==0)

fprintf(fp2,"%s","\n\n");

frequency=(float *)malloc((groupcount+1)*sizeof(float));

for(i=0;i<=groupcount;i++)

tempptr=mat_multi(group_mean+

(i*m_nof),inverse_matrix,1,m_nof,m_nof);

tempptr1=find_transpose(group_mean+(i*m_nof),1,m_nof);

tempptr2=mat_multi(tempptr,tempptr1,1,m_nof,1);

tempptr1=find_transpose(test_data_features,1,m_nof);

tempptr3=mat_multi(tempptr,tempptr1,1,m_nof,1);

*(frequency+i)=*tempptr3-

(*tempptr2*0.5)+(2.303*log(*(prior_prob+i)));

if(m_nof==2)

*(testfreq+i)=*(frequency+i);

str.Format("%f",*(testfreq+i));

MessageBox(str);

if(m_output==0)

fprintf(fp2,"%s","Frequency of Object ");

fprintf(fp2,"%d",i);
fprintf(fp2,"%s"," ");

fprintf(fp2,"%f",*(frequency+i));

fprintf(fp2,"%s","\n");

t++;

//Decision Making

sum=*(frequency+i);

for(i=0;i<=groupcount;i++)

if(sum<=*(frequency+i))

sum=*(frequency+i);

t=i;

str.Format("%s",groupname[t]);

MessageBox("The Object Belongs to the group "+str,"LINEAR

DISCRIMINANT ANALYSIS");

if(m_output==0)

fprintf(fp2,"%s","\n\n");

if(m_output==0)
{

fprintf(fp2,"%s","The Object Belongs to the group ");

fprintf(fp2,"%s",groupname[t]);

fclose(fp2);

if(m_nof==2)

g->freq=freq;

g->m_nof=m_nof;

g->m_nos=m_nos;

g->testfreq=testfreq;

g->Create(IDD_DIALOG5);

g->ShowWindow(SW_SHOW);

grpcount=0;

m_nxtgrp.EnableWindow(true);

else

//MessageBox("Correct the Errors and then click

Calculate","LINEAR DISCRIMINANT ANALYSIS");

grpcount=0;

m_nxtgrp.EnableWindow(true);
}

void lda::OnNextGrp()

// TODO: Add your control notification handler code here

UpdateData(true);

CString str;

str.Format("%d",m_nog);

MessageBox(str);

if(grpcount<m_nog&&m_grp!="")

int k;

ver_groups[grpcount]=(char *)malloc(16*sizeof(char));

strcpy(ver_groups[grpcount],m_grp);

k=0;

while(ver_groups[grpcount][k])

ver_groups[grpcount][k]=toupper(ver_groups[grpcount][k]);

k++;

m_grp="";

grpcount++;
if(grpcount==m_nog)

MessageBox("You Have successfully Entered the group

value","LINEAR DISCRIMINANT ANALYSIS");

m_nxtgrp.EnableWindow(false);

else

MessageBox("Group Value Should not be empty","LINEAR

DISCRIMINANT ANALYSIS");

UpdateData(false);

void lda::OnYes()

// TODO: Add your control notification handler code here

m_out.EnableWindow(true);

void lda::OnNo()

// TODO: Add your control notification handler code here


m_out.EnableWindow(false);

void lda::OnBrowseInput()

// TODO: Add your control notification handler code here

UpdateData(true);

CFileDialog FileDialog(TRUE,"*.*",NULL,OFN_HIDEREADONLY,"Text

Files: (*.txt)|*.txt||");

if(FileDialog.DoModal() == IDOK)

CString PathName = FileDialog.GetPathName();

m_inputfile.SetWindowText(PathName);

void lda::OnBrowseTest()

// TODO: Add your control notification handler code here

UpdateData(true);

CFileDialog FileDialog(TRUE,"*.*",NULL,OFN_HIDEREADONLY,"Text

Files: (*.txt)|*.txt||");

if(FileDialog.DoModal() == IDOK)
{

CString PathName = FileDialog.GetPathName();

m_testfile.SetWindowText(PathName);

void lda::OnViewSampleFile()

// TODO: Add your control notification handler code here

ShellExecute(NULL,"open","notepad.exe","input.txt",NULL,SW_SHOWNORMAL

);

Potrebbero piacerti anche