Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
INTRODUCTION
1) Supervised Learning
2) Unsupervised Learning
2.1 Introduction
If we can assume that the groups are linearly separable, we can use linear
discriminant model (LDA).
2.2 Purpose
obtain need so much data to get the relative frequencies of each groups for
each measurement. It is more practical to assume the distribution and get the
probability theoretically. If we assume that each group has multivariate Normal
distribution and all groups have the same covariance matrix, we get what is called
Linear Discriminant Analysis formula:
2.4 Algorithm
equal to total sample of each group divided by the total samples, that
is
Step 11: Compute the frequency, fi, for each data in the training set and finally
compute the frequency for test set using the discriminant function,
Factory “ABC” produces very expensive and high quality chip rings that their
qualities are measured in term of curvature and diameter. Result of quality control by
experts is given in the table below.
As a consultant to the factory, you get a task to set up the criteria for
automatic quality control. Then, the manager of the factory also wants to test your
criteria upon new type of chip rings that even the human experts are argued to each
other. The new chip rings have curvature 2.81 and diameter 5.46.
When we plot the features, we can see that the data is linearly separable. We
can draw a line to separate the two groups. The problem is to find the line and to
rotate the features in such a way to maximize the distance between groups and to
minimize distance within group.
= features (or independent variables) of all data. Each row (denoted by )
represents one object; each column stands for one feature.
= group of the object (or dependent variable) of all data. Each row represents one
object and it has only one column.
= features data for group . Each row represents one object; each column stands
for one feature. We separate into several groups based on the number of category
in .
,
In this example,
= mean corrected data, that is the features data for group , , minus the global
mean vector
,
= pooled within group covariance matrix. It is calculated for
and , therefore
Discriminant function
2.6 Introduction
The area of Neural Networks probably belongs to the borderline between the
Artificial Intelligence and Approximation Algorithms. Think of it as of algorithms
for smart approximation. The Neural Nets are used in universal approximation
,mapping input to the output, tools capable of learning from their environment, tools
for finding non-evident dependencies between data and so on.
The Neural Networking algorithms are modeled after the brain and how it
processes the information. The brain is a very efficient tool. Having about 100,000
times slower response time than computer chips, it beats the computer in complex
tasks, such as image and sound recognition, motion control and so on. It is also about
10,000,000,000 times more efficient than the computer chip in terms of energy
consumption per operation.
The brain is a multi layer structure, 6-7 layers of neurons, with 10^11
neurons, structure, that works as a parallel computer capable of learning from the
feedback it receives from the world and changing its design by growing new neural
links between neurons or altering activities of existing ones. To make picture a bit
more complete, let's also mention, that a typical neuron is connected to 50-100 of the
other neurons, sometimes, to itself, too.
Example of a neural net with inputs and output,
The Back Propagation Algorithm for training the weights in a multilayer net
uses the steepest descent minimization procedure and the sigmoid function. The
Back Propagation Algorithm consists of two main steps: A feed forward step in
which the outputs of the nodes are computed at layer 1 and working forward to the
output layer k and back propagation step where the weights are updated in an
attempt to get better agreement between the observed outputs and desired outputs.
The Neural Net receives the inputs which can be a pattern of some kind. In
case of image recognition, for example, it would be pixels from the photosensitive
matrix of some kind.
After the neuron in the first layer received its input, it applies the Linear
Combiner and the Activation Function to the inputs and produces the Output. This
output, as you can see from the picture, will become the input for the neurons in the
next layer. So the next layer will feed forward the data, to the next layer. And so on,
until the last layer is reached.
When we work with previous layer output, we not only know the output of
previous layer, but also the output we are trying to predict, called the desired output
of the Neural Net. When we compare the two values, we can compute the Error:
dError = dDesiredOutput - dOutput; Now we can adjust this particular neuron to
work better with this particular input. As the Neural Net is learning, the errors will
decrease and we will have to adjust the weights in every layer.
But the input nodes of the last layer are output nodes of the previous layer.
So we have the actual output of the previous layer and the desired output and we can
adjust the previous layer of the net and so on, until we reach the first layer.
2.7 Algorithm
Step 1: Initialize the weights wij(k) to small random values, and choose a
positive constant c.
Step 4.1 : For the nodes in the output layer, j = 1……. Mk, compute
significantly.
2.8 Example
0.8
0.4
Input B= 0.9 0.9
0.6
(iii) Result
Old error was -0.19. New error is -0.18205. Therefore error has reduced.
CHAPTER 3
DEVELOPMENT ENVIRONMENT
Processor : Pentium IV
Ram : 512MB
Monitor : 15” Color
Hard Disk : 20 GB
SYSTEM IMPLEMENTATION
Header File
//
#if !
defined(AFX_PROJECTDLG_H__BF56921F_0F09_4C8A_BA63_8EA9856308A5
__INCLUDED_)
#define
AFX_PROJECTDLG_H__BF56921F_0F09_4C8A_BA63_8EA9856308A5__INCL
UDED_
#pragma once
/////////////////////////////////////////////////////////////////////////////
// CProjectDlg dialog
class CProjectDlg : public CDialog
// Construction
public:
// Dialog Data
//{{AFX_DATA(CProjectDlg)
int m_ch;
//}}AFX_DATA
//{{AFX_VIRTUAL(CProjectDlg)
protected:
support
//}}AFX_VIRTUAL
// Implementation
protected:
HICON m_hIcon;
// Generated message map functions
//{{AFX_MSG(CProjectDlg)
//}}AFX_MSG
DECLARE_MESSAGE_MAP()
};
//{{AFX_INSERT_LOCATION}}
// Microsoft Visual C++ will insert additional declarations immediately before the
previous line.
#endif // !
defined(AFX_PROJECTDLG_H__BF56921F_0F09_4C8A_BA63_8EA9856308A5
__INCLUDED_)
Implementation Code:
//
#include "stdafx.h"
#include "Project.h"
#include "ProjectDlg.h"
#include "super.h"
#include "unsuper.h"
#ifdef _DEBUG
#undef THIS_FILE
#endif
/////////////////////////////////////////////////////////////////////////////
public:
CAboutDlg();
// Dialog Data
//{{AFX_DATA(CAboutDlg)
//}}AFX_DATA
//{{AFX_VIRTUAL(CAboutDlg)
protected:
support
//}}AFX_VIRTUAL
// Implementation
protected:
//{{AFX_MSG(CAboutDlg)
//}}AFX_MSG
DECLARE_MESSAGE_MAP()
};
CAboutDlg::CAboutDlg() : CDialog(CAboutDlg::IDD)
//{{AFX_DATA_INIT(CAboutDlg)
//}}AFX_DATA_INIT
}
CDialog::DoDataExchange(pDX);
//{{AFX_DATA_MAP(CAboutDlg)
//}}AFX_DATA_MAP
BEGIN_MESSAGE_MAP(CAboutDlg, CDialog)
//{{AFX_MSG_MAP(CAboutDlg)
// No message handlers
//}}AFX_MSG_MAP
END_MESSAGE_MAP()
/////////////////////////////////////////////////////////////////////////////
// CProjectDlg dialog
: CDialog(CProjectDlg::IDD, pParent)
//{{AFX_DATA_INIT(CProjectDlg)
m_ch = -1;
//}}AFX_DATA_INIT
// Note that LoadIcon does not require a subsequent DestroyIcon in Win32
m_hIcon = AfxGetApp()->LoadIcon(IDR_MAINFRAME);
CDialog::DoDataExchange(pDX);
//{{AFX_DATA_MAP(CProjectDlg)
//}}AFX_DATA_MAP
BEGIN_MESSAGE_MAP(CProjectDlg, CDialog)
//{{AFX_MSG_MAP(CProjectDlg)
ON_WM_SYSCOMMAND()
ON_WM_PAINT()
ON_WM_QUERYDRAGICON()
ON_BN_CLICKED(IDC_BUTTON1, OnOk)
ON_BN_CLICKED(IDC_BUTTON2, OnCancel)
//}}AFX_MSG_MAP
END_MESSAGE_MAP()
/////////////////////////////////////////////////////////////////////////////
CDialog::OnInitDialog();
if (pSysMenu != NULL)
CString strAboutMenu;
strAboutMenu.LoadString(IDS_ABOUTBOX);
if (!strAboutMenu.IsEmpty())
pSysMenu->AppendMenu(MF_SEPARATOR);
pSysMenu->AppendMenu(MF_STRING,
IDM_ABOUTBOX, strAboutMenu);
}
// Set the icon for this dialog. The framework does this automatically
return TRUE; // return TRUE unless you set the focus to a control
CAboutDlg dlgAbout;
dlgAbout.DoModal();
else
CDialog::OnSysCommand(nID, lParam);
}
// If you add a minimize button to your dialog, you will need the code below
// to draw the icon. For MFC applications using the document/view model,
void CProjectDlg::OnPaint()
if (IsIconic())
SendMessage(WM_ICONERASEBKGND, (WPARAM)
dc.GetSafeHdc(), 0);
CRect rect;
GetClientRect(&rect);
dc.DrawIcon(x, y, m_hIcon);
}
else
CDialog::OnPaint();
// The system calls this to obtain the cursor to display while the user drags
HCURSOR CProjectDlg::OnQueryDragIcon()
void CProjectDlg::OnOk()
UpdateData(true);
switch(m_ch)
case 0:
s->Create(IDD_DIALOG1);
s->ShowWindow(SW_SHOW);
this->ShowWindow(SW_HIDE);
break;
case 1:
u->Create(IDD_DIALOG2);
u->ShowWindow(SW_SHOW);
this->ShowWindow(SW_HIDE);
break;
default:
break;
void CProjectDlg::OnCancel()
exit(0);
}
4.2 Mapping Resources
Header File:
//
#if !
defined(AFX_PROJECT_H__9334AE5F_4A2E_4CF6_A7DF_ABD1B9CD2BC5__
INCLUDED_)
#define
AFX_PROJECT_H__9334AE5F_4A2E_4CF6_A7DF_ABD1B9CD2BC5__INCLU
DED_
#pragma once
#ifndef __AFXWIN_H__
#endif
// CProjectApp:
//
public:
CProjectApp();
// Overrides
//{{AFX_VIRTUAL(CProjectApp)
public:
//}}AFX_VIRTUAL
// Implementation
//{{AFX_MSG(CProjectApp)
here.
//}}AFX_MSG
DECLARE_MESSAGE_MAP()
};
/////////////////////////////////////////////////////////////////////////////
//{{AFX_INSERT_LOCATION}}
// Microsoft Visual C++ will insert additional declarations immediately before the
previous line.
#endif // !
defined(AFX_PROJECT_H__9334AE5F_4A2E_4CF6_A7DF_ABD1B9CD2BC5__
INCLUDED_)
Implementation File
//
#include "stdafx.h"
#include "Project.h"
#include "ProjectDlg.h"
#ifdef _DEBUG
#define new DEBUG_NEW
#undef THIS_FILE
#endif
/////////////////////////////////////////////////////////////////////////////
// CProjectApp
BEGIN_MESSAGE_MAP(CProjectApp, CWinApp)
//{{AFX_MSG_MAP(CProjectApp)
here.
//}}AFX_MSG
ON_COMMAND(ID_HELP, CWinApp::OnHelp)
END_MESSAGE_MAP()
/////////////////////////////////////////////////////////////////////////////
// CProjectApp construction
CProjectApp::CProjectApp()
/////////////////////////////////////////////////////////////////////////////
CProjectApp theApp;
/////////////////////////////////////////////////////////////////////////////
// CProjectApp initialization
BOOL CProjectApp::InitInstance()
// Standard initialization
// If you are not using these features and wish to reduce the size
#ifdef _AFXDLL
DLL
#else
#endif
CProjectDlg dlg;
m_pMainWnd = &dlg;
if (nResponse == IDOK)
// dismissed with OK
// Since the dialog has been closed, return FALSE so that we exit the
return FALSE;
}
4.3 Linear Discriminant Analysis Dialog Box
Header File:
#if !
defined(AFX_LDA_H__3F872543_4AED_4A6F_9B73_DA05E37DE6F5__INCLU
DED_)
#define
AFX_LDA_H__3F872543_4AED_4A6F_9B73_DA05E37DE6F5__INCLUDED_
#pragma once
//
/////////////////////////////////////////////////////////////////////////////
// lda dialog
// Construction
public:
char *ver_groups[20];
int grpcount;
float *freq;
float *testfreq;
lda()
grpcount=0;
m_Filename="";
m_nos=0;
m_nof=0;
m_nog=0;
m_grp="";
m_test="";
// Dialog Data
//{{AFX_DATA(lda)
CEdit m_testfile;
CEdit m_inputfile;
CEdit m_out;
CButton m_nxtgrp;
CButton m_calc;
CString m_Filename;
int m_nos;
int m_nof;
int m_nog;
CString m_grp;
CString m_test;
CString m_outfile;
int m_output;
//}}AFX_DATA
// Overrides
//{{AFX_VIRTUAL(lda)
protected:
support
//}}AFX_VIRTUAL
// Implementation
protected:
//{{AFX_MSG(lda)
//}}AFX_MSG
DECLARE_MESSAGE_MAP()
};
//{{AFX_INSERT_LOCATION}}
// Microsoft Visual C++ will insert additional declarations immediately before the
previous line.
#endif // !
defined(AFX_LDA_H__3F872543_4AED_4A6F_9B73_DA05E37DE6F5__INCLU
DED_)
Implementation File:
//
#include "stdafx.h"
#include "Project.h"
#include "lda.h"
#include "super.h"
#include "headers.h"
#include "math.h"
#include "graph.h"
#ifdef _DEBUG
#undef THIS_FILE
#endif
/////////////////////////////////////////////////////////////////////////////
// lda dialog
struct T
float *data;
struct T *next;
}*first,*temp,*last,*perm;
//float *features;
void lda::DoDataExchange(CDataExchange* pDX)
CDialog::DoDataExchange(pDX);
//{{AFX_DATA_MAP(lda)
//}}AFX_DATA_MAP
BEGIN_MESSAGE_MAP(lda, CDialog)
//{{AFX_MSG_MAP(lda)
ON_BN_CLICKED(IDC_BUTTON4, OnBack)
ON_BN_CLICKED(IDC_BUTTON6, OnExit)
ON_BN_CLICKED(IDC_BUTTON5, OnCalculate)
ON_BN_CLICKED(IDC_BUTTON2, OnNextGrp)
ON_BN_CLICKED(IDC_RADIO1, OnYes)
ON_BN_CLICKED(IDC_RADIO2, OnNo)
ON_BN_CLICKED(IDC_BUTTON1, OnBrowseInput)
ON_BN_CLICKED(IDC_BUTTON3, OnBrowseTest)
ON_BN_CLICKED(IDC_BUTTON7, OnViewSampleFile)
//}}AFX_MSG_MAP
END_MESSAGE_MAP()
/////////////////////////////////////////////////////////////////////////////
void lda::OnBack()
s->Create(IDD_DIALOG1);
s->ShowWindow(SW_SHOW);
this->ShowWindow(SW_HIDE);
void lda::OnExit()
{
exit(0);
void lda::OnCalculate()
UpdateData(true);
CString str;
int i,j,k,t,choice=1,groupcount,n,m,l,*group_item_count,flag=1;
float
*temp1,*global_mean,*group_data,*group_mean,*tempptr,sum,*inverse_matrix,*fe
aturetest;
float *features;
float
*mean_corrected_data,*transpose_data,*covariance_matrix,*final_covariance_matri
x;
float
*prior_prob,*tempptr1,*tempptr2,*tempptr3,*test_data_features,*frequency;
char *groups[15],*tempgrp,*groupname[15],*grouptest[15];
FILE *fp,*fp1,*fp2;
int msgbxchoice;
fp=fopen(m_Filename,"r+");
fp1=fopen(m_test,"r+");
featuretest=(float *)malloc(m_nof*sizeof(float));
if(fp1==NULL)
choice=0;
for(i=0;i<m_nof;i++)
fscanf(fp1,"%f",&(*(featuretest+i)));
if(*(featuretest+i)<=0)
ANALYSIS");
choice=0;
MB_ICONQUESTION);
if(msgbxchoice==IDOK)
ShellExecute(NULL,"open","notepad.exe",m_test,NULL,SW_SHOWNORMAL);
else
break;
fclose(fp1);
if(i==m_nof-1)
if(fgetc(fp1)!=EOF)
choice=0;
MB_ICONQUESTION);
if(msgbxchoice==IDOK)
ShellExecute(NULL,"open","notepad.exe",m_test,NULL,SW_SHOWNORMAL);
else
Again");
break;
free(featuretest);
if(fp==NULL||m_Filename=="")
DISCRIMINANT ANALYSIS");
choice=0;
else if(grpcount<m_nog)
ANALYSIS");
choice=0;
else if(m_nos==0)
DISCRIMINANT ANALYSIS");
choice=0;
else if(m_nof==0)
{
DISCRIMINANT ANALYSIS");
choice=0;
else if(m_nog==0)
DISCRIMINANT ANALYSIS");
choice=0;
else if(m_Filename=="")
DISCRIMINANT ANALYSIS");
choice=0;
else if(m_test=="")
choice=0;
DISCRIMINANT ANALYSIS");
choice=0;
ANALYSIS");
choice=0;
else if(choice==1)
featuretest=(float *)malloc(m_nos*m_nof*sizeof(float));
for(i=0;i<m_nos;i++)
for(j=0;j<m_nof;j++)
fscanf(fp,"%f",&(*(featuretest+(i*m_nof)+j)));
/*str.Format("%f",*(featuretest+(i*m_nof)+j));
MessageBox(str);*/
if(*(featuretest+(i*m_nof)+j)<=0)
flag=0;
}
choice=0;
if(flag==0)
choice=0;
break;
tempgrp=(char *)malloc(15*sizeof(char));
fscanf(fp,"%s",tempgrp);
k=0;
while(tempgrp[k])
tempgrp[k]=toupper(tempgrp[k]);
k++;
for(t=0;t<m_nog;t++)
if(strcmp(tempgrp,ver_groups[t])==0)
choice=1;
break;
}
if(choice==1)
grouptest[i]=(char *)malloc(16*sizeof(char));
grouptest[i]=tempgrp;
else
break;
/*str.Format("%d",i);
MessageBox(str);*/
if(i==m_nos-1)
if(fgetc(fp)!=EOF)
choice=0;
if(choice==0)
MB_ICONQUESTION);
if(msgbxchoice==IDOK)
ShellExecute(NULL,"open","notepad.exe",m_Filename,NULL,SW_SHOWNORMA
L);
else
free(featuretest);
fclose(fp);
if(choice!=0)
CString str;
fp=fopen(m_Filename,"r+");
switch(m_output)
case 0:
fp2=fopen(m_outfile,"w+");
break;
case 1:
break;
default:
break;
features=(float *)malloc(m_nos*m_nof*sizeof(float));
temp1=features;
str.Format("%u",features);
MessageBox(str);
for(i=0;i<m_nos;i++)
for(j=0;j<m_nof;j++)
fscanf(fp,"%f",&(*(features+(i*m_nof)+j)));
/*str.Format("%f",*(features+(i*m_nof)+j));
MessageBox("j = "+str);*/
groups[i]=(char *)malloc(16*sizeof(char));
fscanf(fp,"%s",groups[i]);
/*str.Format("%s",groups[i]);
MessageBox("grp = "+str);*/
/*str.Format("%u",temp1);
MessageBox(str);*/
features=temp1;
fclose(fp);
lda::OnDisp(features);
global_mean=(float *)malloc(m_nof*sizeof(float));
for(i=0;i<m_nof;i++)
*(global_mean+i)=sum_of_features(features,m_nos,m_nof,i)/m_nos;
/*str.Format("%f",sum_of_features(features,m_nos,m_nof,i));
MessageBox(str);*/
if(m_output==0)
if(i==0)
fprintf(fp2,"%s","----------- Global
Mean-----------\n");
fprintf(fp2,"%f",*(global_mean+i));
fprintf(fp2,"%s","\t");
/*str.Format("%f",*(global_mean+i));
MessageBox(str);*/
MessageBox("sssssssssssssss");
if(m_output==0)
fprintf(fp2,"%s","\n\n");
group_data=(float *)malloc(m_nof*m_nos*sizeof(float));
group_item_count=(int *)malloc(m_nog*sizeof(int));
groupcount=-1;
n=0;
for(i=0;i<m_nos;i++)
k=0;
for(j=0;j<i;j++)
if(strcmp(groups[j],groups[i])==0)
k++;
}
l=0;
if(k==0)
groupcount++;
groupname[groupcount]=(char
*)malloc(16*sizeof(char));
strcpy(groupname[groupcount],groups[i]);
if(m_output==0)
fprintf(fp2,"%s","----------- ");
fprintf(fp2,"%s",groupname[groupcount]);
fprintf(fp2,"%s"," -----------\n");
for(t=0;t<m_nos;t++)
m=0;
if(strcmp(groups[i],groups[t])==0)
while(m<m_nof)
*(group_data+(n*m_nof)
+m)=*(features+(m_nof*t)+m);
if(m_output==0)
fprintf(fp2,"%f",*(group_data+(n*m_nof)+m));
fprintf(fp2,"%s","\t");
m++;
l++;
n++;
if(m_output==0)
fprintf(fp2,"%s","\n");
if(l!=0)
*(group_item_count+groupcount)=l;
if(m_output==0)
fprintf(fp2,"%s","\n");
}
//Calculating Group Mean
group_mean=(float *)malloc((groupcount+1)*m_nof*sizeof(float));
tempptr=group_data;
if(m_output==0)
fprintf(fp2,"%s","---------Group Mean--------\n");
for(k=0;k<=groupcount;k++)
if(m_output==0)
fprintf(fp2,"%s",groupname[k]);
fprintf(fp2,"%s","\t");
for(i=0;i<m_nof;i++)
*(group_mean+(k*m_nof)
+i)=(sum_of_features(tempptr,*(group_item_count+k),m_nof,i)/
(*(group_item_count+k)));
if(m_output==0)
fprintf(fp2,"%f",*(group_mean+(k*m_nof)+i));
fprintf(fp2,"%s","\t");
if(m_output==0)
fprintf(fp2,"%s","\n");
tempptr=tempptr+((*(group_item_count+k))*m_nof);
n=0;
mean_corrected_data=(float *)malloc(m_nof*m_nos*sizeof(float));
if(m_output==0)
for(k=0;k<=groupcount;k++)
if(m_output==0)
fprintf(fp2,"%s",groupname[k]);
fprintf(fp2,"%s","\n");
for(i=0;i<*(group_item_count+k);i++)
for(j=0;j<m_nof;j++)
*(mean_corrected_data+(n*m_nof)
+j)=*(group_data+(n*m_nof)+j)-(*(global_mean+j));
if(m_output==0)
{
fprintf(fp2,"%f",*(mean_corrected_data+(n*m_nof)+j));
fprintf(fp2,"%s","\t");
if(m_output==0)
fprintf(fp2,"%s","\n");
n++;
if(m_output==0)
fprintf(fp2,"%s","\n");
tempptr=mean_corrected_data;
for(k=0;k<=groupcount;k++)
transpose_data =
find_transpose(tempptr,*(group_item_count+k),m_nof);
covariance_matrix =
cal_covar(transpose_data,tempptr,m_nof,*(group_item_count+k));
temp->data = covariance_matrix;
temp->next = NULL;
if(first==NULL)
first = temp;
last = temp;
else
last->next = temp;
last = temp;
tempptr=tempptr+((*(group_item_count+k))*m_nof);
free(mean_corrected_data);
if(m_output==0)
k=0;
temp=first;
while(temp!=NULL)
fprintf(fp2,"%s","\n------------------------------------------------------------------------------
-\n");
fprintf(fp2,"%s"," COVARIANCE
fprintf(fp2,"%s",groupname[k++]);
fprintf(fp2,"%s","\n------------------------------------------------------------------------------
-\n");
tempptr=temp->data;
for(i=0;i<m_nof;i++)
for(j=0;j<m_nof;j++)
fprintf(fp2,"%f",*tempptr++);
fprintf(fp2,"%s","\t");
fprintf(fp2,"%s","\n");
temp = temp->next;
*)malloc(m_nof*m_nof*sizeof(float));
temp=first;
n=0;
if(m_output==0)
fprintf(fp2,"%s","\n------------------------------------------------------------------------------
-\n");
MATRIX \n");
fprintf(fp2,"%s","-------------------------------------------------------------------------------\
n");
for(i=0;i<m_nof;i++)
for(j=0;j<m_nof;j++)
sum=0.0;
t=0;
while(temp!=NULL)
{
tempptr=temp->data;
for(k=0;k<n;k++)
tempptr++;
sum+=(*(group_item_count+t)*(*tempptr));
temp=temp->next;
t++;
n++;
temp=first;
*(final_covariance_matrix+(i*m_nof)
+j)=(sum/m_nos);
if(m_output==0)
fprintf(fp2,"%f",*(final_covariance_matrix+
(i*m_nof)+j));
fprintf(fp2,"%s","\t");
if(m_output==0)
fprintf(fp2,"%s","\n");
free(first);
//Calculating Inverse Matrix
inverse_matrix=cal_inverse(final_covariance_matrix,m_nof);
if(m_output==0)
fprintf(fp2,"%s","\n------------------------------------------------------------------------------
-\n");
\n");
fprintf(fp2,"%s","-------------------------------------------------------------------------------\
n");
for(i=0;i<m_nof;i++)
for(j=0;j<m_nof;j++)
fprintf(fp2,"%f",*(inverse_matrix+(i*m_nof)
+j));
fprintf(fp2,"%s","\t");
fprintf(fp2,"%s","\n");
}
free(final_covariance_matrix);
prior_prob=(float *)malloc((groupcount+1)*sizeof(float));
if(m_output==0)
fprintf(fp2,"%s","\n------------------------------------------------------------------------------
-\n");
fprintf(fp2,"%s"," PRIOR
PROBABILITIES \n");
fprintf(fp2,"%s","-------------------------------------------------------------------------------\
n");
for(i=0;i<=groupcount;i++)
*(prior_prob+i)=(*(group_item_count+i)*1.0)/m_nos;
if(m_output==0)
fprintf(fp2,"%f",*(prior_prob+i));
fprintf(fp2,"%s","\t");
}
if(m_nof==2)
freq=(float *)malloc((2*m_nos)*sizeof(float));
if(m_output==0)
fprintf(fp2,"%s","\n------------------------------------------------------------------------------
-\n");
fprintf(fp2,"%s"," Frequencies
\n");
fprintf(fp2,"%s","-------------------------------------------------------------------------------\
n");
for(i=0;i<=groupcount;i++)
if(m_output==0)
{
fprintf(fp2,"%s","\n------------------------------------------------------------------------------
-\n");
fprintf(fp2,"%s"," ");
fprintf(fp2,"%s",groupname[i]);
fprintf(fp2,"%s","\n------------------------------------------------------------------------------
-\n");
tempptr=mat_multi(group_mean+
(i*m_nof),inverse_matrix,1,m_nof,m_nof);
tempptr1=find_transpose(group_mean+(i*m_nof),1,m_nof);
tempptr2=mat_multi(tempptr,tempptr1,1,m_nof,1);
for(j=0;j<m_nos;j++)
tempptr1=find_transpose(features+
(j*m_nof),1,m_nof);
tempptr3=mat_multi(tempptr,tempptr1,1,m_nof,1);
sum=*tempptr3-
(*tempptr2*0.5)+(2.303*log(*(prior_prob+i)));
if(m_nof==2)
*(freq+(i*m_nos)+j)=sum;
str.Format("%f",*(freq+(i*m_nos)+j));
MessageBox(str);
if(m_output==0)
fprintf(fp2,"%d",j);
fprintf(fp2,"%s"," ");
fprintf(fp2,"%f",sum);
fprintf(fp2,"%s","\n");
if(m_nof==2)
testfreq=(float *)malloc(2*sizeof(float));
fp1=fopen(m_test,"r+");
test_data_features=(float *)malloc(m_nof*sizeof(float));
for(i=0;i<m_nof;i++)
fscanf(fp1,"%f",&(*(test_data_features+i)));
}
if(m_output==0)
fprintf(fp2,"%s","\n\n");
frequency=(float *)malloc((groupcount+1)*sizeof(float));
for(i=0;i<=groupcount;i++)
tempptr=mat_multi(group_mean+
(i*m_nof),inverse_matrix,1,m_nof,m_nof);
tempptr1=find_transpose(group_mean+(i*m_nof),1,m_nof);
tempptr2=mat_multi(tempptr,tempptr1,1,m_nof,1);
tempptr1=find_transpose(test_data_features,1,m_nof);
tempptr3=mat_multi(tempptr,tempptr1,1,m_nof,1);
*(frequency+i)=*tempptr3-
(*tempptr2*0.5)+(2.303*log(*(prior_prob+i)));
if(m_nof==2)
*(testfreq+i)=*(frequency+i);
str.Format("%f",*(testfreq+i));
MessageBox(str);
if(m_output==0)
fprintf(fp2,"%d",i);
fprintf(fp2,"%s"," ");
fprintf(fp2,"%f",*(frequency+i));
fprintf(fp2,"%s","\n");
t++;
//Decision Making
sum=*(frequency+i);
for(i=0;i<=groupcount;i++)
if(sum<=*(frequency+i))
sum=*(frequency+i);
t=i;
str.Format("%s",groupname[t]);
DISCRIMINANT ANALYSIS");
if(m_output==0)
fprintf(fp2,"%s","\n\n");
if(m_output==0)
{
fprintf(fp2,"%s",groupname[t]);
fclose(fp2);
if(m_nof==2)
g->freq=freq;
g->m_nof=m_nof;
g->m_nos=m_nos;
g->testfreq=testfreq;
g->Create(IDD_DIALOG5);
g->ShowWindow(SW_SHOW);
grpcount=0;
m_nxtgrp.EnableWindow(true);
else
grpcount=0;
m_nxtgrp.EnableWindow(true);
}
void lda::OnNextGrp()
UpdateData(true);
CString str;
str.Format("%d",m_nog);
MessageBox(str);
if(grpcount<m_nog&&m_grp!="")
int k;
ver_groups[grpcount]=(char *)malloc(16*sizeof(char));
strcpy(ver_groups[grpcount],m_grp);
k=0;
while(ver_groups[grpcount][k])
ver_groups[grpcount][k]=toupper(ver_groups[grpcount][k]);
k++;
m_grp="";
grpcount++;
if(grpcount==m_nog)
m_nxtgrp.EnableWindow(false);
else
DISCRIMINANT ANALYSIS");
UpdateData(false);
void lda::OnYes()
m_out.EnableWindow(true);
void lda::OnNo()
void lda::OnBrowseInput()
UpdateData(true);
CFileDialog FileDialog(TRUE,"*.*",NULL,OFN_HIDEREADONLY,"Text
Files: (*.txt)|*.txt||");
if(FileDialog.DoModal() == IDOK)
m_inputfile.SetWindowText(PathName);
void lda::OnBrowseTest()
UpdateData(true);
CFileDialog FileDialog(TRUE,"*.*",NULL,OFN_HIDEREADONLY,"Text
Files: (*.txt)|*.txt||");
if(FileDialog.DoModal() == IDOK)
{
m_testfile.SetWindowText(PathName);
void lda::OnViewSampleFile()
ShellExecute(NULL,"open","notepad.exe","input.txt",NULL,SW_SHOWNORMAL
);