Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
on
Lovely professional University Jalandhar- Delhi Highway, GT Road NH-1, Phagwara -144402
CERTIFICATE
This is to certify that AMLANI PRATIK KETANBHAI bearing Registration no. 10903436 has completed dissertation/capstone project titled, PATTERN RECOGNITION USING NEURAL NETWORK WITH MATLABs under my guidance and supervision. To the best of my knowledge, the present work is the result of her original investigation and study. No part of the dissertation has ever been submitted for any other degree at any University. The dissertation is fit for submission and the partial fulfillment of the conditions for the award of Capstone project.
Mr. Srinivas Perala UID 14910 Assistant professor of ECE/EEE Lovely Professional University Phagwara, Punjab. Date :
DECLARATION
I AMLANI PRATIK KETANBHAI student of Lovely faculty of Technology & Sciences , B.Tech ECE LEET RE38E1 under Department of Electronics & Communications Engineering of Lovely Professional University, Punjab, hereby declare that all the information furnished in this dissertation / capstone project report is based on my own intensive research and is genuine. This dissertation / report does not, to the best of my knowledge, contain part of my work which has been submitted for the award of my degree either of this university or any other university without proper citation.
Date : 26/11/2011
2011
INDEX
Topic No 1.0 1.1 1.2 2.0 2.1 2.2 2.3 3.0 3.1 3.2 3.3 4.0 ABSTRACT INTRODUCTION KEYWORDS WHAT IS NEURAL NETWORK? APPLICATIONS OF NEURAL NETWORKS BIOLOGICAL INSPIRATION GATHERING DATA FOR NEURAL NETWORK BASIC ARTIFICIAL MODEL PATTER RECOGNITION TRAINING MULTILAYER PERCEPTRON BACKPROPAGATION ALGORITHM HANDWRITTEN ENGLISH ALPHABET RECOGNITION SYSTEM HANDWRITTEN CHARACTER RECOGNITION SYSTEM TRAINING METHOD WITH REJECT OUTPUT CHARACTER RECOGNITION BY LEARNING RULE ARTIFICIAL NEURAL NETWORK TRAINING MATLAB IMPLEMENTATION TRANSLATING THE INPUT INTO MATLAB CODE CHARACTER RECOGNITION MATLABs SOURCE CODE REFRENCE S Topic Name Page No. 5 5 6 6 7 7 8 8 11 13 14 15
18 21 23 26 27 27 31 49
Page 4
2011
1.0 Abstract
AbstractIn this paper, I have tried develop a neural network that can be used for handwritten character recognition. Supervised learning is used in order to provide training to the neural network. The neural network is designed here with a reject output so that we can easily classify all the training patterns exactly. Competitive learning algorithm is used in order to get the best result. The fine-classifiers are realized using multilayered neural networks, each of which solves the two-category classification problem. We also use the rough-classifier for the selection the training samples in the learning process of multilayered neural networks in order to reduce the learning time.
1.1 Introduction
One of the most important effects the field of Cognitive Science can have on the field of Computer Science is the development of technologies that make our tools more human. A very relevant present-day field of natural interface research is hand writing recognition technology. Evidenced by the fact that we are not currently all using Tablet computers, accurate hand writing recognition is clearly a difficult problem to solve. Before pattern recognition, air photo interpretation and photo reading from aerial reconnaissance were used to define objects and its significance. In this process, the human interpreter must have vast experience and imagination to perform the task efficiently and the interpreters judgment is considered subjective. Just recently, applications utilizing aerial photographs have multiplied and significant technological advancements have emerged from plain aerial photographs into high-resolution digital images. Pattern recognition is now widely used in large scale application such as monitoring the changes in water levels of lakes and reservoirs, assessing crop diseases, assessing land-use and mapping archaeological sites Each character is written with a single stroke. This solves the character level segmentation problem that previously plagued handwriting recognition. The curve drawn between pen down and pen up events can be recognized in isolation. Uni-stroke recognition algorithms can be relatively simple because there is no need to decide which parts of the curve belong to which character or to wait for more strokes that belong to the same character as is often the case when we try to recognize conventional handwriting. To develop a handwriting recognition system that is both as reliable as Uni-stroke, and natural enough to be comfortable, the system must be highly adaptable.
1.2Key words
Pattern recognition, Hidden Markov Model, Learning rules in Neural N/W, Matlab Toolbox.
Page 5
2011
Von Neumann machines are based on the processing/memory abstraction of human information processing. Neural networks are based on the parallel architecture of animal brains.
simple processing elements a high degree of interconnection simple scalar messages adaptive interaction between elements
A biological neuron may have as many as 10,000 different inputs, and may send its output (the nth presence or absence of a short-duration spike) to many other neurons. Neurons are wired up in a 3-dimensional pattern.
Real brains, however, are orders of magnitude more complex than any artificial neural network so far considered.
Page 6
2011
Neural networks grew out of research in Artificial Intelligence; specifically, attempts to mimic the fault-tolerance and capacity to learn of biological neural systems by modeling the low-level structure of the brain (see Patterson, 1996). The main branch of Artificial Intelligence research in the 1960s -1980s produced Expert Systems. These are based upon a high-level model of reasoning processes (specifically, the concept that our reasoning processes are built upon manipulation of symbols). It became rapidly apparent that these systems, although very useful in some domains, failed to capture certain key aspects of human intelligence. According to one line of speculation, this was due to their failure to mimic the underlying structure of the brain. In order to reproduce intelligence, it would be necessary to build systems with a similar architecture.
The brain is principally composed of a very large number (circa 10,000,000,000) of neurons, massively interconnected (with an average of several thousand interconnects per neuron, although this varies enormously). Each neuron is a specialized cell which can propagate an electrochemical signal.
Page 7
2011
Human Neuron Model consisting with millions of element to form a brain and transfers the signal.
In general, if you use a neural network, you won't know the exact nature of the relationship between inputs and outputs - if you knew the relationship, you would model it directly. The other key feature of neural networks is that they learn the input/output relationship through training. There are two types of training used in neural networks, with different types of networks using different types of training. These are supervised and unsupervised training, of which supervised is the most common and will be discussed in this section.
For example, consider a neural network being trained to estimate the value of houses. The
price of houses depends critically on the area of a city in which they are located. A particular city might be subdivided into dozens of named locations, and so it might seem natural to use a nominal-valued variable representing these locations. Unfortunately, it would be very difficult to train a neural network under these circumstances, and a more credible approach would be to assign ratings (based on expert knowledge) to each area; for example, you might assign ratings for the quality of local schools, convenient access to leisure facilities etc.
Page 8
2011
Page 9
2011
Page 10
2011
The network has 2 inputs, and one output. All are binary. The output is 1 if W0 *I0 + W1 * I1 + Wb> 0
Page 11
2011
Sensory
RETRIVAL STRATEGEIC
I/O
Automatic machine recognition, description, classification, and grouping of patterns are important problems in a variety of engineering and scientific disciplines such as biology, psychology, medicine, marketing, computer vision, artificial intelligence, and remote sensing. A pattern could be a fingerprint image, a handwrittencursive word, a human face, or a speech signal. Given a pattern, its recognition/classification may consist of one of the following two tasks 1) Supervised classification (e.g. discriminated identified as a member of a predefined class. analysis) in which the input pattern is
2) Unsupervised classification (e.g. clustering) in which the pattern is assigned to a hitherto unknown class. The recognition problem here is being posed as a classification or categorization task, where the classes are either defined by the system designer in supervised classification or are learned based on the similarity of patterns in unsupervised classification. These applications include data mining identifying a pattern e.g. correlation, or an outlier in millions of multidimensional patterns document classification efficiently searching text documents financial forecasting, organization and retrieval of multimedia databases, and biometrics. The rapidly growing and available computing power, while enabling faster processing of huge data sets, has also facilitated the use of elaborate and diverse methods for data analysis and classification. At the same time, demands on automatic pattern recognition systems are rising enormously due to the availability of large databases and stringent performance requirements (speed, accuracy, and cost). The design of a pattern recognition system essentially involves the following three aspects: 1) Data acquisition and preprocessing. 2) Data representation, and 3) Decision making.
Page 12
2011
Networks (NN) have been widely used to solve complex problems of pattern recognition and they can recognize patterns very fast after they have already been trained. There are two types of training which are used in neural networks. These are supervised and unsupervised training of which supervised is the most common. In supervised training, the neural network assembles a training data set. This data set contains examples of input patterns together with the corresponding output results, and the network learns to infer the relationship between input patterns and output results through training. The training process requires a training algorithm, which uses the training data set to adjust the network's weights and bias so as to minimize an error function, such as the mean squared error function (MSE) , and try to classify all patterns in the training data set. After training, the NN will have a set of weights and biases that will be used to recognize the new patterns. In general, training NN with a larger training data set can reduce the recognizing error rate, but it increases the complexity. Handwritten character recognition is a typical application of neural networks.
A.
If we are training a neural network with a large training data set, it requires more time to minimize the error function. These patterns decreases the speed of reduction of error function but the training data set should not be too small that the network cannot identify the pattern properly. If the neural network is not provided with a proper set of training data, it cannot recognize difficult patterns (fig 1). If the NN have to recognize a pattern that approximates in shape to one of the provided patterns, the recognition result will be wrong.
Page 13
2011
Fig 1 Fig. 1. The difficult recognizing patterns are usually clustered in the area that is near the boundary line in the pattern space.
Page 14
2011
Neural network error surfaces are much more complex, and are characterized by a number of unhelpful features, such as local minima (which are lower than the surrounding terrain, but above the global minimum), flat-spots and plateaus, saddle-points, and long narrow ravines. It is not possible to analytically determine where the global minimum of the error surface is, and so neural network training is essentially an exploration of the error surface. From an initially random configuration of weights and thresholds (i.e., a random point on the error surface), the training algorithms incrementally seek for the global minimum. Typically, the gradient (slope) of the error surface is calculated at the current point, and used to make a downhill move. Eventually, the algorithm stops in a low point, which may be a local minimum (but hopefully is the global minimum).
Page 15
2011
Page 16
2011
INPUT ALPHABET B
25 SEGMENT GRID
DIGITIZATION
0 0 0 0 0
1 1 1 1 1
0 0 1 0 1
0 0 1 1 1
0 0 0 0 0
Page 17
2011
B.
One of the ideas that can be implemented to solve the above problems is the training data set should be separated into some parts and the neural network will use many sets of weights and biases to classify all patterns in each part and between parts. A new structure of pattern recognition NN with an especial output that is called Reject output is introduced that can build a training method corresponding to this structure. The name Reject output that means it is used to separate all difficult recognizing patterns from the training data set. Hence, these patterns are called Rejected patterns. Our training method uses the reject output to separate the training data set into some parts, and with a smaller number of patterns in each part, they can be classified by the NN using a distinct set of weights and biases.
Using Reject output to separate the training data set into two parts for classifying.
Fig illustrates our idea in principle. A large training data set has been separated into two parts for classifying; thus, a not big NN can use two sets of weights and biases (SWBs) for classifying the large training data set. Of course, the NN will use the two SWBs for recognizing new patterns. The SWB1 is set for recognizing first.
Page 18
2011
If the reject output gives a result Reject, the SWB1 will be replaced by SWB2 for recognizing in the second time. As a result, all the sets of weights and biases have to be kept in the order that we receive them from the training process.
The handwritten character recognition system includes the following steps. 1) Scanning of image, 2) Pre processing, 3) Classification of pattern and 4) Output. A 300 dpi scanner can be used for scanning of page that consist the handwritten characters. Here, we are considering only black and white image of the characters and hence, we will avoid color scanning.
Preprocessing
At first, a character image is normalized and smoothed in the preprocessing stage. In normalization, a linear normalization method is employed and an input image is adjusted to 128 * 128 dots. As a result of smoothing, bumps and holes of strokes are patched up by using 3 * 3 masks. Then, contour extraction is done. If a white pixel adjoins a black pixel to the upward, downward, left, or right direction, the black pixel is regarded as on contour. The feature vector is extracted from the pixels of contour.
Classification of Pattern
Different patterns in the form of line elements and directional elements have been developed by scanning a pixel and eight adjacent pixels of that pixel which will help in determining the actual shape of the input pattern.
Page 19
2011
Line elements in the directional element features. This process is called dot orientation and it will help in weight vector constriction and updating process at the time of learning of the neural network.
Page 20
2011
Page 21
2011
where is learning coefficient after t learning steps. The coefficient starts from its initial value and then decreases monotonically as t increases, thus reaching its minimum value at the preset maximum number of learning steps Tmax. is aneighborhood function with the center at winner cell 1508 c and pi is the distance from cell i to the winner cell c. In the equation (2), is a time-varying parameter that defines the neighborhood size in the competitive layer. As well as parameter in equation (1), this parameter decreases monotonically from as t increases. In general, SOM has significant characteristics that the distribution of code- vectors after the learning of SOM reflects the distribution of the input data. That is, code-vectors tend to converge on the area where the density of the input data is high. In this research, based on these characteristics of SOM, we construct the rough- classifier of our modular neural networks by assigning each SOM to each class category. Here, we use one-dimensional SOM for each SOM in the rough-classifier. In the rough-classification, we select k templates which are nearest to the test sample among the all templates for all class categories. The class categories of the selected k templates result in the candidate classes in the fine-classifiers.
Page 22
2011
result, the rejected patterns are used in rotation in each training epoch; thus, they usually are clustered by the reject output faster than the un-rejected patterns The reject output value is always in range from -1 to +1. The NN uses the reject output to cluster for the rejected patterns with value 1 and for the un-rejected patterns with value -1. Therefore, a threshold value R has been determined for the reject output to separate all the rejected patterns from the un-rejected patterns . That means the minimum value of all the value of the reject output corresponding to all the rejected patterns must be higher than R, and the maximum value () of all the value of the reject output corresponding to all the un-rejected patterns must be lower than or equal R. The training software can track the and in this phase, and the NN should be trained until >. At that time, the reject output can separate all the rejected patterns from the un-rejected patterns by the threshold R=. However, we do not need to wait until >, because if < and R=, all the rejected patterns are still clustered by the reject output, although some un-rejected patterns are classified as rejected patterns . It is no problem, because these patterns will be classified again in the next phase. Thus, if is still smaller than after hundreds of epochs, we should check all the training patterns and mark the rejected patterns again. If a pattern has already clustered by the rejected output, it is still marked as a rejected pattern. If a pattern is not classified correctly by the normal outputs, it should be marked as a rejected pattern. As the result, the number of the patterns that were marked as the rejected patterns may be changed in this phase.
Determine the threshold for the reject output After this phase, we have the first set of weights and biases (SWB1) and the threshold value R for the reject output separated the training data set into two parts.
Page 23
2011
A large training data set can be separated into more two parts to classify by using many sets of weights and biases (SWB). The update process can be started not from the first phase. In case of large training data, there are different phases of learning which are derived from main phase 1 and 2. Phase 1 and 2 decides the rejected and not rejected patterns and if the patterns are not rejected, the NN classifies it further for precise outputs. From Fig 6, we can see that at the end of each phase, there is a set of updated new patterns which helps that layer of neurons to solve the problem very easily and fast.
Page 24
2011
trained neural network for simulation. This simulator program is designed in such a way that the characters can be written on any spot on the blank paper. The program has the ability to capture the characters through careful filtering. 8-connectivity is used to allow connected pixels in any direction to be taken as the same object.
EXEMPLARS PREPARATION The characters prepared as explained in Section 2, are scanned using a scanner or captured using a digital camera and these characters will be segregated according to their own character group. One example is shown below in Fig. 1.
Note that the scanned or captured images are in RGB scale. These images have to be converted into grayscale format before further processing can be done. Using appropriate grayscalethresholding, binary images are to be created.
Fig. 2 Binary image of sample character A Fig. 2 shows the generated binary image using image processing tool box in MATLAB and 8connectivity analysis. The next step is to obtain the bounding box of the characters (Fig. 3). Bounding box is referring to the minimum rectangular box that is able to encapsulate the whole character. The size of this bounding box is important as only a certain width-to-height (WH) ratio of the bounding box will be considered in capturing. Those objects of which bounding boxes are not within a specific range will not be captured, hence will be treated as unwanted objects. The next criteria to be used for selection of objects are relative-height (RH) ratio and also relative-width (RW) ratio. RH ratio is defined as the ratio of the bounding box height of one object to the maximum bounding box height among all the objects of that image. Likewise, RW refers to the ratio between the widths of one bounding box to the maximum width among all bounding boxes widths in that image.
Page 25
2011
Fig.3
Fig. 3 The outer rectangle that encapsulates the letter A is the bounding box. The captured objects should now all consist of valid characters (and not unwanted objects) to be used for neural network training. Each of the captured character images have different dimensions measured in pixels because each of them have different bounding box sizes. Hence, each of these images needs to be resized to form standard image dimensions. However, in order to perform multi scale training technique different image resolutions are required. For this purpose, the images are resized into dimensions of 20 by 28 pixels, 10 by 14 pixels, and 5 by 7 pixels. Note that these objects are resized by using the averaging procedure. 4 pixels will be averaged and mapped into one pixel (Fig. 4).
Page 26
2011
Fig. 4 The pixels shaded in gray on the left are averaged (computing mean of 101, 128, 88, and 50) and the pixel shaded in gray on the right is the averaged pixel intensity value. This example shows averaging procedure from 20 by 28 pixel image into 10 by 14 pixel image.
Fig. 5 Conceptual diagram of multi scale training technique. The training begins with 5 by 7 pixels exemplars (stage 1). These input vectors are fed into the neural network for training. After being trained for a few epochs, the neural network is boosted by manipulating the weights between the first and the second layer. This resultant neural network is trained for another few epochs by feeding in 10 by 14 pixels exemplars (stage 2). Again, the trained neural network is boosted for the next training session. In a similar fashion, the boosted neural network is fed in with 20 by 28 pixel exemplars for another few epochs until satisfactory
Page 27
2011
convergence is achieved (stage 3). The conceptual diagram of multi scale neural network is
shown in Fig. 5. Fig. 6 Original neural network (top) and the boosted neural network (bottom). Referring to Fig. 6, P1, P2, P3, and P4 are pixel intensity values and Pave is the averaged pixel intensity value of these pixels. After the boosting process, the original weight value W, is split into 4, each connected to one pixel location.
5.2 MatlabImplimentation
Handwriting samples were obtained from 4 subjects, two male and two female. All the subjects were given a piece of paper with boxes for each letter of the alphabet, and a large box to write the sentence , The quick brown fox jumped over the lazy dogs All the subjects used identical felt tip pens. In addition to the handwriting samples, one printed sample was created using Microsoft word with the font Arial 12pt, in all capital letters.
Page 28
2011
The splitting of alphabets into 25 segment grids, scaling the segments so split to a standard size and thinning the resultant character segments to obtain skeletal patterns. The following preprocessing steps may also be required to furnish the recognition process..
Above, the user has loaded a file, and has painted over it to register the active pixels. Pressing the save button will add this images data to the growing amount of Matlab code in the bottom text box. Each input must also be associated with a letter of the alphabet; this is specified in the interface using the small text box under the add Scan button to specify the appropriate letter. This target letter appears in the Matlab code as a 26 unit vector containing one 1 and twenty-five 0s. Errors Identifications hora had 12 errors krav had 3 errors blaz had 7 errors chme had 15 errors bart had 13 errors hyne had 3 errors barta had 10 errors
Page 29
2011
fris had 8 errors cerm had 4 errors fise had 19 errors kriz had 9 errors cimp had 9 errors holi had 3 errors jako had 9 errors krat had 9 errors habe had 4 errors bern had 4 errors chlu had 6 errors kost had 1 error ---[ Unsuccessful recognitions stats:] 0 not recognised 18 times 1 not recognised 4 times 2 not recognised 11 times 3 not recognised 8 times 4 not recognised 13 times 5 not recognised 5 times 6 not recognised 33 times 7 not recognised 19 times 8 not recognised 20 times 9 not recognised 18 times We had 851 successful recognitions on 1000 patterns 85.100000 We had 68 uncertain right and 56 unclassified
The first part explains how many patterns where misclassified. The first part explains how many patterns where misclassified for every person while the second one tells us the same information but about the number to be classified. In the end we show the rate of success fully classified patterns. To decide if a pattern was uncertain right (the highest output in the right position) or unclassified (the highest value in the wrong position) we checked if the output was lower than 0.5. We have also tried to reduce the input space dimensionality by reducing the number of segments from seven to three. This reduction allowed us to test the network with an input layer of 72 nodes but this network was not able to successfully recognize more than 74% of the validation set. This stopped us from continuing any further in this direction.
Page 30
2011
What can be seen from this image, which compares on the left our best recognized volunteer Kost with the worst Fise on the right, is that the characters do not differ from each other too much, but the results show that discriminates exist. We might find one in the Z axis that represents the pressure. This also hints us where to find other improvements in the network and the feature extraction. Another hint to this problem is the number of unclassified patterns. In fact, if we could classify even just 80% of these patterns, we could get close to 93% of successful recognitions achieved by MarekMusil with an ad-hoc K-Means based solution. Considering the number of weights, increasing the number of patterns in the training set might help as well (Data we did not have access to).
Page 31
2011
Considering that both RBF and K-Means use codebook vectors, we would have expected the RBF approach to yield good results, but we think this inability is caused by the high dimensionality of the input space.
WEIGHTS FOR TRAINING The figures given below give us the information how we can initialize weights for training the neural network. The initialization of weights is a very important task during training because the training of neural network consists only of reduction in error between obtained output and desired output.
Page 32
2011
Page 33
2011
% applied to the GUI before gskmadechar_OpeningFunction gets called. An % unrecognized property name or invalid value makes property application % stop. All inputs are passed to gskmadechar_OpeningFcn via varargin. % % *See GUI Options on GUIDE's Tools menu. Choose "GUI allows only one % instance to run (singleton)". % % See also: GUIDE, GUIDATA, GUIHANDLES % Edit the above text to modify the response to help gskmadechar % Last Modified by GUIDE v2.5 23-Mar-2008 23:58:49 % Begin initialization code - DO NOT EDIT gui_Singleton = 1; gui_State = struct('gui_Name', mfilename, ... 'gui_Singleton', gui_Singleton, ... 'gui_OpeningFcn', @gskmadechar_OpeningFcn, ... 'gui_OutputFcn', @gskmadechar_OutputFcn, ... 'gui_LayoutFcn', [] , ... 'gui_Callback', []); ifnargin&&ischar(varargin{1}) gui_State.gui_Callback = str2func(varargin{1}); end ifnargout [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:}); else gui_mainfcn(gui_State, varargin{:}); end % End initialization code - DO NOT EDIT % --- Executes just before gskmadechar is made visible. functiongskmadechar_OpeningFcn(hObject, eventdata, handles, varargin) % This function has no output args, see OutputFcn. % hObject handle to figure % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA)
Page 34
2011
% varargin VARARGIN)
% Choose default command line output for gskmadechar handles.output = hObject; % Update handles structure guidata(hObject, handles); % UIWAIT makes gskmadechar wait for user response (see UIRESUME) % uiwait(handles.figure1); clc clearglobal global inmv000 wtmv000 incmv000 iolmv000 inmv000 = zeros(100,1); inmv000 = (-1) + inmv000; incmv000 = zeros(1,100); incmv000 = (-1) + incmv000; iolmv000 = 45; wtmv000 = zeros(100,100); wtmv000 = wtmv000 + (inmv000 * inmv000'); %for intv011 = 1:100 % wtmv000(intv011,intv011) = 0; %end % --- Outputs from this function are returned to the command line. functionvarargout = gskmadechar_OutputFcn(hObject, eventdata, handles) % varargout cell array for returning output args (see VARARGOUT); % hObject handle to figure % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Get default command line output from handles structure varargout{1} = handles.output;
function edit1_Callback(hObject, eventdata, handles) % hObject handle to edit1 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA)
Page 35
2011
% Hints: get(hObject,'String') returns contents of edit1 as text % str2double(get(hObject,'String')) returns contents of edit1 as a double % --- Executes during object creation, after setting all properties. function edit1_CreateFcn(hObject, eventdata, handles) % hObject handle to edit1 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles empty - handles not created until after all CreateFcns called % Hint: edit controls usually have a white background on Windows. % See ISPC and COMPUTER. ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pushbutton1. function pushbutton1_Callback(hObject, eventdata, handles) % hObject handle to pushbutton1 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) global wtmv000 inmv000 incmv000 iolmv000 inmv001 = zeros(100,1); inmv001 = (-1) + inmv001; chrv000 = cd; chrv001 = get(handles.edit1,'string'); chrv002 = get(handles.edit3,'string'); chrvv002 = str2double(chrv002); dos('%SystemRoot%\system32\mspaint.exe gskchar.bmp'); un8v000 = imread(fullfile(chrv000,'gskchar.bmp')); %calculation of x1, y1, x2, y2 [intv000,intv001,intv002] = size(un8v000); intv005 = 0; intv006 = 0; intv007 = 0; intv008 = 0; intv009 = intv000; intv010 = intv001;
Page 36
2011
for intv003 = 1:intv000 intv007 = 0; for intv004 = 1:intv001 if (un8v000(intv003,intv004,1) ~= 255) || (un8v000(intv003,intv004,2) ~= 255) || (un8v000(intv003,intv004,3) ~= 255) intv007 = intv007 + 1; if intv007 == 1 intv009 = min(intv003,intv009); intv010 = min(intv004,intv010); end intv005 = max(intv005,intv003); intv006 = max(intv006,intv004); end end end if ((intv005 == intv009) || (intv006 == intv010)) || (intv005 == 0) || (intv006 == 0) || (intv009 == 0) || (intv010 == 0) msgbox([{'Image cannot have less than 2 lines (line must be longer than 2 pixels)',char(13),'Retrieval is not sucessful'}],'ChrRecg - GSK'); else if (intv005 - intv009) > (intv006 - intv010) intv003 = 9 / (intv005 - intv009); else intv003 = 9 / (intv006 - intv010); end axes(handles.axes1); % figure cla holdon for intv011 = 1 : intv000 for intv012 = 1 : intv001 if (un8v000(intv011,intv012,1) ~= 255) || (un8v000(intv011,intv012,2) ~= 255) || (un8v000(intv011,intv012,3) ~= 255) intv013 = round(intv003 * (intv011 -intv009)) + 1; intv014 = round(intv003 * (intv012 -intv010)) + 1; inmv001((intv013-1) * 10 + intv014,1) = 1; plot(intv014,-intv013); end end end xlim([0 11]); ylim([-11 0]);
Page 37
2011
holdoff logv000 = 0; intv004 = 0; intv005 = 0; [intv000,intv001] = size(incmv000); opmv000 = inmv001; while logv000 == 0 intv004 = intv004 + 1; intv016(1:intv000,1) = zeros(intv000,1); for intv002 = 1 : intv000 intv015(1:100,1) = incmv000(intv002,1:100) - opmv000(1:100,1)'; intv016(intv002,1) = norm(intv015(1:100,1)); end intv017 = min(intv016); intv018 = 0; for intv002 = 1 : intv000 if intv016(intv002,1) == intv017; intv018 = intv018 + 1; intv003 = intv002; end end if (intv018 == 1) logv000 = 1; end inmmv001 = opmv000; if logv000 ~= 1 if intv004 > 100 intv004 = intv004 - 100; end intv005 = intv005 + 1; if get(handles.checkbox2,'value') == 1 inmv001(intv004,1) = opmv000(intv004,1); else inmv001 = opmv000; end end if intv005 == chrvv002 * 10000 msgbox([{'Processing taking too long',char(13),'Retrieval is not sucessful'}],'ChrRecg - GSK'); logv000 = 1; intv003 = 1; end opmv000 = wtmv000' * inmv001; for intv002 = 1 : 100 if opmv000(intv002) < 0 opmv000(intv002) = -1; else
Page 38
2011
opmv000(intv002) = 1; end end end set(handles.edit4,'string',num2str(intv005)); chrv001 = strcat(chrv001,char(iolmv000(intv003))); set(handles.edit1,'string',chrv001); end intv000 = get(handles.checkbox1,'value'); intvs000 = num2str(intv000); if intvs000 == '1' gskmadechar('pushbutton1_Callback',hObject, eventdata, handles) end % --- Executes on button press in pushbutton2. function pushbutton2_Callback(hObject, eventdata, handles) % hObject handle to pushbutton2 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) function edit2_Callback(hObject, eventdata, handles) % hObject handle to edit2 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Hints: get(hObject,'String') returns contents of edit2 as text % str2double(get(hObject,'String')) returns contents of edit2 as a double % --- Executes during object creation, after setting all properties. function edit2_CreateFcn(hObject, eventdata, handles) % hObject handle to edit2 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles empty - handles not created until after all CreateFcns called % Hint: edit controls usually have a white background on Windows. % See ISPC and COMPUTER.
Page 39
2011
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pushbutton3. function pushbutton3_Callback(hObject, eventdata, handles) % hObject handle to pushbutton3 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) set(handles.edit1,'string',''); % --- Executes on button press in pushbutton4. function pushbutton4_Callback(hObject, eventdata, handles) % hObject handle to pushbutton4 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) global inmv000 wtmv000 incmv000 iolmv000 [chrv002,chrv003]=uiputfile('*.mat','Select a File to store Data'); if ~(num2str(chrv002) == '0'); save(fullfile(chrv003,chrv002)); msgbox('Saving is sucessful','ChrRecg - GSK'); else msgbox([{'You Have Specify File To Save Data',char(13),'Saving is not sucessful'}],'ChrRecg - GSK'); end % --- Executes on button press in pushbutton5. function pushbutton5_Callback(hObject, eventdata, handles) % hObject handle to pushbutton5 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) [chrv005,chrv006]=uigetfile('*.mat','Select a File to Load Data'); if (num2str(chrv005) == '0'); msgbox([{'You Have Specify File To Save Data',char(13),'Loading is not sucessful'}],'ChrRecg - GSK'); else
Page 40
2011
chrv007 = questdlg([{'This will delete present learned data!!!'};{'Are You Sure ???'}],'Question ?','Yes','No.','No.','Yes'); if chrv007 == 'Yes' chrv008 = handles; load(fullfile(chrv006,chrv005)); else msgbox([{'Cancelled by user',char(13),'Loading is not sucessful'}],'ChrRecg - GSK'); end end % --- Executes on button press in pushbutton6. function pushbutton6_Callback(hObject, eventdata, handles) % hObject handle to pushbutton6 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) global wtmv000 inmv000 incmv000 iolmv000 inmv000 = zeros(100,1); inmv000 = (-1) + inmv000; chrv000 = cd; chrv001 = get(handles.edit2,'string'); dos('%SystemRoot%\system32\mspaint.exe gskchar.bmp'); un8v000 = imread(fullfile(chrv000,'gskchar.bmp')); %calculation of x1, y1, x2, y2 [intv000,intv001,intv002] = size(un8v000); intv005 = 0; intv006 = 0; intv007 = 0; intv008 = 0; intv009 = intv000; intv010 = intv001; for intv003 = 1:intv000 intv007 = 0; for intv004 = 1:intv001 if (un8v000(intv003,intv004,1) ~= 255) || (un8v000(intv003,intv004,2) ~= 255) || (un8v000(intv003,intv004,3) ~= 255) intv007 = intv007 + 1; if intv007 == 1 intv009 = min(intv003,intv009); intv010 = min(intv004,intv010); end intv005 = max(intv005,intv003); intv006 = max(intv006,intv004);
Page 41
2011
end end end if ((intv005 == intv009) || (intv006 == intv010)) || (intv005 == 0) || (intv006 == 0) || (intv009 == 0) || (intv010 == 0) msgbox([{'Image cannot have less than 2 lines (line must be longer than 2 pixels)',char(13),'Teaching is not sucessful'}],'ChrRecg - GSK'); else if (intv005 - intv009) > (intv006 - intv010) intv003 = 9 / (intv005 - intv009); else intv003 = 9 / (intv006 - intv010); end axes(handles.axes1); % figure cla holdon for intv011 = 1 : intv000 for intv012 = 1 : intv001 if (un8v000(intv011,intv012,1) ~= 255) || (un8v000(intv011,intv012,2) ~= 255) || (un8v000(intv011,intv012,3) ~= 255) intv013 = round(intv003 * (intv011 -intv009)) + 1; intv014 = round(intv003 * (intv012 -intv010)) + 1; inmv000((intv013-1) * 10 + intv014,1) = 1; plot(intv014,-intv013); end end end xlim([0 11]); ylim([-11 0]); holdoff [intv000,intv001] = size(incmv000); if intv000>0 logv000 = 0; for intv002 = 1 : intv000 if inmv000(1:100,1) == incmv000(intv002,1:100)' logv000 = 1; if ~isempty(chrv001) intv004 = unicode2native(chrv001); iolmv000(intv002,1)= intv004(1,1); logv001 = 1; end end
Page 42
2011
end end if (intv000 == 0) | (logv000 == 0) incmv000(intv000+1,1:100) = inmv000; intv004 = unicode2native(chrv001); iolmv000(intv000+1,1) = intv004(1,1); wtmv000 = wtmv000 + (inmv000 * inmv000'); % for intv011 = 1:100 % wtmv000(intv011,intv011) = 0; % end end end % --- Executes on button press in pushbutton8. function pushbutton8_Callback(hObject, eventdata, handles) % hObject handle to pushbutton8 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) chrv000 = [{'CHARACTER RECOGNITION WITH NEURAL ANALYSIS'}... {''}... {}... {'This programme used simple algorithm of auto assosiative principle of artificial neural networks analysis with asynchronous updation. Algorithm used here is very simple and robust one.'}... {''}... {''}... {'Number of iterations required depend on the mach between input given to the available teaching data. Number of iteration may exceen with improper teaching. We allow you to specify time limit of each processing so that incase of long iteration the processing will stop to reduce the problem of hanging.'}... {''}... {''}... {'First we need to train the system by typing alphabet or numaric in textbox provided.'}... {''}... {'Paintbrush will open where we can draw input image, use "Air Brush" tool in paintbrush for good results'}... {''}... {'Same alphabet can be taught more than once (It is recomended to teach each alphabet more than 10 times'}... {''}... {'You can save the systems learned data for future use or for future updation. It is similar to stoping work now and
Page 43
2011
continuing afterwards. Load / Save buttons should be used for this purpose'}... {''}... {'By pressing the button start, you are allowed to enter the image which will approximated to nearest alphabet of numaric'}... {''}... {'Please send your suggetions to "gadisureshkumar@gmail.com"'}... {''}... {''}... {'Prepared by Suresh Kumar Gadi'}]; msgbox(chrv000,'ChrRecg - GSK'); % --- Executes on button press in pushbutton9. function pushbutton9_Callback(hObject, eventdata, handles) % hObject handle to pushbutton9 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % --- Executes on button press in pushbutton10. function pushbutton10_Callback(hObject, eventdata, handles) % hObject handle to pushbutton10 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) chrv000 = questdlg([{'This will delete present learned data!!!'};{'Are You Sure ???'}],'Question ?','Yes','No.','No.','Yes'); if chrv000 == 'Yes' clearglobal global inmv000 wtmv000 incmv000 iolmv000 inmv000 = zeros(100,1); inmv000 = (-1) + inmv000; incmv000 = zeros(1,100); incmv000 = (-1) + incmv000; iolmv000 = 45; wtmv000 = zeros(100,100); wtmv000 = wtmv000 + (inmv000 * inmv000'); for intv011 = 1:100 wtmv000(intv011,intv011) = 0; end
% % % else
Page 44
2011
msgbox([{'Cancelled by user',char(13),'Learned data clearing is not sucessful'}],'ChrRecg - GSK'); end function edit3_Callback(hObject, eventdata, handles) % hObject handle to edit3 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Hints: get(hObject,'String') returns contents of edit3 as text % str2double(get(hObject,'String')) returns contents of edit3 as a double % --- Executes during object creation, after setting all properties. function edit3_CreateFcn(hObject, eventdata, handles) % hObject handle to edit3 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles empty - handles not created until after all CreateFcns called % Hint: edit controls usually have a white background on Windows. % See ISPC and COMPUTER. ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pushbutton12. function pushbutton12_Callback(hObject, eventdata, handles) % hObject handle to pushbutton12 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) global inmv000 wtmv000 incmv000 iolmv000 intv000 = wtmv000 / max(abs(max(abs(wtmv000)))); figure holdon gridoff for intv001 = 1:100 for intv002 = 1:100 if intv000(intv001,intv002) < 0
Page 45
2011
plot(intv001,100-intv002,'-rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[1*intv000(intv001,intv002) 0 0],'MarkerSize',10); else if intv000(intv001,intv002) > 0 plot(intv001,100-intv002,'-rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[0 intv000(intv001,intv002) 0],'MarkerSize',10); else plot(intv001,100-intv002,'-rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[1 1 0],'MarkerSize',10); end end end end holdoff % --- Executes on button press in pushbutton15. function pushbutton15_Callback(hObject, eventdata, handles) % hObject handle to pushbutton15 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) global inmv000 wtmv000 incmv000 iolmv000 [intv000, intv001] = size(incmv000); figure holdon for intv002 = 1:intv000 for intv003 = 1:100 if incmv000(intv002,intv003) > 0 plot(intv003,intv002,'-rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[0 1 0],'MarkerSize',10); else plot(intv003,intv002,'-rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[1 0 0],'MarkerSize',10); end end end holdoff %msgbox([{'Recorded input patterens are displyed on mainwindow of MATLAB'}],'ChrRecg - GSK'); % --- Executes on button press in pushbutton16.
Page 46
2011
function pushbutton16_Callback(hObject, eventdata, handles) % hObject handle to pushbutton16 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) global inmv000 wtmv000 incmv000 iolmv000 char(iolmv000) msgbox([{'Recorded Ouputs are displyed on mainwindow of MATLAB'}],'ChrRecg - GSK');
function edit4_Callback(hObject, eventdata, handles) % hObject handle to edit4 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Hints: get(hObject,'String') returns contents of edit4 as text % str2double(get(hObject,'String')) returns contents of edit4 as a double % --- Executes during object creation, after setting all properties. function edit4_CreateFcn(hObject, eventdata, handles) % hObject handle to edit4 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles empty - handles not created until after all CreateFcns called % Hint: edit controls usually have a white background on Windows. % See ISPC and COMPUTER. ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in checkbox1. function checkbox1_Callback(hObject, eventdata, handles) % hObject handle to checkbox1 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA)
Page 47
2011
% Hint: get(hObject,'Value') returns toggle state of checkbox1 % --- Executes on button press in pushbutton17. function pushbutton17_Callback(hObject, eventdata, handles) % hObject handle to pushbutton17 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) global inmv000 wtmv000 incmv000 iolmv000 figure holdon for intv000 = 1:100 if inmv000(intv000,1) > 0 plot(intv000,1,'-rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[0 1 0],'MarkerSize',10); else plot(intv000,1,'-rs','LineWidth',1,'MarkerEdgeColor','b','MarkerFaceColor',[1 0 0],'MarkerSize',10); end end holdoff %msgbox([{'Present input matrix is displyed on mainwindow of MATLAB'}],'ChrRecg - GSK'); % --- Executes on selection change in listbox1. function listbox1_Callback(hObject, eventdata, handles) % hObject handle to listbox1 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Hints: contents = get(hObject,'String') returns listbox1 contents as cell array % contents{get(hObject,'Value')} returns selected item from listbox1 % --- Executes during object creation, after setting all properties. function listbox1_CreateFcn(hObject, eventdata, handles) % hObject handle to listbox1 (see GCBO)
Page 48
2011
% eventdata reserved - to be defined in a future version of MATLAB % handles empty - handles not created until after all CreateFcns called % Hint: listbox controls usually have a white background on Windows. % See ISPC and COMPUTER. ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in radiobutton1. function radiobutton1_Callback(hObject, eventdata, handles) % hObject handle to radiobutton1 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Hint: get(hObject,'Value') returns toggle state of radiobutton1 % --- Executes on button press in radiobutton2. function radiobutton2_Callback(hObject, eventdata, handles) % hObject handle to radiobutton2 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Hint: get(hObject,'Value') returns toggle state of radiobutton2
% --- Executes on button press in checkbox2. function checkbox2_Callback(hObject, eventdata, handles) % hObject handle to checkbox2 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Hint: get(hObject,'Value') returns toggle state of checkbox2
Page 49
2011
6.0REFERENCES
[1] IEEE research paper on A Study on Japanese Historical Character Recognition that uses Modular Neural Networks by Tadashi Horiuchi and Satoru Kato vide reference no. 9780-7695-3873-0/09. [2] IEEE research paper on A Pattern Recognition Neural Network Using Many Sets of Weights [3] and Biases by Le Dung and Makoto Mizukawa vide reference no. 1-4244-0790-7/07. [4] IEEE research paper on Automatic Handwritten Postcode Reading System using Image Processing and Neural Networks by Le Dung, Makoto Mizukawa, 930-931 SI2006 [5] http://www.codeproject.com/KB/recipes/HopfieldNeuralNetwork.aspx [6] http://home.eunet.no/~khunn/papers/2039.html [7] http://www.codeproject.com/KB/dotnet/simple_ocr.aspx
Page 50