Sei sulla pagina 1di 20

1

An Introduction to
Artificial Neural Networks

-:Presented By:-
Pranjal Roy
Applied Electronics and Instrumentation Engg.
Roll no- 08103005018
Haldia Institute of Technology

2
Contents

1. Introduction
2. Why Artificial Neural Network?
3. Development of ANN model.
4. Basic building block of ANN
5. Classifications of ANN.
6. Classical Activation Functions.
7. Perceptron Models for ANN.
8. McCulloch-Pitts Neuron Model.
9. Learning Rules.
10. Issues and Applications of ANN
11. References

3
Introduction
Artificial Neural Networks are nonlinear information (signal) processing
device, which are built from inter connected elementary processing
devices called neurons.



Who is stronger and why?
Modern theory and new mathematical
models of information processing, which
are based on the biological prototypes and
mechanisms of human brain activities
Applied Problems:
1. Decision making
2. Knowledge discovery
3. Context-Dependent Analysis
4
Why Artificial Neural Networks
The long course of evaluation has given the human brain many desirable
characteristics not present in Von Neumann architecture.

1. Massive parallelism.
2. Distributed representation and computation .
3. Learning ability.
4. Generalization ability.
5. Adaptivity.
6. Inherent contextual information processing.
7. Fault tolerance.
8. Low energy consumption .

Why Artificial Neural Networks
5
Development of ANN model
a) 1943- McCulloch and Pitts: start of modern era of neural
network.
This forms a logical calculus of neural networks. A network consists of
sufficient amount of neurons and property set synaptic connections can
commute any computable function.

b) 1949- Hebbs book The Organization of behaviour
An explicit statement of a physiological learning rule for synaptic
modification was presented for the first time.

c) 1972- Kohonens Self-organizing Maps (SOM)
Kohonens self-organizing Maps are capable of reproducing important
aspects of the structure of biological neural nets.

d) 1990- Vapnik
Vapnik developed the support vector machine.
6
Basic Building Block of ANN
Artificial Neural
Network
Network
Architecture
(arrangement of neurons
into Layers and the pattern
connection within it.)
Setting the Weights
(The method of setting
the value for the weights
Enables the process of
learning and training )
Activation Function
(It is used to calculate
the response of a neuron.
The sum of the weighted
signal is applied
with an activation.)
7
Classification of ANN
Feedforward
ANN
Feedback
ANN
Supervised
Network
Unsupervised
Network
This ANN allows
signals to travel one
way only.
Used in Pattern
Recognition.

It can have signals
traveling in both
directions by
introducing loops
in the network..

In supervised network,
the network learns by
comparing the
network output with
the correct answer.

The network does
not get any feedback
from the errors. The
network itself
discovers the
features o the data.
8
Classical Activation Functions


( )
z z | =
Linear activation
Threshold activation
Hyperbolic tangent activation
Logistic activation
( ) ( )
u
u
e
e
u tanh u


2
2
1
1

= =
( )
1
1
z
z
e
o
|

=
+
( )
1, 0,
sign( )
1, 0.
if z
z z
if z
|
>
= =

<

z
z
z
z
1
-1
1
0
0
1
-1
9

Perceptron Model for ANN


.
.
.
f(E)
E
y
j
w
1j
w
2j
w
3j
z
3
z
2
z
1
output
hidden
input
x1 x2 x3 x4 x5
y1 y2 y3 y4
) )) ( ( (
) (
0 0
0
j
k
k
i
i i k kj
j
k
k kj j
w w x w f w f
w z w f y
+ +
'
=
+
'
=


Perceptron Models for ANN
10
Single Layer Perceptron Model (contd.)
.
/
I
n
p
u
t
s

Output
) , (
) , (
) , (
) , (
1
4 4
1
4
1
3 3
1
3
1
2 2
1
2
1
1 1
1
1
w x f y
w x f y
w x f y
w x f y
=
=
=
=
) , (
) , (
) , (
2
3
1 2
3
2
2
1 2
2
2
1
1 2
1
w y f y
w y f y
w y f y
=
=
=
|
|
|
|
|
.
|

\
|
=
1
4
1
3
1
2
1
1
1
y
y
y
y
y
) , (
3
1
2
w y f y
Out
=
|
|
|
|
.
|

\
|
=
2
3
2
3
2
3
2
y
y
y
y
11
MLP neural networks (contd.)
MLP = multi-layer perceptron

Perceptron:




MLP neural network:
x w y
T
out
=
x y
out

2 3
2
1
2 3
2
2
2
1
2
2
1
3
1
2
1
1
1
1
) , (
2 , 1 ,
1
1
) , , (
3 , 2 , 1 ,
1
1
2 1 2
1 1
y w y w y
y y y
k
e
y
y y y y
k
e
y
T
k
k k out
T
a y w
k
T
a x w
k
k
kT
k
kT
= =
=
=
+
=
=
=
+
=

=


12
McCullough-Pitts model:
Sigmoid
2
2
2
|| ||
1
1
a
w x
a x w
e y
e
y
T


=
+
=
Sigmoidal neuron

Gaussian neuron
Nonlinear generalization of the McCullogh-Pitts neuron:


y is the neurons output, x is the vector of inputs, and w is the vector of
synaptic weights.
Examples:
x w y
T
out
=
13
Learning Rules
ENERGY MINIMIZATION
We need an appropriate definition of energy for artificial neural
networks, and having that we can use mathematical optimisation
techniques to find how to change the weights of the synaptic
connections between neurons.

ENERGY = measure of task performance error
14
Perceptron Learning Rule (contd.)
The rule is applicable only for binary neuron response (-1 or +1)
Start with random weights for the connections.
Select an input vector x from the set of training samples.
If output y
k
= d(k) (the perceptron gives an incorrect response),
modify all connections w according to: W
i
= (d
k
y
k
) x
i
; ( =
learning rate).

f(E)
E
y
j
w
1j
w
2j
w
3j
z
3
z
2
z
1
1
-1
15
Delta Learning Rule (contd.)
x w y
T
out
=
The delta learning rule is only valid for continuous activation functions
and in the supervised training mode. The figure of this rule is shown
below :-

2
)] ( [ 2 / 1 ) ( X W f d t E i i =
16
Hebbian Learning Formula (contd.)
If xj is the input of the neuron, xi the output of the neuron, and wij
the strength of the connection between them, and learning rate,
then one form of a learning formula would be:

Wij (t) = Xj*Xi


17
Back-Propagation Learning Model
(contd.)
Solution of the complicated learning:
a) calculate first the changes for the synaptic weights of the
output neuron .
b) calculate the changes backward starting from layer p-1, and
propagate backward the local error terms .





The method is still relatively complicated but it is much simpler than the
original optimisation problem.
18
Issues and Applications of Artificial
Neural Networks
Artificial
Intellect with
Neural
Networks
Technical
Diagonistics
I ntelligent
Data Analysis
and Signal
Processing
Advance
Robotics
Machine
Vision
I mage &
Pattern
Recognition
I ntelligent
Security
Systems
I ntelligentl
Medicine
Devices
I ntelligent
Expert
Systems
19
References
1) Neural Networks for Pattern Recognition- Christopher Bishop, Clarendon
Press, Oxford, 1995.
2) The Essence of Neural Networks-Robrt Callan , Prentice Hall Europe, 1999
3) An Introduction to Neural Networks- Kevin Gurney, UCL Press, 1997
4) Wikipedia Encyclopedia.
5) Process Control Principles and Applications- Surekha
Bhanot.
6) Introduction To Neural Networks- S.N.Sivanandam, S.Sumathi, S.N Deepa.
7) Papaloukas, C.,Fotiadis, D. I.,Likas, A., &Michalis, L. K. (2002b). An ischemia
detection method based on artificial neural networks. Artificial Intelligence
in Medicine, 24, 167178.
8) Green, M. et al.(2006).Comparison between neural networks and multiple
logistic regression to predict acute coronary syndrome in the emergency
room, Elsevier B.V.
9) Neural Networks: A ComprehensiveFoundation- Simon Haykin. Prentice Hall,
1999.
10) An Introduction to the Theory of Neural Computation- J. Hertz, A. Krogh &
R.G. Palmer, Addison Wesley, 1991










20
---------------------------------------------------------------------------------------------------

Potrebbero piacerti anche