Sei sulla pagina 1di 23

Artifcial Neural Networks

The Tutorial
With MATLAB
Contents
1.
PERCEPTRON ..........................................................................................
.................................3
1.1. CLA!"!CAT!ON W!T# A $%!NP&T
PERCEPTRON........................................................................3
1.$. CLA!"!CAT!ON W!T# A 3%!NP&T
PERCEPTRON........................................................................'
1.3. CLA!"!CAT!ON W!T# A $%NE&RON
PERCEPTRON....................................................................(
1.). CLA!"!CAT!ON W!T# A $%LA*ER PERCEPTRON
......................................................................+
$. L!NEAR
NETWOR,..............................................................................................
.................-
$.1. PATTERN AOC!AT!ON W!T# A L!NEAR NE&RON
.....................................................................-
$.$. TRA!N!N. A L!NEAR
LA*ER....................................................................................................11
$.3. A/APT!0E L!NEAR LA*ER
......................................................................................................13
$.). L!NEAR
PRE/!CT!ON.................................................................................................
..............1)
$.'. A/APT!0E L!NEAR PRE/!CT!ON
..............................................................................................1'
3. BAC,PROPA.AT!ON
NETWOR,...................................................................................1+
3.1. PATTERN AOC!AT!ON W!T# A L!NEAR NE&RON
...................................................................1+
1. Perce1tron
1.1. Classifcation with a $%in1ut 1erce1tron.
!M&P % i2ulates a 1erce1tron la3er.TRA!NP % Trains a 1erce1tron
la3er with 1erce1tron rule.
&sin4 the a5o6e 7unctions a $%in1ut har8 li2it neuron is traine8 to
classi73 ) in1ut 6ectors into twocate4ories.
/E"!N!N. A CLA!"!CAT!ON
PROBLEMA row 6ector P 8efnes
7our $%ele2ent in1ut 6ectors9
P : ;%<.' %<.' =<.3
=<.<> %<.' =<.'
%<.' =1.<?>
A row 6ector T 8efnes the 6ector@s tar4et cate4ories.
T : ;1 1 < <?>
PLOTT!N. T#E 0ECTOR TO
CLA!"* We can 1lot these
6ectors with PLOTP09
1lot16APBTC>
The 1erce1tron 2ust 1ro1erl3 classi73 the ) in1ut 6ectors in P into the
two cate4ories 8efne8 53 T.
/E"!NE T#E PERCEPTRON Perce1trons ha6e #AR/L!M neurons. These
neurons are ca1a5le o7 se1aratin4 an in1ut 1ace with a strai4ht line
into two cate4ories A< an8 1C.
!N!TP 4enerates initial wei4hts an8 5iases 7or our neuron9
;WB5? : init1APBTC
!N!TP %!nitialiDes a 1erce1tron la3er.
;WBB? : !N!TPAPBTC P % REF 2atriE o7 in1ut 6ectors. T % EF 2atriE o7
tar4et out1uts.
Returns wei4hts an8 5iases.
!N!T!AL PERCEPTRON CLA!"!CAT!ON The in1ut 6ectors can 5e
re1lotte8...
1lot16APBTC
...with the neuron@s initial atte21t at classifcation.
1lot1cAWB5C
The neuron 1ro5a5l3 8oes not 3et 2ake a 4oo8 classifcationG "ear
not...we are 4oin4 to train it.
TRA!N!N. T#E PERCEPTRONTRA!NP trains 1erce1trons to classi73 in1ut
6ectors.
TRA!NP returns new wei4hts an8 5iases that will 7or2 a 5etter
classifer. !t also returns the nu25ero7 e1ochs the 1erce1tron was
traine8 an8 the 1erce1tron@s errors throu4hout trainin4.
;WB5Be1ochsBerrors? : train1AWB5BPBTB%1C>
TRA!NP Train 1erce1tron la3er with 1erce1tron rule.
;WBBBTEBTR? : TRA!NPAWBBBPBTBTPC W % ER wei4ht 2atriE. B %
E1 5ias 6ector. P % REF 2atriE o7 in1ut 6ectors. T % EF 2atriE
o7 tar4et 6ectors. TP % Trainin4 1ara2eters Ao1tionalC.
Returns9 W % New wei4ht 2atriE. B % New 5ias 6ector. TE %
Traine8 e1ochs. TR % Trainin4 recor89 errors in row 6ector.
Trainin4 1ara2eters are9 TPA1C % E1ochs 5etween u18atin4
8is1la3B 8e7ault : 1. TPA$C % MaEi2u2 nu25er o7 e1ochs to
trainB 8e7ault : 1<<.
Missin4 1ara2eters an8 NaN@s are re1lace8 with 8e7aults.
!7 TPA1C is ne4ati6eB an8 a 1%in1ut neuron is 5ein4 traine8 the
in1ut 6ectors an8 classifcation line are 1lotte8 instea8 o7
the network error.
PLOTT!N. T#E ERROR C&R0E#ere the errors are 1lotte8 with res1ect
to trainin4 e1ochs9
1loterrAerrorsC>
&!N. T#E CLA!"!ERWe can now classi73 an3 6ector usin4
!M&P.Lets tr3 an in1ut 6ector o7 ;%<.'> <.'?9
1 : ;%<.'> <.'?> a : si2u1A1BWB5C
!M&P i2ulate 1erce1tron la3er.
!M&PAPBWBBC P % REF 2atriE o7 in1ut Acolu2nC 6ectors. W % ER
wei4ht 2atriE. B % E1 5ias Acolu2nC 6ector.
Returns out1uts o7 the 1erce1tron la3er.
NowB use !M&P 3oursel7 to test whether ;<.3> %<.'? is correctl3
classife8 as <.
1.$. Classifcation with a 3%in1ut 1erce1tron
&sin4 the a5o6e 7unctions a 3%in1ut har8 li2it neuron is traine8 to
classi73 H in1ut 6ectors into two cate4ories.
/E"!N!N. A CLA!"!CAT!ON PROBLEMA 2atriE P 8efnes ei4ht 3%
ele2ent in1ut Acolu2nC 6ectors9
P : ;%1 =1 %1 =1 %1 =1 %1
=1> %1 %1 =1 =1 %1 %1
=1 =1> %1 %1 %1 %1 =1
=1 =1 =1?>
A row 6ector T 8efnes the 6ector@s tar4et cate4ories.
T : ;< 1 < < 1 1 < 1?>
PLOTT!N. T#E 0ECTOR TO
CLA!"* We can 1lot these
6ectors with PLOTP09
1lot16APBTC>
The 1erce1tron 2ust 1ro1erl3 classi73 the ) in1ut 6ectors in P into the
two cate4ories 8efne8 53 T. /E"!NE T#E PERCEPTRON
;WB5? : init1APBTC
!N!T!AL PERCEPTRON
CLA!"!CAT!ON The in1ut
6ectors can 5e re1lotte8...
1lot16APBTC
...with the neuron@s initial atte21t at classifcation.
1lot1cAWB5C
The neuron 1ro5a5l3 8oes not 3et 2ake a 4oo8 classifcationG "ear
not...we are 4oin4 to train it. TRA!N!N. T#E PERCEPTRON
;WB5Be1ochsBerrors? : train1AWB5BPBTB%1C>
PLOTT!N. T#E ERROR C&R0E#ere the errors are 1lotte8 with res1ect
to trainin4 e1ochs9
1loterrAerrorsC>
&!N. T#E CLA!"!ERWe can now classi73 an3 6ector usin4
!M&P. Lets tr3 an in1ut 6ector o7 ;<.+> 1.$> %<.$?9
1 : ;<.+> 1.$>
%<.$?> a :
si2u1A1BWB5C
NowB use !M&P to see i7 ;%1> =1> %1? is 1ro1erl3 classife8 as a <.
1.3. Classifcation with a $%neuron 1erce1tron
&sin4 the a5o6e 7unctions a la3er o7 $ har8 li2it neurons is traine8 to
classi73 1< in1ut 6ectors into) cate4ories.
/E"!N!N. A CLA!"!CAT!ON PROBLEMA 2atriE P 8efnes ten $%
ele2ent in1ut Acolu2nC 6ectors9
P : ;=<.1 =<.+ =<.H =<.H =1.< =<.3 =<.< %<.3 %<.' %1.'> ...
=1.$ =1.H =1.( =<.( =<.H =<.' =<.$ =<.H %1.' %1.3?>
A 2atriE T 8efnes the cate4ories with tar4et Acolu2nC 6ectors.
T : ;1 1 1 < < 1 1 1 <
<> < < < < < 1 1 1
1 1?>
PLOTT!N. T#E 0ECTOR TO CLA!"*
1lot16APBTC>
The 1erce1tron 2ust 1ro1erl3 classi73 the ) in1ut 6ectors in P into the
two cate4ories 8efne8 53 T.
/E"!NE T#E PERCEPTRONA 1erce1tron la3er with two neurons is a5le
to se1arate the in1ut s1ace into ) 8iIerent cate4ories.
;WB5? : init1APBTC
!N!T!AL PERCEPTRON
CLA!"!CAT!ON The in1ut
6ectors can 5e re1lotte8...
1lot16APBTC
...with the neuron@s initial atte21t at classifcation.
1lot1cAWB5C
The neuron 1ro5a5l3 8oes not 3et 2ake a 4oo8 classifcationG "ear
not...we are 4oin4 to train it. TRA!N!N. T#E PERCEPTRON
;WB5Be1ochsBerrors? : train1AWB5BPBTB%1C>
PLOTT!N. T#E ERROR C&R0E#ere the errors are 1lotte8 with res1ect
to trainin4 e1ochs9
1loterrAerrorsC>
&!N. T#E CLA!"!ERWe can now classi73 an3 6ector we like usin4
!M&P. Lets tr3 an in1ut 6ector o7 ;<.+> 1.$?9
1 : ;<.+>
1.$?> a :
si2u1A1B
WB5C
NowB use !M&P to see i7 ;<.1> 1.$? is 1ro1erl3 classife8 as ;1> <?.
1.). Classifcation with a $%la3er 1erce1tron
&sin4 the a5o6e 7unctions a two%la3er 1erce1tron can o7ten classi73
non%linearl3 se1ara5le in1ut6ectors.
The frst la3er acts as a non%linear 1re1rocessor 7or the secon8 la3er.
The secon8 la3er is traine8 as
usual.
/E"!N!N. A CLA!"!CAT!ON PROBLEMA 2atriE P 8efnes ten $%
ele2ent in1ut Acolu2nC 6ectors9
P : ;%<.' %<.' =<.3 %<.1
%<.H> %<.' =<.' %<.'
=1.< =<.<?>
A 2atriE T 8efnes the cate4ories with tar4et Acolu2nC 6ectors.
T : ;1 1 < < <?>
PLOTT!N. T#E 0ECTOR TO CLA!"*
1lot16APBTC>
The 1erce1tron 2ust 1ro1erl3 classi73 the in1ut ' 6ectors in P into the
$ cate4ories 8efne8 53 T.
Because the 6ectors are not linearl3 se1ara5le A3ou cannot 8raw a line
5etween E@s an8 o@sC a sin4le la3er 1erce1tron cannot classi73 the2
1ro1erl3. We will tr3 usin4 a two%la3er 1erce1tron to classi73 the2.
/E"!NE T#E PERCEPTRON To 2aEi2iDe the chance that the
1re1rocessin4 la3er fn8s a linearl3 se1ara5le re1resentation 7or the
in1ut 6ectorsB it nee8s a lot o7 neurons. We will tr3 $<.
1 : $<>
!N!TP 4enerates initial wei4hts an8 5iases 7or our network9
;W1B51? : init1APB1C> Pre1rocessin4
la3er ;W$B5$? : init1A1BTC>
Learnin4 la3er TRA!N!N. T#E
PERCEPTRON
TRA!NP trains 1erce1trons to
classi73 in1ut 6ectors. The frst
la3er is use8 to 1re1rocess the
in1ut 6ectors9
A1 : si2u1APBW1B51C>
TRA!NP is the use8 to train the secon8 la3er to classi73 the
1re1rocesse8 in1ut 6ectors A1.
;W$B5$Be1ochsBerrors? : train1AW$B5$BA1BTB%1C>
PLOTT!N. T#E ERROR C&R0E#ere the errors are 1lotte8 with res1ect
to trainin4 e1ochs9
1loterrAerrorsC>
!7 the hi88en AfrstC la3er 1re1rocesse8 the ori4inal non%linearl3
se1ara5le in1ut 6ectors into new linearl3 se1ara5le 6ectorsB then the
1erce1tron will ha6e < error. !7 the error ne6er reache8 <B it 2eans a
new 1re1rocessin4 la3er shoul8 5e create8 A1erha1s with 2ore
neuronsC. !.e. tr3 runnin4 this scri1t a4ain.
&!N. T#E CLA!"!ER !" the classifer WOR,E/ we can now classi73
an3 6ector we like usin4 !M&P. Lets tr3 an in1ut 6ector o7 ;<.+> 1.$?9
1 : ;<.+> 1.$?> a1 : si2u1A1BW1B51C> Pre1rocess the 6ector a$ :
si2u1Aa1BW$B5$C Classi73 the 6ector
$. Linear networks
$.1. Pattern association with a linear neuron
&sin4 the a5o6e 7unctions a linear neuron is 8esi4ne8 to res1on8
to s1ecifc in1uts with tar4et out1uts.
/E"!N!N. A PATTERN AOCAT!ON
PROBLEM P 8efnes two 1%ele2ent
in1ut 1atterns Acolu2n 6ectorsC9
P : ;1.< %1.$?>
T 8efnes the associate8 1%ele2ent tar4ets Acolu2n 6ectorsC9
T : ;<.' 1.<?>
PLOTT!N. T#E ERROR &R"ACE AN/ CONTO&R ERR&R" calculates
errors 7or a neuron with a ran4e o7 1ossi5le wei4ht an8 5ias 6alues.
PLOTE 1lots this error sur7ace with a contour 1lot un8erneath.
wJran4e : %19<.191>
5Jran4e : %19<.191>
E : errsur7APBTBwJran4eB5Jran4eB@1urelin@C>
ERR&R"APBTBW0BB0B"C
P % 1EF 2atriE o7 in1ut 6ectors.
T % 1EF 2atriE o7 tar4et 6ectors.
W0 % Row 6ector o7 6alues o7 W.
B0 % Row 6ector o7 6alues o7 B.
" % Trans7er 7unction Astrin4C. Returns a 2atriE o7 error 6alues o6er
W0 an8 B0.
1lotesAwJran4eB5Jran4eBEC>
PLOTEAW0BB0BEB0C
W0 % 1EN row 6ector o7 6alues o7 W. B0 % 1EM ow 6ector o7 6alues o7
B. E % MEN 2atriE o7 error 6ectors. 0 % 0iewB 8e7ault : ;%3+.'B 3<?.
Plots error sur7ace with contour un8erneath. Calculate the error
sur7ace E with ERR&R".
The 5est wei4ht an8 5ias 6alues are those that result in the lowest
1oint on the error sur7ace.
/E!.N T#E NETWOR,The 7unction OL0EL!N will fn8 the wei4ht an8
5ias that result in the 2ini2u2 error9
;wB5? : sol6elinAPBTC
OL0EL!N /esi4n linear network.
;WBB? : OL0EL!NAPBTC P % REF 2atriE o7 F in1ut 6ectors. T %
EF 2atriE o7 F tar4et 6ectors.
Returns9 W % ER wei4ht 2atriE. B % E1 5ias 6ector.
CALC&LAT!N. T#E NETWOR, ERROR !M&L!N is use8 to si2ulate the
neuron 7or in1uts P.
A : si2ulinAPBwB5C>
!M&L!N i2ulate linear la3er.
!M&L!NAPBWBBC P % REF MatriE o7 in1ut Acolu2nC 6ectors. W %
ER Wei4ht 2atriE o7 the la3er. B % E1 Bias Acolu2nC 6ector o7
the la3er.
Returns out1uts o7 the 1erce1tron la3er.
We can then calculate the neurons errors.
E : T % A>
&MFR a88s u1 the sKuare8 errors.
E : su2sKrAEC
PLOT OL&T!ON ON ERROR
&R"ACE PLOTE re1lots the
error sur7ace.
1lotesAwJran4eB5Jran4eBEC>
PLOTEP 1lots the L1ositionL o7 the network usin4 the wei4ht an8
5ias 6alues returne8 53 OL0EL!N.
1lote1AwB5BEC
As can 5e seen 7ro2 the 1lotB OL0EL!N 7oun8 the 2ini2u2 error
solution.
&!N. T#E PATTERN AOC!ATORWe can now test the associator with
one o7 the ori4inal in1utsB %1.$B an8 see i7 it returns the tar4etB
1.<.
1 : %1.$>
a : si2ulinA1BwB5C
&se !ML!N to check that the neurons res1onse to 1.< is <.'.
$.$. Trainin4 a linear la3er
!N!TL!N %!nitialiDes a linear la3er.TRA!NW# % Trains a linear la3er with
Wi8row%#oI rule.!M&L!N % i2ulates a linear la3er.
&sin4 the a5o6e 7unctions a linear la3er is traine8 to res1on8 to
s1ecifc in1uts with tar4et out1uts.
/E"!N!N. A PATTERN AOCAT!ON
PROBLEMP 8efnes 7our 3%ele2ent
in1ut 1atterns Acolu2n 6ectorsC9
P : ;=1.< =1.' =1.$
%<.3 %1.< =$.<
=3.< %<.' =$.<
=1.< %1.( =<.-?>
T 8efnes associate8 )%ele2ent tar4ets Acolu2n 6ectorsC9
T : ;=<.' =3.< %$.$
=1.) =1.1 %1.$
=1.+ %<.) =3.<
=<.$ %1.H %<.)
%1.< =<.1 %1.<
=<.(?>
/E"!NE T#E NETWOR,!N!TL!N 4enerates initial wei4hts an8 5iases 7or
our neuron9
;WB5? : initlinAPBTC>
TRA!N!N. T#E NETWOR,TRA!NW# uses the Wi8row%#oI rule to train
P&REL!N networks.
2e : )<<> MaEi2u2 nu25er o7 e1ochs
to train.e4 : <.<<1> u2%sKuare8 error
4oal.
;WB5Be1ochsBerrors? : trainwhAWB5BPBTB;NaN 2e e4 NaN?C>
TRA!NW# Train linear la3er with Wi8row%#oI rule.
;WBBBTEBTR? : TRA!NW#AWBBBPBTBTPC W % ER wei4ht 2atriE. B %
E1 5ias 6ector. P % REF 2atriE o7 in1ut 6ectors. T % EF 2atriE
o7 tar4et 6ectors. TP % Trainin4 1ara2eters Ao1tionalC.
Returns9 W % new wei4ht 2atriE B % new wei4hts M 5iases. TE %
the actual nu25er o7 e1ochs traine8. TR % trainin4 recor89 ;row
o7 errors?
Trainin4 1ara2eters are9 TPA1C % E1ochs 5etween u18atin4
8is1la3B 8e7ault : $'. TPA$C % MaEi2u2 nu25er o7 e1ochs to
trainB 8e7ault : 1<<. TPA3C % u2%sKuare8 error 4oalB 8e7ault :
<.<$. TPA)C % Learnin4 rateB 8e7ault 7oun8 with MANL!NLR.
Missin4 1ara2eters an8 NaN@s are re1lace8 with 8e7aults.
The 1lot shows the fnal error 2et the error 4oal.
PLOTT!N. !N/!0!/&AL ERROR BARERR creates a 5ar 1lot o7 errors
associate8 with
5arerrAT%si2ulinAPBWB5CC
BARERR Plot 5ar chart o7 errors.
BARERRAEC E % EF 2atriE o7 error
6ectors. Plots 5ar chart o7 the sKuare8
errors in each colu2n.
Note that while the su2 o7 these sKuare8 errors is less than our error
4oalB the in8i6i8ual errors are not the sa2e.
&!N. T#E PATTERN AOC!ATOR
We can now test the associator with one o7 the ori4inal in1ut 6ectors
;1> %1> $?B an8 see i7 it returns the a11ro1riate tar4et 6ector ;<.'> 1.1>
3> %1?.
1 : ;1> %1> $?>
a :
si2ulinA1BWB5
C
&se !M&L!N to check that the neuron res1onse to ;1.'> $> 1? is the
tar4et res1onse ;3> %1.$> <.$> <.1?.
$.3. A8a1ti6e linear la3er
!N!TL!N %!nitialiDes a linear la3er.A/APTW# % Trains a linear la3er with
Wi8row%#oI rule.
&sin4 the a5o6e 7unctions a linear neuron is allowe8 to a8a1t so thatB
4i6en one si4nalB it can 1re8icta secon8 si4nal.
/E"!N!N. A WA0E
"ORMT!ME 8efnes the ti2e
ste1s o7 this si2ulation.
ti2e : 19<.<<$'9'>
P 8efnes a si4nal o6er these ti2e ste1s9
P : sinAsinAti2eC.Oti2eO1<C>
T is a si4nal which is linearl3 relate8 to P9
T : P O $ = $>
PLOTT!N. T#E !.NAL#ere is how the two si4nals are 1lotte89
1lotAti2eBPBti2eBTB@%%@C
titleA@!n1ut an8 Tar4et i4nals@C
Ela5elA@Ti2e@C
3la5elA@!n1ut JJJ Tar4et J J@C
/E"!NE T#E NETWOR,
;wB5? : initlinAPBTC
A/APT!N. T#E L!NEAR NE&RONA/APTW# si2ulates a8a1ti6e linear
neurons. !t takes initial wei4hts an8 5iasesB an in1ut si4nalB
an8 a tar4et si4nalB an8 flters the si4nal a8a1ti6el3. The out1ut
si4nal an8 the error si4nal arereturne8B alon4 with new wei4hts
an8 5iases.lr : <.<1> Learnin4 rate.;aBeBwB5? :
a8a1twhAwB5BPBTBlrC>
PLOTT!N. T#E O&TP&T !.NAL#ere the out1ut si4nal o7 the linear
neuron is 1lotte8 with the tar4et si4nal.
1lotAti2eBaBti2eBTB@%%@C
titleA@Out1ut an8 Tar4et i4nals@C
Ela5elA@Ti2e@C
3la5elA@Out1ut JJJ Tar4et J J@C
!t 8oes not take the a8a1ti6e neuron lon4 to f4ure out how to 4enerate
the tar4et si4nal.
PLOTT!N. T#E ERROR !.NAL A 1lot o7 the 8iIerence 5etween the
neurons out1ut si4nal an8 the tar4et shows how well the a8a1ti6e
neurons works.
1lotAti2eBeC
hol8 on
1lotA;2inAti2eC 2aEAti2eC?B;< <?B@9r@C
hol8 oI
titleA@Error i4nal@C
Ela5elA@Ti2e@C
3la5elA@Error@C
$.). Linear 1re8iction
OL0EL!N % ol6es 7or
a linear la3er.!M&L!N
% i2ulates a linear
la3er.
&sin4 the a5o6e 7unctions a linear neuron is 8esi4ne8 to 1re8ict the
neEt 6alue in a si4nalB 4i6en thelast f6e 6alues o7 the si4nal.
/E"!N!N. A WA0E "ORMT!ME 8efnes the ti2e ste1s o7 this si2ulation.
ti2e : <9<.<$'9'> 7ro2 < to ( secon8s
T 8efnes the si4nal in ti2e to 5e 1re8icte89
T : sinAti2eO)O1iC>
The in1ut P to the network is the last f6e 6alues o7 the si4nal T9
P : 8ela3si4ATB1B'C>
/ELA*!. Create 8ela3e8 si4nal 2atriE 7ro2 si4nal 2atriE.
/ELA*!.ANB/C N % ET 2atriE with %ele2ent colu2n 6ectors
7or T ti2este1s. / % MaEi2u2 8ela3.
Returns si4nal N 8ela3e8 53 <B 1B ...B an8 /$ ti2este1s.
/ELA*!.ANB/1B/$C N % ET 2atriE with %ele2ent colu2n
6ectors 7or T ti2este1s. /1 % Mini2u2 8ela3. /$ % MaEi2u2
8ela3.
Returns si4nal N 8ela3e8 53 /1B /1=1B ...B an8 /$ ti2este1s.
The si4nal N can 5e a row 6ector o7 6aluesB or a 2atriE o7
Acolu2nC 6ectors.
PLOTT!N. T#E !.NAL#ere is a 1lot o7 the si4nal to 5e 1re8icte89
1lotAti2eBTC
ala5elA@Ti2e@B@Tar4et i4nal@B@i4nal to 5e Pre8icte8@C
OL0EL!N sol6es 7or wei4hts an8 5iases which will let the linear neuron
2o8el the s3ste2.
;wB5? : sol6elinAPBTC
TET!N. T#E PRE/!CTOR !M&L!N si2ulates the linear neuronB which
atte21ts to 1re8ict the neEt 6alue in the si4nal at each ti2este1.
a : si2ulinAPBwB5C>
The out1ut si4nal is 1lotte8 with the tar4ets.
1lotAti2eBaBti2eBTB@=@C
ala5elA@Ti2e@B@Out1ut J Tar4et =@B@Out1ut an8 Tar4et i4nals@C
The linear neuron 8oes a 4oo8 Po5B 8oesnQt itRError is the 8iIerence
5etween out1ut an8 tar4et si4nals.
e : T%a>
This error can 5e 1lotte8.
1lotAti2eBeC
hol8 on
1lotA;2inAti2eC 2aEAti2eC?B;< <?B@9r@C
hol8 oI
ala5elA@Ti2e@B@Error@B@Error i4nal@C
Notice how s2all the error isG
$.'. A8a1ti6e linear 1re8iction
!N!TL!N %!nitialiDes a linear la3er.A/APTW# % Trains a linear la3er with
Wi8row%#oI rule.
&sin4 the a5o6e 7unctions a linear neuron is a8a1ti6el3 traine8 to
1re8ict the neEt 6alue in a si4nalB4i6en the last f6e 6alues o7 the
si4nal.The linear neuron is a5le to a8a1t to chan4es in the si4nal it is
tr3in4 to 1re8ict./E"!N!N. A WA0E "ORM
T!ME1 an8 T!ME$ 8efne two se42ents o7 ti2e.
ti2e1 : <9<.<'9)> 7ro2 < to ) secon8s
ti2e$ : ).<'9<.<$)9(> 7ro2 ) to (
secon8s T!ME 8efnes all the ti2e
ste1s o7 this si2ulation. ti2e :
;ti2e1 ti2e$?> 7ro2 < to ( secon8s T
8efnes a si4nal which chan4es
7reKuenc3 once9
T : ;sinAti2e1O)O1iC sinAti2e$OHO1iC?>
The in1ut P to the network is the last f6e 6alues o7 the tar4et si4nal9
P : 8ela3si4ATB1B'C>
PLOTT!N. T#E !.NAL#ere is a 1lot o7 the si4nal to 5e 1re8icte89
1lotAti2eBTC
ala5elA@Ti2e@B@Tar4et i4nal@B@i4nal to 5e Pre8icte8@C
/E"!NE T#E NETWOR,!N!TL!N 4enerates initial wei4hts an8 5iases 7or
our neuron9
;wB5? : initlinAPBTC
A/APT!N. T#E L!NEAR NE&RONA/APTW# si2ulates a8a1ti6e linear
neurons. !t takes initial wei4hts an8 5iasesB an in1ut si4nalBan8 a
tar4et si4nalB an8 flters the si4nal a8a1ti6el3. The out1ut si4nal an8
the error si4nal arereturne8B alon4 with new wei4hts an8 5iases.
We will user a learnin4 rate o7 <.1.
lr : <.1>
;aBeBwB5? : a8a1twhAwB5BPBTBlrC>
A/APTW# A8a1t linear la3er with Wi8row%#oI rule.
;ABEBWBB? : A/APTW#AWBBBPBTBlrC W % ER wei4ht 2atriE. B % E1
5ias 6ector. P % REF 2atriE o7 in1ut 6ectors. T % EF 2atriE o7
tar4et 6ectors. lr % Learnin4 rate Ao1tionalB 8e7ault : <.1C.
Returns9 A % out1ut o7 a8a1ti6e linear flter. E % error o7 a8a1ti6e
linear flter. W % new wei4ht 2atriE B % new wei4hts M 5iases.
PLOTT!N. T#E O&TP&T !.NAL#ere the out1ut si4nal o7 the linear
neuron is 1lotte8 with the tar4et si4nal.
1lotAti2eBaBti2eBTB@%%@C
ala5elA@Ti2e@B@Out1ut JJJ Tar4et J J@B@Out1ut an8 Tar4et i4nals@C
!t 8oes not take the a8a1ti6e neuron lon4 to f4ure out how to 4enerate
the tar4et si4nal.
A 1lot o7 the 8iIerence 5etween the neurons out1ut si4nal an8 the
tar4et si4nal shows how well the a8a1ti6e neuron works.
1lotAti2eBeB;2inAti2eC 2aEAti2eC?B;< <?B@9r@C
ala5elA@Ti2e@B@Error@B@Error i4nal@C
3. Back1ro1a4ation networks
3.1. Pattern association with a linear neuron
!N!T"" %!nitialiDes a 7ee8%7orware network.TRA!NBP % Trains a 7ee8%
7orwar8 network with 5ack1ro1a4ation.!M&"" % i2ulates a 7ee8%
7orwar8 network.
&sin4 the a5o6e 7unctions a neuron is traine8 to res1on8 to s1ecifc
in1uts with tar4et out1uts.
/E"!N!N. A 0ECTOR AOCAT!ON
PROBLEMP 8efnes two 1%ele2ent
in1ut 6ectors Acolu2n 6ectorsC9
P : ;%3.< =$.<?>
T 8efnes the associate8 1%ele2ent tar4ets Acolu2n 6ectorsC9
T : ;=<.) =<.H?>
PLOTT!N. T#E ERROR &R"ACE AN/ CONTO&R ERR&R" calculates
errors 7or a neuron with a ran4e o7 1ossi5le wei4ht an8 5ias 6alues.
PLOTE 1lots this error sur7ace with a contour 1lot un8erneath.
w6 : %)9<.)9)>
56 : %)9<.)9)>
es : errsur7APBTBw6B56B@lo4si4@C>
1lotesAw6B56BesB;(< 3<?C>
The 5est wei4ht an8 5ias 6alues are those that result in the lowest
1oint on the error sur7ace.
/E!.N T#E NETWOR,!N!T"" is use8 to initialiDe the wei4hts an8
5iases 7or the LO.!. neuron.
;wB5? : initIAPBTB@lo4si4@C
!N!T"" !nititialiDe 7ee8%7orwar8 network u1 to 3 la3ers.
;W1BB1B...? : !N!T""APB1B@"1@B...BnB@"n@C
P % RE$ 2atriE o7 in1ut 6ectors.
i % iDe o7 ith la3er.
"i % Trans7er 7unction o7 the ith la3er Astrin4C. Returns9
Wi % Wei4ht 2atriE o7 the ith la3er.
Bi % Bias Acolu2nC 6ector o7 the ith la3er.
TRA!N!N. T#E NETWOR,TBP1 uses 5ack1ro1a4ation to train 1%la3er
networks.
87 : '> "reKuenc3 o7 1ro4ress 8is1la3s Ain
e1ochsC.2e : 1<<> MaEi2u2 nu25er o7
e1ochs to train.e4 : <.<1> u2%sKuare8
error 4oal.lr : $> Learnin4 rate.
;wB5Be1Btr? : t511AwB5B@lo4si4@BPBTB;87 2e e4 lr?Bw6B56BesB;(< 3<?C>
TBP1 Train 1%la3er 7ee8%7orwar8 network wS5ack1ro1a4ation.
;WBBBTEBTR? : TBP1AWBBB"BPBTBTPC W % ER wei4ht 2atriE. B %
E1 5ias 6ector. " % Trans7er 7unction Astrin4C. P % REF 2atriE o7
in1ut 6ectors. T % EF 2atriE o7 tar4et 6ectors. TP % Trainin4
1ara2eters Ao1tionalC.
Returns9 W % new wei4hts. B % new 5iases. TE % the actual
nu25er o7 e1ochs traine8. TR % trainin4 recor89 ;row o7 errors?
Trainin4 1ara2eters are9 TPA1C % E1ochs 5etween u18atin4
8is1la3B 8e7ault : $'. TPA$C % MaEi2u2 nu25er o7 e1ochs to
trainB 8e7ault : 1<<<. TPA3C % u2%sKuare8 error 4oalB 8e7ault :
<.<$. TPA)C % Learnin4 rateB <.<1.
Missin4 1ara2eters an8 NaN@s are re1lace8 with 8e7aults.
TRA!NBP has returne8 new wei4ht an8 5ias 6aluesB the nu25er o7
e1ochs traine8 EPB an8 a recor8 o7trainin4 errors TR.
PLOTT!N. T#E ERROR C&R0E#ere the errors are 1lotte8 with res1ect
to trainin4 e1ochs9
1loterrAtrBe4C>
&!N. T#E PATTERN AOC!ATORWe can now test the associator with
one o7 the ori4inal in1utsB %3B an8 see i7 it returns the tar4etB <.).
1 : %1.$> a :
si2uIA1BwB5B@lo4si4@C
Trainin4 to a lower error 4oal woul8 re8uce this error.&se !M&P to
check the neuron 7or an in1ut o7 $.<. The tar4et res1onse is <.H
.

Potrebbero piacerti anche