Sei sulla pagina 1di 10

NAMA : FAHMIL KHOIR NST

NPM :17351197

BACKPROPAGATION (KCB)

>> p=[0.25 0.25 0.5 0.5 0.5 0.75 0.75 0.75 0.5 0.5;60 80 70 70 70 70 70 70 70 70;65 70 80 80 80 70 90 70
80 80;1 0 1 1 0 1 0 1 0 1];

>> t=[1 1 0 0 1 1 0 1 1 0;0 0 1 1 0 0 1 0 0 1;0 0 0 0 0 0 0 0 0 0];

>> net= newff(minmax(p),[6,3],{'tansig','logsig'},'traingdx');

Warning: NEWFF used in an obsolete way.

> In nntobsu at 18

In newff at 86

See help for NEWFF to update calls to the new argument list.

>> net.IW{1,1}

ans =

4.1467 -0.0730 0.1205 1.9250

5.7831 0.0167 -0.0042 3.2742

-4.8053 0.1473 0.0774 1.0032

4.2381 0.1192 -0.0734 -2.3800

2.2631 -0.1464 -0.0268 2.9847

-4.0877 0.1195 0.0845 2.2040

>> net.b{1}

ans =
-9.4560

-6.6895

-13.9707

-3.1408

11.0090

-16.1588

>> net.LW{2,1}

ans =

0.8129 -0.4901 0.9371 -2.0640 0.8861 -2.1173

1.2144 0.7326 -2.2060 -1.8983 -0.8618 -0.2886

1.1143 -1.5069 -1.0224 1.4824 2.0634 -0.5428

>> net.b{2}

ans =

-3.3626

3.3626

>> [a,Pf,Af,e,Perf]=sim(net,p,[],[],t)
a=

Columns 1 through 8

0.7177 0.0247 0.6324 0.6324 0.0290 0.0274 0.0318 0.0274

0.5189 0.0079 0.1525 0.1525 0.0315 0.3416 0.0959 0.3416

0.9932 0.9393 0.9313 0.9313 0.9933 0.9954 0.9985 0.9954

Columns 9 through 10

0.0290 0.6324

0.0315 0.1525

0.9933 0.9313

Pf =

[]

Af =

[]
e=

Columns 1 through 8

0.2823 0.9753 -0.6324 -0.6324 0.9710 0.9726 -0.0318 0.9726

-0.5189 -0.0079 0.8475 0.8475 -0.0315 -0.3416 0.9041 -0.3416

-0.9932 -0.9393 -0.9313 -0.9313 -0.9933 -0.9954 -0.9985 -0.9954

Columns 9 through 10

0.9710 -0.6324

-0.0315 0.8475

-0.9933 -0.9313

Perf =

0.6303

>> net.trainParam.epochs= 100000;

>> net.trainParam.show=500;

>> net.trainParam.goal=0.01;

>> net=train(net,p,t)
net =

Neural Network object:

architecture:

numInputs: 1

numLayers: 2

biasConnect: [1; 1]

inputConnect: [1; 0]

layerConnect: [0 0; 1 0]

outputConnect: [0 1]

numOutputs: 1 (read-only)

numInputDelays: 0 (read-only)

numLayerDelays: 0 (read-only)

subobject structures:

inputs: {1x1 cell} of inputs

layers: {2x1 cell} of layers

outputs: {1x2 cell} containing 1 output

biases: {2x1 cell} containing 2 biases

inputWeights: {2x1 cell} containing 1 input weight

layerWeights: {2x2 cell} containing 1 layer weight


functions:

adaptFcn: 'trains'

divideFcn: (none)

gradientFcn: 'calcgrad'

initFcn: 'initlay'

performFcn: 'mse'

plotFcns: {'plotperform','plottrainstate','plotregression'}

trainFcn: 'traingdx'

parameters:

adaptParam: .passes

divideParam: (none)

gradientParam: (none)

initParam: (none)

performParam: (none)

trainParam: .show, .showWindow, .showCommandLine, .epochs,

.time, .goal, .max_fail, .lr,

.lr_inc, .lr_dec, .max_perf_inc, .mc,

.min_grad

weight and bias values:


IW: {2x1 cell} containing 1 input weight matrix

LW: {2x2 cell} containing 1 layer weight matrix

b: {2x1 cell} containing 2 bias vectors

other:

name: ''

userdata: (user information)

>> [a,Pf,Af,e,Perf]=sim(net,p,[],[],t)

a=

Columns 1 through 8

0.8944 0.8976 0.1590 0.1590 0.8507 0.8956 0.1261 0.8956

0.0800 0.0771 0.8751 0.8751 0.1208 0.0789 0.9054 0.0789

0.0073 0.0072 0.0485 0.0485 0.0090 0.0073 0.0552 0.0073

Columns 9 through 10

0.8507 0.1590

0.1208 0.8751

0.0090 0.0485
Pf =

[]

Af =

[]

e=

Columns 1 through 8

0.1056 0.1024 -0.1590 -0.1590 0.1493 0.1044 -0.1261 0.1044

-0.0800 -0.0771 0.1249 0.1249 -0.1208 -0.0789 0.0946 -0.0789

-0.0073 -0.0072 -0.0485 -0.0485 -0.0090 -0.0073 -0.0552 -0.0073

Columns 9 through 10

0.1493 -0.1590

-0.1208 0.1249

-0.0090 -0.0485
Perf =

0.0100

>> a'

ans =

0.8944 0.0800 0.0073

0.8976 0.0771 0.0072

0.1590 0.8751 0.0485

0.1590 0.8751 0.0485

0.8507 0.1208 0.0090

0.8956 0.0789 0.0073

0.1261 0.9054 0.0552

0.8956 0.0789 0.0073

0.8507 0.1208 0.0090

0.1590 0.8751 0.0485

>>
>>

Potrebbero piacerti anche