Sei sulla pagina 1di 6

Prashant Kumar (PhD First Year)

Q: Generate the output of logic AND function by McCulloch-Pitts neurons model.

Ans: clear;
clc;
%Getting weights and threshold value
disp('Enter weights');
w1=input('Weight w1=');
w2=input('weight w2=');
disp('Enter Threshold Value');
theta=input('theta=');
x1=[0 0 1 1];
x2=[0 1 0 1];
z=[0 0 0 1];
con=1;
while con
neuron1=x1*w1+x2*w2;
for i=1:4
if neuron1(i)>=theta
y(i)=1;
else
y(i)=0;
end
end
disp('Output of Net');
disp(y);
if y==z
con=0;
else
disp('Net is not learning enter another set of weights and Threshold value');
disp('Enter weights');
w1=input('Weight w1=');
w2=input('weight w2=');
disp('Enter Threshold Value');
theta=input('theta=');
end
end
disp('McCulloch-Pitts Net for AND function');
disp('Weights of Neuron 1');
disp(w1);
disp(w2);
disp('Threshold value');
disp(theta);

Q: Generate the output of logic OR function by McCulloch-Pitts neurons model.

Ans: clear;
clc;
%Getting weights and threshold value
disp('Enter weights');
w1=input('Weight w1=');
w2=input('weight w2=');
disp('Enter Threshold Value');
Prashant Kumar (PhD First Year)
theta=input('theta=');
x1=[0 0 1 1];
x2=[0 1 0 1];
z=[0 1 1 1];
con=1;
while con
neuron1=x1*w1+x2*w2;
for i=1:4
if neuron1(i)>=theta
y(i)=1;
else
y(i)=0;
end
end
disp('Output of Net');
disp(y);
if y==z
con=0;
else
disp('Net is not learning enter another set of weights and Threshold value');
disp('Enter weights');
w1=input('Weight w1=');
w2=input('weight w2=');
disp('Enter Threshold Value');
theta=input('theta=');
end
end
disp('McCulloch-Pitts Net for OR function');
disp('Weights of Neuron 1');
disp(w1);
disp(w2);
disp('Threshold value');
disp(theta);

Q: Generate the output of logic NOT function by McCulloch-Pitts neurons model.

Ans: clear;
clc;
%Getting weights and threshold value
disp('Enter weights');
w=input('Weight w=');
disp('Enter Threshold Value');
theta=input('theta=');
x=[0 1];
z=[1 0];
con=1;
while con
neuron1=x*w;
for i=1:2
if neuron1(i)>=theta
y(i)=1;
else
y(i)=0;
Prashant Kumar (PhD First Year)
end
end
disp('Output of Net');
disp(y);
if y==z
con=0;
else
disp('Net is not learning enter another set of weights and Threshold value');
disp('Enter weights');
w=input('Weight w=');
disp('Enter Threshold Value');
theta=input('theta=');
end
end
disp('McCulloch-Pitts Net for NOT function');
disp('Weights of Neuron');
disp(w);
disp('Threshold value');
disp(theta);

Q: Generate the output of logic XOR function by McCulloch-Pitts neurons model.

Ans: clear;
clc;
%Getting weights and threshold value
disp('Enter weights');
w11=input('Weight w11=');
w12=input('weight w12=');
w21=input('Weight w21=');
w22=input('weight w22=');
v1=input('weight v1=');
v2=input('weight v2=');
disp('Enter Threshold Value');
theta=input('theta=');
x1=[0 0 1 1];
x2=[0 1 0 1];
z=[0 1 1 0];
con=1;
while con
zin1=x1*w11+x2*w21;
zin2=x1*w21+x2*w22;
for i=1:4
if zin1(i)>=theta
y1(i)=1;
else
y1(i)=0;
end
if zin2(i)>=theta
y2(i)=1;
else
y2(i)=0;
end
end
Prashant Kumar (PhD First Year)
yin=y1*v1+y2*v2;
for i=1:4
if yin(i)>=theta;
y(i)=1;
else
y(i)=0;
end
end
disp('Output of Net');
disp(y);
if y==z
con=0;
else
disp('Net is not learning enter another set of weights and Threshold value'-
);
w11=input('Weight w11=');
w12=input('weight w12=');
w21=input('Weight w21=');
w22=input('weight w22=');
v1=input('weight v1=');
v2=input('weight v2=');
theta=input('theta=');
end
end
disp('McCulloch-Pitts Net for XOR function');
disp('Weights of Neuron Z1');
disp(w11);
disp(w21);
disp('weights of Neuron Z2');
disp(w12);
disp(w22);
disp('weights of Neuron Y');
disp(v1);
disp(v2);
disp('Threshold value');
disp(theta);

Q: Generate the output of logic NOR function by McCulloch-Pitts neurons model.

Ans: clear;
clc;
%Getting weights and threshold value
disp('Enter weights');
w1=input('Weight for first neuron w1=');
w2=input('weight for first neuron w2=');
v=input('weight for second neuron v=');
disp('Enter Threshold Value for First Neuron');
theta1=input('theta-1=');
disp('Enter Threshold Value for Second Neuron');
theta2=input('theta-2=');
x1=[0 0 1 1];
x2=[0 1 0 1];
z=[1 0 0 0];
Prashant Kumar (PhD First Year)
con=1;
while con
neuron1=x1*w1+x2*w2;
for i=1:4
if neuron1(i)>=theta1
y1(i)=1;
else
y1(i)=0;
end
end
y2=y1*v;
for i=1:4
if y2(i)>=theta2;
y(i)=1;
else
y(i)=0;
end
end
disp('Output of Net');
disp(y);
if y==z
con=0;
else
disp('Net is not learning enter another set of weights and Threshold value');
disp('Enter weights');
w1=input('Weight w1=');
w2=input('weight w2=');
disp('Enter Threshold Value');
theta=input('theta=');
end
end
disp('McCulloch-Pitts Net for NOR function');
disp('Weights of Neuron 1');
disp(w1);
disp(w2);
disp('Weights of Neuron 2');
disp(v);
disp('Threshold value for first Neuron');
disp(theta1);
disp('Threshold value for second Neuron');
disp(theta2);

Q: Delta Rule for Single Output


Ans: the delta rule changes the weight of the connection to minimize the difference between the net input
to the output unit, Yin and the target value t.
The delta rule is ∆Wi=∞(t-Yin)Xi
Where,
X is the vector of activation of inputs unit
Y-in is the net input to output unit. –ΣX.W1
T is the target vector, α- learning rate

 (t
2
The mean square error for a particular training pattern E= j  y inj )
j
Prashant Kumar (PhD First Year)
The gradient of E is a vector consisting of the partial derivatives of E with respect to each of the weights.
The error can be reduced rapidly by adjusting weight Wij .
Taking partial differentiation of E w.r.t Wij
E 

Wij Wij j
 (t j  yinj ) 2


 (t j  yinj ) 2
Wij
Since the weight Wij influences the error only at output unity y j
Also,
n
yinj   (t j  yinj ) 2
i 1
We get,
E 
 (t j  yinj ) 2
Wij Wij
y inj
 2(t j  y inj ) 2 (1)
Wij
E y inj
 2(t j  y inj )
Wij Wij
E
 2(t j  y inj ) x1
Wij
Thus the error will be reduced rapidly depending upon the given learning by adjusting the weights
according to the delta rule given by,
Wij  (t f  yinj ) x1

Potrebbero piacerti anche