Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
ABSTRACT
In this paper variable memory schemes are proposed to
improve the response of traditional CMAC.
Considering the stabilization of a system, there is
a tradeoff between the weight space of CMAC and
peak overshoots. This tradeoff can be resolved by
varying the weight space according to the error.
Weight space can be varied either by varying the
quantization or by varying the generalization of
CMAC. Increase in quantization is accompanied by the
increase in accuracy of the network but at the cost
o f e n o r m o u s w e i g h t s p a c e . Decrease i n
generalization results in reduced learning interference
and hence fast convergence to error but again at the
cost of significant increment in weight space. Thus
when the error level goes beyond the permissible error
range, first scheme namely FGVQ employs the use of
increase in quantization
while
second
scheme
namely FQVG suggests the decrease in generalization.
Simulation results which are done in MATLAB shows
that variable memory schemes have better results than
that of traditional CMAC.
I.
INTRODUCTION
CONTROLLER
CMAC is a learning structure which emulates the
human cerebellum. Its an associative neural network
[8-10] in which a small subset of the network influences
any instantaneous output and that subset is determined
by the input to the network as shown in Fig. 1. The
region of operation of inputs is quantized say Q inputs
i.e. the number of elements in a particular input is Q.
This quantization determines the resolution of the
network [11] and the shift positions of the overlapping
regions. If n inputs are presented to the network, then
total number of elements in input space is Qn which is
quite large. To reduce this memory, inputs presented are
www.ijsret.org
30
International Journal of Scientific Research Engineering & Technology (IJSRET), ISSN 2278 0882,
Volume 3 Issue 1, April 2014
www.ijsret.org
31
International Journal of Scientific Research Engineering & Technology (IJSRET), ISSN 2278 0882,
Volume 3 Issue 1, April 2014
www.ijsret.org
32
International Journal of Scientific Research Engineering & Technology (IJSRET), ISSN 2278 0882,
Volume 3 Issue 1, April 2014
105
20.25
40
339
15.78
100
1839
14.10
Quantization=20
Learning rate=1000
TABLE II: Effect of changing Generalization
Peak Overshoot
Generalization Memory
(%)
162
13.71
105
20.23
88
23.88
It is quite clear from the fig.6 and the table that increase
in quantization results in the decrease in peak overshoots
but at the expense of weight space.
In the fig. 7 comparisons between the responses of
magnetic levitation system is done for fixed values of
Quantization.
100
20
105
13.65
1000
100
0.20
1839
www.ijsret.org
33
International Journal of Scientific Research Engineering & Technology (IJSRET), ISSN 2278 0882,
Volume 3 Issue 1, April 2014
REFERENCES
VI. CONCLUSION
www.ijsret.org
34
International Journal of Scientific Research Engineering & Technology (IJSRET), ISSN 2278 0882,
Volume 3 Issue 1, April 2014
www.ijsret.org
35