Sei sulla pagina 1di 22

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/322617800

New Deep Learning Algorithms beyond Backpropagation IBM


Developers UnConference 2018, Zurich

Presentation · January 2018


DOI: 10.13140/RG.2.2.13492.76167

CITATIONS READS

0 1,577

1 author:

Bojan Ploj
College of Ptuj
26 PUBLICATIONS   11 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Bipropagation with tensorflow View project

All content following this page was uploaded by Bojan Ploj on 20 January 2018.

The user has requested enhancement of the downloaded file.


It is time for change:
New Deep Learning
Algorithms
beyond Backpropagation

By Bojan PLOJ PhD,


College of Ptuj, Slovenia, EU
Motivation

 Deep learning is today a very


hot topic.
 The backpropagation
algorithm, which is widely used
today is:

Slow
Unreliable
Unconstructive
Non informative
Bipropagation
1st new algorithm
Bipropagation - Basic idea

 Decomposing MLP into separate layers.

= +

 Determining inner desired values.


 Learning individual layers – no local minima.
Bipropagation - Time complexity

Layers - time dependency is:


exponential

T = an T – learning time
If a=10; T=10, 100, 1000,…
n – number of layers
linear a – case specific constant
a >> 1
T=na
If a=10; T=10, 20, 30,…

Learning could be much faster when we have big n.


Bipropagation - Description

 Many layers: this is no more a problem.

 Gradualness: small change in aech layer.

 Initialization: W matrix is very similar to I matrix so Winit = I

 Small number of epoch: are needed (once again).

 Biologically plausible algorithm (a claim without the proof)


Bipropagation – XOR example

Case 1 Case 2

Input inner output Input inner output


0, 0 0, 0 0, 0 0, 0 0, 0 0
0, 1 0.5, 1 1, 1 0, 1 0, 1 1
1, 0 1, 0.5 1, 1 1, 0 1, 0 1
1, 1 0.5, 0.5 0, 0 1, 1 0, 0 0
Bipropagation – gender classification
Bipropagation - Advantages

 Very similar to backpropagation -> easy to


understand & implement for people with
Backpropagation experience
 Multiple times faster than backpropagation
 Much more reliable than backpropagation
 Suitable for combining with other methods
Border Pairs Method
2nd new algorithm
Border Pairs - Basic idea

All Border Pairs must be


divided by a border line.
Border Pairs - Border Pair Definition
If intersection of
neighboring patterns of
different classes is empty
then this is a border pair.
Border Pairs - First layer

 Input space with twoo classes. Line B


1 O
 There are 3 border pairs – only
6 patterns out of 14 are used.
 Two linear lines can separate.
all border pairs – only two
neurons in the 1st layer are
needed.
1 O
 All areas are homogeneous Line A
(clustering).
Border Pairs - Second layer

First layer is giving two binary values A and B to the second layer.

B
A B Patterns
0 0 1 (4) red 3x
1
0 1 1 (2) red
1 0 1 (1) red
1 1 3 (7) blue
0 1 A

Patterns are joined again – the three blue patterns become one. Now the second

layer needs only one line (neuron) for the separation of only four patterns.
Border Pairs - Advantages
 Constructive (near optimal NN construction)
 One step only (no longer iterative)
 Per Partes (on neuron at a time)
 100% reliability (every time successful)
 Condensed (eliminated barren patterns)
 Online suitable (easy additional learning)
 Difficulty prediction (a priori known difficulty)
 Noise reduction (easy & effective)
 Clustering (automaticaly in 1st layer)
 Rules extraction (boolean algebra from 2nd layer on)
 Overfitting resistant (accurate learning)
Solving Brain
Contradiction
3rd new algorithm
Solving Brain Contradiction- Basic idea

Primary author is Germano Resconi from Catholic University,


Brescia, Italy.
AdditionaI direct conection (red colour)
Solving Brain Contradiction – OR example
0 0 0 <𝑏

Neuron with the hard limiter: 𝑋= 0 1 , 𝑍= 1 ⇾ 𝑌= >𝑏


1 0 1 >𝑏
1 1 1 >𝑏
-1

w1 b 0 0 <𝑏
𝑤
1 ∙ 1 = >𝑏
Y Hard Z 𝑌= 0 𝑤2
X ∑ limiter 1 0 >𝑏
w2
1 1 >𝑏

0 0 <𝑏
0 𝑤
1 ∙ 1 = >𝑏
Here are 4 inequations with 3 free parameters: 1 0 𝑤2 >𝑏
Z = f(w1, w2, b) 1 1 >𝑏

what in general menas an overdetermined


system. In the case of function OR despite the
overdetermination exist a solution:
w1 = w2 = 1/2 and b =1/3
Solving Brain Contradiction – XOR example
0 0 0 <𝑏
𝑋= 0 1 , 𝑍= 1 ⇾ 𝑌= >𝑏
1 0 1 >𝑏
1 1 0 <𝑏 1/2
1/3
In the case of function XOR such simple solution do not Z
exist. The trouble is the 4th inequation (green): X 1/2 -2 a
1/3
a
0 0 1 0 < 1/3
1/3
1/2 > 1/3
𝑌= 0 1 ∙ 2 = = 1/2
1 0 1 1/2 > 1/3
1 1 2 1 < 𝟏/𝟑
The solution is an additional neuron (the red one) which is
active only in the 4th line (funciton AND) and sends strong
negative signal (-2) to the original neuron (the black one):
1 < 1/3
0 0 0 0
0 2 1/2 > 1/3
𝑌= 1 0 ∙ 1 = =
1 0 0 1/2 > 1/3
1 1 1 2 −1 < 1/3
−2
Solving Brain Contradiction - features

Less neurons than equivalent MLP


Less neurons than logical gates
Nueronal computer in progress
Thank you for your attention

Any questions?

Contact: Bojan PLOJ


bojan.ploj@gmail.com

View publication stats

Potrebbero piacerti anche