Sei sulla pagina 1di 27

Theano / Lasagne / Keras

Aluno: Marcelo Romero


Professor: Heitor S. Lopes
Graduate Program in Electrical and Computer Engineering
Deep Learning Libraries

2
Theano
• Theano is a Python library for efficiently handling mathematical expressions
involving multi-dimensional arrays (also known as tensors).

• Developed in University of Montreal, in a group led by Yoshua Bengio, since


2008.

3
Theano Basics
32bits, float

0.2*1.0 + 0.7*1.0 = 9

4
Symbolic Graph
• When we are creating a model with Theano:

• We first define a symbolic graph of all variables and operations that need to be
performed.

• And then we can apply this graph on specific inputs to get outputs.

5
Symbolic Expression

y = (x * W).sum()

The system takes x and W, A Theano object y is created,


multiplies them together that knows its values can be
and sums the values calculated as the dot-product of
x and W

6
Symbolic Graph

>>> import theano


>>> a = theano.tensor.vector("a") # declare symbolic variable
>>> b = a + a ** 10 # build symbolic expression
>>> f = theano.function([a], b) # compile function
>>> (f([0, 1, 2])) # prints `array([0,2,1026])`
[ 0. 2. 1026.]

7
Symbolic Graph

8
Variables
Constructor dtype ndim
fvector float32 1

ivector int32 1

fscalar float32 0

fmatrix float32 2

ftensor3 float32 3

dtensor3 float64 3

x = theano.tensor.vector('x', dtype=float32)

9
Shared Variables
• They are shared between different functions and different function calls.

• Normally, these would be used for weights in our neural network.

10
Shared Variables

W = theano.shared(numpy.asarray([0.2, 0.7]), 'W')

Access W.get_value()

Modify W.set_value([0.1, 0.9])

11
Functions
Input

f = theano.function([x], y)

Output

Commonly used for passing input into a network and collecting the resulting output.
12
Example
import theano
import numpy

x = theano.tensor.fvector('x')
target = theano.tensor.fscalar('target')

W = theano.shared(numpy.asarray([0.2, 0.7]), 'W')


y = (x * W).sum()

cost = theano.tensor.sqr(target - y)
gradients = theano.tensor.grad(cost, [W])
W_updated = W - (0.1 * gradients[0])
updates = [(W, W_updated)]

f = theano.function([x, target], y, updates=updates)

for i in xrange(10):
output = f([1.0, 1.0], 20.0)
print output
13
Lasagne
• Lightweight library to build and train neural networks in Theano.

• The Lasagne project was started by Sander Dieleman in September 2014.

• It is developed by a core team of eight people and other contributions on


GitHub.

14
Lasagne – MNIST Example

15
Load Dataset

16
Build MLP

17
Build MLP

Prob: 0.2 Prob: 0.5

d d d
r r r
o Dense o Dense o Out
In
p 800 p 800 p 10
28x28
o ReLU o ReLU o Softmax
u u u
t t t

18
Custom MLP

19
Convolutional Neural Network

20
Convolutional Neural Network

Type: Max
Prob: 0.5
Size: 2x2

p p d d
o o r r
o o o Dense o Out
In
Conv l Conv l p 256 p 10
28x28
i i o ReLU o Softmax
n n u u
g g t t

Filters: 32
Filter size: 5x5
ReLU
21
Using the Networks – MiniBatches Function

22
Using the Networks – Main

23
Using the Networks – Main

24
Using the Networks – Main

25
Keras – MLP Example

26
Keras – MLP

27

Potrebbero piacerti anche