Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Part 2
Supervised Learning
Supervised Learning
Output:
Positive (+) and negative () examples
Input representation:
x : price, x : engine power
1
2
Training set X
X {xt ,rt }tN1
1 if x is positive
r
0 if x is negative
x1
x
x2
Class C
p1 price p2 AND e1 engine
power e2
Hypothesis class H
1 if h says x is positive
h( x)
0 if h says x is negative
Error of h on H
E(h| X ) 1 h xt rt
N
t 1
Margin
Steps in Supervised
Learning
Steps in Supervised
Learning
Steps in Supervised
Learning
Facts in Supervised
Learning
Simpler to use
(lower computational
complexity)
Easier to explain
(more interpretable)
simpler explanations are more plausible and any unnecessary complexity should be shaved off.
1
if
x
Ci
t
ri
t
0
if
x
C j , j i
Train hypotheses
h (x), i =1,...,K:
i
t
1
if
x
Ci
t
hi x
t
0
if
x
C j , j i
Regression
X x ,r
t
t N
t 1
g x w1 x w0
rt
g x w2 x2 w1 x w0
rt f xt
Emperical error
1 N t
t 2
E g| X r g x
N t 1
1 N t
2
t
E w1 ,w0 | X r w1 x w0
N t 1
Triple Trade-Of
Complexity of H, c (H),
Training set size, N,
Generalization error, E, on new data
As N
E
As c (H)
first Eand then E
Cross-Validation
Dimensions of a Supervised
Learner
1.
Model:
2.
Loss function:
g x |
E | X L rt ,g xt |
t
3.
Optimization procedure:
* argminE | X