Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
1. Bayesian Networks
2.Conditional Independence
3.Creating Tables
4.Notations for Bayesian Networks
5. Calculating conditional probabilities from
the tables
table
6.Calculating conditional independence
7. Markov Chain Monte Carlo
8.Markov Models.
9.Markov Models and Probabilistic methods
in vision
Introduction to
Probabilistic
Robotics
1. Probabilities
2. Bayes rule
3. Bayes filters
4. Bayes networks
5. Markov Chains
new
next
Bayesian
Networks and
Markov Models
Bayesian networks and Markov models :
1. Applications in User Modeling
2. Applications in Natural Language
Processing
3. Applications in robotic control
4. Applications in robot Vision
many many more
Extensions of BNs
Decision Networks
Dynamic Bayesian Networks (DBNs)
Definition of Bayesian
Networks
A data structure that represents the
dependence between variables
Gives a concise specification of the joint
probability distribution
A Bayesian Network is a directed acyclic
graph (DAG) in which the following holds:
1.A set of random variables makes up the nodes
in the network
2.A set of directed links connects pairs of nodes
3.Each node has a probability distribution that
quantifies the effects of its parents
5
Conditional
Independence
The relationship between :
conditional independence
and BN structure
is important for understanding how BNs
work
P (C | A B ) P (C | B )
B
Xray
dyspnoea
(A
indep
C)
(
)
P (C | A B ) P (C | B ) A indep C | B
|B
C
???
smoking
B
pollution
cancer
cancer
P ( A | C B ) P ( A) P (C ) ((A indep C) | B )
Example: Cancer is a common effect of
pollution and smoking
Given cancer, smoking explains away pollution
We know that you smoke and have cancer, we do not need to
assume that your cancer was caused by pollution
12
Problem 1:
Creating Joint
Distribution
Table
Joint Distribution Table is
an important concept
Probabilistic
truth table
You can
guess this
table, you
can take
data from
some
statistics,
You can
build this
table
based on
some
partial
tables
Use of
independence
while creating
the tables
Wet
Sprinkler
Rain
Example
17
W
S
R
Wet-Sprinkler-Rain
Example 18
Problem 1:
Creating the
Joint Table
What extra
assumptions can
help to create this
table?
Wet-Sprinkler-Rain
21
Example
Understanding of causation
Wet-Sprinkler-Rain
Example 22
Independence simplifies
probabilities
We use independence of variables S and R
P(S|R) = Sprinkler on
under condition that it
rained
S and R
are
independe
nt
Wet-Sprinkler-Rain
Example
We create the CPT for S and R based on our knowledge of the problem
Sprinkler was on
It rained
Conditional
Probability Table
(CPT)
Grass is wet
What about
children
playing or a dog
pissing?
It is still
possible by this
value 0.1
0.95
0.90
0.90
0.01
Wet-Sprinkler-Rain
25
Example
Random variables
0.95
0.90
0.90
0.01
Full joint
probability
You have
a table
You want
to
calculate
some
probability
P(~W)
Wet-Sprinkler-Rain
Example
Six numbers
We reduced only
from seven to six
numbers
Wet-Sprinkler-Rain
28
Example
Explanation of
Diagrammatic
Notations
such as Bayes
Networks
You do not need to build the
complete table!!
You can build a graph of tables or nodes which correspond to certain types of tables
30
Wet-SprinklerRain Example
Wet-Sprinkler-Rain
Example
Sprinkler was on
It rained
Grass is wet
Conditional
Probability Table
(CPT)
Full joint
probability
You have
a table
You want
to
calculate
some
probability
P(~W)
Problem 2:
Calculating
conditional
probabilities from
the Joint
Distribution Table
Wet-SprinklerRain Example
Wet-Sprinkler-Rain
36
Example
Wet-SprinklerRain Example
We will use
this in next
slide
We showed
examples of both
causal inference
and diagnostic
37
inference
<
Wet-Sprinkler-Rain Example
38
Problem 3: What if S
and R are
dependent?
Calculating
conditional
independence
Conditional
Independence of S
and R
WetSprinklerRain
Example
41
Wet-Sprinkler-Rain
Example
42
extended
S
2
CLOUDY - WetSprinkler-Rain
Example
Example
Lung
Cancer
Diagnosis
46
Lung Cancer
47
Type
Values
Pollution
Binary
{low,high}
Smoker
Boolean
{T,F}
Cancer
Boolean
{T,F}
Dyspnoea
Boolean
{T,F}
Xray
Binary
{pos,neg}
Example of
variables as nodes
in BN
Smoker
Cancer
Xray
Dyspnoea
49
Lung Cancer
Conditional
Probability Tables
(CPTs)
in Bayesian
Networks
51
Probability of
cancer given
state of
variables P and
S
C= cancer
P = pollution
S = smoking
X = Xray
D = Dyspnoea
52
Lung Cancer
Software
NETICA
for
Bayesian
Networks
and
joint
56
Lung Cancer
Pr( P low)
Pr( S F )
Pr(C T | P low, S F )
Pr( X pos | C T )
Pr( D T | C T )
This graph shows how we can
calculate the joint probability from
other probabilities in the network
P = pollution
S = smoking
X = Xray
D = Dyspnoea
57
Lung Cancer
Problem 4:
Determining
Causality and
Bayes Nets:
Advertisement
example
Nets:
Advertisement
Bayes nets allow one to learn about causal
example
relationships
One more Example:
Marketing analysts want to know whether to
increase, decrease or leave unchanged the
exposure of some advertisement in order to
maximize profit from the sale of some product
Advertised (A) and Buy (B) will be variables for
someone having seen the advertisement or
purchased the product
Advertised-Buy Example
Causality
Example
Advertised-Buy Example
How causality
can be
represented in
a graph?
Advertised-Buy Example
A causes B
B causes A
If you buy more, they have
more money to advertise
selection bias
1. Hidden common
cause of A and B
(e.g. income)
In rich country
they advertise
more and they buy
more
Causality Example
continued
But we still dont know if
A causes B
Suppose
We learn about the Income
(I) and geographic Location
(L) of the purchaser
And we learn with high
Bayesian probability the
network on the right
Advertised (A=Ad) and Buy (B)
Advertised-Buy Example
Advertised-Buy Example
Problem 5:
Determine
D-separation
in Bayesian
Networks
D-separation in Bayesian
Networks
We will formulate a Graphical
Criterion of conditional
independence
We can determine whether a set of
nodes X is independent of another
set Y, given a set of evidence nodes
E, via the Markov property:
If every undirected path from a node in
X to a node in Y is d-separated by E,
then X and Y are conditionally
Z is in E and Z has one arrow on the path leading in and one arrow out
(chain)
2.
3.
Chain
Common
cause
Common
effect
Another Example
of Bayesian
Networks: Alarm
Alarm Example
Bayes Net
Corresponding to
Alarm-Burglar
problem
Alarm Example
74
Compactness, Global
Semantics, Local Semantics
and Markov Blanket
Compactness of Bayes Net
Burglar Earthquake
John calls
Mary calls
75
Alarm Example
Semantics,
Local
Semantics and
Markov
Blanket
Useful concepts
76
Alarm Example
78
Markovs
blanket are:
1.Parents
2.Children
3.Childrens
parent
79
Problem 6
How to
systematically
Build a Bayes
Network
-Example
81
Alarm Example
83
Alarm Example
84
Alarm Example
So we add
arrow
85
Alarm Example
86
Alarm Example
87
Alarm Example
88
Alarm Example
89
First method of
simplification:
Enumeration
90
Alarm Example
91
Alarm Example
Variable A was
eliminated 92
3SAT Example
Polytrees are
93
better
IDEA:
94
Convert DAG to polytrees
Clustering is used to
convert non-polytree BNs
95
Not a polytree
Is a polytree
96
Alarm Example
Approximate Inference
1.
2.
3.
4.
97
1. Direct
Sampling
Methods
98
Direct Sampling
Direct Sampling generates minterms
with their probabilities
We start
from top
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
100
Rain Example
Cloudy =
yes
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
101
Rain Example
Cloudy =
yes
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
102
Rain Example
sprinkler
= no
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
103
Rain Example
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
104
Rain Example
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
105
Rain Example
W=wet
C= cloudy
R = rain
S = sprinkler
We generated a
sample
minterm
C S R W
Wet Sprinkler
106
Rain Example
2. Rejection
Sampling
Methods
107
Rejection Sampling
Reject inconsistent samples
Wet Sprinkler
108
Rain Example
3.
Likelihood
weighting
methods
109
110
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
111
Rain Example
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
112
Rain Example
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
113
Rain Example
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
114
Rain Example
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
115
Rain Example
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
116
Rain Example
W=wet
C= cloudy
R = rain
S = sprinkler
Wet Sprinkler
117
Rain Example
119
Sources
Prof. David Page
Matthew G. Lee
Nuria Oliver,
Barbara Rosario
Alex Pentland
Ehrlich Av
Ronald J. Williams
Andrew Moore tutorial with the same
title
Russell &Norvigs AIMA site
Alpaydins Introduction to Machine
Learning site.