Sei sulla pagina 1di 8

Counting Belief Propagation

Initial Work
Get Factor Graph from Tuffy with the help of the in memory data structure MRF.
Tuffy uses Postgres for grounding and later converts the graph into Markov Random Field The instance of the MRF class is used to get the Factor Graph

The python code uses this Factor Graph for compression

Results: Smokers
Time (in seconds)

predicate smokes(person) cancer(person) friends(person, person) weighted formulas 0.5 !smokes(x) V cancer(x) 10.4 smokes(x) V !friends(x,y) => smokes(y)

Smokers MLN Time for Compression


12 10 8 6 4 2 0 1.022147987 0.006123687 100 2.287228033 4.107149501 10.07290667

1.48674701
0.01266217 1000

1.053718149 10000 100000

Number of Ground Nodes Babak Code Tuffy + Compression

The results for compression for the smokers markov logic network of the current code and Babaks code are approximately same. The major difference is the conversion of MLN to Factor Graph

Results: Class
predicate Tuffy + Compression 12 Wrote(person,paper) Refers(paper,paper) 10 Category(paper,cat) 8 Samecat(cat,cat) 6 weighted formulas 4 1 !wrote(a1,a3) V !wrote(a1,a2) V category(a3,a4) 2 V category(a2,a4) 0 100 1000 10000 100000 2 !refers(a1,a2) V category(a2,a3) V Number of Ground Nodes !category(a1,a3) Tuffy + Compression 2 !refers(a1,a2) V category(a1,a3) V !category(a2,a3) The results for compression for the Class markov 10 sameCat(a2,a3) V ! category(a1,a3) V logic network of the current code run of of !category(a2,a3) memory for n = 10000 and Babaks code runs out of memory for n =1000 (while converting to factor graph) . This is because the number of factors is very high.
Time (in seconds)

Results: Time from Tuffy and Compression


100 MLN to FG (Tuffy) Compression Total Time 1.021 0.0011479867 1.0221479867 1000 1.482 0.00474701 1.48674701 10000 2.24 0.0472280333 2.2872280333 100000 9.491 0.5819066667 10.0729066667

100 MLN to FG (Tuffy) Compression 1.023 0.00759538

10000 2.52017 0.6183

10000 3.671 -

100000 -

Total Time

1.03059538

3.13847

Challenges Faced For current Implementation


Grounding the Markov Logic Network to Factor Graph using Tuffy takes considerable amount of time. Also, the current implementation in python also runs out memory as it uses the MRF data structure of tuffy. The implementation will take less time if instead of using the MRF data structure the RDBMS tables are used for.

Issues for using RDBMS of Tuffy


When Tuffy grounds the network, it optimizes the network and does not ground the evidence atoms Therefore we need to just send the domain size as evidence to Tuffy and later add evidence by updating the tables

Future Work
Get Results for National Football League mln to predict the game winner.

Potrebbero piacerti anche