Sei sulla pagina 1di 4

Comp 307

Lecture 19:1

Comp 307

Lecture 19:2

Knowledge based systems


Reasoning Backward chaining Forward chaining

Backward Chaining Interpreter


Input: Goal G, Set S of rules and facts Output: yes if G is logically implied by S; no otherwise

Initialize current goal to G; repeat Choose a goal A from the set of current goals; if A is a fact in S then remove A from current set of goals else if there is no rule in S whose head is A then exit and output no else begin Choose a rule A :-Body in S, where Body is a set of goals Remove A from current set of goals; Add Body to the current set of goals; end until the current set of goals is empty output yes ;

Comp 307

Lecture 19:3

Comp 307

Lecture 19:4

Backward Chaining: Example


Goal: u
u v w p q r w, p. n, r. r. a, b. a, c, d. e, p.

Backward chaining in Prolog


mammal:-hair. mammal:-milk. bird:-feathers. bird:-eggs, flies. carnivore:-mammal, meat. classify(tiger):-carnivore, tawny, striped. classify(puma):-carnivore, black. hair. black. meat.
|?-classify(X). X=puma.

a b e n

Comp 307

Lecture 19:5

Comp 307

Lecture 19:6

Design your own rules in prolog


:-op(800,fx,if). :-op(700, xfx, then). :-op(300,xfy,or). :-op(200,xfy,and). if
hall-wet and kitchen_dry

Backward chaining in Prolog


is_true(p):fact(p). is_true(P):if Condition then P, is_true (Condition). is_true(P1 and P2):is_true(P1), is_true(P2). is_true(P1 or P2):is_true(P1) ; is_true(P2).

then leak_in-bathroom. if window_closed or no-rain then no_water_from_outside. fact(hall_wet). fact(bathroom_dry).

Comp 307

Lecture 19:7

Comp 307

Lecture 19:8

Forward Chaining
Data driving, Data directed reasoning, bottom up

Forward Chaining Interpreter


Input: a set of goals G, Set S of rules and facts Output: An element of G logically implied by S Initialize working memory to facts in S; Repeat Choose a rule A:- Body in S; if Body is a subset of working memory then if A is not in working memory then add A to working memory until an element of G is in working memory; output the element of G in working memory

Search from facts to valid conclusions

Given database of true facts:


Apply all rules that match facts in the database Add conclusions to database Repeat until a goal is reached OR Repeat until no new facts added

Comp 307

Lecture 19:9

Comp 307

Lecture 19:10

Forward Chaining: Example


Goal: u

Forward Chaining in Prolog


farward:new_derived_fact(P),!, assert(fact(p)), farward. farward:-write(no more facts). new_derived_fact(Conclusion):if Cond then Conclusion, not fact(Conclusion), compose_fact(Cond). composed_fact(Cond):-fact(Cond). composed_fact(cond1 and cond2):composed_fact(Cond1), composed_fact(Cond2). composed_fact(Cond1 or Cond2):composed_fact(Cond1) ; composed_fact(Cond2).

u v w p q r

w, p. n, r. r. a, b. a, c, d. e, p.

a b e n

Comp 307

Backward chaining (Goal driving, Goal-directed reasoning, top-down) Search from hypotheses to relevant facts Good when: Limited number of hypotheses Determining truth of facts costs Very large number of possible facts, mostly irrelevant Forward chaining (Data driving, Data-directed reasoning, bottom-up) Search from facts to valid conclusions Good when Very large number of possible conclusions True facts known at start Bi-directional reasoning

Backward Chaining and Forward Chaining

Lecture 19:11

Comp 307

Lecture 19:12

Uncertainty
Sources of uncertainty?

errors or noise in observations and measurements not all relevant factors captured abductive reasoning

Abduction vs deduction
Deduction = reasoning from premises to consequences:
A B, observe A, conclude B (eg classification systems.)

Abduction = reasoning from symptoms to causes:


A causes B, observe B, (eg diagnostic systems) conclude A.

Abduction is necessarily uncertain.

Comp 307

Lecture 19:13

Comp 307

Lecture 19:14

Dealing with uncertainty


We can add confidence factors to rules
if high-temperature and sore-neck then meningitis (with confidence 0.7)

Common Sense Reasoning


To capture common sense knowledge has been the biggest challenge for artificial intelligence and little progress has been made.
Meaningless symbols! Consider the rule: should_send_birthday_card(X) :- my_friend(X). Internally, is equivalent to: p001(X) :- p002(X). As is: should_send_birthday_card(X) :- my_enemy(X).

must combine confidences in conditions with the confidence in rule

We can add real probabilities to rules.


if high-temperature and sore-neck then meningitis ( P(m|ht&sn) = 0.7, P(m|ht&-sn)= 0.2, P(m|-ht&sn)= 0.1, P(m|-ht&-sn)=0.00001)

Use belief nets instead!

Comp 307

Lecture 19:15

Comp 307

Lecture 19:16

Common sense : examples


John likes fruit. People eat what they like. Does John eat apples? The amount of common sense knowledge is HUGE The representation and reasoning is DIFFICULT

Building large knowledge bases


Motivation 1. To reduce the brittleness: Specialized knowledge-based systems are brittle. 2. To share knowledge: Domain specific systems share the same primitives and can communicate easily with one another. Projects: CYC

Potrebbero piacerti anche