Sei sulla pagina 1di 7

@@What is Fuzzy Logic Fuzzy Logic was developed by Lotfi Zadeh at UC Berkley

􀍞Fuzzy logi􀍞 is a branch of logic specially designed for representing knowledge


and human reasoning in such a way that it is amenable to processing by a
􀍞o􀍞puter􀍞
WHAT IS FUZZY LOGIC? Definition of fuzzy : Fuzzy – 􀍞􀍞ot 􀍞lear, disti􀍞􀍞t, or
pre􀍞ise; 􀍞lurred􀍞 Definition of fuzzy logic*A form of knowledge representation
suitable for notions that
cannot be defined precisely, but which depend upon their
contexts.*FOUNDATION OF FUZZY SYSTEMS
• Fuzzyness pertains to the uncertainty
associated with systems. Nothing can be predicted with exact precision
– Vagueness/uncertainty cannot be adequately expessed by crisp variable
– So we need fuzzy variables and fuzzy functions
Application using fuzzy logic** Power system stability control**Power system
stability assessment** Line fault detection and optimization process
for generation, transmission and distribution**Speed control**Wind turbine
control**Motor efficiency optimization**Waveform estimation etc..
Advantages of using fuzzy logic** Fuzzy logic controllers are not dependent on
accurate mathematical model**They are based on heuristics, and they are
able to incorporate human intuition and experience
From Crisp to fuzzy set: Crisp sets is the collection of distinct (precisely
defined) elements.**In a crisp set, an element is either a member
of the set or not.**For example, a jelly bean belongs in the classof food known as
candy (super set). Mashed potatoes do not.
@@Fuzzy sets,**allow elements to be partially in a set.** Each element is given
a degree of membership in a set.**This membership value can range from 0 (not
an element of the set) to 1 (a member of the set).**membership function is the
relationship between the
values of an element and its degree of membership in a
set.
• An example of membership functions are shown in
Figure. In this example, the sets (or classes) are numbers
that are negative large, negative medium, negative
small, near zero, positive small, positive medium, and
positive large. The value, μ, is the amount of
membership in the set.
@@Expert System An expert system is a system that employs human knowledge
captured in a computer to solve problems that ordinarily require human
expertise.(Turban)
A computer program that emulates the behaviour of human experts who are
solving real-world problems associated with a particular domain of knowledge.
(Pigford & Braur)@@Expert System Expert Systems manipulate knowledge while
conventional programs manipulate data. An expert system is often defined by its
structure. Knowledge Based System Vs Expert System.@@Characteristics of an
Expert System:**Pigford & Baur Inferential Processes **Uses various Reasoning
Techniques Heuristics **Decisions based on experience and knowledge
.@@Knowledge and Uncertainty Facts and rules are structured into a knowledge
base and used by expert systems to draw conclusions. There is often a degree of
uncertainty in the knowledge. **Things are not always true or false **the
knowledge may not be complete. In an expert system certainty factors are one
way indicate degree of certainty attached to a fact or rule. @@Classification of
Expert Systems:**Classification based on “Expertness” or Purpose **Expertness
@@An assistant **used for routine analysis and points out those portions of the
work where the human expertise is required. **A colleague the user talks over
the problem with the system until a “joint decision” is reached. **A true expert
the user accepts the system’s advice without question.
@@Components of an Expert System:daiagram
@@Desirable Features of an Expert System :Dealing with Uncertainty **certainty
factors Explanation Ease of Modification Transportability Adaptive learning
@@Advantages: Capture of scarce expertise Superior problem solving Reliability
Work with incomplete information Transfer of knowledge

Marcus was a man. man(Marcus) 2. Marcus was a Pompeian.


Pompeian(Marcus)3. All Pompeians were Romans. ∀x: Pompeian(x) → Roman(x)
4. Caesar was a ruler. ruler(Caesar) 5. All Romans were either loyal to Caesar or
hated him. ∀x: Roman(x) → loyalto(X. Caesar) V hate(x, Caesar)
6. Everyone is loyal to someone. ∀x : ∃ y : Ioyalto(x,y) 7. People only try to
assassinate rulers they are not loyal to. ∀ x : ∀ y : person(x) ∧ ruler(y) ∧
tryassassinate(x,y) → ¬ Ioyalto(x,y) 8. Marcus tried to assassinate Caesar.
tryassassinate (Marcus, Caesar) 9. All men are people.
∀x : man(x) → person(x)
AI:Thinking humanly•Acting humanly•Thinking rationally•Acting rationally
Cognitive science: the brain as an information processing machineRequires
scientific theories of how the brain works
How to understand cognition as a computational process?Introspection: try
to think about how we think
Predict and test behavior of human subjects Image the brain, examine
neurological data •The latter two methodologies are the domains of
cognitive science and cognitive neuroscience
Turing (1950) "Computing machinery and intelligence"
•The Turing Test
What capabilities would a computer need to have to pass the Turing
Test?Natural language processing
Knowledge representation *Automated reasoning*Machine learning
Turing predicted that by the year 2000, machines would be able to fool
30% of human judges for five minutes
SearchUninformed search, informed search Adversarial search: minimax
Constraint satisfaction problems *Planning•Logic•ProbabilityBasic laws of
probability*Bayes networks*Hidden Markov Models•LearningDecision trees
Linear classifiers: neural nets, support vector machines Reinforcement
learning
@@History of ai: Early excitement
1940s McCulloch & Pitts neurons; Hebb’s learning rule
1950 Turing’s “Computing Machinery and Intelligence”
1954 Georgetown-IBM machine translation experiment
1956 Dartmouth meeting: “Artificial Intelligence” adopted
1950s-1960s “Look, Ma, no hands!” period:
Samuel’s checkers program, Newell & Simon’s
Logic Theorist, Gelernter’s Geometry Engine
The rest of the story
1974-1980 The first “AI winter”
1970s Knowledge-based approaches
1980-88 Expert systems boom
1988-93 Expert system bust; the second “AI winter”
1986 Neural networks return to popularity
1988 Pearl’s Probabilistic Reasoning in Intelligent Systems
1990 Backlash against symbolic systems; Brooks’ “nouvelle AI”
1995-present Increasing specialization of the field
Agent-based systems
Machine learning everywhere
Tackling general intelligence again?
@@Genetic algorithms: (GAs) are a technique to solve problems which
need optimization.*GAs are a subclass of Evolutionary Computing and
are random search algorithms.*GAs are based on Darwin’s theory of
evolution.*History of GAs:
@@Searcg space:*Evolutionary computing evolved in the 1960s.*GAs
were created by John Holland in the mid-1970s. Most often one is looking
for the best solution in a specific subset of solutions.*This subset is called
the search space (or state space).*Every point in the search space is a
possible solution.*Therefore every point has a fitness value, depending on
the problem definition.*GAs are used to search the search space for the
best solution, e.g. a minimum.*Difficulties are the local minima and the
starting point of the search.@@@basic algo: Starting with a subset of n
randomly chosen solutions from the search space (i.e. chromosomes). This
is the population.*This population is used to produce a next generation of
individuals by reproduction.*Individuals with a higher fitness have more
chance to reproduce (i.e. natural selection).
Outline of the basic algorithm 0 START : Create random population of n
chromosomes1 FITNESS : Evaluate fitness f(x) of each chromosome in
the population 2 NEW POPULATION 1 REPRODUCTION/SELECTION
: Based on f(x) 2 CROSS OVER : Cross-over chromosomes 3 MUTATION
: Mutate chromosome 3 REPLACE : Replace old with new population: the
new generation 4 TEST : Test problem criterium 5 LOOP : Continue
step 1 – 4 untill criterium is satisfied
 Choose n random crossover points.
 Split along those points.
 Glue parts, alternating between parents.
 Generalization of 1 point.
Ant Colony Optimization Algorithms:*Construction heuristics •How ants find
shortest route •Stigmergy •General ACO metaheuristic •Ant System for TSP
@@Motivation •NP-hard problems – no algorithms that could solve large
instances of these problems to optimality Discrete combinatory
problems•Approximate metods – can find solutions of good quality in reasonable
time *Approximate metods Local search/optimization •Iteratively improves a
complete solution (typically initialized at random) till it reaches some local optimum.
Construction algorithms •Build a solution making use of some problem-specific
heuristic information •Ant Colony Optimization (ACO) algorithms – extend
traditional construction heuristics with an ability to exploit experience gathered during
the optimization process@@Construction Algorithms •Build solutions to a
problem under consideration in an incremental way starting with an empty initial
solution and iteratively adding opportunely defined solution components without
backtracking until a complete solution is obtained. Procedure GreedyConstructionHeur
end return sp end •Pros/Cons + fast, solutions of reasonable quality - Solution may
be far from optimum -Generate only limited number of different solutions -Decisions
made at early stages reduce a set of possible steps at latter stages @@Ant
Algorithms: Biological Inspiration •Inspired by behavior of an ant colony
Social insects – behave towards survival of the colony Simple individual behavior
complex behavior of a colony•Ability to find the shortest path from the colony to
the source of food and back using an indirect communication via pheromone
•Write – ants lay down pheromone on their way to food •Read – ant detects
pheromone (can sense different intensity) laid down by other ants and can choose a
direction of the highest concentration of pheromone. •Emergence – this simple
behavior applied by the whole colony can lead to emergence of the shortest path
@@Ant Colony Optimization Metaheuristic •ACO can be applied to any discrete
optimization problem for which some solution construction mechanism can be
conceived. •Artificial ants are stochastic solution construction heuristics that
probabilistically build a solution by iteratively adding solution components to partial
solutions by taking into account heuristic information on the problem instance
being solved, if available, (artificial) pheromone trails which change dynamically at
run-time to reflect the agents’ acquired search experience. •Stochastic component
allows generating a large number of different solutions.
Knowledge Representation: Knowledge  facts/principles (info. must be
True)  intelligence  procedural knowledge  algebraic
eq.Semantic Network  composed of multiple attribute values *Design 
Hierarchical structure Ex: (tree structure for below)*All Bikes are 2-
wheelers*All 2-wheelers are Vehicles*All 4-wheelers are Vehicles*All cars
are 4-wheelers*Kavasaki is a gear bike*All gear bikes are 2-wheelers
*Knowledge acquisition  acquire a KB thro domain expert*Select 
domain – knowledge engineers – experts*Inference engine  control
mechanism for expert system
Converting to clausal form: 1. man (Marcus). 2. Pompeian (Marcus).
3. ¬(Pompeian(x1) ) V Roman(x1)).
Removing implication and universal quantifier by replacing it with a variable
x1 4. ruler (Caesar). 5. ¬ (Roman(x2) ) V loyalto(x2,Caesar ) V
hate(x2,Caesar )).
Removing implication and universal quantifier by replacing it with a variable
x2 6. loyalto(x3, F(x3)).
Removing universal quantifier by replacing it with a variable x3
and existential quantifier by F(x3) since existential quantifier was inside
universal, that’s why a function
7. ¬ man(x4) V ¬ruler (y1) V ¬tryassassinate(x4,y1) V ¬loyalto(x,y)).
Removing universal quantifier for x and y by replacing it with x4 and y1
respectively. 8. tryassassinate(Marcus, Caesar ) 9. Question: ¬hate(Marcus;
Ceasar ) Proof by Refutation
Event A1. It rains on Marie's wedding.*Event A2. It does not rain on Marie's
wedding.*Event B. The weatherman predicts rain.
In terms of probabilities, we know the following:
P( A1 ) = 5/365 =0.0136985 [It rains 5 days out of the year.]P( A2 ) = 360/365 =
0.9863014 [It does not rain 360 days out of the year.]P( B | A1 ) = 0.9 [When it
rains, the weatherman predicts rain 90% of the time.]P( B | A2 ) = 0.1 [When it
does not rain, the weatherman predicts rain 10% of the time.] We want to know P(
A1 | B ), the probability it will rain on the day of Marie's wedding, given a forecast
for rain by the weatherman. The answer can be determined from Bayes' theorem,
as shown below.
P( A1 ) P( B | A1 )
P( A1 | B ) =
P( A1 ) P( B | A1 ) + P( A2 ) P( B | A2 )
P( A1 | B ) = (0.014)(0.9) / [ (0.014)(0.9) + (0.986)(0.1) ]
P( A1 | B ) = 0.111

Potrebbero piacerti anche