Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
MOHAMMED SHADAB
2009-2010
BUSINESS MATHEMATICS LOGIC
CERTIFICATE
(GUIDE)
DR.G.S.HEDGE
BANGALORE Lecturer, Business Mathematics
Great Eastern Management School
SEP 2009
Acknowledgment
The Project work was carried out under the remarkable guidance of
I also express my sincere gratitude and thanks to all the subjects participated
in the study.
CONTENTS
Statements and Logical Operators Exercises for Section 1
2. Logical Equivalence, Tautologies and Contradictions Exercises for
Section 2
The Conditional and the Biconditional Exercises for Section 3
Tautological Implications and Tautological Equivalences Exercises for
Section 4
Rules of Inference Exercises for Section 5
Arguments and Proofs Exercises for Section 6
Predicate Calculus Exercises for Section 7
• 1 Nature of logic
o 1.1 Logical form
o 1.2 Deductive and inductive reasoning
o 1.3 Consistency, soundness, and completeness
o 1.4 Rival conceptions of logic
• 2 History of logic
• 3 Topics in logic
o 3.1 Syllogistic logic
o 3.2 Sentential (propositional) logic
o 3.3 Predicate logic
o 3.4 Modal logic
o 3.5 Informal reasoning
o 3.6 Mathematical logic
o 3.7 Philosophical logic
o 3.8 Logic and computation
• 4 Controversies in logic
o 4.1 Bivalence and the law of the excluded middle
o 4.2 Is logic empirical?
o 4.3 Implication: strict or material?
o 4.4 Tolerating the impossible
o 4.5 Rejection of logical truth
INTRODUCTION
You have been assigned the job of evaluating the attempts of mortals to
prove the existence of God. And many attempts there have been. Three in
particular have caught your attention: they are known as the cosmological
argument, the teleological argument, and the ontological argument.
Teleological Argument (St. Thomas Aquinas): All things in the world act
towards an end. They could not do this without their being an intelligence
that directs them. This intelligence is God.
Ontological Argument (St. Anselm): God is a being than which none greater
can be thought. A being thought of as existing is greater than one thought of
as not existing. Therefore, one cannot think of God as not existing, so God
must exist.
Since Boole and DeMorgan, logic and mathematics have been inextricably
intertwined. Logic is part of mathematics, but at the same time it is the
language of mathematics. In the late 19th and early 20th century it was
believed that all of mathematics could be reduced to symbolic logic and
made purely formal. This belief, though still held in modified form today,
was shaken by K. Gödel in the 1930's, when he showed that there would
always remain truths that could not be derived in any such formal system.
We'll mention more about this as we go along.
The study of symbolic logic is usually broken into several parts. The first
and most fundamental is the propositional calculus, and this is the subject
of most of this web text. Built on top of this is the predicate calculus, which
is the language of mathematics. We shall study the propositional calculus in
the first six sections and look at the predicate calculus briefly in the last two.
1. Statements and Logical Operators
This on-line text is, for the most part, devoted to the study of so-called
Propositional Calculus. Contrary to what the name suggests, this has
nothing to do with the subject most people associate with the word
"calculus." Actually, the term "calculus" is a generic name for any area of
mathematics that concerns itself with calculating. For example, arithmetic
could be called the calculus of numbers. Propositional Calculus is then the
calculus of propositions. A proposition, or statement, is any declarative
sentence, which is either true (T) or false (F). We refer to T or F as the
truth-value of the statement.
Example 1 Propositions
The sentence "2+2 = 4" is a statement, since it can be either true or false.
Since it happens to be a true statement, its truth value is T.
The sentence "1 = 0" is also a statement, but its truth value is F.
"It will rain tomorrow" is a proposition. For its truth value we shall have to
wait for tomorrow.
is not a proposition
is not a proposition
is not a proposition
"This statement is false" gets us into a bind: If it were true, then, since it is
declaring itself to be false, it must be false. On the other hand, if it were
false, then it’s declaring itself false is a lie, so it is true! In other words, if it
is true, then it is false, and if it is false, then it is true, and we go around in
circles. We get out of this bind by refusing to accord it the privileges of
statement hood. In other words, it is not a statement. An equivalent pseudo-
statement is: "I am lying," so we call this liar's paradox.
"This statement is true" may seem like a statement, but there is no way that
its truth value can ever be determined: is it true, or is it false? We thus
disqualify it as well. (In fact, it is the negation of the liar's paradox; see
below for a discussion of negation.)
Sentences such as these are called self-referential sentences, since they
refer to themselves.
Here are some rather amusing (and slightly disturbing) examples of self-
referential sentences, the first two being taken from Douglas R. Hofstadter's
Met magical Themas:
"While the last sentence had nothing to say, this sentence says a lot."
"This sentence has more to say than the last two sentences combined, if you
count the number of words."
We can form new propositions from old ones in several different ways. For
example, starting with p: "I am an Anchovian," we can form the negation of
p: "It is not the case that I am an Anchovian" or simply "I am not an
Anchovian." We denote the negation of p by ~p, read "not p." What we
mean by this is that, if p is true, then ~p is false, and vice-versa. We can
show this in the form of a truth table:
p ~p
T F
F T
On the left are the two possible truth values of p, with the corresponding
truth values of ~p on the right. The symbol ~ is our first example of a logical
operator.
Negation
The negation of p is the statement ~p, which we read "not p." Its
truth value is defined by the following truth table.
p ~p
T F
F T
Solution
(a) ~p is the statement "it is not true that 2+2 = 4," or more simply,
~p: "2+2 4."
(b) ~q: "1 0."
(d) ~s: " Not all the politicians in this town are crooks."
To say that diamonds are not a pearl's best friends is not to say that
diamonds are a pearl's worst enemy. The negation is not the polar opposite,
but whatever would deny the truth of the original statement. Similarly,
saying that not all politicians are crooks is not the same as saying that no
politicians are crooks, but is the same as saying that some politicians are not
crooks. Negations of statements involving the quantifiers "all" or "some"
are tricky. We'll study quantifiers in more depth when we discuss the
predicate calculus.
Example 3 Conjunction
If p: "This galaxy will ultimately wind up in a black hole" and q: "2+2 = 4,"
what is p q?
Solution
p q: "This galaxy will ultimately disappear into a black hole and 2+2=4," or
the more astonishing statement: "Not only will this galaxy ultimately
disappear into a black hole, but 2+2 = 4!"
Solution
p (~q) says: "This galaxy will ultimately disappear into a black hole and
2+2 4," or "Contrary to your hopes and aspirations, this galaxy is doomed
to eventually disappear into a black hole; moreover, two plus two is
decidedly different from four!"
Solution
The statement is asserting that all three statements p, q and r are true. (Note
that "but" is simply an emphatic form of "and.") Now we can combine them
all in two steps: Firstly, we can combine p and q to get p q, meaning "This
topic is boring and this web site is boring." We can then conjoin this with r
to get: (p q) r. This says: "This topic is boring, this web site is boring and
life is boring." On the other hand, we could equally well have done it the
other way around: conjoining q and r gives "This web site is boring and life
is boring." We then conjoin p to get p (q r), which again says: "This topic is
boring, this web site is boring and life is boring." We shall soon see that
(p q) r
is logically the same as
p (q r),
a fact called the associative law for conjunction. Thus both answers (p q) r
and p (q r) are equally valid. This is like saying that (1+2)+3 is the same as
1+(2+3). As with addition, we sometimes drop the parentheses and write
p q r.
Any sentence that suggests that two things are both true is a conjunction.
The use of symbolic logic strips away the elements of surprise or judgement
that can also be expressed in an English sentence.
We now introduce a third logical operator. Starting once again with p: "I am
clever," and q: "You are strong," we can form the statement "I am clever or
you are strong," which we write symbolically as p q, read "p or q." Now in
English the word "or" has several possible meanings, so we have to agree on
which one we want here. Mathematicians have settled on the inclusive or: p
q means p is true or q is true or both are true.
With p and q as above, p q stands for "I am clever or you are atrong, or
both." We shall sometimes include the phrase "or both" for emphasis, but
even if we do not that is what we mean. We call p q the disjunction of p
and q.
Disjunction
p q p q
T T T
T F T
F T T
F F F
Notice that the only way for the whole statement to be false is for
both p and q to be false. For this reason we can say that p q also
means "p and q are not both false." We'll say more about this in the
next section.
Solution
Notice how we get the ~(p q) column from the p q column: we reverse all
its the truth values, since that is what negation means.
Solution
Since there are two variables, p and q, we again start with the p and q
columns. Working from inside the parentheses, we then evaluate p q, and
finally take the disjunction of the result with p:
p q p qp (p q)
T T T T
T F F T
F T F F
F F F F
Before we go on...
How did we get the last column from the others? Since we are "or-ing" p
with p q, we must look at the values in the p and p q columns and "or"
those together, according to the instructions for "or." Thus, for example, in
the second row, we get T F = T, and in the third row, we get F F = F. (If
you look at the second row of the truth table for "or" you will see T | F | T,
and in the last row you will see F | F | F )
Example 3 Three Variables
Solution
Here, there are three variables: p, q and r. Thus we start with three initial
columns showing all eight possibilities:
p q r
T T T
T T F
T F T
T F F
F T T
F T F
F F T
F F F
We now add columns for p q, ~(p q) and ~r, and finally ~(p q) (~r)
according to the instructions for these logical operators. Here is how the
table would grow as you construct it:
p q r p q
T T T T
T T F T
T F T F
T F F F
F T T F
F T F F
F F T F
F F F F
p q r p q ~(p q) ~r
T T T T F F
T T F T F T
T F T F T F
T F F F T T
F T T F T F
F T F F T T
F F T F T F
F F F F T T
and finally,
The Conditional
Consider the following statement: "If you earn an A in logic, then I'll buy
you a Yellow Mustang." It seems to be made up out of two simpler
statements:
What the original statement is then saying is this: if p is true, then q is true,
or, more simply, if p, then q. We can also phrase this as p implies q, and we
write p q.
Now let us suppose for the sake of argument that the original statement: "If
you earn an A in logic, then I'll buy you a Yellow Mustang," is true. This
does not mean that you will earn an A in logic; all it says is that if you do so,
then I will buy you that Yellow Mustang. Thinking of this as a promise, the
only way that it can be broken is if you do earn an A and I do not buy you a
Yellow Mustang. In general, we ue this idea to define the statement p q.
Conditional
The conditional p q, which we read "if p, then q" or "p implies q," is
defined by the following truth table.
p q p q
T T T
T F F
F T T
F F T
The arrow " " is the conditional operator, and in p q the statement p
is classed the antecedent, or hypothesis, and q is called the
consequent, or conclusion.
Notes
1. The only way that p q can be false is if p is true and q is false-this is the
case of the "broken promise."
2. If you look at the truth table again, you see that we say that "p q" is true
when p is false, no matter what the truth value of q. This again makes sense
in the context of the promise — if you don't get that A, then whether or not I
buy you a Corvette, I have not broken my promise. However, it goes against
the grain if you think of "if p then q" as saying that p causes q. The problem
is that there are really many ways in which the English phrase "if ... then ..."
is used. Logicians have simply agreed that the meaning given by the truth
table above is the most useful for mathematics, and so that is the meaning
we shall always use. Shortly we'll talk about other English phrases that we
interpret as meaning the same thing.
Here are some examples that will help to explain each line in the truth table.
Notice that the statements need not have anything to do with one
another. We are not saying that the sun rises in the east because 1+1 =
2, simply that the whole statement is logically true.
Here p: "the moon is made of green cheese," which is false, and q: "I am the
King of England." The statement p q is true, whether or not the speaker
happens to be the King of England (or whether, for that matter, there even is
a King of England).
Tautological Implications
[(p q) p] q.
Example
Letting p: "I love math" and q: "I will pass this course," we get
If my loving math implies that I will pass this course, and if I indeed love
math, then I will pass this course.
In symbols:
p q
p
q
Notice that we draw a line in the argument form to separate what we are
given from the conclusion that we draw. This tautology represents the most
direct form of everyday reasoning, hence its name "direct reasoning."
Another bit of terminology: We say that p q and p together logically imply
q.
To check that it is a tautology, we use a truth table.
p q p q (p q) p [(p q) p] q
T T T T T
T F F F T
F T T F T
F F T F T
Once more, modus ponens says that, if we know that p implies q, and we
know that p is indeed true, then we can conclude that q is also true. This is
sometimes known as affirming the hypothesis. You should not confuse this
with a fallacious argument like: "If I were an Olympic athlete then I would
drink Boors. I do drink Boors, therefore I am an Olympic athlete." (Do you
see why this is nonsense?) This is known as the fallacy of affirming the
consequent. There is, however, a correct argument in which we deny the
consequent:
[(p q) ~q] ~p
Example
If we once again take p: "I love math" and q: "I will pass this course," we get
If I love math then I will pass this course; but I know that I will fail it.
Therefore, I must not love math.
In argument form:
In symbols:
p q
~q
~p
As you can see, this argument is not quite so direct as that in the first
example; it seems to contain a little twist: "If p were true then q would also
be true. However, q is false. Therefore p must also be false (else q would be
true.)" That is why we refer to it as indirect reasoning.
We'll leave the truth table for the exercises. Note that there is again a similar,
but fallacious argument form to avoid: "If I were an Olympic athlete then I
would drink Boors. However, I am not an Olympic athlete. Therefore I can't
drink Boors." This is a mistake Boors sincerely hopes you do not make!
Simplification
(p q) p
and
(p q) q
In words, the first says: If both p and q are true, then, in particular, p is true.
Example
If the sky is blue and the moon is round, then (in particular) the sky is blue.
Argument Form
In symbols:
p q
p
Addition
p (p q)
In words, the first says: If p is true, then we know that either p or q is true.
Example
If the sky is blue, then either the sky is blue of some ducks are kangaroos.
Argument Form
In symbols:
p
p q
Notice that it doesn't matter what we use as q, nor does it matter whether it is
true or false. The reason is that the disjunction p q is true if at least one of p
or q is true. Since we start out knowing that p is true, the truth value of q
doesn't matter.
Warning
The following are not tautologies:
(p q) p;
p (p q).
5. Rules of Inference
In the last section, we wrote out all our tautologies in what we called
"argument form." For instance, Modus Ponens [(p q) p] q was represented
as
p q
p
q
We think of the statements above the line, the premises, as statements given
to us as true, and the statement below the line, the conclusion, as a statement
that must then also be true.
Our convention has been that small letters like p stand for atomic statements.
But, there is no reason to restrict Modus Ponens to such statements. For
example, we would like to be able to make the following argument:
If roses are red and violets are blue, then sugar is sweet and so are you.
Roses are red and violets are blue.
Therefore, sugar is sweet and so are you.
In symbols, this is
(p q) (r s)
p q
r s
So, we really should write Modus Ponens in the following more general and
hence usable form:
A B
A
B
where, as our convention has it, A and B can be any statements, atomic or
compound.
In this form, Modus Ponens is our first rule of inference. We shall use rules
of inference to assemble lists of true statements, called proofs. A proof is a
way of showing how a conclusion follows from a collection of premises.
Modus Ponens, in particular, allows us to say that, if A B and A both
appear as statements in a proof, then we are justified in adding B as another
statement in the proof.
Solution
Notice that all the statements are compound statements, and that they
have the following patterns:
1. A B
2. C
3. A.
Statement A appears twice; in lines (1) and (3). Looking at Modus
Ponens, we see that we can deduce B = r ~s from these lines. (Line
(2) is not going to be used at all; it just goes along for the ride.) Thus,
we can enlarge our list as follows:
1. (p q) (r ~s) Premise
2. ~r s Premise
3. p q Premise
4. r (~s) 1,3 Modus Ponens
On the right we have given the justification for each line: lines (1)
through (3) were given as premises, and line (4) follows by an
application of Modus Ponens to lines (1) and (3); hence the
justification "1,3 Modus Ponens."
r (~s) Conclusion
Example 2 Using T1
Solution
A B
~B
~A
This matches the first two premises, so we can apply Modus Tollens
to get the following.
1. (p q) (r ~s) Premise
2. ~(r ~s) Premise
3. (p q) p Premise
4. ~(p q) 1,2 Modus Tollens
a q
b q
(a b) q
Premis
P1
e
Premis
P2
e
Premis
P3
e
..... .....
Premis
Pr
e
Concl
C
usion
(P1 P2 . . . Pr) C
is a tautology. In other words, validity means that if all the premises are true,
then the conclusion must be true.
Question
Answer
Well, that would work, but there are a couple of problems. First, the
truth table can get quite large. The truth table for [(a q)Š(b q)]
[(aæb) q] has eight rows and nine columns. It gets worse quickly,
since each extra variable doubles the number of rows.
Answer
Example
As an example, we have the following proof of the argument given
above, which we considered in the preceding section:
1. a q Premise
2. b q Premise
3. ~a q 1, Switcheroo
4. ~b q 2, Switcheroo
5. (~a q) (~b q) 3,4 Rule C
6. (~a ~b) q 5, Distributive Law
7. ~(a b) q 6, De Morgan
8. (a b) q 7, Switcheroo
Question
I'm convinced that proofs may be a good thing, but I'm still a little
skeptical. What does a proof actually have to do with the validity of
an argument?
Answer
The only way to learn to find proofs is by looking at lots of examples and
doing lots of practice. In the following examples we'll try to give you some
tips as we go along.
7. Predicate Calculus
Question
What are you talking about? These are just three ordinary statements
in the propositional calculus:
p: All men are mortal.
q: Socrates is a man.
r: Socrates is mortal.
Answer
Question
OK. That was a tricky one. I now see that we cannot take those
statments as atomic statements, but should write them as compound
statements. Now I get it! It is just the transitive rule:
Something is a man It is mortal
Something is Socrates It is a man
Something is Socrates It is mortal
Answer
This looks more convincing, but there is another catch: "Something is
a man", and "It is a man", while a perfectly good sentences, are not
propositions (what, after all, are their truth values?). The same goes
for the other "statements" in the argument No matter how we try to
rephrase the argument as a valid argument in propositional calculus,
we are doomed to run into some or other technical difficulty.
Universal Quantifier
We begin by rewording the statment "All men are mortal" a little more
slickly than we did above:
Question
What are those square brackets doing around Px Qx?
Answer
They define what is called the scope of the quantifier x. That is, they
surround what it is we are claiming is true for all x.
Nature of logic
The concept of logical form is central to logic; it being held that the validity
of an argument is determined by its logical form, not by its content.
Traditional Aristotelian syllogistic logic and modern symbolic logic are
examples of formal logics.
Logical form
Second, certain parts of the sentence must be replaced with schematic letters.
Thus, for example, the expression 'all As are Bs' shows the logical form
which is common to the sentences 'all men are mortals', 'all cats are
carnivores', 'all Greeks are philosophers' and so on.
That the concept of form is fundamental to logic was already recognized in
ancient times. Aristotle uses variable letters to represent valid inferences the
Prior Analytics, leading Jan Łukasiewicz to say that the introduction of
variables was 'one of Aristotle's greatest inventions'. According to the
followers of Aristotle (such as Ammonius), only the logical principles stated
in schematic terms belong to logic, and not those given in concrete terms.
The concrete terms 'man', 'mortal', etc., are analogous to the substitution
values of the schematic placeholders 'A', 'B', 'C', which were called the
'matter' (Greek 'hyle') of the inference.
Soundness, which means that the system's rules of proof will never allow a
false inference from a true premise. If a system is sound and its axioms are
true then its theorems are also guaranteed to be true.
Completeness, which means that there are no true sentences in the system
that cannot, at least in principle, be proved in the system.
Some logical systems do not have all three properties. As an example, Kurt
Gödel's incompleteness theorems show that no standard formal system of
arithmetic can be consistent and complete. At the same time his theorems for
first-order predicate logics not extended by specific axioms to be arithmetic
formal systems with equality, show those to be complete and consistent.
History of logic
During the later medieval period, major efforts were made to show that
Aristotle's ideas were compatible with Christian faith. During the later
period of the Middle Ages, logic became a main focus of philosophers, who
would engage in critical logical analyses of philosophical arguments.
Topics in logic
Syllogistic logic
The Organon was Aristotle's body of work on logic, with the Prior
Analytics constituting the first explicit work in formal logic, introducing the
syllogistic. The parts of syllogistic, also known by the name term logic, were
the analysis of the judgements into propositions consisting of two terms that
are related by one of a fixed number of relations, and the expression of
inferences by means of syllogisms that consisted of two propositions sharing
a common term as premise, and a conclusion which was a proposition
involving the two unrelated terms from the premises.
Aristotle's work was regarded in classical times and from medieval times in
Europe and the Middle East as the very picture of a fully worked out system.
It was not alone: the Stoics proposed a system of propositional logic that
was studied by medieval logicians; nor was the perfection of Aristotle's
system undisputed; for example the problem of multiple generality was
recognised in medieval times. Nonetheless, problems with syllogistic logic
were not seen as being in need of revolutionary solutions.
Predicate logic
Predicate logic is the generic term for symbolic formal systems such as first-
order logic, second-order logic, many-sorted logic, and infinitary logic.
Frege's original system of predicate logic was second-order, rather than first-
order. Second-order logic is most prominently defended (against the
criticism of Willard Van Orman Quine and others) by George Boolos and
Stewart Shapiro.
Modal logic
The logical study of modality dates back to Aristotle, who was concerned
with the alethic modalities of necessity and possibility, which he observed to
be dual in the sense of De Morgan duality.[citation needed] While the study of
necessity and possibility remained important to philosophers, little logical
innovation happened until the landmark investigations of Clarence Irving
Lewis in 1918, who formulated a family of rival axiomatizations of the
alethic modalities. His work unleashed a torrent of new work on the topic,
expanding the kinds of modality treated to include deontic logic and
epistemic logic. The seminal work of Arthur Prior applied the same formal
language to treat temporal logic and paved the way for the marriage of the
two subjects. Saul Kripke discovered (contemporaneously with rivals) his
theory of frame semantics which revolutionized the formal technology
available to modal logicians and gave a new graph-theoretic way of looking
at modality that has driven many applications in computational linguistics
and computer science, such as dynamic logic.
Informal reasoning
The motivation for the study of logic in ancient times was clear: it
is so that one may learn to distinguish good from bad arguments,
and so become more effective in argument and oratory, and
perhaps also to become a better person. Half of the works of
Aristotle's Organon treat inference as it occurs in an informal
setting, side by side with the development of the syllogistic, and in
the Aristotelian school, these informal works on logic were seen as
complementary to Aristotle's treatment of rhetoric.
Mathematical logic
Mathematical logic really refers to two distinct areas of research: the first is
the application of the techniques of formal logic to mathematics and
mathematical reasoning, and the second, in the other direction, the
application of mathematical techniques to the representation and analysis of
formal logic.
If proof theory and model theory have been the foundation of mathematical
logic, they have been but two of the four pillars of the subject. Set theory
originated in the study of the infinite by Georg Cantor, and it has been the
source of many of the most challenging and important issues in
mathematical logic, from Cantor's theorem, through the status of the Axiom
of Choice and the question of the independence of the continuum
hypothesis, to the modern debate on large cardinal axioms.
Recursion theory captures the idea of computation in logical and arithmetic
terms; its most classical achievements are the undecidability of the
Entscheidungsproblem by Alan Turing, and his presentation of the Church-
Turing thesis. Today recursion theory is mostly concerned with the more
refined problem of complexity classes — when is a problem efficiently
solvable? — and the classification of degrees of unsolvability.
Philosophical logic
In the 1950s and 1960s, researchers predicted that when human knowledge
could be expressed using logic with mathematical notation, it would be
possible to create a machine that reasons, or artificial intelligence. This
turned out to be more difficult than expected because of the complexity of
human reasoning. In logic programming, a program consists of a set of
axioms and rules. Logic programming systems such as Prolog compute the
consequences of the axioms and rules in order to answer a query.
Controversies in logic
Just as we have seen there is disagreement over what logic is about, so there
is disagreement about what logical truths there are.
The logics discussed above are all "bivalent" or "two-valued"; that is, they
are most naturally understood as dividing propositions into true and false
propositions. Non-classical logics are those systems which reject bivalence.
Hegel developed his own dialectic logic that extended Kant's transcendental
logic but also brought it back to ground by assuring us that "neither in
heaven nor in earth, neither in the world of mind nor of nature, is there
anywhere such an abstract 'either–or' as the understanding maintains.
Whatever exists is concrete, with difference and opposition in itself".
In 1910 Nicolai A. Vasiliev rejected the law of excluded middle and the law
of contradiction and proposed the law of excluded fourth and logic tolerant
to contradiction. In the early 20th century Jan Łukasiewicz investigated the
extension of the traditional true/false values to include a third value,
"possible", so inventing ternary logic, the first multi-valued logic.
Logics such as fuzzy logic have since been devised with an infinite number
of "degrees of truth", represented by a real number between 0 and 1.
Intuitionistic logic was proposed by L.E.J. Brouwer as the correct logic for
reasoning about mathematics, based upon his rejection of the law of the
excluded middle as part of his intuitionism. Brouwer rejected formalization
in mathematics, but his student Arend Heyting studied intuitionistic logic
formally, as did Gerhard Gentzen. Intuitionistic logic has come to be of great
interest to computer scientists, as it is a constructive logic, and is hence a
logic of what computers can do.
Modal logic is not truth conditional, and so it has often been proposed as a
non-classical logic. However, modal logic is normally formalized with the
principle of the excluded middle, and its relational semantics is bivalent, so
this inclusion is disputable.
Is logic empirical?
Another paper by the same name by Sir Michael Dummett argues that
Putnam's desire for realism mandates the law of distributivity. Distributivity
of logic is essential for the realist's understanding of how propositions are
true of the world in just the same way as he has argued the principle of
bivalence is. In this way, the question, "Is logic empirical?" can be seen to
lead naturally into the fundamental controversy in metaphysics on realism
versus anti-realism.
The first class of paradoxes involves counterfactuals, such as "If the moon is
made of green cheese, then 2+2=5", which are puzzling because natural
language does not support the principle of explosion. Eliminating this class
of paradoxes was the reason for C. I. Lewis's formulation of strict
implication, which eventually led to more radically revisionist logics such as
relevance logic.
Hegel was deeply critical of any simplified notion of the Law of Non-
Contradiction. It was based on Leibniz's idea that this law of logic also
requires a sufficient ground in order to specify from what point of view (or
time) one says that something cannot contradict itself, a building for
example both moves and does not move, the ground for the first is our solar
system for the second the earth. In Hegelian dialectic the law of non-
contradiction, of identity, itself relies upon difference and so is not
independently assert able.