Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Classical Mathematical Logic: The Semantic Foundations of Logic
Classical Mathematical Logic: The Semantic Foundations of Logic
Classical Mathematical Logic: The Semantic Foundations of Logic
Ebook1,053 pages9 hours

Classical Mathematical Logic: The Semantic Foundations of Logic

Rating: 3.5 out of 5 stars

3.5/5

()

Read preview

About this ebook

In Classical Mathematical Logic, Richard L. Epstein relates the systems of mathematical logic to their original motivations to formalize reasoning in mathematics. The book also shows how mathematical logic can be used to formalize particular systems of mathematics. It sets out the formalization not only of arithmetic, but also of group theory, field theory, and linear orderings. These lead to the formalization of the real numbers and Euclidean plane geometry. The scope and limitations of modern logic are made clear in these formalizations.


The book provides detailed explanations of all proofs and the insights behind the proofs, as well as detailed and nontrivial examples and problems. The book has more than 550 exercises. It can be used in advanced undergraduate or graduate courses and for self-study and reference.



Classical Mathematical Logic presents a unified treatment of material that until now has been available only by consulting many different books and research articles, written with various notation systems and axiomatizations.

LanguageEnglish
Release dateDec 18, 2011
ISBN9781400841554
Classical Mathematical Logic: The Semantic Foundations of Logic

Read more from Richard L. Epstein

Related to Classical Mathematical Logic

Related ebooks

Mathematics For You

View More

Related articles

Reviews for Classical Mathematical Logic

Rating: 3.25 out of 5 stars
3.5/5

4 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Classical Mathematical Logic - Richard L. Epstein

    alone.

    Introduction

    Classical mathematical logic is an outgrowth of several trends in the 19th century.

    In the early part of the 19th century there was a renewed interest in formal logic. Since at least the publication of Logic or the Art of Thinking by Antoine Arnauld and Pierre Nicole in 1662, formal logic had meant merely the study of the Aristotelian syllogisms. Richard Whately, 1827, and others began a more serious study of that subject, though still far below the level of clarity and subtlety of the medieval logicians (compare Chapter XXII).

    Shortly thereafter, George Boole, Augustus De Morgan and others began a mathematization of reasoning. This was part of a general trend of abstraction in mathematics. Using algebraic techniques, Boole, 1854, treated the study of Aristotelian syllogisms mathematically; De Morgan, 1847, showed the limitations of the Aristotelian tradition in dealing with inferences that depended on relations.

    At roughly the same time, problems working with the calculus, particularly justifications and theorems about approximation procedures, prompted many mathematicians to re-examine their conception of the real numbers. Richard Dedekind, 1872, and Karl Weierstrass offered new analyses, which were elaborated in the work of Georg Cantor, 1895-1897, on theories of collections—now called ‘sets’ (Chapter IX.B). Defining the real numbers in terms of infinite collections or limits of infinite sequences of rationals (Chapter XVII) led to a view of mathematics as the study of relations among objects. Functions were understood entirely as relations, with the idea of functions as processes entirely abandoned. Two kinds of paradoxes resulted from this work. The set-theoretic paradoxes were those that led to a contradiction from postulating that there is a set of all sets, or that there is a set all of whose elements are not elements of themselves. These showed that the intuition that any objects whatsoever could be brought together as a collection was faulty. Another kind of paradox came from the definitions of the real numbers and the view of functions as relations. Constructions were given of a continuous function that is nowhere differentiable and of a space-filling curve.

    The discovery of non-Euclidean geometries led to a greater interest in the procedure of axiomatizing. Starting with a basic system of axioms, one could obtain different geometries depending on whether one took Euclid's parallel postulate or one of two postulates that contradicted it. By the end of the 19th century Giuseppe Peano, 1889, had set out his axiomatization of the natural numbers and David Hilbert, 1899, had perfected an axiomatization of plane geometry.

    In the early years of the 20th century Alfred North Whitehead and Bertrand Russell, 1910-1913, brought together many of these trends. They showed how to axiomatize logic, and within that axiomatization, apparently avoiding paradoxes related to the set-theoretic ones, they were able to develop a very substantial part of mathematics.

    Leopold Löwenheim and others, on the other hand, began to develop the idea of a model of an axiom system, with models for the different kinds of plane geometry as archetype.

    It was only in the 1930s with the work of Alfred Tarski, Kurt Gödel, Hilbert, and many others that something recognizably like the work in this text was developed. (See van Heijenoort, 1967 for seminal works by these and others, along with introductory essays on those works.)

    The word ‘mathematical’ in ‘classical mathematical logic’, then, had two meanings: the mathematization of models of reasoning, and the use of formal logic to formalize reasoning in mathematics.

    This development of formal logic stood aside from considerations of other parts of Aristotle's Organon. The study of reasoning with non-valid inferences, the analysis of cause-and-effect reasoning, and the use of Aristotelian logic for reasoning about masses and qualities were studied by John Stuart Mill, 1874, and others, but were not of interest to mathematical logicians. Even the use of formal logic for the analysis of reasoning in ordinary language was almost completely ignored. (Those subjects and their relation to formal logic are presented in Epstein, 1994 and Epstein, 2001.)

    The orientation of formal logic, following the mathematicians’ lead, was in terms of reasoning about things. Processes were considered fully analyzed by casting them in terms of relations (Chapter VIII), and within the mathematical community the paradoxes of the calculus came to be seen as natural consequences of fully satisfactory definitions. The notions of qualities and predicates came to be highly restricted, being identified in formal logic with their extensions (Chapter IX). The assumption that reasoning was to be restricted to reasoning about things (Chapter III) came to dominate to such an extent that it was no longer explicitly mentioned. The ‘classical’ in ‘classical mathematical logic’ in the 20th century came to mean the extreme restriction of the notion of predicate to just its extension: those objects of which it is true. This was in opposition to other views of reasoning in which epistemological, psychological, modal, or other aspects of propositions and predicates were taken into account (see Epstein, 1990).

    In this text we we will see the basic outlines of the limitations as well as the scope of modeling reasoning within classical mathematical logic.

    A.  Propositions

    Other views of propositions

    B.  Types

    • Exercises for Sections A and B

    C.  The Connectives of Propositional Logic

    • Exercises for Section C

    D.  A Formal Language for Propositional Logic

    1  Defining the formal language

    A platonist definition of the formal language

    2  The unique readability of wffs

    3  Realizations

    • Exercises for Section D

    E.  Classical Propositional Logic

    1  The classical abstraction and truth-functions

    2  Models

    • Exercises for Sections E.1 and E.2

    3  Validity and semantic consequence

    • Exercises for Section E.3

    F.  Formalizing Reasoning

    • Exercises for Section F

    Proof by induction


    To begin our analysis of reasoning we need to be clear about what kind of thing is true or false. Then we will look at ways to reason with combinations of those things.

    A. Propositions

    When we argue, when we prove, we do so in a language. And we seem to be able to confine ourselves to declarative sentences in our reasoning.

    I will assume that what a sentence is and what a declarative sentence is are well enough understood by us to be taken as primitive, that is, undefined in terms of any other fundamental notions or concepts. Disagreements about some particular examples may arise and need to be resolved, but our common understanding of what a declarative sentence is will generally suffice.

    So we begin with sentences, written (or uttered) concatenations of inscriptions (or sounds). To study these we may ignore certain aspects, such as what color ink they are written in, leaving ourselves only certain features of sentences to consider in reasoning. The most fundamental is whether they are true or false.

    In general we understand well enough what it means for a simple sentence such as ‘Ralph is a dog’ to be true or to be false. For such sentences we can regard truth as a primitive notion, one we understand how to use in most applications, while falsity we can understand as the opposite of truth, the not-true. Our goal is to formalize truth and falsity for more complex and controversial sentences.

    Which declarative sentences are true or false, that is, have a truth-value? It is sufficient for our purposes in logic to ask whether we can agree that a particular sentence, or class of sentences as in a formal language, is declarative and whether it is appropriate for us to assume it has a truth-value. If we cannot agree that a certain sentence such as ‘The King of France is bald’ has a truth-value, then we cannot reason together using it. That does not mean that we adopt different logics or that logic is psychological; it only means that we differ on certain cases.

    Propositions A proposition is a written or uttered declarative sentence used in such a way that it is true or false, but not both.

    Other views of propositions

    There are other views of what propositions are. Some say that what is true or false is not the sentence, but the meaning or thought expressed by the sentence. Thus ‘Ralph is a dog’ is not a proposition; it expresses one, the very same one expressed by ‘Ralph is a domestic canine’.

    Platonists take this one step further. A platonist, as I use the term, is someone who believes that there are abstract objects not perceptible to our senses that exist independently of us. Such objects can be perceived by us only through our intellect. The independence and timeless existence of such objects account for objectivity in logic and mathematics. In particular, propositions are abstract objects, and a proposition is true or is false, though not both, independently of our even knowing of its existence.

    But a platonist, as a much as a person who thinks a proposition is the meaning of a sentence or a thought, reasons in language, using declarative sentences that, they say, represent, or express, or point to propositions. To reason with a platonist it is not necessary that I believe in abstract propositions or thoughts or meanings. It is enough that we can agree that certain sentences are, or from the platonist’s viewpoint represent, propositions. Whether for the platonist such a sentence expresses a true proposition or a false proposition is much the same question as whether, from my point of view, it is true or is false.¹

    B. Types

    When we reason together, we assume that words will continue to be used in the same way. That assumption is so embedded in our use of language that it’s hard to think of a word except as a type, that is, as a representative of inscriptions that look the same and utterances that sound the same. I don’t know how to make precise what we mean by ‘look the same’ or ‘sound the same’. But we know well enough in writing and conversation what it means for two inscriptions or utterances to be equiform.

    Words are types We will assume that throughout any particular discussion equiform words will have the same properties of interest to logic. We therefore identify them and treat them as the same word. Briefly, a word isa type.

    This assumption, while useful, rules out many sentences we can and do reason with quite well. Consider ‘Rose rose and picked a rose.’ If words are types, we have to distinguish the three equiform inscriptions in this sentence, perhaps as ‘Rosename roseverb and picked a rosenoun’.

    Further, if we accept this agreement, we must avoid words such as ‘I’, ‘my’, ‘now’, or ‘this’, whose meaning or reference depends on the circumstances of their use. Such words, called indexicals, play an important role in reasoning, yet our demand that words be types requires that they be replaced by words that we can treat as uniform in meaning or reference throughout a discussion.

    Suppose now that I write down a sentence that we take to be a proposition:

    All dogs bark.

    Later I want to use that sentence in an argument, say:

    If all dogs bark, then Ralph barks. All dogs bark.

    Therefore, Ralph barks.

    But we haven’t used the same sentence twice, for sentences are inscriptions.

    Since words are types, we can argue that these two equiform sentences should both be true or both false. It doesn’t matter to us where they’re placed on the paper, or who said them, or when they were uttered. Their properties for logic depend only on what words (and punctuation) appear in them in what order. Perhaps it is a simplification, but let’s agree that any property that differentiates them won’t be of concern to our reasoning.

    Propositions are types  In any discussion in which we use logic we’ll consider a sentence to be a proposition only if any other sentence or phrase that is composed of the same words in the same order can be assumed to have the same properties of concern to logic during that discussion. We therefore identify equiform sentences or phrases and treat them as the same sentence. Briefly, a proposition is a type.

    It is important to identify both sentences and phrases, for in the inference above we want to identify the phrase ‘all dogs bark’ in ‘If all dogs bark, then Ralph barks.’ with ‘All dogs bark.’.

    The device I just used of putting single quotation marks around a word or phrase is a way of naming that word or phrase or any linguistic unit. We need some such convention because confusion can arise if it’s not clear whether a word or phrase is being used or referred to as a word or phrase. For example, when I write:

    The Taj Mahal has eleven letters

    I don’t mean that the building has eleven letters, but that the phrase does, and I should write:

    ‘The Taj Mahal’ has eleven letters

    When we do this we say that we mention the word or phrase in quotation marks and that the entire inscription including the quotes is a quotation name of the word or phrase. Otherwise, we simply use the word or phrase, as we normally do. Note that the assumption that words are types is essential in using quotation names. We can also indicate we are mentioning a linguistic unit by italicizing it or putting it in display format. And single quotation marks are used for quoting direct speech, too.

    The device of enclosing a word or phrase in double quotation marks is equivalent to a wink or a nod in conversation, indicating that we’re not to be taken literally or that we don’t really subscribe to what we’re saying. Double quotes are called scare quotes, and they allow us to get away with murder.

    Exercises for Sections A and B _________

    1  Of the following, (i) Which are declarative sentences? (ii) Which contain indexicals? (iii) Which are propositions?

    a. Ralph is a dog.

    b. I am 2 meters tall.

    c. Is any politician not corrupt?

    d. Feed Ralph.

    e. Ralph didn’t see George.

    f. Whenever Juney barks, Ralph gets mad.

    g. If anyone should say that cats are nice, then he is confused.

    h. If Ralph should say that cats are nice, then he is confused.

    i. If Ralph should say that cats are nice, then Ralph is confused.

    j. Dogs can contract myxomatosis.

    k. 2 + 2 = 4 .

    2  Explain why we cannot take sentence types as propositions if we allow the use of indexicals in our reasoning.

    C. The Connectives of Propositional Logic

    We begin with propositions that are sentences. There are two features of such sentences that contribute to our reasoning: their syntax, by which we mean the analysis of their form or grammar, and their semantics, by which we mean an analysis of their truth-values and meaning or content. These are inextricably linked. The choice of what forms of propositions we’ll study leads to what and how we can mean, and the meaning of the forms leads to which of those forms are acceptable.

    There are so many properties of propositions we could take into account in our reasoning that we must begin by restricting our attention to only a few. Our starting point is propositional logic where we ignore the internal structure of propositions except as they are built from other propositions in specified ways.

    There are many ways to connect propositions to form a new proposition. Some are easy to recognize and use. For example, ‘Ralph is a dog and dogs bark’ can be viewed as two sentences joined by the connective ‘and’.

    Some other common connectives are: ‘but’, ‘or’, ‘although’, ‘while’, ‘only if’ ‘if…then…’, ‘neither…nor…’. We want to strike a balance between choosing as few as possible to concentrate on in order to simplify our semantic analyses, and as many as possible so that our analyses will be broadly applicable.

    Our starting point will be the four traditional basic connectives of logic: ‘and’, ‘or’, ‘if…then…‘, ‘not’. In English ‘not’ is used in many different ways. Here we take it as the connective that precedes a sentence as in It’s not the case that…, or other uses that can be assimilated to that understanding. These four connectives will give us a rich enough grammatical basis to begin our logical investigation.

    These English connectives have many connotations and properties, some of which may be of no concern to us in logic. For example, in American English ‘not’ is usually said more loudly than the surrounding words in the sentence. We’ll replace these connectives with formal symbols to which we will give fairly explicit and precise meanings based on our understanding of the English words.

    dogs bark’, corresponding to the earlier example.

    Parentheses are a further formal device important for clarity. In ordinary speech we might say, ‘If George is a duck then Ralph is a dog and Dusty is a horse’. But it’s not clear which of the following is meant:

    If George is a duck, then: Ralph is a dog and Dusty is a horse.

    If George is a duck, then Ralph is a dog; and Dusty is a horse.

    Such ambiguity should have no place in our reasoning. We can require the use of parentheses to enforce one of those two readings:

    George is a duck → (Ralph is a dog ^ Dusty is a horse)

    (George is a duck → Ralph is a dog) ^ A Dusty is a horse

    To lessen the proliferation of quotation marks in naming linguistic items, we can assume that a formal symbol names itself is a formal connective. Here is some more terminology.

    is a conjunction of them; each of the original propositions is a conjunct we conjoin with the other. When v joins two sentences it is a disjunction formed by disjoining the disjuncts. The proposition ‘Ralph is a dog or Ralph does not bark’ is not a disjunction, because the formal symbol ‘ v ’ does not appear in it.

    in front of another sentence is the negation of it; it is a negation. (Ralph is a dog)’, where the parentheses are used for clarity.

    The symbol → is called the arrow or the conditional. The result of joining two sentences with → is a conditional; the proposition on the left is called the antecedent, and the one on the right is the consequent.

    Exercises for Section C ___________

    1  Classify each of the following as a conjunction, disjunction, negation, conditional, or none of those. If a conditional, identify the antecedent and consequent.

    dogs bark.

    dogs bark.

    (cats bark).

    dogs bark.

    e. Cats are mammals and dogs are mammals.

    f. Either Ralph is a dog or Ralph isn’t a dog.

    (cats are dogs)).

    h. Cats aren’t nice.

    dogs bark).

    j. Ralph is a dog.

    k. It is possible that Ralph is a dog.

    l. Some dogs are not white.

    2  a. Write a sentence that is a negation of a conditional, the antecedent of which is a conjunction.

    b. Write a sentence that is a conjunction of disjunctions, each of whose disjuncts is either a negation or has no formal symbols in it.

    3  Write a sentence that might occur in daily speech that is ambiguous but which can be made precise by the use of parentheses, indicating at least two ways to parse it.

    4  List at least three words or phrases in English not discussed in the text that are used to form a proposition from one or more propositions and that you believe are important in a study of reasoning.

    D. A Formal Language for Propositional Logic

    1. Defining the formal language

    We have no dictionary, no list of all propositions in English, nor do we have a method for generating all propositions, for English is not a fixed, formal, static language. But by using variables we can introduce a rigid formal language to make precise the syntax of the propositions we will study.

    Let p0 , p1 ,…be propositional variables. , and parentheses. This will be the formal language. To be able to talk about that language, indeed even to give a precise definition of it, we need further variables, the metavariables A, B, C, A0, A1, A2,…to stand for any of p0, p1, p2,…or complex expressions formed from those, and p, q to stand for any of p0, p1, p2,…. The analogue of a sentence in English is a well-formed formula, or wff. We could define that by:

    a. (pi) is a wff for each i = 0, 1, 2,….

    B).

    c. Only such concatenations of symbols as arise from repeated applications of (a) and (b) are wffs.

    But clause (c) is too vague. So instead we use an inductive definition, that is, one that correlates the structure of what is being defined to the natural numbers.

    Wffs and the formal language L( , p0, p1,…)

    i. For each i = 0, 1, 2,…, (pi) is an atomic wff, to which we assign the number 1.

    ii. If A and B are wffs and the maximum of the numbers assigned to A and to B is nB) is a compound wff to which we assign the number n + 1. These are compound wffs.

    iii. A concatenation of symbols is a wff if and only if it is assigned some number n ≥ 1 according to (i) or (ii) above.

    ) since we usually use the same propositional variables.

    Propositions are written or uttered sentences. Since we previously agreed to view words and propositions as types, let’s now agree that the symbols and formulas of the formal language are types, too.

    Now we need to know that there is only one way to parse each wff.

    A platonist definition of the formal language

    Platonists conceive of the formal language as a complete infinite collection of formulas. Infinite collections, they say, are real, though abstract, entities. So they define the formal language as:

    , p0, p1,…) is the smallest collection containing (pi) for each i B).

    One collection is said to be smaller than another if it is contained in or equal to the other collection.

    Platonists argue that their definition is clearer and more precise because it replaces condition (c) with strict criteria on collections. Whatever clarity is gained, however, depends on our accepting that there are completed infinite totalities.

    2. The unique readability of wffs

    D.

    Theorem 1 Unique readability of wffs

    There is one and only one way to parse each wff.

    Proof To prove this we will use a proof by induction; in an appendix to this chapter I present a short review of that method.

    To each primitive symbol α of the formal language assign an integer λ(α) according to the following chart:

    To the concatenation of symbols α1 α2…αn assign the number:

    in A.

    If there are no occurrences, then A is atomic, that is, for some i ≥ 0, A is (pi). Then λ(‘(‘) = –1, λ(pi) = 0, and λ(‘)’) = 1. Adding, we have λ(A) = 0.

    C). By induction λ(B) = λ(C) = 0, so in each case by adding, we have λ(A) = 0.

    Now I’ll show that reading from the left, if α is an initial segment of a wff (reading from the left) other than the entire wff itself, then λ(α) < 0 ; and if α is a final segment other than the entire wff itself, then λ(α) > 0. So no proper initial or final segment of a wff is a wff. To establish this I will again use induction on the number of occurrences of connectives in the wff. I’ll let you establish the base case for atomic wffs, where there are no (that is, zero) connectives.

    n occurrences of the connectives. If A contains n C), then an initial segment of A must have one of the following forms:

    i. (

    ii. (β where β is an initial segment of B

    γ where γ is an initial segment of C

    ) follow similarly, and I’ll leave those and the proof for final segments to you.

    B) is the same as → D), which is a contradiction.

    Theorem 1 shows that every wff has exactly one number assigned to it in the inductive definition of wffs. Hence we can make the following definition.

    The length of a wff The length of a wff is the number assigned to it in the inductive definition op wffr.

    For example:

    The length of (p0) is 1.

    (p32)) is 2.

    The length of (((p0)→ (p1)) → ((p1)→ (p0))) is 3.

    Now when we want to show that all wffs have some property we can use the following method of proof.

    Induction on the length of wffs First show that the atomic wffs have the property and then show that if all wffs of length n have it, so do all wffs of length n + 1.

    Excessive parentheses can make it difficult to read a formal wff, for example:

    B)→ C). We can also dispense with the outermost parentheses and those surrounding a variable, and we can use [ ] in place of the usual parentheses in informal discussions. So the example above can be written informally as:

    Now it is easier to see that the length of this wff is 5.

    We can write extended conjunctions or disjunctions without parentheses if we adopt the convention that conjuncts or disjuncts are associated to the left. p4.

    Formal wffs exemplify the forms of propositions. To talk about the forms of wffs of the formal language, we use schemap136 is an instance A.

    We can now give an inductive definition of subformula:

    If C has length 1, then it has just one subformula, C itself.

    If C has length n + 1 then:

    A, its subformulas are C, A, and the subformulas of A.

    B, or A → B, its subformulas are C, A, B, and the subformulas of A and of B.

    We say pi appears in A if (pi) is a subformula of A, and write ‘A(p)’ to mean that p appears in A, and ‘C(A)ș to mean that A is a subformula of C.

    In some proofs about the language it is useful to have a numbering of all wffs that assigns distinct numbers to distinct wffs. There are many ways to number all wffs, and it won’t matter for what follows which one we use. I include a particular one here for completeness.

    Numbering of wffs

    This numbering relies on the Fundamental Theorem of Arithmetic: for each natural number there is, except for the order of multiplication, one and only one way to factor it into primes. We first define a function from wffs to natural numbers:

    Then we number all wffs by taking the nth wff to be the wff that is assigned the nth largest number that is an output of the function f.

    3. Realizations

    .

    Realizations A realization is an assignment of propositions to some or all of the propositional variables. The realization of a formal wff is the formula we get when we replace the propositional variables appearing in the formal wff with the propositions assigned to them; it is a semi-formal wff. The semi-formal language for that realization is the collection of realizations of formal wffs all of whose propositional variables are realized. We call the semi-formal language of a realization semi-formal English or formalized English. I will use the same metavariables A, B, C, A0, A1, A2,…for semi-formal wffs.

    For example, we could take as a realization the following assignment of propositions to variables:

    (1)  p0 ‘Ralph is a dog’

     p1 ‘Four cats are sitting in a tree’

     p2  ‘Four is a lucky number’

     p3  ‘Dogs bark’

     p4  ‘Juney is barking loudly’

     p5  ‘Juney is barking’

     p6  ‘Dogs bark’

     p7  ‘Ralph is barking’

     p8  ‘Cats are nasty’

     p9  ‘Ralph barks’

     p47  ‘Howie is a cat’

     p301  ‘Ralph will catch a cat’

     p312  ‘Bill is afraid of dogs’

     p317  ‘Bill is walking quickly’

     p4318  ‘If Ralph is barking, then Ralph will catch a cat’

     p4719  ‘Ralph is barking’

    Some of the semi-formal wffs of this realization are:

    (four cats are sitting in a tree)

    Ralph barks → Bill is walking quickly

    Juney is barking loudly → Juney is barking

    dogs bark → Ralph barks

    (p9))→ (p317)). Note that not all variables need be realized. Nor need we assign distinct propositions to distinct variables. Here the realization of p7 is the same as the realization of p4719 . And note well that a realization is linguistic, a formalized fragment of English.

    The propositions that we assign to the variables are atomic, that is, they are the simplest propositions of the realization, ones whose internal structure will not be under consideration in this particular analysis. So the assignment of ‘If Ralph is barking, then he will catch a cat’ to p4318 is a bad choice, for it will not allow us to make a semantic analysis based on the form of that proposition in relation to ‘Ralph is barking’. Generally speaking, the atomic propositions of a realization shouldn’t include any English expressions that we’ve agreed to formalize with our formal connectives.

    Before we can speak of the realizations of compound wffs as propositions we must decide upon interpretations of the formal connectives.

    Exercises for Section D __________

    1  Why do we introduce a formal language?

    2  Identify which of the following are formal (unabbreviated) wffs:

    3  Abbreviate the following wffs according to our conventions on abbreviations:

    4  Write the following informally presented wffs without abbreviations:

    5  Give an example (using abbreviations) of a formula that is:

    a. A conjunction, the conjuncts of which are disjunctions of either atomic propositions or negated atomic propositions.

    b. A conditional whose antecedent is a disjunction of negations and whose consequent is a conditional whose consequent is a conditional.

    6  a. Calculate the length of the following wffs.

    b. Give an unabbreviated example of a wff of length 5. For a wff of length 5 what is the minimum number of parentheses that can be used? The maximum? For a wff of length 10? (Hint: See Exercise 5 on p. 26 below.)

    7  List all subformulas of each of the wffs in Exercise 6.a. What propositional variables appear in them?

    8  Distinguish the following: ordinary language, the formal language, a semi-formal language.

    9  a. Give the realization of the following, using (1) on p. 11:

    four is a lucky number) →

    (If Ralph is barking then he will catch a cat → Howie is a cat)

    c. Exhibit formal wffs of which the following could be taken to be realizations:

    i. Cats are nasty → Howie is nasty

    Ralph is a puppet) → no number greater than 4 billion is a perfect square

    E. Classical Propositional Logic

    1. The classical abstraction and truth-functions

    What are the simplest interpretations we can provide for the formal connectives that will be consonant with the assumptions we’ve made? We ignore every aspect of propositions other than their form and the single semantic property that makes them propositions.

    The classical abstraction The only properties of a proposition that matter to logic are its form and its truth-value.

    connects two sentences, how are we to reason? From the truth of one proposition how could we deduce the truth of another? Without some regularity reasoning cannot take place.

    The Fregean assumption The truth-value of a proposition is determined by its form and the semantic properties of its constituents.

    in front of one, the truth-value of the resulting proposition must depend on only the truth-values of those atomic propositions. The connectives must operate semantically as functions of the truth-values of the constituent propositions. This is what we mean when we say they are truth-functional.

    are given by the following tables, where p and q stand for atomic propositions, and ‘T’ and ‘F’ stand for ‘true’ and ‘false’.

    q is T if both p and q are T; otherwise it is F.

    there are two choices, corresponding to an inclusive or exclusive reading of ‘or’ in English. The choice is arbitrary, but it’s customary to use the inclusive version, as in ‘p or q or both’.

    The hardest choice, however, is the table for → . For that we use:

    The second row is the essence of ‘if…then…’, namely, a true conditional does not have a true antecedent and a false consequent. The first row is also fundamental, so long as we ignore all aspects of atomic propositions except their truth-values.

    q. If we were to take the third row as T and the fourth as F, then p → q would be the same as q ; were the third row F and the last T, then p → q would be the same as q → p. Rejecting these counterintuitive choices, we are left with the table that is most generous in assigning T to conditional propositions: p → q is true so long as it’s not the case that p is T and q is F.

    An example from mathematics will illustrate why we want to classify conditionals with false antecedent as true. I think you’ll agree the following is true of the counting numbers:

    If x and y are odd, then the sum of x and y is even.

    If we were to take either of the last two rows of the table for ‘→ ’ to be F, the formalization of this proposition would be false, for we could provide false instances of it: 4 + 8 = 12, which is even, and 4 + 7 = 11, which is not even. The formalization of ‘if…then…’ we have chosen allows us to deal with cases where the antecedent does not apply by treating them as vacuously true.

    . I must assume you understand the ordinary English connectives.

    q. But what about:

    (2)  (Ralph is a dog A dogs bark) — Ralph barks

    Here the antecedent is not a formless entity that is simply true or false: It contains a formal connective that must be accounted for. How shall we analyze this conditional compared to the following?

    (3) (Ralph is a dog → dogs bark) → Ralph barks

    q and p → q, are to be evaluated by different methods. But if both are evaluated as true or both as false, is there anything that can distinguish them semantically? We have agreed that the only semantic value of a proposition that we shall consider is its truth-value. So if two propositions have the same truth-value, they are semantically indistinguishable and should play the same role in any further semantic analysis.

    The division of form and content If two propositions have the same semantic properties, then they are indistinguishable in any semantic analysis, regardless of their form.

    dogs bark’ and ‘Ralph is a dog → dogs bark’ have the same truth-value, then the two conditionals (2) and (3) will have the same truth-value.

    above apply to all propositions.

    The classical truth-tables (classical evaluations of the connectives)

    Suppose now that we are given a realization, say (1) above. We agree that the sentences assigned to the propositional variables are propositions. In addition we have a method for assigning truth-values to compound formulas in terms of the truth-values of their parts. Are we then to assume that every well-formed formula of the semi-formal language is a proposition? Consider:

    ’s in front of ‘(Ralph is a dog)’?

    It seems a harmless idealization to accept such a formula as a proposition, even though we might not ever use it. And to try to establish further criteria for what formulas of the semi-formal language are to be taken as propositions more than to say that they are well-formed (or perhaps shorter in length than some specific limit) involves us in a further semantic analysis not already taken into account by the assumptions we have adopted. So we simplify our work by adopting the following idealization.

    Form and meaningfulness What is grammatical and meaningful is determined solely by form and what primitive parts of speech are taken as meaningful. In particular, given a semi-formal propositional language, every well-formed formula will be taken to be a proposition.

    Thus, the semi-formal language of a realization is the collection of propositions formed from the realizations of the propositional variables on the pattern of the formulas of the formal language.

    2. Models

    Suppose we have a realization, say (1) above. We agree that the sentences assigned to the propositional variables, for example, ‘Ralph is a dog’ for p0, are propositions, that is, each has a truth-value. But it is a further agreement to say which truth-value it has. When we do that we call all the assignments of specific truth-values together a propositional valuation(p) = F according to whether the atomic proposition p is taken to be true or false.

    (A) = F if A is assigned T or is assigned F by this method.

    Models A model (A) = T, A is true in the model(A) = F, A is false in the model.

    A formal wff is true or false in the model according to whether its realization is true or false.

    Exercises for Sections E.1 and E.2 _______

    1.  Why do we take A → B to be true if A is false?

    2.  What is a truth-functional connective?

    , if that is possible.

    a. Ralph is barking → cats are nasty

    (Juney is barking)

    cats are nasty → Ralph is barking

    (p6)]

    (p3)]

    (cats are nasty))

    (Ralph is dog)) → dogs bark

    4.  Distinguish between a realization and a model.

    5.  Show by induction on the length of wffs that in a model the valuation v plus the truth-tables determine uniquely the truth-value of every compound proposition.

    3. Validity and semantic consequence

    Bill is walking quickly → Ralph is a dog’ is true. Turning to formal wffs for generality and precision, we can make the following definition.

    Tautologies A formal wff is a tautology or valid means that in every model its realization is evaluated as true; in that case we write 1= A.

    A semi-formal proposition is a tautology means that it is the realization of some wff that is a tautology.

    (Ralph is a dog)’.

    (Ralph is a dog)’ will be evaluated as false no matter what value is assigned to ‘Ralph is a dog’. A proposition or formal wff that is evaluated as false regardless of the truth-values of its atomic constituents is called a contradiction. A is a tautology.

    To say that a schemeA, is a tautology or valid means that every result of replacing the metavariables by wffs is a valid wff. The replacement must be uniform; for example, that A→ (B→ A) is a valid scheme requires that when we replace one A by a wff we use that same wff to replace the second A.

    for collections of propositions or wffs, we make the following definition for formal wffs.

    Semantic consequence A wff Bisa semantic consequence , B is a semantic consequence.

    Two wffs are semantically equivalent if each is a semantic consequence of the other: In every model they have the same truth-value.

    The notion of semantic consequence formalizes the idea that one proposition follows from another or collection of others, relative to our understanding of the connectives. To assert that A follows from B is an inference (or deduction or implication)the premises and A the conclusion, and it is this that is our formalization of therefore, relative to our semantic assumptions. A valid inference is a semantic consequence. We often write an inference with a line separating the premises from the conclusion, as in:

    This notation does not assume that the inference is actually valid.

    A when A is a tautology is still appropriate, for if A is a tautology then it is a semantic consequence of the empty collection of wffs.

    The same terminology can be applied to semi-formal propositions viewed as realizations of formal wffs, and then in turn to ordinary English propositions, as described for validity. For example, consider:

    Either Ralph is a dog or Howie is a duck.

    No way is Howie a duck.

    So Ralph is a dog.

    Arguing that ‘either…or…’ should be formalized as classical disjunction and ‘no way’ as classical negation, this is formalized as:

    Ralph is a dog

    p7.

    (George is a duck))’ is semantically equivalent to ‘Ralph is a dog → George is a duck’. Semantic equivalence is a formalization of the idea that two propositions mean the same for all our logical purposes.

    .

    Then we made the assumption that the only aspects of a proposition that are of concern to logic are its truth-value and its form as built up from sentence connectives. Making assumptions about the relationship of the semantic properties of a proposition to the properties of its constituents, we analyzed the truth of a compound proposition

    Enjoying the preview?
    Page 1 of 1