Propositional calculus
From Wikipedia the free encyclopedia
The propositional calculus[a] is a branch of logic.[1] It is also called propositional logic,[2] statement logic,[1] sentential calculus,[3] sentential logic,[1] or sometimes zeroth-order logic.[4][5] It deals with propositions[1] (which can be true or false)[6] and relations between propositions,[7] including the construction of arguments based on them.[8] Compound propositions are formed by connecting propositions by logical connectives representing the truth functions of conjunction, disjunction, implication, equivalence, and negation.[9][10][11][12] Some sources include other connectives, as in the table below.
Unlike first-order logic, propositional logic does not deal with non-logical objects, predicates about them, or quantifiers. However, all the machinery of propositional logic is included in first-order logic and higher-order logics. In this sense, propositional logic is the foundation of first-order logic and higher-order logic.
Propositional logic is typically studied with a formal language, in which propositions are represented by letters, which are called propositional variables. These are then used, together with symbols for connectives, to make compound propositions. Because of this, the propositional variables are called atomic formulas of a formal zeroth-order language.[10][2] While the atomic propositions are typically represented by letters of the alphabet,[10] there is a variety of notations to represent the logical connectives. The following table shows the main notational variants for each of the connectives in propositional logic.
Connective | Symbol |
---|---|
AND | , , , , |
equivalent | , , |
implies | , , |
NAND | , , |
nonequivalent | , , |
NOR | , , |
NOT | , , , |
OR | , , , |
XNOR | XNOR |
XOR | , |
History[edit]
Although propositional logic (which is interchangeable with propositional calculus) had been hinted by earlier philosophers, it was developed into a formal logic (Stoic logic) by Chrysippus in the 3rd century BC[15] and expanded by his successor Stoics. The logic was focused on propositions. This was different from the traditional syllogistic logic, which focused on terms. However, most of the original writings were lost[16] and, at some time between the 3rd and 6th century CE, Stoic logic faded into oblivion, to be resurrected only in the 20th century, in the wake of the (re)-discovery of propositional logic.[17]
Symbolic logic, which would come to be important to refine propositional logic, was first developed by the 17th/18th-century mathematician Gottfried Leibniz, whose calculus ratiocinator was, however, unknown to the larger logical community. Consequently, many of the advances achieved by Leibniz were recreated by logicians like George Boole and Augustus De Morgan, completely independent of Leibniz.[18]
Gottlob Frege's predicate logic builds upon propositional logic, and has been described as combining "the distinctive features of syllogistic logic and propositional logic."[19] Consequently, predicate logic ushered in a new era in logic's history; however, advances in propositional logic were still made after Frege, including natural deduction, truth trees and truth tables. Natural deduction was invented by Gerhard Gentzen and Stanisław Jaśkowski. Truth trees were invented by Evert Willem Beth.[20] The invention of truth tables, however, is of uncertain attribution.
Within works by Frege[21] and Bertrand Russell,[22] are ideas influential to the invention of truth tables. The actual tabular structure (being formatted as a table), itself, is generally credited to either Ludwig Wittgenstein or Emil Post (or both, independently).[21] Besides Frege and Russell, others credited with having ideas preceding truth tables include Philo, Boole, Charles Sanders Peirce,[23] and Ernst Schröder. Others credited with the tabular structure include Jan Łukasiewicz, Alfred North Whitehead, William Stanley Jevons, John Venn, and Clarence Irving Lewis.[22] Ultimately, some have concluded, like John Shosky, that "It is far from clear that any one person should be given the title of 'inventor' of truth-tables.".[22]
Characteristics[edit]
Propositional logic, as currently studied in universities, is a specification of a standard of logical consequence in which only the meanings of propositional connectives are considered in evaluating the conditions for the truth of a sentence, or whether a sentence logically follows from some other sentence or group of sentences.[2]
Declarative sentences[edit]
Propositional logic deals with statements, which are defined as declarative sentences having truth value.[24][1] Examples of statements might include:
- Wikipedia is a free online encyclopedia that anyone can edit.
- London is the capital of England.
- All Wikipedia editors speak at least three languages.
Declarative sentences are contrasted with questions, such as "What is Wikipedia?", and imperative statements, such as "Please add citations to support the claims in this article.".[25][26] Such non-declarative sentences have no truth value,[27] and are only dealt with in nonclassical logics, called erotetic and imperative logics.
Compounding sentences with connectives[edit]
In propositional logic, a statement can contain one or more other statements as parts.[1] Compound sentences are formed from simpler sentences and express relationships among the constituent sentences.[28] This is done by combining them with logical connectives:[28][29] the main types of compound sentences are negations, conjunctions, disjunctions, implications, and biconditionals,[28] which are formed by using the corresponding connectives to connect propositions.[30][31] In English, these connectives are expressed by the words "and" (conjunction), "or" (disjunction), "not" (negation) and "if" (material conditional), and "if and only if" (biconditional).[1][9] Examples of such compound sentences might include:
- Wikipedia is a free online encyclopedia that anyone can edit, and millions already have. (conjunction)
- It is not true that all Wikipedia editors speak at least three languages. (negation)
- Either London is the capital of England, or London is the capital of the United Kingdom, or both. (disjunction)[b]
If sentences lack any logical connectives, they are called simple sentences,[1] or atomic sentences;[29] if they contain one or more logical connectives, they are called compound sentences,[28] or molecular sentences.[29]
Sentential connectives are a broader category that includes logical connectives.[2][29] Sentential connectives are any linguistic particles that bind sentences to create a new compound sentence,[2][29] or that inflect a single sentence to create a new sentence.[2] A logical connective, or propositional connective, is a kind of sentential connective with the characteristic feature that, when the original sentences it operates on are (or express) propositions, the new sentence that results from its application also is (or expresses) a proposition.[2] Philosophers disagree about what exactly a proposition is,[6][2] as well as about which sentential connectives in natural languages should be counted as logical connectives.[29][2]
Inference[edit]
The following is an example of an inference within the scope of propositional logic:
- Premise 1: If it's raining, then it's cloudy.
- Premise 2: It's raining.
- Conclusion: It's cloudy.
Both premises and the conclusion are propositions. The premises are taken for granted, and with the application of modus ponens (an inference rule),[32] the conclusion follows.
Propositional variables[edit]
As propositional logic is not concerned with the structure of propositions beyond the point where they cannot be decomposed any more by logical connectives,[1] it is typically studied by replacing such atomic (indivisible) statements with letters of the alphabet, which are interpreted as variables representing statements (propositional variables).[1] With propositional variables, the argument above would then be symbolized as follows:
- Premise 1:
- Premise 2:
- Conclusion:
When P is interpreted as "It's raining" and Q as "it's cloudy" these symbolic expressions correspond exactly with the original expression in natural language. Not only that, but they will also correspond with any other inference with the same logical form.
Gentzen notation[edit]
If we assume that the validity of modus ponens has been accepted as an axiom, then the same argument can also be depicted like this:
This method of displaying it is Gentzen's notation for natural deduction and sequent calculus.[33] The premises are shown above a line, called the inference line,[11] separated by a comma, which indicates combination of premises.[34] The conclusion is written below the inference line.[11] The inference line represents syntactic consequence,[11] sometimes called deductive consequence,[35] which is also symbolized with ⊢.[36][35] So the above can also be written in one line as .[c]
Syntactic consequence is contrasted with semantic consequence,[37] which is symbolized with ⊧.[36][35] In this case, the conclusion follows syntactically because the natural deduction inference rule of modus ponens has been assumed. For more on inference rules, see the sections on proof systems below.
Formalization[edit]
Propositional logic may be studied through a formal system in which formulas of a formal language may be interpreted to represent propositions. A proof system allows certain formulas to be derived. These derived formulas are called theorems and may be interpreted to be true propositions. A constructed sequence of such formulas is known as a derivation or proof and the last formula of the sequence is the theorem. The derivation may be interpreted as proof of the proposition represented by the theorem.
When a formal system is used to represent formal logic, only statement letters (usually capital roman letters such as , and ) are represented directly. The natural language propositions that arise when they're interpreted are outside the scope of the system, and the relation between the formal system and its interpretation is likewise outside the formal system itself.
The most thoroughly researched branch of propositional logic is classical truth-functional propositional logic,[1] in which formulas are interpreted as having precisely one of two possible truth values, the truth value of true or the truth value of false.[38] The principle of bivalence and the law of excluded middle are upheld. By comparison with first-order logic, truth-functional propositional logic is considered to be zeroth-order logic.[4][5]
Language[edit]
Part of a series on |
Formal languages |
---|
The language (commonly called )[39][40][29] of a propositional calculus is defined in terms of:[2][10]
- a set of primitive symbols, called atomic formulas, atomic sentences,[29] atoms,[41] placeholders, prime formulas,[41] proposition letters, or variables, and
- a set of operator symbols, called connectives,[14][1] logical connectives,[1] logical operators,[1] truth-functional connectives,[1] or propositional connectives.[2]
A well-formed formula is any atomic formula, or any formula that can be built up from atomic formulas by means of operator symbols according to the rules of the grammar. The language , then, is defined either as being identical to its set of well-formed formulas,[40] or as containing that set (together with, for instance, its set of connectives and variables).[10][29]
Usually the grammar of is defined recursively by just a few definitions, as seen next; some authors explicitly include parentheses as punctuation marks when defining their language's grammar,[29][42] while others use them without comment.[2][10]
Composition of formulas[edit]
More specifically, given a set of atomic propositional variables , , , ..., and a set of propositional connectives , , , ..., , , , ..., , , , ..., a formula of propositional logic is defined recursively by these definitions:[2][10][43]
- Definition 1: Atomic propositional variables are formulas.
- Definition 2: If is a propositional connective, and A, B, C, … is a sequence of m, possibly but not necessarily atomic, possibly but not necessarily distinct, formulas, then the result of applying to to A, B, C, … is a formula.
- Definition 3: Nothing else is a formula.
Writing the result of applying to A, B, C, … in functional notation, as (A, B, C, …), we have the following as examples of well-formed formulas:
It is sort of recursive definition which justifies the use of "atomic" for propositional variables, since all formulas in the language are built up from the atoms as ultimate building blocks.[2] Composite formulas (all formulas besides atoms) are called molecules,[41] or molecular sentences.[29] (This is an imperfect analogy with chemistry, since a chemical molecule may sometimes have only one atom, as in monatomic gases.)[41]
Constants and schemata[edit]
Mathematicians sometimes distinguish between propositional constants, propositional variables, and schemata. Propositional constants represent some particular proposition,[44] while propositional variables range over the set of all atomic propositions.[44] Schemata, or schematic letters, however, range over all formulas.[45][1] It is common to represent propositional constants by A, B, and C, propositional variables by P, Q, and R, and schematic letters are often Greek letters, most often φ, ψ, and χ.[45][1]
However, some authors recognize only two "propositional constants" in their formal system: the special symbol , called "truth", which always evaluates to True, and the special symbol , called "falsity", which always evaluates to False.[46][47][48]
Semantics[edit]
To serve as a model of the logic of a given natural language, a formal language must be semantically interpreted.[29] In classical logic, all propositions evaluate to exactly one of two truth-values: True or False.[1][49] For example, "Wikipedia is a free online encyclopedia that anyone can edit" evaluates to True,[50] while "Wikipedia is a paper encyclopedia" evaluates to False.[51]
In other respects, the following formal semantics can apply to the language of any propositional logic, but the assumptions that there are only two semantic values (bivalence), that only one of the two is assigned to each formula in the language (noncontradiction), and that every formula gets assigned a value (excluded middle), are distinctive features of classical logic.[49][52][53] To learn about nonclassical logics with more than two truth-values, and their unique semantics, one may consult the articles on "Many-valued logic", "Three-valued logic", "Finite-valued logic", and "Infinite-valued logic".
Interpretation of a language[edit]
For a given language , an interpretation,[54] or case,[29] is an assignment of semantic values to each formula of .[29] For a formal language of classical logic, a case is defined as an assignment, to each formula of , of one or the other, but not both, of the truth values, namely truth (T, or 1) and falsity (F, or 0).[55][56] An interpretation of a formal language for classical logic is often expressed in terms of truth tables.[57][1] Since each formula is only assigned a single truth-value, an interpretation may be viewed as a function, whose domain is , and whose range is its set of semantic values ,[2] or .[29]
For distinct propositional symbols there are distinct possible interpretations. For any particular symbol , for example, there are possible interpretations: either is assigned T, or is assigned F. And for the pair , there are possible interpretations: either both are assigned T, or both are assigned F, or is assigned T and is assigned F, or is assigned F and is assigned T.[57] Since has , that is, denumerably many propositional symbols, there are , and therefore uncountably many distinct possible interpretations of as a whole.[57]
Propositional connective semantics[edit]
An interpretation assigns semantic values to atomic formulas directly.[54][29] Molecular formulas are assigned a function of the value of their constituent atoms, according to the connective used;[54][29] this is done by defining the connectives, as follows.[54][29]
Logical connectives | ||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||||||||||||
Related concepts | ||||||||||||||||||||||
Applications | ||||||||||||||||||||||
Category | ||||||||||||||||||||||
Logical connectives are defined semantically in terms of the truth values that they take when the propositional variables that they're applied to take either of the two possible truth values.[1][29] This is usually represented as a truth table for each of the connectives,[1][29] as seen below:
p | q | p ∧ q | p ∨ q | p → q | p ⇔ q | ¬p | ¬q |
---|---|---|---|---|---|---|---|
T | T | T | T | T | T | F | F |
T | F | F | T | F | F | F | T |
F | T | F | T | T | F | T | F |
F | F | F | F | T | T | T | T |
This table covers each of the main five logical connectives:[9][10][11][12] conjunction (here notated p ∧ q), disjunction (p ∨ q), implication (p → q), equivalence (p ⇔ q) and negation, (¬p, or ¬q, as the case may be). It is sufficient for determining the semantics of each of these operators.[1][58][29] For more detail on each of the five, see the articles on each specific one, as well as the articles "Logical connective" and "Truth function". For more truth tables for more different kinds of connectives, see the article "Truth table".
Some of these connectives may be defined in terms of others: for instance, implication, p → q, may be defined in terms of disjunction and negation, as ¬p ∨ q;[59] and disjunction may be defined in terms of negation and conjunction, as ¬(¬p ∧ ¬q).[42] In fact, a truth-functionally complete system, in the sense that all and only the classical propositional tautologies are theorems, may be derived using only disjunction and negation (as Russell, Whitehead, and Hilbert did),[2] or using only implication and negation (as Frege did),[2] or using only conjunction and negation,[2] or even using only a single connective for "not and" (the Sheffer stroke),[3][2] as Jean Nicod did.[2] A joint denial connective (logical NOR) also suffices to define all other connectives,[42] but no other connectives do.[42]
Semantic truth, validity, consequence[edit]
Given and as formulas of a language , and as an interpretation of , then the following definitions apply:[57][56]
- Truth-in-a-case:[29] A sentence of is true under an interpretation if assigns the truth value T to .[56][57] If is true under , then is called a model of .[57]
- Falsity-in-a-case:[29] is false under an interpretation if, and only if, is true under .[57][60][29] This is the "truth of negation" definition of falsity-in-a-case.[29] Falsity-in-a-case may also be defined by the "complement" definition: is false under an interpretation if, and only if, is not true under .[56][57] In classical logic, these definitions are equivalent, but in nonclassical logics, they are not.[29]
- Semantic consequence: A sentence of is a semantic consequence () of a sentence if there is no interpretation under which is true and is false.[56][57][29]
- Validity (or tautology): A sentence of is logically valid (),[d] or a tautology,[61][62][42] if it is true under every interpretation,[56][57] or true in every case.[29]
- Consistency: A sentence of is consistent if it is true under at least one interpretation. It is inconsistent (or self-contradictory)[1] if it is not consistent.[56][57]
For classical logic, the following theorems apply:
- For any given interpretation, a given formula is either true or false.[57][60]
- No formula is both true and false under the same interpretation.[57][60]
- is true under if, and only if, is false under ;[57][60] is true under if, and only if, is not true under .[57]
- If and are both true under , then is true under .[57][60]
- If and , then .[57]
- is true under if, and only if, either is not true under , or is true under .[57]
- if, and only if, is logically valid, that is, if, and only if, .[57][60]
Proof systems[edit]
Proof systems in propositional logic can be broadly classified into semantic proof systems and syntactic proof systems,[63][64][65] according to the kind of logical consequence that they rely on: semantic proof systems rely on semantic consequence (),[66] whereas syntactic proof systems rely on syntactic consequence ().[67] Semantic consequence deals with the truth values of propositions in all possible interpretations, whereas syntactic consequence concerns the derivation of conclusions from premises based on rules and axioms within a formal system.[68]
Semantic proof systems[edit]
Semantic proof systems rely on the concept of semantic consequence, symbolized as , which indicates that if is true, then must also be true in every possible interpretation.[68]
Truth tables[edit]
A truth table is a semantic proof method used to determine the truth value of a propositional logic expression in every possible scenario.[69] By exhaustively listing the truth values of its constituent atoms, a truth table can show whether a proposition is true, false, tautological, or contradictory.[70]
Semantic tableaux[edit]
A semantic tableau is another semantic proof technique that systematically explores the truth of a proposition.[71] It constructs a tree where each branch represents a possible interpretation of the propositions involved.[72] If every branch leads to a contradiction, the original proposition is considered to be a contradiction, and its negation is considered a tautology.[73]
Syntactic proof systems[edit]
Syntactic proof systems, in contrast, focus on the formal manipulation of symbols according to specific rules. The notion of syntactic consequence, , signifies that can be derived from using the rules of the formal system.[68]
Axiomatic systems[edit]
An axiomatic system is a set of axioms or assumptions from which other statements (theorems) are logically derived.[74] In propositional logic, axiomatic systems define a base set of propositions considered to be self-evidently true, and theorems are proved by applying deduction rules to these axioms.[75]
Natural deduction[edit]
Natural deduction is a syntactic method of proof that emphasizes the derivation of conclusions from premises through the use of intuitive rules reflecting ordinary reasoning.[76] Each rule reflects a particular logical connective and shows how it can be introduced or eliminated.[76]
Sequent calculus[edit]
The sequent calculus is a formal system that represents logical deductions as sequences or "sequents" of formulas.[77] Developed by Gerhard Gentzen, this approach focuses on the structural properties of logical deductions and provides a powerful framework for proving statements within propositional logic.[77][78]
Semantic proof via truth tables[edit]
Taking advantage of the semantic concept of validity (truth in every interpretation), it is possible to prove a formula's validity by using a truth table, which gives every possible interpretation (assignment of truth values to variables) of a formula.[70][41][45] If, and only if, all the lines of a truth table come out true, the formula is semantically valid (true in every interpretation).[70][41] Further, if (and only if) is valid, then is inconsistent.[79][80][81]
For instance, this table shows that "p → (q ∨ r → (r → ¬p))" is not valid:[41]
p | q | r | q ∨ r | r → ¬p | q ∨ r → (r → ¬p) | p → (q ∨ r → (r → ¬p)) |
---|---|---|---|---|---|---|
T | T | T | T | F | F | F |
T | T | F | T | T | T | T |
T | F | T | T | F | F | F |
T | F | F | F | T | T | T |
F | T | T | T | T | T | T |
F | T | F | T | T | T | T |
F | F | T | T | T | T | T |
F | F | F | F | T | T | T |
The computation of the last column of the third line may be displayed as follows:[41]
p | → | (q | ∨ | r | → | (r | → | ¬ | p)) |
---|---|---|---|---|---|---|---|---|---|
T | → | (F | ∨ | T | → | (T | → | ¬ | T)) |
T | → | ( | T | → | (T | → | F | )) | |
T | → | ( | T | → | F | ) | |||
T | → | F | |||||||
F | |||||||||
T | F | F | T | T | F | T | F | F | T |
Further, using the theorem that if, and only if, is valid,[57][60] we can use a truth table to prove that a formula is a semantic consequence of a set of formulas: if, and only if, we can produce a truth table that comes out all true for the formula (that is, if ).[82][83]
List of classically valid argument forms[edit]
Using semantic checking methods, such as truth tables or semantic tableaux, to check for tautologies and semantic consequences, it can be shown that, in classical logic, the following classical argument forms are semantically valid, i.e., these tautologies and semantic consequences hold.[45] We use ⟚ to denote equivalence of and , that is, as an abbreviation for both and ;[45] as an aid to reading the symbols, a description of each formula is given. The description reads the symbol ⊧ (called the "double turnstile") as "therefore", which is a common reading of it,[45][84] although many authors prefer to read it as "entails",[45][85] or as "models".[86]
Name | Sequent | Description |
---|---|---|
Modus Ponens | [29] | If p then q; p; therefore q |
Modus Tollens | [29] | If p then q; not q; therefore not p |
Hypothetical Syllogism | If p then q; if q then r; therefore, if p then r | |
Disjunctive Syllogism | [87] | Either p or q, or both; not p; therefore, q |
Constructive Dilemma | If p then q; and if r then s; but p or r; therefore q or s | |
Destructive Dilemma | If p then q; and if r then s; but not q or not s; therefore not p or not r | |
Bidirectional Dilemma | If p then q; and if r then s; but p or not s; therefore q or not r | |
Simplification | [29] | p and q are true; therefore p is true |
Conjunction | [29] | p and q are true separately; therefore they are true conjointly |
Addition | [29][87] | p is true; therefore the disjunction (p or q) is true |
Composition | If p then q; and if p then r; therefore if p is true then q and r are true | |
De Morgan's Theorem (1) | ⟚ [29] | The negation of (p and q) is equiv. to (not p or not q) |
De Morgan's Theorem (2) | ⟚ [29] | The negation of (p or q) is equiv. to (not p and not q) |
Commutation (1) | ⟚ [87] | (p or q) is equiv. to (q or p) |
Commutation (2) | ⟚ [87] | (p and q) is equiv. to (q and p) |
Commutation (3) | ⟚ [87] | (p iff q) is equiv. to (q iff p) |
Association (1) | ⟚ | p or (q or r) is equiv. to (p or q) or r |
Association (2) | ⟚ | p and (q and r) is equiv. to (p and q) and r |
Distribution (1) | ⟚ [87] | p and (q or r) is equiv. to (p and q) or (p and r) |
Distribution (2) | ⟚ [87] | p or (q and r) is equiv. to (p or q) and (p or r) |
Double Negation | ⟚ [29][87] | p is equivalent to the negation of not p |
Transposition | ⟚ [29] | If p then q is equiv. to if not q then not p |
Material Implication | ⟚ [87] | If p then q is equiv. to not p or q |
Material Equivalence (1) | ⟚ [87] | (p iff q) is equiv. to (if p is true then q is true) and (if q is true then p is true) |
Material Equivalence (2) | ⟚ [87] | (p iff q) is equiv. to either (p and q are true) or (both p and q are false) |
Material Equivalence (3) | ⟚ | (p iff q) is equiv to., both (p or not q is true) and (not p or q is true) |
Exportation | [88] | from (if p and q are true then r is true) we can prove (if q is true then r is true, if p is true) |
Importation | ⟚ | If p then (if q then r) is equivalent to if p and q then r |
Tautology (1) | ⟚ [87] | p is true is equiv. to p is true or p is true |
Tautology (2) | ⟚ [87] | p is true is equiv. to p is true and p is true |
Tertium non datur (Law of Excluded Middle) | [29][87] | p or not p is true |
Law of Non-Contradiction | [29][87] | p and not p is false, is a true statement |
Explosion | [29] | p and not p; therefore q |
Example syntactic proof systems[edit]
In a syntactic proof system for the propositional calculus, an argument is defined as a list of propositions. A valid argument is a list of propositions, the last of which follows from—or is implied by—the rest. All other arguments are invalid. The simplest valid argument is modus ponens, one instance of which is the following list of propositions:
This is a list of three propositions, each line is a proposition, and the last follows from the rest. The first two lines are called premises, and the last line the conclusion. We say that any proposition C follows from any set of propositions , if C must be true whenever every member of the set is true. In the argument above, for any P and Q, whenever P → Q and P are true, necessarily Q is true. Notice that, when P is true, we cannot consider cases 3 and 4 (from the truth table). When P → Q is true, we cannot consider case 2. This leaves only case 1, in which Q is also true. Thus Q is implied by the premises.
This generalizes schematically. Thus, where φ and ψ may be any propositions at all,
Other argument forms are convenient, but not necessary. Given a complete set of axioms (see below for one such set), modus ponens is sufficient to prove all other argument forms in propositional logic, thus they may be considered to be a derivative. Note, this is not true of the extension of propositional logic to other logics like first-order logic. First-order logic requires at least one additional rule of inference in order to obtain completeness.
The significance of argument in formal logic is that one may obtain new truths from established truths. In the first example above, given the two premises, the truth of Q is not yet known or stated. After the argument is made, Q is deduced. In this way, we define a deduction system to be a set of all propositions that may be deduced from another set of propositions. For instance, given the set of propositions , we can define a deduction system, Γ, which is the set of all propositions which follow from A. Reiteration is always assumed, so . Also, from the first element of A, last element, as well as modus ponens, R is a consequence, and so . Because we have not included sufficiently complete axioms, though, nothing else may be deduced. Thus, even though most deduction systems studied in propositional logic are able to deduce , this one is too weak to prove such a proposition.
Formal structure for example systems[edit]
One of the main uses of a propositional calculus, when interpreted for logical applications, is to determine relations of logical equivalence between propositional formulas. These relationships are determined by means of the available transformation rules, sequences of which are called derivations or proofs.
The following examples of proof systems for a propositional calculus will assume a calculus defined as a formal system , where:
- The alpha set is a countably infinite set of 's atomic formulas or propositional variables. In the examples to follow, the elements of are typically the letters p, q, r, and so on.
- The omega set Ω is a finite set of elements called operator symbols or logical connectives. The set Ω is partitioned into disjoint subsets as , where, is the set of operator symbols of arity j. For instance, a partition of Ω for the typical five connectives would have and Also, the constant logical values are treated as operators of arity zero, so that
- The zeta set is a finite set of transformation rules, called inference rules when they acquire logical applications.
- The iota set is a countable set of initial points that are called axioms when they receive logical interpretations.
The language is its set of well-formed formulas, inductively defined by the following rules:
- Base: Any element of the alpha set is a formula of .
- If are formulas and is in , then is a formula.
- Closed: Nothing else is a formula of .
Repeated applications of these rules permits the construction of complex formulas. Examples of formulas that follow these rules include "p" (by rule 1), "" (by rule 2), "q" (by rule 1), and "" (by rule 2).[e]
In the discussion to follow, after a proof system is defined, a proof is presented as a sequence of numbered lines, with each line consisting of a single formula followed by a reason or justification for introducing that formula. Each premise of the argument, that is, an assumption introduced as a hypothesis of the argument, is listed at the beginning of the sequence and is marked as a "premise" in lieu of other justification. The conclusion is listed on the last line. A proof is complete if every line follows from the previous ones by the correct application of a transformation rule. (For a contrasting approach, see proof-trees).
Natural deduction proof system example[edit]
Let , where , , , are defined as follows:
- The alpha set , is a countably infinite set of symbols, thus:
- The omega set partitions as and
In the following example of a propositional calculus, the transformation rules are intended to be interpreted as the inference rules of a so-called natural deduction system. The particular system presented here has no initial points, which means that its interpretation for logical applications derives its theorems from an empty axiom set.
- The set of initial points is empty, that is, .
- The set of transformation rules, , is described as follows:
Our propositional calculus has eleven inference rules. These rules allow us to derive other true formulas given a set of formulas that are assumed to be true. The first ten simply state that we can infer certain well-formed formulas from other well-formed formulas. The last rule however uses hypothetical reasoning in the sense that in the premise of the rule we temporarily assume an (unproven) hypothesis to be part of the set of inferred formulas to see if we can infer a certain other formula. Since the first ten rules do not do this they are usually described as non-hypothetical rules, and the last one as a hypothetical rule.
In describing the transformation rules, we may introduce a metalanguage symbol . It is basically a convenient shorthand for saying "infer that". The format is , in which Γ is a (possibly empty) set of formulas called premises, and ψ is a formula called conclusion. The transformation rule means that if every proposition in Γ is a theorem (or has the same truth value as the axioms), then ψ is also a theorem. Considering the following rule Conjunction introduction, we will know whenever Γ has more than one formula, we can always safely reduce it into one formula using conjunction. So for short, from that time on we may represent Γ as one formula instead of a set. Another omission for convenience is when Γ is an empty set, in which case Γ may not appear.
Inference rules[edit]
- Negation introduction: From and , infer ; that is, .
- Negation elimination: From , infer ; that is, .
- Double negation elimination: From , infer p; that is, .
- Conjunction introduction: From p and q, infer ; that is, .
- Conjunction elimination: From , infer p, and from , infer q; that is, and .
- Disjunction introduction: From p, infer .
- From q, infer ; that is, and .
- Disjunction elimination: From and and , infer r; that is, .
- Biconditional introduction: From and , infer ; that is, .
- Biconditional elimination: From , infer , and from , infer ; that is, and .
- Modus ponens (conditional elimination) : From p and , infer q; that is, .
- Conditional proof (conditional introduction) : From [accepting p allows a proof of q], infer ; that is, .
Example of a proof in natural deduction system[edit]
- To be shown that A → A.
- One possible proof of this (which, though valid, happens to contain more steps than are necessary) may be arranged as follows:
Number | Formula | Reason |
---|---|---|
1 | premise | |
2 | From (1) by disjunction introduction | |
3 | From (1) and (2) by conjunction introduction | |
4 | From (3) by conjunction elimination | |
5 | Summary of (1) through (4) | |
6 | From (5) by conditional proof |
Interpret as "Assuming A, infer A". Read as "Assuming nothing, infer that A implies A", or "It is a tautology that A implies A", or "It is always true that A implies A".
Jan Łukasiewicz axiomatic proof system example[edit]
Let , where , , , are defined as follows:
- The set , the countably infinite set of symbols that serve to represent logical propositions:
- The functionally complete set of logical operators (logical connectives and negation) is as follows. Of the three connectives for conjunction, disjunction, and implication (, and →), one can be taken as primitive and the other two can be defined in terms of it and negation (¬).[89] Alternatively, all of the logical operators may be defined in terms of a sole sufficient operator, such as the Sheffer stroke (nand). The biconditional () can of course be defined in terms of conjunction and implication as . Adopting negation and implication as the two primitive operations of a propositional calculus is tantamount to having the omega set partitioned as and Then is defined as , and is defined as .
- The set (the set of initial points of logical deduction, i.e., logical axioms) is the axiom system proposed by Jan Łukasiewicz, and used as the propositional-calculus part of a Hilbert system. The axioms are all substitution instances of:
- The set of transformation rules (rules of inference) is the sole rule modus ponens (i.e., from any formulas of the form and , infer ).
This system is used in Metamath set.mm formal proof database.
Example of a proof in an axiomatic propositional calculus system[edit]
We now prove the same theorem in the axiomatic system by Jan Łukasiewicz described above, which is an example of a Hilbert-style deductive system for the classical propositional calculus.
The axioms are:
- (A1)
- (A2)
- (A3)
And the proof is as follows:
- (instance of (A1))
- (instance of (A2))
- (from (1) and (2) by modus ponens)
- (instance of (A1))
- (from (4) and (3) by modus ponens)
Soundness and completeness of the rules[edit]
The crucial properties of this set of rules are that they are sound and complete. Informally this means that the rules are correct and that no other rules are required. These claims can be made more formal as follows. The proofs for the soundness and completeness of the propositional logic are not themselves proofs in propositional logic; these are theorems in ZFC used as a metatheory to prove properties of propositional logic.
We define a truth assignment as a function that maps propositional variables to true or false. Informally such a truth assignment can be understood as the description of a possible state of affairs (or possible world) where certain statements are true and others are not. The semantics of formulas can then be formalized by defining for which "state of affairs" they are considered to be true, which is what is done by the following definition.
We define when such a truth assignment A satisfies a certain well-formed formula with the following rules:
- A satisfies the propositional variable P if and only if A(P) = true
- A satisfies ¬φ if and only if A does not satisfy φ
- A satisfies (φ ∧ ψ) if and only if A satisfies both φ and ψ
- A satisfies (φ ∨ ψ) if and only if A satisfies at least one of either φ or ψ
- A satisfies (φ → ψ) if and only if it is not the case that A satisfies φ but not ψ
- A satisfies (φ ↔ ψ) if and only if A satisfies both φ and ψ or satisfies neither one of them
With this definition we can now formalize what it means for a formula φ to be implied by a certain set S of formulas. Informally this is true if in all worlds that are possible given the set of formulas S the formula φ also holds. This leads to the following formal definition: We say that a set S of well-formed formulas semantically entails (or implies) a certain well-formed formula φ if all truth assignments that satisfy all the formulas in S also satisfy φ.
Finally we define syntactical entailment such that φ is syntactically entailed by S if and only if we can derive it with the inference rules that were presented above in a finite number of steps. This allows us to formulate exactly what it means for the set of inference rules to be sound and complete:
Soundness: If the set of well-formed formulas S syntactically entails the well-formed formula φ then S semantically entails φ.
Completeness: If the set of well-formed formulas S semantically entails the well-formed formula φ then S syntactically entails φ.
For the above set of rules this is indeed the case.
Sketch of a soundness proof[edit]
(For most logical systems, this is the comparatively "simple" direction of proof)
Notational conventions: Let G be a variable ranging over sets of sentences. Let A, B and C range over