32
STEPHEN READ HARMONY AND AUTONOMY IN CLASSICAL LOGIC Received on 2 October 1998; revised on 8 November 1999 ABSTRACT. Michael Dummett and Dag Prawitz have argued that a constructivist theory of meaning depends on explicating the meaning of logical constants in terms of the theory of valid inference, imposing a constraint of harmony on acceptable connectives. They argue further that classical logic, in particular, classical negation, breaks these constraints, so that classical negation, if a cogent notion at all, has a meaning going beyond what can be exhibited in its inferential use. I argue that Dummett gives a mistaken elaboration of the notion of harmony, an idea stemming from a remark of Gerhard Gentzen’s. The introduction-rules are autonomous if they are taken fully to specify the meaning of the logical constants, and the rules are harmonious if the elimination-rule draws its conclusion from just the grounds stated in the introduction-rule. The key to harmony in classical logic then lies in strengthening the theory of the conditional so that the positive logic contains the full classical theory of the conditional. This is achieved by allowing parametric formulae in the natural deduction proofs, a form of multiple-conclusion logic. KEY WORDS: classical negation, Dummett, fundamental assumption, Gentzen, multiple- conclusion logic 1. HARMONY AND AUTONOMY 1.1. Gentzen’s Remark In an oft-quoted remark, Gerhard Gentzen wrote: “the introductions repre- sent, as it were, the ‘definitions’ of the symbols concerned, and the elim- inations are no more, in the final analysis, than the consequences of these definitions.” 1 He was commenting in an informal way on the rules of his natural deduction calculi, NJ and NK, for intuitionistic and classical logic. His idea has been taken up and developed by many authors, belonging to quite diverse traditions. Nonetheless, the remark can be seen to encapsulate a conception of logic as in a certain way autonomous. The inference rules divide themselves into two classes, those which introduce into the conclu- sion an occurrence of a logical constant not appearing in the premises, and those which eliminate an occurrence of a constant in the premises, so that it has no match in the conclusion. The former set of rules exhibit the grounds for asserting a proposition whose form displays the constant, in general as main connective; the latter show what can be inferred from such an Journal of Philosophical Logic 29: 123–154, 2000. © 2000 Kluwer Academic Publishers. Printed in the Netherlands.

Harmony and Autonomy in Classical Logic

Embed Size (px)

Citation preview

Page 1: Harmony and Autonomy in Classical Logic

STEPHEN READ

HARMONY AND AUTONOMY IN CLASSICAL LOGIC

Received on 2 October 1998; revised on 8 November 1999

ABSTRACT. Michael Dummett and Dag Prawitz have argued that a constructivist theoryof meaning depends on explicating the meaning of logical constants in terms of the theoryof valid inference, imposing a constraint of harmony on acceptable connectives. They arguefurther that classical logic, in particular, classical negation, breaks these constraints, sothat classical negation, if a cogent notion at all, has a meaning going beyond what can beexhibited in its inferential use.

I argue that Dummett gives a mistaken elaboration of the notion of harmony, an ideastemming from a remark of Gerhard Gentzen’s. The introduction-rules are autonomousif they are taken fully to specify the meaning of the logical constants, and the rules areharmonious if the elimination-rule draws its conclusion from just the grounds stated inthe introduction-rule. The key to harmony in classical logic then lies in strengthening thetheory of the conditional so that the positive logic contains the full classical theory of theconditional. This is achieved by allowing parametric formulae in the natural deductionproofs, a form of multiple-conclusion logic.

KEY WORDS: classical negation, Dummett, fundamental assumption, Gentzen, multiple-conclusion logic

1. HARMONY AND AUTONOMY

1.1. Gentzen’s Remark

In an oft-quoted remark, Gerhard Gentzen wrote: “the introductions repre-sent, as it were, the ‘definitions’ of the symbols concerned, and the elim-inations are no more, in the final analysis, than the consequences of thesedefinitions.”1 He was commenting in an informal way on the rules of hisnatural deduction calculi,NJ andNK , for intuitionistic and classical logic.His idea has been taken up and developed by many authors, belonging toquite diverse traditions. Nonetheless, the remark can be seen to encapsulatea conception of logic as in a certain way autonomous. The inference rulesdivide themselves into two classes, those which introduce into the conclu-sion an occurrence of a logical constant not appearing in the premises, andthose which eliminate an occurrence of a constant in the premises, so that ithas no match in the conclusion. The former set of rules exhibit the groundsfor asserting a proposition whose form displays the constant, in generalas main connective; the latter show what can be inferred from such an

Journal of Philosophical Logic29: 123–154, 2000.© 2000Kluwer Academic Publishers. Printed in the Netherlands.

Page 2: Harmony and Autonomy in Classical Logic

124 STEPHEN READ

assertion. Thus the introduction-rules are self-justifying in this sense: theirconclusions have just the meaning which they confer on them. Gentzenwished the reader to observe that the elimination-rules which he had setout drew no more from an assertion than was justified by its grounds, itsmeans of introduction.

This result was encapsulated in his Hauptsatz, the Cut-Elimination The-orem. This theorem shows the utility, and dispensability, of the method oflemmas. In general, we establish a result in stages, setting forth a seriesof lemmas, which are then composed to yield a further result. This aidsclarity and understanding; but it is justified by the fact that the proofs ofthe lemmas can themselves be composed into a single and direct proofof the main theorem. So too with the introduction- and elimination-rules.Their composition was only directly established by Prawitz, Gentzen find-ing it more convenient to develop his ideas in relation to sequent calculirather than natural deduction. But the results transfer, and the essentialpoint is that in any derivation of a conclusionA from assumptionsX (ina calculus of Gentzen’s sort), any formula which is the conclusion of anintroduction-rule for a connectiveδ and simultaneously major premise ofan elimination-rule forδ (so-called “maximal formulae”) can be removedand replaced by a direct proof with no such detours. Prawitz describes sucha reduction as “normalization”.2

Gentzen’s remark epitomizes the conception of logic as self-justifying.It ran into opposition in Arthur Prior’s famous attack on what he called the“analytical validity” view. “It is sometimes alleged,” he wrote, “that thereare inferences whose validity arises solely from the meanings of certainexpressions occurring in them.”3 He produced a pair of introduction- andelimination-inferences for a supposed connective ‘tonk’ which collapsedthe logic to triviality. By a suitable detour through ‘tonk’ (A, so ‘A tonkB’ by tonk-I , soB by tonk-E) every proposition became equivalent toevery other. But whether or not Prior’s observation was salutary in thelight of certain claims (by Popper and Kneale, for example), ‘tonk’ hasno place in Gentzen’s conception of logic. Prior’s tonk-E rule is in noway a consequence of his tonk-I . It is getting clear about this sense of‘consequence of’ which will form the focus of the present paper.

1.2. The Autonomy of Logic

Gentzen’s remark has been developed, in varying ways, by (amongst oth-ers) Dag Prawitz, Michael Dummett and Ian Hacking. Hacking’s focus ofinterest, uniquely among these three, is classical logic. A putative constantis not logical, he claims, unless it conforms to certain constraints, namely,that the extension of a logical system by the addition of this constant to

Page 3: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 125

the language, and operational rules governing its behaviour, must be con-servative and satisfy the sub-formula principle. An extension of a logicS on a languageL by the addition of a constantδ, yielding a languageL′ extendingL and a systemS′ extendingS containing rules for the useof δ, is conservativeif any inference (ofA from X) in L provable inS′(i.e. provable in the extension but not containing the new vocabularyδ)is provable inS. S satisfies thesub-formula principleif wheneverA isderivable fromX in S, there is a proof ofA from X in S containing onlysub-formulae ofA and of formulae inX. (Any formula is a sub-formula ofitself; and ifA is a well-formed formula, according to the standard recur-sive definition of wff, only if{Ai} are wffs, then{Ai} are sub-formulae ofA.)

Hacking’s claim is that, unless an extensionS′ of S satisfies his twoconstraints,S′ will not preserve the essential marks of the consequencerelation onS, namely, Reflexivity, Monotonicity and Cut (or Lemma),due to Tarski.4 If A is derivable fromX in S, we writeX `S A. LetCnS(X) = {A : X `S A}. CnS is reflexive if X ⊆ CnS(X); CnS ismonotonicif CnS(X) ⊆ CnS(Y ) wheneverX ⊆ Y ; andCnS satisfiesCutif CnS(CnS(X)) ⊆ CnS(X). Let us say that a relation isTarski if it isreflexive, monotonic and satisfies Cut. Hacking’s thought is that unlessappropriately constrained,S′ will no longer be a system of logic if itsconsequence relation is not Tarski, and soδ is not alogical constant.

Hacking’s requirement thatS′ be a conservative extension ofS is toostringent, as Sundholm has shown.5 So too is it when Michael Dummettequates it with harmony, his evocative epithet for the desired relation be-tween the rules for any new constant. The demand for conservativenessstems from Nuel Belnap’s response to Prior. It certainly cannot be de-nied that ‘tonk’ was spectacularly non-conservative. “We may now state,”Belnap wrote, “the demand for consistency of the definition of [any] newconnective . . . as follows: the extension must beconservative. . . and (al-though this requirement is not as essential as the first) we ought to adduniquenessas a requirement.”6 A constantδ is unique if, given δ′ char-acterized by rules formally identical to those forδ, we can establish thateach formula containingδ is deductively equivalent to the correspondingformula containingδ′ in place ofδ.

The central idea of the autonomy of logic, recall, is that the rules for aconstant (or at least, the rules for the assertion of formulae containing it,i.e. the introduction-rules) should “fully determine its meaning.”7 “If [theserules] didnot guarantee uniqueness, [two connectives] would, in an obvi-ous sense, obey the same logical laws, even though, not being equivalent,they might bear different meanings” (loc. cit.); so their meanings would

Page 4: Harmony and Autonomy in Classical Logic

126 STEPHEN READ

go beyond the laws. But this is too strong: the laws should certainly not beconsistent with the two constants’ being inequivalent. This is compatible,however, with there being no proof of their equivalence. It is an essentiallyinternalist demand to require that the equivalence be provable. If use fullydetermines meaning, then their equivalence must not be ruled out. Butto infer, as Dummett does in the quotation above, that the lack of proofof uniqueness entails inequivalence, is unwarranted. It is sufficient thatinequivalence not be provable.8

1.3. Total Harmony

The same objection must be levelled at the conservativeness claim. Har-mony does not justify such a stringent condition. Dummett distinguishes,in fact, between total harmony, which he equates with conservativeness,and intrinsic or local harmony, which he identifies with normalization.

Consider Dummett’s argument that harmony requires conservativeness,in The Logical Basis of Metaphysics, p. 218.9 Harmony is first introducedin that book as an informal notion (pp. 210 ff.), meaning that the principlefor verifying an assertion should cohere with the consequences that can bedrawn from it – “the difference made by an utterance.” Suppose these prin-ciples, as governing some constantδ, were not in harmony. Then, he says,we should be drawing more in the way of consequences from an assertioninvolving δ than were justified by the grounds for that assertion. But thosegrounds must be stateable in terms not involvingδ. (This claim, whichhe does not there defend, must ultimately depend on the “FundamentalAssumption,” which we will examine later, and his rejection of holism.)Thus our practice would be to infer more from an assertion involvingδ

than is possible in its absence. So the addition ofδ to our vocabulary hasresulted in a non-conservative extension.

There are three crucial points to notice in this argument. The first wehave noted already – its dependence on the “Fundamental Assumption,”which we will examine closely in Section 1.4 below. Secondly, Dummettinterprets disharmony to mean that theE-principles (the principles gov-erning what conclusion to draw) are stronger than are warranted by theI -principles (those regulating the grounds for assertion). He had earlier(p. 217) suggested that a strengthening of theI -principles would also upsetexisting harmony, but he now interprets harmony as requiring that we inferno more than, though possibly less than, theI -principles warrant. Infer-ring too little is not “so deleterious an effect” and so not disharmonious.Nonetheless, he later introduces the conception of stability specifically toexclude such a situation (p. 287).

Page 5: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 127

Thirdly, note that what the argument shows, if anything, is that conser-vativeness is sufficient for harmony; it does not show that it is necessary.Dag Prawitz in fact gives a counterexample,10 referring to the incomplete-ness of arithmetic. The addition of higher-order concepts, for example,second-order quantifiers, or a ‘truth’-predicate, can readily yield a non-conservative extension, even though those new concepts are perfectly har-monious. For example, the ‘truth’-predicate obeys the rules

p

TpT-intro.

Tp

pT-elim.

which even satisfy stability.11 But adding the ‘truth’-predicate to arithmetic(and allowing ‘T ’ to appear in the induction axiom), allows us to prove theGödel sentence, so is non-conservative.

Dummett’s conception of total harmony as requiring conservativenessand uniqueness clearly stems from Belnap’s response to Prior. Nonethe-less, the two requirements are far more stringent than can be justified.They certainly suffice to prevent such problematic connectives as ‘tonk’.But they would rule out otherwise unexceptionable connectives, and thearguments for their imposition do not support that stringency. Thus Dum-mett is quite wrong to identify harmony with conservativeness, as he does,for example, in his reference to Belnap (p. 246). We do not yet have the“more precise characterization of the notion of harmony” which Dummettwas seeking.

1.4. Intrinsic Harmony

The conception of harmony so far considered is total harmony. We notedthat Dummett distinguishes total harmony from intrinsic harmony (p. 250).Intrinsic harmony is a particular requirement that the rules governing alogical constant be appropriately related; total harmony is the constraintthat, whatever the exact detail of the rules for a constant, consequencesof an assertion should not exceed the grounds for that assertion. Whetherthese two conceptions are really distinct – even whether Dummett believesthey are distinct – is unclear, partly because total harmony is an informalnotion made precise by a flawed argument (for conservativeness).

Intrinsic harmony stems from the work of Prawitz and Gentzen, as to-tal harmony from that of Belnap. Dummett frequently identifies harmonywith normalizability (e.g., p. 250). Indeed, he claims that normalizabilityentails conservativeness, but as we have seen, that cannot be right. Notethat normalizability is a particular feature of certain formulations of therules for a logical constant. This feature obtains when the elimination-rulescan be read off from – are seen to be generated by – the introduction-

Page 6: Harmony and Autonomy in Classical Logic

128 STEPHEN READ

rules. Dummett calls this procedure, “the upwards justification procedure.”However, his understanding of the procedure is deeply mistaken, and thismisconception leads to many problems with his analysis.

The problems come to a head on p. 278. At this point, Dummett con-cedes (unnecessarily) that neither the conditional nor the universal quan-tifier is given its (full) meaning by the specification of the grounds forits assertion. Although he bravely recovers in later chapters from this lowpoint, it marks such a failure of the project to explain the logical constantsas self-justifying, that it should induce a reassessment of the whole methodof approach.

The root of the mistake is Dummett’s application of what he calls the“fundamental assumption:” “that, if we have a valid argument for a com-plex statement, we can construct a valid argument for it which finisheswith an application of one of the introduction rules governing its principaloperator” (p. 254). The notion is more commonly called “inversion.” A ruleof inference isinvertible if whenever there is a derivation of its conclusion,there is a derivation of its premises.12 Dummett’s fundamental assumptionis that the introduction-rules are invertible, that is, that if we can provetheir conclusion, we can derive the premise(s) and so proceed to the con-clusion directly by the rule. But not all rules are invertible, even when thesystem is normalizable. For example, Prawitz’NJ, corresponding toLJ , isnormalizable. But→-left is not invertible inLJ . So normalization does notwarrant the “fundamental assumption.” Normalization, in Dummett’s help-ful metaphor, “levels local peaks:” maximal formulae can be removed, andproofs obey the sub-formula principle. That does not show that the proofwill end with application of an introduction-rule for the main connective.

In fact, Chapters 11–13 ofThe Logical Basis of Metaphysicspresenta succession of counterexamples to the “fundamental assumption.” In theface of them, Dummett successively revises his upwards justification pro-cedure – but throughout those revisions he retains the assumption. Theprocedure is in essence this: an inference fromA to B is valid if, bysuccessively tracing the antecedents ofA as yielded by applying the funda-mental assumption to it and its antecedents, we eventually arrive atB, orsome formulae which we know independently to entailB. For example,the existential quantifier distributes over disjunction because any proofof (∃v)(A ∨ B) must, by the fundamental assumption, be convertible toone ending in∃I , yielding a proof ofA(t/v) ∨ B(t/v), and again by theassumption, that proof must give way to one ending in a proof ofA(t/v) orB(t/v). Now apply∃I and∨I toA(t/v) (orB(t/v)), and we have a proofof (∃v)A∨(∃v)B, that is, of the conclusion of the relevant distributive law.

Page 7: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 129

That such a procedure appears to validate valid rules of inference showsnothing at all, since it also “validates” invalid laws. Dummett’s counterex-amples are the distribution of the universal quantifier over disjunction(p. 259); to inferA from 3A (possibly-A: p. 265); the distribution of theconditional over disjunction (classically valid but intuitionistically invalidand so anathema to Dummett: p. 273); and finally to infer¬A from aderivation ofB from A (p. 294)! In each case (except3A), to be sure, arestriction or refinement of the procedure is found. But the procedure itselfhas no force until the need for further such refinements is ruled out; nosuch guarantee is given, nor can be. The case of3A conclusively refutesthe fundamental assumption. The introduction-rule for3 infers 3A fromA. But clearly many things follow fromA (in particular,A itself) which donot follow from3A .

In fact, a crucial equivocation lies at the heart of the fundamental as-sumption and Dummett’s justification procedure. It can frequently be thecase that any provable wffA has a proof that ends with an application ofthe introduction-rule for its main connective, while no derivation ofA fromsome assumptionsX does, but with applications of elimination-rules forwffs in X. Contrast, e.g., the proof of(∃v)(A→ A) with the derivation of(∃v)(B → A) from (∃v)A. Now suppose that for someA, 3A is actuallyprovable. Then, we can suppose, there is a proof of3A ending with anapplication of3I , and so there is a proof ofA itself. That is, if` 3A then` A. That shows that the calculus is normal, sometimes called Gödel’srule. The problem only arises if we suppose that given any derivation of3Afrom other assumptionsthere is a derivation ending in3I . Then we couldindeed conclude thatA follows from 3A, and modalities would collapse.Yet it is clear from Dummett’s remarks on p. 272 that he does conceive ofthe fundamental assumption’s being applied in the justification procedurein the latter situation, when there are “arbitrarily many additional assertiblepremises.”

What is needed is a rethink of the notion of intrinsic harmony, to dis-cern the true relationship between the introduction- and elimination-rules.Dummett’s conclusion (p. 299) is that “intuitionistic logic. . . has comeout of our enquiry very well. . . [but] classical negation . . . is not amenableto any proof-theoretic justification procedure based on laws which mayreasonably be regarded as self-justifying.” This is not so; ‘→’, ‘ ∀’, ‘ 3’and (classical) ‘¬’ all fall down as incapable of self-justification on hisapproach. We will see, however, that once Gentzen’s remark is properlyunderstood, and we comprehend how the elimination-rules are properly“consequences of” the introduction-rules, all these connectives, includingclassical negation, are self-justifying and harmonious.

Page 8: Harmony and Autonomy in Classical Logic

130 STEPHEN READ

2. THE PROOF-CONDITIONAL THEORY OF MEANING

2.1. The Meaning of the Logical Constants

Gentzen himself gave an example to clarify his remark: “[i]f we . . . wishedto use[A ⊃ B] by eliminating the⊃-symbol . . . we could do this preciselyby inferringB directly, onceA has been proved, for whatA ⊃ B attests isjust the existence of a derivation ofB from A” (loc. cit.).13 Thus what isinferred in the elimination-rule from, for example,A ⊃ B is no more thancan be inferred (directly) from what warrants an assertion ofA ⊃ B inthe first place, that is, the premises of the introduction-rule. In general, if{5i} denotes the grounds for introducing some formulaA (introducing anoccurrence of a connectiveδ in A), then the elimination-rule forδ shouldpermit inference to an arbitrary formulaC only if {5i} themselves entailC. (We need to permit multiple introduction-rules here, to accommodate,for example, the case of disjunction, ‘∨’. There may be a whole variety ofgrounds for assertingA.) Schematically, given the introduction-rules

51

AδI

52

AδI . . .

the elimination-rule will take the form

(51) (52)

A C C . . .

CδE

where the background premises{5i} are discharged by the application ofthe elimination-rule.

We can illustrate the directness of the inference toC by considering acase whereδI is followed immediately by an application ofδE:

5i

AδI

(51)C

(52)C . . .

CδE

for the proof reduces, as Prawitz showed in his influential book,14 by re-moving the maximal formulaA, to obtain

5i

C

extracting the particular sub-proof of the application ofδE matching theactual premise which led originally toA. Thus we obtain Prawitz’ normal-ization theorem (that maximal formulae can be eliminated from proofs),corresponding to Gentzen’s cut-elimination theorem (that Cut can be elim-inated from proofs in the corresponding sequent system).

Page 9: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 131

It is when introduction- and elimination-rules lie in this relationshippermitting normalization that there is the harmony which Dummett seeksin the rules, a harmony between the means by which a formula can beestablished indirectly and the way it can be obtained directly. In the aboveschema,C can be established directly from5i (as in the sub-proof ofδE),and it can be established indirectly from5i by a detour throughA, firstintroducingA, then eliminating it. When this situation obtains,δI andδEare in harmony.

One may still puzzle, however, over the sanction by which theI - andE-rules are required to be in harmony. There is much talk of how things“should be”, and there is certainly an aesthetic satisfaction in seeing proofssimplify by the reduction procedure. But of course, proofs do not disap-pear totally under this procedure. There will be assumptions, containingconnectivesδ, which are not introduced byδI , and conclusions acting asmajor premises for elimination. What “harmony”, that the elimination-rulefollows from the introduction-rules in Gentzen’s sense, shows is that nomore is inferred from a formula containingδ than the introduction-rulesfor δ warrant. In other words, the constant is entirely logical in character, inthat the introduction-rules fully specify the meaning, and so the connectiveis in the intended sense, autonomous.

It so happens that harmony also eases proof-search. A direct proof, fromassumptions to conclusion, is not only more pleasing aesthetically, it isalso easier to find and to follow. Maximal formulae impede proof-search,for they need not be sub-formulae of the premises and conclusion. But, asGentzen himself observed (ibid., p. 69), proofs in normal form (whether innatural deduction, or in sequent calculus) are “not roundabout”. That waswhy Cut-elimination, showing the directness of sequent calculus proof,was his main goal, his “Hauptsatz”, and the sub-formula principle theimmediate corollary.

But the philosophical importance of harmony is the autonomy whichit confers on the logical constants. The guiding principle of a “proof-conditional theory of meaning” is that if the meaning of a logical constantis solely (i.e. completely) given by its introduction-rule(s), then one is enti-tled to infer from a formula containing it no more and no less than one caninfer from the grounds for its introduction (assertion). All indirect proofreduces to direct proof. Such constants are self-justifying and autonomous.Their meaning is fully contained in the introduction-rule.

Page 10: Harmony and Autonomy in Classical Logic

132 STEPHEN READ

2.2. Conjunction

We need now to spell out exactly how this general statement applies toparticular cases. Let us start with the case of conjunction. The introduction-rule has the form:

X : A Y : BX,Y : A& B

&I

spelling out the dependencies of the formulae explicitly.15X : A denotes a(single-conclusion) sequent, which is a pair consisting of a multiset of wffsand a wff. (A multiset is a set each of whose members is indexed by thenumber of its occurrences in the multiset.) In accordance with the recipein Section 2.1, the elimination-rule should allow us to infer fromA& B(directly) whatever the premises (A andB) themselves entail. That is,

X : A& B Y,A,B : CX,Y : C &E

Given thatC follows from A andB (and other assumptionsY ), as theright-hand premise reveals, thenC follows from whatever entailsA& B inthe left-hand premise, or in other words, given thatA& B follows fromX,C follows fromX together withY .

The sense in which &I “justifies” &E, following Gentzen’s remark, isthat an indirect proof viaA& B can be reduced to a direct proof. Considerthe general scheme:

X : A Y : B&I

X, Y : A& B Z,A,B : C&E

X, Y,Z : C(α)

A& B is here a maximal formula, and can be eliminated by what Prawitzcalls a reduction step. In Prawitz’ notation, it would read:

51 52 (A,B)

A B 53&I

A& B C&E

C

and the simplified proof, removingA& B, would be displayed as:

51 52A B53

C

where the assumptionsA andB in53 are “covered” by the proofs51 and52. In the original proof, the final occurrence ofC does not depend onA

Page 11: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 133

andB, since they are discharged by the application of &E. In the simplifiedproof,C does not depend onA andB since they are each derived from51

and52.In moving to the explicit notation where each node displays the as-

sumptions on which each formula depends as well as the formula itself, afurther step is needed to show the joins where51 and52 meet53. This isa step of what (in the sequent calculus presentation) we noted that Gentzencalled Cut. The general form is

X : A Y,A : BX,Y : B Cut

This Cut rule is, however, not a sequent calculus rule, but an admissiblerule of natural deduction, that is, a claim that any proof of the premisescan be converted to a proof of the conclusion. In the derivation ofB fromY andA, all occurrences ofA introduced by assumption can be replaced byderivations ofA fromX. Then the original schema (α) reduces as follows:

X : A Z,A,B : CY : B X,Z,B : C Cut

CutX,Y,Z : C

C is now derived directly fromX, Y andZ, without first introducingand then eliminating the maximal formulaA& B. In other words, whatjustifies &E (given &I ), that is, what constitutes their being in harmony,is the admissibility of Cut. The introduction- and elimination-rules (for &,or more generally, for any logical constantδ) are in harmony if maximalformulae can be removed by applications of Cut.

In addition, we need two structural rules to govern the explicit notation,affecting the assumptions:

X : BA,X : B Thinning

X,A,A : BX,A : B Contraction

that is, the assumptions can be arbitrarily augmented at any time, andmultiple occurrences of a wff in the assumptions can be contracted to asingle occurrence.

2.3. Simplified E-Rules

The elimination-rule for & given in Section 2.2 has an unusual form –sometimes called “generalized &E”. In its more usual form, &E has twocases:

X : A& B

X : A &E-leftX : A& B

X : B &E-right

Page 12: Harmony and Autonomy in Classical Logic

134 STEPHEN READ

These are both special cases of (generalized) &E, and together suffice forcompleteness. How is this possible?

In (generalized) &E, as given in Section 2.2, letY = ∅ and letC = A.We obtain

X : A& B A,B : AX : A

The right-hand premise is derivable by the structural rule of Thinning.The application of &E, therefore, reduces to &E-left. Similarly, lettingY = ∅ andC = B yields &E-right. Thus the two more usual rules arespecial cases.

The admissibility of Cut guarantees that the special cases turn out tosuffice: in fact, full &E is a derived rule:

X : A& B&E-left

X : A& B X : A Y,A,B : C&E-right Cut

X : B Y,A,B : CCut

X,Y : CWe will find that this situation, where a special case of the (general-

ized) rule justified by Gentzen’s remark suffices for completeness in thepresence of Cut, is repeated for ‘∀’. It is important to understand that thetwo cases of &E, though they appear to be weaker than &E itself, beingspecial cases whereC = A orC = B, are in fact not weaker, but togethersuffice to derive &E. &I justifies &E, which is equivalent to the pair ofrules &E-left and &E-right.

The usual form of∨E is already in the generalized form – it does notsimplify. It is justified by Gentzen’s remark as follows. Take the two casesof ∨I :

X : BX : A ∨ B ∨I -left

X : AX : A ∨ B ∨I -right

There are thus two grounds for the assertion ofA ∨ B. Accordingly, theelimination-rule must contain two sub-proofs:

X : A ∨ B Y,A : C Z,B : CX,Y,Z : C ∨E

That is, given a derivation ofA ∨ B, together with demonstrations thatCfollows from both grounds for assertingA∨B, that is,C follows both fromA (together withY ) and fromB (andZ), we may infer thatC follows from

Page 13: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 135

A ∨ B itself – and so from whatever supportsA ∨ B, viz X. For∨E, sostated, permits removal of maximal formulae by Cut:

X : A ∨IX : A ∨ B Y,A : C Z,B : C ∨E

X, Y,Z : Creduces to

X : A Y,A : CCut

X,Y : CThinning

X,Y,Z : Cand similarly for the case whereA ∨ B is derived fromB.

2.4. The Conditional

In the case of the conditional, ‘→’, matters are a little more complex, forthere is a discharge of a hypothesis in the introduction-rule. It reads:

X,A : BX : A→ B

→I

Thus in the elimination rule, we need to hypothesize a derivation ofB

from A, preparatory to inferringC. Let us, temporarily, introduce a novelnotation,V, for this derivation:

X : A→ B Y,AV B : CX,Y : C →E′

We cannot yet use this rule, for we do not know how to deal with ‘A VB’.16 However, we can adapt Gentzen’s sequent calculus to this end. For‘A V B’ is really no more than the introduction of a conditional into theassumption set, and such an introduction is governed by (what might becalled) Gentzen’sV-left rule:

Y : A Y,B : CY,AV B : C V-left

adapted from the sequent calculus rule. Accordingly, the generalized formof our elimination-rule reads:

X : A→ B Y : A Y,B : CX,Y : C →E′′

This simplifies (as with &) to a special case which suffices for complete-ness. LetB = C. We obtain

X : A→ B Y : AX,Y : B →E

Page 14: Harmony and Autonomy in Classical Logic

136 STEPHEN READ

removing the now redundant third premise,Y,B : B. That is, we obtainthe familiarmodus ponensrule. Modus ponens, together with Cut, yields→ E′′ as a derived rule:

X : A→ B Y : A →EX, Y : B Y,B : C

CutX,Y : C

Maximal formulae of the formA→ B are removed without difficulty. Theproof-scheme

X,A : B →IX : A→ B Y : A

→EX, Y : B

reduces to

Y : A X,A : BX,Y : B Cut

Note that our play withVwas entirely heuristic. We wanted to hypothesizea derivation ofB from A, for such a derivation was the ground given in→ I for A→ B. Gentzen’s sequent calculus was set up precisely to studysuch an hypothesis, the introduction of an implication into the antecedent,i.e., into the assumptions (more generally, into the assumptions and whathas been derived from them).A V B says in effect that whatB entailsis entailed by whatever entailsA – it provides a link in a chain. So ifYentailsA, Y will entail B (and whateverB entails in turn) in conjunctionwith the grounds for assertingA→ B. Thus, using only reflection on thegrounds for assertingA→ B specified in→ I , we obtain→ E.

2.5. The Universal Quantifier

Further complications arise in applying our theory to the quantifiers,∀ and∃. The theory will point up the analogy between & and∀ and between∨and∃.

The introduction-rule for the universal quantifier,∀, is standardly givenas:

X : A(u/v)X : (∀v)A ∀I

(whereA(u/v) results fromA by replacing all free occurrences ofv by u),providedX is u-free, that is, provided there are no free occurrences ofu

Page 15: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 137

in any formulae inX. But what does this restriction onu really mean? Itmeans that universal introduction is actually an infinitary rule:17

X1 : A(t1/v) X2 : A(t2/v) . . . for all ti⋃i Xi : (∀v)A

∀I∞

Given that everything has a name (ti) and that the (possibly) infinite num-ber of premises exhaust the names, the rule is valid (intuitively – not, ofcourse, in a compact logic like the finitary first-order logic we are actuallyexamining) – and with no restriction on the possible (indeed, likely) oc-currence ofti inXi. What the usual∀I -rule does is to collapse this schemato the manageability of finitary logic. Perhaps not everything does have aname; and anyway, proofs are required to be finitary objects, so cannotcontain a proof ofA(ti/v) when i ranges over an infinite index setI .A(u/v) goes proxy for eachA(ti/v), andX for eachXi . But sinceu takesthe place of any of theti , u cannot occur (free) inX – on pain of collapsinginto a specific premise. Thus by lettingu in ∀I range over the wholedomain (we give free variables what is standardly called the “generalityinterpretation”), we avoid two aspects of∀I∞, its infinitary nature and theassumption that everything can be named. Nonetheless, the reason∀I isvalid is that, if everything were named andA(ti/v) were proven for alli ∈ I , then(∀v)A would follow.

In the case of &I , there were two premises (two parts to the derivationof A& B), and consequently, two assumptions in the minor premise of&E. Since there are infinitely many premises constituting the ground forasserting(∀v)A, there will be infinitely many assumptions in the minorpremise of∀E∞:

X : (∀v)A Y,A(t1/v),A(t2/v), . . . : BX,Y : B ∀E∞

A maximal formula(∀v)A sandwiched between∀I∞ and∀E∞ can clearlybe removed by infinitely many Cuts – infinitary logic requires infinitaryproofs to deal with its infinitary schemata. But∀E∞ can be simplified toa finitary rule by a move similar to the simplification of &E. There areinfinitely many such rules, one for eachti . LetY = ∅ and letB = A(ti/v):

X : (∀v)A A(t1/v),A(t2/v), . . . : A(ti/v)X : A(ti/v) ∀Ei

Removing the trivially valid minor premise (it results fromA(ti/v) : A(ti/v) by infinitely many applications of Thinning) yields the familiar ver-sion of ∀E. Normalization (removing a maximal formula of the form(∀v)A) now requires a global manipulation of the proof, replacingu by

Page 16: Harmony and Autonomy in Classical Logic

138 STEPHEN READ

ti throughout the derivation ofX : A(u/v) – a replacement guaranteedto preserve the correctness of the derivation by the absence of (free)u

fromX.18

Finally, note that the admissibility of Cut assures us that the simplified∀E-rule generates the generalized form as a derived rule:

X : (∀v)A ∀E1X : (∀v)A X : A(t1/v) Y,A(t1/v),A(t2/v), . . . : B∀E2 Cut· X : A(t2/v) X, Y,A(t2/v) : B· Cut· X,Y,A(t3/v), . . . : B

Cut····Cut

X,Y : BThus the special cases taken all together are equivalent to the rule∀E∞which is justified harmoniously by∀I∞.

2.6. The Existential Quantifier and Negation

Realising that the usual∀E-rule abbreviates infinitely many rules (as &Ehas two cases), makes it clear that∃I is also an infinitary set of rules, onefor eachti:

X : A(ti/v)X : (∃v)A ∃I

It follows that the proper form of∃E, in accordance with Gentzen’s re-mark, will have infinitely many minor premises (as∨E has two):

X : (∃v)A Y1, A(t1/v) : B Y2, A(t2/v) : B . . .

X,⋃i Yi : B

∃E∞(β)

A maximal occurrence of(∃v)A is then removed by pairing its premiseX : A(ti/v) with thei-th minor premise:

X : A(ti/v) Yi, A(ti/v) : BCut

X,Yi : BThinning

X,⋃i Yi : B

However, we must, as before, simplify the infinitary∃E-rule to thefinitary case. To do so, we collapse the infinitely many minor premisesto a single exemplary case:

X : (∃v)A Y,A(u/v) : BX,Y : B ∃E

Page 17: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 139

But considerY andB: although proofs and inference-figures in, for ex-ample, (β) are infinitary, formulae are not. SoB can contain only finitelymany terms. It cannot, therefore, contain more than finitely many of theti. It follows thatB must not contain ‘u’. For ‘u’ goes proxy for all theti .Moreover,Y is (in finitary proofs) finitary. So again, though eachYi willin general containti , Y may not contain ‘u’, for ‘ u’ stands arbitrarily forany of theti . Thus we obtain the standard restriction on the validity of∃E,that ‘u’ does not occur free inY orB.

Finally, note as in Section 2.5 that in the reduction procedure for amaximal formula between∃I and∃E, we once again have to make a globalmanipulation, replacing ‘u’ throughout the proof ofY,A(u/v) : B by ti ,correctness being maintained by the absence ofu (free) fromY andB.

Prawitz showed how to add negation harmoniously.19 Take an absurdityconstant,⊥, and define negation by the scheme:¬A ≡ A→ ⊥. Posit nointroduction-rule for⊥. Gentzen’s remark, interpreted as in Section 2.2onwards, shows that the elimination-rule for⊥ reads as follows:

X : ⊥X : A ⊥E

There is no minor premise, derivingA from assuming the grounds satis-fied for the introduction of⊥, since there are no such grounds.⊥E is inharmony with there being no rule of⊥I . There are, therefore, no maximalformulae of the form⊥, and so no reduction step is needed.20

2.7. Modality

What ultimately showed Dummett’s conception of intrinsic harmony to failwas the modal operator,3 (see Section 1.4 above). For the introduction-rule for3 reads:

X : AX : 3A 3I

If we suppose (by the Fundamental Assumption) that any derivation of3Amust end with an application of3I , then whatever entails3A must entailwhateverA entails, from which it follows that3A entailsA. But it doesnot.

One might infer from the procedure outlined in Sections 2.1–2.6, how-ever, that it will lead to the same absurd conclusion. For given a derivationof 3A, and given thatB follows from whatever the grounds for asserting3A are (vizA), we surely can concludeB:

X : 3A Y,A : BX,Y : B 3E

Yet 3E seems to permit derivation of3A ` A:

Page 18: Harmony and Autonomy in Classical Logic

140 STEPHEN READ

3A A1

A3E(1)

3A andA become equivalent. What has gone wrong?Without the prior realization that∃I was an infinitary set of rules, we

might have been led into similar error about∃E. Without restriction onBin ∃E we could prove the equally fallacious sequent,(∃v)A ` A(ti/v):

(∃v)A A(ti/v)1

A(ti/v)∃E(1)

There must, therefore, be some similar restrictions onY andB in 3E.But what are they, and how can they be justified in a non-ad hocway, i.e.independently of a desire to prevent the equation of3A with A?

What3I does is infer from the truth ofA at some particular world (onewhere every member ofX is true) thatA is true at some world or other.There must then be some implicit world-index onA denoting the worldwiat which it is true:

X : AwiX : (∃w)Aw 3I

3I ′ is, like ∃I , essentially an infinitary rule, one for eachwi. Thus3E isshorthand for:

X : 3A Y1, Aw1 : B Y2, Aw2 : B . . .

X,⋃i Yi : B

3E′

Once again,B can contain only finitely many (implicit) references toworlds, nor canY stand arbitrarily for theYi unless it makes no implicitreference to any particular world. Thus each member ofYi must be trueeverywhere if anywhere, so they must each be equivalent to a2-wff; cor-respondingly,B must be false everywhere if anywhere, so equivalent to a3-wff. We thereby obtain the standard restrictions on3E: in S4, that everymember ofY must be essentially equivalent to a2-wff andB to a3-wff; inS5, that every member ofY andB be fully modal, that is, each variable inevery member ofY andB must lie within the scope of a modal operator.21

Our procedure again shows its value in exhibiting the way the elimi-nation-rule, this time for3, is justified by the introduction-rule. WhereasDummett’s simplistic Fundamental Assumption led him astray, our proce-dure reveals the true nature of theI - andE-rules and their relationship.

2.8. A Proof-Conditional LiarLet me summarize the general theory. The introduction-rule(s) for a con-nectiveδ specify the grounds for assertion of formulaeδA in which δ is

Page 19: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 141

a new occurrence of a connective without a match in the premises. Theremay be several such rules. From them we may read off the elimination-rule for δ in the following way: given a proof ofδA, and proofs that afurther formulaC follows from each and every one of the various grounds,given in the introduction-rules, for assertion ofδA, we may inferC, dis-charging the hypotheses of each of those grounds and retaining only theparametric formulae together with the hypotheses justifyingδA itself. Theintroduction-rules provide sufficient grounds forδA; the elimination-ruleexhibits the necessary condition, namely, that to assertδA, one of theintroduction-rules is required. Suppose the various introduction-groundsare{5i}; then each of5i is sufficientfor δA, and one of5i is necessaryfor δA.

Although this theory shows that Prior’s ‘tonk’-rules are not harmo-nious, the theory does not prevent inconsistency or triviality in the wayBelnap hoped to provide. For the introduction-rule in itself can be psy-chotic, as the following example shows.22 Suppose we introduce a newsymbol• as a zero-place connective, and give it the introduction-rule:

X : ¬•X : • •I

(where¬• abbreviates• → ⊥). Thus a derivation of¬• from X sufficesto infer • from X. Assuming this is the sole introduction-rule, we mayconclude that the elimination-rule should read:

X : • Y,¬• : CY,X : C •E

that is, given a derivation of• fromX and a derivation ofC from the soleground for assertion of•, viz ¬•, we may inferC from X together withany parametric assumptions.

LetX = •. Then a special case of•E is:

Y,¬• : CY, • : C •E′

But •I and•E′ quickly yield triviality:

¬• : ¬• •E′¬• : ¬• • : ¬• • : ••E′ →E• : ¬• • : • • : ⊥→E →I• : ⊥ : ¬•→I •I: ¬• : •→E: ⊥ ⊥E: C

Page 20: Harmony and Autonomy in Classical Logic

142 STEPHEN READ

Note that•E′ discharges an assumption of¬• (at 1 and 3) and replacesit with an assumption of• which is later discharged, along with the otherassumptions of• at 2 and 4, by the applications of→ I . Thus the logic of• is inconsistent.

Does this show that there is something wrong with the above analysis ofharmony, or with harmony itself? Should harmony exclude inconsistency?I believe not. For we can see by looking at•I itself that there is somethingvery strange about it, which•E simply serves to spell out explicitly. Thewff • is rather like the Liar sentence. For•I says that the falsity of•suffices for its truth, i.e., that¬• ` •. Hence• cannot be false. But• hasonly one introduction-rule, so (as is spelled out in•E), the falsity of• isalso necessary for its truth, i.e.,• ` ¬•. Thus• a` ¬•, whence trivialityis immediate from the theory of¬. So• is a wff which is equivalent to itsown negation, which indeed in a way (since its “meaning” is, accordingto Gentzen’s remark, given by the grounds for its assertion) means that ititself is false. Thus it constitutes a kind of proof-conditional Liar sentence.

This fact does not impugn harmony. Rather, harmony has proved a use-ful tool by which to spell out the absurdity of• so characterized. Belnapand others look to harmony and kindred notions to outlaw connectives like• and rules like•I . But this represents only one of a range of attitudes tosuch paradoxes as the Liar. Suppose, for example, that we had previouslydefined• as an abbreviation for ‘This sentence is false’ or some otherLiar-type sentence. Then•I would be a natural way of representing one-half (the sufficiency half) of the truth-schema for•; and•E would resultfrom observing that•I was its sole ground for truth. Perhaps in the endwe would conclude that the truth-scheme had best not apply (at least,straightforwardly) to•, and that•I and•E were not good rules to have.But in coming to that conclusion, we need to be able to express•I and•Eand to understand their (harmonious) relationship.

Reflection on• also brings out another point rather well. Clearly, from•I and•E one can derive more than one can derive from the grounds forintroduction of•, for from•I and•E one can derive anything, whereas thegrounds for asserting• say only that• is false (¬•). But Gentzen’s claimthat “the eliminations are not more than consequences of those definitions”(the introductions) was not supposed to mean that the elimination-rule issomehow a derived rule, or that one could dispense with the elimination-rule. The inconsistency, and here in the context of⊥E, the triviality, towhich • leads, arises from combining the thought that the falsity of• issufficient for its truth with the fact that its falsity is also necessary for itstruth, for its falsity is the only ground for asserting•. •, unlike ‘tonk’,really is incoherent, but at the same time, harmonious.

Page 21: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 143

3. HARMONY IN CLASSICAL LOGIC

3.1. The Root of Dummett’s Complaint

The set of rules for &,∨, →, ∀, ∃ and⊥ in Sections 2.2–2.6 yields(first-order) intuitionistic logic. Does that show that intuitionistic logic isuniquely a logic whose rules are in harmony? I believe not. Dummett, aswe have seen in Section 1.4, claims that the connectives of classical logic,in particular, negation, cannot be seen as self-justifying, and its rules do notstand in harmony. Prawitz too, claims that “there is no known procedurethat justifies . . . the classical rule of indirect proof (i.e. the rule of inferringA given a derivation of a contradiction from¬A)”23 in accordance withGentzen’s remark, that is, in a way which shows classical logic to be au-tonomous, and the consequences of negative formulae to accord with thegrounds for their assertion. These are strong claims. But they are mistaken.

The root of Dummett’s complaint about classical negation is that, ashe puts it, “the addition of negation, subject to the classical rules, does notproduce a conservative extension” (p. 291). The standard natural deductionrules for &,∨,→, ∀, ∃, as given above, do not suffice to establish certainclassically valid negation-free formulae, such as

` ((p→ q)→ p)→ p (Peirce’s law)

and

` (∃x)((∃y)Fy → Fx) (Peirce’s second law)

(see Read, 1992). They become provable once ‘¬’ is added to the posi-tive logic above, but this time subject to such rulesinter alia as classicalreductio(what Prawitz called “indirect proof”):

X,¬A : ⊥X : A

or alternatively, the rule of double negation elimination:

X : ¬¬AX : A

Dummett’s and Prawitz’ objections depend crucially on the presenta-tion of classical logic which is considered. The problem does not affect, forexample, standard sequent calculus,LK . LK is a conservative extension ofits positive fragment, that is, the calculus obtained by dropping the rulesfor ¬, ¬-left and¬-right. Proofs of the above classically valid formulaein LK do not involve negation (by the sub-formula property). The rea-son is thatLK allows multiple conclusions, and Peirce’s law is provable

Page 22: Harmony and Autonomy in Classical Logic

144 STEPHEN READ

directly by two applications of→-right, and one each of→-left, Thin-ning and Contraction. Also truth-tables, or in the first order case, semantictrees, arguably the real basis of classical logic (i.e., the study of structures,proofs being only a means to that end), yield all the negation-free formulaeof classical logic directly without any consideration of negation. What Iintend to do is to explain how the sense of the connectives of classicallogic can be captured by inference rules in such a way that harmony andautonomy are guaranteed; and in particular, to exhibit a reformulation ofNK in which all the negation-free theses of classical logic are provablewithout use of the rules for negation. The rules will be in harmony, in thatall proofs in¬, &, ∨, →, ∀ and∃, complete for the theses of classicallogic, will be normalizable. Every proof can be put in normal form byeliminating maximal formulae, formulae introduced by introduction-rulesand acting as major premise of the corresponding elimination-rules. Theresult is a natural deduction system for full classical logic satisfying thecriteria we have noted above for a logic to be autonomous.

The operative thought is this: the fault withNK which Dummett andPrawitz are pointing out is not in fact with negation, but with the con-ditional. The theory of the conditional inNK is incomplete. It does notsuffice on its own to establish all the ‘→’-theses of (classical) logic. Thesolution is to strengthen the positive fragment before adding negation. Thequestion is whether this can be done while preserving harmony.

3.2. Multiple Conclusions

LK succeeds in capturing classical logic harmoniously by allowing multi-ple conclusions (or succedents) and then contracting. For example, in theproof of Peirce’s law, there is a contraction on the right of two occurrencesof p:24

p⇒ pThinning

p⇒ q, p → -right⇒ p→ q, p p⇒ p→ -left(p→ q)→ p⇒ p,p

Contraction(p→ q)→ p⇒ p → -right⇒ ((p→ q)→ p)→ p

Shoesmith and Smiley say that “classical logicians, like so many Mon-sieur Jourdains, have been speaking multiple conclusions all their liveswithout knowing it”.25 The interpretation of anLK sequentX1, . . . Xm ⇒Y1, . . . Yn is the formulaX1& . . .&Xm → Y1 ∨ . . . ∨ Yn; that is, multi-ple conclusions are essentially disjunctive. The crucial move permitted in

Page 23: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 145

LK but not inLJ (its single conclusion counterpart, yielding intuitionisticlogic) is that fromX,A ⇒ B, Y to X ⇒ A → B, Y , that is, from theformulaA → (B ∨ C) to (A → B) ∨ C.26 So the natural thought is tostrengthen the→ I -rule of NK to allow disjoined parameters,X:

A...

B,X

(A→ B),X→I ′

This way of convertingLK derivations to “natural deduction” proofs wasdeveloped by Boricic (1985), following von Kutschera (1962). The newcalculus is calledNC.

Dummett and Prawitz are, of course, not unaware of the existence ofLK . They exclude multiple conclusions from consideration because theyallow the assertion of disjunctions neither of whose disjuncts is assertible.But that is to beg the question. The question is whether intuitionistic logicis superior proof-theoretically to classical logic. To exclude forms of proofwhich are intuitionistically unacceptable is to introduce a circle in thereasoning.

Proofs now consist of multisets of wffs at each node in the tree. Thatis, in the explicit notation, sequents are pairs of multisets,X : Y . Thinningand Contraction extend to the succedent.

How does this generalization affect the justification for→ E in Sec-tion 2.4? We need to repeat that reasoning, suitably adapted to the caseof → I ′, where we may not only assert ‘A → B’ on its own, but alsowithin a disjunctive context of parametric wffs. Suppose we have a proofof ‘A → B’. We may infer something from it, provided we can infer itfrom whatever justifiesA → B, which is in general a derivation ofBfromA:

X : A→ B,Z Y,AV B : WX,Y : Z,W

As before, we turn toLK for help with dismantling the right-hand premise,to obtain formulae we can handle in natural deduction:

Y ′ : A,W ′ Y ′′, B : W ′′Y,AV B : W V-left

Let B ∈ W ′′. We obtain the simple form ofmodus ponens, sinceB : W ′′becomes derivable by Thinning:

X : A→ B,Z Y : A,WX,Y : B,Z,W →E

Page 24: Harmony and Autonomy in Classical Logic

146 STEPHEN READ

Conversely, the generalized form is derivable frommodus ponensand Cut:

X : A→ B,Z Y ′ : A,W ′ →EX, Y : B,Z,W ′ Y ′′, B : W ′′

CutX,Y : Z,W

These rules,→ I ′ and→ E (with the rules of assumptions, Thinningand Contraction) capture the→-fragment of classical logic. The proof ofPeirce’s law in the resulting system runs as follows:

p1

Thinningp, q →I (1)

(p→ q)→ p2 p,p→ q→E

p, pContraction

p →I (2)((p→ q)→ p)→ p

The comma here serves as a (structural) disjunction. It presents a multipleof conclusions whose disjunction has been proved, as in the interpretationof comma in the succedent of sequents inLK .

It is straightforward to show that→ I ′ is (classically) sound, that is,that if X |= B ∨ C (i.e., ‘B ∨ C’ is true in every model ofX) thenX −A |= (A → B) ∨ C. Thus every thesis provable by the standard rules isprovable with the new ones; and every thesis provable with the new rulesis classically valid.

The gain is that the new rules give us the full classical theory of impli-cation, not the intuitionistic implicative fragment yielded by the standardrules. The completeness in ‘→’ is shown by deriving the single sufficientaxiom ((p → q) → r) → ((r → p) → (s → p)) of Łukasiewicz andWajsberg:27

p1

Thinningp, q →I ′(1)

(p→ q)→ r3 p→ q, p→E

r → p2 r, p →Ep, p

Contractionp →I ′

s → p→I ′(2)

(r → p)→ (s → p)→I ′(3)

((p→ q)→ r)→ ((r → p)→ (s → p))

Page 25: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 147

The rules for ‘&’, ‘∨’, ‘ ∀’ and ‘∃’ also need extending to the multiple-conclusion formulation. The connection between the rules is repeated. Forexample, given:

X : A,Y X : B, YX : A& B, Y

&I

the justification of &E is as before:

X : A& B,Z Y,A,B : WX,Y : Z,W &E

LetA ∈ W . ThenY,A,B : W is derivable by Thinning, so the right-handpremise can be removed, yielding

X : A& B, Y

X : A,Yand the same forB.

For disjunction, the two introduction-rules:

X : A,YX : A ∨ B, Y ∨I

X : B, YX : A ∨ B, Y ∨I

justify, by the familiar procedure:

X : A ∨ B,W Y,A : Z Y,B : ZX,Y : Z,W ∨E

However, we can now perform a simplification similar to that madewith ‘&’. Let Y = ∅ andZ = {A,B}. We obtain:

X : A ∨ B,W A : A,B B : A,BX : A,B,W

The minor premises are trivially derivable by Thinning, so we have as aspecial case of∨E:

X : A ∨ B,WX : A,B,W ∨E′

Nonetheless, this special case suffices to derive the general rule, as for ‘&’,‘→’ etc.:

X : A ∨ B,W∨E′X : A,B,W Y,A : Z

CutX,Y : B,Z,W Y,B : Z

CutX,Y : Z,W

Page 26: Harmony and Autonomy in Classical Logic

148 STEPHEN READ

Normalization for∨E′ is immediate:

X : A,Y ∨IX : A ∨ B, Y∨E′X : A,B, Y

reduces to Thinning:

X : A,YX : A,B, Y Thinning

The presence of parametric wffs in the succedent is crucial for the clas-sical theory of ‘∀’. Reverting to the briefer notation, with the premisesimplicit, the∀-introduction rule becomes:

A(u/v),X

(∀v)A,X ∀I ′

provided ‘u’ is free in neither the assumptions nor in(∀v)A or any wff inX. The elimination-rule again reduces to the form:

(∀v)A,XA(t/v),X

∀E′

It is now straightforward to prove(∀v)(p∨A) ` (p ∨ (∀v)A), needed forcompleteness:

(∀v)(p ∨A) p1 A(t/v)2∀E Thinning Thinningp ∨ A(t/v) p,A(t/v) p,A(t/v)∨E(1,2)

p,A(t/v) ∀Ip, (∀v)A ∨I

p ∨ (∀v)A, (∀v)A ∨Ip ∨ (∀v)A, p ∨ (∀v)A

Contractionp ∨ (∀v)A

Lastly,∃I :

A(t/v),X

(∃v)A,X ∃I

which justifies:

W : (∃v)A,X Y,A(u/v) : ZW,Y : X,Z ∃E

whereY,Z areu-free. In all these proofs, the strategy of proof is exactlymodelled on that inLK .28

Page 27: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 149

3.3. Classical Negation

We can conservatively extend the above positive system (in ‘&’, ‘∨’, ‘→’,‘∃’ and ‘∀’) to full classical logic simply by adding so-called “intuitionis-tic” negation.29 ¬A is defined asA→⊥, and the sole rule for ‘⊥’ simplygeneralizes what we had before:

X : ⊥, YX : A,Y ⊥E

It permits us to prove the other axiom in Wajsberg’s formulation,⊥ →p.30 So the calculus yields full classical logic. Classicalreductio, (¬p →p)→ p, is just a special case of Peirce’s law. But we now see that defining‘¬A’ as ‘A→⊥’ is not a specially “intuitionistic” idea. It is simply nega-tion. Whether ‘A→ ⊥’ is “intuitionistic” or “classical” depends on one’stheory of ‘→’. The theory of ‘→’ in NC is classical, so ‘¬’, so defined, isclassical too.

Double negation elimination,¬¬p→ p, is derivable as follows:

p1

Thinning⊥, p→I ′(1)¬¬p2 ¬p,p →E⊥, p⊥E

p, pContraction

p →I ′(2)¬¬p→ p

Alternatively, one can add ‘¬’ as primitive,31 for example, if one wantsto develop the ‘→’-free fragment, or for its own sake. The introduction-rule will be:

X,A : ⊥, YX : ¬A,Y ¬I

taking the lead from→ I ′ and the previous definition of ‘¬’ (‘ ¬’ is notnow a defined symbol, but primitive). But note that⊥, Y is derivable if andonly if Y is (⊥, Y follows from Y by Thinning, andY follows from⊥, Yby⊥E and Contraction). So we obtain:

X,A : YX : ¬A,Y ¬I

Gentzen’s remark then leads to the followingE-rule:

X : ¬A,Z Y,AV ⊥ : WX,Y : Z,W ¬E′

Page 28: Harmony and Autonomy in Classical Logic

150 STEPHEN READ

Once again, we exploit theV-left rule from sequent calculus to analysethe right-hand premise:

Y ′ : A,W ′ Y ′′,⊥ : W ′′Y,AV ⊥ : W V-left

Y ′′,⊥ : W ′′ is derivable from⊥E. We obtain:

X : ¬A,Z Y : A,WX,Y : Z,W ¬E

¬I and¬E are clearly (classically) sound. Moreover,¬I and¬E normal-ize, as they should if we have followed Gentzen’s instructions properly:

X,A : Z ¬IX : ¬A,Z Y : A,W ¬E

X, Y : Z,Wreduces to

Y : A,W X,A : ZX,Y : Z,W Cut

Finally, ¬I and¬E are complete for classical logic, for¬¬p ` p isproved as follows:

p1

¬I (1)¬¬p ¬p,p¬Ep

Boricic (1985, p. 375) shows that the calculus is normalizable. In fact,normalization should not be a surprise. For it was with normalization inmind that we designed theE-rule for→. We looked at theI -rule for ‘→’and considered whatE-rule it justified. Normalization simply confirmsthat we carried out that exercise correctly. The surprise (and feeling ofsatisfaction) comes rather in the fact that these rules are complete – thatis, complete for the full classical theory of disjunction and implication.In NK as usually formulated, we cannot achieve completeness with thestandard ‘∨’ and ‘→’-rules without essential, non-normalizable use of the‘¬’-rules. We have now removed the need for that lack of harmony – thatdiscordance, we might say.

Page 29: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 151

4. CONCLUSION

NC is a complete and harmonious natural deduction proof-system for fullclassical logic. Dummett and Prawitz criticized classical logic for failing tomeet their demand for harmony between the introduction- and elimination-rules; Dummett went further in requiring that the rules be conservative.I argued in Section 1.3 that the latter requirement cannot be justified;nonetheless, the above system satisfies it. Prawitz claimed that there was noway of justifying the rule of double negation elimination. This we see is nottrue. Preserving harmony and normalization, we can capture the classicaltheory of ‘→’, ‘∨’ and ‘¬’ with a natural deduction system. In conse-quence, we can absolve classical negation from the opprobrium heapedupon it by Dummett and Prawitz. Classical negation only produces an in-harmonious extension of implicational logic when added to a constructivistand inadequate theory of implication. When treated in their proper context,those positive implicational theses are already provable in a properly clas-sical implicational logic. Moreover, we now see that full classical logic,including the theory of negation, has a pleasing natural deduction formu-lation, normalizable in the same way that its sequent calculus presentationpermits Cut elimination. Classical logic meets the correct demands on anautonomous logic.

This shows, in fact, thatNK as formulated by Gentzen and developedby Prawitz and others is deeply misleading about the difference betweenclassical and intuitionistic logic. Its formulation suggests that the two sys-tems agree on disjunction, the conditional and so on, differing only intheir treatment of negation. ButNC shows that formally, the calculi can bemade to agree on disjunction and negation and disagree on the conditional.The right conclusion is that the formalism does not reveal the true natureof the disagreement between classicist and intuitionist, and in particular,harmony and autonomy are interesting properties open to both.

Of course, the constructivist can still mount a challenge to classicallogic. But we now see where that challenge should be concentrated – andwhere it is misguided. The proper challenge is to Bivalence, and to theclassical willingness to assert disjunctions, neither of whose disjuncts isseparately justified, simply on the basis that either some proposition is true– in which case one pole of the disjunction is true – or it is false – in whichcase the other pole obtains. For why is the generalized→ I -rule (→ I ′)given above classically sound? Suppose we know that ‘A → (B ∨ C)’ istrue, that is, we are in a position to assert ‘B∨C’ conditionally onA. TheneitherA is false, in which case ‘A→ B’, and so ‘(A→ B) ∨ C’ are true;or ‘B ∨C’ is true, in which case, eitherB is true, whence ‘A→ B’ and so

Page 30: Harmony and Autonomy in Classical Logic

152 STEPHEN READ

‘(A→ B) ∨ C’ are true; orC is true, and so once again, ‘(A→ B) ∨ C’is true. The justification depends crucially on Bivalence. Reliance on itallows the classical logician to assert a disjunction when he is in no po-sition to assert either of its disjuncts. This is the basis of constructivistobjections to multiple-conclusion logic, that it permits the manipulationof wffs within a disjunctive context in such a way as to produce asertibledisjunctions neither of whose disjuncts is assertible. It is on the classi-cal logician’s willingness to do that, and not on claims that the classicaltheory somehow imports non-logical (metaphysical) presumptions, thatconstructivist objections should concentrate.

NOTES

1 Gentzen, 1969, p. 80.2 Prawitz, 1973, p. 226.3 Prior, 1960–61, p. 38.4 Tarski, 1956, p. 64.5 Sundholm, 1981.6 Belnap, 1961, pp. 132–133.7 Dummett, 1991, p. 247.8 For further comments on uniqueness, see Milne, 1998.9 Dummett, 1991. Note that all page references without other attribution will be to this

work.10 Prawitz, 1994, p. 374.11 Shapiro, 1998, §3.12 Troelstra and Schwichtenberg, 1996, p. 66.13 Cf. Read, 1994, pp. 260–261.14 Prawitz, 1965, Ch. II.15 Cf. Prawitz, 1965, p. 23.16A V B is denotedB(x)[x ∈ A], whereB is not dependent onx, by Martin-Löf: see,

e.g., Nordström et al., 1990, Ch. 7.17 Where there are countably infinitely many premises, this rule is often called theω-rule.

See, e.g., Issacson, 1992, §2 and Carnap, 1937, §14.18 See, e.g., Prawitz, 1965, p. 37.19 Prawitz, 1973, p. 243.20 Note that the argument generates essentially the same elimination-rule for any con-

stant, regardless of its adicity, which is given no introduction-rule. Supposeδ is such aconstant. Then the elimination-rule will read

X : δAX : B δE,

that is, anything follows fromδA (whereδA is any wffA with main connectiveδ) since itfollows from whatever justifies assertion ofδA (viz nothing). ThusδA is always false.

21 Similar reflections reveal the restrictions on2I and the dependence of2E on2I .

Page 31: Harmony and Autonomy in Classical Logic

HARMONY AND AUTONOMY IN CLASSICAL LOGIC 153

22 The example was raised by a referee.23 Prawitz, 1977, p. 34.24 Here and below, I take sequents ofLK to be pairs of multi-sets of formulae, using ‘⇒’

as the separator of antecedent and succedent.25 Shoesmith and Smiley, 1978, p. 4.26 Cf. Ungar, 1992, p. 56.27 See Church, 1956, pp. 140, 159.28 See Boricic, 1985, pp. 369 ff.29 Cf. Ungar, 1992, p. 17.30 Church, loc. cit.31 As does Boricic, p. 367.

REFERENCES

Belnap, N. (1961–62): Tonk, plonk and plink,Analysis22, 30–34.Boricic, B. (1985): On sequence-conclusion natural deduction systems,J. Philos. Logic

14, 359–377.Carnap, R. (1937):The Logical Syntax of Language, Routledge, London.Church, A. (1956):Introduction to Mathematical Logic, Princeton University Press,

Princeton.Dummett, M. (1991):The Logical Basis of Metaphysics, Harvard University Press,

Cambridge, Mass.Gentzen, G. (1969): Untersuchungen über das logische Schliessen, inThe Collected Papers

of Gerhard Gentzen, tr. M. Szabo, North-Holland, Amsterdam.Hacking, I. (1979): What is logic?,J. Philos.76, 285–319.Isaacson, D. (1992): Some considerations on arithmetical truth and theω-rule, in

M. Detlefsen (ed.),Proof, Logic and Formalization, Routledge, London.Milne, P. (1998): Disjunction and disjunctive syllogism,Canad. J. Philos.28, 21–32.Nordström, B., Petersson, K. and Smith, J. (1990):Programming in Martin-Löf ’s Type

Theory, Clarendon Press, Oxford.Prawitz, D. (1965):Natural Deduction, Almqvist and Wiksell, Stockholm.Prawitz, D. (1973): Towards a foundation of general proof theory, in P. Suppeset al. (eds),

Logic, Methodology and Philosophy of Science, IV, North-Holland, Amsterdam.Prawitz, D. (1977): Meaning and proofs: On the conflict between intuitionistic and classical

logic, Theoria43, 2–40.Prawitz, D. (1994): Review of M. Dummett ‘The Logical Basis of Metaphysics’, Mind 103,

373–376.Prior, A. (1960–61): The runabout inference ticket,Analysis21, 38–39.Read, S. (1992): Conditionals are not truth-functional: An argument from Peirce,Analysis

52, 5–12.Read, S. (1994): Formal and material consequence,J. Philos. Logic23, 247–265.Shapiro, S. (1998): Induction and indefinite extensibility: The Gödel sentence is true, but

did someone change the subject?,Mind 107, 597–624.Shoesmith, D. and Smiley, T. (1978):Multiple-Conclusion Logic, Cambridge University

Press, Cambridge.Sundholm, G. (1981): Hacking’s logic,J. Philos.78, 160–168.

Page 32: Harmony and Autonomy in Classical Logic

154 STEPHEN READ

Tarski, A. (1956): Fundamental concepts of the methodology of the deductive sciences, inLogic, Semantics, Metamathematics, ed. and tr. J. Woodger, Clarendon Press, Oxford,pp. 60–109.

Troelstra, A. and Schwichtenberg, H. (1996):Basic Proof Theory, Cambridge Univ. Press,Cambridge.

Ungar, A. M. (1992):Normalization, Cut-Elimination and the Theory of Proofs, CSLI,Stanford.

von Kutschera, F. (1962): Zum Deduktionsbegriff der klassischen Prädikatenlogik ersterStufe, in H. von Max Käsbauer and F. von Kutschera (eds),Logik und Logikkalkül, Alber,Freiburg, pp. 211–236.

Department of Logic and Metaphysics,University of St Andrews,Fife KY16 9AL,Scotland, U.K.(e-mail: [email protected])