PhilMath Intersem 2. 2011
Simplicity/Complexity of Proof
June 2–June 30, 2011
Description
Simplicity and economy of thinking have been perennial concerns of mathematicians. In this seminar we will focus on issues concerning simplicity or economy of thinking in proof.
We will be particularly concerned with the following questions:
I. What types of simplicity/complexity concerning proofs have mathematicians found most signficant and why?
II. In what specific ways (i.e. by means of what specific practices) have mathematicians pursued such economies?
Particular attention will be paid to the general practice of introducing ideal elements as a means of achieving thoughteconomies of various types. Here some more particular concerns will be:
III. To find important historical examples of economies of thinking that have been achieved through the introduction of ideal elements/methods.
IV. To come to a clear understanding of what the benefits of such economies are.
Featured Speakers

Andrew Arana (Kansas State U)

Jeremy Avigad (Carnegie Mellon U)

Matthias Baaz (Technical U Vienna)

Guillaume Burel (ENSIIE)

Sam Buss (University of California, San Diego)

Pierre Cartier (IHES)

Mic Detlefsen (U of Notre Dame)

Gilles Dowek (INRIA, Paris)

Kevin Kelly (Carnegie Mellon U)

Jeffrey Ketland (LudwigMaximiliansU)

Timothy McCarthy (U of IllinoisUrbana/Champaign)

Richard Pettigrew (U of Bristol)

Michael Rathjen (U of Leeds)

John Stillwell (U of San Francisco & Monash U)
Locations of Meetings

Meeting 1: Université de Paris 7–Diderot, Rive Gauche Campus, Bâtiment Condorcet, salle Klee (454A)

Meetings 23: Université de Nancy 2, Laboratoire d’Histoire des Sciences et de Philosophie  Archives Henri Poincaré, Université de Nancy 2, 91 avenue de la Libération (3ème étage)

Meetings 410: Université de Paris 7–Diderot, Rive Gauche Campus, Bâtiment Condorcet, salle Klimt (366A)
>> Maps & Info for ParisDiderot, Rive Gauche Campus.
>> Maps & Info for Nancy 2 Campus.
Schedule of Meetings
Meeting 1: June 2, 14h00–18h00. Paris, P7, Bâtiment Condorcet, salle Klee (454A) [readings] [abstracts]

Andrew Arana, “Toward a measure of the difficulty of following proofs”

Timothy McCarthy, “Normativity and Mechanism”

Mic Detlefsen, “Complexities of Proof”
Meeting 2: June 6, 14h00–18h00. N2, 91 avenue de la Libération (3ème étage), room 326 [readings] [abstracts]

John Stillwell, “Ideal Elements in Geometry”
Meeting 3: June 8, 14h00–18h00. Nancy, N2, 91 avenue de la Libération (3ème étage), room 326 [readings] [abstracts]

John Stillwell, “Ideal Elements in Number Theory, Analysis, and Algebraic Geometry”
Meeting 4: June 10, 14h00–18h00. Paris, P7, Bâtiment Condorcet, salle Klimt (366A) [readings] [abstracts]

Richard Pettigrew, “Ketland and the epistemological consequences of speedup”

Jeff Ketland, “SpeedUp, Explanation and Indispensablity”
Meeting 5: June 14, 14h00–18h00. Paris, P7, Bâtiment Condorcet, salle Klimt (366A) [readings] [abstracts]

Kevin Kelly, “A Topological Theory of Simplicity and its Connection with Truth”

Pierre Cartier, “Monstrous Proofs”
Meeting 6: June 16, 14h00–18h00. Paris, P7, Bâtiment Condorcet, salle Klimt (366A) [readings] [abstracts]

Gilles Dowek, “What could and what should a notion of proof complexity measure?”

Guillaume Burel, “The Impact of theories on proof complexity”
Meeting 7: June 20, 14h00–18h00. Paris, P7, Bâtiment Condorcet, salle Klimt (366A) [abstracts]

Michael Rathjen: “On the (unreasonable?) effectiveness of ideal elements”
Meeting 8: June 23, 14h00–18h00. Paris, P7, Bâtiment Condorcet, salle Klimt (366A) [abstracts]

Matthias Baaz: “A logical view on the simplicity of proofs”
Meeting 9: June 27, 14h00–18h00. Paris, P7, Bâtiment Condorcet, salle Klimt (366A) [abstracts]

Sam Buss, “Expressibility and Derivability, and the Complexity of Proofs”
Meeting 10: June 30, 14h00–18h00. Paris, P7, Bâtiment Condorcet, salle Klimt (366A) [readings] [abstracts]

Jeremy Avigad, “Simplicity, extensionality, and functions as objects”
Readings

Timothy McCarthy, “Normativity and Mechanism” [Lecture notes]
Hilary Putnam, "Minds and Machines", in Mind, Language and Reality, Philosophical Papers, v. 2

Mic Detlefsen, “Complexities of Proof”
Lewis Feuer, "The Principle of Simplicity'', Philosophy of Science 24 (1957): 109122
Ernst Mach, "The Economy of Science'', in The Science of Mechanics
 "The Economical Nature of Physical Inquiry'' in Popular Scientific Lectures
Rüdiger Thiele, "Hilbert's TwentyFourth Problem", American Mathematical Monthly 110.1 (2003): 124

John Stillwell, “Ideal Elements in Geometry”
David Hilbert: Foundations of Geometry, any edition (e.g. Open Court 1971, or the French edition by Dunod 1971)
David Hilbert: Lectures on the Foundations of Geometry 1891  1902 (Springer 2004).

John Stillwell, “Ideal Elements in Number Theory, Analysis, and Algebraic Geometry”
David Hilbert: Foundations of Geometry, any edition (e.g. Open Court 1971, or the French edition by Dunod 1971)
David Hilbert: Lectures on the Foundations of Geometry 1891  1902 (Springer 2004).

Richard Pettigrew, “Ketland and the epistemological consequences of speedup”

Jeff Ketland, “SpeedUp, Explanation and Indispensablity”

Kevin Kelly, “A Topological Theory of Simplicity and its Connection with Truth”
Kevin T. Kelly (2010) "Simplicity, Truth, and Probability", in Handbook on the Philosophy of Statistics, Prasanta Bandyopadhyay and Malcolm Forster, eds., Dordrecht: Elsevier. Preprint at: http://www.andrew.cmu.edu/user/kk3n/ockham/prasantasubmitfinal.pdf
Kevin T. Kelly (2008) “Ockham’s Razor, Truth, and Information”, in Handbook of the Philosophy of Information, J. van Bethem and P. Adriaans, eds., Dordrecht: Elsevier. Preprint at: http://www.hss.cmu.edu/philosophy/kelly/papers/kellyinfo14.pdf
Kevin T. Kelly (2007) “Ockham’s Razor, Empirical Complexity, and Truthfinding Efficiency”, Theoretical Computer Science, 383: 270289. Preprint at: http://www.hss.cmu.edu/philosophy/kelly/papers/burginfixed4.pdf
Kevin T. Kelly (2007d) “Simplicity, Truth, and the Unending Game of Science”, in Infinite Games: Foundations of the Formal Sciences V, S. Bold, B. Löwe, T. Räsch, and J. van Benthem eds, Roskilde: College Press, 223270. Preprint: http://www.hss.cmu.edu/philosophy/kelly/papers/bonnfinal.pdf

Gilles Dowek, “What could and what should a notion of proof complexity measure?”
G. Dowek and B. Werner, Arithmetic as a theory modulo, in J. Giesel (Ed.), Term rewriting and applications, Lecture Notes in Computer Science 3467, SpringerVerlag, 2005, pp. 423437.
G. Dowek, On the convergence of reductionbased and modelbased methods in prof theory, Logic Journal of the Interest Group in Pure and Applied Logic, 2009.

Guillaume Burel, “The Impact of theories on proof complexity”
Gilles Dowek, Thérèse Hardin, and Claude Kirchner, Theorem proving modulo, Journal of Automated Reasoning, 31, 2003, pp. 3372. http://www.loria.fr/~ckirchne/=tpm/tpm.pdf
Guillaume Burel. Efficiently simulating higherorder arithmetic by a firstorder theory modulo. Logical Methods in Computer Science, 7(1:3):131, 2011. http://arxiv.org/pdf/0805.1464

Matthias Baaz: “A logical view on the simplicity of proofs”
1. M.Baaz, "Note on the generalization of calculations", Theoretical computer Science 244(12)(1999): 311
2. M.Baaz, P.Wojtylak, "Generalizing proofs in monadic languages", APAL 154(2)(2008): 71138 (including a PS by G.Kreisel)
3. J.Krajicek, P.Pudlak, "The number of proof lines and the size of proofs in firstorder logic", Arch.Math.Logic 27 (1988): 6984

Jeremy Avigad, “Simplicity, extensionality, and functions as objects”
See, in particular, section 4.3. My talk will expand on some of these ideas.
On the historical side, I'll review some of Frege's comments on the distinction between function and object (in the Grundlagen, "Concept and Object," and the Grundgesetze).
I will also discuss Dirichlet's proof that there are infinitely many primes in an arithmetic progression whose terms are coprime, as it evolved from Dirichlet through Dedekind, de la Vallée Poussin, and Hadamard. I'll focus on the treatment of Dirichlet characters, summations over those characters, and so on. There is a modern version of the proof here: http://people.csail.mit.edu/kuat/courses/dirichlet.pdf
Dirichlet's original paper is here: http://bibliothek.bbaw.de/bibliothekdigital/digitalequellen/schriften/anzeige?band=07abh/1837&seite:int=00000286
There is a translation here: http://arxiv.org/abs/0808.1408
John Stillwell has translated Dedekind's writeup of Dirichlet's lectures on number theory, http://www.amazon.com/LecturesNumberTheoryHistoryMathematics/dp/0821820176
I will discuss Supplement VI, which has a version of the proof, as well as late versions by de la Vallée Poussin and Hadamard."
Participation Welcome
All interested persons are encouraged to attend and participate. If you have questions, please contact Mic Detlefsen at mdetlef1@nd.edu or Andrei Rodin at rodin@ens.fr.
Sponsorship
PhilMath Intersem 2011 is a joint cooperation of:

The University of Notre Dame

The Université de Paris 7–Diderot

The Ecole normale supérieure (ENS)

The Université de Paris 1 (IHPST)

The Université de Nancy 2 (Archives Henri Poincaré & MSH Lorraine)

The Ideals of Proof (IP) project of the ANR
Abstracts

Andrew Arana: “Toward a measure of the difficulty of following proofs”
Abstract: Proofs that seem convoluted or “knotty” are familiar from experience. Think of the usual proof of Tychonoff’s theorem in general topology, for instance, or priority arguments in computability theory. It is normal practice to seek less convoluted proofs of results already proved. One key reason to seek more “streamlined” proofs is that they are easier to follow. Standard prooftheoretic measures of proof complexity fail to measure this ease, since the number of symbols or lines in a proof by itself gives no information about how knotty or streamlined a proof is. In this talk we set out toward a theoretical account of measuring the difficulty of following proofs.

Timothy McCarthy: “Normativity and Mechanism”
Abstract: I shall discuss Gödels remarks about the limits of the mechanizability of mathematical proof in the Gibbs Lecture (1951). Central to the discussion are different answers to the question of in what way mathematical proof is algorithmically representable. Normative constraints on algorithmic representability in terms of a static formal theory lead to a weak form of the LucasPenrose thesis. However, Gödels later conception of axiomatic evolution requires the notion of algorithmic representation for our theoremproving capacities to be recast in dynamic terms. A general class of such dynamic models is introduced which makes it possible to raise and answer in precise terms a variety of questions about the limits of algorithmic representability. Gödel distinguishes, at least implicitly, apodictic from a more general class of proofs. Apodictic proofs arise by evident inferences from evident axioms; in the more general class of proofs, Gödel allows axioms to play a role which are not apodictically evident. The dynamic models referred to above are required to describe apodictic as well as nonapodictic proofs, for it is argued that an inference can be both apodictic and, in a certain sense, ampliative. An effective dynamic model of apodictic provabiliy is introduced which shows how the class of apodictically provable arithmetical sentences might be c.e. (thus illustrating another claim Gödel makes in the Gibbs Lecture); but it is shown that the wider class of provable numbertheoretic statements cannot be captured by any effective model of this type.

Mic Detlefsen: “Complexities of Proof”
Abstract: In this talk I will identify and compare/contrast different types of complexity that concern proof. Among the types of complexity I’ll consider are verificational complexity (i.e., complexity concerning the recognition of a given or presented item as a proof) and discovermental complexity (i.e., complexity involved in finding a proof of a given theorem (or solution to a given problem) ab initio or “from scratch”, as it were)). If time permits, I’ll also say something about complexities that may differentially affect our ability to identify something as a proof and our ability to derive other epistemic benefits from it.

John Stillwell: “Ideal Elements in Geometry”
Abstract: The “vanishing points” used by Italian artists of the 15th century led to the discovery of projective geometry and the powerfully simplifying idea of “points at infinity”. In this talk we trace the spread of this idea to other areas of geometry, such as nonEuclidean geometry and topology. The simplifying power of other ideal elements in geometry, such as “imaginary points,” will also be discussed.

John Stillwell: “Ideal Elements in Number Theory, Analysis, and Algebraic Geometry”
Abstract: The concept of “ideal element” migrated from geometry to number theory in the 1840s with Kummer’s introduction of “ideal numbers.” Shortly afterward, Riemann discovered that the study of complex functions is simplified by completing the plane of complex numbers by a point at infinity. These two threads were drawn together in Dedekind’s theory of ideals, and the result of their combination was modern algebraic geometry.

Richard Pettigrew: “Ketland and the epistemological consequences of speedup”
Abstract: In “Some more curious inferences”, Ketland argues that there are inferences that can be stated in nominalistically acceptable language, but which cannot be derived in firstorder logic in a feasible number of steps without introducing numbers or sets or some other nominalistically unacceptable objects; moreover, these are inferences that we all take ourselves to be justified in making. I read this as a new form of indispensability argument for mathematics: the claim is not that mathematical language is indispensable for the formulation of our best scientific theories, as in the QuinePutnam indispensability argument; it is rather that knowledge of mathematical objects is indispensable for the justification of many of our everyday inferences. I consider the various ways in which the original indispensability argument fails, and ask whether similar problems arise for Ketland’s new version. I conclude that, while Ketland’s argument is a marked improvement on the original, there are still serious problems.

Jeffrey Ketland: “SpeedUp, Explanation and Indispensablity”
Abstract: In a 1936 note, Gödel announced the result that, for each n > 0, (n + 1)th order arithmetic exhibits “speedup” over its predecessor. For certain classes of theorems, moving up an order signficantly reduces the size of the shortest proof of the theorem. A standard example is the speedup of ACA_{0} over PA. In 1987, Boolos provided an interesting example of a firstorder logical truth whose proof, in a standard deductive system, is astronomically long, but whose proof in a deductive system for secondorder logic fills only a page or so.
Another kind of example involves Field’s 1980 analysis of the applicability of mathematics for reasoning: Field’s conservativeness result says that, for “nominalistic” sentences A, B, if there is a proof P of B in “mathematics +A”, there is a proof P' of B from A alone. But P' will typically be much longer than P. In Ketland 2005, I mentioned cases of shortening proofs of logical truths expressed using (finite) cardinality quantifiers. Intuitively, the picture is that moving to a higherlevel of abstraction shortens and simplifies proofs, possibly turning unfeasible proofs into feasible ones. The phenomenon of incompleteness is, in a sense, a limiting case of the phenomenon of speedup. And a good deal more is now known about the general phenomenon of speedup (Buss 1994).
What might the epistemological significance of this be? I will argue that it reinforces the usual indispensability arguments against nominalism. One sometimes cannot feasibly demonstrate a firstorder formula to be valid unless one uses mathematics somehow (e.g., prove A in higherorder logic or prove “A is valid” using mathematics, and then appeal to some soundness property of the higherorder logic or mathematics). Moreover, it seems clear that, in certain cases, the mathematical proof explains the validity of the formula in a way that the firstorder derivation—indeed astronomically long—doesn’t: for the validity of a formula may encode a mathematical principle or perhaps the nonhalting of a certain Turing machine. The only plausible nominalistic response seems to invoke a modal claim that one “could” have given a concrete token derivation of A, given enough paper and time. But one’s sole justification for this modalclaim about prooftokens is based on the soundness (albeit only Σ_{1}soundness) of the mathematical reasoning used.

Kevin Kelly: “A Topological Theory of Simplicity and its Connection with Truth”
Abstract: This talk presents a theory of empirical simplicity developed in terms of topological and computational concepts. The theory provides a (weak) sense in which favoring simple hypotheses can be demonstrated to advance the search for truth, without falling back on a (circular) appeal to a bias in favor of simplicity. According to the proposed account, simplicity, like computational complexity, is highly sensitive to the structure of the question at hand. In particular, simplicity reflects nested topological structure within the boundaries of answers to an empirical question. This nested boundary structure provides Nature with a strategy to force an arbitrary, convergent scientist to produce a of theories prior to converging to the truth and empirical simplicity reflects the order of such sequences. It follows that scientists who always favor simple theories change their minds, in the worst case, no more than arbitrary, convergent scientists, so they are efficient. It can also be shown that scientists who fail to favor simple theories can be forced, in the worst case, to change their minds even more, so a taste for simplicity is also necessary for efficiency. Furthermore, the approach is built on concepts that have a natural interpretation within the theory of computability, so it applies to formal inquiry as well as to empirical inquiry, albeit not as crisply as in the empirical case. I will discuss such applications and some questions they raise about the role of simplicity in mathematics.

Pierre Cartier: Title TBA
Abstract: TBA

Gilles Dowek: “What could and what should a notion of proof complexity measure?”
Abstract: Several notions of proof complexity have been proposed. Some view proofs as derivations in a formal system and count the number of lines, or symbols, in this derivation. Others, use a more abstract notion of proof, seen as any algorithmically checkable certificate, and combine the size of the certificate with the complexity of the checking algorithm. Both types of notions assign a higher complexity to the obvious proof of 1000^{1000} = 1000000 than to the proof 2^{2} = 4, although both proofs are a mere computation. We investigate another notion of proof complexity based on the number of nondeterministic choices requires to build the proof.

Guillaume Burel: “The Impact of theories on proof complexity”
Abstract: Gödel’s famous theorem implies that there exists formulas that are provable in secondorder arithmetic but not in firstorder arithmetic. But even for the formulas that are provable in firstorder arithmetic, arbitrarily smaller proofs can be obtained by using secondorder arithmetic. The minimal length of the proofs of a given formula depends therefore on the theory in which the proofs are done. We investigate whether it depends on the way the theory is presented, or if there is a notion of proof complexity that is intrinsic of the theory. In particular, we show how to present secondorder arithmetic as a finite firstorder theory without increasing the length of proofs.

Michael Rathjen: “On the (unreasonable?) effectiveness of ideal elements”
Abstract: What Hilbert called the method of ideal elements has a bearing on mathematics in several respects. It appears to be vital for the progress of mathematics. It raises questions of consistency, conservativity and speedup. It even seems to be closely linked with our capacity for doing mathematics. In the talk I shall touch on several of these issues.

Matthias Baaz: “A logical view on the simplicity of proofs”
Abstract: We will start the lecture by discussing the most significant views on the simplicity of proofs: the mathematical, metamathematical and logical view (Consider Euclid’s proof of the ex istence of infinitely many primes: Mathematically, this proof is considered to be simpler than e.g. the proof using Fermat numbers and sometimes even considered as the simplest proof. Meta mathematically, the proof using open induction and the reflection principle is simpler. Logically Euclid’s proof might be simple or not depending on the representation).
In the second part of the lecture we will discuss effective and noneffective logical notions of simplicity. (A representation of proofs is effective if any proof can be always effectively recon structed from the informations given). Interestingly it turns out, that modern mathematics can be associated more with noneffective logical notions of the simplicity of proofs.

Sam Buss: “Expressibility and Derivability, and the Complexity of Proofs”
Abstract: It is a general principle that the power of a formal proof system can be directly tied to the expressive power of its formulas. However, there are notable exceptions, and a number of important related open questions. Indeed, there are proof systems that can express the concepts needed for the proof of a theorem, but still cannot prove, or are not known to prove, the theorem efficiently. Conversely, it can happen that a proof system is unable to express the concepts that are seemingly needed for a theorem, but is still able to state and efficiently prove the theorem. We discuss examples of these, along with related questions about the complexity of proofs. We discuss mostly weak proof systems, including propositional logic, but also examples in first order logic. Particular attention will be paid to open problems.

Jeremy Avigad: “Simplicity, extensionality, and functions as objects”
Abstract: This talk will consist of two parts. In the first part, I will highlight the importance of developing better measures of simplicity and complexity in the philosophy of mathematics. I will argue that such measures are needed not only to address a wide range of issues in the broader epistemology of mathematics, but also to address traditional metaphysical concerns regarding the existence of mathematical objects and the justification of mathematical knowledge claims.
In the second part, I will discuss current work by Rebecca Morris, which explores the evolution of the treatment of certain types of functions (specifically, Dirichlet characters and Lseries) in nineteenth century number theory, against the backdrop of Fregean concerns about the nature of functions as objects. In particular, I will explain how not only Frege but also mathematicians including Dirichlet, Dedekind, Hadamard, and de la ValléePoussin were reacting to pragmatic, methodological pressures (including demands of simplicity!) that push towards a reification of the function concept.
Student Participants

Sylvain Cabanacq (ENS, Paris 7)

Jordan Corwin (Notre Dame)

Davide Crippa (Paris 7)

JulesHenri Greber (Nancy 2)

Emmylou Haffner (Paris 7)

Roman Ikonicoff (Paris 7)

Ramzi Kebaili (Paris 1 & Paris 7)

Graham LeachKrouse (Notre Dame)

Rebecca Morris (Carnegie Mellon)

Alberto Naibo (Paris 1)

Mattia Petrolo (Paris 7)

Chris Porter (Notre Dame)

Pablo Ruiz (Notre Dame)

Monica Solomon (Notre Dame)

Catharine St. Croix (Minnesota)

David Thomasette (Nancy 2)
Other Participants

Andrew Arana (Kansas State)

Cyrille Imbert (Nancy 2)

JeanYves Marion (LORIA), Nancy

Timothy McCarthy (Illinois–U/C)

David Rabouin (CNRS;Rehseis/Sphere)

Ivahn Smadja (Paris 7; Rehseis/Sphere)

Frédérick Tremblay (Nancy 2)

Sean Walsh (Birkbeck)