Christopher Scambler (NYU), "The Philosophical Significance of Semi-Constructive Set Theory"


In recent work, Solomon Feferman has revived interest in set theories employing mixed classical and intuitionistic logics. From a philosophical point of view, Feferman sees such theories as sources of evidence capable of contributing to philosophical debate on the determinacy of problems like the Continuum Hypothesis. In this paper, I argue that Feferman’s philosophical motivation for semi-constructive set theory is ill-suited to his philosophical project; I then offer an alternative motivation, argue for its superiority, and re-assess the philosophical significance of Feferman’s semi-constructive set theories from the new perspective thus attained. I begin in sec. 2 by presenting the axioms for the basic system SCS of Semi-Constructive Set theory. In sec. 3 and sec. 4, I present and criticise Feferman’s philosophical motivation for SCS and its natural extensions, and consider his views on the philosophical significance of those theories. In sec. 5 and sec. 6 I present an alternative motivation for SCS that does not succumb to the difficulties seen to arise for Feferman’s motivation in sec. 4; finally in sec. 7, I reassess the philosophical significance of the system SCS on the basis of its new philosophical motivation, paying special attention to the role the theory might play in an approach to truth and ontology in set theory wherein there is a unique universe of sets in which truth is only partial.


Douglas Jesseph (USF), "Evolving Conceptions of Rigor in Seventeenth-Century Mathematics"


This talk is concerned with a cluster of developments in 17th-century mathematics that involve criteria for rigorous demonstration. I argue that the conception of rigor inherited from classical sources stressed the role of “perfect” demonstrations from “true causes” that was difficult to reconcile with both infinitesimal methods and the application of algebra to geometry. After a brief exposition of the classical notion of rigor, I outline various ways in which the work of Descartes, Wallis, Barrow, Cavalieri, and Roberval led to disputes over just what counted as a truly rigorous derivation of a result. In the end, these controversies led to conceptions of rigor that largely abandoned the notion of mathematical causality, even as authors stressed their supposed fidelity to classical models of mathematical demonstration.


Michael Stoeltzner (U of South Carolina), "Hilbert's Axiomatic Method and Carnap's General Axiomatics"


This paper compares the axiomatic method of David Hilbert and his school with Rudolf Carnap's general axiomatics that was developed in the late 1920s, and that influenced his understanding of logic of science throughout the 1930s, when his logical pluralism developed. The distinct perspectives become visible most clearly in how Richard Baldus, along the lines of Hilbert, and Carnap and Friedrich Bachmann analyzed the axiom system of Hilbert's Foundations of Geometry---the paradigmatic example for the axiomatization of science. Whereas Hilbert's axiomatic method started from a local analysis of individual axiom systems in which the foundations of mathematics as a whole entered only when establishing the system's consistency, Carnap and his Vienna Circle colleague Hans Hahn instead advocated a global analysis of axiom systems in general. A primary goal was to evade, or formalize ex post, mathematicians' 'material' talk about axiom systems for such talk was held to be error-prone and susceptible to metaphysics.


Alex Paseau (U of Oxford), "What's the point of complete rigour?"


Complete inferential rigour is achieved by breaking down arguments into steps that are as small as possible: inferential ‘atoms’. For example, a mathematical or philosophical argument may be made completely inferentially rigorous (‘atomised’) by decomposing its inferential steps into the type of step found in a natural deduction system. It is commonly thought that atomisation, paradigmatically in mathematics but also more generally, is pro tanto epistemically valuable. My talk will investigate what the epistemic value of atomising an argument might be.


Matthias Jenny (MIT), "The 'if' of relative computability"


I develop a theory of counterfactuals about relative computability, i.e. counterfactuals such as

• If we could decide the validity problem, then we could also decide
the halting problem,

which is true, and

• If we could decide the validity problem, then we could also decide
the problem of arithmetical truth,

which is false.

These counterfactuals are interesting for two reasons. First, because their antecedents are metaphysically impossible, they pose a challenge to the orthodoxy about counterfactuals and metaphysical necessity. Secondly, they raise questions about the relation between the extensional mathematical theory of relative computability and the counterfactual language used above. I argue for a realist understanding of these counterfactuals. This means that the challenge they pose to the orthodoxy is robust. However, at least on the technical side, the challenge isn’t particularly deep. I show that a model theory can be given for these counterfactuals that is a straightforward extension of comparative similarity models. I close by discussing various options of how to interpret the indices used in this model theory.


Graham Priest (CUNY Graduate Center & U of Melbourne), "What is the Specificity of Classical Mathematics?"


Some of the areas to which one might not want to apply classical logic come from mathematics itself, notably some of the areas of mathematics that have been developed since 1900. In fact, we now know that there are things such as intuitionist mathematics and paraconsistent mathematics, where one can apply classical reasoning only with disaster.

Indeed, it is clear that there are many pure logics, which, as mathematical structures themselves, are all equally good. And one may build more sophisticated mathematical structures on top of these, in the way that the analysis of smooth infinitesimals is built on intuitionist logic, or dialetheic set theory is built on paraconsistent logic. This does not imply that all such mathematical structures are equal—in their mathematical interest, beauty, applicability, and so on. That is another matter, and a subject for on-going mathematical investigation.

But whatever one makes of logical pluralism, it is clear that we are faced with mathematical pluralism. From this perspective, it makes sense to ask what is distinctive about classical mathematics. The obvious and straightforward answer is that it can be regimented employing classical logic. One can hardly gainsay this answer. But I think that it may mask a more profound answer: one to do with the nature of the mathematics itself, and not just its underlying logic. What follows is an attempt to unearth this.


Juliet Floyd (Boston U), "Gödel and Rigor: Mathematical and Philosophical"


Gödel’s “MaxPhil” Gabelsburger notebooks IX-X (1942-3), begun on the day he accepted the invitation to write for Russell’s Schlipp volume, reveal a fascinating attempt on Gödel’s part to come to grips with Russell’s overall philosophy –- and not just the mathematical and logical aspects of Principia. The distance between Gödel’s manner of working in these notebooks and his published tribute to Russell (1944) is considerable, evincing a more sophisticated and longstanding grappling with Russell’s overall philosophy (and philosophy as such) than has been thought. In particular, Gödel did not uncritically hold that we can “see” sets, or that axioms “force themselves upon us” during this period.  Instead, he aimed in 1942-3 to rigorize Russell’s Principia idea of truth-as-correspondence, the “multiple relation theory” of judgment, rooted at the atomic level in (what Russell called) “judgments of perception”. Aware of Wittgenstein’s impact on Russell after 1918, Gödel holds out for an infinitary version of the view, one that takes order as basic.

Every rigorization leaves some interpretive residue behind, the trail where the human serpent brings philosophy and knowledge into the garden. Gödel’s picture of rigor, expressed in the following quote, will be analyzed:

"A board game is something purely formal, but in order to play well, one must grasp the corresponding content [the opposite of combinatorically]. On the other hand the formalization is necessary for control [note: for only it is objectively exact], therefore knowledge is an interplay between form and content." [Max Phil IX 16]


Steve Awodey (Carnegie Mellon U), "Univalence as a new principle of logic"


This talk is about the Univalence Axiom recently proposed by Vladimir Voevodsky as a new principle for the foundations of mathematics. It is formulated within a new system of foundations called Homotopy Type Theory, and, roughly speaking, permits isomorphic structures to be identied.

The resulting picture of the mathematical universe is rather dierent from that
corresponding to conventional set theoretic foundations. It is not entirely
alien, however, and it even has many interesting connections to some traditional philosophical issues, and that is what I hope to explain in this talk.

There are three main themes:

1. The advance in computer technology has given rise to the possibility of a new, computer-assisted, formalization of mathematics, breathing new life into the old Logicist program of Frege and Russell.

2. One of the main tools for such formalizations is MLTT, which has recently
been connected to homotopy theory (of all things). But why should these remote ends be related at all? As it turns out, there are good reasons why logic and homotopy are closely connected, and they have to do with some of the oldest problems of logic and analytic philosophy.

3. In this setting, called Homotopy Type Theory, a new axiom for foundations has been proposed: the Univalence Axiom. It implies that isomorphic structures can be identied---in a certain, precise sense that I will explain. This sounds like the sort of thing that one hears from structuralists in the philosophy of mathematics. And in fact, the development of mathematics within this new system does have a distinctly structural character.


Jeremy Shipley (McHenry County College), "Poincaré on the Foundation of Geometry in the Understanding"


I argue that Poincaré had a unified philosophy of mathematics that was grounded in a conception of mathematical structures as forms of the understanding, but that unlike, say, Hilbert or Russell, Poincaré’s account of forms of the understanding is not captured by the use of axiomatics to implicitly define mathematical concepts. Instead, for Poincaré the understanding contains groups of action potentials, some of which have geometric applications under certain conventional choices. According to Poincaré, arithmetic and geometry are built on the same foundation. In order to articulate and clarify this point of view, I will show how he arrived at it by modifying the Kantian view of geometry as grounded in intuition to adopt his own view of geometry as grounded in forms of the understanding. It is in relation to Kant that Poincaré’s views are most clearly distinguished from the logicists and formalists.


Jamie Tappenden (U of Michigan), "Frege, Karl Snell and Romanticism; Fruitful Concepts and the 'Organic/Mechanical' Distinction"


The paper has a broad scholarly/context-setting objective and a narrower textual one. The broad objective allows an deeper and more satisfying interpretation of some key Fregean texts; the narrow textual point illustrates the value of the stage-setting for even nuts-and-bolts problems of Frege interpretation.

The broad objective is part of an even broader project of reconstructing Frege's local mathematical and philosophical environment. This installment addresses a neglected figure in Frege scholarship: the man he describes as his "revered teacher", Karl Snell. (Ernst Abbe, Rudolf Eucken and Otto Liebmann also make key cameo appearances). It turns out that there is more of interest to say about Snell than can fit into one paper, so I'll restrict attention here to just this aspect of his thought: the role of the concept of "organic", and a contrast with "mechanical".

The paper also goes beyond Snell to indicate some other places in Frege's environment where the contrast is made, to establish that these terms, and the "organic/mechanical" contrast had reached the status of "accepted, recognized cliché". (Though, as with many clichés, the expressions could mean different things to different people.)

The narrow project is a re-examination of Frege's account of "extending knowledge" via "fruitful concepts". This was the subject of a paper I published many years ago (Tappenden [1995]), and I think the account I gave there of what Frege is doing in the key texts in Grundlagen is correct as far as it goes.

But in the intervening years various questions have nagged at me, pushing me to probe deeper. The context-setting sheds light on some plausible answers: that Frege's key metaphors would have been understood in a particular way, that this understanding included a recognized contrast between "organic" and "mechanical" connection, [mechanische/organische Verbindung and cognates] and that this recognition allows us to see that many Fregean remarks were not disconnected asides, but appear to reflect a connected picture of the nature of mathematical thought.

To avoid a common misunderstanding that sometimes comes up with work of this type: The role of context is not to argue that Frege must have believed something because someone he respects believed that thing. Rather, the argument begins with the observation that Frege said certain things, and presumably chose his words carefully. The context shows how such words were understood by the people closest to Frege, and consequently suggests how Frege would have expected them to be understood by his readers.