MWPMW 14

When


The fourteenth annual Midwest PhilMath Workshop (MWPMW 14) will take place the weekend of Saturday, October 5th and Sunday, October 6th, 2013.

As usual, there will be a full day of talks and discussions on Saturday and a half day on Sunday. Also as usual, there will be a workshop dinner Saturday evening, with all participants invited to attend as guests of the workshop.

Prior to the MWPMW 14, there will be a two day workshop (October 3rd & 4th) of the Association for the Philosophy of Mathematical Practice (APMP). Further information here.

Where


University of Illinois-Urbana/Champaign

Lincoln Hall, 1092 (Room Equipment)

Campus Map

Lodging


Hotel reservations should be made as soon as possible. General Hotel Information

The following hotels also have blocks of rooms reserved for the APMP (October 3-4) and the MWPMW 14.

Hawthorne Suites is within a one mile walk of the meetings and campus

Eastland Suites and Holiday Inn (both of which are further off campus, but have shuttle service available to campus).

Call the local lines to reserve within the block reservation.

Other hotels that do not have blocks available, but should have rooms available include:

Homewood Suites

Hilton Garden, Hampton Inn and

Urbana Landmark Hotel

These are all within ~1 mile walking distance to campus.

We look forward to seeing you at the MWPMW 14, Saturday, October 5th and Sunday, October 6th.

Speakers


Walter Dean, U of Warwick, "Arithmetical Reflection, Induction & the Provability of Soundness"

Michael Ernst, U of California-Irvine, "A Paradox in Naïve Category Theory"

William Ewald, U of Pennsylvania, "The Discovery of the Quantifiers"

Janet Folina, Macalester College, "Mathematical Intensions, Intensionality in mathematics, & Intuition"

Emily Grosholz, Pennsylvania State U, "Reducibility and Meaning: Logic and Number Theory"

Geoffrey Hellman, U of Minnesota-TC; Stewart Shapiro, The Ohio State U, "Regions-based Continua, continued"

Harold Hodes, Cornell U, "Varieties of Ramified-Type Assignment Systems"

Douglas Marshall, U of Minnesota-TC, "Taking Applications of Mathematics to Mathematics Seriously"

Victor Pambuccian, Arizona State U, "Surprises in the Axiomatics of Ordered Geometry"

Marcus Rossberg, U of Connecticut, "Inferentialism and Conservativeness"

Neil Tennant, The Ohio State U, "The Concept of Real Number"

Schedule


Saturday, October 5th

Session I: Lincoln Hall, Room 1092

9:00am--10:00am: Neil Tennant, "The Concept of Real Number"

10:10am--11:10am: Janet Folina, "Mathematical Intensions, Intensionality in Mathematics & Intuition"

11:20am--12:20pm: Michael Ernst, "A Paradox in Naive Category Theory"

Lunch 12:45-2:00: Tryon Festival Theatre Foyer, Krannert Center for the Performing Arts

Session II: Lincoln Hall, Room 1092

2:00pm--3:00pm: Victor Pambuccian, "Surprises in the axiomatics of ordered geometry"

3:10pm--4:10pm: Douglas Marshall, "Taking Applications of Mathematics to Mathematics Seriously"

4:20pm--5:20pm: Walter Dean, "Arithmetical Reflection, Induction & the Provability of Soundness"

5:30pm--6:30pm: William Ewald, "The Discovery of the Quantifiers"

Dinner 7:00pm ACES Library, Heritage Room

Sunday, October 6th

Session III: Lincoln Hall, Room 1092

9:00am--10:am: Geoffrey Hellman & Stewart Shapiro, "Regions-based Continua, continued"

10:10am--11:10am: Harold Hodes, "Varieties of Ramified-Type Assignment Systems"

11:20am--12:20pm: Marcus Rossberg, "Inferentialism and Conservativeness"

12:30pm--1:30pm: Emily Grosholz, "Reducibility and Meaning: Logic and Number Theory"

Adjournment

Summaries


Walter Dean, U of Warwick, "Arithmetical Reflection, Induction & the Provability of Soundness"

Summary

A reflection principle is a statement or schema which seeks to express the soundness of a mathematical theory T within its own language.  For instance, the so-called local reflection principle for Peano arithmetic---i.e. Bew_PA('A') --> A---can be understood to assert that any sentence provable in PA is true in the standard model of arithmetic. 

In this talk, I will first seek to highlight a tension between the original technical uses of reflection principles in proof theory (which primarily pertain to proofs of non-finite axiomatizability) and their more recent philosophical appropriation in debates about the role of the concept of truth in mathematical reasoning  (wherein it is often claimed that acceptance of a theory T entails commitment to some form of reflection principle for T). I will next argue (on the basis of results of Kreisel, Lévy, Schmerl, and others) that the justification of reflection principles is closely related to the justification of induction (both mathematical and transfinite). 

On this basis, I will suggest that the task of accounting for our (putative) knowledge of reflection principles may not be as straightforward as it might at first appear. Finally, I will suggest that this motivates the consideration of various forms of "non-canonical" definitions of arithmetical provability---e.g. cut-free or Herbrand provability---relative to which appropriate formulations of reflection are provable in sufficiently strong arithmetical theories.

 

Michael Ernst, U of California-Irvine, "A Paradox in Naïve Category Theory"

Summary

I derive a paradox for naïve category theory in the application of categorial methods to graph theory. Analysis of the derivation illuminates the necessary assumptions. There are different costs depending on which assumptions one is disposed to jettison. However, the paradox guarantees their mutual inconsistency. I use this fact to draw important consequences for Solomon Feferman's ongoing project on the foundations of unlimited category theory.

 

William Ewald, U of Pennsylvania, "The Discovery of the Quantifiers"

Summary

It is frequently said that Frege revolutionized logic by his discovery of quantification theory in 1879.  I shall argue that in fact the 'discovery' of the quantifiers was not a single event, datable to a specific year, but rather is best seen as a process that lasted nearly fifty years, and that required the work of many hands.

 

Janet Folina, Macalester College, "Mathematical Intensions, Intensionality in mathematics, & Intuition"

Summary

In 1892 Frege published two seminal papers: "On Concept and Object" and "On Sense and Reference". These papers articulated some basic asymmetries between co-referring expressions (a=a vs. a=b), some gaps between senses and referents, and some solutions to the problems thereby raised for the philosophy of language. What is their relevance to mathematics?

Several approaches to intensionality in mathematics will be contrasted; one of them will be highlighted. I will argue that post-Kantian appeals to mathematical intuition can be seen as addressing the most basic question about senses or intensions. This is the question of when, and how, does an intension (or an understanding of it) determine an extension (or an understanding of it)? I will close by addressing a concern this may raise regarding the relationship between Kant's appeal to intuition and that of his successors.

 

Emily Grosholz, Pennsylvania State U, "Reducibility and Meaning: Logic and Number Theory"

Summary

In this paper, I want to argue that problem-reduction is just as important in mathematics as theory reduction, illustrating my arguments by reference to the development of number theory. I will note episodes in the development that leads from Fermat’s Last Theorem to Andrew Wiles’ celebrated proof (Wiles 1995), and examine arguments made about number theory and Wiles’ proof in particular by early twentieth century and contemporary philosophers.

Theory reduction is important in the search for proofs, because it forces mathematicians to articulate high level assumptions about the things they are investigating and the procedures they use. However, when mathematicians want to solve problems, they resort to problem-reduction, which typically embeds a problem—originally introduced in a narrower and more well-defined setting—into a broader, richer and less well-understood context.

Wiles’ proof (and the research by many forerunners and colleagues on which it draws) shows that such generalization is never ‘mere generalization,’ in Aristotle’s sense of generalizing abstraction that sheds content; nor is the ensuing re-particularization trivial, as Descartes’ reductionist method and Cassirer’s account in Substance and Function suggest. Rather, in both generalization and re-particularization we find a complex process of adding content sideways, as it were, which deserves greater attention from philosophers.

 

Geoffrey Hellman, U of Minnesota-TC; Stewart Shapiro, The Ohio State U, "Regions-based Continua, continued"

Summary

First, we review our regions-based recovery of the Dedekind-Cantor classical continuum. Second, we will show how to approach Aristotle’s non-punctiform conception of the continuous more closely by avoiding the actually infinite in favor of just the potentially infinite. It emerges that, in contrast to the original situation with actual infinities, the defined pointy superstructure exhibits features not shared by the structure of regions, and depends on choice of constructive framework. Then, third, we sketch how to extend our (classical) methods to multidimensional continua and geometries, Euclidean, hyperbolic, and spherical. Contrary to the impression given by Tarski’s “recovery” of 3-dimensional Euclidean geometry, these tasks----even with definitions like Tarski's---are not entirely trivial.

 

Harold Hodes, Cornell U, "Varieties of Ramified-Type Assignment Systems"

Summary

Assume that we have a set of types (thought of as linguistic expressions). A type-context is a function, whose domain is a set of variables, that assigns each variable in its domain to a type. Relative to a given assignment of non-logical constants to types, a type-assignment system is an inductively defined relation between type-contexts, expressions, and types. When a type-context, expression and type are so related, such an expression is a term of that type relative to that type-context.

The ideas behind PM suggest a notion of ramified-type and a notion of termhood that would be captured by type assignment systems such that (a) formulas (i.e. terms of propositional type) signify propositions, this relative to assignments of type-appropriate values to their free variables, and (b)  λI-terms are constructed from formulas and signify propositional functions (again relative to variable-assignments) which behave as the notation should suggest.

The following constraints on such type assignment systems are appealing.

  1. Every term has a unique type. (Thus orders are not cumulative.)
  2. β-conversion (of a β-redex) does not change type.
  3. The values of a propositional function have the same order as that function. (Thus the type of a β-redex is the type of its head.)
  4. The order of a propositional function is greater than the orders of each of its arguments (Hylton’s “first principle of ramification”).
  5. An atomic formula can have an argument (or several) that is not a variable but that contains a free variable (or several). (E.g. of the form τ(v(c)), with v a variable.) And we can form λI-terms from such formulas.

The big difficulty: these constraints cannot all be satisfied.

Dropping (2) would be terrible. For a while I thought that (3) was dispensable while satisfying (5); I was wrong.

Schutte presented ramified-type assignment systems in which orders are cumulative, abandoning (1). This avoids the problem. Though he gave no reason for this, I suspect that he was aware of the big difficulty, and that was his reason for his cumulative approach. This is an attractive option, since such systems support a reasonable model-theory.

The Russell-Whitehead position on this problem is unclear. Although they introduce their infix use of ‘^’ to represent the abstraction of a 1-place propositional function, they introduce no notation for applying a ‘^’-term to an argument. (Thus they missed out on discovery of β-conversion, and thus on the invention of the λ-calculus.) For the most part they just used formulas with one free variable to signify such 1-place propositional function. The latter usage suggests rejection of (4). But there is some reason to interpret them as rejecting (5) and (3). Laan and Nederpelt so interpret PM: their system RTT in effect requires that every argument of an atomic formula is either a variable or a closed term. I agree that PM does suggest this. But it’s unclear whether there is a good justification, independent of avoiding the big difficulty, for ruling out such formulas and the λI-terms generated from them. (E.g, why not allow that (λw:t,v:t’.w(v(c))), for appropriate types t and t’, signifies a propositional function?)

 All of the secondary literature on this subject with which I am acquainted accept (4); and Hylton offered an argument, from Russellian principles, in its favor. Nonetheless, I suggest that dropping (4) is another option. This involves expanding the set of ramified-types to a larger set of what I call new ramified-types. But a model-theoretic semantics for new-ramified-type assignment systems faces a problem. At the moment, I’m not completely clear about how to solve it. I hope to sketch a solution based on translation from such systems to cumulative systems.

 

Douglas Marshall, U of Minnesota-TC, "Taking Applications of Mathematics to Mathematics Seriously"

Summary

Applications of one branch of mathematics to another---for example, of algebra to geometry---have played a crucial role in mathematics since at least the Seventeenth Century. Nonetheless, a great deal of work in the philosophies of mathematics and science either ignores these applications altogether or treats them as second-rate. Rather, the philosophically interesting questions are usually taken to concern the application of mathematics to the physical world.

In this talk, I argue that two philosophical puzzles arise from ignoring the applications of the various branches of mathematics to each other. The first puzzle: Assuming mathematics describes abstract objects and the empirical sciences describe the physical world, how can mathematics be relevant to the empirical sciences? The second puzzle: Assuming pure mathematics lacks any factual content (as some empiricists have argued), how can one branch of mathematics ever be used to represent and reason about another? 

I will argue that these philosophical puzzles can be resolved by examining the applications of mathematics internal to mathematics itself. 

 

Victor Pambuccian, Arizona State U, "Surprises in the Axiomatics of Ordered Geometry"

Summary

Ordered geometry was born in 1882, when Moritz Pasch provided the first axiom system for a two dimension geometry based on the concepts of point, segment, and the belonging of a point to a segment (or equivalently, solely on points and betweenness). This was partially anticipated by de Morgan between 1854-1860 in his work on the four color problem.

Analyzing the axiomatics of both plane ordered geometry and of a dimension-free version thereof, in particular the Pasch axiom and its two higher-dimension versions, we point out that there are alternatives to Pasch's way of achieving plane separation, that some of  these alternatives show a surprising connection with the non-planarity of $K_5$ and $K_{3,3}$, providing a direct connection to de Morgan and the for color problem. We also provide a simpler axiomatization of plane ordered geometry, where {\em simple} has a precise formal definition. The results obtained are very recent (post-2010), there are still open problems, pointing to the very complex nature of this very basic topological notion, that had been hitherto considered to be well-understood.

 

Marcus Rossberg, U of Connecticut, "Inferentialism and Conservativeness"

Summary

A diffculty for inferentialism is presented. It is shown that third-order logic is not conservative over second-order logic: there are sentences formulated in pure second-order logic that are theorems of third-order logic, but cannot be proven in second-order logic. This new incompleteness challenge is formulated purely proof-theoretically rather than by appeal to model-theoretic semantics. The impossibility to demonstrate the truth of such second-order sentences using the inference rules of second-order logic alone seems to refute the inferentialist's claim that the meaning of the quantifiers is determined by their introduction and elimination rules: such sentences being truths of third-order logic should be true in virtue of the meaning of the logical vocabulary.

I argue that the inferentialist can answer this challenge. In the course of the argument, previously neglected features of proof-theoretic consequence will be elucidated.

 

Neil Tennant, The Ohio State U, "The Concept of Real Number"

Summary

I shall argue for a 'logico-geometric' explication of the concept of real number. The concept has its source in geometric intuition (the intuition of the continuity of a line), although it builds on a logicist conception of natural number. Despite the geometric nature of the source of the concept, we are able to attain to 'dimensionless' reals by the end of the process of explication. I shall commend a so-called 'Schema R' in stating an adequacy condition on a theory of the reals. Schema R results from an analysis of the logical forms of statements of measurement in accordance with respectively appropriate units for different dimensions of measurement. Schema R is analogous to Schema N for the natural numbers. Each Schema serves to explain the applicability of its particular concept of number. Another adequacy condition is the following: one should be able to show that the natural number n is identical to (not just, as structuralists would have it, somehow correlated with) the real number n. Only thereby can we do justice to the intuition that the natural numbers are, and lie among other, real numbers. The problem of giving such an explanation is called the Inclusion Problem.