Thursday, 30 June 2011

Should math be taught in schools?

(Cross-posted at NewAPPS.)

I came across a funny but disturbing video today, via Language Log. The background context is the fact that one of the questions contestants had to answer at the Miss USA pageant earlier this month was whether evolution should be taught in schools. We are all familiar with the heated debates on this topic in the US (especially after the whole Synthese drama...), so from a societal point of view it is not an entirely absurd question to be asked in this context. However, more broadly speaking, to wonder whether evolution (the theory which arguably unifies all fields of research in biology) should be taught in schools is as absurd as wondering whether math should be taught in schools. So someone came up with this video to illustrate the analogy:





It's *very* funny, no doubt about that, but also somewhat disturbing. There are in particular all kinds of gender dimensions worth emphasizing -- at some point, one of the 'contestants' says that math should be taught in schools, "but not to women". Sounds absurd, but if you recall that not that long ago speaking Barbie dolls had 'Math is so hard!' as one of their pre-recorded statements, it's clear that this is not coming out of nowhere. (Of course, the problem also lies with the very concept of beauty queens and beauty contests, but let's not get started.)

Anyway, I guess the bottom line is that we should be happy that, unlike biologists, mathematicians and philosophers of mathematics do not need to defend the very idea of teaching math in schools. And of course, yay for Miss Vermont!

Wednesday, 29 June 2011

Report on Extended Cognition Workshop

In the 'report on conferences' spirit, I've just written a report on the Extended Cognition Workshop which I organized and which took place in Amsterdam over the last days. Again, it's not really M-Phi material, but here is the link in case anyone is interested. Actually, there were two talks within M-Phi remit: my talk on an extended cognition perspective on formal languages and formalisms (very similar to the talk I gave in Munich a few weeks ago, 'Cognitive motivations for treating formalisms as calculi' -- video podcast here), and Helen de Cruz's talk offering an analysis of the role of external devices (notations, calculating devices) in mathematics from the point of view of extended cognition. She presented a case study on early Chinese algebra, arguing that Chinese mathematicians discovered a precursor of the method of Gaussian elimination, specifically prompted by the particular perceptual properties of the counting rods they used to calculate. Very cool stuff!

Tuesday, 28 June 2011

Report on conferences -- Guest post by Shawn Standefer

I don't know about you, dear reader, but I very much enjoy reading reports of conferences I would have liked to attend but did not. This month there were two great conferences with closely related themes (with just a weekend in between): Paradoxes of Truth and Denotation in Barcelona, and Truth at Work in Paris. It was a real marathon for truth-seekers and paradox-avengers! Having missed the conferences myself, I thought of asking Shawn Standefer, one of the brave people having attended both, for a short report for M-Phi. As you see, we are expanding our line of business here at this blog, now also having live coverage of recent events :) Anyway, lame jokes aside, here's Shawn's very thorough and informative report.

_______________________


I recently had the pleasure of attending two conferences on truth. The first was the Seventh Barcelona Workshop on Issues in the Theory of Reference, held in Barcelona, and the second was Truth at Work, held in Paris. Both were well run, multi-day conferences jammed full of interesting talks, and the organizers of both should be commended. I will highlight a few things that cropped up at both conferences and some talks that stood out to me. I do not have good notes for, or clear memories of, all the talks, so the summary will be unfortunately uneven.

There were several presentations motivating substructural logics in response to the paradoxes. Julien Murzi presented some criticisms of Field's recent work based on its inability to contain an adequate validity predicate. This was, I think, based on joint work with JC Beall. Murzi used this to motivate rejecting the structural rule of contraction in addition to the rule of contraction for the conditional. Elia Zardini motivated a contraction-free logic based on paradoxes that arise from adding predicates for naive logical properties, such consistency and inconsistency, that obey simple inference rules. Zardini has obtained some interesting metatheoretic results concerning the validity relation for his logic using a non-classical metalanguage. One point that was pushed in discussion was that there are difficulties of making philosophical and semantic sense of the failure of structural contraction.

Joint work by Pablo Cobreros, Paul Egre, David Ripley, and Robert van Rooij was presented at both conferences. They used work on vagueness to motivate a non-transitive logic that can be given a surprisingly clean, surprisingly classical proof theory. Their Barcelona presentation focused on semantic paradoxes and proof theory and their Paris presentation focused on paradoxes of vagueness, model theory, and conditionals. In discussion, one point that came out was that there is a sense in which their logic is classical and a sense in which it is not. As pointed out in questions, getting clearer on the notion or notions of classicality at issue here will be useful in understanding and evaluating some of the competing theories of truth.

There was more good work on proof theory presented in Paris by Volker Halbach and Leon Horsten. Halbach presented a talk on disquotational theories of truth, full of technical results demonstrating that disquotational theories, ones that add T-sentences and no general principles as axioms to PA, can be rather powerful and complex, depending on which set of T-sentences is added to PA. He said that many of the results can be found in his 2009 RSL paper and his new book. Horsten closed out the Paris conference with a talk, based on joint work with Welch and someone whose name I did not write down, about axioms for different revision theories of truth. He began by pointing out that revision theories based on a single Herzberger sequence can define a notion of determinateness just as powerful as the one found in Field's work. The results he presented were, to me, surprising. He went on to discuss difficulties in axiomatizing the stable and nearly stable truth sets in such sequences.

There was a talk at each conference responding to Leitgeb's work on the paradoxes. In Barcelona, Rafal Urbaniak argued that Leitgeb's analysis of Yablo's paradox and circularity was inadequate and proposed a semantic approach that categorized sentences better than Leitgeb's notion. In Paris, Denis Bonnay presented joint work with someone (my apologies for missing the name) on developing Leitgeb's analysis of groundedness. They showed how there were several parameters involved in the inductive definition of the set of grounded sentences in Leitgeb's sense, and how varying these can bring them into line with Cantini's supervaluationist scheme. Both Urbaniak's and Bonnay's work pointed to fruitful directions in formal work on aboutness and groundedness.

Both conferences featured discussion of Stephen Read's work on Bradwardine's theory of truth. In Barcelona, Graham Priest gave a talk on difficulties that Bradwardine's theory faces in dealing with the paradoxes of denotation when extended to terms in the way one would expect. This was the only talk on paradoxes of denotation at the conference. Stephen Read was in the audience and gave an extended response indicating points in the argument he would resist. In Paris, Stephen Read went over which of the compositional or commutativity principles were compatible with Bradwardine's theory. On the Read-Bradwardine view, truth commutes with conjunction and disjunction but not with the conditional and negation.

Michael Glanzberg gave interesting talks at both conferences on two different approaches to truth: the complex and the simple. The former sees truth as substantive and embraces hierarchies of truth but no substantive notions of determinateness. The latter takes a deflationary view of truth and embraces notions of determinateness but rejects hierarchies of truth. Glanzberg presented some technical results concerning iteration and reflection, including some analysis of the complexity involved in long iterations found in different approaches.

Anil Gupta presented talks at both conferences on the role of the T-sentences in theories of truth, focusing on what sort of conditional should be used in them. Gupta initially motivated the analysis of the T-sentences by framing a logical problem, the sorting of good arguments from bad. He used this to motivate a new conditional for the revision theory and showed how it can be used to address some of the initial problems he raised.

In Barcelona, Hartry Field presented some of his work on restricted quantification and the difficulties faced by his paracomplete approach. This was worked into a criticism of paraconsistent theories, because Field thinks that the prospects for an adequate treatment of restricted quantification is more promising in paracomplete approaches than in paraconsistent ones. Some of the discussion focused on how we were supposed to adjudicate between the different sets of principles that the two approaches preserved. I did not, however, get notes on the responses.

In Barcelona, James Woodbridge and Bradley Armour-Garb gave a talk on how their deflationary view of truth as pretense deals with semantic defect. In Paris, Woodbridge gave a talk on developments of his truth as pretense view, focusing on responses to several objections that it has faced.

In Paris, there were two talks on the role of the word `true' in natural language. Friederike Moltmann presented some distributional data concerning different truth-like constructions and argued they were non-equivalent. Dora Achourioti motivated and defended a project of looking at the use of the truth predicate in communication. While not about natural language, there was a talk on truth in formal semantics. James Shaw argued that assigning an extension to the truth predicate causes problems for the compositionality of semantics. He used this to motivate a procedural interpretation of the truth predicate. Unfortunately, I am not sure what happened to my notes for these talks.

Riccardo Bruni gave a great talk on a sequent calculus presentation for the calculus for finite definitions from the Revision Theory book. He introduced a special conditional, and he sketched how to prove an elimination theorem for the calculus with the definitional rules, classical logic on indexed formulas, and his new conditional. Stefan Wintein argued that Philip Kremer's formalization of the Gupta-Belnap criterion of vicious circularity did not adequately respect the original intuitions of the informal idea. Wintein then proposed a new version that focused on T-sentences. Wintein's version was, I believe, slightly weaker than Kremer's, as the two come apart in the case of Wintein's generalized strong Kleene schemes.

Graham Leach-Krouse gave an excellent talk on the surprise examination paradox. Leach-Krouse showed how to formalize the reasoning involved in the paradox and then presented a generalization of it. The generalized version leads to Solovay's theorem connecting provability logic and its arithmetic interpretation. Leach-Krouse used this fact to argue against Ramsey's famous distinction between the logical and the linguistic paradoxes. Great stuff.

There were a few other talks at the conferences, but I either cannot find my notes for them or do not have good notes for them. In Barcelona, Christopher Gauker argued for a well-foundedness principle about contexts in formal semantics. Andre Billon presented a relativist solution to the paradoxes. David Liggins argued for alethic nihilism. Colin Johnston presented an analysis of the paradoxes in terms of conflicting rules. Alexis Burgess presented and adjudicated a dispute between Scharp and Eklund about the inconsistency theory of truth and revising the truth predicate.

In Paris, Henri Galinon talked about what is involved in accepting a theory and drew consequences of his analysis for deflationism. Pascal Engel argued that the normative role of the truth predicate poses a problem for deflationism. Philippe de Rouilhan talked about abstraction in relation to Frege's distinction between concept and object. Gila Sher sketched a correspondence account of truth. Luca Tranchini talked about validity in the Dummett-Prawitz tradition of proof-theoretic semantics and the problems it poses for that project.

Doug Patterson gave a talk about Tarski's views on definition and language, which was, I believe, similar to a talk he gave at Munich that is now available in iTunes. Finally, Julien Boyer presented some negative results concerning Hintikka's idea that Dummett's ideas about proof and verification are best cashed out in terms of recursive winning strategies in game theoretic semantics.

I will close by mentioning two things that came up in discussion in different talks that I want to highlight. The first is the complexity of theories of truth. This was mentioned by several people. There are some nice results about the complexity of different approaches, even of straightforward disquotational theories. Some people took these results as strong reasons against adopting different theories. What should we make of these arguments?

The second thing has to do with the philosophy of logic. There seemed to be some important questions about how we are supposed to think of logic in the context of truth. There was a divide over whether classical logic should be privileged, what this means, and if it should be privileged, why. This may, however, just be a recasting of old debates about the status of non-classical logic. The wealth of material in the area of theories of truth seems promising as a way of shedding new light on those debates.

Saturday, 25 June 2011

Roy's Fortnightly Puzzle: Volume 5

(A day early, to make up for the last one being a day late!)

Assume we are working within Peano arithmetic (and, for a formula F, F will be the Goedel code of F) supplemented with a new predicate (either "T(...)", "K(...)", or "B(...)"). We obtain the Liar paradox by adding the T-schema to PA:

T-schema: For any statement F:
T(F) <--> F
is a theorem.

[This rule is of course initially plausible if we read "T(...)" as a truth predicate.] The Paradox of the Knower (or Montague's paradox) proceeds by weakening this assumption, replacing the T-schema with the following three principles:

Necessitation: For any formula F, if F is a theorem, then:
K(F)
is a theorem

Closure: For any formulas F and G:
K(F --> G) --> (K(F) --> K(G))
is a theorem.

Factivity: For any formula F:
K(F) --> F
is a theorem.

[These rules are initially plausible if we read "K(...)" as an in principle knowability predicate.] Notably, both of these paradoxes require Factivity. But there are similar paradoxes that do not require that Factivity hold, in general, of the predicate in question. For example, let "B(...)" be a "warranted belief" predicate, and consider the following rules. First, we have versions of Necessitation and Closure for B(...). Second, we have:

Transparency of Disbelief: For any formula F:
B( ~ B(F)) --> ~ B(F)
is a theorem.

In other words: (i) We are warranted to believe theorems. (ii) If we are warranted to believe a conditional, and we are warranted to believe the antecedent, then we are warranted to believe the consequent. (iii) If we are warranted to believe that we aren't warranted in believing something, then we aren't warranted in believing that something.

We can derive a contradiction similar to the one obtained in the Liar and the Knower, using only diagonalization and the three principles above (derivation left to the reader!) This derivation, to emphasize the central point, does not require that warranted belief entails truth!

[Note: Given the last few decades of work in epistemology, the transparency condition might not seem all that plausible. But I think it would have been plausible pre-Gettier. More importantly, the fact that its failure is a matter of pure logic (or, perhaps, pure logic plus pure arithmetic) is, I think, novel and substantial!]

Some questions about this paradox:

(1) Is it novel?

[If so, I hereby dub it the Cook-St. Croix paradox, because I am an arrogant @$%& and because I got thinking about this again while doing an independent study on this stuff with Cat St. Croix (who is completely not an arrogant @$%&, by the way!), who helped me work out some of the stuff below]

(2) Is it Yablo-izable? Are there additional assumptions needed that are not needed for the Liar variant of Yablo?

(3) Is there a set-theoretic analogue of this paradox?

(4) Is it an instance of Priest's inclosure schema?

(5) Is the existence of a paradox that does not depend on factivity of any other philosophical significance?

(6) What is the connection between this paradox and Loeb's theorem (there are, of course, deep connections between Loeb's theorem and both the Liar and the Knower!)

Discuss.

Friday, 24 June 2011

A song of love and logic

"In the land of P and not-P
You’d both be and not be mine"

(listen)

"The 21st Century Monads are an international musical collaboration whose songs address fundamental issues in philosophy, including specialized topics in contemporary analytic philosophy and the history of philosophy. The musical genres range from dance to folk. The songs are unique, original songs, not cheesy parodies."

Monday, 20 June 2011

The Law of Permutation and Paracompleteness

The law of permutation (C) is the following conditional law: $\vdash (A \rightarrow (B \rightarrow C))$ $ \rightarrow$ $ (B \rightarrow (A \rightarrow C))$. In a sense (C) is characteristic of structural exchange, i.e., the sequent calculus rule that permits commutation of premises:


Hartry Field's paracomplete theory of truth rejects (C). I think this is a pretty remarkable and under-discussed aspect of his proposal. The reason the law has to go is that it is equivalent, under pretty minimal assumptions (see Anderson & Belnap 1975, 79), to the following law (A12):

$A \rightarrow ((A \rightarrow B) \rightarrow B)$


From A12, together with another law that Field's algebraic semantics validates, conjunctive syllogism (CS),

$ ((A \rightarrow B) \land (B \rightarrow C)) \rightarrow (A \rightarrow C)$


Field shows that we can derive the law of contraction (W):

$(A \rightarrow (A \rightarrow B)) \rightarrow (A \rightarrow B)$


This leads to disaster since most systems, including Field's, cannot have (W) on pain of the Curry Paradox. Field's diagnosis is that we have no other choice than to reject (A12), and thus the law of permutation (C) with it. The cost is high, but Field suggests that it would be worse to give up the sensible looking (CS). After all, (CS) simply looks like a law recording the transitivity of the conditional, a property we definitely want to preserve.

Nevertheless, I think that this line of reasoning is not entirely satisfactory. If we instead give up (CS), we might be able to maintain both (C) and the equivalence with A12. As far as intuitions go, I suppose that (C) is a pretty natural looking fellow too. Indeed, I'm not sure what would constitute a tie breaker between (CS) and (C). But that aside, there is another consideration in the favour of giving up (CS) rather than (C). (C) is the conditional law characteristic of structural constraction, i.e.


If we reject (LW) then it naturally follows that we give up (C) as well. More interestingly, if we give up (LW) then (CS) isn't normally derivable either. Notice that (CS) relies on it being possible to apply conjunction elimination ($ \land$L) twice in order to derive the consequent. In sequent calculus this involves applying contraction to duplicate the antecedent, and then applying the conjunction rule twice (alternatively, having a conjunction rule with absorbed contraction, in which case full contraction is typically admissible). In natural deduction, we have to assume two copies of the antecedent, and subsequently use multiple discharge (the ND analogue of contraction) in a conditional proof application.

In other words, there is some proof theoretic reason to think that both (W) and (CS) are symptoms of the underlying presence of structural contraction. In contrast, (C) is attached to the structural rule of exchange. Whatever one may think intuitively about (CS), I think there is some reason to think that if we are already committed to rejecting (W), giving up (CS) as well is the move which is most in tune with the revisionary strategy.

Does that mean that we have a non-transitive conditional? No. It's just that expressing the transitivity with the extensional conjunction is unfortunate if one wants to reject the law of contraction as well. There are other, pure conditional laws that express transitivity in both Field's system and related systems:


Of course, that leaves us with an open question. Is there a logic without (CS) but with (C) in the vicinity of Field's logic, which is consistent? I think the answer to this is yes. This brings us roughly to the area of the contraction free relevant logic RW, and perhaps RW plus K.

(Cross-posted on www.hiddenabacus.com)

Update: Shawn Standefer made some very valuable comments. First, A10 and A11 above only hold in the rule form, not as axioms. Second, although I wasn't aware of this, it looks like (CS) isn't valid in the semantics has in the book Saving Truth From Paradox, although it does hold in semantics the of his earlier JPL paper.


PhD position in Konstanz and how to get women to apply for jobs

Over the weekend, Franz Huber (Konstanz) got in touch with me regarding a new PhD position they are advertising within the Formal Epistemology Research Group in Konstanz. He asked me to draw the attention of potential candidates, especially women, to the position. Franz is clearly committed to having as many female applicants as possible, as this is of course likely to increase the chances that the position will be filled by a woman. Here is the ad:

The Formal Epistemology Research Group invites applications for a PhD position in Theoretical Philosophy for an initial period of two years, starting October 1, 2011, or some date agreed upon. The position is subject to the positive evaluation of an interim report. Applications should include at least two letters of reference as well as a description of the dissertation project and/or a writing sample. They should be sent to: formal.epistemology@uni-konstanz.de by July 31, 2011.

Here is the link to the Formal Epistemology Research Group in Konstanz.

Besides advertising the position here as one for which women are particularly encouraged to apply, I’d like to offer some considerations on ‘gender differences’ concerning matters such as applying for jobs. I know the readership of this blog is overwhelmingly male (it’s just the demographics of the area), and it is pretty clear to me that men in general have no idea of what goes on when a woman sees a job ad which might be suitable for her. Typically, a woman’s first reaction is often to think that she is not suitable, that she does not fit the profile; in short, that she is not ‘good enough’. This phenomenon has been widely documented by studies in social psychology, and is certainly not restricted to PhD positions or academic jobs more generally. Typically, if there is a list of five items in the job description and a woman satisfies 4,8 of them, she will think she has no business applying; by contrast, if a man satisfies 3 of them, he already feels he should apply. There are all kinds of reasons why this is so, none of which entails gender essentialism; it is simply a consequence of how women’s potentials are perceived throughout their lives and of the fact that they have internalized a general feeling of inadequacy.

What I mean to say by all this is that if you, fairly senior male researcher, know of a suitable female candidate for this position, or any other position, she will typically need much more support and encouragement to go and apply than a male candidate. You will need to tell her in all words that yes, she’s very talented, that she has what it takes for the job, that she should definitely apply. In fact, this holds also for women who have already taken up positions in environments that are predominantly (or overwhelmingly) male in their composition; such women are much more likely to feel inadequate and ‘not good enough’ from the mere fact that they are one of the very few women around. (This is essentially related to all kinds of unconscious mechanisms usually referred to as ‘implicit biases’, also widely documented in the psychology literature.)

The bottom-line is: as long as a given area is marked by serious gender imbalance, it is much harder to be working in the said area if you belong to the under-represented gender (and similar considerations hold for any other minority as well), simply because the feeling of inadequacy is constantly looming large. So it is only fair that the people in the under-represented group (women, in our case here) should receive additional support and encouragement, as they are swimming against the current non-stop. (Of course, there are a few women who somehow manage to neutralize the swimming-against-the-current effect, but this doesn't invalidate the point that the effect is there alright.) It would be very nice if senior people, both male and female, could keep this in mind when coaching their students and younger colleagues.


Thursday, 16 June 2011

Prizes and lectures in London

As a quick follow-up to Catarina's latest post, Dan Sperber has just been named Chandaria Laureate by the Institute of Philosophy in London. He and Hugo Mercier are giving a series of three lectures there these days. Audio podcasts of the first and second lecture are already out.

A while back, the decision for another London prize, the 2010 Lakatos Award in Philosophy of Science, was announced by the LSE. The Prize has gone to Peter Godfrey-Smith (Harvard), for his book Darwinian Populations and Natural Selection (Oxford University Press, 2009), and Godfrey-Smith also gave his public lecture in London a couple of weeks ago. A mp3 can be found at this page.

Wednesday, 15 June 2011

Mercier and Sperber on the origins of reasoning

I just wrote a post at NewAPPS on a very interesting article on the origins of reasoning by Hugo Mercier and Dan Sperber. As the post is a bit too heavy on the psychology side of things, I suspect it is not exactly within the remit of M-Phi, so I won't cross-post it. But perhaps some M-Phi'ers will be interested, so here is the link. It should be relevant for discussions on rationality and Bayesian epistemology, which many of us M-Phi'ers are interested in.

Mercier and Sperber argue for the bold view that the real function of reasoning is argumentative, and thus not to improve knowledge or make better decisions. Here is the abstract:

Reasoning is generally seen as a means to improve knowledge and make better decisions. However, much evidence shows that reasoning often leads to epistemic distortions and poor decisions. This suggests that the function of reasoning should be rethought. Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade. Reasoning so conceived is adaptive given the exceptional dependence of humans on communication and their vulnerability to misinformation. A wide range of evidence in the psychology of reasoning and decision making can be reinterpreted and better explained in the light of this hypothesis. Poor performance in standard reasoning tasks is explained by the lack of argumentative context. When the same problems are placed in a proper argumentative setting, people turn out to be skilled arguers. Skilled arguers, however, are not after the truth but after arguments supporting their views. This explains the notorious confirmation bias. This bias is apparent not only when people are actually arguing but also when they are reasoning proactively from the perspective of having to defend their opinions. Reasoning so motivated can distort evaluations and attitudes and allow erroneous beliefs to persist. Proactively used reasoning also favors decisions that are easy to justify but not necessarily better. In all these instances traditionally described as failures or flaws, reasoning does exactly what can be expected of an argumentative device: Look for arguments that support a given conclusion, and, ceteris paribus, favor conclusions for which arguments can be found.
I have a lot of sympathy for this approach, but ultimately I think it is flawed, as I argue in my post at NewAPPS.

Monday, 13 June 2011

Roy's Fortnightly Puzzle: Volume 4

The axioms of separation:

For any set x and predicate F(...), there is a set y such that, for all z: z is a member of y iff (z is a member of x and F(z))

and of complement:

For any set x, there is a set y such that, for all z: z is a member of y iff z is not a member of x.

are incompatible with the existence of any sets. Proof:

Assume a set exists. Then by separation, the empty set exists. So by complement the universal set exists. So by separation, the Russell set of all non-self-membered sets exists. Russell's paradox follows.

So, we need to give up either separation or complement. Needless to say, standard mathematical practice has opted for the latter option.

The question is: Should we have? In other words, are there philosophically respectable reasons for thinking that it is complement, and not separation, that should be jettisoned?

Each of these principles corresponds to a natural intuition to have about the 'definiteness' or 'determinateness' of the results of certain set-forming operations. To put things in a bit silly terms:

Complement: If the sheep can be determinately isolated from the non-sheep, then the non-sheep can be determinately isolated from the sheep.

Separation: If the sheep can be determinately isolated from the non-sheep, then the black sheep can be determinately isolated from the non-black-sheep.

["non-black-sheep" here refers to anything that is not a black sheep, not anything that is a non-black sheep!]

The immediate answer that I usually get is that the intuition behind complement, but not the intuition behind separation, depends on the collection of all objects - the universal set - being determinate (so if the sheep are a determinate collection, than everything 'left over' is as well). But this strikes me as a little question-begging. After all, it is only our acceptance of separation (and of ZFC more generally) that throws the existence of the universal set into question. There are all sorts of alternative set theories (such as New Foundations or Boolos' DualV) that accept complement (and most of the standard ZFC axioms in the case of DualV) and hence accept the existence of a universal set.

So, what arguments can be given for the claim that it is separation and not complement that we should retain - arguments that don't depend on things like the iterative conception of set that already presuppose rejection of complement?

Thursday, 9 June 2011

Latest news on the 'inconsistency of PA' affair

UPDATE: I'm adding a few small changes to the original post, which were suggested by Steve Awodey.

(Cross-posted at NewAPPS.)

As previously reported (here and here), over the last few weeks there has been a heated debate within the foundations of mathematics community on what appeared to be a controversial statement by Fields medalist V. Voevodsky: the inconsistency of PA is an open problem. But what seemed at first instance a rather astonishing statement has become more transparent over the last couple of days, in particular thanks to posts by Bill Tait and Steve Awodey over at the FOM list. (Moreover, I’ve had the pleasure of discussing the matter with Steve Awodey in person here in Munich, in conversations which have been most informative.)

First, it might be useful to clarify some of the basic points of the program Voevodsky and others are engaged in. A few years back, Awodey established a neat correspondence between homotopy theory (Voevodsky’s main topic of research) and Martin-Löf’s constructive type theory. The result then led to the project of providing foundations for mathematics within the newly founded framework of Homotopy Type Theory and Univalent Foundations. One of the distinctive characteristics of the framework is that it is based on the notion of a continuum (homotopy is a branch of topology), while the more familiar set-theoretical framework is based on the notion of discrete elements (sets). Moreover, constructive type-theory is, well, constructive, so it is much more straightforward to maintain ‘epistemic control’ over proofs. At the time that Awodey’s correspondence results became known, independently, Voevodsky had also been interested in related ideas, and they are now actively collaborating on it (Martin-Löf is also involved, as is Thierry Coquand). Crucially, given its very distinctive starting point (continuity vs. discreteness) and its constructive nature, the framework could provide a truly revolutionary approach to the foundations of mathematics.

Now, within the bigger picture of things, the consistency of PA is actually a tangential, secondary issue. Voevodsky’s seemingly polemic statement concerning the potential inconsistency of PA in fact seems to amount to the following: all the currently available proofs of the consistency of PA in fact rely on the very claim they prove, namely the consistency of PA, on the meta-level. (S. Awodey adds that it is a consequence of Gödel's results that all proofs of consistency of PA must use methods that are stronger than PA in the meta-language.) So if PA was inconsistent, these proofs would still go through; in other words, there is a sense in which such proofs are circular in that they presuppose the very fact that they seek to prove. I raised a similar point before in comments here: if PA was unsound (which it would be if it were inconsistent), it might be able prove its consistency even if it is actually inconsistent (which also means that Hilbert’s original goal may have had a suspicious circular component all along). Now, we know by Gödel that PA cannot prove its own consistency, but the proofs of the consistency of PA available all seem to presuppose the consistency of PA (on the meta-level), so it boils down to roughly the same situation of epistemic uncertainty. Bill Tait sums up this general point in his latest message to FOM:

I understand him [Voevodsky] to be saying the following: PA is just a formal system and if it is inconsistent, then none of its formal deductions are reliable: false things can be proved without an explicit inconsistency appearing. He wants to say that this will not happen in type theory: the 'proofs' of a sentence \phi in type theory are not just strings of symbols, but are objects of type \phi that we actually construct and we can check that we have made a proper construction of such an object (a pair, a function, etc.). So even if there is an inconsistency in the large, individual proofs in type theory can be checked for reliability. For example, to see that a particular term (proof) t of type \forall x \phi(x) really yields a function of that type, one need 'only' to show that tn computes to an object of type \phi(n) for each number n.

So it would seem that the main conclusion to be drawn from these debates, and as pointed out by Steve Awodey in one of his messages to FOM, is that the consistency of PA is really a minor, secondary issue within a much broader and much more ambitious new approach to the foundations of mathematics. Time will tell what will come out of it, but given the brilliancy of the people involved in the project and its innovative character, we can all look forward to the results to come. For those who would like to know more and keep track of the project, their website is:

www.homotopytypetheory.org

Saturday, 4 June 2011

Science map

It's the worldwide network of scientific collaborations 2005-2009.

Cool, no?



(From Wired Science, where much more can be found.)

Wednesday, 1 June 2011

Explanation in Mathematics (Paolo Mancosu, SEP)

Updated version of Paolo Mancosu's very nice article on Explanation in Mathematics at SEP.

[Hat-tip to Chris Pincock at Honest Toil]

Intuitions Regarding Geometry Are Universal, Study Suggests

Are geometric intuitions universal? A study reported in Science Daily suggests:
All human beings may have the ability to understand elementary geometry, independently of their culture or their level of education. This is the conclusion of a study carried out by CNRS, Inserm, CEA, the Collège de France, Harvard University and Paris Descartes, Paris-Sud 11 and Paris 8 universities. It was conducted on Amazonian Indians living in an isolated area, who had not studied geometry at school and whose language contains little geometric vocabulary. Their intuitive understanding of elementary geometric concepts was compared with that of populations who, on the contrary, had been taught geometry at school. The researchers were able to demonstrate that all human beings may have the ability of demonstrating geometric intuition. This ability may however only emerge from the age of 6-7 years. It could be innate or instead acquired at an early age when children become aware of the space that surrounds them. This work is published in the PNAS.
The article is:
Véronique Izard, Pierre Pica, Elizabeth S. Spelke, and Stanislas Dehaene. 2011. Flexible intuitions of Euclidean geometry in an Amazonian indigene group. Proceedings of the National Academy of Sciences, 23 May 2011.