Monday, 30 May 2011

Roy's Fortnightly Puzzle: Volume 3

Okay, this one has been keeping me up at night for at least two decades.

What is the value of 0^0?

Mathematicians seem to agree that if it has a unique value at all, the value is 1 (there seems to be less consensus on the antecedent here, however).

The puzzle, of course, is not merely figuring out the answer. The puzzle is to say something about what the criteria for deciding such a question might be. In particular:

Is mathematical practice and convenience the only arbiter here?

Might the philosopher of mathematics have something interesting to say about cases like this?

Is this an uncomfortable situation for the platonist (since the choice of 1 over 0 seems like a case where the facts are just stipulated for convenience, and not 'discovered' via examination of the platonic forms or whatnot)?

Have at it.

On the origins of analytic philosophy

(Cross-posted at NewAPPS.)

Greg Frost-Arnold has a nice post on the origins of the phrase ‘analytic philosophy’, in particular on when it began to be used roughly with its current meaning. He has a useful chart showing that already in the 1960s the phrase was being regularly used, whereas ‘continental philosophy’ only became more prominent as a phrase in the 1990s.

But besides the question of uses of the terminology, in comments to the post different people have been presenting insightful remarks on the origins of the very idea of ‘analytic philosophy’ as a particular way of doing philosophy. Greg himself (in comments) offers the following observation as a starting point:

Go back to Europe in 1932. We have the following 3 intellectual groups: the phenomenologists (esp. Husserl and Heidegger), the folks dedicated to (something like) Moorean analysis, and the Vienna Circle and their intellectual allies (perhaps we include here the Lvov-Warsaw school).

He then goes on to argue, and correctly to my mind, that analytic philosophy emerged essentially from the ‘fusion’ of the last two groups, the Mooreans and those who viewed logical analysis as the main philosophical methodology (Vienna Circle etc.). Continental philosophy would have naturally emerged from the phenomenology tradition. (An aside: for a while, I entertained the hypothesis that the analytic/continental divide could be explained by observing which of the Critiques each tradition focused on: mostly the first for analytic philosophy, mostly the third for continental philosophy. But Eric Schiesser told me at some point that this hypothesis doesn’t work, and although I forgot his arguments, I found them compelling then.)

Anyway, I think Greg hit the nail on the head in identifying Moorean ‘common sense’ philosophy and logical positivism (or at least a strong emphasis on the role of logic as a methodology for philosophy) as the two main sources for the development of analytic philosophy as we know it. But this also means that this tradition has always had a mild schizophrenic component, so to speak (and I hope disability philosophers will not take offense here!). In a sense, Moorean common sense philosophy is inimical to the conception of logical analysis in question, which at least to some extent *is* about discovering new facts.

Let me make this more precise. In comments, Greg quotes the ‘statement of policy’ of the very first edition of the journal Analysis, in 1933:

"the contributions to be published will be concerned, as a rule, with the elucidation or explanation of facts, ... the general nature of which is, by common consent, already known; rather than with attempts to establish new kinds of facts" (p.1).

Clearly, it doesn’t get more Moorean than this… And why am I saying that this is in tension with the project of using logical analysis as a key philosophical methodology? After all, one can (and does) also use logical analysis for the explanation of facts which are already known. However, as I see it, this is definitely not the core of the good old Leibnizian idea of using logic to discover new truths, which was to a great extent one of Frege’s main sources of inspiration (although it is true that the very logicist project can be seen as the search for a logical analysis of already known mathematical facts; but it seems to me that this is really just the beginning of the story).

Let me mention a couple of examples. In ‘On Denoting’, when Russell claims that the actual logical form of sentences such as ‘The present king of France is bald’ is not the subject-predicate form but rather a tri-partite claim, he is clearly questioning ‘already known facts’ by common consent. Similarly, theTractatus is filled with un-commonsensical statements. A little later, in his works on truth and logical consequence, Tarski does take something that he himself refers to as the ‘common conception’ of these concepts as his starting point to formulate criteria of adequacy for his formal theories. But it’s clear that these formal theories are intended to go much beyond than just offering an elucidation of these ‘common conceptions’. In the same vein, Carnapian explication does not in any way rule out the possibility of establishing ‘new kinds of facts’ about its object of analysis (which may well be already known facts, but as a starting point).

It is no secret to anybody that I am no enthusiast of Moorean intuition-based philosophical methodology (as explained here and here), so I am not neutral on any of this. But the genealogy offered by Greg Frost-Arnold seems to me to highlight the fact that this tension between sticking to (and explaining) what is known and discovering new facts (in particular, by means of logical analysis) has always been at the heart of analytic philosophy, insofar as it emerged from the confluence of two rather distinct approaches to philosophy. This explains for example the reactions to Michael Streven’s definition of philosophy in a post at Leiter’s blog: “The point of philosophy is to defy common sense.” (For the record, I couldn’t agree more.) It’s Mooreanism still alive and kicking! But modern counterparts of logical positivism are also alive and kicking, and it is surprising that people are not more aware of the intrinsic conflicting nature of philosophical methodology within analytic philosophy.

(Let me also add a little plug for Mike Beaney's great entry on the concept of analysis at SEP.)

Wednesday, 25 May 2011

MCMP video podcasts (Berit Brogaard, Volker Halbach)

LMU’s Virtuelle Hochschule, Armin Rubner and Erik Keller have done a fantastic job preparing MCMP's first two video podcasts. Berit Brogaard (UMSL)  and Volker Halbach (New College, Oxford) have the honour of being first out.

Brogaard: Do 'Looks' Reports Reflect the Contents of Perception?

Halbach: The conservativity of truth and the disentanglement of syntax and semantics

Check them out here.

Tuesday, 24 May 2011

The Three Paradoxes

Okay, so some of you might have noticed by now that I am into comics, and am going to freely inform you about math-and-logic related comics that I run across (both the good - see Peanuts below - and the bad - see Superman below). Hell, I think I am the only person in the world publishing on both logic and on the aesthetics of comics. Give me a break ;)

Anyway, most of you probably are already aware of A. Doxiadis et alia's Logicomix - the story of Russell's logic in comic book form (I reviewed it for History and Philosophy of Logic, for anyone interested in that sort of thing!)

Recently another comic involving the history of logic has appeared: Paul Hornschemeier's The Three Paradoxes. The comic tells five distinct, but interconnected stories. The title derives from one of these stories: Hornschemeier's brilliant seven-page depiction of Zeno of Elea presenting his three paradoxes of motion to the Athenian philosophers (he leaves out the Stadium on Parmenides' advice!)

The Zeno material in the comic is connected to a passage where Hornschemeier sees a scarred childhood acquaintance for the first time in years and finds himself tongue-tied. Afterwards he and his father compare this 'frozen' feeling - this inability to speak or move appropriately - to Zeno's paradoxes of motion.

There are a couple notable things about the portion of the comic that depicts Zeno explaining his puzzles to the Athenians. First off is the fact that Hornschemeier gets the paradoxes right (something Doxiadis couldn't be bothered to do in Logicomix even though his co-author is a professional computer scientist - see the review above for details). Second, however, is his depiction of Socrates and Socrates' reaction to Zeno. After hearing about the paradox he exclaims:

"Man, no offense. But are you guys retarded?"

Then he asks what happens if Zeno is successful, and they all come to believe Zeno's views - views they didn't previously hold. He then answers his own question:

"Here is Athens we call that shit change. Change! What do they call that shit over in Elea?"

It's both hilarious and, in my opinion, completing in keeping with Socrates' character (insofar as we know what that is).

Good stuff. I recommend it.

[By the way, I have decided to let the offensive use of 'retarded' slide, since the comic is so great otherwise.]

Sunday, 22 May 2011

European formal philosophers, you've been warned...

(Folk over at Choice and Inference are very amused.)

UPDATE: Leiter has added an update to his original post.

It has come to my attention that one self-identified "European formal philosopher," who shares the concerns about the editorial misconduct at Synthese, has taken offense at my attempt to introduce a note of levity into this affair. So I hereby make clear that, of course, not all formal philosophers in Europe are willing to excuse the editorial misconduct; the title was prompted primarily with one particularly obtuse defender of the EICs in mind (Reinhard Muskens, a frequent commenter at the New APPS blog). I apologize for the unfair implication!
It that's about me, neither did I take offense nor am I a 'self-identified "European formal philosopher"', but the clarification is much appreciated (less so for the use of the term 'obtuse' here).

Saturday, 21 May 2011

How to Write Proofs, 2

In the earlier post, "How to Write Proofs, A Quick Guide", I gave the example of a simple kind of result that one might come across (and should know how to prove) in intermediate logic:
(*) Suppose $S_0$ is $P \rightarrow Q$ and $S_{n+1}$ is $P \rightarrow S_n$. Show that, for all $n$, $S_n$ is equivalent to $P \rightarrow Q$.
The obvious proof uses induction (on $n$, as one says). We want to show that: every number $n \in \mathbb{N}$ has a certain property: namely, that the formula $S_n$ is equivalent to $P \rightarrow Q$. We can show this by induction. Induction says that if a property holds of $0$, and holds of $k+1$ whenever it holds of $k$, then it holds of all numbers. So, induction proofs proceed in three steps. First (the base step) show the property holds of $0$. Second (the induction step) show that, assuming it holds of $k$, it also holds of $k+1$. Finally, conclude that it holds of all numbers.

$\textbf{Base}$. I.e., when $n = 0$.
$S_0$ is $P \rightarrow Q$. This is obviously equivalent to $P \rightarrow Q$.

$\textbf{Induction Step}$. I.e., from $k$ having the property, we want to show $k+1$ has it too.
So, suppose $S_k$ is equivalent to $P \rightarrow Q$. (This is the Induction Hypothesis.) We want to show that $S_{k+1}$ is equivalent to $P \rightarrow Q$ as well.
First, note that $S_{k+1}$ is defined to be $P \rightarrow S_k$. The Induction Hypothesis tells us that $S_k$ is equivalent to $P \rightarrow Q$. But we can use the simple lemma that "the substitution of equivalents leads to equivalents". So, we can conclude that $S_{k+1}$ is equivalent to $P \rightarrow (P \rightarrow Q)$. We only then need to show that this is equivalent $P \rightarrow Q$. I.e., that $P \rightarrow (P \rightarrow Q) \equiv P \rightarrow Q$. This can be done with a truth table. So, that completes the induction step.

From the Base Step and the Induction Step, we can conclude, using the Induction Principle, that for all $n$, $S_n$ is equivalent to $P \rightarrow Q$, as required. $\sf{QED}$.

Thursday, 19 May 2011

But WHAT is an open problem in mathematics?

UPDATE: This is the second post in a series of three on the same topic; the first one is here and the third one is here.

Two days ago I reported here on an ongoing debate over at the FOM list on some controversial statements made by Fields medalist Voevodsky on the status of the consistency of PA as a mathematical problem. In particular, I mentioned that Harvey Friedman had reported sending a message to Voevodsky, asking for clarifications: "how you view the usual mathematical proof that Peano Arithmetic is consistent, and to what extent and in what sense is "the consistency of Peano Arithmetic" a genuine open problem in mathematics."

Now Friedman reports that Voevodsky has replied:
Such a comment will take some time to write ...

To put it very shortly I think that in-consistency of Peano arithmetic as well as in-consistency of ZFC are open and very interesting problems in mathematics. Consistency on the other hand is not an interesting problem since it has been shown by Goedel to be impossible to proof [sic].
So, what do we make of this? I am now more and more convinced that Toby Meadows (in correspondence) has it right when he says that there are different senses of a 'mathematical open problem' floating around. Toby suggested a weak and a strong reading of 'open problem/question' in this context:
On a weak reading of "open question", we might understand it as a question that is worth asking ... perhaps a research programme worth taking up. In this case, the fact that so much mathematical practice presupposes Con(PA) is going to be good evidence that Con(PA) is not open on this way.
The strong reading would be "that it is not the case that the question has been settled one way or the other in such a way that it is "impossible" for it to be otherwise."

On the strong reading, Con(PA) is perhaps not an open question, as Goedel's results have shown that assuming Con(PA) we cannot prove Con(PA) with merely PA; this is exactly what Voevodsky seems to be saying. However, as again noted by Toby, it doesn't mean that ~Con(PA) is NOT an open question; we know we cannot prove Con(PA) in PA, but we don't know yet whether we can or cannot prove ~Con(PA) in PA! Admittedly, this is a *very* unlikely outcome, but there is indeed as of yet no proof that ~Con(PA) cannot be proved in PA. And this is precisely what Voevodsky seems to be saying, at least as I understand him.

Another point worth mentioning is that here he seems to be relying on a Hilbertian sense of proving the consistency of a system T: Con(T) is only proved if it is proved in T itself, and it is perhaps in this sense that he (wrongly, to my mind) dismisses Gentzen's proofs. (I have lots of thoughts on this too, including foundational worries of circularity; if T is unsound, it might well prove Con(T) even though that's not the case, in particular given that Sou(T) --> Con(T), and an unsound T may well be able to prove Sou(T). But that's for another time.)

Anyway, all this to raise the deeply philosophical question: what is an open problem in mathematics? Are there different senses of 'open problem' floating around? Might this be what is behind all the debate between FOM'ers? M-Phi'ers seem particularly well-positioned to discuss these matters, so please go ahead and shoot!

(On this note, I'm off to Rio tomorrow, so internet access is going to be disrupted. But you can of course debate without me!)

Wednesday, 18 May 2011

Peanuts and Platonism

Schulz, June 29, 1954.

[I recommend that you cllick on photo for larger, more legible version]

Tuesday, 17 May 2011

Voevodsky: "The consistency of PA is an open problem"

UPDATE: This is the first post in a series of three on the same topic; the second one is here and the third one is here.

Those of you who subscribe to the FOM list have certainly noticed that it's been "a lot more interesting than usual", in the terms of Archean Toby Meadows. For those of you who do not subscribe to the list (perhaps precisely for the reasons implicated in Toby's remark!), here's a summary (but you can read the messages yourself at the FOM archives for this month). There are some videos of lectures by Fields medalist Voevodsky made available on the internet, where (at least according to some) he seems to display a complete lack of understanding of Goedel's incompleteness results (here and here). In Neil Tennant's words:
He stated the theorem as follows (written version, projected on the screen):
It is impossible to prove the consistency of any formal reasoning system which is at least as strong as the standard axiomatization of elementary number theory ("first order arithmetic").
So he failed to inform his audience that the impossibility that Goedel actually established was the impossibility of proof-in-S of a sentence expressing the consistency of S, for any consistent and sufficiently strong system S.
As we know, Gentzen's proofs of the consistency of PA are among the most important results in proof-theory, second only to Goedel's results themselves and perhaps Prawitz' normalization results. (For a great overview of Gentzen's proofs, see von Plato's SEP entry.) What I find most astonishing about Gentzen's proofs, based on transfinite induction, is that the theory obtained by adding quantifier free transfinite induction to primitive recursive arithmetic is not stronger than PA, and yet it can prove the consistency of PA (it is not weaker either, obviously; they just prove different things altogether). One may raise eyebrows concerning transfinite induction (and apparently this is what lies behind Voevodsky's dismissal of Gentzen's results), but apparently most mathematicians and logicians seem quite convinced of the cogency of the proof. In Vaughan Pratt's words (he's my favorite regular contributor to FOM, and we've corresponded on a number of interesting topics):
Since Gentzen's proof is a reasonably straightforward induction on epsilon_0 in a system tailored to reasoning not about numbers under addition and multiplication but about proofs, one can only imagine Voevodsky rejects Gentzen's argument on the ground that Goedel's result must surely show that no plausible proof of the consistency of PA can exist, hence why bother thinking about any such proof?
So Voevodsky seems to seriously entertain the possibility of PA's inconsistency. Is it because he doesn't understand Goedels' results, or Gentzen's results, or both? Or is there something else going on? Even granting that a great mathematician is not necessarily savvy on such hair-splitting foundational issues, as pointed out by Juliette Kennedy, "that Voevodsky's coworkers in Univalent Foundations are Awodey and Coquand seals the deal for me." In other words, we have reasons to believe that Voevodsky should know his way around even in foundational issues (his website linked above says that "he is working on new foundations of mathematics based on homotopy-theoretic semantics of Martin-Lof type theories.")

H. Friedman reports having sent him a letter asking for clarifications, in particular "how you view the usual mathematical proof that Peano Arithmetic is consistent, and to what extent and in what sense is "the consistency of Peano Arithmetic" a genuine open problem in mathematics." So the debate is likely to continue, and (needless to say) it clearly has all kinds of important philosophical implications.

Monday, 16 May 2011

Roy's Fortnightly Puzzle: Volume 2

Sorensen's No-No paradox [2001] consists of two statements F1 and F2 such that (where H is the Godel code of formula H):

|- F1 <-> ~T(F2)
|- F2 <-> ~T(F1)

(loosely speaking, each statement is equivalent to the claim that the other is false). Now, given the F1 and F2 instances of the T-schema, we have:

|- F1 <-> ~ F2

Symmetry considerations suggest, however, that F1 and F2 ought to have the same truth value. Hence the 'paradox'.

The question I will set before you today is this. Assume we are working in a consistent extension of Q, and given a predicate P(x), consider two statements G1 and G2 such that:

|- G1 <-> P(G2)
|- G2 <-> P(G1)

Under what conditions is it the case that:

|- G1 <-> G2

In other words, when can we prove the symmetry claim?

Here is a start: If P is a provability predicate, then we can prove G1 <-> G2 (actually, we can prove both G1 and G2 individually!)

Harder?: With the set-up the same as the above, consider a 'Sorensen n-cycle':

|- G1 <-> P(G2)
|- G2 <-> P(G3)
|- G3 <-> P(G4)
: : : : : : :
|- Gn-1 <-> P(Gn)
|- Gn <-> P(G1)

Under what conditions can we prove:

|- G1 <-> G2
|- G2 <-> G3
: : : : : : :
|- Gn-1 <-> Gn
|- Gn <-> G1

Again, P being a provability predicate is sufficient.

Have at it.

[Important Disclaimer: I do not have a general solution to this. I do have some partial results that I will dole out if the question attracts sufficient interest.]

[Edited to correct typo noted by Shane]

Sunday, 15 May 2011

Synthese: the EiC's response to the petition

At least some of the pending issues seem to be settled by this (not all, though).

(The petition was here, and Leiter Reports is once again my source.)

Thursday, 12 May 2011

Philosophy and its technicalities

"There would be something badly wrong if work in the philosophy of physics were as accessible to a linguist as to a physicist, or if work in the philosophy of language were as accessible to a physicist as to a linguist."

It's from a brand new piece by James Ladyman in The Philosophers' Magazine on Philosophy that's not for the masses. Deserving.

Wednesday, 11 May 2011

Exciting new directions in formal semantics

(Cross-posted at New APPS)

The Amsterdam Colloquium is a bi-annual event focusing mostly (though not exclusively) on formal semantics and formal pragmatics. Its 18th installment will be held December 19 - 21, 2011 at (surprise, surprise!) the University of Amsterdam. The call for papers is out, and there will also be three thematic workshops: Formal semantics and pragmatics of sign languages, Formal semantic evidence andInquisitiveness.

Especially the first two workshops indicate that formal semantics and pragmatics as a field is moving in refreshingly new directions (and the third one is also bound to be interesting). The workshop on sign languages shows that the field is finally paying attention to the significant dissimilarities between languages expressed in different media: written, spoken, and in this case sign language. Following Roy Harris and others, I am convinced of the crucial importance of incorporating these differences into linguistic theorizing. To my mind, it is deeply misleading to speak of ‘natural language’ as a blanket term covering both speech and writing (and possibly sign languages too). Moreover, I am with Linell in identifying a chronic ‘written language bias’ in language studies in general, which means that features proper to speech (intonation, prosody) tend to be overlooked. Ironically, features proper to writing are also unduly ignored (as argued e.g. in this great paper by Sybille Kramer).

A workshop on sign language clearly indicates the realization that each form of human language must be studied also from the point of view of its specific medium and the features arising from using the medium in question (clearly, one of my motivations to insist on this aspect is my belief in the fruitfulness of embodied approaches in general). There is some work already being done within formal semantics and pragmatics on e.g. intonation (for instance, by my colleague Floris Roelofsen), and now that sign language is receiving attention in the field, it all seems to be going in a very good direction. Moreover, given that most researchers are not competent users of sign language themselves, such studies will necessarily have a strong empirical component, which brings me to the second workshop.

Working in Amsterdam, I’ve always been exposed to large amounts of formal semantics and pragmatics, and while I’ve always marveled at the technical ingenuity displayed, philosophically I always felt a bit uncomfortable with what exactly was going on. What is formal semantics a model of? Does it purport to describe actual cognitive processes of language users? Does it describe the ‘mathematical’ properties of language, considered as ontologically independent from speakers? (Roughly, this seems to have been Montague’s original take.) And as long as it is not clear what exactly the target phenomenon is for these theories, it is also not clear where the evidence to support or refute a given formal semantic theory should be coming from. So I worried about the kind of epistemic confirmation these theories were based on, and similarly about the explanatory power they could have (what exactly were they explaining?). In other words, I felt that formal semantics and pragmatics as a field was in dire need of serious methodological reflection. I was not alone there, as my boss Martin Stokhof, formerly a full-blown formal semanticist (in particular in his joint work with Jeroen Groenendijk: the logic of questions, dynamic predicate logic etc.), has focused extensively on the philosophical and methodological foundations of the enterprise in recent years, taking a rather critical stance (see his papers here; he is also supervising the dissertation of my friend and co-author Edgar Andrade-Lotero on the philosophical foundations of formal semantics).

Now, given the reservations that I’ve had through the years, I am thrilled to see a whole workshop at the Amsterdam Colloquium dedicated to methodological reflection on the foundations of formal semantics (not surprisingly, it is organized by my very talented colleagues Katrin Schulz and Galit Weidman Sassoon). From the description of the workshop:

Formal semantics as a field of linguists undergoes a rapid change with respect to the status of quantitative methodologies, the application of which is gradually becoming a standard in the field, replacing the good old 'armchair' methodology. In light of this development, we invite submissions reporting of high level formal semantic research benefiting from the use of a quantitative methodology, corpora-based, experimental, neurolinguistic, computational or other.

Clearly, this is a debate with deep philosophical implications, and particularly significant against the background of the sustained debates on methodology that have been taking place in philosophy in general over the last years. In other words, this post is not only intended as a plug for the Amsterdam Colloquium :) It is also intended to suggest that the field of formal semantics and pragmatics may be moving in exciting new directions; to my mind, if these directions continue to be pursued, the field will gain substantially in philosophical depth.

Tuesday, 10 May 2011

Two important blogs

One more post with feminist activism, and then I promise to give you all a break for a while! (^_^)

It occurred to me now that, after my previous post on gender imbalance, it would be good also to advertise two important blogs run by Jender of the Feminist Philosophers: What is it like to be a woman in philosophy?, with all kinds of depressing stories sent by readers (and some 'good news' stories too!), but also What we're doing about what it's like, with submissions on what people have been doing so as to improve the situation of women in philosophy.

So in a sense they form a 'bad news'/'good news' pair, and unsurprisingly, the bad news one gets a lot more submissions and readership. But lots of people have been doing all kinds of interesting things, and 'What we're doing' is an important source of ideas for those looking for inspiration on how to get involved with improving things. It deserves just as much attention as its 'bad news' sibling.

Let me also notice that, while most of my activism is in the feminist direction, I certainly keep in mind that there are all kinds of other minorities in philosophy and elsewhere facing similar difficulties. In particular, I think it's much harder for people working outside the mainstream axis North-America/Western Europe to be taken seriously in the profession; there are also strong 'geographical biases' operating, and in the medium run I'm hoping to be able to do something about this too.

Friday, 6 May 2011


And you thought super-ventriloquism was Superman's most useless power.

There are just so many things wrong with this.

M-Phi and gender (im)balance

Philosophy has a notoriously skewed gender balance, and things tend to be even worse in more techy areas such as logic, philosophy of mathematics etc. Those of you who know me are probably aware that this is something I worry about a lot, for all kinds of reasons. For starters, poor gender balance is self-perpetuating: if there is a strong association between a certain area and men, which is reinforced precisely by the low number of women in the area, then young women who might otherwise consider pursuing their interests in the given area are very likely to feel unwelcome and uncomfortable, thus turning to areas and occupations that they feel are more 'congenial'. And so the cycle continues... Hoping that things will get better by themselves has proven to be an overly optimistic attitude, precisely in virtue of the self-perpetuating nature of the phenomenon.

So instead, I am convinced that it is only by making conscious efforts towards redressing gender imbalance that we are likely to make any progress (yes, affirmative action). It would be crucial for example to increase the visibility of the women who already do good work in a given field typically associated with 'maleness' so as to to counter the stereotype, and one way this can be done is by trying to increase the proportion of women as speakers at conferences. With this in mind, I have created a list of women working in philosophical logic and philosophy of logic, which is meant, among other things, to serve as a source of ideas for conference organizers. The list is being constantly updated, and suggestions for additions are always very welcome!

Another important measure is mentoring/coaching individual women at the early stages of their careers. It is probably very hard for men to understand that women have to counter a lot of biases to pursue their interests in a given 'male' area, including self-imposed biases. They often need to be explicitly told that they have what it takes, that they do belong in a given field, that they have good potential. Most women will profit from the kind of reassurance that might be seen as 'overkill' by men. So I ask you, fellow M-Phi'ers, to pay particular attention to the talented young women around you who are constantly confronted with feelings of inadequacy, and who could certainly benefit from some extra encouragement.

Well, there is lots more that I could write on the topic, but I will leave it at that for now. To be sure, I certainly don't intend to use this blog as a constant outlet for my feminist activism (fortunately, I already have New APPS for that!), but I thought it would be worth reminding the readers of this blog that this is an important issue, and one which requires a conscious effort to be addressed. We are so used to the situation that we usually don't even pause to think of its utter absurdity.

Thursday, 5 May 2011

Coherence not measured by probability

Mark Siebel has an interesting paper in the new Analysis arguing that the popular programme of measuring coherence by probability cannot succeed. The essence of his argument is that if G explains E better than H does, then {G,E} is more coherent than {H,E}, so any measure of coherence must give these two distinct measures. Yet there are such cases in which G and H are logically equivalent and for which the joint probability distributions of G and E and H and E will be the same, a consequence of which is that no probability measure will distinguish {G,E} and {H,E}. Seems pretty cogent.

Wednesday, 4 May 2011

Inconsistent Math Curricula Hurting US Students, Study Finds

Reading this headline in my inbox from Science Daily, I wrongly thought it might be about dialetheism, "Oh, so this is what happens when inconsistent mathematics is taught to students!". But, apparently not. It's about the effects of different levels of difficulty in mathematics curricula in different places.
A new study finds important differences in math curricula across U.S. states and school districts. The findings, published in the May issue of the American Journal of Education, suggest that many students across the country are placed at a disadvantage by less demanding curricula.
Interesting nonetheless.

Tuesday, 3 May 2011

The KK principle and empirical data

So, today I went up to Groningen to attend a lecture by T. Williamson, with the title "Very improbable knowledge". As it turns out, I didn't make it on time for the lecture, as there were massive train delays due to adventurous goats crossing the train-tracks, who then got hit by a train (I do feel sorry for the goats!). I arrived just in time for Q&A, and while I should probably not have had the nerve to ask a question, there was this question that just *had* to be asked.

At some point during Q&A, the KK principle came up, namely the principle that to know p entails to know that you know p (Kp --> KKp). Williamson was arguing against it (as he must have done during the lecture) on conceptual grounds, suggesting that there are situations where it simply does not seem to hold (some convoluted thought-experiment). His interlocutor at that point, Allard Tamminga, insisted that the KK principle is fundamentally correct, and it wasn't clear how the debate could be carried on any further.

So I asked Williamson whether he thought that a debate on the KK principle might benefit from attention to empirical data. He first thought that I meant carrying out surveys and asking people around whether they thought that the KK principle holds -- which sort of beats the point, as they may know the principle and yet not know that they know it! (duh...) I then clarified that I actually had psychological experiments in mind, in particular the kind of thing that has been investigated under the heading of meta-cognition in recent years (very cool stuff!). He was still not very enthusiastic about my idea, arguing that, just as psychological phenomena are not reliable guides for the truth of mathematical statements, they cannot be reliable guides for the truth of logical principles, and according to him KK is a logical principle.

But is it really? I admit that I tend to think that an awful lot of questions are ultimately empirical questions, but I take KK to be essentially about cognition, and thus not a logical principle as such (unless one wants to take the Kantian transcendental road to cognition and infer everything a priori). In fact, many of the results in the meta-cognition tradition seem to suggest that we often 'know' without knowing that we know (and similarly, that we often think we know when we actually do not know). In other words, the meta-cognition literature investigates, among other things, the accuracy with which we judge our own epistemic capacities. My hunch is that the results basically support the view that KK does not hold, but a more serious investigation would have to be carried out for more definitive claims. In any case, I think it is pretty obvious that it could be very interesting to look into the meta-cognition material from the point of view of KK.

I must say that I was surprised to see Williamson so dismissive of the possible relevance of empirical data for this issue. After all, in The Philosophy of Philosophy, he does enlist the mental models theory of reasoning to argue against the inferentialist program (something that Ole still owes us a paper on!). (Basically, his argument doesn't work, but at least it departs from the premise that empirical data could be relevant for such philosophical debates.) But today, he was not in any way sympathetic to the idea -- a shame, if you ask me.

Monday, 2 May 2011

Logic and external target phenomena

It is a real pleasure to have been invited to contribute to M-phi! As some of you may know, I also contribute to the New APPS blog, where I write about all kinds of things besides logic (feminism, philosophical methodology, current affairs). But from now on, I will be cross-posting my posts on logic and the more ‘formal’ parts of philosophy here; I look forward to debates with the knowledgeable readership of this blog :)

For my first post here, I’d like to discuss the very general question of what logical theories are theories of, if anything at all, and to inquire into the appropriate terminology to be used in such contexts. In a forthcoming JPL paper (co-authored with Edgar Andrade-Lotero), we start with the following remark:

Let us start with a fairly uncontroversial observation. Generally speaking, a logical system can be viewed from (at least) two equally important but fundamentally different angles: i) it can be viewed as a pair formed by a syntax, i.e. a deductive system, and a semantics, i.e. a class of mathematical structures onto which the underlying language is interpreted; or ii) it can be viewed as a triad consisting of a syntax, a semantics and the target phenomenon that the logic is intended to capture. In the first case, both syntax and semantics are viewed as autonomous mathematical structures, not owing anything to external elements. In the second case, both syntax and semantics are accountable towards the primitive target phenomenon, which may be an informally formulated concept, or even phenomena in the ‘real world’ (e.g. logics of action, logics of social interaction, quantum logic etc.). Indeed, in the second case, both syntax and semantics seek to be a ‘model’ in some sense or another of the target phenomenon.

In Chap. 12 of Doubt Truth to be a Liar Graham Priest draws a similar distinction between pure vs. applied logics. As I read him, the distinction is not really intended to differentiate logic systems as such, but rather to outline different attitudes one can have towards a logical system. He then goes on to argue that, from the applied logic point of view, the canonical application of logic is correct reasoning. In a similar vein, Paoli (JPL, 2005) draws on Quine’s distinction between immanent and transcendent predicates, and remarks:

According to Quine, in fact, logical connectives are immanent, not transcendent. There is no pretheoretical fact of the matter a theory of logical constants must account for; rather, the vicissitudes of a connective are wholly internal to a specified formal language, to a given calculus. There is nothing, in sum, that precedes or transcends formalization, no external data to “get right”.

The issue is very general and does not concern logical connectives in particular. The key opposition is between the internal features of a logical theory, and its potential relation with external phenomena, which the logical theory purports to be a model of. Although Quine (in Philosophy of Logic) seemed to mean something slightly different with his notions of immanent and transcendent predicates, I find Paoli’s appropriation of the terminology quite fitting.

My question now is: whenever there is something transcendent that a logical system is intended to capture, what is the appropriate terminology to refer to these external phenomena? In practice, the adjectives ‘pre-theoretical’ and ‘intuitive’ are often appended to whatever target phenomenon that a given logical system is ‘about’ (the ‘being about’ part is also what needs to be explained). Presumably, the idea is that the phenomenon is conceptually prior to its systematization within the theory, and this seems right according to the 'transcendent' approach, but there are problems.

Elsewhere, I have objected to the qualification of ‘pre-theoretical’ as attributed to the notion(s) of logical consequence which is (are) presumably the target of the familiar technical accounts of logical consequence (proof-theoretic, model-theoretic). The trouble with the terminology is that it suggests a theoretically neutral target phenomenon, emerging from ‘common, everyday’ practices (terms used in Tarski’s seminal 1936 paper). In truth, the notions in question are inherently couched in robust theoretical frameworks –-T. Smiley (1988) and P. Smith (2010) make similar points; the latter specifically criticizes Field’s misconception of what is ‘squeezed’ in a squeezing argument. My general worry is that, by describing these notions as ‘pre-theoretical’ and ‘intuitive’, we seem to be suggesting that they are transparent and unproblematic, whereas what is often required to make philosophical progress in these discussions is precisely a deeper understanding of the target phenomenon as such. (Shapiro’s and Prawitz’s papers in the 2005 Handbook are good attempts in this direction.)

So ‘pre-theoretical’ and ‘intuitive’ are problematic; what could possibly be used instead? I’ve been contemplating using ‘extra-systematic’ or ‘extra-theoretic’, but they don’t sound all the way right either. In a sense, perhaps there is no terminology to be used across the board, precisely because the target phenomena of different logical systems may be widely dissimilar kinds of phenomena. Some of them may come closer to what one could describe as ‘intuitive’ (e.g. the truth predicate as used in everyday language), while others will be grounded in a considerable amount of theorizing (e.g. the validity predicate, extensively discussed in blog posts recently – my general position is that it is a conceptual mistake to treat the truth predicate and the validity predicate on a par, even though there are interesting technical connections). So for the time being, I continue to use the vague and uninformative phrase ‘target phenomenon’, which is more of a place-holder, but this may well be what is required here.

(Alternatively, one may simply maintain that there are no target phenomena that logical systems seek to capture in any interesting way, i.e. that everything in a logical system is an immanent matter. Although frustrating for a variety of reasons, this remains an available move for the theorist.)

Sunday, 1 May 2011

Roy's Fortnightly Puzzle: Volume 1

Intuitionism and "Unless"

Anyone who has taught an introductory logic course will be familiar with the difficulties students have regarding how to properly translate "unless" into propositional logic. The correct translation of "F unless G" is:

T1. F v G

(Athough it is surprising how many supposedly professional logic teachers get this wrong). We often motivate this translation in terms of various inferences we make with "unless", taking advantage of the fact that T1 is logically equivalent to both:

T2. ~ F -> G
T3. ~ G -> F

in classical logic. By DeMorgans, the disjunctive translation of "unless' is also classically equivalent to:

T4. ~ (~ F & ~ G)

Now, here is the puzzle: How should the intuitionistic logician translate "F unless G"? Note that each of the four options is intuitionistically distinct from the other three.

The initial temptation to go with the simplest (and 'strongest') translation (T1) conflicts with the thought that the "un" in "unless" suggests the presence of negation. A more sophisticated approach is to examine inferences that intuitionists make with "unless" and determine which translation is supported. Now, all four translations support the following inferences:

I1a. F unless G, not-F, therefore not-not-G.
I2a. F unless G, not-G, therefore not-not-F.

The question then becomes whether "unless" intuitionistically supports the following stronger inference rules:

I1b. F unless G, not-F, therefore G.
I2b. F unless G, not-G, therefore F.

T1 supports both inferences, T2 supports I1b but not I2b, T3 supports I2b but not I1b, and T4 supports neither I1b nor I2b.

So, how should intuitionists translate "unless"?

Roy's Fortnightly Puzzle: Introduction

Okay, I am starting a new feature on M-Phi, and will attempt to keep it going as long as I have ideas.

I will post a simple puzzle involving logic or the philosophy of mathematics that is either a little to 'cute' to actually publish, or which is something I thought about but didn't make any substantial progress on.

A new puzzle will be posted every two weeks (roughly), as the title suggests.

The hope, of course, is that these simple puzzles will motivate us to think about issues in a new way. Of course, if that doesn't happen, and they are merely fun, then that works too.