Tuesday, 31 January 2012

Looking for photos of Gentzen's manuscript

(Cross-posted at NewAPPS)

Ok, this is going to be a bit of a self-serving blog post, I hope readers will not mind too much...

I am now working on ideas for the cover of my forthcoming book, and after a few other attempts, I think I have now finally found exactly what I need: a photo of the manuscript of Gerhard Gentzen’s doctoral thesis, Untersuchungen über das logische Schliessen. (For general historical background on proof-theory, see the wonderful SEP entry by Jan von Plato.) The manuscript was found in 2005 in the Paul Bernays collection of the ETH-Zurich, and is remarkable in many respects. One of them is that it contains a proof of normalization for natural deduction which was not included in the published version of the thesis. Eventually, Dag Prawitz proved the same result some 30 years later (in 1965), which was (and still is) seen as a groundbreaking result for proof-theory. It is quite astonishing to think that Gentzen had arrived at the same result but never went on to publish it! (For those who want to know more, here is the article by Jan von Plato containing a translation of the proof, and here is a blog post by Ole Hjortland commenting on Gentzen’s result.)

Now comes the self-serving part: I recall seeing photos of the manuscript at a talk once, but strenuous google searches have not yielded anything so far. Does anyone know where to find photos of Gentzen’s manuscript? (Needless to say, possible copyright issues would be dealt with by the publisher.) It is clear to me now that I simply cannot move on with my life unless a handwritten formula by Gentzen is on the cover of my book! This may be related to the fact that I’ve done (and still do) a lot of work on medieval logic (and indeed had the picture of a manuscript on the cover of my PhD dissertation); somehow, it seems that I’m still nurturing a manuscript obsession.

Any help on locating photos of Gentzen’s manuscript would be much appreciated!

Wednesday, 25 January 2012

What is your favorite deep, elegant, or beautiful explanation?

It's the 2012 Edge question. I quickly went through the responses (almost two hundreds so far) and tried to track how often some big shots were cited. In my record, Einstein gets the highest score, followed closely by Darwin and Watson & Crick, in a tie. Turing beats Feynman and gets a rather striking draw with both Maxwell and quantum theory. Other noticeable mentions include Herbert Simon, the Pascal-Fermat correspondence, and, oh well, Marcel Duchamp. Believe it or not, one even finds a vote for structural realism! And M-Phi proper also gets its share, I guess, with Russell's theory of descriptions. Nice.

Saturday, 21 January 2012

The Fibonacci sequence, plant growth, and Vi Hart

(Cross-posted at NewAPPS)

The Fibonacci numbers are those in the following sequence of integers: 0, 1, 1, 2, 3, 5, 8, 13, 21, 34 etc. By definition, the first two numbers are 0 and 1, and each subsequent number is the sum of the previous two. The sequence is named after Fibonacci, aka Leonardo of Pisa, who introduced the sequence (known already in Indian mathematics) to Western audiences in his famous book Liber Abaci (1202) – which, by the way, is also one of the main sources for the dissemination of Hindu-Arabic numerals in Europe, no less. (Fibonacci had learned ‘Eastern’ mathematics while studying to become a merchant in North Africa -- see an earlier post on the importation of Indian and Arabic mathematics into Europe through a sub-scientific, merchant tradition.)

It has been known for a long time that the ratio between a number in the sequence and its immediate predecessor converges towards what is referred to as the golden ratio or ‘phi’: 1.618… It has also been known that the Fibonacci sequence and the golden ratio permeate a great amount of natural phenomena, in particular plant growth. This observation has given rise to all kinds of mystical, Pythagorean hypotheses on how numbers really are the building blocks of reality.

Enters Vi Hart. She is a video artist whose creations explain complex mathematical concepts in a fun, accessible way, mostly through “doodling in math class”. These videos can be seen as a manifesto against traditional math education and a plea for approaches which preserve the beauty and exhilaration that can be experienced through mathematics. She has a new sequence of videos in which she explains why the Fibonacci sequence is such a pervasive pattern in plant growth: the naturalist explanation (which has been known by botanists for a while) is that following a Fibonacci pattern allows plants to maximize sun exposure. No need to call upon Pythagorean hypotheses! Here is Part 1 of the series; for Parts 2 and 3, click here and here (H/T John Baez).



The applicability of mathematics to the study of natural phenomena is an old but still widely discussed philosophical question: is math (numbers in particular) really in the world, or are we just imposing an approximated, artificial order upon unruly natural phenomena? If numbers are not in the world, how come mathematics is such a powerful tool to describe these phenomena? But if numbers are in fact in the world, who put them there in the first place? Vi Hart’s Fibonacci videos suggest that what may appear to be almost eerie can also have a perfectly reasonable, down-to-earth explanation; the only premise to be accepted is the idea that, given time, the evolutionary development of living beings will tend towards maximization of use of resources that are important for survival (in this case, sun light exposure).

But anyway, eerie or not, one thing is clear: math is a many splendored thing. (And so is nature.)


Wednesday, 18 January 2012

Announcing the MCMP Round Table on Acceptance

The Munich Center for Mathematical Philosophy has laid the cloth: You are invited to join us for the Round Table on Acceptance with Hannes Leitgeb (MCMP), Kevin Kelly (CMU Pittsburgh/Center for Formal Epistemology), and Hanti Lin (CMU Pittsburgh). This public MCMP event will take place on February 3rd, 2012, center about probabilities/belief/acceptance and bring the philosophical theories of Leitgeb and Kelly/Lin to one table.
Download more info about venue and programme here (low-res PDF).

Monday, 16 January 2012

Paradoxes: something's gotta go, but what? And why?

(Cross-posted at NewAPPS)

The coming months, I’ll be teaching a course on paradoxes, which will focus on historical and methodological rather than technical aspects, so it is quite likely that there will be a constant stream of blog posts on paradoxes. We shall see…

One useful definition of a paradox is the one offered by M. Sainsbury in his highly influential book Paradoxes (p.1):

This is what I understand by a paradox: an apparently unacceptable conclusion derived by apparently acceptable reasoning from apparently acceptable premises. Appearances have to deceive, since the acceptable cannot lead by acceptable steps to the unacceptable. So, generally, we have a choice: either the conclusion is not really unacceptable, or else the starting point, or the reasoning, has some non-obvious flaw.

Paradoxes create a situation of cognitive dissonance which must be resolved in one way or another. (Moreover, in those fields of inquiry where theories cannot be straightforwardly tested empirically against reality, if a paradox arises, it is often seen as a sign of the inadequacy of the theory.) Different entrenched beliefs are shown to be in tension with one another, so something’s gotta give. But what should give?

Sainsbury’s passage already suggests the three main alternatives: i) one of the premises must be rejected; ii) one of the steps of the reasoning involved must be rejected; iii) the apparent unacceptability of the conclusion must be revised. Different solutions to the Liar paradox illustrate these three approaches:


1) Tarskian approaches reject some of the premises, namely some of the principles guiding a naïve conception of truth.

2) Revisionist approaches (e.g. Field’s) revise the logic underlying the reasoning giving rise to the paradox.

3) Dialethist approaches revisit the unacceptability of the conclusion of there being one sentence (the Liar sentence) which is both true and false.

Naturally, some solutions to paradoxes blend more than one of these approaches, but any proposed solution to paradoxes must take at least one of these three routes to dispel the situation of cognitive dissonance.

I’ve professed elsewhere my sympathy for type 3 approaches, i.e. those which are not afraid to embrace the conclusion after all, in spite of its apparent couter-intuitiveness. Naturally, the point is not simply to uncritically accept the unacceptable conclusion, but rather to view it as possibly a non-trivial genuine discovery; the history of science is full of surprising discoveries.

But rather than arguing for type 3 approaches, today my goal is to elaborate a bit on why I especially dislike type 2 approaches, i.e. revisionist approaches. My distaste for them does not stem from a particular fondness for classical logic, or whatever other well-entrenched system of reasoning. Rather, my issue with type 2 approaches is that such rejections of certain rules of inference typically have a ‘fix-up’ feel to them: this particular rule has served us well until now, seems well-motivated, but in order to avoid paradox, we should just ditch it, for no other reason. However, if a given rule or principle of logic is to be rejected, it seems to me that such a rejection must be based on independent grounds, not only the fact that the paradox may be blocked.

Another reason for rejecting such revisionist approaches (and one which has been articulated by e.g. Stewart Shapiro in presentations) is that you may end up with a logical system that is no longer useful as a tool for reasoning. Try doing mathematics with the ‘logic’ developed by Field to cope with paradoxes! You may be able to avoid paradoxes, but the price is just too high. Or to pursue the ‘pathology’ metaphor which is often used in connection with paradoxes: you may cure the disease, but the treatment is so violent that you are left with a highly dysfunctional organism. (So there might be pragmatic reasons for adopting type 3 approaches as well, something along the lines of ‘stop worrying and learn to love the paradoxes’.)

This way of setting up the issue occurred to me when skimming through the recent paper by Elia Zardini in RSL 4(4), 2011, ‘Truth without contra(di)ction’ (which I haven’t gotten around to giving all the attention it deserves yet!). The abstract says:

I propose a new solution to [semantic] paradoxes, based on a principled revision of classical logic. Technically, the key idea consists in the rejection of the unrestricted validity of the structural principle of contraction.

Contraction is the structural rule according to which if premises A1, A2 … B, B … entail C, then premises A1, A2 … B … also entail C; that is, one of the copies of B has been ‘deleted’ (contracted), and the consequence still holds. Contraction allows for the ‘deletion’ of premises that occur multiple times (as long as at least one copy is left in place), and is a valid rule in most (but not all) logical systems available in the literature. Linear logic is one of the few prominent logical systems which reject contraction.

In section 2.3 of his paper, Zardini briefly discusses the gist of his independent motivation for rejecting – in fact, restricting – contraction. He writes (p. 504):

But what is the intuitive rationale for restricting contraction? What is it about the state-of-affairs expressed by a sentence that explains its failure to contract? […] I believe that in attempting to find these answers one has to step out of the abstract realm of formal theories of truth and engage in some concrete metaphysics of truth.

He then goes on to argue for the idea of unstable states-of-affairs as those where contraction would fail. Others (e.g. Ole Hjortland) have been thinking about the idea that contraction may be the ‘real villain’ in terms of giving rise to paradoxes. But to my knowledge Elia is the first to ask the question of the plausibility for rejecting/restricting contraction outside a formal system, and independently of the goal of blocking paradoxes in and of itself. I still need to think more carefully about his idea of ‘unstable states-of-affairs’, but at least I think he is asking the right questions.

This way to go about type 2 solutions to paradoxes, i.e. revisionist solutions, makes me much happier than the usual ‘fix-up’ approaches that abound in the literature. I sure hope that revisionists will follow Elia’s lead and start engaging in deeper philosophical, not only technical, analyses of why a given logical principle or rule is to be rejected.

Friday, 6 January 2012

Two Conferences at MCMP: Posters



Here are the new posters for two of our upcoming 2012 events. The design is by Roland Poellinger, one of the center's fellows. 

Thursday, 5 January 2012

2nd CFP: Ninth Annual Formal Epistemology Workshop (FEW 2012)


We are happy to announce that the Ninth Annual Formal Epistemology Workshop (FEW 2012) will be held in Munich, May 29 - June 1, 2012. This year's meeting is sponsored by the Munich Center for Mathematical Philosophy. The meeting will take place at the (stunningly beautiful) Nymphenburg Palace (compliments of the Carl Friedrich von Siemens Foundation).


We are accepting submissions for contributed papers. The deadline for submissions is January 15, 2012. Notifications will be sent out by March 15, 2012. Please send submissions to Branden Fitelson. A selection of papers presented at FEW 2012 will be published in a special issue of Erkenntnis.

Some funding will be available for graduate student participation. Please contact Hannes Leitgeb for more information.

There will be two special (afternoon) sessions at this year's FEW. The first will be a special session on Logic & Rationality, which will include talks by David ChristensenGila Sher, and Robbie Williams, and the second will be a memorial session for Horacio Arló-Costa, which will include talks (pertaining to Horacio's various seminal philosophical contributions) by Cristina BicchieriEric PacuitRohit Parikh, and Paul Pedersen.

We will also have two (two part) tutorials, presented by Jeff Paris (inductive probability), and Charlotte Werndl (determinism, indeterminism, and underdetermination).

This year's local organizers are Hannes LeitgebFlorian SteinbergerVincenzo Crupi, and Ole Hjortland.

FEW 2012 is being funded by the Munich Center for Mathematical Philosophy.

Monday, 2 January 2012

What Scientific Theories Could Not Be (Halvorson)

Occasionally, the issue of the model-theoretic, or "semantic", view of scientific theories has been mentioned here on M-Phi. I've spent ages arguing that this view is mistaken. My objections have always been that first a model $\mathcal{A}$ (or, a collection $\Sigma$ of models) is not a truth-bearer, whereas a scientific theory must be a truth bearer: otherwise one simply assumes instrumentalism from the start, which is cheating. But, second, if one tries to remedy this by saying that a collection $\Sigma$ of models is "true" if and only if some $\mathcal{A} \in \Sigma$ represents the world, then the notion of a "structure representing the world" has never been either defined precisely or explicated coherently. And, third, if one tries to do this in the most obvious way, then one is led very quickly to a Newman problem: a structure $\mathcal{A}$ represents the world just if the world is large enough. This is because the world isn't, or isn't obviously, a structure in the usual mathematical sense. And fourth, there is a non-obvious way to remedy the Newman problem, which involves defining the notion of an interpretation of a structure, but it, in effect, reintroduces the notion of an interpreted language, $(\mathcal{L}, \mathcal{I})$--precisely the thing the advocates of the "semantic view" insisted should be avoided! (Interesting, is it not, that advocates of a so-called "semantic" view should reject the concept of an interpretation: perhaps the central concept of semantics.) Some of these objections have also been made by others (e.g., Anjan Chakravartty in 2001 and Roman Frigg in 2006).

The philosopher of physics Hans Halvorson (Princeton) now has a very interesting preprint on the Pittsburgh PhilSci archive called "What Scientific Theories Could Not Be", which I believe is forthcoming in Philosophy of Science. Halvorson raises several further objections against the semantic view of theories. In particular, it argues that by identifying a theory with a collection of models and analysing equivalence in terms of isomorphisms between models, the view gets theoretical equivalences wrong. It's very stimulating paper and I recommend it for anyone interested in this important topic within General Philosophy of Science.