Monday, 21 May 2012

Canada Research Chair in Logic and the Philosophy of Science -- Calgary

Richard Zach draws my attention to a very attractive position that the Calgary Philosophy Department is advertising at the moment: a Tier I Canada Research Chair in Logic and the Philosophy of Science. It is a research-intensive, tenured position, with a very low teaching load, for those keen on developing their own long-term research program. In other words, it is a very attractive position, at a department that is already very strong in logic and related areas. The official announcement with further details can be found below.
-------------------------------------------------------------------

The Department of Philosophy at the University of Calgary invites applications and nominations for a Tier I Canada Research Chair in Logic and the Philosophy of Science. The Canada Research Chairs program has been established by the Government of Canada to enable Canadian universities to foster excellence in research and teaching. Further information on the program is available on the CRC website at www.chairs.gc.ca.

We are seeking an established scholar and a leader in any area of logic or the philosophy of science. The successful candidate will have an outstanding record of research, teaching and graduate supervision, and an innovative research program. The appointment, at the rank of Associate Professor or Professor, is expected to start on July 1, 2013.

Specific inquiries about this position may be directed to:

Ali Kazmi, Head
Department of Philosophy
University of Calgary
Email: akazmi at ucalgary.ca

All Chairs are subject to review and final approval by the CRC Secretariat. Applications including a CV, a writing sample, a teaching dossier, and a description of a 7 year research plan, and names and contact information of three referees may be sent to:

Merlette Schnell, Manager
Department of Philosophy
University of Calgary
2500 University Drive NW
Calgary, Alberta T2N 1N4
CANADA
Email: schnell at ucalgary.ca

Monday, 14 May 2012

The 'logic-as-umpire' view of logic: a Kantian relic

(Cross-posted at NewAPPS)

As widely circulated on the internet, Tim Williamson has a recent The Stone post, on the purported ‘neutrality’ of logic. He writes:
Here’s an idea many philosophers and logicians have about the function of logic in our cognitive life, our inquiries and debates. It isn’t a player. Rather, it’s an umpire, a neutral arbitrator between opposing theories, imposing some basic rules on all sides in a dispute. The picture is that logic has no substantive content, for otherwise the correctness of that content could itself be debated, which would impugn the neutrality of logic. One way to develop this idea is by saying that logic supplies no information of its own, because the point of information is to rule out possibilities, whereas logic only rules out inconsistencies, which are not genuine possibilities. On this view, logic in itself is totally uninformative, although it may help us extract and handle non-logical information from other sources. 
The idea that logic is uninformative strikes me as deeply mistaken, and I’m going to explain why. […]
He then goes on to argue against this Tractarian account of logic as uninformative, and against the view that there can be no rational debates about logical principles. Now, the information gain afforded by a piece of deductive reasoning is a topic that has received quite some attention recently, in particular by philosophers of information such as Luciano Floridi and my colleague and friend Sebastian Sequoiah-Grayson. Similarly, the idea of rational debates about logical rules and principles dates back at least to the Putnam-Dummett debate, and is alive and well in recent discussions about logical revision (Hartry Field, among others, has written on it). On the face of it, Williamson is not saying anything strikingly new, but because the view he is criticizing remains utterly pervasive (and is utterly wrong!), such critiques remain important.

What could one do to convince the proponent of the ‘neutrality-umpire’ view of logic that she is wrong? One approach that I’ve defended on several occasions (such as here) is what I call ‘conceptual archeology’: why would anyone have thought that this is a compelling account of the nature of logic in the first place? What were the reasons/arguments for this view to establish itself as widely endorsed, in fact almost as a truism about logic? (This exercise of ‘conceptual archeology’ and ‘deconstruction’ is part of a larger project to argue for a dialogical reconceptualization of logic.) As I’ve argued elsewhere (for example, in a BBS commentary), the view of logic as an ontologically neutral umpire, as having no substantive content, is essentially a Kantian view, at the heart of the project of transcendental idealism.

As shown by B. Longuenesse in her Kant and the Capacity to Judge (1998), Kant takes as his starting point the transcendental question, “what are the a priori conditions for the representations of objects in general?”, and reconfigures the logic of his time so as to render it useful for his transcendental project. In particular, he selectively absorbs the notions of “judgment,” “form,” and “categories” as found in the logical textbooks of the time, and puts them to use so as to describe the very conditions of possibility of our thinking and perceiving. In particular, Kant insisted on the normative import of the rules of thought as described by logic. According to him, (general) logic deals with “absolutely necessary rules of thought without which there can be no employment whatsoever of the understanding.” (KrV A52/B76)

For Kant, general logic has no substantive content because it pertains to the forms of thought as such, with no connection to objects whatsoever (not even the a priori conditions for the relation of the understanding to objects, which is the domain of transcendental logic). Similarly, since the laws of logic determine the very conditions of thought as such, no rational debate can be had about them, as they are presupposed in any rational debate.

I used to think that this conception of logic was entirely new with Kant, but now my PhD student Leon Geerdink (a walking encyclopedia of the history of philosophy) tells me that much of it can already be found in Kant’s predecessors, Wolff in particular. At any rate, what is clear is that it is the historical influence of Kantianism in the 19th and 20th centuries which saw to it that the view criticized by Williamson established itself as the received view on the nature of logic. However, once you start asking yourself which reasons you might have to endorse it, beyond the weight of tradition, it quickly becomes apparent that one would be hard pressed to find such reasons. They make good sense against the background of transcendental idealism (something also argued by MacFarlane in his PhD dissertation), but if you are not prepared to buy into the whole framework, which other reasons would you have to endorse the ‘logic-as-umpire’ and ‘logic-as-ontologically-neutral’ view? So far, I’ve never come across any truly convincing argument, and as argued by Williamson and others, there seems to be a wealth of reasons why one should not endorse this view. And yet, it seems to be alive and well. How long will we still remain under this Kantian spell?

Monday, 7 May 2012

Non-monotonicity is a fever

(Cross-posted at NewAPPS)

In his 2008 paper ‘Logical dynamics meets logical pluralism?’, Johan van Benthem writes (p.185):
… Many observations in terms of structural rules address mere symptoms of some more basic underlying phenomenon. For instance, non-monotonicity is like ‘fever’: it does not tell you which disease causes it.
I’ve always been puzzled by this observation – among other reasons because I’m a non-monotonicity enthusiast, so it seemed odd to me to claim that non-monotonicity would be like the symptom of some disease! But beyond the disease metaphor, it was also not clear to me why Johan saw non-monotonicity as this unspecified, possibly multifaceted phenomenon. After all, there should be nothing esoteric about non-monotonicity: a non-monotonic consequence relation is one where addition of new premises/information may turn a valid consequence into an invalid one. The classical notion of validity has monotonicity as one of its defining features: once a consequence, always a consequence, come what may. This is why a mathematical proof, if indeed valid/correct, remains indefeasible for ever and ever, come what may.

The non-monotonic logics which developed in recent decades, such as circumscription and default logics, have as their main feature the fact that something that counts as a valid consequence may become invalid upon the arrival of new information. As described by A. Antonelli in the SEP entry on non-monotonic logics:
The term “non-monotonic logic” covers a family of formal frameworks devised to capture and represent defeasible inference, i.e., that kind of inference of everyday life in which reasoners draw conclusions tentatively, reserving the right to retract them in the light of further information. Such inferences are called “non-monotonic” because the set of conclusions warranted on the basis of a given knowledge base, given as a set of premises, does not increase (in fact, it can shrink) with the size of the knowledge base itself. This is in contrast to standard logical frameworks (e.g., classical first-order) logic, whose inferences, being deductively valid, can never be “undone” by new information.
The monotonic nature of the classical notion of validity is precisely one of the aspects I want to investigate in more detail in the coming years with my ‘Roots of Deduction’ project, and the initial hypothesis is that necessary truth-preservation and monotonicity are not primitive concepts but rather corollaries of the notion of an indefeasible argument, understood in a dialogical setting. Now, last week I was in Konstanz presenting some of the preliminary results of the project, in particular how my ‘built-in opponent’ conception of deduction (I have a draft of a paper on this, available upon request) sheds new light on the model-theory vs. proof-theory debate on logical consequence. I was, as usual, focusing on monotonicity and claiming that the points made applied to classical logic and also to other logics where necessary truth-preservation is a defining feature of the consequence relation; I had in mind things like intuitionistic or relevant logic.

But as pointed out to me by Ole Hjortland during Q&A, the relevant consequence relation is not monotonic in that weakening fails. Weakening is the following structural property, in its sequent calculus formulation:

Γ, A => B, Δ
-------------------
Γ, A, C => B, Δ

(There is a counterpart on the right side too, but for the present purposes it is only left-weakening that matters.) In other words, if A implies B, then adding C to the set of premises is not going to change that – which is pretty much the same as the property of monotonicity. Now, weakening fails for relevant logic: given the requirement that the premises must somehow be topically ‘related’ to the conclusion – must be about the same ‘things’ – it naturally follows that one cannot add arbitrary premises and still maintain the required relevance relation between premises (antecedent) and conclusion (consequent).

But clearly, the reason why relevant logic is non-monotonic is profoundly different from the reason why, say, default logic is non-monotonic. In the first case, necessary truth-preservation remains a necessary – though not sufficient – property for the consequence relation. What relevant logicians want to say is that the fact that it is impossible for the premises to be the case while the conclusion is not the case is necessary but not sufficient for validity; something else is needed, namely a relation of relevance, which in practice blocks or at least restricts the ex falso rule, among others. By contrast, in a non-monotonic logic such as default logic, which operates with the notion of minimal/preferred models, necessary truth-preservation is not even a necessary condition for the consequence relation. There is also something very different about the corresponding responses to the arrival of new information which invalidates a previously established valid consequence: for relevant logic, new information is viewed as an intruder, disrupting the previously established relation of relevance between antecedent and consequent; for non-monotonic logics such as default and others, the effect of the arrival of new information for defeasible reasoning is precisely what is of interest.

I still need to think more carefully about the implications of all this for the central role that I ascribe to the monotonicity vs. non-monotonicity dichotomy in my ‘built-in opponent’ story, but at least for now I can say I understand better what Johan van Benthem means when he says that non-monotonicity is like ‘fever’; it arises from phenomena as diverse as relevance concerns and the defeasibility of ‘everyday’ reasoning.

Two professorships at the ILLC in Amsterdam

The Institute for Logic, Language and Computation (ILLC) at the University of Amsterdam has just announced two vacancies for full professorships, in philosophy of language and philosophical logic. The first position is to replace Jeroen Groenendijk and Martin Stokhof, who will both retire in a few years, and is thus meant to cover a good deal of formal semantics as well. The second position is meant to replace Frank Veltman, and here the emphasis on the interface between logic and cognition is important.

I have worked within the Logic & Language research group of the ILLC for four years, and to say that it is a highly stimulating environment is an understatement... More generally, the ILLC is without a doubt one of the world's leading centers in logic and related areas, and is known for the focus on interdisciplinary research cutting across mathematics, computer science, linguistics, philosophy etc. As such, it is to be expected that competition will be fierce for these two positions! Still, everyone fitting the profile and fancying a relocation to Amsterdam should most definitely apply; my understanding is that there is no pre-determined internal candidate, so everyone stands a chance.

Tuesday, 1 May 2012

Philosophy of Information at Hertfordshire University


FOURTH WORKSHOP ON THE PHILOSOPHY OF INFORMATION
Boardroom, McClaurin Building, University of Hertfordshire, Hatfield

Thursday 10 May 2012
9.00 Welcome, Tea, Coffee
9.30 GREGORY WHEELER (Keynote Speaker)
Is there a logic of information?
10.00 PATRICK ALLO
Tentative inference, open worlds and the informational conception of logic
10.30 – 11.00 Coffee Break
11.00 SIMON D’ALFONSO
The logic of knowledge and the flow of information
11.30 SONJA SMETS
Playing for knowledge
12.00 ORLIN VAKARELOV
From interface to correspondence: How to recover classical representations in a pragmatic theory of semantic information
12.30 – 13.30 Lunch Break
13.30 ANTHONY BEAVERS
Transcendental philosophy in the age of information: Floridi’s Neo-Kantian epistemology
14.00 BERT BAUMGÄRTNER
Defence of modeling vagueness with discrete degrees
14.30 NIR FRESCO & PHILLIP STAINES
A revised attack on digital ontology
15.00 PHYLLIS ILLARI
Information and causal inference: in defence of generality
15.30 MARK JAGO
Bounded rationality and epistemic blindspots
16.00 – 16.30 Tea Break
16.30 ERIC KERR
An informational approach to epistemic agency
17.00 LUCIANO FLORIDI
A plea for antinaturalism
17.30 – 18.00 Final discussion and end of first day
19.00 Workshop Dinner

Friday 11 May 2012
9.00 Welcome, Tea, Coffee
9.30 STEPHEN HARTMANN (Keynote Speaker
)
Updating on conditionals = Kullback-Leibler + causal structure
10.00 SEBASTIAN SEQUOIAH-GRAYSON
Epistemic operations
10.30 – 11.00 Coffee Break
11.00 GUSTAVO CEVOLANI
Truthlikeness, partial truth, and (strongly) semantic information
11.30 GIUSEPPE PRIMIERO
Towards a taxonomy of errors for information systems
12.00 HILMI DEMIR
Taking stock: Arguments for the veridicality thesis
12.30 – 13.30 Final discussion and end of workshop