Accuracy
Considering "... the most accurate report" has reminded me of a puzzle connected to the topic of "truthlikeness" or "accuracy" (also sometimes called "verisimilitude" or "approximate truth"), as might be expressed by,
and are statements or theories.
David Miller published (1974, "Popper's Qualitative Theory of Verisimilitude", BJPS; and 1976, "Verisimilitude Redeflated", BJPS) an interesting problem concerning trying to make sense of this concept. Because the notion seems clearly relevant to any decent theory of scientific method, Sir Karl Popper had previously tried to develop an explication of the concept, but it turned out to suffer a serious (separate) problem, discovered by David Miller also, set out in the 1974 paper (roughly, on that explication, all false theories are as truthlike as each other).
But the other problem - the Language Dependence Problem - is this. (It is explained also in Sc. 1.4.4 of Oddie, "Truthlikeness", SEP.) The statements and are, we suppose, both false. But we should still, nonetheless, like to make sense of what it might mean for to be closer to the truth (or, more accurate) than is. For a scientific example, we intuitively would like to say that Einstein's relativistic equation for the kinetic energy of a point particle of mass at speed ,
or is more accurate? It seems intuitively clear that
makes one error, while makes two errors. For those interested in the fancier details, this is called the Hamming distance between the corresponding binary sequences. For this case, it amounts to (1,1) being closer to (0,1) than (1,0) is.
Miller's language dependence problem is that if we translate the statements into an equivalent language , then we can reverse this evaluation! We can get the translation of to be closer to the truth than the translation of is.
First, we define to have primitive sentences and a new sentence , whose translation into is "it is raining if and only if it is cold". I.e., . One can "invert" this translation, and see that the translation of into is given by . (This is because if , then .)
Next we translate , and into as follows:
, we have:
makes only one error, while makes two errors.
Consequently, if we adopt this measure of distance from the truth, we can reverse closeness to truth or accuracy by simply translating into an equivalent language (i.e., one that has different primitives).
In technical terms, the problem is this. We have placed a metric on the set of propositional assignments (or models, if you like), the Hamming distance. Indeed, is just the set of four binary ordered pairs, i.e.,
is then a metric space. A Miller-style translation from to induces a bijection of this space, but this mapping is not an isometry of .
whereis closer to the truth than is
David Miller published (1974, "Popper's Qualitative Theory of Verisimilitude", BJPS; and 1976, "Verisimilitude Redeflated", BJPS) an interesting problem concerning trying to make sense of this concept. Because the notion seems clearly relevant to any decent theory of scientific method, Sir Karl Popper had previously tried to develop an explication of the concept, but it turned out to suffer a serious (separate) problem, discovered by David Miller also, set out in the 1974 paper (roughly, on that explication, all false theories are as truthlike as each other).
But the other problem - the Language Dependence Problem - is this. (It is explained also in Sc. 1.4.4 of Oddie, "Truthlikeness", SEP.) The statements
is more accurate than the classical equation,
.
(Miller 1975, "The Accuracy of Predictions" (Synthese), explains how the language dependence problem arises here also, for such comparisons.)
Suppose and are false sentences in language , and let the truth be . I.e., is the single true statement that and are falsely approximating. Miller pointed out, given some very natural ways to measure the "distance" between (or ) and the truth, a language relativity appears. One such way is to count the number of "errors" in a false statement; and then the statement with least number of errors is closer to the truth.
I will give an example which is based on Miller's weather example, but a bit simpler. Let the language be a simple propositional language with, as its primitive sentences,
is that it is not raining and it is cold. Let say that it is raining and it is cold and let say that it is raining and it is not cold. So, both and are false. In symbols, we have:
I will give an example which is based on Miller's weather example, but a bit simpler. Let the language
Suppose the truth("it is raining") ("it is cold").
Which of.
.
.
(1)Foris closer to the truth than is.
Miller's language dependence problem is that if we translate the statements into an equivalent language
First, we define
Next we translate
Expressed in the new language.
.
.
(2)Foris closer to the truth than is.
Consequently, if we adopt this measure of distance from the truth, we can reverse closeness to truth or accuracy by simply translating into an equivalent language (i.e., one that has different primitives).
In technical terms, the problem is this. We have placed a metric
And, the Hamming distances are given by:
So,
etc.
In regards to Boddington's "most accurate report" comment on Leiter Reports, she does not say that she means the Daily Mail minimizes error. Indeed, she says it only "more or less" comes from the evidence at the inquest. Rather, what she says is that it *maximizes assertions made* (perhaps including the "following" embellishment) and that it (as compared to certain non-tabloid papers, called "quality" in scare quotes) is *informative about the identity of the author* (so that one may question them about the source of the witness statements and, as with Boddington, not get an answer).
ReplyDelete