tag:blogger.com,1999:blog-4987609114415205593.post2296447514879506635..comments2024-03-28T13:40:26.497+00:00Comments on M-Phi: Update on updating -- or: a fall from favourJeffrey Ketlandhttp://www.blogger.com/profile/01753975411670884721noreply@blogger.comBlogger3125tag:blogger.com,1999:blog-4987609114415205593.post-68701317936188101772020-07-07T09:07:04.369+01:002020-07-07T09:07:04.369+01:00(Uh, the MIRE bit is understandable when you note ...(Uh, the MIRE bit is understandable when you note that it is equivalent to 'minimize inaccuracy as measured by the log rule)Luke101https://www.blogger.com/profile/12310985619542968881noreply@blogger.comtag:blogger.com,1999:blog-4987609114415205593.post-44087535469025867822020-07-07T09:01:36.468+01:002020-07-07T09:01:36.468+01:00(This is Leszek Wroński here, I can't figure o...(This is Leszek Wroński here, I can't figure out my accounts at the BlogSpot.)<br /><br />So: Fascinating! But there's one thing that's bugging me about this. I last thought about these issues sometime in 2017, I think, but I desperately want to return to them this year. Minimizing your expected score according to the only local proper scoring rule, that is, the log rule, delivers Conditionalization: this is seen e.g. from Chapter 15 of your book, Richard. <br /><br />(To get here from a different direction: assume 'M(I)RE' is 'Minimise (Inverse) Relative Entropy' in the sense of Douven & Romeijn (2011). Theorem 8.6 from Paris' book shows that MRE delivers Conditionalization and one can tweak the same argument to show that this holds also for MIRE.)<br /><br />Doesn't this contradict the result from this post, since the log rule is strictly proper?Luke101https://www.blogger.com/profile/12310985619542968881noreply@blogger.comtag:blogger.com,1999:blog-4987609114415205593.post-44877461187336737332020-07-07T08:43:12.448+01:002020-07-07T08:43:12.448+01:00comment testcomment testLuke101https://www.blogger.com/profile/12310985619542968881noreply@blogger.com