*Transformative Experience*, Paul issues two challenges to orthodox decision theory---they are based upon examples such as these.

(In this post and the next, I'd like to tryout some ideas concerning Paul's challenges to orthodox decision theory. The idea is that some of them will make it into my contribution to the

*Philosophy and Phenomenological Research*book symposium on

*Transformative Experience*.)

The experience of eating Vegemite for the first time is, in Paul's terminology, an

*epistemically transformative experience*: that is, it is an experience that teaches you something that you can only learn by having an experience of that type; what it teaches you is what it is like for you to taste Vegemite. This raises an apparent problem for orthodox decision theory. Suppose that, like me, you have never tried Vegemite. And suppose that you are faced with a decision problem---a choice between a number of alternative actions---where at least one of the possible outcomes of at least one of the alternative actions involves you eating Vegemite: for instance, you might have the choice whether or not to try Vegemite; or you might be offered a bet where the stake is three jars of the stuff. Then, contends Paul, you will not be able to use orthodox decision theory straightforwardly to make your choice. The problem is that, in orthodox decision theory, in order to decide between alternative actions, we need at the very least a set of possible states of the world and a utility function that assigns utilities to those states of the world. However, in the case of the decision problem that features Vegemite, you do not have access to your utility function: after all, one of the states of the world involves you eating Vegemite at some point in the future; but you don't know what it's like to eat Vegemite, and the phenomenal character of an experience is often important in determining the utility you assign to having that experience. Thus, there are decision problems that an agent might face where it seems that orthodox decision theory cannot provide the action-guiding advice that is one of its central purposes.

Some claim that the experience of becoming a parent is also an epistemically transformative experience: before you do it for the first time, you cannot know what it is like to do so. But it can also be what Paul calls a

*personally transformative experience*: that is, it is an experience that can change your values in such a way that your utility function after the experience is different from your utility function before the experience. This raises another apparent problem for orthodox decision theory. Suppose that, like me, you have not been a parent. So there is a chance that, if you were to become a parent, your utility function may change. Let us suppose that there is in fact no problem of epistemic access to your current or future utility functions in this case: you know what your current utility function is and you know what your future utility function would be if you were to become a parent and what it would be if you were not to become a parent. And suppose that one thing you know about you current utility function and your hypothetical utility function were you to become a parent is that they are different---you know that, before becoming a parent, you will value the company of your friends more than the company of your hypothetical offspring; afterwards, this will be reversed. According to orthodox decision theory, you make any decision by appealing to your current utility function. But in a situation in which at least one of the actions between which you must choose may lead to a change in your utility function, this seems to ignore an important vantage point, namely, the vantage point of your future self after the choice has been made---indeed, the part of your self that will in fact have the experiences you're choosing between. If I know that becoming a parent will give me less time to spend in the company of my friends and more time in the company of my hypothetical child, and if I currently value the former more than the latter but will come, upon becoming a parent, to value the latter more than the former, then it would seem irrational to make the decision by appealing only to my current utilities. As before, there seem to be decision problems that an agent might face where it seems that orthodox decision theory cannot provide the action-guiding advice that is one of its central purposes.

In the sequel, I will argue that epistemically transformative experiences pose no special problem for orthodox decision theory---I'll deal with this in the present post. Personally transformative experiences, on the other hand, do. I will offer an extension to orthodox decision theory designed to accommodate decisions that involve personally transformative experiences---I'll deal with this in the second post.

## Orthodox decision theory

Let us begin by describing orthodox decision theory. In this post, I'll take a very simple non-causal, non-evidential decision theory. All of the problems we're considering arise for this version, and nothing new arises when we move to the causal version. In the next post, where I deal with personally transformative experiences, we will consider causal decision theory; but in fact no new problems arise for it---the challenges Paul raises could be raised for a non-causal theory.

In this simple decision theory, we model a decision problem as follows:

- $\mathcal{S}$ is a set of propositions each of which describes a different possible state of the world;
- $\mathcal{A}$ is a set of propositions each of which describes a different possible action that our agent might perform and states that in fact she does perform that action;
- $U$ is a function that takes a conjunction of the form $AS$, where $A$ is in $\mathcal{A}$ and $S$ is in $\mathcal{S}$, and returns the utility that the agent would obtain were that conjunction to hold---that is, the utility she would obtain if she were to perform that action in that world;
- $p$ is a subjective probability function (or probabilistic credence function) over the states of the world $S \in \mathcal{S}$.

*value*of an act as follows:

$$V(A) := \sum_{S \in \mathcal{S}} p(S)U(AS)$$ Finally, we can state the main decision rule of simple decision theory:

**Maximize value**If there is an action with maximal value and $A$ does not have maximal value, then choosing $A$ is irrational.

In this statement of the decision rule, it gives a requirement of rationality. But it can also be stated as advice-giving:

*Don't choose an action if it has less value than an action of maximal value*. It is this advice-giving principle that is the primary target of Paul's challenges.

## Epistemically transformative experience

Now let us turn to the apparent challenge to this orthodox version of decision theory that arises from epistemically transformative experiences. In order for an agent to use the advice-giving principle stated at the end of the previous section, it seems that she must be able to calculate the value $V(A)$ of each action $A$ that is available to her; and, in order to do that, she must have access to her own subjective probability function $p$ and utility function $U$. Thus, it seems, when an agent faces a decision problem in which the utility she assigns to a particular possible outcome of one of the possible actions is not accessible to her, she cannot follow the advice-giving principle of causal decision theory. This is exactly the situation created by epistemically transformative experiences: if you face a decision problem in which one of the possible outcomes involves such an experience, you have no access to the phenomenal character of that experience, and since the phenomenal character of an experience in an essential factor in determining the utility you assign to it, you have no access to the utility you assign to that outcome. Thus, you cannot follow the advice-giving principle of causal decision theory and you are unable to make the decision in a rational way. Or so Paul contends.

In order to explore this challenge to orthodox decision theory, I'd like to consider a simpler sort of decision problem; one that doesn't involve epistemically transformative experiences, but which does involve apparently inaccessible utilities. We'll see how causal decision theory can accommodate this simpler decision problem and ask whether we might use a similar technique to accommodate the sorts of decision problems that Paul has in mind.

Here is the decision problem:

**Room of Unknown Outcome**You must choose whether or not to enter the Room of Unknown Outcome. In this room you will either be given £5, or £5 will be taken from you, but you do not know which. If you do not enter, you will be given nothing and nothing will be taken away from you.

Thus, there are two actions:

*Enter*and

*Don't Enter.*And, at least as we will initially frame the decision problem, there is just one state of the world---while it might seem that there is uncertainty, the initial framing of the problem locates that uncertainty as uncertainty about which utility function you have, rather than uncertainty about the way the world is. Thus, the value you assign to

*Don't Enter*is 0, while the value you assign to

*Enter*is either the utility you assign to £5 or the utility you assign to -£5, but you do not know which, since you don't have access to your utility function---in order to know the utility that you assign to

*Enter*, you must know what the outcome of entering the room will be.

There is no epistemically transformative experience involved in Room of Unknown Outcome: you have had experiences in the past that are similar enough to gaining £5 and losing £5 that you know what it would be like to have one of those experiences again. The problem is that you do not know which of those two experiences you will have. Nonetheless, the example shares with Paul's examples the lack of access to the agent's utility function. How might we accommodate Room of Unknown Outcome 1 in orthodox causal decision theory? The standard move is to reframe the uncertainty in the problem as uncertainty about the state of the world, rather than uncertainty about the utility function. That is, we reframe the decision problem as follows. There are still two actions:

*Enter*and

*Don't Enter*. But there are now two states of the world: the first is the state in which you receive £5 upon entering the room (denote this

*Prize is £5*); the second is the state in which you are divested of £5 upon entering the room (denote this

*Prize is -£5*). You then assign subjective probabilities to these two states. Since the antecedents have no influence on the consequents--whether or not you enter the room has no effect on what the prize is--these amount to subjective probabilities over the consequents

*Prize is £5*,

*Prize is -£5*, etc. Finally, you assign utilities to the conjunctions

*Enter & Prize is £5*,

*Don't Enter & Prize is £5*, etc. Whereas there was uncertainty about the utilities that attached to the various act-state conjunctions in the original framing of the problem, there is no such uncertainty in this framing: for instance, $U$(

*Enter & Prize is £5*) = $U$(£5), and you know the latter quantity since you've had the experience of gaining £5 before. In this way, you might use orthodox decision theory to choose between

*Enter*and

*Don't Enter*. For instance, if you are equally confident in

*Prize is £5*and

*Prize is -£5*--so $p$(

*Prize is £5*) = $p$(

*Prize is -£5*) = 0.5--and if your utility is linear in money--so $U$(

*Enter & Prize is -£5*) = $-U$(

*Enter & Prize is £5*)$--then the expected utility of

*Enter*is 0, which is the same as the expected utility of

*Don't Enter*. Thus, their values are the same and both actions are permissible.

Now, you might wonder how we can possibly set our subjective probabilities in

*Prize is £5*and

*Prize is -£5*. After all, the description of the example gives us so little information that it would seem that no particular probabilities are warranted. This is a genuine problem; but it is not a new problem. It is simply the problem of setting credences in the presence of very limited evidence. Extreme subjective Bayesians say that, in such a situation, almost any probability function is permissible. At the other end of the spectrum, objective Bayesians say that we should apply the Principle of Indifference and assign equal credence to each of the possibilities---thus, we assign a credence of 0.5 to

*Prize is £5*and to

*Prize is -£5*. Both approaches have their problems, but they are not peculiar to cases of ignorance about utility. If I tell you that I have a handkerchief in my pocket that is blue or red, you must set your credence in each possibility in the presence of similarly impoverished evidence.

So we accommodate Room of Unknown Outcome in orthodox decision theory by considering a more fine-grained space of possible states of the world: the crucial feature of this new space is that, in each of the new fine-grained states, enough information is specified in each of them that we know the utility we assign to act-state conjunctions $AS$. We then use standard Bayesian techniques to set our credences in these new states and we apply orthodox decision theory. We call this fine-graining strategy a

*redescription strategy*, since it involves redescribing the decision problem so as to make it amenable to treatment within orthodox decision theory. Might we use a similar strategy to accommodate decision problems that use epistemically transformative experience? Let's take a simple example:

**Vegemite**You must choose whether or not to try Vegemite.

We assume that you haven't eaten Vegemite before. So one of the possible actions---namely, the action of eating Vegemite---will result in an epistemically transformative experience. As a result, before you make the choice, you will not know what it is like to eat Vegemite and thus you will not know the utility you assign to eating Vegemite. We initially frame the problem, as we did above in the case of Room of Unknown Outcome, as involving ignorance about your utilities. Thus, there is just one possible state of the world; there are two possible actions,

*Eat*and

*Don't Eat*; and while you know the utility of

*Don't Eat*(conjoined with the one possible state of the world), you don't know the utility of

*Eat*(conjoined with that state). Thus, you cannot assess the value of

*Eat*, and so cannot use orthodox decision theory to make your choice. Or so Paul contends.

Inspired by the redescription strategy used above, we might try to reframe Vegemite as a case of ignorance about the world. In Room of Unknown Outcome, we did not know what the prize would be; but, for each possible prize, we knew the utility of that prize; so we fine-grained the states of the world to specify the prize in that state. In Vegemite, we do not know what the taste will be; but, you might hope, for each possible taste, we know the utility of experiencing that taste; so we fine-grain the states of the world to specify the taste in that state. But things are not so straightforward. In Room of Unknown Outcome, the possible prizes were specified in the original description of the decision problem, and they were all prizes of which we'd had similar experiences before. This is not the case in Vegemite. When we fine-grain the states of the world in Vegemite, which tastes should we include as possible? Yahoo! Answers suggests the following for Vegemite: "fermented yeast that have puked and died", "salty beyond belief", "sucking on a rusty nail that has been soaking in salt water for over a year", "a little bit of salty heaven". But even if we have had sufficiently similar experiences to these to know the utilities we would assign to them, we can't be sure that these are all of the possibilities; and we can't be sure that there aren't possibilities whose utilities we do not know because we have never had an experience that is sufficiently similar. After all, there are two types of epistemically transformative experience:

- (Type I) You have in fact had an experience in the past that is phenomenologically similar to eating Vegemite. You know this, but you don't know which of your past experiences it is. In this case, the experience of eating Vegemite is epistemically transformative because it teaches you which of your past experiences is phenomenologically similar to eating Vegemite.
- (Type II) You have not had an experience in the past that is phenomenologically similar to that of eating Vegemite. You know this might be the case. In this case, the experience of eating Vegemite introduces you to a completely new phenomenological experience and teaches you that eating Vegemite gives rise to that experience.

However, all is not lost. Consider again the definition of the value of an act given above:$$V(A) = \sum_{S \in \mathcal{S}} p(S) U(AS)$$ And note that this may be rewritten as follows:$$V(A) = \sum_{u \in \mathrm{ran}(U)} u \times p(\mbox{Utility for $A$ is $u$})$$ where

- $\mathrm{ran}(U)$ is the range of the utility function $U$; that is, it is the set of values that $U$ takes;
- The proposition
*My utility for $A$ is $u$*is true at all and only those states $S$ such that $U(AS) = u$.

*My utility for $A$ is $u$*.

There are two issues to address before we can be confident that we have accommodated epistemically transformative experiences in orthodox decision theory. The first we have met already: it is the worry that there is no rational way to assign probabilities to each of the possible states of the world when they are specified in this way. How, for instance, am I to assign a probability to the proposition

*My utility for tasting Vegemite is 5*? As I said, we've met this problem before, at least in the case in which I have no evidence that bears on these propositions. In such cases, some Bayesians will permit a large range of different probability functions, whereas others will specify a unique rational response to this paucity of evidence.

You might think that one of the usual problems with the latter response--namely, the language dependence objection to the Principle of Indifference--is particularly acute in the present case. The Principle of Indifference says (at least) that an agent who lacks any evidence should distribute their credences equally over all the possibilities. However, as is often pointed out, this gives rise to different credences depending on how you present the possibilities: if you know that the handkerchief in my pocket is red or blue, you might apply the Principle of Indifference to the possibilities

*Red*,

*Blue*, in which case you'd assign 1/2 to

*Red*; or you might apply that principle to the possibilities

*Red*,

*Light Blue*,

*Dark Blue*, in which case you'd assign 1/3 to

*Red*. You might think that this problem is particularly apparent in the present case, where we are assigning probabilities to propositions such as

*My utility for $A$ is $u$*, where we don't know how many states of the world make that proposition true. For instance, suppose I know that my utility for tasting Vegemite is either -100, -50, 10, 20, or 30. The Principle of Indifference tells me to assign a credence of 1/5 to each of these possibilities. But I don't know how many possible tastes correspond to each of these possible utilites. For all I know, it could be that there is just one possible taste that corresponds to a utility of -100, but fifteen possible tastes that correspond to a utility of -50. If this were the case, you would wish my probability for -50 to be fifteen times greater than that for -100.

All of this is true. But, again, it is not a new problem for Bayesian epistemology. When I assign a probability of 0.5 to

*Red*and 0.5 to

*Blue*in the handkerchief case, I am well aware that these propositions are unlikely to carve nature at its joints; they are unlikely to respect the true symmetries in the world; I am well aware that, if one considers the true possible states of the world---if such a notion even makes sense---there may be twice as many at which

*Blue*is true as there are at which

*Red*is true. But the mere existence of this possibility does not seem to undermine my assignment of 0.5 to each in this case. Nor, I claim, should it undermine my assignment of 1/5 to each of -100, -50, 10, 20, 30 in the Vegemite case.

Of course, in many cases like Vegemite, you would not be in the state of complete ignorance I have been considering. For instance, you've heard the testimony of friends about their experience of Vegemite; you've read the Yahoo! Answers thread with its poetic descriptions; you've watched your friends' faces as they bite into their first piece of Vegemite on toast; in short, you've gathered data about the utilities that others assign to the experience of eating Vegemite. Surely this will affect the subjective probabilities you assign to the different possible utilities that you assign to that experience. I think that's right. However, Paul has two concerns about this evidence. The first is that, in some cases of epistemically transformative experience, the relevant testimony is not reliable. Of course, people have no reason to misreport their utilities for eating Vegemite, so the testimony in that case is likely to be reliable. But there is strong pressure in some societies to report high utilities for becoming a parent whether or not one's utility truly is high, so the testimony in that case is likely to be less reliable. Again, this is all true. But again Bayesians have ways of incorporating evidence that is less than completely reliable. When we use Bayes' Theorem to compute the probability of a hypothesis given some evidence, part of our calculation involves the probability of the evidence given the hypothesis. It is at this stage that we can incorporate our doubts about the reliability of the testimony: we simply note that the probability that a person testifies to assigning high utility to becoming a parent given that he in fact assigns low utility to becoming a parent is non-zero.

Paul's second concern about such testimonial or behavioural evidence is that, in order to extrapolate from that evidence to your own case, you must solve the notorious problem of the reference class. After all, in order to set your probability that you assign utility -100 to eating Vegemite, you do not simply take the proportion of people who assign that utility amongst all people whose utilities you know. Rather, you take the proportion of people who assign that utility amongst all people whose utilities you know

*and who are sufficiently similar to you*. You are unlikely, for instance, to consider people who have been eating Vegemite since childhood, since they are unlikely to have similar utilities to yours. The problem of the reference class is the problem of determining who is and who isn't sufficiently similar to you. Again, this is surely a problem. But it is a problem that haunts Bayesian epistemology independently of its use in decision theory and certainly independently of its application to decision problems that involve epistemically transformative experiences. The same problem is faced by a doctor trying to set her credence in the efficacy of a particular drug for a particular patient: she knows the frequencies of its efficacy amongst various groups of patients; what she doesn't necessarily know is which of those groups contains the patients most relevantly similar to their current patient--indeed, it is not clear that there is even a fact of the matter to know in this case.

In order to use this final version of the redescription strategy---the version in which we assign probabilities to propositions of the form

*My utility for $A$ is $u$*---it seems that we must know the range of possible utilities. Paul worries that, in cases of epistemically transformative experiences, that isn't always possible. I certainly agree that, as we saw above, it isn't always possible to know the range of

*possible experiences*that might result in a case of epistemically transformative experience. That was the lesson of distinguishing Type I from Type II cases of such experience: in Type I cases, you know the range of possible experiences; in Type II cases, you don't. But that doesn't prevent you from knowing the range of

*possible utilities*that these experiences might have, even in the Type II case. After all, utilities are measured by real numbers, so we know all the possible utilities for any given action at any given state of the world: they are simply the real numbers. Indeed, in nearly all cases of transformative experiences, we can narrow the range significantly. For instance, however unpleasant Vegemite tastes, it will be more valuable than experiencing severe pain; and however good it tastes, it will not be as valuable as seeing a loved one happy. So it seems that there is no problem in assuming that we know the range of possible utilities that may attach to epistemically transformative experiences.

I conclude that, while epistemically transformative experiences give rise to problems for decision-makers who hope to use the advice-giving versions of orthodox decision theory to make their decisions, they are problems that Bayesian epistemologists face already, and they are problems that already arise for decisions that do not involve epistemically transformative experience. As we will see in the next post, this is not the case for personally transformative experience: to accommodate that, we will have to extend orthodox decision theory substantially.

Richard, this is really nicely put and extremely interesting. I hadn’t thought of the Type 1 version of an epistemically transformative experience, but you are absolutely right to point it out.

ReplyDeleteRegarding epistemically transformative experiences and their utilities. One thing I like to emphasize is that a way to see the problem with such utilities is not just as simple ignorance, but as an inability to grasp them in a particularly important way. Roughly, the idea is that you need to be able to assess how the outcome will affect you in order to determine its utility for you, so without acquaintance with the experience the problem of uncertainty expands hugely. Think of never having seen color before, and choosing between descriptions given by those who have of outcomes characterizing what it is like to see red. Such descriptions are nearly meaningless for you except for the utility numeral they include.

This takes us to a dilemma.

First horn: merely knowing the numerical utility is enough. Then since we can't imaginatively project in any useful way in order to constrain the range of outcomes, the uncertainty is vast. In the context of decisions where, for example, you are choosing to have a child or choosing to undergo a major medical procedure such that your life or entire way of living is at stake, this brutal fact has personal and philosophical implications for the way we decide. I say that, then, in such contexts, if we embrace the full range of uncertainty, we cannot (rationally) make these decisions in any way that is even close to how we ordinarily want to make them. Instead, we get dangerously close to regarding such decisions as analogous to a flip of a coin. (Choosing to have a child is just a roll of a many-sided die.)

Second horn: merely knowing the numerical utility isn’t enough, we need to understand the utility in the sense of knowing what the numerical utility represents in terms of the character of our lived experience. But we cannot imaginatively project ourselves into the outcomes to assess the utilities this way via (simulated) acquaintance. So, we need to accept that we cannot grasp these relevant acquaintance-based facts about the utilities. The solution here is to revise your approach to the decision. (You can’t rationally choose to have a child by deciding based on what it’s like. You need some other basis-maybe you have a farm and need more help.)

One tendency, when we are faced with this dilemma, is to take the first horn and then attempt to constrain the range of outcomes by imaginatively projecting yourself into the outcomes. This is a way of trying, at the individual level, to partly solve the version of the reference class problem that is related to the fundamental identity problem (“I’d probably react this way, not that way….”). But it won’t work--you can’t do that in this situation, by definition. With low-stakes cases like trying Vegemite, it isn’t a big deal that you can’t do this. But with high-stakes cases it is. My argument is intended to show (a) that when your whole way of living is at stake, the fact that you cannot close the gap is deeply problematic, from a personal point of view, even if is not a problem from standard decision theory’s point of view. And (b) this sort of problem is incredibly common, perhaps even a defining feature of a contemporary society that gives us a ton of choice about how to live our lives and realize our goals, and we need to face up to it.

Thanks very much for this, Laurie -- it's really helpful. In the end, of course, I want to take the first horn of your dilemma. One of the great strengths of decision theory is that the only features of a state of the world that are relevant to decision making are its utility and the credence you assign to it. It is this that allows us to use decision theory to choose between actions whose outcomes are very different and the phenomenal characters of which are wildly dissimilar. By measuring the 'goodness' or 'to-be-brought-about-ness' or 'utility' of all such outcomes on the same scale (the real numbers), we can compare what I called the 'value' of the different possible acts. Of course, this is also what raises worries about it, since some outcomes seem incomparable. But that's not the situation we're considering here. So I grasp the first horn.

DeleteHaving grasped it, I agree that there is a great deal of uncertainty about the utilities you assign to a particular epistemically transformative experience. But that doesn't seem to be unique to epistemically transformative experiences and the uncertainty about their utilities. There is similarly very large uncertainty about the efficacy of particular medical treatments on certain patients. And the statistical techniques you can use to reduce that uncertainty are similar to the ones you can use to reduce the uncertainty about utilities -- both face the Fundamental Identification Problem and the reference class problem.

So the question is: Is the decision making concerning epistemically transformative experience that we can do using standard decision theory and these techniques for reducing certainty within the bounds of societal norms for this sort of decision making? Are the sorts of considerations that people actually try to bring to bear on their decision when they are choosing whether or not to have a child in fact considerations that reduce the uncertainty about utilities and thus contribute rationally to the decision making process?

Yes—these questions need to be asked, and in high-stakes cases involving transformative experience the answers will not be satisfactory. Consider the decision to have a child. My thought is, at least when we consider the ordinary way we regard the decision to have a first child (that is, ordinary from a western, contemporary, relatively wealthy cultural perspective), that many of the main considerations people bring to bear on the question essentially involve introspection. (For example: “How will I respond to becoming a parent?”) But such considerations, I argue, do not reduce the uncertainty about utilities that the orthodox solution presents us with when we must consider epistemically transformative cases, and thus do not contribute rationally to the decision making process.

DeleteIt might be helpful to (i) identify which of the von Neumann-Morgenstern axioms are supposed to be violated by examples like these and (ii) how the examples differ in kind from "adverse selection" problems introduced by Akerlof in 1970.

ReplyDeleteAnd by (i), the question is what response to give to the standard Savage-style arguments for each axiom. For example, completeness is motivated by imagining forced-choice decision problems, where the agent is compelled to express a preference between all options in the menu. If you object to this step, how is the objection different to those we know about?

The reason that it would be helpful to try to connect this to the literature is that it appears there is a lot of the standard theory that is not endorsed. For example, a traditional decision theorist sees nothing at all puzzling about utilities: they are just a numerical representation of the agent's preferences (which, typically, satisfy at least the vN&M axioms) over options.

Thanks for this, Greg. I don't think Laurie is claiming that any vN&M axioms are violated. The vN&M axioms give coherence constraints amongst an agent's preferences. You're right that many decision theorists only take the preferences to be real, so decision theory amounts to the vN&M (or Savage or Joyce) axioms. But this particular debate assumes a sort of realism about the credences and utilities as well, and raises a problem for an agent who is trying to set her preferences (and determine her choice behaviour) by appealing to her utilities and credences; or an agent who is trying justify or give reasons for her preferences and choice behaviour by appealing to her credences and utilities. The problem is that, if you don't know your utilities for all possible states of the world, you can't use them to determine your preferences and thus help you make decisions. And the solution offered is (roughly) to redescribe the states of the world so that you do know the utilities that are assigned to them.

DeleteI think this is part of a general move amongst some decision theorists to see decision theory as more than a set of coherence constraints on rational preferences (and choice behaviour) along with a mathematical representation of preferences that satisfy those constraints. This movement thinks of the expected utility calculation as providing the rational way to combine utilities and credences to give preferences. (Or, in Lara Buchak's version, there are other states of the agent that serve to determine the preferences, namely, for her, the agent's risk function.)

Hi Greg, thanks for your comment. I agree with Richard’s response to you. Let me add that I also agree with Richard that orthodox decision theory can respond to the Vegemite problem in the way he suggests, and that one can respond to the problem of epistemically transformative experiences more generally by, as Richard argues above, redescribing the case and assigning subjective probabilities to a wide range of propositions describing utilities. It is important to note that this solution accepts that there is no substantive role for introspection in narrowing the range of propositions we are to consider (that is, to narrow the range of utilities).

ReplyDeleteMy complaint is not that orthodox decision theory doesn’t have this solution to offer to those of us who want a guide for rational decision-making in such cases. Rather, for high-stakes cases like choosing to have a child, that is, cases that are both epistemically and personally transformative, this solution is normatively unacceptable in a way that has not been previously appreciated.

Thanks, Laurie. The reason I framed my question the way I did was that it seemed that you might reject the completeness axiom or the Archimedean axiom, or both.

DeleteBut if the idea is to assume a sort of realism about credences and utilities, as Richard puts it, the question then is how this realism about utilities is supposed to square with vN&M axioms. For traditionalists, it is preference that is primitive and utilities are simply a numerical representation of those preferences, and one is cautioned against the idea of taking "utilities" as real things to work out one's preferences. It is hard enough to motivate the idea that an agent is supposed to have preferences over options and lotteries, but there is at least a behavioral story for how to do so to satisfy the conditions for (one of) the representation theorem(s) for utility. What is the behavioral story that motivates starting with utilities? Does that story discriminate between affine transformations of one's particular numerical utilities? If not, why not? If so, then why doesn't this discrimination scupper using the representation theorem (of your choosing) in this fashion?

The general point is that the bi-conditional for a representation theorem is truth preserving, but not necessarily behavior or Big-O preserving: else, why bother with proving a representation theorem? The particular point is that there is a behavioral justification for the steps necessary for an agent's preferences to be represented numerically by a utility function (up to affine transformations). What is the behavioral justification for starting with numerical utilities?

If realism about utilities involves running the representation theorems in reverse in this fashion, then I agree that orthodox decision theory would be silent on such cases.

Hi Greg, this is a fair question, and it concerns the way I’m trying to fit a philosophical problem that crosscuts a number of subfields into a discussion framed in decision-theoretic terms. So please bear with me.

ReplyDeleteMy thought is that the behavioral story that supports the sort of psychological realism about utilities I accept is found in psychology, for example, in social psychology, descriptive psychology and cognitive science, as well as in sociology. (I don’t think utility must be independent of preferences, but I take utility to be psychologically real in a way that allows it to be empirically measurable.)

OK, so I take part of your point to be that, if an individual’s preferences conform to the von Neumann and Morgenstern axioms, then there exists a (suitably unique) utility function for her. So if I am concerned about a problem with the individual’s utility function, doesn’t this just translate into a problem with her preferences conforming to these axioms?

Yes and no, depending on how we understand the problem (I, personally, like both ways of understanding it, but for different reasons).

Yes, if the problem with utility is that the individual cannot grasp or determine her utility function, if grasping or determining the utility function depends on grasping or having the relevant phenomenal concept. (The phenomenal concept is what is at issue here, for it is what the individual needs to use to introspectively assess what-it’s-like.) Then we can take this to imply that the individual lacks the concepts she needs to grasp or determine her preferences. (Here, obviously, as a psychological realist I am taking preferences to be more than merely behavioral.) If she lacks preferences in the relevant sense, then in that sense her preferences can’t conform to the axioms, because the entities that are supposed to be doing the conforming don’t exist.

No, if the problem with utility is that the individual has too many utility functions. (This version is what I think of as the problem of imprecise utilities, pairing it with the current focus on imprecise credences.) Then the problem with utilities could be understood in terms of a problem of imprecise preferences, even if the preferences that determine each of her equally acceptable utility functions conform to the axioms.

Hi Laurie - I'll press the traditionalist's case: I don't find myself doing this very often :)

ReplyDeleteA traditionalist would say preference is real and (cardinal) utilities are a representation through which to work out the consequences from those preferences. So, not grasping or determining one's utility function could only come from one's preferences failing to satisfy the axioms. Perhaps you think that certain options are not comparable -- that, say, one truly cannot compare apples and oranges. Then completeness would have to go. Or maybe you think that some options are infinitely better or worse than others. Then the Archimedean axiom has got to go. Or maybe you think the problem is our computational limitations and think that it is implausible to insist that an agent express preferences over options, over all lotteries involving those options, over all compound lotteries involving those initial lotteries, over all lotteries compounded from those initial compound lotteries, ...(!). In any event, a traditionalist will only understand a failure to determine the decision-maker's utility function as a breakdown in determining her preferences, assuming probabilities are given (as they are for vN&M's basic setup).

The traditionalist will also have a reply to the 'too-many utilities' concern, which also conveniently strengthens the case for his unabashed pragmatism. A traditionalist can entirely agree with you that preferences are real things, and not merely behavioral, but deny that they --in and of themselves-- are numerically quantifiable. There is nothing about my preference for Munich to Muncie (qua psychological state) that is numerical. However, you can present to me a (forced choice) decision problem that would yield behavior from me that you could then use to quantify my preference: "Which would you choose: a lottery that gave you a 1/7 chance at a night in Muncie with probability and nothing otherwise to a lottery in which ..." The nice thing about this traditional, pragmatic, behavioral approach is that it is all on you, the decision-maker, to work out how to combine the different values you might consider in making your choices. In the end it is your choice behavior --assuming your choices do not violate the utility axioms-- that determines your utility function.

Re: Imprecise utilities: Martin Petersen has a Phil Studies paper (2006), 'Indeterminate Preferences', which discusses this proposal and responds to Isaac's objections to the indeterminate preferences. (There are papers in stats literature on this idea, too.)

It might be the case that the project I am developing is simply incompatible with a traditionalist perspective. As Richard says above, I am very sympathetic with the idea that we should use decision theory to provide a philosophically richer understanding of the expected utility calculation. I’m a big fan of Lara Buchak’s work in this area. It isn’t a coincidence that my arguments draw on ideas that stem from debates in the philosophy of mind from the 1980s and 1990s about the nature of consciousness and experience. Those debates develop post-behaviorism theories of consciousness, and explicitly recognize the importance of “what it’s like” information. (My own arguments are conservative in the sense that I am not rejecting physicalism or embracing dualism. But behaviorism is just not part of the picture.)

DeleteIn an important sense, my project explores a tension between how we want to be able to model decision-making made from the conscious, first-personal perspective of an agent and how we model decision-making from an impersonal perspective. The problems I am raising can be seen as a challenge to the behavioralist presuppositions of orthodox decision theory, because those presuppositions may keep the orthodox from giving an adequate account of how an individual is to make rational decisions from the first personal, conscious perspective in certain kinds of very important cases.

That said, I appreciate your point, because many decision theorists are sympathetic with the traditionalist approach, and it is helpful to say more about where the differences lie. So let me see if I can make progress by responding more directly.

The issue is that, once the role for consciousness is granted, there are new ways to understand how preferences can and cannot conform to the axioms.

You said: “So, not grasping or determining one's utility function could only come from one's preferences failing to satisfy the axioms. Perhaps you think that certain options are not comparable -- that, say, one truly cannot compare apples and oranges. Then completeness would have to go.” The problem isn’t incomparability, at least not in the usual sense, because the outcomes are comparable. But I suppose we could say that completeness fails (in a new way) if we say that it fails when the options are comparable (in principle, or by an ideal knower), but the actual decision-maker in the case, who is rational but lacks the relevant phenomenal concepts, is unable to form the relevant psychological preferences in virtue of lacking those phenomenal concepts due to lack of experience.

My response is also different from the usual worries about computational limits, because, again, the problem isn’t with complexity or other kinds of computational limits. It’s a problem for an agent who has the right rational capacities but lacks the ability to manifest those capacities due to lack of crucial experience. (I’m very interested in how Joe Halpern’s and John Quiggin’s work on known unknowns/unknown unknowns and precautionary principles might be relevant here. John and I are working on a paper that develops a model for “big” choices using such a framework.)

So in some sense I can agree with the traditionalist that there must be a breakdown in determining preferences. But I worry that it is a bit misleading to describe the problem this way. It is misleading is because the breakdown occurs only in the sense that the agent lacks a crucial ability to form the preferences, an ability that she can only gain by having the right sorts of experiences. This means that the way the breakdown occurs isn’t represented in the traditionalist’s usual menu of options.

Outstanding theory and its great helped him to solving my math question thanks for share it modern resume format .

ReplyDelete