Thursday, November 13, 2008

Endowment Effect II

So I had a nice discussion about probability distributions concerning ideas. Unfortunately I failed to make a sufficient connection to answer the question.

How would a method of more accurately accounting for the truth of our beliefs counteract the endowment effect?

Normally when considering the probability of alternatives there are two keys parameters: 1.) probability of outcome's occurrence 2.) magnitude of gain or loss given the outcome's occurrence. In this way widely different "projects" become tradeable. For example, I have $100,000. I am proposed an investment opportunity in the stock market with a 5% chance of earning a 100% and a 95% percent chance of earning 0%. The expected payoff for this investment is simply calculated [.05*($100,000*1.0) + .95($100,000*0.0) = $5,000]. One could compare this opportunity with say an investment in alpaca farms. Let's say this opportunity is expected to yield $20,000 during the same time horizon. Although these are wildly different projects, they are tradeable in the sense that they are alternatives.

In the market of ideas, the value of ideas don't appear to be tradeable. Ideas are limited to the topic, within the topic they are tradeable, cross-topic tradeability doesn't hold. So we have millions of topics (could be policy issues, philosophical questions, scientific inquires, etc) and within each topic are alternatives. Each alternative could theoretically have a truth probability. Hence the only value ideas hold are their truth probability. We could conceivably multiply this probability by some measure of the value of the topic in general, which would enable cross topic idea rankings. But that is not the subject of our inquiry. We are concerned with the appropriate level of confidence in our idea's truth concerning a specific topic.

Back to the endowment effect. What the endowment effect means in this context is that ideas that are "owned" are given too great of value. Hence, their truth probability is ratcheted up too high. My argument is for a method that more accurately understands truth probabilities. I do not attempt to counterbalance the bias but instead hope to improve the mechanism causing the error.

Often I find myself falling into the trap of discrete thinking towards ideas. Either and ideas is "right" or "wrong." 0% or 100% probability. When in reality nothing is so cut and dry, especially on complex issues. This flaw seems a natural shortcut for the mind and may be behind the endowment effect in ideas. (Although we certainly haven't proven there is such an endowment effect, and given its different features it may deserve a different title.) If we can learn to think probabilistically, instead of in all-or-nothing terms, we may realize greater rationality in the domain of ideas.

Friday, November 7, 2008

Endowment Effect

The endowment effect is one of many cognitive biases recognized by the behavioral camp of economics. I like the name "ownership bias" better (I think its clearer). In short, the ownership bias results in the placement of greater value in goods that are owed as opposed to ones not owned, but rather out in the marketplace somewhere.

My question is: can this bias be applied to ideas?

Well, first of all, how can we "own" ideas? I'll define the ownership of an idea as the state where an individual moves beyond a stage of fact gathering and has reached a conclusion on the relevant issue.

The consequences of such a bias would be that the individual no longer views all ideas fairly, but disproportionately values the ideas in which he has determined to be "correct" in the past. Such as state would pose a dilemma regarding the individual's hopes for rationality in the consideration of alternative ideas.

Obviously, I haven't tested this. Nor have I checked for similar concepts throughout prior literature. On merely intuitive grounds it seems to make some sense. Observationally, it seems that people exert resistance to changing their minds (of course the variance in resistance must be massive concerning different types of issues). Also when a mind is confronted with unfalsifiable evidence of a mistaken belief, severe cognitive dissonance is the usual symptom. Such a reaction would coincide with the destruction of a large value (the owner's idea) and the gain of a small value (the new idea), resulting in a net pain or loss, even though this new idea must be superior based on the evidence.

With this in mind, how can we restore some semblance of rationality? The first logical answer is to not be hasty towards conclusions. This is good advice, but quite well known and, therefore, boring. At some point for some issues we will inevitably be forced to reach a conclusion. No matter how well thought-through the process of acceptance, according to the ownership bias theory, the end result will be a disproportionate assignment of value to the idea's merit, validity, truth, etc. However, it does stand to reason that on issues where one is relatively uninformed the best policy may be not to reach a conclusion.

Upon reaching a conclusion, a good technique may be to take inventory of the likely truth of your ideas. Assign some sort of probability toward their correctness. It seems that when most utilize this advice they come up with a distribution where a 99% probability is reserved for those ideas of the utmost certainty and a 51% percent probability reserved for issues of a tenuous nature. However, this would only apply to simpler true false questions, where there are only two options. It is important to note that for real world issues there will be infinitely many possible solutions. It is only through the correct framing (e.g. will the human race become extinct in the next 100 years?) that the problem can be circumvented. But this framing often removes much of the importance of the initial inquiry.

The point is that when you consider the probability that your view is correct think of all the possible alternatives. It may be very realistic to hold a 5% chance that your view is correct. This may be a view you have thought very long and hard about. You may be relatively more confident in your correctness on this issue than most of the conclusions you indulge. However, given the complexity of the problem, it is very likely that you are wrong (95% chance). Yet the view you have chosen remains the most likely winner of all possible candidates.

Without resisting the urge to wildly speculate, 5% seems a much more realistic probability than 51% on many social science questions (and the 5% may not even be a relatively tenuous position as the 51% was).

Whether this type of "realistic" probability distribution assignment has the power to overcome the endowment effect or ownership bias, is a question I will not answer.