Sunday, July 14, 2013

What does it even mean to "believe" something?



I've done three posts recently that dealt with the issue of "beliefs". First I talked about "derp", which I defined as the tedious repetition of beliefs too strong to be swayed by evidence. Then I jumped back into the blogosphere discussion of whether bets reveal beliefs. Finally, I asked whether inflationistas really believe their warnings of inflation.

But in all these discussions, there's been a more subtle and fundamental issue nagging at me. It's the question: What does it even mean to "believe" something in the first place?? This question seems like a trivial semantic issue, but it's very deeply important for all kinds of economics issues, from Bayesian inference to behavioral economics to the axioms of standard consumer choice theory and game theory.

And it's a question to which I don't really know a satisfactory answer. I am not sure what it means to "believe" something.

One idea of belief is a feeling of certitude. I may have a strong emotional reaction of "Yeah!" to one statement, and a reaction of "No way!" to another. For example, if you say "The sun rises in the east" I feel a feeling of "Yep!", but if you say "The sun rises in the west", I feel a feeling of "No way!" So is this a good definition of "belief"?

Not necessarily. First off, it can't be measured very precisely. Suppose I'm trying to decide whether I believe there's a 50% chance of rain tomorrow, or a 60% chance. My feeling of certitude might be about the same for both those propositions, and there's no way to tell which I "believe" more.

Also, certitude may not be invariant to the situation in which the question is posed. When I am actually required to act, my feeling of certitude may vanish. A great example is this recent study on partisan differences in survey responses:
But when there was money on the line, the size of the [partisan answer] gaps [on factual questions] shrank by 55 percent. The researchers ran another experiment, in which they increased the odds of winning for those who answered the questions correctly but also offered a smaller reward to those who answered “don’t know” rather than answering falsely. The partisan gaps narrowed by 80 percent.
This illustrates the conflict between what I call "Tribal Reality" and "Extant Reality". In response to a statement like "Global warming is a myth", conservatives may feel an upwelling of emotional certitude, due to their tribal affinity with a movement that has long sought to deny or downplay global warming. When there's nothing on the line, that feeling of certitude will determine the response to survey questions. But when there are actual consequences for getting the question right or wrong - when Extant Reality comes into the picture, in other words - emotional certitude may take a back seat.

OK, then how about the idea of a belief as "the degree to which you're willing to bet on something"? That seems like a reasonable definition, but it has big problems too. First of all, single bets can be hedged by outside bets, as I pointed out in the discussion on whether bets reveal beliefs. In that case, bets are not informative. Second of all, even if they are not hedged, bets depend on personal psychological characteristics like risk aversion and loss aversion and ambiguity aversion. In other words, bets will always depend on preferences. Since preferences depend on many outside things, a definition of beliefs that includes preferences will again result in "beliefs" changing depending on totally unrelated things, like whether I lose my job.

OK, well how about the notion of "probability" from Bayesian inference? In Bayesian probability theory, a probability and a belief are the same thing. I used this concept in my definition of "derp" (a "prior" and a "posterior" are both "probabilities"), but I have to admit that here too, I was working with a term without being sure of its usefulness.

In Bayesian probability theory, you assign a number to an event. That number is a "probability", and there are rules for how to update it in response to new data. But suppose you ask me to assign a probability to the event of the Republicans winning the election, and I say "I think there's a 120 percent chance!" Obviously I'm just saying words that I heard somewhere, and obviously my notions of what a "percent chance" means are very different from that of, say, most statisticians. I can feed a probability of 1.2 into Bayes' Rule, sure, but does the output of that exercise deserve to be called a "belief"?

OK, so suppose you tell me "No, silly, you have to give a number between 0% and 100%. That's how percents work!" So I think carefully for a second, and say "OK, I think there's a 99.999% chance that the Republicans will win the election." But obviously I am just repeating another popular catch phrase here. My number comes from my emotional feeling of certitude, not from any sort of internal engagement with Extant Reality.

Now of course this example is of a silly survey respondent, but in a subtler way it applies to mathematically sophisticated people too, even Bayesian statisticians! As Larry Wasserman points out, statisticians often choose their prior when conducting a Bayesian inference. They choose the prior based on some attractive properties, like "uninformativeness" with respect to some function of the parameters. If I choose my "prior" based on some consideration that has nothing to do with the question at hand, can the "prior" really be said to constitute a "belief"? This sort of "belief" is just as unstable as the others. (Also note that it lacks any emotional certitude, and you probably wouldn't bet on it either.)

So all of our intuitive definitions of "belief" will sometimes rely on external conditions that have nothing to do with the statement about which we are trying to determine our "belief". It seems to me that whatever my true "belief" about statement X is, it should (for most types of X) have nothing to do with whether I'm in a good or bad mood that day, or whether the question is framed using politically incendiary language, or whether my financial portfolio is net long inflation, or whether I am a risk-averse person, or whether I'm trying to use that "belief" to publish an empirical paper.

And yet I cannot think of any definition of belief that satisfies those invariance criteria. Furthermore, all of our intuitive definitions of "belief" seem to conflict with each other pretty severely in certain situations.

So I'm still not really sure what it means to "believe" something.

47 comments:

  1. Anonymous5:49 AM

    I suppose that the pythagorean monad that one plus one is two is the precept on which everything else is built - if you 'believe' we exist in a simulation that is. Its the difference between nothing and something that makes 0 or a 1 and on that the cosmos exists.

    But a 'value judgement' is something else. Death bad, red good, love is all.



    Then a mathematical truism like the economy will fail if leverage continues could be correct or it could be wrong depending if whether math included everything necessary to make this conclusion. Here the word 'belief' takes the meaning that 'I believe my math is correct' and the economy will fail.

    A good trader will not 'fall in love' with his trading position because then he is likely to bend facts to fit in with his love. That is a natural human behaviour based on blind protection of self and family.

    To continue ....bla bla bla.... followed by school definitions of the word 'belief'




    ReplyDelete
  2. Jeez, Noah, you need a stiff shot of (later) Wittgenstein. Now.

    ReplyDelete
    Replies
    1. If I may summarize all that: it's all derp at the bottom.

      Also, the set of all derp includes, but is not limited to, Bayesian priors, the later Wittgenstein, praxeology, Confucianisn, and maybe all MMT blog posts.

      Delete
  3. According to this philosopher, believing involves nothing more than "being disposed to do and experience certain kind of things":

    http://www.faculty.ucr.edu/~eschwitz/SchwitzPapers/AcctBel.pdf

    And here's a more recent paper by him on the subject, "A Dispositional Approach to Attitudes":

    http://www.faculty.ucr.edu/~eschwitz/SchwitzPapers/OutsideBelBox121101.pdf

    ReplyDelete
    Replies
    1. Sounds non-quantifiable...

      Delete
  4. Comment on http://noahpinionblog.blogspot.com/2013/07/what-does-it-even-mean-to-believe.html

    The question "what does it mean to believe something" is complicated by an even more maddening question: "what does it mean to mean something". I'm serious about that, as silly as it sounds. Try to understand the meaning of "meaning" in your mind. It's something our brain just does, like tasting or seeing, it has things "mean".

    What does a word mean? There are dictionary definitions, but those are just more words. Using words to answer "what does a word mean" is circular, it just begs the question. At some point you have to gain traction in the conceptual space of the mind, beyond words, when it comes to meaning. And at that point, trying to understand what meaning is resembles trying to understand what consciousness is, to some degree like trying to lift yourself from the ground by your bootstraps. Meaning involves logical connection or association, equality, equivalence, similarity, implication, connotation, but also the evocation of associated memory and emotion, the connection between "what means" and its value to us.

    If we go with this dichotomy between what words mean in terms of other words vs that subjective energy or experience that creates or realizes meaning, then at the subjective level, belief motivates action and emotion. A believable threat causes fight or flight response, but fictional danger, say in a film or book, usually does not, unless it is very believable fiction, unless we occupy the story, identify with characters, and successfully suspend disbelief. But this is temporary. This is playing at belief, where an operational definition of play would be a simulation of reality without real consequences, a rehearsal of sorts. It is not the real thing. When belief is present there is a real perception of possible costs or benefits in "Extant Reality", which motivates the expenditure of energy, to wager or risk some effort, which ties in with the betting aspect. Of course not every belief represents a threat, but threat seems a stark illustration of the difference between belief and non-belief.

    A belief need not be accurate to have real effect, to motivate action or the expenditure of energy. A belief can be delusional. Belief seems to be the mechanism or avenue whereby the world outside of our mind gains purchase inside our mind to trigger action, commitment, or emotion, whether that belief is accurate or not, due to error or deception either internal or external. Belief is in a sense the bridge between the subjective (where "Tribal Reality" is created and maintained) and the real, or "Extant Reality". Belief is like the tag or label associated with an idea or concept that assigns real weight to it, that identifies it as an idea or thought with practical consequences, as opposed to fancy or hypothesis or conjecture or wishful thinking. Even so, sometimes we err and fail to attach the belief tag to something real, or attach the belief tag to what is not real.


    ReplyDelete
  5. Rather than searching for a unified theory of belief, why not accept that there are different types of belief? While this may initially seem problematic, one solution is to add a variable to characterize the probability that a given answer is Tribal/Extant/Other.

    ReplyDelete
  6. Anonymous11:55 AM

    Alexander Bain via C.S. Peirce - belief is [a thought] "that upon which a man is prepared to act" .
    Peirce's essay " The Fixation of Belief " I found an interesting article - tangentially related - that may help to muddy the waters further.

    ReplyDelete
    Replies
    1. Ah! But we do not have access to the 'thoughts' of others. For instance, I think Noah is exaggerating the distinction between tribal and extant reality. I think that when money is on the line, people may answer according to what they think the questioner believes, rather than what they believe themselves.

      My disagreement with Noah comes down to how we infer thoughts from actions.

      Delete
  7. Noah - you seem to be trying to get a definition that is both qualitative/intuitive ("feeling") and quantitative/measurable. Every time I do that in academic work it's a complete failure. In such cases a definition which as the right *properties* is generally more useful than finding one with the right meaning. For example, utility is mostly a bogus term (most economists don't even think about their choices in terms of utility), but the definition has nice properties which allow the discussion of behavior. If you want a physics example, how about relativity - Einstein proposed it as a convenient choice for defining coordinate systems rather than the only choice (ie it has useful *properties* for thinking about the universe).

    Based on what you've written so far, I think you're looking for either A) confidence weighted Bayesian priors to a prediction - which are of course posteriors from some data set, including tribal data - that a person would report if they reported their true prior (ie, not just what they heard) OR B) the whole portfolio fractional bet they would make assuming/given a particular risk aversion. Fractional bets are simply a general form of 'how much to bet' and can be both optimized and computed from likelihoods and payoffs (see Kelly Criteria). These are fractions of total wealth, and so are invariant across whether or not you have a job (you'll bet more $ when you have a job, less $ without, but the fraction of your wealth should remain constant when risk aversion is held constant). Of course for this to be useful in measuring one's "beliefs" about the probabilities and payoffs, we have to assume a no arbitrage world -- ie your inflation bet portfolio is a perfect arbitrage and thus your rational bet is as much leverage as you can obtain; if your odds didn't create an arbitrage you'd need to construct a bet portfolio which more closely matches your "beliefs" about probabilities.

    ReplyDelete
  8. Anonymous1:14 PM

    Thus proving that economics is actually a dismal branch of metaphysics.

    Imagine a universe where the random walk hypothesis holds for market prices....

    ReplyDelete
  9. As an economist, you are surely already familiar with words that have different contextual meanings, especially formal vs colloquial:

    "I really _want_ that new Porsche!"

    "Obviously not."

    From a formal perspective, you can solve the betting-based definition by making it a situation where the individual must make a real-world decision that requires spending any scarce resource.

    ReplyDelete
    Replies
    1. I translate your example into how most economists (incluing me) would express it:

      "I am realy, really willing to buy that new Porsche."

      "But you are not able to."

      And, as Noah would suggest, if you can hedge your commitment of real-world resources, then it's not quite clear what your action (belief) is or amounts to.

      Delete
    2. But "not able to" really doesn't explain it, as one of my old econ profs noted (I think it was John Taylor, actually). One could easily imagine that the hypothetical economist in the joke could in fact make the sacrifices necessary to acquire the Porsche. However, he chooses not to take out a second mortgage, or whatever.

      When it comes to betting or otherwise committing resources, it seems like your belief amounts to the implied odds of all your bets or overall position. So you make a bet of $100 for the proposition, receiving 3-1 odds. You then make a bet of $120 against the proposition, receiving 2-1 odds.

      If the proposition is true, you win a net $180. If it is false, you win $140. So I assume you believe (ignoring risk tolerance) the true probability is 9-7. If you believed they were even odds, you would have bet $133 in the second bet.

      Yes, I realize that there may be transaction availability and risk tolerance problems, but if your total portfolio produces a 5-1 payoff and you had reasonable hedging opportunities, it seems pretty safe to say you don't "believe" the proposition is 50% likely.

      Delete
  10. Rigorous arguments are constructed by stating definitions and assumptions, then presenting data, then analyzing data, and finally drawing a conclusion. Beliefs can be inferred throughout the process, especially if argumentes are not rigorous and are full of logical fallacies.

    Are the definitions and assumptions fair? When trying to demonstrate one side over another side, do the definitions and assumptions presented make it easier to arrive at a specific conclusion?

    What data is presented? Does the data legitimately represent the discussion, or is it being shoehorned into a conversation because there is no other set of data that will yield the desired conclusion? Is some readily available data specifically ignored? Why is the data ignored?

    How is data being analyzed? Does the presentation of data justify a particular analysis(is the arguer using gross numbers when they should be using percentages)?

    Does the conclusion honestly follow the argument? If A implies B, where the hell does F come from?

    ReplyDelete
  11. Noah, I love your blog because you are brilliant, inquisitive and generous hearted. And I like the sorts of questions you ask.

    Here aren't we bumping up against the paradox of granularity, so to speak? The closer we observe reality the more doubtful are the "things" which, from a distance, seem to populate the world. In other words, most of our ideas are approximations, or shorthand, for navigating life. An obvious example which might even set the outer limit for this problem: the self. If you ask a person, a Buddhist monk for example, who dedicates the majority of their energy to investigating their self--the content of their mind--chances are they will tell you, "well, of course there isn't any self! There are only sensations, feelings, thoughts. And these are constantly changing."

    So, why should belief be quantifiable, any more so than the betting record or observable actions of a person? Rather, the way that we use the word 'belief' we can be fairly confident that it must be less quantifiable than our history of action or betting. That is implied by the usage.

    I do think your line of inquiry suggests a related question, what is the difference between intelligence and wisdom? Seems to me intelligence without wisdom is potentially dangerous, and also that wisdom has a large emotional component which intelligence does not.

    ReplyDelete
  12. Some say they don't believe in evolution. I, myself, don't believe in gravity.

    ReplyDelete
    Replies
    1. Bill Ellis7:44 PM

      Finally!
      Someone else who sees through the Gravity Conspiracy. Gravity is just a myth invented by HomeDepot and Ace hardware to sell ladders and guardrails.

      People are such dupes.

      Delete
    2. Welcome fellow Gnostic! Of course, HD and Ace don't take into account that we will soon be Greece. Their problem, not ours.

      Delete
  13. [wonkish]

    "Since preferences depend on many outside things, a definition of beliefs that includes preferences will again result in "beliefs" changing depending on totally unrelated things, like whether I lose my job."

    I think you are taking too narrow a view of preferences here, Noah. If we take a complete Savage state space (as we should) then things like losing your job do not change your preferences - they simple change the state of the world. Now, of course, we may have different conditional preference relations that depend on states. The distinction may sound trivial, but when we formulate the state space properly we can have a much more coherent view of the different roles of preferences and beliefs.

    This is not to say that we can actually always separate beliefs from preferences. If we take, for example, Choquet expected utility (from Schmeidler) it is very, very tempting to interpret the core of the capacity as the agents beliefs. This is incorrect, as the capacity is an amalgam of both beliefs and preferences.

    ReplyDelete
  14. Bill Ellis7:28 PM

    This post is ironic.
    Noah,
    You want to define belief to fit the feeling of certitude that the state belief of provides us. It is like you believe in belief.
    But belief is mental state that Evolved in us to provide us with the NECESSARY illusion of truth/right /certitude. Belief lets us act in the face of ever present uncertainty. With out belief we could never come to a rational decision, we would be frozen... lost in pondering all the possibilities.

    Does that pack of wolves want to eat me ? I can't be certain. Let me take a moment to figure it out. CHOMP CHOMP...

    To me the really interesting thing about belief is just how often we are right when we listen to that feeling and jump to conclusions based on it...

    ReplyDelete
    Replies
    1. Bill, I think you have it here. Isn't there a book about this - thinking slow, thinking fast?

      But Noah, I think is addressing something subtly different - the ambiguity in the language. To be honest, if somebody says they are a believer, I'm not too sure what that really means. Does it mean, that there are things that they refuse to think about?

      Delete
    2. Belief is a word to express how we made up our minds to choose one course of action over another, in the absence of complete knowledge. A best guess. It would not be possible to survive without making 'best guesses.'

      Delete
    3. To me the really interesting thing about belief is just how often we are right when we listen to that feeling and jump to conclusions based on it...

      Alternatively, even a "wrong" decision can be better than no decision at all. This was one of the nuggets of wisdom from the excellent mountaineering film, Touching the Void. And relatedly, having beliefs does not necessarily imply attachment to them. "This is my best guess. If it turns out to be wanting I'll switch to something else."

      Delete
  15. Anonymous9:05 PM

    I think the global warming denier offered money for the "right" answer would see things differently. He wants the money. So when asked whether global warming is real, he is not necessarily going to give the answer that he believes is correct, but rather the answer he believes the person carrying out the experiment -- you know, the one giving out the money -- believes is correct. And since that person is an academic researcher at a university, chances are in excess of 99% that he believes global warming is real. And the denier knows this.

    In this sense, I don't see how the experiment gives any insight into what deniers "really" believe about global warming.

    ReplyDelete
    Replies
    1. The point is that what it means to "really believe" something is not so clear.

      Delete
    2. Anonymous8:14 AM

      Research into shifts in view based on payment show just the opposite. People do change their opinion when paid to express an opinion contrary to the one they started with. Apparently, they don't believe they changed their view to get the money, and so settle on some other explanation for their behavior. The readiest explanation is often "belief".

      Delete
  16. Anonymous8:11 AM

    I think you need to allow for two modes of definition. The operational definition of "belief" would have to do with action taken or resources committed. Belief doesn't matter otherwise, in an operational sense. This opens the way to tautology, of course, but doesn't that happen with lots of operational definitions?

    The other definition of belief is what most of this discussion seems to focus on. Here, we wander into the thicket of common usage. "Belief" means a bunch of thinks, same as "mandate" and "fork". Trying to make use of a common definition when what you are aiming at is an operational definition will drive you crazy and waste your time.

    ReplyDelete
  17. You set an interesting threshold for a definition for belief if belief has to be measurable or quantifiable. That definition must almost certainly be a contradiction in terms because the salient feature that distinguishes a belief from a fact or knowledge is that beliefs are held about things for which there is no evidence or proof.

    a belief is simply an opinion about something unknown - often as in the case of 'God' it is about something that is unknowable.

    Another important feature about beliefs is that they change along with experience, reflection, and other influences. Indeed the more flexible and fluid one is with one's beliefs - the more healthy a relationship one has with the whole business of belief.

    There is a long tradition that one mark of intellectual maturity is comfort with not knowing (ie. in having flexible and fluid opinions about what isn't known, beliefs), and by extension, comfort with contradictory and opposite truths.

    In a word, show me someone who makes it clear what it means to 'really believe' something, and I'll show you an inflationista,a republican, a social conservative, or some other philistine object of easy ridicule. Tell me this who is closer to the truth, the person declaring that inflation is always and everywhere a monetary phenomena, or the person declaiming that when the facts change they change their mind?

    Understanding beliefs is only difficult for the person who tries to reduce belief to a mathematically tractable model that is always and everywhere true. I guess this makes a believing having a local approximation (that is not constant) for a parameter value that is itself not a constant?

    ReplyDelete
  18. Whahey! Off your field and into mine!

    @Peter Dorman is right -- you need some later Wittgenstein. The Philosophical Investigations is the place for a full treatment, but for something easier you could look at the Blue and Brown Books or, perhaps even better, some high-quality introduction.

    Some quick thoughts:

    Why should words in a language used by certain primates to communicate with one another be expected to have clear definitions? There is no reason. The meaning of a word is just the manner in which it is used, and the same word can be used in all sorts of different ways by the primates. Some people throw up their hands in horror at the very idea of that, but the system works very well (except in philosophy, where things get confusing and you have to understand this stuff to make progress). It is just not a problem, in ordinary language, that the concepts we use have fuzzy edges.

    What about in formal language? Sometimes, we can, and do, define words precisely. That can be useful, for example, when we are modelling things, because clear assumptions lead to clear implications. It is also, however, always a further question whether a model is a good approximation of the world.

    You are mixing ordinary and formal language together. You ask what "belief" means and pull some examples from ordinary language, but you also want something "quantifiable", which is the kind of thing you only get in a formal system. If you want to know what a word means in ordinary language, you need to do a survey -- finding out its meaning is an exercise in naturalism (with the advantage that, if careful, you will be able to recall pertinent examples to mind). You have started to do that, but what you take to be objections to any single notion of belief are better seen as different uses of the concept that do not completely overlap.

    If you want to make a model based on some concept, then you need to pick a precise definition, work out the model, and see if it is a good fit for reality. The precisely-defined notion of "belief" will be different from the notion that wild human beings actually use, but if you choose carefully it may be a useful approximation.

    The reason that you are not sure what "belief" means is that you are looking for precise meaning in nature (ordinary language) rather than defining it for the purpose of use in a formal system. You need to do the latter, and then see whether the resultant model is illuminating.

    ReplyDelete
    Replies
    1. ^As a recent Economics & Philosophy grad, I can confirm johnbutters.org's testimony and also recommend Wittgenstein. Also, I'm not even sure that a survey would be sufficient to capture the variants of "belief", as the options themselves would affect the results, but it might be the best way.

      Here's a related question, Noah:

      Regarding your post title, what does the word "mean" mean?

      Delete
    2. This! You are totally right. I would also expand on it that there are some subtle differences in various words and its uses.

      Let's have an example, like the word "apple". What is an apple? Word apple is a concept that represents some set of configurations of reality where molecules, atoms etc create an object with properties that are assigned to an apple (shape, colour, taste, etc.).

      There is not only one representation of what an apple is. There is whole bunch of options. Apple may be red or green. It may be fresh or rotten. It may be big or small. We have a large advantage that we may relate it to some objective reality that anchors its meaning. But even then we may have to come up with better definitions as the times passes by. For instance if there is GMO apple that tastes like pear, we may have some dispute if it realli si an apple or not.

      But there are some words that do not relate to single specific reality. These are words like "beauty". You may come up with some definition like "object is considered beautiful if when shown to randomized sample of people it creates certain response in brain activity in more than 50% of subjects". But at this point it may not be universally used as those 49% people may disagree this definition.

      Delete
  19. A large part of the confusion around the word "belief" is due to the fact that it is used for BOTH of the two radically different thought processes that humans use.

    One process is "faith": it is internal (requiring no real-world evidence), and absolute (valid for all times and all places).

    The other process is "reason": it is external (requiring real-world evidence), and yields a result that is expected to be improved over time.

    When I say "I believe in Jesus Christ," I mean "I have faith in Jesus Christ." When I say "I believe in evolution, I mean "I have seen enough evidence to be convinced of the validity of the theory."

    The two uses of "believe" are radically different... but 99% of us do not notice.

    "I believe that life begins at conception." For one person, that statement requires evidence; for another it does not. Debate between those two persons is impssible.

    ReplyDelete
    Replies
    1. Nathanael10:33 PM

      The "faith" definition doesn't even correspond to the usual psychological meanings of belief.

      Things which people "believe in" by "faith" may not even be coherent or meaningful, such as "I believe in one god, and only one god, and that god is simultaneously three gods", or "I believe that Jesus Christ is 100% human and 100% divine".

      To include *that* sort of "belief" you have to define belief as "utterance".

      Delete
  20. I "believe" proposition X is true if I estimate the probability of truth at over 50%. To try to narrow it down to a narrower range of probability estimates is a mistake because belief is the set of probability estimates over the 50% threshold.

    It is a mistake to equate what a person "believes" with what he says he believes. All communication has a purpose and all communication is composed with its intended audience in mind.

    ReplyDelete
  21. An interesting take on this is presented by Elie Ayache in his book, the Blank Swan (not "Black Swan", by NNT). What he proposes is that there is no probability, such as a 10% chance that a light will turn green before I get to it, rather, future possible states of the world only "exist" to the extent that they have a measurable impact today. The easiest example is that of financial derivatives (the area he originally comes from). A contract that pays $1 if something happens and pays $0 if it doesn't has some value today when priced in an open market. That price, say $0.70, is the only meaningful thing that can be said about those potential future events; not that there is a 70% chance of X occurring, or 10%, or whatever; but rather, only that you can pay $0.70 today to get $1 in that state. To talk about future "probabilities" is meaningless. So in this interpretation, the bettor is the link to the future; they are connecting future events with present values.

    Not that I totally buy it, but it is thought provoking.

    ReplyDelete
  22. Why would an economist care about belief? Economics is about behavior, what people actually do. That's why the idea of utility or value is basically suspect. The only thing you can actually measure is price. It might help to think of belief or value as a local hidden variable the existence or non-existence of which has as much impact on doing economics as such similar values have on doing physics.

    ReplyDelete
  23. Why would an economist care about belief? Economics is about behavior

    Because people's beliefs inform their behavior.

    ReplyDelete
  24. I think a working, non-ostensive, minimal definition of Belief is simply, "A belief is an assertion"; e.g., "If the Earth is flat then 2 and 2 sum to 4" or "Kraken is real". Anything beyond that is something more than merely a belief. I suppose a fair alternative to what I proposed would be that a Belief is a *sincere* assertion.

    Other than those, I'm not sure that there is a commonality to all variants of Belief.

    ReplyDelete
  25. Jamie5:05 AM

    Very interesting thread and discussion. Here is a non-academic perspective.

    Beliefs are just the simplifying assumptions we use to help us make sense of the world.

    We all have beliefs because we all have to make simplifying assumptions. None of us has the time, the inclination or the ability to verify all human knowledge from scratch, or to fill in all of the gaps in human knowledge, so we mostly have to fall back on simplifying assumptions, even in areas where others see us as experts.

    Beliefs form into chains and networks using logic, so we end up with consequential beliefs which follow logically from underlying beliefs, and underlying beliefs which follow logically from deep-seated core beliefs. Derp also forms into logic chains. That is why it is so difficult to erase. For example, if you assume that the bible is a statement of historical fact then it is entirely logical to use family trees to calculate that the age of the earth is a few thousand years. If the earth is a few thousand years old then it follows logically that evolution must be wrong. Consequential derp such as the denial of evolution is impervious to attack as it is defended by underlying derp which, in turn, is defended by core derp. Derp can only be beaten by attacking the core derp on which the rest of the chain of derp is built.

    When we build layers of logic on top of our core beliefs, we can convince ourselves that we have eradicated belief and are pursuing scientific ‘truth’. However, this can be self-deception. For example, when right-wing economists always justify right-wing policies and left-wing economists always justify left-wing policies, it is clear to everyone outside the economics profession that much of modern economics is just a veneer of logic built on top of some core ideological derp, rather than a scientific pursuit.

    When we express beliefs about the future, unconditional statements are normally pure derp. Scientists tend to caveat their statements with probabilities and risks. For example, there is a 10% chance of rain tomorrow, or you have an 80% chance of surviving a medical procedure. Successful gamblers also tend to be the ones who have the best ability to quantify probability and risk about the future. However, economic forecasts are rarely discussed in terms of probability. Economic policies are rarely evaluated, publicly at least, on their probability of success or risks of failure. Even when economists discuss bets, they rarely mention probability or risk.

    When economists failed to predict the current economic crisis, there was much talk about the ‘credibility’ of economics. It is worth remembering that the word ‘credibility’ comes from the Latin word ‘credo’ which means ‘I believe’. The thing that we call ‘the truth’ is really just a form of belief that has been subjected to rigorous testing against reality. In practice, that means that ‘the truth’ is just a belief that the testing has covered all scenarios from all perspectives. For example, Newton’s laws were considered to be ‘the truth’ only until Einstein thought about certain unusual circumstances when they failed. Nevertheless, Newton’s laws are still useful in normal circumstances.

    Belief is so strong in all of us that ‘a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it’ (Max Planck).

    ReplyDelete
  26. Paging Dr. Wittgenstein, we have a patient speculating on real, essential meaning.

    http://en.wikipedia.org/wiki/Philosophical_investigations#Meaning_and_definition

    ReplyDelete
  27. Noah, we've thought a lot about the same things.

    I do wish we could find a way to work together to create a system that can cut through most of this and let more people in on the secrets of our economic system. I'm just an evil IT guy, but I believe technology can make a huge difference. It just needs the right mechanisms to fit into the culture it aims to change.

    I'm working hard, but it is difficult to do it all myself. You might even call my product a derp filter. But it has no use without a crowd of users. I need to find the right people to work with on this in the right way.

    Not sure what that is right now, but I'm going to keep plugging away at it. I haven't seen any indication of interest from the bloggers at all, but perhaps I haven't really explained what I'm doing.

    ReplyDelete
  28. Nathanael10:31 PM

    Noah, contact your local psychology department. They will tell you almost immediately that "belief" is not really a coherent concept.

    "Belief" is used as shorthand for a number of different psychological states, which are not the same as one another. People very frequently "believe" something in one sense and do not "believe" it in another sense. It is of course well known that people frequently hold multiple contradictory beliefs in their head at the same time.

    In short, you are asking two questions.

    One is an empirical psychological question ("What are the various mental states which we characterize as "belief"?)

    And another is a linguistic question ("Which of these states do I want to use the word 'belief' for?")

    ReplyDelete
    Replies
    1. Yes. If I had some advice for Noah that wouldn't be taken as condescending, which I don't want to be, it would be that as an economist, please think for yourself and ignore you father's influence.

      From what I've seen, Noah's father is very, very smart, but his ideas about these subjects are not helpful to a future Nobel winner.

      Psychology is a field that has very strange views, and many of them are just not transferrable to a field that has such rigorous mathematical checks like economics does.

      Delete
  29. Of course we are just atoms rearranging ourselves to understand other atoms...when you look at things too closely (which is what you are doing here) then things devolve into this truth.

    ReplyDelete
    Replies
    1. Yep. And the truth is that poor people need foodstamps or they will die.

      That is the primary truth.

      Delete