1

After reading up on probability theory, it seems that there are two camps: objective and subjective probability theory.

Objective probability can refer to something like the probability of a dice roll being 1/6. Often, these are frequentist and what they really mean is “if I was to toss this dice many times, about 1/6 of them will land on 1.” Okay, so in this case, the probability just means historical frequency. So ultimately we already have a term for this: frequency, that we then rephrase as probability for god knows what reason.

Subjective probability, common in Bayesianism, recognizes that probability is a matter of how strongly you believe something. But why on earth is this relevant, important, significant, or much less even talked about? It came to my surprise that entire books have been written about this concept and how to systematize your “degrees of belief” when coming across evidence. But since when did how strongly you believe something have anything to say about what’s true? Why would how many people strongly believe that the earth is flat matter in philosophy? If it doesn’t, as subjective probability proponents will hopefully recognize, why bother systematizing it?

If philosophy is about seeking truth and not bookkeeping your feelings, why is this part of it?

Baby_philosopher
  • 1
  • 1
  • 4
  • 22
  • 1
    See Henry E. Kyburg Jr, Are there degrees of belief? (2003: Keynes is unequivocal in his insistence that probability represents a logical relation that is objective. “. . . in the sense important to logic, probability is not subjective. It is not, that is to say, subject to human caprice. A proposition is not probable because we think it so. When once the facts are given which determine our knowledge, what is probable or improbable in these circumstances has been fixed objectively, and is independent of our opinion. 1/2 – Mauro ALLEGRANZA Feb 09 '24 at 14:05
  • 1
    The Theory of Probability is logical, therefore, because it is concerned with the degree of belief which it is rational to entertain in given conditions, and not merely with the actual beliefs of particular individuals, which may or may not be rational” “When we argue that Darwin gives valid grounds for our accepting his theory of natural selection, we do not simply mean that we are psychologically inclined to agree with him; ... We believe that there is some real objective relation between Darwin’s evidence and his conclusions . . . ” 2/2 – Mauro ALLEGRANZA Feb 09 '24 at 14:06
  • @MauroALLEGRANZA But things either are true or not. Probability is not inherent in things. How can there be a mind independent standard or relation between evidence and how strong your belief should be. – Baby_philosopher Feb 09 '24 at 15:28
  • Probably indeed so since the principal credence aka degree of belief as principle of probabilities often get emotional when applied to anything in reality perhaps except tossing an absolute fair and hard cold objective coin... – Double Knot Feb 09 '24 at 19:01
  • I strongly disagree with the distinction you make between frequentist and bayesian, or the definitions you give for frequentist and bayesian. The prototypical use-case of Bayes' theorem is the "what is the probability that you have the disease, knowing that you tested positive for the disease?" problem, in which both the prior "probability to have the disease" and "probability to test positive" are literally measured as frequencies in the population, and certainly not as "beliefs". – Stef Feb 09 '24 at 21:56
  • 1
    Did you miss the parts about whose degrees of belief are considered? "Probabilities are degrees of confidence, or credences, or partial beliefs of suitable agents. Thus, we really have many interpretations of probability here — as many as there are suitable agents... the suitable agents must be, in a strong sense, rational... this implies that the agent obeys the axioms of probability", SEP. "Subjective" and "belief" do not mean what they mean in psychology, for "rational agents that obey the axioms" emotions are moot. – Conifold Feb 10 '24 at 00:01
  • @Stef Bayes' theorem is a feature of any interpretation of probability. After all, it is a theorem. But Bayesianism is a distinctly different interpretation from the frequentist and other interpretations. It understands probabilities as degrees of rational belief and treats probability theory as a way to calculate and update degrees of beliefs in a rational fashion. The result is quite a different set of methods. E.g. Bayesians do not use null hypothesis significance testing. They do not use confidence intervals: they have their own similar quantity called a credible interval. – Bumble Feb 10 '24 at 02:07
  • @Bumble ...Yes? Thank you, I guess? – Stef Feb 10 '24 at 09:45
  • @Conifold The axioms of probability don’t tell you what credence you should have in a particular proposition. For example, your belief in P and not P should add up to 1. But there are an infinite number of ways for this to add to 1. So your comment is a distraction when that was not what was being referred to. Secondly, no axiom in probability can tell you that you should represent belief as probabilities in the first place – Baby_philosopher Feb 15 '24 at 22:24
  • Whatever indeterminacies there are cannot be filled by emotions when the agents considered do not have them. So how are your remarks relevant to the question? – Conifold Feb 16 '24 at 00:39
  • @Conifold Indeterminacies do not exist outside the mind. Meaningful propositions are either true or not: the notion of mind independent probability is meaningless. If the agents considered in your proposal do not have emotions, then they must also not have indeterminacies, so there is nothing to fill. An indeterminacy is merely a feeling/sensation/state of belief/call it whatever word you want: the world is indifferent to it all the same. – Baby_philosopher Feb 16 '24 at 04:26
  • I do not have a proposal, but "if X does not have emotions, then it must also not have indeterminacies" is trivially invalid (take any quantum system). Bayesians do have a calculus and credences there are typically used comparatively. So it matters little what their absolute values are, only how they are updated, and that is done by formulaic rules. "Mind independent probability is meaningless" is also trivially false. What is true is that "subjective" probability is available information-dependent, but whether the holder of said information is a mind or a computer is moot. – Conifold Feb 16 '24 at 06:47
  • @Conifold We were talking about indeterminacies that an agent has. These are psychological states that cannot exist mind independently. Can you give me an example of a mind independent probabilistic statement that doesn’t just resort to talking about frequencies? Even probability in QM is just shorthand for the results we see through empirical tests in the past. Regardless of QM, there is no mind independent standard that tells you what degree of belief you should have in a proposition. After all, there can’t be. Beliefs only exist in your head and so do probabilities behind belief. – Baby_philosopher Feb 16 '24 at 20:57
  • What is the degree of belief that one should have in the proposition that beliefs should be represented as degrees and updated a certain way? – Baby_philosopher Feb 16 '24 at 21:01
  • Again, your further questions are moot to the title question. What credences a rational agent should have and how they should be updated is a separate issue, but the answer is certainly not determined by "just emotions", or emotions at all. There are Dutch book arguments that for predictive success they should use probability calculus, and that is surely mind-independent. You keep confusing mind dependence with available information dependence, and mind with computations, see Yudkowsky on Less Wrong. – Conifold Feb 16 '24 at 21:19
  • What I’m trying to say is that obeying the probability calculus does not tell you what credence you should have. At best they tell you how to update credences, not your prior ones. And so the title question implies that those “credences” are ultimately nothing more than conscious sensations since there is nothing in reason that can tell you what credence you should have or what it even means to have a credence in the first place. The very notion of a degree of belief is up for debate @Conifold – Baby_philosopher Feb 16 '24 at 23:36
  • What is does not tell you what should be or what to do, that is just Hume's guillotine and applies universally. It is also irrelevant here since predictive success is assumed as the purpose. Specific values of priors are moot when comparing predictive success after many updates. Bayesianism is not about their selection (one can pick some nominal value), it is about what to do with them to make decisions. So yes, reason can tell you what rational credences should be given the purpose and modulo irrelevant numerics, and what the title question "implies" is trivially false. – Conifold Feb 16 '24 at 23:50
  • @Conifold “Predictive success” deals with the base rate problem where you have to decide which one of the past cases a new case is similar to, and that is up to judgement that cannot itself be instilled through Bayesianism. More importantly, important propositions such as God’s existence have no base cases to be compared to, so there’s simply no way reason can help you not only define what credence is but what specific credence you should have. – Baby_philosopher Feb 17 '24 at 03:56
  • So your "argument" is that since Bayesianism cannot do everything it, therefore, can do nothing but channel emotions. And without slapping a number on the prior "there’s simply no way reason can help you". I got news for you. Electrostatic potential is also defined only up to an arbitrary number. So there is no way reason can help you with potential you should have and electrostatics is just emotions. Kidding. Physicists do with it what Bayesians do with the prior, they call it fixing the gauge. – Conifold Feb 17 '24 at 05:08
  • @Conifold It’s not about whether or not the prior is accurate: it’s about the fact that the very notion of a probability attached to a proposition is meaningless. And there is nothing in belief making that requires attaching probabilities to your beliefs. Arguably, it is actually never rational to attach them; we only ever believe something as opposed to something else anyways, never by itself. One just needs to know which one of A, B, or C one believes in if they are mutually exclusive. One does not need to attach probabilities to them to ever make a decision – Baby_philosopher Feb 17 '24 at 06:27
  • It is not about whether or not the potential is accurate either, accuracy is moot, it only serves to determine the field. The analog of the field in Bayesianism is the ranking, the prior is just a mathematical convenience, like potentials, frames of reference, potentials, etc. Apparently, you cannot distinguish "does not have invariant meaning" from "meaningless" either. I am not even mentioning attaching emotions to minds, which need not have them, etc. This "argument" is so full of non-sequiturs that you need to scrap it and start from scratch. With a more defensible thesis. – Conifold Feb 17 '24 at 09:30
  • @Conifold Your analogy is poor then since not only is electric potential mind independent unlike subjective degrees of belief, but the very notion of potential is defined with respect to a field. Talk of fields imply talk of potential. However, talk of rankings among propositions do not imply talk of probabilities or degrees. People ranked beliefs among propositions in their head without resorting to probabilities, degrees of beliefs, or any form of Bayesian like updating for centuries. They did this by comparing epistemic values among different theories: no numbers needed – Baby_philosopher Feb 17 '24 at 19:33

3 Answers3

1

Rather than objective and subjective, it might be better to say that broadly speaking accounts of probability divide into physical or epistemic. Bayesians and others fall into the epistemic camp and treat probabilities as a way to represent degrees of rational belief.

This is valuable in philosophy because we are interested not only in what things are true but also in questions like: How do we know things are true? Is there such a thing as a justification for a belief? Are some beliefs more strongly justified than others? If so, how? Are there criteria for determining what combinations of beliefs are inconsistent? Is it possible to devise a formal system for describing how we confirm and disconfirm beliefs? All of these questions form part of epistemology.

If you proceed to ask why these questions are important, then it is because life is full of uncertainty. Much of the information we have is inaccurate and imprecise. Arguably, all of our information is incomplete. We are constantly compelled as a matter of practical necessity to form uncertain beliefs and make decisions under this uncertainty. Being able to quantify uncertainty helps greatly if we wish to make good decisions and avoid bad decisions. Probability theory, understood epistemically, is a step in the direction of quantifying uncertainty. Not perfect, but a good approximation.

The fact that probability theory is useful for this purpose can be justified theoretically in at least two ways. One was worked out by Richard Cox in a series of articles published as The Algebra of Probable Inference. Cox shows how probability theory arises as a way of describing how degrees of rational belief are conserved in valid arguments. Another approach was taken by Bruno de Finetti starting from decision theory and using Dutch book arguments. De Finetti shows how probability theory can be derived from assumptions about how to avoid irrational combinations of beliefs, where irrationality is characterised by the criterion that were you to bet on your beliefs you would find yourself in a position where you are bound to lose.

Using probabilities epistemically is highly practical. It is used in risk analysis. In actuarial calculations by insurance companies. In forensic analysis. In evidence based medicine. In machine learning. In cryptography.

One common application is probabilistic information retrieval. Search engines, at least in the early days, expressly use Bayesian methods to return results. When you type a search expression, the engine determines how probable it is that you are interested in a particular document or web page, conditional upon the given search terms, and it ranks the results accordingly. When you click to read a page, the engine updates its assessment to determine how probable it is that you are interested in some other documents, given the ones you have already looked at.

Bumble
  • 24,872
  • 3
  • 32
  • 73
  • 1
    The cases you mention are not epistemic. They are as objective as a dice roll’s probability landing on 1 being 1/6. You look at base data, create frequencies, and then when you come across new data, you assume it’s representative and assign a figure based on how much you would bet on it being the case. None of this tells you what you should believe. In fact, the very notion of how “sure” you are in something seems to be fundamentally an emotion. It is not out there in the world attached to things – Baby_philosopher Feb 09 '24 at 15:33
  • If the probabilities are interpreted as degrees of belief, I would say they are epistemic. This is why I prefer this distinction to talking about objective vs. subjective. There are objective Bayesians and subjective Bayesians but both treat probabilities as degrees of rational belief. Degrees of belief are mental states, but they are not emotions. – Bumble Feb 09 '24 at 16:06
  • @Bumble if degrees of belief are not emotions, do you intend them as mental attitudes which are clearly distinct from emotions as mental states? If not then what're the exact difference between degrees of belief as mental states and emotions as mental states?... – Double Knot Feb 09 '24 at 19:33
  • @DoubleKnot Beliefs are quite different things from emotions. There are many different kinds of mental states. E.g., sensation, thought, desire, mood, motivation, imagination, etc. – Bumble Feb 10 '24 at 01:58
  • Beliefs, as a type of mental states which are assumed here to really exist non-elliminatively, are generally known as propositional attitude, so are you sure degree of attitude are mutually exclusive from emotion type of mental states? – Double Knot Feb 10 '24 at 04:40
  • @DoubleKnot Well, many emotions are not propositional attitudes. Beliefs are not usually considered to be emotions in commonly used classifications. Though I'm aware that some philosophers have argued that beliefs are directly motivating in the way that desires are. David Lewis argued against this thesis. – Bumble Feb 10 '24 at 05:21
  • I'm aware of your above 'aware' where non-committed awareness is close to the usual committed belief, and even with that 'aware', I smell some transient emotional state here and soon see the name David Lewis popped up... – Double Knot Feb 11 '24 at 06:17
  • @Bumble Emotions or not, either way these degrees of belief are psychological states. Thus, degrees of belief must be mind dependent. But the nature of reality does not depend upon our mind. Thus, degrees of belief are irrelevant to determining what’s real and what is not. 99% of the world having a 90% degree of belief in the earth being flat says absolutely zero about whether the earth is actually flat. If you agree with this, then I see no use case for Bayesianism in updating beliefs in your head. Note that I’m not arguing against the use of Bayesian statistics with defined datasets. – Baby_philosopher Feb 16 '24 at 04:35
  • Beliefs are mental states, and the the way the world stands is not dependent on our beliefs. But that hardly makes degrees of belief useless. People have to make decisions with uncertain and incomplete information. There is an important difference between good decisions and bad decisions. Making good decisions depends on having rational beliefs. Bayesian updating is one fairly good way to form rational beliefs from uncertain information. So Bayesianism is a helpful way to make good decisions. – Bumble Feb 16 '24 at 06:45
  • @Bumble Bayesianism cannot justify the belief that beliefs should be represented as degrees of probability and updated in a certain way. So it begs the question for it leading to good decisions – Baby_philosopher Feb 16 '24 at 21:00
  • I don't agree that it is question-begging. The work of de Finetti in particular shows that the probability calculus does a pretty good job of representing degrees of belief that adhere to a minimal criterion of rationality, viz, that you should not hold combinations of beliefs that, were you to bet on them, would lead to a guaranteed loss. And in practice lots of businesses use the Bayesian approach to reasoning with uncertain information in order to improve decision making. – Bumble Feb 17 '24 at 13:16
0

Probabilities are subjective in the way that an estimate might be. If I show you a pile of stones and ask how many stones you think are in the pile, you will come up with your guess, which is likely be different from mine, based on your personal assessment.

Clearly, the guess is only as good as the guesser. If you show me a pile of thousands of stones and I stupidly guess there are a hundred, my ridiculous guess doesn't increase the likelihood of there being a hundred. Probability theory is subject to the 'garbage in, garbage out' rule in a big way.

Leaving aside stupidity (as much as I am able!), suppose I go on to ask you how confident you are that your guess about the number of stones in the pile is correct to within ten percent- how would you answer that? Suppose I also ask how confident are you that your guess is correct to within fifty percent- how would you answer that? And suppose I ask you how confident you are that your guess is correct to within 99.999 percent- how would you answer that?

If you are rational, your degree of confidence will increase in line with the size of the margin for error I give you. To take another analogy, if I give you a dart, you are probably more confident in hitting the board than in hitting the bullseye. If the board is within touching distance, you will probably say you are certain that you can put the dart in it. If the board is a mile away, you will be certain that you can't hit the bullseye. Between those extremes, you are likely to have intermediate degrees of confidence in the various outcomes. Probabilities are just a way of comparing different degrees of confidence. They should line up with frequentist probabilities in the sense that if I judge that I have a 0.5% chance of hitting the bullseye, I should find that I hit it five times in a thousand throws, or thereabouts. It's not an exact science, but it is still meaningful. If I judge that my probability of hitting a twenty is higher than that of hitting a bull, it is a meaningful assessment of the likely outcomes, and will be born out in tests when I try to hit those targets. So you are going too far if you write off subjective probabilities as 'just' emotions.

Marco Ocram
  • 20,914
  • 1
  • 12
  • 64
  • There is a fact of the matter behind how many stones there are in a pile. There is no fact of the matter as to how “certain” one should be in a belief. The notion of that is merely a sensation in consciousness and nothing else – Baby_philosopher Feb 09 '24 at 15:34
  • @thinkingman All assessments are the product of the human mind. That doesn't make them the same as emotions. – Marco Ocram Feb 09 '24 at 17:07
  • There has been some research showing that humans tend to be good at making estimations, but tend to be really, really bad at estimating the confidence or accuracy of their own estimations. – Stef Feb 09 '24 at 21:52
  • @stef that doesn't surprise me, as I'm really, really bad at most things. – Marco Ocram Feb 09 '24 at 22:21
0

The bottom line, the origin of probability, is an engineering design problem. We humans are interested in how to design a system of reasoning that produces judgments that tend to yield better outcomes.

It just so happens that probabilistic reasoning tends to yield better outcomes in many domains. If we are designing robots, a robot capable of weighing the probabilities of different outcomes has a tool it can use to outperform a robot that only focuses on a single outcome and assumes that will definitely happen. If we are evaluating stocks on the stock market, an analyst capable of estimating the numerical risk of different stocks will outperform an analyst who can't do that. If we are predicting where photons will be observed in a quantum physics experiment, someone incapable of probabilistic reasoning simply can't do the job.

Precisely which probabilities an agent ought to assign for maximum performance will depend on many factors, and sometimes there might be no way to pin down the exact best set of probabilities. (Or maybe there is, in theory - see the Aumann Agreement Theorem). But we do find that agents that use probabilistic reasoning tend to outperform agents who don't use it. So it's important. Finding the right probabilities - or at least good ones, reasonable ones - is often key to an agent's success.

causative
  • 12,714
  • 1
  • 16
  • 50