• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

How do you define "Probability" without using a synonym?

psionl0

Skeptical about skeptics
Joined
Sep 29, 2010
Messages
22,061
Location
31°57'S 115°57'E
I know that in a Bernoulli Trial situation, probability can be defined as the relative frequency of successes if the trial is repeated indefinitely.

But how do we define probability in one-off situations (for example, predicting that a certain event will take place at a certain date)? The concept of probability is still meaningful in these situations because if you can accurately calculate the probability of one-off events you can (in the long run) make money by betting on one-off events.

Even though it may be possible to accurately calculate the probability of a one-off event, defining probability in this situation is still problematic. Many text books define probability as a "degree of belief" but that seems to me to just be a synonym for probability.

Can anybody do better?
 
I know that in a Bernoulli Trial situation, probability can be defined as the relative frequency of successes if the trial is repeated indefinitely.

But how do we define probability in one-off situations (for example, predicting that a certain event will take place at a certain date)? The concept of probability is still meaningful in these situations because if you can accurately calculate the probability of one-off events you can (in the long run) make money by betting on one-off events.

Even though it may be possible to accurately calculate the probability of a one-off event, defining probability in this situation is still problematic. Many text books define probability as a "degree of belief" but that seems to me to just be a synonym for probability.

Can anybody do better?

What do you mean by a one-off situation. Do you mean a situation where there is no pattern that has been established?

I'm not clear on why probability is not an adequate word for this.
 
Many text books define probability as a "degree of belief" but that seems to me to just be a synonym for probability.

I don't understand how, or why, you are differentiating between a synonym and and a definition. What is wrong with defining "probability" as a quantification of certainty about a proposition on a scale from 0 to 1, where 0 is certainty that the proposition is false and 1 is certainty that the proposition is true, and in between, the probability of a proposition is proportional to our certainty of it. It follows, for instance, that a probability of .25 means that we are three times as confident that the proposition is false as we are that it is true.

Jay
 
What is wrong with defining "probability" as a quantification of certainty about a proposition on a scale from 0 to 1, where 0 is certainty that the proposition is false and 1 is certainty that the proposition is true, . . .
That is certainly how we measure probability/certainty and many maths teachers define probability by the way it is measured.

I would just like to be able to give my students more than a vague intuitive idea of what probability actually is.
 
But how do we define probability in one-off situations (for example, predicting that a certain event will take place at a certain date)?


Aren't all situations one-off situations?

I mean, the Red Sox and the Yankees are tied right now. One of them will win. Bookmakers are giving odds on the game. Their odds are based on the individual player histories, the franchise history, the manager's history ... but there's only one time in the history of the world the Yankees will play the Red Sox on July 29, 2012 and the winner of that event cannot yet be known.

Nor can it ever be known how well the odds-makers predicted the outcome, or how likely a different outcome would have been. All we know is that on this one day, one set of events will happen.

They're all one-offs in my opinion.

ETA: And they're going to extra innings.
 
Last edited:
That is certainly how we measure probability/certainty and many maths teachers define probability by the way it is measured.

I don't understand that statement either. What do you mean by "measure" probability? What do we measure it with?

I think you've got it backward. We measure (ie, quantify) our degree of certainty by probability, not the other way around. Probability is quantitative; it's the ruler. Certainty is thing needing measuring, like the length of a desk.

I would just like to be able to give my students more than a vague intuitive idea of what probability actually is.

Well, you could have them read Janynes, which derives the axioms of probability from the axioms of propositional logic. Short of that, I don't understand what is wrong with defining Bayesian probability as a quantification of (un)certainty. Frequentist probability, as you noted in your OP, can be defined as the long-run relative frequency of an event.

Jay
 
Here's another variation:

Probability is a measure of the degree of confidence that we assign to an event's occurrence. We assign 1 for certain confidence and zero for no confidence and any number in between, which depends on the confidence level.
 
I think the Bayesian probability as such is quite a simple consept:

http://en.wikipedia.org/wiki/Bayesian_probability

The problem arises as described in the article between objectivist and subjectivist approaches to it. I somehow feel that there is an "objective" probability to an event, if everything would be known, without it still being 1. But of course we can never know everything, so it seems that there has to always a certain subjective relationship to judging the odds. Having no mathematics this is just a common sense view, of course, and I would be interested to read comments by mathematically literate persons here...
 
I know that in a Bernoulli Trial situation, probability can be defined as the relative frequency of successes if the trial is repeated indefinitely.

But how do we define probability in one-off situations (for example, predicting that a certain event will take place at a certain date)? The concept of probability is still meaningful in these situations because if you can accurately calculate the probability of one-off events you can (in the long run) make money by betting on one-off events.

Even though it may be possible to accurately calculate the probability of a one-off event, defining probability in this situation is still problematic. Many text books define probability as a "degree of belief" but that seems to me to just be a synonym for probability.

Can anybody do better?

I guess you could define it as “largeness of opportunities”. Imagine a ball that falls from a random position above 4 slots. If the slots are evenly spaced, there are 4 options with equal “size”, so they have equal probability. If you double the width of slot 4, you still have 4 options but now slot 4 has twice the “size”, so it has twice the probability. Slot 4 has a greater “largeness of opportunities”. Similarly, if there are 5 slots of equal size numbered 1,2,3,4,4, there are still 4 options, but 4 has twice as many “opportunities”, so it has a has a greater “largeness of opportunities”.

The word “opportune” is from the Latin meaning “favorable” from the phrase ob portum veniens “coming toward a port”. Each slot has an opportunity to get the ball, but some may be more opportune or favorable than others based on their size or frequency. Just as wind drives a ship into a harbor, probability drives a ball into a slot.

The problem with this analogy is that it tends to anthropomorphize things. Is there some god who decides which options are the most favorable and who drives the ball to the most favorable slot most of the time? If not, then what does? What drives the ball?

The word “probability” comes from the Latin “probare” meaning to test or try. This is similar to the definition you quote of “degree of belief”. The word “chance” comes from the Latin “cadens” meaning “to fall”. Like that ball I mentioned above. And in English has come to mean both “opportunity” and “randomness”.

I think the puzzle you are facing is not the definition of “probability” but rather the definition of “randomness”. We can calculate the probability that an event can happen, but in a one-off test there can only be one result. What determines the result is “randomness”. In the real world, we don’t see randomness. People can learn to flip coins to get a certain result. Rolling dice is more difficult, but we could build dice rolling machines that get certain results fairly accurately. Even computers don’t generate truly random numbers. Perhaps in quantum physics there is something random in the real world that is truly random, but I don’t think we understand that yet.

So the “randomness” that determines the result of a one-off event is the very “vague intuitive idea” that you are trying to avoid. I don’t think we can avoid it. But I also don’t think it is a difficult concept.
 
If I was explaining it to adolescents, I would tell them it's a way of guessing how likely or unlikely something was using actual available details that help me to form an educated guess.
 
"Given the information available, the probability of something is our estimation that it will occur."

I guess you could also say that it describes the relative frequency of an event in an infinity of multiverses where all of the information you know holds true.
 
various posters said:
. . . likely . . . certainty . . . degree of confidence . . . largeness of opportunities . . . estimation . . .
Hmmm, we are getting quite a few synonyms here. :D

some other posters said:
. . . So what? . . . Why do you want to know? . . . unnecessary . . .
I have noticed that the harder the question, the more posts of this nature that you see. Of course, if it was easy then even I could have answered it. ;)

I know how to teach the concept of probability to children. You run a number of scenarios by them (eg "rolling a 7 on a die") until they can correctly apply such words as possible/impossible, certain/uncertain, likely/unlikely or even chance to the scenarios. From there, it is a simple matter to introduce them to the probability scale.

Unfortunately, a definition of probability will probably elude them - even if they become skilled statisticians.
 
Hmmm, we are getting quite a few synonyms here.

As I explained in #29, in the Bayesian sense, probability is degree of belief scaled from 0 to 1. That is close to one definition you had (you also had a reasonable frequentist definition) all along. But for some incomprehensible reason that you have refused to explain, even when specifically asked, you have dismissed it as being a "synonym." You also made an incoherent objection (something about "measure") to my definition in #29, to which I objected, and you just ignored the objection and went on complaining that no one has given you a proper definition yet.

The fact is, we have given you a definition. It is now your job to either accept it, or to explain coherently why it is inadequate. Just repeating "it's a synonym" or it's how we "measure probability" (whatever that means) isn't good enough.

We've done what you've asked. The ball is now in your court.

Unfortunately, a definition of probability will probably elude them - even if they become skilled statisticians.

Don't worry. If they become skilled statisticians, they will learn the modern definition of probability based on the Kolmogorov axioms.

Jay
 
As I explained in #29, in the Bayesian sense, probability is degree of belief scaled from 0 to 1. That is close to one definition you had (you also had a reasonable frequentist definition) all along. But for some incomprehensible reason that you have refused to explain, even when specifically asked, you have dismissed it as being a "synonym." You also made an incoherent objection (something about "measure") to my definition in #29, to which I objected, and you just ignored the objection and went on complaining that no one has given you a proper definition yet.
It would appear that I inadvertently brushed you off. Sorry about that. :o

I will deal with your previous post now.
I don't understand that statement either. What do you mean by "measure" probability? What do we measure it with?
"quantify" is a better choice of words.

I think you've got it backward. We measure (ie, quantify) our degree of certainty by probability, not the other way around. Probability is quantitative; it's the ruler. Certainty is thing needing measuring, like the length of a desk.
I would have thought that "probability" was like the word "charge" in which it could be the thing or the measure/quantity of the thing. I'm not disputing your use of the word but I have never heard it expressed that way before.

I was just hoping that probability could be generally defined in purely mathematical terms just as it can be in the Bernoulli Trial case. I guess the best that can be done is to use some generalized formula.
 
I think the puzzle you are facing is not the definition of “probability” but rather the definition of “randomness”.
"randomness" is something else altogether. I'm not sure that there is anything such as a purely "random" force but it is not germane to the definition of "randomness".

If the outcome of an event can neither be predicted nor controlled then it can be considered random. Many computer games make use of a pseudo-random number generator which is not random at all. However, since the average user knows nothing about the seed or algorithm of the generator, it is a truly random event for the user.
 
Proper pseudo-random number algorithms pass tests that show their distributions don't significantly deviate from random expectations.

However, they can be stretched too far easily if you don't know their limitations. I once did a simulation of some Dungeons and Dragons combat. This used the C rand() function built on one very well-known pseudo-random number generator. I forget the details, but for some hit percent, say 10%, the longer you ran it, the longer the chain of most hits in a row you would get.

Fair enough. However, at some point, you got to 9 in a row, but never, ever got 10, no matter how long you waited, including many times longer than 10x what it takes to see 9 in a row.

If your simulation relies on such strings, look for better generators.
 
Last edited:
For the nerd: The internal value of the seed is 32 bit and the generator just returns the low (?) 16. The algorithm then generates the next based on that seed and returns the low 16 again.

The seed for the next number is just the full 32 bit of what was returned to you on the previous call.

When you call srand you're just initializing this value. Typically it generates numbers 55 at a time, so every 55th call to rand() will take some extra time.

Anyway, this immediately suggests a random loop of 4 billion-ish then it repeats. Hence you would not be likely to see too many runs with a probability much lower than 1/4 billionish.

If there were a lot of those, then the algorithm would fail random statistic checks, and would not have been used, of course.
 
Last edited:

Back
Top Bottom