## Conditional probability

The conditional probability of an event A, given an event with: Is defined by: This last equation specifies a new (conditional) probability law on the same sample space Ω. In particular, all properties of probability laws remain valid for conditional probability laws.

• Conditional probabilities can also be viewed as a probability law on a new universe B, because all of the conditional probability is concentrated on B.
• If the possible outcomes are finitely many and equally likely, then: `Explanation`

Conditional probability provides us with a way to reason about the outcome of an experiment, based on partial information. Here are some examples of situations we have in mind:

• In an experiment involving two successive rolls of a die, you are told that the sum of the two rolls is 9. How likely is it that the first roll was a 6?
• In a word guessing game, the first letter of the word is a “t”. What is the likelihood that the second letter is “h”?
• How likely is it that a person has a certain disease given that a medical test was negative?
• A spot shows up on a radar screen. How likely is it to correspond to an aircraft?

In more precise terms, given an experiment, corresponding sample spaces, and a probability law, suppose that we know that the outcome is within some given event B. We wish to quantify the likelihood that the outcome also belongs to some other given event A. We thus seek to construct a new probability law that takes into account the available knowledge: a probability law that for any event A, specifies the conditional probability of A given B, denoted by P(AIB).

An appropriate definition of conditional probability when all outcomes are equally likely is given by: Generalizing the argument, we introduce the following definition of conditional probability: Where we assume that: The conditional probability is undefined if conditioning event has zero probability. In words, out of the total probability of the elements of B, P(AIB) is the fraction that is assigned to possible outcomes that also belong to A.

Sources:

Prof. Larry Francis Obando – Technical Specialist – Educational Content Writer

WhatsApp: +34 633129287 +593998524011 Atención Inmediata!!

Copywriting, Content Marketing, Tesis, Monografías, Paper Académicos, White Papers (Español – Inglés)

Escuela de Ingeniería Eléctrica de la Universidad Central de Venezuela, UCV CCs

Escuela de Ingeniería Electrónica de la Universidad Simón Bolívar, USB Valle de Sartenejas.

Escuela de Turismo de la Universidad Simón Bolívar, Núcleo Litoral.

Contacto: España +34 633129287

Caracas, Quito, Guayaquil, Cuenca.

WhatsApp: +34 633129287 +593998524011

## Probabilistic model – models and axioms.

A probabilistic model is a mathematical description of an uncertain situation. It must be in accordance with a fundamental framework which has two main ingredients: `Introduction`

A probabilistic model is a quantitative description of a situation, a phenomenon, or an experiment whose outcome is uncertain. Putting together such a model involves two key steps.

First, we need to describe the possible outcomes of the experiment. This is done by specifying a so-called sample space .

Second, we specify a probability law, which assigns probabilities to outcomes or to collections of outcomes. The probability law tells us, for example, whether one outcome is much more likely than some other outcome.

Probabilities have to satisfy certain basic properties in order to be meaningful. These are the axioms of probability theory. For example probabilities cannot be negative. Interestingly, there will be very few axioms, but they are powerful, and we will see that they have lots of consequences. We will see that they imply many other properties that were not part of the axioms.

`Sample space and Events`

Every probabilistic model involves an underlying process, called the experiment,  that will produce exactly one out of several possible outcomes. The set of all possible outcomes is called the sample space of the experiment, and is denoted by . A subset of the sample space, that is, a collection of possible outcomes, is called an Event. It is important to note that in our formulation of a probabilistic model, there is only one experiment.

The sample space of an experiment may consist of a finite or an infinite number of possible outcomes. Finite sample spaces are conceptually and mathematically simpler. Still, sample spaces with an infinite number of elements are quite common. As an example, consider throwing a dart on a square target and viewing the point of impact as the outcome.

Regardless of their number, different elements of the sample space should be distinct and mutually exclusive, so that, when the experiment is carried out, there is a unique outcome.

Generally, the sample space chosen for a probabilistic model must be collectively exhaustive, in the sense that no matter what happens in the experiment, we always obtain an outcome that has been included in the sample space. In addition, the sample space should have enough detail to distinguish between all outcomes of interest to the modeler, while avoiding irrelevant details.

To summarize– this set should be such that, at the end of the experiment, you should be always able to point to one, and exactly one, of the possible outcomes and say that this is the outcome that occurred. Physically different outcomes should be distinguished in the sample space and correspond to distinct points. But when we say physically different outcomes, what do we mean? We really mean different in all relevant aspects but perhaps not different in irrelevant aspects.

`Probability Laws`

Suppose we have settled on the sample space associated with an experiment. To complete the probabilistic model, we must now introduce a Probability Law.

Intuitively, a probability law specifies the “likelihood” of any outcome, or of any set of possible outcomes (an event, as we called it early). More precisely, the probability law assigns to every event A, a number P(A), called the probability of A, satisfying the following axioms:

1. Nonnegativity. 2. Additivity. If A and B are two disjoints events, then the probability of their union satisfies the following: More generally, if the sample space has an infinite number of elements and A1, A2, A3, A4,… is a sequence of disjoint events, then the probability of their union satisfies: 3. The probability of the entire sample space is equal to 1, that is: In order to visualize a probability law, consider a unity of mass which is “spread” over the sample space . Then, P(A) is simply the total mass that was assigned collectively to the elements of A. In terms of this analogy, the additivity axiom becomes quite intuitive: the total mass in a sequence of disjoint events is the sum of their individual masses.

There are many natural properties of a probability law which can be derived from them. For example, using the normalization and additivity axioms we may find out the probability of the empty event P(Ø) as following: This implies that: `Discrete Model - Discrete Probability Law `

If the sample space consists of a finite number of possible outcomes, then the probability law is specified by the probabilities of the events that consist of a single element. In particular, the probability of any event {s1, s2, …., sn} is the sum of the probabilities of its elements: In the special case where the probability P(s1), P(s2), …, P(sn) are all the same, in view of the normalization axiom we obtain the following law.

`Discrete Uniform Probability Law `

If the sample space consists of n possible outcomes which are equally likely (i.e., all single-element events have the same probability), the probability of any event A us given by: `Continuous Model`

Discrete models are conceptually much easier. Continuous models involve some more sophisticated concepts.

Probabilistic models with continuous sample space differ from their discrete counterparts in that the probabilities of the single-element events may not be sufficient to characterize the probability law.

`Properties of Probability Laws`

Probability laws have a number of properties, which can be deduced from the axioms. Some of them are summarized below: `The role of Probability Theory`

Probability theory can be a very useful tool for making predictions and decisions that apply to the real world. Now, whether your predictions and decisions will be any good will depend on whether you have chosen a good model. Have you chosen a model that’s provides a good enough representation of the real world? How do you make sure that this is the case? There’s a whole field, the field of statistics, whose purpose is to complement probability theory by using data to come up with good models. And so we have the following diagram that summarizes the relation between the real world, statistics, and probability. The real world generates data. The field of statistics and inference uses these data to come up with probabilistic models. Once we have a probabilistic model, we use probability theory and the analysis tools that it provides to us. And the results that we get from this analysis lead to predictions and decisions about the real world.  Suggested video: Interpretation and uses of Probability Sources:

Prof. Larry Francis Obando – Technical Specialist – Educational Content Writer

Copywriting, Content Marketing, Tesis, Monografías, Paper Académicos, White Papers (Español – Inglés)

Escuela de Ingeniería Eléctrica de la Universidad Central de Venezuela, UCV CCs

Escuela de Ingeniería Electrónica de la Universidad Simón Bolívar, USB Valle de Sartenejas.

Escuela de Turismo de la Universidad Simón Bolívar, Núcleo Litoral.

Contact: Caracas, Quito, Guayaquil, Cuenca. telf – 0998524011

WhatsApp: +593998524011   +593981478463