Sei sulla pagina 1di 7

PART 1

a) TASK 1
HISTORY OF PROBABILITY
Probability is a way of expressing knowledge or belief that an event will
occur or

has occurred. In mathematics the concept has been given an exact meaning
in probability theory, that is used extensively in such area of study as
mathematics,
statistics, finance, gambling, science, and phisolophy to draw conclusions
about the
likelihood of potential events and underlying mechanics of complex
systems.

The scientific study of probability is a modern development. Gambling


shows that there has been an interest in quantifying the ideas of probability
for millennia, but exact mathematical descriptions of use in those problems
only arose much later.

According to Richard Jeffrey, "Before the middle of the seventeenth century,


the
term 'probable' (Latin probabilis) meant approvable, and was applied in that
sense,
univocally, to opinion and to action. A probable action or opinion was one
such as
sensible people would undertake or hold, in the circumstances."[4] However,
in legal
contexts especially, 'probable' could also apply to propositions for which
there was good
evidence.

Aside from some elementary considerations made by Girolamo Cardano in


the
16th century, the doctrine of probabilities dates to the correspondence of
Pierre de
Fermat and Blaise Pascal (1654). Christiaan Huygens (1657) gave the
earliest known
scientific treatment of the subject. Jakob Bernoulli's Ars Conjectandi
(posthumous,
1713) and Abraham de Moivre'sDoctrine of Chances (1718) treated the
subject as a
branch of mathematics. See Ian Hacking's The Emergence of Probability and
James
Franklin's The Science of Conjecture for histories of the early development
of the very
concept of mathematical probability.***

The theory of errors may be traced back to Roger Cotes's Opera


Miscellanea (posthumous, 1722), but a memoir prepared by Thomas
Simpson in 1755
(printed 1756) first applied the theory to the discussion of errors of
observation. The
reprint (1757) of this memoir lays down the axioms that positive and
negative errors are
equally probable, and that there are certain assignable limits within which all
errors may
be supposed to fall; continuous errors are discussed and a probability curve
is given.

Pierre-Simon Laplace (1774) made the first attempt to deduce a rule for the
combination of observations from the principles of the theory of
probabilities. He represented the law of probability of errors by a curve y =
(x), x being any error and y its probability, and laid down three properties
of this curve:
1.
it is symmetric as to the y-axis;
2.
the x-axis is an asymptote, the probability of the error
being 0;
3.
the area enclosed is 1, it being certain that an error exists.

He also gave (1781) a formula for the law of facility of error (a term due to
Lagrange,
1774), but one which led to unmanageable equations. Daniel Bernoulli
(1778)
introduced the principle of the maximum product of the probabilities of a
system of
concurrent errors.
PROBABILITY THEORY

Like other theories, the theory of probability is a representation of


probabilistic
concepts in formal termsthat is, in terms that can be considered separately
from their
meaning. These formal terms are manipulated by the rules of mathematics
and logic,
and any results are then interpreted or translated back into the problem
domain.

There have been at least two successful attempts to formalize probability,


namely
the Kolmogorov formulation and the Cox formulation. In Kolmogorov's
formulation
(see probability space), sets are interpreted as events and probability itself as
a measure on a class of sets. In Cox's theorem, probability is taken as a
primitive (that
is, not further analyzed) and the emphasis is on constructing a consistent
assignment of
probability values to propositions. In both cases, the laws of probability are
the same,
except for technical details.***

There are other methods for quantifying uncertainty, such as the Dempster-
Shafer
theory or possibility theory, but those are essentially different and not
compatible with
the laws of probability as they are usually understood

APPLICATION
Two major applications of probability theory in everyday life are
in risk assessment and in trade on commodity markets. Governments
typically apply
probabilistic methods inenvironmental regulation where it is called "pathway
analysis",
often measuring well-being using methods that are stochastic in nature, and
choosing
projects to undertake based on statistical analyses of their probable effect on
the
population as a whole.

A good example is the effect of the perceived probability of any


widespread
Middle East conflict on oil prices - which have ripple effects in the economy
as a whole.
An assessment by a commodity trader that a war is more likely vs. less
likely sends
prices up or down, and signals other traders of that opinion. Accordingly, the
probabilities are not assessed independently nor necessarily very rationally.
The theory
of behavioral finance emerged to describe the effect of such groupthink on
pricing, on
policy, and on peace and conflict.

It can reasonably be said that the discovery of rigorous methods to


assess and
combine probability assessments has had a profound effect on modern
society.
Accordingly, it may be of some importance to most citizens to understand
how odds and
probability assessments are made, and how they contribute to reputations
and to
decisions, especially in a democracy.
Another significant application of probability theory in everyday life
is reliability.
Many consumer products, such as automobiles and consumer electronics,
utilize reliability theory in the design of the product in order to reduce the
probability of
failure. The probability of failure may be closely associated with the
product's warranty.****

b)

CATEGORIES OF PROBABILITY

Empirical Probability of an event is an "estimate" that the event will


happen based on

how often the event occurs after collecting data or running an


experiment (in a large

number of trials). It is based specifically on direct observations or


experiences+++++++>>>>>

Comparing Empirical and Theoretical Probabilities:

Empirical probability is the probability a person calculates from many


different
trials. For example someone can flip a coin 100 times and then record
how many times
it came up heads and how many times it came up tails. The number of
recorded heads
divided by 100 is the empirical probability that one gets heads.

The theoretical probability is the result that one should get if an infinite
number of trials were done. One would expect the probability of heads
to be 0.5 and the probability of tails to be 0.5 for a fair coin.****

PART 2
(a)

Suppose you are playing the Monopoly game with two of your friends. To
start the
game, each player will have to toss the die once. The player who obtains the
highest
number will start the game. List all the possible outcomes when the dice is
tossed once.

={1,2,3,4,5,6

(b)
Table

PART3

Table >paper

(b)

A = { (1,1), (1,2), (1,3), (1,4), (1,5), (1,6)

(2,1), (2,2), (2,3), (2,4), (2,5), (2,6)


(3,1), (3,2), (3,3), (3,4), (3,5), (3,6)
(4,1), (4,2), (4,3), (4,4), (4,5), (4,6)
(5,1), (5,2), (5,3), (5,4), (5,5), (5,6)
(6,1), (6,2), (6,3), (6,4), (6,5), (6,6)}

B=

P = Both number are prime


= {(2,2), (2,3), (2,5), (3,3), (3,5), (5,3), (5,5)}

Q = Difference of 2 number is odd


= { (1,2), (1,4), (1,6), (2,1), (2,3), (2,5), (3,2), (3,4), (3,6), (4,1), (4,3), (4,5),
(5,2),
(5,4), (5,6), (6,1), (6,3), (6,5) }
C= PUQ
= {1,2), (1,4), (1,6), (2,1), (2,2), (2,3), (2,5), (3,2), (3,3), (3,4), (3,6), (4,1),
(4,3), (4,5),
(5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,3), (6,5) }

R = The sum of 2 numbers are even


= {(1,1), (1,3), (1,5), (2,2), (2,4), (2,6), (3,1), (3,3), (3,5), (4,2), (4,4), (4,6),
(5,1), (5,3),
(5,5), (6,2(, (6,4), (6,6)}

D= PR
= {(2,2), (3,3), (3,5), (5,3), (5,5)}

Potrebbero piacerti anche