Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Introduction
Generate U1 , U2 , . . . , Un .
P
Estimate with n = n
j=1 Xj /n
where Xj = h(Uj )
n
n z/2
n
n
n + z/2
n
where
n is the usual estimate of Var (X ) based on X1 , X2 , . . . , Xn .
Introduction
We would like IC to be small, but sometimes this is difficult to
achieve. This may be because Var (X ) is too large, or too much
computational effort is required to simulate each Xj so that n is
necessarily small, or some combination of the two.
There are a number of things we can do:
Develop a good simulation algorithm.
Program carefully to minimize execution time.
Program carefully to minimize storage requirements. For
example we do not need to store all the Xj s: we only need to
P
P
keep track of Xj and Xj2 .
Decrease the variability of the simulation output that we use to
estimate . The techniques used to do this are usually called
variance reduction techniques.
Freddy Hernndez BarajasUniversidad Nacional de Colombia
Antithetic variables.
Control variates.
Conditioning.
1. Antithetic variables
Var
X1 + X2
2
1
(Var (X1 ) + Var (X2 ) + 2Cov (X1 , X2 ))
4
. . . continuation
To see how we migth arrange for X1 and X2 to be negatively
correlated, suppose that X1 is a function of m random numbers
X1 = h(U1 , U2 , . . . , Um )
where U1 , U2 , . . . , Um are m independent random numbers.
If U U(0, 1) then 1 U is also distribuited as U(0, 1) and
X2 = h(1 U1 , 1 U2 , . . . , 1 Um )
has the same distribution as X1 and it is negatively correlated with
X1 .
What type of function could be h()? Consult page 156.
Example 9d
Z 1
e x dx
. . . continuation
First
Second
. . . continuation
. . . continuation
Without antithetic variables
g <- function(x) exp(x)
n <- 1000
u1 <- runif(n)
u2 <- runif(n)
x1 <- ( g(u1) + g(u2) ) / 2
mean(x1)
## [1] 1.721373
var(x1)
## [1] 0.1284228
. . . continuation
## [1] 1.722226
var(x2)
## [1] 0.004299438
. . . continuation
Comparing the two approaches
Without antithetic variables
2.0
1.9
Mean evolution
Mean evolution
1.9
1.8
1.7
1.6
1.8
1.7
1.6
1.5
1.5
0
200
400
600
800
1000
Iterations
200
400
600
Iterations
800
1000
. . . continuation
. . . continuation
. . . continuation
n <- 1000
without.anti <- replicate(n=n, aprox(antithetic=FALSE))
with.anti <- replicate(n=n, aprox(antithetic=TRUE))
. . . continuation
mean(without.anti)
## [1] 1.718942
sd(without.anti)
## [1] 0.01525596
mean(with.anti)
## [1] 1.718274
sd(with.anti)
## [1] 0.002755573
Freddy Hernndez BarajasUniversidad Nacional de Colombia
Example 1
R
0
g(x ) dx =
R
0
log(1 + x 2 )e x dx .
R1
0
f(x)
0.0
0.00
0.5
0.05
1.0
1.5
0.15
0.10
g(x)
2.0
0.20
2.5
0.25
3.0
. . . continuation
10
15
0.0
0.2
0.4
0.6
x
0.8
1.0
. . . continuation
. . . continuation
. . . continuation
n <- 1000
without.anti <- replicate(n=n, aprox(antithetic=FALSE))
with.anti <- replicate(n=n, aprox(antithetic=TRUE))
mean(without.anti)
## [1] 0.6867205
sd(without.anti)
## [1] 0.02383796
mean(with.anti)
## [1] 0.6865537
sd(with.anti)
## [1] 0.01382919
Example 2
Estimate V =
R /4 R /4 2 2
x y sin(x + y ) ln(x + y )dxdy .
0
0
. . . continuation
Not using antithetic variables
g <- function(x) prod(x)^2*sin(sum(x))*log(sum(x))
n <- 1000
u1 <- matrix(runif(2*n), ncol=2)
x1 <- apply(u1*pi/4, 1, g) * pi^2 / 16
u2 <- matrix(runif(2*n), ncol=2)
x2 <- apply(u2*pi/4, 1, g) * pi^2 / 16
x <- ( x1 + x2 ) / 2
mean(x)
## [1] 0.004155943
var(x)
## [1] 9.682597e-05
. . . continuation
Using antithetic variables
g <- function(x) prod(x)^2*sin(sum(x))*log(sum(x))
n <- 1000
u1 <- matrix(runif(2*n), ncol=2)
x1 <- apply(u1*pi/4, 1, g) * pi^2 / 16
u2 <- 1 - u1
x2 <- apply(u2*pi/4, 1, g) * pi^2 / 16
y <- ( x1 + x2 ) / 2
mean(y)
## [1] 0.003887711
var(y)
## [1] 7.70069e-05
. . . continuation
Comparing results.
0.010
Without a.v.
With a.v.
Mean evolution
0.008
0.006
0.004
0.002
0.000
0
200
400
600
800
Iterations
1000
1 + x2
dx ,
2. Control variates
. . . continuation
Cov (X , Y )
Var (Y )
. . . continuation
. . . continuation
. . . continuation
. . . continuation
Example 9h
. . . continuation
where the above used, from Example 9d, that Var (e U ) = 0.2420.
Hence, in this case, the use of the control variate U can lead to a
variance reduction of up to 98.4 percent.
. . . continuation
We want to estimate E [e U ] until its standard deviation d < 0.01.
d
J0
x
s2
i
<<<<<-
0.01
20 # J0 Minimum number of variates
exp( runif(J0) )
var(x)
J0
. . . continuation
. . . continuation
Using control variate.
U
X
Y
S2
i
<<<<<-
runif(J0)
exp(U)
U # Using the notation
var(X)
J0
) {
<- u
/ var(Y)
- 0.5)
. . . continuation
. . . continuation
Resuming
Without control variate
1.9
1.9
1.8
1.8
x Mean
Z Mean
2.0
1.7
1.7
1.6
1.6
1.5
1.5
0
500
1000
1500
2000
2500
Iterations
10
20
Iterations
30
40
. . . continuation
Example 9k
Show how we can estimate by determining how often a randomly
chosen point in the square of area 4 centered around the origin falls
within the inscribed circle of radius 1. Specifically, if we let
Vi = 2Ui 1, where Ui , i = 1, 2, are random numbers, and set
. . . continuation
. . . continuation
Since
. . . continuation
. . . continuation
. . . continuation
Mean evolution
1.2
True value
Without conditioning
With conditioning
1.1
1.0
Mean
0.9
0.8
0.7
0.6
0.5
0
100
200
300
400
Iterations
500
. . . continuation
Homework
Do exercises.