Toolbox for quasi random number generation
the Torus algorithm, the Sobol and Halton sequences.
halton(n, dim = 1, init = TRUE, normal = FALSE, usetime = FALSE, mixed = FALSE, method="C", mexp = 19937) sobol(n, dim = 1, init = TRUE, scrambling = 0, seed = 4711, normal = FALSE, mixed = FALSE, method="Fortran", mexp = 19937) torus(n, dim = 1, prime, init = TRUE, mixed = FALSE, usetime = FALSE, normal=FALSE, mexp = 19937)
n |
number of observations. If length(n) > 1, the length is taken to be the required number. |
dim |
dimension of observations default 1. |
init |
a logical, if TRUE the sequence is initialized and
restarts, otherwise not. By default |
normal |
a logical if normal deviates are needed, default |
scrambling |
an integer value, if 1, 2 or 3 the sequence is scrambled
otherwise not. If |
seed |
an integer value, the random seed for initialization
of the scrambling process. By default |
prime |
a single prime number or a vector of prime numbers to be used in the Torus sequence. (optional argument). |
mixed |
a logical to combine the Torus algorithm with SFMT algorithm, default |
usetime |
a logical to use the machine time to start the Torus sequence,
default |
method |
a character string either |
mexp |
an integer for the Mersenne exponent of SFMT algorithm,
only used when |
The currently available generator are given below.
The kth term of the Torus algorithm in d dimension is given by
u_k = (frac(k sqrt(p_1)), ..., frac(k sqrt(p_d)) )
where p_i denotes the ith prime number, frac the fractional part
(i.e. frac(x) = x-floor(x)). We use the 100 000 first prime numbers
from http://primes.utm.edu/, thus the dimension is limited to 100 000.
If the user supplys prime numbers through
the argument prime
, we do NOT check for primality and we cast numerics
to integers, (i.e. prime=7.1234
will be cast to prime=7
before
computing Torus sequence).
The Torus sequence starts from k=1 when initialized with
init = TRUE
and no not depending on machine time
usetime = FALSE
. This is the default. When init = FALSE, the sequence
is not initialized (to 1) and starts from the last term. We can also use the
machine time to start the sequence with usetime = TRUE
, which overrides
init
.
Calculates a matrix of uniform and normal deviated Sobol low
discrepancy numbers. Optional scrambling of the sequence can
be selected.
The Sobol sequence restarts and is initialized when
init = TRUE
and otherwise not.
Calculates a matrix of uniform or normal deviated halton low
discrepancy numbers. Let us note that Halton sequence in dimension is the
Van Der Corput sequence.
The Halton sequence starts from k=1 when initialized with
init = TRUE
and no not depending on machine time
usetime = FALSE
. This is the default. When init = FALSE, the sequence
is not initialized (to 1) and starts from the last term. We can also use the
machine time to start the sequence with usetime = TRUE
, which overrides
init
.
See the pdf vignette for details.
torus
, halton
and sobol
generates random variables in ]0,1[. It returns a nxdim matrix, when dim
>1 otherwise a vector of length n
.
Christophe Dutang and Diethelm Wuertz
Bratley P., Fox B.L. (1988); Algorithm 659: Implementing Sobol's Quasirandom Sequence Generator, ACM Transactions on Mathematical Software 14, 88–100.
Joe S., Kuo F.Y. (1998); Remark on Algorithm 659: Implementing Sobol's Quaisrandom Seqence Generator.
Planchet F., Jacquemin J. (2003), L'utilisation de methodes de simulation en assurance. Bulletin Francais d'Actuariat, vol. 6, 11, 3-69. (available online)
pseudoRNG
for pseudo random number generation, .Random.seed
for what is done in R about random number generation.
# (1) the Torus algorithm # torus(100) # example of setting the seed setSeed(1) torus(5) setSeed(6) torus(5) #the same setSeed(1) torus(10) #no use of the machine time torus(10, use=FALSE) #Kolmogorov Smirnov test #KS statistic should be around 0.0019 ks.test(torus(1000), punif) #KS statistic should be around 0.0003 ks.test(torus(10000), punif) #the mixed Torus sequence torus(10, mix=TRUE) par(mfrow = c(1,2)) acf(torus(10^6)) acf(torus(10^6, mix=TRUE)) #usage of the init argument torus(5) torus(5, init=FALSE) #should be equal to the combination of the two #previous call torus(10) # (2) Halton sequences # # uniform variate halton(n = 10, dim = 5) # normal variate halton(n = 10, dim = 5, normal = TRUE) #usage of the init argument halton(5) halton(5, init=FALSE) #should be equal to the combination of the two #previous call halton(10) # some plots par(mfrow = c(2, 2), cex = 0.75) hist(halton(n = 5000, dim = 1), main = "Uniform Halton", xlab = "x", col = "steelblue3", border = "white") hist(halton(n = 5000, dim = 1, norm = TRUE), main = "Normal Halton", xlab = "x", col = "steelblue3", border = "white") # (3) Sobol sequences # # uniform variate sobol(n = 10, dim = 5, scrambling = 3) # normal variate sobol(n = 10, dim = 5, scrambling = 3, normal = TRUE) # some plots hist(sobol(5000, 1, scrambling = 2), main = "Uniform Sobol", xlab = "x", col = "steelblue3", border = "white") hist(sobol(5000, 1, scrambling = 2, normal = TRUE), main = "Normal Sobol", xlab = "x", col = "steelblue3", border = "white") #usage of the init argument sobol(5) sobol(5, init=FALSE) #should be equal to the combination of the two #previous call sobol(10) # (4) computation times on my macbook, mean of 1000 runs # ## Not run: # algorithm time in seconds for n=10^6 # Torus algo 0.058 # mixed Torus algo 0.087 # Halton sequence 0.878 # Sobol sequence 0.214 n <- 1e+06 mean( replicate( 1000, system.time( torus(n), gcFirst=TRUE)[3]) ) mean( replicate( 1000, system.time( torus(n, mixed=TRUE), gcFirst=T)[3]) ) mean( replicate( 1000, system.time( halton(n), gcFirst=TRUE)[3]) ) mean( replicate( 1000, system.time( sobol(n), gcFirst=TRUE)[3]) ) ## End(Not run)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.