Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
Randomized Methods for Analysis and Design of Control Systems
M C i F b i i D bb Si G i
IFAC World Congress 2014
Tutorial on Randomized Methods, Cape Town @RT 2014
Marco Campi, Fabrizio Dabbene, Simone Garatti, Maria Prandini, Roberto Tempo
Schedule
8:30 - 8:45 Welcome and presentation of the day 8:45 - 10:15 Introduction to the scenario approach (Campi)
10:15 - 10:45 Coffee break10:45 - 11:30 The fundamental theorems of the scenario ---------------- approach (Garatti)11:30 - 12:15 Application to channel equalization (Prandini)
Tutorial on Randomized Methods, Cape Town @RT 2014
11:30 12:15 Application to channel equalization (Prandini)12:15 - 13:45 Lunch13:45 - 14:45 Probabilistic methods for analysis of --------------------- -- uncertain systems (Tempo)14:45 - 15:00 Coffee break15:00 - 16:00 Sequential randomized algorithms for -------------------- design (Dabbene) 16:00 - 16:30 Conclusions and discussion
Key References – Tutorials and Books
S. Garatti and M.C. Campi, “ModulatingRobustness in Control Design: Principlesand Algorithms,” IEEE CSM, 2013
Tutorial on Randomized Methods, Cape Town @RT 2014
R. Tempo, G. Calafiore and F. Dabbene,“Randomized Algorithms for Analysisand Control of Uncertain Systems, withApplications,” Second Edition,Springer-Verlag, London, 2013
Key References – Tutorials and Surveys
M.C. Campi, “Why Is Resorting to Fate Wise? A CriticalLook at Randomized Algorithms in Systems andControl,” European Journal of Control, 2010
G. Calafiore, F. Dabbene and R. Tempo “Research onP b bili ti D i M th d ” A t ti 2011
Tutorial on Randomized Methods, Cape Town @RT 2014
Probabilistic Design Methods,” Automatica, 2011 R. Tempo and H. Ishii, “Monte Carlo and Las Vegas
Randomized Algorithms for Systems and Control: AnIntroduction,” European Journal of Control, 2007
M. Vidyasagar, “Statistical Learning Theory andRandomized Algorithms for Control,” IEEE CSM, 1998
Key References – Recent Tutorials
I.R. Petersen and R. Tempo, “Robust Control ofUncertain Systems: Classical Results and RecentDevelopments,” Automatica, May 2014
H. Ishii and R. Tempo, “The PageRank
Tutorial on Randomized Methods, Cape Town @RT 2014
Problem, Multiagent Consensus andWeb Aggregation: A Systems andControl Viewpoint,” IEEE CSM, 2014
F. Dabbene and R. Tempo, “Randomized Methods forControl,” Encyclopedia of Systems and Control,Springer-Verlag, London, 2015 (to appear)
Software (Open Source)
R-RoMulOC: Randomized and Robust Multi-ObjectiveControl toolbox
http://projects.laas.fr/OLOCEP/rromuloc/
Tutorial on Randomized Methods, Cape Town @RT 2014
RACT: Randomized Algorithms Control Toolbox forMatlab
http://ract.sourceforge.net
Randomized Algorithms (RAs)
Randomized algorithms are frequently used in manyareas of engineering, computer science, physics,finance, optimization,…
Tutorial on Randomized Methods, Cape Town @RT 2014
Main objective of this tutorial: Introduction to rigorousstudy of randomized methods for uncertain systems andcontrol
The theory is ready for specific applications
Randomized Algorithms (RAs)
Computer science (RQS for sorting, data structuring)
Robotics (motion and path planning problems)
Mathematics of finance (path integrals)
Tutorial on Randomized Methods, Cape Town @RT 2014
Bioinformatics (string matching problems)
Computer vision (computational geometry)
PageRank computation (distributed algorithms)
Opinion dynamics in social networks
A Success Story: Randomization in
Tutorial on Randomized Methods, Cape Town @RT 2014
yComputer Science
A Success Story in CS
Problem: Sorting N real numbers
Algorithm: RandQuickSort (RQS)
RQS is implemented in a C library of Linux for sortingnumbers[1-2]
Tutorial on Randomized Methods, Cape Town @RT 2014
numbers[1 2]
[1] C.A.R. Hoare (1962)[2] D.E. Knuth (1998)
A Success Story in CS
Problem: Sorting N real numbers
Algorithm: RandQuickSort (RQS)
RQS is implemented in a C library of Linux for sortingnumbers
Tutorial on Randomized Methods, Cape Town @RT 2014
numbers
Sorting Problem
given N real x1 x2 x3 sort them in
numbers x4 x5 x6 increasing order
S1
RandQuickSort (RQS)
The idea is to divide the original set S1 into two setshaving (approximately) the same cardinality
This requires finding the median of S1 (which may bedifficult)
Tutorial on Randomized Methods, Cape Town @RT 2014
This operation is performed using randomization
RandQuickSort (RQS)
RQS is a recursive algorithm consisting of two phases
1. randomly select a number xi (e.g. x4)2. deterministic comparisons between xi and other (N-1) numbers
Tutorial on Randomized Methods, Cape Town @RT 2014
x2 x3 x1 x5x6
numbers smaller than x4 numbers larger than x4
S2 S3
4x
RQS: Binary Tree Structure
We use randomization at each step of the (binary) tree
Tutorial on Randomized Methods, Cape Town @RT 2014
Running Time of RQS
Because of randomization, running time may bedifferent from one run of the algorithm to the next one
RQS is very fast: Average running time is O(N log(N)) This is a major improvement compared to brute force
Tutorial on Randomized Methods, Cape Town @RT 2014
approach (e.g. when N = 2M) Average running time holds for every input with
probability at least 1-1/N (i.e. it is highly probable) The so-called Chernoff bound can be used to prove this Improvements for RQS to avoid achieving the worst
case running time O(N 2)
Find Algorithm
Find Algorithm: Find the k-th smallest number in a set Basically it is a RQS but it terminates when the number
is found Average running time of Find is O(N)
Tutorial on Randomized Methods, Cape Town @RT 2014
Another Success Story: Randomization
Tutorial on Randomized Methods, Cape Town @RT 2014
yin Mathematical Finance
(Quasi) Monte Carlo Methods for Computational Finance
QMC methods to estimate the prize of collaterizedmortgage obligations
The problem is to approximate the average mortgage
( ) df u u
Tutorial on Randomized Methods, Cape Town @RT 2014
taking N samples for each variable, but we need Nn
total number of points
Curse of dimensionality: n = 360!
[0,1]( ) d
nf u u
Probabilistic Methods for Analysis of
Tutorial on Randomized Methods, Cape Town @RT 2014
yUncertain Systems
Tutorial on Randomized Methods, Cape Town @RT 2014
Example: H Performance
Consider the linear system
Example: Frequency Response
0 1
0 1 0 01 1
x x u wa a
1 0z x
Tutorial on Randomized Methods, Cape Town @RT 2014
with (nominal) parametersa0 = 1 a1 = 0.8
The transfer function z = G(s) w is given by
2
1( )0.8 1
G ss s
disturbanceserrors
wz
H performance||G(s)|| = sup |G(j)| ≤ γ
Performance is satisfied for γ = 1 35
Example: H Norm
Tutorial on Randomized Methods, Cape Town @RT 2014
Performance is satisfied for γ 1.35
Bode plot (magnitude)
System Performance with Uncertainty
Consider an uncertain stable transfer function G(s,q)
z = G(s,q) wG(s,q) w z
Tutorial on Randomized Methods, Cape Town @RT 2014
where w and z are disturbances and errors and qrepresents uncertainty bounded in a set Q of radius ρ > 0
Consider the uncertain linear system
Example[1]: System Performance with Uncertainty
0 1
0 1 0 01 1
x x u wa a
1 0z x
Tutorial on Randomized Methods, Cape Town @RT 2014
with parametersa0 = 1 + q0 a1 = 0.8 + q1
and bounding setQ = {q = [q0 q1 ]T : ||q|| }
[1] R. Tempo, G. Calafiore, F. Dabbene (2013)
Given performance level the objective is tocompute the maximal radius of Q such that
G(s,q) is stable and ||G(s,q)|| for all q Q
Example: Radius of Uncertainty
ργ= 2
Tutorial on Randomized Methods, Cape Town @RT 2014
G(s,q) is stable and ||G(s,q)|| if and only if
< 0.8 and2(0 .8 ρ ) 1 ρ
2 2
Example: Radius of Uncertainty
Largest radius of Q suchthat performance is satisfied is = 0.025ρ
Tutorial on Randomized Methods, Cape Town @RT 2014
Conclusion: Stability and performance are satisfied for all q Qwith radius = 0.025 ρ
Objective of Robustness
Objective of robustness: To guarantee stability andperformance for all
q Q
Tutorial on Randomized Methods, Cape Town @RT 2014
We may also use the notation to denote uncertaintyand B for bounding set
Tutorial on Randomized Methods, Cape Town @RT 2014
Probabilistic Robustness
Different Paradigm Proposed
Different paradigm based on a probabilistic model ofuncertainty which leads to randomized algorithms foranalysis and synthesis
Within this setting a different notion of problem
Tutorial on Randomized Methods, Cape Town @RT 2014
Within this setting a different notion of problemtractability is needed
Benefits and pitfalls of risk analysis
Objective: Breaking the curse of dimensionality[1]
[1] R. Bellman (1957)
Tutorial on Randomized Methods, Cape Town @RT 2014
Probabilistic Methods
Probabilistic Model of Uncertainty
Assume that q is a random vector with given density
function and support set Q
Probability density function associated to q
Tutorial on Randomized Methods, Cape Town @RT 2014
Examples: Uniform
or Gaussian pdf
Uniform Density U [Q]
Univariate uniform density
b
1/(b-a)
[ , ]a bU
Tutorial on Randomized Methods, Cape Town @RT 2014
Multivariate uniform density U [Q]
1 if
vol( )0 otherwise
q QQQ
U
a b
Probability of Performance
Define a performance function
J(q): Q → R
Given level , probability of performance (reliability) is
Tutorial on Randomized Methods, Cape Town @RT 2014
PJ = Prob{q Q: J(q) }
Example: If G(s,q) is stable and J(q) = ||G(s,q)||
PJ = Prob{q Q: ||G(s,q)|| }
Measure of Performance Violation
Objective: Achieve probabilistic performancePJ = Prob{q Q: J(q) } ≥ 1 -
where (0,1) is a probabilistic parameter calledaccuracy
Tutorial on Randomized Methods, Cape Town @RT 2014
accuracy
Computation of Probability of Performance
ComputingPJ = Prob{q Q: J(q) }
requires to solve a difficult integration problem Taking uniform density U [Q]
Tutorial on Randomized Methods, Cape Town @RT 2014
Taking uniform density U [Q]
In some special cases we can easily compute thisprobability
( ) γd
Prob : ( ) γvol( )J q
qq Q J q
Q
Tutorial on Randomized Methods, Cape Town @RT 2014
Worst Case vs Probabilistic Approaches
Tutorial on Randomized Methods, Cape Town @RT 2014
Example: H Performance
Recall Performance Violation
Increase the radius
Observation: If we allow a small violation
Tutorial on Randomized Methods, Cape Town @RT 2014
allow a small violationof performance we may increase the radius significantly
Computation of Performance Violation
Take uniform pdf in Q
Allowing 5% violationwe increase of 54%
Tutorial on Randomized Methods, Cape Town @RT 2014
we increase of 54% obtaining 0.038 (instead of 0.025)
For several values of we compute PJ ()
Performance Degradation Function
If a 5% violation is allowed we increase of 54%
PJ (ρ)
Tutorial on Randomized Methods, Cape Town @RT 2014
0.038
increase of 54%obtaining 0.038
Radius 0.038 compared to = 0.025ρ
0.038=0.025ρ=0.025 ρ=0.038
Tutorial on Randomized Methods, Cape Town @RT 2014
Probabilistic Robustness Analysis
Probabilistic Model
Probability density function associated to B
We assume that is a random matrix (vector) with given
density function and support B
Tutorial on Randomized Methods, Cape Town @RT 2014
y pp
Example: Uniform density in B
Uniform Density
Consider uniform density U[B] within B
th i0
if)(vol
1 BBBU
Tutorial on Randomized Methods, Cape Town @RT 2014
In this case, for a subset S B
otherwise0
)(vol)(vol
)(vol
dProb
BS
BS S
Good and Bad Sets
We define two subsets of B
Bgood = {: J( } BBbad = {: J( } B
Tutorial on Randomized Methods, Cape Town @RT 2014
bad { ( }
Bgood is the set of satisfying performance Measure of robustness is
good
dvol good ΒB
Probability of Performance
Given a performance level , we define the probability ofperformance
Prob{J() }
Tutorial on Randomized Methods, Cape Town @RT 2014
Violation and Reliability
We define the violation probability
V = 1 - Prob{J() } = Prob{J() > }
Probability of performance is also denoted as reliability
Tutorial on Randomized Methods, Cape Town @RT 2014
R = Prob{J() } = 1 – V
Computation of Violation and Reliability
Computing V and R requires to solve a difficultintegration problem
In some special cases we can easily compute violationand reliability
Tutorial on Randomized Methods, Cape Town @RT 2014
Otherwise use randomized algorithms to determineprobabilistic estimates of V and R
Monte Carlo and Las Vegas
Tutorial on Randomized Methods, Cape Town @RT 2014
Monte Carlo and Las Vegas Randomized Algorithms
Monte Carlo and Las Vegas
Monte Carlo was invented by Metropolis, Ulam, vonNeumann, Fermi, … (Manhattan project)
Tutorial on Randomized Methods, Cape Town @RT 2014
Metropolis Fermi Ulam, Feymann, von Neumann
Las Vegas first appeared in computer science in the lateseventies
Randomized Algorithm: Definition
Randomized Algorithm (RA): An algorithm that makesrandom choices during its execution to produce a result
Example of a “random choice” is a coin toss
Tutorial on Randomized Methods, Cape Town @RT 2014
heads or tails
Randomized Algorithm: Definition
Randomized Algorithm (RA): An algorithm that makesrandom choices during its execution to produce a result
Example: Matlab codeset r =1:0.01:3;
Tutorial on Randomized Methods, Cape Town @RT 2014
set_r 1:0.01:3;for k =1:length(set_r)
if (rand > 0.5) then a_opt(k) = hel(k);else a_opt(k) = 3.7;end if
a_lin(k) =(e/(e-1))*r;a_sub(k) =(a/(a-1))*(r+log(a)-1);
end
Randomized Algorithm: Definition
Randomized Algorithm (RA): An algorithm that makesrandom choices during its execution to produce a result
For hybrid systems, “random choices” could be
Tutorial on Randomized Methods, Cape Town @RT 2014
o yb d sys e s, do c o ces cou d beswitching between different states or logical operations
For uncertain systems, “random choices” require (vectoror matrix) random sample generation
Tutorial on Randomized Methods, Cape Town @RT 2014
Monte Carlo Randomized Algorithm
Monte Carlo Randomized Algorithm
Monte Carlo Randomized Algorithm (MCRA): Arandomized algorithm that may produce incorrect results,but with bounded probability of error
Tutorial on Randomized Methods, Cape Town @RT 2014
Monte Carlo Randomized Algorithm
Monte Carlo Randomized Algorithm (MCRA): Arandomized algorithm that may produce incorrect results,but with bounded probability of “error”
Tutorial on Randomized Methods, Cape Town @RT 2014
Prob{“error” > } < 2e(-2N2) Hoeffding inequality
where is the probabilistic accuracy of the estimate, N isthe sample size (sample complexity) and e is the Eulernumber
Example of Monte Carlo: Area/Volume Estimation
Estimate the volume of the red area: Generate N samplesuniformly in the rectangle; count how many (M) fallwithin the red area, then the estimated area = M/N
Tutorial on Randomized Methods, Cape Town @RT 2014
Las Vegas Randomized Algorithm
Las Vegas Randomized Algorithm (LVRA): Arandomized algorithm that always produces correctresults, the only variation from one run to another is therunning time
Tutorial on Randomized Methods, Cape Town @RT 2014
Las Vegas Randomized Algorithm
Las Vegas Randomized Algorithm (LVRA): Arandomized algorithm that always produces correctresults, the only variation from one run to another is therunning time
Tutorial on Randomized Methods, Cape Town @RT 2014
Example: Randomized Quick Sort (RQS)
Tutorial on Randomized Methods, Cape Town @RT 2014
Randomized Algorithms for Control
Ingredients for RAs
Assume that is random with given pdf and support B Accuracy (0,1) and confidence (0,1) be assigned Performance function for analysis and level
↓ ↓
Tutorial on Randomized Methods, Cape Town @RT 2014
↓ ↓
J = J()
Randomized Algorithms for Analysis
Different classes of randomized algorithms forprobabilistic analysis to estimate
Probability of performance
Tutorial on Randomized Methods, Cape Town @RT 2014
Probability of performance Probability of failure
They are based on uncertainty randomization of
Sample complexity is obtained
Estimating the Probability of
Tutorial on Randomized Methods, Cape Town @RT 2014
Estimating the Probability of Performance
Estimate of the Probability of Performance
Objective: Construct a probabilistic estimate usingMonte Carlo randomized algorithms of reliability(probability of performance)
Tutorial on Randomized Methods, Cape Town @RT 2014
R = Prob{J() }
Monte Carlo Experiment
We draw N i.i.d. random samples of according to thegiven probability measure
), 2), …, ) B
The multisample within B is
Tutorial on Randomized Methods, Cape Town @RT 2014
The multisample within B is
1,…,N = {(1), ... , N)}
We evaluateJ()), J()), …, J(N))
Example
J
Tutorial on Randomized Methods, Cape Town @RT 2014
Example
J
Tutorial on Randomized Methods, Cape Town @RT 2014
1 2 3 4 5 6
Example
J(3)
J
Tutorial on Randomized Methods, Cape Town @RT 2014
1 2 3 4 5 6
J(1)
J(2) J(4)
J(5)J(6)
Empirical Reliability
We construct the empirical reliability
where I (·) denotes the indicator function
N
i
iN J
NR
1
)( )1ˆ I
Tutorial on Randomized Methods, Cape Town @RT 2014
where I ( ) denotes the indicator function
Notice that
where Ngood is the number of samples such that J(i))
( )
( ) 1 if ( )( )
0 otherwise
ii J γ
J
I
NN
RNgoodˆ
Sample Complexity
We need to compute the size of the Monte Carloexperiment (sample complexity)
This requires to introduce probabilistic accuracy (0,1) and confidence (0,1)
Tutorial on Randomized Methods, Cape Town @RT 2014
( ) ( ) Given , (0,1), we want to determine N such that the
probability event(this is the “error”
previously discussed)
holds with probability at least 1-
εˆ NRR
A Good Estimate
If the probability event
holds with probability at least 1- , the we say that the
εˆ NRR
Tutorial on Randomized Methods, Cape Town @RT 2014
holds with probability at least 1 , the we say that theempirical reliability is a “good” estimate of thereliability R
(Additive) Chernoff Bound[1]
(Additive) Chernoff BoundGiven , (0,1), if
2
δ2
ch 2logNN
Tutorial on Randomized Methods, Cape Town @RT 2014
then the probability inequality
holds with probability at least 1-
2ch ε2
[1] H. Chernoff (1952)
εˆ NRR
Remarks
Chernoff bound improves upon other bounds such asthe Law of Large Numbers (Bernoulli)
Dependence is logarithmic on 1/ and quadratic on 1/ Sample size is independent on the number of
Tutorial on Randomized Methods, Cape Town @RT 2014
Sample size is independent on the number ofcontroller and uncertain parameters
1-
Nch
Accuracy vs Confidence
Confidence is “cheap” because of the logarithmicdependence
Accuracy is computationally more expensive becauseof quadratic dependence
Tutorial on Randomized Methods, Cape Town @RT 2014
of quadratic dependence Can we improve the quadratic dependence? The answer to this question is provided by the
(multiplicative) Chernoff Bound
Hoeffding Inequality and Chernoff Bound - 1
Given (0,1), from the Hoeffding inequality we obtain
Prob{1,…,N : } ≤ 2e(-2N2)
where e denotes the Euler number
ˆ- εNR R
Tutorial on Randomized Methods, Cape Town @RT 2014
To guarantee confidence (0,1), we need to take Nsamples such that 2e(-2N2) ≤ holds
We obtain the (additive) Chernoff bound
N ≥ 1/ (22) log(2/ )
Hoeffding Inequality and Chernoff Bound - 2
The Hoeffding inequality provides a bound on the taildistribution
2e(-2N2)
From the computational point of view computing the
Tutorial on Randomized Methods, Cape Town @RT 2014
From the computational point of view, computing theminimum value of N that 2e(-2N2) ≤ is immediate(given and it is a one-parameter problem)
The Chernoff bound provides a fundamental explicitrelation (sample complexity) N = N(, ) showing that1/ enters quadratically and 1/ logarithmically
Hoeffding Inequality and Chernoff Bound - 3
Chernoff bound and the Hoeffding inequality hold onlyfor fixed performance function J
Some results are available for a finite number ofperformance functions
Tutorial on Randomized Methods, Cape Town @RT 2014
performance functions
For an infinite number of performance functions we needto use statistical learning theory
Parallel and Distributed Simulations
Samples q(1), q(2), …, q(N) are i.i.d. Contrary to MCMC or sequential Monte Carlo, this
approach leads to parallel and distributed simulations
Tutorial on Randomized Methods, Cape Town @RT 2014
IBM Blue Gene Cray-1 vector processor
Parallel and Distributed Simulations
Samples q(1), q(2), …, q(N) are i.i.d. Contrary to Markov Chain Monte Carlo (MCMC) or
sequential Monte Carlo, this approach leads to paralleland distributed simulations
Tutorial on Randomized Methods, Cape Town @RT 2014
Sample generation requires tools from importantsampling techniques
Connections with the theory of random matrices[1]
[1] G. Calafiore, F. Dabbene, R. Tempo (2000)
Tutorial on Randomized Methods, Cape Town @RT 2014
Bounds on the Binomial Distribution
Bounds on the Binomial Distribution
The so-called probability of failure is studied in thescenario approach and in statistical learning theory
This required bounding the binomial distribution
Tutorial on Randomized Methods, Cape Town @RT 2014
This required bounding the binomial distribution
0
B( ,ε, ) ε 1 εm
N ii
i
NN m
i
Bounding the Binomial Distribution and Sample Complexity
Theorem[1]: Given , (0,1) and m 0, if
h
1
1 1inf log log( )ε 1 δa
aN m aa
Tutorial on Randomized Methods, Cape Town @RT 2014
then
[1] T. Alamo, R. Tempo and A. Luque (2010)
0
B( ,ε, ) ε 1 ε δm
N ii
i
NN m
i
Bounding the Binomial Distribution and Sample Complexity
Suboptimal value of a is the Euler number e
Sample complexity is given by1 1logε 1 δ
eN me
Tutorial on Randomized Methods, Cape Town @RT 2014
Sample complexity is linear in
- 1/ (not quadratic!)
- m
-
ε 1 δe
1logδ
Probabilistic Methods:Benefits and Drawbacks
Benefits Drawbacks
very general method with immediatepractical applications, for example inaircraft design and process control industry
the results obtained provide no“deterministic certificate” of propertysatisfaction, for example H-infinityperformance
specific sample generation methods have for recursive methods the number of
Tutorial on Randomized Methods, Cape Town @RT 2014
specific sample generation methods havebeen developed (e.g. for norm bounded sets,hit-and-run for convex sets, particlefiltering, importance sampling, MCMC)
for recursive methods the number ofrequired experiments is generally notspecified a priori
sample size bounds are available for non-recursive methods
the method does not cover the entire samplespace, but only a finite subset of it
Monte Carlo methods are very effective indealing with the “curse of dimensionality”;the probability of error is bounded
crucial points of the safety region can bemissed, this may lead to erroneousconclusions
Tutorial on Randomized Methods, Cape Town @RT 2014
Computational Complexity of RAs
Computational Complexity of RAs
RAs are efficient (polynomial-time) because
1. Random sample generation of i) can be performedin polynomial-time
Tutorial on Randomized Methods, Cape Town @RT 2014
2. Cost associated with the evaluation of J(i)) forfixed i) is polynomial-time
3. Sample size is polynomial in the problem size andprobabilistic levels and
1. Bounds on the Sample Size
Chernoff bound is independent on the size of B, on theuncertainty structure, on the pdf and on the number ofuncertainty blocks
It depends only on probabilistic accuracy and
Tutorial on Randomized Methods, Cape Town @RT 2014
It depends only on probabilistic accuracy andconfidence
Same comments can be made for other bounds (suchas Bernoulli)
2. Cost of Checking Stability
Consider a polynomial
To check left half plane stability we can use the Routhtest The number of multiplications needed is
nnsasaaasp 10),(
Tutorial on Randomized Methods, Cape Town @RT 2014
test. The number of multiplications needed is
The number of divisions and additions is equal to thisnumber
We conclude that checking stability is O(n2)
odd for 4
1 even for 4
22
nnnn
3. Random Sample Generation
Random number generation (RNG): Linear andnonlinear methods for uniform generation in [0,1) suchas Fibonacci, feedback shift register, BBS, MT, …
Non uniform univariate random variables: Suitable
Tutorial on Randomized Methods, Cape Town @RT 2014
Non-uniform univariate random variables: Suitablefunctional transformations (e.g., the inversion method)
Much harder problem: Multivariate generation ofsamples of with given pdf and support B
.It can be resolved in polynomial-time
Tutorial on Randomized Methods, Cape Town @RT 2014
Choice of the Probability Distribution
Choice of the Probability Distribution - 1
The probability Prob{S}depends on the underlyingpdf
I b 0 d 1
Tutorial on Randomized Methods, Cape Town @RT 2014
It may vary between 0 and 1depending on the pdf
Choice of the ProbabilityDistribution - 2
The bounds discussed are independent on the choiceof the distribution but for computing an estimate ofProb{J() } we need to know the distribution
Research has been done in order to find the worst-case
Tutorial on Randomized Methods, Cape Town @RT 2014
Research has been done in order to find the worst casedistribution in a certain class[1]
Uniform distribution is the worst-case if a certaintarget is convex and centrally symmetric
[1] B. R. Barmish and C. M. Lagoa (1997)
Choice of the ProbabilityDistribution - 3
Minimax properties of the uniform distribution havebeen shown[1]
Tutorial on Randomized Methods, Cape Town @RT 2014
[1] E. W. Bai, R. Tempo and M. Fu (1998)
Tutorial on Randomized Methods, Cape Town @RT 2014
Random Sample Generation
True Random Number Generators
Hardware sources of trulystatistically random numbers
High-voltage reverse-biasedk
Tutorial on Randomized Methods, Cape Town @RT 2014
P-N semiconductor junctions Reverse-biased Zener diodes Radioactive Decay Lava-rand Mechanical systems
entropy key
Random Generation
(Pseudo) random number generation (RNG): Variousmethods are available for generation in the interval [0,1)
Linear and nonlinear RNGs, Fibonacci, feedback shiftregister BBS MT
Tutorial on Randomized Methods, Cape Town @RT 2014
register, BBS, MT, …
Non-uniform univariate random variables: Suitablefunctional transformations (e.g., the inversion method)
Multivariate random variables: Rejection and conditionaldensity methods
Tutorial on Randomized Methods, Cape Town @RT 2014
Multivariate Random Vector Generation
Parametric Uncertainty
We study parametric uncertainty q in ℓp norm balls
Objective: Sample generation in the ball
Tutorial on Randomized Methods, Cape Town @RT 2014
B {q : ||q||p 1}
We are interested in uniformsample generation within B
-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1step 3
ℓp Vector Norms
Recall the ℓp vector norm of xFn
and the ℓ vector norm
1/
1|| || | | for [1, )
pnp
p ii
x x p
Tutorial on Randomized Methods, Cape Town @RT 2014
and the ℓ vector norm
|| || max | |iix x
Rejection Methods
Goal: to generate uniform samples in a set B (e.g. anorm ball)
Idea: If we have a “simpler" set Bd that contains B, we
Tutorial on Randomized Methods, Cape Town @RT 2014
can generate uniform samples in Bd, and then reject
those that fall outside B The rejection rate of the method is
Note: generation in Bd should be easy, membership of
B should be efficiently checkable
dvol( )vol( )
η BB
Rejection Methods
B Find a bounding set Bd
Tutorial on Randomized Methods, Cape Town @RT 2014
Bd
Generate points x(i) in Bd
Keep the points in B
and reject the others
Rejection Methods:Curse of Dimensionality
Rejection rate for generation ofuniform samples in the sphereusing an hypercube as boundingset
B
Tutorial on Randomized Methods, Cape Town @RT 2014
We obtain
)12/()π/2()(η n n n
n=1 n=2 n=3 n=4 n=10 n=20 n=30
1 1.2732 1.9099 3.2423 401.54 4.·107 5· 1013
Bd
Gamma Density
A random variable x has (unilateral) Gamma densitywith parameters (a,b) if
1 /1( ) 0( )
a x bx af x x e x
a b
Tutorial on Randomized Methods, Cape Town @RT 2014
where · is the Gamma function
We write x G(a,b)
There exist standard and efficient methods for randomgeneration according to G(a,b)
1 ξ
0
( ) ξ dξ 0xx e x
Generalized Gamma Density
A random variable has (unilateral) Generalized Gammadensity with parameters (a,c) if
-1 -( ) , 0( )
cca xx
cf x x e x
Tutorial on Randomized Methods, Cape Town @RT 2014
( )x a
We write x Gg(a,c)
Generalized Gamma Density
11
G ,( )
pxg p
p
pp e
Tutorial on Randomized Methods, Cape Town @RT 2014
p=1p=2p=4p=10p=100
Comments
Using power transformation method, a random variablex ~ Gg(a,c) is simply obtained as
x =z1/c
where z ~ G(a,1)
Tutorial on Randomized Methods, Cape Town @RT 2014
Samples distributed according to a (univariate) bilateraldensity x ~ fx(x) can be easily obtained from a(univariate) unilateral density z ~ fz(z)
Take x = sz, where s is an independent random signwith values +1 and -1 with equal probability
Joint Density
Let x=[x1,…,xn]T with components independentlydistributed according to the (bilateral) GeneralizedGamma density with parameters 1/p and p
The joint density of x is
Tutorial on Randomized Methods, Cape Town @RT 2014
|| ||
1
( ) 2 (1/ ) 2 (1/ )
p ppi
nnxx
x n ni
p pf x e ep p
Example: Multivariate Laplace
Recall that (1)=1 Multivariate (bilateral) Laplace
density
1
| |1( )
n
ii
x
f x e
Tutorial on Randomized Methods, Cape Town @RT 2014
is a Generalized Gammadensity with parameters 1 and 1
( )2x nf x e
Example: Multivariate Normal
Multivariate (bilateral) normal Nwith mean 0 and covariance
T
2I
Tutorial on Randomized Methods, Cape Town @RT 2014
is a Generalized Gamma densitywith parameters 1/2 and 2
T/2( ) π n x xxf x e
Uniform Multivariate Generation in B
TheoremLet xi be random variables distributed according to the(bilateral) Generalized Gamma density
px pgi ,G~ 1
Tutorial on Randomized Methods, Cape Town @RT 2014
Let w[0,1] be a random variable uniformly distributed
Then the vector
is uniformly distributed in B
ppgi
T1 ,,,1
np
xxxxxwy n
Algorithm Vector Uniform Generation
Input: n, p Output: uniform random sample y
• Generate n independent real scalars i ~Gg(1/p,p)
Tutorial on Randomized Methods, Cape Town @RT 2014
• Construct vector x of components xi=si i where si are randomsigns
• Generate w uniform in [0, 1]
• Return1/n
p
y w xx
Uniform Random Generation in ℓ2 - Step 1
1
2
3
4step 1
Generate n iid randomreal scalars
ξ ~ G 1( / , )g p p
Tutorial on Randomized Methods, Cape Town @RT 2014
-4 -3 -2 -1 0 1 2 3 4-4
-3
-2
-1
0 Construct xRn of
components
(si iid random signs)ξi i ix s
Uniform Random Generation in ℓ2 - Step 2
0 2
0.4
0.6
0.8
1step 2 Construct the
normalized vector
xzx
‖ ‖
Tutorial on Randomized Methods, Cape Town @RT 2014
-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1-1
-0.8
-0.6
-0.4
-0.2
0
0.2
The vector z isuniformly distributedon the surface of thep-norm ball
px‖ ‖
0 2
0.4
0.6
0.8
1step 3
Generate w uniform in[0,1], and return
.
Uniform Random Generation in ℓ2 - Step 3
1/ 1/n ny wzw xx
Tutorial on Randomized Methods, Cape Town @RT 2014
-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1-1
-0.8
-0.6
-0.4
-0.2
0
0.2
The vector y isuniformly distributedinside the p-norm ball.
px
Uniform Random Generation in B for p=1
0 2
0.4
0.6
0.8
1
Tutorial on Randomized Methods, Cape Town @RT 2014
-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1-1
-0.8
-0.6
-0.4
-0.2
0
0.2
Uniform Random Generation in B for p=0.7
0 2
0.4
0.6
0.8
1
Tutorial on Randomized Methods, Cape Town @RT 2014
-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1-1
-0.8
-0.6
-0.4
-0.2
0
0.2
Uniform Random Generation in B for p=4
0 2
0.4
0.6
0.8
1
Tutorial on Randomized Methods, Cape Town @RT 2014
-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1-1
-0.8
-0.6
-0.4
-0.2
0
0.2