Upload
hanhi
View
227
Download
3
Embed Size (px)
Citation preview
Tutorial of Meshfree Approximation Method
Qi Ye
Department of Applied MathematicsIllinois Institute of Technology
Advisor: Prof. G. E. Fasshauer
Thanks for the invitation Prof. Xu Sun.
Dec. 2010
[email protected] Huazhong University of Science and Technology Dec. 2010
Meshfree Methods Seminars at Illinois Institute of Technology
[email protected] Huazhong University of Science and Technology Dec. 2010
Background
Outline
1 Background
2 Applications
3 Books and Papers
4 Theorems and Algorithms
5 RBF collocation methods for PDE
[email protected] Huazhong University of Science and Technology Dec. 2010
Background Sample Data
We have the data values Y := {y1, , yN} R sampled from afunction f : Rd R on data points X := {x1, ,xN} , i.e.,
f (xk ) = yk , k = 1, ,N
Two Dimensional Example
X are the Halton points in
:= (0,1)2 R2
and
f = Frankes test function
[email protected] Huazhong University of Science and Technology Dec. 2010
Background Sample Data
We have the data values Y := {y1, , yN} R sampled from afunction f : Rd R on data points X := {x1, ,xN} , i.e.,
f (xk ) = yk , k = 1, ,N
00.2
0.40.6
0.81
0
0.2
0.4
0.6
0.8
1
0.5
0
0.5
1
y
x2
x1
Two Dimensional Example
X are the Halton points in
:= (0,1)2 R2
and
f = Frankes test function
[email protected] Huazhong University of Science and Technology Dec. 2010
Background Sample Data
We have the data values Y := {y1, , yN} R sampled from afunction f : Rd R on data points X := {x1, ,xN} , i.e.,
f (xk ) = yk , k = 1, ,N
00.2
0.40.6
0.81
0
0.2
0.4
0.6
0.8
1
0.5
0
0.5
1
x1x2
y
Two Dimensional Example
X are the Halton points in
:= (0,1)2 R2
and
f = Frankes test function
[email protected] Huazhong University of Science and Technology Dec. 2010
Background Sample Data
We have the data values Y := {y1, , yN} R sampled from afunction f : Rd R on data points X := {x1, ,xN} , i.e.,
f (xk ) = yk , k = 1, ,N
00.2
0.40.6
0.81
0
0.2
0.4
0.6
0.8
1
0.5
0
0.5
1
x1x2
y
Two Dimensional Example
X are the Halton points in
:= (0,1)2 R2
and
f = Frankes test function
[email protected] Huazhong University of Science and Technology Dec. 2010
Background Scattered Data Interpolation
We want to set up an interpolation function sf ,X dependent on the dataX and Y to approximate the unknown function f , i.e.,
interpolation conditions: sf ,X (xk ) = yk , k = 1, ,Nconvergent properties: lim
Nsf ,X (x) = f (x), x
Examples for Meshfree Approximation Methods by Sobolev Splines
00.2
0.40.6
0.81
0
0.2
0.4
0.6
0.8
1
0.5
0
0.5
1
1.5
App
roxi
mat
ing
Sol
utio
n
0.2
0
0.2
0.4
0.6
0.8
1
00.2
0.40.6
0.81
0
0.2
0.4
0.6
0.8
1
0
0.2
0.4
Absolute Error0.05 0 0.05 0.1 0.15 0.2 0.25 0.3
[email protected] Huazhong University of Science and Technology Dec. 2010
Background Scattered Data Interpolation
We want to set up an interpolation function sf ,X dependent on the dataX and Y to approximate the unknown function f , i.e.,
interpolation conditions: sf ,X (xk ) = yk , k = 1, ,Nconvergent properties: lim
Nsf ,X (x) = f (x), x
Examples for Meshfree Approximation Methods by Sobolev Splines
00.2
0.40.6
0.81
0
0.2
0.4
0.6
0.8
1
0.5
0
0.5
1
1.5
App
roxi
mat
ing
Sol
utio
n
0.2
0
0.2
0.4
0.6
0.8
1
00.2
0.40.6
0.81
0
0.2
0.4
0.6
0.8
1
0
0.2
0.4
Absolute Error0.05 0 0.05 0.1 0.15 0.2 0.25 0.3
[email protected] Huazhong University of Science and Technology Dec. 2010
Applications
Outline
1 Background
2 Applications
3 Books and Papers
4 Theorems and Algorithms
5 RBF collocation methods for PDE
[email protected] Huazhong University of Science and Technology Dec. 2010
Applications
Different types of N = 289 data points
[email protected] Huazhong University of Science and Technology Dec. 2010
Applications
Terrain Modeling361201 elevation values (from U.S. Geological Survey website)
Applications
Terrain Modeling
[email protected] Multivariate RBF Approximation New Mexico Tech, November 2, 2007
361201 elevation values (from U.S. Geological Survey website)
[email protected] Huazhong University of Science and Technology Dec. 2010
Applications
Point Cloud ModelingStanford bunny (simplified): 8171 point cloud data in 3D
What are multivariate scattered data?
8171 bunny data points in 3D
[email protected] Multivariate RBF Approximation New Mexico Tech, November 2, 2007
Applications
Point Cloud Modeling
[email protected] Multivariate RBF Approximation New Mexico Tech, November 2, 2007
Stanford bunny (simplified): 8171 point cloud data in 3D
[email protected] Huazhong University of Science and Technology Dec. 2010
Applications
FastSCAN ScannersWeta Digital staff scanning the cave troll with FastSCAN for "The Lordof the Rings: The Fellowship of the Ring" (http://www.fastscan3d.com/)
[email protected] Huazhong University of Science and Technology Dec. 2010
HUST006.mpegMedia File (video/mpeg)
Applications
ARANZ Medical SilhouetteMobile
SilhouetteMobile is an innovativeportable computer device withcustom camera and software thatallows a medical professional tocapture information about a woundat the point-of-care. Its informationis analyzed, managed and storedin a database on the device.(http://www.aranzmedical.com/)
[email protected] Huazhong University of Science and Technology Dec. 2010
Applications
Gasoline Engine DesignApplications
Gasoline Engine Design
[email protected] Multivariate RBF Approximation New Mexico Tech, November 2, 2007
Variables:spark timingspeedloadair-fuel ratio
exhaust gas re-circulation rateintake valve timingexhaust valve timingfuel injection timing
[email protected] Huazhong University of Science and Technology Dec. 2010
Applications
Engine Data Fitting
Find a function sf ,X (model) that fits the "input" variables and "output"(fuel consumption), and use the model to decide which variables leadto an optimal fuel consumption, i.e.,
input outputx1 = spark timingx2 = speedx3 = loadx4 = air-fuel ratio
sf ,X (x1,x2,x3,x4)= fuel consumption
and(xopt1 , x
opt2 , x
opt3 , x
opt4 )
T = xopt = arg minxR4
{sf ,X (x)}
[email protected] Huazhong University of Science and Technology Dec. 2010
Books and Papers
Outline
1 Background
2 Applications
3 Books and Papers
4 Theorems and Algorithms
5 RBF collocation methods for PDE
[email protected] Huazhong University of Science and Technology Dec. 2010
Books and Papers
Scattered Data Approximation( [Buhmann 2003],[Wendland 2005] and [Fasshauer 2007] )
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/48560000/48568864.JPG
[email protected] Huazhong University of Science and Technology Dec. 2010
Books and Papers
Scattered Data Approximation( [Buhmann 2003],[Wendland 2005] and [Fasshauer 2007] )
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/48560000/48568864.JPG
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/34380000/34380775.jpg
[email protected] Huazhong University of Science and Technology Dec. 2010
Books and Papers
Scattered Data Approximation( [Buhmann 2003],[Wendland 2005] and [Fasshauer 2007] )
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/48560000/48568864.JPG
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/34380000/34380775.jpg
Page 1 of 1
2010-12-11http://i.ebayimg.com/22/!B+SFnlw!Wk~$(KGrHqV,!jUEzKMBd-(3BM+JDHMq3w~~0...
[email protected] Huazhong University of Science and Technology Dec. 2010
Books and Papers
Statistical Learning and Support Vector Machine( [Wahba 1990], [Berlinet and Thomas-Agnan 2004] and[Steinwart and Christmann 2008] )
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/11380000/11380430.jpg
[email protected] Huazhong University of Science and Technology Dec. 2010
Books and Papers
Statistical Learning and Support Vector Machine( [Wahba 1990], [Berlinet and Thomas-Agnan 2004] and[Steinwart and Christmann 2008] )
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/11380000/11380430.jpg
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/8100000/8104072.jpg
[email protected] Huazhong University of Science and Technology Dec. 2010
Books and Papers
Statistical Learning and Support Vector Machine( [Wahba 1990], [Berlinet and Thomas-Agnan 2004] and[Steinwart and Christmann 2008] )
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/11380000/11380430.jpg
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/8100000/8104072.jpg
Page 1 of 1
2010-12-11http://images.barnesandnoble.com/images/28200000/28200521.jpg
[email protected] Huazhong University of Science and Technology Dec. 2010
Books and Papers
Approximation of (Stochastic) Dynamic Systems( [Ye 2009 Report] and [Gisel and Wendland 2010] )
Approximation of (Stochastic) Partial Differential Equations( [Boogaart 2001], [Koutsourelakis and Warner 2009] and[Fasshauer and Ye 2010 SPDE] )
Statistical Signal Processing( [Kay 1998] )
[email protected] Huazhong University of Science and Technology Dec. 2010
Books and Papers
Approximation of (Stochastic) Dynamic Systems( [Ye 2009 Report] and [Gisel and Wendland 2010] )
Approximation of (Stochastic) Partial Differential Equations( [Boogaart 2001], [Koutsourelakis and Warner 2009] and[Fasshauer and Ye 2010 SPDE] )
Statistical Signal Processing( [Kay 1998] )
[email protected] Huazhong University of Science and Technology Dec. 2010
Books and Papers
Approximation of (Stochastic) Dynamic Systems( [Ye 2009 Report] and [Gisel and Wendland 2010] )
Approximation of (Stochastic) Partial Differential Equations( [Boogaart 2001], [Koutsourelakis and Warner 2009] and[Fasshauer and Ye 2010 SPDE] )
Statistical Signal Processing( [Kay 1998] )
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
Outline
1 Background
2 Applications
3 Books and Papers
4 Theorems and Algorithms
5 RBF collocation methods for PDE
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
One Dimensional Cases: Lagrange Polynomial Bases
sf ,X (x) =N
j=1
cjBj(x), x R
where
polynomial bases: Bj(x) =N
k=1,k 6=j
x xjxk xj
, j = 1, ,N
coefficients: cj = yj , j = 1, ,N
How about two dimension, three dimension, ..., d dimension?
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
One Dimensional Cases: Lagrange Polynomial Bases
sf ,X (x) =N
j=1
cjBj(x), x R
where
polynomial bases: Bj(x) =N
k=1,k 6=j
x xjxk xj
, j = 1, ,N
coefficients: cj = yj , j = 1, ,N
How about two dimension, three dimension, ..., d dimension?
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
Meshfree Approximation Methods
Use a linear combination of basis functions Bj
sf ,X (x) =N
j=1
cjBj(x), x Rd
and enforce interpolation conditions
sf ,X (xk ) = yk , k = 1, ,N
Leads to linear systemB1(x1) BN(x1)... . . . ...B1(xN) BN(xN)
c1...
cN
=y1...
yN
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
Meshfree Approximation Methods
Use a linear combination of basis functions Bj
sf ,X (x) =N
j=1
cjBj(x), x Rd
and enforce interpolation conditions
sf ,X (xk ) = yk , k = 1, ,N
Leads to linear systemB1(x1) BN(x1)... . . . ...B1(xN) BN(xN)
c1...
cN
=y1...
yN
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
Meshfree Approximation Methods
Use a linear combination of basis functions Bj
sf ,X (x) =N
j=1
cjBj(x), x Rd
and enforce interpolation conditions
sf ,X (xk ) = yk , k = 1, ,N
Leads to linear systemB1(x1) BN(x1)... . . . ...B1(xN) BN(xN)
c1...
cN
=y1...
yN
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
How to construct the basis functions Bj?
Radial Basis Function : [0,) R,
Bj(x) = (x x j2), j = 1, ,N
For example, Gaussian function with shape parameter > 0
(r) := exp(2r2), r > 0
Kernel Function K : R,
Bj(x) = K (x ,x j), j = 1, ,N
For example, min kernel defined in := (0,1)d
K (x ,y) :=d
k=1
min(xk , yk ), x ,y
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
How to construct the basis functions Bj?
Radial Basis Function : [0,) R,
Bj(x) = (x x j2), j = 1, ,N
For example, Gaussian function with shape parameter > 0
(r) := exp(2r2), r > 0
Kernel Function K : R,
Bj(x) = K (x ,x j), j = 1, ,N
For example, min kernel defined in := (0,1)d
K (x ,y) :=d
k=1
min(xk , yk ), x ,y
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
How to construct the basis functions Bj?
Radial Basis Function : [0,) R,
Bj(x) = (x x j2), j = 1, ,N
For example, Gaussian function with shape parameter > 0
(r) := exp(2r2), r > 0
Kernel Function K : R,
Bj(x) = K (x ,x j), j = 1, ,N
For example, min kernel defined in := (0,1)d
K (x ,y) :=d
k=1
min(xk , yk ), x ,y
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
How to construct the basis functions Bj?
Radial Basis Function : [0,) R,
Bj(x) = (x x j2), j = 1, ,N
For example, Gaussian function with shape parameter > 0
(r) := exp(2r2), r > 0
Kernel Function K : R,
Bj(x) = K (x ,x j), j = 1, ,N
For example, min kernel defined in := (0,1)d
K (x ,y) :=d
k=1
min(xk , yk ), x ,y
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms
How to construct the basis functions Bj?
Radial Basis Function : [0,) R,
Bj(x) = (x x j2), j = 1, ,N
For example, Gaussian function with shape parameter > 0
(r) := exp(2r2), r > 0
Kernel Function K : R,
Bj(x) = K (x ,x j), j = 1, ,N
For example, min kernel defined in := (0,1)d
K (x ,y) :=d
k=1
min(xk , yk ), x ,y
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Deterministic and Random Problems
Why are the meshfree approximation methods useful in thecomputational and statistical fields?
Because X , Y , f and K can be deterministic or random, e.g.,
X can be Halton points or Sobol points,
y j := f (x j) or y j := f (x j) + j , j N (0,1), j = 1, ,N,
f can be a Gaussian process,
K(x ,y) := exp(2 x y22), N (, ).
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Deterministic and Random Problems
Why are the meshfree approximation methods useful in thecomputational and statistical fields?
Because X , Y , f and K can be deterministic or random, e.g.,
X can be Halton points or Sobol points,
y j := f (x j) or y j := f (x j) + j , j N (0,1), j = 1, ,N,
f can be a Gaussian process,
K(x ,y) := exp(2 x y22), N (, ).
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Deterministic and Random Problems
Why are the meshfree approximation methods useful in thecomputational and statistical fields?
Because X , Y , f and K can be deterministic or random, e.g.,
X can be Halton points or Sobol points,
y j := f (x j) or y j := f (x j) + j , j N (0,1), j = 1, ,N,
f can be a Gaussian process,
K(x ,y) := exp(2 x y22), N (, ).
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Deterministic and Random Problems
Why are the meshfree approximation methods useful in thecomputational and statistical fields?
Because X , Y , f and K can be deterministic or random, e.g.,
X can be Halton points or Sobol points,
y j := f (x j) or y j := f (x j) + j , j N (0,1), j = 1, ,N,
f can be a Gaussian process,
K(x ,y) := exp(2 x y22), N (, ).
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Deterministic and Random Problems
Why are the meshfree approximation methods useful in thecomputational and statistical fields?
Because X , Y , f and K can be deterministic or random, e.g.,
X can be Halton points or Sobol points,
y j := f (x j) or y j := f (x j) + j , j N (0,1), j = 1, ,N,
f can be a Gaussian process,
K(x ,y) := exp(2 x y22), N (, ).
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Deterministic and Random Problems
Why are the meshfree approximation methods useful in thecomputational and statistical fields?
Because X , Y , f and K can be deterministic or random, e.g.,
X can be Halton points or Sobol points,
y j := f (x j) or y j := f (x j) + j , j N (0,1), j = 1, ,N,
f can be a Gaussian process,
K(x ,y) := exp(2 x y22), N (, ).
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Positive Definite Kernels
Definition ([Wendland 2005, Definition 6.24])A symmetric kernel K : R is said to be positive definite if, forall N N, pairwise distinct centers x1, . . . ,xN Rd , and nonzerovector c = (c1, , cN)T RN , the quadratic form
cT AK ,X c =N
j=1
Nk=1
cjckK (x j ,xk ) > 0,
where the square matrix
AK ,X :=(K (x j ,xk )
)N,Nj,k=1 R
NN .
(K is positive definite if and only if all AK ,X are positive definite.)
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Reproducing-kernel Hilbert Space
Definition ([Wendland 2005, Definition 10.1])
Let HK () be a real Hilbert space of functions f : Rd R. Akernel K : R is called a reproducing kernel for HK () if
(i) K (,y) HK (), for all y (ii) f (y) = (K (,y), f )HK () , for all f HK () and y
In this case HK () is called a reproducing-kernel Hilbert space.
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Theorems
Theorem ([Wendland 2005, Theorem 10.10])If K : R is a symmetric positive definite kernel, then there is areproducing-kernel Hilbert space HK () with reproducing kernel K .
Error Bounds: If K C2n( ) and f HK () thenf (x) sf ,X (x) CK ,xhnX , fHK () , x where the fill distance hX , := supx minx jX
x x j2.Optimal Recovery:sf ,XHK () = min{fHK () : f HK (), f (x j) = yj , j = 1, ,N}
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Theorems
Theorem ([Wendland 2005, Theorem 10.10])If K : R is a symmetric positive definite kernel, then there is areproducing-kernel Hilbert space HK () with reproducing kernel K .
Error Bounds: If K C2n( ) and f HK () thenf (x) sf ,X (x) CK ,xhnX , fHK () , x where the fill distance hX , := supx minx jX
x x j2.
Optimal Recovery:sf ,XHK () = min{fHK () : f HK (), f (x j) = yj , j = 1, ,N}
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Theorems
Theorem ([Wendland 2005, Theorem 10.10])If K : R is a symmetric positive definite kernel, then there is areproducing-kernel Hilbert space HK () with reproducing kernel K .
Error Bounds: If K C2n( ) and f HK () thenf (x) sf ,X (x) CK ,xhnX , fHK () , x where the fill distance hX , := supx minx jX
x x j2.Optimal Recovery:sf ,XHK () = min{fHK () : f HK (), f (x j) = yj , j = 1, ,N}
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Examples
Example (Univariate Sobolev Splines)
Let a shape parameter > 0 and a radial basis function
(r) :=12
exp(r), r 0
We can determine that the kernel function
K (x , y) := (|x y |), x , y R
is positive definite. Its related reproducing-kernel Hilbert space
HK (R) Sobolev space H1(R) :={
f : R R : f , f L2(R)}
and it is equipped with the inner product
(f ,g)HK (R) :=R
f (x)g(x)dx + 2R
f (x)g(x)dx
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Examples
Example (Univariate Sobolev Splines)
Let a shape parameter > 0 and a radial basis function
(r) :=12
exp(r), r 0
We can determine that the kernel function
K (x , y) := (|x y |), x , y R
is positive definite.
Its related reproducing-kernel Hilbert space
HK (R) Sobolev space H1(R) :={
f : R R : f , f L2(R)}
and it is equipped with the inner product
(f ,g)HK (R) :=R
f (x)g(x)dx + 2R
f (x)g(x)dx
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Examples
Example (Univariate Sobolev Splines)
Let a shape parameter > 0 and a radial basis function
(r) :=12
exp(r), r 0
We can determine that the kernel function
K (x , y) := (|x y |), x , y R
is positive definite. Its related reproducing-kernel Hilbert space
HK (R) Sobolev space H1(R) :={
f : R R : f , f L2(R)}
and it is equipped with the inner product
(f ,g)HK (R) :=R
f (x)g(x)dx + 2R
f (x)g(x)dx
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Examples
Example (Modified Min Kernel)
K (x , y) := min{x , y} xy + 12
is positive definite in := (0,1).
Its related reproducing-kernel Hilbertspace has the form
HK () :={
f H1() : f (0) = f (1)}
with the inner product
(f ,g)HK () = 1
0f (x)g(x)dx + f (0)g(0) + f (1)g(1)
where H1() := {f : R : f , f L2()} is a Sobolev space.
[email protected] Huazhong University of Science and Technology Dec. 2010
Theorems and Algorithms Examples
Example (Modified Min Kernel)
K (x , y) := min{x , y} xy + 12
is positive definite in := (0,1). Its related reproducing-kernel Hilbertspace has the form
HK () :={
f H1() : f (0) = f (1)}
with the inner product
(f ,g)HK () = 1
0f (x)g(x)dx + f (0)g(0) + f (1)g(1)
where H1() := {f : R : f , f L2()} is a Sobolev space.
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
Outline
1 Background
2 Applications
3 Books and Papers
4 Theorems and Algorithms
5 RBF collocation methods for PDE
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
Let Rd be a polygonal and open region. Suppose that the ellipticpartial differential equation{
Lu = f , in u = g, on
has a unique solution u, where
L :=d
j,k=1
ajk (x)2
xjxk+
dj=1
bj(x)
xj+ c(x)
Next we want to use the RBF collocation methods to approximate thenumerical solution of this PDE.
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
Let Rd be a polygonal and open region. Suppose that the ellipticpartial differential equation{
Lu = f , in u = g, on
has a unique solution u, where
L :=d
j,k=1
ajk (x)2
xjxk+
dj=1
bj(x)
xj+ c(x)
Next we want to use the RBF collocation methods to approximate thenumerical solution of this PDE.
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
We are given the data points
X := {x1, ,xN} , X := {xN+1, ,xN+M}
and the data values
Y := {f (x1), , f (xN)} , Y := {g(xN+1), ,g(xN+M)}
Suppose the positive definite kernel function K is set up by the radialbasis function , i.e.,
K (x ,y) := (x y2)
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
We are given the data points
X := {x1, ,xN} , X := {xN+1, ,xN+M}
and the data values
Y := {f (x1), , f (xN)} , Y := {g(xN+1), ,g(xN+M)}
Suppose the positive definite kernel function K is set up by the radialbasis function , i.e.,
K (x ,y) := (x y2)
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
We propose the following expansion u for the unknown function u, i.e.,
u(x) u(x) =N
j=1
cjLyK (x ,y)|y=x j +N+M
j=N+1
cjK (x ,x j)
=N
j=1
cjLy(x y2)|y=x j +N+M
j=N+1
cj(x y j2)
After enforcing the collocation conditions
Lu(x j) = f (x j), x j X and u(x j) = g(x j), x j X
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
We propose the following expansion u for the unknown function u, i.e.,
u(x) u(x) =N
j=1
cjLyK (x ,y)|y=x j +N+M
j=N+1
cjK (x ,x j)
=N
j=1
cjLy(x y2)|y=x j +N+M
j=N+1
cj(x y j2)
After enforcing the collocation conditions
Lu(x j) = f (x j), x j X and u(x j) = g(x j), x j X
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
We end up with a linear system(ALL ALATL A
)(cLc
)=
(f (X)
g(X)
)where
(ALL)jk := LxLy(x y2)|x=x j ,y=xk , x j ,xk X(AL)jk := LxLy(x y2)|x=x j ,y=xk , x j , xk X(A)jk := (
x j xk2), x j ,xk XcL := (c1, , cN)T , c := (cN+1, , cN+M)T
f (X) := (f (x1), , f (xN))T
g(X) := (g(xN+1), ,g(xN+M))T
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
Theorem ([Fasshauer 2007, Theorem 38.1])Let L 6= 0 be a second-order linear elliptic differential operator withcoefficients in C2(n2)() that either vanish on or have no zero there.Suppose that C2n(R+0 ) and f C(), g C().
If u HK (),then
u uL() Chn2 uHK ()
where h is the larger of the fill distances in the interior and on theboundary on respectively, i.e.,
h = max{
hX,, hX,}
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
Theorem ([Fasshauer 2007, Theorem 38.1])Let L 6= 0 be a second-order linear elliptic differential operator withcoefficients in C2(n2)() that either vanish on or have no zero there.Suppose that C2n(R+0 ) and f C(), g C(). If u HK (),then
u uL() Chn2 uHK ()
where h is the larger of the fill distances in the interior and on theboundary on respectively, i.e.,
h = max{
hX,, hX,}
[email protected] Huazhong University of Science and Technology Dec. 2010
RBF collocation methods for PDE
Meshfree Methods
(Radial Basis Functions)
Stochastic Analysis
(Random Dynamic Systems)
Statistical Learning
(Kriging Methods)
Combining above three fields, we are now introducing a new numericalmethod to approximate the stochastic partial differential differential.
[email protected] Huazhong University of Science and Technology Dec. 2010
Appendix References
References I
R. A. Adams and J. J. F. Fournier,Sobolev Spaces (2nd Ed.),Pure and Applied Mathematics, Vol. 140, Academic Press, 2003.
A. Berlinet and C. Thomas-Agnan,Reproducing Kernel Hilbert Spaces in Probability and Statistics,Kluwer Academic Publishers, 2004.
M. D. Buhmann,Radial Basis Functions: Theory and Implementations,Cambridge University Press (Cambridge), 2003.
D. G. Duffy,Greens Functions with Applications,Studies in Advanced Mathematics, Chapman & Hall/CRC, 2001.
[email protected] Huazhong University of Science and Technology Dec. 2010
Appendix References
References II
G. E. Fasshauer,Meshfree Approximation Methods with MATLAB,Interdisciplinary Mathematical Sciences, Vol. 6, World ScientificPublishers (Singapore), 2007.
J. K. Hunter and B. Nachtergaele,Applied Analysis,World Scientific Publishers (Singapore), 2005.
S. M. Kay,Fundamentals of Statistical Signal Processing: Estimation Theoryand Detection Theory, Vol 1 and 2Prentice Hall, 1998.
[email protected] Huazhong University of Science and Technology Dec. 2010
Appendix References
References III
B. Schlkopf and A. J. Smola,Learning with Kernels: Support Vector Machines, Regularization,Optimization, and Beyond,MIT Press (Cambridge, MA), 2002.
E. M. Stein and G. Weiss,Introduction to Fourier Analysis on Euclidean Spaces,Princeton University Press, 1975.
M. L. Stein,Interpolation of Spatial Data. Some Theory for Kriging,Springer Series in Statistics, Springer-Verlag (New York), 1999.
I. Steinwart and A. Christmann,Support Vector Machines,Springer Science Press, 2008.
[email protected] Huazhong University of Science and Technology Dec. 2010
Appendix References
References IV
G. Wahba,Spline Models for Observational Data,CBMS-NSF Regional Conference Series in Applied Mathematics59, SIAM (Philadelphia), 1990.
H. Wendland,Scattered Data Approximation,Cambridge University Press, 2005.
K. E. Atkinson and O. Hansen,A Spectral Method for the Eigenvalue Problem for EllipticEquations,Reports on Computational Mathematics #177, Dept ofMathematics, University of Iowa, 2009.
[email protected] Huazhong University of Science and Technology Dec. 2010
Appendix References
References V
K. G. Boogaart,Kriging for Processes Solving Partial Differential Equations2001.
R. Devore and A. Ron,Approximation Using Scattered Shifts of a Multivariate Function,Transactions of the AMS, Electronically published on July 15, 2010.
J. Duchon,Splines minimzing rotation-invariant semi-norms in Sobolevspaces,in Constructive Theory of Functions of Several Variables, W.Schempp and K. Zeller (Eds.), Springer-Verlag (Berlin), 1977,85100.
[email protected] Huazhong University of Science and Technology Dec. 2010
Appendix References
References VI
G. E. Fasshauer and Q. Ye,Reproducing Kernels of Generalized Sobolev Spaces via a GreenFunction Approach with Distributional Operator,submitted.
G. E. Fasshauer and Q. Ye,Reproducing Kernels of Sobolev Spaces via a Green FunctionApproach with Differential Operators and Boundary Operators,in preparation.
G. E. Fasshauer and Q. Ye,Approximation (Stochastic) Partial Differential Equations byGaussian Processes via Reproducing Kernels,in preparation.
[email protected] Huazhong University of Science and Technology Dec. 2010
Appendix References
References VII
P. Giesl and H. Wendland,Numerical determination of the basin of attraction for exponentiallyasymptotically autonomous dynamical systems,preprint, Oxford/Sussex, 2010.
S. Koutsourelakis and J. WarnerLearning Solutions to Multiscale Elliptic Problems with GaussianProcess Models,research report at Cornell University, 2009.
J. Kybic, T. Blu and M. Unser,Generalized sampling: A variational approach - Part I and II:Theory,IEEE Trans. Signal Proc. 50/8 (2002), 19651985.
[email protected] Huazhong University of Science and Technology Dec. 2010
Appendix References
References VIII
R. Schaback,Spectrally Optimized Derivative Formulae,Data Page of R. Schabacks Research Group, 2008.
Q. Ye,Approximation of Stable/Unstable Manifolds of Dynamic Systemsby Support Vector Machine,Illinios Institute of Technology (IIT) project report, 2009.
Q. Ye,Reproducing Kernels of Generalized Sobolev Spaces via a GreenFunction Approach with Differential Operator,preprint, IIT technical report, 2010.
[email protected] Huazhong University of Science and Technology Dec. 2010
BackgroundSample DataScattered Data Interpolation
ApplicationsBooks and PapersTheorems and AlgorithmsDeterministic and Random ProblemsPositive Definite KernelsReproducing-kernel Hilbert SpaceTheoremsExamples
RBF collocation methods for PDEAppendixAppendix