8/9/2019 A Tutorial on Random Fields and Maximum Entropy
1/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Random Fields and Maximum EntropyA Brief Tutorial the FRAME Model and Gibbs Learning
Julian Antolin Camarena
Department of Physics and Astronomy
Wednesday, November 20, 2013
J. Antolin Camarena Random Fields and Maximum Entropy
http://find/http://goback/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
2/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Coming Up
Markov Random Fields and Gibbs MeasuresThe Maximum Entropy Method
The FRAME Model
Maximum Satellite Likelihood Estimation
J. Antolin Camarena Random Fields and Maximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
3/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Markov Random FieldsBoltzmann Distribution
Random Fields
A stochastic process is a set of random variables {Xt :t T}with Xt taking values in a finite set St.
The joint probability distribution of the variables is
p(x) =P(Xt=xt, t T), x= (x1, x2, . . . , xn).
Let Tbe the set of nodes of a graph, G, andNt theneighborhood oft, i.e. the set for which (t, s) share and edgein G, then the processes is said to be a Markov random field
(MRF)ifi p(x)> 0 for all x
ii for each t and x
P(xt|{xs, s G t}) =P(xt|{xs, s Nt}).
J. Antolin Camarena Random Fields and Maximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
4/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Markov Random FieldsBoltzmann Distribution
J. Antolin Camarena Random Fields and Maximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
5/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Markov Random FieldsBoltzmann Distribution
The neighborhood,Nt, of a node t must satisfy the following
properties:i A site is not its own neighbor: t / Nt.
ii The neighborhood property must reciprocate:
t Ns s Nt
J. Antolin Camarena Random Fields and Maximum Entropy
M k R d F ld d G bb M
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
6/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Markov Random FieldsBoltzmann Distribution
A clique is an ordered subset of nodes of the graph: C G.Exmples are
Single-site: C1={t|t G}Pair-site: C2={{t, s}|s Nt, t G}Triple-site:C3 ={{t,s,r}|t,s,r Gare neighbors of one another}
J. Antolin Camarena Random Fields and Maximum Entropy
M k R d Fi ld d Gibb M
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
7/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Markov Random FieldsBoltzmann Distribution
In statistical physics the Boltzmann distribution is given by
p(x) = 1
Z eH(x)
; Z =x
eH(x)
.
In the MRF literature the Boltzmann distribution is called theGibbs measureor distribution.
J. Antolin Camarena Random Fields and Maximum Entropy
M k R d Fields d Gibbs Me s es
http://find/http://goback/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
8/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Markov Random FieldsBoltzmann Distribution
Hammersley-Clifford Theorem
Theorem
X is a Markov random field on Gwith respect toN if and only ifX is a Gibbs random field on Gwith respect toN.
The proof is omitted.In plain English: all MRF distributions can be written as a Gibbs
distribution.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs Measures
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
9/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Markov Random FieldsBoltzmann Distribution
A Gibbs distribution has the clique factorization property:
H(x) =
cChc(x);
that is, the sum is over the local energy functions of eachclique.
A GRF is said to be homogeneous ifhc(x) is independent ofthe relative position of the clique, c and
isotropic ifhc(x) is independent of the orientation ofc.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs Measures
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
10/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Markov Random FieldsBoltzmann Distribution
Sometimes it is convenient to write H(x) as the sum overcliques of equal size. For example, for cliques up to size two:
H(x) =tG
h1(xt) +tG
sNt
h2(xt, xs),
which is the form of a much celebrated model in statisticalphysics.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs Measures
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
11/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
Markov Random FieldsBoltzmann Distribution
Ising Model
The Ising model of magnetism is a prototypical example of a
Gibbs random field. The Ising hamiltonian is
HI=i,j
Jijij j
hjj,
where i, j denotes pairs i, j in the same neighborhood.
J. Antolin Camarena Random Fields and Maximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
12/42
Markov Random Fields and Gibbs Measures
8/9/2019 A Tutorial on Random Fields and Maximum Entropy
13/42
Maximum EntropyFRAME
Satellite Maximum Likelihood Estimation
Markov Random FieldsBoltzmann Distribution
MRF model textures
Source: Statistical Image Processing and Multidimensional Modeling by PaulFieguth,Springer2012
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs Measures
http://find/http://goback/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
14/42
Maximum EntropyFRAME
Satellite Maximum Likelihood Estimation
Maximum Entropy Method
The ME distribution is maximally noncommittal with respectto missing information and is solely dependent on available
data.The resulting distribution is in the exponential family. Morespecifically, it is a Gibbs distribution.
Remember, its not the true underlying distribution, it issimply the best distribution that can be obtained from thedata that will, on average, yield the same statistics as thedata.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs Measures
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
15/42
Maximum EntropyFRAME
Satellite Maximum Likelihood Estimation
To construct it:
i Data is assumed to be a good estimate of the average value ofthe measured function:
measurement ofi(x) yieldsi(x)=x
i(x)p(x)
ii Solve the optimization problem via Lagrange multipliers:
maxp(x)
x
p(x)logp(x)
subject to
xp(x) = 1
i(x)=
x i(x)p(x)
iii Solving, one has the ME distribution:
p(x; ) p(x) = 1
Ze
iTi i(x)
where = (1, 2, . . . , N).
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresM i E
http://find/http://goback/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
16/42
Maximum EntropyFRAME
Satellite Maximum Likelihood Estimation
Z satisfieslog Z
i=i(x)p,
2 log Zij
= cov{i(x), j(x)}.
The second property ofZ says that the Hessian oflog Zpositive semidefinite and is concave wrt and so is p(x; ).Thus, given a set of consistent constraints the Lagrangemultipliers are unique.
The maximum likelihood estimate of the Lagrange multipliers
satisfiesdndt
=n(x)p n, n= 1, 2, . . . , N
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresM i E t
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
17/42
Maximum EntropyFRAME
Satellite Maximum Likelihood Estimation
Overview
We now discuss the paper Filters, Random Fields and MaximumEntropy (FRAME): Towards a Unified Theory for TextureModeling[International Journal of Computer Vision 27(2), 107126(1998)] by Zhu, Wu, and Mumford.Given an input texture image
a set of filters is selected from a general set of filters;
histograms of the filtered image are calculated as theyapproximate the marginals of the true underlying distribution,
f(I);
a maximum entropy distribution, p(I), is constructedconstrained by the marginal distributions off(I)
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
http://find/http://goback/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
18/42
Maximum EntropyFRAME
Satellite Maximum Likelihood Estimation
Filters
A filteris a system that performs mathematical operations onan input signal to enhance or reduce desired features of theinput.
Linear space-invariant (LSI) filters are popular because
because they can be implemented with a convolutionoperation. Let h be an LSI filters impulse response (filterwindow/Green function) and x an input signal, then filteredsignal, is given by their convolution
y(z) = h(z
)x(z z
)dz
or
yn=
k=xnkhk.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
19/42
Maximum EntropyFRAME
Satellite Maximum Likelihood Estimation
Laplacian filter
Lena filtered with Laplacian filter. Source:http://asura.iaigiri.com/OpenGL/Image/LaplacianFilter/LaplacianFilter.png
L(x, y) = 2
x2 +
2
y2
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
20/42
Maximum EntropyFRAME
Satellite Maximum Likelihood Estimation
Gaussian filter
Source: Wikipedia
G(x, y; x0, y0, x, y) = 12
xy
e 12((xx0)
2/22x+(yy0)2/22y)
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
21/42
Maximum EntropyFRAME
Satellite Maximum Likelihood Estimation
Laplacian of Gaussian
http://www.aishack.in/wp-content/uploads/2010/08/conv-laplacian-of-gaussian-result.jpg
LG(x, y; x0, y0, x, y) = L(x, y)G(x, y; x0, y0, x, y)
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
22/42
pyFRAME
Satellite Maximum Likelihood Estimation
Model Assumptions and Definitions
The image I is a random field on a discrete lattice and is astationary process.
I contains sufficiently many pixels for statistical analysis.
Filters are denoted by F(k), k= 1, . . . , K and the filteredimage by I(k) = I F(k)
Further, since I is stationary and the F(k) are LSI ,
I(k) = I F(k) is a convolution.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
23/42
pyFRAME
Satellite Maximum Likelihood Estimation
The histograms of I(k) are good approximations to themarginalsf(k)(I). They are vectors and are denoted H(k).
Knowing a sufficient number of marginals we can build the
distribution.The observed (input) image is denoted Iobs. The observed
filtered (by F(k)) images are denoted by I(k)obs and the
corresponding histograms by H(k)obs. Similar notation is used
for the synthesized quantities.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
24/42
FRAMESatellite Maximum Likelihood Estimation
The ME distribution depends upon the selected filter set SK
and the Lagrange multipliers K:
p(I; SK, K) = 1
ZKeK
n=1T(n)H(n)
We look for
K= argmaxK
{logp(Iobs; SK, K)}
= argmaxK
log ZK
K
n=1 T(n)H
(n)obs
which is equivalent to
d(n)
dt =H(n)synp(I;SK ,K) H
(n)obs
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAME
http://find/http://goback/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
25/42
FRAMESatellite Maximum Likelihood Estimation
FRAME Algorithm
Input a texture image Iobs.Select a set ofK filters, SK={F(1), F(2), . . . , F (K)}.ComputeH(k), k= 1, 2, . . . , K .Initialize (k) 0, k= 1, 2, . . . , K .
Initialize Isyn white Gaussian noise texture.While 12
H(k)synp H(k)obs1
fork= 1, 2, . . . , K
Calculate H(k)syn from Isyn, use it for H
(k)synp
Update
(k)
by
(k)
=H
(n)
synp H
(k)
obs. This updates p.Samplea p(I; SK, K) to update Isyn.
aGibbs, MCMC, etc.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAME
http://find/http://goback/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
26/42
FRAMESatellite Maximum Likelihood Estimation
Filter Selection Algorithm
Let Bbe a general filter bank, Sthe set of selected filters, Iobs theobserved texture image, and Isyn the synthesized texture image.Initialize k= 0, S , p(I) =U[0,G1] and Isyn U[0,G1] For
= 1, . . . , |B| compute H()obs from I
()obs.
RepeatCalculate H
()syn from I
()syn.
d() = 12
H
()syn H
()obs
ChooseF(k+1) so that d(k+ 1) = max{d() :F() B/S}
SS
{F(k+1)}, kk+ 1.
Update p(I) and Isyn with the FRAME algorithm.
Until d() <
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAME
http://find/http://goback/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
27/42
FRAMESatellite Maximum Likelihood Estimation
Reported Results: K = 0, 1, 2, 3, 6filters
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAME
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
28/42
FRAMESatellite Maximum Likelihood Estimation
Reported Results: histograms and Lagrange multipliers forsubband images
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAME
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
29/42
FRAMESatellite Maximum Likelihood Estimation
Graphically, we have
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAME
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
30/42
FRAMESatellite Maximum Likelihood Estimation
Overview
We now give a brief review of a follow up paper by Song Chun Zhuand Xiuwen Liu, Learning in Gibbsian Fields: How Fast and HowAccurate Can It Be? [IEEE TRANSACTIONS ON PATTERN
ANALYSIS AND MACHINE INTELLIGENCE, VOL. 24, NO. 7,JULY 2002]
The authors identify two major issues in Gibbsian learning:1 the efficiency of likelihood functions, and
2 the variance in approximating partition functions using MonteCarlo integration.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAME
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
31/42
Satellite Maximum Likelihood Estimation
This paper proposes three algorithms for learning Gibbsdistribution parameters (Gibbsian learning):
1 A maximum partial likelihood estimator2 A maximum patch likelihood estimator, and3 A maximum satellite estimator.
They find that these algorithms have different benefits anddownfalls, but generally outperform standard MCMC Gibbsianlearning. They claim that the third algorithm offers the best
trade-off between accuracy and speed of estimation.
J. Antolin Camarena Random Fields and Maximum Entropy
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
32/42
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAME
8/9/2019 A Tutorial on Random Fields and Maximum Entropy
33/42
Satellite Maximum Likelihood Estimation
The Common Framework of Gibbsian Learning
The authors identify two choices that need to be made in the
Gibbsian learning problem:1 The number, sizes, and shapes of the foreground patchesSi
and corresponding backgrounds Si i= 1, 2, . . . , M .2 The reference models used to estimate the partition functions.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAME
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
34/42
Satellite Maximum Likelihood Estimation
Choice 1: The foreground and background
The foreground pixels, Si and corresponding backgrounds Si i= 1, 2, . . . ,M are
shown in light and dark shading, respectively. (a)-(c) are mm patches. In one
extreme the loglikelihood, G in (a) chooses m= N 2w and is used in MCMCMLE
methods. The other extreme in (c) chooses m= 1 andG is the pseudolikelihood
used in MPLE. The midpoint is shown in (b) and G is the lo=patch-likelihood. The
choice in (d) has M = 1 irregular patch, 1, with pixels randomly selected, the rest of
the lattice is the background 1 andG is the log-partial-likelihood. In (b) and (c)
patches are allowed to overlap.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMES lli M i Lik lih d E i i
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
35/42
Satellite Maximum Likelihood Estimation
Choice 2: Reference model for estimation ofZ
Now we need to estimate Z(IobsSi
) for each SiMi=1 by Monte Carlo integration using a reference model at
= 0:
Z(IobsSi
) Z0 (I
obsSi
)
L
L
j=1
e0,h(I
synij
|IobsSi)
where Isynij
Lj=1 are typicalsamples of the reference model. The log-likelihood can be estimated iteratively by
gradient descent. The dashed line shows the inverse Fisher information and the solid curves show the variance in a
sequence of models approaching the true parameter value.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMES t llit M i Lik lih d E ti ti
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
36/42
Satellite Maximum Likelihood Estimation
Algorithm 1: Maximizing partial likelihood (MPLE)
We choose Sas in the figure by randomly selecting 1/3 of pixels as
foreground. The log-partial likelihood is G = logp(IobsS1 |I
obsSS1 ; ).
Maximizing G by gradient descent we update iteratively. This is the same
setup as in FRAME, although MPLE trades-off accuracy (lower Fisher info.)
for speed ( 25) in a better way than FRAME. This is mainly due to
FRAMEs image synthesis under nontypical conditions (initializing Isyn to
noise) and MPLE always has typical boundary conditions.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
37/42
Satellite Maximum Likelihood Estimation
Algorithm 2: Maximizing patch likelihood (MPaLE)
The foreground is a set of overlapping patches from IobsS and digs a holeSiin each patch as in the figure. The patch likelihood is
G =
M
i=1
logp(IobsSi |IobsSSi ; ).
Maximizing G by gradient descent we update iteratively. Algorithms 1 and
2 have similar performance.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum Entropy
FRAMESatellite Maximum Likelihood Estimation
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
38/42
Satellite Maximum Likelihood Estimation
Algorithm 3: Maximizing satellite likelihood (MSLE)
In contrast to algorithms 1 and 2, MSLE does not synthesize images online(within the learning algorithm), which is computationally intensive.We select a set of reference models in the exponential family:R= {p(I; j) : j , j = 1, 2, . . . , s}. Each model is sampled to synthesizea large image. The log-satellite likelihoodis given by
G =s
j=1
G(j)(;j); G(j)(;j) =
M
i=1
loge,h(Iobs
Si |Iobs
SSi)
Z(j)i
and
Z(j)i =
Zj (IobsSi
)
L
L
=1
ej ,h(I
synij |I
obsSi
)
is estimated by Monte Carlo integration. In the above the index 1 L runs
over the different realizations of the reference models; 1 j s runs over the
different models; and 1 i M runs over the foreground lattices. Maximizing
G by gradient descent we update iteratively.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum EntropyFRAME
Satellite Maximum Likelihood Estimation
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
39/42
Satellite Maximum Likelihood Estimation
Reported results: FRAME used as truth
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum EntropyFRAME
Satellite Maximum Likelihood Estimation
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
40/42
Satellite Maximum Likelihood Estimation
Results
Top row: The difference between the two MSLE synthesized images is that theresult (b) ignores all boundary conditions, whereas (c) uses obeserved boundaryconditions.Bottom row: was learned with MSLE for different hole sizes (a) m= 2; (b)m= 6; and (c) m= 9.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum EntropyFRAME
Satellite Maximum Likelihood Estimation
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
41/42
Summary of Algorithms
Group 1. In (a) ML estimators (FRAME, MPLE, MPaLE, MCMCMLE)generate a sequence of satellites 0, 1, 2, . . . , k online.
Group 2. In (c) we see the maximum pseudo-likelihood uses a unformmodel 0 = 0 to estimate any model and thus has large
variance.Group 3. In (b) the MSLEs use a general set of satellites which are
precomputed and sampled offline. To save time, one cancompute the difference d(j) =|h(Isynj ) h(I
obs)| the indexvalues that return the smallest s values correspond to satellitesthat are closer to the truth.
J. Antolin Camarena Random Fields and Maximum Entropy
Markov Random Fields and Gibbs MeasuresMaximum EntropyFRAME
Satellite Maximum Likelihood Estimation
http://find/8/9/2019 A Tutorial on Random Fields and Maximum Entropy
42/42
THANK YOU!
J. Antolin Camarena Random Fields and Maximum Entropy
http://find/