52
Sketches with Curvature: The Curve Indicator Random Field and Markov Processes Jonas August Robotics Institute Carnegie Mellon University Pittsburgh, PA [email protected] Steven W. Zucker Center for Computational Vision and Control Yale University New Haven, CT [email protected] September 10, 2002

Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Sketches with Curvature:

The Curve Indicator Random Field

and Markov Processes

Jonas August

Robotics Institute

Carnegie Mellon University

Pittsburgh, PA

[email protected]

Steven W. Zucker

Center for Computational Vision and Control

Yale University

New Haven, CT

[email protected]

September 10, 2002

Page 2: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Abstract

There are two fundamental aspects in the contour enhancement of noisy and low contrast

curve images such as edge or line operator responses. The first is the inference mechanism and

the second is the order of the curve model. To obtain an inference mechanism, we introduce

a formal model of sketches called the curve indicator random field whose role is to provide

a basis for defining edge likelihood models. For curves modeled with stationary Markov

processes, this ideal edge prior is non-Gaussian, and its moment generating functional has

a form closely related to the Feynman-Kac formula. It leads to a nonlinear, minimum

mean squared error contour enhancement filter that requires the solution of two elliptic

partial differential equations. As a contour model, we introduce a Markov process model for

contour curvature, and then analyze the distribution of such curves. Its mode is the Euler

spiral, a curve minimizing changes in curvature. Example computations using the contour

enhancement filter with the curvature-based contour model are provided, highlighting how

the filter is curvature-selective even when curvature is absent in the input.

Page 3: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

1 Introduction

Imagine the situation of a bug trying to “track” the contour in Fig. 1. Suppose the bug is

special in that it can only “search” for its next piece of contour in a cone in front of it centered

around its current predicted position and direction (i.e., orientation with polarity) [51, 13].

This strategy is appropriate so long as the contour is relatively straight. However, when the

bug is on a portion of the contour veering to the right, it will constantly waste time searching

to the left, perhaps even mistracking completely if the curvature is too large. In estimation

terms, the errors of our searching bug are correlated, a clue that assuming the contour is

straight is biased. A good model would only lead to an unavoidable uncorrelated error, as in

the principle of orthogonality [17]. We present a Markov process that models not only the

contour’s direction, but also its local curvature.

It may appear that one may avoid these problems altogether by allowing a higher bound

on curvature. However, this forces the bug to spend more time searching in a larger cone. In

stochastic terms, this larger cone is amounts to asserting that the current (position, direction)

state has a weaker influence on the next state; in other words, the prior on contour shape

is weaker (less peaked or broader). But a weaker prior will be less able to counteract a

weak likelihood (high noise): it will not be robust to noise. Thus we must accept that good

continuation models based only on contour direction are forced to choose between allowing

high curvature or high noise; they cannot have it both ways.1

Although studying curvature is hardly new in vision, modeling it probabilistically began

only recently with Zhu’s empirical model of contour curvature [52]. In [6, 53, 26] and [15,

361 ff.], measuring curvature in images was the problem addressed. In [25], curvature is used

for smooth interpolations, following on the work on elastica in [44, 19] and later [32]. The

1Observe in the road-tracking examples in [13] how all the roads have fairly low curvature. While this is

realistic in flat regions such as the area of France considered, others, more mountainous perhaps, have roads

that wind in the hillsides.

1

Page 4: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

ab

c

d

][t T0 U

y

xParameterization

Loss of curve

Rt

+ noise = or ?

Noisy curve Noisy sketch of curve

Figure 1: (Top) Mistracking without curvature. A bug (grey dot) attempts to track the contour,“looking” in the cone of search directions centered around its current direction. At point (a), thecurve is straight and the bug is successful, although at (b), curve is veering to the right and thebug can barely still track. At (c), the curvature is so high that tracking fails. A better modelwould explicitly include the curvature of the contour, giving rise to a “bent” search cone (d) forthe bug. The same difficulty arises in contour enhancement, which is the application consideredin this paper. (Middle) When a (visual) image of a curve is formed, the parameterization used todefine the curve as a mapping t 7→ Rt from an interval [0, T ] to the plane (left) is lost. The resultis a scalar function on the plane, where “height” U indicates the existence of the curve. Since theoriginal (parameterized) curve is uncertain, we call this scalar function the curve indicator randomfield, which formalizes the ituitive idea of a sketch of a curve. Similar remarks apply when theplane (

2) is embellished with direction ( 2× ) or curvature as well (

2× × ), which is the case

in this paper. (Bottom) The distinction between a parameterized curve and its sketch is crucial fordefining contour observation models (likelihoods). In Kalman filtering, one must cope with a noisycurve, where the noise is added to contour points. On the other hand, contour enhancement mustconfront a noisy sketch of the curve, where the cirf is the entity to which noise is added.

2

Page 5: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

closest work in spirit to this is relaxation labeling [54], several applications of which include a

deterministic form of curvature [37, 21]. Markov random fields for contour enhancement using

orientation [29] and co-circularity [18, 45] have been suggested, but these have no complete

stochastic model of individual curves. The explicit study of stochastic but direction-only

models of visual contours was initiated by Mumford [32] and has been an extended effort of

Williams and co-workers [47, 48], culminating in a model allowing variable-speed and hence

indirectly supporting a notion of curvature [49].

The objective motivating the study of higher-order curve models is the enhancement of

contours in noisy images. We begin by obtaining a local image measurement m(r) ∈ at

each r in R. Depending on whether the measurements depend on curvature κ in addition to

position (x, y) and direction θ, we let r be (x, y, θ) or (x, y, θ, κ) and R be 2× or

2× × ,

respectively. Typically, these measurements are the result of applying an edge or line operator

to an image. To emphasize the spatial relationships among these measurements, we think

of the set of measurements as a field m = m(·) : R → . Clearly these measurements are

uncertain, and therefore m is a realization of a measurement random field M . To relate

these measurements to curves, consider M as a corrupted form of some underlying curve

indicator random field U , which is unknown. In this setup, the problem of organizing curve

elements becomes one of estimating U in a statistical inference framework. In particular, we

seek a minimum mean squared error estimate u of the curve indicator random field (cirf)

U , given the observed measurement field m. In this Bayesian setting, our task amounts to

computing the posterior mean of U given m, where p(M = m|U = u) is the likelihood of

measuring m given a particular curve field u, and U ∈ du is the prior (a generalized

probability density) on U . The likelihood can be used to model various distortions in the

edge measurements, such as blur and noise; such imperfections have been difficult to express

in discrete, sparse forms of low-level perceptual organization. Our prior provides an exact

notion of an ideal edge or line map, while capturing probable patterns in images of contours,

3

Page 6: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

such as good continuation and proximity (Fig. 2).

We define our ideal edge/line map prior, the underlying random field U , in two steps.

First, we require a Markov process model of the unknown individual curves, characterized

by its (infinitessimal) generator L. In this paper we emphasize two such contour models:

(a) the direction process of Mumford and (b) the curvature process introduced here. Second,

we seek some means of converting the parameterized curves, each a function of time, to a

field, which is a function of space r. For this we define the cirf as an ideal sketch of such

random curves, enabling us to formulate the prior U in a manner directly comparable to edge

or line operator responses M (Fig. 1). The cirf is an intermediary, or, in the termology

of the EM algorithm [5], a latent variable, that facilitates the transition from curve models

to measured edge/line maps. Although we cannot observe a parametrized curve, we can

observe its corrupt rsketch (e.g., edge responses); the cirf is an explicit model of the sketch

which was corrupted. Similar models to the cirf include the line-process of Geman and

Geman [14] and the curve layer of Tu and Zhu [42]. By making the connection from (1-

dimensional) processes to (multi-dimensional) fields, the cirf is the first explicit random

field model of images of contours where the statistics of the field, including its moment

generating functional, are a derivable consequence of those of the contour model (i.e., its

generator). We begin by considering the first step—modeling individual curves—in more

detail.

This paper is organized as follows. We first introduce a curvature random process and

its diffusion equation; we then present example impulse responses, which act like the “bent”

search cones. Second, we relate the mode of the distribution for the curvature process to

the Euler-spiral energy functional on smooth curves, by generalizing the same connection

between a rougher direction process curve model and elasticae [32]. Next we review some

Markov process theory and then develop the theory of the cirf, our formal model of an

ideal sketch [3, 1] that provides a bridge between the parametrized curves, e.g., realizations

4

Page 7: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Figure 2: The curve indicator random field as an ideal edge/line map. Contour structure innatural images (top row: “Lenna,” angiogram, and possible ice cracks on Jupiter moon Europa)appears noisy, blurry, and imprecise in local edge/line maps (second row: logical/linear edge andline operator responses M(x, y, θ) [22]). We regard such measurements M as a corrupted renditionof an ideal edge/line map that we call the curve indicator random field (cirf) U ; our model exhibitsno response away from the contours and a large response along them. The loci of nonzero responseare modeled using a Markov process with local tangent information (second last row: samplesof direction process cirf under various parameter settings) or with curvature as well for greatersmoothness (bottom row: samples of curvature process cirf under various parameter settings). Thecontours in the cirf samples shown (bottom two rows) differ in number, length, and smoothness.We thus view the perceptual organization of imperfect edge/line maps as the statistical estimationof the underlying cirf.

5

Page 8: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

of the direction and curvature processes, that we cannot directly observe and the corrupt

sketches, e.g., edge/line maps, that we can. Finally, we derive a minimum mean squared

error filter for enhancing image contours; example computations reveal greater enhancement

in contours with high curvature using the curvature process than with the direction process.

In the appendix we compare the correlations of the cirf to empirical edge correlations from

some natural images.

2 A Brownian Motion in Curvature

Recall that a planar curve is a function taking a parameter t ∈ to a point (x(t), y(t))

in the plane 2. Its direction θ is defined via (x, y) = (cos θ, sin θ), where the dot denotes

differentiation with respect to the arc-length parameter t (x2+y2 = 1 is assumed). Curvature

κ is equal to θ, the rate of change of direction.

Now we introduce a Markov process that results from making curvature a Brownian

motion. Let R(t) = (X, Y, Θ, K)(t) be random2, with realization r = (x, y, θ, κ) ∈ 2× × .

Consider the following stochastic differential equation:

X = cos Θ, Y = sin Θ, Θ = K, dK = σdW,

where σ = σκ is the “standard deviation in curvature change” (see §3) and W denotes

standard Brownian motion. The corresponding Fokker-Planck partial differential equation

(pde), describing the diffusion of a particle’s probability density, is

∂p

∂t=

σ2

2

∂2p

∂κ2− cos θ

∂p

∂x− sin θ

∂p

∂y− κ

∂p

∂θ(1)

=σ2

2

∂2p

∂κ2− (cos θ, sin θ, κ, 0) · ∇p,

2Capitals will often be used to denote random variables, with the corresponding letter in lower case

denoting a realization. However, capitals are also used to denote operators later in the paper.

6

Page 9: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

where p = p(x, y, θ, κ, t) = p(R(t) = r|R(0) = r0), the conditional probability density that

the particle is located at r at time t given that it started at r0 at time 0. Observe that this

pde describes probability transport in the (cos θ, sin θ, κ, 0)-direction at point r = (x, y, θ, κ),

and diffusion in κ. Note there is no transport in the κ-direction; in the θ-direction, the rate

of transport is proportional to the curvature κ, which agrees with the basic result that κ is

the rate of change of θ along a smooth curve. An extra decay term [32, 47] is later included

to penalize length (see §3). We have solved this parabolic equation by first analytically

integrating the time variable and then discretely computing the solution to the remaining

elliptic pde. Details are reported in [1]. See Fig. 3 for example time-integrated transition

densities.

3 What is the Mode of the Distribution of the Curva-

ture Random Process?

To get more insight into our random process in curvature, consider one of the simplest aspects

of its probability distribution: its mode. First, let us consider the situation for Mumford’s

random process in3 2 × , called the direction process, which satisfies

X = cos Θ, Y = sin Θ, dΘ = σκ dW,

where σκ is the “standard deviation in curvature” and W is standard Brownian motion. In

other words, the direction process is a random model of a planar curve whose local direction θ

is white noise of variance σ2κ. This process has the following Fokker-Planck diffusion equation:

∂p

∂t=

σκ2

2

∂2p

∂θ2− cos θ

∂p

∂x− sin θ

∂p

∂y, (2)

where p = p(x, y, θ, t) is the transition density for time t. As Mumford has shown [32], the

mode of the distribution of this direction process is described by elastica, or planar curves3

2 × is also called (x, y, θ)-space, the unit tangent bundle, and orientation space [23].

7

Page 10: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

κ0: -0.2 -0.1 0 0.1 0.2

θ0 : 0 33

Figure 3: (Top) Curvature diffusions for various initial curvatures. For all cases the initial positionof the particle is an impulse centered vertically on the left, directed horizontally to the right. Shownis the time integral of the transition density of the Markov process in curvature (1), integrated overdirection θ and curvature κ; therefore, the brightness displayed at position (x, y) indicates theexpected time that the particle spent in (x, y). (Only a linear scaling is performed for displays inthis paper; no logarithmic or other nonlinear transformation in intensity is taken.) Observe thatthe solution “veers” according to curvature, as sought in the Introduction. Contrast this with thestraight search cone in Fig. 3. The pde was solved on a discrete grid of size 32× 32× 32× 5, withσκ = 0.01 and an exponential decay of characteristic length λ = 10 (see §3 for length distribution).(Bottom) Solutions to the direction process diffusion. Equation (2), minus decay term on the rightto penalize contour length, was integrated over time and then solved for a slightly blurred impulseon a 80×80×44 grid, with parameters σκ = 1/24, λ = 100, and at discrete directions 0 (left) and 4(right). Depicted is the integral over θ, cropped slightly. The method used [1] responds accuratelyat all directions. Note that these responses are straight, analogous the search cone described inthe Introduction. Given their initial direction, particles governed by the direction process moveroughly straight ahead, in contrast to those described by our curvature process (Fig. 3).

8

Page 11: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

that minimize the following functional:∫

(βκ2 + α)dt, (3)

where β and α are nonnegative constants. With such a succint expression for the mode

of the Mumford process, we seek a corresponding functional for the curvature process. We

follow a line of analysis directly analogous to Mumford [32]. First, we discretize our random

curve into N subsections. Suppose our random curve from the curvature process has length

T , distributed with the exponential density p(T ) = λ−1 exp(−λ−1T ), and independent of

the shape of the contour. Each step of the N -link approximation to the curve has length

∆t := T/N . Using the definition of the t-derivatives, for example,

X =dX

dt= lim

N→∞

Xi+1 −Xi

T/N,

we make the approximation Xi+1 ≈ Xi +∆tX. Recalling the stochastic differential equation

(1), we therefore let the curvature process be approximated in discrete time by

Xi+1 = Xi + ∆t cos Θi, Yi+1 = Yi + ∆t sin Θi, Θi+1 = Θi + ∆tKi,

where i = 1, . . . , N . Because Brownian motion has independent increments whose standard

deviation grows with the square root√

∆t of the time increment ∆t, the change in curvature

for the discrete process becomes

Ki+1 = Ki +√

∆t εi,

where εi is an independent and identically distributed set of 0-mean, Gaussian random

variables of standard deviation σ = σκ. Let the discrete contour be denoted by

ΓN = (Xi, Yi, Θi, Ki) : i = 0, . . . , N.

Given an initial point p0 = (x0, y0, θ0, κ0), the probability density for the other points is

p(ΓN |p0) = λ−1 exp(−λ−1T ) · (√

2πσ)−N exp

(

−∑

i

ε2i

2σ2

)

∝ exp

[

−∑

i

1

2σ2

(

κi+1 − κi

∆t

)2

∆t− λ−1T

]

.

9

Page 12: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

We immediately recognize κi+1−κi

∆tas an approximation to dκ

dt= κ, and so we conclude that

p(ΓN |p0) → p(Γ|p0) ∝ e−E(Γ) as N →∞,

where the energy E(Γ) of (continuous) curve Γ is

E(Γ) =

(βκ2 + α)dt, (4)

where β = (2σ2)−1 and α = λ−1.

Maximizers of the distribution p(Γ) for the curvature random process are planar curves

that minimize of the energy functional E(Γ). When α = 0—when there is no penalty on

length—, such curves are known as Euler spirals, and have been studied recently in [25].

A key aspect of the Euler spiral functional (4) is that it penalizes changes in curvature,

preferring curves with slowly varying curvature. In contrast, the elastica functional (3)

penalizes curvature itself, and therefore allows only relatively straight curves, to the dismay

of the imaginary bug in the Introduction.

4 The Curve Indicator Random Field

Given a Markov process model for the uncertain shape of a visual contour, we now introduce

the curve indicator random field (cirf), which formalizes the notion of an ideal sketch

of curves, and provides a precise definition for the kind of output we desire, e.g., from

an edge operator. Roughly, the curve indicator random field is non-zero-valued along the

true contours, and zero-valued elsewhere. This ideal sketch is important because it directly

expresses how an image of a curve is formed: intuitively as ink is deposited by a pen tracing

out the curve. The sketch thus captures what remains of a curve after its parameterization is

lost as its image is created. The sketch is also the first edge model that locally captures the

interactions between crossing curves as the “build-up of ink” at a point; handling crossings

with curve parameterizations is very difficult, requiring global information. The measured

10

Page 13: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

edge/line map is then viewed as an imperfect cirf, i.e., a sketch corrupted by noise, blur, etc.

The goal of filtering, then, is to estimate the true cirf (which acts as a prior on sketches)

given the imperfect one. Before deriving our filter in §7, we now define and develop the

theory of the cirf, including an exact, intuitive characterization of the complete set of the

cirf statistics. In particular, we conclude with an expression for the moment generating

functional of the cirf that only requires the solution of a linear system. See [1] for more

detail and Norris’ text for background on Markov processes [35].

For generality, we shall define the curve indicator random field for any continuous-time,

stationary Markov process R : t 7→ Rt (for 0 ≤ t < T ) taking values Rt in a finite (or at

most countable) set, or state space, I of cardinality |I|. As in §3, the random variable T is

exponentially-distributed with mean value λ > 0, and represents the length of a contour. To

ensure the finiteness of the expressions that follow, we assume λ < ∞. Sites or states within

I will be denoted i and j. (Think of I as a discrete approximation to the continuous state

space R = 2 × ×

(resp., 2 × ) where the curvature random process (resp., direction

process) takes values.) Let condition denote the (indicator) function that takes on value

1 if condition is true, and the value 0 otherwise. With these notations we define the curve

indicator random field V for a single curve to be

Vi :=

∫ T

0

Rt = idt, ∀i ∈ I.

Observe that Vi is the (random) amount of time that the Markov process spent in state i.

In particular, Vi is zero unless the Markov process passed through site i. In the context

of Brownian motion or other symmetric processes, V is variously known as the occupation

measure or the local time of Rt [7, 8].

Generalizing to multiple curves, let the random number N of curves be Poisson dis-

tributed with average value N . Then choose N independent copies R(1)t1 , . . . , R

(N )tN

of the

Markov process Rt, with independent lengths T1, . . . , TN , each distributed as T . To define

11

Page 14: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

the (multiple curve) cirf, take the superposition of the single-curve cirfs V (1), . . . , V (N )

for the N curves.

Definition 1. The curve indicator random field U is defined to be

Ui :=N∑

n=1

V(n)i =

N∑

n=1

∫ Tn

0

R(n)tn = idtn, ∀i ∈ I.

Thus Ui is the total amount of time that all of the Markov processes spent in site i. Again,

observe that this definition satisfies our desiderata for an ideal edge/line map: (1) non-zero

value where the contours are, and (2) zero-value elsewhere. The probability distribution of

U will become our prior for inference.

4.1 Statistics of the Curve Indicator Random Field

Probabilistic models in vision and pattern recognition have been specified in a number of

ways. For example, Markov random field models [14] are specified via clique potentials

and Gaussian models are specified via means and covariances. Here, instead of providing

the distribution of the curve indicator random field itself, we report its moment generating

functional, from which all moments can be computed straightforwardly.

Before doing so, we review some Markov process theory. Let the transition probability

pij(t) = Rt = j|R0 = i be the probability that the Markov process R is in state j at time

t given that it was in state i at time 0. Further, let the |I|× |I| transition probability matrix

be P (t) = (pij(t)). Because of the stationarity of R, the transition probability pij(t) can be

computed from its initial time derivative lij = ddt

pij(t)|t=0. Specifically, letting the generator

of the Markov process R be the matrix L = (lij), then we have P (t) = eLt, the matrix

exponential of matrix Lt = (lijt). For the curvature process, we let L be a discretization of

the partial differential operator on the right hand side of (1), or

L ≈ σκ2

2

∂2

∂κ2− cos θ

∂x− sin θ

∂y− κ

∂θ(curvature process),

12

Page 15: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

and for the direction process, L is the discretization of the corresponding operator in (2), or

L ≈ σκ2

2

∂2

∂θ2− cos θ

∂x− sin θ

∂y(direction process).

To include the exponential distribution over T (the lifetime of each particle), we construct

a killed Markov process with generator Q = L− λ−1I. (Formally, we do this by augmenting

the discrete state space I with adding a single “death” state. When t ≥ T , the process

enters

and it cannot leave.) Slightly changing our notation, we shall now use Rt to mean

the killed Markov process with generator Q. The Green’s function matrix G = (gij) of the

Markov process is the matrix∫∞

0eQtdt =

∫∞

0P (t)e−t/λdt. The (i, j)-entry gij in the Green’s

function matrix G represents the expected amount of time that the Markov process spent

in j before death, given that the process started in i. One can show that G = −Q−1 by

integrating the identity ddt

eQt = Q eQt from t = 0 to t = ∞.

5 Moments of the Single-Curve CIRF

Although we are interested in the statistics of the general curve indicator random field U , we

first consider the simpler, single-curve case, which we studied earlier in discrete-time [2]. The

first step (Prop. 1) is to derive all the moments of the single-curve cirf V . Then we shall

summarize this result as a moment generating functional (Prop. 2), which is then generalized

to the multiple-curve case (Prop. 4).

To prepare for studying these moments, we first recognize the rather pathological nature

of the cirf: realizations of this field are zero except along the curves. While we do not

develop this connection here, we observe that when the state space is a continuum, realiza-

tions of the analogous4cirf would generally not even be continuous; hence we would be

faced with generalized functions, or distributions (in the sense of Schwartz, not probability),

4Loosely, we obtain cirf in the continuum by replacing the indicators above with Dirac δ-distributions [3].

13

Page 16: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

where the field is “probed” by taking inner products with (appropriately well-behaved) test

functions [38].

It is also convenient to use this distribution-theoretic formalism for studying the cirf

on our discrete space I. Let the inner product between vectors a and b in |I| be 〈a, b〉 :=

i∈I aibi. We use a bias vector c ∈ |I| as a test function for “probing” V by taking an

inner product:

〈c, V 〉 =∑

i

ci

∫ T

0

Rt = idt =

∫ T

0

(

i

ci Rt = i

)

dt =

∫ T

0

c(Rt)dt, (5)

where c(i) = ci. The integral∫ T

0c(Rt)dt is known as an additive functional of Markov process

Rt [11]. In the following, we let α := λ−1 to simplify expressions. We also introduce a final

weighting ν(RT−) on the state of the curve just before death; ν can be used to encourage the

curve to end in certain states over others.5 Let

iZ denote the expected value of the random

variable Z given that R0 = i, and let

µZ denote the same expectation except given that

R0 = i = µi, where µ is the distribution of the initial point of the curve. To reduce the

clutter of many brackets we adopt the convention that the expectation operator applies to

all multiplied (functions of) random variables to its right: e.g.,f(X)g(Y ) :=

[f(X)g(Y )].

We now provide a formula for the moments of 〈c, V 〉, a “probed” single-curve cirf V . The

proof uses techniques that are used in statistical physics and in studying order-statistics.

Proposition 1. The k-th moment of 〈c, V 〉 with initial distribution µ and final weighting

ν = ν(i) = νi, i ∈ I is:

µ〈c, V 〉kν(RT−) = αk!〈µ, (GC)kGν〉, (6)

where C = diag c.

5Following [11], we use the notation T− to represent the left limit approaching T from below, i.e.,

ν(RT−) = limtT ν(Rt).

14

Page 17: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

proof. We first consider the case where µj = δi,j, and then generalize. Recall the formula

for exponentially-distributed length T : pT (t) = α e−αt . Substituting this and (5) into the

left side of (6), we get:

i〈c, V 〉kν(RT−) =

i

(∫ T

0

c(Rt)dt

)k

ν(RT−)

= α

i

∫ ∞

0

e−αt

(∫ t

0

c(Rt′)dt′)k

ν(Rt)dt, (7)

where we have used the fact that:

f(t−)dt =

f(t)dt,

for piecewise continuous functions f with finite numbers of discontinuities in finite intervals.

We further note that:(∫ t

0

c(Rt′)dt′)k

=

∫ t

0

· · ·∫ t

0

c(Rt1) · · · c(Rtk)dt1 · · ·dtk

= k!

· · ·∫

0≤t1≤···≤tk≤t

c(Rt1) · · · c(Rtk)dt1 · · ·dtk, (8)

where the second line follows because there are k! orthants in the k-dimensional cube [0, t]k,

each having the same integral by relabeling the ti’s appropriately. Taking integrals iteratively

starting with respect to t1, the right side of (8) becomes, by induction in k:

k!

∫ t

0

∫ tk

0

· · ·∫ t2

0

c(Rt1) · · · c(Rtk)dt1 · · ·dtk.

The right side of (7) then becomes:

αk!

∫ ∞

0

e−αt

∫ t

0

∫ tk

0

· · ·∫ t2

0

i1,... ,ik,j

iRt1 = i1, . . . , Rtk = ik, Rt = j

· c(i1) · · · c(ik)ν(j)dt1 · · ·dtk

= αk!∑

i1,... ,ik,j

c(i1) · · · c(ik)ν(j)[

∫ ∞

0

e−αt

·

∫ t

0

∫ tk

0

· · ·∫ t2

0

pi,i1(t1)pi1,i2(t2 − t1)

· · · pik−1,ik(tk − tk−1)pik,j(t− tk)dt1dt2 · · ·dtk−1

dtk

]

, (9)

15

Page 18: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

using the Markovianity and stationarity of Rt. Recalling that the formula for the convolution

of two functions f and g is:

(f ∗ g)(t) =

∫ t

0

f(τ)g(t− τ)dτ,

we see that the expression in braces in (9) can be written as:

∫ t

0

· · ·∫ t2

0

pi,i1(t1)pi1,i2(t2 − t1)dt1 · · ·pik,j(t− tk)dtk−1

=

∫ t

0

· · ·∫ t3

0

(pi,i1 ∗ pi1,i2)(t2)pi2,i3(t3 − t2)dt2 · · · pik,j(t− tk)dtk−1

= (pi,i1 ∗ pi1,i2 ∗ · · · pik−1,ik ∗ pik,j)(t),

which is a k-fold convolution by induction in k. Now observe that the expression in brackets

in (9) is the Laplace transform h(t)(α) with respect to t, evaluated at α, of the expression

h(t) in braces. Therefore by using the convolution rule of the Laplace transform k times,

the expression in brackets in (9) becomes:

pi,i1(α) · pi1,i2(α) · · · · · pik−1,ik(α) · pik,j(α).

But as shown earlier we know that gi,j =∫∞

0e−αt pi,j(t)dt =

pi,j(α), and so we can write:

i〈c, V 〉kν(RT−)

= αk!∑

i1,... ,ik,j

c(i1) · · · c(ik)ν(j)gi,i1gi1,i2 · · · gik−1,ikgik,j (10)

= αk!∑

i1

gi,i1c(i1)

i2

gi1,i2c(i2) · · ·[

ik

gik−1,ikc(ik)

(

j

gik,jν(j)

)]

· · ·

= αk!∑

i1

(GC)i,i1

i2

(GC)i1,i2 · · ·[

ik

(GC)ik−1,ik(Gν)ik

]

· · ·

= αk!((GC)kGν)i.

Since for any random variable Z we have

µZ =∑

i R0 = i [Z|R0 = i] =

i µi

iZ, the

result follows.

16

Page 19: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

5.1 Moment Generating Functional of the

Single-Curve CIRF: A Feynman-Kac Formula

To obtain a formula for the moment generating functional (mgf) of the single-curve cirf,

we first define the Green’s operator Gc with (spatially-varying) bias (vector) c as the Green’s

operator for the killed Markov process with extra killing −c, i.e., having generator Qc :=

Q + C, where C = diag c = diag(c1, . . . , c|I|). The bias c behaves exactly opposite to the

decay or death term α in Q = L−αI: if ci is positive, there is a bias towards the creation of

particles at site i; if negative, there is a bias towards killing them. Using the same proof as for

G = −Q−1, we know Gc := −Q−1c = −(Q + C)−1. (Khas’minskii found an explicit condition

for the invertibility of Q+C; see [24, 1].) We now compute the moment generating functional

for the single-curve case using Prop. 1. This is known as the Feynman-Kac formula [7]. (For

a recent and more general discussion of these ideas, see [11].)

Proposition 2. For all c ∈ |I| such that |c| is sufficiently small,

µ exp〈c, V 〉ν(RT−) = α〈µ, Gcν〉.

proof. Using the power series representation of the exponential, we write:

µ exp〈c, V 〉ν(RT−) =

µ

∞∑

k=0

〈c, V 〉kk!

ν(RT−)

=∞∑

k=0

µ〈c, V 〉kν(RT−)/k! =

∞∑

k=0

α〈µ, (GC)kGν)〉 (11)

using Prop. 1. Recalling the fact that∑∞

k=0 Ak = (I − A)−1, as long as ||A|| < 1, the right

hand side of (11) becomes:

α〈µ,

[

∞∑

k=0

(GC)k

]

Gν〉 = α〈µ, (I −GC)−1Gν〉,

as long as (for some operator norm) ||GC|| < 1. Since C = diag(c1, . . . , c|I|) is diagonal,

the matrix GC is simply G with the i-th column weighted by ci. Therefore ||GC|| < 1

17

Page 20: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

for |c| sufficiently small. The result follows because (I − GC)−1 = (Q−1Q + Q−1C)−1 =

−(Q + C)−1G−1.

Observe that to evaluate the Feynman-Kac formula, one must solve the linear system (Q +

C)h + ν = 0 for h. This equation will become a key component in our contour enhancement

filter (§7).

5.2 Initial and Final Weightings

As the interpretation of the “final weighting” ν above may seem mysterious, we now restrict

µ and ν to be finite measures satisfying the normalization constraint 1 = 〈µ, Gν〉. (If this

equality is not satisfied, one need only divide by a suitable normalizing constant.)

Corollary 1. Suppose that the joint distribution over initial and final positions is R0 =

i, RT− = j = µigi,jνj. Then the moment generating functional of V , with this joint distri-

bution over initial and final states, is:

exp〈c, V 〉 = 〈µ, Gcν〉. (12)

Remark 1. Although not studied here, it is interesting to consider the problem of finding

those measures µ, ν that induce a R0, RT− with prescribed marginals R0 and RT−

over initial and final states respectively. By specifying these marginals we could control the

contour endpoints for specialized applications in medical imaging for example.

We shall forego answering this question now because in general-purpose contour enhance-

ment we typically have no a-priori preference for the start and end locations of each contour,

and so we would set these measures proportional to the constant vector 1 = (1, . . . , 1).

One can show that by letting µi = |I|−1, νi = λ−1, ∀i ∈ I, the normalization constraint is

satisfied. Before we begin the proof of Corollary 1, we state a basic result.

18

Page 21: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Lemma 1. If X and Z are random variables, and X is discrete (i.e., X can only take on

one of an at most countable number of values x), then:

[Z|X = x] =

[Z

X = x] X = x .

proof. We compute:

[Z|X = x] =

z Z ∈ dz|X = x =

z Z ∈ dz, X = x/ X = x

=

x′

z x′ = x Z ∈ dz, X = x′/ X = x =

Z

X = x/ X = x.

proof of corollary 1. Observe that µigi,jνj is indeed a distribution. We now compute

the result:

exp〈c, V 〉 =

i,j

R0 = i, RT− = j [exp〈c, V 〉|R0 = i, RT− = j]

=∑

i,j

µigi,jνj

i[exp〈c, V 〉|RT− = j]

=∑

i,j

µigi,jνj

i[exp〈c, V 〉 RT− = j]/ iRT− = j,

using Lemma 1. But using Prop. 2 we see that:

iRT− = j =

i exp(0, V ) RT− = j = αgi,j,

and therefore:

exp〈c, V 〉 = α−1

i,j

µiνj

i[exp〈c, V 〉 RT− = j]

= α−1 µ

[

exp〈c, V 〉(

j

νj RT− = j

)]

= α−1 µ [exp〈c, V 〉ν(RT−)] .

The result follows after another application of Prop. 2 to the above expectation.

19

Page 22: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

The next corollary shows all of the (joint) moments of V . Let permk denote the set of

permutations of the integers 1, . . . , k.

Corollary 2. If k ≥ 1, the k-th (joint) moment of V at sites i1, . . . , ik is:

Vi1 · · ·Vik =

i,j

µiνj

a∈permk

giia1gia1

ia2· · · gia

k−1ia

kgia

kj. (13)

proof. Take partial derivatives of (12) with respect to ci1 , . . . , cik and evaluate them at

c = 0. The only nonzero terms come from differentiating an expression proportional to

(10).

In our earlier work on the curve indicator random field for discrete-time Markov processes [2],

we arrived at a similar result, except under the condition that the sites i1, . . . , ik are distinct.

In observing the connection to the Feynman-Kac formula, we have overcome this limitation

and can now summarize the single-curve cirf with its moment generating functional. The

weighting over final states (the other end of the contour) is also new.

6 Multiple-Curve Moment Generating Functional

In order to model more than one curve in an image, we need a joint distribution over both

the number of curves and the curves (and the corresponding cirfs) themselves. To make our

computations concrete, we adopt a Poisson distribution over the number N of curves, and

assume conditional independence of the curves given N . To compute the moment generating

functional of this (multiple-curve) cirf as a Poisson distribution over (single-curve) cirfs,

we first consider the general case of Poisson “point” process given a distribution over each

point, where point will be interpreted as an entire single-curve cirf. This use of the Poisson

distribution is based on [7], but our presentation is more elementary.

20

Page 23: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

6.1 The Poisson Measure Construction

We begin with a finite measure6 P : F → + over the measure space (Ω,F), where F is a

σ-algebra, and

+ denotes the nonnegative real numbers. Intuitively, the finite measure P

is the (un-normalized) distribution over “points” ω ∈ Ω, where in this paper ω is a curve

realization (i.e., a Markov process realization) and Ω is the set of all possible curves. We

shall now define a probability distribution over random configurations ω = (ω1, ..., ωN ) ∈

Con(Ω) := Ω0 =

, Ω1 = Ω, Ω2 = Ω × Ω, Ω3, . . . , where each ωn is a curve in Ω and

N is the random number of curves. In our context, Ω0 is the 0-curve configuration (no

curves), Ω1 are the one-curve configurations (the set of single curves), Ω2 are the two-curve

configurations (the set of pairs of curves), and so on. We now compute the Poisson point

measure via its expectationF on any (measurable) function F : Con(Ω) →

(clearly this

defines a probability distribution for we could take F as an indicator over any (measurable)

subset of Con(Ω) to get its probability).

Proposition 3. Suppose N is a Poisson deviate with mean P (Ω). Further suppose that the

points ω1, . . . , ωn are (conditionally) independent and identically distributed with P (·)/P (Ω),

given N = n. Then:

F :=

∞∑

n=0

e−P (Ω)

n!

Ωn

F (ω1, . . . , ωn)P (dω1) · · ·P (dωn). (14)

proof. We need only take a conditional expectation and recall the formula for the Poisson

6A finite measure can always be normalized to a probability distribution because P (Ω) < ∞. In particular,

P (ω) := P (ω)/P (Ω) is a (normalized) probability distribution over ω.

21

Page 24: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

distribution, as follows:

F =

(

[F |N ])

=(

F (ω1, . . . , ωN ))

=(∫

ΩNF (ω1, . . . , ωN ) (P (dω1)/P (Ω)) · · · (P (dωN )/P (Ω))

)

=

∞∑

n=0

e−P (Ω) P (Ω)n

n!

(

P (Ω)−n

ΩNF (ω1, . . . , ωN )P (dω1) · · ·P (dωN )

)

.

The result follows.

6.2 Application to the MGF of the Curve Indicator

Random Field for Multiple Curves

We now finally consider the joint distribution over many curves. Suppose there are N

contours on average. We now state and prove the key theoretical result of this paper.

Proposition 4. Suppose all curves R(n)t , n = 1, . . . ,N have the same the inital distribution

µ and final weighting ν, where the normalization constraint 〈µ, Gν〉 = 1 is satisfied (see

§5.2). Let N be the average number of curves. Then the moment generating functional of

the curve indicator random field U is

exp〈c, U〉 = exp〈µ, N (Gc −G)ν〉.

proof. To take advantage of the Poisson point measure construction, we let ω be a real-

ization of the killed Markov process Rt, t ∈ [0, T−), such that the finite measure P = P (ω)

is the probability distribution for ω but multiplied by the constant N , i.e., P (Ω) = N . Let

F := exp〈c, U〉 = exp∑N

n=0〈c, V (n)〉 =∏N

n=0 exp〈c, V (n)〉, where V (n) is a function of ωn.

22

Page 25: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Applying (14) we obtain:

exp〈c, U〉 =

∞∑

n′=0

e−P (Ω)

n′!

Ωn′

n′∏

n=0

exp〈c, V (n)〉P (dω1) · · ·P (dωn′)

=∞∑

n′=0

e−P (Ω)

n′!

n′∏

n=0

(∫

Ω

exp〈c, V (n)〉P (dωn)

)

= e−P (Ω)

∞∑

n′=0

1

n′!

n′∏

n=0

(∫

Ω

exp〈c, V (1)〉P (dω1)

)

,

since V (1), . . . , V (n′) are identically distributed. But then the latter integral is N exp〈c, V 〉,

and so the above sum becomes:

∞∑

n′=0

(

N exp〈c, V 〉

)n′/n′! = exp(N

exp〈c, V 〉).

So using P (Ω) = N 〈µ, G(0〉ν) and Prop. 1, we conclude:

exp〈c, U〉 = exp(N 〈µ, Gcν〉 − N 〈µ, Gν〉).

6.3 Cumulants of the CIRF

While Prop. 4 may seem abstract, it is actually very useful. First observe its similarity

to the single-curve case. More importantly, with Prop. 4 we can compute the higher-order

cumulants [28, 34] of U (recall that the moments define the cumulants and vice versa):

Corollary 3. If k ≥ 1, the k-th (joint) cumulant of the curve indicator random field U at

sites i1, . . . , ik is

cumUi1 , . . . , Uik = N∑

i,j

µiνj

a∈permk

giia1gia1

ia2· · · gia

k−1ia

kgia

kj. (15)

proof. Since the cumulant generating functional of U , which is the natural logarithm of

the moment generating functional of U , differs from the moment generating functional of V

by additive and multiplicative constants, we use (13) with no further work.

23

Page 26: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

The cumulant formula has a simple interpretation. First recall that the Green’s operator

entry gij is the expected amount of time spent by Rt in state j given that it started in i.

For any ordering of the k points we take the product of the gij’s for the successive points

in order (the first and last factors deal with the initial and final points). Since the contour

could have passed through the points in any order, all permutations must be considered.

This loss of ordering is the same as that in an image of curves: we do not observe the strokes

that an artist makes in creating a sketch; we only see the resulting pattern of ink. The cirf

thus captures this fundamental aspect of visual curves.

Letting G∗ denote the transpose of G, we can rephrase corollary 3 to show the mean and

covariance of the CIRF [3].

Corollary 4. Suppose that µi = |I|−1, νi = λ−1, ∀i ∈ I, and let η = Nλ|I|−1. The mean

of the curve indicator random field U is

Ui = η, ∀i ∈ I. The covariance matrix of U is

cov U = η(G + G∗).

One column of the covariance matrix for the direction process is illustrated in Fig. 4, by

taking its impulse response. Several columns of the covariance matrix for the curvature

process are illustrated in Fig. 4, by taking its impulse response for several positions, directions

and curvatures. In the appendix we demonstrate that empirical correlations obtained from

natural images with contours qualitatively agree with the cirf based on the direction process,

but point to the need for a curvature process as well.

Note that by Corollary 3, the cumulants of U of order greater than two are generally not

zero, proving the following:

Corollary 5. The curve indicator random field U is non-Gaussian.

Although non-Gaussianity is often difficult to handle, in our case it is easier because we have

an explicit, tractable formula for the cirf’s moment generating functional, which we now

exploit in deriving a contour enhancement filter.

24

Page 27: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

κ0: 0.2 0 -0.1

Figure 4: (Left) Impulse response of the covariance matrix (Corollary 4) for the curve indicatorrandom field for the direction process in (x, y, θ). Impulse is located at image center, with direction0, then Gaussian blurred with radius 1.0. The operator cov U was then applied to this impulsevector and the resulting (impulse) response is shown. Parameters are σκ = 1/8, λ = 200, N = 10.Sum over θ is taken to produce this single image in (x, y). Observe the elongation of the responsealong the horizontal direction, capturing the smoothness and length of the contours. In [1], the cirf

covariance is used as a linear filter for enhancing contour images. Note that the covariance impulseresponse resembles the direction-only compatibility fields used in relaxation labeling [54, 21] andin more recent models for horizontal connections in primary visual cortex of primates [10]. (Right)Impulse responses of the covariance matrix for the curve indicator random field for the curvatureprocess. Impulses are located at the center of each image, directed at discrete direction 4 out of32, with 5 curvatures. Parameters are σκ = 0.01, λ = 10.

25

Page 28: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

7 Minimum Mean Squared Error Filtering

Instead of the unknown random field U , what we actually observe is some realization m of

a random field M of (edge or line) measurements. Given m, we seek that approximation u

of U that minimizes the mean squared error (MMSE), or

u

arg minu

m||u− U ||2,

where

m the denotes taking an expectation conditioned on the measurement realization m.

It is well-known that the posterior mean is the MMSE estimate

u =

mU,

but in many interesting, non-Gaussian, cases this is extremely difficult to compute. In

our context, however, we are fortunate to be able to make use of the moment generating

functional (Prop. 4) to simplify computations.

Before developing our MMSE estimator, we must define our likelihood function p(M |U).

First let Hi be the binary random variable taking the value 1 if one of the contours passed

through (or “hit”) site i, and 0 otherwise, and so H is a binary random field on I. In this pa-

per we consider conditionally independent, local likelihoods: p(M |H) = p(M1|H1) · · · p(M|I||H|I|).

Following [13, 51], we consider two distributions over measurements at site i: pon(Mi) :=

p(Mi|Hi = 1) and poff(Mi) := p(Mi|Hi = 0). It follows [51] that ln p(M |H) =∑

i ln(pon(Mi)/poff(Mi))Hi.

Now let τ be the average amount of time spent by the Markov processes in a site, given that

the site was hit; observe that Ui/τ and Hi are therefore equal on average. This suggests that

we replace H with U/τ above to construct a likelihood in U , in particular,

ln p(M |U) ≈∑

i

ciUi = 〈c, U〉, where ci = ci(Mi) = τ−1 lnpon(Mi)

poff(Mi). (16)

Now we use Bayes’ rule to find the posterior mean

mUi =

[Uip(M |U)]

[p(M |U)]

≈[Ui exp〈c, U〉]

[exp〈c, U〉] =

∂ciln

exp〈c, U〉.

26

Page 29: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

So to compute the posterior mean we have to compute

∇c ln

exp〈c, U〉 =

(

∂c1ln

exp〈c, U〉, . . . ,

∂c|I|ln

exp〈c, U〉

)

,

where ∇c denotes the gradient with respect to c. Substituting the cirf moment generating

functional (Prop. 4), we have to take the gradient of 〈µ, N (Gc − G)ν〉 = N [〈µ, Gcν〉 − 1].

Applying the gradient to the latter inner product, we can write ∇c〈µ, Gcν〉 = 〈µ, [∇cGc]ν〉.

Since Gc := −(Q+diag c)−1, to continue the calculation we must understand how to take

derivatives of inverses of operators, which we study in the following lemma.

Lemma 2. Suppose matrices A = (aij) and B = (bij) are functions of some parameter α,

i.e., A = A(α), B = B(α). Letting primes ( ′) denote (componentwise) derivatives d/dα with

respect to α, the following formulas hold:

(AB)′ = A′B + AB′, (17)

(A−1)′ = −A−1A′A−1. (18)

proof. Formula (17) follows by applying the single variable product rule to the sum from the

inner product of a row of A and column of B: (AB)′ij = (∑

k aikbkj)′ =∑

k(a′ikbkj +aikb

′kj) =

(A′B + AB′)ij.

To obtain formula (18), we first apply formula (17) to write 0 = I ′ = (AA−1)′ = A′A−1 +

AA−1′. Then we solve for A−1′.

Returning to the calculation of the cirf posterior mean, we use (18) to write ∂∂ci

Gc =

∂∂ci

[−(Q + diag c)−1] = (Q+diag c)−1[

∂∂ci

(Q + diag c)]

(Q+diag c)−1 = GcDiGc, where Di is

a matrix with a one at the (i, i)-entry and zeroes elsewhere. The matrix Di can be expressed

as the outer product δiδ∗i of discrete impulse vectors δi at site i, where (δi)j = δij. Therefore

∂∂ci

〈µ, Gcν〉 = (δ∗i G∗cµ) (δ∗i Gcν) = (G∗

cµ)i(Gcν)i, proving the following:

27

Page 30: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Result 1. Under the likelihood approximation in (16), the cirf posterior mean is

mUi ≈ N fibi, ∀i ∈ I, (19)

where f = (f1, . . . , f|I|) is the solution to the forward equation

(Q + diag c)f + ν = 0 (20)

and b = (b1, . . . , b|I|) is the solution to the backward equation

(Q∗ + diag c)b + µ = 0. (21)

Note that the above forward and backward equations arose earlier in the Feynman-Kac

formula (§5.1). In addition, these linear systems are finite-dimensional; however, since Q =

L−λ−1, where the generator L is the discretization of an (elliptic) partial differential operator,

we view the forward and backward equations as (linear) elliptic partial differential equations,

by replacing L with its undiscretized counterpart, and c, f, b, µ, and ν with corresponding

(possibly generalized) functions on a continuum, such as 2 × for the direction process, or

2 × × for the curvature process.

Despite the linearity of the forward and backward equations, observe that two nonlinear-

ities arise in the posterior mean in equation (19). First, there is a product of the forward

and backward solutions, analogous to the source/sink product in the stochastic comple-

tion field (scf) [47]. (The scf was a major inspiration for the current work; the main

difference is our use of the statistical estimation framework with a cirf prior. Most fun-

damentally, the cirf is crucial in defining a measurement model for curve enhancement:

it is a corrupted cirf that is observed, where curve parametrization is lost. Recall how

this was expressed by the cirf cumulants in Corollary 3. The scf was originally intended

as a model of illusory contour formation where the contours are known to begin at end at

corners and junctions. In contour enhancement we do not have such keypoints to anchor

our computations.) Second, although both the forward and backward equations are linear,

28

Page 31: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

they represent nonlinear mappings from input (c) to output (f or b). For example, it fol-

lows that f = (I − G diag c)−1Gν =∑∞

k=0(G diag c)kGν, i.e., f is a polynomial—and thus

nonlinear—function of the input c.

7.1 Example Computations

We have implemented a version of the cirf posterior mean filter. For our initial experi-

mentation, we adopted a standard additive white Gaussian noise model for the likelihood

p(M |U). As a consequence, we have c(m) = γ1m− γ2, a simple transformation of the input

m, where γ1 and γ2 = 3 are constants. The direction-dependent input m was set to the result

of logical/linear edge and line operators [22]. The output of the logical/linear operator was

linearly interpolated to as many directions as necessary. For direction-only filtering, this

input interpolation was sufficient, but not for the curvature-based filtering, as curvature was

not directly measured in the image; instead, the directed logical/linear response was simply

copied over all curvature values, i.e., the input m was constant as a function of curvature.

The limit of the invertibility of Q + diag c and its transpose was used to set γ1 (see the

proof of Prop. 2 and [1]). The technique used to solve the forward and backward cirf

equations for the curvature process in (x, y, θ, κ) is reported in [1], and are a generalization

of the method used for the forward and backward cirf equations for the direction process

in (x, y, θ) [1], which is also used here. Briefly, we use a conjugate gradient method which

iteratively applies the Green’s operator G on the order of ten times. For the direction pro-

cess, applying G involves two FFTs and the solution of a tridiagonal system, while for the

curvature process we use an additional linear solver that is cubic in the number of directions

and curvatures (which is generally much smaller than the length and width of the image).

Parameter settings were N = 1, µ = |I|−11, ν = λ−11. The direction process cirf was

solved on a grid the size of the given image but with 32 directions. For the curvature process

29

Page 32: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

cirf filtering, very few curvatures were used (3 or 5) in order to keep down computation

times in our unoptimized implementation. Unless we state otherwise, all filtering responses

(which are fields over either discrete (x, y, θ)-space or discrete (x, y, θ, κ)-space) are shown

summed over all variables except (x, y).

For our first example, we considered a blood cell image (Fig. 5, top). To illustrate

robustness, noise was added to a small portion of the image that contained two cells (top left),

and was processed with the logical/linear edge operator at the default settings7. The result

was first filtered using the cirf posterior mean based on Mumford’s direction process (top

center). Despite using two very different bounds on curvature, the direction-based filtering

cannot close the blood cell boundaries appropriately. In contrast, the cirf posterior mean

with the curvature process (top right) was more effective at forming a complete boundary.

To illustrate in more detail, we plotted the filter responses for the direction-based filter at

σκ = 0.025 for 8 of the 32 discrete directions in the middle of Fig. 5. The brightness in each

of the 8 sub-images is proportional to the response for that particular direction as a function

of position (x, y). Observe the over-straightening effect shown by the elongated responses.

The curvature filter responses were plotted as a function of direction and curvature (bottom).

Despite the input having been constant as a function of curvature, the result shows curvature

selectivity. Indeed, one can clearly see in the κ > 0 row (Fig. 5, bottom) that the boundary

of the top left blood cell is traced out in a counter-clockwise manner. In the κ < 0 row, the

same cell is traced out in the opposite manner. (Since the parameterization of the curve is

lost when forming its image, we cannot know which way the contour was traversed; our result

is consistent with both ways.) The response for the lower right blood cell was somewhat

weaker but qualitatively similar. Unlike the direction-only process, the curvature process

can effectively deal with highly curved contours.

For our next example, we took two sub-images of a low-contrast angiogram (top of Fig. 6;

7Code and settings are available at “http://cvc.yale.edu”.

30

Page 33: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Direction cirf Curvature cirf

Original σκ = 0.025 σκ = 0.33 σκ = 0.01

Direction cirf Output by Direction

κ > 0

κ = 0

κ < 0

Curvature cirf Output by Direction and Curvature

θ : 0 90 180 270

Figure 5: Curvature filtering of a blood cell image (see text).

31

Page 34: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

sub-images from left and top right of original). The first sub-image (top left) contained a

straight structure, which was enhanced by our curvature-based cirf filter (summed responses

at top right). The distinct responses at separate directions and curvatures show curvature

selectivity as well, since the straight curvature at 45 had the greatest response (center). The

second sub-image (bottom left) of a loop structure also produced a reasonable filter response

(bottom right); the individual responses (bottom) also show some curvature selectivity.

As argued in the Introduction, the bug with a direction-only search cone would mistrack

on a contour as curvature builds up. To make this point computationally, we consider

an image (top left of Fig. 7) of an Euler spiral extending from a straight line segment.8

Observe that the contour curvature begins at zero (straight segment) and then builds up

gradually. To produce a 3-dimensional input to our direction-based filter, this original (2-d)

image was copied to all directions (i.e., m(x, y, θ) = image(x, y), for all θ). Similarly, the

image was copied to all directions and curvatures to produce a 4-d input to the curvature-

based filter. For this test only, our 2-dimensional outputs were produced by taking, at each

position (x, y), the maximum response over all directions (for the direction-based filtering)

or over all directions and curvatures (for the curvature-based filtering). The direction-based

cirf posterior mean (with parameters σκ = 0.025, λ = 10, with 64 directions) was computed

(center), showing an undesirable reduction in response as curvature built up. The curvature-

based cirf posterior mean (right, with parameters σκ = 0.05, λ = 10, 64 directions, and

7 curvatures (0,±0.05,±0.1,±0.15)) shows strong response even at the higher curvature

portions of the contour. To test robustness, 0-mean Gaussian noise of standard deviation

0.4 was added (bottom left) to the image (0 to 1 was the signal range before adding noise).

The results (bottom center and right) show that the curvature-based filter performs better

8We used formula (16.7) of Kimia et al [25], and created the plot in Mathematica with all parameters

0, except γ = 0.1 (Kimia et al’s notation). The resulting plot was grabbed, combined with a line segment,

blurred with a Gaussian, and then subsampled.

32

Page 35: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

κ > 0

κ = 0

κ < 0

Curvature cirf Output by Direction and Curvature

θ : 0 45 90 135

κ > 0

κ = 0

κ < 0

Curvature cirf Output by Direction and Curvature

θ : 0 45 90 135

Figure 6: Curvature filtering for an angiogram (see text).

33

Page 36: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

in high curvature regions despite noise.

Computations were conveniently performed using the Python scripting language with

numerical extensions in a GNU/Linux environment.

8 Conclusion

In this paper we studied the curve indicator random field as an abstraction that mediates pa-

rameterized curves with noisy line operator responses. The cirf acts as a prior for Bayesian

contour enhancement, and the form of the resulting nonlinear filter only requires the solution

of two linear systems. This framework leaves the curve model as a free parameter, allowing

us to introduce a new stochastic model for contour curvature to better capture the shape of

image curves. Whereas most contour models penalize large curvatures, our curvature Markov

process allows highly curving contours, and only penalizes changes in curvature. Our com-

putations show that the filter responds well along smooth contours, even those having large

curvature. In future work we seek to exploit the cirf’s unique capacity to locally represent

contour interactions, e.g. to discourage contour crossings.

A Edge Correlations in Natural Images

Whether an algorithm ever works depends on the correctness of the assumptions upon which

it is based. It is therefore useful to determine if the cirf model of contours bears any resem-

blance to natural contour images so that our efforts are not wasted on an unrealistic model.

In the last section we derived the covariances of the cirf U ; here we present empirical edge

covariances for several natural images. The qualitative similarity between the covariances

for our model and the observations suggests that the cirf based on the direction process is

a reasonable starting point for modeling image contours; the results also suggest a role for

34

Page 37: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Original Direction cirf Curvature cirf

Figure 7: Filtering an Euler spiral without noise (top) and with noise (bottom). The original imagesare on the left; the result after filtering using the curve indicator random field based on Mumford’sdirection-based Markov process (center) and our curvature-based Markov process (right). Noticethat the direction cirf result tends to repress the signal at high curvatures, while the curvatureprocess has more consistent performance, even at higher curvatures. See text for details.

35

Page 38: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

the curvature process.

Although we cannot measure the covariances of U directly (since we cannot observe its

realizations), we can estimate covariances in local edge operator responses, or what we call

the measurement field M . These correlations9 can reveal, albeit crudely, the qualitative

structure of the correlations of U .

While a detailed treatment of observation models is presented elsewhere [1], we briefly

mention the connection between the covariances ΣM and ΣU , of M and U , respectively. Sup-

pose that the measurement field M was generated by the corruption of the cirf U with blur

operator B with additive noise N , statistically independent of U . Because of the indepen-

dence of N , the cumulant generating functional (the logarithm of the moment generating

functional [34]) of the measurements M = BU + N is the sum of the cumulant generat-

ing functions for BU and N respectively. This implies that the measurement covariance is

ΣM = ΣBU + ΣN = BΣUB∗ + ΣN , where B∗ is the transpose of B. If the noise is relatively

white and the blur has relatively small scale, then this result says that the measurement

covariances will be a slightly blurry form of the cirf covariances, but different along the

diagonal. Therefore studying the measurement covariances is warranted for verifying the

plausibility of the cirf.

To carry out our observations, we use oriented edge filter banks. Specifically, at each

image position (x, y) and direction θ we shall measure the edge strength m(x, y, θ), which is

a realization of the random variable M(x, y, θ).10 The resultant set of these random variables

forms our measurement random field M = M(·). The analysis in the previous paragraph

suggests that for good continuation to have any meaning for natural contour images, at least

the following must hold:

9Following common practice, the terms correlation and covariance will be used interchangeably unless

there is confusion.10In this section we are using explicit spatial coordinates instead of the abstract site index i.

36

Page 39: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Conjecture 1. For natural images, M is not “white,” i.e., its covariance

ΣM (r1, r2) := E[(M(r1)− E[M(r1)])(M(r2)− E[M(r2)])] 6= 0,

for all r1 6= r2, where ri := (xi, yi, θi).

In §A.2, we shall present direct evidence not only in support of this conjecture, but also in

agreement with the cirf model: the observed edge correlations have an elongated pattern.

After first presenting the results in this section [4], we found a paper by Okajima [36] in

which edge correlations of natural images were observed and used for learning edge filters.

We later found that Kruger [27] and Schaaf [46] had already independently observed an

elongated structure in edge correlations as well. This research area has recently become

very active, with edge correlation reports in [12] and [40]. One distinction of our work is

that we assume the statistical homogeneity (§A.1) that is qualitatively apparent in others’

results, allowing us to compute edge correlations from a single image. This suggests that it is

possible to adapt or “tune” contour filter parameters to individual images. A more important

difference is the theoretical cirf framework we provide for interpreting these observations.

More generally, Elder and Krupnik have recently estimated the statistical power of group-

ing cues other than good continuation [9]. Martin et al have computed statistics of Gestalt

cues as well, but using hand-segmented images instead [30]. In related work on image statis-

tics, the joint distributions of neighboring wavelet coefficients have been measured [41, 20].

Mumford and Gidas have recently suggested an axiomatic generative model of natural im-

ages using infinitely divisible distributions [33]. We currently ignore higher-order moments in

order to focus on the longer distance spatial variation of edge relations. Invoking techniques

from the study of turbulence, self-similarity properties in image gradients have also been

measured [43]. For a review of natural images statistics, see [39].

37

Page 40: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

A.1 Statistical Homogeneity

for Random Fields in 2 ×

Unfortunately, since[M(x1, y1, θ1)M(x2, y2, θ2)] is a function of six variables, it is clear

that computing, storing, and visualizing this object—and thus assessing Conjecture 1—

would be difficult without simplifications. Inspired by the Euclidean invariance of stochastic

completion fields [47], we suggest [3] that the proper analogue to the translation homogeneity

of (x, y)-images for (x, y, θ)-space is the following translation and rotation homogeneity in

edge correlations (Fig. 8):

[M(x1, y1, θ1)M(x2, y2, θ2)] =

[M(0, 0, 0)M(rot−θ1

[x2 − x1, y2 − y1]T , θ2 − θ1)],

where rotφ =

cos φ − sin φ

sin φ cos φ

.

This notion of homogeneity in correlation can be naturally extended to all statistics of

random fields on 2 × . Zweck and Williams have introduced what they call a shiftable-

twistable basis for performing Euclidean invariant scf computations on 2 × [55, 50].

A.2 Observed Edge Correlations

To test Conjecture 1, we measured edges using the logical/linear edge operator [22]. Here and

throughout this paper the default scales of σtangential = 2.5 pixels (length), σnormal = 1.4 pixels

(width) were used. The correlations were then estimated via: const·ΣM(r1)M(r2), ∀r1, r2

such that (rot−θ1[x2−x1, y2− y1]

T , θ2− θ1) = (x, y, θ). An appropriate spatial normalization

was used to account for the relative lack of distant tangent (edge) pairs over nearby pairs.

The results show first (Fig. 9) that natural images have a more extended correlation pattern

than does pure noise, supporting the Conjecture. Other experiments (Fig. 10) show a

similar correlation pattern. They also bear a qualitative resemblance to the correlations in

38

Page 41: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Rigid motionsof tangent pairs

r’

r1

2r

1r’

2

Figure 8: If the measurement random field is homogeneous, then the correlation of the (edge)measurements located at the primed pair of tangents equals that for the unprimed pair.

39

Page 42: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Originalimage

θ = 22.5

θ = 0

θ = −22.5

Figure 9: The empirical edge covariance function[M(0, 0, 0)M(x, y, θ)] of a pure noise (i.i.d.

uniform(0,255)) image (top left) is contrasted with that of a natural image with contours (top right).Constant-θ slices of the covariances are shown for θ = 22.5, 0, and − 22.5 in the bottom threerows, where whiteness indicates amount of covariance. (These correlation images, each 100 × 100pixels, have been enhanced by setting the gamma correction to 2.) The slight edge correlationsin the noise image (left) are due entirely to the support of the logical/linear edge operator. Theelongation on the right indicates that edges are strongly correlated in the tangential direction. Notethe resemblance to the covariances of the direction process-based cirf (Fig. 4). We consider thisas empirical evidence of good continuation and proximity in a natural image. Again, the patternof correlations is also consistent with the models in [53, 16, 47].

40

Page 43: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

the curve indicator random field (Fig. 4), which suggests that the orientation-based curve

organization models [31, 54, 16] may have a statistical basis.

To more fully compare the distance over which edges are correlated, we plotted the corre-

lation pattern along the central horizontal strips in the 0-orientation slices of the correlations

(Fig. 11). The most important observation is that collinear edges are correlated over large

distances. More surprising is the fact that the actual length of the contours in the blood

cells image (the true perimeter of the cells is over 100 pixels) is not reflected in its collinear

edge correlations. That there is statistical structure to be exploited in this image is shown in

Fig. 12, which revels a curved pattern in the distribution of correlations. These observations

not only suggest that the above straight-line-seeking direction-based curve organization mod-

els would be ineffective on this image, but they point to a way out. By including curvature

as an explicit variable in a curve organization framework [53], one may more fully capture

the local geometry of the highly curved contours. The curvature process (§2) provides a

random model for these observations.

References

[1] J. August. The Curve Indicator Random Field. PhD thesis, Yale University, 2001.

[2] J. August and S. W. Zucker. The moments of the curve indicator random field. In

Proceedings of the 2000 Conference on Information Sciences and Systems, volume 1,

pages WP5–19–WP5–24, Princeton, NJ, March 2000.

[3] J. August and S. W. Zucker. The curve indicator random field: curve organization

via edge correlation. In K. Boyer and S. Sarkar, editors, Perceptual Organization for

Artificial Vision Systems, pages 265–288. Kluwer Academic, Boston, January, 2000.

41

Page 44: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Figure 10: Edge covariances (continued; see caption of Fig. 9 for explanation). The Paolina (left)and lily (center) images have correlations in the tangential direction which extend much fartherthan those for the blood cells image (right). Note how these more distant correlations in the Paolinaimage occur despite the contours being open, unlike the closed contours in the blood cell image,suggesting curve enhancement systems should not rely only on contour closure.

42

Page 45: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

0.01

0.1

1

5 10 15 20 25 30 35 40 45 50

Distance [pixels]

Correlation of Collinear Edges

TwigsPaolina

LilyBlood cells

Noise

Figure 11: Comparison of the correlation coefficient between collinear edges, obtained by extract-ing a central strip

M(0, 0, 0)M(x, 0, 0) through the zero-orientation slices of the correlations in

Figs. 9 and 10, subtracting the square of the mean (estimated by taking a spatial average in anapproximately constant region of the covariance), and then dividing by the estimated variance ofM . Observe that although the correlation of a pure noise image falls off rapidly, the correlationfor natural images persists over much longer distances. For example, note that even at 25 pixelsof separation, collinear edges in the twigs (Fig. 9, right), Paolina and lily (Fig. 10, left and center)images are appreciably more correlated than pure noise. The correlation of the blood cells imagedrops off rapidly due to high curvature (see Fig. 12).

43

Page 46: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

Figure 12: Edge correlations as a function of position only. The orientation integral∫ 2π0

[M(0, 0, 0)M(x, y, θ)]dθ for the Paolina and blood cell images (left and right, respectively,

of Fig. 10) are shown (left and right, respectively). Observe how the positive “ridge” of the cor-relation bends around in a circle (right), just as the image of blood cells is composed of circles.The black-blood-cell/white-background asymmetry is captured in the vertical asymmetry of thecorrelation. More importantly, this example shows that the apparently short-distance correlationsof the blood cells (Fig. 11) were observed because we only considered collinear edges there. It isthe non-collinear, indeed the co-circular [37] edges that give rise to the circular correlation pattern(right). Thus this figure supports the use of curvature in curve enhancement systems [53] to exploitlonger-distance correlations: the contour structure to be enhanced may exist along a straight line(left) or along a circle (right).

44

Page 47: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

[4] J. August and S. W. Zucker. Organizing curve elements with an indicator random

field on the (unit) tangent bundle. In IEEE Workshop on Perceptual Organization in

Computer Vision, Corfu, September, 1999.

[5] A. P. Dempster, N. M. Laird, and D. B. Rubin. Maximum likelihood from incomplete

data via the em algorithm. Royal Statistical Society, (1):1–38, 1977.

[6] A. Dobbins, S. W. Zucker, and M. S. Cynader. Endstopped neurons in the visual cortex

as a substrate for calculating curvature. Nature, 329(6138):438–441, 1987.

[7] E. B. Dynkin. Markov processes as a tool in field theory. Journal of Functional Analysis,

50:167–187, 1983.

[8] E. B. Dynkin. Gaussian and non-gaussian fields associated with markov processes.

Journal of Functional Analysis, 55:344–376, 1984.

[9] J. Elder and A. Krupnik. Contour grouping with strong prior models. In The 3rd

Workshop on Perceptual Organization in Computer Vision, Vancouver, 2001.

[10] D. J. Field, A. Hayes, and R. Hess. Contour integration by the human visual system:

Evidence for an a local ‘association field’. Vision Research, 33:173–193, 1993.

[11] P. J. Fitzsimmons and J. Pitman. Kac’s moment formula and the feynman-kac formula

for additive functionals of a markov process. Stochastic Processes and their Applications,

79:117–134, 1999.

[12] W. S. Geisler, J. S. Perry, B. J. Super, and D. P. Gallogly. Edge co-occurrence in natural

images predicts contour grouping performance. Vision Research, 41:711–724, 2001.

[13] D. Geman and B. Jedynak. An active testing model for tracking roads in satellite

images. IEEE Transactions on Pattern Analysis and Machine Intelligence, 18(1):1–14,

1996.

45

Page 48: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

[14] S. Geman and D. Geman. Stochastic relaxation, gibbs distributions, and the bayesian

restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence,

6(6):721–741, 1984.

[15] G. H. Granlund and H. Knutsson. Signal Processing for Computer Vision. Kluwer

Academic, Dordrecht, 1995.

[16] G. Guy and G. Medioni. Inferring global perceptual contours from local features. In-

ternational Journal of Computer Vision, 20(1/2):113–133, 1996.

[17] C. W. Helstrom. Probability and Stochastic Processes for Engineers. Macmillan, New

York, 1991.

[18] L. Herault and R. Horaud. Figure-ground discrimination: A combinatorial optimization

approach. IEEE Transactions on Pattern Analysis and Machine Intelligence, 15(9):899–

914, 1993.

[19] B. K. P. Horn. The Curve of Least Energy. ACM Transactions on Mathematical

Software, 9:441–460, 1983.

[20] J. Huang and D. Mumford. Statistics of natural images and models. In Proceedings,

Computer Vison and Pattern Recognition, pages 541–547, 1999.

[21] L. A. Iverson. Toward Discrete Geometric Models for Early Vision. PhD thesis, McGill

University, Montreal, 1994.

[22] L. A. Iverson and S. W. Zucker. Logical/linear operators for image curves. IEEE

Transactions on Pattern Analysis and Machine Intelligence, 17(10):982–996, 1995.

[23] S. N. Kalitzin, B. M. ter Haar Romeny, and M. A. Viergever. Invertible orientation

bundles on 2d scalar images. In Proc. Scale-Space ’97, LICS, pages 77–88. Springer,

1997.

46

Page 49: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

[24] R. Z. Khas’minskii. On positive solutions of the equation uu+vu. Theory of Probability

and Its Applications, 4(3):309–318, 1959.

[25] B. B. Kimia, I. Frankel, and A.-M. Popescu. Euler spiral for shape completion. In

K. Boyer and S. Sarkar, editors, Perceptual Organization for Artificial Vision Systems,

pages 289–309. Kluwer Academic, Boston, 2000.

[26] J. J. Koenderink and W. Richards. Two-dimensional curvature operators. J. Opt. Soc.

Am. A, 5(7):1136–1141, 1988.

[27] N. Kruger. Collinearity and parallelism are statistically significant second-order relations

of complex cell responses. Neural Processing Letters, 8:117–129, 1998.

[28] V. P. Leonov and A. N. Shiryaev. On a method of calculation of semi-invariants. Theory

of Probability and Its Applications, 4(3):319–329, 1959.

[29] J. L. Marroquin. A markovian random field of piecewise straight lines. Biological

Cybernetics, 61:457–465, 1989.

[30] D. Martin, C. Fowlkes, D. Tal, and J. Malik. A database of human segmented natural

images and its application to evaluating segmentation algorithms and measuring eco-

logical statistics. In Proceedings of the Eighth International Conference on Computer

Vision, volume II, pages 416–423, 2001.

[31] U. Montanari. On the optimum dection of curves in noisy pictures. Comm. ACM,

14:335–345, 1971.

[32] D. Mumford. Algebraic Geometry and Its Applications, chapter Elastica and Computer

Vision, pages 491–506. Springer-Verlag, 1994.

[33] D. Mumford and B. Gidas. Stochastic models for generic images. Quart. Appl. Math.,

59(1):85–111, 2001.

47

Page 50: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

[34] C. L. Nikias and A. P. Petropulu. Higher-Order Spectra Analysis: A Nonlinear Signal

Processing Framework. Prentice Hall, Englewood Cliffs, NJ, 1993.

[35] J. R. Norris. Markov Chains. Cambridge UP, 1997.

[36] K. Okajima. A model visual cortex incorporating intrinsic horizontal neuronal connec-

tions. Neural Networks, 9(2):211–222, 1996.

[37] P. Parent and S. W. Zucker. Trace inference, curvature consistency, and curve detec-

tion. IEEE Transactions on Pattern Analysis and Machine Intelligence, 11(8):823–839,

August 1989.

[38] J. I. Richards and H. K. Youn. Theory of Distributions: a non-technical introduction.

Cambridge U P, Cambridge, UK, 1990.

[39] D. L. Ruderman. The statistics of natural images. Network: Computation in Neural

Systems, 5:517–548, 1994.

[40] M. Sigman, G. A. Cecchi, C. D. Gilbert, and M. O. Magnasco. On a common circle:

Natural scenes and gestalt rules. Proceedings of the National Academy of the Sciences,

98(4):1935–1940, 2001.

[41] E. P. Simoncelli. Modeling the joint statistics of images in the wavelet domain. In Proc.

SPIE 44th Annual Meeting, volume 3813, 1999.

[42] Z. Tu and S.-C. Zhu. Parsing images into region and curve processes. In European

Conference on Computer Vision, pages 393–407, 2002.

[43] A. Turiel, G. Mato, N. Parga, and J.-P. Nadal. Self-similarity properties of natural

images resemble those of turbulent flows. Physical Review Letters, 80(5):1098–1101,

1998.

48

Page 51: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

[44] S. Ullman. Filling-in gaps: The shape of subjective contours and a model for their

generation. Biological Cybernetics, 25:1–6, 1976.

[45] S. Urago, J. Zerubia, and M. Berthod. A markovian model for contour grouping. Pattern

Recognition, 28(5):683–693, 1995.

[46] A. van der Schaaf. Natural Image Statistics and Visual Processing. PhD thesis, Gronin-

gen University, The Netherlands, 1998.

[47] L. Williams and D. Jacobs. Stochastic completion fields: A neural model of illusory

contour shape and salience. Neural Computation, 9(4):837–858, 1997.

[48] L. Williams, T. Wang, and K. Thornber. Computing stochastic completion fields in

linear-time using a resolution pyramid. In Proc. of 7th Intl. Conf. on Computer Analysis

of Images and Patterns, Kiel, Germany, 1997.

[49] L. R. Williams and K. K. Thornber. Orientation, scale, and discontinuity as emergent

properties of illusory contour shape. Neural Computation, 13:1683–1711, 2001.

[50] L. R. Williams and J. Zweck. A rotation and translation invariant discrete saliency net-

work. In Neural Information Processing Systems (NIPS ’01), Vancouver, BC, Canada,

2001.

[51] A. L. Yuille and J. M. Coughlan. Fundamental bounds of bayesian inference: Order

parameters and phase transitions for road tracking. IEEE Transactions on Pattern

Analysis and Machine Intelligence, 22(2):160–173, 2000.

[52] S.-C. Zhu. Embedding gestalt laws in markov random fields. IEEE Transactions on

Pattern Analysis and Machine Intelligence, 21(11):1170–1187, 1999.

[53] S. W. Zucker, A. Dobbins, and L. Iverson. Two stages of curve detection suggest two

styles of visual computation. Neural Computation, 1:68–89, 1989.

49

Page 52: Sketches with Curvature: The Curve Indicator Random Field and …pdfs.semanticscholar.org/49c4/eba05a93838b1b8eff9bf21bfd... · 2017-10-23 · The Curve Indicator Random Field and

[54] S. W. Zucker, R. Hummel, and A. Rosenfeld. An application of relaxation labelling to

line and curve enhancement. IEEE Trans. Computers, C-26:393–403, 922–929, 1977.

[55] J. Zweck and L. R. Williams. Euclidean group invariant computation of stochastic

completion fields using shiftable-twistable functions. In Proc. of the 6th European Con-

ference on Computer Vision, Dublin, Ireland, 2000.

50