Presentation to LLNL CASIS Workshop Livermore, California...Presentation to LLNL CASIS Workshop...

Preview:

Citation preview

Abdul AwwalAutomatic Alignment & Imaging TeamIntegrated Computer Controls System

National Ignition Facility (NIF)Lawrence Livermore National Laboratory

November 18-19, 2004

Two-Step Position Detection for NIF Automatic Alignment

Presentation toLLNL CASIS WorkshopLivermore, California

The work was performed under the auspices of the U.S. Department of Energy National Nuclear Security Administrationby the University of California Lawrence Livermore National Laboratory under Contract No. W-7405-ENG-48.

UCRL-PRES-208084

UCRL -208084

• Introduction

• Problem Statement

• Background

• Two step algorithm examples: simple to complex

• Results

• Challenges

Outline

UCRL -208084

NIF Facility flyover National Ignition Facility

UCRL -208084

National Ignition Facility

National Ignition Facility

MasterOscillator

CapacitorBay 3

LaserBay 2

TargetChamber

OpticsAssembly

ControlRoom

SwitchYard 2

UCRL -208084NIF-0202-0XXXXppt15/GHM/tr

UCRL -208084

• Automatic alignment uses

video images of fiducials to

find positions of laser beam

Automatic alignments need automated detection ofbeam position and automatic mirror adjustments

UCRL -208084

Problem: How do we detect positions when thefiducials vary in shape and size?

• Automatic alignment needs to detect position of beam

— Challenges

– Beams with different markers

– Same marker, but sizes vary due to defocus

– Circles of different sizes

– Variations relatively large

UCRL -208084

Alignment image contains circles and squares

UCRL -208084

Defocused spots exhibit wide variation in size andquality

UCRL -208084

Different markers identify different beams

UCRL -208084

Some alignment images vary from 35 pixels to200 pixels

UCRL -208084

How to choose the template?

• Centroid

• Template

Possible solutions

UCRL -208084

)),(exp(),(),(*),( yxyxyxyxCMF UUjUUFUUFUUH F-==

)),(exp(),( yxyxPOF UUjUUH F-=

[ ]myxyx

yxyxAMPOF

UUFdUUFcb

UUaFUUH

2),(),(

),(*),(

++=

CMF

POF

AMPOF

Algorithm background

UCRL -208084

cautocrosspos xxxx +-=

cautocrosspos yyyy +-=

Position is obtained from cross and auto-correlation and the original template location

UCRL -208084

CMF output shows cross-correlation location

Peakdetected

UCRL -208084

Problem

UCRL -208084

• Two circles, r1 < r < r2 and r3 < r < r4

• Square, d1 < d < d2

• Why ?

• By design

• Due to optics

Problem

UCRL -208084

Two step approach

• Step 1: Search for right radius

• Segment based on blob size— Segment 1, less than 800— Segment 2, between 900 and 1100— Segment 3, between 1200 and

1400

Segmentthe image

Search for r1 < r < r2DetectEdges

• Change to square, repeat step 1

UCRL -208084

Two step process

• Step 2: Find the position

• Change to square, repeat step 2

Correlate wholeimage withchosen radius

Find locationDetectEdge

UCRL -208084

Segmentation is performed based on pixel size

• Feature sizes of interest (area in pixels)

(1086, 1074, 1086, 836, 832, 836)

• Searching for squares (589.0, 158.5)

• Searching for circles at (52.0,158.0)

UCRL -208084

Movie of the circle match in the correlation planeshows correlation becomes a point at right radius

Correlation is alsoa circle

Circle becomessmaller as it nearsthe right radius

UCRL -208084

Search selected 16.5 pixels for circle radius

Correlation vs radius

0

0.5

1

1.5

2

12 14 16 18

radius in pixels

No

rmal

ized

co

rrel

atio

n

Circlecorrelation

UCRL -208084

Search selected square with dimension of 15.5pixels

Correlation vs side

0

0.5

1

1.5

2

2.5

14 16 18 20

square dimension in pixels

No

rmal

ized

co

rrel

atio

n

Squarecorrelation

UCRL -208084

Circles of two sizes and squares of a single sizedetected (after 3 searches and 3 correlations)

UCRL -208084

Example 2

• 5 circles, r1 < r < r2

— Where r1 = 35 and r2= 260

UCRL -208084

Pinhole images could vary from 35 pixels to 200pixels (radius)

UCRL -208084

Solution to complex problem: Example 2

• Successive approximation

• Search by matching

Section the imageat centroid

Estimate r1 < r < r2

Centroid

Correlate wholeedge image withradius range

Find location

UCRL -208084

Cross-section of images could provide an initialestimate of the radius such as 37 and 204 pixels

UCRL -208084

Edge detected image

UCRL -208084

Two typical final results show the edge and centerof the circular images

UCRL -208084

Refined algorithm

• Successive approximation

• Search by matching

Measure thepedestal

Estimate r1 < r < r2

Correlatesegmented imagewith range ofradius

Find location

Segment the image find ROI

More detailed in paper no. 5556-30

UCRL -208084

Box classifier

Data Cluster Classifier

Features

Classes

Featureestimation

UCRL -208084

Search space shows 3 classes, range is chosen tobound the class- box classifier

pedestal

radius

This causes 14 to belower limit of class III

Class I

Class II

Class III

UCRL -208084

Box classifier

if (pedestal lt 700) then minRad=5

maxRad=10elseif (pedestal lt 1000) then

minRad=11;maxRad=22

elseminRad=14 ;maxRad=27

end

Measure thepedestal area

Estimate r1 < r < r2

UCRL -208084

Detection of four defocused spots

UCRL -208084

Challenges

• Partially missing

• Blobs may be divided into two

• Successive approximation may fail if initial guess is incorrect

— automated guess correction by counting the right and wrong

moves

UCRL -208084NIF-0202-0XXXXppt15/GHM/tr

UCRL -208084

• Wilbert McClay, Jim Candy, Walter Ferguson, Scott Burkhart, Karl

Wilhelmsen (Automatic Alignment team)

• Erlan Bliss, Mark Bowers

• Eric Mertens, Patrick J. Opsahl (control room - data collection)

• Paul Van Arsdall (feedback)

Acknowledgement

Recommended