24
1 Air Force Office of Scientific Research The Basic Research Manager for the Air Force Distribution authorized to DoD components only (Critical Technology) (10/01/04). Other requests for this document shall be referred to AFOSR/PIP Basic Research: Target Basic Research: Target Recognition, Navigation Recognition, Navigation 21 October 2004 21 October 2004 Dr. Jon Sjogren AFOSR/NM 703-696-6564 www.afosr.af.mil

Air Force Office of Scientific Research

Embed Size (px)

DESCRIPTION

Air Force Office of Scientific Research. Basic Research: Target Recognition, Navigation 21 October 2004. The Basic Research Manager for the Air Force. Dr. Jon Sjogren AFOSR/NM 703-696-6564 www.afosr.af.mil. - PowerPoint PPT Presentation

Citation preview

Page 1: Air Force Office of Scientific Research

1

Air Force Office of Scientific Research

The Basic Research Manager for the Air Force

Distribution authorized to DoD components only (Critical Technology) (10/01/04). Other requests for

this document shall be referred to AFOSR/PIP

Basic Research: Target Basic Research: Target Recognition, Navigation Recognition, Navigation

21 October 200421 October 2004

Dr. Jon SjogrenAFOSR/NM 703-696-6564

www.afosr.af.mil

Page 2: Air Force Office of Scientific Research

2

Signals Communication/Surveillance 6.1 Funding Profile FY04

Total Program ATR-Navig

Intramural (lab tasks) $1,035 K $ 625 K

Extramural (university grants) $1,374 K $ 832 K

HBCU/MI, DEPSCoR, DURIP, $1,650 K $ 900 KSTTR, Darpa ISP

MURI $1,838 K $ 0 K

Total Administered $5,897 K $2,357 K

Page 3: Air Force Office of Scientific Research

3

• The “Programme” is to move toward Integrated treatment of

– Synthetic Aperture (SAR)

– High Resolution Ranging (HRR)

– Laser Radar (Ladar)

– Infra-Red (IR)

• Image Formation accentuates the target features that you seek

– Gravitate target detection/identification/recognition toward the sensor

• Combine physical models (electromagnetic scattering) with

• Statistical models of reception (Doppler, phase and bearing)

• “Factor in” the clutter and hostile interference

– ‘structure of (radar) clutter as it affects detection has defied solution’ : Army Night Vision Lab

• Study of parameter spaces that describe complex scenes (several moving targets): “General Pattern Theory”, A. Lanterman interprets U. Grenander

Automatic Target Recognition (ATR):Foundations

Page 4: Air Force Office of Scientific Research

4

• Colorado State Univ. (Kirby): Self-correlation of images, eigen-object analysis, manifold structures and dimensional reduction of data.

• Georgia Tech. (Lanterman): Pattern recognition, structure of “clutter”.

• UC Santa Cruz (Milanfar): Video fusion, edge-modeling, motion and aliasing.

• Rice University (Baraniuk): Multi-dimensional wavelet transforms; rapid reconstruction of singularities.

• Yale Univ (S. Zucker): Multi-scale texture and color reconstruction; neural techniques based on animal visual recognition.

• Boston Univ. (Clem Karl) Unified enhancement and object extraction for ATR.

• Rensselaer Polytechnic Inst. (Yazici): Methods of Representation of continuous groups in design of digital filters.

• Arizona State Univ. (Morrell/Cochran): Adaptive sensing modality.

• Colorado State Univ. (Scharf/Chong): Waveform coding, information-theoretic processing and target/environment modeling. [DARPA]

• SUNY Buffalo (Soumekh): SAR return processing, full Sommerfeld interference model, exploitation of massive computation. [DURIP]

• Geophex Inc. (STTR, $500K): Time Exposure Acoustics (Passive)

Mathematical Methods Enable Sensing

Page 5: Air Force Office of Scientific Research

5

Aaron D. Lanterman, Georgia Institute of Technology

Problem• Clutter may have just as much interesting

structure as a target, yielding high false alarm rates

Objective

• ATR algorithms that are robust to scene variability, particularly clutter

• Algorithm-independent performance metrics for ATR problems

Scientific Approach• Instead of trying to “filter out” clutter, allow the

algorithm to estimate the clutter structure along with the targets

• Riding Moore’s law: real-time scene simulation once required an expensive Silicon Graphics; now a cheap PC with a decent graphics card will do!

• Developing metrics based on Kullback-Leibler distances

Accomplishments/Transitions

Inference via Jump-Diffusion Processes

• To facilitate transition, jump-diffusion code is being refactored into flexible, reusable C++ classes employing OpenGL

• Developed Kullback-Leibler metrics for a radar scenario

Pattern-Theoretic Foundations ofAutomatic Target Recognition in Clutter

Page 6: Air Force Office of Scientific Research

6

Foundations of ATR with 3-D Data Motivation from the DARPA E3D BAA

•DARPA’s E3D program seeks:–“Efficient techniques for rapidly exploiting 3-D sensor data to precisely locate and recognize targets.”–Achieve specific and detailed milestones.

•Natural questions:–If such a milestone is not reached, is that the fault of the algorithm or the sensor?–What performance from a particular sensor is necessary to achieve a certain level of ATR performance, independent of the question of what algorithm is used?

•AFOSR Foundations of ATR program fills the gap

Page 7: Air Force Office of Scientific Research

7

• Derive lower bounds on the performance of any algorithm

•These are based on extraction of features• Feature extraction may involve loss of

information; so use all the data!

• Algorithm design is driven by real-time constraints imposed by current hardware

• Computers keep getting faster; what are the ultimate limits placed by the sensor hardware itself?

•Many ad-hoc algorithms have been built

Foundations of ATR with 3-D Data

Applying the Grenander Program

•Pose is nuisance variable in the ATR problem; pattern theory deals with it head-on

•At a given viewing angle, Target A at one orientation may look much like Target B at a different orientation

Page 8: Air Force Office of Scientific Research

8

Multi-scale Geometric Analysis

2-d complex wavelets 3-d hyper-complex wavelets

Richard Baraniuk, Rice University

• Highly directional atomic representation to match signal geometry

• Complex, quaternion, octonion structure matched to piecewise-smooth multi-D signals with singularities along manifolds

• Enables coherent magnitude/phase multi-scale analysis

• Applications: geometric multi-scale estimation, detection, classification, segmentation, compression

Barb

ara

com

ple

x

magnit

ude

real w

avele

t su

bband

com

ple

x

phase

1-d complex wavelet

Page 9: Air Force Office of Scientific Research

9

Complex Wavelet Analysis

complex magnitudereal wavelet coefficients

- blue green=0 red+ blue=0 red+

Page 10: Air Force Office of Scientific Research

10

Multi-scale Geometric Compression

• Zoom of image compressed using JPEG2000 wavelet encoder

• Strong artifacts at low bit-rates

• Zoom of image compressed using geometry-based WSFQ coder

• Employs cartoon image model combining wavelets and wedgelets

– reduced artifacts– state-of-the-art compression– optimal approximation theorem

• Explicit geometric information in coded bit-stream

• Potential application: multi-scalegeometric target representation

Page 11: Air Force Office of Scientific Research

11

Object-Image Metrics & Duality

],[inf],[Obj xxdxud uEuxu

],[inf],[Img xExu

uudxudx

Duality Theorem: ],[],[ ImgObj xudxud

xu: all objects that could have produced the

image.

ux: all images of the object.

x

uObject-Image

Relations

Matching can (in principle)Matching can (in principle) be performed in either object be performed in either object or image space without loss of performance !or image space without loss of performance !

GX n /3 GU n ˆ/2Object Shape Space Image Shape Space

Page 12: Air Force Office of Scientific Research

12

Statistical Modeling & Curve EvolutionW. Clement Karl, Boston Univ.

• Challenges– Inclusion of accurate sensor and scene models in curve

evolution methods– Unified enhancement and object extraction for ATR

• Existing Methods– Image enhancement followed by boundary extraction– Physical sensor model often ignored

• Progress– Joint ML-EM and curve evolution allowing explicit inclusion of

sensor anomaly model and target range behavior– Unified anomaly suppression and object extraction

• Application– Laser radar range image target extraction

Page 13: Air Force Office of Scientific Research

13

Laser Radar Range Data Example

True Synthetic Range Scene & Initial Curve

Reconstructed Scene & Extracted Object Boundary using statistical sensor and scene model

Laser Radar Observation With Range Anomalies

Page 14: Air Force Office of Scientific Research

14

Sensor and Processor Integration for Improved Resolution

P. Milanfar, University of California, Santa Cruz

Problems• Spatial and temporal resolution of available imaging

sensors is not always adequate.

Objective • Improvement of spatial and temporal fidelity and

resolution of video imagery.

• Optimal adaptation of imaging sensor and “impedance match” to post-processor resulting in:

– Improved information transfer from scene to user

– Improved usage of imaging system’s bandwidth

Scientific Approach• Development of fast and robust computational

estimation framework based on L1 norm.

– Prior based on new multi-scale edge model

• Study of performance limits via statistical bounds

– Improved algorithms minimize the lower bounds

• Measurement of information content in space/time

– Feedback to sensor to maximize info. Content

• Verify algorithms and approach on real data.

Accomplishments/Transitions • Algorithms and software suite for resolution

enhancement from video available to AFRL–Video-to-still/Video (gray and color)

• Proof-of concept implementation of sensor optimization implemented on an IEEE 1394 camera.

• Transitions and extensions to– Closed-loop operation of adaptive sensor– Joint optimization and operation of sensor and

resolution enhancement algorithms.

Infrared Sequence from AF Wright Labs

Before After

Page 15: Air Force Office of Scientific Research

15

noiseSample )],(*),,([ yxhtyxf kkf

error Translate ),( ,kjjk vff

• Reconstruction Problem: Given the frames, estimate the high resolution image. (Super-resolution)

• Implicit problem: Estimate the motion vectors

Fusion of Multiple Video Frames

Nf

1f2,1v

2f

3,2v

NuisanceParameters

Desired unknowns

),,( tyxf

Page 16: Air Force Office of Scientific Research

16

Generic Super-resolution Algorithm

MotionEstimation

ImageReconstruction

Page 17: Air Force Office of Scientific Research

17

Effect of Aliasing

How does aliasing affect the ability to estimate translation between sets of images?

Little aliasing Lots of aliasing

Note “false” motions.

Page 18: Air Force Office of Scientific Research

18

“Algebraic and Topological Structure for Signal and Image Processing”,

Michael Kirby, Colorado State Univ.

• Large data sets of images or signals often possess “geometric structure” that may be exploited to assist in analysis, classification and representation.

• Failure to exploit such structure leads to inferior solutions.

• Data may be represented by manifolds or algebraic varieties. New algorithms involve

• Geometric, Algebraic, Topological Approaches

• Whitney’s theorem. Nash’s theorem.

• Parameterizing Subspace Optimization Problem

• Smooth optimization over Grassmannians.

• Maxi-min approximation criterion.

Michael Kirby, Department of Mathematics, Colorado State University www.math.colostate.edu/~kirby

Page 19: Air Force Office of Scientific Research

19Eigen-image sequence coefficient variation

Image 25

Image 100

Image 200

Shortcomings of Subspace Methods

Problem• Subspace approaches not optimal given large

variations in eigen-coefficients over one person.

• Face images under varying pose and illumination lie on a manifold! (see left)

Objective• Model manifolds directly.

• Classification on manifolds versus subspaces

Applications• Biometrics, human identification, face recognition,

machine lip reading, signal separation.

1

3

2

4Time Time

Michael Kirby, Department of Mathematics, Colorado State University www.math.colostate.edu/~kirby

25 100 200

Page 20: Air Force Office of Scientific Research

20

Steven Zucker, David and Lucile Packard Professor,Yale University

Column-to-column

interactions

Primate visual cortex

• Biologically-inspired work funded out of Life Sciences as part of AFOSR/ (NM and NL) Data Fusion Concentration

• Models of the Primate Cortex motivate Visual recognition algorithms that go beyond Edge Detection

• Column-to-Column interaction among vision cells: object recognition through “consistent orientations” e.g. of attached shadow

Cognition and Image Fusion/Recognition

Page 21: Air Force Office of Scientific Research

21

Biologically-motivated ATR

Yale modelStandard model China lake

Layer-to-layer interactions

• By contrast, Layer-to-Layer interactions lead to a suite of Non-Linear operators well-suited for Object Detection

• In foggy and other scenes where obscuration is heavy, China Lake database below shows out-performance to Canny model on left (on the right you can see the ship stand out)

• The complementary mathematical theory is based on Unit Tangent Bundle for the shading flow field (the cells detect feature orientation)

Page 22: Air Force Office of Scientific Research

22

Vision-based Precision Navigation and Control

• Various Vision Algorithms offer the potential to reduce/eliminate reliance on GPS or other external navigation sources– Optic Flow– Feature Tracking– Bio-Inspired Vision Systems

• Provides the ability to navigate indoors/underground to survey denied targets

• Vision-based control could reduce reliance on onboard IMU systems and improve robustness to extended operating conditions

Sample Aerial Imagery with optic flow vectors

Page 23: Air Force Office of Scientific Research

23

A Few Of Many Benefits Arising from ‘Local-Sensing’ Navigation

Agile Autonomous Flight• Ability to navigate without

external aide• Can fly in complex

environments without extensive mission planning

• Self-awareness of surroundings and other movers

• Can build detailed 3D maps, wide area 3D autonomous target search, generate coordinates for active sensor cueing

Indoor Autonomous Agents• Ability to self navigate without

external sources• Can explore complex

environment with no a-priori map

• Self-awareness of potential threats – ability to “hide”

• Can obtain 3D maps of denied targets for mission planning, possible ability to conduct functional defeat of denied targets

Page 24: Air Force Office of Scientific Research

24

• To fulfill the Long-Term Challenge “Finding and Tracking” depends on Sensing, mathematical-statistical Signals Analysis, Data Fusion and Bio-mimetics

• Collaboration between AFRL, DARPA and National Agencies to achieve ATR in our lifetime

• AFOSR moves to sparkplug new methodologies in Imaging science that provide the leading edge of surveillance systems development

• Early support and encouragement to the most promising investigators working on problems of critical relevance

• The Benefits to DoD of a research focus, managed by AFOSR should reach beyond a single research generation

Summary