Data Processing and Recording at University of Southern California M.I. Todorovska and V.W. Lee...

Preview:

Citation preview

Data Processing and Recording at University of Southern California

M.I. Todorovska and V.W. Lee

Civil Engineering Department, University of Southern California

Los Angeles, CA 90089-2531

www.usc.edu/dept/civil_eng/Earthquake_eng/

Laboratories

• Strong Motion Data Processing Laboratory (established in 1976)

• Strong Motion Recording Laboratory (established in 1978, in support of Los Angeles and Vicinity Strong Motion Network)

People

• Faculty: M.D. Trifunac, V.W. Lee, M.I. Todorovska

• Graduate students and post doctoral associates who took interest in this topic (E.I. Novikova).

SM Data Processing Laboratory-Purpose

• Software development for routine and specialized processing of analogue and digital strong motion accelerograms.

• Routine processing of large accelerogram data sets and database organization for use in regression analyses,

• Large scale regression analyses for empirical scaling of strong ground motion.

Activities later extended to:

• Advanced calibration of strong motion instruments.

• Ambient vibration surveys of full-scale structures.

• Structural health monitoring and damage detection.

Presentation outline

• The network

• Digitization of accelerograms recorded on film (LeFilm software package)

• Processing of digitized and digital accelerograms (Lebatch software package)

The Los Angeles and Vicinity SM Network

• Operating since 1980.

• 80 analog stations with absolute time (SMA-1).

• First urban SM network.

• Sensitivity calibrated in 1996.

• Supported by NSF.

Processed data

Earthquake name Date M H Records

1   Santa Barbara Island   09/04/1981 5.5 5 72   North Palm Springs   07/08/1986 5.9 10 153   Oceanside   07/13/1986 5.3 9 64   Whittier-Narrows   10/01/1997 5.9 14 685   Whittier-Narrows aft. (1)   10/01/1997 3.8 15 166   Whittier-Narrows aft. (2)   10/01/1997    17   Whittier-Narrows aft. (3)   10/01/1997 4.4 14 338   Whittier-Narrows aft. (4)   10/01/1997 3.5 14 39   Whittier-Narrows aft. (5)   10/01/1997 3.9 13 4110   Whittier-Narrows aft. (6)   10/01/1997 3.1 15 311   Whittier-Narrows aft. (7)   10/01/1997 4.0 15 3512   Whittier-Narrows aft. (8)   10/01/1997 3.3 15 213   Whittier-Narrows aft. (9)   10/01/1997 3.8 15 1214   Whittier-Narrows aft. (10)   10/01/1997 3.3 16 215   Whittier-Narrows aft. (11)   10/01/1997 3.4 15 116   Whittier-Narrows aft. (12)   10/04/1997 5.3 13 6017   Whittier-Narrows aft. (13)   02/11/1998 4.7 17 3718   Sierra Madre   06/28/1991 5.8 12 6519   Landers   06/28/1992 7.5 5 6120   Big Bear   06/28/1992 6.5 5 5021   Northridge   01/17/1994  6.7 18 6522   Northridge aftershocks 01/17/ to 03/23/1994  >5 11523   Hector Mine 10/16/1999 7.1 6 39

Digitization of film records

• Hardware: PC, flatbed scanner and printer• Software: LeAuto system of interactive

menu driven programs• Analog to digital image conversion• Automatic trace following• Editing• Trace concatenation

Remarks

• Trace following is a highly nonunique estimation process and depends on threshold level.

• Scanning resolution: 600 dpi is optimal.• Depth: minimum 256 level of gray is

recommended.• High contrast image enhancement should be

avoided unless it is essential.

Remarks (cont.)

• Limit to recovering high frequency is not scanner resolution but finite width of light beam

• Quality of digitized data critically depends on operators experience and quality control.

• We digitize the entire recorded length.

Illustrations

Digital Signal Processing

• Software package: LeBatch (Lee and Trifunac, 1990).

• Programs: Volume 1, 2 and 3

Volume 1 processing

• Scaling of digitized image to time-acceleration units.

• Initial baseline correction (subtraction of fixed trace and removal of linear trend).

• Correction for transducer misalignment and cross-axis sensitivity.

• Output: unequally spaced “uncorrected” acceleration

Volume 2 processing

• Instrument correction – standard and higher order.

• Band-pass filtering – to ensure SNR>1

• Ormsby filter - minimum phase distortions.

• Filtering - by convolution, in time domain.

• End conditions - even extension beyond the domain of the data.

Volume 2 processing (cont.)

• High frequency cut-off: fixed at 25-27 Hz in automatic mode (for standard processing).

• Low frequency cut-off: in automatic mode, determined by the program, based on representative noise spectrum, and SNR >1. Component specific. Checked by operator by visual inspection of velocity and displacement time histories, considering earthquake magnitude, distance, etc.

Volume 2 processing (cont.)

• For specialized applications that require linear combination of different traces, the record is filtered with the same low frequency cut-off (the highest of the low frequency cut-offs chosen by the program).

• Necessary e.g. for analyses of building records, and radial and transverse motions.

Volume 2 processing (cont.)

• Output: “corrected” acceleration, velocity and displacement, equally sampled at 100 points per second.

Remarks• High-pass filtering is a form of baseline correction

(proposed by Trifunac in 1971).• Necessary for analog records to remove a “wavy”

baseline. • We do same baseline correction for digital

records. Piecewise baseline offsets, apparently instrument related, are not uncommon. At very long periods, recorded linear acceleration is “contaminated” by contributions from rotations to the transducer response (we can call it “noise”).

Remarks (cont.)

• Permanent displacement cannot be computed reliably from recorded linear accelerations unless rotations are measured independently (Trifunac and Todorovska, 2001).

Conclusions

• Digitization and signal processing are both art and science.

• There is no exact answer.

• Can be viewed as estimation processes, of a signal contaminated in noise.

Conclusions (cont.)

• Permanent displacement cannot be recovered reliably from 3-comp. translational transducers.

• High-pass filtering is still the most reliable method for baseline correction.

Conclusions (cont.)

• Peak displacement is meaningful only if the frequency band is also specified.

• Best data processing methods depend on the application.

• Uncorrected acceleration and instrument constants should always be supplied for custom processing by advanced users.

Conclusion (cont.)

• The profession would benefit from larger spatial resolution of recording (Trifunac and Todorovska, 2001).

• We need more affordable instruments.

Recommended