Upload
hollie-blake
View
214
Download
1
Embed Size (px)
Citation preview
Source catalog generation
Aim: Build the LAT source catalog (1, 3, 5 years)
Jean Ballet, CEA Saclay GSFC, 29 June 2005
Four main functions:
Find unknown sources over the whole sky (output: list of positions).
Localize sources. Output: list of precise positions (and uncertainties)
Characterize sources (significance, flux, spectrum, variability). This is no different from studying already known sources, and can be done using the likelihood method.
Identify sources (find counterparts in existing external catalogs).
Location: CEA Saclay
LAT Raw Data
Remove space-ground transmission artifacts
Level 0 DatabaseRaw events
“Pipeline” processing
Location: LAT ISOC
Level 1 DatabaseReconstructed events
Calibration data
Ancillary data
Catalog Database (D6)
LAT catalog (D5)List of sources and identifications
List of sources
Interstellar model (U5)
Source search
A2: Source identification
List of sourcesCharacterization
Maximum likelihood (A1)
Exposure maps
Photon maps
U6 U4
Catalog production pipeline
Catalog pipeline. Sequence
Aim: Implement automatic loop to find and characterize the sources
Minimal features:
1. Detect sources over current diffuse model
2. Get a precise position
3. Run Likelihood on all sources to get precise flux, spectrum and significance
4. Split into time pieces to get light curve
5. Run sourceIdentify to get counterparts
Task scheduling tool (like OPUS) for distributing work over CPUs
Simple database for bookkeeping and for the source lists
Associated product: sensitivity maps in several energy bands, or tool to provide minimum detectable flux as a function of spectral index and duration
Catalog pipeline. Schedule
1. Identify candidate source search algorithms. Done
2. Define evaluation criteria. Not yet concluded.
3. Build pipeline prototype. Good progress.
4. Evaluate candidate algorithms. Done on DC1.
5. First selection of source search algorithm. Not done yet. Wait until DC2.
6. Define processing database. By end 2005
7. Integrate pipeline elements (including flux history, identification). 2006.
8. Ready: end 2006 (for DC3)
All-sky source search. Algorithms
1. Wavelet algorithm developed at Perugia (F. Marcucci, C. Cecchi, G. Tosti)
2. Wavelet algorithm developed at Saclay (MR_filter by J.L. Starck, presented in September 2004). On DC1, was quite comparable to 1.
3. Optimal filter PSF/(1+BKG) in Fourier space (myself , presented in September 2004). On DC1, did not find as many sources as 1 or 2 but returns source significance.
4. Multichromatic wavelet method developed by S. Robinson and T. Burnett. Tries using the PSF at the energy of each photon. On DC1, did not find as many sources as 1 or 2 but still interesting for localization.
5. Aperture photometry tried by T. Stephens as a reference. On DC1, did not find as many sources as the other methods but useful exercise.
6. Voronoi algorithm proposed by J. Scargle. Was not ready for DC1.
Source localization
Done locally (for each source in turn)
Typical algorithm (like SExtractor) uses a smoothed map as input, and interpolates to find the maximum.
Another possibility would be to use a different algorithm (like the multichromatic wavelet) to localize sources once they are detected.
We need to provide a precision on source position
Building TSmaps for all sources is certainly VERY CPU intensive.
The precision depends mostly on the source spectrum (estimated by likelihood) and the source significance. This can be computed once and for all, and simulations can tell whether secondary parameters (like background shape) are important.
Pipeline prototype
We have started with the likelihood step (most time consuming)
Basis provided by J. Chiang (catalogAnalysis package, sourceAnalysis Python script). Chains event selection, exposure map generation and likelihood run.
Use the OPUS task scheduler
Try using OPUS in a minimal way (do not decompose too much) in order to facilitate portability.
Standard region of interest (20° radius) contains many sources (several tens)
Optimizers have trouble converging over so many parameters.
Reduce area over which sources are fitted, taking nearby sources (fixed) from run on nearby region of interest.
In addition, call likelihood in several steps (first diffuse emission, then add bright sources, then add faint ones).
What we plan to provide at DC2 kickoff
Run all-sky source detection
Run likelihood over all detections of first step
Provide source list (FITS format for the catalog, but not all columns will be filled)
Source_Name (just a number for internal reference), RA, Dec, Lon, Lat from all-sky source detection
Test_Statistic, Flux (> 100 MeV), Unc_Flux, Spectral_Index, Unc_Spectral_Index from likelihood
What we will try to add during DC2
Error box or ellipse
Counterparts (using sourceIdentify)
Source catalog generation
Several open points:
1. Should we provide fluxes in a different band (starting at higher energy than 100 MeV) to optimize the relative error ?
2. Do we need a separate step for source localization and position error ?
3. Should we implement additional cuts on the data (e.g. on off-axis angle) ?
Catalog generation is on the way !
Jean Ballet, CEA Saclay GSFC, 29 June 2005