7
LIGO Plans for OSG LIGO Plans for OSG J. Kent Blackburn J. Kent Blackburn LIGO Laboratory LIGO Laboratory California Institute of Technology California Institute of Technology Open Science Grid Open Science Grid Technical Meeting Technical Meeting UCSD UCSD December 15-17, 2004 December 15-17, 2004

LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004

Embed Size (px)

Citation preview

Page 1: LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004

LIGO Plans for OSGLIGO Plans for OSGJ. Kent BlackburnJ. Kent BlackburnLIGO LaboratoryLIGO Laboratory

California Institute of TechnologyCalifornia Institute of Technology

Open Science GridOpen Science GridTechnical MeetingTechnical Meeting

UCSDUCSDDecember 15-17, 2004December 15-17, 2004

Page 2: LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004

LSC Data GridLSC Data Grid

The The LLIGO IGO SScientific cientific CCollaboration’s ollaboration’s Data Grid:Data Grid:• Nine Clusters (CIT, MIT, LHO, LLO, UWM, Nine Clusters (CIT, MIT, LHO, LLO, UWM,

PSU, AEI, ISI, Birmingham)PSU, AEI, ISI, Birmingham)• Close to 2000 CPUs within LSCClose to 2000 CPUs within LSC• Condor is primary toolCondor is primary tool• LDAS used for data reduction, database, LDAS used for data reduction, database,

and other analysesand other analyses• Learn more at …Learn more at …

http://www.lsc-group.phys.uwm.edu/lscdatagrid/http://www.lsc-group.phys.uwm.edu/lscdatagrid/

Page 3: LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004

Issues with LSC Data GridIssues with LSC Data Grid

Many LIGO analysis groups take a local Many LIGO analysis groups take a local approach to using LSC Data Gridapproach to using LSC Data Grid

Concrete “DAG” workflows have been Concrete “DAG” workflows have been work horse targeting specific siteswork horse targeting specific sites• Culture developed out of particulars of analysis Culture developed out of particulars of analysis

methods and nature of compute resources, methods and nature of compute resources, e.g., …e.g., …

Where is the data I need most likely to be?Where is the data I need most likely to be? Where are the compute nodes the fastest?Where are the compute nodes the fastest?

Scientific results from inspiral & pulsar Scientific results from inspiral & pulsar analyses limited by available compute analyses limited by available compute resourcesresources

Page 4: LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004

Supercomputing 2004Supercomputing 2004

Deployed an “inspiral” analysis Deployed an “inspiral” analysis across the LSC Data Gridacross the LSC Data Grid• Used Pegasus to “plan execution” Used Pegasus to “plan execution”

across this distributed gridacross this distributed grid First use of abstract “DAX” in LIGO analysisFirst use of abstract “DAX” in LIGO analysis

• Included use of LSU clusterIncluded use of LSU cluster• Considered very successful by LSCConsidered very successful by LSC

Encountered transfer client timeouts due to Encountered transfer client timeouts due to large number of connections to any single large number of connections to any single gridftp server – gridftp server – solution currently under solution currently under development by Pegasus teamdevelopment by Pegasus team

Page 5: LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004

Going Beyond SC04Going Beyond SC04

SC04 demonstrated that non-localized SC04 demonstrated that non-localized usage of LSC Data Grid by LSC analysis usage of LSC Data Grid by LSC analysis groups possible!groups possible!

Pegasus will soon efficiently support LIGO Pegasus will soon efficiently support LIGO dataset challenges through bundled dataset challenges through bundled transfer support on single connectiontransfer support on single connection

In January, a workshop on Pegasus is In January, a workshop on Pegasus is planned for LSC to bootstrap other planned for LSC to bootstrap other analysis groups on using “DAX” workflows analysis groups on using “DAX” workflows on a distributed grid.on a distributed grid.

Page 6: LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004

Migration to Grid3Migration to Grid3

January workshop will also include a January workshop will also include a tutorial on using Grid3tutorial on using Grid3

Goal is to carryout inspiral analysis Goal is to carryout inspiral analysis utilizing Grid3 when possibleutilizing Grid3 when possible

Hope to deploy stochastic analysis Hope to deploy stochastic analysis across the LSC Data Grid and onto across the LSC Data Grid and onto Grid3 as wellGrid3 as well

LIGO plans to build up in-house LIGO plans to build up in-house technical expertise for running on technical expertise for running on Grid3Grid3

Page 7: LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004

On to OSGOn to OSG

Based on experiences running on Based on experiences running on Grid3 in late winter 2005, plan to Grid3 in late winter 2005, plan to migrate inspiral and if available migrate inspiral and if available stochastic analysis onto OSG0 once stochastic analysis onto OSG0 once up and available in the springup and available in the spring